Hold on ....
Are you saying all software hosted on github is infected with copilot? Or am I misreading the situation?
This is a most excellent place for technology news and articles.
Hold on ....
Are you saying all software hosted on github is infected with copilot? Or am I misreading the situation?
Your confusion is understandable since MS has called like 4 different products “Copilot”. This refers to the coding assistant built into GitHub for everything from CI/CD to coding itself.
All code uploaded to GitHub is subject to being scraped by Copilot to both train and provide inference context to its model(s).
Basically having your code in GitHub is implicit consent to have your code fed to MSs LLMs.
All code uploaded to GitHub is subject to being scraped
No kidding: That was literally my very first thought back in the days when I learned that M$ has taken over GitHub.
(Copilot did not exist then)
Mine too. More precisely: code uploaded to GH won't be yours anymore. IIRC there were changes to the TOS that supported this. But even if not, predicting the obvious doesn't make us prophets.
Copilot steals from all the code on github.
I guess it's about copilot scanning the code, submitting PRs, reporting security issues, doing code reviews and such.
Excellent news ! I have been preaching the good word of Codeberg for months, delighted to see it's working.
If I can get NixOS to move, I will be the happiest gal in the world...
More distros need to follow. No FOSS should have any relationship to Microsoft or their products.
Did this few months ago. Everyone should do the same.
Gentoo is still around‽ But Arch exists and eMachines was discontinued like 10 years ago!
I know this is probably sarcastic but honestly Gentoo's great if you don't trust binaries by default. Nothing is an absolute guarantee against compromise, but it's an awful lot harder to compromise a source code repository or a compiler without anyone noticing (especially if you stick to stable versions) than it is to compromise a particular binary of some random software package. I trust most package maintainers, but they're typically overworked volunteers and not all of them are going to have flawless security or be universally trustworthy.
I like building my own binaries from source code whenever possible.
Genuine question from a longtime Linux user who never tried Gentoo - doesn't updating take forever? I used a source build of firefox for a bit and the build took forever, not to mention the kernel itself
The long update has the advantage of providing an opportunity to touch grass.
touch grass is literally a one-liner, cmon bro
Gentoo does not have always the latest builds, not by default.
Updates depend on your amount of packages, hardware, and willingness to utilize that hardware for compiling.
I don't use DE, just dwm+dmenu, so my biggest packages are Firefox and LibreOffice, which can take 3+ hours with dependencies. KDE or Gnome would most likely add more.
But you can put number of cores for compiling into config. If you have your PC on most of the day, you can set it to 1 or 2 and you most likely won't even know about it.
Or, if you have 16 core CPU, let 14 do the compiling and you can browse the web with the remaining two.
This all assumes you have enough RAM as well. It's not as bad, but you should have at least 32GB.
The distro is smooth, way more than anything I've ever tried, and I'm not switching from it.
Depends on your system specs, but.... yes, generally speaking. There is a reason most people and most distros use binaries. Even Gentoo can use binaries for some stuff.
Are you going to suffer significant damage if your updates take forever though? What's the hurry? The number of times I have literally needed the absolute latest version of something installed right now are pretty damn minimal. The major exception is widespread, exploited zero-day remote-access vulnerabilities, but those are rare, and especially rare are ones that affect the exact versions and configurations of software that I am currently using and cannot reasonably just opt to "stop" using. Even so, there are usually other ways to block the network traffic, disable the offending part of the configuration, or otherwise mitigate the risk. Besides, there's nothing stopping you from literally just downloading a patched binary if that's what you need at that moment.
Patience is a virtue, and it's generally good for you. You don't have to be addicted to constant updates, but you do need to be thoughtful and understand how to build defense-in-depth.
It's not so much "I must have the latest version NOW" and more that while it was building my system load would spike from 0.1 to 7+ and everything ran like shit for like half an hour.
I'm a messy, impatient boy - I know my limitations!
Gentoo is more linux than anyhing. It is literally a penguin. What does Arch have?
Gentoo is still a better distro, it underpins every ChromeOS device (they just do the compilation for you)
The because of training claim is wrong.
Quoting the Gentoo post:
Mostly because of the continuous attempts to force Copilot usage for our repositories,
It seems to be about GitHub pushing copilot usage, not them training on data. Moving away doesn't prevent training anyway. And I'm sure someone will host a mirror on hitting if they don't.
Excellent!