Fubarberry

joined 1 year ago
[–] Fubarberry@sopuli.xyz 31 points 3 months ago (6 children)

Nintendo switch had 527 games released this year

Xbox had 355 games

PS5 had 395 games

I'm sure there are other factors (like quality of the games, platform exclusivity, etc), but switch is still doing pretty good for an 8 year old console that's too weak to run new multi-plat games.

[–] Fubarberry@sopuli.xyz 5 points 3 months ago

Looks great, I've been using Stealth but it has trouble with albums and some other content. It also doesn't let me search for posts in a specific subreddit.

This looks like it will do both of those things, as well as being actively maintained and improved. Thanks for making it.

[–] Fubarberry@sopuli.xyz 49 points 4 months ago

Yeah, I'm fine with it.

[–] Fubarberry@sopuli.xyz 3 points 4 months ago (4 children)

A lot of password managers support 2fa now. I use Enpass because I got a lifetime license a long time ago (it's also available to people with Google Play pass), but I know some other popular options have it too.

[–] Fubarberry@sopuli.xyz 13 points 4 months ago* (last edited 4 months ago)

That's covered by section 107 of the US copyright law, and is actually fine and protected as free use in most cases. As long as the work isn't a direct copy and instead changes the result to be something different.

All parody type music is protected in this way, whether it's new lyrics to a song, or even something less "creative" like performing the lyrics of song A to the melody and style of song B.

[–] Fubarberry@sopuli.xyz 4 points 4 months ago* (last edited 4 months ago)

I think a fairer comparison in that case would be the difficulty of building a camera vs the difficulty of building and programming an AI capable computer.

That doesn't really make sense either way though, no one is building their camera/computer from raw materials and then arguing that gives them better intellectual rights.

[–] Fubarberry@sopuli.xyz -4 points 4 months ago (2 children)

If you snap a photo of something, you own the photo (at least in the US).

There's a solid argument that someone doing complex AI image generation has done way more to create the final product than someone snapping a quick pic with their phone.

[–] Fubarberry@sopuli.xyz 93 points 4 months ago (31 children)

There's nothing stopping you from going to youtube, listening to a bunch of hit country songs there, and using that inspiration to write a "hit country song about getting your balls caught in a screen door". That music was free to access, and your ability to create derivative works is fully protected by copyright law.

So if that's what the AI is doing, then it would be fully legal if it was a person. The question courts are trying to figure out is if AI should be treated like people when it comes to "learning" and creating derivative works.

I think there are good arguments to both sides of that issue. The big advantage of ruling against AI having those rights is that it means that record labels and other rights holders can get compensation for their content being used. The main disadvantage is that high cost barriers to training material will kill off open-source and small company AI, guaranteeing that generative AI is fully controlled by tech giant companies like Google, Microsoft, and Adobe.

I think the best legal outcome is one that attempts to protect both: companies and individuals below a certain revenue threshold (or other scale metrics) can freely train on the open web, but are required to track what was used for training. As they grow, there will be different tiers where they're required to start paying for the content their model was trained on. Obviously this solution needs a lot of work before being a viable option, but I think something similar to this is the best way to both have competition in the AI space and make sure people get compensated.

[–] Fubarberry@sopuli.xyz 2 points 4 months ago

There's not much concrete data I can find on accident rates on highways vs non-highways. You would expect small side streets accidents to have lower fatality rates though, with wrecks at highway speeds to have much higher fatality rates. From what I see, a government investigation into how safe autopilot is determined there were 13 deaths, which is very low number given the billions of miles driven with autopilot on (3 billion+ in 2020, probably 5-10billion now? Just guessing here since I can't find a newer number).

But yeah, there are so many factors with driving that it's hard get an exact idea. Rural roads have the highest fatality rates (making up to 90% of accident fatalities in some states), and it's not hard to image that Tesla's are less popular in rural communities (although they seem to be pretty popular where I live).

But also rural roads are a perfect use case for autopilot, generally easy driving conditions where most deaths happen due to speeding and the driver not paying attention. Increased adoption of self driving cars in rural communities would probably save a lot of lives.

[–] Fubarberry@sopuli.xyz 7 points 4 months ago (4 children)

It reminds me of the debate around self driving cars. Tesla has a flawed implementation of self driving tech, that's trying to gather all the information it needs through camera inputs vs using multiple sensor types. This doesn't always work, and has led to some questionable crashes where it definitely looks like a human driver could have avoided the crash.

However, even with Tesla's flawed self driving, They're supposed to have far fewer wrecks than humans driving. According to Tesla's safety report, Tesla's in self driving mode average 5-6 million miles per accident vs 1-1.5 million miles for Tesla drivers not using self driving (US average is 500-750k miles per accident).

So a system like this doesn't have to be perfect to do a far better job than people can, but that doesn't mean it won't feel terrible for the unlucky people who things go poorly for.

[–] Fubarberry@sopuli.xyz 3 points 5 months ago (2 children)

No, the version they released isn't the full parameter set, and it's leading to really bad results in a lot of prompts. You get dramatically better results using their API version, so the full sd3 model is good, but the version we have is not.

Here's an example of SD3 API version: SD3 API

And here's the same prompt on the local weights version they released: SD3 local weights 2B

People think stability AI censored NSFW content in the released model, which has crippled its ability to understand a lot of poses and how anatomy works in general.

For more examples of the issues with SD3, I'd recommend checking this reddit thread.

[–] Fubarberry@sopuli.xyz 3 points 5 months ago (1 children)

That lawsuit is from 2021, and was thrown out later that year for failing to meet "the most basic requirements of an antitrust case,"

view more: ‹ prev next ›