SirEDCaLot

joined 1 year ago
[–] SirEDCaLot@lemmy.today 2 points 1 month ago (2 children)

So you're left with departments full of clock punchers who don't have vision or leadership. If you want to kill your Golden Goose, that's a good way to do it. The remaining departments full of drone followers aren't going to be making you the exciting groundbreaking products that make you money.

Of course then again I personally see value in employees, maybe business leadership does not or thinks they are all generic replaceable.

[–] SirEDCaLot@lemmy.today 1 points 1 month ago

No, but it will bring into question the process by which they were acquired to begin with. Somebody will ask, why did you spend x billion on real estate when it was obvious that remote work was the future? Or if they are locked into a long-term lease, eventually the question will come up why are we spending all this money for office space we aren't using? Shouldn't we have thought of this earlier? Not having workers in the office makes it obvious that real estate was a bad investment, and many of these companies are pretty heavily invested in real estate. Easier to screw the workers with what can be explained away as a management strategy than admit a wasted a whole bunch of money buying and building and renovating space you don't need.

[–] SirEDCaLot@lemmy.today 30 points 1 month ago (7 children)

That and executive ass covering, a way to avoid admitting to shareholders that they wasted their money on useless commercial real estate.

It's also shooting themselves in the foot. The first people to leave aren't going to be the clock punchers, it will be the best and brightest who can easily find other jobs.

[–] SirEDCaLot@lemmy.today 14 points 2 months ago

The only way you can do this, is if the only service you use the provider for is storage. Encrypt the data before you send it to the provider and then they don't know what they're storing.

If they have to do any processing on it at all, then conceptually they need a plain text copy of it to feed into the CPU. And if they have that, there is nothing you can do to stop them from stealing it or using it.

There has been some research in this field, the concept is called homomorphic encryption. That is where you encrypt something in a way that allows a third party to manipulate the data without possessing a key. It is still very limited, and likely always will be due to the extreme difficulty of the question.

[–] SirEDCaLot@lemmy.today 1 points 2 months ago

with an outside control interface that’s quite literally about as optimal as it can be.

Which is probably true, as long as you make one assumption- that the operator dedicates a significant amount of time to learning it. With that assumption being true- I'll assume you're correct and it becomes much more efficient than a Nano/Notepad style editor.

I'm happy to concede without any personal knowledge that if you're hardcore editing code, it may well be worth the time to learn Vim, on the principle that it may well be the very most efficient terminal-based text editor.

But what if you're NOT hardcore editing code? What if you just need to edit a config file here and there? You don't need the 'absolute most efficient' system because it's NOT efficient for you to take the time to learn it. You just want to comment out a line and type a replacement below it. And you've been using Notepad-style text editors for years.

Thus my point-- there is ABSOLUTELY a place for Vim. But wanting to just edit a file without having to learn a whole new editor doesn't make one lazy. It means you're being efficient, focusing your time on getting what you need done, done.

[–] SirEDCaLot@lemmy.today 1 points 2 months ago

That's the appropriate reaction to many of these so-called threats to society. Internet chat rooms, generative AI, drugs, opioids, guns, pornography, trashy TV, you name it. I think it's been pretty well demonstrated throughout history that the majority of the time some 'threat to public safety' comes out and a well-meaning group tries to get the government to shove the genie back in the bottle, the cure ends up being worse than the disease. And it's a lot easier to set up bureaucracy then to dismantle it.

The sad thing is, whatever regulation they set up will be pointless. Someone will download an open source model and run it locally with the watermark code removed. Or some other nation will realize that hobbling their AI industry with stupid regulations won't help them get ahead in the world and they will become a source for non-watermarked output and watermark free models.

So we hobble ourselves with some ridiculous AI enforcement bureaucracy, and it will do precisely zero good because the people who would do bad things will just do them on offshore servers or in their basement.

It applies everywhere else too. I'm all for ending the opioid crisis, but the current attempt to end opioids entirely is not the solution. A good friend of mine takes a lot of opioids, prescribed by a doctor, for a serious pain condition resulting from a car accident. This person's back and neck are full of metal pins and screws and plates and whatnot.
For this person, opioids like oxycontin are the difference between being in constant pain and being able to do things like workout at the gym and enjoy life.
But because of the well-meaning war on opioids, this person and their doctor are persecuted. Pharmacies don't want to deal with oxycontin, and the doctor is getting constant flack from insurance and DEA for prescribing too much of it.
I mean really, a pain management doctor prescribes a lot of pain medication. That's definitely something fishy that we should turn the screws on him for...

It's really infuriating. In my opinion, the only two people who should decide what drugs get taken are a person and their doctor. For anyone else to try and intrude on that is a violation of that person's rights.

[–] SirEDCaLot@lemmy.today 1 points 2 months ago (1 children)

I agree it's hypocritical, but for different reasons.

I think a nude/sex scene can be important to the plot and add a lot to the story- in some situations. Yeah it's often thrown in as eye candy to get more viewers, but sometimes it counts for a lot. Look at Season 1 of Game of Thrones for example- there's a couple sex scenes with Dany and Khal Drogo, and IMHO that does a lot more to further the story than to show T&A-- the first one Dany's basically being raped, but as the season goes on you see her start to fall in love with Drogo and it becomes more making love. Hard to get the same effect without sex scenes.
Same thing anytime you have two people in bed- crappy unrealistic TV sex where the girl never takes her shirt off and then cut to half a second later they're both wrapped tightly but conveniently in sheets can break suspended disbelief.
So I can sympathize with an actor who agrees to artistic nude scenes or sex scenes because they're important to the plot, but then has that specific 20 seconds of video taken out of context and circulated on porn sites.

At the same time, an actor doesn't get to order the audience to experience the film in any certain way. Just as you say about 'the piano', it depends on how you watch it. It's not illegal to buy the film, fast forward to the nude scenes, and stop watching when they're done. So to think you get any sort of control over that is hypocritical, it's like ordering a reader to read the entire book and not share passages with a friend.

[–] SirEDCaLot@lemmy.today 0 points 2 months ago (2 children)

I'm not fine with that, as it will have wide-ranging repercussions on society at large that aren't all good.

But I fully accept it as the cold hard reality that WILL happen now that the genie's out of the bottle, and the reality that any ham-fisted legal attempt to rebottle the genie will be far worse for society and only delay the inevitable acceptance that photographs are no longer proof.

And as such, I (and most other adults mature enough to accept a less-than-preferred reality as reality) stand with you and give the statists the middle finger, along with everyone else who thinks you can legislate any genie back into its bottle. In the 1990s it was the 'protect kids from Internet porn' people, in the 2000s it was the 'protect kids from violent video games' and 'stop Internet piracy' people, I guess today it's the 'stop generative AI' people. They are all children who think crying to Daddy will remake the ways of the world. It won't.

[–] SirEDCaLot@lemmy.today 3 points 2 months ago

I don't think there's nearly as many as you think. It's a perception bias, the few that there are stick out a lot because they are hilariously stupid so you read about them a lot and it seems common.

Keep in mind that in the US, about half the households are armed. And the half that own guns own enough guns to arm the other half. There's more guns than people in this country. If they're truly was a significant overlap between very stupid people and gun ownership, the nation would be like a roadrunner cartoon with Yosemite Sam type a shootouts and people firing into the air on every street corner. That is seriously not the case.

[–] SirEDCaLot@lemmy.today 3 points 2 months ago (2 children)

They generally are.
Problem is there is a tiny little overlap in the middle of that Venn diagram, and that tiny overlap seems to be responsible for a great deal of problems.

[–] SirEDCaLot@lemmy.today 0 points 2 months ago

Probably the best idea yet. It's definitely not foolproof though. Best you could do is put a security chip in the camera that digitally signs the pictures, but that is imperfect because eventually someone will extract the key or figure out how to get the camera to sign pictures of their choosing that weren't taken by the camera.

A creator level key is more likely, so you choose who you trust.

But most of the pictures that would be taken as proof of anything probably won't be signed by one of those.

[–] SirEDCaLot@lemmy.today 0 points 2 months ago (1 children)

I'm not talking about the copyright violation of sharing parts of a copyrighted movie. That is obviously infringement. I am talking about generated nude images.

If the pencil drawing is not harming anybody, is the photo realistic but completely hand-done painting somehow more harmful? Does it become even more harmful if you use AI to help with the painting?

If the pencil drawing is legal, and the AI generated deep fake is illegal, I am asking where exactly the line is. Because there is a whole spectrum between the two, so at what point does it become illegal?

view more: ‹ prev next ›