7heo

joined 2 years ago
[–] 7heo@lemmy.ml 14 points 9 months ago (3 children)

"Religion for yourself" in the age of internet of called "personal belief". So, the term "religion" now only means, like it or not, "institutionalized religion".

This is 100% caused by the fact that people "identify" as Y (not using X as a variable, as it is now a fucking confusing buzzword), and are subsequently grouped together in "echo rooms" by various platforms algorithms. This happened so overwhelmingly that in less than a decade, it redefined the default behavior of people, online, and you will now see people automatically seeking those echo rooms. Even on Lemmy, where people are literally seeking instances that will validate their own beliefs, and block those they do not share.

[–] 7heo@lemmy.ml 1 points 9 months ago* (last edited 9 months ago)

TL;DR: stay away from trustpilot, they are anything but trustworthy. The EU federation really has to provide such platform, and not rely on a private corporation, if they want to promote trust in the EU. In the meanwhile, we should really get a review website aggregator, to spot the inconsistencies.

If I learned anything recently is that trustpilot is essentially an online "private security" firm at best, and an online "protection business" at worst, where they abuse their market dominant position, and wait for users to post damaging (however truthful) reviews of a business to then "offer" said business an opportunity to "manage and display"[^1] their reviews (what is there to manage about reviews third parties left of your business, aside from removing them?) for a modest sum starting from 250E a month, more than doubling for every tier, and going to undisclosed amounts, for the "enterprise" offer.

However they will also do nothing against fake positive reviews (as evidenced by the sheer amount of different websites offering them), and you can buy several dozen online for around 250E (or see here).

I discovered all this recently after seeing concerning patterns and doing tests (with the means available to me). In the process of doing said tests, I discovered a very well rated (essentially 5 out of 5) company (that I won't name), that straight up lies about their entire offer, and merely sublets (without disclosing it) the offer of a much larger, and much, much cheaper company; all the while offering broken basic features.

[^1]: taken from their website:

Manage reviews for stores and branches

Stand out in local search as you manage and display content on each of your sites

[–] 7heo@lemmy.ml -5 points 9 months ago* (last edited 9 months ago)

No budget was stated, and I'm not gonna assume you don't want a "good piece of hardware" because you looked at something 2 orders of magnitude cheaper. If I had the cash, I would definitely get one (or more!) of those bad boys, and would run all my infra on them... I might however in such case still look at an additional SBC just for plugging to the IPMI interfaces and turn the machines on and off at will.

[–] 7heo@lemmy.ml -4 points 9 months ago* (last edited 9 months ago) (2 children)

Supermicro latest H13 servers are good pieces of hardware. They also can run jellyfin. For optimal longevity, I recommend a Supermicro AS -2025HS-TNR fit with 2 9654, 12 dimms of 64GB DDR5, and 12 20TB HDDs.

So that would be my pick, with the stated requirements.

[–] 7heo@lemmy.ml 1 points 10 months ago

OK I'm officially too tired to actually contribute to Lemmy. I'll be on my way... 😭

[–] 7heo@lemmy.ml 1 points 10 months ago (2 children)

I don't see how a repair that causes the nose of a plane to "fall off" would not be considered a "bigger repair"...

I'm not saying that Boeing would be involved in the replacement of a tire from the landing gear. But something major enough to make the actual nose of the plane to literally fall off? That sounds important enough to me.

[–] 7heo@lemmy.ml 4 points 10 months ago* (last edited 10 months ago) (1 children)

The thing is, intelligence is the capacity to create information that can be separately verified.

For this you need two abilities:

  1. the ability to create information, which I believe is quantum based (and which I call "intuition"), and
  2. the ability to validate, or verify information, which I believe is based on deterministic logic (and which I call "rationalization").

If you get the first without the second, you end up in a case we call "insanity", and if you have the second without the first, you are merely a computer.

Animals, for example, often have exemplary intuition, but very limited rationalization (which happens mostly empirically, not through deduction), and if they were humans, most would be "bat shit crazy".

My point is that computers have had the ability to rationalize since day one. But they haven't had the ability to generate new data, ever. Which is a requirement for intuition. In fact, this is absolutely true of random generators too, for the very same reasons. And the exact same way that we have pseudorandom generators, in my view, LLMs are pseudointuitive. That is, close enough to the real thing to fool most humans, but distinctively different to a formal system.

As of right now, we have successfully created a technology that creates pseudointuitive data out of seemingly unrelated, real life, actually intuitive data. We still need to find a way to reliably apply rationalization to that data.

And until then, it is utterly important that we do not conflate our premature use of that technology with "the inability of computers to produce accurate results".

[–] 7heo@lemmy.ml 4 points 10 months ago* (last edited 10 months ago)

if you are at the receiving end of a mistake made my either a classic algorithm or an machine learning algorithm, then you probably won't care whether it was the computer or the programmer making the mistake

I'm absolutely expecting corporations to get away with the argument that "they cannot be blamed for the outcome of a system that they neither control nor understand, and that is shown to work in X% of cases". Or at least to spend billions trying to.

And in case you think traceability doesn't matter anyway, think again.

IMHO it's crucial we defend the "computers don't make mistakes" fact for two reasons:

  1. Computers are defined as working through the flawless execution of rational logic. And somehow, I don't see a "broader" definition working in the favor of the public (i.e. less waste, more fault tolerant systems), but strictly in favor of mega corporations.
  2. If we let the public opinion mix up "computers" with the LLMs that are running on them, we will get even more restrictive ultra-broad legislation against the general public. Think "3D printers ownership heavily restricted because some people printed guns with them" but on an unprecedented scale. All we will have left are smartphones, because we are not their owners.
[–] 7heo@lemmy.ml 0 points 10 months ago

Because of regulations, because of contracts, because of a myriad reasons I won't waste my time listing here.

The point is that they have been in business for over a century, that the aerospace industry is heavily regulated, and so I somewhat expect them to have processes in place and responsibilities to make sure the planes are delivered and remain according to their design specification.

And you don't strike me as someone who knows more than me (a total newbie) on the matter, so maybe we stop wasting each other's time on a pointless argument about shit that is absolutely beyond us both. Yeah?

[–] 7heo@lemmy.ml 3 points 10 months ago (4 children)

I thought that there were specific "critical" operations that would require them (Delta, Boeing, or both) to record an entry in Boeing's Collaborative Manufacturing Execution Systems (CMES) database. But I'm discovering this field, so I don't know if they make a difference in this context between before and after delivery, and if the normal plane maintenance is covered by the same processes or not, and that's why I'm asking, and not stating.

However, if one doesn't know more than me, stating isn't more correct.

[–] 7heo@lemmy.ml 22 points 10 months ago (6 children)

We spent decades on educating people that "computers don't make mistakes" and now you want them to accept that they do?

We filled them with shit, that's what. We don't even know how that shit works, anymore.

Let's be honest here.

[–] 7heo@lemmy.ml 1 points 10 months ago (10 children)

Isn't Boeing QA supposed to inspect the plane and sign it off after maintenance?

view more: ‹ prev next ›