stevecrox

joined 10 months ago
[–] stevecrox@kbin.run 4 points 10 months ago (2 children)

I thought server side anti cheat was the most effective. Since it can't be modified by clients and tracks clients for impossible behaviour.

[–] stevecrox@kbin.run 8 points 10 months ago (2 children)

I wish a company would build 4.5"-5.5" and 5.5"-6.5" flagship phones, put as many features that make sense in each.

Then when you release a new flagship the last flagship devices become your 'mid range' and you drop the price accordingly, with your mid range dropping to budget the year after.

When Nokia had 15 different phones out at a time it made sense because they would be wildly different (size, shape, button layout, etc...).

These days everyone wants as large a screen as possible on a device that is comfortable to hold, we really don't need 15 different models with slightly different screen ratios.

[–] stevecrox@kbin.run 1 points 10 months ago* (last edited 10 months ago)

I actually researched my list, most the technologies were used internally for years and either publically released after better public alternatives had been adopted or it seems buzz reached me years after Google's first release. So I am wrong.

Between 2012-2015 I used to consult on Apache Ivy projects (ideally moving them to Maven and purging the insanity people had written). As a result I would get called in when projects had dependency issues.

The biggest culprits were Guava/GSon, projects would often choose to use them (because Google) and then would discover a bug that had been fixed in a later patch release (e.g. they used 2.2.1 and 2.2.2 had the fix). However the reason they used 2.2.1 was because a library they needed did. Bumping up the version usually caused things to break.

The standard solution was to ask'why' they needed Guava/GSon and everytime you would find they are usually some function found in one of the Apache Commons libraries. So I would pull down the commons library rewrite the bit (often they worked identically)

Fun side note in 2016-2017 I got called to consult on a lot of Gradle projects to fix the same kind of convoluted bespoke things people did with Apache Ivy. Ivy knew the Gradle 'feautres' were a massive headache in 2012 and told you to use Maven for those reasons. Ce La vie.

We tried using Protobuf in 2008 and it was worse than the Apache Axis for JSON conversion (which feels too harsh to say), similarly I had been using AMQP or Kafka for years and tried gRPC when it was released (google say 2016 but I am sure we tried in 2014) and it was worse in every metric I still don't understand why it exists.

I was using Vaadin in 2011 and honestly thought GWT was released in 2012. I had to use it in 2014 and the workflow, compile time and look of GWT is just worse than Vaadin.

[–] stevecrox@kbin.run 2 points 10 months ago* (last edited 10 months ago) (2 children)

The FAANG companies have an internal kind of elitisim that would make staff less effective.

If you look at any Google Java library, GWT, GSon, Guava, Gradle, Protobuf, etc.. there was a commonly used open source library that existed years before that covered 90% of the functionality.

The Google staff just don't think to look outside Google (after if Google hasn't solved it no chance outsiders have) and so wrote something entirely from scratch.

Then normally within 6 months the open source library has added the killer new feature. The Google library only persists because people hold FAANG as great "Its by Google so it must be good!" Yet it normally has serious issues/limitations.

The Google libraries that actually suceeded weren't owned by Google (E.g. Yahoo wrote Hadoop, Kubernetes got spun away from Google control, etc..).

[–] stevecrox@kbin.run 1 points 10 months ago (1 children)

I wouldn't use "certified" in this context.

Limiting support of software to specific software configurations makes sense.

Its stuff like Debian might be using Python 3.8 Ubuntu Python 3.9, OpenSuse Python 3.9, etc.. Your application might use a Python 3.9 requiring library and act odd on 3.8 but fine on 3.7, etc.. so only supporting X distributions let you make the test/QA process sane.

This is also why Docker/Flatpack exist since you can define all of this.

However the normal mix is RHEL/Suse/Ubuntu because those target businesses and your target market will most likely be running one.

[–] stevecrox@kbin.run 2 points 10 months ago (3 children)

I suspect they mean around packaging.

I honestly believe Red Hat has a policy that everything should pull in Gnome. I have had headless RHEL installs and half the CLI tools require Gnome Keyring (even if they don't deal with secrets or store any). Back in RHEL 7, Kate the KDE based Text Editor pulled in a bunch of GTK dependencies somehow.

Certification is really someone paid to go through a process and so its designed so they pass.

Think about the people you know who are Agile/Cloud/whatever certified and how all it means is they have learnt the basic examples.

Its no different when a business gets certified.

The only reason people care is because they can point to the cert if it all goes wrong

[–] stevecrox@kbin.run 1 points 10 months ago* (last edited 10 months ago) (1 children)

Debian isn't old == stable, its tested == stable.

Debian has an effective Rolling distribution through testing than can get ahead of Arch.

At some point they freeze the software versions in testing and look for Release Critical and Major bugs. Once they have shaken everything and submitted fixes where possible. It then becomes stable.

The idea is people have tested a set baseline of software and there are no known major bugs.

For the 4-5 releases Debian has released every 2 years (Similar to Ubuntu LTS). Debian tends to align its release with LTS Kernel and Mesa releases so there have been times the latest stable is running newer versions than Ubuntu and the newest software crown switches between Ubuntu LTS and Debian each year.

For some the priority to run software that won't have major bugs, that is what Debian, Ubuntu LTS and RHEL offer.

[–] stevecrox@kbin.run 1 points 10 months ago (1 children)

I switched my computer illiterate family members to reduce the effort of helping them and they didn't notice.

As a helper..

There are distributions focussed on the latest and greatest (Arch, Fedora, etc..) and ones aiming for stability (Debian, Ubuntu, etc..). Think of them as groups with different views.

So Linux Mint is Ubuntu but it has the latest Cinnamon desktop. Ubuntu is Debian but focused on fixed releases and adds 'snaps' and includes "non-free" by default.

People have different views on how the desktop should work. The two big desktops are Gnome and KDE.

Gnome is like Marmite. Its works completely different to any other desktop and people either love it or loathe it. Its often the distribution default.

With Windows 10/11 I think Microsoft were trying to steal some of KDE's best features. By default it looks very much like a Windows desktop but lots of people mod it to look/act like macos. Some people struggle with the options it provides.

Then there are lots of other desktops, for example Cinnamon takes Gnome and turns it into a normal desktop.

Personally I would suggest Kubuntu as your first attempt. This is a fairly decent install guide.

Ubuntu tries to minimise the choices you need to make and the 6 month update cycle keeps it fairly stable.

Kubuntu is Ubuntu it just makes KDE the default instead of Gnome.

view more: ‹ prev next ›