LeFantome

joined 1 year ago
[–] LeFantome@programming.dev 1 points 4 months ago (1 children)

Don’t get me wrong, I am very excited by the possibility of having an independent browser engine. Firefox was that. Mozilla as an organization has compromised that. I hope Ladybird can avoid the same issues.

The rationale for SerenityOS beyond therapy and fun was precisely the “being responsible for everything ourselves” aspect of the project. Andreas has previously described it as a form of accountability. He has also described it as a kind of freedom in contrast to the Linux model with its inevitable politics, bike-shedding, and inefficiency as you try to get dozens of projects to agree on direction and merge code. Ultimately, the mono repo allowed the project to do things right in the right place in the stack ( in the code ). It allowed the project to move quickly, to avoid legacy, and to continuously improve and modernize. It allows harmony across the entire developer AND user experience.

Linux is famously fragmented. There is no open source desktop OS that provides “whole system” design and user experience harmony. Haiku comes close but it has never gotten much native app momentum. On Haiku, you are going to run the Falkon browser, use GIMP, and run LibreOffice. None of those are developed in concert with the OS.

SerenityOS held the promise of Ladybird, and Hack Studio, and the userland utilities ( and everything else ) all being developed in concert. The same GUI toolkit was used for everything. The same standard library. The same error handling and code level security measures.

This was the “need or use case”. Not anymore.

And it is not just SerenityOS. Ladybird is not as independent as it was. Not just the sponsorships but the code. SerenityOS is no longer a dependency but 8 other projects are. No more mono-repo and goodbye to all the reasons it was considered a good thing.

[–] LeFantome@programming.dev 1 points 4 months ago

Well, that is like, your opinion man.

Ok. Obviously different licenses are useful in different circumstances. So, what you are saying is clearly true.

That said, even though the MIT license is the most used license I believe, I wish MIT was used more and GPL less.

I do not want to create or get drawn into a debate ( because we likely have the same facts and just disagree ) but what I dislike about the GPL is that does not respect freedom—specifically developer freedom. It constrains freedom and hopes that what it calls “the 4 freedoms” are a side effect. In my view, the GPL restricts freedom to bestow rights ( a net negative for freedom ).

My opinion is no more valuable than yours. We do not have to convince each other. I am just explaining my view.

Don’t get me wrong, the ability of the original author to choose the GPL is something I totally support. It is a totally valid constraint to place on people that want to use your code. A developer should get to choose the terms under which people can use their code. It is exactly this freedom that the GPL restricts. Again, I think this is totally ok ( as would be demanding money ) but it is certainly a restriction which, by definition, is not freedom.

[–] LeFantome@programming.dev 1 points 4 months ago (1 children)

I think you may have made the same mistake that I did. Freeciv and DOOM are mid-90’s but Tux Racer did not appear until 2000 and Quake 3 did not come to Linux until even later.

What I remember as 90’s Linux gaming is probably from the early 2000’s.

Check-out the “Top 10” at the end of 2000:

https://edition.cnn.com/2000/TECH/computing/12/20/linux.games.idg/index.html

[–] LeFantome@programming.dev 1 points 4 months ago* (last edited 4 months ago)

All the ID Software stuff worked ( eg. DOOM, Quake ) but outside of them and Loki, not a lot of commercial stuff. And I don’t think the Loki catalogue was very current or extensive.

There were lots of Linux native games but they were much more primitive ( though not necessarily less fun ) like Tux Racer and Pingus. There were also adventure games.

There was also a thriving game engine “clone” scene, especially for Blizzard stuff. Not all of it ever really got there in terms of features or quality. These were designed to work with the “assets” from actual commercial games. There was Stargus for Warcraft and StarCraft for example. There is DevilutionX for Diablo which is great. Often there were fully open source games built off these same engines ( BosWars? ).

By the standards of today, the Linux gaming scene would have seemed pretty shitty. You were not playing the same AAA titles as your Windows friends. However, if you were a Linux enthusiast, there were plenty of really fun options to keep you entertained.

I think Linux has always been a bit better off than Mac with regards to gaming.

[ Edit: Memory correction - DevilutionX is way more recent. Even Stargus did not appear until 1998 as did Pingus. Tux Racer was not until 2000. Loki was a 1999 thing too. So, my comments above are perhaps more valid for 1998 - 2005 than 1991 - 1998. DOOM was 1994 at least and Quake was 1996. ]

[–] LeFantome@programming.dev 1 points 4 months ago (1 children)

Not an American but, other than the odd actual Linux distribution ( like Red Hat ), I do not think I ever saw boxed Linux software for sale. That sounds amazing.

I mean, you could order things like WordPerfect I guess. But I never saw it on a shelf.

[–] LeFantome@programming.dev 4 points 4 months ago* (last edited 4 months ago)

Well, XFree86 ( before Xorg and before KMS ) was an adventure. I spent hours guessing the vertical and horizontal frequencies of my monitor trying to get decent resolutions.

Other than that, Linux was way more work but “felt” powerful relative to OS options of the time. Windows was still crashy. The five of us that used OS/2 hated that it still had a lot of 16 bit under the hood. Linux was pure 32 bit.

Later in the 90’s, you could run a handful of Windows apps on Linux and they seemed to run better on Linux. For example, file system operations were dramatically faster.

And Linux was improving incredibly rapidly so it felt inevitable that it would outpace everything else.

The reality though was that it was super limited and a pain in the ass. “Normal” people would never have put up with it. It did not run anything you wanted it to ( if you had apps you liked on Mac, Windows, OS/2, Amiga, NeXTstep, BeOS, or whatever else you were using ( there were of potential options at the time ). But even for the pure UNIX and POSIX stuff, it was hard.

Obviously installation was technical and complex. And everything was a hodge-podge of independently developed software. “Usability” was not a thing. Ubuntu was not release until 2004.

Linux back then was a lot of hitting FTP sites to download software that you would build yourself from source. Stuff could be anywhere on the Internet and your connection was probably slow. And it was dependency hell so you would be building a lot of software just to be able to build the software you want. And there was a decent chance that applications would disagree about what dependencies they needed ( like versions ). Or the config files would be expected in a different location. Or the build system could not find the required libraries because they were not where the Makefile was looking for them.

Linux in the 90’s had no package management. This is maybe the biggest difference between Linux then and Linux now. When package management finally arrived, it came in two stages. First, came packages but you were still grabbing them individually from FTP. Second came the package manager which handled dependencies and retrieval.

The most popular Linux in the mid to late 90’s was Red Hat. This was before RHEL and before Fedora. There was just “Red Hat Linux”. Red Hat featured RPMs ( packages ) but you were still installing them and any required dependencies yourself at the command line. YUM ( precursor to NRF ) was not added until Fedora Core 1 was release in 2003!

APT ( apt-get ) was not added to Debian until 1998.

And all of this meant that every Linux system ( not distro — individual computer ) was a unique snowflake. No two were alike. So bundling binary software to work on “Linux” was a real horror-show. People like Loki gave it a good run but I cannot imagine the pain they went through. To make matters worse, the Linux “community” was almost entirely people that had self-selected to give up pre-packaged software and to trade sweat-equity for paying for stuff. Getting large number of people to give you money for software was hard. I mean, as far as we have come, that is still harder on Linux than on Windows or macOS.

You can download early Debian or Red Hat distros today if you want to experience it for yourself. That said, even the world of hardware has changed. You will probably not be wrestling IRQs to get sound or networking running on modern hardware or in a VM. Your BIOS will probably not be buggy. You will have VESA at least and not be stuck on VGA. But all of that was just “computing” in the 90’s and the Windows crowd had the same problems.

One 90s hardware quirk was “Windows” printers or modems though where the firmware was half implanted in Windows drivers. This was because the hardware was too limited or too dumb to work on its own and to save money your computer would do some of the work. Good luck having Linux support for those though.

Even without trying old distros, just try to go a few days on you current Linux distro without using apt, nrf, pacman, zypper, the GUI App Store, or what have you. Imagine never being able to use those tools again. What would that be like?

Finally, on my much, much slower 90’s PC, I compiled my own kernel all the time. Honestly multiple times per month I would guess. Compiling new kernels was a significant fraction of where my computing resources went at the time. I cannot remember the last time I compiled a kernel.

It was a different world.

When Linus from LTT tried Linux not that long ago ( he wanted to game ), he commented that he felt like he was playing “with” his computer instead of playing “on” his computer. That comment still describes Linux to some extent but it really, really captures Linux in the 90’s.

[–] LeFantome@programming.dev 7 points 4 months ago

The Caja file manager from MATE has a spatial mode. I think ROX-filer does as well.

[–] LeFantome@programming.dev 1 points 4 months ago

I used a system with 6 GB daily until not long ago. I had to constantly restart my web browsers to reclaim memory. RAM was a constant issue. A 32 bit distro made things a lot better.

[–] LeFantome@programming.dev 3 points 4 months ago (3 children)

Your biggest problem is going to be the 4 GB of RAM. Saving a few hundred megs on the DE will help but not much. If you run a web browser ( and I cannot imagine using a computer without one ) that RAM is going to fill up fast.

Honestly, I would use a 32 bit distro on that hardware.

Q4OS with Trinity, Antix, Adelie, and DSL are all pretty decent options.

[–] LeFantome@programming.dev 8 points 4 months ago (1 children)

If you don’t need a full desktop environment, check-out IceWM.

I recently checked-out Trinity ( essentially KDE 3 modernized ) and was surprised how decent it was. I used it in Q4OS but it may be available in your distro.

[–] LeFantome@programming.dev 6 points 4 months ago (1 children)

Does OpenBSD really default to FVWM in 2024? Metal.

[–] LeFantome@programming.dev 4 points 4 months ago

Many distros customize the colour schemes and theming of their desktops. The out-of-the-box XFCE in EOS looks nothing at all like vanilla XFCE for example.

view more: ‹ prev next ›