this post was submitted on 16 Aug 2024
303 points (98.4% liked)

Linux

48328 readers
641 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] theshatterstone54@feddit.uk 56 points 3 months ago (9 children)

As others have said, discrete math is one of the obvious missing pieces. My uni also has C as the first language students learn as a part of their degree, and follows up with Java and Haskell to teach students about OOP and FP as paradigms. It's useful to have something like C so students can learn about memory management. I'm also not seeing anything on Networking and Cyber Security (aside from Cryptography), which my university also taught.

[–] dubyakay@lemmy.ca -2 points 3 months ago (2 children)

Why is it important in this day and age to learn about memory management? That's like saying it's important to learn cursive, when it really isn't.

[–] mysteryname101@lemmy.world 7 points 3 months ago (2 children)

Embedded. I’m currently writing software with 96 bytes of RAM. My next project I get to splurge and have 8kB of RAM and 32k of Flash.

I’m more scared with how badly I’ll handle/manage the 8k of RAM.

[–] SkunkWorkz@lemmy.world 6 points 3 months ago

Also anywhere where a GC is just too slow. Like in videogame engines.

[–] ChairmanMeow@programming.dev 1 points 3 months ago (1 children)

That's a very specific usecase though that the majority of programmers likely will never have to face.

[–] gamma@programming.dev 7 points 3 months ago* (last edited 3 months ago)

Taking courses which involve subjects that you will likely never encounter in the workforce is a thing in every discipline. Most engineers don't need to manually solve differential equations in their day jobs, they just need to know that they exist and will often require numerical solutions.

Getting your hands dirty with the content provides a better understanding when dealing with higher level concepts.

[–] Suppoze@beehaw.org 5 points 3 months ago (1 children)

I think it's more important than ever. Software is getting slower and slower, and the bloat is ridiculous. Imho this is because we just work with abstractions over abstracions ignorant to how it will be computed on a real machine. I think a more appropriate methapor would be, that you can speak and understand language (program) while being illiterate at the same time (not having a fundamental understanding on how a computer works). Of course this is a exaggeration, you don't need to know about these stuff to be a adequate programmer I think.

[–] Syulang@aus.social 1 points 3 months ago

@Suppoze @dubyakay one thing I liked about programming on Atari 8 bit machines was that your code could and was expected to hit the hardware directly. It was assumed the programmer understood the nature of the hardware and would directly "talk" to it to get it to perform their task. This made coding very efficient. Not a single CPU cycle of byte of RAM was wasted. A program that analysed data from multiple environmental sensors, tabulated, averaged and plotted the results and sent then to a charter plotter would run comfortably on 16kb of RAM.

My phone take a thousand times that to fail to open my emails.

load more comments (6 replies)