this post was submitted on 02 Mar 2024
61 points (90.7% liked)

Technology

59534 readers
3195 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Lately, I was going through the blog of a math professor I took at a community college back when I was in high school. Having gone the path I did in life, I took a look at what his credentials were, and found that he completed a computer science degree back sometime in the 1970s. He had a curmudgeonly and standoffish personality, and his IT skills were nonexistent back when I took him.

It's fascinating to see the perspectives on computing and how many of the things I learned in my undergraduate were still being taught way back to the 1950s. It also seems like the computer science degree was more intertwined with its electrical engineering fraternal twin.

Although the title of this post is inherently provocative, I'm curious to hear from those of you who did computer science, electrical engineering, or similar technical degrees in decades past. Are there topics or subjects that have phased out over the years that you think leave younger programmers/engineers ill-equipped in the modern day? What common practices were you happy to see thrown in the dumpster and kicked away forever?

The community also seems like it was significantly smaller back then and more interconnected. Was nepotism as prevalent in the technology industry then as it is today?

This is just the start of a discussion, please feel free to share your thoughts!

you are viewing a single comment's thread
view the rest of the comments
[โ€“] mdhughes@lemmy.ml 5 points 8 months ago (1 children)

In the good old days, you had to learn assembly/machine language, C, and OS-level programming to get anything done. Even if you mostly worked on applications, you'd drop down and do something useful. At the time, this was writing machine language routines to call from BASIC. This is still a practical skill, for instance I mostly work in Scheme, but use C FFI to hook into native functionality, and debug in lldb.

Computer Science is supposed to be more math than practical, though when I took it we also did low-level graphics (BIOS calls & framebuffers), OS implementation, and other useful skills. These days almost all CS courses are job training, no theory and no implementation.

Younger programmers typically have no experience below the application language (Java, C#, Python, PHP) they work in, and only those with extensive CS degrees will ever see a C compiler. Even a shell, filesystems, and simple toolchains like Make are lost arts.

The MIT Missing Semester covers some of the mid-high levels of that, but there's no real training in the digital logic to OS levels.

[โ€“] Luftruessel@lemmy.world 2 points 8 months ago

I will totally check this out, thanks for the reference!