Bethesda has said that they aren't going to do one until after the next Elder Scrolls game, so if anything in the Fallout world is going to come out on any kind of a near-term schedule, it's going to have to be via someone with available bandwidth licensing it.
tal
If you ever want a great collection of more fun stories from early Apple days, check out folklore.org. It's earlier than this stuff, but covers a lot of shennanigans.
In August 1993, the project was canceled. A year of my work evaporated, my contract ended, and I was unemployed.
I was frustrated by all the wasted effort, so I decided to uncancel my small part of the project. I had been paid to do a job, and I wanted to finish it. My electronic badge still opened Apple's doors, so I just kept showing up.
I asked my friend Greg Robbins to help me. His contract in another division at Apple had just ended, so he told his manager that he would start reporting to me. She didn't ask who I was and let him keep his office and badge. In turn, I told people that I was reporting to him. Since that left no managers in the loop, we had no meetings and could be extremely productive.
They created a pretty handy app that was bundled with the base OS, and which I remember having fun using. So it's probably just as well that Apple didn't hassle them. But in all seriousness, that's not the most amazing building security ever.
reads further
Hah!
We wanted to release a Windows version as part of Windows 98, but sadly, Microsoft has effective building security.
It’s something in the charging circuitry. It works fine when it’s on wall power, but it just does not charge the battery.
And it’s not the battery itself because I’ve tried getting new batteries for it. It’s something in the charging circuitry. It works fine when it’s on wall power, but it just does not charge the battery.
At least some Dell laptops authenticate to the charger so that only "authentic Dell chargers" can charge the battery, though they'll run off third-party chargers without charging the battery.
Unfortunately, it's a common problem -- and I've seen this myself -- for the authentication pin on an "authentic Dell charger" to become slightly bent or something, at which it will no longer authenticate and the laptop will refuse to charge the battery.
I bet the charger on yours is a barrel charger with that pin down the middle.
hits Amazon
Yeah, looks like it.
https://www.amazon.com/dp/B086VYSZVL?psc=1
I don't have a great picture for the 65W one, but the 45W charger here has an image looking down the charger barrel showing that internal pin.
If you want to keep using that laptop and want to use the battery, I'd try swapping out the charger. If you don't have an official Dell charger, make sure that the one you get is one of those (unless some "universal charger" has managed to break their authentication scheme in the intervening years; I haven't been following things).
EDIT: Even one of the top reviews on that Amazon page mentions it:
I have a DELL, that has the straight barrel plug with the pin in it. THEY REALLY made a BAD DECISION when they made these DELL laptops with that type of plug instead of making it with a dog leg style plug. I have to replace my charger cord A LOT because the pin gets bent inside and it stops charging at that plug, but the rest of the charger is still good...
Up until the early 2000s, serial computation speed doubled about every 18 months. That meant that virtually all software just ran twice as quickly every 18 months of CPU advances. And since taking advantage of that was trivial, new software releases did, traded CPU cycles for shorter development time or more functionality, demanded current hardware to run at a reasonable clip.
In that environment, it was quite important to upgrade the CPU.
But that hasn't been happening for about twenty years now. Serial computation speed still increases, but not nearly as quickly any more.
This is about ten years old now:
https://preshing.com/20120208/a-look-back-at-single-threaded-cpu-performance/
Throughout the 80’s and 90’s, CPUs were able to run virtually any kind of software twice as fast every 18-20 months. The rate of change was incredible. Your 486SX-16 was almost obsolete by the time you got it through the door. But eventually, at some point in the mid-2000’s, progress slowed down considerably for single-threaded software – which was most software.
Perhaps the turning point came in May 2004, when Intel canceled its latest single-core development effort to focus on multicore designs. Later that year, Herb Sutter wrote his now-famous article, The Free Lunch Is Over. Not all software will run remarkably faster year-over-year anymore, he warned us. Concurrent software would continue its meteoric rise, but single-threaded software was about to get left in the dust.
If you’re willing to trust this line, it seems that in the eight years since January 2004, mainstream performance has increased by a factor of about 4.6x, which works out to 21% per year. Compare that to the 28x increase between 1996 and 2004! Things have really slowed down.
We can also look at about the twelve years since then, which is even slower:
https://www.cpubenchmark.net/compare/2026vs6296/Intel-i7-4960X-vs-Intel-Ultra-9-285K
This is using a benchmark to compare the single-threaded performance of the i7 4960X (Intel's high-end processor back at the start of 2013) to that of the Intel Ultra 9 285K, the current one. In those ~12 years, the latest processor has managed to get single-threaded performance about (5068/2070)=~2.448
times the 12-year-old processor. That's (5068/2070)^(1/12)=1.07747
, about a 7.7% performance improvement per year. The age of a processor doesn't matter nearly as much in that environment.
We still have had significant parallel computation increases. GPUs in particular have gotten considerably more powerful. But unlike with serial compute, parallel compute isn't a "free" performance improvement -- software needs to be rewritten to take advantage of that, it's often hard to parallelize solving problems, and some problems cannot be solved in parallel.
Honestly, I'd say that the most-noticeable shift is away from rotational drives to SSDs -- there are tasks for which SSDs can greatly outperform rotational drives.
One problem, I think, is that if you have a lot of assets invested in a particular game style, then it's costly to revise the game.
I remember that it happened with the original Halo, where the game was massively revised across different genres during development. But I think that in general, once you've made the assets, it's increasingly painful to dramatically change the game.
I've also heard complaints that AAA studios are "risk-adverse" -- but, honestly, I'd be kind of cautious about gambling a lot of asset money on an unproven game too.
Whereas game genres that are extremely asset-light, like traditional roguelikes, often have pretty polished gameplay -- the developers can cheaply iterate on the gameplay, because they don't have to throw out much asset work.
A lot of indie games today kind of fall into this camp, do stuff like low-res pixel art to save on asset costs.
One thing I've kind of wondered about is whether maybe more of the video game industry should look more like a two-phase affair. You have games made on relatively small asset budgets, kinda more like indie games. Some fail, some succeed.
But then when one is really successful, it becomes common for a studio that specializes in AAA titles to acquire it and do a high-production-value version of the game. That de-risks the game somewhat, since the AAA studio knows that it has a game with popular gameplay, and specializes in churning out a really high-value form.
Now, okay. That doesn't work with all genres. Some genres, like adventure games, you only really play once. Some games don't do very well on the low-asset side -- it's hard to create an open-world FPS game on a budget.
But there have been a lot of times that I've purchased a low-asset-cost game that I really like and then thought "I wish that there was more stuff on the asset side", that I could go and pay more and get it.
Like, for those low-res pixel art games, I'd like to have the ability to get full-res art. I'd often like more soundtracks. I've played a few games that have had outstanding voice acting, like Logan Cunningham in Transistor or Ron Perlman in Fallout: New Vegas, and I think that you could usually take many existing games and go back and stick good voice acting in and make the experience a lot better. A lot of 3D games could take more-extensive bowling and texturing.
Yeah, some old games get remakes to take advantage of new technology, and sometimes they get fancier assets when that happens, but this isn't that -- I'm talking about taking a popular, relatively-current game with a limited asset budget and giving it a high-budget makeover.
Hah, clever name!
It might be too technical for some, but Linux Weekly News (lwn.net) has been a long-running source of articles dating back to the 1990s, if memory serves aright, that's kind of in that ballpark.
EDIT: Hmm. It looks like at some point, some of their articles went subscriber-only, though.
I've kind of felt the same way, would rather have a somewhat-stronger focus on technology in this community.
The current top few pages of posts are pretty much all just talking about drama at social media companies, which frankly isn't really what I think of as technology.
That being said, "technology" kind of runs the gamut in various news sources. I've often seen "technology news" basically amount to promoting new consumer gadgets, which isn't exactly what I'd like to see from the thing, either. I don't really want to see leaked photos of whatever the latest Android tablet from Lenovo or whatever is either.
I'd be more interested in reading about technological advances and changes.
I suppose that if someone wants to start a more-focused community, I'd also be willing to join that, give it a shot.
EDIT: I'd note that the current content here kind of mirrors what's on Reddit at /r/Technology, which is also basically drama at social media companies. I suppose that there's probably interest from some in that. It's just not really what I'm primarily looking for.
It feels slow for me -- though maybe that's because I'm checking on a cell link -- but it seems to be reachable now, at least.
I think that California should take keeping itself competitive as a tech center more-seriously. I think that a lot of what has made California competitive for tech is because it had tech from earlier, and that at a certain threshold, it becomes advantageous to do more companies in an area -- you have a pool of employees and investors and such. But what matters is having a sufficiently-large pool, and if you let that advantage erode sufficiently, your edge also goes away.
We were just talking about high California electricity prices, for example. A number of datacenters have shifted out of California because the cost of electricity is a significant input. Now, okay -- you don't have to be right on top of your datacenters to be doing tech work. You can run a Silicon Valley-based company that has its hardware in Washington state, but it's one more factor that makes it less appealing to be located in California.
The electricity price issue came up a lot back when people were talking about Bitcoin mining more, since there weren't a whole lot of inputs and it's otherwise pretty location-agnostic.
https://www.cnbc.com/2021/09/30/this-map-shows-the-best-us-states-to-mine-for-bitcoin.html
In California and Connecticut, electricity costs 18 to 19 cents per kilowatt hour, more than double that in Texas, Wyoming, Washington, and Kentucky, according to the Global Energy Institute.
(Prices are higher now everywhere, as this was before the COVID-19-era inflation, but the fact that California is still expensive electricity-wise remains.)
I think that there is a certain chunk of California that is kind of under the impression that the tech industry in California is a magic cash cow that is always going to be there, no matter what California does, and I think that that's kind of a cavalier approach to take.
EDIT: COVID-19's remote-working also did a lot to seriously hurt California here, since a lot of people decided "if I don't have to pay California cost-of-living and can still keep the same job, why should I pay those costs?" and just moved out of state. If you look at COVID-19-era population-change data in counties around the San Francisco Bay Area, it saw a pretty remarkable drop.
https://www.apricitas.io/p/california-is-losing-tech-jobs
California is Losing Tech Jobs
The Golden State Used to Dominate Tech Employment—But Its Share of Total US Tech Jobs has Now Fallen to the Lowest Level in a Decade
Nevertheless, many of the tech industry’s traditional hubs have indeed suffered significantly since the onset of the tech-cession—and nowhere more so than California. As the home of Silicon Valley, the state represented roughly 30% of total US tech sector output and got roughly 10% of its statewide GDP from the tech industry in 2021. However, the Golden State has been bleeding tech jobs over the last year and a half—since August 2022, California has lost 21k jobs in computer systems design & related, 15k in streaming & social networks, 11k in software publishing, and 7k in web search & related—while gaining less than 1k in computing infrastructure & data processing. Since the beginning of COVID, California has added a sum total of only 6k jobs in the tech industry—compared to roughly 570k across the rest of the United States.
For California, the loss of tech jobs represents a major drag on the state’s economy, a driver of acute budgetary problems, and an upending of housing market dynamics—but most importantly, it represents a squandering of many of the opportunities the industry afforded the state throughout the 2010s.
Musk lost a lot of money on his last social media company purchase, Twitter, after spending some time in court trying to abort the purchase. I'm not at all sure that he wants to buy another social media company.