addie

joined 1 year ago
[–] addie@feddit.uk 2 points 1 month ago (7 children)

Ah, nice. Had been experimenting with using my Raspberry Pi 3B as my home Git server for all my personal projects - easy sync between my laptop and desktop, and another backup for the the stuff that I'd been working on.

Tried running Gitea on it to start with, but it's a bit too heavy for a device like that. Forgejo runs perfectly, and has almost exactly the same, "very Github inspired" interface. Time to run some updates...

[–] addie@feddit.uk 2 points 2 months ago

Annoys me that "less" is always correct, which makes "fewer" completely redundant, and yet it's a short word that could be valuable in conversation if opened up and reused for something everyday that has a long name.

"Before I leave the house, I always check that I've got my keys, phone, and fure in my pockets."

[–] addie@feddit.uk 10 points 2 months ago (1 children)

We've found it to be the "least bad option" for DnD. Have a Discord window open for everyone to video chat in, have a browser window open with Owlbear Rodeo or Foundry / Forge for your tokens and character sheets, all works smoothly enough. The text chat is sufficient for sending the DM a private message; for group chat to share art of the things you've just run into or organise the next session.

Completely agree that for anything "less transient", then the UX is beyond awful and trying to find anything historical is a massive PITA.

[–] addie@feddit.uk 20 points 2 months ago (1 children)

Should have used Vim instead, that's a real text editor. No-one who starts using it ever moves on to something else.

[–] addie@feddit.uk 14 points 3 months ago (2 children)

I quite liked all the people complaining about 'unrealistic' Russian accents, and every single character in the game is voiced by a native speaker. Many lols. Bit like people complaining about Yvonne Strahovski's 'phoney' Australian accent playing Miranda in Mass Effect.

[–] addie@feddit.uk 2 points 3 months ago

Yeah, it's always had really strong art direction - still holds up, and you don't notice missing shadows so much in the middle of a frenetic sequence anyway.

Good to see ray tracing coming along. You could get the same shadows and lighting in a modern rasterising engine now as demonstrated in the RTX version, but at the cost of much more development time. Graphics like that being available to smaller studios and larger games being feasible for bigger studios would be great. HL2 is massive compared to modern shooters, and not having to spend forever tweaking each scene helps with that.

[–] addie@feddit.uk 11 points 3 months ago (6 children)

When I was still dual-booting Windows and Linux, I found that "raw disk" mode virtual machines worked wonders. I used VirtualBox, so you'd want a guide somewhat like this: https://superuser.com/questions/495025/use-physical-harddisk-in-virtual-box - other VM solutions are available, which don't require you to accept an agreement with Oracle.

Essentially, rather than setting aside a file on disk as your VM's disk, you can set aside a whole existing disk. That can be a disk that already has Windows installed on it, it doesn't erase what you have. Then you can start Windows in a VM and let it do its updates - since it can't see the bootloader from within the VM, it can't fuck it up. You can run any software that doesn't have particularly high graphics requirement, too.

I was also able to just "restart in Windows" if I wanted full performance for a game or something like that, but since Linux has gotten very good indeed at running games, that became less and less necessary until one day I just erased my Windows partition to recover the space.

[–] addie@feddit.uk 44 points 3 months ago (1 children)

In which case, the job becomes transferring the bottled samples into sample tubes in trays so that the machine can process them, and usually adding a barcode to each sample tube. The sample tubes need to be kept immaculate as well - some of the things that we test water for, like pesticides, are only present in miniscule concentrations. Might not actually save a great deal of time, and you need to buy and maintain a very expensive automated sampler.

When I used to work in the water industry, we were usually able to get PhD-qualified research chemists to do all this mind-numbing laboratory work. There's a bit of a surplus of qualified chemists compared to the number of chemist jobs available, so you got absurdly over-qualified people applying for these roles.

[–] addie@feddit.uk 53 points 3 months ago (2 children)

Obligatory www.web3isgoinggreat.com - catalogues all of the grifts, hacks and thefts, with a running $$$ total.

[–] addie@feddit.uk 48 points 4 months ago (12 children)

Cheaper for now, since venture capitalist cash is paying to keep those extremely expensive servers running. The AI experiments at my work (automatically generating documentation) have got about an 80% reject rate - sometimes they're not right, sometimes they're not even wrong - and it's not really an improvement on time having to review it all versus just doing the work.

No doubt there are places where AI makes sense; a lot of those places seem to be in enhancing the output of someone who is already very skilled. So let's see how "cheaper" works out.

[–] addie@feddit.uk 10 points 4 months ago (1 children)

PS3 most certainly had a separate GPU - was based on the GeForce 7800GTX. Console GPUs tend to be a little faster than their desktop equivalents, as they share the same memory. Rather than the CPU having to send eg. model updates across a bus to update what the GPU is going to draw in the next frame, it can change the values directly in the GPU memory. And of course, the CPU can read the GPU framebuffer and make tweaks to it - that's incredibly slow on desktop PCs, but console games can do things like tone mapping whenever they like, and it's been a big problem for the RPCS3 developers to make that kind of thing run quickly.

The cell cores are a bit more like the 'tensor' cores that you'd get on an AI CPU than a full-blown CPU core. They can't speak to the RAM directly, just exchange data between themselves - the CPU needs to copy data in and out of them in order to get things in and out, and also to schedule any jobs that must run on them, they can't do it themselves. They're also a lot more limited in what they can do than a main CPU core, but they are very very fast at what they can do.

If you are doing the kind of calculations where you've a small amount of data that needs a lot of repetitive maths done on it, they're ideal. Bitcoin mining or crypto breaking for instance - set them up, let them go, check in on them occasionally. The main CPU acts as an orchestrator, keeping all the cell cores filled up with work to do and processing the end results. But if that's not what you're trying to do, then they're borderline useless, and that's a problem for the PS3, because most of its processing power is tied up in those cores.

Some games have a somewhat predictable workload where offloading makes sense. Got some particle effects - some smoke where you need to do some complicated fluid-and-gravity simulations before copying the end result to the GPU? Maybe your main villain has a very dramatic cape that they like to twirl, and you need to run the simulation on that separately from everything else that you're doing? Problem is, working out what you can and can't offload is a massive pain in the ass; it requires a lot of developer time to optimise, when really you'd want the design team implementing that kind of thing; and slightly newer GPUs are a lot more programmable and can do the simpler versions of that kind of calculation both faster and much more in parallel.

The Cell processor turned out to be an evolutionary dead end. The resources needed to work on it (expensive developer time) just didn't really make sense for a gaming machine. The things that it was better at, are things that it just wasn't quite good enough at - modern GPUs are Bitcoin monsters, far exceeding what the cell can do, and if you're really serious about crypto breaking then you probably have your own ASICs. Lots of identical, fast CPU cores are what developers want to work on - it's much easier to reason about.

[–] addie@feddit.uk 6 points 4 months ago (3 children)

Yes, because it doesn't do as much to protect you from data corruption.

If you have a use case where a barely-measurable increase in speed is essential, but not so essential that you wouldn't just pay for more RAM to keep it in cache, and also it doesn't matter if you get the wrong answer because you've not noticed the disk is failing, and you can afford to lose everything in the case of a power cut, then sure, use a legacy filesystem. Otherwise, use a modern one.

view more: ‹ prev next ›