Eyron

joined 2 years ago
[–] Eyron@lemmy.world 2 points 4 months ago

Did you purposely miss the first and last questions: Which laptop is the good value?

I never said people need to run LLMs. I said Apple dominates high-end laptops and wanted a good high-end to compare to the high-end Macbooks.

Instead of just complaining about Apple, can do what I asked? Best cheaper laptop alternative that checks the non-LLM boxes I mentioned:

If you want good cooling, good power (CPU and GPU), good screen, good keyboard, good battery, good WiFi, etc., the options get limited quickly.

[–] Eyron@lemmy.world -1 points 4 months ago* (last edited 4 months ago) (2 children)

Is there a particular model you're thinking of? Not just the line. I usually find that Windows laptops don't have enough cooling or make other sacrifices. If you want good cooling, good power (CPU and GPU), good screen, good keyboard, good battery, good WiFi, etc., the options get limited quickly.

Even the RAM cost misses some of the picture. Apple Silicon's RAM is available to the GPU and can run local LLMs and other machine learning models. Pre-AI-hype Macs from 2021 (maybe 2020) already had this hardware. Compare that to PC laptops from the same era. Even in this era, try getting Apple's 200-400GB/s RAM performance on a PC laptop.

PC desktop hardware is the most flexible option for any budget and is cost-effective for most budgets. For laptops, Apple dominates their price points, even pre-Apple-silicon.

The OS becomes the final nail in the coffin. Linux is great, but a lot of software still only supports Windows and Apple; Linux support for the latest/current hardware can be a hit or miss (My three-year-old, 12th-gen Thinkpad just started running well). If the choice is between Mac OS or Windows 11, is there much of a choice? Does that change if a company wants to buy, manage, and support it? Which model should we be looking at? It's about time to replace my Thinkpad.

[–] Eyron@lemmy.world 6 points 6 months ago

A hidden experimental flag isn't "fixed." It might be the start, but until it's stable and usable through the normal UI, it's not really done.

[–] Eyron@lemmy.world 6 points 10 months ago* (last edited 10 months ago) (6 children)

That many steps? WindowsKey+Break > Change computer name.

If you're okay with three steps, on Windows 10 and newer, you can right click the start menu and generally open system. Just about any version supports right clicking "My Computer" or "This PC" and selecting properties, as well.

[–] Eyron@lemmy.world 40 points 10 months ago* (last edited 10 months ago)

Do you remember the Internet Explorer days? This, unfortunately, is still much better.

Pretty good reason to switch the Firefox, now. Nearly everything will work, unlike the Internet Explorer days.

  • Firefox User
[–] Eyron@lemmy.world 2 points 1 year ago

I'm still rocking a Galaxy Watch 4: one of the first Samsung watches with WearOS. It has a true always-on screen, and most should. The always-on was essential to me. I generally notice within 60 minutes if an update or some "feature" tries to turn it off. Unfortunately, that's the only thing off about your comment.

It's a pretty rough experience. The battery is hit or miss. At good times, I could get 3 days. Keeping it locked, (like after charging) used to kill it within 60 minute (thankfully, fixed after a year). Bad updates can kill the battery life, even when new: from 3 days life to 10 hours, then to 3 days again. Now, after almost 3 years, it's probably about 30 hours, rather than 3 days.

In general, the battery life with always-on display should last more than 24 hours. That'd be pretty acceptable for a smartwatch, but is it a smartwatch?

It can't play music on its own without overheating. It can't hold a phone call on its own without overheating. App support is limited, but the processor seems to struggle most of the time. Actually smart features seem rare, especially for something that needs consistent charging.

Most would be better off with a Pebble or less "smart" watch: better water resistance, better battery, longer support, 90% of the usable features, and other features to help make up for any differences.

[–] Eyron@lemmy.world 1 points 2 years ago (1 children)

To me, your attempt at defending it or calling it a retcon is an awkward characterization. Even in your last reply: now you're calling it an approximation. Dividing by 1024 is an approximation? Did computers have trouble dividing by 1000? Did it lead to a benefit of the 640KB/320KB memory split in the conventional memory model? Does it lead to a benefit today?

Somehow, every other computer measurement avoids this binary prefix problem. Some, like you, seem to try to defend it as the more practical choice compared to the "standard" choice every other unit uses (e.g: 1.536 Mbps T1 or "54" Mbps 802.11g).

The confusion this continues to cause does waste quite a bit of time and money today. Vendors continue to show both units on the same specs sheets (open up a page to buy a computer/server). News still reports differences as bloat. Customers still complain to customer support, which goes up to management, and down to project management and development. It'd be one thing if this didn't waste time or cause confusion, but we're still doing it today. It's long past time to move on.

The standard for "kilo" was 1000 centuries before computer science existed. Things that need binary units have an option to use, but its probably not needed: even in computer science. Trying to call kilo/kibi a retcon just seems to be trying to defend the use of the 1024 usage today: despite the fact that nearly nothing else (even in computers) uses the binary prefixes.

[–] Eyron@lemmy.world 1 points 2 years ago

209GB? That probably doesn't include all of the RAM: like in the SSD, GPU, NIC, and similar. Ironically, I'd probably approximate it to 200GB if that was the standard, but it isn't. It wouldn't be that much of a downgrade to go to 200GB from 192GiB. Is 192 and 209 that different? It's not much different from remembering the numbers for a 1.44MiB floppy, 1.5436Mbps T1 lines, or ~3.14159 pi approximation. Numbers generally end up getting weird: trying to keep it in binary prefixes doesn't really change that.

The definition of kilo being "1000" was standard before computer science existed. If they used it in a non-standard way: it may have been common or a decent approximation at the time, but not standard. Does that justify the situation today, where many vendors show both definitions on the same page, like buying a computer or a server? Does that justify the development time/confusion from people still not understanding the difference? Was it worth the PR reaction from Samsung, to: yet again, point out the difference?

It'd be one thing if this confusion had stopped years ago, and everyone understood the difference today, but we're not: and we're probably not going to get there. We have binary prefixes, it's long past time to use them when appropriate-- but even appropriate uses are far fewer than they appear: it's not like you have a practical 640KiB/2GiB limit per program anymore. Even in the cases you do: is it worth confusing millions/billions on consumer spec sheets?

[–] Eyron@lemmy.world -2 points 2 years ago* (last edited 2 years ago) (8 children)

This is all explained in the post we're commenting on. The standard "kilo" prefix, from the metric system, predates modern computing and even the definition of a byte: 1700s vs 1900s. It seems very odd to argue that the older definition is the one trying to retcon.

The binary usage in software was/is common, but it's definitely more recent, and causes a lot of confusion because it doesn't match the older and bigger standard. Computers are very good at numbers, they never should have tried the hijack the existing prefix, especially when it was already defined by existing International standards. One might be able to argue that the US hadn't really adopted the metric system at the point of development, but the usage of 1000 to define the kilo is clearly older than the usage of 1024 to define the kilobyte. The main new (last 100 years) thing here is 1024 bytes is a kibibyte.

Kibi is the recon. Not kilo.

[–] Eyron@lemmy.world -1 points 2 years ago* (last edited 2 years ago) (10 children)

How do you define a recon? Were kilograms 1024 grams, too? When did that change? It seems it's meant 1000 since metric was created in the 1700s, along with a binary prefix.

From the looks of it, software vendors were trying to recon the definition of "kilo" to be 1024.

[–] Eyron@lemmy.world -5 points 2 years ago* (last edited 2 years ago)

Only recent in some computers: which used a non-standard definition. The kilo prefix has meant 1000 since at least 1795-- which predates just about any kilobyte.

view more: next ›