this post was submitted on 06 Nov 2025
512 points (99.2% liked)

Technology

76680 readers
2416 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 38 comments
sorted by: hot top controversial new old
[–] kibiz0r@midwest.social 85 points 2 days ago (2 children)

14kB club: “Amateurs!!!”

https://dev.to/shadowfaxrodeo/why-your-website-should-be-under-14kb-in-size-398n

14kB page can load much faster than a 15kB page — maybe 612ms faster — while the difference between a 15kB and a 16kB page is trivial.

This is because of the TCP slow start algorithm. This article will cover what that is, how it works, and why you should care.

[–] sobchak@programming.dev 3 points 1 day ago (2 children)

Is it just the HTML that should be under 14kb? I think script, CSS, and image (except embedded SVGs) are separate requests? So these should individually be under 14kb to get the benefit?

[–] xthexder@l.sw0.com 5 points 1 day ago

In an ideal world, there's enough CSS/JS inlined in the HTML that the page layout is consistent and usable without secondary requests.

[–] kibiz0r@midwest.social 3 points 1 day ago (1 children)

Those additional requests will reuse the existing connection, so they’ll have more bandwidth at that point.

[–] sobchak@programming.dev 2 points 1 day ago

Interesting, didn't know that's how modern browsers worked. Guess my understanding was outdated from the HTTP/1 standard.

[–] mlg@lemmy.world 15 points 2 days ago (1 children)

Something something QUIC something something

[–] hamFoilHat@lemmy.world 23 points 2 days ago (1 children)

I actually read the link and they mention QUIC

there is a notion that HTTP/3 and QUIC will do away with the 14kB rule — this is not true. QUIC recommends the same 14kB rule.

[–] mlg@lemmy.world 12 points 1 day ago

Damn I was actually gonna add more context to my original comment about how QUIC is an overrated in place UDP upgrade for HTTP, but I didn't wanna open my mouth because I haven't read the QUIC spec.

Thank you for this lol

spoilerSliding windows are for losers, spam packets at gigabit rates or go home /s

[–] htrayl@lemmy.world 129 points 2 days ago (3 children)

A small critique of that project - a large portion of the websites included are simply personal sites for developers - nothing barely more technical than a business card or CV. I would exclude those or categorize them differently, as to me their "usefulness" seems relatively edge case.

[–] Taldan@lemmy.world 64 points 2 days ago (1 children)

I clicked on 6 sites

  • 4 were personal portfolio sites

  • 1 was a personal blog

  • 1 was a web design company

Pretty disappointing, and I'm not going to keep clicking on more in the hopes I find something interesting

[–] dandu3@lemmy.world 24 points 1 day ago

I clicked on random and I got a tic tac toe that's apparently purpose made. Works fine too

[–] tedd_deireadh@lemmy.world 42 points 2 days ago

In the FAQ they actually do address that and mention they're reviewing those sites for removal.

[–] unexposedhazard@discuss.tchncs.de -2 points 2 days ago (1 children)

They are useful for those people though. You can put a QR code or URL on your business card and it will give people all the information they need for your businessor something.

[–] ripcord@lemmy.world 15 points 2 days ago* (last edited 2 days ago) (1 children)

I don't think anyone is arguing that having, like, websites are useful. But if they're not particularly interesting then it doesn't really fit here.

The point of something like this is generally to come up with interesting/creative/useful things within arbitrary resource limits. Not just a bunch of really really limited boring stuff.

[–] frongt@lemmy.zip 1 points 1 day ago (1 children)

That's not one of their requirements. You might want to look at a different competition.

[–] unexposedhazard@discuss.tchncs.de 1 points 1 day ago* (last edited 1 day ago)

Yeah i dont get this complaint. This is just a label that people can qualify for, its not a competition or curated list of totally great websites. Its literally just like an energy efficiency sticker on a TV.

[–] Treczoks@lemmy.world 6 points 1 day ago

My web pages are larger. But not because of script junk and graphics. The texts simply are that long.

It's basically 1% markup and 99% visible text.

[–] knobbysideup@sh.itjust.works 39 points 2 days ago (1 children)
[–] Deceptichum@quokk.au 12 points 2 days ago (1 children)

Oh man, reminds me of that amazing 3D FPS demo from 20 years ago.

[–] Yaky@slrpnk.net 9 points 1 day ago

.kkrieger for those who want to look it up

[–] Korhaka@sopuli.xyz 20 points 2 days ago

512kb? So much bloat..

[–] mlg@lemmy.world 13 points 2 days ago (2 children)

I need a CDN free single GET request club

Why exactly? Do you know what a CDN does and why its there in the first place?

[–] xyx@sh.itjust.works 2 points 1 day ago

Best I can do is a Firefox extension (Decentraleyes)

[–] tedd_deireadh@lemmy.world 13 points 2 days ago (2 children)

That's a pretty cool site. I always wanted to set up my own simple, html page for a personal blog. Lots of inspiration in these pages. Crazy there's functional web pages less than 2.03KB in size.

[–] cmnybo@discuss.tchncs.de 29 points 2 days ago

It's not hard to make a useful website that's small. You just have to avoid using javascript libraries and keep images to a minimum. There are a number of static web page generators if you don't want to write HTML yourself.

Keep in mind that a 50 kB page is about a 15 second load on a typical dial-up connection. Before high speed internet, almost everyone kept their web pages small.

[–] LucidNightmare@lemmy.dbzer0.com 9 points 2 days ago (1 children)
[–] tedd_deireadh@lemmy.world 2 points 1 day ago

I'll look into it! Thanks!

[–] shalafi@lemmy.world 7 points 2 days ago

I remember being amazed at Yahoo!'s load times on 56K. Pulled the code, 79K.

[–] dual_sport_dork@lemmy.world 3 points 2 days ago (1 children)

First of all, I take a bit of umbrage at the author's constant reference to "website size" without defining what this means until you dig into the FAQ. Just blithely referring to everything as "size" is a bit misleading, since I imagine most people would immediately assume size on disk which obviously makes no sense from a web browsing perspective. And indeed, they actually mean total data transferred on a page load.

Also, basically all this does is punish sites that use images. I run an ecommerce website (and no, I'm not telling you lunatics which one) and mine absolutely would qualify handily, except... I have to provide product images. If I didn't, my site would technically still "work" in a broad and objective sense, but my customers would stage a riot.

A home page load on our site is just a shade over 2 megabytes transferred, the vast majority of which is product images. You can go ahead and run an online store that actually doesn't present your customers any products on the landing page if you want to, and let me know how that works out for you.

I don't use any frameworks or external libraries or jQuery or any of that kind of bullshit that has to be pulled down on page load. Everything else is a paltry (these days) 115.33 kB. I'mna go ahead and point out that this is actually less to transfer than jabroni has got on his own landing page, which is 199.31 kB. That's code and content only for both metrics, also not including his sole image — which is his favicon, and that is for some inexplicable reason given the circumstances a 512x512 .png. (I used the Firefox network profiler to generate these numbers.)

[–] JustAnotherKay@lemmy.world 1 points 2 days ago (1 children)

Do you actually have to provide the image? Couldn’t you provide a pointer to the image? Like those thumbnails that are just links on the backends but appear as images when loaded

[–] dual_sport_dork@lemmy.world 9 points 2 days ago (2 children)

If you're going to display pixels on the user's screen, you have to send those pixels to the user. Magic still doesn't exist. HTML img tags are indeed a "pointer," but once the user's browser has the path to that image file it will download the entire thing.

That said, there's no reason to send an image that's any bigger than it needs to be. Sending a scaled down thumbnail if you know it will be displayed small is sensible. Sending the entire 1200px wide or whatever image it is and just squashing it into a 100px wide box in the user's browser is not.

[–] JustAnotherKay@lemmy.world 2 points 1 day ago (1 children)

Once the users browser has the path to that image…

I dunno why that didn’t occur to me, that makes sense

[–] dual_sport_dork@lemmy.world 2 points 1 day ago (1 children)

That's how it works.

You may be thinking of "lazy loading," where some scriptwork is used to delay downloading images until some time after the initial page load completes. This still requires all the data to be sent to the user — all of the data always has to be sent to the user eventually — but just not right away. This can have perceptible load time benefits, especially if whatever content you're loading won't be visible in the viewport initially anyway.

[–] JustAnotherKay@lemmy.world 3 points 1 day ago (1 children)

Tbh I’m just new to the computer science scene - I’ve taken one class so far on the fundamentals of programming and have only seen a real language in my free time as of yet.

It didn’t occur to me that the webpage which references another for an image would still be culpable for the space taken up by the image, because with on-disk memory management you can do tricks to reduce sizes with pointers and I just thought it would be analogous. It feels painfully obvious to me why that’s stupid now lol

[–] dual_sport_dork@lemmy.world 2 points 1 day ago

It's the same line of logic as when you see people post on a forum something like [img]c:\Users\Bob\Documents\My_Image.bmp[/img] and then wonder why it doesn't work.

"But I can see it on my computer!"

Over the internet, the origin of all data is on someone else's computer. All means all. And all of it needs to come down the wire to you at some point.

You're on the right track in one regard, though, in a roundabout way with caching: Browsers will keep local copies of media or even the entire content of webpages on disk for some period of time, and refer to those files when the page is visited again without redownloading the data. This is especially useful for images that appear in multiple places on a website, like header and logo graphics, etc.

This can actually become a problem if an image is updated on the server's side, but your browser is not smart enough to figure this out. It will blithely show the old image it has in its cache, which is now outdated. (If you force refresh a webpage by holding shift when you refresh or press F5 in all of the current modern browsers, you'll get a reload while explicitly ignoring any files already in the cache and all the images and content will be fully redownloaded, and that's how you get around this if it happens to you.)

[–] dondelelcaro@lemmy.world 3 points 1 day ago

And use SVG when you can; bitmaps which should be vectors are frequently big and ugly.

[–] mpramann@discuss.tchncs.de 2 points 2 days ago

Thanks for sharing. Great inspiring collection.