Return to

Browser Wars



What browser? I’m guessing FF!


Yeah I’m using FF 56.0.1


One thing I’ve noticed is that some people have a tendency to base64 encode images and embed them in CSS. That’s one way you can easily hit 4MB of CSS.

As far as fonts go, 1.1MB is not terrible. Fonts can come through in a high resolution, and should be optimized. That’s where the problem lies.


Eh how exactly is authored CSS size related to the browser (not including conditional IE comments which are… a thing?)? Can you elaborate because I fail to see how this would be Firefox’ fault (and for the record I’m not a Firefox user, in fact I’m not exactly a fan).

which can be a reasonable approach in certain scenarios, usecase matters of course.
What many sites don’t seem to do is above-the-fold CSS, though to be fair that is REALLY hard to optimize for.

The problem isn’t the usage of CSS/JS frameworks in itself, the problem is how developers use it. I see it way too much that devs include a whole framework and then start overwriting half the styles to “reset” them, then go ahead and overwrite them to their target values. Which is kinda stupid in the first place.
If you go deeper though most CSS frameworks are modular, you can take whatever you want. You can even integrate them with SASS or LESS to personalize them in the build process, but not a lot of devs seem to do that.

If you selfhost, compile your own version with the modules and styles you need, chances are the file won’t even be that big in the end.
If you’re a dev and you want or need to include the complete framework including all modules and overwrite select styles that’s fine too. But for the love of all that is holy at least use a widely spread CDN and don’t host it on your own servers/CDN. Because that’s what’s really frustrating. If you’re using a widely spread CDN chances are visitors have the file cached already and it’s not a big deal, but if you’re selfhosting it’s just fkin annoying (and also costly because traffic costs).


That’s the major problem.

In addition, if you minify your css for production, along with using compression features on the web server, you’re going to improve load speeds significantly.


It is minified, and its still huge.


oh … :confused:

they spend 2000 characters loading fonts. Someone needs to be fired.


2000 characters… Are they loading every newspaper ever? lol


well… I think this is a matter of course these days, didn’t think it would need mentioning :stuck_out_tongue:
Also… one thing that an approximate 200% of developers don’t understand is file versioning… query-strings. are. bad.


Yes. I’m not insanely frustrated by it though because I’ve never seen an internet connection (in the last 5 or so years) that couldn’t efficiently load a page of that size.


It’s not so much about loading speed, it’s about best practices and how highly paid devs fail to understand some basic principles.

Aside from that there are people that don’t have unlimited data especially on mobiles…


Perhaps the deeper you go into technology, the more obvious things tend to get obfuscated.


That guy nailed it :+1:


Aside from that there are people that don’t have unlimited data especially on mobiles…

Is this also one of the many motivations that Lunduke had in joining the W3C?


Eh, he can’t really reduce payload sizes through the W3C, but it’s definitely not a bad goal to have.

The efficient thing to do is what this forum does, just transfer the CSS/JS once and use JSON to get content.


I dunno I watched the video but it was a little too ranty for me so my head kinda skipped over half of it ¯\_(ツ)_/¯


Good point. On a separate note have you guys heard of / used the quickjava extension yet?


Well, he was just talking about joining the W3C and what he should do with his time there.


First I’m hearing of it.


Pretty neat extension you can use to tweak sites as needed for optimization