The World Wide Web Sucks - Lunduke

Not so much about the video specifically, only in general.

Might change when I get to watch the video, who knows :stuck_out_tongue:

Not sure what you mean with that, plugins? Cause, the whole idea is that it's supposed to work without plugins, of course it doesn't, but that's a different story.

But not entirely though, right? I'm not really using Firefox anymore, we just have it at work (ESR verison though), and I have 2 processes, one for the normal and one for private window. Is that a setting or flag? Or is it not rolled out for everyone?

I mean whenever you load a page with JS, you execute arbitrary 3rd party code.

I agree, the philosophy of "why not?" needs to be turned into "why tho?"

Yet you are using web atm, and in fact executing tons of JS.


While I am myself not a fan of current status of web. It would be a shame if web never existed. Without it I would never learn to code, I wouldn't be able to talk with tons of people, and I wouldn't be able to shitpost.

I agree about spying, hell I use uBlock and Privacy Badger, and sometimes even Tor (I see no point in using Tor when I am going to authenticate in the end). But overall web can be a great thing. We just need to try and win it back.

And in the end the worst problems aren't in the efficiency (or lack there of) and RAM usage. It's exactly what you're talking about. The biggest problem with web is not web, but people who use it. We, as users, let it become centralized, which in turn let it to be used for surveillance (or at least made it easier). We traded privacy for convenience.

We, as in devs, got lazy with software, we default to "hardware will get faster" and "works fine on my machine with 16 GB of memory" instead of actually looking into a problem, profiling and optimizing. I get why though. in the 80s we had constrains, now, not so much.

So in the end, even though I have a beef with web (fuck even whole internet) on tech level. I much more have a problem with it on political level, which is only fixable by people, not tech.

1 Like

Hm, interesting... might be an ESR thing then, who knows.

I might be wrong here, I think everything that is in {} is a thread not a process.

Consider this zsh process with suspended processes.

And this description from pstree man:

       pstree  shows running processes as a tree.  The tree is rooted at either pid or init if pid is omitted.  If a user name is specified, all process trees rooted
       at processes owned by that user are shown.

       pstree visually merges identical branches by putting them in square brackets and prefixing them with the repetition count, e.g.

           init-+-getty
                |-getty
                |-getty
                `-getty

       becomes

           init---4*[getty]

       Child threads of a process are found under the parent process and are shown with the process name in curly braces, e.g.

           icecast2---13*[{icecast2}]

Derailing the thread:

Fucking linux man, confusing as shit :stuck_out_tongue:
I think this answers why threads have PIDs: https://stackoverflow.com/questions/9154671/distinction-between-processes-and-threads-in-linux

Servo is actually separate project (not related to Firefox) but since it was created by Mozilla they decided to use some of it's features in new Firefox engine. :slight_smile:


Yeah I'm so lazy I know :stuck_out_tongue:

That was very enjoyable to watch. Much better than most of his other videos.

I tried to load bestbuy.com on Tor browser and it doesn't even load. Access Denied message.

Am I the only one who thinks that going to prison for downloading a copyrighted music is wrong?

1 Like

No you're not, and I don't think that either. But proper penalties are not the responsibility of "the web" either. DRM and penalties for copyright infringement are two very different things. Not to mention that infringement is handled very differently in different countries and I'm pretty sure you don't go to jail for one song or one album :slight_smile: The chance they find you for one download are rather slim in the first place, and even if they do, there's other ways :slight_smile:

That's fine, who isn't :slight_smile: But regarding the last sentence in your quoted post:

The new thing in 54 will be enabling 4 content processes by default.

Gonna take a year or so until it hits ESR then, so that explains it :slight_smile:

On reflection I think that I should clarify that I do not believe that web and browser developers are incompetent, don't care, are unable to create a safe, secure and pleasant browsing experience, or are generally bad people. Rather, that some of their collective decisions have been irresponsible. We don't need 1/2 a gigabyte (and growing!) of javascript to display a couple of news headlines. Likewise, building browsers that inherently trust all javascript that they encounter and then enthusiastically and autonomously execute it is a recipe for disaster. Certainly, browser architects can't be held responsible for blackhat shenanigans, but they need to recognize that they have set the stage for an orgy of mischief. Listen, if I were to walk into a bar at midnight and start handing out hand grenades to each of the patrons, would anyone be surprised if there were some unintended consequences? The law would certainly hold me to a higher standard and it is my assertion that the devs should likewise hold themselves to a higher standard.

We currently have regular websites and mobile websites. Why can't we have plain html websites and, for the more adventurous, "enhanced" websites with enhanced RAM and CPU requirements along with the attendant sluggishness and enhanced security risks? Enhanced websites should not be the default, we should have to opt in, in order to experience the web in all of its bloat and "splendor."

And while we are at it, why are there ten different bots (that I know of!) tracking every single character that I type into this post?!?!?! AND there are another five bots (that I know of!) that are watching, but haven't yet decided if they are going to track me?!?!?! Again, the stage has been set for mischief and lo and behold, mischief is upon us!

I agree, but that's on web devs not browser devs.

Aren't most browsers run JS in sandbox?

That's more or less what I am talking about, people should stop and think before making a web site. Do they need a web page (or a collection of them) or a webapp.

Care to elaborate about the bots? All I know about are tracking cookies.

Care to elaborate about the bots? All I know about are tracking cookies.

As reported by Privacy Badger:
googleads.g.doubleclick.net
static.doubleclick.net
www.googleadservices.com
r18---sn-aigllnde.googlevid...
r5---sn-q4f7sn7s.googlevide...
www.google.com
img.youtube.com
s.youtube.com
www.youtube.com
i.ytimg.com
img.gta5-mods.com
gtaforums.com
www.rockpapershotgun.com
cdn.edgecast.steamstatic.com
erolives.com

OK so... just got around watching the video (and I know he's exaggerating a lot in everything in the history of ever, so.. yeah, I know). And sorry for the length and rambling, too lazy to keep it short.

OK so he's taking an average over "all" websites tracked by whatever service he used to get his data. The problem with that is that it's not representative at all. There are websites whose single pages grow immensly over the years, but there are also sites that stay in about the same range over years. Especially social media sites grow like crazy in page size because people upload their stupid images everywhere. Of course it skews the overall average. Also do those services track sites with video and audio content? Because then that whole statistic becomes ridiculous. But he doesn't state his source other then "many websites", so who knows...

In addition to that, if you do have a slow connection you can disable the immediate download of images for example and load them on demand (one more reason why developers should use <img width/height already).

The web evolves, connections get faster overall, why not use it. There's also a limit on how far you can compress content. minifying and gzipping is done on basically every larger website already, and there is a limit on how far you can compress media content. And spoiler alert: people like seeing media content.

Yep, my point from earlier still stands, and he even says so himself.

OK so his examples give a little context. However that's more an issue of lazy programmers rather then "the web", because "the web", on rather the W3C, has a spec to obtain GeoLocation data for the user, with his knowledge, that works in a very similar way as the ones used in his examples.

Yeah, point from earlier definitely still stands. Don't use browsers from the stone age maybe. On the one hand he advocates that people should update their shit, because... you know, security... but on the other hand he's complaining that old shit doesn't work? Yeah. That totally makes sense.

And every sane developer (and I'm pretty sure CNN isn't a 100% insane) will have some kind of support for older browsers (recommended/best practice is about 3 versions back, but usually gathered from analytics on most used browser versions), either through pre-deployment and/or build steps, or through polyfills. But you can't have polyfills for everything, because... you know "Web page size sucks!" already.

And oh well, who knew Mosaic wouldn't display a page from 2017, that's a total surprise. Who outside the "nerd" community even knows this thing anymore, let alone uses it.

Of course you "can" build a website that works on those, but why would you if noone's using it? Just funny that he didn't talk about IE there :slight_smile:

It's not that developers choose to have the sites used in "restraint circumstances", just that noone's using those things anymore.

"The W3C is the gatekeeper to all things web. They set the standards, they say these are the standards for HTML, for JavaScript".

It's funny, because he should know. HTML5 is not a standard, it's a specification, because the W3C is not a standardisation organisation. And the W3C has nothing to do with JavaScript. JavaScript is technically ECMAScript, and that is still maintained by Brendan Eich. And also JavaScript is a trademark of Oracle.

Since he's specifically talking about the Encrypted Media Extensions here, well... Here's the thing. We have DRM already, the difference is that it is not specced by the W3C, but that doesn't change the fact that it is there already. And for his first quote it even makes sense. Certain content needs protection to make it possible to spread it over the web. Music and Video services need DRM to be even allowed to distribute their content. And as I see it there is nothing wrong with that. Content creators should have a right to protect their property, and if that isn't the case, they wouldn't allow it to be spread.
Yes it does contradict the second quote. But that quote isn't exactly from yesterday either. Things change over time, there's nothing wrong with that. And just because a site has some kind of DRM doesn't mean it's not possible to access it. On the contrary, it makes it possible to access it for the reasons above.
And for his third quote. At first he says that "DRM is just a broad concept", and not a specific implementation. DRM doesn't have to restrict based on geographical location. It doesn't have to restrict based on network access (whatever the hell he means by that). It doesn't have to be a paid service. The software issue is exactly the one EME is trying to adress, so that plugins aren't browser specific anymore and might not exist for browser X, but so there is one single spec that is used in all browsers. The same goes for the Operating system. It doesn't have to restrict hardware. Actually the only thing I know of that restricts hardware is Netflix' 4k streaming, and that is mostly so that people don't wine about a bad viewing experience when using ... I dunno, Mosaic as a browser for example.
And no, restricting is not the "entire point" of DRM, the "entire point" of DRM is protecting property, how that's implemented and if it works or not is a different story.

And they didn't decide to throw anything out the window. Until now EME is just a Proposal, it's not even a draft nor a recommendation, let alone a spec.

Also spoiler alert: Even if the W3C rejected the proposal it could still be implemented into browsers, it really doesn't matter. It might not make it into Firefox, but it surely will on Chrome and Edge. Firefox still has the blink element and chrome supports certain CSS properties that are not in the official spec.


I feel like every time I hear him talk he's just there to hate on everything just so he can hate on something. I know he exaggerates a lot, but it still annoys me every time I see it, even when he's trying to be nice. Maybe it's just his voice and speech pattern for me.

I mean he could quit talking about how bad the code is and make it better? That could be a thing.

Also cnn.com drinking game anyone? Take a shot every time he says cnn.com and chances are people are gonna pass out within the first minute.

1 Like

In the core I mostly agree with Brian but some of his points should be seen with a grain of salt.

It's been a very long time i used Firefox for example. It always had issues with memory usage and had terrible video performance on YouTube compared to Chrome. Chrome is quite good regarding RAM but I chose for myself to load several plugins like ublock, uMatrix and HTTPS everywhere which increases the memory footprint quite a lot.
When this is such a huge concern I wonder why he didn't mention Microsoft Edge as this is in fact a well-integrated browser which has very little CPU and RAM demands (without plugins, can't test because i deleted the store ;D).

His point about the backwards compatibility is kinda stupid in my opinion. Unless you are really working on decade-old hardware (why would you though?) there is no point in staying with earlier versions of your web browser, let alone how dangerous it is in regards to bug and security fixes.

The true answer lies in the middle as it is mostly the case. Both the need and the supply of memory and bandwidth increases and it's our job to use it economically to keep a balance.
And there is no excuse for DRM, to hell with that!

1 Like

Thanks for watching the vid!

I think that a little exaggeration, especially when used with a pinch of humor is entirely fair, when trying to make a point.

Why not use it, you ask? Well again, just because you can, doesn't make it a good idea. Not everyone's connections are getting faster. More specifically, they are not getting faster at the same rate that web pages are growing larger. There are literally millions of us whose best available option is DSL, satellite, or cellular. And, it's precisely because there is a limit to the amount that content can be compressed, that web developers should be mindful of how bloated their sites have become.

And, what is the substance of this bloat? Third party cruft and javascript doing god only knows? I'll pass ... at least I'd like to. Too bad I can't easily block third party content as easily as I can block third party cookies. In the event that third party content is actually important to a specific web page, we have a solution for that. It's called a hyperlink.

I have a spoiler alert for you. If I visit a site and it automatically starts playing audio, or video assets, I leave immediately and I don't go back. I consider this to be invasive, antisocial behavior and it is only about 1/2 a notch less annoying than flashing ads which intentionally try to induce a seizure. Please have the good manners to either ask, or allow the viewer to initiate multimedia playback. There are few things more annoying than having sixty, or more tabs open and then trying to find the one tab that is automatically playing a thirty second media clip every few minutes!

1 Like

Any language that allows you divide a bool by an array that contains a single index that's a string and not only doesn't freak out and fail - BUT further returns a number is most assuredly the work of the devil - ie console.log(typeof(false/["twelve"]); ... Number ... WAT?!?!

World Wide Web ... let's be honest ... more like World Wide Web (of Web Services)

1 Like

'use strict';

Should help alleviate bad programming.

This is probably true, however there are easy ways of blocking stuff from rendering/downloading client side if that is an issue.

Really depends how the site is bloated. Of course it can be third party stuff (and that is easily blockable...), but it might also just be media content. And depending on the browser you can choose not to download images directly, but only when specifically requested by the user.

That's understandable and so do I. I don't care so much about video if it's muted to be honest, but audio is just a no.

This is only tangent, but.. Most Browsers have an icon on the tab for this issue.

And also use a linter.

The uMatrix addon can help with that, but plenty of pages will then require you to do some fiddling and allow some of the 3rd party stuff before they'll work properly.

1 Like

I agreed with a lot of what Lunduke said, websites auto tracking location, and heavily relying on it for no real reason when not appropriate, and about DRM, however, one thing he fails to mention is that over time at lot of the server-side logic has moved to client-side. There has also been a massive shift if the observable quality of the content on pages. Not to mention that websites now have to have code to deal with devices of all sizes when they used to only have to deal with one.

Frameworks have become more sophisticated; to do more. It makes sense that it would do more, and then you would need more powerful browser engine to run the more advanced frameworks, and then the cycle repeats in an arms race. At least until we reach a point where its 'good enough'. Higher quality everything: images, video, etc, also contributes to the bloat.

Javascript is even more popular now that its ever been. Rightly so, as its very powerful for the user experience. HTML for the structure of the page, CSS for the look of the page, javascript for the behavior of the page; now with client-side and server-side logic.

Devices like small phones, large phones, phablets; mini, regular, plus, and laptops 13", 15", 17", and desktop monitors 20" to 36", and then lastly large smart tv's. That's a lot of devices to have to deal with for just one site. More code is required, but by no means does this mean to be sloppy.

Also, really shouldn't be using a browser more than 10 years old. The fact that gihub even worked on FF4 was cool though. 10 years of backwards compatibility is pretty much the standard as everything goes forward. One thing I do agree with though, is that in web design you are taught from the beginning to make the site usable in just plain text form, so if for any reason the assets fail to load then you can still use the site. Sites that don't adhere to this do have a bad practice.

1 Like

I forgot to mention this earlier, but this is actually a thing I do heavily agree with in his video. The site shouldn't be completely unusable just because it's an old browser. That the layout won't work is somewhat obvious (and he showed it with his own very simple website, where it doesn't exactly work either), but I agree the content should still be there. I haven't really looked into CNNs code though, so I'm not sure what they are doing. From the short visit I did it looked like they only load above-the-fold content (which is not necessarily a bad practice), and load in more articles when the user scrolls a bit. It's not a typical endless-scroll page like some others, but a few articles load on scroll apparently.

That is true. Some frameworks have an issue that they require to be included completely before they do anything, even if just a few features are used. Then there are other frameworks that can actually be split by features, but where some developers are just too lazy to build their own version with just the features used and just load the whole thing regardless.
Now I know that in the last version they are mostly included from CDNs, at which point this is somewhat "fine" since it remains in cache. But a lot of developers still host it themselves, where they could easily build their own version...