The World Wide Web Sucks - Lunduke

That's a consumer problem, not a web problem. People like having it "easy" and "easy" is having everything in one place. It's understandable for the regular consumer though.

WebAssembly is a project of the W3C, mozilla is just a tiny part of the group working on it.

True, I've been trying to de-googlify myself, but still can't ditch my account coz of android and mail.

I just enumerated good things, there's commas that separate em :stuck_out_tongue:

Ah well that explains it :smiley:

Yeah what Mozilla is currently working on is making Firefox multi-process, gonna take forever to complete that though.

Actually Firefox is multi-process for quite some time.

Yeah, but they are making a rewrite. I can't find where I read this, but they said to expect something in FF 56. Also there's Project Quantum - a new web engine from Mozilla.

Quantum is about new engine, Electrolysis is multi-process project. There was a discossion about it >>here<< on the forum. :wink:

1 Like

You certainly have a lot of opinions about Brian's video, for someone who admittedly didn't view the video.

Personally, I don't think that Brian went far enough. The web has literally become both a tool for mass surveillance as well as a security minefield. I don't want my browser secretly executing arbitrary code in the background (particularly third party code!) and there is absolutely reason why the web should crash and burn if I, in an attempt to preserve some modicum of security, try to prevent same.

Just because something is possible, doesn't make it a good idea. It seems that the mindset that gave us notoriously insecure web cams is the same pervasive mindset of web and browser developers. Security, if in fact it is ever considered, seems to be merely an afterthought at best. Mary Shelley famously foresaw this 200 years ago, when she wrote of a miraculous invention and only after having loosed it upon the world, did its creator recognize the evil he had given life to. It would seem that some of our web and browser developers are still in denial.

There's also https://servo.org/ and they porting some stuff from it into gecko.

Not so much about the video specifically, only in general.

Might change when I get to watch the video, who knows :stuck_out_tongue:

Not sure what you mean with that, plugins? Cause, the whole idea is that it's supposed to work without plugins, of course it doesn't, but that's a different story.

But not entirely though, right? I'm not really using Firefox anymore, we just have it at work (ESR verison though), and I have 2 processes, one for the normal and one for private window. Is that a setting or flag? Or is it not rolled out for everyone?

I mean whenever you load a page with JS, you execute arbitrary 3rd party code.

I agree, the philosophy of "why not?" needs to be turned into "why tho?"

Yet you are using web atm, and in fact executing tons of JS.


While I am myself not a fan of current status of web. It would be a shame if web never existed. Without it I would never learn to code, I wouldn't be able to talk with tons of people, and I wouldn't be able to shitpost.

I agree about spying, hell I use uBlock and Privacy Badger, and sometimes even Tor (I see no point in using Tor when I am going to authenticate in the end). But overall web can be a great thing. We just need to try and win it back.

And in the end the worst problems aren't in the efficiency (or lack there of) and RAM usage. It's exactly what you're talking about. The biggest problem with web is not web, but people who use it. We, as users, let it become centralized, which in turn let it to be used for surveillance (or at least made it easier). We traded privacy for convenience.

We, as in devs, got lazy with software, we default to "hardware will get faster" and "works fine on my machine with 16 GB of memory" instead of actually looking into a problem, profiling and optimizing. I get why though. in the 80s we had constrains, now, not so much.

So in the end, even though I have a beef with web (fuck even whole internet) on tech level. I much more have a problem with it on political level, which is only fixable by people, not tech.

1 Like

Hm, interesting... might be an ESR thing then, who knows.

I might be wrong here, I think everything that is in {} is a thread not a process.

Consider this zsh process with suspended processes.

And this description from pstree man:

       pstree  shows running processes as a tree.  The tree is rooted at either pid or init if pid is omitted.  If a user name is specified, all process trees rooted
       at processes owned by that user are shown.

       pstree visually merges identical branches by putting them in square brackets and prefixing them with the repetition count, e.g.

           init-+-getty
                |-getty
                |-getty
                `-getty

       becomes

           init---4*[getty]

       Child threads of a process are found under the parent process and are shown with the process name in curly braces, e.g.

           icecast2---13*[{icecast2}]

Derailing the thread:

Fucking linux man, confusing as shit :stuck_out_tongue:
I think this answers why threads have PIDs: https://stackoverflow.com/questions/9154671/distinction-between-processes-and-threads-in-linux

Servo is actually separate project (not related to Firefox) but since it was created by Mozilla they decided to use some of it's features in new Firefox engine. :slight_smile:


Yeah I'm so lazy I know :stuck_out_tongue:

That was very enjoyable to watch. Much better than most of his other videos.

I tried to load bestbuy.com on Tor browser and it doesn't even load. Access Denied message.

Am I the only one who thinks that going to prison for downloading a copyrighted music is wrong?

1 Like

No you're not, and I don't think that either. But proper penalties are not the responsibility of "the web" either. DRM and penalties for copyright infringement are two very different things. Not to mention that infringement is handled very differently in different countries and I'm pretty sure you don't go to jail for one song or one album :slight_smile: The chance they find you for one download are rather slim in the first place, and even if they do, there's other ways :slight_smile:

That's fine, who isn't :slight_smile: But regarding the last sentence in your quoted post:

The new thing in 54 will be enabling 4 content processes by default.

Gonna take a year or so until it hits ESR then, so that explains it :slight_smile:

On reflection I think that I should clarify that I do not believe that web and browser developers are incompetent, don't care, are unable to create a safe, secure and pleasant browsing experience, or are generally bad people. Rather, that some of their collective decisions have been irresponsible. We don't need 1/2 a gigabyte (and growing!) of javascript to display a couple of news headlines. Likewise, building browsers that inherently trust all javascript that they encounter and then enthusiastically and autonomously execute it is a recipe for disaster. Certainly, browser architects can't be held responsible for blackhat shenanigans, but they need to recognize that they have set the stage for an orgy of mischief. Listen, if I were to walk into a bar at midnight and start handing out hand grenades to each of the patrons, would anyone be surprised if there were some unintended consequences? The law would certainly hold me to a higher standard and it is my assertion that the devs should likewise hold themselves to a higher standard.

We currently have regular websites and mobile websites. Why can't we have plain html websites and, for the more adventurous, "enhanced" websites with enhanced RAM and CPU requirements along with the attendant sluggishness and enhanced security risks? Enhanced websites should not be the default, we should have to opt in, in order to experience the web in all of its bloat and "splendor."

And while we are at it, why are there ten different bots (that I know of!) tracking every single character that I type into this post?!?!?! AND there are another five bots (that I know of!) that are watching, but haven't yet decided if they are going to track me?!?!?! Again, the stage has been set for mischief and lo and behold, mischief is upon us!

I agree, but that's on web devs not browser devs.

Aren't most browsers run JS in sandbox?

That's more or less what I am talking about, people should stop and think before making a web site. Do they need a web page (or a collection of them) or a webapp.

Care to elaborate about the bots? All I know about are tracking cookies.

Care to elaborate about the bots? All I know about are tracking cookies.

As reported by Privacy Badger:
googleads.g.doubleclick.net
static.doubleclick.net
www.googleadservices.com
r18---sn-aigllnde.googlevid...
r5---sn-q4f7sn7s.googlevide...
www.google.com
img.youtube.com
s.youtube.com
www.youtube.com
i.ytimg.com
img.gta5-mods.com
gtaforums.com
www.rockpapershotgun.com
cdn.edgecast.steamstatic.com
erolives.com

OK so... just got around watching the video (and I know he's exaggerating a lot in everything in the history of ever, so.. yeah, I know). And sorry for the length and rambling, too lazy to keep it short.

OK so he's taking an average over "all" websites tracked by whatever service he used to get his data. The problem with that is that it's not representative at all. There are websites whose single pages grow immensly over the years, but there are also sites that stay in about the same range over years. Especially social media sites grow like crazy in page size because people upload their stupid images everywhere. Of course it skews the overall average. Also do those services track sites with video and audio content? Because then that whole statistic becomes ridiculous. But he doesn't state his source other then "many websites", so who knows...

In addition to that, if you do have a slow connection you can disable the immediate download of images for example and load them on demand (one more reason why developers should use <img width/height already).

The web evolves, connections get faster overall, why not use it. There's also a limit on how far you can compress content. minifying and gzipping is done on basically every larger website already, and there is a limit on how far you can compress media content. And spoiler alert: people like seeing media content.

Yep, my point from earlier still stands, and he even says so himself.

OK so his examples give a little context. However that's more an issue of lazy programmers rather then "the web", because "the web", on rather the W3C, has a spec to obtain GeoLocation data for the user, with his knowledge, that works in a very similar way as the ones used in his examples.

Yeah, point from earlier definitely still stands. Don't use browsers from the stone age maybe. On the one hand he advocates that people should update their shit, because... you know, security... but on the other hand he's complaining that old shit doesn't work? Yeah. That totally makes sense.

And every sane developer (and I'm pretty sure CNN isn't a 100% insane) will have some kind of support for older browsers (recommended/best practice is about 3 versions back, but usually gathered from analytics on most used browser versions), either through pre-deployment and/or build steps, or through polyfills. But you can't have polyfills for everything, because... you know "Web page size sucks!" already.

And oh well, who knew Mosaic wouldn't display a page from 2017, that's a total surprise. Who outside the "nerd" community even knows this thing anymore, let alone uses it.

Of course you "can" build a website that works on those, but why would you if noone's using it? Just funny that he didn't talk about IE there :slight_smile:

It's not that developers choose to have the sites used in "restraint circumstances", just that noone's using those things anymore.

"The W3C is the gatekeeper to all things web. They set the standards, they say these are the standards for HTML, for JavaScript".

It's funny, because he should know. HTML5 is not a standard, it's a specification, because the W3C is not a standardisation organisation. And the W3C has nothing to do with JavaScript. JavaScript is technically ECMAScript, and that is still maintained by Brendan Eich. And also JavaScript is a trademark of Oracle.

Since he's specifically talking about the Encrypted Media Extensions here, well... Here's the thing. We have DRM already, the difference is that it is not specced by the W3C, but that doesn't change the fact that it is there already. And for his first quote it even makes sense. Certain content needs protection to make it possible to spread it over the web. Music and Video services need DRM to be even allowed to distribute their content. And as I see it there is nothing wrong with that. Content creators should have a right to protect their property, and if that isn't the case, they wouldn't allow it to be spread.
Yes it does contradict the second quote. But that quote isn't exactly from yesterday either. Things change over time, there's nothing wrong with that. And just because a site has some kind of DRM doesn't mean it's not possible to access it. On the contrary, it makes it possible to access it for the reasons above.
And for his third quote. At first he says that "DRM is just a broad concept", and not a specific implementation. DRM doesn't have to restrict based on geographical location. It doesn't have to restrict based on network access (whatever the hell he means by that). It doesn't have to be a paid service. The software issue is exactly the one EME is trying to adress, so that plugins aren't browser specific anymore and might not exist for browser X, but so there is one single spec that is used in all browsers. The same goes for the Operating system. It doesn't have to restrict hardware. Actually the only thing I know of that restricts hardware is Netflix' 4k streaming, and that is mostly so that people don't wine about a bad viewing experience when using ... I dunno, Mosaic as a browser for example.
And no, restricting is not the "entire point" of DRM, the "entire point" of DRM is protecting property, how that's implemented and if it works or not is a different story.

And they didn't decide to throw anything out the window. Until now EME is just a Proposal, it's not even a draft nor a recommendation, let alone a spec.

Also spoiler alert: Even if the W3C rejected the proposal it could still be implemented into browsers, it really doesn't matter. It might not make it into Firefox, but it surely will on Chrome and Edge. Firefox still has the blink element and chrome supports certain CSS properties that are not in the official spec.


I feel like every time I hear him talk he's just there to hate on everything just so he can hate on something. I know he exaggerates a lot, but it still annoys me every time I see it, even when he's trying to be nice. Maybe it's just his voice and speech pattern for me.

I mean he could quit talking about how bad the code is and make it better? That could be a thing.

Also cnn.com drinking game anyone? Take a shot every time he says cnn.com and chances are people are gonna pass out within the first minute.

1 Like