Today I unplugged my router’s RJ45 cable for a few seconds to move a cable around, I only did it for like 5 seconds. I was prepared to restart all my SSH sessions and relaunch my zfs-send. When I did so, the router did not even detect it had its port unplugged for a while, the ethernet port’s LED lights on the router were constantly lit, while on the switch they were off. I plugged the cable, the LEDs on the router stopped lighting up, then the switch LEDs turned on, followed by the router. Yes, the LEDs turned off on the router only after I plugged the cable back in.
My SSH connections did not flinch, my zfs-send did not bat an eye and my VPN connection showed no events happening. It was so short that for everything on my network, it looked like just a few normal packets lost, which happens more often than I’d like (about 1.3% out of 15k packets, according to MTR to the gateway’s IP and about 3.4% to the VPN’s internal IP address, tho I’m not noticing that at all during my daily activity).
It is quite fascinating how resilient the software that we are using is to to the unreliability of the internet. And the internet is unreliable. The bad part is that most people nowadays believe the internet is not only reliable, but that it will always be up. We see things like “whatever goes on the internet, stays on the internet,” but that is only true because there are still people who treat the internet as the unreliable mess that it is and do surprise decentralized backups of online data.
If more people would treat the internet as it should be treated, like the unreliable medium that it is, we would not have such difficult times.
> strong men create good internet
> good internet creates weak men
> weak men create bad internet
> bad internet creates strong men
The software protocols we have today were created by absolute chads. Today’s soyboys are making horrible software that becomes unresponsive the moment the connections from the browser to the servers gets severed. I can’t wait for the days where people will start decentralizing again and get rid of the abominations that are internet 2.0 and 3.0.
For real, if people would plan for the day the internet goes down, people wouldn’t care if a hurricane hit and took out the infrastructure. Back in the day, people just used to turn on their radios and pick up broadcasted waves from all around. In the modern day, everyone should have solar-powered routers that create a mesh network with all the other routers around, through protocols like B.A.T.M.A.N. or BMX6.
But we need discovery protocols for services around us. Sure, you can do nmap around and use ports 80 and 443, but that ain’t really gonna cut it. We should have a protocol in which you broadcast a packet with a small TTL, then get a unicast reply back with messages of what services are available nearby. Of course, the service response would be a server set up by each admin who wants their service to be public. It would for the most part get rid of search engines for discovering nearby stuff.
Well, I can dream. I am too noob to write such a good piece of software, I could probably only write insecure exploitware™. But I know that, at least even if I tried, it would not be soyftware™.