Now I know we are hips deep in the OS’s from 1960’s but could we need or do a reboot at some point.
This latest meltdown that alien face hugged intel and put the theoretical spectre on the whole predictive code finding meaning in kernel memory.
I remember yes Microsoft making an OS called singularity where just about everything was in userland device wise.
Are we getting to the point where performance can tank a little so our data is secure. It seems computers have been riding the wave of performance > all and data centres power to performance > all.
I just wounder when our data on an os will be important.
Minix uses a micro-kernel. Service connectivity is handled by the micro kernel. Everything else is in userland. If a driver or something crashes, it crashes that app or driver and then the watchdog attempts to restart the failing process. This will not in theory crash the whole system or the kernel, just the userspace item that can then be isolated if it proves problematic. Gnu/HURD works under a similar principle. The two OSes have been used as research tools and have been in development since the late 80s and/org early 90s.
Redox OS is written in Rust and attempts to use a secure programing language to maintain memory safety and other memory exploits due to developer errors.
Then you have systems like Plan 9. Never meant as a mainstream system but instead a researcher’s playground.