I ran it in a vm using virtio as that is supported on the BSD network stack.
Works fine in a VM, was one of the things I did while I was learning how to use QEMU since I wanted to install Linux on a iBook G3. Ended up just shelving it after I had it ready because I’m sure as hell not ever taking a iBook G3 apart again.
Using a VM is beside the point.
macOS was able to transparently run multi-arch binaries, without any (user initiated) VM shenanigans, almost 15 years ago.
It is so transparent, i didn’t even know that Diablo 1 was a PPC binary when i first started playing it on my 2009 (intel) Mac Mini when i first got it. It was that seamless. Just open app, the OS took care of it via rosetta totally transparently.
This is why i have no doubt that Apple could deal quite easily with intel going away, or transitioning to their own ARM based CPU.
They’ve done this (extremely well) before, from 68k to ppc, and from ppc to intel.
Then you run into the problem that they got rid of the engineers that were capable of creating that stuff. If they transition to arm it will NOT go well. Not at all. Not unless they combine mac and mobile first to get a feel for it, which they don’t want to do.
Osx died at 12. We’re not going to see any real improvements with it any time soon, at least thats my opinion.
I don’t believe that.
Apple have a bunch of talented engineers (you can’t tell me shit like ARkit is made by muppets), just right now they are getting directed to focus on end user shiny, rather than back-end. And if they have to, they can hire.
Point being, unlike other vendors, for Apple, UX is everything. There’s no reason people within the linux community couldn’t have come up with something similar; the issue is simply that nobody with the skills cares enough to do so because “i’ll just spin up a VM” is a “good enough” solution for the crowd who run Linux (as demonstrated by the thinking in this thread above). The motivation simply isn’t there.
i.e., it isn’t a super hard engineering problem. It’s simply an issue of motivation. Apple’s culture has it. Linux’s does not.
It is because of situations like this that the crowd who run linux is limited to more technical types.
Even worse, there are some among the linux community who think that doing stuff like this is some sort of badge of honour because it is complicated. It doesn’t have to be this difficult. The fact that it is, is a FAILING of the platform.
the fat binary capability is still there in macOS, and has been there since NEXTSTEP. All apple would need to do is develop (or BUY/license - at LEAST one already exists - Qualcomm(?) were running x86 on ARM a year or two back) an X86 -> ARM JIT compiler, and away you go. And that’s assuming they don’t build a hardware x86 -> ARM front end into an Axx processor. Would not surprise me if AMD already have one, as ARM is most definitely on their roadmap too. The engineering hassles to get this done aren’t huge.
Oh one other relevant thing i forgot.
IIRC, Any App store uploads (at least for IOS, unsure for macOS) for some years now have involved uploading to apple the LLVM intermediate byte code. Specifically so apple can recompile for future architectures without needing your source. They just run the final intermediate -> native code compilation step in their own build farm.
So it’s entirely feasible that any app store app would be recompiled by Apple for the new architecture independently of whether or not a developer bothers to do so.
Any library calls will be native, any app store apps will probably be native, and use an X86->ARMLLVM based JIT compiler for anything else.
Well amd processors, except the athlon and early phenom’s he actually been risc cores underneath for years. Ever since the K5 in fact.
We need to split this into another thread if we’re going to continue this conversation though.
“If” Linux dies, which we all know it won’t, I would go back to MacOS.
I dont think Linux is going away either.
I totally forgot about Redox It is an impressive effort that slipped my mind. One of the System 76 guys was doing it. I should check it out.
He’s still doing it, but its a mature code base already so he doesn’t have to hammer on it as often.
I too have interest, but I cannot betray my lovely icaros.
I will use Windows like I always have. Well actually I gave Linux a chance but it was beyond me. Also recently tried writing some simple code with a program someone recommended and completed the initial task but when moving to what I needed to fix I never got it right but will continue to use the program to goof around with as I can do it in a Windows environment. Also if Linux died and other OSes died and we went back to a no tech world I’d have no problem adjusting as I would just shoot hoops like I do on a regular basis again and play board / card games like I did in the past including playing a kids game my daughter bought that she had good memories of playing as a child with me. Heck there is also table hockey.
I’d just move over to the Linux successor, whatever that may be.
What’s the hypothetical RIP linux scenario?
I can really only see 3 possibilities:
- Something far superior comes along, and is adopted en-masse as a replacement for Linux which then falls into obscurity. This replacement would have to be significantly better and fulfill everyone’s needs while having faster development than Linux.
- Something sparks a patent war, burying useful parts of the Linux codebase. A new kernel must be built from the ground up or adapted from an existing project, userspace effectively remains the same.
- Some upheaval causes a mass exodus from the main project, the original project is forked multiple times and is all but abandoned. Effective development on the kernel side of the ecosystem stops. This heavily slows and stops adoption of the now fractured platforms. Essentially every distro has their own separate kernel based on Linux and over time these kernels diverge.
I’m curious of someone else has a creative way Linux could die.
One of the FANG guys, minus Google cause of fusciha, buys Linux and makes it closed source.
Don’t know if Linux can be made closer source tho. I don’t know about all the legal jargon revolving around licensing and stuff like that.
Linux is GPL2, so it’s a far shot.
GPL is essentially a developers Ulysses pact against closing the source of a project.
Let’s be real, this whole thing with the CoC probably isn’t that big of a deal. Regardless if people start getting kicked out of their projects or what not you make another group that works on the project. More likely if anyone starts abusing said CoC standards to just start trying to remove people more and more people will stop working on the project or remove said rule lawyer from the campaign…
Wait we’re still talking about linux right?
Regardless, google still runs despite the memo, all these inclusivity things always end up with not a lot of change over all. Just a lot of disgruntled nerds. It just means more minorities and women in HR if you’re ubisoft.
Just to be clear, This thread is NOT about the whole CoC situation, there’s a seperate thread for that. This is about what the Linux users would hypothetically switch to if Linux went away.
Just never update and make a couple of backups then, until I figure out bsd.
Kernel 4.16 and 4.14 are chill with me. No nsa bullshit code and no coc. Rebase off one of them :3