Hey lads, I’m trying to get myself into Linux a little again by setting it up to use as a desktop environment on my home rig for web browsing, documents, etc. while Windows remains for gaming and application compatibility. However, it is important to me to not have to reboot to switch between them, rather than run them simultaneously. Ideally I would do this using a virtual machine, but I’m struggling to get performance up to an acceptable standard for daily use - VirtualBox is very choppy, VMWare a little better. I’ve connected over RDP and VNC as suggested elsewhere, but performance was worsened by this. I’m certain that settings are correct - guest additions are installed, plenty of resources provided on a machine that shouldn’t be struggling (R7 1800X, 24GB RAM, 500GB VM-exclusive SSD + 500GB NVMe Host, R9 390X).
I was wondering if it was possible to set up a second GPU to be exclusive to the VM in a Windows 10 environment? This should help with performance a lot. I’m unsure if it is supported from Windows’ perspective, I’ve found plenty of threads with conflicting information, some saying support was even added in the recent creators update. I could easily pick up a $40 graphics card from Amazon, or even use an older one lying around if I could get it to work. Performance doesn’t need to be high end, but I need to be able to at least watch YouTube.
Are VMs in Windows 10 completely pointless if you want to do anything as intensive as watch a 480p video? I just want to be sure there’s no way I can get this set up as aside from the performance, this is completely ideal My only other option would be to dedicate a whole system to the monitor, and use Synergy or something for mouse/keyboard support I guess.
I don’t think the video card is the issue really, I don’t have any problems using just the iGPU on an intel chip.
What OS are you trying to run in the VM?
Have you updated something like ffmpeg to allow HD playback with firefox (not everyone has to do this but I did) or have you tried chrome in the VM?
Did you give the machine enough ram in the system tab of virtual box? 8GB would be great considering you have 24 but surely 4 would work as well.
Have you set the number of processors in that system tab area as well? with your CPU you could easily give it 4 cores.
Are you sure? I hope you’re right.
I’ve been messing with Ubuntu Budgie and ElementaryOS for the time being, I plan to play with more in future.
I’ve tried Chrome, no change. Even moving a terminal window around the screen is choppy, while moving a window around in Windows is completely buttery.
I’ve been giving it 12GB as most of it will be used by productivity tasks, 12GB is enough for gaming in Windows. I’ve tried other configurations too, it doesn’t seem to be affected much by it.
I’ve definitely checked that. Virtualisation is enabled in the BIOS. I’ve given the system 8 of 16 CPU threads, max (128MB) video memory, 12GB of RAM. I’ve messed with all these settings, but I currently am using KVM, PAE/NX enabled, AMD-V enabled, Nested Paging enabled, host I/O cache enabled.
I should also note I’m using a 1440p monitor, lowering the resolution seems to improve performance, but again it’s nothing like as smooth as Windows, especially for videos. The 1440p monitor is also key for the VM, as they’re both intended for productivity.
I don’t think Budgie comes with a compositor, so it is going to look choppy even on bare metal. You need something with a compositor (or add one) to smooth the screen out. KDE, Cinnamon, and I think Gnome come with one by default. Lighter desktops tend to leave it out because of the overhead. For instance the new Lubuntu Next LXQt has compton and runs almost as heavy as Cinnamon (~600MB from my testing) once enabled vs ~120MB without compton.
With the compositor you get all of the flashy effects, transparency, the gradual opening and closing of windows vs them popping in and out of existence, and little or no screen tearing on scrolling or videos. Some of this may also be affected by which graphics you use. There are tons of videos out there of people doing fixes for screen tearing on Linux with nvidia cards.
I’ve tried out a whole load of them now, including KDE on OpenSUSE. Doesn’t seem to change things. I’m using an AMD R9 390X, but I don’t think it matters as the VM wont even see this, it uses VirtualBox/VMWare drivers, not AMD/Nvidia’s.