My 128 GB Framework desktop was delivered today, and I’m trying to decide how to set it up to play with local LLM’s etc. (Probably just LLM’s at first.) A few additional caveats, I don’t have one of the nice Level1Techs KVM’s and buying this thing has blown my cool gadgets budget until probably the middle of next year, so once I do the initial setup, it’s getting tucked into a corner with power and Ethernet, where I will just access it over my home network. (Fortunately, it’s tiny and light, so moving it somewhere to work at it locally on occasion isn’t the end of the world, but day-to-day, I’m looking for headless operation.)
As stated in the title, I’m new to local AI. (Also, might as well call my cloud AI experience extremely minimal, superficial, consumer-level stuff.) On top of that, my Linux experience is… fairly minimal and mostly outdated.
Context out of the way, 0th question: Windows or Linux? Linux. Next question.
First question: Which distro? On occasions in the last decade or so when I’ve wanted “a Linux box” I’ve gone with Ubuntu. However, from what little research I’ve done so far, I can tell for this use case, I’ll probably want something that keeps closer to the bleeding edge. Fedora and Arch are the two most at top-of-mind for me, but I’d be willing to consider other options.
Second question: What software should I use to run the models? LM Studio? Llama.cpp? Since I’m new to this (and from what I’ve seen, it seems like this doesn’t change, or may actually get “worse” with experience), I’ll probably be doing a lot of hopping from model to model and adjusting parameters. For recommendations here, keep in mind though that I want to run the system headless, and ideally I’d prefer not to have the Framework desktop spending any resources on a desktop GUI.
Third question: What other questions am I forgetting to ask?
Thank you for your consideration. (And I apologize if this is already answered well somewhere. Feel free to throw a turorial link at me and tell me to go away
)