Any advice for a beginner looking to get more serious into Local LLMs?

If you want easy friendly ui, LM studio is the way to go

It has a user interface and model browser which you can use to figure out which models are best for your given hardware configuration

Just be careful if you use it for work, as the public version is for personal use

Otherwise ollama with the open web ui per Sean’s suggestion

Wendell wrote a Linux guide. For Mac, you just install the app, start it up, and download a model

Now if you want to build a server that can run the full model, check out this guide

3 Likes