$0.00

No products in the cart.

Old RTX 3090 beats new 5080 for AI

AI is here to stay, and it’s far more than just using online tools like ChatGPT and Copilot. Whether you’re a developer, a hobbyist, or just want to learn some new skills and a little about how these things work, local AI is a better choice.

Ollama is one of the easiest and most popular ways to dabble with LLMs on your local PC. But one thing you need — that you don’t when using ChatGPT — is decently powerful local hardware. Ollama, especially, because it only supports dedicated GPUs right now. But even if you’re using LM Studio with integrated GPUs, you still need some decent hardware to get good performance.

#RTX #beats

Reviews

Related Articles