29
I finally ran a local AI model on my old laptop and it actually worked
I downloaded a small language model called Ollama last week, thinking my 5-year-old laptop would just crash. But it ran a 7-billion parameter model, and I got full answers in about 30 seconds per question. It made me wonder if we're focusing too much on giant cloud AI and not enough on making good tools people can run themselves. Has anyone else had a good surprise with a local AI setup?
2 comments
Log in to join the discussion
Log In2 Comments
dylan2310d agoTop Commenter
How cool is that? I tried something similar with an old desktop I had in the closet, just to see if it would even boot the software. I was shocked when it actually started generating text, even if it was slow. It feels like a different kind of win, getting tech to work on stuff you already own instead of always needing the newest thing. Makes the whole AI field feel a bit more reachable.
5
charlesj4610d ago
Right? I got an old laptop to run a basic model and that same feeling is just great. It really does make the tech feel more like something you can actually touch.
3