Forget ChatGPT: why researchers now run small AIs on their laptops

Open-weight LLMs running locally are gaining traction! Researchers are using them to bypass costs, protect privacy & ensure reproducibility. While cloud-based models still have advantages, the rapid progress of local LLMs suggests they might become sufficient for most use cases soon.

Posted on 26 Sep 2024