Enable javascript in your browser for better experience. Need to know to enable it? Go here.
Published : Apr 03, 2024
NOT ON THE CURRENT EDITION
This blip is not on the current edition of the Radar. If it was on one of the last few editions, it is likely that it is still relevant. If the blip is older, it might no longer be relevant and our assessment might be different today. Unfortunately, we simply don't have the bandwidth to continuously review blips from previous editions of the Radar. Understand more
Apr 2024
Assess ?

Ollama is an open-source tool for running and managing large language models (LLMs) on your local machine. Previously, we talked about the benefits of self-hosted LLMs, and we’re pleased to see the ecosystem mature with tools like Ollama. It supports several popular models — including LLaMA-2, CodeLLaMA, Falcon and Mistral — that you can download and run locally. Once downloaded, you can use the CLI, API or SDK to interact with the model and execute your tasks. We're evaluating Ollama and are seeing early success as it improves the developer experience in working with LLMs locally.

Download the PDF

 

 

English | Español | Português | 中文

Sign up for the Technology Radar newsletter

 

Subscribe now

Visit our archive to read previous volumes