Enable javascript in your browser for better experience. Need to know to enable it? Go here.
Published : Apr 03, 2024
Apr 2024
Assess ? Worth exploring with the goal of understanding how it will affect your enterprise.

Ollama is an open-source tool for running and managing large language models (LLMs) on your local machine. Previously, we talked about the benefits of self-hosted LLMs, and we’re pleased to see the ecosystem mature with tools like Ollama. It supports several popular models — including LLaMA-2, CodeLLaMA, Falcon and Mistral — that you can download and run locally. Once downloaded, you can use the CLI, API or SDK to interact with the model and execute your tasks. We're evaluating Ollama and are seeing early success as it improves the developer experience in working with LLMs locally.

Download the PDF

 

 

English | Español | Português | 中文

Sign up for the Technology Radar newsletter

 

Subscribe now

Visit our archive to read previous volumes