Ollama LLMs

Using deepseek-r1 + other LLMs on a MacBook Pro!

Hardware Specifications

As with all AI/ML related projects, the hardware being used will play a very large role in the results you get. For this project, I'm going to be using a 2024 MacBook Pro with the following specs:

MacBook Pro - Late 2024 (16-inch)

 

Lower spec devices will work, but may take longer to yield similar results. 

Ollama

Ollama's Official Website

Ollama is an open-source project that allows users to run large language models (LLMs) locally on their own computers. It was released in 2023 and has gained popularity as a tool for running AI models without needing cloud services or APIs.

Downloading Ollama is simple for all platforms; check out their website for more instructions on how to download it for your device.