FAQs about Ollama AI Installation

Ollama AI on Windows 10

1. What is the default installation information for Ollama on Windows?

# Ollama installation path
C:\Ollama\ollama.exe

# Python environment path
C:\python311

# Virtual environment path
C:\venv
Windows Ollama Path

2. How do I Install an LLM Model?

Please execute the Ollama command line in the cmd or powershell console. The llama2 model is installed by default. Please download and install other models according to your needs.

Ollama cmd

3. How Do I Start the Ollama Service?

How to start the Ollama service: supervisorctl.exe, its configuration file is in the C:\supervisor\conf directory, and two services are started by default, one is Ollama API and the other is Ollama WebUI.

Windows Supervisorctl

4. How do I access the Ollama Web UI?

If you need to use Ollama WebUI, please visit the address http://127.0.0.1:8080/auth/

Windows Ollama WebUI