Local AI Models Setup

Ollama Setup for Local AI

To use local Ollama models with our application, you need to configure CORS (Cross-Origin Resource Sharing). This allows our web application to interact with locally running Ollama API.

Setup Instructions:

  1. Download and install Ollama for your operating system
  2. Download the CORS configuration script for your operating system (links below)
  3. Run the script with administrator privileges
  4. Restart Ollama if necessary

Note:

The script configures Ollama to accept requests from any origin, which is necessary for our application to work.

How to Use:

For Windows: Run the downloaded .bat file as administrator.

For Linux/macOS: Open terminal, navigate to the downloaded file and run:

chmod +x ollama-cors_linux-mac.sh
sudo ./ollama-cors_linux-mac.sh

After successful CORS configuration, you'll be able to use local Ollama models in our application.

Additional Information

Using local Ollama models provides several benefits:

  • Enhanced privacy for your data
  • Reduced latency when generating responses
  • No API limits or usage fees

Recommended models to use with DreamAuto:

  • ollama pull gemma2:2b - basic model for most tasks