Ollama VM with Ubuntu & Open WebUI

Update : August 11, 2024

Ollama with Vmware Vm

Download VM Ollama

The pre-configured Ollama VM under Ubuntu 22.04 Minimal Desktop with Open WebUI significantly simplifies installation steps for users. This VM is already created and available for download on 1fichier, greatly facilitating the work of users wishing to exploit LLM models. In this article, we will detail the VM configuration, access information, and steps to use it effectively.

ollama vm download

VM Configuration

Technical Specifications

To ensure optimal performance, here are the recommended specifications for the VM:

  • CPU: 4 threads
  • RAM: 10 GB #Depending on your LLM model
  • Disk: 35 GB
  • Network: Bridge

These specifications allow for efficient management of LLM models while ensuring smooth use of the VM.

Access Information

The following information is essential for accessing and managing the VM:

  • User: ABCDO
  • Password: abcdo.tn
  • Static IPv4: 192.168.1.122
  • OpenWebUI IPv4: 192.168.1.122:8080
  • Admin 1 OpenWebUI: [email protected]
  • OpenWebUI Password: ABCDO

Or

Available LLM Models

The managed language models include:

  • qwen:0.5b

These models are optimized to offer high performance and great accuracy in natural language processing tasks.

Downloading and Using the Preconfigured VM

Download

The VM is preconfigured and uploaded to 1fichier to facilitate its installation. You can download it by following this link: Download the Ollama VM.

Using Under VMware Workstation 14

This VM is configured to run under VMware Workstation 14. If you wish to migrate this VM to an ESXi environment, follow the detailed instructions in this article: Convert VMware Workstation to ESXi.

For a video tutorial on this conversion, you can watch this YouTube video: How to Convert VMware Workstation to ESXi.

VM Installation and Configuration

Operating System

The VM runs on Ubuntu 22.04 LTS Minimal Desktop, a lightweight and efficient version of Ubuntu that offers increased stability and performance for critical applications.

Software Versions

The installed software versions are as follows:

  • Open WebUI Version: v0.3.11
  • Ollama Version: v0.3.10

These versions ensure compatibility with the latest features and bug fixes, guaranteeing a smooth user experience.

Network Configuration

The bridge network option allows the VM to behave like a physical device on the local network, with a static IP address set to 192.168.1.122. This facilitates remote access and management via the Open WebUI.

Management and Use of Open WebUI

Accessing the Interface

To access the Open WebUI interface, open a web browser and enter the following address: https://192.168.1.122:8080. Use the administrator credentials below to log in:

Or

Open WebUI Features

The Open WebUI offers an intuitive interface for managing LLM models, including:

  1. Model Deployment: Deploy and manage models.
  2. Resource Monitoring: Monitor CPU, RAM, and disk space usage in real-time.
  3. Task Configuration: Schedule and execute specific tasks based on natural language processing needs.

Executing and Managing LLM Models

Model Deployment

To deploy a model, go to the “Models” section of the Open WebUI. Select the desired model (qwen:0.5b) and click “Deploy”. Follow the instructions to configure the model’s specific parameters.

Task Execution

Once the model is deployed, you can execute natural language processing tasks. For example, to analyze text or generate a response, navigate to the “Tasks” section of the interface and select the desired task. Enter the necessary data and start execution.

Performance Monitoring

Use the Open WebUI monitoring tools to check the status of the VM and deployed models. Monitor resource usage to optimize performance and avoid overloads.

Installation Steps Used

Step 1: Install curl

sudo apt-get install curl

Step 2: Update Ubuntu

sudo apt-get update -y

Step 3: Install Ollama

Before starting, visit the following link to download Ollama: Download Ollama.

curl -fsSL https://ollama.com/install.sh | sh

Step 4: Prepare the Docker Environment

sudo apt-get update
sudo apt-get install ca-certificates curl
sudo install -m 0755 -d /etc/apt/keyrings
sudo curl -fsSL https://download.docker.com/linux/ubuntu/gpg -o /etc/apt/keyrings/docker.asc
sudo chmod a+r /etc/apt/keyrings/docker.asc

Add the Docker repository to Apt sources:

echo "deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/docker.asc] https://download.docker.com/linux/ubuntu $(. /etc/os-release && echo "$VERSION_CODENAME") stable" | sudo tee /etc/apt/sources.list.d/docker.list > /dev/null
sudo apt-get update

Step 5: Install Docker

sudo apt-get install docker-ce docker-ce-cli containerd.io docker-buildx-plugin docker-compose-plugin

Step 6: Run the Docker Container Open WebUI

sudo docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_BASE_URL=http://127.0.0.1:11434 --name open-webui --restart always ghcr.io/open-webui/open-webui:main

Conclusion

The Ollama VM under Ubuntu 22.04 Minimal Desktop with Open WebUI is an ideal solution for those looking to manage LLM models efficiently and effectively. By downloading this preconfigured VM, you can save time and effort on installation and quickly start working

on your projects. The technical specifications, intuitive user interface, and real-time monitoring capabilities make this configuration a preferred choice for natural language processing professionals.

For more downloads, visit this link: VM Ollama.

Info: VM Size = +22GB
ollama vm download

Leave a Comment

Your email address will not be published. Required fields are marked *

Table of Contents

Scroll to Top