Local Ollama URL incorrect and no adjustment. #1

Open
opened 2025-04-09 14:55:24 +00:00 by tcsenpai · 0 comments
Owner

Originally created by @BloodBlight on 12/7/2024

When attempting to use a local instance of olamma, I get this error:

Failed to generate step after 3 attempts. **Error: 404 Client Error: Not Found** for url: http://192.168.10.50:11434/api/chat

However manually verifying that the service is up and running works:

curl http://192.168.10.50:11434/api/generate -d '{
  "model": "llama3.1:latest",
  "prompt": "Why is the sky blue?"
}'
{"model":"llama3.1:latest","created_at":"2024-12-07T22:59:12.977900524Z","response":"The","done":false}
{"model":"llama3.1:latest","created_at":"2024-12-07T22:59:12.988189036Z","response":" sky","done":false}
{"model":"llama3.1:latest","created_at":"2024-12-07T22:59:12.998730522Z","response":" appears","done":false}
{"model":"llama3.1:latest","created_at":"2024-12-07T22:59:13.009296528Z","response":" blue","done":false}
{"model":"llama3.1:latest","created_at":"2024-12-07T22:59:13.019687252Z","response":" to","done":false}

It APPEARS that "/api/chat" is not the correct URL and without digging into the source, there doesn't appear to be a way to change it.

I have a fairly generic ollama install, only oddity is the use of ROCM as I have an AMD GPU:

version: '3.8'

services:
  ollama:
    image: ollama/ollama:rocm
    container_name: ollama
    ports:
      - "11434:11434"
    volumes:
      - ./Ollama/data:/root/.ollama
    devices:
      - /dev/kfd
      - /dev/dri
    shm_size: '16gb'
    group_add:
      - video
    cap_add:
      - SYS_PTRACE
    security_opt:
      - seccomp=unconfined
    restart: always


  ollama-webui:
    build:
      context: .
      args:
        OLLAMA_API_BASE_URL: '/ollama/api'
      dockerfile: Dockerfile
    image: ghcr.io/ollama-webui/ollama-webui:main
    container_name: ollama-webui
    volumes:
      - ./Ollama/webui:/app/backend/data
    depends_on:
      - ollama
    ports:
      - 8081:8080
    environment:
      - 'OLLAMA_API_BASE_URL=http://ollama:11434/api'
      - 'WEBUI_SECRET_KEY='
    extra_hosts:
      - host.docker.internal:host-gateway
    restart: always

EDIT: Updated test to actually use IP rather than localhost just to confirm. No change.

*Originally created by @BloodBlight on 12/7/2024* When attempting to use a local instance of olamma, I get this error: ``` Failed to generate step after 3 attempts. **Error: 404 Client Error: Not Found** for url: http://192.168.10.50:11434/api/chat ``` However manually verifying that the service is up and running works: ``` curl http://192.168.10.50:11434/api/generate -d '{ "model": "llama3.1:latest", "prompt": "Why is the sky blue?" }' {"model":"llama3.1:latest","created_at":"2024-12-07T22:59:12.977900524Z","response":"The","done":false} {"model":"llama3.1:latest","created_at":"2024-12-07T22:59:12.988189036Z","response":" sky","done":false} {"model":"llama3.1:latest","created_at":"2024-12-07T22:59:12.998730522Z","response":" appears","done":false} {"model":"llama3.1:latest","created_at":"2024-12-07T22:59:13.009296528Z","response":" blue","done":false} {"model":"llama3.1:latest","created_at":"2024-12-07T22:59:13.019687252Z","response":" to","done":false} ``` It APPEARS that "/api/chat" is not the correct URL and without digging into the source, there doesn't appear to be a way to change it. I have a fairly generic ollama install, only oddity is the use of ROCM as I have an AMD GPU: ``` version: '3.8' services: ollama: image: ollama/ollama:rocm container_name: ollama ports: - "11434:11434" volumes: - ./Ollama/data:/root/.ollama devices: - /dev/kfd - /dev/dri shm_size: '16gb' group_add: - video cap_add: - SYS_PTRACE security_opt: - seccomp=unconfined restart: always ollama-webui: build: context: . args: OLLAMA_API_BASE_URL: '/ollama/api' dockerfile: Dockerfile image: ghcr.io/ollama-webui/ollama-webui:main container_name: ollama-webui volumes: - ./Ollama/webui:/app/backend/data depends_on: - ollama ports: - 8081:8080 environment: - 'OLLAMA_API_BASE_URL=http://ollama:11434/api' - 'WEBUI_SECRET_KEY=' extra_hosts: - host.docker.internal:host-gateway restart: always ``` EDIT: Updated test to actually use IP rather than localhost just to confirm. No change.
Sign in to join this conversation.
No Milestone
No project
No Assignees
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: tcsenpai/multi1#1
No description provided.