From c22ec9b0749b7d870785502d728f8d60e53783a0 Mon Sep 17 00:00:00 2001 From: martin legrand Date: Thu, 27 Mar 2025 19:01:18 +0100 Subject: [PATCH] udpate readme --- README.md | 10 +++++----- 1 file changed, 5 insertions(+), 5 deletions(-) diff --git a/README.md b/README.md index 13fb73a..6d96921 100644 --- a/README.md +++ b/README.md @@ -306,12 +306,12 @@ The table below show the available providers: | Provider | Local? | Description | |-----------|--------|-----------------------------------------------------------| -| Ollama | Yes | Run LLMs locally with ease using ollama as a LLM provider | +| ollama | Yes | Run LLMs locally with ease using ollama as a LLM provider | | Server | Yes | Host the model on another machine, run your local machine | -| LM studio | Yes | Run LLM locally with LM studio (set `provider_name` to `lm-studio`)| -| OpenAI | No | Use ChatGPT API (non-private) | -| Deepseek | No | Deepseek API (non-private) | -| HuggingFace| No | Hugging-Face API (non-private) | +| lm-studio | Yes | Run LLM locally with LM studio (set `provider_name` to `lm-studio`)| +| openai | No | Use ChatGPT API (non-private) | +| deepseek-api | No | Deepseek API (non-private) | +| huggingface| No | Hugging-Face API (non-private) | To select a provider change the config.ini: