mirror of
https://github.com/tcsenpai/pensieve.git
synced 2025-06-06 03:05:25 +00:00
docs: update docs
This commit is contained in:
parent
7aa2fa6655
commit
7a489ec373
10
README.md
10
README.md
@ -122,13 +122,11 @@ Before deciding to enable the VLM feature, please note the following:
|
||||
- Enabling VLM will significantly increase system power consumption
|
||||
- Consider using other devices to provide OpenAI API compatible model services
|
||||
|
||||
#### Enabling Steps
|
||||
|
||||
1. **Install Ollama**
|
||||
#### 1. Install Ollama
|
||||
|
||||
Visit the [Ollama official documentation](https://ollama.com) for detailed installation and configuration instructions.
|
||||
|
||||
2. **Prepare the Multimodal Model**
|
||||
#### 2. Prepare the Multimodal Model
|
||||
|
||||
Download and run the multimodal model `minicpm-v` using the following command:
|
||||
|
||||
@ -138,7 +136,7 @@ ollama run minicpm-v "Describe what this service is"
|
||||
|
||||
This command will download and run the minicpm-v model. If the running speed is too slow, it is not recommended to use this feature.
|
||||
|
||||
3. **Configure Memos to Use Ollama**
|
||||
#### 3. Configure Memos to Use Ollama
|
||||
|
||||
Open the `~/.memos/config.yaml` file with your preferred text editor and modify the `vlm` configuration:
|
||||
|
||||
@ -163,7 +161,7 @@ default_plugins:
|
||||
|
||||
This adds the `builtin_vlm` plugin to the default plugin list.
|
||||
|
||||
4. **Restart Memos Service**
|
||||
#### 4. Restart Memos Service
|
||||
|
||||
```sh
|
||||
memos stop
|
||||
|
10
README_ZH.md
10
README_ZH.md
@ -125,13 +125,11 @@ memos reindex --force
|
||||
- 启用 VLM 后会显著增加系统功耗
|
||||
- 可以考虑使用其他设备提供 OpenAI API 兼容的模型服务
|
||||
|
||||
#### 启用步骤
|
||||
|
||||
1. **安装 Ollama**
|
||||
#### 1. 安装 Ollama
|
||||
|
||||
请访问 [Ollama 官方文档](https://ollama.com) 获取详细的安装和配置指南。
|
||||
|
||||
2. **准备多模态模型**
|
||||
#### 2. 准备多模态模型
|
||||
|
||||
使用以下命令下载并运行多模态模型 `minicpm-v`:
|
||||
|
||||
@ -141,7 +139,7 @@ ollama run minicpm-v "描述一下这是什么服务"
|
||||
|
||||
这条命令会下载并运行 minicpm-v 模型,如果发现运行速度太慢的话,不推荐使用这部分功能。
|
||||
|
||||
3. **配置 Memos 使用 Ollama**
|
||||
#### 3. 配置 Memos 使用 Ollama
|
||||
|
||||
使用你喜欢的文本编辑器打开 `~/.memos/config.yaml` 文件,并修改 `vlm` 配置:
|
||||
|
||||
@ -166,7 +164,7 @@ default_plugins:
|
||||
|
||||
这里就是将 `builtin_vlm` 插件添加到默认的插件列表中。
|
||||
|
||||
4. **重启 Memos 服务**
|
||||
#### 4. 重启 Memos 服务
|
||||
|
||||
```sh
|
||||
memos stop
|
||||
|
Loading…
x
Reference in New Issue
Block a user