Logo
Explore Help
Register Sign In
tcsenpai/ollama
1
0
Fork 0
You've already forked ollama
mirror of https://github.com/tcsenpai/ollama.git synced 2025-06-10 13:07:08 +00:00
Code Issues Packages Projects Releases Wiki Activity
ollama/llm
History
Bruce MacDonald 3367b5f3df
remove unused generate patches (#1810)
2024-01-05 11:25:45 -05:00
..
ext_server
Code shuffle to clean up the llm dir
2024-01-04 12:12:05 -08:00
generate
remove unused generate patches (#1810)
2024-01-05 11:25:45 -05:00
llama.cpp@328b83de23
Init submodule with new path
2024-01-04 13:00:13 -08:00
dynamic_shim.c
Switch windows build to fully dynamic
2024-01-02 15:36:16 -08:00
dynamic_shim.h
Refactor how we augment llama.cpp
2024-01-02 15:35:55 -08:00
ext_server_common.go
Code shuffle to clean up the llm dir
2024-01-04 12:12:05 -08:00
ext_server_default.go
fix: relay request opts to loaded llm prediction (#1761)
2024-01-03 12:01:42 -05:00
ext_server_windows.go
Load dynamic cpu lib on windows
2024-01-04 08:41:41 -08:00
ggml.go
deprecate ggml
2023-12-19 09:05:46 -08:00
gguf.go
remove per-model types
2023-12-11 09:40:21 -08:00
llama.go
fix: relay request opts to loaded llm prediction (#1761)
2024-01-03 12:01:42 -05:00
llm.go
Load dynamic cpu lib on windows
2024-01-04 08:41:41 -08:00
shim_darwin.go
Code shuffle to clean up the llm dir
2024-01-04 12:12:05 -08:00
shim_ext_server_linux.go
Code shuffle to clean up the llm dir
2024-01-04 12:12:05 -08:00
shim_ext_server_windows.go
Code shuffle to clean up the llm dir
2024-01-04 12:12:05 -08:00
shim_ext_server.go
Code shuffle to clean up the llm dir
2024-01-04 12:12:05 -08:00
utils.go
partial decode ggml bin for more info
2023-08-10 09:23:10 -07:00
Powered by Gitea Version: 1.23.0+rc0 Page: 520ms Template: 18ms
English
Bahasa Indonesia Deutsch English Español Français Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API