yzma
veggiemonk/yzma
yzma lets you use Go to perform local inference with Vision Language Models (VLMs) and Large Language Models (LLMs) using llama.cpp without CGo.
0stars
Forks
0
Open issues
0
Watchers
0
Size
0.7 MB
GoOther
Created: Oct 13, 2025
Updated: Nov 26, 2025
Last push: Nov 26, 2025
Fork