exllamav2
oobabooga/exllamav2
A fast inference library for running LLMs locally on modern consumer-class GPUs
8stars
Forks
1
Open issues
0
Watchers
8
Size
21.7 MB
PythonMIT License
Created: Sep 12, 2023
Updated: Apr 30, 2025
Last push: Apr 30, 2025
Fork