Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
Abiray
/
gemma-4-E4B-it-heretic-GGUF
like
32
Any-to-Any
GGUF
llama-cpp
gemma-4
imatrix
conversational
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
3
Deploy
Use this model
main
gemma-4-E4B-it-heretic-GGUF
34.1 GB
Ctrl+K
Ctrl+K
1 contributor
History:
16 commits
Jay
Upload gemma-4-E4B-it-heretic-Q3_K_M.gguf with huggingface_hub
e8d3eb1
verified
16 days ago
.gitattributes
Safe
2.03 kB
Upload mmproj-F32.gguf with huggingface_hub
17 days ago
README.md
Safe
712 Bytes
Update README.md
16 days ago
gemma-4-E4B-it-heretic-Q3_K_M.gguf
4.85 GB
xet
Upload gemma-4-E4B-it-heretic-Q3_K_M.gguf with huggingface_hub
16 days ago
gemma-4-E4B-it-heretic-Q4_K_M.gguf
5.34 GB
xet
Upload gemma-4-E4B-it-heretic-Q4_K_M.gguf with huggingface_hub
16 days ago
gemma-4-E4B-it-heretic-Q5_K_M.gguf
5.76 GB
xet
Upload gemma-4-E4B-it-heretic-Q5_K_M.gguf with huggingface_hub
16 days ago
gemma-4-E4B-it-heretic-Q6_K.gguf
6.22 GB
xet
Upload fixed gemma-4-E4B-it-heretic-Q6_K.gguf with official chat template
16 days ago
gemma-4-E4B-it-heretic-Q8_0.gguf
8.03 GB
xet
Upload fixed gemma-4-E4B-it-heretic-Q8_0.gguf with official chat template
16 days ago
mmproj-BF16.gguf
992 MB
xet
Upload mmproj-BF16.gguf with huggingface_hub
17 days ago
mmproj-F16.gguf
990 MB
xet
Upload mmproj-F16.gguf with huggingface_hub
17 days ago
mmproj-F32.gguf
1.91 GB
xet
Upload mmproj-F32.gguf with huggingface_hub
17 days ago