Text Classification
PEFT
Safetensors
Transformers
LoRA
QLoRA
multi-label
decoder-only
trl
bitsandbytes
Instructions to use Amirhossein75/LLM-Decoder-Tuning-Text-Classification with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- PEFT
How to use Amirhossein75/LLM-Decoder-Tuning-Text-Classification with PEFT:
from peft import PeftModel from transformers import AutoModelForSequenceClassification base_model = AutoModelForSequenceClassification.from_pretrained("meta-llama/Llama-3.2-1B") model = PeftModel.from_pretrained(base_model, "Amirhossein75/LLM-Decoder-Tuning-Text-Classification") - Transformers
How to use Amirhossein75/LLM-Decoder-Tuning-Text-Classification with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="Amirhossein75/LLM-Decoder-Tuning-Text-Classification")# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("Amirhossein75/LLM-Decoder-Tuning-Text-Classification", dtype="auto") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- 8c4819f50f901954a5387af9e4b706958c3fe6c3a93ea36728582065b9137755
- Size of remote file:
- 5.78 kB
- SHA256:
- eadc06d438cc7278205040c31a9bee93ee2cdb7595e78f39a4afe3219947bae9
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.