Summarization
Transformers
PyTorch
TensorBoard
t5
text2text-generation
Generated from Trainer
Eval Results (legacy)
text-generation-inference
Instructions to use autoevaluate/summarization with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use autoevaluate/summarization with Transformers:
# Use a pipeline as a high-level helper # Warning: Pipeline type "summarization" is no longer supported in transformers v5. # You must load the model directly (see below) or downgrade to v4.x with: # 'pip install "transformers<5.0.0' from transformers import pipeline pipe = pipeline("summarization", model="autoevaluate/summarization")# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("autoevaluate/summarization") model = AutoModelForSeq2SeqLM.from_pretrained("autoevaluate/summarization") - Notebooks
- Google Colab
- Kaggle
Add evaluation results on xsum
#1
by lewtun HF Staff - opened
Beep boop, I am a bot from Hugging Face's automatic evaluation service! Your model has been evaluated on the xsum dataset. Accept this pull request to see the results displayed on the Hub leaderboard. Evaluate your model on more datasets here.