I used the Pegasus model to improve text summarization performance. The dataset I utilized, Samsum, consisted of dialogues and corresponding summaries in separate columns, making it well-suited for text summarization training. I chose the Pegasus model for its exceptional performance in text summarization tasks, attributed in part to its pretraining with Masked Language Modeling (MLM) and Gap Sentence Generation (GSG) tasks, specifically designed to enhance summarization. Due to resource limitations, my initial training was limited to just one epoch, but I recognize that fine-tuning and hyperparameter adjustments hold the potential for further enhancements. My findings indicate promise, and future work involves exploring hyperparameter tuning, obtaining access to larger computing resources for multi-epoch training, and evaluating the model's performance across various summarization benchmarks and real-world datasets.
-
Notifications
You must be signed in to change notification settings - Fork 0
Rkarande1/Text-summarisation-using-pegasus-model
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
I have used the pre-trained pegasus model from hugging face to generate summaries with Samsum dataset here
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published