CompressedBART: Fine-Tuning for Summarization through Latent Space Compression (Paper Review/Described)
Author(s): Ala Alam Falaki Originally published on Towards AI. Paper title: A Robust Approach to Fine-tune Pre-trained Transformer-based Models for Text Summarization through Latent Space Compression. “Can we compress a pre-trained encoder while keeping its language generation abilities?”This is the main question …