PopTheBubble β A Product Idea for Measuring Media Bias
Author(s): Sanghamesh Vastrad Originally published on Towards AI. Product Management A Product Managerβs Perspective on building a Crowdsourced Media Bias Tracker and Anonymous Political News Aggregator Photo by airfocus on Unsplash A couple of months ago, I decided to try something new. …
Modern NLP: A Detailed Overview. Part 1: Transformers
Author(s): Abhijit Roy Originally published on Towards AI. In the recent half-decade, we have seen massive achievements in the Natural Language Processing domain front with the introduction of ideas like BERT and GPT. In this article, we aim to dive into the …
HydraSum: Disentangling Stylistic Features in Text Summarization⦠(Paper Review/Described)
Author(s): Ala Alam Falaki Originally published on Towards AI. Training Is it possible to train a model with transformer architecture to learn generating summaries with different styles? Figure 1. The multi-decoder architecture scheme. (Image from [1]) While itβs true that deep learning …
ArgMiner: End-to-End Argument Mining
Author(s): Yousef Nami Originally published on Towards AI. A PyTorch-based package for processing, augmenting, training, and performing inference on SOTA Argument Mining datasets A pictorial representation of the task of Argument Mining Argument Mining (AM) is the task of extracting argument components …
How To Train a Seq2Seq Summarization Model Using βBERTβ as Both Encoder and Decoder!! (BERT2BERT)
Author(s): Ala Alam Falaki Originally published on Towards AI. BERT is a well-known and powerful pre-trained βencoderβ model. Letβs see how we can use it as a βdecoderβ to form an encoder-decoder architecture. Photo by Aaron Burden on Unsplash The Transformer architecture …