Last Updated on April 1, 2023 by Editorial Team
Author(s): Thomas Rochefort-Beaudoin
Originally published on Towards AI.
In this article, I go through the basics of finetuning large language models like BLOOM on a legal text dataset. You can try it on HuggingFace Spaces!
I am currently on a quest to learn the ropes of working with large language models (LLMs). This includes learning how to finetune pre-trained models such as OPT and BLOOM.
To move forward on that quest, I decided to try to build a model that could answer legal questions. I therefore hereby present you: BetterCallBloom, your friendly AI-legal assistant!
Screenshot of the interface of the BettercallBLOOM space
In this article, I will go through the steps required… Read the full blog for free on Medium.
Join thousands of data leaders on the AI newsletter. Join over 80,000 subscribers and keep up to date with the latest developments in AI. From research to projects and ideas. If you are building an AI startup, an AI-related product, or a service, we invite you to consider becoming a sponsor.
Published via Towards AI