Scaling Training of HuggingFace Transformers With Determined
Last Updated on June 17, 2021 by Editorial Team
Author(s): Towards AI Team
Join an exciting lunch-and-learn event by our friends at Determined AI
Training complex state-of-the-art natural language processing (NLP) models is now a breeze, thanks to HuggingFaceβββmaking it an essential open-source go-to for data scientists and machine learning engineers to implement Transformers models and configure them as state-of-the-art NLP models with straightforward library calls. As a result, the library has become crucial for training NLP models, like in Baidu or Alibaba, and has contributed to state-of-the-art results in several NLPΒ tasks.
Our friends at Determined AI are hosting an exciting lunch-and-learn covering training HuggingFace Transformers at scale using Determined! Learn to train Transformers with distributed training, hyperparameter searches, and cheap spot instancesβββall without modifying code.
Please consider joining on Wednesday, June 30th at 10 AM PT for a hands-on tutorial from Liam Li, a Senior Machine Learning Engineer at Determined AI, and Angela Jiang, a Product Manager at Determined AI (lunch included!).
This fantastic hands-on tutorial will cover the basics of Determined Transformers and walk through how to build a chatbot using a large Transformer language model with distributed training and spot instances.
Join in and come away with an understanding of how to train Transformers at scale with distributed training, experiment and artifact tracking, and resource management, all without needing to modifyΒ code.
Please make sure to register for the Meetup event, the zoom meeting, and if you are on slack, please consider joining Determinedβs Slack community!Β
Scaling Training of HuggingFace Transformers With Determined was originally published in Towards AI on Medium, where people are continuing the conversation by highlighting and responding to this story.
Published via Towards AI