Name: Towards AI Legal Name: Towards AI, Inc. Description: Towards AI is the world's leading artificial intelligence (AI) and technology publication. Read by thought-leaders and decision-makers around the world. Phone Number: +1-650-246-9381 Email: [email protected]
228 Park Avenue South New York, NY 10003 United States
Website: Publisher: https://towardsai.net/#publisher Diversity Policy: https://towardsai.net/about Ethics Policy: https://towardsai.net/about Masthead: https://towardsai.net/about
Name: Towards AI Legal Name: Towards AI, Inc. Description: Towards AI is the world's leading artificial intelligence (AI) and technology publication. Founders: Roberto Iriondo, , Job Title: Co-founder and Advisor Works for: Towards AI, Inc. Follow Roberto: X, LinkedIn, GitHub, Google Scholar, Towards AI Profile, Medium, ML@CMU, FreeCodeCamp, Crunchbase, Bloomberg, Roberto Iriondo, Generative AI Lab, Generative AI Lab Denis Piffaretti, Job Title: Co-founder Works for: Towards AI, Inc. Louie Peters, Job Title: Co-founder Works for: Towards AI, Inc. Louis-François Bouchard, Job Title: Co-founder Works for: Towards AI, Inc. Cover:
Towards AI Cover
Logo:
Towards AI Logo
Areas Served: Worldwide Alternate Name: Towards AI, Inc. Alternate Name: Towards AI Co. Alternate Name: towards ai Alternate Name: towardsai Alternate Name: towards.ai Alternate Name: tai Alternate Name: toward ai Alternate Name: toward.ai Alternate Name: Towards AI, Inc. Alternate Name: towardsai.net Alternate Name: pub.towardsai.net
5 stars – based on 497 reviews

Frequently Used, Contextual References

TODO: Remember to copy unique IDs whenever it needs used. i.e., URL: 304b2e42315e

Resources

Take our 85+ lesson From Beginner to Advanced LLM Developer Certification: From choosing a project to deploying a working product this is the most comprehensive and practical LLM course out there!

Publication

Mini NLP Cypher | Mini Year Review
Latest   Machine Learning   Newsletter

Mini NLP Cypher | Mini Year Review

Last Updated on July 27, 2023 by Editorial Team

Author(s): Ricky Costa

Originally published on Towards AI.

Das Eismeer U+007C Friedrich

__init__.py

Mini NLP Cypher U+007C Mini Year Review

U+1F44BU+1F44B 2020 – The Year That Never Was

Good riddance to the woeful 12 months that made the year 2020. We spent the entire time wearing masks and nervously watching the news for vaccine updates. And while the Earth stood still for a full calendar year, software (and hardware) marched forward, and it never stopped. Even as the year winded down, and all was quiet, maybe too quiet, we couldn’t help but to witness Microsoft and Google go head-to-head once more in the never-ending SuperGLUE battle:

Microsoft added DeBERTa to supersede Google’s T5’s position on the benchmark to only 12 hours later be superseded by a new deployment of T5 + Meena (what?). U+1F923

At Quantum Stat, we kept moving forward as well. We added 800+ datasets and 300+ notebooks to our inventories in addition to thousands of inference code snippets for NLP models. U+1F635 Thank you to all contributors who made it possible!

Ok, so what does NLP look like for 2021? A bifurcation of SUPER large models vs. smaller compressed models? Or how about advancements in sparsity for pretrained models? Or how about models small enough to fit natively on the edge getting closer to reality?

Maybe all of the above. Additionally, we’ll probably see graphs and deep learning finally get married. 2021 will be their honeymoon. Several libraries already out there and have been maturing for several years like PyTorch Geometric, DGL, and DeepMind’s Graph Nets. Here are their GitHub stars growth trajectories over the years:

via paperspace blog

With regards to model architecture, we are also seeing a few alternatives for memory savings, improved abilities to handle longer sequences of text and improved training objectives. Few examples:

Longformer

Reformer

ELECTRA

Also, domain specific adaptation of NLP models will continue to proliferate. And by domain, I’m referring to 3 dimensions: languages, textual format (Twitter text or formal text etc.) and sector (legal or healthcare etc.)

Few examples:

Language-Focused: BERTurk, CamemBERT, AlBERTo, MBERT

Text-Focused: BERTweet , CharBERT

Sector-Focused: BioBERT, FinBERT, Legal-BERT

Inference optimization was a big winner this past year with several libraries being released. This focus area will help to continue bridge the performance gap between research and the enterprise so expect more from this area for the upcoming year. Here are a few libraries that help with optimizing transformers:

FastFormers

TurboTransformers

DeLight

ONNX Transformers

BERT seems so far away now with so many new model architectures and novel use-cases that made 2020 a weird one given the circumstances.

But 2021 is shaping up to be a good year for all of us. So until then…

Happy New Years U+1F387U+1F386U+1F387, and see you on the other side! U+270CU+270C 2021

P.S., regular NLP Cypher arriving Sunday.

Join thousands of data leaders on the AI newsletter. Join over 80,000 subscribers and keep up to date with the latest developments in AI. From research to projects and ideas. If you are building an AI startup, an AI-related product, or a service, we invite you to consider becoming aΒ sponsor.

Published via Towards AI

Feedback ↓