A Lite BERT for Reducing Inference Time
Last Updated on July 24, 2023 by Editorial Team
Author(s): Edward Ma
Originally published on Towards AI.
BERT
Photo by Ksenia Makagonova on Unsplash
BERT (Devlin et al., 2018) achieved lots of state-of-the-art results in 2018. However, it is not easy to use BERT (Devlin et al., 2018) in production even small footprint experiments. The base version of BERT (Devlin et al., 2018) includes 108M parameters while the X-Large version includes 1270M parameters.
Two major problems are introduced due to an extremely large amount of parameters. First of all, it desires a large footprint and it may not able to scale out easily in production due to monetary concern. On the other hand, training and inferencing time are longer when… Read the full blog for free on Medium.
Join thousands of data leaders on the AI newsletter. Join over 80,000 subscribers and keep up to date with the latest developments in AI. From research to projects and ideas. If you are building an AI startup, an AI-related product, or a service, we invite you to consider becoming aΒ sponsor.
Published via Towards AI