Microsoft has announced that it has integrated an optimized implementation of BERT (Bidirectional Encoder Representations from Transformers) with the open source ONNX Runtime. Developers can take advantage of this implementation for scalable inferencing of BERT at an affordable cost.
In 2018, Google has open sourced a new technique for pre-training natural language processing (NLP) models called Bidirectional Encoder Representations from Transformers (BERT). This move from Google revolutionized the conversational user experience domain by empowering developers to train their own state-of-the-art question answering system and a variety of other natural language-based models.
Read the entire article at Forbes