Language models with transformer such as BERT and GPT-3 have been successfully outperforming on a variety of Natural Language Processing (NLP) tasks, such as information extraction, text classification, sentence similarity, text generation, text summarization, and question answering.
In this talk, I will review the state-of-the-art open-source tools based on pre-trained language models that can be utilized to build a foundation of an AI-based healthcare system. The tools can be used in medical practice management (MPM) or electronic health records (EHR) management software.
I will review the architecture of each tool, usage, capabilities, and limitations. I will also show case usage and limitation of each tool through a set of demos.