Sending BERT to Med School – Injecting Medical Knowledge into BERT

General NLP research has greatly advanced over the past several years thanks to large pre-trained neural language models such as BERT and GPT. But in the healthcare domain, this pre-training on unlabeled clinical texts can take us only so far, due to the special properties of medical texts and the lack of labeled data. One possible solution to this problem is to utilize medical ontologies in order to enrich pre-trained neural networks with medical knowledge.

This approach is not limited to the healthcare domain ⁠— recent studies have shown that knowledge graphs, such as WordNet and ConceptNet, can be used to infuse common sense and world knowledge into neural language models. In this lecture we will explore similar approaches using medical knowledge structures such as UMLS, ICD and MeSH, and examine how structured knowledge can be used in a neural network setup to solve various healthcare information retrieval challenges.

This lecture is intended for people interested in the challenges of healthcare language processing and for people in other specialized domains who want to get their structured knowledge sources out of the attic.

 

About the speaker
Rachel-Wities-NLP

Rachel Wities

NLP Data Scientist at Zebra Medical Vision

Rachel Wities is an NLP Data Scientist at Zebra Medical Vision, who believes that understanding doctors and their way of thinking is the key to successfully implementing AI Healthcare algorithms.

Rachel holds an M.Sc. from BIU NLP lab, researching knowledge graph representation of text semantics.

Previously she was a research scientist at PayPal. Loves her family, God, and Oxford Comma jokes.