Loading...
Transformers and the representation of biomedical background knowledge
Wysocki, Oskar ; Zhou, Zili ; O'Regan, Paul ; Ferreira, D. ; Wysocka, Magdalena ; Landers, Donal ; Freitas, Andre
Wysocki, Oskar
Zhou, Zili
O'Regan, Paul
Ferreira, D.
Wysocka, Magdalena
Landers, Donal
Freitas, Andre
Citations
Altmetric:
Abstract
Specialised transformers-based models (such as BioBERT and BioMegatron) are adapted for the
biomedical domain based on publicly available biomedical corpora. As such, they have the potential
to encode large-scale biological knowledge. We investigate the encoding and representation of
biological knowledge in these models, and its potential utility to support inference in cancer precision
medicine - namely, the interpretation of the clinical significance of genomic alterations. We compare
the performance of different transformer baselines; we use probing to determine the consistency of
encodings for distinct entities; and we use clustering methods to compare and contrast the internal
properties of the embeddings for genes, variants, drugs and diseases. We show that these models do
indeed encode biological knowledge, although some of this is lost in fine-tuning for specific tasks.
Finally, we analyse how the models behave with regard to biases and imbalances in the dataset.
Description
Date
2023
Publisher
Collections
Files
Keywords
Type
Article
Citation
Wysocki O, Zhou ZL, O'Regan P, Ferreira D, Wysocka M, Landers D, et al. Transformers and the Representation of Biomedical Background Knowledge. Computational Linguistics. 2023 Mar;49(1):73-115. PubMed PMID: WOS:000993797000002.