Building multi-modal interactive health records

Project Description

Can we use multi-modal analysis of medical health records to inter-link the textual and image information and present the data in an intuitively clear manner?

Today, patients access their medical information using the mijnRadboud portal. Medical records are however not written with the patient in mind - the vocabulary gap between language that the patient would understand and the actual contents of their electronic health records is huge. Online health records will also include medical images, where even experts with extensive medical training rely on the support of radiologists to interpret that information.

The proposed project explores (semi-)automatic methods to transform electronic health records into an interactive online version that is easily explored and understood by the patient. To reduce the vocabulary gap, we will transform the textual information in the health record and/or link it to medical knowledge graphs. The scientific question is whether textual and image information reinforce each other to improve effectiveness of these methods. A specific problem to overcome is limited availability of resources for processing Dutch medical health records. Developing word embeddings, entity tagging and knowledge graph extraction from Dutch medical records will have sustainable impact on health innovation at RadboudUMC.

People

Koen Dercksen

Koen Dercksen

PhD student

Data Science, Radboud University

Arjen de Vries

Arjen de Vries

Professor

Data Science, Radboud University

Faegheh Hasibi

Faegheh Hasibi

Assistant professor

Institute of Computing and Information Sciences, Radboud University

Liesbeth Langenhuysen

Liesbeth Langenhuysen

Manager supply

REshape

Monique Brink

Monique Brink

Radiologist

Radiology, Radboudumc

Ritse Mann

Ritse Mann

Breast and interventional radiologist

DIAG AIIM

Bram van Ginneken

Bram van Ginneken

Professor

Diagnostic Image Analysis Group