Care providers generally experience a high workload mainly due to the large amount of time required for adequate documentation. This paper presents our visionary idea of real-time automated medical reporting through the integration of speech and action recognition technology with knowledge-based summarization of the interaction between care provider and patient. We introduce the Patient Medical Graph as a formal representation of the dialogue and actions during a medical consultation. This knowledge graph represents human anatomical entities, symptoms, medical observations, diagnoses and treatment plans. The formal representation enables automated preparation of a consultation report by means of sentence plans to generate natural language.
The architecture and functionality of the Care2Report prototype illustrate our vision of automated reporting of human communication and activities using knowledge graphs and NLP tools.
DOWNLOAD THE PUBLICATION HERE IN PDF:
Automated Medical Reporting From Multimodal Inputs to Medical Reports through Knowledge Graphs HealthINF_Care2ReportDesign_Published20012221