UMLS-based approach for developing VoiS: Voice-Activated Conversational Agent for self-management of multiple chronic conditions
Park, Min Sook; Oh, Hyunkyoung; Luo, Jake; Ahamed, Sheikh Iqbal; Upama, Paramita Basak; Anik, Adib Ahmed; Tian, Shiyu; Rabbani, Masud
Loading…
Permalink
https://hdl.handle.net/2142/123162
Description
Title
UMLS-based approach for developing VoiS: Voice-Activated Conversational Agent for self-management of multiple chronic conditions
Author(s)
Park, Min Sook
Oh, Hyunkyoung
Luo, Jake
Ahamed, Sheikh Iqbal
Upama, Paramita Basak
Anik, Adib Ahmed
Tian, Shiyu
Rabbani, Masud
Issue Date
2024-01
Keyword(s)
Medical Ontologies
Health Informatics
Conversational Agents
mHealth
System Design
Mobile Systems
Ontologies
Abstract
This abstract proposes a system design for an ontology-based conversational agent (CA) for the self-management of chronic conditions. The proposed system plans to integrate the largest medical ontology, the Unified Medical Language System (UMLS) (Bodenreider, 2004), aiming to narrow the vocabulary gaps between health professionals and patients and to make the agent more responsive to the users. Recently, conversational agents (CAs) like ChatGPT, or computer dialog systems that simulate human-to-human communication in natural language, have risen and gained popularity in various health contexts (Bin Sawad et al., 2022; Montenegro et al., 2019) including self-management of chronic conditions (Griffin et al., 2020). Despite their potential, the currently available CAs are often criticized for lacking the capability to understand natural language inputs (Montenegro et al., 2019). This limitation can be highlighted in medical areas due to the known vocabulary gaps between health professionals and health consumers. The knowledge-grounded dialog flow for CAs presents the potential to lift this limitation, making CAs naturally converse with their users. In the proposed voice-activated self-monitoring support (VoiS) application, the research team plans to integrate the UMLS to make the agent better understand lay terms from patients and properly map those terms to medical concepts. This automated process is expected to improve the user experience in two folds: a) promote the quality of communication between the patients and health providers and b) make the VoiS app more responsive to user inputs, overcoming accepting only constrained user inputs (e.g., multiple choice of utterance).
Series/Report Name or Number
Proceedings of the ALISE Annual Conference, 2023
Type of Resource
text
Language
eng
Handle URL
https://hdl.handle.net/2142/123162
DOI
https://doi.org/10.21900/j.alise.2023.1251
Copyright and License Information
Copyright 2023 Min Sook Park, Hyunkyoung Oh, Jake Luo, Sheikh Iqbal Ahamed, Paramita Basak Upama, Adib Ahmed Anik, Shiyu Tian, Masud Rabbani
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License (https://creativecommons.org/licenses/by-sa/4.0/).
Use this login method if you
don't
have an
@illinois.edu
email address.
(Oops, I do have one)
IDEALS migrated to a new platform on June 23, 2022. If you created
your account prior to this date, you will have to reset your password
using the forgot-password link below.