Virtual healthcare assistant
Virtual healthcare assistants (VHAs) or healthcare chatbots offer information and advice on a range of healthcare issues.
VHAs are generally chatbots on websites or apps that can be accessed from smartphones and computers.
They are increasingly used to automate tasks that are traditionally carried out by healthcare staff, such as suggesting a diagnosis, tracking symptoms and treatment progress, or recommending the use of specific drugs or exercise routines. A VHA may also recommend seeking in-person medical care, such as visiting a GP or going to a hospital.
VHAs use AI technology to process text or audio inputs from a user, and to then generate a response.
In clinical settings, VHAs can be deployed to automate the triaging process, using text or audio inputs about patients’ symptoms to establish the most urgent cases. VHAs can also be used by healthcare providers for remote patient monitoring, for example to track vital signs like blood pressure or temperature.
What are the benefits of this technology?
The use of VHAs has the potential to reduce GP and hospital patient waiting times, while increasing access to round-the-clock medical advice.
The same technology applications may also save costs and time for healthcare providers, as they can increase efficiency and help prioritise in-person care for those who need it.
What are the risks of this technology?
There are some risks of using VHAs, including the risk of personal data breaches, and that, depending on the terms and conditions, data could be sold to third parties, such as health insurance companies.
The use of VHAs may also make healthcare inequalities worse, as not everyone will have access to these technologies, or may not know how to use them.
Studies have also shown that racial and gender bias are likely to be made worse by AI systems deployed in healthcare, as training data is often based on a ‘standard patient’, assumed to be white, male and affluent. This could lead to misinformation and bias, particularly against certain groups, such as women and girls, ethnic minorities and those from less affluent backgrounds.
Lastly, in-person healthcare often requires an interpersonal relationship that goes beyond symptom checklists. AI technologies may lack the necessary contextual and intuitive information that clinicians may observe in face-to-face consultations.