Blog: AI in Healthcare (+5 Ways It’s Used in 2019) – G2 Crowd
It’s no secret that healthcare is a perpetually in-demand field.
We will always need nurses, doctors, and healthcare professionals to help us achieve wellness in our daily lives. So, with continuous advancements in healthcare and medicine, it’s easy to see how integrating AI technology into the processes by which people achieve health is inevitable.
How are medical professionals using AI in healthcare?
It’s fascinating to see how many subsets of the medical field have already taken steps toward incorporating artificial intelligence technology in their approach to patient-centered care. From fully automated medical consulting chatbots, to robots that assist with surgical procedures, AI is showing up in every corner of the healthcare field.
What is AI in healthcare?
AI in healthcare is the introduction of computer systems that have human-like intelligence to assist medical professionals in areas of consulting, digitizing, data tracking, and other often human-centered facets of healthcare.
Click below to read more about five innovative efforts being used to bring AI technology to the forefront of healthcare and medicine, and ways AI will improve as time goes on.
- Enhancing digital consultation
- Using fitness trackers to supplement patient data
- Streamlining how digital medical records are kept
- Providing access to care in underdeveloped nations
- Increasing quality of medical tools by using smart devices
Digital health consultation tools have progressed in popularity thanks to smartphones and people’s need for 24/7 accessible healthcare. With so many medical facilities open only during standard working hours Monday through Friday, it can be difficult for people with typical 9 to 5 schedules to find a place to squeeze in a visit to their doctor.
On top of that, some people may not know when an illness is severe enough to actually make the trek out to their physician rather than just sleeping it off and taking over-the-counter medications.
In 2018, the Kaiser Family Foundation conducted a survey and found that 45 percent of people between the ages of 18 to 29 did not have a primary care provider or a regular doctor they go to. Thus, the upswing of and necessity for digital health consultations arose. Digitized health is popular among younger working generations most especially because it provides a convenient alternative to actually going to a doctor’s office.
Let’s say you’re sick but don’t own a car, and your nearest clinic is over an hour away by public transit. Is it really worth the time there and back to see a doctor for something that might end up being classified as just a common cold? According to the data above, most 18 to 29-year-olds don’t think so.
So it makes sense why younger working people have been drawn to digital health consultation alternatives. Not only does digital consulting allow people to receive health assistance remotely, but it reduces the time patients and doctors spend together, thus giving doctors more time to see more patients.
That being said, because so many people turn to digital consultation as an alternative to a standard in-office doctor’s visit, AI is being used through smartphone applications as a way to identify, detect, and treat health concerns from a distance.
A current digital consultation tool that utilizes artificial intelligence is Buoy, an interactive symptom-checking chatbot. It functions similarly to WebMD’s Symptom Checker, but the main difference between the two is that Buoy uses AI whereas WebMD does not.
Buoy is a chatbot that uses canned responses to prompt the patient it is communicating with to choose from a series of options based on their health concerns. Although an AI-powered chatbot, Buoy communicates with a patient in a very natural tone, despite using canned responses to push the conversation forward.
If it determines that a patient’s health issues may be of larger concern, it will prompt the person to seek medical help from an actual human doctor. But instead of leaving a patient to their own devices, it actually offers up a list of nearby clinics based on the user’s location – remedying the issue of traveling an hour out of their way to see a physician.
Image courtesy of Buoy
Overall, digital consultation tools like this can help time-strapped patients find an easy-to-navigate alternative that helps with the decision-making processes involved in healthcare.
Keeping in line with the desire to maintain health and wellness at all costs, many people use fitness trackers, such as Fitbit or Apple Watch. These devices track heart rate, activity levels, quality and amount of sleep, and notify the wearer of any abnormalities (in rates and results) via an app on their phones.
Despite the level of detail and amount of data these wearable fitness monitors generate and track, the data stays within the hands of users as well as the companies themselves. However, if people were to give consent for their data to be shared and analyzed externally, the findings would produce a wealth of knowledge and insight into the health of people all around the world.
Extracted health data in accordance to patient-provided information like gender, weight, height, ethnicity, family health history, and more can be used by AI to create actionable outcomes. These outcomes could include large, general databases of health information that can better help medical professionals with diagnoses; assist individuals with maintaining and increasing their personal health and wellness goals; and allow employers to provide health benefits to their employees that are individualized according to their personal health profiles.
GIF courtesy of Monika Madurska via dribble.com
For years, clinics and doctors’ offices have kept stacks of manila file folders full of paper documents outlining patients’ extensive medical histories. Though effective in the past, paper medical records are no longer the way of the world and no longer make sense in terms of efficiency or information security.
Electronic health record (EHR) technicians have taken on the burden of digitizing medical records for years, but the stress, pressure, and mental burnout they experience is overwhelming and ultimately can lead to inaccuracies in the data that gets entered.
To help reduce burnout, EHR developers are using artificial intelligence to help automate some of the monotonous processes that are usually the responsibility of a human. Some of the most time-consuming processes EHRs contend with are documentation, order entry, and sorting paper documents from an in-basket; thus, it’s logical to implement forms of artificial intelligence into these tasks to reduce the time spent on them.
Two AI-powered tools that are being used already are voice recognition and natural language processing (NLP), though they are imperfect. Voice recognition still needs to learn from NLP algorithms on how to recognize, interpret, and transcribe human speech well enough that an external individual would be unable to notice a difference in hand-typed versus spoken transcriptions.
An alarming number of un- and underdeveloped nations lack access to trained medical professionals, health centers, and proper equipment typical of a hospital, clinic, or similar center. To fix this, some places have been implementing telehealth programs that can digitally connect a doctor from afar (often the United States) to a patient in an underserved nation.
However, telehealth cannot solve the problem of providing life-saving care as specialists, such as radiologists or ultrasound technicians, can. That’s where AI comes in.
Artificial intelligence can be used as a solution to this problem by taking on some of the duties typically allocated to humans, like x-rays, digital imaging, and other types of scans a radiologist performs. To do this, healthcare professionals in the given area would need access to a smartphone programmed with AI-ready applications that have digital imaging capabilities.
At Stanford University, researchers created an app programmed with an algorithm to do just that.
Researchers developed a deep learning algorithm called CheXNeXt, which is fueled by artificial intelligence and run via an application on a smartphone. The algorithm is able to detect abnormalities and diseases present in a person’s body. On top of having the ability to detect issues, the algorithm can also predict outcomes and solutions for treating diseases it finds.
The process is fairly straightforward. By feeding the app an image of an x-ray or taking a picture of one, the image is uploaded to the Cloud, run through CheXNeXt, and shortly thereafter, gets results on what it believes the issue to be.
Image courtesy of Stanford ML Group
This technology can be brought to underserved nations, which would eliminate the need for multiple in-house radiologists. CheXNeXt was tested against 14 radiologists on accuracy, specificity, sensitivity, and speed. While it performed equally in all categories compared to the radiologists, it surpassed them in terms of speed. Where it took four hours for radiologists to interpret the x-rays, CheXNeXt returned results – with the same levels of accuracy – in under two minutes.
Implementing CheXNeXt and deep learning algorithms to applications in this way can change the speed with which a person receives test results. In underserved populations and beyond, deep learning algorithms and artificial intelligence can change the overall speed and accuracy by which patients receive help entirely.
Still, these algorithms have to be trained to take into account the special physiological and environmental factors that may affect groups of people for which there is currently limited data. It must be tested on multiple, diverse ethnic groups to ensure there is a wealth of data across populations to amplify the accuracy of its results.
Smart devices might seem like a fun tool intended to distract people from their work and home tasks, but they’re used for much more than entertainment purposes. In the medical field, smart devices are being used to assist with monitoring patients who are critical or high-risk.
With these smart devices, AI is being used to track changes in a patient’s status when a nurse is not by their side. If artificial intelligence can detect patient improvement or deterioration, it can assist nurses, doctors, and specialists with how they allocate time given to certain patients.
In addition, AI can track data in the same way that wearable fitness devices do; it can monitor patient levels, including elevated heart rate among other things.
Where will the future of AI in healthcare take us next?
Artificial intelligence is already making waves in healthcare, and with continued testing, research, and improvements, who knows where we’ll be by 2025! It wouldn’t be surprising if AI and robotic process automation (RPA) continue trending upward.
Soon, not only will AI be a factor in bedside assistance, so will RPA – and your friendly robot nurse!