Industry Focus
Human + Machine: The Future of Digital Healthcare
At the recent Medidata NEXT conference in New York Ronan Wisdom, global lead at Accenture’s Connected Health practice, discussed the latest developments in human-machine collaboration and their potential impact on healthcare. Susanne Hauner reports
“We're entering an age of human empowerment, where technology will augment humanity and fundamentally improve the way we live and work,” Accenture’s global lead for connected health Ronan Wisdom told delegates at the Medidata NEXT conference in New York in October.
Wisdom’s keynote was titled ‘Human + Machine’, after the book co-authored by Accenture’s chief technology and innovation officer, Paul Daugherty. “The central premise is that plus sign between human and machine,” Wisdom explained, “because despite everything Hollywood would like us believe, all the work we're doing now, all of our research, our view into the future, is that we're entering an age of human empowerment where technology will augment humanity and fundamentally improve the way we live and work.”
Rather than fully autonomous machines taking over, Wisdom believes the next big step in artificial intelligence (AI) development will lie in more sophisticated forms of collaboration between humans and machines.
He identified three key trends that are driving innovation in human-machine collaboration: “The first is the growth of smart devices and smart products,” he said. “The second is new forms of interaction with those devices and the third is new forms of AI that are underpinning everything. And we think the combination of those three advances are going to change the opportunities for research across industries - healthcare included.”
Smarter devices create new healthcare applications
Connected health devices, such as wearable sensors and trackers have been around for a while, but the latest generation of these devices is moving far beyond the wrist-worn health tracker. They are also becoming more clinically relevant and medically accurate, Wisdom explained.
This allows for new forms of health monitoring – for example, tracking disease progression by monitoring behaviour and cognitive function with smart clothing, using injectable nano devices as biomarkers to track diseases in the body, and using digital pills to ensure patients adhere to their treatments.
Aside from devices designed specifically for medical use, everyday objects around us are also becoming smarter and more connected, which Wisdom argues will give them new potential for healthcare applications.
“The mirror we use in the morning can detect changes in skin condition and eye health. The bed we sleep in at night can detect sleep patterns, interruptions and body temperature.”
“I'm not talking about a smartphone, or a connected health wearable, or a regulator device,” he explained. “I'm talking about everyday objects that we interact with – smart cameras, household appliances, cars and so on. Those objects are getting really, really smart and we think they're going to play an increasing role in our healthcare.
“Do we think the digital health of the future will be bound by our smartphone and a wearable?” he asked. “When the mirror we use in the morning can detect changes in skin condition and eye health. When the bed we sleep in at night can detect sleep patterns, interruptions and body temperature.
“When a car, understanding that we're diabetic and having access to our connected glucometer data, can analyse our driving patterns, maybe the composition of our sweat, and warn us that we're at risk of an impending hyperglycaemic event. When the bus shelter that we stand at waiting for a bus is not only measuring air pollution, it's listening for coughing to detect the potential spread of infectious disease.
“I'm not kidding with that one. It exists and it's out there.”
We will see these smarter everyday objects playing an increasing role in our healthcare, Wisdom said. What’s more, they will also create new data and opportunities for insight that will help to inform future clinical research.
The top ten applications of AI in healthcare and the value they will produce to 2026, according to Accenture’s forecast. Source: Accenture analysis
New forms of collaboration between humans and machines
The second driving trend in AI applications, Wisdom said, is the development of new forms of interaction between humans and machines. He pointed out cobots as a good example as they are designed to work with humans - “to be that human + machine value equation,” as he put it.
Cobots are typically used in a production capacity, such as assembly facilities, and can learn new tasks from humans through haptics and computer vision. In the automotive sector, for instance, human + cobot teams have proven more efficient than fully robotic production lines in dealing with the increased levels of customisation required in car manufacture.
Of course, cobots have also made their way into life sciences, the most prominent example being the surgical robot - an increasingly common sight in operating theatres around the world.
However, Wisdom believes the real significance of cobots lies in the fact that their development is driving forward research of human-computer interactions - and this is a field with even greater potential for healthcare applications.
The changing nature of human-machine interaction
Conversational interfaces are an example of this evolution of human-machine interaction. Driven by recent advances in natural language processing, advances in conversational interfaces could impact healthcare in a number of ways, from making administrative tasks more efficient to providing assistance to patients.
The most significant leap of technology in this field, Wisdom believes, will lie in automating patient advisory services, because that requires a very sophisticated AI.
“An advisory service has to know us as a patient for much longer time,” he explained. “It has to build a profile based on us, based on our preferences, but also based on our behaviours and activities. It has to be able to filter content based on our preferences, suggest new content and options and watch over us for harm in terms of making decisions.”
As an example of assistive technology application that combines smart objects, new forms of interaction and AI, Wisdom showcased the Drishti app, which was developed by Accenture’s Tech for Good programme to help the visually impaired make sense of their surroundings.
The top ten applications of AI in healthcare and the value they will produce to 2026, according to Accenture’s forecast. Source: Accenture analysis
Aside from their use in assistive healthcare applications, Wisdom noted that conversational interfaces also have potential in clinical studies.
“Of course for electronic patient-reported outcomes it makes obvious sense, but beyond the quantitative nature of what a patient might say to a conversational machine, we're really interested in the qualitative nature of what's said - the behavioural aspects of how we speak. Turn-taking, repetition, interruption, frustration, emotion - all of these factors create a vocal biomarker that we can use to assess whether a patient is engaging with a service, whether they are likely to drop out of a service, and this is really relevant when we think about challenges in clinical studies.”
Future opportunities for AI in healthcare
Artificial intelligence has been around for a while but the most significant progress has been very recent.
“If you think about AI, we describe it as a spry 70 or 80-year-old,” Wisdom said, “because although it's existed since 1950 it's only really in the last five to seven years that new forms of machine learning have empowered applications like computer vision so that they have become really penetrative from a commercial perspective.
These recent advancements are reflected in the levels of funding received by the AI community overall, as well as data science and AI startups – a trend that is particularly relevant in life sciences and healthcare.
However, Wisdom says, there is a missing link in the human + machine equation which needs to be addressed as a next step.
“We call it collaborative intelligence, he explains. “If you think about what a human is good at – articulation, communication, improvisation, generalisation – and what a machine is good at – repetition, memorisation, prediction – we need new ways to bring those two ends together to unlock new value.
“And we think there's a lot of opportunity in that middle space to create new solutions that are based on collaborative intelligence.”