Subscribe to our newsletter

Thought Leadership | 6th June 2018

The Future of Artificial Intelligence in Healthcare

Read Time: 4 minutes

 

Marks and Spencer recently announced that the branch in the town where I live will be closing. With British Home Stores and McDonalds long gone and our House of Fraser department store on the brink of closure the town centre is no more.

The good news for our towns folk is that our local hospital, alongside two other neighbouring Trusts has won its campaign to keep A&E open at all three sites. While local councillors, campaigners, and MPs all jumped for joy, as it is always a vote winner, I wondered if this decision was something we should be celebrating.

As our high street brands are changing where they interact with their customers, so is the NHS. Marks and Spencer may be closing their 100-year-old store, but they are opening a state of the art MS Food Store just out of town, responding to consumer needs and demands.

Last month, Jeremy Hunt explained in a newspaper interview that local NHS organisations need to transform services to support their sustainability, especially if they are to take advantage of all the new technology that is being introduced to improve patient diagnosis, care and treatment outcomes. Simply keeping services open because they have always been there is perhaps not the best way forward.

If you talk to anyone in the UK, they will always want care to be delivered locally and many are resistant to change because they simply do not know and cannot visualise how care can be delivered any other way.

Take Artificial Intelligence (AI) for example, most people would probably associate this with Amazon’s Echo Alexa and not a new way of transforming and solving healthcare problems. Throughout the UK, AI is being integrated into the NHS by those organisations who realise the value of the technology in delivering healthcare.

Babylon Health for example, is using AI to provide accessible healthcare for millions in the palm of their hands. Designed around how a doctor thinks, their AI-powered chatbot, has enabled over 1.2 million patients in North London to check their symptoms through its platform. Once a patient inputs their symptoms, the platform responds with follow-on questions to determine the seriousness of the illness, condition or injury. Patients are then advised on potential next steps: seeking medical assistance, going to a chemist or self-managing their condition at home.

Last week, we also read how AI is now more effective in spotting skin cancers than doctors and how Theresa May plans to deploy AI to prevent over 20,000 cancer-related deaths each year by 2033. The Prime Minister wants health charities, the NHS and the AI sector to pool data to transform the diagnosis of chronic diseases. Medical records, along with information about patients’ habits and genetics, will be cross-referenced with national data to spot those at an early stage of cancer.

Streams is a secure mobile phone app, developed by Googles DeepMind Health that aims to address what clinicians call “failure to rescue” – when the right nurse or doctor doesn’t get to the right patient in time. Each year, thousands of people in UK hospitals die from conditions like sepsis and acute kidney injury because the warning signs aren’t picked up and acted on in time.

Streams brings together important medical information, like patients’ blood test results, in one place, allowing clinicians at hospitals to spot serious issues while they are on the move. If one is found, Streams can send an urgent secure smartphone alert to the right clinician to help, along with information about previous conditions so they can make an immediate diagnosis.

Streams can also allow clinicians to instantly review their patients’ vital signs, like their heart rate and their blood pressure, as well as to record these observations straight into the app.

So perhaps it is now time to walk into that A&E department that has just been saved and talk to the people who have been sat there for over 4 hours waiting to see a doctor, about what life could be like.

Your daughter has had a rash on her face for a couple of days now, so you seek advice via your AI powered health app on your smartphone. It suggests it could be the “slapped cheek” virus and has advised you to seek further medical advice, so you book an e-consultation with your doctor online. Just then your smart watch buzzes to remind you that you need to take your statin medication – it’s such a shame Dad didn’t have smart pill technology to remind him to take his medication, otherwise maybe he would still be here.

Right on cue, your doctor video calls you an hour later and uses the diagnostic device plugged in to your smart phone to review your daughter’s vital signs. You show them how the rash has progressed to her chest and confirm she has now developed a sore throat and headache too. She diagnoses scarlet fever and prescribes a course of antibiotics.

You say goodbye as you’ve got another call coming through. It’s from the telecare company that monitors your mother’s wellbeing at her bungalow on the other side of the country by a range of networked remote devices. At 82 she’s still a force of nature, if a little frail, but insists she is capable of living in her own home, and who are you to argue. They’re calling to let you know her smoke alarm just went off for three minutes. You ring your mother and she says she was only toasting a crumpet and that you should be looking after your daughter, not worrying about her. You’ve got to go; the doorbell is buzzing. It’s the delivery driver from the pharmacy with your daughter’s medication.

Thanks to PWC for this case study.

We will of course always need buildings and beds, but locally when accessibility to a GP is incredibly difficult and NHS Digital identified that 59.5% of A&E appointments in my CCG were unnecessary, perhaps it’s time for a rethink and a realignment of service provision.

Back to Blog