More than half (57%) of the UK public is apprehensive about the rapid adoption of artificial intelligence (AI) in personalised medicine, fearing insufficient training may compromise safety, while the protection of health data is another concern for 49% of those polled by QBE.
Opinium Research, on behalf of the business insurer, conducted a survey among 2,000 UK adults to gauge their views on personalised medicine – tailored treatments based on patients’ unique genetic makeup, lifestyle, and disease characteristics – and the use of AI.
It was noted that the AI market in life sciences is anticipated to reach US$7.09 billion by 2028, with partnerships or licensing agreements in place between AI firms and half of the world’s top 50 pharmaceutical companies.
QBE’s research reveals that more than a third (36%) of respondents are unaware of AI’s role in personalised medicine, and almost two-thirds (60%) are uninformed about its rapid integration. However, among those informed, 54% believe AI will enhance healthcare treatment.
Healthcare and the NHS are pivotal issues for voters, second only to the economy, according to pre-election polling.
Tim Galloway (pictured), life sciences portfolio manager at QBE Europe, commented: “While AI holds great promise for enhancing personalised medicine and the life sciences industry, we must proceed with caution. Continuous review of AI systems is vital to identify and rectify any glitches that could jeopardise patient safety.
“Substantial training programmes are necessary to ensure that healthcare professionals are proficient in using AI technologies correctly. AI will likely become a key aspect of risk management in life sciences, particularly around patient centricity.”
Potential benefits of AI in life sciences include drug discovery, genomics, medical device customisation, diagnostics, clinical trials, drug repurposing, regulatory compliance, biotechnology, and healthcare management. However, risks associated with rapid AI adoption include a lack of consolidated standards, training gaps, human error, data privacy and security issues, and medical device failures.
What do you think about this story? Share your thoughts in the comments below.