HCP Engagement

How HCPs Are Actually Using AI at the Point of Care - And What It Means for HCP Engagement

Post by
Dr Myles Furnace
How HCPs Are Actually Using AI at the Point of Care - And What It Means for HCP Engagement

Imagine a clinician who can ask an app for the latest treatment guidelines mid-consultation, or a hospital doctor whose AI assistant automatically writes the clinic note while they focus on the patient. This isn’t science fiction – it’s happening now across the UK. From general practice to specialty wards, artificial intelligence is increasingly present at the point of care. While flashy uses like AI in medical imaging often grab headlines, a quieter revolution is underway in knowledge retrieval and ambient clinical listening tools. These AI-powered assistants are helping clinicians instantly retrieve information and offload paperwork, fundamentally changing daily workflows. In this article, we dive deep into how UK healthcare professionals are using these tools, the current adoption and trust trends, and why industry partners (pharma, MedTech, and digital health firms) should take note.

Smarter Searches: AI Knowledge Assistants in Every Specialty

Clinicians are inundated with information – new research, evolving guidelines, complex patient data – yet they must make quick decisions. AI knowledge retrieval tools (including semantic search engines and clinical decision support platforms) have emerged to give HCPs instant answers from trusted sources. Instead of sifting through static databases, doctors can ask a question in natural language and get relevant, evidence-based guidance on the spot.

There are a number of examples now, including Eolas (pronounced “oh-lus”) Medical, allowing clinicians to ask natural language questions and get immediate responses - “How to manage new-onset atrial fibrillation in a patient with asthma,” for instance – and the AI will scan a curated knowledge base to pull up the most relevant guidance (with key text passages highlighted). An independent evaluation found this saved around 2.5 minutes during a typical 10-minute GP appointment, by streamlining information retrieval . That’s 25% of the consult time freed up – a significant efficiency gain when multiplied across hundreds of patients per week.

Importantly, these AI systems are designed with trust and accuracy in mind. Tools like Eolas Medical don’t fabricate answers; they retrieve answers from pre-vetted clinical sources. The platform only searches credible guidelines, research and local protocols, avoiding the “hallucinations” that a generic chatbot might produce, and provides evidence-based answers tailored to the clinician’s context, think right doctor, right information, right time. This focus on trusted content is critical for adoption – busy clinicians won’t use something they don’t trust. Early users report that the answers feel like using a supercharged clinical search engine rather than chatting with an unpredictable AI.

That said, generative AI is on the horizon. Some clinicians have experimented with mainstream chatbots (like ChatGPT or Bing’s AI) for quick answers or drafting texts – often under the radar. A 2024 survey of 1,000 UK GPs found that one in five GPs have used AI tools such as ChatGPT in their practice and a study carried out by Eolas found that over 45% of clinicians had used it for clinical questions.

This grassroots use shows the perceived utility of AI, but it also raises concerns: one survey warned about patient confidentiality risks, since it’s unclear how data entered into public AI services like ChatGPT might be used, i.e. it is vital for healthcare specific solutions to be used.

Real-World Use Cases and Benefits

AI knowledge tools are already proving their worth across various specialties and settings:

  • General Practice: Family doctors use AI assistants to quickly look up treatment guidelines, drug dosing, or differential diagnoses for uncommon symptoms. For instance, if a GP encounters a rare condition or an unfamiliar drug interaction, an AI query can fetch the answer in seconds. Clinicians say this reduces the need for interruptive phone-a-friend consults or flipping through reference books during a patient visit. In NHS pilot studies, GPs saved time and felt more confident that they weren’t missing something obscure when AI decision support was on hand . Some GPs even use AI as a documentation aid – typing a quick bullet list of a consultation’s findings and letting the AI expand it into a structured clinical letter (to be refined by the doctor). This overlaps with the “ambient scribe” concept discussed later.

  • Hospital Specialties: In hospitals, specialists tap into AI for decision support on complex cases. A cardiologist might query an AI tool about the latest evidence for a difficult-to-treat arrhythmia, or a junior doctor in A&E might use it to double-check the recommended antibiotic for an unusual infection. International platforms like UpToDate and ClinicalKey are increasingly embedding AI features to make searching faster and more intuitive . Notably, Elsevier’s ClinicalKey (a popular reference database) launched a generative AI-powered version in 2024. It integrates directly into EHR systems like Epic, so a doctor can query knowledge without leaving the patient’s record, and it serves up succinct answers backed by journal articles and guidelines . This tight integration into workflow is aimed at delivering just-in-time knowledge. Early feedback indicates clinicians value saving those precious few minutes and clicks, especially in fast-paced settings.

  • Across All Fields: Clinicians uniformly appreciate anything that saves time or reduces cognitive load. AI search tools help by sifting mountains of data – whether it’s a GP quickly finding local referral criteria, an oncologist accessing the latest trial results for a tumor board discussion, or a mental health professional retrieving a relevant NICE guideline on therapy approaches.

Despite these advantages, trust and validation remain paramount. Many HCPs currently use AI in a double-checking mode – they’ll consult the AI, but also verify its suggestions against their own knowledge or a trusted source. If an AI tool suggests a diagnosis or treatment that a clinician feels is off-base, they will (and should) override it without hesitation. This professional skepticism is healthy; over time, as tools prove their accuracy and are rigorously validated, confidence in their recommendations will grow. The ideal scenario is a “centaur” model – human clinicians partnered with AI assistants, each complementing the other. The AI brings the breadth of knowledge and speed; the human brings experience, context, and empathy. Used in this way, AI decision support can enhance clinicians’ own reasoning rather than replace it.

Looking ahead, experts anticipate even more seamless knowledge support, i.e., it’s very clear this is the direction of travel for how HCPs will be accessing medical knowledge moving forward.

Ambient AI Scribes: Let Doctors Doctor, Not Dictate

Another huge pain point for HCPs is documentation. Studies show clinicians can spend up to half their workday on paperwork and data entry, clicking through electronic health records (EHRs) or writing notes, instead of interacting with patients. This not only consumes time but also contributes to burnout – many doctors feel they act as “highly trained clerical workers” some days. Enter AI-powered ambient clinical listening tools, often called AI scribes. These systems use speech recognition and natural language processing (NLP), combined with generative AI, to listen to doctor-patient conversations and automatically draft clinical notes, letters, or even populate EHR fields. Essentially, they aim to take the notepad (or keyboard) out of the clinician’s hands, so they can give full attention to the patient.

Ambient AI scribing has gained significant momentum in the UK over the past year. In late 2024, Great Ormond Street Hospital (GOSH) in London led the first NHS trial of a bespoke AI scribe tool called TORTUS – and the results have been striking. The TORTUS system uses advanced speech-to-text combined with a generative AI to produce consultation notes. During an outpatient visit, it listens to the conversation (via a microphone in the room or a clinician’s device) and in real-time creates a draft of the clinic note and a patient letter. By the end of the appointment, the clinician has a written summary that they only need to lightly edit and approve, rather than compose from scratch . In GOSH’s early trials, 100 real outpatient appointments were run with the AI scribe in the room. Clinicians reported that the AI’s presence did not interfere with the consultation – if anything, it improved it. Doctors could maintain eye contact and engage with the patient, knowing the note was being taken care of in the background. All of the clinicians said the tool helped them give **“full attention to their patients” without reducing the quality of documentation *. In fact, the draft letters were accurate and detailed, often needing only minor tweaks.

Why Ambient Scribes Are a Game-Changer

Early adopters and NHS digital leaders are excited about ambient AI scribes because the benefits touch on multiple chronic problems in healthcare delivery. According to NHS England’s review of these technologies, adoption of ambient scribing can transform care settings by improving efficiency, documentation quality, and staff well-being . Here are some key advantages being observed:

  • Reduced Administrative Burden: Doctors and nurses spend less time on manual data entry and paperwork. Routine tasks like typing up notes, filling forms, or writing letters are handled by the AI, freeing clinicians from the keyboard . This not only saves time but also mental energy.

  • More Face-to-Face Patient Time: With note-taking offloaded, clinicians can focus fully on the patient during the encounter. They can maintain eye contact and listen more deeply instead of worrying about jotting everything down . Patients feel heard and report higher satisfaction when the clinician isn’t glued to a screen. In trials, doctors overwhelmingly said they could give undivided attention to patients and felt their consultations were more human-centered .

  • Improved Documentation Quality: The AI captures detailed notes in real time, producing comprehensive and structured records. Nothing important gets missed due to memory or haste.

  • Shorter Appointments & Productivity Gains: By handling note-writing during the consult (instead of after), ambient scribes can shorten the total time per patient. Early findings show some appointments could end a few minutes sooner while still covering everything, meaning clinics can potentially see more patients in the same time frame.

  • Reduced Burnout: Documentation overload is a known contributor to clinician burnout. Offloading this to an AI (even partially) can help alleviate stress. Doctors can go home earlier (since there are fewer notes to finish after hours) and spend more of their day on the parts of the job that feel meaningful – patient care and problem-solving.

  • Better Data for the System: Interestingly, by making it easier to capture data into the EHR, these tools can lead to more consistent use of digital records. NHS analysts noted that ambient voice tech increased utilisation of the EHR systems’ structured fields, improving the overall volume and consistency of data collected.



Of course, implementing AI scribes isn’t as simple as flipping a switch. Trust and safety are just as crucial here as with knowledge AI – perhaps even more so, since patient-identifiable data is being processed. Both clinicians and patients need confidence that the AI will respect privacy and keep data secure. The good news is that NHS England is taking a proactive approach: in April 2025, it published detailed guidance for trusts on deploying ambient scribing tech safely, with emphasis on data governance, security, and regulatory compliance . All deployments must ensure that patient data is handled confidentially (e.g. processed either locally or via approved secure cloud routes) and that clinicians validate all AI-generated content. In practice, the AI drafts remain under the clinician’s control – they edit and sign off the final note, maintaining accountability.

Adoption and Trust: The HCP Perspective

It’s clear that AI for knowledge retrieval and documentation is making waves, but how do healthcare professionals feel about these tools? Adoption ultimately hinges on frontline buy-in. Several themes have emerged in the UK:

  • Eager but Cautious: Doctors and nurses are generally excited about anything that can ease their workload or improve patient care. Surveys show a majority of NHS staff support the use of AI in patient care and even more in administrative tasks . They see the potential for faster decisions, fewer errors, and less drudgery. However, clinicians are also naturally cautious – they’re trained to “first, do no harm.” So, while many HCPs are trying AI tools (as evidenced by the one-fifth of GPs using ChatGPT-like tools informally ), they do so carefully. They double-check AI outputs and are alert to the technology’s current limits. As one doctor in the GMC study noted, some diagnostic AI systems are not yet accurate enough for them to fully trust, leading them to spend extra time verifying suggestions . This shows that if an AI tool isn’t reliable, it can ironically create more work (the doctor cross-checks it) – so accuracy is paramount for sustained use.

  • Maintaining Control: Clinicians insist that responsibility lies with them, not the AI. They want AI to assist, not autonomously decide. The same GMC-commissioned research highlighted that doctors feel confident to override AI decisions and will do so whenever the AI’s output conflicts with their clinical judgment . In essence, HCPs treat AI advice like they would a human colleague’s advice – helpful, but ultimately advisory. This attitude is important for safety; it also means AI needs to be integrated into clinical workflows as a support tool, with clear ways for clinicians to accept, modify, or reject its outputs.

  • Training and Understanding: There’s a growing recognition that HCPs need some level of AI literacy. If a tool is suggesting a diagnosis or auto-drafting a note, clinicians want to understand how it works and its known pitfalls. The NHS is starting to include AI training in curricula (e.g., the Topol Review recommendations for upskilling staff in digital health). The more clinicians grasp what an AI can/can’t do, the more effectively they can use it. During deployments, hands-on training and support are critical. For example, when rolling out an AI scribe, a hospital might run simulated sessions for doctors to get comfortable with editing AI-generated notes and to set expectations about accuracy. Users who understand a tool’s strengths (and verify its results initially) will build trust through experience as they see it consistently perform well.

  • Addressing Privacy & Ethics: Trust is also about doing right by the patient. Clinicians are vocal that any AI must uphold patient confidentiality and data security. Professional guidelines (from bodies like the BMA and GMC) are being developed to clarify how patient data can be used with AI, what consent might be needed, and how to be transparent with patients. The BMA researchers explicitly warned that using tools like ChatGPT with patient info could undermine confidentiality if data isn’t properly protected . This has prompted calls for NHS-sanctioned platforms where generative AI can be used on patient data safely (for instance, an NHS digital sandbox version of ChatGPT that doesn’t leak information). Until then, clinicians are treading carefully and often limiting AI to non-sensitive tasks or using only de-identified data. Having clear ethical guidelines and robust data security in AI tools will be key to earning clinician trust at scale.

  • Demand for Evidence: Lastly, HCPs expect to see evidence of benefit before relying on a new tool. It’s one thing to pilot an AI in a controlled setting; it’s another to embed it into everyday practice. Clinicians want to know: does using this AI actually improve outcomes? Does it save me time, net of the effort to use it?

Overall, UK clinicians appear to be approaching AI with a mix of open-mindedness and healthy skepticism. They see that AI will inevitably play a bigger role, and they don’t want to be left behind – especially if it can help with pressing issues like workload and waiting lists. But they also won’t accept AI that isn’t safe, effective, and easy to use. The onus is on technology providers and health system leaders to meet those expectations, through robust design, user-centric implementation, and ongoing support.

Why These Trends Matter for Pharma, MedTech, and Digital Health Companies

The rise of AI at the point of care isn’t just a clinical story – it’s also an industry game-changer. Healthcare industry partners such as pharmaceutical companies, medical device firms, and digital health platform providers need to understand these trends and adapt their strategies. Here’s why:

1. The New Frontier of HCP Engagement: Traditionally, pharma and medtech companies engage healthcare professionals (HCPs) through reps, conferences, journals, and educational materials. But if HCPs increasingly turn to AI assistants for information and support, then those become critical channels for engagement. In short, AI tools are changing how HCPs find and consume information, including product and therapy updates . For example, instead of browsing a website or pamphlet, a doctor might ask their AI assistant, “What are the latest treatment options for Condition X?” If you’re a pharma company with a leading therapy for X, you want to ensure the AI’s answer includes accurate, up-to-date info about your drug. This means industry must collaborate with the providers of these AI tools so that their content (e.g. clinical trial results, new guidelines, safety updates) is integrated. 

2. Real-Time Content Delivery and “Next Best Actions”: AI at the point of care enables something very powerful: delivering the right information at the right time. For instance, an AI integrated into the HCP’s workflow could show a suggestion when a certain clinical scenario arises – “Consider Therapy Y for this patient, since they have characteristics ABC.” From a pharma perspective, if Therapy Y is your product, having that AI-driven notification can be gold. Of course, it must be evidence-based and appropriate (this isn’t about ads, but about clinical decision support that solves the doctor’s problem and benefits the patient). Several medtech and pharma-backed initiatives are looking at how to use AI-driven alerts to prompt clinicians when a patient could benefit from a specific intervention and many have already partnered with Eolas.

3. Medical Education in the AI Era: Pharma and device companies invest heavily in medical education – sponsoring CME events, publishing educational content, and so on – to help clinicians stay current (and of course to inform them about the latest therapies). With AI knowledge tools, HCPs may rely less on traditional ed sessions and more on on-demand learning. This is a chance for industry to reimagine medical education through AI. For example, the ability to earn CME credits while using the platform for clinical queries. Every time a doctor looks up a topic, that learning can count toward their formal education requirements. Industry could leverage this by contributing high-quality educational content to such platforms – ensuring that when an AI provides an answer, it also points to a module or reference that offers deeper learning, perhaps sponsored or co-created by industry experts. 

Imagine an oncologist asks an AI about a new class of cancer drugs; the AI could provide a summary and also link to a short video CME snippet (produced with a pharma’s support) about how those drugs work. By meeting HCPs in the moment of need, companies can foster engagement and be seen as value-adding partners in education. Digital health companies can also integrate AI chatbots in their physician portals or apps to answer questions about devices or diagnostics, providing instant support and training tips. Ultimately, AI can facilitate a shift from periodic, scheduled education to continuous, bite-sized learning, and industry players who enable that will build stronger relationships with clinicians.

4. Improved HCP Experience = Better Engagement: Healthcare providers are more likely to engage with industry-provided content or tools if it’s seamlessly integrated into their workflow and actually helpful. AI solutions offer an opportunity to do exactly that. For example, a medtech company might integrate an AI assistant into its medical device interface to guide surgeons during a procedure or help troubleshoot – essentially turning product support into an AI-driven co-pilot.

5. Data and Insights Feedback Loop: As HCPs use AI tools, those tools (with appropriate consent and anonymisation) can generate valuable insights into what clinicians need and what challenges patients are facing. For example, aggregate data from a knowledge platform might show that many doctors are querying about a particular complication or struggling with a certain guideline. Pharma and medtech companies can utilise these insights for better content creation and support. If a pharma sees that thousands of clinicians asked the AI about managing a drug’s side effect, they know to perhaps create better educational material on that, or even feed that information back to R&D or safety teams. It’s a more immediate and dynamic understanding of HCP needs.

In summary, the trends of AI knowledge assistants and ambient scribes are not just an IT upgrade – they represent a paradigm shift in how information flows in healthcare. Pharmaceutical and medtech companies, as well as health IT providers, should view these AI tools as the new interface between them and clinicians. Those that adapt by integrating their offerings and messaging into these AI-driven workflows will remain visible and valuable to HCPs. Those that don’t may find it harder to get attention in the future, as traditional channels become less frequented.

Conclusion: AI powered HCP engagement is here to stay

UK healthcare is at an exciting inflection point. Across hospitals, clinics, and GP surgeries, AI is beginning to weave into the fabric of clinical practice, augmenting how professionals retrieve knowledge and record information. Early adopters have shown that, when thoughtfully implemented, these technologies can save precious minutes, support clinical decisions, and reduce the grind of administrative tasks – all while preserving or even enhancing the quality of care delivered. Crucially, this isn’t about AI replacing clinicians; it’s about freeing clinicians to do what only humans can do best: empathise, exercise judgment, and connect with patients.

Never miss an insight 👉 Subscribe now: https://www.eolasmedical.com/hcp-engagement-and-insights-newsletter