Moorfields Eye Hospital launches first virtual assistant to connect with patients
The virtual assistant, known as the Oriel Assistant, will provide answers to questions from patients, staff and the general public and let them have their say on the proposal – a partnership between Moorfields Eye Hospital, the UCL Institute of Institute of Ophthalmology and Moorfields Eye Charity. The Oriel Assistant expands the options for patients and the general public to feedback on the proposal, other options include open discussion groups and events.
As one of the largest NHS projects currently in London a public consultation was launched by Camden Clinical Commissioning Group (CCG) and NHS England Specialised Commissioning in May to ensure patients and the public have their say on the exciting proposals.
The Oriel Assistant will support the communication by answering questions but will also interact with patients, staff and the public in a new way. The Oriel Assistant will turn the traditional virtual assistant model on its head by using its understanding of natural language to ask a relevant question from the consultation when a similar question or theme becomes part of the chat, creating a two way question and answer conversation between the person and virtual assistant. Answers gathered by Oriel assistant will be fed into the broader consultation which closes on 16 September.
David Probert, chief executive for Moorfields Eye Hospital, said: “This is an exciting step forward towards delivering the modern, efficient and effective care our patients deserve. Central to this is our proposal to move our hospital to a new purpose-built centre where we would be able to transform lives, turn research into new treatments faster and share our knowledge and understanding with the clinicians of tomorrow. “But innovation is not just limited to our clinical environment; it is also about how we connect with our patients. That is why we have developed the Oriel Assistant, to provide round-the-clock answers to questions and information about how the proposal could affect our patients, staff and the wider public. “After this initial launch, we will look at how we can expand the service to support patients with a range of issues; such as information on appointments and improved access for those with sight loss.”
The Oriel Assistant has been trained on over 500 questions gathered by surveying the general public which cover everything from directions to the proposed new site, accessibility of new location and current services to how the move will impact younger patients, staff and private patients. The Oriel Assistant will monitor and analyse questions it receives and develop over time as it learns from the interactions it has with patients and staff, detecting trends and providing insight.
Developed using IBM Watson Assistant and hosted on the IBM Cloud, patients and the general public can interact directly with Oriel, via a dedicated landing page on the Oriel website, receiving rapid and consistent answers to their questions. To cater for a broad range of patients and staff, the Oriel Assistant will be accessible for visually - impaired users. A custom made interface has been designed which is fully compatible with screen readers. Users can also adapt the size of the font or change the colour contrasts on screen to suit their needs.
The Oriel Assistant is the first product of the Maia (Moorfields AI Assistant) Project. After the initial roll out, Moorfields plan to expand the capabilities of its AI-powered assistants, training them to respond to day-to-day queries patients have about the hospital, their appointments, and their care, while continuing to make it more and more accessible to those with reduced vision.
Peter Thomas, Director of Digital Innovation and Consultant Paediatric Ophthalmologist at Moorfields, said: “Moorfields has established a reputation as an early leader in researching the clinical applications of artificial intelligence. The release of Oriel Assistant is an important milestone as we are making an AI-powered service available to the public, and also helping to avoid the digital exclusion of visually impaired users. As we move forward with the Maia project, we plan to expand its capabilities to communicate with patients when they most need it, or are most worried about their vision - even if that is in the middle of the night. Importantly, we plan to make our AI assistants available via voice, so that patients with any level of vision can benefit from this technology.”