with extended reality
technologies.



about us
Why do we develop individualised disability aids?
Whether in the real world or in XR, our access to information, knowledge, and society at large depends on our ability to move and sense our surroundings. Despite declared equal rights of access1, there are real, practical impediments to participation in society for many due to disabilities of motor or sensory impairment – whether affecting movement, vision, or hearing. XR has the potential to transform access to work and society via individualised augmentation of information in real time.
People with disabilities form the largest minority in society, with estimates that 15% of the global population live with some form of disability. Among the most prevalent disabilities are hearing loss with c.466 million people affected, vision loss affecting c.250 million people, and more than 65 million needing wheelchairs. A 2017 review reported a 26% rise in numbers of people affected with moderate to severe vision impairment between 1990 and 2015, attributed to population growth and ageing, and this is expected to continue to rise rapidly with an urgent need to “scale up vision impairment alleviation efforts at all levels.” (The Lancet p. e888). The World Health Organization estimates that about 5% (430 million people) of the world’s population have hearing loss that requires treatment. This number is expected to nearly double in the next 30 years as the world gets noisier and the number of elderly people increases.
As we have developed AL-EYE since 2019, more and more people are realising the potential for XR technologies to transform the lives of persons with disabilities, with Meta/Facebook commissioning a review outlining future development from Funka Nu AB, for example. However, AL-EYE remains unique in 2025, for our inclusion of eye tracking and indivudalised support, covering 5 patents pending.
- Funka Nu AB (2021) New Realities Unlocking the Potential of XR for Persons with Disabilities. 🔗
- WHO (2011) – World Report on Disability 🔗
- IFHOH (2020) – The International Federation of Hard of Hearing People 🔗
- Bourne RRA, Flaxman SR, Braithwaite T, Cicinelli MV, Das A, al e. Magnitude, temporal trends, and projections of the global prevalence of blindness and distance and near vision impairment: a systematic review and meta-analysis. The Lancet Global health. 2017;5(9):888–897. 🔗
- World report on hearing. Geneva: World Health Organization; 2021. Licence: CC BY-NC-SA 3.0 IGO. 🔗
Multi-sensory
support
Universal Design
By solving problems for people with sensory and motor disabilities, we push XR technologies state of the art, incorporating the neuroscience of visual and auditory perception, attention, and methods from signal processing, optics, and machine learning into the pipeline.
All users are disabled to some degree at various times - either temporarily due to circumstances such as poor light conditions, or not having hands free to interact, or more long term due to age related hearing or vision loss. There are many situations when alternative access or control methods and sensory support benefits everyone.
By focusing on universal design, including users with disabilities in the design process from day 1, and availing of an existing ethical and regulatory framework for the development of assistive devices, we aim to push state of the art of extended reality technologies for good.
Our team

Mark Bo Jensen
Senior Software Engineer
ALEYE - STORY
Our Journey.
Every story has a beginning 
In 2019, after conducting a review of available visual aids, Fiona designed a visual aid based on emerging mixed reality hardware with integrated eye tracking. No visual aid on the market used eye tracking – and this prevents individualised support across the retina.

Prototype & X-tech 
In 2020, Fiona proposes the AL-EYE visual aid to a group taking DTU’s x-tech course, and the first prototype is built.

Ongoing work, test and our funding partners
From 2020 to 2022, the DTU campus was very quiet under lockdown, but in our lab we continue the work with 6 Masters theses, 3 Bachelors projects and 3 special courses continuing to develop and test the visual aid, and aspects of multisensory perceptual support. Kevin and Mark join the project to lead software development and participatory design with users, and we win funding from DTU SkyLab and from InnoFonden to support the work. We carry out trials with low vision users in collaboration with the Danish Institute for the Blind (IBOS) and a specialist low-vision clinic at Rigshospitalet Glostrup. Many people with vision loss visit DTU, and spend many hours trying out our individually tailored solutions and teaching us what it is like to have vision loss, and what challenges they face in everyday life.

EU demo showcasing!
Master’s student Mateusz Sadowski and Research Assistant Boasheng Hou demonstrate the AL-EYE visual aid to EU Commissioner Mariya Gabriel, 11th Nov, 2021. Fiona pitches for Funds and is supported by InnoFondon and Skylab discovery and PoC grants.
DTU X-Tech shines to build XR perimetry
In 2022, Fiona leads the second x-tech team to work on AL-EYE – XR based perimetry, without the need to keep the eye still. This team works on a user interface for clinicians or opticians, and a gamified version of the tests suitable for children.
Perfecting our universal disability aids
2023 and 2024 bring more special courses and masters theses that contribute to XR based “universal disability aids”, including eye tracker data handling methods, adding new measures of balance and vestibular function in XR to model the user more completely, to enhance the digital twins of each user. Fiona pitches again for funds, and InnoFonden support us. The tests of visual function are integrated into the visual aid. Yang Xu joins as Research Assistant to work on software, refining the integration of many coordinate systems to make AL-EYE as robust and as accurate as possible. Meta visit DTU, and Fiona presents the AL-EYE to Jan Linden.
The story continues, into more powerful multisenesory aids.
In 2025, we continue development of AL-EYE towards multisensory support for persons with hearing loss, as well as vision loss. We launch the AL-EYE website, designed by Ziru Li, and prototype methods of multisensory support for sound source localisation in hearing aid and cochlear implant users – to help them follow conversations. We are also developing methods of translating music, and specifically harmonic tension in music, to 3d visualisations. We also make a push for investment, and funding. If you would like to support our work in any capacity – please contact Fiona Brid Mulvey.
Our team

Mark Bo Jensen
Senior Software Engineer
Contact
ALEYE - STORY
Our Journey.
Every story has a beginning 
In 2019, after conducting a review of available visual aids, Fiona designed a visual aid based on emerging mixed reality hardware with integrated eye tracking. No visual aid on the market used eye tracking – and this prevents individualised support across the retina.

Prototype & X-tech 
In 2020, Fiona proposes the AL-EYE visual aid to a group taking DTU’s x-tech course, and the first prototype is built.

Ongoing work, test and our funding partners
From 2020 to 2022, the DTU campus was very quiet under lockdown, but in our lab we continue the work with 6 Masters theses, 3 Bachelors projects and 3 special courses continuing to develop and test the visual aid, and aspects of multisensory perceptual support. Kevin and Mark join the project to lead software development and participatory design with users, and we win funding from DTU SkyLab and from InnoFonden to support the work. We carry out trials with low vision users in collaboration with the Danish Institute for the Blind (IBOS) and a specialist low-vision clinic at Rigshospitalet Glostrup. Many people with vision loss visit DTU, and spend many hours trying out our individually tailored solutions and teaching us what it is like to have vision loss, and what challenges they face in everyday life.

EU demo showcasing!
Master’s student Mateusz Sadowski and Research Assistant Boasheng Hou demonstrate the AL-EYE visual aid to EU Commissioner Mariya Gabriel, 11th Nov, 2021. Fiona pitches for Funds and is supported by InnoFondon and Skylab discovery and PoC grants.
DTU X-Tech shines to build XR perimetry
In 2022, Fiona leads the second x-tech team to work on AL-EYE – XR based perimetry, without the need to keep the eye still. This team works on a user interface for clinicians or opticians, and a gamified version of the tests suitable for children.
Perfecting our universal disability aids
2023 and 2024 bring more special courses and masters theses that contribute to XR based “universal disability aids”, including eye tracker data handling methods, adding new measures of balance and vestibular function in XR to model the user more completely, to enhance the digital twins of each user. Fiona pitches again for funds, and InnoFonden support us. The tests of visual function are integrated into the visual aid. Yang Xu joins as Research Assistant to work on software, refining the integration of many coordinate systems to make AL-EYE as robust and as accurate as possible. Meta visit DTU, and Fiona presents the AL-EYE to Jan Linden.
The story continues, into more powerful multisenesory aids.
In 2025, we continue development of AL-EYE towards multisensory support for persons with hearing loss, as well as vision loss. We launch the AL-EYE website, designed by Ziru Li, and prototype methods of multisensory support for sound source localisation in hearing aid and cochlear implant users – to help them follow conversations. We are also developing methods of translating music, and specifically harmonic tension in music, to 3d visualisations. We also make a push for investment, and funding. If you would like to support our work in any capacity – please contact Fiona Brid Mulvey.