IMAGE Project


Making internet graphics accessible through rich audio and touch

Download the IMAGE Chrome browser extension here! (works on Chrome, Edge, Brave and Opera browsers.)

Welcome to the website of IMAGE (Internet Multimodal Access to Graphical Exploration). This project is carried out by McGill University's Shared Reality Lab (SRL), in strategic partnership with Gateway Navigation CCC Ltd and the Canadian Council of the Blind (CCB). The project was funded by Innovation Science Economic Development Canada through the Assistive Technology Program and is now being funded by Healthy Brains, Healthy Lives through the IGNITE Neuro Commercialization Grant. The motivation for this project is to improve the access to internet graphics for people who are blind or partially sighted.

The Problem

On the internet, graphic material such as maps, photographs, and charts that represent numerical information, are clear and straightforward to those who can see it. For people with low vision, this is not the case. Rendering of graphical information is often limited to manually generated alt-text HTML labels, often abridged, and lacking in richness. This represents a better-than-nothing solution, but remains woefully inadequate. Artificial intelligence (AI) technology can improve the situation, but existing solutions are non-interactive, and provide a minimal summary at best, without offering a cognitive understanding of the content, such as points of interest within a map, or the relationship between elements of a schematic diagram. So, the essential information described by the graphic frequently remains inaccessible.

Close up of a person's hands on the touchpad of a laptop, which displays a number of photographs.
A man who is blind or low sighted wearing a sweater and headphones, sitting in front of a computer in a library, reading a braille book.

Our Approach

We use rich audio (sonification) together with the sense of touch (haptics) to provide a faster and more nuanced experience of graphics on the web. For example, by using spatial audio, where the user experiences the sound moving around them through their headphones, information about the spatial relationships between various objects in the scene can be quickly conveyed without reading long descriptions. In addition, rather than only passive experiences of listening to audio, an optional haptic device can help the user literally feel aspects like regions of a landscape, objects found in a photo, or the trend of a line on a graph. This will permit interpretation of maps, charts, and photographs, in which the visual experience is replaced with multimodal sensory feedback, rendered in a manner that helps overcome access barriers for users who are blind, deafblind, or partially sighted.

Engaging the Community

Collaborating with the community is key when creating accessible technology. Our team is partnering with Gateway Navigation CCC Ltd and the Canadian Council of the Blind (CCB), a consumer organization of Canadians who are blind, to ensure that our system is in line with the needs of the community. We are in regular contact with community members as part of our co-design approach, who are helping guide the development process but there is always room for more voices. If you'd like to contribute to the project, we invite you to fill out our community survey.

Participate in our community survey.

The arms of two individuals in a warm handshake

An overhead view of a software engineer's desk with a sleek laptop and matching mouse.

Our Technology

Our project is designed to be as freely available as possible, as well as extensible so that artists, technologists, or even companies can produce new experiences for specific graphical content that they know how to render. If someone has a special way of rendering cat photos, they do not have to reinvent the wheel, but can create a module that focuses on their specific audio and haptic rendering, and plug it into our overall system.

Learn more about how our system works, or take a peek at what we're planning for IMAGE in the future. Maybe you'd like to use the server code or maybe the browser code in your own project.

Contact Us

For any information related to project you can contact Follow our lab on Twitter and LinkedIn.