The IMAGE extension is fully supported on Chrome, but also works on Edge, Brave, Safari, and Opera for Desktop or Laptop. To use IMAGE, install the extension to your web browser by visiting the IMAGE Project Chrome webstore page and clicking "Add to Chrome". A button labelled "Interpret this graphic with IMAGE" will appear after graphics, charts, and maps. You can choose a specific graphic for which you would like to have a richer experience. By activating these buttons, you'll send the graphic to the IMAGE server. The IMAGE server then sends back one or more interpretations which are presented to you either through audio, text, or haptics. Because the actual graphic is sent to a server, be sure to never use IMAGE on graphics that may contain sensitive or personal information.
If you prefer, you can open an IMAGE Extension Launch Pad by pressing the keys Alt+I on your keyboard. This will take you to a window where you can access this page if you need help, or you can open a file on your computer, or manage the extension options.
For the best experience listening to the tnterpretations, please use stereo headphones. Additionally, if you wish, you can use haptic devices with IMAGE. This is a work in progress, and entirely optional, but will enhance the experience! You can use the Haply 2diy. To for a more in-depth explanation of this device, see our Technology page.
Once you have IMAGE installed, you activate it in several ways:
IMAGE interprets photos as being comprised of two broad classes: the first is regions, the second is things and people.
Regions are broad areas of photos made up of similar objects, like trees in a forest, or large things, such as a wall or floor. This is to give you an idea of where the larger components of the photo are relative to other elements of the graphic, rough shape, and spatial extent.
This is what three regions sound like together:
To understand what you are hearing or feeling, imagine that, using the buzzing sound, you are stretching your arm out in front of you to feel the very edge of the region, and tracing the contours of the edge clockwise left to right, with the pitch changing as it traces down and up.
Things and people are found by our object detection programs. Things tends to encompass a wide range of objects, and people are humans. These will generally be grouped together.
Here is an example of things and people:
There were two different thing groups, as well as people being identified in this graphic, and you should have heard their relative locations in the scene as popping noises.
To run IMAGE:
Try the same steps above on a more complex photo. Try to get a feeling for where all the things are in the photograph.
IMAGE uses embedded maps from Google Maps to create point-of-interest type renderings. If you are familiar with Shared Reality Lab's Autour project, this is similar, but also a little different.
If the embedded Google Map has a latitude and a longitude, you will be able to hear the maps with a Points-of-Interest rendering. Imagine that you're standing at the location given by the latitude/longitude and you are facing north. You'll hear a little jingle in a direction relative to due North followed by the name of a place. The volume of the jingle indicates how close it is: louder if its closer, quieter if it's farther.
Here are 5 points of interest centered around a popular tourist spot in Toronto, Ontario. Two of them are to the right, one is behind, and the other two are to the left.
Here's how you can get the Points-of-interest experience for an embedded Google map.
You will hear the points of interest going around your head as if you were standing facing north on the map.
IMAGE can interpret graphs and charts made using Highcharts.
Currently, IMAGE can turn line graphs with a single variable into spatialized audio. You will hear title of the chart, followed by the website the chart is on, and the variable being measured. You will then hear a noise going left to right as it goes from the start of the x-axis to the end of the x-axis. The noise goes up in pitch as the value it represents goes up and the down in pitch as the value goes down.
Here is an example of what you will hear.
Try navigating to a Highcharts embedded chart. We have had pretty good luck with the ones on Etherscan.
Notice the pitch change as the value goes up and down.