Blind accessibility…..addons, autonomy they give and developer requests to further them

Hi all

I have had conversations with 2 great addon developers who make wow access and KUI.
These addons bring great autonomy to many blind people wanting to play wow.
In the conversation they spoke of many things that could be done from blizzards end so these addons could further develop and give further autonomy.
Personally I have no idea about addon development so I asked the author of wow access to compile a list of what would help to further these addons development.

This is an overview of accessibility for the blind and visually impaired within World of Warcraft Retail written by one of the co-authors for WOWAccess and Keyboard-UI, two addOns with a focus on making WOW accessible in all aspects for blind and VI players. I will cover as much as I can. Given that disabilities are unique from person to person, and I am only one disabled person, my perspective is only one of tens of millions. However, it should be a good start. A lot of what I am going to describe here is not implemented in our addOns yet, but in the below list, the APIs do exist. It’s just a matter of time and effort to write the scripts.

Currently, the things that can be made accessible via addOn implementation are the following:

  1. TTS narration of the user interface, inventory, gossip menus, game menu, spell book, talents, vendors, quest titles, descriptions, and objectives, party information,… Essentially any API that generates text can be used to narrate information to the player, which is key.
  2. Manipulation of UI settings via the keyboard, so things like opening and closing the spellbook, selecting a spell, placing the spell on the action bars, opening the game menu and navigating to game options and changing their values, buying and selling at vendors, all can be done with the keyboard. The idea is to limit mouse usage within the UI and game world.
  3. Navigation: Currently, the player can gain access to their 2d mixin coordinates, both global and map specific, which enables the player to form a sense of their position within the game world with the aid of the interpretation of an addOn. They can also access their heading, north, south, east, west, etc. They can access their zone information and area information to know when they have entered a new location. It is possible to utilize the waypoint system to place beacons on the map that the player could follow with the aid of either spoken directions from the TTS or audio indicators. However, this is only truly beneficial for the pre-constructed waypoints, such as for quests. Custom waypoints do not have a path finding function, so this is maybe not as useful as the in game systems, since those will guide you pretty well through cities and the like. We have not reached implementation for this in our addOn just yet, so it is unknown how accurate and helpful this system will be. I have high hopes it will be key.
    4: Sound effect feedback: Within the WOW client by default are a number of very helpful sound effects that queue in the user as to what menus are opened, and even within the gameworld, the audio quality is fantastic. The API allows us to play additional sounds within our addOns for additional customization and alerts. For example, our addOn plays sounds that correlate to the player character’s health, as well as the health of their teammates, with each one having their own unique sound and each sound having multiple variants of it to indicate the change in information. This is very useful when conjoined with event frames.

Limitations of the API

The above list is quite extensive, and there is probably tons more I could add on top of it. I’m just trying not to write a book here. Plenty can be done just using what is already available within the API. It’s just a matter of putting in effort and time. This is also from the point of view of leaving accessibility up to addOn developers. That being said, there are some real barriers that prevent us from being able to enable your blind players. For example, not having access to the z coordinate of the player prevents the player from knowing if they are climbing stairs or running into a wall.
I mentioned that the 2d mixin let’s them know if they are in motion or not. This is because the walking animation runs, along with the foot step sounds, even when there is an obstacle in the way preventing them from moving. Because of this, the addOn has to monitor the player’s current coordinates and compare them to the next set to determine if motion has occurred. There is about a half second delay between each iteration to allow for change to occur. If there is no reasonable change within that time period, the addOn alerts the player that they are running into a wall by playing a sound. However, if the player goes up stairs, or climbs a ladder, there’s nothing to tell the addOn that they’re progressing because we only have access to two of the three navigational values. Thus it generates a false positive. It’s also just very useful information to have for spacial reasoning. A potential solution for this would be to grant access to the z-axis information again. If there is some other reason why this information is not accessible, then an API that can be used to detect when ascent and descent has occurred would be acceptable as well.
Another area that forms a barrier is the removal of the map UI in special events like dungeons and raids. Because the map UI is removed, the addOn can no longer supply the player with vital navigation information such as the direction they are facing, or even if they are running into a wall. Essentially, everything I said about positive navigational accessibility goes out the window. A solution would be to allow for these API calls to work, or develop new API specifically for this to be used in and out of dungeons. Alternatively, if the worry is bot related, then only allow the API to be used while not in combat, and possibly with reduced precision. Fortunately, because of how dungeons work, the player can follow other party members, which is great. However, that leads us to the next barrier.

Because of the environmental obstacles, following targets is difficult because, as I explained earlier, a blind user cannot tell if they are in actual motion, or if they’re running into a wall. If the path finding of autofollow could be made more intelligent and avoid those obstacles, it would solve a lot of frustration on our part. I imagine this is something sighted players also find frustrating.

One additional feature I would love to see that would improve independent navigation for blind and VI users is a way to check what the terrain is like around the player character. Radar systems in other audiogames for the blind usually do this by passing coordinates to functions that check them against the map data, and convey relevant information to the player with audio cues. Usually this information relates to whether or not the space ahead of them is free of hazards. Being able to generate information for a given set of coordinates would be incredibly useful in navigation. If checking specific coordinates is too much power, then being able to get this data relative to the player would be acceptable. I.E. getting data on what is 5 yards in front, behind, and to the side of them. Data that would be useful would be the terrain type, whether or not it is clear to navigate through, and if it is a hazard, or a cliff.

The next area I would like to discuss is environmental objects. Specifically, environmental objects required for quests. I.E. a quest where you have to go out and collect apples off of a tree. They are somewhere in the zone, and you have to go find them, collect them, and bring them back to the NPC. This is an area I have 0 solutions for. Perhaps there’s some API that can tell the player exactly where to go, but in my research, I have found none. There is absolutely no reliable way for the player to review what objects are in the immediate vicinity of the character without meticulously checking every pixel on the screen with the mouse and hoping the object has a tooltip associated with it and having the tts read off the tooltip. Given the size of the world, it’s easy to understand how impractical this approach is. Because of this, the user cannot independently do these kinds of quests. A solution for this could be a way to review this kind of information with the use of API, or an audio indication of where the item is, or even one that plays when they get close. A way to target it without the use of the mouse would be ideal as well. For example, the player sets out to find the apples. When they get close to one, they hear a chime sound played in the game world related to where the item in question is located, similar to the glint I am told sighted players see. The player navigates towards it, and then can either automatically pick it up, or maybe use a slash command to pick it up or activate it once they are within interaction range.
This is a pretty obvious area that addOn developers can’t bridge for accessibility. It’s the character creation process. This is part of the glue, and it’s something that needs to be addressed directly by the WOW client to lower the barrier for entry.
The last area I wanted to address here is the sound playing API. It is very restrictive in how much information can be provided to the user. For example, the health monitoring that I described earlier uses sounds to convey the information by panning the unique sound for each party member from left to right depending on their health percentage. However, the panning is done manually, requiring me to create five alternate versions of each sound. Each audio file is the same, save for being moved a little along the stereo field. The sound that indicates your own health does not pan along the stereo field, but instead changes in pitch as your health shifts from one stage to the next. Again this requires a single sound with multiple alternate versions all manually created and packaged with the addOn. If some of the functions that control sounds and their properties could be available as API calls, this would open the window for far more creative solutions for accessibility.

Thank you for reading, and for your considerations. I have only begun playing WOW myself around October of last year, but it has always been something I’ve wanted to play since it began. The past few months have been a great experience for me. I’m grateful for what is already available. My hope is that, truly, no one gets left behind. With real commitment to accessibility, not only specifically to blind users, this will ring true for the future.

15 Likes