Musings on ARIA role="application"

The W3C Accessible Rich Internet Applications 1.0 specification (ARIA) is currently in "Last Call". The purpose of ARIA is to provide a means for web authors to explicitly identify roles, states, and properties of interactive user interface components on web pages, thereby making them accessible to users of assistive technologies (AT).

ARIA introduces the role attribute, and defines several dozen roles that can be assigned to this attribute. Seven of these are so-called landmark roles, which identify specific sections that commonly appear on web pages: banner, navigation, search, main, complementary (for sidebars), and contentinfo (for footers). Those of you who were counting may be wondering why I only listed six landmark roles.

That's because the seventh, "application", is fundamentally different. The other six landmarks are useful as targets for AT, such as the popular screen reader JAWS, which jumps directly to the next landmark when users press the semi-colon key. Historically web authors have had to add this functionality themselves using "Skip navigation" and similar links, which jump to defined targets on the same page.

The "application" role is supported in the same way as other landmark roles (for navigation), but it has a much greater role than the others in determining how AT will behave. According to the "application" section of the ARIA spec:

The intent is to hint to the assistive technology to switch its normal browsing mode functionality to one in which they would for an application. User agents have a browse navigation mode where keys, such as up and down arrows, are used to browse the document. This native behavior prevents the use of these keys by a web application.

In other words, when screen readers and other assistive technologies encounter role="application", they're supposed to relinquish control of users' keystrokes to the web application.

This raises some concerns for me:

  1. How will users handle their AT not working as they expect it to?
  2. Will web application developers do a satisfactory job of providing keyboard and non-visual access to their applications, so that the usual AT keystrokes truly aren't necessary?
  3. Will web applications be built using standard keyboard interfaces, so users don't have to memorize a different set of keystrokes for every online application they encounter?

If web authors can't answer Yes to questions 2 and 3, I question whether it's safe for AT to relinquish control.

An Example Web Application

An excellent example web application is QueueMusic (thanks, Jayme Johnson!). This application allows you to "Find music, put it in your queue, and then let it stream." The body element looks like this:

<body role="application">

In this application, the web page is divided into three main sections, each corresponding with one of the three steps in the process. Each section has a heading atop it (marked up with <h3>): Song Search, Your Queue, and Now Playing.

screen shot of the Queue Music interface

In the Song Search section, there is a search field. You enter a search phrase into that field, press Enter, and a list of results appears. If you're a mouse user you can now click any item in the results list and that's added to Your Queue (the middle section of the page). Then, if you click any item in Your Queue it plays in the media player that occupies the Now Playing section.

Here's what happens if you're a screen reader user (specifically, if you're using JAWS 11). The behavior is more-or-less the same whether you're using Firefox 3.5 or Internet Explorer 8:

You launch the page, and immediately JAWS announces "Application Mode On". On most web pages, you would now press "H" to jump through the headings on the page, because this gives you a sense of how the page is organized and structured. You try that on this page, and there are in fact headings on this page that might be beneficial for navigation, but JAWS doesn't respond because you're in Application Mode, so you don't know there are headings and even if you did you couldn't jump to them. Something's amiss here though because usually when there are no headings on a page JAWS tells you this, whereas in this case JAWS isn't doing or saying anything. Curious, you try Insert+F6, which normally brings up a list of headings. Now JAWS says "This feature is only available from within a virtual document, such as a page on the Internet." You curse JAWS, because you know with some certainty that you are in fact visiting a page on the Internet.

You also try Insert + F7, which normally would bring up a list of links from the page. JAWS, however, repeats what it told you a moment ago, and you curse JAWS a second time, this time louder and more profane than the first.

You press "f" to jump to the first form field, but nothing happens. (Even after you've found the form field by other means, this will continue to be a frustration for you because you can't easily get back to the form field to search again).

You eventually discover that you can tab to the search field, and you type in your text and press Enter. This returns a list of results and you're able to navigate down through that list with the down arrow. However, when you find the item you want to add to your queue, pressing Enter doesn't queue it. If you had eyesight, you may or may not guess that right arrow is the key that adds the item to your queue, since your queue is positioned to the right of the results list. Since you're navigating audibly, this spatial relationship is meaningless to you.

Once an item is in your queue, you might think you send that item to the media player using right arrow, since that worked in the previous step and the same spatial logic still applies. Not so though. The Enter key may have failed you in the previous step, but it actually works in this step. If you had read the application Help in the beginning, you would have discovered and memorized the sixteen keystrokes that operate this application, but you didn't do that because the Help link is at the very bottom of the page, and it didn't show up when you pressed Insert+F7 to see a links list. Plus, that's a lot of keystrokes to remember. You use many different web applications each day, and it's unrealistic for you to have to memorize a different set of keystrokes for each one.

Someday you will also know that you can toggle application mode off. When you're in Application Mode in JAWS, Insert + V cycles through Application Mode and Virtual Mode. Currently, Application Mode isn't documented at all within JAWS Help.

What Users Are Saying

It may be too soon to know how users are reacting, or will be reacting, to web applications as defined by ARIA. This is all new enough that it isn't yet in widespread use. I encourage those who are actively working to define the ARIA spec to do usability testing with average AT users. I did find a couple of user comments on-line:

First, there's this post from the JAWS-Users list:

Hi Listers, I'm using Jaws 10 and Vista. Tonight everytime I try to go to Yahoo mail, Jaws says application mode on. I don't have any idea what that means, but I can't navigate the page at all with the PC/virtual cursor. I can read with the Jaws cursor, but when I route PC to Jaws, reading is very strange with Jaws not recognizing links or responding to any commands. I would like to understand what application mode is and how to not have it come on without being asked for.

Similarly, there's this two-part quote from a single thread on the WebAIM list:

I dont know what you've done but it messes up JAWS real bad. I checked the 2 examples and somewhere down the page JAWS goes into a mode I've never heard before, applications mode. When it is in this mode I cant navigate anywhere and have to close the page. I'm using JAWS 10.0...
...I have never heard of applications mode, I found it frustrating and if I run into it I'll just close down the page and leave, how many others will do the same?


Despite my being critical in this blog post, I don't necessarily think role="application" is a bad idea. However, all stakeholders need to carefully consider what to do with it. AT makers need to carefully consider the impact on their users, and implement support in a way that makes web applications easier for their users to use, not harder (take note, Freedom: Telling users this feature only works on the Internet is not good implementation). And web authors need to carefully consider whether using role="application" is truly necessary for their site, and if so, they need to be extremely thorough in developing an interface that works well for AT users whose typical interface no longer works. This obviously will require extraordinary attention to detail, and most importantly, plenty of usability testing that includes AT users.

I haven't seen any great resources yet on how to develop web applications that meet these conditions. Section of the WAI-ARIA Best Practices provides some useful guidance, but we need more and better tutorials or I'm afraid we'll just end up with a bunch of bad apps. If folks know of more and better resources, let me know and I'll add them here.

6 comments on “Musings on ARIA role="application"

  1. Honestly, I'm a total newbie to accessibility (actually I'm reading "Web Accessibility: Web Standards and Regulatory Compliance" for the first time), so take my opinion with a pinch of salt.

    I've read a few blog posts about this mode and I thought what you're showing now: most people would be really confused about it if special care wasn't taken by the AT.

    I think the user should be given more information and prompted to whether to enter the applications mode or not. Something like: "Application Mode: This website wants to enter application mode, press [key] to activate it. While in this mode your [screen-reader name] keyboard shortcuts will be stop working and be replaced by the web application's shortcuts. You can toggle Application Mode with the [key] key."

    Then some key could be pressed to stop repeating this message each time a web application wants to enter this mode.

    I think this mode has great potential, but developers must take a lot more care so as to not confuse users. In my opinion, the role="application" on the body is a really bad idea. Another option could be explaining what's going to happen (at least until AT handle this properly) with some text just before entering Application Mode.

    Anyway, I better get back to reading the book and learning more about this topic before saying stupid things like I'm doing now 😛

  2. Application mode is still a relatively new concept, can can understandably be confusing to people that are not screen reader power users and have not been following the ARIA developments.

    When JAWS says "application mode", this informs you that you can expect an interface with similar interaction to a desktop application, rather than a traditional web page.

    The mode is definitely an improvement for complex, dynamic web application interfaces that simply would not work when navigated in virtual mode. Virtual mode would not be sufficient here because (amongst other things) the interaction on such interfaces depends largely on keyboard interaction (that would normally be prohibited by the screen reader).

    The alternative for using application mode would be expecting the user to realize that virtual mode needs to be turned off manually for successful navigation (which in JAWS can be done using the JAWSKEY + Z shortcut). In my opinion is a more confusing solution, because then the web application makes user believe that this interface is interacted with like a regular web page.

    So basically, this means the following things:
    1. In application mode, the user can expect the same screen reader support as in a desktop application, which means no link lists or webpage specific screen reader shortcuts.
    2. As in desktop applications, it is the developer's responsibility to ensure that the interface is fully keyboard and screen reader accessible. The application role should only be used when the the web application has been built with accessibility in mind and tested for it.
    3. Additionally, it is also the web application developer's task to provide documentation about how the interface should be used (such as that the right arrow key needs to be used to add an item to the queue). Similarly ARIA can be used to add descriptive hints about a certain component works.
    4. Standardized keyboard interfaces are definitely being worked on, for example in

    Having said all that, I can understand that this mode can be confusing and annoying for users, especially if the role is used incorrectly and simply been slapped on as a quick fix. Personally, I think screen readers should allow users to choose whether application roles actually trigger a change in screen reader mode. For example, one option could be that the screen reader simply announces "application mode", and then prompts the user if he wants to turn off virtual mode.

    Finally, for people that find this mode annoying and want to use their trusted screen reader shortcuts, you can always switch back into virtual mode using the JAWSKEY + Z shortcut.

  3. I've been in and out of the development of ARIA (over the last nine or ten years). But my understanding of role="application" isn't at all what JAWS is doing.

    I had understood it as a notification that an AT could use *IF* it recognised the application, to provide an enhanced mode of interaction.

    Assuming that the applcation should simply be given control seems completely wrong. And handing over keyboard control is a *huge* mistake.

    One of the reasons I need to get back into ARIA this week is to try and fight the mess that is created by applications' current apprach to keyboard control, of using Javascript to trap keyboard events. That interferes seriously with all kinds of UIs (not least, screen readers and similar assistive technology which offer an extensive set of commands and need a way for the user to activate thos commands), and simply fails on all kinds of platforms where the keyboard isn't what the author expected - from touchscreen devices and phones to TVs using more-or-less standard remote controls.

    There are ways to provide keyboard control of applications. But I think the current dominant paradigms are simply wrong, and need a genuine overhaul 🙁 (Fortunately I am not the only person, nor even the only one from a brwoser maker, who thinks this way...)

  4. Gonzalo is correct, IMHO. One of the first problems we face is that users are not aware of what the "application mode" really is. So, the assistive technology needs to take a better care of providing the necessary messaging to the user.
    To that end, JAWS11 handles the role of application much better than JAWS10 did. JAWS10 used to invoke the "application mode" automatically, whereas JAWS11 allows the user to make that choice.
    To those users who use JAWS10, there is no other fix than to upgrade to JAWS11 or use NVDA, an open source screen reader, at

  5. A very interesting and timely article Terrill.

    The very edition of a new "mode" is something that is of concern. I am seeing more and more potential for "complexity bloat" with new RIAs (whether they use ARIA or not) that will be a cause of confusion for many users of AT.

    IMO The languages/platforms that are used should mediate some of this complexity and not ad to it. What your post shows is the existence of a "third mode" of user interaction. You may be interested in some user testing I did with a blind colleague of mine when using an RIA (Google Reader). The video shows how some of this complexity could effect the user if these modal glitches are not ironed out - note the video is not a critique of ARIA but of RIA complexity in general and illustrates difficulties with introducing new "modes"). [1]

    The ironic thing - screen reader users are so used to virtual cursor enabled browsing that the move towards desktop applications in the web space may not be as welcome as first though - as it breaks the current (and established) paradigm of user interaction.


  6. As someone working on ARIA I think role=application is very important, however you do make some important points, which I do think should be considered. My experience of people using AT with a browser delivered application which is not ARIA enabled, is that they do expect more of a desktop style interaction, however this has been with many core business system applications when the user knows they aren't 'in the wild'. Perhaps when this style of interaction pops up in the wider internet it is more unexpected.

    My experience of colleagues who are screen reader users is that they don't go straight to navigation by headings on a new page, they tab around the screen to give themselves an understanding of how the page is organised. Doing this with an ARIA enabled application is great as tab orders should be logical and understandable. When they understand the page, they then navigate using the headings or links shortcuts. This of course also highlights that everyone uses their screen reader differently and we cannot assume one method of working.

    We must remember that the concept of ARIA is new to users and like anything that is new it will take time to get used to. Users of AT do need support and understanding, tutorials as you mention and some of this should come bundled with their AT so they know what to expect. Other materials indeed should come from those of us working on the specification.

    Joshue mentioned a paradigm shift and that is exactly what this is, not just for users of AT, but for us all. The way we use the internet is changing fundamentally and although those of us without AT will not benefit from ARIA perse, we most certainly will benefit from the keyboard shortcuts that developers should include for navigation. Hans has mentioned the Style Guide which is a key piece of work to support this.

    We need to help people understand this change, but this change will improve the internet for users of AT.