2 Community-Driven Projects to Independently Improve the Accessibility of Inaccessible Web Sites

I know of two projects intended to improve the accessibility of inaccessible Web sites.  A couple very intriguing qualities they have in common are that:

  • they depend upon the community to report accessibility problems, to fix them, and to share their fixes with the rest of the community;
  • their intention is that the accessibility problems can be fixed even if the inaccessible designs of Web sites do not change!

The projects are:

AccessMonkey, by The Department of Computer Science and Engineering at The University of Washington, and funded by a National Science Foundation Grant.

The way it works is that the community reports accessibility problems, AccessMonkey developers create scripts the community can use to ameliorate the problems, then the community shares them with others.  AccessMonkey is based upon GreaseMonkey, the widely-used Firefox extension that allows people to customize the way Web pages look and function.

Note: I am worried about the future of this project because I have not seen any recent activity on its Web site.

Social Accessibility Project, by IBM alphaWorks.

Focused on improving accessibility for users of screen readers (JAWS only at this time), it works in the following way.  Users report a Web site’s accessibility problems to the Social Accessibility server.  Volunteers respond by creating and publishing accessibility metadata. These metadata are attached to the original Web page so all users who visit the page benefit from it.  Users can also make accessibility improvements to a page by submitting landmarks to the server.  They are then made available to all screen reader users.

Note: You can sign in as a guest if you want to just explore this active utility.

On the “Clear Helper” Web site, I may be able to easily and usefully incorporate the community-reporting aspect of these two projects.  Perhaps I could set up a site feature that would enable people with cognitive disabilities to report problems they experience on popular Web sites.  Based upon that information, I could create tutorials on how to use particular features of them.

Juicy Studio Readability Test: Contradictory Results

For the current text of the “Clear Helper” Web site home page, I conducted the Juicy Studio Readability Test.  Results from the readability test’s automatic calculation were significantly different from those derived from my manual calculation.  I don’t know how to account for this.  It makes me distrust the test.

Automatic Summary Results

I submitted the URL of the Clear Helper home page.  A summary of reading level results was produced.

Total sentences 59
Total words 370
Average words per Sentence 6.27
Words with 1 Syllable 211
Words with 2 Syllables 66
Words with 3 Syllables 47
Words with 4 or more Syllables 46
Percentage of word with three or more syllables 25.14%
Average Syllables per Word 1.81
Gunning Fog Index 12.56
Flesch Reading Ease 47.73
Flesch-Kincaid Grade 8.16

Note the last three results.  They indicate that someone with almost 13 years of education or someone in the eighth grade could understand the current text of the Clear Helper home page.  The Flesch Reading Ease scorer of 47.73 falls below the ideal of 60 – 70.

These scores and their indications are worse than those I determined by following Juicy Studio’s instructions on how to calculate the scores manually.

Gunning-Fog Index: Manual Calculations

  • This test “… is a rough measure of how many years of schooling it would take someone to understand the content.”
    • To calculate this score, as instructed, I added the average number of words per sentence (6.27) to the number of words with three or more syllables (93) and multiplied the total by .04. Score = 4.
      • This indicates a fourth-grader would be able to understand the home page text.
        • Note: I think there is an error in Juicy Studio’s instructions.  It first says to use the percentage of 3-syllable words but, in the formula, indicates the number of 3-syllable words should be used.  Calculating the formula with the percentage produced a score of 1.  Because it seemed quite unreasonable that a first-grader could read the home page text, I instead calculated the formula using the number of 3-syllable words.

Flesch Reading Ease: Manual Calculations

  • For this test, “… the higher the score, the easier it is to understand the document.” The ideal score is 60 to 70.
    • To calculate this score, as instructed, I subtracted the sum (153.126) of 84.6 multiplied by the average number of syllables per word (1.81) from the sum (6.36405) of 1.015 multiplied by the average number of words per sentence (6.27). Result = -146. 76195. I then subtracted this result from 206.835 to achieve a score of 60.
      • The score of 60 falls within the ideal range.
        • Note: In calculating the score, I had to change the negative result number (-146. 76195) to a positive number.  It was the only way to produce a reasonable result.

Flesch-Kincaid Grade Level: Manual Calculations

  • This test “… is a rough measure of how many years of schooling it would take someone to understand the content.”
    • To calculate this score, as instructed, I multiplied the average number of words per sentence (6.27) by 0.39 and added its sum (0.24453) to the sum (21.358) of the average number of syllables per word (1.81) multiplied by 11.8. Result = 21.60253. I then subtracted 15.50 from the result. Score = 6.
      • This indicates the home page text requires a sixth-grade education to understand.

Conclusions & Speculation

The results from the manual calculations indicate someone with a fourth- to sixth grade education should understand the home page text, and that its readability falls within the ideal scale.  This is significantly better than the results and the indications of the readability test’s automatic calculations.

I wonder if the different results are due to the consequence of following contradictory- and confusing instructions about how to perform the manual calculations.  Perhaps the errors are mine.

I will have to revisit this at a later date.

A Beginning for the Clear Helper Web Site

The “Clear Helper” Web site has its first page!  The home page has a decent look for a first attempt, and it experiments with a few accessibility elements.  However, its design is definitely not the one I am imagining for the site when it is available for use by people with cognitive disabilities.

HTML 5 & CSS 3

It took me eight hours to create the home page.  I spent much of the time learning enough HTML 5 and CSS 3.  I’m using those technologies primarily because they should work best with WAI ARIA, which defines a way for assistive-technology users to identify and to navigate visually-rich user interfaces.  Will I be using such interfaces or applets on the site?  I don’t know, but I want to be prepared for that possibility.

Voice Narration

One of the accessibility attributes the home page uses is a way for visitors to have its text read to them without needing to use a screen reader.  I created this feature with two tools.

I used Cognable’s speech demo to create a MP3 narration of the home page’s text.  I did this by:

  1. entering text for the MP3 title;
  2. copying and pasting the home page text;
  3. selecting the “Chilled US/Canadian Male” voice font;
  4. proving to the Captcha tool that I am human; and
  5. pressing the “Create MP3” button.

I then downloaded the created MP3 file for use on the home page.  Easy!

I embedded NCAM’s accessible MP3 player into the top, right of the home page.  I followed the simple instructions provided for the ccMP3Player.

The voice narration sounds okay, but it would be significantly improved by the use of a commercial voice font, I bet.  I did not set up closed captioning for the MP3 file because the entire text of it is right on the home page.

Immediate Next Steps

I will test the page using two tools:

  • WebAIM’s WAVE.  There are other Web accessibility evaluation tools, and I will use them.  Yet I am especially interested in WAVE because it will soon incorporate tests specifically for cognitive-disability accessibility attributes.  (More on this later.)
  • Juicy Studio’s Readability Test.  I plan to use this tool to analyze the home page text, then revise it until it reaches a reading level  likely to be understood by most people.  This experimentation, hopefully, will help train me to write explanatory text at an appropriate reading level.

I will be doing a lot more testing, experimentation and design revision.  All of it will be the subjects of future blog posts.

An Evangelist of Web Accessibility for People with Cognitive Disabilities

A project by Inclusive New Media Design in England is evangelizing Web accessibility for people with intellectual / cognitive disabilities, which it also refers to as “learning disabilities”.  It has been running workshops to train Web designers and developers, and to include people with cognitive disabilities as testers.

Its Web site has a section of tips on making Web sites work for people with cognitive disabilities, which includes links to examples of Web sites designed for the population, and information about assistive technology.

Its Web site has several nice features I plan to incorporate into the future “Clear Helper” Web site.

  • a design using HTML 5 and CSS 3;
  • a Flash-based, text-to-speech applet on every page;
  • CSS-based switching of page-background coloring (light / dark);
  • large font sizes;
  • breadcrumbs for site navigation;
  • contextually-relevant icons;
  • presentation of content in multiple formats (text, audio, video); and
  • embedded, closed-captioned videos.

I anticipate learning much from this great resource.

Reservations on the Usability of Automatic Captions

Lately in the mainstream press, there have been articles trumpeting that Google is adding automatic captions to YouTube videos. For examples, see:

Captioning For YouTube Is Not New

For at least a year now, Google has provided the public the capability of adding captions and subtitles to videos uploaded to YouTube.  This enables human-generated transcriptions to serve as the captioning.  What’s new, at least for YouTube, is that Google is now using speech recognition technology to convert the speech to text automatically; no transcription files have to be uploaded manually.

Automatic Captioning Is Not New

Unheralded by the mainstream press, last year The National Institute on Disability and Rehabilitation Research (NIDRR) funded a project that used IBM Research Labs technology to automate the closed captioning of video-based instruction.  It was intended to improve accessible distance learning for people with cognitive disabilities, The Deaf and the hard-of-hearing.


This feature developed in parallel by both companies appears to be a boon for the affected populations.  Unfortunately, speech recognition technology still has far to go to produce language understandable by most people, especially people with cognitive disabilities.

The YouTube automatic captioning is based upon Google Voice technology.  I am a user of Google Voice.  Its conversions of voice-mail messages to transcripts is so bad that I keep using it because it consistently gives me a good laugh.  Its speech recognition is quite poor.

At least, in concessions, Google “… promises that the technology will improve over time” and IBM advertises an “…over 90 percent accuracy”.  That seemingly short way to go, and the small remaining percentage of improvement, actually account for an amount of errors so significant that the transcriptions produced are difficult to comprehend.

The other feature Google announced is that it is giving people the option of using its automatic translation system to read the captions in any of 51 languages.

It was almost twenty years ago that I first researched and started following the effort to computerize the translation of written text from one language to another.  Despite its great strides forward since then, its Achilles’ heel has always been parsing context.  Here is a simple example.  If it is said that employees are green, Americans understand that to mean they are inexperienced.  Computers have always had difficulty determining context, so that a common automatic translation mistake is to misstate that as “the employees are the color green”.

Users of the current Google Translator service, which translates Web site text from one language to another, often tell me it provides an idea of the content being translated, but it has far to go to match human-translated content.


The promise of these automatic tools is that they will make it much easier to caption videos, thus promoting the widespread use of captioning.  Due to insufficient speech recognition, and the problem with context parsing, I predict that, for years to come, all the automatic addition of captions to YouTube videos will require human revision to make them understandable.  Once people realize this, I expect the use and the adoption of it will be low.

The good news is that Google has a strong financial incentive to get this right.  Its empire relies upon the association of advertisements with textual content.  The more accurate Google can make automatic captioning and translation, the more it will be able to monetize other content, such as video and audio, via their captioned text.

3 Tools to Measure Usability of Navigation Icons

One possible way to evaluate the usability of navigation icons may be to determine the amount of interaction Web site visitors have with them.  Common sense says that the higher the click rate, the greater the indication that visitors are attending to them.  Of course measuring click rate would not indicate that visitors were understanding the intended purpose of the navigation icons, but use of them would presumably mitigate that.  The reason is that the more often a visitor uses a navigation icon, the more opportunities there would be for the visitor to learn its purpose.

At the time of this writing, I know of three tools that could provide an accurate measure for the click rate of navigation icons.  Each is marketed as an evaluator of Web site usability.

They are CrazyEgg, ClickTale and Google Web Site Optimizer.  These tools essentially work the same way.  On a test Web page, a “heat map” is set up that tracks where on the page a visitor clicks.  Clicks to navigation icons embedded in such a page can be tracked.  A high percentage of clicks to a navigation icon or icons, compared to the percentage of clicks to other elements of a test page, would indicate that visitors are attending to them.

I plan to use the Google Web Site Optimizer because it appears it will meet my intended purpose, it is free, and it is simple to set up.  Future blog posts will describe the set up process and the related test results.

A drawback to all of the these tools, I believe, is that they use JavaScript in their implementation.  Some people with disabilities do not use JavaScript-enabled Web browsers, or disable JavaScript in the Web browsers they use.  For these folks, click-rate measuring won’t work.

Note: No endorsement is intended or implied for the tools listed above.

Proposed Back & Next Navigation Icons For Future Clear Helper Web Site

I plan to use a consistent set of navigation icons for the future Clear Helper Web site.  My first attempt is represented by the “BACK” and the “NEXT” icons below.  Others in the set will follow.

The development of them is not derived from research, which is beyond the scope of the project at this point.  However, they were created following WebAIM’s “Creating Accessible Images” guidelines.  In particular, they: are not animated; don’t use color alone to convey meaning; look okay even when enlarged to 500%; and use good color contrast between the text and the background.

Of these guidelines, the last can be objectively measured with the following tools.  Each is based upon the latest Web Content Accessibility Guidelines (WCAG 2.0).

Accessibility Color Wheel created by Giacomo Mazzocato.  This tool is used to choose a color pair, especially of foreground- and background colors, that is accessible to people with any of three color-blindness conditions. This tool rated the color contrast of the icons below as “13:1 ok”.

Luminosity Colour Contrast Ratio Analyser from Juicy Studio.  This tool also rates color contrast.  It reported the ratio as “12.96:1 … very good … ” and passing at WCAG 2.0 Level AAA, which is the highest standard.

I would appreciate any constructive feedback about these icons.  They are being shown at the size I plan to use on the Web site.



Accessible Video Players

On the future Clear Helper Web site, I plan to embed videos as an option for visitors to take tutorials.  At the time of this writing, I know of three accessible video players.  On the Clear Helper Web site, I may experiment with each to see which works the best.

  • ccPlayer, from the National Center for Accessible Media is the one I have been using for a couple of years on another Web site.  It was selected for use because of its accessibility features for screen readers and for keyboard users, for its closed-captioning feature, and for its ability to play well the site’s streaming-video files.
  • Easy YouTube, developed by Christian Heilmann, may be an option if Clear Helper’s videos are hosted on YouTube.  It has big buttons and clear video-size options.  Visitors with cognitive disabilities may find those controls easy to use.
  • Section 508 Video Player, was just released by Business.GOV, an official site of the U.S. Government.  Like Easy YouTube, The Section 508 Video Player is intended for use with YouTube videos.  One feature that may make it unique is that it “… plays a single video or a playlist, which is a group of videos within a single player.”

E-Mail Software for People with Cognitive Disabilities

CogLink is e-mail software designed for use by people with cognitive disabilities.  It comes with “automated” training and unlimited access to help-desk staff.  At the time of this writing, there is a one-time cost of $49; there are no continuing subscription fees.

CogLink was designed based in part upon a longitudinal study using participants with cognitive disabilities.  A separate Web site, Think and Link, details the related research.  It was conducted by The University of Oregon and was sponsored by The National Institute on Disability and Rehabilitation Research.

I look forward to reviewing the “automated” training, especially the video-based tutorials.  There is one about how to use a mouse, one on how to use a keyboard (not touch-typing) and one on how to use the CogLink e-mail software.  They were developed using instructional techniques of task analysis, errorless learning, chaining and practice repetition.  Perhaps I will be able to use these videos as models or as reference points when developing the video tutorials for the future Clear Helper Web site.

I have just purchased CogLink for evaluation.  A subsequent, related posting will follow.  No endorsement is intended or implied for CogLink.

Note: The Email Standards Project may be of interest to readers.  It “… works with email client developers and the design community to improve web standards support and accessibility in email.”  At the time of this writing, its Web site contains reviews of over a dozen e-mail clients.

WebAIM: Insights into Cognitive Web Accessibility

My idea for the future Clear Helper Web site, and the reason I named it “Clear Helper”, is that it will offer tutorials intended for people with cognitive disabilities.  My current thinking is that each tutorial will be offered in three modes: text-only, text with pictures, and video.  Visitors to the site, presumably, would choose the mode easiest for them to follow.

So it was with interest that I reviewed the notes from a brief, related study conducted by WebAIM, and reported by Jared Smith, Associate Director of WebAIM.  The notes were from a presentation entitled “Insights into Cognitive Web Accessibility.”  It was of a user test that attempted to measure the efficiency, the effectiveness, and the satisfaction of participants (N = 8, grade 6 – 12 students with cognitive- or learning disabilities).

Among the findings, detailed in the presentation notes, were that participants did better with: larger text; images paired with text; short line lengths; and video-based instruction.  Insights included recommendations to “make your page LOOK easy” (“simple and intuitive”); “provide error recovery mechanisms”; and “keep visual aids clean, simple, and complementary to the content”.

I will keep these findings and recommendations in mind when designing the tutorials on the future Clear Helper Web site.