I am trying to evaluate the barriers of online security and privacy to people with cognitive disabilities. This work will help inform the effort of the W3C’s Cognitive and Learning Disabilities Accessibility Task Force to recommend standards on how to make online security and privacy more accessible.
Problem
I am struggling with how to go about this evaluation. It is a daunting task to come up with common barriers and solution recommendations across:
- the many end-user security and privacy techniques, e.g.:
- passwords;
- two-factor authentication;
- biometrics; and
- encryption
- the variety of platforms upon which the techniquesare implemented, e.g.:
- operating systems;
- devices;
- web and mobile apps; and
- messaging
- the different ways the many techniques within the various platformshave been implemented by the major players, e.g.:
- Apple;
- Google;
- Microsoft; and
- Open Source
Background
It is well known that people sacrifice security and privacy for the sake of convenience. Security and privacy techniques are too difficult to use, and thus are inconvenient. For people with cognitive disabilities, such “inconvenience” amounts to a significant barrier. (See my recent blog post about CAPTCHA for an illustration.)
It appears to me, from my research so far, that there is a lot of work on how to improve the security standards of information and communications technology (ICT) without much focus on the usability and accessibility of it. For example, I could not find even the terms “usability” and “accessibility” in the ICT Security Standards Roadmap of the International Telecommunication Union, an agency of the United Nations.
Improvement
Determining how to make online security usable for everyone must include people with cognitive disabilities. Doing so will mean that the related user experience will be designed to be as simple as possible. The more the experience is easy to use, the more everyone will protect their assets and privacy.
A piece of good news is that the Electronic Frontier Foundation is researching how to measure the usability of implementing secure messaging as part of its “Designing a Prize for Usable Cryptography”. I expect that work would be enough to help develop usability and accessibility-evaluation standards for online security and privacy in general; and to inform the creation of related recommendations for people with cognitive disabilities.
Solution?
I am working on a list of barriers, based upon functional limitations, which are common to end-user security techniques, and sublists unique to each technique. I am not a security and privacy expert. Thus the limitations I am considering are based solely upon my expertise in accessibility and cognitive disability, and what seems logical to me. (For an example, see my recent blog post about CAPTCHA.) I suppose that effort will have to suffice until a security expert, such as the Electronic Frontier Foundation, determines how to measure related usability.
Help Needed
I welcome comments with:
- suggestions about how to evaluate the barriers of online security and privacy to people with cognitive disabilities; and
- information about any effort to evaluate the usability and accessibility of online security and privacy techniques
Notes:
- See Stay Safe Online for online security-related instructions and information.
- No endorsement is intended or implied of the organizations and their efforts mentioned in this blog post.