Shaping the Design

Designing for the vision impaired

Blind users possess an enhanced sense of hearing, smell and touch (Bauer et al., 2017). They also display better cognitive functions, such as language and memory. Their brain rewires to enable them to interact better with the environment. This heightened sense of touch and hearing (Röder et al., 2004) are aspects to consider when designing for the blind.

The memory and spatial perception have a significant influence on how the visually impaired perform some tasks (Harrar et al., 2017). Because of the difficulties in accessing some technologies, and inappropriate user interfaces (e.g. lack of labels in buttons), the visually impaired users resort to their memories to identify the position and actions of specific buttons.

In technological terms, there are some particularities in the way the blind users access and interact with content. For mobile devices, as covered previously, they resort to screen readers, which recognise readable elements in the screen.

Universal Design

The concept of Universal Design was coined in 1997 by a multidisciplinary group of researchers – architects, product designers, engineers and environmental designers – led by Ron Mace, in collaboration with the Center for Universal Design of NC State University. This team studied and defined what is known as the seven principles of Universal Design (Connell, et al., 1997).

1. Equitable Use
The design needs to be useful, marketable, attractive and safe for people with different abilities without stigmatising or segregating any user.

2. Flexibility in Use
The design should encompass an extense variety of preferences and personal abilities, allowing the selection of a method of use, adaptability to the user’s pace and accuracy.

3. Simple and Intuitive Use
The comprehension of the design needs to be independent of the user’s experience, knowledge and literacy. It should also eliminate unnecessary complexities, be within user’s expectations and intuitions, present information consistently and give feedback to the user during and after concluding a task.

4. Perceptible Information
The design should be capable of communicating and inform, independently of the ability of the user or the environment. The design needs to be able to differentiate and contrast elements, use a diverse medium (pictures, verbal and tactile) and enable access to information for people with sensory limitations.

5. Tolerance for Error
The design needs to attenuate errors and hazards of accidental and involuntary actions. It should provide warnings of danger, fail or errors, and isolate or shield hazardous elements from routine tasks and prevent unconscious actions in tasks that require attention.

6. Low Physical Effort
The design needs to be efficient to allow the user to operate with minimal effort. It should allow users to maintain a neutral body position and accomplish the workload with efforts appropriate for the task, eliminate repetitive actions and the excessive physical effort.

7. Size and Space for Approach and Use
The space provided for approaching, use, reach and manipulation needs to be appropriate independently of the size of the user’s body, posture and mobility. It should give a clear line of sight to important elements and access to all element, being the use standing or seated, and proportionate enough space for the use of assistive devices or personal assistance.

The designers need to apply concepts like Universal Design and User-Centred Design to reach broader audiences. It is important that the whole process, from conception to development, has the user set at its core (Langdon et al., 2012). As mentioned previously, Universal Design argues that the solutions have to work well for those with average abilities and those with disabilities (Aslaksen, 1997), ensuring a universal and equitable use (Ziefle & Jakobs, 2010).

According to Zaphiris & Ang (2009), Universal Design answers the need for inclusion in the use and development of new applications and services, overcoming the idea of applications exclusively accessible to users with disabilities. Zaphiris & Ang (2009) also suggest it is paramount to consider particularities and needs of the largest number of users possible to provide an inclusive and universal use of technologies – being these users disabled, or users that present any other particularity.

As stated by Herriott (2012), Inclusive Design needs to base itself on methodologies that improve products and encompass a broad range of users with different motor and cognitive skills. Herriott (2012) also believes the participation of the users during the development of the products can potentially contribute to achieving Universal Design.

Following Zaphiris, Kurniawan & Ghiawdwala (2007), the Universal Design philosophy recognises, values and accommodates a significant number of abilities, preferences and human needs in a single service or product. The mentioned authors also suggest it is essential to create specific guidelines for designers that develop new technologies.

The concept of Universal Design advocates for an equal and inclusive solution for any product – it helps designers focus on specific values and rules, putting accessibility and usability in the spotlight.

Burgstahler (2015) developed a Universal Design process, establishing the stages that a development process of new products should comply with. Burgstahler (2015) considers it crucial to perform periodic evaluations of the products developed to ensure that they maintain the good practices of Universal Design.

Other Guidelines

In general, the accessibility on the web is tested against the success criteria established by WCAG. However, some researches demonstrate that even websites or apps that conform to the WCAG do not guarantee accessibility to people with different types of disabilities (Rømen and Svanæs, 2008). Power et al. (2012) demonstrate that only 49,6% of the problems found by blind users are covered by the WCAG (Table 1).

Table 1. Categories of user problems with total number of problems and percentage (number) of user problems covered by WCAG 2.0 SCs.
Table 1. Categories of user problems with the total number of problems and percentage (number) of user problems covered by WCAG 2.0 SCs. Source: Power et al. (2012).

Exploring other guidelines and understanding the particularities of blind users are vital to create a solution that fits their usage. Apple and Google offer some basic guidelines regarding accessibility to developers and designers. Even though they are not specific to particular disabilities, they can aid with the compliance of accessibility standards. These guidelines highlight the use of correct nomenclature and classification of elements, to enable better integration of the app and the screen readers (Apple Inc. 2012).

Pezzuto et al. (2017) mapped the main issues related to accessibility for blind users, proposing solutions and recommendations related to some categories: buttons, interactions based in gestures, screen readers, screen size, user feedback, voice commands and data input. Pezutto et al. (2017) proposed the following guidelines:

  1. Use physical buttons that are easy to find, for example, in the quadrants of the screen or with the addition of fixed virtual buttons.
  2. New ways of interaction with the virtual keyboard that use more than one finger, to emulate the use of physical keyboards.
  3. New forms of gesture recognition in mobile devices that consider characteristics of visually impaired users.
  4. Screen reader with verbal feedback should have a natural voice and available in the local language. It should offer different configurations for the voice, voice speed and alternatives for linear reading.
  5. Identify the edges of the screen physically.
  6. Multiple forms of feedback to the user: sound, voice and vibration.
  7. Offer voice recognition to a broader vocal tone, accents and localisations.

Kopeček and Batůšek (1999) suggest some principles about design and feedback of interfaces designed for blind users. The recommendations are mainly about the importance of the user’s comfort and interface customisation. Kane and Wobbrock (2011) proposed techniques that blind users preferred to use and covered differences in gestures of blind users vs sighted users. After the study a Design Guidelines for Accessible Touch Screens was produced:

  1. Avoid symbols used in print writing.
  2. Favour edges, corners and other landmarks.
  3. Reduce the demand for location accuracy.
  4. Limit time-based gesture processing.
  5. Reproduce traditional spatial layouts when possible.

References

Apple Inc., (2012). Making Your iOS App Accessible. Accessibility Programming Guide for iOS. Retrieved from https://developer.apple.com/library/archive/documentation/UserExperience/Conceptual/iPhoneAccessibility/Making_Application_Accessible/Making_Application_Accessible.html#//apple_ref/doc/uid/TP40008785-CH102-SW5

Aslaksen, F. et al., (1997) Universal Design: Planning and Design for All. Gladnet Collection. Retrieved from https://digitalcommons.ilr.cornell.edu/cgi/viewcontent.cgi?article=1329&context=gladnetcollect

Bauer, C et al., (2017).  Multimodal MR-imaging reveals large-scale structural and functional connectivity changes in profound early blindness. PLOS ONE. Retrieved from 10.1371/journal.pone.0173064

Burgstahler, S., (2015). Universal Design: Process, Principles, and Applications. How to apply universal design to any product or environment, Washington: University of Washington – College of Engineering.

Connell, B R. et al., (1997) The Principles Of Universal Design. The Center for Universal Design. Retrieved from https://projects.ncsu.edu/ncsu/design/cud/about_ud/udprinciplestext.htm

Harrar, V. et al., (2017). Mobility of Visually Impaired People. Cham, Switzerland: Springer Nature

Kane, S. and Wobbrock, J. (2011) Usable Gestures for Blind People: Understanding Preference and Performance. CHI 2011 • Session: Gestures. Retrieved from  http://faculty.washington.edu/wobbrock/pubs/chi-11.05.pdf

Kopeček, I., & Batůšek, R. (1999). User Interfaces for Visually Impaired People. Proceedings of 5th ERCIM Workshop on User Interfaces for All, pp. 167-174. Retrieved from http://ui4all.ics.forth.gr/UI4ALL-99/Batusek.pdf

Langdon, P. et al., (2012). Designing Inclusive Systems – Designing Inclusion for Real-world Applications. New York Dordrecht: Springer London Heidelberg

Power, C. et al.,(2012). Guidelines are only half of the story: accessibility problems encountered by blind users on the web. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’12). ACM, New York, NY, USA, 433-442. Retrieved from http://www-users.cs.york.ac.uk/cpower/pubs/2012CHIPowerFreireGuidelines.pdf

Röder et al., (2004) Early Vision Impairs Tactile Perception in the Blind. Cell. Retrieved from https://www.cell.com/current-biology/fulltext/S0960-9822(03)00984-9?_returnURL=https%3A%2F%2Flinkinghub.elsevier.com%2Fretrieve%2Fpii%2FS0960982203009849%3Fshowall%3Dtrue

Rømen, D. and Svanæs, D. (2008). Evaluating web site accessibility: validating the WAI guidelines through usability testing with disabled users. In <i>Proceedings of the 5th Nordic conference on Human-computer interaction: building bridges</i> (NordiCHI ’08). ACM, New York, NY, USA. Retrieved from http://medialt.no/pub/uikt/u2010/021-Romen/index.html

Ziefle, M., & Jakobs, E. M. (2010). New Challenges In Human-Computer Interaction: Strategic Directions And Interdisciplinary Trends. Proceedings of the International Conference on Competitive Manufacturing (COMA10). Retrieved from http://www.comm.rwth-aachen.de/files/coma_10_ziefle_jakobs_2.pdf

Shaping the Design Language

As mentioned previously, Apple and Google both published some accessibility guidelines for mobile apps. These guidelines cover basic and generic information concerning accessibility but their application can have a positive impact on the accessibility and usability of the app – labelling items and images, the position of elements and organisation of the menus are some of the items mentioned in the guidelines.

It is essential to highlight the importance of integrating apps with other software adopted by visually impaired users, especially the embarked solutions: Talkback on Android and VoiceOver on iOS, hence why it is essential to label all components added in the screen.

The menus and buttons are another critical aspects in the development of apps. It is crucial to use and produce screens that use known elements to the user. According to Babich (2018), this eliminates the need to explain as users are already familiar with them. Users can resort to past experiences to interact with the app, reducing the learning curve.

It is equally important for the user to be able to use the functionalities of the app simultaneously with other apps; for example, the user might need to copy and paste the address from an email app to a navigation app.

The background alerts are another relevant aspect and should be taken into consideration in the design.

PrincipleGuidelines 
Equitable UseThe components (menus, videos, buttons, links, images) should be identifiable by the users.Because the blind users make use of screen readers to navigate and interact with the apps, it is essential to label all the components correctly. Proper labelling allows users to identify the function of each component, improves the usability and ensures accessibility.
The components and the contents of the app should be labeled correctly.
Layout ConsistencyConsistency in the placement of menus and buttons in different screens.The construction of menus and the layout of the screens should be consistent throughout the user’s navigation to allow the user to explore the app in a more intuitive way.
Avoid unnecessary changes in the position of elements.
Uniform ActionsDo not add components out of the standard guidelines.The interactions, gestures, commands and controls to improve the usability of apps, already exist and used by Android or iOS. Using them creates consistency between different apps, helping the user to recognise familiar components and architecture, therefore reducing the learning curve.
Use the components provided by the iOS and Android guidelines.
Simultaneous UseAbility to run in the background.The visually impaired user needs to access more than one app simultaneously to complete and verify information. When returning to a minimised app, it should remain where the user initially left it. It is crucial that the user can customise notifications and alerts.
Allow the app to run on the background without losing information.
Allow the customisation of alerts and notifications.
Usage CustomisationAllow customisation of the amount of information displayed (high, medium, low).The level of information must be set according to the user’s familiarity with the product. It should also allow the user to customise the preferred type of feedback.
Customisation of the preferred type of feedback (audio or tactile).
Reliability of InformationAll information need to be validated and updated on a regular basis.The information available in the app should be reliable. Also, the text should not only support an image, but it should also be as descriptive as possible. For example, when a dish is displayed, describe the ingredients, the serving size, etc.
User Centred DesignInclude the user when validating the app and its functionalities.Functionalities can be deemed unusable and useless if not tested and approved by the user. Including the user in the development process is essential to ensure accessibility.
Develop, test and validate the app with the user.

Table 1. Compilation of Principles and Guidelines collected during the research about blind users. Sources: Google (2017), Apple Inc. (2012), W3C (2010), Connell, et al. (1997), Burgstahler (2015), Dix, et al. (2004), Butean, et al., (2015).

References

Apple Inc  (2012). Guides and Sample Code – Accessibility Programming Guide for iOS. Retrieved from https://developer.apple.com/library/content/documentation/UserExperience/Conceptual/iPhoneAccessibility/Introduction/Introduction.html#//apple_ref/doc/uid/TP40008785

Babich, N. (2018) A Comprehensive Guide To Mobile App Design. Smashing Magazine. Retrieved from https://www.smashingmagazine.com/2018/02/comprehensive-guide-to-mobile-app-design/

Bogdan, T. Butean, A., Moldoveneanu, A. & Balan, O., (2015). Introducing Basic Geometric Shapes to Visually Impaired People Using a Mobile App. Bucharest, ReasearchGate, pp. 1-4.

Burgstahler, S., (2015). Universal Design: Process, Principles, and Applications. How to apply universal design to any product or environment, Washington: University of Washington – College of Engineering.

Connell, B R. et al., (1997) The Principles Of Universal Design. The Center for Universal Design. Retrieved from https://projects.ncsu.edu/ncsu/design/cud/about_ud/udprinciplestext.htm

Dix, A., Finlay, J., Abowd, G. & Beale, R., (2004). Human-computer Interaction. 3rd edition, England: Pearson Education Limited.

Google. (2017) Android Accessibility Help. Retrieved from https://support.google.com/accessibility/android/answer/6151827?hl=en&ref_topic=3529932

W3C (2010). Mobile Web Application Best Practices. Retrieved from: https://www.w3.org/TR/mwabp/#bp-presentation-interaction

Design Language

There are a lot of information and names about design languages: style guides, design patterns, visual language guides, UI style guides, frameworks, visual guides, design language, atomic design, design system. All of them have the same purpose: manage design at scale creating a series of components that can be reused in different combinations (Clark, 2018).

One of the first and most known design languages for digital user interfaces is Bootstrap. It was created in 2011 by Twitter’s engineers, Mark Otto and Jacob Thorton. The idea was to create an internal tool to solve inconsistencies in development within the company.

As a project grows, naturally more people get involved, and consequently, different methods to develop and design are introduced. This creates inconsistencies in the design, and it can be time-consuming to align each team member towards a particular design pattern to be used in the project. Design languages are born with the goal of reducing these inconsistencies.

The designer Brad Frost (2013) created the concept of Atomic Design, making a scientific analogy to explain how the different components in the page interact between themselves.

Figure 1. The concept of Atomic Design designed by Brad Frost.
Figure 1. The concept of Atomic Design designed by Brad Frost. Source: Edwin Kato

Atoms would be the smaller pieces, usually a button or a header. When these Atoms come together they produce Molecules; for example, a button with an input field can make a search bar. A couple of Molecules together can make Organisms such as navigation bars or chat interfaces. The Organisms produce templates, which are wireframes that will finally become high fidelity versions, called Pages.

Figure 2. An example of applied Atomic Design. Source: Brad Frost.
Figure 2. An example of applied Atomic Design. Source: Brad Frost.

According to Kato (2017), the benefits of Atomic Design are that it makes it simple and easy to understand the layout. As the components are reusable, the prototyping process becomes faster and smoother; changing one single Atom affects not only itself but all Atoms in the system. Reusing components creates a smaller code base which consequently is easier to maintain.

Following the idea of Bootstrap and the Atomic Design, a multitude of design languages was created: Material Design (Google), iOS Design Themes (Apple), Carbon Design System (IBM) and others.


Video 1. Example of Visa’s Design Language, Vision, in use.

References

Clark, C. (2018) What Is a Design System? Forum One. Retrieved from https://forumone.com/ideas/what-is-design-system

Frost, B. (2013) Atomic Design. Brad Frost. Retrieved from http://bradfrost.com/blog/post/atomic-web-design/

Kato, E (2017) Atomic Design: The Secret to Building Powerful User Interfaces. Medium. Retrieved from https://medium.com/the-andela-way/https-medium-com-edwinkato-atomic-design-bc3c77a5cb1a

Choosing the right Design Language

All design languages cover to some extent users with disabilities. The general guidelines for accessibility regarding blind users include providing alternative text labels to components, use of sufficient colour contrast ratios, audio descriptions.

Before choosing the Design Language to be used, an analysis of applications familiar to blind users was  performed (Figure 1) and their performance against the guidelines set previously benchmarked. The apps chosen were mentioned during the user’s interviews and during the research. Whatsapp, Instagram, Gmail and Spotify are mainstream applications that have a large user base, including users with disabilities. ColorID, TapTapSee, BeMyEyes and Eye-D are made specifically to blind users. ColorID indicates the colour of elements in the phone’s camera, it is used to aid users to choose clothing or for example to identify bank notes. TapTapSee does the same functionality but it is able to recognise elements which the phone’s camera is pointed at. BeMyEyes is a collaborative tool where volunteers are contacted to help blind users identify certain objects, for example, differentiate a can of peas from a can of corn. The last app is Eye-D, aimed at helping blind users in various tasks: find transport, locate ATMs, restaurants and cinemas. It also identify objects, colours and recognises texts.

Figure 1. Analysis of familiar apps to blind users.
Figure 1. Analysis of familiar apps to blind users.

Material Design was designed by Google and it has an extensive documentation about accessibility. Its framework offers tools to check the colour contrast of text elements in the pages against coloured backgrounds (Figure 2) to help designers create apps that are accessible to low-vision users. Other tools provided include the Accessibility Test Framework for Android, which performs various accessibility-related checks on the view area based on the WAI guidelines. The Accessibility Scanner is a tool that suggests accessibility improvements for Android apps without requiring technical skills (Figure 3).

Figure 2. Screenshot of Google Colour Contrast verification tool.
Figure 2. Screenshot of Google Colour Contrast verification tool.

 

Figure 3. Accessibility Scanner provided by Google.
Figure 3. Accessibility Scanner provided by Google.

The problem detected during the initial exploration with the users facing unfamiliar interfaces that have a poor hierarchy system (Figure 4) is one of the major issues that blind users have to face.

Figure 4. Example of a poor hierarchy system. Source: Material Design
Figure 4. Example of a poor hierarchy system. Source: Material Design

Google’s Material Design also covers sizing of elements like buttons and icons, spaces between components, and the appropriate invisible areas surrounding icons (Figure 5) to improve user’s accuracy.

Figure 5. Example of a Touch and Pointer targets. Source: Material Design
Figure 5. Example of a Touch and Pointer targets. Source: Material Design

Based on the compilation of Principles and Guidelines (Shaping the Design Language – Table 1), together with the extensive documentation that Google has created for its design language, Material Design seems to be an appropriate fit for this project.

Leave a Reply

Your email address will not be published. Required fields are marked *