UX=Accessibility & Accessibility=UX

Gareth Ford Williams
UX Collective
Published in
22 min readAug 16, 2021

--

Following on from my article on Legal Experience Design, I began wondering what Accessibility would look like as a framework if it was UX or CX led rather than compliance led.

This started me thinking about the lack of evaluation undertaken based on user outcomes which would, if we did it, tell us how successful the impact of our accessibility or inclusive design programmes are.

The focus on guidelines rather than user outcomes is the direct result of how accessibility has evolved from a civil rights movement that has been legally benchmarked using a very specific technical approach. But people and the barriers they face are complicated, so I asked myself what accessibility would be like if it originated from a more ethical approach to UX Design?

If we step away from the compliance model and think of accessibility being first and foremost about people and the rich diversity we find within any audience, it starts to raise a lot of questions about what ‘good’ actually is.

In the conversations we have about accessibility we have a tendency to focus on very specific demographics, we talk about conditions such as blindness, hearing or vision impairment, motor control or neurodivergence as singular experiences whose success criteria are based on predetermined guidelines. Even if those guidelines are sensible, does this approach in some way dehumanise the audience, turning them from people you should design experiences for to a checklist of requirements you ensure are met at the end?

I understand why this has come about and it has been relatively successful. A guidelines based approach has worked well in getting accessibility programmes started, but when they mature and need to become more embedded in any organisation the cracks start to appear. The associated practices with accessibility can create a sense of otherness and separation, especially from UX or CX teams where they in fact should be rooted, because after all what is accessibility if it isn’t about ethical customer experiences that are designed around lived experiences.

People are more complicated than the current accessibility model admits. By using the rigid categorisation and labelling that our society gives users, we lose a great deal of information about people’s expectations, approaches, skills, habits or preferences when it comes to engaging with a product or service. Guidelines are useful but not thinking beyond them to the people they were written about can be very limiting. We are all multifaceted in terms of our abilities and as a result people with very similar conditions and challenges have very different lived experiences, and when it comes to usability we seem to be able to readily accept this for a mainstream audience, but accessibility struggles to evolve into a mature human-centric UX model and as such creates its own evolutionary barrier.

There are terms like “Inclusive Design” which make me uncomfortable, which seem to have become a reaction to this problem. The term itself makes me wonder whether without ‘inclusion’ being called out in a design programme highlights an admittance by the organisation that their design up to that point has be discriminatory and ableist? Maybe that is a useful stage to go through in an organisations accessibility journey, but as the U in UX and the C in CX do not come with caveats of “except them”, the term “inclusive” should quickly become redundant. Designers design things for people to use so accessibility has to start with design.

So have We Been Approaching UX Accessibility All Wrong?

When it comes to Accessibility as part UX Design there is a fundamental issue with the way we both approach it and how we evaluate what success looks like.

I am male, early 50s, white, I have various hobbies, I fit in a social group and look like I’ve had a good Christmas… None of this is relevant data to customer experiences.

I have dyslexia, ADHD and I wear glasses, which seem useful, but there are so many differences between different people with similar conditions or intersections that neither tell you a great deal that is conclusive.

On the other hand I prefer dark mode to reduce fatigue and use pinch-zoom a lot to reduce visual noise. I am not a confident reader, forms make me anxious, I am very easily distracted, I misread words a lot, I get fatigued quickly, etc… This is interesting user insight as it starts to unearth potential barriers and behaviours that can be considered in a design approach.

Looking beyond a condition and taking time to understand related barriers, obstacles and preferences give designers insight that can help them create solutions that work for more people in more contexts. In this way, shifting from personal identity to lived experiences can stop the segregation that uses demographics that are determined by how people see themselves or, by what boxes society puts them in. It could also unlock the missing data accessibility programmes need to mature.

Consider for a moment that 80%+ of your customer base experience barriers. Hold that thought. We don’t know the exact figures as pretty much all data on the subject is estimated and/or incomplete. There are also many difficulties in collecting data on the subject because lots of people have impairments and experience barriers but don’t identify as a “disabled person”, but the barriers they face are no less important than those who do identify as Deaf, disabled or neurodivergent, so self-declaration is a flawed approach. The demographics often used, rarely intersectional, miss out a lot of people and come with all sorts of legal (GDPR) and ethical issues, and yet understanding the experienced barriers and user preferences are paramount to accessible design. The data also ignores barriers that are transient and disabling but to not directly relate to any specific impairment or condition.

Demographics also lead to unhelpful and divisive questions like, “how many deaf people are there?”, where a better question would be, “what percentage of our customers using our service have no access to sound?”

A question like this covers far more people than just a single demographic, and they all have a shared barrier, and therefore the impact of designing around the barrier is more accurately understood. This is a very specific example but we know this is a common barrier as this is one of the few accessibility issues we have where there is reliable data.

Firstly we have to get back to basics and remember that Accessibility as a design function is about ensuring that designers do not introduce barriers that disable people.

If you are not familiar with the Social Model of Disability in the context of design, it’s quite simple. People are a diverse bunch. There are many differences including gender identity, sexuality, neurodiversity, culture, ethnicity, age and impairment. These are all intersectional and when you add character traits, learning, confidence, skills and experience you have an infinite number of combinations, and you quickly realise there is no such thing as a standard issue person.

Some, but not all of these characteristics can affect how users interact with products and services. Interactions can be improved or impacted by things such as available technologies, the environment the users are in or the situation in which they are trying to use the product. They could be in a public space, on a train, whilst carrying bags, after having drunk alcohol, or simply be tired.

All of these and more impact on anyone’s abilities to be able to fully function from neurological, sensory or physical perspectives.

This is what accessibility is all about. People are diverse and their lives are complicated. If they have a permanent or transient impairment or impairments which in combination with their environment (digital and physical) isn’t considered in a design, they experience a designed barrier that disables them.

Impairment + Environment = Disability

Before a product or service is designed, when it is still a concept, everyone has a comparative and I dare say equal experience of it. But as soon as designers start to make decisions, rightly or wrongly, more and more people are designed out either permanently or contextually. This is rarely proactive, but is a consequence of a lack of empathy between a designer and the reality of the lives of the customers they are designing for. Dieter Rams summed this up nicely.

“Indifference towards people and the reality in which they live is actually the one and only cardinal sin in design”

What can we as designers do?

Guidelines and guidance are tools that can help with understanding the options and current best practices. The use of mixed ability recruitment in qualitative research helps with feedback during the design process and training especially in the way users overcome barriers are especially useful when it comes to understanding assistive technology usage, but to ensure a product has the widest possible reach you need to understand the obstacles that cause the UX barriers and accommodate common user preferences.
Think about success as knowing that all your customers have a comparative experience and are satisfied that they can independently comprehend, navigate and interact in a way that they determine is reasonable, and meets their expectations. Then think about how this can be measured on live products so you can continuously improve the customer experience.

With this in mind I have started collecting together a list of the intersectional UX obstacles all designers of software applications, mobile applications, websites and video games should consider.

10 Human Intersectional UX Obstacles within any Product or Service’s Design

When designing a product that measures success by user outcomes, it is imperative to understand the obstacles people face as well as their preferences to overcome them. This is how comparative experiences can be designed effectively

Obstacles are a byproduct of a design process where the user’s impairment/situation in combination with the physical or digital environment they find themselves in was not designed for. It is also important to remember that obstacles can be experienced by multiple user groups, with different conditions or impairments, in similar environments or situations.

Obstacles are also useful to focus on because when collecting quantitative data, under GDPR, organisations have to be very careful collecting any data about a user that could be determined as medical information. So if you ask direct questions about people’s conditions, ages or the specific assistive technologies they use, or detecting and tracking the use of assistive technologies, that could mean that someone’s medical profile can easily be reverse engineered and they could be identified. Assistive technology or condition tracking is also both unethical and is actually of limited use.

Instead I have based the following model on 10 common UX obstacles, all of which can be experienced by multiple user groups including people with either permanent or transient impairments. I have followed those 10 obstacles with 15 UX human preferences that need to be designed for. These relate to user needs and have to be supported in order to enable users to overcome those 10 UX obstacles.

  1. Vision Reliant. There are many situations where not having access to audio is either as a result of an impairment or a choice determined by the environment or situation a user finds themselves in.
    In general this is not an issue as most web pages are silent until it comes to accessing audio or audio and video content (AV) or games it can be a significant barrier.
    AV and games narrative content can be made accessible with the sound off through the addition of closed captions. Up to 80% of social media users access AV with the audio muted because of the situations and environments they find themselves in, browsers block autoplay for video audio and users who have either hearing or cognitive impairment can depend on them.
    In games and drama there is the additional issue of sound effects being an important part of the story or providing important information. Ensure for every sound there is an equivalent indication or caption description.
    Not everyone who is deaf is also culturally Deaf, especially if they have lost their hearing later-on in life, but there is a cultural aspect to this group as most sign languages have a different structure to their spoken counterpart language, so thinking about in-vision signing as an additional consideration where possible.
    With just the social media stats in mind at over 80%, it is worth remembering that this barrier group is huge, it includes the majority of your audience and therefore captions and indicators should be a core consideration in every offering.
  2. Audio Reliant. Websites and applications should be designed to be both visual and non visual experiences. We traditionally think about users who have no vision or have a condition which means that audio is their primary mode when using a website. Users with severely impaired vision can use screen readers but there are also people with poor vision, cognitive conditions such as autism or dyslexia, as well as people with learning or reading difficulties that can also be audio first users, and their technologies can vary from screen readers to text-to-speech engines.
    This group extends even further when you consider voice platforms such as Amazon Echo, Google Nest, Sonos One or Apple’s HomePod, or in-car devices that are built using the assistive technologies of Text-To-Speech and Speech-To-Text, and in-car or other mobile experiences where looking at screens is not a safe option.
    This is a difficult group to estimate the size of because of its diversity. It covers blindness, vision impairment, cognitive conditions as well as voice first platforms.
  3. Impaired Vision. This group consists of anyone who is a visual first user but has impaired vision. This does not necessarily mean they have an eye condition as vision impairment can be transient.
    It can include people with a range of conditions or environments that impact on the subject enough for them to instigate strategies including use of magnification technologies, pinch zoom, changes of font, increased font size, the use of highlighters or the selection of higher contrast colour schemes.
    This can include anyone who wears glasses or contact lenses for reading, most of which do not think of themselves as disbled despite them wearing an assistive technology on their face everyday.
    One of the other most common behaviours to consider is the strategy people use for extending the battery life of their portable devices. They turn the brightness of their screen down. So if your audience is in any way mobile, colour contrast should be a key consideration in your design, because if it is not, you are not supporting a core user behaviour.
    In a recent study by The Readability Group, in a sample of 2,500 users, 54% said that they used pinch zoom on their devices.
  4. Impaired Audio. Hearing impairment affects everyone and we know that because of the data we have, and by acknowledging that people access content in environments where there can be competing audio. Like Impaired Vision it will eventually affect us all because our sight and hearing deteriorate as we age. Games and Audio/Video content are where this can impact the most and like the Vision Dependent group there are assets such as captioning that can help. Some of the people in this group also depend on lip reading and there can also be a cultural consideration when it comes to people whose first language is signed.
    The creation of barriers can come down to poor audio production and lack of synchronised visual equivalents. If you look for guidance on captions, the best guidance on this is provided by the BBC.
    As well as this we should consider the production of the audio itself. Good close microphone, not too close to distort, no background noise or music that is too loud so it competes with the speech, lips in view because all sighted people read lips and we should understand more about the experience. Facial expression and body language are also very important visual cues to meaning.
    This is a universal barrier for anyone with some degree of hearing.
  5. Phonology. Language barriers are not just experienced by people who have adult literacy issues or learning difficulties, but this barrier is also experienced by people who struggle processing written language because of a cognitive condition, or will find difficult language a struggle when used in an interface. Dyslexia is the most common condition which impedes the ability to process language and therefore concepts. This group could also include people with aphasia and similar barriers for people who are not native speakers of the core language used on the site.
    In the US statistics vary from 25 to 54% of the population are impacted by this barrier.
    This Article from Forbes outlines the cost of adult literacy.
  6. Cognition. In broader terms cognition related barriers are based on user processes including thinking, knowing, remembering, judgement, comprehension and problem-solving. When people experience cognitive barriers they become more impacted by designs that drift from conventions and where there is a lack of affordances in relation to interacting with objects.
    There are many cognitive conditions, environments and situations that impact cognitive function. Everything from Autism Spectrum Disorder, ADHD, Dyslexia… but other learning disabilities, ageing, stress, fatigue, prescription or recreational drugs and alcohol can also significantly impact cognitive function. The requirements of this group can be covered by taking a neurodivergent perspective of the NN Usability Heuristics. Statistics for this group vary because screening is still imperfect and has only relatively recently been introduced, but estimates are always over 20% of the audience. If you add to that transience through fatigue, alcohol or medication, this barrier is relevant to your entire audience.
  7. Colour Perception. This barrier is common to cognitive conditions such as Irlen Syndrome and colour blindness, as well as various vision impairments. It can also be impacted by the performance of end user equipment, resulting in a similar barrier. This barrier impacts roughly 8 to 10% of your audience.
    The problem arises when comprehension is wholly dependent on the recognition of different colours. In UI design we see this go wrong in the design of information graphics such as graphs, charts and maps, and in games this can happen when competing teams have colours that look remarkably similar to people with conditions like Colour Blindness.
    So check your colour schemes and also use icons, labelling, pattern, sound and animation as parallel identifiers.
    A simple test is to check all your designs in black and white.
  8. Colour Contrast Processing. There are conditions that are associated with dyslexia but are not unique to people who are dyslexic. One of those is a condition called Irlen syndrome. This can cause the issues that some dyslexic people talk about when they describe rivers of white in justified text and is also the reason they need specific colours to read. Often people with this condition will use coloured gels or tinted glasses for reading print.
    This barrier is not unique to Irlens and is also linked to conditions such as migraine, eye strain and can even become a barrier through fatigue from long exposure to bright screens which is why Dark Mode can be so useful.
    The statistics on this are difficult to pin down because for so many people it is transient, but because of the association with screen exposure it will impact the majority of your sighted customers if not considered.
  9. Target Accuracy. There are many reasons why accuracy can be impacted when trying to click on interactive elements. These can range from impaired minor or major motor control, involuntary movement or a different physicality such as missing fingers through illness or injury. This barrier is also relevant to any user’s situation, such as being jolted on transport, carrying bags, babies or holding other objects like drinks. Even having drunk drinks which are alcoholic, being tired or even being a user who prefers the ergonomics of one or two thumb input on their mobile phone impacts on target accuracy.
    If targets are too small to hit or too close together, accuracy cannot be guaranteed for any user.
    For games this can be purposeful for gameplay which is when options for things like targeting assistance are always welcome.
    This is another barrier that is transient and impacts the entire audience.
  10. Fatigue. All sighted users spend an extraordinary amount of time every day looking at computer, TV or mobile screens, especially on work days. Many users can suffer from fatigue from prolonged use and Fatigue as a barrier is also on the rise as it is one of the main symptoms of the 15+% of COVID survivors that have Long COVID and because of COVID this is something that is likely to become an obstacle whose impact is growing. Fatigue can be the cause of other obstacles such as cognition, colour contrast processing and target accuracy, so it should be factored into your design thinking.

There might be more barriers than these and this list might grow, but within the 10 listed I think most accessibility issues are covered. But this is not enough. To help any understand the options and approaches available it is important to then understand the associated user strategies or preferences.

15 UX Human Preferences

As well as identifiable obstacles that cause barriers, people have preferences that afford them greater access in overcoming those obstacles. We all have preferences that form a key part of our coping strategies, and the ones we are concerned with here are the ones that afford greater control and access. If a product or service is designed with the user preferences associated with coping strategies in mind, then this will afford the majority of the audience more opportunity for engagement and improve customer satisfaction.

This following list is universal for mobile and desktop websites and applications.

For games this is largely the same with the exception of Magnification or No JavaScript and addition of four additional ones I’ll list separately at the end.

  1. Larger Text. Fonts are the foundation of all screen based user experiences. They can impact on how easily, quickly and accurately we can read and therefore how we comprehend content and available actions. The most important thing to do when it comes to visual accessibility is to choose an optimal font that supports emotional, functional and technical accessibility.
    Then there are typographic choices we can make to optimise that experience but this can still not be enough for many users and they will need to increase the size of the font, and your design will need to take this into consideration.
    Associated Obstacles: Impaired Vision, Phonology, Cognition and Fatigue
  2. Spoken Text. There are several ways website designs can support the transposition of text to audio and its associated interactions, and each has its own merit in relation to different reasons why this is important. Users might use a screen reader or text to speech engine which might either be built into their device such as Apple’s Voiceover, Google’s TalkBack or Microsoft’s Narrator, or they can be third party developed technologies such as JAWS, Claro or NVDA, or they can be built into websites such as ReadSpeaker or Reachdeck.
    Each has similarities in output and differences in interaction, but it is essential that any interface is designed to be both visual only and audio, so every user gets a comparable experience.
    This is not just about output but also about interactions which have to be designed in parallel to work effectively.
    Associated Obstacles: Audio Dependence, Impaired Vision, Phonology, Cognition and Fatigue.
  3. High Contrast. For many users with either eye conditions, situational impairments such as reducing the brightness of a screen to ensure battery longevity or an environmental impairment such as strong ambient lighting, contrast is incredibly important. They might use forced colours like in Microsoft’s Edge, or change settings in a browser using settings like the ones available in FireFox, use a third party assistive technology or use a plug-in such as High Contrast for Chrome to customise the page.
    Even if users do not use an additional technology the contrast still needs to be optimal to work with brightness control. There are many tools to help with this like TPGi’s Color Contrast Checker.
    Associated Obstacles: Impaired Vision, Phonology, Cognition, Colour Perception, Colour Contrast Processing and Fatigue.
  4. Magnification/Zoom. Apple Macs, Windows PCs, Chromebooks, Apple iPads & iPhones and Google Android Phones and tablets all have built-in magnification options, and there are third party magnifiers available as stand alone assistive technologies such as Zoom Text. These are used for two main reasons, either someone needs the items on the screen bigger so they can see them or they zoom into a section of the screen to reduce the number of items, reducing cognitive load. On mobile devices the ability to pinch zoom means that anything that is either hard to read or interact with can be made larger in order for the users to complete a task or consume content. Pinch zoom is also used in situations where a user is temporarily vision impaired, such as in strong ambient lighting, or when screen brightness is turned down. Pinch zoom is also used by people with cognitive conditions such as ASD, Dyslexia and ADD/ADHD, not because it makes things bigger, but because it reduces the amount of things that can be focused on, making it easier to consume content or complete tasks.
    Associated Obstacles: Impaired Vision, Phonology, Cognition, Colour Perception, Colour Contrast Processing and Fatigue.
  5. Dark Mode. With progress sometimes good ideas are lost and then re-discovered, and Dark Mode is one of them. In the early days of computing it was recognised that black on white text induced fatigue, was not energy efficient and could be difficult for some people to process. As such white on black was a preferred route and for screens that were being viewed, green screens won the battle against strain and fatigue. But when more and more colours became available, fashion outweighed functionality and more software interfaces moved to brighter and less user centric colour schemes. Dark mode sees the return of a less aesthetic but more functional UI option that has gained popularity for obvious reasons. As a preference it cuts across sections of cognitive and vision impaired user groups as well as being popular with mainstream users as it reduces fatigue as well as instances of eyestrain and migraines. Everyone has reasons for preference related to barrier based accessibility, but for some, impairing themselves a bit for the sake of reducing their environmental impact, is an active choice. The two preferences that support this are Dark Mode and Turning the brightness down. These both create barriers for the user but are friendlier to the planet.
    Associated Obstacles: Impaired Vision, Phonology, Cognition and Fatigue.
  6. Pointer Control. Screen cursors can be controlled by a large family of devices which can include; mouse, trackpad, joystick, trackball, breath or eye-tracking. Pointing devices should be universally supported and this preference isn’t of particular interest until it is cross referenced with other preferences or barriers.
    Any product or service should be wholly operable using a pointing device only, including text input.
    Associated Obstacles: Impaired Vision, Target Accuracy and Fatigue.
  7. Keyboard Control. There are a large number of ways a keyboard can be used to navigate or control a product which include the use of tabbing, arrowing and the use of standard keyboard shortcuts and screen reader keyboard shortcuts. These can essential to enable the use of assistive technologies such as magnifiers and screen readers as well as devices to improve target accuracy such as switch controllers.
    There are many users who suffer from RSI, CFS, Fibromyalgia or Carpal Tunnel Syndrome who use keyboard interaction as a more ergonomic form of control and for respite from the associated pain.
    Tabbing and arrowing controls either a visible cursor or the focus in audio first interactions.
    Keyboard control also supports environments where the use of a pointing device or touch screen is not practical, such as in space, because mice and trackpads are dependent on the presence of gravity.
    Associated Obstacles: Audio Dependence, Impaired Vision, Cognition, Target Accuracy and Fatigue.
  8. Touch Screen Control. Where a device supports touch screens it is important to consider them in your design. It is also worth considering how people hold these devices in your testing and make sure any test set-ups allow people to hold them in a way they would normally do. Children and older users especially as they might need safe areas, and mainstream users who might use their thumbs as primary digits, so target areas need to be maximised and proximity with safe zones have to be considered.
    There are also different ways that users interact with their screens such as gestures, swiping or touch explore that should be understood. One thing to be mindful of is to avoid introducing non-standard gestures.
    Associated Obstacles: Audio Dependence, Impaired Vision, Cognition, Target Accuracy and Fatigue.
  9. Voice Control. With the inclusion of technologies like Apple Voice Control as standard on Apple devices and Alexa on Amazon Echo, users talking to their computers are becoming more commonplace. This is also a long tail of assistive technology that supports speech input from technologies like Dragon Naturally Speaking speech control is on the rise and can also be seen on devices such as voice platforms like Amazon’s Alexa and in-car UX.
    Associated Obstacles: Audio Dependence, Impaired Vision, Phonology, Cognition, Target Accuracy and Fatigue.
  10. Captions. Closed captions, which are also known in the UK as subtitles, are a mainstream asset used by anywhere up to 80% of an audience. The largest use is on social media platforms where accessing video on demand content is often done in public spaces or social settings, so using audio would be impolite. Captions are also used by people who are hearing impaired or deaf as either a supplement to the audio narrative or lip reading, and for people who are Deaf because there is no signed or sign interpreted alternative. Captions are also used by a lot of people with different cognitive conditions such as ASD or Dyslexia as a comprehension aid, or by people who are learning a language. There are recognised best practices BBC Subtitle Guidelines.
    Associated Obstacles: Vision Dependence, Impaired Audio, Phonology, Cognition and Fatigue.
  11. No JavaScript. This is much less common than it used to be but it is an important fall back strategy for users who have figured out how to turn JavaScript off in the browsers when the JavaScript on a website creates barriers. JavaScript in itself isn’t the problem, but when it is used without testing what impact it has on assistive technology users it can completely block people. Working out how to switch JavaScript off is not easy, the success of it working is dependent on how accessible the underlying HTML UX is and it should not be something planned for as it is an advanced user strategy. There is a very informative blog on the subject by Remy Sharp.
    Associated Obstacles: Audio Dependence, Impaired Vision, Phonology, Cognition and Target Accuracy

The following four requirements are specific to video games:

  1. Games Controller. This is vital for all games especially where an adaptive controller such as Microsoft’s Adaptive Controller. This gives a player more options to customise their controls to a schema that best suits their requirements.
    Associated Obstacles: Audio Dependence, Impaired Vision, Cognition, Target Accuracy, Fatigue.
  2. Control Remapping. Although it is important to try and make a default mapping of your game’s controls as ergonomically efficient as possible and to fit as best as possible with conventions, ensuring players can re-map every control supports schemas that support individual’s physical and cognitive models.
    Associated Obstacles: Audio Dependence, Impaired Vision, Cognition, Target Accuracy, Fatigue
  3. Combined Pointer and Keyboard Control. It is important for all games to be designed to be controlled exclusively by keyboard, pointer or controller, but combinations should always be considered, especially if a player is using a combination of assistive technology inputs. This could be an eye tracker and a switch for example.
    Associated Obstacles: Audio Dependence, Impaired Vision, Cognition, Target Accuracy, Fatigue.
  4. Assistance. Games have to have barriers so by design they are challenging, but comparative challenge is different between different users. If a user does not have the dexterity to be able to control fine movement targeting then the addition of a targeting assistance will keep them in the game on a level playing field. The same can be necessary for rapid fire or even difficulty as key points where a player is not able to continue within a game’s narrative.
    Associated Obstacles: Audio Dependence, Impaired Vision, Cognition, Target Accuracy, Fatigue.

UX Human-Centric Data

An Obstacles and Preferences-based model of accessibility would not only help designers and design researchers think about what questions need exploring, but also stops accessibility being a post-UX tick-box exercise, and makes it more about the actual experiences of actual users.

Obstacles and preferences ensure that the widest context of accessibility is also understood as it is more about the flex design that needs to work for everyone in any situation, and to stop UX barriers being designed into products.

The final benefit for this type of approach is that it gives context to the types of questioning and monitoring necessary to evaluate how successful any accessibility or inclusive design programme is. By being intersectional focused it will be next to impossible to reverse engineer the medical conditions of any user and any associated data model is most likely to be GDPR compliant.

By shifting from a compliance first to a user outcome first UX model, there is a lot to be gained in terms of impact, engagement, inclusion and importantly we’d have meaningful data.

#a11y #accessibility #UXdesign #UXresearch #inclusivedesign #Gamesdesign #diversity #inclusion #designthinking #usability #data

The UX Collective donates US$1 for each article we publish. This story contributed to World-Class Designer School: a college-level, tuition-free design school focused on preparing young and talented African designers for the local and international digital product market. Build the design community you believe in.

--

--

Director at Ab11y.com and The Readability Group. I am an Ex-Head of UX Design and Accessibility at the BBC and I have ADHD and I’m Dyslexic.