Do you think you are customer focused? If you are not doing better than legislation in dire need of updating, then you really do not care about the customer. You need to ensure anyone can seamlessly use your digital properties, no matter their situation in life. It’s not only good for your brand, it’s good for people.
Look at this law. It is ancient. When it was created Star Trek had our closest idea to the iPhone. When it was created, you still need to carry around a “case quarter” so you could make a call while on the go. When it was created, the only way to enter into a virtual space was by making sure your telephone squawked at another phone (or something) somewhere else.
We need to do better. Read on.
When gathering data, we need to also look at wildcards. They may not really be wild cards but they might be enough to shake our biases before they get baked into any digital solution.
I am not here writing about racism—though that can be included. Think about historic redlining affecting your ability to get a loan in an automated system. Don’t know what I mean?
Here is IBM realizing that a common bias can become part of a virtual reality and artifical intelligence just because the designers have unstated assumptions. For example, anchoring bias: the first thing we hear is most likely right. Or CS Lewis’ chronological snobbery: the assumption because we are more modern, we know more than the previous generation. Or gender bias: like when looking for an image of a “cop” or “pilot” you (wrongly) expect and (wrongly) get men or when you google “nurse” you (wrongly) expect and (wrongly) get women.
These things can go really bad if there are medical conditions that we stay quiet about or integrate symptoms around the male body as a norm. A big one like the symptoms of a heart attack. A friend of mine had a heart attack while going on a hike and they did not go to the doctor for several hours because her symptoms presented as tiredness and indigestion: which are common presenting symptoms of the female body. Or when our data gap outright ignores people groups where those differences might matter.
This is happening everywhere and you can read about it in regards to women in Caroline Criado Perez’s excellent book Invisible Women. You can get around this by purposefully including groups (like women) as part of your research and not only around roles. Here I would argue that you need a randomized wildcard that excludes the common bits you are looking for.
I am not saying this is the only answer but we need to do better lest we forget large swaths of humans in our supposedly human-centered solutions.
I’ve updated this post with newer articles.