How to make XAI more human | Feature importance case study

So, you want to explain to your customers how a specific algorithm output was provided. Why they got that recommendation, or what they can do to improve their chances of getting their loan approved. But they do not understand your eXplainable Artificial Intelligence results. In this article, I show you how to design explanations so that users understand them better.

Note: Boston Housing dataset is used for this case study.

Case study background

Let’s assume you own a real estate company that makes use of machine learning methods. One of the things you do is provide home valuation information for your clients.

Client problem

A client receives their property price prediction which happens to be much lower than they had anticipated. They ask you a few question:

  • How did you determine this value?
  • How can I improve my house value?

Client solution: Feature importance method

You used the popular SHAP method to determine which features were significant in determining the clients predicted value.

Image from mljar 

You also explain each features as follows to the client:


1. CRIM – per capita crime rate by town.
2. ZN – proportion of residential land zoned for lots over 25,000 sq.ft.
3. INDUS – proportion of non-retail business acres per town.
4. CHAS – Charles River dummy variable.
5. NOX – nitric oxides concentration.
6. RM – average number of rooms in owner units.
7. AGE – proportion of owner units built prior to 1940
8. DIS – weighted distances to five Boston employment centres.
9. RAD – index of accessibility to radial highways.
10. TAX – full-value property-tax rate.
11. PTRATIO – pupil-teacher ratio by town.
12. B – Black proportion of population.
13. LSTAT – Proportion of population that’s lower status.
Dataset explanation can be found in Harrison and Rubinfeld article. 

“But, what is this Charles River dummy variable,” the client asks.

Making the results more human

In previous articles, I showed how people are more likely to trust what you tell them, if they understand it. For example, in the article How to enhance customer trust: Understanding how humans think, I explain how research conducted in neuroscience states that the human brain cannot cope with disorder.

Here is how the data should be displayed to serve the needs of the user:

Solution desgned by Somila in Adobe XD

New solution described

There are a few things that we learn from Psychology and Human-computer interaction  

1. The client wants to know why the valuation is low relative to the value they had in mind.

In his paper, Lipton describes how people think – in contrasts. 

Although someone may ask, “why was my loan rejected,” what they usually mean is, “why was it rejected rather than accepted. In other words, what could I have done to get the loan accepted?

How to make XAI more human then?

Rather than showing all affecting features, which could lead to confusion, show the significant features. 

In other words, if a person applied for a loan and although their age was a favourable feature, they still got rejected. It will add little value to inform the client that their age was favourable. Rather, tell them that their income affected the loan negatively. In this way, they know what to improve.

2. Select only the relevant features to the client, rather than showing them everything.

Showing the user everything can lead to information overload, or worse yet, confusion. Interaction foundation Design explains the dangers of information overload for users. But ultimately, this can decrease customer satisfaction and trust.  

How to make XAI more human then?

Display the features that are significant to the user. See example provided in number 1.

3. People tend to understand, and believe, what is consistent with their prior beliefs.

The confirmation bias theory teaches us that people tend to look for, and believe things that are in line with what they already believe. This means that they also have the tendency to dismiss, or doubt that which is contrary.

How to make XAI more human then?

Indicate those factors that are consistent with what your clients already know. For example, it is common knowledge that bigger houses tend to be worth more money. Therefore, when the client sees the “Rooms” feature included in the list, it’s unlikely that they will dismiss it.

Lastly

The sentence One or more of these factors affecting your valuation is used in the opening sentence in order to give the user the freedom to decide whether to believe both factors apply to them or just one of them. The user may have preconceived beliefs about the area that makes them disagree that it’s a low income area, for example, in that case, they would not disregard the explanation altogether as they realize that they do not have many bedrooms on the property.

Conclusion

When writing an explanation for humans, consider what information is of importance to them, rather than simply providing the results received in the feature importance method used. Doing that may leave the user confused and distrustful.