Unfairness as complex system failure

In this chapter, you have been equipped with an arsenal of technical tools to make machine learning models fairer. However, a model does not operate in a vacuum. Models are embedded in complex socio-technical systems. There are humans developing and monitoring the model, sourcing the data and creating the rules for what to do with the model output. There are also other machines in place, producing data or using outputs from the model. Different players might try to game the system in different ways.

Unfairness is equally complex. We've already discussed the two general definitions of unfairness, disparate impact and disparate treatment. Disparate treatment can occur against any combination of features (age, gender, race, nationality, income, and so on), often in complex and non-linear ways. This section examines Richard Cook's 1998 paper, How complex systems fail - available at https://web.mit.edu/2.75/resources/random/How%20Complex%20Systems%20Fail.pdf - which looks at how complex machine learning-driven systems fail to be fair. Cook lists 18 points, some of which will be discussed in the following sections.

Complex systems are intrinsically hazardous systems

Systems are usually complex because they are hazardous, and many safeguards have been created because of that fact. The financial system is a hazardous system; if it goes off the rails, it can break the economy or ruin people's lives. Thus, many regulations have been created and many players in the market work to make the system safer.

Since the financial system is so hazardous, it is important to make sure it is safe against unfairness, too. Luckily, there are a number of safeguards in place to keep the system fair. Naturally, these safeguards can break, and they do so constantly in a number of small ways.

Catastrophes are caused by multiple failures

In a complex system, no single point of failure can cause catastrophes since there are many safeguards in place. Failure usually results from multiple points of failure. In the financial crises, banks created risky products, but regulators didn't stop them.

For widespread discrimination to happen, not only does the model have to make unfair predictions, but employees must blindly follow the model and criticism must be suppressed. On the flip side, just fixing your model will not magically keep all unfairness away. The procedures and culture inside and outside the firm can also cause discrimination, even with a fair model.

Complex systems run in degraded mode

In most accident reports, there is a section that lists "proto-accidents," which are instances in the past where the same accident nearly happened but did not happen. The model might have made erratic predictions before, but a human operator stepped in, for example.

It is important to know that in a complex system, failures that nearly lead to catastrophe always occur. The complexity of the system makes it prone to error, but the heavy safeguards against catastrophe keep them from happening. However, once these safeguards fail, catastrophe is right around the corner. Even if your system seems to run smoothly, check for proto-accidents and strange behavior before it is too late.

Human operators both cause and prevent accidents

Once things have gone wrong, blame is often put at the human operators who "must have known" that their behavior would "inevitably" lead to an accident. On the other hand, it is usually humans who step in at the last minute to prevent accidents from happening. Counterintuitively, it is rarely one human and one action that causes the accident, but the behavior of many humans over many actions. For models to be fair, the entire team has to work to keep it fair.

Accident-free operation requires experience with failure

In fairness, the single biggest problem is often that the designers of a system do not experience the system discriminating against them. It is thus important to get the insights of a diverse group of people into the development process. Since your system constantly fails, you should capture the learning from these small failures before bigger accidents happen.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset