DSAN 5450: Data Ethics and Policy
Spring 2024, Georgetown University
Wednesday, January 31, 2024
On average, being classified as a White man as opposed to a Coloured man would have more than quadrupled a person’s income. (Pellicer and Ranchhod 2023)
A scary-sounding word that just means:
“What we talk about when we talk about ethics”,
in contrast to
“What we talk about when we talk about [insert particular ethical framework here]”
Descriptive (Is) | Normative (Ought) |
---|---|
Grass is green (true) | Grass ought to be green (?) |
Grass is blue (false) | Grass ought to be blue (?) |
How did you acquire the concept “red”?
*(Tiny text footnote: Except for, perhaps, a few fun but rare onomatopoetic cases)
How did you acquire the concept “good”?
Jesus said to his disciples, “Truly, I say to you, only with difficulty will a rich person enter the kingdom of heaven. Again I tell you, it is easier for a camel to go through the eye of a needle than for a rich person to enter the kingdom of God.” (Matthew 19:23-24)
Oh, were we loving God worthily, we should have no love at all for money! (St. Augustine 1874, pg. 28)
*(…jumpscare: REIFICATION!)
The earliest capitalists lacked legitimacy in the moral climate in which they found themselves. One of the means they found [to legitimize their behavior] was to appropriate the evaluative vocabulary of Protestantism. (Skinner 2012, pg. 157)
Calvinism added [to Luther’s doctrine] the necessity of proving one’s faith in worldly activity, [replacing] spiritual aristocracy of monks outside of/above the world with spiritual aristocracy of predestined saints within it. (pg. 121).
The immunity Israel has received over the last fifty years encourages others, regimes and oppositions alike, to believe that human and civil rights are irrelevant in the Middle East. The dismantling of the mega-prison in Palestine will send a different, and more hopeful, message.
A Jewish state would not have come into being without the uprooting of 700,000 Palestinians. Therefore it was necessary to uproot them. There was no choice but to expel that population. It was necessary to cleanse the hinterland and cleanse the border areas and cleanse the main roads.
Millions are kept permanently happy, on the one simple condition that a certain lost soul on the far-off edge of things should lead a life of lonely torture (James 1891)
\[ \underbrace{\texttt{A} \prec \texttt{B} \prec \cdots \prec \texttt{Z}}_{\mathclap{\substack{\text{Individual Rights} \\ \text{Basic Goods}}}} \phantom{\prec} \prec \phantom{\prec} \underbrace{\texttt{a} \prec \texttt{b} \prec \cdots \prec \texttt{z}}_{\mathclap{\substack{\text{Distributive Principles} \\ \text{Money and whatnot}}}} \]
\[ \text{Seniors} \prec \text{Juniors} \prec \text{Sophomores} \prec \text{Freshmen} \]
\(B\) | |||
Stop | Drive | ||
\(A\) | Stop | \(-1,-1\) | \(-3,\phantom{-}0\) |
Drive | \(\phantom{-}0, -3\) | \(-10,-10\) |
\(B\) | |||
Stop | Drive | ||
\(A\) | Stop | \({\color{orange}\cancel{\color{black}-1}},{\color{lightblue}\cancel{\color{black}-1}}\) | \(\boxed{-3},\boxed{0}\) |
Drive | \(\boxed{0}, \boxed{-3}\) | \({\color{orange}\cancel{\color{black}-10}},{\color{lightblue}\cancel{\color{black}-10}}\) |
\[ \begin{align*} \mathbb{E}[u_A] = \mathbb{E}[u_B] &= \int_{0}^{1}\int_{0}^{1}\left(x - 2y -8xy - 1\right)dy \, dx = -3.5 \\ \underbrace{\mathbb{E}\mkern-3mu\left[u_A + u_B\right]}_{\mathclap{\text{Utilitarian Social Welfare}}} &= -3.5 \end{align*} \]
\(B\) | |||
Stop | Drive | ||
\(A\) | Stop | \({\color{orange}\cancel{\color{black}-1}},{\color{lightblue}\cancel{\color{black}-1}}\) | \(\boxed{-3},\boxed{0}\) |
Drive | \(\boxed{0}, \boxed{-3}\) | \({\color{orange}\cancel{\color{black}-10}},{\color{lightblue}\cancel{\color{black}-10}}\) |
*(through, for example, traffic laws: equal in theory… In practice? Another story)
\[ \underbrace{p(x)}_{\substack{\text{Accept ethical} \\ \text{framework }x}} \implies \underbrace{q(y)}_{\substack{\text{Algorithms should} \\ \text{satisfy condition }y}} \]
Roughly, approaches to fairness/bias in AI can be categorized as follows:
Ah, la majestueuse égalité des lois, qui interdit au riche comme au pauvre de coucher sous les ponts, de mendier dans les rues et de voler du pain!
(Ah, the majestic equality of the law, which prohibits rich and poor alike from sleeping under bridges, begging in the streets, and stealing loaves of bread!)
Anatole France, Le Lys Rouge (France 1894)
\[ A_i = \begin{cases} 0 &\text{if }i\text{ self-reported ``white''} \\ 1 &\text{if }i\text{ self-reported ``black''} \end{cases} \]
Notice: choice of mapping into \(\{0, 1\}\) here non-arbitrary!
We want our models/criteria to be descriptively but also normatively robust; e.g.:
If (antecedent I hold, though majority in US do not) one believes that ending (much less repairing) centuries of unrelenting white supremacist violence here might require asymmetric race-based policies,
Then our model should allow different normative labels and differential weights on
\[ \begin{align*} \Delta &= (\text{Fairness} \mid A = 1) - (\text{Fairness} \mid A = 0) \\ \nabla &= (\text{Fairness} \mid A = 0) - (\text{Fairness} \mid A = 1) \end{align*} \]
despite the descriptive fact that \(\Delta = -\nabla\).
\[ \Pr(D = 1 \mid A = 0) = \Pr(D = 1 \mid A = 1) \]
\[ D \perp A \iff \Pr(D = d, A = a) = \Pr(D = d)\Pr(A = a) \]
\[ \Pr(D = 1 \mid Y = 0, A = 0) = \Pr(D = 1 \mid Y = 0, A = 1) \]
\[ \Pr(D = 0 \mid Y = 1, A = 0) = \Pr(D = 0 \mid Y = 1, A = 1) \]
\[ \Pr(D = d, A = a \mid Y = y) = \Pr(D = d \mid Y = y)\Pr(A = a \mid Y = y) \]
Labeled Low-Risk | Labeled High-Risk | |
---|---|---|
Didn’t Do More Crimes | True Negative | False Positive |
Did More Crimes | False Negative | True Positive |
DSAN 5450 Week 3: (Descriptive) Fairness in AI