Activities per year
Abstract
Duties to explain decisions to individuals exist in laws and other types of regulation. Rules are stricter where dependencies of explainees are larger, where unequal powers add weight to the information imbalance. Of late, decision makers’ decreasing knowledge-ability of decision support technology is said to hit critical levels. Machine conclusions seem unreasonable, and dignitarian concerns are raised: humane treatment is said to depend on the ability to
explain, especially in sensitive contexts.
If this is true, why haven’t our most fundamental laws prevented this corrosion? Covert ‘algorithmic harms’ to groups and individuals were exposed in environments where explanation was regulated, and in sensitive contexts. Fundamental unsafety still slipped in, with highly disparate impact.
Insights from the research fields of epistemic (in)justice help to understand how this happens.
When social dynamics of knowledge practices go unchecked, epistemic authority easily becomes a factor of other powers, and patterns of marginalization appear. ‘Other’ people’s knowledge, capacities, and participation are wrongly excluded, dismissed, and misused.
Wrongful knowledge is made, and harms play out on individual and collective levels.
Core values of explanation promote the ability to recognize when, what, and who to trust and distrust with regard to what is professed. In democratic societies, this capability is highly depended upon. It is true that current challenges to these values are not sufficiently met by regulation, but this problem does not follow from technological developments – it precedes it.
When the Corona crisis hit, National authorities based decisions with fundamental impact on people’s lives on real-time knowledge making. Many professed to build on expert advice, science, and technology, but still asked to be trusted for their political authority. Critical choices with regard to expertise and experts remained unexplained, concepts unreasoned.
Whose jobs are crucial, who’s vulnerable, what does prioritizing health and safety mean?
Patterns of marginalization appeared, policy measures have shown disparate impact.
In times of crisis, the tendency to lean on authority rather than honest explanation and diverse knowledge co-creation is a recurring pattern. This contribution argues to use the dual momentum to assess and reinforce our explanation regulation. If we truly want it to express the fundamental importance of explanation, insights from the fields of epistemic (in)justice
should lead the way. This contribution presents a working model of explanation as a type of interactive, testimonial practice to support such efforts.
explain, especially in sensitive contexts.
If this is true, why haven’t our most fundamental laws prevented this corrosion? Covert ‘algorithmic harms’ to groups and individuals were exposed in environments where explanation was regulated, and in sensitive contexts. Fundamental unsafety still slipped in, with highly disparate impact.
Insights from the research fields of epistemic (in)justice help to understand how this happens.
When social dynamics of knowledge practices go unchecked, epistemic authority easily becomes a factor of other powers, and patterns of marginalization appear. ‘Other’ people’s knowledge, capacities, and participation are wrongly excluded, dismissed, and misused.
Wrongful knowledge is made, and harms play out on individual and collective levels.
Core values of explanation promote the ability to recognize when, what, and who to trust and distrust with regard to what is professed. In democratic societies, this capability is highly depended upon. It is true that current challenges to these values are not sufficiently met by regulation, but this problem does not follow from technological developments – it precedes it.
When the Corona crisis hit, National authorities based decisions with fundamental impact on people’s lives on real-time knowledge making. Many professed to build on expert advice, science, and technology, but still asked to be trusted for their political authority. Critical choices with regard to expertise and experts remained unexplained, concepts unreasoned.
Whose jobs are crucial, who’s vulnerable, what does prioritizing health and safety mean?
Patterns of marginalization appeared, policy measures have shown disparate impact.
In times of crisis, the tendency to lean on authority rather than honest explanation and diverse knowledge co-creation is a recurring pattern. This contribution argues to use the dual momentum to assess and reinforce our explanation regulation. If we truly want it to express the fundamental importance of explanation, insights from the fields of epistemic (in)justice
should lead the way. This contribution presents a working model of explanation as a type of interactive, testimonial practice to support such efforts.
Original language | English |
---|---|
Publication status | Published - 2021 |
Event | TILTing Perspectives: Regulating in Times of Crisis - TILT, Tilburg University (online), Tilburg, Netherlands Duration: 19 May 2021 → 21 May 2021 https://www.tilburguniversity.edu/research/institutes-and-research-groups/tilt/events/tilting-perspectives |
Conference
Conference | TILTing Perspectives |
---|---|
Country/Territory | Netherlands |
City | Tilburg |
Period | 19/05/21 → 21/05/21 |
Internet address |
Fingerprint
Dive into the research topics of 'Explanation is a concept in an AI-induced crisis, they say. But badly explained pandemic politics illustrate how its core values have never been safe. Can we use the momentum?'. Together they form a unique fingerprint.Activities
- 1 Oral presentation
-
Explanations (are) in crises
de Groot, A. (Speaker)
21 May 2021Activity: Talk or presentation types › Oral presentation › Scientific