Cyborg Accountability

Research output: Contribution to conferencePaperpeer-review

1 Downloads (Pure)

Abstract

Innovations in AI – often mundane in application but always profound in implication – suggest that AI is not simply an exogenous shock that is to be managed within the parameters of public accountability as we understand it now. Rather, AI is continuously further embedded as an endogenous feature of administrative work. This means that the measures and standards we apply to public management of societal challenges are already infused with algorithmic processes. This will not simply have an impact on the kinds of information people use, but on the institutional and social logics through which they imagine their roles.
Answering the question of what accountability means when there are ‘robots on the team’ will require our getting to grips with what it is to be accountable in the first place. I argue that informal accountability, which act as the locus for understanding what more formal accountability mechanisms entail, is the central facet of all accountability, even of the kinds of mechanisms and structures that more formal accountability institutions articulate.
I take a relatively ‘thick’ line on informal accountability relationships here, describing them as a class of ‘plural subjectivity.’ Such relationships, I think, are always rooted in collective intentions and understandings because intentions and understandings always emerge in group contexts. Accountability is sited in group agency. It is underpinned and informed by the emergence of joint commitments and intents between actors (following Gilbert, 1989) in the context of how they understand and thus navigate their terms of work. Formal accountability mechanisms are formed from joint commitments and understandings in themselves. They are team efforts. That said, what it means to adhere to an accountability mechanism – what is understood by being accountable – is itself navigated reflexively. Understandings form within accountability teams, not least considering how authorities assert themselves.
This thick perspective on informal accountability gives us a route into how AI is and will become endogenous in accountability relationships and from there in governance more broadly. Accountability being a ‘we’ phenomenon gives us a better sense of how algorithmic processes, each possessing a degree of agency, can be ‘part of the team.’
Original languageEnglish
Number of pages29
Publication statusPublished - 08 Dec 2021
EventDigital Legal Talks 2021 - Tilburg University/Online, Tilburg, Netherlands
Duration: 08 Dec 202108 Dec 2021
https://www.sectorplandls.nl/wordpress/digital-legal-talks-2021/

Conference

ConferenceDigital Legal Talks 2021
Country/TerritoryNetherlands
CityTilburg
Period08/12/202108/12/2021
Internet address

Fingerprint

Dive into the research topics of 'Cyborg Accountability'. Together they form a unique fingerprint.

Cite this