Reviewing the Need for Explainable Artificial Intelligence (xAI)

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Abstract

The diffusion of artificial intelligence (AI) applications in organizations and society has fueled research on explaining AI decisions. The explainable AI (xAI) field is rapidly expanding with numerous ways of extracting information and visualizing the output of AI technologies (e.g. deep neural networks). Yet, we have a limited understanding of how xAI research addresses the need for explainable AI. We conduct a systematic review of xAI literature on the topic and identify four thematic debates central to how xAI addresses the black-box problem. Based on this critical analysis of the xAI scholarship we synthesize the findings into a future research agenda to further the xAI body of knowledge.
Original languageEnglish
Title of host publicationProceedings of the 54th Hawaii International Conference on System Sciences
Number of pages10
Place of PublicationHawaii
PublisherHawaii International Conference on System Sciences (HICSS)
Publication date2021
Pages1284-1293
ISBN (Electronic)9780998133140
Publication statusPublished - 2021
EventThe 54th Hawaii International Conference on System Sciences. HICSS 2021 - Online, United States
Duration: 5 Jan 20218 Jan 2021
Conference number: 54
https://www.insna.org/events/54th-hawaii-international-conference-on-system-sciences-hicss

Conference

ConferenceThe 54th Hawaii International Conference on System Sciences. HICSS 2021
Number54
LocationOnline
CountryUnited States
Period05/01/202108/01/2021
Internet address
SeriesProceedings of the Annual Hawaii International Conference on System Sciences
ISSN1060-3425

Cite this