IRMA-International.org: Creator of Knowledge
Information Resources Management Association
Advancing the Concepts & Practices of Information Resources Management in Modern Organizations

Automation of Explainability Auditing for Image Recognition

Automation of Explainability Auditing for Image Recognition
View Sample PDF
Author(s): Duleep Rathgamage Don (Kennesaw State University, USA), Jonathan Boardman (Kennesaw State University, USA), Sudhashree Sayenju (Kennesaw State University, USA), Ramazan Aygun (Kennesaw State University, USA), Yifan Zhang (Kennesaw State University, USA), Bill Franks (Kennesaw State University, USA), Sereres Johnston (The Travelers Companies, Inc., USA), George Lee (The Travelers Companies, Inc., USA), Dan Sullivan (The Travelers Companies, Inc., USA)and Girish Modgil (The Travelers Companies, Inc., USA)
Copyright: 2023
Volume: 14
Issue: 1
Pages: 17
Source title: International Journal of Multimedia Data Engineering and Management (IJMDEM)
Editor(s)-in-Chief: Chengcui Zhang (University of Alabama at Birmingham, USA)and Shu-Ching Chen (University of Missouri-Kansas City, United States)
DOI: 10.4018/IJMDEM.332882

Purchase

View Automation of Explainability Auditing for Image Recognition on the publisher's website for pricing and purchasing information.

Abstract

XAI requires artificial intelligence systems to provide explanations for their decisions and actions for review. Nevertheless, for big data systems where decisions are made frequently, it is technically impossible to have an expert monitor every decision. To solve this problem, the authors propose an explainability auditing method for image recognition whether the explanations are relevant for the decision made by a black box model, and involve an expert as needed when explanations are doubtful. The explainability auditing system classifies explanations as weak or satisfactory using a local explainability model by analyzing the image segments that impacted the decision. This version of the proposed method uses LIME to generate the local explanations as superpixels. Then a bag of image patches is extracted from the superpixels to determine their texture and evaluate the local explanations. Using a rooftop image dataset, the authors show that 95.7% of the cases to be audited can be detected by the proposed method.

Related Content

Yasasi Abeysinghe, Bhanuka Mahanama, Gavindya Jayawardena, Yasith Jayawardana, Mohan Sunkara, Andrew T. Duchowski, Vikas Ashok, Sampath Jayarathna. © 2024. 20 pages.
Chengxuan Huang, Evan Brock, Dalei Wu, Yu Liang. © 2023. 23 pages.
Duleep Rathgamage Don, Jonathan Boardman, Sudhashree Sayenju, Ramazan Aygun, Yifan Zhang, Bill Franks, Sereres Johnston, George Lee, Dan Sullivan, Girish Modgil. © 2023. 17 pages.
Wei-An Teng, Su-Ling Yeh, Homer H. Chen. © 2023. 17 pages.
Hemanth Gudaparthi, Prudhviraj Naidu, Nan Niu. © 2022. 20 pages.
Anchen Sun, Yudong Tao, Mei-Ling Shyu, Angela Blizzard, William Andrew Rothenberg, Dainelys Garcia, Jason F. Jent. © 2022. 19 pages.
Suvojit Acharjee, Sheli Sinha Chaudhuri. © 2022. 16 pages.
Body Bottom