Article (Périodiques scientifiques)
Astraea: Grammar-based fairness testing
SOREMEKUN, Ezekiel; Udeshi, Sakshi Sunil; Chattopadhyay, Sudipta
2022In IEEE Transactions on Software Engineering
Peer reviewed
 

Documents


Texte intégral
Astrea.pdf
Postprint Éditeur (1.53 MB)
Télécharger

Tous les documents dans ORBilu sont protégés par une licence d'utilisation.

Envoyer vers



Détails



Mots-clés :
software fairness; machine learning; Software testing
Résumé :
[en] Software often produces biased outputs. In particular, machine learning (ML) based software is known to produce erroneous predictions when processing discriminatory inputs. Such unfair program behavior can be caused by societal bias. In the last few years, Amazon, Microsoft and Google have provided software services that produce unfair outputs, mostly due to societal bias (e.g. gender or race). In such events, developers are saddled with the task of conducting fairness testing. Fairness testing is challenging; developers are tasked with generating discriminatory inputs that reveal and explain biases. We propose a grammar-based fairness testing approach (called ASTRAEA) which leverages context-free grammars to generate discriminatory inputs that reveal fairness violations in software systems. Using probabilistic grammars, ASTRAEA also provides fault diagnosis by isolating the cause of observed software bias. ASTRAEA’s diagnoses facilitate the improvement of ML fairness. ASTRAEA was evaluated on 18 software systems that provide three major natural language processing (NLP) services. In our evaluation, ASTRAEA generated fairness violations at a rate of about 18%. ASTRAEA generated over 573K discriminatory test cases and found over 102K fairness violations. Furthermore, ASTRAEA improves software fairness by about 76% via model retraining, on average.
Disciplines :
Sciences informatiques
Auteur, co-auteur :
SOREMEKUN, Ezekiel  ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > SerVal
Udeshi, Sakshi Sunil ;  Singapore University of Technology and Design, Singapore
Chattopadhyay, Sudipta;  Singapore University of Technology and Design, Singapore
 Ces auteurs ont contribué de façon équivalente à la publication.
Co-auteurs externes :
yes
Langue du document :
Anglais
Titre :
Astraea: Grammar-based fairness testing
Date de publication/diffusion :
2022
Titre du périodique :
IEEE Transactions on Software Engineering
Maison d'édition :
IEEE
Peer reviewed :
Peer reviewed
Focus Area :
Security, Reliability and Trust
Disponible sur ORBilu :
depuis le 16 janvier 2023

Statistiques


Nombre de vues
140 (dont 1 Unilu)
Nombre de téléchargements
85 (dont 1 Unilu)

citations Scopus®
 
23
citations Scopus®
sans auto-citations
21
citations OpenAlex
 
6
citations WoS
 
17

Bibliographie


Publications similaires



Contacter ORBilu