Measuring Hidden Bias within Face Recognition via Racial Phenotype
Paper - Poster - Video - Dataset
Recent work reports disparate performance for intersectional racial groups across face recognition tasks: face verification and identification. However, the definition of those racial groups has a significant impact on the underlying findings of such racial bias analysis. Previous studies define these groups based on either demographic information (e.g. African, Asian etc.) or skin tone (e.g. lighter or darker skins). The use of such sensitive or broad group definitions has disadvantages for bias investigation and subsequent counter-bias solutions design. By contrast, this study introduces an alternative racial bias analysis methodology via facial phenotype attributes for face recognition. We use the set of observable characteristics of an individual face where a race-related facial phenotype is hence specific to the human face and correlated to the racial profile of the subject. We propose categorical test cases to investigate the individual influence of those attributes on bias within face recognition tasks. We compare our phenotype-based grouping methodology with previous grouping strategies and show that phenotype-based groupings uncover hidden bias without reliance upon any potentially protected attributes or ill-defined grouping strategies. Furthermore, we contribute corresponding phenotype attribute category labels for two face recognition tasks: RFW for face verification and VGGFace2 (test set) for face identification.
We propose using race-related facial (phenotype) characteristics within face recognition to investigate racial bias by categorising representative racial characteristics on the face and exploring the impact of each characteristic phenotype attribute: skin types, eyelid type, nose shape, lips shape, hair colour and hair type.
Facial phenotype attributes and their categorisation.
To start working with this project you will need to take the following steps:
-
Install Python packages using
conda env create --file environment.yaml
-
For face verification, please install RFW dataset and for face identification VGGFace 2.
-
Download pre-trained models and annotations from here. After installation, please place model.ckpt under models/ folder and place FDA files under test_assets/ folder.
To reproduce the performance reported in the paper: First, align images to 112x112.
python face_alignment.py --dataset_name RFW --data_dir datasets/test/data/African/ --output_dir datasets/test_aligned/African --landmark_file datasets/test/txts/African/African_lmk.txt
python face_alignment.py --dataset_name VGGFace2 --data_dir datasets/VGGFace2/ --output_dir datasets/test_aligned/VGGFace2_aligned --landmark_file datasets/VGGFace2/bb_landmark/loose_bb_test.csv
python face_atribute_verification.py --data_dir datasets/test_aligned/ --model_dir models/setup1_model/model --pair_file test_assets/AttributePairs/setup1/skintype_type1_6000.csv --batch_size 32
python face_cross_atribute_verification.py --input_predictions test_assets/AttributeCrossPairs/skintype_type2.csv --dist_name 'vgg_dist' --output_path test_assets/AttributeCrossPairs
The distribution of race-relavent phenotype attributes of RFW and VGGFace2 test datasets.
If you are making use of this work in any way (including our pre-trained models or datasets), you must please reference the following articles in any report, publication, presentation, software release or any other associated materials:
@InProceedings{yucermeasuring,
author = {Yucer, S. and Tektas, F. and Al Moubayed, N. and Breckon, T.P.},
title = {Measuring Hidden Bias within Face Recognition via Racial Phenotypes},
booktitle = {Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision},
year={2022},
publisher = {IEEE},
arxiv = {https://arxiv.org/abs/2110.09839},
}