no code implementations • 27 May 2024 • Armin Moradi, Nicola Neophytou, Golnoosh Farnadi
In this work, we identify these biases in recommendations for artists from underrepresented cultural groups in prototype-based matrix factorization methods.
no code implementations • 23 May 2022 • Sikha Pentyala, Nicola Neophytou, Anderson Nascimento, Martine De Cock, Golnoosh Farnadi
Group fairness ensures that the outcome of machine learning (ML) based decision making systems are not biased towards a certain group of people defined by a sensitive attribute such as gender or ethnicity.
no code implementations • 15 Oct 2021 • Nicola Neophytou, Bhaskar Mitra, Catherine Stinson
We find statistically significant differences in recommender performance by both age and gender.