Our results show that women’s contributions tend to be accepted more often than men’s [when their gender is hidden]. However, when a woman’s gender is identifiable, they are rejected more often. Our results suggest that although women on GitHub may be more competent overall, bias against them exists nonetheless.
Their link wasn’t to the paper but to the license to poison possible AIs training their models on our posts. Idk if that actually is of any use though