Since the creation of online dating applications, Tinder has been one of the most successful ones and is used widely across the world. With 7.86 million users over 190 countries (estimation from September 2019), the company generates innumerable matches every day. In order to understand people’s dating preferences, Tinder algorithms play a crucial role in the way individuals match with each other. Even though the company has never officially revealed how it deals with the immense amount of data it collects every day, researchers of online dating apps believe that Tinder algorithms mirror societal practices, preferences, and could potentially reinforce already existing racial bias.
Tinder algorithms have been criticized of being biased. The interferences and influences of algorithms on individuals’ dating choices have both a practical and a moral dimension. On the one hand, from a practical perspective, Tinder algorithms are supposed to “know” the individuals’ preferences best, and show the users profiles that they will be more likely to match with. Algorithms shall be rational and objective and therefore provide the best suggestions for potential dating partners. On the other hand, humans are spontaneous and sometimes irrational, especially when it comes to dating even though most of the time our actions follow a certain pattern. Researchers have been arguing over whether the interference of algorithms is ethical or not in dating apps. The biases of Tinder algorithms are most debatable. Users who have been using Tinder for a while have been noticing the fact that the app seems to suggest him or her people of same races or someone who shares similar hobbies, which could potentially reinforce social and racial prejudices which already exist in our society. (Sharma, 2016; Hutson, Taft, Barocas and Levy, 2018). In addition, “although partner preferences are extremely personal, it is argued that culture shapes our preferences, and dating apps influence our decisions.” (Lefkowitz, 2018) The influences of algorithms could either be beneficial to dating or affect negatively users’ swiping experience while increasing racial and social biases.
The creation of algorithms and the selection of data are made by computer scientists. There is thus the risk that those algorithm reflect their thinking and beliefs. Therefore, due to the fact that our society is biased towards some cultures and races, algorithms on Tinder and any other online social platforms also reflect the biases. Tinder algorithms shall be questioned and analyzed more deeply as “a key feature of the cultural forms emerging in their shadows” (Anderson, 2011 and Striphas, 2010). The sociological dimension of algorithms study is relevant, since it puts into consideration the objectivity and impartiality of the collected data.
In our research, we created two identical female profiles, with the same description, same age and striking a similar pose. The only things differentiating them were their names (so that it wasn’t too suspicious in case the same men liked both profiles) and the fact that one girl has black-colored skin and the other has white-colored skin. Through the continuous collection of data, the goal was to understand whether race affects individuals’ attractiveness and to what extent are racial biases present in Tinder’s algorithms. In 2014, a study released by OKCupid has confirmed that racial basis in the contemporary society is present on online dating apps and it affects people’s dating preferences and swiping behaviors. It shows that Black women and Asian men, who are already marginalized in the society, are additionally discriminated against in the online dating environments. (Sharma, 2016) Also, if a user has had several Caucasian matches in the past, the algorithm will suggest the individual more Caucasian matches in the futur as “good matches” rather than showing profiles of different backgrounds or origins. (Lefkowitz, 2018) This whole system is harmful to societal norms, because “if past users made discriminatory decisions, the algorithm will continue on the same, biased trajectory”. (Hutson, Taft, Barocas and Levy, 2018)
The racial biases of Tinder are related to the mechanism that collects data. Tinder ranks and clusters people through a system of identifying and ranking people’s attractiveness, and then intentionally keeps the “lower” ranked profiles out of sight for the “higher” ranked ones. One of the new approaches Tinder has been using is called the TinVec approach. It is a mechanism that calculates the proximity of two vectors which symbolize two people. The closer the two vectors are, the more they share similar characteristics and the more likely they will match with each other. (Chief scientist of Tinder Steve Liu, 2017 MLconf in San Francisco) The problem is that the algorithms tend to show the shades of people’s cultural practices and “select what is more relevant from a corpus of data composed of traces of our activities, preferences, and expressions”. (Gillespie, 2014: 168)
To some extent, it can be concluded that Tinder’s algorithms are indeed biased and do not objectively allocate random profiles to the users because the company wants to gain profits and help the users match with people who are similar to themselves. The data collected in our experiment confirms the racial bias of Tinder, since the black girl, Sara, was “offered” more black male profiles than Anna, the white girl, and thus matched with more of them. Behind the discussion on racial basis of algorithms, maybe we shall question and try to solve the issue of racial biases in contemporary society because the online virtual world only reflects the world we live in.
Bibliography
Gillespie, T. (2014). The relevance of algorithms. In Gillespie, Tarleton, Pablo J. Boczkowski & Kirsten A. Foot (eds.) Media technologies: Essays on communication, materiality and society. MIT Scholarship Online, 167-193.
Hutson, J.A., Taft, J. G., Barocas, S. and Levy, K. (2018). “Debiasing Desire: Addressing Bias & Discrimination on Intimate Platforms” Proceedings of the ACM On Human-Computer Interaction (CSCW) 2: 1–18, https://doi.org/10.1145/3274342.
Lefkowitz, M. (2018) Redesign dating apps to lessen racial bias, study recommends. http://news.cornell.edu/stories/2018/09/redesign-dating-apps-lessen-racial-bias-study-recommends
Liu, S. (2017). In SlideShare: Personalized Recommendations at Tinder: The TinVec Approach.
MLconf (2017). MLconf.com. MLCONF 2017 SAN FRANCISCO. https://mlconf.com/sessions/?event=mlconf-2017-sf/
Sharma, M. (2016). Tinder has a race problem nobody wants to talk about. https://www.smh.com.au/technology/tinder-has-a-race-problem-nobody-wants-to-talk-about-20160215-gmu0pj.html


