Context

In recent years, we have witnessed a significant transformation in the way people consume cultural products. The rise and growth of streaming platforms has revolutionized the way we interact with music and film contents. With options ranging from ad-supported free reproduction to vast libraries of affordable songs and films, the cultural landscape has never been the same. Thus, as pointed out by Valente (2016), we have observed a reorganization of the industry, which consists of a new presence of social actors previously unrelated to the sector, as is the case of tech companies and of actors that now play roles that did not exist before1.

In the case of music streaming platforms such as Spotify and Deezer, we see that they are becoming one of the most popular ways to listen to music in Brazil and worldwide. Spotify is the most widely used music streaming platform and has 195 million paying users worldwide, while Deezer has 9.6 million subscribers, of which 2.7 million are in Brazil, being the second largest market of the platform, following only France. According to the survey TIC Domicílios 2021, 61% of Brazilians listen to music through the internet. Besides, there was a reduction on the proportion of users who downloaded songs: 35% in 2021, compared to 41% in 2019, which also points to the growth of streaming platforms. The survey Panorama Mobile Time/Opinion Box, about the use of apps in Brazil, also confirms this trend. From July 2021 to July 2022, there was a growth of 33% to 38% of Brazilians paying for music streaming services on their smartphones2.   

One of the main features of these platforms is their possibility of offering recommendations of other songs and artists that may interest the listeners, from the very first clicks. The definition of upcoming music content is made through what is called a recommendation algorithm.

Although the specific details about how each platform uses its recommendation algorithms are largely unknown due to market secrets and lack of transparency, we can understand that these algorithms are systems that count on users’ data, such as play history, music genres listened to, playlists, favorites list and browsing behavior, to provide personalized suggestions.

Personalized suggestions can ensure that a user joins one streaming service instead of other. Who gets it right? Who best defines my profile and taste? These are some of the questions posed by the consumer when choosing which platform to use.

On the one hand, we question to whether people are failing to discover songs and artists on their own, relying entirely on platforms suggestions. Even if we have autonomy to decide whether to skip a song, it seems that the contact we will have with this or that song necessarily passes through a platform, without the users autonomously thinking about what they want to find. On the other hand, we can argue that these platforms provide users with more knowledge in relation to a specific musical genre or an artist they already like. Both perspectives bring about reflections about the impacts and changes in the ways of listening to, knowing and enjoying musical contents, raising concerns among intellectuals, activists and artists.

Algorithmic discrimination versus recommendation algorithms

In addition to the aspects mentioned above, there is the challenge of understanding how content recommendation algorithms can contribute to the marginalization of certain groups and/or reinforce already existing social inequalities. However, it is important to avoid falling into a technological pessimism. The reinforcements or creations of inequalities are often intertwined with structural social issues. In this context, our goal is to explore which data will be discovered and how to deal with the disparities found.

We need, however, to take a step back and define what an algorithm is. An algorithm can be explained as a sequence of instructions to perform a certain task in a computing system. Artificial Intelligence algorithms can be trained with databases, for example, a set of pictures for training a facial recognition algorithm. Of the different types of Artificial Intelligence algorithms, recommendation algorithms can also learn from data based on interactions between different users of that system. Thinking about recommendation algorithms is, therefore, thinking about artificial intelligence as well.For some authors, artificial intelligence algorithms exert authority over humans, since they supposedly make decisions in our behalf3. It is worth noting, however, that, for us, these algorithms make decisions, but these decisions also correlate to the behaviors of users, which makes the understanding of their systems and social impacts more complex than a simple determination of what we do based on their mediation.

Even though some social actors understand that music platforms somehow reproduce what we used to see in radio stations and in the industry before the arrival of streaming platforms4, it is important to point out the existence of considerable differences between the previous forms of mediation of these contents and the ones we identify today. One of the main differences is the opacity related to the computerized nature of these platforms, which lead to few people having the literacy to profoundly understand the topic. Moreover, the computerizing of musical production and consumption produces other relevant layers to be researched in depth5.

Given this step, the warning is to not take algorithmic recommendation for algorithmic discrimination! Although both concepts are related, while algorithmic recommendation is based on making readings of user behaviors, collecting data and, from there, personalizing what will be offered to the consumer, several authors have argued that algorithmic discrimination occurs when the application of algorithms perpetuate prejudices and social inequalities based on data and forms of treatment that contain or reproduce biases and stereotypes6.

“Part of the challenge of understanding algorithmic oppression is to understand that mathematical formulations to drive automated decisions are made by human beings. While we often think of terms such as “big data” and “algorithms” as being benign, neutral, or objective, they are anything but. The people who make these decisions hold all types of values, many of which openly promote racism, sexism, and false notions of meritocracy, which is well documented in studies of Silicon Valley and other tech corridors.7

In the context of our research, it is essential to highlight that an algorithm may influence recommendations as to limit the users’ exploration of other music styles and artists. In addition, it is worth raising hypotheses about how algorithmic recommendations may potentially lead people in general to predominantly consume products from artists and/or groups composed of men instead of women, of white people instead of black people, and of artists coming from the Global North instead of the Global South. These possible tendencies deserve an in-depth critical analysis about the social and cultural impacts of algorithmic influence on music industry.

It is also important to mention that it was the intersection of recommendation algorithms and algorithmic discrimination that sparked our interest for this research. After six months of studying algorithmic discrimination in a study group, we wanted to not only look at possible biases in recommendation algorithms, but to go beyond the field of violence studies and think about discrimination in cultural contexts.

For this reason, we wanted to investigate whether, based on the data collected, we could identify evidence of gender and race biases, adopting an intersectional perspective. However, as noted above, we faced a first barrier related to access to data: streaming platforms not always provide detailed information about the social markers of users and artists, as is the case with the category “race” on Spotify and Deezer.  

The import of technologies and geopolitical markers 

Although streaming platforms are widely used in the Global South, it is important to note that the majority of them was developed in the Global North. This Global North origin adds an additional layer of complexity, raising the question of whether these platforms are focused on the public and cultural perspectives of that region.

Our research identified implications related to the origin of Global North platforms. For example, we noticed that music apps often implement changes in a privileged way, prioritizing one geographic pole before expanding to the other. This can be observed when platforms launch new features, such as the non-binary gender identification option, which are made available first in the North and, then, in the South.

Besides that, we noticed that music genres from the Global North have more refined categorizations, which allows for more personalized and elaborate content recommendations. A concrete example of this disparity can be seen on Spotify. When examining the list of music genres available in the music recommendation code through Spotify API (genres seeds)8, we noticed a greater granularity in the distinctions of musical styles from the US, including specific rhythms such as “honky-tonk”, “chicago-house” and “detroit-techno”. In contrast, Brazilian musical styles have broader categories, such as “brazil” and “mpb”.

These differences in the way music genres are categorized can have an impact on the quality and diversity of recommendations offered to users in the Global South. As Nina da Hora, a researcher who focuses on the topic of algorithmic bias, explains, dealing with technologies developed in different contexts requires the application of the concept of “mitigation”9.  The researcher points out, however, that it is impossible to completely eliminate the various biases of the technologies considering their place of origin, and she also emphasizes the urgency of considering the reality of structural inequalities present in Brazilian society, such as racism and misogyny, as the main focal point for mitigation when discussing algorithms and artificial intelligence10.

Therefore, when dealing with apps created in other regions, it is necessary to consider comparative perspectives between both regions, in order to understand how the problems occurring in each of the poles of the globe are intersected or distanced. Understanding the differences between the Global North and the Global South can shed light on the gaps that are more present in one pole than in the other, or on how certain phenomena can be a direct effect of this geopolitical relation.

Notas
  • 1

    Despite the importance of YouTube to the music consumption scenario in Brazil, the differences in the APIs and the difficulties in comparing YouTube to Deezer and Spotify platforms led us to leave YouTube aside this edition of the research.

  • 2

    To learn more, access: https://www.mobiletime.com.br/noticias/01/07/2022/streaming-de-musica-pago-alcanca-38-da-base-de-brasileiros-com-smartphone/.

  • 3

    To learn more about the subject, read SILVA, TARCÍZIO. Comunidades, algoritmos e ativismos digitais: Olhares afrodiaspóricos Organização e Edição: Tarcízio Silva.LiteraRUA – São Paulo, 2020.

  • 4

    Follow the presentation and discussion of this topic at: Da rádio ao streaming : ECAD, direito autoral e música no Brasil / organização Pedro Augusto Pereira Francisco , Mariana Giorgetti Valente. – 1. Ed. – Rio de Janeiro : Beco do Azougue, 2016.

  • 5

    Follow this discussion at: https://www.gov.uk/government/publications/research-into-the-impact-of-streaming-services-algorithms-on-music-consumption/the-impact-of-algorithmically-driven-recommendation-systems-on-music-consumption-and-production-a-literature-review. Accessed on 03 June 2023.

  • 6

    In this sense, see: (1) O’NEIL, Cathy. Weapons of Math Destruction; How Big Data Increases Inequality and Threatens Democracy. London: Crown, 2016; (2) ERIKSSON, Maria; FLEISCHER, Rasmus; JOHANSSON, Anna; SNICKARS, Pelle; e VONDERAU, Patrick. Spotify Teardown Inside the Black Box of Streaming Music. The MIT Press Cambridge, Massachusetts/London, England, 2019; (3) SILVA, Tarcízio. Racismo Algorítmico em Plataformas Digitais: microagressões e discriminação em código. In: Anais do IV Simpósio Internacional LAVITS – Assimetrias e (In)visibilidades: Vigilância, Gênero e Raça. Salvador, Bahia, Brasil, 2019.

  • 7

    NOBLE, Safiya Umoja. Algorithms of oppression: how search engines reinforce racism. New York: Ney York University Press, 2018, p.1-2.

  • 8

    https://gist.github.com/drumnation

  • 9

    According to Nina Da Hora, in her participation in the Seminar “The Construction of the Legal Framework for Artificial Intelligence in Brazil”, organized by the Center for Judicial Studies of the Federal Justice Council (CEJ/CJF) and the Superior Court of Justice (STJ).

  • 10

    Available at: https://www.youtube.com/watch?v=khFcS8k6zbc. Accessed on 25 May 2023.