Cobbe believes that it is important to regulate recommenders because they play a key role in promoting content to a "large audience" and thus has a larger impact [1]. This makes sense considering over 400 hours of video content is uploaded onto Youtube every minute [2], which makes regulating content a non-trivial task. Since 70% of Youtube videos watched are driven by recommendations [3], regulating recommenders will address most of the problem at a fraction of the cost. The principle of "De minimis non curat lex" is observed here. Instead of imposing the burden of regulating thousands of hours of content, some of which attract barely any views, it is more effective to regulate only the popular recommended content which reach a wider audience.
Cobbe distinguishes between 3 different types of recommending, Open Recommending where users are free to recommend user generated content, Curated Recommending where users recommend from within a list of curated content and finally, Closed Recommending where the platform produces the recommendations of its own content [4]. This distinction allows Cobbe to determine if harmful content is able to enter the system in the first place, and subsequently if that harmful content could possibly be recommended by the system. Cobbe subsequently concludes that Open Recommending is likely to be the "biggest contributer to systemic issues" [5].
Cobbe believes that such recommender systems "serve the interest of the platform" and ultimately drives "platform dominance" [6]. By showing users similar content, they also have the propensity to shape the user's worldview and may ultimately influence the user to create more similar content, completing the feedback loop [7]. I agree with Cobbe's observation that such phenomenon is likely to occur. However, I also believe that Cobbe may have overemphasized the role of online platforms in an individual's life. The number of tertiary education students globally has doubled in the last 20 years [8]. Since critical thinking is a key skill, these students should be able to find alternative sources of information. Most people also spend considerable time interacting with colleagues in their workplace and with family and friends elsewhere, hence they should also be exposed to other viewpoints from these sources.
Art. 25(1) of the EU Digital Services Act will put greater responsibility on online platforms to be more transparent in disclosing when recommender systems are used, thus alleviating Cobbe's concern of "echo chambers" and whether the systems are used as a leverage to benefit other products from the same company [9]. Art. 34 and 35 will allay concerns that unmoderated Open Recommender systems could promote harmful content. However, there will still exist issues whereby bots can manipulate recommender systems [10]. It is difficult for online platforms to detect such widespread targeted efforts especially when conducted by well resourced Nation State adversaries [11], hence little can be done in the form of regulation.
[1] Cobbe J and Singh J, ‘Regulating Recommending: Motivations, Considerations, and Principles’ (2019) 10 European journal of law and technology, pp. 2
[2] Hendricks VF and Vestergaard M, Reality Lost : Markets of Attention, Misinformation and Manipulation (1st edn, Springer Open 2019), Chpt 1.2 The Price of Information
[3] Ashley Rodriguez (2018) 'YouTube's recommendations drive 70% of what we watch', Quartz . https://qz.com/1178125/youtubes-recommendations-drive-70-of-what-we-watch accessed 31 January 2024
[4] Cobbe J and Singh J, ‘Regulating Recommending: Motivations, Considerations, and Principles’ (2019) 10 European journal of law and technology, pp. 6
[5] ibid.
[6] ibid, pp. 7
[7] ibid, pp. 8-9
[8] UNESCO, "Higher education figures at a glance", https://uis.unesco.org/sites/default/files/documents/f_unesco1015_brochure_web_en.pdf accessed 31 January 2024
[9] Cobbe J and Singh J, ‘Regulating Recommending: Motivations, Considerations, and Principles’ (2019) 10 European journal of law and technology, pp. 9, 11
[10] ibid, pp. 9-10
[11] Emilio Ferrara et al "Characterizing social media manipulation in the 2020 U.S. presidential election", First Monday https://firstmonday.org/ojs/index.php/fm/article/view/11431 accessed 31 January 2024, pp. 7