As the York college devoted to critically analyzing public policy and furthering the betterment of society, McLaughlin presents a series of debates exploring significant and critical public policy issues of the day. You'll hear from outstanding expert speakers from the fields of interest to McLaughlin students, including the legal professions, public service and the non-profit sector. Past debate topics have included the future of democracy, the deployment of COVID-19 vaccines and artificial intelligence as an existential threat. Check out our upcoming and past events below.
Chair & Moderator: Prof. James C. Simeon
Panelists: Professors Étienne Brown, Natasha Kusikov, Anne F. MacLennan, and Regina Rini
McLaughlin College Union Debate
November 11, 2021
Following the January 6, 2021, insurrection at the US Congress, a number of social media platforms including Facebook and Twitter, blocked, the then US President Donald Trump, from their platforms. This came in the wake of Trump’s use of these social media platforms “to rile up his supporters and bully his enemies” and “that were often filled with falsehoods and threats.” (Mark Isaac and Kate Conger, “Facebook Bars Trump Through End of His Term,” The New York Times, January 7, 2021, https://www.nytimes.com/2021/01/07/technology/facebook-trump-ban.html. (accessed October 21, 2021)) This was not accepted universally as others saw this as a limitation on the freedom speech. It has been noted that “the question of when and how it’s appropriate for private companies to ‘de-platform’ people – especially notable public figures like Trump -- is not so obvious.” (Dipayan Ghosh, “Are We Entering a New Era of Social Media Regulation?” Harvard Business Review, January 14, 2021, https://hbr.org/2021/01/are-we-entering-a-new-era-of-social-media-regulation. (accessed October 21, 2021))
It is noteworthy that even the Canadian Civil Liberties Association (CCLA) does not oppose the regulation of online communications, although it does acknowledge that the devil is in the details. (CCLA, “Regulating Social Media: Into the Unknown,” February 10, 2021, https://ccla.org/fundamental-freedoms/regulating-social-media-into-the-unknown/. (accessed October 21, 2021)) And, it has been argued persuasively that a significant portion of the responsibility for addressing the harms that flow from communication will be with those who consume the information online. The CCLA admonishes us to learn the “critical skills” necessary to be able to discern what is fact and what is fiction. It is up to each of us to be able to respond and to counter harmful speech that may lie outside the law’s reach. (Ibid.)
Governments have already proposed and introduced legislation that would require social media companies to remove harmful content from their platforms within 24 hours of it being reported. New regulatory bodies have also been proposed to monitor social media platforms that would cover harmful content, that are drawn from five areas in the Criminal Code: hate speech; child sexual exploitation; non-consensual sharing of intimate images; incitement to violence; and terrorism. (Rachel Emmanuel, “Ottawa proposes plan to regulate social-media content, ”iPolitics, July 29, 2021, https://ipolitics.ca/2021/07/29/ottawa-proposes-plan-to-regulate-social-media-content/. (accessed October 21, 2021)) We have put together a group of experts who are prepared to debate the following proposition:
"This House accepts that social media platforms should apply the same content moderation rules to global leaders as they do to other users."