Stop Prioritizing Big Tech Profits Over Our Children’s Mental and Physical Health
The outlook for 2023 may seem bleak. On the bright side, however, is the opportunity Canada will have in the coming year when it comes to passing legislation to advance the safety and well-being of Canadians – especially children – before Big Tech results.
For the most part, the public debate about how best to improve online safety and minimize online harm has focused on content regulation, with design algorithmic considerations taking a back seat. It’s silly. The crux of the problem is not individual bad actors posting individual pieces of malicious content; it’s the systems and incentives that exist within the platforms’ structures, allowing the creation and dissemination of harmful content on a much larger scale to begin with.
After engaging in consultations with stakeholders and a myriad of experts, the federal government said it was explicitly moving away from an approach focused on content takedowns and moving more towards a systems approach. , bringing Canada much more in line with peer jurisdictions like the UK. and the European Union. Both the UK and the EU have developed an approach to online harm that focuses on systemic incentives that lead to the creation, spread and amplification of harmful content, in addition to requiring much more transparency on the part of the platforms regarding the operation of their algorithms.
As Canada moves forward, it is imperative that we keep certain things in mind. First and foremost, we need to get past the false dichotomy presented by conservatives – as well as many Canadian commentators – that any regulatory or governance structure imposed on Big Tech is somehow a political version of « Sophie’s Choice between freedom of expression and security. This is a clearly incorrect framing.
Regulating private, for-profit companies in a way that holds them to the same standards as any other company that supplies products to consumers does not stifle free speech. It ensures that platforms consider the impact of the products they manufacture and supply to Canadians, particularly in relation to children. Presenting Canadian children with algorithmically targeted harmful content, such as posts about self-harm or eating disorders, is expected to infuriate the 338 MPs. We have effectively prioritized the benefits of Big Tech on the mental and physical health of our children. This must change.
All other consumer products are subject to the obligation to act responsibly, which means that companies must consider the risks of their products and then demonstrate that measures have been taken to mitigate these risks. Continuing to exempt technology and social media platforms from this standard makes no legal or moral sense.
Second, we need to keep our eyes clear on what any governance framework can and cannot do. Holding Big Tech to the same standard as other companies that offer products to Canadians will not mean that no one will ever have a negative experience using an online platform again. This means that we have taken active steps to mitigate some of these harms, effectively employing a harm reduction strategy.
Finally, it is imperative that any direct involvement of police or national security forces in online harm is kept to a minimum. The inclusion of measures such as mandatory reporting without a production order to law enforcement or national security agencies will raise significant and legitimate privacy and freedom of expression concerns. We cannot navigate our way through the systemic issues plaguing Big Tech.
For too long, Big Tech’s lobbying and public relations efforts have essentially given these companies a free pass when it comes to harming our society. Hopefully 2023 is the year Canada ends that.