Parents and campaign groups calling for tighter restrictions on social media have welcomed a landmark jury verdict in Los Angeles, where a young woman known as Kaley succeeded in her claim against Meta and YouTube over what was described as a childhood addiction to their platforms. Jurors found that Meta, which owns Instagram, Facebook and WhatsApp, and Google, which owns YouTube, had intentionally built products that encouraged compulsive use and, in doing so, contributed to measurable harm to her mental health. She was awarded US$6 million in damages. Both companies have indicated that they will appeal. But the real weight of the case does not sit in the damages, nor even in whether the verdict survives appeal. It sits in the question the court was prepared to entertainwhether the architecture of these platforms, not just the content they carry, can give rise to liability. Once that question is opened, it does not remain confined to California.
This case does not read like an isolated dispute. It reads like the beginning of a shift. For years, the dominant defence advanced by large technology platforms has been that they are neutral intermediaries, hosting content generated by users and therefore not responsible for its downstream effects. That position begins to unravel the moment attention shifts from content to design. A system deliberately structured to reduce friction, remove stopping points, anticipate user behaviour and feed it back in increasingly refined form cannot comfortably sit within the idea of neutrality. It does not simply reflect behaviour. It participates in shaping it quietly, incrementally, and with far more precision than most users realise.
The mechanics behind this are not abstract. People do not enter these environments as detached decision-makers weighing each action in isolation. They arrive with contextstress, curiosity, loneliness, anger, distraction. The platform does not create these states, but it is built to respond to them. It observes behaviour, identifies what sustains attention, and then prioritises that content. Over time, this becomes a loop. The system learns, adjusts, and feeds back material that is more likely to hold the user for longer. The process is subtle enough to feel natural, but consistent enough to be effective. At some point, the line between what the user is choosing and what the system is encouraging begins to blur, not because choice disappears, but because it is being shaped within a controlled environment.
That reality lands differently in African contexts, and in many respects more sharply. The continent is digitising at speed, particularly through mobile access, but the expansion of connectivity has not been matched by an equal development in behavioural awareness or regulatory depth. In South Africa alone, tens of millions of people are already online, and across the continent a predominantly young population is coming of age inside systems designed elsewhere, calibrated for engagement rather than social outcomes, and governed at a distance. The result is not simply increased access, but deep immersion in environments that are already highly optimised to capture attention.
The economic layer makes disengagement even less straightforward. In many African settings, social media is not just social it is infrastructure. It is how businesses reach customers, how individuals build visibility, how opportunity is created in real time. Visibility carries value. Engagement carries consequence. Under those conditions, stepping away is not always a viable option. The same systems that shape behaviour also sustain livelihoods, and that creates a tension that cannot be resolved by placing the burden entirely on the user. Behaviour here is not only influenced; it is incentivised.
There is also a collective effect that cannot be ignored. In societies where identity and community remain central, platforms do not simply influence individuals, they shape group dynamics. Algorithms that prioritise engagement tend to elevate emotionally charged content, and in doing so, they amplify narratives that divide, provoke, and mobilise. What begins as individual expression can quickly scale into shared sentiment, often without the friction that would slow it down in offline settings. The impact, in that sense, extends beyond the psychological and into the social fabric itself.
Treating this case as a purely American development misses the point. History rarely announces its turning points with clarity. Sarajevo did not immediately present itself as the origin of a global realignment, yet it triggered forces that extended far beyond its borders. This verdict carries a similar, quieter significance. It signals the early stages of a broader legal and regulatory shift one in which courts begin to interrogate not only what happens on digital platforms, but how those outcomes are produced. If that shift gathers momentum, its implications will not stop at the United States. They will reach jurisdictions that are already grappling with the consequences of rapid digital integration.
The question now is not whether platforms like Meta and YouTube will continue to operate but whether the terms of that operation are beginning to change. Regulation, if it is to be meaningful, will have to move beyond content and engage directly with design: how systems rank, recommend, and repeat. At the same time, digital literacy must evolve into something more precise an understanding not just of how to use platforms, but of how they shape behaviour in return.
Because reality is increasingly difficult to avoid. The internet is no longer just a tool that reflects society. It is an environment that interacts with it, shapes it, and at times distorts it. And in environments that are carefully designed, behaviour does not simply unfold it is guided, reinforced, and, over time, reshaped.
Kundai Darlington Vambe is a lawyer and researcher focusing on law, governance and technology, with a particular interest in artificial intelligence, cybercrime and international legal frameworks. He holds an LLB and is an LLM candidate specialising in cybercrime, cybersecurity and international law.







