We are living in an age where one online clip can ignite national outrage, divide communities, and test the strength of our constitutional values. That is precisely what unfolded with The Open Chats Podcast, where the hosts, in a widely shared and now-removed episode, made vile remarks about South Africa’s coloured community.
With neither pause nor shame, they asked whether coloured people “sleep with their siblings” and described them as “crazy.” The remarks were not only disgusting. They were dangerous. In a country still haunted by the legacy of apartheid and racial tension, language like this does not remain online. It echoes in real life.
The response was swift. Political parties issued strong statements. The Democratic Alliance reported the matter to the South African Human Rights Commission. The Patriotic Alliance opened a criminal case. TikTok eventually removed the video, but only after it had done considerable damage. And that delay points to a wider issue: by the time platforms act, it is often far too late.
This is not just about one clip or one podcast. It is about a country trying to find its footing in a digital world where harmful content spreads faster than any system we have in place to contain it. We are not keeping pace with the nature and scale of online harm.

South Africa is not without legal tools. The Cybercrimes Act, which came into force in 2021, criminalises the sharing of digital messages that incite hatred or violence based on race, gender, religion, or ethnicity. Section 14 of that Act applies directly to this case. And yet, having a law is not the same as being ready to enforce it.
The reality is, most law enforcement officials and prosecutors are not adequately equipped to handle cyber offences. Evidence disappears in hours. Platforms operate across borders. And harmful content travels faster than the wheels of justice. We can welcome the complaints and condemnations, but without real capacity and resources on the ground, justice will always lag behind.
There is another challenge we must face head-on: regulation. Unlike traditional broadcasters, podcasts operate with almost no oversight. They are self-produced, freely distributed online, and escape the scrutiny applied to television and radio. Despite this, some have audiences in the millions.
The idea that digital content creators are somehow exempt from accountability because they are not part of formal media structures no longer holds. With reach comes responsibility. And with responsibility should come clear and enforceable standards. Bodies like ICASA and the BCCSA must urgently look at how to bring high-impact digital content into the regulatory fold. This is not about censorship. It is about ensuring that where real harm is caused, there is a real path to accountability.
Some will argue, as they always do, that this is a question of free speech. But let us be clear: this is not about silencing dissent or banning satire. It is about recognising when speech crosses a line. When it incites hatred, when it targets historically marginalised communities, when it reinforces violent stereotypes, it is no longer a matter of opinion.
The Constitution fiercely protects freedom of expression, and rightly so. But it also draws a line where speech causes harm. That line exists because we know, as a country, just how damaging words can be. Freedom of expression was never meant to shield people from the consequences of reckless and dangerous speech.
This is a moment that calls for more than outrage. It calls for real action. South Africa needs to strengthen its cybercrime enforcement. Law enforcement officials, prosecutors, and magistrates must be trained in handling offences involving digital content. A specialised digital crimes unit with the right tools and legal authority is no longer optional. It is overdue.
Digital platforms, both local and global, must do more. The Cybercrimes Act already compels service providers to report offences within seventy two hours. That is the law. We must ensure that it is followed and enforced.
More broadly, we need to reflect as a society on what we share and why. Outrage thrives in the online ecosystem because it drives views and engagement. But we have agency. We can choose to be more discerning, more responsible, more humane in our digital conduct.
This is not just about The Open Chats Podcast. It is about how we deal with harmful speech in the digital era. It is about how quickly things can spiral out of control, and how slowly our systems have responded.
Words matter. Platforms matter. And regulation matters. We cannot afford to dismiss every offensive broadcast with a shrug and say, “It’s just the internet.” The internet is no longer separate from real life. What happens there shapes our society, our politics, and our sense of identity.
In response to this incident, the Deputy Minister in the Presidency for Youth and Persons with Disabilities, Mmapaseka Steve Letsike, strongly condemned the remarks. She made clear that these were not simply offensive words but statements that threaten the values of inclusion and mutual respect on which our society is built. Every community in South Africa, she noted, deserves to be treated with dignity. Public platforms must be spaces where unity is promoted, not division.
While the podcast hosts have issued an apology, the Deputy Minister emphasised that more must be done to rebuild trust. “All of us; black, white, Indian or coloured; each represent our unique traditions, languages, and shared experiences that reflect our multicultural heritage and diversity as a nation,” she said.
She also stressed the importance of applying regulation to all forms of media, including podcasts. “Hate speech is one of the most resilient manifestations of cyberviolence and is not to be equalled to free speech,” she said. “Addressing hate speech does not mean limiting freedom of speech. It means keeping it from escalating into discrimination, hostility, and violence which is prohibited under our constitutional law.”
As digital spaces become more central to everyday life, it is essential that we build an online environment based on dignity, fairness, and mutual respect. The Ministry’s response is an important step, but it must be backed by stronger enforcement, better education, and meaningful reforms.
South Africa has a proud Constitution, a painful past, and a future that depends on how well we defend both freedom and dignity. If we do not act, we risk allowing our digital spaces to be shaped by those who are the loudest, the most reckless, and the least accountable. The time to draw the line, legally, morally, and publicly, is now.
The views expressed in this article are those of the writer and do not necessarily reflect those of this publication. Kundai Darlington Vambe holds an LLB (Hons) from the University of London. He specialises in the intersection of law, technology, and digital rights, with a focus on cybersecurity, content regulation and emerging technologies, with a particular interest in the ethical implications of digital systems.







