A Reader in Law from Oxford Brookes, Chara Bakalis, has been part of the policy debate about online safety for almost a decade and sees the legislation of the Online Safety Act as a step in the right direction for the victims of online hate.
Chara Bakalis, Deputy Head for Strategy and Development in the Faculty of Humanities and Social Sciences, is a world-leading expert in the study of online hate speech. She focuses on the regulation of online hate and social media company liability.
In the build up to the enactment of the Online Safety Act, she worked to action her research in British and European legal and policy authorities. She has also provided expert commentary to leading media outlets and charity organisations on the topic of online hate.
She has given evidence to Parliamentary inquiries , including oral evidence to the 2021 Inquiry into Tackling Online Abuse conducted by the Parliamentary Petitions Committee.
This was part of the enquiry into the online abuse received by UK media personality Katie Price’s disabled son, Harvey, in 2020.
Chara notes that online hate speech is “an expression of hatred towards a group based on their protected characteristics”. Without regulation, the Internet has been a place where many victims of online hate do not feel safe in a space that is supposed to be democratic. Victims may choose to leave social media as a result of the abuse they face, and then lose their online voice.
The online abuse that occurred during the Euro 2020 was another high profile case, in which Chara was invited to comment about in a BBC podcast about Tackling Online Abuse in Football. Some British football players decided to leave Twitter (now known as ‘X’) and changed their settings from public to private on Instagram - to get away from the online bullying. They were silenced on social media.
Chara notes that the harm caused by the online world can be very different from the harm caused in the offline world. When hate speech is broadcasted online, the harm could be far more devastating.
The distinction between online/offline harm was key in her understanding of how the law should be used to address online hate speech as she explains in a podcast episode about Digitally-Enabled Stalking hosted by the Suzy Lamplugh Trust, a leading women’s safety charity. For example, a stalker in the offline world might constantly follow a person home. The victim might identify that someone is stalking them, and may not feel safe to leave the house.
However, if the perpetrator also engages in cyber stalking (which is what most stalkers do), the victim’s life could be destroyed by the stalker in a way that would be more difficult if the crime of stalking is just carried out offline.
The stalker could gather personal online information from the victim’s social media and use this to publish disparaging claims about the victim online. The online post has a seemingly infinite reach, if the post is widely shared on social media.
The perpetrator could remain anonymous online or even live in another continent which means the victim might never know who it is. This makes policing cyber stalking very challenging.
It is vital that the law takes into account these features of online hate speech. Some have thought that the best way to deal with online hate speech is for social media platforms to remove material that is illegal. Unfortunately, social media companies have not been consistent in doing so. They need to be legally compelled to do so. The recently passed Online Safety Act proposes to do just that. A regulator will oversee the conduct of social media platforms and ensure they comply with their duties under the Act, or risk heavy fines. Greater protection will be given to material that can be accessed by children. Whilst it would be naive to believe that the Online Safety Act will solve the problem, it is undoubtedly a step in the right direction.
Chara’s research about hate crime has been published in journals such as Legal Studies and the Criminal Law review. Her work on hate speech, social media companies and online regulation was recently published in the Studies in Law, Politics and Society journal.
What can we expect from Chara’s research in the future? She is interested in examining algorithms and the proliferation of hate speech from the angle of freedom of thought.
Social media algorithms push extreme forms of content and most of what people see on social media is recommended by the algorithm and is not chosen by them. She would like to explore to what extent this undermines peoples’ fundamental right to freedom of thought.
Chara is a member of the Oxford Brookes Equality, Diversity and Inclusion Network (EDIN) as well as the Artificial Intelligence and Data Analysis Network (AIDAN).