Outrage cannot make the web safer, warns Ofcom chief after criticism from Paul Dacre

Google and Facebook must not be regulated "only by outrage", the chief executive of Ofcom has said, in comments that will be received as a retort to criticisms from the former Daily Mail editor Paul Dacre.

Dame Melanie Dawes said the "time has come for strong, external oversight" of web search and social media as she underscored her commitment to tackling harmful online content, days after Mr Dacre questioned if she had the "wherewithal" to do the job. 

Mr Dacre – an arch critic of Google and Facebook – also launched a stinging attack on senior civil servants as he pulled out of the race to become the next chairman of Ofcom and rejoined the Mail’s publisher DMG Media. 

He complained that without left-wing associations he would "have more chance of winning the lottery" than getting the job, as he wished Ofcom "all the luck in the world" regulating the "omnipotent, ruthless and, as we’ve learnt, amoral tech giants". 

Writing in The Telegraph, Dame Melanie accused search and social media companies of prioritising growth over the safety of their users as she prepares the media regulator to inherit new powers through the Online Safety Bill that will allow it to dole out multi-billion pound fines.

"By designing their services to maximise reach, they may have inadvertently promoted harmful content: bullying or harassment, hate speech, self-harm. They may not be quick enough to tackle terrorism or sexual abuse," she said. 

"Today, when people spend a quarter of their waking day connected, safety matters as much online as it always has in the home, school or workplace.

"Six in ten connected adults – and eight in ten older children – have had at least one harmful experience online in the past year.

"Most people support tighter rules. But at the moment, search and social media sites can choose whether to heed those concerns.

"If these companies are regulated at all, it is only by outrage. The time has come for strong, external oversight."

Longtime Mail editor Paul Dacre has pulled out of the race to lead Ofcom to return to the paper

Credit: Julian Simmonds

Dame Melanie is hiring 150 new tech and cybersecurity staff at a new hub in Manchester to help the regulator become the "Red Adair" of big tech regulation, in reference to the firefighter who extinguished oil well blazes with explosives. 

However, she said the new laws would not mean regulating individual pieces of content because the "sheer volume would make that impractical ”, but will hold companies accountable over how they use "algorithms, address complaints and ensure a safe experience for children". 

"We won’t act as a censor, prevent robust debate or trample over users’ rights," she added.

"Free speech is the lifeblood of the internet. It is a foundation of democratic society, at the heart of public life, and a value that Ofcom holds dear.

"Instead, our job will be to lift the ‘vague and cloudy uncertainty’ that hovers over search and social media."

Her comments come as Nadine Dorries, the culture secretary, rode to the defence of the permanent secretary Sarah Healey, who was also attacked by Mr Dacre through a letter in The Times. 

Referencing Ms Healey, Mr Dacre wrote: "Senior civil servants working from home so they can spend more time exercising on their Peloton bikes and polishing their political correctness, safe in the knowledge that it is they, not elected politicians, who really run this country."

In her first appearance in front of the culture committee, Ms Dorries said: "There are many male permanent secretaries who went for their jog each morning, or for their cycle ride, or walked their dog. Nobody had anything to say about that."

We must expose tech’s algorithms and regulate social media properly

By Dame Melanie Dawes

Three years before she composed the first modern algorithm, Ada Lovelace pondered the potential for machines to master games like chess and solitaire. If a computer could achieve such a feat, where might this lead?

“I see nothing but vague and cloudy uncertainty,” she confessed. “Yet I discern a very bright light a good way further on.”

For all her visionary brilliance in 1840, not even Lovelace could foresee algorithms becoming the hand that guides people’s travel, shopping and entertainment. More than that, they define our modern experience of being online, fuelling what we see in search results and on social media.

Algorithms have personalised the internet, created new business opportunities, and given ordinary people the power to speak to large audiences. Through their ability to target online advertising, they have also fuelled the rapid rise of trillion-dollar tech giants.

But too often, companies appear to have prioritised growth over the safety of their users. By designing their services to maximise reach, they may have inadvertently promoted harmful content: bullying or harassment, hate speech, self-harm. They may not be quick enough to tackle terrorism or sexual abuse.

Today, when people spend a quarter of their waking day connected with the internet, safety matters as much online as it always has in the home, school or workplace. Six in 10 connected adults – and eight in 10 older children – have had at least one harmful experience online in the past year. Most people support tighter rules.

It's time to hold Big Tech accountable for keeping us and our children safe online, argues Dame Melanie Dawes

Credit:
The Image Bank RF

But at the moment, search and social media sites can choose whether to heed those concerns. If these companies are regulated at all, it is only by outrage. The time has come for strong, external oversight.

So the draft Online Safety Bill, currently being scrutinised by Parliament, is an important piece of law. It means tech firms will have new duties of care to their users, which Ofcom will enforce. We plan to build on our track record of upholding broadcast standards, supporting a range of views and promoting innovation.

Equally, everyone should understand what the new laws will not mean. Ofcom will not be regulating or moderating individual pieces of online content. The Government recognises – and we agree – that the sheer volume would make that impractical.

We won’t act as a censor, prevent robust debate or trample over users’ rights. Free speech is the lifeblood of the internet. It is a foundation of democratic society, at the heart of public life, and a value that Ofcom holds dear.

Instead, our job will be to lift the “vague and cloudy uncertainty” that hovers over search and social media.

As a user, you have no idea how these platforms really work. Why do you see the content you see? What are they doing to protect your children from abuse or harassment? When they design their services, is safety their first priority or just a secondary thought?

Social media users by age

When we regulate online safety, Ofcom will demand answers to these questions. We will require companies to assess risk with the user’s perspective in mind. They will need to explain what they are doing to minimise, and quickly remove, illegal content – and to protect children from harm.

We will hold companies to account on how they use algorithms, address complaints and ensure a safe experience for children. The biggest services must also explain how they protect journalistic and democratic content.

Today, these decisions are being made behind companies’ doors, with no visibility or accountability.

Ofcom will set codes of practice, and report publicly on platforms’ performance. If we find companies fail in their duties of care, we can levy fines or audit their work.

And as other countries follow with similar laws, we will work closely with our international counterparts. When I met tech leaders from around the world at Lisbon’s web summit this month, I saw a collective determination to find global solutions to these challenges.

Ofcom is gearing up for the job, acquiring skills in areas like data and technology. And we have opened a technology hub in Manchester to help us attract skills and expertise from across the country and beyond.

We will be ready; and I believe the new laws will make a genuine difference. By shining that very bright light on immensely powerful companies, we can ensure they take proper care of their users and create a safer life online for everyone.

Dame Melanie Dawes is chief executive of Ofcom

Leave a Reply

Your email address will not be published. Required fields are marked *