How homophily, internet affordances and the ubiquity of flawed reporting necessitates responsibility

Brevity is the soul of wit, but the torpidity of wisdom

A side effect of the blisteringly fast 24hour news cycle is the trade-off between the efficiency of turnaround, or of the message, and the amount of detail and content that you can publish.

Evidently certain topics deserve a thorough analysis, yet not all platforms afford this.

The book No Time to Think unflinchingly covers this topic and cites haste-driven errors such as “the bias of convenience,” or “the law of least effort,” where there is a tendency to be “satisfied by answers that are the easiest to obtain.”

More chilling though is a journalistic expression relayed by Ron Nessen: “too good to check … some stories you don’t want to spend too much time checking because you don’t want to find out you’re wrong.”

But what can we expect when a journalist can say with a straight face something like this:

We have been talking for weeks … about what the right thing to do is. Do we cover the story and present both sides of it, do we ignore them, are we just giving them a soapbox?

Regardless of what you think of someone you should at least attempt to present them fairly.

That is what the principles of ethical journalism are for.

We presume that journalists try to objectively relay the news. But how can we assume that online outlets or individual content creators will uphold the principles of accuracy, independence, or impartiality?

Unfortunately the idea that we’re in a post-truth age is old news. It’s common for emotionally weighty or widely believed ideas to spread virally regardless of their factual accuracy.

Is winning the hearts and minds of the public more important than truth? In the digital age is truth determined by the most shared and most retweeted?

I believe we’ve underestimated how online publishing impacts society.

The internet is not divorced from reality

Social media and online environments are increasingly where we negotiate our democracies.

Yet these are undeniably littered with fake accounts, forged reviews, and manipulative content used to garner traffic, or promote a business or ideological position.

The most nefarious example of this was the alleged glut of fake accounts on Facebook during the 2016 US election.

Facebook believes 120 fake Russian-backed pages created 80,000 posts that were received by 29 million Americans directly, but reached a much bigger audience by users sharing, liking and following the posts.

Facebook and other platforms have codes of conduct attempting to mitigate this. And Facebook is reinstating some human fact checkers after it’s algorithms demonstrably failed at the task. Although Facebook will probably re-encounter issues of bias as they faced before with their Trending Topics feature.

But without giving platforms too much editorial authority we could seek to foster a culture of critical thinking. It is not solely the platform’s responsibility. Content creators and consumers must also carry the responsibility for their dissemination of information online.

As this article states:

Thanks to social media and other interactive digital tools, audiences are now participating vigorously in the process of disseminating, recommending and modifying content.

Why wouldn’t a publisher leverage their audience to improve the dissemination of their content in an increasingly saturated market?

It’s important to recognise that it’s not necessarily in a content creators best interests to give you the whole story, or perhaps they cannot give you the whole story.

Sharing content is not the problem. The problem is that often we do it without actually reading it, or knowing the context.

As Nandagopal Rajan from The Indian Express says, often “we are just leading with the headline.”

… there is something in online journalism called CTR … (Click-Through Rate). The click-through rates are horrible, you know, it’s 1%, 2%. Something goes viral then it goes to like 7% or 10% max … but it also means that if you see a Facebook post it’ll have thousands of shares but people actually clicking and reading that post is again in that 2–3%.

Surely people are sharing content because they have more than a passing interest in it?

In her TEDx talk Anita Li points out the clear boons of online journalism are that “the news has now been democratised,” can give audiences more of what they want, and can give “everyone a voice.”

However, even the good things you covet will eventually fester. And unfortunately some content creators will seek to manipulate readers.

I don’t believe everyone is qualified to uphold, or is interested in upholding, journalistic ethics. And I don’t believe all readers are able to, or want to, critically evaluate what they read online.

We put the blinkers on ourselves

Homophilythe inexorable tendency to associate disproportionately with similar othersis one of the most pervasive and fundamental features of our social lives.

Homophily is an unavoidable, yet variable, characteristic of human nature.

And I believe it could contribute to the creation of “filter bubbles.” Which this article states could arise from a multitude of factors.

… search engines, news aggregators, and social networks are increasingly personalizing content through machine-learning models, potentially creating “filter bubbles” in which algorithms inadvertently amplify ideological segregation by automatically recommending content an individual is likely to agree with.

Interestingly the article also states that people are also more likely to share, or express, opinions they believe conform with those held by their immediate peers.

This article claims that, “although people do not actively avoid online information they disagree with …, the internet allows individuals to easily seek out and consume like-minded political news.”

I experience this myself. It is difficult to approach the world with scepticism, or to proactively test your ideas against others.

And our psychology works against us somewhat.

Information that challenges strongly held political views paradoxically pushes individuals to engage in cognitions and behaviors designed to reinforce their existing beliefs.

I believe this is because people’s beliefs form part of the fundamental notions of “what they know,” “how they act,” and ultimately who they are. When their beliefs are challenged they feel personally attacked and feel the need to validate their very being.

On the internet it’s common to unexpectedly encounter content that challenges you. And the fact that it was unexpected could “pose a unique identity threat … [and] be particularly dissonance-inducing.”

A 2012 Pew survey found that “18% [of SNS users] … have blocked, unfriended, or hidden someone” due to finding their political SNS content disagreeable, offensive or posted too frequently. While “16% of SNS users have friended or followed someone because that person shared the user’s political views.”

Isn’t it possible that the more you seek to cut yourself off from unwanted information the more significant it feels when you encounter it?

Of course it’s valuable for reporters and content creators to contribute to the breadth of information online. But due to haste or naivety you may end up misleading people or merely strengthening their convictions.

We created the post-truth world

Online the stories we encounter are often incomplete or spuriously intended to reinforce, or alter, our views.

Algorithms assist us in funnelling ourselves toward content we already agree with. And encountering opposition will likely cause us to buttress our pre-existing views.

Oh dear.

We better not make the post-truth world a post-responsibility world too.


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s