Misinformation: the (post)truth is in the middle

It almost comes naturally to shake your head as a sign of disappointment, disapproval, and disbelief when someone in your friends’ list shares some kind of false news on social media platforms. What seems to be more challenging is to make them understand how and why the information they are spreading does not resonate with reliable sources and truth. Even harder, is doing so without sounding patronizing.

Fake news is a widely used term to label the spread of false information, but often that is the first misstep that many take, as these types of news are often based on some degree of truth. Hence, more appropriate terms to use are misinformation and disinformation. Once this is established, it is important to focus on how it is possible for some to disregard truth so easily to favor the spread of unchecked information. Though it has become hard to discern what is true and what is (partially) false, there are ways to counteract this trend, without breaking friendships and families apart.

A recipe for the perfect misinformation environment

While discussing social media platforms and disinformation there is often a matter open for discussion: is the spread of misinformation due to a faulty platform infrastructure, or is it due to individuals who are unable to discern what is true and what is “clickbait junk”? Such discussion, if faced in such binary way, will never have an answer. Often, the two issues collide and influence each other, and the truth is in the middle. Indeed, it is quite hard to hold only algorithms responsible for the sharing and the traffic of misinformation, though actions should be taken on regulatory level to hold digital infrastructures accountable for making room and amplifying messages and news that put so much effort in manipulating the truth.

However, if the average online user is not instructed on how to recognize news that spread misinformation, even the most advanced and attentively regulated technology will at some point fail and keep offering misleading content on their platforms. There is a need for experts in social and hard sciences field to come together, willing to share the knowledge and cooperate in order to build more inclusive and more humane platforms that do not nudge users to click and share misinformation by design. The average user on social media platforms must be encouraged to share news that contain truth, even if they look underwhelming at a first glance and not immediately profit-making.

As scholar Lee McIntyre says (2018, p. 5), our society is moving towards an acceptance of post truth, in which objective facts are not as influential in shaping the opinion of the public as personal beliefs and emotions. We must counteract the notion that truth is irrelevant. Post truth takes care of the umbrella term of anything that is not truth, as manipulation of truth is not all the same. There are cases in which good faith is at play, in other cases our lack of willingness to check if the information is true causes the spread of misinformation, and some other times misinformation is caused by the very intention of lying to others and make them believe what doesn’t have reason to exist (McIntyre, 2018, p. 7). Another issue that becomes important with the post truth framework in mind, some truth come to have more priority over others (McIntyre, 2018, p. 10). This is alarming, as truth should be treated all the same.

Of course, not everyone has the tools to always interpret the facts (for example, the average citizen might not be able to understand scientific articles’ findings), and everyone with the power of knowledge in their hands should provide them the tools to better understand. Because of the lack understanding about scientific knowledge, there has been a very popular documentary that surfaced during the COVID-19 pandemic: Plandemic. In it, many facts about the virus were challenged by people who were either not qualified to do so, or by people who were interested in creating havoc during such a delicate time as the one we are still living. The documentary, though it was strongly condemned by the scientific community, acquired a viral popularity and became reason for many to doubt the actual science that had been communicated up to that point.

What is being done, and what can we do

Clearly, this situation is not irreversible. However, it is important to be able to address these problems in a systemic way. To do so, there are grassroots and activist groups that fight misinformation with enormous dedication. They make it a point to not publish something too quickly, as they rather verify the reliability of the information first. Some also train journalists to the same forma mentis, to ensure a slower spread of clickbait titles and misinformation, at least coming from more reliable news sources available to citizens and users online.

As I mentioned before it is hard to see friends and family share content that is so far away from the truth. It is even more challenging to see how quickly they can become defensive if you point out that what they shared is false. To correct this it is important to tackle the issue from a different perspective; one needs to re-train them to new approaches to online content. After all, if we are able to correctly differentiate truth from this misinformation, it means that we have had access to both social and cultural capital, a luxury and privilege, as many do not have the chance to heavily invest in higher education, for instance (Sadowski, 2019, p.4).

Indeed, being able to proficiently navigate the web cannot be considered the norm: it is a privilege to have access to the correct kind of information and we shouldn’t use that acquired power to act like gatekeepers of knowledge. Clearly, instructing others on how to spot and avoid misinformation requires great patience and flexibility. However, at the end of the journey, if even one person will have learned what should be shared and what should be left unclicked, our job will have been done well.



Hale, S. A. (2020, November 30). Human or machine? Social science or computer science? Yes, we need them all. The commons. https://wearecommons.us/2020/11/30/human-or-machine-social-science-or-computer-science-yes-we-need-them-all/?fbclid=IwAR0dcUb-m0GeQz6Zbei8ZsSyGUVRVcxhYqnbNsdvLUOO2aZAXaHJCz-omSc.

McIntyre, L. (2018). Post-truth. MIT Press.

Robson, D. (2020, November 29). It’s only fake-believe: How to deal with a conspiracy theorist. The Guardian. https://www.theguardian.com/society/2020/nov/29/how-to-deal-with-a-conspiracy-theorist-5g-covid-plandemic-qanon?utm_term=Autofeed&CMP=fb_gu&utm_medium=Social&utm_source=Facebook&fbclid=IwAR2oRMy-UcJbbu6Qwe6lt4Boxlhy61_XO99dv8AF7Nhi9WQx9euMYTJ_aEQ#Echobox=1606781358.

Sadowski, J. (2019). When data is capital: Datafication, accumulation, and extraction. Big Data & Society, 6(1), 1-12. DOI: 10.1177/2053951718820549.

disinformation, fake news, misinformation, post-truth

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.