How to Skeptic

With that odd title out of the way, I want to get to the heart of what our role should be as skeptics, and how to properly do scientific scepticism.

My initial attraction to skepticism was because of a teaching colleague of mine back in the early 1990s who was a creationist. Seemingly very smart, he had this belief that seemed to go against science. I wanted to understand why smart people can believe weird things. I thought at the time that just showing somebody the “error of their ways” and showing them the evidence (in this case, for evolution, for that's what creationism is about really – denying evolution) would be good enough to correct their thinking. Surprisingly, this doesn't work out so well!

As skeptics we should be interested in how we can get people to come to the right conclusions about things. By right, I mean the consensus of expert scientific opinion on any particular topic. Some might see this as an appeal to authority – a common logical fallacy. It is an appeal to authority, but to an appropriately qualified authority. The appeal to authority fallacy is only a fallacy when it's an appeal to an inappropriately qualified authority – often to somebody who's commenting outside their field of expertise.

In an ideal world everybody would be engaged with the topics they care about and be capable of thinking critically and assessing the evidence for or against a particular claim. This is unrealistic. Take, as an example, climate change, and in particular, the claim that humans are the cause of the majority of climate change – known as Anthropogenic Climate Change. A good website that summarises the evidence for this is skepticalscience.com. It lays out the claims of climate change deniers (or whatever is the currently accepted term they prefer) and why they're wrong. But, drilling down in that website reveals a huge mass of comments that dig deeper into the claims. Without being a climate scientist, it's impossible to judge the merits of those comments. The impression that a reader would get is that there's a huge scientific controversy over climate change (there isn't!).

Encouraging people to do their own research will typically end badly and people will come to the wrong conclusions. This is a common refrain of the anti-vax community – telling parents that they should do their own research about the safety and efficacy of vaccines. People falling into this trap will likely be easily swayed by misleading claims (for example, lists of scary-sounding ingredients in vaccines) and lead to the wrong conclusion. The anti-vax community is filled with people with an axe to grind, and who like to self-promote.

In principle anybody should be capable, with an appropriate level of education, to drill down into the arguments and evidence, assuming such evidence is publicly available, and not hidden behind paywalls! It's like the claim that an individual, for example, working alone in their garage, could revolutionise a scientific field with their discovery. It's a Victorian era view of science. These days, science is done by teams of people collaborating on making scientific discoveries. In the same way, it's unlikely that individuals are experienced or educated enough to be able to understand multiple detailed and highly technical scientific papers in order to come to the same conclusion as the consensus of scientific experts. We should encourage people to read primary sources, but not as the sole means of determining the scientific consensus of a whole branch of science.

I've found the best approach is to listen to science communicators – people who specialise in summarising the conclusions of science and giving us a window into how it works. Science communicators specialise in this, and they're not your typical journalist writing about the latest study results that call for rewriting the textbooks!

Science Communication, as an industry, is not without its flaws though. It's often guilty of presenting only the conclusions and not really delving into how they were reached. One of the best questions that somebody can ask about a science fact is “how do we know this?”. This is the sort of critical thinking that should be promoted. But there's a fine balance between ensuring that the take-away-message is communicated versus getting into the weeds of the reasoning behind the conclusions.

I have recently completed listening to the audio-book version of The Skeptics Guide to the Universe, written by the hosts of the popular Skeptics Guide to the Universe (SGU) podcast. This book was a real eye-opener (or should that be ear-opener, since I listened to it?). I would whole-heartedly recommend the book to anybody interested in scientific scepticism.

The book is an excellent introduction to skepticism. It presents bite-sized chunks on individual topics we typically encounter (e.g. psychics, GMOs, climate change) – each individual chapter would be a good summary to show to a believer in a particular topic. But the best content is in the areas around how to think, and about the biases we all have which lead to flaws in our thinking. One particular topic of interest is around conspiracy theories and motivated reasoning.

The main message of the book is at the end – the section called “Changing yourself and the world”. It calls on us, as skeptics, to point our sceptical thinking inwards and examine our own biases.

Ideally, we'd love everybody to be a skeptic and to think critically, to be able to understand logical reasoning and good and bad arguments. But, as humans we have heuristic brains – we have to take shortcuts – and we can't spend every waking moment analysing every single claim. At some point, we have to trust what we've been told and hope that it's been told to us by reliable sources. But we also have to be aware of the danger of falling into the trap of thinking what we'd like to be true, is true.

Coming back to my colleague back in the 1990s I recognise that he was subject to motivated reasoning – his devout religious beliefs colouring the way he viewed the science, and wanting it to support his beliefs.

I recently attended the Universal Tour by Professor Brian Cox in Auckland. Professor Cox (and his sidekick, Robin Ince) nicely laid out a fascinating tour of cosmology, stepping through the thinking behind some pretty complex scientific concepts. The second part of the evening moved on into the origin of complexity out of order, with a nice experiment demonstrating mixing cream and coffee. Even at the undeniably wild ticket prices of the event (which Brian and Robin joked about), it was attended by about 3,000 people. There is clearly a hunger for knowledge in the community and, as skeptics, we can be feeding it. While we can't all be literal rock-star science communicators like Professor Cox, I believe our role as skeptics is to promote science and encourage understanding of the process by which conclusions are reached.

References:

https://skepticalscience.com/

goodreads.com - The Skeptics Guide to the Universe: https://tinyurl.com/y6y2tsnx

Brian Cox was in two bands, Dare and D:Ream