Burning Snow
Amateur researchers should try and find out what experts think before proposing their own speculative theories
"Burning Snow" by thorner is licensed under CC BY 2.0.
A friend of mine has a slightly misguided mission to turn me into a conspiracy theorist. The way he does this is to send me strange videos from amateur researchers and cranks on WhatsApp.
I’m not really sure why my friend thinks this is going to be persuasive. It is true that we can all fall for disinformation or misinformation at some point or another, but conspiracy theories are far from my favourite genre of fiction, as long-time readers of this blog will know.
Sander van der Linden, a psychologist who has developed an online game to help inoculate people against fake news, says that the people who are most at risk from this kind of material tend to have low digital or media literacy skills; are more intuitive thinkers; are less open-minded than average (presumably he means that they are less open to the idea that they might be wrong); more politically extreme; and are conspiratorial thinkers; who tend to get their news from social media. Sadly for my friend, I don’t tick any of these boxes - follow the links to see my counter-argument to each point.
Without meaning to sound smug, my friend must know that I am an investigative journalist who gets paid well for my fact-checking skills. I also publish this weekly blog on how authoritarians, grifters and con artists sell conspiracy speculation to weaken democratic institutions and make a fast buck; and I’ve even written a book on how conspiracy theories act as the bodyguards to bad ideas.
I guess my friend thinks that all the conspiracy speculation he consumes must be true. Anyone who disagrees must be (at best) wrong or (at worst) complicit in a completely imaginary conspiracy. Maybe this one video will change my mind? Sadly, of course, this doesn’t happen. I must confess I don’t always react as well as I should to his attempts to convert me to his cause. To make up for my slightly grumpy replies, this essay will take one video he sent and explore how it could be improved. The lessons apply to other videos too.
One of the first videos that my friend sent me showed an American man making a snowball and burning it with a lighter. I won’t link to it, but you can find plenty of similar material if you look for it. Burning snowballs are actually pretty cool! It is an amazing party trick; and the people who do this would be very dangerous indeed in a snowball fight.
The big problem is what happens next. The research stops there. The people who make videos like this inevitably invent some narrative about Bill Gates and fake snow and environmental hacking and blah blah blah (insert a conspiracy cliché here). Some of them will try and set themselves up as gurus, who have discovered something essential about the universe. The next step for many will be to monetize their videos.
The Sharpen Your Axe project is based on an Ancient Greek philosophy called skepticism, which emphasizes doubt instead of certainty. The philosophy is based on the Greek word skepsis, which means investigation or research. I think it is wonderful that non-scientists are discovering strange facts for themselves. Keep the research going!
Why stop the investigation with just one burning snowball? Does it matter how tightly the snowball is packed? Are the results different in different places? This blog has said before that research is a team sport. An important step after any interesting observation should be to do a literature review. Has anyone else noticed burning snowballs before? Why exactly does snow burn?
Amateur researchers should go to Google or ChatGPT or their local library and try and find out why this happens. Alternative options include calling up a scientist at the local university, enrolling in a science course at a local education centre or politely contacting scientists on social media. Whatever option you choose, the mission will always be the same: Find out what the experts think before you offer your own opinion!
If the amateur snowball researchers had done this, they might have stumbled upon this research paper from 2018, which shows that ice evaporates much faster than non-skiers might expect. This means that when someone burns a snowball, the frozen snow turns straight to water vapor instead of liquid water. The technical term for a solid skipping a liquid phase and going straight to gas is called sublimation. It also happens in reverse, which explains why your breath freezes on the coldest days. Sublimation is so well known that it even has its own Wikipedia page.
The lesson we can draw from videos burning snowballs is that it is very difficult to draw convincing conclusions about causality. The scientific method is designed to create contact with the world outside our heads by testing many possibilities and throwing away the vast majority of our intuitive hunches.
Of course, the scientific method came late to humanity. Our ancestors mostly used guesswork, just as amateur researchers do today. Superstitious thinking, which takes a wild guess about a causal relationship between two unrelated events, appears to have deep roots. In a famous experiment published in 1948, psychologist B.F. Skinner trained pigeons to develop superstitions.
Skinner fed his pigeons completely at random. They started to repeat whatever they had been doing just before getting fed - just like superstitious sports fans, who wear the same clothes every match day during a good run. Pigeons shared a last common ancestor with humans about 310 to 330 million years ago.
We can see something similar with anti-vaxxers and their speculation about “vaccine injury” during the COVID-19 pandemic. More than 5.55 billion people (about 72.3% of the world population) received a vaccination - one of the greatest achievements in human history. There is a clear contrast with the wave of epidemics that weakened the Northern Song dynasty in China before humanity developed vaccinations.
However, the world is very big. Lots of things happen every day. Anti-vaxxers often assume direct causality between A (someone receiving a shot) and B (something completely unrelated that happened soon afterwards). “My mother-in-law’s cousin’s neighour got vaccinated and then he was diagnosed with cancer.” Strangely enough, the people who do this rarely (or never) mention that the cancer patient spent 30 years smoking like a chimney before getting vaccinated.
Unfortunately, anti-vaxxers who enjoy these narratives are often closer to Skinner’s pigeons or people speculating about burning snowballs than they are to real scientists or doctors, who would ask plenty of follow-up questions. Lots of people are diagnosed with cancer every day. It is pretty much impossible to draw any conclusions about causality from just one case. Instead, it is better to get massive amounts of data and subject it to sophisticated statistical analysis. A little knowledge of toxicology goes a long way; and our risk analysis should avoid being one-dimensional.
The culture war over vaccines in the US provided scientists with a great tool to probe the dangers of getting vaccinated against the dangers of not getting vaccinated. It turns out that people in Republican districts, who were discouraged by their leaders from getting shots, died at much higher rates than people in Democratic districts, whose leaders were much more likely to pass on scientific advice to get vaccinated.
The conclusion should be that in the midst of a global pandemic, getting jabbed is much safer than not getting jabbed. Anyone who twists the statistics to tell you anything different is almost certainly trying to sell you something. This might be a quack cure that doesn’t really work; or it could be a populist politician trying to win your vote by engaging with primal emotions like fear.
Finally, it is worth mentioning that internet information about vaccine injuries can contain a small grain of contact with reality. The European Medicines Agency (EMA), which monitors the safety of COVID-19 vaccines authorized in the European Union (EU), says that close to 1 billion doses of vaccine have been given in the EU and the European Economic Area (EEA). They are “effective and safe”; and the vast majority of side effects are mild and short-lived. Serious safety problems are “extremely rare.” The small number of people who have experienced issues have my full sympathy, as do their loved ones.
If, on the other hand, you have self-diagnosed some kind of vaccine injury, please contact a medical professional and ask for a more informed opinion before continuing to research the issue yourself on the internet. Self-diagnosis is a slippery slope. The anti-vax movement is full of crackpots, conspiracy nuts and vendors of quack cures who will always all tell you want you want to hear.
Unfortunately, crank magnetism (becoming attracted to multiple unrelated crackpot ideas simultaneously) is just as real as motivated reasoning. You will find bleach-drinking cults at the bottom of this particular rabbit hole. Caveat emptor! The comments are open. See you next week!
Further Reading
Article in Popular Mechanics on burning snow
Play Van der Linden’s Bad News game
Van der Linden lays out the theory behind the game in Foolproof
Sharpen Your Axe is a project to develop a community who want to think critically about the media, conspiracy theories and current affairs without getting conned by gurus selling fringe views. Please subscribe to get this content in your inbox every week. Shares on social media are appreciated!
If this is the first post you have seen, I recommend starting with the second anniversary post. You can also find an ultra-cheap Kindle book here. If you want to read the book on your phone, tablet or computer, you can download the Kindle software for Android, Apple or Windows for free.
Opinions expressed on Substack, Twitter, Mastodon and Post are those of Rupert Cocke as an individual and do not reflect the opinions or views of the organization where he works or its subsidiaries.
You write: "Amateur researchers should go to Google or ChatGPT or their local library and try and find out why this happens."
No one in their right mind should go to ChatGPT expecting accurate information. All the Large Language Models (LLMs) appear to be prone to confabulating, i.e. making up something plausible-ish when they don't have a good answer. (If you google this, the commonly used term appears to be "hallucinate" rather than "confabulate", for reasons I don't understand; confabulate better describes the behaviour.) This includes such helpful tricks as adding footnotes that point to articles which never existed. See e.g. https://www.theguardian.com/commentisfree/2023/apr/06/ai-chatgpt-guardian-technology-risks-fake-article