James Berg, the editor-in-chief of Science, recently voiced concern about what he sees as a “crisis in public trust in science,” the rejection of scientific findings by large parts of the public. This view is supported by public opinion data that reveals a widening divide between the views of scientists and the general public — most drastically on issues like global warming, vaccination, and the safety of genetically modified foods. Survey data dating from 1974 through 2010 also revealed public trust in scientists has been decreasing despite rising education levels.
This decay in trust afforded to scientists is a surprising and disturbing trend given both technological advancement and the significance of scientific findings in the creation and advancement of a modern society. Beyond the undesirable ignorance that arises when new knowledge is summarily rejected, it is essential that today’s citizens can make well-informed decisions on matters pertaining to public policy, especially as issues such as climate change and pollution confront us with complex problems to solve.
Part of the blame for this decline lies with the media, which corrodes trust in science by exaggerating results and highlighting advice from non-scientific celebrities. Tim Caulfield of the University of Alberta also argues that frequent conflicting stories about health research leads people to regard health messages as unreliable. This isn’t to say skepticism toward scientific findings is wrong or that critiques of findings are unnecessary. Some legitimate problems with scientific publishing include increased retractions and irreproducible results. However, the risk lies not in healthy skepticism but rather in disbelief.
There are serious consequences to mistrust in science. For example, suspicion of vaccinations in certain communities has led to a measles outbreak in California. Public policy, often times influenced more by misguided public sentiment rather than established findings, can unnecessarily add financial barriers to investment and research and prevent needed precautions. A less immediate consequence may be reductions in government funding to scientific research — an action that would hamper future discoveries while also undercutting our economy’s competitiveness.
Solutions for this problem must be manifold. For one, the issues in scientific publishing must be addressed, a process that has already begun. For example, AllTrials is an organization that fights to make research open in order to end cherry picking data. But what’s important is not so much to understand the results themselves but the method by which researchers arrived at them. Comprehensive education about scientific history will demonstrate the method by which new knowledge was obtained, how important theories rose to prominence and how new observations led to the revision of those theories. In addition, it will help disprove the belief that science deals with certainties, allowing the public to understand that uncertainty is inevitable.
Some have urged a more clear-cut distinction between scientific findings and the policies derived from them to preserve the neutrality of scientific findings in the public eye. Jeremy Berg echoed this view by urging scientists against venturing beyond explanations into making policy or overhyping their results.
Whatever the method, it’s clear the scientific community must reverse and, at the very least, halt this trend. Trump’s presidency makes this all the more urgent given his history of promoting discredited scientific theories. The announcement that Robert Kennedy Jr., a prominent advocate of the link between vaccines and autism, is expected to chair a vaccine safety commission for the President is a troublesome sign that these views have not changed.
Alex Mink is an Opinion columnist for The Cavalier Daily. He can be reached at email@example.com.