Journalism provides us with important information about what’s going on in the world. But when you consider the incentives that journalists have, combine that with their usual lack of scientific training, and add in the constraints of the medium in which they work, serious distortions of reality can result. Many journalists produce excellent work. But others leave you less informed after reading their articles than before you began.
What causes journalistic distortion?
1. Equal time to each side. There are many issues for which there are two or more reasonable positions that a person can hold. Then there are those issues where one side is supported by nearly everyone who has relevant expertise, and only a few fringe people oppose that view. The trouble is that stories about highly unbalanced issues can lead to a false impression of balance, either because the journalist feels compelled to spend equal time discussing each view-point, or because the journalist is himself unaware of which side is more trustworthy. And a person with highly unrepresentative but highly quotable opinions may be quoted in the article more than is warranted. It may seem less biased to present both sides, but when one side is almost certainly right, an equal presentation may distort more than it informs.
2. Selective reporting. Since news organizations are in the business of selling the news (or, at least, driving traffic to their websites) they have a monetary incentive to produce news that people will be eager to read. Feel-good stories about a dog saving someone’s life can beat out information that might be more important or relevant to most people. What’s problematic from a reality distortion perspective though is that the rate at which events occur and the rate at which they are reported are massively out of sync. For each story about someone coming home from work only to be murdered by their ex-boyfriend, we never hear the millions of tales of people coming home to work only to sit down and eat dinner. This is problematic because the way the human brain tries to estimate how likely something is to occur involves an attempt to retrieve instances of that thing in memory. The more easily you can retrieve those instances, the more frequent you will tend to assume that thing is. If you’ve recently read about a few murder cases, you may have the impression that murder has become more common than it used to be, even if this is merely an artifact of journalists choosing (for whatever reason) to report on more murders. If you can easily think of an example of a shark attack, you may overestimate the frequency of sharks killing people.
The vividness of the accounts we hear can also alter our perception. A vivid retelling not only increases the chance that we remember an account, but also tends to increase our emotional response to it. If you’ve recently read an article that described a gruesome murder in horrid detail, you may subsequently be more afraid when walking alone on an empty street. Through this mechanism, news reading can cause people to have excessive fear of things that aren’t very likely to harm them, and fail to fear far more dangerous things that are rarely reported on. You’re much more likely to die in a car crash than be killed by terrorism, yet in a world where terrorism is reported on constantly, you will likely fear terrorism more.
3. Mix-ups of correlation and causation. Just because X tends to occur together with Y doesn’t mean that X causes Y. In fact, it could be that Y causes X instead, or that both X and Y are caused by some third thing. Unfortunately, reporters frequently get this wrong (or at least fail to make the distinction clear to the reader), especially when reporting on scientific findings. Articles will insinuate that since the latest study found higher wine/broccoli/nicotine consumption was associated with greater longevity/health/focus, that means that wine/broccoli/nicotine must actually cause those benefits. A related problem that you’ll sometimes see (especially in articles about finance) is the implication that since Y came after X, that means that Y was caused by X. It may be true that many stock market investors reacted negatively to a new bill that was just signed into law, but that doesn’t mean that’s a causal explanation of why the stock market fell 1% today. There surely were many factors influencing the change in the market’s price, some tending to push it up, others tending to push it down. Even if it were true that the signing of the bill had a strong effect (which it might be difficult to confirm), that event certainly cannot take all the credit for determining the change in the market’s price.
by Jim Borgman
4. Use of low quality studies. Just because a study “proves” something, doesn’t make it so. In fact, most studies that are conducted are of poor quality for one reason or another. This could be due to a small number of study participants, lack of a control group, lack of randomization, the wrong choice of statistical test, flawed experimental protocol, poor choice of outcome measures, selective reporting of study results, or a variety of other reasons. Unfortunately, journalists rarely make it clear whether a study was of high quality, being mainly interested in what the study claimed to have found. Even the reporting of high quality studies can be problematic, if the journalist fails to mention other high quality studies that found different results. Given all the things that can go wrong in the design and execution of a study, we should be hesitant to accept the results even of those studies that look to be of high quality, until we have seen replication of the study by a different research team.
5. Lack of understanding. Many journalists write about a wide range of subjects. It is rare that they are true experts in the subject of a particular article. But as non-experts writing about what are sometimes very complicated subjects, there is the danger that journalists misunderstand the underlying subject matter. This problem occurs especially often in articles about highly technical research. The issue is compounded further by the fact that journalists are often working under tight deadlines, and so may lack the ability to do extensive background research.
6. Selective use of the facts. Even within a single story, the problem of selective reporting can be substantial. Not all facts in a case are equally entertaining or fit the narrative equally well. There is some incentive to favor those facts that improve the story over drier, though perhaps important material. Of course a political or other agenda on the part of the reporter can also determine which facts he chooses to report on. Since there is a tendency for liberals to read liberal news sources while conservatives read conservative sources, both groups may have their pre-existing views bolstered by selectively reported evidence.
7. Exaggeration of importance. News sells better if it sounds important, so news organizations have an incentive to make their news fit this criteria. One way to do this is to report on stories that actually matter to a lot of people, but sometimes it is better for the organization to just make whatever they’re reporting on sound more important than it is. The next big scientific breakthrough reported on turns out to be completely forgotten a few years or months later (but who remembers?) One of the most common forms of exaggeration in journalism is when a trend is constructed from a few data points. If a handful of celebrities are eating a lot of coconut, or museums have recently become a little more popular among people in their twenties, that doesn’t mean there’s a new fad that the world should hear about.
Choose your news sources carefully, because the information you consume determines what you believe about the world. And as incredibly valuable as journalism is, it can distort reality.
But universities and their representatives - the intellectuals, the academics - also have a remarkable tendency to generate statements that are intentionally not quite true. This is not quite lying, but it is not quite the truth, either. I will call this truth distortion. When others believe them and pass the information on, either intentionally or otherwise, we have misinformation. The consequences can be devastating, even if the misinformation is later corrected -- because in that case people tend to remember both versions, the old and the new, and the confusion continues.
Truth distortion is everywhere, it seems. It infects not only institutions (politics, religion, business) but also our everyday life and relationships. If anyone has any doubt about the relevance of issues surrounding truth telling versus truth distortion in modern life, just read about today's most famous whistleblowers, Julian Assange, Edward Snowden and Bradley Manning, and what happened to them (link).
Distortion of the truth is often for personal gain. It is misleading and insincere. It is not blatant lying, nor is it quite the same as nonsense. It is not funny, because generally other people suffer or lose out as a result. Because truth distortion is not clearly untrue, it is sometimes hard to identify, criticise and eliminate it. This is encouraging for the epistemological opportunists, who feel free to continue with their evil work. The truth distortion mounts higher and higher.
The responsibility of intellectuals
Noam Chomsky, whose brilliant contributions to linguistics made him perhaps the most frequently cited academic author of all time, wrote about the responsibility of intellectuals to tell the truth in 1967 and then spent the rest of his life courageously doing what he had promised to do. Given the enormous implications of his revelations about, for example, US foreign policy, we academics can only bow our head in respect and attempt to follow his example, hoping that other academics will follow suit.
Unfortunately, very few do that. Most of us are burying our head in our academic work and pretending that politics is none of our business, while at the same time enjoying the fruits of political and economic systems for which truth-tellers of the past fought and died. As if we had not grown up yet.
Since universities are primarily in the business of generating and spreading knowledge, they should be in a better position than other institutions to identify and eliminate truth distortion. Chomsky argued convincingly that
Intellectuals are in a position to expose the lies of governments, to analyze actions according to their causes and motives and often hidden intentions. In the Western world, at least, they have the power that comes from political liberty, from access to information and freedom of expression. For a privileged minority, Western democracy provides the leisure, the facilities, and the training to seek the truth lying hidden behind the veil of distortion and misrepresentation, ideology and class interest, through which the events of current history are presented to us.
It follows from this that
It is the responsibility of intellectuals to speak the truth and to expose lies.
This is an obvious conclusion, and as I see it, it really is as simple as that. Intellectuals are in the business of formulating the truth. That is our job. That is what research and teaching are about, at the most fundamental level. Universities give people professorships on the understanding that they will use them to formulate true statements in their discipline for teaching and research purposes. Research grant agencies give researchers money in order to find out the truth about specific questions.
For this reason you would think that universities would be centrally concerned with the truth, in general, because of all public institutions who could be concerned about the truth, the universities are most likely to perceive themselves as having this task and are (or should be) in the best position to execute it.
But as Chomsky explained, universities and intellectuals often do not tell the truth. This is the case both within academic disciplines and in the political arena. That is surely a shocking revelation, and it should motivate anybody who cares about truth and about such things as human rights, which so often depend directly on whether the truth is told or not, to act to solve the problem within their sphere of influence. As Chomsky also suggested, and showed by his personal example, that sphere is often bigger than you think.
Global poverty and global warming
It is reasonable to argue that truth distortion is preventing the world's biggest problems from being solved, and in that way indirectly causing the world's biggest problems. I have argued from a human rights viewpoint that the world's biggest problems are global poverty and global warming, and the associated current and future death rates. The problem of global poverty is not being solved because the rich countries are refusing to make the necessary reforms to the global economic system, while at the same time refusing to finance official developmental assistance at the agreed rate of 0.7% GNP. They are justifying this refusal by veritable cascades of truth distortion: deliberately failing to mention the world's biggest problems when discussing other related problems, pretending that it is not possible to raise the necessary finance, and pretending that developmental assistance does not alleviate poverty. Similarly, the rich countries are refusing to solve the problem of global warming; we have seen an enormous amount of "hot air" but far too little decisive action. Again, truth distortion is the main reason behind the failure to act. Powerful minorities in the rich countries are continuing to pretend that global warming does not exist or that strategies to slow global warming do not work. They are even pretending that people without any relevant qualification can know more about climate science than internationally leading climate scientists. So much for the "truth".
These are primary global political issues, although academics are centrally involved in them. Here is a more purely academic example. Right now in 2015, a university is organising a conference called "Arts and environment". The aim and rational is explained on their homepage:
Rapid environmental changes, especially as driven by human activity, present critical global issues calling for urgent action. Several prominent voices have recently called for those in the arts to be change agents towards a more responsible and sustainable future. Within such a context this conference seeks to explore some of the opportunities, responses, and responsibilities in the arts, including connections between the arts and science.
That certainly sounds promising! So I contacted the organisers and asked if they have any policy for reducing the carbon footprint of the conference, and the answer was no. They had no plans for videoconferencing or other forms of web conferencing, no reduced registration fee for those who do not fly, no donation of the registration proceeds to relevant organisations, no high rejection rate to increase general quality based on peer review. Instead they appeared to be planning, as academic conference organisers have done for decades at the same time as the evidence for global warming has been growing, to encourage as many people as possible to fly to the conference from all over the world. I have repeatedly been guilty of this crime myself, so I can hardly complain. Now, given the increasingly clear findings of climate science, I see that this behavior has got to stop, and I feel an obligation to explain this to colleagues in all academic disciplines.
The flying-to-conferences problem is one of countless examples of truth distortion in an academic context. As in countless other such cases, those distorting the truth are not doing so deliberately, and feel completely innocent. They may argue for example that since everyone else is organising conferences, why shouldn't they, or since no-one is coming to their conference in a private jet, it cannot be all that bad. These reactions are partially correct, which is why I am talking about "truth distortion" and not about "lying".
Truth distortion in general
In the following pages, I will argue that if we are going to solve such problems we need to take a more general look at the problem of academic truth distortion. Before acting to reduce truth distortion, we need to understand the problem itself. What is it? What motivates it? Who is doing it, and why? What are the consequences? I will analyse some of many ways in which academics distort the truth within research and teaching in their disciplines. Beyond that, I will be concerned with how the truth can be distorted in three main areas: in academic discourse within universities, in the public domain when intellectuals participate in political debates, and in everyday life.
Allow me to admit that I am a utilitarian. I believe in pursuing the greatest good for the greatest number. Some people think that is a bad thing, or at least not the optimum approach, but let's not get into that now. One way to achieve the greatest good for the greatest number is to tell the truth, and to expose distortions of the truth. For example, we might tell the truth about climate change or the invasion of Iraq in 2003. Millions of lives have depended, and continue to depend, on the way these issues are publicly discussed. Along the way, some people are bound to be offended (you can't make big progress in any aspect of human affairs without offending someone, it seems), but I believe that if we remain honest and authentic in our quest to formulate the truth, however defined, the number of people who benefit will far outweigh the number whose toes have been trodden upon. This is essentially the rationale behind all examples of truth distortion in the folloiwng pages.
Before continuing I should also make clear that I am not a qualified philosopher, nor am I qualified in several other relevant disciplines that I will consider. This article has not been carefully checked by an expert, and probably contains errors, exaggerations, omissions and/or logical fallacies. All such problems are unintentional and I am doing my best to avoid them by focusing on ideas that seem obvious - so obvious that any arguments against them are evidently cases of truth distortion. Readers who find exceptions are asked to send me their thoughts.
Truth distortion outside of academia
Sociologists have studied everyday lying and truth distortion extensively (e.g. Barnes, 1994). You can lie to get what you want, or to protect someone else. Lying is not always bad, but it would help if there was less of it.
People are constantly evaluating each other's reproductive potential, and Charles Darwin famously explained why. Flirting is a form of acting, and acting is a form of truth distortion. So perhaps there is an evolutionary explanation for why we are so good at it? Billy Joel was right when he sang that "Honesty is such a lonely word", which presumably explains the high divorce rate. (By the way, there is nothing generally wrong with acting - or with flirting of course. In fact, theater is an excellent medium in which to explore and understand the phenomenon of truth distortion.)
We are constantly being encouraged to buy things that we do not need - which is gradually destroying the planet. Most of us are confronted with truth distortion in advertising several times per day. Advertising belongs to the foundations of a capitalist economy; the only consolation is that communism is even worse.
Most politicians are constantly trying to convince voters to re-elect them by pretending to work effectively and solve problems. They conceal essential facts from voters and often appeal to their irrational fears. For example, populist politicians like to explain to the electorate that foreigners take away local jobs and dilute local culture, when in fact the opposite is true: an influx of foreign ideas and expertise can boost local industries, and local culture can be enriched by intercultural interactions. This is truth distortion: it is basically, but not completely, untrue. Because politicians primarily aim to be re-elected, they tend to confine their planning to the short term - avoiding the biggest global problems, because they cannot be solved in three years. This state of affairs is bringing the world to the brink of disaster (global warming).
One of the most serious forms of truth distortion in politics and economics is denying world hunger and poverty, especially given that the problem could realistically be solved within about two decades if only the rich countries would keep their old promise of spending 0.7% of GNP on foreign aid (Sachs, 2005).The simple act of NOT talking about this, which is perhaps the most serious of all problems, is classic truth distortion.
A related example is wealth tax. In an age when there are almost 2000 US$-billionaires in the world and that number is rising as inexorably (and perhaps irreversibly), just as mean global temperature is rising (link), and that in turn is threatening social order and democracy at the same time as millions of people die every year from hunger or disease in connection with poverty - when all of these things are happening at the same time, it is patently obvious that new wealth taxes are necessary. One may well ask: Why should any individual have the right to a billion dollars, while millions of others are living and dying in poverty? Why indeed. Surely that is taking capitalism just a tad too far? Is it possible for one person to deserve to have 1000, 1 000 000 or 1 000 000 000 times more money than another person? Obviously not, and anyone who claims otherwise is obviously and blatantly distorting the truth. Moreover, given the obvious problem of capital flight (the rich can shift their wealth to a country with low or non-existent taxes), it is patently obvious that a global agreement on wealth tax is necessary, so that wealth tax rates are about the same in different countries.
So why aren't the G20 talking about it? There is a remarkable tendency for both the centre-right political parties and the economists (or the entire discipline of economics, for that matter) in the rich countries to ignore this problem, or to immerse it in pseudo-scientific nonsense so that people believe there is no clear solution. That is one of today's most shocking examples of truth distortion. Economists, it seems, are very talented at dreaming up complex reasons why wealth tax might be a bad idea, and the rich just love them for doing it. Conversely, the rich really do not like those economists who have the courage to tell the truth about wealth inequality and wealth tax. If you want to get a good job as an economist, and you are as cowardly as most people, you really should avoid telling the truth about wealth.
Thomas Piketty, in his 2013 book "Capital in the 21st Century", is one of the few to present the truth about rising inequality and its consequences. Most economists (and journalists) are still too chicken to talk about the problem clearly and directly. They prefer instead to wheel out pseudo-sophisticated criticisms of his work, which they hope will impress the rich.
Within academia, new medications are typically tested by randomised double-blind trials with carefully defined control groups. Several studies may be performed by different research groups, and meta-analyses performed. Complete publication of all methodological details is necessary to enable others to evaluate of the quality of a study and the validity of the conclusions. Care and exactitude are important because the results can mean the difference between life and death for individual patients.
But humans are not always very smart, and truth distortion is rife in this area, too. I will confine my comments to distortion of medical research results in the public domain, as brilliantly critiqued by Ben Goldacre in his book "Bad Science" (Harper Collins, London, 2008). An astonishing proportion of what is reported in the media as medical science is nonsense, generated by people who are living in a fantastic dream world, want to make money out of fooling the public, or simply don't know what they are doing, because they are not qualified or never did an experiment themselves. Examples of nonsense supported by pseudoscience include non-standard uses of the word "energy" (e.g. your personal "aura"), "brain gym" (exercises to make you more intelligent), homeopathy (sugar-based placebos), detox (getting non-existent chemicals out of your body), Hopi candles, fish oil (omega-3), antioxidants, vitamins as a cure for cancer or aids (instead of chemotherapy or anti-retroviral medication), and irrational fears of vaccination (because it might cause autism). If you believe that any of these treatments or approaches are valid, please read Ben's book and think again. Of course the global pharmaceutical industry ("big pharma") also has serious problems, but those problems are not solved by pretending that all medical research is biased and opting instead for superstitious alternatives.
This is a big one. You can hardly mention the word without people getting offended. That is a sure sign that the truth is being distorted. If only we could talk about it, the problem would be half solved. Let me first relate this problem to other problems on this website.
Suppose that the world's biggest problems are poverty and climate change. Being overweight is connected to both of them: we in the rich countries have too much to eat while the developing countries have too little, and we are causing climate change in part by our laziness. We drive cars instead of cycling and taking public transport, fly when we could take the train, and overheat our homes rather than wearing a pullover. Many of us could solve several problems at once by changing our lifestyle. We could adapt both our mobility and our eating habits (less red meat!) to the needs of both our body and the planet. It goes without saying that our (sexual) relationships would also improve, not to mention our happiness.
Another big source of truth distortion is the idea that some people are genetically more prone to becoming overweight. That is not untrue, and you can find a lot of medical literature on it. But the idea is often taken too far, which is an example of truth distortion. Given the important difference between self-efficacy and victim mentality, the truth about being overweight is that it is primarily caused by three things: eating too much, eating the wrong food, and not doing enough exercise, of which the third point is probably the most important. That applies regardless of our genetic makeup. Regarding genes: our physiology, which includes the gastrointestinal system and the nerves, hormones, and blood composition changes that signal to our brain that we are hungry, is the result of an interaction between genes and environment. As Maybebop pointed out, we are not victims of our genes; the environment and our behavior are equally important. If you have a similar body-weight problem to your parents, it may simply be that you copied some aspect of their eating or exercise habits.
If you are a parent, your unhealthy attitudes to food or exercise and your resultant weight, can contribute to the emergence of anorexia in your children. This is a shocking realisation for many parents, but it can also help parents to solve the problem, even after their child is diagnosed (more). Children need role models who look after themselves in a normal way by eating normally and getting normal exercise - and enjoying these things in a normal way. As a result, their bodies are normal, and normal is beautiful, just as mother nature intended. My tip: ride your bike in a normal kind of way with normal clothes (don't pretend you're doing the Tour de France or that you have a perfect cyclist's bum). It's never too late to start. Stop watching advertisements on TV, and recommend to your children that they follow suit, for their own sanity.
The public promotion of xenophobia was an important cause of the greatest crime in history, the Holocaust. If you want to prevent something from happening again, the best strategy is to address its causes. One of those causes is public displays of xenophobia, and the acceptance of those displays by the powers that be. This much should be obvious.
Are we serious about "never again"? Evidently not. Today, incredibly, it is still possible to publicly promote xenophobia, as if the Holocaust had never happened, or as if we had not learned anything from it. Right across Europe and in many other countries, the far-right parties are doing it all the time. In fact, it is probably their single main voter-wooing strategy.
Why is the public promotion of xenophobia not illegal? Surely the law is there to protect innocent people from being attacked? Other dangerous things are illegal: driving on the wrong side of the road, for example. Other things are similarly difficult to define: if two cars collide head-on in the middle of the road, how can you decide which was on the wrong side, and hence which caused the accident? Courts of law are constantly forced to make such "subjective" decisions, which is one reason why law is so interesting.
The public promotion of xenophobia could be systematically prevented by preventing or punishing examples that are worse than precedents that were also prevented or punished, according to an independent judge with experience making such judgments. If that happened, at last we would be freed from the most outrageous truth distortions of the far-right parties, especially during election campaigns. Some semblance of civilised behavior and good manners might return. I live in a country where it is considered bad manners not to greet someone formally (in other countries, you just start talking) or to put your hands on your lap while you are eating (heaven knows what you might be doing with them down there), but at the same time it is legal and generally accepted when a political party blames "foreigners", including asylum seekers, for the country's woes and wins elections on that basis.
Some people argue that banning xenophobic advertising would not help. In fact, it could make xenophobia even worse. People would be so offended that they would double their efforts to be xenophobic, as a form of public protest. It is true that xenophobia is part of human nature, so there is a natural level of xenophobia in the population that can never be completely eliminated. That is an important point that many people don't realise; it can be explained by considering the evolutionary foundations of human behavior. But there are also natural tendencies for people to do all kinds of other things.For example, there may be a natural tendency to try out consciousness-changing drugs, or to steal. But preventing people from smoking (e.g. by taxing cigarettes or banning smoking in restaurants) does not cause them to smoke more, nor does preventing people from robbing houses (by means of burglar alarms or Neighborhood Watch) cause them to rob more. Many similar examples could be listed. Heterosexual men have a natural desire to want sex with attractive women, but punishing rapists does not increase the incidence of rape. Need I say more? The idea that the public promotion of xenophobia "cannot" be legally suppressed is a clear case of truth distortion. If one is too cowardly to protect the rights and dignity of minorities and migrants, one might as well admit it. That would at least be honest.
In one way or another, spirituality is important for everyone. But the people who talk about it the loudest are often masters (less often mistresses) of truth distortion. In Europe, people are leaving the Catholic Church in droves (or perhaps flocks) because some of its representatives habitually distort the truth. They deny that women can and should have positions of power, that contraception can and should prevent AIDS, that holy texts were written by humans, and that world religions have remarkably similar aims and functions. The prominent representatives of Islam and Judaism are no better than the Christians, and sometimes worse.
If religious leaders were genuinely interested in the truth (as Pope Francis is, refreshingly) they would find many passages to support truth and honesty in their sacred texts (e.g. Matthew 22:15-16). On that basis, they could allow themselves to be more honest and scientific, without denying the universal importance of religion and spirituality. They could create a new middle path between traditional religious teachings and the extreme atheist alternatives. I don't agree that the world would be better without any religion at all (even if John Lennon's "Imagine" is otherwise a brilliant song), because people will always need and promote religion. People also need honesty, and the simple truth is that it is possible to offer them both at once.
Truth distortion in an academic context
Academic truth distortion is surprisingly common in both research and teaching. Perhaps the most familiar case is when academics try to blind their readers (academic colleagues, students, or the general public) with an impenetrable writing style, which makes it difficult for the reader to evaluate the content. That is classic academic truth distortion. But there is much more to it than that.
To understand academic truth distortion (henceforth ATD), it helps to take a quick philosophical look at the idea of knowledge. Knowledge comprises claims that are generally supposed to be true. But what is truth? Well, there is a long philosophical tradition of discussion about the nature of truth - whether it exists at all and if so how it can be identified. Different academic disciplines approach the concept differently. Scientists tend to believe in the existence of an absolute truth that can be discovered and will be true in all times and all places; the "laws of physics" are an example (but even these have a remarkable tendency to change when physicists make new discoveries, as any physicist will explain). Humanities scholars hasten to point out that truth is always relative to social, historical, or cultural context, which can hardly be denied - although this opinion can be problematic when anything that one claims to be true can be shot down by others as merely relative (in which case why do we bother trying to formulate the truth at all?). Besides, the existence of "historical facts" is obvious, and in cases like the Holocaust or the Armenian genocide it is important to insist on them. In spite of these fundamental differences and ontological issues surrounding the concept of truth, one thing is for sure: however defined or regarded, and however it depends on context, truth is centrally important for all academic disciplines. Because if something is evidently not true, it is evidently not knowledge, either.
Given this background, the concept of ATD seems like a contradiction. If academia is about searching for, identifying and sharing the truth, however defined, and ATD is about undermining it, how can ATD exist at all? Why would an academic do such a thing? Is this not a case of stabbing oneself in the back?
The reason why ATD exists is evidently that academics do not work for nothing. Like everyone else, they need motivation. They are rewarded (by jobs, research grants, prestige and so on) for discovering, expanding or constructing "truth". But it is often difficult to judge the extent to which an academic has succeeded in doing that. So mediocre academics develop the art of tricking others into thinking that they are good at what they do. We are all under pressure, after all. To cover up their real or perceived inadequacy, they develop the gentle art of ATD.
ATD is no small problem. Academics may find themselves surrounded by it at all levels - administration, teaching, research. It follows that if an academic is to make significant and lasting contribution toward the "truth" (also called "knowledge"), s/he must be skilled in identifying, exposing and avoiding ATD. This, then, is one of the main skills that students should acquire before getting a university degree. Unfortunately, sometimes the opposite occurs - masters of ATD naturally and effortlessly pass their art onto their students.
I can think of two main ways to stop ATD-generators in their tracks. One is peer review. Academics who are forced to publish their research in good peer-review journals will soon change their ways. Good anonymous reviewers will not put up with ATD for long. But poor reviewers, alas, may already be up to their necks in their own ATD. While this system often fails and works better in some disciplines than others, there is no better one. The other way is to train students in the art of critical thinking. An important part of critical thinking is the identification and rejection of ATD in teaching materials. University students should choose their courses, instructors and supervisors accordingly. That's not as difficult as it sounds, and one aim of this text is to help students to do just that.
ATD is a serious threat to research and teaching everywhere. If universities are worried about their public funding and social relevance, they should take a long, hard, honest, self-critical look at the ATD that is going on within their own ivory towers. The best way to reduce the incidence of ATD is not to accuse individuals of generating it, but to improve awareness for ATD so that people can at least recognize it. Peel back the layers of truth distortion, one by one. Expose them, talk about them, understand why they are there, and on that basis try to get rid of them.
Varieties of ATD
It's not easy to divide ATD into different kinds, because they often overlap. Here is a possible analysis.
Incomprehensibility and pseudocomplexityIf your academic colleagues cannot understand a text that you write, they cannot evaluate or criticize it. At the same time, you can even accuse them of being too stupid to understand it. Or at least imply that they are. If you are not very intelligent yourself, may start to believe your colleagues really are stupid. And so it goes on. That is one of the ways conflicts can emerge in mediocre academic settings. Such conflicts tend to waste a lot of time and money and reduce the academic standard even further.
Incomprehensibility and pseudocomplexity are perhaps the most common and familiar forms of ATD. To expose it, we need to be clear what we mean by an "appropriate academic writing style". Academics do not usually write in the style of everyday speech, because everyday speech contains colloquialisms (for which clear definitions may not be universally accepted or found in dictionaries) and repetitions (which in a written text can often be avoided). Moreover, the way we speak - when for example we present out work at conferences - is different from the way we write, because the reader of a text has the chance to skip from one place to another and read the same sentence several times, but a listener has to follow the temporal course of an argument from start to finish.
For these and other reasons, one might say that an "appropriate academic writing style" is a style that is concise (saying as much as possible in few words - important especially if you have to summarize an entire research project in a 200-word abstract - and considering that most academics don't have much time to read anything properly) and clear(using terms that are either clearly defined or whose different meanings are discussed in the text so the reader can get a clear idea of what the writer is trying to say). Technical terms, abbreviations and words borrowed from foreign languages should only be used if they are necessary and clearly understandable to the readership. It often helps to avoid technical terms altogether, to open the text to more readers - both within academia in different disciplines and outside of it. Beyond that, the material should be tightly organised and presented in a logical order so that a typical reader will understand it.
Because of the complexity of some academic theories and writing, there is a general feeling among the general public, and unfortunately among many academics themselves (not to mention their innocent students), that complex theories are generally superior to simple ones. In fact, a good theory generally has an appropriate amount of complexity to achieve a given goal. You can find out what an appropriate level is by increasing the complexity of a given theory beyond a given level, and checking whether that improves its explanatory power (the qualitative aspect) or its ability to predict new data (the quantitative aspect). If not, either your theory is already too complex or there is a brilliant idea out there that you haven't discovered yet. Brilliant ideas, needless to say, can usually be expressed simply and concisely. What I am saying here is essentially the same as what the philosopher William of Ockham said in the early 14th century, often called "Ockham's razor". It is a principle that is more important for sciences than humanities, in which a certain richness of material and detailed understanding is usually considered to be a goal in itself. A remarkable thing about Ockham's razor from a modern scientific perspective is that remarkably many scientists know about it and can explain it, but remarkably few seem to be applying it in their research. At least, that is the impression I often get when I peruse leading journals in disciplines such as psychology, sociology and economics, and I should know what I am talking about, having written and reviewed original contributions to many psychology journals.
ATD fans try to fool their readers by writing in a style that sounds academic but in fact breaks the above basic rules of academic communication. A text that contains important-sounding words that are not used in everyday communication may sound academic to a reader from outside the field in question, or a student who has not yet been initiated into that field. Such readers can easily jump to the conclusion that those important-sounding words are necessary. They do not want to admit their ignorance and guess that if they keep reading, they will start to understand those words. In ATD, however, these words are often not clearly defined, or not defined at all. No amount of reading will make ever them clear. Even international leaders in the same specific discipline have to guess what they are supposed to mean - which they will not do for long before reading something else. And if those experts cannot understand the text, it is useless. It is not worth the paper it is printed on.
A common form of ATD is to exaggerate one's own positive achievements and to exaggerate the negative achievements of others. The former tends to be done publicly, the latter privately.
A familiar example of the first form of exaggeration is the word "excellence". These days it seems that just about any academic program that promotes an above-average standard can be called "excellent". But the term "excellence" often means no more than "academic quality". There is no point talking about "research excellence" before a system and a tradition of good research evaluation is in place; the term "excellence" should only refer to the best of a large body of research that is already well above average. A similar thing happens when departments give final grades to students at the end of their degree programs. Students often get very good grades (A or 1) although their performance clearly merits B or 2. This is a common form of ATD used by university departments to bolster their self image. "Look at how many of our students got an A at the end of the course - we must be good!"
Exaggeration of the negative achievements of others is a central feature of academic mobbing or bullying. Mobbing is a time-honored method of suppressing colleagues who are competing for the same resources. Just attack them repeatedly over a long period until their reputation is destroyed and they can easily be pushed around. When describing the work of a colleague, it is not difficult to make just about everything that s/he does sound bad, especially if s/he is doing creative new things, challenging traditions, or merely being different. Academics have a lot of practice at evaluating students and colleagues, and some take the opportunity of applying this skill to the task of destroying their opponents. A good "evaluator" can easily and quickly formulate a convincing story that a given colleague is generally bad - poor in administration, research and teaching - even if that person is in fact one of the best in all three areas. Most universities have anti-mobbing policies, but they generally sound better than they really are, which itself is a form of ATD. When mobbing victims have the courage to look around for the support promised by such policies, they usually manage to find sympathetic colleagues, and those colleagues do offer them reassuring talk; but they usually stop short of politically effective action, so in the end nothing changes. It is hard enough to have a mobbing claim investigated, let alone have a mobber clearly identified as such. And even if that happens, there is little chance that the mobber will be appropriately punished - as a signal to other potential mobbers that mobbing is not acceptable behavior and a massive waste of public funds.
People just love to talk about science and scientific method, and naive ideas about the power and wonder of science are prevalent both within and outside of academia. Just say the word "science" and people spontaneously think of "great men" like Newton, Darwin, Einstein, and all the long-term benefits that accrued from their scientific work. But science is also supposed to be hard to understand, and people often have naive ideas about how science works, or what scientific method means and how reliable it is. In fact, they are often unclear about the difference between science and humanities, a distinction that is absolutely central if you want to understand how academia works. The overused expression "scientific studies have shown that..." is often a warning sign that what is about to come is a dubious statement that has been supported by a flawed methodology, or perhaps a paper that has been rejected by a peer-reviewed journal.
Both experts and laypeople can be lead astray in this way. In my experience, academic grant applications are more likely to succeed if they include the word "science" or "scientific" on every page. Try it! It cannot hurt and it probably increases your chance of success. I got this idea from a highly respected academic colleague - one of the global leaders in my discipline, if you count the citations. What he did not tell me is this: if you are applying for money from the humanities, you should avoid the word "science" at all costs, because humanities scholars regard scientists as a threat, like the members of an opposing football team. The feeling is almost mutual: scientists often have not the slightest idea what the humanities scholars are up to, except that it is probably not "scientific", so it cannot be good. I need not remind the reader that all of these cases are examples of ATD.
These are great opportunities for academic truth distorters. They know that their ideas are more likely to be accepted if scientifically naive readers think they are "scientific". One approach is the suggestive use of neuroscientific ideas and terminology in academic disciplines that involve human behavior. These disciplines include not only psychology, anthropology, and education, but also just about anything: religion, law, cultural studies, history, art appreciation, economics, and so on.
Consider the field of education research for example. "Educational scientists" (as some like to be known) sometimes justify pedagogical and didactical approaches by impressive talk about the left and right sides of the brain, which are apparently associated with analytic and holistic thinking respectively (and a whole host of other things). But any neuroscientist will tell you how misleading this idea is. The brain is so much more complex than that! Besides, the kind of empirical knowledge that educators need to support their educational methods can generally be derived from behavioral studies alone.So why cite neuroscience? You guessed it: They do it to impress people, both within and outside of academia.
The psychological effectiveness of talking about neuroscience, even if it is largely irrelevant to the matter in hand or merely opens a can of worms, was demonstrated in a rather cunning little study by Weisberg et al. (2008). The authors demonstrated that if you talk about the brain when trying to explain some aspect of behavior, the explanation sounds more convincing to non-experts (people not trained in psychology or neuroscience), regardless of whether that information wass logically relevant or not. Here's the reference: Weisberg, D. S., Keil, F. C., Goodstein, J., Rawson, E., & Gray, J. R. (2008). The seductive allure of neuroscience explanations. Journal of Cognitive Neuroscience, 20(3), 470-477.
A classic form of ATD is to deny that something exists, although it is evidently important. This is done in various ways, of which the following are examples.
Ignoring relevant research (also called cherry picking): One way to convince non-experts that you are right is to present only arguments in favor of your thesis and deliberately leave out well-known arguments against it. Just pretend that the counterarguments do not exist. This form of ATD is commonly applied by climate deniers, but it is also remarkably common in respectable academic circles, and I have even heard supervisors recommend it to their research students. That is essentially how contradictory schools of thought emerge. A school of thought that is in conflict with other schools may reflect either a promising new idea or a general inability to communicate with colleagues and consider contrasting ideas and approaches.
Ignoring practical applications: Good research does not necessarily have any practical application at all, and sometimes practical applications only emerge long after the research has been done. But that does not mean that pure research without practical applications is superior, or that academics should be encouraged to lock themselves in an ivory tower and free themselves from practical considerations. The opposite is true: since academia is largely publicly funded, academics have a responsibility to inform the public about the practical applications of their work. Pretending that practical applications are irrelant is ATD.
There are two kinds of plagiarism. One involves copying the wording of sentences. Today, this kind of plagiarism is fairly easy to discover automatically by computer, because most texts from which an academic might copy are in the internet.
The other kind of plagiarism involves copying an idea and presenting it in a new formulation. Computers can't find that, but it is arguably more important, because ideas are the final outcome of research. The way Darwin formulated his theory of evolution is not the most important thing: the important thing is the meaning and implications of the theory. If someone at the time had tried to publish Darwin's theory of evolution in different words and formulations, that would have been a serious case of plagiarism.
In 2016, I discovered a published article in a high-profile international journal that looked like my own work but was expressed in different words. An author whom I did not know had presented an idea of mine as if it was his own. In previous years I had presented this idea in different places, some of them high-profile peer-reviewed journals and books, and I am well known for this idea in my academic circles. The author had easy access to this literature and had evidently seen some of it.
So I asked other colleagues what to do. Should I write to the author or the journal? Some thought I should write to the journal, because the author might just deny it and may have no interest in solving the problem. After all, his article had been published, and he had put a lot of work into it.
So I wrote to the journal and pointed out this obvious and blatant example of plagiarism. After all, the main point of the article in question was the same as the main point of my own articles. What could be more blatant than that? I asked the journal to delete the article until the matter had been resolved. The journal was open access so this was merely a matter of deleting an internet page.
The journal was so big that it had an "Ethics and Integrity Manager". One would have thought that this person would have the authority to decide whether something is plagiarism or not. Instead, he sent my complaint some editors further up the hierarchy, who evidently had no special expertise in "ethics and integrity", or at least not in questions of plagiarism. The message came back that those editors did not believe this was plagiarism and saw no reason to do anything--as if there was not such thing as plagiarism, or as if they were unable to recognize that two papers presented the same idea in different words.
After that I asked a legal colleague at my university for advice. I could take the journal to court, but the procedure could cost 30 000 Euros and I might lose. I then asked my university administration whether other members of the university had similar problems, in which case we could support each other, and the answer was no.
From this I learned that you can take anyone's idea, express it in different words, and publish it without citing the original work, or citing it in a misleading way. After that, it will look like the idea is yours. Probably, nothing will happen. If the author of the original idea complains, just claim that you were unaware of the author's work, or present some other kind of truth distortion as an excuse. If people cite your work instead of the work that you copied, you've won.
But watch out--if you copy sentences from someone's paper, you could lose your doctoral qualification. In practice, plagiarism is not what is seems to be. It is not what you will find in the usual definitions. In practice, plagiarism is only plagiarism if a computer can find it. Have people stopped thinking?
ATD in action
To understand ATD, we need to understand the processes that enable it to happen. Here are some of the most important ones. You can find them happening at just about any university.
Avoiding expert evaluation
ATD fans don't like evaluation procedures, for obvious reasons. So they apply their truth-distorting skills to the task of systematically avoiding evaluation, while at the same time maintaining an aura of academic respectability.
Peer review. One approach is to develop sophisticated arguments about why peer review is bad and should be avoided. Common arguments include: (i) reviewers often have ulterior motives, (ii) the review procedure can suppress motivation and creativity, and (iii) great thinkers of the past never had to deal with peer review. All these points are excellent examples of ATD: they are not untrue, but then again they are not really true, either. The real truth is that (i) conflicts of interest can usually be avoided, (ii) review procedures can also motivate researchers by giving them good ideas, and (iii) almost all would-be great thinkers of the past have now been forgotten. Peer review is not perfect, but neither is democracy - and the simple truth is probably that there is no better system of academic quality control than peer review. (Given that peer review is more prevalent in some disciplines than others, it may be appropriate to apply the following rule of thumb. A researcher who regularly cites refereed articles, or addresses topics that are addressed in refereed articles, should her- or himself publish in peer-review journals. If s/he does not, s/he is either distorting the truth or free-riding.)
Accepting criticism. Those who avoid evaluation tend also not to take suggestions and constructive criticism very well. In this case, ATD involves rejecting constructive suggestions without careful consideration. Since good research is almost always the result of interaction between many different researchers (even if the article in which it is published has only one author), the willingness and ability to implement serious suggestions is an essential skill of any academic.
Language. Another way of avoiding expert evaluation is to publish your research or write your dissertation in a language that most international experts in your specific area do not understand. A linguistic asymmetry of this kind can be very convenient: you can understand what they do, but they cannot understand what you do. Since the international language of academia in most disciplines is now English, this trick works in just about any language that is not English. You can write things that you know international experts don't like, but potential reviewers within the specialist academic world of your minority language are more likely to accept - either out of solidarity for other speakers of their language, or because you know all three of them personally (Fritz, Maria and Rudolf) and know how to write the things that they want to read. The result is ATD.
Proof by ATD
Research is about expanding knowledge. That often involves formulating claims (theses) and then trying to convince colleagues that the claims are true. There are several ways that ATD can play a role in this process.
Proof by superficial impression: A reader who thinks an idea sounds good but does not really understand it tends to accept it as the truth rather than admit her/his ignorance (which can be embarrassing) or try to understand it properly (which can be hard work). Impressive, complicated jargon is used to impress non-experts. The jargon is deliberately not clearly defined, which gives the writing an aura of mystique. New words are invented, and foreign words and fancy abbreviations are used unnecessarily. Non-experts (including students) may be impressed by this kind of speaking and writing, but experts (people who do research in the same specific area and have a good international reputation) just laugh and press the delete button.
Proof by repetition: If you say something often enough, people will start to believe it. In the period leading up to the US invasion of Iraq in 2003, political leaders such as George W. Bush and Tony Blair repeatedly claimed that Iraq had been involved in the 2001 terrorist attack on New York and was subsequently developing nuclear weapons. Both points were obviously untrue, but people started to believe them, simply because they heard them so often. Another example: during the 1990s, multinational oil companies and their supporters repeatedly claimed that global warming either was not happening or, if it was, was not being caused by humans - at the same time as the climate scientists were explaining the opposite. The resultant confusion enabled the American president to refuse to sign the Kyoto treaty, with disastrous consequences. ATD in the form of climate denial is ongoing, and its proponents are gambling with hundreds of millions (perhaps billions) of future human lives, not to mention the countless species that are already doomed to extinction in the coming century. Often, the deniers are merely repeating claims that are obviously not true; after a while, mere repetition means that people will start to believe them. Those are the lengths to which so-called homo sapiens will to go to defend his (sic.) right to distort the truth. Academics who repeat the same half-baked ideas again and again to their students or research colleagues are applying the same technique, albeit in a less catastrophic fashion.
Proof by authorship: The underlying idea here is that if a famous person said something, then it must be true. But famous intellectuals are often famous because they were creative in their ideas and took the risk of making claims that were unpopular, both of which suggests that a lot of what they claimed was unreliable or experimental in nature. In any case, the truth content of a claim can only be determined by analysing the argument. This is surely the main skill that students should be learning. There is nothing wrong with citing the ideas of famous people - but the fact that they were famous is not necessarily relevant for a critical evaluation of the argument. A variant of this idea is that a statement by a member of a group of people with which an author and/or her/his intended audience strongly identify is more likely to be true than an equivalent statement by a member of another group. So if I am a humanities scholar I may consider the opinions of humanities scholars to be generally superior to those of scientists (or vice-versa). Or if I am an English speaker I may think (or subtly suggest) that the opinions of English speakers are generally superior to those of German speakers. Nonsense, of course.
Researchers whose research is consistently rejected by their academic peers may instead turn to their students for feedback. If the students seem to like it, it must be good! The researcher finds a new success strategy: base your research and teaching on student feedback (rather than feedback from academic colleagues in review procedures). Research evaluation by students is a kind of academic populism.
Here's how it works. Course content emerges gradually over years of interaction with students. The teacher learns from each wave of fresh new enthusiastic students how best to impress the next wave. Over the years, the students become increasingly impressed. They attend not only the courses taught by the academic populist but also choose her/him as a supervisor (advisor) for their bachelor's, master's and doctoral projects and theses. The content of the teaching then becomes the content of the research. Top-down becomes bottom-up.
This strategy can work so well that many colleagues and administrators may be fooled by it. Colleagues with an international profile in the same specific discipline are more likely to see what is going on, and they may even be so courageous as so to point out that the academic populist has no internationally recognized publications and is presumably polluting the minds of innocent students with ATD. But others are so impressed by the academic populist's success that all warnings are ignored. Just look at all those satisfied students! In this way, a mediocre and unscrupulous academic can build up a good reputation at her/his home university, while at the same time quietly and almost invisibly undermining the university's international reputation.
This strategy is surprisingly prevalent and often tolerated, although it evidently makes a mockery of the academic system. Why is it wrong? The main problem is that students are basically unable to evaluate research content - otherwise they would not be students. In fact, as every journal editor knows, most researchers are unable to evaluate academic content unless they are currently working in the same subdiscipline and preferably on the same research question. The "research" that emerges from academic populism often includes interesting questions (that is certainly a positive aspect) but if neither teacher nor student is capable of the kind of critical thinking that is essential for research progress, the answers that are offered to those questions exhibit the typical characteristics of ATD.
Students who choose an academic populist as a research supervisor may destroy their chances of a respectable academic career. If they ever start to interact with international experts (and many never do), they may find their ideas and approach to research is consistently rejected or ignored. To some extent, they are prepared for this, since their supervisor has often made disparaging remarks about international colleagues - which now seem to be confirmed. But they are quite unequipped to deal with such a consistently negative reaction. At that point, they drop out or recede to the protection of their populist mentor at their home university. They can only survive if they adopt their mentor's isolationist, populist strategy - and the cycle continues.