Reasons to Question Credibility
|"Very few really seek knowledge in this world. Mortal or immortal, few really ask. On the contrary, they try to wring from the unknown the answers they have already shaped in their own minds -- justification, explanations, forms of consolation without which they can't go on. To really ask is to open the door to the whirlwind. The answer may annihilate the question and the questioner." -- Ann Rice, in The Vampire Lestat|
Conflict information comes in many forms and from various sources. Because obtaining trustworthy information is an integral component of conflict resolution, there are various strategies for ensuring this information's credibility. Some of these strategies are specific to conflict resolution; others are common to all kinds of information gathering. But before such strategies can be discussed, it is important to be aware of the challenges that conflict situations pose to credibility. What possible reasons exist that justify questioning the credibility of a given body of information?
First, and most obvious, is that information is conflict information. In any given conflict, when two (or more) sides are at odds, information has powerful strategic value. Parties will want to use information to aid their cause, and to discredit information that comes from the opposing side. This is sometimes called "adversary science," especially with regard to controversial technical facts. In these cases, parties to a conflict employ sympathetic experts and facts that support their side. Everyone wants to make the strongest case they can in order to get what they want. The result is a default position of mutual distrust when it comes to the other side's factual claims and arguments. This situation is almost guaranteed if the conflict involves distribution of resources (such as in territorial or economic disputes) since each side feels as if it has something significant to lose. Yet parties involved in "issue" or moral conflicts (such as gay rights or abortion) also use factual claims, of which adversarial skepticism is also the norm.
Second, whether conflict or not, information is never gathered in a vacuum. There is always a context in which information is being sought and special interests are often involved. When a scientific team on the payroll of a large industrial corporation produces a report that favors loose environmental standards, skepticism may be justified. Equally questionable would be new and astounding research findings, published by a research organization or interested group that stands to benefit from such findings. Though economic gain might be the most common motivation for special interests, the allure of peer recognition or public attention, a tactical advantage over competitors, the need to get elected or published, or simply the desire to get "even," are all factors in the skewing of relevant information towards bias and lack of credibility.
Manipulating information for one's own advantage, whatever the motivations, can be accomplished in many ways. The most extreme forms are outright lying, deception, and fraud. If someone stands to gain by lying, and feels there is a good chance of getting away with it, fraud and deception are often employed. Yet more frequently deception is done subtly, even subconsciously: estimates are made in one's own favor, background evidence is selected to fit a specific theory or story, undesirable anomalies are ignored, and implicit assumptions are made. Even when data is genuine, its interpretation can be easily (and strategically) slanted. Most data is susceptible to slight misrepresentations and manipulations. The subtler the deception, the more difficult it is to detect, and the easier to deny if discovered.
Sanda Kaufman discusses how parties are likely to frame information in a way that favors their own interests or concerns.
Finally, because of poor fact-finding methods, information may simply be in error. It must always be asked if the information is thorough, complete, and objective, if it is applicable and up-to-date, and if an authoritative, respected agent was responsible for collecting it. Claims resulting from unscientific collection methods, or "pseudoscience," are suspect. A pseudoscience is a knowledge system in which the technical detail and methods of its practitioners make the field appear like a science when in reality it is far from it. The medical industry, for example, has the difficult task of determining which aspects of "holistic" medicine are genuinely helpful and which are pseudoscience. Also, conflicts across cultures can easily involve one or both sides gathering information via culturally accepted, but scientifically vacant, methods. The effectiveness and limitations of these methods must be evaluated if one is to get information that is useful for those outside the culture of origin.
Agreeing on the Facts
Given the many potential reasons to be skeptical of conflict information, it is not surprising that agreeing on the facts can be a real challenge, one that contributes to a conflict's intractability. Yet it is possible for conflicting parties to agree on important facts, and though this might not resolve a conflict completely, it is certainly a step in the right direction. In seeking to reach agreement on facts, it is important to realize that there are two areas to focus on: the information-gathering processes and the attitudes of the persons who must be convinced. It is unlikely that poorly gathered or biased information is going to be persuasive. At the same time, use of the best information-gathering methods does not guarantee that key people will accept the resulting information.
Regarding information-gathering methodology, the overall objective is to minimize the three negative possibilities explained above -- bias, adversarial methods, and errors -- so as to uncover (to the extent possible) the objective facts. There are several formal methods that have been specifically designed for conflict fact-finding, each intended for a specific conflict scenario: a truth commission or international tribunal is used when a conflict involves serious violations of international law; scientific facts can be investigated through joint fact-finding if the people work together; neutral fact-finding, on the other hand, works best when parties to a conflict are highly uncooperative and untrusting. Groups that lack technical expertise would require technical assistance. Each of these methods provides a forum for conflicting groups to come together so as to agree on both information-gathering methods and the facts themselves, while ultimately easing the general tension and mistrust between the opposing sides.
All these information-gathering methods operate under two implicit assumptions. First, they assume the parties acknowledge the general value of the scientific method. Second, it is assumed that key conflict figures are willing to ratify facts that have been uncovered by fair means, even if doing so could have undesirable consequences for them. Unfortunately, acquiescing to the facts may not be palatable to some. A person or group who has made up their mind to take a predetermined position will not be convinced by any of the various fact-finding methods. Therefore, it must be recognized that if key conflict figures truly want an improved situation, such adversarial tactics must be set aside.
Guy Burgess, Co-Director of the University of Colorado Conflict Research Consortium and the Beyond Intractability Project, describes how the internet can provide a low cost, accessible source of information for people involved in conflict.
What this really means for the disputants is that they must strike a balance between cautious skepticism and gullibility, having an open mind while not allowing themselves to be conned. This stance is called "critical thinking," and there is an ongoing debate on how to achieve this mode of thought. However, critical thinking essentially involves focusing on two aspects of factual claims: the arguments and the evidence. The critical thinker is able to recognize flawed arguments as well as evaluate the adequacy and applicability of evidence. Importantly, they do this with their own arguments as well as with the claims of others. A critical thinker will, by definition, assent to the results of fair fact-finding methods. In addition, such a person is generally in a better position to obtain trustworthy information in the first place, being skilled at filtering out lies and inaccuracies. In this respect, use of critical thinking may aid in preventing factual disputes, or may be valuable when formal fact-finding methods are used.
An example of how important critical thinking is to conflicts is demonstrated by the fallacy, argumentum ad hominem. This principle states that to judge an argument by its source (either a person or group) is a mistake. The fact that a party (whether a large industrial corporation, a religious group, or an environmental activist organization) has special interest in a certain position does not automatically suggest that their claims are false. Though conflicts of interests are a factor, claims must also be judged on their own merits, evaluated as to the validity of the arguments and the strength of the evidence. If adopted by both sides of a conflict, this one insight of critical thinking will incline all parties towards an acceptance of the facts -- your opponent, though you may not like them, may be right after all. Such an attitude paves the way for successful fact-finding.
The most effective step in obtaining credible information in a conflict situation is for the parties to take a step back from the dispute and be receptive to facts. Disputants who do this, who set aside their adversarial tactics, put in motion the pursuit of a common goal with other like-minded individuals, quite possibly including those on the other side of the conflict. In doing so, the human barriers to obtaining trustworthy information are breached. Though technical challenges still exist, overcoming this human challenge is the first step to a successful fact-finding mission.
 Frauds, lies, and deceptions are actually quite common in conflicts. Obvious examples are wartime propaganda, which is nearly universally practiced, and disinformation campaigns that can happen at the international/political level. In recent times, top business executives have been caught lying to their shareholders, the government and their employees about company financial status (for example, Enron and WorldCom). Even scientific activities are not immune (see references regarding "Conduct and Misconduct in Science" and the Piltdown Man hoax).
 What is often called "alternative medicine" or "holistic medicine" is not clearly defined, so it is difficult to tell authentic remedies from "snake oil" hoaxes. Some practices, such as acupuncture and massage therapy, have achieved a fair amount of recognition from the medical establishment as a result of extensive studies, while others such as homeopathic drugs and some herbal remedies prove to be devoid of benefit when subjected to the scrutiny of the laboratory. For more information, see The National Council Against Health Fraud Web site, listed in the references.
 It must be granted that scientific method may not be the only way to find facts. Alternative ways of knowing may exist, and the emphasis placed on scientific method in this essay is not intended to nullify the possible value of such alternatives. It is the case, though, that scientific method (as broadly construed) offers means of settling factual disputes that is widely accepted as fair.
 Critical thinking receives quite a lot of attention these days. It has become an education buzzword, especially as increasing numbers of schools and universities in the United States have "critical thinking" requirements. In general, the focus on critical thinking in education is a focus on teaching students transferable thinking skills instead of merely loading their brain with facts and figures -- teaching how to think instead of what to think. The concept itself has been defined in several ways: John Dewey defined it as "active, persistent, and careful consideration of a belief or supposed form of knowledge in light of the grounds which support it and the further conclusions to which it tends," while Robert Ennis defined it as "reasonable, reflective thinking that is focused on what to believe or do." The underlying thread in all critical thinking, though, is a demand for rational examination of the arguments and evidence for any given claim before consenting to belief or disbelief.
Use the following to cite this article:
Schultz, Norman. "Obtaining Trustworthy Information." Beyond Intractability. Eds. Guy Burgess and Heidi Burgess. Conflict Information Consortium, University of Colorado, Boulder. Posted: January 2004 <http://www.beyondintractability.org/essay/obtaining-trustworthy-info>.