lief Systems and Social Perception Structures

Logic and Fallacies | Influence and Persuasion | Suppression of Sound Ideas

There is nothing men more readily give themselves to than pushing their own beliefs
When ordinary means fail, they add commandment, violence, fire and sword. 
(Michael Eyquem De Montaigne - French Philosopher 1533-1592)

belief: Whatever an individual is willing to accept without direct verification by experience
 or without the support of evidence
, resulting in assumption which is taken as a basis for action or non-action.  

Do You Believe In ________?

"Belief drives behavior, but often belief is not based on experience and so does not reach or reflect the intimately lived dimension of human existence. Indeed, the very nature of belief precludes the necessity of experience. Belief does not merely dispense with the evidence of experience, it can go further and deny the evidence of experience. And it often does. Therein lies the power of belief. Belief is motivation by reliance on an assigned version of reality or an assigned version of what might be imagined. Ultimately, the problem introduced by belief is not a matter of believing versus non-believing, because annulment of the will to believe is not possible. The true conflict here is between believing and learning. "The unexamined belief is not worth holding." True enough, but the examined belief may not be worth holding, either. A great many beliefs, once they are examined, may prove to be worthless as indicators of truth or guides to experience, although they may serve to define identity and confer a sense of belonging." 

"Some things are proposed to have certain properties which may be
 logically inconsistent, and hence these things can be proved not to exist ."
                  Dr. Niclas Berggren from "A Note on the Concept of Belief"

"I know what I believe. I will continue to articulate what I believe 
and what I believe — I believe what I believe is right." —George W. Bush, in Rome, July 22, 2001


Belief  - 'mental acceptance of a proposition, statement, or fact, as true, on the ground of apparent authority, which does not have to be based on actual fact. ”  Assent to a proposition or affirmation, or the acceptance of a fact, opinion, or assertion as real or true, without immediate personal knowledge; reliance upon word or testimony; partial or full assurance without positive knowledge or absolute certainty; persuasion; conviction; confidence; as, belief of a witness; the belief of our senses a religious doctrine that is proclaimed as true without proof [syn: dogma, tenet]

Something believed, i.e., accepted as true. Example: Most religions of the world hold the belief that the universe was created by a divine, unseen being.

Experiential knowledge always trumps a belief having no basis in actual experience. 

Hysteron Proteron: The logical fallacy of assuming as true and using as a premise a proposition that is yet to be proved.

Definitions: "A Priori ": Click here    

"A Note on The Concept of Belief"

"We may choose in any evaluative process of thought to adopt the set of criteria which we later
use to judge fact claims. But the central thing to note here is that by rational people these criteria 
are not chosen to correspond to what beliefs they wish to hold. They choose the criteria a priori 
that in some sense fulfill their need to know things about the world in the best manner. They do not choose the criteria a priori that lead to certain, specific beliefs: the criteria are general and universal and are adopted to be applicable to all judgments of fact claims. Being able to choose irrationally is not the same as wanting to do so ... the criterion of faith is about accepting fact claims without or even in opposition to available evidence.

In short, it is an irrational criterion to use for gathering knowledge."

Why is it irrational? The reason is that this criterion for judging fact claims is unable to discriminate between competing fact claims in a rational manner (i.e., by discussing evidence pro et con). In other words, it leads to un-falsifiable fact claims. 

If you accept the fact claim "God exists" without or even in opposition to evidence, then how can you then demonstrate that the mutually exclusive fact claims "Allah exists", "Zeus exists", "Krishna exists" and "Thor exists" are false? You cannot. The general problem with choosing to use an irrational criterion for assessing fact claims is that one is not concerned with the issue of truth but rather some other issues, such as feeling good. This is not done on a conscious level of thought.

A related problem with the Christian process of belief formation is the tendency to disregard all evidence which is contrary to the desired belief. In other words, it is not just that the criterion for judging facts accepts beliefs without or even in opposition to all available evidence, it is also the case that all available evidence is not taken into consideration. The wish to retain a certain belief - that an external God exists - for pragmatic reasons, rather than truth reasons, is evidently so strong as to override all rationality concerns."

        Dr. Niclas Berggren from "A Note on the Concept of Belief"

Creation of Belief Systems

"Within social structures, social interaction takes place. This social interaction is presented in the form of text/discourse, which is then cognizized according to a cognitive system/memory. This "system/memory" consists of short-term memory, in which "strategic process," or decoding and interpretation takes place. Long-term memory, however, serves as a holder of "socio-cultural knowledge," which consists of knowledge of language, discourse, communication, persons, groups and events-existing in the form of "scripts." "Social (group) attitudes" also reside within long-term memory and provide further decoding guides. Each of these "group attitudes" can represent an array of ideologies which combine to create one's own personal ideology which conforms to one's identity, goals, social position, values and resources.

This "process" of framing "beliefs and opinions," say Van Djik, that benefit one particular group, is not final. "Some people may be forced or persuaded, socially or economically" to go against their "best interests"

-from Critical Discourse Analysis,  ©1995 Brett Dellinger  Related Links: Discourse AnalysisSocial Cognition and Organization of Knowledge | The Sociology of Knowledge | Experience |

credibility Trust conferred on the source of a belief, rather than in the substance of the belief itself.

aligned belief: chosen after careful consideration of options or alternatives.
assigned belief A belief acquired from one’s familial, cultural and religious background and accepted like a task or role assigned to the believer, rather than chosen on a voluntary basis.
blind belief: refuses to be questioned or examined. Contrast to open belief.
compound belief: combines various modes of belief in the same syndrome.
conflicted belief: contains contradictory and opposing elements that confuse the believer.
conflictual belief: compels the believer into antagonism toward others.
consensual belief: held by consent rather than chosen with deliberation. We consent to believe what others believe. Here the primary appeal of the belief may consist in the fact that many others hold it. The mainstream religions of the world depend on consensus rather than upon invididual deliberation and choice. To consent to believe something is not to choose to believe it, rather the join company with those who believe it. The primary accent of consensual belief is inclusion in a group.
corporate belief: belongs to a program or agenda and serves the ends proposed in that program or agenda.
default belief: held due to lack of considering any alternatives.
deliberated belief: chosen by a process of considering and evaluating options. Synonymous with aligned belief.

dereasoning: The process separating the reasons and conditions for adopting a belief from its truth value.
dereasoned belief: deprived of its original properties by the process of dereasoning, i.e., isolating the conditions and reasons for holding a belief and thus reducing it to its inherent truth value, if it has any.
dissenting belief: deliberately opposed to conventional and established beliefs.
doctrinal belief: based on predefined dogmas or doctrines. Contrast to intuitive belief.
ethical belief: relates to a way of behaving or prescribes a code of behavior.
extremist belief: enacted in uncompromising or fanatical behavior. Often associated with violence, if not directly used as a justification for violence.
fundamentalist belief: received from a tradition and not allowed to be altered or questioned.
heretic belief: chosen in direct opposition to a widely accepted belief.
humanist belief: based on assumptions that assume human intelligence as the best author of convictions, without need of attributing beliefs and rules for living to a superhuman agency.
ideological belief
: expressed in ideological form, that is, in a systematic body of abstractions or formal ideas.
imperative belief: stated in a flat non-narrative form.
latent belief: held but not enacted.
ludic belief: able to be modified by playing with it.


Earth Culture Conceptual Variations/Distortions Regarding Knowledge

Note: "Modern Psychology" has had over 200 theories of personality - that should tell you something right there - they haven't a clue. The conceptual dynamic on Earth relative to the subject of knowledge is somewhat similar, in that it is set in the context of a body-ID material, linear reality, and conceived of from the middle ages to the present time. In actuality, in terms of their list, knowledge involves several statements below simultaneously. The attempt to define only one approach was intended to further obscure evolution of personal perspective, in order to maintain the status quo over time.   MORE:  Click here


"I'm only human--I'm just a man/woman.
Help me believe in what I believe and all that I am."

Yeah...right...silly child...wake up from the dream!

"The Bible tells us to 'be like God', and then on page after page it describes God as a mass murderer. This may be the single most important key to the political behavior of Western Civilization." 
                                                                        - Robert Anton Wilson

See related link discussions:  Morality and Religion  and Christianity


Various Internet Essays of Interest:

On Belief Systems and Learning      The Nature of Belief Systems What is a Belief State?    Alternative Analysis of Mass Belief Systems
Belief Systems in Africa                 Belief Coercion in Religious Groups Belief Without Evidence    What Is Belief?
Belief and Knowledge                    Definition of Cognitive Distortions Definition and Meaning      Psychiatry As A Modern Belief System
Reality, Belief and the Mind (Good)  The Fixation of Belief            Core Beliefs                    Dogma and Belief: Famous Quotes
The Absence of Belief                   Metapsychology: The Un-Belief System The Biology of Belief        Excellence in Critical Thinking
Thought Contagion                       My Reasons for Being an Atheist The Culture of Cults         


Definition of Cognitive Distortions:
(See also Taboos in the Paradigm areas)

Cognitive distortions are logical, but they are not rational. They can create real difficulty with your thinking. See if you are doing any of the ten common distortions that people use. Rate yourself from one to ten with one being low and ten being high. Ask yourself if you can stop using the distortions and think in a different way.

ALL-OR-NOTHING THINKING: You see things in black-and-white categories. If your performance falls short of perfect, you see your self as a total failure.

OVERGENERALIZATION: You see a single negative event as a never-ending pattern of defeat.

MENTAL FILTER: You pick out a single negative detail and dwell on it exclusively so that your vision of all reality becomes darkened, like the drop of ink that discolors the entire beaker of water.

DISQUALIFYING THE POSITIVE: You reject positive experiences by insisting they "don't count" for some reason or other. In this way you can maintain a negative belief that is contradicted by your everyday experiences.

JUMPING TO CONCLUSIONS: You make a negative interpretation even though there are no definite facts that convincingly support your conclusion.

MIND READING: You arbitrarily conclude that someone is reacting negatively to you, and you don't bother to check this out

THE FORTUNETELLER ERROR: you can anticipate that things will turn out badly, and you feel convinced that your prediction is an already-established fact.

MAGNIFICATION (CATASTROPHIZING) OR MINIMIZATION: You exaggerate the importance of things (such as your goof-up or someone else's achievement), or you inappropriately shrink things until they appear tiny (your own desirable qualities or other fellow's imperfections). This is also called the binocular trick."

EMOTIONAL REASONING: You assume that your negative emotions necessarily reflect the way things really are: "I feel it, therefore it must be true."

SHOULD STATEMENTS: You try to motivate yourself with should and shouldn't, as if you had to be whipped and punished before you could be expected to do anything. "Musts" and "oughts" are also offenders. The emotional consequences are guilt. When you direct should statements toward others, you feel anger, frustration, and resentment.

LABELING AND MISLABELING: This is an extreme form of overgeneralization. Instead of describing your error, you attach a negative label to yourself. "I'm a loser." When someone else's behavior rubs you the wrong way, you attach a negative label to him" "He's a Goddamn louse." Mislabeling involves describing an event with language that is highly colored and emotionally loaded.

PERSONALIZATION: You see your self as the cause of some negative external event, which in fact you were not primarily responsible for.

Some Interesting Quotes


"The paradox of our era is that we is that we extend toleration to systems of belief that 
are themselves intrinsically intolerant and abhorrent to modern consciousness."

"A belief is an idea that is held based on some support -  even if that support is the result of prior fabrication by someone else who needs one to belief as he does..."
"To believe in something is not the same as knowing something. Intrinsic to the concept of belief  is implication that there is an opposite to belief, disbelief. Not everyone will believe something is true, but all sane and rational people will acknowledge an observable fact."

"Belief is based only on unconfirmed information, so the person declaring the belief is always hedging his/her bet as to whether the belief is 'correct', and seeks the company of those who 'believe' and seeks to separate those who don't, with the strongest beliefs attaching themselves to concepts of identity and the apparent nature of the reality around them, with a peculiar preference for religions, 'belief' in external god figures and more."

"Religion, in its essence, is thus not a scheme of conduct, but a theory of causes. What brought it into the world in the remote days I try to conjure up by hypotheses in Section I were man's eternal wonder and his eternal hope. It represents one of his 'boldest efforts' to 'penetrate the unknowable', to 'put down the intolerable', to 'refashion the universe nearer to his heart's desire'. My belief is that it is a poor device to that end--that when it is examined objectively it testifies to his lack of sense quite as much as to his high striving. But that belief is just a belief. The immense interest and importance of the thing itself remains." H.L. Mencken, Treatise on the Gods (NY: Alfred A. Knopf, 1930, revised 1946)   In other words, religion is a mental construct based on a belief system, not objective reality.  For more background, click here.


The Eventual Result At Some Point

 "Kill the disbelievers!"
 (Typical comment from a 'believer') = Planetary Discord, Terrorism, Violence, Ethnic Cleansing, etc.


Logic & Fallacies

A fallacy is, very generally, an error in reasoning. This differs from a factual error, which is simply being wrong about the facts. To be more specific, a fallacy is an "argument" in which the premises given for the conclusion do not provide the needed degree of support. 

A deductive fallacy is a deductive argument that is invalid (it is such that it could have all true premises and still have a false conclusion). 

An inductive fallacy is less formal than a deductive fallacy. They are simply "arguments" which appear to be inductive arguments, but the premises do not provided enough support for the conclusion. In such cases, even if the premises were true, the conclusion would not be more likely to be true.

Visit  Fallacy Tutorial Pro 3.0  online.

Logic and Fallacies
by Mathew 1995-1997


There's a lot of debate on the net. Unfortunately, much of it is of very low quality. The aim of this document is to explain the basics of logical reasoning, and hopefully improve the overall quality of debate.

The Concise Oxford English Dictionary defines logic as "the science of reasoning, proof, thinking, or inference". Logic will let you analyze an argument or a piece of reasoning, and work out whether it is likely to be correct or not. You don't need to know logic to argue, of course; but if you know even a little, you'll find it easier to spot invalid arguments.

There are many kinds of logic, such as fuzzy logic and constructive logic; they have different rules, and different strengths and weaknesses. This document discusses simple Boolean logic, because it's commonplace and relatively easy to understand. When people talk about something being 'logical', they usually mean the type of logic described here. 

What Logic Is Not

It's worth mentioning a couple of things which logic is not.

Firstly, logical reasoning is not an absolute law which governs the universe. Many times in the past, people have concluded that because something is logically impossible (given the science of the day), it must be impossible, period. It was also believed at one time that Euclidean geometry was a universal law; it is, after all, logically consistent. Again, we now know that the rules of Euclidean geometry are not universal.

Secondly, logic is not a set of rules which govern human behavior. Humans may have logically conflicting goals. For example:

Unfortunately, John may have a conflicting goal of avoiding Steve, meaning that the reasoned answer may be inapplicable to real life.

This document only explains how to use logic; you must decide whether logic is the right tool for the job. There are other ways to communicate, discuss and debate.


An argument is, to quote the Monty Python sketch, "a connected series of statements to establish a definite proposition".

Many types of argument exist; we will discuss the deductive argument. Deductive arguments are generally viewed as the most precise and the most persuasive; they provide conclusive proof of their conclusion, and are either valid or invalid.

Deductive arguments have three stages: premises, inference, and conclusion. However, before we can consider those stages in detail, we must discuss the building blocks of a deductive argument: propositions.


A proposition is a statement which is either true or false. The proposition is the meaning of the statement, not the precise arrangement of words used to convey that meaning.

For example, "There exists an even prime number greater than two" is a proposition. (A false one, in this case.) "An even prime number greater than two exists" is the same proposition, re-worded.

Unfortunately, it's very easy to unintentionally change the meaning of a statement by rephrasing it. It's generally safer to consider the wording of a proposition as significant.

It's possible to use formal linguistics to analyze and re-phrase a statement without changing its meaning; but how to do so is outside the scope of this document.


A deductive argument always requires a number of core assumptions. These are called premises, and are the assumptions the argument is built on; or to look at it another way, the reasons for accepting the argument. Premises are only premises in the context of a particular argument; they might be conclusions in other arguments, for example.

You should always state the premises of the argument explicitly; this is the principle of audiatur et altera pars. Failing to state your assumptions is often viewed as suspicious, and will likely reduce the acceptance of your argument.

The premises of an argument are often introduced with words such as "Assume...", "Since...", "Obviously..." and "Because...". It's a good idea to get your opponent to agree with the premises of your argument before proceeding any further.

The word "obviously" is also often viewed with suspicion. It occasionally gets used to persuade people to accept false statements, rather than admit that they don't understand why something is 'obvious'. So don't be afraid to question statements which people tell you are 'obvious' -- when you've heard the explanation you can always say something like "You're right, now that I think about it that way, it is obvious."


Once the premises have been agreed, the argument proceeds via a step-by-step process called inference.

In inference, you start with one or more propositions which have been accepted; you then use those propositions to arrive at a new proposition. If the inference is valid, that proposition should also be accepted. You can use the new proposition for inference later on.

So initially, you can only infer things from the premises of the argument. But as the argument proceeds, the number of statements available for inference increases.

There are various kinds of valid inference - and also some invalid kinds, which we'll look at later in this document. Inference steps are often identified by phrases like "therefore..." or "...implies that..."


Hopefully you will arrive at a proposition which is the conclusion of the argument - the result you are trying to prove. The conclusion is the result of the final step of inference. It's only a conclusion in the context of a particular argument; it could be a premise or assumption in another argument.

The conclusion is said to be affirmed on the basis of the premises, and the inference from them. This is a subtle point which deserves further explanation.

Implication in detail

Clearly you can build a valid argument from true premises, and arrive at a true conclusion. You can also build a valid argument from false premises, and arrive at a false conclusion.

The tricky part is that you can start with false premises, proceed via valid inference, and reach a true conclusion. For example:

There's one thing you can't do, though: start from true premises, proceed via valid deductive inference, and reach a false conclusion.

We can summarize these results as a "truth table" for implication. The symbol "=>" denotes implication; "A" is the premise, "B" the conclusion. "T" and "F" represent true and false respectively.

Truth Table for Implication






A => B













So the fact that an argument is valid doesn't necessarily mean that its conclusion holds -- it may have started from false premises.

If an argument is valid, and in addition it started from true premises, then it is called a sound argument. A sound argument must arrive at a true conclusion.

Example argument

Here's an example of an argument which is valid, and which may or may not be sound:

  1. Premise: Every event has a cause

  2. Premise: The universe has a beginning

  3. Premise: All beginnings involve an event

  4. Inference: This implies that the beginning of the universe involved an event

  5. Inference: Therefore the beginning of the universe had a cause

  6. Conclusion: The universe had a cause

The proposition in line 4 is inferred from lines 2 and 3. Line 1 is then used, with the proposition derived in line 4, to infer a new proposition in line 5. The result of the inference in line 5 is then re-stated (in slightly simplified form) as the conclusion.

Spotting arguments

Spotting an argument is harder than spotting premises or a conclusion. Lots of people shower their writing with assertions, without ever producing anything you might reasonably call an argument.

Sometimes arguments don't follow the pattern described above. For example, people may state their conclusions first, and then justify them afterwards. This is valid, but it can be a little confusing.

To make the situation worse, some statements look like arguments but aren't. For example:

"If the Bible is accurate, Jesus must either have been insane, an evil liar, or the Son of God."

That's not an argument; it's a conditional statement. It doesn't state the premises necessary to support its conclusion, and even if you add those assertions it suffers from a number of other flaws which are described in more detail in the Atheist Arguments document.

An argument is also not the same as an explanation. Suppose that you are trying to argue that Albert Einstein believed in God, and say:

"Einstein made his famous statement 'God does not play dice' because of his belief in God."

That may look like a relevant argument, but it's not; it's an explanation of Einstein's statement. To see this, remember that a statement of the form "X because Y" can be re-phrased as an equivalent statement, of the form "Y therefore X". Doing so gives us:

"Einstein believed in God, therefore he made his famous statement 'God does not play dice'.

Now it's clear that the statement, which looked like an argument, is actually assuming the result which it is supposed to be proving, in order to explain the Einstein quote.

Furthermore, Einstein did not believe in a personal God concerned with human affairs -- again, see the Atheist Arguments document.

Further reading

We've outlined the structure of a sound deductive argument, from premises to conclusion. But ultimately, the conclusion of a valid logical argument is only as compelling as the premises you started from. Logic in itself doesn't solve the problem of verifying the basic assertions which support arguments; for that, we need some other tool.

The dominant means of verifying basic assertions is scientific enquiry. However, the philosophy of science and the scientific method are huge topics which are quite beyond the scope of this document.

For a more comprehensive introduction to logic, try Flew's "Thinking Straight", listed in the Atheist Media document. A much more detailed book is Copi's "Introduction to Logic". The Electronic Resources document also lists LOGIC-L, a LISTSERV mailing list devoted to discussing the teaching of elementary logic.


There are a number of common pitfalls to avoid when constructing a deductive argument; they're known as fallacies. In everyday English, we refer to many kinds of mistaken beliefs as fallacies; but in logic, the term has a more specific meaning: a fallacy is a technical flaw which makes an argument unsound or invalid.

(Note that you can criticize more than just the soundness of an argument. Arguments are almost always presented with some specific purpose in mind -- and the intent of the argument may also be worthy of criticism.)

Arguments which contain fallacies are described as fallacious. They often appear valid and convincing; sometimes only close inspection reveals the logical flaw.

Below is a list of some common fallacies, and also some rhetorical devices often used in debate. The list isn't intended to be exhaustive; the hope is that if you learn to recognize some of the more common fallacies, you'll be able to avoid being fooled by them.

The Nizkor Project at <> has another excellent list of logical fallacies; Stephen Downes maintains a list too. The reference works mentioned above also all contain fallacy lists.

Sadly, many of the examples below have been taken directly from Usenet, though some have been rephrased for the sake of clarity.

List of fallacies


Accent is a form of fallacy through shifting meaning. In this case, the meaning is changed by altering which parts of a statement are emphasized. For example:

"We should not speak ill of our friends"


"We should not speak ill of our friends"

Be particularly wary of this fallacy on the net, where it's easy to misread the emphasis of what's written.

Ad hoc

As mentioned earlier, there is a difference between argument and explanation. If we're interested in establishing A, and B is offered as evidence, the statement "A because B" is an argument. If we're trying to establish the truth of B, then "A because B" is not an argument, it's an explanation.

The Ad Hoc fallacy is to give an after-the-fact explanation which doesn't apply to other situations. Often this ad hoc explanation will be dressed up to look like an argument. For example, if we assume that God treats all people equally, then the following is an ad hoc explanation:

"I was healed from cancer."

"Praise the Lord, then. He is your healer."

"So, will He heal others who have cancer?"

"Er... The ways of God are mysterious."

Affirmation of the consequent

This fallacy is an argument of the form "A implies B, B is true, therefore A is true". To understand why it is a fallacy, examine the truth table for implication given earlier. Here's an example:

"If the universe had been created by a supernatural being, we would see order and organization everywhere. And we do see order, not randomness -- so it's clear that the universe had a creator."

This is the converse of Denial of the Antecedent.


Amphiboly occurs when the premises used in an argument are ambiguous because of careless or ungrammatical phrasing. For example:

"Premise: Belief in God fills a much-needed gap."

Anecdotal evidence

One of the simplest fallacies is to rely on anecdotal evidence. For example:

"There's abundant proof that God exists and is still performing miracles today. Just last week I read about a girl who was dying of cancer. Her whole family went to church and prayed for her, and she was cured."

It's quite valid to use personal experience to illustrate a point; but such anecdotes don't actually prove anything to anyone. Your friend may say he met Elvis in the supermarket, but those who haven't had the same experience will require more than your friend's anecdotal evidence to convince them.

Anecdotal evidence can seem very compelling, especially if the audience wants to believe it. This is part of the explanation for urban legends; stories which are verifiably false have been known to circulate as anecdotes for years.

Argumentum ad antiquitatem

This is the fallacy of asserting that something is right or good simply because it's old, or because "that's the way it's always been." The opposite of Argumentum ad Novitatem.

"For thousands of years Christians have believed in Jesus Christ. Christianity must be true, to have persisted so long even in the face of persecution."

Argumentum ad baculum / Appeal to force

An Appeal to Force happens when someone resorts to force (or the threat of force) to try and push others to accept a conclusion. This fallacy is often used by politicians, and can be summarized as "might makes right". The threat doesn't have to come directly from the person arguing. For example:

"... Thus there is ample proof of the truth of the Bible. All those who refuse to accept that truth will burn in Hell."

"... In any case, I know your phone number and I know where you live. Have I mentioned I am licensed to carry concealed weapons?"

Argumentum ad crumenam

The fallacy of believing that money is a criterion of correctness; that those with more money are more likely to be right. The opposite of Argumentum ad Lazarum. Example:

"Microsoft software is undoubtedly superior; why else would Bill Gates have got so rich?"

Argumentum ad hominem

Argumentum ad hominem literally means "argument directed at the man"; there are two varieties.

The first is the abusive form. If you refuse to accept a statement, and justify your refusal by criticizing the person who made the statement, then you are guilty of abusive argumentum ad hominem. For example:

"You claim that atheists can be moral -- yet I happen to know that you abandoned your wife and children."

This is a fallacy because the truth of an assertion doesn't depend on the virtues of the person asserting it. A less blatant argumentum ad hominem is to reject a proposition based on the fact that it was also asserted by some other easily criticized person. For example:

"Therefore we should close down the church? Hitler and Stalin would have agreed with you."

A second form of argumentum ad hominem is to try and persuade someone to accept a statement you make, by referring to that person's particular circumstances. For example:

"Therefore it is perfectly acceptable to kill animals for food. I hope you won't argue otherwise, given that you're quite happy to wear leather shoes."

This is known as circumstantial argumentum ad hominem. The fallacy can also be used as an excuse to reject a particular conclusion. For example:

"Of course you'd argue that positive discrimination is a bad thing. You're white."

This particular form of Argumentum ad Hominem, when you allege that someone is rationalizing a conclusion for selfish reasons, is also known as "poisoning the well".

It's not always invalid to refer to the circumstances of an individual who is making a claim. If someone is a known perjurer or liar, that fact will reduce their credibility as a witness. It won't, however, prove that their testimony is false in this case. It also won't alter the soundness of any logical arguments they may make.

Argumentum ad ignorantiam

Argumentum ad ignorantiam means "argument from ignorance". The fallacy occurs when it's argued that something must be true, simply because it hasn't been proved false. Or, equivalently, when it is argued that something must be false because it hasn't been proved true.

(Note that this isn't the same as assuming something is false until it has been proved true. In law, for example, you're generally assumed innocent until proven guilty.)

Here are a couple of examples:

"Of course the Bible is true. Nobody can prove otherwise."

"Of course telepathy and other psychic phenomena do not exist. Nobody has shown any proof that they are real."

In scientific investigation, if it is known that an event would produce certain evidence of its having occurred, the absence of such evidence can validly be used to infer that the event didn't occur. It does not prove it with certainty, however.

For example:

"A flood as described in the Bible would require an enormous volume of water to be present on the earth. The earth doesn't have a tenth as much water, even if we count that which is frozen into ice at the poles. Therefore no such flood occurred."

It is, of course, possible that some unknown process occurred to remove the water. Good science would then demand a plausible testable theory to explain how it vanished.

Of course, the history of science is full of logically valid bad predictions. In 1893, the Royal Academy of Science were convinced by Sir Robert Ball that communication with the planet Mars was a physical impossibility, because it would require a flag as large as Ireland, which it would be impossible to wave.

[ Fortean Times Number 82.]


See also Shifting the Burden of Proof.

Argumentum ad lazarum

The fallacy of assuming that someone poor is sounder or more virtuous than someone who's wealthier. This fallacy is the opposite of the Argumentum ad Crumenam. For example:

"Monks are more likely to possess insight into the meaning of life, as they have given up the distractions of wealth."

Argumentum ad logicam

This is the "fallacy fallacy" of arguing that a proposition is false because it has been presented as the conclusion of a fallacious argument. Remember always that fallacious arguments can arrive at true conclusions.

"Take the fraction 16/64. Now, cancelling a six on top and a six on the bottom, we get that 16/64 = 1/4."

"Wait a second! You can't just cancel the six!"

"Oh, so you're telling us 16/64 is not equal to 1/4, are you?"

Argumentum ad misericordiam

This is the Appeal to Pity, also known as Special Pleading. The fallacy is committed when someone appeals to pity for the sake of getting a conclusion accepted. For example:

"I did not murder my mother and father with an axe! Please don't find me guilty; I'm suffering enough through being an orphan."

Argumentum ad nauseam

This is the incorrect belief that an assertion is more likely to be true, or is more likely to be accepted as true, the more often it is heard. So an Argumentum ad Nauseam is one that employs constant repetition in asserting something; saying the same thing over and over again until you're sick of hearing it.

On Usenet, your argument is often less likely to be heard if you repeat it over and over again, as people will tend to put you in their kill files.

Argumentum ad novitatem

This is the opposite of the Argumentum ad Antiquitatem; it's the fallacy of asserting that something is better or more correct simply because it is new, or newer than something else.

"BeOS is a far better choice of operating system than OpenStep, as it has a much newer design."

Argumentum ad numerum

This fallacy is closely related to the argumentum ad populum. It consists of asserting that the more people who support or believe a proposition, the more likely it is that that proposition is correct. For example:

"The vast majority of people in this country believe that capital punishment has a noticeable deterrent effect. To suggest that it doesn't in the face of so much evidence is ridiculous."

"All I'm saying is that thousands of people believe in pyramid power, so there must be something to it."

Argumentum ad populum

This is known as Appealing to the Gallery, or Appealing to the People. You commit this fallacy if you attempt to win acceptance of an assertion by appealing to a large group of people. This form of fallacy is often characterized by emotive language. For example:

"Pornography must be banned. It is violence against women."

"For thousands of years people have believed in Jesus and the Bible. This belief has had a great impact on their lives. What more evidence do you need that Jesus was the Son of God? Are you trying to tell those people that they are all mistaken fools?"

Argumentum ad verecundiam

The Appeal to Authority uses admiration of a famous person to try and win support for an assertion. For example:

"Isaac Newton was a genius and he believed in God."

This line of argument isn't always completely bogus; for example, it may be relevant to refer to a widely-regarded authority in a particular field, if you're discussing that subject. For example, we can distinguish quite clearly between:

"Hawking has concluded that black holes give off radiation"


"Penrose has concluded that it is impossible to build an intelligent computer"

Hawking is a physicist, and so we can reasonably expect his opinions on black hole radiation to be informed. Penrose is a mathematician, so it is questionable whether he is well-qualified to speak on the subject of machine intelligence.

Audiatur et altera pars

Often, people will argue from assumptions which they don't bother to state. The principle of Audiatur et Altera Pars is that all of the premises of an argument should be stated explicitly. It's not strictly a fallacy to fail to state all of your assumptions; however, it's often viewed with suspicion.


Also referred to as the "black and white" fallacy, bifurcation occurs if someone presents a situation as having only two alternatives, where in fact other alternatives exist or can exist. For example:

"Either man was created, as the Bible tells us, or he evolved from inanimate chemicals by pure random chance, as scientists tell us. The latter is incredibly unlikely, so..."

Circulus in demonstrando

This fallacy occurs if you assume as a premise the conclusion which you wish to reach. Often, the proposition is rephrased so that the fallacy appears to be a valid argument. For example:

"Homosexuals must not be allowed to hold government office. Hence any government official who is revealed to be a homosexual will lose his job. Therefore homosexuals will do anything to hide their secret, and will be open to blackmail. Therefore homosexuals cannot be allowed to hold government office."

Note that the argument is entirely circular; the premise is the same as the conclusion. An argument like the above has actually been cited as the reason for the British Secret Services' official ban on homosexual employees.

Circular arguments are surprisingly common, unfortunately. If you've already reached a particular conclusion once, it's easy to accidentally make it an assertion when explaining your reasoning to someone else.

Complex question / Fallacy of interrogation / Fallacy of presupposition

This is the interrogative form of Begging the Question. One example is the classic loaded question:

"Have you stopped beating your wife?"

The question presupposes a definite answer to another question which has not even been asked. This trick is often used by lawyers in cross-examination, when they ask questions like:

"Where did you hide the money you stole?"

Similarly, politicians often ask loaded questions such as:

"How long will this EU interference in our affairs be allowed to continue?"


"Does the Chancellor plan two more years of ruinous privatization?"

Another form of this fallacy is to ask for an explanation of something which is untrue or not yet established.

Fallacies of composition

The Fallacy of Composition is to conclude that a property shared by a number of individual items, is also shared by a collection of those items; or that a property of the parts of an object, must also be a property of the whole thing. Examples:

"The bicycle is made entirely of low mass components, and is therefore very lightweight."

"A car uses less petrochemicals and causes less pollution than a bus. Therefore cars are less environmentally damaging than buses."

Converse accident / Hasty generalization

This fallacy is the reverse of the Fallacy of Accident. It occurs when you form a general rule by examining only a few specific cases which aren't representative of all possible cases. For example:

"Jim Bakker was an insincere Christian. Therefore all Christians are insincere."

Converting a conditional

This fallacy is an argument of the form "If A then B, therefore if B then A".

"If educational standards are lowered, the quality of argument seen on the Internet worsens. So if we see the level of debate on the net get worse over the next few years, we'll know that our educational standards are still falling."

This fallacy is similar to the Affirmation of the Consequent, but phrased as a conditional statement.

Cum hoc ergo propter hoc

This fallacy is similar to post hoc ergo propter hoc. The fallacy is to assert that because two events occur together, they must be causally related. It's a fallacy because it ignores other factors that may be the cause(s) of the events.

"Literacy rates have steadily declined since the advent of television. Clearly television viewing impedes learning."

This fallacy is a special case of the more general non causa pro causa.

Denial of the antecedent

This fallacy is an argument of the form "A implies B, A is false, therefore B is false". The truth table for implication makes it clear why this is a fallacy.

Note that this fallacy is different from Non Causa Pro Causa. That has the form "A implies B, A is false, therefore B is false", where A does not in fact imply B at all. Here, the problem isn't that the implication is invalid; rather it's that the falseness of A doesn't allow us to deduce anything about B.

"If the God of the Bible appeared to me, personally, that would certainly prove that Christianity was true. But God has never appeared to me, so the Bible must be a work of fiction."

This is the converse of the fallacy of Affirmation of the Consequent.

The fallacy of accident / Sweeping generalization / Dicto simpliciter

A sweeping generalization occurs when a general rule is applied to a particular situation, but the features of that particular situation mean the rule is inapplicable. It's the error made when you go from the general to the specific. For example:

"Christians generally dislike atheists. You are a Christian, so you must dislike atheists."

This fallacy is often committed by people who try to decide moral and legal questions by mechanically applying general rules.

Fallacy of division

The fallacy of division is the opposite of the Fallacy of Composition. It consists of assuming that a property of some thing must apply to its parts; or that a property of a collection of items is shared by each item.

"You are studying at a rich college. Therefore you must be rich."

"Ants can destroy a tree. Therefore this ant can destroy a tree."

Equivocation / Fallacy of four terms

Equivocation occurs when a key word is used with two or more different meanings in the same argument. For example:

"What could be more affordable than free software? But to make sure that it remains free, that users can do what they like with it, we must place a license on it to make sure that will always be freely redistributable."

One way to avoid this fallacy is to choose your terminology carefully before beginning the argument, and avoid words like "free" which have many meanings.

The extended analogy

The fallacy of the Extended Analogy often occurs when some suggested general rule is being argued over. The fallacy is to assume that mentioning two different situations, in an argument about a general rule, constitutes a claim that those situations are analogous to each other.

Here's real example from an online debate about anti-cryptography legislation:

"I believe it is always wrong to oppose the law by breaking it."

"Such a position is odious: it implies that you would not have supported Martin Luther King."

"Are you saying that cryptography legislation is as important as the struggle for Black liberation? How dare you!"

Ignoratio elenchi / Irrelevant conclusion

The fallacy of Irrelevant Conclusion consists of claiming that an argument supports a particular conclusion when it is actually logically nothing to do with that conclusion.

For example, a Christian may begin by saying that he will argue that the teachings of Christianity are undoubtedly true. If he then argues at length that Christianity is of great help to many people, no matter how well he argues he will not have shown that Christian teachings are true.

Sadly, these kinds of irrelevant arguments are often successful, because they make people to view the supposed conclusion in a more favorable light.

The Natural Law fallacy / Appeal to Nature

The Appeal to Nature is a common fallacy in political arguments. One version consists of drawing an analogy between a particular conclusion, and some aspect of the natural world -- and then stating that the conclusion is inevitable, because the natural world is similar:

"The natural world is characterized by competition; animals struggle against each other for ownership of limited natural resources. Capitalism, the competitive struggle for ownership of capital, is simply an inevitable part of human nature. It's how the natural world works."

Another form of appeal to nature is to argue that because human beings are products of the natural world, we must mimic behavior seen in the natural world, and that to do otherwise is 'unnatural':

"Of course homosexuality is unnatural. When's the last time you saw two animals of the same sex mating?"

Robert Anton Wilson deals with this form of fallacy at length in his book "Natural Law". A recent example of "Appeal to Nature" taken to extremes is The Unabomber Manifesto.

The "No True Scotsman..." fallacy

Suppose I assert that no Scotsman puts sugar on his porridge. You counter this by pointing out that your friend Angus likes sugar with his porridge. I then say "Ah, yes, but no true Scotsman puts sugar on his porridge.

This is an example of an ad hoc change being used to shore up an assertion, combined with an attempt to shift the meaning of the words used original assertion; you might call it a combination of fallacies.

Non causa pro causa

The fallacy of Non Causa Pro Causa occurs when something is identified as the cause of an event, but it has not actually been shown to be the cause. For example:

"I took an aspirin and prayed to God, and my headache disappeared. So God cured me of the headache."

This is known as a false cause fallacy. Two specific forms of non causa pro causa fallacy are the cum hoc ergo propter hoc and post hoc ergo propter hoc fallacies.

Non sequitur

A non sequitur is an argument where the conclusion is drawn from premises which aren't logically connected with it. For example:

"Since Egyptians did so much excavation to construct the pyramids, they were well versed in paleontology."

(Non sequiturs are an important ingredient in a lot of humor. They're still fallacies, though.)

Petitio principii / Begging the question

This fallacy occurs when the premises are at least as questionable as the conclusion reached. Typically the premises of the argument implicitly assume the result which the argument purports to prove, in a disguised form. For example:

"The Bible is the word of God. The word of God cannot be doubted, and the Bible states that the Bible is true. Therefore the Bible must be true.

Begging the question is similar to circulus in demonstrando, where the conclusion is exactly the same as the premise.

Plurium interrogationum / Many questions

This fallacy occurs when someone demands a simple (or simplistic) answer to a complex question.

"Are higher taxes an impediment to business or not? Yes or no?"

Post hoc ergo propter hoc

The fallacy of Post Hoc Ergo Propter Hoc occurs when something is assumed to be the cause of an event merely because it happened before that event. For example:

"The Soviet Union collapsed after instituting state atheism. Therefore we must avoid atheism for the same reasons."

This is another type of false cause fallacy.

Red herring

This fallacy is committed when someone introduces irrelevant material to the issue being discussed, so that everyone's attention is diverted away from the points made, towards a different conclusion.

"You may claim that the death penalty is an ineffective deterrent against crime -- but what about the victims of crime? How do you think surviving family members feel when they see the man who murdered their son kept in prison at their expense? Is it right that they should pay for their son's murderer to be fed and housed?"

Reification / Hypostatization

Reification occurs when an abstract concept is treated as a concrete thing.

"I noticed you described him as 'evil'. Where does this 'evil' exist within the brain? You can't show it to me, so I claim it doesn't exist, and no man is 'evil'."

Shifting the burden of proof

The burden of proof is always on the person asserting something. Shifting the burden of proof, a special case of Argumentum ad Ignorantiam, is the fallacy of putting the burden of proof on the person who denies or questions the assertion. The source of the fallacy is the assumption that something is true unless proven otherwise.

For further discussion of this idea, see the "Introduction to Atheism" document.

"OK, so if you don't think the grey aliens have gained control of the US government, can you prove it?"

The slippery slope argument

This argument states that should one event occur, so will other harmful events. There is no proof made that the harmful events are caused by the first event. For example:

"If we legalize marijuana, then more people would start to take crack and heroin, and we'd have to legalize those too. Before long we'd have a nation full of drug-addicts on welfare. Therefore we cannot legalize marijuana."

Straw man

The straw man fallacy is when you misrepresent someone else's position so that it can be attacked more easily, knock down that misrepresented position, then conclude that the original position has been demolished. It's a fallacy because it fails to deal with the actual arguments that have been made.

"To be an atheist, you have to believe with absolute certainty that there is no God. In order to convince yourself with absolute certainty, you must examine all the Universe and all the places where God could possibly be. Since you obviously haven't, your position is indefensible."

The above straw man argument appears at about once a week on the net. If you can't see what's wrong with it, read the "Introduction to Atheism" document.

Tu quoque

This is the famous "you too" fallacy. It occurs if you argue that an action is acceptable because your opponent has performed it. For instance:

"You're just being randomly abusive."

"So? You've been abusive too."

This is a personal attack, and is therefore a special case of Argumentum ad Hominem.

Fallacy of the Undistributed Middle / "A is based on B" fallacies / " a type of..." fallacies

These fallacies occur if you attempt to argue that things are in some way similar, but you don't actually specify in what way they are similar. Examples:

"Isn't history based upon faith? If so, then isn't the Bible also a form of history?"

"Islam is based on faith, Christianity is based on faith, so isn't Islam a form of Christianity?"

"Cats are a form of animal based on carbon chemistry, dogs are a form of animal based on carbon chemistry, so aren't dogs a form of cat?"

A Continuum of Influence and Persuasion







Thought Reform

Focus of body of knowledge

Many bodies of knowledge, based on scientific findings in various fields.

Body of knowledge concerns product, competitors; how to sell and influence via legal persuasion.

Body of knowledge centers on political persuasion of masses of people.

Body of knowledge is explicitly designed to inculcate organizational values.

Body of knowledge centers on changing people without their knowledge.

Direction & degree of exchange

Two way pupil-teacher exchange encouraged.

Exchange can occur but communication generally one-sided.

Some exchange occurs but communication generally one-sided.

Limited exchange occurs, communication is one-sided.

No exchange occurs, communication is one-sided.

Ability to change

Change occurs as science advances; as students and other scholars offer criticisms; as students & citizens evaluate programs.

Change made by those who pay for it, based upon the success of ad programs by consumers law, & in response to consumer complaints.

Change based on changing tides in world politics and on political need to promote the group, nation, or international organization.

Change made through formal channels, via written suggestions to higher-ups.

Change occurs rarely; organization remains fairly rigid; change occurs primarily to improve thought-reform effectiveness.

Structure of persuasion

Uses teacher-pupil structure; logical thinking encouraged.

Uses an instructional mode to persuade consumer/buyer.

Takes authoritarian stance to persuade masses.

Takes authoritarian & hierarchical stance.

Takes authoritarian & hierarchical stance; No full awareness on part of learner.

Type of relationship

Instruction is time-limited: consensual.

Consumer/buyer can accept or ignore communication.

Learner support & engrossment expected.

Instruction is contractual: consensual

Group attempts to retain people forever.


Is not deceptive.

Can be deceptive, selecting only positive views.

Can be deceptive, often exaggerated.

Is not deceptive.

Is deceptive.

Breadth of learning

Focuses on learning to learn & learning about reality; broad goal is rounded knowledge for development of the individual.

Has a narrow goal of swaying opinion to promote and sell an idea, object, or program; another goal is to enhance seller & possibly buyer.

Targets large political masses to make them believe a specific view or circumstance is good.

Stresses narrow learning for a specific goal; to become something or to train for performance of duties.

Individualized target; hidden agenda (you will be changed one step at a time to become deployable to serve leaders).


Respects differences.

Puts down competition.

Wants to lessen opposition.

Aware of differences.

No respect for differences.


Instructional techniques.

Mild to heavy persuasion.

Overt persuasion sometimes unethical.

Disciplinary techniques.

Improper and unethical techniques.


  1. Lifton, R.J. (1961). Thought Reform and the Psychology of Totalism. New York: W.W. Norton. (Also: 1993, University of North Carolina Press.)

  2. Lifton, R.J. (1987). Cults: Totalism and civil liberties. In R.J. Lifton, The Future of Immortality and Other Essays for a Nuclear Age. New York: Basic Books.

  3. Lifton, R.J. (1991, February). Cult formation. Harvard Mental Health Letter.

  4. Hunter, E. (1951). Brainwashing in China. New York: Vanguard.

  5. Schein, E.H. (1961). Coercive Persuasion. New York: W. W. Norton.

  6. Singer, M.T. (1987). Group psychodynamics. In R. Berkow (Ed.). Merck Manual, 15th ed. Rahway, NJ: Merck, Sharp, & Dohme.

  7. West, L.J., & Singer, M.T. (1980). Cults, quacks, and nonprofessional psychotherapies. In H.I. Kaplan, A.M. Freedman, & B.J. Sadock (Eds.),  Comprehensive Textbook of Psychiatry III, 3245-3258. Baltimore: Williams & Wilkins.

  8. Ofshe, R., & Singer, M.T. (1986). Attacks on peripheral versus central elements of self and the impact of thought reforming techniques.  Cultic Studies Journal. 3, 3-24.

  9. Singer. M.T. & Ofshe, R.(1990) Thought reform programs and the production of psychiatric casualties. Psychiatric Annals, 20, 188-193

  10. Ofshe, R. (1992). Coercive persuasion and attitude change.  Encyclopedia of Sociology. Vol. 1, 212-224. New York: McMillan.

  11. Wright, S. (1987) Leaving Cults. The Dynamics of Defection.  Society for the Scientific Study of religion. Monograph no. 7, Washington, DC.

Cognitive Processes and the Suppression of Sound Scientific Ideas

J. Sacherman 1997


American and British history is riddled with examples of valid research and inventions which have been suppressed and derogated by the conventional science community. This has been of great cost to society and to individual scientists. Rather than furthering the pursuit of new scientific frontiers, the structure of British and American scientific institutions leads to conformity and furthers consensus-seeking. Scientists are generally like other people when it comes to the biases and self-justifications that cause them to make bad decisions and evade the truth. Some topics in science are 'taboo' subjects. Two examples are the field of psychic phenomenon and the field of new energy devices such as cold fusion. Journals, books and internet sites exist for those scientists who want an alternative to conformist scientific venues. 

Although some scientific ideas are truely unfounded, the author of this paper will explore instances when valuable scientific ideas were unfairly reviled and rejected. This author will discuss the cognitive processes, including cognitive dissonance, conformity, and various biases which contribute to such suppression.

Examples from history of suppression in the sciences

A legacy of cognitive biases and faulty judgments exists. It typifies the history of American and British scientific inquiry and research. 

One of the earliest examples with which nearly everyone is familiar occurred in the early seventeenth Century. Galileo was branded as a heretic and sent to prison for declaring that the earth traveled around the sun (Manning 1996).. 

This paper will concentrate on examples from a period starting closer to the industrial age and continuing until the present. The first example presented here is drawn from Richard Milton's (1996) book Alternative Science. Antoine Lavoisier, the science authority for eighteenth and early nineteenth century Europe and father of modern chemistry, assured his fellow Academicians in 1790, that meteorites could not fall from the sky as there were no stones in the sky (Milton,1996). In spite of first-hand reports of meteors falling from the sky, Lavoisier was believed. Nearly all of the meteorites in public and private collections were then thrown out. Only one meteor that was too heavy to move was saved, so today the world has few specimens that predate 1790. Meteors were not commonly collected again until mounting evidence for their extraterrestrial origin predominated about 50 years later. 

Milton (1996) continued with the history of the human powered flight. During the years, between 1903 to 1908, Wilbur and Orville Wright repeatedly demonstrated the flight capability of their invention, the airplane. Despite these demonstrations plus numerous independent affidavits and photographs from local enthusiasts as well, the Wrights' claims were not believed. Scientific American, the New York Herald, the US Army and most American scientists discredited the Wrights and proclaimed that their mechanism was a hoax. Noted experts from the US Navy and from Johns Hopkins University decried "powered human flight . . .absurd "(Milton,1996 p.11).

In a similar vein, the inventors of the turbine ship engine, the mechanical naval gunnery control, the electric ships telegraph, and the steel ship hull all initially met with disinterest, disbelief and derision by the British Navy of the nineteenth century (Milton, 1996). 

There are numerous accounts of useful science ideas that received such treatment. However, this writer will discuss just a few of the inventions and ideas by the best known scientists. Milton (1996) explained how the invention of what is now considered a very ordinary object, the light bulb, was initially mired in controversy and disbelief. When Thomas Edison was finally successful in finding a light bulb filament which could glow while sustaining the heat of electrical conduction, he invited members of the scientific community to observe his demonstration (Milton 1996). Although the general public traveled to witness his electric lamp, the noted scientists of the day refused to and claimed the following about Edison: 

"Such startling announcements as these should be deprecated as being unworthy of science and mischievous to its true progress." -Sir William Siemens, England's most distinguished engineer (Milton, 1996 p.18) 

"The Sorcerer of Menlo Park appears not to be acquainted with the subtleties of the electrical sciences. Mr. Edison takes us backwards. One must have lost all recollection of American hoaxes to accept such claims." -Professor Du Moncel (Milton,1996 p.18) 

"Edison's claims are "so manifestly absurd as to indicate a positive want of knowledge of the electric circuit and the principles governing the construction and operation of electrical machines."-Edwin Weston, specialist in arc lighting (Milton, 1996 p.18) 

Luckily, the disinterest and derision of Edison's scientific peers did not prevent sharp speculators, like J. P. Morgan and William Vanderbilt from investing funds and helping Edison's inventions become universally adopted (Milton, 1996). Other inventors of the day were not always so lucky. 

Cost to individuals and to society

Many invaluable concepts for inventions from Edison's era, were not granted financial backing (Milton, 1996). This was the case for most of the ideas of Nikola Tesla, who known for the discovery and development of AC current. In the book, The Coming Energy Revolution, the author, Jeanne Manning (1996), told of how the treatment of Tesla contrasted with that of his contemporary, Edison. Tesla did not bother as Edison did, to "play the game" (p. 24) with the U.S. science establishment, the media and the investors. Manning (1996) continued with explaining that even though Tesla was the main trail-blazer of the age of electricity, his almost inaccessible brilliance, his lack of interest in publishing, and his wish to give everyone free electric power may have caused substantial professional jelousy. Manning (1996) further postulated that this jealousy and Tesla's non-conformity were responsible for the lack of support and acknowledgment he received. Moreover, Manning (1996) continued, even though other inventors were often credited for them, many of the products that came out of the age of electricity were directly due to Tesla's concepts. These were inventions such as Marconi's radio, which was presented to the public in 1901 and used 17 of Tesla's patented ideas. In 1943, the Supreme Court had, in fact, ruled that Tesla was the radio's inventor (Manning,1996). Unfortunately for Tesla, that was some years after his death. After the US science community and investors turned their back on Tesla, he descended "into wild eccentricity"(p. 26). However, Manning (1996) asserted, his research on wireless power conveyance, bladeless turbines, excess-output energy machines and other futuristic devices are still being marveled at and studied by those that have rediscovered this unappreciated genius. 

Other innovators who were described by Milton (1996) as victims of the insults of the skeptical scientific power elite, were such men as John Logie Baird, inventor of television. Baird had been described by the British Royal Society as "a swindler" (p. 19). Likewise, Wilhelm Roentgen's discovery of X-rays was decried as an "elaborate hoax" (p.22) by Lord Kelvin, the most influential scientist of Europe in 1895. Scientists of Roentgen's day produced film fogging X-rays on a substantial scale but were unwilling to consider the wide ranging implications of Roentgen's work for 10 years after his discovery (Milton, 1996). 

Another example of such victimization, presented by Dean Radin (1996) in his book The Conscious Universe, involved the theory of German meteorologist, Alfred Wegener. This theory which Wegener developed in 1915, contended that the earth's continents had once been a single mass of land which later drifted apart. Although Wegener carefully cataloged geological evidence, his American and British colleagues ridiculed both him and his idea (Radin, 1996). Although Wegener died an intellectual outcast in 1930, every schoolchild is currently taught his theory which is known as continental drift. 

The cost of scientific suppression to society can be seen in the history of the development of the tank. According to Milton (1996), at a time when 1.000 men a day were dying on W.W.I battlefields for want of protection from shelling and gunfire, the British admiralty, of that epoch, had the following to say about E. L.. deMole's , invention, the tank:. 

"Caterpillar landships are idiotic and useless. Nobody has asked for them and nobody wants them. Those officers and men are wasting their time and are not pulling their proper weight in the war"(p. 20). 

Derogation, Trivialization and Reduction of Dissonance

Some quotations collected by Christopher Cerf and Victor Navakky in their book The Experts Speak (1984) illustrated further the hostile or trivializing attitude towards different ideas, scientific inquiries, and revolutionary discoveries. 

"Louis Pasteur's theory of germs is ridiculous fiction." -Pierre Pachet, Professor of Physiology France, 1872 (p.30) 

"Fooling around with alternating current in just a waste of time. Nobody will use it, ever." -Thomas Edison, 1889 (p.207) 

"I laughed till. . . my sides were sore." -Adam Sedgwick, British geologist in a letter to Darwin in regards to his theory of evolution, 1857 (p.9) 

"If the whole of the English language could be condensed into one word, it would not suffice to express the utter contempt those invite who are so deluded as to be disciples of such an imposture as Darwinism." -Francis Orpen Morris, British ornithologist 1877 (p.10) 

"Airplanes are interesting toys, but of no military value." - Marechal Ferdinand Foch, Professor of Strategy, Ecole Superieure de Guerre (p.245) 

"To affirm that the aeroplane is going to 'revolutionize' naval warfare of the future is to be guilty of the wildest exaggeration." -Scientific American, 1910 (p.246) 

"Who the hell wants to hear actors talk?" - H. M. Warner, Warner Brothers Studios, 1927 (p.72) 

"The whole procedure of shooting rockets into space. . . presents difficulties of so fundamental a nature, that we are forced to dismiss the notion as essentially impracticable, in spite of the author's insistent appeal to put aside prejudice and to recollect the supposed impossibility of heavier-than-air flight before it was actually accomplished." -Richard van der Riet Wooley, British astronomer (p.257) 

"The energy produced by the atom is a very poor kind of thing. Anyone who expects a source of power from the transformation of these atoms is talking moonshine." Ernst Rutherford, 1933 (p.215) 

"Space travel is bunk" - Sir Harold Spencer Jones, Astronomer Royal of Britain, 1957, two weeks before the launch of Sputnik (p.258) 

"But what hell is it good for?" -Engineer Robert Lloyd, IBM 1968, commenting on the microchip (p.209) 

"There is no reason anyone would want a computer in their home." -Ken Olson, president of Digital Equipment Corp. 1977 (p.209) 

Several of the above examples show new ideas that were grievously misjudged by scientific peers and those in authority. 

Today, scientific research is still judged by peer review. Henry Bauer (1994) in his book Scientific Literacy and the Myth of the Scientific Method revealed how research is generally funded through association with a university. In Western civilization , said Bauer (1994) selected peers judge the journal articles that the academic scientists must publish to retain their university positions and insure future funding. 

Specific questions about the process of peer review were examined by sociologist Michael J. Mahoney of the University of Pennsylvania. In an interview granted to Boston Globe science reporter, David Chandler (1987), Mahoney discussed his study. Mahoney sent copies of a paper to 75 reviewers but doctored the results so that in some cases the research appeared to support mainstream theories (Chandler 1987). In other cases Mahoney had doctored the paper so the research deviated from them. When the doctored results ran contrary to the reviewer's theoretical beliefs the author's procedures were berated and the manuscript was rejected. When the results in the doctored papers confirmed the reviewer's beliefs, the same procedures were then lauded and the manuscript was recommended for publication (Chandler 1987). 

Mahoney presented the results of this study to the American Association for the Advancement of Science. Afterwards, Mahoney received 200 to 300 letters and phone calls from scientists who felt they had been victimized because the results of their research conflicted with the generally accepted scientific viewpoint or with their reviewer's beliefs (Chandler 1987). 

Daniel Koshland, editor the leading US scientific journal, Science, said this in an interview to Chandler(1987) about science that threatens to change the parameters of what is accepted: 

"I think it's fair to say that a new idea, something that confronts existing dogma, has an uphill road. . .There certainly is no question that there is a prejudice in favor of the existing dogma"(Chandler 1987). 
In the same interview with Chandler (1987), Koshland cited, as one example, biochemist Edwin G. Krebs' discovery for which he received the Nobel prize. The discovery which is now known as the Krebs cycle, describes the fundamental series of enzyme reactions in living organisms. It was initially rejected. 

Koshland (Chandler 1987) continued with the history of biologist Lynn Margulis's work, showing the evolution of cell structure through symbiotic unions of primitive organisms. It was also initially rejected and even scorned (Chandler 1987). Although her work has become the accepted dogma and appears in textbooks, in 1970 the National Science Foundation not only turned her down for funding, but told her that she should never apply again. Koshland stated that there are other examples such as these (Chandler 1987). 

In-Group and Out-Group Effects

Koshland's statement about the prejudices against ideas that go against the existing dogma (Chandler 1987), and the examples Koshland gives lead this author to suppose that in-group biases could be blinding the scientific authorities to the validity of unorthodox, out-group ideas. As Aronson (1995) revealed, the valid points which the out-group makes are not readily perceived by the in-group. Moreover, the weak points or elements of the out-group preponderate in the mind of the in-group. Aronson (1995) explained the tendency to "in-group favoritism" (p. 144) in which members were thought to produce better output than non-members. This author believes that, scientists with challenging ideas have been viewed as an out-group by the in-group of conventional scientists. 

The Urge to Conform

Chemistry and science studies professor, Henry H. Bauer (1994), in his book, Scientific Literacy and the Myth of the Scientific Method urged us to realize that scientists are only human and are therefore subject to all the variations that humans posses. He claimed that although scientists have been seen as single- mindedly pursuing truth in all fields, in actuality scientists are generally expert in only one field and the pursuit of truth may not be a top priority. The fact that modern scientists are financially dependent on university and foundation research positions that are in turn dependent on grants. (Bauer, 1994) These are key factors in the formulation of a scientist's priorities. This financial dependence and instability, declared Bauer (1994), creates a direct conflict of interest between pure scientific pursuit and behavior aimed at keeping funding and positions. 

A job in scientific research, seems to this writer, to be much like any precarious career position. There could be the usual tendencies to conform and participate in group-think. Criticism by the science community and loss of livelihood appear to this author to be punishment, while acceptance by the science community and financial security seem like rewards. According to Aronson (1996), punishment and rewards generally compel one to conform. 

Bauer (1994) painted a picture of "an elite research community,"(p. 99) consisting of a few dozen universities, which traditionally have been deemed to have the most experts. These universities are thought to turn out the best results and publications and are the top choice to receive both government and private research money. 

Bauer (1994) explained that there is little money in this country for more exploratory pursuits for the "sake of scientific progress"(p. 98). Funding and acknowledgment go to virtually the same schools and the same groups of scientists, so the scope of exploration and scientific thought becomes limited and intellectual inbreeding occurs (Bauer 1994). Most of the scientists chosen to be journal editors and peer reviewers are also selected from this same narrow ingrained group. This phenomenon was referred to by Bauer (1994) as the "imperfections of the filter"(p. 99). 

Like the "concurrence seeking" (p. 18) member of Hitler's inner circle, described by Aronson (1995), this "highly filtered" (Bauer p. 99) group of scientists tend to be in a position that often demand consensus of opinion and necessitates conformity. 

Bauer (1994) illustrated how, throughout history, the course of scientific discovery was impeded by the social environment and prejudices of the time. He gave the example of how in Nazi Germany, the scientists were unable to make progress. The reason for this Bauer (1994) explained, is that they had been commanded to work without the theory of relativity as that theory had been originated and developed by a purportedly inferior Jew. Similarly the Soviets were commanded to do without the theory of wave mechanics which also had an unpopular genesis (Bauer 1994). The punishment of being a maverick scientist in either of those societies were death or forced labor, so the writer of this paper supposes the urge to conform must have been very compelling. 

Bauer (1994) asserted that conformity within the scientific community leads to the evasion of all unwanted or inconsistent facts and this obstructs the practice of science. This avoidance of facts and truth by a group, seems to this writer, to be very much akin to the consensus seeking and evasion of reality that led up to the faulty decision to launch the Challenger space shuttle. Even though it had parts which were known to be of dubious quality, "NASA and Thiokol executive ...reinforced one another's commitment to proceed"(Aronson , 1995 p.17). 

Thomas Gold, a professor and researcher with Cornell, wrote in his 1989 journal article "New Ideas in Science" that he attributed the tendency for consensus seeking among scientist to a primarily vestigial instinct, "a herd mentality"( p.103). Gold supported this notion of the herd mentality by stating how petroleum geology and other disciplines have become completely intolerant of any new ideas He also told of how he had the experience of making colleagues violently angry with him, because he had proposed that there was some uncertainly about the origin of petroleum. (Gold, 1989) Moreover, Gold (1989) claimed, the fresh and genuinely different research from the other countries that are outsiders to the US herds, casts light on the truly one-dimensional nature of our science institutions. 

Gold (1989) conjectured that going against the herd and adopting a deviant viewpoint, feels uncomfortable for personal cognitive and emotional reasons, as well as for the practical reasons listed above by Bauer. Furthermore, Gold (1989) postulated that conformist scientist may be unconsciously motivated by the protection afforded to them by the herd, "against being challenged ...or having their ignorance exposed"(p. 106). 

Cognitive Dissonance

According to Aronson (1996), when people are confronted with opposing beliefs or ones incompatible with their own, they are likely to ignore or negate that belief. They do this in order to convince themselves that they have not behaved foolishly by committing to false beliefs. To assure themselves that they have been wise in supporting their position, they often convince themselves that those who oppose that position are foolish and truly objects for contempt and derision (Aronson, 1996 p.184-8). 

Aronson(1996) also stated that most people, when they are confronted with information that they have behaved in a cruel manner, attempt to reduce subsequent dissonant feelings of perceiving themselves as unkind. They often do this by creating a belief that cruelty towards the victim is actually justified. Studies by Karen Hobden and James M. Olson(1994) examined disparagement humor directed at an out-group. Hobden et al.(1994) had a confederate tell extremely disparaging jokes about lawyers to a group of subjects. The dissonance, caused by disparaging the lawyer out-group, prompted the majority of the subjects to change both their public and private attitudes about lawyers to one that was substantially less favorable. (Hobden et al., 1994) 

Another study by Linda Simon, Jeff Greenberg, and Jack Brehm (1995) showed that trivialization is also effectively employed as a mode of dissonance reduction. The subjects in Simon et al.'s (1995) study were led to follow counter-attitudinal behaviors. They later chose to trivialize the dissonant information about themselves more often than they chose to change their opinions (Simon et al., 1995). 

Many of the quotes contained in this paper in which a member of mainstream science reacts towards new inventions or discoveries are steeped in trivialization and disparagement. This leads this writer to believe that scientists are reducing their cognitive dissonance about challenging science ideas with same faulty cognitions and methods in which non-scientists engage. 

Outside the Paradigm 

Science author Patrick Huyghe (1995), in his internet article "Extraordinary Claim? Move the Goal Posts!," claimed that although a new science idea may have proof, if it defies convention, then instead of consideration and acceptance: 

"There's often some hasty rewriting of the rules of the game. For the would-be extraordinary, for the unorthodox claim on the verge of scientific success, the ground rules are gratefully changed. This practice, often referred to as 'Moving the goal posts' is an extraordinary phenomenon in itself and deserves recognition."(p.1) 

In the book by science writer, Patrick Huyghe co-authored with physicist Louis A. Frank (1990) The Big Splash, this moving of the goal posts was depicted by the conventional science society's reaction to a challenging discovery made by Dr. Frank. Frank and Huyghe (1990) wrote of how Dr. Frank found evidence that the Earth was being showered by approximately twenty house-sized ice comets per minute. These comets all broke up in the atmosphere. His research led him to believe that the millennia of bombardment by these ice comets were responsible for the presence of the water on Earth. Dr. Frank presented his data and his photographs of the ice comets to a geophysics journal for publication (Huyghe, 1990). At the time of the announcement of Dr. Frank's discovery, the academic standard of proof in astronomy was to have two images of the same object. Although Dr. Frank presented such proof, the appearance of ice comets in his photographs was considered to be merely due to a technical fluke and a higher standard of proof was then required (Huyghe, 1990). As each subsequent level of proof was delivered by Dr. Frank, a yet higher tier of standards was then demanded (Huyghe, 1990). 

This writer believes that this goal post shifting is similar to some of the tendencies examined by Aronson(1995). Aronson cited a survey which was done to assess people's reaction to the 1964 surgeon general's report about the serious health risks from cigarettes. Aronson (1995) found that smokers who had tried to quit unsuccessfully experienced dissonance over their inability to stop the habit. Those smokers tended to change their cognitions and create the belief that smoking was not dangerous for them (Aronson, 1995). Exemplifying intelligent people, who also smoked, or deluding themselves "that a filter traps the all of the cancer- producing materials" (p.179) reduced the smokers' dissonance and made them feel that their actions were justified. Just like moving the goal posts, these cognitive ploys changed the standard by which information was judged. 

James McClenon's(1984) book Deviant Science: The Case of Parapsychology and Dean Radin's (1997) book, The Conscious Universe both deal with the topic of psychic phenomenon as a suppressed science. Dean (1997) cited dissonance reduction as the reason why conventional science authorities had suppressed numerous valid studies on psychic phenomenon. Dean (1997) stated that people have an uncomfortable feeling when they are confronted with information that seems impossible to them. Evidence of psychic phenomenon, also known as psi, therefore becomes dissonant information. Although most of Deviant Science and Conscious Universe were devoted to describing the many reproducible, strictly scientific experiments that support the existence of ESP, the writers also speculated about why this field has been found unacceptable. Both Dean (1997) and McClenon (1984) claimed that the dismissal of well executed studies were not due to skepticism, but mainly to blatant attacks by those who are threatened by the shifting of perceptions in the sciences. McClenon (1984) cited the 1970's science philosophy of Thomas Kuhn, who coined the term for shifting perceptions "paradigm shifts"(p.21). McClenon (1984) had the following to say about Kuhn's definition of paradigms cited from Kuhn's The Structure of Scientific Revolutions: 

"Paradigms are the universally accepted scientific achievements that for a time provide model problems and solutions to a community of practitioners . . . an object for future articulation and specification under new or more stringent conditions" (p.21). 
When an anomaly outside of this accepted model happens frequently enough, McClenon (1984) explained, there is a crisis. The anomalies that violates the current ruling paradigm are then either incorporated and resolved within the paradigm, or there is a "revolutionary upheaval"(p. 21). 

Aronson (1995) described how people commonly have a low tolerance for anomalous, dissonant information. He had this to say about how people generally deal with challenges to their beliefs and thereby reduce their dissonance: 

"People don't like to see or hear things that conflict with their deeply held beliefs or wishes. An ancient response to such bad news was to kill the messenger"(p. 185). 
This writer sees such "killing" going on in the deriding and dismissing of the science ideas and the "messenger" scientist. 

Confirmation Bias

Radin (1997) also explained that the rejection of serious studies on psychic phenomenon is due to a particular type of confirmation bias, the "expectancy effect"(p. 234). This expectancy effect, as studied by sociologist Harry Collins in his book The Golem (1993), showed that for controversial scientific topics where the existence of a phenomenon is in question, scientific criticism is generally determined by the critic's prior expectations. 

Collin's work, cited by Radin, (1997) also explained a phenomenon termed "scientific regress"(p. 236). Scientific regress happens when experimental results are predicted by a well-accepted theory and then the outcome is examined to see if it matches the initial expectations. Radin (1997) reasoned that with psi research there isn't a well-accepted theory with which to compare the results, so skeptics use "scientific regress" to invalidate all of the scientific results in this field of study. 

Radin (1997) also called attention to another form of the confirmation bias, that of seeking to confirm one's original hypothesis when a situation is unclear or confusing. Radin's definition here matches Aronson's (1995) definition of "the confirmation bias -the tendency to confirm our original hypotheses and beliefs"(p.150). 

Radin (1997) said confirmation biases are especially problematic for older more experienced scientists because "their commitment to their theories grows so strong, that simpler or different solutions get overlooked"(p. 236). These biases, Radin claimed, preserve ideas that are already established and causes suppression of non-standard science research. 

Dean Radin (1997) broke down the acceptance of a new science idea into the following four predictable stages which this author sees as being rife with various aforementioned biases and dissonance reduction: 

Stage 1, skeptics proclaim that the idea is impossible. 

Stage 2, skeptics reluctantly concede that the ideal is possible, but trivial. 

Stage 3, the mainstream realizes that the idea is more important than the trivializing scientists in authority lead them to believe. 

Stage 4, even the skeptics proclaim that they knew it all along or even that they thought of it first (P.243). 

This writer believes that the cognitions in this last stage are attributable to what Aronson (1996) termed as "the hindsight effect" (p.7). 

Taboo or Unpopular Science

The Golem (Collins 1993), Fire from Ice (Mallove 1991), The Coming Energy Revolution (Manning 1996) and Alternative Science (Milton 1996) all had chapters which described the genesis of cold fusion and gave important evidence for it's validity. These books told of the findings of two chemists, Professor Martin Fleischmann of Southampton University and his former student, Professor Stanley Pons of the University of Utah. Fleischmann and Pons held a 1989 press conference at which they announced the discovery of cold fusion. Milton (1996) defined cold fusion as "the production of usable amounts of excess energy by a nuclear process occurring in a water at room temperature"(p. 25). 

By making the announcement about their success at a press conference, Manning(1996) and Milton(1996), and Collins (1993) all stated that these two distinguished scientists were breaking with the tradition of first submitting an article to peer review for publication. Manning (1996) contended that it was mainly this departure from the expected way of introducing the phenomenon, not the failing of the results, which led to the trivializing and derogating of cold fusion, and of Fleischmann and Pons as well, by the majority of mainstream scientists. 

Manning (1996) suggested that a secondary cause for disapproval was the fact that science did not have a framework yet for how these cold fusion experiments produced the energy. This lack of a previously existing framework seems to cause most mainstream scientists to invalidate anomalous data through experimental regress and the confirmation biases 

Evidently Pons and Fleischmann intended to keep the means of producing cold fusion to themselves in hopes of becoming wealthy, so they were not forthcoming about the details of the methodology used. Although they were able to repeatedly get the same verifiable results, other scientists of the time were not able to independently duplicate what Pons and Fleischmann had done (Manning, 1996). 

A third cause for disapproval, explained Manning (1996), is that the massively funded hot fusion research organizations had also been trying over decades to get some of the same findings as those from the cold fusion experiments and may have had professional jealousy (Manning 1996). 

This writer believes that the suppression of cold fusion could have been due to some of the same cognitive distortions which led to the suppression of other maverick science ideas and inventions throughout history. These cognitions include the in-group out-group, confirmation, and that expectancy biases, as well as cognitive dissonance reactions to anomalies. 

Manning (1996) wrote of how in America, Fleischmann and Pon's reputations as cold fusion researchers were tarnished. Cold fusion articles were suddenly banished from science journals and U.S. patents for cold fusion were dismissed. 

Manning (1997) continued that only Japan was still putting major funding into cold fusion research. As a heavily populated island with few natural energy resources, Japan had everything to gain from clean safe energy production. Also, because many Easterners have a "spiritual belief in an all pervading energy which comes in many forms,"(p. 102) the idea of fusion reactions taking place without extreme high temperatures was not quite such a dissonant idea as it had been for Westerners. 

Other methods to derive usable energy that are considered to be in opposition to the beliefs of mainstream science were discussed by Manning (1996). These included solid state energy devices, vibrational devices developed by nineteenth century musician and inventor John Ernst Worrell Keeley, vortex and magnetic energy mechanisms, new technologies for using waste and hydropower, and the use of hydrogen for power. 

Alternatives for excluded scientists

The internet has, in the last few years, become a valuable resource for those scientists who have been discouraged from experimenting with and publishing unorthodox studies. It gives them the opportunity to network with others interested in their research. 

Some websites for these discussion groups can be found at the yahoo website at, under the subheading, alternative science. In addition there is where one can find free energy, cold fusion and otology discussion groups under the subheadings: freenergy-L, vortex-L and taoshum-L. 

There are journals created specifically for printing professionally written studies on unpopular topics. Since involvement with these non-standard topics might lead to a professional scientist's ostracism, one publication, The Journal of Scientific Exploration (1986-1997) only prints articles by academic research scientists, anonymously. This journal provides a forum for presentation, criticism and debate for topics that are ignored or ridiculed by mainstream science. It also has the secondary goal of publishing articles that help to promote understanding of the factors that limit scientific inquiry. 

Galilean Electrodynamics is a publication devoted to professionally written journal articles that challenge Einstein's ideas. Only papers that are in the realm of mathematics, engineering or physics and that are relativity-related are considered for publication in this journal. 

Infinite Energy Cold Fusion and New Energy Technology (1994- 1998) is a magazine edited by Eugene Mallove and is devoted to energy experimentation that is beyond the scope of orthodox accepted science. 

Looking forward

Bauer (1994) called on science institutions to help foster objectivity by making sure they includes scientist from backgrounds and viewpoints that are as varied as possible. He also asked that scientists fight their personal biases and hidden social agendas by vigilantly examining their own motives, and trying to see an objective reality rather than one influenced by expectations (p. 102). 

Dr. Brian Martin (1998) in his current writings posted on the internet, "Suppression Stories," asked that researchers publish more accounts about suppression, and claimed that this will provide necessary support for dissident and struggling scientists. 

Radin (1997) closed his book with a hope that this process of suppressing new ideas will not continue to be at the cost of good science and scientists. He included this quote by Lewis Thomas, biologist and author of the Medusa and the Snail: 

"The only solid piece of scientific truth about which I feel totally confident is that we are profoundly ignorant about nature. . . It is this sudden confrontation with the depth and scope of ignorance that represents the most significant contribution of twentieth-century science to the human intellect"(p. 289). 
This author will bring this paper to a close with a quote from Bill Beaty's (1998) webpage article "Quotes against excessive skepticism: 

"Daring ideas are like chessmen. Moved forward, they may be defeated, but they start a winning game." -Goethe 


Aronson, Elliot (1995) The Social Animal New York: W. H. Freeman and Co. 
Bauer, Henry H.(1994) Scientific Literacy and the Myth of the Scientific Method Chicago: University of Illinois Press 
Beaty, William J.(1998) Closeminded Science Online, Internet Available 
Brockman, John (1995) The Third Culture: Beyond the Scientific Revolution New York: Simon & Schuster 
Cerf, Christopher and Navasky, Victor ((1984) The Experts Speak, The Definitive Compendium of Authoritative Misinformation New York: Pantheon Books 
Chandler, David L. and Globe Staff (1987) "Maverick Scientists Encounter Barriers, Peer Review Called Curb to Creativity." The Boston Globe Monday 6/22/87 
Collins, Harry and Pinch, Trevor (1993) The Golem: What Everyone Should Know About Science Cambridge, UK: Cambridge University Press 
Duncan, Ronald (1977) The Encyclopaedia of Ignorance: Everything You Ever Wanted to Know About the Unknown Oxford, U.K. Pergamon Press 
Gold, Dr. Thomas (1989) "New Ideas in Science" Journal of Scientific Exploration Vol.3(2) p103-112 
Haich, Bernhard (1990-1998) Journal of Scientific Exploration A Publication of the Society for Scientific Exploration Vol 1-12 
Huyghe, Patrick (1995) Extraordinary Claim? Move the Goalposts The Anomalist Homepage Online, Internet Available 
Huyghe, Patrick and Dr. Louis A. Frank (1990) The Big Splash New York: Birch Lane Press 
Mallove, Eugene (1991) Fire from Ice; Searching for the Truth Behind the Cold Fusion Furor New York: John Wiley & Sons, Inc. 
Mallove, Eugene (1996-1998) *Infinite Energy: Cold Fusion and New Energy Technology Vol.1(1) -Vol. 3(17) 
Manning, Jeanne (1996) The Coming Energy Revolution: The Search for Free Energy New York: Avery 
Martin, Brian (1996) Suppression Stories Peer Review as Scholarly Conformity Department of Science and Technology, University of Wollongong, Online, Internet Available 
Milton, Richard (1996) Alternative Science: Challenging the Myths of the Scientific Establishment Vermont: Park Street Press 
McClenon, James (1984) Deviant Science: The Case of Parapsychology Philadelphia: University of Pennsylvania Press 
Westrum, Ron "Fringes of Reason" Whole Earth Catalog Online. Internet 
Radin, Dean (1997) The Conscious Universe: Scientific Truth of Psychic Phenomenon New York: Harper Collins 
Zimbardo, Philip (1969) The Cognitive Control of Motivation Illinois: Scott, Foresman and Company 



Copyright © 1988- Leading Edge International Research Group