I just came across a great blog with the subject title: http://youarenotsosmart.com/ published by David McRaney, a journalist and broadcast media director. I want to highlight two interesting posts.

Anchoring Effect discusses some of the many experiments that show how immediately preceding information exposure greatly influences how people make estimates or valuations. Those who are exposed to a low number and then immediately asked to make an estimate of how many things are in a certain category give significantly lower estimates than people who are exposed to a high number. Of course, the effect only occurs if the subjects have no idea of the correct answer and are making pure guesses.

Another effect is that someone is more likely to make a impulse purchase if they are first exposed to a high price (say $1,000) and then told the item has just been put on sale for $400 than if the first price they see is $400.

We are all too easily influenced by factors that have no relevance to a estimation or valuation process that is made in close time proximity to the distraction. Most of us are quite unaware that these kinds of processes are happening to us.

Confirmation Bias addresses the common effect of information filtering. We tend to accept information which is in agreement with what we already know or believe and to discount information that conflicts with our biases. For some people, Paul Krugman can offer no information that can be used. Others react the same way to Glenn Beck. Both groups of people suffer from strong confirmation bias.

Confirmation bias is why I like to investigate things that can be expressed in mathematical terms: graphical plots, correlation coefficients, algebraic equations, etc. Over the years I have found that I have had to modify what I “knew” after experiencing some clear mathematical logic. And I have also learned that what once seemed like “clear mathematical logic” based on experimental data can be changed by additional data.

For those who really want to improve understanding of very complex issues or systems, it is necessary to recognize that everything you think you know is suspect. Even that which is mathematically proven must be challenged, unless you can verify every assumption (explicit and implicit) and determine that no additional independent data can be added in the future.

There are other interesting looking posts at You are not so Smart. I plan to return there to further investigate my stupidity.

This article was originally published July 31 at *Seeking Alpha*.

March 5, 2013 at 4:41 pm |

I believe this is one of the such a lot significant information

for me. And i am happy reading your article. However want to

remark on some normal things, The web site style is perfect, the articles is truly

excellent : D. Good activity, cheers