overview

Advanced

Learning How to Think - By Nicholas D. Kristof

Posted by ProjectC 
"I, too, have a pet little evil, to which in more passionate moments I am apt to attribute all the others. This evil is the neglect of thinking. And when I say thinking I mean real thinking, independent thinking, hard thinking.

You protest. You say men are thinking more now than they ever were. You bring out the almanac to prove by statistics that illiteracy is declining. You point to our magnificent libraries. You point to the multiplication of books. You show beyond a doubt that people are reading more now than ever before in all history...

Very well, exactly. That is just the trouble. Most people, when confronted with a problem, immediately acquire an inordinate desire to " read-up" on it. When they get stuck mentally, the first thing such people do is to run to a book. Confess it, have you not often been in a waiting room or a Pullman, noticed people all about you reading, and finding yourself without any reading matter, have you not wished that you had some?—something to "occupy your mind"? And did it ever occur to you that you had within you the power to occupy your mind, and do it more profitably than all those assiduous readers? Briefly, did it ever occur to you to think?

Of course you " thought"—in a sense. Thinking means a variety of things. You may have looked out of your train window while passing a field, and it may have occurred to you that that field would make an excellent baseball diamond. Then you "thought" of the time when you played baseball, "thought" of some particular game perhaps, "thought" how you had made a grand stand play or a bad muff, and how one day it began to rain in the middle of the game, and the team took refuge in the carriage shed. Then you "thought" of other rainy days rendered particularly vivid for some reason or other, or perhaps your mind came back to considering the present weather, and how long it was going to last. . . . And of course, in one sense you were "thinking." But when I use the word thinking, I mean thinking with a purpose, with an end in view, thinking to solve a problem. I mean the kind of thinking that is forced on us when we are deciding on a course to pursue, on a life work to take up perhaps; the kind of thinking that was forced on us in our younger days when we had to find a solution to a problem in mathematics, or when we tackled psychology in college. I do not mean " thinking in snatches, or holding petty opinions on this subject and on that. I mean thought on significant questions which lie outside the bounds of your narrow personal welfare. This is the kind of thinking which is now so rare—so sadly needed!

Of course before this can be revived we must arouse a desire for it. We must arouse a desire for thinking for its own sake; solving problems for the mere sake of solving problems. But a mere desire for thinking, praiseworthy as it is, is not enough. We must know how to think, and to that end we must search for those rules and methods of procedure which will most help us in thinking creatively, originally, and not least of all surely, correctly.
"

- Henry Hazlitt, Thinking as a Science



"Mr. Tetlock called experts such as these the “hedgehogs,” after a famous distinction by the late Sir Isaiah Berlin (my favorite philosopher) between hedgehogs and foxes. Hedgehogs tend to have a focused worldview, an ideological leaning, strong convictions; foxes are more cautious, more centrist, more likely to adjust their views, more pragmatic, more prone to self-doubt, more inclined to see complexity and nuance. And it turns out that while foxes don’t give great sound-bites, they are far more likely to get things right.

This was the distinction that mattered most among the forecasters, not whether they had expertise. Over all, the foxes did significantly better, both in areas they knew well and in areas they didn’t.

Other studies have confirmed the general sense that expertise is overrated. In one experiment, clinical psychologists did no better than their secretaries in their diagnoses. In another, a white rat in a maze repeatedly beat groups of Yale undergraduates in understanding the optimal way to get food dropped in the maze. The students overanalyzed and saw patterns that didn’t exist, so they were beaten by the rodent.
"


Learning How to Think

By NICHOLAS D. KRISTOF
March 26, 2009
Op-Ed Columnist
Source

Ever wonder how financial experts could lead the world over the economic cliff?

One explanation is that so-called experts turn out to be, in many situations, a stunningly poor source of expertise. There’s evidence that what matters in making a sound forecast or decision isn’t so much knowledge or experience as good judgment — or, to be more precise, the way a person’s mind works.

More on that in a moment. First, let’s acknowledge that even very smart people allow themselves to be buffaloed by an apparent “expert” on occasion.

The best example of the awe that an “expert” inspires is the “Dr. Fox effect.” It’s named for a pioneering series of psychology experiments in which an actor was paid to give a meaningless presentation to professional educators.

The actor was introduced as “Dr. Myron L. Fox” (no such real person existed) and was described as an eminent authority on the application of mathematics to human behavior. He then delivered a lecture on “mathematical game theory as applied to physician education” — except that by design it had no point and was completely devoid of substance. However, it was warmly delivered and full of jokes and interesting neologisms.

Afterward, those in attendance were given questionnaires and asked to rate “Dr. Fox.” They were mostly impressed. “Excellent presentation, enjoyed listening,” wrote one. Another protested: “Too intellectual a presentation.”

A different study illustrated the genuflection to “experts” another way. It found that a president who goes on television to make a case moves public opinion only negligibly, by less than a percentage point. But experts who are trotted out on television can move public opinion by more than 3 percentage points, because they seem to be reliable or impartial authorities.

But do experts actually get it right themselves?

The expert on experts is Philip Tetlock, a professor at the University of California, Berkeley. His 2005 book, “Expert Political Judgment,” is based on two decades of tracking some 82,000 predictions by 284 experts. The experts’ forecasts were tracked both on the subjects of their specialties and on subjects that they knew little about.

The result? The predictions of experts were, on average, only a tiny bit better than random guesses — the equivalent of a chimpanzee throwing darts at a board.

“It made virtually no difference whether participants had doctorates, whether they were economists, political scientists, journalists or historians, whether they had policy experience or access to classified information, or whether they had logged many or few years of experience,” Mr. Tetlock wrote.

Indeed, the only consistent predictor was fame — and it was an inverse relationship. The more famous experts did worse than unknown ones. That had to do with a fault in the media. Talent bookers for television shows and reporters tended to call up experts who provided strong, coherent points of view, who saw things in blacks and whites. People who shouted — like, yes, Jim Cramer!

Mr. Tetlock called experts such as these the “hedgehogs,” after a famous distinction by the late Sir Isaiah Berlin (my favorite philosopher) between hedgehogs and foxes. Hedgehogs tend to have a focused worldview, an ideological leaning, strong convictions; foxes are more cautious, more centrist, more likely to adjust their views, more pragmatic, more prone to self-doubt, more inclined to see complexity and nuance. And it turns out that while foxes don’t give great sound-bites, they are far more likely to get things right.

This was the distinction that mattered most among the forecasters, not whether they had expertise. Over all, the foxes did significantly better, both in areas they knew well and in areas they didn’t.

Other studies have confirmed the general sense that expertise is overrated. In one experiment, clinical psychologists did no better than their secretaries in their diagnoses. In another, a white rat in a maze repeatedly beat groups of Yale undergraduates in understanding the optimal way to get food dropped in the maze. The students overanalyzed and saw patterns that didn’t exist, so they were beaten by the rodent.

The marketplace of ideas for now doesn’t clear out bad pundits and bad ideas partly because there’s no accountability. We trumpet our successes and ignore failures — or else attempt to explain that the failure doesn’t count because the situation changed or that we were basically right but the timing was off.

For example, I boast about having warned in 2002 and 2003 that Iraq would be a violent mess after we invaded. But I tend to make excuses for my own incorrect forecast in early 2007 that the troop “surge” would fail.

So what about a system to evaluate us prognosticators? Professor Tetlock suggests that various foundations might try to create a “trans-ideological Consumer Reports for punditry,” monitoring and evaluating the records of various experts and pundits as a public service. I agree: Hold us accountable!