Ecological rationality


Posted On Nov 26 2014 by

What is the difference between a bias and a heuristic explained in layman terms?

Dr. Gigerenzer’s book is a well written and accessible read on applying a sensible statistics based approach to risk and risk based decisions. His chapters on the practice of medicine and many of the fallacies that are carried by physicians themselves, and the general public, on topics like screening tests, how to analyze positive or negative results of tests, are quite relevant and important for physicians to read closely.

Gerd Gigerenzer

Is anyone else struggling to understand Gigerenzer’s point about regression towards the mean, with respect to risk estimation? Regression to the mean is a consequence of X and Y-X being negatively correlated if X and Y are independent, right?

gerd gigerenzer wiki

German psychologist Gerd Gigerenzer goes beyond Simon in dismissing the importance of optimization in decision making. He argues that simple heuristics-experience-based techniques for problem-solving-can lead to better decision outcomes than more thorough, theoretically optimal processes that consider vast amounts of information.

Having said that, I think that some are simply better diagnosticians regardless of their training. They are always accessible when such perspicuity is demanded. In other words, people who took (and did well?) in probability theory are accustomed to the phrasing and the differences between what these terms represent. Other people may _think_ that they understood what they read, but many actually don’t. So, I’m not surprised that one the problem is restated using those “natural frequencies”, people in general do much better.

We discuss some of the major progress made so far, focusing on the discovery of less-is-more effects and the study of the ecological rationality of heuristics which examines in which environments a given strategy succeeds or fails, and why. Homo heuristicus has a biased mind and ignores part of the available information, yet a biased mind can handle uncertainty more efficiently and robustly than an unbiased mind relying on more resource-intensive and general-purpose processing strategies. This study has certain limitations, and a few suggestions for further research have emerged from the current study.

He conceptualizes rational decisions in terms of the adaptive toolbox (the repertoire of heuristics an individual or institution has) and the ability to choose a good heuristics for the task at hand. A heuristic is called ecologically rational to the degree that it is adapted to the structure of an environment. Gerd Gigerenzer is a German psychologist who has studied the use of bounded rationality and heuristics in decision making, especially in medicine. A critic of the work of Daniel Kahneman and Amos Tversky, he argues that heuristics should not lead us to conceive of human thinking as riddled with irrational cognitive biases, but rather to conceive rationality as an adaptive tool that is not identical to the rules of formal logic or the probability calculus. I think just about everyone could benefit from reading the second chapter (on the difference between risk and uncertainty – I don’t have the math skills to enter into the more academic aspect of the debate, but it seems like a meaningful and true distinction in a world of real decisions) and the chapters on financial and medical decision making.

For instance, loss aversion is often presented as a truism; in contrast, a review of the literature concluded that the “evidence does not support that losses, on balance, tend to be any more impactful than gains” (Gal and Rucker, 2018). Research outside the heuristics-and-biases program that does not confirm this message-including most of the psychological research described in this article-is rarely cited in the behavioral economics literature (Gigerenzer, 2015). But you can see you this can lead to economists getting a distorted view of the content of psychology and cognitive science. Gigerenzer goes through a series of classic examples of cognitive errors, including the use of base rates in conditional probability, perceptions of patterns in short sequences, the hot hand, bias in estimates of risks, systematic errors in almanac questions, the Lake Wobegon effect, and framing effects.

Yet statistics, like every tool, has its limits outside a world of known risks. Gigerenzer calls for a second revolution, a heuristic revolution, that would provide us with a second set of tools, heuristics and their ecological rationality. These would equip humans to deal more specifically with the many situations they face in which not all alternatives and probabilities are known, and surprises can happen. With Daniel Goldstein he first theorized the recognition heuristic and the take-the-best heuristic.

This is a very good book about common misconceptions in understanding risks in everyday life as well as decision theory. I won this book in a goodreads giveaway. This book was pleasantly different than I expected. I was expecting something about the psychology of risk

Efficient decision heuristicsEdit

Where an exhaustive search is impractical, heuristic methods are used to speed up the process of finding a satisfactory solution. The rationality of individuals is limited by the information they have, the cognitive limitations of their minds, and the finite amount of time they have to make a decision. Bounded rationality is the idea that an individual’s ability to act rationally is constrained by the information they have, the cognitive limitations of their minds, and the finite amount of time and resources they have to make a decision. The examples were meant to show that humans have had plenty of time to develop techniques to analyze situations systematically on behalf of a risk management and long term thinking agenda.

The above studies suggest that similar phenomena may be observed regarding users’ credibility judgments of Wikipedia. That is, peripheral cues may affect the credibility judgments of college students concerning Wikipedia. Based on dual-process theories, Reinhard and Sporer (2010) conducted a series of experiments to test whether there were relationships between the use of source cues and the levels of task involvement in making credibility judgments. One of their experiments used the attractiveness of images as a source cue, which can be considered as a peripheral cue. They found that only peripheral cues influenced the credibility judgments of participants with low-task involvement, whereas both central and peripheral cues had an impact on the credibility judgments of participants with high-task involvement.

Further, an exploratory question regarding the use of other peripheral cues provides some useful insights into credibility research and library practice. When the respondents were uncertain about the credibility of a Wikipedia article, they tended to look at peripheral cues that they could easily process, and tended not to examine cues such as a discussion page, which demands more cognitive effort. Further research is needed to examine under which conditions users tend to use certain peripheral cues.

In addition, a high percentage of the respondents reported using external links. This result provides practical implications for library and educational practices. College libraries can promote their library sources or suggest other useful sources by actively inserting their sources or other useful Web sources into relevant Wikipedia articles.

It only takes a minute to sign up. While heuristics can speed up our problem and the decision-making process, they can introduce errors. Just because something has worked in the past does not mean that it will work again, and relying on an existing heuristic can make it difficult to see alternative solutions or come up with new ideas.

Last Updated on: September 26th, 2019 at 11:19 pm, by


Written by admin


Leave a Reply

Your email address will not be published. Required fields are marked *