The definition for an expert is:
Basis of credibility of a person who is perceived to be knowledgeable in an area or topic due to his or her study, training, or experience in the subject matter.
The definition suggests that what the expert knows or informs others about can be trusted. We can view the advice or decision as credible and use it with confidence.
But can we? Yes and No. For a background understanding of why this might be the case, read this article on Visual decision-making (here).
Having looked at the link on decision-making, you will understand that experts are prone to errors. They are also subject to biases, just like everyone else.
What is Expertise?
Many studies use the example of a chess Grand Master. Other studies look at firefighters, doctors, surgeons, engineers, and pilots.
Across many different expert work fields, professionals make difficult decisions under conditions of complexity, uncertainty, and time constraints. The assumption is that these individuals either have:
This type of study has been going on for over 70 years, conducted by various researchers.
Contemporary research strongly favors the pattern recognition stored in memory option as the determinant of expertise.
We gain expertise because of our ability to recognize patterns within the problem's context and the expert's experience. The knowledge is acquired over many years of practice and study. The patterns are stored in long-term memory. The level of expertise is dependent on the intensity and time of education in a particular field of work.
When an expert recognizes a pattern (chunk), it then prompts the expert to think of a strategy based on previous experience. A good decision becomes possible.
For expertise to develop, two conditions are necessary:
Expertise = Experience + Quick Feedback
Expertise is nothing more than an intuitive judgment of experience that gets reinforced by specific and timely feedback. But not all domains (fields of work) can develop expertise, and this is because of the availability of timely feedback.
For example, chefs, teachers, surgeons, and firefighters can acquire knowledge but not stockbrokers or radiologists.
What suggests is that - expertise is nothing more than vast amounts of knowledge, pattern-based retrieval, and planning mechanisms acquired over many years of experience in the associated domain.
A great memory is the essence of expertise. For example, an anesthesiologist will develop good intuition compared to a radiologist. This ability is because the feedback conditions are almost immediate for an anesthesiologist, compared to a radiologist who often gets no feedback. The lack of feedback does not allow them to learn and develop good patterns based on feedback.
Experts use a small number of cues (and no more than 4) when dealing with difficult issues. These cues are triggers that are matched to past experiences. When a match is found, adjustments are made to account for the new context, and the decision is made.
This process works in relatively simple circumstances. But when the decision-maker is dealing with complexity, where the new situation has many interacting variables (sometimes referred to as wicked problems, such as environmental, political, social, etc.), the process does not work and falls apart.
To solve complex problems, we need superior analytical skills in generating and evaluating alternative solutions. But as we have seen, experts do not have this. They have domain-specific knowledge.
Complex problems typically involve the mixing of various domains, not just one discipline. An engineer can design a dam wall that does not collapse. However, the engineer might be causing other problems, such as the displacement of communities due to water levels changes.
Solving one aspect of a problem very often creates other problems. Experts have difficulty analyzing the real problem because it usually involves other domains in which they have no expertise.
They also tend to switch into action in solution mode, without really understanding what the problem might become. This funnel effect happens because the patterns they have, at a superficial level, appear to match the situation - request to building the dam wall. The other 'stuff' the communities living on the existing river bank - there is not a pattern that matches their reality.
Complex problems are multi-causal and involve the ability to address unknown situations that have little to no precedence. In such cases, the expert is not going to provide the correct solution. Experts tend to address the basic layer - they do not have patterns for seeing potential problems that have nothing to do with calculating dam walls. Their expertise is limiting them. They are not able to see the problem behind the problem. This happens because of how experts think. They rely on very few cues to retrieve a past solution. That is the mechanism of how experts solve problems.
In a survey of 106 experts from 91 private and public companies in 17 different countries, it was found that 85% were bad at understanding the 'true' problem and often solved the 'wrong' problem. If they had used the tripartite approach of visual problem solving, they would have corrected this weakness.
Diagnosing the problem is more important than problem-solving, as emphasized by people like Albert Einstein and Peter Drucker.
In short - being an expert - what happens is that an automated inbuilt algorithm is developed. These automatic pattern triggers are OK when the problem is nothing close to complex, with no overlaps with other things like social issues. It is true, you can claim expertise for calculating dam walls, but in real life, the title of 'expert' carries more responsibilities. For this reason, the jobs of the future 'supper jobs' (see this post) - are in high demand. Dam wall calculations are best done with computers - oops, there goes an expert.
deGroot, A. D. (1978). Thought and choice in chess. The Hague: Mouton. (Original work published 1946)
Klein, G. (1998). Sources of power: How people make decisions. Cam- bridge, MA: MIT Press.
Simon, H. A. (1992). What is an explanation of behavior? Psychological Science, 3, 150–161.
Kahneman, D., & Frederick, S. (2002). Representativeness revisited: Attribute substitution in intuitive judgment. In T. Gilovich, D. Griffin, & D. Kahneman (Eds.), Heuristics and biases: The psychology of intuitive judgment (pp. 49–81). New York: Cambridge University Press.