≡ Menu

Game Thinking and Trouble Shooting to Solve Problems

One of the great constraints for many great game thinkers is that though they are skilled at developing winning strategies they are not very skilled at avoiding errors in judgment. One cannot be a master strategist and problem solver unless one has a hold on both skill-sets.

In order to develop this skill one must have a profound understanding of numerous concepts drawn from game theory and design thinking including; collaborative intelligence, critical mass, tipping points, ripple effects, Black Swans, the Butterfly effect, Support Triangle, Cognitive Bias, Trembling Hand,puzzle thinking and other elements.

.

Within the Game Thinking community there are trouble shooters whose job it is to check the understanding of experts and specialists for mental errors. Any skilled game theorist knows that the more brilliant, knowledgeable and expert a specialist is, the more likely they are to make some error, often a small one, that can have a major impact within any system especially a game space.  These are individuals, often experts themselves who skilled at focusing on just the types of thinking errors certain types of specialists are likely to make.  Acclaimed experts often are confused, and even annoyed as to how someone as respected and knowledgeable as themselves could possibly need someone less acclaimed and knowledgeable looking over their shoulders and making suggestions. Any serious strategist soon comes to appreciate these individuals.

 

Visionary Thinking: Tips, Techniques & Strategies by Lewis Harrison

The more acclaimed a person is in a specialty the greater the likelihood that they may come to see themselves as infallible experts. This is when they get themselves into trouble.  Where ever there is genius at work there will be judgments made and where there are judgments made there must be some level of must also be some level of uncertainty.  Taking this stream of thought to the next level wherever there is there is an opportunity for human fallibility and preventable human error.

There a number of key reasons why highly skilled individuals make mistakes that can have major ripple effects that can cost millions of dollars, and lead to death and destruction. These include:

  • Failing to see that the information they received from other experts on their own team or project was unreliable.
  • Pay attention mainly to what they were asked to pay attention to, thus missing some bigger picture.
  • Failing to notice what they were not directly asked to notice.
  • Addressing a small problem without realizing that that problem is merely a reflection or an indication of a much larger problem.

The reverse also happens. In this situation an expert deals with the larger problem without realizing that there is a very small, seemingly irrelevant constraint at the cause of the larger problem. When either large problem or small problem is ignored one may win the battle but lose the war.

Trouble shooters often ask and even train experts to notice small details in an environment that they might have noticed before. – Is something missing that is usually there?  Has there been a change in an old pattern without any apparent reason or explanation for that change? Any skilled mentalist and magician can tell you that there is a lot to be learned about people and situation by careful observation.

The great challenge for most experts is that they tend to notice only what they were trained to notice.  This is especially so among Engineers and medical doctors.

One of the great challenges here is information bias, a type of cognitive bias that involves a distorted evaluation of information. An example of information bias is believing that the more information that can be acquired to make a decision, the better, even if that extra information is irrelevant for the decision.  This is a common problem among physicians in properly diagnosing fictitious diseases.

Experts often isolate a constraint in a system without considering statistically that there is another cause more likely for that constraint. Many game thinkers think statistically but nit statistically enough. They seldom think that probabilities apply to their situation, problem or game scenario. Most ordinary individuals don’t believe statistical probabilities apply to them.  Most drunk drivers don’t think the statistics that show that they are more likely to be killed if they drive “under the influence” than if sober applies to them.”

I have many brilliant friends that believe things are so that their statistical analysis supports. The problem is that they are not skilled statisticians and if they were these are not individuals likely to reach out to a game thinking trouble shooter to check their “numbers” and the conclusions they have reached from those numbers. The problem is they have ignored the representativeness heuristic”, a heuristic used when making judgments about the probability of an event under uncertainty. It is one of a group of heuristics (simple rules governing judgment or decision-making) proposed by psychologists Amos Tversky and Daniel Kahneman in the early 1970s.

 

Experts, more than anyone else, need to be conscientious when defining the source of a problem. Often what seems obvious instantly pops their mind that seems to perfectly explain the core constraint and all the related elements that seem to be rippling out from that constraint. Just before acting on what seems obvious it important to bring in a trouble shooter.  This backup expert will help the master game thinker just before they act. Often what first came to mind was correct, yet this is not always the case and if the game thinker is wrong the consequences can be grave.

As Kahneman and Tversky long ago said that a person making a predictionis allowed to ignore statistics that question a conclusion only if they are completely certain they are correct. In a complex game scenario one can never completely be certain about anything. When you are certain Nash equilibrium can help you define the best strategy but in human situations one can never be too sure. This is why a skilled game thinking trouble shooter can be so valuable.

Let’s explore this further.

   Game Thinking Troubleshooting*  (GTT) is a form of problem solving, often applied to repair strategies or solve problems or processes in any game space or system. It is a logical, systematic search for the source of a problem or constraint in order to solve it, and renew or recreate a winning strategy The first step is to identify the symptoms. Determining the most likely cause is a process of elimination—eliminating potential causes of a problem or constraint. Finally, GTT requires confirmation that the solution restores the strategy or process to its peak state.

In general, GTT is the identification of diagnosis of ” problem or constraint” in the application of a strategy for an individual or team.  The problem or constraint is initially described as symptoms of malfunction, and GTT is the process of determining and remedying the causes of these symptoms.

As we have discussed earlier in this book, a system can be described in terms of its expected, desired or intended behavior. Events or specific strategies are expected to generate specific results or outputs. A simple example of this might be  selecting the “print” option from various computer applications is intended to result in a hardcopy emerging from some specific device. Any unexpected or undesirable behavior is a symptom. So GTT is the process of isolating the specific cause or causes of the symptom. Frequently the symptom is nothing more than a failure of the strategy to produce any results. (Nothing was printed, for example). Corrective action can then be taken to prevent further failures of a similar kind. But what action is it best to take?

Some of the most skilled game thinking troubleshooters recognize that one of the sources of a problem or constraint in a game space is a result of failed “tools”. Here they may need to use methods drawn from forensic engineering in tracing problem or constraint. Forensic engineering is the investigation of materials, products, structures or components that fail or do not operate or function as intended. The consequences of this type of failure is dealt with by GTT through the application of the law of product liability.

In Game Thinking there are a wide range of analytical techniques are available to determine the cause or causes of specific failures. Corrective action can then be taken to prevent further failure of a similar kind. Preventative action is possible using failure mode and effects (FMEA)* and fault tree analysis (FTA*) before full-scale production, and these methods can also be used for Failure analysis*.

Usually troubleshooting is applied to something that has suddenly stopped working, since it’s previously working state forms the expectations about its continued behavior. So the initial focus is often on recent changes to the system or to the environment in which it exists. (For example, a printer that “was working when it was plugged in over there”). However, there is a well-known principle that correlation does not imply causality. (For example, the failure of a device shortly after it has been plugged into a different outlet doesn’t necessarily mean that the events were related. The failure could have been a matter of coincidence.) Therefore, troubleshooting demands critical thinking rather than magical thinking.

It is useful to consider the common experiences we have with light bulbs. Light bulbs “burn out” more or less at random; eventually the repeated heating and cooling of its filament, and fluctuations in the power supplied to it cause the filament to crack or vaporize. The same principle applies to most other electronic devices and similar principles apply to mechanical devices. Some failures are part of the normal wear-and-tear of components in a system.

A basic principle in troubleshooting is to start from the simplest and most probable possible problems first. This is illustrated by the old saying “When you see hoof prints, look for horses, not zebras”, or to use another maxim, use the KISS principle* (Keep it simple, stupid!). This principle results in the common complaint about help desks or manuals, that they sometimes first ask: “Is it plugged in and does that receptacle have power?”, but this should not be taken as an affront, rather it should serve as a reminder or conditioning to always check the simple things first before calling for help.

A GTT could check each element in a system one by one, substituting known good components or approaches for each potentially suspect one. However, this process of “serial substitution” can be considered degenerate when components are substituted without regard to a hypothesis concerning how their failure could result in the symptoms being diagnosed.

Simple and intermediate systems are characterized by lists or “information trees*” of dependencies among their components or subsystems. More complex systems require more sophisticated approaches.

oooooooooooooooooooooooooooooooooooooo

You can purchase a number of book on game thinking and information

 

http://www.realuguru.com/products/ebooks/the-realugurus-guide-to-wealth-and-success-through-big-data-and-information-science

 

You can also purchase others at:

The RealUGuru’s Guide To Healing Your Emotions

 

Lewis Harrison is a writer, content-rich, motivational speaker, and an entrepreneur specializing in game based thinking, applied game theory and Game Thinking.

 

Known as the RealUGuru. He is the author of over twenty-two books published in five languages.  Including the business books.

Don’t forget to tune into my Radio show today at WIOX 91.3 FM or on your smart device at WIOXRadio.org.