Desktop version

Home arrow Sociology

  • Increase font
  • Decrease font

<<   CONTENTS   >>

A Procedural Approach to Problem Solving and Decision Making: Baby Steps

If you’ve ever tried untangling kite string, you know that starting in the wrong place can leave you with lots of knots. The same might be said about trying to solve complex problems. If you dive into a solution without much thought, you might make things messier than ever. With that in mind, for complex problems that involve multiple decisions, having a systematic thought-out plan can help you avoid some serious snarls. In fact, dozens of authors have described step-by-step guidelines for both individuals and groups that are faced with the task of finding solutions (e.g., see Beebe & Masterson, 2003; Bransford & Stein, 1984; Dewey, 1910; Hammond, Keeney & Raiffa, 1999; Hayes, 1981; Vaughn, 2007). Although such guidelines vary to some degree, practically all include some or most of the steps that we briefly outline in this section. Although abiding by such a “cookie cutter” approach does not guarantee success, we believe that, under most circumstances, considering these steps will help you avoid many common pitfalls that await unsuspecting problem solvers and decision makers.

Step I: Define the Problem

The old saying, “A problem well stated is a problem half solved,” suggests that taking time to articulate the problem and your assumptions about the problem is time well spent (Beebe & Masterson, 2003). As Hammond et al. (1999) noted, how you pose a problem has profound implications for the course of action you choose. For instance, if you’ve moved to a new city and ask “Which apartment should I pick?” you may prevent yourself from considering the possibility of renting a house, buying a condo, finding a roommate, and so forth (Hammond et al., 1999). With that in mind, Beebe and Masterson (2003) suggest that problem solvers start by asking themselves questions such as: What is the specific problem? Is the problem stated clearly? Who is harmed by the problem?

off the mark .com by Mark Parisi


Figure 10.5 Cartoon: Mark Parisi,, 2009-1 1-02.

© Mark Parisi/

Step 2: Analyze the Problem

In this step, problem solvers investigate causes and effects of the problem. Along the way, they gather facts that help them understand how serious the problem is, where it came from, and what obstacles might keep them from solving the problem. Skipping this step can lead to terrible results. For example, back when he was a student, one of the authors worked in a department store, selling men’s suits on commission. One day, a new manager decided to hire more salespeople. Apparently, the manager noticed a lot of customers waiting around and concluded that there wasn’t enough help on the sales floor. The real cause of the problem, however, was not enough tailors. The customers, it turns out, were not waiting to purchase suits. They were keeping themselves busy while waiting to be fitted! By not searching for facts, and misdiagnosing the cause of the problem, the manager made matters worse. Specifically, with less sales to go around, the star salespeople found jobs elsewhere, leaving mediocre staff behind. Not surprisingly, the department’s profits plummeted.

Step 3: Establish Criteria

For this step, the problem solver determines what standards must be met in order for the problem to be considered resolved. By way of example, when shopping for a car, you might list several criteria—e.g., is it safe, comfortable, snazzy?—that need to be satisfied before deciding to make a purchase. In the same way, a manager in a suit department might decide that a problem is resolved if customers aren’t waiting, salespeople are happy, profits are high, and so forth.

Step 4: Generate Possible Solutions

In this step, an approach known as brainstorming, which fosters the creative generation of ideas, can be useful. Although a number of techniques have been proposed (see Vaughn, 2007), the classic approach was created by Alex Osborn (1957). According to this approach, while working in groups, problem solvers should: 1) generate as many solutions, no matter how wild or outlandish, as possible; 2) avoid criticizing or judging any ideas for the time being; and 3) build on other people’s ideas.

Step S: Select the Best Solution

To accomplish this step, problem solvers might consider each possible solution alongside the criteria generated in Step 3, and then decide which solution seems to meet the criteria best. In addition, for each solution, problem solvers might generate a list of pros and cons. For example, although training salespeople to measure and mark suits for alteration (solution 1) might lessen the tailors’ workload and provide continuity in customer service (pros), it could also lead to fewer sales and more alteration mistakes (cons). Once such a list is generated, the advantages and disadvantages of each solution can be compared and analyzed until the best solution is found.

Step 6: Implement the Solution

Just because a solution is found, does not necessarily mean it will be successful. Its chances are better, though, if some thought is put into how the solution will be put into effect. Thus, for this last step, problem solvers should consider questions such as: Who is in charge of applying the solution? What must be accomplished to implement the solution? How will the changes be explained to others? And how long will the implementation take?

Decision Making in Groups: Are Two (or More) Heads Better Than One?

Although old sayings such as “Too many cooks spoil the soup” or “A camel is a horse designed by a committee” suggest that groups are lousy problem solvers, if group members are able to pool resources while managing both social and task-related issues, the quality of their decisions will usually be higher than when working alone (Cathcart, Samovar & Henman, 1996). That “if’ is a big one, though, and volumes have been written on how groups can be more effective. Although we don’t have space to cover all the topics necessary to make you a group guru, we use the remainder of this chapter to warn you about two hurdles that can get in your way when you are working in groups.

Groupthink: Don’t Rock the Boat!

In Chapter 2 we discussed a phenomenon known as groupthink (Janis, 1972), which, if you recall, occurs when members in a group are so concerned with getting along with each other that they make bad decisions. Among other things, such groups are characterized by overconfidence, closed-mindedness, and intolerance of disagreement within the group. With that in mind, if you ever find yourself in such a group, Janis (1972) suggested several tactics for reducing groupthink. Most of these focus on encouraging critical thinking and argumentation. For instance, disagreement should be encouraged from people inside the group and invited from people outside the group. Along the way, group members can be assigned the role of devil’s advocate, or work individually or in smaller teams to consider alternative viewpoints.

Social Loafing: Couch Potatoes Need Not Apply

If you’ve ever caught yourself slacking off while riding a tandem bicycle, moving a heavy piece of furniture with others, or completing a group project, then you are already familiar with social loafing. According to Karau and Williams (1993), “social loafing is the reduction in motivation and effort when individuals work collectively compared with when they work individually or coactively” (p. 681). Previous research suggests that there are several potential causes for such loafing (for a review, see Gass & Seiter, 2018). Whatever the cause, however, it’s clear that loafing can impede groups from reaching their full potential. With that in mind, what might be done to reduce social loafing? Previous research indicates that social loafing may be attenuated when people identify with the group and are held accountable for their work (Barreto & Ellemers, 2000). As such, building rapport in a group while monitoring performances might be a useful approach.


In this chapter we examined several topics related to judgment, decision making, and problem solving. First, we examined expected value and utility, normative models of decision making, rooted in economics and mathematics. Second, we saw that several heuristics and biases can function to hinder good judgment and decision making. These included sunk cost effects, message framing, the availability heuristic, the representativeness heuristic, overconfidence, and the confirmation bias. Third, we presented a procedural approach to problem solving and decision making that outlined the steps one might take individually or in groups when tackling a complex issue or decision. Finally, we talked about two phenomena, groupthink and social loafing, which can impede effective decision making in groups.


  • 1 Technically speaking, whether or not a computer can generate a truly random number depends on what you mean by “random.” For details, see Rubin (2011).
  • 2 Likewise, people lack the capacity to generate other random responses such as pressing keys on a keyboard (see Baddeley, Emslie, Kolodny & Duncan, 1998).
  • 3 Although we’ve not verified this rumor (to see it yourself, Google “car horn key f”), regardless of its accuracy, car horns honk in one key or another.


Aissia, D.B. (2016). Developments in non-expected utility theories: An empirical study of risk-aversion. Journal of Economics and Finance, 30(2), 299—318, doi: 10.1007/sl2197-014-9305-3.

Arkes, H.R. & Blumer, C. (1985). The psychology' of sunk costs. Organizational Behavior and Human Decision Processes, 35(1), 124-140, doi: 10.1016/0749-5978(85)90049-4.

Arkes, H.R., Christensen, C., Lai, C. & Blumer, C. (1987). Two methods of reducing overconfidence. Organizational Behaviorand Human Decision Processes, 39(1),133—144,

Baddeley, A., Emslie, FL, Kolodny, J. & Duncan, J. (1998). Random generation and the executive control of working memory. Quarterly Journal of Experimental Psychology, 51(4), 819—852, doi: 10.1080/713755788.

Bakan, P. (1960). Response tendencies in attempts to generate random binary series. American Journal of Psychology, 73(1), 127-131, doi: 10.2307/1419124. '

Barberis, N.C. (2013). Thirty years of prospect theory in economics: A review and assessment. Tire Journal of Economic Perspectives, 27(1), 173—195, doi: 10.1257/jep.27.1.173.

Barreto, M. & Ellemers, N. (2000). You can’t always do what you want: Social identity and self-presentational determinants of the choice to work for a low-status group. Personality and Social Psychology Bulletin, 26(8), 891-906, doi: 10.1177/01461672002610001.

Barsalou, L.W. (1992). Cognitive psychology: An overview for cognitive scientists. Hillsdale, NJ: Erlbaum.

Beebe, S.A. Sc Masterson, J.T. (2003). Communicating in small groups (7th Ed.). Boston, MA: Allyn & Bacon.

Birkett, AJ. (2015). Status of vaccine research and development of vaccines for malaria. Vaccine, 34(26), 2915-2920, doi: 10.1016/j.vaccine.2015.12.074.

Bransford, J.D. 8c Stein, B.S. (1984). The ideal problem solver: A guide for improving thinking, learning, and creativity. New York: Freeman.

Buehler, R., Griffin, D. 8c Ross, M. (1994). Exploring the “planning fallacy”: Why people underestimate their task completion times. Journal of Personality and Social Psychology, 67(3), 366—381, doi: rg/10.1037/0022-3514.67.3.366.

Cathcart, R.S., Samovar, L.A. 8c Henman, L.D. (1996). Small group communication: Theory and practice (7th Ed.). Madison, WI: Brown & Benchmark.

Chabris, C. 8c Simons, D. (2010). The invisible gorilla and other ways our intuitions deceive us. New York: Crown.

Cooper, A., Woo, C. & Dunkelberg, W. (1988). Entrepreneurs’ perceived chances for success. Journal of Business Venturing, 3(2), 97-108, doi: org/10.1016/0883-9026(88)90020-l.

Della Vigna, S. & Malmendier, U. (2006). Paying not to go to the gym. American Economic Review, 96(3), 694-719, doi: 10.1257/aer.96.3.694.

de Roos, N. 8c Sarafidis, Y. (2010). Decision making under risk in Deal or No Deal. Applied Economics, 25(6), 987-1027, doi: 10.1002/jae.l 110.

Dewey, J. (1910). How we think. New York: Heath.

Gass, R.H. & Seiter, J.S. (2018). Persuasion, social influence, and compliance gaining (6th Ed.). Boston, MA: Routledge.

Gilovich, T., Griffin, D. & Kahneman, D. (2002). Heuristics and biases: The psychology of intuitive judgment. Cambridge, MA: Cambridge University Press.

Gladwell, M. (2005). Blink: The power of thinking without thinking. New York: Little, Brown, and Company.

Hammond, J.S., Keeney, R.L. & Raiffa, H. (1999). Smart choices: A practical guide to making better decisions. New York: Broadway Books.

Hardman, D. (2009). Judgment and decision making: Psychological perspectives. Oxford: Blackwell.

Harth, M. (2013, April 16). 5 tips from Warren Buffet on mindfulness [Blog post], Huffington Post.

Retrieved on December 22, 2017 from:

Hastie, R. & Dawes, R.M. (2010). Rational choice in an uncertain world: The psychology of judgment and decision making (2nd Ed.). Los Angeles, CA: Sage.

Hayes, J.R. (1981). The complete problem solver. Philadelphia, PA: Franklin Institute Press.

Hogarth, R.M., Karelaia, N. & Trujillo, C.A. (2012). When should I quit? Gender differences in exiting competitions. Journal of Economic Behavior & Organization, £3(1), 136-150, doi: 10.1016/j. jebo.2011.06.021.

Ingraham, C. (2015, June 16). Chart: The animals that are most likely to kill you this summer. Washington Post. Retrieved on December 22, 2017 from: chart-the-animals-that-are-most-likely-to-kill-you-this-summer/?utm_term=.cl9482el519e.

Janis, I.L. (1972). Victims ofgroupthink. Boston, MA: Houghton Mifflin.

Kahneman, D. (2011). Thinking, fast and slow. New York: Farrar, Straus, and Giroux.

Kahneman, D., Knetsch, J.L. & Thaler, R.H. (1990). Experimental tests of the endowment effect and the Coase theorem. Journal of Political Economy, 98(b), 1325—1348, doi: 10.1086/261737.

Kahneman, D., Slovic, P. & Tversky, A. (1982). Judgment under uncertainty: Heuristics and biases. Cambridge, MA: Cambridge University’ Press.

Kahneman, D. & Tversky, A. (2013). Prospect theory: An analysis of decision under risk. In L.C. MacLean

& W.T. Ziemba (Eds), Handbook of the fundamentals of financial decision making (Part 1, pp. 99—127). Toh Tuck Link, Singapore: World Scientific Publishing.

Kahyaoglu, M.B. & lean, O. (2017). Risk aversion and emotions in DoND. InternationalJournal of Economics and Finance, 9(1), 32—lb, doi: 10.5539/ijef.v9nlp32.

Karau, SJ. & Williams, K.D. (1993). Social loafing: A meta-analytic review and theoretical integration. Journal of Personality and Social Psychology, 65(4), 681—706, doi: org/10.1037/0022-3514.65.4.681.

Kruger, J. & Dunning, D. (1999). Unskilled and unaware of it: How difficulties in recognizing one’s own incompetence lead to inflated self-assessments. Journal of Personality and Social Psychology, 77(b), 1121-1134, doi: 10.1037//0022-3514.77.6.1121.

Langer, EJ. & Roth, J. (1975). Heads I win, tails it’s chance: The illusion of control as a function of the sequence of outcomes in a purely chance task. Journal of Personality and Social Psychology, 32(b), 951—955, doi: 10.1037/0022-3514.32.6.951.

Lehrer, J. (2009). How we decide. Boston, MA: Houghton Mifflin Harcourt.

Levy, S. (2005, February 6). Does your iPod play favorites? Newsweek online. Retrieved on August 24, 2018 from: 16739.

Lichtenstein, S., Fischhoff, B. & Phillips, L.D. (1982). Calibration of probabilities: The state of the art to 1980. In D. Kahneman, P. Slovic, & A. Tversky (Eds), Judgment under uncertainty: Heuristics and biases (pp. 306-334). Cambridge, MA: Cambridge University Press.

Lord, C.G., Ross, L. & Lepper, M.R. (1979). Biased assimilation and attitude polarization: The effects of prior theories on subsequently considered evidence. Journal of Personality and Social Psychology, 37(11), 2098-2109, doi: org/10.1037/0022-3514.37.11.2098.

Maxwell, N.L. & Lopus, J.S. (1994). The Lake Wobegon Effect in student self-reported data. The American Economic Review, 84(2), 201—205.

McKenzie, C.R.M. (2005). Judgment and decision making. In K. Lamberts & R.L. Goldstone (Eds), The handbook of cognition (pp. 321-338). London: Sage.

Morris, A.H., Lee, K.H. & Orme, J. (2017). Illusory superiority: A barrier to reaching clinical practice goals [Abstract], American Journal of Respiratory and Critical Care Medicine, 195, Al 226.

Osborn, A.F. (1957). Applied imagination. New York: Scribner.

Payne, J.W., Bettman, J.R. & Johnson, EJ. (1988). Adaptive strategy selection in decision making. Journal of Experimental Psychology: Learning, Memory, and Cognition, 14(3), 534-552, doi: org/ 10.1037/0278-7393.14.3.534.

Pious, S. (1993). The psychology of judgment and decision making. New York: McGraw-Hill.

Post, T., van den Assem, M.J., Baltussen, G. & Thaler, R.H. (2008). Deal or no deal? Decision making under risk in a large-payoff game show. American Economic Review, 98(1), 38-71, doi: 10.1257/ aer.98.1.38.

Rubin, J.M. (2011, November 1). Can a computer generate a truly random number? Ask an Engineer.

Retrieved February 23, 2019 from:

Savage, LJ. (1954). The foundations of statistics. New York: Wiley.

Schick, TJr. & Vaughn, L. (2011). How to think about weird things: Critical thinking for a new age (6th Ed.). New York: McGraw Hill.

Simon, H.A. (1956). Rational choice and the structure of the environment. Psychological Review, 63, 129-138.

Simons, DJ. & Chabris, C.F. (2010, May 30). The trouble with intuition. The Chronicle of Higher Education.

Retrieved on June 6, 2010 from:

Tetlock, P.E. (2005). Expert political judgment: How good is it? How can we know? Princeton, NJ: Princeton University Press.

Tversky, A. & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185(4157), 1124-1130, doi: 10.1126/science.l85.4157.1124.

Tversky, A. & Kahneman, D. (1981). The framing of decisions and the psychology of choice. Science, 211(4481), 453-458, doi:10.1126/science.7455683.

Vaughn, R.H. (2007). Decision making and problem solving in management (3rd Ed.). Brunswick, OH: Crown Custom Publishing.

von Neumann, J. & Morgenstern, O. (1947). Theory of games and economic behavior (3rd Ed.). New York: Science editions.

Wilson, T.D. & Schooler, J.W. (1991). Thinking too much: Introspection can reduce the quality of preferences and decisions. Journal of Personality and Social Psychology, 60(2), 181—192, doi. org/10.1037/0022-3514.60.2.181. ’

Wolf, J.H. & Wolf, K.S. (2013). The Lake Wobegon Effect: Are all cancer patients above average? Milbank Quarterly, 91(4), 690-728, doi: 10.1111/1468-0009.12030.

Chapter I I

<<   CONTENTS   >>

Related topics