Sunday, May 10, 2009

After the Smoke Clears: Organizational Learning in HROs

A lesson is truly learned when we modify our behavior to reflect what we now know

-Wildland Fire Lessons Learned Center


One hallmark of highly reliable organizations is their ability to focus on their failures. While an obsession with failure may seem contrary to successful operations, it is this preoccupation with errors that ensures that mistakes are pre-empted. When these errors do happen, highly reliable organizations pay attention, examine the causes of mistakes and take steps to ensure that they are contained and avoided in the future.


Many HROs such as the military and fire and police departments use After Action Reviews (AARs) to facilitate this post-crisis learning. AARs enable groups to:

  • discuss what happened
  • what went well
  • what needs improvement
  • what lessons can be learned from the experience

Successful organizations learn from mistakes, whether they are large or small. Of course, large, catastrophic errors are easy to detect but successful organizations also pay close attention to potential errors (a misplaced ladder, forgotten safety goggles, a faulty lock, etc.).


The Wildland Fire Lessons Learned Center (LLC) is a website dedicated to “promoting a learning culture to enhance and sustain safe and effective work practices in the wildland fire community.” In an information collection team report, interviews with firefighters revealed potential warning signs that were overlooked:

  • “One week before the escape, the people who were continually on the burn repeatedly said ‘It’s just a matter of time until this escapes.’ This thought was not clearly communicated.”
  • “I didn’t pick up the extent of the drying trend as reported by field personnel.”
  • “In the field we were a little ‘fuzzy’ about the ramifications…”
  • “We missed checking the weather radar.”

Successful HROs are especially adept at paying attention to small errors and weak signals of trouble and learning from these experiences. In order to learn from mistakes, organizations need to:

  1. Reflect on the context in which learning is applied — analyze problems and issues from the offset
  2. Identify strategies that can be put in place to avoid the same mistakes
  3. Produce an action plan, allocate tasks and give deadlines
  4. Constantly revisit your action plan
  5. Share learning with colleagues to ensure they know of the improvements to be made

Of course, all of these steps require great communication both during the AAR and after. Employees need to feel free to communicate potential errors up the corporate ladder and management needs to be open to hearing suggestions. Furthermore, once errors are determined, they need to be communicated throughout the organization to foster learning.


Though small mistakes can be handled by these steps, crises of a larger scale demand a period of heightened awareness where large scale learning can take place. Organizational learning takes three forms: 1) retrospective sensemaking, 2) structural reconsideration, and 3) vicarious learning.


Retrospective sensemaking and structural reconsideration relate to many of the steps considered above. Organizations must be able to look back and make sense of the situation in order to analyze where things went wrong and, at times, organizations may need to be restructured or systems improved so that the mistake will no longer occur. However, an important addition is the notion of vicarious learning.


High reliability organizations need to pay attention to their environment, their competitors and their allies in order to learn from others’ mistakes. It is important to note that when a crisis occurs, it not only provides an opportunity for that organization to learn but other organizations as well.

Wednesday, April 1, 2009

Are we ready for a High Reliability…Government?

Theoretically, I would love for my government to function like an HRO and successfully manages the unexpected highs and lows of our daily function as a nation. Wouldn’t it be nice to know that our government consistently detects weak signals of potential growing danger and neutralizes threats before they become major headlines? Wouldn’t we all rest easier knowing that our current administration, and all of the local governments and agencies subject to the administration, successfully applies lessons learned from previous mistakes and competently avoids future failure? The Obama administration’s response to the flooding in North Dakota and Minnesota provides a snap shot of what such a “High Reliability Government” might entail.

President Obama personally called Fargo, N.D. Mayor Dennis Walaker to offer a pledge of resources well before flood levels reached their peak. The New York Times’ Monica Davey reported that there were so many National Guard, U.S. Army Corp of Engineers, FEMA, and other resources dispatched to the area that the city administrator, Mr. Pat Zavoral, commented “they were getting in the way…But that’s the way we want it.” Since President Obama’s initial conversation with Mayor Walaker, 1,700 National Guard members have fought alongside hundreds of volunteers to fight the rising Red River, the acting FEMA director has dispatched to the area, active duty military aerial support has arrived, and President Obama has declared emergency and disaster declarations for both North Dakota and Minnesota.

Click here to watch a CNN iReport from a volunteer relaying her account of the flood preparation.

Nearly every American political campaign runs on the promise of change and better protection of and for constituents. I have often wondered what this would look like. Obama’s administration is clearly trying to avoid the type of delayed reactive response Bush’s administration was chastised for and instead implement a proactive model that builds on lessons from past administrative shortcomings. While this new approach seems to be welcomed in light of the public’s low expectations for government assistance, how long will the contentment last? Will the public begin to get uneasy with how quickly this administration steps up to help without invitation? Or will public memory of past administrations who were slow, or completely negligent, to anticipate and neutralize potential danger, remain long enough for our nation to be able to appreciate a new trend of a high reliability government? Look at the comments on this YouTube clip of Obama’s most recent public address before you decide.

I am not, in any way, trying to say that our current administrative is the perfect model of high reliability organizing. However, I am highlighting some of the HRO tendencies of the current administration and the questions they raise about what is, and what is not, ideal about high reliability organizing. Even if our country were to have an administration that consistenlty exhibited hro characteristics, we would never be satiated.

Click here to find out how you can donate to relief efforts in the affected areas.

Monday, March 30, 2009

Stronger isn't always better: The issue of culture in HROs

When you look at many of the high reliability organizations (HRO’s) such as Fire Departments, NASA, the U.S. Military, and Hospitals you notice one thing that they have in common: a strong culture. We refer to these organizations as high reliability organizations because even though their work is considered high risk (i.e. complex, ambiguous, and dangerous), they are able to manage this risk appropriately and maintain a relatively safe environment. Often, a strong culture (one where organizational members have shared norms, values, and beliefs tied to organizational values) helps these organizations run like well-oiled machines.

However, having a strong culture doesn’t necessarily point to a safe culture. Cultures with formally defined hierarchies must overcome the hurdle of rank when reporting errors. Often status differences can lead to a lack of clear communication during critical times. For instance, in a study of air line safety, it was found that more accidents occurred with the more experienced pilot is placed in the command position. At first this may seem counterintuitive. However, a closer look yields more insight.

When a high ranking individual is in 2nd command, they are not concerned with offending the pilot in 1st command when identifying mistakes or potential troubles. On the other hand, when a lower ranking individual must point out mistakes to the captain in command, they often use mitigated speech to avoid offending or embarrassing the higher ranking individual.

HRO’s are successful and safe when weak signals of potential trouble are acknowledged and trusted from all employees no matter of rank. This applies to airline pilots but also status differences between nurses and surgeons, captains and firefighters, and managers and employees.

A 2001 Agency for Healthcare Research and Quality white paper, Making Health Care
Safe identified four components of a culture of safety:

• Acknowledgement of the high-risk, error prone nature of an organization's activities
• Blame-free environment where individuals are able to report errors and close calls without punishment
• Expectation of collaboration across ranks to seek solutions to vulnerabilities
• Willingness on the part of the organization to direct resources to address safety concerns.



The impact of National Culture



In his book Outliers, Malcolm Gladwell unravels the story of Korean Air. In the years of 1988-1998, Korean Air had a loss rate (lost planes per departures) more than 17 times higher than an average American Airline, United Airlines. It was, by all accounts, thought to have one of the worst reputations in the airline world.

Many of the problems with Korean Air flights could be traced to nuances within the South Korean culture, mainly speech patterns between individuals of differing status. In South Korean culture, it is unheard of to question or criticize a superior. Therefore, lower ranking pilots only “suggested” weak signals and these suggestions were rarely spoken a second time even when it was clear that danger was approaching.

According to Hofstede’s research on cultural variability, South Korea scores extremely high on the power distance index. This index measures the culture’s emphasis on status differences, rank and authority. In a report issued by Boeing, a clear correlation was found between a country’s rate of plane crashes and its score on Hofstede’s dimension of power distance. In other words, those countries with more emphasis on status differences also had a higher rate of airline accidents. From this example we see that national culture can have a strong influence on organizational culture.


When Korean Air realized that they needed a turn-around, they brought in an outsider, David Greenberg from Delta Air Lines. The first thing Greenberg did was make English the official language of Korean Air. This was an attempt to separate organizational culture from a regimented national culture. By freeing pilots of the status gradients of their native language, they were able to speak clearly about problems and issue commands rather than suggestions in times of crisis. By 2006, Korean Air had turned itself around and has operated with an unblemished record since 1999.

See also:

Anatomy of a System Accident: The Crash of Avianca Flight 052

Wednesday, March 25, 2009

Strategic Ambiguity and Missile Strikes: The New Doctrine of the United States of America

Recent news reports indicate that U.S.-led missile strikes in the tribal regions between Pakistan and Afghanistan are on the rise. The purpose of the attacks are to continue the assault on Taliban and Al-Qaeda militants thus thwarting further terrorist attacks in the United States and around the world. In a news article and audio report on NPR, several experts from various military and civilian higher education institutions weigh in on the ambiguity surrounding the increase in missile strikes. Some want Pakistan to openly condone the strikes while others indicate that such an act would open other countries up to similar strikes, U.S.-led or otherwise.


In the report, an expert in international law at John Hopkins University, Ruth Wedgwood asserts that perhaps the United States is deliberately being ambiguous. Perhaps they are even “strategically ambiguous”. Such a characterization of the language surrounding the increase missile attacks begins to connect current leader behavior in the U.S. military to the idea of strategic ambiguity as defined by Eric Eisenberg. In his book, “Strategic Ambiguities: Essays on Communication, Organization, and Identity”, Eisenberg discusses how organizations and individuals in organizations use language that is deliberately unclear to allow for multiple often conflicting interpretations. Although Eisenberg was referring to general ambiguity in language, other scholars chime in on the very issues related to the missile strikes in these tribal regions.

Strategically ambiguous communication, like that surrounding the missile strikes, allows for multiple plausible interpretations and provides for the ability for leaders to deny many of the interpretations. Many interpretations may attribute inaccurate motives to leaders, but the ambiguity of the message allows leaders to deny the most heinous of such interpretations. Again, the goal is not deception, but legal and plausible deniability if something were to go wrong.



In keeping with our current example, using language that is strategically ambiguous allows the troops in Pakistan and Afghanistan to continue to carry-out assaults on areas that are not fully under either countries sovereignty. Often the missile strikes are carried out by drones and later verified by ground troops. In this case, because the U.S. is keeping their rhetoric ambiguous, neither country can claim that they have overstepped their authority and neither country wants to openly condone or condemn the attacks. Bloggers and experts alike say that this sort of limbo puts the U.S.-led coalition in danger of swift action on the part of either country. Although that is possible, it is likely that until one of the countries attempts to clarify the ambiguity the status quo will remain.


Tuesday, March 3, 2009

Managing Expectations to Manage the Unexpected

Crises inherently involve unanticipated events. And one way that leaders often shape how people around them think about crises is by talking about expectations. For example, much of the recent talk initiated and perpetuated by leaders within the U.S. government incorporates aspects of expectation management.

Take, for instance, President Barack Obama’s Feb. 24 address before a joint session of Congress. In his speech, which dealt largely with his plans to bolster the economy, Mr. Obama incorporated several elements of expectation management. Specifically, after discussing his immediate plans for economic recovery, Mr. Obama’s rhetoric shifted to describe plans with a decidedly futuristic orientation. To illustrate with a simple example, let’s consider the president’s use of two phrases: “short term” and “long term."

For starters, Mr. Obama used the phrase “short term” twice while mentioning “long term” six times. But what is compelling from an expectation-management perspective is that both times that he said “short term,” he immediately juxtaposed “short term” with “long term.” Early in the speech, Mr. Obama compared the two, saying, “Short-term gains were prized over long-term prosperity.” Later, he said, “The recovery plan and the financial stability plan are the immediate steps we’re taking to revive our economy in the short-term. But the only way to fully restore America’s economic strength is to make the long-term investments that will lead to new jobs, new industries, and a renewed ability to compete with the rest of the world.”

In his recent column on Forbes.com, Shaun Rein describes Mr. Obama’s expectation-management strategy as one that business leaders should adopt in difficult times. Rein wrote, “President Obama has continually lowered expectations about his ability to right the economy quickly. This has given him time to maneuver and allowed for more upside potential … Managing the expectations of investors and employees is critical now. One of the biggest mistakes senior executives make is trying to put too positive a spin on a situation.” Indeed, business bloggers are also picking up on the importance of expectation management in the face of crises.

Bloomberg News columnist Caroline Baum focused instead on Mr. Obama’s optimism. In her Feb. 26 column, she wrote, “Chicago is home to, among other things, rational expectations theory, the idea that outcomes depend to some extent on what people expect to happen. It would have been hard to spend that much time in Hyde Park without some of Chicago rubbing off on Obama … If we expect the future to be better, rational expectations dictate that it will be.”

Taking a scholarly perspective, organizational theorists argue that people’s expectations regarding what constitutes the ordinary shape how they make sense of and ascribe meaning to the world around them. So in terms of leadership, it behooves leaders to manage expectations carefully, keeping in mind the power of suggestion and using talk to frame how others regard their environments. At the same time, however, it’s crucial to manage expectations in such a way that people are more likely—not less—to notice and publicize the weak signals and small deviations from normality that are all too often beacons warning of impending disaster.

Saturday, February 7, 2009

Calm, Composed, and Heroic: How one pilot dealt with “hitting two birds with one plane”

On February 5, 2009, the F.A.A. released the audio and transcripts of the flight recorder from the U.S. Airways Flight 1549 that crashed landed in the Hudson River on January 15, 2009. The pilot, Chesley B. ‘Sully’ Sullenberger, can be heard calmly talking with air traffic control while still piloting the unpowered plane to safety. Captain Sullenberger was clearly trying to stay calm, explain the situation to the air-traffic controller, and make sense of what was happening. As the incident unfolded, he had to make sense of new and conflicting information as it came in from the radio, the flight instruments, the crew he commanded, and so on. This act of heroism is also an example of how important it is to make sense of all the many things that were happening all at once, or what social science researchers call “sensemaking”.

Sensemaking in crisis situations is the idea that people collective try to understand their environment and through that process begin to realize the nature and seriousness of the situation. Captain Sullenberger made sense of the situation through talking with the air-traffic controller, interacting with his co-pilot, observing the instruments on the panel, looking out the cockpit window, and so on. All the many things going on in the environment would be mind numbing and paralyzing for most individuals and maybe to some pilots with less background experience.

Psychologist Karl Weick, best know for developing the concept of sensemaking, wrote, “Sensemaking in crisis conditions is made more difficult because action that is instrumental to understanding the crisis often intensifies the crisis.” On Flight 1549, when Captain Sullenberger recognized that the engines had lost thrust due to bird strikes, the crisis intensified because he became aware of the fact that the plane was without the necessary power to make its destination or even return to the airport. In other words, the action of simply noticing and finding meaning in the events that were occurring actually intensified the crisis. The nature of the crisis did not change, but the meaning of the crisis evolved as urgency of resolving the situation became more apparent.

When listening to Captain Sullenberger, one notices that he does not sound anxious, worried, concerned, or fearful. All those emotions are easily communicated through changes in our voice. We all know what someone sounds like when they are scared, worried, or fearful. New stations, newspapers, TV talk-show hosts, and bloggers alike are amazed at the calm and composed demeanor of Captain Sullenberger. This act of remaining emotionally flat on the surface may have actually assisted in keeping the crew calm and helping them focus more on the situation and less on the feelings that were likely to surface.

So, was Captain Sullenberger emotionless or was he simply managing them in that situation?

He (Captain Sullenberger) tells Katie Couric in a CBS "60 Minutes" interview it was "the worst sickening, pit-of-your-stomach, falling-through-the-floor feeling" he's ever had.

Perhaps instead of being emotionless, Captain Sullenberger managed his emotions for the purpose of managing others emotions. By his calm demeanor, he actually helped others (e.g. the crew, the passangers, etc.) also remain calm, alert, and composed in this crisis situation.

Sunday, January 18, 2009

The Financial Crisis: A Corporate Leadership Game-Changer?

Crises necessarily challenge assumptions. And certainly the recent turmoil on Wall Street has shaken many assumptions about economic and financial stability. Looking back on 2008, however, what can we learn about corporate leadership and management? In the Jan. 19 BusinessWeek cover story, titled “Managing Through a Crisis: The New Rules,” senior writer Emily Thornton suggests that current economic conditions have fundamentally changed how business leaders should make decisions, and that within economic turbulence resides new opportunities.

In the article, Thornton mentions several ways in which the decision-making milieu for managers has shifted, including reduced consumer confidence, tighter credit, the prospect of stricter regulations, and general ambiguity about both current conditions and future prospects. Furthermore, she discusses five specific sets of productive actions taken by chief executives in 2008, specifically:

  • Change your mindset. Recognize that market conditions have changed, and that those changes should call into question many aspects of how you’ve traditionally done business.

  • Get your financial house in order. Make tough choices, possibly including eliminating lines of business and issuing more stock, to strengthen your balance sheet and to secure your firm’s fundamental financial health.

  • Make a move for market share. Focus on your core business and be on the lookout for newly available resources, both human and asset-related.

  • Rethink your reward system. Avoid cutting compensation across the board; instead, find non-monetary ways to reward employees and improve morale.

  • Dare to innovate. Taking the time and effort to innovate during the downturn could open new doors in the future. It’s risky, but may result in high returns.

The past year has forced us to think differently about what it means to undertake risk. Additionally, it seems that obtaining actionable information about potential risks is becoming increasingly difficult. It may not be a lack of information that fuels this difficulty; rather, it may be that managers today have such an abundance of information to process—via numerous financial reporting services, for example—that they cannot reasonably evaluate competing courses of action. Much of 2009 will be about figuring out what exactly happened to markets during 2008, but whether managers can use that information to guide their firms successfully remains to be seen.

This entry also appears at Stepping into the Void.