The Safety Climate - PowerPoint PPT Presentation

1 / 80
About This Presentation
Title:

The Safety Climate

Description:

Title: Slide 1 Author: hanlon Last modified by: Ash,Steven R Created Date: 10/10/2007 4:17:11 PM Document presentation format: On-screen Show (4:3) Company – PowerPoint PPT presentation

Number of Views:66
Avg rating:3.0/5.0
Slides: 81
Provided by: hanl91
Category:
Tags: climate | earth | safety | summit

less

Transcript and Presenter's Notes

Title: The Safety Climate


1
The Safety Climate
  • Steven R. Ash, Ph.D.
  • The University of Akron
  • ash_at_uakron.edu

2
Disasters
  • With each disaster that occurs our knowledge of
    the factors which make organizations vulnerable
    to failures has grown.
  • It has become clear that such vulnerability does
    not originate from just human error, chance
    environmental factors or technological failures
    alone.
  • Rather, it is the ingrained organizational
    policies and standards which have repeatedly been
    shown to predate the catastrophe.
  • (Gadd, S Collins, A.M., 2002)

3
Safety Culture
  • The attitudes, beliefs, perceptions and values
    that employees share in relation to safety (Cox,
    S. Cox, T., 1991)
  • 1986 Chernobyl Incident

4
  • The way we typically do things around here.

5
Organizational Culture
  • Where does an organizations culture come from?
  • Founders
  • Leaders
  • How is a culture sustained?
  • Reinforced by demonstrations of acceptable and
    unacceptable behaviors and decisions

6
Safety Climate
  • A summary of perceptions that employees share
    about their work environment (Zohar, 1980)
  • 1980 Factories in Israel

7
Why Safety Climate?
  • Results of questionnaires can serve as leading
    indicators as opposed to lagging indicators

8
Lagging Indicators
  • Statistics related to accidents
  • Injury frequency
  • Injury severity
  • OSHA recordable injuries
  • Lost workdays
  • Workers compensation costs

9
Leading Indicators
  • Measures that predict behavioral outcomes
  • Safety training
  • Ergonomic opportunities identified and corrected
  • Reduction of risk factors
  • Employee perception surveys
  • Safety audits

10
Crisis Agent Categories
  • Man-made
  • Deliberate
  • Terrorism
  • 9/11 attacks
  • Internal Sabotage
  • Disgruntled employee
  • Unintentional
  • Human Error
  • Design
  • Maintenance
  • Operations
  • Natural Disaster
  • Acts-of-God
  • Meteor
  • Probability Events
  • Tornado
  • Hurricane
  • Earthquake
  • Flood
  • Tsunami

11
Some General Sources of Human Error
  • Poor Information (Overload or Lack of)
  • Poor Data, or incomprehensible information
  • Inadequate Ability or Training
  • Knowledge or understanding is lacking
  • Improper Tools/Equipment
  • Right tools not available/used
  • Poor System Design
  • Improper Human Factors (man-machine interfaces)
  • Individual Cognition
  • Biases, perceptual errors, emotional override
  • Social Environment (Culture)
  • Leadership, communication, time pressures,
    personality, etc.

12
Cognitive Biases and Decision Making
  • Heuristics Rules of thumb (humans live by
    these)
  • E.g., In a building with several floors,
    restrooms on one floor are often located in
    roughly the same place as on other floors.

13
Example
  • A bat and ball cost 1.10
  • The bat costs 1.00 more than the ball.
  • How much does the ball cost?

14
Answer
  • The number that came to your mind is, of course,
    10 cents.
  • It is intuitive.
  • It is appealing.
  • It is wrong!

15
Answer
  • Cost
  • Ball .10
  • Bat 1.10 (one dollar more than ball)
  • Total 1.20 WRONG!

16
Answer
  • The correct answer is the ball costs 5 cents.
  • Ball .05
  • Bat 1.05 (one dollar more than ball)
  • Total 1.10 CORRECT!

17
Types of Cognitive Biases in Individual Decision
Making
  1. Availability Bias
  2. Representativeness
  3. Confirmation Bias
  4. Anchoring and Adjustment
  5. Overconfidence Bias
  6. Hindsight Bias
  7. Framing Bias
  8. Escalating Commitment
  9. Randomness Error
  10. Barnum Effect

18
Stroop Effect
  • On the following slide, as quickly as possible,
    loudly say the color of each word, do not read
    the words.

19
Unlearning
  • It is very difficult to change the way we think.
  • Unlearning something that we have spent a lot of
    time learning is quite challenging.

20
Confirmation Bias
  • We find what we are looking for!
  • Theres the proof! I knew it all along.

21
Seeing Patterns Where None Exist (Pareidolia)
  • Pre-existing expectations (Confirmation Bias)
  • Sexual references are often seen
  • References to strong beliefs
  • Faces in particular seem to be innate
  • Our brains can distinguish between faces and
    other objects in less that .20 seconds
    (Hadjikhani, 2009)

22
Seeing Divinity
  • Remarkable sightings have been identified by the
    faithful of all walks

23
Pattern Recognition
  • When it works well, we can find our lost child
    in the middle of a huge crowd at the mall. When
    it works too well, we spot deities in pastries,
    trends in stock prices, and other relationships
    that arent really there.
  • Chabris Simons, 2010

24
  • We tend to seek out and attend to information
    that supports earlier decisions, and ignore
    information that is contradictory.
  • We are generally blind to things we are not
    expecting.

25
Responses to Human Error
  • You want to encourage information flow, but also
    recognize that some discipline may be necessary
  • You want to do something about the employee who
    is truly dangerous, while still encouraging
    reporting from conscientious employees

26
Types of Errors
  • Considers the employees motivation in acting
    when deciding on punishment so as to create a
    feeling of trust among all involved

27
Example
  • Nurses in the state of Texas who made 3
    medication errors in 1 year would lose their
    license
  • What type of reporting would you expect?
  • Rather than improving safety, punishment made
    reducing errors much more difficult by providing
    strong incentives for nurses to hide their
    mistakes, thus preventing recognition, analysis,
    and correction of underlying causes (Leape, L.L.,
    1994)

28
Types of Safety Cultures
  • Pathologic
  • We don't make errors, and we don't tolerate
    people who do. This organization is likely to
    shoot the messenger.
  • Bureaucratic
  • If something occurs, we will write a new rule.
  • Learning
  • Seeks to understand the broader implications of
    error.
  • (Westrum, R. 1993)

29
Wrong Door
  • In 1990, Martin Marietta deployed a satellite
    into the wrong orbit when engineers told the
    computer programmers to, open the bay door to
    the hatch containing the satellite. The
    programmers complied, however they opened the
    wrong door. Today, the 150 million dollar
    satellite sits dead in orbit around the earth.
    The total cost of the single miscommunication is
    estimated to be 500 million dollars (AP,1990).

30
Looking but Not Seeing
  • Homeland Security Screeners failed to spot
    weapons of any kind one third of the time
  • J. McClarey, Elements of Human Performance in
    Baggage X-ray Screening, 4th Annual Aviation
    Secruity Technology Symposium, Washington D.C.,
    2006

31
Organizational Errors
  • It takes a village to really screw things up!

32
Disasters
  • When multiple errors combine together in complex
    systems, large scale disasters become possible

33
Error Chains
  • Swiss Cheese effect (after Reason, 1990)
  • Tragedies are seldom the result of a single error
  • Serious errors are compounded, multiplying the
    impact
  • There are often many opportunities to stop
    disasters

34
Swiss Cheese Effect
35
Nuclear Power Plant Emergencies
  • Oyster Creek (1979)
  • Three Mile Island (1979)
  • Ginna (1982)
  • Davis-Besse (1985)
  • Chernobyl (1986)
  • Fukushima (2011)

36
Fukushima Nuclear Plant
  • On March 11, 2011, a massive 9.0 earthquake hit
    Japan.
  • Terrible loss of life
  • One of the scariest parts was nuclear
  • Japanese Commission Report
  • We believe that the underlying causes of the
    accident are to be found in the organizational
    and control systems that supported wrong
    decisions and actions.

37
Three Mile Island
  • 1979 The operators did not recognize that the
    relief valve on the pressurizer was stuck open.
    The panel display indicated that the relief valve
    switch was selected closed. They took this to
    indicate that the valve was shut, even though
    this switch only activated the opening and
    shutting mechanisms. They did not consider the
    possibility that this mechanism could have (and
    actually had) failed independently and that a
    stuck-open valve could not be revealed by the
    selector display on the control panel.
  • Worst nuclear incident on American soil.

38
Error Chain Construction
  • On Sunday April 14th, 1912, RMS Titanic sank,
    claiming the lives of 1513 of the 2224 people on
    board. Only about 1/3 lived (711).
  • Why did so many people die?

39
Error Chain Examples
  • Case - Jose Eric Martinez
  •  Iatrogenic injury
  • An injury causing harm to a patient resulting
    from medical management rather than from the
    patient's underlying or antecedent condition

40
Group Errors
  • When groups work well, synergy is the result
    (225)
  • When they dont, outcomes can be catastrophic
  • Potential Group Problems
  • Conformity
  • Obedience
  • Groupthink

41
1) Group Conformity
  • Asch experiments
  • Line length
  • Multiple confederates
  • Many unquestioningly went along with majority

42
Group Conformity
  • Which line is longer?

A
B
C
43
2) Obedience to Authority
  • Milgram experiments
  • Questions How could the Nazis commit such
    atrocities?
  • White coated researcher, you must continue
  • Increasing amounts of voltage were administered
  • Why blind obedience?

44
Challenge
  • NTSB study found that 25 of all accidents could
    have been prevented if the pilot had been
    challenged when making an error.
  • E. Tarnow, Self Destructive Obedience, in
    Obedience to Authority, Blass (Ed.) 2000

45
Obedience
  • In an experiment, twenty-one of twenty-two nurses
    were prepared to administer an obviously deadly
    dose of medicine to a patient.
  • No resistance, no internal conflict, no conscious
    awareness of a problem.
  • C. Hofling, 1966

46
3) Groupthink
  • When groups override a realistic appraisal of the
    situation in order to maintain unanimity and
    cohesiveness

47
Examples of Groupthink
  • Bay of Pigs
  • Pearl Harbor
  • Space Shuttle Challenger

48
Symptoms of Groupthink
  1. Invulnerability
  2. Rationalization
  3. Morality
  4. Stereotypes
  5. Pressure
  6. Self-censorship
  7. Unanimity
  8. Mindguards

49
Avoiding Groupthink 1
  • Leader encourages open expression of doubt
  • Leader creates climate where dissenting opinions
    are OK
  • High-status members offer opinions last
  • Receive recommendations from duplicate group
  • Periodically divide into subgroups

50
Avoiding Groupthink 2
  • Get reactions from trusted outsiders
  • Periodically invite outsiders to join discussions
  • Assign role of devils advocate
  • Develop possible outcome scenarios

51
Devils Advocate
  • Historically, the Catholic Church made use of a
    devils advocate in canonization decisions. He
    was the promotor fidei the promoter of faith.
    His role was to build a case against sainthood.
  • John Paul II eliminated the office in 1983.
  • Since then, saints are canonized 20 times faster
    than the old system.

52
Murder Board
  • The Pentagon uses a murder board.
  • This group is staffed with highly skilled and
    experienced officers.
  • Their job is to try to kill ill-conceived
    missions.

53
Safety Culture Traits
  • The U.S. Nuclear Regulatory Commission (NRC) has
    developed a list of 9 traits associated with a
    positive safety culture, along with examples of
    each.
  • http//www.nrc.gov/about-nrc/safety-culture.html

54
1. Leadership
  • Leadership Safety Values and Actions in which
    leaders demonstrate a commitment to safety in
    their decisions and behaviors

55
2. Problem Identification
  • Problem Identification and Resolution in which
    issues potentially affecting safety are promptly
    identified, fully evaluated, and promptly
    addressed and correct

56
3. Personal Accountability
  • Personal Accountability in which all individuals
    take personal responsibility for safety

57
4. Work Processes
  • Work Processes in which the process of planning
    and controlling work activities is implemented to
    maintain safety

58
5. Continuous Learning
  • Continuous Learning in which opportunities to
    learn about ways to ensure safety are sought out
    and implemented

59
6. Environment for Raising Concerns
  • Environment for Raising Concerns in which a
    safety-conscious work environment is maintained
    where personnel feel free to raise safety
    concerns without fear of retaliation,
    intimidation, harassment, or discrimination

60
7. Effective Communication
  • Effective Safety Communication in which
    communications maintain a focus on safety

61
8. Respectful Environment
  • Respectful Work Environment in which trust and
    respect permeate the organization

62
9. Questioning Attitude
  • Questioning Attitude in which individuals avoid
    complacency and continuously challenge existing
    conditions and activities in order to identify
    discrepancies that might result in error or
    inappropriate action

63
Example Application Upper Big Branch Mine
Explosion
  • Information derived from
  • Mine Safety and Health Administration (MSHA),
    Coal Mine Safety and Health, Report of
    InvestigationFatal Underground Mine Explosion,
    April 5, 2010, December 6, 2011

64
Upper Big Branch Mine
  • On April 5, 2010, a series of explosions occurred
    inside the Upper Big Branch (UBB) mine in
    southern West Virginia. Twenty nine coal miners
    working for Performance Coal Company (a
    subsidiary of Massey Energy Company) lost their
    lives in the largest coal mine disaster in the
    United States in 40 years.

65
1. Leadership
  • One specific work process that the Massey
    leadership had in place was to illegally provide
    advance notice to miners of MSHA inspections.
    This was a flagrant violation of Section 103(a)
    of the Federal Mine Safety and Health Act of 1977

66
2. Problem Identification
  • when a worker told the foreman about the air
    reversal, air moving the opposite direction of
    where it should have been in order to properly
    vent the mine He didnt say nothing, he just
    walked away.
  • The preshift, onshift examination systemdevised
    to identify problems and address them before they
    became disasterswas a failure.

67
3. Personal Accountability
  • In the weeks preceding the disaster,
    investigators found that one foremans hand-held
    methane detector had not been turned on, even
    though he filled in the examiners books as if he
    had taken gas readings.
  • This data integrity issue raises doubt about
    the daily and weekly air readings and other data
    recorded by the crew foreman in the weeks leading
    up to the disaster.

68
4. Work Processes
  • In instances in which a section boss did halt
    production because of a dangerous condition, such
    as wholly inadequate ventilation, he was
    instructed to write only downtime. He was not
    to create a record acknowledging a potentially
    deadly situation.

69
5. Continuous Learning
  • Testimony indicates that Massey inadequately
    trained their examiners, foreman and miners in
    health and safetyespecially in hazard
    recognition, performing new job tasks and
    required annual refresher training. This left
    miners unequipped to identify and correct
    hazards.

70
6. Environment for Raising Concerns
  • Witness testimony revealed that miners were
    intimidated by management and were told that
    raising safety concerns would jeopardize their
    jobs. As a result, no whistleblower disclosures
    were made in the 4 years preceding the explosion,
    despite an extensive record of Massey safety and
    health violations at the UBB mine during this
    period.

71
7. Effective Communication
  • Workers were treated in a need to know manner.
    They were not apprised of conditions in parts of
    the mine where they did not work. Only a
    privileged few knew what was really going on
    throughout the mine.

72
8. Respectful Environment
  • Miners also mentioned disrespectful written
    messages they received from a senior manager.
    Others, were intimidated by a managers nasty
    notes and didnt say anything because they were
    job-scared.

73
9. Questioning Attitude
  • Testimony revealed that miners were intimidated
    to prevent them from exercising their
    whistleblower rights. Production delays to
    resolve safety-related issues often were met by
    officials with threats of retaliation and
    disciplinary actions.

74
Summary
  • While violations of particular safety standards
    led to the conditions that caused the explosion,
    the unlawful policies and practices implemented
    by Massey were the root cause of this tragedy.

75
Donald L. Blankenship
  • The CEO of Massey Energy, was sentenced on April
    5, 2016 to a year in prison for conspiring to
    violate federal mine safety standards (a
    misdemeanor).
  • The prison term, the maximum allowed by law, came
    six years and one day after an explosion ripped
    through Masseys Upper Big Branch mine, killing
    29 men.
  • Federal officials have said the guilty verdict
    was the first time such a high-ranking executive
    had been convicted of a workplace safety
    violation.

76
Measuring Climate to Assess Culture
  • Washington Metro Area Transit Authority
  • Survey question categories
  • 1. Tone at the Top
  • 2. Supervisor Leadership
  • 3. Reporting Tendency
  • 4. Responsiveness to Incidents
  • 5. Comfort Speaking Up
  • 6. Openness of Communications
  • 7. Awareness and Training
  • 8 Fairness

77
Conclusion 1
  • Culture is complicated and includes the
    behaviors, communication, and decision making
    styles of the employees
  • Climate is something you can assess. It includes
    the perceptions and attitudes of employees.

78
Conclusion 2
  • Researchers have found a significant association
    between the safety climate scores and injury data
    for many industries.
  • Even among lone workers, safety climate is a
    valid predictor of safety outcomes.

79
Conclusion 3
  • Get Everyone on Board!
  • Hands and backs can be bought, but hearts and
    minds must be won.
  • If the leadership has not bought in, neither will
    the employees.
  • Leadership is the biggest determinant of culture!

80
  • Thank you for your attention
Write a Comment
User Comments (0)
About PowerShow.com