Skip to main content

Upcoming Learning laboratories

Critical Thinking in Safety - an introduction to the field

January 13-17 2025 - on campus at Lund University

Link to application.

Application to the learning lab is open and completed through this link: Learning lab application (external page). Students starting the 2025 MSc programme in Human Factors and Systems Safety do not need to register for the learning lab. 

This Learning Laboratory will help you form a new view of human factors and systems safety. During the lab, you will be exposed to perspectives, which will assist you to develop a critical view when thinking about human factors and system safety. This lab will take you beyond classical “Newtonian thinking”, grounded in a basic cause and effect philosophy. It will challenge common assumptions, regarding accountability, failed human components and linear progressions to accidents, and it will help you to distinguish safety from quality. You will be introduced to the history (and conflicts) of Human Factors and position the so called 'new view' in its ideological heritage.

We will address various theoretical concepts used to explain accidents and the emergence of risk and safety. The lab will take you beyond the label "human error," and help you to discover the mindsets, understanding, expectations and knowledge that affect people's work at the sharp end of safety-critical environments such as airlines, firefighting, hospitals, process control, emergency responders, shipping and air traffic control.

Most of all this learning lab is about exposing, understanding and deconstructing safety discourse(s) in order to understand that the language matter. If we want to change the way we think, speak and write to explain accidents, write recommendation or suggest system improvements, we also need to learn new safety language. 

The ethics of safety

June, 2025 - on campus at Lund University

Link to application: https://dinkurs.se/appliance/?event_id=90653 

Please note that the number of available seats is limited. If you apply for a learning lab which is fully booked you will receive a reserve slot. Students in the MSc programme in Human Factors and Systems Safety do not need to apply to the learning lab. 

In this learning laboratory we do not hesitate to ask the really tricky questions of how organizations do (or could, or perhaps even should) respond to adverse and often traumatic events. In a highly interactive manner learning lab participants from a great variety of domains get to delve into discussions and exchange of experiences regarding how to stimulate honest disclosure, what it means to treat someone justly, whether the recent emphasis on the need for organizational resilience is nothing but an innovative way to increase risk exposure and how we can care for the sharp-end staff who often are the most exposed for high-risk processes in our organisations. Two central concepts in this learning lab will be Justice and safety Culture.

Organizational justice

You want people to tell you about safety and other problems they have in their work, about incidents that happen, certainly if there’s other way for you to find out. But for people to do that, they have to feel that their reports will be treated fairly, that there’s no negative or disproportionate consequences if they report. The dilemma, of course, is that there’ll be cases where you feel you have to demand accountability, even if it may dampen people’s willingness to share similar stories. This is where a Just Culture comes in: to balance accountability and learning. And to change the way we think about accountability so it becomes compatible with learning. Now you’re wrong if you think you can have a just culture by saying: we’ll treat your reporting fairly unless there’s gross negligence, willful violations, or other bad behavior. This still leaves people in uncertainty, because we don’t have clear definitions for any of these categories. Whether something is seen as negligence—which, by the way, is a legal term, not a human factors one—depends on standards of good practice, definitions of skill, prudence, reasonable care, foreseeability of harm. And somebody needs to interpret all that. These are all judgment calls that somebody will have to make. So the real question is: who makes that judgment? Whom do you give the power to make that judgment, to draw the line?

Safety Culture

Emerging as a concept following the 1986 Tjernobyl disaster, and later popularised by professor James Reason, Safety Culture has become a concept and promise for how to develop, understand and measure safety management strategies. The learning lab offers the opportunity to engage in a critical discussion and analysis of Safety Culture's discursive meaning(s), different academic approaches to Safety Culture (from functionalist to interpretivist) and the consequences of introducing the language of Safety Culture to organisational practices. 

 

 

Page Manager: Johan.Bergstrom@risk.lth.se. | 2023-04-03