A Sad Safety Track Record

The Economic Times - - Breaking Ideas - Sarab­jit Ar­jan Singh

Three se­ri­ous rail ac­ci­dents in the last two months is not only a cause for se­ri­ous con­cern but an in­di­ca­tion of an alarm­ing break­down in sys­tems that shape rail­way safety. Safety is the by-prod­uct of the nor­mal func­tion­ing of the In­dian Rail­ways (IR). Clearly, that func­tion­ing is now not able to guar­an­tee the re­quired level of safety.

IR doesn’t look at safety in this man­ner. It be­lieves in what is called the ‘bad ap­ple’ the­ory: the sys­tems for en­sur­ing safety are fail-safe but for the er­ratic be­hav­iour of some un­re­li­able ‘bad ap­ples’. Hence, af­ter ev­ery ac­ci­dent, one hears the cus­tom­ary re­frain that those found re­spon­si­ble for the mishap will be made an ex­am­ple of.

Since the stat­u­ary in­quiry by the Com­mis­sioner of Rail­way Safety nor­mally lim­its it­self to the se­quence of events that led to the ac­ci­dent — and does not go into the or­gan­i­sa­tional fac­tors be­hind the cre­ation of the ac­ci­dent — sys­temic fail­ures re­main un­recog­nised and un­ad­dressed. It is time IR dis­carded the ‘bad ap­ple’ the­ory and recog­nised that hu­man er­ror is a symp­tom of trou­ble deeper in the sys­tem.

Hu­man fail­ure needs to be seen as the start­ing point of an in­ves­ti­ga­tion to un­der­stand what the er­ror points to. What were the dif­fi­cul­ties that made the peo­ple take the de­ci­sions. And why the as­sess­ments and de­ci­sions would have made sense to those who took them when they took them. By adopt­ing this ap­proach, one sees the mis­takes as a win­dow of how the whole sys­tem works.

For IR to ac­cept this view of hu­man fail­ure, it has to ad­mit that sys­tems are not ba­si­cally safe. It is the peo­ple-sys­tem in­ter­face that makes sys­tems safe by ori­ent­ing it to­wards mak­ing safety the prime ob­jec­tive.

Safety is not the only goal when peo­ple op­er­ate the sys­tem. Mul­ti­ple pres­sures and goals are al­ways at work that in­clude those of en­sur­ing punc­tu­al­ity, com­plet­ing main­te­nance tasks within the avail­able time and ma­te­ri­als.

Th­ese in­volve trade-offs be­tween safety and other goals. The tricky part of such de­ci­sions is how to bal­ance safety and non-safety goals. For ex­am­ple, mak­ing up for late ar­rival of a train by re­duc­ing the time avail­able for main­te­nance achieves the im­me­di­ate ob­jec­tive of en­sur- ing punc­tu­al­ity. The re­sult is ap­par­ent. But how much this de­ci­sion has ‘bor­rowed’ from safety is not so eas­ily mea­sur­able.

Man­age­ment and staff face this dilemma ev­ery­day. They are be­ing con­stantly asked to bal­ance mul­ti­ple ob­jec­tives, one of which is stick­ing to the timetable. For ex­am­ple, in the face of ca­pac­ity con­straints, there is a strong re­sis­tance to track clo­sure for track main­te­nance, thereby sac­ri­fic­ing safety for the timetable. There are end­less such ex­am­ples.

Un­der­in­vest­ment over decades has led to such a sit­u­a­tion that the old re­silience, built into the sys­tem through its strong safety cul­ture and nur­tured by a sta­ble work­force, has been eroded. Mainly be­cause when norms were wa­tered down to meet com­mer­cial and po­lit­i­cal imp- er­a­tives, the risk pic­ture was never made ex­plicit.

IR needs to do a num­ber of things. First, it needs to re­think its ac­ci­dent model. It can no longer limit it­self to see­ing ac­ci­dents only as a chain of events that led to a fail­ure. It must see ac­ci­dents as re­lated to la­tent fail­ures that hide in man­age­ment de­ci­sions to pro­ce­dures to equip­ment de­sign. And that an ac­ci­dent emerges from nor­mal work­ing of the sys­tem that is a sys­tem­atic by-prod­uct of man­age­ment and staff try­ing to pur­sue suc­cess un­der con­straints with im­per­fect un­der­stand­ing of the risks they are cre­at­ing or try­ing to re­duce dur­ing nor­mal main­te­nance.

The ac­ci­dent in­quiry should also go beyond the ex­ist­ing chain-of-events model and look into man­age­ment de­ci­sions that cre­ated la­tent un­safe con­di­tions. Sec­ond, IR should de­velop a risk-as­sess­ment sys­tem that spots when the sys­tem has moved out­side the clearly de­fined safe en­ve­lope and must stop, re­group and restart only when safe op­er­at­ing con­di­tions have been es­tab­lished. IR needs to ac­cept that mak­ing safety an ac­tive fea­ture may need sac­ri­fic­ing goals, traf­fic vol­umes and punc­tu­al­ity.

Third, it needs to get away from us­ing fear of pun­ish­ment as an in­stru­ment for achiev­ing safety. It does not work be­cause hu­man er­rors are a symp­tom of a sys­temic prob­lem that ev­ery­one may be vul­ner­a­ble to. The er­ror is not the ‘end of an in­quiry’ but the be­gin­ning of an in­ves­ti­ga­tion.

Fi­nally, IR has to avoid the trap of new tech­nolo­gies. New tech­nolo­gies may re­move a par­tic­u­lar er­ror po­ten­tial, but they will most likely present new com­plex­i­ties and er­ror traps with as­so­ci­ated safety con­cerns.

The writer is former gen­eral man­ager, In­dian Rail­ways

Move! Can’t you hear my whis­tle? En­gine’s whis­tle is not work­ing

Newspapers in English

Newspapers from India

© PressReader. All rights reserved.