Mycoordinates


Integrated response system for disaster mitigation management

May 2007 | Comments Off on Integrated response system for disaster mitigation management

 
A framework for an online nodal agency for automated processing of all the data
   

In recent past , humanity has suffered an increasing number of natural disasters affecting more than 2.5billion people, killing 478,100 and causing economic losses of about US$690bn (UNEP/ GRID-Arendal, 2005). Some of the distinctive instances are: December 2004 Indian Ocean earthquake and its concomitant tsunamis, The US eastern coast and Central America, hurricanes Katrina, Rita, Mitch, Stan and Wilma in September-October 2005, Pakistani earthquake of 2005, and now the current avian influenza in Asia and Europe. Natural and manmade tragedies, such as earthquakes, foods, nuclear catastrophes, pose an ever-present challenge to emergency services. Victims and societies at large have responded differently in each case. Some were heroic, some responsible but many panicked and responded irrationally. This aggravated the already bad situation. All of them could have responded more effectively if they were better informed and aright managed.

It is clear that despite excellent efforts by many groups, the wealth of data residing with various organizations is not often effectively utilized in disaster management. Disaster management is not a linear process that can be documented easily in a flow chart with a readily apparent beginning and absolute end. Rather, it is a cyclical process of approximation, response and re-calibration that involves many different doers whose roles in relation to one another, are likely to dynamically change based on circumstances and the stage in the process. The one constant evident in the process is the chaos and entropy that drives the system of rules.

The existing technology can provide disaster managers the important products that could save lives, reduce damage to property, mitigate overall damage, conserve resources and ameliorate human suffering. The current situation has many defects and has eclectic method for disaster management. To develop effective architectures and technologies that meet the needs of the disaster management there must be a precise understanding of the disaster management lifecycle. All the Communities must be synergized to define the disaster management system. They must necessary be associated with the cycle of data ontogenesis, dissemination, analysis and review. There should be an accurate understanding of the dynamics between these ingredients and the “interfaces” that this kinetics imply. Only with such an understanding, can we effectively pattern the process and derive technology solutions that map well into the business model of disaster management.

The profile of a pre-disaster situation can be built in terms of four distinct groups; namely the hazardous entities, the victims, the intermediaries whose presence or absence can aggravate/ mitigate the impact of the disaster, agencies holding potentially useful information in disaster management. The paper discusses the construction such a profile by quantification of the disaster potential of the hazardous entities and the impact of the intermediaries. Enormous amount of information for such a vulnerability evaluation and disaster management is available disparate Governmental and public agencies. The paper proposes a framework for an online nodal agency for automated processing of all the data to build up such profiles. This can help in evolution of comprehensive disaster mitigation and management plans.

The paper briefly discusses how such a system would have responded in situations such as Bhopal gas disaster

Knowledge discovery for mitigation

Data integration
The information required for disaster mitigation comes from the variety of databases and resources used by divergent bureaux. A large amount of information in the public domain about the accessibility by rail, road, and air can be augmented, updated using remote sensing technologies. The information about the land usage pattern, water supply, power supply, sewage disposal, fire services, and availability of health services is available with local administration. The telecommunication companies control the vast networks that can be used as the channels of information to reach the individuals, families and population masses in the disaster area. The power supply companies, the water supply companies, the hospitals and the public health departments, banks and insurance companies, the police and other security agencies have vital information which can be used for the disaster mitigation. Normally, all these databases are independently designed, maintained and used by these agencies. In order to cope up with the disaster in fast and highly coordinated manner, information sharing among these agencies is vital. Police, fire departments, public health, civil defense, and other organizations not only have to perform efficiently individually, but also in a unified manner.

may_10

Apart from the factual information from different agencies the disaster mitigation system must obtain the information on regulatory / standardization parameter from the appropriate authorities. The databases need to be talking in a real-time manner on a common platform. The data integration will require a certain level of standardization and compulsory data sharing. This requires a national level effort in the form of law enactment and clear regulations guiding everyone. Such a disaster mitigation authority may have a centralized or distributed architecture to handle the local disasters and the regional or national disasters.

Hazard evaluation
To develop effective architectures and technologies that meet the needs of the disaster management there must be a precise understanding of the disaster management lifecycle. All the communities in disaster mitigation must be associated with the data cycle of genesis, dissemination, analysis and review. There should be an accurate understanding of the dynamics between these ingredients and the “interfaces” that this kinetics imply. Only with such an understanding, can we effectively pattern the process and derive technology solutions that map well into the business model of disaster management.

may_11

The vulnerability analysis
The first step in any disaster mitigation management effort would be vulnerability analysis of the area and the population. Broadly speaking, the vulnerability of a system, population or individual to a threat, relates to its capacity to be harmed by that threat. Vulnerability varies widely across peoples, sectors, and regions. The occurrence of extreme events, their emplacement with the environmental declensions is usually a local or regional phenomenon, while the expected outcomes are global ones. Continuous scientific discussions exist about general concepts of vulnerability and the development of indicators, which are suitable for the different scales and conditions.

The methodological challenge is to develop a reporting framework that can include qualitative, quantitative appraisal of vulnerability. Such an appraisal must be context-specific and linked to data on adaptive capacity. Ideally, vulnerability assessments should be continuously up-dated. Although assessments are often carried out at a particular scale, there are significant cross-scale interactions, due to the interconnectedness of economic, social, and environmental systems.

Catastrophic value of hazard (CVH)
CVH is an index of the net damage that a hazardous entity can inflict on its surroundings. The CVH analysis is based on extensive cross-referencing and data authentication using data drawn from variety of resources. The CVH indicators should be used to evaluate adaptive strategies and measures for monitoring development processes. The CVH analysis applications borrow the necessary information from the databases. Two approaches in CVH assessments can be: Risk-based and Exposure based.

Point catastrophic failure (PCF) value
There is a long chain of intermediaries between the hazardous entity and the potential victims. The presence or the absence of these intermediaries can aggravate or mitigate the damage caused by the hazardous entity on the surroundings. The hazard can be compared to the fountainhead with a specific CVH while the intermediaries may increase or decrease the severity of the damage depending on their PCF value.

The methodology to evaluate the threat posed to the population, environment and various natural by the man-made hazards in tem of CVH and PCF values can be devised by, appropriate expert agencies. Different available methodologies / pattern must be compared, evaluated and updated on continuous basis. The disaster mitigation mechanism must use the factual data, regulatory / policy framework and the damage assessment methods (provided by experts). to generate the CVH, PCF and vulnerability values these values can be utilized on a real time basis.

The CVH and PFC values can not only identify the current threat levels but also generate recommended set of actions. Insertion of any new hazard, intermediary, deviation from the standards, and change in the standards or the assessment methods will automatically lead to the reappraisal of the status. Hence the system functions like the conscious system capable of responding to the ever-changing surroundings.

The system may run own housekeeping programs which continuously cross check the inputs provided by the different agencies and bring out the inconsistencies. Common determinations can be, that it is unyielding to assess CVH as an integral part of the causal chain of risk and to appreciate that changing vulnerability (conscious & awareness) is an effective strategy for risk management. Vulnerability analysis, along with conscious & awareness particularly those aimed at advancing sustainability can be identified by the following elements: – Multiple interacting perturbations and stressors/stresses and stressors/stresses and the sequencing of them;
– Exposure beyond the presence of a perturbation and stressor/ stress, including the Manner in which the coupled system experiences hazards;
– Sensitivity of the coupled system to the exposure;
– The system’s capacities to cope or respond (resilience), including the consequences and attendant risks of slow (or poor) recovery;
– The system’s restructuring after the responses taken (i.e., adjustments or adaptations);
– Nested scales and scalar dynamics of hazards, coupled systems, and their responses.

Case study: bhopal gas tragedy

Description
The incident occurred at approximately 0030 on Monday 3rd December 1984 in the suburbs of Bhopal, India. It is one of the worst industrial air pollution disasters in the world, affecting nearly 200,000 people. The source of the incident was a large pesticide factory owned by Union Carbide. One of the intermediates of the pesticide production was methyl isocyanate (MIC). The poisonous methyl isocyanate (MIC) gas leaked out from one of the tanks in the Union Carbide factory. On the day of the incident there was an increase in temperature in one of the tanks to 38 C which brought MIC close to the boiling point. The increased pressure exceeded the design value of the tank causing it to rupture a relief valve. Approximately 40/41 tones of MIC vapors escaped through a 33m high atmospheric vent-line. The sodium hydroxide scrubber designed to neutralize MIC was not in operation at that time. When it was eventually switched on the scrubber was unable to cope with the volume of MIC released. The release continued for about 90 minutes into the cool, dry stable atmosphere of Bhopal. The cloud of gas that formed over the factory slowly drifted towards the city in the darkness of the night. Some residents from the affected area realized that something was wrong and tried to escape; but many of them never had a chance and simply perished. The gravity of inadequate direction was that the workers at Bhopal train station were found dead approximately two hours after the release. Five days after the incident the toll was more than 2500. Within a month of the incident toll more than 3500 precious life perished. Some of the striking irregularities observed were.

* During the initial 48 hours following incident no measurements of atmospheric MIC were taken.
Sampling team had little idea of what they were looking for in terms of chemicals.
* Patients though were provided with symptomatic treatment but hospitals could not cope up further as toll climbed.
* There was little toxicologically understanding of MIC.
* Lack of emergency management systems within the factory.
* No specific hazard contingency plan for emergency services,
* No detailed meteorological plot of the movement of plume.

may_12

Integrated mitigation response (IMR)
If a hypothetical mitigation system as described above was in place at Bhopal before the disaster, it could have generated advisories, cautions and start warnings on following matters in terms of CVH and PCF values.

* Storage of hazardous chemicals (High CVH) dangerously close to populated areas (High Vulnerability).
* Nonexistence of adequate storage safety equipment, to cater for public safety. (High PCF)
* Procedural periodic checks by a central agency observer for third party checks (High PCF).
* A complete lack of knowledge on toxicological implications of chemical. (Moderate PCF)
* Road evacuation plans to be put in place at all times. (High PCF)
* Absence of mechanism to detect toxic chemicals in the atmosphere and track spread. By ensuring multi point wind movement tracking system. (Very High PCF)
* List of all the locals and their phone details linked at all times, so as to provide them with advisory from area affected central system. (Moderate PCF)

It is quite likely that these cautions would have been ignored before the disaster. But they would have definitely help in more effective post disaster management / response in the following areas.

* Identification / prioritization of the population requiring emergency evacuation.
* Identification of transport, accommodation, public health, communication resources at disposal.
* Coordinated and effective response for the immigration of people from the disaster area. Identification of the appropriate clinical procedures for treatment of the victims.
* Emergency treatment of the victims.

Conclusion

Presently much of the effort to compile the data, carry out vulnerability, CVH, PCF value analysis looks like proverbial carving of the mountain in search of a rat. The truth is out there for anyone cares to see. There is a shortage of electricity, water supply is inadequate, malnourishment is common, crimes rampant, there is no approach road for the village, a fire tender can never hope to make its way through the maze of by lanes and yet the people are going about their life. The system must deal with the situation as is and not as it ought to be.

The normal life with all its well-known problems is irreversibly damaged by the disaster. It creates waves like a stone thrown in the pond. The integrated disaster mitigation apparatus can generate more comprehensive as well as prioritized set of ‘things to be done’ in pre-disaster as well as post-disaster phases as against any manual system. The disaster mitigation machinery functions like war waging machine only with an opposite aim.

Disaster mitigation management require a very coordinated and a rapid response. Today the technology offers us an unprecedented chance to rise to the occasion in a well thought out and transparent manner. Investment in this vital area will definitely save the societies and nations from a lot of death and destruction

References

* SOGOS – A Distributed Meta Level Architecture for the Self- Organizing Grid of Services” By C. Beckstein, P. Dittrich, Ch. Erfurth, D. Fey, B. K¨onig- Ries, M. Mundhenk, H. Sack
* “Modeling Multi-Hazard Disaster Reduction Strategies with Computer-Aided Morphological Analysis”, Tom Ritchey, Swedish Defense Research Agency
* “Socio-Economic Aspects of Water- Related Disaster Response”, ISDR
* “Flood and Flash Flood Disaster Management Measures”
* “Culture of disaster prevention “, UNCRD
* “Natural disaster data and information management system” , H. Assilzadeha, S.B. Mansor, Institute Of Advanced Technology (ITMA), University Putra Malaysia
* “Towards A Research Agenda To Guide The Implementation Of Spatial Data Infrastructures – A Case Study From India”, Yola Georgiadou, Satish Puri and Sundeep Sahay
* “Geo-information as Disaster Management Tools in Aceh and Nias, Indonesia: a Postdisaster Area”, Rizqi Abdulharis, D. M. Hakim, Akhmad Riqqi and Siyka Zlatanova
* “Ontologies for Intelligent Search and Semantic Translation in Spatial Data Infrastructures “, L. Bernard, U. Einspanier, S. Haubrock, S. Hübner, W. Kuhn, R. Lessing, M.Lutz, U.Visser
* “Infrastructure Development Relating To Disaster Management” ESCAP

 

Squadron Leader Mudit Mathur

is a commissioned officer in Indian Air Force. He has
undergone advanced training in the field of aerospace
remote sensing and its applications from National Remote
Sensing Agency (NRSA), Vikram Sarabhai Space Center
(VSSC) & Ecole Militare (Paris)
mr.mudit@gmail.com
   

Wing Commander YD Andurkar

is a commissioned officer in Indian Air Force, mechanical
engineer by profession. He has undergone training in the
field of aeronautics, and is Postgraduate from IIT Chennai
& FMS New Delhi
ydandurkar@gmail.com
   
     
 
My coordinates
EDITORIAL
 
News
INDUSTRYLBS | GPSGIS  | GALILEO UPDATE
 
Mark your calendar
May 09 TO DECEMBER 2009

1 Star2 Stars3 Stars4 Stars5 Stars (1 votes, average: 4.00 out of 5)
Loading...