ABSTRACT Title of document: REAL-TIME DECISION AID DISPLAY Jennifer Au, Anthony Bonomo, Laura Freyman Brian Kwong, Benjamin Li, Jessica Lieberman Levon Mkrtchyan, Michael Price, Andrew Skoda Mary Tellers, Andrew Tomaschko, Johnny Wu Directed by: Dr. Frederick W. Mowrer, Associate Professor Emeritus Department of Fire Protection Engineering Fire sensor systems e ectively monitor the state of the building, detect re, and alert occupants in the event of an emergency. However, re sensor technology is limited in its ability to convey information to re ghters. Even though all of the necessary information can be obtained through Fire Annunciator Control Panels (FACPs), it is di cult to use them to track the progression of re. We designed and prototyped a decision aid system to illustrate our approach to this problem. Our goal was to create a tactical decision aid display that can present building information through an intuitive interface in real time. We used previous research on the information needs of re ghters in designing the interface. Our key insight was to use a oor plan with a sensor information overlay to organize information. We implemented a prototype that interfaces with FACPs using existing facilities systems management communication protocols. REAL-TIME DECISION AID DISPLAY by Team Future Fire ghting Advancements (FFA) Jennifer Au, Anthony Bonomo, Laura Freyman Brian Kwong, Benjamin Li, Jessica Lieberman Levon Mkrtchyan, Michael Price, Andrew Skoda Mary Tellers, Andrew Tomaschko, Johnny Wu Thesis submitted in partial ful llment of the Gemstone Program, University of Maryland, 2011 Advisory Committee: Dr. Frederick W. Mowrer, Chair Mr. Millard B. Holmes Dr. James A. Milke, Ph.D. P.E. Dr. James Purtilo Dr. Peter B. Sunderland Mr. Scott Wood c Copyright by Team FFA Jennifer Au, Anthony Bonomo, Laura Freyman Brian Kwong, Benjamin Li, Jessica Lieberman Levon Mkrtchyan, Michael Price, Andrew Skoda Mary Tellers, Andrew Tomaschko, Johnny Wu 2011 Acknowledgments We would like to thank SFPE ESF for their generous grant and the opportunity to present at the SFPE conference. In addition, on campus we?d like to thank David Doucette, Jim Robinson, and Olga Zeller for their time, patience, and helpfulness throughout the project, as well as other facilities management sta who lent their time. We are also grateful to Millard Holmes and SimplexGrinnell for the donation of hardware and the expertise to help us con gure it for our project. We would also like to thank Honeywell for their donation of hardware as well. We also appreciate the Gemstone program?s support throughout this four-year process. Finally, we would like to thank the Fire Protection Engineering Department for the space they gave us for meetings and lab work. ii Table of Contents Acknowledgments ii Table of Contents iii List of Figures vi List of Abbreviations viii 1 Introduction 1 2 Literature Review 7 2.1 Fire Emergency Responder Standard Operating Procedures . . . . . . 7 2.1.1 Size-Up . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7 2.1.2 Operations - High-Rise Buildings . . . . . . . . . . . . . . . . 10 2.2 NFPA Standards . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 2.3 BACnet . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 2.4 Building Tactical Information Project . . . . . . . . . . . . . . . . . . 17 2.5 FireGrid . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27 2.6 Current Building Technology . . . . . . . . . . . . . . . . . . . . . . . 28 2.6.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28 2.6.2 Major Companies of the Fire Alarm and Detection Industry . 29 2.6.2.1 SimplexGrinnell . . . . . . . . . . . . . . . . . . . . 29 2.6.2.2 Honeywell International, Inc. . . . . . . . . . . . . . 33 2.6.2.3 Siemens AG . . . . . . . . . . . . . . . . . . . . . . . 36 2.6.3 Other Relevant Technology . . . . . . . . . . . . . . . . . . . 38 2.6.3.1 Keltron . . . . . . . . . . . . . . . . . . . . . . . . . 38 2.7 Current Fire ghter Technology . . . . . . . . . . . . . . . . . . . . . 39 2.7.1 Pre-planning Software . . . . . . . . . . . . . . . . . . . . . . 40 2.7.2 Locator and Vital Sign Indicator Technology . . . . . . . . . . 41 2.8 Human Factors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43 2.8.1 Decision Support Systems (DSS) . . . . . . . . . . . . . . . . 44 2.8.2 GUI Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50 3 Methodology 53 3.1 Case Study Methodology . . . . . . . . . . . . . . . . . . . . . . . . . 53 3.2 System Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54 3.2.1 Assessing EFR Needs . . . . . . . . . . . . . . . . . . . . . . . 54 3.2.2 Evaluation of Prior Work . . . . . . . . . . . . . . . . . . . . . 56 3.2.3 Design Goals and Basic Layout . . . . . . . . . . . . . . . . . 59 3.2.4 Final GUI Design . . . . . . . . . . . . . . . . . . . . . . . . . 60 3.3 Software Implementation . . . . . . . . . . . . . . . . . . . . . . . . . 63 3.3.1 Software Design Choices . . . . . . . . . . . . . . . . . . . . . 63 3.3.2 Annotation Tool . . . . . . . . . . . . . . . . . . . . . . . . . 64 iii 3.4 Hardware Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65 3.4.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65 3.4.2 Panel Output Protocols . . . . . . . . . . . . . . . . . . . . . 67 3.4.3 Hardware and Scenario Mockup . . . . . . . . . . . . . . . . . 68 3.4.4 Procedure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70 3.4.4.1 Analysis . . . . . . . . . . . . . . . . . . . . . . . . . 70 3.5 Fire Dynamics Simulator Tests . . . . . . . . . . . . . . . . . . . . . 71 3.5.1 FDS Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . 71 3.5.2 Simulations . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72 3.5.3 Testing Procedure . . . . . . . . . . . . . . . . . . . . . . . . . 74 3.5.3.1 Test 1 . . . . . . . . . . . . . . . . . . . . . . . . . . 75 3.5.3.2 Test 2 . . . . . . . . . . . . . . . . . . . . . . . . . . 75 4 Results 76 4.1 Hardware Test . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76 4.1.1 Honeywell Mockup Test . . . . . . . . . . . . . . . . . . . . . 76 4.1.2 SimplexGrinnell Mockup Test . . . . . . . . . . . . . . . . . . 77 4.1.3 Fire Progression . . . . . . . . . . . . . . . . . . . . . . . . . . 78 4.1.4 Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81 4.2 FDS Testing of GUI . . . . . . . . . . . . . . . . . . . . . . . . . . . 83 4.2.1 Test 1 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . 83 4.2.1.1 JMP Scenario . . . . . . . . . . . . . . . . . . . . . . 83 4.2.1.2 Multilevel Scenario . . . . . . . . . . . . . . . . . . . 86 4.2.2 Test 2 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . 86 4.2.2.1 Start of incident . . . . . . . . . . . . . . . . . . . . 87 4.2.2.2 Flow switch activation: Four minutes from start of re 88 4.2.2.3 All smoke sensors in alarm: Eight minutes from start of re . . . . . . . . . . . . . . . . . . . . . . . . . . 89 4.2.2.4 All sensors in alarm for sparse case: Twelve minutes from start of re . . . . . . . . . . . . . . . . . . . . 90 5 Conclusions 91 5.0.3 Future Directions . . . . . . . . . . . . . . . . . . . . . . . . . 92 A Features List 97 B Hardware Test: Emergency Scenario 102 C Sensor Triggering Testing 105 D FDS Test 1: Full Results 108 D.1 Side by Side comparison of JMP simulation to GUI . . . . . . . . . . 108 D.2 Side by Side comparison of Multilevel simulation to GUI . . . . . . . 114 iv E FDS Code 116 E.1 JMP FDS code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116 E.2 Multi-level FDS code . . . . . . . . . . . . . . . . . . . . . . . . . . . 119 F Prototype Source Code 121 References 225 v List of Figures 1.1 Smoke can be seen pouring out of the top of the building as the re continues to burn on the rst oor. . . . . . . . . . . . . . . . . . . . 4 2.1 4190 PC Annunciator User Interface . . . . . . . . . . . . . . . . . . 31 2.2 TrueSite Floor Plan Window . . . . . . . . . . . . . . . . . . . . . . . 32 2.3 TrueSite Historical Log Window . . . . . . . . . . . . . . . . . . . . . 32 2.4 ONYXWorks Workstation software . . . . . . . . . . . . . . . . . . . 35 2.5 Interface of ONYX FirstVision . . . . . . . . . . . . . . . . . . . . . . 36 2.6 Relationship of nine key elements of the emergency decision process . 46 3.1 On-site Screen of NIST?s Prototype Tactical Decision Aid Display (Davis, Holmberg, Reneke, Brassell, & Vettori, 2007) . . . . . . . . . 58 3.2 Team FFA Final GUI Mockup . . . . . . . . . . . . . . . . . . . . . . 61 3.3 A mockup oor plan for real lab hardware to simulate. . . . . . . . . 66 4.1 Frame A . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78 4.2 Frame B . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79 4.3 Frame C . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79 4.4 Frame D . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80 4.5 Frame E . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80 4.6 Frame F . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81 4.7 Frame G . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81 4.8 Side by Side comparison of JMP simulation to GUI at the three minute mark . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84 4.9 Side by Side comparison of JMP simulation to GUI at the eight minute mark . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85 4.10 Side by Side comparison of JMP simulation to GUI at the twelve minute mark . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85 4.11 incident start . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87 4.12 ow switch activation . . . . . . . . . . . . . . . . . . . . . . . . . . . 88 4.13 all smoke sensors in alarm . . . . . . . . . . . . . . . . . . . . . . . . 89 4.14 all sensors in alarm . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90 A.1 NEMA SB30 Symbols Used . . . . . . . . . . . . . . . . . . . . . . . 101 B.1 Floor 1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103 B.2 Floor 2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 104 B.3 Floor 3 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 104 D.1 Side by Side comparison of JMP simulation to GUI at the one minute mark . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108 D.2 Side by Side comparison of JMP simulation to GUI at the two minute mark . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108 vi D.3 Side by Side comparison of JMP simulation to GUI at the three minute mark . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109 D.4 Side by Side comparison of JMP simulation to GUI at the four minute mark . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109 D.5 Side by Side comparison of JMP simulation to GUI at the ve minute mark . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109 D.6 Side by Side comparison of JMP simulation to GUI at the six minute mark . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 110 D.7 Side by Side comparison of JMP simulation to GUI at the seven minute mark . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 110 D.8 Side by Side comparison of JMP simulation to GUI at the eight minute mark . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 110 D.9 Side by Side comparison of JMP simulation to GUI at the nine minute mark . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111 D.10 Side by Side comparison of JMP simulation to GUI at the ten minute mark . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111 D.11 Side by Side comparison of JMP simulation to GUI at the eleven minute mark . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111 D.12 Side by Side comparison of JMP simulation to GUI at the twelve minute mark . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112 D.13 Side by Side comparison of JMP simulation to GUI at the thirteen minute mark . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112 D.14 Side by Side comparison of JMP simulation to GUI at the fourteen minute mark . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112 D.15 Side by Side comparison of JMP simulation to GUI at the fteen minute mark . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113 D.16 Side by Side comparison of JMP simulation to GUI at the sixteen minute mark . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113 D.17 Side by Side comparison of JMP simulation to GUI at the seventeen minute mark . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113 D.18 Side by Side comparison of Multilevel simulation to GUI at the one minute mark . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 114 D.19 Side by Side comparison of Multilevel simulation to GUI at the two minute mark . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 114 D.20 Side by Side comparison of Multilevel simulation to GUI at the three minute mark . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115 D.21 Side by Side comparison of Multilevel simulation to GUI at the four minute mark . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115 D.22 Side by Side comparison of Multilevel simulation to GUI at the ve minute mark . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115 vii List of Abbreviations Antibody Identi cation Assistant (AIDA) The American Society of Heating, Refrigeration, and Air-conditioning Engineers (ASHRAE) Building automation system (BAS) Building information services and control system (BISACS) Building services interface (BSI) Chemical, biological, radiation (CBR) Command Decision Support Interface (CODSI) Comma-separated value (CSV) Decision support system (DSS) Emergency rst responders (EFRs) Fire Alarm Control Panel (FACP) Fire Dynamics Simulator (FDS) Future Fire ghting Advancements (FFA) Geographic information system (GIS) Gallons per minute (gpm) Global positioning system (GPS) Graphical user interface (GUI) Hazardous materials (hazmats) Human-computer interaction (HCI) Incident commanders (ICs) James M. Patterson (JMP) Maryland Fire and Rescue Institute (MFRI) National Electrical Manufacturers Association (NEMA) National Fire Protection Association (NFPA) National Institute of Standards and Technology (NIST) Occupational Safety and Health Administration (OSHA) Personal Identity Veri cation (PIV) Regional Crime Analysis Program (RECAP) Radio-frequency identi cation (RFID) Records management system (RMS) Self-contained breathing apparatus (SCBA) Sensor Driven Fire Model (SDFM) Society of Fire Protection Engineers (SFPE) SFPE Education and Scienti c Foundation (SFPE ESF) Signaling line circuit (SLC) Ultra wideband (UWB) Very Early Smoke Detection Apparatus (VESDA) Extensible markup language (XML) viii Chapter 1 Introduction For decades, building sensor manufacturers have pursued the goal of detecting emergency situations earlier while reducing the occurrence of nuisance alarms. In this respect, great progress has been made. Modern building sensors are much more reliable and accurate than their predecessors. At the same time, the goal of using building sensors as an aid to emergency rst responders (EFRs) has been given less attention. Building sensor data could provide important information, and history has shown that having that data in a timely manner could aid in emergency response. The Pang Seattle Warehouse re is an example of how having information immediately can change the outcome of an emergency. The re started on July 5, 1995 in the warehouse as a result of arson, and re ghters were deployed inside the building in order to suppress the ames. The re ghters on the rst oor were able to successfully put out the re in that area and believed that they had the emergency under control. However, the re raged on in the basement, with another dispatched group of re ghters trying to suppress it. The re ghters on the rst level did not communicate with the re ghters on the lower level and vice versa, and neither communicated with incident command. Consequently, neither the incident commander nor the re ghters on the rst oor had any idea that there was a raging re in the basement. There was a sudden collapse of the rst oor of the warehouse, 1 resulting in the deaths of four re ghters (Routley, 1995). If the information of where the re was located was immediately available, the incident commander would have been much more informed when formulating an attack plan for the re. The need for more pre-planning data, as well as more data communication during an emergency was made very clear in this incident (Thiel, 1999). The First Interstate Bank building re in Los Angeles on May 4, 1988 is another example of an incident where more information could have been provided to the personnel involved. At the time, this incident was considered to be one of the most devastating res in history. The blaze destroyed four oors and damaged a fth, resulting in over $50 million in property damage. One person was killed, and thirty- ve occupants and fourteen re ghters were injured during the incident. The re, which is believed to have been caused by an electrical source, orig- inated on the twelfth oor of the building. Shortly after, a manual pull station was activated and silenced by security personnel within minutes; they believed the alarm was the result of repair work being performed in the building. A series of smoke detector alarms were activated next, but each one was also reset by security personnel. Eventually, an employee went to twelfth oor to determine the source of the alarms, only to be engulfed by the ames in the lobby. The multiple alarm resets delayed the noti cation of the re department, leading to a much larger re by the time re ghters arrived. This scenario is one where a display of the alarm history may have been helpful to security personnel, who were unable to piece together the multiple alarms and recognize the extent of the re. 2 In addition, the main re pumps were shut down, resulting in poor water pressure in the rst minutes of response. Fire ghters eventually found the building re pumps, mainly because the sprinkler installation supervisor was present and able to inform the incident commander of the existence of these pumps (Routley, 1998). In this case, critical static information could have been made available to the incident commander prior to arriving on the scene, which would have prevented any delays in response. Again, this highlights the need for more pre-planning data. The MGM Grand Hotel re took place on November 21, 1980. The re began early in the morning at approximately 7:15 AM. The blaze began as a result of an electrical fault that had been smoldering in the walls of a deli on the rst oor of the hotel. The re soon caused ashover in the deli, causing a reball to expand rapidly throughout the rst oor casino at about nineteen feet per second (Puit, 2000). The casino was full of fuel for the re, including a highly ammable adhesive used to attach ceiling tiles. The re had engulfed the rst oor by the time the rst responders arrived to the scene four minutes after the start of the re. In Figure 1.1, copious amounts of smoke can be seen pouring from the build- ing at the top and some of the sides. However, that smoke is the limit of visual information that can be gained from an external view. While it may appear that the re has spread throughout the entire building by looking at the smoke, the re is only on the rst oor. The massive amount of smoke generated by the re quickly rose through ventilation shafts, moved through the hallways in the top oor and escaped out the windows. Because this smoke movement is very dependent on environmental factors, it becomes hard to visually ascertain exactly where a re is 3 within a building just from seeing the smoke pour out of it. Figure 1.1: Smoke can be seen pouring out of the top of the building as the re continues to burn on the rst oor. There were many critical mistakes made that led to the severity of the re at the MGM Grand Hotel, including a lack of re detectors of any form and a lack of sprinklers. Either of these re safety measures could have provided valuable data to personnel and emergency rst responders. In the event that re spread could not be limited by sprinklers, data from building sensors could have provided information as to where the re was spreading within the building and how to further plan for e ective suppression e orts. Currently, sensor data is typically presented sequentially and out of context, making the data less useful for decision making. Most modern building sensor systems still use a text-only display, which is hard to interpret and takes too much 4 e ort to access and understand the information one is seeking. In order to assess the situation in a building, it is necessary to contextualize the sensor data, that is it must be presented in the context of related data. For example, the state of any one sensor may not say much about the progression of smoke in the building but the states of all of the smoke detectors on a oor does. There are a number of barriers to contextualizing building data. Each sensor manufacturer uses their own protocols for communication between sensors and the annunciator panels. This lack of standardization makes it di cult to create a prod- uct that can work with systems from di erent manufacturers. Another problem is that, frequently, the sensor data is not easily obtained from the annunciator panel. Retrieving analog sensor values such as room temperature and smoke obscuration can be more di cult than simply obtaining sensor trouble or alarm states. The extent of these problems was revealed during work on our project, but we did not attempt to address them since there are known solutions. Our team explored the approach of data contextualization through an emer- gency visualization system. Overlaying data on a oorplan is the simplest way to contextualize data. We focused on making the system useful as a decision support tool during an emergency. We limited the prototype to only work with re sensor systems due to resource limitations, but intend the concept to be easily extensible to all building sensor systems. In this thesis we describe the design, prototyping, and evaluation of our system. Our intent is for this project to serve as a proof-of-concept. We want to show that building sensor data can be more useful when contextualized. The decision 5 support tool we developed demonstrates our approach to visualizing the state of a building. Finally, we wanted to nd out whether modern re sensor systems can be incorporated into such a visualization system without any changes to sensor technology. To satisfy this goal we focused on the design stages of the process but left the implementation as a prototype rather than as a nished product. We are aware that in order to put a building state visualization system to practical use, a number of infrastructure questions must be resolved. How are building oor plans stored and made available for use in our system? Who annotates the oor plans with sensor locations and when? How is the sensor data made externally available without compromising security? How do EFRs incorporate use of the system into their existing practices? It is not the aim of this thesis to address any of these questions. These questions can be answered once a fully functional decision support tool has been realized. 6 Chapter 2 Literature Review 2.1 Fire Emergency Responder Standard Operating Procedures 2.1.1 Size-Up In order to e ectively combat any re emergency, re ghters are required to obtain and process a great deal of information for every re situation. According to the Fire O cer?s Handbook of Tactics by John Norman, there is a thirteen- point outline used by re chiefs that covers a majority of reground considerations, conveniently summed up by the mnemonic COAL WAS WEALTH: Construction, Occupancy, Apparatus and manpower, Life hazard, Water supply, Auxiliary appli- ances, Street conditions, Weather, Exposures, Area and height, Location and extent of re, Time, and Hazardous materials. These points are interrelated and each is not simply considered separately when sizing up a re situation. Norman establishes that life hazard is the most important factor in deter- mining re operations and tactics. Life hazard comes in two forms - civilians and re ghters. Tactics employed by re ghters will usually be more aggressive if there are civilians in need of rescue, but incident commanders (ICs) will not risk the safety of re ghters if there is no signi cant civilian life hazard. Risk to civilian life is best prevented before an incident occurs by imposing occupancy restrictions, specifying 7 re doors and exits, and installing an automatic sprinkler system throughout the building. Occupancy has considerable bearing on life hazard, and is usually dependent on the time of day and the type of building. For example, hospitals and residential buildings create a high life hazard at all hours of the day, while storage warehouses pose a uniformly low life hazard, while the life hazard of a school varies greatly with the time of day. The time of year a ects re operations as well. Certain times of year, such as the holiday season or hunting and shing seasons in some regions, can a ect the amount of manpower available to ICs, especially in volunteer departments (Norman, 2005). Another important aspect of time in re operations is the elapsed time of a re incident. ICs have some rules of thumb to gauge how long a re has been burning before their arrival to the re situation, such as looking to see if re is venting out of windows, but these are typically inexact and require very experienced ICs to implement these rules based on any given situation. ICs also have mechanisms in place to track the time spent ghting a re, such as the dispatcher requesting a status report from the IC ve minutes after arrival and every ten minutes for the rst hour of the incident (Norman, 2005). Construction of the building is another important factor in determining the re tactics utilized by ICs. Buildings are classi ed into ve general types: re- resistive, noncombustible, ordinary construction, heavy timber, and wood frame. These classi cations are based on four criteria: the degree of compartmentation a building provides, the degree to which a building contributes to the re load, the 8 number of hidden voids in the building, and the ability of the building to resist collapse (Norman, 2005). Area and height of a building are concerns during size-up for several reasons. Most notably, they indicate the maximum possible re area. Building height can also give clues as to what kind of construction the building is and what auxiliary appliances are available. For instance, if the building is over a certain height, it might be required by law to be built of Class I re-resistive construction or have a sprinkler or standpipe system (Norman, 2005). The physical location and extent of the re has an in uence on the tactics used to control it. In general, the lower a re is in a building, the more problematic the incident due to more potential for vertical spread. Other locations that create re ghting problems include the top oors of buildings with ordinary brick and wood-frame constructions; res below grade, such as in a cellar, a tunnel, or below deck on a ship; and res beyond the reach of ladders (Norman, 2005). Determining the extent of a re can be di cult, as certain factors such as central air conditioning and moving elevators can cause smoke to move in ways it otherwise would not. ICs must also identify and protect exposures, or areas around the building that could create additional life and property hazards. Fire ghters often use a numbering system to identify exposures, labeling the sides of the building as one through four, starting at the front of the building and proceeding clockwise around the perimeter (Norman, 2005). The IC must then identify the resources available at the scene and start to determine a plan of action for attacking the incident. The IC must identify the 9 apparatus and manpower at his disposal and determine how to utilize them. Closely related to these factors is the available water supply, which not only includes the sources of water but also the devices and manpower used to transport the water to extinguish the re such as hoses and pumpers. ICs must also determine how much water is needed to put out the re, which is usually measured in gallons per minute (gpm) based on the square footage of the re. This can range from 10 gpm for 100 square feet of light re load to 50 gpm for 100 square feet of heavy re load. ICs must also determine the presence and status of auxiliary appliances such as sprinkler systems, standpipe systems, and foam suppression systems (Norman, 2005). Several other factors should be accounted for when determining re ghting tactics. Weather conditions can a ect an IC?s strategy; high temperatures and humidity will fatigue re ghters more quickly, while high winds might make ven- tilation of a re impractical. Certain street conditions such as construction and illegally parked cars can hamper maneuvering and apparatus placement and uti- lization. Hazardous materials (hazmats) present in and around a building present a variety of problems, ranging from health hazards to acceleration of the re ex- tension. Identifying and attacking all of the above factors make the IC?s job very challenging (Norman, 2005). 2.1.2 Operations - High-Rise Buildings The basic strategic plan for a high-rise re incident starts with determining and verifying the speci c re oor before committing hose lines. This information 10 is critical but di cult to ascertain, as the initial information is often vague, such as smoke seen from the fth, sixth, and seventh oors. Next, re ghters must begin a controlled evacuation, evacuating those in immediate danger rst, preventing a panicked exit by those not endangered, and searching the re oor and all oors above the re. Fire ghters must then gain control of the building systems, including HVAC, elevators, public announcement, and other vital systems, and proceed to con ne and extinguish the re (Norman, 2005). A tremendous amount of resources and experience is required to ght a high- rise building re. In New York City, the signal of a possible working re in a high-rise calls for a deputy chief and four battalion chiefs to respond, and a second alarm calls for another battalion chief, another deputy chief, and a senior sta chief. The command post is usually set up in the lobby of the high-rise, while the operations o cer, usually one of the rst responding chiefs other than the IC, goes to the oor below the re to assume direct command of the re operations. The operations o cer must communicate constantly with the command post, keep the line of attack moving forward and maintain resources (Norman, 2005). Fire ghters must take steps to remove the smoke and heat buildup caused by a re, which is usually done through ventilation of the building. One way this can be achieved is through vertical ventilation, or opening the top of the building using a stairwell or elevator shaft. The IC should send two EFRs to the top of the building to open the stairwell door on the roof. The EFRs should check with the IC to see how the re responds to this door opening. If favorable, the door to the re area should then be opened, after which smoke and heat will be drawn to the roof of the 11 building, with the stairwell acting as a chimney (Norman, 2005). Another way to vent smoke, heat and gas out of the building is horizontal ventilation. This technique involves breaking windows on the re oor to allow smoke to exit out the sides of the building. This tactic is much more complex than vertical ventilation because if done improperly or in the wrong weather conditions, it can cause smoke to blow into the building and upward to the top oors. This tactic also has the added danger of glass falling to the street outside, which can injure bystanders and re personnel as well as sever hose lines. The IC should only authorize horizontal ventilation after careful consideration (Norman, 2005). After ventilation of the re, gaining access to the re area is the next re- ghting concern. For large high-rises, using the elevators is considered a "necessary evil" because a re ghter will not be very e ective at re suppression after walking up over twenty stories carrying forcible entry tools, hoses, and other gear. Certain safety precautions must be taken when using elevators in a re situation, which include attempting to accurately determine the re oor, ensuring each team using the elevator has been documented and properly equipped, using remen?s service elevators if possible, pressing the call cancel button when boarding to eliminate any other selections made, and being prepared to don facemasks when arriving at the destination. If elevators become unavailable, extra supplies must be transported manually to the operations post. One re ghter would be assigned to transport equipment for every two oors from ground oor to the re oor. If there is not enough re personnel to transport equipment, non- re personnel such as police can be used as long as they are not put in threatening situations (Norman, 2005). 12 The rst-arriving units are charged with locating the re, determining size and likely paths of travel, and redeploying to prevent extension. The attack crews on the re oor, however, will have a di cult time putting out the re, as it might be challenging to attack the base of the re due to obstructions such as desks and partitions. There is also an issue of having enough pressure and water ow from standpipes that are so high above the ground oor. A fully involved oor that is over 30,000 square feet requires a minimum of 3,000 gpm to extinguish a re at a light re load, but the NFPA 14 standard?s 1993 revision requires only 1,250 gpm of water ow from standpipes. The next step would be to get above the re oor and contain the re until it consumes enough fuel to be fought manually. The tactics used for this process are considered last resort measures to get water onto a re that is otherwise untenable (Norman, 2005). The tactics outlined above are used by essentially all re ghters and ICs across the country. Despite the advancements in re protection and building construction made in recent years, the possibility of a high-rise building disaster remains a very real possibility (Norman, 2005). 2.2 NFPA Standards The National Fire Protection Association (NFPA) is responsible for creating and advocating three hundred consensus codes and standards. In general, the pur- pose of these standards is to decrease the potential risks and negative e ects due to re through the establishment of criteria for building, processing, design, ser- 13 vice, and installation (National Fire Protection Association, 2010). The NFPA uses committees of volunteers from the re protection industry to create and update these standards. While wording from NFPA standards have been incorporated into particular federal or state Occupational Safety and Health Administration (OSHA) regulations, compliance with most of the three hundred codes and standards is vol- untary unless adopted into law (National Volunteer Fire Council, n.d.). Three NFPA standards, NFPA 72, NFPA 170 and NFPA 1620, are of particular signi cance to our research project. NFPA 72 contains the National Fire Alarm and Signaling Code, which pro- vides standards for the installation and use of re alarm systems. Requirements for initiating devices (smoke and heat detectors, radiant energy sensing re detec- tors, gas detectors, water ow alarm-initiating devices, and other re detectors) and noti cation appliances are thoroughly outlined. The NEMA (National Elec- trical Manufacturers Association) SB 30 Fire Service Annunciator and Interface standards included in Annex E are especially important to our research. NEMA SB 30 provides a cross-industry standard for the symbols and other elements involved in interface systems related to re protection (NFPA 72: National Fire Alarm and Signaling Code, 2010). Our research team plans to implement the symbols suggested by NEMA SB 30 whenever possible. NFPA 170 contains the Standard for Fire Safety and Emergency Symbols, which presents a standard set of symbols to be used in representing re safety, emer- gency and associated hazards (NFPA 170: Standard for Fire Safety and Emergency Symbols, 2009). Included are symbols for general use, for use by the re service, for 14 use in architectural and engineering drawings and for use in pre-incident planning sketches. We plan to use the symbols contained in NFPA 170 to augment Annex E of NFPA 72, as NFPA 170 includes symbols for elements of a re emergency not addressed by the NEMA SB 30 standard. NFPA 1620, the Standard for Pre-Incident Planning, establishes the essen- tial features of any pre-planning system. It addresses the method of pre-planning, occupancy, re protection and suppression systems, speci c hazards, emergency op- erations and plan testing and maintenance. It also addresses the site evaluation and other physical considerations. Some essential aspects of the building assessment por- tion are construction, building management, external site conditions, and internal and external security measures (NFPA 1620: Standard for pre-incident planning, 2010). NFPA 1620 details exactly what information about a building needs to be in- cluded in pre-planning materials. A standard like NFPA 1620 allows for cooperation between multiple departments on the same event. 2.3 BACnet In order to facilitate communications between our system and third-party pe- ripherals, our team has decided to focus on using the BACnet protocol to receive and interpret data output from building information systems. The BACnet protocol was developed by ASHRAE, the American Society of Heating, Refrigeration, and Air- conditioning Engineers in 1995 as a communications protocol to aid cross-platform messaging between disparate building information systems(Bushby, 1996). It was 15 internationally recognized and formalized by the ISO standard 16484-5 in 2003. Prior to the development of BACnet, building information systems exclusively used proprietary communication protocols, and they were incapable of directly commu- nicating or working with each other. Without a universal protocol like BACnet, systems reading output from multiple building information systems, such as ours, would be extremely di cult, if not impossible, to implement, for both technical and legal reasons. The di erent protocols would have to be handled individually in the code of the program to enable it to understand the output from the building information systems, requiring a great deal of space, time, and e ort to implement an interpreter for each protocol. In addition, since these protocols are proprietary, there could be legal issues associated with making such an interpreter, since it could involve reverse-engineering these protocols to attain a su cient understanding of them (Newman, 2000). BACnet is an object-oriented protocol with a client-server communications paradigm. Sensors within a BACnet-compatible system are represented as \de- vices." Every device is made up of one or more \objects," which encapsulate a speci c function of that device, such as binary input or analog output. These ob- jects contain properties which represent the state of that function, such as whether it is in alarm, or the value of an analog sensor. These properties are accessed by the BACnet server through \services," which query the objects about the state of their properties. Every BACnet object can respond meaningfully to a subset of the available services, with additional services supported as required (Bushby, 1996). Sensors and systems that cannot natively communicate through BACnet can utilize 16 a BACnet gateway, which converts the proprietary data output into a format that is legible by BACnet. This BACnet gateway must be provided by the manufacturer of the sensor. Our team decided to use BACnet because it is relatively widely supported among building information systems in general. Much of our operating test equip- ment supports BACnet as an output protocol, which allows our system to gather data from them simultaneously and symmetrically. By comparison, output using the RS-232 ports requires a specialized interpreter to be coded for each individual system; without a common protocol like BACnet, there can be no generalization of data interpretation from these systems. Also, since BACnet was originally de- signed to be a universal, general protocol, it is not limited to use with re detection systems; it has the capability to work with a diverse array of products, such as air-conditioning systems, smart elevators, and other building sensors. This wide applicability theoretically enables our decision support tool to receive and use data from any BACnet-compatible device present within a building, even if it is not directly related to re detection or prevention (Bushby, 1996). 2.4 Building Tactical Information Project The ongoing Building Tactical Information project being conducted by the National Institute of Standards and Technology (NIST) is concerned with the de- velopment of technology that aims to make real-time building information available to EFRs (Holmberg, Treado, & Reed, 2006). The project has four main objectives: 17 1) determining which data would be most useful to EFRs in emergency response situations, 2) developing a standard method for dispersing the data collected by buildings to EFRs, 3) demonstrating the e ectiveness of the technology proposed by the project and 4) addressing the security issues in the system proposed. In order to make a system of complexity and sophistication that would be capable of the goals of our project, it must be determined what information is key to emergency response. To make a complex system for a target group of people such as EFRs, it is a necessary step of product development to talk to the end users to determine their needs and speci cations. To this end, NIST held a workshop for emergency responders to de ne what information they needed during a build- ing emergency to aid them in making decisions. The results of this workshop are presented in the paper entitled \Workshop to De ne Information Needed for Emer- gency First Responders During Building Emergencies," by Jones, Holmberg, Davis, Evans, Bushby, and Reed. It is important for our project to have the feedback and information generated by users from NIST?s workshop because it was conducted on a scale that would not be feasible for a Gemstone Team. The following paragraphs introduce information found from the workshop that directly pertains to speci cations we need to address. First, the participants in the workshop identi ed two essential categories of information based on the time frame typically associated with emergency response: \En-Route," de ned as the \ rst ve minutes" and \On the Scene," after the EFRs arrive at the site of the emergency. These two time frames represent the important phases of emergency response in which decisions are made. Second, the workshop 18 identi ed three major areas of information important to creating a system: static in- formation, dynamic information, and the display of information. Static information is de ned as information available before an emergency occurs. This information includes building plans, sensor layouts, and pre-established escape routes. Also in- cluded are the more speci c qualities of a structure such as where access points of the building are, locations of elevators, nearby hazard conditions, and the types of power systems that are running in the building. Dynamic information is de ned as the real-time information that would be received from the sensor systems. This information includes direct sensor readings from building re panels as well as in- formation from decision support tools which take in and analyze data in order to give a useful conclusion such as re location and spread (Jones et al., 2005). All of this information would have to be managed in a simple, yet comprehensive display. For the En-Route display, to be used during the rst ve minutes of an emer- gency when responders are on the way to the scene, a comprehensive list of static and dynamic information to be displayed was decided on by the EFRs who took part in the workshop. However, it was noted that given the circumstances of the rst ve minutes, with the commander and other rst responders all en-route to the incident, only so much information could be e ectively communicated due to di culties of reading from computers while in motion. With that in mind, a list of static information was generated; aspects of this list that are relevant to our project are presented below: Building condition (let burn, unsafe to enter, dangerous roof, sprinklered and other suppression systems) 19 Building type (single family, commercial, gas storage, school) Building style (one story, two story, n story, auditorium, sublevels, etc.) Building construction (type I, II, III, IV or V; re resistive, noncombustible or limited combustible, ordinary, heavy timber, or wood frame) Roof construction (light weight metal or wood trusses) Hazardous materials or unusual hazards (above ground propane tank, gas lines, chemicals, etc.) Location of re hydrants on map with building outline, nonstandard thread sizes included Location of re department hookups for sprinkler system/standpipes Other sources of water nearby Location of staging areas and entrances and exits to building History of location in case re stages before police arrive Routing information for emergency equipment to reach the building in case of construction (Jones et al., 2005) Additionally, the dynamic information generated includes: Con dence in the incident being real (based on number of sensors in alarm and/or calculated re size) Approximate location of re within building Fire size and duration Sprinklers are owing/no sprinklers or other working systems Fire growth (fast, medium, or slow) CBR (chemical, biological, radiation) sensors present and in alarm Police on the scene Presence of occupants in the building Stairwell smoke/heat conditions for positioning Standpipes to use to get to the re 20 Exposures (Jones et al., 2005) The above points of information that would aid in tactical decisions will most likely be available in large commercial structures that have the infrastructure to sup- ply it. In smaller structures, only some of the information is likely to be available. However, any and all of the above information would aid emergency response. Nor- mally, none of the above information can be obtained en-route and in the best-case scenario, only fractions of it are obtained on scene without the aid of technology. Once at the scene, incident commanders need more information to make their decisions. Information crucial to decision making lies in the oor plans of a building. Ideally, these oor plans would be easily viewable, with relevant information shown in the correct location with respect to the oor being viewed. \The oor plan (static data) would include layers/overlays that would allow the incident commander to locate: Doors, windows (with types and which can be used for egress), stairwell risers, re walls (with ratings and area separation), roof access, re sensors. Security sensors, closed circuit TV cameras, occupancy sensors, security con- trol room. Fire alarm panel and remote annunciator panels. Utility shuto . Building generator (with indication of what it powers). Building system controls (HVAC, smoke control, others), areas covered, special operating systems, and which ones should and should not be used by the responders. Evacuation quality elevators, oors served, and location of elevator overrides and how to control. Convenience stairs/evacuation stairs. Areas (zone boundaries) protected by sprinklers or other devices. Vertical openings. 21 Extremely valuable materials. (Jones et al., 2005) Currently, the ONYX FirstVision re panel is being used by NIST to incorpo- rate this kind of information overlay. It clearly labels all oors, all sensor locations, important points of entry and exit, and other aspects of a building useful to emer- gency responders. Where the FirstVision panel excels is in the processing of dynamic information on the scene, as determined by the workshop as the following: Location of re detectors in alarm Location of CBR sensors in alarm Location and size of re(s) Duration of the re(s) Location and condition of smoke Presence of smoke in elevator shafts or stairwells Identi cation of activation of sprinklers or other devices Location of elevators used during the incident Location of people in need of rescue (911 calls or visual sightings) Warnings of structural collapse based on material type, re location, re size and duration Location of operational elevators Alarm, occupant, and system histories of the building (Jones et al., 2005) The kind of system that could take into account all of this information cannot be limited to just re sensor technology though the most pertinent information about the movement and severity of the re comes from such sensors; it would require the incorporation of security system data, HVAC information, facilities management information, such as elevator location, and tracking systems to monitor the location 22 of EFRs within a building. A normal re sensor system would be inadequate on its own. However, re sensor systems have an important component, the re panel annunciator, which, if sophisticated enough, can be used to process and analyze information, and, given the right inputs, incorporate other system technologies by using a common communication protocol such as BACnet. Finally, an incident commander would need to know information about what is going on outside the building, such as who has responded to the emergency (medical personnel, police, re ghters, etc.), what apparatus has been dispatched and where personnel are located, in order to make fully educated decisions. Static information regarding what is going on outside the building to be included on the plot of the oor plan consists of: Building location with street designations Location of re ghting obstacles such as street widths, overhead clearance and elevations Location of underground pipelines and other utilities Name and phone numbers of building owners and managers Name and phone numbers of utility contact people Location of police line necessary to isolate the incident Indicated runo or water table problems Helicopter landing areas Evacuation routes Dynamic information displayed on the oor plan would include: Location of responding units ( re, police, and EMS) Location of units responding but not yet on scene Hospital availability Helicopter availability Hazmat response 23 Location of police line necessary to isolate the incident Location of triage or evacuation area Suggested hazard perimeter Local weather conditions and predicted spread directions Wind direction and velocity The data that could be provided and used by EFRs is extensive and the need for a system that can distribute all of this information is very real. The needs for EFRs in a building emergency determined by NIST?s workshop serve as the backbone for the standards set for our system. There are, however, concerns that must be considered in designing such a system. Speci cally, standardization is a key issue. EFRs expressed concerns with usability of a system, mentioning a need for every system control display to be the same, so there is no new learning curve from building to building, incident to incident. Also, the system has to be fully functional, not partially, as the Fire Service will not adopt a new, questionable system. The Fire Service cannot be expected to pay for new technology, thus the system must also be a ordable enough to be installed and used in real-time when buildings are rst constructed (Jones et al., 2005). Further literature published by NIST addresses the remaining objectives of the Building Tactical Information project. To address the second objective, NIST proposes a building information server to provide access to tactical information in real-time to EFRs (Holmberg et al., 2006). This tactical information includes static building information on a given building?s construction, occupancy type, oor plans, and system schematics, as well as dynamic information collected including alarms and changes in a building?s status as they occur. According to the project, the desirable features for such a building information server system include the ability to: Forward alarms and status messages as they occur 24 Deliver descriptive building information Allow only authorized access Allow authentication by multiple accessing connections Allow asynchronous access Present information in a common format Minimize the volume of transmitted information Maximize the use of layered communications protocols Continue to provide information in the event of partial system failure The project also proposes the use of extensible markup language (XML) in messages received and dispatched by the server in addition to the implementation of the Internet TCP/IP suite as the base communication protocol (Holmberg et al., 2006). In addition to the implementation of a building information server, the project also advocates the use of the Sensor Driven Fire Model (SDFM) support system. The SDFM is a prototype decision support system that converts sensor signals to predictions and issues warnings based on these predictions. In order to present the information collected by the building information server and interpreted by the SDFM system, the project proposes both en-route and on-scene information presentation screens. The en-route screen provides information about the area in the immediate vicinity of the building while the on-scene screen presents detailed information about the building?s interior (Holmberg et al., 2006). In February 2005, a demonstration of much of the technology described above was held at NIST, accomplishing the project?s third objective. The demonstration was documented on video. The presentation included demonstrations of building sensor data and re modeling data formatted in XML schema, the transfer of build- ing data to the building information server, and the use of the proposed software client and interface software. At the demonstration, it was determined that EFRs 25 often are not aware what information is available from the control systems con- tained in buildings; an educational video was produced to address this de ciency (Holmberg et al., 2006). To accomplish the fourth objective of the Building Tactical Information Project, NIST outlines a method for allowing only secure access to building information for use in emergency response. The methodology described would permit public safety o cials remote access to real-time information. In addition, it would also grant access to EFRs en-route (Treado, Vinh, Holmberg, & Galler, 2007). As proposed, dynamic information would be provided to emergency personnel through a building automation system, which coordinates the activities of HVAC, lighting and access controllers, and re detection and security systems. The sensor data and video streams provided by these systems would be delivered to EFRs through enhancements to the BACnet building automation system communication protocol. Static information such as oor plans and equipment schematics would already be available to public safety o cials in building information models (Treado et al., 2007). To access the information being provided by the building automation system, a secure proof-of-identity credential, such as a federal PIV (Personal Identity Veri- cation) would be required. The right to varying levels of the hierarchical database structure would be governed by factors such as identity, jurisdiction, duty status, incident need, and chain-of-command. The secure connection to these databases would be facilitated by a building information services and control system (BISACS), a system lending itself to standardized protocols and secure communication. The BISACS would be centrally located in a given building and credential reader inter- faces at access nodes would permit authenticated internal and external user access. A secure web-based link between the BISACS and the BACnet building automation system (BAS) using a building services interface (BSI) would allow information and 26 resources to be exchanged and shared securely. To maintain security, information would only be permitted to travel in one direction, from BAS to BISACS (Treado et al., 2007). The system proposed by the Building Tactical Information Project provided a sound foundation for the design and implementation of our team?s own prototype decision support tool. The prototype our team created aims to satisfy many of the same design criteria. The background information utilized, data collected, and results to date of the Building Tactical Information Project were invaluable to the design of our decision support tool. 2.5 FireGrid The FireGrid project, lead by the School of Engineering and Electronics at the University of Edinburgh, is attempting to achieve the same goal we sought to accomplish with our research project: provide re ghters with valuable building sensor data in a usable format in real-time. However, unlike our project, the FireGrid consortium has placed heavy emphasis on the modeling of a re?s progression to predict its future course using high performance computers, grid-enabled distributed computing capabilities, computational uid dynamics models, and nite element structural models (Berry et al., 2005). According to Potter and Wickler (2008), incident commanders would use one of two command and control interfaces to send a request concerning the future development of a particular re emergency to the Query Manager which would search for available models that can answer the query, run the most appropriate model to obtain the information requested, and return this information to the incident commander. While the FireGrid consortium?s attempt to integrate predictive modeling into a command and control decision support tool for re ghters is certainly compelling, we feel that it is rather far-sighted and thus did not nd the work they are doing to be particularly helpful to the advancement 27 of our project. 2.6 Current Building Technology 2.6.1 Overview Most of the re detection systems on the market feature a re alarm control panel that is permanently installed near the entrance of a building, and uses a loop to monitor and control a number of devices located throughout the building. These devices are often point-addressable smoke detectors, thermal sensors, ow switches, or other detectors that can relay its current status back to the control panel. Some detectors are intelligent in that they have some remote programming capabilities that allow a user to specify alarm pro les into the detector. Device information is usually displayed on the control panel itself, whose display can be as simple as a set of LED lights that indicate a zone in alarm to a more sophisticated one, such as a multi-character display with icon and graphics support. History logs are often available on control panels and are readily available to technicians and re ghters who respond to an alarm, trouble, or supervisory signal. Many manufacturers o er methods to monitor these systems through a net- work. A network can connect a number of re detection systems together, but the diversity of these systems, as well as the modularity of the control panels, are usually limited by the speci c manufacturer. In some cases, manufacturers provide separately-sold devices or modules to allow control panels to use BACnet, a commu- nications protocol. BACnet has the potential to allow di erent brands of systems to communicate with each other. Some manufacturers also o er monitoring from remote locations. This usually comes in the form of a computer workstation that can provide a visual representation of an emergency. These workstations typically feature a oor plan of the building 28 that contains multiple icons that represent di erent types of signals and detectors. The location of these workstations is xed, since the computer is usually installed in a speci c area. Other types of building technology are HVAC and other monitoring systems. HVAC systems have been used to monitor re alarm panels, with variable success. Many of these have given way to systems like those of Keltron, which uses wireless transceivers to monitor a variety of building systems through radio or Ethernet. 2.6.2 Major Companies of the Fire Alarm and Detection Industry In the United States, Tyco International Ltd. is a major player in the secu- rity, burglar, and re alarm industry, making up 26.2 percent of the market share (Culbert, 2010). Other major companies that play a role in this industry include the Honeywell International, Siemens AG, and UTC Fire and Security. 2.6.2.1 SimplexGrinnell The subsidiaries of Tyco International include several companies that manu- facture safety, security, and electrical products, as well as SimplexGrinnell, which manufactures many of the re alarm systems in the United States. SimplexGrinnell?s addressable re alarm panels are the 4100U, the 4010, and the 4008 (Addressable Fire Alarm Panels, 2011). While the 4010 and 4008 panels are most applicable to small to mid-sized buildings, the 4100U is designed for larger applications. It has a maximum capacity of 2000 addressable points, with up to 250 points that support TrueAlarm sensors, allowing sensor analog values to be communicated back to the control panel (Simplex 4100U Fire Alarm Control Panel, 2010). For TrueAlarm smoke sensors, the control panel maintains a current value, peak value, and average value for each sensor and tracks the status of each sensor by comparing the current values to the average value. 29 This allows the system to account for dirty sensors that could result in false alarms. The 4100U also features a two-line by 40-character LCD display for information readout, a numeric keypad, six system status indicator LEDs, three programmable LEDs, and ve programmable function switches. The panel?s alarm and trouble signal history log can maintain up to 1,300 total events, which are viewable on the LCD display or printed by connecting a printer to the panels RS-232 port. The 4100U is also compatible with other communication devices, such as the BACpac Ethernet module and the SafeLINC internet interface. The BACpac Eth- ernet module converts data from the panel to the BACnet protocol, allowing com- munication with other BACnet-compatible devices (BACPAC BACnet Products, 2010). The module is connected to the panel through the RS-232 port and has an output port for Ethernet LAN connection. The SafeLINC internet interface allows devices such as personal computers and PDAs to remotely monitor the re alarm panels through an internet connection (SafeLINC Fire Panel Internet Interface, 2010). The module provides access for up to 20 di erent user accounts, but only one user can access the interface at a single time. The interface is modeled after that of an internet browser and includes a system snapshot of the control panel status, access to history logs and reports, the local time, a side menu with additional links and a page of panel details. SimplexGrinnell also provides a line of annunciators that are compatible with the 4100U. One particular annunciator is the 4190 PC annunciator. The interface is text-based and essentially provides a printout of each sensor?s state as it changes. Historical logs and reports are also accessible. Figure 2.1 shows its interface. 30 Figure 2.1: 4190 PC Annunciator User Interface SimplexGrinnell?s other annunciator is called the TrueSite Workstation, which provides a graphical display of the building?s oor plan and a textual display of the alarm states (TrueSite Workstation with Multi-Client Capability, 2010). The graphical display has navigational tools to view the oor plan in greater or lesser detail and supports standard re service annunciation icons. Custom alarm and system messages, as well as historical logs, are viewable on the TrueSite Workstation. Figure 2.2 and Figure 2.3 show both of the Workstation?s interfaces. 31 Figure 2.2: TrueSite Floor Plan Window Figure 2.3: TrueSite Historical Log Window SimplexGrinnell also manufactures initiating devices that are compatible with its re alarm panels (SimplexGrinnell Initiating Devices, 2011). This includes the TrueAlarm sensors, manual pull stations, carbon monoxide detectors, and modules that provide addressability to conventional devices. It also manufactures speciality devices, such as an infrared smoke detector, a VESDA VLC-600 TrueAlarm Laser, 32 and a video smoke and ame detector which provides smoke and ame video in real-time (SimplexGrinnell Specialized Detection Devices, 2011). The Very Early Smoke Detection Apparatus (VESDA) periodically samples the air in an area for smoke (VESDA VLC-600 TrueAlarm Laser COMPACT , 2010). 2.6.2.2 Honeywell International, Inc. Honeywell has a number of subsidiaries that manufacture re alarm systems. These include Silent Knight, Fire Lite Alarms, Gamewell Fire Control Instruments and Noti er. Noti er is one of the acceptable manufacturers on the University of Maryland campus (Design Criteria/Facilities Standard Manual, 2005), and for this reason, the products of this subsidiary are explored in depth. The intelligent, addressable control panels of Noti er include the ONYX series, FireWarden series, and Spartan-25, the latter two being applicable in smaller facili- ties. Of the four di erent products in the ONYX series, two of them, the NFS2-640 and NFS2-3030, t the requirements of medium to large facilities (Honeywell Inter- national, 2010a). The NFS2-640 features one signaling line circuit, or SLC, which can be expanded to two SLCs (Honeywell International, 2010d). The NFS2-3030 can be expanded to ten SLCs (Honeywell International, 2010c). Each SLC can hold up to 159 intelligent detectors and 159 addressable modules (which includes pull sta- tions), giving the NFS2-640 a maximum capacity of 636 addressable points and the NFS2-3030 a maximum capacity of 3,180 addressable points. Both panels contain a 640-character large display with a full QWERTY keyboard, as well as an EIA-232 printer port for communication to external devices. The NFS2-640 has a history log that can hold 800 events and 200 alarm-only events, while the NFS2-3030?s history log can hold 4,000 events and 1,000 alarm-only events. The NFS2-640 and NFS2-3030 both utilize FlashScan and ONYX intelligent sensing (Honeywell International, 2010c, 2010d). FlashScan is a detector protocol 33 that allows a panel to poll up to 318 devices in less than two seconds and activate up to 159 outputs in less than ve seconds, allowing for near real-time annunciation. ONYX intelligent sensing is a software algorithm for detector readings. It allows a user to program di erent levels of sensitivity adjustment and perform sensitivity measurements to avoid false alarms and determine if detector maintenance is re- quired. The algorithm also gives the detectors a self-optimization capability. With this feature, the detector takes normal analog readings over an extended period of time and then sets its pre-alarm values just above those readings. Cooperative multi-detector sensing, the ability of a smoke detector to use readings from nearby sensors in alarm decisions, is also possible with ONYX intelligent sensing. Many of the Noti er panels use networks and modules for monitoring and con- trolling. The NFS2-640 and NFS2-3030 support the FireWatch Internet monitoring modules, which use the Internet to monitor alarm signals (Honeywell International, 2010c, 2010d). Another option is NOTI-FIRE-NET, which is a network that con- nects multiple Noti er panels using a combination of ber optic and wire communi- cation paths (Honeywell International, 2004). Each panel on the network is a node, and each node can monitor and control other nodes. The failure of one node does not a ect communication of other nodes in the network. NOTI-FIRE-NET can be mon- itored on either a network command annunciator, or an ONYXWorks workstation. The ONYXWorks workstation, which is an industrial, high-performance computer, can integrate a number of life safety and building systems, which include re, se- curity, CCTV, and other facility information (Honeywell International, 2010f). It was designed to be modular and allow one workstation to manage a combination of di erent manufacturers, technologies, and networks. Communication between the workstation and its systems can occur over local Ethernet or wide-area TCP/IP networks. The interface of ONYXWorks is supported by Windows XP and features a oor plan, real-time event printing of system-wide events, control of security and 34 re panels, a history log, and the capability for mass noti cation. Figure 2.4: ONYXWorks Workstation software Other network devices include the BACnet gateway and ONYX FirstVision. The BACnet gateway allows a single panel or panels on NOTI-FIRE-NET to commu- nicate with a network using the BACnet/IP communication protocol (Honeywell In- ternational, 2009). On NOTI-FIRE-NET, the BACnet gateway can support up to 14 other nodes and 15,000 objects. The ONYX FirstVision is a building-installed, touch screen display that acts as a navigational tool for EFRs (Honeywell Interna- tional, 2010e). It uses CAD drawings and the VeriFire tools database to display a oor plan annotated with the location of activated detectors and the sequence of activation. The ONYX FirstVision also includes the location of water supplies, evacuation routes, special hazards, and the shuto valves for gas, power, and HVAC systems. 35 Figure 2.5: Interface of ONYX FirstVision Noti er o ers a number of conventional and intelligent detectors. Its conven- tional detectors include ionization and photoelectric smoke detectors, xed-temperature and rate-of-rise thermal detectors, ultraviolet and infrared ame detectors, and re- ective beam detectors (Honeywell International, 2010b). The intelligent detectors utilize the FlashScan capability described above. 2.6.2.3 Siemens AG Siemens? line of addressable alarm panels includes the FireFinder XLS, MXL, and FireSeeker 250. While the FireSeeker 250 is mainly meant for small or medium- sized facilities, the FireFinder XLS and MXL series are capable of handling larger facilities, such as commercial or industrial buildings. The FireFinder XLS has the capacity for 2,500 addressable points, which can include both analog and conven- tional detection devices (Siemens Industry, Inc., 2010a). It also provides manual control of a building?s HVAC system. The addressable panel features a 6" backlit, LCD display with touch screen capabilities. The system interface has the capacity 36 for hundreds of large-font text characters, hazmat icons, NFPA re safety symbols, and graphic maps, and provides access to a history log that is capable of hold- ing 5,000 events. This series also includes a remote printer module that provides an output port that, when con gured, can be used to communicate with external systems. In the MXL series, the MXL is an analog re protection system and is one of the acceptable re alarm systems on the University of Maryland campus. It is capable of handling more than 2,000 intelligent input-devices and is capable of monitoring security devices (Siemens Industry, Inc., 2006). Some of its features are not as sophisticated as those of the XLS. The panel only includes an 80-character backlit alphanumeric display and an 800-event history log. The MXLV has simi- lar capabilities as the MXL, but also includes a voice evacuation system (Siemens Industry, Inc., 2005). Both the XLS and MXL can be combined into a network of other XLS and MXL systems that can be monitored and controlled using a network color-graphics command center. Siemens also provides three types of detection devices: conventional, intelli- gent, and specialized. The conventional detection devices are similar to those of SimplexGrinnell and Honeywell. The thermal detectors can be rate-of-rise, xed- temperature, a combination of the two, or Detect-A-Fire (Siemens Industry, Inc., 2002). Detect-A-Fire thermal detectors are rate-compensating and can accurately detect the surrounding air temperature regardless of the re growth rate. The intelligent detectors include photoelectric detectors, ionization detectors, FirePrint detectors, and thermal detectors. These di er from their conventional counterparts in that each contains a microprocessor that allows more complex de- tection, error-checking, and supervision algorithms to be remotely programmed into the detectors (Siemens Industry, Inc., 2010b). This allows the detectors to dis- tinguish between false alarms and actual re hazards. The FirePrint detector has 37 photoelectric and thermal sensing capabilities and can use re hazard pro les pro- grammed from the control panel to compare data. Specialized detectors include a VESDA air sampling system, the Series 3/X3 air duct detector, and linear beam smoke detector. The VESDA LaserPLUS de- tector draws air from a network of air sampling pipes, with each pipe containing a sensor that detects changes in air ow (Siemens Industry, Inc., 2003). Before the air reaches a laser light sources inside the VESDA detector, it passes through a dual- stage air ltration system that lters dust from the air and prevents the inner cham- ber from being contaminated. The VESDA LaserPlus detector can communicate with other VESDA detectors through a communications protocol called VESDAnet. It is separate from NOTI-FIRE-NET. The Series 3/X3 air duct detector operates independently, while the linear beam smoke detector is compatible with the MXL series of re alarm control panels. 2.6.3 Other Relevant Technology 2.6.3.1 Keltron The University of Maryland Facilities Management recently installed a Keltron Life Safety Event Management System to monitor the more than 200 re alarm systems located on its campus (Robinson, 2009). For decades, the University had been using its HVAC system for this purpose, and this presented a number of issues. First, only sixty percent of the campus alarm systems could be monitored using the HVAC systems, forcing the management to rely on people to call in whenever an alarm was set o . Second, the HVAC system was not designed to handle the variety of brands and models that are represented on the campus. This greatly limited the amount of information available to the technicians and re ghters, all of whom had to defer to the building?s control panel for further details. Third, HVAC systems are 38 not held to the same strict codes of re alarm systems, leaving the University with major aws in the communication pathways between the detector and the central monitoring system. The solution to these issues was the installation of the Keltron Active Net- work Radio System (Corporation, 2009). It utilizes wireless transceivers with 8 programmable inputs that allow for two-way transmission between the detector and the central monitoring location. With the Keltron DataTap, transceivers can take point-speci c information from re alarm control panels and relay it through the network. This device is connected to the control panel?s RS-232 port and is com- patible with the four manufacturers discussed above. Keltron also o ers Ethernet networking systems that have similar capabili- ties as those of its radio system. The company also features its re and security alarm monitoring systems called the DMP703 (Corporation, 2008). The DMP703 can handle 20,000 direct connect inputs for continuous supervised monitoring and includes a touch screen. The system can also interface with other re alarm control panels via the RS-232 ports. 2.7 Current Fire ghter Technology In addition to the technology of building sensor systems, EFRs use, or are beginning to use, a variety of technologies for preplanning, re ghter locators and vital signs indicators. These technologies can be applied to EMS and hazmat inci- dents as well as res. However, most of the technology is aimed at re ghters. In an emergency situation, knowing the hazards of the building and the location and health of re ghters is essential to a successful resolution to the event. 39 2.7.1 Pre-planning Software Pre-planning is an essential tool for re ghters. Pre-plans locate re hydrants, stairwells, hazardous materials in a building, and re alarm control panels. In addition, pre-plans make note of building construction in terms of ammability, known di culties in access, and surrounding tra c patterns (NFPA 1620: Standard for pre-incident planning, 2010). While pre-planning has traditionally been done on paper, several companies provide software to aid in creating pre-plans. The CAD Zone, Iron Compass, RealView, and FDM are some of the producers of pre-planning software. Each pre-planning aid aims to comply with NFPA 1620. The CAD Zone was one of the rst companies to begin producing software for pre-planning purposes. Their Fire Zone product helps re service personnel draw site plans and add pertinent pre-incident planning information. It uses simple drawing tools to construct the oor plans, as well as using pre-made symbols to represent stairs, hydrants, and hazards (Zone, 2010a). In order to make the pre- plans accessible on the scene of the incident, The CAD Zone also sells First Look Pro, a mobile pre-plan organizer. It is designed to be used on mobile computers and even touch-screen computers. It uses a search function to nd addresses and displays the information input into Fire Zone. It also includes a mobile mapping function to show the re service vehicle?s progress toward the location of the event through global positioning system (GPS) tracking (Zone, 2010b). The next producer of pre-incident planning technology is Iron Compass, a company known for making maps. Their OnScene Xplorer product also uses a search function to nd addresses or intersections, with \spell-right" technology ensuring proper street name entry. Depending on the zoom of the image, di erent information is displayed to simplify read-out. In addition GPS technology can be used to nd re hydrants and give precise coordinates to MedEvac helicopters. It also allows for space to note important information such as occupancy, shut-o valve locations, 40 hazardous materials, and re suppression systems. Rather than having a separate mobile program, OnScene Xplorer can be used to create and view pre-plans en route (Compass, 2010). RealView produces Command Scope, their mobile pre-planning software, avail- able as a yearly subscription. All the pre-planning data is synchronized across a department through a RealView server. Command Scope allows for both com- mercial and residential pre-plans by asking residents to submit their own pre-plan information online. Like the other softwares, Command Scope shows building site information, hazmat details, and geographic information system (GIS) maps (LLC, 2010). Another producer of pre-planning software is FDM. FDM produces records management system (RMS) modules for a variety of applications including hydrant tracking, personnel management, maintenance records, training details, and prop- erty information. The properties module includes records from prior incidents, permit information and inspection results. The properties module also includes information on occupancy, building construction, hazards, important contacts and built-in safety features. All of FDM?s modules are fully customizable, allow for image viewing and connect with the re department?s CAD program so that dispatchers can have access to information and building plans (FDM, 2010). 2.7.2 Locator and Vital Sign Indicator Technology Tracking re ghters within an incident allows the incident commander to be certain that all re ghters are in their intended locations to better combat the re as well as avoid injury. There are several ways that re ghters can be tracked within a building. Pedometry uses calibrated step lengths to estimate distance and direction, as well as vertical movement. It counts steps to provide the relative location and display the estimated path over the oor plan. In ad hoc self-forming networks, 41 also known as mesh networks, radio networks form that uses the time of signal from other users to track location and direction. Like pedometry, it also shows a relative location and can be overlaid on the oor plan. If the incident commander can be linked in, he can see the locations of all the re ghters. Unfortunately, linkage from the inside of the building to the outside can be di cult with ad hoc networks due to the need for a grid layout in order to operate correctly. Another method is through triangulation. Using transmitters with known coordinates, such as those in re service vehicles outside the building, the locations are pinpointed and can be displayed on the oor plan. Triangulation can successfully transmit through di erent materials and the incident commander can see the entire team. Other technology used to track re ghters includes ultra wideband (UWB) and radio-frequency identi cation (RFID) technologies (Bryner, 2008). Scott Health and Safety produces the Pak-Tracker, a re ghter-locating device. The Pak-Tracker operates on a directional 2.4 GHz radio frequency. The handheld monitor can connect with up to thirty-six individual units. It is designed to be operated with gloves on and weighs only 2.2 lbs. It can be used alone or incorporated into a self-contained breathing apparatus (SCBA) system. If the system does not detect movement for a period of time, it goes into full transmission mode so that other re ghters can nd the immobile re ghter. When using the handheld module, audio and visual tools help locate the re ghter (Health & Safety, 2010). Harris Corp?s GR-100 re ghter tracker uses a SCBA-mounted transmitter and a radio network at 700 and 800 MHz frequencies to track re ghters. The data is then displayed on a graphic user interface (GUI) so that an incident commander can track the re ghters in 3D, updated every few seconds. The system also uses accuracy-enhancing algorithms, inertial information, and GPS tracking. In order to address some challenging structures for re departments, Harris aimed the product at single-family residences, strip malls, and small o ce buildings. The GR-100 42 is able to track the identity, location, and path of individual re ghters in three dimensions (Roberts, 2010). Knowing the location of a re ghter is important, but knowing whether that re ghter is under physical duress is top priority. In the re service, half of all fatalities of re ghters come from heart attacks. Several companies produce under- shirts that are intended to track the vital signs of the re ghter. At the National Personal Protective Laboratory, a vest called a \lifeshirt" uses ve sensors to track body temperature, heart rate, and breathing (\Vitals vest { physiologists create undergarment to measure vital signs of re ghters", 2008). Zephyr Technology de- veloped their BioHarness First Responder System to monitor body temperature, heart rate, and breathing in the context of the re ghter?s activity level and pos- ture, which could alert an incident commander to a dangerous situation. Using a monitoring system, the vital signs can be assessed at a glance on a computer that can provide real-time alerts or assist in post-event analysis (Thompson, 2009). Vital signs monitoring can provide essential information in order to protect re ghters on scene. 2.8 Human Factors As the goal of our project is to develop and demonstrate the e ectiveness of a tactical decision aid display bene tting re ghters and other EFRs, it was important for us to keep the needs and limitations of the user in mind throughout the design process. The eld of human factors is concerned with the study of the capabilities and limitations of human beings and how these factors a ect human- computer interaction (HCI), making research being done in this eld very relevant to our project. Since the decision support tool we have developed is a relatively simple decision support system (DSS) with a graphical user interface (GUI), human factor research dealing with both DSS and GUI design was studied and used to help 43 guide our design choices. 2.8.1 Decision Support Systems (DSS) We de ne decision support systems to be those computer systems dedicated to aiding the decision-making process of the user by providing him or her with informa- tion gained through analysis of available data. Research concerning DSS designed to support emergency response is particularly relevant to our project. Unfortunately, literature dealing with emergency DSS is lacking when compared to the research that has been conducted concerning DSS with more conventional, corporate applications. It should be noted, however, that the body of research on emergency DSS has been steadily growing in recent years (Dai, Wang, & Yang, 1994). Dai et al. (1994) studied the design and development of computer support systems intended to aid emergency decision-making. Their paper addresses three main issues: the di erences between emergency decision-making and what the authors call conventional decision-making, the kinds of aid DSS are capable of providing for emergency decision-making, and the speci c requirements that should be considered in the design and development of emergency DSS. According to Dai et al., emergency decision-making problems di er from conventional decision-making problems in the following ve ways: 1. Attributes of an emergency-decision making problem are often uncertain. 2. Environmental factors are often changing rapidly in emergency scenarios. 3. Emergency decisions are often time-dependent and are often made with in- complete or inexact information. 4. Usually only one or two goals are important in an emergency situation. 5. Even the soundest decision may involve substantial risks when faced with an emergency scenario. (Dai et al., 1994) 44 The paper also emphasizes that the user of an emergency DSS is often under duress and that the DSS should be designed to ease the user?s stress as much as possible. According to the authors, this can be achieved by ensuring the information presented by an emergency DSS is concise, adaptable, and insensitive to imprecision of input data and that limited interaction is required between the user and the AI governing the DSS during the course of an emergency (Dai et al., 1994). Dai et al. then go on to explain how the emergency DSS should function di erently at di erent stages of the emergency scenario. During the pre-planning stage, the emergency DSS should be able to provide the following functions: The ability to collect and store information dealing with possible emergencies The ability to acquire expert knowledge about an emergency The ability to aid managers in developing an emergency plan The ability to run simulations evaluating the e ectiveness of the standing emergency policy The ability to process and use real-time data to predict and monitor developing emergencies (Dai et al., 1994) Meanwhile, the emergency DSS should be able to perform these functions during the course of an emergency situation: The ability to retrieve information about a developing emergency in real-time The ability to analyze an emergency scenario as it progresses The ability to convey expert-level advice and make proposals to the DSS user The ability to aid the user in making and implementing a decision (Dai et al., 1994) Furthermore, Dai et al. state that the response time of emergency DSS should be minimized, even at the cost of storage capacity. The paper concludes with a 45 discussion of the basic framework of an emergency DSS and a description of a prototype emergency DSS used for disaster response in coalmines. Ye, Wang, Li, and Dai (2008) studied the development of an emergency DSS based on the general decision process of emergency management personnel. The paper outlines nine key elements present in the general emergency decision process: (O, A, C, E, R, RES, S, T, V), where O is the set of roles of those involved in emergency response, C is the environment in which the emergency takes place, E is the set containing the events of the emergency, R is the relationship of the events contained in E, RES is the set of all emergency resources, S is set of statistics describing the emergency system, T is time, and V is geographic space (Ye et al., 2008). Figure 2.6, reproduced from Ye et al. (2008), illustrates the relationship of these nine factors. Figure 2.6: Relationship of nine key elements of the emergency decision process The authors then use this model of the general emergency decision process as a basis for their proposal of a general architecture for an emergency DSS. It should be noted that, unlike in the previous paper discussed, the architecture proposed dictates a constant interaction between the DSS and the human user throughout 46 the emergency scenario. Drury, Klein, Pfa , and More (2009) studied DSS that utilize simulations to predict how a given decision will a ect the course of an emergency situation. A cost metric was developed to compare a decision to its alternatives. This cost comparison was visually conveyed to the user through the display of box-and-whisker plots. To assess the usefulness of this developed DSS, a test was designed where two groups had to determine the number of emergency-response vehicles to dispatch to a developing emergency situation. While both groups were given a textual description the emergency scenario, only one group was given access to the developed DSS. It was found that the DSS group was able to make decisions closer to the normatively correct decision. Furthermore, the DSS group reported a higher degree of con dence in their decisions and rated the amount of decision support they received higher than the control group. While not focusing on emergency DSS speci cally, other human factors re- search concerning DSS contain conclusions relevant to our project. Research dealing with DSS designed to aid command decisions is especially relevant, as the target user of our tactical decision aid display is the incident commander directing an emergency response e ort. Breton, Paradis, and Roy (2002) studied the work be- ing done at the Defence R & D Canada - Valcartier toward developing a command and control DSS design tool called CODSI (Command Decision Support Interface). CODSI is used to evaluate and validate design concepts from both a human factors perspective and an operational perspective. To illustrate the e ectiveness of CODSI, the design of a tactical DSS intended for naval threat-assessment applications was studied. Through this study, CODSI was shown to accurately model and assess human factors considerations of DSS with command and control applications. Another facet of DSS research relevant to our project concerns the display of uncertain information. Bisantz, Marsiglio, and Munch (2005) conducted four 47 studies regarding the display of probabilistic information. The rst three studies dealt with user performance of stock purchase tasks where information regarding the pro tability of a given stock was probabilistic. Two aspects of the display were varied in these studies: the display format and the speci city level of probabilistic information. From these studies it was concluded that while performance did not vary with display format, increased speci city improved performance. The fourth study had participants create membership functions that corresponded to three distinct display formats. From this study it was found that users can successfully interpret probabilistic information displayed in a non-numeric manner and use this information to make e ective decisions. A good portion of the human factors research being conducted on DSS deals with the design of so-called \intelligent" systems, systems with a high degree of au- tomation. Guerlain, Brown, and Mastrangelo (2000) studied characteristics that are common to three intelligent DSS shown to be successful in practice: the Antibody Identi cation Assistant (AIDA), the Regional Crime Analysis Program (RECAP), and the River Flooding Forecasting System. Six attributes representative of suc- cessful intelligent DSS were determined from this study and are given below: 1. Interactivity - the system is able work well with both human users and databases. Users are given a wide variety of possibilities instead of a single \optimal" de- cision. 2. Event and Change Detection - the system is able to e ectively communicate status changes and important events. 3. Representation aiding - the system is able to convey information in an infor- mative yet human-like way. 4. Error Detection and Recovery - the system is able to check for typical reasoning errors made by users. The system is also able to identify situations that are beyond its own capabilities. 5. Information out of Data - The system is able to extract information from large data sets and provide the user tools for dealing with smaller data sets, outliers, and possible sources of error. 48 6. Predictive Capabilities - the system is able to predict both the future conse- quences of a given action and the future state of an environment. (Guerlain et al., 2000) Despite the appeal of intelligent DSS, it is important to keep the limitations of such systems in mind. Parasuraman and Wickens (2008) argue that, despite the high degree of automation of current DSS, the human user is still indispensable to the decision-making process. Though current technology allows almost complete autonomy by some DSS, the authors argue that operator input must be factored in to high-risk decisions. Without this input, it is possible that the user will follow the decision made by the system incorrectly due to a lack of understanding. The authors also point out how a lack of involvement by the user may lead to a lack of trust and, in turn, a breakdown of the decision-making process. Rovira, McGarry, and Parasuraman (2007) studied the e ects imperfect au- tomation has had on command and control decisions. To assess these e ects, the authors designed a test in which eighteen participants performed a targeting simu- lation based on sensor data. It was found that while reliable automation reduced decision-making time considerably, imperfect automation lead to a compromised decision-making process with high costs. Based on this study, the authors rec- ommend that automation features should not be included in a DSS unless fully dependable automation is ensured. Sarter and Schroeder (2001) studied how uncertainty of information and time pressure a ect DSS aided high-risk decision making. The authors examined the e ectiveness of two types of DSS: status displays and command and control dis- plays. In their study, twenty-seven pilots ew twenty simulated ight patterns with a possibility of in- ight icing while the accuracy of the DSS provided and the ex- perience of the pilot was varied. It was shown that while accurate information signi cantly improved how pilots approached and handled in- ight icing, inaccu- 49 rate information lead to the performance of pilots falling below those with no DSS. Based on this study, the authors recommend the use of DSS that simply provide information relating to a particular scenario over those DSS that make particular command suggestions for applications where perfect reliability of information cannot be guaranteed. Another limitation of DSS involves those with displays that can become clut- tered with symbols and other information. When DSS displays become cluttered, sensory overload can occur and the performance of the user can su er greatly. John, Smallman, Manes, Feher, and Morrison (2005) studied how heuristic automation can be used to \declutter" displays with this problem. To assess the e ectiveness of heuristic automation, twenty-seven Navy users with experience related to DSS with tactical displays monitored a simulated airspace and were told to nd and respond to aircrafts that posed a signi cant threat. Meanwhile, a heuristic algorithm that dimmed the symbols of aircraft it deemed less threatening was constantly run in the background of some DSS. While the DSS users did not let the dimming of symbols a ect which aircraft they deemed to be threatening, the decluttering algorithm was shown to improve their time response to threatening aircraft by twenty- ve percent. Furthermore, twenty- ve of the twenty-seven users said they preferred systems with the decluttering heuristic. The authors concluded that properly designed heuristic decluttering of a display may prove valuable in many applications where the display can easily become cluttered, such as in air-tra c control or crisis team management. 2.8.2 GUI Design A signi cant component of our research project involves the design of a graph- ical user interface to provide EFRs with useful information concerning a developing re emergency in a timely manner. For this reason, it was important that we were aware of the human factors work related to GUI design. According to An Intro- 50 duction to Human Factors Engineering (2004) by Wickens et al., there are thirteen principles of display design to consider when creating an e ective GUI. These thir- teen principles are as follows: 1. Make displays legible (or audible): The characters or objects being displayed must be discernible. 2. Avoid absolute judgment: Users should not be required to judge the level of a variable on the basis of a single sensory variable. 3. Top-down processing: People will perceive and interpret signals based on prior experience. 4. Redundancy gain: The more times a message is presented, the more likely it is to be correctly interpreted. 5. Discriminability: Similar appearing signals will most likely be confused. 6. Principle of pictorial realism: Symbols employed should look like the object or element that it represents. 7. Principle of the moving part: The moving elements should move consistently with the user?s expectations. 8. Minimizing information access costs: Keep frequently accessed sources in a location such that the cost of traveling between them is small. 9. Proximity compatibility principle: Sometimes two or more separate sources of information are required to complete a given task and must be mentally integrated. 10. Principle of multiple resources: Information should be divided across resources to allow easier processing by the user. 11. Principle of predictive aiding: A display that predicts future events will usually enhance human performance. 12. Replace memory with visual information: knowledge in the world: Place spe- ci c cues in the task environment to remind the user what needs to be done. 13. Principle of consistency: Symbols and other design elements should be consis- tent with other displays the user may be using. (Wickens, Lee, & Liu, 1997) 51 While some of these principles are not applicable to the project at hand, the majority of these guidelines should be considered to ensure the design of an e ective GUI not hindered by the limitations and preconceived notions of the human user. Priegnitz et al. (1997) wrote a paper on the human factors considerations involved in the design of a GUI for open systems radar product generation. The authors studied how a GUI could be used to improve upon the existing unit control position interface. The paper provides a good understanding of the merits of a graphical display and paints a clear picture of exactly what steps go into GUI design. Talcott, Bennett, Martinez, Shattuck, and Stansifer (2007) studied the GUI design of military command and control decision aids and used the principles of direct perception, direct motivation, and perception-action loops to develop a system of perception-action icons. Participants of their study used GUIs with and without these perception-action icons implemented to provide estimates of combat resources. The results showed that GUIs with these perception-action icons improved user performance signi cantly. This paper demonstrates how a system of standardized icons with incorporated color changes can be used to e ectively communicate status changes to the user that demand attention. A particularly useful source related to GUI design is the Federal Aviation Administration?s Human Factors Criteria for Displays (2007). While too extensive to summarize here, Chapter 5 of the technical report covers criteria for displays varying from o ce use to heads-up and tactical displays (Ahlstrom & Kudrick, 2007). While dealing more with hardware than GUI programming considerations, the suggestions made by the FAA proved valuable in the design of our decision support tool. 52 Chapter 3 Methodology 3.1 Case Study Methodology Before our team began the formulation of a decision support visualization tool, a thorough understanding of the current capabilities of building information systems needed to be established through a case study. The objective of the case study was to better understand the types of re alarm systems found in current, state-of-the-art buildings, as well as di erent functioning modes for purpose-speci c buildings. Each building is intended for di erent events and uses, so there are di erent requirements and operation techniques for each re alarm system. The case studies included an observation of the Jeong H. Kim Engineering Building, a laboratory and classroom building, the Comcast Center, an event arena with o ce space and Keltron, the central monitoring system on the University of Maryland campus. All of these buildings were located on the University of Maryland, College Park campus. An additional o ce building, the CareSource Building in Dayton, OH, was also studied. This building was chosen because it is a state-of-the-art building aimed for single-occupant o ce use that was available to our team. The following is a list of building or re alarm system features that were noted during the case study, and important aspects of each feature: Building Type and Usage: This information provided a context for each build- ing case study. Each building was purpose-speci c and the re alarm control panels and emergency operations were customized around the speci c needs of the building and its occupants. This knowledge gave a framework for the study and provided insight into the di erent operations of current, high-tech buildings. Type of Fire Alarm Control Panel (FACP): An observation of FACPs allowed 53 Team FFA to understand what types of panels are currently used in state-of- the-art buildings. These ndings were used to direct our e orts in requesting donations of equipment. It was necessary to obtain commonly used panels for further testing. Location of FACP and Other Control Panels: To understand the current op- erations of re ghters during emergency situations, it was necessary to note typical locations of control panels in buildings. The current locations empha- sized a need for external access to FACP information. Current Floor Plans or Maps: An examination of the oor plans and maps provided insight into current systems. Current oor plans and maps found in these buildings were simplistic and vague, indicating a possibility for improve- ment to the information provided on oor plans. Sensor Numbering and Addressability: Many current sensors are point-addressable. Sensor location data were pinpointed as possibly useful yet underutilized as- pects of current technology. Specialty Sensors: Chemical sensors, high-tech smoke detection systems, pres- surized stairwells and elevators were all noted as items of particular interest. 3.2 System Design 3.2.1 Assessing EFR Needs In order to adequately demonstrate how a decision support visualization tool can e ectively utilize, contextualize, and present building sensor data, our team saw it necessary to construct a prototype decision support tool. In accordance with most established design methodologies, we began our prototype design by considering the needs and limitations of the end users, namely re ghters and other EFRs. In 2004, NIST held a workshop to identify and inventory the information needs of EFRs during the course of an emergency situation (Jones et al., 2005). Our team used the results of this workshop as a starting point toward determining what types of information should be included in and conveyed by our prototype visualization tool. Appendix C of the technical report \Workshop to De ne Information Needed by Emergency Responders during Building Emergencies" contains several lists of 54 the kinds of information that re ghters would like to know both en route to a re emergency and on the scene of the re (Jones et al., 2005), as discussed in Section 2.4 of this thesis. These lists are quite exhaustive, so our team felt it was important to prioritize the items requested and include only the most important information in our prototype. After consolidating all the information requested by EFRs during the course of the workshop into two lists (static and dynamic information), we categorized each item requested as either high priority (information essential to understanding the nature and progression of a re emergency that must be included if available), medium priority (potentially useful information that is either not entirely necessary or would be hard to integrate into our prototype), or low priority (information that is excessive, unnecessary, or not currently available). In order to avoid clutter, we decided to include only the information classi ed as high priority. The resulting list of static information derived from the workshop includes: Building style (e.g., one story, two story, n-story, auditorium, sublevels, etc.) Building construction (e.g., type I, II, III, IV or V; re resistive, noncom- bustible or limited combustible, ordinary, heavy timber, or wood frame) Presence and Location of Hazardous materials (e.g., above ground propane tank, gas lines, chemicals, etc.) Location of re department hookups for sprinkler systems and/or standpipes Location of staging areas and entrances and exits to buildings Location of doors, windows, stairwell risers, re walls (with ratings and area separation), roof access, and re sensors Location of re alarm panels and remote annunciator panels Location of utility shuto s Location of building system controls (e.g., HVAC, smoke control, etc.), areas of e ect, and special operating systems Location of evacuation quality elevators, oors served, and location of elevator overrides and how to control 55 Location of convenience stairs and evacuation stairs Areas (i.e. zone boundaries) protected by sprinklers or other devices The list of dynamic information included from the workshop is as follows: Approximate location of re within a building Fire size and duration Location of re detectors in alarm Location of sprinklers activated Location of CBR sensors in alarm Location and condition of smoke Location of elevators (and state of operation) during an incident 3.2.2 Evaluation of Prior Work In the NIST technical report \Demonstration of Real-Time Tactical Decision Aid Displays" by Davis et al., 2007, a prototype tactical decision aid display in- terface is proposed to meet the EFR needs determined during NIST?s previously held workshop. Before starting to formulate the features of our own prototype deci- sion support visualization tool, our team thought it best to review NIST?s proposed design and evaluate where it succeeded and how it could be improved. NIST?s prototype system consisted of two decision aid display interfaces: one to be used by EFRs en-route to a re emergency and one to aid the incident comman- der on site. The proposed en-route GUI would display a plan view of the building and surrounding area and would provide information such as the location of building standpipes, the building?s name and address, and the location of entrances to the building. After reviewing this en-route display, our team determined that, while the information displayed is indeed useful, the manner in which it is displayed is only a marginal improvement over the pre-planning and dispatch software that is 56 already in use by many re departments. For this reason, we decided to devote our attention solely to the proposed on-site interface. The top of NIST?s on-site display (reproduced in Figure 3.1) is populated with several buttons that allow the user to choose which kinds of information should be overlaid on a oor plan of the building. The types of information that can be called by each button are as follows: security (e.g., information from security sen- sors), elevator (e.g., number and location of elevators), re- ghting equipment (e.g., location of re extinguishers), utilities (e.g., location of utility shuto s), building generator (i.e., location of building generator), re walls (i.e., location and ratings of re walls), standpipes (i.e., location of interior re department standpipe connec- tions), hazardous materials (e.g., location of hazardous materials within the build- ing), building hazards (i.e. information of potential hazards to EFRs), and valuable materials (i.e., location of extremely valuable materials) (Davis et al., 2007). When a button is selected, the associated information is symbolically displayed on the oor plan in real-time. Next to the building?s oor plan are buttons that allow the user to zoom in and out, pan, and reset the view to the default setting. There are also buttons that allow the user to change oors. On the bottom of the display, three buttons are present: the rst displays the building information from the en-route display, the second allows the user to view a log of the alarms and changing building conditions, and the third displays a legend of the symbols used in the map overlay. 57 Figure 3.1: On-site Screen of NIST?s Prototype Tactical Decision Aid Display (Davis et al., 2007) This on-site tactical decision aid display interface is successful in many ways. From the description of the on-site interface?s features, it is clear that all of the EFR information needs highlighted to be included in our prototype are addressed. Furthermore, overlaying symbols on the oor plan in real-time is an e ective way of conveying spatial and temporal information about the building?s state. The fact that an event log exists to complement the visual display is also a positive aspect. Finally, the buttons on the right side of the interface allow for easy manipulation of the map. Despite the successes listed above, there are many ways in which this interface could be improved. First of all, the fact that the user is required to select buttons to toggle which information is displayed on the oor plan requires too much interac- tion. From a human factors perspective, it is unreasonable to expect an emergency responder to devote signi cant amounts of time to manipulating the controls of the 58 GUI while managing and reacting to a developing re emergency (Dai et al., 1994). However, the rationale behind the toggle buttons, to prevent information overload, is clear and also a valid concern. Our team believes that a better way to avoid in- formation overload would be to simply only display the information most pertinent to conveying the status of the building. In addition, the fact that the event log and the building pre-planning information are not displayed at all times on the interface diminishes the e ectiveness of the system. Our team also noted that a function that allowed the user to replay the progression of the emergency would be very useful as more details about a re?s development will lead to more informed tactical de- cisions. One last improvement would be graphically highlighting on which oors the re emergency is occurring to allow the user to quickly return to the oors of greatest importance. 3.2.3 Design Goals and Basic Layout After nishing our assessment of the NIST prototype tactical decision aid display, considering the standard operating procedures of re ghters, and reviewing the available literature on human factors, our team formulated our broad design goals for our prototype system. These goals are as follows: Our decision support visualization tool should e ectively achieve temporal and spatial contextualization of the information obtained from building sensor systems. The interface should be intuitive to view and manipulate. This goal should be accomplished by utilizing standardized symbols and familiar visuals. The display should be designed in such a way as to optimize the amount of information that can be gleaned at a glance with particular emphasis placed on the most important data. The prototype?s design should be modular to allow future technology to be integrated with relative ease. 59 Once we established these goals, our team created a basic layout that we thought would best meet them. We decided that our interface should be divided into four main components: a static information (or pre-planning) panel, a dynamic information panel, a oor plan with overlaid visual information, and an alert pop- up window. The static information panel was to textually display building and pre-planning information that is di cult to illustrate graphically. The dynamic information panel was designed to textually display a queue of alarms and events as they happen in real-time to complement the visually displayed data overlaid on the oor plan. Like in the NIST prototype display, the oor plan was to be overlaid with symbols and other visual cues to e ectively show the state of the building. The alarm pop-up window was devised to ensure that the user is aware of key events such as the re spreading to another oor. After outlining the basic layout of our prototype?s display, a rough mockup of our GUI was created and shown to a panel of incident commanders at the Maryland Fire and Rescue Institute (MFRI) for feedback. 3.2.4 Final GUI Design Our nal prototype GUI was developed based on our design goals, existing building technology, our prioritized list of EFR needs, our assessment of NIST?s prototype display, and the feedback we received from MFRI. A screen capture of our nal GUI mockup is shown in Figure 3.2 below. 60 Figure 3.2: Team FFA Final GUI Mockup The above GUI shows the four main parts in our design: the oor plan dis- play, the pre-planning panel, the dynamic information panel, and the alert window. Located prominently at the center of the design, the oor plan display is the GUI?s most important feature; its purpose is to visually display the location of sensors in alarm and other items of importance. Dynamic information is graphically over- laid in real-time on to the static, two-dimensional oor plan. Only the most useful information is included to prevent the display from becoming cluttered. In order to allow the prototype to potentially t into the standard operating procedures of re ghters seamlessly, NEMA SB 30 icons and NFPA 170 symbols are utilized as deemed necessary. To display the state of a particular location on the map, a color- shading scheme was decided upon in favor of the existing NEMA SB 30 icons. This shading is quite intuitive (e.g., red shading for heat detectors in alarm, grey shading for active smoke detectors in alarm, etc.) and calls attention to the development of the emergency more readily than the existing icons. One other item included in the oor plan display of particular note is the replay control bar. As designed, the 61 developed visualization tool includes a feature that allows emergency responders to watch an animation on the oor plan showing the progression of the emergency up until that point at several factors faster than real-time. Please refer to Appendix A for a full list of features included in the oor plan display. The pre-planning and dynamic information panels are located to the left and right of the oor plan, respectively. The pre-planning panel provides a synopsis of the building involved in the emergency to the re ghters before they arrive on the scene. For the sake of simplicity and to prevent information overload, the information displayed on this panel is limited to the use and construction of the building, a list of possible hazards in the building and their locations, and the time of the rst alarm. On the other side of the oor plan, the dynamic information panel displays any important alerts as they occur in real-time. All events listed are complete with location and time stamp to ensure that an accurate picture of the emergency situation is painted both temporally and spatially. In addition to complementing the oor plan display in conveying the progression of the emergency, this information allows a log documenting the emergency situation to be generated by the system with minimal e ort. The dynamic information panel also shows the amount of time that has elapsed since the beginning of the emergency situation. Refer to Appendix A for the full features lists for these two panels. One nal main feature of the prototype?s GUI is the pop-up alert window. When an event of particular importance occurs (such as the re spreading to a new oor), a pop-up window appears below the oor plan accompanied with a sound and does not disappear until acknowledged. This feature is designed to bring events of particular importance to the attention of the incident commander despite the frenzy of the emergency situation. See Appendix A for further detail. 62 3.3 Software Implementation The task of implementing the software consisted of accessing and parsing sen- sor data and displaying the information according to design speci cations. Ac- cordingly, the largest modules of the parser are 1) the data parser, 2) the building representation and 3) the interface. The data parser processes the raw data ob- tained from the sensor panel and updates the building representation. The building representation organizes the received information to make it easy to nd the state of all sensors on a given oor at a given time. The interface displays the state of the building at the oor and time selected by the user. The overarching modules can, in turn, consist of a number of sub-modules. The desired behavior of the data parser and the visualization interface is already well-de ned: the parser needs to accept the protocols used by building sensor systems, and the design of the visualization interface was described in the previous section. Throughout this implementation phase, we focused on developing software that could be used to evaluate the e ectiveness of the proposed approach to the problem of data visualization. In the prototype, some functionality that would be necessary for a full-featured application was omitted. We chose not to provide solutions to previously solved problems, such as software distribution questions and the networking necessary to make the obtained sensor data remotely available. 3.3.1 Software Design Choices The visualization display interface was implemented in Java. The Object Ori- ented paradigm employed in the programming environment Java was conducive to a logical representation of building information: the building, oors, and sensors were represented by separate object classes. More importantly, Java?s type poly- morphism { a feature that enables a uniform interface to handle di erent complex 63 data types { simpli ed the task of writing modular code. Since one of the goals of the project was to create a system that could interface with building sensor systems regardless of type or manufacturer, the software was designed to be able to receive input from a variety of di erent sources. For each protocol used, a separate parser module is needed, but the other modules can be used without modi cation. We wrote parsers for data received over RS-232 and BACnet, since those were the two transmission methods used by the hardware donated to our team. Additionally, we wrote a parser for Fire Dynamics Simulator output for testing purposes. 3.3.2 Annotation Tool Whereas the implementation of the data parser and the interface is simply done according to speci cations, the building representation required further con- sideration. It was necessary to associate the sensor labels used by the annunciator panels with their locations on the oor plan. Also, it was necessary to know the areas of e ect for each sensor in order to highlight the appropriate parts of the oor plan when the sensors go into alarm. Unfortunately, this information is rarely stored in a way that would allow us to directly use it in the software, so it is necessary to input this information by hand for each building. Since this is a time-consuming and error-prone task, we decided to develop an annotation tool, a separate application to simplify the task of entering building information by hand. The annotation tool uses an interface that is very similar to our decision sup- port tool?s interface. The user marks the sensor location and area of e ect directly on the oor plan. Using a familiar graphical interface to enter information makes it possible to quickly and accurately annotate the oor plan. The annotation tool also allows for text-based information to be entered for any sensor, building oor or for the entire building. The text information is organized into parameters and values 64 to allow comparison of di erent sensors or oors based on the values entered. The variety of data that can be entered makes the annotation tool a robust pre-planning utility that can be used for many di erent purposes. The annotation is stored in XML format, which is well-de ned, robust and can be handled by many existing tools. We propose that the annotation tool or similar pre-planning software be used to standardize the way that building information is stored and accessed. 3.4 Hardware Testing 3.4.1 Overview The goal of the hardware testing was to demonstrate that the decision support tool is capable of processing actual sensor inputs as well as display the information accurately. We used mockups of re sensor equipment donated to our team by Honeywell and SimplexGrinnell to obtain these inputs. The hardware setup consisted of two di erent FACPs, one Noti er NFS-640 and one Simplex4100U, both of which are approved for use in buildings on campus. Each FACP was hooked up to the following sensors: 3 photoelectric smoke sensors 3 ionization smoke sensors 3 heat sensors 1 ow switch 1 tamper switch 1 manual pull station The FACPs were in turn connected to a laptop. The laptop used to run the decision support software had the following key speci cations: Processor: Intel Pentium M, 1.73 GHz 65 Harddrive: Fujitsu MHV2060AH ATA Device, 5400 RPM RAM: 1024 MB at 266 MHz It is possible that due to the age of the parts, the speci cations of the laptop contributed to delay in the response time of the decision support tool. Using the sensors available to each FACP, a oor plan was devised placing each sensor on one of three oors. The oor plan itself, shown below, is a basic hallway with three rooms, a door on the rst oor and a stairwell leading up to each oor. Figure 3.3: A mockup oor plan for real lab hardware to simulate. Using the devised sensor layout, the oor plan was then formatted for input using our developed annotation tool. This annotation identi ed each sensor on the oor plan and its approximate area of e ect that will be visually displayed on the system should the sensor go into alarm. In order to use the hardware to simulate an emergency scenario, sensors were manually triggered according to a pre-written plan of events. This pre-written plan, 66 provided in Appendix B, established an emergency scenario, identifying what sensors go o at what time steps. By controlling the input to the sensors, the output of the system can be analyzed to identify if the real sensor hardware will perform as expected. For this experiment, it is expected that the system will accurately denote which sensors are going o on a oor plan and that there will be little lag time (less than ten seconds) between when a sensor goes into alarm and when it is displayed as being in alarm by the prototype system. 3.4.2 Panel Output Protocols The Simplex 4100U panel and the Noti er NFS-640 panel are signi cant be- cause they both have BACnet gateways installed. The BACnet gateway allowed for the output of each system to be standardized, so that sensor states from each system are identical. BACnet transferred this data to the system over an Ethernet cable through a network card on a computer. The default settings on the BACnet card provided with the Simplex panel were not con gured to monitor analog values. The card was reprogrammed to speci cally monitor analog values of sensors over time. Heat sensors that measured room temperature would then display their current readings in degrees Fahrenheit, and smoke sensors would read their values on the obscuration of the air. The output of the sensors via BACnet now become the input to the system. As previously mentioned, BACnet is convenient because it is a standard protocol that is installed on a variety of FACPs across several manufacturers in addition to Hon- eywell and Simplex Grinnell. However, in the interest of thorough experimentation, another communication protocol was used { RS-232. RS-232 is a well-established way to transfer serial data from one device to the next. Virtually every FACP has RS-232 output; serial data transfer is the primary method for communicating with the printer. With very basic wiring, the printer data stream can be tapped and fed 67 into a computer?s serial data port or a USB port with a DB9 serial data adapter. Because of the simplicity of serial data provides, and the ubiquity of RS-232 output from FACPs, RS-232 was also explored as a method for taking the output of the FACPs and making that the input for the decision support tool. 3.4.3 Hardware and Scenario Mockup Each FACP in the mockup was hooked up to twelve di erent detectors and modules in a Type A loop, meaning a single wire connected each sensor in a loop that began and ended at the FACP. Each ow switch and tamper switch was connected to manual toggle switches, allowing both the manual pull stations and the tamper and ow switches to be instantly activated as necessary. Each photoelectric and ion smoke sensor, as well as the heat sensors, was activated by applying a magnet at a speci c point near the LED light. The sensors triggered on application of the magnet; however, there was a signi cant delay between the application of the magnet and when the sensor actually went into alarm. It also appeared that the delay was di erent from sensor to sensor. Brief experimentation was required to obtain the average response time of each sensor to ensure reliable system input. Appendix C outlines the testing procedure and results for this experimentation. Additionally, for the Simplex 4100U panel, analog values were measured by the sensors. The speci c output from the FACPs includes: time of sensor alarm; what sensor on the loop went into alarm, identi ed by its loop address; its custom ID (inputted speci cally to match with the system input); the alarm state of the sensor (supervisory warning, a trouble signal, or in a re alarm state); and what the analog value of the sensor is, if it has one. In order to test for analog values (which magnet tests do not a ect), canned smoke was applied in speci ed bursts to each smoke sensor, and a heat gun was placed beneath each heat sensor in order to trigger the sensor and measure changes in analog readings. With these methods, test inputs could be performed in 68 accordance to a testing plan. A basic description of the scenario is: at time equals zero, the re starts on the second oor in the bottom left corner room on the oor plan (shown earlier in Figure 3.3). A hot smoke layer soon forms on the ceiling of the room and begins to exit into the hallway through an open door. As the hot gases move over the second oor heat detector, the heat detector goes into alarm roughly sixty seconds from the start of the scenario, and the audio/visual alarms have activated to let personnel in the building be aware of the emergency. The smoke layer expands quickly through the small hallway and the second oor ionization smoke sensor detects the smoke layer at time equals ninety seconds. At this point, personnel have begun evacuating and going down the stairwell. As one person moves to the re escape, they pull the manual pull station on the way out at time equals 120 seconds. The smoke layer on the second oor now begins to extend all the way to the end of the hallway towards the stairwell, and the photoelectric smoke detector on that oor goes into alarm at time equals 150 seconds. Because the stairwell doors have been left open, the smoke enters the stairwell and rises, entering the third oor. Because of the increased burning conditions of the initial re, the smoke layer is hot and dense, thus moving up the stairwell at a rapid rate. At time equals 180 seconds, the third oor photoelectric smoke detector has been triggered. By time equals 210 seconds, the third oor ionization smoke sensor has gone into alarm as the smoke rapidly lls the third oor. At time equals 240 seconds, the third oor heat sensor has been triggered. The scenario has concluded at this point. At time equals 240 seconds, the third oor heat sensor has been triggered. The scenario has concluded at this point. 69 3.4.4 Procedure In order to carry out tests of the scenario, two people were required. The scenario was experimentally demonstrated as follows: 1. At the start of the test, a stopwatch was triggered by Person 1. 2. At each time point that a sensor was activated in the scenario, a sensor was triggered on the sensor mockup by Person 2. Smoke and heat sensors were activated with the application of a magnet; tamper and ow switches, as well as manual pull stations, were manually triggered by their respective toggle switches. 3. (a) In order to accurately follow the scenario, the magnets were applied by Person 2 after a cue from Person 1. The experimentally determined lead times were taken into account. These lead times are provided in Appendix C. (b) For the Simplex Panel, canned smoke was applied in a ve second burst to the smoke sensors, and a small ame was placed under each heat sensor according to appropriate trigger times. The same call-out procedure by Person 1 was employed. 4. The data output by the FACP as a result of the sensor triggers were sent to our prototype using the appropriate communication protocol (BACnet or RS-232). 5. Our prototype interpreted the data and displayed the information on the user interface. 6. A video of the decision support tool was recorded as it responded to the sensor input. 7. The video was then compared to the pre-designed simulation (which was ar- ti cially entered to have zero delay). . 3.4.4.1 Analysis We compared the video output from the hardware mockup with the pre- designed video of the scenario. This comparison achieved two things: 1) whether or not the sensor technology and communication protocols worked appropriately with 70 the designed software, and 2) it identi ed if there were signi cant delays between the hardware response time and the ideal case of no delay. Comparing the video output to the pre-designed scenario allowed us to determine if the system?s perfor- mance was acceptable. The analysis is qualitative in nature due to the uncertainty associated with the triggering of each sensor. 3.5 Fire Dynamics Simulator Tests 3.5.1 FDS Overview Fire Dynamics Simulator (FDS) is a program developed by NIST used for re dynamics simulation (\Smokeview (Version 5) - A Tool for Visualizing Fire Dynam- ics Simulation Data Volume I: Users Guide", n.d.). The input les for this program are written in Fortran. Information contained within the input le can include simu- lated building obstructions, ignition sources, sensors, and the mesh that is computed. Sensors that are readily available in FDS include smoke detectors, heat detectors, and sprinklers. The device properties can be manually set for the di erent sensors (for our scenarios, the values from the FDS user guide were utilized). Smokeview is software that was developed by NIST that displays the numeric solution found by FDS graphically. By default, Smokeview will display the obstructions written in the input le, with the ability to show the movement of the smoke and other par- ticles within the simulation. Smokeview is also capable of displaying temperatures at any point within the simulation. Simulating in FDS and displaying the results in Smokeview eliminated the need to test the prototype in an actual building, since performing such a test would not be feasible. 71 3.5.2 Simulations A simple four-room case was used to demonstrate that the prototype could process the data. This simulation consisted of four rooms connected by a hallway in the middle. The rooms contained sprinklers, smoke detectors and heat detectors. This simulation was created to establish what information was output by the FDS program and to check that the devices and re source worked as intended. The simulation?s results were also employed as a preliminary test of the prototype?s capabilities, ensuring that the prototype was capable of handling data generated from FDS. FDS simulations of greater complexity were utilized to test the nal function- ality of our prototype decision support tool. These simulations included a multilevel case and several cases modeling a wing of the James M. Patterson (JMP) building on the University of Maryland campus. Initially, the simulations were run in the single processor mode of FDS with computers utilizing Intel dual core processors to check if the obstructions in the scenarios were dimensionally correct. Then the scenarios were run in full on a dedicated Linux server. Each scenario contained a single continuous burner that increased heat over time. This choice of re source simpli ed the simulation, allowing for continuous smoke to set o the di erent sensors and the cases to not be dependent on location and quantity of combustibles. The intent of the FDS simulation was to show that the di erent sensors being triggered into alarm at di erent times could be displayed by the prototype. It was not about the realism of the re simulation itself. The multilevel case contained three rooms stacked up on each other with a single smoke detector, heat detector and sprinkler on each oor. The oors were connected by a staircase on the side of the building. The design of the obstructions in the input le was created by rst sketching out the oor plan and then translating that information into the proper obstructions. The multilevel simulation was created 72 to test the prototype?s ability to handle multiple oors. If the prototype could not switch between oors or accurately display the situation in this scenario, the prototype would need to be modi ed. The scenario was run for thirty minutes in simulation time, with the burner in the bottom level to allow the smoke to rise to the other levels. The JMP simulations were created to demonstrate how the prototype would function when faced with larger building sizes and more complex geometries. This simulation contained multiple rooms and hallways in order to provide a more com- plex path for the smoke to ow through. The oor plan for the wing of JMP was drawn in AutoCAD and imported into FDS code through an AutoCAD plugin. The obstructions that were imported were utilized as the base oor plan for all the input les. Multiple sensor placement variations on the same oor plan were used to estab- lish how sensor density a ected the display capabilities of the prototype. Originally there was to be three sensor setups: one representing the minimal amount of sen- sors, another representing the actual as-built setup of JMP, and the last containing a heat and smoke detector in every room. For this test, the maximum sensor den- sity and the as-built setup of JMP were the only two variants used. The reason for this decision is that, for commercial buildings, smoke detectors are not typically required as long as sprinklers are present (NFPA 101: Life Safety Code, 2009). The simulation was tted with a single sprinkler, representing a ow switch for the entire oor; multiple sprinklers were not placed since the wing of JMP modeled is a single zone for the sprinklers as built. This use of one sprinkler also alleviated issues with the added complexity of water droplets lengthening the simulation?s computation time. 73 3.5.3 Testing Procedure There were two di erent objectives for testing with simulations. The rst was to determine if the prototype could accurately show the scenario. The results of the multi-level case and the regular JMP simulation were used for this objective. The second test was to determine the sensor density for which our prototype decision support tool performs most e ectively. The placement variations described earlier for JMP were used for this objective. First, the prototype was set up to use the FDS output in place of sensor data. This setup took several steps: 1. Annotate the oor plan used to design the FDS simulation with the sensor location and areas of e ect. 2. Label the sensors so that the sensor labels match the data eld labels in the FDS output. 3. Set threshold values for each sensor type to determine when the sensors go into alarm. 4. Parse the FDS output to read in sensor values and nd when each sensor goes into alarm. 5. Use the timestamps in the FDS output to simulate sensor activation at appro- priate time intervals. 6. Pass the sensor activations as input to the visualization software. As the prototype was displaying the changing state of the FDS simulation, a video of its display was created by utilizing an existing function within Smokeview. This function exported the individual frames that Smokeview displayed as jpegs. Those jpegs were then spliced together into a single video of the scenario. The time between frames was determined by the numerical solutions found by FDS. Both displays have timestamps located on the screen; timing comparisons were gauged based on those stamps. Pictures were taken at one minute intervals in order to examine whether or not our prototype accurately displayed the progression of the re scenario. 74 3.5.3.1 Test 1 The rst test compared the information given by the prototype to the changing state of the simulation. The test was to determine whether the progression of the smoke movement, representing our re scenario, was followed by the sensor activa- tions in the prototype. If a sensor had been activated in the simulation and not the prototype, it was noted. A visual comparison of the activations was assessed at this point. The videos showing the display of both the prototype and the simulation were placed side by side. At one minute intervals, the two videos were compared to check if the prototype was accurately describing the scenario displayed in Smokeview. 3.5.3.2 Test 2 For test 2, JMP was annotated in two di erent ways: a full sensor layout and a sparse sensor layout. The full sensor layout was identical to the one used in Test 1; each room and hallway was equipped with a heat sensor and a smoke sensor. These sensors were physically located close to the center of each room. There was also a single ow switch monitoring the sprinklers for the whole oor located close to the re source. In the sparse sensor layout, only the major rooms (1, 2, 3, 4, and 26) were equipped with sensors. The sensors were positioned identically to the corresponding sensors in the full case. Our original plan was to analyze the results of this test at one-minute intervals. However, the sparse case does not change very frequently. Four minutes proved to be a more appropriate interval to describe the development of that scenario. 75 Chapter 4 Results 4.1 Hardware Test We tested our prototype with two di erent re sensor systems. The purpose of these tests was to demonstrate that existing hardware can support the proposed visualization system. A second goal was to determine what limitations current hardware imposes on the system and how those limitations may be worked around. Please refer to Section 3.4.4 for the procedure of each test. 4.1.1 Honeywell Mockup Test The Honeywell mockup test was performed by tapping the RS-232 port on theNFS-640 panel to obtain streaming information about what alarms were trig- gered. At t = 0, the rst alarm, H2, the heat sensor on the second oor, went o . There wasa four second delay between when the sensor went o and when the GUI actually displayed the alarm. The prototype did react immediately and kept track of thefour seconds it took to display the data, so the time elapsed clock was still correct. The ion smoke sensor I2 went o approximately six seconds early at t = 29 due to thevariation on trigger times associated with activating the sensors by magnet. After that, the manual pull station was pulled at t = 62 and the screen shifted to lookat the rst oor. The second oor was then manually reselected. The photoelectric smoke sensor P2 went o approximately on time at t = 88. Next, the photoelectricsmoke sensor P3 at the top of the stairwell on the third oor went o at t = 127. Once again, the four second delay can be seen here as the system jumps to a new oor, but the clock still follows the correct time after the four seconds. 76 Finally, the third oor ion smoke sensor I3 triggered 12 seconds late at t = 145, and the third oor heat sensor H3 triggered at t = 174. 4.1.2 SimplexGrinnell Mockup Test The SimplexGrinnell mockup test was performed by programming the BACnet card on the Simplex 4100U panel to monitor changes in analog values of smoke and heat sensors. As a result of this programming, sensor states could not be obtained and only analog values could be read. At t = 0, the rst alarm, H2, the heat sensor on the second oor, went o . There was a four second delay between when the sensor went o and when the system actually displayed the alarm. As with the Honeywell test, the system did react immediately and kept track of the four seconds it took to display the data, so the time elapsed clock was still correct. The ion smoke sensor I2 went o approximately 5 seconds late at t = 35 due to the variation on trigger times associated with triggering the sensors by magnet. The manual pull station was not pulled because the BACnet gateway was programmed to monitor changes in analog data only; manual pull stations do not provide analog data and polling such a device for analog data is an invalid call in the Simplex 4100U panel that shuts o the BACnet card. The second oor photoelectric smoke sensor P2 went o 4 seconds early at t = 86. Next, the photoelectric smoke sensor at the top of the stairwell on the third oor went o fteen seconds late at t = 135. Once again, the four second delay can be seen here as the system jumps to a new oor, but the clock still follows the correct time after the four seconds. Finally, the third oor heat sensor H3 triggered nine seconds early at t = 159, and the third oor ion smoke sensor I3 triggered nineteen seconds late at t = 169. 77 4.1.3 Fire Progression Key screenshots from the hardware tests have been selected and are presented here. The selected screenshots show what the prototype displayed immediately after each sensor was triggered. From these screenshots, a story of the re has been constructed as it would be understood by an observer who is familiar with the system and has no information about the re other than what is presented in the screenshots. (a) Honeywell (b) Simplex Figure 4.1: Frame A The above screenshots show that the re originated on the second oor. The re probably started with a clean-burning material that did not trip the smoke sensors. In the NFS-640 annotation le, only the corner appears in red, identifying the zone of e ect for the heat detector. In the 4100U annotation, the zone of e ect for the sensor is the whole oor, making it more di cult to visually understand where the re initially started, but still clearly shows what oor where the incident began. 78 (a) Honeywell (b) Simplex Figure 4.2: Frame B The east wing of the second oor is lling with smoke. The re is continuing to spread and probably originated in the east wing. (a) Honeywell (b) Simplex Figure 4.3: Frame C In the Honeywell test, the manual pull station on oor one has been activated. This activation indicates that there are people on oor one, and evacuation of the building has begun. In the Simplex test, the manual pull station was not pulled due to di culty reading non-analog alarm states via BACnet, as described earlier. 79 (a) Honeywell (b) Simplex Figure 4.4: Frame D In the Simplex test, we can see that smoke has spread throughout the second oor and has reached the stairwell. In both trials, the text-based display on the right side of the screenshots shows that a new sensor has been activated. However, in the Honeywell test the rst oor is still being shown as a result of the manual pull station being triggered, so it had to be manually switched to the second oor. (a) Honeywell (b) Simplex Figure 4.5: Frame E The smoke has spread up the stairwell from the second oor to the north wing of the the third oor. 80 (a) Honeywell (b) Simplex Figure 4.6: Frame F Smoke has spread throughout the third oor. (a) Honeywell (b) Simplex Figure 4.7: Frame G Material on the third oor has probably ignited. 4.1.4 Analysis The tests have shown that the system can interface with existing Honeywell and Simplex hardware, though with some di culty. It is easy to track the pro- gression of the re by looking at the displayed information assuming that there are multiple sensors present in the building. The text-based display on the right side of the screen should be improved for readability and display information such as sensor oor and sensor type. An observation to note in the results is that in the Simplex test, there were a few occasions where the prototype system detected an alarm before the Simplex 81 4100U panel did. This anomaly is most likely due to the fact that the BACnet plugin for the system monitors the changes in analog values and detects alarms based on the threshold values of the sensors being reached. The 4100U most likely has more sophisticated measures that prevent it from picking up nuisance alarms, thus the slight delay in setting o the alarm to be more con dent that it is real. In addition to the output of the prototype system, the normal output of the printer port from the Noti er NFS-640 Panel was taken and is shown below: ALARM: HEAT(FIXED) DETECTOR ADDR 1D053 Z001 04:15P 012411 1D053 ACKNOWLEDGE 04:16P 012411 Mon ALARM: SMOKE (ION) DETECTOR ADDR 1D051 Z002 04:16P 012411 1D051 ACKNOWLEDGE 04:16P 012411 Mon ALARM: MONITOR MODULE ADDR 1M014 Z004 04:16P 012411 1M014 ACKNOWLEDGE 04:17P 012411 Mon ALARM: SMOKE(PHOTO) DETECTOR ADDR 1D052 Z003 04:17P 012411 1D052 ACKNOWLEDGE 04:17P 012411 Mon ALARM: SMOKE(PHOTO) DETECTOR ADDR 1D012 Z003 04:18P 012411 1D012 ACKNOWLEDGE 04:18P 012411 Mon ALARM: SMOKE (ION) DETECTOR ADDR 1D011 Z002 04:18P 012411 1D011 ACKNOWLEDGE 04:18P 012411 Mon ALARM: HEAT(FIXED) DETECTOR ADDR 1D013 Z001 04:18P 012411 1D013 82 ACKNOWLEDGE 04:18P 012411 Mon As can be seen in the default con guration of the panel, simply reading the alarm states as they go o makes it hard to gain a visual perspective of the scenario, and harder to get a visual picture of the progression. Custom labeling of the sensors could aid in this process, since this is a small building with few sensors, but in larger con gurations, it would be much more di cult to get a full visual picture of how the re started and how it has progressed. The visual nature of the prototype, as well as its replay feature, can potentially aid in providing an image of the spread of the re in a building. 4.2 FDS Testing of GUI 4.2.1 Test 1 Results A qualitative analysis of the correlation between the re activity in the FDS simulation and the GUI of our decision support tool prototype was undertaken. Images from the FDS and the GUI were compared at one-minute intervals to assess the accuracy of the view of re progression through the decision support tool. This assessment was done by matching rooms with smoke visible in Smokeview with rooms showing smoke detector activations in the GUI. The timeline used was based on the SmokeView simulation. The GUI time-of- rst-alarm is approximately 65 seconds after the Smokeview timeline begins. Refer to Section 3.5.3 for the full procedure of this test. 4.2.1.1 JMP Scenario A visual assessment of the decision support tool?s display for the JMP simu- lation in comparison to the SmokeView images showed only two incongruities. At 83 the three-minute mark, smoke was visible at a low concentration in room 9 but had not yet been detected by the decision aid display. This discrepancy can be seen in gure 4.8, where there is no indication on the GUI display that an alarm has activated { an entry in the right dynamic panel or graphical change in the oor plan overlay. However, by the four minute mark, the GUI showed smoke in room 9. Figure 4.8: Side by Side comparison of JMP simulation to GUI at the three minute mark At the seven-minute mark, room 22 showed smoke in Smokeview but not in the GUI. Again, by eight minutes, the GUI indicated the alarm. Also by eight minutes, the prototype?s GUI showed smoke in every room. It is at this point that visual assessment of smoke progress was halted because the simulation would primarily show an increase in intensity of smoke concentration. However, the GUI is not able to show these changes in intensity once all the smoke detectors are activated; it is only capable of displaying whether or not a sensor is triggered, not the analog sensor output. Figure 4.9 displays the simulation at the eight minute mark while gure 4.10 shows the simulation at the twelve minute mark. As evidenced in the gure, the display of activated smoke sensors on the GUI saw no change. However, in the simulation, all the rooms have an increase in smoke concentration. A visual assessment of heat progress was not carried out. 84 Figure 4.9: Side by Side comparison of JMP simulation to GUI at the eight minute mark Figure 4.10: Side by Side comparison of JMP simulation to GUI at the twelve minute mark The discrepancies described above are likely due to the threshold values for the smoke detectors. Smoke can be seen in the Smokeview simulation before the alarm triggers. Each alarm triggers somewhere within the fairly large time gap of one minute. For room 9, the GUI activation appeared seconds after the image that was used as the comparison between the GUI and the simulation. For room 22, the GUI activation time was at 7.5 minutes. Because the activation times are between the minute markers, these discrepancies do not indicate any error in the decision 85 support tool. A visual assessment of the smoke progression in the JMP simulation shows a successful rendering of the smoke progression in the building. 4.2.1.2 Multilevel Scenario In the multilevel simulation, the visual assessment shows a mostly accurate depiction of the smoke progression. At three minutes, smoke is visible in the stairs, but no detectors are located there in the simulation. At four minutes, smoke can be seen entering the second and third oors, but has not encompassed the entire room, so it is reasonable that the smoke alarm has not yet triggered. At ve minutes, both the rst and second oors show smoke detected in the GUI. However, the third oor has a signi cant quantity of smoke but the smoke detector has not yet been activated. By looking at the activation times, the third oor smoke detector triggers twenty seconds after the second oor. This occurrence is somewhat surprising because from a horizontal view, the smoke on oor three appears to be more highly concentrated than oor two. The height of the sensor on the third oor is 0.1 m higher than the other sensors and located further from the door than on the other two oors, likely explaining the delay in sensor activation on the third oor. However, the discrepancy is only ve seconds. At six minutes, all of the oors show smoke in both the GUI and Smokeview and the visual assessment was completed. Considering the minute-long intervals, the visual assessment concludes that the GUI is su ciently accurate in portraying the smoke progression across multiple oors. 4.2.2 Test 2 Results The second test determined how sensor density a ected the prototype perfor- mance. This test involved comparing the videos of the two di erent JMP sensor placement cases previously described. The test examines the di erence between sensor densities to see if more visual information can be obtained from buildings 86 with more sensors. Refer to Section 3.5.3 for the full procedure of this test. 4.2.2.1 Start of incident (a) sparse (b) full Figure 4.11: incident start The sparse sensor layout detects the re over a minute after the full sensor layout has gone into alarm. Room 3, where the initial sensor on the sparse layout was located, is still fairly close to the re source. Due to this proximity, locating the initial source from the data obtained from the sparse sensor layout is still possible. However, the requires an inference that is not necessary when viewing the data from the full sensor layout. 87 4.2.2.2 Flow switch activation: Four minutes from start of re (a) sparse (b) full Figure 4.12: ow switch activation By this point, the full layout shows that the smoke has spread across the building. The sparse layout shows a very di erent picture, with sensors in alarm only in the top left corner of the building. The sparse case does not correctly represent the spread of smoke. The movement of smoke is more clearly seen by the full sensor layout. About half of the building?s smoke sensors are in alarm in both scenarios { two out of ve are in alarm for the sparse case, and twelve out of twenty-seven for the full case. However, without an indication for which rooms have sensors, the sparse case appears to under-represent the extent to which smoke has spread. The fact that only two sensors are in alarm for the sparse case could be deceiving. 88 4.2.2.3 All smoke sensors in alarm: Eight minutes from start of re (a) sparse (b) full Figure 4.13: all smoke sensors in alarm In both sensor layouts, all smoke sensors are now in alarm. However, on the display showing the sparse layout, less of the area is marked to indicate smoke. The e ect is quite pronounced; at a glance, the sparse case appears to be only half- lled with smoke, while with the full case, it is clear that smoke is present in every room. Another aspect of this e ect is that the source of smoke in room 26 is not readily apparent. It raises the question of a separate source of smoke. From the full case it is clear that the smoke did in fact propagate as expected. The full case now has two heat sensors in alarm, signifying a more serious incident. The sparse case has only smoke sensors in alarm, and no new sensors have gone into alarm for the past two minutes. The full case is indicating further developments in the re which the sparse case cannot represent. 89 4.2.2.4 All sensors in alarm for sparse case: Twelve minutes from start of re (a) sparse (b) full Figure 4.14: all sensors in alarm All the sensors from the sparse case that are triggered by the simulation are now in alarm. The sparse case nally gives an indication that temperatures have climbed much higher close to the source of the re. However, it is still missing information compared the full case, which shows that the high temperatures have already spread to the center of the building. 90 Chapter 5 Conclusions Currently, EFRs accessing building sensor data are given sensor activation data sequentially and without context. A clear picture of the situation in the building is virtually inaccessible from textual input of this sort. We proposed a decision support tool to interpret building sensor data and visualize the progression of a developing re emergency situation. The goal was to provide a visual representation of the emergency as accurately as possible to key decision making personnel using only technology already available in buildings. We developed a prototype decision support tool to show that a decision aid display can be an e ective and feasible way to provide valuable information to incident commanders. The prototype was tested by using sensor activation inputs as well as virtual simulation inputs to verify the prototypes capability to handle physical sensor data and the prototypes ability to track re emergency development accurately. Using a sequence of triggered alarms, the prototype demonstrated that it could successfully handle sensor data inputs. In current systems, RS-232 has been a common form of data output for printers commonly connected to FACPs. Basic printer output from FACPs did not provide any useful visual context of the re progression. When the same data from the printer output is used as an input for the prototype decision support tool, the GUI clearly shows the events of the emergency scenario in real-time. The delay between sensor activation and visualization of the GUI was minimal (four second maximum) and was in large part due to ine ciencies of the chosen programming language, Java, and the hardware limitations of the lab laptop. Tests with physical sensor inputs demonstrate that a decision support tool connect to existing building sensor systems and provide more e ective information. 91 In order to perform experiments on a larger scale, virtual simulations of re scenarios were created in FDS. The virtual sensor output of the FDS simulations was used as the input to the prototype. Then the visual outputs of Smokeview and the GUI were compared. There were two virtual experiments. One experiment was an analysis of a residential sized building with three sensors per oor and a total of three levels. This multi-level experiment was of a scale similar in size and sensor density to the hardware tests. The virtual experiments matched the general results of the hardware experiments and reached the same conclusions. The next FDS experiment was of a signi cantly larger scale and sensor den- sity, in a replication of the J.M. Patterson building at the University of Maryland. This comparison demonstrated that the progression of a re emergency situation in a full-scale building can be accurately portrayed by a visualization tool. The experiment was run in two parts, one with a low sensor density (the real, as-built sensor con guration) and one with a high sensor density, closer to an ideal case. In the case of the low density test, the results showed that it was very di cult to get a detailed view of the progression of the re. However, in the high density experiment, the progression of the re was clear and identi able. The high density scenario provided more at-a-glance information for EFRs. In addition, the stage of development of the re was more readily apparent by comparing the progression of activated smoke sensors to heat sensors. While a higher senor density leads to a more precise picture, even in low sensor density con gurations, the visual display provides more at-a-glance information than the plain text output of an FACP. 5.0.3 Future Directions The system proposed in this project has many potential implications for its own portion of the industry as well as others. The future directions inspired by this project fall into two distinct categories: advances that can be made to improve the 92 decision support tool itself, and advances to the industry that vastly increase the potential of incorporating such a system. For advances to the system itself, there are many improvements to be made. Further work can be done to streamline the user interface (UI) and maximize the human interfacing components of potential hardware that could be used. Currently, the software is in a proof-of-concept phase, running on any computer installed. It is envisioned to be installed on touchscreen devices like Toughbooks and other tablet devices for fast, intuitive touch interaction. In order to get to the envisioned state, more research on human interfacing must be considered. Usability testing needs to be done to determine what kind of di erence this tool makes in EFR response to an emergency. Questions such as \Do EFRs make more e ective decisions or make decisions signi cantly faster with the decision support tool?" are paramount in justifying its advanced development. Also, research to evaluate whether or not too much information is displayed is imperative. Optimizing the hardware and software for use by rst responders is a key step in order for a system to be practically adopted. Other advances to the decision support tool include increasing the variety of kinds of input it can adopt. The tool is designed to easily add new sensor technolo- gies in a modular way. Annotations of oor plans can be updated to include new devices. The result is that there is room for new technology to get more and increas- ingly sophisticated data for analysis by the decision support tool. Inclusions that could be made based on current technology include adding in HVAC sensors, secu- rity systems and structural health monitoring devices. HVAC sensors can provide further information on the air ows within a building structure, monitor humid- ity, and track water ows. Security systems o er access to motion sensors which could be used to monitor motion in rooms due to either smoke current or persons still inside a building. Additionally, they provide security cameras that have video 93 feeds that could be used to provide an internal view of the building. Such cameras combined with infrared camera technology or new devices such as re and smoke sensing cameras could bring signi cant visual data. Through cameras, the progres- sion of re and the location of civilians and EFRs within a building could be seen in detail. Additional technology that could be included is GPS tracking of individual EFRs within a building. While it is most likely a more complicated inclusion, it is potentially valuable information when responding to an emergency. Thorough and properly visualized inclusion of current technology could provide a whole new realm of data to aid rst responders in analyzing an emergency scenario. With modularity being a key goal of the system, taking in new types of data as new technologies evolve is also crucial. Ideally, new sensors that are able to more accurately monitor temperature, structural integrity and overall building health could be used to predict dangerous scenarios like ashover or building collapse. Predictive analysis based on accurate sensor data could potentially save lives. It is essential that, as the proposed decision support tool evolves, modularity stays at the forefront of priorities in its future direction. For the industry, signi cant changes could lead to innovation and development in advances for emergency technology. An important recommendation for a new standard is increasing the requirements for building sensor con gurations. As shown in the sensor density experiments, more sensors installed in a building allows for a more accurate visualization of the re scenario. Requiring more senors would greatly increase the bene ts of decision aid displays like the tested prototype, but may not be cost e ective. Further cost-bene t analysis should be undertaken. The single biggest recommendation for the future that could be made is the adoption of a uni ed standard for a protocol of sensor and other building data. Such a standard is essential for the decision support tool to be able to take in new inputs. BACnet, a protocol used in the experiments by our team, is a good start at such a standard. 94 It makes the outputs of sensors uni ed and currently works across re sensors and HVAC sensors. However, it does not have the capacity to deal with new kinds of sensor systems such as security or EFR tracking systems. Also, our experimentation with BACnet was limited to a sensor mockup of only twelve input devices. On a full sized building, there could be orders of magnitude more sensors to incorporate. The refresh rate of the BACnet protocol may not be fast enough to handle real-time monitoring of data on this scale. Further experimentation is necessary to con rm this fact. Another important standard for the proposed system is NEMA SB 30, the standard that outlines how visuals are shown on an interface for EFRs. A key point for adoption of a new technology by rst responders is a consistent visual interface. All layouts, symbols, colors and indicators need to be standardized across every display in order for there to be no new learning curve moving from device to device. As another direction, the industry could work to advance technology in order to increase the potential of the proposed tool. Making sensors more robust in emergency scenarios (e.g., more heat resistant so they fail at a later time in re scenarios), making them take in higher ranges of data at faster rates, and increasing the variety of data available could contribute to increasing the information available to rst responders. Ultimately, the goal of the system is to provide better data for EFRs to make more informed decisions. While discussing future directions of this complex system, there are also chal- lenges to consider. Widespread implementation of the proposed decision support tool would require initial training of EFRs. There are both logistical and nancial costs associated with such a signi cant amount of training. The economic challenges of mass hardware implementation must also be recognized. Conveniently, the sys- tem is primarily software in nature and could run on already existing hardware in the eld such as Toughbooks. However, not all re departments have the necessary 95 technology. Furthermore, if a new standard protocol is implemented, the re sensor industry and all additional sensor industries that would become included will need to adopt that protocol and make appropriate hardware accommodations. While this project did not attempt to address these issues, in future work on this kind of system, it would be important to consider them. 96 Appendix A Features List Static/Pre-planning Sidebar The purpose of the static sidebar is to provide a synopsis of the building involved in the emergency situation. Fire ghters will use this information before they arrive at the scene. This information will be displayed on the left side of the display in plain text. The following information, if available, MUST be displayed in this sidebar in the following order. Occupancy (notation) assembly, business, educational, factory, high-hazard, institutional, mercantile, residential, storage, utilities, miscellaneous (Interna- tional Building Code 2006) Construction (notation) re-resistive non-combustible, protected non-combustible, unprotected non-combustible, protected combustible, unprotected combustible, heavy timber, protected wood frame, unprotected wood frame Number of oors Possible hazards- chemical, gas tanks, biological, radioactive Display time of rst alarm in 24-hour mode in hours-minutes-seconds (i.e., 20:15:30, as opposed to 08:15:30 P.M.). Dynamic Alert Sidebar The purpose of the dynamic sidebar is to display all important alerts that have occurred during the duration of the emergency situation. This information will be displayed on the right side of the display in plain text. Display elapsed time since rst alarm Display alarms chronologically with newest alarms on top. Alarms notated as: sprinklers active, smoke alarm, heat alarm, chemical alarm, chem. suppression, pull station Should state location of alarms by room, sprinklers by zone Should time stamp in 24 hour time Scroll bar if space of window for display is exceeded 97 Display rst alarm of each oor and time occurred Floor Plan The purpose of the oor plan is to allow Incident Commanders to visually pinpoint the locations of sensors in alarm and other items of importance. The dynamic information displayed will be overlaid on a static, two-dimensional oor plan. The system will utilize NEMA SB 30 icons and icons created by our team. Only sensors in alarm will be displayed on the oor plan; idle sensors will be hidden from view. The oor plan should be draggable (if possible) but at minimum, move based on arrows. The following items must be displayed on the oor plan: Location and state of smoke detectors (NEMA SB 30) Rooms with active smoke detectors will be shaded grey. If a room is determined to contain both smoke and heat, the room will be shaded red. The presence of re will take precedence over the presence of smoke. Location of activated heat sensors (NEMA SB 30) Rooms determined to have heat detector going o will be shaded red Location of activated hazardous material (hazmat) detectors (NEMA SB 30) Rooms determined to have activated hazmat detectors will be shaded green Rooms with hazmat and heat detectors activated will be striped green and red Rooms with hazmat and smoke detectors activated will be striped green and gray Location of hazardous materials; chemical, biological, radioactive (NEMA SB 30) Location of troubled sensors (NEMA SB 30) Sensors in alarm that stop reporting will display most recent state and troubled symbol Location of high-pressure gas storage (NEMA SB 30) Locations of elevators/elevator controls (NEMA SB 30) 98 { Floor elevators on should be displayed on elevator icon Locations of stand pipes/ re hydrants (NEMA SB 30) Location of utility shuto (gas, electric, water) Location of smoke vents and exhaust fans (NEMA SB 30) Locations of evacuation/pressurized stairs (NEMA SB 30) Location of standpipes and sprinkler shuto (NEMA SB 30?) Locations of activated sprinkler systems Zones with activated sprinklers will be outlined in blue. Zones in a building will not otherwise be displayed. Displayed zones will have centered label Locations of activated chemical suppression systems Outline room with activated chemical suppression systems will be outline in purple. Zoom (like Google Maps): Three button control { Zoom in: Moves in a set magni cation (will have to gure actual amount later) { Zoom out: Moves out a set magni cation (Max zoom out is default view) { Default: This button should set the map so that the entire oor is in view. Buttons will be placed at the upper left of the map Replay: Scroll bar will allow ip through of the recorded map images Animation needs to be faster (several factors) than real time Animation will replace current map (Needs to be distinguished from real time image) Wherever scroll bar is dragged and dropped, it will continue playing till it reaches current situation 99 Pressing the play button starts the animation at the beginning and plays to the end Only the map area will be a ected by replay; all other areas functions as normal Floor control: Two areas to pick oors; rst is oors in alarm, 2nd is complete list of oors Incident Commanders will have the ability to change oors through a scroll function, point-and-click on the oor of interest The system will default to the oor where the rst alarm occurred. Pop-up Alert Window Area The purpose of the popup alert window is to display current special alerts and crucial warnings to Incident Commanders. The pop-up alert window will be located in the lower section of the screen directly below the oor plan. Pop-up will appear with sound and will not close until acknowledged Newest alert will be in front of stack of pop ups Information on the pop-up will show up on the dynamic sidebar when pop up appears The following alerts will be displayed in the pop-up window: { When an alarm goes o on another oor (notated as: Alert: Alarm on oor X) { When a sprinkler systems activates (notated as: Alert: Sprinklers acti- vated, oor X, zone Y) { When a chemical/special hazard occurs (notated as: Alert: K hazard, oor X, room Z) { When a chemical suppression system activates (notated as: Chemical suppression systems activate, oor X, room Z) 100 (a) Troubled Alarm (b) Chemical Hazard (c) Biological Hazard (d) Radioactive Hazard (e) Elevator (Displayed with oor number in the middle) (f) Gas Tank (g) Pull Station (dis- played when in alarm) (h) Stairs (i) Standpipe (j) Sprinkler Shuto Figure A.1: NEMA SB30 Symbols Used 101 Appendix B Hardware Test: Emergency Scenario T = 0 - re starts T = 0:00 = 60 seconds - H2 goes into alarm, A/V alarms sound (A/V sensors not visualized) T = 0:30 = 90 seconds - I2 goes into alarm T = 1:00 = 120 seconds - Manual Pull station activated T = 1:30 = 150 seconds - P2 goes into alarm T = 2:00 = 180 seconds - P3 Goes into alarm T = 2:30 = 210 seconds - I3 go into alarm T = 3:00 = 240 seconds - H3 Goes into Alarm At time equals zero, the re starts on the second oor in the bottom left corner room on the oor plan (all oors shown below). The re is very sooty, producing a very hot, dense smoke. A hot smoke layer soon forms on the ceiling of the room and begins to exit into the hallway through an open door. As signi cant quantities of hot gases move over the second oor heat detector, the heat detector goes into alarm roughly sixty seconds from the start of the scenario, and the audio/visual alarms have activated to let personnel in the building be aware of the emergency. The smoke layer expands quickly through the small hallway and the ionization smoke sensor to the right detects the smoke layer at time equals ninety seconds. At this point, personnel have begun evacuating and going down the stairwell. As one person moves to the re escape, they pull the manual pull station on the way out at time equals 120 seconds. The smoke layer on the second oor now begins to extend all the way to the end of the hallway towards the stairwell, and the photoelectric smoke detector on that oor goes into alarm at time equals 150 seconds. Because 102 the stairwell doors have been left open, the smoke enters the stairwell and rises, entering the third oor. Because of the increased burning conditions of the initial re, the smoke layer is very hot and very dense, thus moving up the stairwell at a rapid rate. At time equals 180 seconds, the third oor photoelectric smoke detector has been triggered. Soon, it starts spreading into the hallway due to the stairwell door being left open during the evacuation process. By time equals 210 seconds, the ionization smoke sensor has gone into alarm as the smoke rapidly lls the third oor. At time t=240, the third oor heat detector ha been triggered. The scenario has concluded at this point. Figure B.1: Floor 1 103 Figure B.2: Floor 2 Figure B.3: Floor 3 104 Appendix C Sensor Triggering Testing Because alarms did not trigger immediately upon application of the magnet test trigger, there needed to be a way to account for the delay when testing the system. Each Honeywell sensor was triggered a total of fteen times with a magnet. Each Simplex-Grinnell sensor was triggered a total of seven times with a ame or with canned smoke in light of the fact that those sensors were activated by changes in their analog values, not by magnets. They were triggered in random order to reduce systematic errors potentially introduced into the system by consistently trig- gering the sensors in similar time intervals. The time between when the sensor was triggered and the FACP registered the event was recorded. For each sensor the mean and standard deviation of its response times were taken. These numbers were then used to test for normality and perform a two tailed t-test for means with 95% certainty. This test was done to determine if a standard response time for each sensor could be assumed when testing the whole system. The results can be seen in the tables below. Sensor Key Detector type First Floor Second Floor Third Floor Photo L1D12 L1D52 Ion L1D11 L1D51 L1D91 Heat L1D13 L1D53 L1D93 105 Sensor Name Time (seconds) L1D12 16.78 13.76 16.97 16.62 14.60 18.93 18.72 16.90 17.53 17.67 Mean 16.85 STDEV 1.62 Sensor Name Time (seconds) L1D52 14.39 17.46 9.52 15.57 16.90 8.24 14.74 17.04 14.40 17.83 Mean 14.61 STDEV 3.29 Sensor Name Time (seconds) L1D11 8.31 7.96 8.17 6.70 6.84 9.09 7.12 7.12 7.28 7.47 Mean 7.61 STDEV 0.76 Sensor Name Time (seconds) L1D51 7.12 6.91 7.82 7.47 14.88 8.16 6.84 9.92 6.84 Mean 8.44 STDEV 2.60 106 Sensor Name Time (seconds) L1D13 6.00 5.93 5.86 7.40 6.56 6.70 6.49 8.45 9.22 8.59 Mean 7.12 STDEV 1.23 Sensor Name Time (seconds) L1D53 6.78 7.26 6.07 8.24 7.57 9.71 7.47 9.01 4.40 6.70 Mean 7.32 STDEV 1.50 Sensor Name Time (seconds) L1D91 7.05 8.24 8.11 9.08 7.41 8.66 8.45 7.40 9.71 6.84 Mean 8.10 STDEV 0.92 Sensor Name Time (seconds) L1D93 9.08 7.40 7.54 8.38 6.84 9.22 8.87 8.45 8.24 11.39 7.48 Mean 8.44 STDEV 1.24 107 Appendix D FDS Test 1: Full Results D.1 Side by Side comparison of JMP simulation to GUI Figure D.1: Side by Side comparison of JMP simulation to GUI at the one minute mark Figure D.2: Side by Side comparison of JMP simulation to GUI at the two minute mark 108 Figure D.3: Side by Side comparison of JMP simulation to GUI at the three minute mark Figure D.4: Side by Side comparison of JMP simulation to GUI at the four minute mark Figure D.5: Side by Side comparison of JMP simulation to GUI at the ve minute mark 109 Figure D.6: Side by Side comparison of JMP simulation to GUI at the six minute mark Figure D.7: Side by Side comparison of JMP simulation to GUI at the seven minute mark Figure D.8: Side by Side comparison of JMP simulation to GUI at the eight minute mark 110 Figure D.9: Side by Side comparison of JMP simulation to GUI at the nine minute mark Figure D.10: Side by Side comparison of JMP simulation to GUI at the ten minute mark Figure D.11: Side by Side comparison of JMP simulation to GUI at the eleven minute mark 111 Figure D.12: Side by Side comparison of JMP simulation to GUI at the twelve minute mark Figure D.13: Side by Side comparison of JMP simulation to GUI at the thirteen minute mark Figure D.14: Side by Side comparison of JMP simulation to GUI at the fourteen minute mark 112 Figure D.15: Side by Side comparison of JMP simulation to GUI at the fteen minute mark Figure D.16: Side by Side comparison of JMP simulation to GUI at the sixteen minute mark Figure D.17: Side by Side comparison of JMP simulation to GUI at the seventeen minute mark 113 D.2 Side by Side comparison of Multilevel simulation to GUI Figure D.18: Side by Side comparison of Multilevel simulation to GUI at the one minute mark Figure D.19: Side by Side comparison of Multilevel simulation to GUI at the two minute mark 114 Figure D.20: Side by Side comparison of Multilevel simulation to GUI at the three minute mark Figure D.21: Side by Side comparison of Multilevel simulation to GUI at the four minute mark Figure D.22: Side by Side comparison of Multilevel simulation to GUI at the ve minute mark 115 Appendix E FDS Code E.1 JMP FDS code &HEAD CHID=? jmp fs ? &MESH IJK=120 ,120 ,12 , XB= 0 . 0 , 4 0 , 0 . 0 , 2 5 , 0 . 0 , 4 . 0 , COLOR=?BLACK? / &TIME T END=3600. / s i n g l e burner , no objects , 30 min run &SURF ID=?FIRE ? ,HRRPUA=10000.0 ,TAU Q= 600.00/ &OBST XB= 1 . 5 , 2 , 1 . 5 , 2 , 0 , 1 . 5 , SURF IDS=?FIRE ? , ?INERT? , ?INERT? / &PART ID=?water drops ? , WATER=.TRUE. , SAMPLING FACTOR=1 / &PROP ID=?Acme Sprinkler ? , QUANTITY=?SPRINKLER LINK TEMPERATURE? , RTI =95, C FACTOR=0.4 , ACTIVATION TEMPERATURE=68, OFFSET=0.10 ,PART ID=?water drops ? , FLOW RATE=189.3 , DROPLET VELOCITY=10. , SPRAY ANGLE=30. ,80. / &PROP ID=?Acme Smoke Detector ? , QUANTITY=?CHAMBER OBSCURATION? , LENGTH =1.8 , ACTIVATION OBSCURATION=3.28 / &PROP ID=?Acme Heat ? , QUANTITY=?LINK TEMPERATURE? , RTI=132. , ACTIVATION TEMPERATURE=74. / smoke detector &DEVC XYZ=6 ,17 ,3.9 , ID=?smoke 2 ? , PROP ID=?Acme Smoke Detector ? / &DEVC XYZ=6 ,7 ,3.9 , ID=?smoke 1 ? , PROP ID=?Acme Smoke Detector ? / &DEVC XYZ=6 ,1 ,3.9 , ID=?smoke 23 ? , PROP ID=?Acme Smoke Detector ? / &DEVC XYZ=14 ,20.5 ,3.9 , ID=?smoke 5 ? , PROP ID=?Acme Smoke Detector ? / &DEVC XYZ=17.5 ,20.5 ,3.9 , ID=?smoke 6 ? , PROP ID=?Acme Smoke Detector ? / &DEVC XYZ=22 ,20.5 ,3.9 , ID=?smoke 7 ? , PROP ID=?Acme Smoke Detector ? / &DEVC XYZ=17 ,17 ,3.9 , ID=?smoke 27 ? , PROP ID=?Acme Smoke Detector ? / &DEVC XYZ=17.5 ,13 ,3.9 , ID=?smoke 4 ? , PROP ID=?Acme Smoke Detector ? / &DEVC XYZ=17.5 ,6.333 ,3.9 , ID=?smoke 3 ? , PROP ID=?Acme Smoke Detector ? / &DEVC XYZ=18 ,1 ,3.9 , ID=?smoke 24 ? , PROP ID=?Acme Smoke Detector ? / &DEVC XYZ=25.5 ,3 ,3.9 , ID=?smoke 8 ? , PROP ID=?Acme Smoke Detector ? / &DEVC XYZ=25.5 ,6.5 ,3.9 , ID=?smoke 9 ? , PROP ID=?Acme Smoke Detector ? / &DEVC XYZ=26.5 ,10.5 ,3.9 , ID=?smoke 10 ? , PROP ID=?Acme Smoke Detector ? / &DEVC XYZ=26.5 ,15 ,3.9 , ID=?smoke 11 ? , PROP ID=?Acme Smoke Detector ? / &DEVC XYZ=26.5 ,18 ,3.9 , ID=?smoke 12 ? , PROP ID=?Acme Smoke Detector ? / &DEVC XYZ=26.5 ,21.3333 ,3.9 , ID=?smoke 13 ? , PROP ID=?Acme Smoke Detector ? / &DEVC XYZ=31 ,2.5 ,3.9 , ID=?smoke 14 ? , PROP ID=?Acme Smoke Detector ? / &DEVC XYZ=31 ,7 ,3.9 , ID=?smoke 15 ? , PROP ID=?Acme Smoke Detector ? / 116 &DEVC XYZ=32 ,12.5 ,3.9 , ID=?smoke 26 ? , PROP ID=?Acme Smoke Detector ? / &DEVC XYZ=31 ,20 ,3.9 , ID=?smoke 16 ? , PROP ID=?Acme Smoke Detector ? / &DEVC XYZ=36.5 ,1.5 ,3.9 , ID=?smoke 17 ? , PROP ID=?Acme Smoke Detector ? / &DEVC XYZ=37 ,5.5 ,3.9 , ID=?smoke 18 ? , PROP ID=?Acme Smoke Detector ? / &DEVC XYZ=37 ,10 ,3.9 , ID=?smoke 19 ? , PROP ID=?Acme Smoke Detector ? / &DEVC XYZ=37 ,13.5 ,3.9 , ID=?smoke 20 ? , PROP ID=?Acme Smoke Detector ? / &DEVC XYZ=37 ,16.5 ,3.9 , ID=?smoke 21 ? , PROP ID=?Acme Smoke Detector ? / &DEVC XYZ=37 ,20 ,3.9 , ID=?smoke 22 ? , PROP ID=?Acme Smoke Detector ? / &DEVC XYZ=23.25 ,9 ,3.9 , ID=?smoke 25 ? , PROP ID=?Acme Smoke Detector ? / s p r i n k l e r &DEVC XYZ=4 ,1 ,3.9 , ID=? s p r i n k l e r 1 ? , PROP ID=?Acme Sprinkler ? / heat detector &DEVC XYZ=5.5 ,1 ,3.9 , ID=? heat 23 ? , PROP ID=?Acme Heat ? / &DEVC XYZ=5 . 5 ,7 .5 , 3 .9 , ID=? heat 1 ? , PROP ID=?Acme Heat ? / &DEVC XYZ=5.5 ,16.5 ,3.9 , ID=? heat 2 ? , PROP ID=?Acme Heat ? / &DEVC XYZ=17.5 ,1 ,3.9 , ID=? heat 24 ? , PROP ID=?Acme Heat ? / &DEVC XYZ=18 ,16 ,3.9 , ID=? heat 3 ? , PROP ID=?Acme Heat ? / &DEVC XYZ=18 ,13.5 ,3.9 , ID=? heat 4 ? , PROP ID=?Acme Heat ? / &DEVC XYZ=18 ,17 ,3.9 , ID=? heat 27 ? , PROP ID=?Acme Heat ? / &DEVC XYZ=14.333 ,20.5 ,3.9 , ID=? heat 5 ? , PROP ID=?Acme Heat ? / &DEVC XYZ=17.333 ,20.5 ,3.9 , ID=? heat 6 ? , PROP ID=?Acme Heat ? / &DEVC XYZ=22.333 ,20.5 ,3.9 , ID=? heat 7 ? , PROP ID=?Acme Heat ? / &DEVC XYZ=23 ,9 ,3.9 , ID=? heat 25 ? , PROP ID=?Acme Heat ? / &DEVC XYZ=25.5 ,3.25 ,3.9 , ID=? heat 8 ? , PROP ID=?Acme Heat ? / &DEVC XYZ=25.5 ,6.75 ,3.9 , ID=? heat 9 ? , PROP ID=?Acme Heat ? / &DEVC XYZ=27 ,10.5 ,3.9 , ID=? heat 10 ? , PROP ID=?Acme Heat ? / &DEVC XYZ=27 ,15.25 ,3.9 , ID=? heat 11 ? , PROP ID=?Acme Heat ? / &DEVC XYZ=27 ,18 ,3.9 , ID=? heat 12 ? , PROP ID=?Acme Heat ? / &DEVC XYZ=27 ,21.5 ,3.9 , ID=? heat 13 ? , PROP ID=?Acme Heat ? / &DEVC XYZ=31.33 ,3 ,3.9 , ID=? heat 14 ? , PROP ID=?Acme Heat ? / &DEVC XYZ=31.33 ,6.75 ,3.9 , ID=? heat 15 ? , PROP ID=?Acme Heat ? / &DEVC XYZ=32 ,13 ,3.9 , ID=? heat 26 ? , PROP ID=?Acme Heat ? / &DEVC XYZ=31.33 ,20.5 ,3.9 , ID=? heat 16 ? , PROP ID=?Acme Heat ? / &DEVC XYZ=36.25 ,2 ,3.9 , ID=? heat 17 ? , PROP ID=?Acme Heat ? / &DEVC XYZ=37.25 ,5.25 ,3.9 , ID=? heat 18 ? , PROP ID=?Acme Heat ? / &DEVC XYZ=37.25 ,10.25 ,3.9 , ID=? heat 19 ? , PROP ID=?Acme Heat ? / &DEVC XYZ=37.25 ,13.25 ,3.9 , ID=? heat 20 ? , PROP ID=?Acme Heat ? / &DEVC XYZ=37.25 ,16.75 ,3.9 , ID=? heat 21 ? , PROP ID=?Acme Heat ? / &DEVC XYZ=37.25 ,20.25 ,3.9 , ID=? heat 22 ? , PROP ID=?Acme Heat ? / f l o o r p l a n &OBST XB=34.1316 ,34.4649 ,3.59167 ,7.59167 ,0.0 ,4.0/ &OBST XB=38.7983 ,39.1316 ,0.333334 ,3.27333 ,0.0 ,4.0 / &OBST XB=34.1316 ,38.7983 ,3.25833 ,3.59167 ,0.0 ,4.0/ &OBST XB=38.7983 ,39.1316 ,3.25833 ,8.25833 ,0.0 ,4.0/ &OBST XB=32.4667 ,39.1333 ,0 ,0.333333 ,0.0 ,4.0 / &OBST XB=34.1316 ,34.465 ,8.5917 ,9.5917 ,0.0 ,4.0 / &OBST XB=34.1316 ,34.4649 ,10.5917 ,11.5917 ,0.0 ,4.0 / &OBST XB=34.1316 ,38.7983 ,8.25833 ,8.59167 ,0.0 ,4.0 / &OBST XB=38.7983 ,39.1316 ,8.25833 ,11.5917 ,0.0 ,4.0/ &OBST XB=34.1316 ,34.4649 ,11.925 ,12.925 ,0.0 ,4.0 / &OBST XB=34.1316 ,34.4649 ,13.925 ,14.925 ,0.0 ,4.0 / 117 &OBST XB=34.1316 ,38.7983 ,11.5917 ,11.925 ,0.0 ,4.0 / &OBST XB=38.7983 ,39.1316 ,11.5917 ,14.925 ,0.0 ,4.0 / &OBST XB= 3 0 . 0 , 3 2 . 5 , 8 . 3 3 3 , 8 . 6 6 6 3 3 , 0 . 0 , 4 . 0 / &OBST XB=28.885 ,32.135 ,5.333 ,5.66633 ,0.0 ,4.0 / &OBST XB=32.1333 ,32.4667 ,5.333 ,8.333 ,0.0 ,4.0 / &OBST XB=32.135 ,32.4683 ,0.333333 ,3.99999 ,0.0 ,4.0 / &OBST XB=28.885 ,32.4683 ,0.0 ,0.333333 ,0.0 ,4.0 / &OBST XB=34.1316 ,34.4649 ,15.2584 ,15.5917 ,0.0 ,4.0 / &OBST XB=34.1316 ,38.7983 ,14.925 ,15.2584 ,0.0 ,4.0 / &OBST XB=38.7983 ,39.1316 ,14.925 ,18.2584 ,0.0 ,4.0 / &OBST XB=34.137 ,34.4703 ,16.666 ,18.2533 ,0.0 ,4.0 / &OBST XB=34.137 ,38.803 ,18.2533 ,18.5867 ,0.0 ,4.0 / &OBST XB=38.7983 ,39.1317 ,18.2583 ,22.3417 ,0.0 ,4.0 / &OBST XB=32.4667 ,39.1333 ,22.3333 ,22.6667 ,0.0 ,4.0 / &OBST XB=29.8888 ,32.4721 ,16.666 ,16.9994 ,0.0 ,4.0 / &OBST XB=32.1388 ,32.4721 ,16.9994 ,22.3327 ,0.0 ,4.0 / &OBST XB=28.8888 ,32.4721 ,22.3333 ,22.6667 ,0.0 ,4.0 / &OBST XB=28.55 ,28.8833 ,0.332967 ,8.33297 ,0.0 ,4.0 / &OBST XB=2 3.885 ,28. 885 ,0.0 ,0.33 3333 ,0.0 ,4.0 / &OBST XB=1 2.335 ,23. 885 ,0.0 ,0.33 3333 ,0.0 ,4.0 / &OBST XB=23.8867 ,25.5533 ,1.16666 ,1.49999 ,0.0 ,4.0 / &OBST XB=23.8833 ,24.2167 ,1.49999 ,4.99999 ,0.0 ,4.0 / &OBST XB=26.55 ,26.8833 ,1.16334 ,4.99667 ,0.0 ,4.0 / &OBST XB=26.5533 ,26.8867 ,5.33315 ,8.33315 ,0.0 ,4.0 / &OBST XB=23.8833 ,26.8833 ,4.99999 ,5.33333 ,0.0 ,4.0 / &OBST XB=23.8833 ,24.2167 ,7.33333 ,8.33333 ,0.0 ,4.0 / &OBST XB=23.8833 ,24.2167 ,5.3333 ,6.3333 ,0.0 ,4.0 / &OBST XB=28.55 ,28.8833 ,8.6663 ,10.1663 ,0.0 ,4.0 / &OBST XB=23.8833 ,28.8833 ,8.33297 ,8.6663 ,0.0 ,4.0 / &OBST XB=23.8833 ,24.2167 ,8.6663 ,10.1663 ,0.0 ,4.0 / &OBST XB=28.55 ,28.8833 ,11.6663 ,13.1663 ,0.0 ,4.0 / &OBST XB=23.8833 ,24.2167 ,11.6663 ,13.1663 ,0.0 ,4.0 / &OBST XB=28.55 ,28.8833 ,13.4997 ,14.4997 ,0.0 ,4.0 / &OBST XB= 2 8 . 5 5 3 3 , 2 8 . 8 8 6 7 , 1 5 . 5 , 1 6 . 5 , 0 . 0 , 4 . 0 / &OBST XB=23.8833 ,28.8833 ,13.1663 ,13.4997 ,0.0 ,4.0 / &OBST XB= 2 3 . 8 8 3 3 , 2 4 . 2 1 6 7 , 1 3 . 5 , 1 4 . 5 , 0 . 0 , 4 . 0 / &OBST XB= 2 3 . 8 8 6 7 , 2 4 . 2 2 , 1 5 . 5 , 1 6 . 5 , 0 . 0 , 4 . 0 / &OBST XB=24.22 ,27.5533 ,16.5033 ,16.8367 ,0.0 ,4.0 / &OBST XB=28.5533 ,28.8867 ,16.5033 ,19.3367 ,0.0 ,4.0 / &OBST XB= 1 5 . 0 , 2 1 . 6 6 6 7 , 3 . 0 , 3 . 3 3 3 3 3 , 0 . 0 , 4 . 0 / &OBST XB=21.6667 ,22.0 ,2.99934 ,10.3327 ,0.0 ,4.0 / &OBST XB=12.3333 ,21.6667 ,10.3333 ,10.6667 ,0.0 ,4.0 / &OBST XB=21.6667 ,22.0 ,10.3333 ,14.6667 ,0.0 ,4.0 / &OBST XB=13.3333 ,22.0 ,15.9019 ,16.2352 ,0.0 ,4.0 / &OBST XB=23.8867 ,24.22 ,16.5033 ,19.3367 ,0.0 ,4.0 / &OBST XB=23.9133 ,24.2467 ,20.82 ,22.32 ,0.0 ,4.0 / &OBST XB=23.8867 ,28.5533 ,19.3333 ,19.6667 ,0.0 ,4.0 / &OBST XB=28.5533 ,28.8867 ,19.3333 ,22.3333 ,0.0 ,4.0 / &OBST XB=24.2221 ,28.8888 ,22.3333 ,22.6667 ,0.0 ,4.0 / &OBST XB=19.3267 ,22.9933 ,17.8333 ,18.1667 ,0.0 ,4.0 / &OBST XB=19.3333 ,24.2221 ,22.3333 ,22.6667 ,0.0 ,4.0 / &OBST XB=16.6666 ,18.9999 ,17.8333 ,18.1667 ,0.0 ,4.0 / &OBST XB=18.9999 ,19.3333 ,17.83 ,22.33 ,0.0 ,4.0 / &OBST XB=16.0 ,19.3333 ,22.3333 ,22.6667 ,0.0 ,4.0 / 118 &OBST XB=12.3333 ,15.0 ,17.8333 ,18.1667 ,0.0 ,4.0 / &OBST XB= 1 5 . 6 6 6 7 , 1 6 . 0 , 1 7 . 8 3 , 2 2 . 3 3 , 0 . 0 , 4 . 0 / &OBST XB=12.3334 ,16.0 ,22.3333 ,22.6667 ,0.0 ,4.0 / &OBST XB=7.52333 ,12.3567 ,3.0 ,3.33333 ,0.0 ,4.0 / &OBST XB=0.333333 ,6.33333 ,3.0 ,3.33333 ,0.0 ,4.0 / &OBST XB=12.0 ,12.3333 ,3.32667 ,11.4933 ,0.0 ,4.0 / &OBST XB=0.333333 ,12.3333 ,0.0 ,0.333333 ,0.0 ,4.0 / &OBST XB= 1 2 . 0 , 1 2 . 3 3 3 3 , 1 1 . 8 3 , 1 6 . 3 3 , 0 . 0 , 4 . 0 / &OBST XB=0.3333 ,12.3333 ,11.5 ,11.8333 ,0.0 ,4.0 / &OBST XB= 1 2 . 0 , 1 2 . 3 3 3 3 , 1 7 . 8 3 , 2 2 . 3 3 , 0 . 0 , 4 . 0 / &OBST XB= 0 . 0 , 0 . 3 3 3 3 3 3 , 1 1 . 6 6 , 2 2 . 6 6 , 0 . 0 , 4 . 0 / &OBST XB= 0 . 0 , 0 . 3 3 3 3 3 3 , 3 . 0 , 1 1 . 6 6 6 7 , 0 . 0 , 4 . 0 / &OBST XB= 0 . 0 , 0 . 3 3 3 3 3 3 , 0 . 0 , 3 . 0 , 0 . 0 , 4 . 0 / &OBST XB=0.333333 ,12.3333 ,22.3332 ,22.6668 ,0.0 ,4.0 / &HOLE XB= 0 . 2 5 , 0 . 5 , . 5 , . 7 5 , 0 , 0 . 2 5 / &t a i l / E.2 Multi-level FDS code &HEAD CHID=? m u l t i l e v e l ? , TITLE=?Multi l e v e l Case ? &TIME T END=1500. / &MESH IJK=28 ,16 ,18 , XB= 0 . 0 , 1 4 . 0 , 0 . 0 , 8 . 0 , 0 . 0 , 9 . 0 / &OBST XB= 1 1 . 5 , 1 2 . 0 , 0 . 0 , 8 . 0 , 0 . 0 , 9 . 0 / &OBST XB= 1 2 . 0 , 1 4 . 0 , 3 . 5 , 4 . 0 , 0 . 0 , 9 . 0 / &OBST XB= 0 . 0 , 1 1 . 5 , 0 . 0 , 8 . 0 , 2 . 5 , 3 . 0 / &OBST XB= 0 . 0 , 1 1 . 5 , 0 . 0 , 8 . 0 , 5 . 5 , 6 . 0 / &OBST XB= 1 3 . 0 , 1 4 . 0 , 4 . 5 , 5 . 0 , 0 . 0 , 0 . 5 / &OBST XB= 1 3 . 0 , 1 4 . 0 , 5 . 0 , 5 . 5 , 0 . 0 , 1 . 0 / &OBST XB= 1 3 . 0 , 1 4 . 0 , 5 . 5 , 6 . 0 , 0 . 0 , 1 . 5 / &OBST XB= 1 3 . 0 , 1 4 . 0 , 6 . 0 , 6 . 5 , 0 . 0 , 2 . 0 / &OBST XB= 1 3 . 0 , 1 4 . 0 , 6 . 5 , 7 . 0 , 0 . 0 , 2 . 5 / &OBST XB= 1 2 . 0 , 1 4 . 0 , 7 . 0 , 8 . 0 , 2 . 5 , 3 . 0 / &OBST XB= 1 2 . 0 , 1 3 . 0 , 7 . 0 , 7 . 5 , 3 . 0 , 3 . 5 / &OBST XB= 1 2 . 0 , 1 3 . 0 , 6 . 5 , 7 . 0 , 3 . 0 , 4 . 0 / &OBST XB= 1 2 . 0 , 1 3 . 0 , 6 . 0 , 6 . 5 , 3 . 0 , 4 . 5 / &OBST XB= 1 2 . 0 , 1 3 . 0 , 5 . 5 , 6 . 0 , 3 . 0 , 5 . 0 / &OBST XB= 1 2 . 0 , 1 3 . 0 , 5 . 0 , 5 . 5 , 3 . 0 , 5 . 5 / &OBST XB= 1 2 . 0 , 1 4 . 0 , 4 . 0 , 5 . 0 , 5 . 5 , 6 . 0 / &HOLE XB= 1 1 . 5 , 1 2 . 0 , 4 . 0 , 5 . 0 , 0 . 0 , 2 . 5 / &HOLE XB= 1 1 . 5 , 1 2 . 0 , 7 . 0 , 8 . 0 , 3 . 0 , 5 . 5 / &HOLE XB= 1 1 . 5 , 1 2 . 0 , 4 . 0 , 5 . 0 , 6 . 0 , 8 . 5 / &HOLE XB= 0 . 2 5 , 0 . 5 , 0 . 2 5 , 0 . 5 , 0 . 0 , 0 . 2 5 / &PROP ID=?Acme Smoke Detector ? , QUANTITY=?CHAMBER OBSCURATION? , LENGTH =1.8 , ACTIVATION OBSCURATION=3.28 / 119 &PROP ID=?Acme Heat ? , QUANTITY=?LINK TEMPERATURE? , RTI=132. , ACTIVATION TEMPERATURE=74. / &DEVC ID=?Smoke 1 ? , PROP ID=?Acme Smoke Detector ? , XYZ=5.75 ,4.0 ,2.4 / &DEVC ID=?Smoke 2 ? , PROP ID=?Acme Smoke Detector ? , XYZ=5.75 ,4.0 ,5.4 / &DEVC ID=?Smoke 3 ? , PROP ID=?Acme Smoke Detector ? , XYZ=5.75 ,4.0 ,9.0 / &DEVC ID=?Heat 1 ? , PROP ID=?Acme Heat ? , XYZ=6.0 ,4.0 ,2.4 / &DEVC ID=?Heat 2 ? , PROP ID=?Acme Heat ? , XYZ=6.0 ,4.0 ,5.4 / &DEVC ID=?Heat 3 ? , PROP ID=?Acme Heat ? , XYZ=6.0 ,4.0 ,9.0 / &PART ID=? water drops ? , WATER=.TRUE. , SAMPLING FACTOR=1, QUANTITIES=? DROPLET DIAMETER? , DIAMETER=2000. / &PROP ID=?K 11 ? , QUANTITY=?SPRINKLER LINK TEMPERATURE? , RTI=95, C FACTOR=0.4 , ACTIVATION TEMPERATURE=68, OFFSET=0.10 ,PART ID=? water drops ? , FLOW RATE =189.3 , DROPLET VELOCITY=10. , SPRAY ANGLE=30. ,80. / &DEVC ID=? Sprinkler 1 ? , XYZ= 6. 0 ,3 . 0 ,2 .4 , PROP ID=?K 11? / &DEVC ID=? Sprinkler 2 ? , XYZ= 6. 0 ,3 . 0 ,5 .4 , PROP ID=?K 11? / &DEVC ID=? Sprinkler 3 ? , XYZ= 6. 0 ,3 . 0 ,9 .0 , PROP ID=?K 11? / &SURF ID=?FIRE ? ,HRRPUA=2000.0 , TAU Q= 600.00 / &OBST XB= 5 . 5 , 6 . 5 , 3 . 5 , 4 . 5 , 0 . 0 , 0 . 5 , SURF IDS=?FIRE ? , ?INERT? , ?INERT? / &HOLE XB= 5 , 5 . 2 5 , 3 . 5 , 3 . 7 5 , 0 . 0 , 0 . 2 5 / 120 Appendix F Prototype Source Code / . / f f a / annotation /AnnotationIO . java / package f f a . annotation ; import java . awt . Polygon ; import java . i o . F i l e ; import java . i o . IOException ; import java . i o . PrintWriter ; import java . u t i l . L i s t ; import java . u t i l . TreeMap ; import java . u t i l . regex . Matcher ; import java . u t i l . regex . Pattern ; import javax . imageio . ImageIO ; import org . jdom . Document ; import org . jdom . Element ; import org . jdom . JDOMException ; import org . jdom . input . SAXBuilder ; import org . jdom . output . Format ; import org . jdom . output . XMLOutputter ; import f f a . model . Building ; import f f a . model . Floor ; import f f a . model . SensorLayer ; import f f a . model . SensorType ; public c l a s s AnnotationIO f p r i v a t e s t a t i c F i l e annotationFile ( F i l e root ) f return new F i l e ( root , " annotation ") ; g p r i v a t e s t a t i c F i l e f l o o r F i l e ( F i l e root , i n t f l o o r ) f return new F i l e ( root , String . format (" f l o o r%d . png " , f l o o r ) ) ; g public s t a t i c Building newAnnotation ( F i l e f o l d e r ) f 121 i f ( annotationFile ( f o l d e r ) . e x i s t s ( ) ) f return importAnnotation ( f o l d e r ) ; g F i l e [ ] img = f o l d e r . l i s t F i l e s ( ) ; Pattern floorNum = Pattern . compile (" f l o o r ([0 9]+)nn. png ") ; i n t floorCount = 0 ; f o r ( F i l e f : img ) f String name = f . getName ( ) ; Matcher matcher = floorNum . matcher (name) ; i f ( matcher . matches ( ) ) f i n t f l o o r = I n t e g e r . parseInt ( matcher . group (1) ) ; i f ( f l o o r > floorCount ) f floorCount = f l o o r ; g g g Building buil ding = new Building ( ) ; f o r ( i n t i = 0 ; i < floorCount ; i++) f bui ldin g . f l o o r s . put ( i , new Floor ( ) ) ; g setupZoneMap ( b uild ing ) ; loadFloorImages ( building , f o l d e r ) ; return bui ldin g ; g public s t a t i c void exportAnnotation ( Building building , F i l e save ) f Element root = new Element (" bu ildi ng ") ; Document doc = new Document ( root ) ; f o r ( I n t e g e r index : b uild ing . f l o o r s . keySet ( ) ) f writeFloor ( root , index , bui lding . getFloor ( index ) ) ; g XMLOutputter out = new XMLOutputter ( Format . getPrettyFormat ( ) ) ; try f out . output ( doc , new PrintWriter ( save ) ) ; g catch ( IOException e ) f // TODO Auto generated catch block e . printStackTrace ( ) ; g g p r i v a t e s t a t i c void writeFloor ( Element root , I n t e g e r index , Floor f l o o r ) f Element elem = new Element (" f l o o r ") ; root . addContent ( elem ) ; 122 elem . addContent (new Element (" index ") . addContent ( index . toString ( ) ) ) ; f o r ( SensorLayer l a y e r : f l o o r . l a y e r s . values ( ) ) f writeLayer ( elem , l a y e r ) ; g g p r i v a t e s t a t i c void writeLayer ( Element root , SensorLayer l a y e r ) f Element elem = new Element (" l a y e r ") ; root . addContent ( elem ) ; elem . s e t A t t r i b u t e (" type " , l a y e r . getType ( ) . name ( ) ) ; f o r ( Zone zone : l a y e r . getZones ( ) ) f writeZone ( elem , zone ) ; g g p r i v a t e s t a t i c void writeZone ( Element root , Zone zone ) f Element zoneElem = new Element (" zone ") ; zoneElem . addContent (new Element (" l a b e l ") . setText ( zone . getLabel ( ) ) ) ; Polygon shape = zone . getZone ( ) ; Element shapeElem = new Element (" zone ") ; zoneElem . addContent ( shapeElem ) ; f o r ( i n t i = 0 ; i < shape . npoints ; i++) f shapeElem . addContent (new Element (" vertex ") . addContent (new Element (" x ") . setText ( String . valueOf ( shape . xpoints [ i ] ) ) ) . addContent (new Element (" y ") . setText ( String . valueOf ( shape . ypoints [ i ] ) ) ) ) ; g root . addContent ( zoneElem ) ; g @SuppressWarnings (" unchecked ") public s t a t i c Building importAnnotation ( F i l e f o l d e r ) f Building buil ding = new Building ( ) ; bu ildi ng . f l o o r s = new TreeMap() ; Document doc = n u l l ; try f doc = new SAXBuilder ( ) . build ( annotationFile ( f o l d e r ) ) ; f o r ( Element e l e : ( List)doc . getRootElement ( ) . getChildren ( ) ) f i f ( e l e . getName ( ) . equals (" s t a t i c ") ) f bui ldin g . s t a t i c I n f o . add ( r e a d S t a t i c ( e l e ) ) ; g e l s e i f ( e l e . getName ( ) . equals (" f l o o r ") ) f IndexedFloor f l o o r = readFloor ( e l e ) ; bui ldin g . f l o o r s . put ( f l o o r . index , f l o o r . f l o o r ) ; g e l s e f 123 throw new UnsupportedOperationException ("Unknown XML tag : " + e l e . getName ( ) ) ; g g g catch (JDOMException e ) f // TODO Auto generated catch block e . printStackTrace ( ) ; g catch ( IOException e ) f // TODO Auto generated catch block e . printStackTrace ( ) ; g setupZoneMap ( b uild ing ) ; loadFloorImages ( building , f o l d e r ) ; return bui ldin g ; g p r i v a t e s t a t i c void loadFloorImages ( Building building , F i l e f o l d e r ) f f o r ( i n t i : buil ding . f l o o r s . keySet ( ) )f tryf F i l e f l o o r P l a n F i l e = f l o o r F i l e ( f o l d e r , i +1) ; i f ( ! f l o o r P l a n F i l e . e x i s t s ( ) )f System . out . p r i n t l n (" f i l e : " + f l o o r P l a n F i l e + " does not e x i s t ! " ) ; g bui ldin g . getFloor ( i ) . image = ImageIO . read ( f l o o r P l a n F i l e ) ; gcatch ( Exception e )f System . out . p r i n t l n (" Cannot read f l o o r plans . Fatal e r r o r ") ; e . printStackTrace ( ) ; System . e x i t (0) ; g g//end f o r loop g p r i v a t e s t a t i c void setupZoneMap ( Building bui ldin g ) f bu ildi ng . zones = new TreeMap() ; f o r ( Floor f l o o r : b uild ing . f l o o r s . values ( ) ) f f o r ( SensorLayer l a y e r : f l o o r . l a y e r s . values ( ) ) f f o r ( Zone zone : l a y e r . getZones ( ) ) f bui ldin g . zones . put ( zone . getLabel ( ) , zone ) ; g g g g p r i v a t e s t a t i c String r e a d S t a t i c ( Element root ) f return root . getTextTrim ( ) ; g @SuppressWarnings (" unchecked ") 124 p r i v a t e s t a t i c IndexedFloor readFloor ( Element root ) f Floor f l o o r = new Floor ( ) ; I n t e g e r index = I n t e g e r . parseInt ( root . getChildText (" index ") ) ; f o r ( Element element : ( List)root . getChildren (" l a y e r ") ) f SensorLayer l a y e r = readLayer ( element ) ; f l o o r . l a y e r s . put ( l a y e r . getType ( ) , l a y e r ) ; g return new IndexedFloor ( f l o o r , index ) ; g @SuppressWarnings (" unchecked ") p r i v a t e s t a t i c SensorLayer readLayer ( Element root ) f SensorType type = SensorType . valueOf ( root . getAttribute (" type ") . getValue ( ) ) ; SensorLayer l a y e r = new SensorLayer ( type ) ; f o r ( Element element : ( List)root . getChildren ( ) ) f i f ( element . getName ( ) . equals (" zone ") ) f l a y e r . addZone ( readZone ( element ) ) ; g g return l a y e r ; g @SuppressWarnings (" unchecked ") p r i v a t e s t a t i c Zone readZone ( Element root ) f Zone zone = new Zone ( ) ; zone . setLabel ( root . getChildText (" l a b e l ") ) ; Polygon shape = zone . getZone ( ) ; f o r ( Element vertex : ( List)root . getChild (" zone ") . getChildren (" vertex ") ) f i n t x = I n t e g e r . parseInt ( vertex . getChildText (" x ") ) ; i n t y = I n t e g e r . parseInt ( vertex . getChildText (" y ") ) ; shape . addPoint (x , y ) ; g return zone ; g p r i v a t e s t a t i c c l a s s IndexedFloor f Floor f l o o r ; I n t e g e r index ; public IndexedFloor ( Floor f l o o r , I n t e g e r index ) f t h i s . f l o o r = f l o o r ; t h i s . index = index ; g g g 125 / . / f f a / annotation /ModeChangedListener . java / package f f a . annotation ; import f f a . model . SensorType ; public i n t e r f a c e ModeChangedListener f public void modeChanged ( AnnotationMode mode) ; public void layerChanged ( SensorType l a y e r ) ; g / . / f f a / annotation /AnnotationWindow . java / package f f a . annotation ; import i n f o . clearthought . layout . TableLayout ; import java . awt . EventQueue ; import java . awt . event . ActionEvent ; import java . awt . event . ActionListener ; import java . i o . F i l e ; import javax . swing . JFileChooser ; import javax . swing . JFrame ; import javax . swing . JMenu ; import javax . swing . JMenuBar ; import javax . swing . JMenuItem ; import f f a . model . Building ; import f f a . ui . MapDisplay ; import f f a . ui . U t i l ; @SuppressWarnings (" s e r i a l ") public c l a s s AnnotationWindow extends JFrame f p r i v a t e MapDisplay mapDisplay ; p r i v a t e AnnotationLayer aLayer ; p r i v a t e EditPanel editPanel ; p r i v a t e Toolbox toolbox ; 126 // Three columns , l e f t and r i g h t are 25%, one row , f i l l s screen p r i v a t e s t a t i c f i n a l double s i z e [ ] [ ] = ff.25 , TableLayout . FILL , .25g,fTableLayout . FILLgg; p r i v a t e F i l e l a s t = new F i l e (" s t a t i c I n f o ") ; public AnnotationWindow ( ) fg public void initGUI ( Building bui ldin g ) f getContentPane ( ) . removeAll ( ) ; U t i l . checkEDT ( ) ; / I n i t i a l i z e Components / editPanel = new EditPanel ( ) ; F i l e annotationFolder = new F i l e (" s t a t i c I n f o ") ; i f ( b uild ing == n u l l ) f bui ldin g = AnnotationIO . newAnnotation ( annotationFolder ) ; g mapDisplay = new MapDisplay ( this , b uild ing ) ; aLayer = new AnnotationLayer ( mapDisplay , editPanel ) ; toolbox = new Toolbox ( ) ; toolbox . addModeChangedListener ( aLayer ) ; JMenuBar menuBar = new JMenuBar ( ) ; JMenu menu = new JMenu(" F i l e ") ; menuBar . add (menu) ; JMenuItem item = new JMenuItem (" Save ") ; menu . add ( item ) ; item . addActionListener (new ActionListener ( ) f @Override public void actionPerformed ( ActionEvent e ) f JFileChooser chooser = new JFileChooser ( l a s t ) ; i n t r e t = chooser . showSaveDialog ( AnnotationWindow . t h i s ) ; i f ( r e t == JFileChooser .APPROVE OPTION) f AnnotationIO . exportAnnotation ( aLayer . getBuilding ( ) , chooser . g e t S e l e c t e d F i l e ( ) ) ; l a s t = chooser . g e t S e l e c t e d F i l e ( ) ; g g g) ; item = new JMenuItem (" Load ") ; menu . add ( item ) ; item . addActionListener (new ActionListener ( ) f @Override public void actionPerformed ( ActionEvent e ) f JFileChooser chooser = new JFileChooser ( l a s t ) ; chooser . setFileSelectionMode ( JFileChooser .DIRECTORIES ONLY) ; i n t r e t = chooser . showOpenDialog ( AnnotationWindow . t h i s ) ; i f ( r e t == JFileChooser .APPROVE OPTION) f Building buil ding = AnnotationIO . importAnnotation ( chooser . g e t S e l e c t e d F i l e ( ) ) ; 127 s e t V i s i b l e ( f a l s e ) ; initGUI ( bu ildi ng ) ; repaint ( ) ; s e t V i s i b l e ( true ) ; g g g) ; / I n i t i a l i z e Frame / t h i s . setJMenuBar (menuBar) ; // I b e l i e v e t h i s i s the standard screen s i z e we ? re working with t h i s . s e t S i z e (1024 , 768) ; // Centers the window t h i s . setLocationRelativeTo ( n u l l ) ; // This i s the main app window setDefaultCloseOperation ( JFrame .EXIT ON CLOSE) ; setLayout (new TableLayout ( s i z e ) ) ; getContentPane ( ) . add ( mapDisplay , "1 ,0 ,1 ,0") ; getContentPane ( ) . add ( toolbox , "0 ,0 ,0 ,0") ; getContentPane ( ) . add ( editPanel , "2 ,0 ,2 ,0") ; g public void initGUI ( ) f initGUI ( n u l l ) ; g public s t a t i c void main ( String [ ] args ) f f i n a l AnnotationWindow window = new AnnotationWindow ( ) ; EventQueue . invokeLater (new Runnable ( )f public void run ( ) f window . initGUI ( ) ; window . s e t V i s i b l e ( true ) ; g g) ; g g / . / f f a / annotation / AnnotationLayer . java / / Levon K. Mkrtchyan 128 This Component d i s p l a y s annotation s t u f f s and accepts annotation commands / package f f a . annotation ; import java . awt . Color ; import java . awt . Cursor ; import java . awt . Graphics ; import java . awt . Graphics2D ; import java . awt . Polygon ; import java . awt . Rectangle ; import java . awt . event . ComponentAdapter ; import java . awt . event . ComponentEvent ; import java . awt . event . ComponentListener ; import java . awt . event . KeyAdapter ; import java . awt . event . KeyEvent ; import java . awt . event . KeyListener ; import java . awt . event . MouseAdapter ; import java . awt . event . MouseEvent ; import java . awt . geom . AffineTransform ; import java . awt . geom . Line2D ; import java . awt . geom . Point2D ; import java . u t i l . I t e r a t o r ; import javax . swing . JComponent ; import f f a . model . Building ; import f f a . model . Floor ; import f f a . model . SensorLayer ; import f f a . model . SensorType ; import f f a . ui . MapDisplay ; public c l a s s AnnotationLayer extends JComponent implements ModeChangedListenerf s t a t i c f i n a l long serialVersionUID = 1L ; p r i v a t e EditPanel e d i t o r ; p r i v a t e Zone curZone = n u l l ; p r i v a t e i n t curZoneFloor = 0 ; p r i v a t e i n t curVertex = 1; p r i v a t e MapDisplay d i s p la y = n u l l ; p r i v a t e Building b uild ing ; // p r i v a t e I n t e g e r curFloor = 0 ; p r i v a t e SensorType l a y e r = SensorType .HEAT; p r i v a t e ComponentListener c l = new ComponentAdapter ( )f public void componentResized ( ComponentEvent e )f 129 AnnotationLayer . t h i s . s e t S i z e ( e . getComponent ( ) . getWidth ( ) , e . getComponent ( ) . getHeight ( ) ) ; g g; p r i v a t e KeyListener kl = new KeyAdapter ( )f public void keyTyped ( KeyEvent e )f i f ( e . getKeyChar ( )==KeyEvent .VK DELETE)f Zone nz = nextZone ( curZone ) ; i f ( curVertex != 1)f deleteVertex ( curZone , curVertex ) ; curVertex ; i f ( curVertex <0) curVertex=curZone . getZone ( ) . npoints 1; i f ( curZone . getZone ( ) . npoints==0)f curZone=nz ; curVertex= 1; g ge l s e i f ( curZone!= n u l l )f deleteZone ( curZone ) ; curZone=nz ; g ge l s e i f ( e . getKeyChar ( )==KeyEvent .VK TAB)f Zone nz ; i f ( e . isShiftDown ( ) )f nz = prevZone ( curZone ) ; ge l s ef nz = nextZone ( curZone ) ; g curVertex= 1; curZone=nz ; g d i s p la y . repaint ( ) ; g g; p r i v a t e MouseAdapter moveML = n u l l ; p r i v a t e MouseAdapter zoneML = new MouseAdapter ( )f public void mousePressed ( MouseEvent e )f i f ( curVertex== 1)f e d i t o r . c l e a r ( ) ; curZone = new Zone ( ) ; curZoneFloor=d i s p l a y . getCurrentFloor ( ) ; getBuilding ( ) . getLayer ( d is p l a y . getCurrentFloor ( ) , l a y e r ) . addZone ( curZone ) ; e d i t o r . e d i t ( curZone ) ; g curVertex = curZone . getZone ( ) . npoints ; Point2D pt = d i s p l a y . screenToMap ( e . getPoint ( ) ) ; curZone . getZone ( ) . addPoint ( ( i n t ) pt . getX ( ) , ( i n t ) pt . getY ( ) ) ; d i s p la y . repaint ( ) ; 130 g public void mouseClicked ( MouseEvent e )f i f ( e . getClickCount ( )==2) curVertex= 1; d i s p la y . repaint ( ) ; g public void mouseDragged ( MouseEvent e )f Polygon poly = curZone . getZone ( ) ; Point2D pt = d i s p l a y . screenToMap ( e . getPoint ( ) ) ; poly . xpoints [ curVertex ]=( i n t ) pt . getX ( ) ; poly . ypoints [ curVertex ]=( i n t ) pt . getY ( ) ; d i s p la y . repaint ( ) ; g g; p r i v a t e MouseAdapter editML = new MouseAdapter ( )f double zoneOffsetX = 0 f ; double zoneOffsetY = 0 f ; public void mousePressed ( MouseEvent e )f Point2D pt = d i s p l a y . screenToMap ( e . getPoint ( ) ) ; i f ( curZone!= n u l l )f Polygon poly = curZone . getZone ( ) ; f o r ( i n t n=0;nn ; i )f poly . xpoints [ i ]= poly . xpoints [ i 1]; poly . ypoints [ i ]= poly . ypoints [ i 1]; g poly . xpoints [ n]=( i n t ) pt . getX ( ) ; 131 poly . ypoints [ n]=( i n t ) pt . getY ( ) ; curVertex=n ; d i s pl a y . setCursor (new Cursor ( Cursor .MOVE CURSOR) ) ; d i s pl a y . repaint ( ) ; return ; g g curZone=n u l l ; curVertex= 1; e d i t o r . c l e a r ( ) ; SensorLayer sLayer = getBuilding ( ) . getLayer ( d i s p l a y . getCurrentFloor ( ) , l a y e r ) ; f o r ( Zone z : sLayer . getZones ( ) )f i f ( z . getZone ( ) . contains ( pt ) )f curZone=z ; curZoneFloor=d i s p l a y . getCurrentFloor ( ) ; Rectangle bounds = curZone . getZone ( ) . getBounds ( ) ; zoneOffsetX=pt . getX ( ) bounds . x ; zoneOffsetY=pt . getY ( ) bounds . y ; e d i t o r . e d i t ( z ) ; d i s p la y . setCursor (new Cursor ( Cursor .MOVE CURSOR) ) ; d i s p la y . repaint ( ) ; return ; g g d i s p la y . repaint ( ) ; g public void mouseReleased ( MouseEvent e )f d i s p la y . setCursor (new Cursor ( Cursor .DEFAULT CURSOR) ) ; g public void mouseDragged ( MouseEvent e )f Point2D pt = d i s p l a y . screenToMap ( e . getPoint ( ) ) ; i f ( curZone!= n u l l && curVertex== 1)f Rectangle bounds = curZone . getZone ( ) . getBounds ( ) ; curZone . getZone ( ) . t r a n s l a t e ( ( i n t ) ( pt . getX ( ) zoneOffsetX bounds . x ) , ( i n t ) ( pt . getY ( ) zoneOffsetY bounds . y ) ) ; d i s p la y . repaint ( ) ; ge l s e i f ( curZone!= n u l l && curVertex>=0)f curZone . getZone ( ) . xpoints [ curVertex ]=( i n t ) pt . getX ( ) ; curZone . getZone ( ) . ypoints [ curVertex ]=( i n t ) pt . getY ( ) ; d i s p la y . repaint ( ) ; g g g; public void modeChanged ( AnnotationMode mode)f 132 curZone=n u l l ; e d i t o r . c l e a r ( ) ; t h i s . removeMouseListener (moveML) ; t h i s . removeMouseMotionListener (moveML) ; t h i s . removeMouseListener (zoneML) ; t h i s . removeMouseMotionListener (zoneML) ; t h i s . removeMouseListener ( editML ) ; t h i s . removeMouseMotionListener ( editML ) ; switch (mode)f case MOVE: t h i s . addMouseListener (moveML) ; t h i s . addMouseMotionListener (moveML) ; t h i s . addMouseWheelListener (moveML) ; break ; case EDIT: t h i s . addMouseListener ( editML ) ; t h i s . addMouseMotionListener ( editML ) ; break ; case ZONE: t h i s . addMouseListener (zoneML) ; t h i s . addMouseMotionListener (zoneML) ; break ; g d i s pl a y . repaint ( ) ; g @Override public void layerChanged ( SensorType l a y e r ) f t h i s . l a y e r=l a y e r ; curZone=n u l l ; curVertex= 1; e d i t o r . c l e a r ( ) ; d i s pl a y . repaint ( ) ; g public AnnotationLayer ( MapDisplay display , EditPanel e d i t o r )f t h i s . e d i t o r = e d i t o r ; t h i s . d is p l a y = d i s p la y ; t h i s . setOpaque ( f a l s e ) ; d i s pl a y . add ( this , new I n t e g e r (1) ) ; moveML = d i s p l a y . mapMouseListener ; d i s pl a y . addComponentListener ( c l ) ; t h i s . setBackground (new Color (255 ,255 ,255 , Color .TRANSLUCENT) ) ; t h i s . setForeground (new Color (255 ,255 ,255 , Color .TRANSLUCENT) ) ; s et B ui l di n g ( d i s p l a y . bui ldin g ) ; layerChanged ( SensorType .SMOKE) ; / Sample Key L i s t e n e r Code / setFocusable ( true ) ; // Component needs focus f o r keys setFocusTraversalKeysEnabled ( f a l s e ) ; // Disable tab and s h i f t tab 133 // Request focus when mouse e n t e r s or when mouse i s c l i c k e d . // TODO: Not sure which of these two i s most appropriate addMouseListener (new MouseAdapter ( ) f @Override public void mouseEntered ( MouseEvent e ) f AnnotationLayer . t h i s . requestFocusInWindow ( ) ; //System . out . p r i n t l n (" hi ") ; g @Override public void mouseClicked ( MouseEvent e ) f AnnotationLayer . t h i s . requestFocusInWindow ( ) ; //System . out . p r i n t l n (" h a l l o ") ; g g) ; t h i s . addKeyListener ( kl ) ; g public void paint ( Graphics g )f i f ( curZoneFloor != d i s p l a y . getCurrentFloor ( ) ) curZone=n u l l ; Graphics2D g2d = ( Graphics2D ) g ; AffineTransform o r i g i n a l = g2d . getTransform ( ) ; g2d . transform ( d i s p l a y . getTransform ( ) ) ; f l o a t zoom = d i s pl a y . getZoom ( ) ; Color opaque = Color . black ; Color transparent ; switch ( l a y e r )f case HEAT: opaque = Color . red ; break ; case SMOKE: opaque = Color . gray ; break ; case FLOW: opaque = Color . blue ; break ; case MANUAL: opaque = Color . green ; break ; g transparent = new Color ( opaque . getRed ( ) , opaque . getGreen ( ) , opaque . getBlue ( ) ,50) ; i f ( curZone!= n u l l )f g . setColor ( transparent ) ; g . f i l l P o l y g o n ( curZone . getZone ( ) ) ; g 134 g . setColor ( opaque ) ; SensorLayer sLayer = getBuilding ( ) . getLayer ( d i s p l a y . getCurrentFloor ( ) , l a y e r ) ; f o r ( Zone z : sLayer . getZones ( ) )f g . drawPolygon ( z . getZone ( ) ) ; f o r ( i n t n=0;n=0)f g . setColor ( Color . black ) ; g . f i l l R e c t ( curZone . getZone ( ) . xpoints [ curVertex ] ( i n t ) (2/zoom) , curZone . getZone ( ) . ypoints [ curVertex ] ( i n t ) (2/zoom) , ( i n t ) (5/ zoom) , ( i n t ) (5/zoom) ) ; g g2d . setTransform ( o r i g i n a l ) ; g / public void setLayers (Map l a y e r s ) f t h i s . l a y e r s = l a y e r s ; curZone = n u l l ; e d i t o r . c l e a r ( ) ; d i s pl a y . repaint ( ) ; g / p r i v a t e Zone prevZone ( Zone z )f Zone r e t=n u l l ; I t e r a t o r i t r = getBuilding ( ) . getLayer ( d i s p la y . getCurrentFloor ( ) , l a y e r ) . getZones ( ) . i t e r a t o r ( ) ; while ( i t r . hasNext ( ) )f Zone next = i t r . next ( ) ; i f ( next==z && r e t != n u l l ) return r e t ; e l s e r e t=next ; g i t r = getBuilding ( ) . getLayer ( d i s p la y . getCurrentFloor ( ) , l a y e r ) . getZones ( ) . i t e r a t o r ( ) ; i f ( i t r . hasNext ( ) && i t r . next ( )==z ) return r e t ; e l s e return n u l l ; g p r i v a t e Zone nextZone ( Zone z )f i f ( z==n u l l ) return n u l l ; 135 Zone r e t=n u l l ; I t e r a t o r i t r = getBuilding ( ) . getLayer ( d i s p la y . getCurrentFloor ( ) , l a y e r ) . getZones ( ) . i t e r a t o r ( ) ; while ( i t r . hasNext ( ) )f i f ( r e t==z ) return i t r . next ( ) ; e l s e r e t=i t r . next ( ) ; g i f ( r e t==z ) return getBuilding ( ) . getLayer ( d i s p la y . getCurrentFloor ( ) , l a y e r ) . getZones ( ) . i t e r a t o r ( ) . next ( ) ; e l s e return n u l l ; g p r i v a t e void deleteZone ( Zone z )f getBuilding ( ) . getLayer ( d is p l a y . getCurrentFloor ( ) , l a y e r ) . removeZone ( z ) ; g p r i v a t e void deleteVertex ( Zone z , i n t v )f Polygon p = z . getZone ( ) ; i f (p . npoints<=v jj v<0) return ; i f (p . npoints==1)f p . npoints =0; deleteZone ( z ) ; return ; g p . npoints ; f o r ( i n t i=v ; i

events ;// t h i s f i e l d not used by AnnotationLayer , i t i s only used by MetaInfoLayer in the main v i s u a l i z a t i o n software . f i n a l s t a t i c long displayDuration = 15000; public void addEvent ( Event e )f 138 events . add ( e ) ; g public Zone ( ) f events = new TreeSet(new Comparator()f public i n t compare ( Event a , Event b)f return a . getTime ( ) . compareTo (b . getTime ( ) ) ; g g) ; zone = new Polygon ( ) ; g public i n t getOpacity ( DateTime dt )f long s y s M i l l i s = dt . g e t M i l l i s ( ) ; Event recentEvent = n u l l ; I t e r a t o r i t r = events . d e s c e n d i n g I t e r a t o r ( ) ; while ( i t r . hasNext ( ) && recentEvent==n u l l )f Event e = i t r . next ( ) ; i f ( s y s M i l l i s > e . getTime ( ) . getTime ( ) ) recentEvent = e ; g i f ( recentEvent==n u l l ) // jj s y s M i l l i s recentEvent . getTime ( ) . getTime ( )>displayDuration ) return 0 ; return 200;//( i n t ) ( ( displayDuration s y s M i l l i s+recentEvent . getTime ( ) . getTime ( ) ) 200/ displayDuration ) ; g public Polygon getZone ( ) f return zone ; g protected String getLabel ( ) f return l a b e l ; g protected void setLabel ( String l a b e l ) f t h i s . l a b e l = l a b e l ; g g / . / f f a / annotation /Toolbox . java / package f f a . annotation ; 139 import i n f o . clearthought . layout . TableLayout ; import java . awt . event . ActionEvent ; import java . awt . event . ActionListener ; import java . u t i l . ArrayList ; import java . u t i l . L i s t ; import javax . swing . ButtonGroup ; import javax . swing . JPanel ; import javax . swing . JToggleButton ; import f f a . model . SensorType ; import f f a . ui . U t i l ; @SuppressWarnings (" s e r i a l ") public c l a s s Toolbox extends JPanel implements ActionListener f p r i v a t e List modeChangedList = n u l l ; public Toolbox ( ) f modeChangedList = new ArrayList() ; U t i l . checkEDT ( ) ; setLayout (new TableLayout (new double [ ] [ ]ffTableLayout . FILL g,f. 5 , . 5gg) ) ; JPanel buttonPanel = new JPanel ( new TableLayout (new double [ ] [ ]f f. 5 , . 5g, fTableLayout .PREFERRED, TableLayout .PREFERRED, TableLayout . PREFERREDg g) ) ; JPanel layerPanel = new JPanel ( ) ; add ( buttonPanel , "0 ,0") ; add ( layerPanel , "0 ,1") ; ButtonGroup toolboxGroup = new ButtonGroup ( ) ; ToolboxButton button = new ToolboxButton ( AnnotationMode .MOVE, " Move ") ; button . addActionListener ( t h i s ) ; toolboxGroup . add ( button ) ; buttonPanel . add ( button , "0 ,0") ; button . doClick ( ) ;// s e l e c t s the move t o o l by d e f a u l t button = new ToolboxButton ( AnnotationMode . EDIT, " Edit ") ; button . addActionListener ( t h i s ) ; toolboxGroup . add ( button ) ; buttonPanel . add ( button , "1 ,0") ; button = new ToolboxButton ( AnnotationMode .ZONE, "Zone ") ; button . addActionListener ( t h i s ) ; toolboxGroup . add ( button ) ; buttonPanel . add ( button , "1 ,1") ; 140 / button = new ToolboxButton ( AnnotationMode .DELETE, " Delete ") ; button . addActionListener ( t h i s ) ; toolboxGroup . add ( button ) ; buttonPanel . add ( button , "1 ,2") ; / ButtonGroup layerGroup = new ButtonGroup ( ) ; f o r ( SensorType type : SensorType . values ( ) ) f LayerButton layerButton = new LayerButton ( type ) ; layerButton . addActionListener ( t h i s ) ; layerPanel . add ( layerButton ) ; layerGroup . add ( layerButton ) ; g layerGroup . getElements ( ) . nextElement ( ) . doClick ( ) ; // s e l e c t s a l a y e r by d e f a u l t we don ? t r e a l l y care which one g public void actionPerformed ( ActionEvent e ) f i f ( e . getSource ( ) i n s t a n c e o f ToolboxButton ) f ToolboxButton source = ( ToolboxButton ) e . getSource ( ) ; f o r ( ModeChangedListener l i s t e n e r : modeChangedList ) f l i s t e n e r . modeChanged ( source . getMode ( ) ) ; g g e l s e i f ( e . getSource ( ) i n s t a n c e o f LayerButton ) f LayerButton button = ( LayerButton ) e . getSource ( ) ; f o r ( ModeChangedListener l i s t e n e r : modeChangedList ) f l i s t e n e r . layerChanged ( button . type ) ; g System . out . p r i n t l n ( button . type ) ; g g public void addModeChangedListener ( ModeChangedListener l ) f modeChangedList . add ( l ) ; g p r i v a t e c l a s s ToolboxButton extends JToggleButtonf p r i v a t e AnnotationMode mode ; public ToolboxButton ( AnnotationMode mode , String text ) f t h i s . mode = mode ; t h i s . setText ( text ) ; g public AnnotationMode getMode ( ) f return mode ; g g p r i v a t e c l a s s LayerButton extends JToggleButton f public f i n a l SensorType type ; public LayerButton ( SensorType type ) f t h i s . type = type ; setText ( type . t i t l e ) ; 141 g g g / . / f f a / annotation /AnnotationMode . java / package f f a . annotation ; public enum AnnotationMode f MOVE, EDIT, SENSOR, ZONE, LINK, DELETE; g / . / f f a /model/ Sensor . java / package f f a . model ; / @author jwu / public abstract c l a s s Sensor f String ID = ""; Location l o c a t i o n = n u l l ; boolean alarm = f a l s e ; boolean trouble = f a l s e ; public Sensor ( Location l o c a t i o n ) f super ( ) ; t h i s . l o c a t i o n = l o c a t i o n ; g public Sensor ( String id ) f super ( ) ; t h i s . ID = id ; g public Sensor ( Location loc , String id ) f super ( ) ; 142 t h i s . l o c a t i o n = l o c ; t h i s . ID = id ; g public boolean inAlarm ( ) f return alarm ; g public String getID ( ) f return ID ; g public boolean inTrouble ( ) f return trouble ; g public void setAlarm ( boolean s t a t u s ) f t h i s . alarm = s t a t u s ; g public void setTrouble ( boolean s t a t u s ) f t h i s . alarm = trouble ; g g / . / f f a /model/RS232Demo . java / package f f a . model ; import java . i o . ; import gnu . i o . ; / Demo of input from RS232 . This r e q u i r e s a proper implementation of the javax .comm l i b r a r y f o r a given operating system . The one c u r r e n t l y used i s the Windows version , found in RXTXcomm. j a r / public c l a s s RS232Demo f public s t a t i c void main ( String [ ] args ) f CommPortIdentifier ident = n u l l ; String portName = "COM3" ; 143 i f ( args . length > 0) portName = args [ 0 ] ; tryf ident = CommPortIdentifier . g e t P o r t I d e n t i f i e r ( portName ) ; g catch ( NoSuchPortException e ) f System . e r r . p r i n t l n ("No such port : " + portName ) ; System . e x i t (0) ; g i f ( ident == n u l l jj ident . isCurrentlyOwned ( ) ) f System . e r r . p r i n t l n (" Port " + portName + " in use ") ; System . e x i t (0) ; g e l s e f tryf CommPort commPort = ident . open ("Demo" , 2000) ; i f ( commPort i n s t a n c e o f S e r i a l P o r t ) f S e r i a l P o r t s e r i a l P o r t = ( S e r i a l P o r t ) commPort ; s e r i a l P o r t . setSerialPortParams (2400 , S e r i a l P o r t . DATABITS 7 , S e r i a l P o r t . STOPBITS 1 , S e r i a l P o r t .PARITY EVEN) ; InputStream in = s e r i a l P o r t . getInputStream ( ) ; s e r i a l P o r t . addEventListener (new SerialReader ( in ) ) ; s e r i a l P o r t . notifyOnDataAvailable ( true ) ; g e l s e f System . out . p r i n t l n (" Error : Only s e r i a l ports are handled by t h i s example . " ) ; g g catch ( Exception e ) f e . printStackTrace ( ) ; g g g public s t a t i c c l a s s SerialReader implements SerialPortEventListener f p r i v a t e InputStream in ; p r i v a t e byte [ ] b u f f e r = new byte [ 1 0 2 4 ] ; public SerialReader ( InputStream in ) f t h i s . in = in ; g 144 public void s e r i a l E v e n t ( SerialPortEvent arg0 ) f i n t data ; String message ; try f i n t len = 0 ; while ( ( data = in . read ( ) ) > 1 ) f i f ( data == ?nn ? ) f break ; g b u f f e r [ len++] = ( byte ) data ; g message = parseString (new String ( buffer , 0 , len ) ) ; i f ( message != n u l l ) System . out . p r i n t l n ("ALERT: " + message ) ; g catch ( IOException e ) f e . printStackTrace ( ) ; System . e x i t ( 1) ; g g public String parseString ( String input ) f String s t r = n u l l ; String [ ] inputArray ; inputArray = input . s p l i t ("nns +", 7) ; i f ( inputArray [ 0 ] . equals ("ALARM: " ) ) f s t r = inputArray [ 1 ] + " alarm with ID : " + inputArray [ 2 ] + " in zone " + inputArray [ 3 ] + " went o f f at " + inputArray [ 4 ] + " on " + inputArray [ 5 ] . s u bs tr i ng (0 ,2) + "/" + inputArray [ 5 ] . s ub s tr in g (2 ,4) + "/" + inputArray [ 5 ] . s ub st r in g (4 ,6) ; g e l s e i f ( inputArray [ 0 ] . equals ("TROUBL") ) f i f ( inputArray . length == 7) s t r = "System in trouble at " + inputArray [ 4 ] + " on " + inputArray [ 5 ] . s u bs tr i ng (0 ,2) + "/" + inputArray [ 5 ] . s ub s tr in g (2 ,4) + "/" + inputArray [ 5 ] . s ub s tr in g (4 ,6) ; e l s e i f ( inputArray . length == 6) s t r = "System in trouble on " + inputArray [ 4 ] . su bs t ri ng (0 ,2) + "/" + inputArray [ 4 ] . s ub s tr in g (2 ,4) + "/" + inputArray [ 4 ] . s ub s tr in g (4 ,6) ; 145 e l s e i f ( inputArray . length < 4) f s t r = "System in trouble " ; g g return s t r ; g g g / . / f f a /model/ F i r e F i l t e r . java / package f f a . model ; / I n t e r f a c e f o r f i r e f i l t e r s . This d e c l a r e s only the g e n e r i c f i l t e r ( ) method , which does the p r e p r o c e s s i n g with help from i n t e r n a l v a r i a b l e s . / public i n t e r f a c e F i r e F i l t e r f public Alert f i l t e r ( Event e ) ; g / . / f f a /model/BACNetPluginRip . java / / ============================================================================ GNU Lesser General Public License ============================================================================ Copyright (C) 2006 2009 Serotonin Software Technologies Inc . http :// s e r o t o n i n s o f t w a r e . com @author Matthew Lohbihler This l i b r a r y i s f r e e software ; you can r e d i s t r i b u t e i t and/ or 146 modify i t under the terms of the GNU Lesser General Public License as published by the Free Software Foundation ; e i t h e r v er si on 2.1 of the License , or ( at your option ) any l a t e r v er si on . This l i b r a r y i s d i s t r i b u t e d in the hope that i t w i l l be useful , but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License f o r more d e t a i l s . You should have r e c e i v e d a copy of the GNU Lesser General Public License along with t h i s l i b r a r y ; i f not , write to the Free Software Foundation , Inc . , 59 Temple Place , Suite 330 , Boston , MA 02111 1307 , USA. / package f f a . model ; import java . i o . IOException ; import java . u t i l . ArrayList ; import java . u t i l . L i s t ; import java . u t i l .Map; import java . u t i l . Set ; import java . u t i l . TreeMap ; import java . u t i l . TreeSet ; import com . s e r o t o n i n . bacnet4j . LocalDevice ; import com . s e r o t o n i n . bacnet4j . RemoteDevice ; import com . s e r o t o n i n . bacnet4j . RemoteObject ; import com . s e r o t o n i n . bacnet4j . event . DeviceEventListener ; import com . s e r o t o n i n . bacnet4j . exception . BACnetException ; import com . s e r o t o n i n . bacnet4j . obj . BACnetObject ; import com . s e r o t o n i n . bacnet4j . s e r v i c e . confirmed . R e i n i t i a l i z e D e v i c e R e q u e s t . R e i n i t i a l i z e d S t a t e O f D e v i c e ; import com . s e r o t o n i n . bacnet4j . s e r v i c e . unconfirmed . WhoIsRequest ; import com . s e r o t o n i n . bacnet4j . type . Encodable ; import com . s e r o t o n i n . bacnet4j . type . constructed . Choice ; import com . s e r o t o n i n . bacnet4j . type . constructed . ObjectPropertyReference ; import com . s e r o t o n i n . bacnet4j . type . constructed . PropertyValue ; import com . s e r o t o n i n . bacnet4j . type . constructed . SequenceOf ; import com . s e r o t o n i n . bacnet4j . type . constructed . TimeStamp ; import com . s e r o t o n i n . bacnet4j . type . enumerated . EventState ; import com . s e r o t o n i n . bacnet4j . type . enumerated . EventType ; import com . s e r o t o n i n . bacnet4j . type . enumerated . MessagePriority ; import com . s e r o t o n i n . bacnet4j . type . enumerated . NotifyType ; import com . s e r o t o n i n . bacnet4j . type . enumerated . P r o p e r t y I d e n t i f i e r ; import com . s e r o t o n i n . bacnet4j . type . n o t i f i c a t i o n P a r a m e t e r s . NotificationParameters ; import com . s e r o t o n i n . bacnet4j . type . p r i m i t i v e . Boolean ; import com . s e r o t o n i n . bacnet4j . type . p r i m i t i v e . CharacterString ; import com . s e r o t o n i n . bacnet4j . type . p r i m i t i v e . O b j e c t I d e n t i f i e r ; import com . s e r o t o n i n . bacnet4j . type . p r i m i t i v e . UnsignedInteger ; import com . s e r o t o n i n . bacnet4j . u t i l . PropertyReferences ; import com . s e r o t o n i n . bacnet4j . u t i l . PropertyValues ; / 147 Discovers and d e v ic e s and p ri nt a l l p r o p e r t i e s of a l l o b j e c t s found . t h i s i s done by using P r o p e r t y I d e n t i f i e r . a l l so the Device w i l l send a l l propertys that are s e t . i f you want p o l l a l l PropertyId f@link ReadPropertyRangeTestg. @author Matthew Lohbihler @author Arne P l s e / public c l a s s BACNetPluginRip f public s t a t i c String BROADCAST ADDRESS = " 1 9 2 . 1 6 8 . 1 . 2 5 5 " ; / / " 1 2 7 . 0 . 0 . 2 5 5 " ; p r i v a t e LoopDevice loopDevice ; p r i v a t e f i n a l LocalDevice l o c a l D e v i c e ; // remote de v i c e s found p r i v a t e f i n a l List remoteDevices = new ArrayList< RemoteDevice>() ; //maps d e v i ce s to s t a t e s p r i v a t e Map deviceMap = new TreeMap< RemoteDevice , String >() ; public BACNetPluginRip ( String broadcastAddress , i n t port ) throws IOException f l o c a l D e v i c e = new LocalDevice (1 , broadcastAddress ) ; l o c a l D e v i c e . setPort ( port ) ; l o c a l D e v i c e . getEventHandler ( ) . addListener (new DeviceEventListener ( ) f public void l i s t e n e r E x c e p t i o n ( Throwable e ) f System . out . p r i n t l n (" DiscoveryTest l i s t e n e r E x c e p t i o n ") ; g public void iAmReceived ( RemoteDevice d) f System . out . p r i n t l n (" DiscoveryTest iAmReceived ! " ) ; remoteDevices . add (d) ; synchronized ( BACNetPluginRip . t h i s ) f BACNetPluginRip . t h i s . n o t i f y A l l ( ) ; g g public boolean allowPropertyWrite ( BACnetObject obj , PropertyValue pv ) f System . out . p r i n t l n (" DiscoveryTest allowPropertyWrite ") ; return true ; g public void propertyWritten ( BACnetObject obj , PropertyValue pv ) f System . out . p r i n t l n (" DiscoveryTest propertyWritten ") ; g public void iHaveReceived ( RemoteDevice d , RemoteObject o ) f System . out . p r i n t l n (" DiscoveryTest iHaveReceived ") ; 148 g public void covNotificationReceived ( UnsignedInteger s u b s c r i b e r P r o c e s s I d e n t i f i e r , RemoteDevice i n i t i a t i n g D e v i c e , O b j e c t I d e n t i f i e r monitoredObjectIdentifier , UnsignedInteger timeRemaining , SequenceOf l i s t O f V a l u e s ) f System . out . p r i n t l n (" DiscoveryTest covNotificationReceived ") ; g public void e ven tNo tif ica ti onR ece ive d ( UnsignedInteger p r o c e s s I d e n t i f i e r , RemoteDevice i n i t i a t i n g D e v i c e , O b j e c t I d e n t i f i e r e v e n t O b j e c t I d e n t i f i e r , TimeStamp timeStamp , UnsignedInteger n o t i f i c a t i o n C l a s s , UnsignedInteger p r i o r i t y , EventType eventType , CharacterString messageText , NotifyType notifyType , Boolean ackRequired , EventState fromState , EventState toState , NotificationParameters eventValues ) f System . out . p r i n t l n (" BACNetPlugin ev ent No tif ica tio nRe ce ive d : ID=" + p r o c e s s I d e n t i f i e r . longValue ( ) + " EventType= " + eventType . toString ( ) + " fromState=" + fromState . TYPE ID + " toState= " + toState . TYPE ID) ; g public void textMessageReceived ( RemoteDevice textMessageSourceDevice , Choice messageClass , MessagePriority messagePriority , CharacterString message ) f System . out . p r i n t l n (" DiscoveryTest textMessageReceived ") ; g public void privateTransferReceived ( UnsignedInteger vendorId , UnsignedInteger serviceNumber , Encodable serviceParameters ) f System . out . p r i n t l n (" DiscoveryTest privateTransferReceived ") ; g @Override public void r e i n i t i a l i z e D e v i c e ( R e i n i t i a l i z e d S t a t e O f D e v i c e arg0 ) f // TODO Auto generated method stub g g) ; l o c a l D e v i c e . i n i t i a l i z e ( ) ; g / 149 Send a WhoIs request and wait f o r the f i r s t to answer @throws java . lang . Exception / public void doDiscover ( ) throws Exception f // Who i s System . out . p r i n t l n (" Send Broadcast WhoIsRequest ( ) ") ; // Send the broadcast to the c o r r e c t port of the LoopDevice ! ! ! // l o c a l D e v i c e . sendBroadcast ( loopDevice . getPort ( ) , new WhoIsRequest ( null , n u l l ) ) ; l o c a l D e v i c e . sendBroadcast (47808 , new WhoIsRequest ( null , n u l l ) ) ; // wait f o r n o t i f i c a t i o n in iAmReceived ( ) Timeout 5 sec synchronized ( t h i s ) f f i n a l long s t a r t = System . currentTimeMillis ( ) ; t h i s . wait (5000) ; System . out . p r i n t l n (" waited f o r iAmReceived : " + ( System . currentTimeMillis ( ) s t a r t ) + " ms") ; g // An other way to get to the l i s t of d e vi c e s // return l o c a l D e v i c e . getRemoteDevices ( ) ; g @SuppressWarnings (" unchecked ") p r i v a t e void printDevices ( ) throws BACnetException f f o r ( RemoteDevice d : remoteDevices ) f l o c a l D e v i c e . getExtendedDeviceInformation (d) ; List oids = ( ( SequenceOf) l o c a l D e v i c e . sendReadPropertyAllowNull ( d , d . g e t O b j e c t I d e n t i f i e r ( ) , P r o p e r t y I d e n t i f i e r . o b j e c t L i s t ) ) . getValues ( ) ; //xxx (10/19) System . out . p r i n t l n (" removeDevice : " + d) ; // and now from a l l o b j e c t s under the device object >> ai0 , ai1 , bi0 , bi1 . . . f o r ( O b j e c t I d e n t i f i e r oid : oids ) f PropertyReferences r e f s = new PropertyReferences ( ) ; // add the property r e f e r e n c e s of the " device object " to the l i s t r e f s . add (d . g e t O b j e c t I d e n t i f i e r ( ) , P r o p e r t y I d e n t i f i e r . a l l ) ; r e f s . add ( oid , P r o p e r t y I d e n t i f i e r . a l l ) ; System . out . p r i n t l n (" Start read p r o p e r t i e s ") ; f i n a l long s t a r t = System . currentTimeMillis ( ) ; 150 System . out . p r i n t l n ("JWU: r e f s = " + r e f s . s i z e ( ) ) ; //xxx PropertyValues pvs = l o c a l D e v i c e . readProperties (d , r e f s ) ; System . out . p r i n t l n ( String . format (" P r o p e r t i e s read done in %d ms" , System . currentTimeMillis ( ) s t a r t ) ) ; String temp =printObject (d . g e t O b j e c t I d e n t i f i e r ( ) , pvs ) ; System . out . p r i n t l n (" print1 : " + temp ) ; temp = printObject ( oid , pvs ) ; System . out . p r i n t l n (" print2 : " + temp ) ; g g System . out . p r i n t l n (" Remote de v i c e s done . . . " ) ; g //xxx hacked to return s t a t e (10/19) p r i v a t e String printObject ( O b j e c t I d e n t i f i e r oid , PropertyValues pvs ) f System . out . p r i n t l n ( String . format ("nt%s " , oid ) ) ; Set t e s t i n g = new TreeSet() ; t e s t i n g . add (" Object i d e n t i f i e r ") ; t e s t i n g . add (" Object type ") ; t e s t i n g . add (" Object name") ; t e s t i n g . add (" s t a t e ") ; t e s t i n g . add (" value ") ; String temp ; f o r ( ObjectPropertyReference opr : pvs ) f // i f ( oid . equals ( opr . g e t O b j e c t I d e n t i f i e r ( ) ) ) f // System . out . p r i n t l n ( String . format ("ntnt%s = %s " , opr . g e t P r o p e r t y I d e n t i f i e r ( ) . toString ( ) , pvs . getNoErrorCheck ( opr ) ) ) ; // g temp = opr . g e t P r o p e r t y I d e n t i f i e r ( ) . toString ( ) ; f o r ( String s : t e s t i n g ) f i f ( temp . contains ( s ) ) System . out . p r i n t l n ( String . format ("ntnt%s = %s " , opr . g e t P r o p e r t y I d e n t i f i e r ( ) . toString ( ) , pvs . getNoErrorCheck ( opr ) ) ) ; i f ( temp . contains (" s t a t e ") ) return pvs . getNoErrorCheck ( opr ) . toString ( ) ; g g return "No State " ; g / Note same Bropadcast address , but d i f f e r e n t ports ! ! ! @param args 151 @throws java . lang . Exception / @SuppressWarnings (" unchecked ") public s t a t i c void main ( String [ ] args ) throws Exception f BACNetPluginRip dt = new BACNetPluginRip (BROADCAST ADDRESS, 47808) ; try f dt . setLoopDevice (new LoopDevice (BROADCAST ADDRESS, 47808 + 1) ) ; g catch ( RuntimeException e ) f dt . l o c a l D e v i c e . terminate ( ) ; throw e ; g // try f dt . doDiscover ( ) ; dt . printDevices ( ) ; // g f i n a l l y f // dt . l o c a l D e v i c e . terminate ( ) ; // System . out . p r i n t l n (" Cleanup loopDevice ") ; // dt . getLoopDevice ( ) . doTerminate ( ) ; // g while ( true ) f g g / @return the loopDevice / public LoopDevice getLoopDevice ( ) f return loopDevice ; g / @param loopDevice the loopDevice to s e t / public void setLoopDevice ( LoopDevice loopDevice ) f t h i s . loopDevice = loopDevice ; g g / . / f f a /model/ SensorLayer . java / package f f a . model ; import java . u t i l . Comparator ; import java . u t i l . TreeSet ; 152 import java . u t i l . C o l l e c t i o n ; import java . awt . Rectangle ; import f f a . annotation . Zone ; public c l a s s SensorLayer f p r i v a t e SensorType type ; p r i v a t e Collection zoneList ; public SensorLayer ( SensorType type ) f t h i s . type = type ; zoneList = new TreeSet(new Comparator()f public i n t compare ( Zone a , Zone b)f Rectangle aBounds = a . getZone ( ) . getBounds ( ) ; Rectangle bBounds = b . getZone ( ) . getBounds ( ) ; i n t aCenterX = aBounds . x+aBounds . width /2; i n t aCenterY = aBounds . y+aBounds . height /2; i n t bCenterX = bBounds . x+bBounds . width /2; i n t bCenterY = bBounds . y+bBounds . height /2; i f ( ( bCenterX>aCenterX ) && ( bCenterY>aBounds . y ) ) return 1; i f ( ( bCenterXbCenterY ) return 1 ; return new I n t e g e r ( bCenterX ) . compareTo (new I n t e g e r ( aCenterX ) ) ; g g) ; g public void addZone ( Zone zone ) f zoneList . add ( zone ) ; g public void removeZone ( Zone zone ) f zoneList . remove ( zone ) ; g public I t e r a b l e getZones ( ) f return zoneList ; g public SensorType getType ( ) f return type ; g g 153 / . / f f a /model/LoopDevice . java / / ============================================================================ GNU Lesser General Public License ============================================================================ Copyright (C) 2006 2009 Serotonin Software Technologies Inc . http :// s e r o t o n i n s o f t w a r e . com @author Matthew Lohbihler This l i b r a r y i s f r e e software ; you can r e d i s t r i b u t e i t and/ or modify i t under the terms of the GNU Lesser General Public License as published by the Free Software Foundation ; e i t h e r v er si on 2.1 of the License , or ( at your option ) any l a t e r v er si on . This l i b r a r y i s d i s t r i b u t e d in the hope that i t w i l l be useful , but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License f o r more d e t a i l s . You should have r e c e i v e d a copy of the GNU Lesser General Public License along with t h i s l i b r a r y ; i f not , write to the Free Software Foundation , Inc . , 59 Temple Place , Suite 330 , Boston , MA 02111 1307 , USA. / package f f a . model ; import java . i o . IOException ; import com . s e r o t o n i n . bacnet4j . LocalDevice ; import com . s e r o t o n i n . bacnet4j . RemoteDevice ; import com . s e r o t o n i n . bacnet4j . RemoteObject ; import com . s e r o t o n i n . bacnet4j . event . DeviceEventListener ; import com . s e r o t o n i n . bacnet4j . exception . BACnetServiceException ; import com . s e r o t o n i n . bacnet4j . obj . BACnetObject ; import com . s e r o t o n i n . bacnet4j . s e r v i c e . confirmed . R e i n i t i a l i z e D e v i c e R e q u e s t . R e i n i t i a l i z e d S t a t e O f D e v i c e ; import com . s e r o t o n i n . bacnet4j . type . Encodable ; import com . s e r o t o n i n . bacnet4j . type . constructed . Choice ; import com . s e r o t o n i n . bacnet4j . type . constructed . PropertyValue ; import com . s e r o t o n i n . bacnet4j . type . constructed . SequenceOf ; 154 import com . s e r o t o n i n . bacnet4j . type . constructed . TimeStamp ; import com . s e r o t o n i n . bacnet4j . type . enumerated . BinaryPV ; import com . s e r o t o n i n . bacnet4j . type . enumerated . EngineeringUnits ; import com . s e r o t o n i n . bacnet4j . type . enumerated . EventState ; import com . s e r o t o n i n . bacnet4j . type . enumerated . EventType ; import com . s e r o t o n i n . bacnet4j . type . enumerated . MessagePriority ; import com . s e r o t o n i n . bacnet4j . type . enumerated . NotifyType ; import com . s e r o t o n i n . bacnet4j . type . enumerated . ObjectType ; import com . s e r o t o n i n . bacnet4j . type . enumerated . P r o p e r t y I d e n t i f i e r ; import com . s e r o t o n i n . bacnet4j . type . enumerated . R e l i a b i l i t y ; import com . s e r o t o n i n . bacnet4j . type . n o t i f i c a t i o n P a r a m e t e r s . NotificationParameters ; import com . s e r o t o n i n . bacnet4j . type . p r i m i t i v e . Boolean ; import com . s e r o t o n i n . bacnet4j . type . p r i m i t i v e . CharacterString ; import com . s e r o t o n i n . bacnet4j . type . p r i m i t i v e . O b j e c t I d e n t i f i e r ; import com . s e r o t o n i n . bacnet4j . type . p r i m i t i v e . Real ; import com . s e r o t o n i n . bacnet4j . type . p r i m i t i v e . UnsignedInteger ; / software only device d e f a u l t l o c a l loop ; ) @author mlohbihler @author aploese / public c l a s s LoopDevice implements Runnable f public s t a t i c void main ( String [ ] args ) throws Exception f LoopDevice ld = new LoopDevice ( " 1 2 7 . 0 . 0 . 2 5 5 " , LocalDevice . DEFAULT PORT + 1) ; Thread . s l e e p (12000) ; // wait 2 min ld . doTerminate ( ) ; g p r i v a t e boolean terminate ; p r i v a t e LocalDevice l o c a l D e v i c e ; p r i v a t e BACnetObject ai0 ; p r i v a t e BACnetObject ai1 ; p r i v a t e BACnetObject bi0 ; p r i v a t e BACnetObject bi1 ; p r i v a t e BACnetObject mso0 ; p r i v a t e BACnetObject ao0 ; public LoopDevice ( String broadcastAddress , i n t port ) throws BACnetServiceException , IOException f l o c a l D e v i c e = new LocalDevice (1968 , broadcastAddress ) ; try f l o c a l D e v i c e . setPort ( port ) ; l o c a l D e v i c e . getEventHandler ( ) . addListener (new DeviceEventListener ( ) f public void l i s t e n e r E x c e p t i o n ( Throwable e ) f System . out . p r i n t l n (" loopDevice l i s t e n e r E x c e p t i o n ") ; g 155 public void iAmReceived ( RemoteDevice d) f System . out . p r i n t l n (" loopDevice iAmReceived ") ; g public boolean allowPropertyWrite ( BACnetObject obj , PropertyValue pv ) f System . out . p r i n t l n (" loopDevice allowPropertyWrite ") ; return true ; g public void propertyWritten ( BACnetObject obj , PropertyValue pv ) f System . out . p r i n t l n (" loopDevice propertyWritten ") ; g public void iHaveReceived ( RemoteDevice d , RemoteObject o ) f System . out . p r i n t l n (" loopDevice iHaveReceived ") ; g public void covNotificationReceived ( UnsignedInteger s u b s c r i b e r P r o c e s s I d e n t i f i e r , RemoteDevice i n i t i a t i n g D e v i c e , O b j e c t I d e n t i f i e r monitoredObjectIdentifier , UnsignedInteger timeRemaining , SequenceOf l i s t O f V a l u e s ) f System . out . p r i n t l n (" loopDevice covNotificationReceived ") ; g public void e ven tNo tif ica tio nR ece ive d ( UnsignedInteger p r o c e s s I d e n t i f i e r , RemoteDevice i n i t i a t i n g D e v i c e , O b j e c t I d e n t i f i e r e v e n t O b j e c t I d e n t i f i e r , TimeStamp timeStamp , UnsignedInteger n o t i f i c a t i o n C l a s s , UnsignedInteger p r i o r i t y , EventType eventType , CharacterString messageText , NotifyType notifyType , Boolean ackRequired , EventState fromState , EventState toState , NotificationParameters eventValues ) f System . out . p r i n t l n (" loopDevice ev ent No tif ica tio nRe ce ive d ") ; g public void textMessageReceived ( RemoteDevice textMessageSourceDevice , Choice messageClass , MessagePriority messagePriority , CharacterString message ) f System . out . p r i n t l n (" loopDevice textMessageReceived ") ; g 156 public void privateTransferReceived ( UnsignedInteger vendorId , UnsignedInteger serviceNumber , Encodable serviceParameters ) f System . out . p r i n t l n (" loopDevice privateTransferReceived ") ; g @Override public void r e i n i t i a l i z e D e v i c e ( R e i n i t i a l i z e d S t a t e O f D e v i c e arg0 ) f // TODO Auto generated method stub g g) ; // f o r v a l i d property values with v a l i d datatypes see com . s e r o t o n i n . bacnet4j . obj . ObjectProperties and ther look f o r the big s t a t i c block at the end ; // p r o p e r t i e s of device object l o c a l D e v i c e . getConfiguration ( ) . setProperty ( P r o p e r t y I d e n t i f i e r . modelName , new CharacterString (" BACnet4J LoopDevice ") ) ; // Set up a few o b j e c t s . ai0 = new BACnetObject ( localDevice , l o c a l D e v i c e . g e t N e x t I n s t a n c e O b j e c t I d e n t i f i e r ( ObjectType . analogInput ) ) ; //mandatory p r o p e r t i e s ai0 . setProperty ( P r o p e r t y I d e n t i f i e r . objectName , new CharacterString ("G1 RLT03 TM 01") ) ; // t h i s i s a c r y p t i c encoded name of a temp sensor from a drawing . . . (ahm . a c t u a l l y taken from a book ; ) ) ai0 . setProperty ( P r o p e r t y I d e n t i f i e r . presentValue , new Real (11) ) ; ai0 . setProperty ( P r o p e r t y I d e n t i f i e r . outOfService , new Boolean ( f a l s e ) ) ; ai0 . setProperty ( P r o p e r t y I d e n t i f i e r . units , EngineeringUnits . d e g r e e s C e l s i u s ) ; //some opti onal p r o p e r t i e s ai0 . setProperty ( P r o p e r t y I d e n t i f i e r . d e s c r i p t i o n , new CharacterString (" temperature ") ) ; ai0 . setProperty ( P r o p e r t y I d e n t i f i e r . deviceType , new CharacterString (" random values ") ) ; ai0 . setProperty ( P r o p e r t y I d e n t i f i e r . r e l i a b i l i t y , R e l i a b i l i t y . noFaultDetected ) ; ai0 . setProperty ( P r o p e r t y I d e n t i f i e r . updateInterval , new UnsignedInteger (10) ) ; ai0 . setProperty ( P r o p e r t y I d e n t i f i e r . minPresValue , new Real ( 70) ) ; ai0 . setProperty ( P r o p e r t y I d e n t i f i e r . maxPresValue , new Real (120) ) ; 157 ai0 . setProperty ( P r o p e r t y I d e n t i f i e r . r e s o l u t i o n , new Real ( ( f l o a t ) 0 . 1 ) ) ; ai0 . setProperty ( P r o p e r t y I d e n t i f i e r . profileName , new CharacterString (" funny reader ") ) ; l o c a l D e v i c e . addObject ( ai0 ) ; ai1 = new BACnetObject ( localDevice , l o c a l D e v i c e . g e t N e x t I n s t a n c e O b j e c t I d e n t i f i e r ( ObjectType . analogInput ) ) ; ai1 . setProperty ( P r o p e r t y I d e n t i f i e r . units , EngineeringUnits . percentObscurationPerFoot ) ; l o c a l D e v i c e . addObject ( ai1 ) ; bi0 = new BACnetObject ( localDevice , l o c a l D e v i c e . g e t N e x t I n s t a n c e O b j e c t I d e n t i f i e r ( ObjectType . binaryInput ) ) ; l o c a l D e v i c e . addObject ( bi0 ) ; bi0 . setProperty ( P r o p e r t y I d e n t i f i e r . objectName , new CharacterString (" Off and on ") ) ; bi0 . setProperty ( P r o p e r t y I d e n t i f i e r . inactiveText , new CharacterString (" Off ") ) ; bi0 . setProperty ( P r o p e r t y I d e n t i f i e r . activeText , new CharacterString ("On") ) ; bi1 = new BACnetObject ( localDevice , l o c a l D e v i c e . g e t N e x t I n s t a n c e O b j e c t I d e n t i f i e r ( ObjectType . binaryInput ) ) ; l o c a l D e v i c e . addObject ( bi1 ) ; bi1 . setProperty ( P r o p e r t y I d e n t i f i e r . objectName , new CharacterString ("Good and bad ") ) ; bi1 . setProperty ( P r o p e r t y I d e n t i f i e r . inactiveText , new CharacterString ("Bad") ) ; bi1 . setProperty ( P r o p e r t y I d e n t i f i e r . activeText , new CharacterString ("Good") ) ; mso0 = new BACnetObject ( localDevice , l o c a l D e v i c e . g e t N e x t I n s t a n c e O b j e c t I d e n t i f i e r ( ObjectType . multiStateOutput ) ) ; mso0 . setProperty ( P r o p e r t y I d e n t i f i e r . objectName , new CharacterString (" Vegetable ") ) ; mso0 . setProperty ( P r o p e r t y I d e n t i f i e r . numberOfStates , new UnsignedInteger (4) ) ; mso0 . setProperty ( P r o p e r t y I d e n t i f i e r . stateText , 1 , new CharacterString ("Tomato") ) ; mso0 . setProperty ( P r o p e r t y I d e n t i f i e r . stateText , 2 , new CharacterString (" Potato ") ) ; mso0 . setProperty ( P r o p e r t y I d e n t i f i e r . stateText , 3 , new CharacterString (" Onion ") ) ; 158 mso0 . setProperty ( P r o p e r t y I d e n t i f i e r . stateText , 4 , new CharacterString (" B r o c c o l i ") ) ; mso0 . setProperty ( P r o p e r t y I d e n t i f i e r . presentValue , new UnsignedInteger (1) ) ; l o c a l D e v i c e . addObject ( mso0 ) ; ao0 = new BACnetObject ( localDevice , l o c a l D e v i c e . g e t N e x t I n s t a n c e O b j e c t I d e n t i f i e r ( ObjectType . analogOutput ) ) ; ao0 . setProperty ( P r o p e r t y I d e n t i f i e r . objectName , new CharacterString (" Sett able analog ") ) ; l o c a l D e v i c e . addObject ( ao0 ) ; // Start the l o c a l device . l o c a l D e v i c e . i n i t i a l i z e ( ) ; new Thread ( t h i s ) . s t a r t ( ) ; g catch ( RuntimeException e ) f System . out . p r i n t l n ("Ex in LoopDevice ( ) ") ; e . printStackTrace ( ) ; l o c a l D e v i c e . terminate ( ) ; l o c a l D e v i c e = n u l l ; throw e ; g g public void run ( ) f try f System . out . p r i n t l n (" LoopDevice s t a r t changing values " + t h i s ) ; // Let i t go . . . f l o a t ai0value = 0 ; f l o a t ai1value = 0 ; boolean bi0value = f a l s e ; boolean bi1value = f a l s e ; getMso0 ( ) . setProperty ( P r o p e r t y I d e n t i f i e r . presentValue , new UnsignedInteger (2) ) ; while ( ! isTerminate ( ) ) f System . out . p r i n t l n (" Change values of LoopDevice " + t h i s ) ; // Update the values in the o b j e c t s . ai0 . setProperty ( P r o p e r t y I d e n t i f i e r . presentValue , new Real ( ai0value ) ) ; ai1 . setProperty ( P r o p e r t y I d e n t i f i e r . presentValue , new Real ( ai1value ) ) ; bi0 . setProperty ( P r o p e r t y I d e n t i f i e r . presentValue , bi0value ? BinaryPV . a c t i v e : BinaryPV . i n a c t i v e ) ; bi1 . setProperty ( P r o p e r t y I d e n t i f i e r . presentValue , bi1value ? BinaryPV . a c t i v e : BinaryPV . i n a c t i v e ) ; 159 synchronized ( t h i s ) f wait (1000) ; // 1 second or n o t i f i e d ( f a s t e r e x i t then stupid wait f o r 1 second ) g g System . out . p r i n t l n (" Close LoopDevive " + t h i s ) ; g catch ( Exception ex ) f g l o c a l D e v i c e . terminate ( ) ; l o c a l D e v i c e = n u l l ; g @Override protected void f i n a l i z e ( ) throws Throwable f i f ( l o c a l D e v i c e != n u l l ) f l o c a l D e v i c e . terminate ( ) ; l o c a l D e v i c e = n u l l ; g g / @return the terminate / public boolean isTerminate ( ) f return terminate ; g / @param terminate the terminate to s e t / public void doTerminate ( ) f terminate = true ; synchronized ( t h i s ) f n o t i f y A l l ( ) ; // we may wait f o r t h i s in run ( ) . . . g g / @return the broadcastAddress / public String getBroadcastAddress ( ) f return l o c a l D e v i c e . getBroadcastAddress ( ) ; g / @return the port / public i n t getPort ( ) f return l o c a l D e v i c e . getPort ( ) ; g / @return the l o c a l D e v i c e 160 / public LocalDevice getLocalDevice ( ) f return l o c a l D e v i c e ; g / @return the ai0 / public BACnetObject getAi0 ( ) f return ai0 ; g / @return the ai1 / public BACnetObject getAi1 ( ) f return ai1 ; g / @return the bi0 / public BACnetObject getBi0 ( ) f return bi0 ; g / @return the bi1 / public BACnetObject getBi1 ( ) f return bi1 ; g / @return the mso0 / public BACnetObject getMso0 ( ) f return mso0 ; g / @return the ao0 / public BACnetObject getAo0 ( ) f return ao0 ; g g / . / f f a /model/Event . java 161 / package f f a . model ; import java . text . DateFormat ; import java . text . SimpleDateFormat ; import java . u t i l . Date ; / @author jwu @date Nov 16 , 2009 / public c l a s s Eventf p r i v a t e Date time ; p r i v a t e Sensor t r i g g e r ; DateFormat timeFormat = new SimpleDateFormat (" kk :mm: ss ") ; public Event ( Date time , Sensor t r i g g e r ) f t h i s . time = time ; t h i s . t r i g g e r = t r i g g e r ; g protected void setTime ( Date time ) f t h i s . time = time ; g protected void s e t T r i g g e r ( Sensor t r i g g e r ) f t h i s . t r i g g e r = t r i g g e r ; g public String getText ( ) f String msg=""; msg = msg+t r i g g e r . toString ( )+" "+t r i g g e r . getID ( )+ "  "+ timeFormat . format ( time )+"
"+ "      "+ t r i g g e r . l o c a t i o n . g etDes criptio n ( ) ; return msg ; g public Date getTime ( ) f return time ; g public Sensor getTrigger ( ) f return t r i g g e r ; g / @author jwu @return t h i s event ? s l o c a t i o n / public Location getLocation ( ) f return t h i s . t r i g g e r . l o c a t i o n ; g 162 public boolean equals ( Object o ) f i f ( t h i s == o ) return true ; // Also checks o==n u l l i f ( ! ( o i n s t a n c e o f Event ) ) return f a l s e ; // Cast i s now s a f e Event e = ( Event ) o ; i f ( ! e . getTime ( ) . equals ( getTime ( ) ) ) return f a l s e ; i f ( ! e . getText ( ) . equals ( getText ( ) ) ) return f a l s e ; i f ( ! e . getTrigger ( ) . equals ( getTrigger ( ) ) ) return f a l s e ; return true ; g public String getID ( ) f return t r i g g e r . getID ( ) ; g g / . / f f a /model/DummyLord . java / / Used f o r t e s t i n g the RS232 plugin ; i t could e a s i l y be used f o r t e s t i n g any other plugin , as well . Basically , instead of a c t u a l l y doing anything with the input from the plugin , i t simply p r i n t s out information about i t . / package f f a . model ; public c l a s s DummyLord extends EventLord f public DummyLord( ) f g public void updateIncident ( Event e ) 163 f System . out . pr in t (" Event r e c e i v e d at " + e . getTime ( ) ) ; System . out . p r i n t l n (" : name i s " + e . getID ( ) ) ; g g / . / f f a /model/ F l o o r F i l t e r . java / / F i l t e r that determines whether a sensor has gone o f f on a new f l o o r . Implements F i r e F i l t e r . TODO: Figure out how to send an a l e r t / package f f a . model ; public c l a s s F l o o r F i l t e r implements F i r e F i l t e r f p r i v a t e boolean [ ] f l o o r s ; p r i v a t e i n t numFloors ; / Constructs the f i l t e r , c r e a t i n g and i n i t i a l i z i n g the f l o o r array . Takes as an argument the number of f l o o r s in the bui lding . I don ? t think arrays need to be i n i t i a l z e d in java , but b e t t e r s a f e than sorry . / public F l o o r F i l t e r ( i n t n) f numFloors = n ; f l o o r s = new boolean [ n ] ; f o r ( i n t i = 0 ; i < n ; i++) f l o o r s [ i ] = f a l s e ; g / Processes the given event . Checks the value at the index that corresponds to the f l o o r of the event , and i f that value i s f a l s e , s e t i t to true and return an alarm . Else return n u l l / public Alert f i l t e r ( Event e ) f i n t f = e . getLocation ( ) . getFloor ( ) ; i f ( ! f l o o r s [ f ] ) f f l o o r s [ f ]= true ; 164 return new Alert ( e . getTime ( ) , e . getTrigger ( ) , "Alarm t r i g g e r e d on new f l o o r " + ( f +1) ) ; g return n u l l ; g g / . / f f a /model/EventLord . java / / Eventlord c l a s s . Handles input from the network through NetworkClient , p r e p r o c e s s e s i t f o r a l e r t s , then passes i t to the Incident and n o t i f i e s Window of an update . / package f f a . model ; import java . u t i l . ArrayList ; import java . u t i l . L i s t I t e r a t o r ; public c l a s s EventLord f p r i v a t e Incident i n c i d e n t ; p r i v a t e ArrayList f i l t e r s ; / Constructor . Takes as arguments the i n c i d e n t of the system , and an a r r a y l i s t of a l l d e s i r e d f i l t e r s to be applied to new events . / public EventLord ( ) f g public EventLord ( Incident stub , ArrayList f i l t e r ) f i n c i d e n t = stub ; f i l t e r s = f i l t e r ; g / Update the i n c i d e n t . This function takes an argument from the NetworkClient , adds i t to the event l i s t , a p p l i e s a l l f i l t e r s to it , and n o t i f i e s the i n c i d e n t . Each i n d i v i d u a l f i l t e r generates the actual a l e r t , which eventLord then updates the i n c i d e n t with . TODO: s p e c i a l handling of a l e r t s as opposed to events ? / public void updateIncident ( Event e ) f Alert a ; L i s t I t e r a t o r i t e r = f i l t e r s . l i s t I t e r a t o r ( ) ; 165 F i r e F i l t e r f ; i n c i d e n t . addEvent ( e ) ; while ( i t e r . hasNext ( ) ) f f = i t e r . next ( ) ; a = f . f i l t e r ( e ) ; i f ( a != n u l l ) f i n c i d e n t . addEvent ( a ) ; g g i n c i d e n t . update ( ) ; g g / . / f f a /model/ ThresholdSensor . java / package f f a . model ; / @author jwu @date Nov 16 , 2009 / public c l a s s ThresholdSensor extends Sensorf public ThresholdSensor ( Location location , String ID) f super ( location , ID) ; g public String toString ( ) f return "Threshold Sensor"; g g / . / f f a /model/ Building . java / package f f a . model ; 166 import java . u t i l . ArrayList ; import java . u t i l .Map; import java . u t i l . TreeMap ; import f f a . annotation . Zone ; public c l a s s Building f public Map f l o o r s ; public Map zones ; public ArrayList s t a t i c I n f o = new ArrayList ( ) ; public Building ( ) f f l o o r s = new TreeMap() ; g // Convenience method public SensorLayer getLayer ( I n t e g e r f l o o r , SensorType type ) f Floor f l = f l o o r s . get ( f l o o r ) ; SensorLayer s l = f l . getLayer ( type ) ; return s l ; g public Floor getFloor ( I n t e g e r f l o o r )f return f l o o r s . get ( f l o o r ) ; g public void addEvent ( Event e )f Sensor s = e . getTrigger ( ) ; Zone z = zones . get ( s . getID ( ) ) ; i f ( z!= n u l l )f z . addEvent ( e ) ; ge l s ef System . e r r . p r i n t l n (" nonexistant sensor id : " + e . getID ( ) ) ; g g g / . / f f a /model/BACNetPlugin . java / package f f a . model ; import java . i o . IOException ; import java . u t i l . ArrayList ; import java . u t i l . Date ; import java . u t i l . L i s t ; 167 import com . s e r o t o n i n . bacnet4j . LocalDevice ; import com . s e r o t o n i n . bacnet4j . RemoteDevice ; import com . s e r o t o n i n . bacnet4j . RemoteObject ; import com . s e r o t o n i n . bacnet4j . event . DeviceEventListener ; import com . s e r o t o n i n . bacnet4j . exception . BACnetException ; import com . s e r o t o n i n . bacnet4j . obj . BACnetObject ; import com . s e r o t o n i n . bacnet4j . s e r v i c e . confirmed . R e i n i t i a l i z e D e v i c e R e q u e s t . R e i n i t i a l i z e d S t a t e O f D e v i c e ; import com . s e r o t o n i n . bacnet4j . s e r v i c e . unconfirmed . WhoIsRequest ; import com . s e r o t o n i n . bacnet4j . type . Encodable ; import com . s e r o t o n i n . bacnet4j . type . constructed . Choice ; import com . s e r o t o n i n . bacnet4j . type . constructed . ObjectPropertyReference ; import com . s e r o t o n i n . bacnet4j . type . constructed . PropertyValue ; import com . s e r o t o n i n . bacnet4j . type . constructed . SequenceOf ; import com . s e r o t o n i n . bacnet4j . type . constructed . TimeStamp ; import com . s e r o t o n i n . bacnet4j . type . enumerated . EventState ; import com . s e r o t o n i n . bacnet4j . type . enumerated . EventType ; import com . s e r o t o n i n . bacnet4j . type . enumerated . MessagePriority ; import com . s e r o t o n i n . bacnet4j . type . enumerated . NotifyType ; import com . s e r o t o n i n . bacnet4j . type . enumerated . P r o p e r t y I d e n t i f i e r ; import com . s e r o t o n i n . bacnet4j . type . n o t i f i c a t i o n P a r a m e t e r s . NotificationParameters ; import com . s e r o t o n i n . bacnet4j . type . p r i m i t i v e . CharacterString ; import com . s e r o t o n i n . bacnet4j . type . p r i m i t i v e . O b j e c t I d e n t i f i e r ; import com . s e r o t o n i n . bacnet4j . type . p r i m i t i v e . UnsignedInteger ; import com . s e r o t o n i n . bacnet4j . u t i l . PropertyReferences ; import com . s e r o t o n i n . bacnet4j . u t i l . PropertyValues ; public c l a s s BACNetPlugin extends Pluginf EventLord lord ; // EventLord // basic constructor public BACNetPlugin ( EventLord theLord , String broadcastAddress , i n t port ) throws IOException f super . events = true ; lord = theLord ; l o c a l D e v i c e = new LocalDevice (1 , broadcastAddress ) ; l o c a l D e v i c e . setPort ( port ) ; l o c a l D e v i c e . getEventHandler ( ) . addListener (new DeviceEventListener ( ) f //TODO: e d i t the e f f e c t s of these methods public void l i s t e n e r E x c e p t i o n ( Throwable e ) f System . out . p r i n t l n (" DiscoveryTest l i s t e n e r E x c e p t i o n ") ; g public void iAmReceived ( RemoteDevice d) f System . out . p r i n t l n (" DiscoveryTest iAmReceived ") ; remoteDevices . add (d) ; synchronized ( BACNetPlugin . t h i s ) f 168 BACNetPlugin . t h i s . n o t i f y A l l ( ) ; g g public boolean allowPropertyWrite ( BACnetObject obj , PropertyValue pv ) f System . out . p r i n t l n (" DiscoveryTest allowPropertyWrite ") ; return true ; g public void propertyWritten ( BACnetObject obj , PropertyValue pv ) f System . out . p r i n t l n (" DiscoveryTest propertyWritten : " + pv . getValue ( ) ) ; g public void iHaveReceived ( RemoteDevice d , RemoteObject o ) f System . out . p r i n t l n (" DiscoveryTest iHaveReceived ") ; g public void covNotificationReceived ( UnsignedInteger s u b s c r i b e r P r o c e s s I d e n t i f i e r , RemoteDevice i n i t i a t i n g D e v i c e , O b j e c t I d e n t i f i e r monitoredObjectIdentifier , UnsignedInteger timeRemaining , SequenceOf l i s t O f V a l u e s ) f System . out . p r i n t l n (" DiscoveryTest covNotificationReceived ") ; g public void textMessageReceived ( RemoteDevice textMessageSourceDevice , Choice messageClass , MessagePriority messagePriority , CharacterString message ) f System . out . p r i n t l n (" DiscoveryTest textMessageReceived ") ; g public void privateTransferReceived ( UnsignedInteger vendorId , UnsignedInteger serviceNumber , Encodable serviceParameters ) f System . out . p r i n t l n (" DiscoveryTest privateTransferReceived ") ; g public void e ven tNo tif ica ti onR ece ive d ( UnsignedInteger p r o c e s s I d e n t i f i e r , RemoteDevice i n i t i a t i n g D e v i c e , O b j e c t I d e n t i f i e r e v e n t O b j e c t I d e n t i f i e r , TimeStamp timeStamp , UnsignedInteger n o t i f i c a t i o n C l a s s , UnsignedInteger p r i o r i t y , EventType eventType , CharacterString messageText , NotifyType notifyType , com . s e r o t o n i n . bacnet4j . type . p r i m i t i v e . Boolean ackRequired , EventState fromState , EventState toState , NotificationParameters eventValues ) f System . out . p r i n t l n (" BACNetPlugin ev ent Not ifi ca tio nRe ce ive d : ID =" + p r o c e s s I d e n t i f i e r . longValue ( ) + " EventType= " + eventType . toString ( ) + " fromState=" + fromState . TYPE ID + " toState= " + toState . TYPE ID) ; 169 Date d = new Date ( timeStamp . getDateTime ( ) . getTimeMillis ( ) ) ; String id = i n i t i a t i n g D e v i c e . getName ( )+i n i t i a t i n g D e v i c e . getInstanceNumber ( ) ; Sensor s = new SystemSensor (new Location ( null , 0 , eventType . toString ( ) ) , id ) ; Event e = new Event (d , s ) ; lord . updateIncident ( e ) ; // type id = 1 f o r change of s t a t e eventType // type id = 2 f o r change of value eventType g @Override public void r e i n i t i a l i z e D e v i c e ( R e i n i t i a l i z e d S t a t e O f D e v i c e arg0 ) f // TODO Auto generated method stub g g) ; l o c a l D e v i c e . i n i t i a l i z e ( ) ; // s t a r t s event l i s t e n i n g g @Override public void prepare ( ) f // TODO Auto generated method stub // does not need to do anything g @Override public void s t a r t ( ) f // TODO Auto generated method stub try f TestUseListener ( ) ; // TestUsePolling ( ) ; // TestUseNaive ( ) ; g catch ( Exception e ) f // TODO Auto generated catch block e . printStackTrace ( ) ; g g p r i v a t e LoopDevice loopDevice ; p r i v a t e f i n a l LocalDevice l o c a l D e v i c e ; // remote de v i c e s found p r i v a t e f i n a l List remoteDevices = new ArrayList< RemoteDevice>() ; //3 implementation s t y l e s to be t e s t e d p r i v a t e void TestUseListener ( ) throws Exception f 170 try f t h i s . setLoopDevice (new LoopDevice ( " 1 9 2 . 1 6 8 . 1 . 2 5 5 " , 47808 + 1) ) ; //TODO not hardcoded g catch ( RuntimeException e ) f t h i s . l o c a l D e v i c e . terminate ( ) ; throw e ; g try f t h i s . doDiscover ( ) ; t h i s . printDevices ( ) ; g f i n a l l y f t h i s . l o c a l D e v i c e . terminate ( ) ; System . out . p r i n t l n (" Cleanup loopDevice ") ; t h i s . getLoopDevice ( ) . doTerminate ( ) ; g g / Send a WhoIs request and wait f o r the f i r s t to answer @throws java . lang . Exception / public void doDiscover ( ) throws Exception f // Who i s System . out . p r i n t l n (" Send Broadcast WhoIsRequest ( ) ") ; // Send the broadcast to the c o r r e c t port of the LoopDevice ! ! ! // l o c a l D e v i c e . sendBroadcast ( loopDevice . getPort ( ) , new WhoIsRequest ( null , n u l l ) ) ; l o c a l D e v i c e . sendBroadcast (47808 , new WhoIsRequest ( null , n u l l ) ) ; // wait f o r n o t i f i c a t i o n in iAmReceived ( ) Timeout 5 sec synchronized ( t h i s ) f f i n a l long s t a r t = System . currentTimeMillis ( ) ; t h i s . wait (5000) ; System . out . p r i n t l n (" waited f o r iAmReceived : " + ( System . currentTimeMillis ( ) s t a r t ) + " ms") ; g // Another way to get to the l i s t of d e v ic e s // return l o c a l D e v i c e . getRemoteDevices ( ) ; g @SuppressWarnings (" unchecked ") p r i v a t e void printDevices ( ) throws BACnetException f f o r ( RemoteDevice d : remoteDevices ) f l o c a l D e v i c e . getExtendedDeviceInformation (d) ; List oids = ( ( SequenceOf) l o c a l D e v i c e . sendReadPropertyAllowNull ( 171 d , d . g e t O b j e c t I d e n t i f i e r ( ) , P r o p e r t y I d e n t i f i e r . o b j e c t L i s t ) ) . getValues ( ) ; PropertyReferences r e f s = new PropertyReferences ( ) ; // add the property r e f e r e n c e s of the " device object " to the l i s t r e f s . add (d . g e t O b j e c t I d e n t i f i e r ( ) , P r o p e r t y I d e n t i f i e r . a l l ) ; // and now from a l l o b j e c t s under the device object >> ai0 , ai1 , bi0 , bi1 . . . f o r ( O b j e c t I d e n t i f i e r oid : oids ) f r e f s . add ( oid , P r o p e r t y I d e n t i f i e r . a l l ) ; g System . out . p r i n t l n (" Start read p r o p e r t i e s ") ; f i n a l long s t a r t = System . currentTimeMillis ( ) ; PropertyValues pvs = l o c a l D e v i c e . readProperties (d , r e f s ) ; System . out . p r i n t l n ( String . format (" P r o p e r t i e s read done in %d ms" , System . currentTimeMillis ( ) s t a r t ) ) ; printObject (d . g e t O b j e c t I d e n t i f i e r ( ) , pvs ) ; f o r ( O b j e c t I d e n t i f i e r oid : oids ) f printObject ( oid , pvs ) ; g g g p r i v a t e void printObject ( O b j e c t I d e n t i f i e r oid , PropertyValues pvs ) f System . out . p r i n t l n ( String . format ("nt%s " , oid ) ) ; f o r ( ObjectPropertyReference opr : pvs ) f i f ( oid . equals ( opr . g e t O b j e c t I d e n t i f i e r ( ) ) ) f System . out . p r i n t l n ( String . format ("ntnt%s = %s " , opr . g e t P r o p e r t y I d e n t i f i e r ( ) . toString ( ) , pvs . getNoErrorCheck ( opr ) ) ) ; g g g / @return the loopDevice / public LoopDevice getLoopDevice ( ) f return loopDevice ; g / @param loopDevice the loopDevice to s e t / public void setLoopDevice ( LoopDevice loopDevice ) f 172 t h i s . loopDevice = loopDevice ; g p r i v a t e s t a t i c void TestUsePolling ( ) f // get l i s t of devices , loop through checking s t a t e s g p r i v a t e s t a t i c void TestUseNaive ( ) f g g / . / f f a /model/ Plugin . java / package f f a . model ; public abstract c l a s s Plugin f boolean events ; public boolean isEventDriven ( ) f return events ; g public abstract void prepare ( ) ; public abstract void s t a r t ( ) ; g / . / f f a /model/RSPluginDemo . java / package f f a . model ; public c l a s s RSPluginDemo f public s t a t i c void main ( String [ ] args ) f EventLord lord = new DummyLord( ) ; RS232Plugin plugin = new RS232Plugin ( lord , n u l l ) ; plugin . prepare ( ) ; 173 plugin . s t a r t ( ) ; g g / . / f f a /model/RS232Plugin . java / package f f a . model ; / Implementation of the RS232 plugin . This enables communication between a f i r e panel and the eventlord . This implementation i s hardware s p e c i f i c ; some of the s e r i a l port s e t t i n g s ( e . g . baud rate ) , as well as the s t r i n g parsing are dependent on at l e a s t the manufacturing brand , i f not the i n d i v i d u a l machine . I t takes the ASCII output of the f i r e panel , parses i t f o r information about the date , time , sensor ID , etc . , puts t h i s data into an Event object , and passes i t along to the EventLord , which then performs the appropriate p r e p r o c e s s i n g and d i s t r i b u t i n g . At startup , i t d i s p l a y s a window that o f f e r s the choice of a l l s e r i a l ports displayed on a system ; need to work on d i s p l a y i n g the names of these ports , rather than t h e i r Object . toString ( ) , which i s j u s t a memory address ( yikes ) . Credit i s due to rxtx . org ; most of t h i s code i s based o f f t h e i r examples , and I got the implementation of the javax .comm API from them . / import java . i o . IOException ; import java . i o . InputStream ; import java . text . ParseException ; import java . text . SimpleDateFormat ; import java . u t i l . Date ; import java . u t i l . Enumeration ; import java . u t i l . HashSet ; import javax . swing . ; import f f a . ui . Window ; import gnu . i o . ; public c l a s s RS232Plugin extends Plugin f 174 EventLord lord ; // EventLord CommPortIdentifier ident ; //Name of the s e r i a l port that w i l l be used Window window ; // Window of the GUI; used to anchor the d i a l o g box // basic constructor public RS232Plugin ( EventLord theLord , Window theWindow ) f super . events = true ; lord = theLord ; window = theWindow ; g / This i s where the magic happens . This method s e t s up the port and e s t a b l i s h e s the event l i s t e n e r . The basic s t r u c t u r e of t h i s code comes from http :// rxtx . qbang . org / wiki / index . php/Event based two way Communication , although i t i s modified ; s i n c e only one way communication i s necessary , and i t performs more i n t e n s i v e manipulation of the data . / public void s t a r t ( ) f ident = n u l l ; HashSet commPorts = g e t A v a i l a b l e S e r i a l P o r t s ( ) ; // Discovers a v a i l a b l e s e r i a l ports and puts them in an array i f ( commPorts == n u l l jj commPorts . isEmpty ( ) ) // I f there are no s e r i a l ports a v a i l a b l e , e x i t f System . out . p r i n t l n ("No ports a v a i l a b l e ! " ) ; System . e x i t (0) ; //TODO: something more elegant to do? g String [ ] names = new String [ commPorts . s i z e ( ) ] ; i n t i = 0 ; f o r ( CommPortIdentifier c : commPorts ) f names [ i ] = c . getName ( ) ; i ++; g //Ask the user which s e r i a l port he wishes to use ; TODO: f i n d a b e t t e r way to d i s p l a y port names try f ident = CommPortIdentifier . g e t P o r t I d e n t i f i e r ( ( String ) JOptionPane . showInputDialog ( window , " S e l e c t the s e r i a l port to use : " , " S e r i a l Port S e l e c t i o n " , JOptionPane .QUESTION MESSAGE, null , names , n u l l ) ) ; g catch ( NoSuchPortException e ) f System . out . p r i n t l n (" Error : no such port ") ; System . e x i t (0) ; g 175 // houston , we have a problem i f ( ident == n u l l jj ident . isCurrentlyOwned ( ) ) f System . e r r . p r i n t l n (" Port nonexistent or in use ") ; System . e x i t (0) ; g e l s e f tryf //Attempt to open the port , using the name Plugin to r e s e r v e i t . CommPort commPort = ident . open (" Plugin " , 2000) ; i f ( commPort i n s t a n c e o f S e r i a l P o r t ) f // Associate the s e r i a l port with the proper object , and s e t i t s p r o p e r t i e s to match the f i r e panel ? s //NOTE WELL: THESE SETTINGS ARE POTENTIALLY MACHINE SPECIFIC S e r i a l P o r t s e r i a l P o r t = ( S e r i a l P o r t ) commPort ; s e r i a l P o r t . setSerialPortParams (2400 , S e r i a l P o r t . DATABITS 7, S e r i a l P o r t . STOPBITS 1 , S e r i a l P o r t .PARITY EVEN) ; s e r i a l P o r t . setFlowControlMode ( S e r i a l P o r t . FLOWCONTROL XONXOFF IN) ; //Get input stream of s e r i a l port , a s s o c i a t e i t with the event l i s t e n e r , and s t a r t l i s t e n i n g InputStream in = s e r i a l P o r t . getInputStream ( ) ; s e r i a l P o r t . addEventListener (new SerialReader ( in , lord ) ) ; s e r i a l P o r t . notifyOnDataAvailable ( true ) ; g e l s e f System . out . p r i n t l n (" Error : Only s e r i a l ports are handled by t h i s plugin . " ) ; System . e x i t (0) ; g g catch ( Exception e ) f e . printStackTrace ( ) ; g g g / This inner c l a s s handles the input events of the S e r i a l port . / public s t a t i c c l a s s SerialReader implements SerialPortEventListener f p r i v a t e InputStream in ; p r i v a t e byte [ ] b u f f e r = new byte [ 1 0 2 4 ] ; p r i v a t e EventLord lord ; 176 // basic constructor public SerialReader ( InputStream in , EventLord theLord ) f t h i s . in = in ; lord = theLord ; g / When a s e r i a l event i s activated , t h i s event method i s c a l l e d . I t reads the s t r i n g u n t i l the EOL, parses the information , and passes i t to the event lord f o r f u r t h e r handling . / public void s e r i a l E v e n t ( SerialPortEvent arg0 ) f i n t data ; String input ; Event event = n u l l ; try f //Read the input i n t len = 0 ; while ( ( data = in . read ( ) ) > 1 ) f i f ( data == ?nn ? ) f break ; g b u f f e r [ len++] = ( byte ) data ; g input = new String ( buffer , 0 , len ) ; event = parseString ( input ) ;// Parse the s t r i n g and c r e a t e a new event lord . updateIncident ( event ) ;// pass the event to the event lord and we ? re done g catch ( IOException e ) f e . printStackTrace ( ) ; System . e x i t ( 1) ; g g / This helper method f o r the SerialEvent method parses the actual s t r i n g r e c e i v e d from the s e r i a l port . NOTE WELL: THIS PARSING IS POTENTIALLY MACHINE SPECIFIC / public Event parseString ( String input ) f String [ ] inputArray ; Date d = n u l l ; Event e = n u l l ; String date = n u l l ; SimpleDateFormat df = new SimpleDateFormat ("MMddyyhh:mma") ; inputArray = input . s p l i t ("nns " , 7) ; 177 date = "" + inputArray [ 5 ] . s ub st r in g (0 ,6) + inputArray [ 4 ] . s ub s tr in g (0 ,4) ; System . out . p r i n t l n ( inputArray [ 4 ] ) ; i f ( inputArray [ 4 ] . charAt (4) == ? a ? ) date += "am" ; e l s e date += "pm" ; i f ( inputArray [ 0 ] . equals ("ALARM: " ) ) f try f d = df . parse ( date ) ; g catch ( ParseException ex ) f // TODO Auto generated catch block ex . printStackTrace ( ) ; g e = new Event (d , new SystemSensor (new Location ( null , 0 , " l o c a t i o n ") , inputArray [ 2 ] ) ) ; g return e ; g g / This i s a helper method f o r s t a r t ( ) ; i t gathers a HashSet containing the CommPortIdentifiers a s s o c i a t e d with a l l a v a i l a b l e s e r i a l ports . This i s l i f t e d d i r e c t l y from http :// rxtx . qbang . org / wiki / index . php/ Discovering available comm ports ; no changes were made . / public s t a t i c HashSet g e t A v a i l a b l e S e r i a l P o r t s ( ) f HashSet h = new HashSet() ; Enumeration thePorts = CommPortIdentifier . g e t P o r t I d e n t i f i e r s ( ) ; while ( thePorts . hasMoreElements ( ) ) f CommPortIdentifier com = ( CommPortIdentifier ) thePorts . nextElement ( ) ; switch (com . getPortType ( ) ) f case CommPortIdentifier .PORT SERIAL: try f CommPort thePort = com . open ("CommUtil" , 50) ; thePort . c l o s e ( ) ; h . add (com) ; g catch ( PortInUseException e ) f System . out . p r i n t l n (" Port , " + com . getName ( ) + " , i s in use . " ) ; g catch ( Exception e ) f System . e r r . p r i n t l n (" Failed to open port " + com . getName ( ) ) ; e . printStackTrace ( ) ; g g g return h ; 178 g / This function i s required by the abstract s u p e r c l a s s Plugin ; f o r t h i s s p e c i f i c implementation , i t does nothing . / public void prepare ( ) f // TODO Auto generated method stub g g / . / f f a /model/ Incident . java / package f f a . model ; import java . u t i l . Comparator ; import java . u t i l . Date ; import java . u t i l . concurrent . ConcurrentSkipListSet ; import f f a . ui . U t i l ; public c l a s s Incident f / F i r s t attempt at a s i g n a l i n g mechanism . / p r i v a t e boolean updated = f a l s e ; p r i v a t e ConcurrentSkipListSet eventList = new ConcurrentSkipListSet (new EventComparator ( ) ) ; p r i v a t e Date s t a r t ; p r i v a t e c l a s s EventComparator implements Comparator f @Override public i n t compare ( Event o1 , Event o2 ) f i n t i = o1 . getTime ( ) . compareTo ( o2 . getTime ( ) ) ; i f ( i == 0) i = o1 . getText ( ) . compareTo ( o2 . getText ( ) ) ; i f ( i == 0) i = new I n t e g e r ( o1 . getTrigger ( ) . hashCode ( ) ) . compareTo ( o2 . getTrigger ( ) . hashCode ( ) ) ; return i ; g g 179 public Incident ( ) fg // d e f a u l t constructor jwu / @author jwu @param s t a r t Constructor f o r the i n c i d e n t with a s t a r t date / public Incident ( Date s t a r t ) f t h i s . eventList = new ConcurrentSkipListSet (new EventComparator ( ) ) ; t h i s . s t a r t = s t a r t ; g / @author jwu Currently l i k e t h i s to j u s t copy f u n c t i o n a l i t y of t e s t i n g p r e v i o u s l y in DynamicDisplay @return the ConcurrentSkipListSet of events / public ConcurrentSkipListSet getEvents ( ) f return t h i s . eventList ; g public synchronized void waitForUpdate ( )f while ( ! updated ) f try f wait ( ) ; g catch ( InterruptedException e ) f // Should not be interrupted a f a i k i f ( U t i l .DEBUG) e . printStackTrace ( ) ; g g updated = f a l s e ; g public synchronized void update ( ) f updated = true ; n o t i f y A l l ( ) ; g / @author jwu @param event The event that has updated the i n c i d e n t Updates the i n c i d e n t by adding the event to the eventList / public synchronized void update ( Event event ) f updated = true ; t h i s . eventList . add ( event ) ; n o t i f y A l l ( ) ; g 180 / Add an event WITHOUT updating , so that we don ? t have to r e f r e s h the screen every time / public synchronized void addEvent ( Event e ) f t h i s . eventList . add ( e ) ; g g / . / f f a /model/ SystemSensor . java / package f f a . model ; / Created o r i g i n a l l y to give the a l e r t that the bu ildi ng has burned down f o r t e s t i n g . @author jwu @date Feb . 16 , 2010 / public c l a s s SystemSensor extends Sensorf public SystemSensor ( Location l o c a t i o n ) f super ( l o c a t i o n ) ; g public SystemSensor ( String id ) f super ( id ) ; g public SystemSensor ( Location loc , String id ) f super ( loc , id ) ; g public String toString ( ) f return "System Event"; g g / 181 . / f f a /model/ Location . java / package f f a . model ; import java . awt . Point ; / @author jwu @date Nov 16 , 2009 / public c l a s s Locationf p r i v a t e Point point ; p r i v a t e i n t f l o o r ; p r i v a t e String d e s c r i p t i o n ; public Location ( Point point , i n t f l o o r , String d e s c r i p t i o n ) f super ( ) ; t h i s . point = point ; t h i s . f l o o r = f l o o r ; t h i s . d e s c r i p t i o n = d e s c r i p t i o n ; g public void setPoint ( Point point )f// should only be used in annotation . t h i s . point = point ; g public Point getPoint ( ) f return point ; g public i n t getFloor ( ) f return f l o o r ; g public String getDes criptio n ( ) f return d e s c r i p t i o n ; g g / . / f f a /model/ Alert . java / package f f a . model ; import java . u t i l . Date ; 182 / @author jwu @date Nov 16 , 2009 / public c l a s s Alert extends Event f String message ; public Alert ( Date time , Sensor t r i g g e r , String message ) f super ( time , t r i g g e r ) ; t h i s . message = message ; g @Override public String getText ( ) f // TODO Auto generated method stub return message ; g @Override public String toString ( ) f return message ; g g / . / f f a /model/BACNetPluginDemo . java / package f f a . model ; import java . i o . IOException ; public c l a s s BACNetPluginDemo f // public s t a t i c String BROADCAST ADDRESS = " 1 9 2 . 1 6 8 . 1 . 2 4 " ; //TODO user provided public s t a t i c String BROADCAST ADDRESS = " 1 9 2 . 1 6 8 . 1 . 2 5 5 " ; //TODO user provided public s t a t i c void main ( String [ ] args ) f EventLord lord = new DummyLord( ) ; BACNetPlugin plugin = n u l l ; try f plugin = new BACNetPlugin ( lord , BROADCAST ADDRESS, 47808) ; g catch ( IOException e ) f 183 e . printStackTrace ( ) ; g plugin . prepare ( ) ; plugin . s t a r t ( ) ; g g / . / f f a /model/ Floor . java / package f f a . model ; import java . awt . image . BufferedImage ; import java . u t i l . HashMap ; import java . u t i l .Map; public c l a s s Floor f public BufferedImage image ; public Map l a y e r s ; public Floor ( ) f l a y e r s = new HashMap() ; f o r ( SensorType type : SensorType . values ( ) ) l a y e r s . put ( type , new SensorLayer ( type ) ) ; g public SensorLayer getLayer ( SensorType type ) f return l a y e r s . get ( type ) ; g g / . / f f a /model/SensorType . java / package f f a . model ; public enum SensorTypef SMOKE("Smoke") , HEAT(" Heat ") , 184 FLOW(" Flow ") , MANUAL(" Manual ") ; public f i n a l String t i t l e ; p r i v a t e SensorType ( String t i t l e )f t h i s . t i t l e = t i t l e ; g g / . / f f a / t e s t i n g / PluginTesting . java / package f f a . t e s t i n g ; import java . awt . EventQueue ; import java . i o . F i l e ; import java . lang . r e f l e c t . InvocationTargetException ; import java . u t i l . ArrayList ; import java . u t i l . Date ; import f f a . model . ; import f f a . annotation . ; import f f a . ui . ; public c l a s s PluginTesting f p r i v a t e s t a t i c Window window ; p r i v a t e s t a t i c Plugin plugin ; p r i v a t e s t a t i c EventLord theLord ; public s t a t i c void i n i t i a l i z e ( i n t f l o o r s ) f Incident i n c i d e n t = new Incident (new Date ( ) ) ; window = new Window( i n c i d e n t ) ; //TODO: Currently using j u s t a new ArrayList, change l a t e r ArrayList f i r e F i l t e r s = new ArrayList() ; theLord = new EventLord ( incident , f i r e F i l t e r s ) ; plugin = new RS232Plugin ( theLord , window ) ; try f EventQueue . invokeAndWait (new Runnable ( )f public void run ( ) f window . initGUI ( ) ; window . s e t V i s i b l e ( true ) ; 185 g g) ; g catch ( InterruptedException e ) f e . printStackTrace ( ) ; g catch ( InvocationTargetException e ) f e . printStackTrace ( ) ; g g public s t a t i c void main ( String [ ] args ) f i n i t i a l i z e (4) ; plugin . prepare ( ) ; plugin . s t a r t ( ) ; window . await ( ) ; g g / . / f f a / t e s t i n g /FDSSensor . java / package f f a . t e s t i n g ; / Temporary sensor c l a s s f o r FDS i n t e g r a t i o n / public c l a s s FDSSensor f p r i v a t e String xyz ; p r i v a t e String name ; p r i v a t e String type ; p r i v a t e double alarmThreshold ; p r i v a t e double troubleThreshold ; public FDSSensor ( ) f t h i s . xyz = n u l l ; t h i s . name = n u l l ; t h i s . type = n u l l ; t h i s . alarmThreshold = 0 . 0 ; t h i s . troubleThreshold = 0 . 0 ; g public FDSSensor ( String loc , String nm, String ty , double alarm , double trouble ) f t h i s . xyz = l o c ; t h i s . name = nm; t h i s . type = ty ; t h i s . alarmThreshold = alarm ; 186 t h i s . troubleThreshold = trouble ; g public String getXyz ( ) freturn xyz ;g public void setXyz ( String xyz ) ft h i s . xyz = xyz ;g public String getName ( ) freturn name ;g public void setName ( String name) ft h i s . name = name ;g public String getType ( ) freturn type ;g public void setType ( String type ) ft h i s . type = type ;g public double getAlarmThreshold ( ) freturn alarmThreshold ;g public void setAlarmThreshold ( double alarmThreshold ) ft h i s . alarmThreshold = alarmThreshold ;g public double getTroubleThreshold ( ) freturn troubleThreshold ;g public void setTroubleThreshold ( double troubleThreshold ) ft h i s . troubleThreshold = troubleThreshold ;g public String toString ( ) f return "name=" + t h i s . name + " , type=" + t h i s . type + " , alarm=" + t h i s . alarmThreshold + " , trouble=" + t h i s . troubleThreshold ; g g / . / f f a / t e s t i n g / EventTesting . java / package f f a . t e s t i n g ; import java . awt . EventQueue ; import java . awt . F i l e D i a l og ; import java . awt . Frame ; import java . awt . Point ; import java . i o . BufferedWriter ; import java . i o . F i l e ; import java . i o . FileNotFoundException ; import java . i o . FileWriter ; import java . i o . IOException ; import java . lang . r e f l e c t . InvocationTargetException ; import java . u t i l . ArrayList ; import java . u t i l . Date ; import java . u t i l . I t e r a t o r ; import java . u t i l . Random ; import java . u t i l . Scanner ; import java . u t i l . TreeMap ; import j u n i t . framework . TestCase ; import f f a . model . Alert ; import f f a . model . Event ; import f f a . model . EventLord ; 187 import f f a . model . F i r e F i l t e r ; import f f a . model . F l o o r F i l t e r ; import f f a . model . Incident ; import f f a . model . Location ; import f f a . model . Sensor ; import f f a . model . SystemSensor ; import f f a . model . ThresholdSensor ; import f f a . ui . Window ; / @author jwu / public c l a s s EventTesting extends TestCase f p r i v a t e s t a t i c f i n a l String TEST1 = " s r c / f f a / t e s t i n g / t e s t 1 . txt " ; p r i v a t e s t a t i c f i n a l String PULSE1 = " s r c / f f a / t e s t i n g / pulse1 " ; p r i v a t e s t a t i c f i n a l String KIMBUILDINGBURN = " s r c / f f a / t e s t i n g /jmp . f f a ";//" s r c / f f a / t e s t i n g / Conf video . f f a ";//" s r c / f f a / t e s t i n g / KimBuildingBurn " ; p r i v a t e ArrayList events = new ArrayList() ; p r i v a t e ArrayList timeBetweenEvents = new ArrayList() ; // timeBetweenEvents . get ( i ) = time between events . get ( i 1) and events . get ( i ) p r i v a t e EventLord theLord ; p r i v a t e Window window ; public EventTesting ( ) f g / @author jwu i n i t i a l i z e s t e s t i n g by c r e a t i n g the Incident , c r e a t i n g a Window , c r e a t i n g an EventLord S t a r t s the window and then returns / public void i n i t i a l i z e ( i n t f l o o r s ) f Incident i n c i d e n t = new Incident ( ) ; window = new Window( i n c i d e n t ) ; //TODO: Currently using j u s t a new ArrayList, change l a t e r ArrayList f i r e F i l t e r s = new ArrayList() ; f i r e F i l t e r s . add (new F l o o r F i l t e r ( f l o o r s ) ) ; theLord = new EventLord ( incident , f i r e F i l t e r s ) ; try f EventQueue . invokeAndWait (new Runnable ( )f public void run ( ) f window . initGUI ( ) ; window . s e t V i s i b l e ( true ) ; 188 g g) ; g catch ( InterruptedException e ) f e . printStackTrace ( ) ; g catch ( InvocationTargetException e ) f e . printStackTrace ( ) ; g g / @author jwu Basic t e s t to s t a r t the t e s t i n g parse event information from a f i l e s t a r t a thread to simulate the events and have the window await / / public void t e s t 1 ( ) f i n i t i a l i z e (2) ; Thread t = getTestThread (new F i l e (TEST1) ) ; t . s t a r t ( ) ; window . await ( ) ; g / / Generates a t e s t f i l e . . . not quite working due to not writing to f i l e f o r some reason . . . Will add more parameters l a t e r to change how the simulation runs Very basic r i g h t now / / public void testGeneratePulse ( ) f pulseGenerate (4 , 400 , new F i l e (KIMBUILDINGBURN) ) ; g / / B a s i c a l l y the same as other test , but uses pulse f i l e instead / / public void t e s t P u l s e 2 ( ) f i n i t i a l i z e (10) ; Thread t = getTestThread (new F i l e (PULSE1) ) ; t . s t a r t ( ) ; window . await ( ) ; g / 189 public void testBurnKim ( ) f i n i t i a l i z e (4) ; Thread t = getTestThread (new F i l e (KIMBUILDINGBURN) ) ; t . s t a r t ( ) ; window . await ( ) ; g public void pulseGenerate ( i n t f l o o r s , i n t numSensors , F i l e f ) f try f BufferedWriter output = new BufferedWriter (new FileWriter ( f ) ) ; i n t topFloor = 0 ; i n t activatedSensors = 0 ; boolean s t a l l i n g = true ; long timeBetween = 0 ; long randomTimeBetween = 0 ; i n t numberToDo = 0 ; i n t numberDone = 0 ; double chanceOfActivate = 0 . 0 ; Random r = new Random( ) ; System . out . p r i n t l n (0 + " " + 0 + " Threshold F i r s t Event ") ; output . write (0 + " " + 0 + " Threshold F i r s t Event ") ; output . newLine ( ) ; while ( topFloor < f l o o r s && numberDone <= numSensors ) f i f ( activatedSensors >= ( ( numSensors / f l o o r s ) / 10) + r . nextInt ( ( ( numSensors / f l o o r s ) / 10) ) ) f activatedSensors = 0 ; topFloor++; g i f ( s t a l l i n g ) f timeBetween = 10000; randomTimeBetween = 1000; numberToDo = r . nextInt (5) ; //( numSensors / f l o o r s ) / 10 + 1 ; chanceOfActivate = 0 . 7 ; g e l s e f timeBetween = 100; randomTimeBetween = 1000; numberToDo = ( numSensors / f l o o r s ) / 4 ; chanceOfActivate = 0 . 4 ; g //System . out . p r i n t l n (" number to write = " + numberToDo) ; f o r ( i n t i = 0 ; i < numberToDo ; i++) f 190 long time = timeBetween + r . nextInt ( ( i n t ) randomTimeBetween ) ; i n t f l o o r = r . nextInt ( topFloor +1) ; String message = ""; i f ( r . nextDouble ( ) < chanceOfActivate ) f message = " In Alarm " ; activatedSensors++; g e l s e f message = " In Trouble " ; activatedSensors++; g // w r i t e r . write ( time + " " + f l o o r + " Threshold " + message ) ; // w r i t e r . newLine ( ) ; System . out . p r i n t l n ( time + " " + f l o o r + " Threshold " + message ) ; output . write ( time + " " + f l o o r + " Threshold " + message ) ; output . newLine ( ) ; g s t a l l i n g = ! s t a l l i n g ; g output . write (2010 + " " + ( f l o o r s 1) + " System " + "") ; output . newLine ( ) ; output . c l o s e ( ) ; gcatch ( Exception e ) f e . printStackTrace ( ) ; g g / Expected format of a l i n e of the f i l e i s : < Sensor Type> @author jwu @param f / p r i v a t e void p a r s e F i l e ( F i l e f ) f try f Scanner s = new Scanner ( f ) ; String s t r ; long timeBetween ; Sensor sensor ; Event a l e r t ; long t o t a l = System . currentTimeMillis ( ) ; while ( s . hasNextLine ( ) ) f 191 // get time s t r = s . next ( ) ; timeBetween = Long . parseLong ( s t r ) ; t h i s . timeBetweenEvents . add ( timeBetween ) ; t o t a l += timeBetween ; s t r = s . next ( ) ; i n t f l o o r = I n t e g e r . parseInt ( s t r ) ; s t r = s . next ( ) ; String sensorLabel = s t r ; // get sensor information s t r = s . next ( ) ; i f ( s t r . equals (" Threshold ") ) f // New sensor ID String sensorID = s . next ( ) ; // Prune parens sensorID = sensorID . su b st ri n g (1 , sensorID . length ( ) 1) ; s t r = s . nextLine ( ) ; sensor = new ThresholdSensor (new Location ( null , f l o o r , s t r . trim ( ) ) , sensorID ) ; g e l s e i f ( s t r . equals (" System ") ) f s t r = s . nextLine ( ) ; sensor = n u l l ; // sensor = new SystemSensor (new Location ( null , f l o o r , s t r . trim ( ) ) , sensorLabel ) ; ge l s e f throw new IllegalArgumentException ( ) ; // unrecognized sensor g // get message i f ( sensor != n u l l ) f a l e r t = new Event (new Date ( t o t a l ) , sensor ) ; t h i s . events . add ( a l e r t ) ; g //System . out . p r i n t l n ( s t r ) ; g g catch ( FileNotFoundException e ) f e . printStackTrace ( ) ; g g / @author jwu 192 @param f the f i l e from which to get event information f o r t h i s thread @return a thread that w i l l be run f o r t e s t i n g that does the events described in passed f i l e / p r i v a t e Thread getTestThread ( F i l e f ) f p a r s e F i l e ( f ) ; return new Thread (new Runnable ( )f public void run ( ) f I t e r a t o r e v e n t I t e r a t o r = events . i t e r a t o r ( ) ; I t e r a t o r t i m e I t e r a t o r = timeBetweenEvents . i t e r a t o r ( ) ; while ( e v e n t I t e r a t o r . hasNext ( ) ) f // theLord . updateIncident (new Alert (new Time (0) , // new ThresholdSensor ( ) , Double . toString (Math . random ( ) ) ) ) ; try f Thread . s l e e p ( t i m e I t e r a t o r . next ( ) ) ; g catch ( InterruptedException e ) f e . printStackTrace ( ) ; g theLord . updateIncident ( e v e n t I t e r a t o r . next ( ) ) ; // U t i l . printDebug (" Updating ") ; // i n c i d e n t . update ( ) ; g g g) ; g // runs a t e s t f i l e public s t a t i c void main ( String [ ] args ) f / F i l e Di a l o g f = new F i l e D i al o g ( ( Frame) null , " S e l e c t the annotation f i l e ") ; f . setMode ( F i l e Di a l o g .LOAD) ; f . s e t V i s i b l e ( true ) ; F i l e a n n o t a t i o n f i l e = new F i l e ( f . getDirectory ( )+f . g e t F i l e ( ) ) ; / F i l e Di a l o g f = new F i l e D i al o g ( ( Frame) null , " S e l e c t the f f a f i l e ") ; f . setMode ( F i l e Di a l o g .LOAD) ; f . s e t V i s i b l e ( true ) ; F i l e f f a f i l e = new F i l e ( f . getDirectory ( )+f . g e t F i l e ( ) ) ; EventTesting e = new EventTesting ( ) ; e . i n i t i a l i z e (1) ; Thread t = e . getTestThread ( f f a f i l e ) ; t . s t a r t ( ) ; 193 e . window . await ( ) ; g g / . / f f a / t e s t i n g /FDSPlugin . java / package f f a . t e s t i n g ; import java . awt . F i l e D i a l og ; import java . awt . Frame ; import java . i o . BufferedWriter ; import java . i o . F i l e ; import java . i o . FileNotFoundException ; import java . i o . FileWriter ; import java . i o . IOException ; import java . u t i l . ArrayList ; import java . u t i l . Scanner ; import java . u t i l . TreeMap ; import javax . swing . JFileChooser ; import javax . swing . JOptionPane ; public c l a s s FDSPlugin f public s t a t i c F i l e fdsInput ; public s t a t i c F i l e fdsOutput ; public s t a t i c F i l e parseOutput ; public s t a t i c TreeMap s e n s o r s = new TreeMap() ; // every sensor in the simulation w i l l have a name to FDSSensor mapping here public s t a t i c void main ( String [ ] args ) f // choose to use e x i s t i n g f f a f i l e or make new one // i n t choice = 0 ; // String input = JOptionPane . showInputDialog (" Enter choice :nn0 Use e x i s t i n g . f f a f i l enn1 Enter FDS f i l e s to generate and run a . f f a f i l e ") ; // try f // choice = I n t e g e r . parseInt ( input ) ; // i f ( choice != 0 && choice != 1) // throw new IllegalArgumentException ( ) ; // g catch ( Exception e ) f 194 // JOptionPane . showMessageDialog ( null , " I n v a l i d option . " ) ; // System . e x i t (0) ; // g // // i f ( choice == 1) f // get FDS input f i l e fdsInput = g e t F i l e (" S e l e c t FDS Input f i l e ( . fds f i l e ) " , true , n u l l ) ; fdsOutput = g e t F i l e (" S e l e c t FDS Output f i l e ( . cvs f i l e ) " , true , n u l l ) ; parseOutput = g e t F i l e (" Save the r e s u l t i n g . f f a f i l e as . . . " , f a l s e , " . f f a ") ; //TODO d e l e t e System . out . p r i n t l n ( fdsInput . toString ( ) + "nn" + fdsOutput . toString ( ) + "nn" + parseOutput . toString ( ) ) ; // parse input f i l e f o r s e n s o r s parseForSensors ( ) ; // parse output f i l e f o r sensor a c t i v a t i o n times ( generate . f f a f i l e ) generateFFA ( ) ; // g e l s e f // // j u s t use an e x i s t i n g . f f a f i l e // parseOutput = g e t F i l e (" S e l e c t the e x i s t i n g f i l e to simulate ( . f f a f i l e ) " , true , n u l l ) ; // // //TODO d e l e t e // System . out . p r i n t l n ( parseOutput . toString ( ) ) ; // g // JOptionPane . showMessageDialog ( null , "program termination ") ; System . e x i t (0) ; g / Get ? s a f i l e to be used f o r something , i f open save i s true , d i s p l a y s open dialog , otherwise d i s p l a y s save d i a l o g / p r i v a t e s t a t i c F i l e g e t F i l e ( String titleMessage , boolean open save , String f i l e e x t ) f F i l e Di a l o g f = new F i l e D i al o g ( ( Frame) null , t i t l e M e s s a g e ) ; i f ( open save ) f . setMode ( F i l e Di a l o g .LOAD) ; e l s e f . setMode ( F i l e Di a l o g .SAVE) ; f . s e t V i s i b l e ( true ) ; String filename = f . getDirectory ( ) + f . g e t F i l e ( ) ; i f ( f i l e e x t != n u l l ) f F i l e r e t ; i f ( filename . endsWith ( f i l e e x t ) ) 195 r e t = new F i l e ( filename ) ; e l s e r e t = new F i l e ( filename + f i l e e x t ) ; return r e t ; g System . out . p r i n t l n ( f . getDirectory ( ) + " . . . " + f . g e t F i l e ( ) ) ; return new F i l e ( f . getDirectory ( )+f . g e t F i l e ( ) ) ; g / This i s c a l l e d once we have a mapping of device names to a s s o c i a t e d FDSSensors ( which contain threshold information ) at every step , get the tokens , and check each token to see i f a new event has occurred check i f sensor dead ( i f so skip ) check i f in trouble ( i f so , s e t to dead , skip ) check i f in alarm ( i f so and not already in alarm , s e t to alarm ) / p r i v a t e s t a t i c f i n a l double IN STATE = 1; p r i v a t e s t a t i c void generateFFA ( ) f // c r e a t e array of device names , array of alarmsThresholds , array of troubleThresholds // whenever a sensor e n t e r s a state , the appropriate array w i l l be modified to have 1 so that the check isn ? t made again //when a l l s e n s o r s enter trouble , simulation terminates String [ ] deviceNames ; double [ ] alarmThresholds ; double [ ] troubleThresholds ; String [ ] current ; long lastEventTime = 0 ; boolean firstEventOccurred = f a l s e ; long intervalTime = 0 ; long currentTime = 0 ; try f Scanner sc = new Scanner ( fdsOutput ) ; BufferedWriter out = new BufferedWriter (new FileWriter ( parseOutput ) ) ; String l i n e ; sc . nextLine ( ) ; //must have the units l i n e l i n e = sc . nextLine ( ) ; // i n i t device names , alarmThresholds , troubleThresholds deviceNames = l i n e . s p l i t ( "n" ,n"j,n"jn" " ) ; alarmThresholds = new double [ deviceNames . length ] ; troubleThresholds = new double [ deviceNames . length ] ; f o r ( i n t i = 1 ; i < deviceNames . length ; i++) f String s = deviceNames [ i ] ; alarmThresholds [ i ] = s e n s o r s . get ( s ) . getAlarmThreshold ( ) ; troubleThresholds [ i ] = s e n s o r s . get ( s ) . getTroubleThreshold ( ) ; g 196 //TODO multiple f l o o r s f o r f i r s t event String sensor name ; while ( sc . hasNextLine ( ) ) f l i n e = sc . nextLine ( ) ; current = l i n e . s p l i t (" , ") ; i n t a l i v e s e n s o r s = 0 ; // counts number of a l i v e s e n s o r s in each round , i f a l l dead ends parsing double c u r s e n s v a l = 0 . 0 ; current [ 0 ] = current [ 0 ] . trim ( ) ; System . out . p r i n t l n (" current [ 0 ] = " + current [ 0 ] ) ; //TODO remove currentTime = ( long ) ( eToDouble ( current [ 0 ] ) 1000) ; intervalTime = currentTime lastEventTime ; //TODO can be moved into a i f body f o r e f f i c i e n c y f o r ( i n t i = 1 ; i < current . length ; i++) f // parse current values and compare with threshold values , i f an event occurs , write i t i f ( troubleThresholds [ i ] == 1) //dead sensor , no p r o c e s s i n g needed continue ; c u r s e n s v a l = eToDouble ( current [ i ] ) ; i f ( c u r s e n s v a l >= troubleThresholds [ i ] ) f // sensor has entered trouble s t a t e i f ( ! firstEventOccurred ) f // i f f i r s t event , write f i r s t E v e n t a l e r t out . write ( intervalTime + " 0 smk1 System F i r s t Event ") ; out . newLine ( ) ; intervalTime = 0 ; firstEventOccurred = true ; g sensor name = s e n s o r s . get ( deviceNames [ i ] ) . getName ( ) ; out . write ( "" + intervalTime + " 0 " + //TODO multiple f l o o r s s e n s o r s . get ( deviceNames [ i ] ) . getType ( ) + sensor name . charAt ( sensor name . length ( ) 1) + " Threshold " + " (" + sensor name + ") In Trouble ") ; out . newLine ( ) ; troubleThresholds [ i ] = 1; lastEventTime = currentTime ; g e l s e i f ( alarmThresholds [ i ] != 1 && c u r s e n s v a l >= alarmThresholds [ i ] ) f // not already in alarm , and going into alarm i f ( ! firstEventOccurred ) f // i f f i r s t event , write f i r s t E v e n t a l e r t 197 out . write ( intervalTime + " 0 smk1 System F i r s t Event ") ; out . newLine ( ) ; intervalTime = 0 ; firstEventOccurred = true ; g sensor name = s e n s o r s . get ( deviceNames [ i ] ) . getName ( ) ; out . write ( "" + intervalTime + " 0 " + //TODO multiple f l o o r s s e n s o r s . get ( deviceNames [ i ] ) . getType ( ) + sensor name . charAt ( sensor name . length ( ) 1) + " Threshold " + " (" + sensor name + ") In Alarm ") ; out . newLine ( ) ; alarmThresholds [ i ] = 1; lastEventTime = currentTime ; a l i v e s e n s o r s ++; g e l s e a l i v e s e n s o r s ++; //no change in s t a t e g g out . write ("0 0 smk1 System Simulation End") ; out . newLine ( ) ; // Close the output stream out . c l o s e ( ) ; g catch ( FileNotFoundException e ) f e . printStackTrace ( ) ; g catch ( IOException e ) f e . printStackTrace ( ) ; g System . out . p r i n t l n (" f i n i s h i n g generateFFA ") ; g / given a parameter of the form d . ddddddddE(+j )ddd convert to an return a double / p r i v a t e s t a t i c double eToDouble ( String s t r ) f System . out . pr in t (" e2d : " + s t r ) ; //TODO remove String [ ] toks = s t r . s p l i t ("E") ; double num = Double . parseDouble ( toks [ 0 ] ) ; i n t power = I n t e g e r . parseInt ( toks [ 1 ] . s u bs t ri ng (1) ) ; i f ( toks [ 1 ] . charAt (0) == ? ?) power = 1; double r e s u l t = num Math . pow(10 , power ) ; System . out . p r i n t l n (" == " + r e s u l t ) ; //TODO remove return r e s u l t ; g 198 p r i v a t e s t a t i c void parseForSensors ( ) f ArrayList devc = new ArrayList() ; TreeMap propid = new TreeMap() ; StringBuilder s t r = new StringBuilder ( ) ; try f Scanner sc = new Scanner ( fdsInput ) ; String l i n e ; String [ ] tokens ; while ( sc . hasNextLine ( ) ) f l i n e = sc . nextLine ( ) ; s t r = new StringBuilder ( l i n e ) ; while ( ! l i n e . endsWith ("/") ) f l i n e = sc . nextLine ( ) ; s t r . append ( l i n e ) ; g System . out . p r i n t l n ( s t r . toString ( ) ) ; //TODO remove i f ( s t r . toString ( ) . startsWith ("&DEVC") ) f System . out . p r i n t l n (" devc found ") ; //TODO remove devc . add ( s t r . toString ( ) ) ; //can ? t be processed u n t i l a f t e r reading whole f i l e g e l s e i f ( s t r . toString ( ) . startsWith ("&PROP") ) f System . out . p r i n t l n (" prop found ") ; //TODO remove FDSSensor ps = new FDSSensor ( ) ; //TODO get xyz // get sensor type name tokens = s t r . toString ( ) . s p l i t (" ?") ; String sensorTypeName = tokens [ 1 ] ; System . out . p r i n t l n ("name = " + sensorTypeName ) ; // get a c t i v a t i o n tokens = s t r . toString ( ) . s p l i t (".+ACTIVATIONnnD+") ; String a c t i v a t i o n = tokens [ 1 ] . s ub st r in g (0 , tokens [ 1 ] . indexOf (" ") ) ; i f ( a c t i v a t i o n . endsWith ( " . " ) ) a c t i v a t i o n = a c t i v a t i o n . s ub st r in g (0 , a c t i v a t i o n . length ( ) 1) ; System . out . p r i n t l n (" a c t i v a t i o n = " + a c t i v a t i o n ) ; //TODO remove ps . setAlarmThreshold ( Double . parseDouble ( a c t i v a t i o n ) ) ; // get trouble ps . setTroubleThreshold ( Double .MAX VALUE) ; // get type i f ( s t r . toString ( ) . contains (" ?LINK TEMPERATURE? " ) ) ps . setType (" het ") ;// ps . setType (" Heat ") ; 199 e l s e i f ( s t r . toString ( ) . contains (" ?SPRINKLER LINK TEMPERATURE ? " ) ) ps . setType (" f l o ") ;// ps . setType (" S p r i n k l e r ") ; e l s e i f ( s t r . toString ( ) . contains (" ?CHAMBER OBSCURATION? " ) ) ps . setType ("smk") ;// ps . setType ("Smoke") ; e l s e JOptionPane . showMessageDialog ( null , " unrecognized sensor type : nn" + s t r . toString ( ) ) ; System . out . p r i n t l n (" sensor type = " + ps . getType ( ) ) ; //TODO remove propid . put ( sensorTypeName , ps ) ; g g //TODO remove System . out . p r i n t l n (" propid : " + propid . toString ( ) ) ; System . out . p r i n t l n (" devc : " + devc . toString ( ) ) ; // parse d e v i c e s now that prop id information known f o r ( String s : devc ) f String [ ] toks = s . s p l i t (" ?") ; FDSSensor prop = propid . get ( toks [ 3 ] ) ; FDSSensor p = new FDSSensor ( prop . getXyz ( ) , toks [ 1 ] , prop . getType ( ) , prop . getAlarmThreshold ( ) , prop . getTroubleThreshold ( ) ) ; System . out . p r i n t l n ("new devc = " + p . toString ( ) ) ; //TODO remove s e n s o r s . put (p . getName ( ) , p) ; g g catch ( FileNotFoundException e ) f e . printStackTrace ( ) ; g g g / . / f f a / ui / MetaInfoLayer . java / 200 / Levon K. Mkrtchyan This Component d i s p l a y s annotation s t u f f s and accepts annotation commands / package f f a . ui ; import java . awt . BasicStroke ; import java . awt . Color ; import java . awt . Cursor ; import java . awt . Graphics ; import java . awt . Graphics2D ; import java . awt . Polygon ; import java . awt . Rectangle ; import java . awt . Stroke ; import java . awt . event . ComponentAdapter ; import java . awt . event . ComponentEvent ; import java . awt . event . ComponentListener ; import java . awt . event . KeyAdapter ; import java . awt . event . KeyEvent ; import java . awt . event . KeyListener ; import java . awt . event . MouseAdapter ; import java . awt . event . MouseEvent ; import java . awt . geom . AffineTransform ; import java . awt . geom . Line2D ; import java . awt . geom . Point2D ; import java . u t i l . I t e r a t o r ; import javax . swing . JComponent ; import f f a . model . Building ; import f f a . model . Floor ; import f f a . model . SensorLayer ; import f f a . model . SensorType ; import f f a . ui . MapDisplay ; import f f a . annotation . Zone ; public c l a s s MetaInfoLayer extends JComponentf s t a t i c f i n a l long serialVersionUID = 1L ; p r i v a t e ClockPanel clockPanel = n u l l ; p r i v a t e MapDisplay d i s p la y = n u l l ; p r i v a t e Building b uild ing ; p r i v a t e I n t e g e r curFloor = 0 ; p r i v a t e ComponentListener c l = new ComponentAdapter ( )f public void componentResized ( ComponentEvent e )f MetaInfoLayer . t h i s . s e t S i z e ( e . getComponent ( ) . getWidth ( ) , e . getComponent ( ) . getHeight ( ) ) ; g 201 g; public MetaInfoLayer ( MapDisplay display , ClockPanel cp )f clockPanel = cp ; t h i s . d is p l a y = d i s p la y ; t h i s . setOpaque ( f a l s e ) ; d i s pl a y . add ( this , new I n t e g e r (1) ) ; d i s pl a y . addComponentListener ( c l ) ; t h i s . setBackground (new Color (255 ,255 ,255 , Color .TRANSLUCENT) ) ; t h i s . setForeground (new Color (255 ,255 ,255 , Color .TRANSLUCENT) ) ; s et B ui l di n g ( d i s p l a y . bui ldin g ) ; g public void paint ( Graphics g )f Graphics2D g2d = ( Graphics2D ) g ; Stroke oldStroke = g2d . getStroke ( ) ; AffineTransform o r i g i n a l = g2d . getTransform ( ) ; g2d . transform ( d i s p l a y . getTransform ( ) ) ; // f l o a t zoom = di s p l a y . getZoom ( ) ; Floor f l o o r = getBuilding ( ) . getFloor ( curFloor ) ; f o r ( SensorType layerType : SensorType . values ( ) )f // f l o o r . l a y e r s . keySet ( ) ) f Color o u t l i n e = Color . black ; Color f i l l ; switch ( layerType )f case HEAT: o u t l i n e = new Color (0 ,0 ,0 ,0) ; f i l l = Color . red ; break ; case SMOKE: o u t l i n e = new Color (0 ,0 ,0 ,0) ; f i l l = new Color ( Color . gray . getRed ( ) , Color . gray . getGreen ( ) , Color . gray . getBlue ( ) ,150) ; break ; case FLOW: g2d . setStroke (new BasicStroke (10 f ) ) ; o u t l i n e = Color . blue ; f i l l = new Color (0 ,0 ,0 ,0) ; break ; d e f a u l t :// manual o u t l i n e = Color . green ; f i l l = new Color (0 ,0 ,0 ,0) ; break ; g f o r ( Zone z : f l o o r . getLayer ( layerType ) . getZones ( ) )f i n t opacity = z . getOpacity ( clockPanel . getClockTime ( ) ) ; i f ( opacity >0)f 202 // Color transparent = new Color ( opaque . getRed ( ) , opaque . getGreen ( ) , opaque . getBlue ( ) , opacity ) ; g . setColor ( f i l l ) ; g . f i l l P o l y g o n ( z . getZone ( ) ) ; g . setColor ( o u t l i n e ) ; g . drawPolygon ( z . getZone ( ) ) ; g g g2d . setStroke ( oldStroke ) ; g g2d . setTransform ( o r i g i n a l ) ; g public void s e tB u il d in g ( Building buil ding ) f t h i s . bu ildi ng = buil ding ; g public Building getBuilding ( ) f return bui ldin g ; g g / . / f f a / ui / ClockPanel . java / package f f a . ui ; import i n f o . clearthought . layout . TableLayout ; import java . awt . Color ; import java . awt . EventQueue ; import java . awt . Font ; import java . text . DateFormat ; import java . text . SimpleDateFormat ; import java . u t i l . Date ; import javax . swing . JLabel ; import javax . swing . JPanel ; import javax . swing . SwingConstants ; import org . joda . time . DateTime ; import org . joda . time . I n t e r v a l ; import org . joda . time . Period ; import org . joda . time . PeriodType ; @SuppressWarnings (" s e r i a l ") public c l a s s ClockPanel extends JPanelf p r i v a t e DateTime s t a r t ; p r i v a t e boolean running = true ; p r i v a t e Period duration = n u l l ; 203 p r i v a t e MapDisplay d i s p la y = n u l l ; JLabel msgLabel ; JLabel clockLabel ; DateFormat timeFormat = new SimpleDateFormat (" kk :mm: ss ") ; / S t a t i c panel doesn ? t do anything / public ClockPanel ( Date date , String msg) f super ( ) ; U t i l . checkEDT ( ) ; msgLabel = new JLabel (msg) ; f i n a l String text = timeFormat . format ( date ) ; clockLabel = new JLabel ( text ) ; layoutPanel ( ) ; g / Dynamic panel , w i l l update time / public ClockPanel ( DateTime start , String msg , MapDisplay disp ) f super ( ) ; U t i l . checkEDT ( ) ; t h i s . s t a r t = s t a r t ; msgLabel = new JLabel (msg) ; clockLabel = new JLabel ("") ; d i s pl a y = disp ; layoutPanel ( ) ; s t a r t ( ) ; g p r i v a t e void layoutPanel ( ) f clockLabel . setBackground (new Color (51 ,51 ,51) ) ; clockLabel . setOpaque ( true ) ; clockLabel . setForeground ( Color . white ) ; clockLabel . setHorizontalAlignment ( SwingConstants .CENTER) ; msgLabel . setFont ( msgLabel . getFont ( ) . deriveFont (15 f ) ) ; clockLabel . setFont ( clockLabel . getFont ( ) . deriveFont (25 f ) . deriveFont ( Font .BOLD) ) ; double border = 15; double iborder = 10; double [ ] [ ] s i z e = ffborder , TableLayout . FILL , borderg,fborder , . 2 5 , iborder , TableLayout . FILL , bordergg; setLayout (new TableLayout ( s i z e ) ) ; add ( msgLabel , " 1 , 1 , c , c ") ; add ( clockLabel , " 1 , 3 , f , f ") ; 204 g public void s t a r t ( ) f Thread t = new Thread (new Runnable ( )f public void run ( ) f while ( running ) f try f Thread . s l e e p (1000) ; g catch ( InterruptedException e ) f e . printStackTrace ( ) ; g ClockPanel . t h i s . t i c k ( ) ; g g g) ; t . s t a r t ( ) ; g p r i v a t e void t i c k ( ) f DateTime current = new DateTime ( ) ; duration = new I n t e r v a l ( start , current ) . toPeriod ( PeriodType . time ( ) ) ; f i n a l String text = String . format("

%02d:%02d:%02d" , duration . getHours ( ) , duration . getMinutes ( ) , duration . getSeconds ( ) ) ; EventQueue . invokeLater (new Runnable ( )f public void run ( ) f clockLabel . setText ( text ) ; g g) ; d i s pl a y . repaint ( ) ; // repaint the map every timestep g public DateTime getClockTime ( )f return s t a r t . plus ( duration ) ; g g / . / f f a / ui /DynamicDisplay . java / package f f a . ui ; import java . awt . Color ; import java . awt . GridLayout ; 205 import java . text . DateFormat ; import java . text . SimpleDateFormat ; import java . u t i l . concurrent . ConcurrentSkipListSet ; import javax . swing . BorderFactory ; import javax . swing . JLabel ; import javax . swing . JPanel ; import f f a . model . Event ; @SuppressWarnings (" s e r i a l ") public c l a s s DynamicDisplay extends JPanel f p r i v a t e ComponentList eventDisplayList ; p r i v a t e MapDisplay mapDisplay ; DateFormat timeFormat = new SimpleDateFormat (" kk :mm: ss ") ; public DynamicDisplay ( ) f U t i l . checkEDT ( ) ; eventDisplayList = new ComponentList ( ) ; setBorder ( BorderFactory . createLineBorder ( Color .GREEN) ) ; setLayout (new GridLayout (1 ,1) ) ; add ( eventDisplayList . getScrollPane ( ) ) ; g public void addMapDisplay ( MapDisplay md)f mapDisplay = md; g / @author jwu @param events the l i s t of events to r e f l e c t changes with Modified to take a l i s t of Events s i n c e t e s t i n g was moved to f f a . t e s t i n g / public void changeMade ( ConcurrentSkipListSet events ) f U t i l . checkEDT ( ) ; eventDisplayList . c l e a r ( ) ; f o r ( Event e : events ) f eventDisplayList . add ( getComponent ( e ) ) ; mapDisplay . b uild ing . addEvent ( e ) ; g eventDisplayList . r e f r e s h ( ) ; mapDisplay . repaint ( ) ; g / @author jwu @param e 206 @return panel Modified to take Events as opposed to EventStubs / public JPanel getComponent ( Event e ) f JPanel panel = new JPanel ( ) ; panel . setLayout (new GridLayout (1 ,1) ) ; panel . add (new JLabel ( e . getText ( ) ) ) ; panel . setBorder ( BorderFactory . createEtchedBorder ( ) ) ; return panel ; g g / . / f f a / ui /MapDisplay . java / package f f a . ui ; import java . awt . Color ; import java . awt . Cursor ; import java . awt . Dimension ; import java . awt . Graphics ; import java . awt . Graphics2D ; import java . awt . Point ; import java . awt . RenderingHints ; import java . awt . event . ComponentAdapter ; import java . awt . event . ComponentEvent ; import java . awt . event . MouseAdapter ; import java . awt . event . MouseEvent ; import java . awt . event . MouseWheelEvent ; import java . awt . geom . AffineTransform ; import java . awt . geom . Point2D ; import java . awt . image . BufferedImage ; import java . i o . F i l e ; import javax . imageio . ImageIO ; import javax . swing . BorderFactory ; import javax . swing . JLayeredPane ; import javax . swing . JPanel ; import f f a . annotation . AnnotationIO ; import f f a . model . Building ; @SuppressWarnings (" s e r i a l ") public c l a s s MapDisplay extends JLayeredPanef p r i v a t e java . awt . Window window ; p r i v a t e JPanel mapPanel ; p r i v a t e ControlPanel controlPanel ; 207 p r i v a t e f i n a l String floorPlanRootPath = " s t a t i c I n f o / f l o o r " ; p r i v a t e f i n a l String floorPlanExt = " . png " ; p r i v a t e f i n a l String annotationPath = " s t a t i c I n f o / annotation " ; p r i v a t e AffineTransform i n v e r s e = n u l l ; public Building buil ding = n u l l ; public MouseAdapter mapMouseListener = new MouseAdapter ( )f Point2D prev=n u l l ; boolean exited=true ; public void mouseEntered ( MouseEvent e )f window . setCursor (new Cursor ( Cursor .HAND CURSOR) ) ; exited=f a l s e ; g public void mouseExited ( MouseEvent e )f window . setCursor (new Cursor ( Cursor .DEFAULT CURSOR) ) ; prev=n u l l ; exited=true ; g public void mousePressed ( MouseEvent e )f window . setCursor (new Cursor ( Cursor .MOVE CURSOR) ) ; prev=screenToMap ( e . getPoint ( ) ) ; g public void mouseDragged ( MouseEvent e )f i f ( prev != n u l l )f Point2D cur=screenToMap ( e . getPoint ( ) ) ; controlPanel . pan ( ( i n t ) ( cur . getX ( ) prev . getX ( ) ) ,( i n t ) ( cur . getY ( ) prev . getY ( ) ) ) ; prev=screenToMap ( e . getPoint ( ) ) ; g g public void mouseReleased ( MouseEvent e )f i f ( ! exited )f window . setCursor (new Cursor ( Cursor .HAND CURSOR) ) ; prev=n u l l ; g g public void mouseWheelMoved ( MouseWheelEvent e )f controlPanel . zoomBy(Math . pow ( . 9 , e . getWheelRotation ( ) ) ) ; repaint ( ) ; g public void mouseClicked ( MouseEvent e )f i f ( e . getButton ( )==MouseEvent .BUTTON2)f controlPanel . zoomDefault ( ) ; repaint ( ) ; 208 g g g; public void setCursor ( Cursor cur )f window . setCursor ( cur ) ; g public void recomputeInverse ( )f i n v e r s e = getTransform ( ) ; tryf i n v e r s e = i n v e r s e . c r e a t e I n v e r s e ( ) ; gcatch ( Exception ex )f ex . printStackTrace ( ) ; i n v e r s e = n u l l ; g g public AffineTransform getTransform ( )f AffineTransform pannedZoomed = AffineTransform . g e t S c a l e I n s t a n c e ( controlPanel . getZoom ( ) , controlPanel . getZoom ( ) ) ; pannedZoomed . t r a n s l a t e ( controlPanel . getPanX ( ) , controlPanel . getPanY ( ) ) ; return pannedZoomed ; g public f l o a t getZoom ( )f return controlPanel . getZoom ( ) ; g public Point2D screenToMap ( Point pt )f return i n v e r s e . transform ( pt , n u l l ) ; g public i n t getCurrentFloor ( )f return controlPanel . getCurrentFloor ( ) ; g public f l o a t computeScale ( i n t f l o o r )f BufferedImage temp = bui ldin g . getFloor ( f l o o r ) . image ; return Math . min ( mapPanel . getWidth ( ) /( f l o a t ) temp . getWidth ( ) , mapPanel . getHeight ( ) /( f l o a t ) temp . getHeight ( ) ) ; g public BufferedImage getFloorImage ( i n t i )f return bui ldin g . getFloor ( i ) . image ; g public MapDisplay ( java . awt . Window parentWindow , Building buil ding ) f t h i s . window = parentWindow ; U t i l . checkEDT ( ) ; t h i s . bu ildi ng = buil ding ; 209 setBorder ( BorderFactory . createLineBorder ( Color .RED) ) ; //TODO decide on a s i z e f o r the map s e t P r e f e r r e d S i z e (new Dimension (600 , 600) ) ; // setLayout (new GridLayout (2 ,1) ) ; mapPanel = new JPanel ( )f public void paint ( Graphics g )f Graphics2D g2d = ( Graphics2D ) g ; g . setColor ( Color . white ) ; g . f i l l R e c t (0 , 0 , t h i s . getWidth ( ) , t h i s . getHeight ( ) ) ; g2d . setRenderingHint ( RenderingHints .KEY INTERPOLATION, RenderingHints .VALUE INTERPOLATION BICUBIC) ; AffineTransform originalTransform = g2d . getTransform ( ) ; AffineTransform pannedZoomed = getTransform ( ) ; Point2D temp = pannedZoomed . transform (new Point (0 ,0) , n u l l ) ; i f ( controlPanel . getWorkingImage ( ) != n u l l ) g . drawImage ( controlPanel . getWorkingImage ( ) ,( i n t ) temp . getX ( ) , ( i n t ) temp . getY ( ) , n u l l ) ; g2d . transform ( pannedZoomed ) ; // drawing s t a t e of f i r e goes here g2d . setTransform ( originalTransform ) ; g2d . transform ( AffineTransform . getTranslateInstance (10 , 10) ) ; g//end method draw g;// end mapPanel anonymous c l a s s d e f i n i t i o n mapPanel . addComponentListener (new ComponentAdapter ( )f public void componentResized ( ComponentEvent e )f controlPanel . recomputeScale ( ) ; g g) ; mapPanel . addMouseListener ( mapMouseListener ) ; mapPanel . addMouseMotionListener ( mapMouseListener ) ; mapPanel . s e t S i z e ( getWidth ( ) , getHeight ( ) ) ; t h i s . addComponentListener (new ComponentAdapter ( )f public void componentResized ( ComponentEvent e )f mapPanel . s e t S i z e ( getWidth ( ) , getHeight ( ) ) ; g g) ; add ( mapPanel , new I n t e g e r (0) ) ; controlPanel = new ControlPanel ( this , b uild ing . f l o o r s . s i z e ( ) ) ; recomputeInverse ( ) ; //add ( controlPanel ) ; g 210 g / . / f f a / ui /ComponentList . java / package f f a . ui ; import i n f o . clearthought . layout . TableLayout ; import java . awt . Component ; import java . awt . Dimension ; import javax . swing . JPanel ; import javax . swing . JScrollPane ; public c l a s s ComponentList f p r i v a t e JScrollPane s c r o l l P a n e ; p r i v a t e JPanel panel ; p r i v a t e TableLayout layout ; p r i v a t e s t a t i c f i n a l double s i z e [ ] [ ] = ffTableLayout . FILLg,fgg; public ComponentList ( ) f U t i l . checkEDT ( ) ; panel = new JPanel ( ) ; s c r o l l P a n e = new JScrollPane ( panel , JScrollPane . VERTICAL SCROLLBAR AS NEEDED, JScrollPane .HORIZONTAL SCROLLBAR NEVER) ; layout = new TableLayout ( s i z e ) ; panel . setLayout ( layout ) ; g public JScrollPane getScrollPane ( ) f return s c r o l l P a n e ; g public void c l e a r ( ) f layout = new TableLayout ( s i z e ) ; panel . setLayout ( layout ) ; panel . removeAll ( ) ; g 211 // Does not re render public void add ( Component c ) f layout . insertRow (0 , TableLayout .PREFERRED) ; panel . add ( c , "0 ,0 , f , t ") ; g public void addLast ( Component c ) f i n t row = layout . getNumRow( ) ; layout . insertRow ( row , TableLayout .PREFERRED) ; panel . add ( c , String . format ("0 ,%d , f , t " , row ) ) ; g public void r e f r e s h ( ) f U t i l . checkEDT ( ) ; panel . r e v a l i d a t e ( ) ; panel . repaint ( ) ; g public void s e t P r e f e r r e d S i z e ( Dimension d) f U t i l . checkEDT ( ) ; panel . s e t P r e f e r r e d S i z e (d) ; r e f r e s h ( ) ; g g / . / f f a / ui /Window . java / package f f a . ui ; import i n f o . clearthought . layout . TableLayout ; import java . awt . EventQueue ; import java . i o . F i l e ; import java . lang . r e f l e c t . InvocationTargetException ; import java . u t i l . Date ; import java . u t i l . concurrent . ExecutionException ; import java . u t i l . concurrent . ExecutorService ; import java . u t i l . concurrent . Executors ; import java . u t i l . concurrent . Future ; import javax . swing . JFrame ; import org . joda . time . DateTime ; import f f a . annotation . AnnotationIO ; 212 import f f a . model . Building ; import f f a . model . Incident ; @SuppressWarnings (" s e r i a l ") public c l a s s Window extends JFrame f p r i v a t e ExecutorService eventExecutor ; p r i v a t e S t a t i c D i s p l a y s t a t i c D i s p l a y ; p r i v a t e DynamicDisplay dynamicDisplay ; p r i v a t e MapDisplay mapDisplay ; p r i v a t e Future eventThread ; p r i v a t e Incident i n c i d e n t = new Incident ( ) ; // Three columns , l e f t and r i g h t are 25%, one row , f i l l s screen p r i v a t e s t a t i c f i n a l double s i z e [ ] [ ] = ff.25 , TableLayout . FILL , .25g,fTableLayout . FILL , . 1 5gg; public Window( ) fg / @author jwu @param i n c i d e n t / public Window( Incident i n c i d e n t )f t h i s . i n c i d e n t = i n c i d e n t ; g public void initGUI ( ) f U t i l . checkEDT ( ) ; / Load bui ldin g f i l e / F i l e annotationFolder = new F i l e (" s t a t i c I n f o ") ; Building buil ding = AnnotationIO . importAnnotation ( annotationFolder ) ; / I n i t i a l i z e Components / s t a t i c D i s p l a y = new S t a t i c D i s p l a y ( b uild ing ) ; dynamicDisplay = new DynamicDisplay ( ) ; mapDisplay = new MapDisplay ( this , b uild ing ) ; dynamicDisplay . addMapDisplay ( mapDisplay ) ; ClockPanel startTime = new ClockPanel (new Date ( ) , "Time of F i r s t Alarm ") ; ClockPanel elapsedTime = new ClockPanel (new DateTime ( ) , " Elapsed Time" , mapDisplay ) ; new MetaInfoLayer ( mapDisplay , elapsedTime ) ; / I n i t i a l i z e Frame / // I b e l i e v e t h i s i s the standard screen s i z e we ? re working with t h i s . s e t S i z e (1024 , 768) ; // Centers the window 213 t h i s . setLocationRelativeTo ( n u l l ) ; // This i s the main app window setDefaultCloseOperation ( JFrame .EXIT ON CLOSE) ; setLayout (new TableLayout ( s i z e ) ) ; add ( s t a t i c D i s p l a y , "0 ,0") ; add ( mapDisplay , "1 ,0 ,1 ,1") ; add ( dynamicDisplay , "2 ,0") ; add ( startTime , " 0 , 1 " ) ; add ( elapsedTime , " 2 , 1 " ) ; / I n i t i a l i z e Event Thread / eventExecutor = Executors . newSingleThreadExecutor ( ) ; eventThread = eventExecutor . submit (new Runnable ( ) f @Override public void run ( ) f eventThread ( ) ; g g) ; // TODO j u s t f o r debug purposes , remove l a t e r / Thread t = new Thread (new Runnable ( )f public void run ( ) f while ( true ) f dynamicDisplay . tempAddStuff ( ) ; try f Thread . s l e e p (2000) ; g catch ( InterruptedException e ) f // TODO Auto generated catch block e . printStackTrace ( ) ; g U t i l . printDebug (" Updating ") ; i n c i d e n t . update ( ) ; g g g) ; t . s t a r t ( ) ; / g p r i v a t e void eventThread ( ) f while ( true ) f i n c i d e n t . waitForUpdate ( ) ; U t i l . printDebug (" Updated ") ; 214 EventQueue . invokeLater (new Runnable ( )f public void run ( ) f dynamicDisplay . changeMade ( i n c i d e n t . getEvents ( ) ) ; g g) ; //TODO n o t i f y the various panels that there ? s something new going on g g public void await ( ) f try f eventThread . get ( ) ; g catch ( InterruptedException e ) f // TODO Auto generated catch block e . printStackTrace ( ) ; g catch ( ExecutionException e ) f // TODO Auto generated catch block e . printStackTrace ( ) ; g g public s t a t i c void main ( String [ ] args ) f f i n a l Window window = new Window( ) ; try f EventQueue . invokeAndWait (new Runnable ( )f public void run ( ) f window . initGUI ( ) ; window . s e t V i s i b l e ( true ) ; g g) ; g catch ( InterruptedException e ) f // TODO Auto generated catch block e . printStackTrace ( ) ; g catch ( InvocationTargetException e ) f // TODO Auto generated catch block e . printStackTrace ( ) ; g window . await ( ) ; g g / . / f f a / ui / U t i l . java 215 / package f f a . ui ; import javax . swing . S w i n g U t i l i t i e s ; public c l a s s U t i l f // Allow compiler to remove debugging code public s t a t i c f i n a l boolean DEBUG = true ; public s t a t i c void checkEDT ( ) f i f ( !DEBUG) return ; i f ( ! S w i n g U t i l i t i e s . isEventDispatchThread ( ) ) throw new I l l e g a l S t a t e E x c e p t i o n ( ) ; g public s t a t i c void printDebug ( String s ) f i f ( !DEBUG) return ; System . e r r . p r i n t l n ( s ) ; g g / . / f f a / ui / S t a t i c D i s p l a y . java / package f f a . ui ; import java . awt . Color ; import java . awt . GridLayout ; import javax . swing . BorderFactory ; import javax . swing . JLabel ; import javax . swing . JPanel ; import f f a . model . Building ; @SuppressWarnings (" s e r i a l ") public c l a s s S t a t i c D i s p l a y extends JPanel f p r i v a t e ComponentList s t a t i c D i s p l a y L i s t ; public S t a t i c D i s p l a y ( Building bu ildi ng ) f U t i l . checkEDT ( ) ; setBorder ( BorderFactory . createLineBorder ( Color .BLACK) ) ; setLayout (new GridLayout (1 ,1) ) ; 216 s t a t i c D i s p l a y L i s t = new ComponentList ( ) ; add ( s t a t i c D i s p l a y L i s t . getScrollPane ( ) ) ; f o r ( String i n f o : bu ildi ng . s t a t i c I n f o ) f addEntry ( i n f o ) ; g g p r i v a t e void addEntry ( String text ) f JPanel panel = new JPanel ( ) ; panel . setLayout (new GridLayout (1 ,1) ) ; panel . add (new JLabel(""+text ) ) ; panel . setBorder ( BorderFactory . createEtchedBorder ( ) ) ; s t a t i c D i s p l a y L i s t . addLast ( panel ) ; g g / . / f f a / ui / ControlPanel . java / package f f a . ui ; import java . u t i l . ArrayList ; import java . u t i l . SortedSet ; import java . u t i l . TreeSet ; import java . awt . Color ; import java . awt . Dimension ; import java . awt . event . ; import java . awt . Image ; import java . awt . image . BufferedImage ; import java . awt . GridLayout ; import java . awt . Point ; import javax . swing . BorderFactory ; import javax . swing . border . ; import javax . swing . BoxLayout ; import javax . swing . ImageIcon ; import javax . swing . JButton ; // import javax . swing . JComponent ; import javax . swing . JLabel ; import javax . swing . JPanel ; import javax . swing . JScrollPane ; import javax . swing . J S l i d e r ; // @SuppressWarnings (" s e r i a l ") public c l a s s ControlPanel / extends JPanel / f p r i v a t e MapDisplay d i s p la y ; 217 p r i v a t e JButton mapLeft ; p r i v a t e JButton mapRight ; p r i v a t e JButton mapUp; p r i v a t e JButton mapDown; p r i v a t e JButton zoomIn ; p r i v a t e JButton zoomOut ; p r i v a t e JButton zoomDefault ; p r i v a t e JPanel replayPanel ; p r i v a t e JButton replayRun ; p r i v a t e J S l i d e r replayProgress ; p r i v a t e ArrayList f l o o r L i s t = new ArrayList() ; p r i v a t e JPanel f l o o r S e l e c t P a n e l ; p r i v a t e SortedSet t r o u b l e F l o o r L i s t = new TreeSet< FloorButton >() ; p r i v a t e JPanel troublePanel ; p r i v a t e ComponentList t r o u b l e F l o o r D i s p l a y L i s t ; p r i v a t e ComponentList f l o o r D i s p l a y L i s t ; p r i v a t e f l o a t s c a l e = 1.0 f ; p r i v a t e f l o a t zoom = 1.0 f ; p r i v a t e Point pan = new Point (0 ,0) ; p r i v a t e i n t currentFloor = 0 ; p r i v a t e Image workingImage = n u l l ; p r i v a t e ImageIcon arrowIcon = new ImageIcon (" images /arrow . png ") ; p r i v a t e ImageIcon blankIcon = new ImageIcon (" images / blank10 . png ") ; ActionListener mapActionListener = new ActionListener ( )f public void actionPerformed ( ActionEvent e )f String command = e . getActionCommand ( ) ; i f (command . equals (" mapLeft ") )f pan . x+=50/zoom ; g e l s e i f (command . equals (" mapRight ") )f pan . x =50/zoom ; g e l s e i f (command . equals ("mapUp") )f pan . y+=50/zoom ; g e l s e i f (command . equals ("mapDown") )f pan . y =50/zoom ; g e l s e i f (command . equals (" zoomIn ") )f zoomBy ( 1 .2 5 f ) ; g e l s e i f (command . equals ("zoomOut") )f zoomBy ( . 8 f ) ; g e l s e i f (command . equals (" zoomDefault ") )f zoomDefault ( ) ; g e l s e i f (command . matches ("^ f [0 9]+$ ") )f i n t f l o o r = I n t e g e r . parseInt (command . su b st ri n g (1) ) ; JButton newButton = f l o o r L i s t . get ( f l o o r ) ; 218 newButton . setBackground ( Color . green ) ; newButton . s et I c o n ( arrowIcon ) ; JButton oldButton = f l o o r L i s t . get ( currentFloor ) ; oldButton . setBackground (new Color (0xAAFFFF) ) ; oldButton . s et I c o n ( blankIcon ) ; currentFloor=f l o o r ; recomputeScale ( ) ; g d i s p la y . recomputeInverse ( ) ; d i s p la y . repaint ( ) ; g//end method actionPerformed g; p r i v a t e void recomputeWorkingImage ( )f BufferedImage unscaled = d i s pl a y . getFloorImage ( currentFloor ) ; workingImage = unscaled . getScaledInstance ( new Float ( unscaled . getWidth ( ) zoom s c a l e ) . intValue ( ) , new Float ( unscaled . getHeight ( ) zoom s c a l e ) . intValue ( ) , BufferedImage .SCALE AREA AVERAGING) ; g public void recomputeScale ( )f replayPanel . setLocation ( d i s pl a y . getWidth ( ) replayPanel . getWidth ( ) 15, d is p l a y . getHeight ( ) replayPanel . getHeight ( ) 15) ; f l o o r S e l e c t P a n e l . setLocation ( d i s p l a y . getWidth ( ) f l o o r S e l e c t P a n e l . getWidth ( ) 15, d is p l a y . getHeight ( ) /2 f l o o r S e l e c t P a n e l . getHeight ( ) /2) ; s c a l e = d is p l a y . computeScale ( currentFloor ) ; recomputeWorkingImage ( ) ; d i s pl a y . recomputeInverse ( ) ; d i s pl a y . repaint ( ) ; g public void zoomBy( double zoomBy)f zoom =zoomBy ; recomputeWorkingImage ( ) ; d i s pl a y . recomputeInverse ( ) ; g public void zoomDefault ( )f zoom=1.0 f ; recomputeWorkingImage ( ) ; d i s pl a y . recomputeInverse ( ) ; g public f l o a t getZoom ( )f return zoom s c a l e ; g public i n t getPanX ( )f 219 return pan . x ; g public i n t getPanY ( )f return pan . y ; g public void pan ( i n t x , i n t y )f pan . x+=x ; pan . y+=y ; d i s pl a y . recomputeInverse ( ) ; d i s pl a y . repaint ( ) ; g public Image getWorkingImage ( )f return workingImage ; g public i n t getCurrentFloor ( )f return currentFloor ; g public ControlPanel ( MapDisplay display , i n t numFloors ) f t h i s . d is p l a y = d i s p la y ; U t i l . checkEDT ( ) ; Border emptyBorder = BorderFactory . createEmptyBorder ( ) ; Color transparent = new Color (255 ,255 ,255 , Color .TRANSLUCENT) ; mapLeft = new JButton (new ImageIcon (" images / arrowLeft . png ") ) ; mapLeft . setActionCommand (" mapLeft ") ; mapLeft . addActionListener ( mapActionListener ) ; mapLeft . setBorder ( emptyBorder ) ; mapLeft . setBackground ( Color . white ) ; mapLeft . s e t P r e f e r r e d S i z e (new Dimension (10 ,10) ) ; mapRight = new JButton (new ImageIcon (" images / arrowRight . png ") ) ; mapRight . setActionCommand (" mapRight ") ; mapRight . addActionListener ( mapActionListener ) ; mapRight . setBorder ( emptyBorder ) ; mapRight . setBackground ( Color . white ) ; mapRight . s e t P r e f e r r e d S i z e (new Dimension (10 ,10) ) ; mapUp = new JButton (new ImageIcon (" images /arrowUp . png ") ) ; mapUp. setActionCommand ("mapUp") ; mapUp. addActionListener ( mapActionListener ) ; mapUp. setBorder ( emptyBorder ) ; mapUp. setBackground ( Color . white ) ; mapUp. s e t P r e f e r r e d S i z e (new Dimension (10 ,10) ) ; mapDown = new JButton (new ImageIcon (" images /arrowDown . png ") ) ; mapDown. setActionCommand ("mapDown") ; mapDown. addActionListener ( mapActionListener ) ; mapDown. setBorder ( emptyBorder ) ; 220 mapDown. setBackground ( Color . white ) ; mapDown. s e t P r e f e r r e d S i z e (new Dimension (10 ,10) ) ; zoomIn = new JButton (new ImageIcon (" images /zoomIn . png ") ) ; zoomIn . setActionCommand (" zoomIn ") ; zoomIn . addActionListener ( mapActionListener ) ; zoomIn . setBorder ( emptyBorder ) ; zoomIn . setBackground ( Color . white ) ; zoomIn . s e t P r e f e r r e d S i z e (new Dimension (20 ,20) ) ; zoomOut = new JButton (new ImageIcon (" images /zoomOut . png ") ) ; zoomOut . setActionCommand ("zoomOut") ; zoomOut . addActionListener ( mapActionListener ) ; zoomOut . setBorder ( emptyBorder ) ; zoomOut . setBackground ( Color . white ) ; zoomOut . s e t P r e f e r r e d S i z e (new Dimension (20 ,20) ) ; zoomDefault = new JButton (new ImageIcon (" images / zoomDefault . png ") ) ; zoomDefault . setActionCommand (" zoomDefault ") ; zoomDefault . addActionListener ( mapActionListener ) ; zoomDefault . setBorder ( emptyBorder ) ; zoomDefault . setBackground ( Color . white ) ; zoomDefault . s e t P r e f e r r e d S i z e (new Dimension (20 ,20) ) ; replayRun = new JButton (new ImageIcon (" images /run . png ") ) ; replayRun . setBorder ( emptyBorder ) ; replayRun . setBackground ( Color . white ) ; replayRun . s e t P r e f e r r e d S i z e (new Dimension (20 ,20) ) ; replayProgress = new J S l i d e r ( ) ; replayProgress . setValue (0) ; replayProgress . setBorder ( emptyBorder ) ; replayProgress . setBackground ( Color . white ) ; replayProgress . s e t P r e f e r r e d S i z e (new Dimension (150 ,20) ) ; f l o o r D i s p l a y L i s t = new ComponentList ( ) ; t r o u b l e F l o o r D i s p l a y L i s t = new ComponentList ( ) ; genFloors ( numFloors ) ; f o r ( FloorButton f : f l o o r L i s t ) f f l o o r D i s p l a y L i s t . add ( f ) ; g JButton currentFloorButton = f l o o r L i s t . get ( currentFloor ) ; currentFloorButton . setBackground ( Color . green ) ; currentFloorButton . s e t I c o n ( arrowIcon ) ; f l o o r D i s p l a y L i s t . r e f r e s h ( ) ; / t r o u b l e F l o o r L i s t . add ( f l o o r L i s t . get (2) ) ; // t r o u b l e F l o o r L i s t . add ( f l o o r L i s t . get (4) ) ; t r o u b l e F l o o r L i s t . add ( f l o o r L i s t . get (3) ) ; f o r ( FloorButton f : t r o u b l e F l o o r L i s t ) f t r o u b l e F l o o r D i s p l a y L i s t . add ( getComponent ( f ) ) ; g 221 t r o u b l e F l o o r D i s p l a y L i s t . r e f r e s h ( ) ; / JPanel movePanel = new JPanel (new GridLayout (3 ,3 ,10 ,10) ) ; JPanel f i l l e r = new JPanel ( ) ; f i l l e r . setBackground ( transparent ) ; movePanel . add ( f i l l e r ) ; movePanel . add (mapUp) ; f i l l e r = new JPanel ( ) ; f i l l e r . setBackground ( transparent ) ; movePanel . add ( f i l l e r ) ; movePanel . add ( mapLeft ) ; f i l l e r = new JPanel ( ) ; f i l l e r . setBackground ( transparent ) ; movePanel . add ( f i l l e r ) ; movePanel . add ( mapRight ) ; f i l l e r = new JPanel ( ) ; f i l l e r . setBackground ( transparent ) ; movePanel . add ( f i l l e r ) ; movePanel . add (mapDown) ; f i l l e r = new JPanel ( ) ; f i l l e r . setBackground ( transparent ) ; movePanel . add ( f i l l e r ) ; movePanel . setBounds (15 ,15 ,50 ,50) ; movePanel . setBackground ( transparent ) ; d i s pl a y . add ( movePanel , new I n t e g e r (1) ) ; JPanel zoomPanel = new JPanel (new GridLayout (3 ,1 ,5 ,5) ) ; zoomPanel . add ( zoomIn ) ; zoomPanel . add ( zoomDefault ) ; zoomPanel . add (zoomOut) ; zoomPanel . setBackground ( transparent ) ; zoomPanel . setBounds (30 ,100 ,20 ,70) ; d i s pl a y . add ( zoomPanel , new I n t e g e r (1) ) ; replayPanel = new JPanel ( ) ; replayPanel . setLayout (new BoxLayout ( replayPanel , BoxLayout . X AXIS) ) ; replayPanel . add ( replayRun ) ; replayPanel . add ( replayProgress ) ; replayPanel . setBackground ( transparent ) ; replayPanel . setBounds ( 200, 200, replayRun . g e t P r e f e r r e d S i z e ( ) . width + replayProgress . g e t P r e f e r r e d S i z e ( ) . width , Math . max( replayRun . g e t P r e f e r r e d S i z e ( ) . height , replayProgress . g e t P r e f e r r e d S i z e ( ) . height ) ) ; d i s pl a y . add ( replayPanel , new I n t e g e r (1) ) ; JLabel f l o o r S e l e c t L a b e l = new JLabel (" Floors ") ; // f l o o r S e l e c t L a b e l . setHorizontalTextPosition ( JLabel .LEADING) ; JScrollPane f l o o r S c r o l l = f l o o r D i s p l a y L i s t . getScrollPane ( ) ; f l o o r S c r o l l . setBorder ( emptyBorder ) ; f l o o r S c r o l l . setBackground ( transparent ) ; f l o o r S c r o l l . s e t P r e f e r r e d S i z e (new Dimension ( f l o o r S c r o l l . g e t P r e f e r r e d S i z e ( ) . width +10,Math . min (100 , f l o o r S c r o l l . 222 g e t P r e f e r r e d S i z e ( ) . height ) ) ) ; f l o o r S e l e c t P a n e l = new JPanel ( ) ; f l o o r S e l e c t P a n e l . setLayout (new BoxLayout ( f l o o r S e l e c t P a n e l , BoxLayout . Y AXIS) ) ; f l o o r S e l e c t P a n e l . add ( f l o o r S e l e c t L a b e l ) ; f l o o r S e l e c t P a n e l . add ( f l o o r S c r o l l ) ; f l o o r S e l e c t P a n e l . setBackground ( Color . white ) ; f l o o r S e l e c t P a n e l . setBounds ( 200, 200, 55 , //Math . max( f l o o r S c r o l l . g e t P r e f e r r e d S i z e ( ) . width , f l o o r S e l e c t L a b e l . g e t P r e f e r r e d S i z e ( ) . width ) , f l o o r S c r o l l . g e t P r e f e r r e d S i z e ( ) . height+f l o o r S e l e c t L a b e l . g e t P r e f e r r e d S i z e ( ) . height ) ; d i s pl a y . add ( f l o o r S e l e c t P a n e l , new I n t e g e r (1) ) ; / add ( mapLeft ) ; add ( mapRight ) ; add (mapUp) ; add (mapDown) ; add ( zoomIn ) ; add ( zoomDefault ) ; add (zoomOut) ; add ( replayRun ) ; add ( replayProgress ) ; add ( f l o o r D i s p l a y L i s t . getScrollPane ( ) ) ; add ( t r o u b l e F l o o r D i s p l a y L i s t . getScrollPane ( ) ) ; / g p r i v a t e void genFloors ( i n t maxFloor ) f Color fColor = new Color (0xAAFFFF) ; Border fBorder = BorderFactory . createEtchedBorder ( EtchedBorder . RAISED) ; f o r ( i n t i = 0 ; i < maxFloor ; i++) f FloorButton fButton = new FloorButton ( i ) ; fButton . setBackground ( fColor ) ; fButton . setBorder ( fBorder ) ; fButton . s et I c o n ( blankIcon ) ; fButton . setIconTextGap (5) ; fButton . setHorizontalTextPosition ( JButton .TRAILING) ; fButton . setActionCommand (" f"+i ) ; fButton . addActionListener ( mapActionListener ) ; f l o o r L i s t . add ( fButton ) ; g g @SuppressWarnings (" s e r i a l ") p r i v a t e c l a s s FloorButton extends JButton implements Comparable< FloorButton> f i n t f l o o r ; public FloorButton ( i n t f l o o r ) f super (""+( f l o o r +1) ) ; 223 t h i s . f l o o r = f l o o r ; g @Override public i n t compareTo ( FloorButton arg0 ) f return I n t e g e r . valueOf ( f l o o r ) . compareTo ( arg0 . f l o o r ) ; g g g 224 References Addressable re alarm panels. (2011). Online. Retrieved from http:// www.simplexgrinnell.com/Solutions/FireDetectionAndAlarm/Products/ ControlPanels/AddressableFireAlarmPanels/Pages/default.aspx Ahlstrom, V., & Kudrick, B. (2007, May). Human factors criteria for dis- plays: A human factors design standard update of chapter 5 (Tech. Rep. No. DOT/FAA/TC-07/11). Federal Aviation Administration. BACPAC BACnet products. (2010). Online. Retrieved from http:// www.simplexgrinnell.com/SOLUTIONS/FIREDETECTIONANDALARM/ PRODUCTS/SYSTEMACCESSORIES/COMMUNICATIONSDEVICES/ Pages/BACpacBACnetProducts.aspx Berry, D., Usmani, A., Torero, J. L., Tate, A., McLaughlin, S., Potter, S., et al. (2005, September). Firegrid: Integrated emergency response and re safety engineering for the future built environment. In Uk e-science programme all hands meeting (ahm-2005). Nottinham, UK. Bisantz, A. M., Marsiglio, S. S., & Munch, J. (2005). Displaying uncertainty: Investigating the e ects of display format and speci city. Human Factors, 47 (4), 777-796. Breton, R., Paradis, S., & Roy, J. (2002). Command decision support interface (CODSI) for human factors and display concept validation. Information Fu- sion, 2002. Proceedings of the Fifth International Conference on, 2 , 1284 - 1291. Bryner, N. (2008, February). Fire ghter locator. Retrieved from http://www. re .gov/locator/index.htm Bushby, S. T. (1996). Testing performance and interoperability of BACnet build- ing automation products. Retrieved from http://www. re.nist.gov/bfrlpubs/ build96/art043.html Compass, I. (2010). Features for re. Retrieved from http://www.ironcompass.com/ index.php?option=com content&view=article&id=33&Itemid=50 Corporation, K. (2008). DMP703 universal alarm monitor. Retrieved from http:// www.keltroncorp.com/pdfs/DMP703 Data Sheet.pdf Corporation, K. (2009). Keltron RF774F wireless transceiver. Retrieved from http://www.keltroncorp.com/pdfs/Keltron RF774F Data Sheet.pdf Culbert, K. (2010). Ibisworld industry report 56162: Security, burglar re alarm services in the us (Tech. Rep.). Dai, J., Wang, S., & Yang, X. (1994). Computerized support systems for emergency decision making. Annals of Operations Research, 51 , 313-325. Davis, W. D., Holmberg, D., Reneke, P., Brassell, L., & Vettori, R. (2007, Au- gust). Demonstration of real-time tactical decision aid displays (Tech. Rep. No. NISTIR 7437). NIST. Design criteria/facilities standard manual (Tech. Rep.). (2005). University of Mary- land Facilities Management. Retrieved from http://www.facilities.umd.edu/ DCFS2005/ Drury, J., Klein, G., Pfa , M., & More, L. (2009, May). Dynamic decision sup- 225 port for emergency. In Technologies for homeland security, 2009. hst ?09. ieee conference on. Boston, MA. FDM. (2010). Fdm rms - modules - properties. Retrieved from http://www.fdmsoft .com/FDM RMS Properties.html Guerlain, S., Brown, D., & Mastrangelo, C. (2000, October). Intelligent decision support systems. In Systems, man, and cybernetics, 2000 ieee international conference on. Nashville, TN, USA. Health, S., & Safety. (2010). Pak-tracker locator. Retrieved from http://www.scotthealthsafety.com/americas/en/products/accountability/ Locator/paktracker.aspx Holmberg, D. G., Treado, S. J., & Reed, K. A. (2006, January). Building tacti- cal information system for public safety o cials: Intelligent building response (ibr) (Tech. Rep. No. NISTIR 7314). NIST. Honeywell International, I. (2004). Noti- re-net. Retrieved from http://www .noti er.com/products/datasheets/DN 6791.pdf Honeywell International, I. (2009). BACnet-GW-3: BACnet gateway. Retrieved from http://www.noti er.com/products/datasheets/DN 6877.pdf Honeywell International, I. (2010a). Fire alarm control panels. Retrieved from http://www.noti er.com/products/controlpanels.htm Honeywell International, I. (2010b). Fire alarm peripherals. Retrieved from http:// www.noti er.com/products/peripherals.htm Honeywell International, I. (2010c). NFS2-3030. Retrieved from http://www .noti er.com/products/datasheets/DN 7070.pdf Honeywell International, I. (2010d). NFS2-640. Retrieved from http://www.noti er .com/products/datasheets/DN 7111.pdf Honeywell International, I. (2010e). ONYX FirstVision: Interactive re ght- ers? display. Retrieved from http://www.noti er.com/products/datasheets/ DN 7051.pdf Honeywell International, I. (2010f). ONYXWorks: Integrated facilities monitoring network. Retrieved from http://www.noti er.com/products/datasheets/DN 7048.pdf John, M. S., Smallman, H. S., Manes, D. I., Feher, B. A., & Morrison, J. G. (2005). Heuristic automation for decluttering tactical displays. Human Factors, 47 (3), 509-525. Jones, W. W., Holmberg, D. G., Davis, W. D., Evans, D. D., Busby, S. T., & Reed, K. A. (2005, January). Workshop to de ne information needed by emergency responders during building emergencies (Tech. Rep. No. NISTIR 7193). NIST. LLC, R. (2010). Realview -mobile property inspection tools. Retrieved from http:// www.realviewllc.com/ re-departments.aspx National Fire Protection Association. (2010). About NFPA: Overview. Re- trieved from http://www.nfpa.org/itemDetail.asp?categoryID=495&itemID= 17991&URL=Aboutn%20Us/Overview National Volunteer Fire Council. (n.d.). NVFC: NFPA standards. Retrieved from http://nvfc.org/index.php?id=764 226 Newman, M. H. (2000). Bacnet tutorial overview. Retrieved from http://www .bacnet.org/Tutorial/HMN-Overview/sld001.htm NFPA 101: Life safety code. (2009). NFPA 1620: Standard for pre-incident planning. (2010). NFPA 170: Standard for re safety and emergency symbols. (2009). NFPA 72: National re alarm and signaling code. (2010). Norman, J. (2005). Fire o cer?s handbook of tactics (Third ed.). PennWell. Parasuraman, R., & Wickens, C. D. (2008). Humans: Still vital after all these years of automation. Human Factors, 50 (3), 511-520. Potter, S., & Wickler, G. (2008, May). Model-based query systems for emergency response. In Proceedings of the 5th international iscram conference. Washing- ton, DC, USA. Priegnitz, D., Frashier, D., Marci, T., O?Bannon, T., Smith, S., Steadham, R., et al. (1997, July). Human factors method in the design of the graphical user interface for the open systems radar product generation (orpg) component of the wsr-88d. In Aerospace and electronics conference, 1997. naecon 1997., proceedings of the ieee 1997 national. Dayton, OH, USA. Puit, G. (2000). MGM Grand re: The deadliest day. Las Vegas Review- Journal. Retrieved from http://www.reviewjournal.com/lvrj home/2000/Nov -19-Sun-2000/news/ Roberts, M. (2010, August). Harris debuts re ghter locator. Fire Chief . Retrieved from http:// rechief.com/technology/ar/harris-gr100 - re ghter-location-20100826/ Robinson, J. N. (2009). Solving the system: Integrated re alarm monitoring. College Planning and Management. Retrieved from http://www.keltroncorp .com/pdfs/University of Maryland Case Study.pdf Routley, J. (1995). Four re ghters die in seattle warehouse re (Tech. Rep.). U.S. Fire Administration. Retrieved from http://www.usfa.dhs.gov/downloads/ pdf/publications/tr-077.pdf Routley, J. (1998). Interstate building bank re (Tech. Rep.). U.S. Fire Administra- tion. Retrieved from http://www.usfa.dhs.gov/downloads/pdf/publications/ tr-022.pdf Rovira, E., McGarry, K., & Parasuraman, R. (2007). E ects of imperfect automation on decision making in a simulated command and control task. Human Factors, 49 (1), 76-87. SafeLINC re panel internet interface. (2010). Online. Retrieved from http:// xtra.simplexnet.com/a e/FA/4100-0028.pdf Sarter, N. B., & Schroeder, B. (2001). Supporting decision making and action selection under time pressure and uncertainty: The case of in- ight icing. Human Factors, 43 (4), 573-583. Siemens Industry, Inc. (2002). Thermal re detectors: Explosion proof models. Re- trieved from http://www.us.sbt.siemens.com/FIS/productdoc/catalogs/6128 .pdf Siemens Industry, Inc. (2003). VESDA LaserPlus. Retrieved from http://www.us .sbt.siemens.com/FIS/productdoc/catalogs/1173.pdf 227 Siemens Industry, Inc. (2005). MXLV: Multiplex emergency voice alarm/communi- cation system. Retrieved from http://www.buildingtechnologies.siemens.com/ bt/us/SiteCollectionDocuments/sbt internet us/2691 726.pdf Siemens Industry, Inc. (2006). MXL advanced protection system. Re- trieved from http://www.buildingtechnologies.siemens.com/bt/us/ SiteCollectionDocuments/sbt internet us/2702 737.pdf Siemens Industry, Inc. (2010a). FireFinder XLS re alarm control panel. Retrieved from http://www.us.sbt.siemens.com/FIS/productdoc/catalogs/6300.pdf Siemens Industry, Inc. (2010b). Intelligent FirePrint detector: Model FP-11. Re- trieved from http://www.us.sbt.siemens.com/FIS/productdoc/catalogs/6175 .pdf Simplex 4100U re alarm control panel. (2010). Online. Retrieved from http://www.simplexgrinnell.com/Solutions/FireDetectionAndAlarm/ Products/ControlPanels/AddressableFireAlarmPanels/Pages/ 4100UFireAlarmControlPanel.aspx SimplexGrinnell initiating devices. (2011). Online. Retrieved from http://www.simplexgrinnell.com/Solutions/FireDetectionAndAlarm/ Products/InitiatingDevices/Pages/default.aspx SimplexGrinnell specialized detection devices. (2011). Online. Retrieved from http://www.simplexgrinnell.com/Solutions/FireDetectionAndAlarm/ Products/InitiatingDevices/SpecializedDetectionDevices/Pages/default.aspx Smokeview (version 5) - a tool for visualizing re dynamics simulation data volume i: Users guide [Computer software manual]. (n.d.). Talcott, C. P., Bennett, K. B., Martinez, S. G., Shattuck, L. G., & Stansifer, C. (2007). Perception-action icons: An interface design strategy for intermediate domains. Human Factors, 49 (1), 120-135. Thiel, A. (1999). Special report: Improving re ghter communications (Tech. Rep.). U.S. Fire Administration. Retrieved from http://www.usfa.dhs.gov/ downloads/pdf/publications/tr-099.pdf Thompson, J. (2009). Physiological monitoring system checks re ghter vital signs. Retrieved from http://www. rerehab.com/ re-products/ re-rehab/articles/ 587030-Physiological-monitoring-system-checks- re ghter-vital-signs/ Treado, S., Vinh, A., Holmberg, D., & Galler, M. (2007, July). Building information for emergency responders. In Systemics, cybernetics and informatics, 11th world multi-conference (wmsci 2007). proceedings. volume 3. Orlando, FL. TrueSite workstation with multi-client capability. (2010). Online. Retrieved from http://xtra.simplexnet.com/a e/FA/4190-0016.pdf VESDA VLC-600 TrueAlarm Laser COMPACT. (2010). Online. Retrieved from http://www.simplexgrinnell.com/SOLUTIONS/ FIREDETECTIONANDALARM/PRODUCTS/INITIATINGDEVICES/ SPECIALIZEDDETECTIONDEVICES/Pages/VESDA.aspx Vitals vest { physiologists create undergarment to measure vital signs of re ghters. (2008, February). Science Daily. Retrieved from http://www.sciencedaily .com/videos/2008/0212-vitals vest.htm Wickens, C., Lee, J., & Liu, Y. (1997). An introduction to human factors engineering 228 (2nd ed.). Prentice Hall. Ye, X., Wang, Y., Li, H., & Dai, Z. (2008, December). An emergency decision support system based on the general decision process. In Web intelligence and intelligent agent technology, 2008. WI-IAT ?08. IEEE/WIC/ACM inter- national conference on. Sydney, NSW. Zone, C. (2010a). Fire zone basic features. Retrieved from http:// www.cadzone.com/index.php?option=com content&view=article&id=119n %3A re-zone-basic-features&catid=46&Itemid=143 Zone, C. (2010b). First look pro. Retrieved from http://www.cadzone.com/index .php?option=com content&view=article&id=75&Itemid=158 229