Geographic Information Systems in the PSAP: Strategic Planning for Data Integrity
The importance of mission critical Geographic Information Systems (GIS) data for call- and dispatch- mapping in the Public Safety Answering Point...
The influence of geographic information systems (GIS) on emergency communications continues to find its way to the front and center of the workstation, and for good reason. The number one job of the PSAP (public safety answering point) telecommunicator is to try to determine a precise dispatchable location to get first responders to the right place as fast and accurately as possible. Map-centric software systems help to visualize the location and location uncertainty of an incident, but as location and supplemental data technologies advance, can we not start to think past the pin on a map to help telecommunicators with contextual awareness and decision-making? We need to move that mapping application from a secondary screen to the call handling application.
Let’s start with the concept of decision-making. We all know how difficult the telecommunicator job can be. And while we can’t take the stress out of the situations they encounter—helping those in need on their worst day—we can try to streamline as much of the process as possible when it comes to the characterization of an incident. Much of the time spent from answering a call to dispatching a response is focused on trying to understand the parameters of an incident and what help is required. Telecommunicators have to quickly and accurately determine the emergency resources needed based on limited information and from providers of information who can be under extreme duress and confusion or, in today’s world, a digital device of some sort. And let's face it, the devices today are not as intelligent as hoped for. Devices can most certainly report and request help, but the context of the situation is still left to the people involved in the incident and those charged with responding to it.
From afar, a telecommunicator has to manage a number of degrees of ambiguity, from location uncertainty to the number of people or vehicles involved, the extent of their injuries, whether a crime scene is active and if people are in danger, and so on. They have to envision the incident as they piece together the information they are gathering across their workstation screens and the caller in their ear. Do I need to send one ambulance or two? Or three? On what floor of the building is the caller hiding from their assailant? Where is the science lab located inside the school where the fire alarm was triggered, and where are the nearest entry points?
As I study the science behind decision-making, I’ve noticed that much of the research around how humans work and think was designed in a controlled environment for situations where people had time to strategize and contemplate. The literature outlines clear, logical functions like gathering information, weighing alternatives, and considering consequences. And while the research that I found didn’t address the time pressures and degrees of uncertainty found in emergency communications, there were some interesting elements to the science that we need to consider as we design and implement software support systems for emergency communications.
Decision-making relies on determining a truth, or what we believe to be true. In the case of a 911 call, relying on just information through the eyes of the caller or a single security alarm is fraught with risk and justifiably treated with skepticism, depending on the situation. But the opposite holds true now as well, that with the abundance of supplemental data, including incident-related imagery and Internet of Things (IoT) sensors, too many sources of truth can also be troubling. Those data can be conflicting, hard to decipher, or just plain erroneous. Telecommunicators who once relied on their experience and intuition to make decisions risk falling into the dreaded analysis-paralysis syndrome.
Consolidation versus Aggregation
One option, of course, would be to try to filter out or ignore all those supplemental data. The workflow works fine as is and has served us well for many years. Perhaps so, but it appears that a couple of trends may be pushing us more in the direction of the Trekkie resistance is futile. By many accounts, the number of sensor-based RFAs is rising dramatically, with one analyst, Frost & Sullivan, projecting that over half (52%) of 911 calls will be from sensors, not humans, in a few years. I don’t see a world where direct calls from people will go away anytime soon, but we also don’t get to dictate how people and businesses will use innovative information and communications technologies to interact with emergency services.
The first driver that I would look to would be the expectation of the public when it comes to supplemental data. The technology providers, including consumer and communications electronics manufacturers, are placing some powerful and interactive equipment in the hands of consumers and enterprises. Smartphones, connected vehicles and homes, surveillance cameras, and alarm systems are adding safety and lifestyle features that allow people and business operations to generate numerous forms of device-driven RFAs (requests for assistance). RFAs with supplemental data are thrown over the wall to PSAPs whether they asked for them or not.
Ask yourself whether the public expects the supplemental data provided by their smart devices to be made available to and used by first responders to render aid. And that first responders likewise would value those data but also want the emergency communications centers (ECCs) to aggregate and analyze the relevant information for them. And if we did disregard that data for the sake of velocity, would auto manufacturers or security providers continue to invest in connected vehicle or sensor innovation that went unused?
The public expects that the supplemental data they provide from their smart devices will be used by first responders to render aid.
Another challenge that falls, in part, on ECCs is the growing threat of scarcity of first responder resources. News stories across the nation point to the double whammy of a noticeable rise in crime and false alarms pitted against the disturbing challenges in ECC staffing and recruiting. Some agencies are forced to alert their communities that certain incidents will no longer be serviceable by law enforcement or that they will enlist a citizen-resources response. Those agencies may look to the ECC to better optimize their dispatch of resources, perhaps through some form of improved incident intelligence and insights.
To meet the ever-changing challenges of our times, we will have to look at the data flows, who “owns” those data, and how they are analyzed and transmitted across the 911 process. From what we hear and read, it appears that first responders want these data to help them strategize their response to an incident but likely won't have the time or wherewithal to analyze those data as they race to the scene. (For example, a camera feed of an active shooter scene helps police officers to determine which entry points give them protective cover, or real-time temperature readings in a building that help a fire officer to track the spread of a fire.)
What is the role of supplemental data when telecommunicators are under extreme and growing pressure to diagnose and dispatch as quickly as possible? They live by the mantra that seconds count, especially in life-and-death situations. Most, if not all, of the supplemental data are sent to the ECC, so by default, it’s in their lap, right? Well, yes and no. Before those data reach the telecommunicator’s fingertips, they pass through other hands, from the alarm company that generated the non-verbal RFA to the application and data brokering providers who help route and present the RFA and supplemental data to the ECC.
Let me digress a little here, if you will, on how we see a 911 call as a flow of 1s and 0s, and what people want to do with all this data that the connected world generates. A dear colleague of mine shared a presentation from a top marketing blogger, Scott Brinker, on the difference between data consolidation and data aggregation. Brinker made the distinction that where consolidation was defined as the act of “reducing a large set of things into a fewer number of things (or just one),” aggregation could be viewed as “making a large set of things easier to consume or access through a single source.” Why am I sharing this model with you here? Well, because I think the solution to the 911 data problem might lie in how we see it.
A PSAP director or shift supervisor would likely be hard-pressed to ask their telecommunicators to become data consolidators. Perhaps that could be the role of another layer of data analysts, artificial intelligence, or fusion centers, but let’s set that aside for a future blog topic. Asking telecommunicators to run all those analytics in the 30-60 seconds they have to make a decision on a dispatch recommendation would be overly burdensome, but asking them to take on the role of data aggregators (with some technology support), might be feasible. They could become responsible for ensuring the relevant supplemental data are passed down the line to the dispatchers and first responders who might need them.
Location-Centric Consolidation
So, what if the data consolidation was executed as the data was transmitted from the incident to the telecommunicator? We have the data conduits in place with services like Intrado’s Emergency Data Broker to capture and serve up relevant, digestible data. We already work with IoT partners such as ADT to streamline alarm-based RFAs and deliver them in a way that is easily consumable by the PSAP.
The originators of such data are working hard to transmit emergency data in a format that is standardized and in a common language. In support of that effort, APCO (Association of Public-Safety Communications Officials) and NENA (National Emergency Number Association), for example, have devised a Vehicular Emergency Data Set (VEDS) for advanced automatic collision notification. At the risk of oversimplifying, picture that instead of a telematics service provider sending raw inclinometer data—the vehicle’s angular measurement in degrees or a percentage reference to a level plane—the PSAP received a data message that indicated whether a rollover had occurred and the number of rollover turns a vehicle experienced.
For any particular incident, the PSAP could receive dozens of pieces of such data from the vehicles involved, bystanders calling 911, medical data from passenger smartwatches, highway traffic cameras, and so on. What might be the best way to organize and consolidate such data? Well, as with any 911 call, it starts with “What is the location of your emergency?” Location consolidation?
What might consolidating data around location mean to the decision-making that we discussed earlier? Putting a pin on a map of where the crash occurred is just the start of the relevant data gathering and analysis. Using the location as the center point in trying to rapidly characterize and analyze an incident, reducing the many rings of ambiguity, and making those data consumable and useful to the telecommunicator, is a key role of location centricity.
What is the difference between map-centric and location-centric? Esri, the go-to leader in GIS, penned a great blog on the subject, illustrating the difference between your local municipality using GIS to tell you what day to put your trash to the curb instead of showing you a map of trash collection pickup zones by day, making you search for your address. Esri describes map-centric as starting with data and requiring the user to “explore the data to find answers,” while a location-centric approach “starts with purpose,” where insights are presented immediately.
For 911 workflows, that means moving the GIS from a distinct application into a fully integrated, location-centric call-handling system. Intrado Spatial Insight, together with data brokering, now consolidates information from multiple relevant sources into a single incident or course of an incident. Such integration means the state-of-the-art Device Based Hybrid (DBH) location data are seamlessly presented in the call handling process. The first ring of uncertainty is addressed immediately, where the accuracy of DBH data, wherever possible, greatly improves the confidence levels of the supplemental data gathered thereafter.
With a location-centric system, the PSAP's role of data aggregators can be explored further. In addition to the location and the nature of the incident, what additional data might be available to telecommunicators in the vicinity of the incident that can help their dispatchers and first responders render aid more efficiently and effectively? Where would a firefighter find the nearest hydrant on the block? Where can a police officer find a registered AED (Automated External Defibrillator) should an ambulance get stuck in traffic? To which floors of a building do the Z-axis coordinates and uncertainties translate, whereby the first responders could narrow their search for a victim?
Intrado Spatial Insight integrating geospatial intelligence into the call handling application
A location-centric approach is designed to align with the pragmatic approach that PSAPs might want to take with supplemental data. We want to present the most relevant data, trying to reduce ambiguities—particularly with trusted data from IoT partners—while simplifying the process wherever possible so as to not overwhelm the telecommunicators. Supplemental data are valuable only to the extent that emergency services can get it and use it wisely.
Supplemental data are valuable only to the extent that emergency services can get it and use it wisely.
Technology is rarely a panacea for workflow stressors, but in the case of potential information overload we need to continue to press forward with software solutions focused on reducing uncertainty and optimizing decision making. Location-centric decision engines are designed to support the human experience and intuition with a broader set of data points and greater data accuracy. Processing and visualizing those data seem to be the key to embracing the opportunities presented by the next generation of 911, but as with any technology advancement, we want to open a dialog through the lens of innovation as much as introducing a new workflow optimization tool.
If you’ve been following my blog, I hope that you appreciate that I believe the Why matters. Why hybrid clouds. Why edge computing. Why location-centric over map-centric. These discussions may seem esoteric but the problems we are trying to address are as real to us as technologists and coders as they are to you as practitioners.
Related content:
The importance of mission critical Geographic Information Systems (GIS) data for call- and dispatch- mapping in the Public Safety Answering Point...
In the coming weeks, Intrado will introduce the Sonic EDGE, an alternate configuration of the Sonic G3, and a cutting-edge appliance that gives...
LEAD THE WAY Geographic Information Systems (GIS) is front & center in the march toward a fully-realized i3-based Next Generation 9-1-1 (NG9-1-1)...
One of the most pressing technology-related concerns in 911 today is quickly identifying the location of 911 callers. When 911 callers are at home...