Reducing gun violence is a top policy priority in the United States. However, it is hard to address the problem until we better understand the actual incidence of gun violence, which is difficult using existing data sources. If data aren’t telling the full story, it makes doing research and creating good public policy difficult – and can even lead us in the wrong direction.
In a new working paper, Jillian Carr and I use data from a technology called ShotSpotter to present new evidence on the underreporting of gun violence. ShotSpotter uses audio sensors to detect and triangulate the location of gunfire incidents. Because it doesn’t depend on victims, witnesses, or police to report shots fired, it provides a more complete and accurate picture of gun violence in communities across the country than do other crime data sources.
We combine ShotSpotter data from Washington, D.C., and Oakland, CA, with the next-best data available on gun violence from those cities: reported crime data and 911 calls. Using individual gunfire incidents as initial events, we estimate the likelihood that each incident results in a 911 call or crime report.
Few gunfire incidents in Washington D.C. and Oakland result in a 911 call
It turns out that the reporting of gunfire incidents is extremely low. Based on data from January 2011 through June 2013 in D.C., only 22 percent of gunfire results in a 911 call, which could include calls for an ambulance. Twenty-one percent of gunfire results in a call for police assistance, which could include calls to report a crime, such as an assault with a dangerous weapon (the relevant offense if someone fires a gun at you). Even if someone calls 911 from the scene to report an injury or crime, we might expect that nearby residents would call 911 to report hearing gunshots, just in case someone needs help. In fact, only 12 percent of gunfire incidents resulted in a 911 call to report shots fired.
These estimates varied across the city. In D.C.’s Police District 3, which includes Adams Morgan, Shaw, and Columbia Heights, only 15 percent of gunfire incidents resulted in a call for police, and only 9 percent of incidents prompted a call to report hearing gunshots. In Police District 5, which includes much of Northeast D.C., those numbers were 40 percent and 18 percent, respectively.
Gunfire Incidents, Reported Crime in Washington, D.C., 2011-2014
Gunfire Incidents, Reported Crime in Oakland, CA, 2011-2014
Studies that rely on homicide data miss a big part of the gun violence story
It is clear that relying on 911 calls to report gunshots tells only a small part of the story, and the completeness of those data varies from place to place.
However, relying on reported crime data isn’t any better. The good news is that a tiny fraction of gunfire incidents results in a homicide. This is, in part, because a lot of bullets don’t hit their targets or innocent bystanders. (In dense urban areas like D.C. and Oakland, this is mostly a matter of luck.) In addition, improvements in medical technology have increased the likelihood that ER doctors can save a gunshot victim’s life. However, this means that studies of gun violence that rely on homicide data are missing a very large part of the gun violence story. In D.C., only 0.5 percent of gunfire incidents result in a homicide. In Oakland, between January 2008 and October 2013, only 1 percent of gunfire incidents result in a homicide.
Slightly more incidents result in a report of an assault with a dangerous weapon (AWDW): 2.3 percent in D.C. and 6.4 percent in Oakland. If the vast majority of gunfire in these cities is intended to be threatening – which seems likely – then they constitute AWDWs, and these estimates imply extremely low reporting rates. These low reporting rates are consistent with a scenario where neither the victim nor the assailant wants to call or cooperate with the police (perhaps a drug deal gone bad, a dispute between rival gangs, or even a more mundane situation where the victim simply doesn’t trust the cops to help so chooses not to report the offense). This means that our traditional crime data are missing a substantial portion of violent crime in these cities. If we don’t know where and when crime occurs, how can we address it?
Underreported gun violence creates problems for policymaking—but ShotSpotter data can help
Underreporting is an especially big problem when we’re trying to measure policies’ effectiveness. For instance, recent anecdotal evidence of police slowdowns (the so-called “Ferguson effect”) might prompt researchers and policymakers to analyze crime data to see what happened to public safety. However, if police are less active, the detection of crime will fall – even if criminal activity increases or remains unchanged. Likewise, estimating the effect of police surges is difficult because putting more police on the streets probably increases the detection of crime. When a policy affects both actual crime and the rate at which crime is reported, it could lead us to the wrong conclusion about a policy’s impact on criminal activity. (For another example, see our work on juvenile curfews.)
Using data from ShotSpotter – which detects crime through sensors, not reports from police or others – makes such studies straightforward. The accuracy of the sensors doesn’t depend on human reporting behavior, so we get a clear picture of what happens to criminal activity and can more easily tell if a policy is helping or hurting. Granted, these data only tell us what happens to gun violence. But gun violence is an important outcome, and it’s correlated with other types of violent crime.
The main takeaway of this study is that surveillance and sensor data can paint a much more complete picture of what life is like in a dangerous neighborhood. (This assumes, of course, that we can get our hands on the data – a challenge that I’ve written about before.) As more such data become available, it will be easier to understand crime patterns and focus our resources on crime-reduction policies that are effective. Our cities will be safer as a result.