In this field, there are:
- Gold standard sources
- Acceptable sources
- Questionable sources
- Unacceptable sources
Gun Facts strives to use gold standard sources. This is not always possible as some data is fragmented, incomplete, or statistically fragile.
That said, there are some sources that are unusable, though this has not stopped lesser news organizations than yours from using them. Below is a list of sources and our observations about them.
Gold Standard sources
CDC: The Centers for Disease Control gather a great deal of data from primary sources at the local level (i.e., death certificate tallies). They also have good statisticians and a long history of quality number crunching. Their web portals and data exploration tools are much better than average.
The primary portal is called WONDER (Wide-ranging ONline Data for Epidemiologic Research). Their “Underlying Cause of Death” is the primary gateway to finding data, down to the county level, for anything deadly including guns.
The other important one is WISQARS (Web-based Injury Statistics Query and Reporting System) which can lead you to non-fatal gun injury data.
FBI: The FBI started gathering structured crime data in the 1940s. The old system (recently augmented) was called the Uniform Crime Reports (UCR) system. It remains the primary source of data for crime measurement.
The FBI recently replaced an old, simple but useful crime data exploration system with a new web portal that is an impediment to easily obtaining quality crime data. The new system, Crime Data Explorer (CDE) is useful for getting high-level data and simple comparisons (i.e., the homicide rates for Illinois compared to the nation at large).
NACJD: The National Archive of Criminal Justice Data (NACJD) is a repository for crime data and studies. They also take difficult-to-use data (i.e., the FBI’s raw UCR data) and repackage it to make it simpler to obtain and use.
BJS: The Bureau of Justice Statistics is the national archive for crime data and analysis. They produce a number of recurring reports (i.e., prisoner surveys about the sources of crime guns).
Acceptable sources
Violence Project and Gun Facts Mass Public Shooting database: There are several mass public shooting databases that use the criminology-consensus definition of mass public shooting (which is not what the Gun Violence Archive uses, and we’ll discuss that later).
The Violence Project (VP) is an amazing resource, tracking the largest collection of variables about mass public shootings (MPS). They have also been granted unparalleled access to shooter profiles, histories, mental health assessments, et cetera. Gun Facts refers to their work often.
That said, they don’t track everything and their dataset organization has some inconveniences. Hence, Gun Facts maintains an independent MPS database that includes data not found elsewhere, and it is in a slightly more user-friendly format. Interestingly, when we conducted a periodic update of our database, we had two events that the Violence Project did not (and we told them about these events so they could update their database).
The reason we flag these as “acceptable” when they are otherwise gold standard is that the data on the details of MPS remains fragmented. You can learn much from either MPS database, but there are gaps in the data, especially for events that occurred before the commercialization of the Internet.
CPRC Global Mass Shooting Database: The Crime Prevention Research Center hired native speakers of the top global languages to comb the internet for all reported MPS. Though many regions with little mass media or localized languages (i.e., Chemehuevi) are not covered, the database does cover the first and second worlds, and a fair bit of third-world counties.
Questionable sources
Gun Violence Archive (GVA): There are too many problems with GVA to consider it a reliable source. It may be acceptable for rough, ballpark assessments, but it is unsuitable for proper research.
Some of the shortcomings with GVA include:
- History of data quality issues
- Incomplete scope of data
- Invents terms that do not comport with established definitions
- No transparency on operations and methods
Early in their history, GVA was basically crowd sourced and had few checks and balances. Sundry researchers noted they had duplicate entries, missing entries, events were not being updated after the initial logging, no follow-up for adjudication, and more shortcomings.
Over time they have slightly improved and made their methodology more acceptable. But it is still woefully lacking and their home-grown definitions have added to public confusion.
For example, GVA claims to have (as of 2023-10-17) 7,500+ media and law enforcement “sources.” That said, there are over 18,000 law enforcement agencies in the United States. Hence, GVA cannot be considered a ground-up data source.
Also, GVA does not base their data collection classification and reporting on established criminology terms, which makes comparing their work to gold standard sources nearly impossible. For example, the criminology-consensus of “mass public shooting” is a public shooting event where four or more people are killed, not including perpetrator, in a single location. GVA invented their own definition of “mass shooting” (note, the omission of the word public) which reads, “Four or more shot [injured] and/or killed in a single event [incident], at the same general time and location not including the shooter.” This adds private property events, self-defense incidents, gangland gun battles, and more, but importantly it includes both wounded and killed, whereas the rest of the world uses only deaths as the metric.
Unacceptable sources
K–12 School Shooting Database: There are many defects in this source. It appears to be a one-man project. As you can tell by the curve of incidents, there appears to be a “reporting awareness” skewing the data (i.e., over time, more awareness of events led to more entries – not that a consistent finding/gathering process was initiated from the start).
The next problem is that their definition of “school shooting” is impossibly broad. Some examples of what could constitute a “school shooting” in their database includes:
- Brandish a gun at a nighttime football game
- A stray bullet from a drive-by shooting landing on school grounds
- A depressed teacher committing suicide in the parking lot at 3 AM on the weekend
The overreach is so bad that in 2020, the General Accounting Office (GAO) edited the K–12 database down to only authentic school shooting incidents (see report titled “Characteristics of School Shootings”).