Nearly 18,000 people died in Syrian government prisons between March 2011 and December 2015. My colleagues and I at the Human Rights Data Analysis Group(HRDAG) calculated this estimate for Amnesty International, which published the findings on Aug. 18. That’s just a small fraction of the overall death toll from the Syrian war: Hundreds of thousands of people have been killed and millions more displaced by this five-years-and-counting conflict. But what does this catalog of shattered lives mean? How can we gather accurate statistics about the patterns of killings in Syria and use them as a tool to bring the perpetrators to justice?
The total number of violent deaths tells only a small part of the story. Beyond the magnitude of the horrors that have befallen Syrian civilians, it is even more important to understand the patterns of violence: Where in Syria is the violence worse? Are deaths decreasing over time or increasing? Does violence mostly affect urban elites or people in rural areas?
The fog of war is a convenient screen for those who commit mass violence, and even more so for those who apologize for the perpetrators. But careful documentation, meticulous forensic investigation, and rigorous statistics can help us pierce the fog to understand how violence works in conflict. Perhaps most importantly, real facts enable meaningful accountability, from historical memory to war crimes trials.
Over the last 20 years, rigorous statistical analysis of patterns of killing has helped bring mass murderers to justice in trials for crimes against humanity, war crimes, and genocide. For example, in the trial of former Yugoslav President Slobodan Milosevic in 2002, I presented statistics about migration and killings in Kosovo in 1999. While we cannot prove with statistics that the violence was caused by actions of the Yugoslav government, the patterns make clear that the war’s suffering could not have been the result of NATO’s bombing campaign, nor could it have been caused by the insurgents of the Kosovo Liberation Army. The prosecutor used this argument to rebut the defense’s deflection of responsibility, and to support the claim that Yugoslav government actions were the most likely cause of the suffering.
In human rights, survivors’ testimonies tell us what happened — but statistics answer questions about how widespread or systematic the violence was. They provide the context so that the victims’ stories can be understood as part of a wider framework.
Counting the dead
Real accountability requires real statistics. When a group reports that more than 250,000 people have been killed in Syria, it might mean many different things. The group might have a list of 250,000 individuals, identified by name and place of death. This is what nongovernmental organizations like the Violations Documentation Centre, the Syrian Network for Human Rights, and the Syrian Observatory for Human Rights do. The 2013 reports HRDAG wrote for the United Nations also provided counts of identified individuals drawn from several sources.
These reports have the advantage of knowing something about each death — cataloging the total number of casualties is as simple as counting the names on all the available lists and then figuring out which records are duplicate entries about the same people. However, the lists do not capture the full magnitude of the violence:
If nobody reports a death, it’s as if it never occurred.
If nobody reports a death, it’s as if it never occurred. Of course, collecting details on the victims of violence in the midst of conflict is enormously difficult, and consequently many deaths are never documented. That is especially true when it comes to the thousands of Syrians who have disappeared into government prisons. Our estimate that nearly 18,000 people died in Syrian government prisons includes the records of thousands of deaths of victims we can individually name, in combination with a probability model that enabled us to estimate the killed that cannot be directly observed. This allowed us to draw conclusions about all the dead — even when we had only some of the data.
Our calculation began by combining data from four Syrian NGOs: the Violations Documentation Centre, the Syrian Network for Human Rights, the Damascus Center for Human Rights, and theSyrian Center for Statistics and Research. We combined each NGO’s list of deaths, identifying all the records that refer to the same deaths, called “database de-duplication.” Then we identified the deaths that occurred while the victims were held by Syrian authorities.
With the records of deaths in custody, we used a technique called “capture-recapture” to estimate the number of deaths in custody that were not reported to our partner NGOs. I onceexplained this method by asking readers to imagine that they were trying to determine which of two rooms was larger:
“Our only tool for assessing the rooms’ sizes is a handful of small rubber balls. The balls have the curious property that when they strike each other, they make a distinctive clicking noise. We take the rubber balls, throw them into the first room and listen — click, click, click-click, click. Then we gather the balls and throw them into the second room — click. Which room is larger?
The second room is larger. The smaller room forces the balls together more closely than the larger room, so the balls have less room to bounce around, and they therefore hit each other more often.”
In the context of estimating total homicides in a conflict, the colliding balls represent duplicate entries of the same death across multiple databases. When multiple, independent databases report many of the same deaths, we infer that they may be close to covering all the deaths. But when they report lists of deaths that rarely appear in the other groups’ lists, we infer that there are many deaths that have not yet been reported by any project.
One way to spot a formal model — as opposed to a count — is that the former will come with what nonstatisticians sometimes call a “margin of error.” A probability model tells us what the most likely answer is, and what the probability is that given this data, the real answer lies within a certain range.
In the case of our report on deaths in Syrian government prisons, there is a 95 percent probability that the true value falls in the range of 13,409 to 18,713 deaths. This range tells us what the true number of people killed probably is, taking into account both the deaths that have been observed and estimating those that were not documented by the four NGOs that provided figures.
Probability models are particularly useful at answering the question: Is violence increasing or decreasing? In a study my HRDAG colleagues and I published last year, we showed that trends can be difficult to assess using only the data we can observe. For example, we compared the governorates of Hama and Homs from December 2012 to March 2013. In the figure below, the purple bars show killings that were observed by one or more of the human rights NGOs tracking deaths. The light blue portion of each bar shows the estimated total, and the black vertical line shows the confidence interval for each estimate.
In Homs, the observed data track the estimate of total deaths. In other words, the top of the purple bars goes up and down consistently with the top of the light blue bars. But in Hama, the observed total across the top of the purple bars drops between December 2012 and January 2013, while the estimated total increases. This means that there was a surge in unobserved killings in January 2013.
If we treated the observed totals as representing the overall picture, we would have largely understood the pattern of violence in Homs. But our analysis of Hama would have missed the crucial surge in violence in January 2013. That’s because events on the ground were changing at the time: A major military push by rebel forces started in mid-December 2012, but the rebels’ offensive had mostly stalled by February. The increase in killings in January may have been undocumented because human rights observers were unable to adequately access the region due to the violence. Or perhaps as fighting intensified, secret executions of civilians also increased. What we can affirm is that something quite substantial happened in January, but it went mostly unrecorded.
If we want to get pattern analysis right, we need to use statistical inferences that include — as former Defense Secretary Donald Rumsfeld might have put it — both the known knowns and the known unknowns. Accounting for the process by which data are generated — adding the known unknowns to the known knowns — is the central point of inferential statistics.
How statistics can serve justice
This is far more than an abstract intellectual exercise. Developing reliable statistics about mass violence is the first step in bringing justice to those guilty of mass violence.
Justice for mass crimes is not the simple story of holding a perpetrator accountable for his — and it is, generally, his — specific bloody deeds. War crimes and crimes against humanity are crimes of policy: Those responsible formulated goals, made plans, and gave orders that other people executed. Statistics can help shed light on how these policies played out in practice.
The root of the word “statistics” originally meant “knowledge about the state.”
To understand how a state commits crimes, we need detailed statistics on the dynamics of the violence: Who suffered, where, and when? Who committed the crimes?
To understand how a state commits crimes, we need detailed statistics on the dynamics of the violence: Who suffered, where, and when? Who committed the crimes?If we don’t know exactly who did it, do the patterns in time and geography yield clues? From these statistical patterns, can we infer the existence of a policy of deliberate action or criminal neglect?Statistics played important roles in the recent trials of three former heads of state for war crimes, crimes against humanity, and genocide, as in the Milosevic trial. Human rights narratives are fundamentally about the experience of the victims. But in cases that involve thousands of deaths over months and spread across a wide area, statistics can help to see the big patterns.
In the 2013 trial for the genocide conducted by former Guatemalan President José Efraín Ríos Montt, statistics showed that the probability of being killed by the army was eight times greater for indigenous people than for their nonindigenous neighbors. In the 2015 trial of former Chadian President Hissène Habré, statistics played a slightly different role. Analysis from government documents recovered by the Chadian truth commission showed that in the jails run by the secret police, prisoner mortality in 1986 and 1987 was hundreds of times greater than normal adult mortality in Chad. Worse, prisoners died at a rate up to five times greater than the death rate among U.S. prisoners in Japanese custody during World War II. (Mistreatment of American and British prisoners was judged a war crime by the International Military Tribunal for the Far East, and the total mortality among the prisoners was explicitly cited as one of the bases for that judgment.) In the Habré verdict, the judges cited the statistical analysis to rebut defense claims that the prisons were not especially unpleasant.
The Syrian civil war will eventually end, and if Syria is to turn to a more just and more democratic future, there will need to be some form of transitional justice. This process may include public acknowledgements of past crimes, criminal prosecutions, memorials, an official history written by a truth commission, or some combination of those steps. But all of these approaches depend on knowing the truth about the past. One part of that truth is the statistical pattern — the trends and magnitude of mass violence.
This work has already started with the courageous documentation done by human rights NGOs on the ground. To be just, to be proportional, to be legitimate, we need to be right, and rigorous statistics are a key part of being right about mass violence.
Photo credit: ZEIN AL-RIFAI/AFP/Getty Images