Iraq Body Count urgently needs your support to keep track of casualties - help us with a donation now

 

IBC's system of documenting violent deaths in Iraq is but one of several approaches to gaining knowledge of conflict casualties. One other approach is to estimate their number through statistical sampling, and most such studies to date are somewhat higher but generally compatible with IBC, including a survey published in PLoS in 2013.

Here we compare the results of this survey to IBC and others, as part of an occasional series to provide background info on specialist topics in casualty recording and estimation.

PLOS study compared to IBC and others

21 April 2015

The following article compares numbers from Iraq Body Count (IBC) with the results of the University Collaborative Iraq Mortality Survey (UCIMS), whose results were reported in the journal PLoS Medicine in October 2013. The report’s lead author was Amy Hagopian, and it was co-authored by, among others, Gilbert Burnham, the lead author of the Lancet survey report published in 2006, and Tim Takaro, co-author of a recent, poorly-researched publication by “Physicians for Social Responsibility” that was hypercritical of IBC.

In the latter publication, and certain other commentary, it is claimed that the UCIMS survey supports the extremely high estimates of the earlier Lancet survey and the notion that IBC is a “gross underestimate” of deaths from violence in Iraq. These assertions have no basis in reality. The results of the UCIMS survey are instead consistent with other credible and much larger studies such as the Iraq Family Health Survey (IFHS) and the earlier Iraq Living Conditions Survey (ILCS), and in fact consistent with the numbers recorded by IBC, which we show below based on the data published along with the UCIMS report in the PLoS journal. It is the Lancet survey, and not IBC, which has been and remains the outlier among the credible research on the Iraq conflict.

The UCIMS report focused primarily on estimating “excess deaths”, which includes both violent deaths and non-violent deaths believed to be indirectly caused by the war. This needs to be distinguished from IBC numbers, which refer to civilian deaths from violence. Currently, this IBC number for civilian deaths from violence stands at 137,693-155,994. IBC also publishes on its home page a broader number that includes both civilian and combatant deaths, which currently stands at 211,000.

The appropriate comparison to be made between IBC and UCIMS is on violent deaths, and since the UCIMS survey does not distinguish between civilians and combatants, the proper comparison is with IBC’s number that includes combatants (currently 211,000). This number also includes the deaths of Coalition forces, which are not included in the UCIMS survey and should therefore be removed for this comparison, making the final comparable number 206,000. Other comparisons between the two sources – such as civilian violent deaths vs all violent deaths, or civilian violent deaths vs violent and non-violent “excess deaths” – would be comparing different things, apples and oranges.

Another key point to be taken into account is that the UCIMS study covered the period 2003-2011, while the IBC numbers given above are to the present, and must therefore be adjusted down to the 2003-2011 time period. The corresponding IBC number for civilian deaths would therefore be 119,000, while the broader IBC number including combatants would be roughly 162,000. This number of 162,000 is therefore the appropriate IBC number to compare to the UCIMS survey.

The UCIMS study actually performed two separate random sample surveys, a Household Survey and a Sibling Survey. The report's main excess deaths estimate was drawn from the Household Survey, but the only proper estimate for violent deaths given in the report is “132,000 (95% UI 89,000–174,000)” deaths among adults aged 15-60, drawn from the Sibling Survey. The IBC total of 162,000 is higher than this estimate. However, a direct comparison is not quite appropriate because the UCIMS estimate is limited to victims aged 15-60 while the IBC number includes victims of all ages. This 15-60 age category would, however, make up the vast majority of violent deaths in the war, which means we would have to lower the IBC number somewhat from 162,000 to make an appropriate comparison, but this would be unlikely to be more than about 20,000 and IBC would still come out either modestly higher than, or about the same as, the UCIMS estimate of 132,000. So we can already see that IBC matches up extremely well with the only proper estimate for violent deaths actually given in the report. This UCIMS estimate does not suggest IBC missed any deaths, and, if anything, would suggest that IBC was slightly too high.

The UCIMS Household Survey was separately used to make the report's main estimate of “excess deaths”, but the report did not actually give any proper violent deaths estimate from this Household Survey. It gives just an excess deaths estimate of “405,000 (95% UI 48,000–751,000)”, and the only hint at the violent death numbers in the Household Survey is in the following statement from the report:

“We estimate that more than 60% of excess deaths were directly attributable to violence, with the rest associated with the collapse of infrastructure and other indirect, but war-related, causes.” [emphasis added]

The report does not provide a precise estimate of violent deaths from the Household Survey, nor an uncertainty interval (UI), just this somewhat vaguely worded proportion, asserting that “more than 60%” of 405,000 estimated excess deaths were from violence. This would translate to an estimate of around 245,000 violent deaths, or “more” than this depending on what exactly the loose phrase “more than 60%” is supposed to mean. However, the “more than 60%” assertion itself appears to be incorrect based on the survey data the authors published along with their report.

As is done in many surveys, some provinces in the UCIMS Household Survey are over- or under-sampled relative to others. This is not unusual and it is typically corrected by weighting when building the final estimates. If, for example, you over-sample in high violence areas, your estimates will overstate violent deaths unless the sample is correctly weighted when building the estimates. Likewise, if you over-sample in peaceful areas, your estimates will understate violent deaths unless the sample is correctly weighted when making the estimates.

It appears that the UCIMS authors did not do the weighting in making their estimate of excess deaths. In this particular case, the “excess deaths” estimate just happens to come out to almost exactly the same number whether the weighting is done or not, and the authors calculated it without the weighting. However, the problem is that while the weighting doesn't appear to matter to the final excess deaths number, it does matter to the violent deaths number. Violent deaths and non-violent deaths have a very different geographic distribution, time trend and victim demographics than do non-violent deaths, and so issues of weighting can affect their estimates very differently. In the case of the UCIMS Household Survey, the un-weighted (incorrect) violent deaths estimate from the published survey data would come out to approximately 245,000, and this appears to be how they arrived at “more than 60%” of 405,000. However, the weighted (correct) estimate would come out to approximately 220,000, and it would be more appropriate to say that about 55% of excess deaths were due to violence, not “more than 60%” as stated in the report.

The 95% UI for the figure of 220,000 violent deaths would be roughly 140,000-340,000. We can then see that while the central estimate of 220,000 from the Household Survey is higher than the IBC number of 162,000 (by a factor of about 1.4), the IBC number still fits well within the statistical error margins, which means that even though IBC and UCIMS provide two different specific numbers, they are still statistically consistent. The UCIMS Household Survey therefore suggests that IBC probably missed around 60,000 deaths, but at the same time it can't say with high confidence that IBC really missed any deaths.

For excess deaths, the UCIMS authors also did an additional calculation aimed at estimating additional deaths among refugees that they believe may have higher death rates than non-refugees who remained in Iraq across their entire coverage period. On this basis, they proposed raising their excess deaths estimate from 405,000 to 461,000. If we apply the same adjustment to the violent death estimate this would raise that number from 220,000 to approximately 250,000 (or a factor of about 1.6 above IBC). The refugee adjustment is more speculation-driven than data-driven, so there isn't really a credible way to calculate a proper UI (note they did not provide one for this elevated 461,000 excess deaths estimate). However, under any reasonable interpretation the uncertainty associated with the speculative adjustment would have to be even greater than that of the straightforward data-driven estimate and would therefore not credibly change the consistent relationship between the UCIMS and IBC numbers already shown above.

The relationships shown above are also similar to the relationships between IBC and data from earlier studies such as the Iraq Family Health Survey (IFHS) and the Iraq Living Conditions Survey (ILCS), meaning that the UCIMS survey is quite consistent with not just IBC, but with both of those earlier surveys as well. Given how closely all of these studies align with each other, it is hard to see why those who consider the UCIMS to be a credible survey shine such a harsh light on IBC, or on other studies such as the IFHS.

In stark contrast, the UCIMS survey is not remotely consistent with the Lancet survey. Recall that the Lancet survey claimed a number of violent deaths some 10 times higher than IBC, while the UCIMS Sibling Survey finds numbers that are basically the same as IBC and the UCIMS Household Survey finds numbers that are 1.4 (or possibly 1.6) times higher than IBC. So how do UCIMS and Lancet estimates for violent deaths compare to each other? If we narrow the UCIMS Household Survey down to the time period covered by the Lancet survey (March 2003 – June 2006), the UCIMS estimate for violent deaths would be approximately 130,000 (95% UI 20,000-280,000), while the number claimed in the Lancet was an astonishing 601,027 (95% CI 426,369-793,663).

The UCIMS Household Survey and Lancet surveys disagree by almost a factor of 5, and are not even remotely close to being consistent with each other. The enormous gulf between the Lancet and the UCIMS Sibling Survey would be even wider still. The IFHS survey, on the other hand, estimated 150,000 deaths for this same period, very close to the UCIMS.

If one is going to suggest the UCIMS survey corroborates another, it does not remotely corroborate the Lancet survey. It corroborates the IFHS.

The UCIMS Sibling, UCIMS Household, IFHS and ILCS surveys are all quite consistent with each other on the violent deaths question. The only survey to sit far afield from this generally convergent range of results is the Lancet survey, which is radically higher than, and statistically inconsistent with, all of them.

Beyond these total number comparisons, UCIMS and IBC also match up rather well to each other on the profile of violent deaths in several other ways, including the geographic distribution, trend over time, demographics and weapons used. The IFHS and ILCS also match up quite well with IBC on these kind of indicators. The Lancet study, to the contrary, differed from IBC and all the others quite dramatically on geographic distribution of violence and trend over time. Not only does the Lancet survey suggest a radically different number from all the other credible surveys, it suggests a very different picture of where this violence took place and when.

Despite all of this, there are some who seem determined to treat the Lancet survey of more than eight years ago as the most accurate – and the only genuinely tragic – picture of the war in Iraq, and to single out IBC as some sort of gross distortion of reality. This impulse is clearly not based on a reasonable evaluation of the available evidence.