Although this article leads with recent survey results about what Americans know and think about the Holocaust, I’ll start with the summary of earlier surveys and move forward in time to the recent results:
Whether or not the assumptions in the Claims Conference survey are fair, and how to tell, is at the core of a decades long debate over Holocaust knowledge surveys, which are notoriously difficult to design. In 1994, Roper Starch Worldwide, which conducted a poll for the American Jewish Committee, admitted that its widely publicized Holocaust denial question was “flawed.” Initially, it appeared that 1 in 5, or 22 percent, of Americans thought it was possible the Holocaust never happened. But pollsters later determined that the question—“Does it seem possible or does it seem impossible to you that the Nazi extermination of the Jews never happened?”—was confusing and biased the sample. In a subsequent Gallup poll, when asked to explain their views on the Holocaust in their own words, “only about 4 percent [of Americans] have real doubts about the Holocaust; the others are just insecure about their historical knowledge or won’t believe anything they have not experienced themselves,” according to an Associated Press report at the time. More recently, the Anti-Defamation League was criticized for a 2014 worldwide study that asked respondents to rate 11 statements—“People hate Jews because of the way they behave, for example”—as “probably true” or “probably false.” If respondents said “probably true” to six or more of the statements, they were considered to harbor anti-Semitic views, a line that many experts said could not adequately represent real beliefs…
Just two years ago, the Claims Conference released another survey of Americans that found “Two-Thirds of Millennials Don’t Know What Auschwitz Is,” as a Washington Post headline summarized it. The New York Times reported on the numbers at the time as proof that the “Holocaust is fading from memory.” Lest it appear the group is singling out Americans, the Claims Conference also released surveys with “stunning” results from Canada, France, and Austria.
But a deeper look at the Claims Conference data, which was collected by the firm Schoen Cooperman Research, reveals methodological choices that conflate specific terms (the ability to ID Auschwitz) and figures (that 6 million Jews were murdered) about the Holocaust with general knowledge of it, and knowledge with attitudes or beliefs toward Jews and Judaism. This is not to discount the real issues of anti-Semitism in the United States. But it is an important reminder that the Claims Conference, which seeks restitution for the victims of Nazi persecution and also to “ensure that future generations learn the lessons of the Holocaust,” is doing its job: generating data and headlines that it hopes will support its worthy cause.
The new Claims Conference survey is actually divided into two, with one set of data from a 1,000-person national survey and another set from 50 state-by-state surveys of 200 people each. In both iterations, the pollsters aimed to assess Holocaust knowledge according to three foundational criteria: the ability to recognize the term the Holocaust, name a concentration camp, and state the number of Jews murdered. The results weren’t great—fully 12 percent of national survey respondents had not or did not think they had heard the term Holocaust—but some of the questions weren’t necessarily written to help respondents succeed. Only 44 percent were “familiar with Auschwitz,” according to the executive summary of the data, but that statistic was determined by an open-ended question: “Can you name any concentration camps, death camps, or ghettos you have heard of?” This type of active, as opposed to passive, recall is not necessarily indicative of real knowledge. The Claims Conference also emphasized that 36 percent of respondents “believe” 2 million or fewer Jews were killed in the Holocaust (the correct answer is 6 million), but respondents were actually given a multiple-choice question with seven options—25,000, 100,000, 1 million, 2 million, 6 million, 20 million, and “not sure”—four of which were lowball figures. (Six million was by far the most common answer, at 37 percent, followed by “not sure.”)
The first example above has made it into research methods textbooks regarding the importance of how survey questions are worded. The ongoing discussion in this article also could illustrate these textbook dialogues: how questions are asked and how the results are interpreted by the researchers are very important.
There are other actors in this process that can help or harm the data interpretation:
- Funders/organizations behind the data. What do they do with the results?
- How the media reports the information. Do they accurately represent the data? Do they report on how the data was collected and analyzed?
- Does the public understand what the data means? Or, do they solely take their cues from the researchers and/or the media reports?
- Other researchers who look at the data. Would they measure the topics in the same way and, if not, what might be gained by alternatives?
This all may be boring details to many but going from choosing research topics and developing questions to sharing results with the public and interpretation from others can be a process. The hope is that all of the actors involved can help get as close to what is actually happening – in this case, accurately measuring and reporting attitudes and beliefs.
Like this:
Like Loading...