More evidence for having IRBs: sociologist finds that US Army released toxic cadmium into St. Louis air in the 1950s and 1960s

A sociologist in St. Louis says she has discovered an unknown story involving the US Army releasing cadmium into the air in the 1950s.

The aerosol was sprayed from blowers installed on rooftops and mounted on vehicles. ”The Army claims that they were spraying a quote ‘harmless’ zinc cadmium sulfide,” says Dr. Lisa Martino-Taylor, Professor of Sociology, St. Louis Community College. Yet Martino-Taylor points out, cadmium was a known toxin at the time of the spraying in the mid 50?s and mid 60?s. Worse, she says the aerosol was laced with a fluorescent additive – a suspected radiological compound – produced by U.S. Radium, a company linked to the deaths of workers at a watch factory decades before.

Martino-Taylor says thousands upon thousands of St. Louis residents likely inhaled the spray. ”The powder was milled to a very, very fine particulate level.  This stuff traveled for up to 40 miles.  So really all of the city of St. Louis was ultimately inundated by  the stuff.”

Martino-Taylor says she’s obtained documents from multiple federal agencies showing the government concocted an elaborate story to keep the testing secret. “There was a reason this was kept secret.  They knew that the people of St. Louis would not tolerate it.” She says part of the deception came from false news reports planted by government agencies.  “And they told local officials and media that they were going to test clouds under which to hide the city in the event of aerial attack.” Martino-Taylor says some of the key players in the cover-up were also members of the Manhattan Atomic Bomb Project and involved in other radiological testing across the United States at the time. “This was against all military guidelines of the day, against all ethical guidelines, against all international codes such as the Nuremberg Code.”

She says the spraying occurred between 1953 and 54 and again from 1963 to 65 in areas of North St. Louis and eventually in parts of South St. Louis. Martino-Taylor launched her research after hearing independent reports of cancers among city residents living in those areas at the time.

When students ask why we have Institutional Review Boards (IRBs) and why it may seem they have researchers jump through a series of hoops, I remind them of stories like this. This experiment even took place after the establishment of the beginnings of the modern ethical guidelines for science  through the Nuremberg Code. It is not too long ago when the government and other organizations undertook silent experiments and violated two of the primary ethical principles sociologists and others hold to: do not harm participants and ensure that they are participating on a voluntary basis.

Another note: it sounds like these experiments were justified in the name of safety. The tests were conducted under the cover that the city needed to prepare for a possible bombing, presumably by Russia.

History class “Lying About the Past” fools Wikipedia and the Internet…for a short time

Here is a fascinating story of a history class at George Mason University that asked students to fabricate information on Wikipedia and it worked…for a short time.

Each tale was carefully fabricated by undergraduates at George Mason University who were enrolled in T. Mills Kelly’s course, Lying About the Past. Their escapades not only went unpunished, they were actually encouraged by their professor. Four years ago, students created a Wikipedia page detailing the exploits of Edward Owens, successfully fooling Wikipedia’s community of editors. This year, though, one group of students made the mistake of launching their hoax on Reddit. What they learned in the process provides a valuable lesson for anyone who turns to the Internet for information.

The first time Kelly taught the course, in 2008, his students confected the life of Edward Owens, mixing together actual lives and events with brazen fabrications. They created YouTube videos, interviewed experts, scanned and transcribed primary documents, and built a Wikipedia page to honor Owens’ memory. The romantic tale of a pirate plying his trade in the Chesapeake struck a chord, and quickly landed on USA Today’s pop culture blog. When Kelly announced the hoax at the end of the semester, some were amused, applauding his pedagogical innovations. Many others were livid.

Critics decried the creation of a fake Wikipedia page as digital vandalism. “Things like that really, really, really annoy me,” fumed founder Jimmy Wales, comparing it to dumping trash in the streets to test the willingness of a community to keep it clean. But the indignation may, in part, have been compounded by the weaknesses the project exposed. Wikipedia operates on a presumption of good will. Determined contributors, from public relations firms to activists to pranksters, often exploit that, inserting information they would like displayed. The sprawling scale of Wikipedia, with nearly four million English-language entries, ensures that even if overall quality remains high, many such efforts will prove successful…

Sometimes even an apparent failure can mask an underlying success. The students may have failed to pull off a spectacular hoax, but they surely learned a tremendous amount in the process. “Why would I design a course,” Kelly asks on his syllabus, “that is both a study of historical hoaxes and then has the specific aim of promoting a lie (or two) about the past?” Kelly explains that he hopes to mold his students into “much better consumers of historical information,” and at the same time, “to lighten up a little” in contrast to “overly stuffy” approaches to the subject. He defends his creative approach to teaching the mechanics of the historian’s craft, and plans to convert the class from an experimental course into a regular offering.

Should this professor be applauded for his innovative use of technology or questioned about the possible unethical nature of asking students to create stories online?

I’d love to see the student evaluations for this course. This course could be practical on a variety of levels: it reveals some insights into how history is “made” (it requires a certain number of sources, credible sources, and a narrator or place where the facts can be put together), it involves current technology (a plus for today’s college student who spend a lot of time online and rely on Wikipedia a lot), and it shows students how to evaluate information (whether online or otherwise). These sound like laudable goals. Here is the syllabus for the second iteration of the course (Spring 2012) and some of the material from the first page:

Why would I design a course that is both a study of historical hoaxes and then has the specific aim of promoting a lie (or two) about the past? I have two answers to this question, both of which I hope will convince you that I’m onto something. The first answer is that by learning about historical fakery, lying, and hoaxes, we all become much better consumers of historical information. In short, we are much less likely to be tricked by what we find in our own personal research about the past. That alone ought to be enough of a reason to teach this course. But my second reason is that I believe that the study of history ought to be fun and that too often historians (I include myself in this category) take an overly stuffy approach to the past. Maybe it’s our conditioning in graduate school, or maybe we’re afraid that if we get too playful with our
field we won’t be taken seriously as scholars. Whatever the reason, I think history has just gotten a bit too boring for its own good. This course is my attempt to lighten up a little and see where it gets us.

In the interest of full disclosure, I have only taught this class once before and to my knowledge,
no other history professor in the world is willing to teach something similar (or works in a
department where they could get away with it). Various courses taught around the world spend
some time on hoaxes and hoaxing, but I haven’t found one that is all about the hoax. So the only
model to work from is the one I used last time (Fall 2008). The last time around, the final class
project generated a great deal of discussion (much, but not all of it negative) in the academic
blogosphere. As you’ll see when we discuss the previous iteration of this course, I’m not
particularly sympathetic to those who took a dim view of what my students did.

Learning Goals

I do have some specific learning goals for this course. I hope that you’ll improve your research
and analytical skills and that you’ll become a much better consumer of historical information. I
hope you’ll become more skeptical without becoming too skeptical for your own good. I hope
you’ll learn some new skills in the digital realm that can translate to other courses you take or to
your eventual career. And, I hope you’ll be at least a little sneakier than you were before you
started the course.

Interesting.

Increase in retractions of scientific articles tied to problems in scientific process

Several scientists are calling for changes in how scientific work is conducted and published because of a rise in retracted articles:

Dr. Fang became curious how far the rot extended. To find out, he teamed up with a fellow editor at the journal, Dr. Arturo Casadevall of the Albert Einstein College of Medicine in New York. And before long they reached a troubling conclusion: not only that retractions were rising at an alarming rate, but that retractions were just a manifestation of a much more profound problem — “a symptom of a dysfunctional scientific climate,” as Dr. Fang put it.

Dr. Casadevall, now editor in chief of the journal mBio, said he feared that science had turned into a winner-take-all game with perverse incentives that lead scientists to cut corners and, in some cases, commit acts of misconduct…

Last month, in a pair of editorials in Infection and Immunity, the two editors issued a plea for fundamental reforms. They also presented their concerns at the March 27 meeting of the National Academies of Sciences committee on science, technology and the law.

Here is what Fang and Casadevall suggest may help reduce these issues:

To change the system, Dr. Fang and Dr. Casadevall say, start by giving graduate students a better understanding of science’s ground rules — what Dr. Casadevall calls “the science of how you know what you know.”

They would also move away from the winner-take-all system, in which grants are concentrated among a small fraction of scientists. One way to do that may be to put a cap on the grants any one lab can receive.

In other words, give graduate students more training in ethics and the sociology of science while also redistributing scientific research money so that more researchers can be involved. There is a lot to consider here. Of course, there might always be researchers tempted to commit fraud yet these scientists are arguing that the current system and circumstances needs to be tweaked to fight this. Graduate students and young faculty are well aware of what they have to do: publish research in the highest-ranked journals they can. Jobs and livelihoods are on the line. With that pressure, it makes sense that some may resort to unethical measures to get published.

Three other thoughts:

1. How often is social science research retracted? If it is infrequent, should it happen more often?

2. Even if an article or study is retracted, this doesn’t solve the whole issue as that work may have been cited a lot and become well known. Perhaps the bigger problem is “erasing” this study from the collective science memory. This reminds me of newspaper corrections; when you go find the original printing, you don’t know there was a later correction. The same thing can happen here: scientific studies can have long lives.

3. Should disciplines or journals have groups that routinely assess the validity of research studies? This would go beyond peer review and give a group the authority to ask questions about suspicious papers. Alas, this still wouldn’t catch even most of the problematic papers…

Quick Review: The Immortal Life of Henrietta Lack

After a few people mentioned a particular New York Times bestseller to me recently, I decided to read The Immortal Life of Henrietta Lack. While the story itself was interesting, there is a lot of material here that could be used in research methods and ethics classes. A few thoughts about the book:

1. The story is split into two narratives. One is about both the progress science has made with a Lack’s cells but also the struggle of her family to understand what actually has been done with her cells. The story of scientific progress is unmistakable: we have come a long way in identifying and curing some diseases in the last sixty years. (This narrative reminded me of the book The Emperor of All Maladies.)

2. The second narrative is about the personal side of scientific research and how patients and relatives interpret what is going on. The author initially finds that the Lacks know very little about how their sister or mother’s cells have been used. These problems are compounded by race, class, and educational differences between the Lacks and the doctors utilizing Henrietta’s cells. In my opinion, this aspect is understated in this book. At the least, this is a reminder about how inequality can affect health care. But I think this personal narrative is the best part of the book. When I talk in class about the reasons for Institutional Review Boards, informed consent, and ethics, students often wonder how much social science research can really harm people. As this book discusses, there are some moments in relatively recent history that we would agree were atrocious: Nazi experiments, the Tuskegee experiments, experiments in Guatemala, and so on. Going beyond those egregious cases, this book illustrates the kind of mental and social harm that can result from research even if using Henrietta’s cells never physically harmed the Lacks. I’m thinking about using some sections of this narrative in class to illustrate what could happen; even if new research appears to be safe, we have to make sure we are protecting our research subjects.

3. This book reminded me of the occasional paternalistic side of the medical field. This book seems to suggest this isn’t just an artifact of the 1950s or a racial division; doctors appear slow in addressing concerns some people might have about the use of human tissue in research. I realize that there is a lot at stake here: the afterward of the book makes clear how difficult it would be to regulate this all and how this might severely limit needed medical research. At the same time, doctors and other medical professionals could go further in explaining the processes and the possible outcomes to patients. Perhaps this is why the MCAT is moving toward involving more sociology and psychology.

4. There is room here to contrast the discussions about using body tissue for research and online privacy. In both cases, a person is giving up something personal. Are people more disturbed by their tissue being used or their personal information being used and sold online?

All in all, this book discusses both scientific breakthroughs, how patients can be hurt by the system, and a number of ethical issues that have yet to be resolved.

Sociological study says junk food sales in middle schools don’t lead to weight gain

A new study in the Sociology of Education provides some insights into the current debate over whether public schools should be selling junk food to students:

The authors found that 59.2 percent of fifth graders and 86.3 percent of eighth graders in their study attended schools that sold junk food. But, while there was a significant increase in the percentage of students who attended schools that sold junk food between fifth and eighth grades, there was no rise in the percentage of students who were overweight or obese. In fact, despite the increased availability of junk food, the percentage of students who were overweight or obese actually decreased from fifth grade to eighth grade, from 39.1 percent to 35.4 percent.

“There has been a great deal of focus in the media on how schools make a lot of money from the sale of junk food to students, and on how schools have the ability to help reduce childhood obesity,” Van Hook said. “In that light, we expected to find a definitive connection between the sale of junk food in middle schools and weight gain among children between fifth and eighth grades. But, our study suggests that—when it comes to weight issues—we need to be looking far beyond schools and, more specifically, junk food sales in schools, to make a difference.”

According to Van Hook, policies that aim to reduce childhood obesity and prevent unhealthy weight gain need to concentrate more on the home and family environments as well as the broader environments outside of school.

“Schools only represent a small portion of children’s food environment,” Van Hook said. “They can get food at home, they can get food in their neighborhoods, and they can go across the street from the school to buy food. Additionally, kids are actually very busy at school. When they’re not in class, they have to get from one class to another and they have certain fixed times when they can eat. So, there really isn’t a lot of opportunity for children to eat while they’re in school, or at least eat endlessly, compared to when they’re at home. As a result, whether or not junk food is available to them at school may not have much bearing on how much junk food they eat.”

This study has a big sample of nearly 20,000 students and the findings were so counterintuitive that the authors waited two years to publish the results.

While this study suggests schools don’t contribute to weight gain, it doesn’t necessarily mean that schools should suddenly revert to selling all kinds of junk. At first glance, this could be the sort of study that people worried about the “nanny state” could jump on. For example, see the response of the Center for Consumer Freedom: “Maybe it’s time for the “food police” to educate themselves. All the attempts to limit choices apparently won’t do the students any good.” At the same time, schools can be part of a larger package of social forces pushing for better eating and exercise but they aren’t likely to solve the problems by themselves or by operating in simplistic ways.

I wonder if this points to a bigger issue: Americans expect that schools will be able to even a lot of social ills. In this case, being obese and overweight is a complex issue that schools themselves can’t overcome. As the authors note, there are a lot of other factors at play and by the time students reach middle school, they have already been shaped in significant ways. While education is one of the best ways to promote upward mobility and the opportunity to compete in a rapidly-changing world economy, it is not a silver bullet for all problems. Of course, public policy is limited in what it can feasibly or popularly change and politicians and advocates only have so many levers they can move.

Another thing to note: I wonder how some might see an admission from one of the authors. One author said, “We were really surprised by that result and, in fact, we held back from publishing our study for roughly two years because we kept looking for a connection that just wasn’t there.” Some might be suspicious and wonder if there is an ethical issue: did the authors data mine looking for other connections? Were the authors afraid of how some might respond to their findings? At the same time, scientists can also be surprised by their findings and I would guess they were simply being thorough before exposing their work to the public.

After case of fraud, researchers discuss others means of “misusing research data”

The news that a prominent Dutch social psychologist published fraudulent work has pushed other researchers to talk about other forms of “misusing research data”:

Even before the Stapel case broke, a flurry of articles had begun appearing this fall that pointed to supposed systemic flaws in the way psychologists handle data. But one methodological expert, Eric-Jan Wagenmakers, of the University of Amsterdam, added a sociological twist to the statistical debate: Psychology, he argued in a recent blog post and an interview, has become addicted to surprising, counterintuitive findings that catch the news media’s eye, and that trend is warping the field…

In September, in comments quoted by the statistician Andrew Gelman on his blog, Mr. Wagenmakers wrote: “The field of social psychology has become very competitive, and high-impact publications are only possible for results that are really surprising. Unfortunately, most surprising hypotheses are wrong. That is, unless you test them against data you’ve created yourself.”…

To show just how easy it is to get a nonsensical but “statistically significant” result, three scholars, in an article in November’s Psychological Science titled “False-Positive Psychology,” first showed that listening to a children’s song made test subjects feel older. Nothing too controversial there.

Then they “demonstrated” that listening to the Beatles’ “When I’m 64” made the test subjects literally younger, relative to when they listened to a control song. Crucially, the study followed all the rules for reporting on an experimental study. What the researchers omitted, as they went on to explain in the rest of the paper, was just how many variables they poked and prodded before sheer chance threw up a headline-making result—a clearly false headline-making result.

If the pressure is great to publish (and it certainly is), then there have to be some countermeasures to limit unethical research practices. Here are a few ideas:

1. Giving more people access to the data. In this way, people could check up on other people’s published findings. But if the fraudulent studies are already published, perhaps this is too late.

2. Having more people have oversight over the project along the way. This doesn’t necessarily have to be a bureaucratic board but only having one researcher looking at the data and doing the analysis (such as in the Stapel case) means that there is more opportunity for an individual to twist the data. This could be an argument for collaborative data.

3. Could there be more space within disciplines and journals to discuss the research project? While papers tend to have very formal hypotheses, there is a lot of messy work that goes into these but very little room to discuss how the researchers arrived at them.

4. Decrease the value of media attention. I don’t know how to deal with this one. What researcher doesn’t want to have more people read their research?

5. Have a better educated media so that they don’t report so many inconsequential and shocking studies. We need more people like Malcolm Gladwell who look at a broad swath of research and summarize it rather than dozens of reports grabbing onto small studies. This is the classic issue with nutrition reporting: eggs are great! A new study says they are terrible! A third says they are great for pregnant women and no one else! We rarely get overviews of this research or real questions about the value of all this research. We just get: “a study proved this oddity today…”

6. Resist data mining. Atheoretical correlations don’t help much. Let theories guide statistical models.

7. Have more space to publish negative findings. This would help researchers feel less pressure to come up with positive results.

More details of unethical US medical experiments in Guatemala in the 1940s

Research methods courses tend to cover the same classic examples of unethical studies. With more details emerging from a government panel, the US medical experiments undertaken in Guatemala during the 1940s could join this list.

From 1946-48, the U.S. Public Health Service and the Pan American Sanitary Bureau worked with several Guatemalan government agencies to do medical research — paid for by the U.S. government — that involved deliberately exposing people to sexually transmitted diseases…

The research came up with no useful medical information, according to some experts. It was hidden for decades but came to light last year, after a Wellesley College medical historian discovered records among the papers of Dr. John Cutler, who led the experiments…

During that time, other researchers were also using people as human guinea pigs, in some cases infecting them with illnesses. Studies weren’t as regulated then, and the planning-on-the-fly feel of Cutler’s work was not unique, some experts have noted.

But panel members concluded that the Guatemala research was bad even by the standards of the time. They compared the work to a 1943 experiment by Cutler and others in which prison inmates were infected with gonorrhea in Terre Haute, Ind. The inmates were volunteers who were told what was involved in the study and gave their consent. The Guatemalan participants — or many of them — received no such explanations and did not give informed consent, the commission said.

Ugh – a study that gives both researchers and Americans a bad name. It is also a good reminder of why we need IRBs.

While the article suggests President Obama apologized to the Guatemalan president, is anything else going to be done to try to make up for this? I also wonder how this is viewed in Central America: yet more details about the intrusiveness of Americans over the last century?

(See my original post on this here.)

Wired’s “seven creepy experiments” short on social science options

When I first saw the headline for this article in my copy of Wired, I was excited to see what they had dreamed up. Alas, the article “Seven Creepy Experiments That Could Teach Us So Much (If They Weren’t So Wrong)” is mainly about biological experiments. One experiment, splitting up twins and fixing their environments, could be interesting: it would provide insights into the ongoing nature vs. nurture debate.

I would be interested to see how social scientists would respond to a question about what “creepy” or unethical experiments they would like to see happen. In research methods class, we have the classic examples of experiments that should not be replicated. Milgram’s experiment about obedience to authority, Zimbardo’s Stanford Prison Experiment, and Humphrey’s Tearoom Trade Study tend to come up. From more popular sources, we could talk about a setup like the one depicted in The Truman Show or intentionally creating settings like those found in Lord of the Flies or The Hunger Games.

What sociological experiments would produce invaluable information but would never pass an IRB?

Considering the ethics of adopting children to study them

An ethicist look at three scenarios to help sort out the difference between studying one’s biological vs. adopted children:

Ethics has a bizarre blind spot around parents and children. For no justifiable reason that I can discern, we deem it perfectly tolerable for a parent to decide unilaterally to raise their child genderless or under the Tiger Mother or laissez-faire method of parenting, but horror at the idea of someone “testing” one of these parental styles on a child. Recall, there is no test to become a parent, no minimum qualification or form of licensing. In fact, if you are so irresponsible as to unintentionally have a child you do not want and cannot support, you have more of a right (and obligation) to rear that child than a stranger with the means and desire to give that child a better life…

I would like to test this reproduce-rearing correlation with a thought experiment. The details of the thought experiment appear below the fold, but the conclusion is as follows: it would be ethically permissible for a scientist to adopt a large group of children and then perform specific, non-harmful, nature-vs-nurture social experiments on those children…

After running through three scenarios, here is the conclusion:

Therefore, if it is morally permissible for parents to independently decide how to raise their children in regards to gender, it should be morally permissible for a team of scientists to conduct a rigorous experiment with their own adopted children on the impact of rearing on gender and sexual preferences.

I imagine an IRB would have a very difficult time approving a formal proposal for this.

Several other methodological issues come to mind:

1. There could be issues of objectivity: how do we know parents of either biological or adoptive children could “objectively” observe their own kids? This may be a bigger problem in some disciplines than others: ethnographies, for example, utilize participant observation which parents would certainly be a part of. But even then, there are concerns about the researcher becoming too immersed in the setting of the study and losing an outsider’s point of view. Scenario #3 simply suggests that sociologist parents would make “unbiased observations.”

2. How could an experimenter be sure that results from adopted (or even biological) children are the result of the treatment rather than prior experiences and behaviors? Experiments try to isolate the effects of treatments but adopted children could have numerous confounding factors from their pre-adoption days.

Pivoting toward greater competition

Ryan Singel over at Wired magazine writes about a new start-up called LawPivot that helps start-ups with their legal questions:

LawPivot’s solution is to create a Q&A site where startups can ask legal questions confidentially and then get recommended lawyers to answer the question, which can lead to the former hiring the latter.

While California-based startups can now ask three free questions a month, LawPivot will soon be charging companies $80 for each question. For lawyers, the benefit is being able to land new clients for themselves or their firms, and to build a reputation — though they don’t get paid to answer a question.

Despite potential ethical issues and haughty dismissals by certain blogs, this certainly is where the legal profession is heading.  In a globalized world with plenty of lawyers looking for work, more competition is inevitable.  Fees are going to go down.