Sociology grad student: scholars need to and can make their research and writing more public

Sociology PhD student Nathan Jurgenson argues that scholars need to make their research more public:

To echo folks like Steven Sideman or danah boyd, we have an obligation to change this; academics have a responsibility to make their work relevant for the society they exist within.

The good news is that the tools to counter this deficiency in academic relevance are here for the taking. Now we need the culture of academia to catch up. Simply, to become more relevant, academics need to make their ideas more accessible.

There are two different, yet equally important, ways academics need to make their ideas accessible:

(1) Accessible by availability: ideas should not be locked behind paywalls.

(2) Accessible by design: ideas should be expressed in ways that are interesting, readable and engaging.

Considering that Jurgenson researches social media (see my earlier post on another of his arguments), I’m not surprised to see him make this argument. Though most of his argument is tilted toward the brokenness of the current system, Jurgenson wants to help the academic world see that we now have the tools, particularly online, to do some new things.

A few other thoughts:

1. Does every generation of graduate students suggest the current system is broken or is this really a point in time where a big shift could occur?

2. Jurgenson also hints that academics need to be more able to write for larger publics. So it is not just about the tools but about the style and rhetoric needed to speak through these other means. I can’t imagine any “Blogging Sociology” courses in grad schools anytime soon but Jurgenson is bringing up a familiar complaint: academics sometimes have difficulty making their case to people who are not academics.

3. Jurgenson doesn’t really get at this but these new tools also mean that data, not just writing, can be shared more widely. This could also become an important piece of a more open academia.

4. The idea that academic writing should or could be fun is intriguing. How many academics could pull this off? Might this reduce the gravitas of academic research?

Academics flock to research the Occupy movement

A New York Times article suggests a number of academics have seized the opportunity presented by the Occupy movement to not only teach about but also research the protests:

“This thing just erupted so quickly,” said Alex S. Vitale, a sociologist at Brooklyn College who studies the policing of demonstrations. “It’s almost overwhelming to deal with all the information that’s out there.”

Mr. Vitale is finishing a 10-city study of interactions between protesters and the police since last fall, which he said showed a lack of overall “militarization” in police response in major cities. (New York is an exception, said Mr. Vitale, who organized a demonstration against police tactics in Zuccotti Park last fall but said he did not consider himself part of the Occupy movement.) Other researchers are doing ethnographic studies, crunching survey data, recording oral histories and analyzing material by and about the movement, all at lightning speed compared with the usual pace of scholarship.

“Academics are used to taking forever, but we don’t have to,” said Theda Skocpol, a sociologist at Harvard and author, with Vanessa Williamson, of “The Tea Party and the Remaking of Republican Conservatism,” a study of Occupy’s right-wing counterpart published in January…

Some researchers also say that the sympathy many academics feel for the movement risks undermining objective research.

It will be very interesting to see the research and then the resulting discussions.

This highlights a larger issue in academia: the common lag time between events and publishable research. This can often take a few years as researchers quickly draw up plans, collect data, analyze it, and then work through the review process. I imagine there will be some pressure to get some of this Occupy research going more quickly, perhaps with an interest in more quickly addressing and understanding this phenomenon and with the idea of capitalizing on political momentum. Could this change how research is presented and considered in the future? Work could be published in web or open source journals. What about books that are rushed into print or even more timely, e-books?

Increase in retractions of scientific articles tied to problems in scientific process

Several scientists are calling for changes in how scientific work is conducted and published because of a rise in retracted articles:

Dr. Fang became curious how far the rot extended. To find out, he teamed up with a fellow editor at the journal, Dr. Arturo Casadevall of the Albert Einstein College of Medicine in New York. And before long they reached a troubling conclusion: not only that retractions were rising at an alarming rate, but that retractions were just a manifestation of a much more profound problem — “a symptom of a dysfunctional scientific climate,” as Dr. Fang put it.

Dr. Casadevall, now editor in chief of the journal mBio, said he feared that science had turned into a winner-take-all game with perverse incentives that lead scientists to cut corners and, in some cases, commit acts of misconduct…

Last month, in a pair of editorials in Infection and Immunity, the two editors issued a plea for fundamental reforms. They also presented their concerns at the March 27 meeting of the National Academies of Sciences committee on science, technology and the law.

Here is what Fang and Casadevall suggest may help reduce these issues:

To change the system, Dr. Fang and Dr. Casadevall say, start by giving graduate students a better understanding of science’s ground rules — what Dr. Casadevall calls “the science of how you know what you know.”

They would also move away from the winner-take-all system, in which grants are concentrated among a small fraction of scientists. One way to do that may be to put a cap on the grants any one lab can receive.

In other words, give graduate students more training in ethics and the sociology of science while also redistributing scientific research money so that more researchers can be involved. There is a lot to consider here. Of course, there might always be researchers tempted to commit fraud yet these scientists are arguing that the current system and circumstances needs to be tweaked to fight this. Graduate students and young faculty are well aware of what they have to do: publish research in the highest-ranked journals they can. Jobs and livelihoods are on the line. With that pressure, it makes sense that some may resort to unethical measures to get published.

Three other thoughts:

1. How often is social science research retracted? If it is infrequent, should it happen more often?

2. Even if an article or study is retracted, this doesn’t solve the whole issue as that work may have been cited a lot and become well known. Perhaps the bigger problem is “erasing” this study from the collective science memory. This reminds me of newspaper corrections; when you go find the original printing, you don’t know there was a later correction. The same thing can happen here: scientific studies can have long lives.

3. Should disciplines or journals have groups that routinely assess the validity of research studies? This would go beyond peer review and give a group the authority to ask questions about suspicious papers. Alas, this still wouldn’t catch even most of the problematic papers…

Journal editors push authors to add citations to improve impact factors?

A new study in Science suggests that some journal editors push authors to include citations in their soon-t0-be published studies to boost the reputation of their journals:

A system of “impact factors”, tied to references listed in studies, pervades the scholarly enterprise, notes survey author Allen Wilhite and Eric Fong of the University of Alabama in Huntsville, who reports the survey of 6,672 researchers in economics, sociology, psychology, and business research in the current Science journal. The survey covered journal editor behavior from 832 publications.

Overall, about 20% of survey respondents say that a journal editor had coerced extra citations to their own journal from them. Broadly, journals with higher impact factors attract more prestige, advertising and power in hiring and firing decisions in scholarly circles, the authors note, giving journal editors an incentive to extort added citations to their publications in the studies they consider for publication. “(T)he message is clear: Add citations or risk rejection,” write the authors.

In particular, younger professors with few co-authors who need publications to keep their jobs reported the most pressure. Business journal editors coerced the most often, followed by economics, and then psychology and other social sciences. As well, “journals published by commercial, for-profit companies show significantly greater use of coercive tactics than journals from university presses. Academic societies also coerce more than university presses.”

Less than 7% of the respondents thought researchers would resist this coercion, so desperate for publication are professors. “Although this behavior is less common in economics, psychology, and sociology, these disciplines are not immune—every discipline reported multiple instances of coercion. And there are published references to coercion in fields beyond the social sciences,” concludes the survey report…

While I’m glad to see that sociology seems to be toward the bottom of this list, this is still a problem. In some ways, this is not surprising as many in academia feel the pressure to make their work stand out.

However, I think you could ask broader questions about the system of citations. Here are a few other ideas:

1. Do researchers feel pressure to add citations to articles simply to reach a “magic number” or to have enough so that it looks like they have “properly” scoped out the topic?

2. How much have citations increased with the widespread use of online databases that make it much easier to find articles?

2a. Since I assume this has increased the number of citations, does this lead to “better research”?

3. When choosing what articles to cite, how much are researchers influenced by how many other people have cited the article (supposedly a measure of its value) and the impact factors of the journal the article is in?

Moving away from academic journals and toward “Performative Social Science”

Most sociologists aim to publish research in academic journals or books. One sociologist suggests a new venue for sharing research: creating fiction films.

Kip Jones hates PowerPoint presentations. He doesn’t care much for academic journals, either. An American-born sociologist, who teaches in the school of health and social care at Bournemouth University in England, Mr. Jones says that “the shame of research is that you spend a lot of money and the knowledge just disappears — or worse, ends up as an article in a scholarly journal.”

So when he was invited to participate in “The Gay and Pleasant Land” project — an investigation into the lives of older gay men and lesbians living in rural England and Wales — Mr. Jones decided that the best way to present the project’s findings to the public wasn’t by publishing the results or delivering a paper at a scholarly conference, but by making a short fiction film…

That’s what Mr. Jones is counting on. “Most of my own work is around developing a method — what’s known as Performative Social Science. I’ve worked with theater. I’ve worked with dancers,” he said. The idea is to combine serious scholarship and popular culture, using performance-based tools to present research outcomes.

Jones suggests that research is often forgotten and that is why he sought to make a film. This raises some questions:

1. Is a film more “permanent” than a research article or book? Without widespread distribution, I suspect the film is less permanent.

2. Is this really about reaching a bigger audience? Academics sometimes joke about how journal articles might reach a few hundred people in the world who care. A film could reach more people but it would need effective distribution or a number of showings for this to happen. This also requires work and how many academic films are actually able to reach a broad audience?

3. Can a film acceptably convey research results compared to a more data-driven paper? Both data-driven work and films need to tell a story and/or make an argument but they are different venues.

In the end, I don’t think we will have a sudden rush to make such films as opposed to writing more academic work. However, I wouldn’t be surprised if we see more established researchers create films and documentaries to supplement their work. (See Mitchell Duneier’s Sidewalk disc which included a documentary.) Such films could reach a broader and younger audience, i.e., putting it in the Youtube world of today’s college students.

(Another note: can you find many academics who would actually defend the use of Powerpoint? It seems like an odd way to begin the story.)

A humorous yet relevant comment from the scientific past: “Oh, well, nobody is perfect.”

When I look at sociology journal articles from the past, a few things strike me: the lack of high-powered statistics and a simplicity in explanation and research design. In the current world of publishing demands and the push for always high-quality, ground-breaking work, these earlier articles look like they were from a more innocent era.

I was reminded of this by a recent Wired post. In this case, a geology journal had published an article in the early 1960s and another scholar had responded in print to this article by pointing out a mistake on the part of the original authors. This is not uncommon. What does look particularly uncommon is the response by the original authors: “Oh, well, nobody is perfect.”

In a perfect world, isn’t this how science is supposed to work: just admit your mistakes, don’t repeat them, and move on? But I can’t imagine that many current scholars could give such a reply, perhaps in fear that their career or reputation would be in jeopardy. And in the world of scientific journals, is this sort of back and forth (with candidness) even possible much of the time?

I also infer a sense of humility on the part of the original authors. Instead of going on for pages about how their mistake was defensible or trying to pass the blame, a quick one-liner admits the mistake, diffuses the situation, and everyone can move on.