Studies suggest texting in class is related to lower grades, GPA

Several studies in recent years have examined the link between students texting and using Facebook in in class and their grades. The Chicago Tribune summarizes the studies:

In the past five years researchers have published the results of five surveys and experiments that link texting and Facebooking with lower academic performance. In 2011, researchers at California State University reported that students who received or sent a high number of text messages during a video recorded lecture scored worse on a quiz than those who received or sent few or no text messages.

In a 2012 study, a researcher at Lock Haven University in Pennsylvania surveyed 1,800 students about how often they Facebook, instant message, email, text, search online and talk on the phone in class. Among the results: 69 percent of students reported they had texted in class, and students who texted or used Facebook more frequently in class had lower overall semester GPAs. The author of that study, Reynol Junco, also co-wrote a study that linked texting and Facebooking during study time with lower GPAs…

“That I can be definitive about: That’s not working. If you’re going to search (online) during class, I don’t have any data telling you to stop. If you’re going to email during class, I don’t have any data to tell you to stop. But do not text or Facebook during class. Do not text or Facebook while you’re studying for your classes, because that’s another area where this is definitely a negative.”…

Junco’s evidence against texting and Facebooking is correlational, meaning that his studies show that students who Facebook or text more in class or while studying do worse academically, but not that the texting or Facebooking itself is causing the problem. It’s possible that, say, an easily distractible student is texting a lot and doing poorly in class, with the underlying cause of poor performance being the distractibility, not the texting. But Junco points to two other college studies in which researchers, not students, largely determined which students would do the most media multitasking in class.

In both the studies — the California State study and one published in 2012 in the journal Computers and Education — a form of media multitasking (texting or Facebooking) was linked to lower student performance.

It would be interesting to pair data like this with student’s perceptions of whether they are doing better or worse in a class because they are texting, browsing, or not. It would be one thing if students knew that texting was distracting and did it anyway and yet another thing if they were so used to texting that they were unaware of its possible effects.

Let’s say future studies more clearly establish a causal link. Would colleges then move to banning cell phone use or Facebooking in class? Or is this one of those areas that would generate a lot of negative feedback from students who would want the freedom to do more of what they want in class?

Positive results for teaching statistics by computer

A recent study shows that students taking an online statistics course utilizing software from Carnegie Mellon do better than students who take a hybrid course with a classroom classroom:

The study, called “Interactive Learning Online at Public Universities,” involved students taking introductory statistics courses at six (unnamed) public universities. A total of 605 students were randomly assigned to take the course in a “hybrid” format: they met in person with their instructors for one hour a week; otherwise, they worked through lessons and exercises using an artificially intelligent learning platform developed by learning scientists at Carnegie Mellon University’s Open Learning Initiative.

Researchers compared these students against their peers in the traditional-format courses, for which students met with a live instructor for three hours per week, using several measuring sticks: whether they passed the course, their performance on a standardized test (the Comprehensive Assessment of Statistics), and the final exam for the course, which was the same for both sections of the course at each of the universities…

The robotic software did have disadvantages, the researchers found. For one, students found it duller than listening to a live instructor. Some felt as though they had learned less, even if they scored just as well on tests. Engaging students, such as professors might by sprinkling their lectures with personal anecdotes and entertaining asides, remains one area where humans have the upper hand.

But on straight teaching the machines were judged to be as effective, and more efficient, than their personality-having counterparts.

As someone who regularly teaches both Statistics and Social Research (a research methods course), these findings are intriguing. I understand the urge to curb costs while still providing a good education. However, I have three questions that perhaps go beyond these findings:

1. Are there any benefits for students from being in a classroom for three hours a week beyond learning outcomes? Is there a social dimension to the classroom setting that could enhance learning? For example, it is common for professors to have students work in groups or with each other, sometimes with the idea that being able to teach or effectively help another student will increase a student’s learning. Also, I wonder about learning becoming strictly an individualistic activity. Sure, there are ways to do this online (discussion boards, using Skype, etc.) but does this replicate the kind of discussions faculty and students can have in a classroom?

2. Are there any professors in the United States who might secretly welcome not having to teach statistics?

3. Is there a point in a discipline, like statistics, where the difficulty of the subject matter makes it more helpful to have a live instructor? This study looked at introductory stats courses but would the findings be the same if the courses covered more advanced topics that require more “intuition” and “art” than pure steps or facts?

h/t Instapundit

In response to criticism, sociologist argues academics need to explain better what they do

A recent Washington Post op-ed suggested college faculty do not work hard enough:

An executive who works a 40-hour week for 50 weeks puts in a minimum of 2,000 hours yearly. But faculty members teaching 12 to 15 hours per week for 30 weeks spend only 360 to 450 hours per year in the classroom. Even in the unlikely event that they devote an equal amount of time to grading and class preparation, their workload is still only 36 to 45 percent of that of non-academic professionals. Yet they receive the same compensation.

If the higher education community were to adjust its schedules and semester structure so that teaching faculty clocked a 40-hour week (roughly 20 hours of class time and equal time spent on grading, preparation and related duties) for 11 months, the enhanced efficiency could be the equivalent of a dramatic budget increase. Many colleges would not need tuition raises or adjustments to public budget priorities in the near future. The vacancies created by attrition would be filled by the existing faculty’s expanded teaching loads — from 12 to 15 hours a week to 20, and from 30 weeks to 48; increasing teachers’ overall classroom impact by 113 percent to 167 percent.

Critics may argue that teaching faculty members require long hours for preparation, grading and advising. Therefore they would have us believe that despite teaching only 12 to 15 hours a week, their workloads do approximate those of other upper-middle-class professionals. While time outside of class can vary substantially by discipline and by the academic cycle (for instance, more papers and tests to grade at the end of a semester), the notion that faculty in teaching institutions work a 40-hour week is a myth. And whatever the weekly hours may be, there is still the 30-week academic year, which leaves almost 22 weeks for vacation or additional employment.

One article about the subsequent conversation regarding the op-ed quotes sociologist Jerry Jacobs talking about how academics do not explain their jobs to the public well:

Faculty-baiting might exist because people have certain perceptions of how college professors operate, some experts said. “I do not think we do a good job of explaining what we do,” said Jerry Jacobs, a professor of sociology at the University of Pennsylvania. Jacobs, who has researched faculty life, said that students often graduate from research universities without a clear understanding of what a professor’s job entails. “Meanwhile people see that the costs of college are going up and to them, faculty at colleges don’t seem to work 40 hours a week like high school teachers do,” he said.

In a 2004 article in the Sociological Forum, Jacobs found that full-time faculty members spend an average of just above 50 hours a week working. The data for his analysis came from the 1998 National Study of Postsecondary Faculty by the U.S. Department of Education and the faculty sample included 819 colleges and universities. “As a point of comparison, the average work week for men in the U. S. labor force is 43 hours per week and 37 for women. About one-quarter of men work in the labor force work over 50 hours per week (26.5 percent), along with one in ten women (11.3 percent),” Jacobs said. Many academics, of course, report working far more than 50 hours a week — and for adjuncts, the pay is a fraction of the figures cited by Levy, and many work without health or retirement benefits, or any job security.

It may be a job with some more flexibility than other jobs but there is certainly plenty of work for academics invested in their classrooms, research, and schools.

So what would Jacobs say academics should do? How can we explain to the public what academic life is like?

One option is to tie our roles to helping prepare students for jobs. However, this downplays aspects that aren’t as clearly vocational.

Another option: be more clear with students about what we do and how we do it. Instead of making our jobs like “black boxes” that are mysterious and capricious, explain what we are doing as we go along. Why should our students learn about a particular topic? Why do we grade the way we do? What do we do when we put together a research paper? I’ve tried some of these strategies and while students don’t seem overjoyed, some do appear to appreciate hearing the process behind it.

A third option would be to more clearly relate our teaching and research to everyday life, whether this is in the classroom or the community. While public sociology might be a sort of trendy term, it could help show people why what we do matters. We don’t just sit around and write for ten other academics; in our research we are hoping to draw attention to particular issues, influence public policy, help people who care about the topics, and interact with others who are also interested.

Fourth, we could defend the classroom experience. It is not easy to effectively impart knowledge and wisdom to other and to lead discussions. These days, it might be cheaper to do more online learning but something is missing, the community and atmosphere that can come from being in a classroom where both the instructor and students are engaged. This sort of criticism also is often leveled at teachers: “anyone could teach these lessons.” I don’t think everyone could.

Sociologist uses Twitter for class but are the students learning more?

Stories like this are not uncommon: professors utilizing technology to engage their students (and here is another one about clicker use).

Wendy Welch is incorporating the use of the social networking site Twitter into her cultural geography class this semester. The adjunct instructor said she decided to use the social networking site in her class after having problems with students using their phones in class for less-than-appropriate purposes.“If you can’t beat them, get ahead of them,” Welch said. “That’s the way the world works now.”…

Each student was assigned a country in Africa and asked to tweet facts about their country, such as languages and population, using designated hash tags, or categories. That way, each student only has to research one country but has access to all the information they may need from other students.

Welch also plans to have students use their mobile devices or laptops to research information during class sessions, she said…

She said she hopes to “get students to understand and participate in their own education.”

Perhaps this does increase the engagement level of the students. All professors want their students to be engaged and we don’t want to be seen as being behind the times. But, I think there is often something missing from discussions about student engagement and the use of technology in the classroom: does this actually lead to higher levels of student learning or student outcomes?

I suspect professors will always try to keep up with technology as it changes and each of these changes will be accompanied by hand-wringing. However, we need to be able to distinguish between engaging students with technology versus helping them learn. Take this Twitter example from class: do students do better on tests? Do they retain the knowledge better? Can they apply their knowledge from this particular class to other settings, particularly if the technology is not present? Does technology itself help students think more deeply about the big questions of our world?

If technology alone becomes the answer in the classroom, we will be in trouble.

How long do students keep notes from their college classes?

While discussing some of the things that he left behind in the transition between the analog and digital world, a writer includes his notes from Sociology 101:

I collected a lot of things. A large part of my identity revolved around the acquisition and accumulation of books. I also collected CDs, DVDs, comics and other cultural ephemera. I kept movie tickets, clippings of articles, flyers, interesting things I picked up. I couldn’t bear to throw these out because I thought that there might come a time when I might need something —like, say, my readings in Sociology 101 from the year 2000.

Who knew when I would have to define the sociological imagination? Or when I would need to define the political dynamics and do a comparative analysis of the authoritarian leadership styles of Lee Kuan Yew and Saddam Hussein based on my studies of Politics and Change in the Third World in 2001? Oh and there were empty liquor bottles signed by friends from the early Noughties wishing me a happy nineteenth or twentieth birthday, and lord knows a situation might arise when I might need those too.

If I was the professor of this Soc 101 class, what should be my response on hearing this? Happiness in that a former student might have turned to these notes? Depression because the student had years to look at these and never did again? Or indifference since this student seemed to collect a lot of things, not just sociology notes?

More broadly, I would be curious to know how often college students return to their books and notes from school. Does anyone have any systematic data on the subject? I suspect the data would look like a Poisson curve: most students have never returned to these sources. But couldn’t this be a measure of the “effectiveness” or “success” of a particular class, an outcome that colleges and professors might be interested in knowing about? Typically, we get information on evaluations forms from the closing moments of class, a time when students might be able to judge the immediate effect of a class but can shed little light on the longer-lasting impact of a particular course. Imagine if we found that a more popular sociological text like Gang Leader For a Day was popular in the short-term but a text like The Truly Disadvantaged stuck with students for years. Both outcomes could be desirable – a short-term book or lecture can draw people into the subject or enhance the classroom experience while a longer-term book or lecture can influence lives down the road – but are qualitatively different pieces of information.

Perhaps this could all be explained by personality types: there are people who keep things from the past and those who do not. But I suspect that professors would like to think that they have the potential in many lectures or in the sources they put in front of students to influence any student for years.

Getting better data on how students use laptops in class: spy on them

Professors like to talk about how students use laptops in the classroom. Two recent studies shed some new light on this issue and they are unique in how they obtained the data: they spied on students.

Still, there is one notable consistency that spans the literature on laptops in class: most researchers obtained their data by surveying students and professors.

The authors of two recent studies of laptops and classroom learning decided that relying on student and professor testimony would not do. They decided instead to spy on students.

In one study, a St. John’s University law professor hired research assistants to peek over students’ shoulders from the back of the lecture hall. In the other, a pair of University of Vermont business professors used computer spyware to monitor their students’ browsing activities during lectures.

The authors of both papers acknowledged that their respective studies had plenty of flaws (including possibly understating the extent of non-class use). But they also suggested that neither sweeping bans nor unalloyed permissions reflect the nuances of how laptops affect student behavior in class. And by contrasting data collected through surveys with data obtained through more sophisticated means, the Vermont professors also show why professors should be skeptical of previous studies that rely on self-reporting from students — which is to say, most of them.

While these studies might be useful for dealing with the growing use of laptops in classrooms, discussing the data itself would be interesting. A few questions come to mind:

1. What discussions took place with an IRB? It seems that this might have been a problem in the study using spyware on student computers and this was reflected in the generalizability of the data with just 46% of students agreeing to have the spyware on their computer. The other study also could run into issues if students were identifiable. (Just a thought: could a professor insist on spyware being on student computers if the students insisted on having a laptop in class?)

2. These studies get at the disparities between self-reported data and other forms of data collection. I would guess that students would underestimate their distractable laptop use on self-reported surveys because they would suspect that this is the answer that they should give (social desirability bias). But it could also reveal things about how cognizant computer/Internet users are about how many windows and applications they actually cycle through.

3. Both of these studies are on a relatively small scale: one had 45 students, the other had a little more than 1,000 but the data was “less precise” since it involved TAs sitting in the back monitoring students. Expanding the Vermont study and linking laptop use to outcomes on a larger scale is even better: move beyond just talking about the classroom experience and look at its impact on learning outcomes. Why doesn’t someone do this on a larger scale and in multiple settings? Would it be too difficult to get past some of the IRB issues?

In looking at the comments about this story, it seems like having better data on this topic would go a long ways to moving the discussion beyond anecdotal evidence.

Students suffer withdrawal in a one day media blackout

Professors and teachers can often provide anecdotal evidence of how students react when told that smartphones (and other devices like laptops) are not to be used in the classroom. A new study suggests that the problem isn’t really the classroom: simply not having these devices at all could the issue.

Researchers found that 79 per cent of students subjected to a complete media blackout for just one day reported adverse reactions ranging from distress to confusion and isolation.

In vivid accounts, they told of overwhelming cravings, with one saying they were ‘itching like a crackhead [crack cocaine addict]’.

The study focused on people aged between 17 and 23 in ten countries, including the UK, where about 150 students at Bournemouth University spent 24 hours banned from using phones, social networking sites, the internet and TV.

They were allowed to use landline phones or read books and were asked to keep a diary.

One in five reported feelings of withdrawal akin to an addiction while 11 per cent said they were confused or felt like a failure.

Nearly one in five (19 per cent) reported feelings of distress and 11 per cent felt isolated. Just 21 per cent said they could feel the benefits of being unplugged.

Some students took their mobile phone with them just to touch them.

While some of these symptoms don’t seem as bad as others, it is interesting that only 21% “could feel the benefits” of being “unplugged.” These devices and SNS tools really have become necessities in a short amount of time.

In reactions to this study, it would be interesting to see whether people advocate a complete move away from such technology because of these possible dangerous side effects or if people suggest more moderate usage. But if usage is really is an addiction, then moderate usage could still be an issue. I would like to see a follow-up to this study that examines a longer-term media blackout – how long does it take for students to readjust to life without all this media and then what would be their thoughts about what they might be missing (or gaining)?