Why I am excited to teach in Fall 2020

Starting up college classes in Fall 2020 is a difficult and uncertain task. Many decisions and much planning has gone into schools starting up or getting close to starting again. Here is why I am excited to be back in the classroom to start classes this next week:

Image capture from “Why Study Sociology and Anthropology at Wheaton?
  1. I am always excited for learning to begin. There is much for all of us to learn; the well-worn phrase “the more you learn, the less you know” (or some variation) is true. The start of a new class marks the beginning of a process by which an instructor and students learn together. There are a lot of other things that colleges and universities are now about but learning is at the heart of the mission. Teaching many classes at the undergraduate level means that the courses are just the start of what could become life-long conversations or projects yet there is potential to spark new interests or paths or epiphanies. Even though I have taught each of my two classes this fall semester more than ten times each, I am excited to share the material, ways of thinking, and skills with new sets of students. We have minds and bodies and we are called to put them to use in learning and then applying or living out that knowledge.
  2. Learning together. Learning is not only a solitary task; it comes to full fruition when done in community. Over sixteen weeks of classes, we will get to know each other a little better, hear alternative perspectives, and consider what it all means. Since my institution is smaller, I can know every student’s name, run into people on campus, and find opportunities to link broader or structural concepts to individual experiences. Even with masks this semester or going virtual for the second half of the Spring 2020 semester, we can build relationships during class discussions, through assignments, and outside of class. By the end of the semester, it is hard to let go of a class as an instructor prepares to start the process all over again the next term.
  3. This is a critical time to address issues in society and in our world. One of the reasons I enjoy sociology is that is always applies to current circumstances and now is no different with COVID-19, a presidential election cycle, conversation and action about race, changing economies and cultures, and more. Classrooms provide spaces to explore what is happening from a particular disciplinary lens and since sociology examines all aspects of human behavior, there is much to consider (much more than we can do in any 16 week semester!). There is much for us to apply the sociological imagination to. And with a shared faith commitment on our campus, we can connect sociology’s (or other disciplines) approach to the world to our religious beliefs, belonging, and behavior.
  4. Getting back to some sort of routine. COVID-19 has disrupted a lot of daily patterns. As my campus gets back to on-campus classes, we hopefully we be able to settle into a rhythm and structure that helps us nudge us in positive directions. Living in chaotic or uncertain times is difficult for humans; we need routines and patterns. The academic calendar is one such pattern that does much to structure my own life through my own educational experiences plus now teaching. By the time August starts, I am ready for the school year to start up even as I am grateful for the change that summer brings with a more flexible schedule and time for research.

Intro to Sociology with 82 year old “godfather of Canadian menswear”

I imagine Intro to Sociology might be a little different with a 82 year old menswear magnate in class:

Even in his school duds — no tie, sometimes even jeans, if you can believe it — Harry Rosen was the best-dressed student this fall in Intro Sociology.

“I dress casually for class, but never without a jacket,” stated the godfather of Canadian menswear, who, at 82, decided this year to start studying humanities at Ryerson University.

He has been excused from exams because he still juggles part-time duty with his luxury clothing empire — he has a meeting Friday with a customer who still prefers to “Ask Harry,” semi-retired or not; some are now fourth-generation clients. He also fundraises for Bridgepoint Health and the University Health Network’s stem-cell team that created a research chair in his name, and serves on boards of institutions such as Ryerson…

History Prof. Martin Greig said he enjoyed the “octogenarian sitting amongst the 17- and 18-year-olds who made up the bulk of this first year course on medieval Europe. He was very attentive and seemed genuinely appreciative of my efforts. It was fun to have him there and I hope that he follows through with his intention to take my Cold War course in the winter term.”

“I love learning and I need that activity, in good measure because of my regrets at not getting a university-level education when I was young,” said Rosen, a self-taught retail mogul who went from high school straight to work, opening a modest men’s shop with his brother and then spending the next 60 years learning what he needed from carefully chosen partners.

It is good to hear about life-long learners who want to find out more about the world. Of course, this doesn’t have to happen in a college classroom. Yet, I think his example could go a long way with younger college students. With some of the figures about student learning in college and completion rates, his interaction with students might be the most valuable thing that happens in the classroom.

Argument: The Myth of ‘I’m Bad at Math’

Two professors argue being good at math is about hard work, not about genetics:

We hear it all the time. And we’ve had enough. Because we believe that the idea of “math people” is the most self-destructive idea in America today. The truth is, you probably are a math person, and by thinking otherwise, you are possibly hamstringing your own career. Worse, you may be helping to perpetuate a pernicious myth that is harming underprivileged children—the myth of inborn genetic math ability…

Again and again, we have seen the following pattern repeat itself:

  1. Different kids with different levels of preparation come into a math class. Some of these kids have parents who have drilled them on math from a young age, while others never had that kind of parental input.
  2. On the first few tests, the well-prepared kids get perfect scores, while the unprepared kids get only what they could figure out by winging it—maybe 80 or 85%, a solid B.
  3. The unprepared kids, not realizing that the top scorers were well-prepared, assume that genetic ability was what determined the performance differences. Deciding that they “just aren’t math people,” they don’t try hard in future classes, and fall further behind.
  4. The well-prepared kids, not realizing that the B students were simply unprepared, assume that they are “math people,” and work hard in the future, cementing their advantage.

Thus, people’s belief that math ability can’t change becomes a self-fulfilling prophecy.

Interesting argument: if you believe you can’t do well at a subject, you probably won’t. The authors then go on to hint at broader social beliefs: Americans tend to believe in talent, other countries tend to emphasize the value of hard work.

This lines up with what I was recently reading about athletes in The Sports Gene. The author reviews a lot of research that suggests training and genetics both matter. But, genetics may not matter in the way people typically think they do – more often, it matters less that people are “naturally gifted” and more that some learn quick than others. So, the 10,000 hours to become an expert, an idea popularized by Malcolm Gladwell, is the average time it takes one to become an expert. However, some people can do it much more quickly, some much more slowly due to their different rates of learning.

Commercials that market smartphones as education devices shouldn’t fool many

In the past few months, I’ve heard several commercials for smartphones that suggest kids can and will use them for educational purposes. When your child needs help on their homework, they can whip out their phone and find the answer.

Who do they think they are fooling? While parents want to hear about helping their kids succeed in school (this is an American constant over the decades), these commercials offer implausible possibilities. Kids could use their phone for homework or studying. But, I suspect the smartphone is used for two other tasks that will far outweigh educational purposes: social interaction (texting, chatting, Facebook, etc.) and media consumption (music, YouTube, TV and movies, etc.). The real education provided here might be in how to be a media-saturated, 21st century American kid.

This may be effective marketing but it also hints at another issue: the idea that new technological devices automatically lead to more learning. Where is the evidence for this? We can argue that kids needs to keep up with technology to understand and use it for their good like applying for jobs. We can argue the new technology engages kids. We can argue the technology can open up new opportunities like forming and maintaining beneficial relationships or learning how to code. But, suggesting it actually leads to more learning is a more difficult case to make.

My conclusion: such commercials play off the interests of parents who would say they want to help their kids succeed without marshalling much evidence that the new smartphone will help kids learn.

How does the rise in non-tenured college faculty affect education?

There has been much conversation about this in academia lately but here are some actual numbers about the percentages of tenured and non-tenured faculty:

Once, being a college professor was a career. Today, it’s a gig.

That, broadly speaking, is the transformation captured in the graph below from a new report by the American Association of University Professors. Since 1975, tenure and tenure-track professors have gone from roughly 45 percent of all teaching staff to less than a quarter. Meanwhile, part-time faculty are now more than 40 percent of college instructors, as shown by the line soaring towards the top of the graph.

This doesn’t actually mean that there are fewer full-time professors today than four-decades ago. College faculties have grown considerably over the years, and as the AAUP notes, the ranks of the tenured and tenure-track professoriate are up 26 percent since 1975. Part-time appointments, however, have exploded by 300 percent. The proportions vary depending on the kind of school you’re talking about. At public four-year colleges, about 64 percent of teaching staff were full-time as of 2009. At private four-year schools, about 49 percent were, and at community colleges, only about 30 percent were. But the big story across academia is broadly the same: if it were a move, it’d be called “Rise of the Adjuncts.”

This is quite a shift over several decades. While there is a lot to explore here about economic life in colleges and universities, there is another question we could ask about how this affects the college experience: how does this change educational experiences and outcomes? Are students learning more or less depending on what kind of faculty in the classroom? Does it matter?

Do “Anthropological Video Games” lead to anthropological learning?

The New Yorker has a short article about several anthropological video games including “Guess My Race,” “The Cat and the Coup,” and “Sweatshop.”

A cluster of teen-agers gathered around a small table, and passersby could hear them exclaim, “Asian! Yeah, I knew it!” and “Aryan? That seems ridiculous.” They hovered over two iPads in the Grand Gallery of the Museum of Natural History during the Margaret Mead Film Festival, playing a game called “Guess My Race.” It was one of five video games in the Mead Arcade; the others included “The Cat and the Coup,” which traces the downfall of Iran’s first democratically elected Prime Minister, Mohammad Mossadegh, and “Sweatshop,” in which you hire and fire workers for your loathsome factory.

Aiding the swarms of museum patrons who stopped to play were volunteers from Games for Change, a New York City-based nonprofit that encourages the development of what it calls “social-impact games.” (All of the games at the arcade are also available for free through the organization’s Web site.) I sat down at a laptop to try my hand at running a sweatshop. To a bouncy techno soundtrack, the boss floor manager, who keenly evoked Hitler, spewed insults and directions—”Lazybones! How are you today? Shh-h-h-h. I don’t care!”—and the orders started pouring in for shoes, shirts, hats, and bags…

In 1940, Margaret Mead created a card game along with her husband, the anthropologist Gregory Bateson. Called “Democracies and Dictators,” its cards contained instructions such as “Dictator! Crippled Industries: You have put your leading industrialists into concentration camps. (lose a card in 5).” Mead wrote that it was based on “the basic ideas that democracies and dictators play by different rules and work with different values.” She tried to sell the idea to Parker Brothers, but it was never produced for public consumption. The games on display at the Mead Arcade have been markedly more successful. “Sweatshop” had a million plays during its first three months, and “The Cat and the Coup” has received acclaim from gamers around the world—including one German reviewer who wrote that it is “like Monty Python being dropped in a bowl full of Persian kitsch.”…

But if games train players in the rules of culture, what happens when those rules become too complicated to follow, or, perhaps, obsolete? Settling down to play “Guess My Race,” the player looks at photographs of ten faces—no artifacts here, the subjects are familiarly modern. You choose from six possible races that vary widely from one round to the next—descriptions might be nationalities, skin colors, religions, or loaded epithets like “Illegal” or “East Coast.” The player might have to select from options that would seem to be simultaneously plausible (i.e., Asian versus Indonesian, or Black versus Caribbean) with answers that suggest race is self-defined, not regionally or ethnically determined.

And so the gamification of the world continues. I’m not surprised these games are featured at a museum; when recently visiting the Museum of Science and Industry in Chicago for the first time in a few years, I was struck by the number of hands-on exhibits and games that allow one or two users to explore some dimension of science. It is interesting to see that these games have had so many downloads – people are either interested in the topics or there are a lot of gamers out there willing to trying a lot of things.

My biggest question about these games is whether players learn the intended lessons. As the article notes, games have been used and proposed for decades to teach players different lessons. We know, for example, that Monopoly is partly about capitalism. It seems to me that the crop of more recent Euro games, from Settlers of Catan on downward, tend to teach about what is needed to grow a community or society. Even new video games like Assassin’s Creed III are related to historical events. However, having played a lot of games over recent years, I wonder how much I’ve actually learned about anything as opposed to enjoyed competing. Is the point of the board game Agricola to teach me that Germans living in the 1600s needed a diverse base of multiple foodstuffs? Did the video game Civilization (II-IV) teach me something meaningful about how civilizations actually develop? I’m not sure.

Also, I have to ask: what would a sociological game look like?

Daily Herald discusses appropriate tablet use in school but misses bigger point: do new tablets help students learn more?

The Daily Herald had an editorial yesterday that argued tablets used by students at school and home need to be used appropriately:

Other districts also have committed to the devices’ use, though some taxpayers might see them as an extravagance. Educators get just as starry-eyed over new technology as the rest of us, and why shouldn’t they? Kids are growing up with lightning-fast change in the electronic tools we use every day. This is the world they know and will need to keep knowing, and schools are adapting.

But how they adapt is key. Without good policies and solid technology plan that includes training and evaluation, the tablet revolution — called “one-to-one” programs by educators — could amount to little more than handing kids high-tech notebooks at best or, worse, free video gaming.

Gurnee Elementary District 56, for one, is rolling out its iPad initiative for middle-schoolers this week and appears to be doing things right by involving parents in a checkout night with consent forms, user agreements, guidelines and a downloadable instruction manual. It cannot stop there, however, nor can administrators be certain those 46-page manuals will be read.

Perhaps the most important way to make these devices as cost-effective as possible is to ensure teachers on the front lines have the training to use them to their fullest and to focus the instruction on learning, not the device.

The rest of the article goes on to talk about the appropriate use of tablets but the key is here in the last sentence of the quote above. Do the tablets and iPads contribute to student learning? It is one thing to suggest students need to learn about new technology. This is helpful in itself, particularly for kids who may not have consistent access at home. And tablets may help students to be more engaged in the classroom. But, there is a bigger question that should be asked here: does using a tablet help students learn math or reading or science or social studies or other subjects better?

The real question to ask about the iBooks 2, textbook killer: will it help students learn?

There is a lot of buzz about the iBooks 2 but I have a simple question: will students learn more using it? In one description of the new program, this isn’t really covered:

Yet, the iPad offers a big opportunity for students to get excited about learning again. The iPad has already demonstrated it can help children with learning disabilities make leaps in bounds in their development, and schools have already invested heavily in Apple’s tablet. Roughly 1.5 million iPads are currently in use in educational institutions.

Schiller said that the problem with textbooks is not the content, which is “amazing,” but the weight of the physical book. They need to last five or six years when they’re written, and they’re not very durable or interactive. Searching is also difficult.

At that point, Schiller introduced iBooks 2, which has a new textbook experience for the iPad. The first demonstration showed what it’s like to open a biology textbook, and see an intro movie playing right before you even get to the book’s contents. When you get to the book itself, images are large and beautiful, and thumbnails accompany the text. To make searching easier, all users need to do is tap on a word and they go straight to the glossary and index section in the back of the book…

Jobs had long hoped to bring sweeping changes to higher education for much of his life. When he left Apple and launched NeXT in 1986, Jobs wanted the company’s first computer — a distinctive all-black magnesium cube — to be designed specifically for higher education establishments and what Jobs called “aggressive end users.”…

“‘The process by which states certify texbooks is corrupt,’ he said. ‘But if we can make the textbooks free, and they come with the iPad, then they don’t have to be certified. The crappy economy at the state level will last for a decade, and we can give them an opportunity to circumvent the whole process and save money.'”

Based on this article, I see five things that are good about iBooks 2: it will excite students, it is lighter to use so don’t have to carry so much weight around, it will be cheaper for everyone in the long run, there are some cool features like searching and embedded videos, and it could make Apple a lot of money (and presumably traditional textbook publishers will lose money unless they adapt?).

But, we have been told for decades that better technology in the classroom, computers, laptops, the Internet, etc., will lead to improvements in learning and test scores. Isn’t this how iBooks 2 should be measured? It is good if kids are excited about learning again but will this tool actually help them learn more? The technology may be better and cheaper in the long run but this doesn’t necessarily mean it will lead to improvements in the education system.

I wouldn’t go so far as to say that iPads or iBooks 2 can’t lead to better learning but I would want to know a lot more about its effect on educational outcomes before simply adopting the technology.

22 possible ways of assessing a course

Assessment is an important issue in schools of all levels today. On the college level, there is a growing emphasis on collecting data about particular courses and programs and then assessing whether these courses and programs met critical goals and then using that data to improve what is being offered.

The Chronicle of Higher Education has put together 22 different assessment measurements for a hypothetical college course. Broken into three categories, the instructor, during the course, and after the course, this table quickly suggests how easy or difficult it is to collect the data and then the limitations of each measure.

Just looking at this chart, here is what one could take away from it:

1. It is relatively easy to assess the qualifications of the instructor.

2. Measurement during the course appears easier than after the course.

3. A key issue with the after the course data is that it is difficult to determine exactly what impact one particular course had when students take other courses as well and get educational input outside of courses.

4. It appears that a variety of data would be useful to help avoid the limitations of individual measures.

5. Assessment can be a time-consuming and complex task.

Rethinking how to study

The New York Times highlights recent research that suggests older methods or habits for studying may not be worthwhile. Instead, there are new suggestions for studying that haven’t yet caught on:

[P]sychologists have discovered that some of the most hallowed advice on study habits is flat wrong. For instance, many study skills courses insist that students find a specific place, a study room or a quiet corner of the library, to take their work. The research finds just the opposite…

Varying the type of material studied in a single sitting — alternating, for example, among vocabulary, reading and speaking in a new language — seems to leave a deeper impression on the brain than does concentrating on just one skill at a time. Musicians have known this for years, and their practice sessions often include a mix of scales, musical pieces and rhythmic work. Many athletes, too, routinely mix their workouts with strength, speed and skill drills…

When the neural suitcase is packed carefully and gradually, it holds its contents for far, far longer. An hour of study tonight, an hour on the weekend, another session a week from now: such so-called spacing improves later recall, without requiring students to put in more overall study effort or pay more attention, dozens of studies have found.

These would be worthwhile for any type or stage of learning. While it may be initially difficult to change ingrained habits, switching to new study methods would pay off in the end with improved abilities to retain and utilize knowledge.

Reading about this could lead to some interesting questions regarding how people and students learn or acquire their study habits. Is it an intuitive process that each person needs to figure out for themselves? Do most people simply do what others have told them to do? How often do we assess our own studying/learning habits to determine their effectiveness?