Intro to Sociology with 82 year old “godfather of Canadian menswear”

I imagine Intro to Sociology might be a little different with a 82 year old menswear magnate in class:

Even in his school duds — no tie, sometimes even jeans, if you can believe it — Harry Rosen was the best-dressed student this fall in Intro Sociology.

“I dress casually for class, but never without a jacket,” stated the godfather of Canadian menswear, who, at 82, decided this year to start studying humanities at Ryerson University.

He has been excused from exams because he still juggles part-time duty with his luxury clothing empire — he has a meeting Friday with a customer who still prefers to “Ask Harry,” semi-retired or not; some are now fourth-generation clients. He also fundraises for Bridgepoint Health and the University Health Network’s stem-cell team that created a research chair in his name, and serves on boards of institutions such as Ryerson…

History Prof. Martin Greig said he enjoyed the “octogenarian sitting amongst the 17- and 18-year-olds who made up the bulk of this first year course on medieval Europe. He was very attentive and seemed genuinely appreciative of my efforts. It was fun to have him there and I hope that he follows through with his intention to take my Cold War course in the winter term.”

“I love learning and I need that activity, in good measure because of my regrets at not getting a university-level education when I was young,” said Rosen, a self-taught retail mogul who went from high school straight to work, opening a modest men’s shop with his brother and then spending the next 60 years learning what he needed from carefully chosen partners.

It is good to hear about life-long learners who want to find out more about the world. Of course, this doesn’t have to happen in a college classroom. Yet, I think his example could go a long way with younger college students. With some of the figures about student learning in college and completion rates, his interaction with students might be the most valuable thing that happens in the classroom.

Argument: The Myth of ‘I’m Bad at Math’

Two professors argue being good at math is about hard work, not about genetics:

We hear it all the time. And we’ve had enough. Because we believe that the idea of “math people” is the most self-destructive idea in America today. The truth is, you probably are a math person, and by thinking otherwise, you are possibly hamstringing your own career. Worse, you may be helping to perpetuate a pernicious myth that is harming underprivileged children—the myth of inborn genetic math ability…

Again and again, we have seen the following pattern repeat itself:

  1. Different kids with different levels of preparation come into a math class. Some of these kids have parents who have drilled them on math from a young age, while others never had that kind of parental input.
  2. On the first few tests, the well-prepared kids get perfect scores, while the unprepared kids get only what they could figure out by winging it—maybe 80 or 85%, a solid B.
  3. The unprepared kids, not realizing that the top scorers were well-prepared, assume that genetic ability was what determined the performance differences. Deciding that they “just aren’t math people,” they don’t try hard in future classes, and fall further behind.
  4. The well-prepared kids, not realizing that the B students were simply unprepared, assume that they are “math people,” and work hard in the future, cementing their advantage.

Thus, people’s belief that math ability can’t change becomes a self-fulfilling prophecy.

Interesting argument: if you believe you can’t do well at a subject, you probably won’t. The authors then go on to hint at broader social beliefs: Americans tend to believe in talent, other countries tend to emphasize the value of hard work.

This lines up with what I was recently reading about athletes in The Sports Gene. The author reviews a lot of research that suggests training and genetics both matter. But, genetics may not matter in the way people typically think they do – more often, it matters less that people are “naturally gifted” and more that some learn quick than others. So, the 10,000 hours to become an expert, an idea popularized by Malcolm Gladwell, is the average time it takes one to become an expert. However, some people can do it much more quickly, some much more slowly due to their different rates of learning.

Commercials that market smartphones as education devices shouldn’t fool many

In the past few months, I’ve heard several commercials for smartphones that suggest kids can and will use them for educational purposes. When your child needs help on their homework, they can whip out their phone and find the answer.

Who do they think they are fooling? While parents want to hear about helping their kids succeed in school (this is an American constant over the decades), these commercials offer implausible possibilities. Kids could use their phone for homework or studying. But, I suspect the smartphone is used for two other tasks that will far outweigh educational purposes: social interaction (texting, chatting, Facebook, etc.) and media consumption (music, YouTube, TV and movies, etc.). The real education provided here might be in how to be a media-saturated, 21st century American kid.

This may be effective marketing but it also hints at another issue: the idea that new technological devices automatically lead to more learning. Where is the evidence for this? We can argue that kids needs to keep up with technology to understand and use it for their good like applying for jobs. We can argue the new technology engages kids. We can argue the technology can open up new opportunities like forming and maintaining beneficial relationships or learning how to code. But, suggesting it actually leads to more learning is a more difficult case to make.

My conclusion: such commercials play off the interests of parents who would say they want to help their kids succeed without marshalling much evidence that the new smartphone will help kids learn.

How does the rise in non-tenured college faculty affect education?

There has been much conversation about this in academia lately but here are some actual numbers about the percentages of tenured and non-tenured faculty:

Once, being a college professor was a career. Today, it’s a gig.

That, broadly speaking, is the transformation captured in the graph below from a new report by the American Association of University Professors. Since 1975, tenure and tenure-track professors have gone from roughly 45 percent of all teaching staff to less than a quarter. Meanwhile, part-time faculty are now more than 40 percent of college instructors, as shown by the line soaring towards the top of the graph.

This doesn’t actually mean that there are fewer full-time professors today than four-decades ago. College faculties have grown considerably over the years, and as the AAUP notes, the ranks of the tenured and tenure-track professoriate are up 26 percent since 1975. Part-time appointments, however, have exploded by 300 percent. The proportions vary depending on the kind of school you’re talking about. At public four-year colleges, about 64 percent of teaching staff were full-time as of 2009. At private four-year schools, about 49 percent were, and at community colleges, only about 30 percent were. But the big story across academia is broadly the same: if it were a move, it’d be called “Rise of the Adjuncts.”

This is quite a shift over several decades. While there is a lot to explore here about economic life in colleges and universities, there is another question we could ask about how this affects the college experience: how does this change educational experiences and outcomes? Are students learning more or less depending on what kind of faculty in the classroom? Does it matter?

Do “Anthropological Video Games” lead to anthropological learning?

The New Yorker has a short article about several anthropological video games including “Guess My Race,” “The Cat and the Coup,” and “Sweatshop.”

A cluster of teen-agers gathered around a small table, and passersby could hear them exclaim, “Asian! Yeah, I knew it!” and “Aryan? That seems ridiculous.” They hovered over two iPads in the Grand Gallery of the Museum of Natural History during the Margaret Mead Film Festival, playing a game called “Guess My Race.” It was one of five video games in the Mead Arcade; the others included “The Cat and the Coup,” which traces the downfall of Iran’s first democratically elected Prime Minister, Mohammad Mossadegh, and “Sweatshop,” in which you hire and fire workers for your loathsome factory.

Aiding the swarms of museum patrons who stopped to play were volunteers from Games for Change, a New York City-based nonprofit that encourages the development of what it calls “social-impact games.” (All of the games at the arcade are also available for free through the organization’s Web site.) I sat down at a laptop to try my hand at running a sweatshop. To a bouncy techno soundtrack, the boss floor manager, who keenly evoked Hitler, spewed insults and directions—”Lazybones! How are you today? Shh-h-h-h. I don’t care!”—and the orders started pouring in for shoes, shirts, hats, and bags…

In 1940, Margaret Mead created a card game along with her husband, the anthropologist Gregory Bateson. Called “Democracies and Dictators,” its cards contained instructions such as “Dictator! Crippled Industries: You have put your leading industrialists into concentration camps. (lose a card in 5).” Mead wrote that it was based on “the basic ideas that democracies and dictators play by different rules and work with different values.” She tried to sell the idea to Parker Brothers, but it was never produced for public consumption. The games on display at the Mead Arcade have been markedly more successful. “Sweatshop” had a million plays during its first three months, and “The Cat and the Coup” has received acclaim from gamers around the world—including one German reviewer who wrote that it is “like Monty Python being dropped in a bowl full of Persian kitsch.”…

But if games train players in the rules of culture, what happens when those rules become too complicated to follow, or, perhaps, obsolete? Settling down to play “Guess My Race,” the player looks at photographs of ten faces—no artifacts here, the subjects are familiarly modern. You choose from six possible races that vary widely from one round to the next—descriptions might be nationalities, skin colors, religions, or loaded epithets like “Illegal” or “East Coast.” The player might have to select from options that would seem to be simultaneously plausible (i.e., Asian versus Indonesian, or Black versus Caribbean) with answers that suggest race is self-defined, not regionally or ethnically determined.

And so the gamification of the world continues. I’m not surprised these games are featured at a museum; when recently visiting the Museum of Science and Industry in Chicago for the first time in a few years, I was struck by the number of hands-on exhibits and games that allow one or two users to explore some dimension of science. It is interesting to see that these games have had so many downloads – people are either interested in the topics or there are a lot of gamers out there willing to trying a lot of things.

My biggest question about these games is whether players learn the intended lessons. As the article notes, games have been used and proposed for decades to teach players different lessons. We know, for example, that Monopoly is partly about capitalism. It seems to me that the crop of more recent Euro games, from Settlers of Catan on downward, tend to teach about what is needed to grow a community or society. Even new video games like Assassin’s Creed III are related to historical events. However, having played a lot of games over recent years, I wonder how much I’ve actually learned about anything as opposed to enjoyed competing. Is the point of the board game Agricola to teach me that Germans living in the 1600s needed a diverse base of multiple foodstuffs? Did the video game Civilization (II-IV) teach me something meaningful about how civilizations actually develop? I’m not sure.

Also, I have to ask: what would a sociological game look like?

Daily Herald discusses appropriate tablet use in school but misses bigger point: do new tablets help students learn more?

The Daily Herald had an editorial yesterday that argued tablets used by students at school and home need to be used appropriately:

Other districts also have committed to the devices’ use, though some taxpayers might see them as an extravagance. Educators get just as starry-eyed over new technology as the rest of us, and why shouldn’t they? Kids are growing up with lightning-fast change in the electronic tools we use every day. This is the world they know and will need to keep knowing, and schools are adapting.

But how they adapt is key. Without good policies and solid technology plan that includes training and evaluation, the tablet revolution — called “one-to-one” programs by educators — could amount to little more than handing kids high-tech notebooks at best or, worse, free video gaming.

Gurnee Elementary District 56, for one, is rolling out its iPad initiative for middle-schoolers this week and appears to be doing things right by involving parents in a checkout night with consent forms, user agreements, guidelines and a downloadable instruction manual. It cannot stop there, however, nor can administrators be certain those 46-page manuals will be read.

Perhaps the most important way to make these devices as cost-effective as possible is to ensure teachers on the front lines have the training to use them to their fullest and to focus the instruction on learning, not the device.

The rest of the article goes on to talk about the appropriate use of tablets but the key is here in the last sentence of the quote above. Do the tablets and iPads contribute to student learning? It is one thing to suggest students need to learn about new technology. This is helpful in itself, particularly for kids who may not have consistent access at home. And tablets may help students to be more engaged in the classroom. But, there is a bigger question that should be asked here: does using a tablet help students learn math or reading or science or social studies or other subjects better?

The real question to ask about the iBooks 2, textbook killer: will it help students learn?

There is a lot of buzz about the iBooks 2 but I have a simple question: will students learn more using it? In one description of the new program, this isn’t really covered:

Yet, the iPad offers a big opportunity for students to get excited about learning again. The iPad has already demonstrated it can help children with learning disabilities make leaps in bounds in their development, and schools have already invested heavily in Apple’s tablet. Roughly 1.5 million iPads are currently in use in educational institutions.

Schiller said that the problem with textbooks is not the content, which is “amazing,” but the weight of the physical book. They need to last five or six years when they’re written, and they’re not very durable or interactive. Searching is also difficult.

At that point, Schiller introduced iBooks 2, which has a new textbook experience for the iPad. The first demonstration showed what it’s like to open a biology textbook, and see an intro movie playing right before you even get to the book’s contents. When you get to the book itself, images are large and beautiful, and thumbnails accompany the text. To make searching easier, all users need to do is tap on a word and they go straight to the glossary and index section in the back of the book…

Jobs had long hoped to bring sweeping changes to higher education for much of his life. When he left Apple and launched NeXT in 1986, Jobs wanted the company’s first computer — a distinctive all-black magnesium cube — to be designed specifically for higher education establishments and what Jobs called “aggressive end users.”…

“‘The process by which states certify texbooks is corrupt,’ he said. ‘But if we can make the textbooks free, and they come with the iPad, then they don’t have to be certified. The crappy economy at the state level will last for a decade, and we can give them an opportunity to circumvent the whole process and save money.'”

Based on this article, I see five things that are good about iBooks 2: it will excite students, it is lighter to use so don’t have to carry so much weight around, it will be cheaper for everyone in the long run, there are some cool features like searching and embedded videos, and it could make Apple a lot of money (and presumably traditional textbook publishers will lose money unless they adapt?).

But, we have been told for decades that better technology in the classroom, computers, laptops, the Internet, etc., will lead to improvements in learning and test scores. Isn’t this how iBooks 2 should be measured? It is good if kids are excited about learning again but will this tool actually help them learn more? The technology may be better and cheaper in the long run but this doesn’t necessarily mean it will lead to improvements in the education system.

I wouldn’t go so far as to say that iPads or iBooks 2 can’t lead to better learning but I would want to know a lot more about its effect on educational outcomes before simply adopting the technology.