Accessing the public domain through JSTOR

Academic journal archiver JSTOR has just made public domain articles a lot more accessible:

[W]e are making journal content on JSTOR published prior to 1923 in the United States and prior to 1870 elsewhere, freely available to the public for reading and downloading. This includes nearly 500,000 articles from more than 200 journals, representing approximately 6% of the total content on JSTOR.

We are taking this step as part of our continuous effort to provide the widest possible access to the content on JSTOR while ensuring the long-term preservation of this important material.

Mike Masnick over at Techdirt recounts some history that provides context for JSTOR’s decision:

You may recall that following the indictment of Aaron Swartz for downloading some JSTOR papers, a guy named Greg Maxwell decided to upload 33GBs of public domain papers from JSTOR and make them available via The Pirate Bay. He had the papers for a while, but was afraid that he’d get legally harassed for distributing them.

JSTOR explicitly acknowledge this history in its announcement (emphasis added):

I realize that some people may speculate that making the Early Journal Content free to the public today is a direct response to widely-publicized events over the summer involving an individual [Aaron Swartz] who was indicted for downloading a substantial portion of content from JSTOR, allegedly for the purpose of posting it to file sharing sites. While we had been working on releasing the pre-1923/pre-1870 content before the incident took place, it would be inaccurate to say that these events have had no impact on our planning. We considered whether to delay or accelerate this action, largely out of concern that people might draw incorrect conclusions about our motivations. In the end, we decided to press ahead with our plans to make the Early Journal Content available, which we believe is in the best interest of our library and publisher partners, and students, scholars, and researchers everywhere.

Regardless of how this happened, I applaud JSTOR for greatly furthering access to public domain academic journal articles.

H/T Techdirt/Copycense.

How long do students keep notes from their college classes?

While discussing some of the things that he left behind in the transition between the analog and digital world, a writer includes his notes from Sociology 101:

I collected a lot of things. A large part of my identity revolved around the acquisition and accumulation of books. I also collected CDs, DVDs, comics and other cultural ephemera. I kept movie tickets, clippings of articles, flyers, interesting things I picked up. I couldn’t bear to throw these out because I thought that there might come a time when I might need something —like, say, my readings in Sociology 101 from the year 2000.

Who knew when I would have to define the sociological imagination? Or when I would need to define the political dynamics and do a comparative analysis of the authoritarian leadership styles of Lee Kuan Yew and Saddam Hussein based on my studies of Politics and Change in the Third World in 2001? Oh and there were empty liquor bottles signed by friends from the early Noughties wishing me a happy nineteenth or twentieth birthday, and lord knows a situation might arise when I might need those too.

If I was the professor of this Soc 101 class, what should be my response on hearing this? Happiness in that a former student might have turned to these notes? Depression because the student had years to look at these and never did again? Or indifference since this student seemed to collect a lot of things, not just sociology notes?

More broadly, I would be curious to know how often college students return to their books and notes from school. Does anyone have any systematic data on the subject? I suspect the data would look like a Poisson curve: most students have never returned to these sources. But couldn’t this be a measure of the “effectiveness” or “success” of a particular class, an outcome that colleges and professors might be interested in knowing about? Typically, we get information on evaluations forms from the closing moments of class, a time when students might be able to judge the immediate effect of a class but can shed little light on the longer-lasting impact of a particular course. Imagine if we found that a more popular sociological text like Gang Leader For a Day was popular in the short-term but a text like The Truly Disadvantaged stuck with students for years. Both outcomes could be desirable – a short-term book or lecture can draw people into the subject or enhance the classroom experience while a longer-term book or lecture can influence lives down the road – but are qualitatively different pieces of information.

Perhaps this could all be explained by personality types: there are people who keep things from the past and those who do not. But I suspect that professors would like to think that they have the potential in many lectures or in the sources they put in front of students to influence any student for years.

[CollegeNameHere].com coming to a browser near you

Even though colleges have their own Internet domain, .edu, some colleges are thinking about branching out into .com addresses:

Some observers worry, though, that an influx of new names might dilute the power of “.edu,” which has been the online way to say “a legitimately accredited institution of higher education in the United States.”

Weber State University is among those that have already started branching out, with “getintoweber.com” as an online destination. It is “a vanity URL we pursued to dovetail with our ‘Get Into Weber’ marketing campaign that started in 2007,” says John L. Kowaleski, director of media relations. “We wanted something catchy and easy to remember, since the intended audience for “getintoweber.com” was prospective students.”

Why not simply add a “getintoweber.edu” address to the existing “weber.edu“? Because “.edu” is restricted by the “one per institution” rule that has been in effect since 2001, says Gregory A. Jackson, a vice president of Educause, the higher-education-technology group that administers the “.edu” domain. “The U.S. Commerce Department, which gave us the contract to administer the domain, views ‘.edu’ as something that identifies an institution, not multiple names that mean the same institution,” he says…

Asking the Internet Corporation for Assigned Names and Numbers for a domain of one’s own—”.weberstate” or “.trinity,” for instance—would avoid some of those problems. But that’s an expensive route to go. A college has to pay Icann $185,000 to become the administrator of a domain, and then $25,000 each year to maintain it. And the college has to adhere to strict rules about who gets the domain and who doesn’t, which could cause other problems. “What if you say that alumni can have ‘.dartmouth’ in order to strengthen connection to the school?” Mr. Jackson says. “And then an alumnus involved in some shady dealings uses that address? You can’t ban them. Icann won’t let you pick who you like and who you don’t.”

If the .com addresses are just for marketing purposes, why haven’t more colleges gone this route already? It isn’t very hard to set up a targeted site and then link through to the college’s main page.

It sounds like some of the issue is the meaning or symbolism behind the .edu domain. If prospective students and parents are searching for schools, they know the .edu domain is pretty safe. The .com realm is more open and there could be some confusion about who put the site together. Particularly for less comfortable web users, going to a .edu could be a safer and trustworthy proposition.

Of course, the rules about the use of .edu sites hints at bigger problems across the internet: a need for more domains to provide more online pages.

(With all of this talk, shouldn’t some enterprising people buy up a bunch of the possible .com sites? For example, wheatoncollege.com is available but wheaton.com is not. )

94% of American parents expect their kid to go to college

Looking at the article “Is a college education worth the price?“, I was pointed to Pew survey data released in May 2011 about what Americans think about college. Among the findings:

Nearly every parent surveyed (94%) says they expect their child to attend college, but even as college enrollments have reached record levels, most young adults in this country still do not attend a four-year college. The main barrier is financial. Among adults ages 18 to 34 who are not in school and do not have a bachelor’s degree, two-thirds say a major reason for not continuing their education is the need to support a family. Also, 57% say they would prefer to work and make money; and 48% say they can’t afford to go to college.

These are pretty high aspirations that cut across income levels and backgrounds. Pew suggests the primary barrier to reaching these expectations is money: the need to support oneself and a family gets in the way.

But I wonder if there is another barrier that is partly due to finances and partly due to other factors: it can be difficult to translate aspirations into outcomes. In today’s world and particularly in America where parents have always desired great things for their children (I remember this coming out distinctly in the original Middletown study), what parent wouldn’t say that their kid will attend college? If one comes from a privileged background, a child can see how this path will logically play out: you go through the stages of school and naturally you will move from high school to college (with finances somehow being taken care of and parents socking away money for over a decade in a college fund). But, in lesser circumstances, where is this easy path? It may be doable but there are a lot of obstacles standing in the way.

This reminds me of Annette Lareau’s Unequal Childhoods. While I can’t remember whether she specifically talks about college aspirations, the class-based styles of parenting she outlines could lead to different outcomes in achieving these parental aspirations.

Syllabus bloat: the ever-lengthening college syllabus

A professor discusses the reasons why college syllabi keep getting bigger:

Nowadays my course syllabi tend to run to many pages and always include a punctilious day-by-day calendar of the semester stipulating, for example, precisely which pages in what book students need to have read for class.  My instructions to students concerning formal written work have also become replete with prescription in a way that I would not have thought necessary even ten years ago.  Colleagues concur that instructors at the state-college level can take little or nothing for granted about student preparedness and that everything, absolutely everything, must be spelled out in advance.  Without abundant guidance and prescription, students complain of being lost, as perhaps they are, or of “not understanding what the professor wants,” as is perhaps the case…

First-year college students have a drastically diminished vision of what higher education portends for them.  The idea of discipline that enabled my UCLA instructors to assume procedural competency in their students, and that enabled most students to acquit themselves during the term with only a minimal syllabus, no longer exists…

The enlargement of the syllabus also stems from the need to define, explain, and insofar as possible justify the course itself, something that no syllabus from my undergraduate career ever bothered to do.  The syllabus of my survey of ancient literature (“Western Heritage”) addresses the basic notion of historical indebtedness, the idea of continuity of insight, and of the dignity of knowledge as opposed to ignominy of ignorance.  The syllabus also addresses the difficulty of reading; it tells students that an epic poem by Homer or a philosophical dialogue by Plato is not like a TV drama or a movie, in which in the first few minutes, one can predict the remainder…

Some of this effort—and much of the hypertrophied syllabus—is precautionary. It is precautionary on behalf of students, who, from day one, will know in advance every requirement and assignment of the course. It is also precautionary on behalf of the syllabus-writer, who seeks protection from petulant students claiming they never knew the schedule or failed to receive procedural knowledge concerning the semester.  Syllabus in hand, no one can plead ignorance.

The general idea is this: today’s college students need college explained to them, point by point. This could quickly turn into a generational argument that is bigger than just college classes: the role of college has changed from a place of learning to four years of job training. Society, and consequentially, college students simply don’t know what college is about when they should and professors have to do the extra work to explain it. This could also be tied to the issue that a number of college students are not ready to do college-level work.

There may be some truth to this but, as the article hints at, there could be good reasons to have longer syllabi:

1. Expectations are made clear from the beginning. This could cut down student’s anxiety as there is less “guesswork” involved. If a relatively short document (compared to books/journal articles) can help eliminate ignorance, why not?

2. Why not have a short part of the syllabus that explains what the class is about? Certain subjects, like sociology, are relatively unknown and a one or two paragraph introduction can give students a engaging foundation.

3. I like having the day-to-day calendar for myself so why not provide it for the students as well? Perhaps this is just because I like to be organized.

4. I wonder if a detailed and longer syllabi just by its thoroughness conveys to students the importance of the task. Some students may groan at seeing how much there is to read but others will feel the gravity.

We could transfer these ideas to another context: would many employees find it acceptable if they came to work each day with little idea of what to expect? On one hand, we should promote internal motivation but some structure is helpful. We can rue the loss of “gravity” and “mystery” that students have or feel regarding college or we can try to convey these ideas in our syllabi and what we say and do in the classroom.

h/t Instapundit

College athletes clustering in a few majors, including sociology

I’ve written before about sociology being considered an “easy major” by athletes. A new report looks at some notable schools and considers how clustered male athletes are within majors:

Since the NCAA invented the APR [Academic Progress Rate] in 2003, critics have worried that it would discourage athletes from choosing difficult majors or from changing course once they started down a given track. Some have anticipated a “clustering” of athletes in certain majors, such as sociology or communication, and others have expressed concern about the creation of broad programs such as general studies with athletes in mind.

A 2008 analysis by USA Today found that clustering happens at most institutions, and of the three sports programs Shalala compares, Miami football is most questionable, with 62.5 percent of the team studying one of two majors. While clustering on a small scale isn’t necessarily unusual, researchers who study the phenomenon say the 25-percent mark is where things start getting fishy.

A full 37.5 percent of Miami’s junior and senior football players were majoring in liberal arts in 2008, and 25 percent in sports administration. The same 37.5 percent of Stanford’s junior and senior softball players were in one major — but it was human biology — and 36.8 percent of baseball players majored in sociology. Notre Dame athletes didn’t cluster at all, according to USA Today’s analysis.

While this report by Donna Shalala, president of Miami, seems tied to troubles their football program has with violating NCAA regulations, the USA Today 2008 analysis offers more insights. While sociology is lumped within the social sciences, you can mouse over the graphics and while the most clustering seems to happen in the social sciences, the sociology clusters are numerous.

Alas, this collected data is still limited:

Assisted by sports information and other school offices, USA TODAY obtained the majors for about 85% of the athletes in the study. For most of the rest, no major was listed. Primary or first-listed majors were used in the cases of students with multiple majors.

Initially, part of the intent was to compare the percentages of athletes in a major with those of the student body as a whole. That is, if 30% of baseball players are in sociology, is 30% of the entire student body enrolled in sociology? However, short of getting athletes’ private records and the federal reporting code of each athlete’s major, large-scale comparisons are unreliable because some schools have multiple versions of some majors.

The NCAA collects similar information, but does not release it and has no current plans to study it.

Hmmm…I wonder why the NCAA has no interest in analyzing this data.

Measuring colleges by their service to community and country

Many publications want to get into the college rankings business and Washington Monthly released their own take today. The difference? They emphasize how the college gives back to society:

The Monthly’s list aims to be a corrective to the annual ranking of colleges published by U.S. News World & Report–the industry-standard roster that typically leads with well-endowed Ivy League schools that turn away the vast majority of applicants.

Instead, the Monthly ranks schools using three main categories: how many low-income students the college enrolls, how much community and national service a given college’s students engage in, and the volume of groundbreaking research the university produces (in part measured by how many undergraduates go on to get PhDs). To paraphrase the long-ago dictum of President John F. Kennedy, the Monthly is seeking, in essence, to ask not so much what colleges can to for themselves as what they can be doing for their country.

By that measure, only one Ivy cracked the top 10–Harvard. The University of California system dominated, with six of California state schools among the top 30 national universities. Texas A&M, which is ranked 63rd by U.S. News, shot into the top 20 in part because of how many of its students participate in ROTC. Meanwhile, Washington University in St. Louis plunged in these rankings to 112 from U.S. News’ 13, because only 6 percent of its student body qualifies for federal Pell grants, an indication that Washington’s students come almost entirely from upper- and middle-class backgrounds.

The U.S. News & World Report “relies on crude and easily manipulated measures of wealth, exclusivity, and prestige for its rankings,” Washington Monthly editor Paul Glastris wrote. The U.S. News’ rankings take into account freshmen retention rate, admissions’ selectivity, high school counselors’ opinions of the school, faculty salary, per-pupil spending and the rate of alumni giving, among other things.

While the editor suggests these new rankings are not as influenced by status and wealth, I wonder if the measures really get away from these entirely. It takes resources to enroll low-income students, provide resources ground-breaking research, and perhaps extra time for students to be able to be engaged in community and national service. On the other hand, colleges make decisions about how to spend their money and could choose to put their resources into these particular areas.

I’m sure there will be questions about methodology: how did they measure impactful research? How much should ROTC count for and how did they measure community engagement?

New rankings also give more schools an opportunity to claim that they are at the top. For example, Northwestern College in Iowa now trumpets on their main page that “Washington Monthly ranks NWC third in the nation.” Read past the headline and you find that it is third within baccalaureate colleges. On the other side, will schools like Washington University in St. Louis even acknowledge these new rankings since they don’t look so good?

College students don’t know how to use Google

I recently heard about this study at a faculty development day: college students have difficulty understanding and using search results.

Researchers with the Ethnographic Research in Illinois Academic Libraries project watched 30 students at Illinois Wesleyan University try to search for different topics online and found that only seven of them were able to conduct “what a librarian might consider a reasonably well-executed search.”

The students “appeared to lack even some of the most basic information literacy skills that we assumed they would have mastered in high school,” Lynda Duke and Andrew Asher write in a book on the project coming out this fall.

At all five Illinois universities, students reported feeling “anxious” and confused when trying to research. Many felt overwhelmed by the volume of results their searches would turn up, not realizing that there are ways to narrow those searches and get more tailored results. Others would abandon their research topics when they couldn’t find enough sources, unaware that they were using the wrong search terms or database for their topics.

The researchers found that students did not know “how to build a search to narrow or expand results, how to use subject headings, and how various search engines (including Google) organize and display results.” That means that some students didn’t understand how to search only for news articles, or only for scholarly articles. Most only know how to punch in keywords and hope for the best.

Such trust in technology. Wonder where this came from?

I like how anthropologists were involved in this study. Including an observation component could make this data quite unique. I don’t think many people would think that ethnographic methods could be used to examine such up-to-date technology.

Several other thoughts:

1. How many adults could explain how Google displays pages?

1a. If people knew how Google organized things, would they go elsewhere for information?

2. Finding and sorting through information is a key problem of our age. The problem is not a lack of information or possible sources; rather, there is too much.

3. Who exactly in schools should be responsible for teaching this? Librarians, perhaps, but students have limited contact. Preferably, all teachers/professors should know something about this and talk about it. Parents could also impart this information at home.

4. I’m now tempted to ask students to include all of their search terms in final projects so that I can check and see whether they actually sorted through articles or they simply picked the top few results.

ACT scores suggest most students not ready for college

The ACT has released a report that says the majority of students who take the test are not ready for college:

Only one in four college-bound high school graduates is adequately prepared for college-level English, reading, math and science, according to report released Wednesday by the ACT college admissions test.

Some 28 percent of the members of the high school class of 2011 failed to meet readiness benchmarks in any of the four core subject areas.

“ACT results continue to show an alarmingly high number of students who are graduating without all the academic skills they need to succeed after high school,” the report stated…

Readiness was defined as a student having a 50 percent chance of getting a B or a 75 percent chance of getting a C in first-year courses English Composition, College Algebra, Biology and social sciences.

Additionally, there are some pretty big gaps between racial and ethnic groups.

Here are some possible courses of action in response to this information:

1. Tell colleges that they need to offer more remedial classes and get students up to speed.

2. Add to the argument that perhaps college isn’t for all students.

3. Tell high schools that they need to keep their standards high and improve their ability to prepare students for college.

3a. Push the issue further down the educational ladder before high school.

4. Attack the ACT test. Perhaps it isn’t a great predictor of success, perhaps it is culturally biased, perhaps the students who take the ACT are not the same who take the SAT, etc.

I wonder how colleges will respond to this information. I would guess that this really doesn’t impact more elite schools who have their pick of students who have higher ACT scores. But where does this leave schools that accept a broader range of students?

Sociological findings of Academically Adrift in Doonesbury

The findings of Academically Adrift stirred up a lot of discussion. (See an earlier post here.) Eight months after the book was released, its findings made it way to the Sunday comics (August 14) as Doonesbury picked up on the information.

Neither colleges or emerging adults look too good here.

It would be interesting to hear Gary Trudeau talk about how he discovered this information and what he wanted to say in this particular comic strip.