Using Twitter to predict when you will get sick with 90% accuracy

A new study uses tweets in New York City to predict when a user will get sick – and does so with 90% accuracy.

Using 4.4 million tweets with GPS location from over 630,000 users in New York City, Sadilek and his team were able to predict when an individual would get sick with the flu and tweet about it up to eight days in advance of their first symptoms. Researchers found they could predict said results with 90 percent accuracy.

Similar to Google’s Flu trends, which uses “flu” search trends to pinpoint where and how outbreaks are spreading, Sadilek’s system uses an algorithm to differentiate between alternative definitions of the word ‘sick.’ For example, “My stomach is in revolt. Knew I shouldn’t have licked that door knob. Think I’m sick,” is different from “I’m so sick of ESPN’s constant coverage of Tim Tebow.”

Of course, Sadilek’s system isn’t an exhaustive crystal ball. Not everyone tweets about their symptoms and not everyone is on Twitter. But considering New York City has more Twitter users than any other city in the world, the Big Apple is as good as a place as any for this study.

While one could look at this and marvel at the power of Twitter, I think the real story here is about two things: (1) the power of big data and (2) the power of social networks that Twitter harnesses. If you have people volunteering information about their lives, access to the data, and information about who users are connected to, you can do things that would have been very difficult even ten years ago.

It is interesting that this study was conducted in New York City where there is a high percentage of Twitter users. How good are predictions in cities with lower usage rates? Are we headed toward a world where public health requires people to report on their health so that outbreaks can be contained or quelled?

Seeing urban growth from the Landsat satellite system

Among other things, the Landsat satellites took pictures of big cities over time. Here are images of 11 of these big cities with roughly 30-40 years between each picture. A few thoughts:

1. I find several of the desert city images, such as Dubai and Las Vegas, to be most fascinating.

2. I’ve always liked overhead or satellite pictures of cities as I think it gives a helpful perspective where one can see the big picture rather than just the nearby area.

3. I’ve wondered several times how difficult it might have been for city dwellers who lived before the 19th century to truly adopt or imagine an overhead view of their city. Clearly, it could be done but it is one thing to imagine and another to see it from an airplane (or hot air balloon or dirigible) or really tall building.

4. I would be interested in spending some time with these images to see if there are discernible patterns. I assume the first thing people would notice is the expansion of development but I assume there are some other things in here such as important transportation corridors (highways and trains) and different kinds of development located in different places.

Bonus: here are some pairs/series of images from American locations.

 

Four tips for making a good infographic

The head of a new infographic website suggests four tips for making a good infographic:

1. Apply a journalist’s code of ethics

An infographic starts with a great data set. Even if you’re not a journalist — but an advertiser or independent contractor, say — you need to represent the data ethically in order to preserve your credibility with your audience. Don’t source from blogs. Don’t source from Wikipedia. Don’t misrepresent your data with images.

2. Find the story in the data

There’s a popular misconception that creating a great infographic just requires hiring a great graphic designer. But even the best designer can only do so much with poor material. Mapping out the key points in your narrative should be the first order of business. “The most accessible graphics we’ve ever done are the ones that tell a story. It should have an arc, a climax and a conclusion,” Langille says. When you find a great data set, mock up your visualization first and figure out what you want to say, before contacting a designer.

3. Make it mobile and personal

As the media becomes more sophisticated, designers are developing non-static infographics. An interactive infographic might seem pretty “sexy,” Langille says, but it’s much less shareable. A video infographic, on the other hand, is both interactive and easy to port from site to site. Another way to involve readers is to create a graphic that allows them to input and share their own information.

4. Don’t let the code out

One of the easiest ways to protect your work is to share it on a community site. Visual.ly offers Creative Commons licensing to users who upload a graphic to the site. When visitors who want to use the graphic grab embed code from the site, the embedded image automatically links back to its creator. Langille suggests adding branding to the bottom of your work and never releasing the actual source file — only the PNG, JPEG, or PDF. And what if your work goes viral without proper credit? For god’s sake, don’t be a pain and demand that the thieves take it down. “It’s better to let it go and ask for a link back and credits on the graphics,” Langille said.

The first two points apply to all charts and graphs: you need to have good and compelling data and then use the graphic to tell this story. Infographics should make the relevant data easier to understand than having someone read through denser text. An easy temptation is to try new ways of displaying data without thinking through whether they are easily readable.

It would be interesting to know whether infographics are actually more effective in conveying information to viewers. In other words, is a traditional bar graph made in Excel really worse in the basic task of sharing information than a snazzy infographic? I imagine websites and publications would rather have infographics because they look better and take advantage of newer tools but a better visual does not necessarily equal connecting more with viewers.

Side note: the “meta Infographic” at the beginning of this article and the “Most Popular Infographics You Can Find Around the Web” at the end are amusing.

Thinking about Americans losing the ability to work with their hands

A New York Times essay argues we are losing something as Americans because fewer people can work skillfully with their hands:

“In an earlier generation, we lost our connection to the land, and now we are losing our connection to the machinery we depend on,” says Michael Hout, a sociologist at the University of California, Berkeley. “People who work with their hands,” he went on, “are doing things today that we call service jobs, in restaurants and laundries, or in medical technology and the like.”

That’s one explanation for the decline in traditional craftsmanship. Lack of interest is another. The big money is in fields like finance. Starting in the 1980s, skill in finance grew in stature, and, as depicted in the news media and the movies, became a more appealing source of income…

Craft work has higher status in nations like Germany, which invests in apprenticeship programs for high school students. “Corporations in Germany realized that there was an interest to be served economically and patriotically in building up a skilled labor force at home; we never had that ethos,” says Richard Sennett, a New York University sociologist who has written about the connection of craft and culture…

As for craftsmanship itself, the issue is how to preserve it as a valued skill in the general population. Ms. Milkman, the sociologist, argues that American craftsmanship isn’t disappearing as quickly as some would argue — that it has instead shifted to immigrants. “Pride in craft, it is alive in the immigrant world,” she says.

I don’t doubt that the ability to produce craftmenship is worthwhile, particularly if one is a homeowner. But I wonder about the larger value of working with one’s hands. Why can’t using a mouse or a controller be considered “working with one’s hands”? Of course, it fits in a literal sense but there is a difference in production and skills. Yet, it still requires effort and finesse to be able to effectively utilize the newest machines. Perhaps we have swapped our traditional toolbox for a “digital toolbox.”

If the world is moving toward an information and service economy, is this necessarily bad? This reminds me of a piece in The Atlantic months ago about a contest where programmers had to try to put together a computer that could converse like a human. Working with tools is not uniquely human but thinking and reasoning might be. Does this make working with our hands less valuable compared to other possible activities?

Google adding more and more indoor maps of buildings

Google continues to expand its Maps program by adding more and more indoor maps:

10,000 indoor maps. You can consider this proof-positive that Google is making headway in its effort to chart every nook and cranny of navigable terrain, even if this includes carpet and linoleum.

Even more noteworthy: A great many of these floor plans weren’t created in partnership with Google. Instead, they were uploaded by users — business owners and institutional leaders who were motivated to make their properties just a bit more open to all. A steakhouse in Massachusetts. A camera store in New York. Even the Mayo Clinic in Scottsdale, Arizona. More and more pioneering spirits are using Google’s self-service tool to upload their building layouts for everyone to see.

But there’s a caveat: It’s nearly impossible to find most of these indoor maps, unless you happen to stumble upon one during your day-to-day use of the Maps app. Or unless you read Wired.

Google launched its indoor mapping initiative and its Google Maps Floor Plans self-publishing tool in November 2011. But right now, if you look at the Google Maps support site, you’ll find a bare-bones list of some 80 available indoor maps inside the U.S. This list only includes major museums, airports, and business locations that Google has partnered with.

Much more interesting to Wired are the individual businesses and organizations that have made their own indoor-mapping leaps of faith. We were smitten with the idea that so many people willingly uploaded their floorplans to the mapping database, so we asked Google to share a sampling of user-submitted examples. As you can see from the images above, some of the maps are most noteworthy for their sheer, well, normal-ness. But this, in part, reflects the limitations that Google puts on people who voluntarily opt into the service.

While the last uncharted area of the Earth may be deep under the oceans, providing widely available maps of public indoor spaces (Google is not yet accepting private buildings) is also pretty cool. These maps could be really helpful to visitors who don’t realize what may be turn around the corner or corridor inside a nearby building.

So when can I start getting turn-by-turn directions on my smartphone from the entrance to the Field Museum in Chicago to my favorite exhibits?

New York Times review of SimCity Social

Here is evidence that the world is a changed place: the New York Times has a short review of the new SimCity Social game for Facebook.

SimCity Social brings the original city-building video game to Facebook, though fans will be hard-pressed to find any of the depth and complexity of that popular PC series. Players place businesses, factories, houses and various attractions, as their expansionist ambitions are kept in check by an energy meter that slowly refills.

The game allows friends to establish sister cities or rival cities, which enables some entertaining cross-border acts of charity or benign sabotage. SimCity Social is a cute and capable social city builder. It’s also a shameless attempt to capitalize on the success of Zynga’s wildly popular CityVille, slapping a powerful name on a game that could never live up to SimCity’s legacy.

As a long-time SimCity fan, I’m tempted to try out this new version. However, several things will stop me:

1. I don’t want a watered down version. I’d rather use my computer and XBox 360 to play full, more stunning versions of games.

2. I’m not sure even a full-scale social version would add to the gameplay.

3. Does this app bug all of your friends like Farmville and the like? If so, I’m staying far away.

4. It sounds like this version may have become more “gamified” rather than being the free-flowing game I’m used to. Here is another review that explains some of the game:

So it’s technically Facebook, but when you’re playing it, it feels like a place (OR A CITY) of its own. I started playing it last Friday and I can’t stop. I am on Level 17, my population is at healthy 6,000, and SimCitySocialCheat.com is the website I aspire to be managing editor of. There’s something about the colorful utopia that I can not not stop thinking about.

Maybe it’s the constant yearning of completing tasks to get more energy bolts, thus being able build more houses and increase population and, in doing so, unlock the next level and new attractions.

Perhaps it’s the constant praise the game lauds on you for doing something so dumb and pointless, like planting a tree in a high-populated area. The the real world just doesn’t offer that,  unless you send a tree to Israel. (Then you get a fancy certificate back in return.)

And my friends are redeeming themselves there. You find an inner-circle of people that you can trust and rely on—not for moral support, but for land permits, teamwork badges, and Dunkin’ Donut energy bonuses: Jordanville runs on Dunkin’.

SimCity has always had some incentive to grow as you get to build different kinds of things. This often worked like it does for real cities: as a city grow, it can support monuments, cultural attractions, and more complicated transportation options. However, it sounds like this new version takes it to another level.

 

Some big cities only made possible by air conditioning?

This seems pertinent with the recent heatwave in the Midwest and East Coast: how many of the major cities of the world wouldn’t exist without air conditioning?

It wasn’t until the beginning of World War II that homes in southern U.S. cities began using air conditioning units. By 1955, one in every 22 American homes had air conditioning. In the South, that number was about 1 in 10, according to the historian Raymond Arsenault [PDF]. Since this increase in air conditioning use, many of these Southern cities experienced a population boom.

I took a look at the metro areas in the U.S. with more than 1 million people and found which have historically been the hottest, based on the number of cooling degree days per year — a statistic used to measure how much and how many days the outside temperature in a certain location is above 65 degrees. Using numbers from NOAA, I found that between 1971-2000, six big cities in the South had an average of at least 3,000 cooling degree days. I also compared the 1940 metro population (when available) to the metro population in 2010. From the time just before air conditioning became popular in the South to today, population growth in the region has skyrocketed. This raises the question: would these hot Southern cities be around, at least in their present form, if air conditioning hadn’t been invented?

But, of course, there are bigger, hotter cities across the globe. In fact, seven of the largest metros in the world have an average high temperature above 90 degrees Fahrenheit.

Not surprisingly, all of these cities are found in developing countries. As Michael Sivak, a professor at the University of Michigan notes, only two of the warmest 30 global metros can be found in developed countries. With the middle class growing in warm metros in countries like India, demand for air conditioning is increasing. A recent New York Times article reported that sales of air conditioning units in India and China are growing 20 percent per year and are fast becoming a middle-class status symbol. Last year, 55 percent of new air conditioners were sold in the Asia Pacific region.

Is there some sort of giant control group we could use to figure this out? Over the weekend, I was in a 150 year old church with no air conditioning. It was hot though I think this was primarily because there was no air movement; indeed, when we walked outside afterward, it felt more pleasant as there was a slight breeze. Before air conditioning, people obviously survived in such temperatures (and also survived in the winters without central heating as we know it today).

So this seems to be the real question: could we expect that there would be major changes in population distributions if there was no air conditioning whatsoever? Would Florida really have few people and post-World War II Sunbelt expansion not taken place? The best solution to all of this would be to have people move to more temperate climates where it doesn’t get too hot in the summer or too cold in the winter. This generally requires consistent breezes, usually off major bodies of water. Of course, not everyone can live in places like Hawaii which only has a record high temperature of 100 degrees. Did the Mediterranean climate help give rise to empires like Greece and Rome (though it makes it difficult to then explain the Sumerian, Assyrian, Babylonian, and Persian empires which must have adapted to desert climates)?

More broadly, we could discuss the influence of ecology on population growth and state building. I remember studying the mysterious decline of the Maya in southeastern Mexico/Guatemala. More recent scholars have suggested some kind of ecological explanation, perhaps a drought, that led to increased contentious competition for dwindling resources.

A sociologist on the iPhone at 5: “There has been no other device that has changed social and technological life in such a short time”

The iPhone just turned five years old and a sociologist makes some big claims about the impact of the device:

“There has been no other device that has changed social and technological life in such a short time,” said Clifford Nass, a Stanford University sociologist and psychologist who studies how technology impacts society. “There has been nothing like it in the world.”

This is a bold claim. I assume this primarily about the time period: important technology today has the ability to make rapid changes. This is one of the defining features of today’s globalization: stuff happens and spreads quickly. The iPhone itself is influential but it quickly led to other changes and pushed Android and other phone makers as well. I can admit that the smartphone world has some advantages.

At the same time, I wonder if this claim is too much. Looking at the broad sweep of human history, how does the iPhone stack up? What about the printing press, the plow, the steam engine, and so on? These devices may not have had such a quick effect but these led or contributed to whole eras like the Renaissance, the Agricultural Revolution, and the Industrial Revolution. Will we look back in fifty or one hundred years and see the iPhone as a similar singular device or is it part of the computer-age process?

Why sociologists should make their own apps

A sociologist who has made her own medical sociology app argues that her colleagues should be making their own apps:

My decision to make an app stemmed from two major reasons. First, I have long been interested in the ways people interact with computer technologies, and have published some research on this in the past.

More recently my interest has turned to health-related apps available for smartphones and tablet computers. I had been researching the various apps available for such purposes and had noted that many apps have been developed for teaching purposes for medical students.

Second, we have mobile digital devices at home that are very popular with my two school-aged daughters. I had noticed the huge number of educational apps that are available for children’s use, from infancy to high-school level. Some Australian high schools, including my older daughter’s school, have acknowledged young people’s high take-up of mobile digital devices and are beginning to advocate that students bring their devices to school and use them for educational purposes during the school day.

The relevance for tertiary-level education appeared obvious. I wondered whether many universities, academic publishers or academics themselves had begun to develop apps. Yet, having searched both the Android and the Apple App Stores using the search term of my discipline, ‘sociology’, I discovered only a handful of apps related to this subject for tertiary students. Nor were there many for other social sciences. There seemed to be a wide-open gap in the market…

My app is very simple. It is text-based only and has no illustrations or graphics, but there is provision for these to be included if the developer so chooses. Apps developed using this particular wizard are only be available for use on Android devices, but having looked at similar app makers for Apple devices I was put off by their more technical nature and the greater expense involved.

In just a couple of hours my app was ready. I had typed in over 25 medical sociology key concepts (for example, social class, discourse, identity, illness narratives, poststructuralism), plus a list of books for further reading, chosen a nice-looking background and paid US$79.00 for the app to appear without ads and to guarantee that it would be submitted to the Android App Store.

Three issues I could see with this:

1. How much demand is there really for such apps? I can’t imagine too many people look for sociology or social science apps. Of course, it is relatively easy to make so it isn’t like tons of time has to be invested in such apps (though there could be a relationship between the time put into an app and how engaging it is).

2. The assumption here is that people want to use these apps for educational purposes. Would this work? Can apps effectively be used for education

3. How much better is making an app than putting together a website?

I’m glad to see more sociologists venturing into new technologies but it is worthwhile to consider the payoffs and how they are really going to be used.

When you find out that your dissertation is for sale as an ebook without you knowing about it

A recent sociology PhD describes an interesting experience: he found that his dissertation was being sold online as an ebook.

A Google search brought me to a link to BarnesandNoble.com, where with one click I soon discovered that my dissertation was being sold. It took a minute of staring at the computer screen to fully accept that my work could be purchased for (at the time) $32.34 as an eTextbook for the Nook reader. I thought the price was a steal. Literally.

I had graduated about a year earlier with a Ph.D. in sociology. Although I had hoped to turn my dissertation into a book one day, I had not yet started that process. I hadn’t even secured a contract with a publisher…

I began investigating how it could have come to be for sale. Like many graduate students, I was burned out after defending my dissertation. My immediate thoughts were not about which publisher I should contact but about whether I would be able to afford rent and food in this economy. My final weeks of graduate school had been a bit foggy, and I couldn’t recall the specific publication options I had selected when I submitted the dissertation to my university as a degree requirement.

So I dug out my copies of handouts from the Office of Graduate Studies, describing my options for publication with ProQuest (the university’s publisher of theses and dissertations). Reading through the papers, I could find nothing on all the possible ways my dissertation could be sold.

Then I logged into my account on the ProQuest site and saw that when I had submitted my dissertation electronically I had chosen an option for third-party selling. At the time, I was unsure what that meant, and in my end-of-graduate-school haze, I had neglected to find out. I assumed it meant that some other academic company could sell my work to individual researchers, typically few in number, who would have to exert great effort even to discover its whereabouts. I never thought it meant it could be sold, in its entirety, on the same site where one can purchase calendars and the complete series of The Sopranos on DVD.

Remembering some similar feelings at the end of my time in graduate school as I was looking to complete and defend my dissertation, I could see how this information could slip through the cracks. However, the lesson still remains: read all of that fine print so you know what you are agreeing to.