Facebook owes a debt to sociological research on social networks

At a recent conference, two Facebook employees discussed how their product was based on sociological research on social networks:

Two of Facebook’s data scientists were in Cambridge today presenting on big data at EmTech, the conference by MIT Technology Review, and discussing the science behind the network. Eytan Bakshy and Andrew Fiore each have a PhD and have held research or lecture positions at top universities. Their job is to find value in Facebook’s massive collection of data.

And their presentation underscored, unsurprisingly, the academic roots of their work. Fiore, for instance, cited the seminal 1973 sociology paper on networks, The Strength of Weak Ties, to explain Facebook’s research showing that we’re more likely to share links from our close acquaintances, but given the volume of those weaker connections, in aggregate weak ties matter more. As Facebook attempts to extract value from its users, it’s standing on the shoulders of social science to do it. It may seem banal to point out, but its insights are dependent on a rich history of academic research…

These data scientists were referencing an article written by sociologist Mark Granovetter that has to be one of the most cited sociology articles of all time. I just looked up the 1973 piece in the database Sociological Abstracts and the site says the article has been cited 4,251 times. Granovetter helped kick off a exploding body of research on social networks and how they affect different areas of life.

Some of the other conclusions in this article are interesting as well. The writer suggests the pipeline between academia and Facebook should be open both ways as both the company and scholars would benefit from Facebook data:

Select academics do frequently get granted access to data at companies like Facebook to conduct and publish research (though typically not the datasets), and some researchers manage to glean public data by scraping the social network. But not all researchers are satisfied. After tweeting about the issue, I heard from Ben Zhao, an associate professor of Computer Science at UC Santa Barbara, who has done research on Facebook. “I think many of us in academia are disappointed with the lack of effort to engage from FB,” he told me over email.

The research mentioned above and presented at EmTech was published earlier this year, by Facebook, on Facebook. Which is great. But it points to the power that Facebook, Google, and others now have in the research environment. They have all the data, and they can afford to hire top tier researchers to work in-house. And yet it’s important that the insights now being generated about how people live and communicate be shared with and verified by the academic community.

This is the world of big data and who has access to the more proprietary data will be very important. More broadly, it should also lead to discussions about whether corporations should be able to sit on such potentially valuable data and primarily pursue profits or whether they should make it more available so we can learn more about humanity at large. I know which side many academics would be on…

Apple iPad mini launch similar to a “religious revival meeting”?

An anthropologist discusses how the recent iPad mini launch has some religious dimensions:

She [anthropologist Kirsten Bell] came to some of the same conclusions as her predecessors, including Eastern Washington University sociologist Pui-Yan Lam, who published an academic paper more than a decade ago that called Mac fandom an “implicit religion.”…

Apple’s product launches take place in a building “littered with sacred symbols, especially the iconic Apple sign itself,” she said. During keynote speeches, an Apple leader “addresses the audience to reawaken and renew their faith in the core message and tenets of the brand/religion.”

Even Apple’s tradition of not broadcasting launches in real time is akin to a religious event, Bell said. (Today’s event was available live on Apple’s website.) “Like many Sacred Ceremonies, the Apple Product Launch cannot be broadcast live,” she wrote. “The Scribes/tech journalists act as Witness, testifying to the wonders they behold via live blog feeds.”…

Yet there are strong reasons people have long compared Apple culture to religion, Bell said. “They are selling something more than a product,” she said. “When you look at the way they advertise their product, it’s really about a more connected life.” A better life is something many faiths promise, she said.

I wrote about this earlier when a commentator made a similar argument after the passing of Steve Jobs.  Comparisons like this, whether it be a product launch or a big sporting event or a rock concert, tend to draw on similar Durkheimian ideas: these are rituals; they can generate feelings of collective effervescence and emotional energy; they can strengthen group bonds; they involve a lot of important symbols that often require some inside knowledge to fully understand; there are clear lines demarcating what is sacred and what is profane. It may not be religion as the public typically thinks of it as involving some real or perceived spiritual or supernatural forces but its actions and consequences could be similar.

 

A UN report discusses how Facebook can be used for terrorism

The United Nations Office on Drugs and Crime released a report this week on how terrorists are using new platforms like Facebook:

Terrorists are increasingly turning to social media such as Facebook, Twitter and YouTube to spread propaganda, recruit sympathizers and plot potential attacks, a United Nations’ report released Monday says.

The UN Office on Drugs and Crime said Internet-based social platforms are fertile, low-cost grounds for promotion of extremist rhetoric encouraging violent acts, with terrorists able to virtually cross borders and hide behind fake identifies…

The University of Waterloo sociologist said networks like Facebook are effective tools to screen potential recruits, who could then be directed to encrypted militant Islamic websites affiliated with al-Qaida, for example.

Check out what the full report says about Facebook. Here is the first mention of Facebook (p.4):

The promotion of extremist rhetoric encouraging violent acts is also a common
trend across the growing range of Internet-based platforms that host user-generated
content. Content that might formerly have been distributed to a relatively limited audience, in person or via physical media such as compact discs (CDs) and digital video discs (DVDs), has increasingly migrated to the Internet. Such content may be distributed using a broad range of tools, such as dedicated websites, targeted virtual chat rooms and forums, online magazines, social networking platforms such as Twitter and Facebook, and popular video and file-sharing websites, such as YouTube and Rapidshare, respectively. The use of indexing services such as Internet search engines also makes it easier to identify and retrieve terrorism-related content.

The second mention (p.11):

Particularly in the age of popular social networking media, such as Facebook, Twitter, YouTube, Flickr and blogging platforms, individuals also publish, voluntarily or inadvertently, an unprecedented amount of sensitive information on the Internet. While the intent of those distributing the information may be to provide news or other updates to their audience for informational or social purposes, some of this information may be misappropriated and used for the benefit of criminal activity.

And that’s about it when it comes to specifics about Facebook in report. One case involving Facebook was cited specifically but the bulk of the terrorist activity appeared to happen on other websites. On one hand, officials say they will continue to monitor Facebook. On the other hand, Facebook is one popular website, among others, where Internet users can interact.

I imagine Facebook as a company is also interested in this and its too bad they didn’t respond, at least not to Bloomberg Businessweek:

Spokespeople at Facebook, Google and Twitter didn’t immediately return phone calls and e-mails seeking comment.

Summarizing sociological theories in 140 characters or less

A sociology instructor is having his students tweet criminal-justice theories:

“They have all these theories to learn,” Atherton said. “Some of them are very dense, and complex. What I try to get them to do, and I tie some extra credit to it, is see if they can boil the theory down, the essence of it, to 140 characters.”…

In a recent class session, Atherton shared tweets from a lesson on a theory of social disorganization, displaying the tweets under Twitter’s signature bluebird.

“Social disorganization refers to communities as a whole not coming together for common goals, ultimately causing a disruption,” the first tweet stated.

Another tweet on the topic read: “theory suggests criminal activity comes from the neighborhood where someone lives and how it shapes them living there.”

If the American Sociological Association is working on a Wikipedia initiative, why not also start a Twitter push? Since it looks like Karl Marx’s Das Kapital is being tweeted (over 41,000 tweets and counting), there is work to be done.

While I think this could be an interesting pedagogical exercise as it allows students to use a current medium as well as put complex theories into their own terms, I wonder if this doesn’t perfectly illustrate the issues with Twitter. Sociological theories are often messy and complex, taking some time to explain and think through. For a very basic understanding, 140 characters could work but if this is all students know about sociological theories, is this worthwhile in the long run?

How powerful is the distrust of Facebook among its 900 million plus users?

A commentator who praises Facebook tries to get at why so many users are suspicious about Facebook and willing to believe rumors like the recent one that Facebook was revealing private messages on walls:

The problem is that when technologists talk about data and privacy, for many of us it is still in the abstract. For technologists and computer scientists, data is a thing that lives somewhere, it has a logic and can be parsed, made sense of, organized into databases. It can be searched and ultimately sold. But as Nathan Jurgenson, a social-media theorist, points out, for most people “data is this weird nebulous concept that somebody knows something about me, but I don’t know what they know.”…

A Democratic candidate for the Maine State Senate was attacked recently by her Republican opponent for her playing of the multiplayer online game “World of Warcraft.” According to her critics, the politician playing a “rogue orc assassin” was unbecoming. This collision of two seemingly different personalities — on the one hand, a social worker and moderate politician, and on the other, a violent assassin (online) who likes stabbing things — is what sociologists have called “role strain.”

“Identities that were cultivated in little tide pools, that were conceived to be separate, come clashing together,” says Marc A. Smith, a sociologist and social-media expert. “The issue now is that all of these other identities, the idea that we can perform them on separate stages and that they had separate audiences, that is collapsing and the sound of its collapse is the sound of people squealing.”

In his 1959 “Presentation of Self In Everyday Life,” the sociologist Erving Goffman wrote about the idea of “front stage” and “back stage.” In Goffman’s theory, when they’re “front stage,” people engage in “impression management,” choosing their clothing, speech, and adapting the way they present themselves to their audience. “Back stage” they can be more themselves, which might mean shedding their societal role. In the era of social media, Smith says that “we live in a culture where the back stage keeps disappearing.” We think the conversations we are having are in private, but, in fact, they are publicly accessible and data has a long half-life. When U.S. presidential candidate Mitt Romney spoke to a select audience about the “47 percent,” he was, in fact, speaking to everyone. What happens in “World of Warcraft” doesn’t always stay in “World of Warcraft.”…

Or perhaps front stage there is a deep sense of unease about Facebook, but back stage we are not half as worried as we seem.

The suggestion here is that the world of audience segregation and impression management, where we can and do craft our actions, words, and behaviors to a particular audience, is slowly fading away. By doing more things online, these different parts of life are coming together in new ways. And I tend to agree with this journalist: there are over 900 million Facebook users, many of whom have calculated that they are willing to at least put a little information out there in return for the benefits that Facebook like keeping in touch with friends, being able to access information about others that was previously unavailable, or even acquiring the status that comes with keeping up with everyone else. A good number of users express complaints or features of Facebook that make them uneasy but relatively few are willing to give it up all together.

Indeed, we might be in the middle of a very important era where slowly individuals are thinking about and practicing new ways to present themselves and see others through mediums like Facebook. Mark Zuckerberg has expressed the goal of Facebook being a more open society where even less information on Facebook would be private, hidden, or restricted to friends. We could also look at this from the other angle: isn’t it remarkable that millions of people around the world in a span of less than 10 years have voluntarily put out information about themselves? One key might be that Facebook doesn’t force them to reveal everything; users can still practice impression management by crafting a profile. However, these are not “fake” or “untrue” profiles; rather the information is an approximation of the user’s true self.

Lack of WASP candidate for election due to the Internet?

Several commentators have picked up on this feature of the 2012 presidential election: neither candidate is a WASP.

Right now, we’re looking at an absence that would have been a startling presence 50 years ago. With all the focus on economic issues in the U.S. presidential race, there’s hardly any talk about the fact that, for the first time, none of the leading presidential and vice-presidential candidates is a white, Anglo-Saxon Protestant. Moreover, the U.S. Supreme Court has no WASPs. These are new phenomena in the United States.

The totally non-WASP tickets signify major political and social shifts in the networked age. As Robert Putnam showed a decade ago in Bowling Alone, organized groups such as churches, political clubs, fraternal clubs and Scouts have declined in importance. People have moved sharply away from traditional, tightly knit groups into more loosely knit networks that have fewer clan boundaries and more tolerance. The rise of the Internet and mobile connectivity has pushed the trend along by allowing people to expand the number and variety of their social ties…

In 1955, sociologist Will Herberg showed how white America was rigidly divided in Protestant, Catholic, Jew. Indeed, one of the authors of this article was barred from college fraternities because he was Jewish.

Now, when Chelsea Clinton marries, no one remarks on the kippa on her husband’s head. This year, a poll by the Pew Research Center found that 81 per cent of those who know Republican Mitt Romney is a Mormon are either comfortable with his affiliation or say it doesn’t matter to them.

I’m not sure I buy the Internet argument; WASPs lost their elite control because of the Internet? I think the process had started way before this. I wonder if the most basic explanation is that there are simply less WASPs overall in the population. Since the 1950s, there has been a sharp uptick in immigration and more people have had access to education and college and graduate degrees.

Argument: George Lucas is the “greatest artist of our time”

Camille Paglia explains why she believes George Lucas is “the greatest artist of our time”:

Who is the greatest artist of our time? Normally, we would look to literature and the fine arts to make that judgment. But Pop Art’s happy marriage to commercial mass media marked the end of an era. The supreme artists of the half century following Jackson Pollock were not painters but innovators who had embraced technology—such as the film director Ingmar Bergman and the singer-songwriter Bob Dylan. During the decades bridging the 20th and 21st centuries, as the fine arts steadily shrank in visibility and importance, only one cultural figure had the pioneering boldness and world impact that we associate with the early masters of avant-garde modernism: George Lucas, an epic filmmaker who turned dazzling new technology into an expressive personal genre.

The digital revolution was the latest phase in the rapid transformation of modern communications, a process that began with the invention of the camera and typewriter and the debut of mass-market newspapers and would produce the telegraph, telephone, motion pictures, phonograph, radio, television, desktop computer, and Internet. Except for Futurists and Surrealists, the art world was initially hostile or indifferent to this massive surge in popular culture. Industrial design, however, rooted in De Stijl and the Bauhaus, embraced mechanization and grew in sophistication and influence until it has now eclipsed the fine arts.

No one has closed the gap between art and technology more successfully than George Lucas. In his epochal six-film Star Wars saga, he fused ancient hero legends from East and West with futuristic science fiction and created characters who have entered the dream lives of millions. He constructed a vast, original, self-referential mythology like that of James Macpherson’s pseudo-Celtic Ossian poems, which swept Europe in the late 18th century, or the Angria and Gondal story cycle spun by the Brontë children in their isolation in the Yorkshire moors. Lucas was a digital visionary who prophesied and helped shape a host of advances, such as computer-generated imagery; computerized film editing, sound mixing, and virtual set design; high-definition cinematography; fiber-optic transmission of dailies; digital movie duplication and distribution; theater and home-entertainment stereo surround sound; and refinements in video-game graphics, interactivity, and music.

Read the entire interesting argument.

Four quick thoughts:

1. This broadens the common definition of artist. It acknowledges the shift away from “high art,” the sort of music, painting, and cultural works that are typically found in museums or respectful places to “popular art” like movies and music.

2. The argument doesn’t seem to be that Lucas is the best filmmaker or best storyteller. Rather, this is based more on his ability to draw together different cultural strands in a powerful way. Paglia argues he brought together art and technology, combined stories from the past and present, promoted the use and benefits of new technologies that were influential far beyond his own films.

3. Another way to think of a “great artist” is to try to project the legacy of artists. How will George Lucas be viewed in 50 or 100 years? Of course, this is hard to do. But, part of creating this legacy starts now as people review an artist’s career though it could change with future generations. I wonder: if technology is changing at a quicker pace, does this also mean the legacy of cultural creators will have a shorter cycle? For example, if movies as we know them today are relics in 50 years, will Lucas even matter?

4. How would George Lucas himself react to this? Who would he name as the “greatest artist” of today?

The first laptop met with distaste because it was associated with the gendered job of secretary

The “first recognizable laptop” created in 1982 ran into some problems such as its hefty price tag and its association with typing and who did the typing in many offices:

But Jeff Hawkins, founder of Palm and Handspring (makers of the Treo), was there in 1982 and he told a different story at the Computer History Museum a few years ago during a panel on the laptop. For him, the problems were not exclusively in the harder domains of currency and form factor. No, sociological and psychological reasons made the GRiD Compass hard to sell to businessmen…

This is an amazing fact. We had this product. It was designed for business executives. And the biggest obstacle, one of the biggest obstacles, we had for selling the product was the fact — believe it or not — that it had a keyboard. I was in sales and marketing. I saw this first-hand. At that time, 1982, business people, who were in their 40s and 50s, did not have any computer or keyboard in their offices. And it was associated with being part of the secretarial pool or the word processing (remember that industry?) department. And so you’d put this thing in their office and they’d say, “Get that out of here.” It was like getting a demotion. They really were uncomfortable with it…

The second reason they were uncomfortable with it is that none of them knew how to type. And it wasn’t like they said, “Oh, I’ll have to learn how to type.” They were very afraid — I saw this first-hand — they were very afraid of appearing inept. Like, “You give me this thing, and I’m gonna push the wrong keys. I’m gonna fail.”

In Hawkins telling at least, there was no way around these obstacles. “We couldn’t solve this problem. It took a generational change, for the next younger group who had been exposed to terminals and computers to grow up,” he continued. “That was an amazing technology adoption problem you would have never thought about.”

This is a great example of underlying sociological issues that might not be considered fully when making and marketing a new product. On one hand, this was an exciting new technology but on the other hand, existing social factors made it difficult for businessmen to grab the opportunity this technology represented. Ideas about gender and who was supposed to be a typist, viewed as a lower status position, influenced technology adaptation.

Also, this story could lead into the history of secretaries and typists. Around the beginning of the 20th century, the field of secretaries started turning away from men to women. Like other gendered occupations with a majority of women, secretary became a lower status position with relatively lower pay.

A lot of web traffic comes through the “dark social,” not through social network sites

Alexis Madrigal argues that while social network sites like Facebook get a lot of attention, a lot of web traffic is influenced by social processes that are much more difficult to see and measure:

Here’s a pocket history of the web, according to many people. In the early days, the web was just pages of information linked to each other. Then along came web crawlers that helped you find what you wanted among all that information. Some time around 2003 or maybe 2004, the social web really kicked into gear, and thereafter the web’s users began to connect with each other more and more often. Hence Web 2.0, Wikipedia, MySpace, Facebook, Twitter, etc. I’m not strawmanning here. This is the dominant history of the web as seen, for example, in this Wikipedia entry on the ‘Social Web.’…

There are circumstances, however, when there is no referrer data. You show up at our doorstep and we have no idea how you got here. The main situations in which this happens are email programs, instant messages, some mobile applications*, and whenever someone is moving from a secure site (“https://mail.google.com/blahblahblah“) to a non-secure site (http://www.theatlantic.com).
This means that this vast trove of social traffic is essentially invisible to most analytics programs. I call it DARK SOCIAL. It shows up variously in programs as “direct” or “typed/bookmarked” traffic, which implies to many site owners that you actually have a bookmark or typed in www.theatlantic.com into your browser. But that’s not actually what’s happening a lot of the time. Most of the time, someone Gchatted someone a link, or it came in on a big email distribution list, or your dad sent it to you…
Just look at that graph. On the one hand, you have all the social networks that you know. They’re about 43.5 percent of our social traffic. On the other, you have this previously unmeasured darknet that’s delivering 56.5 percent of people to individual stories. This is not a niche phenomenon! It’s more than 2.5x Facebook’s impact on the site…
If what I’m saying is true, then the tradeoffs we make on social networks is not the one that we’re told we’re making. We’re not giving our personal data in exchange for the ability to share links with friends. Massive numbers of people — a larger set than exists on any social network — already do that outside the social networks. Rather, we’re exchanging our personal data in exchange for the ability to publish and archive a record of our sharing. That may be a transaction you want to make, but it might not be the one you’ve been told you made.

Two thoughts about this:

1. Here is how I might interpret this argument from a sociological point of view: Internet traffic is heavily dependent on social connections. Whether this is done on sites like Facebook, which are more publicly social, or through email, which is restricted from public view but is still quite social, the interactions people have influence where they go on the web. In this sense, the Internet is an important social domain that may have some of its own norms and rules as well as its own advantages and disadvantages but it is built around human connections.

2. This sounds like a fantastic business and/or research opportunity; what is going on in this “dark social” realm? Could there be ways at getting at these activities that would help us better understand and analyze the importance of social connections and interactions and could this information be monetized as well?

Cell phone users now comprise half of Gallup’s polling contacts

Even as Americans are less interested in participating in telephone surveys, polling firms are trying to keep up. Gallup has responded by making sure 50% of people contacted for polling samples are cell phone users:

Polling works only when it is truly representative of the population it seeks to understand. So, naturally, Gallup’s daily tracking political surveys include cellphone numbers, given how many Americans have given up on land lines altogether. But what’s kind of amazing is that it now makes sure that 50 percent of respondents in each poll are contacted via mobile numbers.

Gallup’s editor in chief, Frank Newport, wrote yesterday about the evolution of Gallup’s methods to remain “consistent with changes in the communication behavior and habits of those we are interviewing.” In the 1980s the company moved from door-to-door polling to phone calls. In 2008 it added cellphones. To reflect the growing number of Americans who have gone mobile-only, it has steadily increased the percentage of those numbers it contacts.

“If we were starting from scratch today,” Newport told Wired, “we would start with cellphones.”…

Although it may be a better reflection of society, mobile-phone polling is more expensive, says Newport. They have to call more numbers because the response rate is lower due to the nature of mobile communication.

As technology and social conventions change, researchers have to try and keep up. This is a difficult task, particularly if fewer people want to participate and technologies offer more and more options to screen out unknown requests. Where are we going next: polling by text? Utilizing well-used platforms like Facebook (where we know many people are turning every day)?