Reminder from Manti Te’o saga: few people are “Catfished” online

The unfolding of the Manti Te’o girlfriend hoax story has been both strange and fascinating. Here is one thing we should take away from Te’o’s experience: few people online are in danger of experiencing something similar, of being “Catfished.” In a statement issued by Te’o on Wednesday, January 16, here is his second to last sentence:

If anything good comes of this, I hope it is that others will be far more guarded when they engage with people online than I was.

People should use common sense online. But, we know that many users of the Internet and of common social networking sites like Facebook are not there to meet strangers and begin relationships. Rather, most users are interested in connecting with people they already know or people who might be inside a common circle, say, part of an incoming college freshman class or who are part of a larger organization. To be “Catfished” means that an Internet user would have to seek out some of these relationships with unknown or random people. Since many people are not seeking this out or responding to the occasional odd request, this is not a huge problem for the general population of Internet users. While the movie Catfishpresents such a scenario and MTV has a show with the same name and theme, this does not mean it is a common occurrence.

Argument: “the Internet probably hasn’t made people less religious”

Has the Internet led to decreased religiosity? One lab researcher and research assistant doesn’t think so:

Given these data, I think it’s really unlikely that the Internet has played any substantive role in bringing Americans out of religion. Everyone has a self-serving bias, and atheists aren’t immune. Atheist writers seem really optimistic — they say we have the truth on our side, information is widely accessible, and we’re growing in numbers. But it seems like these first two things don’t really matter that much, and our growth seems to be more in organization and political influence, rather than genuine conversion.

To me, this supports a focus on values rather than beliefs, and about this I’m optimistic — if America is becoming more socially liberal but remains God-fearing, then that’s fine with me. So long as we have a cultural momentum geared toward gay rights, secular government, and social justice, the politically liberal religiously unaffiliated can help to push this progress forward. And there the Internet might help, no matter what anyone believes about God.

This sounds like an interesting research question that would be the flip-side of a recent paper I co-authored where we looked at how religiosity affects Facebook use. I don’t know how this new question would turn out but it does get at a question we raise at the end of our paper: is the Internet more of a secular or sacred sphere? Are there more people promoting belief or unbelief, how many websites are devoted to each topic, how many visitors do such websites receive, and do certain groups have more appealing approaches and sites? And it may not even matter what exactly is being promoted on the Internet; perhaps it is a function of time spent online versus doing other things.

A mid-twentieth century vision of “the future” versus welcome changes to everyday life for average Americans

Virginia Postrel compares the vision of “the future” decades ago versus the changes that have made the everyday lives of many Americans better:

Forget the big, obvious things like Internet search, GPS, smartphones or molecularly targeted cancer treatments. Compared with the real 21st century, old projections of The Future offered a paucity of fundamentally new technologies. They included no laparoscopic surgery or effective acne treatments or ADHD medications or Lasik or lithotripsy — to name just a few medical advances that don’t significantly affect life expectancy…

Nor was much business innovation evident in those 20th century visions. The glamorous future included no FedEx or Wal- Mart, no Starbucks or Nike or Craigslist — culturally transformative enterprises that use technology but derive their real value from organization and insight. Nobody used shipping containers or optimized supply chains. The manufacturing revolution that began at Toyota never happened. And forget about such complex but quotidian inventions as wickable fabrics or salad in a bag.

The point isn’t that people in the past failed to predict all these innovations. It’s that people in the present take them for granted.

Technologists who lament the “end of the future” are denigrating the decentralized, incremental advances that actually improve everyday life. And they’re promoting a truncated idea of past innovation: economic history with railroads but no department stores, radio but no ready-to-wear apparel, vaccines but no consumer packaged goods, jets but no plastics.

I wonder if another way to categorize this would be to say that many of the changes in recent decades have been more about quality of life, not significantly different way of doing things or viewing the world (outside of the Internet). Quality of life is harder to measure but if we take the long view, the average life of a middle-class American today contains improvements over decades before. Also, is this primarily a history or perspective issue? History tends to be told (and written) by people in charge who often focus on the big people and moments. It is harder to track, understand, and analyze what the “average” person experiences day to day.

I can imagine some might see Postrel’s argument and suggest we are deluded by some of these quality of life improvements and we forget about what we have given up. While some of this might be mythologizing about a golden era that never quite was, it is common to hear such arguments about the Internet and Facebook: it brings new opportunities but fundamentally changes how humans interact with each other and machines (see Alone Together by Sherry Turkle). We now have Amazon and Walmart but have lost any relationships with small business owners and community shops. We may have Starbucks coffee but it may not be good for us.

The rise of misattributed quotes on the Internet, social media

An editor at RealClearPolitics examines an erroneous online list of Mark Twain quotes and takes a broad view of quotes in the age of the Internet and social media:

The point of this example is that lists of quotes without specific and verifiable citations — where and when it appeared — are useless, and invariably rife with errors. Websites with names like “Brainyquote” and “Thinkexist.com” are essentially Internet compost piles.

In the pre-Internet days, “Bartlett’s Familiar Quotations” and “The Oxford Dictionary of Quotations” were the gold standards, although sometimes misattributed quotes found their way into those volumes. Much of this material is now online, but the best source of accurate quotes today is the “Yale Book of Quotations,” edited by the rigorous and charming Fred R. Shapiro.

Many of the most frequently misquoted historical figures have websites devoted to keeping the record straight for their heroes. These range from one established by a conscientious amateur Twain aficionada named Barbara Schmidt to WinstonChurchill.org, which is run by the Churchill Centre and Museum in London. The latter site even has a section called “Quotes Falsely Attributed.”

In his anthology, Shapiro goes the extra mile in tracking down the origin of erroneous quotes. Thus, he is no stranger to the misuse of quotations or even obvious forgeries. But even he was astonished at the casual speciousness of the Huffington Post inventory.

This has been a widespread issue in recent years – remember the fake MLK viral quote after the death of Osama bin Laden? While Wikipedia might have relatively good information that is regularly edited, quotations are simply floating around the Internet and social media.

I think this is tied to two other phenomena related to the Internet and social media:

1. The desire people have to find a quote that represents them. In an era of profiles and status updates, people are defined more and more by short, snappy bursts. There is simply not space to write more and who wants to read a long piece about your existence (except on blogs)? Finding the right sentence or two that sums up one’s existence or current state is a difficult task that can be aided with quotes attributed to famous figures. If you don’t want to use quotes, you can always use pictures – witness the rise of Instagram.

2. Many of these quotes are inspirational or witty. If you look at the inspirational quotes on Facebook profiles or Twitter feeds, many suggest people are continually facing and then overcoming challenges and obstacles. The overcoming-type quotes are empowering as individuals can quickly equate their challenges to some of the greatest in history. The witty quotes do something else; they suggest the user is facing life with verve and can find and wield profound words. Witty quotes can then become another status game as users try to one-up each other with piercing and whimsical takes on the world.

Perhaps this is how the average person gets to participate on a daily basis in a sound bite culture.

Claim: 90% of information ever created by humans was created in the last two years

An article on big data makes a claim about how much information humans have created in the last two years:

In the last two years, humans have created 90% of all information ever created by our species. If our data output used to be a sprinkler, it is now a firehose that’s only getting stronger, and it is revealing information about our relationships, health, and undiscovered trends in society that are just beginning to be understood.

This is quite a bit of data. But a few points in a response:

1. I assume this refers only to recorded data. While there are more people on earth than before, humans are expressive creatures and have been for a long time.

2. This article could be interpreted by some to mean that we need to pay more attention to online privacy but I would guess much of this information is volunteered. Think of Facebook: users voluntarily submit information their friends and Facebook can access. Or blogs: people voluntarily put together content.

3. This claim also suggests we need better ways to sort through and make sense of all this data. How can the average Internet user put it all this data together in a meaningful way? We are simply awash in information and I wonder how many people, particularly younger people, know how to make sense of all that is out there.

4. Of course, having all of this information out there doesn’t necessarily mean it is meaningful or worthwhile.

An argument for Amazon’s one-star reviews reveals the role of cultural critics

A professional critic praises Amazon’s one-star reviews:

About a year ago, while shopping online for holiday gifts, I became an unabashed connoisseur of the one-star amateur Amazon review. Here I found the barbed, unvarnished, angry and uncomfortably personal hatchet job very much alive. Indeed, I became so enamored of Amazon’s user-generated reviews of books, films and music that my interest expanded to the one-star notices on Goodreads, Yelp and Netflix, where, for instance, a “Moneyball” review notes the movie “did not make you feel warm and fuzzy at the end as a good sports film should.” How true! A rare opinion on a critical darling!…

But there is a visceral thrill to reading amateur reviewers on Amazon who, unlike professional critics, do not claim to be informed or even knowledgeable, who do not consider context or history or ambition, who do not claim any pretense at all. Their reviews, particularly of classics, often read as though these works had dropped out of space into their laps, and they were first to experience it. About “Moby-Dick,” one critic writes: “Essentially, they rip off the plot to ‘Jaws.'” About “Ulysses,” another critic writes: “I honestly cannot figure out the point, other than cleverness for cleverness’ sake.”

Likewise, to seriously dismiss “The Great Gatsby” as “‘Twilight’ without the vampires,” as an Amazon reviewer did, may be glib and reductive, but it’s also brilliantly spot on, the kind of comparison a more mannered critic might not dare. “Whoever made that ‘Twilight’ comparison, whether they know it, is showing their education, that they can connect new media with old works and draw fresh conclusions,” said David Raskin, chair of the art history, theory and criticism department at the School of the Art Institute of Chicago…

Speaking of honesty: It should be pointed out here that, in general, online amateur reviews are not mean but usually as forgiving as the professional sort. Bing Liu, a data-mining expert at the University of Illinois at Chicago who has studied online reviews — “partly because I was curious if they were real or just someone gaming the system” — told me that 60 percent of Amazon reviews are five-star reviews and another 20 percent are four-star. The information research firm Gartner released a study in September predicting that, within a couple of years, between 10 and 15 percent of online reviews will be paid for by companies — rigged.

It sounds like the argument is this: you can find the average American in the one-star Amazon reviews. Instead of getting the filtered, sophisticated review typically found in media sources, these reviewers give the unvarnished pop culture take. Discussed in this argument is the idea of social class and education. An approved reviewer or critic, the typical gatekeeper, is able to put a work in its context. The educated critic is trying to make the work understandable for others. The educated critic often has experience and education backing their opinions. In contrast, the Internet opens up spaces for individuals to post their own reactions and through aggregation, such as the Amazon five-star review system, have some say about how products and cultural works are perceived.

This new reality doesn’t render cultural gatekeepers completely irrelevant but it does do several things. One, it dilutes their influence or at least makes it possible for more critics to get involved. Second, it also makes more visible the opinions of average citizens. Instead of just theorizing about mass culture or pop culture, we can all see what the masses are thinking at the moment they are thinking it. (Think of the possibilities on Twitter!) Third, it provides space like in this article for reviewers to admit they don’t always want to write erudite pieces but want to have a “normal person reaction.”

Just one problem with this piece: the critic says he doesn’t really read the one-star Amazon reviews for information. Instead, he appreciates the “visceral thrill.” He quotes an academic who says such reviews reveal cultural gaps. Thus, celebrating the one-star reviews may be just another way to assert the traditional reviewer’s cultural capital. Read the one-star reviews for entertainment but continue to go back to the educated reviewer for the context and more valued perspective.

Real estate firm survey: younger Americans still want to own a home

Even though a number of commentators have suggested younger Americans are not as interested in homeownership, a recent survey conducted by “Better Homes & Garden real estate brand” suggests this may not be the case:

Nearly all of them said they were willing to adjust their lifestyles to save for a home. Sixty-two percent said they’d eat out less. Forty percent said they’d work a second job. And 23 percent said they’d move back home with their parents to save money — they’re being strategic about saving money to own a home.

They also said that all of the media coverage of the housing crisis has taught them the importance of doing their research and planning, and they think they’re more knowledgeable about the process than their parents were at their age. But they want to be ready to own — 69 percent said that someone is ready to buy if they can maintain their lifestyle (while owning), and 61 percent agreed that the “readiness indicator” is if they have a secure job.

And even if these younger adults do want to own a home, the real estate industry has to be ready to appeal to this group:

Well, as an industry and certainly as a brand, we’d have to step up our campaign to show young buyers the importance of real estate as a long-term investment and lifestyle.

On a related note, something else also drove us to do this survey: the big disconnect in the average age of a first-time buyer (36), versus the average age of a real estate agent (56). This younger generation of buyers’ habits are different — they’re comfortable using technology, especially mobile devices, to buy and track everything, and agents need to learn this.

Several things are interesting here. First, it appears a good number of younger Americans do want a home but they are also more aware of what it will take to make it happen. If homeownership is such a big investment, younger Americans want to do their homework to know what they are getting into. This could mean that fewer people in this group will buy a home until they find a more “perfect” situation which might decrease the homeownership rate but it could also mean that those who buy a home are more committed.

Second, it is suggested that the real estate industry needs to stay relevant in the era of the Internet. Traditionally, real estate agents are necessary people in the middle who have expertise that the average homeowner would not have. But, potential homebuyers have much more information at their fingertips and if more people are selling their own homes, the real estate industry needs to continually show what extra value it offers. Also, this article hints at the aging of real estate agents: is this a desirable job for young people to pursue? If you look at a table of occupational prestige in the United States, real estate agent is at the bottom.

I wonder if the story for younger Americans and homeownership will be a bifurcated one based on socioeconomic status. Those with higher education and good jobs will continue to buy homes. Those who don’t have college degrees and/or struggle to find a good job may not have the option to do so.

A UN report discusses how Facebook can be used for terrorism

The United Nations Office on Drugs and Crime released a report this week on how terrorists are using new platforms like Facebook:

Terrorists are increasingly turning to social media such as Facebook, Twitter and YouTube to spread propaganda, recruit sympathizers and plot potential attacks, a United Nations’ report released Monday says.

The UN Office on Drugs and Crime said Internet-based social platforms are fertile, low-cost grounds for promotion of extremist rhetoric encouraging violent acts, with terrorists able to virtually cross borders and hide behind fake identifies…

The University of Waterloo sociologist said networks like Facebook are effective tools to screen potential recruits, who could then be directed to encrypted militant Islamic websites affiliated with al-Qaida, for example.

Check out what the full report says about Facebook. Here is the first mention of Facebook (p.4):

The promotion of extremist rhetoric encouraging violent acts is also a common
trend across the growing range of Internet-based platforms that host user-generated
content. Content that might formerly have been distributed to a relatively limited audience, in person or via physical media such as compact discs (CDs) and digital video discs (DVDs), has increasingly migrated to the Internet. Such content may be distributed using a broad range of tools, such as dedicated websites, targeted virtual chat rooms and forums, online magazines, social networking platforms such as Twitter and Facebook, and popular video and file-sharing websites, such as YouTube and Rapidshare, respectively. The use of indexing services such as Internet search engines also makes it easier to identify and retrieve terrorism-related content.

The second mention (p.11):

Particularly in the age of popular social networking media, such as Facebook, Twitter, YouTube, Flickr and blogging platforms, individuals also publish, voluntarily or inadvertently, an unprecedented amount of sensitive information on the Internet. While the intent of those distributing the information may be to provide news or other updates to their audience for informational or social purposes, some of this information may be misappropriated and used for the benefit of criminal activity.

And that’s about it when it comes to specifics about Facebook in report. One case involving Facebook was cited specifically but the bulk of the terrorist activity appeared to happen on other websites. On one hand, officials say they will continue to monitor Facebook. On the other hand, Facebook is one popular website, among others, where Internet users can interact.

I imagine Facebook as a company is also interested in this and its too bad they didn’t respond, at least not to Bloomberg Businessweek:

Spokespeople at Facebook, Google and Twitter didn’t immediately return phone calls and e-mails seeking comment.

Lack of WASP candidate for election due to the Internet?

Several commentators have picked up on this feature of the 2012 presidential election: neither candidate is a WASP.

Right now, we’re looking at an absence that would have been a startling presence 50 years ago. With all the focus on economic issues in the U.S. presidential race, there’s hardly any talk about the fact that, for the first time, none of the leading presidential and vice-presidential candidates is a white, Anglo-Saxon Protestant. Moreover, the U.S. Supreme Court has no WASPs. These are new phenomena in the United States.

The totally non-WASP tickets signify major political and social shifts in the networked age. As Robert Putnam showed a decade ago in Bowling Alone, organized groups such as churches, political clubs, fraternal clubs and Scouts have declined in importance. People have moved sharply away from traditional, tightly knit groups into more loosely knit networks that have fewer clan boundaries and more tolerance. The rise of the Internet and mobile connectivity has pushed the trend along by allowing people to expand the number and variety of their social ties…

In 1955, sociologist Will Herberg showed how white America was rigidly divided in Protestant, Catholic, Jew. Indeed, one of the authors of this article was barred from college fraternities because he was Jewish.

Now, when Chelsea Clinton marries, no one remarks on the kippa on her husband’s head. This year, a poll by the Pew Research Center found that 81 per cent of those who know Republican Mitt Romney is a Mormon are either comfortable with his affiliation or say it doesn’t matter to them.

I’m not sure I buy the Internet argument; WASPs lost their elite control because of the Internet? I think the process had started way before this. I wonder if the most basic explanation is that there are simply less WASPs overall in the population. Since the 1950s, there has been a sharp uptick in immigration and more people have had access to education and college and graduate degrees.

A lot of web traffic comes through the “dark social,” not through social network sites

Alexis Madrigal argues that while social network sites like Facebook get a lot of attention, a lot of web traffic is influenced by social processes that are much more difficult to see and measure:

Here’s a pocket history of the web, according to many people. In the early days, the web was just pages of information linked to each other. Then along came web crawlers that helped you find what you wanted among all that information. Some time around 2003 or maybe 2004, the social web really kicked into gear, and thereafter the web’s users began to connect with each other more and more often. Hence Web 2.0, Wikipedia, MySpace, Facebook, Twitter, etc. I’m not strawmanning here. This is the dominant history of the web as seen, for example, in this Wikipedia entry on the ‘Social Web.’…

There are circumstances, however, when there is no referrer data. You show up at our doorstep and we have no idea how you got here. The main situations in which this happens are email programs, instant messages, some mobile applications*, and whenever someone is moving from a secure site (“https://mail.google.com/blahblahblah“) to a non-secure site (http://www.theatlantic.com).
This means that this vast trove of social traffic is essentially invisible to most analytics programs. I call it DARK SOCIAL. It shows up variously in programs as “direct” or “typed/bookmarked” traffic, which implies to many site owners that you actually have a bookmark or typed in www.theatlantic.com into your browser. But that’s not actually what’s happening a lot of the time. Most of the time, someone Gchatted someone a link, or it came in on a big email distribution list, or your dad sent it to you…
Just look at that graph. On the one hand, you have all the social networks that you know. They’re about 43.5 percent of our social traffic. On the other, you have this previously unmeasured darknet that’s delivering 56.5 percent of people to individual stories. This is not a niche phenomenon! It’s more than 2.5x Facebook’s impact on the site…
If what I’m saying is true, then the tradeoffs we make on social networks is not the one that we’re told we’re making. We’re not giving our personal data in exchange for the ability to share links with friends. Massive numbers of people — a larger set than exists on any social network — already do that outside the social networks. Rather, we’re exchanging our personal data in exchange for the ability to publish and archive a record of our sharing. That may be a transaction you want to make, but it might not be the one you’ve been told you made.

Two thoughts about this:

1. Here is how I might interpret this argument from a sociological point of view: Internet traffic is heavily dependent on social connections. Whether this is done on sites like Facebook, which are more publicly social, or through email, which is restricted from public view but is still quite social, the interactions people have influence where they go on the web. In this sense, the Internet is an important social domain that may have some of its own norms and rules as well as its own advantages and disadvantages but it is built around human connections.

2. This sounds like a fantastic business and/or research opportunity; what is going on in this “dark social” realm? Could there be ways at getting at these activities that would help us better understand and analyze the importance of social connections and interactions and could this information be monetized as well?