Are we closer to the end of the era of the car than the beginning?

One academic argues we are getting closer to the end of automobile era:

This prediction sounds bold primarily for the fact that most of us don’t think about technology – or the history of technology – in century-long increments: “We’re probably closer to the end of the automobility era than we are to its beginning,” says Maurie Cohen, an associate professor in the Department of Chemistry and Environmental Science at the New Jersey Institute of Technology. “If we’re 100 years into the automobile era, it seems pretty inconceivable that the car as we know it is going to be around for another 100 years.”

Cohen figures that we’re unlikely to maintain the deteriorating Interstate Highway System for the next century, or to perpetuate for generations to come the public policies and subsidies that have supported the car up until now. Sitting in the present, automobiles are so embedded in society that it’s hard to envision any future without them. But no technology – no matter how essential it seems in its own era – is ever permanent. Consider, just to borrow some examples from transportation history, the sailboat, the steamship, the canal system, the carriage, and the streetcar…

“The replacement of the car is probably out there,” Cohen adds. “We just don’t fully recognize it yet.”

In fact, he predicts, it will probably come from China, which would make for an ironic comeuppance by history. The car was largely developed in America to fit the American landscape, with our wide-open spaces and brand-new communities. And then the car was awkwardly grafted onto other places, like dense, old European cities and developing countries. If the car’s replacement comes out of China, it will be designed to fit the particular needs and conditions of China, and then it will spread from there. The result probably won’t work as well in the U.S., Cohen says, in the same way that the car never worked as well in Florence as it did in Detroit.

In our modern world, 100 years is a long time for a technology to hold on. While I imagine there is some technology that would be better than cars, it is harder to imagine the complete overhaul that would have to take place to replace the car. What happens to all of the roads and asphalt? What happens to the garage which has become a more prominent feature of houses? What happens to cities that based their planning around the most efficient pathways for cars? What about the oil industry and auto makers?

Cohen also notes that change could come from China. What if end up in a world where certain countries use a replacement technology for cars because of its efficiency, their larger populations, etc. while wealthier countries like the United States retain their use of the automobile?

Of course, Cohen is correct to note that it is hard to see the future from the present. This may seem like a very silly discussion looking back several decades from now…

A mid-twentieth century vision of “the future” versus welcome changes to everyday life for average Americans

Virginia Postrel compares the vision of “the future” decades ago versus the changes that have made the everyday lives of many Americans better:

Forget the big, obvious things like Internet search, GPS, smartphones or molecularly targeted cancer treatments. Compared with the real 21st century, old projections of The Future offered a paucity of fundamentally new technologies. They included no laparoscopic surgery or effective acne treatments or ADHD medications or Lasik or lithotripsy — to name just a few medical advances that don’t significantly affect life expectancy…

Nor was much business innovation evident in those 20th century visions. The glamorous future included no FedEx or Wal- Mart, no Starbucks or Nike or Craigslist — culturally transformative enterprises that use technology but derive their real value from organization and insight. Nobody used shipping containers or optimized supply chains. The manufacturing revolution that began at Toyota never happened. And forget about such complex but quotidian inventions as wickable fabrics or salad in a bag.

The point isn’t that people in the past failed to predict all these innovations. It’s that people in the present take them for granted.

Technologists who lament the “end of the future” are denigrating the decentralized, incremental advances that actually improve everyday life. And they’re promoting a truncated idea of past innovation: economic history with railroads but no department stores, radio but no ready-to-wear apparel, vaccines but no consumer packaged goods, jets but no plastics.

I wonder if another way to categorize this would be to say that many of the changes in recent decades have been more about quality of life, not significantly different way of doing things or viewing the world (outside of the Internet). Quality of life is harder to measure but if we take the long view, the average life of a middle-class American today contains improvements over decades before. Also, is this primarily a history or perspective issue? History tends to be told (and written) by people in charge who often focus on the big people and moments. It is harder to track, understand, and analyze what the “average” person experiences day to day.

I can imagine some might see Postrel’s argument and suggest we are deluded by some of these quality of life improvements and we forget about what we have given up. While some of this might be mythologizing about a golden era that never quite was, it is common to hear such arguments about the Internet and Facebook: it brings new opportunities but fundamentally changes how humans interact with each other and machines (see Alone Together by Sherry Turkle). We now have Amazon and Walmart but have lost any relationships with small business owners and community shops. We may have Starbucks coffee but it may not be good for us.

Modern skeuomorphs are touches of the past in a digital age

Clive Thompson discusses skeumorphs, “a derivative object that retains ornamental design cues to a structure that was necessary in the original” (Wikipedia definition), in a digital world:

Now ask yourself: Why does Google Calendar—and nearly every other digital calendar—work that way? It’s a strange waste of space, forcing you to look at three weeks of the past. Those weeks are mostly irrelevant now. A digital calendar could be much more clever: It could reformat on the fly, putting the current week at the top of the screen, so you always see the next three weeks at a glance…

Because they’re governed by skeuomorphs—bits of design that are based on old-fashioned, physical objects. As Google Calendar shows, skeuomorphs are hobbling innovation by lashing designers to metaphors of the past. Unless we start weaning ourselves off them, we’ll fail to produce digital tools that harness what computers do best.

Now, skeuomorphs aren’t always bad. They exist partly to orient us to new technologies. (As literary critic N. Katherine Hayles nicely puts it, they’re “threshold devices, smoothing the transition between one conceptual constellation and another.”) The Kindle is easy to use precisely because it behaves so much like a traditional print book.

But just as often, skeuomorphs kick around long past the point of reason. Early automobiles often included a buggy-whip holder on the dashboard—a useless fillip that designers couldn’t bear to part with.

I’ve noticed the same thing on my Microsoft Outlook calendar: the default is to show the full month of February even today when I don’t really care to look back at February and would much rather see what is coming up in March. I can alter it somewhat in the options by displaying two months at a time but it still shows all the earlier part of February.

What would be interesting to hear Thompson discuss is the half-life of skeuomorphs. If they are indeed useful for helping users make a transition from an old technology to a new one, how long should the old feature stick around? Is this made more complicated when the product has a broader audience? For example, iPhone users could be anyone from a 14 year old to an 80 year old. Presumably, the 14 year old might want the changes to come more quickly and tends to acquire the newer stuff earlier but the device still has to work for the 80 year old who is just getting their first smartphone and is doing partly so because they only recently became so cheap. How do companies make this decision? Could a critical mass of users “force”/prompt a change?

This is also a good reminder that new technologies sometimes get penalized for being too futuristic or too different. If skeu0morphs are used, users will make the necessary steps over time toward new behaviors and ways of seeing the world. Perhaps Facebook falls into this category. The method of having “friends” all in one category is often clunky but if users had to simply open their information to anyone, who would want to participate? However, by gradually changing the structure (remember we once had networks which were a comforting feature because you could easily place/ground people within an existing community), Facebook users can be moved toward a more open environment.

In general, social change takes time, even if the schedule in recent decades has become more compressed.

Argument: Chomsky wrong to suggest Twitter is “superficial, shallow, evanescent”

Nathan Jurgenson argues that Noam Chomsky’s thoughts about Twitter are misguided:

Noam Chomsky has been one of the most important critics of the way big media crowd out “everyday” voices in order to control knowledge and “manufacture consent.” So it is surprising that the MIT linguist dismisses much of our new digital communications produced from the bottom-up as “superficial, shallow, evanescent.” We have heard this critique of texting and tweeting from many others, such as Andrew Keen and Nicholas Carr. And these claims are important because they put Twitter and texting in a hierarchy of thought. Among other things, Chomsky and Co. are making assertions that one way of communicating, thinking and knowing is better than another…

Claiming that certain styles of communicating and knowing are not serious and not worthy of extended attention is nothing new. It’s akin to those claims that graffiti isn’t art and rap isn’t music. The study of knowledge (aka epistemology) is filled with revealing works by people like Michel Foucault, Jean-François Lyotard or Patricia Hill Collins who show how ways of knowing get disqualified or subjugated as less true, deep or important…

In fact, in the debate about whether rapid and social media really are inherently less deep than other media, there are compelling arguments for and against. Yes, any individual tweet might be superficial, but a stream of tweets from a political confrontation like Tahrir Square, a war zone like Gaza or a list of carefully-selected thinkers makes for a collection of expression that is anything but shallow. Social media is like radio: It all depends on how you tune it…

Chomsky, a politically progressive linguist, should know better than to dismiss new forms of language-production that he does not understand as “shallow.” This argument, whether voiced by him or others, risks reducing those who primarily communicate in this way as an “other,” one who is less fully human and capable. This was Foucault’s point: Any claim to knowledge is always a claim to power. We might ask Chomsky today, when digital communications are disqualified as less deep, who benefits?

Back to a classic question: is it the medium or the message? Is there something inherent about 140 character statements and how they must be put together that makes them different than other forms of human communication? I like that Jurgenson notes historical precedent: these arguments have also accompanied the introduction of radio, television, and the Internet.

But could we tweak Chomsky’s thoughts to make them more palatable? What if Chomsky had said that the average Twitter experience was superficial, would he be incorrect? Perhaps the right comparison is necessary – Twitter is more superficial compared to face-to-face contact? But is it more superficial than no contact since face-to-face time is limited? Jurgenson emphasizes the big picture of Twitter, its ability to bring people together and give people the opportunity to follow others and “tune in.” In particular, Twitter and other social media forms allow the average person in the world to potentially have a voice in a way that was never possible before. But for the average user, how much are they benefiting – are they tuned in to major social movements or celebrity feeds? What their friends are saying right now or progress updates from non-profit organizations? Is this a beneficial public space for the average user?

Additionally, does it matter here if Twitter had advertisements and made a big push to make money off of this versus providing a more democratic space? Is Twitter more democratic and deep than Facebook? How would one decide?

In the end, is this simply a generational split?

(See earlier posts on a similar topic: Malcolm Gladwell on the power of Twitter, how Twitter contributed (or didn’t) to movements in the Middle East, and whether using Twitter in the classroom improves student learning outcomes.)

General Motor’s “Parade of Progress” bus tour

General Motors has had difficulty in recent years but at one point, GM was important and big enough to cast a vision for America’s future. In addition to the “Futurama” exhibit which featured an impressive highway system, GM also had a bus tour that gave Americans a glimpse of the future:

General Motors’ research Vice President Charles Kettering (Boss Ket) decided to take GM’s show on the road. Between 1936 and 1956, the company’s “Parade of Progress” toured the U.S., Canada, Mexico and Cuba, visiting hundreds of towns and showing millions how working examples of modern technology would transform their everyday lives.

Eight 30-foot, streamlined buses led the parade, six with walk-through exhibits, one with a stage and one carrying equipment, while nine tractor-trailers carried the remaining gear, and new models of GM cars followed. The red-and-white buses would pull into a small town, circle the wagons at the football field, and the buses would open like clams while electric floodlights rose on poles. A crew accompanied the parade and erected a tent that could accommodate up to 1,500 people for a free technology show.

The show was such a success that GM built 12 Futurliner buses in 1940, after the New York World’s Fair. The parade continued to tour until Pearl Harbor, after which it was disbanded and the buses stored in Ohio. They wouldn’t see the light of day for 12 years, until the “Parade of Progress” was revived in 1953, with 12 buses. But the world had changed. TV had stolen the parade’s thunder, and even though the show included new exhibits — Highways of Tomorrow, How a Jet Engine Works, Wonders of Stereo, Kitchen of Tomorrow and What is the Atom? — it was over by 1956.

It really does seem like a bygone era: a bus tour of America that would pull into a community and residents would come out to see the technology of the future. It is interesting that the article notes that the television was part of the demise of these bus tours. With the information the television provided plus the information available to anyone today through the Internet, who needs to check out a bus tour? At the same time, these experiences are quite different in that they are solitary and more passive. Additionally, I imagine there could be quite a crowd or energy that would build at these exhibitions. This would be a Durkheimian “collective effervescence” experience. What would be the equivalent today: people showing up at the Apple store to see the latest technological wizardry? But this sort of experience would be about a single or just a few digital devices and less about an exciting vision of the future. Is there any place these days that offers a comprehensive and positive view of the future?

I also wonder how much these GM exhibits helped push the narrative of scientific and technological progress that seemed to develop in the post-World War II United States.

The limits of GPS in the West

Technology can be a good thing but it can also lead people astray. Hence, a warning out West regarding using GPS in certain areas:

Travelers in the western U.S. should not rely solely on technology such as GPS for navigation, authorities said, after a Canadian couple were lost in the Nevada wilderness for 48 days.

Albert Chretien, 59, and his wife Rita Chretien, 56, sought a shorter route between Boise, Idaho and Jackpot, Nevada during a road trip from British Columbia to Las Vegas…

Sheriff’s offices in remote, high-elevation parts of Idaho, Nevada and Wyoming report the past two years have brought a rise in the number of GPS-guided travelers driving off marked and paved highways and into trouble.

The spike has prompted Death Valley National Park in California to caution on its web site that “GPS navigation to sites to remote locations like Death Valley are notoriously unreliable.”

When two roads diverge in Western lands, take the one more traveled, authorities said.

Perhaps this could be read as a warning about over-reliance on technology: it is not infallible.You can occasionally find stories of people driving into retention ponds or crashing into things because the GPS told them to turn. At the same time, how bad are these GPS maps that people can get lost so easily? This would seem to be bad news for GPS makers if they don’t cover certain areas very well. Could a GPS maker ever have any liability for any of these unpleasant occurrences? Additionally, I wonder how many GPS owners also carry around a map of some kind in their vehicle or on their person.

More broadly, this is a reminder that one doesn’t have to travel very far to leave the comforts of the modern world and get lost in nature.

Two economists explain why college has come to cost so much

Two economists first summarize some of the arguments for why a college education has become so expensive and then provide their own overview based on “the technological forces that have reshaped the entire American economy”: rising costs relative to the price of goods associated with the time necessary to build relationships between faculty and students, a highly educated workforce, and the necessity for schools to purchase expensive technology devices to keep up with particular fields of research.

Taking this sort of view suggests that it won’t be easy to reduce costs of education since the issues present in colleges and universities are issues the entire economy faces.

h/t Instapundit