First American inpatient hospital Internet addiction facility to open

Internet addiction is a growing topic of discussion and the first hospital inpatient facility to address it is set to open soon in Pennsylvania:

The voluntary, 10-day program is set to open on Sept. 9 at the Behavioral Health Services at Bradford Regional Medical Center. The program was organized by experts in the field and cognitive specialists with backgrounds in treating more familiar addictions like drug and alcohol abuse.

“[Internet addiction] is a problem in this country that can be more pervasive than alcoholism,” said Dr. Kimberly Young, the psychologist who founded the non-profit program. “The Internet is free, legal and fat free.”…

Young and other experts are quick to caution that mere dependence on modern technology does not make someone an Internet addict. The 20-year-old who divides his time between his girlfriend and “World of Warcraft” likely does not require intensive treatment. The program is designed for those whose lives are spiraling out of control because of their obsession with the Internet. These individuals have been stripped from their ability to function in daily life and have tried in the past to stop but cannot…

Last May, the American Psychiatric Association released its Diagnostic and Statistical Manual of Mental Disorders 5, or DSM-5, for the first time listed “Gaming Disorder” in Section III of the manual, which means it requires further research before being formally identified as a disorder.

This bears watching. This will likely be a real problem for a small subset of the population and yet critics of the Internet could continue to use it to criticize all Internet use. How exactly this is constructed as a social problem (or not) will strongly influence how this is perceived in the United States.

It would be interesting to know why exactly the first hospital facility is being set up in central Pennsylvania. Why not elsewhere?

Homelessness went down in last decade but not much coverage of this policy success

Here is a story you may not have heard: homelessness in the United States has gone down in the last decade.

The National Alliance to End Homelessness, a leader in homelessness service and research, estimates a 17% decrease in total homelessness from 2005 to 2012. As a refresher: this covers a period when unemployment doubled (2007-2010) and foreclosure proceedings quadrupled (2005-2009)…And what about the presidents responsible for this feat? General anti-poverty measures – for example, expanding the Earned Income Tax Credit — have helped to raise post-tax income for the poorest families. But our last two presidents have made targeted efforts, as well. President George W. Bush’s “housing first” program helped reduce chronic homelessness by around 30% from 2005 to 2007. The “housing first” approach put emphasis on permanent housing for individuals before treatment for disability and addiction.

The Great Recession threatened to undo this progress, but the stimulus package of 2009 created a new $1.5 billion dollar program, the Homeless Prevention and Rapid Re-Housing Program. This furthered what the National Alliance called “ground-breaking work at the federal level…to improve the homelessness system by adopting evidence-based, cost effective interventions.” The program is thought to have aided 700,000 at-risk or homeless people in its first year alone, “preventing a significant increase in homelessness.”

Since then, the Obama administration also quietly announced in 2010 a 10-year federal plan to end homelessness. This is all to say that the control of homelessness, in spite of countervailing forces, can be traced directly to Washington—a fact openly admitted by independent organizations like the National Alliance to End Homelessness.

The article goes on to suggest why there hasn’t been much coverage of this success: homelessness is not much of a social problem in Washington or the national media. The social construction of homelessness as a social problem that should receive a lot of public attention either hasn’t been very successful, was never really attempted, or other social problems (like various wars on crime, poverty, terrorism, etc.) have captured more attention.

But, if all the numbers cited above are correct, it seems a shame that a positive effect of public policies regarding a difficult problem is going relatively unnoticed…

Should the Newtown shootings be considered a mass shooting, a school shooting, mass murder, workplace violence…

Sociologist Joel Best provides a history of how American scholars and media have classified mass shootings:

It may seem self-evident that the killings at Sandy Hook Elementary ought to be classified as a shooting event, or as a school shooting or a mass shooting. Of course we classify events into categories that make sense to us, and it is easy to take familiar categories for granted. We learn of terrible crimes and we are accustomed to commentators talking about incidents as instances. But the ways we make sense of the world—the terms we use to describe that world—are created by people, and they are continually evolving, so that specific categories come into and fall out of favor. In fact, in recent decades, Americans have understood events like the Newtown killings in a variety of ways…

By the early 1980s, the Federal Bureau of Investigation promoted the distinction between mass murder and serial murder. The Bureau had a new databank—the Violent Criminal Apprehension Program, or VICAP—that could help law enforcement identify similar crimes that had occurred in other jurisdictions. But in the aftermath of revelations about the FBI’s surveillance of the civil rights movement, an effort to expand the bureau’s domestic data collection invited suspicion and resistance. The FBI used the serial murderer menace—and particularly the idea that serial killers might be nomadic, able to kill in different jurisdictions without the authorities ever recognizing that crimes in different places might be linked—to justify the VICAP program. That set the stage for Clarice Starling and all the other heroic FBI agents who began pitting their wits against serial murderers in crime fiction and movies. “Son of Sam” would no longer be classified with Charles Whitman…

Journalists notice patterns, so similarities between cases invite the creation of new categories. For example, in 1986, a postal worker killed 14 postal employees; then, in 1991, there were two more incidents involving former postal workers killing employees at post offices. This led to the expression going postal. Eventually, after further incidents in 1993, the Postal Service responded with a program to improve their workplace and prevent violence. Some criminologists began writing about workplace violence, although this category was defined as including any violence in a workplace, not just mass murders. Under that definition, a large share of workplace violence involved robberies. (According to one analysis, the three most common sites for workplace violence were taxicabs, liquor stores, and gas stations—all isolated settings likely to have cash on hand.)

At the end of the 1990s, attention shifted to schools. During the 1997–98 academic year, there were heavily publicized incidents in West Paducah, Kentucky; Jonesboro, Arkansas; and Springfield, Oregon. The Jonesboro story made the cover of Time, which featured a photo of one of the shooters as a young child wearing camouflage and holding a rifle, with the caption “Armed & Dangerous.” Thus, the expression school shooting was already familiar a year before the April 1999 killings at Columbine. Advocates and academics began compiling databases of school violence, although the results were surprising: The average number of deaths per year fell, from 48 during the period from the fall of 1992 through the spring of 1997, to 32 during the period spanning September 1997 through the end of the school year in 2001, even though Columbine and the other best-publicized cases occurred during the latter period. In spite of commentators declaring that the nation was experiencing a wave or epidemic of school shooting, the evidence suggested that violent deaths in schools were declining.

Best has written a lot about how social categories, experiences, and data then get used in political and civic discussions. The classification as a school shooting or the result of mental illness then shapes the rest of the discussion including what should be done in the future. Social problems can’t be social problems until they can be shown to be worth the public’s attention.

What Best doesn’t do is try to forecast how the Newtown shooting will come to be known in the future. What is the dominant narrative that will develop? And how will it be used?

The changing definition and use of “Latino”

Here is a quick recap of how American society has defined and used the term Latino in recent decades:

If all ethnic identities are created, imagined or negotiated to some degree, American Hispanics provide an especially stark example. As part of an effort in the 1970s to better measure who was using what kind of social services, the federal government established the word “Hispanic” to denote anyone with ancestry traced to Spain or Latin America, and mandated the collection of data on this group. “The term is a U.S. invention,” explains Mark Hugo Lopez, associate director of the Pew Hispanic Center. “If you go to El Salvador or the Dominican Republic, you won’t necessarily hear people say they are ‘Latino’ or ‘Hispanic.’?”

You may not hear it much in the United States, either. According to a 2012 Pew survey, only about a quarter of Hispanic adults say they identify themselves most often as Hispanic or Latino. About half say they prefer to cite their family’s country of origin, while one-fifth say they use “American.” (Among third-generation Latinos, nearly half identify as American.)

The Office of Management and Budget defines a Hispanic as “a person of Cuban, Mexican, Puerto Rican, South or Central American, or other Spanish culture or origin regardless of race” — about as specific as calling someone European.

“There is no coherence to the term,” says Marta Tienda, a sociologist and director of Latino studies at Princeton University. For instance, even though it’s officially supposed to connote ethnicity and nationality rather than race — after all, Hispanics can be black, white or any other race — the term “has become a racialized category in the United States,” Tienda says. “Latinos have become a race by default, just by usage of the category.”

A good discussion throughout. And, the definition and usage of the term Latino or Hispanic is likely to change in decades to come. All together, it suggests racial and ethnic categories can be quite fluid based on a whole host of social factors.

Arguing over Frank Gehry’s plans for the Eisenhower Memorial illustrates the social construction of memorials

Architect Frank Gehry’s designs for the Eisenhower Memorial in Washington D.C. are drawing criticism. Curbed sums it up:

Anyone who still believes that “any press is good press” doesn’t know a thing about Frank Gehry’s plans for D.C.’s Eisenhower Memorial, which, ever since renderings were released for public fodder well over two years ago, has attracted a publicity buzz not unlike flies swarming a dying animal. Indeed, the memorial’s most hyperbolic and outspoken critic, the National Civic Art Society, has called Gehry’s plans for an architectural memorial park—which, with 80-foot columns and woven steel tapestries, is as nonlinear and flourished as the rest of his oeuvre—”sentimental kitsch,” “a temple to nothingness,” and a “behemoth [that] commemorates Gehry’s ego, not Eisenhower’s greatness and humility.” President Eisenhower’s grandchildren have spoken out against the design, as well, most recently calling it “regretfully, unworkable.” Oh, and don’t even get them started on those tapestries, which have been likened to the stuff of Communist regimes, derided as an “Iron Curtain to Ike,” and described by the NCAS as “a rat’s nest of tangled steel, a true maintenance nightmare.”

This week, Congress joined the clamor: Rep. Rob Bishop, a Republican from Utah, has just introduced legislation that would officially halt all of Gehry’s efforts and start the whole process afresh. Rep. Tom McClintock (R-Calif.) chimed in: “I want to know how we came up with this monstrosity.” This, of course, has ruffled a whole other set of feathers, namely those of the American Institute of Architects, which has said in a statement that the bill “is nothing more than an effort to intimidate the innovative thinking for which our profession is recognized at home and around the globe.”

This highlights the socially constructed nature of memorials. What are they supposed to look like? To know, we often look at genres. We have memorials that celebrate war victories and they look a certain way: perhaps a big arch, perhaps a leader on a horse. We have memorials to celebrate the loss of life and the ambiguous outcomes of war. See the Vietnam War Memorial or the Memorial to the Murdered Jews in Europe in Berlin. These public discussions can help ensure the public or leaders get what they want out of the memorial but might also stifle innovation.

In addition to this issue of genre, I see a few other issues in this criticism:

1. Why build a memorial for Eisenhower in the first place? Is it for his actions as president in being in charge during a time of prosperity or is it for his leadership in World War II (though we tend not to honor generals in these large ways anymore)? Here is the reasoning courtesy of the official website: eisenhowermemorial.gov.

Why honor President Eisenhower with a Memorial?

Congress approved the Dwight D. Eisenhower National Memorial in 1999 with the passage of Public Law 106-79, signed into law by President Clinton. The Eisenhower Memorial Commission is entrusted with the task of building an enduring memorial honoring Dwight D. Eisenhower as the Supreme Commander of the Allied Forces in Europe during World War II and the 34th President of the United States. Eisenhower understood war as only a soldier could and believed the possibility of a nuclear or thermonuclear, World War III, would be unwinnable for mankind.  He set in place a strategy for winning the Cold War, that was followed and implemented by future Presidents until the collapse of the Soviet Union.  Eisenhower’s prescience and his strategic understanding of science and technology in establishing the United States as a pre-eminent world power was essential to securing freedom for generations of Americans to come. Eisenhower was influential in bringing World War II to an end and his efforts throughout the War, especially with the planning and execution of D-Day, stopped the Nazi war machine. He also ended the Korean War and maintained active communications with the Soviet Union during the Cold War.This Memorial will not only tell the story of Eisenhower, the young man from Kansas who became a great soldier, a U.S. President, and a world leader, but will also reflect the story of America – humble, isolated beginnings, and a rapid ascension on the world stage.  His example is an inspiration that, through leadership, cooperation, and public service, we too can achieve the American dream and make a difference in the world.  Eisenhower, like America, rose to the occasion with courage and integrity.With the 60th Anniversary of his election to President and the 70th anniversary of victory in World War II, it is fitting to celebrate Eisenhower´s numerous accomplishments as a General, President, and world citizen. Dwight D. Eisenhower´s dedicated service to his country spanned 50 years. It is appropriate that the first national presidential memorial of the 21st century will honor President Eisenhower.  If there was ever a moment in our nation’s history to recognize a leader committed to both security and peace for the good of his nation and the world, now is that time.

How many presidents will receive memorials like this? How many should and who gets to decide?

2. I wonder how much of this is tied to Frank Gehry as architect. Gehry has a particular approach to structures. What if it was a lesser-known architect or even an unknown? Back to the official website:

How was Frank Gehry selected to design the Eisenhower Memorial?

Mr. Gehry was one of four finalists in a competitive process  managed by GSA under the guidelines of the General Services Administration Design Excellence Program.  The process consisted of three stages.  A notice was published in FedBizOpps announcing the opportunity for any designer with an existing portfolio to compete for the project.  Submissions were received from forty-four qualified design firms in 2008. Evaluation factors included previous work, ability to work within the constraints of an urban site, interviews, and responses to the memorial´s pre-design program. That program addressed Eisenhower´s accomplishments as well as the physical parameters of the memorial site. Mr. Gehry´s creativity, ingenuity and inventiveness demonstrated his understanding of Eisenhower as a General, President, and world citizen. An independent panel of reviewers, including Commissioner David Eisenhower, reviewed the presentations by the final four designers and recommended Frank Gehry.  The Eisenhower Commission unanimously accepted their recommendation.

3. How much should the family of the memorialized person be involved? Curbed cites the family’s dislike for the structure. But, isn’t the memorial more for the people of the United States? This is a matter of competing interests.

4. I wonder if there are any critics of Eisenhower’s presidency who might object loudly to the design of the memorial. The Eisenhower administration wasn’t perfect…

In the end, this memorial partly reflects something about Eisenhower himself but also strongly reflects our understanding of Eisenhower from the years 1999 when the Memorial process started to 2016 when the project is supposed to be done.

How the Facebook equation 6÷2(1+2)= reveals the social construction of the order of operations

An equation on Facebook that has generated a lot of debate actually illustrates where the mathematical order of operations comes from:

Some of you are already insisting in your head that 6 ÷ 2(1+2) has only one right answer, but hear me out. The problem isn’t the mathematical operations. It’s knowing what operations the author of the problem wants you to do, and in what order. Simple, right? We use an “order of operations” rule we memorized in childhood: “Please excuse my dear Aunt Sally,” or PEMDAS, which stands for Parentheses Exponents Multiplication Division Addition Subtraction.* This handy acronym should settle any debate—except it doesn’t, because it’s not a rule at all. It’s a convention, a customary way of doing things we’ve developed only recently, and like other customs, it has evolved over time. (And even math teachers argue over order of operations.)

“In earlier times, the conventions didn’t seem as rigid and people were supposed to just figure it out if they were mathematically competent,” says Judy Grabiner, a historian of mathematics at Pitzer College in Claremont, Calif. Mathematicians generally began their written work with a list of the conventions they were using, but the rise of mass math education and the textbook industry, as well as the subsequent development of computer programming languages, required something more codified. That codification occurred somewhere around the turn of the last century. The first reference to PEMDAS is hard to pin down. Even a short list of what different early algebra texts taught reveals how inconsistently the order of operations was applied…

The bottom line is that “order of operations” conventions are not universal truths in the same way that the sum of 2 and 2 is always 4. Conventions evolve throughout history in response to cultural and technological shifts. Meanwhile, those ranting online about gaps in U.S. math education and about the “right” answer to these intentionally ambiguous math problems might be, ironically, missing a bigger point.

“To my mind,” says Grabiner, “the major deficit in U.S. math education is that people think math is about calculation and formulas and getting the one right answer, rather than being about exciting ideas that cut across all sorts of intellectual categories, clear and logical thinking, the power of abstraction and a language that lets you solve problems you’ve never seen before.” Even if that language, like any other, can be a bit ambiguous sometimes.

Another way to restate this conclusion from Grabiner is that math is more about problem-solving than calculations.

This reminds me of well-known areas of sociology that deal with the norms of everyday interactions. In order to interpret the actions of others, we need to know about agreed-upon assumptions. When those assumptions are blurry or are not followed, people get nervous. Hence, as this article suggests, many people get anxious when the rules/norms of math are seemingly violated. If these sorts of basic equations can’t be easily figured out, what hope is there to understand the rest of math? But, norms are not always cut and dry and that can be okay…as long as the people participating are aware of this.

Sociologist: downgrade threat of terrorism in US to a “tiny” threat

Remember when terrorism was the number one concern in the United States? A new report features a sociologist arguing that terrorism is a “tiny” threat in the United States. Here is some of the evidence:

Kurzman’s report, “Muslim-American Terrorism in the Decade Since 9/11,” said that compared to the 14,000 murders in the U.S. last year, the potential for Muslim Americans to take up terrorism is “tiny.”

In the 10 years since the 9/11 terrorist attacks, 193 Muslim Americans have been indicted in terrorist plots, or fewer than 20 per year, Kurzman said.

Just one of those indicted last year was actually charged with carrying out an attack — Yonathan Melaku, who fired shots at military buildings in northern Virginia — compared to six Muslim Americans who carried out attacks in 2010, including Faizal Shahzad, the failed Times Square bomber.

“This number is not negligible — small numbers of Muslim Americans continue to radicalize each year and plot violence,” Kurzman wrote. “However, the rate of radicalization is far less than many feared in the aftermath of 9/11.”

This reminds me of the idea that the “war on terror” is more of a social construction than actual threat. Granted, the money and resources spent on fighting terrorism may just have contributed to the low number of terrorists but the large application of resources plus the political rhetoric (remember the days of terror alerts?) plus media accounts may have just blown this up into a bigger issue than it actually was.

It would be interesting to hear what Kurzman thinks should be done in response to this data. On one hand, perhaps we should spend less time and effort fighting terrorism, particularly in an era of a lot of other issues and fiscal shortfalls. On the other hand, who wants to be the politician or expert that says things are okay and some major incident occurs? Is just one incident of terrorism just too many to handle? This sounds like a very similar tradeoff to what the options are in dealing with (falling rates of) crime.

Why many products are always “on sale” – and why buyers fall for it

My wife and I had a running joke going for a while with the Kohl’s circular that would come in the Sunday newspaper: every week was “the biggest sale of the year!” This is a common strategy for many retailers and consumers continue to fall for it:

“People don’t have a gut sense of absolute value. It’s just that they’re sensitive to contrast. So if you say I’m getting 40 percent off, I’m interested, no matter what the actual cost is.”

“The whole concept of a sale or a discount has become really perverted,” said Shell, a co-director of Boston University’s Center for Science & Medical Journalism and a contributing editor to Atlantic Monthly. “So what is the price? We think of price as a number, something that’s coolly objective, but it’s not. It’s a highly emotional construct. Price is manipulated to attract the consumer.

“If people see a sweater on a table for $50, they don’t buy it. If they see the same sweater was once $100, they will. We’re highly swayed by reference price. … There are some things that are almost perennially on sale, like mattresses and jewelry. We buy almost all our clothing on sale.”

“Retailers are now outfoxing consumers,” said Kit Yarrow, chair of the psychology department at Golden Gate University, where she is a jointly appointed professor of both psychology and marketing. “They’ve figured out how to offer a bargain in a way that the consumer doesn’t even know what they’re buying anymore.”

So how could consumers fight back? Some common strategies:

1. In certain areas, like credit card offers and statements or the calories in restaurant meals, having sellers display more information so that the consumer can theoretically make more rational decisions based on more information. Do all consumers use this information? Does the extra information “wear off” over time, particularly in light of enticing promotions or marketing? You can hear the same argument about health care from some people – if everyone knew, doctors and patients, how much every test or treatment was going to cost, different choices would be made.

2. Use an envelope system (or a debit card) for spending money so that one has a better idea of the total spending limit. This may help overspending but does it help eliminate all “impulse buys” or the deals one purchases?

3. Aren’t consumer education classes in high schools supposed to help talk about finances and such? And do they help much? Do such classes typically talk about how marketing works and different ways to think about deals?

4. There are companies that claim to not offer deals and have “no-haggle prices” or something like that. Think of CarMax or Saturn. Since most other retailers do offer deals, some companies can take an opposite tack.

The conclusion: prices are a social construction and taps into basic human impulses to avoid losses (paying the higher price)

It is the best of times for teaching sociology and the worst of times for America

A minister and adjunct instructor of sociology raises an intriguing question: when times are good for teaching sociology, it may be bad times for society.

This is a great time for teaching sociology, which means it is a bad time.

The study of sociology was born of the Industrial Revolution when the gap between the rich and the poor became the greatest ever known. The two groups which I straddle; the religious community and the academic community, became interested and attempted to study social phenomenon with a scientific approach, replacing social myths with evidence and facts.

Somehow we have managed to return statistically to that time. In our country, the wealthiest one percent of the population own 33 percent of the wealth and the wealthiest 10 percent own 70 percent of “our” wealth. It seems we have returned to the ruling class mode of the 19th century in Russia and France—a time when America was awash with “robber barons.”

No wonder folks are taking to the streets.

I wonder if anyone has researched the relationship between the popularity and of sociology compared to the historical milieu. Sociology did emerge out of turbulent times in the mid 1800s but it also seemed to reach peaks in the United States in the prosperous 1920s (the Chicago School) and the 1960s and 1970s while there was both unrest and prosperity. Might this suggest that when academia thrives, i.e. student populations are increasing as well as budgets, sociology (and perhaps other disciplines) thrive? At the very least, we could look at how figures of undergraduate  majors and student enrolled in sociology graduate programs over the years. Perhaps there simply wouldn’t be many dips in the data as sociology programs expand over time and spread into more schools.

Probably the better argument to make here is that sociology appears more relevant in unsettled times. As society dips toward troubles and chaos, people want answers and explanations. Additionally, perceptions of social problems might be more important here than the scale of actual problems. However, I wonder if this tends to give sociology a bad name as people then equate it only with social problems rather than solutions and thriving societies.

Geneologies as “heavily curated social constructions”

Tracking genealogies is both a popular hobby and big business. A sociologist argues that these genealogies are actually social constructions of our past:

In Ancestors and Relatives: Genealogy, Identity, and Community, Eviatar Zerubavel, a sociologist at Rutgers, pulls back the curtain on the genealogical obsession. Genealogies, he argues, aren’t the straightforward, objective accounts of our ancestries we often presume them to be. Instead, they’re heavily curated social constructions, and are as much about our values as they are about the facts of who gave birth to whom…

“No other animals have ‘second cousins once removed,'” Zerubavel points out, “or are aware of having had great-great-great-grandparents”; only people have the more abstract sorts of relatives necessary for a real genealogy. In the meantime, as categories for relatives proliferate and family trees expand, we accrue large numbers of ‘optional’ relatives. We construct our genealogies by choosing, out of a nearly endless array of possibly important or interesting ancestors, the ones who matter to us.

Those choices are highly motivated, and often obviously artificial. Because we want to stretch our family lines far into the past, we often “cut and paste” different branches, claiming, for example, a great-great-grandmother’s stepfather as one of our own ancestors, and following his line into the past. We “braid” ancestral identities together, emphasizing, as President Obama has, that we come from two distinct lines of descent (“a mother from Kansas and a father from Kenya”). Sometimes, though, the opposite impulses take hold. We might deliberately “lump” our diverse ancestries together, aiming to consolidate them, using a label like “Eurasian,” to lower the contrast (as Tiger Woods does when he refers to himself as “Cablinasian” — a combination of Caucasian, black, American Indian, and Asian). Or we might “clip” our family trees, obscuring their origins so as to preserve coherence and purity. That, Zerubavel writes, is what the Nazis did with Jewish genealogies: “Going only two generations back when formally defining Jewishness… helped the Nazis avoid realizing how many ‘Aryan’ Germans actually also had Jewish ancestors.”…

The point, Zerubavel writes, is that genealogies don’t all follow the same rules. Depending on what you’re trying to emphasize, you accept, reject, combine, or contrast individuals, families, and even whole ethnic identities. The most objective point of view, as Richard Dawkins has written, would probably hold that “all living creatures are cousins.” But genealogies are partial, selective, subjective, and social. They are as much about the present as they are about the past.

This isn’t too surprising: humans commonly pick and choose what we want to believe and then display to others. Could we argue that genealogies are simply another tool of impression management where we show our best (past) side to others and cover up the people we aren’t as proud of? This doesn’t seem that different than communities that cover up infamous parts of their histories or patriotic narratives that emphasize only the positives.

This reminds me of a high school history project I had to do. For my American History class, we had to make a poster out of our genealogies and there was a prize handed out to the person who could go the farthest back. Several of my family lines didn’t go more than four or five generations back but one of them had been extensively researched back to 46 generations and Alfred the Great, king of the Anglo-Saxons in the late 800s. Several things struck me then as odd:

1. I ended up losing out to a girl who could trace her family back 47 generations. Is this a prize-worthy objective anyway?

2. Who has the time and money to spend on tracing one’s family back 46 generations? Perhaps this doesn’t require to many resources these days with online resources plus what is often available at libraries but it still requires time.

3. Some of the family line was strange as I think one time it went through a cousin and another time for a daughter rather than a son. It seemed clearly set up to get back to people like Sir Francis Bacon and Alfred the Great.

But, for the day or two that my poster was up in the classroom, I could say that I could trace my family back 46 generations when most people could not.