Earning more yearly from the growing value of your home than a minimum wage job?

Zillow suggests the growth in home values in about half of the United States’ largest cities is higher than working a full-time minimum wage job:

The typical U.S. home appreciated 7.6 percent over the past year, from a median value of $195,400 in February 2017 to $210,200 at the end of February 2018. That $14,800 bump in value translates to a gain in home equity of $7.09 for every hour the typical U.S. homeowner was at the office last year (assuming a standard 40-hour work week),[1] a shade less than the federal minimum wage of $7.25 per hour.

Overall, owners of the median-valued home in 24 of the nation’s 50 largest cities earned more in equity per hour over the past year than their local minimum wage.[2] But homeowners in a handful of U.S. cities made out a lot better than that – in some cases much, much better.

The median U.S. household earned roughly $60,000 in 2017 ($58,978 to be exact),[3] or a little more than $28 per hour. But in six U.S. cities – New York, San Diego, San Jose, San Francisco, Seattle and Oakland – owners of the median-valued local home gained more than that in home equity alone. And if earning a six-figure annual salary represents a certain amount of privilege, homeowners in San Francisco, San Jose and Seattle all made comfortably more than that simply by virtue of owning a local home…

A home is often a person’s biggest financial investment, and according to the 2017 Zillow Group Consumer Housing Trends Report, the typical American homeowner has 40 percent of their wealth tied up in their home. A recent Zillow survey found that 70 percent of Americans[4] view their home as a positive long-term investment.

This is both an interesting and weird comparison. For the interesting part: most people understand the abstract idea of working a minimum wage job. They should know that a full year of work at that rate does not generate much money. The reader is supposed to be surprised that simply owning a home could be a more profitable activity than working.

But, there are a number of weird features of this comparison. Here are four:

First, not all that many Americans work full-time minimum wage jobs. People understand the idea but tend to overestimate how many people work just for minimum wage.

Second, roughly half the cities on this list did not experience such an increase in housing values. Without comparisons over time, it is hard to know whether this information about 24 out of 50 cities is noteworthy or not.

Third, the comparison hints that a homeowner could choose to not work and instead reap the benefits of their home’s value. This question is posed in the first paragraph: “Why work a 9-5 slog, when you can sit back and collect substantial hourly home equity “earnings” instead?” Oddly, after the data is presented, there is a disclaimer section at the end where the difference between working a job and earning money through selling a home is explained.

Fourth, to purchase a home, particularly in the hottest markets cited, someone has to start with a good amount of capital. In other words, the people who would be working full-time minimum wage jobs for a full year are not likely to be the ones who would benefit from the growth in their home’s equity. It takes a certain amount of wealth to even own a home and then even more if someone wanted to profit from just owning homes.

Overall, I would give Zillow some credit for trying to compare the growth in home values to a known entity (a minimum wage job) but the comparison falls apart pretty quickly when one gets past the headline.

The double-edged sword of record home prices in many American metro areas

The housing bubble of the late 2000s may be long gone as housing prices continue to rise:

Prices for single-family homes, which climbed 5.3 percent from a year earlier nationally, reached a peak in 64 percent of metropolitan areas measured, the National Association of Realtors said Tuesday. Of the 177 regions in the group’s survey, 15 percent had double-digit price growth, up from 11 percent in the third quarter.

Home values have grown steadily as the improving job market drives demand for a scarcity of properties on the market. While prices jumped 48 percent since 2011, incomes have climbed only 15 percent, putting purchases out of reach for many would-be buyers.

The consistent price gains “have certainly been great news for homeowners, and especially for those who were at one time in a negative equity situation,” Lawrence Yun, the Realtors group’s chief economist, said in a statement. “However, the shortage of new homes being built over the past decade is really burdening local markets and making homebuying less affordable.”

Having read a number of stories like this, I wonder if there is a better way to distinguish between economic indicators that are good all around versus one like this that may appear good – home values are going up! – but really mask significant issues – the values may be going up because many buyers cannot afford more costly homes. The news story includes this information but I suspect many will just see the headline and assume things are good. Another example that has been in a lot of partisan commentaries in recent years (with supporters of both sides suggesting this when their party was not president): the unemployment rate is down but it does not account for the people who have stopped looking for work.

In the long run, we need (1) better measures that can encompass more dimensions of particular issues, (2) better reporting on economic indicators, and (3) a better understanding among the general populace about what these statistics are and what they mean.

Multiple measures and small trends: American birthrates down, births per woman up

A new Pew report explains this statistical oddity: the annual birthrate in the US is down but women are having more children.

How can fertility be down even as the number of women who are having children is going up? There are complex statistical reasons for this, but the main cause of this confusing discrepancy is the age at which women are having children. Women are having children later in life — the median age for having a first baby is 26 now, up from 23 in 1994 — and this delay causes annual birth rates to go down, even as the cumulative number of babies per woman has risen…

 

Another factor, Livingston said, is the drop in teen birth rates, with black women seeing the biggest drop in that category.

See the Pew report here. An additional part of the explanation is that there are multiple measures at play here. A Pew report from earlier in 2018 explains:

But aside from this debate, the question remains: Is this really a record low? The short answer is: It’s complicated.

That’s because there are different ways to measure fertility. Three of the most commonly used indicators of fertility are the general fertility rate (GFR); completed fertility; and the total fertility rate (TFR). All three reflect fertility behavior in slightly different ways – respectively, in terms of the annual rate at which women are presently having kids; the number of kids they ultimately have; or the hypothetical number they would likely have based on present fertility patterns.

None of these indicators is “right” or “wrong,” but each tells a different story about when fertility bottomed out.

Measurement matters and the different measures can fit different social and political views.

I wonder if part of the issue is also that there is a clear drop in births from the earlier era – roughly 1950 to 1970 which we often associate with Baby Boomers – but the last 3+ decades have been relatively flat. This plateau of recent decades means researchers and commentators may be more prone to jump on small changes in the data. Many people would love to predict the next big significant rise or fall in numbers but a significant change may not be there, particularly when looking at multiple measures.

Reading into a decreasing poverty rate, increasing median household income

Here are a few notable trends in the new data that shows the poverty rate is down in the United States and median household incomes are up:

Regionally, economic growth was uneven.
The median household income in the Midwest grew just 0.9 percent from last year, which is not a statistically significant amount. In the South, by contrast, the median income grew 3.9 percent; in the West, it grew 3.3 percent. “The Midwest is the place where we should have the greatest worry in part because we didn’t see any significant growth,” said Mary Coleman, the senior vice president of Economic Mobility Pathways, a national nonprofit that tries to move people out of poverty. Median household income was also stagnant in rural areas, growing 13 percent, to $45,830. In contrast, it jumped significantly inside cities, by 5.4 percent, to $54,834, showing that cities are continuing to pull away from the rest of the country in terms of economic success…

African Americans and Hispanics experienced significant gains in income, but still trail far behind whites and Asians.
All ethnic groups saw incomes rise between 2015 and 2016, the second such annual increase in a row. The median income of black families jumped 5.7 percent between 2015 and 2016, to $39,490. Hispanic residents also saw a growth incomes, by 4.3 percent, to $47,675. Asians had the highest median household income in 2016, at $81,431. Whites saw a less significant increase than African Americans and Hispanics, of 1.6 percent, but their earning are still far higher, at $61,858.

The poverty rate for black residents also decreased last year, falling to 22 percent, from 24.1 percent the previous year. The poverty rate of Hispanics decreased to 19.4 percent, from 21.4 percent in 2015. In comparison, 8.8 of whites, or 17.3 million people, were in poverty in 2016, which was not a statistically significant change from the previous year, and 10.1 percent of Asians, or 1.9 million people were in poverty, which was also similar to 2015…

Income inequality isn’t disappearing anytime soon.
Despite the improvements in poverty and income across ethnic groups, the American economy is still characterized by significant income inequality; while the poor are finally finding more stable footing following the recession, the rich have been doing well for quite some time now. The average household income of the the top 20 percent of Americans grew $13,749 from a decade ago, while the average household income of the bottom 20 percent of Americans fell $571 over the same time period. The top 20 percent of earners made 51.5 percent of all income in the U.S. last year, while the bottom 20 percent made just 3.5 percent. Around 13 percent of households made more than $150,000 last year; a decade ago, by comparison, 8.5 percent did. While that’s something to cheer, without a solid middle class, it’s not indicative of an economy that is healthy and stable more broadly.

Both of these figures – the poverty rate and median household incomes – are important indicators of American social and economic life. Thus, that both are trending in the right direction is good.

Yet, we also have the impulse these days to (1) dig deeper into the data and (2) also highlight how these trends may not last, particularly in the era of Trump. The trends noted above (and there are others also discussed in the article) can be viewed as troubling as the gains made by some either were not shared by others or do not erase large gaps between groups. Our understandings of these income and poverty figures can change over time as measurements change and perceptions of what is important changes. For example, the median household income going up could suggest that more Americans have more income or we may now care less about absolute incomes and pay more attention to relative incomes (and particularly the gap between those at the top and bottom).

In other words, interpreting data is influenced by a variety of social forces. Numbers do not interpret themselves and our lenses consistently change. Two reasonable people could disagree on whether the latest data is good for America or suggests there are enduring issues that still need to be addressed.

Mutant stat: 4.2% of American kids witnessed a shooting last year

Here is how a mutant statistic about the exposure of children to shootings came to be:

It all started in 2015, when University of New Hampshire sociology professor David Finkelhor and two colleagues published a study called “Prevalence of Childhood Exposure to Violence, Crime, and Abuse.” They gathered data by conducting phone interviews with parents and kids around the country.

The Finkelhor study included a table showing the percentage of kids “witnessing or having indirect exposure” to different kinds of violence in the past year. The figure under “exposure to shooting” was 4 percent.

The findings were then reinterpreted:

Earlier this month, researchers from the CDC and the University of Texas published a nationwide study of gun violence in the journal Pediatrics. They reported that, on average, 7,100 children under 18 were shot each year from 2012 to 2014, and that about 1,300 a year died. No one has questioned those stats.

The CDC-UT researchers also quoted the “exposure to shooting” statistic from the Finkelhor study, changing the wording — and, for some reason, the stat — just slightly:

“Recent evidence from the National Survey of Children’s Exposure to Violence indicates that 4.2 percent of children aged 0 to 17 in the United States have witnessed a shooting in the past year.”

The reinterpreted findings were picked up by the media:

The Dallas Morning News picked up a version of the Washington Post story.

When the Dallas Morning News figured out something was up (due to a question raised by a reader) and asked about the origins of the statistic, they uncovered some confusion:

According to Finkelhor, the actual question the researchers asked was, “At any time in (your child’s/your) life, (was your child/were you) in any place in real life where (he/she/you) could see or hear people being shot, bombs going off, or street riots?”

So the question was about much more than just shootings. But you never would have known from looking at the table.

This appears to be a classic example of a mutant statistic as described by sociologist Joel Best in Damned Lies and Statistics. As Best explains, it doesn’t take much for a number to be unintentionally twisted such that it becomes nonsensical yet interesting to the public because it seems shocking. And while the Dallas Morning News might deserve some credit for catching the issue and trying to set the record straight, the incorrect statistic is now in the public and can easily be found.

Claim: we see more information today so we see more “improbable” events

Are more rare events happening in the world or are we just more aware of what is going on?

In other words, the more data you have, the greater the likelihood you’ll see wildly improbable phenomena. And that’s particularly relevant in this era of unlimited information. “Because of the Internet, we have access to billions of events around the world,” says Len Stefanski, who teaches statistics at North Carolina State University. “So yeah, it feels like the world’s going crazy. But if you think about it logically, there are so many possibilities for something unusual to happen. We’re just seeing more of them.” Science says that uncovering and accessing more data will help us make sense of the world. But it’s also true that more data exposes how random the world really is.

Here is an alternative explanation for why all these rare events seem to be happening: we are bumping up against our limited ability to predict all the complexity of the world.

All of this, though, ignores a more fundamental and unsettling possibility: that the models were simply wrong. That the Falcons were never 99.6 percent favorites to win. That Trump’s odds never fell as low as the polling suggested. That the mathematicians and statisticians missed something in painting their numerical portrait of the universe, and that our ability to make predictions was thus inherently flawed. It’s this feeling—that our mental models have somehow failed us—that haunted so many of us during the Super Bowl. It’s a feeling that the Trump administration exploits every time it makes the argument that the mainstream media, in failing to predict Trump’s victory, betrayed a deep misunderstanding about the country and the world and therefore can’t be trusted.

And maybe it isn’t very easy to reconcile these two explanations:

So: Which is it? Does the Super Bowl, and the election before it, represent an improbable but ultimately-not-confidence-shattering freak event? Or does it indicate that our models are broken, that—when it comes down to it—our understanding of the world is deeply incomplete or mistaken? We can’t know. It’s the nature of probability that it can never be disproven, unless you can replicate the exact same football game or hold the same election thousands of times simultaneously. (You can’t.) That’s not to say that models aren’t valuable, or that you should ignore them entirely; that would suggest that data is meaningless, that there’s no possibility of accurately representing the world through math, and we know that’s not true. And perhaps at some point, the world will revert to the mean, and behave in a more predictable fashion. But you have to ask yourself: What are the odds?

I know there is a lot of celebration of having so much available information today but it isn’t necessarily easy adjusting to the changes. Taking it all in requires some effort on its own but the hard work is in the interpretation and knowing what to do with it all.

Perhaps a class in statistics – in addition to existing efforts involving digital or media literacy – could help many people better understand all of this.

A better interpretation of crime statistics for Chicago suburbs

The Daily Herald looks at recent crime figures in Chicago area suburbs. How should we interpret such numbers?

Violent crimes increased last year in half of 80 suburbs, says a new report by the FBI we’ve been analyzing.

Property crimes increased in more than 40 percent of the suburbs.

The Uniform Crime Reporting Program’s 2015 report shows Rosemont had a 94 percent increase in violent crimes, from 18 in 2014 to 35 in 2015. Most are assaults, but the category also includes rape, homicide and robbery. The village had a 29 percent increase in property crimes, which include arson, burglary and vehicle theft.

Other more populous suburbs had larger numbers of violent crimes in 2015, including 650 in Aurora, 261 in Elgin and 128 in Naperville.

Violent crimes remained largely flat in Palatine, with 36; Des Plaines, with 50; and Arlington Heights, with 42; while some communities saw crimes decrease across the board. Buffalo Grove saw an 80 percent decrease in violent crimes, to 2, and an 18 percent decrease in property crimes, to 234, while Prospect Heights saw a 33 percent decrease in violent crimes, to 14, and a 29 percent decrease in property crimes, to 112.

What I would take away:

  1. Looking across communities, there was not much change as half of the suburbs did not experience a rise in violent crimes and property crimes increased in less than half of the suburbs.
  2. It is interesting to note larger jumps in crime in certain communities. However, these should be interpreted in light of #1 and it would be more helpful to look at crime rates in these larger suburbs rather than just relying on occurrences.
  3. The last paragraph notes some major changes in other suburbs. But, some of these suburbs are smaller and a large decrease (80% in Buffalo Groves means a drop from 10 to 2) or increase could be more a function of not many crimes overall rather than indicate a larger trend.
  4. There is little indication of crime figures or rates over time which would help put the 2015 figure in better perspective.

All together, the headline “40 suburbs see spike in violent crimes in 2015” is not the most accurate. It may catch the attention of readers but neither the headline or article sufficiently discuss the statistics.