Thursday, August 29, 2019

What is a "host culture"? (Covert cultural hegemony)

Abstract: The concept of a "host culture" has been appropriated by the tourism industry from sociology, and is used incoherently when applied to Hawaii. Native Hawaiians are typically described as a "host culture" distinct from the "local culture", although no one would consider locals in Hawaii to be "guests". Native Hawaiians culture might better be understood as having been a very influential "hegemonic" culture, despite the relative declining fortunes of Native Hawaiians. In the latter half of the 20th century, it was ethnic Japanese in Hawaii whose values and attitudes became covertly culturally hegemonic within the local population, much to the benefit of Hawaii. In the 21st century, mainstream American culture might have a greater purchase on Hawaii's youth than locals might recognize.
The concept of a "host culture"
The term "host culture" does not seem to be used much in an academic context today. One of the earliest glimmers of the use of the term is from post-WW2 sociology in referring to the conflict between immigrants and the dominant culture that they encounter in their new country. The assumption was that a "failure" to assimilate on the part of immigrants led to a racist backlash in mainstream society.

https://en.wikipedia.org/wiki/Immigrant-host_model
The immigrant-host model was an approach that developed in postwar sociology to explain the new patterns of immigration and racism. It explained the racism of hosts as a reaction to the different cultural traditions of the immigrants, which acted as obstacles for their economic development and social integration.[1] The model assumed that the disruption immigration caused to stability would be solved by the cultural assimilation of immigrants into the dominant culture.
This perspective is rooted in the integrationist model that underlay the civil rights movement of the 1960s, which began to disintegrate in the face of the successes and failures of that movement. Even after the enforcement of voting rights and desegregation, blacks found that old prejudiced did not change with the change in laws, and even where attitudes did change, economic conditions did not. Also, among northern whites who supported the civil rights movement when it rolled through the South, the quasi-liberal segregationist slogan of "separate but equal" became appealing when the movement took hold in their own neighborhoods. White supremacy had been squelched in the South, but white and black separatism replaced it. The South transformed and became the "New South" that relinquished explicit racial domination in favor of an unofficial policy of avoidance -- and so did the rest of the USA. Hence the 1970s renaissance of American multiculturalism. This is the current orthodoxy across the political spectrum.

Subsequently, in sociology, the integrationist immigrant-host model fell out of fashion. The perceived problem was no longer immigrants clinging to their culture and refusing to assimilate, but an unfair and exploitative situation in which immigrants were discriminated against on the pretext of their difference. The 1950s immigrant-host model was itself now perceived as yet another example of the racist "blame-the-victim" mentality of the dominant society.
Since the 1970s, however, the assumptions in this model have been increasingly discarded in the sociology of race relations. The core idea of the model that the immigrants' children would gradually assimilate and, thus, that racism and racial inequality would cease proved false. The model was criticized and blamed for reflecting and, even, reinforcing the racist assumptions by describing the cultures of immigrants as social problems and ignoring the role structural inequality plays in their subjugation.
The very concept of a "host culture" seems to have disappeared from the social sciences. In fact, there is no coherent definition of a "host culture" out there anywhere. The concept of a host culture seems to have been appropriated by the travel industry, and is no longer used an academic world that would at least take pains to define its terms. The typical travel industry website makes a distinction between tourists or "guests", on the one hand, and the "host culture" of the tourist destination, on the other hand. The host culture is the mainstream culture of those destinations. Notably, the host culture is not the culture of indigenous people in places like Australia or Brazil, but rather the mainstream, dominant culture of those countries.
The "host culture" in Hawaii
The one exception to this trend is found on the Travel Weekly website, where the author distinguishes between the "host culture" and the "local culture". He takes Hawaii as his example.
https://www.travelweekly.com/Arnie-Weissmann/Host-culture-local-culture
One consequence of globalization is an acceleration in the gap that's widening between host cultures and local cultures. The implications for the travel industry are significant. 
Hawaii is the destination that perhaps provides the clearest distinction between a host culture and a local culture. 
The Hawaiian traditions, language, customs and, above all, the aloha spirit, define its host culture. Sacred as that is, the influences of everything and everyone who have passed through or settled on the islands has made an impact on modern Hawaiian life and shaped a distinctive local culture that's quite different from the host culture. 
World War II GIs can take credit, if they want, for the prevalence of Spam on local menus. And Japanese tourists' demand for familiar ingredients has helped the local cuisine evolve in, well, less processed ways. 

In Hawaii, host culture and local culture coexist in relative harmony, in large measure because few in the local culture challenge the importance of the host culture, and the host culture's spirit is so inherently welcoming that many outside influences are not viewed as conflicting with traditional beliefs.
Today, both local and host cultures are being erased by the spread of corporate mass culture -- Starbucks, McDonalds, Walmart.
But elsewhere, globalization has been accused of contributing to the dilution of dominant host cultures while concurrently contributing to the homogenization of local cultures. 
This part of the story is familiar: Fast-food outlets, retail stores and hotels have become brands without borders. Popular culture, from singers to movie franchises, is similarly ubiquitous. 
It would seem that the world is moving toward sameness just as it's getting easier for more people to move around and explore the world's differences.
As far back as the 1950s, the French historian Fernand Braudel spoke of "global civilization" as a modernizing force that was transforming the world into somewhat similar societies. There is some irony in the French fear of globalization -- in particular, the spread of American corporate mass culture -- as a threat to the French sense of distinctiveness. Historically, it was other countries -- notably, German-speaking lands -- that chafed under the universal appeal of a French culture that aggressively broadcast itself throughout Europe and the world. Paralleling the split between Protestantism and Catholicism, Germans distinguished between "Kultur" as distinctive inward spiritual values that were under threat from the outward, material aspects of "civilization", such as political and legal systems, economies, technology, infrastructure, even rationality and science. (Adam Kuper's 1999 book "Culture: The anthropologist's account".)

Historically, the role of mass culture is downplayed in normative debates on how immigrants should behave in relation to society. In the USA, three models vied for legitimacy:
  • Anglo-dominant assimilation, in which immigrants gratefully conform -- at least outwardly -- to the sociocultural norms of the society that has admitted them;
  • fusion and synthesis of various cultures, from which a "New American" culture would be born; and
  • multiculturalism, in which subcultures maintain their distinct identity.
Interestingly, these three competing models are also the conceptual parameters of the 1990s debates on globalization, and whether the emerging world order fosters diversity or is really an "artificially intelligent" way of imposing Anglo-American domination (e.g., Benjamin Barber's "Jihad versus McWorld: How Globalism and Tribalism Are Shaping the World"). This itself mirrors Cold War-era foreign policy debates on whether the 20th-century American project ("liberal internationalism") of "liberating" the world by promoting the spread of liberty and democracy is itself merely a sneaky form of imperialism (so insidious that it even fools the American leadership that buys into it).

These normative debates on how people should act might fail to notice the descriptive realities of how ethnic groups actually do act. The reality of how groups behave turns out to be a mix of all three models: ethnic groups simultaneously conform, create a new cultures and also remain distinct. Moreover, there is a huge element of "global civilization" and "mass culture" in ethnic cultures distinct from those three models. Making things even more complicated, "global civilization" is comprised to a remarkable extent of creative, even subversive, proudly local popular cultures of lower class minorities -- for example, the rap music of urban America -- that has been packaged and sold by multinational corporations that are usually associated with bland, generic "mass culture".

But back to the distinction between "host cultures" and "local cultures" made above by the travel writer. Usually the host culture is associated with the majority population ("British culture"), and local cultures refer to particular subsets of that population (rural Yorkshire). The typical use of these terms when applied to Hawaii, however, reverses this definition -- the local culture is associated with the majority, and the host culture is associated with Native Hawaiians who are a subset of that local population. Logically, Native Hawaiians should be considered a "native" element within the local culture, and this local culture would properly be considered the host culture of Hawaii.

However ... Hawaii might be a special case. As the travel writer explained, there is a sense that the Native Hawaiian culture exerted and still exerts a formative influence on the locality. Outside of the cultural influence of Native Hawaiians on contemporary Hawaii, in the legal-political institutions one finds early influences, such as the very strong powers granted to Hawaii's governor (as opposed to Texas, where the governor has more limited power) down to simple government functions (in Hawaii, the deed to a house is kept not by the owner but by two separate State bureaucracies, an arrangement dating back to the Kingdom). This is to say that Hawaii's local population is to some extent still operating within the a certain framework that is a holdover from a host society that was predominantly Native Hawaiian. Economically, Native Hawaiians might have lost ground in relation to the rest of Hawaii's population, but they -- as did their ancestors -- still exert an outsized cultural influence on the local population.
The concept of "cultural hegemony"
This confounds the notion of "cultural hegemony", usually understood as the upper classes imposing their values, attitudes and perspectives onto the general population. At one time in Hawaii, the Hawaiian upper classes did do just that, but that culture has persisted despite the waning of the Hawaiian upper classes. As stated above, it is ordinary Hawaiians who still exert a strong cultural influence on the local population.

https://en.wikipedia.org/wiki/Cultural_hegemony
In Marxist philosophy, cultural hegemony is the domination of a culturally diverse society by the ruling class who manipulate the culture of that society—the beliefs, explanations, perceptions, values, and mores—so that their imposed, ruling-class worldview becomes the accepted cultural norm; the universally valid dominant ideology, which justifies the social, political, and economic status quo as natural and inevitable, perpetual and beneficial for everyone, rather than as artificial social constructs that benefit only the ruling class.
The 2011 movie "The Descendants" intimates something like this.

https://en.wikipedia.org/wiki/The_Descendants
Matthew ("Matt") King is a Honolulu-based attorney and the sole trustee of a family trust of 25,000 pristine acres on Kauai. The land has great monetary value, but is also a family legacy.
By the end of the movie, Matt King rejects selling the land, arguing that his family was originally Native Hawaiian, but they married into a long series of white Americans until they became white Americans. The final scene in the movie involves multiple reconciliations.
Later, the three are at home sitting together sharing ice cream and watching television, all wrapped in the Hawaiian quilt Elizabeth had been lying in.
The film rankled a critic in Hawaii, Chad Blair, because of its elitism.

https://www.civilbeat.org/2012/04/15402-descent-into-haole-the-descendants-dissed/
Things seemed promising early on in the film: shots of boxy apartment buildings, congested freeways, Diamond Head devoid of rainfall, homeless people downtown, Chinatown shoppers, squatters on beaches. 
“Paradise?” says Clooney’s character. “Paradise can go fuck itself.” 
I thought: Tell it like it is, Alexander Payne! 
But that all changed, right about the time I learned that Clooney’s surname is King and that he’s descended from Hawaii royalty and haole aristocracy and that he lives in a beautiful house in Manoa (or Nuuanu or Tantalus; someplace like that) with a swimming pool and the kids go to Punahou and they hang out at the Outrigger Canoe Club and they own a beach and mountain on the Garden Isle. 
“The Descendants” had gone from hibiscus and Gabby Pahinui to a dashboard hula doll and Laird Hamilton in just minutes, and the film kept right on descending.
Blair implies that the book was written by a member of Hawaii's white elite, and that the non-white elite loved it.
On March 1, the Hawaii State Senate honored Kaui Hart Hemmings, the author of the book version of “The Descendants” and an advisor to the film. (And stepdaughter of former GOP state Sen. Fred Hemmings.) 
“We are very proud of Kaui Hart Hemmings and the role she played in showcasing Hawaii in her novel. She shares the story of life in our islands through the eyes of a Kamaaina, which everyone in the world would be able to appreciate,” said Senate Majority Leader Brickwood Galuteria, who presented the certificate. That’s from a Senate press release. 
“Scenery from Kauai’s iconic properties and landscapes are beautifully photographed and highlighted in the film, thanks to Kaui and the producers of the movie,” said Sen. Ron Kouchi of Kauai. “We are pleased with being able to share our island lifestyle with those who watch the movie.” 
“Our island lifestyle”? More like the richest 1 percent, the ones teeing up at Princeville.
The local politicians loved the movie, and maybe that in part reflects a complacent attitude toward hierarchy in Hawaii that winds through Hawaii's colonial era but was even more profound in the pre-contact era. The flip side of the Native Hawaiian legacy was authoritarianism, and one can see that strain in Hawaii's political culture, which otherwise exhibits a casual populism and egalitarianism.

One the other hand, Chad Blair applauds how a little bit of social criticism was sneaked into the movie in George Clooney's big speech, which is included in the Chad Blair's article (in big bold letters).
We didn’t do anything to own this land — it was entrusted to us. Now, we’re haole as shit and we go to private schools and clubs and we can hardly speak Pidgin let alone Hawaiian. 
But we’ve got Hawaiian blood, and we’re tied to this land and our children are tied to this land. 
Now, it’s a miracle that, for some bullshit reason, 150 years ago, we owned this much of paradise. But we do. And, for whatever bullshit reason, I’m the trustee now and I’m not signing.
In the case of Hawaii, the term "host culture" might be understood as the culture that is secretly hegemonic amidst a mix of all sorts of ethnicities. "The Descendants" does a pretty good job of showing how among those who descend from the old white elite, there is still some sort of residue of values and attitudes that derive from Native Hawaiian culture. That might be something that people in Hawaii -- and their politicians -- could recognize when they saw the movie.
California uber alles?
A sharp contrast to "The Descendants" might be found in the 2010 comedy-drama "The Kids Are Alright".

https://en.wikipedia.org/wiki/The_Kids_Are_All_Right_(film)
Nic (Annette Bening) and Jules (Julianne Moore) are a married same-sex couple living in the Los Angeles area. Nic is an obstetrician and Jules is a housewife who is starting a landscape design business. Each has given birth to a child using the same sperm donor.
The kids locate the donor, Paul, who has an affair with Jules and wants to marry her and adopt the kids. This causes a crisis, but by the end of the movie, there is a reconciliation.
The next morning, the family drives Joni to college. While Nic and Jules hug Joni goodbye, they also affectionately touch each other. During the ride home, Laser tells his mothers that they should not break up because they are too old. Jules and Nic giggle, and the film ends with them smiling at each other and holding hands.
The actual feel of the final scene of the movie is not as warm as a description of it might seem. Joni breaks down as she watches her family drive away because her parents may split, but also perhaps because she is now an adult and on her own. It is a very different final scene from that of "The Descendants". It's much more classically American, and all very Californian.

["The Kids Are Alright", 2010, final scene, "You're too old."]

https://www.youtube.com/watch?v=eodbWmxFgOU


Does California exert an outsized cultural influence on suburban America? And insofar that California is culturally a proverbial "postcard from the future", does California help the rest of America deal with the social changes that California has already undergone prior to other places (for example, the blended family of the "Brady Bunch")? That is, through a television industry that is very Californian, does the rest of middle-class, suburban America finds a culture that it can relate too despite the provincialism of their own surroundings? For example, the TV show "Freaks and Geeks" was set in the Midwest in the 1980s, but that region was still adjusting to the realities that had already beset more urban places like California in the 1970s -- realities like the rise of lonely latchkey children.

https://www.nytimes.com/interactive/2017/08/13/arts/freaks-and-geeks.html
A little setup: Bill Haverchuck (Martin Starr) is having a terrible time of it. He's a latchkey kid. He’s horrible in phys-ed class. And he learns, in this episode, that his mom is seriously dating his gym teacher, whom he hates. 

First, the music. “I’m One” by the Who, from the 1973 album “Quadrophenia.” It builds from mournfulness (“I’m a loser / No chance to win”) to a defiant chorus. And it's a great example of how “Freaks and Geeks” chose its soundtracks. The episode is set in 1981, but it avoids on-the-nose ’80s-song choices. Paul Feig, the show’s creator, once told me that the thing about the early ’80s in the Midwest was that they were really still the ’70s.
The old switcheroo
If the dominant values can be a product of social groups that are not dominant (Native Hawaiians in Hawaii) or are not physically present in the locality (Californian influence in the Midwest, or French hegemony in traditional Europe), could one group's secret influence be replaced by that of another group?

It could be that through the latter half of the 20th century, cultural hegemony was exerted in Hawaii by local ethnic Japanese who disproportionately made up the schoolteachers and politicians in this period (in fact, a potential majority).

https://en.wikipedia.org/wiki/Japanese_in_Hawaii
The Japanese in Hawaii (simply Japanese or “Local Japanese”, rarely KepanÄ«) are the second largest ethnic group in Hawaii. At their height in 1920, they constituted 43% of Hawaii's population.[2] They now number about 16.7% of the islands' population, according to the 2000 U.S. Census. The U.S. Census categorizes mixed-race individuals separately, so the proportion of people with some Japanese ancestry is likely much larger.
During elections in Hawaii, politicians are constantly talking about the welfare of "keiki" and the "kupuna". Hawaiian language is being utilized, but the target audiences are Asians in general and the coveted Japanese voting block in particular. The image and identity of Hawaii might be located in a Polynesian past, but during the second half of the 20th century this masked the hegemony of Asian American values and attitudes. Similarly, Japanese American soldiers in WW2 supposedly suffered the highest level of fatalities in that war among American soldiers, and the entrenched rhetoric of patriotic self-sacrifice that one still encounters in Hawaii among the local Japanese ironically serves as a vehicle for perpetuating culturally conservative Asian values under the guise of Americanization.

Asian cultural hegemony in Hawaii might have been crucial in terms of accelerating Hawaii's economic development in the second half of the 20th century. As it became clear in the 2016 American presidential election, there is in places like rural West Virginia a segment of the population that identifies with family, community, tradition, religion and hard physical labor, and it resists urbanization and formal education. In Hawaii, the Asian and especially the Japanese population identify with those culturally conservative values, but they also put a premium on professional orientation and technical education, and this serves as a crucial bridge into the future. Without this cultural bridge between the rural community-oriented working class and the individualistic middle-class suburbs, there might not have been that much different between the economic fortunes of Hawaii and a place like Puerto Rico. Moreover, the idea that you can "Have it both ways" in Hawaii means that almost anyone can live in Hawaii and in their own little niche enjoy a very high quality of life (if they can afford it).
That was then and this is now
That being said, the 21st century might be another matter entirely. Cultural hegemony in Hawaii might have shifted yet again, this time toward mainstream American culture.

For example, the orientation of children of Asian immigrants in Hawaii today seems quite different from the past. In the past, Asian immigrants in Hawaii would typically assimilate into local culture. Today, they assimilate just as much into American culture. For example, there is a local literature in Hawaii that stresses the dysfunctional continuities with the past. This contrasts with an Asian American literature in the continental USA that stresses the profound chasms between the generations that signify not just a generation gap, but thousands of years of difference ("The Joy Luck Club"). Today, however, the young writers in Hawaii who are the educated offspring of immigrants engage in a form of literature that is really typically Asian American. The torch has passed to a new generation.

Monday, August 26, 2019

Old thinking, new realities (2020 recession, trends v. events)

The 2020 recession
The economist Nouriel Roubini argues that there is going to be a recession in 2020. Unfortunately, at some point the USA will not be able to use the usual stimulus measures to revive the economy. This is because there will be several "permanent supply shocks" at work, including trade and technology conflict between the USA and China, as well as military conflict between the USA and Iran that would lead to rising oil prices. In the face of inflation, the USA will not be able stimulate the economy in the midst of recession because this could trigger hyperinflation.

https://www.theguardian.com/business/2019/aug/23/global-recession-immune-monetary-solution-negative-supply-shock
All three of these potential shocks would have a stagflationary effect, increasing the price of imported consumer goods, intermediate inputs, technological components and energy, while reducing output by disrupting global supply chains. Worse, the Sino-American conflict is already fuelling a broader process of deglobalisation, because countries and firms can no longer count on the long-term stability of these integrated value chains. As trade in goods, services, capital, labour, information, data and technology becomes increasingly balkanized, global production costs will rise across all industries.
Roubini is know as "Dr. Doom" because he is a pessimist. As far back as 2005 he accurately predicted the meltdown in real estate and financial markets. Roubini worked in the Clinton administration, and is associated with the Democratic Party.

Here is a list of economists who accurately predicted the 2008 meltdown:

https://www.businesstoday.in/top-story/these-people-predicted-the-2008-recession-and-were-laughed-at/story/283071.html

It is said that very few economists foresaw the financial crisis of 2008. Actually, quite a few topnotch economists predicted the 2008 crisis. The problem was that few listened.

Just prior to that crisis, the premier economist within the Republic Party was David Rosenberg. He accurately explained why the USA was headed into the worst recession in its history. He is again saying something like that now. For instance, there is a bubble in corporate debt.

https://www.forbes.com/sites/greatspeculations/2019/04/08/what-ballooning-corporate-debt-means-for-investors/#11869fa9636c
Since the last recession, nonfinancial corporate debt has ballooned to more than $9 trillion as of November 2018, which is nearly half of U.S. GDP. As you can see below, each recession going back to the mid-1980s coincided with elevated debt-to-GDP levels—most notably the 2007-2008 financial crisis, the 2000 dot-com bubble and the early '90s slowdown. 

Through 2023, as much as $4.88 trillion of this debt is scheduled to mature. And because of higher rates, many companies are increasingly having difficulty making interest payments on their debt, which is growing faster than the U.S. economy, according to the Institute of International Finance (IIF). 
On top of that, the very fastest-growing type of debt is riskier BBB-rated bonds—just one step up from “junk.” This is literally the junkiest corporate bond environment we’ve ever seen. 
Combine this with tighter monetary policy, and it could be a recipe for trouble in the coming months.
Along with the bubble in corporate debt, there is a bubble in consumer spending. Earnings are down, and spending is being financed by borrowing.

https://www.cnbc.com/2019/08/21/consumer-spending-is-a-bubble-waiting-to-burst-david-rosenberg-warns.html
“What no one seems to talk about is the underlying fundamentals behind the consumer are actually deteriorating before our very eyes,” the firm’s chief economist and strategist said Tuesday on CNBC’s “Futures Now. ” 
Yet the latest economic data suggests the consumer, considered the main driver of the U.S. economy, is on solid ground. Commerce Department figures show retail sales in July rose 0.7%, after a 0.3% increase in June. 
In a recent note, Rosenberg criticized the government’s retail report. He estimated the bullish retail sales number was completely financed by credit and, therefore, unsustainable. 

According to Rosenberg, the problems lie in the Bureau of Labor Statistics’ real average weekly earnings decline in July of 0.3% from June.
“It’s actually been flat to down now in six of the past eight months,” said Rosenberg. “I’m actually wondering how long the consumer is actually going to keep it up and hold the glue together for the economy when real incomes are starting to subside as much as they are.”
In order to maintain a middle-class lifestyle, Americans are going into debt.

https://www.wsj.com/articles/families-go-deep-in-debt-to-stay-in-the-middle-class-11564673734?mod=rsswn
The American middle class is falling deeper into debt to maintain a middle-class lifestyle. 
Cars, college, houses and medical care have become steadily more costly, but incomes have been largely stagnant for two decades, despite a recent uptick. Filling the gap between earning and spending is an explosion of finance into nearly every corner of the consumer economy.
Average tuition at public four-year colleges, however, went up 549%, not adjusted for inflation, according to data from the College Board. On the same basis, average per capita personal health-care expenditures rose about 276% over a slightly shorter period, 1990 to 2017, according to data from the Centers for Medicare and Medicaid Services.
And average housing prices swelled 188% over those three decades, according to the S&P CoreLogic Case-Shiller National Home Price Index.
Growing debt reflects confidence in the economy, but when borrowers bet wrong and the economy sours, debt blows up in their face -- and in the economy. The crucial question is whether the debt is paying for investments or for luxuries. Going into debt to buy status symbols makes people poorer, and this increases wealth inequality.
Taking on a mortgage to buy a house that could appreciate, or borrowing for a college degree that should boost earning power, can be wise decisions. Borrowing for everyday consumption or for assets such as cars that lose value makes it harder to save and invest in stocks and real estate that tend to create wealth. So the rise in consumer borrowing exacerbates the wealth gap.
A case study suggests that typical borrowing is for both investments and luxury.
Jonathan Guzman and Mayra Finol earn about $130,000 a year, combined, in technology jobs. Though that is more than double the median, debt from their years at St. John’s University in New York has been hard to overcome. 
The two 28-year-olds in West Hartford, Conn., have about $51,000 in student debt, plus $18,000 in auto loans and $50,000 across eight credit cards. Adding financial pressure are a baby daughter and a mortgage of around $270,000. 
“I’m normally a worrier, but this is next-level stuff. I’ve never been more stressed,” Mr. Guzman said. “Never would I have thought with the amount we make I would have these problems.” 
They no longer dine out several times a week. Other hits to their budget were hard to avoid, such as a wrecked car that forced them to borrow more.
First of all, these are not middle class people, they are the upper-middle class. Second, in an earlier generation, middle class people did not eat out several times a week, they ate out maybe several times a month or several times a year. And when they did dine out, they ate food like chicken and waffles.

https://www.laweekly.com/what-to-eat-during-hbos-mildred-pierce-a-fried-chicken-recipe/

https://www.laweekly.com/what-to-eat-during-hbos-mildred-pierce-a-fried-chicken-recipe/
One consistent pattern in the rising cost of cars, college, houses, medical care is that everything has changed, but the way of thinking remains stuck in the past.
Cars (actually, trucks and SUVs)
The rising price of automobiles is telling, and so is the rising method of paying for cars -- borrowing.
Nowhere is the struggle to maintain a middle-class lifestyle more apparent than in cars. The average new-car price in the U.S. was $37,285 in June, according to Kelley Blue Book. It didn’t deter buyers. The industry sold or leased at least 17 million cars each year from 2015 to 2018, its best four-year stretch ever. Partly because of demand satisfied by that run, sales are projected to be off modestly this year.
How households earning $61,000 can acquire cars costing half their gross income is a story of the financialization of the economy. Some 85% of new cars in the first quarter of this year were financed, including leases, according to Experian. That is up from 76% in the first quarter of 2009. 

Car trouble 
And 32% of new-car loans were for six to seven years. A decade ago, only 12% were that long. The shorter-term loans of the past gave many owners several years of driving without car payments. 
Now, a third of new car buyers roll debt from their old loans into a new one. That’s up from roughly 25% in the years before the financial crisis. The average amount rolled into the new loan is just over $5,000, according to Edmunds, an auto-industry research firm. 

Leasing, which often entails lower payments than purchase loans, accounted for 34% of financed new vehicles in the first quarter, up from 20% a decade earlier, according to Experian. Drivers of used cars also finance them—more than half did last year.
The primary driver of the increasing price of automobiles is improved safety features. The gorgeous cars of the 1950s would today be seen as flying coffins. Some of the money that Americans save by not winding up in the hospital after a car accident they must now pay up front when they buy a car.

https://www.nbcnews.com/business/autos/new-car-safety-technology-saves-lives-can-double-cost-repairs-n925536
So, dent a fender or smack a mirror and you might need to repair or replace one of these sensors. AAA found that in a minor front or rear collision involving a car with ADAS technology the repair costs can run as high as $5,300. That’s about $3,000 more than repairing the same vehicle without the safety features.
Subsequently, the chances of dying in an automobile accident continue to fall (even as Vehicle Miles Travelled has increased).

Image result for automobile accident fatality graph
http://blog.fusedgrid.ca/wp-content/uploads/2015/03/VKM-Fatalities-USA-Historic-1900-to20111.jpg

In a related development, automobiles are built to last much longer. The average lifespan of a car today is 12 years.

https://wolfstreet.com/wp-content/uploads/2018/08/US-auto-cars-trucks-average-age-2017.png

Image result for average lifespan of a car graph
https://www.nytimes.com/2012/03/18/automobiles/as-cars-are-kept-longer-200000-is-new-100000.html
In the 1960s and ’70s, when odometers typically registered no more than 99,999 miles before returning to all zeros, the idea of keeping a car for more than 100,000 miles was the automotive equivalent of driving on thin ice. You could try it, but you’d better be prepared to swim.

But today, as more owners drive their vehicles farther, some are learning that the imagined limits of vehicular endurance may not be real limits at all. Several factors have aligned to make pushing a car farther much more realistic.
As with safety features, the increasing longevity of motor vehicles is partly a result of government regulation.
Customer satisfaction surveys show cars having fewer and fewer problems with each passing year. Much of this improvement is a result of intense global competition — a carmaker simply can’t allow its products to leak oil, break down or wear out prematurely.
But another, less obvious factor has been the government-mandated push for lower emissions.
“The California Air Resources Board and the E.P.A. have been very focused on making sure that catalytic converters perform within 96 percent of their original capability at 100,000 miles,” said Jagadish Sorab, technical leader for engine design at Ford Motor. “Because of this, we needed to reduce the amount of oil being used by the engine to reduce the oil reaching the catalysts.
“Fifteen years ago, piston rings would show perhaps 50 microns of wear over the useful life of a vehicle,” Mr. Sorab said, referring to the engine part responsible for sealing combustion in the cylinder. “Today, it is less than 10 microns. As a benchmark, a human hair is 200 microns thick.
“Materials are much better,” Mr. Sorab continued. “We can use very durable, diamondlike carbon finishes to prevent wear. We have tested our newest breed of EcoBoost engines, in our F-150 pickup, for 250,000 miles. When we tear the engines down, we cannot see any evidence of wear.”
Another reason for more durable cars is competition from Japan.
The trend toward better, longer-lasting cars seems to have begun way back in the ’60s, when the first imports from Asia started to encroach on American and European carmakers’ sales figures.

Another factor is that cars from the ’60s and ’70s were susceptible to rust and corrosion — many literally fell apart before their engines and transmission wore out. But advances in corrosion protection, some propelled by government requirements for anticorrosion warranties, have greatly reduced that problem.
“Competition is part of it,” said Peter Egan, a former auto mechanic and now editor at large of Road & Track magazine. “Japanese cars kind of upped everyone’s game a bit. With some exceptions, the engines would go a long time without burning oil or having other major problems.”
Hyundai and Kia, the South Korean carmakers, now include 100,000-mile/10-year warranties on their cars’ powertrains. If a relatively abusive driver can count on no major mechanical failures before 100,000 miles, a careful owner can — and does — expect his car to go much farther.
In a sense, American cars slowly morphed into German cars -- more solid, more safe, more expensive. Americans, however, have not adopted German habits of automobile ownership. Germans buy cars with cash, not credit, and keep the car forever, replacing every part of a car until it essentially becomes a new car. The American mentality is rooted in the 1950s practice of trading in a car for a better car every few years, a practice based on two assumptions: 1) the income of the car owner will rise in the interval, and 2) a car is the equivalent of a disposable razor blade that needs to be swapped out.

Also, Americans generally don't buy cars. Seven of the top ten best-selling vehicles in the USA in 2018 were trucks and SUVs. While Americans swear that they need these big vehicles for practical reasons, these vehicles are actually much less practical.

https://www.foxnews.com/auto/the-10-best-selling-vehicles-in-the-united-states-in-2018-were-mostly-trucks-and-suvs

https://www.nytimes.com/2016/12/31/us/texas-truck-culture.html

In sum, automobile technology has changed, but Americans haven't.
College
New realities afflicted by old ways of thinking is also true in terms of the rising cost of college.
Back in the 1950s, only about 5% of American adults completed college. That percentage now approaches 30%.

https://en.wikipedia.org/wiki/Educational_attainment_in_the_United_States

Relatively few people went to college, and they tended to be either the wealthiest people or the smartest people. Mostly likely, the banker was the son (and grandson) of a banker, went to Harvard and got mediocre grades, and focused instead on getting into the right clubs and marrying the right woman from the right family. In fact, studying was seen as unseemly. Elite universities were in cahoots with the American aristocracy's self-replication.

https://www.quora.com/What-is-a-gentleman%E2%80%99s-C
Given the decades-long history of grade inflation, particularly in Ivy League schools in the non-engineering disciplines, a “gentleman’s C” is proxy for “it should have been an F but we are too civilized to tarnish your record as such.” 
In graduate schools this is perhaps even more pronounced. 
I went to a top 30 or so law school where the grades were curved to a strict 3.17. 
The saying there was “It’s hard to get an A; it’s even harder to get a C.” In practice the distribution was something like 15% As, 75% Bs, 10% Cs in each class. 
I never heard of anyone actually failing, but it was well known that a C+ was the equivalent of a D, and a C- was tantamount to an F.
This stands in contrast to those students who graduated from elite schools and who were not from the elite. The educational career of Robert McNamara is a case in point.

https://en.wikipedia.org/wiki/Robert_McNamara
Robert Strange McNamara (June 9, 1916 – July 6, 2009) was an American business executive and the eighth United States Secretary of Defense, serving from 1961 to 1968 under Presidents John F. Kennedy and Lyndon B. Johnson

Robert McNamara was born in San Francisco, California.[3] His father was Robert James McNamara, sales manager of a wholesale shoe company, and his mother was Clara Nell (Strange) McNamara.[5][6][7] His father's family was Irish and, in about 1850, following the Great Irish Famine, had emigrated to the U.S., first to Massachusetts and later to California.[8] He graduated from Piedmont High School in Piedmont in 1933, where he was president of the Rigma Lions boys club[9] and earned the rank of Eagle Scout. McNamara attended the University of California, Berkeley and graduated in 1937 with a B.A. in economics with minors in mathematics and philosophy. He was a member of Phi Gamma Delta fraternity,[10] was elected to Phi Beta Kappa his sophomore year, and earned a varsity letter in crew. McNamara before commissioning into the Army Air Force, was a Cadet in the Golden Bear Battalion at U.C. Berkeley [11] McNamara was also a member of the UC Berkeley's Order of the Golden Bear which was a fellowship of students and leading faculty members formed to promote leadership within the student body. He then attended Harvard Business School, where he earned an M.B.A. in 1939.
Immediately thereafter, McNamara worked a year for the accounting firm Price Waterhouse in San Francisco. He returned to Harvard in August 1940 to teach accounting in the Business School and became the institution's highest paid and youngest assistant professor at that time.
UC Berkeley charged no tuition up until Governor Ronald Reagan attempted to raise tuition in 1968. Berkeley might be understood to have been a free public school for the intellectual elite (like City College of New York) that has become a partly subsidized private elite university for the elites in general.

https://thebottomline.as.ucsb.edu/2017/10/a-brief-history-of-uc-tuition

It is often stated that there is a hierarchy of intellectual rigor in the university.
  • STEM fields like math are the most difficult.
  • The social sciences and humanities are not quite as hard.
  • Business degrees are the easiest.
Historically, the aristocracy that went to college majored in things like history and business -- things that would teach practical skills and critical thinking skills useful for leadership without being unduly difficult. The ordinary person, in contrast, would learn vocational skills, often on the job. The intellectual elite not born into wealth might have a trajectory very similar to McNamara. They might major in a practical and rigorous field like economics but they would also be fascinated by abstract, rigorous fields like math and philosophy. (In the 2003 documentary "The Fog of War", McNamara says that he had never been so excited in his life as when he discovered philosophy in college.)

Rich students study general things; smart students study difficult things; everyone else learns a trade. The Bush dynasty is a case in point. President George W. Bush majored in history at Yale; his father, President George H.W. Bush, majored in economics at Yale; his father, Senator Prescott Bush, attended Yale (major unknown); his father, the industrialist Samuel Bush, attended Stevens Institute of Technology; his father, Rev. James Smith Bush, attended Yale; his father, Obadiah Bush, was a schoolteacher, prospector and abolitionist; his father, Timothy Bush, was blacksmith. As the Bushes moved up in the world, they went from the trades at the bottom to a generalized education for the aristocracy.

As more people go to college, this pattern might have become inverted.

In order to graduate from college, academically challenged students whose parents did not go to college gravitate to the social sciences and humanities, which have a reputation for being easy, "Mickey Mouse" majors. Ironically, this is the kind of academic career that once typified the aristocracy. One problem is that this liberal arts education for ordinary students is funded today by student loans, not by trust funds from Swiss bank accounts. Another problem is that unless one is going to Yale and one's father is President of the United States of America, such a degree is only really good for going to graduate school. Graduate school is perfectly fine for the likes of Robert McNamara, but does it really make sense for 90% of the population whose IQ is below 120?
In the meantime, the aristocracy has developed its own quasi-vocational career path that includes a traditional aristocratic education. In the UK, the PPE course of study is exemplary.

https://en.wikipedia.org/wiki/Philosophy,_politics_and_economics
Philosophy, politics and economics or politics, philosophy, and economics (PPE) is an interdisciplinary undergraduate/postgraduate degree which combines study from three disciplines.  
The first institution to offer degrees in PPE was the University of Oxford in the 1920s. This particular course has produced a significant number of notable graduates such as Aung San Suu Kyi, Burmese politician, State Counsellor of Myanmar, Nobel Peace Prize winner; Princess Haya bint Hussein daughter of the late King Hussein of Jordan and wife of the ruler of Dubai; Christopher Hitchens, the British–American polemicist, [1][2] Oscar winning writer and director Florian Henckel von Donnersmarck; Philippa Foot a British philosopher; Harold Wilson, Edward Heath and David Cameron, former Prime Ministers of the United Kingdom; Hugh Gaitskell, William Hague and Ed Miliband, former Leaders of the Opposition; former Prime Minister of Pakistan Benazir Bhutto and current Prime Minister of Pakistan Imran Khan; and Malcolm Fraser, Bob Hawke and Tony Abbott, former Prime Ministers of Australia.[3][4] The course received fresh attention in 2017, when Nobel Peace Prize winner Malala Yousafzai earned a place.[5][6] 

In the 1980s, the University of York went on to establish its own PPE degree based upon the Oxford model; King's College London, the University of Warwick, the University of Manchester, and other British universities later followed. According to the BBC, the Oxford PPE "dominate[s] public life" (in the UK).[7] It is now offered at several other leading colleges and universities around the world. More recently Warwick University and King’s College added a new degree under the name of PPL (Politics, Philosophy and Law) with the aim to bring an alternative to the more classical PPE degrees.
The PPE major is a weird hybrid in that it is modern vocational training for an established aristocracy.
Oxford PPE graduate Nick Cohen and former tutor Iain McLean consider the course's breadth important to its appeal, especially "because British society values generalists over specialists". Academic and Labour peer Maurice Glasman noted that "PPE combines the status of an elite university degree – PPE is the ultimate form of being good at school – with the stamp of a vocational course. It is perfect training for cabinet membership, and it gives you a view of life". However he also noted that it had an orientation towards consensus politics and technocracy.[4] 
Geoffrey Evans, an Oxford fellow in politics and a senior tutor, critiques that the Oxford course's success and consequent over-demand is a self-perpetuating feature of those in front of and behind the scenes in national administration, in stating "all in all, it's how the class system works". In the current economic system he bemoans the unavoidable inequalities besetting admissions and thereby enviable recruitment prospects of successful graduates. The argument itself intended as a paternalistic ethical reflection on how governments and peoples can perpetuate social stratification.[7] 
Stewart Wood, a former adviser to Ed Miliband who studied PPE at Oxford in the 1980s and taught politics there in the 1990s and 2000s, acknowledged that the programme has been slow to catch up with contemporary political developments, saying that "it does still feel like a course for people who are going to run the Raj in 1936... In the politics part of PPE, you can go three years without discussing a single contemporary public policy issue". He also stated that the structure of the course gave it a centrist bias, due to the range of material covered: "...most students think, mistakenly, that the only way to do it justice is to take a centre position".[4]
To what extent is the quasi-vocational training of the elites a disaster? After the 2016 election, Barack Obama lectured Mark Zuckerberg on why Facebook's mission to "connect the world" and its policy to "move fast and break things" was maybe not such a good idea, yet Zuckerberg only came away confused. Focused on computer science, it is as though Zuckerberg never learned how to think while he was at Harvard.

Silicon Valley's Peter Thiel argues that college is obsolete and needs to be disrupted. The standard reply to Thiel is that the irrelevance of college only applies to a few Silicon Valley success stories like Bill Gates, Steve Jobs and Zuckerberg, men who dropped out of elite universities or prestigious colleges. But Zuckerberg's cluelessness is evidence that even though Thiel is wrong, the usual criticism of Thiel is even more wrongheaded. That is, future decision-makers in particular should be like Robert McNamara and be exposed to, fascinated by and engaged in the humanities. In fact, Steve Jobs would insist as much. (Jobs's great disappointment in life was dropping out of Reed College because he lacked the funds and the educational preparation because he was adopted by a working class family, as he angrily stated in his Stanford speech.)

The point is that in the past, very few people -- the aristocracy, men of talent -- went to college. The current assumption that everyone should to go to college is based on assuming what was once true for those few who did go to college will also apply to everyone.
  • "Men of talent have rising incomes thanks to their college diplomas." This is a true statement.
  • "The Aristocracy is generally not so talented, but they can handle college." This is likewise true.
  • "The liberal arts are an excellent preparation for life." This is true for both the aristocracy and the Talented Tenth.
  • "There is a correlation between a college degree and higher salaries." This is true for men of talent.
But these statements do not apply to 90% of the population who neither possess superior ability nor attended elite preparatory academies. The reality is that as more people go to college, the college degree becomes less useful in the marketplace, and this is particularly true for the liberal arts.
The thesis here is that everything has changed, but our way of thinking is stuck in the past. The irony is that the previous system of higher education might serve as a better model.
  • Elite but easy education for the aristocracy, as long as they pay for it (Bushes);
  • Elite, rigorous education for the Talented Tenth, free of cost (McNamara); and
  • Vocational training for 90%, free of cost (although it might be called "college").
There would be no student debt in this model and, in fact, the cost of "college" would fall. Students would attend state universities for free and they would take classes in the humanities and social sciences as part of the general curriculum, but only the Talented Tenth would be allowed to major in such liberal arts disciplines. In fact, 90% of students would be encouraged to first attend a (free) community college and attain a vocational degree. They would be encouraged to patronize their local library.
Houses
The rising cost of houses in the USA has many unexpected causes. Home sizes keep increasing, even while family sizes continue to decrease. Homeowners, many of them liberals, oppose upzoning their neighborhoods for higher density, which would lower home prices and make homes affordable for young families. Lonely seniors will not downsize and move out of their big houses, a lifestyle becoming popular among younger childless couples in the suburbs.

One common theme on rising home prices is young Millennials conflicting with the older generation(s). The classic stereotype of Millennials is of the minimalist lifestyle: small, spare rented apartments in the city with few possession. However, studies show that Millennials somehow expect to someday own a house even larger than their parents' house.

As Albert Einstein said about nuclear war, everything has changed except the way we think.
Medical care
The only things I know about healthcare policy are from reading articles by the surgeon Atul Gawande.

https://www.newyorker.com/contributors/atul-gawande

However, I do know a little about healthcare spending on pets. During the worst days of the 2009 recession, spending on all things fell except for two categories: vitamins and pets. People ate less food and ate cheap food, and supplemented their diets with vitamins. They reexamined their lives and concluded that the best thing in their lives was their pet, and so they spent more on pet food and veterinary care.

Image result for rising cost of veterinary care graph
https://www.economist.com/sites/default/files/20170114_WOC583_3.png

Image result for rising cost of veterinary care graph
https://www.cross-check.com/hs-fs/hub/138250/file-17679504-jpg/images/veterinary_expenditures_-_consumer_price_level.jpg

This is a "crisis trend", in which a trend is exposed and accelerated by crises like war and recession.
Let's contrast this with the past, specifically with the BBC classic "All Creatures Great and Small" about a veterinary clinic in rural northern England.

https://en.wikipedia.org/wiki/All_Creatures_Great_and_Small_(TV_series)

The primary job of veterinarians used to be to help large farm animals like cows and horses give birth. The primary tool of the trade was a stethoscope.

Today, dogs and cats are primarily companion animals, and as members of the family, people are willing to spend large amounts of money on them. The tools of the trade now include MRIs, insulin shots and stem cell therapy.

We talk about the "rising cost of technology", but technology drives costs down over time. What rises are expectations. People import expectations from their knowledge of human life spans and human medical technology and apply that to their pets. The lifespan of a serious working dog might be five years (which is the typical lifespan of a wolf in the wild). As a pet, a dog might now live 12 years (the lifespan of a wolf in captivity), but people assume that there is something wrong with this, this is too short a period. So while the technology of veterinary technology has transformed, people have unrealistic expectations based on earlier ways of thinking (about humans).
Trends versus predictions
The forecasts of preeminent economists feel a bit like fortune-telling. The fortune teller will amaze the customer with insider knowledge and then make a very specific prediction about exactly what will happen in the future and precisely when it will happen.

The insider knowledge consist of processes and trends that no one else is observing. Out come the charts about previous downturns and statistics that everyone else has forgotten about. It seems plausible and illuminating.

The predictions have an odd certainty. These two countries will go to war and interest rates will go down by so many points. After so many months, the credit bubble will then burst.

It seems difficult enough to observe or at least try to apprehend trends without making predictions.

One general trend now is for the USA to move away finally from global leadership with the demise of the Soviet Union and the end of the Cold War almost three decades ago. "America First" isolationism might not have been inevitable, but the old role of global policeman began to ring hollow over a generation. In this situation, one would expect increasingly open conflict between American allies like Japan and South Korea, and regional trade conflicts would ensue. So the collapse of free trade might be much worse than Roubini expects from the current tariff spats between the USA and China.

Thursday, August 22, 2019

Community resilience indicators (Gross National Antifragility?)

Community resilience indicators
How would society survive a great disaster? Nassim Taleb argues that the profoundly destructive Black Swan events that define human history are fundamentally unpredictable. Nevertheless, he argues, such events can be planned for by making society more "antifragile". Up to a point, antifragile systems not only manage to survive uncertainty and hardship, but end up benefiting from such shocks.

How does one make society antifragile in terms of natural disasters?

It's FEMA to the rescue.

The Federal Emergency Management Agency has compiled commonly used resilience indicators.
They are listed on Table 1 on page seven (7) in the following PDF.

https://www.fema.gov/media-library-data/1549906639681-ac6f6d5fb54af1649f0077feed876b9e/Community_Resilience_Indicator_Analysis_December_2018_508.pdf

Population-Focused Indicators:
  • Educational Attainment
  • Unemployment Rate
  • Disability
  • English Language Proficiency
  • Home Ownership
  • Mobility (geographic)
  • Age
  • Household Income
  • Income Inequality
  • Health Insurance
  • Single-Parent Household
Community-Focused Indicators:
  • Connection to Civic and Social Organizations
  • Hospital Capacity
  • Medical Professional Capacity
  • Affiliation With a Religion
  • Presence of Mobile Homes
  • Public School Capacity
  • Population Change
  • Hotel/Motel Capacity
  • Rental Property Capacity
Taking all of these resilience indicators into account, the more southern regions of the USA seem most vulnerable, in particular Puerto Rico.

https://cdn.theatlantic.com/assets/media/img/posts/2019/08/Screen_Shot_2019_08_16_at_3.02.51_PM/7f1a46264.png
https://cdn.theatlantic.com/assets/media/img/posts/2019/08/Screen_Shot_2019_08_16_at_3.02.51_PM/7f1a46264.png

A new study finds that those cities which are most vulnerable to natural disasters -- heat waves, flooding, rising seas, drought -- are the least prepared.

https://www.citylab.com/environment/2019/08/climate-impacts-resilient-cities-environmental-justice/596251/
"Gross National Antifragility"
To some extent, the list of resilience indicators corresponds with factors linked with social mobility.

https://www.smartcitiesdive.com/ex/sustainablecitiescollective/income-mobility-rankings-harvard/1070681/
  • less segregation by income and race;
  • lower levels of income inequality;
  • better schools;
  • lower rates of violent crime; and
  • a larger share of two-parent households.
This has implications in terms of how cities are defined as flourishing.

Currently, there is a tendency to look at urban growth in terms of population and incomes.

https://www.citylab.com/life/2019/08/job-ranking-top-cities-population-growth-census-data-us/596485/

The concept of national growth always needs to be interrogated.

At the national level, at one time, economic progress was measured in terms of how many railroad cars existed in a country. This was replaced later with measures like the GNP and the GDP. Serious alternatives to this have been proposed.

https://en.wikipedia.org/wiki/Gross_domestic_product#Proposals_to_overcome_GDP_limitations
In terms of resilience, perhaps cities and nations could pursue "Gross National Antifragility".

Again, resilient community indicators seem to resonate with both social mobility factors and with Taleb's notion of "antifragility".

This resonance might also hold true for some of the indicators for alternative measures of national well-being. These too could be incorporated as indicators of Gross National Antifragility.
"Well-being": American happiness versus realistic joy
The real distinction to be made here is not between various indexes of well-being.

The real variation is in temperament and attitude and how well-being or "happiness" is defined separately from these indexes, and the way the indexes are subsequently interpreted.

The classic American attitude is that happiness is a permanently emotional endpoint achieved by accruing worldly possessions -- a materialistic version of the Christian concept of Heaven. In the past, this implied hard work, but this might no longer be the case. Hundreds of years ago, the "American dream" was religious freedom; later, this dream evolved into the goal of establishing a republic; by the 2000s, the American dream was to own a gigantic fiberboard house in the suburbs; by the 2010s, the American dream morphed into the "California dream" of attaining wealth and fame without effort or talent (Kardashians). By most indexes of well being, the Kardashians are healthy and wealthy and empowered. So the indexes of well-being in themselves can be misleading when they used to support the wrong value system. The things that are associated with happiness typically represent permanence, security, rootedness and finality: marriage, family, professional success, fame, houses, domesticity. This is a false sense of security, and signifies the preference of the merely robust over the antifragile.

Antifragility likewise involves a notion of well-being -- specifically, the brief joy of managing to survive a trauma that makes one stronger. Health, education and wealth and other indexes of well-being are essential in order to survive hardship, and when hardship is overcome, health and wealth and education often increase. But this joy is based on temporarily overcoming serious obstacles, as opposed to the common fantasy of permanently escaping from obstacles by amassing a fortress of wealth. Although the various indexes are valid insofar as they promote antifragility, they are usually interpreted in a quasi-Christian, materialistic way as promoting security and consumerism.

Happiness is not the same thing as joy. Ordinary Americans choose the illusion of happiness, whereas the artist chooses a brief spasm joy with each hard-won accomplishment.

["Le Plaisir", 1952, ending of final part, "La Modele"]




The indicators of well-being are misinterpreted by decision makers to emphasize quantity over quality (success means more money and more people and big houses, etc.), happiness over joy, the illusion of attaining permanent security over the reality of the temporary, the robust over the antifragile.

Monday, August 19, 2019

The plagues of 2100

My previous prediction:

The developing world will be swept by a series of viral epidemics that will eliminate most of the population. These plagues, however, will fail to devastate the more affluent developed world.

However, in the developed world, people with compromised immune systems -- the sick, the very young and the elderly -- will be subject to constant fatalities from antibiotic-resistant bacteria.

My new prediction:

Antibiotic resistance will have epidemic-like consequences for all age groups; moreover, this will happen all over the world, not just the wealthy developed world.

All this will happen around 2100, when the world population begins to decline.

The new information:

The overuse of antibiotics in agriculture globally has created antibiotic resistance all over the world, not just in the USA. Also, global warming has played an unexpected role.

https://www.nytimes.com/2019/04/06/health/drug-resistant-candida-auris.html
Last May, an elderly man was admitted to the Brooklyn branch of Mount Sinai Hospital for abdominal surgery. A blood test revealed that he was infected with a newly discovered germ as deadly as it was mysterious. Doctors swiftly isolated him in the intensive care unit.
The germ, a fungus called Candida auris, preys on people with weakened immune systems, and it is quietly spreading across the globe. Over the last five years, it has hit a neonatal unit in Venezuela, swept through a hospital in Spain, forced a prestigious British medical center to shut down its intensive care unit, and taken root in India, Pakistan and South Africa.
Recently C. auris reached New York, New Jersey and Illinois, leading the federal Centers for Disease Control and Prevention to add it to a list of germs deemed “urgent threats.”
The man at Mount Sinai died after 90 days in the hospital, but C. auris did not. Tests showed it was everywhere in his room, so invasive that the hospital needed special cleaning equipment and had to rip out some of the ceiling and floor tiles to eradicate it.
“Everything was positive — the walls, the bed, the doors, the curtains, the phones, the sink, the whiteboard, the poles, the pump,” said Dr. Scott Lorin, the hospital’s president. “The mattress, the bed rails, the canister holes, the window shades, the ceiling, everything in the room was positive.”
C. auris is so tenacious, in part, because it is impervious to major antifungal medications, making it a new example of one of the world’s most intractable health threats: the rise of drug-resistant infections.
https://www.bbc.com/news/health-49170866
The drug-resistant fungus, Candida auris, was only discovered 10 years ago, but is now one of the world's most feared hospital microbes.
There have been outbreaks across the world, and new research shows higher temperatures may have led to an increase in infections.
Most fungi prefer the cooler temperatures found in soil. But, as global temperatures have risen, C. auris has been forced to adapt to higher temperatures.
This may have made it easier for the fungus to thrive in the human body, which is warm at 36C to 37C.
https://www.cnn.com/2019/04/09/health/candida-auris-fungus-drug-resistance/index.html
When it comes to bacteria, drug-resistant infections affect 2 million people a year in the United States, killing at least 23,000, the CDC says. And drug-resistant infections more broadly could claim 10 million lives per year around the globe by 2050 -- up from today's 700,000, according to one estimate.
"We live in a world covered with antibiotics," Chiller said. "We really need to be thinking hard about how we use those drugs."
When will deaths from viruses and bacteria reach reach epidemic proportions?

Perhaps we can get clues from the past.
The plague of 1347
When the human population shrinks under economic and environmental stressors, humans become vulnerable to epidemics. That is what happened in the 14th century.

https://en.wikipedia.org/wiki/Black_Death
The Black Death, also known as the Great Plague or the Plague, or less commonly the Black Plague, was one of the most devastating pandemics in human history, resulting in the deaths of an estimated 75 to 200 million people in Eurasia and peaking in Europe from 1347 to 1351.
The Black Death is estimated to have killed 30% to 60% of Europe's population.[7] In total, the plague may have reduced the world population from an estimated 450 million to 350–375 million in the 14th century.[8] It took 200 years for the world population to recover to its previous level.
One curious thing is that Europe's population was already shrinking when the plague struck around 1350.

https://www.economics.utoronto.ca/munro5/L02MedievalPopulationC.pdf



Climate change was already disrupting European life and reducing the population, and the plague might have been just one more symptom of ongoing crisis.

https://en.wikipedia.org/wiki/Little_Ice_Age
The new plagues will strike in 2100 ( ... or 2055?)
World population is expected to peak in the year 2100 at 11 billion people. In these conditions, there will be increased resource scarcity coupled with shrinking demand, an economic double whammy of less supply and less demand. The large number of retirees in relation to the population means strained finances, both nationally and at the household level. (While increased immigration might seem like an obvious remedy, in conditions of economic stagnation, people tend to become more averse to immigrants.) Women will in that case will fill the vacuum by making up a greater part of the workforce, altering social norms. Fewer children means greater investments in education per child. Fewer children also means the extinction of the extended family which currently serves as a social safety net. Fewer young people means less creativity.

https://www.theatlantic.com/family/archive/2019/07/world-population-stop-growing/595165/

All of this is now happening in Japan. Japan is a demographic postcard from the future.

https://www.nytimes.com/2019/08/03/world/asia/japan-single-women-marriage.html

https://www.nytimes.com/2017/11/30/world/asia/japan-lonely-deaths-the-end.html

https://www.bbc.com/news/stories-47033704

According to a 2014 report by Deutsche Bank, world population will peak at 8.7 billion people in 2055. In this scenario, peak population and its consequences would happen globally within a generation.

https://www.cnbc.com/id/101018722

If population decline and all that goes with it (e.g., economic contraction) can be connected to the rise of epidemics, then one might expect that Japan is at greater vulnerability for epidemics. Japan is therefore an important test case as a classic developed country losing population and suffering from chronic economic stagnation. However, in the past thirty years of Japanese economic torpor, Japan has not been laid waste by an epidemic. With a highly advanced economy, Japan is insulated from an epidemic by superior technology and a superior healthcare infrastructure.

Perhaps another reason that Japan's population remains healthy despite Japan's economic malaise is because Japan is all about social cohesion. The characteristics that allow Japan to function as a clean, efficient society in the face of crisis -- the willingness of the Japanese to work themselves to death, to tolerate a system in which promotion is based on seniority and not merit, to cooperate with one another, to be so obedient to authority -- are all examples of Japan's national social contract that assures every social group that they will not be forgotten and will be taken care of. Again, this social contract gives Japan the unique ability to function even as its economy frays, but Japan is transforming into a more atomized society with its low birth rate, and the consequence of this for future generations is a society that has much less social cohesion. When Japan's social fabric and economy both begin to fray in the future, then Japan may become much more vulnerable to pandemics.

(In the 2007 documentary "Young Yakuza", a Yakuza boss describes young Japanese as "aliens" who listen to rap music. For the young Japanese men in the film who drop out of society and profess to only care about their personal "freedom", the brotherhood of the Yakuza has no appeal. The documentary illuminates how those angry young men who reject a life in the Yakuza are much more disturbing than the violence and criminal activity of Japanese organized crime. The direction of social change in Japan might be toward this kind of unfulfilling and disengaged individualism.)

["Young Yakuza", 2007 documentary, trailer]

https://mubi.com/films/young-yakuza

Of course, this thesis -- that epidemics will begin to afflict humanity as the world population falls in 2100 -- ignores the impact of climate change on the economy, and how climate-related economic crises would diminish the human population and foster the conditions in which epidemics would arise. That is, my pet theory overlooks the climate-change aspect of the 14th century plague that inspired the theory. I blithely ignore the effect of climate change in my model because the effects of climate change are a big unknown -- a "known unknown", to those who recognize the reality of climate change, and an "unknown unknown" to the rest of society -- whereas the ongoing decline in global population that will create economic and societal conditions similar to contemporary Japan is more predictable. So this analysis is really based on the current Japan scenario, not the 14th century Black Death scenario. (My approach is reminiscent of the joke about the drunk who lost his keys in the dark, so he crossed the street to search for his keys under a street lamp.)

Another problem with my theory is that it compares apples and oranges in comparing the distant past and the distant (and not-yet-existent) future. Various stressors like war, taxes and climate change were already driving down Europe's population in the 14th century, and the Black Death was simply one more factor afflicting an impoverished, overpopulated western Europe. In contrast, current population decline is based on dramatic improvements in the human condition all over the world. My theory relies on the rise of stressors that are expected to arise in 2100 in a world that will be much more developed than today.

Also, it is foolhardly to even attempt to predict the future. Nassim Taleb points out that the defining events in human history were fundamentally unpredictable, despite bogus later attempts to explain them away as perfectly understandable and logical. Instead of pretending to understand and master risk, societies should focus on reducing their vulnerability to risk (e.g., by avoiding debt) in order to minimize damage when the totally unpredictable devastating event inevitably does happens.

https://en.wikipedia.org/wiki/The_Black_Swan:_The_Impact_of_the_Highly_Improbable
The Black Swan: The Impact of the Highly Improbable is a 2007 book by author and former options trader Nassim Nicholas Taleb. The book focuses on the extreme impact of rare and unpredictable outlier events — and the human tendency to find simplistic explanations for these events, retrospectively. Taleb calls this the Black Swan theory.
A central idea in Taleb's book is not to attempt to predict Black Swan events, but to build robustness to negative events and an ability to exploit positive events. Taleb contends that banks and trading firms are vulnerable to hazardous Black Swan events and are exposed to losses beyond those predicted by their defective financial models.
Further defying Taleb, one can venture to guess how many people will die in the coming plagues.
How many people are going to die in the plague(s) of 2100?
What portion of a population might be lost to an epidemic? One might assume that death rates are inversely proportional to the level of economic development. Economic development is also inversely proportional to birth rates. So fertility rates might indicate how many people would perish in an epidemic.

https://www.un.org/en/development/desa/population/publications/pdf/fertility/world-fertility-patterns-2015.pdf
  • global fertility rates average about 2.5 children per family
  • African fertility rates average 4.7
  • Asian rates are 2.2
  • Europe is 1.6
  • Latin America and the Caribbean are 2.2
  • North America is 1.9
If one-third to two-thirds of Europe's population perished from bubonic plague in the 14th century, then perhaps we can expect similar double-digit fatalities in the future plagues.

If one multiplies current fertility rates by a factor of ten, one gets an idea of what the future may hold in store. In that scenario, 47% of the population of Africa would perish from epidemics, 22% of Asians would perish, 16% of Europeans, 22% of Latin Americans, and 19% of North Americans.

Because the economies of developed societies are based on specialized skills cultivated through formal education rather than generalized skills taught by the family, the long-term effects of epidemics might be just as challenging for developed societies than on less developed societies. For example, a country like Afghanistan might lose one-third of its population to war and disease, but because most Afghans are farmers with high birth rates, within a generation the population can bounce back from a population implosion. In contrast, in countries like the USA, people tend to have specialized skills, so the loss of ten percent of the population would be economically crippling, perhaps permanently. (For example, it takes about 25 years of fancy book learning to train a PhD.)

The world's population may peak and start to fall somewhere between 2055 and 2100. That is not so far off in time (most students in college today will probably still be alive in 2100). But that is time frame of when the plagues are going to hit.
Systems that are safe after failing
In the popular mind, a "fail-safe" system is one that can withstand anything thrown at it. This is an incorrect notion. Actually, fail-safe refers to a system that will minimize damage or disruption when it does fail. (This might accord with Taleb's Black Swan thinking, that rather than pretend to predict risks that are fundamentally unpredictable, it is wiser to insure for inevitable disasters.)

https://en.wikipedia.org/wiki/Fail-safe
In engineering, a fail-safe is a design feature or practice that in the event of a specific type of failure, inherently responds in a way that will cause no or minimal harm to other equipment, the environment or to people.
Unlike inherent safety to a particular hazard, a system being "fail-safe" does not mean that failure is impossible or improbable, but rather that the system's design prevents or mitigates unsafe consequences of the system's failure. That is, if and when a "fail-safe" system "fails," it is "safe" or at least no less safe than when it was operating correctly.
Sociologically, Japan might be described as a classic case of a fail-safe system. In such a system, amidst calamity, things do not fall apart as they would in other societies because of remarkable social cohesion. Unfortunately, that kind of system is also not a dynamic, creative system that can re-invent itself. When the economy fails, no one starves, but there is seemingly endless stagnation.

The challenge is to create a system that is not only more robust and can better withstand impacts, but also more vital so that in the aftermath of a disaster, the system can reinvent itself. It could be that a bottom-up system is not only more capable of spontaneous regeneration, but it would remake itself into a new form.
Self-organizing systems
Perhaps a self-organizing system -- in which some form of overall order arises from local interactions between parts of an initially disordered system -- would be both fail-safe as well as more robust to begin with. A self-organizing system might even be more self-healing and recover quicker. For example, the internet is a collection of overlapping systems that continue to function together when any one of the systems fail. That is a fail-safe design, and it is also self-organizing.

https://en.wikipedia.org/wiki/Self-organization

In terms of urban planning, self-organization would involve the transition to autonomous and net-zero energy buildings that not only engage in distributed production, but also exchange and coordination of resources between buildings. Moreover, localities and regions would also be required to have some degree of self-sufficiency in food, water and energy production and consumption, which would be coordinated at the local and regional level.

Indeed, in his book "Antifragile", Talib argues for built-in such insurance, pointing to the naturalistic model of organisms which always have a backup system -- organs come in pairs -- a strategy that most economists would scorn as inefficient.

Talib's notion of "antifragility" offers a whole new way of thinking about disaster preparation.
Antifragile
In Talib's understanding, there are policies that would make it possible to actually grow and benefit from disasters that are not overly severe. A society with the right policies might not only survive a plague, but profit by it. What does not kill you makes you stronger.
For Talib, there are three kinds of entities:
  • those that are fragile and break easily,
  • those that are more robust and shockproof -- up to a point, and
  • those that not only survive moderate shocks, but improve because of them.
Taleb writes:

https://www.fooledbyrandomness.com/prologue.pdf
I. HOW TO LOVE THE WIND
Wind extinguishes a candle and energizes fire.
Likewise with randomness, uncertainty, chaos: you want to use them, not hide from them. You want to be the fire and wish for the wind. This summarizes this author’s nonmeek attitude to randomness and uncer-tainty.
We just don’t want to just survive uncertainty, to just about make it. We want to survive uncertainty and, in addition— like a certain class of aggressive Roman Stoics— have the last word. The mission is how to domesticate, even dominate, even conquer, the unseen, the opaque, and the inexplicable. 
How? 
II. THE ANTIFRAGILE
Some things benefit from shocks; they thrive and grow when exposed to volatility, randomness, disorder, and stressors and love adventure, risk, and uncertainty. Yet, in spite of the ubiquity of the phenomenon, there is no word for the exact opposite of fragile. Let us call it antifragile.
Antifragility is beyond resilience or robustness. The resilient resists shocks and stays the same; the antifragile gets better. This property is behind everything that has changed with time: evolution, culture, ideas, revolutions, political systems, technological innovation, cultural and economic success, corporate survival, good recipes (say, chicken soup or steak tartare with a drop of cognac), the rise of cities, cultures, legal systems, equatorial forests, bacterial resistance . . . even our own existence as a species on this planet. And antifragility determines the boundary between what is living and organic (or complex), say, the human body, and what is inert, say, a physical object like the stapler on your desk.  
The antifragile loves randomness and uncertainty, which also means — crucially — a love of errors, a certain class of errors. Antifragility has a singular property of allowing us to deal with the unknown, to do things without understanding them— and do them well.
Talib suggests that the best strategy for making a system antifragile is to build uncertainty and hardship into the system itself. He suggests several ways to do this.

https://en.wikipedia.org/wiki/Antifragile

Skin in the game. Decision-makers should always have proverbial "skin in the game". That is, the leadership of a society or business should be personally vested in the survival of the enterprise that they lead, or they will game policies to make the system fragile and failure-prone. If the policy is not for the captain to go down with the ship, the ship is doomed. Profits will be privatized, and risks will be socialized.
To me, every opinion maker needs to have “skin in the game” in the event of harm caused by reliance on his information or opinion (not having such persons as, say, the people who helped cause the criminal Iraq invasion come out of it completely unscathed). Further, anyone producing a forecast or making an economic analysis needs to have something to lose from it, given that others rely on those forecasts (to repeat, forecasts induce risk taking; they are more toxic to us than any other form of human pollution).
Situation in which the manager of a business is not the true owner, so he follows a strategy that cosmetically seems to be sound, but in a hidden way benefits him and makes him antifragile at the expense (fragility) of the true owners or society. When he is right, he collects large benefits; when he is wrong, others pay the price.
Typically this problem leads to fragility, as it is easy to hide risks. It also affects politicians and academics. A major source of fragility.
Via negativa. Another strategy to promote antifragility is, in a sense, to achieve health by subtracting medication. We must edit our lives because quality of life is based not on what we have, but on what we can eliminate. (Humphrey Bogart used to say that the only really good thing about having money is that it allows one to tell jerks to go screw themselves.)
I would add that, in my own experience, a considerable jump in my personal health has been achieved by removing offensive irritants: the morning newspapers (the mere mention of the names of the fragilista journalists Thomas Friedman or Paul Krugman can lead to explosive bouts of unrequited anger on my part), the boss, the daily commute, air-conditioning (though not heating), television, emails from documentary filmmakers, economic forecasts, news about the stock market, gym “strength training” machines, and many more.
Lindy effect. One thing to eliminate is "neomania", or the craze for the new and fashionable.
A technology, or anything nonperishable, increases in life expectancy with every day of its life—unlike perishable items (such as humans, cats, dogs, and tomatoes). So a book that has been a hundred years in print is likely to stay in print another hundred years. The opposite is Neomania, a love of change for its own sake, a form of philistinism that does not comply with the Lindy effect and that understands fragility. Forecasts the future by adding, not subtracting.
Barbell strategy. Extremely risky strategies are fragile, whereas safe policies are robust but stagnant. The antifragile strategy is to combine both of these paths simultaneously.
A dual strategy, a combination of two extremes, one safe and one speculative, deemed more robust than a “monomodal” strategy; often a necessary condition for antifragility. For instance, in biological systems, the equivalent of marrying an accountant and having an occasional fling with a rock star; for a writer, getting a stable sinecure and writing without the pressures of the market during spare time. Even trial and error are a form of barbell.
How does the Talib's concept of antifragility apply to epidemics?

When applied to the Black Death of the 14th century, in biological terms, the plague tended to kill off those who more "frail" and rendered the descendants of those who had survived more immune to disease.

https://www.americanscientist.org/article/the-bright-side-of-the-black-death

Economically and politically, the labor shortage created by the plague lifted living standards for workers and destroyed a feudal system already in decline, and outsiders came under persecution. The favorable conditions for labor diminished as populations recovered, although the persecution persisted. Nevertheless, the Black Death was a classic unexpected transformative Black Swan event that rendered western Europe more antifragile.

https://en.wikipedia.org/wiki/Consequences_of_the_Black_Death


What will be the consequences of the Plague of 2100? Will it make the world more antifragile? What policies promoting antifragility would prepare humanity for the coming plagues?