Monday, June 8, 2020

New SARS2 metrics & factors (UBI, labor, work & action)

 Abstract: As the SARS2 pandemic continues, new metrics to measure its growth have emerged into public view. Their usefulness depends upon which stage in the pandemic a society finds itself. Some metrics are more abstract and fixed, whereas metrics that are more concrete tend to be dynamic. New factors in the contagiousness and virulence of SARS2 have been suggested, but remain uncertain or have been discredited. The metrics of an epidemic are typically implied to be solely a property of the pathogen, but they should be understood to include social conditions, including the response of a particular society to the epidemic, and other factors. A game changer in the pandemic would seem to be the alteration of simple hygiene habits of the population. Yet such habits tend to be stubbornly entrenched, and evolve slowly over time. A conditional Guaranteed Minimum Income might be paid to those who do take these personal precautions and who are willing to test, trace and isolate. A Universal Basic Income would not aid in this effort, and runs against human nature.
  • In the early stages of a pandemic, the best metric is the number of deaths by country.
  • Retrospectively determining the best responses to pandemics involves looking at death rates per population.
  • The basic reproduction number of a disease is known as “R0” and represents how many people are expected to be infected by an individual at a particular time and place.
  • The actual transmission rate as it varies over time is known as “Rt”.
  • A transmission rate (Rt) over 1 (one person infecting on average one other) means the disease is growing, and results in lockdown.
  • In cold, crowded, enclosed environments so loud that people have to shout, the virus is highly contagious.
  • But in other circumstances, SARS2 turns out to be not so contagious.
  • This is due to the dispersion factor, which is known as “k”.
  • SARS2 low dispersion factor (k) has great consequence for the so-called “herd immunity” strategy.
  • The dispersion factor (k) might also help to explain Japan’s unique success in containing SARS2.
  • The experts recommended that the Japanese avoid the “three Cs”:
    1. Closed spaces (with poor ventilation).
    2. Crowded places (with many people nearby).
    3. Close-contact settings (such as close-range conversations).
  • Another popular metric is the case “fatality rate".
  • But the more interesting metric might be closed cases.
  • The closed case death rate might therefore mark the progress of the disease in an afflicted countries.
  • The ACE2 gene had been perceived as a factor in resistances to SARS2.
  • Having Type A blood was linked to a 50 percent increase in the likelihood that a patient would need to get oxygen or to go on a ventilator.
  • COVID-19 death rates have been linked to socioeconomic status.
  • Culturally, there could be lifestyle traits unique to Korea that render Koreans and East Asians less susceptible to death and infection from SARS2.
  • There are historical precedents on how case fatality rates can vary by ethnicity.
  • Is it possible to for a government alter the hygiene habits of a people?
  • The stimulus checks extended into a recovery would be a form of Universal Basic Income (UBI), and are unconditional.
  • Guaranteed Minimum Incomes are already found in many societies, and are conditional.
  • A Universal Basic Income is impossible because it goes against human nature.
  • People do not want an income, they want jobs in order to labor.
  • The need to labor can be understood in the distinction between labor, work and action.
    • The purpose of labor is purely the endless task of sustaining the life processes of the individual.
    • Work has an end goal involving the creation of the artificial from nature.
    • A third type of activity is action, which is characterized by how individuals express their individuality.
    • Action might involve a certain self-destructiveness.
    • The realm of action has a political economy of death.
    • Intensified awareness of mortality might not characterize only the realm of action, but also the realm of contemplation.
In the coronavirus pandemic, comparing international statistics can be difficult.

https://www.bbc.com/news/52311014

First, the number of fatalities needs to be put in the context of the population size of a country, but that can be skewed when comparing large countries (China) with small ones (Belgium).

Second, countries differ in who is include among COVID-19 fatalities — for example, only hospital deaths, or only fatalities tested and found positive, or fatalities with no other illnesses.

Third, death rates are uncertain because it is compared with those who have been tested, and testing rates varies by country.

Fourth, comparing national fatality rates only makes sense when adjusting for when the disease was first introduced to the country.

Fifth, factors like population density (cities), the advanced age of the population (rural areas) and poverty rates (the developing world) need to be taken into account.

The points above are valid and useful, but they can be extended and elaborated.

With regard to the first point — on the superior if problematic relevance of population size in the spread of a pandemic — four assertions are pertinent.
  1. The mass media ignores the proper metric of pandemics by focusing on the turnover of big numbers (e.g., the global number of infected passing one million, then two million).
  2. A more relevant metric in terms of measuring the effectiveness of national policy is the national death rate per million inhabitants.
  3. In the beginning of the crisis, however, cases per country and by locality are the best metrics.
  4. Also, these rates needs to be understood in terms of time — specifically, how fast the case rate and deaths per population ascends over time.
The news media can be superficial and sensationalist in its presentation of statistics, stressing the attainment of longer numbers, much like the rollover on odometer.

Identifying the national death rate by population is more pertinent in terms of looking for explanations of what influences outcomes, especially after the fact.

However, during the early stages of infection, the most important thing is to look at the growing number of deaths in a country, regardless of a country’s population size.

In the early stages of a pandemic, the best metric is the number of deaths by country.

This is because — all things being the same — the rate of spread of the disease is the same in each country.

In fact, the initial outbreak in a country is typically concentrated in a particular region.

The national fatality rate by population size can distract from the real concern, which is that the pandemic is becoming established in the country as it tears through a region.

In fact, the national fatality rate by population would be distorted by unaffected regions.

For example, New York City was an early and dramatic focus of the coronavirus outbreak in the USA.

During that period, most of the USA, especially rural areas, seemed unaffected.

The “big picture” view presented by the percentage of deaths in the overall population would thus underplay the dire rate of increase in deaths in New York City.

Hence the importance also of looking at the death rate by region or city, because the initial outbreak is local.
<figure></figure>https://www.ft.com/coronavirus-latest

Only in the aftermath of a pandemic does the national death rate by population take on relevance.

Retrospectively determining the best responses to pandemics involves looking at death rates per population.

East Asian societies might be in such a post-pandemic period in which policies can be reviewed.

The coronavirus death rate per million in South Korea and Singapore are in the single digits (5 and 2 deaths per million, respectively).

The death rate in Taiwan and Hong Kong is less than a single digit (0.3 and 0.5 deaths per million, respectively).

In contrast, the death rate in the USA and the UK are now in the triple digits.

Contrasting the fatality rate by population, policies that were most successful in East Asia can be identified.

The effective policies seem to be:
  • Aggressively test,
  • aggressively trace and
  • aggressively isolate cases.
https://www.businessinsider.com/south-koreas-coronavirus-curve-timeline-2020-4

The countries that most successfully implemented this regimen of testing — Taiwan, South Korea, Hong Kong — did not need to implement lockdowns.

Lockdowns only happened in countries that failed to implement widespread, aggressive testing.

In some of these countries, specifically the USA, this testing is still not happening.

In the West, where the disease is still smoldering, it is better to look at the daily number of deaths by country.

For instance, the graph below is useful precisely because it looks at the number of fatalities over time by country, not at fatalities per population.

One might object that looking at numbers rather than ratios would underplay the trauma suffered by smaller countries.

That is a legitimate concern.

For example, Spain and Italy have roughly one-fifth and one-seventh the population of the USA, respectively, yet half as many deaths as the USA.

However, because the graph is logarithmic rather than linear, it visually adjusts for the bias toward exaggerating fatalities in large countries.
<figure></figure>https://www.ft.com/coronavirus-latest

The usefulness of the graph reaffirms the importance of the metric of time.

Time is important irrespective of whether it is the number of fatalities or the fatality rate per population that is being studied.

In some metrics of the spread of disease, however, time is not an issue.

For example, the basic reproduction number is an estimate of how many further cases a single case of a disease is expected to create in ideal conditions (no immunity, no infections).

The basic reproduction number of a disease is known as “R0” and represents how many people are expected to be infected by an individual at a particular time and place.

https://en.wikipedia.org/wiki/Basic_reproduction_number
In epidemiology, the basic reproduction number (sometimes called basic reproductive ratio, or incorrectly basic reproductive rate, and denoted R0, pronounced R nought or R zero[17]) of an infection can be thought of as the expected number of cases directly generated by one case in a population where all individuals are susceptible to infection.[18] The definition describes the state where no other individuals are infected or immunized (naturally or through vaccination).
By definition, R0 cannot be modified through vaccination campaigns. Also, it is important to note that R0 is a dimensionless number and not a rate, which would have units of time[20] like doubling time.
Nevertheless, the basic reproduction number is not a universal constant, but is dependent on things like environmental conditions, human behavior and what is being observed.
R0 is not a biological constant for a pathogen as it is also affected by other factors such as environmental conditions and the behaviour of the infected population. Furthermore R0 values are usually estimated from mathematical models, and the estimated values are dependent on the model used and values of other parameters.
The basic reproduction number is therefore dynamic and can change, even though it is always a snapshot of a given moment and not a rate.
The most important uses of R0 are determining if an emerging infectious disease can spread in a population and determining what proportion of the population should be immunized through vaccination to eradicate a disease. In commonly used infection models, when R0 > 1 the infection will be able to start spreading in a population, but not if R0 < 1. Generally, the larger the value of R0, the harder it is to control the epidemic.
For simple models, the proportion of the population that needs to be effectively immunized (meaning not susceptible to infection) to prevent sustained spread of the infection has to be larger than 1 − 1/R0.
The article states that the basic reproduction number for the coronavirus ranges from 1.4 to 5.7.
This is comparable to the estimated fuel consumption for a particular brand of automobile, where there is a range between between estimated highway and street mileage.

Again, both these numbers — 1.4 and 5.7 — are quite high because anything over 1 means that the disease is spreading, whereas anything below 1 means that the disease is diminishing.

The following presentation explains this in a more user-friendly manner.

https://www.bbc.com/news/health-52300114
On average, without social distancing in place, scientific modelling suggests that people infected with coronavirus would pass it on to another three people.
This high level of infection would overwhelm the health care industry.

But what happens if one radically alters one’s behavior?

For example, if one drove one’s car well below the speed limit and avoided stop-and-go traffic, then one’s mileage would begin to vary from the official fuel consumption estimates.

Likewise, once precautionary measures are put in place during a pandemic, the transmission numbers decline.

Time is now a factor in the measure of the progress of the pandemic.

At that point, transmission can be understood as a rate.

The actual transmission rate as it varies over time is known as “Rt”.

https://qz.com/1834700/rt-the-real-time-r0-guiding-how-to-lift-coronavirus-lockdowns/
Early on in the Covid-19 outbreak, different teams of researchers came up with varying estimates of R0, with most ranging between two and three. Some put the number lower, like the World Health Organization’s estimates of 1.4 to 2.5. But R0 is not set in stone. It is an average, and can also vary from place to place. As science journalist Ed Yong put it in the Atlantic, R0 “is a measure of a disease’s potential,” and once response measures are put in place—screening and quarantines, for example—the actual transmission rate can be lowered. The actual or “effective” version of the reproductive number, as opposed to the basic version, is known as Rt—that is, the virus’s actual transmission rate at a given time, t.
In Wuhan, the Rt was originally quite high — over 2 — but fell below 1 after the lockdown.
“In Wuhan, Rt was probably two-point-x in December, two-point-x in January, until about January 23,” when the city of 11 million residents was put under lockdown, said Ben Cowling, division head of epidemiology and biostatistics at the University of Hong Kong (HKU). “And at that point, most likely the effective R dropped pretty substantially to 0.3, and then stayed at a low level almost until now.” As Wuhan’s lockdown was lifted today, the question will be whether China can avoid a second wave of infections.
In Germany, the basic reproduction number is now 1, and so Germany can now exit its lockdown (carefully).

A transmission rate (Rt) over 1 (one person infecting on average one other) means the disease is growing, and results in lockdown.

Germany’s chancellor Angela Merkel explains that if the number goes up to 1.3, Germany’s healthcare system will be overwhelmed and another lockdown will ensue.

https://www.theguardian.com/world/video/2020/apr/16/merkel-sets-out-clear-explanation-of-how-coronavirus-transmission-works-video

https://qz.com/1839030/angela-merkel-explains-how-coronavirus-transmission-works/
As societies begin to contemplate how to re-start their economies after weeks of shutdowns, epidemiologists have urged a multi-cycle strategy of “suppression and lift:” a regimen of relaxing and tightening social distancing measures to fine-tune them, so that they are just right for a particular population at a particular time. The idea is to relax the measures when case growth has fallen sufficiently, but to tighten them again if infections again start to spread. Hong Kong and Singapore, for example, are already testing out this strategy.
At the core of the “suppress and lift” strategy is one key number: Rt, or the real-time effective reproductive number. Rt tells us a virus’s actual transmission rate at a given time, t. That is, in a particular population at a particular time,  how many other people will catch the disease from a single infected person?
This is the concept that Merkel explained so well yesterday. She noted that the Rt in Germany was currently around one, meaning that on average a person with the virus infects one other person. One is the critical threshold: below one, the epidemic gradually fades out. Above one, it will grow, possibly exponentially. (Hong Kong has a dashboard showing a frequently updated Rt; the latest Rt is just above 0.3.)
Merkel then sketched out what it would mean if Germany’s Rt edged up to 1.1.
“If we get to the point where everybody infects 1.1 people, then by October we will reach the capacity level of our health system, with the assumed level of intensive care beds,” she said.
And if it edges up further still, to 1.2, “everyone is infecting 20% more.”
But 20% is arguably an abstract number, and hard for the average citizen to grasp. Merkel seemed to recognize this, and explained the percentage more concretely: “Out of five people, one infects two and the rest one.” At this rate, Germany’s health care system will reach its limit in July. At an Rt of 1.3, the health care system maxes out in June.
“So you see what little leeway we have,” she said.
One shortcoming of the reproduction number is its neglect of context.

The contagiousness of SARS2 varies with circumstance.

In cold, crowded, enclosed environments so loud that people have to shout, the virus is highly contagious.

But in other circumstances, SARS2 turns out to be not so contagious.

https://www.sciencemag.org/news/2020/05/why-do-some-covid-19-patients-infect-many-others-whereas-most-don-t-spread-virus-all
SARS-CoV-2, like two of its cousins, severe acute respiratory syndrome (SARS) and Middle East respiratory syndrome (MERS), seems especially prone to attacking groups of tightly connected people while sparing others.
Most of the discussion around the spread of SARS-CoV-2 has concentrated on the average number of new infections caused by each patient. Without social distancing, this reproduction number (R) is about three. But in real life, some people infect many others and others don’t spread the disease at all. In fact, the latter is the norm, Lloyd-Smith says: “The consistent pattern is that the most common number is zero. Most people do not transmit.”

This is due to the dispersion factor, which is known as “k”.

That’s why in addition to R, scientists use a value called the dispersion factor (k), which describes how much a disease clusters. The lower k is, the more transmission comes from a small number of people.
Estimates of k for SARS-CoV-2 vary.
Adam Kucharski of LSHTM estimated that k for COVID-19 is as low as 0.1. “Probably about 10% of cases lead to 80% of the spread,” Kucharski says.
On the one hand, the coronavirus has not swept across the world in the way that it was once expected to.
That could explain some puzzling aspects of this pandemic, including why the virus did not take off around the world sooner after it emerged in China, and why some very early cases elsewhere—such as one in France in late December 2019, reported on 3 May—apparently failed to ignite a wider outbreak. If k is really 0.1, then most chains of infection die out by themselves and SARS-CoV-2 needs to be introduced undetected into a new country at least four times to have an even chance of establishing itself, Kucharski says. If the Chinese epidemic was a big fire that sent sparks flying around the world, most of the sparks simply fizzled out.
On the other hand, SARS2 will sweep through a workplace or a nightclub, infecting so many of the people in such confined spaces.
SARS-CoV-2 appears to transmit mostly through droplets, but it does occasionally spread through finer aerosols that can stay suspended in the air, enabling one person to infect many. Most published large transmission clusters “seem to implicate aerosol transmission,” Fraser says.
Researchers in China studying the spread of the coronavirus outside Hubei province—ground zero for the pandemic—identified 318 clusters of three or more cases between 4 January and 11 February, only one of which originated outdoors. A study in Japan found that the risk of infection indoors is almost 19 times higher than outdoors. (Japan, which was hit early but has kept the epidemic under control, has built its COVID-19 strategy explicitly around avoiding clusters, advising citizens to avoid closed spaces and crowded conditions.)

SARS2 low dispersion factor (k) has great consequence for the so-called “herd immunity” strategy.

The idea in the UK and Sweden is that at-risk populations would be sequestered, while the bulk of the population would become harmlessly infected and thereby become immune.
It turned out that there are a number of examples that diverge from this.
  • Because little is known about COVID-19, it was premature to characterize as low-risk young people without preexisting medical conditions.
  • Care home residents account for nearly half of deaths linked to Covid-19 in Sweden.
  • Most new Covid-19 hospitalizations in New York state were from people who were staying home and not venturing much outside.
  • A Swedish study found that only 7.3 percent of Stockholmers had developed antibodies by late April.
  • For much of the New York City metropolitan area, much less than 20 percent of the population have been infected. That’s a long way from the 60 to 70 percent to begin to achieve herd immunity.
  • Homeless encampments in Hawaii, perceived to be the dirtiest and with the most careless settings, had zero cases. 
Again, the paradox of SARS2 presents itself:

For people out and about in fresh air and sunshine and who are taking all the simple precautions, SAR2 is not so contagious or deadly.

For those in cold, loud, crowded, confined spaces, SARS2 is very infectious and dangerous.

Half of all Swedes live by themselves, and this low household density could be why Sweden’s COVID-19 death rates are not as high as that of Italy or Spain.

Likewise, high-density Manhattan has only 1.2 persons per household, whereas the hardest hit areas of NY state are low-rise districts in Queens with a household density of 2.8 persons.

<figure></figure>https://www.businessinsider.com/washington-post-coronavirus-young-people-developing-world-2020-5
https://www.bbc.com/news/world-europe-52704836
https://www.cnbc.com/2020/05/06/ny-gov-cuomo-says-its-shocking-most-new-coronavirus-hospitalizations-are-people-staying-home.html
https://www.mirror.co.uk/news/world-news/doubts-over-swedens-herd-immunity-22079529
https://www.newsweek.com/contact-tracing-wont-solve-coronavirus-crisis-says-this-renowned-epidemiologistheres-what-1506209
https://www.civilbeat.org/2020/05/denby-fawcett-hawaiis-homeless-have-avoided-covid-19-so-far/
https://en.wikipedia.org/wiki/Demographics_of_Queens
https://en.wikipedia.org/wiki/Corona,_Queens

The dispersion factor (k) might also help to explain Japan’s unique success in containing SARS2.

Japan had very few cases of COVID-19.

Japan had very limited restrictions on its citizens.

https://time.com/5842139/japan-beat-coronavirus-testing-lockdowns
Japan’s state of emergency is set to end with new cases of the coronavirus dwindling to mere dozens. It got there despite largely ignoring the default playbook.
No restrictions were placed on residents’ movements, and businesses from restaurants to hairdressers stayed open. No high-tech apps that tracked people’s movements were deployed. The country doesn’t have a center for disease control. And even as nations were exhorted to “test, test, test,” Japan has tested just 0.2% of its population — one of the lowest rates among developed countries.
Yet the curve has been flattened, with deaths well below 1,000, by far the fewest among the Group of Seven developed nations. In Tokyo, its dense center, cases have dropped to single digits on most days.
Nobody, not even the Japanese, understand why Japan has been so successful in combating SARS2.
There is a general assumption that it was a number of things that contributed to this success.

Many of the factors — frequent hand washing, not shaking hands, removing shoes in homes, wearing masks — are Japanese habits that preexisted the SARS2 pandemic.

Like the successful response in Hong Kong, Japan’s early reaction to the SARS2 threat was at the grassroots level, and largely bypassed a lethargic central government.

Japan has no CDC, but rather relies on a system of local health centers that have long engaged in old-school forms of contact tracing for all kinds of disease.
An early grassroots response to rising infections was crucial. While the central government has been criticized for its slow policy steps, experts praise the role of Japan’s contact tracers, which swung into action after the first infections were found in January. The fast response was enabled by one of Japan’s inbuilt advantages — its public health centers, which in 2018 employed more than half of 50,000 public health nurses who are experienced in infection tracing. In normal times, these nurses would be tracking down more common infections such as influenza and tuberculosis.
Japan, in many ways, is an old fashioned country (every household and business in Japan still has and uses a fax machine), and this is reflected in the contact tracing methods.
“It’s very analog — it’s not an app-based system like Singapore,” said Kazuto Suzuki, a professor of public policy at Hokkaido University who has written about Japan’s response. “But nevertheless, it has been very useful.”
While countries such as the U.S. and the U.K. are just beginning to hire and train contact tracers as they attempt to reopen their economies, Japan has been tracking the movement of the disease since the first handful of cases were found. These local experts focused on tackling so-called clusters, or groups of infections from a single location such as clubs or hospitals, to contain cases before they got out of control.
The Japanese healthcare system’s containment of SARS2 represents a victory of localism.
“Many people say we don’t have a Centers for Disease Control in Japan,” said Yoko Tsukamoto, a professor of infection control at the Health Sciences University of Hokkaido, citing a frequently held complaint about Japan’s infection management. “But the public health center is a kind of local CDC.”
Because medical experts used simple messages to appeal to the public, the Japanese public embraced the experts.

The experts recommended that the Japanese avoid the “three Cs”:

  1. Closed spaces (with poor ventilation).
  2. Crowded places (with many people nearby).
  3. Close-contact settings (such as close-range conversations).
Experts are also credited with creating an easy-to-understand message of avoiding what are called the “Three C’s” — closed spaces, crowded spaces and close-contact settings — rather than keeping away from others entirely.Although political leadership was criticized as lacking, that allowed doctors and medical experts to come to the fore — typically seen as a best practice in managing public health emergencies. “You could say that Japan has had an expert-led approach, unlike other countries,” Tanaka said.
<figure></figure>The Japanese avoidance of the three Cs seems perfectly focused on addressing the dispersal factor (k).

Another popular metric is the case “fatality rate“.

The case fatality rate compares the number of people who have been determined to have been infected with the number of recorded deaths attributed to the disease.

https://www.worldometers.info/coronavirus/

The case fatality rate is a useful but flawed measurement.

For example, on the 16th, there were 2.2 million cases and 145,000 deaths worldwide, which results in a 7% death rate.

This overlooks both a lack of testing and deaths that were improperly counted (e.g., excluding untested corpses or, conversely, including undetermined or misdiagnosed deaths).

The case fatality rate might be interesting for a theoretical reason.

This is because it is based on an overlooked distinction between active and closed cases.

Among the cases, there were 1.5 million active cases with currently infected patients: 96% were in mild condition, and 4% were critical.

There were also 500,000 closed cases that had an outcome: almost 80% were recovered or discharged, the other 20% died.

To determine the case fatality rate, the total number of cases — open and closed– are being contrasted with deaths, which are a subset of closed cases.

But the more interesting metric might be closed cases.

The 20% global death rate among SARS2 closed cases that lasted for some time was much higher than that for Spanish flu.
The Spanish flu is generally understood to have had a case fatality rate of over 2.5%.

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3291398/

But even that is uncertain.

The Spanish flu infected around 500 million people, about a quarter or a third of the world’s population.

But the number of resulting deaths remains uncertain and contested.

https://en.wikipedia.org/wiki/Spanish_flu#Mortality
  • An estimate from 1991 states that the virus killed between 25 and 39 million people.[52] That is a case fatality rate between 5% and 7.8%.
  • A 2005 estimate put the death toll at probably 50 million (less than 3% of the global population), and possibly as high as 100 million (more than 5%).[53][54] That is a case fatality rate between 10% and 20%.
  • However, a reassessment in 2018 estimated the total to be about 17 million,[55] though this has been contested. That is a case fatality rate of 3.4%.
When we look at the case fatality rate of the Spanish flu, we are looking backward, so that there are no open cases, only closed cases.

When the millions or billions of open cases of SARS2 eventually resolve themselves into closed cases, what will be the mortality rate?

At the moment, among closed cases, the percentages of the living versus the dead was a stable four to one ratio since the beginning of April.

There was a historical progression of the closed case fatality rate.
  • At the beginning of February, the closed case death rate was around 40%.
  • It fell to almost 5% in the beginning of March.
  • By the beginning of April it had slowly climbed up to 20%.
The 40% death rate might have marked the high point of the pandemic in Asia.

The diminution of the closed case death rate to a mere 5% might signify the successful vanquishing of the pandemic in Asia.

The subsequent 20% death rate might reflect the rise of the pandemic in the Western world.

The closed case death rate might therefore mark the progress of the disease in an afflicted countries.

  1. Early in the outbreak, a country suffers is a closed case death rate of 40%.
  2. In the middle of the lockdown, the closed case fatality rate falls to 20%.
  3. In the post-lockdown phase, a country’s closed case fatality rate falls below 5%.
In the USA, Belgium and France, the closed case death rate was roughly 40% in March, suggesting that they are in the early stages of their crisis.

https://www.worldometers.info/coronavirus/country/us/
https://www.worldometers.info/coronavirus/country/france/

(Incidentally, that 60-40 ratio is roughly the golden ratio.)

https://en.wikipedia.org/wiki/Golden_ratio

Spain was around the 20% mark of closed case deaths in March.

https://www.worldometers.info/coronavirus/country/spain/

(Interestingly, that 80-20 ratio is the Pareto principle.)

https://en.wikipedia.org/wiki/Pareto_principle

The closed case death rate in Sweden in March was 73%, implying that in Sweden, the real ordeal had yet to begin.

https://www.worldometers.info/coronavirus/country/sweden/

(That might be the inversion of Pareto.)
<figure></figure>https://www.economist.com/graphic-detail/2020/04/17/coronavirus-infections-have-peaked-in-much-of-the-rich-world

In China, the closed case fatality rate is 6%.

https://www.worldometers.info/coronavirus/country/china/

In Taiwan, it is 3%.

https://www.worldometers.info/coronavirus/country/taiwan/

It’s also 3% in South Korea.

https://www.worldometers.info/coronavirus/country/south-korea/

In Singapore, it is 1%.

https://www.worldometers.info/coronavirus/country/singapore/

It seems that eventually the closed case fatality rate will fall to where the total case fatality rate is.
But the total case fatality rate is uncertain.

Above, it was (crudely) estimated to be 7%.

In the popular media, it is often reported to be 1.4%.

The South Korean government reports that it is 0.6%.

Random testing for SARS2 antibodies in Santa Clara County suggests that the number of infected people is 50 to 85 times higher than currently assumed.

This would drive down the case fatality rate in Santa Clara to a fraction of what it was estimated to be in South Korea, making SARS2 comparable to the seasonal flu.

https://www.cnn.com/2020/04/17/health/santa-clara-coronavirus-infections-study/index.html

Nevertheless, where the SARS2 flares up, the death rate soars.

or example, deaths in New York City are more than double the usual total.

https://www.nytimes.com/interactive/2020/04/10/upshot/coronavirus-deaths-new-york-city.html
Over the 31 days ending April 4, more than twice the typical number of New Yorkers died.
That total for the city includes deaths directly linked to the novel coronavirus as well as those from other causes, like heart attacks and cancer. Even this is only a partial count; we expect this number to rise as more deaths are counted.
These numbers contradict the notion that many people who are dying from the new virus would have died shortly anyway. And they suggest that the current coronavirus death figures understate the real toll of the virus, either because of undercounting of coronavirus deaths, increases in deaths that are normally preventable, or both.
<figure></figure>Covid-19 is rapidly becoming America’s leading cause of death.

https://www.washingtonpost.com/outlook/2020/04/16/coronavirus-leading-cause-death/
<figure></figure>There is a tendency to understand the fatality rate of a disease as a scientific constant in all circumstances, much like the speed of light.

But there might be other ways of looking at this.

It is important to remember that the case fatality rate is an abstraction — an educated guess.
There is a tendency to forget that abstract models are just that.

https://en.wikipedia.org/wiki/Reification_(fallacy)
Reification (also known as concretism, hypostatization, or the fallacy of misplaced concreteness) is a fallacy of ambiguity, when an abstraction (abstract belief or hypothetical construct) is treated as if it were a concrete real event or physical entity.[1][2] In other words, it is the error of treating something that is not concrete, such as an idea, as a concrete thing. A common case of reification is the confusion of a model with reality: “the map is not the territory“.
Reification takes place when natural or social processes are misunderstood or simplified; for example, when human creations are described as “facts of nature, results of cosmic laws, or manifestations of divine will”.
Reification may derive from an inborn tendency to simplify experience by assuming constancy as much as possible.
Relatedly, a case fatality rate is not completely inherent to a pathogen.

The fatality rate would also reflect the behavior of the victims of the pathogen.

There is a tendency to assume that the perceived qualities of a being are purely inherent to the entity, and not the accidental products of a historical process.

https://en.wikipedia.org/wiki/Essentialism
Essentialism is the view that every entity has a set of attributes that are necessary to its identity and function.[1] In early Western thought, Plato’s idealism held that all things have such an “essence”—an “idea” or “form”. In Categories, Aristotle similarly proposed that all objects have a substance that, as George Lakoff put it, “make the thing what it is, and without which it would be not that kind of thing”.[2] The contrary view—non-essentialism—denies the need to posit such an “essence’”.
An example would be the SARS2 case death rate in South Korea.

With an aggressive regime of widespread testing, the South Koreans have determined that the death rate of SARS2 is a mere 0.6%.

That is, the number of people who die from SARS2 is a tiny tip of the vast number of people who become infected but are largely unaffected.

But this 0.6% case fatality rate might only hold true in South Korea.

The ACE2 gene has been perceived as a factor in resistances to SARS2.

First, individuals with a strong expression of ACE2 gene might have a genetic advantage in dealing with being infected by the coronavirus.

This includes young people, East Asians, women, and individuals who are not type 2 diabetics.

https://www.preprints.org/manuscript/202003.0191/v1
The COVID19 coronavirus SARS-CoV2 spreading in Wuhan and now worldwide has been shown to use angiotensin-converting enzyme 2 ACE2 as its host cell receptor, like the severe acute respiratory syndrome coronavirus (SARS-CoV). Epidemiology studies found different sex and age groups have different susceptibility to infection, and very skewed severity and mortality of the virus infection, with male, old age, and comorbidity being the most inflicted. Here by analyzing GTEx and other public data in 30 tissues across thousands of individuals, we found significantly higher expression in Asian females compared to males and other ethnic groups, an age dependent ACE2 expression decrease and a highly significant decrease in type II diabetic patients. Consistently, the most significant expression quantitative loci (eQTLs) contributing to high ACE2 expression are close to 100% in East Asians, >30% higher than other ethnic groups. Together with the shockingly common enrichment of viral infection pathways among ACE2 anti-expressed genes, binding of virus infection-related transcription factors at ACE2 regulatory regions, the repression of ACE2 expression by inflammatory cytokines and by type 2 diabetes, and the induction by estrogen and androgen (both decrease with age) established a negative correlation between ACE2 expression and CovID19 fatality at both population and molecular levels. Our results will be instrumental when designing potential prevention and treatment strategies for ACE2 binding coronaviruses in general.
Along with socioeconomic status, this might help to explain the higher death rates in New York City among blacks and Hispanics.

This does not bode well for Africa and Latin America, which are yet to face the full force of the SARS2 pandemic.

In the USA, this might suggest a favorable outcome for the state of Hawaii, which has a large population of people of East Asian background.

Unfortunately, many East Asians in Hawaii — specifically, ethnic Japanese — often have a higher rates of diabetes 2, which would make them more susceptible to SARS2.

Earlier research had erroneously concluded that the higher rates of ACE2 expression in East Asians made East Asians MORE vulnerable to SARS2 than Europeans or Africans.

This was readily embraced by American racists who wanted to discount the possibility that the USA would be impacted by SARS2, and who wanted to play up racial differences.

https://vdare.com/articles/it-s-official-chinese-scientists-find-genetic-explanation-for-coronavirus-discriminating-by-race

The ACE2 gene is now being discounted as a factor.

https://www.nytimes.com/2020/06/03/health/coronavirus-blood-type-genetics.html

Yet other genetic factors are emerging.

Having Type A blood was linked to a 50 percent increase in the likelihood that a patient would need to get oxygen or to go on a ventilator.

There is one interesting discrepancy in comparing the USA and the UK.

Again, in the USA, Asians have lower COVID-19 death rates than whites.

In the UK, Asians have higher death rates from COVID-19 than whites.

COVID-19 death rates have been linked to socioeconomic status.

The media in the UK explain this as a result of socioeconomic disparities between whites and people of color.

https://www.newsweek.com/people-color-more-likely-die-coronavirus-white-people-1498083

However, the label of “Asians” in the UK typically refer to South Asians.

In the USA, “Asians” generally refers to East Asians.

South Asians might be genetically more vulnerable to SARS2 than East Asians, which might become a tragedy for India, Pakistan and Bangladesh.

Interestingly, the USA diverges from the UK in socioeconomic disparities by ethnicity.

In fact, South Asians are the wealthiest group in the USA in a ranking of ethnic groups by family incomes.

https://en.wikipedia.org/wiki/List_of_ethnic_groups_in_the_United_States_by_household_income
  1. Indian (ancestry from India)
  2. East Asian
  3. White
  4. Middle Eastern
  5. Native Hawaiians and Pacific Islanders
  6. Hispanic or Latino
  7. Black or African American
  8. American Indian or Native American
That Asians are generally more prosperous than whites and others might explain the SARS2 death rates by ethnicity in New York City, with Asians being least afflicted.

But it might be instructive to compare death rates and income between East Asians and South Asians in NYC.

This study would have to take into account the number of healthcare workers by ethnicity, because Asian groups might be disproportionately represented and thus more vulnerable.

This might be especially true of Filipinos, who are alternately and sometimes simultaneously classified Asian and/or Pacific Islander, and who are often healthcare professionals.

IIRC, there was a comparative study of terrorist acts by Moslem immigrants in the USA and the UK, which contrasted the household incomes of immigrants in those two countries.

In the UK, Jamaican and Pakistani immigrants had half the income of white Englishmen.

In the USA, Jamaican and Pakistani immigrants had twice the income of white Americans.

In the UK, whites do earn more than Jamaican and Pakistanis, but perhaps not twice as much.
<figure></figure>https://www.bbc.com/news/business-48919813
In the USA, Pakistani incomes are higher than the average American income, but not twice as much.
<figure></figure>https://prosperitynow.org/blog/racial-wealth-snapshot-asian-americans

It is difficult to find information on Jamaican immigrants in the USA, but Caribbean immigrants and black immigrants tend to have lower incomes than average US households.
In 2017, households headed by a Caribbean immigrant had a median income of $47,000, compared to $56,700 and $60,800 for all immigrant and U.S.-born households, respectively.Feb 13, 2019
https://www.migrationpolicy.org/article/caribbean-immigrants-united-states
https://www.pewsocialtrends.org/2015/04/09/chapter-1-statistical-portrait-of-the-u-s-black-immigrant-population/

It could be that Pakistani immigrants to the USA are from the professional classes, whereas Pakistani immigrants in the UK are poorer and less educated.

But it also draws out the distinction between income equality and social mobility.

The USA might have higher income disparities, but the USA might have higher social mobility rates for certain immigrants, both compared to the UK and to non-immigrant Americans.

One can see this difference in the lives of Nigerian immigrants to the USA.

The most educated ethnic group in the USA are Nigerians, not Asians.

https://en.wikipedia.org/wiki/Nigerian_Americans
According to a data provided by Rice University in Texas, Nigerian-Americans are the most educated ethnic group in the United States.[14] According to the Migrations Policy Institute 29% of Nigerian-Americans have graduate degrees (compared to 11% of the overall American population).[15] Furthermore, a minimum of four percent of Nigerian-Americans are also Ph.D holders. This is at least three times higher than any other ethnic group in the United States of America.[16] Nigerian-Americans are also known for their exploits in medicine, science, technology, and literature.
In their open ambitiousness, Nigerians seem like the Koreans of western Africa.

Speaking of Koreans….

Culturally, there could be lifestyle traits unique to Korea that render Koreans and East Asians less susceptible to death and infection from the coronavirus.

From 2002 to 2004, the SARS epidemic — from a virus related to SARS-CoV-2 — spread to different countries, infecting 8,098 people and killing 774.

https://en.wikipedia.org/wiki/Severe_acute_respiratory_syndrome#Epidemiology
SARS was a relatively rare disease; at the end of the epidemic in June 2003, the incidence was 8,422 cases with a case fatality rate (CFR) of 11%.
Despite SARS’s origin in China and its focus on East Asia, only three Koreans became infected.
At the time, it was widely noted that the centerpiece of the Korean diet was spiced, fermented vegetables, which is not a part of the diet of other East Asian cuisines.

https://en.wikipedia.org/wiki/Kimchi#Nutrition_and_health
During the 2003 SARS outbreak in Asia, many people believed that kimchi could protect against infection. While there was no scientific evidence to support this belief, kimchi sales rose by 40%.
South Korea consumes 1.85 million metric tons of kimchi annually, or nearly 80 pounds a person.
Koreans also eat a lot of food.

IIRC, historically, a Korean soldier consumed three times as many calories as a Japanese soldier.

In traditional warfare, this gave Korean soldiers an advantage because they were larger, stronger and more energetic than their Japanese counterparts.

With the advent of modern weaponry, the advantage turned toward the Japanese because they could now field three times as many troops as the Koreans.

Koreans could eat so much without becoming unhealthy because their diet was basically vegetarian, consisting mostly of rice and (uncooked) fermented vegetables.

Also, for the past seven thousand years, Koreans have had underfloor heating.

https://en.wikipedia.org/wiki/Underfloor_heating#History

This influenced all sorts of aspects of Korean culture.
  • Koreans spent a lot of time laying on the warm floor.
  • Koreans thus tended to wear comfortable, baggy clothing.
  • They played games suited to sitting on the floor.
  • They ate a lot of slow-cooking stews because there was always a fire in the kitchen.
  • The kitchen was below ground and screened off from the living area, eliminating smoke.
Many of these habits persist and tend to boost the immune system.

Like other East Asians, Koreans bow more often than they shake hands, which reduces the transmission of pathogens.

Koreans and other East Asians also readily wear face masks because the air quality in East Asia can be so terrible and because they have a recent familiarity with pandemics.

<figure></figure>https://www.upi.com/Top_News/World-News/2018/11/27/South-Korea-smothered-in-Chinese-air-pollution/3891543308613/

<figure></figure>https://cdn.cnn.com/cnnnext/dam/assets/151022135937-smog-woman-2-mask-pollution-exlarge-169.jpg

Koreans are working and socializing much as they did before the pandemic, but they are wearing face masks.

The typical face mask is an imperfect form of protection from pathogens.

However, that imperfection might be of benefit in an unexpected way.

The chances of dying from exposure to a lethal virus is in proportion to the amount of virus involved.

https://www.nytimes.com/2020/04/01/opinion/coronavirus-viral-dose.html
The importance of viral dose is being overlooked in discussions of the coronavirus. As with any other poison, viruses are usually more dangerous in larger amounts. Small initial exposures tend to lead to mild or asymptomatic infections, while larger doses can be lethal.
By going about their usual business but wearing crude face masks, South Koreans might be harmlessly exposing other Koreans with small, manageable amounts of coronavirus.

South Koreans might thus be unintentionally and very successfully carrying out the idiosyncratic British policy of using a pandemic to create “herd immunity”.

The British healthcare establishment adopted the strategy of allowing a plague to infect its entire population as a form of mass vaccination.

This would presumably protect against … being infected by that very same plague.

In the South Korean case, people might be exposed to coronavirus in small doses so that they register as having been infected.

That low level of infection is not the same kind of massive exposure that healthcare workers can experience.

However, that low level of exposure that does not cause a disease might not create immunity.

Moreover, this coronavirus might not offer any long-term immunity, anyway.

https://www.cnbc.com/2020/04/17/who-issues-warning-on-coronavirus-testing-theres-no-evidence-antibody-tests-show-immunity.html

The point is that detecting this mass exposure would skew the case fatality rate in South Korea.
There would be few fatalities despite huge number of infections.

This would be the exact opposite of the experience of health care workers who are bombarded by large amounts of the virus.

Despite young, healthy female nurses in China not fitting the classic profile of those most vulnerable to SARS2, they have died in disproportional numbers nonetheless.

So if one takes strong precautions yet becomes mildly infected by the coronavirus, the outcome might be quite favorable.

In contrast, if one is inundated by up-close, prolonged and intense contact with dying patients who are saturated with the coronavirus, the outcome would be dire.

This is not true of the seasonal flu.

So perhaps it’s dumb to panic over the SARS2 — if you are an ordinary civilian constantly wearing a mask, washing your hands and social distancing.

But if one is a physician in a hospital intensive care unit, then it’s dumb not to take extreme measures over a virus that still very much remains an unknown quantity.

This explains the paradox of certain epidemiologists early in the crisis claiming that SARS2 was not so different from the flu.

Meanwhile, healthcare workers were strongly asserting that it is very different from the flu.

It’s like astronomers claiming that a black hole is theoretically not so different from a star, while astronauts on the event horizon beg to differ.

There are historical precedents on how case fatality rates can vary by ethnicity.

One example, is the Black Death of the 1300s.

Some areas (in green) saw relatively few cases of the plague, despite what would be expected.
(map of the spread of the bubonic plague in Europe in the 1300s)
<figure></figure>https://upload.wikimedia.org/wikipedia/commons/thumb/d/dc/Bubonic_plague_map.PNG/800px-Bubonic_plague_map.PNG

One theory is that these unexpectedly less afflicted areas had large Jewish populations.

The Jews in these regions would have had strict dietary laws and other proscriptions meant to promote ritual purity that might have promoted hygiene.

https://en.wikipedia.org/wiki/Kosher_foods

In order to win converts, Christianity abandoned such restrictions.

In a thought experiment, one can imagine an an alternate universe where that did not happen.

In fact, what if Christians had adopted religious proscriptions that promoted hygiene — for example, not shaking hands?

In that case, the Black Death, which altered western history so significantly, might have hardly perturbed Europe.

The plague destroyed feudalism in western Europe, but reinforced serfdom in eastern Europe.

Likewise, the aftermath of the SARS2 pandemic offers two developmental pathways.

There could be either a sundering of the status quo or a reactionary retreat into deep conservatism and a rebound into inequality, as after the 2008 recession.

http://inthesetimes.com/article/22445/scholar-pandemics-inequality-coronavirus-recession-elites-walter-scheidel

In the aftermath of economic collapse, societies become radicalized.

However, when the economic crises are joined with pandemics, societies become xenophobic and authoritarian.

By this understanding, a world that was already trending toward populism will probably pivot hard toward even greater authoritarianism and scapegoating.

https://www.nytimes.com/2020/05/06/opinion/coronavirus-trump-authoritarianism.html

But back to the thought experiment….

In particular, what if Christians had been required by their religion to drink tea and avoid alcohol?

Historically, Europeans drank alcoholic beverages — beer and wine — instead of water because they considered water too unsafe to drink.

Widespread alcohol consumption might inject a certain violence and instability into a society, but it might also promote individualism and creativity, which are modern values.

Here, one thinks of the Celts, whose culture seemed to revolve around alcohol (and still does).

But this culture of alcohol might have made the Celts vulnerable to more disciplined and organized cultures, like the Romans.

(map of Celtic civilization in Europe)
<figure></figure>https://en.wikipedia.org/wiki/Celts

The adoption of tea drinking in China has been said to have been perhaps the single most successful public health policy in history.

https://en.wikipedia.org/wiki/History_of_tea_in_China
According to legend, tea was first discovered by the legendary Chinese emperor and herbalist, Shennong, in 2737 BCE.[3] It is said that the emperor liked his drinking water boiled before he drank it so it would be clean, so that is what his servants did. One day, on a trip to a distant region, he and his army stopped to rest. A servant began boiling water for him to drink, and a dead leaf from the wild tea bush fell into the water. It turned a brownish color, but it was unnoticed and presented to the emperor anyway. The emperor drank it and found it very refreshing, and cha (tea) came into being.
Among other habits, tea drinking among Chinese railroad workers in the USA made them healthier than the Irish workers.

https://www.pbs.org/wgbh/americanexperience/features/tcrr-workers-central-union-pacific-railroad/
Healthier Habits
Workers lived in canvas camps alongside the grade. In the mountains, wooden bunkhouses protected them from the drifting snow, although these were often compromised by the elements. Each gang had a cook who purchased dried food from the Chinese districts of Sacramento and San Francisco to prepare on site. While Irish crews stuck to an unvarying menu of boiled food — beef & potatoes — the Chinese ate vegetables and seafood, and kept live pigs and chickens for weekend meals. To the dull palates of the Irishmen, the Chinese menu was a full-blown sensory assault. The newcomers seemed alien in other ways: they bathed themselves, washed their clothes, stayed away from whiskey. Instead of water they drank lukewarm tea, boiled in the mornings and dispensed to them throughout the day. In such a manner they avoided the dysentery that ravaged white crews.
An fixed diet of boiled beef and potatoes sounds grim, but by Irish standards this is the American dream come true.

The American stereotype of the Irish is that they love to feast on corned beef and cabbage.

But historically, beef — too expensive for the Irish — was eaten by the English.

Irish cattle were shipped to England for slaughter and consumption (the export of cattle was banned after Irish independence), and the Irish turned toward the potato.

Historically, the mainstay of the Irish diet was mash potatoes with butter — a complete meal.

If they were lucky enough to stumble on a dead sheep, then its meat would be added to the boiling potatoes (“Irish stew”).

https://en.wikipedia.org/wiki/Corned_beef
The Celtic grazing lands of … Ireland had been used to pasture cows for centuries. The British colonized … the Irish, transforming much of their countryside into an extended grazing land to raise cattle for a hungry consumer market at home … The British taste for beef had a devastating impact on the impoverished and disenfranchised people of … Ireland. Pushed off the best pasture land and forced to farm smaller plots of marginal land, the Irish turned to the potato, a crop that could be grown abundantly in less favorable soil. Eventually, cows took over much of Ireland, leaving the native population virtually dependent on the potato for survival.
Despite being a major producer of beef, most of the people of Ireland during this period consumed little of the meat produced, in either fresh or salted form, due to its prohibitive cost. This was because most of the farms and its produce were owned by wealthy Anglo-Irish who were absentee landlords and that most of the population were from families of poor tenant farmers, and that most of the corned beef was exported.
The lack of beef or corned beef in the Irish diet is especially true in the north of Ireland and areas away from the major centres for corned beef production. However, individuals living in these production centres such as Cork did consume the product to a certain extent. The majority of Irish who resided in Ireland at the time mainly consumed dairy products and meats such as pork or salt pork,[12] bacon and cabbage being a notable example of a traditional Irish snack.
Anthropologically, in terms of the distinction between nature and culture (“raw” versus “cooked”), preparing tea in a ritualistic manner (e.g., tea time) is a step toward civilized life.

That is, the very inconvenience of certain daily practices — for example, the Japanese removing their shoes when they enter a home — promotes a sense a boundaries and a sense of decency.

From these rituals, people gain a sense of themselves as being superior to outsiders who seem so barbaric in their uninhibited habits.

Hence, sometimes Chinese would add just a single leaf of tea to boiled water for their “tea”, in order to distinguish it from mere water.

Also, having a ritual such as tea time makes people more productive.

IIRC, when the Irish and the Chinese competed to lay the most track, the Irish would work without a break while the Chinese would stop to drink tea, yet still outperformed the Irish.

It is the ritual interruptions that increase productivity.

https://en.wikipedia.org/wiki/Donald_Francis_Roy

Perhaps even better than tea would be coffee.

The Irish drank water and whiskey, but they also drank coffee.

That’s perfect for the Irish, because coffee feeds imagination, talkativeness and political rebellion.
Hence, coffee had often been banned by traditionalist monarchs.

https://www.npr.org/sections/thesalt/2012/01/10/144988133/drink-coffee-off-with-your-head
Wherever it spread, coffee was popular with the masses but challenged by the powerful.
“If you look at the rhetoric about drugs that we’re dealing with now — like, say, crack — it’s very similar to what was said about coffee,” Stewart Allen, author of The Devil’s Cup: Coffee, the Driving Force in History, tells The Salt.
Monarchs and tyrants publicly argued that coffee was poison for the bodies and souls of their subjects, but Mark Pendergrast — author of Uncommon Grounds: The History of Coffee and How It Transformed Our World — says their real concern was political.
“Coffee has a tendency to loosen people’s imaginations … and mouths,” he tells The Salt.
And inventive, chatty citizens scare dictators.
According to one story, an Ottoman Grand Vizier secretly visited a coffeehouse in Istanbul.
“He observed that the people drinking alcohol would just get drunk and sing and be jolly, whereas the people drinking coffee remained sober and plotted against the government,” says Allen.
Coffee fueled dissent — not just in the Ottoman Empire but all through the Western world. The French and American Revolutions were planned, in part, in the dark corners of coffeehouses. In Germany, a fearful Frederick the Great demanded that Germans switch from coffee to beer. He sent soldiers sniffing through the streets, searching for the slightest whiff of the illegal bean.
In England, King Charles II issued an order to shut down all coffeehouses after he traced some clever but seditious poetry to them. The backlash was throne-shaking. In just 11 days, Charles reversed his ruling.
And so coffee took its place in the center of culture. Where so many other underground movements — religious, political, even musical — were squashed, coffee managed to go mainstream.

Is it possible to for a government alter the hygiene habits of a people?

To some extent, this is the mission of the government of Singapore.

Lee Kwan Yew, the founder of modern Singapore, explained that the only way for a provincial society to develop is to alter the habits of the population.

For example, Chinese men love to urinate inside elevators, so the Singapore government installed sensors in the carpets of elevators to set off an alarm when they became wet.

To outfox the sensors, the men would then wait to get off the elevator, turn around and then urinate into the elevator from outside, forcing the government to install cameras.

It is a constant game of wits.

As Lee Kwan Yew asked aloud, Can the Chinese government get one billion Chinese to stop spitting in public?

Singapore managed to do this, but its attainment was and remains an all-consuming task.

Within a generation, Singapore achieved not only full economic development, but the culmination of a civilizing process that typically takes centuries.

https://en.wikipedia.org/wiki/The_Civilizing_Process
The first volume, The History of Manners, traces the historical developments of the European habitus, or “second nature”, the particular individual psychic structures molded by social attitudes. Elias traced how post-medieval European standards regarding violence, sexual behaviour, bodily functions, table manners and forms of speech were gradually transformed by increasing thresholds of shame and repugnance, working outward from a nucleus in court etiquette. The internalized “self-restraint” imposed by increasingly complex networks of social connections developed the “psychological” self-perceptions that Freud recognized as the “super-ego“.
There might be a better way to promote hygiene than Singapore’s policy, which is to impose fines on the public for the slightest infraction.

A system of rewards might actually be more effective than punishment.

https://en.wikipedia.org/wiki/Reinforcement#Positive_reinforcement

One proposal during the SARS2 crisis is that every American should receive $2000 a month in emergency assistance from the federal government.

https://www.marketwatch.com/story/one-1200-stimulus-check-wont-cut-it-give-americans-2000-a-month-to-fire-up-the-economy-2020-04-20

https://www.forbes.com/sites/ryanguina/2020/04/18/proposed-2000-monthly-stimulus-checks-and-canceled-rent-and-mortgage-payments-for-1-year

One counterargument to this proposal is that if this emergency assistance extended well into the post-crisis stage, it would inhibit a recovery.

The public, still spooked about a pandemic, would simply remain at home rather than look for jobs or create new businesses.

The stimulus checks extended into a recovery would be a form of Universal Basic Income (UBI), which is unconditional.

https://en.wikipedia.org/wiki/Basic_income
Basic income, also called universal basic income (UBI), citizen’s income, citizen’s basic income, basic income guarantee, basic living stipend, guaranteed annual income, or universal demogrant, is a governmental public program for a periodic payment delivered to all on an individual basis without means test or work requirement.
The UBI can take different forms.
Basic income can be implemented nationally, regionally or locally. An unconditional income that is sufficient to meet a person’s basic needs (at or above the poverty line) is sometimes called a full basic income while if it is less than that amount, it is sometimes called partial. A welfare system with some characteristics similar to those of a basic income is a negative income tax in which the government stipend is gradually reduced with higher labour income.
When conditions are imposed, it is not really a UBI, it is a guaranteed minimum income.
Some welfare systems are sometimes regarded as steps on the way to a basic income, but because they have conditionalities attached they are not basic incomes. If they raise household incomes to specified minima they are called guaranteed minimum income systems. For example, Bolsa Família in Brazil is restricted to poor families and the children are obligated to attend school.

Guaranteed Minimum Incomes are already found in many societies, and are conditional.

https://en.wikipedia.org/wiki/Guaranteed_minimum_income
Guaranteed minimum income (GMI), also called minimum income, is a system[1] of social welfare provision that guarantees that all citizens or families have an income sufficient to live on, provided they meet certain conditions. Eligibility is typically determined by the following: citizenship, a means test, and either availability for the labor market or a willingness to perform community services. The primary goal of a guaranteed minimum income is to reduce poverty. If citizenship is the only requirement, the system turns into a universal basic income.
A system of guaranteed minimum income can consist of several elements, most notably:
Individuals who responsibly engage in SARS2 precautions like social distancing would be financially rewarded by the government.

From a conservative point of view, this would avoid moral hazard, which is the punishment of virtuous behavior and the incentivizing of abuse.

https://en.wikipedia.org/wiki/Moral_hazard
In economics, moral hazard occurs when an individual has an incentive to increase their exposure to risk because they do not bear the full costs of that risk. For example, when a person is insured, they may take on higher risk knowing that their insurance will pay the associated costs. A moral hazard may occur where the actions of the risk-taking party change to the detriment of the cost-bearing party after a financial transaction has taken place.
Economist Paul Krugman described moral hazard as “any situation in which one person makes the decision about how much risk to take, while someone else bears the cost if things go badly.”
https://www.newyorker.com/magazine/2018/07/09/who-really-stands-to-win-from-universal-basic-income

The ideological counterpart to this moralist conservative view would be moral idealism.

Liberals and socialists are moral idealists who believe that every individual is entitled to certain basic goods.

There seems to be widespread agreement that during and immediately after an emergency such as a natural disaster, humanitarian relief is in order.

But most people don’t feel that it should become permanent.

Liberals and socialists might be attracted to the idea of a Universal Basic Income on those moral grounds.

The premise underlying moral idealism is the concept of basic human rights.

One flaw of this viewpoint is that human rights do not exist.

“Human rights” is an amorphous label that politicians and interest groups can and will apply to whatever they desire.

Civil rights exist, and can be altered through the changing of laws.

Human welfare as a norm exists, and its standards can be raised.

It is a rhetorical sleight of hand, however, to fuse “civil rights” with standards of “human welfare” in order to conceive a universal “human rights”.

“Human rights” issues should be treated as the human welfare issues that they really are.

It would improve human welfare during the SARS2 crisis if people socially distanced from one another, tested themselves and wore masks — and this would be financially rewarded.

From a libertarian point of view, one function of government is to compensate for “externalities”.
https://en.wikipedia.org/wiki/Externality
In economics, an externality is the cost or benefit that affects a third party who did not choose to incur that cost or benefit.[1] Externalities often occur when the production or consumption of a product or service’s private price equilibrium cannot reflect the true costs or benefits of that product or service for society as a whole.
For example, manufacturing activities that cause air pollution impose health and clean-up costs on the whole society, whereas the neighbors of individuals who choose to fire-proof their homes may benefit from a reduced risk of a fire spreading to their own houses. If external costs exist, such as pollution, the producer may choose to produce more of the product than would be produced if the producer were required to pay all associated environmental costs. Because responsibility or consequence for self-directed action lies partly outside the self, an element of externalization is involved. If there are external benefits, such as in public safety, less of the good may be produced than would be the case if the producer were to receive payment for the external benefits to others. For the purpose of these statements, overall cost and benefit to society is defined as the sum of the imputed monetary value of benefits and costs to all parties involved.
Those who socially distance and engage in other safe behaviors in the face of the SARS2 threat promote the safety of the entire society.

Such positive externalities are very high in the case of a pandemic because the growth rate of the disease is exponential.

https://en.wikipedia.org/wiki/Doubling_time

Moreover, for East Asian societies with a recent history of pandemics, and for intelligent, educated and socially conscious people like Bill Gates, pandemics are to be expected.

These societies responded to SARS2 with testing, tracing, isolating and masks — but they never shut down.

In contrast, ignorant and irresponsible societies in the West, the pandemic seemed to come out of nowhere — a pseudo-black swan event.

https://en.wikipedia.org/wiki/Black_swan_theory

Unprepared, these societies needed to enter a devastating lockdown.

This triggered a long, powerful series of unforeseen consequences, which might include permanent decline and historical irrelevance (already true for the UK, Spain and Italy).

So the consequences — that is, negative externalities — of not taking personal precautions are profound.

This means that those who do engage in safe personal conduct should be rewarded.

There have been public protests against the lockdowns in the USA.

The protesters assert that the lockdown is unconstitutional and infringes on their personal liberties.
The protesters are populists who can be said to embody an American “folk libertarian” tradition.
Perhaps “libertarian” would be the wrong adjective to describe the populist.

A libertarian would recognize that personal liberties have limits.

In fact, liberties are a framework of mutual constraints that protects individuals from each other.

https://www.nytimes.com/2003/03/23/weekinreview/nation-freedom-vs-liberty-more-than-just-another-word-for-nothing-left-lose.html
As the political theorist Hanna Fenichel Pitkin has observed, liberty implies a system of rules, a ”network of restraint and order,” hence the word’s close association with political life. Freedom has a more general meaning, which ranges from an opposition to slavery to the absence of psychological or personal encumbrances (no one would describe liberty as another name for nothing left to lose).
A populist might be better understood as an “inconsistent anarchist”.

In contrast to the minimal-government libertarian, the anarchist does not want any government whatsoever.

The populist conceives “liberty” as a total absence of constraints.

For example, conservatives oppose the Affordable Care Act on the grounds that it gives health care benefits to those who do not want to work for it.

https://www.newyorker.com/magazine/2017/10/02/is-health-care-a-right

In contrast, populists hate Obamacare because it forces families to buy some form of health insurance.

At the same time, the populist is inconsistent because the populist also wants a large, generous government — for themselves, but not for others.

So populists desire health care from the government, but they don’t want to be forced to pay for it and they don’t want outsiders to benefit from it.

Some of the things that populists hate:
  • “elites” (a flexible category that can encompass ANYONE and ANYTHING)
  • foreigners
  • outsiders
  • minorities
  • government
  • corporations
  • cities
  • EDUCATION
The populist loves the army as protector of the tribe, but does not perceive the military as a government institution.
The worldview of the populist is marked by a very concrete thought processes.
  • Egocentric morality. Almost all children have moral feelings, but these ethical impulses are typically tribal and socially exclusive in nature. A child will say, “They should not close the factory because my father works in the factory.” Only through the maturation process does the individual learn to universalize morality to include strangers.
  • Pragmatic ideation. Modern people classify objects by categories. In traditional societies, objects are classified by their use. So when viewing a hammer, a shovel, a hoe and a potato, a modern person will identify them as three tools and a vegetable. A person in a traditional society will see one woodworking implement, and three farming objects.
  • Contradictory perceptions. It is said that “The mob is fickle.” Yet the populist does not so much readily change his mind as he simultaneously maintains divergent beliefs about everything. He wants this AND he wants that, and he makes a fuss about it.
In contrast to the populist, the romantic is a young, over-educated, semi-privileged, urban white guy.

Like the populist, the romantic is full of contradictions, but these contradictions are of a different nature than those of the populist.

The romantic also hates some of the things that populists hate — such as “elites”, corporations, government, the educational system.

However, the romantic embodies these things.

For example, in San Francisco, pseudo-liberal billionaires are pricing out pseudo-liberal millionaires.

https://www.newyorker.com/magazine/2014/07/07/california-screaming

The over-educated romantic represents the top 20% of income earners going to war against the top 1% on the pretext of problems that were actually created by the top 20%.

https://www.nytimes.com/2020/04/23/opinion/income-inequality.html

The romantic, however, identifies with foreigners, minorities, and outsiders — even thought the romantic is not one of them.

The movie star who manifests the contradictions of the romantic in virtually all the roles that he plays is Matt Damon.

Matt Damon’s niche is the talented, semi-elite, semi-insider, semi-outsider with identity confusion.

The romantic suffers from what might be called “race, class and gender dysphoria”.

https://en.wikipedia.org/wiki/Dysphoria

The rural populist affirms “My country, right or wrong”, and blames his own problems and his country’s problems on the outside world.

In contrast, the urban American romantic asserts that the real source of problems in the world today is … his own country.

While the populist glorifies the military, the American romantic demonizes the CIA.

While the humble rural ethos pleads “Please don’t look at me, I’m normal”, the urban individualist asserts “Look at me! I’m SPECIAL!”

The romantic is a self-righteous narcissist waging a glorious revolution to liberate the world.

https://en.wikipedia.org/wiki/Lord_Byron

In fact, in some cases — like that of Ira Einhorn, the ur-Bernie Bro of the 1960s — the romantic is a sociopath.

https://www.nytimes.com/2020/04/07/us/ira-einhorn-dead.html

The personality of the romantic is characterized by a tendency to abstract.
  • Ideal self-interest. The sociologist Max Weber said that humans have material self-interest, but their self-image also comprises a form of self-interest (economics focuses on the former, whereas sociology is examines social status). This is related to conscience, but also to narcissism, and the romantic has both in abundance.
  • Reification. The romantic is all theory, and he confuses the map for the territory.
  • Hypocrisy. A person's worldview is often an expression of a person's personality, but sometimes it is the perfect reverse mirror image of themselves. The romantic is typically in the latter camp.
Once in power, both the romantic and the populist exhibit a certain dual personality.

One can see this in the USA in the presidencies of populist politicians.

The populist leader seeks to simultaneously buttress government authority and undermine it.

Literally in the same breath that he pronounces the government’s endorsement of wearing masks to limit the spread of coronavirus, Trump emphasizes that he refuses to wear them.

When Reagan was elected, he removed the solar panels that his predecessor Jimmy Carter had installed on the roof of the White House.

Reagan objected to the solar panels on the grounds that they detracted from the dignity and authority of the central symbol of the governmental power.

At the same time, Reagan continued to lament government power.

Going back even further, one finds Andrew Jackson adopting a “spoils system” whereby patrons of the winning presidential party would displace the professional civil service.

Jackson’s goal was not to eliminate government and civil service, but gut it of experts and replace them with party hacks whose power would be transitory and ephemeral.

One finds the same paradox with politicians who were romantics.

In post-revolutionary China, Mao Zedong was disturbed to find the Communist Party rapidly transforming into a career-oriented bureaucratic hierarchy like that of any other country.

Mao’s response was to foster a kind of “permanent revolution” in which governmental institutions would be disrupted through campaigns meant to foster chaos.

At the same time, the leader of this government was Mao himself (importantly, as a “chairman”, not a CEO).

The countercultural movement of the 1960s has been seen as a popular, Americanized version of romanticism.

https://en.wikipedia.org/wiki/The_Making_of_a_Counter_Culture
Theodore Roszak found common ground between 1960s student radicals and hippie dropouts in their mutual rejection of what he calls the technocracy–the regime of corporate and technological expertise that dominates industrial society.
At the heart of the counterculture was a rejection of expertise and rationality.
https://en.wikipedia.org/wiki/Technocracy
Technocracy is an ideological system of governance in which decision-makers are selected on the basis of their expertise in a given area of responsibility, particularly with regard to scientific or technical knowledge. This system explicitly contrasts with the notion that elected representatives should be the primary decision-makers in government,[1] though it does not necessarily imply eliminating elected representatives.
Leadership skills for decision-makers are selected on the basis of specialized knowledge and performance, rather than political affiliations or parliamentary skills.
Of course, rural populists are likewise famous for their disdain and resentment of technocrats and experts.

Yet unlike the populists, the romantics are so often themselves technocrats and experts, usually engaged in a career involving symbolic manipulation or information technology.

This is hinted at in the Netflix documentary “Wild Wild Country”, about the Oregon commune of the cult leader Bhagwan Shri Rajneesh.

The ranchers who neighbored the commune described the cultists as financially successful, over-educated professionals who lacked common sense.

Similarly, the over-educated romantic may hate elites, but unlike the populist, the romantic does not really want to create an egalitarian society.

Rather, romantics fantasize about becoming the new elites.

Importantly, however, despite their obsession with politics, romantics do not really want to get involved in actual messy politics or policy.

Politics is fantasy football for “tenured radical” romantics.

When over-educated romantics gets together on the weekends to drink expensive wine and beer, they talk in intoxicated detail how they would run the world if given the chance.

The rural populist and the over-educated romantic therefore might tend to oppose the idea of a guaranteed minimum income conditional on adopting hygienic practices.

This is because they would chafe at the idea of conditions imposed from above.

What of progressives who want society to be rationally managed as an enlightened technocracy?

Just as libertarians want to allow for rational behavior at the micro level of the individual, progressives desire rational policies at the macro, societal level.

https://en.wikipedia.org/wiki/Age_of_Enlightenment

The classic progressive policy would be establishing a rational system of measurement — the metric system.

IIRC, the first modern unit of currency based on the metric system was the US dollar.

https://en.wikipedia.org/wiki/United_States_dollar#Overview
Unlike the Spanish milled dollar, the U.S. dollar has been based upon a decimal system of values since the time of the Continental Congress.[22] This decimal system was again described in the Coinage Act of 1792: in addition to the dollar, the Act officially established monetary units of mill or one-thousandth of a dollar (symbol ), cent or one-hundredth of a dollar (symbol ¢), dime or one-tenth of a dollar, and eagle or ten dollars, with prescribed weights and composition of gold, silver, or copper for each.
A less successful attempt to impose the decimal system was that the French Revolution’s system of keeping time.

https://en.wikipedia.org/wiki/Decimal_time

In post-revolutionary America, spelling reform was a manifestation of the desire to rationalize language.

Noah Webster’s dictionary was a successful attempt to mold the spelling of English words into scientific form.

https://en.wikipedia.org/wiki/English-language_spelling_reform#19th_century

More radical attempts to accomplish this in the 20th century fizzled.

https://en.wikipedia.org/wiki/English-language_spelling_reform#20th_century_onward

In the contemporary political scene, the quintessential progressive would be the German leader Angela Merkel — a scientist.

The following article describes Merkel as an idealist.

https://www.theatlantic.com/international/archive/2020/04/angela-merkel-germany-coronavirus-pandemic/610225/
Merkel—for whom, as a former East German, liberty and freedom are known to be paramount—understood all too personally what the lockdown meant for her fellow citizens.
In contrast, a New Yorker profile describes Merkel as a pragmatist who has maintained power for so long by cleverly portraying herself as an idealist.

https://www.newyorker.com/magazine/2014/12/01/quiet-german

Growing up in communist Eastern Germany, Merkel believes that society needs some measure of liberty to function properly, but her attitude toward freedom is largely utilitarian.

In fact, Merkel despises idealists as emotional grandstanders who fail to do their homework (she happily spends her weekends reading policy papers).

For this reason, Merkel initially disdained Barack Obama and his soaring idealistic rhetoric — until she concluded that he was secretly a rationalist and pragmatist like herself.

It has been said of Obama that his idea of a perfect society would be a cross between Sweden and Singapore — a rationally enlightened and starkly disciplined society.

Merkel would approve.

To stay in power, Merkel adopted a simple trick.

Whenever her approval ratings declined, she would ostentatiously adopt an idealistic policy that she herself did not believe in.

One example was her abandonment of nuclear power after the Fukushima Daiichi nuclear disaster in 2011 (like many progressives such as Obama, Merkel is quietly pro-nuclear).

Another example would be her allowing refugees into Germany.

In 2015, Merkel faced widespread ridicule after trying to comfort a Palestinian girl who faced deportation.

Merkel had upset the girl by straightforwardly explaining that countries simply cannot have an open immigration policy.

This was a moment of public honesty so rare for a politician.

https://www.reuters.com/article/us-germany-asylum-merkel/merkel-mocked-online-over-refugee-girls-tears-idUSKCN0PQ1O520150716
“I understand,” said Merkel. “However … sometimes politics is hard. When you stand in front of me and you are a very nice person, but you know in Palestinian refugee camps in Lebanon there are thousands and thousands (of people) and if we say you can all come and you can all come from Africa and you can all come. We can’t manage that.”
At that point Reem cried.
Later in that year, Merkel opened the floodgates to refugees and, as she had predicted, migrants came to Europe from every corner of the Earth.

This marked a decisive a turning point in European and world history toward populism, nationalism and authoritarianism.

Although in terms of its basis in scientific understanding, technocracy is vastly superior to the irrationality of populism, Merkel’s leadership reveals at least two problems with it.

The first problem is the tendency of technocratic authorities to manipulate public opinion with the same expertise that they regulate the economy.

For example, the American philosopher John Dewey supported America’s entry into WW1, and engaged with critics of the war.

https://en.wikipedia.org/wiki/Randolph_Bourne

Nevertheless, Dewey was astounded by how rapidly the public institutions meant to support democracy — namely, education — were transformed into vehicles of state control.

After the war, this machinery of persuasion was appropriated by commerce and renamed “public relations”.

https://en.wikipedia.org/wiki/Edward_Bernays

The television series “Mad Men” portrays the technocratic manipulators of subjectivity and their ideology of “creativity” (the same ideology of the countercultural protesters).

https://www.press.uchicago.edu/Misc/Chicago/259919.html

The second problem that Merkel’s progressive regime exposes is the fragility of this technocratic mass manipulation.

There is a constant need to finesse the irrational masses who otherwise mock good, honest policy.

But this playing to the public sentimentality — e.g., allowing mass migration into Germany — can so easily backfire and inflame the mob.

A literal progressive would be Theodore Roosevelt (who supported radical spelling reform).
Roosevelt was a leader of the progressive movement.

https://en.wikipedia.org/wiki/Progressive_Era
The Progressive Era was a period of widespread social activism and political reform across the United States that spanned the 1890s to the 1920s.[1] The main objectives of the Progressive movement were addressing problems caused by industrialization, urbanization, immigration, and political corruption. The movement primarily targeted political machines and their bosses. By taking down these corrupt representatives in office, a further means of direct democracy would be established. They also sought regulation of monopolies (trustbusting) and corporations through antitrust laws, which were seen as a way to promote equal competition for the advantage of legitimate competitors. They also advocated for new government roles and regulations, and new agencies to carry out those roles, such as the FDA.
Roosevelt wrapped himself in the trappings of populism.

https://en.wikipedia.org/wiki/People%27s_Party_(United_States)
The People’s Party, also known as the Populist Party or simply the Populists, was a left-wing[2] agrarian populist[3] late-19th-century political party in the United States. The Populist Party emerged in the early 1890s as an important force in the Southern United States and the Western United States, but the party collapsed after it nominated Democrat William Jennings Bryan in the 1896 United States presidential election. A rump faction of the party continued to operate into the first decade of the 20th century but never matched the popularity of the party in the early 1890s.
That’s kind of like if Al Gore pretended to be Donald Trump -- and succeeded.

Incidentally, in the 2004 US presidential election, John Kerry was advised that if he wanted to be elected, his wealthy wife should buy him a ranch.

When Kerry objected that he was not a rancher, the advisors pointed out that neither was President George W. Bush.

All Bush does on his “ranch” down in Texas is ride around on his bicycle.

Kerry did not understand this advice and did not buy a ranch.

Kerry was never elected.

This brings us to the complicated figure of President Jimmy Carter.

Jimmy Carter had a strong populist appeal because he was a peanut farmer from Georgia.
[1976 US presidential election map]
<figure></figure>https://en.wikipedia.org/wiki/1976_United_States_presidential_election

Carter was a liberal, but when he first went into politics, he publicly portrayed himself as an arch-conservative.
That September, Carter came ahead of Sanders in the first ballot by 49 to 38 percent, leading to a runoff. The subsequent campaign grew even more bitter; despite his early support for civil rights, Carter’s campaign criticized Sanders for supporting Martin Luther King Jr. Carter won the runoff election with 60 percent of the vote—winning 7 percent of the black vote—and went on to win the general election easily over the Republican Hal Suit, a local news anchor. Once he was elected, Carter changed his tone, and began to speak against Georgia’s racist politics. Leroy Johnson, a black state Senator, voiced his support for Carter, saying, “I understand why he ran that kind of ultra-conservative campaign. … I don’t believe you can win this state without being a racist.”
After winning, Carter came out as a liberal in his shocking inaugural speech.
Carter was sworn in as the 76th Governor of Georgia on January 12, 1971. He declared in his inaugural speech that “the time of racial discrimination is over. … No poor, rural, weak, or black person should ever have to bear the additional burden of being deprived of the opportunity for an education, a job or simple justice.”[37] The crowd was reportedly shocked by this message, contrasting starkly with Georgia’s political culture and particularly Carter’s campaign. The many segregationists who had supported Carter during the race felt betrayed. Time ran a story on the progressiveNew South” governors elected that year in a May 1971 issue, featuring a cover illustration of Carter.
Making his profile even more complicated, Carter was a military officer with a background in engineering and science.
Carter had long dreamed of attending the U.S. Naval Academy. In 1941, he started undergraduate coursework in engineering at Georgia Southwestern College in nearby Americus. The following year, he transferred to the Georgia Institute of Technology in Atlanta, and he achieved admission to the Naval Academy in 1943. He was a good student but was seen as reserved and quiet, in contrast to the academy’s culture of aggressive hazing of freshmen.
In 1952, Carter began an association with the US Navy‘s fledgling nuclear submarine program, then led by Captain Hyman G. Rickover. Rickover’s demands on his men and machines were legendary, and Carter later said that, next to his parents, Rickover was the greatest influence on his life.[10] He was sent to the Naval Reactors Branch of the Atomic Energy Commission in Washington, D.C. for three month temporary duty.
In March 1953, Carter began nuclear power school, a six-month non-credit course covering nuclear power plant operation at Union College in Schenectady,[8]. His intent was to eventually work aboard USS Seawolf, which was planned to be one of the first two U.S. nuclear submarines.
The political scientist Samuel Huntington describes the professional ethic of the military’s officer corp as “conservative realism….It exalts obedience as the highest virtue of military men. The military ethic is thus:
  • pessimistic [in its view of human nature as irrational, foolish and selfish],
  • collectivist [insofar soldiers need to be bound by tradition, discipline and ritual],
  • historically inclined [in valuing experience over speculative rationality],
  • power-oriented [versus idealism],
  • nationalistic,
  • militaristic [in its desire for preparedness],
  • pacifist [in its desire to avoid unnecessary war], and
  • instrumentalist in its view of the military profession.”
https://en.wikipedia.org/wiki/The_Soldier_and_the_State

Six out of seven military officers in the USA are Republicans, and their political orientation reflects the cautious, realistic, conservative “military mind” that Huntington describes.

This stands in contrast to the optimistic, gung-ho “mission ready” mentality inculcated into enlisted men.

It’s been said that among enlisted men, the white guys are Republicans and everybody else is a Democrat.

But the Republican enlisted man might better be described as a populist than a conservative.

Populists have a way of flip-flopping on foreign policy.

The classic conservative military officer would cringe at George W. Bush’s neoconservative “adventurist” foreign policy.

In contrast, the populist loves military adventures — at least in the early days of war, when the air is still fresh with the smell of easy victory.

But when those ill-conceived wars turn into debacles, the populists blames elites and foreigners and warms up to isolationist politicians (Tulsi Gabbard).

There is a paradox regarding the military officer’s relationship with rationality.

The officer class maintain high standards of expertise and they are elite technocrats in a centralized hierarchy — classic characteristics of the progressive.

But in their pessimism about human nature, they cannot properly be described as progressives who have a modern faith in historical human progress.

And so perhaps despite Jimmy Carter’s liberalism and his background in engineering and science, he would not be a classic progressive the way that, say, Steven Pinker is.

Steven Pinker seems to personify a progressive-liberal hybrid, compounding a faith in reason with a belief that reason leads to ever-increasing levels of liberation.

With this qualification, one might likewise question whether Angela Merkel — a conservative Christian Democrat — is best described as a “progressive” who believes in progress.

That is, her scientific and technocratic orientation might not make her a progressive in the greater sense.

Steven Pinker might be described as a “big time” (grand mal) progressive, whereas Carter and Merkel might be labeled “small time” (petit mal) progressives.

Making things even more complicated, Carter’s liberalism was rooted in his Christian faith.

Jimmy Carter’s political stances are typically liberal, except for his opposition to abortion at the personal level (Carter supports legalized abortion).

But all of Carter’s policy positions are grounded in his Christianity.

https://en.wikipedia.org/wiki/Jimmy_Carter#Political_positions

So while Theodore Roosevelt was a progressive aggressively disguised as a populist, Carter is better described as a liberal (Christian) presented as a populist.

However, there is yet another complication in Jimmy Carter’s political orientation.

Much to the dismay of liberals, Carter as president followed the path laid by President Nixon in deregulating the American economy, beginning with the airline industry.

(Carter signing the Airline Deregulation Act, 1978)
<figure></figure>
Deregulation
In 1977, Carter appointed Alfred E. Kahn to lead the Civil Aeronautics Board (CAB). He was part of a push for deregulation of the industry, supported by leading economists, leading ‘think tanks’ in Washington, a civil society coalition advocating the reform (patterned on a coalition earlier developed for the truck-and-rail-reform efforts), the head of the regulatory agency, Senate leadership, the Carter administration, and even some in the airline industry. This coalition swiftly gained legislative results in 1978.[151]
Carter signed the Airline Deregulation Act into law on October 24, 1978. The main purpose of the act was to remove government control over fares, routes and market entry (of new airlines) from commercial aviation. The Civil Aeronautics Board‘s powers of regulation were to be phased out, eventually allowing market forces to determine routes and fares. The Act did not remove or diminish the FAA’s regulatory powers over all aspects of airline safety.
Carter’s progressive emphasis on rationality allowed him to adopt a deregulation agenda that would align him with libertarians.

For a liberal, that would be evil.

Here one finds a crack in the union between American liberals and progressives that would more widely split in the 1970s with the rise of the environmental movement.

Much of FDR’s pragmatic policy in the New Deal and during WW2 involved uniting big business, big labor and big government into an alliance for sake of promoting stability.

The “liberal” model here is one of political negotiation between stakeholders at the expense of scientific expertise.

Joe Biden is considered an old-school paragon of this “liberal” model of politics as negotiation.

The liberal theme behind backroom deals is the idea that we can talk out our differences and find a common ground (even if that commonality is patronage).

https://www.theguardian.com/us-news/2019/sep/16/corn-pop-joe-biden-story-what-happened-is-it-real-swimming-pool-confrontation

The progressive critique is that this “liberal” model promotes corruption and mindless patronage projects.

https://en.wikipedia.org/wiki/Iron_triangle_(US_politics)
In United States politics, the “iron triangle” comprises the policy-making relationship among the congressional committees, the bureaucracy, and interest groups.
Central to the concept of an iron triangle is the assumption that bureaucratic agencies, as political entities, seek to create and consolidate their own power base.
In this view an agency’s power is determined by its constituency, not by its consumers.
<figure></figure>One finds glimmers of the progressive-liberal alliance in the early days of social media, when Mark Zuckerberg professed that Facebook would “interconnect” humanity.

In his youthful idealism, Zuckerberg had the liberal faith in bringing people together and the progressive hope that it would be technology that would accomplish this.

In 2004, when Facebook was launched, there were grounds for believing this.

The 1990s were the golden age of the internet, when only scientists had access to the World Wide Web after it was invented in 1989.

But as more and more people gained access to the internet, its nature grew darker, going from knowledge to entertainment to commerce to various forms of evil.

This concludes a round-about inquiry into the characteristics and mentality of a typical “progressive”.
This culminates in a question about what policies are acceptable to progressives.

Again, what would a progressive think about a Guaranteed Minimum Income based on the condition that those receiving it engage in social distancing?

A progressive would probably be open to the idea of a rationalistic, guided public policy.

It is useful to compare the progressive policies of the past — e.g., the adoption of the decimal system of measurement, spelling reform — to find when these policies best succeed.

It seems that moderate progressive reforms that are rooted in civil society (e.g., Webster’s dictionary) and not harshly imposed by the state are the measures that best succeed.

This might suggest the superiority of a conditional Guaranteed Minimal Income over a Universal Basic Income.

In the face of an economic meltdown, however, both a GMI and a UBI might be vastly superior to bailouts of corporations for economic, moral and political reasons.

The problem with these bailouts includes:
  • the relevance of these industries to national security;
  • the over-supply that seemed evident in these industries prior to the crisis;
  • evidence of the obsolescence of these industries was hinted at a decade ago; and
  • the moral and practical implications of bailing out irresponsible businesses.
Airlines around the world are arguing that they need massive government bailouts to the tune of $200 billion.

https://www.nytimes.com/reuters/2020/03/17/business/17reuters-health-coronavirus-iata.html
LONDON — Global airlines need to up to $200 billion of government support to help them survive the coronavirus crisis, the International Air Transport Association said on Tuesday.
The $150 billion to $200 billion IATA estimate includes indirect support such as loan guarantees and comes after U.S. airlines asked for a $50 billion bailout on Monday.
IATA chief economist Brian Pearce also said cash was running out for many airlines and that 75% of them had cash to cover less than three months of unavoidable fixed costs.
The airlines are among a cluster of interlinked industries that are failing in the wake of the coronavirus pandemic.
  • airlines;
  • performing arts;
  • sports;
  • gambling;
  • recreation;
  • hotels;
  • restaurants; and
  • bars
https://www.nytimes.com/svc/oembed/html/
The five sectors experiencing the most direct and immediate collapse in demand or facing government-mandated shutdowns because of coronavirus are air transportation; performing arts and sports; gambling and recreation; hotels and other lodging; and restaurants and bars.
Together, they accounted for $574 billion in total employee compensation in 2018, about 10 percent of the total. It was spread among 13.8 million full-time equivalent employees.
Those numbers represent the share of the economy at most direct risk. These are the industries and workers where revenue is likely to plummet; they will simply not have enough revenue to fulfill their usual obligations. In danger is the $11 billion a week they normally pay their employees, not to mention all those payments for rent, debt service and property taxes.
The first question to ask is:

1. Are these industries essential to national security?

It would seem that this is the entertainment industry.

On that count, no, these are not essential industries the way the failing banking industry was considered fundamental back in 2008.

In a prosperous, modern, consumption-driven society, there might be a tendency to perceive entertainment as essential.

One sees this in the takeover by sports of the educational system in the USA.

A more recent example is the refusal of the video game store GameStop to close during the coronavirus shutdown, on the grounds that it provides an essential service.

https://www.newsweek.com/gamestop-clerks-should-tell-police-store-can-remain-open-its-essential-retail-company-says-1493339
Video game retailer GameStop is reportedly urging store employees to keep stores open in defiance of any orders to close in light of the ongoing COVID-19 pandemic.
Employees at all thousands of GameStop stores were sent a memo Thursday insisting that the company provides and “essential retail” service because it helps people “work from home,” according to a report from Kotaku. Essential retail is generally defined to include businesses like grocery stores and pharmacies.
“Due to the products we carry that enable and enhance our customers’ experience in working from home, we believe GameStop is classified as essential retail and therefore is able to remain open during this time,” the memo reportedly reads.
Another question is:

2. Is the entertainment-industrial complex overbuilt?

The restaurant industry has for years been amazed at its own growth, and has described itself as overdeveloped and primed for a downsizing.

It has been lamented in the liberal media that the restaurants in NYC operate on razor-thin margins, and that they are in desperate need of bailouts.

But if they have operated all these years on razor-thin margins during this long economic expansion, that means that they should not exist in the first place.

https://www.bloomberg.com/news/articles/2020-03-19/eating-out-may-never-be-the-same-in-the-u-s-after-coronavirus
This is a grim prospect for restaurants that were already struggling with brutal competition for a stagnant pool of customers. To make matters worse, some chains have opted to expand aggressively in recent years, which has over-saturated the market and helped to fuel an increase in restaurant debt levels, increasing the risk factor for many companies.
Add rising wages to the mix — which are likely to continue climbing due to the sudden, urgent need for expanded sick leave and benefits — and the industry is poised for a shakeout.
‘Oversupply’
The growth in fast-food restaurants had already slowed to the lowest rate in at least 20 years in 2019 as companies start to curb the rapid growth of the previous years. Still, capacity still outstrips demand, according to Noah Shaffer, senior director at real estate services firm Confidant Asset Management.
“When you just look at what consumers can spend in a market and how many restaurants are already there serving it, you start seeing there is an oversupply,” Shaffer said. “And that happens in quite a few markets.”
Much of the expansion was financed with cheap debt as companies took advantage of low interest rates. Restaurants in the Russell 2000 Restaurants Index more than doubled their debt load as a share of earnings over the last five years, reaching a record high in 2019 after an accounting rule change required them to record more leases on their balance sheets. Even before the accounting change, the ratio had trended upwards in in recent years.
“If they are pretty levered up already, they don’t have ability to tap into additional capital to sustain,” Shaffer said. “The risk and concern for some landlords is starting already.”
The issue of oversupply plagues the entire US economy.

The continued existence of problematic businesses that should have failed long ago is a result of extremely low interest rates.

This is reflected in mediocre productivity rates despite technological innovation.

https://fred.stlouisfed.org/series/MPU4910062

The stagnation of labor productivity seems even more pronounced in the UK.

<figure></figure>https://www.economicshelp.org/wp-content/uploads/2017/11/labour-prod-index-86-20-with-trend-rate.png

It could be that the entertainment-industrial complex that includes the airlines was long in for a serious downsizing.

Like the rest of the entertainment-industrial complex, the aeronautical industry is simply overbuilt.
Going back a few years to 2016, one can see this in the restaurant industry.

The restaurant industry at the time seemed to be to be headed toward a downturn.

http://www.forbes.com/sites/maggiemcgrath/2016/07/26/is-the-restaurant-industry-headed-for-a-recession/#7e25464846fc
One of the biggest pieces of evidence Barish points to is a supply glut. Thanks to an influx of capital from 2010 to 2016, restaurant unit growth has gone back to 2007 levels.
Only this time there are even more smaller chains and independent/chef-driven concepts fighting for diners’ dollars, so supply is even higher than it was at the last peak in the cycle.One can see the trend toward more independent restaurants with better quality, locally sourced food colliding with the inevitable cyclical shakeup. 
Alarmingly, recession in the restaurant business is an indicator of coming recession in general.
Westra doesn’t stop there; he even goes as far to suggest that a restaurant recession could be a harbinger for a U.S. recession in 2017.
“From our restaurant-industry only perspective, when industry category comps decelerate simultaneously by -2% to -3% (like in the second half of 2000, first half of 2007 and now in the second quarter of 2016), then the lower ‘new-normal’ same-store-sales trend has lasted for at least two years — typically the year-before and year-of a U.S. recession,” he writes. “Accordingly, today, we adopt a below-consensus two-year second half of 2016 through first half of 2018 industry-wide comp outlook of +0.50% while also incorporating a below-consensus outlook of relative pricing power as price-wars typically ensue during dining-out slowdowns.”
Perhaps the recession did not happen thanks to very low interest rates that allowed bad businesses to persist until an inevitable — and worse — recession took them down.

Googling “restaurant industry trends” in 2017, one of the results was “retail apocalypse“.

It has its own Wikipedia entry.

https://en.wikipedia.org/wiki/Retail_apocalypse#Factors

Three reasons for the decline of retail are:
  1. the rise of online shopping,
  2. the decline of malls, and
  3. the economic decline of the American middle class.
But there is another reason given as well.
[Another] major reported factor is the “restaurant renaissance,” a shift in consumer spending habits for their disposable cash from material purchases such as clothing towards dining out and travel.
The trend is buying experiences instead of buying things.

This is not a shift toward greater spirituality and inward growth, however.

It reflects the growth of social media.

Rather than show off their big new car — which is now a truck or SUV — people cultivate social status by eating out and traveling and sharing their pics on Instagram.

Even in 2017, there were doubts as to whether this trend could be sustained.

http://www.businessinsider.com/too-many-restaurants-sparks-industry-slump-2017-5
Concerns that there are simply too many places to eat out for chain restaurants to succeed have swept the industry. Executives at both Starbucks and Darden, Olive Garden’s parent company, have said that the US is “over-retailed.”
Yet, chains “keep on adding restaurants,” Fernandez says. Even though McDonald’s traffic dropped in 2016, the largest chain in the country opened more than 350 net new restaurants in the US last year. Starbucks’ US store count increased by 321 in 2016, despite the CEO’s concerns that there were too many retail and restaurant locations opened in the US. Even Chipotle, struggling to recover following an E. coli scandal in late 2015, opened 240 new restaurants last year. The same is true with travel.
There is a bigger and bolder question that goes beyond whether an industry is overbuilt.

This is the question of whether it is becoming obsolete.

3. Is the current entertainment-industrial complex obsolete?

Obsolescence was happening in travel, as was noted in 2017.

https://www.marketplace.org/2017/03/09/after-years-growth-foreign-travel-us-drops/

Periods of crisis like war expose and accelerate changes and trends. In terms of travel, the recession of 2008 saw several developments that might indicate future trends.

1) Japanese and American tourists who would have otherwise visited Hawaii remained in their countries and visited Disneyland instead.

http://archive.fortune.com/2008/05/05/news/companies/simons_disney.fortune/index.htm

http://www.foxnews.com/printer_friendly_wires/2008Dec24/0,4675,ASJapanMickeyapossMagic,00.html
“In a way, the slumping economy works well for Tokyo Disneyland,” said Hiroshi Watanabe, an economist at the Tokyo-based Daiwa Research Institute. “Because of the recession, people have stopped buying cars and houses or going to Hawaii, and Tokyo Disneyland offers an affordable and pleasant alternative.”
2) Americans who would have otherwise have gone to Disneyland took their kids to the movies instead.

https://www.nytimes.com/svc/oembed/html/

3) Americans who would have otherwise gone to the movies invested instead in plasma TVs and home entertainment systems in order to save money over the long term.

http://articles.latimes.com/2009/jun/29/business/fi-tv-sales29

So everybody got downgraded a notch.

And this was a harbinger of the future.

These developments echo the concept of “disruptive innovation”, in which the cheaper, inferior technology gains a niche, improves, and then becomes the dominant technology.

The future of global tourism turns out to be … Netflix.

These developments echo those in the oil industry.

At one time, there was concern that oil production would peak and oil would then become prohibitively expensive.

In fact, British Petroleum predicted that in 2013, oil production would peak.

BP later revised its forecast, stating that because of the 2008 recession, oil production would peak in 2017.

In fact, in terms of conventional oil, it does seem that oil production did peak.

However, the world had moved on from conventional oil to unconventional oil (fracking) and to natural gas and renewable energy sources.

The issue then becomes peak oil CONSUMPTION.

People will simply use less and less oil.

As they say in Saudi Arabia, the Stone Age did not end because the cavemen ran out of stones.

Our ancient ancestors found alternatives to stone just as we developing alternatives to conventional oil (hence the Saudi fear of fracking in the USA).

Because of the coronavirus crisis, what is happening in the entertainment-industrial complex of restaurants, air travel, movie theaters and tourism is also taking place in the oil industry.

The growing obsolescence and transformation of these industries that would have taken 30 years has been compressed into a brief period of three weeks.

Moreover, this was all extremely predictable.

If one googles “peak oil and air travel” or “peak oil and tourism”, there were only a couple of websites devoted to the topic.

One of those websites was that of consultants to the global tourist industry.

They claimed peak oil is of obvious concern to these industries.

Yet, they stated, not one single entity in these industries — airplane manufacturers, airlines, hotels, and so forth — had done any research or had any thoughts on peak oil.

It’s not that these industries did not dismiss the possibility of peak oil.

They simply never thought about it.

If one googled “pandemics and tourism” or “pandemics and oil” in 2019, there would probably not be one single website dealing with the obvious possibility.

There are two popular critiques of the travel industry, and they are both moralistic.

The first is the “class warfare in the sky” theme, that the elites traveled in luxury while the masses are stuck in coach.

The obvious rejoinder to that is that up until relatively recently, most people did not travel by air.

Moreover, the few wealthy people who did travel by air in the 1950s and 1960s did so with less comfort and safety than the average air traveler today.

The other critique is that of “overtourism”.

Mass tourism to once-exotic locales is destroying those places and transforming local communities into Airbnbs.

Interestingly, these two critiques are contradictory.

The first critique demands that ordinary people are entitled to travel with more luxury.

The second critique touches on the fact that the problem is precisely ordinary people taking over places that — once upon a time — only wealthy tourists visited (e.g., Rome, Bali).

This contradiction is overlooked because the popular rhetoric blames corporations (airlines in the first, Airbnb in the second).

What these popular critiques overlook is that much of the entertainment-industrial complex was already showing signs of obsolescence.

4. Are airline bailouts immoral?

There is the issue of “moral hazard”.

Moral hazard is when virtue is punished and unethical behavior is rewarded.

First of all, the airline industry seems remarkably incompetent.

https://www.nytimes.com/svc/oembed/html/

American Airlines in particular is a poster child of irresponsibility.

It might be rooted in over-confidence.
In 2014, having reduced competition through mergers and raised billions of dollars in new baggage-fee revenue, American began reaching stunning levels of financial success. In 2015, it posted a $7.6 billion profit — compared, for example, to profits of about $500 million in 2007 and less than $250 million in 2006. It would continue to earn billions in profit annually for the rest of the decade. “I don’t think we’re ever going to lose money again,” the company’s chief executive, Doug Parker, said in 2017.
There are plenty of things American could have done with all that money. It could have stored up its cash reserves for a future crisis, knowing that airlines regularly cycle through booms and busts. It might have tried to decisively settle its continuing contract disputes with pilots, flight attendants and mechanics. It might have invested heavily in better service quality to try to repair its longstanding reputation as the worst of the major carriers.
Instead, American blew most of its cash on a stock buyback spree. From 2014 to 2020, in an attempt to increase its earnings per share, American spent more than $15 billion buying back its own stock. It managed, despite the risk of the proverbial rainy day, to shrink its cash reserves. At the same time it was blowing cash on buybacks, American also began to borrow heavily to finance the purchase of new planes and the retrofitting of old planes to pack in more seats. As early as 2017 analysts warned of a risk of default should the economy deteriorate, but American kept borrowing. It has now accumulated a debt of nearly $30 billion, nearly five times the company’s current market value.
The same is true of airplane makers.

Boeing wants a bailout.

https://www.marketwatch.com/story/airlines-and-boeing-want-a-bailout-but-look-how-much-theyve-spent-on-stock-buybacks-2020-03-18
Boeing said March 17 that “a minimum of $60 billion in access to public and private liquidity, including loan guarantees” was appropriate for the aerospace-manufacturing industry.
First of all, the airplane manufacturers like Boeing are actually a part of the entertainment industry, just like their customers, the airlines.

Secondly, Boeing seems quite irresponsible and incompetent (e.g., the 737 debacle).

For example, it is important for companies, like individuals, to save a little cash for a rainy day.

But companies are under pressure to focus on raising stock value.

Companies therefore neglect building a stockpile of cash that may come in handy for dealing with the problems and opportunities that may arise in a future crisis.
Free cash flow and buybacks
Most investors know that cash flow is more important than earnings, because revenue can be booked, and profits shown, before a company actually receives payment. A company’s free cash flow is its remaining cash flow after planned capital expenditures.
Free cash flow can be used to pay for dividends, buy back shares, expand operations or invest in other improvements for the business.
Companies that built up hoards of cash, such as Berkshire Hathaway have been criticized for doing so, because it lowers a company’s return on invested capital.
Then again, Berkshire CEO Warren Buffett has shown during down markets that the extra cash can be put to work by scooping up other companies’ shares at low prices, or making special, lucrative preferred-stock deals, such as the one Berkshire did with Goldman Sachs during the 2008 financial crisis. Berkshire had $125 billion in cash and short-term investments in U.S. Treasury bills as of Dec. 31.
Companies can be temped to buy back their own shares rather than save cash.
Companies use free cash flow to repurchase shares for several reasons. If the share count is reduced, it boosts earnings per share. If a company is issuing a significant number of new shares as part of its executive-compensation packages, buybacks mitigate the dilution of other shareholders’ ownership percentages.
A company may buy back shares because its executives cannot think of any better use for the money (such as expansion, equipment replacement, new product or service development, etc.), or maybe because the executives and board members are overly fixated on quarterly earnings results and boosting the share price, rather than the long-term health of the business and its ability to navigate storms.
Analysts, investors and corporate executives often call buybacks a “return of capital” to shareholders. This isn’t necessarily the case if the share price declines, despite the buybacks, or it eventually becomes clear the company was underinvesting in its ability to deliver competitive products and services.
Like the airlines, Boeing imprudently spent most of its cash on stock buy backs.
As a group, the six airlines spent 96% of their free cash flow on stock buybacks over the past 10 full years through 2019.
Boeing’s free cash flow for 10 years totaled $58.37 billion, while the company spent $43.44 billion, or 74% of free cash flow, on stock repurchases.
After screwing up so badly, now Boeing feels entitled to take the cash of the American people.

The practical point of moral hazard is that rewarding irresponsibility encourages greater irresponsibility in the future.

It might be that irrelevant, over-built, obsolete corporations now imagine themselves to be “too big to fail” because of the 2008 bank bailouts — and politicians think so, too.

This might mean that sometime in the future, there will be even bigger bailouts for corporations that are of even more dubious value.

It was noted that the 1960s counterculture was a response to the rise of the technocratic rule of experts in the government and in corporations — a technocracy that had roots in progressivism.

It was also mentioned earlier that a new form of progressivism emerged in the 1970s to challenge the liberal establishment consensus that had emerged in the New Deal era.

https://thebreakthrough.org/journal/issue-5/raiding-progress

But going back further to the 1950s, there was a fringe conservative rebellion to FDR’s fusion of the corporation and the state that had marginalized small-town, small-businessmen.

http://movies2.nytimes.com/books/first/p/perlstein-storm.html

Their objection was that the government was “picking winners”, and that they were the consistent losers.

https://financial-dictionary.thefreedictionary.com/Picking+Winners
The policy in which a government encourages certain sectors of an economy, or even particular companies. A government may pick winners by offering tax incentives, favorable regulation or even direct subsidies. Picking winners was a feature of post-World War II development in a number of countries.
Today, an echo of that conservative rebellion is found in Trump’s greatest supporters, who are affluent businessmen without college degrees rather than the small-town white working class.

Ironically, in 2020, it is Republicans who are bailing out big business and “throwing small business under the bus”.

Populism has been appropriated by the Republican establishment’s “socialism for the rich”, and the populists don’t see it.

If money went to citizens, then they could pick the winners.

There is a new emerging economy that neither the government nor corporate America understand.
Government support for corporations represents the bailing out of dinosaurs.

Old people run the world, and they do their best to sustain a system that no longer works or actually does not exist.

In fact, the bulk of the society lives in the past.

In 2016, at Trump rallies, they played Elvis songs, and at Clinton rallies they played the Rolling Stones.

Elvis and the Stones were once considered risque and rebellious, and this is what raises the pulse rate of an aging society.

There is shock that Jeff Bezos’s net worth has risen by $24 billion during the SARS2 crisis.

Amazon is now a vital resource in the American economy.

The future belongs to Amazon, and it needs no bailouts.

The real scandal is that trillions of dollars of taxpayer money is being plowed into companies that might very well be obsolete, and often were run incompetently and unethically.

It has been argued that the these bailouts could have taken the form of a Universal Basic Income that Americans would have spent on goods and services that have a future.

There are all sorts of pragmatic arguments for and against a UBI.

But there is one argument against a UBI that has not been adequately voiced.

A Universal Basic Income is impossible because it goes against human nature.

People do not want an income, they want jobs in order to labor.

To appeal to its recipients (not to mention critics), an unconditional UBI would have to take the form of a conditional Guaranteed Minimal Income and have to feel like it was a form of work.

In some sense, it is an issue of the “dignity of labor” versus becoming a virtual of ward of the state and an object of pity.

But it is actually much more complicated than that, and in some ways contradicts it.

The need to labor can be understood in the distinction between labor, work and action.

To understand this distinction, one must go back to an earlier distinction between thinking and calculation.

Artificial intelligence is not genuine intelligence because at best machines can merely simulate calculation, whereas only humans can truly think.

https://en.wikipedia.org/wiki/Hubert_Dreyfus%27s_views_on_artificial_intelligence#Knowing-how_vs._knowing-that:_the_primacy_of_intuition

There might be political implications to such a view.

Calculation would align with descriptions of instrumental rationality.

https://en.wikipedia.org/wiki/Instrumental_and_value_rationality
Instrumental” and “value rationality” are classic labels philosopher and sociologist Max Weber coined for two kinds of human reasoning. They reason instrumentally by judging which factual means will “work” to achieve given ends. They reason value-rationally by judging which valued ends will be “right,” legitimate in continuous social situations.
Here, technocracy rears its head once again.

Wary of the potential for calculation to displace genuine thinking, a hostility toward technocracy might be directed against administration and bureaucracy.

This hostility might therefore manifest itself in a rejection of centralized state socialism.

Alternatively, it might also renounce capitalism or even reject democracy when “democracy” is debased into forms of patronage.

Along these lines, a rejection of technocracy might evolve into anti-American attitudes, for instance, from the perspective of Marxism and, later, from the view of political Islam.

It might even express itself in ant-Jewish feeling, insofar as — in the European context, at least — Jews were perceived to be crafty, rootless merchants.

Here a suspicion of calculation is fused to a distrust of the nomadic and the “global”.

This conjoined loathing of the cosmopolitan and of calculation resonates with fascism.

In retrospect, however, fascism — with its glorification of technological power and persecution of intellectuals — seems like the ultimate mindless technocracy.

A disillusioned fascist intellectual might continue in his stubborn hostility to calculation and technocracy.

However, in his subtle rhetorical devices there might be hints that he came around to express appreciation of the nomadic.

He might recognize that the greatest thinkers were also wanderers.

He might write a book involving a long dialog between men who are taking a stroll and discussing the nature of thinking.

This is because real thinking is a lot like just going for a walk to nowhere in particular.

https://en.wikipedia.org/wiki/What_Is_Called_Thinking%3F

A critical response to this distinction between thinking and calculation might be the objection that thinking is inferior to action.

In fact, philosophy in its origin might be seen as an attempt to create a private realm of discussion and debate in the aftermath of the democracy’s loss of legitimacy.

There seemed to be hope that the educated life could then inform political life.

http://classics.mit.edu/Plato/seventh_letter.html

Perhaps in the Christian tradition, in particular in a Lutheran mode, the contemplative life turned in on itself more completely, so that culture was defined as inward and spiritual.

This definition of culture stands opposed to the (French) sense of civilization as material and objective, involving laws, institutions, structures and infrastructure.

It also stands in contrast to the (British) sense of culture as elite “high culture”, or the contemporary relativistic, democratized and anthropological (American) sense of the term.

https://www.hup.harvard.edu/catalog.php?isbn=9780674004177

But there are hints of when the great German tradition of critical thought turns against itself and criticizes itself for being too private and self-absorbed.

For example, the following article describes the German novelist Hermann Hesse’s writings as puritanical romantic rebellion from the world — and thus perfect literature for teenagers.

https://www.newyorker.com/magazine/2018/11/19/hermann-hesses-arrested-development

Yet Hesse’s final work, written during the rise of fascism, rejects the withdrawal of the Germanic intellectual tradition from and its irrelevance to public, political life.

https://en.wikipedia.org/wiki/The_Glass_Bead_Game

The same reappraisal of the traditional philosophical valuation of contemplation over citizenship happens in political philosophy.

https://en.wikipedia.org/wiki/The_Human_Condition
The Human Condition,[1] first published in 1958, is Hannah Arendt‘s account of how “human activities” should be and have been understood throughout Western history. Arendt is interested in the vita activa (active life) as contrasted with the vita contemplativa (contemplative life) and concerned that the debate over the relative status of the two has blinded us to important insights about the vita activa and the way in which it has changed since ancient times. She distinguishes three sorts of activity (labor, work, and action) and discusses how they have been affected by changes in Western history.
This does not mean that the active life is superior to contemplation.

In fact, all too often in the modern world, contemplation has been shunted aside in favor of activity.

Moreover, there are different forms of activity that, in a very German way, need to be carefully distinguished.
Arendt introduces the term vita activa (active life) by distinguishing it from vita contemplativa (contemplative life). Ancient philosophers insisted upon the superiority of the vita contemplativa, for which the vita activa merely provided necessities. Karl Marx flipped the hierarchy, claiming that the vita contemplativa is merely a superstructure on the fundamental basic life-processes of a society. Arendt’s thesis is that the concerns of the vita activa are neither superior nor inferior to those of the vita contemplativa, nor are they the same. The vita activa may be divided into three sorts of activities: labor, work and action.
The West traces its values back to ancient Athens, yet the ancient Greek world was very different from the modern world.

In ancient Greece, the household was the realm of physical necessity — of production and consumption — whereas the public realm was a realm of freedom and accomplishment.

But the public realm was for those few who could afford it or who were born into it.

This is because it was assumed that those who were still trapped in chains of necessity could never function as proper citizens.
The Public and the Private Realm
According to Arendt, ancient Greek life was divided between two realms: the public realm in which “action” was performed, and the private realm, site of the household ruled by its head. The mark of the private was not intimacy, as it is in modern times, but biological necessity. In the private realm, heads of households took care of needs for food, shelter, and sex. By contrast, the public realm was a realm of freedom from these biological necessities, a realm in which one could distinguish oneself through “great words and great deeds.” Property requirements for citizenship reflected the understanding that unless one was able to take care of one’s biological necessities, one could not be free from them and hence could not participate in the public realm as a free person among equals. Slaves and subordinated women were confined to the private realm where they met the biological necessities of the head of the household. The public realm naturally was accorded higher status than the private.
In the classical world, the public sphere was a realm of glory.

The rise of the Church as the new governing body of the empire complicated this.

The official mission of the Church was to promote the glory of God in the everlasting life of the individual.

Economic life was dominated by the major and minor estates of the nobility, each of which had its own military.
With the fall of the Roman Empire, the church took over the role of the public realm (though its otherworldly orientation gave it a character distinct from the previous public realm), and the feudal lords ran their lands and holdings as private realms.
Gradually, secular states arose to displace both the Church and the nobility.

The modern state was concerned with economic productivity, which funded state militaries, which extended state territory.
The modern period saw the rise of a third realm, the social realm. The social realm is concerned with providing for biological needs, but it does so at the level of the state.
https://en.wikipedia.org/wiki/Biopower

The modern state’s focus on administrating economic productivity was an abomination by the standards of the traditional order.

The modern state violated the distinction between private economies of necessity, on the one hand, and public freedom and glory, on the other.
Arendt views the social realm as a threat to both the private and the public realm. In order to provide for the needs of everyone, it must invade the private sphere, and because it makes biological needs a public matter, it corrupts the realm of free action: There is no longer a realm free from necessity.
Another careful distinction needs to be made between labor and work.

The purpose of labor is purely the endless task of sustaining the life processes of the individual.

Arendt claims that her distinction between labor and work has been disregarded by philosophers throughout history even though it has been preserved in many European languages. Labor is human activity directed at meeting biological (and perhaps other) necessities for self-preservation and the reproduction of the species. Because these needs cannot be satisfied once and for all, labor never really reaches an end. Its fruits do not last long; they are quickly consumed, and more must always be produced. Labor is thus a cyclical, repeated process that carries with it a sense of futility.
In traditional societies, the upper classes define themselves by their dignified elevation above the need to labor.

In a modern society, everyone labors.

For example, the upper classes the United States are tireless laborers putting in long weeks at the office.

Sometimes this is publicly disguised.

https://www.cnn.com/2020/03/27/entertainment/kim-kardashain-kourtney-kardashian-fight-season-18-kuwtk-trnd/index.html

It is notable that the Kardashian sisters pride themselves on their work ethic.

Their popularity is based on their self-presentation as the new “idle rich” aristocracy of semi-talented influencers.

The American dream was originally religious freedom, and it became democracy in the American Revolution, and it became homeownership in the 1950s.

At some point, the American dream became the California dream of wealth and fame acquired with little effort or ability.

But like ducks gliding along a pond, below the surface of the placid image of easy money is a well-hidden burning ambition and ceaseless labor.

An elite proud of its labors diverges from tradition.
In the ancient world, Arendt asserts, labor was contemptible not because it was what slaves did; rather, slaves were contemptible because they performed labor, a futile but necessary activity. In the modern world, not just slaves, but everyone has come to be defined by their labor: We are job-holders, and we must perform our jobs to meet our needs.
The existence of the laborer is purely subordinate to his labor.

In order to understand this, one must recognize the distinction between material and “spiritual” nourishment.

For example, in the Roman Empire, the upper-classes were obligated to subsidize the gladiatorial games — “bread and circuses” — that kept the masses preoccupied.

The laborer’s sense of freedom is based largely on escapist entertainment (sports, dumbed-down religion).

Entertainment nourishes the laborer — but only insofar as entertainment sustains and empowers the laborer to continue in his labor, and nothing more.

Hence, the military provides its soldiers retreat from combat in scheduled periods of “R & R” — “rest and recuperation" — so that they can be sent back into combat in strengthened condition.

Enthralled by the entertainment-industrial complex, laborers employed by corporations spend their earnings on escapist fantasias (superhero movies, cruise ships) run by corporations.

This is a bit like the plantation laborers in “The Grapes of Wrath” who spend what little they earn on meager supplies from the company store.

In the USA, a working-class family can earn an upper-middle class income (over $100K), but they have no savings because all is spent on status symbols and entertainment.

Prosperity only makes the system more insidious.

Psychologically, laborers needs to labor in order to enjoy the fruits of their labor.

That is, without first exhausting themselves they become emotionally blocked and depressed and cannot enjoy their entertainments.

For example, in contemporary Russia, there is often a wave of deaths due to drug abuse (from “bath salts”) by those who receive unemployment benefits for the first time in their lives.

This phenomena was not seen earlier in the period of economic collapse following the fall of the Soviet state, when there were few unemployment benefits.

The Russian unemployed now find themselves physically sustained but unable to expend their energies in labor.

Unable to labor, they cannot thereby enjoy entertainment, and so they become depressed.

People don’t want a basic income, they want jobs, which provide both income and exhaustion.

Hence, the existence of mindless patronage projects.

Laborers do not want to get paid for doing nothing, they want to get paid for hard labor even if the objective is manifestly absurd.

https://en.wikipedia.org/wiki/Gravina_Island_Bridge

Radical political romantics speak of a utopian time in the future, when machines will perform all labor and all people will be free to pursue a life of fulfillment (science, art, citizenship).

This runs against the grain of human nature.
Marx registers this modern idea in his assertion that man is animal laborans, a species that sets itself apart from the animals not by its thinking, but by its labor. But Marx then contradicts himself in foreseeing a day when production allows the proletariat to throw off the shackles of their oppressors and be free from labor entirely. By Marx’s own lights, this would mean they cease to be human. Arendt worries that if automation were to allow us to free ourselves from labor, freedom would be meaningless to us without the contrast with futile necessity that labor provides. Because we define ourselves as job-holders and have relegated everything outside of labor to the category of play and mere hobbies, our lives would become trivial to us without labor. Meanwhile, advances in production and the transformation of work into labor means that many things that were once to be lasting works are now mere disposable objects of consumption, “The solution…consists in treating all use objects as though they were consumer goods, so that a chair or a table is now consumed as rapidly as a dress and a dress used up almost as quickly as food.
If the nature of labor is an endless cycle of laboring in order to consume in order to labor some more, the nature of work is linear.

Work has an end goal involving the creation of the artificial from nature.

Work, unlike labor, has a clearly defined beginning and end. It leaves behind a durable object, such as a tool, rather than an object for consumption. These durable objects become part of the world we live in. Work involves an element of violation or violence in which the worker interrupts nature in order to obtain and shape raw materials. For example, a tree is cut down to obtain wood, or the earth is mined to obtain metals. Work comprises the whole process, from the original idea for the object, to the obtaining of raw materials, to the finished product.
Work involves the kind of instrumental thinking that characterizes technocratic governance and calculation in general.

In evolutionary terms, instrumental thinking characterizes human violence as opposed to that of other primates, the aggression of which tends to be more reactive and emotional.

https://en.wikipedia.org/wiki/Prefrontal_cortex
The most typical psychological term for functions carried out by the prefrontal cortex area is executive function. Executive function relates to abilities to differentiate among conflicting thoughts, determine good and bad, better and best, same and different, future consequences of current activities, working toward a defined goal, prediction of outcomes, expectation based on actions, and social “control” (the ability to suppress urges that, if not suppressed, could lead to socially unacceptable outcomes).
The frontal cortex supports concrete rule learning.
Again, the modern world is unusual by traditional standards because even the ruling classes are understood to be laborers, not idle aristocrats.

Likewise, the instrumental mode of thought has become so prevalent in the modern world that the elites present themselves as super-productive workers from whom all good things derive.
The process of work is determined by the categories of means and end. Arendt thinks that thinking of ourselves primarily as workers leads to a sort of instrumental reasoning in which it is natural to think of everything as a potential means to some further end. Kant‘s claim that humanity is an end in itself shows just how much this instrumental conception of reason has dominated our thinking. Utilitarianism, Arendt claims, is based on a failure to distinguish between “in order to” and “for the sake of.”[
So much of the modern understandings of value that distinguishes between means and ends seems derived from capitalism — including radical anti-capitalist ideologies like Marxism.

The common definition of capitalism is that of a modern market economy.

https://en.wikipedia.org/wiki/Capitalism

However, a more narrow definition of capitalism would focus on the notion of capital.

Capital refers to something that holds intrinsic value losing that value when it is used as an investment.

For example, in the eyes of a samurai, a samurai sword would both be of great personal value and have tremendous use value as a fighting instrument.

But if the samurai were to use his sword as collateral on a loan, then the sword would become a form of capital.

There is a double sleight of hand in the modern way of thinking.

First, use value eclipses and replaces inherent worth as the “intrinsic” form of valuation.

Second, exchange value displaces use value as the pragmatic form of valuation.
The homo faber mentality is further evident in the substitution of the notion of “use value” for “worth” in economic discourse, which marks the beginning of the disappearance of a notion of a kind of worth that is intrinsic, as opposed to value, which is relative to human demand or need. Although use objects are good examples of the products of work, artworks are perhaps the best examples, since they have the greatest durability of all objects. Since they are never used for anything (least of all labor), they don’t get worn down.
In a capitalist enterprise, profits are relentlessly plowed back into investments.

In a capitalist society, everything under the sun has a way of becoming commodified (e.g., the private lives of the Kardashians).

Whereas trade and commerce have always existed in human history, capitalism is properly understood as a modern phenomenon, the origin of which is debated (Protestantism?).

It is sometimes stated that the American economy since the New Deal is no longer a true capitalist economy.

That is, the USA is better understood as having a mixed economy in which government as a regulatory body looms large.

However, the expanded role of government has served to buttress and aid and abet capitalism.

The New Deal might therefore be seen as the metastasization of capitalism into the state, and not the reverse.

This might be reflected in the paradox that FDR might have been a traitor to his class, but he was the savior of capitalism.

The real divergence from capitalism might be the rise of a consumer economy in which people do not save and invest, they spend everything they have on status symbols and entertainment.

Likewise, companies no longer plow their profits into new investments, they use them to buy company stock, enriching shareholders who rewards the CEO financially.

Only Amazon, with its central role in the coronavirus era, is plowing its new wealth into its operations in the classic fashion of the true capitalist.

Amazon is investing in developing testing capabilities for coronavirus in its operating chain.

There is a tendency for the social sciences to identify various forms of capital: financial capital (money), social capital (networks, prestige), cultural capital (education).

The educational system in particular is understood marketplace where the various forms of capital are exchanged for one another.

https://en.wikipedia.org/wiki/Cultural_capital

One example of such an exchange was when Oprah Winfrey endorsed Barrack Obama in a presidential election.

In doing so, she was said to have transformed a vast amount of her moral capital into political capital, and then donated it to Obama’s campaign.

This might be correct, but there was one step that came prior to the later two steps of transformation and donation of capital.

Initially, Oprah did not possess moral capital, she possessed moral stature and credibility that had value in itself.

Only later did she — controversially — transform that ethical standing into moral capital and exchange it for political capital and then hand it over to Obama.

This is controversial, because in doing so, she somewhat devalued her public image.

Moreover, even though Obama won, Winfrey’s sacrifice might have been pointless, because anyone who would have heeded Oprah would have voted for Obama anyway.

A less happy example of an attempt to purchase political capital would be Micheal Bloomberg’s futile exchange of financial capital for political capital in his 2020 presidential bid.

This might have been a less effective exchange because Bloomberg was not transforming any of the inherent esteem that he enjoys — largely limited to New York City — into political capital.

None of this political phenomena would be recognized by the social sciences.

This is because the social sciences do not recognize the value of inherent worth.

The social sciences to not perceive inherent worth because the educational system itself is a branch of capitalism.

A third type of activity is action, which is characterized by how individuals express their individuality.

However, the expression of the self might neither be conscious nor intentional on the individuals part.
The third type of activity, action (which includes both speech and action), is the means by which humans disclose themselves to others, not that action is always consciously guiding such disclosure. Indeed, the self revealed in action is more than likely concealed from the person acting, revealed only in the story of her action. Action is the means by which we distinguish ourselves from others as unique and unexchangeable beings.
In fact, the actor Robert De Niro has said that people don’t express themselves, they hide themselves.

One trick for an actor would then to play a character as if they were trying to hide themselves.

Of course, the actor himself — for example, Robert De Niro — is drawn to the profession of acting so he can hide himself.

Indeed, the actor Adam Driver avoids seeing himself in his own movies.

If he sees himself onscreen, he will freak out and run out of the room.

Perhaps what he sees is himself shining out from behind the disguise of the character he was playing.
Moreover, what he sees is a part of himself that he does not recognize.

So the task of the actor is to simultaneously play their character as well as play an embellished version of themselves that people will love.

In that vein, the actor George C. Scott said that the key to acting is to project one’s love of acting, so that the audience will love the actor.

But that projection of the love of acting is itself an exaggeration, or perhaps even a simulation (for example, in the case of Marlon Brando, who swore that he disliked acting).

Moreover, the best roles would be of characters who themselves love acting in public (for example, General George Patton).

Of course, it is not just in theater or “play acting” that human individuals distinguish and unconsciously express themselves, but in all action.

Whereas other species of animals engage in behavior, only humans possess action that discloses individuality to other individuals.

Insofar as politics is a realm of action, it is partly what makes us uniquely human.
With humans, unlike with other beings, there is not just a generic question of what we are, but of who each is individually. Action and speech are always between humans and directed toward them, and it generates human relationships. Diversity among the humans that see the action makes possible a sort of objectivity by letting an action be witnessed from different perspectives. Action has boundless consequences, often going far beyond what we could anticipate. The Greeks thought of the polis as a place where free people could live together so as to act.
The kind of self-expression that one expects to find in something like play can also be found in work.

For example, there is evidence that the bulky tools that neanderthals fabricated were technologically superior to what our homo sapien ancestors used, or still use.

This might imply that the neanderthals possessed language.

https://www.sciencedaily.com/releases/2019/06/190626133802.htm
https://www.pnas.org/content/115/9/1959
Neanderthals made both stone-tipped wooden spears and hafted cutting or scraping tools, and they employed a variety of adhesives (15), which fleshes out the complexity of Neanderthal technology by documenting the presence of at least two additional classes of artifacts, each comprising at least three components. By this measure of complexity, the Neanderthals were making food-getting artifacts more complex than those of some recent hunter-gatherers (1).
For the later African Middle (and Later) Stone Age and the Upper Paleolithic, the computations underlying the artifacts are of comparable complexity to those of recent hunter-gatherers (1, 20). The artifacts, which include mechanical instruments and facilities, such as spear-throwers and even self-acting mechanical facilities or automata (e.g., snares/traps), require the computational complexity (and working memory capacity) of an unrestricted grammar or natural language (12, 18, 20).
The neanderthals might have possessed advanced toolmaking and language, but they did not seem to possess the artistic advances that marked homo sapiens, such as jewelry.

However, the neanderthals did not seem to move on to develop more refined neolithic stone tools, such as polished axes.

https://en.wikipedia.org/wiki/Stone_tool#Neolithic_industries
Polishing increased the intrinsic mechanical strength of the axe. Polished stone axes were important for the widespread clearance of woods and forest during the Neolithic period, when crop and livestock farming developed on a large scale. They are distributed very widely and were traded over great distances since the best rock types were often very local. They also became venerated objects, and were frequently buried in long barrows or round barrows with their former owners.
<figure></figure>Here a tool is becoming a work of art among our homo sapien ancestors, much like cars in the 1950s America and the iPhone in the 2000s.

The neanderthal might have had the intelligence to develop tools that were technically advanced, but those tools were not works of art that happened to be more useful.

It could be that life in Europe in that cold period was just too difficult to transcend mere labor and work a develop to a greater degree a realm of symbolic action.

And to what extent is it still like that?

People who live in cold climates tend to be practical and focus on technology, which just might later take on an aspect of beauty.

(BMW commercial, Theo Jansen’s wind-powered sculptures, invented to push sand onto dunes and protect them from the rising sea)


https://youtu.be/wnVZyu6cDmg

Conversely, something that is beautiful might take on practical significance, such as the elliptical wing of the Spitfire fighter aircraft of WW2.

IIRC, it was originally researched in part because it was aesthetically pleasing, but eventually it was developed purely for practical considerations.

<figure></figure>https://en.wikipedia.org/wiki/Supermarine_Spitfire

Likewise, people who live in warm, ancient civilizations can cultivate cultivate beauty, which might take on practical significance (it could save your life).

(BMW commercial, Italian wedding)



https://youtu.be/UlNvJVSKZEQ

"Exaptation" is when something is developed for one reason and gets repurposed for another.

https://en.wikipedia.org/wiki/Exaptation
Exaptation and the related term co-option describe a shift in the function of a trait during evolution. For example, a trait can evolve because it served one particular function, but subsequently it may come to serve another. Exaptations are common in both anatomy and behaviour.
Bird feathers are often cited as the classic case of a random genetic mutation serving one purpose, and then later being repurposed for another survival function.
Bird feathers are a classic example: initially they may have evolved for temperature regulation, but later were adapted for flight. Note here that when feathers were initially used to aid in flight they were doing so exaptively; however, since they have since been shaped by natural selection to improve flight, in their current state they are now best regarded as adaptations for flight. So it is with many structures that initially took on a function as exaptations, once molded for that new function they become adapted for that function.
Not mentioned is that ornate displays of plumage in male birds have also come to serve as signaling in mating rituals.

In fact, it is theorized that this was also the original purpose of feathers in dinosaurs before the need for insulation and, later, flight.

https://en.wikipedia.org/wiki/Sexual_selection_in_birds#Visual_signaling

Indeed, the issue of tool as instrument versus tool as jewelry goes back in human evolution — indeed, in the evolution in general.

https://en.wikipedia.org/wiki/Sexual_selection
Sexual selection is a mode of natural selection in which members of one biological sex choose mates of the other sex to mate with (intersexual selection), and compete with members of the same sex for access to members of the opposite sex (intrasexual selection). These two forms of selection mean that some individuals have better reproductive success than others within a population, either because they are more attractive or prefer more attractive partners to produce offspring.
A classic case would be a peacocks extravagant tail plumage, which has become established in the species because the (female) peahen finds it attractive.

This ran counter to Charles Darwin’s theory of natural selection, which was based on survival.

This frustrated Darwin.
The sight of a feather in a peacock‘s tail, whenever I gaze at it, makes me sick!
[Sexual selection] depends, not on a struggle for existence, but on a struggle between the males for possession of the females; the result is not death to the unsuccessful competitor, but few or no offspring.
For Darwin, there was no practical purpose for the exaggerated displays of ornamentation in the mating rituals of a species.

Therefore, Darwin concluded, this behavior is based on the capacity by intellectually advanced animals to recognize beauty.

This just might be the case in some animals such as the bowerbird, which typically builds an elaborate structure of twigs and/or collects groups of bright objects.
<figure></figure>https://preview.redd.it/6fcuj7p8ocw11.jpg?width=960&crop=smart&auto=webp&s=6b43a6b7213bcd2f2f2d107c9611a2d7bad43605



https://www.youtube.com/embed/E1zmfTr2d4c

Along with impersonations, bowerbirds also sing and dance.



https://youtu.be/1XkPeN3AWIE

The attraction to beauty can take on a dysfunctional or maladaptive quality.

For example, there is a population of hybrid swordtail fish in a small pool of water in Mexico that exhibits some of the genetic problems that are expected in hybrid animals.

Yet the rate of melanoma in the fish is even higher than would be expected among hybrids.

It is speculated that the female fish are attracted to males who have that shiny black bead of skin cancer.

https://phys.org/news/2020-05-genetic-hybrid-dysfunction.html

But not all species in which the males make great displays of their exaggerated excellence are quite so intellectually advanced, as in the case of the rhinoceros beetle.

<figure></figure>

In this case, those outward physical traits that distinguish males from females become more exaggerated over time within a species as the opposite sex consistently chooses it.

https://en.wikipedia.org/wiki/Secondary_sex_characteristic

Notably, these impractical exaggerated features are only desirable insofar as:
  1. there is no other information available to judge potential mates, and
  2. these exaggerated features do not collide with the need to survive.
The role of secondary sex characteristics in mating might find a parallel in the role of educational credentials in the modern job market.

Generations ago, bankers were the sons of bankers who went to Harvard and got mediocre grades.

Today, the world of finance is more open to talent.

However, this means that the Harvard degree has actually become more important.

This is because it in the absence of the establishment’s inbred “old boys’ club”, the educational credential is the equivalent of a passport telling the world who you are.

Unfortunately, the elite credential does not necessarily translate into real-world success.

Like they say, one’s IQ measures not intelligence, but rather one’s ability to do well on an IQ test.

Also, in the aftermath of a calamity, the more refined measures of success fall away.

When civilization collapses, the elite educational credential might not mean as much as practical skills.

For example, the stereotype of Jewish women is that they want to marry a nice doctor or lawyer, but when society collapses, suddenly the tough farmer seems like the real prize.

https://en.wikipedia.org/wiki/Defiance_(2008_film)

A sociopolitical parallel to exaggerated and useless secondary sex characteristic might be found in the concept of societal “decadence”.

Decadence properly understood is not decline but the period of barren opulence that precedes decline.

The height of its powers and accomplishments, a civilization will simply exhaust itself and enter a period of self-satisfied albeit frustrated stasis.

Its central guiding cultural principle(s) will lose its dynamism and creative energy, problems cease to be solved and people passive-aggressively turn against one other.

Decadence manifests a bored, sterile, self-indulgent prosperity in which past struggles and glories are simulated in exaggerated form as a source of excitement.

https://www.nytimes.com/svc/oembed/html/

In a democracy, during a period of decadence, political leaders tend to be actors, showmen, con artists, carnival barkers.

Like professional wrestling, political contests like elections take on a the quality of staged contests run by a corrupt duopoly.

Eventually, a frustrated public turns away in disgust from the establishment and embraces … pseudo-extremists (pseudo-socialists, pseudo-fascists).

By this measure, the USA has been in a period of decadence since the 1980s.

There are three periods of American history along the lines of how American presidents related to their public image:
  1. In the first period, from Washington to Truman, American presidents were exceedingly image conscious, but they lived up to or exceeded the image that they projected.
  2. In the second period, from Eisenhower to Carter, American presidents were in their true character very different from the glamorous or folksy public image that they projected because they were far more serious, intelligent and substantial than that popular image.
  3. In the third period, from Reagan onward, style exists without substance.
Returning to the world of nature, it has been theorized that the exaggerated displays of the mating ritual do, in fact, serve a practical function.

Extremely healthy animals can get away with a big fancy display which normally hinders their struggle to survive.

Like humans who own expensive houses and cars, animals with exaggerated traits are showing off to potential mates that they can afford the expense.

(photo of long-tailed widowbird)
<figure></figure>https://en.wikipedia.org/wiki/Handicap_principle
The handicap principle suggests that reliable signals must be costly to the signaler, costing the signaler something that could not be afforded by an individual with less of a particular trait. For example, in the case of sexual selection, the theory suggests that animals of greater biological fitness signal this status through handicapping behaviour or morphology that effectively lowers this quality. The central idea is that sexually selected traits function like conspicuous consumption, signalling the ability to afford to squander a resource. Receivers know that the signal indicates quality because inferior quality signallers cannot afford to produce such wastefully extravagant signals.
The scientist and popular writer Jared Diamond claims that the handicap principle is the reasons why humans engage in so much substance abuse.

That is, alcohol, tobacco and other drugs are portrayed in advertising and the media as sexy precisely because they cause health problems to which the super-healthy would be immune.

https://en.wikipedia.org/wiki/The_Third_Chimpanzee#Sexual_selection_(parts_two_and_three)

The handicap principle might be at work in terms of educational credentials.

The exaggerated uselessness of an academic credential is precisely what makes it attractive.

The financial cost and intellectual rigor of an otherwise useless pursuit of knowledge in the liberals arts serves as a signal of vitality to prospective employers.

However, in the context of the 21st century’s glorification of a “creative class”, it might even go further than that.

Perhaps an elite credential is most effective when it is itself cast as an impediment to a rebellious disruptiveness — an impediment that the credential holder has overcome.

For example, there is the spectacle of the Ivy-educated rebel in the world of business, politics and technology who ceaselessly advertises how he is out to disrupt the system that made him.

He or she will constantly tell the public how he or she had to un-brainwash himself or herself from his or her elite education, hindrance that it is to Thinking Outside The Box.

This, at least at first, serves to hide his or her true nature as a pretty typical bland technocrat groomed from an early age for the role he or she is playing (including in art and literature).

Also, this rebellious posturing is something mastered and on constant display by the faculty at elite universities (especially by young professors fresh out of elite PhD programs).

Geopolitically, nothing inspires the peoples of planet Earth more than seeing a country with unprecedented wealth and power that has as its leaders a series of charismatic posers.

Such leaders embody the promise of the American dream that literally anyone can make it in the USA.

Ironically, decadence is a sign of the success of a sociopolitical-economic system, at least in terms of accruing a material surplus.

That is, the insubstantiality of American political leaders since the 1980s is a sign of American prosperity and superiority.

There might be a parallel in the world of business.

There is a saying that one should invest in a company that would thrive even if it were run by an imbecile — because eventually it will be.

The handicap principle might suggest that when that company is eventually run by an imbecile and it does well nonetheless, its value on the stock market would soar.

That is, despite all of his or her terrible mistakes, the moronic leader would be a signal to investors of the company’s ability to transcend even its own poor management.

The point of all this is that the characteristics of action blend into labor and work.

What seems like a gratuitous display (action) can have a functional purpose (work).

Conversely, what seems useful (work) might simply be an adornment (action).

This intertwining of labor, work and play is partly functional.

That is, labor is the foundation of work, which in turn is the basis of action.

This kind of stacking of stages of human development is common.

For example, it is famously found in Maslow’s hierarchy of needs.

Maslow’s hierarchy of needs bases individual fulfillment on the satisfaction of social needs, which are predicated on the satisfaction of physiological survival.

<figure></figure>https://en.wikipedia.org/wiki/Maslow%27s_hierarchy_of_needs

To reiterate, people don’t want a Universal Basic Income, they want jobs.

Again, without labor, the cycle of exhaustion and nourishment (entertainment, recreation) is broken and the laborer falls into a death spiral of depression and frustration.

Also, if labor ceases, then so do work and action because they are based on labor.

Perhaps one can find this in the long lunches that the French are famous for.

The classic French lunch is a three-course meal consumed over two hours.

The assumption is that such a substantial lunch is necessary to provide the laborer with enough nourishment to work even harder.

https://www.bbc.com/worklife/article/20170407-we-can-learn-a-lot-from-how-the-french-do-lunch

On the other hand, lunchtime in France is a time of leisure and an escape from the world of labor.

http://news.bbc.co.uk/2/hi/programmes/from_our_own_correspondent/6599675.stm

But this realm of leisure is not entertainment or nourishment which serves merely to support labor, but a genuine realm of freedom.

For the French, sitting around and arguing is a form of action.
Then we were back on the road to Donzy, to join Claudie and her customers for coffee.
At Claudie’s, an unemployed builder was heatedly debating with a local teacher whether France needed to end its 35-hour working week.
If it hadn’t been for their clothes, I couldn’t have said who was who. Both had been to the same village school, both could quote their philosophers and the history of the Fifth Republic with equal ease.
If, for the French, eating represents leisure, which is a type of freedom and a form of action, and if action entails the display of individuality to the group, then meals should be communal.
The French are also very keen on commensality [eating together]. According the Crédoc consumer studies and research institute, 80% of meals are taken with other people. “In France meals are strongly associated with good company and sharing, which is undoubtedly less so in other countries,” says Loïc Bienassis, a researcher at the European Institute of Food History and Culture .
Americans prefer individualism over the French public display of individuality.

For Americans, a person should eat whatever they want to and whenever they want to.
“In the US food is regarded as an individual concern,” Fischler adds. “Everyone is different and everyone is free to make up their own mind and accept the consequences. It is an individualistic, contract-based model. If someone invites you to dinner, you can tell them you’re a vegetarian and no one will be offended. You’re entitled to eat differently. In France, on the contrary, food is a collective concern, almost a form of communion: the idea of sharing is a key part of the meal. So there are customs it’s hard to get round – people will take a poor view of someone who doesn’t partake of the main dish or only eats gluten-free foods. What counts most of all is conviviality.”
For the French, the American way of dining represents the mere brutish necessity of replenishing calories in order to do more labor — a slavish, animalistic existence.
Americans take a radically different approach. There is nothing sacred about meals: everyone eats at their own speed, depending on their appetite, outside constraints and timetable. As long ago as 1937 French writer Paul Morand was surprised to see New Yorkers lunching alone, in the street, “like in a stable”. US practice is so different from French ritual that it sometimes requires explanation. “There’s a secondary school in Toulouse which organises exchanges with young Americans,” says social anthropologist Jean-Pierre Poulain. “To avoid any misunderstandings teachers warn families before their children leave that the start of their stay will not be marked by an evening meal, as in France. When the young visitors arrive they are shown the fridge and told they can help themselves whenever they like.”
Americans see food in terms of physiological nutrition rather than in terms of group enjoyment, and so they manifest an unusual puritanical streak in refusing to enjoy it too much.
“In the US the dominant conception of food is nutritional,” Fischler explains. “Feeding oneself is above all a matter of making rational decisions to satisfy bodily needs. In contrast the French have a culinary conception of food, putting the emphasis on flavour and pleasure. In our surveys we asked French and American people to say what they associated with various words. When we suggested ‘chocolate cake’, the Americans thought of ‘guilt’, the French, ‘birthdays’.”
American and French attitudes toward food are rooted in religious attitudes.

Catholicism is communal and realistic, with an “otherworldly aestheticism” (cathedrals); Protestantism is individualistic and idealistic, with a “worldly asceticism” (capitalism).
One of the reasons for the solidity of these customs is that they are firmly rooted in the past. “France has a Roman Catholic tradition which sustains a sensual, hedonistic relation to food,” says Pascal Ory, a professor at the Sorbonne in Paris. “In the Jura, for example, nuns in Château-Chalon took charge of promoting vin jaune. Catholicism, with its celebration of the Eucharist, helped develop a real culture of eating and drinking, with the emphasis on the collective, communal dimension of meals. This is not the case in countries with Anglo-Saxon roots, where Protestantism entertains a more Puritan relationship with food. The Kellogg brothers, for example, were Seventh-day Adventists and vegetarians who wanted to use their corn flakes to combat Falstaff-style breakfasts centring on meat and beer.”
Something that the French might not understand is the link between work and creativity in the American imagination.

If for the French lunchtime embodies a period of communal freedom and debate, for educated Americans since the 1960s, the workplace represents freedom and creativity (“Mad Men”).

This might be the latest perverse iteration of capitalist propaganda, but it needs to be taken into account in order to understand how the USA has come to diverge so far from tradition.

For the ancient Greeks, the public realm was the sphere of freedom, whereas in post-industrial world, the home became an escape from the travails of the world.

In TV shows like “The Sopranos”, “Mad Men” and “Breaking Bad”, the protagonist has a work family and a family at home, but it is at work where they excel and transcend their limits.

This is not the only way in which the pyramid of labor, work and action that are reflected in Maslow’s hierarchy of needs can seem to be simplistic.

Again, labor consists of a cycle of exhaustion and physiological replenishment, and work involves aligning means and ends.

Action is quite different from labor or work.

Action is distinct from labor in that it transcends a cycle of futility, and different from work in that action is impractical and useless because it has inherent value.

But perhaps the distinctiveness of action can be taken one step further.

Action might involve a certain self-destructiveness.

Perhaps one can find this programmed self-destructiveness in Greek tragedies.

For example, in “Oedipus Rex”, the hero is doomed from the outset, fated to live out a grim prophecy.

In the pursuit of his origins, he displays a willful arrogance toward those who seek to dissuade him in his quest.

And yet when his horrifying fate is revealed, he confesses that he sort of sensed it all along.

The tragic hero has a dangerous, doomed aura.

So does the rock star, and this is the source of the dark glamor of the rock star.

This brings us back to Jared Diamond’s theory that the handicap principle is the reasons why humans engage in so much substance abuse.

That is, humans supposedly smoke and drink because they want to advertise that they are so healthy they can engage in unhealthy practices without ill effect.

It just might be that Diamond is really on to something.

However, the psychological reality might be even darker than Diamond thinks.

Diamond spoke of cigarette advertising projecting the image of the macho cowboy.

But many consumer products oriented toward men — say, trucks — project a macho image, but they do not imply that the product is a dangerous substance.

Indeed, in the face of medical studies beginning in the 1940s that warned of the dangers of tobacco, tobacco companies tried to portray their product as wholesome.

<figure></figure>https://www.cnn.com/2014/01/10/health/gallery/historic-cigarette-ads/index.html
The smoking cowboy campaign was originally intended to rebrand filtered cigarettes as a man’s cigarette.

https://en.wikipedia.org/wiki/Marlboro_Man
The Marlboro Man is a figure used in tobacco advertising campaigns for Marlboro cigarettes. In the United States, where the campaign originated, it was used from 1954 to 1999. The images initially featured rugged men portrayed in a variety of roles[1] but later primarily featured a rugged cowboy or cowboys, in picturesque wild terrain.[2] The advertisements were originally conceived as a way to popularize filtered cigarettes, which at the time were considered feminine.
The effort of the tobacco companies in the 1950s was to get men to smoke healthier cigarettes.

It was erroneously believed that the filtered cigarettes that women smoked were healthier.
Philip Morris & Co. (now Altria) had originally introduced the Marlboro brand as a woman’s cigarette in 1924. Starting in the early 1950s, the cigarette industry began to focus on promoting filtered cigarettes, as a response to the emerging scientific data about harmful effects of smoking.[10] Under the misconception that filtered cigarettes were safer, Marlboro, as well as other brands, started to be sold with filters.
However, filtered cigarettes, Marlboro in particular, were considered to be women’s cigarettes.[11] During market research in the 1950s, men indicated that while they would consider switching to a filtered cigarette, they were concerned about being seen smoking a cigarette marketed to women
It was a stunningly successful campaign.
The Marlboro advertising campaign, created by Leo Burnett Worldwide, is said to be one of the most brilliant advertisement campaigns of all time.[3] It transformed a feminine campaign, with the slogan “Mild as May”, into one that was masculine, in a matter of months.
Gradually since the 1950s, cigarettes have come to be perceived by the general public as unhealthy.

In the USA, the middle class and upper-middle class professionals now avoid cigarettes (although the top 1% of income earners supposedly smoke to a surprising extent).

Cigarettes have thus taken on an aura of danger.

But it could be that at some level, tobacco was always associated with death and, curiously, was also associated with sexuality.

The source of this association between tobacco and death, and thus with sexuality, would exist at a deep, unconscious level.

Human experience reflects two central principles — unruly, chaotic instinct versus individually distinct orderly forms.

These two forces were brought into balance in ancient Greek religion and art.

https://en.wikipedia.org/wiki/Apollonian_and_Dionysian
The Apollonian and Dionysian is a philosophical and literary concept and dichotomy/dialectic, based on Apollo and Dionysus in Greek mythology. Some Western philosophical and literary figures have invoked this dichotomy in critical and creative works, most notably Friedrich Nietzsche and later followers.
In Greek mythology, Apollo and Dionysus are both sons of Zeus. Apollo is the god of the sun, of rational thinking and order, and appeals to logic, prudence and purity.
Dionysus is the god of wine and dance, of irrationality and chaos, and appeals to emotions and instincts. The Ancient Greeks did not consider the two gods to be opposites or rivals, although they were often entwined by nature.
http://faculty.fiu.edu/~harrisk/Notes/Aesthetics/Apollonian-%20Dionysian%20Dichotomy.htm
“Apollonian” and “Dionysian” are terms used by Friedrich Nietzsche in The Birth of Tragedy to designate the two central principles in Greek culture. Nietzsche characterizes the differences this way:
The Apollonian:  analytic distinctions
All types of form or structure are Apollonian, thus, sculpture is the most Apollonian of the arts, since it relies entirely on form for its effect. Rational thought is also Apollonian since it is structured and makes distinctions.
The Dionysian: inability or unwillingness to make these distinctions; directly opposed to the Apollonian
Drunkenness and madness are Dionysian All forms of enthusiasm and ecstasy are Dionysian.  Music is the most Dionysian of the arts, since it appeals directly to man’s instinctive, chaotic emotions and not to his formally reasoning mind.
In Jungian psychology, the human “collective unconscious” is comprised of a universal gallery of personified primordial images akin to an Apollonian realm of “shining forms”.

https://en.wikipedia.org/wiki/Jungian_archetypes
Jungian archetypes are defined as universal, archaic symbols and images that derive from the collective unconscious, as proposed by Carl Jung. They are the psychic counterpart of instinct. That is to say they are a kind of innate unspecific knowledge, derived from the sum total of human history, which prefigures and directs conscious behavior.
They are unclear underlying forms, or the archetypes-as-such, from which emerge images and motifs[4] such as the mother, the child, the trickster, and the flood among others. History, culture and personal context shape these manifest representations thereby giving them their specific content. These images and motifs are more precisely called archetypal images. However it is common for the term archetype to be used interchangeably to refer to both archetypes-as-such and archetypal images.
https://www.verywellmind.com/what-are-jungs-4-major-archetypes-2795439
Archetypes were a concept introduced by the Swiss psychiatrist Carl Jung, who believed that archetypes were models of people, behaviors, or personalities. Archetypes, he suggested, were inborn tendencies that play a role in influencing human behavior.
Jung believed that the human psyche was composed of three components: the ego, the personal unconscious, and the collective unconscious.1 According to Jung, the ego represents the conscious mind while the personal unconscious contains memories including those that have been suppressed. The collective unconscious is a unique component in that Jung believed that this part of the psyche served as a form of psychological inheritance. It contained all of the knowledge and experiences we share as a species.
In Jungian psychology, the archetypes represent universal patterns and images that are part of the collective unconscious. Jung believed that we inherit these archetypes much the way we inherit instinctive patterns of behavior.
Jung identified four major archetypes:
  • The Persona. The persona is how we present ourselves to the world. Over the course of development, children learn that they must behave in certain ways in order to fit in with society’s expectations and norms. The persona develops as a social mask to contain all of the primitive urges, impulses, and emotions that are not considered socially acceptable.
  • The Shadow. The shadow forms out of our attempts to adapt to cultural norms and expectations. It is this archetype that contains all of the things that are unacceptable not only to society, but also to one’s own personal morals and values. This archetype is often described as the darker side of the psyche, representing wildness, chaos, and the unknown.
  • The Anima or Animus. The anima is a feminine image in the male psyche, and the animus is a male image in the female psyche. These archetypal images are based upon both what is found in the collective and personal unconscious. The collective unconscious may contain notions about how women should behave while personal experience with wives, girlfriends, sisters, and mothers contribute to more personal images of women.
  • The Self. The self is an archetype that represents the unified unconsciousness and consciousness of an individual. Creating the self occurs through a process known as individuation, in which the various aspects of personality are integrated. The ego makes up the center of consciousness, but it is the self that lies at the center of personality, which encompasses not only consciousness, but also the ego and the unconscious mind.
Jung suggested that the number of existing archetypes was not static or fixed. Instead, many different archetypes may overlap or combine at any given time. The following are just a few of the various archetypes that Jung described:
  • The father: Authority figure; stern; powerful.
  • The mother: Nurturing; comforting.
  • The child: Longing for innocence; rebirth; salvation.
  • The wise old man: Guidance; knowledge; wisdom.
  • The hero: Champion; defender; rescuer.
  • The maiden: Innocence; desire; purity.
  • The trickster: Deceiver; liar; trouble-maker.
A structuralist approach to the archetypes that populate human mythologies understands them as manifestations of an underlying systematic “concrete science”.

That is, in a non-scientific mode of understanding, humans comprehend the world via concrete images that are nevertheless underpinned by a logic of classification.

Like the binary machine code that is the foundation on top of which a computer’s operating system runs, there is a universal “human spirit” that underlies manifold human cultures.

https://www.nytimes.com/1973/06/03/archives/from-honey-to-ashes-introduction-to-a-science-of-mythology-2-by.html
A constellation of myths taken from neighboring South American societies are made the logical counters in a system which operates according to certain dialectical “laws” or principles of opposition constructed on “a multiplicity of axes” (honey/ tobacco, male/female, raw/ cooked, spouse/affine, literal sense/figurative sense, diachrony/synchrony, dry/wet, high/ low, life/death). Such binary oppositions permeate the work of Levi‐Strauss, and have their parallels in the linguistics of De Saussure and of computer science.
Because of this binary nature of mythologies, mythical thought always progresses from the awareness of oppositions toward their resolution.

https://en.wikipedia.org/wiki/Claude_L%C3%A9vi-Strauss#The_structuralist_approach_to_myth
Thus, myths consist of:
  1. elements that oppose or contradict each other and
  2. other elements that “mediate”, or resolve, those oppositions.
The archetype of the trickster is a classic case.
Lévi-Strauss thinks the trickster of many Native American mythologies acts as a “mediator”. Lévi-Strauss’s argument hinges on two facts about the Native American trickster:
1) the trickster has a contradictory and unpredictable personality;
2) the trickster is almost always a raven or a coyote.
Lévi-Strauss argues that the raven and coyote “mediate” the opposition between life and death.

The relationship between agriculture and hunting is analogous to the opposition between life and death: agriculture is solely concerned with producing life (at least up until harvest time); hunting is concerned with producing death.
Furthermore, the relationship between herbivores and beasts of prey is analogous to the relationship between agriculture and hunting: like agriculture, herbivores are concerned with plants; like hunting, beasts of prey are concerned with catching meat.
Lévi-Strauss points out that the raven and coyote eat carrion and are therefore halfway between herbivores and beasts of prey: like beasts of prey, they eat meat; like herbivores, they don’t catch their food. Thus, he argues, “we have a mediating structure of the following type”:
<figure></figure>One criticism of Lévi-Strauss’s analysis is that it does not appear to be capable of explaining why representations of the Trickster in other areas of the world make use of such animals as the spider and mantis.

One response to this criticism might be that the spider and the mantis are both famous as ambush predators who hunt by stealth and camouflage.

That is, they are opportunists rather than classic beast of prey that engage in the chase.
In fact, one species of mantis is disguised as vegetation.

https://en.wikipedia.org/wiki/Ambush_predator
(orchid mantis)
<figure></figure>The binary distinctions within mythologies that Lévi-Strauss observes are typically of a culinary nature.

This might represent a rather French interpretation that he imposes on native South American mythologies.

One classic binary distinction that he is famous for is between the motifs of raw food versus cooked food.

For Lévi-Strauss, in mythologies, the raw and the cooked symbolically represent the primary dualism of nature and culture, especially the transition from the former to the latter.

Metaphorically, the cooking of food represents the journey that humans take both as a species that discovered civilized life and as individuals in their maturation processes.

The binary distinction between life and death is represented by honey and tobacco, respectively.

https://www.nytimes.com/1973/06/03/archives/from-honey-to-ashes-introduction-to-a-science-of-mythology-2-by.html
“From Honey to Ashes,” carries out further the culinary metaphor introduced In “The Raw and the Cooked.” This time, however, the myths concern honey and tobacco (ashes) which belong to either side of cooking: honey on the near‐side as a naturally “cooked” yet raw product, and tobacco on the far‐side as an over‐cooked, literally burnt to ashes, by‐product of culture. The continued use of the dichotomy of nature:culture/raw:cooked permits Lévi‐Strauss to join the myths studied in both volumes by combining their elements.
Honey myths represent life and sexuality.
The “honey” myths (and there are dozens of variations) have as their central theme the abuse of a natural product highly valued by the South American Indians for its sweetness. Honey is sought after and greedily consumed by a “girl mad for honey” instead of being reserved as it should have been as a gift offering to her parents from her husband. Disjunctive, or anti‐social behavior signals a “honey” myth which symbolizes also eroticism, seduction and certain “back to nature” aspects of.
Tobacco represents the connection between the natural world and the supernatural realm.
“Tobacco” myths differ in that they are seen to create conjunctions between the profane world and the spirit world. Tobacco, a “food which has to be burnt before it is consumed,” establishes a communication between man and the supernatural, in the form of smoke, whereas the girl’s selfish consumption of honey is a “honeymoon” in every sense of the word: it stresses a sexual relationship between a man and a woman at the expense of communication be tween the larger social groups of which the bride and groom are members. Honey myths result finally in a regression from culture exemplified as affinal exchanges to nature exemplified by disruptive passion.
If human development is comprised of the progression from nature to culture, it also involves the path from life to death.

However, the movement from life to death also ironically reverses the normal course of human development, transforming it into a journey back into nature.

Moreover, the movement toward death as a negation of culture is also a move back into an untamed sexuality (e.g., myths about bypassing marriage).

This is what all those honey and tobacco myths are about.
Honey and tobacco have the same meanings as those other myths treated in “The Raw and the Cooked” but with one crucial difference. They reverse the previously established progression from nature to culture. Instead they depict degeneration, or movement away from culture back to nature, the consequence as it were, of not obeying the cooking instructions set forth in Vol. I. The myths in “From Honey to Ashes” therefore appear on the “wrong” side of those re counted in “The Raw and the Cooked.”
This brings us back to tobacco advertising and the sex appeal of cigarettes.

Jared Diamond claims that cigarette smoking and substance abuse have sex appeal because they reflect the handicap principle.

Again, the excessive physical traits like the plumage of the peacock display how healthy an individual is because only the healthiest males could survive with such a burden.

Unfortunately for this theory, although cigarettes were always regarded as sexy, they were not necessarily perceived as dangerous or insalubrious.

Again, in the USA, cigarettes were not regarded as unhealthy until medical research began to suggest otherwise in the 1940s.

In response, in the 1950s, cigarette companies attempted to make cigarettes more healthy by selling filtered cigarettes — which were associated with women — to men.

This advertising campaign involved rebranding filtered cigarettes as macho by associating them with cowboys.

However, at a deeper psychological level, because tobacco was always associated with ashes, it was associated with death.

In a complicated way, death was associated with a reversion back to nature and to a raw sexuality.
Jared Diamond is suggesting that cigarettes symbolize the overcoming of death, but cigarettes might actually represent the finality of death and the reversion to raw nature and sexuality.

The greater argument is about the significance of action.

If labor exists for nourishment and work to achieve goals, action is a gratuitous form of the display of individuality that might have the sexy aura of self-destructiveness.

The realm of action has a political economy of death.

A normal economy involves accumulation and parsimony.

In contrast, action involves a useless splurging of resources in the face of death.

Not all squandering is a form of action.

A shopping spree is a form of recreation; buying an expensive status symbol is an investment; and paying off a mortgage on a McMansion represents the attainment of a life goal.

In contrast, a samurai committing seppuku (harakiri) or ritual suicide through disembowelment can be a form of action that is also a kind of useless expenditure.

Whereas medieval vassals in Europe had rights and duties, the samurai had duties only, and little in the way of a right to protest.

The only form of protest to one’s lord was seppuku, and only to protest that the lord had made a mistake against lord’s self-interest (seppuku was the ultimate selfless act of service).

In a sense, the samurai’s life was comprised of labor/recreation and work/accomplishments, but his latitude for action properly understood was tightly constricted.

In the face of death, people engage in exorbitant expenditures that have dubious practical value — for example, creating and completing a “bucket list”.

Action is one such expenditure.

Creating new life — that is, having children — bypasses this sort of existential crisis and inhibits or occludes action.

John Kennedy wrote that “To have a child is to give fate a hostage.”

This is widely understood as a statement of frustrated ambition.

It was condensed from Sir Francis Bacon’s claim that family life impedes both action and irresponsibility.
He that hath wife and children hath given hostages to fortune, for they are impediments to great enterprises, either of virtue or mischief. Certainly the best works and of greatest merit for the public have proceeded from the unmarried or childless men, which both in affection and means have married and endowed the public.
This is also part of the logic behind George W. Bush’s push to create an “ownership society”.

https://en.wikipedia.org/wiki/Ownership_society
Ownership society is a slogan for a model of society promoted by former United States president George W. Bush. It takes as lead values personal responsibility, economic liberty, and the owning of property.
…if you own something, you have a vital stake in the future of our country. The more ownership there is in America, the more vitality there is in America, and the more people have a vital stake in the future of this country.” – President George W. Bush, June 17, 2004.
We’re creating… an ownership society in this country, where more Americans than ever will be able to open up their door where they live and say, welcome to my house, welcome to my piece of property.” – President George W. Bush, October 2004.
The libertarian side of the ownership society is that everything should be privatized in order to promote economic efficiency and vitality.

In reality, the Bush “ownership society” agenda fed a subprime bubble.

The conservative side of the idea is that if everyone owned homes and had children, they would embraces conservative values and stay out of trouble, and society would flourish.

On the one hand, a society where everyone would mind their own business and stay out of trouble and focus on earning and saving would indeed be a safe, prosperous society.

On the other hand, it would be an uncreative, stagnant, sterile kind of “flourishing”, much like the stereotype of Singapore as “Disneyland with the death penalty”.

In a conservative “ownership society” where people are invested in houses and family, there are fewer troublemakers — but there is also less action, less greatness and less creativity.

The connection between action and awareness of mortality might apply to the private realm and not just the public, political realm that was the original focus of Arendt’s work.

In discussion’s of Maslow’s hierarchy of needs, the satisfaction of physiological and social needs culminates in “self-actualization” — that is, achieving one’s full creative potential.

In the literature, self-actualization is usually understood in the most privatized terms, for example, in Maslow’s profile of the characteristics of self-actualizers.

For example, even the characteristic of “community feeling” (Gemeinschaftsgefühl) refers to emotion and identity, not political commitment.

https://en.wikipedia.org/wiki/Self-actualization#Characteristics_of_self-actualizers
  • Efficient perceptions of reality. Self-actualizers are able to judge situations correctly and honestly. They are very sensitive to the superficial and dishonest, and are free to see reality ‘as it is’.
  • Comfortable acceptance of self, others and nature. Self-actualizers accept their own human nature with all its flaws. The shortcomings of others and the contradictions of the human condition are accepted with humor and tolerance.
  • Reliant on own experiences and judgement. Independent, not reliant on culture and environment to form opinions and views.
  • Spontaneous and natural. True to oneself, rather than being how others want.
  • Task centering. Most of Maslow’s subjects had a mission to fulfill in life or some task or problem ‘beyond’ themselves (instead of outside themselves) to pursue. Humanitarians such as Albert Schweitzer are considered to have possessed this quality.[citation needed]
  • Autonomy. Self-actualizers are free from reliance on external authorities or other people. They tend to be resourceful and independent.[18]
  • Continued freshness of appreciation. The self-actualizer seems to constantly renew appreciation of life’s basic goods. A sunset or a flower will be experienced as intensely time after time as it was at first. There is an “innocence of vision”, like that of an artist or child.
  • Profound interpersonal relationships. The interpersonal relationships of self-actualizers are marked by deep loving bonds.[citation needed]Comfort with solitude. Despite their satisfying relationships with others, self-actualizing people value solitude and are comfortable being alone.[19]
  • Non-hostile sense of humor. This refers to the ability to laugh at oneself.
  • Peak experiences. All of Maslow’s subjects reported the frequent occurrence of peak experiences (temporary moments of self-actualization). These occasions were marked by feelings of ecstasy, harmony, and deep meaning. Self-actualizers reported feeling at one with the universe, stronger and calmer than ever before, filled with light, beauty, goodness, and so forth.
  • Socially compassionate. Possessing humanity.
  • Few friends. Few close intimate friends rather than many perfunctory relationships.
  • Gemeinschaftsgefühl. According to Maslow, the self-actualizers possess “Gemeinschaftsgefühl”, which refers to “social interest, community feeling, or a sense of oneness with all humanity.”

Intensified awareness of mortality might not characterize only the realm of action, but also the realm of contemplation.

The most famous example might be Socrates’s assertion that philosophy is learning how to die.
In fact, perhaps action and contemplation involve not just a heightened awareness of mortality as a spur to action and though, but self-destructiveness as well.

Socrates death by hemlock is yet another famous example.

What about the USA, where the realm of contemplation was always eclipsed by labor, work and action?

IIRC, the vision of W.E.B. du Bois was that African Americans would become the intellectual class of the USA.

That task fell largely to white New Yorkers, who were predominantly Jewish, and often graduates of the impoverished public City College of New York.

https://en.wikipedia.org/wiki/The_New_York_Intellectuals

However, in the realm of art, fringe popular music took a turn toward intellectual respectability with the development of jazz into a recognized form of high art.

https://www.pbs.org/wgbh/cultureshock/beyond/jazz.html
The Devil’s Music features another jazz great of the century, composer and bandleader Duke Ellington, who created a sensation when he toured England in 1933. By the time Ellington hit the scene, classical musicians and music critics alike were analyzing jazz and declaring it a serious art form.
Jazz as a form of contemplation rather than entertainment brings with it the shadow of mortality and self-destructiveness.

Intensifying a sense of mortality would stimulate creativity and virtuosity.

Unfortunately for this thesis, in discussions why jazz musicians in the 1930s turned toward heroin, many reasons are given, but intensified mortality is not one of them.

https://www.bbc.co.uk/sounds/play/b08k4s1k

One of the biggest reasons why jazz musicians used heroin was the example of their idol, Charlie Parker, who used heroin to relieve his many medical and emotional afflictions.

Charlie Parker has been called the “patient-zero”, a kind of super-spreader of heroin use because so many musicians emulated him.

If Charlie Parker had not been such a great musician or if he had not used heroin, there is the possibility that heroin might not have become so established in the USA.

There was a common myth that heroin use would inspire genius in a musician, even though the clear evidence was that heroin use would impair performance.

This myth that heroin stimulated artistic virtuosity is still alive and kicking.

https://nypost.com/2017/02/05/charlie-parkers-heroin-addiction-helped-make-him-a-genius/

The hypothesis here is that behind the myth was the deep psychological association of creative inspiration with self-destructiveness.

For example, for the past 150 years, it was the very deadliness of absinthe that made it the muse of so many artists and writers.

https://www.bbc.com/culture/article/20140109-absinthe-a-literary-muse

The point is that there is a primal connection between death and both action and contemplation.

The confrontation with death can be a spur to action, but when action in the face of death is futile, a melancholy, thoughtful stoic resignation takes hold.

Whereas action is a young man’s game involving the display of the self to others, contemplation involves the self examining itself in the mirror of the mind’s eye.

The greater point of all of this is that an unconditional Universal Basic Income violates various primordial relationships.
  • The enjoyment of entertainment and recreation is based on the exhaustion caused by labor, and so without labor and exhaustion, life becomes unbearable.
  • Work and action are based on a foundation of labor, and collapse without labor.
  • Relatedly, labor, work and action are intertwined, but ambiguously.
  • Thus, action involves a gratuitous display of excessive excellence that is of problematic practical value (and this has echoes in natural selection by mating).
  • Related to this, action has a “political economy of death”.
  • This might also be true of contemplation.
Rather than an unconditional Universal Basic Income, labor would be publicly compensated.

For example, the American response to the SARS2 crisis was to bailout corporations and provide stimulus checks to Americans.

The European response was to provide wage subsidies, with most of an employee’s wages coming from their government.

https://www.voanews.com/covid-19-pandemic/us-unemployment-soars-wage-subsidies-enable-european-workers-keep-their-jobs
The stark contrast between the unemployment figures in the U.S. and those in most European countries reflects the very different policy choices made by lawmakers in Washington by those across Europe. By and large, governments across Europe have been subsidizing the wages of employees of private companies, in some cases effectively paying them to stay home.
“European governments are choosing to use their unemployment funds to keep workers attached to the firms that they’re working for through what are known as work-sharing, or short-term compensation programs,” said Robert E. Scott, senior economist and director of trade and manufacturing policy research at the Economic Policy Institute in Washington.
“That means that workers keep the benefits that they have associated with those jobs,” he said. “But it also, more importantly, keeps the workers attached to the firm and not unemployed.”
These programs serve two main purposes. The most immediate is that they guarantee that millions of people who could face financial disaster if they lost their stream of income are able to remain solvent. In the long term, the programs are designed to maintain the connection between workers and employers, making it easier and more likely for businesses to start back up once the crisis is past.
In popular discourse, the American policy is controversial because of the issue of moral hazard.
Some bailed out companies were irresponsible or obsolete, and the stimulus checks disincentivized people from working (which might be a good thing in a pandemic).

Also, rather than spend the checks and thus stimulate the economy, people are saving their money or plowing it into the stock market.

However, the point here is quite different from these mainstream concerns.

At some level, people have a psychological need to labor, and wage subsidies give people a sense that they are still workers (even if they are home).

Most obviously, the great mass of the public could be employed in avoiding the three Cs and engaging in the three Ts.

But how would this be done?

How would the government figure out how its citizens were avoiding confined areas, crowed places and close contact, and engaging in testing, tracing and treatment (isolation)?

Essentially, citizens would be compensated for engaging in a system of self-imposed surveillance.

First of all, that can be a tough pill to swallow in normal times.

Providing phone data to the government would be crucial in a program of financial rewards for taking precautions against SARS2.

People really do not want to give their location data to authorities.

They don’t mind it, however, when they imagine that their personal location data is being collected by a cute little robot-like “app”.

The public fails to realize several things about the harvesting of their personal data by smartphone apps:
  • Actual human beings at the corporations that make apps are looking at your data.
  • This data is sold legally to other corporations and to government entities (such as law enforcement), and being sold illegally to individuals on the dark web.
https://www.nytimes.com/interactive/2019/12/19/opinion/location-tracking-cell-phone.html

All sorts of technology companies wanted to get into the automobile manufacturing business all the better to track you and sell your data.

https://www.theatlantic.com/technology/archive/2016/03/self-driving-cars-and-the-looming-privacy-apocalypse/474600/

Tesla's business plan is all about collecting the information from its cars relating to building a map for autonomous vehicles.

https://www.cnbc.com/2020/05/07/why-kevin-oleary-wouldnt-have-given-musk-a-tesla-deal-on-shark-tank.html
My son went to work for [Tesla as an intern] and convinced me that it wasn’t a car company I was investing in, it was a data technology company,” O’Leary says. “Sometimes you have to think out of the box.”
Analysts that have long supported Tesla have said the data the company collects from its customers is “one of the most under-valued areas of the business, which is significantly ahead of competitors,” as CNBC reported in February.
“Every mile driven globally, [Tesla’s] data gets smarter, the resolution better,” O’Leary said on CNBC’s “Halftime Report” on Wednesday. “This company is all about the future of autonomous driving.”
But this might be suggestive of where automobile companies in the future will generate revenue.
Car companies no longer make money from the sale of their cars.

Rather, car companies are now banks that make money by loaning money to their customers.

They may have no choice in the future but to become purveyors of their customers’ personal location data.

Second, to make things worse, these are not normal times.

Traditional conservatives are normally conscientious and strict about accepting personal responsibility.

Yet in the SARS2 crisis, these conservatives are avoiding personal precautions (masks) because to them it has taken on the political and cultural overtones of an imposed “socialism”.

In this period of civil unrest, police forces are expressing an interest in using personal technology to contact trace protestors.

Because of this, progressives may grow wary of the three Ts of testing, tracing and treating (isolating) and neglect to avoid the three Cs of confined spaces, crowds and close contact.

People might be reimbursed for tracking themselves and sharing their information anonymously, much the way that they do when they post results of DNA tests.

As mentioned above, one complaint by conservatives and libertarians is that the American stimulus package incentivizes workers to skip work.

However, in the short term, that would be precisely what a society in lockdown should aim at.

This would be extended to paying those who were in treatment (isolation) for COVID-19 to use that time to retrain to become part-time contact tracers.

Aside from collecting personal tracking information, there are other measures can be taken to promote a system to financially reward people for taking personal precautions.

For example, using government grants, the air conditioning systems of confined spaces like restaurants, bars and mass transit systems might be upgraded.

Also, these kind of spaces might be repurposed to other uses, and owners could be funded to do that.
For example, could an underutilized restaurant be a good place to go for school, either with a teacher present or as a place to teleconference outside the home?

One thing to keep in mind is that these would not be temporary changes.

There is a good chance that if and when the SARS2 crisis ends, there will soon enough be a new — and potentially much worse — pandemic at hand.

So these might be timely investments.

One question is what to do about schools and universities.

Educational institutions are perceived by many as potential superspreaders.

South Korea inspire a model of what can be done with teenagers.

South Korea has compulsory military service for all young men that lasts almost 24 months.

SARS2 was largely kept under control in South Korea.

The South Korean military in particular is highly effective in vanquishing the SARS2 threat within its ranks.

A conditional Guaranteed Minimum Income would offer military service in brief periods to older high school students during periods when the pandemic would shut schools.

The students would be isolated from their families for months at a time.

This might involve some education, but it would primarily focus on basic training.

These ideas are at least illustrative attempts to come up with creative ideas to limit the impact of a pandemic.