Total Pageviews

Sunday, 26 May 2019

American life is improving for the lowest paid (Economist)

The Economist title: American life is improving for the lowest paid

The Economist subtitle: Come back capitalism, all is forgiven

Date of publishing: 16 May 2019


"Brad hooper quit his previous job at a grocery in Madison because his boss was “a little crazy”. The manager threatened to sack him and other cashiers for refusing orders to work longer than their agreed hours. Not long ago, Mr Hooper’s decision to walk out might have looked foolhardy. A long-haired navy veteran, he suffers from recurrent ill-health, including insomnia. He has no education beyond high school. Early this decade he was jobless for a year and recalls how back then, there were “a thousand people applying for every McDonald’s job”.

This time he struck lucky, finding much better work. Today he sells tobacco and cigarettes in a chain store for 32 hours a week. That leaves plenty of time for his passion, reading science fiction. And after years of low earnings he collects $13.90 an hour, almost double the state’s minimum rate and better than the grocer’s pay. His new employer has already bumped up his wages twice in 18 months. “It’s pretty good,” he says with a grin. What’s really rare, he adds, is his annual week of paid holiday. The firm also offers help with health insurance.

His improving fortunes reflect recent gains for many of America’s lowest-paid. Handwritten “help wanted” signs adorn windows of many cafés and shops in Madison. A few steps on from the cigarette shop is the city’s job centre, where a manager with little else to do points to a screen that tallies 98,678 unfilled vacancies across Wisconsin. In five years, he says, he has never seen such demand for labour. He says some employers now recruit from a vocational training centre for the disabled. Others tour prisons, signing up inmates to work immediately on their release.

Unemployment in Wisconsin is below 3%, which is a record. Across America it was last this low, at 3.6%, half a century ago. A tight labour market has been pushing up median pay for some time. Fewer unauthorised immigrants arriving in America may contribute to the squeeze, though this is disputed. Official figures show average hourly earnings rising by 3.2% on an annual basis. “Right now, part time, it seems like everyone is hiring. Every American who wants a job right now can get a job,” says another shop worker in Merrillville, in northern Indiana.

In any economic upturn the last group of workers to prosper are typically the poorest earners, such as low-skilled shopstaff, food preparers, care-givers and temps. Their pay was walloped in the Great Recession a decade ago, and the recovery since has been unusually slow. Pay has leapt recently—with the lowest-paid enjoying faster gains than the better-off.

[Note LO: see article for graph on "United States, usual weekly nominal earnings of a full-time worker at the tenth percentile"]

The benefits are not equally spread. In Wisconsin, as in much of the country, more jobs are being created in urban areas and in services. Laura Dresser, a labour economist, points to a “very big racial inequality among workers”. Wages have been rising fastest for African-Americans, but poorer blacks, especially those with felony convictions, are also likelier to have fallen out of the formal labour market, so are not counted in unemployment figures.

The wage recovery is not only about markets. Policy matters too. Some states, typically Republican-run, have been reluctant to lift minimum wages above the federal level of $7.25 an hour. In Merrillville, a worker in a petshop carries a Husky puppy to be inspected by a group of teenage girls. Staff are paid “a dollar or two above the minimum wage”, says his manager. Despite his 13 years’ employment, and over 40 hours’ toil each week, his pay and benefits amount to little. He calls occasional bonuses a “carrot at the end of the road”.

He could munch on bigger carrots in other states. Lawmakers in some states are more willing to lift minimum wages. Where they do, the incomes of the lowest-paid rise particularly fast. Thirteen states and the District of Columbia raised the minimum wage last year. (Some cities, like Chicago and New York, occasionally raise it too). Elise Gould of the Economic Policy Institute told Congress in March that, in states which put up minimum wages at least once in the five years to 2018, incomes for the poorest rose by an average of 13%. In the remaining states, by contrast, the poorest got a rise of 8.6% over the same period.

In neither case, however, do the increases amount to much better long-term prospects for the worst-off. By last year, the poorest 10% were still earning only a miserly 4.1% more per hour than they did (in real wages) 40 years ago. Median hourly pay for America’s workers was up a little more, by 14%.

One study in Wisconsin suggests that caretakers, for example, took home over $12 an hour by last year, so were only just getting back to their (real) average earnings achieved in 2010. Expansion at the bottom of the labour market “is finally pulling some wages up. But it’s certainly been much slower in this boom than any other,” argues Tim Smeeding, a poverty expert at the University of Wisconsin, in Madison. He describes “capital winning over labour” for several decades, and expects the trend to continue, given weak unions, more automation and other trends.

The poorest get some hard-to-measure benefits in addition to higher hourly pay. Mr Hooper is not alone in daring to walk away from an exploitative boss. More of the low-paid get a bit more say on how and when they toil. Many crave a reduction in the income volatility that afflicts them, since sudden swings in earnings are associated with poor mental health, high stress and worry over losing access to financial assistance or food stamps.

One study of 7,000 households, by Pew, found in 2015 that 92% of them would opt for lower average incomes, if earnings were predictable. Follow-up research late last year suggested the same trends are still present. Low- and middle-income households remain anxious about volatile earnings. Most have almost no savings. Many would struggle with a financial shock of just a few hundred dollars.

Lots of jobs that are being created are in or near flourishing cities like Madison, where low-paid workers are squeezed by high housing costs. Pew has estimated that 38% of all tenant households spend at least 30% of their income on rent. Living in more affordable places, such as Janesville, an hour south of Madison, may be an option for the lower-paid. But that means commuting to the city, or taking local jobs with less pay and fewer benefits. Few workers earning less than $12 an hour get health insurance from their employer, whereas most do so above that threshold.

Katherine Cramer, who studies the long-standing causes of simmering anger among poorer, rural Americans, says “resentment is worse than before”, despite the recent better wages. Rural folk complain that “it’s been like this for decades”, she says. A year or two catching up has not yet been enough to change their minds."

Source: https://www.economist.com/united-states/2019/05/18/american-life-is-improving-for-the-lowest-paid

Saturday, 25 May 2019

Owning a dog is influenced by genetic make-up (Medical Xpress)

Medical Xpress titleOwning a dog is influenced by genetic make-up

Date of publishing: 17 May 2019


"A team of Swedish and British scientists have studied the heritability of dog ownership using information from 35,035 twin pairs from the Swedish Twin Registry. The new study suggests that genetic variation explains more than half of the variation in dog ownership, implying that the choice of getting a dog is heavily influenced by an individual's genetic make-up.

Dogs were the first domesticated animal and have had a close relationship with humans for at least 15,000 years. Today, dogs are common pets and are considered to increase the well-being and health of their owners. The team compared the genetic make-up of twins (using the Swedish Twin Registry—the largest of its kind in the world) with dog ownership. The results are published for the first time in Scientific Reports. The goal was to determine whether dog ownership has a heritable component.

"We were surprised to see that a person's genetic make-up appears to be a significant influence in whether they own a dog. As such, these findings have major implications in several different fields related to understanding dog-human interaction throughout history and in modern times. Although dogs and other pets are common household members around the globe, little is known how they impact our daily life and health. Perhaps some people have a higher innate propensity to care for a pet than others," says Tove Fall, lead author of the study, and Professor in Molecular Epidemiology at the Department of Medical Sciences and the Science for Life Laboratory, Uppsala University.

Carri Westgarth, lecturer in human-animal interaction at the University of Liverpool and co-author of the study, adds, "These findings are important as they suggest that supposed health benefits of owning a dog reported in some studies may be partly explained by different genetics of the people studied."

Studying twins is a well-known method for disentangling the influences of environment and genes on biology and behaviour. Because identical twins share their entire genome, and non-identical twins on average share only half of the genetic variation, comparisons of the within-pair concordance of dog ownership between groups can reveal whether genetics play a role in owning a dog. The researchers found concordance rates of dog ownership to be much larger in identical twins than in non-identical ones—supporting the view that genetics does play a major role in the choice of owning a dog.

"These kinds of twin studies cannot tell us exactly which genes are involved, but at least demonstrate for the first time that genetics and environment play about equal roles in determining dog ownership. The next obvious step is to try to identify which genetic variants affect this choice and how they relate to personality traits and other factors such as allergy" says Patrik Magnusson, senior author of the study and Associate Professor in Epidemiology at the Department of Medical Epidemiology and Biostatistics at Karolinska Insitutet, Sweden and Head of the Swedish Twin Registry.

"The study has major implications for understanding the deep and enigmatic history of dog domestication" says zooarchaeologist and co-author Keith Dobney, chair of human palaeoecology in the Department of Archaeology, Classics and Egyptology at the University of Liverpool. "Decades of archaeological research have helped us construct a better picture of where and when dogs entered into the human world, but modern and ancient genetic data are now allowing us to directly explore why and how." "

Sources:
and

Complex life may only exist because of millions of years of groundwork by ancient fungi (Conversation)

The Conversation title: Complex life may only exist because of millions of years of groundwork by ancient fungi

Date of publishing: 22 May 2019


"Because of their delicate organic and decomposing nature, fossilised fungi are extremely rare. So rare, in fact, that a new discovery has just pushed back the earliest evidence of fungi by at least 500m years – doubling their age.

Until now, the oldest confirmed fungal fossils dated to around 450m years ago – about the same time that plants migrated from sea to land. One of the most famous fossilised fungi from this period is the Prototaxites, which could grow up to eight metres tall – leading to its misidentification for many years as a tree.

But previous examination of the fungal “molecular clock”, using DNA-based methods, suggested that fungi may have evolved much earlier, between 760m and 1.06 billion years ago. Extracted from Arctic Canadian shales, the newly discovered billion-year-old fossilised fungal spores and hyphae (long thin tubes) plug the gap in the fossil record and suggest that fungi may have occupied land well before plants.

The fungal fossils were found in rocks that were probably once part a shallow-water estuary. Such environments are typically great for fungi thanks to nutrient-rich waters and the build up of washed-up organic matter to feed on. The high salinity, high mineral and low oxygen content of these ancient coastal habitats also provided great conditions to perfectly preserve the tough chitin molecules embedded within fungal cell walls that otherwise would have decomposed.

While it’s not certain whether the newly-discovered ancient fungi actually lived within the estuary or were washed into the sediments from the land, they show many of the distinctive features you’d expect in modern terrestrial fungi. The germinating spores are clearly defined, as are the branching, thread-like tubes that help fungi explore their environment, named hyphae. Even the cell walls are distinctively fungal, being made up of two clear layers. In fact, if you didn’t know they were so old, you’d be hard-pressed to distinguish them from modern fungi.

Fungal forefathers

As you might imagine from their ancient origins, fungi have played a critical role in shaping Earth’s terrestrial biosphere over the last billion years. The first plants to emerge onto land 500m years ago formed intimate partnerships with fungi. Lacking roots, these early plants relied on their fungal partners to grow inside them and spread outwards into the primordial mineral soil. In a process known as biological weathering, fungal hyphae would secrete organic acids to dissolve rocks and extract nutrients held within. In return, the plants would transfer nutrients produced through photosynthesis to the fungi.

This exchange of resources between early plants and fungi powered the growth, evolution and diversification of Earth’s flora into ever more complex species, communities and ecosystems, and remains the norm today. Over 90% of land plants associate with a fungal partner of one type or another, and some are entirely dependent on fungal assistance to survive.

The symbiotic rise of land plants and their fungal partners also had dramatic effects on our atmosphere. Now with abundant access to mineral-based energy building blocks, plants evolved more efficient mechanisms for photosynthesis to capture this energy, for example through better control of the movement of carbon dioxide and water into and out of leaves. Over millions of years, this increased absorption of carbon dioxide produced a massive rise in oxygen concentrations, supporting the emergence of much larger, more complex animal life than the tiny insect-like life forms that previous oxygen levels could support.

From there, the evolutionary story is clear. But in showing that fungi probably arrived on land 500m years before plants, the new fossil evidence raises fundamental questions about the start of this symbiotic journey.

It was previously thought that plants made the transition to terrestrial life simultaneously with aquatic fungal partners, but the new discovery opens up the possibility that Earth’s lands may have been already being prepared for successful plant life for hundreds of millions of years. Dissolving mineral-rich rocks and secreting carbon-based organic acids, we know that fungi were extremely important in converting barren lands into the fertile, carbon-rich soils we know today. It could be that the emergence of plant life was only made possible by aeons of groundwork by ancient fungal forefathers.

The outstanding challenge for scientists now is to resolve with certainty whether these ancient fungi were terrestrial in origin, and pinpoint their placement on the evolutionary tree of life. With the focus now on finding further fossil fungi, our understanding of the evolution of the early biosphere will make leaps and bounds.

What is already clear is that without fungi, we would not exist. Playing a vital role in the maintenance of healthy ecosystems across the planet, from the Antarctic deserts to the tropical rainforests, fungi underpin all life on Earth today. Now, it appears we may have another 500m years to thank them for."

Source: http://theconversation.com/complex-life-may-only-exist-because-of-millions-of-years-of-groundwork-by-ancient-fungi-117526

Friday, 24 May 2019

Burn-out & depression, a crisis of faith - not genetics

Before 2013, I was skeptical about people having a burn-out or depression. I could not imagine why someone got it, what it did to you, and how long it would take to heal. Now I know and it leaves me with a complex answer. A burn-out and subsequent depression is probably the most devastating experience that I know of - and yet, it changed my life for the better.

In my various blogs on this topic, I have labelled a burn-out and the subsequent depression as a crisis of faith. Faith does not necessarily have a religious connotation: you can also have faith in people (eg, friends) or the things you do (eg, work). Usually, we use a different term than faith: I believe in my friends, I believe in what I do.

I noticed the term "crisis of faith" in a 2006 NY Magazine article (see 14th paragraph). The article's "hidden" subtitle is also interesting: 
"Where Work Is a Religion, Work Burnout Is Its Crisis of Faith".

A burn-out results in an abrupt loss of willpower. You can still eat and sleep but you can't "do" things that directly relate to your burn-out (eg, driving, making notes). Your body just refuses to listen to the instructions from your mind. It resembles the so-called freezing behaviour

This loss of willpower is connected to my concept of Faith, Beliefs & Willpower (my blogs). In this concept, Willpower is our operational level (ie, deeds, words), Beliefs is our mission level, and Faith is our energy level. When your energy is fully depleted, everything comes to a sudden stop. Latter moment is the burn-out, which is fully in line with my analogy.

The article below is not a surprise to me. It may, however, explain the various misguided articles that burn-out is merely a Western disease. The title of a recent Guardian long-read article was very clear: "Busting the myth that depression doesn't affect people in poor countries" (my blog). 

Perhaps, these scientists made a causality error. Possibly, changed or "damaged" genetics are a consequence rather than a cause of a burn-out and the subsequent depression. 

--------

The Atlantic title: A Waste of 1,000 Research Papers

The Atlantic subtitle: Decades of early research on the genetics of depression were built on nonexistent foundations. How did that happen?

Date of publishing: 17 May 2019


"In 1996, a group of European researchers found that a certain gene, called SLC6A4, might influence a person’s risk of depression.

It was a blockbuster discovery at the time. The team found that a less active version of the gene was more common among 454 people who had mood disorders than in 570 who did not. In theory, anyone who had this particular gene variant could be at higher risk for depression, and that finding, they said, might help in diagnosing such disorders, assessing suicidal behavior, or even predicting a person’s response to antidepressants.

Back then, tools for sequencing DNA weren’t as cheap or powerful as they are today. When researchers wanted to work out which genes might affect a disease or trait, they made educated guesses, and picked likely “candidate genes.” For depression, SLC6A4 seemed like a great candidate: It’s responsible for getting a chemical called serotonin into brain cells, and serotonin had already been linked to mood and depression. Over two decades, this one gene inspired at least 450 research papers.

But a new study—the biggest and most comprehensive of its kind yet—shows that this seemingly sturdy mountain of research is actually a house of cards, built on nonexistent foundations.

Richard Border from the University of Colorado at Boulder and his colleagues picked the 18 candidate genes that have been most commonly linked to depression—SLC6A4 chief among them. Using data from large groups of volunteers, ranging from 62,000 to 443,000 people, the team checked if any versions of these genes were more common among people with depression. “We didn’t find a smidge of evidence,” says Matthew Keller, who led the project.

Between them, these 18 genes have been the subject of more than 1,000 research papers, on depression alone. And for what? If the new study is right, these genes have nothing to do with depression. “This should be a real cautionary tale,” Keller adds. “How on Earth could we have spent 20 years and hundreds of millions of dollars studying pure noise?”

“What bothers me isn’t just that people said [the gene] mattered and it didn’t,” wrote the psychiatrist Scott Alexander in a widely shared blog post. “It’s that we built whole imaginary edifices on top of this idea of [it] mattering.” Researchers studied how SLC6A4 affects emotion centers in the brain, how its influence varies in different countries and demographics, and how it interacts with other genes. It’s as if they’d been “describing the life cycle of unicorns, what unicorns eat, all the different subspecies of unicorn, which cuts of unicorn meat are tastiest, and a blow-by-blow account of a wrestling match between unicorns and Bigfoot,” Alexander wrote.

Border and Keller’s study may be “bigger and better” than its predecessors, but “the results are not a surprise,” says Cathryn Lewis, a geneticist at Kings College London. Warnings about the SLC6A4/depression link have been sounded for years. When geneticists finally gained the power to cost-efficiently analyze entire genomes, they realized that most disorders and diseases are influenced by thousands of genes, each of which has a tiny effect. To reliably detect these miniscule effects, you need to compare hundreds of thousands of volunteers. By contrast, the candidate-gene studies of the 2000s looked at an average of 345 people! They couldn’t possibly have found effects as large as they did, using samples as small as they had. Those results must have been flukes—mirages produced by a lack of statistical power. That’s true for candidate-gene studies in many diseases, but Lewis says that other researchers “have moved on faster than we have in depression.”

Marcus Munafo from the University of Bristol remembers being impressed by the early SLC6A4 research. “It all seemed to fit together,” he says, “but when I started doing my own studies in this area, I began to realize how fragile the evidence was.” Sometimes the gene was linked to depression; sometimes it wasn’t. And crucially, the better the methods, the less likely he was to see such a link. When he and others finally did a large study in 2005—with 100,000 people rather than the 1,000 from the original 1996 paper—they got nothing.

“You would have thought that would have dampened enthusiasm for that particular candidate gene, but not at all,” he says. “Any evidence that the results might not be reliable was simply not what many people wanted to hear.” In fact, the pace at which SLC6A4/depression papers were published accelerated after 2005, and the total number of such papers quadrupled over the next decade. “We’re told that science self-corrects, but what the candidate gene literature demonstrates is that it often self-corrects very slowly, and very wastefully, even when the writing has been on the wall for a very long time,” Munafo adds.

Many fields of science, from psychology to cancer biology, have been dealing with similar problems: Entire lines of research may be based on faulty results. The reasons for this so-called “reproducibility crisis” are manifold. Sometimes, researchers futz with their data until they get something interesting, or retrofit their questions to match their answers. Other times, they selectively publish positive results while sweeping negative ones under the rug, creating a false impression of building evidence.

Beyond a few cases of outright misconduct, these practices are rarely done to deceive. They’re an almost inevitable product of an academic world that rewards scientists, above all else, for publishing papers in high-profile journals—journals that prefer flashy studies that make new discoveries over duller ones that check existing work. People are rewarded for being productive rather than being right, for building ever upward instead of checking the foundations. These incentives allow weak studies to be published. And once enough have amassed, they create a collective perception of strength that can be hard to pierce.

Terrie Moffitt from Duke University, who did early influential work on SLC6A4, notes that the candidate-gene approach has already been superseded by other methods. “The relative volume of candidate-gene studies is going way down, and is highly likely to be trivial indeed,” she says. Border and Keller disagree. Yes, they say, their geneticist colleagues have largely abandoned the approach, which is often seen as something of a historical embarrassment. “But we have colleagues in other sciences who had no idea that there was even any question about these genes, and are doing this research to this day,” Border says. “There’s not good communication between sub-fields.” (A few studies on SLC6A4 and depression have even emerged since their study was published in March.)

The goalposts can also change. In one particularly influential study from 2003, Avshalom Caspi, Moffitt, and others claimed that people with certain versions of SLC6A4 were more likely to become depressed after experiencing stressful life events. Their paper, which has been cited over 8,000 times, suggested that these genes have subtler influences, which only manifest in certain environments. And if bigger studies found that the genes had no influence, it’s probably because they weren’t accounting for the experiences of their volunteers.

Border and Keller have heard that argument before. So, in their study, they measured depression in many ways—diagnosis, severity, symptom count, episode count—and they accounted for environmental factors like childhood trauma, adulthood trauma, and socioeconomic adversity. It didn’t matter. No candidate gene influenced depression risk in any environment.

But Suzanne Vrshek-Schallhorn from the University of North Carolina at Greensboro says that Border’s team didn’t assess life experiences with enough precision. “I cannot emphasize enough how insufficient the measures of the environment used in this investigation were,” she says. “Even for measures that fall below gold-standard stress assessment approaches, they represent a new low.” By using overly simple yes-or-no questionnaires rather than more thorough interviews, the team may have completely obscured any relationships between genes and environments, Vrshek-Schallhorn claims. “We should not get starry-eyed about large sample sizes, when measure validity is compromised to achieve them. We need to emphasize both quality and quantity.”

But Border argues that even if there had been “catastrophic measurement error,” his results would stand. In simulations, even when he replaced half the depression diagnoses and half the records of personal trauma with coin flips, the study would have been large enough to detect the kinds of effects seen in the early candidate-gene papers.

Similar debates have played out in other fields. When one group of psychologists started trying to reproduce classic results in much larger studies, their peers argued that any failures might simply be due to differences between the new groups of volunteers and the originals. This excuse has eroded with time, but to Border, it feels familiar. “There’s an unwillingness to part with a previous hypothesis,” he says. “It’s hard to wrap your head around the fact that maybe you were on a wild goose chase for years.”

Keller worries that these problems will be used as ammunition to distrust science as a whole. “People ask, ‘Well, if scientists are publishing crap, why should we believe global warming and evolution,’” he says. “But there’s a real difference: Some people were skeptical about candidate genes even back in the 1990s. There was never unanimity or consensus in the way there is for human-made global warming and the theory of evolution.”

Nor, he says, should his work be taken to mean that genes don’t affect depression. They do, and with newer, bigger studies, researchers are finally working out which ones do. If anything, the sordid history of the candidate-gene approach propelled the development of better methods. “I feel like the field of psychiatric genetics felt really burned coming out of the candidate-gene era, and took strides to make sure it won’t happen again.” That includes sharing data openly, and setting standards for how large and powerful studies need to be.

Dorothy Bishop from the University of Oxford argues that institutions and funders that supported candidate-gene work in depression should also be asking themselves some hard questions. “They need to recognize that even those who think they are elite are not immune to poor reproducibility, which leads to a huge amount of waste,” she says.

“We have got to set up a system, or develop a culture, that rewards people for actually trying to do it right,” adds Keller. “Those who don’t learn from the past are doomed to repeat it.” "

Source: https://www.theatlantic.com/science/archive/2019/05/waste-1000-studies/589684/

Thursday, 23 May 2019

Talking vs listening

"He is a better talker than listener". I noticed this statement in a recent Dutch newspaper article and it kept bugging me. It took me a day to realize that this statement is exemplary of today's world: everybody seems better in talking than in listening. This makes me wonder why.

Throughout history, talkers (or writers) used to be teachers and listeners (or readers) used to be students. This process was about a transfer of knowledge. The Greek philosopher and mathematician Pythagoras (c.570–c.495 BC) once stated: “Be silent or let thy words be worth more than silence”. The proverb "silence is golden" might even date back to ancient Egypt.

It's tempting to argue that social media are accountable and responsible for this cultural change (ie, from listening to talking). Their platforms have (very) limited restrictions on the opinions of their users. However, people easily forget that Facebook only exists since 2004 (Wiki). Nevertheless, its user volume may be more relevant than its time of existence.

In the 1988 movie The Dead PoolClint Eastwood a.k.a. Dirty Harry already states: "Opinions are like assholes; everybody has one" (video). Long before him, Roman Emperor Marcus Aurelius (121 AD - 180 AD) a.k.a. the Philosopher wrote: "Everything we hear is an opinion, not a fact. Everything we see is a perspective, not the truth."

It feels safe to argue that listening has never been commonplace. The few people who didn't talk (much) were easily mistaken for wise. Latter is also the premise of the 1979 movie Being there. Nevertheless, listening has always been regarded as a virtue and (too much) talking as a vice.

Moreover, there's nothing new about "fake news" - apart from its (new) name. Previous names included urban legendurban myth, "broodje aap verhaal" (Dutch for monkey sandwich story). The urban part is interesting as this suggests that fake news thrives on volume. Spreading nonsense in rural areas will backfire on the source.

Possibly, extreme beliefs - and its opinions - found a way into the public domain following the surge of social media (eg, Facebook). The overrepresentation of such extreme beliefs on social media might be the root cause for our feeling that something has fundamentally changed in society. In the past, newspapers moderated the letters to the editor.

Social media are slowly accepting some accountability and responsibility for user contributions (eg, hate speech, live streaming of mass murders, foreign interference in domestic elections). As long as these companies are legally allowed to consider themselves IT platforms rather than media companies, nothing much will change.

Such a Shame (1984) by Talk Talk ft. Mark Hollis (1955-2019)


Note: all markings (bolditalicunderlining) by LO unless stated otherwise

Wednesday, 22 May 2019

Why do revolutions turn into autocracies?

For some time, I have been considering today's blog title. Yesterday's blog already put this question in a broader perspective. Still, it seems that any revolution leads to an autocracy (eg, Cuba, Nicaragua, Russia, Venezuela). At the other side of the political spectrum, coup d'états usually also turn into autocracies (eg, African continent).

Initially, I assumed that revolutions become "sacred" and need to be defended - at all cost. There is, however, a clear downside to celebrating revolutions because it will remind the people of their (political) power. I suppose this is a reason why the Russian President did not attend the 100th celebration of the Russian Revolution of 1917 (eg, FT-2017).

Unlike political parties in a parliamentary democracies, revolutionists are individuals without a (shared) political program. Only the present is relevant. After a revolution, major differences of opinion will emerge between the many fathers of a victorious revolution.

The default playbook of any revolution then appears to include (i) a purge amongst the fathers of the revolution (eg, Che Guevara, Vladimir Lenin, Leon Trotsky), and then (ii) a Great Purge amongst the people (eg, Cambodia, China, Cuba, IranRussia, Turkey 2016-onwards).

Such a purge results in absolute power, albeit for a moment in history. A famous 1887 quote from John Dalberg-Acton a.k.a. Lord Acton on absolute power states:
"Power tends to corrupt, and absolute power corrupts absolutely. Great men are almost always bad men, even when they exercise influence and not authority; still more when you superadd the tendency of the certainty of corruption by authority.”

It's no longer the revolution that needs to be defended but the absolute power resulting from it. Hence, absolute power is what turns revolutions - or coup d'états - in autocracies. Today's examples include: China, Cuba, Iran, Nicaragua, Russia, Turkey, Venezuela. Trump's USA is clearly longing for this absolute power (eg, Atlantic, NBCNYTWaPo).

The last step in the revolution's playbook is establishing dynasties in order to consolidate absolute power (eg, North Korea). This is where an autocracy turns into a pathocracy, "in which individuals with personality disorders (especially psychopathy) occupy positions of power and influence" (eg, pathocracy). This may explain Trump's uncanny friendship with Kim Jong-un.

Absolute (1984) by Scritti Politti 

Absolute, on power drive 
I need you so to keep me alive 
Absolute, I long for you


Note: all markings (bolditalicunderlining) by LO unless stated otherwise

Tuesday, 21 May 2019

Wealth inequality, revolution, nationalism and autocracy

Wealth inequality is a relatively new topic in my blogs. Nationalism has often frequented in my topics. I'm concerned by the link between both. Historically, these two were connected by revolutions (eg, Cuba-1956-1959, Russia 1917). Nowadays, these are connected in the voting booth, like Trump 2016. The 2016 Russian interference is the "revolution" part.

UK wealth inequality (eg, Guardian, Observer), amongst other due to continued austerity, is a main underlying reason for the Brexit "revolution". Interestingly, the UK wealthy and the least privileged are both in favour of Brexit, although for (very) different reasons. Post-Brexit wealth inequality is likely to surge even further and should eventually cause massive social unrest.

Wealth inequality is often blamed on Globalism. This criticism is quite unfair because wealth inequality is even (much) higher in countries that favour (domestic) Nationalism, like China, Russia, Turkey, UK, and Trump's USA.

There's little doubt that Globalism increases wealth. The general - and valid - criticism on Globalism is about the distribution of that wealth, which is often skewed towards the haves and ignores the have-nots. Mainstream politicians often fail to appreciate the dangers of globalism (eg, loss of cultural identity, prioritizing migrants and refugees).

Similarly, there's little doubt on the merits of a shared cultural identity in Nationalism (eg, language, religion, values). Unfortunately, Nationalism is often abused and misguided by politicians looking for (absolute) Power. Such politicians stir up cultural fears against foreigners, religious and other minorities in order to gain Power - and then change the rules.

The biggest threat of political Nationalism is its next step: establishing an autocracy by dismantling parliamentary democracy. The first victim is the independence of the legal system by appointing government approved judges. The next victims are media and press. Examples: China, Russia, Turkey, Trump's USA.

Hence, in my view, Globalism is ultimately less dangerous than Nationalism. The best situation would still be a hybrid between both, in order to find a decent balance between identity versus power. Probably, this is why I like my country so much.

This week, the European Parliament election will start. Actually, I'm glad that European power is still in the hands of European nations rather than European voters.

"Democracy is a device that ensures we shall be governed no better than we deserve". A quote by George Bernard Shaw (1856-1950), Irish playwright.

Elected (1972) by Alice Cooper


Note: all markings (bolditalicunderlining) by LO unless stated otherwise

Monday, 20 May 2019

Superstition

I didn't watch the (semi) finals of the Eurovision songfestival last week. I did listen to the radio but turned it off once the voting results became due. Neither do I watch the (semi) finals of my favourite sports teams. Why? I'm "afraid" that my favourites may lose because of my viewing. I know that superstition makes no sense at all but it's stronger than me.

Latter remark suggests (to me) that superstition is outside our conscious domain. That leaves three other possibilities: subconscious, superconscious and unconscious. Please see my 2017 blog on the 4 levels of Consciousness - an integrated framework.

Given the parameters in my 7 August 2017 blog, superstition is in the "known knowns" domain. That leaves consciousness (ie, intelligence), unconsciousness (ie, complexes), and subconsciousness ie, intuition). At least, we can disregard superconsciousness which is only in the "unknown unknowns" domain.

Consciousness is a very unlikely option because superstition is not rational, it's irrational. Initially, I assumed that superstition might be in our subconscious (ie, intuition). However, superstition is in the "now" and not in the future. That leaves unconsciousness as the most likely reason for superstition.

Although the classification of unconsciousness (ie, complexes) feels counterintuitive, complexes are fear related and so is superstition. Superstition might thus be defined as an irrational fear. Many fears are mostly rational (eg, heights, snakes, water), some are less rational (eg, elevators, town squares), and some are not rational (eg, superstition).

Apparently, "there is no single definition of superstition" (the Conversation). I do not agree with a "broadly defined" definition by Thought Co: "superstition is a belief in the supernatural" because a belief in the supernatural is not an irrational fear.

Recently, I read an eye-popping 2018 Aeon article: Evolution unleashed. Aeon-2018: "When researchers at Emory University in Atlanta trained mice to fear the smell of almonds (by pairing it with electric shocks), they found, to their consternation, that both the children and grandchildren of these mice were spontaneously afraid of the same smell."

I think, feel and believe that this scientific study explains why some people have certain irrational fears, which Carl Jung and Sigmund Freud called complexes, and which we have commonly named as superstition.

“If a black cat crosses your path, it signifies that the animal is going somewhere.” A quote by Groucho Marx (1890-1977), American comedian.

Superstition (1972) by Stevie Wonder

When you believe in things 
That you don't understand 
Then you suffer 
Superstition ain't the way


Note: all markings (bolditalicunderlining) by LO unless stated otherwise

Sunday, 19 May 2019

The Observer view on Britain’s scandalous wealth inequality (Observer)

Title of Observer editorial: The Observer view on Britain’s scandalous wealth inequality

Observer subtitle: Official figures mask the growing income disparities dividing Britain

Date of publishing: 19 May 2019


"Britain needs to wean itself off measures of inequality that disguise more than they reveal about the gap between rich and poor.

So says the Institute for Fiscal Studies, which last week used the occasion of its 50th anniversary to launch a five-year quest for better measures of inequality.

Too often, politicians ignore the growing disparity in wealth and the growing divide between the generations when they debate the subject, said the thinktank. They also ignore the widening gulf in transport provision, life expectancy and cultural amenities between the regions that could be covered by more comprehensive measures of inequality.

A narrow focus on traditional measures of income inequality has proved to be a deeply political act. Since 2010, Tory-led governments have brushed aside concerns that cuts to welfare and public services have increased inequality by using the single figure generated by the Gini coefficient, the widely recognised measure of the gap between the lowest and highest household incomes.

It is a method of defence that put an impregnable wall around George Osborne and has done much the same for his successor, Philip Hammond.

No matter how many times chancellors have been taunted with the desperate side-effects of austerity on one group or another, the Treasury has retorted that the Gini coefficient remains largely the same.

Even the IFS bowed to this argument over the past decade, its reports often concluded that the overall effect of government policy, whatever the evidence to the contrary, has left inequality unchanged. The Gini coefficient ruled.

One reason the average has remained almost the same – and it has when measuring disposable incomes for more than 30 years – relates to the fracturing of society into large and not-so-large discrete groups whose fortunes have ebbed and flowed, often in the opposite direction.

One factor making it harder to determine wealth inequality is the lack of information about the top 1%

For every working household on low incomes that saw their situation worsen over the past 10 years, the circumstances of a retired household improved. Whenever someone in the north-east secured a job, a worker in the south-east found an even better paid one, most likely in the financial services industry.

There are already guides to Britain’s wealth divide, including a Gini coefficient that shows the nation’s record £11.1tn of wealth is distributed far less equally than earnings or household income. The figure for wealth is almost twice as high as that for income. The Resolution Foundation, another thinktank that has examined this subject, says one factor making it harder to determine wealth inequality is the lack of information about the top 1%. Official figures show that one in 10 adults owns around half of the nation’s wealth, while the top 1% own 14%. But the figure could be nearer 20%.

This lack of information is a scandal. It is crucial to make sure the property, pensions and investments of the better-off are measured accurately when we know that the accumulation of wealth over the course of someone’s working life is a key driver of their opportunities and those of their children.

The IFS says its first reports will appear next year. It can’t be soon enough."

Source: https://amp.theguardian.com/commentisfree/2019/may/19/observer-view-on-britain-scandalous-wealth-inequality

How Much Money Do You Need to Be Wealthy in America? (Bloomberg)

Bloomberg title: How Much Money Do You Need to Be Wealthy in America?

Bloomberg title: The exact amount can depend on how old you are.

Date of publishing: 13 May 2019


"Rich is relative.

Merely having a net worth of $1 million, it seems, doesn’t mean you’re wealthy. In Charles Schwab’s annual Modern Wealth Survey, the amount people said it took to be considered rich averaged out to $2.3 million. That, the company said, is “more than 20 times the actual median net worth of U.S. households.”

It’s also a very slight drop from the $2.4 million average in the two previous iterations of the survey.

The older one gets, the higher the bar goes, predictably. Among baby boomers (roughly age 55 to 73), the average net worth you need to be considered wealthy is $2.6 million, 35% higher than what millennials envision as the admission price to the plutocracy.

For someone to be deemed merely financially comfortable, the required net worth shrinks significantly. The average amount was $1.1 million, and only Generation Z (about age 9 to age 22, though Schwab’s sample was 18 to 22) cited a number below $1 million ($909,600, to be exact.)

[Note LO: see article for graph of minimum wealth per age group]

The Schwab survey, which took a national sample of 1,000 Americans between the ages of 21 and 75, also revealed that the majority of Americans really crave real estate. More than 50% of respondents across generations said that if they got a $1 million windfall, they’d spend it, and the most popular purchase would be a place to live—particularly among millennials (roughly age 22 to 37).

Those millennials also took issue with the premise of the survey. More than three-quarters of them said their personal definition of wealth was really about the way they live their lives, rather than a discrete dollar amount.

Nevertheless, 60% of them aren’t all that worried, since they plan to be wealthy within one to 10 years. The survey results suggest an interesting strategy to help them get there—ignore their friends’ social media posts.

How’s that? Well, it seems virtual covetousness has taken on a life of its own for the digital generation. According to the survey, overspending because of what they see on social media (in tandem with the ease with which it takes your cash) was the largest “bad” influence on how they managed their money.

[Note LO: see article for graph on the timeline for becoming wealthy]

And the negative influence of social media on spending is only going to grow. In March, Instagram announced that it’s testing a shopping feature called Checkout that lets users buy things directly within the app, rather than being directed to a retailer’s website. So much for one-stop shopping. Now you won’t even have to stop.

With 59% of the Americans surveyed saying they live paycheck to paycheck, instant gratification comes with a high price. While a strong economy and low unemployment are helping consumers stay current on their debt payments, the largest U.S. banks are seeing losses on credit cards outpace those of auto and home loans at a rate not seen in at least 10 years.

And when the bottom does finally fall out, the last thing most Americans will be thinking of is whether they qualify as wealthy."

Britain risks heading to US levels of inequality, warns top economist (Guardian)

The Guardian title: Britain risks heading to US levels of inequality, warns top economist

The Guardian subtitle: Sir Angus Deaton says UK is at risk of extreme inequality in pay, wealth and health

Date of publishing: 14 May 2019


"Rising inequality in Britain risks putting the country on the same path as the US to become one of the most unequal nations on earth, according to a Nobel-prize winning economist.

Sir Angus Deaton is leading a landmark review of inequality in the UK amid fears that the country is at a tipping point due to a decade of stagnant pay growth for British workers. The Institute for Fiscal Studies thinktank, which is working with Deaton on the study, said the British-born economist would “point to the risk of the UK following the US” which has extreme inequality levels in pay, wealth and health.

Speaking to the Guardian at the launch of the study, he said: “There’s a real question about whether democratic capitalism is working, when it’s only working for part of the population.

“There are things where Britain is still doing a lot better [than the US]. What we have to do is to make sure the UK is inoculated from some of the horrors that have happened in the US.”

[Note LO: see article for graph on Gini coefficients in Europe, UK and USA]

His warning comes as analysis from the Trades Union Congress (TUC) showed that real wages in the finance sector had outstripped average salaries in the UK over the decade since the financial crisis. Earnings after inflation in the finance sector have grown by as much as £120 a week on average, compared with the average British worker still being about £17 a week worse off after taking account of rising living costs over the past decade.

Frances O’Grady, the TUC general secretary, said: “It’s not right that pay is racing ahead in the City when most working people are still worse off than a decade ago.”

Deaton said geographical inequality appeared to be a factor in the UK, with London benefiting disproportionately compared with other parts of the country.

“People really feel that not everybody is having a fair crack anymore,” the US-based economist said. “There’s a sense that if you live in one part of Britain away from the capital, lots of bad things are happening, while lots of good things are happening in the capital – and you don’t see why you should be left behind that way.”

There’s a real question about whether democratic capitalism works, when it only works for part of the population. Sir Angus Deaton

Deaton, a professor at Princeton, won the Nobel prize in his field for work charting global developments in health, wellbeing and inequality in 2015.

The US is ranked on some measures among the most unequal of major nations. Pay for non-college-educated men has not risen for five decades, while mortality for less-educated white men and women in middle age has led to average life expectancy to fall for the past three years, something that has not happened for a century.

Launched amid growing international concern over inequality and the rise of more extreme political ideologies in several countries, the IFS Deaton Review will span five years and look at inequalities in areas such as income, wealth, health, social mobility and political participation.

[Note LO: see article for 2nd graph on richest 1% share in overall income in UK]

In a research paper accompanying the launch by IFS researchers Robert Joyce and Xiaowei Xu, figures show that “deaths of despair” in Britain have more than doubled among men since the early 1990s. This concept was coined by Deaton in an earlier study and refers to deaths from suicide and drug- and alcohol-related issues.

In a reflection of the pressure on certain groups in society as inequality grows across Britain, the number of such deaths per 100,000 adults has risen from about 30 to 61 for men and from 15 to 26 for women over the period. Deaton believes these figures are an early warning sign of the UK developing characteristics of inequality similar to the US.

Deaton warned that rising inequality was not a uniform phenomenon in the UK, judging by mortality statistics. “One part we do know is that it seems to be geographically unequal,” Deaton said, referring to deaths from suicide, drugs and alcohol. “Blackpool seems to be a hotspot and the north east, but not very much in London. So it maybe that it’s geographical inequalities in health that are much more important here than in the US.”

On some measures, inequality in Britain has remained relatively steady over recent years, despite having rapidly accelerated in the 1980s.

Some economists point to the Gini coefficient – a sliding scale between 0% and 100% used by academics where a reading of 100% would indicate that one person received everything. The gauge has remained stable since the 1990s, although it rose slightly last year from 31.4% to 32.5%.

[Note LO: see article for 3rd graph on historical development of Britain's Gini coefficient]

However, the headline measurement of inequality masks significant differences for households in modern Britain, which Deaton said were important to consider.

According to the IFS paper, the richest 1% in Britain have seen the share of household income they receive almost triple in the last four decades, rising from 3% in the 1970s to about 8%. Average chief executive pay at FTSE 100 firms has risen to 145 times that of the average worker, from 47 times as recently as 1998.

Earnings in the lowest-earning working households have barely risen since the mid-1990s, compared with greater increases for higher-income groups.

A spokesman for the Treasury said: “Our policies are highly redistributive – this year the lowest income households will receive over £4 in public spending for every £1 they pay in tax, while the highest income households will contribute over £5 in tax for every £1 they receive in public spending. Income inequality is lower now than it was in 2010.”

John McDonnell, the shadow chancellor, said: “With inequality tearing apart the fabric of our society, I’m delighted it’s finally beginning to get the attention it deserves.

“Sir Angus is right to highlight the problems of stagnant wages and regional inequality as well as the importance of trade unions for addressing inequality.” "

Note LO: I have added some URL's for clarification purposes.

Source: https://www.theguardian.com/inequality/2019/may/14/britain-risks-heading-to-us-levels-of-inequality-warns-top-economist

Saturday, 18 May 2019

One in Seven Homes in Japan Is Empty (Bloomberg+Quartz)

Bloomberg title: One in Seven Homes in Japan Is Empty

Date of publishing: 29 April 2019

"A record 8.46 million Japanese homes are sitting vacant as builders keep adding stock in a country where the population is shrinking.

The number jumped by 260,000 in a twice-a-decade survey released by the government on Friday, reaching 13.6 percent of housing, the Nikkei Asian Review reported.

Many of the properties are for future sale or rental or vacation. However, some are abandoned, posing hazards, the news service reported. Vacancy rates were highest in a prefecture that’s home to the northern part of Mount Fuji, which is a popular area for holiday homes. However more people moving from rural areas to metropolitan ones is also driving the increase, according to the news report.

Stashes of cash are also often discovered when these houses are taken down, the Nikkei Asian Review said. The equivalent of more than $200,000 was found at one Tokyo demolition site in 2018.

Still, the number of empty homes may be dwarfed elsewhere. A 2017 survey indicated a vacancy rate of about one-in-five urban dwellings in China. That translates to around 65 million homes, according to media reports."
--------

Quartz title: Over 13% of the homes in Japan are abandoned

Date of publishing: 28 April 2019

"Japan’s population is shrinking. Last year it fell by nearly 450,000 people. Not since records began in 1899 had so few babies been born (921,000). Before that, 2017 had also set a record. Meanwhile the number of people passing away last year set a post-war record. The figures are part of a larger pattern in which births have declined and deaths increased steadily for decades.

Less noticed is another alarming figure that’s been growing. According to the latest government statistics, the number of abandoned homes in Japan reached a record high of 8.5 million as of Oct. 1, 2018, up by 260,000 from five years earlier. As a proportion of total housing stock, abandoned homes reached 13.6%.

Some areas have been hit harder than others. Saitama, north of Tokyo, and tropical Okinawa had the lowest proportions of vacant homes. But the rate topped 20% in the Yamanashi and Wakayama prefectures.

Japan’s education ministry, meanwhile, has struggled with how to repurpose vacant school buildings. One became a building for curing meats, another an onsen (hot springs spa).

Little wonder Japan, long averse to immigration, is preparing to open its doors wider to foreigners to tackle a worker shortage. But even on that front, the numbers are coming up short: There simply aren’t enough educators to teach the newcomers Japanese."

Sources:
and

People with greater intellectual humility have superior general knowledge (BPS/BT)

BPS / BT title: People with greater intellectual humility have superior general knowledge

Big Think subtitle: Smart people are not afraid to say "I don't know."

Author: Dr. Christian Jarrett

Date of publishing: 3 April 2019 (BPS) / 15 May 2019 (BT)


"In the era of social media and rolling news there’s a constant pressure to be in the know, always on hand with an aperçus or two. Today intellectual humility therefore feels more important than ever – having the insight and honesty to hold your hands up and say you’re ignorant or inexpert about an issue.

Psychologists are responding by taking an increasing interest in intellectual humility, including investigating its consequences for learning and the thinking styles that support it. For a new paper in The Journal of Positive Psychology a team led by Elizabeth Krumrei-Mancuso have continued this endeavour, showing, among other things, that intellectual humility correlates with superior general knowledge. This is a logical outcome because, as the researchers write, “simply put, learning requires the humility to realise one has something to learn.”

Krumrei-Mancuso and her colleagues conducted five studies in all, attempting to find out more about the links between intellectual humility and knowledge acquisition; between intellectual humility and meta-knowledge (insight into one’s own knowledge); and lastly between intellectual humility and other thinking styles.

A strength and a weakness of the research is the use of two different measures of intellectual humility. Some studies involved a shorter questionnaire assessing being a “know-it-all” (through agreement or not with statements like “I know just about everything there is to know”) and intellectual openness (through agreement or not with statements like “I can learn from other people”); whereas other studies used a more recently developed, more comprehensive 22-item measure incorporating questions about cognitions, emotions and behaviours representative of intellectual humility (such as, being accepting of criticism of one’s important beliefs; being ready to change one’s mind; and respect for others’ viewpoints). This use of different measures makes for a more comprehensive, varied assessment of intellectual humility, but also impedes comparison between the studies.

The findings in relation to knowledge acquisition were mixed. While an online study involving 604 adults (and using the more comprehensive measure of intellectual humility) found the aforementioned link between greater intellectual humility and superior general knowledge, another involving college students (and the briefer intellectual humility questionnaire) found that those higher in intellectual humility achieved poorer grades. Perhaps the latter result arose because the higher-achieving students used their objectively higher grades to judge their intellectual ability as higher, not having had the chance yet in life to confront their intellectual fallibility (but as mentioned, the use of different measures across the studies complicates any interpretation of the mixed results).

In terms of insight, higher scorers in intellectual humility were less likely to claim knowledge they didn’t have (the researchers tested this by assessing participants’ willingness to claim familiarity with entirely fictitious facts that they couldn’t possibly know), and they also tended to underestimate their performance on a cognitive ability test.

Meanwhile, other thinking styles and constructs that correlated with greater intellectual humility included being more inclined to reflective thinking, having more “need for cognition” (enjoying thinking hard and problem solving), greater curiosity, and open-minded thinking. More intellectual humility was also associated with less “social vigilantism”, defined as seeing other people’s beliefs as inferior.

While the new findings “replicate and extend previous studies using different measures of intellectual humility”, it’s fair to say there remains a great deal we don’t yet know about intellectual humility. Perhaps most important is the lack of longitudinal research to establish causality – for instance, we don’t yet know if greater general knowledge and open-mindedness fosters intellectual humility, or if intellectual humility comes first, and promotes knowledge and curiosity. Most likely the causal associations between these constructs are complex and two-way, but at the moment, if we’re honest, we just don’t know."

Sources:
and

Skin Lightening: Africa's Multibillion Dollar Post-Colonial Hangover (Bright)

Bright title: Skin Lightening: Africa's Multibillion Dollar Post-Colonial Hangover

Bright subtitle: "One thing we cannot deny is that skin lightening has impacted Africans' individual and collective beauty standards; lighter skin is often perceived as a marker of superior beauty and economic status."

Date of publishing: 7 May 2019


"Growing up in Lagos, Nigeria, my mother’s hair salon housed many vivid memories. I recall how my eyes would tear up from the sting of menthol as I greased scalps. I remember my arms cramping from prepping hair extensions, or worse, undoing micro braids. (This was the 1990s. These days, we are more into Peruvian weaves, wigs, and crochet braids.)

I also remember eavesdropping on women swapping recommendations for skin lightening products. Some women gave directions to beauticians who were known for mixing special creams. Others would exchange homemade concoctions, like how combining certain products with moisturizer could mitigate the harshness of the chemicals, or how a certain egg-based shampoo made for effective lightening results. Sometimes code words like skin toning, brightening, or glowing would be used in place of the pejorative “bleaching.”

Thinking back, the question “what are you using?” was a common refrain in my youth.

Personally, I didn’t feel like I needed to be lighter, but I certainly didn’t want to get darker. Like so many Nigerian girls and women, I found myself avoiding the sun as much as I could, a habit that continued into my early adulthood. My older sister is very light skinned, and growing up, it was palpable how both men and women fawned over her. Somewhere in the depths of my subconscious, I too had equated lighter skin tone with beauty.

As I entered my early 20s, I began to interrogate beauty standards and those ideals started to lose their power. But still, despite all the work I’ve done to accept my natural color, when I walk into a salon to get my eyebrows waxed, someone inevitably recommends a product to, as they put it, “heighten my glow.”


Today, the global skin lightening industry is estimated to be in the multibillion dollar range. In Africa, Nigeria is the largest consumer of skin lightening products. While there is no substantial data on the use of skin lightening products around the world, a World Health Organization report claims that 77 percent of Nigerian women use them on a regular basis. Countries like Togo, South Africa, and Senegal are not lagging too far behind.

Skin lightening, however, is not limited to Africa. In 2017, according to Future Market Insights, Asia-Pacific made up more than half of the global market for skin lightening products, with China accounting for about 40 percent of sales, Japan 21 percent, and Korea 18 percent.

In Africa, there is no documented history of when skin lightening took off, but Yaba Blay, who teaches black body politics and gender politics at North Carolina Central University, believes that it began as African countries gained their independence.

In a 2018 interview with the online publication Byrdie, Blay says that white women have historically used their whiteness as a way to communicate purity. This belief was exported to Africa, and around the time of independence, skin lightening began “exploding.”

“These European countries [were] flooding their colonial places with their products and using whiteness as a way to sell the products,” she says. “People were attempting to gain some level of power and privilege that’s associated with whiteness.”

These attitudes have continued to this day. Make-up artist Teni Coco, in an Instagram post, spoke of her experience using lightening creams. “By the time I was 20, I had become a heavy skin bleacher,” she wrote. “At the time it felt almost normal, I felt like I looked more attractive. I craved so much to be lighter. I felt being black wasn’t beautiful enough. I guess the society we live in played a little role in my decision to bleach my skin.”

After some soul searching, Coco gave up bleaching her skin at 25. “How crazy it was for me to have believed that my black skin wasn’t beautiful, to have allowed myself to feel inadequate,” she reflected.

With attitudes as deeply ingrained as this, what would it take to get Nigerian women — and women across the continent — to stop bleaching their skin?

The debate around skin bleaching has recently resurfaced. Cameroonian singer Dencia launched a skincare line called Whitenicious in 2014 to much controversy. In 2018, the American reality television star Blac Chyna launched a skin lightening product under a brand called “Diamond Illuminating & Lightening Cream” here in Lagos, priced at $250 a jar. There was also the infamous Nivea advertising campaign, “White is Purity.” All of these campaigns elicited heavy social media criticism and think pieces that attributed the desire for lighter skin to self-hate for brown skin.

Feminist and publisher Bibi Bakare-Yusuf has done extensive research on skin bleaching among Nigerian women. She believes it is too reductive to blame this “post-colonial phenomena” simply on white supremacy. “In the context of Nigeria, at least when I started working on the research,” she says, “people were very disdainful of white skin.”

How crazy it was for me to have believed that my black skin wasn't beautiful, to have allowed myself to feel inadequate.

If there is a connection to low self-esteem or desire for whiteness, Bakare-Yusuf believes it happens on a subconscious level. “I think that when we constantly ascribe every single action that we [take] to desire for whiteness,” she says, “we are unnecessarily over-valorizing whiteness. People are not always conscious of what they do.” Bakare-Yusuf likens skin lightening to any kind of cosmetic procedure and believes that humans are always searching for “perfection” by tampering with their natural appearance.

From my childhood to date, television images and billboards have always been emblazoned with light-skinned faces and much of the media and popular culture are often complicit in the perpetuation of these ideals. Film and commercials director Tolu Ajayi says that in advertising, light skin is often viewed as an “aspirational look.”

“The aspirational look is people that look ‘upmarket,’” he says. “Looking upmarket, being of a [high] socio-economic class is associated with being light-skinned.” The implication is that if you are dark skinned, you spend much of your time working outdoors under the sun, which is associated with economic struggle. This is one possible reason for the disdainful gasps I received from the hair salon clients when I would home from boarding school with a tan due to spending long hours in the sun from cutting grass and other chores.

Ajayi explains that fashion photographers have very little power to change the status quo because clients usually insist on using lighter-skinned women to market their products. And the images created by the advertising industry in Nigeria often do not represent the audience they are trying to communicate with. “There hasn’t been any real change. I think light-skinned people are still preferred,” he says. “There is an idea that they photograph better. Sometimes people believe clothes pop better on lighter skin.” This thinking even extends to children’s products, with mostly light-skinned babies dominating diaper adverts.


Television host and actress Folu Ogunkeye has experienced her share of rejection when auditioning for film and television roles as a dark-skinned woman. “What I have found in Nigeria is that leading roles are not readily available for dark-skinned actresses,” she explains. “Initially I had simply assumed that I wasn’t suited for the particular role for which I had auditioned, but then each time, the role was given to a lighter-skinned contemporary. After discussions behind the scenes with industry experts, it has been said outright that certain leading roles are simply not given to darker-skinned actresses because executives do not believe that audiences [want to] see darker women in romantic or leading lady roles.”

One of the seemingly oxymoronic aspects of skin lightening in Nigeria is the sense of shame and denial attached to using these products, particularly among elite women.

The aspirational look is people that look 'upmarket'. Looking upmarket, being of a [high] socio-economic class is associated with being light-skinned.

Bakare-Yusuf says that during her research, she found that many of them did not want to admit to lightening their skin because the act is often associated with those who are uneducated. Working-class women, meanwhile, “are not even trying to hide that they are bleaching.”

Many professional Nigerians also closely follow the discourse around white supremacy, and don’t want to be seen to have capitulated to its power. As a result, some opt for more discreet skin lightening products in the form of pills, chemical peels, and intravenous treatments offered by expensive dermatologists and high-end beauty spas.

One thing we cannot deny is that skin lightening has impacted Africans’ individual and collective beauty standards; lighter skin is often perceived as a marker of superior beauty and economic status.

Pragmatism and aesthetics aside, the dangers of skin lightening can be serious, ranging from chemical burns to skin cancer to kidney damage. I recall an aunt whom we as children cruelly nicknamed “crocodile skin” because of the scales that had formed around her face and neck. There were stories of people whose wounds would not heal after an injury due to the skin’s thinness from harsh chemicals. One season at my boarding school, a teenage girl fell asleep with a potent soap smeared across her face, only to wake up to burns and blisters. These stories highlight the desperate measures many women are willing to take to attain their ideal of beautiful skin.


A few African countries, like Kenya and Ghana, have attempted a crackdown on the importation and sale of certain skin lightening products, especially those containing chemicals like hydroquinone and mercury. More recently, Rwanda enforced a nationwide ban on skin bleaching products, leading to authorities removing creams and soaps from shelves across the country.

While the sentiment behind the bans are noble, most of them aren’t enforced effectively. In 2017, Ghana introduced a ban on the importation and sale of products containing the lightening agent hydroquinone. “But they are not going around Accra walking into shops, picking them off the shelves,” says Nana Agyemang-Asante, a Ghanaian journalist. “I know a shop where I buy my lotions from and I still see these products. Like everything else in my country, on paper, there is a ban but people who know where to get it, will get it.”

Ineffective bans could also create an underground market with unregulated products that could be far more dangerous. Nigerian doctor Ola Brown arguesthat banning skin lightening products do not work as long as lighter skin is still associated with beauty and success.

For me personally, banning these bleaching products may not stop their use completely — but it will hopefully inspire conversations that not only inform people about the dangers of skin lightening but that also encourage people to talk about the psychological toll of colorism.

Luckily, young people on the continent are starting those much needed conversations. Social media-savvy youth are using the internet to push back against a singular standard of beauty and in the process forcing brands and image creators to re-examine the “aspirational look.”

Nollywood actress Beverly Naya created the “50 Shades of Black” campaign, to expand the visibility of other skin tones represented in mainstream media. The campaign scored a corporate partnership with the haircare brand Dark & Lovely, helping her message reach more people than she could ever have imagined, which has led her to become the face of Nivea’s current campaign. More recently, Naya made a documentary called “Skin” which through interviews with many Nigerian women addresses, the deep-rooted issues around colorism. She says the “goal of the documentary is to empower people and to encourage them to love themselves as they are.”


Colorism is a complex and loaded notion that requires re-examining our cultural norms of beauty. This sort of long-term educational approach will take a lot of time and effort. But I think there is hope.

Just in the same way that the natural hair movement caused a decline in the sale of chemical hair relaxers, forcing beauty companies to create products for natural hair, or how black YouTubers forced the makeup industry to rethink its products and marketing, the same can happen to the skin lightening industry.

With education and awareness campaigns and a deliberate move to broaden the spectrum of the skin tones that we see on our television screens and billboards, the needle on colorism will eventually shift. However, while we wait for that change to happen, we need strict regulations to ensure the safety of skin products being sold in stores across the continent.

Now in my 30s, I am surprisingly asked about my skin regimen despite sporting a heavy tan from taking on swimming as a new hobby. I think this is because Nigerian’s perception of what it considered beautiful skin is becoming more expansive, and there is an increased awareness that beauty isn’t monolithic.

Recent shifts in how we see beauty such as the body positivity and natural hair movements as well as dark-skinned, Oscar-winning Kenyan actress Lupita Nyong’o becoming ambassador for French luxury cosmetics house Lancôme, are contributing to our gradual redefining of beauty. My hope is that one day in the near future, no woman in Nigeria will feel she has to lighten her skin to feel beautiful or improve her odds of success in life."