Tuesday, May 2, 2023

Rare and tragic cases of postpartum psychosis are bringing renewed attention to its risks and the need for greater awareness of psychosis after childbirth

Postpartum depression affects approximately 1 in 8 mothers in the U.S. Postpartum psychosis is far more rare, occurring in about 1 in every 500 deliveries. Justin Paget/Stone via Getty Images
Ziv E. Cohen, Cornell University

Lindsay Clancy, a labor and delivery nurse at the prestigious Massachusetts General Hospital in Boston, is the latest tragic and high-profile example of a mother allegedly taking the lives of her own three children.

On Jan. 24, 2023, Clancy allegedly strangled the children with an exercise band while her husband ran an errand. Clancy then slit her wrists, cut her neck and jumped from the second floor of their home. She has been hospitalized since, apparently paralyzed from the waist down following her suicide attempt.

At her arraignment, Clancy’s defense lawyer stated that she may have been suffering from an extreme form of postpartum depression called postpartum psychosis. Other women have made this claim, including Andrea Yates, a Texas woman who in 2001 drowned her five children in a bathtub. She was convicted of capital murder at her first trial, but after a successful appeal, she was found not guilty by reason of insanity in her second trial.

The Centers for Disease Control and Prevention estimate that 1 in 8 mothers, or approximately 12%, experience postpartum depression. Cases of parents killing children, in contrast, are exceedingly rare, with estimates of about 500 of these tragic events per year in the U.S.

Many people wonder whether a psychiatric condition, no matter how severe, could justify or explain the killing of innocent children, especially by their own mother.

As a clinical and forensic psychiatrist, I routinely treat patients after delivery for depression, and I have evaluated women accused of killing their children. The potentially fatal outcomes make it imperative to increase awareness and understanding of postpartum depression and psychosis.

Postpartum depression explained

It is important to make a distinction between “postpartum blues” and postpartum depression. Research shows that between 15% to 85% of women have “postpartum blues,” and the incidence peaks around the fifth day following delivery. Postpartum blues can include low mood, tearfulness, irritability and feeling overwhelmed. It is a totally normal, transient condition thought to be a result of the rapid drop in hormone levels following delivery.

True postpartum depression is more severe than postpartum blues. This term refers to when the patient is experiencing symptoms of a clinical depressive episode, also called “major depressive episode,” usually within the first month after delivery.

Postpartum depression is defined as experiencing two weeks or more of some or all of the following symptoms: depressed mood for most of the day, diminished interest or pleasure in most activities, weight loss, inability to sleep or excessive sleep, physical slowing or agitation, fatigue, poor concentration and, in severe cases, suicidal thoughts. The medical community estimates that postpartum depression is very common, with rates of 10% to 20% in the U.S., and the true numbers may be higher.

Baby blues are characterized by worries such as “Am I a good mom?” that typically pass within a few weeks after childbirth, whereas postpartum depression involves longer-lasting feelings of disconnectedness.

The onset and duration of postpartum depression can vary greatly. For some patients, the first weeks and months after delivery may go well or mood symptoms may be manageable, followed months later by a “crash.” For others, mood symptoms may begin during pregnancy and worsen after delivery.

Diagnosis can be difficult since the time of onset is variable and because some of the symptoms of depression are normal, temporary changes that occur after delivery. In addition, research shows that cultural factors can influence the reporting and development of postpartum depression, and some patients may not disclose symptoms due to guilt or shame.

Risk factors for postpartum depression

Some key risk factors for postpartum depression include a history of depression or mental illness prior to pregnancy, stressful life events during and after pregnancy, marital conflict and young maternal age.

New mothers are under a great deal of pressure – personal, familial and societal – to immediately bond with and love their children. The stress and burden of being a new parent, and the tasks that go along with this role, such as breastfeeding, often make bonding with the child a challenge. The patient may struggle with feelings of guilt and shame, which can delay or prevent seeking help.

While the physical causes of postpartum depression remain mysterious, researchers believe the condition is caused by hormone fluctuations during and especially after pregnancy. For example, research suggests that sex hormones like estrogen, which reach high levels during pregnancy and then fall precipitously after delivery, as well as hormones like oxytocin that are involved in lactation and maternal-baby bonding, likely play an important role. During and after pregnancy, the brain is on a hormonal roller coaster, and this can wreak havoc on mental health.

Postpartum depression treatments

For mild cases, psychotherapy alone may be sufficient to reduce the symptoms and gradually restore a sense of well-being. Approaches such as interpersonal psychotherapy and cognitive behavioral therapy have been shown to be helpful for those suffering with postpartum depression. Interpersonal psychotherapy, for example, focuses on improving interpersonal connections, while cognitive behavioral therapy focuses on correcting distorted thinking, such as believing that one is a “bad” parent.

The mainstay of treatment for postpartum depression is medication. Given the probably strong biological underpinnings of this condition, medication is thought to be helpful in restoring neurochemistry to alleviate symptoms, such as by raising brain levels of the neurotransmitter serotonin.

Breastfeeding patients may prefer psychological treatment to medication therapy since antidepressants can enter breast milk. To date, however, antidepressants do not appear to have an affect on the infant’s health or development.

How postpartum psychosis differs

Postpartum psychosis is a condition where maternal mental health is affected not just by depression, but by a break with reality.

The break with reality, called “psychosis,” generally includes seeing or hearing things that don’t exist – called hallucinations – having jumbled or disconnected thoughts or having fixed false beliefs, often of a bizarre or extremely implausible nature, such as the devil having entered into one’s child. For example, in the Andrea Yates case, she professed to believing that she was marked by Satan and that the only way to save her children from hell was by killing them. Some patients may hear an auditory hallucination - meaning a powerful voice - commanding suicide or an attack on the infant.

This condition is much less common than postpartum depression and is thought to occur in 1 in 500, or 0.2%, of deliveries in the U.S. Also, unlike postpartum depression, which can begin months after delivery, postpartum psychosis usually begins within the first three days following childbirth.

Due to the severe nature of these symptoms, their rapid onset and the frequent presence of thoughts of harming oneself or the baby, postpartum psychosis is considered a psychiatric emergency. It usually results in psychiatric hospitalization for the patient’s and the baby’s safety. In many cases, postpartum depression and its extreme form, postpartum psychosis, go undetected by loved ones and health care providers because of a reluctance to acknowledge that the patient may be a danger to oneself or the child.

What experts know about Clancy’s Case

Lindsay Clancy reportedly suffered from anxiety about going back to work in September 2022, four to five months after giving birth to her third child. She was diagnosed with anxiety and prescribed anti-anxiety medications and antidepressants.

In December 2022, Clancy was evaluated at a women’s psychiatric clinic, where she was told she did not have postpartum depression. However, a short time later she told her husband she was having thoughts of harming herself and the children, and was admitted to a psychiatric hospital. She was discharged after a few days and reported that her suicidal thoughts had resolved. However, just a few days later, she allegedly strangled her three children.

If accurate, this timeline indicates how difficult it can be to diagnose possible postpartum depression and psychosis, and that symptoms may fluctuate on a daily or even hourly basis. Mothers may not always disclose symptoms due to guilt, shame or fear about how it could impact their family.

Clancy’s tragic story illustrates how important close mental health follow-up and treatment is for women suspected of having postpartum depression. And when suicidal thoughts or thoughts of harming the children are present, they must be treated as a potential psychiatric emergency.

Ziv E. Cohen, Clinical Assistant Professor of Psychiatry, Cornell University

This article is republished from The Conversation under a Creative Commons license.

Recharge for Summer Fun with a Sweet Superfood

(Family Features) Summer adventures can often take people just about anywhere, from down the street at the neighborhood pool to across the country on a family road trip. Wherever the action takes you this summer, remember to stay refreshed and energized with easy snacks that provide the nutrition you need.

For example, these Sweetpotato Summer Rolls offer a flavorful way to recharge after some time in the sun. Made with peanut butter, celery sticks and North Carolina Sweetpotatoes, they’re ideal for serving your family following a day of fun.

Classified as a “diabetes superfood” by the American Diabetes Association, sweetpotatoes are rich in vitamins, minerals, fiber and antioxidants, all of which are good for overall health. Plus, they offer a natural sweetener without the added sugar.

Consider these additional sweetpotato facts as you prepare for summer excitement.

Versatile

As one of the most versatile vegetables that’s easy to add to a variety of recipes for enhancing flavor and nutrition content, sweetpotatoes can be a key ingredient in both simple or elevated and sweet or savory dishes. They can be cooked and prepared on the stove, baked, microwaved, grilled or slow cooked.

One-Word Spelling

“Sweetpotato” should be spelled as one word, even if you aren’t familiar with that spelling. In fact, the North Carolina Sweetpotato Commission deliberately spells it as one word (a practice adopted by the National Sweetpotato Collaborators in 1989) as a way for shippers, distributors, warehouse workers and consumers to avoid confusion with the equally unique and distinctive white potato or yam.

Shelf Life and Storage

Not only are sweetpotatoes abundant and found in just about any grocery store or farmers market, but they also have a long shelf life – up to 4 weeks if stored properly in a cool, dry, well-ventilated area away from heat sources.

Ideal for Athletes

Due to their high carbohydrate content, sweetpotatoes are solutions for both before and after exercise sessions. With complex carbohydrates that provide sustained energy and antioxidants that help reduce inflammation and aid in muscle repair, sweetpotatoes can elevate both endurance and recovery.

Find more summertime recipe ideas by visiting ncsweetpotatoes.com.

Watch video to see how to make this recipe!


Sweetpotato Summer Rolls

Recipe courtesy of the North Carolina Sweetpotato Commission
Servings: 4

  • 2 1/2    cups North Carolina Sweetpotatoes
  • 2          tablespoons olive oil
  • 2          teaspoons sesame seeds
  • 3          tablespoons maple syrup
  • salt, to taste
  • coarse pepper, to taste
  • 4          celery sticks
  • 1          red pepper
  • 2          tablespoons creamy peanut butter
  • 1/3       cup hot water
  • 1/4       cup soy cooking cream
  • 1          tablespoon soy sauce
  • 12        sheets rice paper (22-centimeter diameter)
  • 2          tablespoons chopped, roasted peanuts
  1. Peel sweetpotatoes and cut into 1-centimeter thick strips.
  2. In skillet, heat olive oil. Fry sweetpotato strips 3-4 minutes, turning occasionally; sprinkle with sesame seeds, deglaze with maple syrup and boil down briefly. Season with salt and pepper, to taste, and let cool.
  3. Wash celery and red pepper; cut into strips.
  4. Mix peanut butter with water, cream and soy sauce.
  5. Let rice paper sheets swell according to package instructions.
  6. Spread strips of sweetpotato, celery and red pepper on top half of one sheet rice paper. Drizzle with sauce. Fold lower half over strips then edges.
  7. Repeat with remaining rice paper sheets and ingredients. Sprinkle summer rolls with chopped peanuts. Serve with remaining sauce.
SOURCE:
North Carolina Sweetpotato Commission

Fighting climate change means taking laughing gas seriously

Agriculture researchers seek ways to reduce nitrous oxide’s impact on warming

As nations and industries try to cut greenhouse gas emissions to tackle climate change, agricultural practices are in the spotlight. There’s good reason for that: Agriculture accounts for 16 to 27 percent of human-caused climate-warming emissions, according to the Intergovernmental Panel on Climate Change (IPCC). But much of these emissions are not from carbon dioxide, that familiar climate change villain. They’re from another gas altogether: nitrous oxide.

N2O, also known as laughing gas, does not get nearly the attention it deserves, says David Kanter, a nutrient pollution researcher at New York University and vice chair of the International Nitrogen Initiative, an organization focused on nitrogen pollution research and policy making. “It’s a forgotten greenhouse gas,” he says. Yet molecule for molecule, N 2O is about 300 times as potent as carbon dioxide at heating the atmosphere. And like CO 2, it is long-lived, spending  an average of 114 years in the sky before disintegrating. It also depletes the ozone layer. In all, the climate impact of laughing gas is no joke. IPCC scientists have estimated that nitrous oxide comprises roughly 6 percent of greenhouse gas emissions, and about three-quarters of those N 2O emissions come from agriculture. 

But despite its important contribution to climate change, policy makers have not directly addressed N2O emissions. And the gas continues to accumulate. A 2020 review of nitrous oxide sources and sinks found that  emissions rose 30 percent in the last four decades and are exceeding all but the highest potential emissions scenarios described by the IPCC. Agricultural soil — especially because of the globe’s heavy use of synthetic nitrogen fertilizer — is the principal culprit. 

Today, scientists are looking at an array of ways to treat the soil or adjust farming practices to cut back on N2O production.

“Anything that can be done to improve fertilizer use efficiency would be big,” says Michael Castellano, an agroecologist and soil scientist at Iowa State University.

Nitrogen unbalanced

Humanity has tipped the Earth’s nitrogen cycle out of balance. Before the rise of modern agriculture, most plant-available nitrogen on farms came from compost, manure and nitrogen-fixing microbes that take nitrogen gas (N2) and convert it to ammonium, a soluble nutrient that plants can take up through their roots. That all changed in the early 1900s with the debut of the  Haber-Bosch process that provided an industrial method to produce massive amounts of ammonia fertilizer. 

This abundance of synthetic fertilizer has boosted crop yields and helped to feed people around the globe, but this surplus nitrate and ammonium comes with environmental costs. Producing ammonia fertilizer accounts for about 1 percent of all global energy use and 1.4 percent of CO 2 emissions (the process requires heating nitrogen gas and subjecting it to pressures of up to 400 atmospheres, so it’s very energy-intensive). More importantly, the fertilizer drives increased emissions of nitrous oxide because farmers tend to apply the nitrogen to their fields in a few large batches during the year, and crops can’t use it all.

When plant roots don’t mop up that fertilizer, some of it runs off the field and pollutes waterways. What remains is consumed by a succession of soil microbes that convert the ammonia to nitrite, then nitrate and, finally, back to N2 gas. N 2is made as a by-product at a couple of points during this process.

Carefully dispensing fertilizer right when plants need it or finding ways to maintain yields with reduced nitrogen fertilizer would reduce these N2O emissions, and scientists are looking at various ways to do that. One strategy under investigation is to harness precision agriculture techniques that use remote sensing technology to determine where and when to add nitrogen to fields, and how much. Another is to use nitrification inhibitors, chemicals that suppress the ability of microbes to turn ammonia into nitrate, impeding the creation of N 2O and keeping the nitrogen in the soil for plants to use over a longer span of time.

Widely adopting these two practices would reduce nitrous oxide emissions about 26 percent from their current trajectory by 2030, according to a 2018 estimate by researchers at the International Institute for Applied Systems Analysis in Austria. But the authors say it will take more than that to help meet greenhouse gas targets such as those set forth in the Paris climate agreement. So scientists are exploring additional strategies.

One potential method involves harnessing the potential of certain microbes to directly supply nitrogen to plants, much as nitrogen-fixing bacteria already do in partnership with beans, peanuts and other legumes. “There’s really a gold mine living in the soil,” says Isai Salas-González, coauthor of an article on the plant microbiome in the 2020  Annual Review of Microbiology and a computational biologist who recently completed a PhD at the University of North Carolina at Chapel Hill.

In that vein, since 2019 the company Pivot Bio has marketed a microbial product called Pivot Bio Proven that, they say, forms a symbiosis with crops’ roots after an inoculant is poured in the furrows where corn seeds are planted. (The company plans to release similar products for sorghum, wheat, barley and rice.) The microbes spoon-feed nitrogen a little at a time in exchange for sugars leaked by the plant, reducing the need for synthetic fertilizer, says Karsten Temme, CEO of Pivot Bio. 

Temme says that company scientists created the inoculant by isolating a strain of the bacterium Kosakonia sacchari that already had nitrogen-fixing capabilities in its genome, although the genes in question were not naturally active under field conditions. Using gene editing technology, the scientists were able to reactivate a set of 18 genes so  that the enzyme nitrogenase is made even in the presence of synthetic fertilizer. “We coax them to start making this enzyme,” Temme says.

Steven Hall, a biogeochemist at Iowa State University, is now testing the product in large, dumpster-sized containers with corn growing in them. Researchers apply the inoculant, along with different amounts of synthetic fertilizer, to the soil and measure corn yields, nitrous oxide production and how much nitrate leaches from the base of the containers. Though results of the trial are not yet out, Hall says there’s “good initial support” for the hypothesis that the microbes reduce the need for fertilizer, thereby reducing nitrous oxide emissions.

But some soil scientists and microbiologists are skeptical of a quick microbial fix. “Biofertilizers” like these have had mixed success, depending on the soil and environment in which they are applied, says Tolu Mafa-Attoye, an environmental microbiology graduate student at the University of Guelph in Canada. In one field study of wheat, for example, inoculating the crops with beneficial microbes enhanced growth of the plants but only resulted in slightly greater yields. Unknowns abound, Mafa-Attoye’s Guelph colleagues wrote in February in  Frontiers in Sustainable Food Systems — such as whether the microbes will negatively affect the soil ecology or be outcompeted by native microbes. 

Instead of adding in a microbe, it may make more sense to encourage the growth of desirable microbes that already exist in the soil, says Caroline Orr, a microbiologist at Teesside University in the UK. She has found that cutting back on pesticide use led to a more diverse microbial community and a greater amount of natural nitrogen fixation. In addition, production of nitrous oxide is influenced by the availability of carbon, oxygen and nitrogen — and all are affected by adjusting fertilizer use, irrigation and plowing. 

Take tillage, for example. An analysis of more than 200 studies found that nitrous oxide emissions increased in the first 10 years after farmers stopped or cut back on plowing their land. But after that, emissions fell. Johan Six, a coauthor of the analysis and an agroecologist at ETH Zürich in Switzerland, thinks that’s because the soils start out in a heavily compacted state after years of equipment driving over them. Over time, though, the undisturbed soil forms a cookie-crumb-like structure that allows more air to flow in. And in high oxygen environments, microbes produce less nitrous oxide. Such no-till systems also result in more carbon storage because less plowing means reduced conversion of organic carbon to CO — providing an additional climate benefit.

It may even be possible for farmers to save money on fertilizer and water and reduce emissions, all while maintaining yields. In research on tomato farms in California’s Central Valley, Six found that study plots with reduced tillage and a drip irrigation system that slowly oozed nitrogen to plants — reducing how much of the nutrient pooled in the soil — lowered N2O emissions by 70 percent compared with conventionally managed plots. The farmer who implemented those changes was also compensated for his greenhouse gas reduction through the state’s cap-and-trade program. With the right incentives, persuading farmers to cut their emissions might not be that hard, says Six. 

In Missouri, farmer Andrew McCrea grows 2,000 acres of corn and soy in a no-till system. This year, he plans to trim back his fertilizer use and see if the Pivot Bio inoculant can keep his yields more or less the same. “I think all farmers certainly care about the soil,” he says. “If we can cut costs, that’s great too.”

And if policy makers turn to tackling nitrous oxide, there should be rippling benefits for all of us, says Kanter of New York University. Some of them could be more rapid and tangible than addressing climate change. The same measures that lower N2O levels also reduce local air and water pollution as well as biodiversity losses. “Those are things that people will see and feel immediately,” Kanter says, “within years as opposed to within decades or centuries.”

This article originally appeared in Knowable Magazine, an independent journalistic endeavor from Annual Reviews.

Our bodies crave more food if we haven’t had enough protein, and this can lead to a vicious cycle — especially if we’re reaching for ultraprocessed instead of high-fiber whole foods

This story starts in an unusual place for an article about human nutrition: a cramped, humid and hot room somewhere in the Zoology building of the University of Oxford in England, filled with a couple hundred migratory locusts, each in its own plastic box.

It was there, in the late 1980s, that entomologists Stephen Simpson and David Raubenheimer began working together on a curious job: rearing these notoriously voracious insects, to try and find out whether they were picky eaters.

Every day, Simpson and Raubenheimer would weigh each locust and feed it precise amounts of powdered foods containing varying proportions of proteins and carbohydrates. To their surprise, the young scientists found that whatever food the insects were fed, they ended up eating almost exactly the same amount of protein.

In fact, locusts feeding on food that was low in protein ate so much extra in order to reach their protein target that they ended up overweight — not chubby on the outside, since their exoskeleton doesn’t allow for bulges, but chock-full of fat on the inside.

Inevitably, this made Simpson and Raubenheimer wonder whether something similar might be causing the documented rise in obesity among humans. Many studies had reported that even as our consumption of fats and carbohydrates increased, our consumption of protein did not.

Might it be that, like locusts, we are tricked into overeating, in our case by the irresistible, low-protein, ultraprocessed foods on the shelves of the stores where we do most of our foraging? That’s what Raubenheimer and Simpson, both now at the University of Sydney, argue in their recent book “ Eat Like the Animals” and in an overview in the Annual Review of Nutrition.

Simpson took us through the reasoning and the data in an interview with Knowable Magazine. This conversation has been edited for length and clarity.

How does an entomologist end up studying nutrition in humans?

My interest in feeding behavior goes all the way back to my undergraduate years in Australia, where I was studying the food choices of sheep blowfly maggots, which are laid in the wool of sheep and eat the sheep alive. For my PhD, I took an opportunity at the University of London, England, to study appetite and food intake control in migratory locusts, which exist in two extreme forms — one solitary and one aggregating in swarms that create devastating plagues.

Since they had this reputation for being absolutely voracious, we surely did not expect them to have a lot of nuance in the way they control what they eat. But I started to explore whether they could sense the requirement for different nutrients and use it to regulate their intake. That led to experiments with artificial diets of different nutrient compositions, which showed that locusts have nutrient-specific appetites for protein and carbohydrate: Their food tastes differently to them depending on what they need, and that enables them to balance their diets.

In 1987, I started working with David Raubenheimer at Oxford to find out what happens if you put locusts on a diet that forces different appetites to compete, by feeding the animals mixtures of proteins and carbohydrates in relative amounts that do not match their intake target. We made 25 different diets, measured how much the locusts ate, how quickly they developed, and how big they grew, and found that when protein and carbohydrate appetites compete, protein wins.

What that means is that if you put animals on a low-protein, high-carb diet, they’ll eat more calories to get that limiting protein, and they’ll end up obese. Likewise, if you put them on a high-protein, low-carb diet, they don’t need to eat as much to get to their protein target, and they end up losing weight. It was at that point that we knew we had discovered a powerful new way of looking at nutrition.

We started looking at lots of different species of insects, and found that they, too, had the capacity to regulate their intake of protein and carbohydrate, and that protein was often, but not always, the prioritized nutrient.

By now, we have studied species from cats, dogs and free-ranging primates to fish in aquaculture to slime molds to humans, in a variety of contexts — from understanding health and disease to optimizing animal feed to conservation biology.

You’ve found that the nutrient levels that animals aim for are the ones at which they grow, survive or reproduce best. Just by following their appetite, they eat exactly what they need. Why don’t we?

There are two possibilities. Either our biology is broken, or it still works but we’re in the wrong environment. What we’ve shown in our studies is the latter. What has happened is our appetites, which evolved in natural environments, have now been subjected to highly engineered food environments which have been designed, in many ways, to hack our biology, to subvert our appetites.

One of our favorite examples came from a study we did in Sydney. We confined people in a sleep center for three four-day periods and provided them with foods and menus which were varied and matched in palatability, but were all of the same nutrient composition for a given week.

We had a 25 percent protein week, a 15 percent protein week, and a 10 percent protein week, and the subjects didn’t know that was going on. As far as they were concerned, they were allowed to eat what they wanted, everything tasted equally well and there were lots of choices. But it turned out that during the low-protein week, people ate more, because their protein appetite would drive them to eat more calories, to try and get enough protein. They largely did this by increasing snacking between meals, and selectively on savory-flavored snacks.

We’ve subsequently discovered that when you’re low in protein, as is the case on a 10 percent protein diet, you have elevated levels of a hormone called FGF21, which is mainly released from the liver. What we’ve shown in mouse experiments and confirmed in humans is that FGF21 switches on savory-seeking behavior, which is a proxy for eating protein.

Now, if you have that response and the nearest savory thing is a bag of barbecue-flavored potato crisps, that’s a protein decoy. You’ll be misdirected to eat that, but you’ll not get any substantial amount of protein. You’ll remain protein-hungry, and you’ll have to eat more to satisfy that protein appetite. That means you’re accumulating excess calories, and that is precisely what happens to us in our modern food environment.

You argue that ultraprocessed foods are especially likely to make us consume too many calories. Why would that be so?

Over the last couple of years, population survey data have shown that the average person in the US, Australia or the UK gets more than half their calories from highly processed foods — in some cases it’s 90 percent or more. As the proportion of ultraprocessed food in the diet increases, protein intake remains largely the same, but energy intake goes up steeply because of the dilution of protein by the fats and carbs in these foods. So this protein appetite we discovered initially in locusts operates in us too. In our modern food environment, it drives us to overconsume energy, and that sets up a vicious cycle.

What we find is that as people become overweight, their metabolism becomes dysregulated. Their tissues become less responsive to insulin, which normally regulates protein metabolism. This makes protein metabolism less efficient, causing the body to break down lean tissues like muscle and bone and burn protein to produce energy.

That increases people’s protein target, so they’ll eat even more, put on more weight, become even more metabolically dysregulated, start craving more protein, and so on.

We’ve since taken that basic idea and used it in a paper at the end of last year to propose a new understanding of why women are prone to put on weight during menopause. That’s a period when protein breakdown rates go steeply upwards in bone and muscle because of the decline in reproductive hormones. And it is driving the same sort of outcome that I just described.

You also see it in aging, you see it in people who smoke, you see it with excess alcohol intake — these are all circumstances in which FGF21 goes up, protein appetite goes up, protein breakdown goes up, and you’ll end up in this sort of vicious cycle.

As an entomologist, how did you manage to convince colleagues in nutrition science this matters?

It’s just the accumulation of evidence. Last fall, we spoke at the Royal Society in London at a big obesity conference, and the response to our talk indicated to me that protein leverage is now accepted as one of the main, credible underlying explanations for obesity. Our evidence comes from pre-clinical studies, it comes from clinical studies, it comes from cohort studies, it comes from population-level analyses, it comes from deep mechanistic biology — it’s now unanswerably there. The remaining question is: Of the various influences that drive obesity, is protein appetite a main one? We think it probably is.

Why would protein be the strongest driver of our appetites? What would be the biological logic?

All three macronutrients — fat, carbs and protein — contain calories, so we can burn any of them to yield energy, and we can use any of them to make glucose, which is the preferred fuel for our cells and brain.

But only protein has nitrogen, which we need for many other purposes, from maintaining our cells to producing offspring. You don’t want to eat too little protein.

That leaves the question of why we don’t overeat it. Why do we eat fewer calories than we need on a high-protein diet, rather than eat excess protein? To us, that implied there is a cost to eating too much protein, and we set out to discover that cost in fruit flies. We designed a large experiment where we confined a thousand flies to one of 28 diets varying in the ratio of protein and carbohydrate, the two major macronutrients for flies. What we found was that flies lived longest on a lower-protein, high-carbohydrate diet, but laid most eggs on a higher-protein, lower-carbohydrate diet. A really-high-protein diet, finally, wasn’t better for either outcome.

That overturned a hundred years of thinking around restricting calories and aging: The dominant view was that reduced calories were what prolonged life, but our data showed that the type of calories matter, notably the ratio of protein to carbs. And it created quite a stir at the time — the paper came out in 2008.

We set out to do the same experiment in mice. To do that, we had to add fat as a third nutrient dimension to the dietary design. That involved an enormous study. We took more than 700 mice and put them on one of 25 different diets varying in the concentration and ratio of protein, carbohydrate and fat. It took 6 metric tons of experimental diet to run that study across the 3 or 4 years it took before the oldest mice died.

That was the first of a whole series of huge mouse experiments where we looked at different types of carbohydrate, different ratios of amino acids, and so on. The long and the short of it was that the mice lived longer on low-protein, high-carbohydrate diets, but reproduced better on high-protein, low-carbohydrate diets — very similar to the flies.

Importantly, the benefit of low protein was only realized when the carbohydrates were harder-to-digest complex carbohydrates like fiber and starch, not simple sugars. If you translate that into human populations and look across the world for human populations that live the longest, lo and behold they’re the ones on diets low in protein and high in healthy carbohydrates and fats, such as Mediterranean-style diets and the traditional Okinawan diet.

I’m sure they’re all very healthy, but how do people on these diets manage their appetites?

That’s a really interesting question. The Okinawans certainly are hungry for protein. In traditional Japanese cuisine, there is an almost religious prominence given to umami flavors, which are the signature of protein, the savory characteristic in foods. So that’s like a societal protein appetite.

The other question is: On a 10 percent protein diet like the Okinawan diet, why aren’t they all suffering obesity because they have to eat far more to get their protein? The answer is that the traditional diet is low in energy, and high in fiber. By eating more to try and attain their protein target, they get more fiber instead of more calories, until their stomach is full. That’s a crucial distinction with the modern industrialized food environment, which isn’t just low in protein, but also low in fiber — and high in fats and carbs.

If low protein and low fiber content are the main problem, would it help to just increase them in ultraprocessed foods? Or would that not be sufficient?

Science has already nudged the industry in that direction in a couple of ways that are not altogether helpful. The high-protein snack industry is a phenomenon which reflects this science. Their response was: We’ve got a new market now for high-protein bars. Whether or not that’s ultimately going to help the world’s waistline is less clear at the moment, as the food environment as a whole remains replete with low-protein, low-fiber, ultrapalatable processed foods.

The principal driver for reducing protein content in ultraprocessed foods was that protein is more expensive than fats and carbohydrates. It was cheaper to take some of the protein out and add a little more fat and carbs, particularly when you can make things taste fantastic by mixing sugar and fat and a bit of salt together.

Some of the big providers of lifestyle interventions have shifted towards increasing the percent protein in the diet. And of course, all of the commercially successful fad diets of recent decades have been high-protein diets. But none of them takes account of the fact that there’s potentially a cost to a higher-protein diet.

As we’ve shown originally in flies and mice, a higher protein-to-carbohydrate ratio than we need speeds up aging in our tissues. That being said, if you’re suffering obesity and diabetes, the benefits of a high-protein diet in terms of weight loss may outweigh the costs. It’s a matter of understanding the relative costs and benefits associated with different diet compositions, relating them to personal goals and breaking away from some of the crazy diet zealotry that goes on online and is promoted by many of the fad diet industries.

So you’d recommend eating more fiber and fewer carbs and fats rather than eating more protein? How does that affect your own choice of snacks outside of mealtimes?

I have a deep love of food, cooking, and even hunting and gathering — I’m a fisherman. But I’m as susceptible to the siren call of ultraprocessed foods and beverages as everyone else. These products have been designed to be irresistible, so I avoid them, except on occasions. They are not in the house or my shopping trolley.

As a family, we eat whole foods, plenty of fruits and vegetables, pulses, nuts and grains, as well as dairy and high-quality meat, fish and poultry. There are many ways to mix a nutritionally balanced and delicious diet without the use of apps or computer programs. After all, no species in the history of life on Earth ever needed those.

The trick is to take advantage of our evolved biology of appetite by creating an environment in which our appetites can guide us to a healthy and balanced diet. We need to help our appetites work for ourselves and our health, not the profits of the food and beverage industries.

This article originally appeared in Knowable Magazine, an independent journalistic endeavor from Annual Reviews.

Sudan’s conflict has its roots in three decades of elites fighting over oil and energy

The opening of a hydro-electric dam on the Nile River at Merowe, north of Khartoum, in 2009. Ashraf Shazly/Afp via GettyImages
Harry Verhoeven, Columbia University

Sudan stands on the brink of yet another civil war sparked by the deadly confrontation between the Sudan Armed Forces of General Abdelfatah El-Burhan and the Rapid Support Forces of Mohamed Hamdan Dagalo (“Hemedti”).

Much of the international news coverage has focused on the clashing ambitions of the two generals. Specifically, that differences over the integration of the paramilitary Rapid Support Forces into the regular army triggered the current conflict on April 15, 2023.

I am a professor teaching at Columbia University and my research focuses on the political economy of the Horn of Africa. A forthcoming paper of mine in the Journal of Modern African Studies details the strategic calculus of the Sudan Armed Forces in managing revolution and democratisation efforts, today as well as in past transitions. Drawing on this expertise, it is important to underline that three decades of contentious energy politics among rival elites forms a crucial background to today’s conflict.

The current conflict comes after a decade-long recession which has drastically lowered the living standards of Sudanese citizens as the state teetered on the brink of insolvency.

How energy has shaped Sudan’s violent political economy

Long gone are the heady days when Sudan emerged as one of Africa’s top oil producers. Close to 500,000 barrels were pumped every day by 2008. Average daily production in the last year has hovered around 70,000 barrels.

In the late 1990s, amid a devastating civil war, President Omar Al-Bashir’s military-Islamist regime announced that energy would help birth a new economy. It had already paved the way for this reality, ethnically cleansing the areas where oil would be extracted. The regime struck partnerships with Chinese, Indian and Malaysian national oil companies. Growing Asian demand was met with Sudanese crude.

Petrodollars poured in. The regime in power between 1989 and 2019 oversaw a boom. This enabled it to weather internal political crises, increase the budgets of its security agencies and to spend lavishly on infrastructure. Billions of dollars were channelled to the construction and expansion of several hydro-electric dams on the Nile and its tributaries.

These investments intended to enable the irrigation of hundreds of thousands of hectares. Food crops and animal fodder were to be grown for Middle Eastern importers. Electricity consumption in urban centres was transformed; production in Sudan was boosted by thousands of megawatts. The regime spent more than US$10 billion on its dam programme. That’s a phenomenal sum and testament to its belief that the dams would become the centrepiece of Sudan’s modernised political economy.

South Sudan secedes

Then, in 2011, South Sudan seceded – along with three-quarters of Sudan’s oil reserves. This exposed the illusions on which these dreams of hydro-agricultural transformation rested. The regime lost

half of its fiscal revenues, and about two-thirds of its international payment capacity.

The economy shrank by 10%. Sudan was also plagued by power cuts as the dams proved very costly and produced much less than promised. Lavish fuel subsidies were maintained but as evidence shows, these disproportionately benefited select constituencies in Khartoum and failed to protect the poor.

As the regime sank ever deeper into economic crisis, its security agencies concentrated on accumulating the means they deemed essential to survive, and to compete with each other. Both the Sudan Armed Forces and Rapid Support Forces deepened their involvement in Sudan’s political economy. They took control of key commercial activities. These included meat processing, information and communication technology and gold smuggling.

Soaring fuel, food and fertiliser prices

This economic crisis fuelled a popular uprising which led to the overthrow of Al-Bashir. After the 2018-2019 revolution, the international community oversaw a power-sharing arrangement. This brought together Sudan Armed Forces, Rapid Support Forces and a civilian cabinet. Reforms were tabled to reduce spending on fuel imports and address the desperate economic situation.

However, the proposals for economic reform competed for government and international attention with calls to fast-track the “de-Islamisation” of Sudan, and to purge collaborators of the ousted regime from civil service ranks.

Inflationary pressures worsened as food and energy prices rose. It also strengthened a growing regional black market in which fuel, wheat, sesame and much else was illicitly traded across borders. At the same time, divisions grew in Sudan’s political establishment and among protesters in its streets.

The government’s efforts to push back against growing control of economic activities by the Sudan Armed Forces and Rapid Support Forces ultimately contributed to the October 2021 coup against Prime Minister Abdallah Hamdok.

Overlapping crises

The coup only deepened the crisis. So too did global supply shocks, such as those caused by the COVID-19 pandemic and the Russia-Ukraine conflict, which sent the prices of fuel, food and fertiliser skyrocketing globally, including in Sudan. Fertiliser prices increased by more than 400%. The state’s retreat from subsidising essential inputs for agricultural production, such as diesel and fertiliser, led farmers to drastically reduce their planting, further exacerbating the food production and affordability crunch.

Amid these overlapping energy, food and political crises, Sudan’s Armed Forces and Rapid Support Forces have been violently competing for control of the political economy’s remaining lucrative niches, such as key import-export channels. Both believe the survival of their respective institutions is essential to preventing the country from descending into total disintegration.

In view of such contradictions and complexity, there are no easy solutions to Sudan’s multiple crises. The political, economic and humanitarian situation is likely to worsen further.

A version of this article was first published by the Center on Global Energy Policy.

Harry Verhoeven, Senior Research Scholar at the Center on Global Energy Policy, Columbia University

This article is republished from The Conversation under a Creative Commons license. 

Generative AI is forcing people to rethink what it means to be authentic

Generative AI thrives on exploiting people’s reflexive assumptions of authenticity by producing material that looks like ‘the real thing.’ artpartner-images/The Image Bank via Getty Images
Victor R. Lee, Stanford University

It turns out that pop stars Drake and The Weeknd didn’t suddenly drop a new track that went viral on TikTok and YouTube in April 2023. The photograph that won an international photography competition that same month wasn’t a real photograph. And the image of Pope Francis sporting a Balenciaga jacket that appeared in March 2023? That was also a fake.

All were made with the help of generative AI, the new technology that can generate humanlike text, audio and images on demand through programs such as ChatGPT, Midjourney and Bard, among others.

There’s certainly something unsettling about the ease with which people can be duped by these fakes, and I see it as a harbinger of an authenticity crisis that raises some difficult questions.

How will voters know whether a video of a political candidate saying something offensive was real or generated by AI? Will people be willing to pay artists for their work when AI can create something visually stunning? Why follow certain authors when stories in their writing style will be freely circulating on the internet?

I’ve been seeing the anxiety play out all around me at Stanford University, where I’m a professor and also lead a large generative AI and education initiative.

With text, image, audio and video all becoming easier for anyone to produce through new generative AI tools, I believe people are going to need to reexamine and recalibrate how authenticity is judged in the first place.

Fortunately, social science offers some guidance.

The many faces of authenticity

Long before generative AI and ChatGPT rose to the fore, people had been probing what makes something feel authentic.

When a real estate agent is gushing over a property they are trying to sell you, are they being authentic or just trying to close the deal? Is that stylish acquaintance wearing authentic designer fashion or a mass-produced knock-off? As you mature, how do you discover your authentic self?

These are not just philosophical exercises. Neuroscience research has shown that believing a piece of art is authentic will activate the brain’s reward centers in ways that viewing something you’ve been told is a forgery won’t.

Authenticity also matters because it is a social glue that reinforces trust. Take the social media misinformation crisis, in which fake news has been inadvertently spread and authentic news decreed fake.

In short, authenticity matters, for both individuals and society as a whole.

But what actually makes something feel authentic?

Psychologist George Newman has explored this question in a series of studies. He found that there are three major dimensions of authenticity.

One of those is historical authenticity, or whether an object is truly from the time, place and person someone claims it to be. An actual painting made by Rembrandt would have historical authenticity; a modern forgery would not.

A second dimension of authenticity is the kind that plays out when, say, a restaurant in Japan offers exceptional and authentic Neapolitan pizza. Their pizza was not made in Naples or imported from Italy. The chef who prepared it may not have a drop of Italian blood in their veins. But the ingredients, appearance and taste may match really well with what tourists would expect to find at a great restaurant in Naples. Newman calls that categorical authenticity.

And finally, there is the authenticity that comes from our values and beliefs. This is the kind that many voters find wanting in politicians and elected leaders who say one thing but do another. It is what admissions officers look for in college essays.

In my own research, I’ve also seen that authenticity can relate to our expectations about what tools and activities are involved in creating things.

For example, when you see a piece of custom furniture that claims to be handmade, you probably assume that it wasn’t literally made by hand – that all sorts of modern tools were nonetheless used to cut, shape and attach each piece. Similarly, if an architect uses computer software to help draw up building plans, you still probably think of the product as legitimate and original. This is because there’s a general understanding that those tools are part of what it takes to make those products.

Hands of woodworker using a turning lathe.
When a piece of furniture is advertised as handmade, we assume that tools were still involved. Arterra/Universal Images Group via Getty Images

In most of your quick judgments of authenticity, you don’t think much about these dimensions. But with generative AI, you will need to.

That’s because back when it took a lot of time to produce original new content, there was a general assumption that it required skill to create – that it only could have been made by skilled individuals putting in a lot of effort and acting with the best of intentions.

These are not safe assumptions anymore.

How to deal with the looming authenticity crisis

Generative AI thrives on exploiting people’s reliance on categorical authenticity by producing material that looks like “the real thing.”

So it’ll be important to disentangle historical and categorical authenticity in your own thinking. Just because a recording sounds exactly like Drake – that is, it fits the category expectations for Drake’s music - it does not mean that Drake actually recorded it. The great essay that was turned in for a college writing class assignment may not actually be from a student laboring to craft sentences for hours on a word processor.

If it looks like a duck, walks like a duck and quacks like a duck, everyone will need to consider that it may not have actually hatched from an egg.

Also, it’ll be important for everyone to get up to speed on what these new generative AI tools really can and can’t do. I think this will involve ensuring that people learn about AI in schools and in the workplace, and having open conversations about how creative processes will change with AI being broadly available.

Writing papers for school in the future will not necessarily mean that students have to meticulously form each and every sentence; there are now tools that can help them think of ways to phrase their ideas. And creating an amazing picture won’t require exceptional hand-eye coordination or mastery of Adobe Photoshop and Adobe Illustrator.

Finally, in a world where AI operates as a tool, society is going to have to consider how to establish guardrails. These could take the form of regulations, or the creation of norms within certain fields for disclosing how and when AI has been used.

Does AI get credited as a co-author on writing? Is it disallowed on certain types of documents or for certain grade levels in school? Does entering a piece of art into a competition require a signed statement that the artist did not use AI to create their submission? Or does there need to be new, separate competitions that expressly invite AI-generated work?

These questions are tricky. It may be tempting to simply deem generative AI an unacceptable aid, in the same way that calculators are forbidden in some math classes.

However, sequestering new technology risks imposing arbitrary limits on human creative potential. Would the expressive power of images be what it is now if photography had been deemed an unfair use of technology? What if Pixar films were deemed ineligible for the Academy Awards because people thought computer animation tools undermined their authenticity?

The capabilities of generative AI have surprised many and will challenge everyone to think differently. But I believe humans can use AI to expand the boundaries of what is possible and create interesting, worthwhile – and, yes, authentic – works of art, writing and design.

Victor R. Lee, Associate Professor of Learning Sciences and Technology Design in Education, Stanford University

This article is republished from The Conversation under a Creative Commons license. 

Monday, May 1, 2023

Shelby Democrats Collect Diapers for Babies and Elders




May Issue 2023


Diapers are expensive, as any new parent or caretaker of an incontinent adult can attest.  The Shelby County Democratic Party’s latest community service project, led by Public Relations Committee Chair Leslie Tyus, was successful in helping families who struggle to obtain these necessary supplies by organizing a Diaper and Wipes Drive.  Not the most glamorous of products, disposable diapers and wipes make a huge difference in the lives of people who depend upon them, no pun intended.  And nothing is more stressful than to run out of diapers when money is tight.

 

On a recent Saturday at the Community Room at Shelby County Services Building, members of the Democratic Party plus their friends converged on the site with bags and boxes and loaded car trunks.  A total of 2073 baby items, including diapers and packs of wipes, plus 493 adult care products, including incontinence garments and packs of adult-size wipes, were collected.

 

Shelby Baptist Association Ministry, Oak Mountain Ministries, and

the Elder Justice Center of Alabama, an arm of the Middle Alabama Area Agency on Aging, received all the donated items and will distribute them to needy families and individuals county-wide. 

The pandemic has immersed us faster and deeper in immersive communication technologies. It’s a disrupted, confusing, sometimes exhausting world — but shifting both the tech and our expectations might make it a better one.

I am sitting in a darkened room, listening to upbeat music of the type often used at tech conferences to make attendees feel they are part of Something Big, waiting in eager anticipation for a keynote speaker to appear.

Bang on time, virtual communication expert Jeremy Bailenson arrives on the digital stage. He is here at the American Psychological Association’s November meeting, via a videoconferencing app, to somewhat ironically talk about Zoom fatigue and ways to battle it. “In late March, like all of us, I was sheltered in place,” Bailenson tells his invisible tele-audience. “After a week long of being on video calls for eight or nine hours a day, I was just exhausted.”

One of the pandemic’s many impacts was to throw everyone suddenly online — not just for business meetings but also for everything from birthday parties to schooling, romantic dates to science conferences. While the Internet thankfully has kept people connected during lockdowns, experiences haven’t been all good: There have been miscommunications, parties that fall flat, unengaged schoolkids.

Many found themselves tired, frustrated or feeling disconnected, with researchers left unsure as to exactly why and uncertain how best to tackle the problems. Sensing this research gap, Bailenson, director of Stanford University’s Virtual Human Interaction Lab, and colleagues quickly ramped up surveys to examine how people react to videoconferencing, and this February published a “ Zoom Exhaustion & Fatigue Scale” to quantify peoples’ different types of exhaustion (see box). They found that having frequent, long, rapid-fire meetings made people more tired; many felt cranky and needed some alone time to decompress.

This reality comes in contrast to the rosy views painted by many enthusiasts over the years about the promises of tech-mediated communication, which has evolved over recent decades from text-based chat to videoconferencing and the gathering of avatars in virtual landscapes. The dream is to create ever more immersive experiences that allow someone to feel they are really in a different place with another person, through techniques like augmented reality (which projects data or images onto a real-life scene), to virtual reality (where users typically wear goggles to make them feel they are elsewhere), to full-blown systems that involve a user’s sense of touch and  smell.

The vision is that we would all be sitting in holographic boardrooms by now; all university students should be blowing up virtual labs rather than physical ones; people should feel as comfortable navigating virtual worlds and friendships as in-person realities. On the whole, this hasn’t yet come to pass. Highly immersive technologies have made inroads in niche applications like simulation training for sports and medicine, along with the video gaming industry — but they aren’t mainstream for everyday communication. The online environment Second Life, launched in 2003, offered a parallel online world as a companion space to the physical one; it saw  monthly active users drop from a million in 2013 to half that in 2018.  Google Glass, which aimed to provide augmented reality for wearers of a special camera-enabled pair of glasses, launched in 2013 mostly to widespread mockery.

As Zoom fatigue has highlighted, the road to more immersive technologies for communication isn’t always a smooth one. But experts across fields from education to communication, computer science and psychology agree that deeper immersion still holds great promise for making people feel more connected, and they are aiming to help navigate the bumpy road to its best adoption. “I hope that no pandemic ever happens again, but if it does, I hope we have better technologies than we have now,” says Fariba Mostajeran, a computer scientist who studies human-computer interaction and virtual reality at Hamburg University. “For people who live alone, it has been really hard not to be able to hug friends and family, to feel people. I’m not sure if we can achieve that 10 years from now, but I hope we can.”

For distanced communication to live up to its full potential, “there will need to be an evolution,” Bailenson writes me, “both on the technology and on the social norms.”

Sudden shift

It takes a while for societies to adapt to a new form of communication. When the telephone was first invented, no one knew how to answer it: Alexander Graham Bell suggested that the standard greeting should be “Ahoy.” This goes to show not just that social use of technology evolves, but also that the inventors of that technology are rarely in the driver’s seat.

Email has danced between being extremely casual and being as formal as letter-writing as perceptions, expectations and storage space have shifted. Texting, tweeting and social media platforms like Facebook and Snapchat are all experiencing their own evolutions, including the invention of emojis to help convey meaning and tone. Ever since prehistoric people started scratching on cave walls, humanity has experimented with the best ways to convey thoughts, facts and feelings.

Some of that optimization is based on the logistical advantages and disadvantages of different platforms, and some of it is anchored in our social expectations. Experience has taught us to expect business phone calls to be short and sharp, for example, whereas we expect real-life visits with family and friends to accommodate a slow exchange of information that may last days. Expectations for video calls are still in flux: Do you need to maintain eye contact, as you would for an in-person visit, or is it OK to check your email, as you might do in the anonymity of a darkened lecture hall?

Travel often demarcates an experience, focusing attention and solidifying work-life boundaries — whether it’s a flight to a conference or a daily commute to the office. As the online world has sliced those rituals away, people have experimented with “fake commutes” (a walk around the house or block) to trick themselves into a similarly targeted mindset.

“For people who live alone, it has been really hard not to be able to hug friends and family.... I’m not sure if [technology] can achieve that 10 years from now, but I hope we can.”

FARIBA MOSTAJERAN

But while the evolution of technology use is always ongoing, the pandemic threw it into warp speed. Zoom reported having 300 million daily meeting participants by June 2020, compared to 10 million in December 2019. Zoom itself hosted its annual  Zoomtopia conference online-only for the first time in October 2020; it attracted more than  50,000 attendees, compared to about 500 in 2017.

Some might see this as evidence that the tech is, thankfully, ready to accommodate lockdown-related demands. But on the other side of the coin, people have been feeling exhausted and disrupted.

Visual creatures

Humans are adapted to detect a lot of visual signals during conversations: small twitches, micro facial expressions, acts like leaning into a conversation or pulling away. Based on work starting in the 1940s and 1950s, researchers have estimated that such physical signals made up 65 to 70 percent of the “social meaning” of a conversation. “Humans are pretty bad at interpreting meaning without the face,” says psychologist Rachael Jack of the University of Glasgow, coauthor of  an overview of how to study the meaning embedded in facial expressions in the  Annual Review of Psychology. “Phone conversations can be difficult to coordinate and understand the social messages.”

People often try, subconsciously, to translate the visual and physical cues we pick up on in real life to the screen. In virtual worlds that support full-bodied avatars that move around a constructed space, Bailenson’s work has shown that people tend to intuitively have their virtual representatives stand a certain distance from each other, for example, mimicking social patterns seen in real life. The closer avatars get, the more they avoid direct eye contact to compensate for invasion of privacy (just as people do, for example, in an elevator).

Yet many of the visual or physical signals get mixed or muddled. “It’s a firehose of nonverbal cues, yet none of them mean the thing our brains are trained to understand,” Bailenson said in his keynote. During videoconferencing, people are typically looking at their screens rather than their cameras, for example, giving a false impression to others about whether they are making eye contact or not. The stacking of multiple faces on a screen likewise gives a false sense of who is looking at whom (someone may glance to their left to grab their coffee, but on screen it looks like they’re glancing at a colleague).

And during a meeting, everyone is looking directly at everyone else. In physical space, by contrast, usually all eyes are on the speaker, leaving most of the audience in relative and relaxed anonymity. “It’s just a mind-blowing difference in the amount of eye contact,” Bailenson said; he estimates that it’s at least 10 times higher in virtual meetings than in person.

Research has shown that the feeling of being watched (even by a static picture of a pair of eyes) causes people to  change their behavior; they act more as they believe they are expected to act, more diligently and responsibly. This sounds positive, but it also causes a hit to self-esteem, says Bailenson. In effect, the act of being in a meeting can become something of a performance, leaving the actor feeling drained.

For all these reasons, online video is only sometimes a good idea, experts say. “It’s all contextual,” says Michael Stefanone, a communications expert at the University of Buffalo. “The idea that everyone needs video is wrong.”

Research has shown that if people need to establish a new bond of trust between them (like new work colleagues or potential dating partners), then “richer” technologies (video, say, as opposed to text) are better. This means, says Stefanone, that video is important for people with no prior history — “zero-history groups” like him and me. Indeed, despite a series of emails exchanged prior to our conversation, I get a different impression of Stefanone over Zoom than I did before, as he wrangles his young daughter down for a nap while we chat. I instantly feel I know him a little; this makes it feel more natural to trust his expertise. “If you’re meeting someone for the first time, you look for cues of affection, of deception,” he says.

But once a relationship has been established, Stefanone says, visual cues become less important. (“Email from a stranger is a pretty lean experience. Email from my old friend from grade school is a very rich experience; I get a letter from them and I can hear their laughter even if I haven’t seen them in a long time.”) Visual cues can even become detrimental if the distracting downsides of the firehose effect, alongside privacy issues and the annoyance of even tiny delays in a video feed, outweigh the benefits. “If I have a class of 150 students, I don’t need to see them in their bedrooms,” says Stefanone. He laughs, “I eliminate my own video feed during meetings, because I find myself just staring at my hair.”

In addition to simply turning off video streams occasionally, Bailenson also supports another, high-tech solution: replacing visual feeds with an automated intelligent avatar.

The idea is that your face onscreen is replaced by a cartoon; an algorithm generates facial expressions and gestures that match your words and tone as you speak. If you turn off your camera and get up to make a cup of tea, your avatar stays professionally seated and continues to make appropriate gestures. (Bailenson demonstrates during his keynote, his avatar gesturing away as he talks: “You guys don’t know this but I’ve stood up…. I’m pacing, I’m stretching, I’m eating an apple.”) Bailenson was working with the company Loom.ai to develop this particular avatar plug-in for Zoom, but he says that specific project has since been dropped. “Someone else needs to build one,” he later tells me.

Such solutions could be good, says Jack, who studies facial communication cues, for teachers or lecturers who want visual feedback from their listeners to keep them motivated, without the unnecessary or misleading distractions that often come along with “real” images.

All together now

This highlights one of the benefits of virtual communication: If it can’t quite perfectly mimic real-life interaction, perhaps it can be better. “You take things out that you can’t take out in real life,” says Jack. “You can block people, for example.” The virtual landscape also offers the potential to involve more people in more activities that might otherwise be unavailable to them because of cost or location.  Science conferences have seen massive increases in participation after being forced to thrust their events online. The  American Physical Society meeting, for example, drew more than 7,200 registrants in 2020, compared with an average of 1,600 to 1,800 in earlier years.

In a November 2020 online gathering of the American Association of Anthropology, anthropologist and conference chair Mayanthi Fernando extolled the virtues of virtual conferences in her opening speech, for boosting not just numbers but also the type of people who were attending. That included people from other disciplines, people who would otherwise be unable to attend due to childcare issues, and people — especially from the Global South — without the cash for in-person attendance. Videoconferencing technologies also tend to promote engagement, she noted, between people of different ages, languages, countries and ranks. “Zoom is a great leveler; everyone is in the same sized box,” she said. (The same meeting, however, suffered from “bombers” dropping offensive material into chat rooms.)

Technology also offers huge opportunity for broadening the scope and possibilities of education. EdX, one of the largest platforms for massive open online courses (MOOCs), started 2020 with 80 million enrollments; that went up to 100 million by May. Online courses are often based around prerecorded video lectures with text-based online chat, but there are other options too: The Open University in the UK, for example, hosts  OpenSTEM Labs that allow students to remotely access real scanning electron microscopes, optical telescopes on Tenerife and a sandbox with a Mars rover replica.

There is great potential for online-based learning that isn’t yet being realized, says Stephen Harmon, interim executive director of the Center for 21st Century Universities at Georgia Tech. “I love technology,” says Harmon. “But the tech we use [for teaching] now, like BlueJeans or Zoom, they’re not built for education, they’re built for videoconferencing.” He hopes to see further development of teaching-tailored technologies that can monitor student engagement during classes or support in-class interaction within small groups. Platforms like Engage, for example, use immersive VR in an attempt to enhance a student’s experience during a virtual field trip or meeting.

Full immersion

For many developers the ultimate goal is still to create a seamless full-immersion experience — to make people feel like they’re “really there.” Bailenson’s Virtual Human Interaction Lab at Stanford is state of the art, with a pricey setup including goggles, speakers and a moveable floor. Participants in his VR experiments have been known to scream and run from encounters with virtual earthquakes and falling objects.

There are benefits to full immersion that go beyond the wow factor. Guido Makransky, an educational psychologist at the University of Copenhagen, says that virtual reality’s ability to increase a person’s sense of presence, and their agency, when compared to passive media like watching a video or reading a book, is extremely important for education. “Presence really creates interest,” he says. “Interest is really important.” Plenty of studies have also shown how experiencing life in another virtual body (of a different age, for example, or race) increases empathy, he says. Makransky is now working on a large study to examine how experiencing the pandemic in the body of a more vulnerable person helps to improve willingness to be vaccinated.

But VR also has limitations, especially for now. Makransky notes that the headsets can be bulky, and if the software isn’t well designed the VR can be distracting and add to a student’s “cognitive load.” Some people get “cyber sickness” — nausea akin to motion sickness caused by a mismatch between visual and physical motion cues. For now, the burdens and distractions of immersive VR can make it less effective at promoting learning than, for example, a simpler video experience.

Mostajeran, who looks primarily at uses of VR for health, found in a recent study that a slideshow of forest snaps was more effective at reducing stress than an immersive VR forest jaunt. For now, she says, lower-immersion technology is fine or better for calming patients. But, again, that may be just because VR technology is new, unfamiliar and imperfect. “When it’s not perfect, people fall back on what they trust,” she says.

All technology needs to surpass a certain level of convenience, cost and sophistication before it’s embraced — it was the same for video calling. Video phones go much further back than most people realize: In 1936, German post offices hosted a public video call service, and AT&T had a commercial product on the market around 1970. But these systems were expensive and clunky and few people wanted to use them: They were too ahead of their time to find a market.

For distanced communication to live up to its full potential, “there will need to be an evolution, both on the technology and on the social norms.”

JEREMY BAILENSON

Both Mostajeran and Makransky say they’re impressed with how much VR technologies have improved in recent years, getting lighter, less bulky and wireless. Makransky says he was surprised by how easy it was to find people who already own VR headsets and were happy to participate in his new vaccination study — 680 volunteers signed up in just a few weeks. As the technology improves and more people have access to it and get comfortable with it, the studies and applications are expected to boom.

Whether that will translate to everyone using immersive VR for social and business meetings, and when, is up for debate. “We just missed it by a year or two, I think,” said Bailenson optimistically after his keynote presentation.

For now, the researchers say, the best way to get the most from communication media is to be aware of what you’re trying to achieve with it and adapt accordingly. People in long-distance relationships, for example, get value out of letting their cameras run nonstop, letting their partners “be in the room” with them even while they cook, clean or watch TV. Others, in the business world, aim for a far more directed and efficient exchange of information. Video is good for some of these goals; audio-only is best for others.

“This has been a heck of an experiment,” says Stefanone about the last year of online engagement. For all the pitfalls of social media and online work, he adds, there are definitely upsides. He, for one, won’t be jumping on any planes when the pandemic ends — he has proved he can do his academic job effectively from home while also spending time with his daughter. But it’s hard to know where the technology will ultimately take us, he says. “The way people adapt never follows the route we expect.”

This article is part of  Reset: The Science of Crisis & Recovery , an ongoing Knowable Magazine  series exploring how the world is navigating the coronavirus pandemic, its consequences and the way forward. Reset is supported by a grant from the Alfred P. Sloan Foundation.

This article originally appeared in Knowable Magazine, an independent journalistic endeavor from Annual Reviews.