Friday, April 28, 2023

New antifungal medications are sorely needed


Better treatment for Candida auris, Aspergillus and other dangerous fungal pathogens is slow to come, even as rates of drug resistance rise. New therapies are in the pipeline, and hospital practices can help.

As rates of antibiotic resistance grow alarmingly among disease-causing bacteria, dangerous fungi also are evolving stronger defenses, with a lot less fanfare.

Every year, infections of molds and yeasts such as Aspergillus and  Candida kill more than  1.5 million people globally, more than malaria and on a par with rates for tuberculosis. And new drug-resistant strains are emerging, such as  Candida auris , first detected in Japan in 2009 and since then reported on every continent except Antarctica. Between September 1, 2020, and August 31, 2021, the number of reported  C. auris cases in the United States has soared to over 1,100 in 21 states, up from 63 cases in four states from 2013 to 2016.

With Covid-19 cases stressing health care systems, changes in hospital infection control have given drug-resistant fungi a leg up, too. In 2019, the Centers for Disease Control and Prevention listed C. auris as an urgent threat; it was the first time the agency had done so for a pathogenic fungus. In December 2020, the CDC reported increased spread of  C. auris during the pandemic.

Put simply, “fungal infections are a massive public health problem,” says Johanna Rhodes, a genomic epidemiologist of fungal infections at Imperial College London. There are few drugs to fight them, and the pipeline for development of new ones has been frustratingly slow.

Today, though, a few novel antifungals are moving through clinical trials and researchers are developing new approaches to drug discovery that may ultimately strengthen the antifungal arsenal. In the meantime, health care organizations are working on improved practices to help stall the development of resistance in these problematic microbes.

Few weapons, more victims

Fungal pathogens become life-threatening when they get inside the body, infecting the bloodstream and internal organs. Such invasive infections have become more common due to the evolution of drug resistance as well as life-saving medical advancements such as organ transplants and cancer therapies that have created a growing population of immunocompromised people. The armamentarium of drugs to fight them is limited — and dated.

The first antifungal for treating invasive infections, amphotericin B, came out in 1958, and works against a variety of fungi. A member of the antifungal class known as polyenes, amphotericin B binds to key molecules — ergosterols — and extracts them from the fungal cell membrane, thereby damaging the cell’s functions. The drug’s toxicity to patients limits its use.

Beginning in the late 1970s, doctors also had a new, less toxic class of antifungals to turn to: azoles, which prevent fungal cells from making ergosterol. Then, in the early 2000s, a third class, the echinocandins, was approved by the US Food and Drug Administration for medical use. These drugs act by blocking production of a carbohydrate called beta-D-glucan, a vital part of fungal cell walls.

Resistance to azoles slowly emerged in the 1990s, due in part to agriculture. The industry had begun using azole fungicides to protect crops from fungi such as  Aspergillus fumigatus, a common mold, in the 1970s. Later, azole-resistant  A. fumigatus infections in people began cropping up, becoming more common after 2003. People with no previous exposure to medical azoles were turning up with resistant infections, a telling sign that they had picked up  A. fumigatus from the environment, for example from gardens or soil.

Medical use of antifungals has also pushed pathogens to evolve new defenses. Problems include the failure of patients to finish a course of drugs, as well as improper prescribing — for example starting antifungals in someone with an asymptomatic infection, prescribing the wrong drug or dose, or prescribing too long a course.

Physicians must also strike a delicate balance between preventing deadly infections in immunocompromised patients and trying to limit opportunities for fungi to evolve resistance. They often prescribe antifungals as a preventive measure in such patients which, though protective, also encourages resistant fungi if use is prolonged.

In hospitals, invasive fungal infections featuring drug resistance are increasingly problematic. C. auris infections, almost always acquired in health care facilities, increased by over 100 cases each year from 2017 to 2019, when 469 cases were reported, jumping to 746 cases in 2020, according to the CDC. And in the 12 months from September 2020 through August 2021 there were 1,156 reported cases. Catheters, intravenous lines and ventilators provide ample opportunities for pathogens to enter new hosts. “These are absolute highways for these environmental agents to get into the human body,” says Rodney Rohde, a microbiologist at Texas State University.

Covid-19-associated invasive fungal infections have cropped up too — most commonly pulmonary aspergillosis (generally Aspergillus fumigatus) but also black fungus (caused by soil fungi called mucormycetes) and infections with  Candida, including  C. auris. According to the CDC,  overstretched health care facilities have struggled to uphold normal infection control procedures, such as cleaning medical equipment and rooms and screening for  C. auris.

Today, 90 percent of C. auris samples from infected patients are resistant to at least one antifungal drug, typically fluconazole, and 30 percent are resistant to at least two. But during the pandemic,  C. auris infections that are resistant to all antifungal drugs also have been detected — the first examples of  pan-resistant C. auris transmission in US health care facilities.

People with invasive fungal infections “are very sick patients and we don’t have very good diagnostic tests. We don’t have very good treatment options,” says Jose Lopez-Ribot, a medical mycologist at the University of Texas at San Antonio. And this isn’t just a risk for people with compromised immunity. “Any of us, even the general public, can go in for a routine surgery and can end up sick in a hospital — that’s when you’re going to be at risk for these infections,” says Tom Chiller, chief of the CDC’s mycotic diseases branch. “You want there to be drugs available for you to use.”

Hospitals can help to prevent drug resistance

To keep existing drugs useful for as long as possible, hospitals need to adopt careful practices. “All hospitals have antimicrobial stewardship programs where usually an infectious-disease doctor, often with an infectious-disease pharmacist, will try to limit antibiotic use to situations where it’s strictly necessary,” says Stuart Levitz, an infectious-disease physician at UMass Memorial Medical Center, who wrote about fungal infections and immunity in the 2018  Annual Review of Immunology.

Such programs require early and accurate diagnoses and tracking of fungal infections, reports of antifungal use and feedback to physicians on their prescribing habits. Levitz, for example, helps to inform antifungal prescribing policies as part of his hospital’s stewardship efforts, determining which patients should receive them. His hospital’s microbiology lab is on the lookout for drug resistance and tracks its patterns within the hospital, while the clinical pharmacists track antifungal prescriptions — including patient numbers, dosages and drug costs.

Such stewardship attention to fungi has generally taken second place to bacteria in hospitals — but that is changing, Rhodes says: “We’re starting to see more antifungal stewardship programs avoid inappropriate use of antifungals, especially in immunocompromised patients.”

But though antifungal stewardship programs can cut costs and decrease antifungal use — which is crucial for preventing resistance — according to a review of stewardship programs at several hospitals, they don’t in and of themselves decrease deaths and thus don’t erase the need for better drugs. 

Antifungal drugs lag behind

Even as rates of fungal infections and drug resistance are increasing, the speed of drug development is not. “We’re basically limited to three classes, and the spectrum of activity of each of the classes doesn’t cover the whole gamut of fungal infection,” says Lopez-Ribot. The business incentive is lacking, despite the fact that globally, about 13.5 million people develop life-threatening fungal infections each year, because physicians use the drugs for relatively few patients.

The pandemic has also drawn pharmaceutical companies toward vaccine and antiviral development, away from other work, Rhodes says. “Even prior to Covid, a lot of the big pharmaceutical companies had basically abandoned their antifungal drug discovery pipelines…. It’s a sorry state of affairs.”

Scientific challenges also hamper drug development. Fungi are eukaryotes — they have cells with nuclei — and so are biochemically far more similar to humans than bacteria are. This makes it harder to design drugs that won’t also harm a patient. Until recently, only one antifungal drug, and no new antifungal drug class, had been approved by the Food and Drug Administration in the last two decades.

But today, researchers are testing a few new types of antifungal drugs that act in novel ways, and the FDA is prioritizing their approval process. “There were some smaller companies that have taken some of these drugs into clinical trials, and they’re looking very promising,” Chiller says.

For example, fosmanogepix from Amplyx Pharmaceuticals (recently acquired by Pfizer) showed some success in a small, Phase 2 clinical trial of 20 patients with Candida blood infections — 16 tested negative for  Candida after two weeks and survived the infection. The drug acts by blocking a key fungal enzyme and so impedes the pathogen’s ability to stick to tissue surfaces in the body. Researchers are now recruiting 50 patients with  invasive infections of Aspergillus and other molds to test the effectiveness of fosmanogepix in a Phase 2 trial.

Another company, F2G, has developed the antifungal olorofim, which targets an enzyme that fungi need to make some of the building blocks of DNA and RNA. Researchers are recruiting 200 participants with invasive fungal infections that aren’t responding to other treatments for a Phase 2 trial of the drug.

And in June 2021, the FDA approved an antifungal drug in a new class for vaginal yeast infections caused by Candida. There is hope that the drug will treat invasive infections, too. The drug,  ibrexafungerp from Scynexis, targets the same cell wall component as echinocandins do — beta-D-glucan — but it binds to a different site.

In a Phase 3 trial — a larger clinical trial with controls that is the final step before the FDA approves a drug — researchers are recruiting 200 participants to test how well ibrexafungerp performs against severe invasive fungal infections that haven’t responded to other medications.

And researchers are still looking for new drugs. Lopez-Ribot, for example, is scouring libraries of chemical compounds that don’t kill fungi or stop their growth but disarm them so they can’t harm the human host. He works with  Candida albicans, whose cells assemble into microbial mats — biofilms — that adhere firmly to surfaces, making them difficult to clear. In the body, they can also grow string-like filaments, a growth pattern associated with infection severity. His group is searching for molecules that rob  C. albicans of biofilm- or filament-forming abilities, or both. One plus to this approach, he says, is that it doesn’t exert the same degree of evolutionary pressure for resistance as traditional antifungals.

Of the drugs in clinical trials, Lopez-Ribot says that the most advanced are not those with novel mechanisms but ones within existing classes. Pathogens may soon evolve resistance to these new iterations, but something is better than nothing.

“My philosophy is very simple,” he says. “We have so few, that any type of addition to the antifungal armamentarium should be welcome.”

This article originally appeared in Knowable Magazine, an independent journalistic endeavor from Annual Reviews.

Parmesan Crusted Chicken


Not only is this Parmesan Crusted Chicken dish delicious, it is super simple to make.  For more meal recipe ideas, go to culinary.net.

Parmesan Crusted Chicken

  • 1/2 cup Hellmann's® or Best Foods® Real Mayonnaise
  • 1/4 cup grated Parmesan cheese
  • 4 boneless, skinless chicken breast halves (about 1 1/4 pounds)
  • 4 teaspoons Italian seasoned dry bread crumbs
  1. Preheat oven to 425°F.
  2. Combine mayonnaise with cheese in medium bowl. Arrange chicken on baking sheet. Evenly top with mayonnaise mixture, then sprinkle with bread crumbs.
  3. Bake 20 minutes or until chicken is thoroughly cooked.

SOURCE:
Culinary.net

How Climate Change Impacts Birds, Their Feeding Habits and How to Help from Home

(Joan Casanova) Bird feeding is a common practice in the United States, with more than 59 million Americans participating, according to the U.S. Fish & Wildlife Service. In addition to providing aesthetic and recreational benefits, bird feeding can have positive impacts on bird populations.

According to the National Audubon Society, birds provide important ecosystem services, such as pollination, pest control and seed dispersal. In fact, around 87% of flowering plants rely on animal pollinators, including birds, to reproduce and grow, according to a study published in “Science.” Birds also consume fruits and berries then spread the seeds, which helps maintain biodiversity and promotes the growth of new plants.

Considered good indicators of the health of the ecosystem, changes in bird populations and behaviors can signal changes in the environment, such as pollution, habitat loss and climate change. As temperatures, weather patterns and ecosystems change, it can affect the availability of food for birds, which may alter their behavior.

Feeding birds can be a beneficial practice that helps them cope with climate change. Consider these benefits:

  • Supplemental Food: Bird feeders provide a supplemental source of food for birds when natural food sources may be scarce due to prolonged droughts or severe storms. Bird feeding can help birds maintain energy levels, especially during breeding or migration when nutritional needs are higher.
  • Range Shifts: Climate change can cause shifts in the distribution and abundance of bird species. Feeders can serve as “refuges” for birds, providing reliable food sources as they move in search of suitable habitats.
  • Behavioral Adaptations: Some species may alter their feeding behaviors due to changes in timing of insects hatching or plants flowering, which can affect the availability of natural food sources. Bird feeders can help bridge these gaps, providing a stable source of food when traditional sources are disrupted.

Feeders
To attract more birds this season, it’s important to offer quality feed in a variety of bird feeder types placed at different heights.

Traditional tube feeders are basic, all-purpose, must-have feeders that work well for finches, nuthatches and other small birds that cling. Made with state-of- the-art materials to prevent warping and discoloration, Cole’s Terrific Tube Feeder features a quick-clean removable base.

Simply push a button and the bottom of the feeder comes off for easy access. Rinse well with soapy water, submerge in a 9-1 water-bleach solution, rinse and dry. Then reattach the bottom; there’s no disassembly or assembly of multiple parts necessary. Regular cleaning of feeders is essential, preventing mold, germs and disease.

Another option, bowl feeders, can serve not only seeds, but also dried mealworms, fruit and suet in cake or kibble form. For example, Cole’s Bountiful Bowl Feeder comes with an adjustable dome cover you can raise or lower to protect from rain and prevent larger birds and squirrels from getting to the food.

Popular Foods
In addition to feeders, offering a variety of foods is vital for inviting different species to your backyard.

  • Birdseed: Not all birdseed is created equal. Look for quality blends without filler seeds like red millet and oats. All-natural seed, containing no chemicals or mineral oil, is safe and more appealing to birds. Consider researched, specially formulated options like all-natural black oil sunflower, Cole’s “Hot Meats” (sunflower meats infused with habanero chile peppers) or Special Feeder blend, which is packed with black oil sunflower, sunflower meats, black striped sunflower, raw peanuts, safflower and pecans.
  • Dried Mealworms: Full of energy, essential nutrients, fats and proteins, mealworms are a preferred food for adult songbirds. Dried mealworms are easy to feed, less messy and lack the “ick” factor of live worms.
  • Fresh Fruit: Apple and orange halves and chunks of banana are favorites for orioles and tanagers.  
  • No-Melt Suet: Perfect for insect-eating birds, high-fat food provides abundant calories and rich nutrition. 

Don’t forget, birds need water just as much as humans. Drinking water helps regulate body processes, improves metabolism and maintains health. Birds also use water for preening and bathing, and on hot days, standing in cool water or taking a quick splash can help them keep cool.

Find more solutions to bring birds to your backyard at ColesWildBird.com

SOURCE:
Cole’s Wild Bird Products

Roles for robots

Robots are getting geared up for a variety of human health and social uses

Companions for the elderly

Robots could offer older people everything from companionship to health monitoring to assistance with walking. People talk to robots and even miss them when they’re not around. Some are concerned, however, that robots might take over human healthcare jobs, cause accidents or reduce person-to-person contacts.

Social skills for kids with autism

People with autism have difficulty interacting with other humans. Might a robot with simpler, more predictable behaviors help with the development and practice of social skills? While there’s lots of interest in this kind of robot, studies so far have been small, and yielded mixed results.

Remote-control helpers in the hospital

Telerobots are controlled by a human who isn’t present in person, such as a doctor performing rounds from afar. Hospital patients seem to like robots’ attentions just as much as in-person visits. Telerobots can also allow sick children to still attend school remotely, keeping up with the class academically and socially.

Exercise coaches after stroke

Wearable robots can help people who’ve had a stroke to exercise, improving their walking and hand and arm movements. While the robots are not meant to be social, researchers find that people are more likely to use one if it matches their personality — a quiet, nurturing robot for an introverted patient, for example.

Robo-teachers

Building robots in the classroom can encourage children’s technical skills and interests. Robots can also play the role of peer or teacher, encouraging or instructing students, for example in foreign-language study. One preliminary study suggested that humanoid robots boosted motivation, community and self-expression in low-income students.

Consumer guides

People like robots in malls and museums. In one study, more than 90 percent of shoppers wanted to see the robot again. In another, shoppers tended to perceive a bot as a “mascot” for the shopping center. Many even preferred the robot to a human. The reason? Robots don’t seem to judge; they treat everyone the same.

—Amber Dance

This article originally appeared in Knowable Magazine, an independent journalistic endeavor from Annual Reviews.

Travelling abroad? Don’t be tempted to pay your way using your home currency

Catarina Belova/Shutterstock
Dirk Gerritsen, Utrecht University; Bora Lancee, Utrecht University, and Coen Rigtering, Utrecht University

Part of the joy of travelling comes from experiencing the unfamiliar – a different climate, culture or cuisine. But when it comes to paying for things abroad, we might feel more comfortable using the currency we are most familiar with, the one we use at home.

This has recently become a common – and expensive – option for tourists withdrawing money from cash machines, or paying electronically in shops and restaurants.

When a restaurant bill arrives for example, foreign customers may be offered the choice on the card reader to pay in their home currency rather than the local one. This feature, known as “dynamic currency conversion” or “currency choice” sounds appealing at first – a service which has done the hard work for you, converting the bill to a currency you understand, giving you a better idea of how much money you are spending.

But it comes at a price – as the fees charged for this convenience can be exorbitant. In fact, one study shows that the average fee applied to this kind of conversion is a whopping 7.6%, more than double the cost of paying in the local currency (usually between 1.5% and 3%).

So suppose a French traveller goes out for dinner in a British town, and the final bill comes to £88.43, the equivalent of €100. Paying in UK currency, which would then converted to euros by the French diner’s bank, would lead to a payment of around €102. But using the dynamic currency conversion to pay the restaurant bill directly in euros would end up costing them €107.60.

Despite the high fees, our research shows that more than half of international customers still choose to pay in their familiar home currency. The most obvious explanation for this is an understandable preference for the familiar when dealing with money abroad.

But it is also true that the fees are not explicitly shown to customers. That is, tourists may see the applied exchange rate, but they are not shown the hidden fees or how that exchange rate compares with others.

And while expensive for tourists, the currency choice “service” can be highly lucrative for those who operate it. The companies which provide dynamic currency conversion options earn significant conversion revenues – a portion of which is often shared with the business where the transaction takes place.

Sources indicate that extra revenues for retailers come to around 1% of the transaction value. We have also been told of well known department stores training employees to actively encourage foreign customers to pay for purchases in their home currency.

Greater transparency

And despite the high conversion fees involved with dynamic currency conversion, most government regulators around the world have been hesitant to intervene. One possible reason for this is that regulation would be seen as potentially hitting the profits of local businesses.

The exception is the European Union (EU), which considers excessive transaction costs to be a barrier to the development of businesses and aims to protect European consumers.

The latest EU regulations (not yet enforced) aim to enhance transparency by including extra information about the costs of currency choice on card readers and ATMs.

This is a step in the right direction. But we would in fact encourage a reduction in the amount of information to make things simpler, so that customers are made aware purely of the percentage fee being added if they choose to pay in their own currency. We also think there should be maximum conversion charges to protect unaware customers from excessive fees.

With the continued growth of international travel, it is crucial to find ways to help people make informed financial decisions when dealing with exchange rates and making payments outside of their currency zone.

But for now, travellers are likely to spend more of their money abroad than they need to, because of something they intuitively feel will make a transaction simpler and less time consuming.

So if you’re on holiday or travelling for work, our advice is to decline the option of paying in your home currency and instead opt for the more reasonable conversion fees charged by your bank. Your travel experience could end up much cheaper if you do.

Dirk Gerritsen, Assistant Professor of Finance and Financial Markets, Utrecht University; Bora Lancee, Researcher, Utrecht University, and Coen Rigtering, Assistant Professor in Strategy and Organization, Utrecht University

This article is republished from The Conversation under a Creative Commons license. 

Why Kurt Vonnegut’s advice to college graduates still matters today

A generation told not to trust anyone over 30 nevertheless adored Vonnegut. Ulf Andersen/Getty Images
Susan Farrell, College of Charleston

Kurt Vonnegut didn’t deliver the famous “Wear Sunscreen” graduation speech published in the Chicago Tribune that was often mistakenly attributed to the celebrated author. But he could have.

Over his lifetime, he gave dozens of quirky commencement addresses. In those speeches, he made some preposterous claims. But they made people laugh and made them think. They were speeches the graduates remembered.

Having studied and written about Vonnegut for years, I wish he had been my commencement speaker. I graduated from Austin College, a small school in North Texas. I don’t even remember who gave my class’s graduation speech, much less a single word the speaker said. I suspect many others have had – and will have – similar experiences.

Young people, college students especially, loved Vonnegut. During the early and mid-1960s, he commanded an avid and devoted following on campuses before he had produced any bestsellers. Why was a middle-aged writer born in 1922 adored by a counterculture told not to trust anyone over 30? Why did he continue to appeal to younger generations until his death?

Their parents’ generation

Vonnegut, who died just before commencement season in 2007, was nearly 50 years old when his groundbreaking anti-war novel, “Slaughterhouse-Five,” was published in 1969.

A cultural touchstone, the novel changed the way Americans think and write about war. It helped usher in the postmodern style of literature with its playful, fragmented form, its insistence that reality is not objective and that history is not monolithic, and its self-reflection on its own status as art. Like Andy Warhol’s soup cans, “Slaughterhouse-Five,” with its jokes, drawings, risqué limericks and flying saucers, blurs the line between high and low culture.

Cited as one of the top novels of the 20th century, “Slaughterhouse-Five” has been transformed into film, theatrical plays, a graphic novel and visual art. It has inspired rock bands and musical interpretations. Vonnegut’s recurring refrain, “So it goes,” used 106 times in the novel, has entered the popular lexicon. The book has been banned, burned and censored.

In many ways, though, Vonnegut had more in common with the parents of the college students he addressed than with the students themselves. Father to six children – three of his own and three nephews who joined the family after his sister Alice and her husband died – Vonnegut had studied biochemistry at Cornell and had worked in corporate public relations. He continued to believe all his life in the civic virtues he learned as a student at Shortridge High School in Indianapolis.

He had the credibility of a World War II veteran, a member of what journalist Tom Brokaw would later call the “Greatest Generation.” Captured by the Germans during the Battle of the Bulge, he was sent to Dresden as a prisoner of war. There he was starved, beaten and put to work as a slave laborer. He survived the Allied firebombing of the city in February 1945 and was forced to help excavate hundreds of bodies of men, women and children who had been burned alive, suffocated and crushed to death.

Fool or philosopher?

If Vonnegut was, like the students’ fathers, a family man and a veteran, perhaps he also embodied the dad that students in 1969 dreamed their own fathers could be: funny, artistic, anti-establishment and anti-war.

Man in striped suit holding cigarette.
Kurt Vonnegut at Bennington College in 1970. Bennington College Archive, CC BY-SA

Vonnegut had the look – sad, kind eyes under that mop of uncontrollable hair, the full droopy mustache. A photo taken just before he delivered a commencement address at Bennington College in 1970 shows him wearing a loud striped jacket, reading glasses tucked neatly in its pocket, with a cigarette dangling at his fingertips.

Looking like a cross between Albert Einstein and a carnival huckster, Vonnegut had his contradictions on full display.

Was he a clown or a wise man? A fool or a philosopher?

The literary establishment did not quite know what to make of Vonnegut, either. A writer frequently dismissed by critics for his flying saucers and space aliens, for the simplicity of his prose, for pandering to what one reviewer called the “minimally intelligent young,” he was also praised for his inventiveness, for his lively and playful language, for the depth of feeling behind the zaniness, and for advocating decency and kindness in a chaotic world.

A forceful defense of art

As the U.S. was fighting what most college students believed was an unjust and imperialist war in Vietnam, Vonnegut’s message struck home. He used his own experience in World War II to destroy any notion of a good war.

“For all the sublimity of the cause for which we fought, we surely created a Belsen of our own,” he lamented, referencing the Nazi concentration camp.

The military-industrial complex, he told the graduates at Bennington, treats people and their children and their cities like garbage. Instead, Americans should spend money on hospitals and housing and schools and Ferris wheels rather than on war machinery.

In the same speech, Vonnegut playfully urged young people to defy their professors and fancy educations by clinging to superstition and untruth, especially what he considered the most ridiculous lie of all – “that humanity is at the center of the universe, the fulfiller or the frustrater of the grandest dreams of God Almighty.”

Vonnegut conceded that the military was probably right about the “contemptibility of man in the vastness of the universe.” Still, he denied that contemptibility and begged students to deny it as well by creating art. Art puts human beings at the center of the universe, whether they belong there or not, allowing people to imagine and create a saner, kinder, more just world than the one we really live in.

The generations, he told students at the State University of New York at Fredonia, are not that far apart and do not want that much from each other. Older people want credit for having survived so long – and often imaginatively – under difficult conditions. Younger people want to be acknowledged and respected. He urged each group not to be so “intolerably stingy” about giving the other credit.

A strain of sorrow and pessimism underlies all of Vonnegut’s fiction, as well as his graduation speeches. He witnessed the worst that human beings could do to one another, and he made no secret about his fears for the future of a planet suffering from environmental degradation and a widening divide between the rich and the poor.

If Vonnegut were alive and giving commencement speeches today, he would be speaking to college students whose parents and even grandparents he may have addressed in the past. Today’s graduates have lived through the COVID-19 pandemic and are drowning in social media. They face high housing costs and financial instability and are more depressed and anxious than previous generations.

I’m sure he would give these students the advice he gave so often over the years: to focus, in the midst of chaos, on what makes life worth living, to recognize the joyful moments – maybe by listening to music or drinking a glass of lemonade in the shade – and saying out loud, as his Uncle Alex taught him, “If this isn’t nice, what is?”

Kurt Vonnegut delivers a lecture at Case Western University in 2004, three years before his death.

Susan Farrell, Professor of English, College of Charleston

This article is republished from The Conversation under a Creative Commons license. 

Thursday, April 27, 2023

Trendy office layouts. Performance reviews that crush morale. There’s plenty of evidence on how to get the best out of workers, but businesses often ignore it.

Alan Colquitt is a student of the ways people act in the workplace. In a corporate career that spanned more than 30 years, the industrial-organizational psychologist advised senior managers and human resources departments about how to manage talent — always striving to “fight the good fight,” he says, and applying scientific rigor to his job.

Should executives ask employees for hiring referrals? Colquitt would consult the research to see if that would bring in better candidates. How to get more women into senior management? Colquitt would dig into studies that revealed the reasons for the stubborn endurance of the glass ceiling.

And then he hit a ceiling of his own.

A Fortune 500 firm where he worked had put in place a compensation system that was making employees miserable. Colquitt hadn’t been the one who implemented the system, which gave better raises and bonuses to those who scored high on a five-point performance scale. But people complained to him about it, incessantly. He decided to push upper management for change.

True to his roots, Colquitt reviewed the published literature and combed through internal data to show higher-ups where, exactly, things were going wrong. The evidence led him to a stark conclusion: The firm’s performance assessments and pay structure were completely counterproductive, reducing happiness of individual workers and hurting the enterprise as whole.

Colquitt recommended that his employer scrap the system. The company’s CEO backed him, but many others in the organization, including heads of the human resources and compensation departments, pushed back hard on a total revamp.

Colquitt kept arguing. No dice. After a couple of years, exhausted and ready for a career shift anyway, he gave up. He left corporate life and became an affiliate research scientist at the Center for Effective Organizations at the University of Southern California. He began teaching, speaking and writing, and published a book —  Next Generation Performance Management: The Triumph of Science Over Myth and Superstition — in 2017.

“I was pretty outraged by it all,” he says today. “What we do in organizations has very little relationship to what the science says we should do.”

Getting companies to pay attention to science and engage in so-called “evidence-based management” is a challenge that has been driving industrial-organizational psychologists nuts for the better part of 20 years. Whether it’s hiring staff or determining salaries or investing in technology, managers making high-stakes decisions have a vast scholarly literature at their disposal: studies conducted over more than a century, in labs and in the field, vetted through peer review, that show  whether pay incentives drive internal motivation ( often not);  whether diversity training works ( only under the right conditions); whether companies should get rid of performance ratings  (yes, Colquitt would say);  how to train effective teams; and more.

Executives love hard numbers, and they desperately want to know how to keep their best employees, how to make more widgets, how to be more creative. So you’d think they’d lap up the research. “It’s hard to find students in graduate school who don't hear the idea of evidence-based management and say, ‘Yes! Of course!’” says Neil Walshe, an organizational psychologist who teaches the approach at the University of San Francisco School of Management.

Except most companies don’t. Occasionally, a firm will make a splash — the poster child these days is Google, which gets kudos for its data-centric, research-based “People Operations” (a.k.a. human resources) department. But most executives would rather just copy another company’s proven ideas than do the hard work of assessing evidence relevant to their own circumstances. Managers falter, victims of inertia (“but we’ve always done things this way!”) confusion (“industrial-organizational  what?”), even downright hostility to expertise.

Interest in evidence-based practices may get a boost as more and more companies start delving into data analytics the way Google has, observing their own operations and putting the information to use in thoughtful ways. Perhaps, proponents hope, managers who open their minds to analytics will also open their minds to other new ways of thinking, seeing the value in evidence.

Or maybe human nature will keep getting in the way.

“There are really good reasons why people don’t use evidence, and changing that is hard,” Colquitt says.

The science of business

Science-inspired ideas have been applied to business since at least 1911, when mechanical engineer-turned-management consultant Frederick Winslow Taylor’s Principles of Scientific Management applied insights from engineering to improve efficiency, arguing that “the best management is a true science, resting upon clearly defined laws [and] rules.”

Taylor worked with Bethlehem Steel to optimize the volume of pig iron a worker could load onto railroad cars in a single day. He studied “the tiring effect of heavy labor” and tasked a young assistant to look up “all that had been written on the subject in English, German and French.” He conducted experiments to figure out how much iron a man could consistently haul and through a process of analysis determined that a “first-class man,” with the right strength and pacing, should be able to manage 47 tons. He urged managers to move workers who couldn’t handle such a load into other roles.

In later decades, researchers studied industrial behavior with ever-increasing rigor, the field of industrial-organizational psychology was born, and academic work increasingly informed business practices, within human resources and without. During World War I, the military used assessments to place soldiers in jobs where they’d be most successful. In the 1920s and 1930s, a series of famous studies at Western Electric’s Hawthorne plant in Cicero, Illinois, influenced managers to pay attention to social interactions on teams.

Japan’s postwar economic boom was also built on research, including the 1950s-era innovations of statistician W. Edwards Deming, who focused on product quality, among other things, as a driver of business success. It didn’t take long for American companies to adopt “total quality management,” as the trend became known in the United States, when Japanese firms began to threaten American predominance.

With the start of the twenty-first century, another concept was percolating up through the industrial-organizational psychology ranks: an approach called “evidence-based management,” articulated and championed by a Carnegie Mellon professor named Denise Rousseau. She gave an impassioned  speech on the subject at the annual meeting of the Academy of Management in Honolulu in 2005.

Rousseau had always assumed that companies paid attention to the research she and her colleagues so carefully produced, but slowly it began to dawn on her that that wasn’t the case. It was an epiphany that “blew my mind,” she says today. Managers rejected scientifically proven strategies and refused to abandon practices the literature didn’t support: things like paying executives outlandishly more than rank-and-file employees. Bosses made decisions based on gut feeling. They copied blue chip companies like General Electric and Coca-Cola, even when what those outfits did had little relevance. They chased trends.

Rousseau and other industrial-organizational psychologists thought this seemed like a colossal waste of time, effort and money. They saw a model for change in a movement called evidence-based medicine, which urges physicians to consider the best available external evidence when deciding how to treat patients. Increasingly, beginning in the 1990s, doctors were expected to lean on research, not just go with their guts.

Shouldn’t businesspeople, similarly, take stock of the work that psychologists had so carefully produced?

“It’s time to start an evidence-based movement in the ranks of managers,” exhorted Stanford Business School professors Jeffrey Pfeffer and  Robert I. Suttonin a  2006 article in the Harvard Business Review, a popular read for executives. The pair tartly opined that if doctors practiced medicine the way managers practiced management, the morgues would be packed and the courts brimming over with malpractice lawsuits. Managers should “relentlessly seek new knowledge and insight” to hone key practices, the scholars said.

The idea took root. Researchers like Rousseau wanted companies to read their papers, to be sure — but they also urged critical thinking in a broader sense, in which everything from science to internal surveys to gut feelings is considered in a systematic way, following a six-step process. The concerns of evidence-based managers are “Why are we doing this? What is the problem we're trying to solve? How do we know the solution will solve the problem?” says Eric Barends, managing director of the nonprofit Center for Evidence-Based Management, an international network of experts.

Worried about revealing trade secrets, companies are loath to talk openly about their real-world experiences with evidence-based management. Still, the approach can yield results.

Take the case of one company that couldn’t retain software engineers. It asked Cheryl Paullin, a Minneapolis-based industrial-organizational psychologist who heads the talent management and analytics division of the Human Resources Research Organization, a nonprofit that advises HR departments, if it should raise salaries to keep programmers on board.

Reviewing the academic literature and a variety of metrics within and outside of the company, Paullin and her colleagues determined that programmers were leaving not because of pay but because they weren’t getting the training they wanted. “We were able to say: ‘Don’t try to keep them by focusing on their pay — that’s not what’s causing the problem,’” Paullin says.

In another case, documented by Barends for a forthcoming evidence-based management textbook, Ctrip, the largest travel agency in China, conducted a randomized trial to help determine if allowing call center employees to work from home would improve their individual performance (several academic studies suggested it would). The company chose 250 employees for the three-month-long experiment, assigning those with even-numbered birthdays to work at home and those with odd-numbered birthdays to work in the office. Remote workers increased their performance by 13.5 percent over their colleagues in the office and used fewer sick days, too. “Stunned” by the result, Ctrip’s CEO decided to adopt remote working for all call center employees.

Or there’s the far from straightforward problem of assigning salary ranges to different types of workers. To evaluate jobs and set pay, many companies still rely on outdated systems designed in the 1940s that assign higher salaries to people who are managers or in charge of budgets and give short shrift to newer sorts of jobs that are very valuable to twenty-first century firms — roles like project management or functions requiring expert skills and knowledge, says Philipp Schuch, a cofounder of Gradar.com, a Dusseldorf, Germany-based HR tech startup.

So Gradar is using an evidence-based approach to build a web-based job evaluation tool that it hopes will do better. It spent months studying existing systems to understand what criteria they used to grade jobs and derive pay scales, and then conducted a comprehensive literature search to come up with updated, requirements-based criteria that make more sense for today’s workplace: things like responsibility for key functions and projects, and not just people and organizational responsibility.

The company built a pilot system, then tested and retested it over and over to validate its results against established systems and other jobs-related data. Then they built an online system and tested and retested again (verification is a crucial part of an evidence-based approach). Today, more than five years after Schuch and colleagues began thinking about Gradar, the company is working with 100 medium to large companies around the world.

They range from auto parts manufacturers to theater companies to universities — “and it still works across all the different jobs. We get consistent results,” says Gradar cofounder Ralf Kuklik.

Why managers won’t commit

Schuch and Kuklik are believers in their tool — but they’re also realists. Schuch worked for years as head of compensation and benefits at large German companies and he’s been paying attention to what other companies do with evidence-based management.

“It’s not much, honestly,” he says.

That’s a common refrain.

“We’d love to see a commitment from a leader that says, ‘I expect our decisions about people and work and the organization to have evidence behind them,’” says John Boudreau, research director at the  Center for Effective Organizations, housed in USC’s Marshall School of Business. “I don’t know that I have seen examples of that. Especially at the high level, the CEO level.”

“I’m a little baffled that it’s not more widespread,” says Jennifer Kurkoski, director of Google’s People Innovation Lab (PiLab), the internal research and development team behind the company’s People Operations department. “Companies spend billions on R&D, almost none of which is devoted to making people work better. It’s not something we understand yet. And we should.”

But there are many reasons why managers have been slow to embrace evidence-based management.

It’s a lot of work. Companies must spend a great deal of time, effort and money to assimilate research findings, or to test and validate new policies or systems. “Most people want to put things in place quickly, and get it done,” says Elaine Pulakos, president of PDRI, a Washington, DC-based talent management company. Executives, always with an eye on the bottom line and the next quarter’s results, often see this sort of research as overhead they can’t afford.

People fear change and risk. Even though an evidence-based management approach may ultimately yield better results, the perceived safer route is hewing to well-known “best practices” championed by other companies, and promoted by consultants who may or may not have done rigorous study. “People get enamored with something they can easily implement that someone else has tried before them,” says Pulakos. If Exxon Mobil or Google has scored with some initiative, she adds, “it makes it safe.” But maybe irrelevant, too.

Managers put more faith in intuition than they put in science. “We’re all experts on human behavior, right?” jokes organizational psychologist Ed Lawler, of USC’s Center for Effective Organizations. It’s an abiding sense that’s often flawed: Sometimes, industrial-organizational psychology research reveals that algorithms are better than people at particular tasks, such as initial screenings for new hires. But “people tend not to like findings that don’t present humans in a good light,” says  Sara Rynes, an industrial-organizational psychologist at the Tippie College of Business at the University of Iowa.

Parsing the scientific literature can be hard. Managers, unlike doctors, aren’t required to have any kind of advanced training, and often can’t read a scholarly report or engage in the sort of statistical analysis needed to understand internal employee data. At the same time, academics catch fire for not making their findings more readable, or for publishing their work in prestigious journals that keep studies hidden away behind paywalls — pushing managers toward popular business books and articles that do an uneven job of presenting research correctly.

“It’s hard to find the research, and it’s hard to read, and it’s hard to interpret,” Colquitt says. “There are so many more channels to get information . . . it’s hard for leaders or HR professionals to sort the wheat from the chaff.”

Making matters worse — ironically — the very people who champion the science-based approach haven’t yet proved that it works with the kind of rigorous study that they would like. In that sense, “the evidence for evidence-based management is almost nonexistent,” admits Rob Briner, an industrial-organizational psychologist and scientific director of the Center for Evidence-Based Management.

In a paper published in the Annual Review of Organizational Psychology and Organizational Behavior in 2017, Rynes and co-author Jean Bartunek of Boston College  examined 134 scholarly articles about evidence-based management. Most were essays and other pieces advocating or criticizing the approach, talking about how to teach it, and the like. Only about a fifth were empirical studies reporting research or reviewing such studies. The authors highlighted just a handful of those as “exemplary” — noting that many focused on small numbers of subjects and relied on self-reporting from managers for data, a method “known to be fraught with numerous biases and opportunities for error.”

“People want more evidence that when people use our studies it actually does something,” says Rynes.

As Google goes . . .

Advocates for evidence-based management think their approach may start looking more interesting to more people now that companies are embracing big data analytics: slicing and dicing truckloads of behavioral information, much of it collected through internal workplace computer systems, to dig up insights. Some of this information sits in databases, other bits are embedded in operational systems, and can be mined.

Here, Google reigns supreme. It’s in the business of collecting and analyzing information, after all. Kurkoski’s team, heavy with PhDs, questions all kinds of assumptions about organizations. Then it consults the research, tries to find data within its own operations to shed light on the question, and tests new ways to solve problems. Questions like: “Do managers matter?” (yes, because the best ones boost job satisfaction among workers); “Why are women leaving our company?” (industry-standard, 12-week maternity and paternity leaves are too short); even “What shape of lunch table will get co-workers talking?” (a long one).

Kurkoski is close-lipped about a lot of what Google does — she won’t share how many people are on PiLab’s staff, for instance — but the company has earned a lot of attention for its work in the business and  popular press. A  2016 article in theNew York Times Magazine, for instance, detailed a 2012 initiative known as “Project Aristotle,” designed to figure out what made effective teams work and what made bad ones fizzle. The company ultimately homed in on “ psychological safety” — how comfortable workers feel taking risks, a well-studied subject in the organizational psychology canon.

The brilliance of Google’s approach was the way it used science to encourage workers to talk about their feelings, one Google manager who went on to apply the findings told The Times. “By putting things like empathy and sensitivity into charts and data reports, it makes them easier to talk about,” he said.

Colquitt is among those who think the new rage for data analytics might spark renewed interest in evidence-based management — the operative word being “might.” He pounds out blog posts, stuffed with research citations, when the NFL decides to fine players who don’t stand for the national anthem, or when  United Airlines toys with converting its bonus system into a winner-takes-all-lottery. He’s digging deeper into the problem of performance management and pay.

And the fodder keeps coming. Studies that find open offices don’t, in fact, encourage conversation and collaboration. Studies that find employees  resent the corporate fad of hot-desking — jumping from desk to desk instead of having a dedicated workspace, based on a notion that this will spark synergies and blue-sky thinking.

In one recent paper calling on industrial-organizational psychologists to put “an end to bad talent management,” Colquitt and his co-authors called out companies who fall for consultants promising to help them understand “the brain science of millennials” and other trendy topics, with little or no evidence for any of it.

“We needed to write about it and put these things to bed,” Colquitt says. He adds, as if a dark cloud is momentarily passing overhead: “But no one reads these papers anyway — so they won’t stay in bed long.”

Then it’s back to the talks and the blogs and the books, and fighting that good fight.

This article originally appeared in Knowable Magazine, an independent journalistic endeavor from Annual Reviews.

How the body’s own defense cells can be turned into tiny, programmable assassins to battle cancers and other disorders

The immune system is an essential sentinel: It attacks foreign pathogens and destroys our own cells when they become a threat. It runs seamlessly most of the time, but it can also make mistakes, failing to root out cancer cells or waywardly attacking healthy tissues. The immune system’s power and perils have inspired scientists’ quickening efforts to genetically “hack” it, using viruses and other gene editing technology to endow existing immune cells with new abilities. The goal is to make cells that can be deployed like minuscule commandos to seek and destroy tumors, subdue inflammation and self-destruct on command.

There’s a long way to go before engineered immune cells achieve that level of precision. Yet the approach, part of a booming branch of medicine called immunotherapy, has already achieved some stunning successes. In cancer treatment, for example, white blood cells engineered to kill cancer cells — known as CAR T cells — have been shown to effectively treat some blood cancers, including the most common form of childhood leukemia. But such cells can also have dangerous — even deadly — side effects and aren’t yet effective against solid tumors such as colorectal and breast cancer. Researchers hope to eventually use engineered immune cells to treat a range of illnesses, not only cancers but also autoimmune diseases such as diabetes and neuroimmune disorders such as multiple sclerosis.

Synthetic biologist Wendell Lim of the University of California, San Francisco, studies how immune cells process information and make decisions, and how to harness those abilities for medicine. Writing recently in the Annual Review of Immunology with UCSF colleague Kole Roybal, he reviewed ongoing work to engineer immune cells for new therapies. Here, Lim discusses some of the biggest wins and failures in this rapidly advancing area of research.

This interview has been edited for length and clarity.

How did you become interested in synthetic biology?

My interest has always been trying to understand how cells make decisions. Essentially living cells are a type of computer, they just don’t use electronic circuits — they use molecular circuits. They have this amazing capability to read what’s going on in their environment and read signals from other cells and use that information to make very complicated decisions.

One of the things that I saw as I began my career was that although there were many different cellular programs, a lot of the molecules being used and the ways that the circuits were linked together were very similar. I started to become interested not just in how one particular molecule or pathway works, but the logic of how molecular systems can be programmed in different ways. For example, when I was working on the structure and mechanism of signal transduction switch proteins — proteins that mediate communication both within and between cells — I was struck by how the different molecules that we studied used very similar conceptual mechanisms despite being completely different in detail.

Cells don’t just say, “I’m going to turn on this response because I see signal A.” They are usually monitoring many different signals and integrating that information with basically the equivalent of Boolean logic, where if inputs A, B and C are there, then it’s going to have a certain response, whereas if D and E are there it will do something different. That’s the beauty of living systems. They can react not in a simple way, but a nuanced and sophisticated way. It’s been very exciting to realize that we now understand the principles underlying these behaviors well enough that we can create cells that do useful things.

What does it mean to “hack” immune cells?

“That’s the beauty of living systems. They can react not in a simple way, but a nuanced and sophisticated way.”

Wendell Lim

The immune system is a pretty new thing, evolutionarily. It’s an incredible system that is still evolving to have cells that carry out a lot of different complex functions. The cells have different capabilities to sense things, whether it’s other cells that they should be talking to, or the ability to sense “foreign” from “self.” Different immune cells can launch killing responses or secrete new factors that are immunosuppressive and dial down the immune system itself. What we’re trying to do is to create a new kind of sensing-response system using the same parts, reconnected in a new way. If we are programming a T cell — a white blood cell that fights infection — to recognize some set of cancer antigens and then kill the cancer cells, that’s hacking — creating a new circuit that is good at detecting and treating cancer. By the same token, we could create a cell that would detect some tissue-specific signal associated with autoimmune disease and have that cell control the secretion of immunosuppressive factors.

Right now most studies using engineered T cells to address autoimmune disease are still in cell culture. There has been work in mice and humans transplanting native, immune-suppressive T cells, but these cells haven’t been engineered to seek specific targets or to reshape their behavior.

What would an ideal genetically engineered immune cell be able to do?

I’ll talk about cancer, because that’s the lead application. Immune cells are so powerful that they seem to be in some cases able to eliminate and cure cancer. We want them to be powerful, but of course the big danger is that if they attack any of our critical, normal tissues, that could be lethal. It’s really about combining control and precision with the effectiveness of the cells.

For the most part, the therapeutic cells that are in clinical trials now, or have been approved, are not extremely smart. They may simply include one receptor that is put on the cell and, if it recognizes the antigen it was designed to detect on a cancer cell, it will kill it. But we’re developing technologies to build more sophisticated sensing circuits that can detect two or three different inputs, which could allow engineered cells to be eliminated or turned off if needed for safety.

Right now we mostly use viruses to put new genetic material into the cells. There is a payload limit for these viruses, but we can get on the order of two new sensors into the cell’s genome. How much genetic material we can insert is one of several bottlenecks. But it is possible that in the future the amounts we can insert will grow with Moore’s-law-like behavior. The genetic engineering tool CRISPR is certainly one of several exciting new ways to insert and integrate DNA, for example. In the next couple of years, as we develop better ways to transfer genetic material into cells, we’ll find some diseases for which this added sophistication will make a huge difference.

How is cancer immunotherapy currently being performed in patients, and what’s the next frontier?

In the last several years there has been a big explosion in the concept of engineering T cells to treat cancer. What’s done nowadays is that you take a patient’s own immune cells and modify those and then put them back into the patient — this is what is called an autologous transplant. People are working towards the possibility of more off-the-shelf therapeutic immune cells that could come from a universal donor. But we need to figure out a reliable way to modify the donor cells so that they are not rejected by the patient’s immune system.

What are some of the biggest successes in the field so far?

It’s been a huge success to have CAR T cells — T cells engineered to bind to proteins called antigens on cancer cells — that can treat certain blood cell cancers with a 70 to 80 percent success rate. The therapies being marketed by Novartis [tisagenlecleucel] and Kite Pharma [axicabtagene ciloleucel] for B-cell lymphoma have shown spectacular results — these are going to become the first-line therapies for these diseases. Clinical trials have also been reporting some great results with multiple myeloma, another blood cancer.

The biggest failures?

We’re seeing great results in blood cancers. But we haven’t really seen any significant results in solid cancers. There have been a number of mouse studies and some human clinical studies, but so far the results on solid tumors have been disappointing — they have not seen the spectacular results and reproducibility that we have seen in a few blood cancers. That’s where we need much more precision, because solid cancers have a lot of molecular antigens that look like those of normal tissue. I think that’s where a lot of the technology that we’re working on is going to make a difference.

The biggest failures are where there’s been some cross-reaction that’s been lethal, or the tumors have developed resistance to the engineered immune cells. But since we have such flexibility in how we program things, we can start trying to take these problems into account. I’m pretty optimistic — the tools that we’re developing are based on a very deep fundamental understanding of how cells work.

What about cost?

There’s been a lot of criticism about the cost of these therapies — the immunotherapies that were just approved are about $300,000 to $500,000. But I think the cost will go down. The molecular parts and sensors that we’re developing are going to be reused in different cancers, so I’m optimistic that this kind of platform will lend itself to broad applications across many different diseases in a way that can bring costs down.

Are there other diseases where these cells could be useful?

There’s a lot of interest in autoimmune disorders such as Type I diabetes and severe dermatological autoimmune diseases such as pemphigus. We’re also starting to get interested in engineered cells that could address neuroinflammatory diseases such as multiple sclerosis. It’s hard to get a lot of drugs and biologics into the brain but we do know that we can get T cells into the brain. So if we could target them to particular parts of the brain, to diseased tissues, it could be extremely powerful. The other things on the horizon are regeneration and repair with stem cells. The use of engineered cells for immunotherapy is really just at the beginning and it will evolve over decades.

This article originally appeared in Knowable Magazine, an independent journalistic endeavor from Annual Reviews.

South Korea, US presidents to meet in Washington – amid wary glances in the direction of Pyongyang, Beijing and Moscow

Running a few bilateral ideas up the flagpole. Daniel Slim/AFP via Getty Images)
Sung-Yoon Lee, Tufts University

South Korean President Yoon Suk Yeol will meet his U.S. counterpart Joe Biden at the White House on April 26, 2023 – a rare state visit that comes as the two nations seek to confront common concerns.

The event is only the second state visit to the U.S. of a foreign head of state during the Biden administration, following a trip by President Emmanuel Macron of France in late 2022. That the White House handed the honor to Yoon, a relative political novice before taking office in May 2022, may come as a surprise to some foreign policy observers. Seoul does not carry the same clout in international politics as some other U.S. allies. It is an important economic partner but so, too, are Japan, Germany, Canada and Mexico – all of whom rank above South Korea in terms of overall U.S. trade.

Why, then, the pomp and ceremony for Yoon? As a scholar of Korean political history and U.S.-East Asia relations, I believe the answer can be found in three locations on the map and their respective governments: Pyongyang, Beijing and Moscow. The White House meeting might well frame the event around the strengthening of ties between Seoul and Washington, but in reality they will want to send a message of unity in the face of saber-rattling – and worse – by North Korea, China and Russia.

A friendship forged in war

Washington and Seoul’s relationship was forged in the bloody crucible of the Korean War of 1950-53. For several decades, the alliance was lopsided, especially in the lean two decades following the armistice of 1953 when the South Korean subsistence economy was almost totally dependent on U.S. aid. But over the past two decades, South Korea has evened up the ledger, becoming a world leader in electronics, shipping, vehicles, arms and pop culture. The U.S-South Korea alliance has developed into one based as much on economic interests as diplomatic and strategic concerns.

Even the awkward issue of recent reports of alleged U.S. spying on the South Korean presidential office is not likely to dampen the show of friendliness expected on display during the bilateral meeting.

After all, Biden and Yoon have more serious matters to contend with. The state visit follows a year in which North Korea fired nearly 100 missiles into the skies in and around the Korean Peninsula, Russia brazenly invaded Ukraine, and China upped its rhetoric around the disputed island of Taiwan. And each will need addressing in the summit.

North Korean missiles

To South Korea, the threat of the isolationist state to its north is the most existential. Biden will likely underscore the U.S. commitment to the defense of South Korea against a nuclear-armed North Korea.

But the threat is not confined to imperiling the Korean Peninsula. North Korean leader Kim Jong Un’s intercontinental ballistic missiles now have the capability to hit the U.S. mainland. Such a development may be intended to draw Washington’s attention, but it has another consequence: aligning the existential threat that South Korea faces with that of the United States.

Growing apprehension in South Korea – where more than 70% now favor a domestic nuclear weapons program rather than rely on its powerful ally – means that Yoon will seek U.S. reassurances that go beyond the rhetoric of “extended deterrence” and promises of an “ironclad” alliance.

North Korean leader Kim, having told the world last week that he is gearing up to launch a spy satellite into space, has also used the opportunity of Yoon’s U.S. visit to step up the country’s ballistic missiles tests – a reminder to his two main adversaries that he can always make life for them difficult.

China’s regional push

That China and Russia continue to block any move at the U.N. Security Council to punish North Korea over its tests only emboldens Pyongyang.

But the threat posed by North Korea is not the only East Asian security concern for the U.S. or South Korea. The rise of China as an Indo-Pacific force – and a rival to Washington’s and Seoul’s economic and strategic interests – is another likely topic to come up in the White House meeting.

Indeed, Yoon may have foreshadowed U.S. and South Korean thinking on China with comments made to the Reuters news agency just days ago.

“The Taiwan issue is not simply an issue between China and Taiwan, but like the issue of North Korea, it is a global issue,” he said. Yoon may just have been echoing what Biden and he declared at the pair’s first summit in Seoul in May 2022 over the importance of preserving “peace and stability in the Taiwan Strait as an essential element in security and prosperity in the Indo-Pacific region.” But the remark raised the ire of officials in Beijing to howls of protest. And the fact that a South Korean leader should join the U.S. as it ups the rhetoric over Taiwan will likely be welcomed by Washington and, of course, Taipei.

It also comes on the back of efforts by Yoon to make amends with Japan – an erstwhile “friend of a friend” in regards to the U.S., but one with which Seoul has long-festering wounds going back to the Japanese occupation of Korea.

Two men shake hands in front of a South Korean and Japanese flag.
South Korean President Yoon Suk Yeol and Japanese Prime Minister Fumio Kishida shake hands on March 16, 2023. Kiyoshi Ota/Pool Photo via AP

In March, Yoon visited Japanese Prime Minister Fumio Kishida – the first official bilateral meeting between the two countries’ leaders in 12 years.

Friendlier terms between Tokyo and Seoul – both democracies – serve Washington’s plans to counter the influence of autocracies in the region, forming a quasi-trilateral alliance structure.

Biden will be hoping to isolate China further through economic means. Yoon will visit Boston during his trip, underscoring the importance of collaboration in the biotech and high-tech industries. It comes as South Korea’s leading microchip producers, including Samsung and SK Hynix, face pressure from the U.S. to curtail their semiconductor business in China. Yoon will be seeking to promote U.S.-Korean joint investment in the semiconductor sector to compensate for the impact from reducing sales to China’s market.

Ukraine’s need for weapons

And then there is the war in Ukraine, which tends to loom over diplomatic matters since Russia’s invasion.

In the past, South Korea has remained largely parochial on security issues, understandably, given the threat it faces on the peninsula. For example, no previous administration has even floated the notion of military support for the U.S. in the event of war in the Taiwan Strait.

Similarly, Seoul has provided only economic and humanitarian assistance to Ukraine, although it is the world’s eighth-biggest exporter of arms. But Yoon’s vision for his nation is that of a “global pivotal state” that places freedom, values and international rules-based order at the heart of its foreign policy – and that opens up the possibility of further intervention.

If Biden is able to coax his guest to commit to supply, discreetly, more weapons and ammunition to Ukraine, it will prove a win for both Yoon’s vision as well as that of Biden.

State visits are by their nature ceremonial – and 2023 marks the 70th anniversary of the United States-Republic of Korea alliance. But as strategic and economic concerns converge, the future relationship between the countries is being redefined by how the two allies confront simultaneously geopolitical concerns on South Korea’s doorstep, the wider region and the world beyond.

Sung-Yoon Lee, Professor in Korean Studies, Tufts University

This article is republished from The Conversation under a Creative Commons license.