Saturday, April 29, 2023

Cannabis-derived products like delta-8 THC and delta-10 THC have flooded the US market – two immunologists explain the medicinal benefits and potential risks

Thousands of cannabis-derived products are now on the market. skodonnell/E+ via Getty Images
Prakash Nagarkatti, University of South Carolina and Mitzi Nagarkatti, University of South Carolina

These days you see signs for delta-8 THC, delta-10 THC and CBD, or cannabidiol, everywhere – at gas stations, convenience stores, vape shops and online. Many people are rightly wondering which of these compounds are legal, whether it is safe to consume them and which of their supposed medicinal benefits hold up to scientific scrutiny.

The rapid proliferation of cannabis products makes clear the need for the public to better understand what these compounds are derived from and what their true benefits and potential risks may be.

We are immunologists who have been studying the effects of marijuana cannabinoids on inflammation and cancer for more than two decades.

We see great promise in these products in medical applications. But we also have concerns about the fact that there are still many unknowns about their safety and their psychoactive properties.

Parsing the differences between marijuana and hemp

Cannabis sativa, the most common type of cannabis plant, has more than 100 compounds called cannabinoids.

The most well-studied cannabinoids extracted from the cannabis plant include delta-9-tetrahydrocannabinol, or delta-9 THC, which is psychoactive. A psychoactive compound is one that affects how the brain functions, thereby altering mood, awareness, thoughts, feelings or behavior. Delta-9 THC is the main cannabinoid responsible for the high associated with marijuana. CBD, in contrast, is non-psychoactive.

Marijuana and hemp are two different varieties of the cannabis plant. In the U.S., federal regulations stipulate that cannabis plants containing greater than 0.3% delta-9 THC should be classified as marijuana, while plants containing less should be classified as hemp. The marijuana grown today has high levels – from 10% to 30% – of delta-9 THC, while hemp plants contain 5% to 15% CBD.

In 2018, the Food and Drug Administration approved the use of CBD extracted from the cannabis plant to treat epilepsy. In addition to being a source of CBD, hemp plants can be used commercially to develop a variety of other products such as textiles, paper, medicine, food, animal feed, biofuel, biodegradable plastic and construction material.

Recognizing the potential broad applications of hemp, when Congress passed the Agriculture Improvement Act, called the Farm Bill, in 2018, it removed hemp from the category of controlled substances. This made it legal to grow hemp.

When hemp-derived CBD saturated the market after passage of the Farm Bill, CBD manufacturers began harnessing their technical prowess to derive other forms of cannabinoids from CBD. This led to the emergence of delta-8 and delta-10 THC.

The chemical difference between delta-8, delta-9 and delta-10 THC is the position of a double bond on the chain of carbon atoms they structurally share. Delta-8 has this double bond on the eighth carbon atom of the chain, delta-9 on the ninth carbon atom, and delta-10 on the 10th carbon atom. These minor differences cause them to exert different levels of psychoactive effects.

Illustration of the chemical formula and structural composition of CBD versus delta9 THC.
Delta-9 THC is believed to be the primary cannabinoid that gives marijuana its psychoactive effects. Both CBD and marijuana have been shown in studies to be beneficial for various medicinal uses. About time/iStock via Getty Images Plus

The properties of delta-9 THC

Delta-9 THC was one of the first forms of cannabinoid to be isolated from the cannabis plant in 1964. The highly psychoactive property of delta-9 THC is based on its ability to activate certain cannabinoid receptors, called CB1, in the brain. The receptor, CB1, is like a lock that can be opened only by a specific key – in this case, delta-9 THC – allowing the latter to affect certain cell functions.

Delta-9 THC mimics the cannabinoids, called endocannabinoids, that our bodies naturally produce. Because delta-9 THC emulates the actions of endocannabinoids, it also affects the same brain functions they regulate, such as appetite, learning, memory, anxiety, depression, pain, sleep, mood, body temperature and immune responses.

The FDA approved delta-9 THC in 1985 to treat chemotherapy-induced nausea and vomiting in cancer patients and, in 1992, to stimulate appetite in HIV/AIDS patients.

The National Academy of Sciences has reported that cannabis is effective in alleviating chronic pain in adults and for improving muscle stiffness in patients with multiple sclerosis, an autoimmune disease. That report also suggested that cannabis may help sleep outcomes and fibromyalgia, a medical condition in which patients complain of fatigue and pain throughout the body. In fact, a combination of delta-9 THC and CBD has been used to treat muscle stiffness and spasms in multiple sclerosis. This medicine, called Sativex, is approved in many countries but not yet in the U.S.

Delta-9 THC can also activate another type of cannabinoid receptor, called CB2, which is expressed mainly on immune cells. Studies from our laboratory have shown that delta-9 THC can suppress inflammation through the activation of CB2. This makes it highly effective in the treatment of autoimmune diseases like multiple sclerosis and colitis as well as inflammation of the lungs caused by bacterial toxins.

However, delta-9 THC has not been approved by the FDA for ailments such as pain, sleep, sleep disorders, fibromyalgia and autoimmune diseases. This has led people to self-medicate against such ailments for which there are currently no effective pharmacological treatments.

Delta-8 THC, a chemical cousin of delta-9

Delta-8 THC is found in very small quantities in the cannabis plant. The delta-8 THC that is widely marketed in the U.S. is a derivative of hemp CBD.

Delta-8 THC binds to CB1 receptors less strongly than delta-9 THC, which is what makes it less psychoactive than delta-9 THC. People who seek delta-8 THC for medicinal benefits seem to prefer it over delta-9 THC because delta-8 THC does not cause them to get very high.

However, delta-8 THC binds to CB2 receptors with a similar strength as delta-9 THC. And because activation of CB2 plays a critical role in suppressing inflammation, delta-8 THC could potentially be preferable over delta-9 THC for treating inflammation, since it is less psychoactive.

There are no published clinical studies thus far on whether delta-8 THC can be used to treat the clinical disorders such as chemotherapy-induced nausea or appetite stimulation in HIV/AIDS that are responsive to delta-9 THC. However, animal studies from our laboratory have shown that delta-8 THC is also effective in the treatment of multiple sclerosis.

The sale of delta-8 THC, especially in states where marijuana is illegal, has become highly controversial. Federal agencies consider all compounds isolated from marijuana or synthetic forms, similar to THC, Schedule I controlled substances, which means they currently have no accepted medical use and have considerable potential for abuse.

However, hemp manufacturers argue that delta-8 THC should be legal because it is derived from CBD isolated from legally cultivated hemp plants.

In this California-based recreational and medical cannabis store, cannabis gummies are “easily” the most popular product.

The emergence of delta-10 THC

Delta-10 THC, another chemical cousin to delta-9 and delta-8, has recently entered the market.

Scientists do not yet know much about this new cannabinoid. Delta-10 THC is also derived from hemp CBD. People have anecdotally reported feeling euphoric and more focused after consuming delta-10 THC. Also, anecdotally, people who consume delta-10 THC say that it causes less of a high than delta-8 THC.

And virtually nothing is known about the medicinal properties of delta-10 THC. Yet it is being marketed in similar ways as the other more well-studied cannabinoids, with claims of an array of health benefits.

The future of cannabinoid derivatives

Research and clinical trials using marijuana or delta-9 THC to treat many medical conditions have been hampered by their classification as Schedule 1 substances. In addition, the psychoactive properties of marijuana and delta-9 THC create side effects on brain functions; the high associated with them causes some people to feel sick, or they simply hate the sensation. This limits their usefulness in treating clinical disorders.

In contrast, we feel that delta-8 THC and delta-10 THC, as well as other potential cannabinoids that could be isolated from the cannabis plant or synthesized in the future, hold great promise. With their strong activity against the CB2 receptors and their lower psychoactive properties, we believe they offer new therapeutic opportunities to treat a variety of medical conditions.

Prakash Nagarkatti, Professor of Pathology, Microbiology and Immunology, University of South Carolina and Mitzi Nagarkatti, Professor of Pathology, Microbiology and Immunology, University of South Carolina

This article is republished from The Conversation under a Creative Commons license. 

A Fruity Sprinkle Surprise

(Culinary.net) To kids, birthday parties are a big deal and only happen once a year. From the decorations to their friends and all the sweet, delicious treats to devour, it can be an overwhelming amount of excitement and awe.

They receive gifts, get to have fun with their friends and family, and get to snack on treats they typically don’t have on a regular basis. This is part of what makes birthdays so fun.

It can be a lot of pressure for parents, though. You want everything to be perfect and fall in line with expectations, especially when it comes to the food and treats served to everyone that day.

At the next party you’re hosting, try this delightful Fruity Sprinkles Smoothie that fits the theme for nearly any colorful birthday bash.

It’s made with frozen blueberries, frozen strawberries and frozen mango for a healthier alternative to sugar-filled birthday cake. Topped with fluffy, fun whipped cream and mini sprinkles, it still provides a sweet, festive treat. Plus, this smoothie can be made in a matter of minutes using only one kitchen appliance for easy clean up.

To make it, blend frozen blueberries, frozen strawberries, frozen mango, milk and yogurt until well combined.

Pour the mixture into four smoothie glasses and garnish each with whipped cream and sprinkles to add some extra color.
It’s that easy to make and even better to enjoy while watching your kid make wonderful memories with friends and family.
Find more fun celebration recipes at Culinary.net.

If you made this recipe at home, use #MyCulinaryConnection on your favorite social network to share your work.

Watch video to see how to make this recipe!

 

Fruity Sprinkles Smoothie

Servings: 4

  • 1          cup frozen blueberries
  • 2          cups frozen strawberries
  • 1          cup frozen mango
  • 1 1/2    cups milk
  • 1          carton (6 ounces) vanilla yogurt
  • whipped cream
  • sprinkles
  1. In blender, blend blueberries, strawberries, mango, milk and yogurt until combined.
  2. Pour smoothie into four glasses. Garnish with whipped cream and sprinkles.
SOURCE:
Culinary.net

The quest for autism’s causes, and what it reveals about all of us


The more researchers look, the more multifaceted the risk factors appear — and the more we learn about how the brain works and develops

As alarm grew over autism prevalence at the turn of this century, there was much public talk of a growing “epidemic.” That language has since softened, and it is now clear that many autistic people were there all along, their condition unrecognized until relatively recently.

But what is the cause? The emerging narrative today is that there is no single cause — rather, multiple factors, roughly sorted into the categories of genetics and environment, work together in complex ways. Because of this complexity and the hundreds of gene variants that have been implicated, developing human brains may follow many possible paths to arrive at a place on the autism spectrum.

And this may help explain something true about autism: It varies greatly from one person to the next.

As clinicians view it, autism involves communication deficits and formulaic, repetitive behaviors that present obstacles to establishing conventional relationships. The soft borders of that definition — where does communication difficulty cross over into communication deficit? — suggest blurred margins between people who are diagnosed with autism and those who approach, but never quite cross, the line into diagnostic territory.

Those who do have diagnoses display behaviors on a continuum of intensity. Their use of spoken language ranges from not speaking at all to being hyperverbal. They can have a unique interest in the finer details of window blinds or an intense but more socially tolerated fascination with dinosaurs. As with many human behaviors, each feature exists on a spectrum, and these spectra blend in a person to create what clinicians call autism.

By pinpointing risk-associated genes and uncovering their roles, studying the roots of autism also is providing new insights into the development of all human brains, autistic or not. Here is a taste of what we now know, and what we don’t, about autism’s causes — and what that search is teaching us about everybody’s neurology.

They know it when they see it

Despite the many and varied threads that may interweave to cause autism, the condition is largely identifiable. What clinicians are really saying when they diagnose autism, says James McPartland, a clinical psychologist at the Yale Child Study Center, is that they see a recognizable, if broadly defined, constellation of behaviors. “So really, there is something true about autism, and everyone who meets the diagnosis of autism shows these kinds of behaviors.”

At the same time, the subtle differences in how each autistic person manifests the telltale features make it highly individual, says Pauline Chaste, a child psychiatrist at Inserm U 894, the Centre de Psychiatrie et Neurosciences, in Paris. “We describe a specific behavior that exists — that kind of social impairment and rigidity. You can have more or less of it, but it definitely exists.”

The more or less of autism could trace, in part, to the types of gene variants that contribute to it in a given person. Some of these variants have a big effect by themselves, while others make tiny contributions, and any autistic person could have their own unique mix of both. One thing seems clear: Though there may be something true about autism, as McPartland puts it, the existence of “one true autism gene” or even one gene for each autism feature is unlikely.

Instead, there will be patterns of gene combinations and the results they produce, says epidemiologist Elise Robinson of the Harvard T.H. Chan School of Public Health and an associate member of the Broad Institute. People who have both autism and intellectual disability, for example, tend to have more big-effect gene mutations than people with autism alone.

Facial communication

Looking for these contributing gene variants isn’t simply an exercise in scientific curiosity or in finding potential targets for drug treatments. Because most of these genes direct how human brains develop and nerve cells communicate, learning about how they lead to autism can also reveal a lot about how everyone’s brain works.

For example, a key autism trait is atypical social behaviors, such as, sometimes, not focusing on “social” facial features like the eyes. Although the tendency to look into another person’s eyes seems like something we might learn simply from being around other people, autism research has revealed that genes underlie the instinct.

In a 2017 study, the authors first showed that identical twins are similar in how they look at a video with social content, such as faces. When viewing the same video, the identical twin pairs shifted their eyes with the same timing and focused on the same things far more than did two non-identical siblings or unrelated children. The fact that almost all twin pairs shared this tendency suggests solid genetic underpinnings for the behavior.

Having established a strong genetic contribution to this trait, the investigators, from Emory University and the Marcus Autism Center in Georgia and Washington University in St. Louis, then showed that the tendency to look at the eye and mouth areas of a human face is decreased in autistic children. They concluded that while not all of the inclination to look at certain parts of a face is genetic, much of it is.

Twin studies like this are powerful tools for evaluating how much genes dictate a feature, and such investigations reveal that the genetic contribution to autism is substantial. Autism also tends to cluster in non-twin family members: One in five infants who has an older sibling with autism also develops it.

Genetic determinants

Overall, genetics accounts for about 70 to 80 percent of factors contributing to autism, says neurologist Daniel Geschwind, director of UCLA’s autism research and treatment center. By comparison, a condition like depression has an underlying genetic contribution of about 50 percent, he says. Alessandro Gozzi, neuroscientist and group leader at the Istituto Italiano di Tecnologia, weights the power of genes even more, placing the shared diagnosis rate between twins as high as 95 percent, depending on how strict the diagnostic boundaries are. But regardless of the precise value, he says that the “wide consensus” among autism researchers is that genetics is a powerful determinant of autism.

Going the next step — finding the specific genes involved — is a monumental task. It’s also one that yields dividends for understanding brain function more broadly.

The candidate gene variants are today very numerous, but a few stand out for their potential to exert a large effect. Chaste cites fragile X syndrome and Rett syndrome as examples — both are genetic conditions (termed syndromes because they are defined by a cluster of traits) that are tied to variants of a single gene or chromosome region and are closely associated with autism.

The gene linked to fragile X syndrome lies on the X chromosome. Its name, FMR1, is easily forgettable, but the effects of its variants are not. Studies on the causes of fragile X reveal that the protein this gene encodes, FMRP, acts as a cellular shuttle for RNA molecules that are crucial for nerve-cell communication and plasticity of connections in the brain. In people with fragile X, cells don’t produce the protein, or make very little of it. The FMR1 variants underlying fragile X are the most common known genetic cause of intellectual disability and are implicated in 1 to 6 percent of autism cases.

Like FMR1, the genetic changes involved in Rett syndrome also affect brain development. A gene called methyl CpG binding protein 2, or MECP2, oversees the activity of many brain-related genes, turning them off or on. Because of this pivotal role for MECP2, mutations that affect its function can lead to broad effects. Some of the resulting features look so much like autism that Rett syndrome was categorized as an autism spectrum disorder until 2013.

Other genetic syndromes also include autism as a feature. Some are caused by variants in a gene called SHANK3 which, like most genes implicated in autism, is involved in brain development and function. The protein that it encodes helps to coax nerve extensions to form and take shape so that a nerve cell can communicate with others. The SHANK3 protein also provides a physical scaffold for those cells to link up. In populations of people with mutations that prevent SHANK3 protein production or who are missing the segment of chromosome 22 that contains the gene, most will have autism or Phelan-McDermid syndrome, which often includes autism.

Yet another syndrome arises from the loss or duplication of a chunk of chromosome 16. Researchers linked this chromosomal change to autism in studies comparing the DNA of people with and without the condition, singling out sequence alterations found only in autistic participants.

Despite their clear ties to autism, these syndromes are rare. “Collectively, they are found in about 5 percent of the total population of patients with autism,” Gozzi says. That leaves a great deal to explain.

Inheritance on a spectrum

So where do the other autistic people come from, genetically speaking? Robinson says that their genetics don’t neatly fall into two types of buckets, of either a few genes with big effects or many genes with small effects. “It’s been well established at this point that it’s not either–or,” she says.

In fact, says Gozzi, varying combinations of big-effect mutations and lots of different, smaller-effect ones could explain the wide spectrum of differences observed among autistic people. The evidence supports such a range, he says: everything from a few heavy-hitting variations in some people, to an additive dose from many variants in others, and with overlap between the two patterns in still others.

Geschwind adds yet another layer of complexity: the role of the cellular environment that all the other gene variants in a person create, known as the background effect. For example, someone could have a mutation conferring high risk that is either enhanced or diminished by the background input from other genes not directly related to autism, to create a gradation of autism intensity.

Environmental influences

When researchers speak of environmental inputs to traits, diseases and disorders, they are referring to everything from pollutants in the air to subtle perturbations inside cells to cues from other cells. Finding such causative candidates for autism generally involves epidemiological studies that look for correlations between autism rates in a population and an environmental factor of interest.

These connections aren’t easy to locate. In the case of genes, if a study involves enough people, even rare genetic differences that make small contributions to autism can often be plucked from the pile. Not so for environmental influences if their effects are significant but small, says Robinson. Within those epidemiological studies, you have to be able to detect that slight signal and assess its power against the larger, background noise of lots of other variations in the cell, body or outside environment that you might not even be aware of and might not be relevant. “We don’t live in a simple, single-exposure world,” says Kristen Lyall, an epidemiologist at Drexel University in Philadelphia.

And even when a connection is made, its basis is still just math. That is certainly the first step in evaluating a link between an environmental factor and a condition such as autism: As one thing goes up, does the other follow? But two things that track together don’t necessarily share a biological association. (One of the silliest examples to illustrate how misleading correlation can be is how tightly the number of people killed by venomous spiders each year tracks with the number of letters in the winning word of the same year’s Scripps National Spelling Bee.)

In the case of genetic studies, gene changes with tiny effects can still be considered plausible if their usual role relates to brain function in some way. Environmental factors aren’t as well catalogued, measured and tracked. But the better epidemiological studies do look for correlations with credible and pre-identified factors of interest (so, not Scripps Spelling Bee words).

For feasibility’s sake, work on environmental factors in autism has tended to focus on inputs that have broad effects on brain development. Robinson points to extreme preterm birth, which is related to many kinds of neurodevelopmental disorders — autism among them.

Eventually, studies can add up to connect dots and arrive at a plausible story of cause and effect. For example, along with preterm birth, air pollution also has been linked to autism risk. Another recent study found that when oil and power plants close down, preterm births in the region drop. It’s therefore a reasonable hypothesis that very preterm birth operates as an intermediate between air pollution exposure and autism.

Lyall believes that prenatal exposures to environmental pollutants that can behave like hormones are particularly strong candidates for involvement in autism risk. These chemicals, collectively known as endocrine-disrupting compounds, include pesticides and even heavy metals, and they are pretty much everywhere — in air, land, water, food and us.

Some research suggests, for example, that exposure to the endocrine disruptor mercury in air pollution raises autism odds. The studies are few and the data haven’t overwhelmingly showed increases in risk, Lyall acknowledges, “but I think that it’s an interesting and important area for future research given the lack of regulation around these chemicals, their ubiquity in the environment and their known adverse effects on broader neurodevelopment.”

Researchers have also homed in on plausible biological bases for a couple of other potential environmental effects. Gozzi points to animal studies, mostly in mice, that bolster human work linking autism in a child with prenatal exposure to a mother’s ramped-up immune responses as a result of infections. Again, Gozzi stresses that the findings are far from definitive, and most studies involving humans have focused on infections severe enough to require hospitalization.

Another unearthed link is to paternal age at conception: Studies find that autism risk increases with the age of the father, usually starting in the thirties or forties, although the age range and magnitude of the increase vary among different studies. The cells that give rise to sperm tend to accumulate new mutations over the years, so the sperm contain sequence changes that pass to offspring but aren’t present in the father’s own body cells. Some of these changes involve regions or genes already implicated in autism risk. Sperm also show changes in the chemical tagging of DNA that controls the activity of genes.

Establishing environmental cause unequivocally is almost impossible, because of ethical constraints. It’s one thing to examine blood or tissue samples for genetic variants that track with autism diagnoses. It’s another thing entirely to manipulate factors to see if they induce autism or not. No one’s going to deliberately infect a pregnant woman or have a group of men specifically delay fatherhood just to test how these factors influence autism odds.

Researchers instead are stuck finding correlations between these factors and then looking at available measures, such as changes in gene activity, accrual of mutations over the lifespan and studies of autism-like behavior in animal models. And as they look at these associations, they often make discoveries that are relevant beyond autism — ones that have now been extended to studies of schizophrenia, aging and even human evolution. The link between autism and having an older father, for example, has led to studies examining how changes in sperm over time affect brain development in later generations.

While most environmental candidates remain just that — candidates — Lyall says emphatically that one factor is out of the running: vaccines. “That’s pretty conclusively been shown to have no association with autism,” she says, noting the numerous large epidemiological studies that have reached that conclusion.

The settled vaccine question is a small point of clarity in an otherwise blurred landscape of autism cause-and-effect research. Every new finding seems to open up yet more pathways, some leading toward autism, and some toward broader revelations about the brain and how hormones, the immune system, the air we breathe and more add up to make their mark on neural development. The network of genetic and environmental factors that converge and diverge to produce autism may reflect not only the multiplicity of ways of being autistic — but also, more broadly, of being human.

This article originally appeared in Knowable Magazine, an independent journalistic endeavor from Annual Reviews.

The Federal Reserve and the art of navigating a soft landing … when economic data sends mixed signals

‘Surely we can avoid an economic crash? We can, but don’t call me Shirley!’ Paramount Pictures/Fathom Events
Christopher Decker, University of Nebraska Omaha

With inflation easing and the U.S. economy cooling, is the Federal Reserve done raising interest rates? After all, gently bringing down the trajectory of prices without crashing the economy was the central bank’s objective when it began jacking up rates over a year ago.

Gross domestic product, the broadest measure of an economy’s output, expanded at an annual pace of a mere 1.1% in the first quarter, according to data released April 27, 2023 – down from 2.6% recorded in the final three months of 2022. And the latest consumer price data, from March, shows inflation slowing to 5% on an annualized basis, the least in about a year.

Unfortunately for consumers and businesses weary of soaring borrowing costs, the Fed’s not likely done hiking rates quite yet. Financial markets are predicting another quarter-point hike when the Fed meets for a two-day meeting that ends May 3, 2023. And there could be several more increases to come.

But this does raise another important question: With all the recent, often conflicting, data and narratives regarding inflation, bank failures and layoffs in the tech sector, is the Fed close to engineering the “soft landing” it’s been hoping for?

The economy zigs then zags

The GDP data is a mixed bag and provides some clues to the answer.

Overall, the recent GDP figures suggest a likely economic slowdown going forward, due largely to a drawdown in inventories – that is, rather than ordering new goods, companies are relying more on stuff currently in storage. Businesses seems more inclined to sell what is on hand rather than order up new products, likely in anticipation of a slowdown in consumption. And business investment declined 12.5% in the quarter.

At the same time, consumer spending, which represents about two-thirds of GDP, grew at a healthy 3.7% pace, and investment in equipment such as computers and robotics increased by 11.2% – though this category is quite volatile and could easily turn in subsequent quarters.

Other data also points to a slowdown, such as a decline in new orders for manufactured goods. This, combined with the drawdown in inventories in the GDP report, might suggest that businesses are anticipating a slowdown in demand for goods and services.

When we look at the labor market, while job increases have been strong – 334,000 over the past six months – job openings have been declining. After peaking at about 12 million in March 2022, openings dropped to about 9.9 million as of February, according to the Bureau of Labor Statistics.

Inflation: Is it high or low?

In terms of inflation, we can also see conflicting numbers.

The headline consumer price index has indeed slowed steadily since peaking in June 2022 at 9.1%. But the core preferred consumption index, the Fed’s favored measure of inflation, has remained stubbornly elevated. The latest data, released on April 28, 2023, showed the index, which excludes volatile food and energy prices, was up 4.6% in March from a year earlier and has barely budged in months.

Meanwhile, wages, which when rising can have a strong upward push on prices, climbed at an annualized 5.1% in the first quarter, also according to data released on April 28. That’s down from the peak of 5.7% in the second quarter of 2022 but is still about the fastest pace of wage gains in at least two decades.

More hikes to come

So what might all this suggest about Fed actions on interest rates?

The next meeting is scheduled to end on May 3, with the market odds greatly favoring another 0.25 percentage point increase – which would be the 10th straight hike since March 2022.

With the inflation rate still well above the Fed’s target of about 2%, combined with continued job growth and a low unemployment rate, the central bank is likely not done ratcheting up rates. I agree with the market odds pricing in a quarter-point hike for the May meeting. Future data will guide any future rate increases beyond that.

The good news is that, I believe, the larger rate increases are well in the past.

Landing softly – or at least mildly

That brings us back to the big question: How close is the Fed to sticking a soft landing, in which the U.S. economy manages to tame inflation without a recession?

Sadly, it’s too early to tell. Labor markets can be very volatile and political and international events – such as potential gridlock on debt ceiling talks or further escalations in the Ukraine War – can turn things upside down. That said, we are either looking at a mild recession or a growth recession.

What’s the difference? A growth recession signals a weak economy but not enough to significantly drive up unemployment – and that’s preferable to even a mild recession of multiple quarterly drops in GDP and much higher unemployment.

We just don’t know which is more likely. What I think is true now, though, is that, barring any catastrophic and unpredictable events, a severe recession has been avoided.

Christopher Decker, Professor of Economics, University of Nebraska Omaha

This article is republished from The Conversation under a Creative Commons license. 

Criminologist Bruce Jacobs has spoken to carjackers in detail about their crimes. Here’s what he’s learned in two decades of study.

Almost as long as there have been cars, there have been carjackings — thefts of occupied automobiles committed through force, or threat of force. During Prohibition, shipments of alcohol were regularly intercepted by armed robbers, and other inventory-carrying commercial vehicles then became targets. Carjacking of personal vehicles became increasingly prominent in media reports in the 1990s after some high-profile incidents in which victims died during the robbery. The crime became a federal offense in 1992.

In recent years, reports of carjackings have increased in several cities. In Chicago, carjackings more than doubled in 2020 and continued to rise in 2021. Since 2019, carjackings in Philadelphia have more than tripled. Officials in New Orleans, Washington, DC, and Minneapolis have all reported similar spikes.

Bruce Jacobs, a criminologist at the University of Texas at Dallas, has extensively studied the crime. He started his research in the early 2000s, describing it as a natural progression from studying other street crimes in St. Louis, Missouri, where he had been researching illegal drug distribution, drug-related violence and robbery.

To understand the steps and motivations that drive carjackings, Jacobs and his collaborators used both crime-reporting data and interviews with active carjackers. Recently, Jacobs and Michael Cherbonneau, a criminologist at the University of North Florida, described insights into the scope and process of carjacking in the 2023 Annual Review of Criminology.

Knowable Magazine spoke with Jacobs to discuss what he’s learned about this crime and his takeaways for prevention. The following conversation has been edited for clarity and length.

You’ve been studying carjacking, off and on, for about 20 years now. What’s the most surprising thing you’ve learned?

I think maybe the most surprising thing is just the unpredictability of this crime. Other violent crimes usually have a certain pattern to them geographically or temporally. There may be some sort of interpersonal connection between the victim and the offender. Or, in stranger-on-stranger crimes, like robbery typically is, there are usually hotspots within a city that are more prone to experiencing those types of crimes.

With carjacking, it’s so spur-of-the-moment. The offender sees a vulnerable target and an opportunity to strike. It may not be in a so-called hotspot of a city. Really, anybody driving a car in public is potentially at risk. From the perspective of the victim, they may just be at the wrong place at the wrong time.

Why do people choose to carjack rather than steal an unoccupied car?

A lot of carjackers don’t like the ambiguity and the uncertainty of a potential victim coming out of their house or their business while they are stealing a car. Whereas with carjacking, the vehicle’s on, the keys are inside, the victim’s inside. It’s simply a matter of going up to them, displaying the weapon, telling them to get out of the car or throwing them out of the car, and taking the vehicle. It’s very quick. It’s very simple. That’s what some of our carjackers would say: It’s safer to carjack than to steal a vehicle off the street.

Cars today also have more security features than in the past. Does that also make carjacking a preferred option?

That’s what the evidence seems to show. Back in the day, you could break into an Oldsmobile or a Chevy, strip the ignition column, jam a screwdriver in there, and it starts in 30 seconds. With these modern cars, you can’t do that anymore. They require these chips and proximity readers. A lot of the electronics are much more advanced and not accessible to a thief with a screwdriver. So there does seem to be what might be called tactical displacement, where these offenders figure out, “If I identify the car I want, I’m just gonna take it by force. It’s already on and the keys are in it.”

Several major US cities have experienced dramatic rises in carjacking since the pandemic hit. What’s behind that trend?

First, the technology issue that we just talked about — it’s just getting harder and harder to steal cars off the street. Second, the pandemic, I think, played a large part as well. Due to the school shutdowns, younger at-risk offenders found themselves unsupervised with a lot of time on their hands. And the ubiquitous Covid mask allowed them substantially enhanced anonymity.

But it’s very difficult to do a year-to-year or city-to-city analysis because the data are not maintained that way on a federal level or even on a state or local level. Most jurisdictions don’t track carjacking separately from other forms of robbery. So we had to rely on reports from police officials who did track it in some of these cities.

To research carjacking, you’ve interviewed active offenders. How did you conduct these interviews?

Those active offenders were identified to us through a specially trained project field worker who I came to know over the years as part of my duties as a criminologist in St. Louis. He was an active offender himself. He had multiple and ongoing contacts with active offenders — not in jail or prison, but out on the street. He was trusted amongst the folks that he referred, and we had worked with him for many years.

So we relied on him to identify offenders and convey those respondents to us. Then we would interview them at length, through in-depth and semi-structured interviews, about why they did it, how they did it, where they did it, who they selected for targets.

What did you learn about decision-making and motives from your interviews?

The economic motives are probably primary — stealing the vehicle to chop it up for parts, or, not infrequently, we see these vehicles being stolen for their accessory items, like performance rims and high-end audio systems, which might be worth more than the car itself.

There are sometimes retaliatory motives where somebody is showboating their vehicle and driving in a way that is disrespectful to the would-be offender, and they'll just take it to teach a lesson. We’ve seen carjackings that are committed in the course of some other crime — for example, to escape. We’ve had carjackings done for thrills, especially among young offenders who are just looking for a rush.

There’s a variety of motives that energize this offense. It really depends on the offender and the situation that they’re in.

In terms of offender decision-making, it lines up with a lot of what we know about predatory violence more generally. Despite the opportunistic, spur-of-the-moment, crude way in which many of these offenses are carried out, there is a reasonable degree of calculation on the part of many of these offenders. There’s calculation in sizing up their targets, figuring out how to approach their target, figuring out the ideal place to commit these crimes to lower risk of detection, and then using force within the actual offense to maximize the likelihood of compliance.

It’s striking just how quickly these decisions are made. You’re talking about literally under a minute for most of the offenses to unfold and be done. I’ve reviewed thousands of police reports and video evidence of carjackings around the country. All that evidence seems to indicate most of these offenses are very, very quick.

You write that media accounts often give a skewed picture of carjacking. Why do you say that?

The carjackings that typically get reported in the media tend to be disproportionately violent, disproportionately graphic, because those kinds of stories generate eyeballs, and eyeballs generate ratings, which mean profit. You’ve got to be very careful not to suggest that those events are representative of the broader universe of carjackings.

Carjackings are very, very rarely fatal — in the tenths of a percent. And they very rarely involve serious victim injury: Only 1 percent of victims are hospitalized.

Based on your research, how can potential carjacking victims keep themselves safe?

Potential victims really have to educate themselves on being alert. It’s as simple as just being aware of your surroundings and people lurking on the periphery of your vehicle. There are certain points where you’re more vulnerable than others: when you get inside your vehicle, when you get outside of your vehicle, at traffic lights, at gas stations. That’s when you’re at highest risk. That’s what we’ve noticed not only in our interview-based research, but also from other researchers in the field, media sources and police sources.

Just being aware of those points, I think, can enhance victim safety. If you get that kind of gnawing feeling that something’s about to go down, I would listen to that sixth sense.

But can’t being super vigilant be exhausting and potentially cause us to view well-meaning strangers as threats?

There’s a balance. You don’t want to be paranoid. With paranoia, you can almost put the idea in the offender’s head. And we’ve seen that in our interviews, like, “Oh, I wasn't really even thinking about it, but this guy looked at me a certain way or looked paranoid or scared or afraid and then you know, the car was right there.” These crimes are so opportunistic and spur-of-the-moment that that can set them off.

You want to be able to be attuned to your surroundings so that you can react quickly if necessary. At least at certain times when you’re potentially vulnerable, just minimizing distractions that might undermine that situational awareness can help.

What about when the carjacking is already happening? What should victims do?

This is a difficult crime for a victim to manage. When you’re getting carjacked, you don’t know what’s happening. You might think you’re being abducted. And if the victim panics, then that can escalate really badly, quickly. If the vehicle’s on and the driver’s inside, the car can be both a weapon and a shield, so that can encourage resistance on the victim’s part. Even with a gun in your face, if you think you’re being abducted or about to be killed, you might just floor it to get away. That can potentially escalate the violence to you as the victim. Offenders get mad when you’re non-compliant. If they have a gun, they’re liable to fire it. Ironically, it’s really the carjacker’s job, to let you know, “Hey, I just want your car, get out and you’re not gonna get hurt.” Sometimes that doesn’t happen.

It’s hard to give universal advice, it’s very situational. The general best advice is not to resist, give them what they want.

This article originally appeared in Knowable Magazine, an independent journalistic endeavor from Annual Reviews.

Friday, April 28, 2023

AI is exciting – and an ethical minefield: 4 essential reads on the risks and concerns about this technology

Who’s in control? John Lund/Stone via Getty Images
Molly Jackson, The Conversation

If you’re like me, you’ve spent a lot of time over the past few months trying to figure out what this AI thing is all about. Large-language models, generative AI, algorithmic bias – it’s a lot for the less tech-savvy of us to sort out, trying to make sense of the myriad headlines about artificial intelligence swirling about.

But understanding how AI works is just part of the dilemma. As a society, we’re also confronting concerns about its social, psychological and ethical effects. Here we spotlight articles about the deeper questions the AI revolution raises about bias and inequality, the learning process, its impact on jobs, and even the artistic process.

1. Ethical debt

When a company rushes software to market, it often accrues “technical debt”: the cost of having to fix bugs after a program is released, instead of ironing them out beforehand.

There are examples of this in AI as companies race ahead to compete with each other. More alarming, though, is “ethical debt,” when development teams haven’t considered possible social or ethical harms – how AI could replace human jobs, for example, or when algorithms end up reinforcing biases.

Casey Fiesler, a technology ethics expert at the University of Colorado Boulder, wrote that she’s “a technology optimist who thinks and prepares like a pessimist”: someone who puts in time speculating about what might go wrong.

That kind of speculation is an especially useful skill for technologists trying to envision consequences that might not impact them, Fiesler explained, but that could hurt “marginalized groups that are underrepresented” in tech fields. When it comes to ethical debt, she noted, “the people who incur it are rarely the people who pay for it in the end.”

2. Is anybody there?

AI programs’ abilities can give the impression that they are sentient, but they’re not, explained Nir Eisikovits, director of the Applied Ethics Center at the University of Massachusetts Boston. “ChatGPT and similar technologies are sophisticated sentence completion applications – nothing more, nothing less,” he wrote.

But saying AI isn’t conscious doesn’t mean it’s harmless.

“To me,” Eisikovits explained, “the pressing question is not whether machines are sentient but why it is so easy for us to imagine that they are.” Humans easily project human features onto just about anything, including technology. That tendency to anthropomorphize “points to real risks of psychological entanglement with technology,” according to Eisikovits, who studies AI’s impact on how people understand themselves.

A human hand against a dark background reaches out to touch a hologram-like hand.
People give names to boats and cars – and can get attached to AI, too. Yuichiro Chino/Moment via Getty Images

Considering how many people talk to their pets and cars, it shouldn’t be a surprise that chatbots can come to mean so much to people who engage with them. The next steps, though, are “strong guardrails” to prevent programs from taking advantage of that emotional connection.

3. Putting pen to paper

From the start, ChatGPT fueled parents’ and teachers’ fears about cheating. How could educators – or college admissions officers, for that matter – figure out if an essay was written by a human or a chatbot?

But AI sparks more fundamental questions about writing, according to Naomi Baron, an American University linguist who studies technology’s effects on language. AI’s potential threat to writing isn’t just about honesty, but about the ability to think itself.

A woman with short hair, a necklace, and a short-sleeve dress smiles guardedly in a black and white photograph.
American writer Flannery O'Connor sits with a copy of her novel ‘Wise Blood,’ published in 1952. Apic/Hulton Archive via Getty Images

Baron pointed to novelist Flannery O'Connor’s remark that “I write because I don’t know what I think until I read what I say.” In other words, writing isn’t just a way to put your thoughts on paper; it’s a process to help sort out your thoughts in the first place.

AI text generation can be a handy tool, Baron wrote, but “there’s a slippery slope between collaboration and encroachment.” As we wade into a world of more and more AI, it’s key to remember that “crafting written work should be a journey, not just a destination.”

4. The value of art

Generative AI programs don’t just produce text, but also complex images – which have even captured a prize or two. In theory, allowing AI to do nitty-gritty execution might free up human artists’ big-picture creativity.

Not so fast, said Eisikovits and Alec Stubbs, who is also a philosopher at the University of Massachusetts Boston. The finished object viewers appreciate is just part of the process we call “art.” For creator and appreciator alike, what makes art valuable is “the work of making something real and working through its details”: the struggle to turn ideas into something we can see.

Editor’s note: This story is a roundup of articles from The Conversation’s archives.

Molly Jackson, Religion and Ethics Editor, The Conversation

This article is republished from The Conversation under a Creative Commons license. 

Plasmonics brings the molecular world into sharper focus


People have been using metals to manipulate light for centuries. Now researchers are using it to create powerful biosensors.

In the battle against breast cancer, the drug Herceptin is a steady ally, keeping some kinds of tumors at bay and helping people live longer. Yet nearly all cancers eventually develop resistance.

Until recently, the underlying cause of this resistance eluded researchers. But then chemist Wei Wang developed a technique to track how individual Herceptin molecules attach to cancer cells. He found that a protein in the membranes of the resistant cells was deforming the receptor molecules that Herceptin grabs onto, giving the drug no handhold.

This medical insight owes a debt to physics — specifically, the ability of metals to steer light on nanometer scales, a field of research known as plasmonics. By introducing cancer cells to Herceptin on one side of a gold wafer and watching changes in how light bounced off the other side, the researchers could see how the two parties — cancer cell and cancer drug — interacted, thus revealing a crucial mechanism of Herceptin resistance.

People have been using metals to manipulate the passage of light for centuries, though only recently has the phenomenon been used to understand diseases like cancer. Large sheets of shiny metal — a.k.a. “mirrors” — are like Do Not Enter signs for photons, reflecting back these light particles in mostly unchanged states. But microscopic metal flakes are different: They act more like traffic cops, allowing certain colors of light to pass through and blocking others.

One of the oldest examples of this phenomenon in action is a fourth century Roman chalice known as the Lycurgus Cup. Normally, the glass cup appears green and opaque. But if you illuminate the goblet from within, the glass glows a translucent red. That’s because nanometer-sized particles of gold and silver suspended in the glass reflect green light and allow red light to pass.

In creating the cup, Roman craftsmen had stumbled upon a synergy between electrons, metals and light that no one would understand for another 16 centuries: The electrons in some metals will resonate when tickled with just the right wavelength of light, which alters the path of the light itself.

Today, the field of plasmonics is flourishing. In the last 20 years or so, researchers have taken a much more deliberate approach to exploiting this behavior, creating tailored nanostructures that compress and manipulate light into volumes roughly the size of single molecules.

The ability to focus light on the nanoscale turns out to have scores of potential applications, says Caltech plasmonics pioneer Harry Atwater. The many uses are helping to solve problems in chemical sensing, data transfer, cancer therapy and navigation for self-driving cars. “That’s why this field is so exciting and why it’s been so compelling … it’s so interdisciplinary,” Atwater says.

Shedding light on biomolecules

One of the most successful applications of plasmonics is biosensing, wherein researchers try to detect the presence (or absence) of biologically relevant molecules. Normally, these are much too small to see with light, and though there are ways to tag them, these techniques are often expensive or cumbersome or alter the molecules in ways that hinder their study. Plasmonics offers an alternative by confining light to molecule-sized volumes. Under such circumstances, “what you can achieve is very strong interaction of light with matter,” says Hatice Altug, a researcher at the Federal Institute of Technology in Lausanne, Switzerland. And that makes the light very sensitive to changes in, or the presence of, individual molecules within those volumes. (For a more thorough review of how plasmonics helps with biosensing, see this 2018 paper in Chemical Reviews.)

Wang, of China’s Nanjing University, and colleagues wanted to bring this power to the field of drug development. Part of the process of designing new drugs is understanding how the molecules interact with cells in the body, but that typically requires monitoring the interaction one cell at a time, which is very labor intensive. If there were ways to see many drug molecules interacting with many cells simultaneously, Wang says, that could really speed things along.

One tried-and-true technique for tracking such interactions is electrochemistry — running an electrical current through a collection of molecules and cells. By tracking changes in the current, researchers can measure the rate at which molecules attach to and detach from the cells.

But electrochemistry doesn’t allow researchers to pinpoint these interactions with any spatial precision. A researcher might measure electrical current flowing through some cells and know that it’s hindered by molecules attaching, but be blind to exactly where. In the case of Herceptin resistance, that would mean knowing that there’s been an overall change in how frequently Herceptin is sticking to cells and then breaking off, but not knowing where the change is happening.

Guided by the light

To get the precision they needed, Wang and colleagues created a plasmonics-based electrochemical imager. As he and others describe in the 2017 Annual Review of Analytical Chemistry, instead of just applying a voltage to some cells and tracking how the ensuing electrical current changes as molecules arrive and depart, they also measured how light was reflected and absorbed by a gold wafer to see just where the current got held up, and by how much.

It’s the same principle as the Lycurgus Cup. In this case, however, the light is aimed at one gold wafer, 50 nanometers thick. On one side of the wafer is a small box filled with an electrolyte solution in which scientists can place molecules and cells. A red light illuminates the other side, and a camera measures how much of this red light bounces off the wafer.

The researchers apply an oscillating voltage to the whole setup, and this sends current coursing through the solution, drawing electrons into and out of the gold. The shifting density of electrons in the gold changes how much light gets reflected to the camera. At some densities, the light bounces off, while at others the light energy triggers waves of electrons in the gold known as plasmons that skim its surface, much like ripples moving across a pond. As the voltage goes up and down, electrons scurry in and out of the gold, and the amount of reflected light pulses in sync.

Cell-molecule interactions are detectable because the setup is affected by the behavior of those molecules and cells in the fluid on the other side of the plate. Cells sitting close to the backside of the gold will hold back some of the current — casting an electrical “shadow” on the gold plate, pinpointing the cell’s location. Now if a molecule attaches to the cell’s membrane, the cell’s electrical properties will change, which will slightly brighten or dim its electrical shadow. This changes how light reflects off the metal at that spot, and the camera records the changes.

Wang's team used such an approach to figure out how Herceptin behaved differently around tumor cells that had developed resistance to the drug. They took resistant and nonresistant tumor cells and stuck them on the gold film. Then they added Herceptin to the electrolyte solution. Normally, Herceptin attaches to a protein in the cell membrane and thereby inhibits growth. This binding alters the plasmon wave of electrons and thus the light reflected off the metal.

But the researchers found that the reflections looked different near places on the gold that had been layered with drug-resistant tumor cells. Some of the proteins in those cells let go of the Herceptin dozens of times faster than normally, which altered the electrical current through those cells. This, in turn, changed whether plasmons formed in the gold and modified the pattern of light reflection.

With a view to drug development, Wang says that he has now begun studies of neuron and heart cells, in particular the protein channels used to shuttle sodium and potassium ions in and out of the cells. These ion channels are “the windows of communication between the cell and its environment,” he says. “If you’re developing a new drug, you need to evaluate the potential side effects to the heart.”

It is slow, laborious work, Wang adds, but a plasmonics-based electrochemical imager could speed things up.

Plasmonics is also being applied to other kinds of imaging, delivering resolutions that traditional microscopes cannot achieve and acting as a highly sensitive biosensor for a range of applications. Researchers hope that these tabletop devices can be shrunk into handheld ones; Altug, for one, envisions a future with portable, low-cost, easy-to-use plasmonics-based drug detectors and other biosensors that don’t require trained researchers. They might even be attached to cellphones.

“You could use it in the field to monitor water, security in an airport or in resource-limited settings such as a third-world country,” she says. “We’re not quite there yet, but there is a big push.”

This article originally appeared in Knowable Magazine, an independent journalistic endeavor from Annual Reviews.

Reaction to bronze sculpture of Coretta and Martin Luther King Jr. in Boston hasn’t been good – and that’s not bad for art that shatters conventions

The Coretta and Martin Luther King Jr. memorial sculpture at Boston Common is called ‘The Embrace.’ Lane Turner/The Boston Globe via Getty Images
Kristin Ann Hass, University of Michigan

As an acclaimed photographer and conceptual artist, Hank Willis Thomas has grown accustomed to criticisms of his unconventional art and concepts of identity.

But even Thomas had never experienced anything like the reaction to his latest sculpture, designed to commemorate the lives of Coretta and Martin Luther King Jr., two of the most revered civil rights leaders in modern American history.

Unveiled in January 2023, the two sets of 20-foot-tall bronze arms appear floating in air and are embracing. Those who visit the statue in Boston can also walk underneath it into the space between the Kings’ arms.

It was in Boston after all, that the two met and fell in love.

Despite the intended show of mutual affection between the Kings, many of the tweets shared on national news feeds after the unveiling were crude and misinterpreted arms for other body parts.

Tweeters decried: “Disrespectful,” “Obscene,” “Phallic,” “Gross” and “Insulting.”

In the online magazine Compact, Seneca Scott, a labor union activist and cousin of Coretta Scott King, depicted the sculpture, titled “The Embrace,” as a “masturbatory metal homage to my legendary family members” and an insult to Black people everywhere.

As a scholar of visual culture, public memorials and race, I know these reactions to a new monument are not uncommon.

In fact, outrage is the common response.

Shattering the idea of a conventional memorial

“The Embrace” is unusual and was unveiled at a time of intense national debate about the public memorials of white men and the dismal histories of representing Black people and women.

Across the U.S., Confederate monuments and statues of Christopher Columbus and Teddy Roosevelt have been passionately defended – and have come tumbling down over the past 10 years.

This sculpture is both abstract and carefully detailed – the buttons on his coat and her jewelry are clearly articulated in bronze.

A Black man is embracing a Black woman as both of them are smiling.
Martin Luther King Jr. hugs his wife, Coretta, after he was awarded the Nobel Peace Prize in 1964. Bettmann/GettyImages

Many of the critics complained that enormous floating arms of beloved civil rights leaders did a terrible disservice to the Kings.

One tweeter asked Thomas: “Why did you make it so complicated and confusing?”

Most memorials do their work with a few very familiar conventions – soldiers on horses, scantily clad buxom figures of liberty, and dignified men caught midstride, forever frozen in time.

“The Embrace” shattered those conventions – which partly explains the outrage.

In the past, the most respectful, most dignified way to represent a revered person was as fully dressed and standing tall.

“The Embrace” steps outside of memorial conventions, which is a particularly complicated thing to do when representing Black people and women.

Depicting Coretta Scott King without a whole body and without a face runs the risk of seeming to be part of a long practice of denying women the power and dignity of their male counterparts.

A Black man dressed in a dark suit is sitting on stairs made of stone.
Hank Willis Thomas, the artist who created ‘The Embrace,’ in Boston on June 14, 2022. Lane Turner/The Boston Globe via Getty Images

Most women found in public memorials are symbols of liberty, peace, justice – and at least partially naked.

They are beautiful and aspirational, and, most notably, not powerful actual people in the world.

According to Monument Lab, a public art and history nonprofit group, there are 11 times more monuments to mermaids than congresswomen in the United States.

The history of representing Black men in the United States is equally disturbing.

Figures of them are all too rare, and when they do appear, they are generic soldiers or, more often, barechested and kneeling, nameless or enslaved.

The artistic choice to depict Martin Luther King Jr. without a face, without an intact body, without the dignity of a straight back, runs the risk of robbing him of the power he risked to carve out nonviolent protests in a racially hostile country.

An artist of Thomas’ caliber and experience knows he is taking those risks, and does so intentionally.

Initial reactions change over time

Some of the most beloved public art has been met with calls for a wrecking ball.

Lots of folks, for example, were very upset when the Vietnam Veterans Memorial was unveiled in 1982. One critic called the monument a “black gash of shame.”

“It is an unfortunate choice of memorial,” the New Republic wrote at the time. “Memorials are built to give context and, possibly, meaning to suffering that is otherwise incomprehensible. … To treat the Vietnam dead like the victims of some monstrous traffic accident is more than a disservice to history; it is a disservice to the memory of the 57,000.”

Designed by Maya Lin, the memorial has now become one of the most cherished pieces of public art in the U.S.

Even the Eiffel Tower was considered an eyesore by high-minded Paris art critics, some of whom described it as no more than a railroad bridge turned on its side when it was finished in 1889.

Willis is no stranger to criticisms. In fact, he embraces it.

“My belief,” he told Time magazine in a January 2023 interview “is artists learn through critique. There’s things that we love that over time we get tired of, and there’s things that we’re not quite sure about at the beginning, but over time, we love.”

Such was the case in Philadelphia in 2017, when he unveiled his 8-foot-tall, 800-pound sculpture of an Afro pick topped with a clenched-fist, Black Power salute.

Officially called “All Power to All People,” the statue rests near Philadelphia City Hall on Thomas Paine Plaza and received initial rebukes but eventual praise.

Public art that has something to say

But one crucial idea is missing from most of the criticisms of “The Embrace.”

In my view, memorials and monuments are not actually made to mark a shared history or to maintain the status quo, as some have argued. It’s my belief that the people who build and design them have a point they want to make in the world.

A statue of arms and hands has a space underneath where visitors can walk.
Another view of ‘The Embrace’ shows the space underneath the statue. Craig F. Walker/The Boston Globe via Getty Images

The United Daughters of the Confederacy had a vision in 1890 when it unveiled the sculpture of Confederate Gen. Robert E. Lee riding atop his horse Traveller in Richmond, Virginia.

And Thomas had his vision for “The Embrace.”

The magic of memorials and monuments is that they seem natural and eternal in our landscape but they are neither.

What Thomas does in “The Embrace” is ask us to see the Kings, simply yet powerfully, in a new light.

Kristin Ann Hass, Professor of American Culture, University of Michigan

This article is republished from The Conversation under a Creative Commons license.