Monday, April 10, 2023

How to deal with work stress — and actually recover from burnout

Mindfulness, detachment, selecting off-time activities with care: Here are evidence-based strategies to achieve healthy work-life balance

There’s job stress, and then there’s the crushing pressure paramedics went through during the first wave of the Covid-19 pandemic. The uncertainty, the dread, the constantly changing protocols, the shortages of personal protective equipment, the multiple calls to the same nursing home — it was almost too much for Kate Bergen of Manahawkin, New Jersey.

“It felt like everything was closing in around us,” Bergen says. “At some point I knew that I couldn’t take any more. Was I headed for a meltdown? Was I going to just walk off the job one day? I was getting very close to that point.”

Instead of quitting, Bergen found a calling. One day while waiting for the next emergency call, she took a picture of herself in her full PPE. The image inspired her to paint a self-portrait poster in the style of World War II icon Rosie the Riveter. The message: “We need you to stay home.”

It was the first in a series of “Rosie” posters of women first responders, an ongoing project that has helped Bergen calm her mind during her downtime. Ultimately, she says, the Rosies helped her withstand the stress of her job and allowed her to show up to work each day with new energy and focus. “They made it possible for me to keep going.”

While workers like Bergen are responding to emergency calls and saving lives, many of us are doing things like responding to emails and saving receipts from business trips. But even for people with jobs in offices, restaurants and factories, there’s an art and a science to making the most of downtime, says Sabine Sonnentag, a psychologist at the University of Mannheim in Germany. The right approach to non-work time can help prevent burnout, improve health and generally make life more livable. “When a job is stressful, recovery is needed,” says Sonnentag, who cowrote an article exploring the psychology of downtime in the 2021 issue of the  Annual Review of Organizational Psychology and Organizational Behavior.

Workers everywhere are feeling frazzled, overwhelmed and ready for the weekend. With that backdrop, researchers are doing work of their own to better understand the potential benefits of recovery and the best ways to unwind. “Work recovery has become part of the national conversation on well-being,” says Andrew Bennett, a social scientist at Old Dominion University in Norfolk, Virginia. “There’s a growing awareness that we can’t just keep working ourselves to death.”

At a time when many people are rethinking their jobs (if they haven’t already quit), they should also be thinking about their quality of life away from work, Sonnentag says. “People should ask themselves, how much free time do I have and how much energy do I have for my free time? How do I want to continue my life?”

A weekend paradox

We can all use a chance to unplug and unwind, but here’s the rub: Recovery from work tends to be the most difficult and elusive for those who need it most. “We call it the ‘recovery paradox,’” Sonnentag says. “The odds are high that when a job is stressful, it’s difficult to have an excellent recovery.”

That paradox was underscored in a 2021 analysis that combined results from 198 separate studies of employees at work and at home. Workers with the most mentally and emotionally draining jobs were also the least likely to feel rested and rejuvenated during their off time. Interestingly, people with physically demanding jobs — construction workers, furniture movers and the like — had much less trouble winding down. The surest way to feel lousy after hours, it appears, is to think too hard at work.

Sonnentag authored a 2018 study published in Research in Organization Behavior that helped to explain why  the paradox is so hard to escape. People who were more stressed out at work tended to get less exercise and worse sleep, an ideal scenario for feeling less than great. In other words, stressful work can disrupt the very fundamentals of healthy living.

To help workers break out of that destructive loop, researchers are pondering both sides of the work/life cycle. As Sonnentag explains, certain tasks, obligations and workplace cultures make it especially hard to unwind when work is done. Time pressure, the feeling that one is constantly under the gun, is especially disruptive. Jobs in health care, where that time pressure often combines with life-and-death stakes, tend to be especially taxing. Working with customers can be exhausting too, Sonnentag says, partly because it takes a lot of focus and effort to act cheerful and friendly when you don’t always feel that way deep down, a task known as emotional labor.

The demands of work vary widely from one person to the next, and so do approaches to downtime. Recovery is highly individual, and different people will have different strategies. “We don’t have a single prescription,” Bennett says. Researchers have grouped approaches into broad categories, including “relaxation” and “mastery.” Relaxation, a concept that’s easier to grasp than it is to achieve, includes any activity that calms the body and mind, whether it’s walking through a park, reading a good book or watching a zombie hunter movie on Netflix. (Note: The latter may not be an ideal choice if your actual job involves hunting zombies.)

Mastery, meanwhile, can be achieved through any activity that challenges a person to be good (or at least passable) at a new skill. Just as painting helped Bergen cope with stress, workers can find relief in their accomplishments. “Anything associated with learning can be helpful,” Sonnentag says. “It could be some kind of sport or exercise. It can be something like learning a new language or trying new cuisines when cooking.” A 2019 study that followed 183 employees over 10 workdays found that people who achieved some sort of mastery during their off time were more energetic and enthusiastic the next morning.

For people who need a break, the “why” behind a particular activity can be as important as the “what.” A 2013 study that followed 74 workers for five days found that people who spent their off time with activities and tasks that they actually wanted to do — whatever they were — were more lively and energetic the next day than those who felt obligated or forced to do something.

Whether they’re relaxing or creating during their time away from the office, Bennett says stressed-out workers should strive to think about something other than their jobs, a process that psychologists call detachment. (The TV show Severance takes this concept to extremes.) It’s OK to have great ideas in the shower and regale your partner with office anecdotes, but research shows people with stressful jobs tend to be happier and healthier if they can achieve some mental and emotional distance from work.

The benefits of tuning out became clear in a 2018 report involving more than 26,000 employees in various lines of work, including judges, teachers, nurses and office workers. The analysis, coauthored by Bennett, found that detachment was a powerful buffer against work-related fatigue. Workers who said they were able to think about things other than work while at home were less worn out than their colleagues. On the other hand, workers who carried on-the-job thoughts throughout the day were more likely to feel exhausted.

Vacations can also help erase work stress and prevent burnout, to a point. Sonnentag coauthored a 2011 study that used questionnaires to track 131 teachers before and after vacations. The teachers returned to work feeling refreshed and engaged, but those benefits tended to fade after only a month. The post-vacation high was more fleeting for teachers with especially demanding jobs, but it lingered a bit longer for those who managed to fit relaxing leisure activities into their regular routine.

How much vacation is enough? That question is hard to answer, Sonnentag says. While many European workers expect and demand four- or five-week breaks, she says there’s no evidence that such long vacations offer any more chance for recovery than a vacation of one or two weeks. She does feel confident saying that most workers will need at least occasional breaks that are longer than just a weekend, especially if that weekend is largely eaten up by household chores and other non-work obligations.

Perhaps an extra day off each week would make a big difference. That’s the premise driving an ongoing four-day-workweek experiment involving 70 companies in the UK. The businesses, including banks, robotics manufacturers, and a fish and chips restaurant, are all expecting employees to maintain their productivity despite working one day less each week. The full results won’t be available until 2023, but early data suggest that the four-day workweek has decreased signs of burnout and stress while improving life satisfaction and feelings of work-life balance, reports Wen Fan, a sociologist at Boston College who is helping to conduct the experiment. “The results are very encouraging,” she says.

Fan says it’s too early to know if the employees and companies were able to stay as productive as ever during the experiment, but she notes that most jobs could be done more efficiently with a little extra planning and streamlining. “A lot of time is wasted on distractions and meetings that go on too long,” she says.

No matter how many days a week a person has to work, minibreaks during the day can help, too. A 2020 survey-based study involving 172 workers in the US found that subjects tended to be in better moods and were less emotionally exhausted toward the end of the workday if they had breaks that allowed them to briefly detach from work. The study also tracked  mindfulness, the degree to which people are conscious of their present emotions and circumstances. They did this by asking the participants how much they agreed with statements such as “Today at work I was aware of different emotions that rose within me.” Employees who were the most mindful were also the most likely to truly check out and relax during their breaks from work.

A 2021 study of college students took a closer look at relaxation and exercise during work breaks. Those who tried progressive muscle relaxation, a low-stress activity that involves tensing and releasing muscles, reported more detachment during the break, while students who got their blood pumping on an exercise bike had more energy for the rest of their day. Study coauthor Jennifer Ragsdale, now a research psychologist at the National Institute for Occupational Safety and Health in Cincinnati, says that a better appreciation for the nuance of work breaks can help people choose the right approach for a given day. “If you need some sort of pick-me-up, you can walk round the building to get your energy going,” she says. “If you’re feeling overwhelmed, you can relax.”

As many people have discovered during the pandemic years, it can be challenging to fully check out from work when your living room is also your office. Speaking with at-home workers, Bennett has collected tips for separating work life and life life. Something as simple as wearing a collared shirt or other office attire during work hours and changing into casual wear at the end of the day can help establish boundaries, he says. Using a dedicated laptop for work and putting any work-related materials out of sight at the end of the day can also create much-needed distance.

Ragsdale says that technology can be both an escape and a tether. The same devices that help us play games, listen to podcasts or struggle with online word puzzles also make it possible to receive work emails and other reminders of life outside of the home. Ragsdale cowrote a 2021 commentary calling for more research into  the impacts of cell phones on work recovery. “When you’re continuing to be exposed to work through your cell phone, it’s harder for that recovery process to unfold,” she says. The very sight of a work email can trigger thoughts that are just as stressful as the actual job, she adds.

Not many people can completely let go of their phones when they’re at home, but they can take steps to protect themselves from intrusive work pings. “You can adjust your settings in a way that make your phone less appealing,” she says, including turning off notifications for things like email and Twitter.

Bergen can’t be away from her phone when she’s on call, but she can still feel like she’s in her own world when she’s working on a new “Rosie” painting. Psychologists may call it mastery, but for her it’s a validation and an escape. She has recently started painting women first responders who were on duty for both 9/11 and Covid. “I started out painting one thing for myself and it blossomed,” she says. “It’s turned into something beautiful.”

This article originally appeared in Knowable Magazine, an independent journalistic endeavor from Annual Reviews.

Rising unemployment: economists sometimes say it’s good for the economy, but are they right?

‘What’ll I tell my wife and kids?’ guitarfoto
Daragh O'Leary, University College Cork

Signs of a global economic downturn are growing by the day. Inflation is still going up, debt is up and interest rates are up, which means that projections for growth are down. Put simply, the proverbial something is close to hitting the fan.

Business closures and job losses are likely to become another hurdle for the global economy – and that points to rising unemployment. Yet, while most people would think of rising unemployment as a bad thing, some economists don’t entirely agree.

Economists have long pointed to a counterintuitive positive relationship between unemployment and entrepreneurship, born of the fact that people who lose their job often start businesses. This is often referred to within economic literature as necessity-based or push-factor entrepreneurship.

Where it gets tricky

There is certainly good evidence for the existence of this contradictory relationship. The graph below shows the rates of UK business creation in blue and unemployment in red. As you can see, unemployment started to increase during the global financial crisis of 2007-09 and business creation followed not long after.

UK new business creation and unemployment, 2006-2020

Graph plotting unemployment and new business creation rates in the UK

This relationship between business creation and unemployment has previously been used by some as a justification for cold social policies towards the unemployed on the rationale that “the market fixes itself” in the long run. They see business closures and job losses not as human miseries that require government help, but necessary evils that are needed to reallocate the money, people and other resources back into the economy in more efficient ways .

But my latest research has found that rising unemployment is not quite the silver bullet for reigniting the economic engine that it’s cracked up to be. I looked at 148 regions across Europe from 2008 to 2017. Although I did find evidence that unemployment can stimulate business creation over time, this only seems to happen in higher performing regions within higher performing economies such as the Netherlands, Finland and Austria.

In lower performing regions within lower performing economies such as Bulgaria, Romania and Hungary, the relationship between unemployment and business creation actually appears to be negative. In other words, rather than inducing business creation, unemployment simply seems to lead to more unemployment.

The reason why higher performing regions in wealthier areas have a positive relationship between job losses and business creation is that they enjoy what are known as “urbanisation economies”. These are positive benefits derived from the scale and density of economic activity occurring within that area, including wider arrays of services, greater pools of customers and greater numbers of transactions relative to other areas of the economy.

For example, a firm located in a capital city like London will benefit from more abundant access to consumers, suppliers and lenders as well as larger labour pools. The higher population density in these areas also makes it more likely that firms and workers will learn faster as they observe the activities of their many neighbours. In more peripheral areas with fewer of these characteristics, the opposite is true. This is why unemployment affects different places differently.

What it means

One consequence is that economists need to stop explaining how economies perform differently based solely on national factors. And it’s not just unemployment where this becomes apparent. For example, Ireland’s longstanding low rate of corporation tax (12.5%) has been cited as a reason for its high foreign direct investment, which accounts for roughly 20% of private sector employment.

Yet while just over 43% of all Irish enterprises in 2020 were located in either Dublin or Cork, counties like Leitrim in the north accounted for fewer than 1% of enterprises. So while national measures can help induce entrepreneurship and increase the overall size of the pie, the pie is shared very unequally. Just as rising unemployment can benefit some areas while hindering others, the same is true of government interventions.

We therefore need to stop viewing the free market and government intervention as either wrong or right. In some contexts one is going to be more helpful, while in other contexts it will be the opposite. Recognising this reality would improve on much of the polarised debate in politics and economics, in which those on the right can come across as cold and ignorant, while those on the left can seem self-righteous and sanctimonious, viewing capitalism and markets as dirty words.

How does this apply to today’s gathering downturn? It would make sense for governments to prioritise supporting businesses in more peripheral regions, while leaving those in wealthier urban areas to fend for themselves.

The famous economist John Kenneth Galbraith gave what I believe to be one of the best pieces of commentary on this topic, saying:

Where the market works, I’m for that. Where government is necessary, I’m for that … I’m in favour of whatever works in the particular case.

If we are to survive this upcoming recession and get things going again, we are going to need to acknowledge that centralised “one-size-fits-all” policies won’t be useful everywhere. The solutions to economic recovery are in some cases government intervention and in others the free market, but not always one or the other.

Daragh O'Leary, PhD Researcher in Economics, University College Cork

This article is republished from The Conversation under a Creative Commons license. Read the original article.

River Cruises Offer Exploration, Comfort

Travel for pure enjoyment is on the rise, so this may be your year to plan the ultimate dream vacation.

According to Sports and Leisure Research, 80% of people surveyed believe a vacation does wonders for mental health and travel is a top spending priority in the coming year. The survey indicated travelers want to immerse themselves in unique experiences, including new cultures, foods and people.

For those who delight in exploring entire regions, one downfall can be the burden of packing and unpacking at each new destination. Taking your accommodations with you is a practical alternative for curious travelers.

River voyages, for example, allow travelers to unpack once and visit multiple destinations in one seamless journey, from major European cities to quaint towns and villages. These destination-focused journeys offer experienced travelers the opportunity to explore science, history and cuisine with culturally enriching itineraries on the world’s great waterways.

If an intimate, relaxed journey is your ideal getaway, you may want to consider the revolutionary Viking Longships. These state-of-the-art river ships are engineered with guests’ comfort and exploration in mind.

Sailing Europe’s storied rivers, the award-winning fleet of identical longships showcase innovative engineering, streamlined Scandinavian design and understated elegance. River ships are also small enough – hosting 190 guests – to dock in the heart of popular destinations, making it easy to explore.

The voyages range from 8-23 days with itineraries featuring Europe’s Rhine, Main, Danube, Seine, Rhône, Douro, Moselle, Elbe, Dordogne, Garonne and Gironde Rivers.

Known as travel experiences for “The Thinking Person,” each Viking journey includes a shore excursion in every port and an onboard and onshore enrichment program that provides deep immersion in the destination through performances of music and art, cooking demonstrations, informative port talks and carefully selected guest lecturers. Enjoy shore excursions that provide historical tours and visits to unique haunts where you can experience some of the local culture, regional foods and everyday life.

On a Viking Longship, you can expect to relax in spacious public areas, including wide-open sun decks with ever-changing views. The ships feature spacious staterooms in a variety of categories, including true two-room suites with full-size verandas.

Additional ship highlights include al fresco dining on an indoor-outdoor terrace and onboard amenities including a restaurant, bar, lounge and library. Inclusive fares that cover your port taxes and fees also mean you can enjoy beer, wine and soft drinks with onboard lunch and dinner; specialty coffees, teas; bottled water; ground transfers; and more.

Chart your next adventure at viking.com.
SOURCE:
Viking River Cruises

Unplugging asthmatic airways



New therapies that involve the removal of mucus in the lungs might be the best strategy to beat asthma

Blessing Azeke wrapped her cardigan around her body as another asthma attack set in. Provoked by cold air from an overhead fan in her law school classroom in Enugu, Nigeria, her lungs refused to let her breathe. The attack made Azeke so weak that she could hardly move on her own. She was rushed to the school’s clinic yet again.

For Azeke and more than 260 million other people with asthma worldwide, such attacks are a constant threat. Cold air, allergens and other triggers cause inflammation in their lungs, narrowing the air passages and increasing mucus production. Often, plugs of mucus block smaller airways completely, and this obstruction is a major cause of the nearly half-million deaths caused by asthma each year.

Coping with severe asthma like Azeke’s can be tricky because existing therapies don’t treat all facets of the disease. Anti-inflammatories such as corticosteroids reduce inflammation and swelling in the airways, but they don’t prevent excessive mucus secretion or clear existing mucus plugs from the lungs. And therapies targeted at clearing airway mucus do not reduce inflammation and barely reduce the overactive secretion of mucus or dissolve plugs in the airways.

Today, researchers are working on several promising new treatments to prevent or clear mucus plugs that may leave people with asthma breathing easier.

At the heart of the problem is mucus itself, a viscous mixture of water, cellular debris, salt, lipids and proteins that performs the crucial job of trapping foreign particles and ferrying them out of the lungs. The primary component of this fluid is a family of proteins known as mucins, which give mucus its gel-like thickness. In people with asthma, genetic changes in mucin proteins make the mucus thicker and harder to clear from the lungs.

When that happens, allergens, pollutants and pathogens can accumulate in the lungs, triggering inflammation that leads to further mucus secretion as the body works to rid itself of the threats. The result is the accumulation of airway-blocking mucus plugs.

Currently, doctors treat mucus plugs with inhalable medications such as bronchodilators to widen airways, corticosteroids to reduce inflammation to enable the easier flow of mucus, and drugs called mucolytics that break down the mucins themselves.

However, the only available mucolytic, known as N-Acetylcysteine (NAC), is not very effective at breaking the bonds in mucin. And at high doses, it can cause cough and raise the risk for bacterial pneumonia and other adverse side effects. “It’s very rarely used because its activity is very, very weak and people have to take very high doses of it to get effects,” says Christopher Evans, a pulmonary scientist at the University of Colorado Anschutz Medical Campus who studies how airway mucins regulate respiratory health and diseases.

To address this shortcoming, Evans and others are trying to discover more effective mucolytics to dissolve mucus plugs. He and his team recently tested a different bond-breaking agent, similar to NAC but much more potent and effective at breaking up mucus plugs.

In their studies, the researchers exposed mice to a fungal allergen once a week for four weeks. This stimulated inflammation and mucus overproduction, mimicking a full-blown asthma attack. Then they treated the mice with a mucolytic agent known as tris (2-carboxyethyl) phosphine. The treatment improved mucus flow, the team found, allowing the asthmatic mice to clear mucus just as well as mice that had not been exposed to the allergen, with higher doses yielding better results.

Evans cautions that the bonds that hold mucins together are also found in other proteins, so the risk of side effects is high. Finding a drug that will break bonds only in mucins, he says, “is still pretty far from reality at this point.”

Clearing crystals

In a different approach to the problem, immunologist Helena Aegerter of Ghent University in Belgium and her colleagues are focusing on what they believe drives mucus overproduction in asthma: protein crystals called Charcot-Leyden crystals (CLCs) that form as byproducts of dead white blood cells called eosinophils. The presence of these crystals in mucus makes it thicker and more challenging to clear from airways.

Other researchers had already shown that these crystals induce inflammation in the lungs by recruiting immune cells. And earlier work by Aegerter and colleagues showed that the crystals enhance mucus production in mice with chronic asthma. Perhaps, she thought, dealing with the crystals could be the best way to avoid the formation of mucus plugs. “You can target mucus and you can target inflammation, but while you still have these crystals in the airways, they are always going to drive a vicious cycle of mucus production and inflammation,” says Aegerter, who coauthored an article on the pathology of asthma in the 2023 Annual Review of Pathology.

To address the crystals directly, Aegerter and her colleagues developed antibodies in llamas, then engineered them to more effectively attack the proteins in the crystals. They then tested them on mucus samples collected from people with asthma. The antibodies successfully dissolved the crystals by attaching to specific regions of the CLC proteins that hold the crystals together, the team reported in 2022. The antibodies also neutralized inflammatory reactions in mice. Based on these findings, the scientists are working on a drug that would have the same effect on people.

“Our strategy, now, is really to target the crystals at the heart of the mucus plug. And by getting rid of the crystals, hopefully, that will put a stop to all the mucus production and inflammation that happens around these airways,” says Aegerter. The approach could work not just for asthma, she says, but also for a variety of other inflammatory diseases involving mucus oversecretion, such as inflammation of the sinuses and some allergic reactions to fungal pathogens.

Taming the flow

In a third approach, pulmonologist Burton Dickey of the University of Texas MD Anderson Cancer Center is working to avert mucus plugs by preventing excess secretion of mucus. After 20 years of work on airway mucin secretions, Dickey’s team published a paper in 2022 identifying a protein, synaptotagmin 2 (Syt2), that is involved in the excessive mucus secretion experienced by people with asthma and other conditions.

Dickey and his team induced mucus overproduction in mice by exposing their airways to an inflammatory molecule called interleukin-13. In mice lacking the gene for Syt2, they found, the IL-13 caused only normal mucus production in the mice’s lungs. In other words, it looked as if Syt2 was central to excess mucus production but had no role in normal mucus production, which is regulated by a different mechanism. That was promising: It suggested that a drug could be made to block excessive production only.

With this win under their belt, Dickey’s team next designed a molecule, which they called PEN-SP9-Cy3, that would block the action of Syt2 in inflamed lungs. When they tested this molecule on mice and on human cells in culture, they found that it significantly reduced the amount of mucins secreted.

One day, says Dickey, “we hope that when someone with a severe asthma attack comes to the emergency room they can breathe in our drug and it will prevent any further mucus plugging.”

In just six months during law school, Blessing Azeke had 10 emergency room visits. If any of these new approaches for treating asthma pan out, people like her can look forward to a future with fewer medical crises.

Editor’s note: The article was amended on February 16, 2023, to clarify that Christopher Evans works at the University of Colorado Anschutz Medical Campus.

Lea en español

This article originally appeared in Knowable Magazine, an independent journalistic endeavor from Annual Reviews.

Sunday, April 9, 2023

Celebrate Easter with Family-Friendly Fun

Gathering for Easter means bright decor and flavorful food with those you hold nearest and adding some extra “egg-citement” to the holiday can come easy. Let your inner kid shine through with games and activities everyone can enjoy whether it’s at the kitchen table, in the backyard or gathered around for story time.

Hunt for Easter Eggs
Likely the most popular Easter activity of all, hiding plastic eggs full of candy and goodies for kiddos to “hunt” brings plenty of smiles and giddiness. Whether your gatherings typically take place in a family member’s backyard or a local park, it’s an exciting way to get youngsters outdoors for a friendly (yet probably competitive) game.

Bake Desserts
Every holiday comes with its own flavorful traditions and Easter is no exception. From sweet, fresh, fruity desserts to chocolatey delights, baking your family’s favorites is a fun way to bring everyone together in the kitchen. Assigning specialized roles in an easy solution for ensuring all feel involved. Little ones can gather and organize ingredients while older kids measure cups, tablespoons and teaspoons to show off their math skills. Finally, adults can handle cutting and cooking so safety comes first.

Decorate Eggs
Keep the fun in the kitchen by using eggs (real or plastic) as the canvas for creativity. Dyes are a popular choice, but you can also paint or simply use markers to decorate to your heart’s desire. Add final touches with glitter, fabric or ribbons to truly make your creation your own.

Enjoy the Outdoors
Depending on where you live, Easter often presents opportunities to celebrate outdoors. Turn back the clock with kid-friendly classics like tag, hide and seek, backyard sports and more. The best part: These beloved games are meant for all ages, meaning everyone in the family can get in on the fun.

Pass Down Family Stories and Traditions
Whether your loved ones live down the street or across the country, holidays bring people together. These moments spent sharing meals, playing games and looking back on the past are a perfect opportunity for passing down stories and traditions, from studying the family tree to sharing the secrets to favorite recipes. Encouraging elders to share their experiences helps ensure traditions are passed from generation to generation and connects the past to the present and future.

Find more Easter “egg-tivities” to share with your loved ones at eLivingtoday.com.

Making computer chips act more like brain cells

Flexible organic circuits that mimic biological neurons could increase processing speed and might someday hook right into your head

The human brain is an amazing computing machine. Weighing only three pounds or so, it can process information a thousand times faster than the fastest supercomputer, store a thousand times more information than a powerful laptop, and do it all using no more energy than a 20-watt lightbulb.

Researchers are trying to replicate this success using soft, flexible organic materials that can operate like biological neurons and someday might even be able to interconnect with them. Eventually, soft “neuromorphic” computer chips could be implanted directly into the brain, allowing people to control an artificial arm or a computer monitor simply by thinking about it.

Like real neurons — but unlike conventional computer chips — these new devices can send and receive both chemical and electrical signals. “Your brain works with chemicals, with neurotransmitters like dopamine and serotonin. Our materials are able to interact electrochemically with them,” says Alberto Salleo, a materials scientist at Stanford University who wrote about the potential for organic neuromorphic devices in the 2021  Annual Review of Materials Research.

Salleo and other researchers have created electronic devices using these soft organic materials that can act like transistors (which amplify and switch electrical signals) and memory cells (which store information) and other basic electronic components. 

The work grows out of an increasing interest in neuromorphic computer circuits that mimic how human neural connections, or synapses, work. These circuits, whether made of silicon, metal or organic materials, work less like those in digital computers and more like the networks of neurons in the human brain.

Conventional digital computers work one step at a time, and their architecture creates a fundamental division between calculation and memory. This division means that ones and zeroes must be shuttled back and forth between locations on the computer processor, creating a bottleneck for speed and energy use.

The brain does things differently. An individual neuron receives signals from many other neurons, and all these signals together add up to affect the electrical state of the receiving neuron. In effect, each neuron serves as both a calculating device — integrating the value of all the signals it has received — and a memory device: storing the value of all of those combined signals as an infinitely variable analog value, rather than the zero-or-one of digital computers.

Researchers have developed a number of different “memristive” devices that mimic this ability. When you run electric currents through them, you change the electrical resistance. Like biological neurons, these devices calculate by adding up the values of all the currents they have been exposed to. And they remember through the resulting value their resistance takes. 

A simple organic memristor, for example, might have two layers of electrically conducting materials. When a voltage is applied, electric current drives positively charged ions from one layer into the other, changing how easily the second layer will conduct electricity the next time it is exposed to an electric current. (See diagram.) “It’s a way of letting the physics do the computing,” says Matthew Marinella, a computer engineer at Arizona State University in Tempe who researches neuromorphic computing.

The technique also liberates the computer from strictly binary values. “When you have classical computer memory, it’s either a zero or a one. We make a memory that could be any value between zero and one. So you can tune it in an analog fashion,” Salleo says.

At the moment, most memristors and related devices aren’t based on organic materials but use standard silicon chip technology. Some are even used commercially as a way of speeding up artificial intelligence programs. But organic components have the potential to do the job faster while using less energy, Salleo says. Better yet, they could be designed to integrate with your own brain. The materials are soft and flexible, and also have electrochemical properties that allow them to interact with biological neurons. 

For instance, Francesca Santoro, an electrical engineer now at RWTH Aachen University in Germany, is developing a polymer device that takes input from real cells and “learns” from it. In her device, the cells are separated from the artificial neuron by a small space, similar to the synapses that separate real neurons from one another. As the cells produce dopamine, a nerve-signaling chemical, the dopamine changes the electrical state of the artificial half of the device. The more dopamine the cells produce, the more the electrical state of the artificial neuron changes, just as you might see with two biological neurons. (See diagram.) “Our ultimate goal is really to design electronics which look like neurons and act like neurons,” Santoro says. 

The approach could offer a better way to use brain activity to drive prosthetics or  computer monitors. Today’s systems use standard electronics, including electrodes that can pick up only broad patterns of electrical activity. And the equipment is bulky and requires external computers to operate.

Flexible, neuromorphic circuits could improve this in at least two ways. They would be capable of translating neural signals in a much more granular way, responding to signals from individual neurons. And the devices might also be able to handle some of the necessary computations themselves, Salleo says, which could save energy and boost processing speed.

Low-level, decentralized systems of this sort — with small, neuromorphic computers processing information as it is received by local sensors — are a promising avenue for neuromorphic computing, Salleo and Santoro say. “The fact that they so nicely resemble the electrical operation of neurons makes them ideal for physical and electrical coupling with neuronal tissue,” Santoro says, “and ultimately the brain.”

Lea en español

This article originally appeared in Knowable Magazine, an independent journalistic endeavor from Annual Reviews.

Paleogenomic research has expanded rapidly over the past two decades, igniting heated debate about handling remains. Who gives consent for study participants long gone — and who should speak for them today?

The 2022 Nobel Prize in physiology and medicine has brought fresh attention to paleogenomics, the sequencing of DNA of ancient specimens. Swedish geneticist Svante Pääbo won the coveted prize “for his discoveries concerning the genomes of extinct hominins and human evolution.” In addition to sequencing the Neanderthal genome and identifying a previously unknown early human called Denisova, Pääbo also found that genetic material of these now extinct hominins had mixed with those of our own Homo sapiens after our ancestor migrated from Africa some 70,000 years ago.

The study of ancient DNA has also shed light on other migrations, as well as the evolution of genes involved in regulating our immune system and the origin of our tolerance to lactose, among many other things. The research has also ignited ethical questions. Clinical research on living people requires the informed consent of participants and compliance with federal and institutional rules.

But what do you do when you’re studying the DNA of people who died a long time ago? That gets complicated, says anthropologist Alyssa Bader, coauthor of an article about ethics in human paleogenomics in the 2022 Annual Review of Genomics and Human Genetics.

“Consent takes on new meaning” when participants are no longer around to make their voices heard, Bader and colleagues write. Scientists instead must regulate themselves, and navigate the sometimes contradictory guidelines — some of which prioritize research outcomes; others, the wishes of descendants, even very distant ones, and local communities. There are no clear-cut, ironclad rules, says Bader, now at McGill University in Montreal, Canada: “We don’t necessarily have one unified field standard for ethics.”

Take, for example, research at Pueblo Bonito, a massive stone great house in Chaco Canyon in New Mexico, where a community thrived from 828 to 1126 AD under the rule of ancestral Puebloan peoples. In the late 1800s, archaeologists from the American Museum of Natural History started excavations there, unearthing more than 50,000 tools, ritual objects and other belongings, as well as the remains of 14 people. These human bones remained stored in boxes and drawers, allowing non-Indigenous researchers to study them. Recently, a research team extracted and analyzed their DNA. The study, published in 2017, suggested an exciting finding: The remains found in Pueblo Bonito once belonged to members of a matrilineal dynasty, and leadership at Chaco Canyon was likely passed through a female line that persisted for hundreds of years until the society collapsed.

But the research sparked fierce ethical discussions. Several anthropologists and geneticists, Bader included, criticized the study for its lack of tribal consultation — the Puebloan and Diné communities, who still live in the area, were not asked for permission to carry out the research. The critics also cited the dehumanizing language (such as “cranium 8” or “burial 14”) that authors used to describe the Pueblo ancestors and warned that the controversy would exacerbate feelings of distrust toward scientists.

Bader spoke with Knowable Magazine about what we can learn from research on ancient DNA and why considering the ethics around it is such an urgent task for the field. This conversation has been edited for length and clarity.

What is ancient DNA? And where can we find it?

Well, ancient DNA is the DNA that’s been preserved over hundreds or thousands of years. And it can be from humans, from animals, from plants, from microbes, viruses, bacteria.

An easy explanation would be “DNA from non-living beings.” We have DNA from woolly mammoths, we have ancient DNA from Neanderthals, and we have DNA from more recent human ancestors. So it’s a huge span.

If we’re talking about DNA from humans, we can get it from teeth, from bone, from hair. We find it from coprolites, which is poop. We can get it from something that someone chewed on. Any way that you would leave your DNA now as a living human could also potentially be preserved for the future as well.

With the development of next-generation sequencing technology, there has been a dramatic proliferation of research on ancient genomes from ancestral humans, from very few published before 2009 to more than 1,000 by 2017. What have we learned from peeking into the genomes of ancestral humans?

Oh, there’s a lot of different types of questions that we can address by looking at the DNA from human ancestors. We can see how closely or distantly related they were across continents, across time spans. We can see population movements. We can see how humans and their environments interacted.

But all of it, I think, boils down to just understanding a little bit more about what makes humans who we are now. Ancient DNA is simply using a genomic perspective to understand what things happened in the past to shape what humans are today.

This is something that you’ve also tried to do with your own research, right?

Yes. Part of my family is Tsimshian from southeast Alaska. I grew up in Washington state. But when I was a kid, and even now, one of the things that I really enjoy doing to maintain a close relationship with my family in Alaska is go up and spend the summer fishing. I just went fishing with my grandpa, my uncles and my dad last summer.

That influenced the research I do now, which is thinking about how traditional foods — such as salmon, in my family — shape people’s oral microbiome, the community of bacteria that are in our mouths. And there’s been research showing those bacteria can impact our health outside of the mouth. If they get out of balance, they can cause problems in other areas of your body. They can also support your health.

My research looks at the relationship between traditional food in Indigenous communities of the Pacific Northwest, mostly in Alaska and British Columbia, and how they can support the biological resilience and health of these communities. In short, understanding the way that our diet might be impacting our health on a microbial level.

And you’re also studying the oral microbiome of Tsimshian ancestors.

Yes, we’re comparing the microbes that we find in our ancestors’ mouths with microbes in descendant communities, and trying to answer what folks are eating now, what ancestors were eating in the past, how that stayed the same, how that’s changed through time, and then how that correlates with the microbiome.

When scientists study the DNA of living people, some sort of institutional committee reviews those projects to make sure they are carried out in an ethical manner. What happens when the people you study have been dead for a long time?

The idea of consent and what it means in the context of ancient DNA research is a big challenge in the field. Ancestors themselves don’t have a way to either consent to being part of research or to withhold their consent, the way that a living person who opts into genetic research can. We don’t have a good way to do that with ancestors.

There are a lot of different approaches that researchers take to that, though the one I advocate for, and model my own research practice after, is what we call community-collaborative research. Here, descendant communities stand in for the ancestors, and part of that is because data from ancestors can impact these modern communities.

In what way, exactly?

Well, we can’t really act as if ancestors just exist in this prehistoric or historic bubble and that understanding or learning new things about them doesn’t impact folks who are living now.

These things can tell us a lot about a specific group of ancestors, sure, but they might also be part of the history of living communities. For example, there are researchers looking at relatedness between communities, looking at population histories and migration and movements.

How do you approach your research on the field and with the communities involved?

My approach is about building the relationship with the community as research partners. So I’m not just approaching for permission.

For example, for one of the communities that I worked with, I went out there, introduced myself and had community meetings. I talked about my research expertise and the types of things I was interested in — but I also heard the kind of research that they were interested in. Then we were able to chat about what methods could be used to explore those mutual research interests and plan the project together. I got formal permission to go to the museum where their ancestors were, to be able to look at them and collect samples from them.

I provided updates about where we were in the research process. This was before Covid, so I went out every summer to provide updates. And then, when we started to get data from these analyses, I was interpreting that data with the community. Instead of me presenting it as, you know, “These are the results; this is what the science says,” I was like, “This is the data, this is how we generated it, this is how it’s often interpreted. How should we think about it in the context of community history and community-held knowledge?” That enhances the scientific outcomes.

Did it help that you’re Tsimshian and were familiar with community values, culture and traditions?

I think that the biggest influence that has had is that it’s shaped how I hold myself accountable to my community research partners. So when I’m doing the work and talking to people, I think: “If someone was approaching my family, how would I want them to be treated?” That has a big impact on the way I construct my research collaborations, and also on the way I have turned away from, or pushed away, some of the extractive processes where communities aren’t consulted or are treated as a resource, as something researchers use as they need.

You also mentioned that researchers may follow different approaches to these kinds of ethical issues. Can you talk about this apparent lack of consensus?

The thing about ethics is that they’re culturally constructed. Two different people might have different ideas of what is or is not ethical. And those ideas can also change over time. I think we see a little bit of that with research.

In the review article, we talk about there being some tension. Some folks really orient the research around stakeholders like local and descendant communities, and how it impacts them. You can also take the approach that research is done for the sake of knowledge, regardless of how or who it impacts.

So, depending on how you orient yourself around these perspectives, you might shift your research practices in a specific way. But we don’t have one set of rules or something that everyone is held accountable to. There are no formal consequences if you don’t abide by one of the ethical guidelines, some of which even conflict with one another.

I think there are benefits to there not being one concrete thing, because that means you can adapt to different situations. If you write one set of rules, that also creates limitations. But it also means that it’s difficult for folks to sometimes figure out what they should be doing.

What kind of mistakes have been made?

Particularly in the context of North America, the remains of Indigenous human ancestors have been taken from their communities and used as a resource for researchers. Sometimes communities knew about it and objected. Sometimes communities didn’t even know where their ancestors were, or what they were being used for.

These remains have been collected in museums, displayed in ways that communities didn’t approve of or felt were disrespectful. And in a museum context like that, non-Indigenous scientists didn’t necessarily have to go to a descendant community and ask for permission to do their research. This just continued a history of violence, harm and exploitation.

As ancient DNA came along, then those ancestors’ bones and remains became a source for genomic research. But we don’t want these harms, which came out of archaeological research more broadly, to continue to proliferate in genomic research. We want them to stop.

How can this community-centered approach you advocate for facilitate a more collaborative research?

Genomic data is just one form of information, right? If you think about what makes you as a person, your genes that come from your family and ancestry are one part of what makes you who you are. And I think the same is true for paleogenomic research. The genomes that we study using ancient DNA are one part of a really big story.

When you collaborate with communities and you include community-held knowledge or histories, that improves the narratives that we’re able to tell using genomic information. It can only improve things, because we have more depth, more perspective on the story that we’re trying to tell through genomes.

In my view, the people who should have the most voice in research are the people who potentially bear the most risk from research. Researchers can cause harm by taking samples from ancestors, excluding communities from giving permission, or excluding them from being involved in the research process.

In a deeply collaborative approach, communities are our partners. They’re not only giving consent for samples from ancestors to be taken, but also helping to shape the research questions. Maybe the methods. They are involved in interpreting the data. Or preparing results for publication. Of course, that all depends on how deeply a community does or does not want to be involved in the process.

For you, what does it mean to think ethically about ancient DNA research?

When I think about what may or may not be ethical, I try to think about the way that harm has happened in the past.

So, when I think about how I want to do my research now, I hold myself accountable to communities when I do my work. I don’t think about research as being just a value-neutral thing. I try to think about how my research will impact other people: who will benefit from it, and how I can prevent harms in doing it.

It’s this kind of restorative-justice approach where you say, “Folks were excluded in the past and we want to include them as much as possible now to heal that harm.” To me, that can be achieved by finding new ways to break down the barriers between who is being researched and who is doing the research.

Lea en español

This article originally appeared in Knowable Magazine, an independent journalistic endeavor from Annual Reviews.

How the NRA evolved from backing a 1934 ban on machine guns to blocking nearly all firearm restrictions today

NRA conventiongoers, like these at the gun group’s 2018 big meeting, browse firearms exhibits. Loren Elliott/AFP via Getty Images
Robert Spitzer, State University of New York Cortland

The mass shootings at a Buffalo, New York, supermarket and an elementary school in Uvalde, Texas, just 10 days apart, are stirring the now-familiar national debate over guns seen after the tragic 2012 and 2018 school shootings in Newtown, Connecticut, and Parkland, Florida.

Inevitably, if also understandably, many Americans are blaming the National Rifle Association for thwarting stronger gun laws that might have prevented these two recent tragedies and many others. And despite the proximity in time and location to the Texas shooting, the NRA is proceeding with its plans to hold its annual convention in Houston on May 27-29, 2022. The featured speakers include former President Donald Trump and Sen. Ted Cruz, a Texas Republican.

After spending decades researching and writing about how and why the NRA came to hold such sway over national gun policies, I’ve seen this narrative take unexpected turns in the last few years that raise new questions about the organization’s reputation for invincibility.

People delivered boxes of petitions calling for stronger gun control rules to former Florida Gov. Rick Scott after the 2018 mass shooting in Parkland. AP Photo/Gerald Herbert

Three phases

The NRA’s more than 150-year history spans three distinct eras.

At first the group was mainly concerned with marksmanship. It later played a relatively constructive role regarding safety-minded gun ownership restrictions before turning into a rigid politicized force.

The NRA was formed in 1871 by two Civil War veterans from Northern states who had witnessed the typical soldier’s inability to handle guns.

The organization initially leaned on government support, which included subsidies for shooting matches and surplus weaponry. These freebies, which lasted until the 1970s, gave gun enthusiasts a powerful incentive to join the NRA.

The NRA played a role in fledgling political efforts to formulate state and national gun policy in the 1920s and 1930s after Prohibition-era liquor trafficking stoked gang warfare. It backed measures like requiring a permit to carry a gun and even a gun purchase waiting period.

And the NRA helped shape the National Firearms Act of 1934, with two of its leaders testifying before Congress at length regarding this landmark legislation. They supported, if grudgingly, its main provisions, such as restricting gangster weapons, which included a national registry for machine guns and sawed-off shotguns and taxing them heavily. But they opposed handgun registration, which was stripped out of the nation’s first significant national gun law.

Decades later, in the legislative battle held in the aftermath of President John F. Kennedy’s assassination and amid rising concerns about crime, the NRA opposed another national registry provision that would have applied to all firearms. Congress ultimately stripped it from the Gun Control Act of 1968.

Throughout this period, however, the NRA remained primarily focused on marksmanship, hunting and other recreational activities, although it did continue to voice opposition to new gun laws, especially to its membership.

NPR’s Ron Elving recounts the NRA’s history.

A sharp right turn

By the mid-1970s, a dissident group within the NRA believed that the organization was losing the national debate over guns by being too defensive and not political enough. The dispute erupted at the NRA’s 1977 annual convention, where the dissidents deposed the old guard.

From this point forward, the NRA became ever more political and strident in its defense of so-called “gun rights,” which it increasingly defined as nearly absolute under the Second Amendment.

One sign of how much the NRA had changed: The Second Amendment right to bear arms never came up in the 166 pages of congressional testimony regarding the 1934 gun law. Today, the organization treats those words as its mantra, constantly citing them.

And until the mid-1970s, the NRA supported waiting periods for handgun purchases. Since then, however, it has opposed them. It fought vehemently against the ultimately successful enactment of a five-business-day waiting period and background checks for handgun purchases in 1993.

The NRA’s influence hit a zenith during George W. Bush’s gun-friendly presidency, which embraced the group’s positions. Among other things, his administration let the ban on assault weapons expire, and it supported the NRA’s top legislative priority: enactment in 2005 of special liability protections for the gun industry, the Protection of Lawful Commerce in Arms Act.

People attending the National Rifle Association Leadership Forum in 2017 paid rapt attention to President Donald Trump’s address. AP Photo/Evan Vucci

Having a White House ally isn’t everything

Despite past successes, the NRA has suffered from a series of mostly self-inflicted blows that have precipitated an existential crisis for the organization.

Most significantly, an investigation by the New York Attorney General, filed in 2020, has revealed extensive allegations of rampant cronyism, corruption, sweetheart deals and fraud. Partly as a result of these revelations, NRA membership has apparently declined to roughly 4.5 million, down from a high of about 5 million.

Despite this trend, however, the grassroots gun community is no less committed to its agenda of opposition to new gun laws. Indeed, the Pew Research Center’s findings in 2017 suggested that about 14 million people identify with the group. By any measure, that’s a small minority out of nearly 260 million U.S. voters.

But support for gun rights has become a litmus test for Republican conservativism and is baked into a major political party’s agenda. This laserlike focus on gun issues continues to enhance the NRA’s influence even when the organization faces turmoil. This means that the protection and advancement of gun rights are propelled by the broader conservative movement, so that the NRA no longer needs to carry the ball by itself.

Like Bush, Trump maintained a cozy relationship with the NRA. It was among his 2016 presidential bid’s most enthusiastic backers, contributing US$31 million to his presidential campaign.

When Trump directed the Justice Department to draft a rule banning bump stocks, and indicated his belated support for improving background checks for gun purchases after the Parkland shooting, he was sticking with NRA-approved positions. He also supported arming teachers, another NRA proposal.

Only one sliver of light emerged between the Trump administration and the NRA: his apparent willingness to consider raising the minimum age to buy assault weapons from 18 to 21 – which has not happened. In 2022, a year after Trump left office, 18-year-olds, including the gunmen allegedly responsible for the mass shootings in Uvalde and Buffalo, were able to legally purchase firearms.

In politics, victory usually belongs to whoever shows up. And by showing up, the NRA has managed to strangle every federal effort to restrict guns since the Newtown shooting.

Nevertheless, the NRA does not always win. At least 25 states had enacted their own new gun regulations within five years of that tragedy.

Supreme Court ruling’s repercussions

These latest mass shootings may stir gun safety supporters to mobilize public outrage and turn out voters favoring stricter firearm regulations during the 2022 midterm elections.

But there is a wild card: The Supreme Court will soon rule on New York State Rifle & Pistol Club v. Bruen, the most significant case regarding gun rights it has considered in years. It’s likely that the court will strike down a long-standing New York pistol permit law, broadening the right to carry guns in public across the United States.

Such a decision could galvanize gun safety supporters while also emboldening gun rights activists – making the debate about guns in America even more tumultuous.

This is an updated version of an article originally published on February 23, 2018.

Robert Spitzer, Distinguished Service Professor Emeritus of the Political Science Department, State University of New York Cortland

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Scientific highs and lows of cannabinoids

Hundreds of these cannabis-related chemicals now exist, both natural and synthetic, inspiring researchers in search of medical breakthroughs — and fueling a dangerous trend in recreational use

Editor’s note: Raphael Mechoulam passed away on March 9, 2023, at the age of 92.

The 1960s was a big decade for cannabis: Images of flower power, the summer of love and Woodstock wouldn’t be complete without a joint hanging from someone’s mouth. Yet in the early ’60s, scientists knew surprisingly little about the plant. When Raphael Mechoulam, then a young chemist in his 30s at Israel’s Weizmann Institute of Science, went looking for interesting natural products to investigate, he saw an enticing gap in knowledge about the hippie weed: The chemical structure of its active ingredients hadn’t been worked out.

Mechoulam set to work.

The first hurdle was simply getting hold of some cannabis, given that it was illegal. “I was lucky,” Mechoulam recounts in a personal chronicle of his life’s work, published this month in the Annual Review of Pharmacology and Toxicology. “The administrative head of my Institute knew a police officer. ... I just went to Police headquarters, had a cup of coffee with the policeman in charge of the storage of illicit drugs, and got 5 kg of confiscated hashish, presumably smuggled from Lebanon.”

By 1964, Mechoulam and his colleagues had determined, for the first time, the full structure of both delta-9-tetrahydrocannabinol, better known to the world as THC (responsible for marijuana’s psychoactive “high”) and cannabidiol, or CBD.

That chemistry coup opened the door for cannabis research. Over the following decades, researchers including Mechoulam would identify more than 140 active compounds, called cannabinoids, in the cannabis plant, and learn how to make many of them in the lab. Mechoulam helped to figure out that the human body produces its own natural versions of similar chemicals, called endocannabinoids, that can shape our mood and even our personality. And scientists have now made hundreds of novel synthetic cannabinoids, some more potent than anything found in nature.

Today, researchers are mining the huge number of known cannabinoids — old and new, found in plants or people, natural and synthetic — for possible pharmaceutical uses. But, at the same time, synthetic cannabinoids have become a hot trend in recreational drugs, with potentially devastating impacts.

For most of the synthetic cannabinoids made so far, the adverse effects generally outweigh their medical uses says biologist João Pedro Silva of the University of Porto in Portugal, who studies the toxicology of substance abuse, and coauthored a 2023 assessment of the pros and cons of these drugs in the Annual Review of Pharmacology and Toxicology. But, he adds, that doesn’t mean there aren’t better things to come.

Cannabis’s long medical history

Cannabis has been used for centuries for all manner of reasons, from squashing anxiety or pain to spurring appetite and salving seizures. In 2018, a cannabis-derived medicine — Epidiolex, consisting of purified CBD — was approved for controlling seizures in some patients. Some people with serious conditions, including schizophrenia, obsessive compulsive disorder, Parkinson’s and cancer, self-medicate with cannabis in the belief that it will help them, and Mechoulam sees the promise. “There are a lot of papers on [these] diseases and the effects of cannabis (or individual cannabinoids) on them. Most are positive,” he tells Knowable Magazine.

That’s not to say cannabis use comes with zero risks. Silva points to research suggesting that daily cannabis users have a higher risk of developing psychotic disorders, depending on the potency of the cannabis; one paper showed a 3.2 to 5 times higher risk. Longtime chronic users can develop cannabinoid hyperemesis syndrome, characterized by frequent vomiting. Some public health experts worry about impaired driving, and some recreational forms of cannabis contain contaminants like heavy metals with nasty effects .

Finding medical applications for cannabinoids means understanding their pharmacology and balancing their pros and cons.

Mechoulam played a role in the early days of research into cannabis’s possible clinical uses. Based on anecdotal reports stretching back into ancient times of cannabis helping with seizures, he and his colleagues looked at the effects of THC and CBD on epilepsy. They started in mice and, since CBD showed no toxicity or side effects, moved on to people. In 1980, then at the Hebrew University of Jerusalem, Mechoulam co-published results from a 4.5-month, tiny trial of patients with epilepsy who weren’t being helped by current drugs. The results seemed promising: Out of eight people taking CBD, four had almost no attacks throughout the study, and three saw partial improvement. Only one patient wasn’t helped at all.

“We assumed that these results would be expanded by pharmaceutical companies, but nothing happened for over 30 years,” writes Mechoulam in his autobiographical article. It wasn’t until 2018 that the US Food and Drug Administration approved Epidiolex for treating epileptic seizures in people with certain rare and severe medical conditions. “Thousands of patients could have been helped over the four decades since our original publication,” writes Mechoulam.

Drug approval is a necessarily long process, but for cannabis there have been the additional hurdles of legal roadblocks, as well as the difficulty in obtaining patent protections for natural compounds. The latter makes it hard for a pharmaceutical company to financially justify expensive human trials and the lengthy FDA approval process.

In the United Nations’ 1961 Single Convention on Narcotic Drugs, cannabis was slotted into the most restrictive categories: Schedule I (highly addictive and liable to abuse) and its subgroup, Schedule IV (with limited, if any, medicinal uses). The UN removed cannabis from schedule IV only in December 2020 and, although cannabis has been legalized or decriminalized in several countries and most US states, it remains still ( controversially), on both the US’ and the UN’s Schedule I — the same category as heroin. The US’ cannabis research bill, passed into law in December 2022, is expected to help ease some of the issues in working with cannabis and cannabinoids in the lab.

To date, the FDA has only licensed a handful of medicinal drugs based on cannabinoids, and so far they’re based only on THC and CBD. Alongside Epidiolex, the FDA has approved synthetic THC and a THC-like compound to fight nausea in patients undergoing chemotherapy and weight loss in patients with cancer or AIDS. But there are hints of many other possible uses. The National Institutes of Health registry of clinical trials lists hundreds of efforts underway around the world to study the effect of cannabinoids on autism, sleep, Huntington’s Disease, pain management and more.

In recent years, says Mechoulam, interest has expanded beyond THC and CBD to other cannabis compounds such as cannabigerol (CBG), which Mechoulam and his colleague Yehiel Gaoni discovered back in 1964. His team has made derivatives of CBG that have anti-inflammatory and pain relief properties in mice (for example, reducing the pain felt in a swollen paw) and can prevent obesity in mice fed high-fat diets. A small clinical trial of the impacts of CBG on attention-deficit hyperactivity disorder is being undertaken this year. Mechoulam says that the methyl ester form of another chemical, cannabidiolic acid, also seems “very promising” — in rats, it can suppress nausea and anxiety and act as an antidepressant in an animal model of the mood disorder.

But if the laundry list of possible benefits of all the many cannabinoids is huge, the hard work has not yet been done to prove their utility. “It’s been very difficult to try and characterize the effects of all the different ones,” says Sam Craft, a psychology PhD student who studies cannabinoids at the University of Bath in the UK. “The science hasn’t really caught up with all of this yet.”

A natural version in our bodies

Part of the reason that cannabinoids have such far-reaching effects is because, as Mechoulam helped to discover, they’re part of natural human physiology.

In 1988, researchers reported the discovery of a cannabinoid receptor in rat brains, CB1 (researchers would later find another, CB2, and map them both throughout the human body). Mechoulam reasoned there wouldn’t be such a receptor unless the body was pumping out its own chemicals similar to plant cannabinoids, so he went hunting for them. He would drive to Tel Aviv to buy pig brains being sold for food, he remembers, and bring them back to the lab. He found two molecules with cannabinoid-like activity: anandamide (named after the Sanskrit word ananda for bliss) and 2-AG.

These endocannabinoids, as they’re termed, can alter our mood and affect our health without us ever going near a joint. Some speculate that endocannabinoids may be responsible, in part, for personality quirks, personality disorders or differences in temperament.

Animal and cell studies hint that modulating the endocannabinoid system could have a huge range of possible applications, in everything from obesity and diabetes to neurodegeneration, inflammatory diseases, gastrointestinal and skin issues, pain and cancer. Studies have reported that endocannabinoids or synthetic creations similar to the natural compounds can help mice recover from brain trauma, unblock arteries in rats, fight antibiotic-resistant bacteria in petri dishes and alleviate opiate addiction in rats. But the endocannabinoid system is complicated and not yet well understood; no one has yet administered endocannabinoids to people, leaving what Mechoulam sees as a gaping hole of knowledge, and a huge opportunity. “I believe that we are missing a lot,” he says.

“This is indeed an underexplored field of research,” agrees Silva, and it may one day lead to useful pharmaceuticals. For now, though, most clinical trials are focused on understanding the workings of endocannabinoids and their receptors in our bodies (including how everything from probiotics to yoga affects levels of the chemicals).

‘Toxic effects’ of synthetics

In the wake of the discovery of CB1 and CB2, many researchers focused on designing new synthetic molecules that would bind to these receptors even more strongly than plant cannabinoids do. Pharmaceutical companies have pursued such synthetic cannabinoids for decades, but so far, says Craft, without much success — and some missteps. A drug called Rimonabant, which bound tightly to the CB1 receptor but acted in opposition to CB1’s usual effect, was approved in Europe and other nations (but not the US) in the early 2000s to help to diminish appetite and in that way fight obesity. It was withdrawn worldwide in 2008 due to serious psychotic side effects, including provoking depression and suicidal thoughts.

Some of the synthetics invented originally by academics and drug companies have wound up in recreational drugs like Spice and K2. Such drugs have boomed and new chemical formulations keep popping up: Since 2008, 224 different ones have been spotted in Europe. These compounds, chemically tweaked to maximize psychoactive effects, can cause everything from headaches and paranoia to heart palpitations, liver failure and death. “They have very toxic effects,” says Craft.

For now, says Silva, there is scarce evidence that existing synthetic cannabinoids are medicinally useful: As most of the drug candidates worked their way up the pipeline, adverse effects have tended to crop up. Because of that, says Silva, most pharmaceutical efforts to develop synthetic cannabinoids have been discontinued.

But that doesn’t mean all research has stopped; a synthetic cannabinoid called JWH-133, for example, is being investigated in rodents for its potential to reduce the size of breast cancer tumors. It’s possible to make tens of thousands of different chemical modifications to cannabinoids, and so, says Silva, “it is likely that some of these combinations may have therapeutic potential.” The endocannabinoid system is so important in the human body that there’s plenty of room to explore all kinds of medicinal angles. Mechoulam serves on the advisory board of Israel-based company EPM, for example, which is specifically aimed at developing medicines based on synthetic versions of types of cannabinoid compounds called synthetic cannabinoid acids.

With all this work underway on the chemistry of these compounds and their workings within the human body, Mechoulam, now 92, sees a coming explosion in understanding the physiology of the endocannabinoid system. And with that, he says, “I assume that we shall have a lot of new drugs.”

Lea en español

This article originally appeared in Knowable Magazine, an independent journalistic endeavor from Annual Reviews.