Saturday, April 8, 2023

A Family Favorite in Just 5 Minutes

(Culinary.net) Running short on time from a busy schedule shouldn’t mean skipping out on your favorite desserts. In fact, it should be all the more reason to enjoy a sweet treat as a reward for all that hard work.

When you’re due for a bite into dark chocolate goodness, all it takes is a few minutes out of your day to make 5-Minute Dark Chocolate Cereal Bars. This quick and simple dessert makes it easy to celebrate the day’s accomplishments without added stress.

As a fun way for little ones to help in the kitchen, you can cook together the butter, marshmallows, peanut butter and cereal then let the kiddos drizzle the key ingredient: melted chocolate. All that’s left to do is cut and serve or pack a few off to school and work for an afternoon treat.  

Find more seasonal dessert recipes at Culinary.net.

If you made this recipe at home, use #MyCulinaryConnection on your favorite social network to share your work.

Watch video to see how to make this recipe!


5-Minute Dark Chocolate Cereal Bars

Recipe adapted from ScrummyLane.com
  • 4          tablespoons butter
  • 10        ounces marshmallows
  • 1/2       cup peanut butter
  • 6          cups cereal
  • 4          ounces milk chocolate, melted
  • 4          ounces dark chocolate, melted
  1. Heat saucepan over low heat. Add butter, marshmallows and peanut butter; stir to combine. Add cereal; mix until coated.
  2. Line 9-by-13-inch pan with parchment paper. Add cereal mixture to pan.
  3. In bowl, mix milk chocolate and dark chocolate. Drizzle chocolate over cereal mixture; spread evenly then allow to cool.
  4. Cut into bars and serve.
SOURCE:
Culinary.net

About Us

    Our site is always changing and growing.  We change and construct live so maybe you will see changes happening so just refresh and keep going.....  We are just your normal average locals who would like to bring information forward.  Connecting you with businesses, and information here and around the world.  More curious and seeking than opinionated and that's a little of who we are.  You may not agree with everything you read here but we believe it is important to see what others are writing about in the world for it influences others.  So if your ever talking to someone and they seem confident in their beliefs maybe they are convinced by what they read or being told.  We urge you to do your own research before developing an opinion or reacting.  We encourage anyone to THINK FOR THEMSLEVES! BE FREE!   BE OPEN!  LEARN!

     Challenging and competitive as we will agree with many things and not agree with everyone, every article or conversation.  Remaining open to another point of view helps us keep growing.  To your best life and ours, we love you.

    We are all here right now together in this World and we believe WE ARE ONE.  We all think differently and have different views, likes, and dislikes. We can all agree to disagree and LIVE TOGETHER!

   Please enjoy the articles of information we present as some may help you in your knowledge and some may anger you or not in the end hopefully will help make an even better world for all of us.

     Maybe you are close to home or far beyond our home.  The world has so much to offer and exploring it is one key to understanding.  

   WE LOVE ALABAMA AND SHELBY COUNTY AND ALL SURROUNDING COMMUNITIES. WE BELIEVE WE HAVE THE BEST TO OFFER AS MANY OTHER PLACES IN THE WORLD.  

Email: team@shelbycountygazette.com

                                       Docendo discimus
               


How heat pumps of the 1800s are becoming the technology of the future

Innovative thinking has done away with problems that long dogged the electric devices — and both scientists and environmentalists are excited about the possibilities

It was an engineering problem that had bugged Zhibin Yu for years — but now he had the perfect chance to fix it. Stuck at home during the first UK lockdown of the Covid-19 pandemic, the thermal engineer suddenly had all the time he needed to refine the efficiency of heat pumps: electrical devices that, as their name implies, move heat from the outdoors into people’s homes.

The pumps are much more efficient than gas heaters, but standard models that absorb heat from the air are prone to icing up, which greatly reduces their effectiveness.

Yu, who works at the University of Glasgow, UK, pondered the problem for weeks. He read paper after paper. And then he had an idea. Most heat pumps waste some of the heat that they generate — and if he could capture that waste heat and divert it, he realized, that could solve the defrosting issue and boost the pumps’ overall performance. “I suddenly found a solution to recover the heat,” he recalls. “That was really an amazing moment.”

Yu’s idea is one of several recent innovations that aim to make 200-year-old heat pump technology even more efficient than it already is, potentially opening the door for much greater adoption of heat pumps worldwide. To date, only about 10 percent of space heating requirements around the world are met by heat pumps, according to the International Energy Agency (IEA). But due to the current energy crisis and growing pressure to reduce fossil fuel consumption in order to combat climate change, these devices are arguably more crucial than ever.

Since his 2020 lockdown brainstorming, Yu and his colleagues have built a working prototype of a heat pump that stores leftover heat in a small water tank. In a paper published in the summer of 2022, they describe how their design helps the heat pump to use less energy. Plus, by separately rerouting some of this residual warmth to part of the heat pump exposed to cold air, the device can defrost itself when required, without having to pause heat supply to the house.

The idea relies on the very principle by which heat pumps operate: If you can seize heat, you can use it. What makes heat pumps special is the fact that instead of just generating heat, they also capture heat from the environment and move it into your house — eventually transferring that heat to radiators or forced-air heating systems, for instance. This is possible thanks to the refrigerant that flows around inside a heat pump. When the refrigerant encounters heat — even a tiny amount in the air on a cold day — it absorbs that modicum of warmth.

A compressor then forces the refrigerant to a higher pressure, which raises its temperature to the point where it can heat your house. It works because an increase of pressure pushes the refrigerant molecules closer together, increasing their motion. The refrigerant later expands again, cooling as it does so, and the cycle repeats. The entire cycle can run in reverse, too, allowing heat pumps to provide cooling when it’s hot in summer.

The magic of a heat pump is that it can move multiple kilowatt-hours of heat for each kWh of electricity it uses. Heat pump efficiencies are generally measured in terms of their coefficient of performance (COP). A COP of 3, for example, means 1 kWh of juice yields 3 kWh of warmth — that’s effectively 300 percent efficiency. The COP you get from your device can vary depending on the weather and other factors.

It’s a powerful concept, but also an old one. The British mathematician, physicist and engineer Lord Kelvin proposed using heat pump systems for space heating way back in 1852. The first heat pump was designed and built a few years later and used industrially to heat brine in order to extract salt from the fluid. In the 1950s, members of the British Parliament discussed heat pumps when coal stocks were running low. And in the years following the 1973-74 oil crisis, heat pumps were touted as an alternative to fossil fuels for heating. “ Hope rests with the future heat pump,” one commentator wrote in the 1977 Annual Review of Energy.

Now the world faces yet another reckoning over energy supplies. When Russia, one of the world’s biggest sources of natural gas, invaded Ukraine in February 2022, the price of gas soared — which in turn shoved heat pumps into the spotlight because with few exceptions they run on electricity, not gas. The same month, environmentalist Bill McKibben wrote a widely shared blog post titled “Heat pumps for peace and freedom” in which, referring to the Russian president, he argued that the US could “peacefully punch Putin in the kidneys” by rolling out heat pumps on a massive scale while lowering Americans’ dependence on fossil fuels. Heat pumps can draw power from domestic solar panels, for instance, or a power grid supplied predominantly by renewables.

Running the devices on green electricity can help to fight climate change, too, notes Karen Palmer, an economist and senior fellow at Resources for the Future, an independent research organization in Washington, DC, who coauthored an analysis of policies to enhance energy efficiency in the 2018 Annual Review of Resource Economics. “Moving towards greater use of electricity for energy needs in buildings is going to have to happen, absent a technology breakthrough in something else,” she says.

The IEA estimates that, globally, heat pumps have the potential to reduce carbon dioxide emissions by at least 500 million metric tons in 2030, equivalent to the annual CO 2 emissions produced by all the cars in Europe today.

Despite their long history and potential virtues, heat pumps have struggled to become commonplace in some countries. One reason is cost: The devices are substantially more expensive than gas heating units and, because natural gas has remained relatively cheap for decades, homeowners have had little incentive to switch.

There has also long been a perception that heat pumps won’t work as well in cold climates, especially in poorly insulated houses that require a lot of heat. In the UK, for example, where houses tend to be rather drafty, some homeowners have long considered gas boilers a safer bet because they can supply hotter water ( around 140 to 160 degrees Fahrenheit), to radiators, which makes it easier to heat up a room. By contrast, heat pumps tend to be most efficient when heating water to around 100 degrees Fahrenheit.

The cold-climate problem is arguably less of an issue than some think, however, given that there are multiple modern air source devices on the market that work well even when outside temperatures drop as low as minus 10 degrees Fahrenheit. Norway, for example, is considered one of the world leaders in heat pump deployment. Palmer has a heat pump in her US home, along with a furnace as backup. “If it gets really cold, we can rely on the furnace,” she says.

Innovations in heat pump design are leading to units that are even more efficient, better suited to houses with low levels of insulation and — potentially — cheaper, too. For example, Yu says his and his colleagues’ novel air source heat pump design could improve the COP by between 3 percent and 10 percent, while costing less than existing heat pump designs with comparable functionality. They are now looking to commercialize the technology.

Yu’s work is innovative, says Rick Greenough, an energy systems engineer now retired from De Montfort University in the UK. “I must admit this is a method I hadn’t actually thought of,” he says.

And there are plenty more ideas afoot. Greenough, for instance, has experimented with storing heat in the ground during warmer months, where it can be exploited by a heat pump when the weather turns cool. His design uses a circulating fluid to transfer excess heat from solar hot-water panels into shallow boreholes in the soil. That raises the temperature of the soil by around 22 degrees Fahrenheit, to a maximum of roughly 66 degrees Fahrenheit, he says. Then, in the winter, a heat pump can draw out some of this stored heat to run more efficiently when the air gets colder. This technology is already on the market, offered by some installers in the UK, notes Greenough.

But most current heat pumps still only generate relatively low output temperatures, so owners of drafty homes may need to take on the added cost of insulation when installing a heat pump. Fortunately, a solution may be emerging: high-temperature heat pumps.

“We said, ‘Hey, why not make a heat pump that can actually one-on-one replace a gas boiler without having to really, really thoroughly insulate your house?’” says Wouter Wolfswinkel, program manager for business development at Swedish energy firm Vattenfall, which manufactures heat pumps. Vattenfall and its Dutch subsidiary Feenstra have teamed up to develop a high-temperature heat pump, expected to debut in 2023.

In their design, they use CO2 as a refrigerant. But because the heat-pump system’s hot, high-pressure operating conditions prevent the gas from condensing or otherwise cooling down very easily, they had to find a way of reducing the refrigerant’s temperature in order for it to be able to absorb enough heat from the air once again when it returns to the start of the heat pump loop. To this end, they added a “buffer” to the system: a water tank where a layer of cooler water rests beneath hotter water above. The heat pump uses the lower layer of cooler water from the tank to adjust the temperature of the refrigerant as required. But it can also send the hotter water at the top of the tank out to radiators, at temperatures up to 185 degrees Fahrenheit.

The device is slightly less efficient than a conventional, lower temperature heat pump, Wolfswinkel acknowledges, offering a COP of around 265 percent versus 300 percent, depending on conditions. But that’s still better than a gas boiler (no more than 95 percent efficient), and as long as electricity prices aren’t significantly higher than gas prices, the high temperature heat pump could still be cheaper to run. Moreover, the higher temperature means that homeowners needn’t upgrade their insulation or upsize radiators right away, Wolfswinkel notes. This could help people make the transition to electrified heating more quickly.

A key test was whether Dutch homeowners would go for it. As part of a pilot trial, Vattenfall and Feenstra installed the heat pump in 20 households of different sizes in the town of Heemskerk, not far from Amsterdam. After a few years of testing, in June 2022 they gave homeowners the option of taking back their old gas boiler, which they had kept in their homes, or of using the high temperature heat pump on a permanent basis. “All of them switched to the heat pump,” says Wolfswinkel.

In some situations, home-by-home installations of heat pumps might be less efficient than building one large system to serve a whole neighborhood. For about a decade, Star Renewable Energy, based in Glasgow, has been building district systems that draw warmth from a nearby river or sea inlet, including a district heating system connected to a Norwegian fjord. A Scandinavian fjord might not be the first thing that comes to mind if you say the word “heat” — but the water deep in the fjord actually holds a fairly steady temperature of 46 degrees Fahrenheit, which heat pumps can exploit.

Via a very long pipe, the district heating system draws in this water and uses it to heat the refrigerant, in this case ammonia. A subsequent, serious increase of pressure for the refrigerant — to 50 atmospheres — raises its temperature to 250 degrees Fahrenheit. The hot refrigerant then passes its heat to water in the district heating loop, raising the temperature of that water to 195 degrees Fahrenheit. The sprawling system provides 85 percent of the hot water needed to heat buildings in the city of Drammen.

“That type of thing is very exciting,” says Greenough.

Not every home will be suitable for a heat pump. And not every budget can accommodate one, either. Yu himself says that the cost of replacing the gas boiler in his own home remains prohibitive. But it’s something he dreams of doing in the future. With ever-improving efficiencies, and rising sales in multiple countries, heat pumps are only getting harder for their detractors to dismiss. “Eventually,” says Yu, “I think everyone will switch to heat pumps.”

This article originally appeared in Knowable Magazine, an independent journalistic endeavor from Annual Reviews. Sign up for the newsletter.

Knowable Magazine | Annual Reviews

Gallium: The liquid metal that could transform soft electronics

Bend it. Stretch it. Use it to conduct electricity. Researchers are exploring a range of applications that harness the element’s unusual properties.

Every time you sit down with your phone in your back pocket, you’re reminded of a fundamental truth: Human bodies are soft and flexible. Electronics aren’t.

But soon there may be devices that can stretch, bend and even repair themselves when they’re damaged. By harnessing the unusual properties of a liquid metal called gallium, materials scientists aim to create a new generation of flexible devices for virtual reality interfaces, medical monitors, motion-sensing devices and more.

The goal is to take the functionality of electronics and make them softer, says Michael Dickey, a chemical engineer at North Carolina State University. “I mean, the body and other natural systems have figured out how to do it. So surely, we can do it.”

Bendable electronics can also be made with conventional metals. But solid metal can fatigue and break, and the more that’s added to a soft material, the more inflexible the material becomes. Liquid metals don’t have that problem, Dickey says — they can be bent, stretched and twisted with little or no damage.

Flexibility turns out to be just one of gallium’s useful properties. Since it’s a metal, it conducts heat and electricity easily. Unlike the better-known liquid metal mercury, it has low toxicity, and low vapor pressure, so it doesn’t evaporate easily.

Gallium flows about as easily as water. But in air it also quickly forms a stiff outer oxide layer, allowing it to be easily formed into semisolid shapes. The surface tension, which is 10 times that of water, can even be varied by submerging the liquid metal in salt water and applying a voltage.

“I’m biased, so take this for what it’s worth. But I think this is one of the most interesting materials on the periodic table because it’s got so many unique properties,” says Dickey, coauthor of an overview of gallium in the 2021 Annual Review of Materials Research.

Interest in gallium lagged in the past, partly because of the unfair association with toxic mercury, and partly because its tendency to form an oxide layer was seen as a negative. But with increased interest in flexible and, especially wearable electronics, many researchers are paying fresh attention.

To make bendable circuits with gallium, scientists form it into thin wires embedded between rubber or plastic sheets. These wires can connect tiny electronic devices such as computer chips, capacitors and antennas. The process creates a device that could wrap around an arm and be used to track an athlete’s motion, speed or vital signs, for instance, says Carmel Majidi, a mechanical engineer at Carnegie Mellon University.

These liquid metal wires and circuits can stand up to significant bending or twisting. As a demonstration, Dickey made earbud wires that can stretch up to eight times their original length without breaking. Other circuits can heal themselves when torn — when the edges are positioned against each other, the liquid metal flows back together.

Gallium circuits can also be printed and applied directly to the skin, like a temporary tattoo. The “ink” works like a conventional electrode, the kind that is used to monitor heart or brain activity, says Majidi, who made such a circuit by printing the metal onto a flexible material. The tattoos are more flexible and durable than existing electrodes, making them promising for long-term use.

The shape-shifting quality of the liquid metal opens up other potential uses. When the metal is squeezed, stretched and twisted, its shape changes, and the change in geometry also changes its electrical resistance. So running a small current through a mesh of gallium wires allows researchers to measure how the material is being twisted, stretched and pressed on.

This principle could be applied to create motion-sensing gloves for virtual reality: If a mesh of gallium wires were embedded inside a thin, soft film on the inside of the glove, a computer could detect the changes in resistance as the wearer moves their hand.

“You can use it to track your own body’s motion, or the forces that you’re in contact with, and then impart that information into whatever the virtual world is that you’re experiencing,” Majidi says.

This property even raises the possibility of machines that use what Dickey calls “soft logic” to operate. Rather than requiring computation, machines using soft logic have simple reactions based directly on changes in electrical resistance across the grid. They can be designed so that pushing, pulling or bending different parts of the grid activates different responses. As a demonstration, Dickey created a device that can turn motors or lights on and off depending entirely on where the material is pressed.

“There’s no semiconductors here. There’s no transistors, there’s no brain, it’s just based on the way the material is touched,” Dickey says.

Low-level tactile-based logic like this could be used to build responsiveness into devices, akin to building reflexes into soft robots — such reactions don’t require a complex “brain” to process information, but can react directly in response to environmental stimuli, changing color or thermal properties or redirecting electricity.

And that outer oxide layer that forms when gallium is exposed to air is now being taken advantage of.  The oxide layer allows the metal to hold its shape, and opens up all sorts of possibilities for patterning and fabrication. Tiny drops of gallium can be stacked high on top of one another. A drop of gallium can be dragged along a surface, leaving a thin trail of oxide that can be used as a circuit.

In addition, in water the oxide layer can be made to form and disappear by applying a tiny amount of voltage, causing the beads to form and collapse instantly. By switching back and forth, Dickey can make the beads move a weight up and down. With refinement, this property could form the basis of artificial muscles for robots, he says.

Dickey admits that the technology is still in its early stages, and that the work so far merely suggests how it could be commercialized. But gallium has so many interesting properties it’s bound to be useful in soft electronics and robotics, he says.

He compares the field with the early days of computing. Although the earliest experimental computers made with vacuum tubes and mechanical switches are crude by today’s standards, they established principles that gave rise to modern electronics.

Majidi says he also expects to see liquid metal used commercially in the near future.

“In the next few years, you’re going to see more and more of this transition of liquid metal technologies out in industry, in the marketplace,” he says. “It’s not really so much a technical bottleneck at this point. It’s about finding commercial applications and uses of liquid metal that actually do make a difference.”

Lea en español

This article originally appeared in Knowable Magazine, an independent journalistic endeavor from Annual Reviews.

Migrant deaths in Mexico put spotlight on US policy that shifted immigration enforcement south

Mourners gather outside a detention center in Ciudad Juarez. David Peinado/picture alliance via Getty Images
Raquel Aldana, University of California, Davis

The fire-related deaths of at least 39 migrants in a detention facility in Ciudad Juarez, just across the U.S. border with Mexico, will likely be found to have had several contributing factors.

There was the immediate cause of the blaze, the mattresses apparently set alight by desperate men in the center to protest their imminent deportation. And then there is the apparent role of guards, seen on video walking away from the blaze.

But as an expert on immigration policy, I believe there is another part of the tragedy that can’t be overlooked: the decadeslong immigration enforcement policies of the U.S. and Mexican governments that have seen the number of people kept in such facilities skyrocket.

In the aftermath of the fire, Felipe González Morales, the United Nations special rapporteur for human rights of migrants, commented on Twitter that the “extensive use of immigration detention leads to tragedies like this one.”

And the United States is a big part of that “extensive use” on both sides of the border.

Lengthy stays and fear of deportation

Today Mexico maintains a very large detention system. It comprises several dozen short- and long-term detention centers, housing more than 300,000 people in 2021.

By comparison, the U.S. immigration detention system is the world’s largest. It maintains 131 facilities comprised of government-owned Service Processing Centers, privately run Contract Detention Facilities, and a variety of other detention facilities, including prisons.

Mexico has laws in place that are supposed to guarantee that migrants in detention only endure brief stays and are afforded due process, such as access to lawyers and interpreters. The law also states that they should have adequate conditions, including access to education and health care.

But in reality, what migrants often face at these detention centers is poor sanitary conditions, overcrowding, lengthy stays and despair over the near certainty of deportation.

The fire in Ciudad Juárez was started after the migrants – men from Guatemala, Honduras, Venezuela, El Salvador, Colombia and Ecuador – learned that they were to be sent back to those nations, according to Mexican President Andrés Manuel López Obrador. Deportation would have ended their hopes of asylum in the U.S.

US immigration enforcement shifts south

Why Mexico was doing the deporting, not the U.S., has a great deal to do with how the two nations have collaborated to control illegal migration headed to the U.S., especially since the turn of the century. In the wake of the 9/11 terrorist attacks of 2001, U.S. authorities increasingly viewed immigration as a security issue – a pivot that affected not only U.S. domestic legislation on immigration but its bilateral relations with Mexico.

In 2006, Mexican President Felipe Calderón joined efforts with President George W. Bush on the Merida Initiative to wage a war on drugs in Mexico, build a “21st Century U.S.-Mexican border” and shift immigration enforcement into Mexican territory.

These efforts, supported by massive U.S. funding, continue today.

With this money, Mexico established naval bases on its rivers, security cordons and drone surveillance. It also set up mobile highway checkpoints and biometric screening at migrant detention centers, all with the goal of detecting, detaining and deporting largely Central American migrants attempting to reach the United States.

The intent was to shift U.S. immigration enforcement south of the border. In that respect, the policy has been successful. Figures from the Guatemalan Institute of Migration show that of the 171,882 U.S.-bound migrants deported to the Northern Triangle region of Central America – El Salvador, Honduras and Guatemala – in 2022, Mexico sent back 92,718, compared to the U.S.‘s 78,433.

Prevention through deterrence is not working

Mexico’s detentions and deportations have done little to stop the flow of migrants entering the country en route to the U.S.

Researchers at the University of Texas at Austin estimate that from 2018 to 2021, an annual average of 377,000 migrants entered Mexico from the Northern Triangle region. The vast majority were headed to the U.S. to escape violence, drought, natural disasters, corruption and extreme poverty.

Migrants are passing through Mexico in the thousands from multiple other countries as well, fleeing conditions in countries such as Haiti and Venezuela, as well as African nations.

Meanwhile, recent years have seen a toughening of border enforcement policies targeting asylum seekers at the U.S.-Mexico border. This started under the Trump administration but has been continued by President Joe Biden despite the Democrat’s campaign promises of a more “humane” immigration system.

Since 2019, Washington has adopted a series of policies that have either forced migrants presenting themselves at the U.S. southern border to apply for asylum while remaining in Mexico or expelled them back to their countries of origin.

This has created a bottleneck of hundreds of thousands of migrants at Mexico’s border towns and swelled the numbers entering detention facilities in Mexico.

By 2021, the number of immigration detainees in such centers had reached 307,679, nearly double what it had been in 2019.

As a result, many centers, including the one implicated in the fire, have suffered from overcrowding and deterioration conditions. A 2021 report by the immigration research center Global Detention Project extensively documented how the conditions and practices of Mexico’s immigration centers had led to widespread protest by detained migrants. Rioting and protests have become more common, with incidents taking place at facilities in Tijuana and the southern city of Tapachula in recent months.

No end in sight

The tragedy in Ciudad Juárez is unlikely to affect the steady flow of migrants entering Mexico in the hope of making it north of the border. For many, the options to take a different path to safety in the U.S. are simply not there.

Only a few can apply for refugee status in the U.S. from abroad, and the waits are long. Biden’s “humanitarian parole” program – which allows entry to the U.S. for up to 30,000 people a month – is only an option for those living in a handful of nations. It is also being challenged in court. And for the lucky few who manage to file for U.S. asylum, denial rates remain high – 63% in 2021 – while immigration court backlogs mean that fewer cases are being decided. Only 8,349 asylum seekers were actually granted asylum by U.S. immigration judges in 2021.

Meanwhile, the Biden administration’s incoming “transit ban” will mean anyone seeking asylum at the U.S. southern border from May 11, 2023 without having first applied for asylum en route, will be rapidly deported, many to Mexico.

The likelihood is the policy will only worsen the migrant processing bottleneck in Mexico, and add pressure on the country’s already volatile detention facility system.

Raquel Aldana, Associate Vice Chancellor for Academic Diversity and Professor of Law, University of California, Davis

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Saliva: The next frontier in cancer detection

Scientists are finding tumor signals in spit that could be key to developing diagnostic tests for various types of cancer

3.14.2023

In the late 1950s, dentist and US Navy Capt. Kirk C. Hoerman, then a young man in his 30s, attempted to answer a bold question: Might the saliva of prostate cancer patients have different characteristics from that of healthy people? Could it contain traces of a disease that’s so far away from the mouth?

Without wasting more of their own saliva on elaborate discussion, Hoerman and his colleagues from the department of dental research at the Naval Training Center in Great Lakes, Illinois, got down to work. They analyzed samples from more than 200 patients and healthy controls, and found that the saliva of patients with untreated prostate cancer showed a significant increase in the levels of enzymes called acid phosphatases.

Writing in 1959 in the journal Cancer, the researchers then made a prescient reflection: that it may be valuable to observe discrete biochemical changes in tissues distant from the site of tumor origin.

More than 60 years later, the idea that saliva analysis can be used to detect different types of cancer is gaining traction in the scientific community. In the specialized literature, papers containing the keywords “diagnosis,” “cancer” and “saliva” grew more than tenfold over the past two decades, from 26 in 2001 to 117 in 2011, 183 in 2016 and 319 in 2021, according to the PubMed database, a search engine for biomedical research articles.

The appeal of this approach is obvious. Although cancer can be diagnosed through tissue biopsy, that requires trained physicians wielding long needles, scalpels, endoscopes or other tools to pry into the body to take samples. Liquid biopsy, which looks for traces of tumor components in fluids such as blood, urine, cerebrospinal fluid, semen or saliva, is a less invasive alternative. Of these, the simplest sample to collect is undoubtedly saliva.

The approach has already paid off: In 2021, the US Food and Drug Administration gave an innovative device designation to a saliva-based oral and throat cancer prediagnostic tool developed by the US company Viome. (Such designations are granted to novel medical devices that have the potential to provide more effective treatment or diagnosis of life-threatening diseases.) Based on artificial intelligence and machine learning, the tool analyzes a saliva sample for the activity of genes (in particular, messenger RNA) belonging to the bacterial community housed in the mouth. For unknown reasons, this community is modified when a tumor develops on the lips, tongue, throat or surrounding areas.

“For decades, saliva was considered a stepchild of blood,” says chemist Chamindie Punyadeera, who spent a decade working on Viome’s saliva diagnostic test. Now at Griffith University in Australia, she is lead author of a 2021 study describing the test’s development in NPJ Genomic Medicine. But that view of saliva as an afterthought could begin to change in the coming years as techniques to analyze it advance and a better understanding develops of what information it can hold. “Because saliva can be collected noninvasively, an empowered patient could take multiple samples and become a steward of his or her own diagnostic tests,” Punyadeera predicts.

The treasure contained in saliva

Every day, the salivary glands of an average adult produce between 500 and 1,500 milliliters of saliva to aid digestion and preserve oral health. In addition to enzymes, hormones, antibodies, inflammatory mediators, food debris and microorganisms, saliva has been found to contain traces of DNA and RNA or proteins from tumors.

“The goal of saliva diagnostics is to develop rapid, noninvasive detection of oral and systemic diseases,” write dental scientists Taichiro Nonaka of Louisiana State University and David T.W. Wong of the University of California, Los Angeles, in an article on saliva diagnostics published in the 2022 Annual Review of Analytical Chemistry. The field is developing rapidly due to the progress of “omics sciences” that analyze large collections of molecules involved in the functioning of an organism — such as genomics (genomes), proteomics (proteins) or metabolomics (metabolites) — as well as methods for analyzing large quantities of data. For example, the proteome of saliva — an exhaustive catalog of the proteins present in this fluid — is already available, and it is known that between 20 percent and 30 percent of the saliva proteome overlaps with that of blood.

But “the study of diagnostics through saliva is a relatively new field,” says Nonaka. It wasn’t until the last decade, he says, that it became known that salivary glands — parotid, submandibular, and sublingual, as well as other minor glands, in close proximity to blood vessels — transfer molecular information.

Today, in saliva — and also in blood — scientists are beginning to look for and find circulating tumor DNA (ctDNA), which is DNA that is shed from cancer cells when a tumor is present in the body. Multiple studies have identified biomarkers — such as proteins that are produced in higher quantities in cancer cells or genetic changes that occur in tumor cells — that could be used to detect tumors of the head and neck, breast, esophagus, lung, pancreas and ovary, as well as to monitor the patient’s response to therapies.

For example, in 2015 Chinese researchers published that the identification of two fragments of an RNA strand (microRNA) in saliva allowed the detection of malignant pancreatic cancer in 7 out of 10 patients with the disease. A more recent review of 14 studies involving more than 8,000 participants estimated that breast cancer patients were 2.58 times more likely to have certain saliva-detectable biomarkers — although 39 percent of the negative test results were in patients who actually had breast cancer. The research in the field is promising, but will require further prospective studies to determine its clinical applicability, Nonaka says.

“A great advantage of liquid biopsies is that they can sweep for up to 50 types of cancers in early stages at once, when they can be surgically treated or are candidates for short, targeted treatments,” says biologist Marina Simián, a researcher at Argentina’s National Scientific and Technical Research Council at the Nanosystems Institute of the National University of San Martín, in Buenos Aires. Simián is also cofounder of the company Oncoliq, which aims for the early detection of breast, prostate and other tumors from a blood sample.

“With today’s tools, very few organs are screened for cancer,” says Simián. Common screens include ones for prostate, breast, cervix, colon after the age of 50, and the lungs for those who have smoked heavily. And in the world, she says, only half of these people undergo these tests, and in many countries, not even 10 percent. The hope is to add many more tests that can be done on a single blood or saliva sample.

It is possible that in the future, testing of both blood and saliva will be the norm. Although there is still a long way to go, Nonaka believes that, except for oral cancers, saliva testing should most likely be supplemented with liquid biopsies in blood or urine, plus other parameters to increase sensitivity and practical utility.

In pursuit of exosomes

One particularly promising type of component to look for in saliva is the exosome. Exosomes are tiny lipid-wrapped vesicles that are present in almost all types of body fluids. They are transporters or messengers that travel from one cell to another — even to those in very distant organs. They carry a cargo of genetic material and proteins, which is taken up by a recipient cell in an organ and plays important roles in cell-to-cell signaling. But exosomes also have an important role in cancer. “They are key players,” says Punyadeera. Released by cancer cells, they pass into the blood and from there, can reach the salivary glands. The exosomes are thus dumped into the saliva, from which they can be collected.

Exosomes from tumor cells have a specific composition and are suspected of contributing to the spread of cancer to other organs or tissues. But from a diagnostic perspective, one of their main advantages is that they package and protect the cargo — in other words, they do not mix with the other components of saliva. In this way, they provide “more stable and accurate clinically relevant information for disease detection,” Nonaka explains.

For example, for squamous cell esophageal cancer, scientists have found two signatures or signals in salivary exosomes that allow detection of this disease with a sensitivity and specificity of more than 90 percent, in addition to providing guidance on prognosis and treatment, as reported in January 2022 in Molecular Cancer.

Factors such as the concentration or appearance of exosomes under the microscope can also be revealing. Patients with oral cancer, for example, have exosomes with different shapes and sizes than those found in healthy individuals.

However, the techniques available so far to isolate and study the exosome content of saliva are expensive and laborious. In response to this challenge, a new method known as electric field-induced release and measurement, or EFIRM, has emerged; it integrates electrochemical sensors and magnetic fields to elegantly capture minute amounts of circulating tumor DNA and other molecules — biomarkers — that indicate the presence of cancer. This technique has already shown encouraging results in the early detection of non-small cell lung cancer and could also be used to assess response to treatment.

The US company Liquid Diagnostic LLC, in which Wong has a stake, already offers this technology, having christened it Amperial and promising “the highest specificity and sensitivity for early stage cancers” and at “much lower cost.” Those most enthusiastic about the technology propose a world where a routine visit to the dentist saves lives and it is not necessary to draw blood to check if someone is ill. But experts agree that, for that dream to become a reality on a large scale, more studies are still needed.

“To achieve the translation of salivary biomarkers to the clinic, it is necessary, on the one hand, to develop standardized protocols and, on the other, to carry out large multicenter studies in which the influence of different confounding variables such as age, sex or lifestyle is analyzed,” says dental scientist Óscar Rapado González, of the Health Research Institute of Santiago de Compostela, in Spain, where he is investigating the use of saliva samples for the detection of head and neck cancers, as well as colorectal tumors.

The identification in saliva or other fluids of molecules directly or indirectly related to tumors has potential apart from early detection, says Rapado González. It might make it possible to assess individual risk of developing cancer, predict how a tumor will evolve or monitor the therapeutic response in a noninvasive way, allowing the development of personalized medicine.

“Undoubtedly,” Rapado González says, “more research in this field will drive progress toward the applicability of saliva in precision oncology in the coming years.”

Article translated by Debbie Ponchner

Lea en español

This article originally appeared in Knowable Magazine, an independent journalistic endeavor from Annual Reviews.

Deliver Maple Flavor in the Morning

(Culinary.net) It’s hard to beat a fresh, oven-baked breakfast to start the day, especially one loaded with sausage and eggs complemented by the sweetness of diced apples and maple syrup. This Maple Breakfast Braid delivers a tempting flavor combination perfect for a weekend morning with loved ones.

Find more breakfast recipes at Culinary.net.

Watch video to see how to make this recipe!

Maple Breakfast Braid

  • 1          package (16 ounces) breakfast sausage
  • 1/4       cup maple syrup
  • 2          eggs, beaten
  • 1/2       cup green onions, sliced
  • 2          Granny Smith apples, peeled and diced
  • 1 1/2    cups dry herb stuffing mix
  • 1          package (17 1/4 ounces) frozen puff pastry, thawed
  • 2          egg whites
  • 1          teaspoon water
  1. Heat oven to 400° F.
  2. In large bowl, combine sausage, syrup, beaten eggs, green onions, diced apples and stuffing mix.
  3. Dust surface with flour; roll out pastry sheet to 12-by-18-inch rectangle. Transfer pastry to large baking sheet with parchment paper. Spoon half of sausage mixture down center of pastry.
  4. Make 3-inch cuts down sides of pastry. Fold one strip at a time, alternating sides. Fold both ends to seal in filling. In bowl, beat egg whites and water; brush over pastry.
  5. Repeat steps for second pastry sheet.
  6. Bake 25-30 minutes, or until brown, rotating pans after baking 15 minutes.
SOURCE:
Culinary.net

Not enough fish in the sea

The scientist who found a way to tally up global catches is an ocean advocate and a vocal critic of industrial fisheries. Now we have a treaty for the high seas — but does it go far enough?

3.22.2023

Daniel Pauly is one of the world’s most cited fisheries researchers — and someone who’s used to making waves. He has called for a ban of subsidies that promote overfishing, such as ones that make shipping fuel cheaper or keep market prices artificially high. And he has compared the global fishery industry to an unsustainable Ponzi scheme: Just as fraudsters subsist only by finding new investors to scam, he says, fisheries survive only by finding previously unexploited stocks to fish.

This February, Pauly and colleague Rashid Sumaila won the prestigious Tyler Prize for Environmental Achievement from the University of Southern California for their efforts championing ocean sustainability. The pair, for example, penned a petition to the United Nations this February in advance of an international meeting to hammer out a new high seas treaty, calling for those waters to be declared a massive UN Protected Area that bans commercial fishing. That treaty was finally forged on March 4 to broad acclaim; it includes a way to designate marine protected areas in the high seas, but it does not go nearly as far as the duo had asked in terms of marine wildlife protections.

Pauly is no stranger to conflict and hard times. In the wake of World War II, he recounts, his mother sent him from France to live with a Swiss couple for a few months, but they never sent him back. At the age of 16, he ran away to Germany, where he finished his studies and became a fisheries biologist. He worked in Ghana, Tanzania and then the Philippines for 15 years, and wound up in Canada where, in 1999, he founded the Sea Around Us initiative at the University of British Columbia in Vancouver. That project, described in the 2023 issue of the Annual Review of Marine Science, aims to quantify humanity’s impact on fisheries and seek the best ways forward.

Knowable Magazine spoke with Pauly about his work. This conversation has been edited for length and clarity.

How did the Sea Around Us project start?

In 1997, I was invited by the Pew Charitable Trusts (at the time, it was a foundation) to participate in a meeting along the lines of: What would you do if you had lots of money to answer the questions “How is the ocean doing?” and “How can we improve what we do to the ocean?” There were five or six heavyweights in oceanography, and me. They all said: We need more data (most scientists always say we need more data) and then in 20 years you can say something. I was the last to speak, and I said: This is nonsense. The fisheries are sampling the ocean for us. All we need to do is to document fisheries properly, and then we can extract information about the state of the ocean.

I got about $1.2 million a year, for 15 years, from Pew to do this. And that was the basis of all of our success. We documented the world fisheries, and we made this available to the world.

Today it’s more difficult, because we don’t have this kind of support. We have about one-third of the money, assembled from various sources.

How did you get the data to document fisheries?

Basically, every year, the FAO (Food and Agriculture Organization of the United Nations) publishes the data that they get from their member countries about their fisheries. But the catch data is incomplete.

For example, Canada doesn’t send the FAO the catch of their recreational fisheries, or the catch made by foreign fleets in its own waters, or discarded catch, or bycatch — fish caught by accident alongside the target species. In the US, only the catch made in their federal waters, further than three miles from the coast, is sent.

Many Pacific Island nations will not report the catch made by local fishers, including women on reefs, which often feeds their population. That’s up to 80 percent of their catch, which goes into cooking pots but is never reported. How do we know about it? We know because the World Health Organization publishes, for example, fish consumption. Many countries don’t import fish, so they must get it from their own waters.

There are a few countries where officials can get political mileage by saying their fisheries are doing well. For example, Myanmar has for years sent to FAO not their catch, but the projection for what they expect to catch next year. That was smoothly increasing: In 2008, they had a major hurricane that destroyed half their fishing fleet, but it didn’t show up in their FAO reports.

The Chinese statistics are bad as can be.

We have to fix all this, fill in the gaps. We find data sometimes in anthropological papers, in gender studies, in health studies, anywhere that documents what people are eating and doing.

So, you're trawling all kinds of datasets to get this information.

Yes. We made an atlas — that effort involved 400 people, friends and colleagues throughout the world who helped us on a voluntary basis to reconstruct country-by-country data going back to 1950.

Some of our country estimates are going to be too high or too low. But we are confident that the total that we get by adding all these up is not far from the truth.

What is that total? What’s the total global catch?

We think it peaked around 1996 at 130 million metric tons. In the 1990s, there were still new stocks that we could exploit that had never been exploited. Then there came a time where the expansion hit a limit. It could not expand further, even though the fishing effort, the number of boats that we have and the time they spend at sea, is more today than in the 1990s.

That 130 million metric tons of fish provide about 5 percent of the calories needed for the world. And it is perhaps 20 percent of the global fish biomass at any time. In a sustainable, well-managed fishery, you can typically catch about 10 percent per year of the biomass that is there. So the global catch is excessive.

How much have the oceans suffered as a result?

There isn’t one single way to quantify this that everybody will agree on. But if you take the abundance of big fish, like sharks, they have been reduced by 70 percent. The biomass of exploited fish populations is going down virtually everywhere. People are shifting the population of the oceans to smaller fish.

There are very few indicators that are positive. One might be money. The decline in cod, for example, has allowed shrimp and crabs and lobsters to blossom. And you can make lots of money exploiting that; some people make more money than ever before. But fisheries money is distributed differently today than in the past. It’s now more in the hands of a few industrial boats, as opposed to widespread over the landscape.

Are there bright spots or exceptions?

The Alaska pollock fishery in the northeast Pacific is an anomalous fishery in that it is very well managed. This is one of the smartest fisheries, because they have managed to keep the biomass extremely high, and it’s stable. They don’t allow too many boats, so each boat makes a huge amount of money. It’s the most profitable fishery in the world.

I should mention that US fisheries, as a whole, are better managed than ones in most other countries.

What role does fish farming have to play in a sustainable ocean?

Fish farming sounds good. But right now, only Asia gets a net benefit from fish farming. About 60 percent of the world aquaculture is in China. The overwhelming majority of their seafood production is mollusks, such as clams, oysters and so on. These are invertebrates that you don’t need to feed; they feed themselves. That’s good aquaculture.

When we talk about aquaculture in the West, we usually mean salmon and other carnivorous fish that have to be fed with fish meal. You need about 3 to 4 kilos of small fish to produce 1 kilo of salmon. This kind of aquaculture consumes fish, it doesn’t produce them. In West Africa, for example, the sardines that people used to eat are now ground up for export as fish meal.

Marine protected areas (MPAs) — spots where fishing is legally restricted or banned entirely — are often promoted by conservationists but contested by the fishing industry. Do they work?

Yes. Basically, if you fish in a certain area, the population is reduced. If you overfish, the population is reduced even more. If you don’t fish, then the population bounces back. That’s what happens. There is no conceptual blockage to the notion that they should work, and every time somebody does a study of MPAs, they do work.

When you have a large MPA, you can see from space all the boats that are fishing right at the edge: It’s called fishing the line. The fishing industry will go to the edge of an MPA and catch lots of fish, and then they will go and complain about MPAs. It’s illogical.

In December 2022, more than 190 nations pledged to protect 30 percent of the land and 30 percent of the ocean by 2030 (the 30x30 campaign). How is that going?

I don’t know. It’s not only me who doesn’t know, because often people declare an MPA but they don’t actually protect anything there. It’s just a “paper park.”

For example, there is a system of marine protected areas in Europe called Natura 2000. Somebody has investigated them and found these marine protected areas in Europe are killing zones where the fishing pressure is stronger inside than outside. I’m not making that up. The French concept of a marine protected area is an area where you can fish really without restriction, including trawling. We have developed a “paper park” index, and found the worst 11 MPAs in the world.

The 30 percent target is a worthy goal. But many countries will meet the 30 percent requirement by lying.

You have suggested banning commercial fishing on the high seas — the areas beyond “exclusive economic zones” (EEZs), which extend 200 miles off national coastlines. Wouldn’t that have a huge impact on our fish supply?

The high seas make up 60 percent of the world ocean, but less than 10 percent of the fish that is caught worldwide. So this is an immense area, producing very little fish. It gets lots of attention because that’s where we catch tuna and squid and so on. But altogether, the high seas are a sideshow.

Notably, almost every species that is caught in the high seas also moves into EEZs, where they could be fished. Right now, tuna are caught by Japan, South Korea, Taiwan, China, Spain, and two or three more countries, and that’s it. Whereas if you could fish only in the EEZs, then maybe 50 countries could access tuna. You would bring much more equity to the world.

If we close the high sea to fishing, we would still have the same catch, globally. We would produce much less greenhouse gas if fishing boats weren’t going so far out. We would have less slavery at sea because all of this would be done within national jurisdictions. And the fish populations would be much more sustainable, because there would be such a large area where they could recover.

Are there signs that this will happen?

It is an idea that will take 20 to 30 years. But it’s like the idea of exclusive economic zones. That started with a few South American countries in the 1940s to 1960s, and it was laughed about at the time: The idea that a fleet could not fish in coastal waters without authorization and without paying a fee seemed completely absurd. And yet in 1982, the Convention on the Law of the Sea was ratified and EEZs became the norm of Planet Earth. And so one cannot expect this to be to be happening right away, but it is an idea whose time has come.

Delegates to the United Nations’ Intergovernmental Conference on Marine Biodiversity of Areas Beyond National Jurisdiction (BBNJ) just agreed this month, after years of discussion, on a treaty for the high seas. It establishes a way to set up MPAs in the high seas, but doesn’t go as far as you asked in your letter.

We did not at all expect our idea of banning all fishing in the high sea to be picked up in this round — the idea is far too new. What’s important is that we have a high seas treaty, which can and will be amended later.

If we were to do all the good things — ban all fishing in the high seas, protect 30 percent of the ocean in MPAs, ban harmful subsidies, enforce stricter fishing regulations and quotas — how long would it take fish stocks to bounce back?

The sea recovers very fast. Stocks recover very fast — faster than land: They can be rebuilt within 10 years.

But we also have climate change to face.

Yes. The fish will move, toward the poles. But the ocean is getting more and more problematic for fish because the oceans are being slowly deoxygenated. As waters warm, fish require more oxygen, but the water is capable of  holding less oxygen. It will be a mess if we don’t solve the problem of greenhouse gas emission: The ocean will be inimical to higher life forms — that is clear. There will be smaller fish in a few areas that will survive.

But it will not be the fishery that brings down our civilization. Our agricultural system will collapse: That is much more fragile. Russia’s war on Ukraine has illustrated how the failure of a few countries to export wheat can trigger famines.

So right now, overfishing is the biggest problem for fish. But even if we do all the right things about that…

If we don’t lick the problem of greenhouse gas emissions, that will all be in vain. Yeah.

That’s depressing.

It’s not depressing. We have agency, we can do things. We’re not prisoners. We can get out of this hole.

This article originally appeared in Knowable Magazine, an independent journalistic endeavor from Annual Reviews.

Why Britain’s new CPTPP trade deal will not make up for Brexit

UNIKYLUCKK/Shutterstock
Terence Huw Edwards, Loughborough University and Mustapha Douch, The University of Edinburgh

The UK recently announced that it will join the Comprehensive and Progressive Agreement for Trans-Pacific Partnership (CPTPP), giving British businesses access to the 11 other members of the Indo-Pacific trade bloc and bringing its combined GDP to £11 trillion.

Some commentators have suggested the deal could make up for Brexit. It’s been called “a momentous economic and strategic moment” that “kills off any likelihood that it [the UK] will ever rejoin the EU customs union or single market”. Shanker Singham of think tank the Institute of Economic Affairs has even said: “it’s no exaggeration to say that CPTPP+UK is an equivalent economic power to the EU-28-UK”, comparing it to a trade deal between the UK and EU members.

UK business and trade secretary Kemi Badenoch echoed such sentiments, telling Times Radio:

We’ve left the EU so we need to look at what to do in order to grow the UK economy and not keep talking about a vote from seven years ago.

The problem with this fanfare is that the government’s own economic analysis of the benefits of joining this bloc is underwhelming. There is an estimated gain to the UK of 0.08% of GDP – this is just a 50th of the OBR’s estimate of what Brexit has cost the UK economy to date. Even for those that are sceptical about models and forecasts, that is an enormous difference in magnitude.

Of course, the CPTPP is expected to offer the UK some real gains. It certainly provides significant potential opportunities for some individual exporters. But the estimated gains for Britain overall are very small.

The main reason for this is that, apart from Japan, the major players of the global economy are not in the CPTPP. The US withdrew from the Trans Pacific Partnership (the CPTPP is what the remaining members formed without it). And China started negotiations to join in 2022, but current geopolitics now make its entry highly improbable. India was never involved.

In addition, the UK already has free trade agreements with nine out of the 11 members. The remaining two, Malaysia and Brunei, are controversial due to environmental threats from palm oil production to rainforests and orangutans.

Britain’s existing trade agreements with CPTPP members

A table listing the existing British trade agreements with CPTPP members.
Author provided using GDP data from the World Bank and trade data from UN Comtrade.

And despite the widespread public perception of the Asia-Pacific area as a hub of future growth, the performance and prospects of the CPTPP members are a mixed bag. The largest member, Japan, is arguably in long-term decline, as is Brunei, while just three members (Vietnam, Singapore and New Zealand had average growth in the last decade above 3% annually.

Finally, distance really does matter in trade. All the CPTPP members are thousands of miles from the UK, which explains their relatively small shares in UK trade at present.

Some benefits of CPTPP

While all of these points pour cold water on the suggested gains, there are some potential benefits from the CPTPP agreement, which allows for mutual recognition of certain standards. This includes patents and some relaxation of sanitary and phytosanitary rules on food items.

However, agreements over standards will involve the UK submitting to international CPTPP courts on these issues. This sits uncomfortably with many of the “sovereignty” objections to the European Court of Justice in relation to Brexit (largely from many of those who have extolled the CPTPP). It’s also notable that out of the nine agreements with CPTPP members that existed before the UK signed this deal, all but two are rollovers of previous EU deals.

But a trade deal with the CPTPP is worth more to the UK than separate deals with each member due to requirements around “rules of origin”, which determine the national source of a product. When a product contains inputs from more than one country, a series of separate free trade agreements may not eliminate tariffs. But if all the relevant countries are members of a single free trade agreement, then rules of origin on inputs from other members cease to be a problem (although there might be some issues if some members do not police the requirements properly).

Not the ideal agreement

While these benefits should be recognised, we should also acknowledge that the CPTPP is not the ideal agreement for Britain. As stated above, distance really does matter in trade – this is overwhelmingly accepted by modern trade economists.

Research shows that the rate at which trade declines with distance has barely changed over more than a century. This might seem strange because transport costs have fallen over time. But, as transport and communications have improved, firms have outsourced much of their production to complex supply chains that often cross national borders many times, with “just-in-time” supply schedules to keep down the costs of holding large stocks.

This means that, while trade everywhere has grown, there is still a big premium for trading (many times) across borders between contiguous countries. It is exactly this type of trade which benefits most from big comprehensive trade agreements that simplify rules of origin and regulatory paperwork.

This suggests that, while some elements of the the CPTPP offer benefits to the UK, it is unlikely to boost its trade in the way it does between countries around the Pacific Rim. For this sort of boost, the UK really needs to look towards its own neighbours. Of course, this is just the sort of agreement that Badenoch seems reluctant to discuss.

Terence Huw Edwards, Senior Lecturer in Economics, Loughborough University and Mustapha Douch, Assistant Professor in Economics, The University of Edinburgh

This article is republished from The Conversation under a Creative Commons license. Read the original article.