Stickleback Fish Reveal Insights into Animal Decision-Making – Neuroscience News

Summary: A new study provides significant insights into how animals, specifically three-spined stickleback fish, make decisions under competing demands. The study explored the fishs behaviors during the breeding season, where males must simultaneously defend territory, court females, and care for offspring.

By exposing male sticklebacks to various stimuli and analyzing their behavioral responses and brain gene expression, the researchers discovered a complex interaction between territorial defense and courtship, with some prioritizing defense.

This research not only sheds light on animal decision-making processes but also suggests ancient mechanisms driving complex decision-making across many taxa.

Key Facts:

Source: University of Illinois

How do animals make decisions when faced with competing demands, and how have decision making processes evolved over time?

In a recent publication in Biology Letters, Tina Barbasch, a postdoctoral researcher at the Carl R. Woese Institute for Genomic Biology, and Alison Bell (GNDP), a professor in the Department of Ecology, Evolution and Behavior, explored these questions using three-spined stickleback fish (Gasterosteus aculeatus).

Whether you are in school, working, raising children, managing a social life, or just trying to relax for a moment, managing multiple responsibilities at once can quickly become overwhelming. You may find yourself wondering how much simpler life would be if you were a fish floating along a river or a hawk soaring through the sky.

Yet, animals also face the burdens of multitasking, whether it be searching for their next meal while avoiding becoming someone elses next meal or attracting a mate while defending their territory.

During my PhD, I studied parental care in clownfish, and how they decide how much care to provide for their offspring, says Barbasch.

This requires the integration of many sources of social and environmental information. Recently, I have become interested in understanding the mechanisms underlying how animals make decisions and integrate different sources of information.

Despite the importance of decision making for an animals fitness, the mechanisms that shape decision-making are not well understood. Stickleback are a powerful model for investigating these questions because of their complex life history and reproductive behavior.

During the breeding season, male sticklebacks establish territories to build nests to attract females. Males must simultaneously defend their territories from other males, court females that enter their territory with performative swimming motions, called zig-zags, and ultimately provide care for offspring if they can successfully court a female.

This study was inspired by an experiment where we looked at brain gene expression in male three-spined stickleback during parental care or when defending their territory, explained Bell.

We found that the same genes were involved in both experiments, but in opposite directions genes turned on in one condition were turned off in the other. This idea that the brain might be using the same molecular machinery, but in opposite ways, could have major implications for the evolution of decision making.

To explore the underlying molecular mechanisms of decision making, Barbasch exposed male stickleback to one of three stimuli: a female stickleback (courtship treatment); another male stickleback (territorial intruder treatment), or both a male and female stickleback (trade-off treatment).

Some male stickleback were left alone as a control. Aggressive behaviors (biting) and courtship behaviors (zig-zags) were quantified, and then the brains of the male stickleback were dissected to look at gene expression using RNA sequencing.

Barbasch found that, when faced with a trade-off, males generally prioritized territorial defense over courtship. There was also substantial variation across males in how they responded, suggesting that there might be different strategies that males employ when faced with a trade-off.

Furthermore, the gene expression results identified groups of genes that were differentially expressed across each of the experimental treatments relative to a control. Of particular interest are the genes that are only present in the trade-off treatment, because they suggest that males have a unique molecular response when faced with conflicting demands.

We performed gene ontology analysis on these trade-off genes to look into what the identity and function of these genes might be, describes Barbasch. Preliminary results suggest the trade-off genes may be related to the dopamine response pathway, which modulates reward and motivation in the brain, or neurogenesis, which is important for cognition.

Ultimately, these findings highlight the importance of exploring the molecular basis of animal behavior, as Bell outlines. Animals are living really complicated lives, across many taxa. This suggests that the mechanisms that are driving complex decision-making are probably really ancient and animals have been managing complex decisions for a long time.

Barbaschs study also sets the foundation for a wide range of exciting follow-up studies. She has already started to explore the behavioral and molecular responses by stickleback to other trade-offs including those involving predation risk, foraging, and parental care.

She also plans on expanding her molecular toolkit by quantifying gene expression in finer detail using single-cell RNA sequencing and weighted gene co-expression network analysis, which helps capture gene function by identifying networks of genes with related patterns of expression.

So, the next time you notice an animal doing something, think a bit deeper about their day-to-day life, and how they are finding a way to manage all their responsibilities.

Author: Nicholas Vasi Source: University of Illinois Contact: Nicholas Vasi University of Illinois Image: The image is credited to Neuroscience News

Original Research: Open access. A distinct neurogenomic response to a trade-off between social challenge and opportunity in male sticklebacks (Gasterosteus aculeatus) by Alison Bell et al. Biology Letters

Abstract

A distinct neurogenomic response to a trade-off between social challenge and opportunity in male sticklebacks (Gasterosteus aculeatus)

Animals frequently make adaptive decisions about what to prioritize when faced with multiple, competing demands simultaneously.

However, the proximate mechanisms of decision-making in the face of competing demands are not well understood.

We explored this question using brain transcriptomics in a classic model system: threespined sticklebacks, where males face conflict between courtship and territorial defence. We characterized the behaviour and brain gene expression profiles of males confronted by a trade-off between courtship and territorial defence by comparing them to males not confronted by this trade-off.

When faced with the trade-off, males behaviourally prioritized defence over courtship, and this decision was reflected in their brain gene expression profiles. A distinct set of genes and biological processes was recruited in the brain when males faced a trade-off and these responses were largely non-overlapping across two brain regions.

Combined, these results raise new questions about the interplay between the neural and molecular mechanisms involved in decision-making.

See the original post:
Stickleback Fish Reveal Insights into Animal Decision-Making - Neuroscience News

Mapping Ketamine’s Impact on the Brain – Neuroscience News

Summary: A study reveals that repeated use of ketamine leads to structural changes in the brains dopamine system, emphasizing the need for targeted ketamine therapies.

The research suggests that specific brain regions should be addressed to minimize unintended effects on other dopamine areas. Repeated ketamine exposure decreases dopamine neurons linked to mood regulation and increases dopamine neurons related to metabolism and basic functions.

These findings may explain ketamines potential in treating eating disorders and the dissociative behavioral effects observed in users. The study paves the way for improved ketamine applications in clinical settings.

Key Facts:

Source: Columbia University

Ketamine an anesthetic also known for its illicit use as a recreational drug has undergone a thorough reputational rehabilitation in recent years as the medical establishment has begun to recognize its wide-ranging therapeutic effects.

The drug is increasingly used for a range of medical purposes, including as a painkiller alternative to opioids, and as a therapy for treatment-resistant depression.

In a new study published in the journalCell Reports, Columbia biologists and biomedical engineers mapped ketamines effects on the brains of mice, and found that repeated use over extended periods of time leads to widespread structural changes in the brains dopamine system.

The findings bolster the case for developing ketamine therapies that target specific areas of the brain, rather than administering doses that wash the entire brain in ketamine.

Instead of bathing the entire brain in ketamine, as most therapies now do, our whole-brain mapping data indicates that a safer approach would be to target specific parts of the brain with it, so as to minimize unintended effects on other dopamine regions of the brain, Raju Tomer, the senior author of the paper said.

The study found that repeated ketamine exposure leads to a decrease in dopamine neurons in regions of the midbrain that are linked to regulating mood, as well as an increase in dopamine neurons in the hypothalamus, which regulates the bodys basic functions like metabolism and homeostasis.

The former finding, that ketamine decreases dopamine in the midbrain, may indicate why long-term abuse of ketamine could cause users to exhibit similar symptoms to people with schizophrenia, a mood disorder.

The latter finding, that ketamine increases dopamine in the parts of the brain that regulate metabolism, on the other hand, may help explain why it shows promise in treating eating disorders.

The researchers highly-detailed data also enabled them to track how ketamine affects dopamine networks across the brain. They found that ketamine reduced the density of dopamine axons, or nerve fibers, in the areas of the brain responsible for our hearing and vision, while increasing dopamine axons in the brains cognitive centers. These intriguing findings may help explain the dissociative behavioral effects observed in individuals exposed to ketamine.

The restructuring of the brains dopamine system that we see after repeated ketamine use may be linked to cognitive behavioral changes over time, Malika Datta, a co-author of the paper said.

Most studies of ketamines effects on the brain to-date have looked at the effects of acute exposure how one dose affects the brain in the immediate term. For this study, researchers examined repeated daily exposure over the course of up to ten days. Statistically significant alterations to the brains dopamine makeup were only measurably detectable after ten days of daily ketamine use.

The researchers assessed the effects of repeated exposure to the drug at two doses, one dose analogous to the dose used to model depression treatment in mice, and another closer to the dose that induces anesthesia. The drugs effects on dopamine system were visible at both doses.

The study is charting a new technological frontier in how to conduct high-resolution studies of the entire brain, said Yannan Chen, a co-author of the paper. It is the first successful attempt to map changes induced by chronic ketamine exposure at what is known as sub-cellular resolution, in other words, down to the level of seeing ketamines effects on parts of individual cells.

Most sub-cellular studies of ketamines effects conducted to-date have been hypothesis-driven investigations of one area of the brain that researchers have targeted because they believed that it might play an important role in how the brain metabolizes the drug. This study is the first sub-cellular study to examinethe entire brain without first forming such a hypothesis.

Bradley Miller, a Columbia psychiatrist and neuroscientist who focuses on depression, said: Ketamine rapidly resolves depression in many patients with treatment resistant depression, and it is being investigated for longer term use to prevent the relapse of depression.

This study reveals how ketamine rewires the brain with repeated use. This is an essential step for developing targeted treatments that effectively treat depression without some of the unwanted side effects of ketamine.

The research was supported by the National Institutes of Health (NIH) and the National Institute of Mental Health (NIMH). The papers lead authors are Malika Datta and Yannan Chen, who completed their research in Raju Tomers lab at Columbia. Datta is now a postdoctoral fellow at Yale.

This study gives us a deeper brain-wide perspective of how ketamine functions that we hope will contribute to improved uses of this highly promising drug in various clinical settings as well as help minimize its recreational abuse. Morebroadly, the study demonstrates that the same type of neurons located in different brain regions can be affected differently by the same drug, said Tomer.

Author: Christopher Shea Source: Columbia University Contact: Christopher Shea Columbia University Image: The image is credited to Neuroscience News

Original Research: The findings will be published in Cell Reports

See the rest here:
Mapping Ketamine's Impact on the Brain - Neuroscience News

The Growing Synergy of AI and Neuroscience in Decoding the Human Brain – Securities.io

Artificial intelligence (AI) has been the talk of the town lately, with chatbots like OpenAI's ChatGPT, Google's Bard, and Elon Musk's Grok gaining a lot of traction. However, AI isn't as new as these chatbots; rather, interest in AI came decades ago in 1950 when scientist Alan Turing proposed a test of machine intelligence called The Imitation Game in his paper Computer Machinery and Intelligence.

Can machines think? asks Turing in his paper, offering a Turing Test, where a human interrogator would try to distinguish between a computer and human text response.

Since then, advancements in technology have led to more sophisticated AI systems that have been used across different fields, including healthcare and the understanding and treatment of the most complex human organ, the brain.

Click here to learn all about AI brain chips.

Broadly speaking, AI systems reason, learn, and perform tasks commonly associated with human cognitive functions, such as identifying patterns and interpreting speech by processing massive amounts of data.

AI is basically a set of technologies that enable computers to perform a variety of advanced functions. The backbone of innovation in modern computing, AI encompasses different disciplines, including:

These AI models that simulate cognitive processes and aid in complex cognitive tasks such as language translation and image recognition are based on biological neural networks, which are complex systems of interconnected neurons and help train' machines to make sense of speech, images, and patterns.

The intricate and intelligent human brain has been presenting a challenge for scientists to unlock possibilities for human augmentation. However, while AI has been harnessed to create the likes of Apple's Siri, Amazon's Alexa, and IBM's Watson, the truly transformative impact will only be achieved when artificial neural networks are augmented by human native intelligence, an outcome of centuries of survival.

Although computers still can't match the complete flexibility of humans, there are programs that manage to execute specific tasks, with the scope of AI's applications expanding daily. This technological progress, coupled with advancements in science, has notably led to the utilization of AI in medical diagnosis and treatment.

By analyzing large amounts of patient data from multiple sources to assist healthcare providers, AI helps get a complete picture of a patient's health for a more accurate prediction and make more informed decisions about patient care. This further helps detect potential health problems earlier before they become potentially life-threatening. Moreover, by using AI, healthcare providers can automate routine tasks, allowing them to focus on more complex patient care.

Click here to learn how various technologies are enabling the next level of human evolution.

Groundbreaking research in neuroscience has led to the development of advanced brain imaging techniques, including:

Concurrently, as AI algorithms, particularly in machine learning and deep learning, have become more sophisticated, this has resulted in an intersection of both fields. Such synchronization is enabling scientists to analyze and understand brain data at an unprecedented scale.

The intersection of AI and neuroscience, the field focusing on the nervous system and brain, is particularly evident in the realm of data analysis. Presently, AI empowers scientists and researchers to map brain regions with unprecedented accuracy. This has been made possible by the technological advancements in AI that allow the classification of intricate patterns of brain data and then making correlations. This collaboration has also paved the way for researchers to better comprehend neural pathways.

With the help of AI, medical diagnostics could be made better by improving the prediction accuracy, speed, and efficiency of the diagnostic process. AI-powered brain image studies have found subtle changes in brain structures that make their appearance prior to their clinical symptoms becoming known, which have enormous potential for early detection and intervention, potentially revolutionizing our approach to neurodegenerative disorders.

For instance, late last month, researchers leveraged AI toanalyze specialized brain MRI scansof individuals with attention-deficit/hyperactivity disorder (ADHD). ADHD is a common disorder, with an estimated 5.7 million children and adolescents between the ages of 6 and 17 diagnosed with it in the US.

The disorder that is increasingly becoming prevalent due to the influx of smartphones can have a huge impact on the patient's quality of life, as children with ADHD tend to have trouble paying attention and regulating activity. Here, early diagnosis and intervention are key to managing it, but ADHD, as study co-author Justin Huynh said:

It is extremely difficult to diagnose.

The study used fractional anisotropy (FA) values as input for training a deep-learning AI model to diagnose ADHD in a quantitative, objective diagnostic framework.

As we saw, by feeding massive amounts of datasets related to brain scans and patient histories, algorithms can distinguish subtle markers that may not be possible for humans. This, in turn, increases diagnostic accuracy, resulting in earlier interventions and better patient outcomes.

Studying new brain-imaging technology to understand the secrets of brain science and then linking it with AI to simulate the brain is also a way to close the gap between AI and human intelligence. Already, there have been a lot of advancements in brain-computer interfaces (BCI) by companies like Neuralink. BCI connects the brain directly to external devices, allowing disabled people to control prosthetics and interact with the world just by thought, showcasing their potential for many scientific and practical applications.

This merger of human intelligence and AI ultimately can create superhumans' but needs computing models that integrate visual and natural language processing, just as the human brain does, for comprehensive communication. In this context, virtual assistants can address both simple and complex tasks, but machines need to learn to understand richer contexts for human-like communication skills.

In healthcare, diagnostics involves evaluating medical conditions or diseases by analyzing symptoms, medical history, and test results. Its goal is to make use of tests such as imaging tests, blood tests, etc, to determine the cause of a medical problem and make an accurate diagnosis to provide effective treatment. In addition, diagnostics can be used to monitor the progress of a condition and assess the effectiveness of treatment.

The potential of AI in treatment is pretty compelling. Artificial intelligence can provide an analysis of a person's brain characteristics as well as their medical history, genetics, lifestyle data, and other factors, based on which it can offer personalized medicine. This way, AI promises tailored treatment plans that take into account the unique intricacies of each patient's brain.

By identifying unique, unbiased patterns in data, AI can potentially also discover new biomarkers or intervention methods. AI-based systems are faster and more efficient than manual processes and significantly reduce human errors.

A team of researchers recently used AI to predictthe optimal method for synthesizing drug molecules. This method, according to the paper's lead author David Nippa, has the potential to reduce the number of required lab experiments significantly, as a result, increasing both the efficiency and sustainability of chemical synthesis.

The AI model was trained on data from trustworthy scientific works and experiments from an automated lab and can successfully predict the position of borylation for any molecule and provide the optimal conditions for the chemical transformation. Already being used to identify positions in existing active ingredients where additional active groups can be introduced, this model will help in developing new and more effective variants of known drug active ingredients more quickly.

Now, let's take a look at some of the publicly traded companies in the medical sector that are making use of the technology.

This pharma giant has been investing in AI for biomedical data analysis and drug discovery and development. With a market cap of $223.48 bln, Novartis stocks are currently trading at $98.27, up 8.17% this year. The company's revenue trailing twelve months (TTM) has been $47.88 bln while having EPS (TTM) of 3.59, P/E (TTM) of 27.30, and ROE (TTM) of 14.94%. Meanwhile, the dividend yield has been 3.57%.

The company has been integrating AI across its operations, including analyzing vast datasets covering public health records, prescription data, internal data, and medical insurance claims to identify potential trial patients to optimize clinical trial design. Using the AI tool has made enrolling patients in trials faster, cheaper, and more efficient, according to Novartis.

This research-based biopharmaceutical company has a market cap of $163.238 bln and its shares are currently trading at $28.97, down 43.58% this year. The company's revenue trailing twelve months (TTM) has been $68.53 bln while having EPS (TTM) of 1.82, P/E (TTM) of 15.88, and ROE (TTM) of 11.05%. Meanwhile, the dividend yield has been 5.67%.

Pfizer has been showing a lot of interest in leveraging AI to enhance its drug discovery efforts. The company has partnered with many AI companies, such as CytoReason, Tempus, Gero, and Truveta. Meanwhile, to improve its oncology clinical trials, Pfizer signed a data-sharing agreement with oncology AI company Vysioneer, which also has an FDA-cleared AI-powered brain tumor auto-contouring solution called VBrain.

In addition to creating an ML research hub to create new predictive models and tools, Pfizer also partnered with one of the largest cloud providers in the Amazon Web Services for using cloud computing in drug discovery and manufacturing. This partnership has been particularly valuable during the COVID-19 pandemic in various aspects of the vaccine's development, from manufacturing to clinical trials.

This biopharmaceutical company has a market cap of $200.8 bln, and its shares are currently trading at $64.86, down 4.44% this year. The company's revenue trailing twelve months (TTM) has been almost $45 bln while having EPS (TTM) of 1.89, P/E (TTM) of 34.29, and ROE (TTM) of 16.30%. Meanwhile, the dividend yield has been 2.22%.

The Anglo-Swedish drugmaker has been investing in AI to analyze complex biological data for drug discovery and has been collaborating with AI companies to enhance their research capabilities. Most recently, AstraZeneca signed a deal worth up to $247 million with AI-based biologics drug discovery company Absci to design an antibody to fight cancer. The biologics firm makes use of generative AI to get optimal drug candidates based on traits such as affinity, manufacturability, and safety, among others.

Last month, AstraZeneca formed a health-technology unit dubbed Evinova to accelerate innovation and bring AI to clinical trials. The company has also gained early access to AI-driven' digital twins' and signed an AI-powered drug discovery pact with Verge Genomics through its rare disease division,Alexion.

This AI-enabled drug discovery and development company has a market cap of $86.45 bln, and its shares are currently trading at $0.545, down 84.43% this year. The company's EPS (TTM) is 0.75, and P/E (TTM) is 0.72.

BenevolentAI is a clinical-stage company that aims to treat atopic dermatitis as well as potential treatments for chronic diseases and cancer. It uses predictive AI algorithms to analyze and extract the needed insights from the available data and scientific literature. Back in May this year, as part of a strategic plan to position itself for a new era in AI, the company shared that it would reduce spending and free up net cash to increase its financial flexibility.

The company has an established partnership with other big pharmaceutical companies such as GSK and Novartis, while its collaboration with AstraZeneca is to develop drugs for fibrosis and chronic kidney disease. A few months ago, BenevolentAI also partnered with Merck KGaA to leverage its expertise in oncology and neuroinflammation and support the company's AI-driven drug discovery plans by focusing on finding viable small molecule candidates.

As we saw, AI has vast potential to enhance the diagnosis and treatment of brain diseases. It can even help predict brain disorders based on minor deviations from normal brain activity, leading to improved patient outcomes and a more efficient and effective healthcare system. However, it must be noted that this intersection of AI and the human brain is not without its ethical concerns and hence demands strict privacy safeguards.

Read the original post:
The Growing Synergy of AI and Neuroscience in Decoding the Human Brain - Securities.io

Implant Shows Promise in Restoring Cognitive Function After Brain Injury – Neuroscience News

Summary: A groundbreaking study successfully restored cognitive function in patients with lasting impairments from traumatic brain injuries using deep-brain-stimulation devices.

This innovative technique targets the central lateral nucleus in the thalamus to reactivate neural pathways associated with attention and arousal.

The studys participants, who had suffered moderate to severe brain injuries, showed remarkable improvements in mental processing speed, concentration, and daily life activities.

These findings offer new hope for individuals struggling with the long-term effects of traumatic brain injuries.

Key Facts:

Source: Stanford

In 2001, Gina Arata was in her final semester of college, planning to apply to law school, when she suffered a traumatic brain injury in a car accident. The injury so compromised her ability to focus she struggled in a job sorting mail.

I couldnt remember anything, said Arata, who lives in Modesto with her parents. My left foot dropped, so Id trip over things all the time. I was always in car accidents. And I had no filter Id get pissed off really easily.

Her parents learned about research being conducted at Stanford Medicine and reached out; Arata was accepted as a participant. In 2018, physicians surgically implanted a device deep inside her brain, then carefully calibrated the devices electrical activity to stimulate the networks the injury had subdued.

She noticed the difference immediately: When she was asked to list items in the produce aisle of a grocery store, she could rattle off fruits and vegetables. Then a researcher turned the device off, and she couldnt name any.

Since the implant I havent had any speeding tickets, Arata said. I dont trip anymore. I can remember how much money is in my bank account. I wasnt able to read, but after the implant I bought a book,Where the Crawdads Sing, and loved it and remembered it. And I dont have that quick temper.

For Arata and four others, the experimental deep-brain-stimulation device restored, to different degrees, the cognitive abilities they had lost to brain injuries years before. The new technique, developed by Stanford Medicine researchers and collaborators from other institutions, is the first to show promise against the long-lasting impairments from moderate to severe traumatic brain injuries.

The results of the clinical trial will be published Dec. 4 inNature Medicine.

Dimmed lights

More than 5 million Americans live with the lasting effects of moderate to severe traumatic brain injury difficulty focusing, remembering and making decisions. Though many recover enough to live independently, their impairments prevent them from returning to school or work and from resuming their social lives.

In general, theres very little in the way of treatment for these patients, saidJaimie Henderson, MD, professor of neurosurgery and co-senior author of the study.

But the fact that these patients had emerged from comas and recovered a fair amount of cognitive function suggested that the brain systems that support attention and arousal the ability to stay awake, pay attention to a conversation, focus on a task were relatively preserved.

These systems connect the thalamus, a relay station deep inside the brain, to points throughout the cortex, the brains outer layer, which control higher cognitive functions.

In these patients, those pathways are largely intact, but everything has been down-regulated, said Henderson, the John and Jene Blume-Robert and Ruth Halperin Professor. Its as if the lights had been dimmed and there just wasnt enough electricity to turn them back up.

In particular, an area of the thalamus called the central lateral nucleus acts as a hub that regulates many aspects of consciousness.

The central lateral nucleus is optimized to drive things broadly, but its vulnerability is that if you have a multifocal injury, it tends to take a greater hit because a hit can come from almost anywhere in the brain, saidNicholas Schiff, MD, a professor at Weill Cornell Medicine and co-senior author of the study.

The researchers hoped that precise electrical stimulation of the central lateral nucleus and its connections could reactivate these pathways, turning the lights back up.

Precise placement

In the trial, the researchers recruited five participants who had lasting cognitive impairments more than two years after moderate to severe traumatic brain injury. They were aged 22 to 60, with injuries sustained three to 18 years earlier.

The challenge was placing the stimulation device in exactly the right area, which varied from person to person. Each brain is shaped differently to begin with, and the injuries had led to further modifications.

Thats why we developed a number of tools to better define what that area was, Henderson said. The researchers created a virtual model of each brain that allowed them to pinpoint the location and level of stimulation that would activate the central lateral nucleus.

Guided by these models, Henderson surgically implanted the devices in the five participants.

Its important to target the area precisely, he said. If youre even a few millimeters off target, youre outside the effective zone.

A pioneering moment

After a two-week titration phase to optimize the stimulation, the participants spent 90 days with the device turned on for 12 hours a day.

Their progress was measured by a standard test of mental processing speed, called the trail-making test, which involves drawing lines connecting a jumble of letters and numbers.

Its a very sensitive test of exactly the things that were looking at: the ability to focus, concentrate and plan, and to do this in a way that is sensitive to time, Henderson said.

At the end of the 90-day treatment period, the participants had improved their speeds on the test, on average, by 32%, far exceeding the 10% the researchers had aimed for.

The only surprising thing is it worked the way we predicted it would, which is not always a given, Henderson said.

For the participants and their families, the improvements were apparent in their daily lives. They resumed activities that had seemed impossible reading books, watching TV shows, playing video games or finishing a homework assignment. They felt less fatigued and could get through the day without napping.

The therapy was so effective the researchers had trouble completing the last part of their study. They had planned a blinded withdrawal phase, in which half the participants would be randomly selected to have their devices turned off.

Two of the patients declined, unwilling to take that chance. Of the three who participated in the withdrawal phase, one was randomized to have their device turned off. After three weeks without stimulation, that participant performed 34% slower on the trail-making test.

The clinical trial is the first to target this region of the brain in patients with moderate to severe traumatic brain injury, and it offers hope for many who have plateaued in their recovery.

This is a pioneering moment, Schiff said. Our goal now is to try to take the systematic steps to make this a therapy. This is enough of a signal for us to make every effort.

Researchers from Weill Cornell Medicine, Spaulding Rehabilitation Hospital in Boston, Harvard Medical School, the University of Utah, the University of Florida, Vanderbilt University, the University of Washington, the University of Bordeaux and the Cleveland Clinic also contributed to the study.

Funding: The study was supported by funding from the National Institute of Health BRAIN Initiative and a grant from the Translational Science Center at Weill Cornell Medical College. Surgical implants were provided by Medtronic.

Author: Nina Bai Source: Stanford Contact: Nina Bai Stanford Image: The image is credited to Neuroscience News

Original Research: Closed access. Thalamic deep brain stimulation in traumatic brain injury: a phase 1, randomized feasibility study byJaimie Henderson et al. Nature Medicine

Abstract

Thalamic deep brain stimulation in traumatic brain injury: a phase 1, randomized feasibility study

Converging evidence indicates that impairments in executive function and information-processing speed limit quality of life and social reentry after moderate-to-severe traumatic brain injury (msTBI). These deficits reflect dysfunction of frontostriatal networks for which the central lateral (CL) nucleus of the thalamus is a critical node. The primary objective of this feasibility study was to test the safety and efficacy of deep brain stimulation within the CL and the associated medial dorsal tegmental (CL/DTTm) tract.

Six participants with msTBI, who were between 3 and 18 years post-injury, underwent surgery with electrode placement guided by imaging and subject-specific biophysical modeling to predict activation of the CL/DTTm tract. The primary efficacy measure was improvement in executive control indexed by processing speed on part B of the trail-making test.

All six participants were safely implanted. Five participants completed the study and one was withdrawn for protocol non-compliance. Processing speed on part B of the trail-making test improved 15% to 52% from baseline, exceeding the 10% benchmark for improvement in all five cases.

CL/DTTm deep brain stimulation can be safely applied and may improve executive control in patients with msTBI who are in the chronic phase of recovery.

ClinicalTrials.gov identifier:NCT02881151.

Read this article:
Implant Shows Promise in Restoring Cognitive Function After Brain Injury - Neuroscience News

Dopamine’s Role in Learning from Rewards and Penalties – Neuroscience News

Summary: Dopamine, a neurotransmitter, plays a vital role in encoding both reward and punishment prediction errors in the human brain.

This study suggests that dopamine is essential for learning from both positive and negative experiences, enabling the brain to adapt behavior based on outcomes. Using electrochemical techniques and machine learning, scientists measured dopamine levels in real-time during a computer game involving rewards and penalties.

The findings shed light on the intricate role of dopamine in human behavior and could have implications for understanding psychiatric and neurological disorders.

Key Facts:

Source: Wake Forest Baptist Medical Center

What happens in the human brain when we learn from positive and negative experiences? To help answer that question and better understand decision-making and human behavior, scientists are studying dopamine.

Dopamine is a neurotransmitter produced in the brain that serves as a chemical messenger, facilitating communication between nerve cells in the brain and the body. It is involved in functions such as movement, cognition and learning. While dopamine is most known for its association withpositive emotions, scientists are also exploring its role in negative experiences.

Now, a new study from researchers at Wake Forest University School of MedicinepublishedDec. 1 inScience Advancesshows thatdopamine releasein the human brain plays a crucial role in encoding both reward and punishment prediction errors.

This means that dopamine is involved in the process of learning from both positive and negative experiences, allowing the brain to adjust and adapt its behavior based on the outcomes of these experiences.

Previously, research has shown that dopamine plays an important role in how animals learn from rewarding (and possibly punishing) experiences. But, little work has been done to directly assess what dopamine does on fast timescales in thehuman brain, said Kenneth T. Kishida, Ph.D., associate professor of physiology and pharmacology and neurosurgery at Wake Forest University School of Medicine.

This is the first study in humans to examine how dopamine encodes rewards and punishments and whether dopamine reflects an optimal teaching signal that is used in todays most advanced artificial intelligence research.

For the study, researchers on Kishidas team utilized fast-scancyclic voltammetry, an electrochemical technique, paired withmachine learning, to detect and measuredopamine levelsin real-time (i.e., 10 measurements per second). However, this method is challenging and can only be performed during invasive procedures such as deep-brain stimulation (DBS) brain surgery.

DBS is commonly employed to treat conditions such as Parkinsons disease, essential tremor, obsessive-compulsive disorder and epilepsy.

Kishidas team collaborated with Atrium Health Wake Forest Baptist neurosurgeons Stephen B. Tatter, M.D., and Adrian W. Laxton, M.D., who are also bothfaculty membersin the Department of Neurosurgery at Wake Forest University School of Medicine, to insert a carbon fiber microelectrode deep into the brain of three participants at Atrium Health Wake Forest Baptist Medical Center who were scheduled to receive DBS to treat essential tremor.

While the participants were awake in theoperating room, they played a simple computer game. As they played the game, dopamine measurements were taken in the striatum, a part of the brain that is important for cognition, decision-making, and coordinated movements.

During the game, participants choices were either rewarded or punished with real monetary gains or losses. The game was divided into three stages in which participants learned from positive or negative feedback to make choices that maximized rewards and minimized penalties. Dopamine levels were measured continuously, once every 100 milliseconds, throughout each of the three stages of the game.

We found that dopamine not only plays a role in signaling both positive and negative experiences in the brain, but it seems to do so in a way that is optimal when trying to learn from those outcomes. What was also interesting, is that it seems like there may be independent pathways in the brain that separately engage the dopamine system for rewarding versus punishing experiences.

Our results reveal a surprising result that these two pathways may encode rewarding and punishing experiences on slightly shifted timescales separated by only 200 to 400 milliseconds in time, Kishida said.

Kishida believes that this level of understanding may lead to a better understanding of how the dopamine system is affected in humans with psychiatric and neurological disorders. Kishida said additional research is needed to understand how dopamine signaling is altered in psychiatric and neurological disorders.

Traditionally, dopamine is often referred to as the pleasure neurotransmitter, Kishida said.

However, our work provides evidence that this is not the way to think about dopamine. Instead, dopamine is a crucial part of a sophisticated system that teaches our brain and guides our behavior.

Thatdopamineis also involved in teaching ourbrainabout punishing experiences is an important discovery and may provide new directions in research to help us better understand the mechanisms underlying depression, addiction, and related psychiatric and neurological disorders.

Author: Kenneth T. Kishida Source: Wake Forest Baptist Medical Center Contact: Kenneth T. Kishida Wake Forest Baptist Medical Center Image: The image is credited to Neuroscience News

Original Research: Open access. Sub-second fluctuations in extracellular dopamine encode reward and punishment prediction errors in humans by Paul Sands et al. Science Advances

Abstract

Sub-second fluctuations in extracellular dopamine encode reward and punishment prediction errors in humans

In the mammalian brain, midbrain dopamine neuron activity is hypothesized to encode reward prediction errors that promote learning and guide behavior by causing rapid changes in dopamine levels in target brain regions.

This hypothesis (and alternatives regarding dopamines role in punishment-learning) has limited direct evidence in humans. We report intracranial, subsecond measurements of dopamine release in human striatum measured, while volunteers (i.e., patients undergoing deep brain stimulation surgery) performed a probabilistic reward and punishment learning choice task designed to test whether dopamine release encodes only reward prediction errors or whether dopamine release may also encode adaptive punishment learning signals.

Results demonstrate that extracellular dopamine levels can encode both reward and punishment prediction errors within distinct time intervals via independent valence-specific pathways in the human brain.

Read the original here:
Dopamine's Role in Learning from Rewards and Penalties - Neuroscience News

Cannabis and Alcohol Co-use Impacts Adolescent Brain and Behavior – Neuroscience News

Summary: Recent studies reveal effects of cannabis and alcohol co-use on adolescent rats, simulating human behavior.

Rats voluntarily consumed THC-infused treats and alcohol, allowing researchers to observe changes in brain structure and behavior. Notably, co-use led to reduced synaptic plasticity in the prefrontal cortex, with effects more pronounced in female rats.

The studies aim to understand cognitive disruptions caused by drug use in adolescence and develop treatment approaches.

Key Facts:

Source: University of Illinois

The increased legalization of cannabis over the past several years can potentially increase its co-use with alcohol. Concerningly, very few studies have looked at the effects of these two drugs when used in combination.

In a series of new studies, researchers at the University of Illinois Urbana-Champaign used rats to understand how brain structure and behavior can change when cannabis and alcohol are taken together.

Most researchers have studied the effects of either alcohol or THC (delta-9-tetrahydrocannabinol), the primary psychoactive drug in cannabis, alone. However, when people, especially adolescents, use these drugs, they often do so in tandem.

Even when researchers study the co-use of these drugs, it involves injecting the animals with the drugs, which does not mirror what happens in humans.

Its rare that a person would have these drugs forced upon them. Also, other studies have shown that the effects of a drug are very different when an animal chooses to take it compared to when it is exposed against its will, Lauren Carrica, a graduate student in the Gulley lab.

Our study is unique because the rats have access to both these drugs and they choose to consume them.

The researchers used young male and female rats to mimic adolescence in humans. During feeding time, the animals were exposed to recreational doses (3 mg/kg-10 mg/kg) of THC that was coated on Fudge Brownie Goldfish Grahams and a sweetened 10% ethanol solution. The control group of rats were fed just the cookies and sweetened water in addition to their regular food.

Training them to eat the drug was simple. We mimicked the timing that humans are more likely to take the drugsat the end of the day. We did not deprive them of food or water. They were given an alcohol bottle in place of their water bottle during the access period and they preferred eating the cookies over their regular chow, said Nu-Chu Liang, an associate professor of psychology.

After 20 days of increasing THC doses, rats were drug-free as they grew into young adulthood. The researchers took blood samples from the rats and also tested their memories to see if the co-use of drugs had any effect.

Briefly, rats were required to remember the location of a target lever after a delay period that ranged from very short to very long. If they remembered the location, and pressed the target lever, they earned a food reward. If they responded on the wrong lever, no food was delivered.

The effects were more pronounced in females and they had higher levels of chemicals that are produced when THC is broken down. Even so, the influence of THC on memory were modest, Carrica said.

These volitional, low-to-moderate doses of alcohol, THC, or both drugs did not induce long lasting, serious cognitive defects.

The subtlety of these effects is not surprising because we have modeled how these drugs are taken in a social setting over a relatively short period of time, said Joshua Gulley (GNDP), a professor of psychology.

Our results with the female rats are in agreement with other research that has shown that women who take edibles often have a different experience, which may be due to differences in how their bodies break down the drug.

In this first study the researchers were unable to expose the rats to higher levels of THC because the rats would ignore the THC-laced cookies.

When you gave them higher doses, some animals lost interest in the cookies, and it is unclear why. Its possible that they dont like the higher doses or there is something about the taste or smell that becomes aversive, Gulley said.

Although there were modest differences in behavior, the group still wanted to check whether anything had changed in the signaling pathways in the brain, especially at higher levels of THC. In the second paper they did so by injecting alcohol-drinking or non-drinking adolescent rats with THC doses ranging from 3 mg/kg to 20 mg/kg.

Similar to the first study, the injections and alcohol drinking were then stopped and the rats were tested once they reached early adulthood.

Just like humans, rat brains undergo significant changes during adolescence, particularly in the prefrontal cortex, which helps them adapt to changing environments. The neurons in the prefrontal cortex modify their connectionsa process referred to as synaptic plasticityfrom the end of adolescence into young adulthood, according to Gulley.

The researchers wanted to test whether drug exposure during adolescence could change the ability of the brain to undergo synaptic plasticity as an adult. Therefore, they sacrificed the rats and measured the electrical signals generated in the brain.

We found that alcohol and THC together significantly reduced, and in some cases prevented, the ability of the prefrontal cortex in drug-exposed rats to undergo plasticity in the same way that the brains from control animals can, said Linyuan Shi, a graduate student in the Gulley lab.

The effects were apparent in rats exposed to either drug alone, and they were most pronounced with co-exposure to both drugs. We also found the impaired plasticity was likely due to changes in signaling caused by gamma-aminobutyric acid, a chemical messenger in the brain.

When we used a chemical that enhances GABA, it could rescue the deficits we saw in the animals that had been exposed to the drugs.

The researchers are now interested in understanding which neurons are involved in the response to the drugs.

From these studies, and the work our group has done with methamphetamine, we know that drug exposure during adolescence has the ability to disrupt cognitive functioning by altering the development of neuronal signaling in the prefrontal cortex.

Although different drugs influence the brain in different ways, they might have the same effect on the brain that can manifest as cognitive disruptions later in life, Gulley said.

Our ultimate goal is to harness our knowledge of these changes to develop treatment approaches for reversing cognitive dysfunctions that are associated with long-term drug use and addiction.

Author: Nicholas Vasi Source: University of Illinois Contact: Nicholas Vasi University of Illinois Image: The image is credited to Neuroscience News

Original Research: Open access. Effects of combined use of alcohol and delta-9-tetrahydrocannibinol on working memory in Long Evans rats by Joshua Gulley et al. Behavioral Brain Research

Open access. Effects of combined exposure to ethanol and delta-9-tetrahydrocannabinol during adolescence on synaptic plasticity in the prefrontal cortex of Long Evans rats by Joshua Gulley et al. Neuropharmacology

Abstract

Effects of combined use of alcohol and delta-9-tetrahydrocannibinol on working memory in Long Evans rats

The increase in social acceptance and legalization of cannabis over the last several years is likely to increase the prevalence of its co-use with alcohol. In spite of this, the potential for effects unique to co-use of these drugs, especially in moderate doses, has been studied relatively infrequently.

We addressed this in the current study using a laboratory rat model of voluntary drug intake. Periadolescent male and female Long-Evans rats were allowed to orally self-administer ethanol, 9-tetrahydrocannibinol (THC), both drugs, or their vehicle controls from postnatal day (P) 30 to P47. They were subsequently trained and tested on an instrumentalbehaviortask that assesses attention, working memory and behavioral flexibility.

Similar to previous work, consumption of THC reduced both ethanol and saccharin intake in both sexes.

Blood samples taken 14h following the final self-administration session revealed that females had higher levels of the THC metabolite THC-COOH. There were modest effects of THC on our delayed matching to position (DMTP) task, with females exhibiting reduced performance compared to their control group or male, drug using counterparts.

However, there were no significant effects of co-use of ethanol or THC on DMTP performance, and drug effects were also not apparent in the reversal learning phase of the task when non-matching to position was required as the correct response.

These findings are consistent with other published studies in rodent models showing that use of these drugs in low to moderate doses does not significantly impact memory or behavioral flexibility following a protracted abstinence period.

Abstract

Effects of combined exposure to ethanol and delta-9-tetrahydrocannabinol during adolescence on synaptic plasticity in the prefrontal cortex of Long Evans rats

Significant exposure to alcohol or cannabis during adolescence can induce lasting disruptions of neuronal signaling in brain regions that are later to mature, such as the medial prefrontal cortex (mPFC). Considerably less is known about the effects of alcohol and cannabis co-use, despite its common occurrence.

Here, we used male and female Long-Evans rats to investigate the effects of early-life exposure to ethanol, delta-9-tetrahydrocannabinol (THC), or their combination on high frequency stimulation (HFS)-induced plasticity in the prelimbic region of the mPFC.

Animals were injected daily from postnatal days 3045 with vehicle or THC (escalating doses, 320mg/kg) and allowed to drink vehicle (0.1% saccharin) or 10% ethanol immediately after each injection.In vitrobrain sliceelectrophysiologywas then used to record population responses of layer V neurons following HFS in layer II/III after 34 weeks of abstinence.

We found that THC exposure reduced body weight gains observed inad libitumfed rats, and reduced intake of saccharin and ethanol. Compared to controls, there was a significant reduction in HFS-induced long-term depression (LTD) in rats exposed to either drug alone, and an absence of LTD in rats exposed to the drug combination.

Bath application ofindiplonor AR-A014418, which enhance GABAAreceptor function or inhibitglycogen synthase kinase3 (GSK3), respectively, suggested the effects of ethanol, THC or their combination were due in part to lasting adaptations in GABA and GSK3 signaling.

These results suggest the potential for long-lasting adaptations in mPFC output following co-exposure to alcohol and THC.

Go here to read the rest:
Cannabis and Alcohol Co-use Impacts Adolescent Brain and Behavior - Neuroscience News

The Role of Protein Misfolding in Neurodegenerative Diseases – Neuroscience News

Summary: Neurodegenerative diseases share a common factor: protein misfolding and deposits in the brain. Misfolded proteins can lead to toxic activity or the loss of the proteins physiological function, causing damage to neurons.

Recent research explores the cross-seeding phenomenon, where misfolded proteins in one disease can induce the aggregation of others. The study specifically focuses on the interaction between the prion protein and TDP-43, shedding light on how they collaborate to impact neurodegenerative diseases.

Key Facts:

Source: RUB

The causes of neurodegenerative diseases such as Alzheimers disease, Parkinsons disease, frontotemporal dementia and prion diseases can be many and varied. But there is a common denominator, namely protein misfolding and the occurrence of protein deposits in the brain.

Various approaches and models have shown that misfolded proteins play a crucial role in the disease process, says Jrg Tatzelt.

Still, theres an ongoing debate about the nature of the harmful protein species and how misfolded proteins selectively damage specific neurons.

Studies on genes associated with pathologies have revealed two basic mechanisms by which misfolded proteins can lead to neurodegeneration: Firstly, misfolding can cause the protein to acquire toxic activity. Secondly, the misfolding can lead to a loss of the physiological function of the protein, which impairs important physiological processes in the cell.

The assumption used to be that every neurodegenerative disease was characterized by the misfolding of a specific protein, explains Jrg Tatzelt.

However, it has since been shown that misfolded proteins that are produced more frequently in one disease can also induce the aggregation of other proteins, a mechanism referred to as cross-seeding.

The prion protein and TDP-43

TDP-43 (TAR DNA-binding protein 43) is a protein that helps to translate genetic information into specific proteins. It thus helps to maintain the protein balance in nerve cells. The clumping of TDP-43 in the cell is a characteristic feature in the brains of patients suffering from amyotrophic lateral sclerosis or frontotemporal dementia.

Misfolding of the prion protein triggers prion diseases such as Creutzfeldt-Jakob disease. All research findings to date indicate that the misfolded prion protein acquires toxic activity. However, the exact mechanisms by which disease-associated prion proteins trigger the death of nerve cells are only partially understood.

TDP-43 loses its physiological function through PrP-mediated cross-seeding

Using in vitro and cell culture approaches, animal models and brain samples from patients with Creutzfeldt-Jakob disease, the researchers showed that misfolded prion proteins can trigger the clumping and inactivation of TDP-43.

The prion proteins interact with TDP-43 in vitro and in cells, thus inducing the formation of TDP aggregates in the cell. As a result, TDP-43-dependent splicing activity in the cell nucleus is significantly reduced, leading to altered protein expression.

Prion protein and TDP-43 are partners in crime in neurodegenerative diseases, so to speak, says Jrg Tatzelt.

An analysis of brain samples showed that in some Creutzfeld-Jacob patients, TDP-43 aggregates were found alongside the prion protein deposits. This study has revealed a new mechanism of how disease-associated prion proteins can affect physiological signaling pathways through cross-seeding.

Author: Meike Driessen Source: RUB Contact: Meike Driessen RUB Image: The image is credited to Neuroscience News

Original Research: Closed access. Cross-Seeding by Prion Protein Inactivates TDP-43 by Jrg Tatzelt et al. Brain

Abstract

Cross-Seeding by Prion Protein Inactivates TDP-43

A common pathological denominator of various neurodegenerative diseases is the accumulation of protein aggregates. Neurotoxic effects are caused by a loss of the physiological activity of the aggregating protein and/or a gain of toxic function of the misfolded protein conformers. In transmissible spongiform encephalopathies or prion diseases, neurodegeneration is caused by aberrantly folded isoforms of the prion protein (PrP).

However, it is poorly understood how pathogenic PrP conformers interfere with neuronal viability. Employingin vitroapproaches, cell culture, animal models and patients brain samples, we show that misfolded PrP can induce aggregation and inactivation of TAR DNA-binding protein-43 (TDP-43).

Purified PrP aggregates interact with TDP-43in vitroand in cells and induce the conversion of soluble TDP-43 into non-dynamic protein assemblies. Similarly, mislocalized PrP conformers in the cytosol bind to and sequester TDP-43 in cytosolic aggregates.

As a consequence, TDP-43-dependent splicing activity in the nucleus is significantly decreased, leading to altered protein expression in cells with cytosolic PrP aggregates. Finally, we present evidence for cytosolic TDP-43 aggregates in neurons of transgenic flies expressing mammalian PrP and CreutzfeldtJakob disease patients.

Our study identified a novel mechanism of how aberrant PrP conformers impair physiological pathways by cross-seeding.

See the rest here:
The Role of Protein Misfolding in Neurodegenerative Diseases - Neuroscience News

Link Between Childhood Adversity and Muscle Dysmorphia in Youth – Neuroscience News

Summary: A new study reveals a significant association between adverse childhood experiences (ACEs) and symptoms of muscle dysmorphia in adolescents and young adults.

The research highlights how ACEs, such as domestic violence and emotional abuse, can lead to the pathological pursuit of muscularity as a coping mechanism. The study found that boys and young men who experienced five or more ACEs were particularly at risk for muscle dysmorphia symptoms.

The findings emphasize the importance of recognizing and addressing the impact of childhood trauma on mental health and body image.

Key Facts:

Source: University of Toronto

A new study published inClinical Social Work Journalfound that adolescents and young adults who experienced adverse childhood experiences (ACEs) before the age of 18 were significantly more likely to experience symptoms of muscle dysmorphia.

With previous research showing that more than half of North American children and adolescents experience at least one adverse childhood experience in their lifetime, these new findings highlight the need for greater awareness of how adverse experiences in childhood (such as domestic violence, emotional abuse, and sexual abuse) and muscle dysmorphia (the pathological pursuit of muscularity) are linked.

Those who experience adverse childhood experiences may engage in the pursuit of muscularity to compensate for experiences where they once felt inferior, small, and at risk, as well as to protect against future victimization, says lead author Kyle T. Ganson, PhD, MSW, an assistant professor at the University of Torontos Factor-Inwentash Faculty of Social Work.

The experience of adverse childhood experiences may also increase body dissatisfaction, specifically muscle dissatisfaction, which is a key feature of muscle dysmorphia.

Previous studies have shown that adverse experiences in childhood can lead to harmful health effects. While prior research has demonstrated that adverse childhood experiences are highly common in people with eating disorders and body dysmorphic disorder, few studies have looked at the association between adverse childhood experiences and muscle dysmorphia.

The studys researchers analyzed data from over 900 adolescents and young adults who participated in the Canadian Study of Adolescent Health Behaviors. In total, 16% of participants who experienced five or more adverse childhood experiences were at clinical risk for muscle dysmorphia, underscoring the significant traumatic effects that such experiences can have on mental health and well-being.

Importantly, our study found that gender was an important factor in the relationship between adverse childhood experiences and muscle dysmorphia symptoms, says Ganson.

Boys and young men in the study who have experienced five or more adverse childhood experiences had significantly greater muscle dysmorphia symptoms when compared to girls and young women.

The authors note that boys and young men who experience adverse childhood experiences may feel that their masculinity was threatened from these experiences. Therefore, they engage in the pursuit of muscularity to demonstrate their adherence to masculine gender norms such as dominance, aggression, and power.

It is important for health care professionals to assess for symptoms of muscle dysmorphia, including muscle dissatisfaction and functional impairment related to exercise routines and body image, among young people who have experienced adverse childhood experiences, particularly boys and young men, concludes Ganson.

Author: Dale Duncan Source: University of Toronto Contact: Dale Duncan University of Toronto Image: The image is credited to Neuroscience News

Original Research: Closed access. Adverse Childhood Experiences and Muscle Dysmorphia Symptomatology: Findings from a Sample of Canadian Adolescents and Young Adults by Kyle T. Ganson et al. Clinical Social Work Journal

Abstract

Adverse Childhood Experiences and Muscle Dysmorphia Symptomatology: Findings from a Sample of Canadian Adolescents and Young Adults

Adverse childhood experiences (ACEs) are relatively common among the general population and have been shown to be associated with eating disorders and body dysmorphic disorder. It remains relatively unknown whether ACEs are associated with muscle dysmorphia.

The aim of this study was to investigate the association between ACEs and muscle dysmorphia symptomatology among a sample of Canadian adolescents and young adults. A community sample of 912 adolescents and young adults ages 1630 years across Canada participated in this study.

Participants completed a 15-item measure of ACEs (categorized to 0, 1, 2, 3, 4, and 5 or more) and the Muscle Dysmorphic Disorder Inventory. Multiple linear regression analyses were utilized to determine the association between the number of ACEs experienced and muscle dysmorphia symptomatology.

Participants who experienced five or more ACEs, compared to those who had experienced no ACEs, had more symptoms of muscle dysmorphia, as well as more symptoms related to Appearance Intolerance and Functional Impairment.

There was no association between ACEs and Drive for Size symptoms. Participants who experienced five or more ACEs (16.1%), compared to 10.6% who experienced no ACEs, were at clinical risk for muscle dysmorphia (p=.018).

Experiencing ACEs, particularly five or more, was significantly associated with muscle dysmorphia symptomatology, expanding prior research on eating disorders and body dysmorphic disorder. Social workers should consider screening for symptoms of muscle dysmorphia among adolescents and young adults who experience ACEs.

Link:
Link Between Childhood Adversity and Muscle Dysmorphia in Youth - Neuroscience News

New neuroscience research upends traditional theories of early language learning in babies – PsyPost

New research suggests that babies primarily learn languages through rhythmic rather than phonetic information in their initial months. This finding challenges the conventional understanding of early language acquisition and emphasizes the significance of sing-song speech, like nursery rhymes, for babies. The study was published in Nature Communications.

Traditional theories have posited that phonetic information, the smallest sound elements of speech, forms the foundation of language learning. In language development, acquiring phonetic information means learning to produce and understand these different sounds, recognizing how they form words and convey meaning.

Infants were believed to learn these individual sound elements to construct words. However, recent findings from the University of Cambridge and Trinity College Dublin suggest a different approach to understanding how babies learn languages.

The new study was motivated by the desire to better understand how infants process speech in their first year of life, specifically focusing on the neural encoding of phonetic categories in continuous natural speech. Previous research in this field predominantly used behavioral methods and discrete stimuli, which limited insights into how infants perceive and process continuous speech. These traditional methods were often constrained to simple listening scenarios and few phonetic contrasts, which didnt fully represent natural speech conditions.

To address these gaps, the researchers used neural tracking measures to assess the neural encoding of the full phonetic feature inventory of continuous speech. This method allowed them to explore how infants brains process acoustic and phonetic information in a more naturalistic listening environment.

The study involved a group of 50 infants, monitored at four, seven, and eleven months of age. Each baby was full-term and without any diagnosed developmental disorders. The research team also included 22 adult participants for comparison, though data from five were later excluded.

In a carefully controlled environment, the infant participants were seated in a highchair, a meter away from their caregiver, inside a sound-proof chamber. The adults sat similarly in a normal chair. Each participant, whether infant or adult, was presented with eighteen nursery rhymes played via video recordings. These rhymes, sung or chanted by a native English speaker, were selected carefully to cover a range of phonetic features. The sounds were delivered at a consistent volume.

To capture how the infants brains responded to these nursery rhymes, the researchers used a method called electroencephalography (EEG), which records patterns of brain activity. This technique is non-invasive and involved placing a soft cap with sensors on the infants heads to measure their brainwaves.

The brainwave data was then analyzed using a sophisticated algorithm to decode the phonological information allowing them to create a readout of how the infants brains were processing the different sounds in the nursery rhymes. This technique is significant as it moved beyond the traditional method of just comparing reactions to individual sounds or syllables, allowing a more comprehensive understanding of how continuous speech is processed.

Contrary to what was previously thought, the researchers found that infants do not process individual speech sounds reliably until they are about seven months old. Even at eleven months, when many babies start to say their first words, the processing of these sounds is still sparse.

Furthermore, the study discovered that phonetic encoding in babies emerged gradually over the first year. The read out of brain activity showed that the processing of speech sounds in infants started with simpler sounds like labial and nasal ones, and this processing became more adult-like as they grew older.

Our research shows that the individual sounds of speech are not processed reliably until around seven months, even though most infants can recognize familiar words like bottle by this point, said study co-author Usha Goswami, a professor at the University of Cambridge. From then individual speech sounds are still added in very slowly too slowly to form the basis of language.

The current study is part of the BabyRhythm project, which is led by Goswami.

First author Giovanni Di Liberto, a professor at Trinity College Dublin, added: This is the first evidence we have of how brain activity relates to phonetic information changes over time in response to continuous speech.

The researchers propose that rhythmic speech the pattern of stress and intonation in spoken language is crucial for language learning in infants. They found that rhythmic speech information was processed by babies as early as two months old, and this processing predicted later language outcomes.

The findings challenge traditional theories of language acquisition that emphasize the rapid learning of phonetic elements. Instead, the study suggests that the individual sounds of speech are not processed reliably until around seven months, and the addition of these sounds into language is a gradual process.

The study underscores the importance of parents talking and singing to their babies, using rhythmic speech patterns such as those found in nursery rhymes. This could significantly influence language outcomes, as rhythmic information serves as a framework for adding phonetic information.

We believe that speech rhythm information is the hidden glue underpinning the development of a well-functioning language system, said Goswami. Infants can use rhythmic information like a scaffold or skeleton to add phonetic information on to. For example, they might learn that the rhythm pattern of English words is typically strong-weak, as in daddy or mummy, with the stress on the first syllable. They can use this rhythm pattern to guess where one word ends and another begins when listening to natural speech.

Parents should talk and sing to their babies as much as possible or use infant directed speech like nursery rhymes because it will make a difference to language outcome, she added.

While this study offers valuable insights into infant language development, its important to recognize its limitations. The research focused on a specific demographic full-term infants without developmental disorders, mainly from a monolingual English-speaking environment. Future research could look into how infants from different linguistic and cultural backgrounds, or those with developmental challenges, process speech.

Additionally, the study opens up new avenues for exploring how early speech processing relates to language disorders, such as dyslexia. This could be particularly significant in understanding and potentially intervening in these conditions early in life.

The study, Emergence of the cortical encoding of phonetic features in the first year of life, was authored by Giovanni M. Di Liberto, AdamAttaheri, Giorgia Cantisani, Richard B. Reilly, ineN Choisdealbha, SineadRocha, PerrineBrusini, and Usha Goswami.

Go here to see the original:
New neuroscience research upends traditional theories of early language learning in babies - PsyPost