Category Archives: Physiology

UC Berkeley Partners with Tracmo to Support the Physical & Mental Health of Caregivers and Dementia – PharmiWeb.com

UC Berkeley Partners with Tracmo to Support the Physical & Mental Health of Caregivers and Dementia Patients

Industry Background

Many previous studies, including a recently published paper in the Journal of Personality and Social Psychology by Dr. Robert Levenson and Dr. Kuan-Hua Chen at the University of California, Berkeley, has found that couples with higher relationship satisfactions showed greater linkage in their physiological responses (for example, heart rate and skin conductance) during face-to-face interactions, which suggests a greater biological connection between the couples. In addition, there has been emerging evidence further suggesting that being physically linked with a partners physiological response may even have important implications to individuals mental and physical health. For example, more recent findings from Dr. Levenson and Dr. Chens group suggested that a couples physiological linkage can predict their mental and physical healthin both healthy married couples and couples in which one person is the spousal caregiver of the other who is diagnosed with a neurodegenerative disease.

Building upon this, researchers wanted to better understand whether synchronicity of objective physiology indicators between dementia patients and their caregivers also correlates to the influences between each other outside the laboratory and real life. In one recent study, Dr. Levenson and Dr. Chen had 22 patients and their spousal caregivers wear a wrist-mounted actigraphy monitor in their homes for seven days. They found that the more linked (particularly more synchronized) the patients and the caregivers activity was, the less anxiety the caregiver reported that they experienced.

Understanding How Synchronicity Can Predict Physical and Mental Health

In all of the above studies, the linkage and relationship/health data were collected around the same time, therefore the researchers could not know whether greater linkage produced better relationship/health outcome, the vice versa, or both. In addition, research participants in these previous studies were mostly living in the San Francisco/Northern California areas, therefore the researchers could not know whether the effects that they found could be generalized to couples living in other, more rural areas in the United States (e.g., Montana).

To address these issues, Dr. Levenson and Dr. Chen have recently launched a research project that aims to recruit 300 patients-and their familial caregivers (therefore, the total number of participants will be 600) and study their activity linkage in their homes for six months. Over the study period, both the patients and caregivers will wear the TRACMO CareActive Watch continuously for six months, and caregivers will be monitored periodically for their mental and physical health changes.

Overcoming Research Challenges with Tracmo CareActive Watch

In order to carry out their study, the researchers needed a wearable device that would allow them to collect daily, real-time data from their subjects with the least amount of interference possible. However, the researchers found the traditional consumer wearable smart devices had a lot of shortcomings in a research setting, including short battery life, inaccurate location data, and complexity when it comes to usage. Tracmos CareActive Watch provided the researchers with the perfect solution.

We selected the Tracmo CareActive Watch because of its long battery life, ability to provide near continuous monitoring of wearers movement and relative position in the home, and ready

access to data uploaded to the cloud, according to Dr. Chen. In our view, the CareActive Watch provided the optimal feature set for our needs and the companys technical support was exemplary.

The Result

The progression of dementia can be rapid therefore it can make both the patients and caregivers feel anxious. This rapid progression also means that long term follow-up is usually required, but medical resources are limited - the time between every visit is too long for doctors to observe its change. With Tracmo CareActive Watch, the objective data are directly collected and recorded at home in real-time and can help doctors to evaluate and diagnose. It is very useful and helpful when dealing with cross districts, cross countries, and telemedicine and virtual-medical services.

The Tracmo CareActive Watchs user-friendly app allowed our sample of elderly research participants to complete self-installation with minimum frustration, according to Dr. Levenson. By having people with dementia and their familial caregivers both wear CareActive Watches, we are able to monitor and analyze longitudinal changes in movement and in-home location during individual activities and social interactions. This data is useful in helping us predict changes in the health and wellbeing of participants in our research studies.

The research target is to utilize the various activity indicators to establish behavioral analysis models, furthermore to predict the behavior changes and give early warnings of potential risks of mental health for caregivers. The ongoing study has shown that wearing CareActive Watch helps increase self-awareness of patients and caregivers, and improve the relationships through self-monitoring. It benefits not only dementia patients but also caregivers' mental health.

View post:
UC Berkeley Partners with Tracmo to Support the Physical & Mental Health of Caregivers and Dementia - PharmiWeb.com

One-Two Punch – The UCSB Current

Drought is endemic to the American West along with heatwaves and intense wildfires. But scientists are only beginning to understand how the effects of multiple droughts can compound to affect forests differently than a single drought alone.

UC Santa Barbara forest ecologist Anna Trugman along with her colleagues at the University of Utah, Stanford University and the U.S. Forest Service investigated the effects of repeated, extreme droughts on various types of forests across the globe. They found that a variety of factors can increase and decrease a forests resilience to subsequent droughts. However, the study, published in Nature Climate Change, concluded that successive droughts are generally increasingly detrimental to forests, even when each drought was no more extreme than the initial one.

Droughts usually leave individual trees more vulnerable to subsequent droughts. Compounding extreme events can be really stressful on forests and trees, said Trugman, an assistant professor in the Department of Geography. She compares the experience to a person battling an illness: Youll be harder hit if you get sick again while youre still recovering.

That said, the case is not quite so clear cut. Theoretically, responses to subsequent droughts could be quite varied depending on a wide range of tree-level and ecosystem-level factors, said lead author William Anderegg, an assistant professor at the University of Utah. So, while a drought may place a tree under considerable stress, it could also kill off some of its neighbors, leaving the survivors with less competition for water should arid conditions return.

Trugman and her colleagues used a variety of data sources to investigate this effect on a broad scale. Tree ring data spanning over 100 years enabled them to see how trees that survived an initial drought grew afterward. Data from the U.S. Forest Inventory and Analysis gave them access to metrics on tree mortality for more than 100,000 forest plots from 2000 through 2018. They combined these sources with satellite measurements of the water content in forest canopies.

Two clear tends emerged. We found that generally trees seem to become more vulnerable to stress after multiple droughts, especially conifers, Anderegg said.

The second finding, the researchers believe, comes down to basic physiology. Conifers and their kin have different vascular systems than broadleaf trees, or angiosperms. As a result, they may sustain more damage in an initial drought and be at a disadvantage compared to angiosperms during subsequent periods of drought stress. The tree ring data bears this out, showing that conifers that survived a drought grew much more slowly, especially if another drought settled in.

By contrast, angiosperms have much more flexible anatomy and physiology, and this seems to help them recover faster and more fully after initial droughts, Anderegg said.

Anderegg was particularly surprised by the impact repeated drought had on the Amazon Rainforest. We tend to think of these forests as not very impacted by drought and, due to their high tree diversity, able to recover quickly, he said. But our results indicate the Amazon has been hit hard by three very severe droughts in the past 15 years.

Forests are complex systems, and a variety of factors ultimately dictate how they respond to extreme events. In terms of damage you need to not only think about it at the individual level, but at the forest level as well, said Trugman. So, although they will need time to recover from an extreme drought, surviving trees will face less competition for water resources than they had before. This could leave them in a better situation if drought returns to the area.

Whats more, natural selection will drive the forest as a whole to transition toward more resilient individuals, or even to more drought tolerant species overall. Repeated droughts affect forest pests and pathogens as well, and their response to these conditions will also influence how forests behave.

Scientists are still working to untangle the conditions under which each of these factors rises to the top. This [study] provides a lot of motivation, said Trugman, but I think the next pressing step is to get at the underlying mechanisms at a physiological level and ecological level.

Researchers can use these insights to improve computer models and make more accurate forecasts about the future of forests in a changing climate. Climate change is going to bring more frequent droughts, Anderegg said, so we have to understand and be able to forecast how forests will respond to multiple droughts.

These results are especially crucial in the western U.S., he added, where we've had a number of major droughts in the past 20 years.

See the original post here:
One-Two Punch - The UCSB Current

Why Is Eating at Night So Frowned upon in the Nutrition World? – The Great Courses Daily News

ByMichael Ormsbee, PhD,Florida State UniversityEdited by Kate Findley and proofread byAngelaShoemaker, The Great Courses DailyResearch studies focusing on whether eating before bed causes weight gain have shown that quantity and quality of consumed food are the determining factors. Photo By Photographee.eu / ShutterstockNighttime Eating Enigmas

If youre like most people, then you get hungry again before you go to bed at night. For a long time, it was believed that eating late at night before going to sleep was bad for your health and would automatically make you gain body fat.

However, we need to consider several things when the topic of nighttime eating is brought up. According to Professor Ormsbee, you should ask yourself where the recommendations are coming from and what the scientific evidence is.

Additionally, how is nighttime eating defined? Does this mean you should be monitoring what time you eat dinner? How does the nighttime meal size and composition influence your health?

Nighttime feeding has been a major research focus in my lab at Florida State University, Professor Ormsbee said. The original premise for starting this line of research stemmed from my glory days as a collegiate ice hockey player. I remember reading about the need to stop eating late at night to avoid gaining fat or feeling lousy.

Various media experts began to speak up about it, emphasizing that if you want to have a great body composition, you should avoid eating late at nightafter dinnerat all costs.

The trouble was that I always ate before going to bed, Professor Ormsbee said. I thought it would be good to help my body recover from practice and workouts so that I could become a better player. And just about everybody I knew who had low body fat and good muscle mass would not only eat at all times of the day, but would purposefully eat before bed and drink a protein shake or have some kind of protein before going to sleep.

How could people who ate before bed still have excellent body composition and perform well, when many in the media were telling us the opposite message? As it turns out, several factors influenced this public health message.

To understand why most people think that eating at night is bad, we need to dive into some research and physiology. An easy place to start would be with your circadian clock or circadian timing system.

We all have an internal clock that regulates our physiology with our daily behaviors and surrounding environment. For many of us, our typical circadian clock keeps us awake and active during the daytime hours and less active during the evening and night hours.

Because many of us are less active in the evening and late night hours, our clock is programmed to slow our internal system down at those times, too. However, does a less active system at night translate to fat gained if you eat at night? It depends.

Food we consume has two fates: iIt is either stored for later use or burned for energy. Since our biological clock slows things down at night, it seems obvious that food we eat at night will more likely be stored rather than burned during this time.

When you eat carbohydrates, the carbohydrates are broken down into smaller components like glucose that enter your blood. In response to the glucose, insulin is secreted from your pancreas to get the glucose into the cells. This effectively lowers blood glucose back to normal.

Research shows that for the same amount of glucose, greater amounts of insulin are required to remove it from the blood during the night as compared to the day. More insulin produced equals more storage at night.

If nighttime eating is not done often, there is likely no problem. However, if its repeated over time, chronically high insulin can lead to a desensitization of the insulin receptors and possibly lead to future problems with your glucose control.

Research has also shown that the energy cost of digesting and processing your foodthat is, the thermic effect of foodgoes down at night. That means that if you eat the same exact meal for breakfast as you do for dinner, youll have a lower energy expenditure in the evening.

We also know that we do not feel as full when we eat foods toward the later part of the day, so theres a chance well eat more food. Nighttime digestion also takes longer for food eaten later in the day to be emptied from the stomach and into the intestines during this time.

In summary, when we eat at night, more insulin is needed; we feel less full, so we eat more; and we have a slower digestion and less energy is used to digest and process food. This suggests that our physiology at night favors storage when we eat towards the later part of the day.

This may not be ideal for body composition. Therefore, it is easy to see how the message quickly spread to avoid eating in the evening.

Michael Ormsbee is an Associate Professor in the Department of Nutrition, Food, and Exercise Sciences and Interim Director of the Institute of Sports Sciences and Medicine in the College of Human Sciences at Florida State University. He received his MS in Exercise Physiology from South Dakota State University and his PhD in Bioenergetics from East Carolina University.

More:
Why Is Eating at Night So Frowned upon in the Nutrition World? - The Great Courses Daily News

Concussion Prevention: Sorting Through the Science to See What’s Sound – American Council on Science and Health

ByJames Smoliga, High Point University

As his helmet collided violently with his opponents shoulder, Luke Kuechly looked like a life-size bobblehead doll. In an instant, the Carolina Panthers star linebacker suffered yet another concussion. His season, and perhaps career, was in jeopardy.

A few weeks earlier, Kuechly began wearing an experimental collar around his neck designed to protect his brain from within. The device, known as the Q-Collar and previously sold as NeuroShield, is designed to mimic the woodpeckers method of injury protection by keeping more blood inside the skull to create a bubble wrap effect around the brain.

So, why didnt this nature-inspired safety equipment avert Luke Kuechlys 2017 concussion, which apparently he still wears?

As a physiologist and sports medicine researcher, I study how the body responds to exercise and other stressors. I also study ways to prevent and treat sports injuries. As the public learns more about the potential long-term dangers of contact sports, including chronic traumatic encephalopathy (CTE), parents, athletes and sports organizations are desperate to find a quick fix to the concussion crisis. Unfortunately, I do not think there is an easy solution to make inherently high-risk sports safe.

The high altitude argument

Back in 2014, a friend told me about a study which reported that NFL players were 20-30 percent less likely to sustain a concussion in games played at higher altitudes. The researchers theorized that higher altitude caused a slight swelling in the brain, and consequently increased brain volume.

This tighter fit inside the skull would reduce brain slosh during impacts to reduce the likelihood of concussions. Since higher altitude seemed to protect the brain, they argued, it would be beneficial to replicate this tighter fit. The authors proposed this could be achieved by applying slight pressure on the necks jugular veins to trap a bit more blood inside the brain. A few years earlier, a member of their research team filed a patent for such a device a jugular compression collar.

While those less familiar with physiology may have been persuaded by this fascinating-sounding explanation, my fellow researcher, Gerald Zavorsky, and I thought this idea was scientifically implausible. Most importantly, the study defined higher altitude as anything above a meager 600 feet above sea level way too low to have any effect on brain volume. Essentially, our brain volume stays remarkably constant at high altitude, even when we may feel short of breath or lightheaded. In the Mile High City of Denver, which houses the highest NFL stadium in the country at 5,280 feet above sea level, you would be hard-pressed to experience even a miniscule swelling in the brain. However, at much higher elevations, there is actually an increased likelihood for brain swelling which causes a life-threatening emergency called high altitude cerebral edema.

A game of chance

If altitude does not cause a protective increase in brain volume, then why were concussions reduced in NFL games played at greater than 600 feet above sea level? To answer this question, we examined the same publicly available NFL data set. The original study looked at data from two combined seasons (2012 and 2013), but we analyzed a few additional years. We confirmed that concussion rate was indeed statistically reduced at higher altitudes during the 2013 season, but not in the 2012 season. We dug deeper and found no connection between altitude and concussions in the 2014 or 2015 seasons. A separate study in college athletes showed concussions were even more likely at higher altitude.

Since the effect wasnt consistent and repeatability is a major problem in all of science, we suspected the original linkages were due to random chance a mathematical artifact of using a huge data set of nearly 1500 gridiron giants literally butting heads with one another on a weekly basis. If that was the case, we might expect that something completely arbitrary to also be associated with a reduced risk of concussion. And, indeed our analysis demonstrated that is true. It turns out that NFL teams with animal logos, such as the Miami Dolphins, also had a 20-30 percent reduced risk of concussion compared to teams without animal logos, such as the Pittsburgh Steelers, regardless of game altitude.

Based on our analysis, we concluded that random chance, not physiological response, explains why concussions were less likely at altitudes above 600 feet. Thus, an altitude-mimicking collar seems unjustified for preventing concussions.

The woodpecker theory

Supposedly, the Q-Collar also replicates how woodpeckers naturally protect themselves from headaches. According to company information, woodpeckers compress their jugular vein using their neck muscles to induce tighter fit and reduce brain slosh. While this amazing-sounding mechanism is often presented as a fact, it does not seem to be mentioned anywhere in over a century of scientific studies examining woodpeckers.

I thoroughly examined all of the woodpecker papers I could find, and then tracked down all of their references, and repeated the process. I discovered ornithology papers from the 1700s through cutting-edge engineering models of woodpecker biomechanics, but none mentioned jugular compression. Thus, it is not surprising that the company does not cite any scientific references to woodpecker literature.

Even if this mechanism does exist and has been somehow overlooked by woodpecker researchers, evolution gave the woodpecker numerous unique protective adaptations. I teamed up with a woodpecker researcher and published a summary of these mechanisms in October 2018. These include a specialized skull bone structure and a shock-absorbing beak. Woodpeckers even use very specific postures and movements to brace themselves, which helps to dissipate force away from their brains. We concluded that these multiple protective mechanisms work in harmony, which cannot be replicated by simply pushing on ones jugular vein.

New research suggests that woodpeckers may indeed experience brain injuries similar to those seen in humans. Regardless, the physics of woodpecker drumming are quite different than that of sports concussions, which generally happen with unpredictable timing, and involve considerable head rotation. Despite its intuitive appeal, I believe that a woodpecker-mimicking collar is more pseudoscience than innovation.

Beyond sports concussions

As my colleagues and I have been debunking the scientific rationale for the Q-Collar, research examining the Q-Collar seems to have shifted from reducing the risk of concussions, or distinct events following a single hit, to a less tangible goal of reducing brain damage from repeated subconcussive impacts.

New research claims evidence of benefit, based on MRI data. As one article stated in 2016, the collar may have provided a protective effect against brain microstructural changes after repetitive head impacts. An article published in October 2018 from a small study showed that the brains of female soccer players who wore collars for a season seemingly showed no brain damage. Those who did not wear the collar did show small changes in some areas of their brain.

However, some other researchers have expressed concerns over the small numbers of subjects and the high dropout rates in similar studies about the collar. Some physicians have concluded that this evidence is not enough to suggest that it does protect the brain from injury and current promotional campaigns are potentially misleading. I also remain skeptical of these findings, since the clinical utility of this particular type of MRI data remains unclear, especially in relation to long-term health.

As the company aims for FDA approval and looks beyond sports applications, I fear that long-term brain health is being placed in equipment justified by misunderstandings of physiology, coincidental relationships, and yes, even what Ive concluded are incorrect claims about woodpeckers and other animals.

Some may argue that even if it does not work, there is no harm in adding an extra layer of protection. However, I believe this is a dangerous attitude. When athletes feel they are more protected, they have a false sense of extra safety and play more aggressively. This may actually increase risk of injury.

As Luke Kuechly and others can attest, even innovative-sounding equipment cannot stop concussions in contact sports. Unfortunately, we may not know if long-term brain damage can actually be limited by new technologies until it is too late.

*****

James Smoliga, Professor of Physiology, Department of Physical Therapy, High Point University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

See the rest here:
Concussion Prevention: Sorting Through the Science to See What's Sound - American Council on Science and Health

CSUN Study Finds Land-Based Inputs Make Corals Vulnerable to Ocean Warming – CSUN Today

Danielle Becker on the left and CSUN marine biology professor Nyssa Silbiger on the right, collecting samples of Pocillopora acuta from one of the field sites in Moorea, French Polynesia. Photo by DM Barnas,

Land-based input including nutrient and sediment loading can have an adverse impact on a certain species of coral, making them more susceptible to the warmer seawater conditions brought on by climate change, according to a new study by California State University, Northridge marine biologists.

The study by Danielle Becker, who graduated from the university in August with a masters degree in biology, and marine biology professor Nyssa Silbiger offers insight into how human-driven stressors can impact corals metabolic rates, including photosynthesis, respiration and calcification.

Local and global anthropogenic, or human-caused, stressors are rapidly diminishing the biodiversity and structural complexity of coral reefs, Becker said. Therefore, a better understanding of the ecological ramifications of warming and land-based inputs such as sedimentation and nutrient loading on coral reef ecosystems is necessary.

Our findings helped us identify how nutrient-input and sedimentation influenced coral physiology and their ability to function during thermal stress, she continued. With this information, we can better understand how different metabolic rates of corals may change under human-driven stressors. Further, these results indicate that anthropogenic stressors on a local scale may make it even more difficult for corals to deal with global stressors, like climate change.

The study, Nutrient and sediment loading affect multiple facets of coral functionality in a tropical branching coral, appeared last month in the Journal of Experimental Biology (doi:10.1242/jeb.225045).

Our study estimated a range of responses in a branching coral species to nutrient and sediment loading, Silbiger said, measuring metabolism of individuals in the lab to percent cover on the reef. While our current study shows that nutrient and sediment loading are generally detrimental to corals, other studies have conflicting results, she continued. The different biological responses across studies highlight the importance of how many interacting variables, like water flow, depth and distance to shore, can affect a corals response to nutrient loading.

Becker, working with Silbiger, began her study in fall 2018, during the first year of her masters program. She surveyed six sites in fringing coral reefs along the north shore of Moorea in French Polynesia that represented a gradient in nutrient and sediment concentrations. At each site, she measured the percent cover of numerous coral species, including Pocillopora acuta, a fast-growing branching coral that is typically more resistant to human disturbances. She also collected a variety of environmental samples, including the nutrient concentration in the water, tissue nitrogen content in macroalgae and sedimentation rates.

She then measured the photosynthesis, respiration, and calcification rates of the corals, at increasing temperatures, to better understand their ability to perform during thermal stress, and the impact nutrient and sediment loading had on their performance.

Becker, now a first-year doctoral student at the University of Rhode Island, explained that nutrient and sediment loading can occur either naturally from such sources of fish excretion, nitrogen-fixing blue-green algae or submarine groundwater discharge or by human-derived sources such as industrial or agricultural waste and run-off, deforestation, stormwater run-off, coastal development or household products.

These various sources enter the water supply through run-off and drainage that can eventually make their way into our waterways and into ocean environments, especially along coastal cities or populations around the world, Becker said. Land-based inputs nutrients, sedimentation, toxins and pathogens can enter coral reef ecosystems and cause disease and mortality, disrupt ecological functions, change dynamics and feeding behaviors, and prevent coral growth and reproduction.

Becker and Silbiger found that coral metabolic responses significantly declined with exposure to high nutrient concentrations and sedimentation rates, which may have contributed to a decline in the Pocillopora acutacoral cover along the north shore fringing reefs in Moorea.

To our knowledge, this is one of the first published studies that provides evidence for the influences of nutrient and sediment loading on coral thermal performance that encompasses multiple aspects of coral functionality, Becker said. Our findings show that nutrient and sediment loading can have a range of effects on coral functionality, with increased levels of these stressors compromising corals ability to withstand thermal stress and their ability to perform necessary metabolic tasks to thrive in their natural environments.

Metabolic processes such as photosynthesis, respiration and calcification are important indicators of organismal health and are continuously being altered by organisms to adjust their physiological mechanisms in variable environments, she continued. Our study shows that these processes are compromised along a nutrient and sedimentation gradient which can have implications much larger than on an individual scale. Understanding how local-scale anthropogenic stressors influence the responses of corals to temperature can inform coral reef management.

Becker and Silbiger said their findings are relevant to coral reefs around the world.

Nutrient pollution is common anywhere that there is agriculture, coastal development and sewage outfalls, Becker said. French Polynesia has relatively low levels of anthropogenic nutrient and sediment loading compared to other areas of the world. If were seeing these detrimental effects on a lesser scale, it is worth investigating these relationships in other reefs around the world. Our findings also provide valuable information for the use of thermal performance curves to further understand how organisms across environments may respond to local- and global-scale anthropogenic stressors in concert.

College of Science and Mathematics, CORAL, Danielle Becker, Featured, Journal of Experimental Biology, Marine Biology, Nutrient and sediment loading, Nyssa Silbiger, Pocillopora acuta

Read the original here:
CSUN Study Finds Land-Based Inputs Make Corals Vulnerable to Ocean Warming - CSUN Today

Exploring Eagle Hearing & Vision Capabilities To Reduce Risk At Wind Farms – CleanTechnica

Clean Power

Published on October 13th, 2020 | by U.S. Department of Energy

October 13th, 2020 by U.S. Department of Energy

Purdue University (Purdue) and the University of Minnesota (UMN) are studying the visual and auditory capabilities of bald and golden eagles to help improve the effectiveness of deterrents used around wind energy facilities. Findings from this research, which is funded by the Wind Energy Technologies Office (WETO), will be made available to eagle deterrent technology developers.

Bald eagles were removed from the endangered species list in 2007 after a strong population recovery. Golden eagles were not listed, but both eagle species are federally protected under the Bald and Golden Eagle Protection Act (BGEPA), which prohibits the killing (or take) of eagles, unless permitted. This act requires that wind energy developers and operators do everything they can to minimize risks to eagles through methods such as careful siting, deterrents, or sensors that monitor for incoming wildlife and shut down wind turbines if an eagle approaches.

One way to reduce risks is to develop technologies that produce sound or a visual cue to deter eagles from entering the airspace around wind turbines. To develop highly effective deterrents based on sound or visual stimuli to which eagles are most sensitive, Purdue University explored both eagle hearing and vision, whereas UMN researchers studied eagle hearing and identified possible surrogate species with hearing capabilities similar to bald and golden eagles.

The Purdue research team worked with seven raptor rehabilitation centers to evaluate eagle hearing and vision ranges. They found that both bald and golden eagles have a blind spot near the tops of their heads (Figure 1) that hinders the birds ability to see a wind turbine ahead of them if looking downward (e.g., while hunting). This finding supports the need for a deterrent that is sufficiently alarming to an eagle to cause it to look up when hunting.

Figure 1. Visual field configurations of the golden eagle (left) and bald eagle (right). The Purdue University team found both species of eagles have a blind spot near the tops of their heads (bottom row). Illustration courtesy of Purdue University

The Purdue team also found that it is highly unlikely that golden or bald eagles can detect ultraviolet light. They identified candidate colors (blue/indigo and orange/red) that would be most visible to eagles against various backgrounds. Furthermore, golden eagles exhibited a higher proportion of stress-related behaviors to visual signals than to sound or light-plus-sound signals. Bald eagles showed a higher proportion of stress-related behavior to light-plus-sound signals. In other words, golden eagles are more likely to respond to visual signals, whereas bald eagles are more likely to respond to a combination of sight and sound. Both species showed some level of adaptation to stimuli over time, indicating the need for additional, randomized visual and auditory signal testing.

Purdue researchers concluded that the auditory systems of bald and golden eagles were sufficiently different to warrant species-specific deterrent signals. They discovered that:

The Purdue research team concluded that these types of signals would be good candidates for further testing with bald eagles but that deterrents for golden eagles should be complex tonal harmonics or modulated sounds that do not change very rapidly.

The UMN research team studied raptors admitted to the universitys Raptor Center for treatment and worked with Sia: the Comanche Nation Ethno-Ornithological Initiative in Cyril, Oklahoma, to assess eagle hearing ranges. Once data were collected, they developed a suite of audio test signals and worked with eagles at the Raptor Center to evaluate which of the signals generated the strongest response.

Researchers found that eagles can hear over a frequency range of at least four octaves, centered on 2 kHz, which is roughly a B note on a piano, three octaves above middle C, with an upper limit between 6 kHz and 10 kHz at 80 decibels, and a lower limit that likely extends below 0.2 kHz.

In addition to evaluating eagles physiological responses to synthetic tones, the research team evaluated the auditory properties of eagle vocalizations to better understand how their vocal repertoire might be used in a deterrent. The findings suggest that companies designing eagle deterrents should consider varying frequency and volume patterns to achieve the strongest and least-habituated responses. They also recommend against broadcasting sound outside the observed responsive frequency band of bald and golden eagles to avoid contributing unnecessarily to existing sound-pollution levels.

After exploring the usefulness of red-tailed hawks as potential surrogate species for field testing auditory deterrents, the UMN research team concluded that the hawks auditory systems are similar enough to bald and golden eagles that they may be used as surrogate species when testing new deterrent devices or signals.

This finding is important because of regulatory protections afforded eagles under the BGEPA. Being able to test on red-tailed hawks will provide a significant benefit to technology developers looking to test the usefulness of their systems prior to field trials. However, researchers noted that testing with eagles in a real-world environment, in addition to any testing on red-tailed hawks, will be critical to any deterrent validation study.

The final technical report on this research is pending publication. The University of Minnesota has published onearticleabout their research in theJournal of Comparative Physiology.

Appreciate CleanTechnicas originality? Consider becoming aCleanTechnica member, supporter, or ambassador or apatron on Patreon.

Sign up for our free daily newsletter or weekly newsletter to never miss a story.

Have a tip for CleanTechnica, want to advertise, or want to suggest a guest for our CleanTech Talk podcast? Contact us here.

Tags: Bald eagles, purdue, Purdue University, university of minnesota, US, wind turbines birds

U.S. Department of Energy The mission of the U.S. Energy Department is to ensure Americas security and prosperity by addressing its energy, environmental and nuclear challenges through transformative science and technology solutions. Learn more.

Follow this link:
Exploring Eagle Hearing & Vision Capabilities To Reduce Risk At Wind Farms - CleanTechnica

Crop Biotechnology, physiology and translational genomics to feed and fuel the world – Newswise

Newswise October 6, 2020 Accelerated crop improvement is needed to meet both global population growth and climate change generated stresses on crops. TheCrop Biotechnology, physiology and translational genomics to feed and fuel the worldsymposium at theTranslating Visionary Science to Practice ASA, CSSA, SSSA International Annual Meetingwill address these topics.

The meeting is being held virtually, Nov. 9-13, 2020 and is hosted by the American Society of Agronomy, Crop Science Society of America and Soil Science Society of America. Media are invited; preregistration is required.

The presentations are:

Presentations may be watched asynchronously, and there will be a scheduled Q&A time to speak with presenters during the meeting. Presentations will be available for online viewing for 90 days after the meeting for all registrants. For more information about theTranslating Visionary Science to Practice 2020meeting,visithttps://www.acsmeetings.org/.

Media are invited to attend the conference. Pre-registration by Nov. 2, 2020 is required. Visithttps://www.acsmeetings.org/mediafor registration information.

To speak with one of the scientists, contact Susan V. Fisk, 608-273-8091,sfisk@sciencesocieties.orgto arrange an interview.

Excerpt from:
Crop Biotechnology, physiology and translational genomics to feed and fuel the world - Newswise

NIH intramural researcher Dr. Harvey Alter wins 2020 Nobel Prize in Physiology or Medicine – National Institutes of Health

News Release

Monday, October 5, 2020

National Institutes of Health intramural researcher Harvey J. Alter, M.D., has won the 2020 Nobel Prize in Physiology or Medicine for his contributions to the discovery of the hepatitis C virus. Dr. Alter is a Senior Scholar at the NIH Clinical Centers Department of Transfusion Medicine and shares the award with Michael Houghton, Ph.D., University of Alberta, Canada, and Charles M. Rice, Ph.D., Rockefeller University, New York City.

The Royal Swedish Academy of Sciences said, Prior to their work, the discovery of the Hepatitis A and B viruses had been critical steps forward, but the majority of blood-borne hepatitis cases remained unexplained. The discovery of Hepatitis C virus revealed the cause of the remaining cases of chronic hepatitis and made possible blood tests and new medicines that have saved millions of lives.

I am overwhelmed at the moment, but so pleased that this originally obscure virus has proven to have such a large global impact, said Dr. Alter. There are so many persons at NIH who advanced my research, but for now I can only thank NIH, itself, for creating the permissive and collaborative environment that supported these studies over the course of decades. I dont believe my contributions could have occurred anywhere else.

Dr. Alters career at NIH has spanned more than 50 years where he focused his research on the occurrence of hepatitis in patients who had received blood transfusions. In the 1970s, despite the discovery of hepatitis B, Dr. Alter saw a significant number of patients receiving blood transfusions still developed chronic hepatitis due to an unknown infectious agent. Dr. Alter and his colleagues showed that blood from these hepatitis patients could transmit the disease to chimpanzees, the only susceptible host besides humans. Subsequent studies also demonstrated that the unknown infectious agent had the characteristics of a virus. Alters methodical investigations defined a new, distinct form of chronic viral hepatitis, which became known as non-A, non-B hepatitis. His work was instrumental in leading to the development of new diagnostic and therapeutic agents and providing the scientific basis for instituting blood donor screening programs that have decreased the incidence of transfusion-transmitted hepatitis to near zero.

Harvey Alter is a scientists scientist smart, creative, dedicated, persistent, self-effacing, intensely dedicated to saving lives, said NIH Director Francis S. Collins, M.D., Ph.D. His work to identify the nature of the hepatitis C virus has led to dramatic advances in protecting the blood supply from this very serious illness, and ultimately to the development of highly successful therapy.

Dr. Alter had focused on viral hepatitis even before his work on hepatitis C. In the 1960s, he co-discovered the Australia antigen, a key to detecting hepatitis B virus. Later, he spearheaded a project at the NIH Clinical Center that created a storehouse of blood samples used to uncover the causes and reduce the risk of transfusion-associated hepatitis. In 2000, Alter was awarded the prestigious Clinical Lasker Award. In 2002, he became the first NIH Clinical Center scientist elected to the National Academy of Sciences, and in that same year he was elected to the Institute of Medicine. In 2013, Dr. Alter was honored with the distinguished Canada Gairdner International Award.

Harvey is known for a very sharp sense of humor, a tireless, work ethic, and for treating everyone well, said James K. Gilman, M.D., chief executive officer of the NIH Clinical Center. As a long-time military physician, I am grateful to what Harvey and his co-winners have done to make it possible to provide a safe blood supply to the men and women who serve the country in uniform.

Dr. Alters co-recipient Dr. Rice has received continuous NIH funding totaling more than $67 million since 1987, primarily from NIHs National Institute of Allergy and Infectious Diseases.

For more on Drs. Alter, Houghton and Rices contributions to the discovery of the hepatitis C virus, visit the Royal Swedish Academy of Sciences site: https://www.nobelprize.org/prizes/medicine/2020/press-release/.

About the NIH Clinical Center: The NIH Clinical Center is the worlds largest hospital entirely devoted to clinical research. It is a national resource that makes it possible to rapidly translate scientific observations and laboratory discoveries into new approaches for diagnosing, treating, and preventing disease. Over 1,600 clinical research studies are conducted at the NIH Clinical Center, including those focused on cancer, infectious diseases, blood disorders, heart disease, lung disease, alcoholism and drug abuse. For more information about the Clinical Center, visit https://clinicalcenter.nih.gov/index.html.

About the National Institutes of Health (NIH):NIH, the nation's medical research agency, includes 27 Institutes and Centers and is a component of the U.S. Department of Health and Human Services. NIH is the primary federal agency conducting and supporting basic, clinical, and translational medical research, and is investigating the causes, treatments, and cures for both common and rare diseases. For more information about NIH and its programs, visit http://www.nih.gov.

NIHTurning Discovery Into Health

###

More:
NIH intramural researcher Dr. Harvey Alter wins 2020 Nobel Prize in Physiology or Medicine - National Institutes of Health

Exercise identified as key to halting progress from diabetes to heart disease – Study – Voxy

An international study led by the University of Otago has revealed how exercise can reduce the chance of diabetes leading on to heart disease.

The research has identified that exercise triggers the release of small sequences of genetic code in the heart called microRNA, which increase protein production to improve heart structure and function.

The study 'Exercise regulates microRNAs to preserve coronary and cardiac function in the diabetic heart has recently been published in the prestigious journal Circulation Research, one of the worlds leading publications in the field of cardiovascular medicine.

Associate Professors Daryl Schwenke and Rajesh Katare, of Otagos Department of Physiology, found that specific microRNA are adversely altered in the early stages of diabetes. These altered microRNA can reliably predict the inevitable onset of heart disease. Associate Professor Katare believes this is a pivotal new development as microRNA can serve as a reliable early biomarker for heart disease in diabetes.

"Weve proven that by using exercise as a treatment, we can increase good microRNA, and reduce bad microRNA from causing damage. Exercise effectively improves regulation of microRNA to prevent the onset and progression of heart disease," Associate Professor Schwenke says.

Along with highlighting the role of exercise in regulating microRNA, the study also shows that microRNA are a potential novel target for the therapeutic treatment of heart disease in people with chronic diabetes.

"By increasing the good microRNA using pharmacological drugs it is possible to effectively reduce heart disease in diabetic subjects. This approach is not solely reliant on exercise," Associate Professor Schwenke says.

Over 250,000 thousand New Zealanders have diabetes according to the Ministry of Health, which defines diabetes as a serious health challenge to our country.

Associate Professor Schwenke believes this research has clear long-term benefits on both the quality of life for diabetic patients with heart disease, as well as alleviating the economic burden associated with current treatment of diabetes.

"By understanding the physiological role of microRNA we can see without doubt the positive role of exercise in preventing diabetic heart disease," he adds.

The study is a collaboration between the University of Otago, Japans National Cardiovascular Research Institute, and the Synchrotron Radiation Research Institute, also of Japan.

Here is the original post:
Exercise identified as key to halting progress from diabetes to heart disease - Study - Voxy

Molly Thompson | School of Community Health Sciences – Nevada Today

Summary Research interests

After nearly a decade of working in wildlife biology I decided to shift my research focus from wildlife conservation and physiology to human health. Most wildlife problems today are symptoms of human behavior and I believe that physical and emotional health are essential to altering destructive and dissociative human behaviors. My research interests are broad but include the interfaces of mental and physical health, topics related to traumatic experiences (e.g. denial, recovery and resiliency, and impacts to physical health), and alternative therapies that promote human health while fostering a healthier connection to the environment (e.g. gardening, creative arts rooted in upcycling, and environmentally-based service learning).

2023 (anticipated) PhD in Public Health with specialization in Epidemiology

2017 MS in Wildlife Conservation, Virginia Tech

2010 BS in Biological Sciences, University of California, Chico

Original post:
Molly Thompson | School of Community Health Sciences - Nevada Today