All posts by medical

Nobel-winning biologist on the most promising ways to stop ageing – New Scientist

ANTI-AGEING is big business. From books encouraging diets such as intermittent fasting to cosmetic creams to combat wrinkles, a multibillion-dollar industry has been built on promises to make us live longer and look younger. But how close are we really to extending our lifespan in a way that gives us extra years of healthy life?

Nobel prizewinner Venki Ramakrishnan, a molecular biologist and former president of the UKs Royal Society, is the latest to tackle this question. He has spent 25 years studying the ribosome, which is where our cells make proteins using the information encoded in our genes, at the MRC Laboratory of Molecular Biology in Cambridge, UK.

In his latest book, Why We Die: The new science of ageing and the quest for immortality, he goes on a journey around the cutting-edge biology of human ageing and asks whether it will be possible to extend our lifespan in the near future.

He talks to New Scientist about the recent breakthroughs in our knowledge of what causes ageing, how close we are to creating therapeutics to combat it, and the potential consequences if we succeed.

Graham Lawton: What inspired you to take a break from a hugely successful career researching how cells make proteins to write a book about ageing?

Venki Ramakrishnan: Two things. One is that the translation of genetic code into proteins affects almost every biological process, and it turns out to be central to many aspects of ageing.

The other reason is that we have worried about ageing and death ever since we

See original here:

Nobel-winning biologist on the most promising ways to stop ageing - New Scientist

How to better research the possible threats posed by AI-driven misuse of biology – Bulletin of the Atomic Scientists

Over the last few months, experts and lawmakers have become increasingly concerned that advances in artificial intelligence could help bad actors develop biological threats. But so far there have been no reported biological misuse examples involving AI or the AI-driven chatbots that have recently filled news headlines. This lack of real-world wrongdoing prevents direct evaluation of the changing threat landscape at the intersection of AI and biology.

Nonetheless, researchers have conducted experiments that aim to evaluate sub-components of biological threatssuch as the ability to develop a plan for or obtain information that could enable misuse. Two recent effortsby RAND Corporation and OpenAIto understand how artificial intelligence could lower barriers to the development of biological weapons concluded that access to a large language model chatbot did not give users an edge in developing plans to misuse biology. But those findings are just one part of the story and should not be considered conclusive.

In any experimental research, study design influences results. Even if technically executed to perfection, all studies have limitations, and both reports dutifully acknowledge theirs. But given the extent of the limitations in the two recent experiments, the reports on them should be seen less as definitive insights and more as opportunities to shape future research, so policymakers and regulators can apply it to help identify and reduce potential risks of AI-driven misuse of biology.

The limitations of recent studies. In the RAND Corporation report, researchers detailed the use of red teaming to understand the impact of chatbots on the ability to develop a plan of biological misuse. The RAND researchers recruited 15 groups of three people to act as red team bad guys. Each of these groups was asked to come up with a plan to achieve one of four nefarious outcomes (vignettes) using biology. All groups were allowed to access the internet. For each of the four vignettes, one red team was given access to an unspecified chatbot and another red team was given access to a different, also unspecified chatbot. When the authors published their final report and accompanying press release in January, they concluded that large language models do not increase the risk of a biological weapons attack by a non-state actor.

This conclusion may be an overstatement of their results, as their focus was specifically on the ability to generate a plan for biological misuse.

The other report was posted by the developers of ChatGPT, OpenAI. Instead of using small groups, OpenAI researchers had participants work individually to identify key pieces of information needed to carry out a specific defined scenario of biological misuse. The OpenAI team reached a conclusion similar to the RAND teams: GPT-4 provides at most a mild uplift in biological threat creation accuracy. Like RAND, this also may be an overstatement of results as the experiment evaluated the ability to access information, not actually create a biological threat.

The OpenAI report was met with mixed reactions, including skepticism and public critique regarding the statistical analysis performed. The core objection was the appropriateness of the use of a correction during analysis that re-defined what constituted a statistically significant result. Without the correction, the results would have been statistically significantthats to say, the use of the chatbot would have been judged to be a potential aid to those interested in creating biological threats.

Regardless of their limitations, the OpenAI and RAND experiments highlight larger questions which, if addressed head-on, would enable future experiments to provide more valuable and actionable results about AI-related biological threats.

Is there more than statistical significance? In both experiments, third-party evaluators assigned numeric scores to the text-based participant responses. The researchers then evaluated if there was a statistically significant difference between those who had access to chatbots and those who did not. Neither research team found one. But typically, the ability to determine if a statistically significant difference exists largely depends on the number of data points; more data points allow for a smaller difference to be considered statistically significant. Therefore, if the researchers had many more participants, the same differences in score could have been statistically significant.

Reducing text to numbers can bring other challenges as well. In the RAND study, the teams, regardless of access to chatbots, did not generate any plans that were deemed likely to succeed. However, there may have been meaningful differences in why the plans were not likely to succeed, and systematically comparing the content of the responses could prove valuable in identifying mitigation measures.

In the OpenAI work, the goal of the participants was to identify a specific series of steps in a plan. However, if a participant were to miss an early step in the plan, all the remaining steps, even if correct, would not count towards their score. This meant that if someone made an error early on, but identified all the remaining information correctly, they would score similarly to someone who did not identify any correct information. Again, researchers may gain insight from identifying patterns in which steps and why participants failed.

Are the results generalizable? To inform an understanding of the threat landscape, conclusions must be generalizable across scenarios and chatbots. Future evaluators should be clear on which large language models are used (the RAND researchers were not). It would be helpful to understand if researchers achieve a similar answer with different models or different answers with the same model. Knowing the specifics would also enable comparisons of results based on the characteristics of the chatbot used, enabling policymakers to understand if models with certain characteristics have unqiue capabilities and impact.

The OpenAI experiment used just one threat scenario. There is not much reason to believe that this one scenario is representative of all threat scenarios; the results may or may not generalize. There is a tradeoff in using one specific scenario; it becomes tenable for one or two people to evaluate 100 responses. On the other hand, the RAND work was much more open-ended as participant teams were given flexibility in how they decided to achieve their intended goal. This makes the results more generalizable, but required a more extensive evaluation procedure that involved many experts to sufficiently examine 15 diverse scenarios.

Are the results impacted by something else? Part way through their experiment, the RAND researchers enrolled a black cell, a group with significant experience with large language models. The RAND researchers made this decision because they noticed that some of their studys red teams were struggling to bypass safety features of the chatbots. In the end, the black cell received an average score almost double that of the corresponding red teams. The black cell participants didnt need to rely only on their expertise with large language models; they were also adept at interpreting the academic literature about those models. This provided a valuable insight to the RAND researchers, which is [t]herelative outperformance of the black cell illustrates that a greater source of variability appears to be red team composition, as opposed to LLM access. Simply put, it probably matters more who is on the team than if the team has access to a large language model or not.

Moving forward. Despite their limitations, red teaming and benchmarking efforts remain valuable tools for understanding the impact of artificial intelligence on the deliberate biological threat landscape. Indeed, the National Institute of Standards and Technologys Artificial Intelligence Safety Institute Consortiuma part of the US Department of Commercecurrently has working groups focused on developing standards and guidelines for this type of research.

Outside of technical design and execution of the experiments, challenges remain. The work comes with meaningful financial costs including the compensation of participants for their time (OpenAI pays $100 per hour to experts); for indviduals to recruit participants, design experiments, administer the experiments, and analyze data; and of biosecurity experts to evaluate the responses. Therefore, it is important to consider who will fund this type of work in the future. Should artificial intelligence companies fund their own studies, a perceived conflict of interest will linger if the results are intended to be used to inform governance or public perception of their models risks. But at the same time, funding that is directed to nonprofits like RAND Corporation or to academia does not inherently enable researchers access to unreleased or modified models, like the version used in the OpenAI experiment. Future work should learn from these two reports, and could benefit from considering the following:

The path toward more useful research on AI and biological threats is hardly free of obstacles. Employees at the National Institute of Standards and Technology have reportedly expressed outrage regarding the recent appointment of Paul Christianoa former OpenAI researcher who has expressed concerns that AI could pose an existential threat to humanityto a leadership role at the Artificial Intelligence Safety Institute. Employees are concerned that Christianos personal beliefs about catastrophic and extistential risk posed by AI broadly will affect his ability to maintain the National Institute of Standards and Technologys commitment to objectivity.

This internal unrest comes on the heels of reporting that the physical buildings that house the institute are falling apart. As Christiano looks to expand his staff, he will also need to compete against the salaries paid by tech companies. OpenAI, for example, is hiring for safety-related roles with the low end of the base salary exceeding the high end of the general service payscale (federal salaries). It is unlikely that any relief will come from the 2024 federal budget, as lawmakers are expected to decrease the institutes budget from 2023 levels. But if the United States wants to remain a global leader in the development of artificial intelligence, it will need to make financial commitments to ensure that the work required to evaluate artificial intelligence is done right.

See the rest here:

How to better research the possible threats posed by AI-driven misuse of biology - Bulletin of the Atomic Scientists

Fired Biology Professor Fights Back and Wins, Has a Message For Fellow Christians – CBN.com

A Texas college professor who said he was fired after university leaders reportedly found his teachings too religious has been reinstated to his position more than a year after being terminated.

Listen to them on the latestepisodeof Quick Start

Dr. Johnson Varkey and his attorneys at First Liberty Institute recently announced Varkey had won his adjunct professorial job back at St. Philips College in San Antonio, Texas, after being fired in 2023 for teaching standard principles about human biology and reproduction.

The announcement comes after a favorable settlement was reached with the Alamo Community College District; the school system voluntarily reinstated Varkey.

I was so excited, the professor told CBN News after the announcement. And thank the Lord for that outcome.

Varkey said hes grateful to First Liberty and to the Lord for helping him get back his position.

I am excited to go back and teach, he said.

Watch Varkey share his story:

As CBN News previously reported, a biology professor for the past 20 years, Varkey consistently taught the same facts about the human reproductive system without any problems. But that changed last year, when he received a notice of dismissal.

I was surprised and I was shocked, because, you know, never expected for such a letter from, or such an email from, the school because Ive been teaching that for that school for the last 20 years and without any complaints, he said.

Varkey believes his lessons on human biology and sex being determined by chromosomes X and Y sparked complaints leading to his dismissal.

On the 12th of January [of 2023], I received an email from the vice president of the department of the school that they are doing an ethic violation investigation on me, so I responded to his email and asked him, What are the complaints?' Varkey said. So, what he said was the human resources will contact me.

The professor said, although he asked about complaints, he received no response from HR and didnt get a chance to defend himself before his firing.

When I got the letter of termination, what the VP mentioned was that some of the complaints were offensive to the homosexuals and transgender, he said. So, I presume that, very possibly, it is based on a human reproductive system, which I taught, which was in November [2022].

Varkey and his attorney believe he was potentially unfairly targeted by some who disagreed with his views or outside work as a pastor.

First Liberty filed a complaint with the Equal Employment Opportunity Commission (EEOC), but before receiving an official response, St. Philips College issued the reinstatement.

Kayla Toney, associate Counsel at First Liberty Institute, told CBN News her legal firm was initially outraged after initially finding out about Varkeys plight, noting the fact his speech had been targeted was deeply concerning.

There were many First Amendment violations that we saw right away, first his religious exercise, Toney said. He is a committed Christian; hes also a pastor in addition to his role as a biology professor. So, we think there was some religious targeting by students who knew he was a pastor and thought they could accuse him of something like religious preaching.

The attorney said her client was simply teaching straight from the textbook and following those lesson plans that he had used for 19 years.

Beyond that, Toney cited academic freedom concerns at the center of the case, arguing professors like Varkey should be free to teach from textbooks without fear of reprisal.

We also saw some due process concerns that Dr. Barkey never had the opportunity to meet with the students who were upset, or learn what their complaints were, or have a conversation, even with his supervisor, which was part of the colleges own procedures for a complaint or a termination, she said.

Varkey could return to the classroom as soon as this spring, with Toney noting the EEOC complaint will be dismissed as part of the settlement pending he is able to teach again as planned.

The professor said he hopes his successful quest to fight for his job will inspire other Christians who might face similar issues and barriers.

I would say, dont quit, because there are people very supportive just like First Liberty, he said, urging people to be brave and take a stand. Stand for the truth.

Original post:

Fired Biology Professor Fights Back and Wins, Has a Message For Fellow Christians - CBN.com

Understanding Reductionism and ID – Discovery Institute

Photo: Monarch butterfly, by liz west from Boxborough, MA [CC BY 2.0 ], via Wikimedia Commons.

The burgeoning field of systems biology, as defined by the National Institutes of Health (NIH),

is an approach in biomedical research to understanding the larger picture be it at the level of the organism, tissue, or cell by putting its pieces together. Its in stark contrast to decades of reductionist biology, which involves taking the pieces apart.

Im sure that statement is designed to make systems biology sound radical and exciting, and it succeeds. Its especially exciting for proponents of intelligent design, because ID theorists have been arguing against reductionism in biology for a long time.

But we need to be careful. We dont want to make an argument based on an equivocation. The word reductionism is thrown around a lot, but it can mean several different things. Its not as simple as saying, Biologists are learning that reductionism is bad!

As it turns out, the move away from reductionism in systems biologyissignificant for the ID debate, but not simply by word-association. So I want to take some time to suss out the different meanings of the word reductionism and what they have to do with intelligent design.

There are two kinds of reductionism that are relevant to this discussion:methodological reductionismandontological reductionism. (For a third kind,epistemological reductionism,see this cartoon.) The opposing philosophies are, respectively, methodologicalantireductionism and ontologicalantireductionism. The terms are a bit eye-splitting, but they arent difficult to understand.

Methodological reductionism is the idea that a thing can best be understood by breaking it down into its parts. The contrary philosophy, methodological antireductionism, says that a thing can be best understood by looking at it as a whole.

The opposing views are summed up nicely in a conversation between the wizards Saruman and Gandalf inThe Lord of the Rings. Saruman shows Gandalf his new rainbow-colored outfit and tells him that he has decided to stop going by Saruman the White and go by Saruman of Many Colours instead.

I liked white better, says Gandalf.

White! Saruman sneers. It serves as a beginning. White cloth may be dyed. The white page can be overwritten; and the white light can be broken.

In which case it is no longer white, says Gandalf. And he that breaks a thing to find out what it is has left the path of wisdom.

Saruman is a methodological reductionist and Gandalf is a methodological antireductionist.

Methodological reductionism:The white light can be broken.

Methodological antireductionism:He that breaks a thing to find out what it is has left the path of wisdom.

Ontological reductionism, on the other hand, is not about the best way to study something, but rather about what that thing really is at the deepest level. Ontological reductionism says that a thing can be reduced to its most basic parts, and thats what it is nothing more. According to this theory, a tree is a collection of cells, which in turn are collections of molecules, which are collections of atoms, which are collections of subatomic particles. So in the final analysis, a tree is a collection of subatomic particles.

This view, and its antithesis, is expressed in C. S. LewissVoyage of the Dawn Treader. On an island near the edge of the world, the characters meet a being named Ramandu who claims to be a star.

In our world, Eustace Scrubb objects, a star is a huge ball of flaming gas.

Even in your world, my son, replies Ramandu, that is not what a star is but only what it is made of.

Eustace is an ontological reductionist and Ramandu is an ontological antireductionist. (And if Ramandus statement seems mind-bending or baffling, thats because most of us were educated into ontological reductionism.)

Ontological reductionism:A star is a huge ball of flaming gas.

Ontological antireductionism: That is not what a star is but only what it is made of.

The field of systems biology ismethodologicallyantireductionist. It does not have to be ontologically antireductionist. So, systems biologists do not necessarily reject materialism or physicalism. They do not have to believe in minds, or be willing to posit neo-Platonic souls of cabbages, or think the true meaning of a mushroom can only be found in its wholeness.

They have simply found it to be the case that looking at living organisms as complete systems yields better results than only taking them apart to focus on their bare components. Researchers are coming to realize that it is more productive to think about the plan of an organism than simply about its physical structure or components.

But this is important, because whether systems biologists always admit it or not, methodological antireductionismimpliesontological antireductionism. Gandalf agrees with Ramandu, not Eustace.

Thats not to say that ontological antireductionism logically follows from methodological antireductionism, or vice versa. In theory, you could have one without the other. But the success of methodological antireductionism fulfills a prediction of the hypothesis of ontological antireductionism.

That is: if there really is a plan, then you would naturally suppose that looking for a plan would turn out to be a great strategy, and that proceeding as if there were no plan would not be a great strategy. And that is the reality. It turns out that when you take a creature apart to see what its parts are, you see a bunch of parts; but when you take a step back and look for a plan, you find a plan.

Intelligent design is a sub-type of ontological antireductionism. To be exact, it is one way of answering the question if a thing isnt just the sum of its parts, then whatisit? ID proposes that (at least some) natural entities are more than the sum of their parts because they are ultimately an expression of an idea in a conscious mind. If this is true, then you would predict those entities to be best understood by grasping the idea behind them; you would try to see the scheme, the purpose, the outline, the plan.

The neo-Darwinian model, in contrast, does not inherently lead to this prediction, because the mechanism of natural selection and random variation is, by definition, anuncoordinatedpiling-up of useful features, whereas a plan is thecoordinationof useful features. (Michael Behes three books andMarcos EberlinsForesightexplore this idea in depth.)

This is not proof of the design hypothesis, but itisevidence for it. In fact, this sort of evidence is one of the pillars of the scientific method: the strength of a scientific hypothesis depends on its ability to make predictions that are borne out by investigation. Based on that criterion, the hypothesis of intelligent design is doing very well. The hypothesis of mindless evolution is not doing so well, because although mindless processes might generate great complexity, they do not make plans.

Some systems biologists may want to reject Saruman but stay with Eustace; to reap the practical benefits of methodological antireductionism while avoiding the philosophical costs. But they may find that stance difficult to maintain. An unwary systems biologist could easily drift over to Ramandus Island, where the ID theorists are waiting.

View post:

Understanding Reductionism and ID - Discovery Institute

All creatures great and small: Sequencing the blue whale and Etruscan shrew genomes – University of Wisconsin-Madison

Illustration: Beth Atkinson

Size doesnt matter when it comes to genome sequencing in the animal kingdom, as a team of researchers at the Morgridge Institute for Research recently illustrated when assembling the sequences for two new reference genomes one from the worlds largest mammal and one from one of the smallest.

The blue whale genome was published in the journal Molecular Biology and Evolution, and the Etruscan shrew genome was published in the journal Scientific Data.

Research models using animal cell cultures can help navigate big biological questions, but these tools are only useful when following the right map.

The genome is a blueprint of an organism, says Yury Bukhman, first author of the published research and a computational biologist in the Ron Stewart Computational Group at the Morgridge Institute, an independent research organization that works in affiliation with the University of WisconsinMadison in emerging fields such as regenerative biology, metabolism, virology andbiomedical imaging. In order to manipulate cell cultures or measure things like gene expression, you need to know the genome of the species it makes more research possible.

The Morgridge teams interest in the blue whale and the Etruscan shrew began with research on the biological mechanisms behind the developmental clock from James Thomson, emeritus director of regenerative biology at Morgridge and longtime professor of cell and regenerative biology in the UW School of Medicine and Public Health. Its generally understood that larger organisms take longer to develop from a fertilized egg to a full-grown adult than smaller creatures, but the reason why remains unknown.

Its important just for fundamental biological knowledge from that perspective. How do you build such a large animal? How can it function? says Bukhman.

Bukhman suggests that a practical application of this knowledge is in the emerging area of stem cell-based therapies. To heal an injury, stem cells must differentiate into specialized cell types of the relevant organ or tissue. The speed of this process is controlled by some of the same molecular mechanisms that underlie the developmental clock.

Understanding the genomes of the largest and smallest of mammals may also help unravel the biomedical mystery known as Petos paradox. This is a curious phenomenon in which large mammals such as whales and elephants live longer and are less likely to develop cancer often caused by DNA replication errors that occasionally happen during cell division despite having a greater number of cells (and therefore more cell divisions) than smaller mammals like humans or mice.

Meanwhile, knowledge of the Etruscan shrew genome will enable new insights in the field of metabolism. The shrew has an extremely high surface to volume ratio and fast metabolic rate. These high energy demands are a product of its tiny size no bigger than a human thumb and weighing less than a penny making it an interesting model to better understand regulation of metabolism.

The blue whale and Etruscan shrew genome projects are part of a large collaborative effort involving dozens of contributors from institutions across North America and several European countries, in conjunction with the Vertebrate Genomes Project.

The mission of the VGP is to assemble high-quality reference genomes for all living vertebrate species on Earth. This international consortium of researchers includes top experts in genome assembly and curation.

The VGP has established a set of methods and criteria for producing a reference genome, Bukhman says. Accuracy, contiguity, and completeness are three measures of quality.

Previous methods to sequence genomes used short read technologies, which produce short lengths of the DNA sequence 150 to 300 base pairs long, called reads. Overlapping reads are then assembled into longer contiguous sequences, called contigs.

Contigs assembled from short reads tend to be relatively small compared to mammalian chromosomes. As a result, draft genomes reconstructed from such contigs tend to be very fragmented and have a lot of gaps.

Instead, the team used long read sequencing, with reads around 10,000 base pairs in length, with the principal advantage being longer contigs and fewer gaps.

Then you can use other methods such as optical mapping and Hi-C to assemble contigs into bigger structures called scaffolds, and those can be as big as an entire chromosome, Bukhman explains.

The researchers also analyzed segmental duplications, large regions of duplicated sequence that often contain genes and can provide insight into evolutionary processes when compared to other species, either closely or distantly related.

They found that the blue whale had a large burst of segmental duplications in the recent past, with larger numbers of copies than the bottlenose dolphin and the vaquita (the worlds smallest cetacean, the order of mammals including whales, dolphins and porpoises). While most of the copies of genes created this way are likely non-functional, or their function is still unknown, the team did identify several known genes.

One encodes the protein metallothionein, which is known to bind heavy metals and sequester their toxicity a useful mechanism for large animals that accumulate heavy metals while living in the ocean.

A reference genome is also useful for species conservation. The blue whale was hunted almost to extinction in the first half of the 20th century. It is now protected by an international treaty and the populations are recovering.

In the worlds oceans, the blue whale is basically everywhere except for the high Arctic. So, if you have a reference genome, then you can make comparisons and can better understand the population structure of the different blue whale groups in different parts of the globe, Bukhman says. The blue whale genome is highly heterozygous, theres still a lot of genetic diversity, which has important implications for conservation.

Which begs the question: how do you go about acquiring samples from a large, endangered creature that exists in the vastness of the oceans?

The logistics posed several challenges, including the fact that blue whale sightings in our area are very rare and almost unpredictable, says Susanne Meyer, a research specialist at the University of California Santa Barbara, who spent over a year to coordinate the permits, personnel and resources needed to procure the samples.

Once their local whale-watching team determined the timing and coordinates of the whale sightings, they brought in licensed whale researcher Jeff K. Jacobsen to perform the whale biopsies using an approved standard cetacean skin biopsy technique, which involves a custom stainless steel biopsy tube fitted to a crossbow arrow.

The team acquired samples from four blue whales, which Meyer used to develop and expand fibroblasts in cell culture for the genome sequencing and further research use.

While the Etruscan shrew genome wasnt studied as extensively as the blue whale genome, the team reported an interesting finding.

We found that there are relatively few segmental duplications in the shrew genome, Bukhman says, while emphasizing that this result does not necessarily correlate to the diminutive size of the shrew itself. While shrews belong to a different mammalian order, some similarly small rodents have lots of segmental duplications, and the house mouse is kind of a champion in that sense that it has the most. So, its not a matter of size.

As the Vertebrate Genomes Project makes strides in producing more high-quality reference genomes for all vertebrates, Bukhman is hopeful that contributions to those efforts will continue to advance biological research in the future.

These studies were supported by grants from the National Science Foundation (2046753, DBI2003635, DBI2146026, IIS2211598, DMS2151678, CMMI1825941 and MCB1925643) and National Institutes of Health (R01GM133840).

Read more:

All creatures great and small: Sequencing the blue whale and Etruscan shrew genomes - University of Wisconsin-Madison

Seeing Double: USU Biologist Carl Rothfels is Developing Novel Polyploid Phylogenetics Tools – Utah State University

Humans have 23 pairs of chromosomes and 46 total chromosomes. Half come from your mother and the other half come from your father. Were a diploid species, meaning most of our chromosomes come in matched sets.

Plants are a different story. Unlike in animals, polyploidy having more than two sets of chromosomes is very common among plants.

Polyploidy, says Utah State University plant biologist Carl Rothfels, is a dominant feature of existing plant species and appears to be a driver of plant diversity. Advances in genomics techniques, including CRISPR, are fueling study of the phenomenon, he says, yet polyploid phylogenetics, the study of the evolutionary history and relationships among and within polyploid plant groups, lags behind.

Rothfels was awarded a National Science Foundation Faculty Early Career Development Program (CAREER) grant to develop phylogenetic tools and apply them to the fern family Cystopteridaceae, including fragile fern and oak fern, commonly found in the Intermountain West.

This plant family provides an excellent system for investigating big questions about polyploidy and its role in evolution and diversity, says Rothfels, director of USUs Intermountain Herbarium and associate professor in the Department of Biology and USU Ecology Center. Is polyploidy an important generator of diversity and thus an engine of evolutionary success? Or, is it an evolutionary dead end, forming new species that go extinct more quickly?

Unraveling these mysteries is a formidable task, he says, involving collection of hard-to-get data and developing, from the ground up, complex analytical tools.

When evolution happens in a purely branching way, you can construct a family tree, Rothfels says. But polyploids mess up this system and make it harder because many polyploids are also hybrids, so their history is more of a reticulated network a web of life instead of a tree of life.

He explains three primary steps involved in a polypoid phylogenetics study none of which is simple and all of which present their own challenges.

The first step involves sequencing each copy of the target locus or set of loci present in a polyploid sample, and accurately reconstructing each distinct sequence from each subgenome.

The second step is to determine which subgenome each copy came from so that subgenome histories can be accurately reconstructed.

Step three, for which Rothfels is developing mathematical model, requires inferring polyploid evolutionary histories, which twist and turn in unexpected paths.

In pursuit of these aims, Rothfels is enlisting data collection help from an army of students and citizen scientists through the online iNaturalist network, along with a newly established annual botany trip known as the Intermountain Botanical Foray.

We held our first foray in June 2023, and the plan is to hold this yearly trip, open to plant-fans of all walks of life, in a different Intermountain location each year, he says. This past year, we visited the Desert Experimental Range in Millard County in southwestern Utah, where we logged more than 1,500 iNaturalist observations covering more than 200 species of plants.

In tandem with this effort, Rothfels is developing a curriculum in field botany to foster undergraduate learning, which he has introduced to students from USU Blanding who are participants in the yearly Native American Summer Mentorship Program.

Our lab is working with USUs NASMP and MESAS Mentoring and Encouraging Student Academic Success programs, to get students involved in study of plants, including Indigenous knowledge, he says. Our research is a collective effort, but not just on scientific progress, but on community building as well.

Visit link:

Seeing Double: USU Biologist Carl Rothfels is Developing Novel Polyploid Phylogenetics Tools - Utah State University

Department of Biology Special Seminar: Angela Hancock – The Hub at Johns Hopkins

Description

Angela Hancock, an independent group leader at the Max Planck Institute for Plant Research, will give a talk titled "Molecular Mechanisms of Adaptation to Novel Environments" for the Biology Department.

An organism's metabolism and growth are determined by a complex interplay of environmental signals and interacting molecular pathways. Angela Hancock's lab investigates how molecular response systems evolve in new and changing environments. They combine population genetics, functional genomics, computational modeling, and gene editing to deconstruct and reconstruct the molecular steps that enable adaptation to extreme environments.

This is a hybrid event; to attend virtually, use the Zoom link.

Read the original post:

Department of Biology Special Seminar: Angela Hancock - The Hub at Johns Hopkins

New Imaging Tool Advances Study of Lipid Biology – University of California San Diego

From fruit flies to humans, there are many, many different types and subtypes of lipids operating at the same time within any living organism. For example, the plasma portion of human blood is home to 600 different types of lipids. While we know that lipid molecules play myriad different roles in health, aging and disease; researchers currently struggle to uncover the fine details of these roles details which could unlock cures, extend the human healthspan, and solve mysteries of aging.

One big challenge is connected to the fact that multiple subtypes of lipids are often found within the very same cells. At the same time, there is no ideal tool for identifying and tracking the activity of specific lipids within individual cells. While, there is a long list of existing techniques that are used to try to answer some of the toughest questions about the roles of specific lipid subtypes in individual cells and tissues all current techniques have drawbacks.

Now, a study led by bioengineers at the University of California San Diego marks a significant step forward in this critical area of lipid research. The new work was published on February 21, 2024 in the journal Nature Communications. In this paper, the research team presents what they believe is the first method for distinguishing multiple lipid subtypes in cells and tissue samples by using nondestructive label-free optical imaging methods.

The new light-based imaging tool is called a hyperspectral imaging platform or PRM-SRS. The new capabilities of the PRM-SRS platform are due, in part, to the fact that the imaging platform integrates a Penalized Reference Matching (PRM) algorithm with Stimulated Raman Scattering (SRS) microscopy.

Use of this platform could profoundly deepen researchers ability to understand the roles that different lipid subtypes are playing in a wide range of cells and tissues. This is particularly relevant because traditional imaging methods often fall short in capturing the intricate spatial distributions and metabolic dynamics of lipid subtypes, hampering efforts to unravel their significance in aging and various diseases.

"Our PRM-SRS platform represents a paradigm shift in lipid imaging, offering unparalleled capabilities in distinguishing lipid subtypes within complex biological environments," said UC San Diego bioengineering professor Lingyan Shi, the senior corresponding author. As far as we know, we are presenting the first method for distinguishing multiple lipid subtypes in cells and tissue samples by using nondestructive label-free optical imaging methods. Shi was selected as anAlfred P. Sloan Research Fellow in 2023.

As described in the Nature Communications paper, the researchers harnessed PRM-SRS to visualize and identify distinct lipid subtypes across different organs and species including: high density lipoprotein particles in human kidneys; cholesterol-rich granule cells in mouse hippocampus; and subcellular distributions of two lipids (sphingosine and cardiolipin) in the human brain. In these demonstration cases, the PRM-SRS imaging platform unveiled unprecedented insights with enhanced chemical specificity (which is used to identify lipid subtypes) and subcellular resolution.

Multiplexed visualization: Unlike conventional methods, PRM-SRS enables the simultaneous visualization of multiple lipid subtypes from single label-free hyperspectral imaging (HSI) sets, expanding the scope of lipid research.

Diagnostic potential: Preliminary analyses of human kidney tissue samples suggest a potential application of PRM-SRS in diagnosing and prognosing renal diseases, such as offering non-invasive insights into dyslipidemia-associated conditions.

Neurological insights: By mapping lipid distributions in mouse and human brain tissues, PRM-SRS sheds light on the role of lipid metabolism in neurological disorders, paving the way for targeted therapeutic interventions.

Future directions: With its versatility and ease of implementation, PRM-SRS holds promise for diverse applications, from high-throughput studies to deep learning-enhanced imaging techniques, fostering a new era of multiplex cell and tissue imaging.

Multi-molecular hyperspectral PRM-SRS microscopyinNature Communications /https://doi.org/10.1038/s41467-024-45576-6

This project is led by researchers in the Shu Chien-Gene Lay Department of Bioengineering at the UC San Diego Jacobs School of Engineering. This collaborative project includes important contributions from researchers from Northwestern University School of Medicine, the UC Irvine School of Medicine, Duke University School of Medicine, Duke Universitys School of Medicine and Department of Biomedical Engineering, and Washington University in St. Louis.

Paper authors: Wenxu Zhang, Yajuan Li, Anthony A. Fung, Zhi Li, Hongje Jang, Honghao Zha, Xiaoping Chen, Fangyuan Gao, Jane Y. Wu, Huaxin Sheng, Junjie Yao, Dorota Skowronska-Krawczyk, Sanjay Jain, Lingyan Shi

The researchers report no conflict of interest.

UC San Diego Startup funds, NIH R01GM149976, NIH U01AI167892, NIH 5R01NS111039,NIH R21NS125395, NIH U54DK134301, NIHU54 HL165443, NIH U54CA132378, and a Hellman Fellow Award. We are grateful for the support of the Washington University Kidney Translational Research Center (KTRC) for kidney samples and the HuBMAP grant U54HL145608. We thank Dr. E. Bigio and Dr. M.-M. Mesulam from Mesulam Center for Cognitive Neurology and Alzheimers Disease (MCCNAD) for providing the de-identified autopsy brain samples; and MCCNAD is supported by NIH P30 AG013854. Work done in D.S.K. laboratory is supported by NEI P30 grant, P30EY034070-01 and in part by an unrestricted grant from Research to Prevent Blindness awarded to the Gavin Herbert Eye Institute.

Human BioMolecular Atlas Program This paper is also collected by the NIH HubMAP programs Nature series publication collection here:Human BioMolecular Atlas Program (nature.com)

See original here:

New Imaging Tool Advances Study of Lipid Biology - University of California San Diego

Saving Biology With Blue Biotechnology – The Maritime Executive

The significance of blue colour, representing water bodies (blue bodies) on the Earth, becomes evident while understanding the biological uniqueness of the universe. Water has been essential to countless ecological cycles and processes in addition to providing the environments required for life to evolve and flourish.

However, the unrelenting quest of mankind for material gain and economic expansion has resulted in widespread pollution, habitat destruction, and mismanagement of water resources. From industrial discharge to plastic pollution, from overfishing to deforestation along waterways, the cumulative impacts of human actions have taken a heavy toll on our blue bodies. This article briefly summarizes the extent of destruction inflicted upon these vital resources by human activities as well as how we can use blue biotechnology as a transformative application to heal and sustainably use the aquatic resources.

Earths blue bodies from ponds to oceans serve as important sources of energy, food and health for living beings and humans have been modifying the blue bodies for thousands of years. Although these activities have been essential to the growth of humanity (economic and social), they have also had a negative impact on the health of our blue bodies. In the majority of situations, water serves as the ultimate destination for our garbage, chemicals and other pollutants we release (Figure 1) and numerous studies have demonstrated that the rate of (man-made) water contamination has increased more than ever, due to advances in industry and urbanization.

Figure 1 Anthropogenic causes of water pollution. The figure was created using free icons available from Flaticon at: http://www.flaticon.com

Many resources that assist humanity are found in blue bodies, and a sizable fraction of the world's population depends heavily on the ocean and coastline for existence. It is to be noted that about 90% of the ocean's surface has been impacted by humans, which has resulted in drastic decrease in the number of current marine biodiversity compared to 1970 levels.The ocean ecosystem's resilience to tremors, potential to adjust to climate change, as well as its ability to fulfill its function as a global ecological and climate regulator are all being weakened by the loss of marine biodiversity. In addition, research on climate change has revealed that, in the event that greenhouse gas emissions continue to rise, the majority of all marine species on Earth will be in risk of becoming extinct by the year 2100. Exploitation, dredging, trawling, and development of the coastline also contributed to the loss of feeding and reproduction habitat of the marine ecosystem.

In recent years, there have been several significant coral bleaching events that have resulted in the disappearance of many corals due to water pollution, which raises the temperature and acidifies the water. This decrease in coral cover has resulted in a 60% decline in reef biodiversity and a negative impact on coastal populations10. In summary, humans have been remarkably successful in exploiting the resources provided by the blue bodies, however, this has resulted in a multitude of natural disasters, including the extinction of numerous aquatic species and ecosystems, the decline of marine biodiversity, excessive or insufficient sedimentation of the sea, increasing coastal erosion, and so forth.

Water wars: an emerging reality?

It's widely believed that conflicts in the future are going to revolve around water. Sharing of international waterways is expected to give rise to these water wars, which are defined as armed confrontations between multiple nations over limited water supplies. While, there is a persistent belief that water wars will remain a myth due to technological and resource advancements, statistical research and analyses indicate that this may not be the case. Several predictions suggest that between 2030 and 2050, the earth's water distribution will change drastically, and the vast majority of the planet will not be able to replenish the water that mankind have consumed and contaminated. According to the World Population Clock 2024, the global population is expected to expand by 73 million people annually or at a pace of about 0.91%.

But the amount of the available fresh water is not increasing, and if we dont take necessary actions, the same amount of the water will be distributed to more number of people. Also, it is to be important to remember that, freshwater makes up only 3% of the world's water resources, and that a significant amount of it is extremely challenging for humans to access due to its location on the poles. Estimates suggest that mankind is presently consuming approximately half of the freshwater supply, and it is projected that this rate will grow in just a few decades. From the Lagash-Umma dispute over water and irrigation in ancient Sumeria in 2500 BC to Israel's retaliatory attacks on Gaza's water supplies in 2023, a total of 1634 major conflicts were recorded in the water conflict chronology database created by the Pacific Institute. This database itself serves as further evidence that water wars are real issues rather than an imaginary piece.

Blue Biotechnology

According to the Organization for Economic Cooperation and Development (OECD), blue biotechnology or marine biotechnology is defined as the application of science and technology to living organisms from marine resources, as well as parts, products and models thereof, to alter living or non-living materials for the production of knowledge, goods and services. The primary focus of the current definition is on measures to enhance accessibility of marine resources and how we can benefit from them. Therefore, it is essential to expand and redefine the term blue biotechnology as a field of biotechnology that uses technical advancements that can heal or restore the harm we inflicted to the blue bodies as well as assist us in utilizing them. Figure 2 summarizes the use of blue biotechnology for healing and sustainable resource utilization of blue bodies.

Blue biotechnology for healing our blue bodies

Blue bodies are the primary global recipient of contaminants, making water pollution a concerning and pressing issue. Various efficient biotech tools have been proven to be useful in addressing water pollution (Figure 2). Utilizing the unique metabolic processes of bacteria, fungus, yeast, microalgae, and microbial mats, bioremediation methods are a potential approach for cleaning up blue bodies. It is carried out through either bioaugmentation, which introduces viable populations of microbes or by biostimulation, which entails stimulating the native microbial population, for the biodegradation of aquatic contaminants.

Furthermore, the potential of marine microorganisms for bioremediation has been enhanced with the aid of biotech tools. Genetic engineering approaches can modify the catabolic potential of various organisms that are able to thrive and remain active in harsh environments or polluted areas. Using sensitive biotech techniques, like polymerase chain reaction (PCR), these microorganisms and the newly introduced catabolic genes can be tracked and even quantified. Remarkably, these bioremediation solutions are value-added, environmentally benign, and commercially viable.

The blue bodies also serves as the carbon sink of our planet and also have a vital role in resisting the effects of change in the climate. As per one estimate, from the onset of industrialization, we have raised CO2 levels in the atmosphere by 50% and in the ocean's uppermost layer by 30%. This CO2 pollution leads to thermal and chemical stress on our blue bodies resulting in increased temperature of and change in the ocean chemistry, adversely affecting the marine biodiversity. Biological capturing, sequestration of contaminants by microbes, has emerged as one of the most efficient and significant carbon sequestration techniques in the globe today. This technique has also been proven to be a sustainable, cost-effective, and ecologically friendly approach. The potential of microalgae for metabolizing CO2 is 10-50 times greater than that of other terrestrial plants.

Another biotech application is ocean fertilization, commonly referred to as ocean nutrition, which is the introduction of nutrients into the ocean to promote the growth of marine microorganisms. Ocean fertilization has the potential to promote phytoplankton development, which can sequester and store more CO2. Therefore, biotechnology is an essential tool with enormous potential to repair the severe damage that humans have inflicted to the blue bodies. Biotechnology is constantly advancing, which makes us more capable of repairing the harm we have caused, even though there are still challenges to face.

Blue biotechnology for sustainable use of aquatic resources

The discipline of blue biotechnology is a rapidly growing area of study that looks into the abundant biological resources present in our blue bodies for a range of uses in industry, science, and medicine. However, while humanity is benefiting from theseresources, the declining rate of marine populations is also alarming. However, sustainable marine resource utilization can be made possible through marine bioprospecting, which is defined as "the systematic inquiry for interesting and novel genes, metabolites, molecules, and organisms from the marine environment that might be useful to the society and have economic potential to commercial product development." Currently, microorganisms account foralmost 60% of the production of new marine natural products and because of the wide range of genetic alterations possibilities of marine microbes, they are becoming more significant for sustainable blue biotechnology. Numerous nations, including US, Japan and Europe, already have a thriving marine-based nutraceuticals business, and the market has grown significantly in the last ten years, across the globe. In light of this, blue biotechnology has become crucial for meeting the growing demand for nutritious, quality sea food while conserving the marine resource diversity.

Marine-derived therapeutics has gained huge importance in recent decades. For instance, sponges are known as the drug goldmine because of the enormous diversity and therapeutic potentials of their secondary metabolites. Cytarabine, the first marine-derived anticancer drug to be produced for clinical use, was isolated from sponge and is being routinely used for the treatment of leukemia and lymphoma.

Due to the marine environment's relatively undiscovered biodiversity in comparison to the terrestrial environment, more medicinal compounds are now being separated from the blue bodies. Figure 2 summarizes the use of blue biotechnology for healing and sustainable resource utilization of water resources.

Figure 2. Blue biotechnology (BT) for healing and sustainable resource utilization of Blue bodies. The figure was created using free icons available from Flaticon at: flaticon.com.

Moreover, the notion of blue bioeconomy is becoming increasingly significant due to the enormous potential of marine resources to increase human well-being and marine resource biotechnology has seen a steady increase in applications related to the market in recent decades. Our ability to produce high-quality protein and materials that are financially significant for human welfare, has improved because of the use of biotechnology tools in aquaculture farming, which involves the commercial rearing of aquatic animals and plants under human intervention. Therefore, biotechnology is assisting us in the resilience building and restoration of local marine populations that will further enhance the overall conservation of marine ecosystem. It also helps us in promote a restoration culture, and safeguard and improve blue economy investments that depend on thriving marine ecosystems.

Concluding remark

Blue biotechnology has its foundations in the profound understanding that water is the cradle of all life. From pharmaceuticals to renewable energy or from food security to environmental conservation, the applications of blue biotechnology are broad and far- reaching. Blue biotechnology is thus defined currently with an emphasis on using marine resources to advance humankind. However, blue biotechnology also provides viable strategies to mitigate anthropogenic impacts to our blue bodies. Thus, this sector offers opportunities for both the preservation of the environment as well as the growth of humanity through the sustainable utilization of marine resources.

Furthermore, it is imperative to utilize blue biotech applications cautiously and effectively, while also imparting this knowledge to the younger generation. It will be beneficial if we could introduce biotechnological concepts in the school curriculum at very early levels that will help build a foundational understanding and foster curiosity. Offering students the opportunity to engage in practical biotechnology experiments can increase their learning and ignite their interest in the subject. In addition to teaching future generations about the ethical and regulatory boundaries governing biotechnological tools, it is crucial to emphasize their role in promoting sustainable resource utilization. This will instill an understanding of the importance of responsible innovation and adherence to regulations.

Thus, acknowledging the equal importance of aquatic ecosystems alongside humanity, harnessing sustainable blue biotechnology can propel mankind advancement while simultaneously preserving and restoring blue bodies or in essence saving Biology by Biotechnology.

About the authors

Abhay H Pande is professor at the National Institute of Pharmaceutical Education & Research (NIPER), S.A.S Nagar, India, with more than 27 years of experience in biotechnology. This endeavor has resulted in an academic portfolio comprising numerous patents and articles in esteemed journals.

J Anakha is currently a doctoral researcher under the guidance of Professor Abhay H Pande at the NIPER, S.A.S Nagar, India. She holds a masters degree in Biochemistry and Molecular Biology from the Central University of Kerala.

Read more:

Saving Biology With Blue Biotechnology - The Maritime Executive

The Human Element: For Student Scientists, Learning to Place Biology in Social Context – Tufts Now

Last fall, 14 undergraduate biology majors gathered for three hours every Friday to grapple with the issues that have shaped their field over the centuries. Not so much the hypotheses and the lab work, but the human factors at play: the feelings, the interactions, and the biasesthings rarely discussed as students study hard to become good scientists.

As the inaugural fellows in the year-long Civic Biology Fellowship program, they faced tough questions, learned new skills, and formed close bonds with the goal of making their field more equitable, welcoming, and better poised to collaborate with the communities it serves.

The course, taught by a team of faculty from several Tufts schools, was born of conversations among the biology departments Diversity, Equity, and Inclusion committee members, includingLauren Crowe, a lecturer in the Department of Biology. After it was formed in 2020, the committee wondered if there was room in the curriculum to incorporate aspects of society and its relationship to biology, Crowe says.

Crowe reached out to longtime colleague, School of Dental Medicine ProfessorJonathan Garlick, who is founding director of theTufts Initiative in Civic Science and Dialogue Center. In addition to his lab-based work on stem cells and his research on health equity, Garlick focuses on civic science,which aims to strengthen connections between scientists and the publicby making science more inclusive, diverse, and worthy of public trust.Those ideas are also reinforced in his role as director of science communications at Tufts Clinical and Translational Science Institute. But Garlicks previous civic science courses attracted mostly humanities students, Crowe says. And we really wanted to bring in biology students.

They wondered how to excite biology students about community engagement; how to help them understand that their work affects the communities around them. They wanted to teach the students not to do harm by excluding underrepresented populations in their research.

"Scientists are generally trained to value academic knowledge over the experiential and cultural knowledge of community partners, Crowe says. When biologists undervalue the lived experiences of those most affected by the research, they diminish their voices and exclude them as valid experts.And many times, biologists research approaches reinforce their privilege.

The pandemic-era timing felt right to Garlick, who had researched the growing polarization aroundand less public faith inscientific knowledge, as well as the broader awareness of the health inequities that impact communities of color.

Ultimately, Crowe and Garlick envisioned a one-year fellowship for second-, third-, and fourth-year biology majors. The first semester would feature classroom lectures, readings, case studies, and discussions. It would continue in an abridged form the second semester, augmented by an in-person internshipin community organizations that support equity-based work with underrepresented minority groups, Garlick says.

Students in the Civic Biology Fellowship do internships with organizations that support equity-based projects involving underrepresented communities. Above, Ada Yu, A26, meets with Chloe Yang, project manager of the ADAPT (Addressing Disparities in Asian Populations through Translational Research) Coalition at the start of Yu's internship. ADAPT is part of Tufts Clinical and Translational Science Institute. Photo: Jenna Schad

Their idea received funding through the provosts office. Crowe and Garlick recruited YouTube science communication expert and Friedman School of Nutrition AdjunctProfessorLara Hyde to co-facilitate with them, and put out a call for applications last spring. Fourteen students applied and were accepted into the Civic Biology program. The cohort specializes in a variety of biology topics and represents a diversity of backgrounds and life experiences, including religion, sexual orientation, and economic status.

The 14 fellows discussed topics such as de-colonizing science: placing historically marginalized populations at the center of biologists work, to shift the focus away from exclusively Western approaches to knowledge. Facilitators aimed to broaden fellows views by teaching them the social and historical contexts of research, and to introduce the spectrum of people who have contributed to our knowledge of biologywith the aim of including indigenous knowledge and creating more inclusive narratives.

They also talked about inclusive science communication and how diversity benefits science; and how white supremacy culturethe widespread ideology baked into society that whiteness holds value and that reinforces a racial hierarchy of power and controlhas affected the field of biology.

During one class, Garlick and guest speaker Linda Hudson, assistant professor of public health and community medicine at Tufts University School of Medicine, asked the fellows how they would approach research involving a Native American community. In this case study, students were told, You are not a person who is Native American-identified and the institution you are affiliated with has a history of missteps in clinical and research practice with the Native American community. They considered how best to take tissue samples from the people in the community and how to conduct a survey as they built trust with Native American parents and children.

"A lot of us were really stuck with this question, says Fellow Ayesha Lobo, A 24, a dual-degree student who is majoring in biology through the School of Arts and Sciences, and fine arts through the School of the Museum of Fine Arts.Its hard to say whats appropriate.

Though they were stuck, fellows could work through the question together within the carefully crafted safety of their classroom to gain a better understanding of the issues at play.The faculty followed the practice developed by Garlick known as the "dialogic classroom," where students feel safe enough to discuss uncomfortable topics even while disagreeing. Fellows made classroom agreements and drafted multiple codes of conduct for the class, says Lobo. We spent a lot of time going over safety in the classroom because a lot of the discussion topics were pretty intense.

That process led to a closeness among the fellows, says Basil Hand, A25. Being vulnerable was part of the class, they say. I feel close with the people in this course in a way that Ive never felt in a STEM course. Fellow Alice Rizkallah, A26, has a diverse group of friends, but grew particularly close with the Civic Biology class, whose discussions were unusually candid. Through them, she learned how deeply someones upbringing shapes their perceptions, including perceptions of science.

Guest speakers talked about their work, identities, and the interplay between the two; their stories came up repeatedly throughout the semesters discussions. Carl Baty, executive director and cofounder of the nonprofitRounding the Bases, spoke about the power of connection, Lobo says, as he talked about living with racism. Carls testimony in the classroom has come up again and again, Crowe says. Students had weekly readings and reflection prompts such as, What is your role as a scientist? and What does it mean to be worthy of trust?

Asking such big questions helps prepare students by teaching them cultural humility, inclusive communication, and other skillswhich, in turn, equips them for complex leadership roles, competitive job markets, and building inclusive and diverse research teams in science laboratories, Garlick says. And by doing that, they could help individuals who feel disconnected from or distrustful of science, and help them feel more connected, trusting, and engaged. Crowe adds: More than anything, these are trust-building skills, relationship skills, storytelling skills.

Students say the co-facilitators also taught these skills by example. Crowe, Garlick, and Hydewho are whiteaddressed their own position, privilege, and power in relation to white supremacy culture. Garlick explained that he feels accountable as a scientist to do no harm, either intentionally or unintentionally, and that requires training. Facilitators encouraged a stance of humility, and of understanding the importance of being quiet and listening, Hyde says.

Hand, Rizkallah, and Lobo say the fellowship has already shifted their ideas about biology and its place in society. I think its really important to integrate cultural humility into science, because a lot of us are pre-med or pre-dental, Lobo says. Were going to be working with a lot of diverse patients.

They agree that the fellowships tenets and approaches should be required learning for Tufts biology students. I think this class is so necessary, Rizkallah says. I have learned so much about the importance of being worthy of trust. It wont be a coworkers job or a patients job to trust me later in life just because Im a medical professional.

Continued here:

The Human Element: For Student Scientists, Learning to Place Biology in Social Context - Tufts Now