The iPhone Is a Part of Human Anatomy – NYU Washington Square News

Henry Cohen, Staff Writer April 17, 2017

Human dependency on the smartphone has become an accepted part of life in the first world. iPhone Separation Anxiety is a very real effect of being deprived of your smartphone for extended periods of time. Trivial as it may sound, not having your phone within reach can result in higher blood pressure, increased heart rate, worsened anxiety and poor cognitive performance, according to Psychology Today.

In a CBS Newsinterview, addictive behavior psychologist Dr. Harris Stratyner said that many people subconsciously treat smartphones as an extension of their bodies. We can literally feel almost as if we are disembodied from an extension of ourselves, Stratyner said, We dont feel the same ability to be individuals that we are with our iPhone, because we have become so dependent on that being a part of our knowledge base. Smartphones have become a huge part of how as much as 77 percent of American adults, according to a Pew Research Center 2017 study, interact with the world. They perfect our perception of time, give us full access to the wealth of human knowledge that is the internet, remind us of appointments, communicate with anyone, anywhere, at any time they can even tell us what the weather is going to be tomorrow at 3 p.m. In short, they are enhancements to our human abilities that manifest themselves in a slim block of metal and glass.

It may be difficult to see the iPhone as a true part of the human anatomy, but it is no different than a prosthetic leg or glass eye. It is always at hand, not physically a part of us but rarely apart from our person in much the same way that a prosthetic leg can be removed but is a part of the body when it is attached. Both the leg and smartphone serve to make up for some deficiency in the person who uses them. In the case of the prosthetic, it is the lack of a leg, while in the case of the phone, it is mans inability to naturally perform tasks such as taking photos and playing music wherever they are.

Transhumanist thinkers like Zoltan Istvan and Daniel Dennett have long advocated for and predicted the rise of a new brand of humanity, one enhanced by technology such that we can effectively accelerate our own evolution. While some outspoken critics like Francis Fukuyama have decried the dangers of transhumanism, this process is clearly already underway. Is having all earthly knowledge at our fingertips comparable to having a superpower? What about a human who can participate in a dozen text conversations at once spanning hundreds of miles in an instant? The smartphone represents the first and most successful step towards an entirely new variety of human, one that is almost a different species from those that came before and is capable of anything.

Opinions expressed on the editorial pages are not necessarily those of WSN, and our publication of opinions is not an endorsement of them. A version of this article appeared in the Monday, April 17 print edition.

Email Henry Cohen at [emailprotected]

Share on Facebook

Share on Google Plus

Share on Pinterest

Share via Email

Go here to read the rest:
The iPhone Is a Part of Human Anatomy - NYU Washington Square News

Anatomy of a Disney musical: Composer reveals process of creating iconic songs – ABC Online

Posted April 17, 2017 14:25:39

You probably don't know Alan Menken's name, but you do know his songs.

The acclaimed composer and songwriter is the hidden face behind some of Disney's most iconic musicals, including Beauty and the Beast, The Little Mermaid and Aladdin.

His body of work has earned him eight Oscars and 16 Golden Globes, and he's not done yet.

Mr Menken is currently in Australia for the premiere of the stage musical of Aladdin and will head home soon to work on the upcoming live action film version of the story.

So how does he create a Disney song? And what is the process for bringing a fledgling idea all the way to the big screen?

The first thing to know is that for most Disney animations the musical dramatists aka the songwriters are brought in by the studio first.

"That's something people don't know," Mr Menken told ABC News Breakfast.

"Disney will say, 'We want to tell this story' and then we say, 'OK, how do we do it?'.

"Generally you go from the basic story, to the basic structure of telling the story, to what musical style you're going to use to tell the story with, where the songs lie in that structure, then one by one tackle those songs."

Once the musical dramatists have those key song ideas in place, then the script and the story board is put together.

Then it's up to Mr Menken and the songwriters to work with the animation team to bring the songs to life.

The musical number A Whole New World was a hit on the 1992 Aladdin cartoon and went on to win the Academy Award for best original song as well as the Grammy for song of the year.

When it came to creating that song, Mr Menken said the music came first, then the lyrics, and then finally a discussion was had about the visuals.

"We ask: Is it going to be a montage? Are they going to sing to each other? What are we seeing visually in it?"

"And it's a collaboration again with the animators about what is actually happening [on screen]."

According to Mr Menken, a good song should combine with good visuals to hold the film together.

He said that while the composer and animators almost always collaborated well, sometimes their focus would be in different areas.

"In animation sometimes they'll be concentrating on a visual they really want to do and we'll be concentrating on a plot point we really want to push forward in order to support a song and there can be a little bit of creative collaboration.

"It's almost never contentious.

"There are also times when I'll say, 'You know what? We need this score and this project needs this kind of song'.

"And then we'll say, 'OK we're going to write that song and then once that's done you look at that and come back about how you're going to alter the visuals'."

A range of live action musical remakes of the original films have been coming out in recent years and include some of the original songs and many more are due for release, including Aladdin and The Lion King.

But after 50 years in the business Mr Menken has learnt a key lesson:

Having written more than 40 musicals Mr Menken has developed a pretty good feel for the business.

"I invite opinions but I have a pretty good sense of when it's right," he said.

"But I learnt a long time ago never to get invested in any song that I write.

"Because it's not a matter of the quality, it's simply about the nature of the song and whether it really hits the sweet spot for the people you're collaborating with."

This professional distance means that despite the awards, Mr Menken tries not to celebrate or mourn any of his songs too much.

"For me, you just keep writing new ones and when people like it, hey be grateful for it," he said.

And when it comes to seeing his musical numbers on the big screen for the first time?

"I feel good. I feel like I've done my job."

It's all about hitting that sweet spot, according to Mr Menken, even if he can't always predict what that will be.

"I've had songs that I thought were kind of a dumb song, but it just hit the sweet spot," he said.

"Like there was a song in [2010 animated film] Tangled called I Got A Dream.

"People love that song, but to me it's like, 'Oh my God it's a dumb song, but it works'."

More recently, Mr Menken has been tasked with writing new songs for classic Disney stories.

He collaborated on this year's live action Beauty and the Beast and wrote three new songs that fleshed out the new adaptation.

He said he wasn't upset if some fans of the original didn't immediately warm to the additions.

"I know that it will grow over time and the movie holds together," he said.

"And if the movie holds together the songs are doing their job.

"I liken myself to being an architect. I design structures that others will build and live in."

Topics: music, opera-and-musical-theatre, film-movies, australia

Original post:
Anatomy of a Disney musical: Composer reveals process of creating iconic songs - ABC Online

Why do some of us find it easier to forgive? Neuroscience sheds light – Medical News Today

Whether we condemn the villain in a movie or feel that somebody has wronged us personally, many of us make moral judgments on a daily basis. From a neuropsychological viewpoint, the act of judging a moral situation is incredibly complex and has a lot to do with intentionality - did the perpetrator really mean to do those awful things? What happens in our brain when we know that whoever caused the harm did so unintentionally? New research investigates the neuroanatomical basis of forgiveness.

The new study examines the role of a brain area called the anterior superior temporal sulcus (aSTS) in forgiving those who make unintentional mistakes.

The researchers were led by Giorgia Silani from the University of Vienna in Austria, and the study was carried out in collaboration with scientists from Trieste University in Italy and Boston College in Massachusetts. The findings were recently published in the journal Scientific Reports.

As the authors explain, making a mature moral judgment about a wrongful act involves not only considering the damage done, but also the perpetrator's intention and mental state. When there is a clear contradiction between the two, however, intention seems to take precedence over the result of the action.

Indrajeet Patil, the study's primary author, details this further and puts the new research into context:

"Behavioural studies have already shown that when the intention and outcome of an action are conflicting, as in the case of sometimes serious accidental harm, people tend to focus mainly on the intentions when formulating a judgment. And this is more or less a universal feature of mature moral judgments across cultures," Patil explains.

"To date, however, very few studies have taken on this issue from an anatomical point of view, to gain an understanding of whether differences in the volume and structure of certain areas of the brain might explain variations in moral judgment. This research attempted to explore precisely this aspect."

To do this, the researchers asked 50 participants to complete a moral judgement task. The volunteers were presented with 36 unique stories and four potential outcomes for each of them.

Each scenario comprised four parts: some background information; a so-called foreshadowing segment, in which it was suggested that the outcome would be either neutral or harmful; information on the neutral or intentionally harmful mental state of the agent; and, finally, the consequence, which revealed the agent's action and the resulting outcome.

Participants read each story and were asked to give their moral judgment by answering questions regarding "acceptability" and "blame." Namely, the participants were asked: "How morally acceptable was [the agent]'s behavior?" and "How much blame does [the agent] deserve?" The volunteers gave answers based on a scale from 1 to 7.

While answering the questions, the participants' brain activity was analyzed using voxel-based morphometry - a neuroimaging technique that allows for a holistic examination of brain changes while simultaneously preserving a high degree of brain region specificity.

The researchers also used neuroimaging to localize the neural areas responsible for the so-called theory of mind (ToM). ToM, or "mentalizing," is a person's ability to correctly attribute mental states - such as beliefs, intentions, and desires - to others based on their behavior. Mentalizing also refers to the person's ability to explain and predict other people's behavior based on these inferences.

The results revealed a connection between the differences in moral judgement severity about unintentional harm and the volume of the left aSTS brain region.

More specifically, the more developed the aSTS was, the less blame was attributed to the wrongdoers. "The greater the gray matter volume [in this area], the less accidental harm-doers are condemned," the authors write.

Patil further explains the findings:

"The aSTS was already known to be involved in the ability to represent the mental states (thoughts, beliefs, desires, etc.) of others. According to our conclusions, individuals with more gray matter at aSTS are better able to represent the mental state of those responsible for actions and thus comprehend the unintentional nature of the harm. In expressing judgment they are thus able to focus on this latter aspect and give it priority over the especially unpleasant consequences of the action. For this reason, ultimately, they are less inclined to condemn it severely."

This study opens up new avenues for neuroscientific research. Patil and colleagues recommend that further studies use more realistic contexts to study moral judgments, as well as using a more demographically diverse study sample.

Learn about a newly discovered mechanism for memory formation.

Go here to see the original:
Why do some of us find it easier to forgive? Neuroscience sheds light - Medical News Today

This Week in Neuroscience News 4/16/17 – ReliaWire

More brain stimulation news came this week when researchers at the University of Zurich pinpointee the brain mechanism that regulates decisions between honesty and self-interest. Using transcranial direct current stimulation, they could even increase honest behavior.

The work highlights a deliberation process between honesty and self-interest in the right dorsolateral prefrontal cortex (rDLPFC).

Christian Ruff, UZH Professor of Neuroeconomics, said:

This finding suggests that the stimulation mainly reduced cheating in participants who actually experienced a moral conflict, but did not influence the decision making process in those not in those who were committed to maximizing their earnings. These brain processes could lie at the heart of individual differences and possibly pathologies of honest behavior

When researchers applied transcranial direct current stimulation over a region in the right dorsolateral prefrontal cortex, during a dice rolling task, participants were less likely to cheat. However, the number of consistent cheaters remained the same. (Michel Andr Marchal, et al. Increasing honesty in humans with noninvasive brain stimulation)

Another story featuring the prefrontal cortex showed that its neurons helped teach the hippocampus to process memories. The research looked at memory flexibility and interference, the mechanisms by which the brain interprets events and anticipates their likely outcomes.

The study was by Matthew Shapiro, PhD, from Icahn School of Medicine at Mount Sinai. The results suggest that neurons in the medial prefrontal cortex instruct hippocampal neurons to learn rules which differentiate memory-based predictions in otherwise identical situations. The mechanisms revealed could improve understanding of psychiatric conditions, such as schizophrenia, that involve hippocampal and prefrontal cortex interactions.

NIH-funded research involving 446 children reported that insight into differences in treatment response in patients with childhood absence epilepsy could come from precision medicine. Childhood absence epilepsy (CAE) is the most common form of pediatric epilepsy.

The results suggest knowledge of specific gene variants in children with CAE may help predict what drugs would work best for them. For example, two specific forms of the calcium channel genes appeared more often in children for whom ethosuximide did not work. Two other variants of the calcium channel genes were found in children for whom lamotrigine did work, but one form of the drug transporter gene was associated with a continuation of seizures. (Glauser TA et al. Pharmacogenetics of Antiepileptic Drug Efficacy in Childhood Absence Epilepsy. Annals of Neurology. March 25, 2017)

A new study published this week in the Proceedings of the National Academy of Sciences hones our understanding of a uniquely human skill; the ability to instantaneously assess a new environment and get oriented thanks to visual cues.

Whereas humans can look at a complex landscape like a mountain vista and almost immediately orient themselves to navigate its multiple regions over long distances, other mammals such as rodents orient relative to physical cues like approaching and sniffing a wall that build up over time.

The way humans navigate their surroundings and understand their relative position includes an environment-dependent scaling mechanism, an adaptive coordinate system with differences from other mammals, according to the study led by researchers at The University of Texas at Austin.

Our research, based on human data, redefines the fundamental properties of the internal coordinate system,

said Zoltan Nadasdy, lead author of the study and an adjunct assistant professor in the universitys Department of Psychology.

Dysfunction in this system causes memory problems and disorientation, such as we see in Alzheimers disease and age-related decline. So, its vital that we continue to further our understanding of this part of the brain, he said.

By measuring brain activity in the entorhinal cortex, researchers identified three previously unknown traits of the system:

(Zoltan Nadasdy et al. Context-dependent spatially periodic activity in the human entorhinal cortex)

A pair of preclinical studies suggest that silencing the SCA2 gene, using antisense oligonucleotide therapy, may help prevent neurological symptoms associated with spinocerebellar ataxia type 2 and amyotrophic lateral sclerosis.

Finally, in Sweden, Karolinska Institute researchers report a method to force astrocytes to transmute into dopamine neurons, that work like normal midbrain dopamininergic neurons. The finding, could be the first step in an alternate therapeutic approach for Parkinsons disease.

Image: DARPA

See more here:
This Week in Neuroscience News 4/16/17 - ReliaWire

Why Human Behavior is Hurting Honey Bees – Entomology Today

The Varroa destructor mite (shown above attached to bee) is a widespread parasite of European honey bees (Apis mellifera). Poor management practices have enabled the spread of V. destructor and other bee pathogens, an Australian bee researcher argues. (Photo credit:Stephen Ausmus, USDA Agricultural Research Service, Bugwood.org)

In the search for answers to the complex health problems and colony losses experienced by honey bees in recent years, it may be time for professionals and hobbyists in the beekeeping industry to look in the mirror.

In a research essay published last week in the Journal of Economic Entomology, Robert Owen argues that human activity is a key driver in the spread of pathogens afflicting the European honey bee (Apis mellifera) and recommends a series of collective actions necessary to stem their spread. While some research seeks a magic bullet solution to honey bee maladies such as Colony Collapse Disorder, many of the problems are caused by human action and can only be mitigated by changes in human behavior, Owen says.

Owen is author of The Australian Beekeeping Handbook, owner of a beekeeping supply company, and a Ph.D. candidate at the Centre of Excellence for Biosecurity Risk Analysis at the University of Melbourne. In his essay in the Journal of Economic Entomology, he outlines an array of human-driven factors that have enabled the spread of honey bee pathogens:

Owen offers several suggestions for changes in human behavior to improve honey bee health, including:

The problems facing honeybees today are complex and will not be easy to mitigate, says Owen. The role of inappropriate human action in the spread of pathogens and the resulting high numbers of colony losses needs to be brought into the fore of management and policy decisions if we are to reduce colony losses to acceptable levels.

Like Loading...

See the article here:
Why Human Behavior is Hurting Honey Bees - Entomology Today

New physiology major announced for fall 2017 – The Aquinas

Annie Kennedy Staff Writer

The University is introducing a new major in physiology in the fall of 2017. The major, housed in the biology department, is open to current and incoming first-year students.

Photo courtesy of Wikimedia Commons A NEW physiology major will be available for students at The University for the upcoming fall semester.

The major is also open to current exercise science first-year students. Because they take anatomy and physiology as first-year students, they will take general biology I and II in their sophomore year.

Students have already switched to the major. Stephanie Nativo, a first-year student, switched to the major to prepare herself to become a physicians assistant.

I feel great knowing that I am one of the first students to graduate with this major and hope that it will inspire more students to follow their passions, she said.

Terrence E. Sweeney, Ph.D., is the program director, chair of the biology department and the person who brought the major to The University. He first proposed the physiology major in June 2015, and then he introduced it to the department in the fall 2015 semester. Last month, the new major was approved by the department, the dean and the faculty senate. Additional faculty whohelped develop the major include Gary Kwiecinski, Ph.D., Matthew Socha, Ph.D., Maria Squire, Ph.D. and Robert Waldek, Ph.D.

Sweeney explained that there are not many schools in the Northeast that offer a physiology major. He believes that this new major will help draw students to The Universitywho may not have considered this school in the first place.

An article by Erik J. Henriksen, Ph.D., published in the review journal Physiology, discussed the growth of physiology majors at other universities in the country.

The growth of these physiology programs has far exceeded the increases observed in overall undergraduate enrollments at these institutions, Henriksen wrote.

Students who are not interested or who cannot switch their major to physiology can still take general physiology because the department plans to continue to offer many sections of this popular course. For those who are concerned that this major is too focused on physiology and will not provide enough of a background in biology, Sweeney explained that they have addressed this concern in two ways.

First, physiology majors will take general biology as first-year students, which will provide them with a broad background in all of biology. Second, they will be encouraged to take a broad variety of electives in their junior and senior years. Additionally, Sweeney noted that students who do not want this specificity will be encouraged to consider the biology major because the physiology major is designed for students who want a more specific approach to this field.

Students in the major begin their college career the same way as many other science majors: by taking general biology and general chemistry courses. In their sophomore year, students take advanced human anatomy and physiology I and II, and in their junior year, they take cellular and integrative physiology with lab. Additionally, in their junior year, students take a seminar designed to introduce students to the latest techniques used in physiology research. Finally, in the spring of their junior year and in their senior year, students take 12 credits of physiology electives in three domains: molecular and cellular physiology, systems physiology and comparative physiology.

If anyone has any questions about this major, he or she is encouraged to contact Sweeney at terrence.sweeney@scranton.edu.

Follow this link:
New physiology major announced for fall 2017 - The Aquinas

Is Neuroscience Limited by Tools or Ideas? – Scientific American

Intricate, symmetric patterns, in tiles and stucco, cover the walls and ceilings of Alhambra, the red fort, the dreamlike castle of the medieval Moorish kings of Andalusia. Seemingly endless in variety, the two dimensionally periodic patterns are nevertheless governed by the mathematical principles of group theory and can be classified into a finite number of types: precisely seventeen, as shown by Russian crystallographer Evgraf Federov. The artists of medieval Andalusia are unlikely to have been aware of the mathematics of space groups, and Federov was unaware of the art of Alhambra. The two worlds met in the 1943 PhD thesis of Swiss astronomer Edith Alice Muller, who counted eleven of the seventeen planar groups in the adornments of the palace (more have been counted since). All seventeen space groups can also be found in the periodic patterns of Japanese wallpaper.

Without conscious intent or explicit knowledge, the creations of artists across cultures at different times nevertheless had to conform to the constraints of periodicity in two dimensional Euclidean space, and were thus subject to mathematically precise theory. Does the same apply to the endless forms most beautiful, created by the biological evolutionary process? Are there theoretical principles, ideally ones which may be formulated in mathematical terms, underlying the bewildering complexity of biological phenomema? Without the guidance of such principles, we are only generating ever larger digital butterfly collections with ever better tools. In a recent article, Krakauer and colleagues argue that by marginalizing ethology, the study of adaptive behaviors of animals in their natural settings, modern neuroscience has lost a key theoretical framework. The conceptual framework of ethology contains in it the seeds of a future mathematical theory that might unify neurobiological complexity as Fedorovs theory of wallpaper groups unified the patterns of the Alhambra.

The contemporary lack of ethological analysis is part of a larger deficit. Darwins theory of natural selection, arguably the most important theoretical framework in biology, is prominent by its absence in modern neuroscience. Darwins theory has two main tenets: the unguided generation of heritable variation, and the selection of such variation by an environmental niche to produce adaptive traits in organisms. The general role of the animal brain is to enable adaptive behaviors. It is reasonable to argue that a study of these adaptive behaviors (natural behaviors) should guide the experimental study of brain circuits. Indeed, this was the premise of the field of ethology developed by Tinbergen, Lorenz and von Frisch in the mid-twentieth century. The observational field studies of core natural behaviors such as mating, aggression and critical-period learning by ethologists enabled the subsequent elucidation of the underlying neural circuitry by neuroethologists.

In contrast to this empirical method of observing a freely behaving animal in its adaptive niche (natural settings) is the controlled experimental approach developed by Pavlov and Skinner to study conditioned behaviors, and psychophysical tests developed by experimental psychologists to characterize perception. This approach draws inspiration from physics, with its emphasis on isolating a system from external influences. The animal is placed in a controlled environment, subjected to simple stimuli and highly constrained in its behavior (e.g., forced to choose between two alternatives). The majority of contemporary neuroscientific studies use the controlled experimental approach to behavior with ethological analysis taking a back seat. Krakauer et al argue that all of the emphasis on tool building and gathering large neural data sets, while neglecting ethological grounding, has led the field astray.

The rationale of the current approach is that detailed recordings of neural activity (neural correlates) associated with behavior, and interventions in the behavior by suitable circuit manipulations, go beyond mere description of behavior and therefore provide greater explanatory power. Krakauer et al challenge this school of thought and argue that neither method is fruitful without first understanding natural behaviors in their own right, to set a theoretical context and guide experimental design. The tools to record and manipulate neural activity cannot substitute for ethological analysis, and may even impede progress by providing a false narrative of causal-interventionist explanation.

Misplaced concreteness in recording/manipulating neural activity can lead to the mereological fallacy, which incorrectly attributes to a part of a system a property of the whole system. The authors point to the popular mirror neurons as an example. Mirror neurons show the same activity when a primate performs a task, as compared to when the primate observes a different actor performing the task. However, this partial match between neural activities, does not by itself imply any similarity of psychological state between the observer and the actor. It would therefore be a conceptual error to use the activity of the mirror neurons as an interchangeable proxy for the psychological state. Krakauer et al hold that such an error is prevalent in the literature.

Generally, it is impossible to obtain a complete system-wide measurement of neural activity. Even the best current efforts to measure the activity of thousands of neurons falls far short of recording the electrical activity of entire nervous systems, including all of the axons, dendrites and chemical messages. There is no escape from the need to generalize from partial neural observations. These generalizations are fragile and may not provide any insight into the adaptive behaviors unless the experiments are carefully designed, taking those behaviors explicitly into account. Ignoring Darwin is not a good recipe for success in gaining biological understanding. Conversely, the authors draw upon studies of Bradykinesia in Parkinsonian disease, sound localization in barn owls, navigation in electric fish and motor learning, to show that ethologically informed experimental design coupled with neural activity measurements and perturbations can lead to better insight.

The call to re-focus on natural behaviors is timely but not really controversial. However, Karakauer et. al. proceed to make stronger claims regarding behaviors as emergent phenomena that cannot even in principle be explained in neural terms. Here they are on shakier ground. Quasi-mystical claims regarding emergence in biology are endemic in the literature and uncomfortably echo discarded notions of Cartesian dualism and Bergsonian vitalism. In support of their argument Krakauer et al refer to the collective behavior of flocks of birds, which exhibit large-scale spatiotemporal patterns (murmurations) not obvious from the behavior of one or a few birds. The fallacy of the argument is starkly evident in an amplifying commentary in The Atlantic on Krakauers article, where it is noted that the patterns can be reproduced in simple models of flocking with elementary rules dictating the flight behavior of individual birds in the context of their neighbors. This is in keeping with innumerable studies throughout the twentieth century: It has been repeatedly observed that seemingly complex patterns can be explained by simple, local rules.

These exercises demonstrate that complex collective behavior of systems can indeed be explained by simple rules of interactions between the elements of the system. Snatching defeat from the jaws of victory, the Atlantic commentary concludes that you would never have been able to predict the latter (i.e. the flocking patterns) from the former (the simple rules). But this was precisely what was done by the computer models cited, namely the flocking patterns were predicted by the simple rules! Perhaps what is implied is that the outcome of the model is not obvious in a subjective sense: i.e. we may not be able to do the math in our heads to connect the dots between the interaction rules and the collective behaviors (though this can be disputed one can indeed build the relevant intuition using appropriate theoretical, paper-pencil calculations of a pre-computer age, nineteenth century variety). However, that is a statement about our subjective feelings about the topic and has no bearing on the in principle question as to whether simple interaction rules lead to complex macroscopic behaviors. We now understand that they can. Working out the connections between the microscopic details and macroscopic behaviors may be practically challenging, but much theoretical progress has been made on this topic, and no in principle explanatory gap exists between the microscopic and macroscopic. Leaving aside the canard of emergence, Krakauer et al have hit upon a central issue that bears amplification. The problem with the mechanistic-reductionist explanation of nervous systems is not that there is an in principle gap between microscopic neuronal details and macroscopic behaviors (emergence), but that this style of explanation is largely divorced from Darwins theory of natural selection. This is particularly evident in the lack of niche-adaptive behaviors in driving experimental design, as pointed out by Krakauer et al. As is customary in the neuroscience literature, in contrasting the how (mechanistic) style of explanation from the why (adaptive) style of explanation of behavior, Krakauer et al invoke David Marrs computational level of analysis and Tinbergens ultimate causes. Marr defines three levels of analysis, computation (problem to be solved), algorithm (rules) and implementation (physical). Tinbergens analysis of behavior is separated into proximate or mechanistic explanations and ultimate or adaptive explanations. However, one might as well directly go to Darwin, since the context is broader than that of computational explanations or ethology, and originates in a fundamental tension between the biological and physical sciences.

Questions regarding function (in the English dictionary sense of purpose) belong exclusively to the biological domain. Exorcism of teleological considerations was central to modern physics; an explanation such as the purpose of the sun is to give light has no place in a physics textbook. Yet a statement with the same epistemological status, that the function of hemoglobin is to transport oxygen would be completely uncontroversial in a biology textbook. This cognitive dissonance between the status of teleological explanations in the two sciences has historical roots. Aristotles biological teleology stood in contrast with Democritus physics-style atomism. The teleology-atomism contrast in understanding nature is not special to classical Greek philosophy and occurs for example in classical Hindu philosophy. The role of function in the dictionary sense of purpose continues to be debated in the contemporary philosophy of biology. The working neuroscientist may regard these philosophical discussions as a waste of time (or worse, as crypto-vitalism). However as the recent controversy over defining DNA function in the ENCODE project shows, lack of agreement about function has practical consequences for the scientific community.

A more satisfactory treatment of function could dispel much of the theoretical confusion in understanding brain complexity. Coherent conceptual accounts already exist. Card-carrying biologists like Ernst Mayr have distinguished between cosmic teleology, corresponding to an inherent purposefulness of Nature that has no place in modern science, and teleonomy, or apparent purposefulness instantiated in genetic programs evolved through natural selection. Animal behavior within the lifetime of an individual is highly purposeful, executing programmed behaviors adapted to ecological niches. The program of instructions or the genetic code itself of course changes over evolutionary time scales. Developmental and adult plasticity of the nervous system does not fundamentally negate the existence of species-specific adaptive behaviors; indeed, plasticity itself is an evolved species-specific mechanism (as is illustrated by the convergent evolution of vocal learning in multiple taxa including humans and songbirds).

Fragments of a theory of design that deals squarely with teleonomic issues exist, including the ethological considerations and computationalist accounts referred to in Krakauers article. However, without a more robust, mathematically sound and conceptually coherent theoretical enterprise that has better explanatory power and provides guidance to experimental design, we are likely to be staring for a long time at the intricate patterns of neurobiological wallpaper without uncovering the underlying simplicities.

What is the way forward? Fedorov discovered the mathematics of space groups governing the patterns of Alhambra by studying crystals rather than by visiting the palace. It is possible that the underlying mathematical principles, that govern apparently purposeful biological systems, have their own intrinsic logic and may be discovered independently in a different domain. This is indeed the hope of researchers in the field of modern Machine Learning, who aim to discover the abstract principles of intelligence in a technological context largely removed from neuroscience. Human engineers, in trying to solve problems that often resemble those that animal nervous systems may have encountered in their adaptive niches, have come up with mathematically principled theoretical frameworks. These engineering theories classically include the three Cs (Communications, Computation and Control) and one should add Statistics or Machine Learning. These theories are taught in different departments in universities, but the modern context of interconnected systems and distributed networks has also brought the disciplines together into a mix that is ripe for connecting to neuroscience.

In terms of engineering metaphors in neuroscience, the computer has dominated, as can be seen from the discussions in Krakauers article. This may be a mistake: while no doubt the most popular textbook metaphor for brains, Theories of Computation as substantiated by Turings model or Von Neumanns computer architecture separating processors from memory, have been singularly unsuccessful in providing biological insight into brain function or experimental guidance to the practicing neuroscientist. It may also provide a simple explanation for the negative results obtained in the recent study by Jonas and Kording where standard analysis methods used by neuroscientists were unsuccessful in shedding insight into the architecture of a computer programmed to play a video game.

This study has led to much self-flagellation, but the neuroscience data analysis methods actually have been quite successful in exploring a different engineering metaphor for nervous systems, namely signal and image processing, usually studied in the context of communications or control. Paradigmatic of this success is our understanding of the primate visual system, understanding that has now borne fruit in a multi-billion-dollar Machine Vision industry. If the neuroscience data analysis methods fail in understanding a Von Neumann computer architecture separating processor from memory and using precise elements, its not such a big deal since no one expects the brain to conform to that architecture anyway. It is telling that the modern advances in Machine Learning have come from an abandonment of the digital, rule-based AI methods of traditional computer science, and an adoption of the analog, linear algebra and probability theory based methods more in the domain of statisticians, physicists and control theorists. Calling for interdisciplinary research is a clich, but the theoretical framework we need for neuroscience is unlikely to be based in an existing academic department.

Modern neuroscience needs pluralism not only in the epistemological levels of analysis, as Krakauer et al calls for, but also in the diversity of species studied. The biological counterpart to engineering theorizing is the comparative method that looks at a broad range of species across taxa to find cross-cutting principles. The comparative method has been in decline for decades, under pressure from the expansion of studies of a few model organisms, particularly those suited for translational medical research. The tool-building drive has forced this decline further: we now study the visual system of the mouse not because vision is a primary niche-adaptation for this species (an ethological dictum known as Kroghs principle), but simply because elaborate genetic tools are available.

We cannot brute force our way through the complexities of nervous systems. There is no doubt that we need better tools, but the best tool that we have for the problem perhaps resides in our own craniums. If there are no deep theoretical principles to be found in the study of animal nervous systems, then we are doomed to cataloguing the great variety of detail that is characteristic of biology, and tools will dominate. The hope is that underlying the endless and beautiful forms produced by the struggle for existence are mathematically quantifiable simplicities, fearful symmetries as it were. Then ideas will win the day.

View original post here:
Is Neuroscience Limited by Tools or Ideas? - Scientific American

Daeyeol Lee named the Duberg Professor of Neuroscience – Yale News

Daeyeol Lee, newly named as the Dorys McConnell Duberg Professor of Neuroscience, focuses his research on the brain mechanisms of decision-making, in particular the role of the prefrontal cortex and basal ganglia in reinforcement learning and economic choices.

The long-term goal of research in Lees laboratory is to understand how appropriate behaviors are chosen and their outcomes are evaluated by the neural networks in the cerebral cortex and basal ganglia of the brain.The laboratory also investigates how the brain combines various abstract quantities, such as time, probability, and magnitude, to optimize our decision strategies.Research in his laboratory is highly interdisciplinary and capitalizes on the insights from formal theories of economics and reinforcement learning as well as computational neuroscience of neural coding and behavioral studies of decision-making. Lee also develops novel behavioral paradigms that can probe the core processes of decision-making. Combined with the use of multi-electrode recording systems, this research seeks to unravel the biological basis of willful actions.

Lee graduated from Seoul National University (Korea) with a degree in economics and earned his Ph.D. in neuroscience from the University of Illinois at Urbana-Champaign. He then received postdoctoral training in neurophysiology at the University of Minnesota. Lee held faculty positions at Wake Forest University School of Medicine and the University of Rochester before coming to Yale in 2006 as associate professor of neurobiology. In addition to his new appointment, he also serves as professor of psychology and of psychiatry.

Lee is the author of the book Birth of Intelligence and has published over 80 original research articles, including several papers in Science, Nature Neuroscience, and Neuron. He has received the Fellowship for Prominent Collegians from Korea Foundation for Advanced Studies, a university fellowship from the University of Illinois, and the James S. McDonnell Foundation Cognitive Neuroscience Grant. His research has been funded by the National Institute of Health continuously since 1999.

See original here:
Daeyeol Lee named the Duberg Professor of Neuroscience - Yale News

Yale College creates new neuroscience major – Yale News

Yale College undergraduates for the first time can choose neuroscience as a major. The new major was developed through a joint effort by the Department of Molecular, Cellular and Developmental Biology (MCDB) and the Department of Psychology.

Neuroscience aims to understand how the brain produces behavior, with the goals of advancing human understanding, improving physical and mental health, and optimizing performance. This entails a highly interdisciplinary effort that spans molecules to minds.

MCDB and Psychology worked closely together to create this major because we want our students to have broadly integrative and rigorous training in neuroscience, only possible through our joint curriculum, said Marvin Chun, the Richard M. Colgate Professor of Psychology, and professor of neuroscience in the Yale School of Medicine, who helped spearhead the effort.

There has been a strong interest among students and faculty for a major in neuroscience, which is also the subject of several federal research initiatives, said Damon Clark, assistant professor of MCDB and of physics, who helped lead the collaboration in MCDB.

Yale has an excellent neuroscience graduate program, and this new course of study builds on Yales strengths to offer an undergraduate degree in neuroscience, Clark said.

Neuroscience majors will be admitted via application, and an unofficial course description and requirements are available here.

Qualifying students may receive a B.S. or B.A. in neuroscience as early as 2017-2018.

Excerpt from:
Yale College creates new neuroscience major - Yale News

The Landscape of Neuroscience 2006 – 2015 – Discover Magazine (blog)

How has neuroscience changed over the past decade? In a new paper, Hong Kong researchers Andy Wai Kan Yeung and colleagues take a look at brain science using the tools of citation analysis.

Yeung et al. extracted data from 2006-2015 from Web of Science and Journal Citation Reports (JCR), which track publications and citations. All journals that the JCR classifies in the Neurosciences category were included.

The first change Yeung et al. noticed was that the number of published neuroscience papers has been growing steadily, although keep in mind that the increasing volume of papers is a phenomenon not limited to neuroscience.

Looking at which kinds of papers received the most citations, Yeung et al. noticed a shift towards the more psychological and behavioural side of brain science. The Web of Science Psychology category went from #6 in terms of citations in 2006 up to #1 in 2015, while Behavioral sciences went from #3 to #2. The more biological areas of neuroscience, such as Physiology and Biochemistry, molecular biology, declined in terms of citations. A sign of the times?

A breakdown of papers by the national affiliations of the authors reveals the growth of Chinese neuroscience over the 2006 to 2015 period. While just 3% of papers had at least one author based in China in 2006, by 2015 this had risen to over 11%. China has overtaken Germany, the UK, Japan, and other countries such that China is now #2 on the world neuroscience authorship list.

Finally, Yeung et al. tracked the impact factor (average citations per paper) of ten core neuroscience journals over time. This reveals little change from 2006 to 2015 although the venerable Journal of Neuroscience (established 1981) has lost some ground to Neuroimage (founded 1992).

Overall, this is an interesting little paper. The results dont contain any big surprises, but its nice to be able to see where neuroscience is going.

Yeung AW, Goto TK, & Leung WK (2017). The Changing Landscape of Neuroscience Research, 2006-2015: A Bibliometric Study. Frontiers in Neuroscience, 11 PMID: 28377687

More:
The Landscape of Neuroscience 2006 - 2015 - Discover Magazine (blog)