Nature Genetics Paper Demonstrates How Inclusion of Bionano Genomics’ Next-Generation Mapping is Essential to … – Yahoo Finance

SAN DIEGO, March 09, 2017 (GLOBE NEWSWIRE) -- Bionano Genomics, Inc., a company focused on genome structure analysis, today highlighted results from a study demonstrating how combining genomic sequencing and mapping technologies, including Bionanos next-generation mapping (NGM), produced the most continuous de novo mammalian assembly to date, of the domestic goat (Capra hircus). The paper, Single-molecule sequencing and chromatin conformation capture enable de novo reference assembly of the domestic goat genome, was published online in advance of print in the March 6, 2017 issue of Nature Genetics (DOI:10.1038/ng.3802).

The paper described the use of single-molecule real-time long-read sequencing for contig formation (PacBios RSII), followed by scaffolding using chromatin interaction mapping (Phase Genomics Hi-C) and optical mapping data (Bionano Genomics Irys). The resulting assembly was polished with paired-end short-read sequencing for sequence accuracy (Illuminas HiSeq).

Hi-C has been gaining popularity for scaffolding de novo assembled genomes. By cross-linking remote regions of chromosomes, Hi-C has the potential to make scaffolds reaching chromosome arm lengths possible. However, this cross-linking does not always enable order and orientation of the contigs to be determined correctly. In the scaffolding of the goat genome described here, Hi-C introduced 7 times more errors than Bionano (21 vs. 3).

Since Bionano NGM is the only non-sequence based scaffolding method, it is the only technology capable of correcting errors inherent to both short-read and long-read sequencing. Bionano genome maps are created completely de novo, and therefore provide an entirely independent assembly with which to compare the sequence contigs. The Nature Genetics paper describes how Bionanos genome maps were able to correct 36 mis-assemblies in the PacBio contigs. Furthermore, Bionano allowed for precise sizing of 79% of the remaining gaps, which is not possible with Hi-C.

Erik Holmlin, Ph.D., CEO of Bionano, commented, This paper follows the recent release of multiple reference-quality human genomes, including the Chinese and Korean reference genomes, all of which included Bionano data to create the most contiguous and accurate assemblies possible. Since the time this study was conducted over 18 months ago, Bionano has significantly advanced its scaffolding tools and with the recently released Saphyr instrument, the cost to map a mammalian genome has decreased six-fold. This publication serves as important validation of NGM as a complementary genomic tool to combine with next-generation sequencing to reveal highly informative native structure of chromosomes.

Bionanos hybrid scaffold pipeline within the Bionano Access analysis and visualization software allows users to combine two genome maps, each created by labeling a different sequence motif with an NGS assembly. This combination typically doubles the contiguity achieved using one genome map, incorporates about 50% more contigs into the assembly, and provides an additional de novo assembly to be used in error correction. Using Saphyr, two mammalian genome maps can be created in a single run, for a total reagent and consumables cost of less than $2000 a fraction of the sequencing and scaffolding cost for the entire project described in this paper, which approached $100,000.

The generated assembly, ARS1, represents an approximate 400-fold improvement in genome continuity due to properly assembled gaps compared to the previously published goat assembly, and better resolves the full structure of large repeats longer than 1 kilobase (kb). ARS1 comprises just 31 scaffolds and 649 gaps covering 30 of the 31 haploid, acrocentric goat chromosomes (excluding only the Y chromosome), which compares favorably to the current human reference (GRCh38), which has 24 scaffolds, 169 unplaced or un-localized scaffolds, and 832 gaps in the primary assembly.

The researchers concluded that the assembly strategy using multiple complementary technologies achieved superior continuity and accuracy, is cost-effective compared to past finishing approaches, and sets a new standard for mammalian genome assembly quality.

The Bionano Access software, which includes two-enzyme hybrid scaffolding and related scaffold editing, will be released as a free download later this month.

Read More

About Bionano Genomics

Bionano Genomics, Inc. provides the Irys and Saphyr systems for next-generation mapping (NGM), which is the leading solution in physical genome mapping. NGM offers customers whole genome analysis tools that reveal true genome structure and enabling researchers to capture whats missing in their data to advance human, plant and animal genomic research. NGM uses NanoChannel arrays to image DNA at the single-molecule level with average single-molecule lengths of about 350,000 base pairs, which leads the genomics industry. The long-range genomic information obtained with NGM detects and deciphers structural variations (SVs), which are large, complex DNA segments involving repeats that are often missed by sequencing technologies and which are a leading cause of inaccurate and incomplete genome assembly.

As a stand-alone tool, NGM enables the accurate detection of SVs, many of which have been shown to be associated with human disease as well as complex traits in plants and animals. As a complementary tool to next-generation sequencing (NGS), NGM integrates with sequence assemblies to create contiguous hybrid scaffolds for reference-quality genome assemblies that reveal the highly informative native structure of the chromosome. NGM also provides the additional ability to verify, correct and improve a NGS-generated genome assembly.

Only Bionano provides long-range genomic information with the cost-efficiency and high throughput to keep up with advances in NGS.

NGM has been adopted by a growing number of leading institutions around the world, including: National Cancer Institute (NCI), National Institutes of Health (NIH), Wellcome Trust Sanger Institute, BGI, Garvan Institute, Salk Institute, Mount Sinai and Washington University. Investors in the Company include Domain Associates, Legend Capital, Novartis Venture Fund and Monashee Investment Management.

For more information, please visit http://www.BionanoGenomics.com.

Notes: Bionano Genomics is a trademark of Bionano Genomics, Inc. Any other names of actual companies, organizations, entities, products or services may be the trademarks of their respective owners.

See the rest here:
Nature Genetics Paper Demonstrates How Inclusion of Bionano Genomics' Next-Generation Mapping is Essential to ... - Yahoo Finance

Historic Lotamore House is rejuvenated as a fertility clinic – Irish Examiner

A multi-million euro private medical investment at Corks historic Lotamore House has just come to completion, after an 18-month-plus gestation.

Lotamore House, in Tivoli, Cork, which has been transformed to become the Waterstone Clinic.

Now set to employ 55, the Waterstone Clinic (previously known as the Cork Fertility Centre,) has just completed as a 13,000sq ft centre of excellence at the 210-year old Lotamore House in Cork.

The classical, villa-style building was sold in 2013 to Dr John and Susan Waterstone for an unconfirmed 800,000 having had a recently chequered past in previous ownerships.

It was controversially and briefly occupied by a protest group the Rodolphus Allen Private Family Trust after the property was taken over by receivers Deloitte from previous private owner, Sidney McElhinney, who had plans to turn it into a 90-bed nursing home.

Lotamore House had previously sold for over 3m, on 11 acres by the Tivoli dual carriageway, and other previous uses of the grande era villa included offices for a computer firm, as well as being offices for Irish Sweepstakes in the mid-1900s.

It had operated too as a luxury guesthouse for many years, hosting judges on the circuit, among other guests.

It featured on TV news during the brief-lived occupation until garda moved a caravan off its ground.

And, a proposal to document Lotamore Houses transition to 21st century fertility clinic was pitched to RT by a production company, GoodLookingFilms, but the broadcaster didnt commission the series which promised to mix medical science and embryo technology with Grand Designs.

Private family owners included the Hacketts, the Ronayne Mahonys, the Cudmores, the Lunhams and the Huguenot merchant family, the Perriers.

Now, claiming to be the most advanced fertility unit in the country, Lotamore is set to play a role in creating new families, out of a building with three centuries of Cork history.

The Waterstone Clinic previously operated in College Road, Cork, with clinic also in Waterford, Limerick and Dublin, on Leeson St.

At Lotamore, it has grown its lab space five fold to 1,500 sq ft of high tech lab with with the latest embryology technology and the building also accommodates five scan rooms (up from two), five consultation rooms, five recovery rooms, three masterbatorium, two theatres, a reception etc.

Procedures are on a day-visit basis, with no overnight facilities.

Lotamore House is a historic 18th century Cork building, and we have sympathetically refurbished and restored it, preserving its fine period details while incorporating modern facilities and comforts.

We have endeavoured to make a visit to Lotamore House as stress-free as possible for patients, with generous parking, spacious waiting areas and an interior design that maximises privacy, said founder Dr John Waterstone, who will host Lotamores first seminar post-opening on March 23.

Irish Examiner Ltd. All rights reserved

Originally posted here:
Historic Lotamore House is rejuvenated as a fertility clinic - Irish Examiner

Should Naturalism Define Science? (RJS) – Patheos (blog)

Methodological naturalism. For most scientists this is a foregone conclusion; a scientist studies nature and looks for natural cause and effect. Among Christians the term is often viewed as a cop-out, giving away the farm by ruling divine action out of bounds. Many atheists view the term as indicative of a failure to face facts and admit that there is nothing but the natural world. Which view is closest to yours?

Jim Stump, in his recent book Science and Christianity: An Introduction to the Issues, digs into the concept of methodological naturalism. His first point (as a good philosopher) is that methodological naturalism is not an easy concept to define. Well, methodological isnt terribly hard to grasp. Methodological is contrasted with metaphysical or ontological naturalism. The emphasis in methodological naturalism is on the method of doing science rather than on the existence or nonexistence of anything beyond the natural world. All scientists can approach their work as methodological naturalists no matter what views they hold concerning the ultimate shape of reality Christians, atheist, Hindu, Buddhist, or whatever. For Jim, the hard term is natural. What counts or doesnt count as natural? Most definitions are, or seem, circular. Natural phenomena are those that are investigated by natural means obeying natural laws.

The trouble with adopting methodological naturalism it that it seems we have to predetermine what counts as natural. And that will inescapably involve metaphysical notions and values that are not properly scientific by the standards of methodological naturalism.In that case, our metaphysics is going to affect our science, so long as we are committed to science as explanatory. (p. 71-72)

Commenters on this blog have occasionally suggested that methodological naturalism is metaphysical naturalism in disguise because it simply rules out everything else. Certainly some who favor intelligent design feel this danger. Lets not worry about defining natural at this time and move on to look at the nature and practice of methodological naturalism.

Practice of Science. It is relatively easy to see where the practice of chemistry and physics; geology and agriculture; genetics and embryology along with many other disciplines and subdisciplines can be approached through the lens of methodological naturalism. We look for and confine ourselves to the study of the interactions between atoms and molecules, even subatomic particles, the interaction of light and gravity with matter, and the laws that describe these interactions.

Problems may arise when scientists in these fields look to grand unifying theories. Jim brings some of Alvin Plantingas work into the discussion.

There is something to be said for recognizing disciplinary boundaries. Michael Ruse compares methodological naturalism to going to a doctor and expecting not to be given any political advice. The doctor may have very strong political views, but it would be inappropriate for him or her to disseminate them in that context. So, too, the scientist ought not to disseminate religious views, as they are not relevant to the task at hand. But Plantinga counters that in assessing grand scientific theories we will necessarily cross disciplinary lines in order to use all that we know that is relevant to the question. For the Christian, he thinks this properly allows the use of biblical revelation in assessing whether something like the theory of common ancestry is a correct explanation. And he believes that can be called Augustinian, or theistic, science. (p. 76)

In part this is because, historically speaking, what counts as natural is a moving target. I think Plantinga has an important point concerning grand theories but (big but) he is completely off-base in applying his concern to the question of common ancestry. Evolution and common descent are natural scientific questions with methodological naturalism an appropriate approach even for devout Christians. Before digging a little deeper into places where methodological naturalism should be held lightly we will look briefly at reasons for retaining methodological naturalism.

Retaining Methodological Naturalism. In the natural sciences (biology, chemistry, physics, geology, astronomy, climatology, meteorology ) reasons given for abandoning methodological naturalism are always gap arguments. Jim does not put it quite this bluntly, but after reading quite widely, this is the clear conclusion. I have not yet found an argument that is not based on a possibly temporary state of ignorance. Protestations to the contrary are emotional rather than evidence based.

Inserting supernatural agency or events into explanations has a fairly poor track record historically. Science has been remarkably successful at figuring out the causes of phenomena that were once explained by supernatural agents from thunder and solar eclipses, to disease and epilepsy. Of course that doesnt mean that science will be able to figure out everything in the future. But it should give us pause before thinking weve found some phenomenon for which there will never be any scientific explanation. To do otherwise would be to inhibit scientific investigation. Take the example of how the first living cell came about. Scientists dont have very promising models right now for how that could have happened through natural means. (p. 77-78)

Both Alvin Pantinga (Jim cites a couple of articles written in 1996 and 1997) and Stephen Meyer (Signature in the Cell) suggest that this should allow us to draw the conclusion that the best explanation is that here we have a place where God acted as an intelligent agent. Jim notes But should we call it the best scientific explanation we have at present if we say and then a miracle happened and there was life? It seems more in keeping with our present usage to say, At present we have no scientific explanation for that phenomenon. (p. 78) To insert a supernatural act of God here is to insert God into a gap in our knowledge. If the gap fills where is God? Of course God is responsible for the origin of life, just as he is responsible for the weather and the formation of a babe in the womb; but it isnt either God or science. It is God and science. As a Christian I am convinced that as scientists we study Gods ordained and sustained creation. Perhaps there are places where there will never be a satisfactory scientific explanation, but it is unwise to draw this conclusion about any individual proposal.

When is methodological naturalism troublesome? Here I leave Jims chapter and give my own view. Methodological naturalism is troublesome when we step away from the impersonal (chemistry, physics, ) and move to the personal. If there is a God who interacts with his creation methodological naturalism will give the wrong result in these instances.

Methodological naturalism applied to the study of history will guarantee that we never find God active in history. Methodological naturalism would require us to accept that dead people never come back to life without some yet unknown scientific mechanism for rejuvenation. Methodological naturalism would require us to propose a natural explanation for every act of Jesus from walking on water to stilling the storm, healing the lame, blind, and deaf, and feeding the multitudes. For many the natural explanation is that these never happened they are tall tales. But, the incarnation is a very personal act. If the Christian God exists, methodological naturalism wont get to the truth. N. T, Wright makes this argument in his book The Resurrection of the Son of God. If we dont eliminate the possibility of resurrection, then The Resurrection makes good sense. Many scholars today, of course, simply eliminate the possibility and look for natural explanations.

I will suggest that another place where methodological naturalism fails is in some areas of the social sciences. Humans and human social constructs are shaped by interpersonal interactions. The plasticity of the human brain means that we are shaped and formed not only by nature i.e. our genes, but also through community our social environment. Ideas change people. If there is a God who interacts with his people, his presence and interaction will change people. Natural explanations, ignoring the supernatural, i.e. God, will never get to the complete truth. Here is a case where the a priori move to eliminate God from consideration will limit understanding if there is a God. This isnt miraculous, but neither is it natural because God isnt natural.

Methodological naturalism is troublesome when it shapes our grand theories of being. However, it is counterproductive, and can be destructive to faith, to insist on gaps in impersonal processes and insert divine as opposed to natural cause.

What do you think?

Is methodological naturalism a useful approach?

What are the limits, if any, to methodological naturalism?

If you wish to contact me directly you may do so at rjs4mail[at]att.net

If interested you can subscribe to a full text feed of my posts at Musings on Science and Theology.

See the original post:
Should Naturalism Define Science? (RJS) - Patheos (blog)

Want To Play Inside A Human Cell? Genius Games To Launch A Kickstarter Campaign That Will Let You – Yahoo Finance

ST. LOUIS, March 9, 2017 /PRNewswire/ --There are many mysteries unfolding every second at the cellular level inside the human body, and the board game scientists at Genius Games want to help you unravel them in their latest hard-science based strategy game, Cytosis: A Cell Biology Game. It is the first game in the world to be designed totally around the actual structure and dynamics of the human cell, and promises a vividly visual, tactile, and interactive tour of this essential building block of life.

The creators of award-winning biology and chemistry games (as well as children's books) will take players inside a human cell in Cytosis: A Cell Biology Game, where they will compete to build enzymes, hormones and receptors and fend off attacking Viruses! In 2016,Cytosis: A Cell Biology Gamewas the highest rated game during the prestigious Stonemaier Game Design Day, a day dedicated to play-testing prototypes, game design and idea exchange within the gaming community.

Well-known inside the scientific, teaching and gaming communities, the company is thrilled to announce their upcoming Kickstarter campaign for Cytosis: A Cell Biology Game, the latest addition to its growing product portfolio of games that have been heralded by both the gaming and scientific communities. In the company's latest board game, science becomes fun and learning becomes addictive when taught by the team at Genius Games. The company has had six other successful crowdfunded projects come to market. The Kickstarter campaign for Cytosis: A Cell Biology Game aims to raise $14,500. Donors pledging $39 or more will receive a copy of the game. Find more details on Cytosis: A Cell Biology Game here, and to reserve a copy, click the "Support This Project" button.

"We are excited to return to Kickstarter to seek funding for our latest board game venture, Cytosis: A Cell Biology Game. People familiar with our other products will find the same level of quality and creativity that they've come to expect from us," noted John Coveyou, founder and director of Genius Games. "Traditionally games are only meant for entertainment and school is where you go to learn. At Genius Games we have always felt that you can make learning fun. That is our mission, to develop games that are not only a blast to play, but that also simultaneously demystify intimidating science concepts. And for a cool behind-the-scenes look into the design, and launch of the game on Kickstarter, check out my new YouTube documentary series, A Kickstarter Launch Story."

"Cytosis is a really well-designed hard science game. The worker placement works extremely well Great game!," raved Paul Salomon, an expert Board Game Geek reviewer.

In Cytosis: A Cell Biology Game, players compete to build enzymes, hormones and receptors to fend off attacking viruses inside a human cell. The player with the most Health Points at the end of the game wins!

Players utilize the available organelles within the cell to collect cellular resources such as mRNA from the Nucleus, Lipids from the Smooth E.R., ATP from the Mitochondria, or transport Carbohydrates into the cell via endocytosis through the Plasma Membrane.Players may also utilize the organelles to Translate mRNA into Proteins (either on the Free Ribosome in the Cytoplasm, or in the Rough E.R) or add glucose or lipid tags to their hormonesor hormone receptors in the Golgi Apparatus.Players score health points when they complete any of the Hormone, Receptors or Enzyme cards. For 2-5 players, ages 10 and up. The game will be available nationwide in August for $44.99 MSRP.

Read More

About Genius GamesSt. Louis, Missouri based Genius Games was founded by John Coveyou in 2014. Genius Games is a game design company that publishes high-quality tabletop games that are both entertaining and educational. For more information, please visit https://gotgeniusgames.com/.

About John CoveyouJohn Coveyou, creator and designer of Cytosis: A Cell Biology Game, nearly dropped out of school in his teen years and spent a stint living out of his car. However, after serving in the military, he pursued his love of science at Washington University, earning his Masters in Energy, Environmental and Chemical Engineering. He later quit his posh engineering job to launch Genius Games in 2014. He now teaches courses on Game Design and Crowdfunding at Webster University in St. Louis along with running Genius Games full time.

Cytosis is the sixth science-based game created by Coveyou. His previous Kickstarter campaigns were wildly successful! Coveyou also published My FirstScience Textbooks and Science Wide Open, science storybooks for kids, which are still the fourth and sixth most funded children's book campaign to date on Kickstarter.

To view the original version on PR Newswire, visit:http://www.prnewswire.com/news-releases/want-to-play-inside-a-human-cell-genius-games-to-launch-a-kickstarter-campaign-that-will-let-you-300420868.html

Continued here:
Want To Play Inside A Human Cell? Genius Games To Launch A Kickstarter Campaign That Will Let You - Yahoo Finance

UG/PG admission begins at AMU, Aligarh: Check out the details – India Today

The Aligarh Muslim University (AMU), Aligarh has released an admission notification inviting applications from interested, eligible candidates to apply for admission to its various programmes offered under various specialisations for the academic session 2017.

BA programme: The candidates interested in applying for this programme should have passed senior secondary school or equivalent examination with at least 50 per cent marks in aggregate with English and three subjects from -- accountancy, Arabic, banking, biology, biotechnology, business organisation, business studies, chemistry, commerce, computer science, economics, education, English, fine arts, geography, Hindi, history, home science, islamic studies, mathematics, Persian, philosophy, physical health education, physics, political science/civics, psychology, Sanskrit, sociology, statistics, Urdu and modern Indian languages (Bengali, Tamil, Telugu, Malayalam, Marathi, Punjabi and Kashmiri).

MTech programme: The candidates interested in applying for this programme should have pursued BTech or its equivalent examination, in the relevant branch of study, with not less than 60 per cent marks in aggregate or its equivalent CPI/CGPA/NAG.

MSc programme: The candidates interested in applying for this programme should have pursued BSc with biochemistry/biosciences/life sciences/medical biochemistry/clinical biochemistry as main, with two of the following subsidiary subjects: zoology/botany/ chemistry/biotechnology or BSc with biochemistry/biosciences/clinical biochemistry/ medical biochemistry, as one of the subjects of equal value along with any two of the optional subjects i.e. zoology, botany/chemistry/biotechnology.

(Read: Indian Statistical Institute Admissions 2017: Apply latest by March 10)

The candidates will be selected on the basis of departmental test conducted by the university.

The candidates are required to apply at the official website.

The last date of submissions of online application form for MSc (agriculture)/LLM programme is April 10.

The last date of submissions of online application form for MBBS/BDS programme is June 15.

The last date of submissions of online application form for MA/MTech/Mcom programmes is April 17.

The last date of submissions of online application form for LLM/BRTT/MSc programme is April 18.

The last date of submissions of online application form for MA/BFA programmes is April 12.

The last date of submissions of online application form for BA (Hons)/MPEd programmes is April 19.

Read: NISER, Bhubaneswar admissions 2017: Apply for PhD courses

For information on more courses and admissions,click here.

Read more:
UG/PG admission begins at AMU, Aligarh: Check out the details - India Today

Transcrypt: Anatomy of a Python to JavaScript Compiler – InfoQ.com

Key Takeaways

Featuring a diversity of programming languages, backend technology offers the right tool for any kind of job. At the frontend, however, it's one size fits all: JavaScript. Someone with only a hammer will have to treat anything like a nail. One attempt to break open this restricted world is represented by the growing set of source to source compilers that target JavaScript. Such compilers are available for languages as diverse as Scala, C++, Ruby, and Python. The Transcrypt Python to JavaScript compiler is a relatively new open source project, aiming at executing Python 3.6 at JavaScript speed, with comparable file sizes.

For a tool like this to offer an attractive alternative to everyday web development in JavaScript, at least the following three demands have to be met:

To be successful, all aspects of these three requirements have to be met. Different compilers strike a different balance between them, but no viable compiler for every day production use can neglect any of them. For Transcrypt, each of the above three points has led to certain design decisions.

Demand 1:

Look and feel of web sites and web applications are directly connected to the underlying JavaScript libraries used, so to have exactly the same look and feel, a site or application should use exactly the same libraries.

Although fast connections may hide the differences, achieving the same page load time, even on mobile devices running on public networks, mandates having roughly the same code size. This rules out downloading a compiler, virtual machine or large runtime at each new page load.

Achieving the same startup time as pages utilizing native JavaScript is only possible if the code is statically precompiled to JavaScript on the server. The larger the amount of code needed for a certain page, the more obvious the difference becomes.

To have the same sustained speed, the generated JavaScript must be efficient. Since JavaScript virtual machines are highly optimized for common coding patterns, the generated JavaScript should be similar to handwritten JavaScript, rather than emulating a stack machine or any other low level abstraction.

Demand 2:

To allow seamless access to any JavaScript library, Python and JavaScript have to use unified data formats, a unified calling model, and a unified object model. The latter requires the JavaScript prototype based single inheritance mechanism to somehow gel with Pythons class based multiple inheritance. Note that the recent addition of the keyword 'class' to JavaScript has no impact on the need to bridge this fundamental difference.

To enable efficient debugging, things like setting breakpoints and single stepping through code have to be done on the source level. In other words: source maps are necessary. Whenever a problem is encountered it must be possible to inspect and comprehend the generated JavaScript to pinpoint exactly what's going on. To this end, the generated JavaScript should be isomorphic to the Python source code.

The ability to capitalize on existing skills means that the source code has to be pure Python, not some syntactic variation. A robust way to achieve this is to use Python's native parser. The same holds for semantics, a requirement that poses practical problems and requires introduction of compiler directives to maintain runtime efficiency.

Demand 3:

Continuity is needed to protect investments in client side Python code, requiring continued availability of client side Python compilers with both good conformance and good performance. Striking the right balance between these two is the most critical part of designing a compiler.

Continued availability of trained Python developers is sufficiently warranted by the fact that Python has been the number 1 language taught in introductory computer science courses for three consecutive years now. On the backend it is used for every conceivable branch of computing. All these developers, used to designing large, long lived systems rather than insulated, short lived pieces of frontend script code, become available to browser programming if it is done in Python.

With regard to productivity, many developers that have made the switch from a different programming language to Python agree that it has significantly increased their output while retaining runtime performance. The latter is due to the fact that libraries used by Python applications for time critical operations like numerical processing and 3D graphics usually compile to native machine code.

The last point openness to changed needs means that modularity and flexibility have to be supported at every level. The presence, right from the start, of class-based OO with multiple inheritance and a sophisticated module and package mechanism has contributed to this. In addition, the possibility to use named and default parameters allows developers to change call signatures in a late stage without breaking existing code.

Conformance versus performance: language convergence to the rescue

Many Python constructs closely match JavaScript constructs, especially when translating to newer versions of JavaScript. There's a clear convergence between both languages. Specifically, more and more elements of Python make their way into JavaScript: for ... of ..., classes (in a limited form), modules, destructuring assignment and argument spreading. Since constructs like for ... of ... are highly optimized on modern JavaScript virtual machines, it's advantageous to translate such Python constructs to closely matching JavaScript constructs. Such isomorphic translation will result in code that can benefit from optimizations in the target language. It will also result in JavaScript code that is easy to read and debug.

Although with Transcrypt, through the presence of source maps, most debugging will take place stepping through Python rather than JavaScript code, a tool should not conceal but rather reveal the underlying technology, granting developer full access to 'what's actually going on'. This is even more desirable since native JavaScript code can be inserted at any point in the Python source, using a compiler directive.

The isomorphism between Python and the JavaScript code generated by Transcrypt is illustrated by the following fragment using multiple inheritance.

translates to:

Striving for isomorphic translation has limitations, rooted in subtle, but sometimes hard to overcome differences between the two languages. Whereas Python allows lists to be concatenated with the + operator, isomorphic use of this operator in JavaScript result in both lists being converted to strings and then glued together. Of course a + b could be translated to __add__ (a, b), but since the type of a and b is determined at runtime, this would result in a function call and dynamic type inspection code being generated for something as simple as 1 + 1, resulting in bad performance for computations in inner loops. Another example is Python's interpretation of 'truthyness'. The boolean value of an empty list is True (or rather: true) in JavaScript and False in Python. Dealing with this globally in an application would require every if-statement to feature a conversion, since in the Python construct if a: it cannot be predicted whether a holds a boolean or somthing else like a list So if a: would have to be translated to if( __istrue__ (a)), again resulting in slow performance if used in inner loops.

In Transcrypt, compiler directives embedded in the code (pragmas) are used control compilation of such constructs locally. This enables writing matrix computations using standard mathematics notation like M4 = (M1 + M2) * M3, at the same time not generating any overhead for something like perimeter = 2 * pi * radius. Syntactically, pragma's are just calls to the __pragma__ function, executed compile time rather than run time. Importing a stub module containing def __pragma__ (directive, parameters): pass allows this code to run on CPython as well, without modification. Alternatively, pragmas can be placed in comments.

Unifying the type system while avoiding name clashes

Another fundamental design choice for Transcrypt was to unify the Python and the JavaScript type system, rather than have them live next to each other, converting between them on the fly. Data conversion costs time and increases target code size as well as memory use. It burdens the garbage collector and makes interaction between Python code and JavaScript libraries cumbersome.

So the decision was made to embrace the JavaScript world, rather than to create a parallel universe. A simple example of this is the following code using the Plotly.js library:

Apart from the pragma allowing to leave out the quotes from dictionary keys, which is optional and only used for convenience, the code looks a lot like comparable JavaScript code. Note the (optional) use of list comprehensions, a facility JavaScript still lacks. The fact that Python dictionary literals are mapped to JavaScript object literals is of no concern to the developer; they can use the Plotly JavaScript documentation while writing Python code. No conversion is done behind the scenes. A Transcrypt dict IS a JavaScript object, in all cases.

In unifying the type systems, name clashes occur. Python and JavaScript strings both have a split (), but their semantics have important differences. There are many cases of such clashes and, since both Python and JavaScript are evolving, future clashes are to be expected.

To deal with these, Transcrypt supports the notion of aliases. Whenever in Python .split is used, this is translated to .py_split, a JavaScript function having Python split semantics. In native JavaScript code split will refer to the native JavaScript split function as it should. However, the JavaScript native split method can also be called from Python, where it is called js_split. While methods like these predefined aliases are available in Transcrypt, the developer can define new aliases and undefine existing ones. In this way any name clashes resulting from the unified type system can be resolved without run time penalty, since aliases do their work compile time.

Aliases also allow generation of any JavaScript identifier from a Python identifier. An example is the $ character, that is allowed as part of a name in JavaScript but forbidden in Python. Transcrypt strictly conforms Python syntax and is parsed by the native CPython parser, making its syntax identical. A piece of code using JQuery may look as follows:

Since Transcrypt uses compilation rather than interpretation, imports have to be decided upon compile time, to allow joined minification and shipment of all modules involved. To this end C-style conditional compilation is supported, as can be seen in the following code fragment:

The same mechanism is used in the Transcrypt runtime to switch between JavaScript 5 and JavaScript 6 code:

In this way optimizations in newer JavaScript versions are taken into account, retaining backward compatibility. In some cases, the possibility for optimization is preferred over isomorphism:

Some optimizations are optional, such as the possibility to activate call caching, resulting in repeated calls to inherited methods being done directly, rather than through the prototype chain.

Static versus dynamic typing: Scripting languages growing mature

There has been a resurgence in appreciation of the benefits of static typing, with TypeScript being the best known example. In Python, as opposed to JavaScript, static typing syntax is an integral part of the language and supported by the native parser. Type checking itself, however, is left to third party tools, most notably mypy, a project from Jukka Lehtosalo with regular contributions of Python initiator Guido van Rossum. To enable efficient use of mypy in Transcrypt, the Transcrypt team contributed a lightweight API to the project, that makes it possible to activate mypy from another Python application without going through the operating system. Although mypy is still under development, it already catches an impressive amount of typing errors at compile time. Static type checking is optional and can be activated locally by inserting standard type annotations. A trivial example of the use of such annotations is the mypy in-process API itself:

As illustrated by the example, static typing can be applied where appropriate, in this case in the signature of the run function, since that is the part of the API module that can be seen from the outside by other developers. If anyone misinterprets the parameter types or the return type of the API, mypy will generate a clear error message, referring to the file and line number where the mismatch occurs.

The concept of dynamic typing remains central to languages like Python and JavaScript, because it allows for flexible data structures and helps to reduce the amount of source code needed to perform a certain task. Source code size is important, because to understand and maintain source code, the first thing that has to happen is to read through it. In that sense, 100 kB of Python source code offers a direct advantage over 300 kB of C++ source that has the same functionality, but without the hard to read type definitions using templates, explicit type inspection and conversion code, overloaded constructors and other overloaded methods, abstract base classes to deal with polymorphic data structures and type dependent branching.

For small scripts well below 100kB source code and written by one person, dynamic typing seems to only have advantages. Very little planning and design are needed; everything just falls into place while programming. But when applications grow larger and are no longer built by individuals but by teams, the balance changes. For such applications, featuring more than roughly 200kB source code, the lack of compile time type checking has the following consequences:

An interface featuring even one parameter that may refer to a complex, dynamically typed object structure, cannot be considered sufficiently stable to warrant separation of concerns. While this type of 'who did what, why and when' programming accounts for tremendous flexibility, it also accounts for design decisions being postponed to the very last, impacting large amounts of already written code, requiring extensive modifications.

The 'coupling and cohesion' paradigm applies. It's OK for modules to have strong coupling of design decisions on the inside. But between modules there should preferably be loose coupling, a design decision to change the inner workings of one module should not influence the others. In general, this leads to the following rules of the thumb for the choice between dynamic and static typing.

So while the current surge in static typing may seem like a regression, it isn't. Dynamic typing has earned its place and it won't go away. The opposite is also true: even a traditionally statically typed language like C# has absorbed dynamic typing concepts. But with the complexity of applications written in languages like JavaScript and Python growing, effective modularization, cooperation and unit validation strategies gain importance. Scripting languages are coming of age.

Why choose Python over JavaScript on the client?

Due to the immense popularity of programming for the web, JavaScript has drawn lots of attention and investment. There are clear advantages in having the same language on the client and on the server. An important advantage is that it becomes possible to move code from server to client in a late stage, when an application is upscaled.

Another advantage is unity of concept, allowing developers to work both on the front end and the back and without constantly switching between technologies. The desirability of decreasing the conceptual distance between the client and server part of an application has resulted in the popularity of a platform like Node.js. But at the same time, it carries the risk of expanding the 'one size fits all' reality of current web client programming to the server. JavaScript is considered a good enough language by many. Recent versions finally start to support features like class based OO (be it in the form of a thin varnish over its prototyping guts), modules and namespaces. With the advent of TypeScript, the use of strict typing is possible, though incorporating it in the language standard is probably some years away.

But even with these features, JavaScript isn't going to be the one language to end all languages. A camel may resemble a horse designed by a committee, but it never becomes one. What the browser language market needs, in fact what any free market needs, is diversity. It means that the right tool can be picked for the job at hand. Hammers for nails, and screwdrivers for screws. Python was designed with clean, concise readability in mind right from the start. The value of that shouldn't be underestimated.

JavaScript will probably be the choice of the masses in programming the client for a long time to come. But for those who consider the alternative, what matters to continuity is the momentum behind a language, as opposed to an implementation of that language. So the most important choice is not which implementation to use, but which language to choose. In that light Python is an effective and safe choice. Python has a huge mindshare and there's a growing number of browser implementations for it, approaching the golden standard of CPython closer and closer while retaining performance.

While new implementations may supersede existing ones, this process is guided by a centrally guarded consensus over what the Python language should entail. Switching to another implementation will always be easier than switching to the next JavaScript library hype or preprocessor with proprietary syntax to deal with its shortcomings. Looking at the situation in the well-established server world, it is to be expected that multiple client side Python implementations will continue to exist side by side in healthy competition. The winner here is the language itself: Python in the browser is there to stay.

Jacques de Hooge MSc is a C++/Python developer living in Rotterdam, the Netherlands. After graduating from the Delft University of Technology, department of Information Theory, he started his own company, GEATEC engineering, specializing in Realtime Controls, Scientific Computation, Oil and Gas Prospecting and Medical Imaging.He is a part-time teacher at the Rotterdam University of Applied Sciences, where he teaches C++, Python, Image Processing, Artificial Intelligence, Robotics, Realtime Embedded Systems and Linear Algebra. Currently he's developing cardiological research software for the Erasmus University in Rotterdam. Also he is the initiator and the lead designer of the Transcrypt open source project.

Visit link:
Transcrypt: Anatomy of a Python to JavaScript Compiler - InfoQ.com

Scott Foley’s ‘dead head’ freaks out his wife on Grey’s Anatomy – TV3.ie

9th Mar 17 | Entertainment News

Scott Foley's actress wife freaked out when she saw her first 'dead body' on Grey's Anatomy, because her husband was looking up at her.

Marika Dominczyk has started work on the medical drama that once featured her man as Henry Burton, and she didn't realise the show's prop team recycle Scott's dead head whenever they need a corpse.

"They not so kindly killed me off," he recalls, "but to do so, they made a full prosthesis of my head, and those things are expensive to make, so they don't make a bunch of them.

"Every time they have a dead body or a cadaver laying on a table, it's my head... The first time she had no idea; they didn't tell her... She was like, 'Oh God!'"

Marika, who plays lesbian Dr. Eliza Minnick on the current season of the show, has just returned to acting after taking time off to focus on being a mum to her three kids with Foley. She previously featured in TV drama North Shore and played Bernadette in The 40-Year-Old Virgin.

"She spent seven years raising our children and now that she had the opportunity to go back to work she was really chomping at the bit," Foley tells Access Hollywood Live. "This part came along and she's knocking it out the park... She looks great in a doctor's coat."

But he's not looking forward to sitting down with his wife's TV lover, Jessica Capshaw, and her husband now the old friends are kissing on TV.

"We've known socially Jessica Capshaw and her husband Christopher Gavigan for years, so it was a little strange for them," he explains. "I don't think we've had the chance to talk about it yet. That'll be an interesting conversation."

WENN Newsdesk 2017

Link:
Scott Foley's 'dead head' freaks out his wife on Grey's Anatomy - TV3.ie

McGill ranked world’s 3rd best university for study of Anatomy & Physiology – McGill Newsroom

McGill University is the worlds third-best university for the study of Anatomy & Physiology, behind only the Universities of Oxford and Cambridge, according to the 2017 QS World University Rankings by Subject.

The seventh edition of QS Quacquarelli Symondss analysis of subject-specific university performance, released today, lists the worlds best universities for the study of 46 different subjects. Anatomy & Physiology is one of four new subject categories introduced in this years listing.

We are extremely pleased to rank among the worlds top three universities in the study of anatomy and physiology, said David Eidelman, Vice-Principal of Health Affairs and Dean of Medicine at McGill. This is a direct outcome of the quality of our academics and staff in these departments, who I congratulate for their stellar and hard work on behalf of our students. I am also gratified to see McGills rankings rise this year in the medicine and pharmacology categories.

McGills ranking in the Medicine subject category rose to 22nd this year from 27th in 2016. In Pharmacology, McGill moved up to the 31st spot from 37th a year ago.

Another standout performance came in the Engineering Mineral and Mining category, with McGill rising to a tie for sixth place globally this year from 13th place last year. We are very proud to be ranked so highly along with our counterparts in other Canadian institutions, said Jim Nicell, McGills Dean of Engineering. The mining industry is an essential part of the economy of Canada, so we must always do our best to stay at the forefront in our teaching and research in support of this sector.

More broadly, McGill is listed this year in the top 50 in 7 of 10 subjects in Arts & Humanities, 3 of 6 subjects in Engineering & Technology, 7 of 9 in Life Sciences & Medicine, 6 of 7 in Natural Sciences, and 9 of 14 in Social Sciences & Management.

The full QS World University Rankings by Subject tables can be foundhere. The full methodology can be foundhere.

Read the original post:
McGill ranked world's 3rd best university for study of Anatomy & Physiology - McGill Newsroom

Behind the scenes of cardiac physiology at Salisbury District Hospital – Salisbury Journal

CARDIAC physiologists carry out a range of investigations for people with potential, or known, heart problems.

Their work includes performing electrocardiograms (ECGs), putting people on treadmills to evaluate their hearts response to exercise, performing cardiac ultrasound scans, checking and programming implanted pacemakers and implantable defibrillators, and monitoring the blood pressure and heart rhythm while stents are fitted in the cardiac catheter lab.

Claire Murray, a cardiac physiologist at Salisbury District Hospital, said: We are involved with any investigation relating to your heart. We see everybody from babies to the elderly if patients have a problem with their heart, we are involved in their care.

Cardiac physiology is one of more than 20 healthcare sciences. They involve the life sciences such as microbiology, histology and genetics, the physical sciences like medical engineering, medical physics and nuclear medicine and then the physiological sciences with people going into audiology and neurophysiology, Claire says.

Cardiac physiology is very patient-focused, which is what appealed to me.

Some of the other healthcare sciences are much more in the lab or using physics.

Cardiology is always changing theres a lot of investment into cardiac health care and theres always research going on and new developments.

We use a lot of technology and equipment and there is always something to learn which I really enjoy.

You get that feeling of really making a difference if someone comes in with a heart rate of 20 and has a pacemaker fitted, theyre instantly better. Thats so satisfying.

Procedures generally take between 10 minutes and half an hour with a cardiac physiologist writing up a report after analysing results. We are quite autonomous in our working, Claire says. For example, after doing an ultrasound, we would create a technical report on what we have found which goes back to the clinician whos asked us to do it and they will then prescribe medication or further procedures.

The most common route into cardiac physiology is a BSc in clinical physiology. Once complete, you are a healthcare science practitioner, becoming a scientist after completing the three-year Scientist Training Programme (STP).

People are educated to different levels within the scientific banner people have come in with degrees in medical engineering or microbiology, a lot of people have masters or PhDs, Claire says.

As healthcare scientists our careers have come across quite convoluted paths to get to where we are currently in the healthcare science programme.

Today, everyone will be doing science and maths as A-levels but after that point it can be very split. A lot of our scientist programmes are done as part of a national recruitment process to train people.

Previously, as a trainee cardiac physiologist, you would be employed by a hospital and do your training over four years which involved being hospital-based and going to university on block release but now you come out of university with qualifications and then look for a job.

Claires own route into the profession involved going straight from GCSEs into a two-year regional training programme.

I started in 1990, she said. The first eight months was spent in audiology, neurophysiology, respiratory physiology and cardiology. After that I chose which one and as I just loved cardiology, I spent the rest of the two years on that, doing a BTEC in medical physics and physiological measurements.

Claire has been at SDH for 15 years and is one of 12 cardiac physiologists. For any students considering a career in cardiac physiology, you have to be of scientific mind and enjoy the sciences, she says.

Its important to like working with people, you need to be prepared to talk to anybody, have excellent communication skills as we often have to explain complicated information to our patients or their carers, and have an interest in technology because everything we do is with medical equipment.

Its also about being able to keep calm in stressful situations and work as a team.

From a practical aspect, we dont tend to work shifts, although some of our role includes being on call from home, so it is very appealing from that point of view.

Read the original post:
Behind the scenes of cardiac physiology at Salisbury District Hospital - Salisbury Journal

How MLB teams are using neuroscience to try to gain a competitive advantage – CBSSports.com

Long beforeMoneyballcame along, the game of baseball embraced numbers and statistical analysis. Every single team in baseball has a stats department. The ways and means have certainly changed, but the goal remains the same: use numbers to find a competitive advantage.

These days sophisticated tracking systems like PitchFX and Statcast record pretty much everything that happens on the field: how much a pitch moves, how quickly an outfielder takes his first step, how hard a catcher throws down to second base on a stolen base attempt. You name it, and theres a number for it.

R.J. Anderson recently spoke to several folks within baseball to get an idea of how teams are acquiring and using their data in the post Moneyball era . There are, surprisingly, many outside vendors involved. This nugget from Andersons piece stood out to me:

Right now, we have teams out there, who, when they evaluate a player, theyre taking their 2017 schedule, they are prototyping the opposing pitcher array -- perhaps, if theyre really sophisticated, even assuming what the seventh, eighth, and ninth innings look like against those teams -- and theyre simulating a batters performance, a prospective acquisition, his performance against that pitching opposition, in those ballparks, Gennaro said. Because theyre not looking at his stats, theyre looking at his exit velocity and his launch angle.

If you hit the ball 86 mph to lets call it straight-away right field at Yankee Stadium at a 32-degree launch angle, depending on the wind, that probably drops into the first couple of rows. If you hit that same velocity and launch angle at AT&T Park, Hunter Pence is taking two steps in to field it. So, all of those things are being incorporated into the analytics of the most sophisticated teams.

Thats wild. Teams are essentially modeling an entire season using all the available data to get an idea how a player may perform for them. These are still human beings of course, so nothing is 100 percent predictive, but if you have the data available, why wouldnt you use it to try to forecast performance?

The days of targeting players with high on-base percentages because the the rest of the league is undervaluing them are long, long gone. Now teams are using information in more complex ways to gain an advantage. One method is neuroscience, the science of brain function and reaction.

Baseball players are human beings. You cant ignore the human element. Everyones brain works differently and its impossible to quantify that variable. That hasnt stopped teams from trying though. A few teams started exploring neuroscience years ago, and more and more clubs have caught on since.

The Red Sox were among the first clubs to buy into neuroscience, back when Theo Epstein was running the show.As Brian Costa of theWall Street Journalexplained in 2014, the Red Sox partnered with a company called NeuroScouting that developed software to measure and improve a players reaction ability. From Costa:

Nonetheless, teams interest in the neuroscience of hitting is only growing. What began as a training tool for the Red Sox has also become a scouting device. Before each amateur draft, the Red Sox assess hitting prospects in part based on how well they score on the NeuroScouting games.

Mookie Betts, Bostons fifth-round draft pick in 2011, recalled meeting with a Red Sox scout in an empty classroom one day during his lunch period at a Tennessee high school. At the scouts request, he completed a series of games on a laptop. I was thinking, What does this have to do with baseball? Betts said. I guess I did pretty well, since he kept on pursuing me.

Betts, 21, said the daily NeuroScouting drills he did in the minors helped put him on a fast track to the majors. It gets your brain going, he said. In 43 games through Thursday, his .363 on-base percentage ranked second among major-league rookies behind Chicago White Sox star Jose Abreu.

Teams are, quite literally, attempting to measure how quickly a players brain reads a situation and reacts with a decision. Grading out well in neuroscouting doesnt guarantee success, the same way having a pretty swing or a 95 mph fastball doesnt guarantee success. Its one tool in the shed, and the more tools available to you, the more likely you are to get the job done successfully.

Original post:
How MLB teams are using neuroscience to try to gain a competitive advantage - CBSSports.com