Saturday, 30 August 2014

Flapping baby birds give clues to origin of flight

Date:
August 28, 2014
Source:
University of California - Berkeley
Summary:
The origin of flight is a contentious issue: some argue that tree-climbing dinosaurs learned to fly in order to avoid hard falls. Others favor the story that theropod dinosaurs ran along the ground and pumped their forelimbs to gain lift, eventually talking off. New evidence showing the early development of aerial righting in birds favors the tree-dweller hypothesis.

How did the earliest birds take wing? Did they fall from trees and learn to flap their forelimbs to avoid crashing? Or did they run along the ground and pump their "arms" to get aloft?
The answer is buried 150 million years in the past, but a new University of California, Berkeley, study provides a new piece of evidence -- birds have an innate ability to maneuver in midair, a talent that could have helped their ancestors learn to fly rather than fall from a perch.
The study looked at how baby birds, in this case chukar partridges, pheasant-like game birds from Eurasia, react when they fall upside down.
The researchers, Dennis Evangelista, now a postdoctoral researcher at the University of North Carolina, Chapel Hill, and Robert Dudley, UC Berkeley professor of integrative biology, found that even ungainly, day-old baby birds successfully use their flapping wings to right themselves when they fall from a nest, a skill that improves with age until they become coordinated and graceful flyers.
"From day one, post-hatching, 25 percent of these birds can basically roll in midair and land on their feet when you drop them," said Dudley, who also is affiliated with the Smithsonian Tropical Research Institute in Balboa, Panama. "This suggests that even rudimentary wings can serve a very useful aerodynamic purpose."
Flapping and rolling
The nestlings right themselves by pumping their wings asymmetrically to flip or roll. By nine days after hatching, 100 percent of the birds in the study had developed coordinated or symmetric flapping, plus body pitch control to right themselves.
"These abilities develop very quickly after hatching, and occur before other previously described uses of the wings, such as for weight support during wing-assisted incline running," said Evangelista, who emphasized that no chukar chicks were injured in the process. "The results highlight the importance of maneuvering and control in development and evolution of flight in birds."
The researchers' study appeared Aug. 27 in the online journal Biology Letters, published by the Royal Society.
Dudley has argued for a decade that midair maneuverability preceded the development of flapping flight and allowed the ancestors of today's birds to effectively use their forelimbs as rudimentary wings. The new study shows that aerial righting using uncoordinated, asymmetric wing flapping is a very early development.
Righting behavior probably evolved because "nobody wants to be upside down, and it's particularly dangerous if you're falling in midair," Dudley said. "But once animals without wings have this innate aerial righting behavior, when wings came along it became easier, quicker and more efficient."
Dudley noted that some scientists hypothesize that true powered flight originated in the theropod dinosaurs, the ancestors to birds, when they used symmetric wing flapping while running up an incline, a behavior known as wing-assisted incline running, or WAIR. WAIR proponents argue that the wings assist running by providing lift, like the spoiler on a race car, and that the ability to steer or maneuver is absent early in evolution.
Falling, gliding and flying
Such activity has never been regularly observed in nature, however, and Dudley favors the scenario that flight developed in tree-dwelling animals falling and eventually evolving the ability to glide and fly. He has documented many ways that animals in the wild, from lizards and lemurs to ants, use various parts of their bodies to avoid hard landings on the ground. Practically every animal that has been tested is able to turn upright, and a great many, even ones that do not look like fliers, have some ability to steer or maneuver in the air.
Contrary to WAIR, maneuvering is very important at all stages of flight evolution and must have been present early, Evangelista said. Seeing it develop first in very young chicks indirectly supports this idea.
"Symmetric flapping while running is certainly one possible context in which rudimentary wings could have been used, but it kicks in rather late in development relative to asymmetric flapping," Dudley added. "This experiment illustrates that there is a much broader range of aerodynamic capacity available for animals with these tiny, tiny wings than has been previously realized."
The researchers also tested the young chicks to see if they flapped their wings while running up an incline. None did.

NASA's Spitzer Space Telescope witnesses asteroid smashup

Date:
August 28, 2014
Source:
NASA/Jet Propulsion Laboratory
Summary:
NASA's Spitzer Space Telescope has spotted an eruption of dust around a young star, possibly the result of a smashup between large asteroids. This type of collision can eventually lead to the formation of planets.
Image asteroid Smashup
This artist's concept shows the immediate aftermath of a large asteroid impact around NGC 2547-ID8, a 35-million-year-old sun-like star thought to be forming rocky planets.

NASA's Spitzer Space Telescope has spotted an eruption of dust around a young star, possibly the result of a smashup between large asteroids. This type of collision can eventually lead to the formation of planets.
Scientists had been regularly tracking the star, called NGC 2547-ID8, when it surged with a huge amount of fresh dust between August 2012 and January 2013.
"We think two big asteroids crashed into each other, creating a huge cloud of grains the size of very fine sand, which are now smashing themselves into smithereens and slowly leaking away from the star," said lead author and graduate student Huan Meng of the University of Arizona, Tucson.
While dusty aftermaths of suspected asteroid collisions have been observed by Spitzer before, this is the first time scientists have collected data before and after a planetary system smashup. The viewing offers a glimpse into the violent process of making rocky planets like ours.
Rocky planets begin life as dusty material circling around young stars. The material clumps together to form asteroids that ram into each other. Although the asteroids often are destroyed, some grow over time and transform into proto-planets. After about 100 million years, the objects mature into full-grown, terrestrial planets. Our moon is thought to have formed from a giant impact between proto-Earth and a Mars-size object.
In the new study, Spitzer set its heat-seeking infrared eyes on the dusty star NGC 2547-ID8, which is about 35 million years old and lies 1,200 light-years away in the Vela constellation. Previous observations had already recorded variations in the amount of dust around the star, hinting at possible ongoing asteroid collisions. In hope of witnessing an even larger impact, which is a key step in the birth of a terrestrial planet, the astronomers turned to Spitzer to observe the star regularly. Beginning in May 2012, the telescope began watching the star, sometimes daily.
A dramatic change in the star came during a time when Spitzer had to point away from NGC 2547-ID8 because our sun was in the way. When Spitzer started observing the star again five months later, the team was shocked by the data they received.
"We not only witnessed what appears to be the wreckage of a huge smashup, but have been able to track how it is changing -- the signal is fading as the cloud destroys itself by grinding its grains down so they escape from the star," said Kate Su of the University of Arizona and co-author on the study. "Spitzer is the best telescope for monitoring stars regularly and precisely for small changes in infrared light over months and even years."
A very thick cloud of dusty debris now orbits the star in the zone where rocky planets form. As the scientists observe the star system, the infrared signal from this cloud varies based on what is visible from Earth. For example, when the elongated cloud is facing us, more of its surface area is exposed and the signal is greater. When the head or the tail of the cloud is in view, less infrared light is observed. By studying the infrared oscillations, the team is gathering first-of-its-kind data on the detailed process and outcome of collisions that create rocky planets like Earth.
"We are watching rocky planet formation happen right in front of us," said George Rieke, a University of Arizona co-author of the new study. "This is a unique chance to study this process in near real-time."
The team is continuing to keep an eye on the star with Spitzer. They will see how long the elevated dust levels persist, which will help them calculate how often such events happen around this and other stars. And they might see another smashup while Spitzer looks on.
The results of this study are posted online Thursday in the journal Science.
NASA's Jet Propulsion Laboratory in Pasadena, California, manages the Spitzer Space Telescope mission for NASA's Science Mission Directorate in Washington. Science operations are conducted at the Spitzer Science Center at the California Institute of Technology in Pasadena. Spacecraft operations are based at Lockheed Martin Space Systems Company in Littleton, Colorado. Data are archived at the Infrared Science Archive housed at the Infrared Processing and Analysis Center at Caltech. Caltech manages JPL for NASA.

How the zebrafish gets its stripes: Uncovering how beautiful color patterns can develop in animals

Date:
August 28, 2014
Source:
Max-Planck-Gesellschaft
Summary:
The zebrafish, a small fresh water fish, owes its name to a striking pattern of blue stripes alternating with golden stripes. Three major pigment cell types, black cells, reflective silvery cells, and yellow cells emerge during growth in the skin of the tiny juvenile fish and arrange as a multi-layered mosaic to compose the characteristic color pattern. While it was known that all three cell types have to interact to form proper stripes, the embryonic origin of the pigment cells that develop the stripes of the adult fish has remained a mystery up to now. Scientists have now discovered how these cells arise and behave to form the 'zebra' pattern.
IMAGE ZEBARAFISH

The zebrafish, a small fresh water fish, owes its name to a striking pattern of blue stripes alternating with golden stripes. Three major pigment cell types, black cells, reflective silvery cells, and yellow cells emerge during growth in the skin of the tiny juvenile fish and arrange as a multilayered mosaic to compose the characteristic colour pattern. While it was known that all three cell types have to interact to form proper stripes, the embryonic origin of the pigment cells that develop the stripes of the adult fish has remained a mystery up to now. Scientists of the Max Planck Institute for Developmental Biology in Tübingen have now discovered how these cells arise and behave to form the 'zebra' pattern. Their work may help to understand the development and evolution of the great diversity of striking patterns in the animal world.
Beauty in the living world amazes poets, philosophers and scientists alike. Nobel prize laureate Christiane Nüsslein-Volhard, Director of the Department for Genetics at the Max Planck Institute for Developmental Biology, has long been fascinated by the biology behind the colour patterns displayed by animals. Her group uses zebrafish as a model organism to study the genetic basis of animal development.
New research by Nüsslein-Volhard's laboratory published in Science shows that the yellow cells undergo dramatic changes in cell shape to tint the stripe pattern of zebrafish. "We were surprised to observe such cell behaviours, as these were totally unexpected from what we knew about colour pattern formation," says Prateek Mahalwar, first author of the study. The study builds on a previous work from the laboratory, which was published in June this year in Nature Cell Biology (NCB), tracing the cell behaviour of silvery and black cells. Both studies describe diligent experiments to uncover the cellular events during stripe pattern formation. Individual juvenile fish carrying fluorescently labelled pigment cell precursors were imaged every day for up to three weeks to chart out the cellular behaviours. This enabled the scientists to trace the multiplication, migration and spreading of individual cells and their progeny over the entire patterning process of stripe formation in the living and growing animal. "We had to develop a very gentle procedure to be able to observe individual fish repeatedly over long periods of time. So we used a state of the art microscope which allowed us to reduce the adverse effects of fluorescence illumination to a minimum," says Ajeet Singh, first author of the earlier NCB study.
Surprisingly, the analysis revealed that the three cell types reach the skin by completely different routes: A pluripotent cell population situated at the dorsal side of the embryo gives rise to larval yellow cells, which cover the skin of the embryo. These cells begin to multiply at the onset of metamorphosis when the fish is about two to three weeks old. However, the black and silvery cells come from a small set of stem cells associated with nerve nodes located close to the spinal cord in each segment. The black cells reach the skin migrating along the segmental nerves to appear in the stripe region, whereas the silvery cells pass through the longitudinal cleft that separates the musculature and then multiply and spread in the skin.
Brigitte Walderich, a co-author of the Science paper, who performed cell transplantations to trace the origin of yellow cells, explains: "My attempt was to create small clusters of fluorescently labelled cells in the embryo which could be followed during larval and juvenile stages to unravel growth and behaviour of the yellow cells. We were surprised to discover that they divide and multiply as differentiated cells to cover the skin of the fish before the silvery and black cells arrive to form the stripes."
A striking observation is that both the silvery and yellow cells are able to switch cell shape and colour, depending on their location. The yellow cells compact to closely cover the dense silvery cells forming the light stripe, colouring it golden, and acquire a loose stellate shape over the black cells of the stripes. The silvery cells thinly spread over the stripe region, giving it a blue tint. They switch shape again at a distance into the dense form to aggregate, forming a new light stripe. These cell behaviours create a series of alternating light and dark stripes. The precise superposition of the dense form of silvery and yellow cells in the light stripe, and the loose silvery and yellow cells superimposed over the black cells in the stripe cause the striking contrast between the golden and blue coloration of the pattern.
The authors speculate that variations on these cell behaviours could be at play in generating the great diversity of colour patterns in fish. "These findings inform our way of thinking about colour pattern formation in other fish, but also in animals which are not accessible to direct observation during development such as peacocks, tigers and zebras," says Nüsslein-Volhard -- wondering how her cats got their stripes.

Astronomy: Radio telescopes settle controversy over distance to Pleiades

Date:
August 28, 2014
Source:
National Radio Astronomy Observatory
Summary:
A worldwide network of radio telescopes measured the distance to the famous star cluster the Pleiades to an accuracy within 1 percent. The result resolved a controversy raised by a satellite's measurement that now is shown to be wrong. The incorrect measurement had challenged standard models of star formation and evolution.
Image Astronomy:
With parallax technique, astronomers observe object at opposite ends of Earth's orbit around the Sun to precisely measure its distance.
Astronomers have used a worldwide network of radio telescopes to resolve a controversy over the distance to a famous star cluster -- a controversy that posed a potential challenge to scientists' basic understanding of how stars form and evolve. The new work shows that the measurement made by a cosmic-mapping research satellite was wrong.
The astronomers studied the Pleiades, the famous "Seven Sisters" star cluster in the constellation Taurus, easily seen in the winter sky. The cluster includes hundreds of young, hot stars formed about 100 million years ago. As a nearby example of such young clusters, the Pleiades have served as a key "cosmic laboratory" for refining scientists' understanding of how similar clusters form. In addition, astronomers have used the measured physical characteristics of Pleiades stars as a tool for estimating the distance to other, more distant, clusters.
Until the 1990s, the consensus was that the Pleiades are about 430 light-years from Earth. However, the European satellite Hipparcos, launched in 1989 to precisely measure the positions and distances of thousands of stars, produced a distance measurement of only about 390 light-years.
"That may not seem like a huge difference, but, in order to fit the physical characteristics of the Pleiades stars, it challenged our general understanding of how stars form and evolve," said Carl Melis, of the University of California, San Diego. "To fit the Hipparcos distance measurement, some astronomers even suggested that some type of new and unknown physics had to be at work in such young stars," he added.
To solve the problem, Melis and his colleagues used a global network of radio telescopes to make the most accurate possible distance measurement. The network included the Very Long Baseline Array (VLBA), a system of 10 radio telescopes ranging from Hawaii to the Virgin Islands; the Robert C. Byrd Green Bank Telescope in West Virginia; the 1,000-foot-diameter William E. Gordon Telescope of the Arecibo Observatory in Puerto Rico; and the Effelsberg Radio Telescope in Germany.
"Using these telescopes working together, we had the equivalent of a telescope the size of the Earth," said Amy Miouduszewski, of the National Radio Astronomy Observatory (NRAO). "That gave us the ability to make extremely accurate position measurements -- the equivalent of measuring the thickness of a quarter in Los Angeles as seen from New York," she added.
The astronomers used this system to observe several Pleiades stars over about a year and a half to precisely measure the apparent shift in each star's position caused by the Earth's rotation around the Sun. Seen at opposite ends of the Earth's orbit, a star appears to move slightly against the backdrop of more-distant cosmic objects. Called parallax, the technique is the most accurate distance-measuring method astronomers have, and relies on simple trigonometry.
The result of their work is a distance to the Pleiades of 443 light-years, accurate, the astronomers said, to within one percent. This is the most accurate and precise measurement yet made of the Pleiades distance.
"This is a relief," Melis said, because the newly-measured distance is close enough to the pre-Hipparcos distance that the standard scientific models of star formation accurately represent the stars in the Pleiades.
"The question now is what happened to Hipparcos?" Melis said. Over four years of operation, the spacecraft measured distances to 118,000 stars. The cause of its error in measuring the distance to the Pleiades is unknown. Another spacecraft, Gaia, launched in December of 2013, will use similar technology to measure distances of about one billion stars.
"Radio-telescope systems such as the one we used for the Pleiades will provide a crucial cross-check to insure the accuracy of Gaia's measurements," said Mark Reid, of the Harvard-Smithsonian Center for Astrophysics.
Many ancient cultures, including Native Americans, used the Pleiades as a test of vision. The more Pleiades stars one can discern -- typically five to nine -- the better one's vision.
"Now we've used a system that provides modern astronomy's sharpest 'vision' to solve a longstanding scientific debate about the Pleiades themselves," said Melis.
Melis, Miouduszewski, and Reid worked with John Stauffer of the Spitzer Science Center, and Geoffrey Bower of the Academia Sinica Institute of Astronomy and Astrophysics. The scientists published their findings in the 29 August issue of the journal Science.
The National Radio Astronomy Observatory is a facility of the National Science Foundation, operated under cooperative agreement by Associated Universities, Inc.

Genomic sequencing reveals mutations, insights into 2014 Ebola outbreak

Date:
August 28, 2014
Source:
Broad Institute of MIT and Harvard
Summary:
In response to an ongoing, unprecedented outbreak of Ebola virus disease in West Africa, a team of researchers has rapidly sequenced and analyzed more than 99 Ebola virus genomes. Their findings could have important implications for rapid field diagnostic tests.

In response to an ongoing, unprecedented outbreak of Ebola virus disease (EVD) in West Africa, a team of researchers from the Broad Institute and Harvard University, in collaboration with the Sierra Leone Ministry of Health and Sanitation and researchers across institutions and continents, has rapidly sequenced and analyzed more than 99 Ebola virus genomes. Their findings could have important implications for rapid field diagnostic tests. The team reports its results online in the journal Science.
For the current study, researchers sequenced 99 Ebola virus genomes collected from 78 patients diagnosed with Ebola in Sierra Leone during the first 24 days of the outbreak (a portion of the patients contributed samples more than once, allowing researchers a clearer view into how the virus can change in a single individual over the course of infection). The team found more than 300 genetic changes that make the 2014 Ebola virus genomes distinct from the viral genomes tied to previous Ebola outbreaks. They also found sequence variations indicating that, from the samples sequenced, the EVD outbreak started from a single introduction into humans, subsequently spreading from person to person over many months.
The variations they identified were frequently in regions of the genome encoding proteins. Some of the genetic variation detected in these studies may affect the primers (starting points for DNA synthesis) used in PCR-based diagnostic tests, emphasizing the importance of genomic surveillance and the need for vigilance. To accelerate response efforts, the research team released the full-length sequences on National Center for Biotechnology Information's (NCBI's) DNA sequence database in advance of publication, making these data available to the global scientific community.
"By making the data immediately available to the community, we hope to accelerate response efforts," said co-senior author Pardis Sabeti, a senior associate member at the Broad Institute and an associate professor at Harvard University. "Upon releasing our first batch of Ebola sequences in June, some of the world's leading epidemic specialists contacted us, and many of them are now also actively working on the data. We were honored and encouraged. A spirit of international and multidisciplinary collaboration is needed to quickly shed light on the ongoing outbreak."
The 2014 Zaire ebolavirus (EBOV) outbreak is unprecedented both in its size and in its emergence in multiple populated areas. Previous outbreaks had been localized mostly to sparsely populated regions of Middle Africa, with the largest outbreak in 1976 reporting 318 cases. The 2014 outbreak has manifested in the more densely-populated West Africa, and since it was first reported in Guinea in March 2014, 2,240 cases have been reported with 1,229 deaths (as of August 19).
Augustine Goba, Director of the Lassa Laboratory at the Kenema Government Hospital and a co-first author of the paper, identified the first Ebola virus disease case in Sierra Leone using PCR-based diagnostics. "We established surveillance for Ebola well ahead of the disease's spread into Sierra Leone and began retrospective screening for the disease on samples as far back as January of this year," said Goba. "This was possible because of our long-standing work to diagnose and study another deadly disease, Lassa fever. We could thus identify cases and trace the Ebola virus spread as soon as it entered our country."
The research team increased the amount of genomic data available on the Ebola virus by four fold and used the technique of "deep sequencing" on all available samples. Deep sequencing is sequencing done enough times to generate high confidence in the results. In this study, researchers sequenced at a depth of 2,000 times on average for each Ebola genome to get an extremely close-up view of the virus genomes from 78 patients. This high-resolution view allowed the team to detect multiple mutations that alter protein sequences -- potential targets for future diagnostics, vaccines, and therapies.
The Ebola strains responsible for the current outbreak likely have a common ancestor, dating back to the very first recorded outbreak in 1976. The researchers also traced the transmission path and evolutionary relationships of the samples, revealing that the lineage responsible for the current outbreak diverged from the Middle African version of the virus within the last ten years and spread from Guinea to Sierra Leone by 12 people who had attended the same funeral.
The team's catalog of 395 mutations (over 340 that distinguish the current outbreak from previous ones, and over 50 within the West African outbreak) may serve as a starting point for other research groups. "We've uncovered more than 300 genetic clues about what sets this outbreak apart from previous outbreaks," said Stephen Gire, a research scientist in the Sabeti lab at the Broad Institute and Harvard. "Although we don't know whether these differences are related to the severity of the current outbreak, by sharing these data with the research community, we hope to speed up our understanding of this epidemic and support global efforts to contain it."
"There is an extraordinary battle still ahead, and we have lost many friends and colleagues already like our good friend and colleague Dr. Humarr Khan, a co-senior author here," said Sabeti. "By providing this data to the research community immediately and demonstrating that transparency and partnership is one way we hope to honor Humarr's legacy. We are all in this fight together."
This work was supported by Common Fund and National Institute of Allergy and Infectious Diseases in the National Institutes of Health, Department of Health and Human Services, as well as by the National Science Foundation, the European Union Seventh Framework Programme, the World Bank, and the Natural Environment Research Council.
Other researchers who contributed to this work include Augustine Goba, Kristian G. Andersen, Rachel S. G. Sealfon, Daniel J. Park, Lansana Kanneh, Simbirie Jalloh, Mambu Momoh, Mohamed Fullah, Gytis Dudas, Shirlee Wohl, Lina M. Moses, Nathan L. Yozwiak, Sarah Winnicki, Christian B. Matranga, Christine M. Malboeuf, James Qu, Adrianne D. Gladden, Stephen F. Schaffner, Xiao Yang, Pan-Pan Jiang, Mahan Nekoui, Andres Colubri, Moinya Ruth Coomber, Mbalu Fonnie, Alex Moigboi, Michael Gbakie, Fatima K. Kamara, Veronica Tucker, Edwin Konuwa, Sidiki Saffa, Josephine Sellu, Abdul Azziz Jalloh, Alice Kovoma, James Koninga, Ibrahim Mustapha, Kandeh Kargbo, Momoh Foday, Mohamed Yillah, Franklyn Kanneh, Willie Robert, James L. B. Massally, Sinéad B. Chapman, James Bochicchio, Cheryl Murphy, Chad Nusbaum, Sarah Young, Bruce W. Birren, Donald S.Grant, John S. Scheiffelin, Eric S. Lander, Christian Happi, Sahr M. Gevao, Andreas Gnirke, Andrew Rambaut, Robert F. Garry, and S. Humarr Khan.

Prehistoric migrations: DNA study unravels the settlement history of the New World Arctic

Date:
August 28, 2014
Source:
University of Copenhagen
Summary:
A new DNA study unravels the settlement history of the New World Arctic. We know people have lived in the New World Arctic for about 5,000 years. Archaeological evidence clearly shows that a variety of cultures survived the harsh climate in Alaska, Canada and Greenland for thousands of years. Despite this, there are several unanswered questions about these people.
Image Dana

WE know people have lived in the New World Arctic for about 5,000 years. Archaeological evidence clearly shows that a variety of cultures survived the harsh climate in Alaska, Canada and Greenland for thousands of years. Despite this, there are several unanswered questions about these people: Where did they come from? Did they come in several waves? When did they arrive? Who are their descendants? And who can call themselves the indigenous peoples of the Arctic? We can now answer some of these questions, thanks to a comprehensive DNA study of current and former inhabitants of Greenland, Arctic Canada, Alaska, the Aleutian Islands and Siberia, conducted by an international team headed by the Centre for GeoGenetics at the Natural History Museum of Denmark, University of Copenhagen.
The results have just been published in the scientific journal Science.
Looking for ancient human remains in northern Greenland.
The North American Arctic was one of the last major regions to be settled by modern humans. This happened when people crossed the Bering Strait from Siberia and wandered into a new world. While the area has long been well researched by archaeologists, little is known of its genetic prehistory. In this study, researchers show that the Paleo-Eskimo, who lived in the Arctic from about 5,000 years ago until about 700 years ago, represented a distinct wave of migration, separate from both Native Americans -- who crossed the Bering Strait much earlier -- and the Inuit, who came from Siberia to the Arctic several thousand years after the Paleo-Eskimos.
"Our genetic studies show that, in reality, the Paleo-Eskimos -- representing one single group -- were the first people in the Arctic, and they survived without outside contact for over 4,000 years," says Lundbeck Foundation Professor Eske Willerslev from Centre for GeoGenetics at the Natural History Museum, University of Copenhagen, who headed the study.
"Our study also shows that the Paleo-Eskimos, after surviving in near-isolation in the harsh Arctic environment for more than 4,000 years, disappeared around 700 years ago -- about the same time when the ancestors of modern-day Inuit spread eastward from Alaska," adds Dr. Maanasa Raghavan of Centre for GeoGenetics and lead author of the article.
Migration pulses into the Americas
Greenlandic Inuit from the 1930s pictured in their traditional boats (umiaq), used for hunting and transportation.
In the archaeological literature, distinctions are drawn between the different cultural units in the Arctic in the period up to the rise of the Thule culture, which replaced all previous Arctic cultures and is the source of today's Inuit in Alaska, Canada and Greenland. The earlier cultures included the Saqqaq or Pre-Dorset and Dorset, comprising the Paleo-Eskimo tradition, with the Dorset being further divided into three phases. All of these had distinctive cultural, lifestyle and subsistence traits as seen in the archaeological record. There were also several periods during which the Arctic was devoid of human settlement. These facts have further raised questions regarding the possibility of several waves of migration from Siberia to Alaska, or perhaps Native Americans migrating north during the first 4,000 years of the Arctic being inhabited.
"Our study shows that, genetically, all of the different Paleo-Eskimo cultures belonged to the same group of people. On the other hand, they are not closely related to the Thule culture, and we see no indication of assimilation between the two groups. We have also ascertained that the Paleo-Eskimos were not descendants of the Native Americans. The genetics reveals that there must have been at least three separate pulses of migration from Siberia into the Americas and the Arctic. First came the ancestors of today's Native Americans, then came the Paleo-Eskimos, and finally the ancestors of today's Inuit," says Eske Willerslev.
Genetics and archaeology
The genetic study underpins some archaeological findings, but not all of them.
It rejects the speculation that the Paleo-Eskimos represented several different peoples, including Native Americans, or that they are direct ancestors of today's Inuit. Also rejected are the theories that the Greenlanders on the east coast or the Canadian Sadlermiut, from Southampton Island in Hudson Bay, who died out as late as 1902-03, were surviving groups of Dorset people. Genetics shows that these groups were Inuit who had developed Dorset-like cultural traits.
The study clearly shows that the diversity of tools and ways of life over time, which in archaeology is often interpreted as a result of migration, does not in fact necessarily reflect influx of new people. The Paleo-Eskimos lived in near-isolation for more than 4,000 years, and during this time their culture developed in such diverse ways that it has led some to interpret them as different peoples.
"Essentially, we have two consecutive waves of genetically distinct groups entering the New World Arctic and giving rise to three discrete cultural units. Through this study, we are able to address the question of cultural versus genetic continuity in one of the most challenging environments that modern humans have successfully settled, and present a comprehensive picture of how the Arctic was peopled," says Dr. Raghavan.
The first inhabitants
The study was unable to establish why the disappearance of the Paleo-Eskimos coincided with the ancestors of the Inuit beginning to colonise the Arctic. There is no doubt that the Inuit ancestors -- who crossed the Bering Strait about 1,000 years ago and reached Greenland around 700 years ago -- were technologically superior.
The Inuit's own myths tell stories of a people before them, which in all likelihood refer to the Paleo-Eskimos. In the myths, they are referred to as the 'Tunit' or 'Sivullirmiut', which means "the first inhabitants." According to these myths they were giants, who were taller and stronger than the Inuit, but easily frightened from their settlements by the newcomers.
Co-author Dr. William Fitzhugh from the Arctic Studies Centre at the Smithsonian Institution says: "Ever since the discovery of a Paleo-Eskimo culture in the North American Arctic in 1925, archaeologists have been mystified by their relationship with the Thule culture ancestors of the modern Inuit. Paleo-Eskimo culture was replaced rapidly around AD 1300-1400, their only traces being references to 'Tunit' in Inuit mythology and adoption of some elements of Dorset technology. This new genomic research settles outstanding issues in Arctic archaeology that have been debated for nearly a century, finding that Paleo-Eskimo and Neo-Eskimo people were genetically distinct, with separate origins in Eastern Siberia, and the Paleo-Eskimo remained isolated in the Eastern Arctic for thousands of years with no significant mixing with each other or with American Indians, Norse, or other Europeans."

New research reveals how wild rabbits were genetically transformed into tame rabbits

Date:
August 28, 2014
Source:
Uppsala University
Summary:
The genetic changes that transformed wild animals into domesticated forms have long been a mystery. An international team of scientists has now made a breakthrough by showing that many genes controlling the development of the brain and the nervous system were particularly important for rabbit domestication. The study gives answers to many genetic questions.

THe genetic changes that transformed wild animals into domesticated forms have long been a mystery. An international team of scientists has now made a breakthrough by showing that many genes controlling the development of the brain and the nervous system were particularly important for rabbit domestication. The study is published today in Scienceand gives answers to many genetic questions.
The domestication of animals and plants, a prerequisite for the development of agriculture, is one of the most important technological revolutions during human history. Domestication of animals started as early as 9,000 to 15,000 years ago and initially involved dogs, cattle, sheep, goats and pigs. The rabbit was domesticated much later, about 1,400 years ago, at monasteries in southern France. It has been claimed that rabbits were domesticated because the Catholic Church had declared that young rabbits were not considered meat, but fish, and could therefore be eaten during lent! When domestication occurred, the wild ancestor, the European rabbit (Oryctolagus cuniculus), was confined to the Iberian Peninsula and southern France.
There are several reasons why the rabbit is an outstanding model for genetic studies of domestication: its domestication was relatively recent, we know where it happened, and this region is still densely populated with wild rabbits, explains Miguel Carneiro, from CIBIO/Inbio-University of Porto, one of the leading authors on the paper. Wild rabbits also serve as an excellent model for genetic studies of the early stages of species formation, as shown in an accompanying study we publish today in PLoS Genetics, adds Miguel Carneiro.
The scientists first sequenced the entire genome of one domestic rabbit to develop a reference genome assembly. Then they resequenced entire genomes of domestic rabbits representing six different breeds and wild rabbits sampled at 14 different places across the Iberian Peninsula and southern France.
No previous study on animal domestication has involved such a careful examination of genetic variation in the wild ancestral species. This allowed us to pinpoint the genetic changes that have occurred during rabbit domestication, says Leif Andersson, Uppsala University, Swedish University of Agricultural Sciences and Texas A&M University.
In contrast to domestic rabbits, wild rabbits have a very strong flight response because they are hunted by eagles, hawks, foxes and humans, and therefore must be very alert and reactive to survive in the wild. In fact, Charles Darwin wrote in On the Origin of Species that "…no animal is more difficult to tame than the young of the wild rabbit; scarcely any animal is tamer than the young of the tame rabbit." Darwin used domestic animals as a proof-of-principle that it is possible to change phenotypes by selection. The scientists involved in the current study have now been able to reveal the genetic basis for this remarkable change in behaviour and the study has given important new insights about the domestication process.
Rabbit domestication has primarily occurred by altering the frequencies of gene variants that were already present in the wild ancestor. Our data shows that domestication primarily involved small changes in many genes and not drastic changes in a few genes, states Kerstin Lindblad-Toh, co-senior author, Director of Vertebrate Genome Biology at the Broad Institute of MIT and Harvard, professor at Uppsala University and Co-Director of Science for Life Laboratory.
The team observed very few examples where a gene variant common in domestic rabbits had completely replaced the gene variant present in wild rabbits; it was rather shifts in frequencies of those variants that were favoured in domestic rabbits.
An interesting consequence of this is that if you release domestic rabbits into the wild, there is an opportunity for back selection at those genes that have been altered during domestication because the 'wild-type' variant has rarely been completely lost. In fact, this is what we plan to study next, comments Miguel Carneiro.
The scientists found no example where a gene has been inactivated during rabbit domestication and there were many more changes in the non-coding part of the genome than in the parts of the genome that codes for protein.
The results we have are very clear; the difference between a wild and a tame rabbit is not which genes they carry but how their genes are regulated i. e. when and how much of each gene is used in different cells, explains Miguel Carneiro.
The study also revealed which genes that have been altered during domestication. The researchers were amazed by the strong enrichment of genes involved in the development of the brain and the nervous system, among the genes particularly targeted during domestication.
But that of course makes perfect sense in relation to the drastic changes in behaviour between wild and domestic rabbits, concludes Kerstin Lindblad-Toh.
The study shows that the wild rabbit is a highly polymorphic species that carries gene variants that were favourable during domestication, and that the accumulation of many small changes led to the inhibition of the strong flight response -- one of the most prominent phenotypic changes in the evolution of the domestic rabbit
We predict that a similar process has occurred in other domestic animals and that we will not find a few specific "domestication genes" that were critical for domestication. It is very likely that a similar diversity of gene variants affecting the brain and the nervous system occurs in the human population and that contributes to differences in personality and behaviour, says Leif Andersson.

Paleontology: Oldest representative of a weird arthropod group

Date:
August 28, 2014
Source:
Ludwig-Maximilians-Universität München
Summary:
Biologists have assigned a number of 435-million-year-old fossils to a new genus of predatory arthropods. These animals lived in shallow marine habitats and were far less eye-catching than related forms found in Jurassic strata.
Image Thylacares brandonensis.
Thylacares brandonensis.

LMU biologists have assigned a number of 435-million-year-old fossils to a new genus of predatory arthropods. These animals lived in shallow marine habitats and were far less eye-catching than related forms found in Jurassic strata.
Before they sank to the bottom of their shallow marine habitat and were fossilized some 435 million years ago, these arthropods preyed on other denizens of the Silurian seas -- although they were not exactly inconspicuous, possessing a bivalved carapace and multiple abdominal limbs. A group of researchers including LMU biologist Carolin Haug recently recognized these fossils as the oldest representatives yet discovered of an enigmatic and now extinct class of arthropods known as Thylacocephala, and assigned them to the new species Thylacares brandonensis. "Where exactly the thylacocephalans belong among the arthropods is still a matter of intense debate," Haug says, but the new specimens shed light on the phylogenetic affinities of this problematic group of animals.
According to the authors of the study just published, certain aspects of the anatomy of T. brandonensis, together with the results of a detailed investigation of more recent specimens attributable to the group, support the hypothesis that the thylocephalans belong among the crustaceans. Moreover, the anatomy of their posteror appendages and an analysis of the organization of their muscles with the aid of fluorescence microscopy strongly suggest that they can be interpreted as a sister group of the so-called Remipedia. Remipedians, which were first described in the 1980s, are blind crustaceans found in flooded limestone caves in coastal settings in the tropics. These cave systems are typically connected to the sea via subsurface channels and are also open to the surface further inland.
"The main reason why it has been so difficult to work out the precise systematic position of thylacocephalans is that their morphology is so bizarre," says Haug. "For a long time, researchers couldn't even agree which end was anterior and which posterior." Most of the specimens so far described come from strata of Jurassic age, and are therefore 200-250 million years younger than the new species. Representatives of the group typically have unusually large compound eyes and are equipped with several paired and anteriorly located raptorial appendages, which are almost as long as the rest of the animal. This combination of characters strongly suggests that they were adapted to a predatory lifestyle. "Actually the eyes were initially not recognized as such, and instead were interpreted as stomach pouches by some researchers," says Haug. By comparison with its spectacular descendants, the new species T. brandonensis can be described as modest and unassuming. "Representatives of this thylacocephalan species have a more 'normal' morphology," says Haug, "their eyes are smaller and the raptorial appendages are shorter."
The authors of the new study therefore conclude that, like more recent representatives of the group, T. brandonensis also earned its living as a predator, but was less highly specialized than the later forms. Consequently, the morphological specializations seen in the latter probably emerged in the course of the further evolution of the class. "It is quite possible that the extreme degree of specialization seen in specimens from the Jurassic proved to be an evolutionary dead end," Haug suggests, "for at the close of the Cretaceous, at a time when many other groups of animals disappeared from the fossil record, Thylacocephala also became extinct." However, we now know that these predators had previously enjoyed a successful career that lasted for more than 350 million years.

From nose to knee: Engineered cartilage regenerates joints

Date:
August 28, 2014
Source:
Universität Basel
Summary:
Human articular cartilage defects can be treated with nasal septum cells. Researchers now report that cells taken from the nasal septum are able to adapt to the environment of the knee joint and can thus repair articular cartilage defects. The nasal cartilage cells' ability to self-renew and adapt to the joint environment is associated with the expression of so-called HOX genes.
Image From Nose to knee
Articular cartilage replaced: MRI of defect tissue site before (left) and four months after (right) transplantation.

Human articular cartilage defects can be treated with nasal septum cells. Researchers at the University and the University Hospital of Basel report that cells taken from the nasal septum are able to adapt to the environment of the knee joint and can thus repair articular cartilage defects. The nasal cartilage cells' ability to self-renew and adapt to the joint environment is associated with the expression of so-called HOX genes. The scientific journal Science Translational Medicine has published the research results together with the report of the first treated patients.
Cartilage lesions in joints often appear in older people as a result of degenerative processes. However, they also regularly affect younger people after injuries and accidents. Such defects are difficult to repair and often require complicated surgery and long rehabilitation times. A new treatment option has now been presented by a research team lead by Prof. Ivan Martin, professor for tissue engineering, and Prof. Marcel Jakob, Head of Traumatology, from the Department of Biomedicine at the University and the University Hospital of Basel: Nasal cartilage cells can replace cartilage cells in joints.
Cartilage cells from the nasal septum (nasal chondrocytes) have a distinct capacity to generate a new cartilage tissue after their expansion in culture. In an ongoing clinical study, the researchers have so far taken small biopsies (6 millimeters in diameter) from the nasal septum from seven out of 25 patients below the age of 55 years and then isolated the cartilage cells. They cultured and multiplied the cells and then applied them to a scaffold in order to engineer a cartilage graft the size of 30 x 40 millimeters. A few weeks later they removed the damaged cartilage tissue of the patients' knees and replaced it with the engineered and tailored tissue from the nose. In a previous clinical study conducted in cooperation with plastic surgeons and using the same method, the researchers from Basel recently already successfully reconstructed nasal wings affected by tumors.
Surprising Adaption
The scientists around first author Dr. Karoliina Pelttari were especially surprised by the fact that in the animal model with goats, the implanted nasal cartilage cells were compatible with the knee joint profile; even though, the two cell types have different origins. During the embryonic development, nasal septum cells develop from the neuroectodermal germ layer, which also forms the nervous system; their self-renewal capacity is attributed to their lack of expression of some homeobox (HOX) genes. In contrast, these HOX genes are expressed in articular cartilage cells that are formed in the mesodermal germ layer of the embryo.
"The findings from the basic research and the preclinical studies on the properties of nasal cartilage cells and the resulting engineered transplants have opened up the possibility to investigate an innovative clinical treatment of cartilage damage," says Prof. Ivan Martin about the results. It has already previously been shown that the human nasal cells' capacity to grow and form new cartilage is conserved with age. Meaning, that also older people could benefit from this new method, as well as patients with large cartilage defects. While the primary target of the ongoing clinical study at the University Hospital of Basel is to confirm the safety and feasibility of cartilage grafts engineered from nasal cells when transplanted into joint, the clinical effectiveness assessed until now is highly promising.

Inside the teenage brain: New studies explain risky behavior

Date:
August 27, 2014
Source:
Florida State University
Summary:
It’s common knowledge that teenage boys seem predisposed to risky behaviors. Now, a series of new studies is shedding light on specific brain mechanisms that help to explain what might be going on inside juvenile male brains.
Image Insdie Teenage Brain

It's common knowledge that teenage boys seem predisposed to risky behaviors. Now, a series of new studies is shedding light on specific brain mechanisms that help to explain what might be going on inside juvenile male brains.
Florida State University College of Medicine Neuroscientist Pradeep Bhide brought together some of the world's foremost researchers in a quest to explain why teenagers -- boys, in particular -- often behave erratically.
The result is a series of 19 studies that approached the question from multiple scientific domains, including psychology, neurochemistry, brain imaging, clinical neuroscience and neurobiology. The studies are published in a special volume of Developmental Neuroscience, "Teenage Brains: Think Different?"
"Psychologists, psychiatrists, educators, neuroscientists, criminal justice professionals and parents are engaged in a daily struggle to understand and solve the enigma of teenage risky behaviors," Bhide said. "Such behaviors impact not only the teenagers who obviously put themselves at serious and lasting risk but also families and societies in general.
"The emotional and economic burdens of such behaviors are quite huge. The research described in this book offers clues to what may cause such maladaptive behaviors and how one may be able to devise methods of countering, avoiding or modifying these behaviors."
An example of findings published in the book that provide new insights about the inner workings of a teenage boy's brain:
• Unlike children or adults, teenage boys show enhanced activity in the part of the brain that controls emotions when confronted with a threat. Magnetic resonance scanner readings in one study revealed that the level of activity in the limbic brain of adolescent males reacting to threat, even when they've been told not to respond to it, was strikingly different from that in adult men.
• Using brain activity measurements, another team of researchers found that teenage boys were mostly immune to the threat of punishment but hypersensitive to the possibility of large gains from gambling. The results question the effectiveness of punishment as a deterrent for risky or deviant behavior in adolescent boys.
• Another study demonstrated that a molecule known to be vital in developing fear of dangerous situations is less active in adolescent male brains. These findings point towards neurochemical differences between teenage and adult brains, which may underlie the complex behaviors exhibited by teenagers.
"The new studies illustrate the neurobiological basis of some of the more unusual but well-known behaviors exhibited by our teenagers," Bhide said. "Stress, hormonal changes, complexities of psycho-social environment and peer-pressure all contribute to the challenges of assimilation faced by teenagers.
"These studies attempt to isolate, examine and understand some of these potential causes of a teenager's complex conundrum. The research sheds light on how we may be able to better interact with teenagers at home or outside the home, how to design educational strategies and how best to treat or modify a teenager's maladaptive behavior."
Bhide conceived and edited "Teenage Brains: Think Different?" His co-editors were Barry Kasofsky and B.J. Casey, both of Weill Medical College at Cornell University. The book was published by Karger Medical and Scientific Publisher of Basel, Switzerland. More information on the book can be found at:http://www.karger.com/Book/Home/261996
The table of contents to the special journal volume can be found at:http://www.karger.com/Journal/Issue/261977

Electric current to brain boosts memory: May help treat memory disorders from stroke, Alzheimer's, brain injury

Date:
August 28, 2014
Source:
Northwestern University
Summary:
Stimulating a region in the brain via non-invasive delivery of electrical current using magnetic pulses, called Transcranial Magnetic Stimulation, improves memory. The discovery opens a new field of possibilities for treating memory impairments caused by conditions such as stroke, early-stage Alzheimer's disease, traumatic brain injury, cardiac arrest and the memory problems that occur in healthy aging.

Image Ekectric current
New research indicates that stimulating a particular region in the brain via non-invasive delivery of electrical current using magnetic pulses, called Transcranial Magnetic Stimulation, improves memory.

Stimulating a particular region in the brain via non-invasive delivery of electrical current using magnetic pulses, called Transcranial Magnetic Stimulation, improves memory, reports a new Northwestern Medicine® study.
The discovery opens a new field of possibilities for treating memory impairments caused by conditions such as stroke, early-stage Alzheimer's disease, traumatic brain injury, cardiac arrest and the memory problems that occur in healthy aging.
"We show for the first time that you can specifically change memory functions of the brain in adults without surgery or drugs, which have not proven effective," said senior author Joel Voss, assistant professor of medical social sciences at Northwestern University Feinberg School of Medicine. "This noninvasive stimulation improves the ability to learn new things. It has tremendous potential for treating memory disorders."
The study will be published August 29 in Science.
The study also is the first to demonstrate that remembering events requires a collection of many brain regions to work in concert with a key memory structure called the hippocampus -- similar to a symphony orchestra. The electrical stimulation is like giving the brain regions a more talented conductor so they play in closer synchrony.
"It's like we replaced their normal conductor with Muti," Voss said, referring to Riccardo Muti, the music director of the renowned Chicago Symphony Orchestra. "The brain regions played together better after the stimulation."
The approach also has potential for treating mental disorders such as schizophrenia in which these brain regions and the hippocampus are out of sync with each other, affecting memory and cognition.
TMS Boosts Memory
The Northwestern study is the first to show TMS improves memory long after treatment. In the past, TMS has been used in a limited way to temporarily change brain function to improve performance during a test, for example, making someone push a button slightly faster while the brain is being stimulated. The study shows that TMS can be used to improve memory for events at least 24 hours after the stimulation is given.
Finding the Sweet Spot
It isn't possible to directly stimulate the hippocampus with TMS because it's too deep in the brain for the magnetic fields to penetrate. So, using an MRI scan, Voss and colleagues identified a superficial brain region a mere centimeter from the surface of the skull with high connectivity to the hippocampus. He wanted to see if directing the stimulation to this spot would in turn stimulate the hippocampus. It did.
"I was astonished to see that it worked so specifically," Voss said.
When TMS was used to stimulate this spot, regions in the brain involved with the hippocampus became more synchronized with each other, as indicated by data taken while subjects were inside an MRI machine, which records the blood flow in the brain as an indirect measure of neuronal activity.
The more those regions worked together due to the stimulation, the better people were able to learn new information.
How the Study Worked
Scientists recruited 16 healthy adults ages 21 to 40. Each had a detailed anatomical image taken of his or her brain as well as 10 minutes of recording brain activity while lying quietly inside an MRI scanner. Doing this allowed the researchers to identify each person's network of brain structures that are involved in memory and well connected to the hippocampus. The structures are slightly different in each person and may vary in location by as much as a few centimeters.
"To properly target the stimulation, we had to identify the structures in each person's brain space because everyone's brain is different," Voss said.
Each participant then underwent a memory test, consisting of a set of arbitrary associations between faces and words that they were asked to learn and remember. After establishing their baseline ability to perform on this memory task, participants received brain stimulation 20 minutes a day for five consecutive days.
During the week they also received additional MRI scans and tests of their ability to remember new sets of arbitrary word and face parings to see how their memory changed as a result of the stimulation. Then, at least 24 hours after the final stimulation, they were tested again.
At least one week later, the same experiment was repeated but with a fake placebo stimulation. The order of real stimulation and placebo portions of the study was reversed for half of the participants, and they weren't told which was which.
Both groups performed better on memory tests as a result of the brain stimulation. It took three days of stimulation before they improved.
"They remembered more face-word pairings after the stimulation than before, which means their learning ability improved," Voss said. "That didn't happen for the placebo condition or in another control experiment with additional subjects."
In addition, the MRI showed the stimulation caused the brain regions to become more synchronized with each other and the hippocampus. The greater the improvement in the synchronicity or connectivity between specific parts of the network, the better the performance on the memory test. "The more certain brain regions worked together because of the stimulation, the more people were able to learn face-word pairings, " Voss said.
Using TMS to stimulate memory has multiple advantages, noted first author Jane Wang, a postdoctoral fellow in Voss's lab at Feinberg. "No medication could be as specific as TMS for these memory networks," Wang said. "There are a lot of different targets and it's not easy to come up with any one receptor that's involved in memory."
The Future
"This opens up a whole new area for treatment studies where we will try to see if we can improve function in people who really need it," Voss said.
His current study was with people who had normal memory, in whom he wouldn't expect to see a big improvement because their brains are already working effectively.
"But for a person with brain damage or a memory disorder, those networks are disrupted so even a small change could translate into gains in their function," Voss said.
In an upcoming trial, Voss will study the electrical stimulation's effect on people with early-stage memory loss.
Voss cautioned that years of research are needed to determine whether this approach is safe or effective for patients with Alzheimer's disease or similar disorders of memory.