Saturday, 23 August 2014

The Doctor Can See You Now: High-tech health gadgets to watch for

We’ve all seen ads for Fitbits, Jawbones, Gearfits – perhaps even tracked our own health with a specialized home glucometer or blood pressure cuff.
These devices can help collect data that can help motivate an individual and track progress, but they don’t tell us much beyond that. The next wave of wearable health technology has far more advanced biosensors that can collect new data, teach us what data is most valuable – and maybe even change the way we practice medicine.
Wearable health devices are just one niche in the rapidly growing field of digital health. Future wearable biosensors may be as small as a patch worn on the skin or an ear bud or even a tiny fiber sewn into clothing – and these tools can gather new data in real-time with patients in the real world, tracking things we couldn’t accurately measure outside a hospital until now.
This data not only will provide useful information for the patient and his or her doctor, but could also lead to huge gains in precision medicine, an emerging field that aims to integrate data from molecular, clinical, population and other research to create treatments that are more predictive, preventive and precise.
“We need to get to a world where individuals are using digital health devices to collect accurate, detailed data about themselves and that data is available for clinical trials as well as to their clinicians for helping them maintain wellness or managing disease,” says Michael Blum, MD, director of UCSF’s Center for Digital Health Innovation, which collaborates across health care and industry to create, implement and validate digital health technologies. One of its biggest collaborations is with Samsung, a partnership that’s launched the UCSF-Samsung Digital Health Innovation Lab at Mission Bay.
Blum, who also serves as chief medical information officer of the UCSF Medical Center, says the goal is for wearables being developed now to someday be able to seamlessly “provide patient data into larger databases that can be accessed for clinical care and that multiple researchers can have access to in order to create new understandings.
“When we have access to these large, rich data sources, we will likely see new patterns and relationships that will lead to the development of new, non-traditional ‘vital signs.’”
For all of the wearable health devices being developed, Blum says, it’s vital that they are validated: Do the sensors accurately measure the things they’re designed to measure? Does wearing the device and knowing this information lead to changes in treatment or behavior?  Does it generate better outcomes for the patients? Did we uncover new health data points that could be more important to measure for a certain disease?
For example, Aenor Sawyer, MD, an orthopaedic surgeon and associate director of CDHI, says one key area where monitoring could really help inform doctors – and patients – is when a patient is released from the ICU or after a serious surgery.
“Imagine a patient who’s been closely monitored, then when he’s discharged, we don’t have any oversight, and we don’t have any vital signs being taken. These patients might benefit from some closer scrutiny. We know that certain things can happen in those windows that would be nice to have some way to track it,” says Sawyer.
A number of wearable health devices being developed now could help close that critical loophole. And the data that future wearables gather will teach us new key indicators for health we haven’t even thought of yet, Sawyer says.

Here are five exciting new wearable health gadgets on the horizon:

Track what gets you stressed.



The next generation of wristbands will have far more accurate biosensors that can measure specific health indicators such as blood pressure, heart rate, oxygen saturation and body temperature. The devices will be able to send data to your doctor, and could help researchers measure how different medicines or behavior changes are affecting patient health. For example, Samsung has partnered with UCSF to develop the Simband, which will measure heart rate, blood pressure, temperature, oxygen level and even signs of stress. Simband is also a reference platform that allows other companies to develop sensors that will integrate into it, allowing for a community of developers to create the eco-system of sensors and products that will be critical to this nascent market.

Take part in a sleep study – in the comfort of your own bed.


In the sleep lab, researchers hook patients up to complex machines and sensors to measure motion, heart rate and rhythm, respiratory rate and rhythm, and oxygen and carbon dioxide saturation. Soon tiny, non-invasive biosensors could gather this data while you sleep in your own bed, and transmit the information to a central database.



Know how your elderly mother is doing from hundreds of miles away.

The Doctor Can See You Now: High tech health gadgets to watch for
A combination of biosensors can measure movement and heart and respiration rates. They could be calibrated to an individual’s patterns to alert caretakers when something is amiss. Knowing that an elderly relative is not going to the refrigerator, leaving the house, or calling friends and family could provide early clues to a brewing illness that could be easily managed with early intervention, but might be devastating if left unchecked.


The Doctor Can See You Now: High-tech health gadgets to watch for

We’ve all seen ads for Fitbits, Jawbones, Gearfits – perhaps even tracked our own health with a specialized home glucometer or blood pressure cuff.

These devices can help collect data that can help motivate an individual and track progress, but they don’t tell us much beyond that. The next wave of wearable health technology has far more advanced biosensors that can collect new data, teach us what data is most valuable – and maybe even change the way we practice medicine.
Wearable health devices are just one niche in the rapidly growing field of digital health. Future wearable biosensors may be as small as a patch worn on the skin or an ear bud or even a tiny fiber sewn into clothing – and these tools can gather new data in real-time with patients in the real world, tracking things we couldn’t accurately measure outside a hospital until now.
This data not only will provide useful information for the patient and his or her doctor, but could also lead to huge gains in precision medicine, an emerging field that aims to integrate data from molecular, clinical, population and other research to create treatments that are more predictive, preventive and precise.
“We need to get to a world where individuals are using digital health devices to collect accurate, detailed data about themselves and that data is available for clinical trials as well as to their clinicians for helping them maintain wellness or managing disease,” says Michael Blum, MD, director of UCSF’s Center for Digital Health Innovation, which collaborates across health care and industry to create, implement and validate digital health technologies. One of its biggest collaborations is with Samsung, a partnership that’s launched the UCSF-Samsung Digital Health Innovation Lab at Mission Bay.
Blum, who also serves as chief medical information officer of the UCSF Medical Center, says the goal is for wearables being developed now to someday be able to seamlessly “provide patient data into larger databases that can be accessed for clinical care and that multiple researchers can have access to in order to create new understandings.
“When we have access to these large, rich data sources, we will likely see new patterns and relationships that will lead to the development of new, non-traditional ‘vital signs.’”
For all of the wearable health devices being developed, Blum says, it’s vital that they are validated: Do the sensors accurately measure the things they’re designed to measure? Does wearing the device and knowing this information lead to changes in treatment or behavior?  Does it generate better outcomes for the patients? Did we uncover new health data points that could be more important to measure for a certain disease?
For example, Aenor Sawyer, MD, an orthopaedic surgeon and associate director of CDHI, says one key area where monitoring could really help inform doctors – and patients – is when a patient is released from the ICU or after a serious surgery.
“Imagine a patient who’s been closely monitored, then when he’s discharged, we don’t have any oversight, and we don’t have any vital signs being taken. These patients might benefit from some closer scrutiny. We know that certain things can happen in those windows that would be nice to have some way to track it,” says Sawyer.
A number of wearable health devices being developed now could help close that critical loophole. And the data that future wearables gather will teach us new key indicators for health we haven’t even thought of yet, Sawyer says.

Here are five exciting new wearable health gadgets on the horizon:

Track what gets you stressed.

The Doctor Can See You Now: High tech health gadgets to watch forThe next generation of wristbands will have far more accurate biosensors that can measure specific health indicators such as blood pressure, heart rate, oxygen saturation and body temperature. The devices will be able to send data to your doctor, and could help researchers measure how different medicines or behavior changes are affecting patient health. For example, Samsung has partnered with UCSF to develop the Simband, which will measure heart rate, blood pressure, temperature, oxygen level and even signs of stress. Simband is also a reference platform that allows other companies to develop sensors that will integrate into it, allowing for a community of developers to create the eco-system of sensors and products that will be critical to this nascent market.

Take part in a sleep study – in the comfort of your own bed.

The Doctor Can See You Now: High tech health gadgets to watch forIn the sleep lab, researchers hook patients up to complex machines and sensors to measure motion, heart rate and rhythm, respiratory rate and rhythm, and oxygen and carbon dioxide saturation. Soon tiny, non-invasive biosensors could gather this data while you sleep in your own bed, and transmit the information to a central database.

Know how your elderly mother is doing from hundreds of miles away.

The Doctor Can See You Now: High tech health gadgets to watch forA combination of biosensors can measure movement and heart and respiration rates. They could be calibrated to an individual’s patterns to alert caretakers when something is amiss. Knowing that an elderly relative is not going to the refrigerator, leaving the house, or calling friends and family could provide early clues to a brewing illness that could be easily managed with early intervention, but might be devastating if left unchecked.

Let the doctor monitor your heart in real time.

The Doctor Can See You Now: High tech health gadgets to watch forFor anyone who has worn a holter monitor to check for irregular heart rhythms, a Vital Connect patch is a big upgrade. Instead of having to go in to the doctor’s office to pick up a cumbersome device, wear it for weeks, then go back to the doctor’s office to return it and wait for it to be analyzed, the data from the patch is uploaded to the cloud-based system via the Internet, and the doctor can be alerted if there are any signs of danger.

Measure your vitals while listening to music.

The Doctor Can See You Now: High tech health gadgets to watch forThe ear is an excellent spot on the body to measure physical signals such as motion, heart rate and blood pressure.  Several companies are exploring making a new high-tech ear bud that can measure heart rate, temperature and respiration rate using photoplethysmography, or PPG, which measures changes in blood flow by shining a light on the skin and measuring how it scatters off blood vessels (this is often done in hospitals with a device that fits over your fingertip).

Proteins critical to wound healing identified

Mice missing two important proteins of the vascular system develop normally and appear healthy in adulthood, as long as they don’t become injured. If they do, their wounds don’t heal properly, a new study shows.
The research, atWashington University School of Medicine in St. Louis, may have implications for treating diseases involving abnormal blood vessel growth, such as the impaired wound healing often seen in diabetes and the loss of vision caused by macular degeneration.
The study appears Aug. 18 in the Proceedings of the National Academy of Sciences (PNAS) online early edition.
The paper’s senior author, David M. Ornitz, MD, PhD, the Alumni Endowed Professor of Developmental Biology, studies a group of proteins known as fibroblast growth factors, or the FGF family of proteins. FGF proteins are signaling molecules that play broad roles in embryonic development, tissue maintenance, and wound healing. They interact with specific receptor molecules, FGFRs, located on the surface of many types of cells in the body.
When an organ is injured, the healing process involves the growth of new blood vessels. Since the cells lining the interior of blood vessels and blood cells themselves are important for developing new vasculature, Ornitz and his colleagues asked what would happen if they turned off signaling of the FGFR1 and FGFR2 proteins, two major mediators of the FGF signal that are present in the cells that line blood vessels. Their strategy differed from past studies, which shut down this signaling more broadly.
“The first thing we noticed — and we were rather surprised by this — was that the mice were completely normal,” Ornitz said. “They were running around and lived to a ripe old age. We did genetic tests to make sure they actually lacked these proteins. But when we challenged these mice, we saw that they healed from a skin injury more slowly than their normal littermates, and we found that the density of blood vessels surrounding the injury site was significantly decreased.”
With collaborator and co-senior author Rajendra S. Apte, MD, PhD, the Paul A. Cibis Distinguished Professor of Ophthalmology and Visual Sciences, the investigators also looked at the eyes. Like any other organ, new blood vessels grow in the eye in response to disease or injury. But unlike the rest of the body, new blood vessels are not desired here, since they bleed, cause scar tissue formation and block light to the retina, causing vision loss.
The new work suggests that increasing FGF signaling in the body might help improve wound healing by increasing new blood vessel growth following an injury. Especially in those who have trouble healing, such as patients with diabetes-related foot ulcers. Ornitz pointed out that human FGF2 is already in clinical use as a topical spray in Japan for foot ulcers and similar wound healing purposes.
Conversely, inhibiting these pathways in the eye might help patients with age-related macular degeneration or diabetic retinopathy. Such patients grow new blood vessels in response to these diseased or injured states, but the new vessels only serve to obscure vision, not help heal an abnormal eye.
And since the research suggests these FGF pathways are not involved with normal development and tissue maintenance, any treatments boosting or inhibiting these signals would likely not effect healthy tissue.
“That’s an important point,” said Apte, who treats patients at Barnes-Jewish Hospital. “In diabetes, the normal blood vessels of the retina become fragile because the disease affects them. With any targeted therapy, we worry about damaging the normal vessels. But our work suggests that inhibiting FGF signaling in the eye may prevent this abnormal response without harming normal vessels.”

New Satellite Data Will Help Farmers Facing Drought


About 60 percent of California is experiencing “exceptional drought,” the U.S. Drought Monitor’s most dire classification. The agency issued the same warning to Texas and the southeastern United States in 2012. California’s last two winters have been among the driest since records began in 1879. Without enough water in the soil, seeds can’t sprout roots, leaves can’t perform photosynthesis, and agriculture can’t be sustained.
Currently, there is no ground- or satellite-based global network monitoring soil moisture at a local level. Farmers, scientists and resource managers can place sensors in the ground, but these only provide spot measurements and are rare across some critical agricultural areas in Africa, Asia and Latin America. The European Space Agency’s Soil Moisture and Ocean Salinity mission measures soil moisture at a resolution of 31 miles (50 kilometers), but because soil moisture can vary on a much smaller scale, its data are most useful in broad forecasts.
Enter NASA’s Soil Moisture Active Passive (SMAP) satellite. The mission, scheduled to launch this winter, will collect the kind of local data agricultural and water managers worldwide need.
SMAP uses two microwave instruments to monitor the top 2 inches (5 centimeters) of soil on Earth’s surface. Together, the instruments create soil moisture estimates with a resolution of about 6 miles (9 kilometers), mapping the entire globe every two or three days. Although this resolution cannot show how soil moisture might vary within a single field, it will give the most detailed maps yet made.
“Agricultural drought occurs when the demand for water for crop production exceeds available water supplies from precipitation, surface water and sustainable withdrawals from groundwater,” said Forrest Melton, a research scientist in the Ecological Forecasting Lab at NASA Ames Research Center in Moffett Field, California.
“Based on snowpack and precipitation data in California, by March we had a pretty good idea that by summer we’d be in a severe agricultural drought,” Melton added. “But irrigation in parts of India, the Middle East and other regions relies heavily on the pumping of groundwater during some or all of the year.” Underground water resources are hard to estimate, so farmers who rely on groundwater have fewer indicators of approaching shortfalls than those whose irrigation comes partially from rain or snowmelt. For these parts of the world where farmers have little data available to help them understand current conditions, SMAP’s measurements could fill a significant void.
Some farmers handle drought by changing irrigation patterns. Others delay planting or harvesting to give plants their best shot at success. Currently, schedule modifications are based mostly on growers’ observations and experience. SMAP’s data will provide an objective assessment of soil moisture to help with their management strategy.
“If farmers of rain-fed crops know soil moisture, they can schedule their planting to maximize crop yield,” said Narendra Das, a water and carbon cycle scientist on SMAP’s science team at NASA’s Jet Propulsion Laboratory in Pasadena, California. “SMAP can assist in predicting how dramatic drought will be, and then its data can help farmers plan their recovery from drought.”
“Scientists see tremendous potential in SMAP,” Melton said. “It is not going to provide field-level information, but it will give very useful new regional observations of soil moisture conditions, which will be important for drought monitoring and a wide range of applications related to agriculture. Having the ability provided by SMAP to continuously map soil moisture conditions over large areas will be a major advance.”

Secrets of how worms wriggle uncovered

An engineer at the University of Liverpool has found how worms move around, despite not having a brain to communicate with the body.
Dr Paolo Paoletti, alongside his colleague at Harvard, Professor L Mahadevan, has developed a mathematical model for earthworms and insect larvae which challenges the traditional view of how these soft bodied animals get around.
The most widely accepted model is that of the central pattern generator (CPG) which states that the central brain of these creatures generates rhythmic contraction and extension waves along the body. However, this doesn’t account for the fact that some of these invertebrates can move along even when their ventral nerve cord is cut.
Instead, Dr Paoletti and Professor Mahadevan hypothesised that there is a far greater role for the body’s mechanical properties and the local nerves which react to the surface that the animal is travelling across.
Dr Paoletti said: “When we analyse humans running there is clearly local control over movements as by the time nerve signals travel from the foot to the brain and back again, you will have taken three steps – and would otherwise probably have fallen over.”
“We see much the same in these soft bodied animals. Rather than generating a constant wave of contraction and expansion, their movement is controlled and influenced by the contours of the surface they are moving across.”
Dr Paoletti, from the School of Engineering, and Professor Mahadevan created a mathematical and computational theory to understand this and then tested these theories under different circumstances and conditions and using imagined worms of different masses. They now believe that this new model could be of use in robotics.
He said: “Replicating the movement of animals in robots is very difficult and often involves the use of many sensors. This new model avoids using sophisticated sensors and control strategies, and could be used to improve robots used for entering confined spaces or which have to deal with difficult terrain.”

First Indirect Evidence of So-Far Undetected Strange Baryons

New supercomputing calculations provide the first evidence that particles predicted by the theory of quark-gluon interactions but never before observed are being produced in heavy-ion collisions at the Relativistic Heavy Ion Collider (RHIC), a facility that is dedicated to studying nuclear physics. These heavy strange baryons, containing at least one strange quark, still cannot be observed directly, but instead make their presence known by lowering the temperature at which other strange baryons “freeze out” from the quark-gluon plasma (QGP) discovered and created at RHIC, a U.S. Department of Energy (DOE) Office of Science user facility located at DOE’s Brookhaven National Laboratory.
RHIC is one of just two places in the world where scientists can create and study a primordial soup of unbound quarks and gluons—akin to what existed in the early universe some 14 billion years ago. The research is helping to unravel how these building blocks of matter became bound into hadrons, particles composed of two or three quarks held together by gluons, the carriers of nature’s strongest force.
“Baryons, which are hadrons made of three quarks, make up almost all the matter we see in the universe today,” said Brookhaven theoretical physicist Swagato Mukherjee, a co-author on a paper describing the new results in Physical Review Letters. “The theory that tells us how this matter forms—including the protons and neutrons that make up the nuclei of atoms—also predicts the existence of many different baryons, including some that are very heavy and short-lived, containing one or more heavy ‘strange’ quarks. Now we have indirect evidence from our calculations and comparisons with experimental data at RHIC that these predicted higher mass states of strange baryons do exist,” he said.
Added Berndt Mueller, Associate Laboratory Director for Nuclear and Particle Physics at Brookhaven, “This finding is particularly remarkable because strange quarks were one of the early signatures of the formation of the primordial quark-gluon plasma. Now we’re using this QGP signature as a tool to discover previously unknown baryons that emerge from the QGP and could not be produced otherwise.”

Freezing point depression and supercomputing calculations

The evidence comes from an effect on the thermodynamic properties of the matter nuclear physicists can detect coming out of collisions at RHIC. Specifically, the scientists observe certain more-common strange baryons (omega baryons, cascade baryons, lambda baryons) “freezing out” of RHIC’s quark-gluon plasma at a lower temperature than would be expected if the predicted extra-heavy strange baryons didn’t exist.
“It’s similar to the way table salt lowers the freezing point of liquid water,” said Mukherjee. “These ‘invisible’ hadrons are like salt molecules floating around in the hot gas of hadrons, making other particles freeze out at a lower temperature than they would if the ‘salt’ wasn’t there.”
To see the evidence, the scientists performed calculations using lattice QCD, a technique that uses points on an imaginary four-dimensional lattice (three spatial dimensions plus time) to represent the positions of quarks and gluons, and complex mathematical equations to calculate interactions among them, as described by the theory of quantum chromodynamics (QCD).
“The calculations tell you where you have bound or unbound quarks, depending on the temperature,” Mukherjee said.
The scientists were specifically looking for fluctuations of conserved baryon number and strangeness and exploring how the calculations fit with the observed RHIC measurements at a wide range of energies.
The calculations show that inclusion of the predicted but  “experimentally uncharted” strange baryons fit better with the data, providing the first evidence that these so-far unobserved particles exist and exert their effect on the freeze-out temperature of the observable particles.
These findings are helping physicists quantitatively plot the points on the phase diagram that maps out the different phases of nuclear matter, including hadrons and quark-gluon plasma, and the transitions between them under various conditions of temperature and density.

“To accurately plot points on the phase diagram, you have to know what the contents are on the bound-state, hadron side of the transition line—even if you haven’t seen them,” Mukherjee said. “We’ve found that the higher mass states of strange baryons affect the production of ground states that we can observe. And the line where we see the ordinary matter moves to a lower temperature because of the multitude of higher states that we can’t see.”

Exporting US coal to Asia could drop emissions 21 percent

Superior energy efficiency of South Korean plants, and choice of replacement fuels in US, are key to success

Under the right scenario, exporting U.S. coal to power plants in South Korea could lead to a 21 percent drop in greenhouse gas emissions compared to burning the fossil fuel at plants in the United States, according to a new Duke University-led study.
“Despite the large amount of emissions produced by shipping the coal such a long distance, our analysis shows that the total emissions would drop because of the superior energy efficiency of South Korea’s newer coal-fired power plants,” said Dalia Patiño-Echeverri, assistant professor of energy systems and public policy at Duke.
For the reduction to occur, U.S. plants would need to replace the exported coal with natural gas. And in South Korea, the imported coal must replace other coal as the power source. However, if imported U.S. coal were to replace natural gas or nuclear generation in Korea, the emissions produced per unit of electricity generated would increase, Patiño-Echeverri said.
“This significant difference in results highlights the importance of analyzing domestic energy policies in the context of the global systems they affect,” Patiño-Echeverri said.
Stricter emissions requirements on coal-fired power plants, together with low natural gas prices, have contributed to a recent decline in the use of coal for electricity generation in the United States, she said. Faced with a shrinking domestic market, many coal companies are taking advantage of a growing export market. U.S. coal exports hit an all-time high in 2012, fueled largely by demand in Asia. U.S. coal exports to Asian countries have tripled since 2009.
Patiño-Echeverri and her colleagues published their findings this month in the peer-reviewed journal Environmental Science & Technology.
To conduct their analysis, they performed lifecycle air-emissions and economic assessments of two scenarios: a business-as-usual scenario in which the coal continues to be burned domestically for power generation at power plants in the U.S. Northwest after they have been retrofitted to meet EPA emissions standards, and an export scenario in which the coal is shipped to South Korea. For the export scenario, they focused on the Morrow Pacific Project being planned in Oregon by Ambre Energy. Under the project, Ambre would ship 8.8 million tons of Powder River Basin coal each year to Asian markets using rails, river barges and ocean vessels.
In the export scenario, emissions of “equivalent carbon dioxide” — a scientific measure of the coal emissions’ total global warming potential over a 100-year period — dropped 21 percent.
Other harmful emissions, including sulfur dioxide, nitrogen oxide and particulate matter, dropped similarly.
“In addition to these benefits, our analysis shows that the export scenario would generate more than $25 billion in direct and indirect economic activity in the United States,” Patiño-Echeverri said. “It would also directly or indirectly create nearly $6 billion in total employee compensation, $742 million in new tax revenues, and roughly $4.7 billion in profits for all sectors involved.”
Promising though these results are, “it’s too early to give the export scenario an unequivocal green light,” she said.
Further studies are needed to assess the export scenario’s full environmental impacts, including water use, land use, the loss or degradation of vital fish and wildlife habitats, and risks associated with extraction and wastewater disposal of U.S. shale gas deposits. And there’s still some fine tuning to do on the economic end.
Patiño-Echeverri said the team’s projections are limited in precision due to the fact that the Morrow Pacific Project is in a permitting stage, and many of its operational and financial details are still unknown. As more specific information about the project is released, calculations can be updated to present a clearer picture of the impacts the project may have on the U.S. energy system and global environmental conditions.
“It’s important to note that this is just one scenario. The export of coal to different markets, under different conditions, might yield very different results,” Patiño-Echeverri said. “Our work does not provide a carte blanche for all energy export projects, but it does give us a framework for comparing their impacts and making smarter economic and environmental policy decisions.”
Support for the study came from the Center for Climate and Energy Decision Making (SES-0949710), which is funded by the National Science Foundation.
Patiño-Echeverri is Gendell Assistant Professor of Energy Systems and Public Policy at Duke’s Nicholas School of the Environment.

She conducted the study with Barrett Bohnengel, a 2013 master’s degree graduate of both Duke and the University of North Carolina at Chapel Hill, and Joule Bergerson, assistant professor of chemical and petroleum engineering at the University of Calgary.

Why global warming is taking a break

The average temperature on Earth has barely risen over the past 16 years. ETH researchers have now found out why. And they believe that global warming is likely to continue again soon.
Global warming is currently taking a break: whereas global temperatures rose drastically into the late 1990s, the global average temperature has risen only slightly since 1998 – surprising, considering scientific climate models predicted considerable warming due to rising greenhouse gas emissions. Climate sceptics used this apparent contradiction to question climate change per se – or at least the harm potential caused by greenhouse gases – as well as the validity of the climate models. Meanwhile, the majority of climate researchers continued to emphasise that the short-term ‘warming hiatus’ could largely be explained on the basis of current scientific understanding and did not contradict longer term warming.
Researchers have been looking into the possible causes of the warming hiatus over the past few years. For the first time, Reto Knutti, Professor of Climate Physics at ETH Zurich, has systematically examined all current hypotheses together with a colleague. In a study published in the latest issue of the journal Nature Geoscience, the researchers conclude that two important factors are equally responsible for the hiatus.

El Niño warmed the Earth

One of the important reasons is natural climate fluctuations, of which the weather phenomena El Niño and La Niña in the Pacific are the most important and well known. “1998 was a strong El Niño year, which is why it was so warm that year,” says Knutti. In contrast, the counter-phenomenon La Niña has made the past few years cooler than they would otherwise have been.
Although climate models generally take such fluctuations into account, it is impossible to predict the year in which these phenomena will emerge, says the climate physicist. To clarify, he uses the stock market as an analogy: “When pension funds invest the pension capital in shares, they expect to generate a profit in the long term.” At the same time, they are aware that their investments are exposed to price fluctuations and that performance can also be negative in the short term. However, what finance specialists and climate scientists and their models are not able to predict is when exactly a short-term economic downturn or a La Niña year will occur.

Longer solar cycles

According to the study, the second important reason for the warming hiatus is that solar irradiance has been weaker than predicted in the past few years. This is because the identified fluctuations in the intensity of solar irradiance are unusual at present: whereas the so-called sunspot cycles each lasted eleven years in the past, for unknown reasons the last period of weak solar irradiance lasted 13 years. Furthermore, several volcanic eruptions, such as Eyjafjallajökull in Iceland in 2010, have increased the concentration of floating particles (aerosol) in the atmosphere, which has further weakened the solar irradiance arriving at the Earth’s surface.
The scientists drew their conclusions from corrective calculations of climate models. In all climate simulations, they looked for periods in which the El Niño/La Niña patterns corresponded to the measured data from the years 1997 to 2012. With a combination of over 20 periods found, they were able to arrive at a realistic estimate of the influence of El Niño and La Niña. They also retroactively applied in the model calculations the actual measured values for solar activity and aerosol concentration in the Earth’s atmosphere. Model calculations corrected in this way match the measured temperature data much more closely.

Incomplete measured data

The discrepancy between the climate models and measured data over the past 16 years cannot solely be attributed to the fact that these models predict too  much warming, says Knutti. The interpretation of the official measured data should also be critically scrutinised. According to Knutti, measured data is likely to be too low, since the global average temperature is only estimated using values obtained from weather stations on the ground, and these do not exist everywhere on Earth. From satellite data, for example, scientists know that the Arctic region in particular has become warmer over the past years, but because there are no weather stations in that area, there are measurements that show strong upward fluctuations. As a result, the specified average temperature is too low.
Last year, British and Canadian researchers proposed an alternative temperature curve with higher values, in which they incorporated estimated temperatures from satellite data for regions with no weather stations. If the model data is corrected downwards, as suggested by the ETH researchers, and the measurement data is corrected upwards, as suggested by the British and Canadian researchers, then the model and actual observations are very similar.

Severe infections with hospitalization after prostate biopsy rising in Sweden


Risk of urinary tract infections after prostate biopsy highest in men with prior infections or significant comorbidities, report Swedish researchers in The Journal Of Urology


Transrectal ultrasound guided biopsy is the gold standard for detecting prostate cancer, but international reports have suggested that the number of risks associated with the procedure is increasing. In a new nationwide population-based study, Swedish researchers found that six percent of men filled a prescription for antibiotics for a urinary tract infection within 30 days after having a prostate biopsy, with a twofold increase in hospital admissions over five years, reports The Journal of Urology®.
Earlier studies reported serious adverse events after prostate biopsy including febrile urinary tract infection and urosepsis in one to four percent of men, despite the use of prophylactic antibiotics. There have also been reports that chronic conditions such as diabetes, benign prostatic hyperplasia (BPH), and a history of urinary tract infection increase the risk of infections.
To estimate the incidence of infection after prostate biopsy and assess risk factors for infection and 90-day mortality in Sweden, researchers looked at records of more than 51,000 men registered in the Swedish Prostate Cancer database who underwent transrectal ultrasound guided prostate biopsy between 2006 and 2011. They also compiled data from the National Prostate Cancer Register (NPCR) of Sweden, which captures more than 96 percent of all newly diagnosed prostate cancers in the country.
“We aimed to estimate the frequency and severity of infectious complications in men diagnosed with prostate cancer after prostate biopsy by examining how many men filled prescriptions for antibiotics related to urinary tract infections, rates of hospitalization within 30 days, and death due to infection,” says lead investigator Karl-Johan Lundström, MD, Department of Surgical and Perioperative Sciences, Urology, Andrology, Umeå University, Östersund, Sweden. “We also capitalized upon the unique nationwide cross-linked health care databases in Sweden to perform a more comprehensive evaluation of potential risk factors for infectious complications,” he adds.
Of the men who filled a prescription for urinary tract antibiotics within 30 days of biopsy, 54 percent filled the prescription in the first week after biopsy. One percent of men were hospitalized with a urinary tract infection.
Between 2006 and 2011 the number of men obtaining an antibiotic prescription after biopsy decreased, whereas the number who were hospitalized increased. No significant increase in 90-day mortality was observed, however.
The strongest risk factors for an antibiotic prescription were multiple comorbidities, particularly diabetes, and prior infection. Overall, approximately two percent of the men had a urinary tract infection during the six months before biopsy.
“Our data show that severe infections with hospitalization after prostate biopsy are increasing in Sweden. The rate of hospital admission increased twofold during this five-year period. However, the risk of dying of an infection after prostate biopsy is very low,” observes Dr. Lundström. “The risk of post-biopsy infection is highest among men with a history of urinary tract infections and those with significant comorbidities. The increasing risk of hospitalization is concerning and highlights the importance of carefully evaluating the indications for biopsy especially in men at increased risk of infection,” he concludes.