Wednesday, January 20, 2010

Hot off the presses! Jan 21 Nature

The Jan 21 issue of the Nature is now up on Pubget (About Nature): if you're at a subscribing institution, just click the link in the latest link at the home page. (Note you'll only be able to get all the PDFs in the issue if your institution subscribes to Pubget.)

Latest Articles Include:

  • Climate of suspicion
    - Nature 463(7279):269 (2010)
    With climate-change sceptics waiting to pounce on any scientific uncertainties, researchers need a sophisticated strategy for communication.
  • Ten years of synergy
    - Nature 463(7279):269 (2010)
    Contributions to and from basic science are the part of synthetic biology that most deserves celebration.
  • Self-inflicted damage
    - Nature 463(7279):270 (2010)
    The autocratic actions of an institute's founder could destroy a centre of excellence for brain research.
  • Microbiology: Life in the lost city
    - Nature 463(7279):272 (2010)
  • Geophysics: Synthetic sky light
    - Nature 463(7279):272 (2010)
  • Chemistry: Chase acid, solve maze
    - Nature 463(7279):272 (2010)
  • Astrophysics: Dusty galaxy
    - Nature 463(7279):272 (2010)
  • Evolutionary biology: How girls go solo
    - Nature 463(7279):272 (2010)
  • Biochemistry: Designer label
    - Nature 463(7279):272 (2010)
  • Ecology: Asocial invaders
    - Nature 463(7279):273 (2010)
  • Geoscience: Blowin' in the wind
    - Nature 463(7279):273 (2010)
  • Evolutionary biology: Sperm signals
    - Nature 463(7279):273 (2010)
  • Neuropharmacology: Beating depression
    - Nature 463(7279):273 (2010)
  • Journal club
    - Nature 463(7279):273 (2010)
  • News briefing: 21 January 2010
    - Nature 463(7279):274 (2010)
    The week in science. This article is best viewed as a PDF. Policy|Research|Events|People|Funding|Business watch|The week ahead|Number crunch|News maker The US Food and Drug Association (FDA) has altered its position on bisphenol A, announcing on 15 January that it has "some concern" about the potential effects of the chemical on the health of fetuses and young children. "'Some concern' means that we need to know more," said FDA commissioner Margaret Hamburg. The agency will conduct further studies over the next 18–24 months, she added. In August 2008, the FDA ruled the chemical safe, to criticism from researchers including the agency's own external science board. At a pan-African conference in Dakar, Senegal, on 12 January scientists launched the African Physical Society, a continent-wide professional association of physicists. The organization hopes to boost Africa's bid for the enormous Square Kilometre Array radio telescope; a decision on whether Australia or Africa will host the array is expected by the end of 2012. For-profit publishers are among a panel of experts calling for US federally funded research papers to be made open access. The 14-member group, which included information researchers, librarians and academic administrators, reported its findings on 12 January. The report, requested by the House Committee on Science and Technology, says that all research-funding agencies should develop explicit public-access policies, probably including an embargo period of up to a year between publication and public access. The World Health Organization (WHO) on 14 January fended off criticism from the media and politicians that the influence of vaccine manufacturers led it to hype the H1N1 flu pandemic. To call the pandemic fake "is both wrong and is irresponsible", said Keiji Fukuda, special adviser on pandemic influenza to the WHO director-general. Earlier in the week, the organization said that there would be an independent examination of its handling of the outbreak once the pandemic is declared to be over. The Heterodyne Instrument for the Far Infrared (HIFI) spectrometer on the European Space Agency's Herschel Space Observatory is working again. Damage to the instrument's controls — possibly due to a collision with a cosmic-ray particle — had stopped it working in August 2009. HIFI will scan gas clouds swirling between stars, to understand star formation. Researchers at Baylor College of Medicine in Houston, Texas, face greater scrutiny of their grants after the US National Institutes of Health (NIH) found the institution was not complying with the agency's financial conflict-of-interest rules. In a 14 January letter, NIH director Francis Collins confirmed that Baylor failed to inform the NIH about drugmaker payments disclosed by an NIH-funded cardiologist at the college. The agency has asked Baylor to review financial disclosures from 2004 to the present and wants specific assurance that new grants comply with NIH policy. Baylor says that it is recrafting its policies to meet NIH standards. See go.nature.com/wot5Vv for more. US geologists hope to gather data next week from the magnitude-7.0 quake that struck near Port-au-Prince in Haiti on 12 January and killed tens of thousands of people. See page 276 for more. A proposed long-distance neutrino beam was granted initial approval by the US Department of Energy on 8 January, project officials said last week. The Long Baseline Neutrino Experiment would generate neutrinos at the Fermi National Accelerator Laboratory in Batavia, Illinois. The beam would strike detectors 1,000 kilometres away at DUSEL, a planned lab in an abandoned mine in South Dakota. Nine of the eleven warmest years since records began in 1880 occurred during the past decade, according to a 15 January report by the National Climatic Data Center in Asheville, North Carolina, part of the US National Oceanic and Atmospheric Administration (NOAA). On the basis of data released by NOAA, combined land and ocean surface temperatures put 2009 in joint place for fifth warmest on record; preliminary data from NASA's Goddard Institute for Space Studies in New York have it tied as second warmest. GlaxoSmithKline (GSK) announced measures on 20 January to promote an 'open innovation' strategy for getting medicines to poor countries. As well as creating knowledge-sharing collaborations, GSK is allowing up to 60 scientists to work on their own projects at an 'open lab', based at the company's facility in Tres Cantos, Spain, supported by an US$8-million investment. And GSK has made public the details of 13,500 compounds that screening suggests might inhibit the malaria parasite. See go.nature.com/SjJdm6 for more. L. YOUNG/AP One of the earliest manuscripts to record Isaac Newton's tale that a falling apple set him thinking about gravity — perhaps the most enduring anecdote of scientific discovery — is now online in its original form. Britain's Royal Society has digitized, among other works, the pages of William Stukeley's 1752 Memoirs of Sir Isaac Newton's Life — previously available online only as raw text (see go.nature.com/AlWLCZ). In the book (pictured), Stukeley, a contemporary of Newton's, relates the tale from a 1726 conversation under some apple trees: "he told me, he was just in the same situation, as when formerly, the notion of gravitation came into his mind". University of Tehran particle physicist Masoud Alimohammadi was assassinated on 12 January, leaving Iranian academics in fear of further attacks. See page 279 for analysis. As Nature went to press, the French government was expected to appoint Alain Fuchs as chief executive of the country's basic-research agency, the CNRS. Fuchs, a chemist, currently heads the institute Chimie ParisTech. The CNRS job will combine the existing posts of president and director-general, whose terms of office expire this month. Fuchs's appointment comes at a crucial time for the CNRS, as the government moves to shift research power and funding away from national research agencies to universities. Geneticist and biochemist Marshall Nirenberg, who shared the 1968 Nobel Prize in Physiology or Medicine for his work on deciphering the three-letter genetic code, died aged 82 on 15 January. He worked at the US National Institutes of Health (NIH) from 1957 until his death, and was the first intramural NIH scientist to win a Nobel prize. The European Research Council on 14 January announced grants for 236 research leaders across Europe, totalling around €515 million (US$740 million) in funding. This is the second 'advanced grants' competition for the funding body, a pan-European initiative established in 2007 to support research on the basis of excellence. Its management is currently undergoing reform. The preprint server arXiv.org, the popular repository for physics papers, is seeking donations to keep it running. Since 2001, arXiv has been run by the Cornell University Library in Ithaca, New York. But the library says that it can no longer afford the US$400,000 annual operating costs. This week, it is expected to ask the top 200 institutional users of arXiv to make voluntary contributions. The repository says that it remains committed to keeping its content free for researchers. "We were fighting the 2009 H1N1 flu with vaccine technology from the 1950s," said Kathleen Sebelius, secretary of the US Department of Health and Human Services (HHS), in December 2009. Her view highlights frustrations with the slow process of making flu vaccines in chicken eggs. SOURCE: PHARMAVENTURES Increased investment in vaccine research and development has spurred competition to replace the technique. Prospective candidates include growing flu vaccines in cultures of mammalian, insect, plant, bacterial or fungal cells. In the past five years, the HHS alone has awarded nearly US$1.8 billion to major pharmaceutical companies to accelerate cell-based flu vaccine development, according to analysis conducted for Nature by consultants PharmaVentures in Oxford, UK (see chart). So far only Swiss firm Novartis and Baxter of Deerfield, Illinois, have European approval for mammalian-cell flu vaccines, and the United States has not yet approved any cell-based flu vaccines. In November 2009, the US Food and Drug Administration (FDA) ruled that a technology developed by Protein Sciences, a biopharmaceutical company in Meriden, Connecticut, needed more data. The company needs to complete more safety studies before the FDA will give the green light to the caterpillar-cell manufacturing process. The 50th anniversary of the deepest-ever manned ocean dive. In 1960, Jacques Piccard and Don Walsh took the bathyscaphe Trieste almost 11,000 metres to the bottom of the Mariana Trench. The Arctic Frontiers conference in Tromsø, Norway, discusses research, governance and sustainable development in the Arctic. → www.arctic-frontiers.com The Royal Society in London hosts a meeting on the detection of extraterrestrial life, and what its consequences would be for science and society. → go.nature.com/ThxodX The United Nations Educational, Scientific and Cultural Organization (UNESCO) holds a conference on biodiversity, focusing on policy priorities. → go.nature.com/dymHtz The revised cost for acquiring and displaying one of the space shuttles after the programme winds up this year. Discovery has already been claimed by the National Air and Space Museum in Washington DC; Endeavour and Atlantis are still up for grabs. TURNBACKTHEClOCK.ORG The Bulletin of the Atomic Scientists has nudged back its 'doomsday clock' — on which midnight symbolizes the end of civilization due to threats such as nuclear war. It is now a minute further back than in 2007. There are currently no comments. This is a public forum. Please keep to our Community Guidelines. You can be controversial, but please don't get personal or offensive and do keep it brief. Remember our threads are for feedback and discussion - not for publishing papers, press releases or advertisements.
  • Glacier estimate is on thin ice
    - Nature 463(7279):276 (2010)
    IPCC may modify its Himalayan melting forecasts. When might the majestic Himalaya glaciers disappear for good?G. Wiltsie/National Geographic Hundreds of millions of people rely on water from the Himalayas' mighty glaciers, which experts agree are shrinking as a result of rising global temperatures. But a claim that all of the ice could be gone by 2035 — enshrined in the most recent report from the Intergovernmental Panel on Climate Change (IPCC) — has come under fire from, among others, a coordinating lead author of the IPCC chapter that uses the questionable figure. The dispute highlights the fact that the panel sometimes relies on 'grey' or unrefereed literature. IPCC chairman Rajendra Pachauri says that the panel is investigating whether its report needs to be modified — which, if it were to happen, would be highly unusual. A hasty retreat At issue is a statement in the portion of the 2007 IPCC report1 compiled by its working group on impacts, adaptation and vulnerability. It says that "glaciers in the Himalaya are receding faster than in any other part of the world and, if the present rate continues, the likelihood of them disappearing by the year 2035 and perhaps sooner is very high if the Earth keeps warming at the current rate". The source cited was a 2005 overview from the conservation group WWF's Nepal Program2, which, in turn, refers to non-refereed findings by glaciologist Syed Iqbal Hasnain, a senior fellow at The Energy and Resources Institute in New Delhi. Hasnain recently told the magazine New Scientist that his initial conclusions, contained in a 1999 report by the Working Group on Himalayan Glaciology of the International Commission on Snow and Ice, were "speculative". Nature could not reach him for comment. Satellite observations and in situ measurements do suggest that many of the more than 45,000 glaciers in the Himalayan and Tibetan region are losing mass. But given the observed rate of decline so far, many experts doubt that even small glaciers will melt completely before the end of the century. "The IPCC's statement is wrong and misleading," says Andreas Schild, director-general of the International Centre for Integrated Mountain Development in Kathmandu, Nepal. "It was pretty clear early on that this was an error awaiting correction," adds Michael Zemp, a glaciologist with the World Glacier Monitoring Service in Zurich, Switzerland. "The next IPCC report should accurately characterize the risks of water insecurity and glacial lake outburst flooding." The loudest charges, however, have come from Murari Lal, director of the Climate, Energy and Sustainable Development Analysis Centre in Ghaziabad, who served as coordinating lead author for the Asia chapter in the working group report. He says that his team followed proper IPCC procedures for using non-refereed studies, which require chapter teams to review the quality of such sources before citing results. The WWF report seemed credible, he says, but he admits that the team should have looked more carefully at the secondary sources to which it refers. Even so, Lal argues that Hasnain, rather than the IPCC reviewers, is to blame, for coming up with and continuing to talk about a speculative date. "The findings would have been of major significance to the whole region," Lal says. IPCC representatives say that the bottom line of the Asia chapter remains the same. "There is no scientific doubt on the rapid melting of the glaciers in the Himalayas," says Pachauri, although they are very unlikely to disappear during the next few decades. The section also includes other, smaller errors that are drawing less attention. The chapter attributes to the WWF report, for instance, a related but less drastic estimate that the total area of the Himalayan glaciers could shrink from the present 500,000 square kilometres to 100,000 square kilometres by 2035. The WWF publication gives no such number. Christopher Field, who is overseeing the impacts working group for the next full IPCC assessment report, says that the team will carefully consider the "extremely important" future of the Himalayan glaciers. By 2014, when the next working group report is due, it should be possible to assess Himalayan glacier retreat "in a way that accurately characterizes the risks of water insecurity and glacial lake outburst flooding", says Field, of the Carnegie Institution for Science in Stanford, California. The IPCC will continue to use a combination of peer-reviewed studies and carefully selected grey literature for its next full report. But Field says the incident shows that the IPCC has an extra responsibility to thoroughly assess the quality of the underlying work. Meanwhile, lingering uncertainty over glacier retreat has prompted India and other countries to put more emphasis on glaciological research. India's environment minister, Jairam Ramesh, has called the shrinking Himalayan ice a matter of national security — even though a report he commissioned last year found little evidence of drastic retreat due to climate change3. Pachauri has challenged that finding as "unsubstantiated". ADVERTISEMENT Settling the issue of glacier retreat gained urgency last year with the publication of several papers4,5 based on data from the GRACE gravity-sensing satellites, which highlighted the problem of groundwater depletion in India. As they shrink, the glaciers are expected to add melt water to Himalayan rivers. But if the glaciers disappear altogether, run-off to the headwaters of ten major rivers, including the Indus and the Ganges, will drop markedly. Still, it is unclear whether or when this may happen. Himalayan glaciers, says glaciologist Michael Bishop of the University of Nebraska in Omaha, behave very differently in different places. "Sweeping conclusions," he says, "just don't hold water." * References * Cruz, R. V. et al. in Climate Change 2007: Impacts, Adaptation and Vulnerability. Contribution of Working Group II to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change (eds Parry, M. L., Canziani, O. F., Palutikof, J. P., van der Linden, P. J. & Hanson, C. E.) 469-506 (Cambridge Univ. Press, 2007); available at www.ipcc.ch/publications_and_data/ar4/wg2/en/ch10s10-6-2.html * WWF Nepal Program An Overview of Glaciers, Glacier Retreat, and Subsequent Impacts in Nepal, India and China (2005); available at http://assets.panda.org/downloads/himalayaglaciersreport2005.pdf. * Raina, V. Himalayan Glaciers: A State-of-Art Review of Glacial Studies, Glacial Retreat and Climate Change (2009); available at http://moef.nic.in/downloads/public-information/MoEF%20Discussion%20Paper%20_him.pdf. * Rodell, M., Velicogna, I. & Famiglietti, J. Nature460, 999-1002 (2009). | Article | PubMed * Tiwari, V. M., Wahr, J. & Swenson, S. Geophys. Res. Lett.doi:10.1029/2009GL039401 (2009). There are currently no comments. This is a public forum. Please keep to our Community Guidelines. You can be controversial, but please don't get personal or offensive and do keep it brief. Remember our threads are for feedback and discussion - not for publishing papers, press releases or advertisements.
  • Geologists to evaluate future Haiti risks
    - Nature 463(7279):276 (2010)
    Hunt for survey markers may reveal crucial data. Click for larger imageSource: USGS US geologists hope to arrive in Haiti next week to pick through the rubble of the earthquake that struck on 12 January, killing tens of thousands of people. The scientists will hunt for survey markers that could help them better understand the geology of what happened — and perhaps determine where future risk lies. The stainless-steel pins, usually set in concrete bases, are crucial landmarks for measuring earth movements as small as 1 millimetre. To date, the array of 30 devices in Haiti, and 40 in the Dominican Republic — which shares the island of Hispaniola with Haiti — has yielded the best analysis yet of the local earthquake risk. Finding them could allow researchers to better estimate the likelihood of future fault movements. Major earthquakes are rare in the region. But in 2008, a team led by Eric Calais, a geophysicist at Purdue University in West Lafayette, Indiana, reported at a Caribbean geology conference that the geodetic markers revealed a dangerous strain build-up along Haiti's Enriquillo fault — enough to produce a magnitude-7.2 quake. Last week's quake, on that fault, was a magnitude 7.0. Hispaniola sits on the rim of the Caribbean tectonic plate (see map). To the northeast, the North American plate pushes under the Caribbean Plate, driving it westwards along two parallel faults: the Enriquillo fault on the southern side of the island and the Septentrional fault along the north shore. These faults periodically lock, build up strain, then release it in earthquakes. Major quakes have not struck the Enriquillo fault area since 1860. Calais's team passed its warnings on to the Haitian government, but even developed nations would struggle to set up proper earthquake preparedness in the course of just two years. For now, the focus is on helping to assess immediate geological hazards, such as landslides, and gathering data for future studies of seismic risk. Calais will be going to the island with Paul Mann, a geologist at the University of Texas at Austin who has described the Enriquillo fault (P. Mann et al. Tectonophysics 246, 1–69; 1995). They will be working with Haitian colleagues in the bureau of mines and energy to take Global Positioning System (GPS) measurements from as many of the geodetic markers as they can, to see how much the fault slipped at different points along its length. "These benchmarks are extremely important, representing years of data," says Calais. Mann will be looking for surface signs of the fault rupture — called mole tracks because they look like the swell along the surface sometimes produced by the burrowing mammals. Researchers can plug that information into a model to calculate where strain has now built up along the fault and where future quakes might strike. ADVERTISEMENT Seismologists from the universities of Nice and Brest in France will be coming with portable seismometers. UNAVCO, a non-profit consortium in Boulder, Colorado, has provided ten additional GPS receivers to deploy, and more may be coming from other sources, says Calais. And the US Geological Survey hopes to send in a rapid-response team, working with the US Agency for International Development. Meanwhile, other researchers are trying to get insight into the quake from afar. At the University of Miami in Florida, Tim Dixon and Falk Amelung are looking to see whether space-borne radar interferometers, such as that aboard Japan's Advanced Land Observing Satellite, detected deformation of the terrain before the quake. The satellite is expected to pass over Haiti again this week. Yet all acknowledge that the science will do little immediate good unless countries are able to incorporate the findings into future preparedness plans. There are currently no comments. This is a public forum. Please keep to our Community Guidelines. You can be controversial, but please don't get personal or offensive and do keep it brief. Remember our threads are for feedback and discussion - not for publishing papers, press releases or advertisements.
  • Genomics boosts brain-cancer work
    - Nature 463(7279):278 (2010)
    Few diseases are more intractable than brain cancer. Many factors make it difficult to treat, including the sensitivity of the brain, the complexity of the disease — catch-all diagnoses can obscure differences between tumour types — and the small number of patients, which limits the power of clinical trials and makes them less attractive to large drug companies. There are currently no comments.
  • Iranian academics fear more killings
    - Nature 463(7279):279 (2010)
    Concern grows in the wake of particle physicist's death. Iran's scientific community is reeling after the assassination on 12 January of Masoud Alimohammadi, a particle physicist at the University of Tehran. Alimohammadi was killed by a bomb as he got into his car to go to work. "Everyone is worried that this may be only the start, and that there may be more killings of academics to come," one researcher says. Nature interviewed half a dozen scientists in Iran who knew Alimohammadi, all of whom requested anonymity. They are mystified as to why he was singled out. "I could expect that some influential political figure be assassinated, but not him," says one. Like many intellectuals in Iran, he was politically engaged, but far from being a political activist, the researchers say. A. TAHERKENAREH/EPA/Corbis Iran's President Mahmoud Ahmadinejad and supreme leader Ayatollah Ali Khamenei have said that the killing was perpetrated by the country's "enemies" and was designed to hamper its scientific and technological progress. State media portrayed Alimohammadi as a "martyr" (pictured) and a "committed revolutionary professor". Scientists in Iran hotly contest the official picture of Alimohammadi as a supporter of the Ahmadinejad regime. They say that he opposed both the current regime and the violent crackdown on protests that followed the disputed presidential elections last June. They also question the regime's implication that Alimohammadi was involved in Iran's nuclear programme, making him a target. Alimohammadi supported the 1979 Islamic Revolution from the outset and had links in the past with the Islamic Revolutionary Guard Corps. But he opposed the hardline crackdown on student demonstrations in 1999. Last year, he was among hundreds of academics who signed a petition endorsing Mir Hossein Mousavi, the reformist presidential candidate. Alimohammadi was also an organizer and first signatory of a 4 January letter by 88 academics at Tehran University to Ayatollah Khamenei, protesting against the regime's post-election repression and attacks on universities and students. Friends say that he did not endorse overthrow of the regime, but was "keen to find solutions for a way out of the crisis". The scientists also say that, to their knowledge, Alimohammadi had nothing to do with Iran's nuclear efforts or any military programme. Although the regime last week described him as a nuclear scientist, he was a theoretical particle physicist; his PhD was on string theory, and he then moved on to the quantum effects of gravity and gauge theories, and more recently to research on dark energy and modified Newtonian dynamics. "I can see no reason why or how Iran's military or nuclear programmes could benefit from Alimohammadi's expertise," says Moshe Paz-Pasternak, a physicist at Tel Aviv University in Israel who worked with Alimohammadi on the Middle East synchrotron SESAME (see 'Physicist was part of 'science for peace' project'). All of which leaves the killer's identity and motives a mystery. The regime has blamed Israel, the United States and "their lackeys", as well as various dissident groups. Others speculate that hardliners in the regime itself might have staged the killing as a warning to opposition supporters. ADVERTISEMENT "If Alimohammadi was murdered by hardliners, then the message would be clear: that they are willing even to assassinate well-known and well-respected academics," says materials scientist Muhammad Sahimi of the University of Southern California in Los Angeles, "and that other academics must think twice before they participate in political activities to support Mousavi". Many Iranian researchers say they would like the international scientific community to speak out and condemn the assassination. This is a public forum. Please keep to our Community Guidelines. You can be controversial, but please don't get personal or offensive and do keep it brief. Remember our threads are for feedback and discussion - not for publishing papers, press releases or advertisements.
  • The scientific diplomat
    - Nature 463(7279):281 (2010)
    AAAS president Peter Agre talks to Nature about his recent visits to Cuba and North Korea. S. BURGOS/NEWSCOM A physician and Nobel-prizewinning chemist at Johns Hopkins University in Baltimore, Maryland, Peter Agre (pictured) has never shied away from politics. In 2007 he briefly considered running for the US Senate; in 2009 he became president of the American Association for the Advancement of Science (AAAS) and, with the association's Center for Science Diplomacy, began to engage with some of his country's bitterest political enemies. Over a shaky mobile-phone line from rural Zambia, where he is assisting efforts to combat malaria, Agre spoke to Nature about his recent visits to Cuba and North Korea. We decide on countries where we think we could do better than our elected officials. We choose obvious nations, where we have everything to gain and nothing to lose, just to see how we can help them to develop peaceful science. In Cuba, we visited the Center for Genetic Engineering and Biotechnology, the Finlay Institute for vaccine research and production, the University of Havana, the Latin American School of Medicine and the Cuban Academy of Sciences, all in Havana. North Korea granted visits to all the institutions that we asked to see, including the State Academy of Sciences' biology and biotechnology branches, Kim Chaek University of Technology in Pyongyang and Pyongyang University of Science and Technology. I have to be careful not to oversell. We met with officials, not with graduate students to watch them do experiments. The goal is to make it interesting so that we can continue the exchange. There was some scepticism on their part. They have watched US policy change, warm up and cool down over previous presidential terms. Both countries are trying hard to boost science, but the level is difficult to establish. Both had beautiful computing centres, as good as I see at most US universities. In Cuba, they had excellent English and were highly motivated. They seem to have had large success in raising vaccines to infectious diseases. They also seem to have excellent public-health networks, and disease prevention is a priority. I sense that the [US trade] embargo has strongly inhibited sophisticated science. North Korea has tremendous intellectual horsepower and could develop fast. Stuart Thorson [who directs an information-technology collaboration between Syracuse University in New York and Kim Chaek University of Technology] says that undergraduate teams performed well in international undergraduate computer-programming competitions. The level of science is variable. They seem to have large success in computer-information systems; we toured some biological labs that were modest. Central heating was lacking. The DPRK [Democratic People's Republic of Korea] is a poor country, and capital investment in laboratories and basic amenities is needed. There was some talk of stem-cell and cloning research, but I'm not aware of whether they are achieving success. We would need to see more. It cannot help that they have such limited access to outside information. We couldn't run the streets. And we avoided controversial topics. In both countries a few private opinions were shared, and their viewpoints seemed very reasonable. In North Korea, four individuals from the State Academy of Sciences accompanied us everywhere outside the hotel. Some people ask if we felt safe in North Korea; I wish I could feel that safe in Baltimore. Exchanges were free, and there were discussions of family and personal interests. The fear of Americans could easily evaporate with more positive contacts. Their isolation is really extreme, and most of our science contacts admitted that they had never met Americans before. In Cuba, a tour-group leader and interpreters travelled with us, but we were not restricted. Free exchanges and visits with shopkeepers and others in restaurants occurred. We stayed away from controversial issues. The next step with Cuba will be more visits — Cubans to the United States and Americans to Cuba. Increased interactions in Europe or other locations should also be expected. Until the US government changes its policy towards Cuba, we need to be careful that we remain focused on the science and do not delve into the policy arena without a portfolio for doing so. Areas of future exchanges may include public health, disaster preparations and agriculture. With regard to North Korea, the next step will occur when the State Academy of Sciences approves an agreement to pursue collaborations. More visits to North Korea and to the United States, and contacts during activities in other countries, are anticipated. We cannot rush North Korea any more than we can rush the US government. Areas of future exchanges may include pharmaceutical manufacture, digital-library access, information systems and agriculture. Syria, Rwanda and Myanmar. [David Baltimore, Agre's predecessor as AAAS president, made exploratory visits to Syria and Rwanda, and follow-up exchanges are under way. A mission to Myanmar is planned for 2010.] ADVERTISEMENT I imagine it is like the situation 30 years ago when China was still isolated. Now it's hard to imagine life without China. And science could be the path to political peace. In the 1950s, it helped a lot to have scientists in the Soviet Union and the United States talking to each other. There are currently no comments. This is a public forum. Please keep to our Community Guidelines. You can be controversial, but please don't get personal or offensive and do keep it brief. Remember our threads are for feedback and discussion - not for publishing papers, press releases or advertisements.
  • 'Big science' spurs collaborative trend
    - Nature 463(7279):282 (2010)
    Complicated projects mean that science is becoming more globalized. It has never been a more fruitful time for collaborations with foreign scientists, and the European Union (EU) is leading the pack. Spurred by funding policies, half of EU research articles had international co-authors in 2007, more than twice the level of two decades ago, according to a major report released last week by the US National Science Foundation. The EU level of international co-authorship is about twice that of the United States, Japan and India, but levels in these countries are rising — a sign of the continued allure of working across borders. "The phenomenon is across disciplines," says Loet Leydesdorff, a science-metrics expert at the University of Amsterdam. "You can find it everywhere." "What we really have is a breakdown of national concepts and national systems." András Schubert, editor of the journal Scientometrics and a researcher at the Institute for Science Policy Research in Budapest, Hungary, says that the rise in collaborations is partly out of necessity, corresponding with the rise of 'big science'. Many scientific endeavours — whether colliding particles or sequencing genomes — have become more complicated, requiring the money and labour of many nations. But he says that collaborations have also emerged because of heightened possibilities: the Internet allows like-minded scientists to find each other, and dramatic drops in communications costs ease long-distance interactions. And there is a reward: studies of citation counts show that internationally co-authored papers have better visibility (O. Persson et al. Scientometrics 60, 421–432; 2004). "Scientists are motivated by vanity," says Schubert. "International collaboration is a way to propagate ideas in wider and wider circles." Caroline Wagner, a research scientist at George Washington University in Washington DC and US chief executive of consulting company Science-Metrix in Alexandria, Virginia, notes that international collaborations offer additional flexibility. Whereas local collaborations sometimes persist past the point of usefulness because of social or academic obligations, international ones can be cultivated and dropped more freely. The collaborative trend is true across scientific disciplines, although some fields have a greater tendency for it. Particle physicists and astronomers collaborate often, for example, because they congregate at shared facilities such as particle colliders and observatories. Mathematicians, by contrast, tend historically towards solitude and lag behind other disciplines in the level of co-authorship — although, Wagner says, collaborations are rising there, too. The level of collaboration also varies by country, as shown by the report's 'international collaboration index' (see table). Scandinavians are much more apt to collaborate with each other than, for example, the French are with the Germans, according to the index. "There are overlays of history, and political reasons, as to why collaborations emerge," says Wagner. Yet even below-average penchants for partnership (shown by values less than 1) have risen over the past decade — apparently boosted by policies embedded in European Framework funding schemes that require collaboration. The United States tends to score below average because many researchers form within-country collaborations, says Rolf Lehming, director of the programme that produces the National Science Foundation report, Science and Engineering Indicators 2010. Relatively high index values for Asian country pairs suggest the birth of an intra-Asian zone of scientific collaboration (see 'The rise of Asia') — with the exception of collaborations between China and India, which dropped in the past decade, perhaps as they formed links with other countries. But Wagner says that these national differences will become less meaningful as individual scientists enter a globalized science system that is open, undirected and ultimately more efficient: "What we really have is a breakdown of national concepts and national systems." There are currently no comments. This is a public forum. Please keep to our Community Guidelines. You can be controversial, but please don't get personal or offensive and do keep it brief. Remember our threads are for feedback and discussion - not for publishing papers, press releases or advertisements.
  • Bulgarian science reform attacked
    - Nature 463(7279):283 (2010)
    Researchers say law wouldn't fix nation's higher-education system. Three years after it joined the European Union, Bulgaria is working to improve its ranking as one of the region's lowest overall performers in science. But a proposed law meant to improve research and universities is meeting protests from the scientists themselves. The law would dismantle the central Higher Attestation Commission, which awards advanced degrees and oversees academic appointments. Instead, universities would be responsible for awarding their own higher degrees, as happens elsewhere in Europe. "This centralized system is archaic; universities should be independent." Some researchers charge that the move would eliminate quality control of PhD and postdoctoral work, particularly in the universities that have sprung up recently. Bulgaria had just four universities in 1990, plus a handful of medical and engineering schools. Now there are 53 universities, serving a population of 7.5 million. Last week, an action group called the Civil Movement for Support of Bulgarian Science and Education presented parliament with a list of demands for changes to the proposed law. They include setting up a system to govern university accreditation before allowing the institutions to award their own higher degrees. "We agree that this centralized system is archaic, and that universities should be independent," says Oleg Yordanov, a physicist at the Institute of Electronics in Sofia who is involved with the group. "But the commission needs to be replaced with a system that guarantees a minimum quality of academic achievement." The University of Sofia and the Bulgarian Academy of Sciences, which together account for 90% of peer-reviewed publications in Bulgaria, have also registered concerns with the government. Bulgaria's minister of science and education, Sergey Ignatov, an Egyptologist, declined to comment. ADVERTISEMENT Mathematician Emil Horozov of the University of Sofia says that it would be an "enormous problem" if universities were to give out their own higher degrees, as the agency that accredited the new universities is part of a system that brought "our country to ruin". Horozov took over this month as head of Bulgaria's granting agency, the National Research Funds, and he says that he intends to introduce reforms there as well, such as involving foreign reviewers and making the granting procedures fully transparent. The law passed its first reading in parliament on 18 December and will go through a second round of discussion and a vote in the next few weeks. This is a public forum. Please keep to our Community Guidelines. You can be controversial, but please don't get personal or offensive and do keep it brief. Remember our threads are for feedback and discussion - not for publishing papers, press releases or advertisements.
  • Climate: The real holes in climate science
    - Nature 463(7279):284 (2010)
    Like any other field, research on climate change has some fundamental gaps, although not the ones typically claimed by sceptics. Quirin Schiermeier takes a hard look at some of the biggest problem areas. Download a PDF of this story. The e-mails leaked from the University of East Anglia's Climatic Research Unit (CRU) in November presented an early Christmas present to climate-change denialists. Amid the more than 1,000 messages were several controversial comments that — taken out of context — seemingly indicate that climate scientists have been hiding a mound of dirty laundry from the public. A fuller reading of the e-mails from CRU in Norwich, UK, does show a sobering amount of rude behaviour and verbal faux pas, but nothing that challenges the scientific consensus of climate change. Still, the incident provides a good opportunity to point out that — as in any active field of inquiry — there are some major gaps in the understanding of climate science. In its most recent report in 2007, the Intergovernmental Panel on Climate Change (IPCC) highlighted 54 'key uncertainties' that complicate climate science. Such a declaration of unresolved problems could hardly be called 'hidden'. And some of these — such as uncertainties in measurements of past temperatures — have received considerable discussion in the media. But other gaps in the science are less well known beyond the field's circle of specialists. Such holes do not undermine the fundamental conclusion that humans are warming the climate, which is based on the extreme rate of the twentieth-century temperature changes and the inability of climate models to simulate such warming without including the role of greenhouse-gas pollution. The uncertainties do, however, hamper efforts to plan for the future. And unlike the myths regularly trotted out by climate-change denialists (see 'Enduring climate myths'), some of the outstanding problems may mean that future changes could be worse than currently projected. "This climate of suspicion we're working in is insane. It's drowning our ability to soberly communicate gaps in our science." Gavin Schmidt Researchers say it is difficult to talk openly about holes in understanding. "Of course there are gaps in our knowledge about Earth's climate system and its components, and yes, nothing has been made clear enough to the public," says Gavin Schmidt, a climate modeller at NASA's Goddard Institute for Space Studies in New York and one of the moderators and contributors to the influential RealClimate blog. "But this climate of suspicion we're working in is insane. It's really drowning our ability to soberly communicate gaps in our science when some people cry 'fraud' and 'misconduct' for the slightest reasons." Nature has singled out four areas — regional climate forecasts, precipitation forecasts, aerosols and palaeoclimate data — that some say deserve greater open discussion, both within scientific circles and in the public sphere. Regional climate prediction The sad truth of climate science is that the most crucial information is the least reliable. To plan for the future, people need to know how their local conditions will change, not how the average global temperature will climb. Yet researchers are still struggling to develop tools to accurately forecast climate changes for the twenty-first century at the local and regional level. The basic tools used to simulate Earth's climate are general circulation models (GCMs), which represent physical processes in the global atmosphere, oceans, ice sheets and on the land's surface. Such models generally have a resolution of about 1–3° in latitude and longitude — too coarse to offer much guidance to people. So climate scientists simulate regional changes by zooming in on global models — using the same equations, but solving them for a much larger number of grid points in particular locations. However, increasing the resolution in this way can lead to problems. Zooming in from GCMs bears the risk of blowing up any inherent weakness of the 'mother' model. If the model does a poor job of simulating certain atmospheric patterns, those errors will be compounded at the regional level. Most experts are therefore cautious when asked to make regional predictions. "Our current climate models are just not up to informed decision-making at the resolution of most countries," says Leonard Smith, a statistician and climate analyst at the London School of Economics and Political Science. "You need to be very circumspect about the added value of downscaling to regional impacts," agrees Hans von Storch, a climate modeller at the GKSS Institute for Coastal Research in Geesthacht, Germany, who has recently contributed to a regional climate assessment of the Hamburg metropolitan region. If the simulations project future changes in line with the trends already observed, von Storch has more confidence in them. But if researchers run the same model, or an ensemble of models, multiple times and the results diverge from each other or from the observed trends, he cautions, "planners should handle them with kid gloves. Whenever possible, they'd rather wait with spending big money on adaptation projects until there is more certainty about the things to come." Downscaled climate models face particular uncertainty problems dealing in regions with complex topography, such as where mountains form a wall between two climatically different plains. Another potential source of error comes from projections concerning future greenhouse-gas emissions, which vary depending on assumptions about economic developments. Climate simulations for Europe for the end of the current century suggests warming (top) of more than 3°C relative to the end of the twentieth century. Precipitation projections (bottom) indicate drying of southern Europe and wetter conditions in northern Europe.IPCC CLIMATE CHANGE 2007: THE PHYSICAL SCIENCE BASIS. CH. 11 (2007) All the problems, however, do not make regional simulations worthless, as long as their limitations are understood. They are already being used by planners at the local and national levels (see graphs, right). Simulations remain an important tool for understanding processes, such as changes in river flow, that global models just cannot resolve, says Jonathan Overpeck, a climate researcher at the University of Arizona in Tucson. Overpeck is part of a research team that is using statistical techniques to narrow down divergent model projections of how much average water flow in the Colorado River will decrease by 2050. Researchers hope that by improving how they simulate climate variables such as cloud coverage and sea surface temperatures, they will further reduce the uncertainties in regional forecasts, making them even more useful for policy-makers. Precipitation Rising global temperatures over the next few decades are likely to increase evaporation and accelerate the global hydrological cycle — a change that will dry subtropical areas and increase precipitation at higher latitudes. These trends are already being observed and almost all climate models used to simulate global warming show a continuation of this general pattern1. Projections of precipitation change for 2090-99. Blue indicates increases in precipitation and brown denotes drying. White represents areas of uncertainty, where less than two-thirds of models agreed on whether precipitation would increase or decrease. Stippled areas indicate where 90% of the models agreed on the sign of the change.IPCC CLIMATE CHANGE 2007: THE PHYSICAL SCIENCE BASIS. CH. 11 (2007) Unfortunately, when it comes to precipitation, that is about all the models agree on. The different simulations used by the IPCC in its 2007 assessment offer wildly diverging pictures of snow and rainfall in the future (see graphic, right). The situation is particularly bad for winter precipitation, generally the most important in replenishing water supplies. The IPCC simulations failed to provide any robust projection of how winter precipitation will change at the end of the current century for large parts of all continents2. Even worse, climate models seemingly underestimate how much precipitation has changed already — further reducing confidence in their ability to project future changes. A 2007 study3, published too late to be included into the last IPCC report, found that precipitation changes in the twentieth century bore the clear imprint of human influence, including drying in the Northern Hemisphere tropics and subtropics. But the actual changes were larger than estimated from models — a finding that concerns researchers. "If the models do systematically underestimate precipitation changes that would be bad news", because the existing forecasts would already cause substantial problems, says Gabriele Hegerl, a climate-system scientist at the University of Edinburgh, UK, and a co-author on the paper. "This is, alas, a very significant uncertainty," she says. Climate scientists think that a main weakness of their models is their limited ability to simulate vertical air movement, such as convection in the tropics that lifts humid air into the atmosphere. The same problem can trip up the models for areas near steep mountain ranges. The simulations may also lose accuracy because scientists do not completely understand how natural and anthropogenic aerosol particles in the atmosphere influence clouds. Data on past precipitation patterns around the globe could help modellers to solve some of these issues, but such measurements are scant in many areas. "We really don't know natural variability that well, particularly in the tropics," says Hegerl. The uncertainties about future precipitation make it difficult for decision-makers to plan, particularly in arid regions such as the Sahel in Africa and southwestern North America. 'Mega-droughts' lasting several decades have struck these areas in the past and are expected to happen again. But the models in use today do a poor job of simulating such long-lasting droughts. "That's pretty worrying," says Overpeck. Increasing the resolution of models will not be enough to resolve the convective processes that lead to precipitation. To forecast precipitation more accurately, researchers are trying, among other things, to improve the simulation of key climate variables such as the formation and dynamics of clouds. Furthermore, high-resolution satellite observations are increasingly being used to validate and improve model realism. Aerosols Atmospheric aerosols — airborne liquid or solid particles — are a source of great uncertainty in climate science. Despite decades of intense research, scientists must still resort to using huge error bars when assessing how particles such as sulphates, black carbon, sea salt and dust affect temperature and rainfall. Overall, it is thought that aerosols cool climate by blocking sunlight, but the estimates of this effect vary by an order of magnitude, with the top end exceeding the warming power of all the carbon dioxide added to the atmosphere by humans. One of the biggest problems is lack of data. "We don't know what's in the air," says Schmidt. "This means a major uncertainty over key processes driving past and future climate." To measure aerosols in the sky, satellite and ground-based sensors detect the scattering and absorption of solar radiation. But researchers lack enough of this kind of data to complete a picture of aerosols across the globe. And a complex set of coordinated experiments is required to determine how aerosols alter climate processes. Some aerosols, such as black carbon, absorb sunlight and produce a warming effect that might also inhibit rainfall. Other particles such as sulphates exert a cooling influence by reflecting sunlight. The net effect of aerosol pollution on global temperature is not well established. And various studies have produced conflicting conclusions over whether global aerosol pollution is increasing or decreasing. The relationship between aerosols and clouds adds another layer of complication. Before a cloud can produce rain or snow, rain drops or ice particles must form and aerosols often serve as the nuclei for condensation. But although some aerosols enhance cloudiness, others seem to reduce it. Aerosols could also have a tremendous impact on temperatures by altering the formation and lifetime of low-level clouds, which reflect sunlight and cool the planet's surface. The thin white lines show how aerosols from ship exhausts brighten clouds over the Atlantic Ocean.J. DESCLOITRES/MODIS/NASA/GSFC Scientists have yet to untangle the interplay between pollution, clouds, precipitation and temperature. However, NASA's Glory satellite, an aerosol and solar-irradiance monitoring mission scheduled for launch in October, will provide some greatly anticipated data. Still, atmospheric researchers say that ground-based sensors capable of determining the abundance and composition of aerosols in the atmosphere are needed just as much. The tree-ring controversy Many of the e-mails leaked from the CRU computers came from a particular group of climate researchers who work on reconstructing temperature variations over time. The e-mails revealed them discussing some of the uncertainties in centuries worth of climate information gleaned from tree rings and other sources. Records of thermometer measurements over the past 150 years show a sharp temperature rise during recent decades that cannot be explained by any natural pattern. It is most likely to have been caused by anthropogenic greenhouse-gas emissions. But reliable thermometer records from before 1850 are scarce and researchers must find other ways to reveal earlier temperature trends. Palaeoclimatology relies on records culled from sources such as tree rings, coral reefs, lake sediments, stalagmites, glacial movements and historical accounts. As trees grow, for example, they develop annual rings whose thickness reflects temperature and rainfall. Proxies such as these provide most knowledge of past climate fluctuations, such as the Medieval Warm Period from about 800 to 1300 and the Little Ice Age, centred on the year 1700. Northern Hemisphere temperature estimates. Click to enlarge.IPCC CLIMATE CHANGE 2007: THE PHYSICAL SCIENCE BASIS. CH. 6 (2007) When proxy records for the Northern Hemisphere are stitched together, they show a pattern resembling a hockey stick, with temperatures rising substantially during the late twentieth century above the long-term mean conditions. This type of work was pioneered in 1998 by Michael Mann, a climate researcher then at the University of Virginia in Charlottesville, and his co-authors4. In a subsequent publication5, they concluded that the decade of the 1990s was probably the warmest decade, and 1998 the warmest year, in at least a millennium. That work figured prominently in the 2001 assessment by the IPCC. But the use and interpretation of such proxy records has generated considerable controversy. One notable critic, Stephen McIntyre, a retired Canadian mining consultant and editor of the Climate Audit blog, has spent much of the past decade challenging the work of Mann and other scientists whose e-mails were leaked. McIntyre has doggedly attacked the proxy records6, particularly the statistics used to analyse tree-ring data. Many scientists are tired of the criticisms, and the IPCC concluded that it is "likely" that the second half of the twentieth century was the warmest 50-year period in the Northern Hemisphere during the past 1,300 years. But legitimate questions remain about paleoclimate proxies, according to the IPCC7. Climate scientists are worried in particular about tree-ring data from a few northern sites. By examining temperature measurements from nearby, researchers know that tree growth at these locations tracked atmospheric temperatures for much of the twentieth century and then diverged from the actual temperatures during recent decades. It may be that when temperatures exceed a certain threshold, tree growth responds differently. The 'divergence' issue also made an appearance in the CRU affair. In the most frequently quoted of the CRU e-mails, the former director of the centre, Phil Jones, mentioned a 'trick' — namely using actual observations of late-twentieth-century temperatures instead of tree ring data — to 'hide the decline' in the response of trees to the warming temperatures." On the surface, Jones's phrasing seems damning. Indeed, a graph of Northern Hemisphere temperature produced for the World Meteorological Organization in 2000 with Jones's help fails to make clear that instrumental records from the nineteenth and twentieth centuries were spliced onto proxy data for the past millennium because of the divergence issue. The figure did, however, contain clear references to papers that discussed the divergence issue. "They show what was, at the time, the best estimate of how temperatures evolved over time," says Hegerl. "However, with hindsight, they could have been a bit clearer how this was done, given the high profile that figures like this can have." Aside from the issue of clarity, the decision to exclude the tree-ring records that diverge from the instrumental data makes sense, says Thomas Stocker, co-chair of the IPCC's working group on the physical basis of climate change. The tree ring divergence problem is restricted to a few high-latitude regions in the Northern Hemisphere and is not ubiquitous even there, he says. "As long as we don't understand why the records diverge, we can't be sure that they accurately represent the past." Gabriele Hegerl Still, the divergence issue remains a source of debate within the scientific community. "I'm worried about what causes the divergence," says Hegerl. "As long as we don't understand why they diverge, we can't be sure that they accurately represent the past." So improving the usefulness of proxies will require a better understanding of how different species of trees grow and respond to climate change. Another outstanding problem in proxy research is the large range of uncertainty for temperatures from before about 1500. Studies published in 2004 (ref. 8) and 2005 (ref. 9), based on a combination of proxies of different resolution, suggest that fluctuations in global temperature during the past millennium may have been larger than initially thought. However, these studies still show late twentieth century warming to be unprecedented, says von Storch. And the most recent decade was warmer still. ADVERTISEMENT Even with ongoing questions about the proxy data, the IPCC's key statement — that most of the warming since the mid-twentieth century is "very likely" to be due to human-caused increases in greenhouse-gas concentration — remains solid because it rests on multiple lines of evidence from different teams examining many aspects of the climate system, says Susan Solomon, the former co-chair of the IPCC team that produced the 2007 physical science report and a climate researcher with the US National Oceanic and Atmospheric Administration in Boulder, Colorado. "The IPCC's team of scientists," she says, "would not have said that warming is unequivocal based on a single line of evidence — even if came from Moses himself." * References * Meehl, G.A.et al. in Climate Change 2007: The Physical Science Basis. Contribution of Working Group I to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change (eds Solomon, S. et al.) Ch. 10, 760-789 (Cambridge Univ. Press, 2007). * IPCC Climate Change 2007: The Physical Science Basis. Contribution of Working Group I to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change (Cambridge Univ. Press, 2007). * Zhang, X.et al. Nature448, 461-465 (2007). * Mann, M. E. , Bradley, R. S. & Hughes, M. K.Nature392, 779-787 (1998). * Mann, M. E. , Bradley, R. S. & Hughes, M. K.Geophys. Res. Lett.26, 759-762 (1999). * McIntyre, S. & McKitrick, R.Geophys. Res. Lett.32, L03710 (2005). * Jansen, E.et al. in Climate Change 2007: The Physical Science Basis. Contribution of Working Group I to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change (eds Solomon, S. et al.) Ch. 6, 466-475 (Cambridge Univ. Press, 2007). * Von Storch, H.et al. Science306, 679-682 (2004). * Moberg, A. , Sonechkin, D. M. , Holmgren, K. , Datsenko, N. M. & Karlén, W.Nature433, 613-617 (2005). There are currently no comments. This is a public forum. Please keep to our Community Guidelines. You can be controversial, but please don't get personal or offensive and do keep it brief. Remember our threads are for feedback and discussion - not for publishing papers, press releases or advertisements.
  • Bioengineering: Five hard truths for synthetic biology
    - Nature 463(7279):288 (2010)
    Can engineering approaches tame the complexity of living systems? Roberta Kwok explores five challenges for the field and how they might be resolved. Download a PDF of this story To read some accounts of synthetic biology, the ability to manipulate life seems restricted only by the imagination. Researchers might soon program cells to produce vast quantities of biofuel from renewable sources, or to sense the presence of toxins, or to release precise quantities of insulin as a body needs it — all visions inspired by the idea that biologists can extend genetic engineering to be more like the engineering of any hardware. The formula: characterize the genetic sequences that perform needed functions, the 'parts', combine the parts into devices to achieve more complex functions, then insert the devices into cells. As all life is based on roughly the same genetic code, synthetic biology could provide a toolbox of reusable genetic components — biological versions of transistors and switches — to be plugged into circuits at will. Such analogies don't capture the daunting knowledge gap when it comes to how life works, however. "There are very few molecular operations that you understand in the way that you understand a wrench or a screwdriver or a transistor," says Rob Carlson, a principal at the engineering, consulting and design company Biodesic in Seattle, Washington. And the difficulties multiply as the networks get larger, limiting the ability to design more complex systems. A 2009 review1 showed that although the number of published synthetic biological circuits has risen over the past few years, the complexity of those circuits — or the number of regulatory parts they use — has begun to flatten out. Challenges loom at every step in the process, from the characterization of parts to the design and construction of systems. "There's a lot of biology that gets in the way of the engineering," says Christina Agapakis, a graduate student doing synthetic-biology research at Harvard Medical School in Boston, Massachusetts. But difficult biology is not enough to deter the field's practitioners, who are already addressing the five key challenges. Many of the parts are undefined A biological part can be anything from a DNA sequence that encodes a specific protein to a promoter, a sequence that facilitates the expression of a gene. The problem is that many parts have not been characterized well. They haven't always been tested to show what they do, and even when they have, their performance can change with different cell types or under different laboratory conditions. The Registry of Standard Biological Parts, which is housed at the Massachusetts Institute of Technology in Cambridge, for example, has more than 5,000 parts available to order, but does not guarantee their quality, says director Randy Rettberg. Most have been sent in by undergraduates participating in the International Genetically Engineered Machine (iGEM) competition, an annual event that started in 2004. In it, students use parts from a 'kit' or develop new ones to design a synthetic biological system. But many competitors do not have the time to characterize the parts thoroughly. THE HYPE: The 'parts' work like Lego. Images such as these run in magazines The New Yorker (left) and Wired portray synthetic biology as simple design and construction. The truth is that many of the parts are not well characterized, or work unpredictably in different configurations and conditions.J. SWART; M. KNOWLES While trying to optimize lactose fermentation in microbes, an iGEM team from the University of Pavia in Italy tested several promoters from the registry by placing them in Escherichia coli, a standard laboratory bacterium. Most of the promoters tested by the team worked, but some had little documentation, and one showed no activity. About 1,500 registry parts have been confirmed as working by someone other than the person who deposited them and 50 have reportedly failed, says Rettberg. 'Issues' have been reported for roughly another 200 parts, and it is unclear how many of the remaining parts have been tested. The registry has been stepping up efforts to improve the quality by curating the collection, encouraging contributors to include documentation on part function and performance, and sequencing the DNA of samples of parts to make sure they match their descriptions, says Rettberg. Meanwhile, synthetic biologists Adam Arkin and Jay Keasling at the University of California, Berkeley, and Drew Endy at Stanford University in Stanford, California are launching a new effort, tentatively called BIOFAB, to professionally develop and characterize new and existing parts. Late last year, the team was awarded US$1.4 million by the National Science Foundation and is hiring staff, says Arkin. Endy, moreover, has proposed methods to reduce some of the variability in measurements from different labs. By measuring promoter activity relative to a reference promoter, rather than looking at absolute activity, Endy's team found that it could eliminate half the variation arising from experimental co! nditions and instruments2. Measurements are tricky to standardize, however. In mammalian cells, for example, genes introduced into a cell integrate unpredictably into the cell's genome, and neighbouring regions often affect expression, says Martin Fussenegger, a synthetic biologist at the Swiss Federal Institute of Technology (ETH) Zurich. "This is the type of complexity that is very difficult to capture by standardized characterization," he says. The circuitry is unpredictable Even if the function of each part is known, the parts may not work as expected when put together, says Keasling. Synthetic biologists are often caught in a laborious process of trial-and-error, unlike the more predictable design procedures found in other modern engineering disciplines. "We are still like the Wright Brothers, putting pieces of wood and paper together," says Luis Serrano, a systems biologist at the Centre for Genomic Regulation in Barcelona, Spain. "You fly one thing and it crashes. You try another thing and maybe it flies a bit better." THE HYPE: Cells can simply be rewired. The magazines Scientific American (top) and IEEE Spectrum portrayed synthetic biology as being similar to microchip design or electrical wiring. Although computational modelling may help scientists to predict cell behaviour, the cell is a complex, variable, evolving operating system, very different from electronics.SLIM FILMS; H. CAMPBELL Bioengineer Jim Collins and his colleagues at Boston University in Massachusetts crashed a lot when implementing a system called a toggle switch in yeast. His lab built one roughly ten years ago in E. coli3: the team wanted to make cells express one gene — call it gene A — and then prompt them with a chemical signal to turn off A and express another gene, B. But the cells refused to express B continuously; they always shifted back to expressing A. The problem, says Collins, was that the promoters controlling the two genes were not balanced, so A overpowered B. It took about three years of tweaking the system to make it work, he says. Computer modelling could help reduce this guesswork. In a 2009 study4, Collins and his colleagues created several slightly different versions of two promoters. They used one version of each to create a genetic timer, a system that would cause cells to switch from expressing one gene to another after a certain lag time. They then tested the timer, fed the results back into a computational model and predicted how timers built from other versions would behave. Using such modelling techniques, researchers could optimize computationally rather than test every version of a network, says Collins. But designs might not have to work perfectly: imperfect ones can be refined using a process called directed evolution, says Frances Arnold, a chemical engineer at the California Institute of Technology in Pasadena. Directed evolution involves mutating DNA sequences, screening their performance, selecting the best candidates and repeating the process until the system is optimized. Arnold's lab, for instance, is using the technique to evolve enzymes involved in biofuel production. The complexity is unwieldy As circuits get larger, the process of constructing and testing them becomes more daunting. A system developed by Keasling's team5, which uses about a dozen genes to produce a precursor of the antimalarial compound artemisinin in microbes, is perhaps the field's most cited success story. Keasling estimates that it has taken roughly 150 person-years of work including uncovering genes involved in the pathway and developing or refining parts to control their expression. For example, the researchers had to test many part variants before they found a configuration that sufficiently increased production of an enzyme needed to consume a toxic intermediate molecule. "People don't even think about tackling those projects because it takes too much time and money," says Reshma Shetty, co-founder of the start-up firm Ginkgo BioWorks in Boston, Massachusetts. To relieve similar bottlenecks, Ginkgo is developing an automated process to combine genetic parts. The parts have pre-defined flanking sequences, dictated by a set of rules called the BioBrick standard, and can be assembled by robots. "We are still like the Wright Brothers, putting pieces of wood and paper together." Luis Serrano At Berkeley, synthetic biologist J. Christopher Anderson and his colleagues are developing a system that lets bacteria do the work. Engineered E. coli cells, called 'assembler' cells, are being equipped with enzymes that can cut and stitch together DNA parts. Other E. coli cells, engineered to act as 'selection' cells, will sort out the completed products from the leftover parts. The team plans to use virus-like particles called phagemids to ferry the DNA from the assembler to the selection cells. Anderson says that the system could shorten the time needed for one BioBrick assembly stage from two days to three hours. Many parts are incompatible Once constructed and placed into cells, synthetic genetic circuits can have unintended effects on their host. Chris Voigt, a synthetic biologist at the University of California, San Francisco, ran into this problem while he was a postdoc at Berkeley in 2003. Voigt had assembled genetic parts, mainly from the bacterium Bacillus subtilis, into a switch system that was supposed to turn on expression of certain genes in response to a chemical stimulus. He wanted to study the system independently of B. subtilis' other genetic networks, so he put the circuit into E. coli — but it didn't work. "You looked under the microscope and the cells were sick," says Voigt. "One day it would do one thing, and another day it would do another thing." He eventually saw in the literature that one of the circuit's parts dramatically disrupted E. coli's natural gene expression. "There was nothing wrong with the design of the circuit," he says. "It was just that one part was not compatible." "The field has had its hype phase. Now it needs to deliver." Martin Fussenegger Synthetic biologist Lingchong You at Duke University in Durham, North Carolina, and his colleagues found that even a simple circuit, comprising a foreign gene that promoted its own expression, could trigger complex behaviour in host cells6. When activated in E. coli, the circuit slowed down the cells' growth, which in turn slowed dilution of the gene's protein product. This led to a phenomenon called bistability: some cells expressed the gene, whereas others did not. To lessen unexpected interactions, researchers are developing 'orthogonal' systems that operate independently of the cell's natural machinery. Synthetic biologist Jason Chin of the Medical Research Council Laboratory of Molecular Biology in Cambridge, UK, and his colleagues have created a protein-production system in E. coli that is separate from the cell's built-in system7. To transcribe DNA into RNA, the team uses a polymerase enzyme that recognizes genes only if they have a specific promoter sequence that is not present in the cell's natural genes. Similarly, the system's orthogonal 'O-ribosomes', which translate RNA into protein, can read only 'O-mRNA' that contains a specific sequence, and O-mRNA is unreadable by natural ribosomes. A parallel system gives biologists the freedom to tweak components without disrupting the machinery needed for the cell to survive, says Chin. For example, his team has stripped down the DNA sequence encoding part of the O-ribosome to speed up production. This allows the cell to boot up protein manufacture more quickly, he says. Another solution is to physically isolate the synthetic network from the rest of the cell. Wendell Lim, a synthetic biologist at the University of California, San Francisco, is experimenting with the creation of membrane-bound compartments that would insulate the genetic circuits. Lim's team is working in yeast, but similar principles could be applied to bacterial cells, he says. Variability crashes the system Synthetic biologists must also ensure that circuits function reliably. Molecular activities inside cells are prone to random fluctuations, or noise. Variation in growth conditions can also affect behaviour. And over the long term, randomly arising genetic mutations can kill a circuit's function altogether. Michael Elowitz, a synthetic biologist at the California Institute of Technology in Pasadena, observed the cell's capacity for randomness about ten years ago when his team built a genetic oscillator8. The system contained three genes whose interactions caused the production of a fluorescent protein to go up and down, making cells blink on and off. However, not all cells responded the same way. Some were brighter, and some were dimmer; some blinked faster, others slower; and some cells skipped a cycle altogether. THE HYPE Cells can simply be rewired The magazines Scientific American (top) and IEEE Spectrum portrayed synthetic biology as being similar to microchip design or electrical wiring. Although computational modelling may help scientists to predict cell behaviour, the cell is a complex, variable, evolving operating system, very different from electronics.R. PAGE/ETC GROUP; ISSUE 1 OF THE ADVENTURES IN SYNTHETIC BIOLOGY. STORY: DREW ENDY & ISADORA DEESE. ART: CHUCK WADEY Elowitz says that the differences might have arisen for multiple reasons. A cell can express genes in bursts rather than a steady stream. Cells also may contain varying amounts of mRNA and protein-production machinery, such as polymerase enzymes and ribosomes. Furthermore, the number of copies of the genetic circuit in a cell can fluctuate over time. Jeff Hasty, a synthetic biologist at the University of California, San Diego, and his colleagues described an oscillator with more consistent behaviour9 in 2008. Using a different circuit design and microfluidic devices that allowed fine control of growth conditions, the team made nearly every monitored cell blink at the same rate — though not in sync. And in this issue of Nature (see page 326)10, Hasty's team reports the ability to synchronize the blinking by relying on cell–cell communication. But Hasty says that rather than trying to eliminate noise, researchers could use it to their advantage. He notes that in physics, noise can sometimes make a signal easier to detect. "I don't think you can beat it, so I think you ought to try to use it," says Hasty. For example, noise could allow some cells to respond differently to the environment from others, enabling the population to hedge its bets, says Elowitz. Meanwhile, geneticist George Church at Harvard Medical School in Boston, Massachusetts, is exploring ways to make a bacterial strain more stable. Church says that this might be achieved by introducing more accurate DNA-replication machinery, changing genome sites to make them less prone to mutation and putting extra copies of the genome into cells. Although stability may not be a serious issue for simple systems, it will become important as more components are assembled, he says. Time to deliver? Despite the challenges, synthetic biologists have made progress. Researchers have recently developed devices that allow E. coli to count events such as the number of times they have divided and to detect light and dark edges. And some systems have advanced from bacteria to more complex cells. The field is also gaining legitimacy, with a new synthetic-biology centre at Imperial College London and a programme at Harvard University's recently launched Wyss Institute for Biologically Inspired Engineering in Boston. The time has come for synthetic biologists to develop more real-world applications, says Fussenegger. "The field has had its hype phase," he says. "Now it needs to deliver." ADVERTISEMENT Keasling's artemisinin precursor system is approaching commercial reality, with Paris-based pharmaceutical company Sanofi-Aventis aiming to have the product available at an industrial scale by 2012. And several companies are pursuing biofuel production via engineered microbes. But most applications will take time. As the cost of DNA synthesis continues to drop and more people begin to tinker with biological parts, the field could progress faster, says Carlson. "It's a question of whether the complexity of biology yields to that kind of an effort." Roberta Kwok is a freelance writer in the San FranciscoBay Area. * References * Purnick, P. E. M. & Weiss, R.Nature Rev. Mol. Cell Biol.10, 410-422 (2009). * Kelly, J. R.et al. J. Biol. Engineer.3, 4 (2009). * Gardner, T. S. , Cantor, C. R. & Collins, J. J.Nature403, 339-342 (2000). * Ellis, T. , Wang, X. & Collins, J. J.Nature Biotechnol.27, 465-471 (2009). * Ro, D.-K.et al. Nature440, 940-943 (2006). * Tan, C. , Marguet, P. & You, L.Nature Chem. Biol.5, 842-848 (2009). * An, W. & Chin, J. W.Proc. Natl Acad. Sci. USA106, 8477-8482 (2009). * Elowitz, M. B. & Leibler, S.Nature403, 335-338 (2000). * Stricker, J.et al. Nature456, 516-519 (2008). * Danino, T. , Mondragón-Palomino, O. , Tsimring, L. & Hasty, J.Nature463, 326-330 (2010). There are currently no comments. This is a public forum. Please keep to our Community Guidelines. You can be controversial, but please don't get personal or offensive and do keep it brief. Remember our threads are for feedback and discussion - not for publishing papers, press releases or advertisements.
  • World view: Wild goose chase
    - Nature 463(7279):291 (2010)
    Is your work any good? Academic researchers will be asked this question with increasing frequency in this new age of austerity. Next month, the Higher Education Funding Council for England (HEFCE) releases its final plans for the Research Excellence Framework (REF). There are currently no comments.
  • Advocacy for carbon capture and storage could arouse distrust
    - Nature 463(7279):293 (2010)
    In addition to industry voices such as Gert Jan Kramer and Martin Haigh (Nature 462, 568–569; 2009), many academic experts are promoting CO2 capture and storage (CCS). But advocacy by academics could be ill-advised.
  • Activists should be consulted in animal testing decisions
    - Nature 463(7279):293 (2010)
    You argue, in an Editorial discussing a university's decision to cancel a primate-research project (Nature 462, 699; 2009), that such decisions should be guided by consultation between administrators, researchers and members of university communities.
  • Conservation work is incomplete without cryptic biodiversity
    - Nature 463(7279):293 (2010)
    You focus attention on biological diversity, nature conservation and the effects of climate warming in your special issue on biodiversity (19 November 2009). 'Cryptic' biodiversity is also crucial, because it helps natural ecosystems to continue functioning and habitats to bounce back in response to environmental change.
  • Geothermal energy stuck between a rock and a hot place
    - Nature 463(7279):293 (2010)
    In his Opinion article, Domenico Giardini (Nature 462, 848–849; 2009) calls for a better understanding of earthquake risk in pursuing deep geothermal energy using an enhanced geothermal system (EGS). However, earthquakes are only part of the problem in trying to tap Earth's internal heat as an alternative clean-energy source.
  • Correction
    - Nature 463(7279):293 (2010)
    While J. P.
  • A route to more tractable expert advice
    - Nature 463(7279):294 (2010)
    There are mathematically advanced ways to weigh and pool scientific advice. They should be used more to quantify uncertainty and improve decision-making, says Willy Aspinall.
  • Fixing the communications failure
    - Nature 463(7279):296 (2010)
    People's grasp of scientific debates can improve if communicators build on the fact that cultural values influence what and whom we believe, says Dan Kahan.
  • Vision of a personal genomics future
    - Nature 463(7279):298 (2010)
    The director of the US National Institutes of Health, Francis Collins, calls for a revolution in personalized medicine. Such advances should be shared beyond the developed world, says Abdallah S. Daar.
  • The bootleggers' legacy
    - Nature 463(7279):299 (2010)
    As Deborah Blum describes vividly in The Poisoner's Handbook, the catalyst for the birth of forensic medicine in the United States was Prohibition. Most people think of the nationwide banning of the consumption of alcohol in the 1920s as merely a failed attempt at social engineering.
  • Turin's criminology museum
    - Nature 463(7279):300 (2010)
    These days Cesare Lombroso is often considered a figure of fun, a flamboyant crazy who insisted that a criminal — or, indeed, a genius — could be recognized by the cut of his jaw or the slope of his forehead.The self-styled anthropological criminologist might not cut muster as a rigorous scientist today, but his ideas were influential.
  • Culture dish: Doom-laden Sundance
    - Nature 463(7279):300 (2010)
    The Sundance Film Festival in Park City, Utah, sets the agenda for independent cinema. At this year's festival, which runs until 31 January, science-related films are most concerned with disaster scenarios, both real and imagined.
  • Synthetic biology: Synchronized bacterial clocks
    - Nature 463(7279):301 (2010)
    By synchronizing clocks, humans make more efficient use of their time and orchestrate their activities in different places. Bacteria have now been engineered that similarly coordinate their molecular timepieces.
  • Materials science: Membrane magic
    - Nature 463(7279):302 (2010)
    The use of magnetic fields to assemble particles into membranes provides a powerful tool for exploring the physics of self-assembly and a practical method for synthesizing functional materials.
  • Genetics: Decoding a national treasure
    - Nature 463(7279):303 (2010)
    The giant-panda genome is the first reported de novo assembly of a large mammalian genome achieved using next-generation sequencing methods. The feat reflects a trend towards ever-decreasing genome-sequencing costs.
  • Asteroids: Stripped on passing by Earth
    - Nature 463(7279):305 (2010)
    Asteroids are weakly bound piles of rubble, and if one comes close to Earth, tides can cause the object to undergo landslides and structural rearrangement. The outcome of this encounter is a body with meteorite-like colours.
  • Evolutionary biology: New take on the Red Queen
    - Nature 463(7279):306 (2010)
    Biologists have assumed that natural selection shapes larger patterns of evolution through interactions such as competition and predation. These patterns may instead be determined by rare, stochastic speciation.
  • Atmospheric chemistry: More ozone over North America
    - Nature 463(7279):307 (2010)
    Springtime ozone levels in the lower atmosphere over western North America are rising. The source of this pollution may be Asia, a finding that reaffirms the need for international air-quality control.
  • Cell biology: How cilia beat
    - Nature 463(7279):308 (2010)
    Physics provides new approaches to difficult biological problems: a plausible mathematical model of how cilia and flagella beat has been formulated, but it needs to be subjected to rigorous experimental tests.
  • Correction
    - Nature 463(7279):309 (2010)
    In the obituary of Vitaly Ginzburg by Malcolm Longair (Nature 462, 996; 2009), editorial intervention introduced the statement that Gorky University was "in what is now Yekaterinburg". That should have read "in what is now Nizhny Novgorod".
  • Essentiality of FASII pathway for Staphylococcus aureus
    - Nature 463(7279):E3 (2010)
    Arising from: S. Brinster et al.Nature 458, 83–86 (2009); Brinster et al.reply Recently, Brinster et al.1 suggested that type II fatty-acid biosynthesis (FASII) is not a suitable antibacterial target for Gram-positive pathogens because they use fatty acids directly from host serum rather than de novo synthesis. Their findings, if confirmed, are relevant for further scientific and financial investments in the development of new drugs targeting FASII. We present here in vitro and in vivo data demonstrating that their observations do not hold for Staphylococcus aureus, a major Gram-positive pathogen causing several human infections. The observed differences among Gram-positive pathogens in FASII reflects heterogeneity either in fatty-acid synthesis or in the capacity for fatty-acid uptake from the environment.
  • Brinster et al. reply
    - Nature 463(7279):E4 (2010)
    Replying to: W. Balemans et al.Nature 462, 10.1038/nature08668 (2009) Our studies led us to conclude that growth of major Gram-positive pathogens, including Staphylococcus aureus, is not inhibited by FASII-targeted antibiotics in septicaemic infection, owing to compensation by serum fatty acids1. The comments of Balemans et al.2 challenge the generality of our results, mainly on the basis of their own work, which is aimed at developing FabI inhibitors for treatment of S. aureus infections. Their allusion to the documented use of FASII inhibitors to treat mycobacterial infections is misleading. Mycobacteria were not considered in our study, because (1) their main route of pathogenesis is not sepsis, and (2) they require mycolic acids for normal growth, which are lacking in serum. The results we present here further reinforce the conclusions of our article.
  • The sequence and de novo assembly of the giant panda genome
    Li R Fan W Tian G Zhu H He L Cai J Huang Q Cai Q Li B Bai Y Zhang Z Zhang Y Wang W Li J Wei F Li H Jian M Li J Zhang Z Nielsen R Li D Gu W Yang Z Xuan Z Ryder OA Leung FC Zhou Y Cao J Sun X Fu Y Fang X Guo X Wang B Hou R Shen F Mu B Ni P Lin R Qian W Wang G Yu C Nie W Wang J Wu Z Liang H Min J Wu Q Cheng S Ruan J Wang M Shi Z Wen M Liu B Ren X Zheng H Dong D Cook K Shan G Zhang H Kosiol C Xie X Lu Z Zheng H Li Y Steiner CC Lam TT Lin S Zhang Q Li G Tian J Gong T Liu H Zhang D Fang L Ye C Zhang J Hu W Xu A Ren Y Zhang G Bruford MW Li Q Ma L Guo Y An N Hu Y Zheng Y Shi Y Li Z Liu Q Chen Y Zhao J Qu N Zhao S Tian F Wang X Wang H Xu L Liu X Vinar T Wang Y Lam TW Yiu SM Liu S Zhang H Li D Huang Y Wang X Yang G Jiang Z Wang J Qin N Li L Li J Bolund L Kristiansen K Wong GK Olson M Zhang X Li S Yang H Wang J Wang J - Nature 463(7279):311 (2010)
    Using next-generation sequencing technology alone, we have successfully generated and assembled a draft sequence of the giant panda genome. The assembled contigs (2.25 gigabases (Gb)) cover approximately 94% of the whole genome, and the remaining gaps (0.05 Gb) seem to contain carnivore-specific repeats and tandem repeats. Comparisons with the dog and human showed that the panda genome has a lower divergence rate. The assessment of panda genes potentially underlying some of its unique traits indicated that its bamboo diet might be more dependent on its gut microbiome than its own genetic composition. We also identified more than 2.7 million heterozygous single nucleotide polymorphisms in the diploid genome. Our data and analyses provide a foundation for promoting mammalian genetic research, and demonstrate the feasibility for using next-generation sequencing technologies for accurate, cost-effective and rapid de novo assembly of large eukaryotic genomes.
  • The transcriptional network for mesenchymal transformation of brain tumours
    Carro MS Lim WK Alvarez MJ Bollo RJ Zhao X Snyder EY Sulman EP Anne SL Doetsch F Colman H Lasorella A Aldape K Califano A Iavarone A - Nature 463(7279):318 (2010)
    The inference of transcriptional networks that regulate transitions into physiological or pathological cellular states remains a central challenge in systems biology. A mesenchymal phenotype is the hallmark of tumour aggressiveness in human malignant glioma, but the regulatory programs responsible for implementing the associated molecular signature are largely unknown. Here we show that reverse-engineering and an unbiased interrogation of a glioma-specific regulatory network reveal the transcriptional module that activates expression of mesenchymal genes in malignant glioma. Two transcription factors (C/EBPβ and STAT3) emerge as synergistic initiators and master regulators of mesenchymal transformation. Ectopic co-expression of C/EBPβ and STAT3 reprograms neural stem cells along the aberrant mesenchymal lineage, whereas elimination of the two factors in glioma cells leads to collapse of the mesenchymal signature and reduces tumour aggressiveness. In human glioma, exp! ression of C/EBPβ and STAT3 correlates with mesenchymal differentiation and predicts poor clinical outcome. These results show that the activation of a small regulatory module is necessary and sufficient to initiate and maintain an aberrant phenotypic state in cancer cells.
  • A synchronized quorum of genetic clocks
    - Nature 463(7279):326 (2010)
    The engineering of genetic circuits with predictive functionality in living cells represents a defining focus of the expanding field of synthetic biology. This focus was elegantly set in motion a decade ago with the design and construction of a genetic toggle switch and an oscillator, with subsequent highlights that have included circuits capable of pattern generation, noise shaping, edge detection and event counting. Here we describe an engineered gene network with global intercellular coupling that is capable of generating synchronized oscillations in a growing population of cells. Using microfluidic devices tailored for cellular populations at differing length scales, we investigate the collective synchronization properties along with spatiotemporal waves occurring at millimetre scales. We use computational modelling to describe quantitatively the observed dependence of the period and amplitude of the bulk oscillations on the flow rate. The synchronized genetic cloc! k sets the stage for the use of microbes in the creation of a macroscopic biosensor with an oscillatory output. Furthermore, it provides a specific model system for the generation of a mechanistic description of emergent coordinated behaviour at the colony level.
  • Earth encounters as the origin of fresh surfaces on near-Earth asteroids
    - Nature 463(7279):331 (2010)
    Telescopic measurements of asteroids' colours rarely match laboratory reflectance spectra of meteorites owing to a 'space weathering'1, 2 process that rapidly3 reddens asteroid surfaces in less than 106 years. 'Unweathered' asteroids (those having spectra matching the most commonly falling ordinary chondrite meteorites), however, are seen among small bodies the orbits of which cross inside Mars and the Earth. Various explanations have been proposed for the origin of these fresh surface colours, ranging from collisions4 to planetary encounters5. Less reddened asteroids seem to cross most deeply into the terrestrial planet region, strengthening6 the evidence for the planetary-encounter theory5, but encounter details within 106 years remain to be shown. Here we report that asteroids displaying unweathered spectra (so-called 'Q-types'7) have experienced orbital intersections closer than the Earth–Moon distance within the past 5 × 105 years. These Q-typ! e asteroids are not currently found among asteroids showing no evidence of recent close planetary encounters. Our results substantiate previous work5: tidal stress8, strong enough to disturb and expose unweathered surface grains, is the most likely dominant short-term asteroid resurfacing process. Although the seismology details are yet to be worked out, the identification of rapid physical processes that can produce both fresh and weathered asteroid surfaces resolves the decades-long9 puzzle of the difference in colour of asteroids and meteorites.
  • Strong crystal size effect on deformation twinning
    - Nature 463(7279):335 (2010)
    Deformation twinning1, 2, 3, 4, 5, 6 in crystals is a highly coherent inelastic shearing process that controls the mechanical behaviour of many materials, but its origin and spatio-temporal features are shrouded in mystery. Using micro-compression and in situ nano-compression experiments, here we find that the stress required for deformation twinning increases drastically with decreasing sample size of a titanium alloy single crystal7, 8, until the sample size is reduced to one micrometre, below which the deformation twinning is entirely replaced by less correlated, ordinary dislocation plasticity. Accompanying the transition in deformation mechanism, the maximum flow stress of the submicrometre-sized pillars was observed to saturate at a value close to titanium's ideal strength9, 10. We develop a 'stimulated slip' model to explain the strong size dependence of deformation twinning. The sample size in transition is relatively large and easily accessible in experi! ments, making our understanding of size dependence11, 12, 13, 14, 15, 16, 17 relevant for applications.
  • High-water-content mouldable hydrogels by mixing clay and a dendritic molecular binder
    - Nature 463(7279):339 (2010)
    With the world's focus on reducing our dependency on fossil-fuel energy, the scientific community can investigate new plastic materials that are much less dependent on petroleum than are conventional plastics. Given increasing environmental issues, the idea of replacing plastics with water-based gels, so-called hydrogels, seems reasonable. Here we report that water and clay (2–3 per cent by mass), when mixed with a very small proportion (<0.4 per cent by mass) of organic components, quickly form a transparent hydrogel. This material can be moulded into shape-persistent, free-standing objects owing to its exceptionally great mechanical strength, and rapidly and completely self-heals when damaged. Furthermore, it preserves biologically active proteins for catalysis. So far1 no other hydrogels, including conventional ones formed by mixing polymeric cations and anions2, 3 or polysaccharides and borax4, have been reported to possess all these features. Notably, this! material is formed only by non-covalent forces resulting from the specific design of a telechelic dendritic macromolecule with multiple adhesive termini for binding to clay.
  • Increasing springtime ozone mixing ratios in the free troposphere over western North America
    - Nature 463(7279):344 (2010)
    In the lowermost layer of the atmosphere—the troposphere—ozone is an important source of the hydroxyl radical, an oxidant that breaks down most pollutants and some greenhouse gases1. High concentrations of tropospheric ozone are toxic, however, and have a detrimental effect on human health and ecosystem productivity1. Moreover, tropospheric ozone itself acts as an effective greenhouse gas2. Much of the present tropospheric ozone burden is a consequence of anthropogenic emissions of ozone precursors3 resulting in widespread increases in ozone concentrations since the late 1800s3, 4, 5, 6, 7. At present, east Asia has the fastest-growing ozone precursor emissions8. Much of the springtime east Asian pollution is exported eastwards towards western North America9. Despite evidence that the exported Asian pollution produces ozone10, no previous study has found a significant increase in free tropospheric ozone concentrations above the western USA since measurements began ! in the late 1970s5, 11, 12. Here we compile springtime ozone measurements from many different platforms across western North America. We show a strong increase in springtime ozone mixing ratios during 1995–2008 and we have some additional evidence that a similar rate of increase in ozone mixing ratio has occurred since 1984. We find that the rate of increase in ozone mixing ratio is greatest when measurements are more heavily influenced by direct transport from Asia. Our result agrees with previous modelling studies, which indicate that global ozone concentrations should be increasing during the early part of the twenty-first century as a result of increasing precursor emissions, especially at northern mid-latitudes13, with western North America being particularly sensitive to rising Asian emissions14. We suggest that the observed increase in springtime background ozone mixing ratio may hinder the USA's compliance with its ozone air quality standard.
  • Phylogenies reveal new interpretation of speciation and the Red Queen
    Venditti C Meade A Pagel M - Nature 463(7279):349 (2010)
    The Red Queen1 describes a view of nature in which species continually evolve but do not become better adapted. It is one of the more distinctive metaphors of evolutionary biology, but no test of its claim that speciation occurs at a constant rate2 has ever been made against competing models that can predict virtually identical outcomes, nor has any mechanism been proposed that could cause the constant-rate phenomenon. Here we use 101 phylogenies of animal, plant and fungal taxa to test the constant-rate claim against four competing models. Phylogenetic branch lengths record the amount of time or evolutionary change between successive events of speciation. The models predict the distribution of these lengths by specifying how factors combine to bring about speciation, or by describing how rates of speciation vary throughout a tree. We find that the hypotheses that speciation follows the accumulation of many small events that act either multiplicatively or additively fo! und support in 8% and none of the trees, respectively. A further 8% of trees hinted that the probability of speciation changes according to the amount of divergence from the ancestral species, and 6% suggested speciation rates vary among taxa. By comparison, 78% of the trees fit the simplest model in which new species emerge from single events, each rare but individually sufficient to cause speciation. This model predicts a constant rate of speciation, and provides a new interpretation of the Red Queen: the metaphor of species losing a race against a deteriorating environment is replaced by a view linking speciation to rare stochastic events that cause reproductive isolation. Attempts to understand species-radiations3 or why some groups have more or fewer species should look to the size of the catalogue of potential causes of speciation shared by a group of closely related organisms rather than to how those causes combine.
  • Mutational robustness can facilitate adaptation
    - Nature 463(7279):353 (2010)
    Robustness seems to be the opposite of evolvability. If phenotypes are robust against mutation, we might expect that a population will have difficulty adapting to an environmental change, as several studies have suggested1, 2, 3, 4. However, other studies contend that robust organisms are more adaptable5, 6, 7, 8. A quantitative understanding of the relationship between robustness and evolvability will help resolve these conflicting reports and will clarify outstanding problems in molecular and experimental evolution, evolutionary developmental biology and protein engineering. Here we demonstrate, using a general population genetics model, that mutational robustness can either impede or facilitate adaptation, depending on the population size, the mutation rate and the structure of the fitness landscape. In particular, neutral diversity in a robust population can accelerate adaptation as long as the number of phenotypes accessible to an individual by mutation is smaller! than the total number of phenotypes in the fitness landscape. These results provide a quantitative resolution to a significant ambiguity in evolutionary theory.
  • Prejudice and truth about the effect of testosterone on human bargaining behaviour
    Eisenegger C Naef M Snozzi R Heinrichs M Fehr E - Nature 463(7279):356 (2010)
    Both biosociological and psychological models, as well as animal research, suggest that testosterone has a key role in social interactions1, 2, 3, 4, 5, 6, 7. Evidence from animal studies in rodents shows that testosterone causes aggressive behaviour towards conspecifics7. Folk wisdom generalizes and adapts these findings to humans, suggesting that testosterone induces antisocial, egoistic, or even aggressive human behaviours. However, many researchers have questioned this folk hypothesis1, 2, 3, 4, 5, 6, arguing that testosterone is primarily involved in status-related behaviours in challenging social interactions, but causal evidence that discriminates between these views is sparse. Here we show that the sublingual administration of a single dose of testosterone in women causes a substantial increase in fair bargaining behaviour, thereby reducing bargaining conflicts and increasing the efficiency of social interactions. However, subjects who believed that they receiv! ed testosterone—regardless of whether they actually received it or not—behaved much more unfairly than those who believed that they were treated with placebo. Thus, the folk hypothesis seems to generate a strong negative association between subjects' beliefs and the fairness of their offers, even though testosterone administration actually causes a substantial increase in the frequency of fair bargaining offers in our experiment.
  • Systematic sequencing of renal carcinoma reveals inactivation of histone modifying genes
    - Nature 463(7279):360 (2010)
    Clear cell renal cell carcinoma (ccRCC) is the most common form of adult kidney cancer, characterized by the presence of inactivating mutations in the VHL gene in most cases1, 2, and by infrequent somatic mutations in known cancer genes. To determine further the genetics of ccRCC, we have sequenced 101 cases through 3,544 protein-coding genes. Here we report the identification of inactivating mutations in two genes encoding enzymes involved in histone modification—SETD2, a histone H3 lysine 36 methyltransferase, and JARID1C (also known as KDM5C), a histone H3 lysine 4 demethylase—as well as mutations in the histone H3 lysine 27 demethylase, UTX (KMD6A), that we recently reported3. The results highlight the role of mutations in components of the chromatin modification machinery in human cancer. Furthermore, NF2 mutations were found in non-VHL mutated ccRCC, and several other probable cancer genes were identified. These results indicate that substantial genetic heter! ogeneity exists in a cancer type dominated by mutations in a single gene, and that systematic screens will be key to fully determining the somatic genetic architecture of cancer.
  • HnRNP proteins controlled by c-Myc deregulate pyruvate kinase mRNA splicing in cancer
    David CJ Chen M Assanah M Canoll P Manley JL - Nature 463(7279):364 (2010)
    When oxygen is abundant, quiescent cells efficiently extract energy from glucose primarily by oxidative phosphorylation, whereas under the same conditions tumour cells consume glucose more avidly, converting it to lactate. This long-observed phenomenon is known as aerobic glycolysis1, and is important for cell growth2, 3. Because aerobic glycolysis is only useful to growing cells, it is tightly regulated in a proliferation-linked manner4. In mammals, this is partly achieved through control of pyruvate kinase isoform expression. The embryonic pyruvate kinase isoform, PKM2, is almost universally re-expressed in cancer2, and promotes aerobic glycolysis, whereas the adult isoform, PKM1, promotes oxidative phosphorylation2. These two isoforms result from mutually exclusive alternative splicing of the PKM pre-mRNA, reflecting inclusion of either exon 9 (PKM1) or exon 10 (PKM2). Here we show that three heterogeneous nuclear ribonucleoprotein (hnRNP) proteins, polypyrimidine t! ract binding protein (PTB, also known as hnRNPI), hnRNPA1 and hnRNPA2, bind repressively to sequences flanking exon 9, resulting in exon 10 inclusion. We also demonstrate that the oncogenic transcription factor c-Myc upregulates transcription of PTB, hnRNPA1 and hnRNPA2, ensuring a high PKM2/PKM1 ratio. Establishing a relevance to cancer, we show that human gliomas overexpress c-Myc, PTB, hnRNPA1 and hnRNPA2 in a manner that correlates with PKM2 expression. Our results thus define a pathway that regulates an alternative splicing event required for tumour cell proliferation.
  • FOXO-dependent regulation of innate immune homeostasis
    - Nature 463(7279):369 (2010)
    The innate immune system represents an ancient host defence mechanism that protects against invading microorganisms. An important class of immune effector molecules to fight pathogen infections are antimicrobial peptides (AMPs) that are produced in plants and animals1. In Drosophila, the induction of AMPs in response to infection is regulated through the activation of the evolutionarily conserved Toll and immune deficiency (IMD) pathways2. Here we show that AMP activation can be achieved independently of these immunoregulatory pathways by the transcription factor FOXO, a key regulator of stress resistance, metabolism and ageing. In non-infected animals, AMP genes are activated in response to nuclear FOXO activity when induced by starvation, using insulin signalling mutants, or by applying small molecule inhibitors. AMP induction is lost in foxo null mutants but enhanced when FOXO is overexpressed. Expression of AMP genes in response to FOXO activity can also be trigger! ed in animals unable to respond to immune challenges due to defects in both the Toll and IMD pathways. Molecular experiments at the Drosomycin promoter indicate that FOXO directly binds to its regulatory region, thereby inducing its transcription. In vivo studies in Drosophila, but also studies in human lung, gut, kidney and skin cells indicate that a FOXO-dependent regulation of AMPs is evolutionarily conserved. Our results indicate a new mechanism of cross-regulation of metabolism and innate immunity by which AMP genes can be activated under normal physiological conditions in response to the oscillating energy status of cells and tissues. This regulation seems to be independent of the pathogen-responsive innate immunity pathways whose activation is often associated with tissue damage and repair. The sparse production of AMPs in epithelial tissues in response to FOXO may help modulating the defence reaction without harming the host tissues, in particular when animals are s! uffering from energy shortage or stress.
  • Transcriptional role of cyclin D1 in development revealed by a genetic–proteomic screen
    - Nature 463(7279):374 (2010)
    Cyclin D1 belongs to the core cell cycle machinery, and it is frequently overexpressed in human cancers1, 2. The full repertoire of cyclin D1 functions in normal development and oncogenesis is unclear at present. Here we developed Flag- and haemagglutinin-tagged cyclin D1 knock-in mouse strains that allowed a high-throughput mass spectrometry approach to search for cyclin D1-binding proteins in different mouse organs. In addition to cell cycle partners, we observed several proteins involved in transcription. Genome-wide location analyses (chromatin immunoprecipitation coupled to DNA microarray; ChIP-chip) showed that during mouse development cyclin D1 occupies promoters of abundantly expressed genes. In particular, we found that in developing mouse retinas—an organ that critically requires cyclin D1 function3, 4—cyclin D1 binds the upstream regulatory region of the Notch1 gene, where it serves to recruit CREB binding protein (CBP) histone acetyltransfer! ase. Genetic ablation of cyclin D1 resulted in decreased CBP recruitment, decreased histone acetylation of the Notch1 promoter region, and led to decreased levels of the Notch1 transcript and protein in cyclin D1-null (Ccnd1-/-) retinas. Transduction of an activated allele of Notch1 into Ccnd1-/- retinas increased proliferation of retinal progenitor cells, indicating that upregulation of Notch1 signalling alleviates the phenotype of cyclin D1-deficiency. These studies show that in addition to its well-established cell cycle roles, cyclin D1 has an in vivo transcriptional function in mouse development. Our approach, which we term 'genetic–proteomic', can be used to study the in vivo function of essentially any protein.
  • Mechanism of folding chamber closure in a group II chaperonin
    - Nature 463(7279):379 (2010)
    Group II chaperonins are essential mediators of cellular protein folding in eukaryotes and archaea. These oligomeric protein machines, ~1 megadalton, consist of two back-to-back rings encompassing a central cavity that accommodates polypeptide substrates1, 2, 3. Chaperonin-mediated protein folding is critically dependent on the closure of a built-in lid4, 5, which is triggered by ATP hydrolysis6. The structural rearrangements and molecular events leading to lid closure are still unknown. Here we report four single particle cryo-electron microscopy (cryo-EM) structures of Mm-cpn, an archaeal group II chaperonin5, 7, in the nucleotide-free (open) and nucleotide-induced (closed) states. The 4.3 Å resolution of the closed conformation allowed building of the first ever atomic model directly from the single particle cryo-EM density map, in which we were able to visualize the nucleotide and more than 70% of the side chains. The model of the open conformation was obtai! ned by using the deformable elastic network modelling with the 8 Å resolution open-state cryo-EM density restraints. Together, the open and closed structures show how local conformational changes triggered by ATP hydrolysis lead to an alteration of intersubunit contacts within and across the rings, ultimately causing a rocking motion that closes the ring. Our analyses show that there is an intricate and unforeseen set of interactions controlling allosteric communication and inter-ring signalling, driving the conformational cycle of group II chaperonins. Beyond this, we anticipate that our methodology of combining single particle cryo-EM and computational modelling will become a powerful tool in the determination of atomic details involved in the dynamic processes of macromolecular machines in solution.
  • Direct inhibition of the NOTCH transcription factor complex
    - Nature 463(7279):384 (2010)
    Nature 462, 182–188 (2009) In the print issue of this Article, text from the last line of the Figure 4 legend, defining the scale bar lengths, is inadvertently missing. This sentence should read "Scale bars, 50 μm".
  • Thickness and Clapeyron slope of the post-perovskite boundary
    - Nature 463(7279):384 (2010)
    Nature 462, 782–785 (2009) In this Letter, the accepted date was incorrectly listed as 19 August 2009. The correct accepted date is 21 October 2009. Also, in the Acknowledgements, APS was defined incorrectly. The correct definition is the Advanced Photon Source.
  • Cyclical DNA methylation of a transcriptionally active promoter
    - Nature 463(7279):384 (2010)
    Nature 452, 45–50 (2008) Errors and inappropriate manipulations were made in the assembly and processing of images in this Article. This affected Figure 2b, d, and some of the Supplementary Figures. These errors have been rectified using original data, and the correct Figure 2 is shown below. The results and conclusions of the paper are not affected by correction of the figures. We apologize for the mistakes made, which included the overuse of Photoshop contrast levels to generate a uniform white background, and errors in figure assembly.
  • FGF signalling during embryo development regulates cilia length in diverse epithelia
    - Nature 463(7279):384 (2010)
    Nature 458, 651–654 (2009) In Figure 4 of this Letter, panel b (fgfr1 MO, sox17in situ) was inadvertently duplicated in panel a (WT uninj., sox17 in situ) during figure revision and assembly. The corrected Figure panels are shown below. This does not alter the data indicating that sox17 expression is not affected in fgfr1 MO, or any of the conclusions of the work presented in the manuscript.
  • Strange machine
    - Nature 463(7279):392 (2010)
    A fair exchange?

No comments: