Latest Articles Include:
- Data's shameful neglect
- Nature 461(7261):145 (2009)
Research cannot flourish if data are not preserved and made accessible. All concerned must act accordingly. - A step too far?
- Nature 461(7261):145-146 (2009)
The Obama administration must fund human space flight adequately, or stop speaking of 'exploration'. - Overrated ratings
- Nature 461(7261):146 (2009)
Criteria for 'green buildings' need to make energy performance a priority — as do universities. - Animal communication: Warning wings
- Nature 461(7261):148 (2009)
- Atmospheric chemistry: Ozone's winners and losers
- Nature 461(7261):148 (2009)
- Physics: Magnetic monopoles
- Nature 461(7261):148 (2009)
- Computational biology: A new protein subdivision
- Nature 461(7261):148 (2009)
- Microbial evolution: Cholera gene swap
- Nature 461(7261):148-149 (2009)
- Chemistry: Going for gold
- Nature 461(7261):149 (2009)
- Evolution and development: Genes in the mirror
- Nature 461(7261):149 (2009)
- Neuroscience: Fear net
- Nature 461(7261):149 (2009)
- Genetics: Why Y knots
- Nature 461(7261):149 (2009)
- Journal club
- Nature 461(7261):149 (2009)
- News briefing
- Nature 461(7261):150-151 (2009)
The week in science. This article is best viewed as a pdf. Policy|Business|Research|Events|Awards|The week ahead|Sound bites|Number crunch 's human space-flight programme does not have enough money to fulfil its vision of building a Moon base or sending astronauts to Mars. The conclusion was due to be delivered to US President Barack Obama this week in a report from an expert panel led by former aerospace executive Norman Augustine. The report outlines a range of alternatives for NASA, including sustaining the International Space Station beyond its scheduled de-orbit in 2016, and cancelling the Ares I rocket that is being developed to carry astronauts to the Moon. For more, see page 153. Delegates representing 155 nations at the World Climate Conference in Geneva agreed on 3 September to set up a providing long-term forecasts to users ranging from national governments to individual farmers. Over the next four months, a task force set up by the World Meteorological Organization will work out the practicalities of the service. But some countries are baulking at the suggestion that they will need to supply the service with data, citing issues such as national security or commercial interests that would prevent disclosure. For more, see page 159. The United Nations' World Economic and Social Survey 2009 estimated last week that the would need between US$500 billion and $600 billion annually from rich nations — around 1% of their GDP — to shift to cleaner energy and adapt to global warming. Even this amount, well above previous estimates, is dwarfed by a Chinese economic analysis. Environmental economists at Renmin University in Beijing suggest that if emissions in are to peak by 2030, up to $438 billion will have to be spent each year in that country alone. Biomedical research collaborations between need greater ethical oversight, according to BIONET, a panel that examines projects between the regions. At a meeting in London on 2–4 September, it recommended that a joint advisory body be set up to offer advice and monitor research practices in order to stamp out unregulated stem-cell therapies and prevent participants in clinical trials being exploited. For more, see page 157. greenhouse-gas emissions will triple by 2031 but nevertheless will probably still be below the world per-capita average for 2005, said Jairam Ramesh, the country's environment minister, on 2 September. Citing five independent studies, he said emissions would rise from today's 1.2 billion tonnes to between 4 billion and 7.3 billion tonnes of carbon dioxide equivalents — or between 2.77 and 5 tonnes per capita. With flu season looming in the Northern Hemisphere, a small clinical trial by Novartis indicated that just one dose of pandemic H1N1 was sufficient to provoke an adequate immune response — if accompanied by an adjuvant, or booster chemical. If confirmed, this single-dose requirement would effectively double the amount of vaccine available, as two doses have been assumed necessary. Chinese company Sinovac said last month that a single shot of its vaccine also worked; it received a production licence from China's government last week. The sequencing firm in Mountain View, California, will not meet its goal this year of sequencing 1,000 human genomes for US$5,000 each. On 9 September, the company said it had sequenced just 14 genomes, for customers such as US drug giant Pfizer, and the HudsonAlpha Institute for Biotechnology in Huntsville, Alabama. Complete Genomics announced on 24 August that it was delaying its commercial launch by six months to January 2010, owing to fundraising difficulties. patent office has rejected claims from US companies Gilead and Tibotec for patents on their respective , tenofovir and darunavir. The decision opens the way for India to supply cheaper generic versions of the medicines, both to its own population and to other countries where the drug is not patented. It is the latest in a string of legal victories for Cipla, India's largest generic drug maker, which refused to sign up to a condition-bound licence on tenofovir that Gilead offered to generics manufacturers in 2006. Click here for a longer version of this story. Concern rose last week that China might further restrict exports of , based on leaked details of a draft plan from the nation's Ministry of Industry and Information Technology. Wang Caifeng, deputy director-general of the ministry's Department of Raw Material Industry, told a mining conference in Beijing on 3 September that the policy was still under review, but insisted there would be no outright ban on exports of elements such as dysprosium and terbium. China produces more than 90% of the world's rare earth elements, which are used as catalysts and in high-tech magnets, hybrid car batteries, wind turbines and mobile phones. Leading mass-spectrometer manufacturer has been bought for US$1.1 billion by scientific and medical technology company of Washington DC. AB SCIEX was jointly owned by life-sciences companies Life Technologies of Carlsbad, California, and MDS of Mississauga, Ontario. The US Department of Energy has released nearly $500 million in direct cash subsidies for wind-energy developers, marking the first payment under a new stimulus programme intended to revive renewable-energy markets. The money will go to companies developing ten wind projects in six states; another $3 million went to a pair of solar projects. SOURCE: AWEA; PREDICTIONS FROM NEF The cash payments are in lieu of tax incentives that have been in place for more than a decade. Energy developers previously secured up-front financing from banks, which then took advantage of the tax credits over time. That financing dried up following the global economic downturn, contributing to a sharp decline in new wind-energy projects. The US wind industry, previously hoping to install as much as 10 gigawatts of capacity in 2009, is now expecting around 6.5 gigawatts. But the London-based consultancy New Energy Finance is forecasting growth next year, with installation of between 8 and 10 gigawatts of capacity. The energy department expects the new programme to provide around $3 billion in funding, enabling projects valued at between $10 billion and $14 billion. The International Centre for Radio Astronomy Research opened last week in Perth, Australia, at a cost of Aus$100 million (US$85 million). The centre, largely funded by Curtin University of Technology and the University of Western Australia, both in Perth, is expected to help Australia's bid to host the Aus$2.5-billion radio telescope. A decision on whether Australia or its rival South Africa will host the array is expected in 2012. A flagship US$17-million project has been halted after encountering problems at its northern California drilling site. AltaRock Energy of Sausalito, California, is backed by a $6.25-million grant from the US Department of Energy and venture funding from Google. It aims to harness geothermal energy by cracking bedrock at the bottom of a deep well and pumping water through the cracks to generate steam. The company did not immediately provide details about the drilling problems, but is due to file a report with federal agencies. K. DJANSEZIAN/GETTY IMAGES Californian firefighters saved the historic Mount Wilson observatory from a massive arson fire that created huge palls of smoke and blackened more than 600 square kilometres in the mountains above Los Angeles. Astronomer Edwin Hubble used the 100-inch (2.5 metre) Hooker telescope on Mount Wilson in the 1920s to confirm that the Milky Way is one galaxy among many and that the Universe is expanding. This year's winners, announced on 7 September, include two top scientists. O. EGAN/MCGILL UNIV. of the Federal Polytechnic School of Lausanne, France, was rewarded for his development of the dye-sensitized solar cell. (pictured), of McGill University in Montreal, Canada, received the prize for her research on the role of the hippocampus in the formation of memories. Awarded by the International Balzan Prize Foundation, the prize aims to bolster "initiatives in the cause of humanity, peace and brotherhood". The foundation, which is based in Milan, Italy, will give each of the winners 1 million Swiss francs (US$944,000), half of which must be devoted to future research. The European Planetary Science Congress holds its fourth annual meeting in Potsdam, Germany. → http://meetings.copernicus.org/epsc2009 The International Atomic Energy Agency holds its annual general conference in Vienna. → http://www.iaea.org/About/Policy/GC Individual genomes' role in research and clinical medicine is the theme of Personal Genomes, the second meeting on the topic hosted by Cold Spring Harbor Laboratory, New York. → http://meetings.cshl.edu/meetings/person09.shtml Michael Zammit Cutajar, United Nations Framework Convention on Climate Change. The chairman of a working group deliberating the text of a draft climate treaty for debate at December's Copenhagen summit describes the group's slow progress. (Reuters) Record-setting sum paid by Pfizer to settle allegations that it had illegally marketed drugs and paid kickbacks to physicians. There are currently no comments. - Cash crisis could ground NASA rocket
- Nature 461(7261):153 (2009)
Crewed missions to the Moon are under threat, warns an expert panel. Current projects such as NASA's Ares I rocket could be cancelled in favour of commercial space flights.ATK A committee of aerospace engineers and scientists was poised to deliver its grim assessment of NASA's human space-flight programme to US President Barack Obama on 8 September. The panel's report will outline the stark choices Obama will face, which could include cancelling a new system of Moon-bound rockets and all but giving up on exploring space beyond the low Earth orbit of the International Space Station (ISS). "The bottom line is, they concluded that there's not enough money in the current budget to do anything useful in human space flight," says Marcia Smith, president of the Space and Technology Policy Group, a consultancy based in Arlington, Virginia, and former director of the Space Studies Board at the US National Research Council. In May, Obama ordered the committee to review the current space policy set by former president George W. Bush, with its "vision" of building a Moon base as a prelude to sending people to Mars. The committee was tasked with assessing new scenarios — including using the ISS past its scheduled de-orbit in 2016 — while keeping to strict budget guidelines. Led by former aerospace executive Norman Augustine, the ten-member committee has not yet released its report, but public discussions this summer have made some of the options clear. Given the budget constraints, the choices weren't pretty. In Obama's 2010 budget request, NASA's exploration programme, known as Constellation, would receive about US$6 billion per year — about $1 billion less than Bush asked for in his 2009 budget, and several billion less than what was slated in previous budgets (see chart). "The Bush budget stressed the system, but the Obama budget, if left as is, breaks it," says Scott Pace, director of the Space Policy Institute at George Washington University in Washington DC. One analysis by the committee showed that if the current plan and budget are kept, astronauts won't even leave low Earth orbit until 2028. Click for larger image So the panel looked at alternatives, narrowing down some 3,000 permutations to just a handful for presidential digestion. In several scenarios, the Ares I rocket — one of two needed to take cargo and astronauts to the Moon — would be cancelled. Instead, money would be poured into commercial space companies, such as Space Exploration Technologies of Hawthorne, California, and Orbital Sciences in Dulles, Virginia, which are already trying to build rockets to take cargo to the ISS. But the committee also seems inclined to support commercial rockets that could ferry people into space, says Smith. Former NASA administrator Michael Griffin says there are risks not just in making a crewed commercial rocket a reality, but also in ceding the capability for space travel — traditionally held by the US government — to the private sector. "I am not a fan of attempts to rely on such a capability before it actually exists," says Griffin, now a professor of aerospace engineering at the University of Alabama in Huntsville. He says he would also be disappointed if Ares I were cancelled, not so much for the $6 billion that has already been spent on the rocket and its Orion crew capsule, but because he still believes that Ares I is the cheapest way to get past low Earth orbit when paired with its heavy-lift launch companion Ares V. The system, he says, "has the sole failure of costing more than President Obama was willing to provide in the budget". The committee found that extensive human exploration of the Moon and a direct trip to Mars are not feasible. With a little budgetary leeway, and with the Ares I money put into developing an alternative heavy-lift rocket, the committee determined that there could eventually be a 'deep space' option. Such possibilities could include visits to asteroids, flybys of the Moon and planets, and trips to Lagrangian points — the gravity wells in the Earth–Sun system where some telescopes are situated. ADVERTISEMENT The committee found many ways to extend the operations of the ISS to 2020 in order to satisfy international agreements. What is not obvious is whether, after spending $2.5 billion a year to service the ISS in coming years, there would be money for much else. "I dislike pretending that we have goals that are far-reaching and frontier-oriented when we're not willing to set aside money to achieve them," says Griffin. Obama's 2010 budget guidance did include the caveat that additional money could be requested for the programme pending the Augustine committee's report. Congress, which is working to set those spending figures this autumn, has scheduled hearings on the report for mid-September. So although the committee's job will soon be over, some tough decisions — whether to argue for more money, or to accept a more limited programme — are still in store. "The more difficult job is going to be on the president's desk," says Smith. There are currently no comments. - How green is your campus?
- Nature 461(7261):154-155 (2009)
On a typically muggy day in late August, some 1,300 incoming freshmen and their parents gathered for orientation weekend at Emory University, near downtown Atlanta, Georgia. Here, in the heart of the conservative Deep South, the students received their first lesson of the school year. There are currently no comments. - Export-control laws worry academics
- Nature 461(7261):156 (2009)
Academics in the United States are hoping that pending legislation and a presidentially mandated review could provide long-sought relief from export laws they believe hamper international scientific cooperation and research. The defence and aerospace industries have long struggled with the seemingly Byzantine nature of export-control regulations, as has NASA, which has sought exemptions to cover its work on the International Space Station. There are currently no comments. - Ethics scrutiny needed for Chinese–European projects
- Nature 461(7261):157 (2009)
Biomedical research collaborations between Europe and China need greater ethical oversight to combat unregulated stem-cell therapies and prevent the exploitation of clinical-trial participants. That's the message from a group of bioethics experts who are part of the Chinese–European BIONET project, a partnership set up to examine scientific collaborations between the regions. There are currently no comments. - Toxicity testing gets a makeover
- Nature 461(7261):158 (2009)
Europe aims to make chemical-exposure studies more predictive while using fewer animals. By 2013, the European cosmetics industry will phase out animal testing.S. Sanchez/The Image Bank/Getty The European Commission has revealed details of a major new research programme to develop a modern, high-throughput approach to repeat-dose toxicity testing. Pressure to launch such an effort arose because the commission had drafted conflicting pieces of legislation, which demanded more extensive safety testing of chemicals while also requiring less use of animals in those tests. The programme, says the commission, will help to reconcile these goals. "Faster, cheaper and more reliable alternative methods will contribute to increased safety" while reducing the use of animals, says a commission communiqué issued in Rome last week at the World Congress on Alternatives and Animal Use in the Life Sciences, where the €25-million (US$36-million) programme was presented. Two items of European legislation present particular dilemmas to industry. One is the 2006 Registration, Evaluation, Authorisation and Restriction of Chemical Substances (REACH) directive, which requires retrospective testing of chemicals that are being marketed, to a point that many think overburdens existing testing capacities (see Nature 460, 1065; 2009). The other is the 2003 amendment to the 1976 cosmetics directive, which phases out all testing of cosmetic ingredients on animals by 2013. The legislation also applies to imported products marketed in Europe. Now, in the first agreement of its kind, industry will match the commission's funds through Colipa, the consortium of Europe's cosmetics, toiletry and perfumery industries based in Brussels. The total €50-million pot represents the largest-ever injection of money into the development of alternative toxicity testing. The cosmetics industry is not particularly happy about coughing up the money when the chemicals industry is not doing the same. "Of course it is not fair," says one top representative of a cosmetics company, speaking on condition of anonymity. "But the legislation itself is not fair — the science is not there." "The programme puts toxicology on a new basis." No one expects the new programme to be more than a modest start to the massive effort needed to rapidly and reliably test, with minimal animal use, for all possible adverse consequences of prolonged exposure to chemicals. "It will take 10 or 20 years before this is going to be translated," says meeting co-organizer Thomas Hartung, director of the Johns Hopkins University Center for Alternatives to Animal Testing in Baltimore, Maryland. For instance, determining whether long-term exposure to a chemical causes cancer or neurological disease without using animals is much harder than the nearly completed work of replacing animals in single-exposure toxicity work. "You can't just go with a single endpoint — you have to know how the whole system works," says toxicologist Horst Spielmann of the Federal Institute for Risk Assessment in Berlin. Advanced technology The commission's call for projects intends to incorporate expertise in five areas not widely used in traditional toxicology. These include developing methods to reliably generate other types of human cells from stem cells, and developing cellular devices that simulate organs such as the heart, lungs or kidney. Other areas include systems biology and computational modelling. Each area will be tackled by a single consortium of researchers. "We want to concentrate the money on the minimum number of labs who can do the work needed," says Jürgen Buesing, the commission official in charge of the programme. Stem-cell researcher Jürgen Hescheler from the University of Cologne in Germany is one of those intending to apply for funding through the initiative. "The programme puts toxicology on a new basis and brings it into the right species: the human," he says. A US initiative — the Tox21 programme coordinated by the Environmental Protection Agency and the National Institutes of Health — is also taking a high-throughput, systems approach to toxicology. With $22 million for this year alone, it too aims to increase the predictive value of toxicity tests while reducing animal use, and is prioritizing chemicals most in need of testing. "It is critical that Tox21, and data generated in other countries, are used in Europe so that there is no duplication," says Spielmann, who is running a project under Europe's seventh framework programme for research to ensure just that. In the meantime, scientists at the Rome meeting said that steps must be taken now to reduce the unnecessary use of animals. Bennard van Ravenzwaay, head of toxicology at the German chemicals giant BASF in Ludwigshafen, says that tests should be abandoned if they add negligible predictive value to the battery of experiments already required by regulatory agencies. Such checks include the two-generation test for reproductive toxicology, in which the second generation uses many animals without providing useful information; the mouse cancer test, which provides negligible additional information beyond the rat cancer test; and developmental neurotoxicity checks. ADVERTISEMENT Regulatory authorities can also engage in "intelligent toxicity testing strategies" to reduce the number of chemicals that need full testing, says Kees van Leeuwen of TNO, the Netherlands' applied research organization in Zeist. "We can reduce which chemicals may not need a full battery of testing, by optimizing the use of information from similar chemicals," he says. Buesing says that national agencies and industry should be prepared to extend funding of alternative methods in toxicology in the near future. "Otherwise," he says, "our €50 million will have been wasted." There are currently no comments. - World climate services framework agreed
- Nature 461(7261):159 (2009)
A global framework to supply on-demand climate predictions to governments, businesses and individuals is moving closer to reality. On 3 September, delegates representing 155 nations at the World Climate Conference in Geneva, Switzerland, agreed that a body should be established to supply such 'climate services' to users ranging from national governments to individual farmers. There are currently no comments. - Correction
- Nature 461(7261):159 (2009)
The News Feature 'Last chance clinic' (Nature 460, 1071–1075; 2009) inadvertently located Massachusetts General Hospital in Cambridge. It is in Boston. There are currently no comments. - Data sharing: Empty archives
- Nature 461(7261):160-163 (2009)
Most researchers agree that open access to data is the scientific ideal, so what is stopping it happening? Bryn Nelson investigates why many researchers choose not to share. Download a PDF of this article In 2003, the University of Rochester in New York launched a digital archive designed to preserve and share dissertations, preprints, working papers, photographs, music scores — just about any kind of digital data the university's investigators could produce. Six months of research and marketing had convinced the university that a publicly accessible online archive would be well received. At the time of the launch, the university librarians were worried that a flood of uploaded data might swamp the available storage space. Six years later, the US$200,000 repository lies mostly empty. Researchers had been very supportive of the archive idea, recalls Susan Gibbons, vice-provost and dean of the university's River Campus Libraries — especially as the alternative was to keep on scattering their data and dissertations across an ever-proliferating array of unintegrated computers and websites. "So we spent all this money, we spent all this time, we got the software up and running, and then we said, 'OK, here it is. We're ready. Give us your stuff'," she says. "And that's where we hit the wall." When the time came, scientists couldn't find their data, or didn't understand how to use the archive, or lamented that they just didn't have any more hours left in the day to spend on this business. As Gibbons and anthropologist Nancy Fried Foster observed in their 2005 postmortem1, "The phrase 'if you build it, they will come' does not yet apply to IRs [institutional repositories]." A similar reality check has greeted other data-sharing efforts. Most researchers happily embrace the idea of sharing. It opens up observations to independent scrutiny, fosters new collaborations and encourages further discoveries in old data sets (see pages 168 and 171). But in practice those advantages often fail to outweigh researchers' concerns. What will keep work from being scooped, poached or misused? What rights will the scientists have to relinquish? Where will they get the hours and money to find and format everything? Some communities have been quite open to sharing, and their repositories are bulging with data. Physicists, mathematicians and computer scientists use http://arXiv.org, operated by Cornell University in Ithaca, New York; the International Council for Science's World Data System holds data for fields such as geophysics and biodiversity; and molecular biologists use the Protein Data Bank, GenBank and dozens of other sites. The astronomy community has the International Virtual Observatory Alliance, geoscientists and environmental researchers have Germany's Publishing Network for Geoscientific & Environmental Data (PANGAEA), and the Dryad repository recently launched in North Carolina for ecology and evolution research. "We got the software up and running and said 'Give us your stuff'. That's when we hit the wall." Susan Gibbons But those discipline-specific successes are the exception rather than the rule in science. All too many observations lie isolated and forgotten on personal hard drives and CDs, trapped by technical, legal and cultural barriers — a problem that open-data advocates are only just beginning to solve. One of those advocates is Mark Parsons at the National Snow and Ice Data Center at the University of Colorado in Boulder. Parsons manages a global programme to preserve and organize the data produced by the International Polar Year (IPY) that ran from March 2007 to March 2009 and included an estimated 50,000 collaborators from more than 60 countries. The IPY policy calls for data to be made available fully, freely, openly and on the shortest feasible timescale. "Part of what is driving that is the rapidness of change in the poles," says Parsons. "If we're going to wait five years for data to be released, the Arctic is going to be a completely different place." Reality bites But reality is forcing a longer timescale. As soon as they began implementing the data policy, Parsons and his team encountered a staggering diversity of incoming information, as well as wide variations in the culture of data sharing. Fields such as atmospheric science and oceanography, Parsons says, have well-developed traditions of free and open access, and robust databases. But fields such as wildlife ecology and many of the social sciences do not. "What we discovered was that this infrastructure to share the data doesn't really exist, so we need to start creating that," Parsons says. But his programme lacks the resources required to create that infrastructure on a large scale. So the team has resorted to preserving as much data as it can. It has delegated much of that job to national coordinators, or "data wranglers", as Parsons calls them, who contact investigators and, "get the data branded and put in the IPY corral". One of the most successful data-wrangling countries has been Sweden, which formed a subcommittee to correct its early lag in collecting and then received national funding for its own IPY data archive. National coordinator Håkan Olsson, a specialist in remote sensing at the Swedish University of Agricultural Sciences in Umeå, says that the country's archive is helping to house data from smaller, independent projects that would never reach large international databanks. Nevertheless, he says, many Swedish researchers still don't archive their data, or don't put data in formats that make them easily searchable and retrievable. He faults the funding agencies too. "Unlike some other countries," he says, "the research councils in Sweden do not yet have a practice to grant funds with the condition that data from the project is sent to a data centre." Even when wranglers can identify the data, it is not always obvious where the data should go. For example, says Parsons, "you would think that any snow and ice data would go into the National Snow and Ice Data Centre". But the centre's funding is generally tied to specific data streams, he says, which means it can find itself in the position of accepting glacial data from a programme it has money for, while being forced to turn away similar glacial data from programmes where it does not. Despite the launch earlier this year of the Paris-based Polar Information Commons to make polar data more accessible, Parsons says, that with all the "naive assumptions", the lack of planning and other unanticipated obstacles, properly managing the IPY data will require another decade of work. "We don't just have to analyse the data, we need to make sure the data are right." Szabocs Márka In other fields, however, the main barriers to data sharing are concerns about quantity and quality. The US National Science Foundation's (NSF's) Laser Interferometer Gravitational-Wave Observatory (LIGO), for example, uses giant detectors in Louisiana and Washington to search for gravitational waves that might indicate the presence of rare phenomena such as colliding black holes or merging stars. LIGO is also working with the Virgo consortium, which operates a similar detector near Pisa, Italy. Neither team has detected the signal they are looking for yet — but that's not surprising: gravitational waves are expected to be extraordinarily faint. The key to detecting them is to eliminate every possible source of spurious vibration in the detectors, whether from seismic events, electrical storms, road traffic or even from the surf on distant beaches. It requires what Szabolcs Márka, a physicist at Columbia University in New York and the university's lead scientist for LIGO, calls "a really paranoid monitoring of the environment". The question of what data should be shared has provoked strong debate within the LIGO and Virgo teams. Should they open up all their terabytes of data to outside scientists, including the torrents of environmental data? Or should they release just the cleaned-up data stream most likely to reveal a gravity wave? Would naive outsiders fail to process the raw data adequately, leading to premature announcement of gravitational wave 'discoveries' that would hurt everyone's credibility? Or would the extra eyes bring fresh perspective to the search? "I'm torn," says Márka, who says that the precise terms of data sharing are being negotiated with the project's funders. "We don't just have to analyse the data, we need to make sure the data are right." How data should be shared is also a substantial problem. A prime example is the issue of data standards: the conventions that spell out exactly how the digital information is formatted, and exactly how the contextual information (metadata) is listed. In some disciplines it is comparatively easy to agree on standards, says Clifford Lynch, executive director of the Coalition for Networked Information based in Washington DC, which represents academia on data and networking issues. "If you look at something like the sequencing of a genome, there's a whole lot of tacit stuff that's already settled," he says. "Sequencing one genome is very similar to sequencing another." But for other groups — say, environmental scientists trying to understand the spread of a pollutant — the choice of common standards is far less obvious. The all-too-frequent result is fragmented and often mutually incomprehensible scientific information. And that, in turn, stifles innovation, says James Boyle, a law professor at Duke University in Durham, North Carolina, and a founding board member of Creative Commons, a non-profit organization that supports creative content sharing. Always somebody smarter "Researchers generally create their own formats because they believe that they know how their users want to use the data," says Boyle. But there are roughly a billion people with Internet access, he says "and at least one of them has a smarter idea about what to do with your content than you do". For example, web users are using applications such as Google Earth to plot the spread of pandemics2 or to collect information on the effects of climate change. All that is needed, says Boyle, are common languages and formats for data. Perhaps not surprisingly, data-sharing advocates say, the power to prod researchers towards openness and consistency rests largely with those who have always had the most clout in science: the funding agencies, which can demand data sharing in return for support; the scientific societies, which can establish it as a precedent; and the journals, which can make sharing a condition of publication. The trick is to wield that power effectively. The NSF, for example, has funded ground-breaking research into digital archiving, search and networking technologies. But its data-sharing policies for standard research grants, for example, have come under fire for being scattered and ad hoc; they are often stipulated on a per-project basis. Gibbons says she is especially disappointed with a 2003 mandate by the US National Institutes of Health (NIH), which could have dramatically changed the culture of data sharing. The mandate does require a data-sharing plan for any grant worth $500,000 or more in direct annual costs or an explanation of why sharing isn't possible. But details about how to make the data available were so vague, says Gibbons, that researchers soon stopped paying attention, content to sit back until someone got in trouble for not playing by the rules. Officials at the NIH Office of Extramural Research reply that the data-sharing policy's 'vagueness' is, in fact, flexibility, an attempt to avoid forcing every research programme into a one-size-fits-all straightjacket. They note that the policy also recognizes that there may be valid reasons for not sharing, including concerns about patient privacy and informed consent. The chicken or the egg? Nonetheless, until data sharing becomes a requirement for every grant, says Daniel Gardner, a physiologist and biophysicist at the Weill Medical College of Cornell University, "people aren't going to do it in as widespread of a way as we would like". Right now, he says, "you can't ask large numbers of people to do it, because it's a lot of work and because in many cases the databases don't exist for it. So there is kind of a chicken and egg problem here." One solution would be for agencies to invest in the infrastructure necessary to meet their archiving requirements. That can be difficult to arrange, says Boyle. "Infrastructure is the thing that we always fail to fund because it's kind of everybody's problem, and therefore it's nobody's problem." Yet some agencies have been pioneers in this area. One often-cited example is the Wellcome Trust, the largest non-governmental UK funder of biomedical research. Since 1992, its Sanger Institute near Cambridge has been developing and housing some of the world's leading databases in genomics, proteomics and other areas. "At least one of the people out there has a smarter idea about what to do with your content than you do." James Boyle Another prominent example is the NIH's National Library of Medicine, which in 1988 established the National Center for Biotechnology Information (NCBI) to manage its own collection of molecular biology databases, including the GenBank repository. James Ostell, chief of the NCBI's Information Engineering Branch, likes to show a colour-coded timeline of contributions to GenBank since its founding in 1982 — a progression that dramatizes the fast-evolving history of genetic sequencing. Ostell points out thick waves of colours flowing from the left side of the chart. Representing traditional sequence divisions such as viruses, rodents, primates, plants and bacteria, they dominated GenBank's contents for years. Other sequences, produced by faster techniques, began to put in appearances in the mid 1990s. Then in late 2001 a sudden surge of green, representing DNA snippets derived from whole-genome shotgun sequencing, quickly took over. By 2006, the green accounted for more than h! alf of the database's contents. Keeping up with ever-shifting technology has created its own set of challenges, says Ostell. "Nobody has infinite resources. And storing electronic information over time is a dynamic process. If you try to look at a file that you wrote with a word processor 20 years ago, good luck." In the same way, if a data set isn't readable by the latest version of a database, it isn't usable. So an archive may well have to choose between tossing old data out, and paying to preserve the out-of-date software required to make sense of them. Even more challenging are the legal minefields surrounding personal data and privacy. The need to protect human subjects has led to starkly different approaches. Some projects openly share data, whereas others require researchers to navigate a labyrinthine approval process before granting access. The NCBI has tried to build such requirements into its newer databases. A case in point is its database of Genotype and Phenotype (dbGaP), which archives and distributes the results of genome-wide association studies, medical DNA sequencing, molecular diagnostic assays and almost anything else that relates people's traits and behaviours to their genetic makeup. The dbGaP allows open access to summaries and other forms of information that have been stripped of personal identifiers. But it grants controlled access to personal health information only after a researcher has been approved by a formal review committee. Novel meaning Such measures can be cumbersome, says Ostell. Yet the benefits of sharing far outweigh the costs. Some of GenBank's early sequences, for example, included genes from yeast and Escherichia coli labelled as DNA repair enzymes. Years later, researchers studying human colon cancer made a link between mutations in patients and those same enzymes3. "If you just did a literature search, you would never make that connection," Ostell says. "But when you search on the basis of their genes, suddenly you connect meaning in a way that's novel, which is the basis of discovery." Sharing is obviously easier when the expectations are clear, and many scientists point to a 1996 meeting in Bermuda as a defining moment for genomics. At the meeting, leaders working on the Human Genome Project hammered out a set of agreements known as the Bermuda principles. Chief among them was the stipulation that sequences longer than 1,000 base pairs be made publicly available, preferably within 24 hours. "We need to change the culture of science to one that equally values publications and data." William Michener The Bermuda principles, in turn, built on the foundations laid a decade earlier by the editors of journals such as Nucleic Acids Research, who spurred the early development of GenBank and other genomic repositories by requiring researchers to deposit their data there as a precondition for publishing. Newer journals, such as the open-access Public Library of Science journals, have made publication contingent on making the data "freely available without restriction, provided that appropriate attribution is given and that suitable mechanisms exist for sharing the data used in a manuscript". The journal Neuroinformatics devoted its September 2008 issue to data sharing through the NIH Neuroscience Information Framework. Ecological Archives publishes appendices, supplements and data — related to studies appearing in other ecology journals — which include the metadata needed to interpret them. (Nature journals require authors "to make materials, data and associated protocols pr! omptly available to readers without preconditions".) Yet the journals' power to compel data sharing and scientific culture change is not absolute. In March 2009, for example, the journal Epidemiology felt able to call only for a "small step" towards more openness. "We invite our authors to share their data and computer code when the burden is minimal," said an editorial4 in that issue. "We believe that data sharing is a matter of time," says Miguel Hernán, an epidemiologist at Harvard University and a co-author of the editorial. But prematurely forcing a sharing requirement on authors "would be suicidal", he warns, especially with unresolved concerns over patient confidentiality. They would simply submit their papers somewhere else. Another issue facing journals and data banks is how to ensure proper citations for data sets. "The one thing that people clearly care about in the sciences is attribution," says Boyle. Without an agreed-on way of assigning credit for original data falling beyond the parameters of a publication, however, it's no wonder that scientists are reluctant to share: their hard work may never be recognized by their employers or by granting agencies. Worse yet, it could be poached or scooped. This is one place that technology might help, says Boyle. He points to a music site associated with Creative Commons known as ccMixter, in which users can upload an a capella chorus, a bass line, a trumpet solo or other musical samples. Users are free to remix the samples into new tracks. But when they do, the program automatically keeps a continuous credit record. So why not implement a similar system that would add a link back to a database every time a researcher repurposed some data? It wouldn't necessarily solve the problem of scooping, Boyle says, "but it aligns the social incentives with the individual incentives". It could also provide a feasible way for universities or funding agencies to track the value of a researcher's data. International agreement Other Creative Commons tools are already making their way into international scientific agreements. In May, for example, Creative Commons' CC0 licence was endorsed by participants at a meeting in Rome on resource and data sharing within the mouse functional genomics community. The licence, which allows its users to "waive all copyrights and related or neighbouring rights" and thereby share more of their work, has been translated into dozens of languages. As welcome as such developments are, however, Boyle points out that the creation of the legal and technical infrastructure to accommodate researchers' data-sharing concerns is a huge task, and should not be left solely to non-profit organizations and individual universities. Nor should it be left to the funding agencies' grant-by-grant allocations for data sharing. It will require major government investments, starting with demonstration projects to explore how sharing can best be done. "What we need is a working example that you can point to," he says. If William Michener has his way, a virtual data centre funded by the NSF and hosted by his university will be one of those examples. DataONE (Data Observation Network for Earth) exists only on paper, but a five-year, $20-million grant through the NSF's DataNet programme will help to turn it into an open-access database focusing on biology, ecology and environmental science data. Four other $20-million archives are planned under DataNet's first phase. Michener, director of e-science initiatives for University Libraries at the University of New Mexico, Albuquerque, and a leader of DataONE, says that the archive is designed to accommodate many of the orphan data sets that have yet to find a home, and will target resource-strapped colleges, field stations, and individual or small teams of scientists. In the longer term, the DataONE consortium, which encompasses two dozen partner institutions in the United States, the United Kingdom, South Africa, Australia and Taiwan, will explore business models that could sustain the archive well beyond its initial grant and potential five-year renewal. Among the plans under consideration are a fee-for-service set up, a membership requirement for participating entities and the solicitation of external grants for education and outreach. ADVERTISEMENT DataONE's success, however, may depend on overcoming the same ambivalence among researchers that has bedevilled the University of Rochester and other builders of public databases. Although a strategy is still being worked out, Michener envisions a combination of workshops, seminars, websites and other educational tools to help clarify the how and why of sharing. But one archive can only do so much. Larger efforts will be required to tackle what Michener sees as the overriding challenge: "Changing the culture of science from one where publications were viewed as the primary product of the scientific enterprise to one that also equally values data." Without that cultural shift, says Gibbons, many digital archives are likely to remain little more than stacks of empty shelves. pages 168 and 171online special * References * Foster, N. F. & Gibbons, G.D-Lib Magazine doi:10.1045/january2005-foster (2005). * http://www.nature.com/avianflu/google-earth/index.html * Marra, G. & Boland, C. R.Gastroenterol. Clin. North Am.25, 755-772 (1996). * Hernán, M. A. & Wilcox, A. J.Epidemiology20, 167-168 (2009). There are currently no comments. - Evolution: Mouth to mouth
- Nature 461(7261): (2009)
There are currently no comments. - Choking on carbon emissions from Greek academic paperwork
- Nature 461(7261):167 (2009)
Selection processes for academic jobs are notoriously open to criticism, but in Greece they have the additional drawback of leaving a hefty carbon footprint.Typically, selection committees for research institutes require applicants for a senior post to submit 11 paper copies of each of their publications (the Greeks' expansive view of publication sometimes includes texts of oral presentations) as well as of their birth certificate, national identity card (both sides), transcripts, translations of foreign degrees, and military and police reports. - Evolution pioneers: celebrating Lamarck at 200, Darwin 215
- Nature 461(7261):167 (2009)
I take issue with the contention that Erasmus Darwin, the grandfather of Charles, tackled evolution only in poetic terms, as implied by Dan Graur and colleagues in their insightful Book Review ('In retrospect: Lamarck's treatise at 200' Nature 460, 688–689; 2009).Erasmus Darwin's most important contributions to evolutionary thought will be found in the very unpoetic prose of the first volume of his major medical and zoological treatise, Zoonomia, published in 1794. - Evolution pioneers: Lamarck's reputation saved by his zoology
- Nature 461(7261):167 (2009)
Work by Lamarck scholars over the past 20 years calls into question some of the assertions made by Dan Graur and his colleagues in their Book Review (Nature 460, 688–689; 2009).For example, far from being universally scorned, Jean Baptiste Lamarck became known as 'the French Linnaeus' during the 1820s. - Religious belief and the history of science
- Nature 461(7261):167 (2009)
I am concerned that the survey responses expressed in Gene Russo's Prospects article 'Balancing belief and bioscience' are irrelevant to gauging the influence of religion on the development of scientists (Nature460, 654; 2009).Many of the great scientists renowned for developing entire scientific fields or theories were religious. - Prepublication data sharing
- Nature 461(7261):168-170 (2009)
Rapid release of prepublication data has served the field of genomics well. Attendees at a workshop in Toronto recommend extending the practice to other biological data sets. - Post-publication sharing of data and tools
- Nature 461(7261):171-173 (2009)
Despite existing guidelines on access to data and bioresources, good practice is not widespread. A meeting of mouse researchers in Rome proposes ways to promote a culture of sharing. - Call for a climate culture shift
- Nature 461(7261):174-175 (2009)
A new book describes the rapid reshaping of human priorities needed to save the planet from global warming. Some of that change is already under way at the community level, explains Robert Costanza. - The wider lessons for finance
- Nature 461(7261):175-176 (2009)
One of the unintended effects of the near-collapse of the world economy is the creation of a market for scientific advice to the banking sector. Senior officials at the Bank of England, for example, are consulting the theoretical ecologist and former Royal Society president Robert May, whose research interests include modelling ecosystem collapses and the spread of infectious diseases. - How Spain redrew the world
- Nature 461(7261):176 (2009)
In the autumn of 1571, Juan López de Velasco, an ambitious legal scholar with one eye on the heavens, accepted the coveted position of chief cosmographer and chronicler to Philip II, the King of Spain. Velasco received a salary hike and a trunk filled with invaluable documents collected by his predecessor. - Sex determination: Birds do it with a Z gene
- Nature 461(7261):177-178 (2009)
The gene that determines sex in birds has eluded scientists for a decade. Now this all-important locus is revealed as a gene on the Z chromosome known for its proclivity for determining sex in all kinds of animals. - Nanotechnology: A gentle jackhammer
- Nature 461(7261):178-179 (2009)
A futuristic method of data storage depends on the 'write–read' action of a multitude of tiny silicon tips. The concept of dynamic superlubricity offers a way to avoid the wear that would otherwise cripple them. - Early Earth: Oxygen for heavy-metal fans
- Nature 461(7261):179-181 (2009)
Chromium isotopes provide an eyebrow-raising history of oxygenation of Earth's atmosphere. Not least, it seems that oxygen might have all but disappeared half a billion years after its initial rise. - 50 & 100 years ago
- Nature 461(7261):180 (2009)
My Philosophical Development. By Bertrand Russell — All those whose study of philosophy is grounded in the empirical tradition regard Lord Russell as the greatest living philosopher ... Although one should not neglect other influences ... there is no doubt that the main responsibility for the present state of philosophy lies squarely on Russell's shoulders ... There are few philosophers in history who have written important philosophical works almost continuously for fifty years: Russell has added to the immense debt we owe him by now giving us a full-scale account of his philosophical development, written with all the clarity, verve and wit we are accustomed to expect from anything he writes. - Cell biology: Sent by the scent of death
- Nature 461(7261):181-182 (2009)
Dying cells release 'find-me' factors that attract professional scavenger cells to engulf and digest them. These cellular invitations to dine can take unexpected forms. - Materials chemistry: Catalysts made thinner
- Nature 461(7261):182-183 (2009)
Thinner can be better, at least for the industrially useful catalysts known as zeolites. A technique that allows single layers of zeolites to assemble from solution opens up a plethora of practical applications. - Developmental biology: Instructions writ in blood
- Nature 461(7261):183-184 (2009)
It seems that growth factors may instruct blood-cell progenitors to develop into specific mature cell types, actively determining lineage choice. But is this reductionist view of cell fate overly simplistic? - Transcribing the genome
- Nature 461(7261):186-192 (2009)
In the eukaryotic genome, the thousands of genes that encode messenger RNA are transcribed by a molecular machine called RNA polymerase II. Analysing the distribution and status of RNA polymerase II across a genome has provided crucial insights into the long-standing mysteries of transcription and its regulation. These studies identify points in the transcription cycle where RNA polymerase II accumulates after encountering a rate-limiting step. When coupled with genome-wide mapping of transcription factors, these approaches identify key regulatory steps and factors and, importantly, provide an understanding of the mechanistic generalities, as well as the rich diversities, of gene regulation. - Defining mechanisms that regulate RNA polymerase II transcription in vivo
- Nature 461(7261):186-192 (2009)
In the eukaryotic genome, the thousands of genes that encode messenger RNA are transcribed by a molecular machine called RNA polymerase II. Analysing the distribution and status of RNA polymerase II across a genome has provided crucial insights into the long-standing mysteries of transcription and its regulation. These studies identify points in the transcription cycle where RNA polymerase II accumulates after encountering a rate-limiting step. When coupled with genome-wide mapping of transcription factors, these approaches identify key regulatory steps and factors and, importantly, provide an understanding of the mechanistic generalities, as well as the rich diversities, of gene regulation. - The logic of chromatin architecture and remodelling at promoters
- Nature 461(7261):193-198 (2009)
The regulation of gene transcription involves a dynamic balance between packaging regulatory sequences into chromatin and allowing transcriptional regulators access to these sequences. Access is restricted by the nucleosomes, but these can be repositioned or ejected by enzymes known as nucleosome remodellers. In addition, the DNA sequence can impart stiffness or curvature to the DNA, thereby affecting the position of nucleosomes on the DNA, influencing particular promoter 'architectures'. Recent genome-wide studies in yeast suggest that constitutive and regulated genes have architectures that differ in terms of nucleosome position, turnover, remodelling requirements and transcriptional noise. - Genomic views of distant-acting enhancers
- Nature 461(7261):199-205 (2009)
In contrast to protein-coding sequences, the significance of variation in non-coding DNA in human disease has been minimally explored. A great number of recent genome-wide association studies suggest that non-coding variation is a significant risk factor for common disorders, but the mechanisms by which this variation contributes to disease remain largely obscure. Distant-acting transcriptional enhancers — a major category of functional non-coding DNA — are involved in many developmental and disease-relevant processes. Genome-wide approaches to their discovery and functional characterization are now available and provide a growing knowledge base for the systematic exploration of their role in human biology and disease susceptibility. - Implications of chimaeric non-co-linear transcripts
- Nature 461(7261):206-211 (2009)
Deep sequencing of 'transcriptomes' — the collection of all RNA transcripts produced at a given time — from worms to humans reveals that some transcripts are composed of sequence segments that are not co-linear, with pieces of sequence coming from distant regions of DNA, even different chromosomes. Some of these 'chimaeric' transcripts are formed by genetic rearrangements, but others arise during post-transcriptional events. The 'trans-splicing' process in lower eukaryotes is well understood, but events in higher eukaryotes are not. The existence of such chimaeric RNAs has far-reaching implications for the potential information content of genomes and the way it is arranged. - Chromosome crosstalk in three dimensions
- Nature 461(7261):212-217 (2009)
The genome forms extensive and dynamic physical interactions with itself in the form of chromosome loops and bridges, thus exploring the three-dimensional space of the nucleus. It is now possible to examine these interactions at the molecular level, and we have gained glimpses of their functional implications. Chromosomal interactions can contribute to the silencing and activation of genes within the three-dimensional context of the nuclear architecture. Technical advances in detecting these interactions contribute to our understanding of the functional organization of the genome, as well as its adaptive plasticity in response to environmental changes during development and disease. - Molecular networks as sensors and drivers of common human diseases
- Nature 461(7261):218-223 (2009)
The molecular biology revolution led to an intense focus on the study of interactions between DNA, RNA and protein biosynthesis in order to develop a more comprehensive understanding of the cell. One consequence of this focus was a reduced attention to whole-system physiology, making it difficult to link molecular biology to clinical medicine. Equipped with the tools emerging from the genomics revolution, we are now in a position to link molecular states to physiological ones through the reverse engineering of molecular networks that sense DNA and environmental perturbations and, as a result, drive variations in physiological states associated with disease. - Co-translational mRNA decay in Saccharomyces cerevisiae
- Nature 461(7261):225-229 (2009)
The rates of RNA decay and transcription determine the steady-state levels of all messenger RNA and both can be subject to regulation. Although the details of transcriptional regulation are becoming increasingly understood, the mechanism(s) controlling mRNA decay remain unclear. In yeast, a major pathway of mRNA decay begins with deadenylation followed by decapping and 5'–3' exonuclease digestion. Importantly, it is hypothesized that ribosomes must be removed from mRNA before transcripts are destroyed. Contrary to this prediction, here we show that decay takes place while mRNAs are associated with actively translating ribosomes. The data indicate that dissociation of ribosomes from mRNA is not a prerequisite for decay and we suggest that the 5'–3' polarity of mRNA degradation has evolved to ensure that the last translocating ribosome can complete translation. - An RNA-dependent RNA polymerase formed by TERT and the RMRP RNA
- Nature 461(7261):230-235 (2009)
Constitutive expression of telomerase in human cells prevents the onset of senescence and crisis by maintaining telomere homeostasis. However, accumulating evidence suggests that the human telomerase reverse transcriptase catalytic subunit (TERT) contributes to cell physiology independently of its ability to elongate telomeres. Here we show that TERT interacts with the RNA component of mitochondrial RNA processing endoribonuclease (RMRP), a gene that is mutated in the inherited pleiotropic syndrome cartilage–hair hypoplasia. Human TERT and RMRP form a distinct ribonucleoprotein complex that has RNA-dependent RNA polymerase (RdRP) activity and produces double-stranded RNAs that can be processed into small interfering RNA in a Dicer (also known as DICER1)-dependent manner. These observations identify a mammalian RdRP composed of TERT in complex with RMRP. - The global distribution of pure anorthosite on the Moon
- Nature 461(7261):236-240 (2009)
It has been thought that the lunar highland crust was formed by the crystallization and floatation of plagioclase from a global magma ocean1, 2, although the actual generation mechanisms are still debated2, 3. The composition of the lunar highland crust is therefore important for understanding the formation of such a magma ocean and the subsequent evolution of the Moon. The Multiband Imager4 on the Selenological and Engineering Explorer (SELENE)5 has a high spatial resolution of optimized spectral coverage, which should allow a clear view of the composition of the lunar crust. Here we report the global distribution of rocks of high plagioclase abundance (approaching 100 vol.%), using an unambiguous plagioclase absorption band recorded by the SELENE Multiband Imager. If the upper crust indeed consists of nearly 100 vol.% plagioclase, this is significantly higher than previous estimates of 82–92 vol.% (refs 2, 6, 7), providing a valuable constraint on models of lunar m! agma ocean evolution. - Coherent optical pulse sequencer for quantum applications
- Nature 461(7261):241-245 (2009)
The bandwidth and versatility of optical devices have revolutionized information technology systems and communication networks. Precise and arbitrary control of an optical field that preserves optical coherence is an important requisite for many proposed photonic technologies. For quantum information applications1, 2, a device that allows storage and on-demand retrieval of arbitrary quantum states of light would form an ideal quantum optical memory. Recently, significant progress has been made in implementing atomic quantum memories using electromagnetically induced transparency, photon echo spectroscopy, off-resonance Raman spectroscopy and other atom–light interaction processes. Single-photon3, 4 and bright-optical-field5, 6 storage with quantum states have both been successfully demonstrated. Here we present a coherent optical memory based on photon echoes induced through controlled reversible inhomogeneous broadening. Our scheme allows storage of multiple pulses ! of light within a chosen frequency bandwidth, and stored pulses can be recalled in arbitrary order with any chosen delay between each recalled pulse. Furthermore, pulses can be time-compressed, time-stretched or split into multiple smaller pulses and recalled in several pieces at chosen times. Although our experimental results are so far limited to classical light pulses, our technique should enable the construction of an optical random-access memory for time-bin quantum information, and have potential applications in quantum information processing. - Stable single-unit-cell nanosheets of zeolite MFI as active and long-lived catalysts
- Nature 461(7261):246-249 (2009)
Zeolites—microporous crystalline aluminosilicates—are widely used in petrochemistry and fine-chemical synthesis1, 2, 3 because strong acid sites within their uniform micropores enable size- and shape-selective catalysis. But the very presence of the micropores, with aperture diameters below 1 nm, often goes hand-in-hand with diffusion limitations3, 4, 5 that adversely affect catalytic activity. The problem can be overcome by reducing the thickness of the zeolite crystals, which reduces diffusion path lengths and thus improves molecular diffusion4, 5. This has been realized by synthesizing zeolite nanocrystals6, by exfoliating layered zeolites7, 8, 9, and by introducing mesopores in the microporous material through templating strategies10, 11, 12, 13, 14, 15, 16, 17 or demetallation processes18, 19, 20, 21, 22. But except for the exfoliation, none of these strategies has produced 'ultrathin' zeolites with thicknesses below 5 nm. Here we show that appropriately desig! ned bifunctional surfactants can direct the formation of zeolite structures on the mesoporous and microporous length scales simultaneously and thus yield MFI (ZSM-5, one of the most important catalysts in the petrochemical industry) zeolite nanosheets that are only 2 nm thick, which corresponds to the b-axis dimension of a single MFI unit cell. The large number of acid sites on the external surface of these zeolites renders them highly active for the catalytic conversion of large organic molecules, and the reduced crystal thickness facilitates diffusion and thereby dramatically suppresses catalyst deactivation through coke deposition during methanol-to-gasoline conversion. We expect that our synthesis approach could be applied to other zeolites to improve their performance in a range of important catalytic applications. - Fluctuations in Precambrian atmospheric oxygenation recorded by chromium isotopes
- Nature 461(7261):250-253 (2009)
Geochemical data1, 2, 3, 4 suggest that oxygenation of the Earth's atmosphere occurred in two broad steps. The first rise in atmospheric oxygen is thought to have occurred between 2.45 and 2.2 Gyr ago1, 5, leading to a significant increase in atmospheric oxygen concentrations and concomitant oxygenation of the shallow surface ocean. The second increase in atmospheric oxygen appears to have taken place in distinct stages during the late Neoproterozoic era (800–542 Myr ago)3, 4, ultimately leading to oxygenation of the deep ocean 580 Myr ago3, but details of the evolution of atmospheric oxygenation remain uncertain. Here we use chromium (Cr) stable isotopes from banded iron formations (BIFs) to track the presence of Cr(VI) in Precambrian oceans, providing a time-resolved picture of the oxygenation history of the Earth's atmosphere–hydrosphere system. The geochemical behaviour of Cr is highly sensitive to the redox state of the surface environment because oxidative we! athering processes produce the oxidized hexavalent [Cr(VI)] form. Oxidation of reduced trivalent [Cr(III)] chromium on land is accompanied by an isotopic fractionation, leading to enrichment of the mobile hexavalent form in the heavier isotope. Our fractionated Cr isotope data indicate the accumulation of Cr(VI) in ocean surface waters 2.8 to 2.6 Gyr ago and a likely transient elevation in atmospheric and surface ocean oxygenation before the first great rise of oxygen 2.45–2.2 Gyr ago (the Great Oxidation Event)1, 5. In 1.88-Gyr-old BIFs we find that Cr isotopes are not fractionated, indicating a decline in atmospheric oxygen. Our findings suggest that the Great Oxidation Event did not lead to a unidirectional stepwise increase in atmospheric oxygen. In the late Neoproterozoic, we observe strong positive fractionations in Cr isotopes (53Cr up to +4.9), providing independent support for increased surface oxygenation at that time, which may have stimulated rapid evolution o! f macroscopic multicellular life3, 4, 6. - The importance of niches for the maintenance of species diversity
- Nature 461(7261):254-257 (2009)
Ecological communities characteristically contain a wide diversity of species with important functional, economic and aesthetic value. Ecologists have long questioned how this diversity is maintained1, 2, 3. Classic theory shows that stable coexistence requires competitors to differ in their niches4, 5, 6; this has motivated numerous investigations of ecological differences presumed to maintain diversity3, 6, 7, 8. That niche differences are key to coexistence, however, has recently been challenged by the neutral theory of biodiversity, which explains coexistence with the equivalence of competitors9. The ensuing controversy has motivated calls for a better understanding of the collective importance of niche differences for the diversity observed in ecological communities10, 11. Here we integrate theory and experimentation to show that niche differences collectively stabilize the dynamics of experimental communities of serpentine annual plants. We used field-parameteriz! ed population models to develop a null expectation for community dynamics without the stabilizing effects of niche differences. The population growth rates predicted by this null model varied by several orders of magnitude between species, which is sufficient for rapid competitive exclusion. Moreover, after two generations of community change in the field, Shannon diversity was over 50 per cent greater in communities stabilized by niche differences relative to those exhibiting dynamics predicted by the null model. Finally, in an experiment manipulating species' relative abundances, population growth rates increased when species became rare—the demographic signature of niche differences. Our work thus provides strong evidence that species differences have a critical role in stabilizing species diversity. - Photosystem I gene cassettes are present in marine virus genomes
- Nature 461(7261):258-262 (2009)
Cyanobacteria of the Synechococcus and Prochlorococcus genera are important contributors to photosynthetic productivity in the open oceans1, 2, 3. Recently, core photosystem II (PSII) genes were identified in cyanophages and proposed to function in photosynthesis and in increasing viral fitness by supplementing the host production of these proteins4, 5, 6, 7. Here we show evidence for the presence of photosystem I (PSI) genes in the genomes of viruses that infect these marine cyanobacteria, using pre-existing metagenomic data from the global ocean sampling expedition8 as well as from viral biomes9. The seven cyanobacterial core PSI genes identified in this study, psaA, B, C, D, E, K and a unique J and F fusion, form a cluster in cyanophage genomes, suggestive of selection for a distinct function in the virus life cycle. The existence of this PSI cluster was confirmed with overlapping and long polymerase chain reaction on environmental DNA from the Northern Line Islands! . Potentially, the seven proteins encoded by the viral genes are sufficient to form an intact monomeric PSI complex. Projection of viral predicted peptides on the cyanobacterial PSI crystal structure10 suggested that the viral–PSI components might provide a unique way of funnelling reducing power from respiratory and other electron transfer chains to the PSI. - Changes of mind in decision-making
- Nature 461(7261):263-266 (2009)
A decision is a commitment to a proposition or plan of action based on evidence and the expected costs and benefits associated with the outcome. Progress in a variety of fields has led to a quantitative understanding of the mechanisms that evaluate evidence and reach a decision1, 2, 3. Several formalisms propose that a representation of noisy evidence is evaluated against a criterion to produce a decision4, 5, 6, 7, 8. Without additional evidence, however, these formalisms fail to explain why a decision-maker would change their mind. Here we extend a model, developed to account for both the timing and the accuracy of the initial decision9, to explain subsequent changes of mind. Subjects made decisions about a noisy visual stimulus, which they indicated by moving a handle. Although they received no additional information after initiating their movement, their hand trajectories betrayed a change of mind in some trials. We propose that noisy evidence is accumulated over t! ime until it reaches a criterion level, or bound, which determines the initial decision, and that the brain exploits information that is in the processing pipeline when the initial decision is made to subsequently either reverse or reaffirm the initial decision. The model explains both the frequency of changes of mind as well as their dependence on both task difficulty and whether the initial decision was accurate or erroneous. The theoretical and experimental findings advance the understanding of decision-making to the highly flexible and cognitive acts of vacillation and self-correction. - The avian Z-linked gene DMRT1 is required for male sex determination in the chicken
- Nature 461(7261):267-271 (2009)
Sex in birds is chromosomally based, as in mammals, but the sex chromosomes are different and the mechanism of avian sex determination has been a long-standing mystery1, 2, 3. In the chicken and all other birds, the homogametic sex is male (ZZ) and the heterogametic sex is female (ZW). Two hypotheses have been proposed for the mechanism of avian sex determination. The W (female) chromosome may carry a dominant-acting ovary determinant4, 5, 6. Alternatively, the dosage of a Z-linked gene may mediate sex determination, two doses being required for male development (ZZ)7, 8. A strong candidate avian sex-determinant under the dosage hypothesis is the conserved Z-linked gene, DMRT1 (doublesex and mab-3-related transcription factor 1)9, 10, 11. Here we used RNA interference (RNAi) to knock down DMRT1 in early chicken embryos. Reduction of DMRT1 protein expression in ovo leads to feminization of the embryonic gonads in genetically male (ZZ) embryos. Affected males show partia! l sex reversal, characterized by feminization of the gonads. The feminized left gonad shows female-like histology, disorganized testis cords and a decline in the testicular marker, SOX9. The ovarian marker, aromatase, is ectopically activated. The feminized right gonad shows a more variable loss of DMRT1 and ectopic aromatase activation, suggesting differential sensitivity to DMRT1 between left and right gonads. Germ cells also show a female pattern of distribution in the feminized male gonads. These results indicate that DMRT1 is required for testis determination in the chicken. Our data support the Z dosage hypothesis for avian sex determination. - Targeted capture and massively parallel sequencing of 12 human exomes
- Nature 461(7261):272-276 (2009)
Genome-wide association studies suggest that common genetic variants explain only a modest fraction of heritable risk for common diseases, raising the question of whether rare variants account for a significant fraction of unexplained heritability1, 2. Although DNA sequencing costs have fallen markedly3, they remain far from what is necessary for rare and novel variants to be routinely identified at a genome-wide scale in large cohorts. We have therefore sought to develop second-generation methods for targeted sequencing of all protein-coding regions ('exomes'), to reduce costs while enriching for discovery of highly penetrant variants. Here we report on the targeted capture and massively parallel sequencing of the exomes of 12 humans. These include eight HapMap individuals representing three populations4, and four unrelated individuals with a rare dominantly inherited disorder, Freeman–Sheldon syndrome (FSS)5. We demonstrate the sensitive and specific identification! of rare and common variants in over 300 megabases of coding sequence. Using FSS as a proof-of-concept, we show that candidate genes for Mendelian disorders can be identified by exome sequencing of a small number of unrelated, affected individuals. This strategy may be extendable to diseases with more complex genetics through larger sample sizes and appropriate weighting of non-synonymous variants by predicted functional impact. - Modification of CO2 avoidance behaviour in Drosophila by inhibitory odorants
- Nature 461(7261):277-281 (2009)
The fruitfly Drosophila melanogaster exhibits a robust and innate olfactory-based avoidance behaviour to CO2, a component of odour emitted from stressed flies1. Specialized neurons in the antenna and a dedicated neuronal circuit in the higher olfactory system mediate CO2 detection and avoidance1, 2. However, fruitflies need to overcome this avoidance response in some environments that contain CO2 such as ripening fruits and fermenting yeast, which are essential food sources. Very little is known about the molecular and neuronal basis of this unique, context-dependent modification of innate olfactory avoidance behaviour. Here we identify a new class of odorants present in food that directly inhibit CO2-sensitive neurons in the antenna. Using an in vivo expression system we establish that the odorants act on the Gr21a/Gr63a CO2 receptor3. The presence of these odorants significantly and specifically reduces CO2-mediated avoidance behaviour, as well as avoidance mediated ! by 'Drosophila stress odour'. We propose a model in which behavioural avoidance to CO2 is directly influenced by inhibitory interactions of the novel odours with CO2 receptors. Furthermore, we observe differences in the temporal dynamics of inhibition: the effect of one of these odorants lasts several minutes beyond the initial exposure. Notably, animals that have been briefly pre-exposed to this odorant do not respond to the CO2 avoidance cue even after the odorant is no longer present. We also show that related odorants are effective inhibitors of the CO2 response in Culex mosquitoes that transmit West Nile fever and filariasis. Our findings have broader implications in highlighting the important role of inhibitory odorants in olfactory coding, and in their potential to disrupt CO2-mediated host-seeking behaviour in disease-carrying insects like mosquitoes. - Nucleotides released by apoptotic cells act as a find-me signal to promote phagocytic clearance
- Nature 461(7261):282-286 (2009)
Phagocytic removal of apoptotic cells occurs efficiently in vivo such that even in tissues with significant apoptosis, very few apoptotic cells are detectable1. This is thought to be due to the release of 'find-me' signals by apoptotic cells that recruit motile phagocytes such as monocytes, macrophages and dendritic cells, leading to the prompt clearance of the dying cells2. However, the identity and in vivo relevance of such find-me signals are not well understood. Here, through several lines of evidence, we identify extracellular nucleotides as a critical apoptotic cell find-me signal. We demonstrate the caspase-dependent release of ATP and UTP (in equimolar quantities) during the early stages of apoptosis by primary thymocytes and cell lines. Purified nucleotides at these concentrations were sufficient to induce monocyte recruitment comparable to that of apoptotic cell supernatants. Enzymatic removal of ATP and UTP (by apyrase or the expression of ectopic CD39) abro! gated the ability of apoptotic cell supernatants to recruit monocytes in vitro and in vivo. We then identified the ATP/UTP receptor P2Y2 as a critical sensor of nucleotides released by apoptotic cells using RNA interference-mediated depletion studies in monocytes, and macrophages from P2Y2-null mice3. The relevance of nucleotides in apoptotic cell clearance in vivo was revealed by two approaches. First, in a murine air-pouch model, apoptotic cell supernatants induced a threefold greater recruitment of monocytes and macrophages than supernatants from healthy cells did; this recruitment was abolished by depletion of nucleotides and was significantly decreased in P2Y2-/- (also known as P2ry2-/-) mice. Second, clearance of apoptotic thymocytes was significantly impaired by either depletion of nucleotides or interference with P2Y receptor function (by pharmacological inhibition or in P2Y2-/- mice). These results identify nucleotides as a critical find-me cue released by apoptoti! c cells to promote P2Y2-dependent recruitment of phagocytes, a! nd provide evidence for a clear relationship between a find-me signal and efficient corpse clearance in vivo. - ErbB2 resembles an autoinhibited invertebrate epidermal growth factor receptor
- Nature 461(7261):287-291 (2009)
The orphan receptor tyrosine kinase ErbB2 (also known as HER2 or Neu) transforms cells when overexpressed1, and it is an important therapeutic target in human cancer2, 3. Structural studies4, 5 have suggested that the oncogenic (and ligand-independent) signalling properties of ErbB2 result from the absence of a key intramolecular 'tether' in the extracellular region that autoinhibits other human ErbB receptors, including the epidermal growth factor (EGF) receptor6. Although ErbB2 is unique among the four human ErbB receptors6, 7, here we show that it is the closest structural relative of the single EGF receptor family member in Drosophila melanogaster (dEGFR). Genetic and biochemical data show that dEGFR is tightly regulated by growth factor ligands8, yet a crystal structure shows that it, too, lacks the intramolecular tether seen in human EGFR, ErbB3 and ErbB4. Instead, a distinct set of autoinhibitory interdomain interactions hold unliganded dEGFR in an inactive stat! e. All of these interactions are maintained (and even extended) in ErbB2, arguing against the suggestion that ErbB2 lacks autoinhibition. We therefore suggest that normal and pathogenic ErbB2 signalling may be regulated by ligands in the same way as dEGFR. Our findings have important implications for ErbB2 regulation in human cancer, and for developing therapeutic approaches that target novel aspects of this orphan receptor. - Structure of the BK potassium channel in a lipid membrane from electron cryomicroscopy
- Nature 461(7261):292-295 (2009)
A long-sought goal in structural biology has been the imaging of membrane proteins in their membrane environments. This goal has been achieved with electron crystallography1 in those special cases where a protein forms highly ordered arrays in lipid bilayers. It has also been achieved by NMR methods1 in proteins up to 50 kilodaltons (kDa) in size, although milligram quantities of protein and isotopic labelling are required. For structural analysis of large soluble proteins in microgram quantities, an increasingly powerful method that does not require crystallization is single-particle reconstruction from electron microscopy of cryogenically cooled samples (electron cryomicroscopy (cryo-EM))2. Here we report the first single-particle cryo-EM study of a membrane protein, the human large-conductance calcium- and voltage-activated potassium channel3 (BK), in a lipid environment. The new method is called random spherically constrained (RSC) single-particle reconstruction. B! K channels, members of the six-transmembrane-segment (6TM) ion channel family, were reconstituted at low density into lipid vesicles (liposomes), and their function was verified by a potassium flux assay. Vesicles were also frozen in vitreous ice and imaged in an electron microscope. From images of 8,400 individual protein particles, a three-dimensional (3D) reconstruction of the BK channel and its membrane environment was obtained at a resolution of 1.7–2.0 nm. Not requiring the formation of crystals, the RSC approach promises to be useful in the structural study of many other membrane proteins as well. - MicroRNA-mediated switching of chromatin-remodelling complexes in neural development
- Nature 461(7261):296 (2009)
Nature 460, 642–646 (2009) In the print issue of this Letter, Fig. 3 was incorrectly printed as a black and white image. The correct image is shown below. - The pet
- Nature 461(7261):304 (2009)
An exercise in control.
No comments:
Post a Comment