Friday, April 9, 2010

Hot off the presses! Apr 08 Nature

The Apr 08 issue of the Nature is now up on Pubget (About Nature): if you're at a subscribing institution, just click the link in the latest link at the home page. (Note you'll only be able to get all the PDFs in the issue if your institution subscribes to Pubget.)

Latest Articles Include:

  • Open sesame
    - Nature 464(7290):813 (2010)
    Government influence favouring enhanced openness is rightly diversifying practices in science publishing.
  • Learning in the wild
    - Nature 464(7290):813 (2010)
    Much of what people know about science is learned informally. Education policy-makers should take note.
  • Attention Canadian mentors
    - Nature 464(7290):814 (2010)
    Since they were launched in 2005, Nature's awards for mentoring in science have rewarded outstanding research mentors in Britain, Germany, Japan, Australia and South Africa. The competition is held within one country each year, in the belief that mentoring reflects not just notions of good scientific practice and creativity that are universal, but also scientific traditions and cultures that are, at least to a degree, national. (For details of past competitions, see http://go.nature.com/Rccbo4. For our guide to outstanding mentoring, see Nature447, 791–797; 2007.) This year's competition is taking place in Canada. Two prizes of Can$10,000 (US$9,900) will be awarded, one for a mid-career mentor and one for lifetime achievement in mentoring. Nominations are now open, with a closing date of 30 June 2010. The prizes will be awarded at the Canadian Association for Graduate Studies annual meeting in Toronto, Ontario, in November. Contenders may nominate themselves or be nominated by colleagues and ex-colleagues. Nominations for a candidate must include independent testimonials from at least five researchers who have been mentored by the nominee, not all over the same period. Full details and nomination forms can be found at http://go.nature.com/CKbeC4. We look forward to hearing about Canada's outstanding mentors.
  • Wildlife biology: Pitch shifter
    - Nature 464(7290):816 (2010)
  • Palaeoecology: Ancient tree nursery
    - Nature 464(7290):816 (2010)
  • Cancer biology: Brain tumour trigger
    - Nature 464(7290):816 (2010)
  • Chemical sensing: Bomb detector sewn up
    - Nature 464(7290):816 (2010)
  • Immunology: Secret to superinfection
    - Nature 464(7290):816 (2010)
  • Cancer detection: Tracking roving cancer cells
    - Nature 464(7290):817 (2010)
  • Plant biology: Seeking enlightenment
    - Nature 464(7290):817 (2010)
  • Animal behaviour: Tortoise see, tortoise do
    - Nature 464(7290):817 (2010)
  • Neurobiology: Entangled diseases
    - Nature 464(7290):817 (2010)
  • Journal club
    Kappe CO - Nature 464(7290):817 (2010)
  • News briefing: 8 April 2010
    - Nature 464(7290):818 (2010)
    The week in science. This article is best viewed as a PDF. Policy|Business|Research|Events|People|Business watch|The week ahead|Sound bites|News maker US President Barack Obama announced plans on 31 March to expand offshore oil and gas drilling, part of an effort to establish middle ground as the administration seeks votes on Senate climate legislation. The plans would prevent drilling along the west coast and halt a particularly controversial project in Alaska's Bristol Bay, but open up vast tracts along the eastern seaboard. Last week, the administration also finalized its greenhouse-gas standards for vehicles and announced that greenhouse-gas permits would be required for major industrial sources by January. The British government has announced that it will create a huge marine reserve around the Chagos islands, an archipelago of more than 50 islands in the British Indian Ocean Territory. The protected biodiversity hot spot covers more than half a million square kilometres of ocean, and will include a 'no-take' reserve where all commercial fishing is banned. The declaration has angered the Mauritian government, which has claims on the territory, and the former inhabitants of the islands, who were expelled four decades ago and are still campaigning for their right to return. On 31 March, the drug company Pfizer began to make public its payments to physicians and other health professionals for speaking and consulting on its behalf, and for conducting clinical trials of its drugs. Pfizer said it paid out US$35 million in the last six months of 2009. It was required to post much of the data by an agreement settling a US government investigation into the company's promotion of its drugs for off-label use. GlaxoSmithKline, Merck and Eli Lilly already publicly report physician payments; this will be mandatory from 2013, under US health-care reform law. Greenhouse-gas emissions from around 11,000 factories and power plants under the 27-nation European Union (EU) trading scheme fell by 11% in 2009, according to preliminary, incomplete data released on 1 April. The fall — due to the recession — meant that the EU handed out an excess of 60.6 million carbon credits (free permits to emit a tonne of carbon dioxide), which can be retained for future trading. Steel and cement industries have amassed the greatest surplus. Tighter permit caps are expected from 2013 in the scheme's next phase. Wind-energy companies have struck a compromise with the UK Ministry of Defence, which was blocking the development of five wind farms on England's east coast. The ministry has previously opposed these projects because spinning turbine blades can confuse air-defence radar (see Nature 451, 746; 2008). But under an agreement announced on 31 March, wind developers will pay part of the roughly $15 million cost for a replacement radar at Trimingham, Norfolk, which can discriminate between wind turbines and aircraft. The project would supply 3 gigawatts of wind power. SOURCE: CLEANTECH GROUP/DELOITTE Brushing off what may have been a seasonal blip late last year, worldwide venture-capital investment in the clean-technology sector in the first quarter of 2010 has continued its recovery from the economic downturn. At US$1.9 billion, it is now back to levels seen at the beginning of 2008, according to data released on 31 March by analysts the Cleantech Group and Deloitte. The sector includes renewable-energy generation and storage, waste and water treatment, and materials and infrastructure for greater energy efficiency. Electric-vehicle companies led the charge, with Better Place — a company based in Palo Alto, California, that builds infrastructure for electric-vehicle networks — raising $350 million in a January funding round. The total number of venture-capital deals rose to an all-time high of 180, but early-stage investment rounds showed little increase — suggesting that venture-capital investors are interested in maintaining existing portfolios rather than striking out with new companies, says Cleantech's president, Sheeraz Haji. He goes on to say that "private capital may not be as dependent on government stimulus as some in the industry have feared", noting that only one of the top ten deals was with a company backed by government stimulus funds — compared with four of ten in the third quarter of 2009. Twenty-two clinics around the world that offer patients experimental adult stem-cell treatments have been surveyed by the International Cellular Medicine Society based in Salem, Oregon. The study, released on 2 April, provides information about working clinics — such as their cell processing and implantation techniques — although it does not rank them. The society has also established a registry to track the health of people who undergo stem-cell therapy. See go.nature.com/ZrAHkc for more. The UK government has approved an earmarked £97.4 million (US$148 million) to expand the country's Diamond synchrotron in Harwell, Oxfordshire; the facility will get 10 extra beamlines by 2017, taking its total to 32. In addition, on 30 March the country's Natural Environment Research Council announced that it had commissioned a £75-million replacement vessel for its ageing research ship, the RSS Discovery. The order, due mid-2013, was postponed in March 2009 — frustrating British marine scientists — because of rising costs due to exchange rate fluctuations. The effects of the recession have prompted Arizona State University in Tempe to withdraw from a much-heralded medical school partnership designed to boost biotechnology research in Phoenix (see Nature 446, 971–972; 2007). The University of Arizona in Tucson has agreed to take control of the jointly developed Phoenix medical school, now in its third year of operation. The agreement awaits formal approval by the Arizona Board of Regents at a meeting due to be held on 1 May. Both universities have been hit in recent years by more than US$100 million each in state budget cuts. Financial donors to the Consultative Group on International Agriculture Research (CGIAR) want changes to the group's plans for reshaping its research programme. The global network of 15 agricultural research centres, which focuses on improving agriculture in developing countries, hopes to increase its budget from about US$500 million to $1 billion in 5–10 years. Donors who may provide that money voiced concerns at a conference last week in Montpellier, France. They want the proposed reform process to be accelerated and for the CGIAR to focus on well-defined problems rather than on broad themes. See go.nature.com/ZhGM3A for more. Linking offshore wind farms together with an undersea cable down the US east coast could produce a reliable supply of grid electricity, according to a study published on 5 April (W. Kempton et al. Proc. Natl Acad. Sci. USA doi:10.1073/pnas.0909075107; 2010). Researchers from the University of Delaware in Newark studied 5 years of wind data from 11 meteorological stations. They conclude that if wind generators were electrically connected, fluctuations at each site could be smoothed out so that the total power provided changes slowly and never drops to zero. D. BALIBOUSE/REUTERS Physicists have started to gather experimental data from the world's most powerful particle accelerator. On 30 March, the Large Hadron Collider (LHC), located outside Geneva, Switzerland, began colliding protons at energies of 7 teraelectronvolts — more than three times the power of the Tevatron in Batavia, Illinois. Almost immediately, the four main detectors around the machine's 27-kilometre ring began recording data from the collisions. Researchers hope that the data will provide evidence of the Higgs boson — a key part of the mechanism that creates mass — among other discoveries. The start-up comes nearly 18 months after a major accident sidelined the LHC for more than a year (see Nature 463, 1008–1009; 2010). British science writer Simon Singh has won a key appeal in his court battle with the British Chiropractic Association (BCA). The 1 April ruling is of wider significance as it could establish greater legal protection for others wanting to debate scientific or medical issues. The BCA is suing Singh over an article he wrote in The Guardian newspaper in 2008; the appeal judgment means that he is able to use the defence of 'fair comment' under British libel law. The BCA may appeal the ruling. See go.nature.com/EQFfg3 for more. In London, the Royal Institution of Great Britain faces a showdown meeting: its members will vote on whether to oust the venerable body's council, which in January forced out director Susan Greenfield. US President Barack Obama hosts a global summit on nuclear security in Washington DC. The meeting follows a review of the United States' nuclear policy. Weather, water and climate services in Africa are under the spotlight at the First Conference of Ministers Responsible for Meteorology in Africa, in Nairobi, Kenya. → go.nature.com/HV9gSh Two annual meetings will see debate on a federal rule that allows Native American tribes to reclaim ancient bones found near their lands (see Nature 464, 662; 2010). The American Association of Physical Anthropologists meets in Albuquerque, New Mexico. → go.nature.com/7hMvFJ And the Society for American Archaeology meets in St Louis, Missouri. → go.nature.com/QdkT9N Last week's appeal judgement in the court case of science writer Simon Singh (see 'People') quoted with approval this statement from a 1994 US libel action (Underwager v. Salter 22 Fed. 3d 730; 1994). MARITIME SAFETY QUEENSLAND Shen Neng 1 The Chinese coal carrier slammed into Australia's Great Barrier Reef on 3 April, dripping around 2 tonnes of oil and destroying coral. There are currently no comments. This is a public forum. Please keep to our Community Guidelines. You can be controversial, but please don't get personal or offensive and do keep it brief. Remember our threads are for feedback and discussion - not for publishing papers, press releases or advertisements.
  • Telescope arrays give fine view of stars
    Hand E - Nature 464(7290):820 (2010)
    Optical interferometry is no longer on the fringe of astronomy. Light beams from telescopes in the Very Large Telescope Interferometer in Chile are combined in underground tunnels.ESO Overlooking Los Angeles, six small domes nestle amid the pine trees atop Mount Wilson. Individually, the 1-metre telescopes inside those buildings have no chance of competing with the biggest ground and space telescopes. But collectively, the Mount Wilson telescopes are producing some of the sharpest images ever made. Spread in a Y-shaped array across the top of the mountain, the telescopes are part of the Center for High Angular Resolution Astronomy (CHARA). The light from each one is funnelled through vacuum tubes to a central shed, where it is combined in a process called interferometry. Merging the light beams from the widely separated domes gives CHARA a resolving power, or sharpness, equivalent to a single telescope with a 330-metre mirror. That's more than 50 times better than the Hubble Space Telescope's resolution, allowing CHARA to see details on the surfaces of stars where other telescopes just see blurry blobs of light. Radio astronomers have relied on interferometry for more than half a century, but optical astronomers have lagged behind. Now, optical interferometry has come of age. Several observatories are producing strong scientific results, including one reported on page 870 by researchers using the CHARA array1. At the end of last year, the team imaged a disk of dust almost as wide as the Solar System as it crept in front of a large, old star and blotted out its light. This was the first direct image of an eclipsing binary system that has puzzled astronomers for more than a century. "This is moving us into a realm that radio astronomy has been able to enjoy for decades," says Robert Stencel, an astronomer at the University of Denver in Colorado and a co-author on the paper. "This is moving us into a realm that radio astronomy has been able to enjoy for decades." From the start, radio astronomers have enjoyed several advantages over optical observers. Earth's atmosphere doesn't blur radio waves as it does the shorter wavelengths of light. Moreover, radio signals gathered at separate dishes can be digitized, transmitted electronically, then recombined into an interference pattern — the basis of a high-resolution image. This ease of handling has allowed radio astronomers to amalgamate data from dishes all over the globe, creating virtual arrays with baselines as wide as Earth itself. But with optical interferometry, astronomers must intertwine the faint light beams in real time by routing them through tunnels with nanometre-level precision. They also have to counteract the effects of atmospheric blurring using a complex technology called adaptive optics. And because many optical arrays use relatively small telescopes, they have trouble gathering enough light to study anything but bright stars nearby. Technical advances Even with those constraints, optical interferometry has yielded new insights about stars, such as how binary systems swap mass and how stars bulge when they spin. Now, astronomers are pushing the technique by combining light from more than two telescopes. Multiple beams not only make data collection more efficient — more photons are caught and used — they also provide cross-checks on the data, making it easier to build up an image from the interference pattern. CHARA first demonstrated2 a four-beam combiner in 2007, and next year it plans to try for a record six beams at once. The advances are turning once-difficult experiments into more routine operations, opening up the process to astronomers who are not experts in interferometry. "Now we're getting more general users," says Françoise Delplancke, head of the interferometry group for the European Southern Observatory's Very Large Telescope Interferometer (VLTI) in Chile. The number of science papers based on optical interferometry has surged as well, from 9 in 1999 to 56 last year. The VLTI, the focus of European support, is responsible for about half of those. The support for US facilities is more fragmented. CHARA is a university-run operation supported by the National Science Foundation. A potential rival, the Magdalena Ridge Observatory in New Mexico, has run into delays because of funding problems. A NASA-supported interferometer involving the twin 10-metre Keck telescopes in Hawaii was supposed to achieve VLTI-like capabilities with the addition of four to six small 'outrigger' telescopes. But the auxiliary project was derailed in 2006 over environmental and cultural concerns about building new telescopes on the Mauna Kea summit. Yet not everyone has given up on Mauna Kea, which holds the largest concentration of huge telescopes on Earth. Guy Perrin, an astronomer at the Paris Observatory and principal investigator for the Optical Hawaiian Array for Nano-radian Astronomy (OHANA), is connecting the seven large telescopes at the summit into an array with a baseline of 800 metres. As a proof of principle, Perrin has already combined light from the two Keck telescopes via inconspicuous optical fibres, which would obviate the need to connect the telescopes with tunnels3. ADVERTISEMENT Reached by telephone atop the summit, Perrin last week was busy implementing a second stage — a fibre-optic link to connect the Gemini North telescope to the Canada-France-Hawaii Telescope. Down in Chile, at the VLTI, he is helping to develop integrated optics, which would combine beams efficiently on tiny silicon chips rather than in large, complicated rooms. Although the technological hurdles to the OHANA project are still high, Perrin says that a bigger problem could be getting all the Mauna Kea observatories to simultaneously offer up their telescope time — a precious and fiercely guarded resource. "It will be easier to convince the communities that are behind the telescopes," says Perrin, "if we first demonstrate that interferometry is a big player in science today." * References * Kloppenborg, B.et al. Nature464, 870-872 (2010). * Monnier, J. D.et al. Science317, 342-345 (2007). * Perrin, G.et al. Science311, 194 (2006). There are currently no comments. This is a public forum. Please keep to our Community Guidelines. You can be controversial, but please don't get personal or offensive and do keep it brief. Remember our threads are for feedback and discussion - not for publishing papers, press releases or advertisements.
  • Charities warm to climate
    Thompson Osuri L - Nature 464(7290):821 (2010)
    Philanthropic support for climate-change issues tripled in 2008. Global steps to battle climate change might have faltered, but philanthropic institutions in the United States have swung into action, more than tripling their support for climate-related causes in 2008. Donations jumped from the 2007 total of US$240 million to $897 million in 2008 (see 'Climate concern'), according to a report from the Foundation Center, an organization that supports philanthropies, in New York. FOUNDATION CENTER The funding is going to a range of activities, including efforts to reduce greenhouse-gas emissions and to prepare cities for warmer temperatures and higher sea levels. Foundation money is also supporting academic researchers studying the effects of climate change and ways to reduce pollution. In 2008, for example, the Rockefeller Foundation in New York gave a grant to Stanford University in California for studies on how agriculture could adapt to a changing climate. The ClimateWorks Foundation in San Francisco, California, is supporting research around the world, including a grant to Wang Lan, a materials scientist at the China Building Materials Academy in Beijing, who is working to reduce greenhouse-gas emissions from cement production. The vast majority of the increase in 2008 came from the William and Flora Hewlett Foundation in Menlo Park, California, which gave a total of $549 million. Hewlett's donations included a one-time contribution of $500 million to ClimateWorks, which aims to help countries limit carbon dioxide concentrations in the atmosphere to less than 450 parts per million. Many other foundations also bumped up their spending. All told, 267 foundations other than Hewlett distributed 1,578 grants for climate change, representing a 45% increase in their giving compared with 2007, according to the Foundation Center report, which is entitled Climate Change: The U.S. Foundation Response. A generational change may account for part of the sudden generosity. Baby boomers are showing more concern about climate change than previous generations did, says Rachel Leon, executive director of the Environmental Grantmakers Association in New York, a trade group of environmentally focused foundations. These people are now starting to set up their own foundations with a strong emphasis on climate change. The efforts of the foundations pale next to commercial investment in clean energy — $173 billion in 2008 and $162 billion last year, according to market analysts Bloomberg New Energy Finance in London. But foundations can fund projects regardless of their potential pay-off, says Ethan Zindler, the company's head of US research. "They view it as a social imperative," he says. ClimateWorks, for example, collaborates with smaller foundations around the world on projects including the development of vehicle-fuel standards in India and appliance standards in China. Other efforts aim to help developing countries adapt to change. Under a five-year, $70-million commitment in 2007, Rockefeller established the Asian Cities Climate Change Resilience Network, which focuses on aiding smaller cities, such as Surat in India, make growth decisions that help them survive a shifting climate. "We are not really an environmental foundation but a poverty-reduction foundation. But we see a connection between them," says Cristina Rumbaitis del Rio, an associate director at the Rockefeller Foundation. None of the foundations contacted by Nature would say what it plans to give in 2010. Because Hewlett will not repeat its $500-million, one-time donation, the total foundation support for climate-related causes is likely to drop from its 2008 high, but Steven Lawrence, the director of research for the Foundation Center and the author of the new report, expects funding this year to surpass the 2007 amount. "My expectation is to continue to see growth in giving." There are currently no comments. This is a public forum. Please keep to our Community Guidelines. You can be controversial, but please don't get personal or offensive and do keep it brief. Remember our threads are for feedback and discussion - not for publishing papers, press releases or advertisements.
  • US seeks to make science free for all
    Butler D - Nature 464(7290):822 (2010)
    Moves to make research funded by the US government available to everyone could mark a turning point in a publishing revolution. Declan Butler reports. Harvard University is part of a group seeking ways to bolster open access to research papers.M. FEIN/BLOOMBERG/GETTY IMAGES The push to open up scientific knowledge to all looks set to go into overdrive. Over the past decade, the accessibility offered by the Internet has transformed science publishing. Several efforts have already tried to harness the web's power to make research papers available for free. Now two parallel efforts from the US government could see almost all federally funded research made available in free, publicly accessible repositories. Traditional science publishing relies on institutions and libraries buying subscriptions and site licences to academic journals. Some 'open-access' publishers, such as the non-profit Public Library of Science (PLoS), make papers free to readers immediately and try to cover the costs of peer review and publication by charging authors a fee. But author-pays business models are still in their infancy, and the papers they produce account for only a fraction of the literature. The US government and many other research funders are largely taking a different tack — one that can instantly make huge numbers of scientific articles publicly available after a certain delay. Increasingly, they are making it a condition of funding that when scientists publish in a peer-reviewed subscription journal they must place of copy of their paper in a free, publicly accessible database. Such archives, however, mostly contain the authors' final version of the manuscript rather than the published, version of record available on the publisher's website. The argument that everyone should have free access to the fruits of taxpayer-funded research has proved popular with lawmakers keen to reap the benefits of investment in science. And distributing results as widely as possible is predicted to produce socioeconomic gains, such as helping doctors keep up with medical research. "The notion of open government and open access has taken a firm hold," says John Hawley, executive director of the American Society for Clinical Investigation in Ann Arbor, Michigan. "If that means public-access mandates, so be it." Public access was boosted in late 2007, when the US Congress passed a bill making it compulsory for scientists funded by the National Institutes of Health (NIH) to deposit their papers in the agency's PubMed Central archive within 12 months of publication. Click to enlargeSOURCE: PUBMED CENTRAL The agency had introduced a voluntary policy in 2005, but the idea flopped when scientists showed little interest in depositing their articles. Since the measure became compulsory, submissions to PubMed Central and use of the archive have skyrocketed (see 'Where freedom grows'). PubMed Central now holds nearly 2 million articles, and on a typical weekday some 420,000 users between them download about 750,000 articles. In recent years similar mandates have been imposed by research funders in other countries, including the Wellcome Trust — Britain's largest research charity — all the UK government's research councils and the European Research Council. In the United States, two recent proposals could see a policy similar to that of the NIH soon cover most federally funded research. The Federal Research Public Access Act (FRPAA), a bill reintroduced in the Senate in June last year by Joseph Lieberman (Independent, Connecticut) and John Cornyn (Republican, Texas), would apply to all research funded by federal agencies with annual research budgets of more than $100 million, with a few exceptions such as classified research. The House could consider the bill within months. Meanwhile, a six-week public consultation on whether and how public-access policies might be implemented ended on 21 January. Organized by the White House's Office of Science and Technology Policy (OSTP), the consultation has sparked intense speculation that President Barack Obama might soon sign an executive order bringing a policy covering similar ground to the FRPAA into force. That order might also dispense with the $100-million budget cap, but, being an executive order, it would be more vulnerable than a federal law to being overturned by a future administration. Fledgling model The various public initiatives enjoy wide support among leaders of research agencies, universities, libraries and research charities. A broad consensus on the need to enable public access to all US federal research emerged in a report published in January by the Scholarly Publishing Roundtable, a panel of librarians, academic leaders and publishers convened last June by the OSTP and the House Committee on Science and Technology. The report recommended that archiving policies should not damage commercial and not-for-profit scholarly publishing businesses. As with the NIH mandate, it says that publishers should be allowed to delay archiving an article for several months or more after it is published, so that they don't lose business from their paying subscribers. Some publishers aren't satisfied. One panel member, YoungSuk Chi, vice-chairman and managing director of global academic and customer relations for Amsterdam-based Elsevier, dissented from the report, saying that it supports "an overly expansive role of government and advocates approaches to the business of scholarly publishing that I believe are overly prescriptive". In a joint statement to the OSTP, the Association of American Publishers (AAP) and the Washington DC Principles Coalition for Free Access to Science — which represents society publishers — slammed NIH-style mandates as "a means for facilitating international piracy", saying that they would "damage the very institutions that researchers, the public and government itself rely on to peer review, publish, disseminate and preserve scientific information". The statement argued that the government should instead make research results available as summaries, reports and data. Many of these organizations' members, however, already have policies allowing scientists to deposit their own versions of manuscripts in free public archives, and some allow them to post a copy of the final published version. Many journals, including Nature, also help authors fulfil institutional mandates by depositing articles in PubMed Central on the authors' behalf. Allan Adler, the AAP's vice-president of government and legal affairs, says that its message is being heard in Washington and that he expects the two US proposals to "get more careful consideration than did the NIH mandate". One member of the AAP has explicitly distanced itself from the organization's stance, however. Mike Rossner, executive director of Rockefeller University Press in New York, wrote to Bart Gordon (Democrat, Tennessee), chairman of the House Committee on Science and Technology, on 31 March saying: "We strongly support the efforts of the federal government, such as the NIH mandate and the Federal Research Public Access Act, to provide public access to the results of federally funded research." Mark Patterson, director of publishing at PLoS's European office in Cambridge, UK, said that although the roundtable's proposals would "significantly improve" access, they don't go far enough. He argues that bills such as the FRPAA should specifically support models in which authors' fees allow articles to become freely available the moment they are published. Click to enlargeSOURCE: PLOS ONE For now, mandates seem to be the tool of choice for governments and funders to engineer greater public access, whereas the author-pays method remains a fledgling business model. Publishers such as PLoS and the for-profit BioMed Central, which in 2008 was bought by international publisher Springer, based in Germany, have only recently shown that their author-pays model can be sustainable for at least some forms of journal (see 'Opening up'). But the model has proved unable to generate the investment needed for highly selective journals or for those that provide substantial amounts of editorial added value, such as reviews. A growing number of funders are paying author fees on behalf of the scientists they support, but this approach is still far from becoming mainstream. In a bid to change that, five large US research centres, including Harvard University and the Massachusetts Institute of Technology, both in Cambridge, Massachusetts, launched the Compact for Open-Access Publishing Equity in September 2009 to encourage more funders and institutions to pay author fees. This could "reduce the risk to publishers of moving to an open-access business model", says Stuart Shieber, who heads Harvard's Office for Scholarly Communication and is one of the drivers behind the initiative. Harvard's Stuart Shieber backs author-pays models.K. SNIBBE/HARVARD UNIV. NEWS OFFICE Matthew Cockerill, managing director of BioMed Central, welcomes the move. "The Compact members are actively thinking about how to bring about a sustainable change in how their scholarly output is communicated, and are beginning to set up the necessary funding channels to facilitate this," he says. Hybrid vigour Three more institutions, including Columbia University in New York, signed up to the Compact last December. The funds created by the Compact's founding institutions are small, however, and researchers have so far been slow to tap into them. But some fear that the Compact's policies could slow the transition to greater open access because they explicitly discourage paying author fees to 'hybrid journals'. These subscription journals — such as The EMBO Journal, published by Nature Publishing Group — give authors the option to pay a fee to make an individual article open access. Shieber says he is open to revising the policy, but adds that it is motivated by a belief that scarce author fees should go first to pure open-access journals. He also notes concerns that some subscription journals are charging open-access fees while also making money from subscriptions. To ease those worries, some publishers, including Oxford University Press and Nature Publishing Group, modify the subscription prices of hybrid journals in response to open-access uptake. ADVERTISEMENT "The hybrid model is far less risky than betting on a full author-pays business model," says Philip Davis, a graduate student in science publishing at Cornell University in Ithaca, New York. He argues that hybrid journals are a key mechanism to allow subscription-based journals to move to greater open access without jeopardizing their viability. "I'd much prefer a transition in business models, and most hybrid publishing models allow for this transition." One problem is that little research has been done to explore how a transition to greater open access would best be designed, says Mark McCabe, an economist at the University of Michigan in Ann Arbor. "An ideal future does not consist of only open-access journals, but rather a mix of open-access, subscription-based and perhaps hybrid journals," he says. Patrick Labelle, a librarian at the University of Ottawa, Canada, which is a member of the Compact, is convinced that open access will win out over conventional scholarly publishing. "The rapid pace that we have seen in the past few years by institutions, granting agencies, publishers and researchers is indicative that change is upon us," he says. "Open access will, one day, prevail over traditional publishing models." There are currently no comments. This is a public forum. Please keep to our Community Guidelines. You can be controversial, but please don't get personal or offensive and do keep it brief. Remember our threads are for feedback and discussion - not for publishing papers, press releases or advertisements.
  • What's in a name? Fly world is abuzz
    Dalton R - Nature 464(7290):825 (2010)
    Proposed reorganization of Drosophila fruitfly genus might throw out its most celebrated member. Drosophila melanogaster faces genus reassignment.INDIANA UNIV. The star subject of genetic research — the Drosophila melanogaster fruitfly — may lose its name. This is an anticipated repercussion of a decision last week by the London-based International Commission on Zoological Nomenclature. It had spent more than two years debating a petition that would have protected the hallowed name while opening the way to a major reorganization of the Drosophila genus, which includes at least 1,450 species. The commission, which oversees the naming of all species, rejected the petition, setting the stage for a likely renaming of D. melanogaster and hundreds of related species. Among biologists who study various fruitfly species to link genes to traits, the 1 April ruling was no joke. "Oh my God," says Therese Markow, a geneticist at the University of California, San Diego, who was reached in the Sonoran Desert, where she was collecting fruitflies. Markow, who is director of the university's Drosophila Species Stock Center, added that extensive name changes could "wreak havoc" in the Drosophila literature and databases. The naming debate began when a US scientist filed a petition with the commission to designate D. melanogaster as the Drosophila type species — the accepted standard of the genus (see Nature 457, 368; 2009). Kim van der Linde, an ecologist at Florida State University in Tallahassee, wanted to ensure that the name D. melanogaster would not change if the genus were divided, as she and other scientists advocate. The genus is extremely large, and genetic data suggest that some of its member species are more closely related to flies outside the genus than they are to other Drosophila species. In the end, the commission voted 23 to 4 to reject van der Linde's petition. The designated type species will continue to be Drosophila funebris, described in 1787 by Johann Fabricius. But the proposal forced the taxonomic world to face the possibility that the genus in its present form may be untenable. In their written opinions, commission members gave several reasons for voting against the new proposal. Many called it premature because the science about the organization of the Drosophila genus remains unsettled. Others sought to limit the naming disruptions that would occur if the genus were split. Drosophila melanogaster fits within a subgenus called Sophophora, which includes some 350 members. Splitting this group off to form a new genus would require fewer renamings than would be needed if D. melanogaster became the type species for Drosophila. In that case, roughly 1,100 species would be pushed off into new genera. "It was very difficult for the commissioners," says Ellinor Michel, the commission's executive secretary. "It was a question of celebrity, as everyone knows D. melanogaster." ADVERTISEMENT If a researcher were to use current data to publish a revision of the Drosophila genus, D. melanogaster would probably become Sophophora melanogaster. Van der Linde says that if she and her co-authors from the petition can agree, they may present the case for the change. "Something needs to happen," she says. But even if the celebrity fly is renamed, Michel noted, it may still be referred to by its original name. There are currently no comments. This is a public forum. Please keep to our Community Guidelines. You can be controversial, but please don't get personal or offensive and do keep it brief. Remember our threads are for feedback and discussion - not for publishing papers, press releases or advertisements.
  • Animals thrive without oxygen at sea bottom
    Fang J - Nature 464(7290):825 (2010)
    Creatures found where only microbes and viruses were thought to survive. Some loriciferans live in anoxic sediments.R. Danovaro Living exclusively oxygen-free was thought to be a lifestyle open only to viruses and single-celled microorganisms. A group of Italian and Danish researchers has now found three species of multicellular animal, or metazoan, that apparently spend their entire lives in oxygen-starved waters in a basin at the bottom of the Mediterranean Sea. The discovery "opens a whole new realm to metazoans that we thought was off limits", says Lisa Levin, a biological oceanographer at Scripps Institution of Oceanography in La Jolla, California. Roberto Danovaro from the Polytechnic University of Marche in Ancona, Italy, and his colleagues pulled up the animals during three research cruises off the south coast of Greece. The species, which have not yet been named, belong to a phylum of tiny bottom-dwellers called Loricifera. Measuring less than 1 millimetre long, they live at a depth of more than 3,000 metres in the anoxic sediments of the Atalante basin, a place so little explored that Danovaro likens his team's sampling to "going to the Moon to collect rocks". Researchers have previously found multicellular animals living in anoxic environments, but Danovaro says that it was never clear whether those animals were permanent residents. The new loriciferans, which he and his team reported this week (R. Danovaro et al. BMC Biol. doi:10.1186/1741-7007-8-30; 2010), seem to "reproduce and live all their life in anoxic conditions", he says. ADVERTISEMENT The researchers identified an adaptation that helps these loriciferans to survive in their environment. Instead of mitochondria, which rely on oxygen, the creatures have organelles that resemble hydrogenosomes, which some single-celled organisms use to produce energy-storing molecules anaerobically. Angelika Brandt, a deep-sea biologist at Germany's Zoological Museum in Hamburg, says that the work by Danovaro's group is "highly significant". The discovery of metazoans living without mitochondria and oxygen, she says, suggests that animals can occupy niches that once seemed too extreme. There are currently no comments. This is a public forum. Please keep to our Community Guidelines. You can be controversial, but please don't get personal or offensive and do keep it brief. Remember our threads are for feedback and discussion - not for publishing papers, press releases or advertisements.
  • Correction
    - Nature 464(7290):825 (2010)
    The News Feature 'The human race' (Nature 464, 668–669; 2010) misspelt the name of the architect of whole-genome shotgun sequencing. It should be Gene Myers. This error has been corrected online in the HTML and PDF versions of this story. There are currently no comments. This is a public forum. Please keep to our Community Guidelines. You can be controversial, but please don't get personal or offensive and do keep it brief. Remember our threads are for feedback and discussion - not for publishing papers, press releases or advertisements.

  • Ford M - Nature 464(7290):826 (2010)
    The explosion in commercial archaeology has brought a flood of information. The problem now is figuring out how to find and use this unpublished literature, reports Matt Ford. Download a PDF of this story. Archaeologists are used to gathering data by scratching in the dirt. But when Richard Bradley set out to write a new prehistory of Britain in 2004, he unearthed his most important finds while wearing sandals and a sweater rather than work boots and a hard hat. Bradley is one of a growing number of academics in the United Kingdom who are doing their digging in the masses of unpublished 'grey literature' generated when commercial archaeologists are brought in to excavate before any sort of construction. Bradley, a professor at the University of Reading, travelled around the country, visiting the offices of contract archaeological teams and local planning officials. There, he unearthed dozens of reports showing that settlements in England had remained strong during the Bronze Age and had not suffered a population crash, as academics had long thought. "I became aware that what I was teaching would be out of date without looking at the grey literature," says Bradley. For the past 20 years, Britain has been at the centre of a revolution in the funding and practice of archaeology. The shift was spurred by a 1990 change in policy that requires local governments to consider how construction projects will affect archaeological remains. That policy has essentially forced public and private entities to pay for archaeological assessments before they start laying a road, constructing an office building or engaging in other projects that disturb the ground. In many ways the law has achieved its aim, helping to preserve relics that otherwise would have been destroyed. But at the same time, it has created problems for academics, who have struggled to keep up with the avalanche of new data, which some argue are hard to access. Similar concerns have emerged in other countries that have enacted equivalent laws. But it's in the crowded British Isles — with its densely packed archaeological record and rapid pace of development — where the effect has been particularly profound. "There is such a vast body of untapped stuff out there," says Barry Cunliffe, an emeritus professor of European archaeology at the University of Oxford. "This means there is a hold-up in academic development and the way in which the public are able to understand and appreciate archaeology." The contractors disagree. Commercial work now accounts for 93% of the archaeological research done in the United Kingdom, and academics must take note of the data generated by contract units, says Kenneth Aitchison, head of projects and professional development at the Institute for Archaeologists, the body representing commercial archaeologists in Britain. "This is the mainstream," he says. "The ones who complain are missing out." Academic archaeologists are used to a system in which researchers conduct excavations and then publish their observations in monographs and journal articles, which are then available in libraries. But now the results of most excavations get written up for clients and local government planners, and are then held in private offices or local government buildings. Skeleton in the closet? Some academics, such as Bradley, are thankful for the new source of data and have responded by working more closely with commercial units. But others argue that fundamental changes are necessary to prevent market forces from letting down the archaeology. "Where there is a real limitation is that the reports aren't necessarily publicly available," says Cunliffe. "It ought to be made mandatory that all these reports should be made available to the public. Sometimes a unit may say 'I'm sorry, my client is not prepared to make such and such a report public'." Cunliffe says that his research, such as that on Iron Age Britain, has been affected by difficulties obtaining grey literature. Because these reports are not held in libraries, they are unavailable as inter-library loans, making it necessary to travel to read them. "To go through all the records in all the units across the country would have taken years to do and just wasn't feasible in the context of writing a book," he says. Click to enlargeSOURCE: INST. ARCHAEOL. But that is where an increasing amount of archaeological information is stored (see 'Careers in ruins'). Statistics are limited, but it is estimated that in 2003–04, private developers sponsored the vast majority of UK archaeology, spending £144 million (US$220 million), compared to around £19 million spent by the central government and the European Union, and around £25 million by local governments1. Knowledge is power Cunliffe's views are not unique. Another vocal critic has been Gary Lock, also at the University of Oxford. He wrote in the magazine British Archaeology in 2008 that while he was studying part of Oxfordshire, several developer-funded projects had been carried out nearby and he had been unable to access the reports. "Archaeological information is being treated as a commodity to which developers control access," he wrote2. Even if academics can locate reports, that doesn't necessarily resolve all their concerns. "I know that some bits of grey literature I have seen are barely worth the paper they are printed on," says Cunliffe. Commercial reports contain broadly the same sort of data as academic reports — interpretations of features, finds and chronology. But the reports are composed in response to planning, as opposed to research questions. There have also been complaints about the quality of some commercial archaeological work. Representatives of that industry, however, argue that commercial units are advancing archaeology and that academics must keep pace. The grey literature, "does what it is supposed to do, and it is essentially accessible", says Aitchison. Aitchison argues that the issue is not one of access, but rather of awareness, attitude and understanding. The major archaeological contractors — including Oxford Archaeology and Wessex Archaeology in Salisbury — are run as charitable trusts with educational aims. They forge links with universities and are working to get their grey literature online, says Aitchison. More material is published on the Internet than ever before, and the situation has improved massively in the past few years, he adds. Laws requiring developers to fund archaeology operate to varying extents across Europe and North America, with similar debates taking place wherever they are applied. "I reject the concept that grey literature is unpublished," says Deni Seymour, who worked in US contract archaeology for more than 25 years and for a decade co-ran the Lone Mountain Archaeological Services in Albuquerque, New Mexico. Grey literature, she says, "is no less available than many obscure journals and master's theses". According to Seymour, "some academics and mainstream researchers think they are above looking at contract work and they don't value it. In reality, I think many of them are five or more years behind with respect to new discoveries, concepts, method and theory." Although opinions are strong on both sides about working practices and management issues, a consensus is emerging that grey literature contains important information and must be used. "It is time for all scholars to engage their colleagues more widely and meaningfully, and to bridge the divide between the academic and the professional sectors," says Seymour, echoing many of her colleagues in and outside universities. "A lot of the best work is coming out of commercial units now — a lot of the worst is as well." Ireland is showcasing a potential way to make the grey literature more easily available. A government-sponsored programme called Irish National Strategic Archaeological Research funds projects to synthesize grey data. In England, the Archaeological Data Service in York and Bournemouth University's Archaeological Investigations Project are working to put grey literature online. But finding money in the teeth of a recession, with UK universities already facing massive cuts, seems unlikely. Academics who have dug into the grey literature say it can transform ideas about the past. Bradley's work, for example, turned the standard view of late Bronze Age Britain and Ireland on its head. In the late 1980s, academics had concluded that the population in the British Isles had dropped markedly during the late Bronze Age. But since then, professional archaeologists have unearthed so many settlements from that period that "no one mentions a population decline any more", says Bradley. The other Roman Britain Michael Fulford, one of Bradley's colleagues at the University of Reading, has been piloting a study of the grey literature about Roman Britain, with similarly exciting results. "We've almost found 'another Roman Britain'," he says, "one that we would have never seen without developer-funded archaeology." Previously British Roman archaeology had tended to be biased towards excavating high-status sites such as villas, as these were what researchers had chosen to investigate. But commercial excavations happen wherever developers are planning to break ground, and so provide a wider sampling of the past. ADVERTISEMENT By embarking on a "massive photocopying campaign", Fulford assimilated huge amounts of data, representing a massive increase in both the number and type of sites now known. His study revealed the other side of Roman society. The low-status rural settlements showed how indigenous communities coexisted with Roman invaders, by keeping much of their vernacular architecture, but furnishing their homes with Roman manufactured goods. "A lot of the best work is coming out of commercial units now — a lot of the worst is as well, but you can say that about universities, quite frankly," says Fulford. He advises PhD students who want to keep their hand in fieldwork that they might be better off working in commercial archaeology because it often involves large projects that are properly funded. "A lot of my contemporaries feel disenfranchised, but then that's too bad," says Fulford. "Despite the difficulties, we have to adapt to an archaeological record that is massively expanded and, at its best, of far better quality than has been achieved by academics, who are often very part-time fieldworkers." * References * Hinton, P. & Jennings, D. in Quality Management in Archaeology (eds Willems, W. J. H. & van den Dries, M. H.) 100-112 (Oxbow, 2007). * Lock, G.Brit. Archaeol.101, 36-37 (July/August 2008). There are currently no comments. This is a public forum. Please keep to our Community Guidelines. You can be controversial, but please don't get personal or offensive and do keep it brief. Remember our threads are for feedback and discussion - not for publishing papers, press releases or advertisements.
  • Protein folding: The dark side of proteins
    Schnabel J - Nature 464(7290):828 (2010)
    Almost every human protein has segments that can form amyloids, the sticky aggregates known for their role in disease. Yet cells have evolved some elaborate defences, finds Jim Schnabel. Download a PDF of this story Of all the ways that proteins can go bad, becoming an amyloid is surely one of the worst. In this state, sticky elements within proteins emerge and seed the growth of sometimes deadly fibrils. Amyloids riddle the brain in Alzheimer's disease and Creutzfeldt–Jakob disease. But until recently it has seemed that this corrupt state could threaten only a tiny fraction of proteins. Research is now hinting at a more unsettling picture. In work reported in February, a team led by David Eisenberg at the University of California, Los Angeles, sifted through tens of thousands of proteins looking for segments with the peculiar stickiness needed to form amyloid1. They found, says Eisenberg, that "effectively all complex proteins have these short segments that, if exposed and flexible enough, are capable of triggering amyloid formation". Not all proteins form amyloids, however. The 'amylome', as Eisenberg calls it, is restricted because most proteins hide these sticky segments out of harm's way or otherwise keep their stickiness under control. His results and other work suggest that evolution treats amyloids as a fundamental threat. Amyloids have been found in some of the most common age-related diseases, and there is evidence that ageing itself makes some amyloid accumulation inevitable. It now seems as though the human body is perched precariously above an amyloidal abyss. "The amyloid state is more like the default state of a protein, and in the absence of specific protective mechanisms, many of our proteins could fall into it," says Chris Dobson, a structural biologist at the University of Cambridge, UK. Several laboratories are now trying to find ways to supplement or boost these protective mechanisms, in the hope of treating or preventing a host of amyloid-linked diseases. "Advances in understanding amyloids could lead to a powerful new class of medicines for many age-related conditions," says Sam Gandy, a neurobiologist and clinician at Mount Sinai School of Medicine in New York. Fibrils abound The recent work on amyloids has partially confirmed a prediction made 75 years ago by the British biophysicist William Astbury. Proteins start as linear chains of amino acids, but most then fold into complex, three-dimensional, 'globular' shapes. Astbury proposed that almost any globular protein could be made to form dysfunctional fibrils by damaging — or 'denaturing' — it with heat or chemicals. By the 1980s, researchers had come to understand that these artificially induced fibrils had the same peculiar structure seen in disease-linked amyloids, such as the amyloid-β deposits in the brains of people with Alzheimer's disease. But the wider potential of proteins to naturally form this basic structure was not seen right away. "The previous paradigm was that the whole protein unfolded and then refolded into a fibrous structure," says Eisenberg. "Most proteins have evolved to fold in a way that effectively conceals their amyloid-prone segments." By 1999, it was clear that numerous proteins could be made to form amyloids. Dobson proposed that unfolding exposes an essential stickiness in a protein's backbone of amino-acid chains2. Researchers were also linking more and more amyloid-forming proteins to disease, including tau proteins in Alzheimer's disease, α-synuclein in Parkinson's disease, polyglutamine in Huntington's disease, prion protein in Creutzfeldt-Jakob disease and amylin in type 2 diabetes3. Eisenberg and his colleagues studied such proteins using fibril-forming assays and X-ray diffraction techniques and found that their tendency to form amyloids came from specific segments within them4. These segments are typically about six amino acids long, and can be exposed when a protein partly unfolds. These 'amyloidogenic' segments, Eisenberg's team found, have a self-complementary 'steric zipper' structure that lets them mesh very tightly with an identical segment exposed on another protein5. Several of these segments are needed to seed, or nucleate, an amyloid. Segments stack atop one another to form sheets, two of which zip together to form the spine of the fibril. As it grows, the fibril is fringed by the remnants of the segments' host proteins. Eventually, this sprouting fibril breaks to form two smaller fibrils, each of which will grow from both ends again — and so on. "The nucleation event may be rare," Eisenberg says, "but once it starts, you can see how it would spread." In their study1, Eisenberg's team used a computer algorithm to determine when any short protein segment has sufficient steric-zipper-forming potential, based on its predicted three-dimensional structure. After calibrating against known amyloid segments, the team applied the algorithm to the genomes of human, budding yeast and the bacterium Escherichia coli and found that about 15% of the short segments coded by genes in these organisms had this property. "At that rate most proteins contain at least several of these amyloid-prone segments," says Eisenberg. The work helps to clarify in a rigorous way why denaturing a protein often pushes it into the amyloid state, says Jeffery Kelly, a structural biologist and amyloid expert at the Scripps Research Institute in La Jolla, California. "It gives us a better idea of why some proteins have to partially unfold before they can start forming amyloids." Eisenberg, Dobson and others have speculated that the self-complementary stickiness of these short segments might have made them useful building blocks in the earliest stages of life on Earth. Moreover, reports have started to emerge of proteins that function normally in the amyloid state, for example some pituitary hormones6. "We know by now of over two dozen native amyloids, so this state is clearly used by biology in a functional way as well as a dysfunctional way," says Eisenberg. Protein segments with a 'steric zipper' structure mesh tightly to form the spine of amyloid fibrils.M. R. SAWAYA Even so, says Kelly, these native amyloids "are all highly regulated" by, for example, being tucked away inside membrane-bound compartments called vesicles. "That's why biology can use them and not suffer the consequences." Most modern proteins fold into globular structures. But their folding patterns are so complex that they couldn't have evolved by accident. "If you had a machine that could generate protein sequences randomly, you would only rarely get one that can remain stable in the globular, soluble state," Dobson says. Underlying that stability are a variety of evolved mechanisms. When proteins are first synthesized and start to fold, 'chaperone' proteins and related molecules are there to guard against amyloid formation. Other systems are in place to recognize, sequester and destroy amyloids when they do form. The native folded state offers its own strong protection. Eisenberg's group examined more than 12,000 proteins whose folded, three-dimensional structures are already known. They found that 95% of the predicted amyloid-prone segments within them are buried within the structures of their host proteins, and that those that are exposed are too twisted and inflexible to zip up with partner segments1. "It seems that most proteins have evolved to fold in a way that effectively conceals their amyloid-prone segments," says Eisenberg. So it may have been unnecessary for evolution to get rid of the segments outright. Wear and tear Yet all these safeguards amount to a defence line that will inevitably be breached. Some mutations and toxins, and the cellular wear and tear associated with ageing, can result in proteins that are less well folded and less protected by chaperoning and disposal mechanisms — and thus more liable to become amyloids. "The 40 or 50 amyloid-associated diseases we've found so far are probably only the ones in which our proteins are the most vulnerable," says Dobson. "If we were to live longer, we might have to contend with more of these conditions." By the same token, even a subtle hindrance of amyloidogenesis with drugs might have a major effect on disease and even on ageing in general. "If we could just enhance the natural protective mechanisms that stabilize a protein," says Dobson, "we could take it back over to the side of the line where it's soluble and stable." Amyloids may not be the prime causes of all the diseases in which they have been found, but, typically, some by-product of the amyloid process is suspected. In Alzheimer's disease, many scientists now believe that small and still-soluble forms of amyloid are the most toxic to brain cells. By contrast, the larger, insoluble fibrils "might even be protective to the extent that they sequester more toxic forms", says Dobson. The general hope is that by preventing or slowing the initial cascade of amyloid formation, the true 'toxic species' of amyloid will be stopped at its source. "There's a group of 500–600 genes that protect us when we're young." One anti-amyloid strategy is to use small molecules as extra chaperones to lower the probability that a protein will expose its amyloidogenic segments. FoldRx, a biotech company based in Cambridge, Massachusetts and founded by Kelly and Susan Lindquist of the Massachusetts Institute of Technology in Cambridge, recently demonstrated this principle in a clinical trial against familial amyloid polyneuropathy, a fatal neurodegenerative disease. Eisenberg says that this strategy is unlikely to work well against most amyloid-prone disease proteins, such as amyloid-β, because they are typically too small to stay tightly folded. "For those I think there would be no hope of stabilizing the native structure, because they don't have one," he says. Instead, his group is trying to develop compounds to 'cap' the steric zippers of amyloid fibrils, slowing down their formation in the hope that innate clearance mechanisms can then keep up. ADVERTISEMENT A third strategy is to boost the activity of these clearance mechanisms — which, according to work by Kelly's lab, includes enzymes that specifically disaggregate amyloids7. "There's a group of 500–600 genes that protect us when we're young, even if we've been so unlucky as to inherit, for example, a predisposing Parkinson's or Alzheimer's mutation," he says. Finding ways to rejuvenate that system "is what almost our whole lab is working on these days," says Kelly. Jim Schnabel is a freelance writer based in Maryland. * References * Goldschmidt, L. , Teng, P. K. , Riek, R. & Eisenberg, D.Proc. Natl Acad. Sci. USA107, 3487-3492 (2010). * Dobson, C. M.Trends Biochem. Sci.24, 329-332 (1999). * Aguzzi, A.Nature459, 924-925 (2009). * Balbirnie, M. , Grothe, R. & Eisenberg, D. S.Proc. Natl Acad. Sci. USA98, 2375-2380 (2001). * Nelson, R.et al. Nature435, 773-778 (2005). * Maji, S. K.et al. Science325, 328-332 (2009). * Murray, A. N. , Solomon, J. P. , Wang, Y. J. , Balch, W. E. & Kelly, J. W.Prot. Sci.19, 836-846 (2010). There are currently no comments. This is a public forum. Please keep to our Community Guidelines. You can be controversial, but please don't get personal or offensive and do keep it brief. Remember our threads are for feedback and discussion - not for publishing papers, press releases or advertisements.
  • PhD: routine technical work of sequencing is no substitute
    Chen Y - Nature 464(7290):831 (2010)
    You ask in an Editorial whether the Beijing Genomics Institute's strategy of tackling a huge research project by using a large team of new graduates could eventually displace a PhD degree as a model for the training and development of future scientists (Nature464, 7; 2010).
  • PhD: still necessary for independent research leaders
    Li Y - Nature 464(7290):831 (2010)
    Your Editorial poses the question of whether research scientists really need to have a PhD (Nature464, 7; 2010).
  • PhD: time and effort invested foster scientific maturity
    Chatterji BP - Nature 464(7290):831 (2010)
    I question the adequacy of the training in research methods being acquired by the apprentice researchers at China's Beijing Genomics Institute (Nature464, 7; 2010).
  • Food security requires genetic advances to increase farm yields
    Leegood RC Evans JR Furbank RT - Nature 464(7290):831 (2010)
    J. R.
  • How is the Global Green New Deal going?
    Barbier E - Nature 464(7290):832 (2010)
    China and South Korea have invested heavily in environmental stimulus projects. Other G20 countries need to deliver on their sustainability promises to save both the planet and the economy, says Edward Barbier.
  • Correction
    - Nature 464(7290):833 (2010)
    In 'Stop laser uranium enrichment' (F. Slakey & L. R. Cohen Nature464, 32–33; 2010), the article should state that the average fuel cost for nuclear power in 2007 was US$0.0045 per kilowatt-hour (not $0.045 per kilowatt-hour). The other numbers and calculations are correct as printed.
  • Why music moves us
    - Nature 464(7290):834 (2010)
    Daniel J. Levitin enjoys a book that explains how rhythm, pitch and timbre are combined, and why the most delightful compositions balance predictability and surprise.
  • Once more into the animal mind
    - Nature 464(7290):835 (2010)
    The study of the mental lives of animals — comparative cognition — is relatively new. Although Charles Darwin suggested in the nineteenth century that mental, as well as morphological, characteristics are subject to natural selection, the study of animal cognition did not take off until the 1970s, the offspring of a partnership between the fields of comparative psychology and animal behaviour.
  • Smithsonian on human origins
    - Nature 464(7290):836 (2010)
    A permanent exhibition exploring what it means to be human opened last month at the Smithsonian National Museum of Natural History in Washington DC. The US$20.
  • Books in brief
    - Nature 464(7290):836 (2010)
    Talent does not stem from our genes alone, argues science writer David Shenk in The Genius in All of Us (Doubleday, 2010). Favouring nurture over nature, he examines the science of genetics, cognition and human development and concludes that top performers are moulded by hard work and circumstance, not their biological blueprint.
  • Correction
    - Nature 464(7290):836 (2010)
    In the picture caption of the Book Review 'Two views of collapse' (Nature463, 880–881; 2010), we wrongly stated that Chaco Canyon's society existed in the first century. It should have read 'eleventh century'.

  • Sonnenburg JL - Nature 464(7290):837 (2010)
    Without the trillions of microbes that inhabit our gut, we can't fully benefit from the components of our diet. But cultural differences in diet may, in part, dictate what food our gut microbiota can digest.

  • Brown TA - Nature 464(7290):838 (2010)
    The sequencing of ancient DNA is generating dramatic results. The sequence from a bone fragment has revealed the existence of an unknown type of extinct human ancestor that lived in Asia 40,000 years ago.

  • Wootton RC Demello AJ - Nature 464(7290):839 (2010)
    Microfluidic devices have many applications in chemistry and biology, but practical hitches associated with their use are often overlooked. One such device that optimizes catalysts tackles these issues head-on.

  • Alfaro M Santini F - Nature 464(7290):840 (2010)
    According to an innovative exercise in 'morphospace analysis', modern fish owe their stunning diversity in part to an ecological cleaning of the slate by the mass extinction at the end of the Cretaceous.

  • Guinan E - Nature 464(7290):842 (2010)
    For more than a century, the binary star system ϵ Aurigae has been a riddle, wrapped in a mystery, inside an enigma. But no more — the system's previously inferred but unseen disk of dust has been detected.

  • Del Grosso SJ - Nature 464(7290):843 (2010)
    Most emissions of nitrous oxide from semi-arid, temperate grasslands usually occur during the spring thaw. The effects that grazing has on plant litter and snow cover dramatically reduce these seasonal emissions.

  • Workman P Travers J - Nature 464(7290):844 (2010)
    Some cancer cells that become tolerant to a drug remain resistant even after its withdrawal, yet these cells eventually become sensitive to the drug again. The underlying molecular mechanism is unusual.
  • Quantum spin liquid emerging in two-dimensional correlated Dirac fermions
    Meng ZY Lang TC Wessel S Assaad FF Muramatsu A - Nature 464(7290):847 (2010)
    At sufficiently low temperatures, condensed-matter systems tend to develop order. A notable exception to this behaviour is the case of quantum spin liquids, in which quantum fluctuations prevent a transition to an ordered state down to the lowest temperatures. There have now been tentative observations of such states in some two-dimensional organic compounds, yet quantum spin liquids remain elusive in microscopic two-dimensional models that are relevant to experiments. Here we show, by means of large-scale quantum Monte Carlo simulations of correlated fermions on a honeycomb lattice (a structure realized in, for example, graphene), that a quantum spin liquid emerges between the state described by massless Dirac fermions and an antiferromagnetically ordered Mott insulator. This unexpected quantum-disordered state is found to be a short-range resonating valence-bond liquid, akin to the one proposed for high-temperature superconductors: the possibility of unconventional su perconductivity through doping therefore arises in our system. We foresee the experimental realization of this model system using ultra-cold atoms, or group IV elements arranged in honeycomb lattices.
  • Bone progenitor dysfunction induces myelodysplasia and secondary leukaemia
    Raaijmakers MH Mukherjee S Guo S Zhang S Kobayashi T Schoonmaker JA Ebert BL Al-Shahrour F Hasserjian RP Scadden EO Aung Z Matza M Merkenschlager M Lin C Rommens JM Scadden DT - Nature 464(7290):852 (2010)
    Mesenchymal cells contribute to the 'stroma' of most normal and malignant tissues, with specific mesenchymal cells participating in the regulatory niches of stem cells. By examining how mesenchymal osteolineage cells modulate haematopoiesis, here we show that deletion of Dicer1 specifically in mouse osteoprogenitors, but not in mature osteoblasts, disrupts the integrity of haematopoiesis. Myelodysplasia resulted and acute myelogenous leukaemia emerged that had acquired several genetic abnormalities while having intact Dicer1. Examining gene expression altered in osteoprogenitors as a result of Dicer1 deletion showed reduced expression of Sbds, the gene mutated in Schwachman–Bodian–Diamond syndrome—a human bone marrow failure and leukaemia pre-disposition condition. Deletion of Sbds in mouse osteoprogenitors induced bone marrow dysfunction with myelodysplasia. Therefore, perturbation of specific mesenchymal subsets of stromal cells can disorder differentiation, proliferation and apoptosis of heterologous cells, and disrupt tissue homeostasis. Furthermore, primary stromal dysfunction can result in secondary neoplastic disease, supporting the concept of niche-induced oncogenesis.
  • Zscan4 regulates telomere elongation and genomic stability in ES cells
    Zalzman M Falco G Sharova LV Nishiyama A Thomas M Lee SL Stagg CA Hoang HG Yang HT Indig FE Wersto RP Ko MS - Nature 464(7290):858 (2010)
    Exceptional genomic stability is one of the hallmarks of mouse embryonic stem (ES) cells. However, the genes contributing to this stability remain obscure. We previously identified Zscan4 as a specific marker for two-cell embryo and ES cells. Here we show that Zscan4 is involved in telomere maintenance and long-term genomic stability in ES cells. Only 5% of ES cells express Zscan4 at a given time, but nearly all ES cells activate Zscan4 at least once during nine passages. The transient Zscan4-positive state is associated with rapid telomere extension by telomere recombination and upregulation of meiosis-specific homologous recombination genes, which encode proteins that are colocalized with ZSCAN4 on telomeres. Furthermore, Zscan4 knockdown shortens telomeres, increases karyotype abnormalities and spontaneous sister chromatid exchange, and slows down cell proliferation until reaching crisis by passage eight. Together, our data show a unique mode of genome maintenance in ES cells.
  • Molecular mechanism of multivesicular body biogenesis by ESCRT complexes
    Wollert T Hurley JH - Nature 464(7290):864 (2010)
    When internalized receptors and other cargo are destined for lysosomal degradation, they are ubiquitinated and sorted by the endosomal sorting complex required for transport (ESCRT) complexes 0, I, II and III into multivesicular bodies. Multivesicular bodies are formed when cargo-rich patches of the limiting membrane of endosomes bud inwards by an unknown mechanism and are then cleaved to yield cargo-bearing intralumenal vesicles. The biogenesis of multivesicular bodies was reconstituted and visualized using giant unilamellar vesicles, fluorescent ESCRT-0, -I, -II and -III complexes, and a membrane-tethered fluorescent ubiquitin fusion as a model cargo. Here we show that ESCRT-0 forms domains of clustered cargo but does not deform membranes. ESCRT-I and ESCRT-II in combination deform the membrane into buds, in which cargo is confined. ESCRT-I and ESCRT-II localize to the bud necks, and recruit ESCRT-0-ubiquitin domains to the buds. ESCRT-III subunits localize to the bud neck and efficiently cleave the buds to form intralumenal vesicles. Intralumenal vesicles produced in this reaction contain the model cargo but are devoid of ESCRTs. The observations explain how the ESCRTs direct membrane budding and scission from the cytoplasmic side of the bud without being consumed in the reaction.
  • Infrared images of the transiting disk in the ϵ Aurigae system
    Kloppenborg B Stencel R Monnier JD Schaefer G Zhao M Baron F McAlister H Ten Brummelaar T Che X Farrington C Pedretti E Sallave-Goldfinger PJ Sturmann J Sturmann L Thureau N Turner N Carroll SM - Nature 464(7290):870 (2010)
    Epsilon Aurigae (ϵ Aur) is a visually bright, eclipsing binary star system with a period of 27.1 years. The cause of each 18-month-long eclipse has been a subject of controversy for nearly 190 years1 because the companion has hitherto been undetectable. The orbital elements imply that the opaque object has roughly the same mass as the visible component, which for much of the last century was thought to be an F-type supergiant star with a mass of ~15M⊙ (M⊙, mass of the Sun). The high mass-to-luminosity ratio of the hidden object was originally explained by supposing it to be a hyperextended infrared star2 or, later, a black hole3 with an accretion disk, although the preferred interpretation was as a disk of opaque material4, 5 at a temperature of ~500 K, tilted to the line of sight6, 7 and with a central opening8. Recent work implies that the system consists of a low-mass (2.2M⊙–3.3M⊙) visible F-type star, with a disk at 550 K that enshrouds a single B 5V-type star9. Here we report interferometric images that show the eclipsing body moving in front of the F star. The body is an opaque disk and appears tilted as predicted7. Adopting a mass of 5.9M⊙ for the B star, we derive a mass of ~(3.6  0.7)M⊙ for the F star. The disk mass is dynamically negligible; we estimate it to contain ~0.07M (M, mass of the Earth) if it consists purely of dust.
  • 'Memristive' switches enable 'stateful' logic operations via material implication
    Borghetti J Snider GS Kuekes PJ Yang JJ Stewart DR Williams RS - Nature 464(7290):873 (2010)
    The authors of the International Technology Roadmap for Semiconductors1—the industry consensus set of goals established for advancing silicon integrated circuit technology—have challenged the computing research community to find new physical state variables (other than charge or voltage), new devices, and new architectures that offer memory and logic functions1, 2, 3, 4, 5, 6 beyond those available with standard transistors. Recently, ultra-dense resistive memory arrays built from various two-terminal semiconductor or insulator thin film devices have been demonstrated7, 8, 9, 10, 11, 12. Among these, bipolar voltage-actuated switches have been identified as physical realizations of 'memristors' or memristive devices, combining the electrical properties of a memory element and a resistor13, 14. Such devices were first hypothesized by Chua in 1971 (ref. 15), and are characterized by one or more state variables16 that define the resistance of the switch depending u pon its voltage history. Here we show that this family of nonlinear dynamical memory devices can also be used for logic operations: we demonstrate that they can execute material implication (IMP), which is a fundamental Boolean logic operation on two variables p and q such that pIMPq is equivalent to (NOTp)ORq. Incorporated within an appropriate circuit17, 18, memristive switches can thus perform 'stateful' logic operations for which the same devices serve simultaneously as gates (logic) and latches19 (memory) that use resistance instead of voltage or charge as the physical state variable.
  • Dislocation nucleation governed softening and maximum strength in nano-twinned metals
    Li X Wei Y Lu L Lu K Gao H - Nature 464(7290):877 (2010)
    In conventional metals, there is plenty of space for dislocations—line defects whose motion results in permanent material deformation—to multiply, so that the metal strengths are controlled by dislocation interactions with grain boundaries1, 2 and other obstacles3, 4. For nanostructured materials, in contrast, dislocation multiplication is severely confined by the nanometre-scale geometries so that continued plasticity can be expected to be source-controlled. Nano-grained polycrystalline materials were found to be strong but brittle5, 6, 7, 8, 9, because both nucleation and motion of dislocations are effectively suppressed by the nanoscale crystallites. Here we report a dislocation-nucleation-controlled mechanism in nano-twinned metals10, 11 in which there are plenty of dislocation nucleation sites but dislocation motion is not confined. We show that dislocation nucleation governs the strength of such materials, resulting in their softening below a critical twin thi ckness. Large-scale molecular dynamics simulations and a kinetic theory of dislocation nucleation in nano-twinned metals show that there exists a transition in deformation mechanism, occurring at a critical twin-boundary spacing for which strength is maximized. At this point, the classical Hall–Petch type of strengthening due to dislocation pile-up and cutting through twin planes switches to a dislocation-nucleation-controlled softening mechanism with twin-boundary migration resulting from nucleation and motion of partial dislocations parallel to the twin planes. Most previous studies12, 13 did not consider a sufficient range of twin thickness and therefore missed this strength-softening regime. The simulations indicate that the critical twin-boundary spacing for the onset of softening in nano-twinned copper and the maximum strength depend on the grain size: the smaller the grain size, the smaller the critical twin-boundary spacing, and the higher the maximum strength of t he material.
  • Grazing-induced reduction of natural nitrous oxide release from continental steppe
    Wolf B Zheng X Brüggemann N Chen W Dannenmann M Han X Sutton MA Wu H Yao Z Butterbach-Bahl K - Nature 464(7290):881 (2010)
    Atmospheric concentrations of the greenhouse gas nitrous oxide (N2O) have increased significantly since pre-industrial times owing to anthropogenic perturbation of the global nitrogen cycle1, 2, with animal production being one of the main contributors3. Grasslands cover about 20 per cent of the temperate land surface of the Earth and are widely used as pasture. It has been suggested that high animal stocking rates and the resulting elevated nitrogen input increase N2O emissions4, 5, 6, 7. Internationally agreed methods to upscale the effect of increased livestock numbers on N2O emissions are based directly on per capita nitrogen inputs8. However, measurements of grassland N2O fluxes are often performed over short time periods9, with low time resolution and mostly during the growing season. In consequence, our understanding of the daily and seasonal dynamics of grassland N2O fluxes remains limited. Here we report year-round N2O flux measurements with high and low tempor al resolution at ten steppe grassland sites in Inner Mongolia, China. We show that short-lived pulses of N2O emission during spring thaw dominate the annual N2O budget at our study sites. The N2O emission pulses are highest in ungrazed steppe and decrease with increasing stocking rate, suggesting that grazing decreases rather than increases N2O emissions. Our results show that the stimulatory effect of higher stocking rates on nitrogen cycling4, 7 and, hence, on N2O emission is more than offset by the effects of a parallel reduction in microbial biomass, inorganic nitrogen production and wintertime water retention. By neglecting these freeze–thaw interactions, existing approaches may have systematically overestimated N2O emissions over the last century for semi-arid, cool temperate grasslands by up to 72 per cent.
  • Seismic evidence for widespread western-US deep-crustal deformation caused by extension
    Moschetti MP Ritzwoller MH Lin F Yang Y - Nature 464(7290):885 (2010)
    Laboratory experiments have established that many of the materials comprising the Earth are strongly anisotropic in terms of seismic-wave speeds1. Observations of azimuthal2, 3 and radial4, 5 anisotropy in the upper mantle are attributed to the lattice-preferred orientation of olivine caused by the shear strains associated with deformation, and provide some of the most direct evidence for deformation and flow within the Earth's interior. Although observations of crustal radial anisotropy would improve our understanding of crustal deformation and flow patterns resulting from tectonic processes, large-scale observations have been limited to regions of particularly thick crust6. Here we show that observations from ambient noise tomography in the western United States reveal strong deep (middle to lower)-crustal radial anisotropy that is confined mainly to the geological provinces that have undergone significant extension during the Cenozoic Era (since ~65 Myr ago)7, 8. The coincidence of crustal radial anisotropy with the extensional provinces of the western United States suggests that the radial anisotropy results from the lattice-preferred orientation of anisotropic crustal minerals caused by extensional deformation. These observations also provide support for the hypothesis that the deep crust within these regions has undergone widespread and relatively uniform strain in response to crustal thinning and extension9, 10, 11.
  • Hierarchical group dynamics in pigeon flocks
    Nagy M Akos Z Biro D Vicsek T - Nature 464(7290):890 (2010)
    Animals that travel together in groups display a variety of fascinating motion patterns thought to be the result of delicate local interactions among group members1, 2, 3. Although the most informative way of investigating and interpreting collective movement phenomena would be afforded by the collection of high-resolution spatiotemporal data from moving individuals, such data are scarce4, 5, 6, 7 and are virtually non-existent for long-distance group motion within a natural setting because of the associated technological difficulties8. Here we present results of experiments in which track logs of homing pigeons flying in flocks of up to 10 individuals have been obtained by high-resolution lightweight GPS devices and analysed using a variety of correlation functions inspired by approaches common in statistical physics. We find a well-defined hierarchy among flock members from data concerning leading roles in pairwise interactions, defined on the basis of characteristic delay times between birds' directional choices. The average spatial position of a pigeon within the flock strongly correlates with its place in the hierarchy, and birds respond more quickly to conspecifics perceived primarily through the left eye—both results revealing differential roles for birds that assume different positions with respect to flock-mates. From an evolutionary perspective, our results suggest that hierarchical organization of group flight may be more efficient than an egalitarian one, at least for those flock sizes that permit regular pairwise interactions among group members, during which leader–follower relationships are consistently manifested.
  • The complete mitochondrial DNA genome of an unknown hominin from southern Siberia
    Krause J Fu Q Good JM Viola B Shunkov MV Derevianko AP Pääbo S - Nature 464(7290):894 (2010)
    With the exception of Neanderthals, from which DNA sequences of numerous individuals have now been determined1, the number and genetic relationships of other hominin lineages are largely unknown. Here we report a complete mitochondrial (mt) DNA sequence retrieved from a bone excavated in 2008 in Denisova Cave in the Altai Mountains in southern Siberia. It represents a hitherto unknown type of hominin mtDNA that shares a common ancestor with anatomically modern human and Neanderthal mtDNAs about 1.0 million years ago. This indicates that it derives from a hominin migration out of Africa distinct from that of the ancestors of Neanderthals and of modern humans. The stratigraphy of the cave where the bone was found suggests that the Denisova hominin lived close in time and space with Neanderthals as well as with modern humans2, 3, 4.
  • Genome-wide SNP and haplotype analyses reveal a rich history underlying dog domestication
    Vonholdt BM Pollinger JP Lohmueller KE Han E Parker HG Quignon P Degenhardt JD Boyko AR Earl DA Auton A Reynolds A Bryc K Brisbin A Knowles JC Mosher DS Spady TC Elkahloun A Geffen E Pilot M Jedrzejewski W Greco C Randi E Bannasch D Wilton A Shearman J Musiani M Cargill M Jones PG Qian Z Huang W Ding ZL Zhang YP Bustamante CD Ostrander EA Novembre J Wayne RK - Nature 464(7290):898 (2010)
    Advances in genome technology have facilitated a new understanding of the historical and genetic processes crucial to rapid phenotypic evolution under domestication1, 2. To understand the process of dog diversification better, we conducted an extensive genome-wide survey of more than 48,000 single nucleotide polymorphisms in dogs and their wild progenitor, the grey wolf. Here we show that dog breeds share a higher proportion of multi-locus haplotypes unique to grey wolves from the Middle East, indicating that they are a dominant source of genetic diversity for dogs rather than wolves from east Asia, as suggested by mitochondrial DNA sequence data3. Furthermore, we find a surprising correspondence between genetic and phenotypic/functional breed groupings but there are exceptions that suggest phenotypic diversification depended in part on the repeated crossing of individuals with novel phenotypes. Our results show that Middle Eastern wolves were a critical source of genom e diversity, although interbreeding with local wolf populations clearly occurred elsewhere in the early history of specific lineages. More recently, the evolution of modern dog breeds seems to have been an iterative process that drew on a limited genetic toolkit to create remarkable phenotypic diversity.
  • Human memory strength is predicted by theta-frequency phase-locking of single neurons
    Rutishauser U Ross IB Mamelak AN Schuman EM - Nature 464(7290):903 (2010)
    Learning from novel experiences is a major task of the central nervous system. In mammals, the medial temporal lobe is crucial for this rapid form of learning1. The modification of synapses and neuronal circuits through plasticity is thought to underlie memory formation2. The induction of synaptic plasticity is favoured by coordinated action-potential timing across populations of neurons3. Such coordinated activity of neural populations can give rise to oscillations of different frequencies, recorded in local field potentials. Brain oscillations in the theta frequency range (3–8 Hz) are often associated with the favourable induction of synaptic plasticity as well as behavioural memory4. Here we report the activity of single neurons recorded together with the local field potential in humans engaged in a learning task. We show that successful memory formation in humans is predicted by a tight coordination of spike timing with the local theta oscillation. More stereoty ped spiking predicts better memory, as indicated by higher retrieval confidence reported by subjects. These findings provide a link between the known modulation of theta oscillations by many memory-modulating behaviours and circuit mechanisms of plasticity.
  • Transfer of carbohydrate-active enzymes from marine bacteria to Japanese gut microbiota
    Hehemann JH Correc G Barbeyron T Helbert W Czjzek M Michel G - Nature 464(7290):908 (2010)
    Gut microbes supply the human body with energy from dietary polysaccharides through carbohydrate active enzymes, or CAZymes1, which are absent in the human genome. These enzymes target polysaccharides from terrestrial plants that dominated diet throughout human evolution2. The array of CAZymes in gut microbes is highly diverse, exemplified by the human gut symbiont Bacteroides thetaiotaomicron3, which contains 261 glycoside hydrolases and polysaccharide lyases, as well as 208 homologues of susC and susD-genes coding for two outer membrane proteins involved in starch utilization1, 4. A fundamental question that, to our knowledge, has yet to be addressed is how this diversity evolved by acquiring new genes from microbes living outside the gut. Here we characterize the first porphyranases from a member of the marine Bacteroidetes, Zobellia galactanivorans, active on the sulphated polysaccharide porphyran from marine red algae of the genus Porphyra. Furthermore, we show tha t genes coding for these porphyranases, agarases and associated proteins have been transferred to the gut bacterium Bacteroides plebeius isolated from Japanese individuals5. Our comparative gut metagenome analyses show that porphyranases and agarases are frequent in the Japanese population6 and that they are absent in metagenome data7 from North American individuals. Seaweeds make an important contribution to the daily diet in Japan (14.2 g per person per day)8, and Porphyra spp. (nori) is the most important nutritional seaweed, traditionally used to prepare sushi9, 10. This indicates that seaweeds with associated marine bacteria may have been the route by which these novel CAZymes were acquired in human gut bacteria, and that contact with non-sterile food may be a general factor in CAZyme diversity in human gut microbes.
  • MONOPTEROS controls embryonic root initiation by regulating a mobile transcription factor
    Schlereth A Möller B Liu W Kientz M Flipse J Rademacher EH Schmid M Jürgens G Weijers D - Nature 464(7290):913 (2010)
    Acquisition of cell identity in plants relies strongly on positional information1, hence cell–cell communication and inductive signalling are instrumental for developmental patterning. During Arabidopsis embryogenesis, an extra-embryonic cell is specified to become the founder cell of the primary root meristem, hypophysis, in response to signals from adjacent embryonic cells2. The auxin-dependent transcription factor MONOPTEROS (MP) drives hypophysis specification by promoting transport of the hormone auxin from the embryo to the hypophysis precursor. However, auxin accumulation is not sufficient for hypophysis specification, indicating that additional MP-dependent signals are required3. Here we describe the microarray-based isolation of MP target genes that mediate signalling from embryo to hypophysis. Of three direct transcriptional target genes, TARGET OF MP 5 (TMO5) and TMO7 encode basic helix–loop–helix (bHLH) transcription factors that are expressed in the h ypophysis-adjacent embryo cells, and are required and partially sufficient for MP-dependent root initiation. Importantly, the small TMO7 transcription factor moves from its site of synthesis in the embryo to the hypophysis precursor, thus representing a novel MP-dependent intercellular signal in embryonic root specification.
  • Vascular endothelial growth factor B controls endothelial fatty acid uptake
    Hagberg CE Falkevall A Wang X Larsson E Huusko J Nilsson I van Meeteren LA Samen E Lu L Vanwildemeersch M Klar J Genove G Pietras K Stone-Elander S Claesson-Welsh L Ylä-Herttuala S Lindahl P Eriksson U - Nature 464(7290):917 (2010)
    The vascular endothelial growth factors (VEGFs) are major angiogenic regulators and are involved in several aspects of endothelial cell physiology1. However, the detailed role of VEGF-B in blood vessel function has remained unclear2, 3. Here we show that VEGF-B has an unexpected role in endothelial targeting of lipids to peripheral tissues. Dietary lipids present in circulation have to be transported through the vascular endothelium to be metabolized by tissue cells, a mechanism that is poorly understood4. Bioinformatic analysis showed that Vegfb was tightly co-expressed with nuclear-encoded mitochondrial genes across a large variety of physiological conditions in mice, pointing to a role for VEGF-B in metabolism. VEGF-B specifically controlled endothelial uptake of fatty acids via transcriptional regulation of vascular fatty acid transport proteins. As a consequence, Vegfb-/- mice showed less uptake and accumulation of lipids in muscle, heart and brown adipose tissue, and instead shunted lipids to white adipose tissue. This regulation was mediated by VEGF receptor 1 and neuropilin 1 expressed by the endothelium. The co-expression of VEGF-B and mitochondrial proteins introduces a novel regulatory mechanism, whereby endothelial lipid uptake and mitochondrial lipid use are tightly coordinated. The involvement of VEGF-B in lipid uptake may open up the possibility for novel strategies to modulate pathological lipid accumulation in diabetes, obesity and cardiovascular diseases.
  • Chromatin signature of embryonic pluripotency is established during genome activation
    Vastenhouw NL Zhang Y Woods IG Imam F Regev A Liu XS Rinn J Schier AF - Nature 464(7290):922 (2010)
    After fertilization the embryonic genome is inactive until transcription is initiated during the maternal–zygotic transition1, 2, 3. This transition coincides with the formation of pluripotent cells, which in mammals can be used to generate embryonic stem cells. To study the changes in chromatin structure that accompany pluripotency and genome activation, we mapped the genomic locations of histone H3 molecules bearing lysine trimethylation modifications before and after the maternal–zygotic transition in zebrafish. Histone H3 lysine 27 trimethylation (H3K27me3), which is repressive, and H3K4me3, which is activating, were not detected before the transition. After genome activation, more than 80% of genes were marked by H3K4me3, including many inactive developmental regulatory genes that were also marked by H3K27me3. Sequential chromatin immunoprecipitation demonstrated that the same promoter regions had both trimethylation marks. Such bivalent chromatin domains also exist in embryonic stem cells and are thought to poise genes for activation while keeping them repressed4, 5, 6, 7, 8. Furthermore, we found many inactive genes that were uniquely marked by H3K4me3. Despite this activating modification, these monovalent genes were neither expressed nor stably bound by RNA polymerase II. Inspection of published data sets revealed similar monovalent domains in embryonic stem cells. Moreover, H3K4me3 marks could form in the absence of both sequence-specific transcriptional activators and stable association of RNA polymerase II, as indicated by the analysis of an inducible transgene. These results indicate that bivalent and monovalent domains might poise embryonic genes for activation and that the chromatin profile associated with pluripotency is established during the maternal–zygotic transition.
  • Proviral silencing in embryonic stem cells requires the histone methyltransferase ESET
    Matsui T Leung D Miyashita H Maksakova IA Miyachi H Kimura H Tachibana M Lorincz MC Shinkai Y - Nature 464(7290):927 (2010)
    Endogenous retroviruses (ERVs), retrovirus-like elements with long terminal repeats, are widely dispersed in the euchromatic compartment in mammalian cells, comprising ~10% of the mouse genome1. These parasitic elements are responsible for >10% of spontaneous mutations2. Whereas DNA methylation has an important role in proviral silencing in somatic and germ-lineage cells3, 4, 5, an additional DNA-methylation-independent pathway also functions in embryonal carcinoma and embryonic stem (ES) cells to inhibit transcription of the exogenous gammaretrovirus murine leukaemia virus (MLV)6, 7, 8. Notably, a recent genome-wide study revealed that ERVs are also marked by histone H3 lysine 9 trimethylation (H3K9me3) and H4K20me3 in ES cells but not in mouse embryonic fibroblasts910, 11 are required for H3K9me3 and silencing of endogenous and introduced retroviruses specifically in mouse ES cells. Furthermore, whereas ESET enzymatic activity is crucial for HP1 binding and efficient proviral silencing, the H4K20 methyltransferases Suv420h1 and Suv420h2 are dispensable for silencing. Notably, in DNA methyltransferase triple knockout (Dnmt1-/-Dnmt3a-/-Dnmt3b-/-) mouse ES cells, ESET and KAP1 binding and ESET-mediated H3K9me3 are maintained and ERVs are minimally derepressed. We propose that a DNA-methylation-independent pathway involving KAP1 and ESET/ESET-mediated H3K9me3 is required for proviral silencing during the period early in embryogenesis when DNA methylation is dynamically reprogrammed.
  • The kinetics of two-dimensional TCR and pMHC interactions determine T-cell responsiveness
    Huang J Zarnitsyna VI Liu B Edwards LJ Jiang N Evavold BD Zhu C - Nature 464(7290):932 (2010)
    The T-cell receptor (TCR) interacts with peptide-major histocompatibility complexes (pMHC) to discriminate pathogens from self-antigens and trigger adaptive immune responses. Direct physical contact is required between the T cell and the antigen-presenting cell for cross-junctional binding where the TCR and pMHC are anchored on two-dimensional (2D) membranes of the apposing cells1. Despite their 2D nature, TCR–pMHC binding kinetics have only been analysed three-dimensionally (3D) with a varying degree of correlation with the T-cell responsiveness2, 3, 4. Here we use two mechanical assays5, 6 to show high 2D affinities between a TCR and its antigenic pMHC driven by rapid on-rates. Compared to their 3D counterparts, 2D affinities and on-rates of the TCR for a panel of pMHC ligands possess far broader dynamic ranges that match that of their corresponding T-cell responses. The best 3D predictor of response is the off-rate, with agonist pMHC dissociating the slowest2, 3, 4 . In contrast, 2D off-rates are up to 8,300-fold faster, with the agonist pMHC dissociating the fastest. Our 2D data suggest rapid antigen sampling by T cells and serial engagement of a few agonist pMHCs by TCRs in a large self pMHC background. Thus, the cellular environment amplifies the intrinsic TCR–pMHC binding to generate broad affinities and rapid kinetics that determine T-cell responsiveness.
  • Double Holliday junctions are intermediates of DNA break repair
    Bzymek M Thayer NH Oh SD Kleckner N Hunter N - Nature 464(7290):937 (2010)
    Repair of DNA double-strand breaks (DSBs) by homologous recombination is crucial for cell proliferation and tumour suppression. However, despite its importance, the molecular intermediates of mitotic DSB repair remain undefined. The double Holliday junction (DHJ), presupposed to be the central intermediate for more than 25 years1, has only been identified during meiotic recombination2. Moreover, evidence has accumulated for alternative, DHJ-independent mechanisms3, 4, 5, 6, raising the possibility that DHJs are not formed during DSB repair in mitotically cycling cells. Here we identify intermediates of DSB repair by using a budding-yeast assay system designed to mimic physiological DSB repair. This system uses diploid cells and provides the possibility for allelic recombination either between sister chromatids or between homologues, as well as direct comparison with meiotic recombination at the same locus. In mitotically cycling cells, we detect inter-homologue joint mo lecule (JM) intermediates whose strand composition and size are identical to those of the canonical DHJ structures observed in meiosis2. However, in contrast to meiosis, JMs between sister chromatids form in preference to those between homologues. Moreover, JMs seem to represent a minor pathway of DSB repair in mitotic cells, being detected at about tenfold lower levels (per DSB) than during meiotic recombination. Thus, although DHJs are identified as intermediates of DSB-promoted recombination in both mitotic and meiotic cells, their formation is distinctly regulated according to the specific dictates of the two cellular programs.
  • Homotypic fusion of ER membranes requires the dynamin-like GTPase Atlastin
    - Nature 464(7290):942 (2010)
    Nature460, 978–983 (2009) In Figure 5i of this Article, the GTPase activity data contains a calculation error that occurred during the conversion of 2-amino-6-mercapto-7-methylpurine absorbance to μM phosphate. As a result, the activity is over stated. The corrected Figure 5i with the appropriate y axis is shown below. This change does not affect any of the conclusions of the work.
  • A canine distemper virus epidemic in Serengeti lions (Panthera leo)
    - Nature 464(7290):942 (2010)
    Nature379, 441–445 (1996) In this Letter, the received and accepted dates for the manuscript were incorrectly listed as being in 1994, instead of 1995. The correct dates are: received 2 October; accepted 24 November 1995.
  • Transient FTY720 treatment promotes immune-mediated clearance of a chronic viral infection
    Premenko-Lanier M Moseley NB Pruett ST Romagnoli PA Altman JD - Nature 464(7290):942 (2010)
    Nature454, 894–898 (2008) The authors wish to retract this Letter on the grounds that they have been unable to repeat key observations of the paper. Using currently available stocks of virus (LCMV clone 13) and drug (FTY720), they no longer observe that transient drug treatment of infected mice leads to immune-mediated clearance of virus*. Although the authors will continue to investigate the matter, seeking to identify parameters that enable them to repeat the results they reported, they request that Nature retract the paper and regret any adverse consequences that may have resulted from the paper's publication.
  • Memory sticks
    - Nature 464(7290):948 (2010)
    Total recall.

No comments: