Wednesday, March 31, 2010

Hot off the presses! Apr 01 Nature

The Apr 01 issue of the Nature is now up on Pubget (About Nature): if you're at a subscribing institution, just click the link in the latest link at the home page. (Note you'll only be able to get all the PDFs in the issue if your institution subscribes to Pubget.)

Latest Articles Include:

  • The human genome at ten
    - Nature 464(7289):649 (2010)
    Nearly a decade on from the completion of the draft sequence of the human genome, researchers should work with the same intensity and focus to apply the results to health.
  • A new row to hoe
    - Nature 464(7289):650 (2010)
    The time is right to revitalize US agricultural research.
  • Climate science: No solar fix
    - Nature 464(7289):652 (2010)
  • Genomics: DNA packaging unravelled
    - Nature 464(7289):652 (2010)
  • Addiction: Junk-food junkies
    - Nature 464(7289):652 (2010)
  • Photonics: Carbon light catcher
    - Nature 464(7289):652 (2010)
  • Neurodevelopment: Baby talk
    - Nature 464(7289):652 (2010)
  • Metabolism: Fat from fructose
    - Nature 464(7289):653 (2010)
  • Ecology: Mothers stress kids out
    - Nature 464(7289):653 (2010)
  • Nanotechnology: Small salt superconducts
    - Nature 464(7289):653 (2010)
  • Neuropsychology: Morality of murder
    - Nature 464(7289):653 (2010)
  • Journal club
    - Nature 464(7289):653 (2010)
  • News briefing: 1 April 2010
    - Nature 464(7289):654 (2010)
    The week in science. This article is best viewed as a PDF Policy|Events|People|Business|Business watch|The week ahead|News maker|Number crunch| Some of North America's coastal waters were last week designated 'emission control areas'. This means that from August 2012, ships entering those waters will have to cap their emissions of particulate matter and of nitrogen and sulphur oxides — for example, by switching to cleaner diesel fuels. The International Maritime Organization announced the move on 26 March, following a proposal from the United States and Canada to improve coastal and inland air quality. Climate scientists should make available all the data and methodologies — including raw data and computer codes — that support their work, a UK cross-party parliamentary committee has recommended. The House of Commons science and technology committee report, published on 31 March, looked into the disclosure of climate data revealed by e-mails leaked from the Climatic Research Unit (CRU) at the University of East Anglia in Norwich. It criticized the university for supporting "the culture at CRU of resisting disclosure of information", but found there was no attempt to subvert the peer-review process. Science advisers to the British government "should not act to undermine mutual trust", according to a set of principles for independent science advice released last week. Formal terms of engagement were called for after the sacking of drugs adviser David Nutt. An early draft had suggested that politicians and science advisers should "reach a shared position" — but that controversial clause was removed after consultation with researchers. The French government has indefinitely postponed a tax on carbon dioxide emissions after a massive defeat in regional elections. The proposed €17 (US$23) per tonne tax had been due to enter into force on 1 July. France would have been the largest economy to have adopted such a measure, and it was a key part of President Nicolas Sarkozy's environmental plans. Conservative parliamentarians and industrialists feared that the tax would harm the competitiveness of French firms if implemented only in France, and argue that it should be imposed at European Union level. The United States' leading position in nanotechnology is "threatened by several aggressively investing competitors" such as China, South Korea and the European Union. The verdict came from the President's Council of Advisors on Science and Technology (PCAST) in its 25 March review of the US National Nanotechnology Initiative. Although US public and private nanotech investment grew by 18% a year between 2003 and 2008 to reach US$5.7 billion, the rest of the world's investment grew at 27% annually. And China now applies for more nanotech patents than the United States, although it holds fewer overall. Ahead of national elections expected in May, the UK government's annual budget, announced on 24 March, revealed nothing of spending plans for science after 2011 — when researchers fear cuts. Chancellor Alistair Darling announced a one-off sum of £270 million (US$407 million) to create 20,000 new university places, a £2-billion public–private green investment fund and support through 'technology and innovation centres' for commercializing research. Prime Minister Gordon Brown later confirmed an expected £250 million for a biomedical research centre in London, and pledged to appoint a minister for life sciences if he is re-elected. India's government has approved a 60-billion-rupee (US$1.3-billion), 10-year project to establish a nationwide high-speed data communication network. The National Knowledge Network, cleared on 25 March, would link together about 1,500 scientific and educational institutions, allowing scientists and students to share computing facilities, set up virtual classrooms and collaborate on research. Its core physical infrastructure should be completed in two years. The US Food and Drug Administration, which has had chronic staff recruitment and retention problems, needs to develop a strategic human capital plan, according to a government report issued on 23 March. The Government Accountability Office, the investigative arm of the US Congress, considered a 2009 survey of more than 300 agency managers, of whom 80% said they would welcome extra staff, and just 36% saw agency progress keeping pace with the scientific advances necessary to regulate products. E. MARTINO/PANOS The pace of deforestation around the world has slowed in the last decade, but it is still alarmingly high, according to a report released on 23 March by the Food and Agriculture Organization of the United Nations. On average, around 13 million hectares of forests were converted to agricultural land or lost through natural causes every year between 2000 and 2010. That compares with 16 million hectares annually in the 1990s. The net annual reduction in forest area over 2000–2010 was 5.2 million hectares (an area about the size of Costa Rica), down from 8.3 million hectares per year in 1990–2000. South America and Africa faced the biggest net losses, but Asia registered a net annual gain, mainly because of large-scale afforestation programmes. The figures come from the 'Global Forest Resources Assessment', a study released every 5 years that now covers 233 countries and territories. Evolutionary geneticist Francisco Ayala has won this year's £1-million (US$1.5-million) Templeton Prize, an award for those "affirming life's spiritual dimension" that has recently focused on the overlap between science and religion. Ayala, a former president of the American Association for the Advancement of Science, was also ordained as a Dominican priest and is currently professor of biology and philosophy at the University of California, Irvine. Some researchers condemned the US National Academy of Sciences for hosting the award announcement. In a ruling that has far-reaching implications for gene patenting, a federal judge in New York on 29 March struck down seven patents on genes associated with breast and ovarian cancer. The patents, on the BRCA1 and BRCA2 genes, are held by Myriad Genetics and the University of Utah Research Foundation, both based in Salt Lake City. The judge ruled that the patents were "improperly granted" because they were related to isolated DNA that is a product of nature, and therefore not patentable. The lawsuit was filed in May 2009 by the Public Patent Foundation and the American Civil Liberties Union, both in New York. Myriad says it will appeal the ruling. See go.nature.com/uoSndN for more. Pharmaceutical giant Pfizer fraudulently marketed an anti-seizure drug, a US federal jury ruled last week. The firm, based in New York, has been ordered to pay US$142 million in damages to the Kaiser Foundation Health Plan and Kaiser Foundation Hospitals, based in Oakland, California. The jury determined that Pfizer had illegally marketed the drug gabapentin (Neurontin) for unapproved indications, including migraines, neuropathic pain and bipolar disorder. An analysis late last year suggested that the company published only those trials of off-label uses that had positive outcomes (S. S. Vedula et al. N. Engl. J. Med. 361, 1963–1971; 2009). China has become the world's leading clean-energy investor, putting US$34.6 billion into the sector in 2009 — almost double the $18.6 billion invested by the United States, according to a report from the Pew Charitable Trusts in Washington DC. Using data collated by Bloomberg New Energy Finance, the study also shows that Spain invested the highest percentage (0.74%) of its gross domestic product in clean energy, topping the table of investment intensity in the sector. SOURCE: REDFIELD CONSULTING Scotland's bid to become the 'Saudi Arabia of marine energy' was boosted on 16 March, when ten off-shore sites were leased to firms hoping to build up to 1.2 gigawatts of wave and tidal power by 2020. That doesn't include tidal barrages, which trap huge volumes of water in the manner of hydroelectric dams, and would be a massive amount of capacity for the sector, which has a long way to go before it makes a dent in renewable-energy generation. Wave-bobbing machines and underwater turbines together contribute fewer than 10 megawatts of capacity worldwide. By March this year, only 15 of 116 tidal and wave companies tracked by Redfield Consulting — a renewable-energy consultancy in Monymusk, UK — had designs suitable for commercial application (see chart). Britain, Australia, Canada, Norway and South Korea lead interest and investment in the sector, and their coastlines have potential. Theoretically, the Pentland Firth — a strait between the Orkney Islands and northeast Scotland — could generate an average of 4 gigawatts, estimates Ian Bryden, a marine-energy expert at the University of Edinburgh, UK. That's roughly equivalent to four coal-fired power stations. But, Bryden adds, the industry talks of having 2 gigawatts of combined wave and tidal capacity worldwide by 2020 — a target he calls "ambitious". NASA looks back half a century to 1 April 1960, when the first successful weather satellite, TIROS-1, was launched from Cape Canaveral, Florida. → go.nature.com/ZhrtTo The European Space Agency is set to launch CryoSat-2, which will monitor variations in the extent and thickness of polar ice. See page 658 for more. → go.nature.com/M5nOT8 In Prague, the United States and Russia will sign a treaty (agreed last week) to cut their stores of long-range nuclear warheads by around 30%. John TateC. FONDVILLE/ABEL PRIZE/NORWEGIAN ACAD. SCI. LETTS John Tate Growth in prescription drug sales for Teva Pharmaceuticals during the 12 months to September 2009. The generics firm, based in Israel, was the fastest-growing drugs company in that period. Source: Reuters, IMS Health There are currently no comments. This is a public forum. Please keep to our Community Guidelines. You can be controversial, but please don't get personal or offensive and do keep it brief. Remember our threads are for feedback and discussion - not for publishing papers, press releases or advertisements.
  • Geoengineers get the fear
    - Nature 464(7289):656 (2010)
    Researchers fail to come up with clear guidelines for experiments that change the planet's climate. Seeding clouds with sea salt could control our climate.J. MacNeill "Be very careful." The warning, from Robert Socolow, a climate researcher at Princeton University in New Jersey, came at the end of a meeting last week that aimed to thrash out guidelines for the nascent field of geoengineering. The discipline aims to use global-scale efforts to control the climate and mitigate the worst effects of anthropogenic warming — but the techniques used could also have far-reaching, unintended consequences. Socolow presented more than 175 experts from a range of disciplines with a list of their own nightmares, collected over meals and cocktails during the course of an often contentious week. As he rattled through the scenarios, he highlighted the legal, moral and ethical quandaries of geoengineering. In one, a single country unilaterally pumps aerosols into the stratosphere to block the Sun's rays and preserve — or perhaps create — a climate of its own liking. In another, climate policies result in a world full of forest plantations that are created solely to store the greatest possible amount of carbon, with no regard for preserving biodiversity. Or what if the very possibility of using geoengineering to mitigate climate change gives political leaders cover to say that greenhouse gases aren't a problem? The morning after Socolow's sobering talk, the conference's scientific organizing committee released a summary statement, based on attendees' comments, that endorsed geoengineering research as a viable way of avoiding possibly catastrophic global warming. But participants came up short on their stated goal of formulating a set of guidelines and principles for scientists working in the field, and conference organizers promised further work on these in the coming weeks. Instead, it was Socolow's cautionary note that resonated as participants departed the beachside Asilomar Conference Center near Monterey, California. "We're scared, and nothing brings people together like fear," says Jane Long, associate director for energy and environment at Lawrence Livermore National Laboratory in California. Organizers modelled the conference on a gathering at the same location 35 years ago, when eminent biologists established influential guidelines on experiments in the budding field of genetic engineering. Despite disagreement on when ­— or indeed whether — the technologies should be used, says Long, participants generally agreed on the need to identify a responsible way forwards for geoengineering research. "It's a moral imperative to search for solutions," she adds. But it was evident from the beginning that the much broader field of geoengineering would not yield to simple principles as quickly as had genetics. Oranges, Porsches and whales The term geoengineering covers everything from mundane methods for increasing carbon storage in plants, soils and oceans to futuristic 'solar-radiation management' techniques — for example, creating haze in the stratosphere to act as a cheap layer of sunscreen. And that diverse definition is a problem, says David Keith, a geoengineering researcher at the University of Calgary, Alberta. "People aren't discussing apples and oranges, they are talking about apples and oranges and Porsches and whales and moons," he says. "We're scared, and nothing brings people together like fear." Testing solar-radiation management techniques on a global scale is particularly daunting, given that detecting changes in the climate system caused by geoengineering would be nearly as difficult as measuring global warming itself. It could take years to determine the main effects and decades to sort out any number of smaller impacts (see Nature 463, 426–427; 2010). Some fear that stratospheric aerosols could thin the ozone layer or shift global precipitation patterns. Keith is developing a method to use aircraft to release fine sulphur particles that will stay aloft for years in the stratosphere. He says that there should be a way to conduct small-scale experiments that test this kind of technology without perturbing the global climate. But any larger experiments, in which the goal is to effect even small shifts in incoming solar energy at the global scale, should require authorization from a high-level international body, he says. Granger Morgan, an engineer at Carnegie Mellon University in Pittsburgh, Pennsylvania, proposed creating an assessment for field tests based on physical characteristics, such as experiment duration and the predicted reduction in warming. If researchers exceeded specified thresholds, international governance would have to authorize further experiments. Another cadre of researchers is pushing a more benign technology that involves seeding clouds with sea salt to increase their brightness. This brightness could be turned on and off in a matter of days, because the clouds would disperse quickly once the seeding was stopped. The technique could be focused on regional problems such as disappearing Arctic sea ice, say advocates, who suggest that a research programme could be presented to the intergovernmental Arctic Council for approval. ADVERTISEMENT "You can't build a wall around the Arctic climate," counters Alan Robock, a climatologist at Rutgers University in New Brunswick, New Jersey. He fears that some of his colleagues are pushing forwards too quickly in their hunt for a climate fix, although he was pleased with the conference's final statement. The participants agreed that what's now needed is a broader discussion of geoengineering issues within civil society and government. In the meantime, other organizations, including the UK Royal Society and TWAS, the academy of sciences for the developing world, based in Trieste, Italy, are planning their own joint effort to face the fears about geoengineering — and find a way forwards. There are currently no comments. This is a public forum. Please keep to our Community Guidelines. You can be controversial, but please don't get personal or offensive and do keep it brief. Remember our threads are for feedback and discussion - not for publishing papers, press releases or advertisements.
  • River reveals chilling tracks of ancient flood
    - Nature 464(7289):657 (2010)
    Water from melting ice sheet took unexpected route to the ocean. The Younger Dryas flood 13,000 years ago could have emptied into the Arctic Ocean through the Mackenzie River delta.W. LYNCH/PHOTOLIBRARY.COM A thousand years after the last ice age ended, the Northern Hemisphere was plunged back into glacial conditions. For 20 years, scientists have blamed a vast flood of meltwater for causing this 'Younger Dryas' cooling, 13,000 years ago. Picking through evidence from Canada's Mackenzie River, geologists now believe they have found traces of this flood, revealing that cold water from North America's dwindling ice sheet poured into the Arctic Ocean, from where it ultimately disrupted climate-warming currents in the Atlantic. The researchers scoured tumbled boulders and gravel terraces along the Mackenzie River for signs of the meltwater's passage. The flood "would solve a big problem if it actually happened", says oceanographer Wally Broecker of Columbia University's Lamont-Doherty Earth Observatory in Palisades, New York, who was not part of the team. Click for larger image On page 740, the geologists present evidence confirming that the flood occurred (J. B. Murton et al. Nature 464, 740–743; 2010). But their findings raise questions about exactly how the flood chilled the planet. Many researchers thought the water would have poured down what is now the St Lawrence River into the North Atlantic Ocean, where the currents form a sensitive climate trigger. Instead, the Mackenzie River route would have funnelled the flood into the Arctic Ocean (see map). The Younger Dryas was named after the Arctic wild flower Dryas octopetala that spread across Scandinavia as the big chill set in. At its onset, temperatures in northern Europe suddenly dropped 10 °C or more in decades, and tundra replaced the forest that had been regaining its hold on the land. Broecker suggested in 1989 that the rapid climate shift was caused by a slowdown of surface currents in the Atlantic Ocean, which carry warm water north from the Equator to high latitudes (W. S. Broeckeret al. Nature341, 318-321; 1989). The currents are part of the 'thermohaline' ocean circulation, which is driven as the cold and salty — hence dense — waters of the far North Atlantic sink, drawing warmer surface waters north. Broecker proposed that the circulation was disrupted by a surge of fresh water that overflowed from Lake Agassiz, a vast meltwater reservoir that had accumulated behind the retreating Laurentide Ice Sheet in the area of today's Great Lakes. The fresh water would have reduced the salinity of the surface waters, stopping them from sinking. "There's no way for that water to go out of the Arctic without going into the Atlantic." The theory is widely accepted. However, scientists never found geological evidence of the assumed flood pathway down the St Lawrence River into the North Atlantic; or along a possible alternative route southwards through the Mississippi basin. Now it is clear why: the flood did occur; it just took a different route. The team, led by Julian Murton of the University of Sussex in Brighton, UK, dated sand, gravel and boulders from eroded surfaces in the Athabasca Valley and the Mackenzie River delta in northwestern Canada. The shapes of the geological features there suggest that the region had two major glacial outburst floods, the first of which coincides with the onset of the Younger Dryas. If the western margins of the Laurentide Ice Sheet lay just slightly east of their assumed location, several thousand cubic kilometres of water would have been able to flood into the Arctic Ocean. ADVERTISEMENT "Geomorphic observations and chronology clearly indicate a northwestern flood route down the Mackenzie valley," says James Teller, a geologist at the University of Manitoba in Winnipeg, Canada, who took part in the study. But he thinks that the route raises questions about the climatic effects of the Lake Agassiz spill. "We're pretty sure that the water, had it flooded the northern Atlantic, would have been capable of slowing the thermohaline ocean circulation and produce the Younger Dryas cooling," he says. "The question is whether it could have done the same in the Arctic Ocean." Broecker, however, says that the Arctic flood is just what his theory needed. He says that flood waters heading down the St Lawrence River might not have affected the thermohaline circulation anyway, because the sinking takes place far to the north, near Greenland. A pulse of fresh water into the Arctic, however, would ultimately have flowed into the North Atlantic and pulled the climate trigger there. "There's no way for that water to go out of the Arctic without going into the Atlantic," he says. Quirin Schiermeier, with additional reporting by Richard Monastersky There are currently no comments. This is a public forum. Please keep to our Community Guidelines. You can be controversial, but please don't get personal or offensive and do keep it brief. Remember our threads are for feedback and discussion - not for publishing papers, press releases or advertisements.
  • Space probe set to size up polar ice
    - Nature 464(7289):658 (2010)
    Europe's ice-monitoring project gets a second chance after 2005 launch mishap. Click for larger image Almost five years after watching a launch failure destroy their ice-measuring satellite, Europe's polar researchers are ready to try again. For scientists hoping to understand how polar ice is reacting to climate change — and how sea levels may rise as a result — the stakes are higher than ever. CryoSat-2, the satellite's second incarnation, is set to lift off on 8 April from a launch pad in Kazakhstan, aboard a converted Soviet missile. Technical problems with the rocket have already delayed the launch, which was originally scheduled for February. "I hope this time around probability is on our side," says Duncan Wingham, CryoSat-2's principal scientist, who will watch the launch from the European Space Operations Centre of the European Space Agency (ESA) in Darmstadt, Germany. ESA's original mission to measure changes in ice sheets and sea ice in Earth's polar regions failed on 8 October 2005 when a software problem caused the commercial launch rocket to fail. By now, other satellites that monitor Earth's ice are either ageing or malfunctioning (see 'Keeping tabs on Earth's ice'). NASA's ICESat, for example, has lost the use of its key sensors, and stopped returning data last year; its successor will not be launched before 2015. "CryoSat-2 gives us a new pair of eyes on what is happening to Earth's ice," says Robert Bindschadler, a glaciologist and chief scientist at NASA's Hydrospheric and Biospheric Sciences Laboratory in Greenbelt, Maryland. "The changes in the cryosphere are providing the most unequivocal evidence that we are changing our planet in ways that should concern us all." Centimetre accuracy After the loss of CryoSat, ESA's 18 member states decided, in 2006, to rebuild the €135-million (US$182-million) satellite. The new probe is a copy of the original, but with redundancy in all of its critical instruments. Its main sensor is a sophisticated radar altimeter designed to measure the distance between the satellite and Earth's surface with such high precision that it will estimate ice thickness with an accuracy of a few centimetres. If everything goes according to plan, the radar will be turned on and will start to collect data on the thickness of glaciers and ice sheets just three days post-launch. "The coasts are where the action is and where we need the best data." When CryoSat was first conceived more than a decade ago, its main objective was to determine whether Earth's large ice sheets were losing mass at all. Since then, measurements from other satellites have shown that the ice sheets on both Antarctica and Greenland are shrinking. But the error bars on the observations are high, and scientists cannot confidently forecast the future of the ice. Some forecasts, based on the relationship between temperature and sea-level changes during the twentieth century, suggest that the growing rate of ice loss could cause global sea levels to rise by 0.5–1.4 metres above those of 1990 by the end of this century (S. Rahmstorf Science 315, 368–370; 2007). But, rather than behaving like rivers, the coastal glaciers that drain the ice sheets may behave more like lines of traffic, starting and stopping. To make predictions, scientists need to understand how these glaciers behave and what governs their movements. With the current limited sweep of observations, "our predictive ability is not very good", says Ian Howat, a glaciologist at Ohio State University in Columbus, who studies the dynamics of ice sheets and their response to climate change. Because of their orbital orientations, current satellites can survey only around 10% of the coastal areas of Antarctica and Greenland. CryoSat-2 should help fill the gap. Reaching latitudes of 88°, it will provide a view of all the key coastal areas. Coastal retreat "The coasts are where the action is and where we need the best data," says Eric Rignot, an Earth scientist at the University of California, Irvine, who studies ice-sheet dynamics in Greenland and Antarctica. Getting a closer view of steep, narrow glacier channels along coastlines should help scientists to determine how glaciers respond to warmer temperatures. ADVERTISEMENT The new information will ultimately feed into important policy decisions. Rising sea levels is one of the more manageable aspects of climate change because people can migrate away from coasts, says Bindschadler. "But the scientific information must be solid enough to justify policy decisions," he says, adding that, "CryoSat-2 data will play strongly in the science research that will predict future sea level." This is a public forum. Please keep to our Community Guidelines. You can be controversial, but please don't get personal or offensive and do keep it brief. Remember our threads are for feedback and discussion - not for publishing papers, press releases or advertisements.
  • Synching Europe's big science facilities
    - Nature 464(7289):659 (2010)
    Momentum grows for body to coordinate the continent's research infrastructure. The ALBA Synchrotron Light Facility near Barcelona opened last week.ALBA Europe's busiest big science facilities, such as powerful neutron sources and synchrotrons, are centres of international collaboration — but there is precious little coordination to ensure that they are adequately funded, or that underused or moribund facilities are wound down. To tackle this problem, the head of the committee charged with drawing up Europe's priority list of such facilities is calling for a new independent body to help manage their cash flow across the continent. The new body would not manage facilities directly, says Carlo Rizzuto, chair of the European Strategy Forum on Research Infrastructures, but would be "more like the conductor of an orchestra", bringing greater coherence to funding decisions. Speaking at the 6th European Conference on Research Infrastructures, held in Barcelona, Spain, on 23–24 March, Rizzuto said that, if created, the new body could resemble the European Research Council — the pan-European funding organization that allocates scientific grants on the basis of excellence — directing European Union (EU) cash to facilities in which the best research is being carried out. Recommendations and conclusions from the conference will be discussed at the next meeting of EU research ministers in Brussels on 25–26 May. John Wood, chair of the European Research Area Board, which advises the European Commission, said that he supports the creation of an independent body. But he added that it would need to have the political clout to increase EU infrastructure spending significantly. The EU's member countries together spend around €10 billion to €15 billion (US$14 billion to $20 billion) per year on running research facilities. Because the annual operating costs can be around 10% of the price of construction, they can exceed the initial investment within a decade. But the EU itself currently contributes just €250 million per year, or around 2.5% of the running costs of European facilities. That figure is "too small to drive better integration of research infrastructure and should be at least ten times higher", Rizzuto told Nature. Because individual governments pay the bills, the locations of new facilities (see 'Big beasts') are generally decided through political horse-trading, and the host nations make the key decisions on funding levels and whether to maintain or shut a facility. Rizzuto wants a further €1 billion to €2 billion from the EU for running infrastructures in the next European research initiative — the eighth Framework programme. ADVERTISEMENT Getting that extra cash will not be easy, but it will be "indispensable" for managing Europe's facilities better, Rizzuto says, and would also help to make them truly open to scientists with the best proposals, wherever they are in the world. Currently, scientists based in the country hosting the facility tend to have a greater share of the access. "It's not only how you build the facility and how you run it, but also the people that use it," Wood said. But there is a balance to be struck — host nations that put up most of the cash will expect privileged access, he added. The new body could also advise on difficult decisions. "Infrastructures that are obsolete or not well managed could be closed down," says Rizzuto. He adds that, of the 400–600 small and medium-sized scientific facilities in Europe, he thinks that around 200 are poorly managed. "By closing them or making them more efficient, we could save more than €2 billion in operating costs," Rizzuto told Nature. Wood agrees. There are many old telescopes operating, he said. "But who has shut down a telescope so far?" This is a public forum. Please keep to our Community Guidelines. You can be controversial, but please don't get personal or offensive and do keep it brief. Remember our threads are for feedback and discussion - not for publishing papers, press releases or advertisements.
  • Japan plans nuclear power expansion
    - Nature 464(7289):661 (2010)
    Proposal for eight new reactors and nuclear fuel reprocessing faces public opposition. The Monju prototype reactor is set to restart.Courtesy of IAEA Like most countries that embraced nuclear power decades ago, Japan has soured on the technology in recent years. But prompted by worries about climate and energy security, the country's industry ministry last week placed a big bet on a rapid expansion of its nuclear power capability. When the draft energy plan is finalized and signed by the Japanese cabinet in June, it will stand as a roadmap for the country's new government, which campaigned on a platform of reducing carbon emissions by 25% below 1990 levels by 2020 — a promise that is unpopular with the business community. But despite the government's nuclear ambitions, individual reactors will still need approval from local authorities, which is far from certain. Japan relies on imports for more than 80% of its total energy needs; the plan aims to reduce that figure to just 30% by 2030. "With the balance of energy demand changing dramatically we really have to think about energy security," says Ken Sasaji, director of the ministry's energy planning office. Click for larger image. Japan already has 54 reactors with a total generating capacity of 49 gigawatts, accounting for about a quarter of its electricity supplies (see 'Japan's energy mix'). But following a series of accidents between 1997 and 2007, growing public resistance meant that only five reactors were built in the past decade. The new plan proposes building eight reactors by 2020 to supply an additional 11.4 gigawatts of electricity. To ensure that those reactors have fuel, Japan forged a nuclear-energy deal in March with Kazakhstan, which holds the world's second-largest uranium reserves and mines about 20% of the world's uranium ore, making it the world's biggest producer. Japan has promised to supply nuclear-energy technology to Kazakhstan in return for a stable supply of uranium. "We have to get the understanding of the local residents, and that takes time." And last week, Itochu, a Tokyo-based trading company backed by the government, bought a 15% stake in Kalahari Minerals, headquartered in London, which is developing a large uranium mine in Namibia. The mine is expected to begin producing more than 5,000 tonnes of uranium per year in 2013 — roughly 10% of the total uranium mined around the world in 2008. Japan is also counting on its nuclear recycling programme, which recently started after years of failed efforts to convince local residents of its necessity and safety (see Nature 440, 138; 2006). In December 2009, a reactor on the southern island of Kyushu started burning mixed oxide fuel, made by mixing uranium with plutonium from spent fuel. And in February, the Japanese Nuclear Safety Commission gave its approval for a restart of the Monju fast-breeder test reactor in Tsuruga, which will use some of the neutrons generated during the fission process to turn non-fissile uranium isotopes into plutonium that can be extracted from the spent fuel. There are also plans to squeeze extra energy from the country's existing reactors, some of which are around 40 years old. At a 19 March meeting of the US–Japan Nuclear Energy Steering Committee in Washington DC, the partners agreed to collaborate on studies aimed at extending the life of old reactors. But the Japanese government will face a struggle to secure public acceptance of its nuclear ambitions, which are open for public comment until 7 April. Confidence in nuclear power was shaken in 2007 when a magnitude-6.8 earthquake caused a shutdown of the Kashiwazaki-Kariwa plant in Niigata after radioactive cooling water leaked into the sea (see Nature 448, 392–393; 2007). And fresh objections are being raised about Monju. After decades of experimentation, most countries with significant nuclear capabilities have given up on fast-breeder technology, partly because of safety concerns. Monju itself has been closed since 1995 when leaking coolant damaged the plant, and a cover-up attempt damaged the plant's reputation. With safety and earthquake-resistance tests completed in February, the Japan Atomic Energy Agency, which runs Monju, now only needs the local Fukui government to sign on. On 11 March, however, 29 scientists opposed to restarting Monju released a letter on the Citizens' Nuclear Information Center website claiming that checks of key pipes have been inadequate and that the current reactor set-up does not serve as a useful prototype for future fast-breeder reactors. The group argues that because Monju's construction costs were five times greater than a conventional reactor, a full-scale plant would have to be very different from the Monju protoype to be commercially viable. ADVERTISEMENT Japan's situation contrasts with that of its neighbour, China, where more than 20 reactors are under construction and face little public opposition. China aims to reach at least 70 gigawatts of nuclear power by 2020. For Japan, eight new reactors over the next decade will be a struggle, says Takuyuki Kawauchi of the industry ministry's nuclear-energy policy division. "We can't just start putting reactors wherever we want," he says. "We have to get the understanding of the local residents, and that takes time." There are currently no comments. This is a public forum. Please keep to our Community Guidelines. You can be controversial, but please don't get personal or offensive and do keep it brief. Remember our threads are for feedback and discussion - not for publishing papers, press releases or advertisements.
  • Rule poses threat to museum bones
    - Nature 464(7289):662 (2010)
    Law change will allow Native American tribes to reclaim ancient bones found close to their lands. Thousands of ancient human remains held in US museums have not been culturally identified.M. KAUFFMAN/TIME LIFE PICTURES/GETTY IMAGES Deep in the bowels of dozens of US museums lie caches of unidentified ancient human bones that hold vital clues to the history of the continent's earliest inhabitants. But many Native Americans believe that the remains should to be returned to them, often for reburial or destruction. A federal rule unveiled on 15 March could give Native Americans a way to claim these bones — and some researchers fear that this could empty museum collections. The final rule, due to take effect on 14 May, amends the 1990 Native American Graves Protection and Repatriation Act (NAGPRA), which set out steps to correct a history of insensitive handling of bones and funerary objects. The law was a compromise, balancing native rights and those of all Americans who might benefit from scientific study of the remains. US institutions were required to complete and publish by 1995 inventories of their Native American remains. If tribes could trace the remains to their ancestors or show some other cultural affiliation, they could claim the material from the inventories. Those specimens determined to be not culturally affiliated remained at institutions. "This is a major departure, going way beyond the intent of the original law." Following years of pressure from Native American groups, the new rule would give them the right to claim specimens without a cultural link if they had been found close to tribes' historic lands. "This is a major departure, going way beyond the intent of the original law," says John O'Shea, a curator at the University of Michigan Museum of Anthropology in Ann Arbor, which has about 1,400 specimens considered culturally unaffiliated. Overall, there are more than 124,000 culturally unidentified ancient human remains in US institutions; although estimates vary widely, at least 15% of these could be affected by the new rule. Dennis O'Rourke, a population geneticist at the University of Utah in Salt Lake City and president of the American Association of Physical Anthropology, argues that the loss to science would be greater than ever before, because new techniques allowing the extraction of DNA from increasingly ancient bones has boosted the scientific value of such specimens (see Nature 464, 472–473; 2010). But the National NAGPRA office, the division of the US Department of the Interior that administers the law, says that the rule is in keeping with the intent of the 1990 act. Sherry Hutt, programme manager of the NAGPRA office, says that scientists have had sufficient time to study specimens that have been held for decades. "Holding the remains in perpetuity" isn't appropriate, she says. Local links Most scientists say that geographical connections between remains and current tribes may be meaningless, pointing out that the early peoples of the Americas travelled far and wide, as a recent study identifying migration from Siberia to Greenland shows (M. Rasmussenet al. Nature463, 757-762; 2010). "Geographical proximity is not a great way to define a relationship," says O'Rourke. The rule, published in the US Federal Register, is open for comment for 60 days — but it will be enacted once that period is over. Ryan Seidemann, a Louisiana state assistant attorney general based in Baton Rouge who is familiar with NAGPRA, called the rule's enactment a form of "guerrilla tactics" that ignores scientists' concerns. Although Native American tribes are hopeful that the rule will enable them to recover more of their ancestors' bones, they point out that related funerary objects are not covered by it. "As the rule now stands, it won't work," says Mervin Wright Jr, chair of the Pyramid Lake Paiute tribe in Nixon, Nevada. "It is offensive to not include the objects, which for us are a traditional part of the burial." He expects that tribes will lobby to change the rule to address this point. ADVERTISEMENT Some museums — including the American Museum of Natural History in New York, the Field Museum in Chicago, Illinois, and the Peabody Museum of Archaeology and Ethnology at Harvard University in Cambridge, Massachusetts — are discussing whether they will challenge the rule. The issue could have the same import as the long legal fight to study the 9,000-year-old Kennewick man skeleton against Native American wishes (see Nature 436, 10; 2005). In 2004, scientists won that court battle, affirming the principle that bones would be returned only to culturally related tribes. Anthropologists and archaeologists are also gearing up to debate the rule. Discussions have already been scheduled for the annual meeting of the American Association of Physical Anthropology, which starts on 14 April in Albuquerque, New Mexico, and the Society for American Archaeology meeting, which begins on the same day in St Louis, Missouri. There are currently no comments. This is a public forum. Please keep to our Community Guidelines. You can be controversial, but please don't get personal or offensive and do keep it brief. Remember our threads are for feedback and discussion - not for publishing papers, press releases or advertisements.
  • Gene flaw found in induced stem cells
    - Nature 464(7289):663 (2010)
    Key difference between reprogrammed adult mouse cells and embryonic stem cells discovered. Mouse stem cells can be used to create neurons (above) for the screening of drugs.Y.SUN, UCSF/CIRM Stem-cell researchers have puzzled over why reprogrammed cells taken from adult tissues are often slower to divide and much less robust than their embryo-derived counterparts. Now, a team has discovered the key genetic difference between embryonic and adult-derived stem cells in mice. If confirmed in humans, the finding could help clinicians to select only the heartiest stem cells for therapeutic applications and disease modelling. Induced pluripotent stem (iPS) cells are created by reprogramming adult cells, and outwardly seem indistinguishable from embryonic stem (ES) cells. Both cell types are pluripotent — they can form any tissue in the body. Yet subtle distinctions abound. Last month, for example, Su-Chun Zhang and his colleagues at the University of Wisconsin–Madison compared the ability of both types of pluripotent cell to form human neurons in a laboratory setting, and found that iPS cells did so with markedly lower efficiency than ES cells (B.-Y. Hu et al. Proc. Natl Acad. Sci. USA 107, 4335–4340; 2010). Last year, researchers also reported consistent differences in gene expression between the two cell types (M. H. Chin et al. Cell Stem Cell 5, 111–123; 2009). However, because scientists have always obtained iPS and ES cells from different sources — in general, iPS cells are derived from skin samples taken during biopsies and ES cells from excess embryos from fertility clinics — it was impossible to tell whether the discrepancies could be chalked up to the unique biology of the cells or the genetics of the underlying tissue. Silence please A team led by Konrad Hochedlinger at Massachusetts General Hospital in Boston has now derived iPS and ES cells with identical DNA. The iPS cells were less efficient than the ES cells at incorporating into chimeric mice — a standard test of pluripotency, or 'stemness'. The team added the stem cells into embryos from mice of a different colour; once each mouse matures, the colouring of its coat reveals how much the stem cells contributed to forming its tissue. "This is an important step towards identifying the differences that may exist in imperfectly reprogrammed cells." When the scientists compared genome-wide expression patterns between the two cell types, they discovered that a small stretch of DNA on the long arm of chromosome 12 displayed significantly different gene activity. In this region, two genes and a slew of tiny regulatory sequences called microRNAs were consistently activated in the ES cells and silenced in the iPS cells, regardless of whether the reprogrammed cells came originally from skin, brain, blood or other tissue. Although the function of the key genes is unknown, this region is usually silenced in mouse sperm cells and activated in other types of cell, so reprogramming might somehow mimic the silencing process, the authors speculate. "This is an important step towards identifying the differences that may exist in those imperfectly reprogrammed cells," says Sheng Ding, a stem-cell researcher at the Scripps Research Institute in La Jolla, California. The discovery raises the possibility that human iPS cells carry similar silenced sequences that make them less effective than ES cells, according to team member Matthias Stadtfeld, also from the Massachusetts General Hospital, who presented the work at a meeting of the New York Academy of Sciences on 23 March. "It points towards the possibility that hot spots for epigenetic abnormalities exist also in human iPS cells," he says. "A profound abnormality like that could confound results obtained with patient-specific iPS cells." ADVERTISEMENT John Hambor, director of stem-cell-based drug discovery at Cell Therapy Group, a consultancy based in Madison, Connecticut, cautions that although the iPS cells in the experiment did not meet the strictest criteria of stemness — they did not introduce significant colouring into chimeric mice — they may still have been able to form many types of tissue, something the researchers did not explicitly test. Stadtfeld agrees, noting that the silenced genes "might not matter for tissues in which [such] genes have no role". Although findings in mice don't always apply to humans, if a similar gene signature is found in human cells, it could help researchers to identify which iPS cells to avoid using, and which stand the best chance of producing the desired tissue. Hochedlinger's team has therefore begun to look at human ES and iPS cells in search of similar gene-activity patterns to those they found in mice. There are currently no comments. This is a public forum. Please keep to our Community Guidelines. You can be controversial, but please don't get personal or offensive and do keep it brief. Remember our threads are for feedback and discussion - not for publishing papers, press releases or advertisements.
  • Correction
    - Nature 464(7289):663 (2010)
    A News Briefing (Nature 464, 330; 2010) wrongly stated that the Intergovernmental Panel on Climate Change had appointed the InterAcademy Council to review its procedures. The council was invited by the United Nations and the World Meteorological Organization to conduct the review. There are currently no comments. This is a public forum. Please keep to our Community Guidelines. You can be controversial, but please don't get personal or offensive and do keep it brief. Remember our threads are for feedback and discussion - not for publishing papers, press releases or advertisements.
  • Human genone at ten: Life is complicated
    - Nature 464(7289):664 (2010)
    The more biologists look, the more complexity there seems to be. Erika Check Hayden asks if there's a way to make life simpler. Download a PDF of this story. Not that long ago, biology was considered by many to be a simple science, a pursuit of expedition, observation and experimentation. At the dawn of the twentieth century, while Albert Einstein and Max Planck were writing mathematical equations that distilled the fundamental physics of the Universe, a biologist was winning the Nobel prize for describing how to make dogs drool on command. The molecular revolution that dawned with the discovery of the structure of DNA in 1953 changed all that, making biology more quantitative and respectable, and promising to unravel the mysteries behind everything from evolution to disease origins. The human genome sequence, drafted ten years ago, promised to go even further, helping scientists trace ancestry, decipher the marks of evolution and find the molecular underpinnings of disease, guiding the way to more accurate diagnosis and targeted, personalized treatments. The genome promised to lay bare the blueprint of human biology. That hasn't happened, of course, at least not yet. In some respects, sequencing has provided clarification. Before the Human Genome Project began, biologists guessed that the genome could contain as many as 100,000 genes that code for proteins. The true number, it turns out, is closer to 21,000, and biologists now know what many of those genes are. But at the same time, the genome sequence did what biological discoveries have done for decades. It opened the door to a vast labyrinth of new questions. See online collection. Few predicted, for example, that sequencing the genome would undermine the primacy of genes by unveiling whole new classes of elements — sequences that make RNA or have a regulatory role without coding for proteins. Non-coding DNA is crucial to biology, yet knowing that it is there hasn't made it any easier to understand what it does. "We fooled ourselves into thinking the genome was going to be a transparent blueprint, but it's not," says Mel Greaves, a cell biologist at the Institute of Cancer Research in Sutton, UK. Instead, as sequencing and other new technologies spew forth data, the complexity of biology has seemed to grow by orders of magnitude. Delving into it has been like zooming into a Mandelbrot set — a space that is determined by a simple equation, but that reveals ever more intricate patterns as one peers closer at its boundary. With the ability to access or assay almost any bit of information, biologists are now struggling with a very big question: can one ever truly know an organism — or even a cell, an organelle or a molecular pathway — down to the finest level of detail? "The more we know, the more we realize there is to know." Imagine a perfect knowledge of inputs, outputs and the myriad interacting variables, enabling a predictive model. How tantalizing this notion is depends somewhat on the scientist; some say it is enough to understand the basic principles that govern life, whereas others are compelled to reach for an answer to the next question, unfazed by the ever increasing intricacies. "It seems like we're climbing a mountain that keeps getting higher and higher," says Jennifer Doudna, a biochemist at the University of California, Berkeley. "The more we know, the more we realize there is to know." Web-like networks Biologists have seen promises of simplicity before. The regulation of gene expression, for example, seemed more or less solved 50 years ago. In 1961, French biologists François Jacob and Jacques Monod proposed the idea that 'regulator' proteins bind to DNA to control the expression of genes. Five years later, American biochemist Walter Gilbert confirmed this model by discovering the lac repressor protein, which binds to DNA to control lactose metabolism in Escherichia coli bacteria1. For the rest of the twentieth century, scientists expanded on the details of the model, but they were confident that they understood the basics. "The crux of regulation," says the 1997 genetics textbook Genes VI (Oxford Univ. Press), "is that a regulator gene codes for a regulator protein that controls transcription by binding to particular site(s) on DNA." Just one decade of post-genome biology has exploded that view. Biology's new glimpse at a universe of non-coding DNA — what used to be called 'junk' DNA — has been fascinating and befuddling. Researchers from an international collaborative project called the Encyclopedia of DNA Elements (ENCODE) showed that in a selected portion of the genome containing just a few per cent of protein-coding sequence, between 74% and 93% of DNA was transcribed into RNA2. Much non-coding DNA has a regulatory role; small RNAs of different varieties seem to control gene expression at the level of both DNA and RNA transcripts in ways that are still only beginning to become clear. "Just the sheer existence of these exotic regulators suggests that our understanding about the most basic things — such as how a cell turns on and off — is incredibly naive," says Joshua Plotkin, a mathematical biologist at the University of Pennsylvania in Philadelphia. Even for a single molecule, vast swathes of messy complexity arise. The protein p53, for example, was first discovered in 1979, and despite initially being misjudged as a cancer promoter, it soon gained notoriety as a tumour suppressor — a 'guardian of the genome' that stifles cancer growth by condemning genetically damaged cells to death. Few proteins have been studied more than p53, and it even commands its own meetings. Yet the p53 story has turned out to be immensely more complex than it seemed at first. In 1990, several labs found that p53 binds directly to DNA to control transcription, supporting the traditional Jacob–Monod model of gene regulation. But as researchers broadened their understanding of gene regulation, they found more facets to p53. Just last year, Japanese researchers reported3 that p53 helps to process several varieties of small RNA that keep cell growth in check, revealing a mechanism by which the protein exerts its tumour-suppressing power. Even before that, it was clear that p53 sat at the centre of a dynamic network of protein, chemical and genetic interactions. Researchers now know that p53 binds to thousands of sites in DNA, and some of these sites are thousands of base pairs away from any genes. It influences cell growth, death and structure and DNA repair. It also binds to numerous other proteins, which can modify its activity, and these protein–protein interactions can be tuned by the addition of chemical modifiers, such as phosphates and methyl groups. Through a process known as alternative splicing, p53 can take nine different forms, each of which has its own activities and chemical modifiers. Biologists are now realizing that p53 is also involved in processes beyond cancer, such as fertility and very early embryonic development. In fact, it seems wilfully ignorant to try to understand p53 on its own. Instead, biologists have shifted to studying the p53 network, as depicted in cartoons containing box! es, circles and arrows meant to symbolize its maze of interactions. Data deluge The p53 story is just one example of how biologists' understanding has been reshaped, thanks to genomic-era technologies. Knowing the sequence of p53 allows computational biologists to search the genome for sequences where the protein might bind, or to predict positions where other proteins or chemical modifications might attach to the protein. That has expanded the universe of known protein interactions — and has dismantled old ideas about signalling 'pathways', in which proteins such as p53 would trigger a defined set of downstream consequences. "When we started out, the idea was that signalling pathways were fairly simple and linear," says Tony Pawson, a cell biologist at the University of Toronto in Ontario. "Now, we appreciate that the signalling information in cells is organized through networks of information rather than simple discrete pathways. It's infinitely more complex." The data deluge following the Human Genome Project is undoubtedly part of the problem. Knowing what any biological part is doing has become much more difficult, because modern, high-throughput technologies have granted tremendous power to collect data. Gone are the days when cloning and characterizing a gene would garner a paper in a high-impact journal. Now teams would have to sequence an entire human genome, or several, and compare them. Unfortunately, say some, such impressive feats don't always bring meaningful biological insights. "In many cases you've got high-throughput projects going on, but much of the biology is still occurring on a small scale," says James Collins, a bioengineer at Boston University in Massachusetts. "We've made the mistake of equating the gathering of information with a corresponding increase in insight and understanding." A new discipline — systems biology — was supposed to help scientists make sense of the complexity. The hope was that by cataloguing all the interactions in the p53 network, or in a cell, or between a group of cells, then plugging them into a computational model, biologists would glean insights about how biological systems behaved. In the heady post-genome years, systems biologists started a long list of projects built on this strategy, attempting to model pieces of biology such as the yeast cell, E. coli, the liver and even the 'virtual human'. So far, all these attempts have run up against the same roadblock: there is no way to gather all the relevant data about each interaction included in the model. A bug in the system In many cases, the models themselves quickly become so complex that they are unlikely to reveal insights about the system, degenerating instead into mazes of interactions that are simply exercises in cataloguing. In retrospect, it was probably unrealistic to expect that charting out the biological interactions at a systems level would reveal systems-level properties, when many of the mechanisms and principles governing inter-and intracellular behaviour are still a mystery, says Leonid Kruglyak, a geneticist at Princeton University in New Jersey. He draws a comparison to physics: imagine building a particle accelerator such as the Large Hadron Collider without knowing anything about the underlying theories of quantum mechanics, quantum chromodynamics or relativity. "You would have all this stuff in your detector, and you would have no idea how to think about it, because it would involve processes that you didn't understand at all," says Kruglyak. "There is a certain amount of naivety to the idea that for any process — be it biology or weather prediction or anything else — you can simply take very large amounts of data and run a data-mining program and understand what is going on i! n a generic way." This doesn't mean that biologists are stuck peering ever deeper into a Mandelbrot set without any way of making sense of it. Some biologists say that taking smarter systems approaches has empowered their fields, revealing overarching biological rules. "Biology is entering a period where the science can be underlaid by explanatory and predictive principles, rather than little bits of causality swimming in a sea of phenomenology," says Eric Davidson, a developmental biologist at the California Institute of Technology in Pasadena. Such progress has not come from top–down analyses — the sort that try to arrive at insights by dumping a list of parts into a model and hoping that clarity will emerge from chaos. Rather, insights have come when scientists systematically analyse the components of processes that are easily manipulated in the laboratory — largely in model organisms. They're still using a systems approach, but focusing it through a more traditional, bottom–up lens. Davidson points to the example of how gene regulation works during development to specify the construction of the body. His group has spent almost a decade dissecting sea-urchin development by systematically knocking out the expression of each of the transcription factors — regulatory proteins that control the expression of genes — in the cells that develop into skeleton. By observing how the loss of each gene affects development, and measuring how each 'knockout' affects the expression of every other transcription factor, Davidson's group has constructed a map of how these transcription factors work together to build the animal's skeleton4. The map builds on the Jacob–Monod principle that regulation depends on interactions between regulatory proteins and DNA. Yet it includes all of these regulatory interactions and then attempts to draw from them common guiding principles that can be applied to other developing organisms. For example, transcription factors encoded in the urchin embryo's genome are first activated by maternal proteins. These embryonic factors, which are active for only a short time, trigger downstream transcription factors that interact in a positive feedback circuit to switch each other on permanently. Like the sea urchin, other organisms from fruitflies to humans organize development into 'modules' of genes, the interactions of which are largely isolated from one another, allowing evolution to tweak each module without compromising the integrity of the whole process. Development, in other words, follows similar rules in different species. "The fundamental idea that the genomic regulatory system underlies all the events of development of the body plan, and that changes in it probably underlie the evolution of body plans, is a basic principle of biology that we didn't have before," says Davidson. That's a big step forwards from 1963, when Davidson started his first lab. Back then, he says, most theories of development were "manifestly useless". Davidson calls his work "a proof of principle that you can understand everything about the system that you want to understand if you get hold of its moving parts". He credits the Human Genome Project with pushing individual biologists more in the direction of understanding systems, rather than staying stuck in the details, focused on a single gene, protein or other player in those systems. First, it enabled the sequencing of model-organism genomes, such as that of the sea urchin, and the identification of all the transcription factors active in development. And second, it brought new types of biologists, such as computational biologists, into science, he says. The eye of the beholder So how is it that Davidson sees simplicity and order emerging where many other biologists see increasing disarray? Often, complexity seems to lie in the eye of the beholder. Researchers who work on model systems, for instance, can manipulate those systems in ways that are off-limits to those who study human biology, arriving at more definitive answers. And there are basic philosophical differences in the way scientists think about biology. "It's people who complicate things," says Randy Schekman, a cell and molecular biologist at the University of California, Berkeley. "I've seen enough scientists to know that some people are simplifiers and others are dividers." Although the former will glean big-picture principles from select examples, the latter will invariably get bogged down in the details of the examples themselves. "It's people who complicate things. Some people are simplifiers and others are dividers." Mark Johnston, a yeast geneticist at the University of Colorado School of Medicine in Denver, admits to being a generalizer. He used to make the tongue-in-cheek prediction that the budding yeast Saccharomyces cerevisiae would be "solved" by 2007 when every gene and every interaction has been characterized. He has since written more seriously that this feat will be accomplished within the next few decades5. Like Davidson, he points out that the many aspects of yeast life, such as the basics of DNA synthesis and repair, are essentially understood. Scientists already know what about two-thirds of the organism's 5,800 genes do, and the remaining genes will be characterized soon enough, Johnston says. He works on the glucose-sensing pathway, and says he will be satisfied that he understands it when he can quantitatively describe the interactions in the pathway — a difficult but not impossible task, he says. Not everyone agrees. James Haber, a molecular biologist at Brandeis University in Waltham, Massachusetts, says it is hard to argue that the understanding of fundamental processes will be enriched within 20–30 years. "Whether this progress will result in these processes being 'solved' may be a matter of semantics," he says, "but some questions — such as how chromosomes are arranged in the nucleus — are just beginning to be explored." Johnston argues that it is neither possible not necessary to arrive at the quantitative understanding that he hopes to achieve for the glucose-sensing pathway for every other system in yeast. "You have to decide what level of understanding you're satisfied with, and some people respond that they're not satisfied at any level — that we have to keep going," he says. This gulf between simplifiers and dividers isn't just a matter of curiosity for armchair philosophers. It plays out every day as study sections and peer reviewers decide which a! pproach to science is worth funding and publishing. And it bears on the ultimate question in biology: will we ever understand it all? The edge of the universe Some, such as Hiroaki Kitano, a systems biologist at the Systems Biology Institute in Tokyo, point out that systems seem to grow more complex only because we continue to learn about them. "Biology is a defined system," he says, "and in time, we will have a fairly good understanding of what the system is about." Others demur, arguing that biologists will never know everything. And it may not matter terribly that they don't. Bert Vogelstein, a cancer-genomics researcher at Johns Hopkins University in Baltimore, Maryland, has watched first-hand as complexity dashed one of the biggest hopes of the genome era: that knowing the sequence of healthy and diseased genomes would allow researchers to find the genetic glitches that cause disease, paving the way for new treatments. Cancer, like other common diseases, is much more complicated than researchers hoped. By sequencing the genomes of cancer cells, for example, researchers now know that an individual patient's cancer has about 50 genetic mutations, but that they differ between individuals. So the search for drug targets that might help many patients has shifted away from individual genes and towards drugs that might interfere in networks common to many cancers. ADVERTISEMENT Even if we never understand biology completely, Vogelstein says, we can understand enough to interfere with the disease. "Humans are really good at being able to take a bit of knowledge and use it to great advantage," Vogelstein adds. "It's important not to wait until we understand everything, because that's going to be a long time away." Indeed, drugs that influence those bafflingly complex signal-transduction pathways are among the most promising classes of new medicines being used to treat cancer. And medicines targeting the still-mysterious small RNAs are already in clinical trials to treat viral infections, cancer and macular degeneration, the leading cause of untreatable blindness in wealthy nations. The complexity explosion, therefore, does not spell an end to progress. And that is a relief to many researchers who celebrate complexity rather than wring their hands over it. Mina Bissell, a cancer researcher at the Lawrence Berkeley National Laboratory in California, says that during the Human Genome Project, she was driven to despair by predictions that all the mysteries would be solved. "Famous people would get up and say, 'We will understand everything after this'," she says. "Biology is complex, and that is part of its beauty." She need not worry, however; the beautiful patterns of biology's Mandelbrot-like intricacy show few signs of resolving. * References * Gilbert, W. & Muller-Hill, B.Proc. Natl Acad. Sci. USA56, 1891-1898 (1966). * The ENCODE Project Consortium Nature447, 799-816 (2007). * Suzuki, H. I.et al. Nature460, 529-533 (2009). * Oliveri, P. , Tu, Q. & Davidson, E. H.Proc. Natl Acad. Sci. USA105, 5955-5962 (2008). * Fields, S. & Johnston, M.Science307, 5717 (2005). There are currently no comments. This is a public forum. Please keep to our Community Guidelines. You can be controversial, but please don't get personal or offensive and do keep it brief. Remember our threads are for feedback and discussion - not for publishing papers, press releases or advertisements.
  • Human genome at ten: The human race
    - Nature 464(7289):668 (2010)
    What was it like to participate in the fastest, fiercest research race in biology? Alison Abbott talks to some of the genome competitors about the rivalries and obstacles they faced then — and now. Download a PDF of this story. In many people's minds, May 1998 marked the real start of the race to sequence the human genome. In that month, Craig Venter announced that his upstart company, Celera Genomics in Rockville, Maryland, would sequence the genome within two years. The publicly funded Human Genome Project, which had been plodding along until that point, had a competitor — and each side assembled and prepped its team. The shotgunner Venter was willing to flout convention, and he recruited to help him. As a mathematician at the University of Arizona in Tucson, Myers had developed a technique for blasting a genome to pieces and reassembling the sequenced debris. But he despaired of ever using this 'whole-genome shotgun sequencing' method on the human genome. The field was signed up en bloc to sequencing the genome piece by consecutive piece to avoid gaps, and Myers's algorithms had been scorned for being error-prone and unworkable. At Celera, Myers never felt he was on the 'wrong side'. He arrived before the computers and furniture did, yet little more than a year later the group had lined up most of the 120-million-base-pair genome of the fruitfly Drosophila melanogaster (E. W. Myers et al. Science 287, 2196–2204; 2000), proving that the shotgun technique could work. The human genome came next. Myers still feels sore about his early rejection — "it hurt deeply" — and expresses a gleeful triumph that the technique is now standard in genomics. The academic world was hypocritical, he says. It castigated him for pushing the technique and joining industry, then sneaked him job offers at the first inkling that he might have been right. When Myers left Celera in 2002, he was looking for a new direction. He eventually found it in neuroinformatics, a field that provides its own computational challenges. Advances in microscopy combined with sophisticated genetic techniques now make it possible to observe how individual neurons behave when genes are turned on and off. Doing this across an entire mouse brain allows biologists to observe development in unprecedented molecular detail — if, that is, they can make sense of the vast numbers of high-resolution images. Myers is tackling this data challenge at the Janelia Farm Research Campus in Ashburn, Virginia. "Sequences: been there, done that," Myers says. "Cell-resolution models of nervous systems or developing organisms: daunting but looking more and more doable." The mega-manager The huge sequencing effort of the Human Genome Project was biology's first foray into the world of 'big science'. It required big money, and a level of teamwork that came as a major sociological shock to participating scientists. These were the problems with which had to contend as manager of the Human Genome Project for the Wellcome Trust Sanger Institute near Cambridge, UK. "It was an incredible moment, seeing everyone stand up. We felt we had saved the day." Jane Rogers In 1998, Rogers was part of a small posse of senior scientists from Sanger who persuaded governors of the Wellcome Trust to inject more momentum into the project by doubling the Sanger centre's budget so that it could sequence a full one-third of the genome. The trust's senior administrator, Michael Morgan, revealed the decision to scientists at that year's genome meeting at Cold Spring Harbor Laboratory in New York. The scientists were demoralized by Venter's recent announcement that he was entering the race, and Morgan's news brought the crowd to its feet. "It was an incredible moment, seeing everyone stand up," Rogers says. "We felt we had saved the day." Back home, Rogers had to cajole and coerce scientists who were used to working in their own small groups into working together on a central project, using standardized methods and procedures. There were emotional moments, she concedes with some diplomacy. Rogers, one of very few women involved at a high level in the Human Genome Project, developed a taste for big science. After finishing the major sequencing, the Sanger Institute reverted to principal-investigator-led research groups focused on the genomics of human health. But Rogers set about lobbying the UK Biotechnology and Biological Sciences Research Council for funds to establish a centre for sequencing plant, animal and microbial genomes. She now heads the council's Genome Analysis Centre in Norwich, UK, which opened last year— a management challenge that, for her, matches the buzz of the Human Genome Project. The patent pioneer believed he'd landed in patent-attorney heaven when he joined Celera as head of intellectual property in 1999. It was Millman's task to work out which of the company's intended products — the human genome sequence, its constituent genes, and the software and algorithms to analyse it — could be patented. In earlier days, Millman had been a street artist, performing outrageous feats of escapology in his free time. Life at Celera turned out to be similarly challenging. He enjoyed the buzz of testifying in front of Congress with Venter, helping to shape the US patent office's policies in gene patenting. Academics scorned Venter for making a business out of the human genome, but Millman remembers that although Venter "revelled in his bad-boy image, he didn't always act like he really believed in patents and he didn't make my life easy". Millman found himself caught between Venter's academic principles and his business drive, and thought that the company could have pursued patents more aggressively. In the end, Millman patented 150 genes and proteins that were considered likely drug targets, a handful of 'SNP' patterns linked to disease, and technologies linked to shotgun gene sequencing, none of which Celera fully exploited. Frustrated, Millman says that when he left the company in 2002, he didn't want to hear the suffix '-omics' ever again. He obviously changed his mind. Millman has since been involved in start-up companies that are pursuing other hot new biotechnologies, including, in 2004, Alnylam Pharmaceuticals in Cambridge, Massachusetts, which has led the way in RNA-interference technologies for regulating genes. In his current position at the venture-capital company MPM Capital in Boston, Massachusetts, he has invested in firms exploring epigenetics and stem cells. Gene patenting, however, remains controversial, even though patents are no longer granted for sequences alone and now require information about a gene's function and utility. Millman still sports his colourful clothes and his red ponytail. Occasionally he yearns to don his straightjacket and ride his unicycle across a tightrope, but, these days, he resists. The freedom fighter Whenever Celera put out a bullish press release to reassure shareholders that it was winning the race, went on television to explain that, actually, it wasn't. "I was a reluctant media star," he recalls. Sulston never worked directly on the human genome, but his work sequencing that of the nematode worm at the Sanger Institute paved the way for the Human Genome Project — and he became one of its most righteous political and scientific champions. "The race made for a crazy and irrational time." John Sulston Sulston fought to ensure that sequence data were released daily into the public domain, helping to establish principles at a 1996 strategy meeting on human-genome sequencing in Bermuda that are still largely followed by the genomics community. And he put the kibosh on a compromise with Celera, proposed in 1999, because the company was not prepared to release data early enough to satisfy the public effort's principles. In retrospect, Sulston still thinks it was right to fight. "Otherwise the biological databases that we have today would have collapsed — everything could have ended up in the hands of an American corporation. The race made for a crazy and irrational time." Yet his battles over the ownership of biology haven't stopped. Now emeritus at the Sanger Institute, he is a part-time faculty member at the University of Manchester's Institute for Science, Ethics and Innovation, which is engaging patent attorneys in heated debate about ownership issues in biology, such as the extent to which donors of biological material deserve compensation. Sulston thinks that the biology should be able to be exploited by businesses but that better checks are needed to stop basic researchers from becoming secretive. The diplomatic coder When moved to Japan from the United States in 1998, he was a molecular geneticist in need of employment. Taking a chance, he presented himself as a bioinformatics expert to the RIKEN Genomic Sciences Research Complex in Yokohama, newly created to allow Japan to contribute to the Human Genome Project. Then he started reading up like crazy. The centre was collaborating on chromosome 21 with another Japanese group and two German teams. He soon found himself as the centre's English-speaking representative at its meetings, and experienced the occasionally sharp edge of international tensions. The Japanese side was not well organized at first, he says, and sequenced some parts of the genome assigned to its partners. He recalls a meeting at the Sanger Institute when one of the Germans, beside himself with anger, shouted that by doing so the Japanese had wasted German taxpayers' money. ADVERTISEMENT Once the Japanese groups hit their stride, they bid for the unassigned chromosomes 11 and 18. The researchers flew over to Washington University in St Louis to negotiate with the rival US contingent. "We stepped off the plane and went straight into a three-hour meeting where no one even offered us a glass of water," Taylor remembers. After some fairly hostile bargaining, they came away with a compromise — the long arm of chromosome 11, the short arm of 18 and no dinner invitation. "It was crazy to split the chromosomes that way, but at least I got two Nature papers," he jokes. Taylor, now a recognized bioinformatician, works at the RIKEN Advanced Science Institute that replaced the former genome centre. His group has shrunk from 70 to 20 people. One of his main projects is with the International Human Microbiome Consortium, developing software for analysing the hundreds of microbial species in the intestines of healthy Japanese people. But these and other international efforts cannot rival the Human Genome Project, says Taylor, who calls it "a once-in-a-lifetime project, something the likes of which we probably won't see again. Not that we all wouldn't mind working like that together again. I'd jump at the opportunity." There are currently no comments. This is a public forum. Please keep to our Community Guidelines. You can be controversial, but please don't get personal or offensive and do keep it brief. Remember our threads are for feedback and discussion - not for publishing papers, press releases or advertisements.
  • Human genome at ten: The sequence explosion
    - Nature 464(7289):670 (2010)
    This article needs to be viewed as a PDF At the time of the announcement of the first drafts of the human genome in 2000, there were 8 billion base pairs of sequence in the three main databases for 'finished' sequence: GenBank, run by the US National Center for Biotechnology Information; the DNA Databank of Japan; and the European Molecular Biology Laboratory (EMBL) Nucleotide Sequence Database. The databases share their data regularly as part of the International Nucleotide Sequence Database Collaboration (INSDC). In the subsequent first post-genome decade, they have added another 270 billion bases to the collection of finished sequence, doubling the size of the database roughly every 18 months. But this number is dwarfed by the amount of raw sequence that has been created and stored by researchers around the world in the Trace archive and Sequence Read Archive (SRA). * References * Venter, J. C.et al. Science291, 1304-1351 (2001). * International Human Genome Sequencing Consortium Nature409, 860-921 (2001). * International Human Genome Sequencing Consortium Nature431, 931-945 (2004). * Levy, S.et al. PLoS Biol.5, e254 (2007). * Wheeler, D. A.et al. Nature452, 872-876 (2008). * Ley, T. J.et al. Nature456, 66-72 (2008). * Bentley, D. R.et al. Nature456, 53-59 (2008). * Wang, J.et al. Nature456, 60-65 (2008). * Ahn, S.-M.et al. Genome Res.19, 1622-1629 (2009). * Kim, J.-I.et al. Nature460, 1011-1015 (2009). * Pushkarev, D. , Neff, N. F. & Quake, S. R.Nature Biotechnol.27, 847-850 (2009). * Mardis, E. R.et al. N. Engl. J. Med.10, 1058-1066 (2009). * Drmanac, R.et al. Science327, 78-81 (2009). * McKernan, K. J.et al. Genome Res.19, 1527-1541 (2009). * Pleasance, E. D.et al. Nature463, 191-196 (2010). * Pleasance, E. D.et al. Nature463, 184-190 (2010). * Clark, M. J.et al. PLoS Genet.6, e1000832 (2010). * Rasmussen, M.et al. Nature463, 757-762 (2010). * Schuster, S. C.et al. Nature463, 943-947 (2010). * Lupski, J. R.et al. N. Engl. J. Med. doi:10.1056/NEJMoa0908094 (2010). * Roach, J. C.et al. Science doi:10.1126/science.1186802 (2010). There are currently no comments. This is a public forum. Please keep to our Community Guidelines. You can be controversial, but please don't get personal or offensive and do keep it brief. Remember our threads are for feedback and discussion - not for publishing papers, press releases or advertisements.
  • World view: Missing weapons
    - Nature 464(7289):672 (2010)
    The US defence department should be at the centre of the nation's energy policy, says Daniel Sarewitz. When Jeffrey Marqusee looks at the US Department of Defense (DOD), what he sees is not history's most fearsome war machine, but a gigantic test-bed for advanced environmental technologies. Marqusee runs the Pentagon's environmental-technology programmes, and he likes to tell anyone who will listen that the DOD's infrastructure includes 500 fixed installations (some the size and complexity of small cities), 546,000 buildings and other structures and 160,000 non-tactical vehicles. Combine these numbers with the fact that no institution on Earth has anything close to the DOD's buying power and technical capabilities, and it's hard not to conclude, as Marqusee does, that the Pentagon has the capacity to become the world's most important weapon in the fight to reduce greenhouse-gas emissions. Consider buildings. They account for about 38% of the nation's greenhouse-gas emissions. Technologies exist that could greatly improve their efficiency, but little overall progress has been made since the 1980s, despite an array of government incentives and regulations, and popular voluntary programmes such as the LEED (Leadership in Energy and Environmental Design) system. To Marqusee, the problem is largely organizational. For example, in the civilian world, the performance of a building's heating, ventilating and air-conditioning system is impossible to optimize. The components are built by manufacturing firms, integrated into the building plan by engineers and architects, purchased by a developer, installed by a contractor, used by tenants and maintained by a service company. Each player has different and often conflicting goals and interests. And, over time, the building's interior evolves to accommodate different and perhaps unanticipated users and uses. End-to-end innovation The DOD provides a setting to tame this chaos. Marqusee's programmes are funding demonstration projects on zero-energy housing units and advanced energy-management systems that continually minimize building energy costs and consumption. "Demonstration is a crucial role for us, to create confidence in new technologies," he says. "But what makes our role particularly powerful is that we work both sides of the equation, not just the manufacturers but also the customers. They trust us, and we understand their needs and constraints." The DOD sets the technical specifications, demonstrates the technologies, then buys, uses and maintains them. Unlike in the civilian world, this means that the building (or power plant, airfield, vehicle fleet, etc.) can be managed over its lifetime for specific performance goals such as maximum energy efficiency. Moreover, the DOD's market is big enough — US$23 billion this fiscal year for construction and facilities maintenance alone — to stimulate healthy competition between the private-sector firms that manufacture the advanced technologies. Firms see a big, reliable, long-term customer, giving them the impetus to innovate while also giving private-sector consumers confidence in the technologies. Confidence in turn drives consumption and improvements in performance and cost for both government and private customers. This innovation chain is neither utopian nor theoretical. Many, if not most, of the major waves of technological innovation over the past 60 years have had at their core the performance needs and purchasing power of the US military: telecommunications, information and computer technology, advanced materials, satellites, aircraft and jet engines, robotics and human-performance enhancement. What's true for the DOD's buildings is true for its electricity grids, vehicles, aircraft, ships and combat supply lines. Technological leadership underpins the agency's approach to ensuring national security, and security is increasingly recognized as being integrally related to energy. According to the department's 2010 Quadrennial Defense Review: "Energy efficiency can serve as a force multiplier, because it increases the range and endurance of forces in the field and can reduce the number of combat forces diverted to protect energy supply lines … The department is increasing its use of renewable energy supplies and reducing energy demand to improve operational effectiveness, reduce greenhouse gas emissions … and protect the department from energy price fluctuations." National security, climate change and energy economics are convergent rationales that provide the DOD with a potentially huge institutional advantage over other energy innovators. A litre of petrol transported along highly vulnerable supply lines to Afghanistan costs an average of about $100. Enhancing the energy independence of forward-base operations in combat zones — to save lives and money — is thus a powerful short-term incentive for energy-technology innovation in everything from building insulation to fuel efficiency for jeeps, tanks and jets, to renewable power generation and storage. The price at which new technologies make economic and strategic sense is enormously higher than what the energy market — or any plausible cap-and-trade or energy tax scheme — would allow. This means that the DOD is well positioned to aggressively invest in energy technologies that have little economic logic outside the military context, a situation that in the past has often led! to rapid innovation and reduced costs for civilian applications. The visible hand Marqusee believes that a five-year commitment to demonstration projects aimed at reducing energy use, fuel costs and emissions on fixed installations in the United States could move a broad array of technologies off the drawing board and into widespread use in the military's enormous infrastructure, a key step towards proliferation in the commercial marketplace. But Marqusee's brand of thinking must proliferate as well. Congress and the administration of President Barack Obama have lavished budget increases and new programmes on the Department of Energy in the hope that it can accelerate US energy innovation. But despite the energy department's capabilities in research and development, it lacks the institutional attributes that have made the defence department the dynamo of global technological innovation for the better part of a century. Most importantly, the DOD's ability to carry out its mission depends critically on the performance of the technologies that it purchases. It is a discerning customer — with a giant development and procurement budget. The Pentagon, it turns out, is the only institution in the United States with the scale, structure and mandate to start an energy-technology revolution. . Daniel Sarewitz, co-director of the Consortium for Science, Policy and Outcomes at Arizona State University, is based in Washington DC.dsarewitz@gmail.com There are currently no comments. This is a public forum. Please keep to our Community Guidelines. You can be controversial, but please don't get personal or offensive and do keep it brief. Remember our threads are for feedback and discussion - not for publishing papers, press releases or advertisements.
  • Sceptics and deniers of climate change not to be confused
    - Nature 464(7289):673 (2010)
    Climate-change denial could have disastrous consequences, if it delays global action to cut carbon emissions. Denialism is gaining popularity because people have difficulty differentiating deniers' twisted arguments from the legitimate concerns of genuine sceptics.
  • Fishermen contribute to protection of marine reserves
    - Nature 464(7289):673 (2010)
    Fisheries benefit from protected marine areas, as eggs, larvae and adult fish spill over into adjacent fishing grounds. But reserves should benefit fishermen too (Nature463, 1007; 2010).
  • Public database for HIV drug resistance in southern Africa
    - Nature 464(7289):673 (2010)
    The Opinion article by S. Karim and Q.
  • Has the revolution arrived?
    - Nature 464(7289):674 (2010)
    Looking back over the past decade of human genomics, Francis Collins finds five key lessons for the future of personalized medicine — for technology, policy, partnerships and pharmacogenomics.
  • Multiple personal genomes await
    - Nature 464(7289):676 (2010)
    Genomic data will soon become a commodity; the next challenge — linking human genetic variation with physiology and disease — will be as great as the one genomicists faced a decade ago, says J. Craig Venter.
  • Point: Hypotheses first
    - Nature 464(7289):678 (2010)
    Large, unbiased genomic surveys are taking cancer therapeutics in directions that could never have been predicted by traditional molecular biology, says Todd Golub. This Opinion piece is part of a linked pair; see also Point: Hypothesis First by Robert Weinberg.
  • Counterpoint: Data first
    - Nature 464(7289):679 (2010)
    There is little to show for all the time and money invested in genomic studies of cancer, says Robert Weinberg — and the approach is undermining tried-and-tested ways of doing, and of building, science. This Opinion piece is part of a linked pair; see also Counterpoint: Data First by Todd Golub.
  • A reality check for personalized medicine
    - Nature 464(7289):680 (2010)
    Bringing genetic information into health care is welcome but its utility in the clinic needs to be rigorously reviewed, caution Muin J. Khoury, James Evans and Wylie Burke.
  • How ocean stirring affects climate
    - Nature 464(7289):681 (2010)
    Wally Broecker is one of the great pioneers of palaeoclimatology, the study of past climate changes in Earth's history. He introduced the term global warming and, in the 1980s, proposed that the global ocean-circulation system, which he dubbed the Great Ocean Conveyor, tends to 'flip-flop' between radically different yet stable states.
  • Lost curve hits a nerve
    - Nature 464(7289):681 (2010)
    Hermann von Helmholtz (1821–1894) was a towering figure of the European Enlightenment, a physiologist and accomplished draftsman with the soul of a Prussian physicist. He conducted his research with rigorous mathematical precision, investigating his biological preparations by adapting whichever industrial-revolution technologies he saw fit.
  • Books in brief
    - Nature 464(7289):682 (2010)
    When the biotech revolution comes, we may turn to guidebooks to advise us on which genes we should delete to enhance our intelligence, how we might regenerate a limb or how we should interact with our clone. In their quirky guide to the future of biotechnology, How To Defeat Your Own Clone (Random House, 2010), bioengineers Kyle Kurpinski and Terry Johnson convey with simplicity and humour the science behind stem cells, genetic variation and bioenhancements.
  • Cell biology forum: Genome-wide view of mitosis
    - Nature 464(7289):684 (2010)
    Despite our rapidly growing knowledge about the human genome, we do not know all of the genes required for some of the most basic functions of life. To start to fill this gap we developed a high-throughput phenotypic screening platform combining potent gene silencing by RNA interference, time-lapse microscopy and computational image processing. We carried out a genome-wide phenotypic profiling of each of the ~21,000 human protein-coding genes by two-day live imaging of fluorescently labelled chromosomes. Phenotypes were scored quantitatively by computational image processing, which allowed us to identify hundreds of human genes involved in diverse biological functions including cell division, migration and survival. As part of the Mitocheck consortium, this study provides an in-depth analysis of cell division phenotypes and makes the entire high-content data set available as a resource to the community.
  • Quantum mechanics: The surf is up
    - Nature 464(7289):685 (2010)
    Researchers have long wanted to be able to control macroscopic mechanical objects in their smallest possible state of motion. Success in achieving that goal heralds a new generation of quantum experiments.
  • Stem cells: Skin regeneration and repair
    - Nature 464(7289):686 (2010)
    Different types of stem cell maintain the skin's epidermis and contribute to its healing after damage. The identity of a stem-cell type that gives rise to different epidermal-cell lineages has just been revealed.
  • Early Earth: Faint young Sun redux
    - Nature 464(7289):687 (2010)
    Given that the Sun was dimmer in its youth, our planet should have been frozen over for much of its early history. That it evidently wasn't is a puzzle that continues to engage the attention of Earth scientists.
  • Drug discovery: Fat-free proteins kill parasites
    - Nature 464(7289):689 (2010)
    The addition of a fatty acid to certain proteins is vital for the survival of protozoa that cause sleeping sickness and of their mammalian hosts. Compounds that target this process in the protozoa are now reported.
  • Physiology: There is no single p
    - Nature 464(7289):691 (2010)
    Why metabolic rates do not vary in direct proportion to body mass has long been the subject of debate. Progress has been made with the realization that no universal scaling exponent can be applied to them.
  • 50 & 100 years ago
    - Nature 464(7289):693 (2010)
    When the oldest British general genetical journal left the country with its editor, only one such journal remained ... Not only was it a time when the pressure on space for publication was rapidly increasing in all fields, but it was also a time when the belated recognition by British universities that the Americans were right in recognizing genetics as an essential part of biology was beginning to have results. One journal of general genetics could not be enough for a country such as Britain ... A new journal, Genetical Research, has now been launched to meet this need ... It includes a number of important papers among which should be noted especially Pritchard's discussion of recombination, and Waddington's experiments on canalizing selection.
  • Exotic matter: Another dimension for anyons
    - Nature 464(7289):693 (2010)
    Non-Abelian anyons are hypothesized particles that, if found, could form the basis of a fault-tolerant quantum computer. The theoretical finding that they may turn up in three dimensions comes as a surprise.
  • Astrophysics: Cosmic acceleration confirmed
    - Nature 464(7289):694 (2010)
    If you are still trying to get to grips with the idea that the Universe's expansion is speeding up, check out Schrabback and colleagues' scrutiny of the largest continuous area ever imaged with the Hubble Space Telescope — the COSMOS field (T.Schrabback> et al. Astron. Astrophys
  • Obituary: Joanne Simpson (1923–2010)
    - Nature 464(7289):696 (2010)
    Meteorologist who brought the study of clouds to the forefront of Earth science.
  • FTO effect on energy demand versus food intake
    - Nature 464(7289):E1 (2010)
    Arising from: J. Fischer et al.Nature 458, 894–898 (2009); Fischer et al.reply An intronic single nucleotide polymorphism (SNP) (rs9939609) close to the fat mass and obesity associated gene (FTO) was the first SNP to be discovered with common variants linked to body mass index1; at least seven studies in humans have implicated this SNP with variations in food intake and satiety2, 3, 4, 5, 6, 7, 8, and four studies have rejected an effect on energy expenditure normalized for body weight2, 5, 6, 8. Fischer et al.9 recently constructed a mouse in which the homologous Fto gene was inactivated (Fto-/-) and showed that these mice were protected from obesity. This observation strongly implicates the effects of the intronic SNP rs9939609 as arising due to an effect on the closest gene (FTO). However, the suggested mechanism underlying this effect in mice was opposite to that in humans. The Fto-/- mice showed no significant differences in food intake relative to wild-types litter-mates9 but had an elevated metabolic rate. The apparent contrasting effects of the! gene in humans and mice is worthy of closer investigation.
  • Fischer et al. reply
    - Nature 464(7289):E2 (2010)
    Replying to: J. R. Speakman Nature 464, 10.1038/nature08807 (2010) The human studies on FTO reported an association of an intronic single nucleotide polymorphism (SNP) with obesity. Our report of mice with the targeted inactivation of the Fto gene demonstrated a direct role of Fto in energy homeostasis1. We have shown that the absence of Fto protein results in leanness and that Fto deficiency affects energy homeostasis. Speakman2 exemplifies that the experiments performed in mice1 conflict with results in humans carrying the FTO risk allele presenting hyperphagia and increased caloric intake3, 4, 5, 6, 7, 8, 9.
  • Quantum ground state and single-phonon control of a mechanical resonator
    O'Connell AD Hofheinz M Ansmann M Bialczak RC Lenander M Lucero E Neeley M Sank D Wang H Weides M Wenner J Martinis JM Cleland AN - Nature 464(7289):697 (2010)
    Quantum mechanics provides a highly accurate description of a wide variety of physical systems. However, a demonstration that quantum mechanics applies equally to macroscopic mechanical systems has been a long-standing challenge, hindered by the difficulty of cooling a mechanical mode to its quantum ground state. The temperatures required are typically far below those attainable with standard cryogenic methods, so significant effort has been devoted to developing alternative cooling techniques. Once in the ground state, quantum-limited measurements must then be demonstrated. Here, using conventional cryogenic refrigeration, we show that we can cool a mechanical mode to its quantum ground state by using a microwave-frequency mechanical oscillator—a 'quantum drum'—coupled to a quantum bit, which is used to measure the quantum state of the resonator. We further show that we can controllably create single quantum excitations (phonons) in the resonator, thus taking ! the first steps to complete quantum control of a mechanical system.
  • Origins and functional impact of copy number variation in the human genome
    Conrad DF Pinto D Redon R Feuk L Gokcumen O Zhang Y Aerts J Andrews TD Barnes C Campbell P Fitzgerald T Hu M Ihm CH Kristiansson K Macarthur DG Macdonald JR Onyiah I Pang AW Robson S Stirrups K Valsesia A Walter K Wei J The Wellcome Trust Case Control Consortium Tyler-Smith C Carter NP Lee C Scherer SW Hurles ME - Nature 464(7289):704 (2010)
    Structural variations of DNA greater than 1 kilobase in size account for most bases that vary among human genomes, but are still relatively under-ascertained. Here we use tiling oligonucleotide microarrays, comprising 42 million probes, to generate a comprehensive map of 11,700 copy number variations (CNVs) greater than 443 base pairs, of which most (8,599) have been validated independently. For 4,978 of these CNVs, we generated reference genotypes from 450 individuals of European, African or East Asian ancestry. The predominant mutational mechanisms differ among CNV size classes. Retrotransposition has duplicated and inserted some coding and non-coding DNA segments randomly around the genome. Furthermore, by correlation with known trait-associated single nucleotide polymorphisms (SNPs), we identified 30 loci with CNVs that are candidates for influencing disease susceptibility. Despite this, having assessed the completeness of our map and the patterns of linkage dise! quilibrium between CNVs and SNPs, we conclude that, for complex traits, the heritability void left by genome-wide association studies will not be accounted for by common CNVs.
  • Genome-wide association study of CNVs in 16,000 cases of eight common diseases and 3,000 shared controls
    - Nature 464(7289):713 (2010)
    Copy number variants (CNVs) account for a major proportion of human genetic polymorphism and have been predicted to have an important role in genetic susceptibility to common disease. To address this we undertook a large, direct genome-wide study of association between CNVs and eight common human diseases. Using a purpose-designed array we typed ~19,000 individuals into distinct copy-number classes at 3,432 polymorphic CNVs, including an estimated ~50% of all common CNVs larger than 500 base pairs. We identified several biological artefacts that lead to false-positive associations, including systematic CNV differences between DNAs derived from blood and cell lines. Association testing and follow-up replication analyses confirmed three loci where CNVs were associated with disease—IRGM for Crohn's disease, HLA for Crohn's disease, rheumatoid arthritis and type 1 diabetes, and TSPAN8 for type 2 diabetes—although in each case the locus had previously been identif! ied in single nucleotide polymorphism (SNP)-based studies, reflecting our observation that most common CNVs that are well-typed on our array are well tagged by SNPs and so have been indirectly explored through SNP studies. We conclude that common CNVs that can be typed on existing platforms are unlikely to contribute greatly to the genetic basis of common human diseases.
  • Phenotypic profiling of the human genome by time-lapse microscopy reveals cell division genes
    - Nature 464(7289):721 (2010)
    Despite our rapidly growing knowledge about the human genome, we do not know all of the genes required for some of the most basic functions of life. To start to fill this gap we developed a high-throughput phenotypic screening platform combining potent gene silencing by RNA interference, time-lapse microscopy and computational image processing. We carried out a genome-wide phenotypic profiling of each of the ~21,000 human protein-coding genes by two-day live imaging of fluorescently labelled chromosomes. Phenotypes were scored quantitatively by computational image processing, which allowed us to identify hundreds of human genes involved in diverse biological functions including cell division, migration and survival. As part of the Mitocheck consortium, this study provides an in-depth analysis of cell division phenotypes and makes the entire high-content data set available as a resource to the community.
  • N-myristoyltransferase inhibitors as new leads to treat sleeping sickness
    - Nature 464(7289):728 (2010)
    African sleeping sickness or human African trypanosomiasis, caused by Trypanosoma brucei spp., is responsible for ~30,000 deaths each year. Available treatments for this disease are poor, with unacceptable efficacy and safety profiles, particularly in the late stage of the disease when the parasite has infected the central nervous system. Here we report the validation of a molecular target and the discovery of associated lead compounds with the potential to address this lack of suitable treatments. Inhibition of this target—T. brucei N-myristoyltransferase—leads to rapid killing of trypanosomes both in vitro and in vivo and cures trypanosomiasis in mice. These high-affinity inhibitors bind into the peptide substrate pocket of the enzyme and inhibit protein N-myristoylation in trypanosomes. The compounds identified have promising pharmaceutical properties and represent an opportunity to develop oral drugs to treat this devastating disease. Our studies validate T. br! ucei N-myristoyltransferase as a promising therapeutic target for human African trypanosomiasis.
  • Intense star formation within resolved compact regions in a galaxy at z = 2.3
    Swinbank AM Smail I Longmore S Harris AI Baker AJ De Breuck C Richard J Edge AC Ivison RJ Blundell R Coppin KE Cox P Gurwell M Hainline LJ Krips M Lundgren A Neri R Siana B Siringo G Stark DP Wilner D Younger JD - Nature 464(7289):733 (2010)
    Massive galaxies in the early Universe have been shown to be forming stars at surprisingly high rates1, 2, 3. Prominent examples are dust-obscured galaxies which are luminous when observed at sub-millimetre wavelengths and which may be forming stars at a rate of 1,000 solar masses (M⊙) per year4, 5, 6, 7. These intense bursts of star formation are believed to be driven by mergers between gas-rich galaxies8, 9. Probing the properties of individual star-forming regions within these galaxies, however, is beyond the spatial resolution and sensitivity of even the largest telescopes at present. Here we report observations of the sub-millimetre galaxy SMMJ2135-0102 at redshift z = 2.3259, which has been gravitationally magnified by a factor of 32 by a massive foreground galaxy cluster lens. This magnification, when combined with high-resolution sub-millimetre imaging, resolves the star-forming regions at a linear scale of only 100 parsecs. We find that the luminosity dens! ities of these star-forming regions are comparable to the dense cores of giant molecular clouds in the local Universe, but they are about a hundred times larger and 107 times more luminous. Although vigorously star-forming, the underlying physics of the star-formation processes at z ≈ 2 appears to be similar to that seen in local galaxies, although the energetics are unlike anything found in the present-day Universe.
  • Generation of electron beams carrying orbital angular momentum
    - Nature 464(7289):737 (2010)
    All forms of waves can contain phase singularities1, 2, 3, 4. In the case of optical waves, a light beam with a phase singularity carries orbital angular momentum, and such beams have found a range of applications in optical manipulation, quantum information and astronomy3, 4, 5, 6, 7, 8, 9. Here we report the generation of an electron beam with a phase singularity propagating in free space, which we achieve by passing a plane electron wave through a spiral phase plate constructed naturally from a stack of graphite thin films. The interference pattern between the final beam and a plane electron wave in a transmission electron microscope shows the 'Y'-like defect pattern characteristic of a beam carrying a phase singularity with a topological charge equal to one. This fundamentally new electron degree of freedom could find application in a number of research areas, as is the case for polarized electron beams.
  • Identification of Younger Dryas outburst flood path from Lake Agassiz to the Arctic Ocean
    - Nature 464(7289):740 (2010)
    The melting Laurentide Ice Sheet discharged thousands of cubic kilometres of fresh water each year into surrounding oceans, at times suppressing the Atlantic meridional overturning circulation and triggering abrupt climate change1, 2, 3, 4. Understanding the physical mechanisms leading to events such as the Younger Dryas cold interval requires identification of the paths and timing of the freshwater discharges. Although Broecker et al. hypothesized in 1989 that an outburst from glacial Lake Agassiz triggered the Younger Dryas1, specific evidence has so far proved elusive, leading Broecker to conclude in 2006 that "our inability to identify the path taken by the flood is disconcerting"2. Here we identify the missing flood path—evident from gravels and a regional erosion surface—running through the Mackenzie River system in the Canadian Arctic Coastal Plain. Our modelling of the isostatically adjusted surface in the upstream Fort McMurray region, and a slight rev! ision of the ice margin at this time, allows Lake Agassiz to spill into the Mackenzie drainage basin. From optically stimulated luminescence dating we have determined the approximate age of this Mackenzie River flood into the Arctic Ocean to be shortly after 13,000 years ago, near the start of the Younger Dryas. We attribute to this flood a boulder terrace near Fort McMurray with calibrated radiocarbon dates of over 11,500 years ago. A large flood into the Arctic Ocean at the start of the Younger Dryas leads us to reject the widespread view that Agassiz overflow at this time was solely eastward into the North Atlantic Ocean.
  • No climate paradox under the faint early Sun
    - Nature 464(7289):744 (2010)
    Environmental niches in which life first emerged and later evolved on the Earth have undergone dramatic changes in response to evolving tectonic/geochemical cycles and to biologic interventions1, 2, 3, as well as increases in the Sun's luminosity of about 25 to 30 per cent over the Earth's history4. It has been inferred that the greenhouse effect of atmospheric CO2 and/or CH4 compensated for the lower solar luminosity and dictated an Archaean climate in which liquid water was stable in the hydrosphere5, 6, 7, 8. Here we demonstrate, however, that the mineralogy of Archaean sediments, particularly the ubiquitous presence of mixed-valence Fe(II–III) oxides (magnetite) in banded iron formations9 is inconsistent with such high concentrations of greenhouse gases and the metabolic constraints of extant methanogens. Prompted by this, and the absence of geologic evidence for very high greenhouse-gas concentrations10, 11, 12, 13, we hypothesize that a lower albedo on the ! Earth, owing to considerably less continental area and to the lack of biologically induced cloud condensation nuclei14, made an important contribution to moderating surface temperature in the Archaean eon. Our model calculations suggest that the lower albedo of the early Earth provided environmental conditions above the freezing point of water, thus alleviating the need for extreme greenhouse-gas concentrations to satisfy the faint early Sun paradox.
  • Hominins on Flores, Indonesia, by one million years ago
    Brumm A Jensen GM van den Bergh GD Morwood MJ Kurniawan I Aziz F Storey M - Nature 464(7289):748 (2010)
    Previous excavations at Mata Menge and Boa Lesa in the Soa Basin of Flores, Indonesia, recovered stone artefacts in association with fossilized remains of the large-bodied Stegodon florensis florensis1, 2, 3, 4, 5, 6, 7, 8, 9. Zircon fission-track ages from these sites indicated that hominins had colonized the island by 0.88 ± 0.07 million years (Myr) ago6. Here we describe the contents, context and age of Wolo Sege, a recently discovered archaeological site in the Soa Basin that has in situ stone artefacts and that lies stratigraphically below Mata Menge and immediately above the basement breccias of the basin. We show using 40Ar/39Ar dating that an ignimbrite overlying the artefact layers at Wolo Sege was erupted 1.02 ± 0.02 Myr ago, providing a new minimum age for hominins on Flores. This predates the disappearance from the Soa Basin of 'pygmy' Stegodon sondaari and Geochelone spp. (giant tortoise), as evident at the nearby site of Tangi Talo, which ! has been dated to 0.90 ± 0.07 Myr ago10. It now seems that this extirpation or possible extinction event and the associated faunal turnover were the result of natural processes rather than the arrival of hominins9. It also appears that the volcanic and fluvio-lacustrine deposits infilling the Soa Basin may not be old enough to register the initial arrival of hominins on the island.
  • Curvature in metabolic scaling
    - Nature 464(7289):753 (2010)
    For more than three-quarters of a century it has been assumed1 that basal metabolic rate increases as body mass raised to some power p. However, there is no broad consensus regarding the value of p: whereas many studies have asserted that p is 3/4 (refs 1–4; 'Kleiber's law'), some have argued that it is 2/3 (refs 5–7), and others have found that it varies depending on factors like environment and taxonomy6, 8, 9, 10, 11, 12, 13, 14, 15, 16. Here we show that the relationship between mass and metabolic rate has convex curvature on a logarithmic scale, and is therefore not a pure power law, even after accounting for body temperature. This finding has several consequences. First, it provides an explanation for the puzzling variability in estimates of p, settling a long-standing debate. Second, it constitutes a stringent test for theories of metabolic scaling. A widely debated model17 based on vascular system architecture fails this test, and we suggest modificat! ions that could bring it into compliance with the observed curvature. Third, it raises the intriguing question of whether the scaling relation limits body size.
  • The genome of a songbird
    - Nature 464(7289):757 (2010)
    The zebra finch is an important model organism in several fields1, 2 with unique relevance to human neuroscience3, 4. Like other songbirds, the zebra finch communicates through learned vocalizations, an ability otherwise documented only in humans and a few other animals and lacking in the chicken5—the only bird with a sequenced genome until now6. Here we present a structural, functional and comparative analysis of the genome sequence of the zebra finch (Taeniopygia guttata), which is a songbird belonging to the large avian order Passeriformes7. We find that the overall structures of the genomes are similar in zebra finch and chicken, but they differ in many intrachromosomal rearrangements, lineage-specific gene family expansions, the number of long-terminal-repeat-based retrotransposons, and mechanisms of sex chromosome dosage compensation. We show that song behaviour engages gene regulatory networks in the zebra finch brain, altering the expression of long non-codin! g RNAs, microRNAs, transcription factors and their targets. We also show evidence for rapid molecular evolution in the songbird lineage of genes that are regulated during song experience. These results indicate an active involvement of the genome in neural processes underlying vocal communication and identify potential genetic substrates for the evolution and regulation of this behaviour.
  • Impaired hippocampal–prefrontal synchrony in a genetic mouse model of schizophrenia
    - Nature 464(7289):763 (2010)
    Abnormalities in functional connectivity between brain areas have been postulated as an important pathophysiological mechanism underlying schizophrenia1, 2. In particular, macroscopic measurements of brain activity in patients suggest that functional connectivity between the frontal and temporal lobes may be altered3, 4. However, it remains unclear whether such dysconnectivity relates to the aetiology of the illness, and how it is manifested in the activity of neural circuits. Because schizophrenia has a strong genetic component5, animal models of genetic risk factors are likely to aid our understanding of the pathogenesis and pathophysiology of the disease. Here we study Df(16)A+/– mice, which model a microdeletion on human chromosome 22 (22q11.2) that constitutes one of the largest known genetic risk factors for schizophrenia6. To examine functional connectivity in these mice, we measured the synchronization of neural activity between the hippocampus and the prefro! ntal cortex during the performance of a task requiring working memory, which is one of the cognitive functions disrupted in the disease. In wild-type mice, hippocampal–prefrontal synchrony increased during working memory performance, consistent with previous reports in rats7. Df(16)A+/– mice, which are impaired in the acquisition of the task, showed drastically reduced synchrony, measured both by phase-locking of prefrontal cells to hippocampal theta oscillations and by coherence of prefrontal and hippocampal local field potentials. Furthermore, the magnitude of hippocampal–prefrontal coherence at the onset of training could be used to predict the time it took the Df(16)A+/– mice to learn the task and increased more slowly during task acquisition. These data suggest how the deficits in functional connectivity observed in patients with schizophrenia may be realized at the single-neuron level. Our findings further suggest that impaired long-range synchrony of neural a! ctivity is one consequence of the 22q11.2 deletion and may be ! a fundamental component of the pathophysiology underlying schizophrenia.
  • Understanding mechanisms underlying human gene expression variation with RNA sequencing
    Pickrell JK Marioni JC Pai AA Degner JF Engelhardt BE Nkadori E Veyrieras JB Stephens M Gilad Y Pritchard JK - Nature 464(7289):768 (2010)
    Understanding the genetic mechanisms underlying natural variation in gene expression is a central goal of both medical and evolutionary genetics, and studies of expression quantitative trait loci (eQTLs) have become an important tool for achieving this goal1. Although all eQTL studies so far have assayed messenger RNA levels using expression microarrays, recent advances in RNA sequencing enable the analysis of transcript variation at unprecedented resolution. We sequenced RNA from 69 lymphoblastoid cell lines derived from unrelated Nigerian individuals that have been extensively genotyped by the International HapMap Project2. By pooling data from all individuals, we generated a map of the transcriptional landscape of these cells, identifying extensive use of unannotated untranslated regions and more than 100 new putative protein-coding exons. Using the genotypes from the HapMap project, we identified more than a thousand genes at which genetic variation influences over! all expression levels or splicing. We demonstrate that eQTLs near genes generally act by a mechanism involving allele-specific expression, and that variation that influences the inclusion of an exon is enriched within and near the consensus splice sites. Our results illustrate the power of high-throughput sequencing for the joint analysis of variation in transcription, splicing and allele-specific expression across individuals.
  • Transcriptome genetics using second generation sequencing in a Caucasian population
    Montgomery SB Sammeth M Gutierrez-Arcelus M Lach RP Ingle C Nisbett J Guigo R Dermitzakis ET - Nature 464(7289):773 (2010)
    Gene expression is an important phenotype that informs about genetic and environmental effects on cellular state. Many studies have previously identified genetic variants for gene expression phenotypes using custom and commercially available microarrays1, 2, 3, 4, 5. Second generation sequencing technologies are now providing unprecedented access to the fine structure of the transcriptome6, 7, 8, 9, 10, 11, 12, 13, 14. We have sequenced the mRNA fraction of the transcriptome in 60 extended HapMap individuals of European descent and have combined these data with genetic variants from the HapMap3 project15. We have quantified exon abundance based on read depth and have also developed methods to quantify whole transcript abundance. We have found that approximately 10 million reads of sequencing can provide access to the same dynamic range as arrays with better quantification of alternative and highly abundant transcripts. Correlation with SNPs (small nucleotide polymorphi! sms) leads to a larger discovery of eQTLs (expression quantitative trait loci) than with arrays. We also detect a substantial number of variants that influence the structure of mature transcripts indicating variants responsible for alternative splicing. Finally, measures of allele-specific expression allowed the identification of rare eQTLs and allelic differences in transcript structure. This analysis shows that high throughput sequencing technologies reveal new properties of genetic effects on the transcriptome and allow the exploration of genetic effects in cellular processes.
  • Identification of two evolutionarily conserved genes regulating processing of engulfed apoptotic cells
    Kinchen JM Ravichandran KS - Nature 464(7289):778 (2010)
    Engulfment of apoptotic cells occurs throughout life in multicellular organisms. Impaired apoptotic cell clearance (due to defective recognition, internalization or degradation) results in autoimmune disease1, 2. One fundamental challenge in understanding how defects in corpse removal translate into diseased states is the identification of critical components orchestrating the different stages of engulfment. Here we use genetic, cell biological and molecular studies in Caenorhabditis elegans and mammalian cells to identify SAND-1 and its partner CCZ-1 as new factors in corpse removal. In worms deficient in either sand-1 or ccz-1, apoptotic cells are internalized and the phagosomes recruit the small GTPase RAB-5 but fail to progress to the subsequent RAB-7(+) stage. The mammalian orthologues of SAND-1, namely Mon1a and Mon1b, were similarly required for phagosome maturation. Mechanistically, Mon1 interacts with GTP-bound Rab5, identifying Mon1 as a previously unrecogniz! ed Rab5 effector. Moreover, a Mon1–Ccz1 complex (but not either protein alone) could bind Rab7 and could also influence Rab7 activation, suggesting Mon1–Ccz1 as an important link in progression from the Rab5-positive stage to the Rab7-positive stage of phagosome maturation. Taken together, these data identify SAND-1 (Mon1) and CCZ-1 (Ccz1) as critical and evolutionarily conserved components regulating the processing of ingested apoptotic cell corpses.
  • Spatial control of EGF receptor activation by reversible dimerization on living cells
    Chung I Akita R Vandlen R Toomre D Schlessinger J Mellman I - Nature 464(7289):783 (2010)
    Epidermal growth factor receptor (EGFR) is a type I receptor tyrosine kinase, the deregulation of which has been implicated in a variety of human carcinomas1, 2, 3, 4. EGFR signalling is preceded by receptor dimerization, typically thought to result from a ligand-induced conformational change in the ectodomain that exposes a loop (dimerization arm) required for receptor association. Ligand binding may also trigger allosteric changes in the cytoplasmic domain of the receptor that is crucial for signalling5, 6, 7. Despite these insights, ensemble-averaging approaches have not determined the precise mechanism of receptor activation in situ. Using quantum-dot-based optical tracking of single molecules8, 9, 10, 11 combined with a novel time-dependent diffusivity analysis, here we present the dimerization dynamics of individual EGFRs on living cells. Before ligand addition, EGFRs spontaneously formed finite-lifetime dimers kinetically stabilized by their dimerization arms1! 2, 13, 14. The dimers were primed both for ligand binding and for signalling, such that after EGF addition they rapidly showed a very slow diffusivity state that correlated with activation. Although the kinetic stability of unliganded dimers was in principle sufficient for EGF-independent activation, ligand binding was still required for signalling. Interestingly, dimers were enriched in the cell periphery in an actin- and receptor-expression-dependent fashion, resulting in a peripheral enhancement of EGF-induced signalling that may enable polarized responses to growth factors.
  • NINJA connects the co-repressor TOPLESS to jasmonate signalling
    - Nature 464(7289):788 (2010)
    Jasmonoyl-isoleucine (JA-Ile) is a plant hormone that regulates a broad array of plant defence and developmental processes1, 2, 3, 4, 5. JA-Ile-responsive gene expression is regulated by the transcriptional activator MYC2 that interacts physically with the jasmonate ZIM-domain (JAZ) repressor proteins. On perception of JA-Ile, JAZ proteins are degraded and JA-Ile-dependent gene expression is activated6, 7. The molecular mechanisms by which JAZ proteins repress gene expression remain unknown. Here we show that the Arabidopsis JAZ proteins recruit the Groucho/Tup1-type co-repressor TOPLESS (TPL)8 and TPL-related proteins (TPRs) through a previously uncharacterized adaptor protein, designated Novel Interactor of JAZ (NINJA). NINJA acts as a transcriptional repressor whose activity is mediated by a functional TPL-binding EAR repression motif. Accordingly, both NINJA and TPL proteins function as negative regulators of jasmonate responses. Our results point to TPL proteins a! s general co-repressors that affect multiple signalling pathways through the interaction with specific adaptor proteins. This new insight reveals how stress-related and growth-related signalling cascades use common molecular mechanisms to regulate gene expression in plants.
  • Phosphorylation of histone H3T6 by PKCβI controls demethylation at histone H3K4
    Metzger E Imhof A Patel D Kahl P Hoffmeyer K Friedrichs N Müller JM Greschik H Kirfel J Ji S Kunowska N Beisenherz-Huss C Günther T Buettner R Schüle R - Nature 464(7289):792 (2010)
    Demethylation at distinct lysine residues in histone H3 by lysine-specific demethylase 1 (LSD1) causes either gene repression or activation1, 2. As a component of co-repressor complexes, LSD1 contributes to target gene repression by removing mono- and dimethyl marks from lysine 4 of histone H3 (H3K4)1, 3. In contrast, during androgen receptor (AR)-activated gene expression, LSD1 removes mono- and dimethyl marks from lysine 9 of histone H3 (H3K9)2. Yet, the mechanisms that control this dual specificity of demethylation are unknown. Here we show that phosphorylation of histone H3 at threonine 6 (H3T6) by protein kinase C beta I (PKCβI, also known as PRKCβ) is the key event that prevents LSD1 from demethylating H3K4 during AR-dependent gene activation. In vitro, histone H3 peptides methylated at lysine 4 and phosphorylated at threonine 6 are no longer LSD1 substrates. In vivo, PKCβI co-localizes with AR and LSD1 on target gene promoters and phosphorylates H3T6 after ! androgen-induced gene expression. RNA interference (RNAi)-mediated knockdown of PKCβI abrogates H3T6 phosphorylation, enhances demethylation at H3K4, and inhibits AR-dependent transcription. Activation of PKCβI requires androgen-dependent recruitment of the gatekeeper kinase protein kinase C (PKC)-related kinase 1 (PRK1)4. Notably, increased levels of PKCβI and phosphorylated H3T6 (H3T6ph) positively correlate with high Gleason scores of prostate carcinomas, and inhibition of PKCβI blocks AR-induced tumour cell proliferation in vitro and cancer progression of tumour xenografts in vivo. Together, our data establish that androgen-dependent kinase signalling leads to the writing of the new chromatin mark H3T6ph, which in consequence prevents removal of active methyl marks from H3K4 during AR-stimulated gene expression.
  • The balance scale
    - Nature 464(7289):804 (2010)
    A matter of life and death.

No comments: