Latest Articles Include:
- Nuclear ambition
- Nature 464(7292):1103 (2010)
The US weapons labs need to develop a twenty-first-century vision of deterrence — one that does not include making new bombs. - The weight of evidence
- Nature 464(7292):1103 (2010)
Better chemical-control legislation is a good start, but scientific reform should parallel legal reform. - Time for libel-law reform
- Nature 464(7292):1104 (2010)
Simon Singh's recent libel result is a victory for science, but the real fight lies ahead. - Planetary science: Moon grab
- Nature 464(7292):1106 (2010)
- Neuroscience: Relief from pain
- Nature 464(7292):1106 (2010)
- Biomaterials: Electronics on the brain
- Nature 464(7292):1106 (2010)
- Cultural evolution: High fidelity
- Nature 464(7292):1106 (2010)
- Cell biology: Toxin tackle
- Nature 464(7292):1106 (2010)
- Chemistry: Plumbing carbon rings
- Nature 464(7292):1106 (2010)
- Cancer biology: Cells combat chemo
- Nature 464(7292):1107 (2010)
- Neuroscience: Sharpening social skills
- Nature 464(7292):1107 (2010)
- Climate change: Fewer, taller, fiercer
- Nature 464(7292):1107 (2010)
- Immunology: Inflammatory good guys
- Nature 464(7292):1107 (2010)
- Journal club
- Nature 464(7292):1107 (2010)
- News briefing: 22 April 2010
- Nature 464(7292):1108 (2010)
The week in science. This article is best viewed as a PDF. Policy|Business|Business watch|Research|Events|The week ahead|Number crunch|Sound bites Construction of ITER, the multibillion-dollar fusion experiment in St Paul-lez-Durance, France, is poised to kick into high gear. On 13 April, Fusion for Energy (F4E), the European Union's organization administering the project from Barcelona, Spain, announced that it had signed a €150-million (US$202-million) contract for architecture and engineering. A consortium of four contractors from France, Spain and the United Kingdom will be responsible for the detailed designs of the reactor's buildings. Excavation work for the main reactor building will start in May or June, with completion tentatively scheduled for late 2019. The British Chiropractic Association (BCA) ended its libel claim against the science writer Simon Singh on 15 April, for a piece he had written for The Guardian newspaper. The news came two weeks after a court ruling went in Singh's favour (see go.nature.com/EQFfg3), overturning a previous decision on Singh's April 2008 article. The recent ruling meant that Singh would have been able to use a defence commonly referred to as 'fair comment' if the BCA had continued with the case. See go.nature.com/oV7qku for more. An inquiry has upheld the integrity of research by the 'climategate' scientists at the University of East Anglia, UK. Headed by geologist Ron Oxburgh, former rector of Imperial College London, the inquiry was one of a number established after e-mails sent by scientists at the university's Climatic Research Unit were leaked. This inquiry, set up by the university, considered several allegations, including that data in research papers had been manipulated to support predetermined conclusions on climate change. It cleared the scientists of any malpractice, but expressed surprise that few professional statisticians were involved in the work. See go.nature.com/XNggKb for more. The US Department of the Interior has requested a scientific review of the possible ecological impact of drilling for oil and gas in the Beaufort and Chukchi seas in the Arctic. Environmentalists raised concerns last year that drilling could affect wildlife in the region, including walrus and beluga whales. The review, by scientists at the US Geological Survey, will be completed by 1 October 2010. Amyris Biotechnologies, one of the leading start-up firms deploying the tools of synthetic biology in the biofuels field, filed plans to go public with the US Securities and Exchange Commission on 16 April. Based in Emeryville, California, and co-founded by Jay Keasling, a bioengineer at the University of California, Berkeley, the firm has engineered strains of yeast to produce hydrocarbon fuels and other chemicals from sugarcane feedstocks. It plans to seek US$100 million in its initial public offering; no date for this has been set. US chemical regulation looks set for an overhaul with the introduction of the draft Safe Chemicals Act. If passed, the new legislation will require health-and-safety information to be provided for all chemicals, and will pass the burden of proof of safety to the manufacturers rather than the regulators, in a similar way to the European Union's REACH legislation. The act, introduced in both houses of Congress on 15 April, will replace the ageing Toxic Substances Control Act (see Nature 463, 599; 2010). Genetically engineered crops offer significant environmental and economic advantages over non-transgenic varieties, according to a report published on 13 April by the US National Research Council. Introduced in 1996, transgenic crops now make up more than 80% of soya bean, maize (corn) and cotton grown in the United States — or about half the nation's cropland. According to the report, farmers who grow Bt crops, which are engineered to produce pest-killing toxins from the bacterium Bacillus thuringiensis, use less insecticide. Increased planting of herbicide-tolerant crops may also have reduced the use of many herbicides that linger in soil and waterways, while increasing the use of glyphosate, a herbicide thought to be less harmful to the environment (see graphic). SOURCE: http://purl.umn.edu/49271 Farmers growing transgenic crops are more likely to practise 'conservation' tillage, which reduces soil erosion. They have also seen economic benefits, with lower production costs due to decreased insecticide and pesticide use and higher yields in some cases. But the report warns that the risks of genetic engineering may multiply as the technology is applied to more crops and calls for further research — including on the growing resistance of weeds to glyphosate. See go.nature.com/ddcpba for more. Estimates for the number of microbial species in the world's oceans have jumped massively. L. AMARAL ZETTLER When the International Census of Marine Microbes (ICoMM) kicked off in 2003, microbiologists had identified 6,000 kinds of microbe and predicted that they might find as many as 600,000. But the latest analyses indicate that the oceans are home to at least 20 million types, including acantharians (pictured) — protists with skeletons made of strontium sulphate crystals. "The results just blow the wheels off all estimates of microbial diversity," says ICoMM leader Mitch Sogin of the Marine Biological Laboratory at Woods Hole, Massachusetts. See go.nature.com/Pnstez for more. Scientists at the University of California, Davis, last week unveiled BioTorrents, a website that enables people to share scientific data, including genome sequences that are currently held in large repositories such as GenBank. The website ("www.biotorrents.net":www.biotorrents.net) uses BitTorrent's peer-to-peer file-sharing technology, which can distribute large data sets to multiple users by splitting the data into small chunks. Although data piracy could be a concern, the authors argue in their paper (PLoS ONE 5, e10071; 2010) that BioTorrents could speed data sharing between large international collaborations. See go.nature.com/ubozh8 for more. A 55-year-old clinical-trials network needs a major overhaul, according to a report by the Institute of Medicine, the Washington DC-based health arm of the National Academies. The Clinical Trials Cooperative Group Program, funded by the National Cancer Institute, enrols 25,000 patients in cancer trials run by 14,000 researchers at 3,100 institutions each year. Trials typically take at least two years to get off the ground, the report says, and funding only covers about half the costs, leaving investigators to seek out the difference from other sources. Indian space agency ISRO's first test flight of a home-made cryogenic engine — powered by fuels that are liquid at very low temperatures — ended in failure on 15 April, dumping its GSAT-4 communications satellite into the Indian Ocean. The failure is likely to push back the planned 2012–13 launch of India's unmanned Chandrayaan-2 lunar mission and further communications satellites. IL NUOVO SAGGIATORE/SOCIETÀ ITALIANA DI FISICA DI BOLOGNA Around four tonnes of Roman lead were transferred on 14 April from a museum on the island of Sardinia to Italy's particle-physics laboratory at Gran Sasso on the mainland. Once intended to become ammunition for Roman soldiers' slingshots, the ingots, discovered by a diver in 1988 (see picture, right), will now be used to shield the CUORE (Cryogenic Underground Observatory for Rare Events) detector, which seeks to nail down the mass of neutrinos. See go.nature.com/FtvAhs for more. The Icelandic volcano Eyjafjallajökull, which has been erupting since 21 March, has spewed vast amounts of ash up to 5 kilometres into the atmosphere. Magma from the eruption has found a route to the surface from under a glacier. Flights across Europe have been grounded because of fears over potential engine damage caused by the silica ash cloud issuing from the fissure. Meteorologists are modelling where the cloud will move, using satellite and wind-speed data. The emissions are so small, however, that climate experts don't anticipate any climate effects. The Hubble Space Telescope was launched 20 years ago on this day. See Nature's online special for a retrospective, slideshow and stories from our archive. www.nature.com/hubble About 13,000 scientists are expected at Experimental Biology 2010 in Anaheim, California. The conference includes lectures and posters from fields such as anatomy, biochemistry and pharmacology. go.nature.com/ErrFZe The Cambridge Healthtech Institute's Drug Discovery Chemistry conference is held in San Diego, California, with programmes on antibacterial drug development and protein–protein interactions as drug targets. www.drugdiscoverychemistry.com A symposium hosted by the Zoological Society of London examines the link between the conservation of biodiversity and reductions in poverty. go.nature.com/jr2rLC Source: FAO "I just have to say pretty bluntly here: we've been there before." US President Barack Obama tells Florida's Kennedy Space Center on 15 April that sending astronauts to the Moon is so last century (see go.nature.com/zWdf2W for more). There are currently no comments. This is a public forum. Please keep to our Community Guidelines. You can be controversial, but please don't get personal or offensive and do keep it brief. Remember our threads are for feedback and discussion - not for publishing papers, press releases or advertisements. - A shot in the arm for cancer vaccines?
- Nature 464(7292):1110 (2010)
Researchers anxiously await a decision by US regulators on a controversial cancer therapy. In the early 1990s, immunologist Edgar Engleman of California's Stanford University School of Medicine thought he had discovered a way to treat cancer using a vaccine that harnessed the body's immune cells. He co-founded a company — later named Dendreon — in 1992 to develop the vaccine, predicting that it would reach patients within a few years. "We were so naïve," he says. "We didn't know what to expect." A vaccine using dendritic cells might be approved.D. SCHARF/SCIENCE FACTION/CORBIS Now, after some 20 years of successes and setbacks, Dendreon's prostate-cancer vaccine Provenge (sipleucel-T) may finally be nearing the market; the US Food and Drug Administration (FDA) is expected to reach a decision on its approval by 1 May. If the vaccine is approved, it will mark a turning point for the field of therapeutic cancer vaccines, an approach that seemed promising but developed a disappointing reputation after several high-profile failures in clinical trials. Although a few vaccines have been licensed for use in other countries, none has broken through to the US market. An FDA-approved vaccine, says Theresa Whiteside, an immunologist at the University of Pittsburgh School of Medicine in Pennsylvania, "would sort of legitimize the field". It would also offer a potential new treatment for patients with advanced prostate cancer, which killed more than 28,000 US men in 2008. Provenge is much more complex than familiar vaccines against viruses, such as measles or human papilloma virus, the cause of most cervical cancers. The vaccine is tailor-made for each patient by harvesting his dendritic cells — a type of immune cell — and exposing them to a cancer-associated protein called prostatic acid phosphatase. Once infused back into the patient, the exposed cells should trigger an immune assault on tumour cells. The vaccine seemed to be on the cusp of approval three years ago, after an FDA advisory committee determined that it was both safe and effective for use in advanced prostate cancer. But the FDA had lingering concerns, noting that Dendreon's phase III clinical trials were relatively small. What's more, although Provenge extended the lifespan of men with advanced prostate cancer, it did not slow tumour growth, the endpoint that those trials were designed to address. In a highly controversial decision, the agency ordered Dendreon, headquartered in Seattle, Washington, to complete a further large clinical trial in 500 patients, this time designating overall survival as the trial's endpoint. The results of that trial, released last April, showed that Provenge lengthened the survival of patients with late-stage prostate cancer by four months. Provenge has won the fervent support of patients with prostate cancer as well as their advocates, and the company's rising stock price (see 'Wild ride for a cancer treatment') is evidence that investors are optimistic about the vaccine's approval. However, it promises to be expensive. Dendreon has not yet set a price, but some analysts estimate that it will cost up to US$100,000 per patient. Meanwhile, many researchers are reserving judgement on its efficacy until Dendreon publishes the results of its latest clinical trial in a peer-reviewed journal. Click to enlarge.FINANCIAL DATA: NASDAQ There are also questions over exactly how the vaccine works. Provenge is a relatively crude mixture of different cell types, including the dendritic cells that should stimulate the immune response. "It would be nice to know exactly what's in there and what these other cells are contributing," says Nina Bhardwaj, an immunologist at New York University's Langone Medical Center in New York. Nevertheless, given that Provenge relies on nearly 20-year-old technology, the vaccine's performance is impressive, says Bhardwaj. Many first-generation cancer vaccines such as PANVAC, a pancreatic cancer vaccine, were deemed safe but failed to demonstrate that they significantly slowed the progression of cancer. Because cancer-associated antigens — such as those used in Provenge — are also found at low levels in healthy tissue, their ability to trigger a powerful immune response may be blunted. A second generation of vaccines, designed to provoke a stronger immune response, is under development, with some scientists now focusing on antigens that are found only on tumour cells. One of the first vaccines to use this approach targets a mutant protein called EGFRvIII that is found in glioblastoma, an aggressive brain cancer. The vaccine is being jointly developed by drugs giant Pfizer, based in New York, and Celldex, a biotechnology firm headquartered in Needham, Massachusetts. Over the past decade, researchers have reached a deeper understanding of how tumours actively suppress immune responses in their immediate environment, which can dampen responses to cancer vaccines. To overcome this, some therapies currently in development combine the vaccine with chemotherapies that are designed to counteract this immune suppression. For example, a Seattle-based biotechnology company called Oncothyreon has developed a cancer vaccine called Stimuvax that is administered in combination with the drug cyclophosphamide. The compound inhibits immune cells called T-regulatory cells, which block immune responses to the body's own molecules. Compounds that modulate the immune response could have unwanted side effects, however. A patient in a clinical trial of Stimuvax involving high doses of cyclophosphamide developed an acute inflammation of the brain, which caused the FDA to put all Stimuvax trials on hold. ADVERTISEMENT A clean safety profile is crucial if cancer-vaccine developers are to improve a vaccine's performance in clinical trials. To date, most of these trials have enrolled patients who are in the advanced stages of cancer, which may have limited the trials' effectiveness because such individuals may not be able to mount an effective immune response. Now that such vaccines have been established as safe in phase II trials, clinicians are more willing to test them in healthier patients. An ongoing large trial of a lung cancer vaccine by London-based pharmaceutical firm GlaxoSmithKline, for example, is enrolling patients at an earlier stage of the disease. For some in the field, the struggle to create effective cancer vaccines conjures up memories of the long battle to develop antibody-based therapies, which are now a mainstay of the biotechnology industry. There, too, a series of clinical-trial failures initially soured the field's reputation, recalls Thomas Davis, chief medical officer at Celldex. In the early 1990s, when Davis worked to develop rituximab — a monoclonal antibody used to treat autoimmune disorders and some cancers — he recalls that researchers in the field learned to be resilient. "We realized you just have to test a lot of drugs to find one that works," he says, "and it's the same for a cancer vaccine." There are currently no comments. This is a public forum. Please keep to our Community Guidelines. You can be controversial, but please don't get personal or offensive and do keep it brief. Remember our threads are for feedback and discussion - not for publishing papers, press releases or advertisements. - No gain from brain training
- Nature 464(7292):1111 (2010)
Computerized mental workouts don't boost mental skills, study claims. IMAGE SOURCE/REX FEATURES The largest trial to date of 'brain-training' computer games suggests that people who use the software to boost their mental skills are likely to be disappointed. The study, a collaboration between British researchers and the BBC Lab UK website, recruited viewers of the BBC science programme Bang Goes the Theory to practise a series of online tasks for a minimum of ten minutes a day, three times a week, for six weeks. In one group, the tasks focused on reasoning, planning and problem-solving abilities — skills correlated with general intelligence. A second group was trained on mental functions targeted by commercial brain-training programs — short-term memory, attention, visuospatial abilities and maths. A third group, the control subjects, simply used the Internet to find answers to obscure questions. A total of 11,430 volunteers aged from 18 to 60 completed the study, and although they improved on the tasks, the researchers believe that none of the groups boosted their performance on tests measuring general cognitive abilities such as memory, reasoning and learning. "There were absolutely no transfer effects" from the training tasks to more general tests of cognition, says Adrian Owen, a neuroscientist at the Medical Research Council (MRC) Cognition and Brian Sciences Unit in Cambridge, UK, who led the study. "I think the expectation that practising a broad range of cognitive tasks to get yourself smarter is completely unsupported." It's unlikely that the study, published online in Nature this week1, will quell the brain-training debate. "I really worry about this study — I think it's flawed," says Peter Snyder, a neurologist who studies ageing at Brown University's Alpert Medical School in Providence, Rhode Island. Snyder agrees that data supporting the efficacy of brain training are sparse. Although some earlier studies — such as one2 funded by Posit Science, a brain-training software company in San Francisco, California — showed modest effects, Snyder recently published a meta-analysis that found little benefit3. But he says that most commercial programs are aimed at adults well over 60 who fear that their memory and mental sharpness are slipping. "You have to compare apples to apples," says Snyder. An older test group, he adds, would have a lower mean starting score and more variability in performance, leaving more room for training to cause meaningful improvement. "You may have more of an ability to see an effect if you're not trying to create a supernormal effect in a healthy person," he says. Indeed, the subjects in this study were a self-selected group "who would have had a natural inclination to play this sort of game", says David Moore, director of the MRC Institute of Hearing Research in Nottingham, UK, and a founder of MindWeavers, a company in Oxford, UK, selling the brain-training program MindFit. Moore and Snyder add that the training time may not have been long enough. Subjects completed an average of 24 sessions — at ten minutes a session, that's just four hours of training, says Snyder. "Four hours of testing over six weeks isn't a lot to create meaningful change." Brain-training exercises such as treatments for lazy eye or some post-stroke training regimens require more time to work, says Moore. Owen counters that several similar studies have used a six-week training period. Although the average number of sessions in his trial was 24, the actual number ranged from two to "some real diehards doing it several hundred times", he says, and he saw no difference in performance between the extremes. "There is no psychological theory that could account for [no effects at all] for six weeks, and then suddenly at week 22 an effect," he says. Owen concedes that his findings don't necessarily mean that training in young children or elderly patients is pointless. But "the evidence is not strong", he says. "And someone needs to go and test it." * * References * Owen, A. M. et al. Nature advance online publication doi:10.1038/nature09042 (20 April 2010). * Smith, G. E. et al. J. Am. Geriatr. Soc.57, 594-603 (2009). | Article | PubMed * Papp, K. V., Walsh, S. J. & Snyder, P. J. Alzheimers Dement.5, 50-60 (2009). | Article | PubMed This is a public forum. Please keep to our Community Guidelines. You can be controversial, but please don't get personal or offensive and do keep it brief. Remember our threads are for feedback and discussion - not for publishing papers, press releases or advertisements. - Portrait of a year-old pandemic
- Nature 464(7292):1112 (2010)
'Swine flu' isn't over yet, but it already holds lessons for the future. One year ago this month, the world watched with trepidation as a novel influenza A virus, to which the global population had little or no immunity, emerged in Mexico and the United States. In the weeks that followed, the H1N1 'swine flu' virus spread rapidly to countries worldwide, leading the World Health Organization (WHO) on 11 June 2009 to officially declare the first flu pandemic in more than 40 years. Nature looks at the lessons learnt from H1N1, and how they will help scientists and health authorities to handle the next flu pandemic. The 2009 pandemic was not the killer of 1918. "Most people were less likely to get infected than in previous pandemics, less likely to get sick if they did, and less likely to die if they got sick," says Marc Lipsitch, an epidemiologist at Harvard School of Public Health in Boston, Massachusetts. But with the pandemic still playing out, it may be years before we get a reliable estimate of how many deaths it has caused. Research published last month in the journal PLoS Currents Influenza, by Cécile Viboud of the US National Institutes of Health (NIH) in Bethesda, Maryland, and her colleagues, suggests that the first waves may have been more severe than is widely perceived1. Viboud and her team tested different approaches for estimating flu mortality in the United States, with the most conservative estimates based on medical records that officially reported flu as the cause of death. This is likely to be an underestimate: many flu deaths are not recorded as such, with death often ascribed instead to an underlying condition, such as heart disease or diabetes. Click to enlarge.ADAPTED FROM REF. 1 This conservative approach estimated US pandemic flu deaths at around 7,500–12,000: less than half the number caused annually in the United States by seasonal influenza H1N1 and influenza B viruses. But this method also revealed that the number of life years lost was around a quarter more than usual because the 2009 pandemic deaths were skewed towards younger ages than seasonal flu (see 'Deaths and years of life lost from influenza'). Under a less conservative estimate, based on comparing overall mortality during the pandemic with mortality over the same period in previous years, excess deaths numbered 44,100, surpassing those of a typical flu season. Years of life lost were three to four times higher than a virulent H3N2 season and five times higher than years of life lost to seasonal H1N1 and B viruses — of the same order as the 1968 flu pandemic. Click to enlarge.SOURCE: REF. 2 Whereas seasonal flu hits mainly the very old and the very young, the 2009 pandemic was different in that those most affected were older children and young adults (see 'Which age groups were affected most by H1N1?'). Research published in January 2010 in the journal BMC Infectious Diseases reported that more than three-quarters of cases occurred in people younger than 30, with a peak in the group aged 10–19 years2. Seroprevalence studies, which monitor antibodies that react with the virus, suggest an explanation for this. Serum samples taken in England before the start of the pandemic show that older people had stronger antibody reactions to H1N1 than young people, probably as a result of previous exposure to strains with similarities to the new virus3. Because the pandemic virus outcompeted seasonal flu strains, the usual annual flu season never developed, so older people were spared a bad A/H3N2 flu season, which normally takes a heavy toll on the elderly. Probably. In past pandemics, infection and illness have come in waves over a period of several years, and later waves are often more severe. "We are in a pandemic period of 2 to 5 years and must continue to keep our guard up," says Lone Simonsen, a flu expert at the Research and Policy for Infectious Disease Dynamics (RAPIDD) programme, a collaboration between the NIH Fogarty International Center and the science and technology directorate of the US Department of Homeland Security. Click to enlarge.SOURCE: CENTRES FOR DISEASE CONTROL AND PREVENTION In the longer term, the H1N1 virus looks set to establish itself as the dominant seasonal flu strain. Already, almost all new flu cases currently being detected are pandemic H1N1 (see 'H1N1 rose to dominate other flu strains'), although some seasonal influenza B virus continues to co-circulate, particularly in Asia. As more people acquire resistance to the pandemic virus, its virulence will settle to the level of seasonal flu, which causes new infections only when the genetics of the virus change annually. This winter saw little flu overall in the Northern Hemisphere. Outbreaks of pandemic flu are currently occurring in some of the tropical zones of the Americas, west and east Africa and southeast Asia, in particular Thailand and Singapore, but at low levels. With winter approaching in the Southern Hemisphere, the question of whether a new pandemic wave will hit countries there will soon be answered. Click to enlarge.SOURCE: DEPT OF HEALTH AND AGEING, AUSTRALIAN GOVT H1N1 offered a stark reminder that current techniques for making a flu vaccine take too long: around six months from the identification of the new virus to production of any sizeable vaccine quantities. Substantial amounts of vaccine against the pandemic virus became available only around last October, after the first wave had already passed during the winter in Australia (see 'Vaccines arrived too late') and in other Southern Hemisphere countries, and weeks into an autumn wave in most Northern Hemisphere countries. Better surveillance is also needed. Although researchers could gain some idea of the progress of the pandemic from the number of cases diagnosed and from proxy figures — such as hospitalizations and the number of people reporting influenza-like illnesses — health agencies and scientists were generally slow to implement the gold standard: seroprevalence studies that measure levels of flu antibodies in blood serum (see Nature 462, 398–399; 2009). Part of the problem was logistical: at the start of the pandemic, most labs were overwhelmed with diagnostic samples that needed immediate processing to meet pressing public-health demands. But a lack of advance planning often hampered efforts as well. ADVERTISEMENT Good seroprevalence data are crucial to making informed policy decisions. A key measurement of the severity of a pandemic is the case-fatality rate, or how often infection is fatal. Getting a handle on this requires precise estimates of how many people have been infected, but it was not until around last September — five months into the pandemic — that epidemiologists began to get such data. Clinical research during the pandemic — studies of the best drug regimes, for example — also lagged. In several countries, such as the United Kingdom and Australia, medical research councils introduced calls for proposals at record speed. But compared with epidemiologists and virologists, clinical researchers were generally slow to respond, in part because many were tied up in the frontline pandemic response. Anne Kelso, director of the WHO Collaborating Centre for Reference and Research on Influenza in Melbourne, Australia, points to another problem: "Delays in obtaining ethical clearance was another impediment for some, with the result that the epidemic was almost over before some studies could begin." * References * Viboud, C. , Miller, M. , Olson, D. , Osterholm, M. & Simonsen, L.PLoS Curr. Influenza, RRN1153 (2010). Available at http://go.nature.com/DSfm6h. * Reichert, T. , Chowell, G. , Nishiura, H. , Christensen, R. A. & McCullers, J. A.BMC Infect. Dis.10, 5 (2010). * Miller, E.et al. Lancet375, 1100-1108 (2010). There are currently no comments. This is a public forum. Please keep to our Community Guidelines. You can be controversial, but please don't get personal or offensive and do keep it brief. Remember our threads are for feedback and discussion - not for publishing papers, press releases or advertisements. - Undersea project delivers data flood
- Nature 464(7292):1115 (2010)
Sea-floor observatory in the Pacific Ocean to provide terabytes of data. VICTORIA, CANADA NEPTUNE CANADA Results are pouring in from an ambitious project that has wired the floor of the northeast Pacific Ocean with an array of cameras, seismometers, chemical sensors and more. The challenge won't be getting good data, but rather handling the vast quantities of it, project scientists reported last week at their first post-launch meeting in Victoria, Canada. The Can$145-million (US$145-million) project, called NEPTUNE Canada (North-East Pacific Time-Series Undersea Networked Experiments), has laid 800 kilometres of cable to transmit power and data, and established five 'nodes' that act like giant, 13-tonne plug-in points for scientific instrumentation, lying up to 2.6 kilometres beneath the waves (see map). The network spans the Juan de Fuca plate, which sits between the Pacific and the North American plates and hosts earthquakes and tsunamis, giant clams and whale pods, along with hydrothermal vents and frozen methane deposits. Click to enlarge. NEPTUNE Canada is the leading effort to wire a wide region of ocean floor with multiple sensors, making real-time, interactive data streams freely available to anyone online. It is breaking ground for similar networks elsewhere, including Japan (the Advanced Real-time Earth Monitoring Network in the Area, or ARENA) and Europe (the European Sea Floor Observatory Network, or ESONET). Scientists want to use NEPTUNE Canada to study how different systems interact, answering questions such as whether earthquakes trigger methane release, and how climate change is affecting the ocean. The project was launched in December 2009 after a decade of work. Three-quarters of its 78 instruments are delivering data at present, and, despite some teething problems, project scientists are delighted by the progress. "I'm flabbergasted," says Mairi Best, associate director of science for the project at the University of Victoria. "Everyone was painfully aware of how many pieces are involved, how many ways it could go wrong." The original vision for NEPTUNE called for 3,000 kilometres of cable off the coast of Canada and the United States. But the US contribution has been held up, having only recently received funding as part of the 2009 economic recovery package. US project managers now plan to install 800 kilometres of cable and several nodes that should be up and running by 2014. "It is a shame that the time-lag has split the project," says Brian Bornhold, a marine geologist and NEPTUNE Canada project scientist at the University of Victoria. But the two countries still plan to work closely together. "The Canadian success is absolutely fantastic," says John Delaney, an oceanographer at the University of Washington, Seattle, who helped to dream up the idea for NEPTUNE and is working on the US project. He thinks that such vast ocean monitoring systems will become more common in the future: "This is the first of them; it is by no means the last." Bit torrent NEPTUNE Canada is now sending about 0.1 terabytes (1012 bytes) of data back to shore every day, and should ramp up to about 60 terabytes per year. This may be small scale for, say, particle physicists — the Large Hadron Collider is expected to generate an annual 15 petabytes (1015 bytes) of data — but it marks a substantial change for oceanographers, who are used to obtaining isolated bursts of data from week-long cruises, and then spending a year analysing the results. "Half of our staff are in data management," says Best. In terms of data handling, "everyone is outside of their comfort zones", she says. "If they don't analyse it, someone else will." The fact that the data are freely available online should spur the scientists in charge of each instrument to make the most of the results, says Benoît Pirenne, who heads NEPTUNE's data management and was previously in charge of archiving astronomical data for the European Southern Observatory. "If they don't analyse it, someone else will," he says. Teams are looking for innovative ways of crunching through the data, including getting the public to help watch the vast quantities of video archive and highlight noteworthy events. Early results from NEPTUNE Canada include seismometer readings from the Chilean earthquake in February, and bottom-pressure sensor results that tracked the small tsunami waves it generated. The project also has a crawler called Wally (named after the robot in the 2008 film WALL·E) that is investigating methane beds by remote operation from Germany. The results could help to improve estimates of how much methane, a potent greenhouse gas, is being released from oceans. To get renewed government funding, the project will have to prove its worth by generating demand, says Martin Taylor, chief executive of the Ocean Networks Canada, the agency managing NEPTUNE Canada. So far, thousands of users in 71 countries have signed up for free access to the data, says Best. The project hopes to raise funds by charging companies to test new oceanographic instruments on the NEPTUNE network, and to sell parts of their data-management system. The first maintenance expedition, planned for May this year, will show how well the instruments are holding up to the high pressure and salt water. "We're learning a lot," says Pirenne, "which is another way of saying that things are breaking." There are currently no comments. This is a public forum. Please keep to our Community Guidelines. You can be controversial, but please don't get personal or offensive and do keep it brief. Remember our threads are for feedback and discussion - not for publishing papers, press releases or advertisements. - Crucial data on REACH not disclosed
- Nature 464(7292):1116 (2010)
Disagreement flares up over the number of animals required to implement Europe's chemical-safety law. A leading toxicologist has accused the research arm of the European Commission of suppressing a scientific study that could have changed the course of chemical safety legislation in Europe. The study predicted that millions more animals than the commission had estimated would be needed to test chemicals under REACH (Registration, Evaluation, Authorisation and Restriction of Chemicals), sometimes dubbed the most complex legislation in the history of the European Union (EU). Thomas Hartung led European Commission research on the use of animals in chemical tests.C. HARTLOVE/JOHNS HOPKINS UNIV. "It is my strong belief that some decisions on REACH would have been different in the light of these data," says Thomas Hartung, former head of the European Centre for the Validation of Alternative Methods (ECVAM) in Ispra, Italy, who led the study. ECVAM is part of the Joint Research Centre (JRC), the commission service that carries out research to inform European polices. The allegations came to light in a commission-led inquiry into whether Hartung, now a toxicologist at Johns Hopkins University in Baltimore, Maryland, broke commission rules by publishing an Opinion piece on REACH in Nature last year without asking the JRC's permission (see Nature 460, 1080–1081; 2009). Hartung now claims that the JRC tried to keep his research under wraps just as REACH was in the final, politically sensitive stages of formulation. In a statement to Nature, issued on behalf of all JRC and commission officials involved, the commission unequivocally denies any wrongdoing. "The European Commission robustly refutes any suggestion that it has misled anyone or sought to conceal information over the effects of the REACH legislation on animal testing. Such suggestions are unfounded." "The European Commission robustly refutes any suggestion that it has sought to conceal information over the effects of the REACH legislation." REACH became law in June 2007. It stipulates that all chemicals sold in the EU in annual quantities of more than one tonne (at least 30,000 compounds) must be registered, along with toxicity data, by 2018. Back in 2005, when REACH was taking shape, it was already clear that more stringent chemical-safety testing would need many more tests on laboratory animals. Some politicians were deeply concerned about the ethics of this; others worried about the financial costs, or whether there were enough laboratories to conduct the tests. For example, Gunter Verheugen, who was then the commission's vice-president and responsible for enterprise and industry, said in November 2005 that REACH would not be ethically viable if it required excessive additional use of animals. A spokesman for the Brussels-based European Parliament Intergroup on the Welfare and Conservation of Animals, a cross-party group of parliament members that promotes awareness on animal-welfare issues, says that if the! commission had cited higher estimates of animal use, there would have been greater pressure to ensure more funding for alternative testing methods. The JRC had already produced two such estimates in 2003 and 2004. The conclusion was that, in a best-case scenario, REACH would require 2.1 million to 3.9 million animals by 2018. These figures were cited by the JRC and commission throughout the development of REACH, and are still found on its website. But the studies were not peer reviewed, and were widely criticized by Hartung and other toxicologists for not taking into account the offspring produced during reproductive toxicology tests over two generations of animals, as required by the EU's lab animals directive at that time. Hartung was commissioned by the JRC to come up with a new estimate. In September 2006, he delivered a verdict that had been peer reviewed by a panel of academic and industry experts, suggesting that a minimum of 8 million to 9 million animals would be required. This was in line with two other independent, peer-reviewed studies conducted on the impact of REACH. A 2001 study by researchers at the Institute for Environment and Health at the University of Leicester, UK, and another by toxicologists at the Federal Institute for Risk Assessment in Berlin (BfR), in 2004, found that the legislation would require between 7.5 million and 10 million animals. Although these studies were available in the literature, greater weight was given to the commission's estimates during debates on REACH. But Manfred Liebsch, a toxicologist at the BfR who contributed to the study, told Nature that the JRC's original lower estimates were "unrealistic" and that Hartung's 2006 estimate was more credib! le. The differences between these estimates was partly due to varying degrees of optimism over the use of non-animal methods such as computer models of biological response to chemicals. Hartung took part in another JRC study on this matter, collaborating with two other toxicologists including Martin Barratt, who runs a predictive toxicology consultancy in Bedford, UK. The study, completed in October 2006, found that around 50% of chemicals under REACH could be tested by computer simulation. The result contrasted with the JRC's 2003 study, which said that up to 92% of some key toxicology tests could use such methods. Barratt says the JRC did not give clearance to publish the study: "I have no idea why they said we could not publish it." The research remains unpublished, and the commission did not respond to Nature's questions on this matter. Margin for error About a month after delivering his study estimating total animal use, Hartung was growing impatient. He e-mailed his superior, Elke Anklam, director of the JRC's Institute for Health and Consumer Protection (IHCP), on 27 October 2006, calling for his study to be published so that it could inform the legislation. "I consider it embarrassing, how the outdated numbers are continuously referred to," he wrote. He e-mailed Anklam again on 8 November 2006 with a letter for the JRC's then acting director general, Roland Schenkel, making the same point. Hartung says he could not make the study public himself at that time, as he was bound by the commission's confidentiality rules. The research had still not been published when the European Parliament approved REACH on 13 December 2006. A commission briefing document on REACH released on the same day made no reference to Hartung's study and quoted the lower estimates of animal use. On 16 January 2007, Hartung met with Anklam, Schenkel and advisers to then research commissioner Janez Potočnik to discuss the REACH animal-use estimates, among other topics. Hartung claims that, before the meeting, Schenkel told him to "downplay the discrepancies" between the three JRC studies. Hartung says he complied with the request. The commission, speaking to Nature on Schenkel's behalf, would not comment on this conversation. Hartung says that a summary, rather than a full version of his study, was published on the IHCP website in January 2007 (bearing the date October 2006), but was removed several months later. In its statement to Nature, the commission said: "No research was 'suppressed'. The Commission's assessments are never the view of one individual, but are arrived at by thorough internal processes of peer review and quality assurance, followed in this case. "These figures have been adapted over time and in line with evolving circumstances. This process has been transparent at all times," it adds. "The different estimation (nine million) in the 2006 report from the 2003 and 2004 reports derives mainly from the new requirement (stemming from the REACH regulation as adopted) to count estimates of offspring in reproductive toxicity tests." Any such estimates contain a "margin for error in both directions". The commission says that animal use will decrease over time, in part because it has invested €200 million (US$270 million) in non-animal methods over the past 20 years. Matter of opinion The Nature Opinion article that prompted the inquiry into Hartung claimed that generating REACH data could require up to 54 million animals and cost almost €10 billion over the next decade, rendering the legislation unfeasible. This figure took into account the projected growth of the EU and the chemical industry, and also assumed that two-generation reproductive toxicology tests would still be used. At a 24 March hearing, Hartung defended himself by claiming that the analysis was not based on any data collected while he was at the JRC. The commission says it will not comment on the inquiry. Responding to Hartung's Nature article, the European Chemicals Agency (ECHA) in Helsinki, set up in 2007 to manage REACH, agreed that estimates of 2.6 million animals were too low, noting instead the BfR estimate of 9 million animals. A commission statement released at the same time also quotes the 9-million figure. Speaking to Nature, Jukka Malm, director of assessment at the ECHA, said the agency was "not aware" of Hartung's 2006 study, adding that it had independently concluded that 9 million animals was the most reliable estimate. ADVERTISEMENT Although Liebsch supports Hartung's work, he worries that the controversy is distracting attention from work led by the Organisation for Economic Co-operation and Development (OECD) to finalize guidelines on a reproductive toxicity test that would use only one generation rather than two, potentially halving the number of animals required and making the legislation feasible (see Nature 463, 142–143; 2010). The OECD must agree on the guidelines before December 2010 so that they can be included in the tests put forward by chemical manufacturers to meet REACH's requirements. A decision on Hartung's case is pending. There are currently no comments. This is a public forum. Please keep to our Community Guidelines. You can be controversial, but please don't get personal or offensive and do keep it brief. Remember our threads are for feedback and discussion - not for publishing papers, press releases or advertisements. - Environmental science: New life for the Dead Sea?
- Nature 464(7292):1118 (2010)
A conduit from the Red Sea could restore the disappearing Dead Sea and slake the region's thirst. But such a massive engineering project could have untold effects, reports Josie Glausiusz. Download a PDF of this story. Standing at the rocky shore of the Dead Sea, Itay Reznik raises his arms as high as they will go. Suspended in the air about a metre-and-a-half above his fingertips is a dock to nowhere. "When I started my PhD in 2007, we could sail from this dock," says Reznik, a graduate student in geology at Israel's Ben-Gurion University of the Negev in Beer-Sheva. Now the dock dangles more than three metres above the water. This super-salty lake on the border between Israel and Jordan is the lowest spot on Earth's surface, and it is getting lower each year. Over the past 50 years, the water level has dropped by almost 30 metres; recently the loss has accelerated to an average of 1.2 metres per year. The Dead Sea's surface area has shrunk by almost one-third over the past 100 years. In this desert region, more water evaporates from the sea than enters it. The Jordan river once fed the sea with 1.3 billion cubic metres of fresh water per year, but that has shrivelled to less than 100 million cubic metres, most of which consists of agricultural run-off and sewage. Israel, Syria and the Kingdom of Jordan take the river's water for drinking and agriculture. At the southern end of the sea, Israel's Dead Sea Works and Jordan's Arab Potash Company exacerbate the problem by evaporating the mineral-rich water to extract potash and magnesium. Without action, the Dead Sea will continue to shrink. But a proposal being evaluated by the World Bank could revive the lake with a 180-kilometre-long conduit carrying water from the Red Sea 400 metres downhill to the Dead Sea through a canal, pipeline or some combination of the two. The water's flow would generate electricity to run a desalination plant, providing drinking water for local people — as much as 850 million cubic metres of water annually, the equivalent of just under half of Israel's current consumption (see 'Saving the Dead Sea'). Click to enlarge.ESA The concept of a Red–Dead canal goes back to 1664, when Athanasius Kircher, a German Jesuit scholar, envisioned it as part of a regional network of transportation canals. Similar schemes have been revived and abandoned over the years, most notably following the 1973 energy crisis, when Israel considered building a hydropower plant on a canal linking the Mediterranean and the Dead Sea. Lately, however, the Red–Dead plan has gained momentum, mainly because of Jordan's desperate need for drinking water, and because of a desire by Israel, Jordan and the Palestinian Authority to collaborate on a 'peace conduit'. The three governments have developed shared goals for the project and in 2005 they jointly asked the World Bank to investigate its feasibility and environmental impacts. "It's the only place where Israel, Jordan and the Palestinian Authority are publicly working on a project together," says Alex McPhail, lead water and sanitation specialist at the World Bank, who is overseeing the bank's study programme of the plan. "The Dead Sea is the only place where Israel, Jordan and the Palestinian Authority are publicly working on a project together." Environmentalists aren't so keen. Friends of the Earth Middle East (FoEME) an Israeli–Jordanian–Palestinian advocacy group with branches in Tel Aviv, Amman and Bethlehem, has questioned the environmental effect of the conduit, which would cost billions of dollars to build. The intake pipe would draw up to 2 billion cubic metres of water each year from the Red Sea's Gulf of Aqaba, which would have an unknown effect on the sea's 1,000 or so species of fish and 110 species of reef-building coral. And the conduit would run through the Arava Valley, a haven for rare gazelles, hyrax and hares. The valley is also lined with a seismically active fault that could damage the water system. Most significantly, environmentalists argue that this expensive and potentially harmful project is unnecessary, and that Israel and Jordan could at least partially restore the River Jordan by conserving more of their water resources. Reznik and other researchers in the region are busy resolving these and other issues. Their results will feed into the World Bank's final report in 2011, which will help to determine whether Israel, Jordan and the Palestinian Authority pursue the project, and whether it will attract funding. The decline of the Dead Sea is obvious on a drive through the region with Reznik and his PhD supervisor, Jiwchar Ganor, a geologist at Ben Gurion. At Ein Fashkha, a nature reserve and freshwater spring on the northwestern shore, small placards mark the shore line in 1968 and 1984. The first is now about two kilometres from the lake; the second sits forlornly beside a set of crumbling stone 'stairs to the sea', which is now nowhere to be seen. A muddy mess The lake's retreat has left a wasteland of exposed sediment, which is so salty that few plants can grow. Freshwater springs once fed oases of palm trees and other plants along the former lake shore but the springs' outlets have migrated downhill into the muddy zone along the current shore. Environmentalists worry that the decline of the oases will harm migrating birds that stop in the region to fatten up before crossing the Sahara Desert. Infiltration of fresh water has also dissolved salt layers in the sediments, leading to the formation of some 3,000 sinkholes. The retreat has also taken a toll on people because the Dead Sea is a tourist destination and home to several farming communities. Its retreat has halted efforts to build hotels and other amenities, and the sinkholes have undermined roads and bridges, and harmed agriculture. The Dead Sea will probably never vanish completely — as its surface shrinks, its salinity increases and evaporation slows. "The Dead Sea will not die," says Ittai Gavrieli, acting director of the Geological Survey of Israel in Jerusalem, who is leading a series of modelling studies on the impact of the proposed conduit. But if nothing changes, the lake is likely to drop a further 100–150 metres from its current level of 423 metres below sea level, Gavrieli says. Aside from arresting further ecological damage, the conduit could provide crucial help for the Kingdom of Jordan, says Mousa Jama'ani, secretary-general of the Jordan Valley Authority in Amman. Jordan is one of the world's poorest countries in terms of freshwater resources. The Gulf States are even more parched, but they use their oil for electricity production to power the desalination of seawater. "Here in Jordan, there is no oil, also no water," Jama'ani says. A desalination plant on the Red–Dead conduit would supply some of that badly needed drinking water to Jordan. "The population here increases and increases, water resources are limited, and demands increase," says Jama'ani. "What we can do? The government has a responsibility to the people." The desalination plant would also supply clean water to the Palestinians, says Shaddad Attili, head of the Palestinian Water Authority in Ramallah. Palestinian water sources are limited to a mountain aquifer beneath the West Bank (part of which borders the Dead Sea) and a coastal aquifer — heavily contaminated with seawater and sewage — that supplies Gaza. "Palestinians haven't had access to the Jordan river basin since 1967," Attili says. Nor are they permitted to develop the northwestern shore of the Dead Sea. Attili believes that the agreement to cooperate on the Red–Dead project with Israel is a big achievement — a view reiterated by Uri Shor, spokesman for the Israeli Water Authority. Israel obtains about one-third of its water from the Sea of Galilee and most of the rest from underground aquifers; it also desalinates some 165 million cubic metres of sea and brackish water per year, about 9% of the country's annual consumption of 1.8 billion cubic metres. "Such a project is a platform for international co-operation, and therefore it's in our interest as well," Shor says. In 2008, with US$16.7 million in aid donated by eight countries, including the United States, France and Sweden, the World Bank launched a study programme to examine the feasibility of constructing the Red Sea–Dead Sea Water Conveyance and its social and environmental impact. The programme has released a series of interim reports1,2 over the past 18 months, examining such factors as the best route for the conduit (most likely through Jordanian territory) the form it will take (canal, tunnel, pipeline or some combination thereof) the type of intake on the Red Sea, where to site the pumping stations, desalination plant and hydropower facility, and the allocation of desalinated water. The World Bank also recently started two studies looking at the impact of the conduit on both the Red Sea and the Dead Sea. Last October, under pressure from FoEME, the World Bank initiated a 'study of alternatives', conducted by a trio of British, Jordanian and Israeli experts to examine other options, such as building a water pipeline from Turkey or restoring the flow of the River Jordan. One important factor is the risk of earthquakes. The Dead Sea Fault, an active seismic zone, runs the length of the Arava Valley and forms the border of two tectonic plates. According to the interim feasibility study1 commissioned by the World Bank, there is high risk of a major earthquake occurring within the operating life of the project, endangering pumping stations and the desalination plant. Gavrieli says that such damage could be forestalled if geologists can identify the fault location and engineers design the conduit and related facilities well enough. "If you know where the fault line is, you prepare for it; you reinforce it, you allow for some flexibility," he says. The World Bank reports also examine other potential hazards of the project. For example, pumping water from the Gulf of Aqaba could change currents, damaging corals or sea grasses. Sea water could leak from the conduit and contaminate groundwater. It could harm wildlife and archaeological sites in the Arava Valley, including ancient settlements, aqueducts and reservoirs, copper smelting sites and cemeteries. The Arava ecosystem potentially faces further threat from a plan by the Israeli real estate billionaire Yitzhak Tshuva to build a 'valley of peace' along the conduit, a Las Vegas-style city filled with parks, lakes, waterfalls, hotels and a botanical garden. Extreme inhabitants The Dead Sea itself could be transformed, with unknown long-term consequences. Despite its name, the sea is home to a variety of microorganisms, including a salt-tolerant unicellular green alga called Dunaliella and red Archaea from the family Halobacteriaceae. Dunaliella thrives when the sea is slightly diluted, as in rainy years such as 1992, when the lake level rose by two metres. A new conduit could also stimulate growth of the alga because the Dead Sea is much saltier than either the Red Sea or brine from a desalination plant. This dock was level with the Dead Sea in 2007.J. GLAUSIUSZ Experiments conducted at the Dead Sea Works by Gavrieli and Aharon Oren, a microbiologist at the Hebrew University of Jerusalem show that such an influx of less-salty water would trigger algal blooms. The effect would be enhanced by the addition of phosphate-based fertilizer, which enters the Dead Sea from the River Jordan3. A bloom of Dunaliella would in turn feed Archaea that could turn the sea red, says Gavrieli. Whether that is a problem is a matter of opinion. "What's the big deal if the Dead Sea is red?" Gavrieli asks. "What's worse, having it drop, or having it red?" "What's the big deal if the Dead Sea is red? What's worse, having it drop, or having it red?" On the other hand, a change in the Dead Sea's chemistry might turn the surface water white. Currently, the sea is supersaturated with gypsum, a form of calcium sulphate, which barely precipitates because the kinetics of the reaction are too slow4. But gypsum will precipitate into white crystals if Red Sea water, which contains ten times more sulphate, is pumped into the Dead Sea as reject brine from a desalination plant. In a ten-cubic metre tank at the Dead Sea Works, Reznik, Ganor and Gavrieli have mixed equal parts of Dead Sea brine and desalination reject brine, and seen little white gypsum crystals bob on the surface. The large-scale, long-term consequences of such a change are unknown, says Gavrieli. Gypsum might sink to the lake bed, or it might form crystals that remain suspended in the upper water layer, turning it milky. A film of crystals on the surface could make the Dead Sea more reflective and slow evaporation; but if they remain in the top of the water column,! the gypsum clumps could scatter light within the Dead Sea, raising its temperature and increasing evaporation. The Dead Sea Works and the Arab Potash Company are important players in the fate of the Sea, as their evaporation ponds account for some 30–40% of the water-level decline. According to the Dead Sea Works, their evaporation ponds help the region by preserving the southern end of the Dead Sea, which dried up in 1977. They also provide employment for some 1,000 workers and support a large complex of hotels at Ein Bokek. But Gidon Bromberg, Israeli director of FoEME, is not impressed. The potash companies, he says, could switch to extracting minerals by forcing Dead Sea water through membranes under high pressure. That would take more money and energy, but it would cause significantly less water to evaporate. A set of studies recently completed by FoEME suggests that countries in the region could restore 400 million to 600 million cubic metres of water per year to the River Jordan, at a smaller cost than desalination, by managing demand through measures such as switching to compost or vacuum toilets, or flushing toilets with 'grey' water recycled from the shower. FoEME claims that more sustainable farming practices could also conserve water. Working together In the end, it will be the need for desalinated water that is most likely to drive the conduit's construction. Despite the considerable political tensions, all three governments need to cooperate, says Attili. "We don't have another choice," he says, because future generations will depend on new sources of water. ADVERTISEMENT At the moment, the future looks uncertain in the desert beside the Dead Sea. At the end of a long, dusty day, Ganor and Reznik drive high above the lake to a rock bearing a faint red line, which marks the water level measured by British surveyors between 1900 and 1913. The Dead Sea now lies 35 metres below. If nothing is done, the situation will only get worse, but a Red–Dead conduit would carry with it some real risks. The decision to stop the sea's decline, says Gavrieli, "is a matter of choosing between bad and worse. But the question is, what is bad and what is worse?" Josie Glausiusz is a freelance journalist in New York City. * References * Red Sea–Dead Sea Water Conveyance Study Program Feasibility Study. Options Screening and Evaluation Report. Executive Summary (Coyne et Bellier, 2009). * Red Sea–Dead Sea Water Conveyance Study, Environmental and Social Assessment. Preliminary Scoping Report. December 2008 (ERM/BRL/Eco Consult, 2008). * Oren, A.et al. J. Mar. Syst.46, 121-131 (2004). * Reznik, I. J. , Gavrieli, I. & Ganor, J.Geochim. Cosmochim. Acta73, 6218-6230 (2009). There are currently no comments. This is a public forum. Please keep to our Community Guidelines. You can be controversial, but please don't get personal or offensive and do keep it brief. Remember our threads are for feedback and discussion - not for publishing papers, press releases or advertisements. - Toxicology: The big test for bisphenol A
- Nature 464(7292):1122 (2010)
After years of wrangling over the chemical's toxicity, researchers are charting a new way forwards. Brendan Borrell investigates how the debate has reshaped environmental-health studies. Download a PDF of this story. In her 25 years of research, Gail Prins, a reproductive physiologist at the University of Illinois in Chicago, had got used to doing science her way. But when her experiments started to question the safety of bisphenol A (BPA), a chemical found in thousands of consumer products from food-can linings to baby bottles, she found her work under a new level of scrutiny. The experience was unnerving, she says. "I feel I do solid science." Even federal evaluators in the United States agreed that her work was suitable for informing decisions about BPA's safety — at least at first. A study published in early 2006, for example, helped explain how early exposure to BPA could increase rats' susceptibility to prostate cancer1. The work complemented a growing body of research suggesting that the chemical posed several developmental and cancer risks (see 'Hazard warning'). That December, when a panel on reproductive health drafted a report on BPA for the US National Toxicology Program (NTP) it determined that Prins's study "makes important contributions and is suitable for the evaluation process". But the following year, the NTP's final report discounted Prins's study. The chemical industry had stepped in to make its views heard. BPA grosses some US$6 billion a year for the five companies that produce it in the United States. Steven Hentges, who works on BPA for the American Chemistry Council, the industry trade group, wrote a 93-page letter to the NTP panel on 2 February 2007, detailing what he perceived as flaws in a slew of studies coming out of academic laboratories. SOURCE: HTTP://CERHR.NIEHS.NIH.GOV/CHEMICALS/BISPHENOL/BISPHENOL.PDF; PICTURE: W.EBERHART/GETTY Prins's study came under attack for injecting the chemical under the rats' skin, rather than giving it orally, as humans would generally be exposed. Hentges found flaws in 60 of the roughly 80 studies that the panel found to be "adequate", "useful" or "suitable" for evaluating BPA's reproductive and developmental effects. Following Hentges's critique, the percentage of non-industry-funded studies deemed adequate for informing policy dropped from 70% to 30%. Most of those that remained found the chemical to be safe. Three years on, the debate over BPA's potential for harm is unresolved. Canada and Denmark have banned the chemical's use in baby bottles, toys and other products for infants. And manufacturers and retailers worldwide have begun to limit its use in response to mounting consumer concerns. For researchers, though, the issue exposes a growing gulf between basic research and the regimented world of toxicity testing. "They are very different systems that serve different purposes," says Richard Denison, a chemical-risk analyst in the Washington DC office of the Environmental Defense Fund, a non-governmental organization. Scientists such as Prins were among the first to highlight concerns about ubiquitous environmental toxins, but because they were not specifically aiming to test toxicity, they were easy targets for industry. Consequently, the dispute over BPA has had an unexpected outcome: it is shaping the way studies will be funded and conducted in academic labs. To bridge the divide, the US National Institute of Environmental Health Sciences (NIEHS) in Research Triangle Park, North Carolina, is bringing researchers together to create greater collaboration and a more rigorous, integrated body of research on BPA that can compete on an equal footing with the industry-sponsored studies. Endocrinologist Frederick vom Saal of the University of Missouri-Columbia, a pioneer in the study of BPA and one of those funded by the new NIEHS programme, calls it the "BPA master experiment". The new approach is already being adopted for other types of toxicity investigation. "If we didn't start doing business a little differently," says Jerry Heindel, the BPA programme manager at the NIEHS, "we might not have the answers we need." Toxic science The gap between the methods used in industry and those of basic researchers can be traced back to 1993, when vom Saal and his colleagues showed how organisms can be exquisitely sensitive to tiny amounts of hormone-like chemicals during development. These chemicals bind to the same receptors as hormones such as oestrogen and thus mimic their effect, potentially disrupting development. Vom Saal and his colleagues christened them 'endocrine disruptors'2. It was no secret that BPA was a potential endocrine disruptor. Scientists explored its effects on fertility in the 1930s, because of its similarities to oestrogen. It was abandoned for more powerful chemicals and ultimately found a use in making shatterproof polycarbonate plastic and epoxy resins used to coat metals. The FDA approved its use under food-additive regulations in the early 1960s. In the late 1990s and early 2000s, researchers began to amass evidence that BPA leaching from such products was acting as an endocrine disruptor, even if the mechanism remained unclear. Ana Soto, a cell biologist at Tufts University School of Medicine in Boston, found that low doses of BPA alter the development of mammary glands in mice and could lead to cancer3. Other researchers found a link in mice between BPA and hyperactivity and a heightened sensitivity to illegal drugs. In Prins's work, BPA had an effect on rat prostates similar to that of an injection of oestradiol, the body's main oestrogen. A mechanism for BPA's action was taking shape. For Prins's control animals, injected with corn oil, the ageing process naturally silenced the expression of a gene linked to development of cancers. Animals injected with corn oil and BPA however, had the gene locked in the 'on' position and were more than twice as a likely to develop pre-cancerous lesions. Researchers have learned that the chemical is working on an epigenetic level — modifying gene expression, but not sequence, over a long period. Yet the academic goal of such work — to uncover and explore biological mechanisms — was quite different from those of guideline studies designed to evaluate chemical safety. Therein lies the room for contention. Hentges's letter found bones to pick with the dozens of studies it attacked. Some used sample sizes that were too small; some had only looked at a single dose level; some had failed to carry studies through to the measurement of disease or dysfunction, stopping at surrogate endpoints. Some criticisms were method-specific. For example, Hentges said that an enzyme-based assay to detect concentrations of BPA in blood was not specific to BPA and can overestimate its levels. Academic researchers had also identified similar methodological problems in the published BPA literature, but they generally did not regard them as fatal flaws. A common general criticism from Hentges, however, was that none of these studies were conducted according to Good Laboratory Practice (GLP), part of the testing guidelines developed by regulators around the world, outlining basic standards for equipment calibration and the storage of raw data. In general, when called on by federal regulators to test the safety of a substance, the chemical industry has relied on private labs such as RTI International in Research Triangle Park to carry out guideline studies using GLP. Academic researchers rarely conduct such studies, but in their deliberations about chemical safety, federal agencies are expected to examine all the evidence, GLP or not. Hentges says non-GLP studies should be given lesser weight. "There's certainly some role for academic studies in generating hypotheses, but they don't provide what regulators need to draw conclusions." But compared to scientific protocols, which evolve continuously, guideline standards advance i! n fits and starts, because adding new procedures requires a lengthy period of comment, revision and validation. The US Environmental Protection Agency (EPA) set limits for acceptable human exposure to BPA in the late 1980s. It set up a programme on endocrine disruption in 1998, but it took until October 2009 for methods to be sufficiently agreed on to request a first round of tests, and some say tests are still inadequate. "This panel was the most dysfunctional thing I've ever sat on." "This was the most dysfunctional thing I've ever sat on," says biologist Theo Colborn, who has been on the programme committee for its duration and runs the Endocrine Disruption Exchange in Paonia, Colorado. In the early days, she says, the science of endocrine disruption wasn't ready for standardized testing, but today, although the science is stronger, the tests EPA is requesting are inadequate. "A chemical like BPA could easily be missed in the assays they have selected," she says. Once the stakeholders agree on a procedure, it must be validated, which is, in a sense, a substitute for replication. Validation involves multiple contract laboratories performing the same procedure and coming back with a consistent result. But according to Thomas Zoeller, an endocrinologist at the University of Massachusetts, Amherst, well-established and reproducible scientific techniques may have trouble getting validated when contract laboratories can't perform the procedures. Inappropriate tools At the EPA's request, Zoeller reviewed the raw data from three contract labs asked to measure thyroid hormone levels, and found that they could not conduct radioimmune assays that have been available for more than 40 years. "The problem is the assay is difficult," says Rochelle Tyl, a toxicologist at RTI International, which was part of the validation process. "If experienced labs can't run the assay how can we put it as a guideline?" "If experienced labs can't run the assay how can we put it as a guideline?" That's exactly the point, Zoeller and other critics argue: contract labs may not be able to apply the appropriate tools. Academic scientists studying BPA are convinced that their techniques are better for understanding the dangers of oestrogen mimics. Prins replied to Hentges's charges in her own comment to the NTP panel, one of a handful of researchers to do so. In her letter, she noted that "by selectively eliminating data collected from non-oral routes of administration, the committee has introduced a significant bias in the process". She says that the key factor was the level of BPA circulating in the bloodstream of her rats, estimated to be about 10 parts per billion and consistent with amounts generally found in humans. Such calculations aren't even required in guideline studies. "Most academic scientists have quality-control standards that are above GLP standards," says Prins, who does run a GLP-compliant clinical lab. But getting certified is expensive, time-consuming and generally unnecessary for research goals. Zoeller, vom Saal, and 34 other researchers published a commentary in Environmental Health Perspectives last year, arguing that regulatory agencies should give no greater weight to GLP-guideline studies than rigorous, replicated peer-reviewed research5. Zoeller admits it is a tough balance. "Regulatory agencies shouldn't have to run around every time some academic lab finds something, somewhere with some esoteric technology," he says. "What we have to figure out is how to invest modern science in regulatory toxicology and make it work." A new direction In 2008, in an attempt to close the holes poked in academic research on BPA, the NIEHS made plans to direct US$30 million to BPA research in 2010 and 2011, including $14 million in stimulus funds. In the past, the agency would typically put out a request for grant applications in a general area, fund researchers in different areas, and let them go off and conduct their research. This time, the NIEHS was going to do things differently. "We may fund the best research by the best investigators, but doing that doesn't guarantee we'll fill all the data gaps that need to be filled," says Heindel. Gail Prins now delivers bisphenol A to mouse pups orally rather than injecting it under the skin.B. VOGELZANG So last October, more than 40 scientists working on BPA, including Prins, arrived at the agency's campus in Research Triangle Park. The meeting alerted researchers to technical issues in working with BPA, and also allowed them to reach a consensus on certain study protocols. To counter concerns about the assays used to detect BPA levels in blood, Antonia Calafat, an analytical chemist at the Centers for Disease Control and Prevention in Atlanta, will do the lion's share of chemical analysis using mass spectrometry, calibrated with a BPA isotope not found in the environment. At the meeting, Calafat also advised researchers to include 'blanks' of distilled BPA-free water to prove that experiments were free from contamination, and to test food to ensure it contains minimal levels of plant-based oestrogens. Researchers also discussed the use of positive controls, such as the hormone oestradiol, to prove that experiments that fail to find a response to BPA actually work. "Let's learn from the lessons of BPA and start developing collaborations and interactions to move the field right from the start." The NIEHS is also encouraging researchers to use consistent BPA doses across labs, to measure additional variables such as markers related to diabetes, and to exchange tissues. Soto, for example, studies female development, but she will raise male and female mice during her experiments. She will send the males' prostates to Prins in Chicago. In addition, the Food and Drug Administration's National Center for Toxicological Research in Jefferson, Arkansas, will run animal studies as a GLP backbone for the entire project, making tissues and animals available to external grantees. To help grantees take advantage of these collaborations, the NIEHS offered them supplements of up to $50,000 per year. Heindel says that the agency has already adopted the same approach for a cohort of grantees beginning studies of nanomaterial safety this spring. "Let's learn from the lessons of BPA and start developing collaborations and interactions to move the field right from the start," he says. Hentges says he is "encouraged that the research will be moving towards studies that will really be helpful to assess the safety of BPA". ADVERTISEMENT Although many of the funded researchers were quick to praise the programme, it also sometimes means that they will be doing things they don't necessarily agree with. For example, vom Saal takes issue with stopping the enzyme-based assay for BPA concentrations. The test costs several dollars, compared with the $150 for the more sophisticated methods the NIEHS now requires. He says that enzyme-based assays are only problematic for researchers not familiar with their nuances. However, he is willing to adopt the more expensive technique if it means producing unassailable results. "If you are going to spend millions of dollars on new research, then it's best not to open it up to criticism," he says. Prins, like vom Saal, disputes the relevance of administering the chemical orally as opposed to under the skin, but they all agreed in October to conduct some of their experiments by feeding BPA to animals, as the chemical industry has insisted. "I can't particularly say I like people telling me how to conduct my research," Prins says. "But it's very important that things are done consistently between investigators if we are going to move the field forward." Brendan Borrell is a freelance writer in New York City. * References * Ho, S.-M.et al. Cancer Res.66, 5624-5632 (2006). * Colborn, T. , vom Saal, F. S. & Soto, A. M.Environ. Health Perspect.101, 378-384 (1993). * Markey, C. M.et al. Biol. Reprod.65, 1215-1223 (2001). * Suzuki, T.et al. Neuroscience117, 639-644 (2003). * Myers, J. P.et al. Environ. Health Perspect.117, 309-315 (2009) There are currently no comments. This is a public forum. Please keep to our Community Guidelines. You can be controversial, but please don't get personal or offensive and do keep it brief. Remember our threads are for feedback and discussion - not for publishing papers, press releases or advertisements. - Climate policy: role of scientists in public advocacy
- Nature 464(7292):1125 (2010)
In his review of my book Science as a Contact Sport — a personal retrospective account of the development of climate science and policy covering 40 years — Roger Pielke Jr misrepresents my position on advocacy (Nature 464, 352–353; 2010).Pielke fairly represents my decades-old argument that scientists should avoid policy prescriptions. - Climate policy: dissent over moral as well as factual issues
- Nature 464(7292):1125 (2010)
In his Review of books by James Hansen and Stephen Schneider (Nature 464, 352–353; 2010), Roger Pielke Jr misguidedly implies that it is undemocratic for climate scientists to call for action against climate change in the name of science.The normative assumptions underlying climate-change policy recommendations receive much less public attention than the scientific facts that fuel policy deliberation. - What users really want to know from university ratings
- Nature 464(7292):1125 (2010)
The improvements you describe in university rating systems are welcome (Nature464, 7–8 and Nature464 16–17; 2010), but they are not notably geared to the interests of the students. We need broader ratings that clearly indicate the practical advantages of studying at a particular university. - African students value the results of studying in China
- Nature 464(7292):1125 (2010)
Your News story describing offers of research collaboration by China to Africa (Nature464, 477; 2010) conveys an overall negative impression of what I believe to be a successful initiative.For example, you raise doubts about the calibre of students from Africa who train as scientists in China. - Actions speak louder than words to prevent language extinctions
- Nature 464(7292):1125 (2010)
Every two weeks a language becomes extinct, according to the United Nations Permanent Forum on Indigenous Issues (see http://go.nature.com/RLfdzx - Copenhagen Accord pledges are paltry
- Nature 464(7292):1126 (2010)
Current national emissions targets can't limit global warming to 2 °C, calculate Joeri Rogelj, Malte Meinshausen and colleagues — they might even lock the world into exceeding 3 °C warming. - Building life from the bottom up
- Nature 464(7292):1129 (2010)
Engineering biological systems and organisms is a costly team effort and may be incompatible with an open-source regulatory environment, finds Michael A. Goldman. - Why twins age differently
- Nature 464(7292):1130 (2010)
Signs of ageing in animals originate from accumulated damage to the genome, proteins or corrupted cell components that reflect a decline in bodily maintenance. Others arise not from this primary damage, but through damage-limitation mechanisms that are provoked by cellular malfunctions. - Where the wilderness line blurs
- Nature 464(7292):1131 (2010)
Seen from a distance, some of the wild lynx of Colorado seem to have three ears. As the book Living Through the End of Nature explains, behind their trademark pointed ears, the lynx sport an antenna wire that is connected to a radio collar. - Quantum measurement: A condensate's main squeeze
- Nature 464(7292):1133 (2010)
Entanglement between particles permits the quantum uncertainty in one variable to be reduced at the cost of increasing that in another. Condensates are an ideal system in which this technique can be studied. - Neuroscience: Signals far and away
- Nature 464(7292):1134 (2010)
The neocortex of the mammalian brain mediates functions such as sensory perception and ultimately consciousness and language. The spread of local signals across large distances in this brain region has now been clarified. - Chemistry: Not just any old anion
- Nature 464(7292):1136 (2010)
Unlike its neighbours on the right-hand side of the periodic table, boron barely forms an anion. A new trick has been established that allows it to do so, enabling a highly unusual complex to be prepared. - Materials science: A cloak of liquidity
- Nature 464(7292):1137 (2010)
Droplets of a liquid alloy on a silicon surface can rearrange the surface atoms so that they mimic the short-range ordering of atoms in the alloy. Remarkably, this effect inhibits freezing of the droplets. - 50 & 100 years ago
- Nature 464(7292):1138 (2010)
This article describes a model which imitates certain aspects of morphogenesis and maintenance in a developing embryo. The model consists of a number of artificial 'cells' (constructed from radio parts) which can stimulate and inhibit one another by means of connexions which are made through a switchboard. - Infectious disease: Listeria does it again
- Nature 464(7292):1138 (2010)
Proteins are synthesized by ribosomes, and then commonly undergo further modifications. A new example of how these host-cell processes can e subverted by a pathogenic bacterium has come to light. - Q&A: Animal behaviour: Magnetic-field perception
- Nature 464(7292):1140 (2010)
The ability to perceive Earth's magnetic field, which at one time was dismissed as a physical impossibility, is now known to exist in diverse animals. The receptors for the magnetic sense remain elusive. But it seems that at least two underlying mechanisms exist — sometimes in the same organism. - Generation of a novel wing colour pattern by the Wingless morphogen
Werner T Koshikawa S Williams TM Carroll SB - Nature 464(7292):1143 (2010)
The complex, geometric colour patterns of many animal bodies have important roles in behaviour and ecology. The generation of certain patterns has been the subject of considerable theoretical exploration, however, very little is known about the actual mechanisms underlying colour pattern formation or evolution. Here we have investigated the generation and evolution of the complex, spotted wing pattern of Drosophila guttifera. We show that wing spots are induced by the Wingless morphogen, which is expressed at many discrete sites that are specified by pre-existing positional information that governs the development of wing structures. Furthermore, we demonstrate that the elaborate spot pattern evolved from simpler schemes by co-option of Wingless expression at new sites. This example of a complex design developing and evolving by the layering of new patterns on pre-patterns is likely to be a general theme in other animals. - Conversion of adult pancreatic α-cells to β-cells after extreme β-cell loss
Thorel F Népote V Avril I Kohno K Desgraz R Chera S Herrera PL - Nature 464(7292):1149 (2010)
Pancreatic insulin-producing β-cells have a long lifespan, such that in healthy conditions they replicate little during a lifetime. Nevertheless, they show increased self-duplication after increased metabolic demand or after injury (that is, β-cell loss). It is not known whether adult mammals can differentiate (regenerate) new β-cells after extreme, total β-cell loss, as in diabetes. This would indicate differentiation from precursors or another heterologous (non-β-cell) source. Here we show β-cell regeneration in a transgenic model of diphtheria-toxin-induced acute selective near-total β-cell ablation. If given insulin, the mice survived and showed β-cell mass augmentation with time. Lineage-tracing to label the glucagon-producing α-cells before β-cell ablation tracked large fractions of regenerated β-cells as deriving from α-cells, revealing a previously disregarded degree of pancreatic cell plasticity. Such inter-endocrine spontaneous adult cell conversi! on could be harnessed towards methods of producing β-cells for diabetes therapies, either in differentiation settings in vitro or in induced regeneration. - Lateral competition for cortical space by layer-specific horizontal circuits
- Nature 464(7292):1155 (2010)
The cerebral cortex constructs a coherent representation of the world by integrating distinct features of the sensory environment. Although these features are processed vertically across cortical layers, horizontal projections interconnecting neighbouring cortical domains allow these features to be processed in a context-dependent manner. Despite the wealth of physiological and psychophysical studies addressing the function of horizontal projections, how they coordinate activity among cortical domains remains poorly understood. We addressed this question by selectively activating horizontal projection neurons in mouse somatosensory cortex, and determined how the resulting spatial pattern of excitation and inhibition affects cortical activity. We found that horizontal projections suppress superficial layers while simultaneously activating deeper cortical output layers. This layer-specific modulation does not result from a spatial separation of excitation and inhibition,! but from a layer-specific ratio between these two opposing conductances. Through this mechanism, cortical domains exploit horizontal projections to compete for cortical space. - Possible thermochemical disequilibrium in the atmosphere of the exoplanet GJ 436b
- Nature 464(7292):1161 (2010)
The nearby extrasolar planet GJ 436b—which has been labelled as a 'hot Neptune'—reveals itself by the dimming of light as it crosses in front of and behind its parent star as seen from Earth. Respectively known as the primary transit and secondary eclipse, the former constrains the planet's radius and mass1, 2, and the latter constrains the planet's temperature3, 4 and, with measurements at multiple wavelengths, its atmospheric composition. Previous work5 using transmission spectroscopy failed to detect the 1.4-μm water vapour band, leaving the planet's atmospheric composition poorly constrained. Here we report the detection of planetary thermal emission from the dayside of GJ 436b at multiple infrared wavelengths during the secondary eclipse. The best-fit compositional models contain a high CO abundance and a substantial methane (CH4) deficiency relative to thermochemical equilibrium models6 for the predicted hydrogen-dominated atmosphere7, 8. Moreover,! we report the presence of some H2O and traces of CO2. Because CH4 is expected to be the dominant carbon-bearing species, disequilibrium processes such as vertical mixing9 and polymerization of methane10 into substances such as ethylene may be required to explain the hot Neptune's small CH4-to-CO ratio, which is at least 105 times smaller than predicted6. - Nonlinear atom interferometer surpasses classical precision limit
Gross C Zibold T Nicklas E Estève J Oberthaler MK - Nature 464(7292):1165 (2010)
Interference is fundamental to wave dynamics and quantum mechanics. The quantum wave properties of particles are exploited in metrology using atom interferometers, allowing for high-precision inertia measurements1, 2. Furthermore, the state-of-the-art time standard is based on an interferometric technique known as Ramsey spectroscopy. However, the precision of an interferometer is limited by classical statistics owing to the finite number of atoms used to deduce the quantity of interest3. Here we show experimentally that the classical precision limit can be surpassed using nonlinear atom interferometry with a Bose–Einstein condensate. Controlled interactions between the atoms lead to non-classical entangled states within the interferometer; this represents an alternative approach to the use of non-classical input states4, 5, 6, 7, 8. Extending quantum interferometry9 to the regime of large atom number, we find that phase sensitivity is enhanced by 15 per cent relativ! e to that in an ideal classical measurement. Our nonlinear atomic beam splitter follows the 'one-axis-twisting' scheme10 and implements interaction control using a narrow Feshbach resonance. We perform noise tomography of the quantum state within the interferometer and detect coherent spin squeezing with a squeezing factor of -8.2 dB (refs 11–15). The results provide information on the many-particle quantum state, and imply the entanglement of 170 atoms16. - Atom-chip-based generation of entanglement for quantum metrology
Riedel MF Böhi P Li Y Hänsch TW Sinatra A Treutlein P - Nature 464(7292):1170 (2010)
Atom chips provide a versatile quantum laboratory for experiments with ultracold atomic gases1. They have been used in diverse experiments involving low-dimensional quantum gases2, cavity quantum electrodynamics3, atom–surface interactions4, 5, and chip-based atomic clocks6 and interferometers7, 8. However, a severe limitation of atom chips is that techniques to control atomic interactions and to generate entanglement have not been experimentally available so far. Such techniques enable chip-based studies of entangled many-body systems and are a key prerequisite for atom chip applications in quantum simulations9, quantum information processing10 and quantum metrology11. Here we report the experimental generation of multi-particle entanglement on an atom chip by controlling elastic collisional interactions with a state-dependent potential12. We use this technique to generate spin-squeezed states of a two-component Bose–Einstein condensate13; such states are a useful! resource for quantum metrology. The observed reduction in spin noise of -3.7 ± 0.4 dB, combined with the spin coherence, implies four-partite entanglement between the condensate atoms14; this could be used to improve an interferometric measurement by -2.5 ± 0.6 dB over the standard quantum limit15. Our data show good agreement with a dynamical multi-mode simulation16 and allow us to reconstruct the Wigner function17 of the spin-squeezed condensate. The techniques reported here could be directly applied to chip-based atomic clocks, currently under development18. - Substrate-enhanced supercooling in AuSi eutectic droplets
- Nature 464(7292):1174 (2010)
The phenomenon of supercooling in metals—that is, the preservation of a disordered, fluid phase in a metastable state well below the melting point1—has led to speculation that local atomic structure configurations of dense, symmetric, but non-periodic packing act as the main barrier for crystal nucleation2, 3. For liquids in contact with solids, crystalline surfaces induce layering of the adjacent atoms in the liquid4, 5 and may prevent or lower supercooling6. This seed effect is supposed to depend on the local lateral order adopted in the last atomic layers of the liquid in contact with the crystal. Although it has been suggested that there might be a direct coupling between surface-induced lateral order and supercooling6, no experimental observation of such lateral ordering at interfaces is available6. Here we report supercooling in gold-silicon (AuSi) eutectic droplets, enhanced by a Au-induced (6 × 6) reconstruction of the Si(111) substrate. In situ X-ray ! scattering and abinitio molecular dynamics reveal that pentagonal atomic arrangements of Au atoms at this interface favour a lateral-ordering stabilization process of the liquid phase. This interface-enhanced stabilization of the liquid state shows the importance of the solid–liquid interaction for the structure of the adjacent liquid layers. Such processes are important for present and future technologies, as fluidity and crystallization play a key part in soldering and casting, as well as in processing and controlling chemical reactions for microfluidic devices or during the vapour–liquid–solid growth of semiconductor nanowires. - Stoichiometric control of organic carbon–nitrate relationships from soils to the sea
- Nature 464(7292):1178 (2010)
The production of artificial fertilizers, fossil fuel use and leguminous agriculture worldwide has increased the amount of reactive nitrogen in the natural environment by an order of magnitude since the Industrial Revolution1. This reorganization of the nitrogen cycle has led to an increase in food production2, but increasingly causes a number of environmental problems1, 3. One such problem is the accumulation of nitrate in both freshwater and coastal marine ecosystems. Here we establish that ecosystem nitrate accrual exhibits consistent and negative nonlinear correlations with organic carbon availability along a hydrologic continuum from soils, through freshwater systems and coastal margins, to the open ocean. The trend also prevails in ecosystems subject to substantial human alteration. Across this diversity of environments, we find evidence that resource stoichiometry (organic carbon:nitrate) strongly influences nitrate accumulation by regulating a suite of microbia! l processes that couple dissolved organic carbon and nitrate cycling. With the help of a meta-analysis we show that heterotrophic microbes maintain low nitrate concentrations when organic carbon:nitrate ratios match the stoichiometric demands of microbial anabolism. When resource ratios drop below the minimum carbon:nitrogen ratio of microbial biomass4, however, the onset of carbon limitation appears to drive rapid nitrate accrual, which may then be further enhanced by nitrification. At low organic carbon:nitrate ratios, denitrification appears to constrain the extent of nitrate accretion, once organic carbon and nitrate availability approach the 1:1 stoichiometry5 of this catabolic process. Collectively, these microbial processes express themselves on local to global scales by restricting the threshold ratios underlying nitrate accrual to a constrained stoichiometric window. Our findings indicate that ecological stoichiometry can help explain the fate of nitrate across dis! parate environments and in the face of human disturbance. - Learning-related fine-scale specificity imaged in motor cortex circuits of behaving mice
Komiyama T Sato TR O'Connor DH Zhang YX Huber D Hooks BM Gabitto M Svoboda K - Nature 464(7292):1182 (2010)
Cortical neurons form specific circuits1, but the functional structure of this microarchitecture and its relation to behaviour are poorly understood. Two-photon calcium imaging can monitor activity of spatially defined neuronal ensembles in the mammalian cortex2, 3, 4, 5. Here we applied this technique to the motor cortex of mice performing a choice behaviour. Head-fixed mice were trained to lick in response to one of two odours, and to withhold licking for the other odour6, 7. Mice routinely showed significant learning within the first behavioural session and across sessions. Microstimulation8, 9 and trans-synaptic tracing10, 11 identified two non-overlapping candidate tongue motor cortical areas. Inactivating either area impaired voluntary licking. Imaging in layer 2/3 showed neurons with diverse response types in both areas. Activity in approximately half of the imaged neurons distinguished trial types associated with different actions. Many neurons showed modulatio! n coinciding with or preceding the action, consistent with their involvement in motor control. Neurons with different response types were spatially intermingled. Nearby neurons (within ~150 μm) showed pronounced coincident activity. These temporal correlations increased with learning within and across behavioural sessions, specifically for neuron pairs with similar response types. We propose that correlated activity in specific ensembles of functionally related neurons is a signature of learning-related circuit plasticity. Our findings reveal a fine-scale and dynamic organization of the frontal cortex that probably underlies flexible behaviour. - Genetic analysis of variation in transcription factor binding in yeast
Zheng W Zhao H Mancera E Steinmetz LM Snyder M - Nature 464(7292):1187 (2010)
Variation in transcriptional regulation is thought to be a major cause of phenotypic diversity1, 2. Although widespread differences in gene expression among individuals of a species have been observed3, 4, 5, 6, 7, 8, studies to examine the variability of transcription factor binding on a global scale have not been performed, and thus the extent and underlying genetic basis of transcription factor binding diversity is unknown. By mapping differences in transcription factor binding among individuals, here we present the genetic basis of such variation on a genome-wide scale. Whole-genome Ste12-binding profiles were determined using chromatin immunoprecipitation coupled with DNA sequencing in pheromone-treated cells of 43 segregants of a cross between two highly diverged yeast strains and their parental lines. We identified extensive Ste12-binding variation among individuals, and mapped underlying cis- and trans-acting loci responsible for such variation. We showed that ! most transcription factor binding variation is cis-linked, and that many variations are associated with polymorphisms residing in the binding motifs of Ste12 as well as those of several proposed Ste12 cofactors. We also identified two trans-factors, AMN1 and FLO8, that modulate Ste12 binding to promoters of more than ten genes under α-factor treatment. Neither of these two genes was previously known to regulate Ste12, and we suggest that they may be mediators of gene activity and phenotypic diversity. Ste12 binding strongly correlates with gene expression for more than 200 genes, indicating that binding variation is functional. Many of the variable-bound genes are involved in cell wall organization and biogenesis. Overall, these studies identified genetic regulators of molecular diversity among individuals and provide new insights into mechanisms of gene regulation. - Listeria monocytogenes impairs SUMOylation for efficient infection
- Nature 464(7292):1192 (2010)
During infection, pathogenic bacteria manipulate the host cell in various ways to allow their own replication, propagation and escape from host immune responses. Post-translational modifications are unique mechanisms that allow cells to rapidly, locally and specifically modify activity or interactions of key proteins. Some of these modifications, including phosphorylation and ubiquitylation1, 2, can be induced by pathogens. However, the effects of pathogenic bacteria on SUMOylation, an essential post-translational modification in eukaryotic cells3, remain largely unknown. Here we show that infection with Listeria monocytogenes leads to a decrease in the levels of cellular SUMO-conjugated proteins. This event is triggered by the bacterial virulence factor listeriolysin O (LLO), which induces a proteasome-independent degradation of Ubc9, an essential enzyme of the SUMOylation machinery, and a proteasome-dependent degradation of some SUMOylated proteins. The effect of LLO! on Ubc9 is dependent on the pore-forming capacity of the toxin and is shared by other bacterial pore-forming toxins like perfringolysin O (PFO) and pneumolysin (PLY). Ubc9 degradation was also observed in vivo in infected mice. Furthermore, we show that SUMO overexpression impairs bacterial infection. Together, our results reveal that Listeria, and probably other pathogens, dampen the host response by decreasing the SUMOylation level of proteins critical for infection. - MicroRNA-mediated integration of haemodynamics and Vegf signalling during angiogenesis
Nicoli S Standley C Walker P Hurlstone A Fogarty KE Lawson ND - Nature 464(7292):1196 (2010)
Within the circulatory system, blood flow regulates vascular remodelling1, stimulates blood stem cell formation2, and has a role in the pathology of vascular disease3. During vertebrate embryogenesis, vascular patterning is initially guided by conserved genetic pathways that act before circulation4. Subsequently, endothelial cells must incorporate the mechanosensory stimulus of blood flow with these early signals to shape the embryonic vascular system4. However, few details are known about how these signals are integrated during development. To investigate this process, we focused on the aortic arch (AA) blood vessels, which are known to remodel in response to blood flow1. By using two-photon imaging of live zebrafish embryos, we observe that flow is essential for angiogenesis during AA development. We further find that angiogenic sprouting of AA vessels requires a flow-induced genetic pathway in which the mechano-sensitive zinc finger transcription factor klf2a5, 6, 7! induces expression of an endothelial-specific microRNA, mir-126, to activate Vegf signalling. Taken together, our work describes a novel genetic mechanism in which a microRNA facilitates integration of a physiological stimulus with growth factor signalling in endothelial cells to guide angiogenesis. - Caspase activation precedes and leads to tangles
de Calignon A Fox LM Pitstick R Carlson GA Bacskai BJ Spires-Jones TL Hyman BT - Nature 464(7292):1201 (2010)
Studies of post-mortem tissue have shown that the location of fibrillar tau deposits, called neurofibrillary tangles (NFT), matches closely with regions of massive neuronal death1, 2, severe cytological abnormalities3, and markers of caspase activation and apoptosis4, 5, 6, leading to the idea that tangles cause neurodegeneration in Alzheimer's disease and tau-related frontotemporal dementia. However, using in vivo multiphoton imaging to observe tangles and activation of executioner caspases in living tau transgenic mice (Tg4510 strain), we find the opposite: caspase activation occurs first, and precedes tangle formation by hours to days. New tangles form within a day. After a new tangle forms, the neuron remains alive and caspase activity seems to be suppressed. Similarly, introduction of wild-type 4-repeat tau (tau-4R) into wild-type animals triggered caspase activation, tau truncation and tau aggregation. Adeno-associated virus-mediated expression of a construct m! imicking caspase-cleaved tau into wild-type mice led to the appearance of intracellular aggregates, tangle-related conformational- and phospho-epitopes, and the recruitment of full-length endogenous tau to the aggregates. On the basis of these data, we propose a new model in which caspase activation cleaves tau to initiate tangle formation, then truncated tau recruits normal tau to misfold and form tangles. Because tangle-bearing neurons are long-lived, we suggest that tangles are 'off pathway' to acute neuronal death. Soluble tau species, rather than fibrillar tau, may be the critical toxic moiety underlying neurodegeneration. - Crystal structure of the FTO protein reveals basis for its substrate specificity
Han Z Niu T Chang J Lei X Zhao M Wang Q Cheng W Wang J Feng Y Chai J - Nature 464(7292):1205 (2010)
Recent studies1, 2, 3, 4, 5 have unequivocally associated the fat mass and obesity-associated (FTO) gene with the risk of obesity. In vitro FTO protein is an AlkB-like DNA/RNA demethylase with a strong preference for 3-methylthymidine (3-meT) in single-stranded DNA or 3-methyluracil (3-meU) in single-stranded RNA6, 7, 8. Here we report the crystal structure of FTO in complex with the mononucleotide 3-meT. FTO comprises an amino-terminal AlkB-like domain and a carboxy-terminal domain with a novel fold. Biochemical assays show that these two domains interact with each other, which is required for FTO catalytic activity. In contrast with the structures of other AlkB members, FTO possesses an extra loop covering one side of the conserved jelly-roll motif. Structural comparison shows that this loop selectively competes with the unmethylated strand of the DNA duplex for binding to FTO, suggesting that it has an important role in FTO selection against double-stranded nucleic ! acids. The ability of FTO to distinguish 3-meT or 3-meU from other nucleotides is conferred by its hydrogen-bonding interaction with the two carbonyl oxygen atoms in 3-meT or 3-meU. Taken together, these results provide a structural basis for understanding FTO substrate-specificity, and serve as a foundation for the rational design of FTO inhibitors. - Isolation of the elusive supercomplex that drives cyclic electron flow in photosynthesis
Iwai M Takizawa K Tokutsu R Okamuro A Takahashi Y Minagawa J - Nature 464(7292):1210 (2010)
Photosynthetic light reactions establish electron flow in the chloroplast's thylakoid membranes, leading to the production of the ATP and NADPH that participate in carbon fixation. Two modes of electron flow exist—linear electron flow (LEF) from water to NADP+ via photosystem (PS) II and PSI in series1 and cyclic electron flow (CEF) around PSI (ref. 2). Although CEF is essential for satisfying the varying demand for ATP, the exact molecule(s) and operational site are as yet unclear. In the green alga Chlamydomonas reinhardtii, the electron flow shifts from LEF to CEF on preferential excitation of PSII (ref. 3), which is brought about by an energy balancing mechanism between PSII and PSI (state transitions4). Here, we isolated a protein supercomplex composed of PSI with its own light-harvesting complex (LHCI), the PSII light-harvesting complex (LHCII), the cytochrome b6f complex (Cyt bf), ferredoxin (Fd)-NADPH oxidoreductase (FNR), and the integral membrane protein ! PGRL1 (ref. 5) from C. reinhardtii cells under PSII-favouring conditions. Spectroscopic analyses indicated that on illumination, reducing equivalents from downstream of PSI were transferred to Cyt bf, whereas oxidised PSI was re-reduced by reducing equivalents from Cyt bf, indicating that this supercomplex is engaged in CEF (Supplementary Fig. 1). Thus, formation and dissociation of the PSI–LHCI–LHCII–FNR–Cyt bf–PGRL1 supercomplex not only controlled the energy balance of the two photosystems, but also switched the mode of photosynthetic electron flow. - Ku is a 5′-dRP/AP lyase that excises nucleotide damage near broken ends
Roberts SA Strande N Burkhalter MD Strom C Havener JM Hasty P Ramsden DA - Nature 464(7292):1214 (2010)
Mammalian cells require non-homologous end joining (NHEJ) for the efficient repair of chromosomal DNA double-strand breaks1. A key feature of biological sources of strand breaks is associated nucleotide damage, including base loss (abasic or apurinic/apyrimidinic (AP) sites)2. At single-strand breaks, 5′-terminal abasic sites are excised by the 5′-deoxyribose-5-phosphate (5′-dRP) lyase activity of DNA polymerase β (pol β)3, 4, 5, 6: here we show, in vitro and in cells, that accurate and efficient repair by NHEJ of double-strand breaks with such damage similarly requires 5′-dRP/AP lyase activity. Classically defined NHEJ is moreover uniquely effective at coupling this end-cleaning step to joining in cells, helping to distinguish this pathway from otherwise robust alternative NHEJ pathways. The NHEJ factor Ku can be identified as an effective 5′-dRP/AP lyase. In a similar manner to other lyases7, Ku nicks DNA 3′ of an abasic site by a mechanism involvin! g a Schiff-base covalent intermediate with the abasic site. We show by using cell extracts that Ku is essential for the efficient removal of AP sites near double-strand breaks and, consistent with this result, that joining of such breaks is specifically decreased in cells complemented with a lyase-attenuated Ku mutant. Ku had previously been presumed only to recognize ends and recruit other factors that process ends; our data support an unexpected direct role for Ku in end-processing steps as well. - Super-resolution biomolecular crystallography with low-resolution data
Schröder GF Levitt M Brunger AT - Nature 464(7292):1218 (2010)
X-ray diffraction plays a pivotal role in the understanding of biological systems by revealing atomic structures of proteins, nucleic acids and their complexes, with much recent interest in very large assemblies like the ribosome. As crystals of such large assemblies often diffract weakly (resolution worse than 4 Å), we need methods that work at such low resolution. In macromolecular assemblies, some of the components may be known at high resolution, whereas others are unknown: current refinement methods fail as they require a high-resolution starting structure for the entire complex1. Determining the structure of such complexes, which are often of key biological importance, should be possible in principle as the number of independent diffraction intensities at a resolution better than 5 Å generally exceeds the number of degrees of freedom. Here we introduce a method that adds specific information from known homologous structures but allows global and local defor! mations of these homology models. Our approach uses the observation that local protein structure tends to be conserved as sequence and function evolve. Cross-validation with Rfree (the free R-factor) determines the optimum deformation and influence of the homology model. For test cases at 3.5–5 Å resolution with known structures at high resolution, our method gives significant improvements over conventional refinement in the model as monitored by coordinate accuracy, the definition of secondary structure and the quality of electron density maps. For re-refinements of a representative set of 19 low-resolution crystal structures from the Protein Data Bank, we find similar improvements. Thus, a structure derived from low-resolution diffraction data can have quality similar to a high-resolution structure. Our method is applicable to the study of weakly diffracting crystals using X-ray micro-diffraction2 as well as data from new X-ray light sources3. Use of homology informa! tion is not restricted to X-ray crystallography and cryo-elect! ron microscopy: as optical imaging advances to subnanometre resolution4, 5, it can use similar tools. - RNA interference: Homing in on delivery
- Nature 464(7292):1225 (2010)
The scientific community now seems convinced that small RNAs will become therapies, if new tools can help these large molecules to make it safely into cells. Monya Baker reports. - RNA interference: From tools to therapies
- Nature 464(7292):1225 (2010)
It's not only clinical researchers who are looking for the best ways to get synthetic oligonucleotides into cells in living mammals. If the delivery problem were solved, says Phillip Sharp, who studies RNA interference (RNAi) at the Massachusetts Institute of Technology (MIT) in Cambridge, the ability to use animals to study physiology and disease would expand dramatically. - RNA interference: MicroRNAs as biomarkers
- Nature 464(7292):1227 (2010)
In the 1990s, Carlo Croce, then director of the Kimmel Cancer Center in Philadelphia, Pennsylvania, was hunting for genes involved in chronic lymphocytic leukaemia. The disease was consistently associated with a lesion in chromosome 13, and so, back before the human genome was sequenced, his lab determined the identity of the nucleotides in an 800-kilobase stretch from the deleted region and began searching for protein-coding genes. - RNA interference: Table of suppliers
- Nature 464(7292):1229 (2010)
Table 1 - Corrective action
- Nature 464(7292):1238 (2010)
Accidents will happen.
No comments:
Post a Comment