Sunday, November 20, 2011
OPERA experiment reports anomaly in flight time of neutrinos from CERN to Gran Sasso
UPDATE 18 November 2011
Following the OPERA collaboration's presentation at CERN on 23 September, inviting scrutiny of their neutrino time-of-flight measurement from the broader particle physics community, the collaboration has rechecked many aspects of its analysis and taken into account valuable suggestions from a wide range of sources. One key test was to repeat the measurement with very short beam pulses from CERN. This allowed the extraction time of the protons, that ultimately lead to the neutrino beam, to be measured more precisely.
The beam sent from CERN consisted of pulses three nanoseconds long separated by up to 524 nanoseconds. Some 20 clean neutrino events were measured at the Gran Sasso Laboratory, and precisely associated with the pulse leaving CERN. This test confirms the accuracy of OPERA's timing measurement, ruling out one potential source of systematic error. The new measurements do not change the initial conclusion. Nevertheless, the observed anomaly in the neutrinos' time of flight from CERN to Gran Sasso still needs further scrutiny and independent measurement before it can be refuted or confirmed.
On 17 November, the collaboration submitted a paper on this measurement to the peer reviewed Journal of High Energy Physics (JHEP). This paper is also available on the Inspire website.
Geneva, 23 September 2011. The OPERA1 experiment, which observes a neutrino beam from CERN2 730 km away at Italy’s INFN Gran Sasso Laboratory, will present new results in a seminar at CERN this afternoon at 16:00 CEST. The seminar will be webcast at http://webcast.cern.ch. Journalists wishing to ask questions may do so via twitter using the hash tag #nuquestions, or via the usual CERN press office channels.
The OPERA result is based on the observation of over 15000 neutrino events measured at Gran Sasso, and appears to indicate that the neutrinos travel at a velocity 20 parts per million above the speed of light, nature’s cosmic speed limit. Given the potential far-reaching consequences of such a result, independent measurements are needed before the effect can either be refuted or firmly established. This is why the OPERA collaboration has decided to open the result to broader scrutiny. The collaboration’s result is available on the preprint server arxiv.org: http://arxiv.org/abs/1109.4897.
The OPERA measurement is at odds with well-established laws of nature, though science frequently progresses by overthrowing the established paradigms. For this reason, many searches have been made for deviations from Einstein’s theory of relativity, so far not finding any such evidence. The strong constraints arising from these observations makes an interpretation of the OPERA measurement in terms of modification of Einstein’s theory unlikely, and give further strong reason to seek new independent measurements.
“This result comes as a complete surprise,” said OPERA spokesperson, Antonio Ereditato of the University of Bern. “After many months of studies and cross checks we have not found any instrumental effect that could explain the result of the measurement. While OPERA researchers will continue their studies, we are also looking forward to independent measurements to fully assess the nature of this observation.”
“When an experiment finds an apparently unbelievable result and can find no artefact of the measurement to account for it, it’s normal procedure to invite broader scrutiny, and this is exactly what the OPERA collaboration is doing, it’s good scientific practice,” said CERN Research Director Sergio Bertolucci. “If this measurement is confirmed, it might change our view of physics, but we need to be sure that there are no other, more mundane, explanations. That will require independent measurements.”
In order to perform this study, the OPERA Collaboration teamed up with experts in metrology from CERN and other institutions to perform a series of high precision measurements of the distance between the source and the detector, and of the neutrinos’ time of flight. The distance between the origin of the neutrino beam and OPERA was measured with an uncertainty of 20 cm over the 730 km travel path. The neutrinos’ time of flight was determined with an accuracy of less than 10 nanoseconds by using sophisticated instruments including advanced GPS systems and atomic clocks. The time response of all elements of the CNGS beam line and of the OPERA detector has also been measured with great precision.
"We have established synchronization between CERN and Gran Sasso that gives us nanosecond accuracy, and we’ve measured the distance between the two sites to 20 centimetres,” said Dario Autiero, the CNRS researcher who will give this afternoon’s seminar. “Although our measurements have low systematic uncertainty and high statistical accuracy, and we place great confidence in our results, we’re looking forward to comparing them with those from other experiments."
“The potential impact on science is too large to draw immediate conclusions or attempt physics interpretations. My first reaction is that the neutrino is still surprising us with its mysteries.” said Ereditato. “Today’s seminar is intended to invite scrutiny from the broader particle physics community.”
The OPERA experiment was inaugurated in 2006, with the main goal of studying the rare transformation (oscillation) of muon neutrinos into tau neutrinos. One first such event was observed in 2010, proving the unique ability of the experiment in the detection of the elusive signal of tau neutrinos.
Holocaust-era database of 'Joint' now searchable online
For the first time in its history, the American Jewish Joint Distribution Committee (JDC) is making a collection of its historic records and photographs from the Holocaust period available online. The website enables the public, especially Holocaust survivors and their families, to perform searches for themselves or others they know on a database of more than 500,000 names, and to view and identify photos from 14 countries where JDC operated during and after the war. This will help JDC - also widely known as 'The Joint' - fill in the blanks about its impact during this tumultuous time in Jewish history.
"I cannot express the profoundly deep connection I felt to my past and now to JDC when out of nowhere my young face popped up on the screen," said Claus Hirsch (pictured below as a child and recently), a German-born Shanghai Ghetto survivor who found a photo of himself in the ghetto during his search on the system. Hirsch's family was helped by JDC in China during the war, and he found two lists on which his family members' names appear. Hirsch now lives in Manhattan.
The website will allow users to search the names database compiled from historic documents and JDC client lists from operations in Barcelona, Shanghai, Kobe, Vilna, Australia, South America, and the JDC Emigration Service in Vienna and Munich. A group of volunteer genealogists helped the JDC Global Archives create the database, and are adding new names each week. JDC's website is being launched at a time when a number of leading organizations and museums are making newly-digitized Holocaust era records available online, allowing broad public access for the first time ever. "For six decades, the vast majority of this data has been available only to professional researchers," said JDC CEO Steven Schwager. "Now, thanks to technology, survivors and their descendants can directly engage with our shared history."
Users can also explore and identify people they know in photo galleries of 1,500 photos from Austria, Belgium, China, Cyprus, Czechoslovakia, the Dominican Republic, France, Germany, Italy, Japan, Morocco, Lithuania, Portugal, and Spain. JDC is also inviting the public to tag photos and to share their JDC stories from this period in history. JDC was responsible for caring for hundreds of thousands of Jews in places from Cuba to Portugal during and after the Second World War. "Whether you were a little Jewish child we aided in Barcelona or one of the Jews we supported in Displaced Persons camps after the war, by putting faces, names, and stories together, you will benefit generations to come," said Schwager.
With tens of thousands of documents and photographs from the Holocaust era drawn from JDC collections in New York and Jerusalem, this website aims to add personal stories to JDC's vast international archive. Every year, hundreds of Holocaust survivors, genealogists, academics, filmmakers, and journalists conduct research in the JDC Global Archives. JDC will launch its Global Archives website in spring 2011 and will make available huge collections of newly-digitized documents and its significant photo collection from the organization's founding in 1914.
The American Jewish Joint Distribution Committee is the world's leading Jewish humanitarian assistance organization. It works in more than 70 countries to alleviate hunger and hardship, rescue Jews in danger, create lasting connections to Jewish life, and provide immediate relief and long-term development support for victims of natural and man-made disasters.(From: World Jewish Congress)
http://archives.jdc.org/sharedlegacy/
International Delegation of Jewish Parliamentarians Convenes in Jerusalem
(JERUSALEM – June 28, 2011) A delegation of 55 Jewish lawmakers from 22 nations gathered in Jerusalem this week for a Consultation of the International Council of Jewish Parliamentarians (ICJP), organized by the World Jewish Congress. Coming at a time of unprecedented diplomatic challenges for the Jewish people and the State of Israel, the consultation focused on developing common strategies to combat anti-Semitism and the assault on Israel as the nation-state of the Jewish people.
In a hearing in the Knesset moderated by ICJP Chairman Congressman Gary Ackerman (D-NY), numerous Parliamentarians voiced the increasing challenges felt by politicians in national capitals across the world. Viviane Teitelbaum-Hirsch, a Jewish Member of Parliament from Belgium, said that in her legislative body, the term “Israel” was viewed as a “dirty word,” and she said that overcoming those stereotypes represents “a very lonely fight.”
Dan Diker, Secretary General of the World Jewish Congress said that creating a common voice among Jewish Parliamentarians is one of the most important public diplomacy objectives for Israel and the Jewish world. “The coming months will be ones of unprecedented challenge for all those looking to defend the interests of the Jewish State,” he said. “Ensuring that these lawmakers can maximize their influence to support Israel and world Jewry’s basic rights will therefore be critical in overcoming the many obstacles that we know lie ahead.”
The gathering allowed the lawmakers to meet with their Israeli counterparts in the Knesset and other public officials and included a visit to the protest tent of Gilad Shalit, whose captivity under Hamas entered its fifth year earlier in the week. Speaking with Noam Shalit, Gilad’s father, Congressman Ackerman said, “It is wholly unacceptable that Gilad is being held as a political ploy in a world which should be peace-loving and just.”
Mr. Diker said in the name of the World Jewish Congress leadership and its President, Ronald Lauder, that pressure must be brought to bear not just on Hamas alone. “The Palestinian Authority as a whole and Mahmoud Abbas as its leader must be held accountable for the continued captivity of Gilad, which is an outright international war crime,” he said. “We call upon the international community to intensify all possible efforts to secure his release as every day which passes without him returned to his family is another day too long.”
Thursday, November 17, 2011
Conservatives Ignore Regions on Gun Registry
OTTAWA - Churchill MP Niki Ashton has criticized the Federal Conservatives for bringing in legislation on the gun registry that ignores the different views on the gun registry in different regions in Canada.
Ms. Ashton said that the bill the Conservatives introduced to end the registry contained the surprise provision to destroy the records. She pointed out that Quebec has indicated that it will establish its own registry when the federal registry is eliminated. They have indicated access to the records is critical to Quebec.
The NDP MP said that provinces like Manitoba have made it clear they oppose the registry and have no plans to establish a provincial version. "They have made it clear the registry does not meet the needs of their region," said Ashton. "It is ironic that the Conservatives have said the registry was a waste of money. Now they are forcing Quebec to pay even more money to put in place a provincial registry."
Ashton said the Tory bill is another example of the heavy-handed approach of the Harper government.
Over 400 persons at Gerry Sklavounos’ brunch and nomination
(Picture: Mr. Gerry Sklavounos with Mr. Jean Charest)
Montréal, November 15, 2011
Over 400 people were present, last Saturday, at the annual brunch and convention of Laurier-Dorion M.N.A. and Chairman of the Committee on Health and Social Services, Mr. Gerry Sklavounos held in the presence of Liberal Party leader and Premier of Québec, Mr. Jean Charest.
Several elected officials and dignitaries were also present, namely Mr. Gilles Deguire, Mayor of the Montréal-Nord borough, Mrs. Mary Deros, City Councillor for Park-Extension, the Honourable Eleni Bakopanos and the Consul General of Greece, Mr. Thanos Kafopoulos.
“Laurier-Dorion is red. Laurier-Dorion is Liberal!” exclaimed the M.N.A. while thanking the Premier for his much appreciated presence and also welcomed and thanked his citizens and Liberal partisans for their continued friendship, unwavering support and for having renewed their trust in him as candidate for the next elections. “Over the last four years, I have been attentive to the needs of my citizens and have represented them with honour at the National Assembly. I am extremely proud of the great projects that we’ve accomplished in Laurier-Dorion thanks to the excellent collaboration between my team and our different partners in the riding”, added Mr. Sklavounos.
"Gerry is an energetic young M.N.A. who invests much of his time to serve his citizens. In order to pursue our great projects and to maintain the quality of our health care and education services, it is necessary that Quebec continue to create wealth. That is why we are the government that prioritizes the economy. Among others, we have put in place the Plan Nord, an ambitious project that will provide economic benefits in all regions of Quebec. In these critical times, we made the right decision by investing in infrastructure in order to maintain and to create jobs. All these actions have only one goal: to enrich Quebec and all Quebecers”, declared
Montréal, November 15, 2011
Over 400 people were present, last Saturday, at the annual brunch and convention of Laurier-Dorion M.N.A. and Chairman of the Committee on Health and Social Services, Mr. Gerry Sklavounos held in the presence of Liberal Party leader and Premier of Québec, Mr. Jean Charest.
Several elected officials and dignitaries were also present, namely Mr. Gilles Deguire, Mayor of the Montréal-Nord borough, Mrs. Mary Deros, City Councillor for Park-Extension, the Honourable Eleni Bakopanos and the Consul General of Greece, Mr. Thanos Kafopoulos.
“Laurier-Dorion is red. Laurier-Dorion is Liberal!” exclaimed the M.N.A. while thanking the Premier for his much appreciated presence and also welcomed and thanked his citizens and Liberal partisans for their continued friendship, unwavering support and for having renewed their trust in him as candidate for the next elections. “Over the last four years, I have been attentive to the needs of my citizens and have represented them with honour at the National Assembly. I am extremely proud of the great projects that we’ve accomplished in Laurier-Dorion thanks to the excellent collaboration between my team and our different partners in the riding”, added Mr. Sklavounos.
"Gerry is an energetic young M.N.A. who invests much of his time to serve his citizens. In order to pursue our great projects and to maintain the quality of our health care and education services, it is necessary that Quebec continue to create wealth. That is why we are the government that prioritizes the economy. Among others, we have put in place the Plan Nord, an ambitious project that will provide economic benefits in all regions of Quebec. In these critical times, we made the right decision by investing in infrastructure in order to maintain and to create jobs. All these actions have only one goal: to enrich Quebec and all Quebecers”, declared
Deforestation causes cooling in Northern US, Canada
The impact of deforestation on global warming varies with latitude, according to new research from a team of scientists representing 20 institutions from around the world. The surprising finding, which researchers say calls for new climate-monitoring strategies, will be published in the Nov. 17 issue of the journal Nature.
"It depends where the deforestation is," said UC Davis atmospheric science Professor Kyaw Tha Paw U, a study co-author. "It could have some cooling effects at the regional scale, at higher latitudes, but there's no indication deforestation is cooling lower latitudes, and in fact may actually cause warming."
"Because surface station observations are made in grassy fields with biophysical properties of cleared land, they do not accurately represent the state of climate for 30 percent of the terrestrial surface covered by forests," the study says.
Paw U and his colleagues found that deforestation in the boreal region, north of 45 degrees latitude, results in a net cooling effect. While cutting down trees releases carbon into the atmosphere, it also increases an area's albedo, or reflection of sunlight. Surface temperatures in open, non-forested, high-latitude areas were cooler because these surfaces reflected the sun's rays, while nearby forested areas absorbed the sun's heat. At night, without the albedo effect, open land continued to cool faster than forests, which force warm turbulent air from aloft to the ground.
"People are debating whether afforestation is a good idea in high latitudes," said Xuhui Lee, the study's principal investigator and professor of meteorology at the Yale School of Forestry & Environmental Studies. "If you plant trees you sequester carbon, which is a benefit to the climate system. At the same time, if you plant trees you warm the landscape because trees are darker compared to other vegetation types. So they absorb solar radiation."
Paw U emphasized that the findings should not be viewed as a "green light" to cut down forests in high latitudes. "The intent is to clarify where we can see these regional effects using actual temperature measurements," he said. "Besides absorbing carbon dioxide, forest ecosystems have a number of other valuable qualities, even if at certain latitudes they may be warmer than open areas."
The researchers calculated that north of Minnesota, or above 45 degrees latitude, deforestation was associated with an average temperature decrease of 1.5 degrees Fahrenheit. On the other hand, deforestation south of North Carolina, or below 35 degrees latitude, appeared to cause warming. Statistically insignificant cooling occurred between these two latitudes.
The researchers collected temperature data from a network of specialized weather stations in forests ranging from Florida to Manitoba and compared results with nearby stations situated in open grassy areas that were used as a proxy for deforested land.
"The cooling effect is linear with latitude, so the farther north you go, the cooler you get with deforestation," said Lee.
David Hollinger, a scientist with the USDA Forest Service and study co-author, said, "Another way to look at the results is that the climate cooling benefits of planting forests is compounded as you move toward the tropics."
Archeologists investigate Ice Age hominins' adaptability to climate change
TEMPE, Ariz. – Computational modeling that examines evidence of how hominin groups evolved culturally and biologically in response to climate change during the last Ice Age also bears new insights into the extinction of Neanderthals. Details of the complex modeling experiments conducted at Arizona State University and the University of Colorado Denver will be published in the December issue of the journal Human Ecology, available online Nov. 17.
"To better understand human ecology, and especially how human culture and biology co-evolved among hunter-gatherers in the Late Pleistocene of Western Eurasia (ca. 128,000-11,500 years ago) we designed theoretical and methodological frameworks that incorporated feedback across three evolutionary systems: biological, cultural and environmental," said Michael Barton, a pioneer in the area of archaeological applications of computational modeling at Arizona State University.
"One scientifically interesting result of this research, which studied culturally and environmentally driven changes in land-use behaviors, is that it shows how Neanderthals could have disappeared not because they were somehow less fit than all other hominins who existed during the last glaciation, but because they were as behaviorally sophisticated as modern humans," said Barton, who is lead author of the published findings.
The paper "Modeling Human Ecodynamics and Biocultural Interactions in the Late Pleistocene of Western Eurasia" is co-authored by Julien Riel-Salvatore, an assistant professor of anthropology at the University of Colorado Denver; John Martin "Marty" Anderies, an associate professor of computational social science at ASU in the School of Human Evolution and Social Change and the School of Sustainability; and Gabriel Popescu, an anthropology doctoral student in the School of Human Evolution and Social Change at ASU.
"It's been long believed that Neanderthals were outcompeted by fitter modern humans and they could not adapt," said Riel-Salvatore. "We are changing the main narrative. Neanderthals were just as adaptable and in many ways, simply victims of their own success."
The interdisciplinary team of researchers used archeological data to track behavioral changes in Western Eurasia over a period of 100,000 years and showed that human mobility increased over time, probably in response to environmental change. According to Barton, the last Ice Age saw hunter-gathers, including both Neanderthals and the ancestors of modern humans, range more widely across Eurasia searching for food during a major shift in the Earth's climate.
The scientists utilized computer modeling to explore the evolutionary consequences of those changes, including how changes in the movements of Neanderthals and modern humans caused them to interact – and interbreed – more often.
According to Riel-Salvatore, the study offered further evidence that Neanderthals were more flexible and resourceful than previously assumed.
"Neanderthals had proven that they could roll with the punches and when they met the more numerous modern humans, they adapted again," Riel-Salvatore said. "But modern humans probably saw the Neanderthals as possible mates. As a result, over time, the Neanderthals died out as a physically recognizable population."
To reach their conclusion, the researchers ran a computer program for the equivalent of 1,500 generations showing that as Neanderthals and modern humans expanded their yearly ranges, the Neanderthals were slowly absorbed by more numerous modern humans until they had disappeared as a recognizable population.
"We tested the modeling results against the empirical archaeological record and found that there is evidence that Neanderthals, and moderns, did adapt their behaviors in the way in which we modeled," explained Barton. "Moreover, the modeling predicts the kind of low-level genetic admixture of Neanderthal genes that are being found in the newest genetic studies just now being published.
"In other words, successful behavioral adaptations to severe environmental conditions made Neanderthals, and other non-moderns about whom we know little, vulnerable to biological extinction, but at the same time, ensured they made a genetic contribution to modern populations," Barton said.
The authors noted that "the methods we illustrate here offer a robust, new framework in which researchers can begin to examine the effects that such invisible characteristics could have on the observable record."
"The kind of modeling we did in this research is very new in paleoanthropology, as is the continental scope of the archaeological analysis we used to test the model results," noted Barton.
"However, such computational modeling can refine our understanding of long-term human impact on the environment that can help inform land-use decisions for our future," said Barton, who also is co-director of ASU's Center for Social Dynamics and Complexity, which leverages the emerging field of complex systems to foster interdisciplinary research on fundamental questions of social life.
The research presented in Human Ecology was supported in part by the National Science Foundation, a Fulbright Senior Research Fellowship and a Fulbright Graduate Student Fellowship.
Saturday, November 12, 2011
Nuclear Experts Discuss IAEA Operational Safety Reviews
4 November 2011 | Senior nuclear experts today offered several recommendations on how the International Atomic Energy Agency (IAEA) can further develop its operational safety review services. The IAEA hosted a technical meeting on the Evaluation of Effectiveness of Operational Safety Review Services and their Future Evolution at the Agency's headquarters in Vienna from 1 to 4 November 2011.
Representatives from nuclear regulatory bodies, nuclear utilities, nuclear power plants and technical support organisations from 19 IAEA Member States and the World Association of Nuclear Operators (WANO) took part in the meeting. It provided a platform for the exchange of information, experience and lessons learned from the operational safety review missions performed during 2008-2011. The meeting also included discussion of expectations for the future evolution of these services.
"This week's meeting demonstrated the response of the IAEA's Member States to the lessons learned from the Fukushima accident. Nations must constantly strive to improve their nuclear safety practices, and the IAEA review services provide an excellent tool to assess their progress," said Miroslav Lipar, head of the IAEA's Operational Safety Section.
The IAEA's operational safety review services assess the operational safety performance of nuclear power plants by conducting peer reviews using the requirements of IAEA Safety Standards. The longest running safety review service, the Operational Safety Review Team (OSART) programme, was established in 1982 and has provided advice and assistance to Member States in 165 missions to enhance the safety of nuclear power plants during commissioning and operation. Other review services available in the area of operations evaluate operating experience feedback, safe long-term operation and safety culture.
The IAEA Action Plan on Nuclear Safety1/ includes actions focused towards strengthening the existing IAEA peer reviews by incorporating lessons learned and improving their effectiveness. This week's meeting provided several recommendations to the IAEA on how to modify the scope and methodology of the OSART missions to reflect the lessons learned from this year's accident at Japan's Fukushima Daiichi Nuclear Power Plant.
The most important recommendation was to introduce Severe Accident Management as a separate review area in the standard OSART scope. The meeting endorsed the integration of the different types of operational safety services under the umbrella of OSART, to improve the effectiveness of using available resources and harmonising the methodology of these services. In addition, several ideas on how to improve the efficiency of OSART missions in identifying safety significant issues were endorsed by the meeting.
The meeting considered that the IAEA operational safety review services, and in particular the OSART programme, were effective in supporting the enhancement of the safety of nuclear power plants during both commissioning and operation. The recommendations and improvements endorsed by the meeting in light of the Fukushima accident and a review of the services are intended to support further enhancement of the safety of nuclear power plants worldwide.
IAEA Remediation Mission to Japan Concludes
14 October 2011 | A team of international experts today completed a preliminary assessment of the strategy and plans being considered by the Japanese authorities to remediate the areas off-site the Fukushima Dai-ichi Nuclear Power Plant reported to have elevated levels of radiation.
The IAEA dispatched the mission to Japan on 7 October following a request from the country's Government. The mission, comprising 12 international and IAEA experts from several countries, visited numerous locations in the Fukushima Prefecture and conducted meetings in Tokyo and Fukushima with Japanese officials from several Ministries and institutions.
"The meetings held and visits made by the team over the last eight days gave us a first-hand appreciation of the extraordinary efforts and dedication on the part of Japanese people in their effort to remediate the areas affected by elevated levels of radiation in the Fukushima Prefecture," says Mr. Juan Carlos Lentijo, Team Leader and General Director for Radiation Protection at Spain's nuclear regulatory authority. "As Japan continues its current remediation efforts, it is our belief that this work will bring relief to the populations who are affected by the consequences of the nuclear accident at the Fukushima Dai-ichi nuclear power plant."
In a Preliminary Summary Report delivered to Japanese authorities today, the team prepared a set of conclusions including, though not limited to, the following:
Japan developed an efficient program for remediation - allocating the necessary legal, financial and technological resources to bring relief to the people affected by the accident, with priority being given to children. The Team was impressed with the strong commitment to the remediation effort from all institutions and parties involved, including the public;
Japan has also taken practical measures to inform the public and involve residents and local institutions in the process of defining its remediation strategy;
Japan is advised to avoid classifying removed materials that do not warrant special radiation protection measures as "radioactive waste";
Japan is advised to consider explaining to the public the importance of focusing on radiation doses that may actually be received by people rather than on data indicating contamination levels; and
Japan is encouraged to continue its remediation efforts. In doing so, Japan is encouraged to take into account the advice provided by the Mission. The IAEA stands ready to support Japan as it considers new and appropriate criteria for such activities.
The authorities and local residents in Japan fully assisted the IAEA international team in its endeavor to conclude its mission successfully.
"The team also appreciates the openness with which our discussions were conducted and the high level of cooperation and access we were granted by Japan," says Mr. Lentijo. "This was an invaluable opportunity for us to learn from this important decontamination initiative. We would like to continue our support to Japan in this very challenging task. We look forward to sharing our findings with the international community."
The final report of the mission will be presented to the Government of Japan in the next month.
Background
The accident at the Fukushima Dai-ichi Nuclear Power Plant has led to elevated levels of radiation over large areas. The Government of Japan has been formulating a strategy and plans to implement countermeasures to remediate these areas.
The IAEA organized an International Fact Finding Expert Mission of the Fukushima Dai-ichi Nuclear Power Plant Accident Following tThe Great East Japan Earthquake and Tsunami, which took place between 24 May and 2 June 2011.
The mission concluded today is a follow-up to the fact-finding mission held earlier in the year and an essential component of the IAEA's Nuclear Safety Action Plan, approved by the IAEA Board of Governors on 13 September and endorsed by all 151 Member States at the recent IAEA General Conference in September 2011. The Action Plan defines a programme of work to strengthen the global nuclear safety framework.
The international expert mission to Japan on environmental remediation was held between 7 and 15 October 2011.
International Nuclear Officials Discuss IAEA Peer Reviews of Nuclear Safety Regulations
28 October 2011 | Washington DC -- Senior nuclear regulators today concluded a Workshop on the Lessons Learned from the IAEA Integrated Regulatory Review Service (IRRS) Missions. The U.S. Nuclear Regulatory Commission (NRC) hosted the workshop, in cooperation with the International Atomic Energy Agency, in Washington, DC, from 26 to 28 October 2011. About 60 senior regulators from 22 IAEA Member States took part in this workshop.
The IRRS programme is an international peer review service offered by the IAEA to its Member States to provide an objective evaluation of their nuclear safety regulatory framework. The review is based on the internationally recognized IAEA Safety Standards.
"The United States Nuclear Regulatory Commission was pleased to host the IAEA's IRRS meeting this week. The discussions over the past three days have provided an important opportunity for regulators from many countries to come together to strengthen the international peer review process," said U.S. NRC Chairman Gregory B. Jaczko. "Especially after the Fukushima Daiichi accident, the global community recognizes that IRRS missions fill a vital role in strengthening nuclear safety and security programs around the world, and we are proud to be a part of this important effort."
The IAEA Action Plan on Nuclear Safety1/ includes actions focused towards strengthening the existing IAEA peer reviews, incorporating lessons learned and improving their effectiveness. The workshop provided a platform for the exchange of information, experience and lessons learned from the IRRS missions, as well as expectations for the IRRS programme for the near future. Further improvements in the planning and implementation of the IRRS missions in the longer term were discussed. A strong commitment of all relevant national authorities to the IRRS programme was identified as a key element of an effective regulatory framework.
The conclusions of the workshop will be issued in November 2011 and the main results will be reported to the IAEA Board of Governors meeting in November.
"The strong support expressed by senior regulators for the IAEA peer reviews of the nuclear regulatory framework and their concrete proposals for improvement will contribute significantly to the effective implementation of the IAEA Nuclear Safety Action Plan," said Denis Flory, IAEA Deputy Director General for Nuclear Safety and Security. "There was a general recognition that these peer reviews provide national nuclear regulators with an objective view of their strengths and weaknesses and contribute to the continuous strengthening of nuclear safety."
European Commission and IAEA Celebrate 30 Years Co-operation on Nuclear Safeguards
14 October 2011 | Today the European Commission and the International Atomic Energy Agency (IAEA) celebrate 30 years of cooperation in the safeguarding of nuclear materials and facilities. This anniversary is marked by an event at the IAEA Headquarters in Vienna. The Joint Research Centre (JRC) of the European Commission has provided scientific and technical support to the work of IAEA since 1981, with over 100 scientists and technicians working on more than 25 projects. The anniversary is also an opportunity for both parties to plan their future joint activities.
"Nuclear safety and security are absolute priorities for the EU and in this context expertise on nuclear safeguards is extremely important for global security," says Dominique Ristori, Director General of the Joint Research Centre. "The JRC is constantly at work on state-of-the-art technologies for nuclear safeguards and training of nuclear inspectors to stay ahead of the evolving challenges, in its long-standing cooperation in support of the Agency's mission."
"The JRC has provided us with vital scientific and technical support which has helped us to implement safeguards more effectively," said Herman Nackaerts, Deputy Director General for Safeguards at the IAEA. "This has had a positive impact on the security of all the citizens of the European Union and beyond."
An important chapter in the collaboration between the two organisations is training: high-quality training programmes are provided by the JRC for the next generation of IAEA and EURATOM Inspectors. Other examples of cooperation include special tools to improve environmental particle analysis, a 3D laser-based verification system of nuclear facilities, new nuclear reference materials, and secure sealing for underwater nuclear spent fuel assemblies.
Future cooperation between the JRC and IAEA will be in line with the new priorities of the IAEA to further increase the safeguards' effectiveness and efficiency, through a customized approach increasingly focused at national level. This also involves the support of the European Commission in establishing the new IAEA Safeguards Laboratory in Seibersdorf, Austria.
Friday, November 11, 2011
Plasma etching pushes the limits of a shrinking world
Plasma etching, essential for semiconductor device fabrication in the nanoelectronics age, confronts the fundamental limits of physics and chemistry
Plasma etching (using an ionized gas to carve tiny components on silicon wafers) has long enabled the perpetuation of Moore's Law -- the observation that the number of transistors that can be squeezed into an integrated circuit doubles about every two years. Without the compensating capabilities of plasma etching, Moore's Law would have faltered around 1980 with transistor sizes at about 1 micron (the diameter of a human hair is approximately 40-50 microns wide). Today, etch compensation helps create devices that are smaller than 20 nanometers (1,000 times smaller than a micron).
Now more than ever, plasma etch technology is used to extend semiconductor device fabrication into the nanoelectronics age -- and technologists at Lam Research are developing techniques for the manufacture of even smaller, faster, and more densely packed multi-functionality chips. The question now is how much smaller and faster can the semiconductor industry go? The answer has much to do with plasma etch technology.
One of the most critical steps of semiconductor manufacturing, plasma etching creates finely delineated features in the conductive and dielectric (insulating) layers on integrated circuits. Plasma etch techniques can also compensate for limitations in lithography, the optical process that develops the "template" for creating nanoelectronic structures on silicon wafers. Transistors and other components are now so small that lithography can no longer produce templates with the necessary precision to pack millions of transistors onto small integrated circuits. While researchers are working on new lithography technology (extreme ultraviolet or EUV) to overcome this limitation, plasma etching is used to compensate for lithography's imperfections by filling in gaps and smoothing out edges of the tiny components on the chip (Figure 1). Plasma etching also enables other techniques that extend current lithography capabilities, including double patterning (a method of overlaying two patterns to achieve the original design) and directly shrinking structures smaller than the template dimensions.
Yet, plasma etching itself is now facing the fundamental limits imposed by the basic laws of physics and chemistry. Because etching is involved in forming the critical structures of every semiconductor device, Lam Research technologists are learning to better control the behavior of the various components of the plasma (a gaseous mixture of charged and neutral particles) during the etching process. The ultimate goal would be to selectively etch one layer of atoms at a time (atomic-layer etching or ALE), without disturbing the bulk of the material underneath.
Over the next 5 years, improving plasma etch technology will be key to extending Moore's Law further and manufacturing the next-generation of consumer electronics devices.
Tokamak experiments come clean about impurity transport
A fusion reactor operates best when the hot plasma inside it consists only of fusion fuel (hydrogen's heavy isotopes, deuterium and tritium), much as a car runs best with a clean engine. But fusion fuel reactions at the heart of magnetic fusion reactors also create leftovers—helium "ash." The buildup of this helium ash and other impurities can cool the hot plasma and reduce fusion power. Research at the MIT Plasma Science and Fusion Center is providing new insight into the transport of these impurities in fusion plasmas in an effort to improve on the natural impurity exhaust process, producing cleaner plasmas and higher fusion power.
On the Alcator C-Mod tokamak at MIT, researchers are using a novel set of plasma diagnostics and advanced computer simulations to better understand the physical processes that can either flush out impurities or allow them to stay. All fusion plasmas contain intrinsic impurities introduced by the unintentional interaction of very hot plasma with the reactor walls and the fusion reactions themselves. To study these phenomena, the scientists introduce a known source of impurities at a small level that will not adversely affect the plasma's performance. This is achieved using a high powered, pulsed laser to knock impurity atoms off a coated glass slide directly into the plasma edge. Once inside the plasma, the impurity is ionized and heated by the plasma and begins to emit soft x-ray radiation which is observed by a new high-resolution spectrometer that allows the impurities to be tracked as they are transported by plasma turbulence.
"It is not enough to simply observe results in existing experiments," says MIT graduate student Nathan Howard. "We also need to develop high resolution computer models to predict how impurities will behave in future larger, hotter fusion reactors. The process is much like developing accurate long-range weather forecasts."
The MIT scientists are developing and testing new computer programs which run on some of the world's fastest supercomputers. A single case can take up to 250,000 CPU hours to complete. For comparison, this is roughly equivalent to letting a home computer run for about 15 years. The latest simulations connect the behavior of small turbulent eddies and ripples in the plasma to new measurements showing the movement of impurities into and out of the plasma.
According to Dr. Martin Greenwald, Nathan's thesis advisor, "This work represents an important first step in gaining confidence in our ability to predict and control impurity transport in tokamaks."
Researching graphene nanoelectronics for a post-silicon world
Rensselaer Polytechnic Institute researchers use supercomputer to study effects of stacking graphene nanoribbons
Troy, N.Y. – Copper's days are numbered, and a new study at Rensselaer Polytechnic Institute could hasten the downfall of the ubiquitous metal in smart phones, tablet computers, and nearly all electronics. This is good news for technophiles who are seeking smaller, faster devices.
As new generations of computer chips continue to shrink in size, so do the copper pathways that transport electricity and information around the labyrinth of transistors and components. When these pathways—called interconnects—grow smaller, they become less efficient, consume more power, and are more prone to permanent failure.
To overcome this hurdle, industry and academia are vigorously researching new candidates to succeed traditional copper as the material of choice for interconnects on computer chips. One promising candidate is graphene, an atom-thick sheet of carbon atoms arranged like a nanoscale chicken-wire fence. Prized by researchers for its unique properties, graphene is essentially a single layer of the graphite found commonly in our pencils or the charcoal we burn on our barbeques.
Led by Rensselaer Professor Saroj Nayak, a team of researchers discovered they could enhance the ability of graphene to transmit electricity by stacking several thin graphene ribbons on top of one another. The study, published in the journal ACS Nano, brings industry closer to realizing graphene nanoelectronics and naming graphene as the heir apparent to copper.
"Graphene shows enormous potential for use in interconnects, and stacking up graphene shows a viable way to mass produce these structures," said Nayak, a professor in the Department of Physics, Applied Physics, and Astronomy at Rensselaer. "Cooper's limitations are apparent, as increasingly smaller copper interconnects suffer from sluggish electron flows that results in hotter, less reliable devices. Our new study makes a case for the possibility that stacks of graphene ribbons could have what it takes to be used as interconnects in integrated circuits."
The study, based on large-scale quantum simulations, was conducted using the Rensselaer Computational Center for Nanotechnology Innovations (CCNI), one of the world's most powerful university-based supercomputers.
Copper interconnects suffer from a variety of unwanted problems, which grow more prominent as the size of the interconnects shrink. Electrons travel through the copper nanowires sluggishly and generate intense heat. As a result, the electrons "drag" atoms of copper around with them. These misplaced atoms increase the copper wire's electrical resistance, and degrade the wire's ability to transport electrons. This means fewer electrons are able to pass through the copper successfully, and any lingering electrons are expressed as heat. This heat can have negative effects on both a computer chip's speed and performance.
It is generally accepted that a quality replacement for traditional copper must be discovered and perfected in the next five to 10 years in order to further perpetuate Moore's Law—an industry mantra that states the number of transistors on a computer chip, and thus the chip's speed, should double every 18 to 24 months.
Nayak's recent work, published in the journal ACS Nano, is titled "Effect of Layer Stacking on the Electronic Structure of Graphene Nanoribbons." When cut into nanoribbons, graphene is known to exhibit a band gap—an energy gap between the valence and conduction bands—which is an unattractive property for interconnects. The new study shows that stacking the graphene nanoribbons on top of each other, however, could significantly shrink this band gap. The study may be viewed online at: http://dx.doi.org/10.1021/nn200941u
"The optimal thickness is a stack of four to six layers of graphene," said Neerav Kharche, first author of the study and a computational scientist at CCNI. "Stacking more layers beyond this thickness doesn't reduce the band gap any further."
The end destination, Nayak said, is to one day manufacture microprocessors—both the interconnects and the transistors—entirely out of graphene. This game-changing goal, called monolithic integration, would mean the end of the long era of copper interconnects and silicon transistors.
"Such an advance is likely still many years into the future, but it will certainly revolutionize the way nearly all computers and electronics are designed and manufactured," Nayak said.
###
Along with Nayak and Kharche, contributors to this study were: former Rensselaer physics graduate student Yu Zhou; Swastik Kar, former Rensselaer physics research assistant professor; and Kevin P. O'Brien of Intel Corporation.
This research was supported in part by the New York State Interconnect Focus Center at Rensselaer; the Semiconductor Research Corporation; and the National Science Foundation (NSF) Division of Electrical, Communications, and Cyber Systems, and a generous gift from a donor who wished to remain anonymous. Computational resources were partly funded by Rensselaer and New York state through CCNI; and by NSF through nanoHUB.org.
Exploring the last white spot on Earth
Grenoble, 10 November 2011 – Scientists will soon be exploring matter at temperatures and pressures so extreme that the conditions can only be produced for microseconds using powerful pulsed lasers. Matter in such states is present in the Earth’s liquid iron core, 2500 kilometres beneath the surface, and also in elusive “warm dense matter” inside large planets like Jupiter. A new X-ray beamline ID24 at the European Synchrotron Radiation Facility (ESRF) in Grenoble, France, allows a new level of exploration of the last white spot on our globe: the centre of the Earth.
The layers of the Earth: the thin upper crust, the viscous upper and lower mantle, the liquid core and the solid inner core.
We know surprisingly little about the interior of the Earth. The pressure at the centre can be calculated accurately from the propagation of earthquake waves; it is about three and a half million times atmospheric pressure. The temperature at the centre of the Earth, however, is unknown, but is thought to be roughly as hot as the surface of the sun.
ID24, which was inaugurated today, opens new fields of science, being able to observe many rapid processes, like in a time-lapse film sequence, whether laser-heating of iron to 10,000 degrees, charge reactions in new batteries or catalysts cleaning pollutants. It is the first of eight new beamlines to be built within the ESRF Upgrade Programme, a 180 million Euro investment over eight years, to maintain the world-leading role of the ESRF. ID24 extends the existing capabilities at the ESRF in X-ray absorption spectroscopy to sample volumes twenty times smaller and time resolutions one thousand times better than in the past.
“Scientists can use several other synchrotrons notably in Japan and the U.S for fast X-ray absorption spectroscopy, but it is the microsecond time resolution for single shot acquisition coupled to the micrometre sized spot that makes ID24 unique worldwide," says Sakura Pascarelli, scientist in charge of ID24. "The rebuilt ID24 sets the ESRF apart, and even before the first users have arrived, I am being asked to share our technology.”
Illustration of a diamond anvil cell used to compress microscopic samples to pressures of 3 Mbar and more (black arrows represent direction of compresive force). Pulsed laser beams heat the sample from both sides, and an X-ray beam (orange) probes the states of matter at extreme conditions. Image credit: ESRF/Format Editions.
The Earth’s interior is literally inaccessible and today it is easier to reach Mars than to visit even the base of the Earth’s thin crust. Scientists can however reproduce the extreme pressure and temperature of a planet’s interior in the laboratory, using diamond anvil cells to squeeze a material and once under pressure, heat it with short, intense laser pulses. However, these samples are not bigger than the size of a speck of dust and remain stable under high temperatures only for very short time, measured in microseconds.
Thanks to new technologies employed at ID24, scientists can now study what happens at extreme conditions, for example when materials undergo a fast chemical reaction or at what temperature a mineral will melt in the interior of a planet. Germanium micro strip detectors enable measurements to be made sequentially and very rapidly (a million per second) in order not to miss any detail. A stable, microscopic X-ray beam means that measurements can also be made in two dimensions by scanning across a sample to obtain a map instead of only at a single point. A powerful infrared spectrometer complements the X-ray detectors for the study of chemical reactions under industrial processing conditions.
A catalytic cell with a sample heated under in situ conditions for analysis with a beam of X-rays. Image credit: ESRF/B. Gorges.
Today, geologists want to know whether a chemical reaction exists between the Earth’s mostly liquid core and the rocky mantle surrounding it. They would like to know the melting temperature of materials other than iron that might be present in the Earth’s core in order to make better models for how the core—which produces the Earth’s magnetic field—works and to understand why the magnetic field changes over time and why periodically in Earth’s history it has disappeared and reversed.
We know even less about warm dense matter believed to exist in the core of larger planets, for example Jupiter, which should be even hotter and denser. It can be produced in the laboratory using extremely powerful laser shock pulses compressing and heating a sample. The dream of revealing the secrets of the electronic and local structure in this state of matter with X-rays is now becoming reality, as ID24 allows sample volumes 10000 times smaller than those at the high power laser facilities to be studied, making these experiments possible at the synchrotron using table top lasers.
The ID24 beamline works like an active probe rather than a passive detector, firing an intense beam of X-rays at a sample. The technique used is called X-ray absorption spectroscopy and it involves the element specific absorption of X-rays by the atoms in a material. From this data not only the abundance of an element can be deducted but also its chemical states and which other atoms, or elements, are in their immediate neighbourhood, and even how far apart they are. In short, a complete picture is obtained of the sample at the atomic scale.
ID24 has just successfully completed first tests with X-ray beams. Testing will continue over the coming weeks, and the beamline will be open for users from around the world as of May 2012. The date for the inauguration on 10 November 2011 was chosen to coincide with the autumn meeting of the ESRF’s Science Advisory Committee of external experts who played a key role in selecting the science case for ID24 and the other Upgrade Beamlines.
“ID24 opens unchartered territories of scientific exploration, as will the seven other beamlines of the ESRF Upgrade Programme. The economic crisis has hit our budgets hard, and it is not obvious to deliver new opportunities for research and industrial innovation under these circumstances”, says Harald Reichert, ESRF Director of Research. “I wish to congratulate the project team for extraordinary achievements, and I look forward to seeing some extraordinary new science.”
Wednesday, November 9, 2011
Astrobiologists Discover “Sweet Spots” for the Formation of Complex Organic Molecules in the Galaxy
Scientists within the New York Center for Astrobiology at Rensselaer Polytechnic Institute have compiled years of research to help locate areas in outer space that have extreme potential for complex organic molecule formation. The scientists searched for methanol, a key ingredient in the synthesis of organic molecules that could lead to life. Their results have implications for determining the origins of molecules that spark life in the cosmos.
The findings will be published in the Nov. 20 edition of The Astrophysical Journal in a paper titled “Observational constraints on methanol production in interstellar and preplanetary ices.” The work is collaboration between researchers at Rensselaer, NASA Ames Research Center, the SETI Institute, and Ohio State University.
“Methanol formation is the major chemical pathway to complex organic molecules in interstellar space,” said the lead researcher of the study and director of the NASA-funded center, Douglas Whittet of Rensselaer. If scientists can identify regions where conditions are right for rich methanol production, they will be better able to understand where and how the complex organic molecules needed to create life are formed. In other words, follow the methanol and you may be able to follow the chemistry that leads to life.
Using powerful telescopes on Earth, scientists have observed large concentrations of simple molecules such as carbon monoxide in the clouds that give birth to new stars. In order to make more complex organic molecules, hydrogen needs to enter the chemical process. The best way for this chemistry to occur is on the surfaces of tiny dust grains in space, according to Whittet. In the right conditions, carbon monoxide on the surface of interstellar dust can react at low temperatures with hydrogen to create methanol (CH3OH). Methanol then serves as an important steppingstone to formation of the much more complex organic molecules that are required to create life. Scientists have known that methanol is out there, but to date there has been limited detail on where it is most readily produced.
What Whittet and his collaborators have discovered is that methanol is most abundant around a very small number of newly formed stars. Not all young stars reach such potential for organic chemistry. In fact, the range in methanol concentration varies from negligible amounts in some regions of the interstellar medium to approximately 30 percent of the ices around a handful of newly formed stars. They also discovered methanol for the first time in low concentrations (1 to 2 percent) in the cold clouds that will eventually give birth to new stars.
The scientists conclude in the paper that there is a “sweet spot” in the physical conditions surrounding some stars that accounts for the large discrepancy in methanol formation in the galaxy. The complexity of the chemistry depends on how fast certain molecules reach the dust grains surrounding new stars, according the Whittet. The rate of molecule accumulation on the particles can result in an organic boom or a literal dead end.
“If the carbon monoxide molecules build up too quickly on the surfaces of the dust grains, they don’t get the opportunity to react and form more complex molecules. Instead, the molecules get buried in the ices and add up to a lot of dead weight,” Whittet said. “If the buildup is too slow, the opportunities for reaction are also much lower.”
This means that under the right conditions, the dust surrounding certain stars could hold greater potential for life than most of its siblings. The presence of high concentrations of methanol could essentially jumpstart the process to create life on the planets formed around certain stars.
The scientists also compared their results with methanol concentrations in comets to determine a baseline of methanol production in our own solar system.
“Comets are time capsules,” Whittet said. “Comets can preserve the early history of our solar system because they contain material that hasn’t changed since the solar system was formed.” As such, the scientists could look at the concentrations of methanol in comets to determine the amount of methanol that was in our solar system at its birth.
What they found was that methanol concentrations at the birth of our solar system were actually closer to the average of what they saw elsewhere in interstellar space. Methanol concentrations in our solar system were fairly low, at only a few percent, compared to some of the other methanol-dense areas in the galaxy observed by Whittet and his colleagues.
“This means that our solar system wasn’t particularly lucky and didn’t have the large amounts of methanol that we see around some other stars in the galaxy,” Whittet said.
“But, it was obviously enough for us to be here.”
The results suggest that there could be solar systems out there that were even luckier in the biological game than we were, according to Whittet. As we look deeper into the cosmos, we may eventually be able to determine what a solar system bursting with methanol can do.
The New York Center for Astrobiology
Based within the School of Science at Rensselaer Polytechnic Institute in Troy, N.Y., the New York Center for Astrobiology is devoted to investigating the origins of life on Earth and the conditions that lead to formation of habitable planets in our own and other solar systems. Supported by NASA, the $7 million center is a member of NASA’s Astrobiology Institute (NAI), and is a partnership between Rensselaer and the University at Albany, Syracuse University, the University of Arizona, and the University of North Dakota. Researchers and students within the center seek to understand the chemical, physical, and geological conditions of early Earth that set the stage for life on our planet. They also look beyond our home planet to investigate whether the processes that prepared the Earth for life could be replicated elsewhere — on Mars and other bodies in our solar system, for example, and on planets orbiting other stars.
Monday, November 7, 2011
LHC proton run for 2011 reaches successful conclusion
Geneva, 31 October 2011. After some 180 days of running and four hundred trillion (4x1014) proton proton collisions, the LHC’s 2011 proton run came to an end at 5.15pm yesterday evening. For the second year running, the LHC team has largely surpassed its operational objectives, steadily increasing the rate at which the LHC has delivered data to the experiments.
At the beginning of the year’s run, the objective for the LHC was to deliver a quantity of data known to physicists as one inverse femtobarn during the course of 2011. The first inverse femtobarn came on 17 June, setting the experiments up well for the major physics conferences of the summer and requiring the 2011 data objective to be revised upwards to five inverse femtobarns. That milestone was passed by 18 October, with the grand total for the year being almost six inverse femtobarns delivered to each of the two general-purpose experiments ATLAS and CMS.
“At the end of this year’s proton running, the LHC is reaching cruising speed,” said CERN’s Director for Accelerators and Technology, Steve Myers. “To put things in context, the present data production rate is a factor of 4 million higher than in the first run in 2010 and a factor of 30 higher than at the beginning of 2011.”
Physics highlights from this year’s proton running include closing down the space available for the long sought Higgs and supersymmetric particles to hide in, putting the Standard Model of particle physics through increasingly gruelling tests, and advancing our understanding of the primordial universe.
“It has been a remarkable and exciting year for the whole LHC scientific community, in particular for our students and post-docs from all over the world. We have made a huge number of measurements of the Standard Model and accessed unexplored territory in searches for new physics. In particular, we have constrained the Higgs particle to the light end of its possible mass range, if it exists at all,” said ATLAS Spokesperson Fabiola Gianotti. “This is where both theory and experimental data expected it would be, but it’s the hardest mass range to study.”
“Looking back at this fantastic year I have the impression of living in a sort of a dream,” said CMS Spokesperson Guido Tonelli. “We have produced tens of new measurements and constrained significantly the space available for models of new physics and the best is still to come. As we speak hundreds of young scientists are still analysing the huge amount of data accumulated so far; we’ll soon have new results and, maybe, something important to say on the Standard Model Higgs Boson.”
“We’ve got from the LHC the amount of data we dreamt of at the beginning of the year and our results are putting the Standard Model of particle physics through a very tough test ” said LHCb Spokesperson Pierluigi Campana. “So far, it has come through with flying colours, but thanks to the great performance of the LHC, we are reaching levels of sensitivity where we can see beyond the Standard Model. The researchers, especially the young ones, are experiencing great excitement, looking forward to new physics.”
Over the coming days and weeks, the LHC experiments will be analysing the full 2011 data set to home in further on new physics. However, while it is possible that new physics may emerge, it is equally likely that the full 10 inverse femtobarns initially foreseen for 2011 and 2012 will be required.
As in 2010, the LHC is now being prepared for four weeks of lead-ion running, but in a new development this year, the world’s largest particle accelerator will also attempt to demonstrate that large can also be agile by colliding protons with lead ions in two dedicated periods of machine development. If successful, these tests will lead to a new strand of LHC operation, using protons to probe the internal structure of the much more massive lead ions.
This is important for the lead-ion programme, whose goal is to study quark-gluon plasma, the primordial soup of particles from which the ordinary matter of today’s visible universe evolved.
“Smashing lead ions together allows us to produce and study tiny pieces of primordial soup,” said ALICE Spokesperson Paolo Giubellino, “but as any good cook will tell you, to understand a recipe fully, it’s vital to understand the ingredients, and in the case of quark-gluon plasma, this is what proton-lead ion collisions could bring.”
NASA in Final Preparations for Nov. 8 Asteroid Flyby
NASA scientists will be tracking asteroid 2005 YU55 with antennas of the agency's Deep Space Network at Goldstone, Calif., as the space rock safely flies past Earth slightly closer than the moon's orbit on Nov. 8. Scientists are treating the flyby of the 1,300-foot-wide (400-meter) asteroid as a science target of opportunity - allowing instruments on "spacecraft Earth" to scan it during the close pass.
Tracking of the aircraft carrier-sized asteroid will begin at 9:30 a.m. local time (PDT) on Nov. 4, using the massive 70-meter (230-foot) Deep Space Network antenna, and last for about two hours. The asteroid will continue to be tracked by Goldstone for at least four hours each day from Nov. 6 through Nov. 10. Radar observations from the Arecibo Planetary Radar Facility in Puerto Rico will begin on Nov. 8, the same day the asteroid will make its closest approach to Earth at 3:28 p.m. PST.
The trajectory of asteroid 2005 YU55 is well understood. At the point of closest approach, it will be no closer than 201,700 miles (324,600 kilometers) or 0.85 the distance from the moon to Earth. The gravitational influence of the asteroid will have no detectable effect on anything here on Earth, including our planet's tides or tectonic plates. Although 2005 YU55 is in an orbit that regularly brings it to the vicinity of Earth (and Venus and Mars), the 2011 encounter with Earth is the closest this space rock has come for at least the last 200 years.
During tracking, scientists will use the Goldstone and Arecibo antennas to bounce radio waves off the space rock. Radar echoes returned from 2005 YU55 will be collected and analyzed. NASA scientists hope to obtain images of the asteroid from Goldstone as fine as about 7 feet (2 meters) per pixel. This should reveal a wealth of detail about the asteroid's surface features, shape, dimensions and other physical properties (see "Radar Love" - http://www.jpl.nasa.gov/news/news.cfm?release=2006-00a ).
Arecibo radar observations of asteroid 2005 YU55 made in 2010 show it to be approximately spherical in shape. It is slowly spinning, with a rotation period of about 18 hours. The asteroid's surface is darker than charcoal at optical wavelengths. Amateur astronomers who want to get a glimpse at YU55 will need a telescope with an aperture of 6 inches (15 centimeters) or larger.
The last time a space rock as big came as close to Earth was in 1976, although astronomers did not know about the flyby at the time. The next known approach of an asteroid this large will be in 2028.
NASA detects, tracks and characterizes asteroids and comets passing close to Earth using both ground- and space-based telescopes. The Near-Earth Object Observations Program, commonly called "Spaceguard," discovers these objects, characterizes a subset of them, and plots their orbits to determine if any could be potentially hazardous to our planet.
NASA's Jet Propulsion Laboratory manages the Near-Earth Object Program Office for NASA's Science Mission Directorate in Washington. JPL is a division of the California Institute of Technology in Pasadena.
More information about asteroids and near-Earth objects is at: http://www.jpl.nasa.gov/asteroidwatch .
More information about asteroid radar research is at: http://echo.jpl.nasa.gov/ .
More information about the Deep Space Network is at: http://deepspace.jpl.nasa.gov/dsn .
NASA Telescopes Help Solve Ancient Supernova Mystery
PASADENA, Calif. -- A mystery that began nearly 2,000 years ago, when Chinese astronomers witnessed what would turn out to be an exploding star in the sky, has been solved. New infrared observations from NASA's Spitzer Space Telescope and Wide-field Infrared Survey Explorer, or WISE, reveal how the first supernova ever recorded occurred and how its shattered remains ultimately spread out to great distances.
The findings show that the stellar explosion took place in a hollowed-out cavity, allowing material expelled by the star to travel much faster and farther than it would have otherwise.
"This supernova remnant got really big, really fast," said Brian J. Williams, an astronomer at North Carolina State University in Raleigh. Williams is lead author of a new study detailing the findings online in the Astrophysical Journal. "It's two to three times bigger than we would expect for a supernova that was witnessed exploding nearly 2,000 years ago. Now, we've been able to finally pinpoint the cause."
A new image of the supernova, known as RCW 86, is online at http://go.nasa.gov/pnv6Oy .
In 185 A.D., Chinese astronomers noted a "guest star" that mysteriously appeared in the sky and stayed for about 8 months. By the 1960s, scientists had determined that the mysterious object was the first documented supernova. Later, they pinpointed RCW 86 as a supernova remnant located about 8,000 light-years away. But a puzzle persisted. The star's spherical remains are larger than expected. If they could be seen in the sky today in infrared light, they'd take up more space than our full moon.
The solution arrived through new infrared observations made with Spitzer and WISE, and previous data from NASA's Chandra X-ray Observatory and the European Space Agency's XMM-Newton Observatory.
The findings reveal that the event is a "Type Ia" supernova, created by the relatively peaceful death of a star like our sun, which then shrank into a dense star called a white dwarf. The white dwarf is thought to have later blown up in a supernova after siphoning matter, or fuel, from a nearby star.
"A white dwarf is like a smoking cinder from a burnt-out fire," Williams said. "If you pour gasoline on it, it will explode."
The observations also show for the first time that a white dwarf can create a cavity around it before blowing up in a Type Ia event. A cavity would explain why the remains of RCW 86 are so big. When the explosion occurred, the ejected material would have traveled unimpeded by gas and dust and spread out quickly.
Spitzer and WISE allowed the team to measure the temperature of the dust making up the RCW 86 remnant at about minus 325 degrees Fahrenheit, or minus 200 degrees Celsius. They then calculated how much gas must be present within the remnant to heat the dust to those temperatures. The results point to a low-density environment for much of the life of the remnant, essentially a cavity.
Scientists initially suspected that RCW 86 was the result of a core-collapse supernova, the most powerful type of stellar blast. They had seen hints of a cavity around the remnant, and, at that time, such cavities were only associated with core-collapse supernovae. In those events, massive stars blow material away from them before they blow up, carving out holes around them.
But other evidence argued against a core-collapse supernova. X-ray data from Chandra and XMM-Newton indicated that the object consisted of high amounts of iron, a telltale sign of a Type Ia blast. Together with the infrared observations, a picture of a Type Ia explosion into a cavity emerged.
"Modern astronomers unveiled one secret of a two-millennia-old cosmic mystery only to reveal another," said Bill Danchi, Spitzer and WISE program scientist at NASA Headquarters in Washington. "Now, with multiple observatories extending our senses in space, we can fully appreciate the remarkable physics behind this star's death throes, yet still be as in awe of the cosmos as the ancient astronomers."
NASA's Jet Propulsion Laboratory, Pasadena, Calif., manages the Spitzer Space Telescope mission for NASA's Science Mission Directorate, Washington. Science operations are conducted at the Spitzer Science Center at the California Institute of Technology in Pasadena. Caltech manages JPL for NASA. For more information about Spitzer, visit http://spitzer.caltech.edu/ and http://www.nasa.gov/spitzer .
JPL manages, and operated, WISE for NASA's Science Mission Directorate. The spacecraft was put into hibernation mode after it scanned the entire sky twice, completing its main objectives. Edward Wright is the principal investigator and is at UCLA. The mission was selected competitively under NASA's Explorers Program managed by the agency's Goddard Space Flight Center in Greenbelt, Md. The science instrument was built by the Space Dynamics Laboratory in Logan. The spacecraft was built by Ball Aerospace & Technologies Corp. in Boulder, Colo. Science operations and data processing take place at the Infrared Processing and Analysis Center at Caltech. Caltech manages JPL for NASA. More information is online at http://www.nasa.gov/wise and http://wise.astro.ucla.edu and http://www.jpl.nasa.gov/wise .
NASA's Fermi Finds Youngest Millisecond Pulsar, 100 Pulsars To-Date
WASHINGTON -- An international team of scientists using NASA's Fermi Gamma-ray Space Telescope has discovered a surprisingly powerful millisecond pulsar that challenges existing theories about how these objects form.
At the same time, another team has located nine new gamma-ray pulsars in Fermi data, using improved analytical techniques.
A pulsar is a type of neutron star that emits electromagnetic energy at periodic intervals. A neutron star is the closest thing to a black hole that astronomers can observe directly, crushing half a million times more mass than Earth into a sphere no larger than a city. This matter is so compressed that even a teaspoonful weighs as much as Mount Everest.
"With this new batch of pulsars, Fermi now has detected more than 100, which is an exciting milestone when you consider that, before Fermi's launch in 2008, only seven of them were known to emit gamma rays," said Pablo Saz Parkinson, an astrophysicist at the Santa Cruz Institute for Particle Physics at the University of California Santa Cruz, and a co-author on two papers detailing the findings.
One group of pulsars combines incredible density with extreme rotation. The fastest of these so-called millisecond pulsars whirls at 43,000 revolutions per minute.
Millisecond pulsars are thought to achieve such speeds because they are gravitationally bound in binary systems with normal stars. During part of their stellar lives, gas flows from the normal star to the pulsar. Over time, the impact of this falling gas gradually spins up the pulsar's rotation.
The strong magnetic fields and rapid rotation of pulsars cause them to emit powerful beams of energy, from radio waves to gamma rays. Because the star is transferring rotational energy to the pulsar, the pulsar's spin slows after this transfer is completed.
Typically, millisecond pulsars are around a billion years old. However, in the Nov. 3 issue of Science, the Fermi team reveals a bright, energetic millisecond pulsar only 25 million years old.
The object, named PSR J1823−3021A, lies within NGC 6624, a spherical collection of ancient stars called a globular cluster, one of about 160 similar objects that orbit our galaxy. The cluster is about 10 billion years old and lies about 27,000 light-years away toward the constellation Sagittarius.
Fermi's Large Area Telescope (LAT) showed that eleven globular clusters emit gamma rays, the cumulative emission of dozens of millisecond pulsars too faint for even Fermi to detect individually. But that's not the case for NGC 6624.
"It's amazing that all of the gamma rays we see from this cluster are coming from a single object. It must have formed recently based on how rapidly it's emitting energy. It's a bit like finding a screaming baby in a quiet retirement home," said Paulo Freire, the study's lead author, at the Max Planck Institute for Radio Astronomy in Bonn, Germany.
J1823−3021A was previously identified as a pulsar by its radio emission, yet of the nine new pulsars, none are millisecond pulsars, and only one was later found to emit radio waves.
Despite its sensitivity, Fermi's LAT may detect only one gamma ray for every 100,000 rotations of some of these faint pulsars. Yet new analysis techniques applied to the precise position and arrival time of photons collected by the LAT since 2008 were able to identify them.
"We adapted methods originally devised for studying gravitational waves to the problem of finding gamma-ray pulsars, and we were quickly rewarded," said Bruce Allen, director of the Max Planck Institute for Gravitational Physics in Hannover, Germany. Allen co-authored a paper on the discoveries that was published online today in The Astrophysical Journal.
Allen also directs the Einstein@Home project, a distributed computing effort that uses downtime on computers of volunteers to process astronomical data. In July, the project extended the search for gamma-ray pulsars to the general public by including Fermi LAT data in the work processed by Einstein@Home users.
NASA's Fermi Gamma-ray Space Telescope is an astrophysics and particle physics partnership. It is managed by NASA's Goddard Space Flight Center in Greenbelt, Md. It was developed in collaboration with the U.S. Department of Energy, with important contributions from academic institutions and partners in France, Germany, Italy, Japan, Sweden and the United States.
Subscribe to:
Posts (Atom)