CERN Accelerating science

Louvre accelerator gets an upgrade
by Jennifer Toes (CERN)


New AGLAE multi-detector with 5 SDD PIXE detectors,1 HPGe PIGE detector, optical fiber for IBIL. The annular PIPS E/RBS detector is located inside the beamline, around the beam. (Image: © C2RMF – AGLAE V. Fournier)

Whilst the Parisian Louvre museum may be known as home to some of the world’s most revered and priceless art and antiquities, in the field of high energy physics it is in close proximity to one of premiere sites of the use of accelerators for culture heritage.

The Accélérateur Grand Louvre d'Analyse Elémentaire (AGLAE) is part of the French Ministry of Culture’s Centre for Research and Restoration of Museums of France (C2RMF). The accelerator serves more than 1200 French museums and assists in multiple national and international research projects.

Researchers at C2RMF are able to study objects using ion beam, proton induced X-ray emission (PIXE), proton induced gamma-ray emission (PIGE) and Rutherford Backscattering Spectrometry (RBS) analyses and ion beam induced luminescence (IBIL).

AGLAE’s unique position allows its beamtime to be entirely dedicated to cultural heritage work and can be used to answer questions on the provenance, composition, authentication and degradation of objects made of stone, metals, glass, and ceramics.

In effort to increase the capability of the AGLAE facilities, an upgrade of the accelerator is ongoing. The “New AGLAE” will include a multi-detector system and will allow for systematic imaging and the automation of the beamline.

New particle analysis techniques will minimise the risk of damage to test subject – a crucial concern in cultural heritage studies.

“The New AGLAE project is part of the development, upgrading and optimization of the beamline since its settlement in the Louvre premises in 1988 for its specific applications to art objects with their proper constraints,” says Claire Pacheco, leader of the AGLAE research group.

Five silicon drift detectors (SDDs) have replaced the former two Si(Li), which will provide bigger solid angles, enabling the study of more fragile materials.

The area of interest on the object is scanned combining a vertical magnetic deflection of the beam up to 500µm and a horizontal mechanical translation of the target. The ListMode acquisition, coupled with the scanning of the area previously described, enables systematic chemical imaging.

Increasing the hours of beamtime is crucial to meeting the growing domain of work completed with the AGLAE. A more constant, automated beamline would not only allow further study of the museum collection, but would also provide access to the increasing number of project proposals submitted to the C2RMF.

As such, a call for bids to upgrade the beamline was opened in 2014 and later awarded to Thales.

The Thales proposal aims to provide control of the Terminal Voltage via a digital system which is to be integrated directly into the industrial automated machine. In addition, the alpha zone will house two 90° and two 45° magnets, and a quadrupole triplet enabling the beam to be more stable in energy and position. Finally, a customised human-machine interface (HMI) will be installed for operation and maintenance of the machine.

The installation and testing of the New AGLAE is due to be completed and ready for users by July 2017. A call for proposals is open twice a year and European user groups can be financially supported by the European Commission through the IPERION CH programme.

Commenting on the future of the facility, Claire Pacheco says: “We are looking forward to welcoming the French and European users at the New AGLAE facility and to showing them its new capabilities.”

-

Claire Pacheco presented a seminar organised by the CERN Knowledge Transfer (KT) Department in January 2017.

 

Accelerator Education goes further
by Sabrina El Yacoubi (CERN) & Graeme Burt (University of Lancaster/The Cockcroft Institute) 


Participants at accelerator workshop (Image: QUASAR Group) 

“Knowledge belongs to mankind, not to scientists,” said Fabiola Giannotti, CERN Director General at the 2017 World Economic Forum. Nowadays the scientific community better understands the need for public engagement. Demonstrating their work to the public through education, outreach, policy and many other activities is one of their main responsibilities alongside their scientific duties.

Similarly to many other institutes and universities, CERN and the Cockcroft Institute have strengthened their education and outreach activities by providing two new educational programs for different audiences.

The Cockcroft Institute is launching an exciting new education program of lectures on accelerator science and technology, to be delivered via webcast and video archives. This will provide a new free resource for the worldwide accelerator community, as a supplement to existing accelerators schools.

The program provides both a general introduction to the subject for non-technical audiences in addition to education on more advanced topics serving as a quick refresher for experienced staff were a traditional accelerator school may not be available. All course videos and slides are free to view, and the usage is strongly encouraged to anyone in our community. The resource shall also act as an inspiration for other institutions to consider similar training initiatives.

The Cockcroft Institute is a UK-based collaboration between Daresbury Laboratory and several UK universities (Lancaster, Liverpool, Manchester and Strathclyde) to provide training for the next generation of accelerator scientists and engineers required to develop and optimise future accelerator facilities and light sources. The institute has been very successful in its efforts by initiating a large number of international training networks, such as DITANET, oPAC, OMA and AVA, as well as through the provision of an in-house lecture series for the institute’s postgraduate students. The latter includes a comprehensive set of training courses which provides all PhD students at the institute with a broad education in accelerator science outside of their own specific discipline. An online provision was also added to accommodate the large numbers of students based at overseas laboratories for at least part of their PhD work.

The lectures are primarily delivered by academic staff and accelerator experts from the stakeholder universities and the UK’s Science and Technology Facilities Council (STFC). Some external lecturers also complement the program where the in-house experts did not have the right expertise. This online resource now also benefits the wider accelerator community and thus closes an existing training gap identified by a number of studies, such as the EU-funded TIARA project. More information and all course material can be found on the Cockroft Institute’s website, and you can also follow the institute on Facebook and Twitter to receive the latest news about new lectures and short courses.

In parallel, CERN is trying to reach the whole population of the organization and beyond with a large spectrum of courses and training. The CERN Accelerator School, established in 1983, holds trainings every year at one of the CERN member states on particle accelerators and colliders with the aim of transmitting and sharing knowledge.  Complementary to this school, a yearly lecture series on “Introduction to Particle Accelerators” (AXEL) is held for technicians who are operating accelerators and whose work is closely linked to them. The lectures are also open to technicians, engineers, and physicists interested in this.

However, CERN has gone further with education and outreach by hosting a new lecture titled “Accelerators explained for everyone – without Maths”. At the origin of this lecture, Rende Steerenberg, Head of the Operations Group within the Beam Department, understood the need of setting up a lecture open to everyone without any prior knowledge of accelerators.

The objective is to widen the audience and to give a general overview of the CERN accelerator complex. It is open to everybody willing to gain a basic knowledge on how to share the beam between the LHC and all the other experiments, the LHC cycle, injection and extraction of particles, guiding particle around an accelerator, accelerating particles, Energy, basic beam diagnostic tools and performance limitations. Without diving into mathematical formulas and concepts, Rende Steerenberg reaches the public by choosing images, comparisons and equivalent to our daily lives to increase public understanding of basic scientific facts and concepts. 


Example slide featuring a diagram of the CERN acceleration complex (Credit: CERN)

Five lectures have been given so far, and few others are planned for 2017. They are open to all personnel at CERN and will be given in French and English. More information here.

In the current climate, education and training are crucial aspects of most research and development projects to help ensure the future generations of scientists are well prepared. Indeed, the TIARA project conducted a series of surveys and produced a document containing suggestions for how to improve accelerator education and training based on their results. These suggestions included actions such as the development of training lectures or the provision of scholarships and accelerator schools.

In a similar vein, the ARIES project, due to begin in May 2017, includes a task dedicated to outreach, education and training. ARIES will develop an e-learning course aimed at undergraduate students to deliver an introduction to accelerator science, engineering and technology. 

A partnership-mentorship approach for global access to radiation therapy
by Virginia Greco (CERN)

An electron linac for conventional treatments with X rays and electrons. Photo from the Clinic of Génolier (Credit: Max Brice, CERN)

On November 2016, CERN hosted a Workshop on Design Characteristics of a Novel Linear Accelerator for Challenging Environments, organized by Norman Coleman and David Pistenmaa from the International Cancer Experts Corps (ICEC) in collaboration with Manjit Dosanjh, from CERN.

The participation to the event was by invitation only and reserved to about 70 internationally-recognized experts in various fields correlated to radiotherapy for cancer treatment. They met to define a strategy for increasing access to radiotherapy to a larger number of people and to discuss possible solutions for geographical areas that present economic and technological challenges as well as a quickly changing political situation.

The idea of designing affordable equipment and developing sustainable infrastructures for delivering radiation treatment for cancer in countries that lack resources and expertise is a core mission of ICEC. Established in 2013 as a non-governmental organization, ICEC has set itself as an international sustainable mentoring network of cancer professionals, whose aim is to establish partnership projects in low- and medium-income countries, as well as in isolated indigenous communities of all countries, oriented at facilitating access to radiotherapy and improving the quality of the treatment offered.

This will be achieved by encouraging and supporting initiatives of local groups, providing mentorship and training, and guiding them through a number of steps to be completed in order to be recognized as high standard cancer care centres.

Leading experts coming from key international organisations, research institutes, universities, medical hospitals, companies producing equipment for conventional x-ray and particle therapy took the stage in turns at the workshop to share their knowledge and expertise and to discuss needs, goals and possible solutions. The key topics of discussion were the technology to be employed, sustainability, and training.

An essential step that ICEC and collaborating experts have to accomplish is designing a linear accelerator and associated instrumentation needed to deliver radiotherapy that would have to be operated in places where general infrastructures are poor of lacking, power outages and water supply fluctuations can occur and whose climatic conditions might be harsh.

The ideal facility should have a modular structure, in order to be easily shipped, assembled in-situ, upgraded and repaired. In order to be easily operated, the equipment also needs to have an intuitive and accessible interface, as a smartphone, even though it is highly technologically advanced.

A critical issue that was also discussed at the meeting at CERN was the treatment system sustainability after its installation. Specialized technical staff is required to maintain the equipment and promptly repair it, if needed, relying on availability of standard spare parts and replacement procedures that will be developed in order to make maintenance as easy as possible.

Difficulties of displacement and communication are also to be taken into account. As a consequence, these centres have to be designed modelled on the philosophy of a space station, where astronauts have spare components available and can easily replace faulty parts as pieces of lego, with remote guidance.

The participants to the workshop agreed that training is fundamental to make this ambitious project possible. ICEC’s strategy consists of setting up a team of mentors to guide local groups throughout the various phases of the programme. In this way, each centre located in a region with cancer treatment disparities and insufficient resources that is aiming at implementing radiotherapy would be associated with a centre in a resource-rich country and eventually become a reference centre for other local groups willing to undertake a similar path.

Professionals in oncology, radiotherapy and radiobiology, medical physicists as well as nurses and ancillary staff, will have to be identified in order to ensure assistance to remote locations. After completion of regular academic training, the personnel of the remote centre would be mentored and trained by ICEC’s experts through face-to-face lectures, periodic on-site visits and consultations via video-connection. This would ensure that, at a later stage, they would be able in turn to train future employees.

At the end of two intense days of debate and exchange of ideas, the participants have got a more precise picture of needs, limits and priorities, as well as a lot of input for further reflection. As a follow up, working groups will be established to address different aspects of the problem and the date for another global meeting fixed. An editorial board will write a report of the workshop, which will also be submitted for publication to medical journals.

The report emerging from the workshop will be published in various media and journals in order to highlight the initiative and obtain further momentum.

 

Accelerator Fault Tracking at CERN
by Chris Roderick (CERN)


During Run 2 the LHC achieved an outstanding performance (Image: CERN CDS)

CERN’s Accelerator Fault Tracking (AFT) system aims to facilitate answering questions like: “Why are we not doing Physics when we should be?” and “What can we do to increase machine availability?”

People have tracked faults for many years, using numerous diverse, distributed and un-related systems. As a result, and despite a lot of effort, it has been difficult to get a clear and consistent overview of what is going on, where the problems are, how long they last for, and what is the impact. This is particularly true for the LHC, where faults may induce long recovery times after being fixed.

The AFT project was launched in February 2014 as collaboration between the Controls and Operations groups with stakeholders from the LHC Availability Working Group (AWG).

The project was initially divided into 3 phases, with the 1st phase completed on time, ahead of the LHC restart (post Long Shutdown 1: 2013-2014) and delivering the means to achieve consistent and coherent data capture for LHC, from an operational perspective. Phase 2 of the project has been in progress during 2015-16 working on detailed fault classification and analysis for equipment groups. Phase 3 (pending) foresees extended integration with other systems e.g. asset management tracking to be able to make predictive failure analysis and plan preventive maintenance operations.

 AFT helps various teams from around CERN, and output from the Web application regularly features in various machine coordination and operations meetings.  Furthermore the AWG and various equipment group representatives are using AFT data and statistics to analyse the performance of their systems and target areas for improvement – as presented at various conferences and workshops [1] [2], and summarized in regular AWG reports [3] [4] [5].

If a picture is worth a 1000 words, then take a look at the AFT Cardiogram (Figure 1) that displays LHC faults occurring in 2016 between Technical Stops 1 and 2, together with the machine activity data.

​Figure 1 LHC Faults between 2016 Technical Stops 1 and 2

AFT allows representing relationships between faults such as child faults (represented in pink on the Cardiogram) and faults blocking the resolution of another fault.  With such data it is possible to analyse availability from different perspectives such as raw system downtime, impact on machine availability (accounting for faults occurring in the shadow of on-going faults) and root cause analysis (assigning child fault downtime to parent faults).  Figure 2 shows an example of such a comparison, for a specific sub-domain of systems displayed in the AFT Web application.

Figure 2 Comparison of Fault Time from different perspectives for LHC Technical Services

Other functionality includes: fault searching and data export with a workflow for fault follow-up by different experts.  Like most data-centric systems, the value of the infrastructure and tools is always governed by the quality of the data, and so the role of the AWG – who regularly meet to ensure the completeness and correctness of the AFT data – shouldn’t be underestimated.

The technologies involved are a database to persist fault data, a Java server with ReST APIs for data exchange with the Operation team’s E-logbooks (and potentially other systems), and a dedicated Web application for data editing / visualization and analysis (shown in above screenshots).

The AFT system has been designed to be non-LHC specific, and therefore is able to cater for fault tracking for other accelerators if so desired. Due to the success of AFT for LHC during 2015, in 2016 the CERN Machine Advisory Committee proposed that AFT be used for CERN’s Injector Complex. As such, work has started in late 2016 to prepare for AFT usage in the Injector Complex from the start of 2017 operation at the end of March. 

 

 

 

References

[1] 7th Evian Workshop, 2016, https://indico.cern.ch/event/578001

[2] LHC Performance Workshop (Chamonix 2017), https://indico.cern.ch/event/580313/

[3] LHC Availability 2016: Restart to Technical Stop 1, CERN-ACC-NOTE-2016-0047, http://cds.cern.ch/record/2195706?ln=en

[4] LHC Availability 2016: Technical Stop 1 to Technical Stop 2, CERN-ACC-NOTE-2016-0066, http://cds.cern.ch/record/2235082?ln=en

[5] LHC Availability 2016: Technical Stop 2 to Technical Stop 3, CERN-ACC-NOTE-2016-0065,  http://cds.cern.ch/record/2235079?ln=en

 

 

 

Interview with Prof. John Womersley, Director of ESS
by Panos Charitos (CERN)


Professor John Womersley (Image: ESS)

Accelerating News Editor in Chief, Panos Charitos, sat down with Director General of the European Spallation Source (ESS) Professor John Womersley to discuss his experience at ESS and the future of European infrastructures and projects. 

PC: Which are the main challenges in your new mandate as Director General of ESS?

JW: The European Spallation Source (ESS) is one of the world’s largest scientific facilities and as such presents many interesting challenges. Scientists, staff, partner institutions and countries across Europe have come together to build what will be the world's leading neutron source for research on materials and life sciences. ESS will provide up to 100 times brighter neutron beams than existing facilities today and this calls for the development of state-of-the-art technologies.

The limitations of reactor technology have long been known and there is a consensus that accelerator driven spallation sources are the next step forward. With an improved source there is also the need for ESS to develop increasingly sophisticated instruments and detectors.

All these developments are taking place in a green-field site in Lund, Sweden and everything has to be built from scratch. ESS is not part of an existing laboratory so we now have to develop the infrastructures and also recruit the staff that will operate the neutron source once it is running.

 Visualisation of the European Spallation Source (ESS) in Lund (Image: ESS)

ESS is receiving in-kind contributions from almost 100 different partner institutes and suppliers from around Europe. The large amount of in-kind contributions also poses a significant integration challenge that adds to the complexity of the project. Over the past year, instrument design has advanced rapidly, with scope-setting, engineering, and the establishment of each instrument’s budget and schedule.

One way to think of the challenge is that it is like putting the ATLAS or CMS detectors together and integrating the different subdetectors that are designed and build by different international teams of physicists. This also presents us with a great opportunity to build a truly international laboratory on a site that is very hospitable and very welcoming to researchers and partners from all over the world.

PC: What are the main advantages of ESS that attract new members?

JW: There are important material science communities in many countries across Europe. Let me note that material science is important as it addresses many of the big challenges that lie ahead in the 21st century, including energy sustainability, health-care, and climate challenge.  Further developments require new materials with unique properties and neutron scattering is an excellent way to explore and monitor the properties of these materials at molecular and atomic level, thus allowing for the development of new materials.

(Credit: ESS)

ESS will go way beyond what is currently available in terms of the neutron flux and instrumentation capability. ESS builds on an existing vision in Europe that dates at least 50 years but is a facility that offers vastly expanding capabilities.

PC: What do you bring from your previous experience as CEO of STFC in this challenging role?

JW. I think that my background in particle physics gave me an invaluable experience in building large-scale projects and managing them in a collaborative way; lot of different laboratories coordinate to build different pieces of instrumentation and integrate them in a single project. In my view, the particle physics community has an excellent track record of delivering collaborative projects on time and on budget.

Moreover, from my role in STFC comes an appreciation of the multi and inter-disciplinary aspects that are common in ESS. One could think ESS as applying cutting-edge accelerator technologies (using superconducting RF cavities conceived for future particle physics accelerators) but using them to address challenges in engineering and biophysics and healthcare. Under my leadership, STFC has developed a very good track record for making the case for Big Science to all the involved stakeholder groups.

Last but not least, through the years I always kept an eye on science communication and advocacy which is very important for ESS but also for other laboratories around the world. Stakeholders in large-scale scientific projects need to be continually reminded of their value and importance.

PC: Why do you think is important to continue investing in large-scale research infrastructures?

JW. I think there are many reasons. First of all the open questions in science, whether it is fundamental physics, astronomy or engineering, require that we develop new instruments and push further back existing technologies.

If we want to progress in science we need scientific infrastructures that offer new capabilities beyond our present horizon. This means investing resources in new and large-scale research facilities. It takes a lot of time to design and build them while they require both human and financial resources which is why we need to build big collaborations to achieve in these efforts.

Particle physics has been working this way for many decades while other fields like biomedical research are now starting to form large collaborative activities and becoming accustomed to this new way of doing fundamental research.

To make these large-scale research infrastructures sustainable I think it is important to recognize some of the risks and challenges linked to the size of these big projects. First of all, there is typically a long time from concept to realization and thus we should ensure that students and post-doctoral researchers have plenty of working opportunities during the different stages of a project. Secondly, it is crucial to ensure that scientists develop news skills and learn to work in large collaborative schemes. Especially younger scientists who can easily feel lost in a big collaboration. It is important to keep all the collaborators motivated about a project and also give them space to develop new skills that may help them in their career paths.

Another important point about large-scale research infrastructures is that they offer a physical space to meet and interact with your colleagues. Though we leave in an internet-connected world with many opportunities for instantaneous communication, it is much more fruitful if you can share solutions in a collaborative way that included both physical meeting and digital communication.

All in all, it is important to continue investing in large-scale research infrastructures since they are clusters of innovation, incubators of collaboration and the way to make progress towards tackling the biggest scientific challenges.

PC: How important is to identify the stakeholders in large-scale projects from an early stage and what’s the role of ESFRI?

JW: It is absolutely critical to understand the stakeholder environment, since these big scientific projects require investments beyond what a single funding agency or research laboratory can do.

Typically they require some form of national decision making either at a level of a funding agency or some form of governmental agency. In that sense, scientists need to be connected with the decision makers who have been entrusted with those high level of budget. Decision makers are often not scientists or they can come from a different discipline making it harder to communicate your scientific case. On top of that they always need to compare different research priorities before allocating the available budget. This is why is important to be very strong in communicating not only the hard scientific case but also the benefits that stem from fundamental research.

This has been one of the main challenges for ESFRI. We tried to bring together representatives of governments around the table and set a roadmap process to identify the main research challenges in different research areas. The goal was to commonly set priorities for the European research area and progress them more efficiently.

In ESFRI, we have tried to address not just the scientific relevance of a given project but also its readiness meaning the project maturity. Our aim was twofold: to educate governments and funding agencies about the scientific priorities but also educate scientists about what funding agencies would like to see; in particular the need to have a very clear project plan. Scientists should identify sources of funding, but also evaluate the impact that research has in their own field along with its interdisciplinary impact and the socio-economic benefits that stem from fundamental research.

ESFRI is not a funding agency with its own budget but offers a certain level of advocacy presenting to governments the future scientific opportunities and investments. At the same time we provided feedback to the scientific community (especially in cases where we thought that a project is not mature enough).  I hope that our work contributed to make ESFRI a rigorous body from which both the scientific community and also funding agencies could benefit.

PC: Do you think that more and more scientists have to prove the practical application of their research?

JW: This is a discussion that has gone on for many years and I remember that even when I studied physics as an undergraduate there was much debate about applied versus pure research. Today I think that this discussion has moved on in a positive direction. In my view there is no strict distinction between research carried out to answer fundamental questions and research carried out to answer some practical and perhaps pressing problems. These are different aspects of research that reflect different timescales.

Scientists should make a constant effort to communicate the basic questions of their research. It is unreasonable to expect decision makers and the public to provide funding without discussing your research and the possible outputs. From my experience, the public and the politicians are willing to understand the value of research including the training opportunities for young people.

All of these broader aspects need to be included. Let me add that scientists working in "purer" research shouldn’t be worried because of a difficulty to discuss very abstract or very technical issues. On the contrary, it should be seen as a big opportunity for some of the ongoing exciting research projects to talk about the impact that their results have had including their socio-economic benefits. We have a wide range of impact and stories of applications coming from HEP. Applications to aerospace technology – one can easily comprehend its financial benefits - to research of understanding fundamental mechanisms of biology and all the way up to gravitational waves that may never be applied but the technology developed and the interested in science that was created have high value.

The discovery of the Higgs boson has made thousands of newspaper stories and generated a high level of interest about science, inspiring possibly millions of people around the world to visit a science museum or watch a documentary. That's a major socioeconomic impact.

All in all, it is important to be enthusiastic about communicating our research to different stakeholders including the public, fellow scientists and journalists. This broader engagement of stakeholders is the way to think instead of trapping ourselves in the false choice between basic and applied research. In times of economic difficulty we need to invest more in education, training and innovation because this is how our economy can improve and lead to a brighter future.

PC: How do you see the transformation of the European research area?

JW: ESS offers a European scale solution to a problem pointed out by hundreds of scientists in national research communities. Many countries are decommissioning research reactors that supported research with neutron beams and so by pulling resources into a single new project the scale and capabilities of this project can be much larger. However each of these countries need to learn and adopt new ways of working based on international collaboration. If you like it is a shift from quantity to quality, a shift from having many competing research centres to invest in building a research centre that is better and diverse in nature. We need to learn to share resources and collaborate more. This is happening nowadays across Europe and is a major shift common across many research areas.

The main challenge in delivering this change is that it happens without a European budget for research infrastructures. H2020 has a significant budget overall, but only a small fraction of it goes to research infrastructures. Regarding ESS, about 1% has come from the EU budget - the rest comes from member states. So we need to create these collaborative projects by bringing together national funding. This is part of the way ERA is structured and is also a strength. It means that governments make a strong decision to be part of these projects and they don’t feel that their funding will be lost or that they have lost oversight of how it is used.

Presently, there are big questions as the 9th framework programme is designed, whether research infrastructures will take a bigger role.  I hope they do as they are major investments. I do welcome the funding within H2020 for activities like design studies for future research infrastructure but more is needed in my opinion.

PC: What do you think about the present landscape and the future of high-energy physics?

JW: I think there are three very attractive features of today's landscape that we need to remember and communicate. We just discovered the Higgs particle and we need to study it in detail and further understand it. We have a machine in the LHC that can be upgraded substantially and will give us the tool needed to study the Higgs boson with better statistics and higher precision.

Moreover, astroparticle searches for dark matter  will be complemented by collider searches. We are more and more convinced about the existence of dark matter as we accumulate more cosmological and astrophysical results, but we still have no idea what it actually is.

Finally, neutrino oscillations are a constant and very tangible reminder that there is physics beyond the standard model. Nature has been kind enough to give us three flavours of them with large mixing angles so the next generation of long baseline experiments will be able to further explore the nature of neutrinos, their mass hierarchy and probe CP violation.

The one thing which is missing is a credible plan for a new collider. So it is appropriate to explore future opportunities like a Future Circular Collider at CERN. Going sufficiently beyond the current energy scales opens great opportunities for the field. That’s how particle physics worked for many decades. In the past it has been the case with many large exploratory projects in other areas which similarly didn’t have a guarantee of new discoveries but offered deeper insights to scientific theories and opened new prospects. We need to communicate clearly the opportunities presented by a large-scale research infrastructure while also explaining that part of any technological R&D is to ensure the affordable construction and sustainable operation of such an infrastructure.

CESSAMag delivering impact
by Livia Lapadatescu (CERN)

The main objective of the FP7-CESSAMag (CERN-EC Support for SESAME Magnets) project was to support the construction of the SESAME light source in the Middle-East. With financial contribution from the EC, CERN’s main objective was to deliver the magnetic system and its powering scheme for the SESAME main accelerator ring, as well as to support the training of SESAME staff.  Completed at the end of 2016, the project fulfilled or exceeded all its objectives. 

Scientific and technical impact of CESSAMag

Section of the SESAME Main Accelerator Ring (Image credit: CERN)

Building upon SESAME studies, CESSAMag finalized the requirements and design and produced the engineering and technical drawings of the SESAME magnets and powering scheme. The first main result of CESSAMag is the production of design reports on the combined function bending magnets, on the quadrupole magnets (long and short), on the sextupole magnets with their auxiliary corrector windings and on the powering scheme. These design and engineering study reports were used as background for the technical specifications needed for tendering and can serve as reference for the construction of similar light sources. 

During the tendering process, CERN made a special effort to place orders not only with experienced European companies, but also with companies based in some of the SESAME Members (Cyprus, Israel, Pakistan, Turkey), without former experience in accelerator components (except for Israel), but demonstrating potential and motivation. This was achieved through effective knowledge transfer from CERN and generated potential commercial impact in the companies trained. 

All magnets successfully passed the acceptance tests at either ALBA-CELLS or CERN and their measured field quality and reproducibility from magnet to magnet are excellent, making them a reference for similar synchrotrons. Therefore, a key result of CESSAMag is the string of magnets forming the SESAME storage ring, composed of: 

  • 16 combined function bending magnets (dipole + quadrupole)

  • 64 quadrupoles of two types: 32 long focusing and 32 short defocusing quadrupoles

  • 64 sextupole/correctors

CESSAMag also contributed to the production of an improved magnet powering scheme: rather than procuring power supplies adapted to each kind of magnet, another approach was proposed by CERN, based on light source standards (PSI), which allows individual powering of quadrupoles and simplified maintenance by plug-and-play modules by standardizing interfaces. With this strategy, SESAME benefits from a powering strategy more powerful, flexible and robust than initially foreseen. 

Following the decision to procure some components from companies based in the SESAME Members and thanks to the in-kind contribution of Pakistan, offering the assembly of 50% of the sextupoles, CESSAMag managed to deliver a more powerful and complete magnetic system and reduce the financial share that SESAME was due to contribute to the project

Finally, CESSAMag contributed to the magnet integration and commissioning, with the goal of making SESAME fully in control of the equipment delivered by CERN.

The first beam was circulated in the SESAME main accelerator ring on 11 January 2017 and it was stored and accumulated up to 20mA in mid-February. The next step is ramping the beam and completing the RF stations and final acceleration assessment expected before the end of summer. The inauguration ceremony of the SESAME light source will take place in mid-May with the foreseen presence of high-ranking officials from SESAME Members and Observers.  The first user experiments are foreseen to start in Q3.   

Political and social impact of CESSAMag

A significant aspect showcasing the socio-economic impact of CESSAMag is the knowledge transfer to companies from SESAME Members and training of SESAME staff. The duration of training to staff, engineers and companies from SESAME Members amounts to about 90 person-months and the CERN personnel effort in training and knowledge transfer amounts to 16 person-months. 

In the context of CESSAMag, international collaborations and agreements were established between CERN and SESAME and CERN and ALBA-CELLS; implementation agreements were formed with PAEK (Pakistan), TAEK (Turkey) and ILSF (Iran) and an informal collaboration with IAEA, which provided financial support for training and experts’ visits between CERN and SESAME. These collaborations and agreements illustrate the international and science diplomacy dimensions of the project.  

Furthermore, the European Union acknowledged the science diplomacy impact of CESSAMag and made further steps in support of SESAME. Since 2015, the EU is an Observer in the SESAME Council and the EC decided to further support the training of SESAME users and staff in the framework of the OPEN SESAME (Opening Synchrotron Light for Experimental Science and Applications in the Middle East) H2020 “Policy and international cooperation measures for research infrastructures” project. 

*****

This article is based on the Project Final Report.  

Laser technology to help take the LHC to the next level
by Panos Charitos (CERN)

Jointly developed by researchers from the University of Dundee and the Science and Technology Facilities Council (STFC), the technology – which is known as LESS (Laser Engineered Surface Structures) – could increase the range of experiments possible on the LHC by helping to clear the so-called “electron cloud”: a cloud of negative particles which can degrade the performance of the primary proton beams that circulate in the accelerator.

Laser-engineered surface structures (Image credit: STFC Daresbury Laboratory)

Removing this electron cloud will expand the range of experiments that the LHC, the world’s largest particle collider, can carry out. Professor Amin Abdolvand, chair of functional materials and photonics at Dundee University said: “Large particle accelerators such as the Large Hadron Collider suffer from a fundamental limitation known as the ‘electron cloud’.

“This cloud of negative particles under certain conditions may degrade the performance of the primary proton beams that circulate in the accelerator, which is central to its core experiments.

“Current efforts to limit these effects involve applying composite metal or amorphous carbon coatings to the inner surfaces of the LHC vacuum chambers. These are expensive and time consuming processes that are implemented under vacuum.”

Tests have shown that it is possible to reformulate the surface of the metals in the LHC vacuum chambers to a design that under a microscope resembles the type of sound padding seen in music studios. The surface can trap electrons, keeping the chambers clear of the cloud.

The image shows the metal before the laser treatment (top) and afterwards (bottom) where one can see the characteristic pattern that resembles the type of sound padding (Image credit: Dundee University)

Future upgrades of the LHC that will double the intensity of the beams – thus resulting in a denser electron cloud – and studies for future circular high-intensity and high-energy colliders, could profit from this technique. The LESS method, which uses lasers to manipulate the surface of metals, could effectively reduce the electron cloud allowing for more powerful beams.

Professor Lucio Rossi, project leader of the High Luminosity LHC, said: “If successful, this method will allow us to remove fundamental limitations of the LHC and reach the parameters which are needed for the high luminosity upgrade in an easier and less expensive way. “This will boost the experimental program by increasing the number of collisions in the LHC by a factor over the present machine configuration.”

Michael Benedikt, head of the Future Circular Collider Study at CERN, said: “The LESS solution could be easily integrated in the design of future high-intensity proton accelerators; the method is scalable from small samples to kilometre-long beam lines.”

(*Front page image credit: Joshua Valcarel)

Optimized first energy stage for CLIC at 380 GeV
by Daniel Schulte & Philipp Roloff (CERN)


  The CTF3 test facility at CERN, which has demonstrated CLIC’s novel two-beam acceleration technology (Image credit: Maximilien Brice)

In the post-LHC era, one of CERN’s potential options for the next flagship accelerator is an electron–positron collider at the high-energy frontier; the Compact Linear Collider (CLIC).

In August 2016 the CLIC collaboration, which consists of 75 institutes, published an updated baseline scenario. This scenario starts with a first energy stage at 380 GeV center-of-mass, followed by a second stage with an energy around 1.5 TeV, and a final step to 3 TeV.

Prior to the discovery of the Higgs boson particle, the CLIC conceptual design report (CDR) focused on the design of the 3 TeV stage and has documented the viability of the technology required for this energy. Lower energy stages have been considered with much less detail.

With the information obtained from the Higgs discovery, the optimum energy choice for the first stage was also studied. The physics programme has been evaluated, including detailed studies of realistic detector configurations. The choice of 380GeV would allow detailed measurements of the Higgs boson and the top quark.

To optimize the CLIC accelerator, a systematic design approach has been developed and used to explore a large range of configurations for the RF structures of the main linac. For each structure design, the luminosity performance, power consumption and total cost of the CLIC complex are calculated.

For the first stage, different accelerating structures operating at a somewhat lower accelerating gradient of 72 MV/m will be used to reach the luminosity goal. The design of this will have a cost and power consumption similar to earlier projects at CERN such as LHC with its injectors, whilst it ensures that the cost of the higher-energy stages is not inflated. The design should also be flexible enough to take advantage of projected improvements in RF technology during the construction and operation of the first stage.

In order to  upgrade to higher energies, the structures optimized for 380 GeV will be moved to the beginning of the new linear accelerator and the remaining space filled with structures optimized for 3 TeV operation. The RF pulse length of 244 ns is kept the same at all stages to avoid major modifications to the drive-beam generation scheme.

Data taking at the three energy stages is foreseen to last for a period of seven, five and six years, respectively. The stages are interrupted by two upgrade periods of two years, meaning that the overall three-stage CLIC programme would last for 22 years from the start of operation. The duration of each stage is derived from integrated luminosity targets of 500 fb–1 at 380 GeV, 1.5 ab–1 at 1.5 TeV and 3 ab–1 at 3 TeV.

Overview of the CLIC layout at 3 TeV, showing combiner rings (CR), delay loop, damping ring (DR), pre-damping ring (PDR), bunch compressor (BC) and beam delivery system (BDS). The red and green squares represent beam dumps. (Image Credit: CLIC collaboration). 

Further improvements are being pursued via an intense R&D programme. For instance, the CLIC study recently proposed a novel design for klystrons that could increase efficiency significantly. In addition, permanent magnets are also being developed that are tunable enough to replace the normal conducting magnets are also being developed as they could reduce power consumption even further.

The goal is to develop a detailed design of both the accelerator and detector in time for the update of the European Strategy for Particle Physics towards the end of the decade.

*A version of this article appeared in the November 2016 issue of CERN Courier.

Triplet magnets program progressing on both sides of the Atlantic
by G. Ambrosio, P. Ferracin, E. Todesco (CERN)

The Nb3Sn 150 mm aperture quadrupoles MQXF, to be installed in the inner triplets around ATLAS and CMS in 2024-5, are entering a critical phase; the first two 1.5-m-long models have been manufactured and tested since the beginning of this year.

This magnet development program, carried out as a joint effort between CERN and US LARP foresees the construction and testing of five 1.5-m-long models to validate the design and fine tune the assembly features during 2014-17.

These magnets rely on the Al shell and bladder&key structure, allowing easy and fast disassembly, and a precise tuning of the coil prestress. Mechanics is a critical part in the design of these large aperture quadrupoles, featuring an 11.4 T peak field in the coils (50% larger than the peak field in the LHC dipoles operating at 6.5 TeV).

The first model, MQXFS1, was assembled in the U.S. with two CERN coils and two LARP coils, and was confirmed to fulfil performance requirements in April 2016 (see Figure 1). The performance requirements included a) reaching the ultimate current (8% higher than the nominal current of 16.4 kA), and b) reaching nominal current after a thermal cycle with at most one quench.

Figure 1: Training of MQXFS1: quenches (markers), nominal and ultime current (solid lines) and short sample limit (dotted line). (Credit: HL-LHC WP3 collaboration)

The memory after thermal cycle has outperformed expectations by exceeding ultimate current in the first quench after the thermal cycle. However, training has been slower than expected, reaching nominal current after nine quenches. After this first cycle of testing, the transverse pre-stress in the magnet was increased by 30%, to ensure a better support to the coils.

In October 2016, the second assembly was tested at FNAL, reaching 18.8 kA; which is 15% more than the nominal current, and close to 90% of the maximum theoretical performance of the magnet. Some detraining has been observed sporadically, reducing the magnet performance but keeping it always well above the nominal current.

At 4.2 K the magnets shows the ability to reach the same current, thus demonstrating the existence of a considerable margin in temperature, meaning the magnet should tolerate local heating).

The second model, MQXFS3, (MQXFS2 has been postponed to 2017) has been tested at CERN in October 2016, using a novel test station (HFM) planned to be used for the Fresca II dipole.

The magnet reached nominal current with nine quenches, as MQXFS1, but reached a current only 4% above nominal after 20 quenches. A significantly larger detraining than in MQXFS1 was observed, pushing the magnet performance well below nominal (15.0 kA).

Nonetheless,  the maximal performance of 17.2 kA has been recovered after ramp rate tests. In addition, 4.2 K test, shows the same performance reached at 2.1 K and also demonstrates the existence of a considerable temperature margin.

Training of MQXFS3: quenches (markers), nominal and ultimate current (solid lines) and short sample limit (dotted line). (Credit: HL-LHC WP3 collaboration)

Work is now focussed on understanding the relationship between the quenches and the mechanical structure. As quenches are mainly located in the coil heads the longitudinal preload will be increased. Further testing after the thermal cycle is expected for the end of the year and . three additional models are foreseen in 2017.

The program will run in parallel with the development of the long coils (4.2 m in US and 7.15 m in CERN) required for the full size magnets.

“The short model program is a fundamental tool to master the design and construction of superconducting magnets, and it is even more important for a novel technology as Nb3Sn”  says L. Bottura, leader of the CERN Magnet, Superconductors and Cryostat group. “If needed, we will prolong the short model program to improve our understanding and to reduce the risks in the construction of the prototypes and of the series.”

Interview with Jon Butterworth
by Panos Charitos (CERN)


Prof Jon Butterworth (University College London) is Head of the UCL Physics Department and member of the ATLAS Collaboration (Image: Macleans.ca)

Panos Charitos of Accelerating News sat down with John Butterworth, Head of the Physics Department at University College London (UCL) and author of the book “Smashing Physics: The Inside Story of the Hunt for the Higgs” to discuss his work. We covered his involvement in one of the most important physics discoveries, the present landscape in high-energy physics and the plans for future colliders and ongoing R&D efforts that inspire technological innovation and could lead to ground-breaking science in the course of this century.

PC: What is your view on the latest results from the LHC and other experiments presented earlier this summer in ICHEP16 ?

JB: From the point of view of the experimentalist, the LHC has done an incredible work offering a significant leap in the energy scale. The fact that the 750 GeV bump was not confirmed caused some disappointment but this doesn’t mean that our search for new physics came to an end as we have just started scratching the surface.

Perhaps one could compare the situation with the first flight over a newly discovered island, where new physics may lie. We first fly at 30,000 ft., which is what we did in 2015, and then at 10,000 ft., where we may see signs of a new civilization. However, discovering nothing unexpected does not mean that there is no new physics on the ground. We just have to land carefully and explore the territory in detail.

On the one hand, it would be great to have a breakthrough discovery announced at ICHEP 2016, but on the other hand, the fact that the accelerator and detectors are doing so well means that we experimentalists have a lot of work to do.

It seems strange that nothing has appeared yet, but the next discovery may be just around the corner and there might be something to discover in higher energies. I would like to see the theoretical net cast a little wider. In any case, however, I am looking forward to the next three years for more data with higher precision.

PC: Do we need a new way of interpreting experimental results given the success of the Standard Model?

JB: Presently we experience a strange situation, because the Standard Model of particle physics — so complete and consistent that every calculation fits new data with remarkable accuracy not to mention the fantastic success of the Higgs discovery— leaves a number of questions open. It does not explain dark matter nor what caused the observed matter–antimatter asymmetry; both are fundamental problems that challenge our present understanding of nature.

In other words, the more we look closely at the Standard Model, the more surprised we are at its success. Looking at the latest results, I think that a large part of the motivation for theories postulating new physics tied to electroweak symmetry breaking is becoming slightly less attractive.

So to answer your question, I think that there might be more to it than we thought and maybe approaching it from a different angle will reveal answers to some of the open questions Maybe the Standard Model is even more wonderful than it appears. 

PC: To which extent should the concept of naturalness continue inform our research?

JB: We know that at the LHC energies special things happen in physics. The force carriers of the weak interaction – W and Z bosons – have masses in this energy range and we have discovered a Higgs boson with mass lying in this energy range.

However, from our theory, the Higgs mass gets lots of big quantum corrections, positive and negative, which cancel each other out in an apparently miraculous way for the Higgs mass to be where we see it. The exact cancelation of terms seems a bit strange to be merely a coincidence of the model. . In this context, naturalness is the assumption that the parameters in a theory should be about unity, and should not have to be fantastically fine-tuned in order to make the theory work.

Supersymmetry tries to answer this question by avoiding the concept of fine-tuning. It does so by introducing a new particle for every existing one, with the opposite sign thus accounting of all these cancellations that we observe. However, though it is conceptually a beautiful theory there is yet a lack of experimental evidence to confirm it.

The concept of naturalness boils down to the so-called Hierarchy problem and is related to the fact that we have different hierarchy scales: the QCD scale, the electroweak scale and the Planck scale at very high energies. The electroweak scale is closely linked to the mass of the Higgs boson but we still don’t know why the Higgs boson has a mass at this energy scale and how to deal with the quantum corrections predicted by the theory. Theories like supersymmetry are introduced to cancel those corrections and thus make it more natural to have this mass. Usually a lower than expected energy scale for the mass of a particle, as in the case of pion mass, is due to an approximate symmetry. In the case of the electroweak scale the approximate symmetry would be supersymmetry that fine-tunes the Higgs mass to where we see it.

To conclude, naturalness presents an interesting problem in modern physics which becomes very pressing in light of recent LHC data. The motivation for and significance of naturalness in quantum field theory is a hotly contested topic that we need to rethink. A concept which I think may evolve – rather than guide- as we get more data from the LHC and other experiments. On a personal note, I think that we have other reasons to believe that the Standard Model is not the whole story, with dark matter being one of the main motivations for future research.

PC: How important is our understanding of gravity for answering some of the open questions?

JB: Presently the best theory we have for the description of gravitiy is the General relativity which explains the geometry and development of the universe on macroscopic scales. Quantum field theory, in the Standard Model of particle physics, describes the other three fundamental forces and describes the universe of the very small.

However, at very high energies their spheres of applicability - the very large and the very small - overlap, and the theories conflict. Both cannot be valid and it seems that we still lack a more profound understanding.

 We face a great anomaly which is the absence of any treatment of gravity on the same footing as the other forces. There is a hierarchy problem of gravity being so ridiculously weak compared to the other forces while the same applies to the masses of particles like neutrinos that are extremely small compared to other particles. These two apparent unrelated observations may be linked and could mark a radical shift in our understanding of nature as well as to rethinking or rephrasing some of the so-called open questions.

PC: How could we decide about the next step in particle physics research?

JB: We need to understand the scale at which new physics may exist. Before committing my scientific career, I would like to know that there is an energy scale after which physics is not the same. In the case of the LHC — although there are still many ongoing searches — we knew that it could answer whether the Standard Model Higgs boson exists. We need a similarly well-posed question about the new leap in energy.

In the meantime I think is important to work on R&D to make future high-energy accelerators cost-effective, as well as diversify our experiments until we find a clue of new physics and think how we could probe it. I hope that this would be within the reach of a 100 TeV machine and I would love to work towards this direction to explore the physics options present by such a machine. However,  I think we still have to learn more from the LHC, as well as from some precision experiments and from astrophysics as well.

PC: Do you think that maybe we should also reconsider the speculative character of science?

JB: I never believed that there is a hard divide between exploratory and theoretically driven science. I think any good large-scale project would be based on a mix of the two. We had a huge theoretical motivation with the Higgs at the LHC, but we also pursued, and still pursue, an exploratory aspect. One of my favourite plots is the charge current and neutral current cross section in Deep Inelastic Scatter from HERA. You could see the weak and electromagnetic forces coming together around 100 GeV — that is a real change in high energy physics that we knew that the LHC could probe. This is motivated partly by theory and partly by experiment.

The bigger and longer-term a project is, the stronger its motivation has to be. For a small project you can take a long-shot and come up with a high-reward, high-risk plan. There is, however, a trade-off between doing a large number of these experiments and constructing a large accelerator, since resources, including physicists who can work on such projects, are not infinite. This balance of large and small experiments should be examined case by case given also the long lead times for these projects.

Finally, one should bear in mind that we live in a kind of ecosystem in which is important to advance our R&D efforts for new technologies. New developments have a strong impact, even if not directly applied to fundamental physics, including the development of new accelerators, high-field magnets and fast computing needed to process data from future detectors.

PC: Do you think that nowadays there is a strong complementarity between research in HEP and in astrophysics?

JB: I am chair of a department that is home to a very strong astrophysics and cosmology group. I found their combination of theoretical motivation and exploratory driven science very interesting. Much of astronomy is pure exploration — going to Pluto is not about fundamental physics but about investigating the solar system. Of course, studying cosmology and trying to understand dark matter or dark energy and how the Universe evolved is closely linked to the fundamental questions that particle physics tries to answer. Some of our undergraduate students found an exoplanet, and another group found a supernova. I slightly envy them. It might not be a fundamental breakthrough in the theory of supernovae but they discovered something new, that lies out there.

PC: Finally, I would like to discuss your motivation to communicate science and what is the personal reward.

JB: I have always enjoyed writing something other than a scientific paper. As a field, being able to explain our work to a non-scientific audience is just as important as publishing in peer-reviewed journals, in my opinion – though not everyone has to do both! We live in a complex society and people often cannot understand and differentiate between fiction and fact. As our lives are heavily based on science and technology, we need scientists to engage with society and discuss their work with the people. Not to mention that it can be very fun as well.

 

Pages