CERN Accelerating science

Interview with Jon Butterworth
by Panos Charitos (CERN)


Prof Jon Butterworth (University College London) is Head of the UCL Physics Department and member of the ATLAS Collaboration (Image: Macleans.ca)

Panos Charitos of Accelerating News sat down with John Butterworth, Head of the Physics Department at University College London (UCL) and author of the book “Smashing Physics: The Inside Story of the Hunt for the Higgs” to discuss his work. We covered his involvement in one of the most important physics discoveries, the present landscape in high-energy physics and the plans for future colliders and ongoing R&D efforts that inspire technological innovation and could lead to ground-breaking science in the course of this century.

PC: What is your view on the latest results from the LHC and other experiments presented earlier this summer in ICHEP16 ?

JB: From the point of view of the experimentalist, the LHC has done an incredible work offering a significant leap in the energy scale. The fact that the 750 GeV bump was not confirmed caused some disappointment but this doesn’t mean that our search for new physics came to an end as we have just started scratching the surface.

Perhaps one could compare the situation with the first flight over a newly discovered island, where new physics may lie. We first fly at 30,000 ft., which is what we did in 2015, and then at 10,000 ft., where we may see signs of a new civilization. However, discovering nothing unexpected does not mean that there is no new physics on the ground. We just have to land carefully and explore the territory in detail.

On the one hand, it would be great to have a breakthrough discovery announced at ICHEP 2016, but on the other hand, the fact that the accelerator and detectors are doing so well means that we experimentalists have a lot of work to do.

It seems strange that nothing has appeared yet, but the next discovery may be just around the corner and there might be something to discover in higher energies. I would like to see the theoretical net cast a little wider. In any case, however, I am looking forward to the next three years for more data with higher precision.

PC: Do we need a new way of interpreting experimental results given the success of the Standard Model?

JB: Presently we experience a strange situation, because the Standard Model of particle physics — so complete and consistent that every calculation fits new data with remarkable accuracy not to mention the fantastic success of the Higgs discovery— leaves a number of questions open. It does not explain dark matter nor what caused the observed matter–antimatter asymmetry; both are fundamental problems that challenge our present understanding of nature.

In other words, the more we look closely at the Standard Model, the more surprised we are at its success. Looking at the latest results, I think that a large part of the motivation for theories postulating new physics tied to electroweak symmetry breaking is becoming slightly less attractive.

So to answer your question, I think that there might be more to it than we thought and maybe approaching it from a different angle will reveal answers to some of the open questions Maybe the Standard Model is even more wonderful than it appears. 

PC: To which extent should the concept of naturalness continue inform our research?

JB: We know that at the LHC energies special things happen in physics. The force carriers of the weak interaction – W and Z bosons – have masses in this energy range and we have discovered a Higgs boson with mass lying in this energy range.

However, from our theory, the Higgs mass gets lots of big quantum corrections, positive and negative, which cancel each other out in an apparently miraculous way for the Higgs mass to be where we see it. The exact cancelation of terms seems a bit strange to be merely a coincidence of the model. . In this context, naturalness is the assumption that the parameters in a theory should be about unity, and should not have to be fantastically fine-tuned in order to make the theory work.

Supersymmetry tries to answer this question by avoiding the concept of fine-tuning. It does so by introducing a new particle for every existing one, with the opposite sign thus accounting of all these cancellations that we observe. However, though it is conceptually a beautiful theory there is yet a lack of experimental evidence to confirm it.

The concept of naturalness boils down to the so-called Hierarchy problem and is related to the fact that we have different hierarchy scales: the QCD scale, the electroweak scale and the Planck scale at very high energies. The electroweak scale is closely linked to the mass of the Higgs boson but we still don’t know why the Higgs boson has a mass at this energy scale and how to deal with the quantum corrections predicted by the theory. Theories like supersymmetry are introduced to cancel those corrections and thus make it more natural to have this mass. Usually a lower than expected energy scale for the mass of a particle, as in the case of pion mass, is due to an approximate symmetry. In the case of the electroweak scale the approximate symmetry would be supersymmetry that fine-tunes the Higgs mass to where we see it.

To conclude, naturalness presents an interesting problem in modern physics which becomes very pressing in light of recent LHC data. The motivation for and significance of naturalness in quantum field theory is a hotly contested topic that we need to rethink. A concept which I think may evolve – rather than guide- as we get more data from the LHC and other experiments. On a personal note, I think that we have other reasons to believe that the Standard Model is not the whole story, with dark matter being one of the main motivations for future research.

PC: How important is our understanding of gravity for answering some of the open questions?

JB: Presently the best theory we have for the description of gravitiy is the General relativity which explains the geometry and development of the universe on macroscopic scales. Quantum field theory, in the Standard Model of particle physics, describes the other three fundamental forces and describes the universe of the very small.

However, at very high energies their spheres of applicability - the very large and the very small - overlap, and the theories conflict. Both cannot be valid and it seems that we still lack a more profound understanding.

 We face a great anomaly which is the absence of any treatment of gravity on the same footing as the other forces. There is a hierarchy problem of gravity being so ridiculously weak compared to the other forces while the same applies to the masses of particles like neutrinos that are extremely small compared to other particles. These two apparent unrelated observations may be linked and could mark a radical shift in our understanding of nature as well as to rethinking or rephrasing some of the so-called open questions.

PC: How could we decide about the next step in particle physics research?

JB: We need to understand the scale at which new physics may exist. Before committing my scientific career, I would like to know that there is an energy scale after which physics is not the same. In the case of the LHC — although there are still many ongoing searches — we knew that it could answer whether the Standard Model Higgs boson exists. We need a similarly well-posed question about the new leap in energy.

In the meantime I think is important to work on R&D to make future high-energy accelerators cost-effective, as well as diversify our experiments until we find a clue of new physics and think how we could probe it. I hope that this would be within the reach of a 100 TeV machine and I would love to work towards this direction to explore the physics options present by such a machine. However,  I think we still have to learn more from the LHC, as well as from some precision experiments and from astrophysics as well.

PC: Do you think that maybe we should also reconsider the speculative character of science?

JB: I never believed that there is a hard divide between exploratory and theoretically driven science. I think any good large-scale project would be based on a mix of the two. We had a huge theoretical motivation with the Higgs at the LHC, but we also pursued, and still pursue, an exploratory aspect. One of my favourite plots is the charge current and neutral current cross section in Deep Inelastic Scatter from HERA. You could see the weak and electromagnetic forces coming together around 100 GeV — that is a real change in high energy physics that we knew that the LHC could probe. This is motivated partly by theory and partly by experiment.

The bigger and longer-term a project is, the stronger its motivation has to be. For a small project you can take a long-shot and come up with a high-reward, high-risk plan. There is, however, a trade-off between doing a large number of these experiments and constructing a large accelerator, since resources, including physicists who can work on such projects, are not infinite. This balance of large and small experiments should be examined case by case given also the long lead times for these projects.

Finally, one should bear in mind that we live in a kind of ecosystem in which is important to advance our R&D efforts for new technologies. New developments have a strong impact, even if not directly applied to fundamental physics, including the development of new accelerators, high-field magnets and fast computing needed to process data from future detectors.

PC: Do you think that nowadays there is a strong complementarity between research in HEP and in astrophysics?

JB: I am chair of a department that is home to a very strong astrophysics and cosmology group. I found their combination of theoretical motivation and exploratory driven science very interesting. Much of astronomy is pure exploration — going to Pluto is not about fundamental physics but about investigating the solar system. Of course, studying cosmology and trying to understand dark matter or dark energy and how the Universe evolved is closely linked to the fundamental questions that particle physics tries to answer. Some of our undergraduate students found an exoplanet, and another group found a supernova. I slightly envy them. It might not be a fundamental breakthrough in the theory of supernovae but they discovered something new, that lies out there.

PC: Finally, I would like to discuss your motivation to communicate science and what is the personal reward.

JB: I have always enjoyed writing something other than a scientific paper. As a field, being able to explain our work to a non-scientific audience is just as important as publishing in peer-reviewed journals, in my opinion – though not everyone has to do both! We live in a complex society and people often cannot understand and differentiate between fiction and fact. As our lives are heavily based on science and technology, we need scientists to engage with society and discuss their work with the people. Not to mention that it can be very fun as well.

 

Carlos Moedas on the importance of SESAME as a model for science diplomacy
by Livia Lapadatescu (CERN)

EU Commissioner for Research, Science and Innovation, Carlos Moedas, during his visit to SESAME in Jordan, April 2015. (Image credit: 2015-2016 CERN) 

At the 28th SESAME Council held in May 2016 in the premises of the European Commission, the EU Commissioner for Research, Science and Innovation, Carlos Moedas, gave an introductory talk on SESAME as an example of cooperation in the Middle East through science diplomacy.

SESAME, as a model of scientific cooperation in the Middle East, is part of the European Union priority to ensure that European Research and Innovation are “Open to the World” and was an inspiration for the book on Open Innovation, Open Science, Open to the World – a vision for Europe, published by DG Research and Innovation in May 2016. Science diplomacy has been one of the priorities of Commissioner Moedas and three science diplomacy pillars have been set up: (i) building bridges and improving international relations; (ii) addressing global challenges through sound scientific advice; (iii) embracing globalization through enhanced STI cooperation. For example, Carlos Moedas gave the initiative of FP7 CESSAMag project as an example and a trigger, and his visit to CERN and the CESSAMag laboratory in January 2015 was the beginning of the first pillar with SESAME bridging divides in the Middle East.

In the framework of this first pillar historical agreements were signed associating some countries to Horizon 2020, such as Ukraine and Tunisia in 2015 and Armenia and Georgia in 2016. Another example of a bridge-building activity is the PRIMA initiative (Partnership for Research and Innovation in the Mediterranean Area), a cooperation in the Mediterranean region, bringing neighbours at odds together,  on how to ensure the sustainable provision of vital resources such as water and food.

In the context of the second pillar, a high-level group of seven scientific experts for scientific advice on specific policy issues in Europe was set up. In addition, a Science4Refugees programme was launched to help refugees with a science background find suitable jobs in universities and research institutions in the EU.

With respect to the third pillar, progress has been made towards the creation of a Global Research Area based on the development of a Common Research Area for the EU, Latin America, and the Caribbean. This has been manifested by the decision of the 28 EU member states to make scientific papers freely available by 2020 or the setting-up co-funding mechanisms with China and Mexico.

To conclude, the EC Commissioner, informed participants that €2M had been earmarked for SESAME in the 2016-2017 Horizon 2020 Work Programme and stressed the fact that he has become emotionally involved in this project and would continue to be the SESAME Ambassador and an advocate on scientific cooperation in the Middle East through SESAME. 

A revolutionary mini-accelerator
by Panos Charitos (CERN)



A glimpse in the accelerator structures of the world’s smallest accelerator (Credit: CERN)

CERN is the home of the 27-kilometre Large Hadron Collider (LHC) that searches for new discoveries by colliding protons at extraordinarily high energies. The unprecedented energy levels led to the discovery of the Higgs boson, the last missing piece in the Standard Model, and now open a new chapter in fundamental physics. The development of such complex machines is based on the advancement of novel technologies and invaluable know-how, which can be capitalised in other fields outside particle physics.

Sometimes working for the largest accelerators gives ideas on how to build the smallest ones; the construction of the world’s smallest Radio Frequency Quadrupole (RFQ) for proton acceleration that was completed in September provides one of the most successful examples. This miniature machine is a linear accelerator (linac) consisting of four sections of only 130 mm diameter, operating at a frequency of 750 MHz, for a total length of 2 metres. It can accelerate low-intensity proton beams of a few hundreds of microA up to the energy of 5 MeV.

It should be noted that the mini RFQ cannot be used for the large colliders needed for fundamental research, since it cannot achieve high peak currents. The small size and low current is however what makes this design ideal for a wide range of medical and industrial applications.

Maurizio Vretenar (CERN), head of the LINAC4 project and coordinator of the design and construction of the mini accelerator, said: “The challenge to develop this miniature accelerator came from a spin-off company that aims to take advantage of the knowledge and infrastructure of CERN in building new accelerators. The main idea was that a mini-RFQ is a much more efficient injector than a cyclotron to a compact proton linac for particle therapy. The linac-based facility under development will permit a more precise 3D scanning of tumours than what is possible with other proton therapy machines or conventional radiotherapy.”

Vretenar explained: “Reaching high frequencies is particularly challenging, but it is the only way to build compact accelerators. For proton linacs at CERN, we started with the 200 MHz LINAC2 at the end of the 1970s and since then we have almost doubled the frequency to 350 MHz for the recently commissioned LINAC4. With the new LINAC4 we will be able to double the beam intensity in the LHC injectors, thus significantly contributing to an increase of the LHC luminosity,” and continues: “the idea of constructing a smaller accelerator that could produce low-intensity beams for medical purposes has been a long-standing technological challenge. It dates back to the 1990s when it seemed almost impossible to build such a small RFQ.”

The rich experience that the CERN team has gained from the design and development of LINAC4 made a new miniature RFQ accelerator seem more plausible. The main challenge was to double the operating frequency, resulting in more accelerating cells and a shorter length, but at the same time leading to a very challenging beam optics design and RF resonator. With the high frequency RFQ, we have more than doubled the accelerating capabilities (2.5 MeV/metre in place of 1 for the LINAC4 RFQ) and reduced by a factor 2 the construction cost per metre.

The way to the higher frequencies was opened by a new beam dynamics approach developed by Alessandra Lombardi, who now follows the testing and commissioning of the RFQ in ADAM’s premises. The next challenges to address were the tuning of RFQs that are long with respect to the wavelength and the machining and brazing of RFQ parts of unprecedented small size.

The design and construction of the RFQ relied on a sophisticated mechanical approach defined by Serge Mathot and on a detailed definition of the resonator properties and tuning strategy by Alexej Grudiev (BE).

Thanks to the collaborative spirit and the passionate work of CERN’s people who worked in this project, the team recently completed the brand-new mini accelerator. The four modules that make up the final accelerator have been entirely constructed in CERN's workshops within less than two years through the effort of a small but enthusiastic team. The fact that what they were building could help treating thousands of patients gave extra motivation to everyone involved in the project. In addition, Serge Mathot explains: “the construction was a very delicate procedure, given the need for high precision and the geometry of each module. Thanks to the experience and the skills we have gained from our previous works on the cavities for LINAC4, we successfully met the challenges of this project”.

Serge Mathot in front of one of the four modules (Credit: CERN)

The technological breakthrough achieved by the team behind the mini-accelerator has attracted interest from the industry, in first instance from A.D.A.M. SA, which stands for Applications of Detectors and Accelerators to Mediciane, a Geneva-based spin-off company from CERN, and from its parent company Advanced Oncotherapy in the United Kingdom. "Behind every innovative aspect of this accelerator, there is unique CERN intellectual property and know-how", says David Mazur from CERN's Knowledge Transfer Group, "and we have concluded a license agreement with A.D.A.M. SA which enables them to commercialize such accelerators in the field of proton therapy, based on our IP".

The mini accelerator was delivered to the ADAM test facility last September and is presently being commissioned. It is more modular, more compact and cheaper than its “big brothers”. Its small size and light weight mean that the mini-RFQ could become the key element of proton therapy systems but also of systems able to produce radioactive isotopes on-site in hospitals.


The mini accelerator (RFQ) installed in the ADAM test stand (Credit: ADAM)

The team that developed the mini-RFQ foresees many other potential medical applications, such as acceleration of alpha particles for advanced radiotherapy techniques that may be the new frontier in the treatment of cancer or industrial applications, where a mini accelerator could analyse the quality of surfaces or trace aerosol pollution for example.

Also, the small size of the new accelerator means that it can be easily transported, which would be particularly useful for the surface analysis of archaeological materials or artworks presently exhibited in museums around the world, using proton-induced x-ray emission (PIXE) analytical technique. Indeed a new generation of mini accelerators have great potential and could find numerous applications in many fields. The mini-RFQ offers another example of the societal benefits stemming from fundamental research.

Higher energies for ISOLDE's radioactive ion beams
By Athena Papageorgiou Koufidou (CERN)


HIE-ISOLDE cryomodule with five copper RF cavities and one solenoid magnet assembled at the SM18 clean room. (Image: Maximilien Brice, CERN​)

On 28 September, the members of the ISOLDE collaboration and major stakeholders came together in a well-deserved celebration. The first phase of the facility’s high energy and intensity upgrade (HIE-ISOLDE) is now complete and a promising future is in sight as experiments started on 9 September.

ISOLDE is the oldest facility still in operation at CERN and one of the most successful. It currently occupies a leading position in the field of radioactive ions research, producing the largest range of isotopes worldwide (over 1300 isotopes of more than 70 elements), which are used in multiple fields of physics: nuclear and atomic physics, astrophysics and fundamental interactions. A key element of ISOLDE’s success is the wealth of technical expertise it has accumulated over the decades, especially in the construction of target‑ion source units. The secret to the facility’s longevity, however, is its vibrant international collaboration and its ability to adapt to the changing physics landscape.

An impressive team is behind HIE-ISOLDE, comprising leading physicists, engineers and other experts in accelerator and beam technologies. Another essential ingredient of the workforce are early stage Marie Curie researchers, who acquire valuable skills in the area of advanced accelerator technologies, reflecting the commitment of ISOLDE on training the next generation of experts.

Taking beam energy and intensity to new heights

The production of radioactive ion beams at ISOLDE begins when a high‑energy proton beam from the PS Booster hits the facility’s target, resulting in a wide variety of reaction products. These are ionised in a surface, plasma or laser ion source and separated according to mass, producing the beam of the preferred element. An RFQ cooler and buncher lowers the temperature of the radioactive beam, thus significantly reducing emittances and energy spreads. The beam is then delivered to the low-energy experimental stations or charge‑bred and post‑accelerated at the REX accelerator.

The energy upgrade of the facility entails the construction of a superconducting linear accelerator (HIE-linac) to increase the energy of radioactive ion beams, a high energy beam transfer line to bring the beam to the experiments, as well as new beam diagnostic tools. The intensity upgrade aims to improve the target and ion source, the mass separators and charge breeder.

HIE-linac takes advantage of many cutting‑edge cryogenics and radiofrequency technologies that were originally developed for the LHC. It is equipped with superconducting radiofrequency cavities made of copper coated with niobium and operating at 101.29 MHz. They are cooled by liquid helium at 4.5 K in ultra‑high vacuum conditions. In the first phase of the energy upgrade, two high‑beta cryomodules, each containing five cavities and one superconducting solenoid magnet, were coupled to REX-linac and commissioned, thus increasing energy to 5.5 MeV per nucleon. Two more cryomodules with the same configuration will be added in the second phase, allowing beams to be accelerated to 10 MeV per nucleon; one is currently in the SM18 clean room, awaiting installation in 2017, and the other is scheduled to be assembled and installed in 2018. In the third and final phase, two low-beta cryomodules, containing six cavities and two solenoids each, will be manufactured and installed in replacement of the 7-gap and 9-gap normal conducting structures of REX, allowing beams to be decelerated to 0.3 MeV per nucleon.

The tunnel at HIE-ISOLDE now contains two cryomodules – a unique set up that marks the end of phase one for the HIE-ISOLDE installation. By Spring 2018 the project will have four cryomodules installed and will be able to reach higher energy up to 10 MeV/u. Image credits: Erwin Siesling/CERN.

After post-acceleration in HIE-linac, radioactive ions enter the high‑energy beam transfer line (HEBT), which is specially designed to preserve emittances. Then, the beam is delivered to the different experimental stations through one of two beam lines that have been in operation since 2015. A third one will be installed in early 2017.

The PS Booster upgrade and the operation of Linac 4 after LS2 are expected to increase the primary proton beam intensity at ISOLDE to 6.7 μA, allowing more exotic isotopes to be produced and more precise measurements to be obtained. However, the new experimental conditions create a set of challenges that necessitate ISOLDE’s intensity upgrade. Higher radiation levels limit the lifetime of the target, thus options for new target materials with a focus on radiation resistance are explored, while materials that are presently used undergo extensive radiation tests. The laser ion source (RILIS) has also been upgraded, improving selectivity and developing new ionisation schemes. Finally, the improvement of the mass separators will reduce isobaric contamination.

HIE-ISOLDE is currently the only next generation radioactive beam facility available in Europe, while SPIRAL-2 and SPES are still under construction,and the most advanced isotope separation on-line (ISOL) facility in the world.

New physics opportunities

HIE-ISOLDE creates a wealth of opportunities for research in many aspects of nuclear physics, astrophysics, as well as solid state physics, because it can produce a wide variety of exotic nuclei at different energies. The upgrade was welcomed by the international nuclear physics community and is in line with the recommendations of the Nuclear Physics European Collaboration Committee. Over thirty experiments have already been approved and are now at the preparation stage.

Nuclear physics

Scientists have been studying the atomic nucleus for more than 100 years, starting with Ernest Rutherford in 1911, yet many open questions remain: What is the nature of nucleonic matter? What happens if we change the energy, momentum, or temperature of the nucleus? Studying radioactive ion beams allows researchers to dig deeper into these questions, as radioactive nuclei often behave differently than stable ones and can reveal certain aspects of nuclear behaviour that their stable counterparts cannot. Accelerating these exotic nuclei to higher energies provides new physics possibilities, matching the innovative theoretical developments of the field. Many of the approved experiments plan to use Coulomb excitation, including studying the physics of super-heavy nuclei, which could reveal the next magic numbers in the very heavy systems. Other experiments will investigate transfer reactions, which may allow physicists to unravel the evolution the structure of the nucleus’s energy levels, also known as its ‘shell structure’.

Nuclear astrophysics

HIE-ISOLDE also paves the way for advances in nuclear astrophysics, a field that explores the abundance of chemical elements in the Universe. Hydrogen and helium, which were produced seconds after the Big Bang, comprise 74% and 24% of ordinary matter in the Universe, while most other elements were created inside stars much later. Astrophysicists have extensively studied how elements up to the iron region are produced, but the processes by which nuclear reactions produced elements with a higher atomic number remain largely a mystery.

Although we know that these heavy elements were created by stellar explosions and nuclear processes in stars, matching specific events to the observed distribution patterns poses a considerable challenge. The higher intensity, reduced emittance and possibility for beam deceleration at HIE-ISOLDE will enable astrophysics experiments to shed light to this problem. Some research teams plan to investigate neutron-rich nuclei that form in the crust of neutron stars, while others will study the proton-capture process that occurs during X-ray bursts or explosions of white dwarves, research the production of chemical elements in the collapsed core of supernovae and address the problem of lithium-7 abundance in the Universe.

Solid state physics

The solid state programme at ISOLDE encompasses materials science, biophysics and biochemistry, complementing nuclear physics research. It would greatly profit from the high purity and intensity ion beams of HIE-ISOLDE, as well as from the modernisation of the facility. Such research can have considerable social benefits as well, because it yields a wide range of applications — from nanomaterials and superconductors to advances in cancer diagnosis and therapy.

A flying start for HIE-ISOLDE

On 9 September, the first exotic beam at HIE-ISOLDE marked the start of operations for the new facility. The experiment investigated charge states of tin isotopes, using transfer reactions and Coulomb excitation of an 110-Sn-26+ beam, post‑accelerated to 4.5 MeV per nucleon. Besides demonstrating the experimental capabilities of the upgraded facility, this successful first run validated the technical choices of the HIE‑ISOLDE team and provided a fitting reward for eight years of rigorous R&D efforts.

Almost half a century after the first ion beams bombarded the ISOLDE target, the facility is thriving and, thanks to the energy and intensity upgrade, continues to create new opportunities for radioactive ions research. The upgrade team and the users are now looking forward to an exciting, intense period.

 

From biomedical applications to nuclear astrophysics, physicists at CERN’s nuclear physics facility, ISOLDE, are probing the structure of matter. To stay at the cutting edge of technology and science, further development was needed. Now, 8 years since the start of the HIE-ISOLDE project, a new accelerator is in place taking nuclear physics at CERN to higher energies. With physicists setting their sights on even higher energies of 10 MeV in the future, with four times the intensity, they will continue to commission more HIE-ISOLDE accelerating cavities and beamlines in the years to come.

You can find more information about ISOLDE here.

 
 
Speaking in the opening ceremony of Euroscience Open Forum 2016, Carlos Moedas, the European Commissioner for Research and Innovation offered his vision of science and the role of the commission in policies and programmes. In his inspiring talk he explained the principles that should guide research and emphasized the need for scientists to engage more with citizens in solving research challenges.
 
In the new “Republic of Letters” science plays a key role in advancing our fundamental knowledge about nature but also in answering to some of the most pressing problems that Europe is facing in the light of the 21st century. However, to realise this new “Republic of Letters” scientists have to regain the trust of citizens and engage them throughout their research. Toward this direction, Cssr Moedas emphasized the need for open access with open data been deemed the “default mode” for the next set of Horizon 2020 calls.
 
By continuing investing in research, Europe can strengthen its place in the global research area, continue leading innovation and secure its global competitiveness. The development of science and technology is necessary for a sustainable development and guarantees a brighter future for our societies. In an inspiring speech, given at the EuroScience Open Forum conference in July 2016, Carlos Moedas described the rise of the new Republic of Letters:
 
If you were a European intellectual during the Enlightenment, the chances are you were a citizen of the Republic of Letters, a community of scholars and literary figures that included the likes of Benjamin Franklin, Goethe and Voltaire.
 
In Voltaire's correspondence alone, there were nearly 19,000 letters. Voltaire wrote most often to his contemporaries in France, but he also wrote to many others in Germany, Italy, Russia and Switzerland. Across Europe, as universities began publishing academic journals, as royal societies provided patronage to the natural sciences, and as new ideas spread from the salons of the nobles to the coffee houses of the bourgeois, the blueprint for modern science was formed.
 
Within the Republic of Letters, natural philosophers shared and critiqued each other's ideas. They sent articles and pamphlets to one another and worked towards the expansion of their community, by introducing each other and increasing their networks of correspondence. This was a community that transcended national borders, that experimented and debated across disciplines, and that pursued progress and societal advancement by means of rationalism.
 
But, though open-minded and meritocratic for the times, the Republic of Letters was a small and privileged community that few people had the means to access. The public was excited by the scientific discoveries of the age, but could play no active role in the process. The Republic of Letters was open science for the few.
 
By the 19th century, the abundance of new areas of scientific exploration required an overall term for 'men of science' and the word 'scientist' emerged. The industrial revolution and urbanisation had brought science into the public consciousness. National governments were funding science. School children were mastering the rudiments of physics, chemistry and biology in schools and books on science became bestsellers among increasingly literate populations.
 
Science was now discussed in the laboratory and the lecture hall. Science had succeeded in reaching the professional classes, who could marvel at great exhibitions in their leisure time. So the 19th century enabled more people to take part in science, but, for the most part, science was still closed to ordinary people.
 
The 20th century, was about nations. Individual nations conquered Everest, achieved space flight and navigated to the poles. Science was defined by one nation's sprint to the finish line after the other and scientific institutions and their funding were organised accordingly. Science was a matter of national pride and national security. More people were attending university than ever before and broadcasting had brought science into people's living rooms. But still, the public remained an audience to be instructed, rather than an active participant in the scientific debate.
 
In the 21st century, science can no longer be distant to the public. It requires public support to succeed. I think of it in terms of a triangle between the public, scientists and data, with the public firmly at the centre.
 
It is my view that we are entering a new era of global and open science. This will return us to some of the founding principles of science. So the 21st century is not about one nation's sprint to the finish line.
 
As I said, in the 18th century the Republic of Letters was open science for the few. The 21st century will become the Republic of Letters for the many. Rather than being an elite activity, concentrated in a few countries in Europe, 21st century science will involve tens of thousands of scientists working collaboratively across the globe.
 
Equally as important, the relationship with the general public will define science. Because, unlike in the past, each of us now commands more information in our pockets than any scientist could ever read in their lifetime. This information overload requires public trust in scientists to determine fact from fiction. Trust that will be built on the integrity and objectivity of scientists, and that will depend on good communication. Therefore, the persistent historical division between the "intellectual" and the "non-intellectual", which I described earlier, is one that every scientist and every politician should be worried about.
 
Though globalisation provides the international integration that makes it possible for countries to work together on global challenges, such as climate change and migration, in its current form it has fallen short of benefitting the majority of people.
 
A scientist can explain how renewable energy can help to combat climate change, but how does that help someone who cannot afford to heat their home? A politician can explain the net benefits of migration, but how does that help someone who cannot get a doctor's appointment?
 
The current lack of public and political engagement in fact-based decision-making even has people asking, have we have entered a "post-factual" era of democracy? One in which the public identifies with populist rhetoric and decisions are made based on fears and assumptions, because people feel science and politics have left them behind.
 
So what do we do about this? How do we build trust? How can we be clear and transparent? How do we ensure progress in this triangle of the public, scientists and data? I believe many of the answers lie in open science. Open access to data needs trust and transparency. Public acceptance requires research integrity and citizen science brings scientists closer to people.
 
Let's start with open access to data and research integrity.
 
The future of our knowledge economy will rely on public access to data, so that 1) the European public can take part in the scientific debate and 2) the public can directly access scientific evidence on the issues they care about.
 
You have to show how data can change lives. Recently in San Francisco, with the help of data in a deep learning system, the system detected cancer in more cases than cancer experts.
 
But with greater availability of scientific data, comes the need to ensure the integrity of what is being shared. The public needs to know that research results are not falsified, fabricated or plagiarised.
 
This is why we're putting more focus on research integrity in Horizon 2020 model grant agreements. And today, I can announce that the grant agreements for Horizon 2020 have been updated. They will include clearer rules on Research integrity, making sure that all researchers and research institutions know their obligations.
 
Citizen science.
 
We also need to find ways for the European public to take part in the processes behind scientific discovery 1) to help decide the priorities for public research funding and 2) so the European scientific community can crowdsource solutions with the volume and diversity to provide new insights.
 
Take, for example, the potential of gaming to help scientists multiply the number of brains working on a single problem at any given time.
 
Five years ago, gamers famously resolved the structure of an enzyme that causes an Aids-like disease in monkeys. Scientists had been working on the problem for over a decade. By using an online puzzle game, gamers solved the structure in just three weeks.
 
So, to ensure Europe leads the way on open science, I can announce that, from today, the Commission has made open data the default for all Horizon 2020 projects. Moreover, we have now approved the next set of calls under Horizon 2020. Fifty calls, worth around 8.5 billion euro in 2017, in areas ranging from food security, to smart cities, to understanding migration.
 
For all projects funded by these calls, we will expect the data generated to be open access.
 
In addition, I am currently working with colleagues in the Commission on our proposed revisions to EU copyright law. The aim is to introduce a research exception in copyright that will apply across all Member States, and which will provide a predictable legal framework for Text and Data Mining.
 
The trends towards open science and open data are not something we can stop, So we should lead change, rather than adapt to it later. 

Of course, talking about Horizon 2020 here in the UK, I know that there is a great deal of uncertainty about what the future holds. I have heard concerns about British organisations being dropped from EU projects. There are concerns about staff from other EU member states still being able to work in British research institutions.

I wish I could give you all the answers, but for now I can make two clear statements: First, for as long as the UK is a member of the European Union, EU law continues to apply and the UK retains all rights and obligations of a Member State. This of course includes the full eligibility for funding under Horizon 2020. Second, Horizon 2020 projects will continue to be evaluated based on merit and not on nationality. So I urge the European scientific community to continue to choose their project partners on the basis of excellence.

I would like to conclude with this message: By continuing to allow the gap between public perception and scientific ambition to increase, we risk, at best, apathy and, at worst, complete distrust at a crucial juncture.
 
Europe should not only be part of a Global Research Area that embraces open science, we should lead the way to this new Global Research Area.
 
Following the agreement by EU science ministers in May, Europe is the first region of the world to make open access the norm for all scientific publications, and now the largest research funding programme in the world to introduce open data as a default for all projects.
 
So let's create a new Republic of Letters: one that is inclusive, one that values its people as much as progress and one that restores trust and confidence in science.”

Antimatter research boosted by EU funding
by Alexandra Welsch

In 1928, British physicist Paul Dirac wrote down an equation that combined quantum theory and special relativity to describe the behaviour of an electron moving at a relativistic speed. The equation posed a problem as it could have two solutions: One for an electron with positive energy, and one for an electron with negative energy.

Today we know that for every particle that exists, so does a corresponding antiparticle, exactly matching the particle but with opposite charge. When matter and antimatter come into contact, they annihilate each other – disappearing in a flash of energy. In theory, the big bang should have created equal amounts of matter and antimatter. So why is there far more matter than antimatter in the universe?

 

The AVA project logo

This question and a number of other equally fundamental questions about the laws of nature are being addressed at CERN’s unique Antiproton Decelerator facility. Efforts to answer these questions, the Accelerators Validating Antimatter (AVA) project was created and has been selected for funding by the European Union.

Professor Carsten Welsch, Head of the University of Liverpool’s Department of Physics who is based at the Cockcroft Institute and coordinator of AVA, said: “Antimatter experiments are at the cutting edge of science, but they are very difficult to realize. This year the new Extra Low Energy Antiproton ring (ELENA) is being commissioned at CERN and will be a critical upgrade to the unique AD facility. In addition, there are also exciting long-term prospects through opportunities a future low energy antimatter facility might provide as part of the FAIR research centre in Germany.”


A gas jet experiment. (Image courtesy of University of Liverpool/Cockcroft Institute)

AVA is an Innovative Training Network within the H2020 Marie Skłodowska-Curie actions which will enable an interdisciplinary and cross-sector program on antimatter research. The network comprises a lot of the European expertise in this research area, and joins five universities, eight national and international research centers, and 13 industry partners.

Within AVA, the project partners will carry out research across three scientific work packages. These cover facility design and optimization, advanced beam diagnostics and novel low energy antimatter experiments. A total of 15 Fellows will be recruited and become part of larger scientific teams. A structured combination of local and network-wide trainings will also be offered within AVA. This includes hands-on training on accelerator facilities, as well as an international training programme consisting of Schools, Topical Workshops and Conferences that will be open to all Fellows, as well as the wider scientific community.

The network will recruit its Fellows for start in spring/summer 2017. The deadline for applications is 31st January 2017. More information about the project can be found on the project home page: http://www.ava-project.eu.

 

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No 721559.

 

Editorial Issue 16

Message from the Editor-in-Chief

In this issue of Accelerating News we bring you lots of great stories including: the first design concept for the FCC-ee main dipoles; exciting news on the 11T project for the HL-LHC, the development of HTS TI-based coatings for the FCC beam screens, new installation for LINAC4, a summary on the latest accelerator energy efficiency workshop, a feature on arrival of the SESAME magnets to Jordan and so much more.



After serving as Editor-in-Chief for almost four years, I also say farewell as I begin a new endeavor. I would like to give my very special thanks to all my co-editors and pass the baton to Panos Charitos. I wish him and his team all the best stories to share with you in the upcoming issues!

 

Professor Tommaso Valletti (Imperial College London, Tor Vergata) has been appointed as the new Chief Economist of the European Commission’s DG Competition. He will take up his duties on 1 September 2016. Tommaso Valletti currently holds teaching positions at both Imperial College Business School and at the University of Rome "Tor Vergata" while he is also the Academic Director at the Centre for Regulation in Europe (CERRE) in Brussels. In this interview we discussed with him about the challenges of his new role and the importance of fundamental research for boosting EC competition. 

P.C. First of all, I would like to ask you about your background and how did you decide to move from engineering to economics and finance? 

T.V. I studied engineering at the Politecnico di Torino, one of the two top engineering schools in Italy. I grew up in Torino and, being a good student, enrolling there was a natural choice. While studying engineering, I also attended some economics courses and found them fascinating. My progression into social sciences was actually even more complicated. I graduated from the five-year engineering course when I was 22, but my biggest love back then was (and still is now) music. After graduation, I happened to have the chance of joining a chamber music ensemble, and I ended up travelling the world and giving concerts for the next three years. However, at the age of 25 I wanted to find a more stable lifestyle, and that’s when economics really entered my life. I came to London to do a MSc and PhD in Economics at the LSE. I have not left economics ever since.

P.C. Do you think that your training as an engineer helped you in your later studies? 

T.V. Very much so. During the first few years of my economics studies, the foundation courses in engineering proved to be especially helpful in understanding the mathematics which provide the basis for many formalisations in modern economics (mathematics was not used as much by classical economists, such as Adam Smith, David Ricardo, or Karl Marx). Applied engineering courses were also helpful, but I guess I will simply never have the chance of applying them!

P.C. What are your main research interests?

T.V. My area of specialization is in the field of Industrial Organisation, with particular reference to competition and regulation of telecommunications markets. I want to understand the functioning of markets, the causes of market failures, and possible remedies to fix them. Why telecoms? I followed that direction almost by accident, since, when I started my PhD in the mid-1990s and wanted to find an industry to study, I was curious about mobile phones, which were then just taking off (although, in truly academic style, I owned my first cell phone only a decade later). Jokes apart, telecommunications markets, and network industries in general, are very interesting to economists: think of the widespread barriers to entry, network effects, as well as fast technological progress. This combination poses formidable challenges to economic analysis and raises a number of interesting and novel questions. My specialization, though, extends to other industries as well, including platform markets, retailing, insurance, public procurement, pharmaceuticals, broadcasting, sport leagues, and transportation.

My approach and set of skills encompass both theory and empirical work. I rely on the application of game theoretical tools to the analysis of markets (game theory is the branch of economics ‘invented’ by John Nash, the famous protagonist of ‘A Beautiful Mind’). My empirical work largely consists of policy evaluation based on statistical analysis of micro data.

 P.C. What is the role of competition and regulation (given your role as Academic Director of CERRE) in modern economy? How easy is it to find a balance between the two (if this is needed on the first place)?

T.V. Competition is one of the most powerful engines for growth and it works great in most cases. Competition means that the best ideas can emerge and that talent is rewarded. Still, it would be foolish to conclude that we should always leave it to the market. Competition works under some conditions. If those conditions are not met, then the reverse is also true: competition fails to deliver good outcomes. That’s true every time there are externalities (think of education, or health), public goods (defence or research), or market power involved. The last case is when there is no competition after all, and markets do not achieve their wonders. This last aspect is what I study, but one has to make sure that interventions are appropriate – else you may achieve even worse results.

P.C. You have been appointed EC's Chief Economist for Competition. What do you think are the main keys to keep an economy competitive in the global environment?

T.V. One of the most important economic challenges for any democratic government is enabling businesses to operate freely, whilst also ensuing fairness and value for end consumers and taxpayers. This goes to the heart of many recent political and economic issues – whether regulating banks to prevent mis-selling of insurance products or stepping in to ‘save’ some industries. The role of the Chief Economist – and of its team of very talented PhDs! – has to be understood in this context.

We work in three areas. First, we look at large mergers that have a European dimension. Our task is to allow those concentrations achieving efficiency gains that can be passed on to consumers, and block instead those mergers whose ultimate goal is to increase prices. Recently, for instance, the Commission blocked a proposed merger between mobile operators in the UK. Second, there is antitrust enforcement. This means checking that companies with market power should not abuse it, the key word here being abuse. A case in point is the ongoing Google case. There is nothing wrong in itself with Google being the largest search engine in Europe (and they do a great job, as we all know). However, we want to make sure that Google does not then use that market power to stifle competition and innovation, for instance by leveraging its power from search to other markets where Google is also present and where it may give unfair advantages to its products.

The last aspect of our work, has to do with State Aid. Governments with short political cycles have a tendency to bow to pressures and spend taxpayers’ money to save some failing firms, under the threat – say – of layoffs. Think of the billions of euros that have been given to the banking sector. Or, as an Italian, I am too well aware of the money squandered on a company like Alitalia. Under current new regulations, governments will have to convince us of the robustness of the economic market failure they try to fix with their intervention, else it risks being totally unfair to other players in the market, as well as to citizens at large. I think these three functions are very important to keep our economy competitive. But I start the new job on September 1st, so this is my thinking ‘in theory’, while ‘in practice’ I will have to learn how to do it.  

P.C. What are the main challenges for EC economy in an age of turbulence like the one we are going through? 

T.V. Creating a culture of innovation is, I believe, one of the main challenges. In this respect, key issues are attracting and retaining talent, which requires clear career perspectives, and fostering mobility via practices that encompass well-organised recruitment procedures, the accompanying social security measures and adequate professional incentives.  In addition to providing the foundation for our independence, scientific and technological innovation is a precondition for global competitiveness and the key to successful partnerships.

In this framework, an additional challenge is to build on the well-documented public interest in science, using all forms of communication to foster dialogue and interest, to serve the dual purpose of enhancing public understanding of science and securing the future workforce for Europe’s R&D undertakings. In particular, a dedicated effort to improve science education in Europe’s primary and secondary schools is crucially important.

There are, of course, other significant challenges that we have to address, such as people’s fears, migration flows, the discrepancy between facts and people’s perception of those facts. Distribution of income is also an aspect that we have neglected for too long, and has fostered tensions. Although I acknowledge that the European message has failed to some extent, I believe that Europe can greatly benefit from global collaboration by building and securing its own competence in science and technology and being ready and able to act as a single entity to reap the full benefits. For example, the process of scientists from around the world working together has been going on for fifty years since the formation of the European Organization for Nuclear Research, CERN and is going on with the High Luminosity upgrade that will run up to 2035 and studies for a post-LHC large scale research infrastructure. Now, in many areas, the work is done and the facilities offered by these organisations ensure that Europe leads the world.

P.C. Do you think that fundamental research can play a role in sustaining EC's competitiveness?

T.V. It is evident that science as a whole has a significant role to play in support of a society which has placed knowledge creation at the heart of its vision. The capability to successfully bring about innovation in a market is a determinant of the global competitiveness of nations. It all starts from fundamental research. We should strive to keep the flow of ideas open and resist any effort to become more insular – a very difficult task in an age of polarisation like the present. Our job is to diffuse ideas, let them be understood, and share the results of research. This brings an issue of appropriability: the ‘public good’ nature of basic research. If we do not assign Intellectual Property over it, incentives to supply it will be moot. Instead, if we do assign IP, the ‘open’ nature of research is lost. That’s why public funding of fundamental research is really key.

For Europe to become and remain a world leader, it is also essential to have available a large body of highly educated and experienced S&T personnel with access to sophisticated and technologically advanced facilities and infrastructures. Current demographic, financial, and cultural trends pose a challenge to provide the intellectual, cultural and financial incentives to retain European researchers and to attract non-European researchers in the research and development areas considered to be of high priority. This is another reason why in my view the present studies for future large-scale research infrastructures are particularly topical.

P.C. CERN, ESA, ESS and ITER are but a few examples where Europe is leading international collaborations in research. Do you think that more effort should be placed in measuring and understanding the impact and role that these organizations play in maintaining the competitiveness of the European Research Area? 

T.V. In my view, knowledge, education, technology, and innovation are Europe's main strengths and the foundation for growth and employment. Our aim is to implement policies that encourage young researchers to develop their research ideas and provide them with the tools that will allow them to think how these ideas can be transformed into marketable products and services. Support for frontier research, as well as for interdisciplinary collaborative research in new and promising fields, can lead to greater European competitiveness, employment, and prosperity.

The examples that you mentioned, as well as the European Council, showcase the importance of providing support for basic research of the highest quality. Basic or fundamental research — which explores certain scientific areas without specific applications in mind — brings enormous benefits to society and is one of the most cost-effective investments governments can make, because it opens up new scientific horizons.

At the same time, I think it is important to support small- and medium-sized enterprises (SMEs) that invest in fundamental technologies to help them become leading and competitive market participants in the global environment. This means creating spaces to bring research centres and the academia closer to SME, but also simplifying some process and introducing new tools where needed.

I would also like to add that one shouldn’t be too obsessed with metrics either! If you measure my ability with teaching, then perhaps you will neglect the impact of my research and vice versa. So one has to be careful when introducing new indexes and interpreting them to derive assumptions. However, in terms of accountability and transparency then my answer is simply that by all means we should intensify our efforts. This can and should be done by experts.

P.C. Having a background as a mechanical engineer and as an economist, but also being a Professor in one of the world's most prestigious business schools, I would like to ask you, what do you think about the relation between academia and the industry?

T.V. Partnerships between academia and industry are fundamental, both to ensure that research finds useful applications, and often to make sure it is funded appropriately. At the same time, it is especially important to maintain the integrity and independence of researchers.  It is not easy to find a balance,  in a period when government funding for research is being reduced; a really worrying trend.

Here I can speak more directly only about my own environment, less sure about elsewhere in EU as I am not there. Government funding is going away, the trend there is awful. The top universities such as Imperial College can afford increasing students’ fees. But is this fair? I don’t think so. Meanwhile, the industry can provide much needed funds through commercialisation of ideas or technological transfers; incentives, if they are set right, can do wonders.

Nevertheless, we should also provide some space to pure thinking and try to avoid bureaucratisation. It is important to keep the essence in sight: academics want to do top research; they do not want to fill forms. And let’s always be open to the world, to new ideas, and to the best students wherever they come from. They are our future.

Highlights from IPAC 2016
by Gianluigi Arduini, Yannis Papaphilippou and Rogelio Tomas Garcia


Attendees of the 7th International Particle Accelerator Conference, IPAC 2016

The 7th International Particle Accelerator Conference, IPAC’16, was held in Busan, Korea from May 8-13, 2016. This is the major yearly event in the accelerator science field, gathering scientists, engineers, students and industrial partners. The conference attracted more than 1200 participants from 37 countries.

The scientific program started on Sunday afternoon with the traditional Student Poster Session were the new generation of physicists and engineers present their work and interact with colleagues from all the major accelerator laboratories. The program continued with oral and poster presentations covering colliders, hadron accelerators, photon sources, novel acceleration techniques, beam dynamics, beam instrumentation, accelerator technology and applications of accelerators.

Lepton and hadron colliders continue to progress in pushing the performance to serve particle physics research. The commissioning of the SuperKEKB collider is progressing firmly after a major upgrade to increase its luminosity. The LHC is back in action after a major shutdown and it has restarted successfully operation at the unprecedented energy of 6.5 TeV. In RHIC, electron lenses have been used to enhance performance by compensating the beam-beam effect.

The ring light source upgrade programs are in full swing, led by MAXIV, the first synchrotron light source based on a multibend achromat cell. Novel optimization techniques based on genetic algorithms are paving the way towards reaching horizontal emittances at the diffraction limit coupled with new and challenging magnet technology. PACMAN, a Marie-Curie network based at CERN is indeed pushing the limits of technology in the domain of component alignment, training qualified engineers and creating synergies between laboratories, universities and industries.

Accelerating superconducting cavities based on Nb3Sn have shown, for the first time, to outperform Nb cavities, defining the next generation of superconductive RF.

At the high beam power front SNS has reached 1.4 MW and a study for doubling the power at 1.3 GeV was launched. The RCS of JPARC is continuously improved towards 1 MW routine operation, whereas the main 50 GeV ring is delivering 400 kW beam for neutrino long baseline experiments.

Over a dozen of contributions focused on the development of linear optics measurement and corrections techniques and the understanding of their limitations. Investigations in NSLS-II and ESRF compared a variety of techniques based on orbit response and turn-by-turn BPM measurements. Both studies conclude that the agreement on the measured beta functions stays at the 1% level. This opens the door for light sources to replace the traditional and lengthy orbit response techniques with fast turn-by-turn-based optics corrections.

A special session on the engagement with Industry took place on Wednesday afternoon, where the strategy for the High Luminosity LHC was presented. A two way process is essential in order to achieve success within business constraints, through which industry understands the project’s needs but also the project realizes industry’s capabilities.

Four accelerator prizes were awarded during the conference. The Xi Jalin Prize for outstanding work in the accelerator field was awarded to Derek Lowenstein, BNL, USA, “for his many years of leadership in the AGS Booster and BNL RHIC”. The Nishikawa Tetsuji Prize for a recent, significant, original contribution to the accelerator field was awarded to Gwo-Huei Luo, NSRRC, Taiwan, “for his leading role in the management, construction, and commissioning of the Taiwan Photon Source (TPS)”. Sam Posen, Fermilab, USA, received the Hogil Kim Prize for a recent, significant, original contribution to the accelerator field, for an individual in the early part of his or her career “for the development of Nb3Sn film coated superconducting RF cavities.” The Mark Oliphant Prize was awarded to Spencer Jake Gessner, SLAC, USA, for his PhD thesis work on the demonstration of the Hollow Channel Plasma Wakefield Accelerator.

The conference ended with a plenary session for Hadron Accelerators, where the status of the European Spallation Source accelerator project, an unprecedented power (5MW) linear accelerator brought to 2GeV of energy, through super-conducting RF cavities. ESS is well into construction and the accelerator project is progressing according to plan towards first beam in June 2019.

IPAC’16 was organized under the auspices of the Asian Committee for Future Accelerators (ACFA), the European Physical Society Accelerator Group (EPS-AG), the American Physical Society Division of Physics of Beams (APS-DPB). The large number of participants and the enthusiasm shown in Busan confirm the strong mandate for the IPAC series from the worldwide accelerator community. The eighth IPAC will return to Europe and take place in Copenhagen, Denmark on 14-19 May 2017. 

 

FP7 CESSAMag: building industrial relations with SESAME members
By Jean-Pierre Koutchouk & Livia Lapadatescu (CERN)

The goal of SESAME is to join forces in the Middle East and beyond, to build a synchrotron light source of excellence, attracting scientists from the region, including those already working at other such facilities in the world. In line with this international endeavour, the CERN engineers involved in the FP7 CESSAMag project in support of SESAME considered the possibility of procuring some components from companies based in the SESAME Members. Having extensive experience in carrying out quality control when buying from industry, CERN could take the challenge of procuring from companies without former accelerator experience, by providing appropriate knowledge transfer.


CESSAMag magnets being installed at SESAME. (Image credit: SESAME)

Identifying potential companies from SESAME members was, however, a challenge. In Cyprus, for example, a company was found through the Chamber of Commerce, while a Turkish company was identified at an industrial fair. Both companies had no previous experience in producing accelerator components, but, after an assessment, were found to have the potential to do it. By placing a pilot order first, CERN could test the components and ensure that they were produced at the highest standards for SESAME (all the components, in particular the magnets, were highly customized for SESAME, as is the case for most accelerators, and were not off the shelf products). In the end, the Turkish company built the quadrupole coils, while the Cyprus company assembled half of the sextupoles (33 magnets). The assembly of the other half was donated by Pakistan, who sent an expression of interest to assemble the sextupoles against a knowledge transfer from CERN. Except for the dipole, “the power supplies of the quadrupoles and sextupoles were bought from a well-established power supply company from Israel.

Another challenge in this industrial endeavour was the logistics behind the shipping of the various components, involving in particular maritime transport – to Pakistan, Cyprus and Jordan. Besides a proper conditioning of the crates to avoid damages, the various transit times, including customs clearance, had to be properly considered to meet the overall schedule.


Map of CESSAMag partners

Along with the industrial return for companies themselves (some of them now interested to participate in call for tenders with CERN or to continue producing accelerator components), the members involved have also welcomed the industrial collaboration with CERN.

The strategy chosen by CESSAMag to build industrial ties with SESAME members, in addition to purchasing accelerator components from experienced companies in Europe is yet another example of science diplomacy advanced by the project.  

Pages