How Automation Enables Critical Work at CERN

The European Organization for Nuclear Research, known as CERN, is a European research organization that operates the largest particle physics laboratory in the world. Established in 1954, the organization is based in a northwest suburb of Geneva on the Franco-Swiss border and has 23 member states.

Since the Large Hadron Collider (LHC) started making news in 2008, CERN has become a household name – and not just among science enthusiasts. Responsible for a network of half-a-dozen particle accelerators, CERN has been providing the world with cutting-edge science and research since it was founded in the 1950s. Discovery of the Higgs boson or “God particle” led to two CERN scientists, Peter Higgs and François Englert, receiving the Nobel Prize for physics in 2013.

There are about 2,500 scientific, technical, and administrative staff members at CERN. They also host about 12,000 users, all of which requires an impressive IT infrastructure. In addition to all of the users, CERN deals with an incredible amount of data. In 2017, they passed the 200 petabyte milestone. Their accelerators and experiments had by then recorded data equal to 680 years of constant full HD video.

Our performance is much better. We have time … to dedicate to actual things that are important, to implement [updates], to change things that we are seeing in the monitoring that are not working.


At-Scale Automation Is a Must

CERN needed a way to monitor daily operations and identify problems in real-time. 

CERN has grown so large that it simply cannot function without automation. In 2011, CERN adopted Puppet to manage their growing infrastructure. With Puppet, the CERN IT team now automates over 40,000 managed nodes with more than 100 virtual servers to provide the necessary infrastructure support.

Keeping the scientists happy and experiments running smoothly is one of the priorities for IT at CERN. This presents unique challenges, however. Research projects at CERN deal with staggering amounts of data, and it requires massive compute resources to process and analyze it all. The LHC accelerates protons through its 27 km circular tunnel to 299,792,455 m/s – just 3 m/s shy of the speed of light – and slams them together at energies surpassing one trillion electron volts. 40 million particle collisions can happen every second. All of this data needs to be analyzed!

Everyday Automation Lowers Costs, Improves Monitoring

CERN uses a monitoring system for infrastructure that combines Puppet agents, Puppet servers, and PuppetDB in workflows collecting data in two source pipelines using data processing pipelines (Kafka, Logstash), analytics software (Elasticsearch), data storage (InfluxDB, HDFS), and frontend components (Grafana, Kibana, and Jupyter).

Puppet automates the provisioning and decommissioning of thousands of virtual servers every day at CERN. With this level of automation, the rate of managed services per DevOps personnel has increased substantially. Though the number of members in the IT department at CERN has remained stable, they have been able to enhance IT functionality, improve monitoring, and offer more services.

Automation has also reduced the number of support calls received by the IT team. In 2011, before adopting Puppet, the system administration team dealt with 60,000 tickets. This number has fallen to just a few hundred. Automation enables them to continuously introduce service changes while minimizing service disruption. This gives them time to develop better software to serve the physics community while building more resources for CERN researchers.


  • Readable data for configuration management of 40,000 nodes
  • Usable code extracted and documented from 350 catalogs per minute
  • Reduced latency

At Puppet, we are thrilled to know that our automation technology helps make the most important physics research in the world possible.

See for yourself how Puppet can help you scale automation. 

Try Puppet