Press Release

Cosmology Machine re-creates the Universe

By SpaceRef Editor
July 30, 2001
Filed under , ,

The past, present and future of the universe is about to be revealed in unprecedented detail by Britain’s biggest academic supercomputer called the Cosmology Machine, based at the University of Durham.

Trade and Industry Secretary Patricia Hewitt launched the “time machine” on its first simulation program today (31 July) when she switched on the £1.4 million state-of-the-art installation at the University’s Physics Department.

The Cosmology Machine takes data from billions of observations about the behaviour of stars, gases, galaxies and the mysterious dark matter throughout the universe and then calculates, at ultra high speed, how galaxies and solar systems formed and evolved. By testing different theories of cosmic evolution it can simulate virtual universes to test which ideas come closest to explaining the real universe.

The gigantic new facility – manufactured by Sun Microsystems and supplied by Esteem Systems plc – has been installed at Durham with the help of £652,000 from the Joint Research Equipment Initiative. The JREI was set up by the DTI’s Office of Science and Technology, the Higher Education Funding Council for England (Hefce) and the research councils – in this case, the Particle Physics and Astronomy Research Council (PPARC) – to provide strategic investment in key scientific infrastructure for research of international quality.

The funding forms part of £18 million worth of special strategic investment in Durham science by the DTI and the research and funding councils over the past two years.

The supercomputer is operated by the Institute for Computational Cosmology (ICC), part of the Ogden Centre for Fundamental Physics now being developed at Durham. Its breathtaking capacity for calculations will set new standards in science that could also help other areas of research. The supercomputer :

is called the Cosmology Machine. Its engine room is an integrated cluster of 128 Ultra-SparcIII processors and a 24-processor SunFire.. It is the largest computer in academic research in the UK and one of the 10 largest in the UK as a whole.

can perform 10 billion arithmetic operations in a second. This number of operations would a take a numerate individual about a million years of continuous calculation to complete. Alternatively, if all of Earth’s six billion inhabitants were proficient at arithmetic, it would take them about two hours to carry out the same number of operations that the supercomputer can carry out in a single second.

has a total of 112 Gigabytes of RAM and 7 Terabytes of data storage. (A Terabyte is more than a million million bytes.) This is the equivalent of nearly 11,000 CD-ROMs. It could hold the contents of the 10 million books that make up the British Library collection and still have plenty of space left over.

Vice-Chancellor Sir Kenneth Calman said: “This is a fascinating and important branch of physics. I am delighted that my colleagues in Durham have established the expertise and quality to take a lead in advancing the frontiers of knowledge even further.”
Professor Carlos Frenk, Director of the ICC, says: “The new machine will allow us to recreate the entire evolution of the universe, from its hot Big Bang beginning to the present. We are able to instruct the supercomputer on how to make artificial universes which can be compared to astronomical observations. It is truly remarkable that all that is required to emulate the Universe are the same laws of Physics, such as gravity, that govern everyday events on Earth.”

Chief Executive of PPARC, Professor Ian Halliday, said: “This is a stunning resource for astronomical research in Britain. It will enable consortium members in the UK, Germany, Canada and the USA to perform cosmological calculations of unprecedented size and detail. We are poised to confront one of the grandest challenges of science: the understanding of how our universe was created and how it evolved to its present state.”

The Durham Institute is a leading international centre for research into the origin and evolution of the universe and is the UK base of the “Virgo consortium for cosmological simulations”, a collaboration of about 30 researchers in the UK, Germany, Canada and the USA. Research ranges from the formation of the first objects in the universe, to the physics of the great clusters of galaxies. Long-term goals are to understand the formation of structures in the universe, to establish the identity and properties of the dark matter that dominates the dynamics of the universe, to determine the parameters of our world model, and to relate the Big Bang theory to astronomical observations.

Notes for Editors

Further information from:
Prof Michael Pennington, Head of Physics
Durham University
0191 374 2158

1. There are three linked developments under way in Physics at the University of Durham, representing a major UK investment in science within a strong international network.:
– The Ogden Centre for Fundamental Physics, named after the benefactor businessman and Durham physics graduate Dr Peter Ogden. It contains two research institutes:
– the Institute for Computational Cosmology
– the Institute for Particle Physics Phenomenology
The switching on of the Cosmology Machine by Patricia Hewitt marks the formal launch of the ICC and the beginning of the new Centre. A new building for the Centre is under construction and due for completion in the summer of 2002.

2. JREI was specifically designed to help purchase equipment for advanced research programmes. It was funded by the four UK higher education funding bodies, the Office for Science and Technology (OST) and the Research Councils. Research teams who receive support also forge partnerships with industrial bodies in order to further research. It was replaced early in 2001 by the Science Research Investment Fund (SRIF).

3. A more detailed scientific note by Professor Carlos Frenk is attached. It may be reproduced as a signed article or used for more extensive quotes.

University of Durham : news background note

31 July 2001

Cosmic architecture: building the Universe

by Professor Carlos Frenk
Director of the Institute for Computational Cosmology, Ogden Centre for Fundamental Physics, Department of Physics, University of Durham, UK

Cosmology confronts some of the most fundamental questions in the whole of science. How and when did our universe begin? What is it made of? How did it acquire its current appearance? How will it end? These are all questions that have preoccupied mankind since the beginning of civilisation, but which only relatively recently have become accessible to established scientific methodology. Recent advances suggest that these and related questions will be answered in the next few years through a combination of astronomical observations and computer simulations.

The enormous progress in cosmological studies over the past decade has been driven by the timely coincidence of new theoretical ideas and technological innovation. In the past 40 years, astronomers have gathered incontrovertible evidence that our universe began about 10 billion years ago in a hot, dense phase – the Big Bang – and that most of its material content today consists of invisible “dark matter”, very likely made up of exotic elementary particles produced in the earliest stages of the Big Bang. The radiation generated by the primordial fireball is detected today as a background of microwaves and this provides a direct window to the early universe. In 1993, the COBE satellite discovered tiny ripples in this radiation, the fossil records of primordial irregularities which, over 10 billion years of cosmic evolution, have been amplified by the gravity of the dark matter to produce the rich variety of structures seen today in large galaxy surveys. Precise measurements show that the properties of these irregularities agree remarkably well with the predictions of the “inflationary” theory of the early universe. Also during the past decade, astronomers have discovered galaxies in their early phases of formation which have many of the properties that theorists had predicted. We are currently on the verge of major breakthroughs which will resolve once and for all issues of such fundamental importance as the geometry of the universe, the nature of the cosmic dark matter and the origin of galaxies.

Every day observatories on the ground and in space peer into the cosmos, collecting huge amounts of astronomical information. Cosmological data are unique because the finite light travel time implies that objects are observed not as they are now but as they were at some time in the distant past which depends on how far the object is. Using the fundamental laws of Physics, computer simulations are able to recreate the evolution of the universe thus providing the means for connecting objects or “events” observed at widely different cosmic epochs. On the scales of galaxies and clusters, the evolution is complex and involves not only gravitational interactions, but also gasdynamic and radiative effects associated with the gas that ultimately ends up in the stars that make up the galaxies. In spite of this apparent complexity, the problem is much better posed than most computational problems in Physics or Biology: the initial conditions are known precisely, both from the ripples in the microwave background radiation and from early universe Physics. Starting from such initial condition, cosmological simulations follow the coupled evolution of dark matter and gas into the present day.

Modern computer simulations recreate the major events which have shaped our Universe:
– the formation of the primordial plasma
– its irradiation by the earliest quasars and stars
– the motion of primordial hydrogen gas clouds and their accretion onto spinning dark matter clumps
– the growth of dark matter halos and the galaxies within them by repeated mergers of substructures
– the emergence of spiral galaxies like the Milky Way
– the formation of great aggregates of galaxies like the Coma cluster.
The output of a simulation is a virtual universe over which scientists have control; the input values of fundamental parameters and the underlying assumptions about the nature of the dark matter can be changed at will and new virtual universes created. A detailed comparison of the virtual universes with the real one reveals the model assumptions and parameter values that best describe our Universe.

Cosmological simulations present a formidable computational challenge not only because of the intrinsic complexity of the problem, but also because of the huge range of scales involved. The processes that lead to the formation of an individual star operate on a length scale at least one hundred million times smaller than the size of the largest galaxy structures seen in the universe. To overcome these problems, cosmologists have devised clever algorithms to calculate efficiently the evolution of a collisionless N-body system (the dark matter) as well as novel approaches to fluid dynamics. Many of these techniques have applications to a broad range of problems in other disciplines.

In spite of the spectacular achievements of the past two decades, even the largest supercomputers today are still too small to recreate our universe in the detail required for a full interpretation of astronomical data. For example, the largest cosmological calculation ever carried out, the “Hubble volume” simulation performed by the Virgo consortium using the 750-processor Cray-T3E supercomputer in Munich, used a record-breaking one billion particles to follow the evolution of dark matter over practically the entire visible universe. However, galaxy clusters could only be resolved with 1000 particles and individual galaxies not at all. Similarly, the largest simulation of the dark halo of our own galaxy (also carried out by the Virgo consortium) took several months of continuous calculation on a 64-processor Cray-T3E at Edinburgh, but did not resolve the crucial inner parts of the galaxy where processes that probe the nature of the dark matter occur.

The new supercomputer at the Institute for Computational Cosmology in Durham will increase the computing power available to the Virgo consortium tenfold. It is the largest supercomputer for academic research in Britain and one of the largest in Europe. It will enable consortium scientists in the UK, Germany, Canada and the USA to perform cosmological calculations of unprecedented size and detail. In close connection with astronomical data collected with a new generation of giant telescopes and space observatories, these calculations will confront one the grandest challenges of science: the understanding of how our universe was created and how it evolved to its present state.

Professor Carlos S. Frenk
Tel: +44 (0)191-374-2141


Mr Keith Seacroft
University of Durham
0191 374 2946

Reference URL :

SpaceRef staff editor.