How are we to assess the distance between basic research and the essential technologies of the modern age? Are we in the process of building the bridge that will unite the two domains or is the gulf between them growing wider by the day? Reconciling the interested parties in any definitive way remains difficult as each side can furnish multiple examples to support their perspective on the matter. Perhaps the best illumination can be provided through a retrospective approach that highlights numerous pertinent discoveries and in doing so clear up some of the fog that surrounds the debate.
The role of science at the birth of the industrial revolution is often overstated and it could be said that the birth of thermodynamics is a result of the invention of the steam engine rather than the other way around. In seeking to answer questions surrounding the operation of heat engines French physicist S. Carnot published his groundbreaking memoirs in 1824 under the title Réflexions sur la puissance motrice du feu et sur les machines propres à développer cette puissance (“Reflections on the Motive Power of Fire”). It contains a preliminary outline of the second law of thermodynamics and laid the entire foundation for future advances in the field. The publication of an 1864 paper “A Dynamical Theory of the Electromagnetic Field” by the Scottish scientist James Clerk Maxwell reversed the trend and made science the mother of invention tipping the scales firmly in the direction of research as the wellspring of future innovation. The agreement of his results proved that light and magnetism are affections of the same substance. Working on the problem further he was able to demonstrate the existence of waves of oscillating electric and magnetic fields that travel through empty space at the speed of light and could be predicted from simple electrical experiments. Nevertheless, Maxwell’s revolutionary theories languished in obscurity until after his death when in 1887 Hertz was able to confirm his findings through a series of experiments. The electromagnetic waves that were predicted a full century and a half ago reverberate through time and continue to bathe our society in the warmth of their considerable glow.
Since Albert Einstein introduced it to the world in 1905, the Special Theory of Relativity, and its subsequent expansion to include his ideas on gravitational force in 1916 with the publication of the General Theory, has remained stubbornly conceptual in nature. Empirical examples, such as the perihelion due to the relativistic effect in the anomaly of Mercury or the fact that matter deflects light, as can be observed by the deviation of light rays in close proximity to a star, are the exception rather than the rule. The popular view of physics often conforms to the stereotype of a shadowy world of laboratories where scientists labor monastically to find the key that will unlock the secrets of the universe and it is understandably difficult for the layperson to understand what all the fuss is about.
To bring the matter closer to home, and understand the practical application of the consequences of relativity, it helps to observe the way it plays out in our daily lives. GPS allows us to find our position on the earth to an accuracy level of about one meter (and it goes without saying that military applications have the capacity to give measurements to an even greater degree of precision). A GPS receiver determines its position by comparing the electromagnetic time signals it receives from at least three satellites (usually 6 to 12) and triangulation on the known position of each satellite. Light travels roughly one meter every three nanoseconds which means that the measure of time must be made at a minimum measure of accuracy of 10 square (-9) seconds per day. Ordinary timekeeping devices have no chance of approaching this degree of precision therefore each GPS satellite carries with it an atomic clock that ticks with an accuracy of one nanosecond by measuring the frequency of radiation emitted by an atom or molecule when it makes a transition between two energy states (the second today is defined as the duration of 9192631770 cycles of radiation corresponding to the transition between two energy levels of the caesium-133 atom). Contrary to popular belief GPS satellites are not geostationary and because of the speed at which they orbit (about 14,000 km/h) Special Relativity predicts that the on-board atomic clocks on the satellites should fall behind the clocks on the ground by about 7 microseconds per day due to the time dilation effect of their relative motion. Furthermore, the satellites orbit at about 20,000 km above the surface of the earth where the curvature of space time due to the Earth’s mass is less than it is at the Earth’s surface. General Relativity predicts that the clocks on the satellites will appear to be ticking faster than those on the ground by 45 microseconds per day. The combination of these two relativistic effects means that the clocks on-board each satellite should tick faster than identical clocks on the ground by 38 microseconds per day, the amount of time it takes for light to travel almost a dozen kilometers. If these relativistic effects were not taken into account a navigational fix based on GPS would be utterly useless.
In academia, quantum theories of matter have always been plagued with a reputation as being too esoteric and philosophical to be considered as anything other than a specialist subject unfit as a realistic option for a course of study to the battalions of engineering students entering institutions of higher learning annually. Even in the fifties, when transistor technologies were revolutionizing the future of electronics and the budding field of solid-state and condensed matter physics was demonstrating the practical value of theories of quantum mechanics and electrodynamics this was true and little has changed since. That the laser was likewise the result of pure research into the possibility of a stimulated emission, the results of which were published in a 1917 paper by Albert Einstein on the quantum theory of radiation, was easily overlooked. Who else but the physicist at work in his laboratory would find the slightest use for a sudden burst of coherent light? The answer to this question has been provided by the plethora of commercial applications developed in the latter half of the 20th century which have ranged from CD and DVD technologies to surgical and precision cutting instruments, passing by way of the supermarket check-out counter in the process with the development of bar-code scanning devices. Kastler and Brossel’s ideas about optical pumping, in which illuminated atoms are excited by the light to a higher energy level, played an important part in this development and were swiftly advanced by Townes, et al. From communications and information technologies to nuclear energy, quantum mechanics has been an essential tool for solving the most challenging engineering problems.
The linear model of science which places theoretical research at the disposal of practical necessity, in much the same way as a mortar and pestle used by doctors is placed at the disposal of the patient, has been updated and largely replaced by a more interactive model of give and take between the pure and applied worlds. The exponential increase in the number of components contained on a single chip has become known as Moore’s Law. While not really a law at all—more an uncannily accurate observation of what is technologically possible—it predicts the doubling of the number of transistors contained per cm2 every 18 months. As device dimensions get smaller advances will begin to come more from tweaks to system design than technical innovation and when transistors become so small that quantum effects begin to take over we will have arrived at the ultimate single electron transistor. At this point will silicon cede its primacy to quantum computing? In a conventional computer information is stored and manipulated as sequences of binary digits, or bits denoted as 1s or 0s. In a normal computer each bit is a switch, which can be either on or off whereas in quantum computing they can be in a superposition of states, either on or off at the same time. Unfortunately, these qubits, as they are known, are notoriously difficult to create because they require the entanglement of multiple particles and the real challenge will lie in preventing the decoherence that arrives as a result of unavoidable fluctuations in control parameters.
In the give and take process between basic scientific research and modern technological development, the time it takes for transfers of information between the two has been dramatically shortened. One example, the 1988 discovery of giant magnetoresistance by Albert Fert (Orsay, France) and P. Grünberg (Jülich, Germany) earned each of them the Nobel Prize in Physics in 2007 and is the phenomenon at the heart of read-head technologies that have found almost universal application in the hard drives of our computers. Essentially, it is a quantum effect linked to the spin properties of electrons arising from changes in applied magnetic fields. This is an advance over conventional conduction based on electrical charge and has created an entirely new discipline called spintronics.
Despite its long history physics at its most fundamental level has continued to push against the limits of what is possible and remains surprisingly open to innovation. During the post-war boom years, important progress was made at the conceptual level allowing physicists to gain a more profound understanding of the world around them. Explorations of the various interactions in nuclear and electromagnetic fields proved the existence of new symmetries, with symmetry being understood as the phenomenon where the property of a structure remains the same if you perform a certain operation on it. A general feature of these ‘gauge’ theories is that none of the fundamental fields that change under a gauge transformation can be directly measured.
History is still being written and the search is on for the Higgs boson and its associated supersymmetries in order to provide the missing pieces in the Standard Model, a 40-year-old mathematical framework that links all the known particles and all of the fundamental forces of nature except for gravity. Above a certain critical temperature all particles are massless except the Higgs boson and the Large Hadron Collider has continued to crank up the limits of what is possible since coming on-line almost 2 years ago at CERN where it has been coined the largest and most complex machine ever constructed and is really starting to hit its stride as it mounts in power.
The techniques required to do modern experimental high energy physics are quite varied and complex and have often led to technological advances in the development of superconducting magnets for example and, lest we forget, the pioneering work of CERN scientist Tim Berners-Lee in the creation of the World Wide Web, originally conceived and developed to meet the demand for automatic information sharing between scientists working in different universities and institutes all over the world. A simultaneous trend has been an effort to resolve the incompatibility between Einstein’s theories on gravity and quantum mechanics with some rather audacious theories that would be extraordinarily difficult to confirm because they would require accelerators far more powerful than any now conceivable, in which space is no longer limited to the three dimensions that our senses and measuring apparatus are capable of perceiving. Theory building, while it creates a framework for thought, can be extremely hazardous and is never a substitute for experiment and observation.
Nevertheless, contemporary observations of the astrophysics community have continued to demonstrate how necessary it is to make progress on these questions. Cosmology has become an experimental science, particularly since observations of fluctuations in cosmic microwave background radiation were made showing that the universe is bathed in the stuff. One of the biggest challenges is that despite huge advances there are still shortcomings in the tools used to observe the phenomena: whether by satellite or from land; using radio astronomy or telescopes using advances made in adaptive optics technology. And, it has to be admitted that most of reality appears to be missing and the universe remains as difficult as ever to understand. “Normal” matter represents roughly 3% of the known universe. One quarter is made up of dark matter that shrouds the galaxies in giant bubbles called haloes, the composition of which is completely unknown. Recent observations made by the Hubble Space Telescope indicating that the universe is expanding rapidly imply the existence of an additional 70% dark energy. This has created an almost existential crisis: when physicists have tried to calculate how much energy they would draw from the quantum fluctuations of an empty space, the answer has come out wrong by the incomprehensibly large magnitude of 10 square 120 times too big. Thus, the mystery continues and will probably remain until physicists have one day unified quantum mechanics with gravity.
The recent trend toward cross-fertilization between the different scientific fields has resulted in a less cloistered marketplace of ideas. Physics has a long history of laying the theoretical foundations for the advancement of technology. A few examples include: powerful medical imaging devices; x-ray machines; MRI; magneto-encephalography; and, positron emission tomography (PET). In recent years the relationship between biologists and physicists has been strengthened and exchanges between the two disciplines have become more and more frequent. For our purposes we need not trouble ourselves over the reasons this development has taken place but it is useful to demonstrate the results of these collaborative efforts by listing some shared challenges: processing massive amounts of data being produced through DNA sequencing; analyzing the structure of proteins with the aid of synchrotron facilities (these produce radiation across a broad range of wave lengths allowing scientists to discern the behavior of atomic structure of a variety of materials); and, realizing the possibilities for studying molecular biology using scanning tunneling microscope technology. All these examples demonstrate the logic of the current interdisciplinary approach to innovation. Further evidence of this happy marriage is provided by instruments such as the Neurospin Device, which was recently acquired by the French Atomic Energy Commision, and aims to push as far as possible the current limits of MRI or Diffusion MRI, invented by D. Le Bihan. Both represent new and powerful approaches to the study of brain anatomy and function, an incredible challenge for scientists given the complexity of neural networks and the staggering amount of synapses that relay them.
As we look back over the massive achievements ushered in over the course of the 20th century one can’t help but reflect that this golden age of physics, resting on the twin pillars of the theories of Relativity and quantum mechanics, has also been tinged with a gnawing sense of dread. The two atomic bombs dropped on Japan to bring an abrupt end to the Second World War and the devastation they unleashed serve as all the reminder we need of the sometimes terrible possibilities created through physics. Considering the further challenges that need to be overcome to continue the progress that was made over the last century no small measure of political will is required to help minimize any future risks to which populations around the globe could be exposed. The primacy of science as a source for solving so many of the world’s ills must not be forgotten as we rush headlong into the 21st century. How else can we face the challenge to reduce or contain carbon emissions, find the means to feed the estimated 9 billion people expected by the year 2050, continue the fight against poverty, provide adequate healthcare, and meet the challenge for the continued existence of Earth as we know it?
More on paristech review
On the topic
- Stormy weather: how extreme will it get? Can we adapt?By ParisTech Review on November 12th, 2010
- Microbes or men: who will win?By Maxime Schwartz on April 14th, 2010
By the author
- Basic research and its relationship to modern technologyon November 24th, 2010