Ever since the electronic properties of silicon were discovered in the United States in the late 30s, it has been a well-known fact: a new material can change the world. Perhaps because it weighs less, is sturdier, provides better thermal and acoustic performance, lasts longer, or makes production and assembly easier. Every now and then, the scientific community announces a new miracle material successor to silicon and the 2010s already rustle with announcements about a very serious candidate: graphene, a two-dimensional crystal consisting of a single layer of carbon atoms, which is credited with exceptional potential.
Countries endowed with great research institutions know it all too well. In order to stay in the race for innovation in the 21st century, they need to be involved and successful in four fronts, summarized by the acronym "NBIC": nanotechnology, biotechnology, information technology, and cognitive sciences. Nanotechnology came back into the spotlight early 2010 following a particularly heated public debate in France, which emphasized the ethical and environmental concerns over an assessment of the potential of this new scientific frontier. Speaking of which, what is the real promise of nanotechnology? What can we say about the astronomical profits that certain big American consulting firms promised industrialists who were ready to embark on the adventure? In short, is nanomania meant to last?