It has been a notable phenomenon since
considerable time that almost every major university invents their own nano-imaging techniques.
Usually, this results in a particular piece of technology with limited
applications that does not necessarily become industry standard. Although it
may produce interesting results and demonstrate alternative options, it does
not necessarily mean that it will end up relevant. While it is quite worthwhile
taking a close look at the strengths and benefits of individual approaches,
extolling their virtues remains to be postponed for years if not decades in
certain cases until the consensus of market forces has articulated a clear
preference along with reasons for it.
That said, one needs to take into
consideration that all technology, even what has become the industry standard,
is provisional and its continued development is cross-fertilized by alternative
approaches. In the longer term, there is no room in technology development for
‘not invented here,’
of ignoring third-party solutions because of their external origins. With few exceptions,
looking sideways to leverage other people’s work
behooves all further R&D, since it avoids reinventing the wheel
while highlighting potential for improvements.
While scanning electron microscopy (Ernst Ruska and Max Knoll, 1931, ~
50 nm resolution) opened the door to imaging the molecular dimension, the
scanning tunneling microscope (Gerd Binnig and Heinrich Rohrer,[1]
1981, ~ 0.1-0.01 nm resolution) enabled imaging and manipulating individual
atoms in a sample. It has since been refined into deterministic electron
ptychography at atomic resolution levels. Ruska as well as
Binnig/Rohrer received the 1986 Nobel Prize in Physics for their contributions
to electron microscopy. The next leap came in scanning transmission x-ray
microscopy (STXM) of which a special case is ptychography, a form
of diffractive imaging using inverse computation of scattered intensity data. The
name derives from Greek ptyx for fold or layer as in diptychon, triptychon, polyptychon. Developed by Walter
Hoppe in the late 1960s,[2]
further developments arrived at applications for use in both x-ray and visible
spectrum, resulting in a resolution improvement by more than a factor of 3
so that it can, in principle, reach wavelength-scale
resolution. Even with typical resolutions of just 0.24 nm its
image quality is improved over standard scanning tunneling microscopy and
therefore useful in the nanoscale.
Its principal limitation was, until recently, the need to avoid vibrations in
the x-ray microscope. A sample is scanned through a minimal aperture
with a narrow and coherent x-ray generated by a synchrotron. Smart algorithms based on Fourier transformations
replace optical or magnetic lenses.
Or, as John Rodenburg
put it,
“We measure diffraction patterns rather than images. What we record is equivalent to the strength of the electron, X-ray or light waves which have been scattered by the object – this is called their intensity. However, to make an image, we need to know when the peaks and troughs of the waves arrive at the detector – this is called their phase.The key breakthrough has been to develop a way to calculate the phase of the waves from their intensity alone. Once we have this, we can work out backwards what the waves were scattered from: that is, we can form an aberration-free image of the object, which is much better than can be achieved with a normal lens.”
As a 2013 study conducted jointly by Switzerland’s Paul Scherrer Institute
(PSI) and Technical University Munich showed, progress in imaging and
metrology increasingly correlates with sophisticated control of and
comprehensive characterization of wave fields.
This technology makes it possible to image an entire class of specimens that
could not previously be observed particularly well. Not only can remaining
vibrations of the x-ray microscope be compensated for by purely mathematical
and statistical methods, arriving at much higher image quality,
but ptychography also makes it possible to characterize fluctuations within the
specimen itself, even if they occur at a speed transcending that of individual
frames. It may become possible to determine changes in magnetization of
individual bits in high-density magnetic storage media.
Qualitative image improvements accomplished by this technology are
notable:
Computer simulation enables testing the diffraction imaging
composed by the system’s algorithms, which allows both
simulation of instrumentation effects and of effects of and within the
specimen. This matters
because it proves that the specimen and its dynamics are accurately reflected
in the algorithmic images. 3D images may be
generated by repeat scans of de facto 2D samples at different
tilt angles. The PSI/TU Munich method
renders high-resolution images of mixed states within the sample. These may
include quantum mixtures or fast stationary stochastic processes such as
vibrations, switching or steady flows that can be generally described as low-rank mixed states since the dynamics of samples
are often the very objective of an experiment.
[1]
Nanotechnology – as well as I personally – owe Heinrich Rohrer an immense debt
of gratitude. The passing in May 2013 of this disciple of Wolfgang Pauli and
Paul Scherrer as well as an IBM Fellow was an immense loss. IBM’s Binnig and Rohrer Nanotechnology
Center in Rüschlikon, Zurich was named after both physicists.
[2] Walter Hoppe. “Towards three-dimensional “electron microscopy” at atomic resolution.”
61 (6) Die Naturwissenschaften (1974), 239–249.
No comments:
Post a Comment