I was sitting in the back of the lecture theatre at the Microwave Photonics Conference in 2010, peering up front to see if there were any closer seats. No luck – the hall was full for the plenary talks, and Prof. Bahram Jalali took to the podium to give his talk on “Photonic time-stretch: From world’s fastest digitizer to the world’s fastest camera”.
It was confusing – this was a conference on the fusion of microwave engineering and photonics, with most of the shop talk revolving around linearity of photodetectors and optical links. Why was Prof. Jalali talking about cameras? The world’s fastest camera?
In a rush of hazy memories from my PhD reading list, I recalled a citation from IEEE Transaction on Microwave Theory and Techniqeust: “Photonic time stretch and its application to analog-to-digital conversion” by Coppinger, Bhushan and Jalali in 1999 (1). This early publication really demonstrated how optical phenomena could be used to leap over bottlenecks in other domains.
The trick was in the title itself – “Photonic time stretch”. The researchers would create a “chirped” pulse of light, where the colours in the pulse were separated in time by an effect called dispersion. It turns out this is extraordinarily simple to do – just pass a normal pulse of light through a long length of optical fiber, which is cheap and has very little loss, and your pulse’s reds and blues would start separating in time.
This chirped pulse might not be very long, but it was long enough to imprint a very fast electronic signal on it, so that the beginning of the electronic signal sat on the slow part of the pulse, and the end of the signal sat on the fast part.
It’s all a bit tricky to get right, but once it was working, the next part was magic: pass this blurred pulse through even more dispersion, spread it out even more in time, pull the spectral components apart. The optical pulse has now become stretched out, but so has the electronic signal, allowing the happy user to use a lower bit rate ADC.
A nice bit of work in 1999, but quite a difficult pitch to try and argue that a cheaper ADC is worth all this optical technology. In the original paper, the authors state “a time-stretch preprocessor can revolutionize A/D conversion”, but I didn’t think anything of it, nor did I buy into it.
That’s a great example of judging technology before it has a chance to hit its stride – eleven years later, I watched Bahram Jalali give a stunning talk about how time-stretching has allowed his research team to build extremely advanced cameras, able to capture dynamic events on an unprecedented scale.
From the letter in Nature (Goda et al. 2009, 2), the team was able to turn this whole concept on its head and come up with a device that used the phenomenon of dispersion to accomplish something remarkable.
In microscopy, 2D images are taken through an optical system, then detected by a CCD or CMOS camera. These cameras use an array of detectors to measure a 2D scene, but they are limited in refresh rate and ability to take faint images. The most advanced CCD camera operates at around 1 MHz refresh rate, but needs refrigeration to make that happen.
Goda et al. attempted to overcome this limitation; they start with a narrow laser pulse, and use a 2D disperser to spread the pulse in two dimensions. If you think of the pulse as containing a whole range of colour, the 2D dispersers is pushing the red components to the upper left, and the blue components to the lower right, with all the space in between covered by the middle colours. This bundle of colours is then projected to an object, or microscope, to take a snapshot of a 2D scene.
When this image comes back into the camera, how do you measure these spread-out colours without an array? Similar to the 1999 experiment, here they use optical fiber dispersion to pull these colours apart in time, which make measuring each 2D ‘pixel’ as simple as using a single photodiode and an oscilloscope.
This camera is able to operate at 6.1 MHz, and incredible improvement over typical camera systems, and was used to capture incredibly fast images. As an example, they captured an image of a hole being created by a high-power laser, with images roughly captured every 150 nanoseconds.
The media caught wind of this work by the headline: “the world’s fastest camera”, and suddenly important media outlets such as the BBC picked up the story. In most cases, this would be a one-off shot, forgotten after the initial rush of press releases and interviews.
However, I love how the core concept of chromatic dispersion in optical fiber led the way to more complex and complicated experiments, pushing the idea as far as it can go. In 2012, the world’s fastest camera made it into Time Magazine, used to detect cancer cells at a rate of 100,000 cells per second. That’s an application that the general public loves to hear about, researchers solving problems, finding cancer – with lasers.
2 – Goda et al., “Serial time-encoded amplified imaging for real-time observation of fast dynamic phenomena”, Nature, Apr. 2009 (http://www.nature.com/nature/journal/v458/n7242/full/nature07980.html)
Cibby Pulikkaseril has been a Photonics Society member since 2005. He completed his B.Sc. (EE) at the University of Alberta, Canada, a M.Eng (EE) at McGill University, Canada, and then a Ph.D. in microwave photonics at the University of Sydney, Australia. He is currently a technical project leader for novel optical instrumentation at Finisar Australia, Sydney.