Centauri Dreams reports on what may be a big step towards one of the greatest astronomical instrument breakthroughs since the invention of the telescope. This could be a genuine advance in interferometry.
The longest optical telescope baselines are now 437 meters. The researchers are proposing changes to get to 7 kilometer baselines. Longer baselines (putting telescopes further apart) can enable higher resolution if the telescopes can still work together.
An interferometer essentially combines the light of several different telescopes, all in the same phase, so it adds together “constructively” or coherently, to create an image via a rather complex mathematical process called a Fourier transform (no need to go into detail but suffice to say it works). We wind up with detail or angular resolution equivalent to the distance between the two telescopes. In other words, it’s like having a single telescope with an aperture equivalent to the distance, or “baseline” between the two. If you combine several telescopes, this creates more baselines which in effect help fill in more detail to the virtual singular telescopes’ “diluted aperture”. The equation for baseline number is n(n-1) /2, where n is the number of telescopes. If you have 30 telescopes this gives an impressive 435 baselines with angular resolution orders of magnitude beyond the biggest singular telescope. So far so easy? Wrong.
The problem is the coherent mixing of the individual wavelengths of light. It must be accurate to a tiny fraction of a wavelength, which for optical light is a few billionths of a metre. Worse still, how do you arrange for light, each signal at a slightly different phase, to be mixed from telescopes a large distance apart?
The Advance in Interferometry
The researchers are advocating is heterodyne interferometry, an old fashioned idea, again like interferometry itself. Basically it involves creating an electrical impulse as near in frequency as possible to the one entering the telescope, and then mixing it with the incoming light to produce an “intermediate frequency” signal. This signal still holds the phase information of the incoming light but in a stable electrical proxy that can be converted to the original source light and mixed with light from other telescopes in the interferometer to create an image. This avoids most of the complex light-losing “optical train”
Arxiv – A Dispersed Heterodyne Design for the Planet Formation Imager
The Planet Formation Imager (PFI) is a future world facility that will image the process of planetary formation. It will have an angular resolution and sensitivity sufficient to resolve sub-Hill sphere structures around newly formed giant planets orbiting solar-type stars in nearby star formation regions. We present one concept for this design consisting of twenty-seven or more 4m telescopes with kilometric baselines feeding a mid-infrared spectrograph where starlight is mixed with a frequency-comb laser. Fringe tracking will be undertaken in H-band using a fiber-fed direct detection interferometer, meaning that all beam transport is done by communications band fibers. Although heterodyne interferometry typically has lower signal-to-noise than direct detection interferometry, it has an advantage for imaging fields of view with many resolution elements, because the signal in direct detection has to be split many ways while the signal in heterodyne interferometry can be amplified prior to combining every baseline pair. We compare the performance and cost envelope of this design to a comparable direct-detection design.
Extend the heterodyne concept to exclude the beam combiner and delay line loss and the loss of light approaches that of a radio interferometer. Imagine what could be seen with optical baselines as large as 7 kilometers for a 30 telescope array configuration.
An example 30-telescope array configuration, with maximum baseline 7 km, minimum baseline 82 m and excellent imaging and baseline-bootstrapping properties.
A heterodyne design is competitive to direct detection designs for detecting this thermal radiation and should be considered as one of the options for PFI. This design has a low-order adaptive optics system creating a collimated beam which is mixed with a laser frequency comb. This mixed signal is then fed into a spectrograph disperses the mixed signal onto an linear array of individual detectors, ideally one for each polarisation. We begin by describing the overall interferometer architecture and sensitivity, then move into plausible designs for the individual components.
At a central N-band wavelength of 11.5 µm, the angular resolution requirement (taken to be λ/B) corresponds to maximum baseline of 7 km. Allowing for some super-resolution might enable baselines to be reduced to 3 km, but not without a reduction in sensitivity due to possible contamination by the disk emission as discussed above. The imaging requirement for PFI is essential because complex structures in the disk could mask planetary signals if ambiguous modelling of interferometric data are needed. The minimum number of baselines is roughly the number of resolution elements across the final image – 400 in our case. This drives PFI to consider a large number of telescopes – of order 30 or more, noting that the number of baselines NB = NT (NT − 1)/2 with NT the number of telescopes. This number of telescopes is inadequate for snapshot imaging, and requires earth rotation synthesis to get the full ∼400 x 400 resolution element image. Snapshot imaging fidelity is arguably not required, as the smallest timescale required to be resolved is ∼ half the rotation period of the proto-Jupiter, or a few hours. Given the need for ground-based observations sensitive to atmospheric conditions to avoid high air-masses, it is also important that earth rotation synthesis can be achieved in only a few hours of observing and not a full night. Taken together, these requirements mean that a 3-arm spiral or a Y-shaped array with a format similar to the VLA is close to optimal.
The largest clear identifiable cost for the array is the telescopes themselves. 30 telescopes of 4 meter diameter is a total size comparable to the Giant Magellan Telescope (GMT) in terms of collecting area. Given the scaling laws of van Belle et al.,5 one might expect the cost of the telescope component for many 4 m telsscopes to be smaller by a factor of ∼2 than GMT, especially given that the telescopes only have to be designed once and constructed many times. In any case, as long as the ratio between instrumentation and telescope costs are not so different for PFI as for extremely large telescopes, a total construction cost in the range $500 million to $1 Billion USD appears plausible. The lower limit on the cost envelope for PFI is the moving mass (structural steel), glass (area) and building (high end commercial space) costs, which is of order 0.1$ Billion USD – this is of course unrealistically small
They have demonstrated that heterodyne interferometry is a plausible competitor to direct detection interferometry for the Planet Formation Imager concept. Critical to the design is mixing starlight with a frequency comb laser in each telescope, dispersing the light in a high-resolution spectrograph and detecting the light mixed with each line of the frequency comb individually. Fringe tracking using direct detection interferometry is still required, but this is significantly simplified by transporting the beams from the telescopes to the delay lines by fiber. The overall cost of the project is likely comparable to the low-end of ELT budgets.
If you liked this article, please give it a quick review on ycombinator or StumbleUpon. Thanks
Brian Wang is a Futurist Thought Leader and a popular Science blogger with 1 million readers per month. His blog Nextbigfuture.com is ranked #1 Science News Blog. It covers many disruptive technology and trends including Space, Robotics, Artificial Intelligence, Medicine, Anti-aging Biotechnology, and Nanotechnology.
Known for identifying cutting edge technologies, he is currently a Co-Founder of a startup and fundraiser for high potential early-stage companies. He is the Head of Research for Allocations for deep technology investments and an Angel Investor at Space Angels.
A frequent speaker at corporations, he has been a TEDx speaker, a Singularity University speaker and guest at numerous interviews for radio and podcasts. He is open to public speaking and advising engagements.