It seems there’s always a catch when optical processing enters the mix of systems that are traditionally digital or analog.
I haven’t been keeping up with the tech, but In the distant past I worked in a lab that developed novel methods of channelizing radio frequency spectrums for detecting and analyzing RF signals. One method used optical techniques.
Laser light passing through an acousto-optic Lithium Niobate crystal (called a Bragg cell) modulated by wideband RF energy stimulating the cell via a transducer on the crystal, and then passed though a series of lenses would produce a channelized image that was the Fourier transform of the wideband input RF spectrum; essentially at the speed of light.
We got the physical size down to that of a shoebox; containing, the optical bench, HeNe laser, series of lenses, and target linear CCD array. Over time, for a different project with different specs, that was reduced down to a small ceramic substrate in a micro-min package using a laser diode for stimulus.
While speed of light processing sounds impressive, the catch in this case was imaging the channelized light onto a linear CCD array and then reading that information out with sufficient dynamic range (not great, due to photon/dark noise), digitizing, and then doing something with it. Which was always at more pedestrian rates.
Today, with far less noisy CMOS image sensors, the dynamic range would be far greater and thus more useful.
But… digital signal processing in custom silicon, particularly fast multipliers needed for FFT spectral analysis pretty much makes that moot. Indeed, when I left that company there was a group in the process of developing in-house custom digital FFT and correlator chips for processing wideband RF spectrums with far greater dynamic range. And optical signal processing (at least at that time) was no longer interesting.