MIT Develops 6G Photonic Processor for Nanosecond Signal Processing
Researchers at MIT have developed a new photonic processor aimed at enhancing 6G applications. One of the primary challenges they faced was how to effectively integrate machine learning computational frameworks into optical hardware. Davis, a lead researcher, explained, "Current machine learning frameworks cannot directly adapt to our photonic processors, so we had to design hardware-aware computational structures through meticulous physical responses to achieve specific computational functions." When tested on signal classification, the photonic neural network achieved an accuracy of 85% on the first run. Through repeated measurements, this accuracy quickly climbed to over 99%, and the entire processing time for the MAFT-ONN (Modulated Amplitude Fourier Transform Optical Neural Network) took only about 120 nanoseconds. "Measurement time correlates with accuracy; the longer you measure, the higher the accuracy due to the ultra-fast nature of the MAFT-ONN," Davis added. Presently, the most advanced digital frequency devices can perform machine learning computations at microsecond speeds, whereas optical systems can operate at nanosecond or even picosecond speeds. This significant speed advantage makes the new photonic processor promising for future 6G wireless signal processing. Looking ahead, researchers aim to adopt the multi-path reuse approach to perform more complex calculations and scale up the MAFT-ONN. They also hope to expand the system's capabilities to deeper learning architectures and enable it to run models like Transformers or Large Language Models (LLMs). Original link: MIT News
