and Lennon Yao Ting Lee Spectral imaging and spectral ...

31
Review Nanxi Li*, Chong Pei Ho, I-Ting Wang, Prakash Pitchappa, Yuan Hsing Fu, Yao Zhu and Lennon Yao Ting Lee Spectral imaging and spectral LIDAR systems: moving toward compact nanophotonics-based sensing https://doi.org/10.1515/nanoph-2020-0625 Received November 25, 2020; accepted January 16, 2021; published online February 12, 2021 Abstract: With the emerging trend of big data and internet-of-things, sensors with compact size, low cost and robust performance are highly desirable. Spectral imaging and spectral LIDAR systems enable measurement of spec- tral and 3D information of the ambient environment. These systems have been widely applied in different areas including environmental monitoring, autonomous driving, biomedical imaging, biometric identification, archaeology and art conservation. In this review, modern applications of state-of-the-art spectral imaging and spectral LIDAR systems in the past decade have been summarized and presented. Furthermore, the progress in the development of compact spectral imaging and LIDAR sensing systems has also been reviewed. These systems are based on the nanophotonics technology. The most updated research works on subwavelength scale nanostructure-based func- tional devices for spectral imaging and optical frequency comb-based LIDAR sensing works have been reviewed. These compact systems will drive the translation of spectral imaging and LIDAR sensing from table-top toward portable solutions for consumer electronics applications. In addi- tion, the future perspectives on nanophotonics-based spectral imaging and LIDAR sensing are also presented. Keywords: LIDAR; nanophotonics; sensor; spectral imaging. 1 Introduction Optical imaging and sensing systems are key components in industrial automation and consumer electronics. The wide distribution of these sensing systems enables data genera- tion to meet the emerging global trend of big data and internet-of-things [1]. In order to sense the spectral infor- mation of the object, spectral imaging technology has been developed and widely applied. The spectral imager collects 2-dimensional (2D) images of the object at different wave- lengths and forms an imaging stack. Hence, at each pixel, there is a data cube containing the spectral information of the object corresponding to the selected pixel location. From this spectral information, the material or chemical compo- sition of the object can be determined. Depending on the number of spectral bands within the stack or data cube, spectral imaging can be subcategorized into multispectral and hyperspectral imaging, which typically contains 310 bands and dozens to hundreds of bands, respectively [2]. The spectral imaging technology, which is able to obtain both spatial and spectral information, was originally applied in Earth remote sensing [3]. Currently, it has been widely utilized in remote and indoor sensing, covering from Earth observation, geo-information study [47] to optical sorting and pharmaceutical analysis [8, 9]. In addition to spectral information, to sense the 3-dimensional (3D) information of the object, light detec- tion and ranging (LIDAR) technology provides an effective solution. LIDAR system primarily consists of a light source and a detector. By tracking the reflected signal from the object in ambient environment, the location and velocity information of the object can be obtained. The location information can then be used to reconstruct the 3D image of the object. LIDAR technology has been widely used in advanced driver-assistance systems (ADAS), autonomous driving and 3D sensing. It has become the eyes of robotics and cars to sense the ambient environment. The LIDAR technology has also been combined with the aforemen- tioned spectral imaging technology to realize spectral *Corresponding author: Nanxi Li, Institute of Microelectronics, A*STAR (Agency for Science, Technology and Research), 2 Fusionopolis Way, Singapore 138634, Singapore, E-mail: [email protected]. https://orcid.org/0000-0002-0524- 0949 Chong Pei Ho, I-Ting Wang, Prakash Pitchappa, Yuan Hsing Fu, Yao Zhu and Lennon Yao Ting Lee, Institute of Microelectronics, A*STAR (Agency for Science, Technology and Research), 2 Fusionopolis Way, Singapore 138634, Singapore. https://orcid.org/0000-0002-7691- 0196 (Y.H. Fu) Nanophotonics 2021; 10(5): 14371467 Open Access. © 2021 Nanxi Li et al., published by De Gruyter. This work is licensed under the Creative Commons Attribution 4.0 International License.

Transcript of and Lennon Yao Ting Lee Spectral imaging and spectral ...

Page 1: and Lennon Yao Ting Lee Spectral imaging and spectral ...

Review

Nanxi Li*, Chong Pei Ho, I-Ting Wang, Prakash Pitchappa, Yuan Hsing Fu, Yao Zhuand Lennon Yao Ting Lee

Spectral imaging and spectral LIDAR systems:moving toward compact nanophotonics-basedsensinghttps://doi.org/10.1515/nanoph-2020-0625Received November 25, 2020; accepted January 16, 2021;published online February 12, 2021

Abstract: With the emerging trend of big data andinternet-of-things, sensors with compact size, low cost androbust performance are highly desirable. Spectral imagingand spectral LIDAR systems enable measurement of spec-tral and 3D information of the ambient environment. Thesesystems have been widely applied in different areasincluding environmentalmonitoring, autonomous driving,biomedical imaging, biometric identification, archaeologyand art conservation. In this review, modern applicationsof state-of-the-art spectral imaging and spectral LIDARsystems in the past decade have been summarized andpresented. Furthermore, the progress in the developmentof compact spectral imaging and LIDAR sensing systemshas also been reviewed. These systems are based on thenanophotonics technology. The most updated researchworks on subwavelength scale nanostructure-based func-tional devices for spectral imaging and optical frequencycomb-based LIDAR sensing works have been reviewed.These compact systemswill drive the translation of spectralimaging and LIDAR sensing from table-top toward portablesolutions for consumer electronics applications. In addi-tion, the future perspectives on nanophotonics-basedspectral imaging and LIDAR sensing are also presented.

Keywords: LIDAR; nanophotonics; sensor; spectral imaging.

1 Introduction

Optical imaging and sensing systems are key components inindustrial automation and consumer electronics. The widedistribution of these sensing systems enables data genera-tion to meet the emerging global trend of big data andinternet-of-things [1]. In order to sense the spectral infor-mation of the object, spectral imaging technology has beendeveloped and widely applied. The spectral imager collects2-dimensional (2D) images of the object at different wave-lengths and forms an imaging stack. Hence, at each pixel,there is a data cube containing the spectral information ofthe object corresponding to the selectedpixel location. Fromthis spectral information, the material or chemical compo-sition of the object can be determined. Depending on thenumber of spectral bands within the stack or data cube,spectral imaging can be subcategorized into multispectraland hyperspectral imaging, which typically contains 3–10bands and dozens to hundreds of bands, respectively [2].The spectral imaging technology, which is able to obtainboth spatial and spectral information, was originallyapplied in Earth remote sensing [3]. Currently, it has beenwidely utilized in remote and indoor sensing, covering fromEarth observation, geo-information study [4–7] to opticalsorting and pharmaceutical analysis [8, 9].

In addition to spectral information, to sense the3-dimensional (3D) information of the object, light detec-tion and ranging (LIDAR) technology provides an effectivesolution. LIDAR system primarily consists of a light sourceand a detector. By tracking the reflected signal from theobject in ambient environment, the location and velocityinformation of the object can be obtained. The locationinformation can then be used to reconstruct the 3D imageof the object. LIDAR technology has been widely used inadvanced driver-assistance systems (ADAS), autonomousdriving and 3D sensing. It has become the eyes of roboticsand cars to sense the ambient environment. The LIDARtechnology has also been combined with the aforemen-tioned spectral imaging technology to realize spectral

*Corresponding author: Nanxi Li, Institute of Microelectronics,A*STAR (Agency for Science, Technology and Research),2 Fusionopolis Way, Singapore 138634, Singapore,E-mail: [email protected]. https://orcid.org/0000-0002-0524-0949Chong Pei Ho, I-Ting Wang, Prakash Pitchappa, Yuan Hsing Fu, YaoZhu and Lennon Yao Ting Lee, Institute of Microelectronics, A*STAR(Agency for Science, Technology and Research), 2 Fusionopolis Way,Singapore 138634, Singapore. https://orcid.org/0000-0002-7691-0196 (Y.H. Fu)

Nanophotonics 2021; 10(5): 1437–1467

Open Access. © 2021 Nanxi Li et al., published by De Gruyter. This work is licensed under the Creative Commons Attribution 4.0 InternationalLicense.

Page 2: and Lennon Yao Ting Lee Spectral imaging and spectral ...

LIDAR sensing systems [10–13]. It can be used to determinethe shape as well as the material composition of the ob-jects, as different materials have unique reflectance in theoptical spectrum. For example, the spectral reflectance ofvarious plant species [14], gravel grain sizes [15], asphaltsurfaces [16] are different and hence can be distinguishedby using a multispectral imaging system.

Modern applications of spectral imaging and spectralLIDAR systems include environmental monitoring [3, 10, 11,17], autonomous driving [18–20], biomedical imaging [2, 21,22], biometric identification [23, 24], archaeology and artconservation [25, 26], as illustrated in Figure 1 left panel.These applications are enabled by the current state-of-the-artspectral imaging and spectral LIDAR systems. Also, there is agrowing trend to make these systems more compact, lighterweight and with lower power consumption. The nano-photonics technology, with the capability to provide chip-scale high-performance functional devices, has beenexploited to meet this emerging trend [27–29]. Comprehen-sive reviews on spectral imaging technologies and their

applications have been reported before [2, 22, 25, 30–32].However, the progress report on nanophotonics-basedspectral imaging and LIDAR sensing systems is lacking. Inthis review, we summarized the recent research works onspectral imaging and spectral LIDAR systems, including thenanophotonics-based sensing systems. The modern appli-cations of the current state-of-the-art spectral imaging andspectral LIDAR systems are presented in Section 2. A sum-mary table categorizing the recent research works in the pastdecade based on application, sensing mechanism, sensortypeandworkingwavelength is presented. Following that, inSection 3, the progress in recent development ofnanophotonics-based spectral imaging and LIDAR sensingsystems are reviewed and presented. A summary table hasalso been made based on the nanostructured material,sensingmechanism, application andwavelength. Finally, inSection 4, a summary of the review work and the outlook offuture research directions in spectral imaging and LIDARsensing systems are presented. The overview of the contenthas been illustrated in Figure 1.

Figure 1: Overview of spectral imaging and spectral LIDAR systems, applications and future outlook.Left panel: Modern applications chart of the state-of-the-art spectral imaging and spectral LIDAR sensing systems. Inset images: (top left) 2dimensional (2D) multispectral images of urban area, adapted with permission from the study by Morsy et al. [11]. Licensed under a CreativeCommons Attribution. (top and bottom right) A point cloud captured by line-scanning LIDAR system and schematic of LIDAR measurementsetup, both adapted from the study by Taher [18] with permission. Copyright Josef Taher, Finnish Geospatial Research Institute FGI. (bottomleft andmiddle) Schematic of multispectral facial recognition system setup and light source, both are adaptedwith permission from the studyby Steiner et al. [23]. Licensed under a Creative Commons Attribution. Middle panel: Nanophotonics-based sensing systems. Inset images:(top and middle) Scanning electron microscopy (SEM) images of the fabricated color filters and optical image of color filters integrated withdetector array, both are adapted with permission from the study by Shah et al. [33]. Licensed under a Creative Commons Attribution. (bottom)Schematic of dual-comb-based LIDAR system, adapted from the study by Trocha et al. [34]. Reprintedwith permission fromAAAS. Right panel:Outlook of the future development work for compact spectral imaging and LIDAR sensing systems.

1438 N. Li et al.: Spectral imaging and spectral LIDAR systems

Page 3: and Lennon Yao Ting Lee Spectral imaging and spectral ...

2 Modern applications of the state-of-the-art spectral imaging andspectral LIDAR sensing systems

In this section, the modern applications using the state-of-the-art spectral imaging and spectral LIDAR systems havebeen reviewed and presented. The following subsectionsare categorized based on the main application areas.Subsections 2.1 and 2.2 focus on remote or outdoor sensing,while Subsections 2.3–2.5 cover close-range or indoorsensing. All the research works reviewed in this section arepublished in the past 10 years. These reviewed works aresummarized and listed in Table 1.

2.1 Environment monitoring

Environment monitoring is the first application area thatadopted spectral imaging solutions [3]. Over the pastdecade, with the advancements and wide applications ofLIDAR systems, the multispectral LIDAR technology hasbeen implemented for environment monitoring purposeas well. For example, in the study by Hopkinson et al. [10],the airborne LIDAR system (Teledyne Optech) is imple-mented for the characterization and classification of for-est environment. In addition to the conventional 1064 nmsingle wavelength LIDAR system, 1550 and 532 nmwavelengths are also used for multispectral LIDARsensing. Such sensing system provides improvements inland surface classification and vertical foliage partition-ing. Furthermore, multispectral LIDAR has also been usedfor urban area classification, as reported in the studies byMorsy et al. and Huo et al. [11, 12]. In these reports,commercially available multispectral LIDAR sensors fromTeledyne Optech and RIEGL Laser Measurement Systems,covering from visible wavelength (532 nm) to shortwavelength infrared (SWIR) (1550 nm) are employed togenerate multispectral LIDAR data. Different approachesare applied to classify areas (e.g., grass, roads, trees andbuildings) within the urban area. In the study by Morsyet al. [11], the normalized difference vegetation indices(NDVI) computation is conducted for point-based classi-fication of multispectral LIDAR data. In Figure 2(a), leftand right panels show the 2D and 3D view of classifiedLIDAR points, respectively. The figures are based on NDVIcomputation using the recorded intensity at 532 and1064 nm wavelength, which gives the overall accuracyof 92.7%.

Alternative to multispectral LIDAR approach, in thestudy by Jurado et al. [35], a more cost-effective method,photogrammetry, is used to construct 3D images of olivetrees. A high resolution camera is mounted on an un-manned aerial vehicle (UAV) to take multispectral imageswhich are then reconstructed into 3D images. The multi-spectral images and RGB point clouds are fused to studyan olive orchard. The methodology is illustrated in thescheme shown in Figure 2(b). It starts with the 3D recon-struction of both RGB andmultispectral images as the firststep. Following that, the reflectance maps are generatedfrom the multispectral images (step two). These reflec-tance maps are used to enrich the 3D reconstructed im-ages after alignment process, as shown in the third andfourth steps. After that, each olive tree has beensegmented for morphological information extraction andtemporal analysis. In addition to the airborne sensorsmentioned above, spaceborne sensors have also beenrecently implemented for multispectral sensing. In thestudy by Torres et al. [36], sensors are mounted on a sat-ellite to capture the multispectral images covering fromvisible to mid-infrared (MIR) wavelength range for earth-quake vulnerability estimation.

One more point worth mentioning is that the multi-spectral images taken from environment can also be usedfor military and mineral mapping purposes. In military,the spectral imaging system provides information on3D land cover of the battlefield [44]. The spectral infor-mation also facilitates the detection of target in variesdegrees of camouflage [45]. Also, in mineral mapping,the spectral information enables identification of variousmineral materials from the airborne hyperspectralimages [5–7].

2.2 Autonomous driving

Currently, LIDAR systems have been widely used forautonomous driving. Most of the commercial LIDAR sys-tems for autonomous driving are based on single wave-length, which may not be as reliable as multispectralsystems, as the environmental condition sometimes mighthave strong absorption in that single working wavelength.Also, many machine learning methods provide more ac-curate predictions when the input data is consistentwithout variations [18]. The multispectral LIDAR, which isbased on the reflection from the object surface coveringdifferent wavelengths including IR, will not have largevariations under different ambient conditions such as

N. Li et al.: Spectral imaging and spectral LIDAR systems 1439

Page 4: and Lennon Yao Ting Lee Spectral imaging and spectral ...

Table : Summary of the current state-of-the-art spectral imaging and spectral LIDAR sensing systems for modern applications.

Application Sensing mechanism Sensor Wavelength Reference/year

Environment monitoring(forest)

Multispectral LIDAR Aquarius ( nm), Gemini( nm), Orion C ( nm),Titan ( nm, and nm)

, and nm []/

Environment monitoring(urban area classification)

Multispectral LIDAR Optech Titan Channel = nm;Channel = nm;Channel = nm;

[]/[]/

Environment monitoring(precision agriculture)

Reconstruct D modelfrom multispectral andRGB images

Multispectral: Parrot Sequoia( × )RGB: Sony Alpha RIII (megapixels)

Multispectral: green (– nm), red (– nm), red-edge (– nm), near infrared(NIR) (– nm)

[]/

Environment monitoring(earthquake vulnerabilityestimation)

Multispectral LIDAR Landsat Operational Land Imager(OLI) and Landsat Thermal Infra-Red Scanner (TIRS)

Visible, NIR, SWIR, and MIR []/

Environment monitoring(aquatic ecosystem)

Hyperspectral LIDAR Scheimpflug LIDAR with D arraycharge-coupled device (CCD)detector

– nm []/

Autonomous driving (asphaltroad, gravel road, highway,parking lot prediction)

Multispectral LIDAR RIEGL VUX-HA ( nm LIDAR)RIEGL miniVUX-UAV ( nmLIDAR)

and nm []/

Autonomous driving (objectdetection in traffic scenes,e.g., bike, car)

Multispectral imaging RGB, NIR, MIR and far infrared (FIR)camera

Visible, NIR, MIR, FIR []/

Autonomous driving (objectdetection, drivable regiondetection, depth estimation)

Multispectral imaging,and single wavelengthLIDAR

RGB/Thermal camera; RGB stereo;LIDAR

Visible, Long-wavelength infrared(LWIR)

[]/

Biomedical imaging (braintumor delineation)

Multispectral opto-acoustic tomographyimaging

Multispectral optoacoustic tomog-raphy scanner inVision -TF,iThera Medical GmbH

NIR: – nm []/

Biomedical imaging(Alzheimer’s diseasevisualization)

Multispectral opto-acoustic tomographyimaging

Multispectral optoacoustic tomog-raphy scanner inVision -echosystem, iThera Medical GmbH

NIR: – nm []/

Biomedical imaging(liver-tumor inspection)

Multispectral fluores-cence imaging

Visible and NIR-I: Camware , PCOAGNIR-II: LightField , TeledynePrinceton Instruments

Visible;NIR-I: – nm;NIR-II: – nm

[]/

Biometric identification(skin detection and facialrecognition)

Multispectral imaging Light-emitting diode (LED) lightsource;SWIR camera with InGaAs sensor

SWIR (, , , and nm)

[]/

Biometric identification(facial recognition)

Multispectral imaging Two quartz tungsten halogenlamps as light source; comple-mentary metal-oxide-semiconductor (CMOS) camera

Visible to NIR (, , , ,, , , , and nm)

[]/

Biometric identification(iris recognition)

Multispectral imaging LED array, SONY CCD camera NIR (, and nm) []/

Biometric identification(fingerprint recognition)

Multispectral imaging LED light source, InGaAs sensor SWIR (, , , and nm)

[]/

Biometric identification (palmrecognition)

Multispectral imaging LED light source, CCD camera Visible (, , and nm)and NIR ( nm)

[]/

Archaeology and art conser-vation (painting analysis)

Hyperspectral imaging SPECIM hyperspectral(HS-XX-VE) CCD camera

– nm []/

Archaeology and art conser-vation (Islamic papercharacterization)

Hyperspectral imaging GILDEN Photonics hyperspectralimaging scanner

– nm []/

Archaeology and artconservation

Hyperspectral imaging SPECIM IQ hyperspectral camera – nm []/

1440 N. Li et al.: Spectral imaging and spectral LIDAR systems

Page 5: and Lennon Yao Ting Lee Spectral imaging and spectral ...

illumination. In addition, the multispectral sensing sys-tems can also provide the material information due to thespectral fingerprint of different materials. A typical multi-spectral LIDAR road image is shown in Figure 3(a), which iscaptured at 905 and 1550 nm wavelengths using RIEGLVUX-1HA line scanning LIDAR and RIEGL miniVUX-1UAVline scanning LIDAR, respectively [18]. Within the image,the two-lane road can be clearly seenwith details includingroad markers, road shoulders and trees around the road.

The schematic of the multispectral imaging setup isshown in Figure 3(b). The multispectral LIDAR systemtogether with an imaging system have been mounted ontop of a vehicle. The multispectral road data have beencollected for road area semantic segmentation. Differentroad areas, including asphalt road, gravel road, highwayand parking lot, can be correctly predicted. The prediction

areas have been overlaid on top of the multispectral LIDARimages and compared with the ground truth, as shown inFigure 3(c).

Additionally, a thermal camera is used as a secondaryvision sensor in the study by Choi et al. [20], and bringsalong the advantage of capturing road images regardless ofdaylight illumination condition. The integration of the 3DLIDAR (Velodyne HDL-32E) and the GPS/IMU (globalpositioning system/inertial measurement unit) onto thesame sensor system enables capturing of the depth infor-mation and location information, respectively. Further-more, in the study by Takumi et al. [19], multispectralimages covering visible, near infrared (NIR), MIR and farinfrared (FIR) wavelength range are collected. These im-ages are used for object detection in traffic scenes,including bike, car and pedestrian. It has been found that

Figure 2: Airborne multispectral environment sensing and monitoring.(a) 2D and 3D multispectral images for urban area classification. These images are based on normalized difference vegetation indices (NDVI)computation using the recorded intensity at 532 and 1064 nmwavelength. (b) RGB andmultispectral images of olive orchard and the imagingprocess flow for olive tree analysis. (a) and (b) are adapted with permission from the studies by Morsy et al. [11] and Jurado et al. [35],respectively. Both are licensed under a Creative Commons Attribution.

N. Li et al.: Spectral imaging and spectral LIDAR systems 1441

Page 6: and Lennon Yao Ting Lee Spectral imaging and spectral ...

the images at different spectral range are suitable fordetection of different classes of object. Hence, the advan-tage of multispectral imaging for diversified object detec-tion has been demonstrated in the work.

2.3 Biomedical spectral imaging andsensing

As mentioned in Section 1, spectral imaging system cancapture an image stack by using different wavelengths,and obtain the spectral information, including reflectanceand transmittance, for each pixel in the image from thedata cube. Such information can be used to monitor thechanges of biosamples that cannot be obtained using thetraditional gray-scale or RGB imaging technique [2]. Theprinciple is based on the spectral signature of differentbiosamples, which originates from the interaction betweenthe multiwavelength electromagnetic waves and the

biomolecules. In order to obtain the spectral informationfrom biosamples, there are fourmain scanning approachesused: whiskbroom (spatial scan on both axes), pushbroom(spatial scan on one axis), staring (spectral scan) andsnapshot (no scan). The study by Li et al. [2] presents a goodsummary and comparison of the advantages and disad-vantages for these scanning approaches.

In medical application, spectral imaging is mainlyused for disease diagnosis and surgical guidance [22]. Fordisease diagnosis, multispectral optoacoustic tomography(MSOT) is an emerging technology, enabled by the devel-opment of NIR high-speed tunable lasers [46]. It enables in-depth high-resolution imaging and spectral information oftissuemolecules. A recent example reported in the study byNeuschmelting et al. [37] uses MSOT for brain tumordelineation. The stability of the nanostar contrast agent isalso found through the MSOT spectra in the NIR wave-length regime. Besides brain tumor, MSOT has also beenrecently applied for visualization of Alzheimer’s disease in

Figure 3: Multispectral LIDAR system applied for autonomous driving.(a) A point cloud captured by 1550 and 905 nm line-scanning LIDAR systems RIEGL VUX-1HA and RIEGL miniVUX-1UAV, respectively.(b) Schematic of LIDAR measurement setup, showing a multispectral LIDAR system, visible camera, and GPS mounted on top of the vehiclefor data collection. (c) Road area prediction examples in comparison with the ground truth. (a)–(c) are adapted from the study by Taher [18]with permission. Copyright Josef Taher, Finnish Geospatial Research Institute FGI.

1442 N. Li et al.: Spectral imaging and spectral LIDAR systems

Page 7: and Lennon Yao Ting Lee Spectral imaging and spectral ...

mouse’s brain [38] and pathophysiological procession [47].Furthermore, a hyperspectral endoscopy system has beendeveloped [48], which enables image distortion compen-sation for flexible endoscopy in clinical use. High spatialand spectral resolutions have been achieved under free-hand motion.

For surgical guidance application, a recent example isreported in the study by Hu et al. [21]. In this work, multi-spectral fluorescence imaging in visible, NIR-I (700–900 nm) and NIR-II (1000–1700 nm) wavelength rangeshave been used to monitor the in-human liver-tumor sur-gery. The work shows that the NIR-II wavelength regime isable to provide tumor detection with higher sensitivity,signal distinction ratio and detection rate compared withthe traditional NIR-I wavelength regime. Also, with thedevelopment of artificial intelligence and machinelearning technology, it has recently been applied in thehyperspectral imaging system for precise cancer tumordetection during surgical operations [49, 50]. For example,in the study by Fabelo et al. [49], a classificationmethod forhyperspectral imaging system has been developed toaccurately determine the boundaries of brain tumor duringthe surgical operation, which is able to help the surgeon toavoid extra excision of the normal brain tissue or leavingresidual tumor unintentionally.

2.4 Biometric sensor systems

Biometric sensors have drawn an increased attention, dueto itswide applications covering fromhomeland security toconsumer electronics. Multispectral biometric systemenables the capturing of biometric data under differentillumination levels, with antispoofing functionality andresistant to weather and environmental changes. Thesebiometric data are typically taken from face, finger, palmoriris, followed by pattern recognition from 2D imagescaptured under different spectral bands.

Within a multispectral facial recognition system, thewavelength range plays a significant role. Figure 4(a)shows the remission spectrum of different skin types andspoofing mask materials from visible to NIR wavelengthregime. It can be observed that human skin has relativelylower remission at the wavelength beyond visible range[23]. Hence, such wavelength range can be used to senseand distinguish skin from different materials for presen-tation attack detection purpose. Also, different skin colorshave very similar remission across thewavelength between900 and 1600 nm. Therefore, facial recognition operatingin such wavelength range will not be affected by the skintone. Based on the remission (or reflection) datamentioned

earlier, in the study by Steiner et al. [23], the multispectralimaging at SWIR has been applied for face recognition andskin authentication. The images at wavelengths of 935,1060, 1300 and 1550 nm are used simultaneously for anti-spoofing purpose. The schematic of the imaging system isillustrated in Figure 4(b). The light-emitting diodes (LEDs)with different wavelengths are distributed on the illumi-nation panel around the sensor camera. These LEDs areprogrammed using microcontroller to switch on/off inchronological order, with only one wavelength switchedon at any given time. An SWIR camera (with InGaAs sensor)is placed in the center of the LED array to collect the re-flected light from the human face. The images captured bythe camera are transmitted to personal computer, andprocessed to compose a multispectral image stack. Theimage processing steps include nonlinear correction, mo-tion compensation, distortion correction andnormalization.

A part of the images taken for facial recognition areillustrated in Figure 4(c) and (d). In Figure 4(c), the first andsecond rows are the facial images taken in visible andSWIRwavelength, respectively. By comparing the images inthese two rows, it can be observed that the SWIR images areinsensitive to different skin tones, due to the similarremission spectrum in SWIR wavelength region as shownin Figure 4(a). Furthermore, Figure 4(d) shows a humanface wearing 3D-printed mask acting as presentationattack. The image in SWIR can clearly distinguish thehuman skin and printed mask, and hence illustrates theadvantage of antispoofing capability of SWIR wavelengthregime.

Furthermore, in the study by Vetrekar et al. [24], a lowcost multispectral facial recognition system has beendemonstrated. The system consists of a filter wheel withnine different bands covering from 530 nm up to 1000 nm,and mounted in front of a complementary metal-oxide-semiconductor (CMOS) camera. The multispectral imagestaken from the CMOS camera are then fused using imagefusion techniques, including wavelet decomposition,averaging, and then inverse wavelet transform. The mul-tispectral feature of the system enables the reduction of theillumination effect compared with single spectral system.An additional noteworthy point to mention on facialrecognition is that the COVID-19 outbreakwill further boostthe need for facial recognition technology due to its con-tactless detection scheme, which will effectively addressthe hygiene and infection-related issues.

Besides facial recognition, the fingerprint sensor isalso one of the most widely deployed sensors for biometricidentification. Different physical mechanisms have beenimplemented to capture the fingerprint information

N. Li et al.: Spectral imaging and spectral LIDAR systems 1443

Page 8: and Lennon Yao Ting Lee Spectral imaging and spectral ...

including optical imaging, capacitive imaging and ultra-sonic sensing [51–54]. However, most systems have issuesin detection under various circumstances, such as wet/dryfinger, poor contact and susceptibility to spoofing. Themultispectral fingerprint system addresses these issues

effectively. It is able to capture the images under differentoptical wavelengths and collect data on both surface andsubsurface, contributed by the fact that different wave-lengths have different penetration depths within the fingerskin. The subsurface information also can tell whether the

Figure 4: Multispectral imaging applied for facial recognition showing advantage of anti-spoofing.(a) Remission spectrum of different skin types and spoofing mask materials from visible to NIR wavelength. (b) Schematic of multispectralfacial recognition system setup, including LED arrays at four different wavelengths (935, 1060, 1300 and 1550 nm) as light source (right panel),and a short wavelength infrared (SWIR) camera for image capture. (c) Facial images captured with visible (first row) and SWIR (second row)wavelengths of different skin types. SWIR images are insensitive to different skin tones. (d) Facial images with a printed mask in visible (top)and SWIR (bottom) wavelength. The mask material can be clearly distinguished in SWIR image. (a)–(d) are adapted with permission from thestudy by Steiner et al. [23]. Licensed under a Creative Commons Attribution.

1444 N. Li et al.: Spectral imaging and spectral LIDAR systems

Page 9: and Lennon Yao Ting Lee Spectral imaging and spectral ...

fingerprint is from a real finger or a fake one with only 2Dinformation. The working principle of multispectralfingerprint imaging has been presented in the study byRowe et al. [55]. A commercial multispectral fingerprintsensing product (J110 MSI) has also been presented in thestudy by Rowe et al. [55] based on the working principleintroduced. The commercial product has four LEDs at 430,530, 630 nm and white light. There is also an embeddedprocessor within the system for data processing.

Furthermore, due to COVID-19, as mentioned earlier infacial recognition part, touchless fingerprint imaging sys-tem will be highly attractive, since it helps to prevent thespread of disease through use of same systems by multi-users such as lift buttons. In the study by Hussein et al.[40], a novel touchless fingerprint capture device has beenintroduced, using multispectral SWIR imaging and laserspeckle contrast imaging for sensing and presentationattack detection.

For iris recognition, in the study by Zhang et al. [39], amultispectral imaging system has been introduced. Itextends the traditional 850 nm wavelength for irisrecognition to shorter wavelength in which pigments canbe used as another source of iris texture. The schematic ofthe system is illustrated in Figure 5(a) left panel. It con-tains capture unit, illumination unit, interaction unit andcontrol unit. The data collection process is illustrated inthe photograph provided in Figure 5(a) right panel. Thesystem is able to capture themultispectral images in 2–3 s.The captured multispectral images at 700, 780 and850 nm are shown in Figure 5(b). These images are fusedto form the final measurement image including all thepigment information obtained from the three differentwavelengths.

Palmprint is also a unique biometric character, whichcan be applied to authentication system. Zhang et al. [41]proposed an online multispectral palmprint system forreal-time authentication. Since different band containsdifferent texture information, the combining of the bandsenables detection with reduced error rate and antispoofingfunctionality.

2.5 Archaeology and art conservation

Spectral imaging has been used as a novel and noninvasivemethod for archeology and art conservation since 1990sbased on [25]. Besides the 2D spatial information, it is ableto obtain the spectral information of the object such as anantique, art painting or manuscript, and hence reveals thehistorical and hidden information of the object. Acomprehensive review for multispectral and hyperspectral

imaging systems applied in archaeology and art conser-vation has been reported in 2012 [25]. Here we review themost updated work in the past decade.

Recently, studies on compact hyperspectral camerafrom SPECIM for imaging of art work have been conducted[43, 56]. In the study by Picollo et al. [43], the new com-mercial hyperspectral camera working from 400 to1000 nm has been used for inspection of art works. Thecamera with a size of 207 × 91 × 126 mm3 has been used toanalyze both indoor painting (a 19th century canvaspainting), outdoor painting (Sant’Antonino cloister at theMuseum of San Marco, Florence), and a manuscript(a 15th century Florentine illuminated book in Florence).This proves the capability of the hyperspectral camera tooperate effectively under different environmental condi-tions. The pigment identification has been achievedthrough the spectral angle mapper procedure embeddedin the camera software. Furthermore, in the study byDaniel et al. [26], hyperspectral camera working in thesame wavelength range has been used to analyze thepaintings from Goya in a Spanish museum Zaragoza.Restored zoom is shown in the infrared hyperspectralimage. The pigment identification has also been demon-strated in the work.

Moving beyond 1000 nm wavelength, in the study byMahgoub et al. [42], a pushbroom hyperspectral imagingsystem (GILDEN Photonics) working in 1000–2500 nm hasbeen applied to investigate an Islamic paper. A calibrationmodel has been made for the quantitative analysis. Thestarchwithin the Islamic paper has been identified, and thecellulose degree of polymerization has been quantified,which provides information on the conservation conditionof the paper. Also, in the study by Cucci et al. [57], NIRhyperspectral images of Machiavelli Zanobi painting havebeen obtained from 1000 to 1700 nm. From the NIR image,the restoration places can be foundwhich is not observablefrom the image in visible wavelength. From the reflectancespectra, it is found that gypsum has been used as groundlayer for the painting (preparatory drawing).

3 Nanophotonics-based spectralimaging and LIDAR sensingsystems

Although there are many modern applications of the state-of-the-art spectral imaging and spectral LIDAR systems asmentioned in the previous section, most of these systemsare still bulky, heavy and consume high power. Hence,there is an enormous demand for compact and low-cost

N. Li et al.: Spectral imaging and spectral LIDAR systems 1445

Page 10: and Lennon Yao Ting Lee Spectral imaging and spectral ...

sensing system. Nanophotonics technology [58], which isbased on light–matter interaction at nanoscale di-mensions, provides an ideal solution. Numerous compactoptics and photonics functional devices have beendemonstrated using CMOS-compatible fabrication process[59–65]. In the past decade, research works on

nanophotonics-based spectral imaging and LIDAR systemshave also accelerated [28, 29, 66–69]. Various compactdevices have been developed for the proof-of-conceptdemonstration in this field. In this section, we havereviewed these research works on nanophotonics-basedspectral imaging and LIDAR sensing systems. Subsection

Figure 5: Multispectral iris recognition system and data fusion process.(a) Schematic drawing of multispectral iris capture device (left) and an optical image showing data collection process (right). (b) Fusingprocess of multispectral iris images at 700, 780 and 850 nm, including all the pigment information within iris. (a)–(b) are adapted withpermission from Springer Nature: Multispectral Biometrics, by Zhang et al. [39]. Copyright 2016.

1446 N. Li et al.: Spectral imaging and spectral LIDAR systems

Page 11: and Lennon Yao Ting Lee Spectral imaging and spectral ...

3.1 is mainly focused on the spectral imaging systems thathave been demonstrated in the past decade. Subsection3.2 is focused on the most recent research works ofnanophotonics-based LIDAR systems using integratedfrequency comb or supercontinnum as light source.Also, the reviewed works are categorized based on ma-terial, structure, sensing mechanism and workingwavelength, as listed in Table 2.

3.1 Spectral imaging systems

3.1.1 Metasurface-based lens and reflectors

Flat optics or metasurface [80, 81], which can be formed bya single layer of subwavelength-scale nanostructures, hasdrawn a lot of research interests in the field of nano-photonics. It works based on the scattering of light by the

Table : Summary of nanophotonics-based spectral imaging and LIDAR sensing systems.

Material and structure Sensing mechanism Application Wavelength Reference/year

Elliptical amorphous silicon (a-Si)nanobars on fused silicasubstrate

Hyperspectral imaging Immunoglobulin G (IgG) biomolecule detec-tion and sensing with high sensitivity

– nm []/

TiO-based metasurface Multispectral imaging Chiral beetle multispectral imaging using sin-gle metalens to resolve the chirality

, , and nm

[]/

Periodic silver nanowires Multispectral imaging Tunable color filter with polarizationdependent transmission for color imaging

– nm []/

a-Si nanoposts with rectangularcross section on silica substrate

Hyperspectral imaging Hyperspectral imaging with compact size andlight weight

– nm []/

Periodic silicon pillar withphotonic crystal structure

Hyperspectral imaging CMOS-compatible, low cost and compacthyperspectral imaging system

– nm []/

Bayer color filter array Multispectral imaging Multispectral imaging for early stagepressure ulcer detection

, , ,and nm

[]/

SiN-AlN-Ag multilayer stack toform Bayer color filter array

Multispectral imaging Color image using metal-dielectric filterpatterned in Bayer array on CMOS imagesensor (CIS)

– nm []/

Microscale plate-like SiNstructure

Multispectral imaging Color imaging using near-field deflection-based color splitting with minimal signal loss

– nm []/

Si nanowire Multispectral imaging All-silicon multispectral imaging system invisible and NIR wavelength

– nm []/

Periodic circular holes on gold(Au) layer

Multispectral imaging Adaptive multispectral imaging using plas-monic spectral filter array working in LWIR

– μm []/

Nanohole arrays in an Au film Multispectral imaging Multispectral imaging of methylene blue fortransmission-imaging and leaf for reflection-imaging

– nm []/

Periodic circular holes on Al thinfilm

Multispectral imaging Multispectral imaging using plasmonic spec-tral filter array integrated with CMOS-basedimager

– nm []/

Elliptical and circular hole arrayson Al thin film

Multispectral imaging Low-photon multispectral imaging RGB: , ,and nm

[]/

Pixelated Si-based metasurfacewith zigzag array structure

Imaging-basedspectroscopy

Biosensing for protein A/G .–. μm []/

Silica wedge disk resonator Dual-comb based time offlight (ToF) LIDAR

Distance measurement with high accuracy – nm []/

SiN microring resonatorpumped by erbium-doped fiberamplifier (EDFA)

Dual-comb based LIDAR Distance measurement at high speed andaccuracy

– nm []/

SiN microring resonatorpumped by EDFA

Frequency-comb basedFMCW LIDAR

Distance and velocity measurement along aline

– nm []/

Terahertz quantum cascadelasers

Dual-comb based hyper-spectral imaging

Bio-imaging ., ., and. THz

[]/

N. Li et al.: Spectral imaging and spectral LIDAR systems 1447

Page 12: and Lennon Yao Ting Lee Spectral imaging and spectral ...

nanostructures. These nanostructures, also called asnanoantennas, can be patterned to achieve designedspectral response or phase profile, thereby enabling variedfunctional devices such as lenses [82, 83], spectral filters[84, 85], wave plates [86, 87], beam deflectors [88–90] andpoint cloud generator [91]. Formetalens, when the phase ofthe scattered light from nanoantennas follow the hyper-boloidal profile below, the scattered light will focus at onepoint [81, 82]:

φ = 2πλ

x2 + y2 + f 2

√− f( ) (1)

where λ is the wavelength in free space, f is the focal lengthof the metalens. While for reflectors, the angle of reflectedlight follows the generalized Snell’s law for reflection [80]:

sin( θr) − sin( θi) = λ2πni

dφdx

(2)

where θr and θi are reflection angle and incident angle,respectively. ni is the refractive index of themedia. dφ/dx isthe gradient of phase discontinuity along the reflectioninterface. Such phase discontinuity can be engineered toachieve the designed reflection angle of the optical beam.The integration of metasurface devices with active layersenables the active tuning and control of optics [92–96].Furthermore, the metasurface can also be engineered toachieve desired dispersion [97]. Metasurface-based achro-matic optical devices have been demonstrated [98–103],which can be applied for multispectral imaging. In thestudy by Khorasaninejad et al. [70], a single metasurface-based lens (metalens) has been used to replace the so-phisticated ensemble of optical components to achievesimultaneous imaging in twoopposite circular polarizationstates. The schematic of the setup is shown in Figure 6(a),illustrating the imaging principle: the light with differentcircular polarizations from object is focused by the multi-spectral metalens at different physical locations for imag-ing purpose. The multispectral images of chiral beetle(Chrysina gloriosa), which is known for high reflectivity foronly left-circularly polarized light, are also illustrated inFigure 6(b). These images are obtained by using LEDs atred, green and blue color together with a band pass filter ateach wavelength. The compact multispectral imagingsystem reported in the study by Khorasaninejad et al. [70]should be able to obtain the helicity and spectral infor-mation from other biosamples as well. An additional pointworthmentioning is that, besides the spectral information,the study by Khorasaninejad et al. [70] also illustrates thatextra information of the sensing object can be obtainedthrough the polarization of light. Hence, polarization pro-vides onemore degree of freedom in imaging in addition to

the spectral information. The full-Stokes polarization im-agingwith compact optical system enabled bymetasurfacehas later been demonstrated [104, 105], inwhich additionalinformation including the mechanical stress of the sensingobject and texture of reflecting surfaces are also revealed. Acomprehensive review on recent advances of metasurface-based polarization detection has been published in thestudy by Intaravanne and Chen [106].

Also, contributed by the capability of dispersion con-trol with metasurface, in the study by Faraji-Dana et al.[68], a line-scanned hyperspectral imager with a single-layer metasurface, which is patterned by a single-steplithography process on glass substrate, has been demon-strated, with schematic shown in Figure 6(c) left panel. Theimaging system is based on a compact folded metasurfaceplatform [107]. The light from the object enters the systemfrom an aperture at the top, reflects between the meta-surfaces and gold mirrors, and finally exits from thetransmissive metasurface at the bottom for image forming.The imaging system is designed to disperse the light withdifferent wavelengths in vertical direction. The light withdifferent incident angles along the horizontal direction isfocused horizontally at the detector array. The Caltech logohas been used for imaging as a proof-of-concept demon-stration, with simplified setup schematic shown inFigure 6(c) right panel. The inset shows the colored Caltechlogo with wavelength increasing from bottom (750 nm) totop (850 nm). The imaging results are illustrated inFigure 6(d). The left panel shows the intensity profile ob-tained by metasurface hyperspectral imager (M-HSI) alongcut A and cut B. The result is benchmarked with the oneobtained by a tunable laser (TL), and shows a good match.The intensity of the two wavelengths 770 and 810 nm ob-tained by M-HSI are shown in Figure 6(d) right panel. It isalso compared with the result obtained by TL, showing agood match.

3.1.2 Spectral filters integrated with photodetectorarray or CMOS camera

Besides the metasurface-based lens and reflectors used inthe earlier works, the spectral filters made from flat opticscan also be placed on top of a photodetector (PD) or a CMOSimage sensor (CIS) for spectral imaging. A recent work tointegrate the spectral filter with a PD array has been re-ported in the study by Shah et al. [33]. In this work, aplasmonic metasurface-based color filter with ellipticaland circular nanoholes are defined in a thin Aluminum (Al)layer. To achieve different resonance wavelengths invisible range, the dimension of these sub-wavelength scalenanoholes are varied, as shown in Figure 7(a) where the

1448 N. Li et al.: Spectral imaging and spectral LIDAR systems

Page 13: and Lennon Yao Ting Lee Spectral imaging and spectral ...

scanning electron microscopy (SEM) images of the nano-structures are illustrated. The inset of the figure shows themicrograph of the color filter. The filter array is patternedby a single-step electron beam lithography. Once thefabrication is completed, the filter array is then integratedwith a 64 × 64 single photon avalanche photodetector(SPAD) array through a flip-chip bonding process. Hence,the imaging system has the capability of counting at singlephoton level. The optical images of the system are illus-trated in Figure 7(b). Each filter covers one of the 64 × 64pixels of SPAD, with red, green and blue colors randomlydistributed. Each color has approximately equal quantity

of 33.33%. The active imaging system utilizes a super-continuum tunable laser source, whose schematic isshown in Figure 7(c). As a proof-of-concept demonstration,the color image of a sample target taken by conventionalcamera and the reconstructed image obtained from themultispectral imaging system are illustrated in Figure 7(d)left and right panel, respectively. The sensing systemdemonstrated in this work can find applications inLIDAR-based 3D imaging.

A recent demonstration for integration of dielectricmetasurface-based spectral filter with a CIS is also reportedin the study by Yesilkoy et al. [67]. In this work, a

Figure 6: Metasurface-based lens and reflectors for spectral imaging.(a) Top panel: schematic illustration of the multispectral metalens imaging principle: the light from object with different circular polarizations isfocused at different location by the multispectral chiral lens. (b) The beetle (Chrysina gloriosa) images formed by using red, green and blue LEDillumination together with a band pass filter at each wavelength. (a) and (b) are adapted with permission from the study by Khorasaninejad et al.[70]. Direct link: https://pubs.acs.org/doi/10.1021/acs.nanolett.6b01897. Further permissions related to thematerial excerpted should bedirectedto the ACS. (c) Left panel: the schematic of the hyperspectral imaging system: the light from sample enters the system through the aperture at thetop, then is reflected between themetasurfaces and goldmirrors, and exits through the transmissivemetasurface at the bottom. On detector array,light with different incident angles is focused along the horizontal direction, and light with different colors is focused along the vertical direction.Right panel: the simplified schematic of the system for imaging the object. Inset shows the object of Caltech logo with mixed color, whosewavelength increases from bottom (750 nm) to top (850 nm). (d) Left panel: measured intensity profile captured by photodetector (PD) across cut Aandcut Bby themetasurfacehyperspectral imager (M-HSI).M-HSI imaging result is benchmarkedwith the intensity profile obtainedusinga tunablelaser (TL). Right panel: measured intensity at two wavelengths (770 and 810 nm) by M-HSI benchmarked with the one obtained using TL. (c) and(d) are adaptedwithpermission from the studyby Faraji-Dana et al. [68]. Direct link: https://pubs.acs.org/doi/full/10.1021/acsphotonics.9b00744.Further permissions related to the material excerpted should be directed to the ACS.

N. Li et al.: Spectral imaging and spectral LIDAR systems 1449

Page 14: and Lennon Yao Ting Lee Spectral imaging and spectral ...

biomolecule sensor has been demonstrated based onhyperspectral images takenbyhigh quality factor dielectricmetasurface integrated on a CIS. The sensingmechanism isillustrated in Figure 8(a). A tunable narrow-band lasersource, formed by a tunable filter coupled to a super-continuum light source, is used to illuminate the meta-surface with immunoglobulin G (IgG) solutions. Thespectral information of each pixel can be obtained from thehyperspectral data cube, as shown in bottom right panel ofFigure 8(a). The resonance wavelength shift of the meta-surface induced by the IgG molecules can be obtained bycomparing the spectral information to the reference withoutthe IgG molecule. Higher IgG biomolecule concentrationcontributes to larger refractive index change andhence largerresonance wavelength shift of the metasurface. In this way,the concentration information of the IgG can be obtained.Figure 8(b) shows the schematic of the bioassay for IgGbiomolecule binding/immobilization process. The meanresonance wavelength shifts with respect to different IgGconcentrations have been plotted in Figure 8(c).

The CIS mentioned above has been widely applieddue to its advantages including compact size, low cost,low power consumption and ease of integration withother CMOS-based functional devices. Rather than usingCIS and spectral filter as separate components, the

nanophotonics-based spectral filters can be integratedwith CCD or CIS, either through attaching on the imagesensor [27, 29, 73, 76] or direct patterning on the imagesensor [72, 108–110]. The compact integrated system is asuitable platform for spectral imaging. In the study byPark and Crozier [27], a compact multispectral imagingsystem has been demonstrated by using color filtersformed by vertical silicon nanowires. The color spec-trum is varied by the nanowire diameter. The nano-structures are patterned by a single-step electron beamlithography process. The nanowires embedded in poly-dimethylsiloxane (PDMS) is attached to a Si-based imagesensor, with schematic and optical image shown inFigure 9(a) and (b), respectively. The zoomed-in image ofthe fabricated nanowire filter array is included at thebottomof Figure 9(b). In the imaging system, there are fivechannels in visible range and three channels in infrared(IR) wavelength range. In Figure 9(c), left and right panelsshow the image of Macbeth color chart obtained usingconventional camera and three channels of this multi-spectral imaging system in visible range, respectively.The colors show goodmatch. Furthermore, the advantageof IR channel is demonstrated using the experiment setupshown in Figure 9(d). A donut-shaped object is placed atthe back of a black screen (glass painted with black ink),

Figure 7: Metasurface-basedplasmonic color filters integratedwith single photon avalanche photodetector (SPAD) array for spectral imaging.(a) Scanning electron microscopy (SEM) images of the fabricated color filters, with inset illustrating the micrographs of blue-, green- and red-colored filter. (b) Optical image of color filters integrated with SPAD array. (c) Schematic of the imaging system, including the supercontinuumtunable laser as light source. (d) Left panel: sample target used for multispectral imaging. Right panel: reconstructed multispectral image ofthe sample target. (a)–(d) are adapted with permission from the study by Shah et al. [33]. Licensed under a Creative Commons Attribution.

1450 N. Li et al.: Spectral imaging and spectral LIDAR systems

Page 15: and Lennon Yao Ting Lee Spectral imaging and spectral ...

which is opaque in visible wavelength, but transparent inIR range. The images obtained by the system in visible andIR wavelength range are shown in Figure 9(e) middle

and right panel, respectively. From the IR image, thedonut-shaped object can be observed. For comparison,the image taken by a conventional camera in visible

Figure 8: Dielectric metasurface-based hyperspectral imaging for ultrasensitive biomolecule detection.(a) Schematic of dielectricmetasurface-integrated CIS for hyperspectral imaging-based biosensing. Narrow-band tunable laser source is usedfor illumination. CMOS camera captures image for eachwavelength and the data forms hyperspectral data cube. For each pixel, the resonancewavelength canbe obtained from the processed spectral information. Biosensing is achievedby comparing the resonancemapofmetasurfacewith biomolecules and the reference resonance map without the biosample. (b) Schematic showing the immobilization process of thebiomolecules for sensing purpose. (c) Mean resonance shift with respect to average number of IgG molecules. (a)–(c) are adapted withpermission from Springer Nature, Nature Photonics [67]. Ultrasensitive hyperspectral imaging and biodetection enabled by dielectricmetasurfaces, Yesilkoy, et al. Copyright 2019.

N. Li et al.: Spectral imaging and spectral LIDAR systems 1451

Page 16: and Lennon Yao Ting Lee Spectral imaging and spectral ...

wavelength range is shown in Figure 9(e) left panel,where the cross-shape object in front of the screen can beclearly observed, while the donut-shaped object at theback of the screen is hardly seen.

Besides the abovementioned metasurface-based de-vice integration with image sensor, in the study by Yoko-gawa et al. [111], the plasmonic-based color filter arraydesigned for CIS has also been demonstrated, which haspotential application for spectral imaging. The color filtersare formed byhole arrays on 150-nm thickAlfilm towork in

visible wavelength range. Based on the same materialplatform, in the study by Burgos et al. [76], the plasmoniccolor filter array has been integrated with a CIS todemonstrate a plasmonic-based full-color imaging func-tionality. The schematic of the CIS with RGB plasmonic-based filters on top is shown in Figure 10(a) left panel. TheSEM image of the filters, the optical image of the filter arrayon quartz substrate, and the integrated CIS have beenincluded in Figure 10(a) right panel. The Al nanostructureis patterned using a single-step electron beam lithography

Figure 9: Si nanowire-based spectral filter integrated with CCD image sensor for multispectral imaging in visible and infrared (IR)wavelength range.(a) Schematic of the multispectral imaging system, with inset showing the Si nanowire structure as spectral filter. (b) Optical image of thespectral filter mounted on CCD image sensor, with zoom-in image of the filter area at the bottom panel. Inset of the bottom panel shows themagnified image of the filter array. (c) Image of Macbeth color chart taken by conventional color camera (left panel) in comparison with theimage taken by the nanowire-basedmultispectral imaging system (right panel). (d) Schematic of imaging setup usingwhite light and IR LED aslight source to demonstrate the advantage of multispectral imaging. (e) Images taken by conventional camera (left), nanowire-based imagingsystem in visible wavelength range (middle), and nanowire-based imaging system in IR wavelength range (right). The donut-shaped object atthe back of the black ink painted glass is invisible or hard to observe from the image in visiblewavelength, but can be observed from the imagein IR wavelength. (a)–(e) are adaptedwith permission from the study by Park and Crozier [27]. Licensed under a Creative CommonsAttribution.

1452 N. Li et al.: Spectral imaging and spectral LIDAR systems

Page 17: and Lennon Yao Ting Lee Spectral imaging and spectral ...

followed by a lift-off process on quartz substrate. Thefabricated structure is then integrated with the CIS througha contact process. The reconstructed image of a 24-patchMacbeth color chart obtained from the integrated system isshown in Figure 10(b) right panel, showing good matchwith the image taken by conventional CMOS camera shownin Figure 10(b) left panel.

Also, using the same contacting/assembly approach,the photonic crystal has been implemented as spectralfilter for spectral imaging. In the study byWang et al. [29], acompact on-chip spectrometer with hyperspectral imagingfunctionality has been demonstrated, whose schematic isshown in Figure 10(c). The photonic crystal array is fabri-cated and then attached on top of the CMOS sensor array.The photonic crystal dimension for each slab is varied to

achieve different resonance frequency. The spectrum oflight source can hence be reconstructed. Figure 10(d)shows some reconstructed optical spectrum plotted in bluecircles well matched with the reference ground-truthspectrum plotted in red solid line. The hyperspectralfunctionality has also been demonstrated, as shown inFigure 10(e). Two numbers “5” and “9” are illuminated bylight source at 610 and 670 nm, respectively. The target onthe screen is a superposition of “5” and “9” encoded usingthese two wavelengths. The hyperspectral image stackshown on right panel of Figure 10(e) is able to distinguishtwo numbers at different wavelengths, which is notdistinguishable using the conventional RGB camera. Anadditional note from the authors of this work is that thepixel number in this hyperspectral imager is limited by the

Figure 10: Metallic and dielectric nanophotonics spectral filters integrated with CMOS sensor for spectral imaging.(a) Schematic of hole array-based RGBspectral filter integratedwith CIS, with inset showing the SEM image of the filter array, optical images ofthe filter array patterned on quartz substrate, and the optical image of the integrated system. (b) Image of Macbeth color chart taken byconventional CMOS camera (left panel) and the plasmonic-based CMOS camera (right panel) for comparison. (a)–(b) are adapted withpermission from the study by Burgos et al. [76]. Copyright © 2013 American Chemical Society. (c) Schematic of microspectrometer consistingphotonic-crystal array integrated with CMOS sensor array. (d) Measured optical spectrum from a narrow-band light source centered at581–584 nmby the integrated spectrometer. Themeasured results plotted in blue circles are benchmarked with the ground-truth data plottedin red solid line, showing good match. (e) Left panel: hyperspectral imaging setup using photonic-crystal-based spectral imaging system.The target on the screen is a superposition of “5” and “9” encoded using different wavelengths. Right panel: the captured images at differentwavelengths, where “5” and “9” can be distinguishedatwavelength of 610 and670 nm, respectively. (c)–(e) are adaptedwith permission fromthe study by Wang et al. [29]. Licensed under a Creative Commons Attribution.

N. Li et al.: Spectral imaging and spectral LIDAR systems 1453

Page 18: and Lennon Yao Ting Lee Spectral imaging and spectral ...

photonic crystal area size patterned by electron beamlithography. The limitation can be overcome by using highthroughput photolithography patterning technology[112, 113]. In the study by Stewart et al. [114], ultraviolet (UV)photolithography has been used to pattern the large-areasilver (Ag) nanocubes on gold (Au) layer. The nano-structure dimensions have been varied to form plasmonic-based filter array. The resonance wavelength covers from580 nm up to 1125 nm. The device has also been patternedto form a color image of a rainbow lorikeet. Also, inthe studies by Xu et al. [115, 116], the band pass filtershave been patterned using 12-inch 193-nm deep UV im-mersion lithography followed by an inductive coupledplasma etching process. The filters work in both SWIR andvisible wavelength range. The fabrication process isCMOS-compatible, which enables these filters to be inte-grated with the CIS, through either monolithic integrationor packaging process.

Similar to contacting/assembly approach, the study byQi et al. [71] reports a hand-held multispectral imagingdevice for biomedical application. TiO2-Al2O3-Ag multi-layer structure has been fabricated on glass substrate toform Bayer color filter array. The mosaic filter is thenlaminated on a CIS [117, 118] as a multispectral imagingdevice for early stage pressure ulcer detection. Further-more, different from the contacting approach mentionedearlier, in the study by Chen et al. [108], the Al-basedplasmonic color filter array is directly patterned on the topsurface of CIS, using electron beam lithography followedby a dry etching process. This direct patterning approachreduces the complexity of system integration andpackaging.

3.1.3 Novel standalone spectral filters for spectralimaging

In the previous subsections, the nanostructured spectralfilters integrated with PD array or CISs are covered. In themeanwhile, there are also novel standalone spectral filtersdemonstrated recently, which are suitable for spectralimaging [28, 75, 119–123]. For example, in the study byNajiminaini et al. [75], nanohole arrays in a gold film hasbeen used as spectral filter for a snapshot multispectralimager. The schematic of the filter is shown in Figure 11(a).In the schematic, the mosaic filter is composed of 4 × 4block array with each block consisting of 3 × 3 nanoholearray. The resonance wavelength of the nanostructure isadjusted through the period variation of the hole array. Theresonance cavity is formed beneath the hole through wet-etching process. These etched holes are shown in the SEMimage as the inset of Figure 11(a). Themultispectral images

are taken using the setup shown in Figure 11(b) in bothtransmission and reflection modes. The mosaic filter with20 × 20 block array has been implemented in front of aCMOS camera mounted with an objective lens for imagingpurpose. The multispectral transmission images of meth-ylene blue (MB+) with different concentrations are illus-trated in Figure 11(c). From the spectral band of 662–714 nm, it can be observed that the image intensity isdecreasing as theMB+ solution concentration is increasing.The low intensity at the spectral band of 788–832 nm is dueto the existence of a low pass filter (<800 nm) within theimaging test system.

Another example is reported in the study by Xu et al.[120]: a plasmonic-based color filter array in transmissionmode has been demonstrated by using metal-insulator-metal structure. The plasmonic nanoresonators areformed by 100-nm-thick ZnSe layer sandwiched betweentwo Al layers with 40 nm thickness on an MgF2 substrate.Also, in the study by Duempelmann et al. [28], a compactplasmonic-based color filter formed by periodic silvernanowires has been designed and used for multispectralimaging. The filter is implemented in front of a camera,with setup schematic shown in Figure 12(a). The spectralfilter is polarization sensitive, and the transmissionspectrum can be tuned by rotating the filter orientation/polarization, which is different from the conventionalspectral scanning approach. This approach brings inthe advantages of lower fabrication cost and morecompactness. The spectrum of some fruits have beenincluded in Figure 12(a) bottom part. These spectra areused to reconstruct the multispectral image as shown inthe same figure. It can be observed that the colors of thefruits have been well reconstructed. The spectrumrecording capability of the system is verified bymeasuring and recording the laser light spectrum asshown in inset of Figure 12(a). Alternatively, colloidalquantum dots have been used as broadband color filterswithin a spectrometer working in visible wavelength[124]. The compact system is able to capture data forreconstruction of the optical spectrum.

An additional note worth mentioning is that an inte-grated multispectral sensor with the PD, working as atunable filter has also been demonstrated in the study byJovanov et al. [125]. The spectral responsivity of the sensorcan be actively tuned by varying the reverse bias on thevertical diode structure. The schematic of the sensors witha reflecting layer is shown in Figure 12(b). The measuredspectral responsivity of the vertically integrated sensorwith the interlayer under different reverse bias are shownin Figure 12(c). Also, in the study by Mauser et al. [126], atunable PD has been demonstrated using subwavelength

1454 N. Li et al.: Spectral imaging and spectral LIDAR systems

Page 19: and Lennon Yao Ting Lee Spectral imaging and spectral ...

scale thermoelectric nanoresonators for wavelength se-lective absorption. Its absorption wavelength covers fromvisible to MIR range.

Going beyond visible and NIR wavelength regime, MIRwavelength regime is interesting in scientific researchsince the fingerprints of most chemical molecules fall intothis wavelength range. Plasmonic-based multispectral fil-ters have been demonstrated using hole arrays in Au layer[74, 127]. In the study by Jang et al. [74], an MIR multi-spectral imaging system has been demonstrated, usingplasmonic color filter array. The reconstruction of irradi-ance from a blackbody IR source and a metal ring object

have been achieved. Also, in the study by Tittl et al. [77], apixelated metasurface-based filter array has been used inan imaging system to detect the molecular fingerprint inMIRwavelength regime. The pixelatedmetasurface formedby dielectric nanostructures are designed to have reso-nance peak located in MIR. The biosample (protein A/G)attached to the metasurface, as shown in Figure 12(d),will have absorption at its fingerprint wavelength. Bycomparing the reflectance difference between the meta-surface with and without biosample, the protein absorp-tion signature can be obtained, as illustrated in Figure 12(d)bottom left panel. A spectral integration process enables

Figure 11: Multispectral imaging using nanohole array.(a) Schematic of mosaic filter formed by nanohole array (NHA). Inset shows the SEM image of the resonance cavity formed by etching throughthe gold layer. Resonance wavelength variation is achieved by adjusting the NHA period. (b) Multispectral imaging setup working in bothtransmission and reflectionmode. NHA-basedmosaic filterswith 20 × 20 blocks are implemented in front of the CMOScamera. (c) Raw imagesof NHA and multispectral images of methylene blue (MB+) under different concentrations (10, 30, and 50 μM). (a)–(c) are adapted withpermission from [75]. Licensed under a Creative Commons Attribution.

N. Li et al.: Spectral imaging and spectral LIDAR systems 1455

Page 20: and Lennon Yao Ting Lee Spectral imaging and spectral ...

the translation from absorption signature into 2D absorp-tion map as molecular barcode of the biosample, as illus-trated in bottom right panel of Figure 12(d). A possibleconfiguration to integrate the metasurface spectral filterwith broadband IR detector array has also been provided inthe study by Tittl et al. [77].

3.2 Frequency comb/supercontinuum-based LIDAR sensors

There are a lot of recent research works and progress inultrafast photonics, nonlinear optics and optical solitongeneration [128–137]. The optical frequency comb, whichleverages on the nonlinear optical effects (e.g., Kerr effectand self-phase modulation), has also drawn a lot ofattention. The frequency comb generation originates fromthe nonlinear interaction between high-intensity opticalfield and material, which also can be a platform for opticalsoliton generation [138–142]. The optical frequency comb isa powerful tool in precision measurement [143],

spectroscopy [144], optical frequency synthesis [145–147],distance measurement [148], microwave photonics[141, 149] and many other applications [150–152]. Thephotonic-integrated optical frequency comb, which is ableto provide a compact solution to a sensing system, hasdrawn a lot of research interests [153–157]. Recently, theoptical frequency comb generated by a compact chip-scaleoptical resonator has been used as light source for LIDARsystems [34, 69, 78]. The microresonators, which generallyhave high optical Q value for nonlinear optics generation,are fabricated by the state-of-the-art nanofabricationtechnology [158], and can also be integrated with otherphotonic components such as optical phased array toenable beam steering [159–161].

A frequency-modulated continuous wave (FMCW)

LIDAR has been used to demonstrate the simultaneous

measurement ofmultiple points along a line in the study by

Riemensberger et al. [69]. The optical frequency comb,

which is generated through four wave mixing (FWM) in a

high Q silicon nitride (Si3N4) microring resonator, has been

used as the light source for the measurement. The output

Figure 12: Standalone spectral filters for spectral imaging system.(a) Schematic of the setup for multispectral imaging using a tunable plasmonic-based color filter in front of an imaging camera. The filterproperty is tuned by polarization angle. Multispectral image and the spectra recorded at selected area are included at bottom part. (a) isadapted with permission from the study by Duempelmann et al. [28]. Copyright © 2017, American Chemical Society. (b) Cross section ofvertically integrated tunable optical sensor. (c) Active tuning of sensor spectral responsivity by increasing reverse bias voltage. (b)–(c) areadapted with permission from the study by Jovanov et al. [125]. Copyright © 2018, American Chemical Society. (d) Protein absorption spectralmeasurement and spatial mapping. Absorption spectral is obtained by taking the difference between the normalized reflectance spectral withand without protein A/G. The spectral integration enables the translation from absorption signature into 2D absorption map as molecularbarcode of protein (bottom right panel). (d) is adapted from the study by Tittl et al. [77]. Reprinted with permission from AAAS.

1456 N. Li et al.: Spectral imaging and spectral LIDAR systems

Page 21: and Lennon Yao Ting Lee Spectral imaging and spectral ...

signal frequencymodulation is achieved through the pumpmodulation. The line is formed through the spatial spreadof the frequency comb signal by using a dispersive opticalcomponent. The parallel measurement of velocity anddistance along the line has been demonstrated, withconcept schematic and working principle illustrated inFigure 13(a) top and bottom panels, respectively. Withinthe concept schematic, the modulation signal from anarbitrary functional generator (AFG) is modulated into thepump source through an electro-optic modulator (EOM).The modulated pump is coupled into Si3N4 microring forfrequency comb generation through FMW. The generatedfrequency comb lines are spread spatially by a dispersiveoptical component to achieve parallel sensing. The re-flected signal containing the distance and velocity infor-mation of the object enters circulator (CIRC) to beatwith theoriginal signal. The beat signal is then split into separatecomb line through a demultiplexer (DEMUX), and con-verted into the electrical signal through a PD for furthersignal processing to obtain the distance and velocity in-formation at each point along the line. The plot of themodulated signal, Doppler-shifted reflected signal havealso been illustrated in Figure 13(a). The formula forcalculating the distance (D) and velocity (V) [69] are pro-vided in Figure 13(a) and are also listed below:

D = cT4B

fu + fd2

(3)

V = c2fccosθ

fu − fd2

(4)

whereT is themodulation period,B is the bandwidth, and cis the speed of light. fu and fd represents the beat frequencyfor upward and downward laser scan, respectively. Bothhave taken the Doppler shift into account. fc is the opticalcarrier frequency, θ is the angle between the object movingdirection and the reflected optical beam propagation di-rection. Based on the ranging formula mentioned above,3D image of the “EPFL” logo has been formed by sweepingthe line using a scanning mirror, with setup schematicshown in Figure 13(b) left panel. Themeasurement target isformed by two pieces of papers placed with a certain dis-tance in between. The front paper has been partially cutwith “EPFL” logo pattern. The 3D image (distance profile)of the “EPFL” logo pattern is illustrated in Figure 13(b) rightpanel, as a proof-of-concept demonstration.

In addition, an alternative modulation mechanism,which makes use of the piezoelectric effect of AlN and theacoustic-optic coupling on nanophotonics integrated sys-tem [162–164], has been implemented for optical signal

modulation in the study by Liu et al. [139]. It provides anintegrated modulation scheme to demonstrate FMCWLIDAR system. Besides FMCW LIDAR, a time of flight(ToF)-based LIDAR system is demonstrated to obtain 3Dimage from the distance information in [165]. In this sys-tem, a spatial dispersion component is placed after themultispectral light source to achieve fast spatial scanning(MHz rate) by tuning the frequency/wavelength of thesource, which is a fiber-based supercontinuum sourcecascaded with an arrayed waveguide grating. A photonic-integrated supercontinuum light source [166, 167] can alsobe used in the system to replace the fiber-based version.

The LIDAR systems discussed above use either singlefrequency comb or supercontinuum source. Dual combLIDAR systems have also been demonstrated recently. Inthe study by Suh and Vahala [78], a dual-comb based ToFLIDAR system has been used for distance measurementwith 200 nm precision. The ToF measurement setup isillustrated in Figure 14(a). The pump, amplified by anerbium-doped fiber amplifier (EDFA), is split into 50/50 tocouple into themicrodisk resonator in clock-wise (CW) andcounter clock-wise (CCW) directions. To reach soliton sta-tus, an acoustic-optic modulator (AOM) and a polarizationcontroller (PC) are used in each arm to tune the pump andto ensure efficient pump coupling into the microdiskresonator, respectively. After the microdisk resonator, theresidual pump is attenuated by Fiber Bragg grating (FBG).The spectra of optical and electrical signal are monitoredby optical spectrum analyzer (OSA) and electric signalanalyzer (ESA), respectively. The generated soliton isstabilized through a servo feedback loop. In this configu-ration, the dual combs, generated by the same micro-resonator and formed by CW and CCW propagationsolitons, are shown in Figure 14(b) top and bottom panel,respectively. For ToF LIDAR measurement, the CW solitonsignal is split by 50/50 splitter, as indicated in orange andgreen colored dashed arrow in Figure 14(a). These twosignals will be combined with CCW soliton signal, asindicated in blue dashed arrow, to generate an interfero-gram. The two measured distances with respect to timeusing CW and CCW frequency comb to probe the target areillustrated in Figure 14(c) left and right panels, respec-tively. The distance difference between the two measure-ments is 16.02 μm, and can be used to determine theabsolute range. Histogram of the range measurement areplotted with Gaussian fitting.

Also, in the study by Trocha et al. [34], a dual-combLIDAR has been implemented for distance measurementwith high accuracy and speed. The measurement accuracyreaches 188 nm. The distance acquisition rate achieved is96.4 MHz, which is contributed by the difference in line

N. Li et al.: Spectral imaging and spectral LIDAR systems 1457

Page 22: and Lennon Yao Ting Lee Spectral imaging and spectral ...

Figure 13: Optical frequency comb-based FMCW LIDAR for parallel sensing: principle, setup, and measurement result.(a) Top panel: Conceptual schematic of parallel FMCW LIDAR setup. The pump source is modulated by electro-optic modulator (EOM). Themodulation signal is provided by an arbitrary functional generator (AFG). The modulated pump signal is injected into Si3N4 microring forfrequency comb generation. The inset shows the SEM image of integrated Si3N4 micro-ring structure. Frequency comb lines are spreadspatially by a dispersive optical component for parallel sensing along a line. The reflected signal enters circulator (CIRC) to beat with theoriginal signal. The beat signal is split into separate comb line through a de-multiplexer (DEMUX), and converted into the electrical signalthrough a PD. Further signal processing are carried out to obtain the distance and velocity information at each point along the line. Bottompanel: Principle of FMCW LIDAR with each optical frequency comb line modulated for parallel sensing. (b) Left panel: experimental setup forparallel distancemeasurement. An external scanningmirror is used for vertical scanning of the frequency comb signal. The target is formed bytwo pieces of papers. The front paper has been partially cut with “EPFL” logo. Right panel: 3D image plot based on the distance profilemeasured from the target. (a) and (b) are adaptedwith permission fromSpringer Nature, Nature [69].Massively parallel coherent laser rangingusing a soliton microcomb, by Riemensberger et al. Copyright 2020.

1458 N. Li et al.: Spectral imaging and spectral LIDAR systems

Page 23: and Lennon Yao Ting Lee Spectral imaging and spectral ...

Figure 14: Dual comb-based LIDAR sensing: measurement setup and results.(a) Experimental setup for dual-combgenerationand LIDARmeasurement. A pump laser source is amplifiedby an erbium-doped fiber amplifier(EDFA), followed by a 50/50 splitter to couple into two arms to pump the microdisk resonator in clockwise (CW) and counter clockwise(CCW)directions. Acoustic-opticmodulator (AOM) andpolarization controller (PC) are used in each arm to tune the pump for soliton generationand ensure efficient coupling into the microdisk resonator, respectively. Fiber Bragg grating (FBG) is used to attenuate the residual pump.Optical signal in spectrum and time domain is monitored by optical spectrum analyzer (OSA) and oscilloscope, respectively. Electric signal ismonitored in spectrumdomain by electric signal analyzer (ESA). A servo feedback loop is used to lock the pump to themicrodisk resonator andhence to stabilize the comb. The CW soliton signal is split into two by 50/50 splitter for distance measurement, as indicated in orange andgreen colored dashed arrow. These two signals will be combined with CCW soliton signal (blue dashed arrow) to generate interferogram.(b) Optical spectral of CW and CCW soliton. (c) Range data with respect to time using CW and CCW soliton to probe the target. Difference of16.02 μm gives the absolute range. Histogram of the range measurement are plotted with Gaussian fitting, and the standard deviations (SD)are calculated. (a)–(c) are adapted from the study by Suh and Vahala [78]. Reprinted with permission from AAAS. (d) Left: schematic setup ofdual-comb-based ultrafast LIDAR measurement. DKS-dissipative Kerr soliton. Middle: Optical image of the static bullet. Right: measuredbullet profile using dual-DKS-comb in dynamic status, and benchmarked with the profile measured using swept-source optical coherencetomography (OCT) in static status. (d) is adapted from the study by Trocha et al. [34]. Reprinted with permission from AAAS.

N. Li et al.: Spectral imaging and spectral LIDAR systems 1459

Page 24: and Lennon Yao Ting Lee Spectral imaging and spectral ...

spacing of the signal and local oscillator combs. As a proof-of-concept demonstration, the system, using dissipativeKerr soliton (DKS) as probe signal, has been applied tomeasure the profile of a flying bullet from an air gun with aspeed of 150m/s. Themeasurement setup, optical image ofthe static bullet, and themeasured bullet profile are shownin Figure 14(d) left, middle and right panels, respectively.The measured flying bullet profile using dual-DKS comb asprobe is benchmarked with the static profile measuredusing swept-source optical coherence tomography (OCT)system, as illustrated in Figure 14(d) right panel. The slightdiscrepancy at the end of the bullet is contributed by thecorrugation of the bullet at the end, as can be visualized inFigure 14(d) middle panel.

Based on the abovementioned literature studies, it canbe observed that in nanophotonics-based frequency combapplied in LIDAR, the multispectral property of the lightsource has been used for parallel measurement of the ve-locity and distance information along a line with singlecomb source [69], or high precision measurement andhigh-speed ranging with dual comb source [34, 78]. To thebest of our knowledge, there is no work on nanophotonics-based frequency comb source applied in LIDAR system toobtain the spectral information of the sensing object. Basedon the fact that nanophotonics-based integrated frequencycomb has compact size and low-power consumption[142, 168], and also considering the multi-wavelengthnature of the frequency comb, we anticipate thatnanophotonics-based frequency comb can be applied inLIDAR system for compact multispectral sensing in thenear future. A side note worth mentioning is that a hyper-spectral imaging system has been reported using terahertzfrequency comb generated by chip-scale quantum cascadelaser [79]. It demonstrates spectral imaging of chemicalsincluding glucose, lactose and histidine captured at mul-tiple frequencies, showing the potential for future chip-scale biomedical applications.

4 Summary and outlook

To sum up, the recent progress on spectral imaging andspectral LIDAR systems has been presented. The modernapplications of state-of-the-art systems in the past decadehave been reviewed from the system implementationperspective. The modern application areas are categorizedas environmental monitoring, autonomous driving,biomedical imaging, biometric identification, archaeologyand art conservation. Furthermore, the recent researchworks on nanophotonics-based spectral imaging andLIDAR sensing systems have been summarized and

presented. The progress tomake the compact systems havebeen reviewed from the system-level research and devel-opment perspective. The recent research works in the pastdecade are mainly based on subwavelength-scale nano-structures for spectral imaging, and chip-scale-integratedoptical frequency combs for LIDAR systems. The nano-structures are engineered to achieve achromatic functionaldevices or narrow-band spectral filters for spectral imag-ing. Also, high-Qmicroring resonators are used to generateoptical frequency combs for LIDAR systems to sense withhigh precision and fast rate. This review provides a sum-mary of spectral imaging and spectral LIDAR systems,covering from the modern applications of the state-of-the-art systems to the development of the compactnanophotonics-based sensing systems.

In the near future, research and development effortscan be focused on the following three aspects, to facilitatethe miniaturization of the multispectral/hyperspectralimaging and LIDAR systems. Firstly, with the demonstra-tion of various metasurface-based functional devices,especially the spectral filters [169–175], it is expected tohave more system-level integration of these devices (e.g.,monolithic integration with CIS), to achieve compactspectral imaging systems. The monolithic fabrication pro-cess can be developed based on the mature CMOS fabri-cation technology. More functional optical componentsusing the materials on the existing CMOS-compatiblefabrication line, including Si [51, 176], SiO2 [177], SiN [178]and AlN [179, 180], are expected to be demonstrated. Morelarge-area metasurface-based optical components are ex-pected to be realized by deep UV photolithography formass manufacturing [113]. Furthermore, the metasurface-based devices in combination with the micro-electromechanical systems (MEMS) or flexible materials[83, 181–183]will enable the compact imaging systemswithactive tuning capability.

Secondly, within the current nanophotonics-basedLIDAR systems [34, 69, 78], the only chip-scale componentis the optical frequency comb source. Hence, it is expectedto integrate rest of the optical components, including op-tical phased-arrays [159] and photodetectors [184], with thelight source [142, 168] to realize a fully integrated comb-based coherent LIDAR system with high frame rate. Thefrequency comb-based optical signal synthesizer reportedin the study by Spencer et al. [145] is a remarkableachievement in this direction. Furthermore, as mentionedearlier, the compact nanophotonics-based spectral LIDARsystem to obtain the spectral information of the object canbe developed. Such compact system should be able to drivethe 3D spectral imaging and sensing application fromtable-top toward portable devices for consumer

1460 N. Li et al.: Spectral imaging and spectral LIDAR systems

Page 25: and Lennon Yao Ting Lee Spectral imaging and spectral ...

electronics. Also, LIDAR operation wavelength around1.55 μmwill be preferred since it is located at the eye-safetywavelength region. This wavelength has advantage infacial recognition since it is insensitive to human skin-tones, and can distinguish different materials such asplastic and resin [23], which is vital for antispoofingpurpose.

Last but not the least, based on the CMOS fabricationand integration platform, more functional devices basedonmultiphysics couplingmechanisms can be developed tobe integrated with the imaging and sensing systems. Oneexample is optical modulator or nonmagnetic isolatorbased on acoustic-optic coupling mechanism leveragingon the piezoelectric effect of AlN on nanophotonics-basedintegrated platforms [162–164]. AlN is a suitable materialfor such multiphysics coupling since it not only providespiezoelectric effect, but also exhibits excellent opticalproperties, including wide transparency window from UVto MIR [153], significant second- and third-order nonlinearoptical effects [185–188]. Also, AlN can be deposited andpatterned on Si substrate using the CMOS-compatiblefabrication process, which ensures the mass production ofthe devices with low cost [189, 190]. Another example is anoptical switch based on optoelectro-mechanical effect re-ported in the study byHaffner et al. [191]. The demonstratedoptical devices have the potential to enable the reprog-rammable optical system in large scale.

List of abbreviation

ADAS advanced driver-assistance systemsAFG arbitrary functional generatorAOM acoustic-optic modulatorCCD charge-coupled deviceCCW counter-clock-wiseCIRC circulatorCIS CMOS image sensorCMOS complementary metal-oxide-semiconductorCW clock-wiseDEMUX de-multiplexerDKS dissipative Kerr solitonEDFA erbium-doped fiber amplifierEOM electro-optic modulatorESA electric signal analyzerFBG fiber Bragg gratingFIR far infraredFMCW frequency modulated continuous waveFWM four wave mixingGPS global positioning systemIgG Immunoglobulin GIMU inertial measurement unitIR infraredLED light emitting diode

LIDAR light detection and rangingLWIR long-wavelength infraredMB+ methylene blueMEMS microelectromechanical systemMIR mid infraredMSOT multispectral optoacoustic tomographyNHA nanohole arrayNIR near infraredOCT optical coherence tomographyOSA optical spectrum analyzerPC polarization controllerPD photodetectorPDMS PolydimethylsiloxaneQ quality factorRGB red, green, blueSD standard deviationSPAD single photon avalanche photodetectorSWIR short wavelength infraredSEM scanning electron microscopyToF time of flightUV ultraviolet

Acknowledgment: The authors thank Dr. Chen Liu and Dr.Elaine Cristina Schubert Barretto for the valuablediscussions.Author statement: All the authors have accepted theresponsibility for the content of this review and approvedsubmission.Research funding: This work is supported by Agency forScience, Technology and Research, Singapore IAF-PPA19B3a0008 and IAF-PP A1789a0024.Conflict of interest statement: The authors declare noconflict of interest.

References

[1] P. G. Hancke, D. B. Silva, and H. P. Gerhard, Jr., “The role ofadvanced sensing in smart cities,” Sensors, vol. 13, no. 1, 2013.

[2] Q. Li, X. He, Y. Wang, H. Liu, D. Xu, and F. Guo, “Review of spectralimaging technology in biomedical engineering: achievements andchallenges,” J. Biomed. Opt., vol. 18, no. 10, pp. 1–29, 2013.

[3] A. F. H. Goetz, G. Vane, J. E. Solomon, and B. N. Rock, “Imagingspectrometry for earth remote sensing,” Science, vol. 228, no.4704, p. 1147, 1985.

[4] J. Transon, R. d’Andrimont, A.Maugnard, and P. Defourny, “Surveyof hyperspectral earth observation applications from space in thesentinel-2 context,” Rem. Sens., vol. 10, no. 2, p. 157, 2018.

[5] S. Weksler, O. Rozenstein, and E. Ben-Dor, “Mapping surfacequartz content in sand dunes covered by biological soil crustsusing airborne hyperspectral images in the longwave infraredregion,” Minerals, vol. 8, no. 8, p. 318, 2018.

[6] M. K. Tripathi and H. Govil, “Evaluation of AVIRIS-NGhyperspectral images for mineral identification and mapping,”Heliyon, vol. 5, no. 11, p. e02931, 2019.

N. Li et al.: Spectral imaging and spectral LIDAR systems 1461

Page 26: and Lennon Yao Ting Lee Spectral imaging and spectral ...

[7] H. Eichstaedt, T. Tsedenbaljir, R. Kahnt, et al., “Quantitativeestimation of clay minerals in airborne hyperspectral data usinga calibration field,” J. Appl. Rem. Sens., vol. 14, no. 3, pp. 1–17,2020.

[8] A. Picon, A. Bereciartua, J. Echazarra, O. Ghita, P. F. Whelan, andP. M. Iriondo, “Real-time hyperspectral processing for automaticnonferrous material sorting,” J. Electron. Imag., vol. 21, no. 1,pp. 1–10, 2012.

[9] L.M. Kandpal, J. Tewari, N. Gopinathan, P. Boulas, andB.-K. Cho,“In-process control assay of pharmaceutical microtablets usinghyperspectral imaging coupled with multivariate analysis,”Anal. Chem., vol. 88, no. 22, pp. 11055–11061, 2016.

[10] C. Hopkinson, L. Chasmer, C. Gynan, C. Mahoney, and M. Sitar,“Multisensor and multispectral LiDAR characterization andclassification of a forest environment,” Can. J. Rem. Sens.,vol. 42, no. 5, pp. 501–520, 2016.

[11] S. Morsy, A. Shaker, and A. El-Rabbany, “Multispectral LiDARdata for land cover classification of urban areas,” Sensors,vol. 17, no. 5, p. 958, 2017.

[12] L.-Z. Huo, C. A. Silva, C. Klauberg, et al., “Supervised spatialclassification of multispectral LiDAR data in urban areas,” PloSOne, vol. 13, no. 10, p. e0206185, 2018.

[13] G. Zhao, M. Ljungholm, E. Malmqvist, et al., “Inelastichyperspectral lidar for profiling aquatic ecosystems,” LaserPhotonics Rev., vol. 10, no. 5, pp. 807–813, 2016.

[14] P. S. Roy, “Spectral reflectance characteristics of vegetation andtheir use in estimating productive potential,” Proc. Plant Sci.,vol. 99, no. 1, pp. 59–81, 1989.

[15] A. L. Pisello, G. Pignatta, V. L. Castaldo, and F. Cotana,“Experimental analysis of natural gravel covering as cool roofingand cool pavement,” Sustainability, vol. 6, no. 8, 2014, https://doi.org/10.3390/su6084706.

[16] E. Puttonen, J. Suomalainen, T. Hakala, and J. Peltoniemi,“Measurement of reflectance properties of asphalt surfaces andtheir usability as reference targets for aerial photos,” IEEE Trans.Geosci. Rem. Sens., vol. 47, no. 7, pp. 2330–2339, 2009.

[17] J. F. Bell, S. W. Squyres, R. E. Arvidson, et al., “Pancammultispectral imaging results from the opportunity rover atmeridiani planum,” Science, vol. 306, no. 5702, p. 1703, 2004.

[18] J. Taher, Deep Learning for Road Area Semantic Segmentation inMultispectral Lidar Data, Master Thesis, Aalto University, 2019.

[19] K. Takumi, K. Watanabe, Q. Ha, A. Tejero-De-Pablos, Y. Ushiku,and T. Harada, “Multispectral object detection for autonomousvehicles,” Proc. Themat. Workshop ACM Multimed., vol. 2017,pp. 35–43, 2017.

[20] Y. Choi, N. Kim, S. Hwang, et al., “KAISTmulti-spectral day/nightdata set for autonomous and assisted driving,” IEEE Trans. Intell.Transport. Syst., vol. 19, no. 3, pp. 934–948, 2018.

[21] Z. Hu, C. Fang, B. Li, et al., “First-in-human liver-tumour surgeryguided by multispectral fluorescence imaging in the visible andnear-infrared-I/II windows,” Nat. Biomed. Eng., vol. 4, no. 3,pp. 259–271, 2020.

[22] G. Lu and B. Fei, “Medical hyperspectral imaging: A review,”J. Biomed. Opt., vol. 19, no. 1, pp. 1–24, 2014.

[23] H. Steiner, S. Sporrer, A. Kolb, and N. Jung, “Design of an activemultispectral SWIR camera system for skin detection and faceverification,” J. Sens., vol. 2016, p. 9682453, 2015.

[24] N. T. Vetrekar, R. Raghavendra, and R. S. Gad, “Low-cost multi-spectral face imaging for robust face recognition,” in 2016 IEEE

International Conference on Imaging Systems and Techniques(IST), Chania, Greece, IEEE, 2016, pp. 324–329.

[25] H. Liang, “Advances in multispectral and hyperspectral imagingfor archaeology and art conservation,” Appl. Phys. A, vol. 106,no. 2, pp. 309–323, 2012.

[26] F. Daniel, A. Mounier, J. Pérez-Arantegui, et al., “Hyperspectralimaging applied to the analysis of Goya paintings in theMuseumof Zaragoza (Spain),”Microchem. J., vol. 126, pp. 113–120, 2016.

[27] H. Park and K. B. Crozier, “Multispectral imaging with verticalsilicon nanowires,” Sci. Rep., vol. 3, no. 1, p. 2460, 2013.

[28] L. Duempelmann, B. Gallinet, and L. Novotny, “Multispectralimaging with tunable plasmonic filters,” ACS Photonics, vol. 4,no. 2, pp. 236–241, 2017.

[29] Z. Wang, S. Yi, A. Chen, et al., “Single-shot on-chip spectralsensorsbasedonphotonic crystal slabs,”Nat. Commun., vol. 10,no. 1, p. 1020, 2019.

[30] N. A. Hagen and M. W. Kudenov, “Review of snapshot spectralimaging technologies,” Opt. Eng., vol. 52, no. 9, pp. 1–23, 2013.

[31] M. J. Khan, H. S. Khan, A. Yousaf, K. Khurshid, and A. Abbas,“Modern trends in hyperspectral image analysis: A review,” IEEEAccess, vol. 6, pp. 14118–14129, 2018.

[32] H. Liu, B. Bruning, T. Garnett, and B. Berger, “Hyperspectralimaging and 3D technologies for plant phenotyping: fromsatellite to close-range sensing,” Comput. Electron. Agric., vol.175, p. 105621, 2020.

[33] Y. D. Shah, P. W. R. Connolly, J. P. Grant, et al., “Ultralow-light-level color image reconstruction using high-efficiency plasmonicmetasurface mosaic filters,” Optica, vol. 7, no. 6, pp. 632–639,2020.

[34] P. Trocha, M. Karpov, D. Ganin, et al., “Ultrafast optical rangingusing microresonator soliton frequency combs,” Science, vol.359, no. 6378, p. 887, 2018.

[35] J.M. Jurado, L. Ortega, J. J. Cubillas, and F. R. Feito, “Multispectralmapping on 3D models and multi-temporal monitoring forindividual characterization of olive trees,” Rem. Sens., vol. 12,no. 7, 2020, https://doi.org/10.3390/rs12071106.

[36] Y. Torres, J. J. Arranz, J. M. Gaspar-Escribano, et al., “Integrationof LiDAR and multispectral images for rapid exposure andearthquake vulnerability estimation. Application in Lorca,Spain,” Int. J. Appl. Earth Obs., vol. 81, pp. 161–175, 2019.

[37] V. Neuschmelting, S. Harmsen, N. Beziere, et al., “Dual-modalitysurface-enhanced resonance Raman scattering and multispectraloptoacoustic tomography nanoparticle approach for brain tumordelineation,” Small, vol. 14, no. 23, p. 1800740, 2018.

[38] S.-J. Park, C. J. H. Ho, S. Arai, A. Samanta, M. Olivo, andY.-T. Chang, “Visualizing Alzheimer’s disease mouse brain withmultispectral optoacoustic tomography using a fluorescentprobe, CDnir7,” Sci. Rep., vol. 9, no. 1, p. 12052, 2019.

[39] D. Zhang, Z. Guo, and Y. Gong, “Multispectral Iris acquisitionsystem,” inMultispectral Biometrics: Systems and Applications,D. Zhang, Z. Guo, and Y. Gong, Eds., Springer InternationalPublishing, 2016, pp. 39–62.

[40] M. E. Hussein, L. Spinoulas, F. Xiong, and W. Abd-Almageed,“Fingerprint presentation attack detection using a novel multi-spectral capture device and patch-based convolutional neuralnetworks,” in 2018 IEEE International Workshop on InformationForensics and Security (WIFS), Hong Kong, IEEE, 2018, pp. 1–8.

[41] D. Zhang, Z. Guo, and Y. Gong, “An online system ofmultispectral palmprint verification,” in Multispectral

1462 N. Li et al.: Spectral imaging and spectral LIDAR systems

Page 27: and Lennon Yao Ting Lee Spectral imaging and spectral ...

Biometrics: Systems and Applications, D. Zhang, Z. Guo, andY. Gong, Eds., Springer International Publishing, 2016,pp. 117–137.

[42] H. Mahgoub, H. Chen, J. Gilchrist, T. Fearn, and M. Strlic,“Quantitative chemical near-infrared hyperspectral imaging ofIslamic paper,” in ICOM-CC 18th Triennial Conference Preprints(International Council of Museums), Copenhagen, ICOMCommittee for Conservation, 2017, p. 1606.

[43] M. Picollo, A. Casini, C. Cucci, J. Jussila, M. Poggesi, andL. Stefani, “A new compact VNIR hyperspectral imaging systemfor non-invasive analysis in the FineArt and architecture fields,”Proc. e Rep., pp. 69–74, 2018, https://doi.org/10.36253/978-88-6453-707-8.16.

[44] M. Shimoni, R. Haelterman, and C. Perneel, “Hypersectralimaging formilitary and security applications: combiningmyriadprocessing and sensing techniques,” IEEE Trans. Geosci. Rem.Sens., vol. 7, no. 2, pp. 101–117, 2019.

[45] X. Briottet, Y. Boucher, A. Dimmeler, et al., “Military applicationsof hyperspectral imagery,” in Targets and Backgrounds XII:Characterization and Representation, vol. 6239, Orlando,Florida, United States, Proc. SPIE, 2006, p. 62390B.

[46] V. Gujrati, A. Mishra, and V. Ntziachristos, “Molecular imagingprobes for multi-spectral optoacoustic tomography,” Chem.Commun., vol. 53, no. 34, pp. 4653–4672, 2017.

[47] X. Ai, Z. Wang, H. Cheong, et al., “Multispectral optoacousticimaging of dynamic redox correlation and pathophysiologicalprogression utilizing upconversion nanoprobes,” Nat.Commun., vol. 10, no. 1, p. 1087, 2019.

[48] J. Yoon, J. Joseph, D. J. Waterhouse, et al., “A clinicallytranslatable hyperspectral endoscopy (HySE) system for imagingthe gastrointestinal tract,” Nat. Commun., vol. 10, no. 1, p. 1902,2019.

[49] H. Fabelo, S. Ortega, D. Ravi, et al., “Spatio-spectralclassification of hyperspectral images for brain cancerdetection during surgical operations,” PloS One, vol. 13,no. 3, p. e0193721, 2018.

[50] H. Fabelo, M. Halicek, S. Ortega, et al., “Surgical aidvisualization system for glioblastoma tumor identification basedon deep learning and in-vivo hyperspectral images of humanpatients,” in Medical Imaging 2019: Image-Guided Procedures,Robotic Interventions, and Modeling, vol. 10951, San Diego,California, United States, Proc. SPIE, 2019, p. 1095110.

[51] T. Hu, Q. Zhong, N. Li, et al., “CMOS-compatible a-Si metalenseson a 12-inch glass wafer for fingerprint imaging,”Nanophotonics, vol. 9, no. 4, 2020, https://doi.org/10.1515/nanoph-2019-0470.

[52] B. A. Ganji and M. S. Nateri, “A high sensitive MEMS capacitivefingerprint sensor using slotted membrane,” Microsyst.Technol., vol. 19, no. 1, pp. 121–129, 2013.

[53] H. Hassan and H.-W. Kim, “CMOS capacitive fingerprint sensorbased on differential sensing circuit with noise cancellation,”Sensors, vol. 18, no. 7, 2018, https://doi.org/10.3390/s18072200.

[54] X. Jiang, Y. Lu, H.-Y. Tang, et al., “Monolithic ultrasoundfingerprint sensor,”Microsyst. Nanoeng., vol. 3, no. 1, p. 17059,2017.

[55] R. K. Rowe, K. A. Nixon, and P. W. Butler, “Multispectralfingerprint image acquisition,” in Advances in Biometrics,Springer, 2008, pp. 3–23.

[56] C. Cucci, A Casini, L. Stefani, M. Picollo, and J. Jussila, “Bridgingresearch with innovative products: A compact hyperspectralcamera for investigating artworks: A feasibility study,” in Opticsfor Arts, Architecture, and Archaeology VI, vol. 10331, Proc. SPIE,Munich, Germany, 2017, p. 1033106.

[57] C. Cucci, J. K. Delaney, and M. Picollo, “Reflectancehyperspectral imaging for investigation of works of art: oldmaster paintings and illuminated manuscripts,” Acc. Chem.Res., vol. 49, no. 10, pp. 2070–2079, 2016.

[58] A. F. Koenderink, A. Alù, and A. Polman, “Nanophotonics:shrinking light-based technology,” Science, vol. 348, no. 6234,p. 516, 2015.

[59] C. Sorace-Agaskar, P. T. Callahan, K. Shtyrkova, et al.,“Integrated mode-locked lasers in a CMOS-compatible siliconphotonic platform,” in CLEO: 2015, OSA Technical Digest(Online), San Jose, California, United States, Optical Society ofAmerica, 2015, p. SM2I.5.

[60] K. Shtyrkova, P. T. Callahan, N. Li, et al., “IntegratedCMOS-compatible Q-switched mode-locked lasers at 1900 nmwith an on-chip artificial saturable absorber,” Opt. Express, vol.27, no. 3, pp. 3542–3556, 2019.

[61] N. Li, M. Xin, Z. Su, et al., “A silicon photonic data link with amonolithic erbium-doped laser,” Sci. Rep., vol. 10, no. 1, p. 1114,2020.

[62] N. Li. Purnawirman, E. S. Magden, G. Singh, et al., “Wavelengthdivision multiplexed light source monolithically integrated on asilicon photonics platform,” Opt. Lett., vol. 42, no. 9,pp. 1772–1775, 2017.

[63] Z. Su, N. Li, H. C. Frankis, et al., “High-Q-factor Al2O3 micro-trench cavities integrated with silicon nitride waveguideson silicon,” Opt. Express, vol. 26, no. 9, pp. 11161–11170, 2018.

[64] E. S. Magden, N. Li, M. Raval, et al., “Transmissive siliconphotonic dichroic filters with spectrally selective waveguides,”Nat. Commun., vol. 9, no. 1, p. 3009, 2018.

[65] N. Li, Z. Su, Purnawirman, et al., “Athermal synchronization oflaser source with WDM filter in a silicon photonics platform,”Appl. Phys. Lett., vol. 110, no. 21, p. 211105, 2017.

[66] Q. Chen, X. Hu, L. Wen, Y. Yu, and D. R. S. Cumming,“Nanophotonic image sensors,” Small, vol. 12, no. 36,pp. 4922–4935, 2016.

[67] F. Yesilkoy, E. R. Arvelo, Y. Jahani, et al., “Ultrasensitivehyperspectral imaging and biodetection enabled by dielectricmetasurfaces,” Nat. Photonics, vol. 13, no. 6, pp. 390–396,2019.

[68] M. Faraji-Dana, E. Arbabi, H. Kwon, et al., “Hyperspectral imagerwith folded metasurface optics,” ACS Photonics, vol. 6, no. 8,pp. 2161–2167, 2019.

[69] J. Riemensberger, A. Lukashchuk, M. Karpov, et al., “Massivelyparallel coherent laser ranging using a soliton microcomb,”Nature, vol. 581, no. 7807, pp. 164–170, 2020.

[70] M. Khorasaninejad, W. T. Chen, A. Y. Zhu, et al., “Multispectralchiral imaging with a metalens,” Nano Lett., vol. 16, no. 7,pp. 4595–4600, 2016.

[71] H. Qi, L. Kong, C. Wang, and L. Miao, “A hand-held mosaickedmultispectral imaging device for early stage pressure ulcerdetection,” J. Med. Syst., vol. 35, no. 5, pp. 895–904, 2011.

[72] L. Frey, P. Parrein, J. Raby, et al., “Color filters including infraredcut-off integrated on CMOS image sensor,”Opt. Express, vol. 19,no. 14, pp. 13073–13080, 2011.

N. Li et al.: Spectral imaging and spectral LIDAR systems 1463

Page 28: and Lennon Yao Ting Lee Spectral imaging and spectral ...

[73] S. Nishiwaki, T. Nakamura, M. Hiramoto, T. Fujii, and M. Suzuki,“Efficient colour splitters for high-pixel-density image sensors,”Nat. Photonics, vol. 7, no. 3, pp. 240–246, 2013.

[74] W.-Y. Jang, Z. Ku, J. Jeon, et al., “Experimental demonstration ofadaptive infrared multispectral imaging using plasmonic filterarray,” Sci. Rep., vol. 6, no. 1, p. 34876, 2016.

[75] M. Najiminaini, F. Vasefi, B. Kaminska, and J. J. L. Carson,“Nanohole-array-based device for 2D snapshot multispectralimaging,” Sci. Rep., vol. 3, no. 1, p. 2589, 2013.

[76] S. P. Burgos, S. Yokogawa, and H. A. Atwater, “Color imaging vianearest neighbor hole coupling in plasmonic color filtersintegrated onto a complementary metal-oxide semiconductorimage sensor,” ACS Nano, vol. 7, no. 11, pp. 10038–10047, 2013.

[77] A. Tittl, A. Leitis, M. Liu, et al., “Imaging-based molecularbarcoding with pixelated dielectric metasurfaces,” Science, vol.360, no. 6393, p. 1105, 2018.

[78] M.-G. Suh and K. J. Vahala, “Soliton microcomb rangemeasurement,” Science, vol. 359, no. 6378, p. 884, 2018.

[79] L. A. Sterczewski, J. Westberg, Y. Yang, et al., “Terahertzhyperspectral imaging with dual chip-scale combs,” Optica,vol. 6, no. 6, pp. 766–771, 2019.

[80] N. Yu, P. Genevet, M. A. Kats, et al., “Light propagation withphase discontinuities: generalized laws of reflection andrefraction,” Science, vol. 334, no. 6054, p. 333, 2011.

[81] N. Yu and F. Capasso, “Flat optics with designer metasurfaces,”Nat. Mater., vol. 13, no. 2, pp. 139–150, 2014.

[82] F. Aieta, P. Genevet, M. A. Kats, et al., “Aberration-free ultrathinflat lenses and axicons at telecom wavelengths based onplasmonic metasurfaces,” Nano Lett., vol. 12, no. 9,pp. 4932–4936, 2012.

[83] S. Zhang, M.-H. Kim, F. Aieta, et al., “High efficiency neardiffraction-limited mid-infrared flat lenses based onmetasurface reflectarrays,” Opt. Express, vol. 24, no. 16,pp. 18024–18034, 2016.

[84] I. Koirala, S.-S. Lee, and D.-Y. Choi, “Highly transmissivesubtractive color filters based on an all-dielectric metasurfaceincorporating TiO2 nanopillars,” Opt. Express, vol. 26, no. 14,pp. 18320–18330, 2018.

[85] W. Yue, S.Gao, S.-S. Lee, E.-S. Kim, andD.-Y. Choi, “Highly reflectivesubtractive color filters capitalizing on a silicon metasurfaceintegratedwith nanostructured aluminummirrors,” Laser PhotonicsRev., vol. 11, no. 3, p. 1600285, 2017.

[86] N. Yu, F. Aieta, P. Genevet, M. A. Kats, Z. Gaburro, andF. Capasso, “A broadband, background-free quarter-wave platebased on plasmonic metasurfaces,” Nano Lett., vol. 12, no. 12,pp. 6328–6333, 2012.

[87] Z. H. Jiang, L. Lin, D. Ma, et al., “Broadband and wide field-of-view plasmonic metasurface-enabled waveplates,” Sci. Rep.,vol. 4, p. 7511, 2014.

[88] Y. F. Yu, A. Y. Zhu, R. Paniagua-Domínguez, Y. H. Fu, B. Luk’yanchuk,and A. I. Kuznetsov, “High-transmission dielectricmetasurfacewith2π phase control at visible wavelengths,” Laser Photonics Rev., vol.9, no. 4, pp. 412–418, 2015.

[89] Z. Zhou, J. Li, R. Su, et al., “Efficient silicon metasurfaces forvisible light,” ACS Photonics, vol. 4, no. 3, pp. 544–551, 2017.

[90] K. Chen, J. Deng, N. Zhou, et al., “2π-space uniform-backscattering metasurfaces enabled with geometric phase andmagnetic resonance in visible light,”Opt. Express, vol. 28, no. 8,pp. 12331–12341, 2020.

[91] Z. Li, Q. Dai, M.Q.Mehmood, et al., “Full-space cloud of randompoints with a scrambling metasurface,” Light Sci. Appl., vol. 7,no. 1, p. 63, 2018.

[92] Y.-W. Huang, H. W. H. Lee, R. Sokhoyan, et al., “Gate-tunableconducting oxide metasurfaces,” Nano Lett., vol. 16, no. 9,pp. 5319–5325, 2016.

[93] G. Kafaie Shirmanesh, R. Sokhoyan, R. A. Pala, andH. A. Atwater, “Dual-gated active metasurface at 1550 nm withwide (>300°) phase tunability,” Nano Lett., vol. 18, no. 5,pp. 2957–2963, 2018.

[94] P. C. Wu, R. A. Pala, G. Kafaie Shirmanesh, et al., “Dynamicbeam steering with all-dielectric electro-optic III–V multiple-quantum-well metasurfaces,” Nat. Commun., vol. 10, no. 1,p. 3654, 2019.

[95] E. Khaidarov, Z. Liu, R. Paniagua-Domínguez, et al.,“Control of LED emission with functional dielectricmetasurfaces,” Laser Photonics Rev., vol. 14, no. 1,p. 1900235, 2020.

[96] Y.-Y. Xie, P.-N. Ni, Q.-H. Wang, et al., “Metasurface-integratedvertical cavity surface-emitting lasers for programmabledirectional lasing emissions,” Nat. Nanotechnol., vol. 15, no. 2,pp. 125–130, 2020.

[97] W. T. Chen, A. Y. Zhu, and F. Capasso, “Flat optics withdispersion-engineered metasurfaces,” Nat. Rev. Mater., vol. 5,no. 8, pp. 604–620, 2020.

[98] Z. Shi, M. Khorasaninejad, Y.-W. Huang, et al., “Single-layermetasurface with controllable multiwavelength functions,”Nano Lett., vol. 18, no. 4, pp. 2420–2427, 2018.

[99] S. Wang, P. C. Wu, V.-C. Su, et al., “Broadband achromaticoptical metasurface devices,” Nat. Commun., vol. 8, no. 1,p. 187, 2017.

[100] F. Aieta, M. A. Kats, P. Genevet, and F. Capasso,“Multiwavelength achromatic metasurfaces by dispersivephase compensation,” Science, vol. 347, no. 6228,pp. 1342–1345, 2015.

[101] W. T. Chen, A. Y. Zhu, V. Sanjeev, et al., “A broadbandachromatic metalens for focusing and imaging in the visible,”Nat. Nanotechnol., vol. 13, no. 3, pp. 220–226, 2018.

[102] R. C. Devlin, M. Khorasaninejad, W. T. Chen, J. Oh, andF. Capasso, “Broadbandhigh-efficiency dielectricmetasurfacesfor the visible spectrum,” Proc. Natl. Acad. Sci. U.S.A., vol. 113,no. 38, p. 10473, 2016.

[103] S. Wang, P. C. Wu, V.-C. Su, et al., “A broadband achromaticmetalens in the visible,” Nat. Nanotechnol., vol. 13, no. 3,pp. 227–232, 2018.

[104] E. Arbabi, S. M. Kamali, A. Arbabi, and A. Faraon, “Full-Stokesimaging polarimetry using dielectric metasurfaces,” ACSPhotonics, vol. 5, no. 8, pp. 3132–3140, 2018.

[105] N. A. Rubin, G. D’Aversa, P. Chevalier, Z. Shi, W. T. Chen, andF. Capasso, “Matrix Fourier optics enables a compact full-Stokes polarization camera,” Science, vol. 365, no. 6448,p. eaax1839, 2019.

[106] Y. Intaravanne and X. Chen, “Recent advances in opticalmetasurfaces for polarization detection and engineeredpolarization profiles,” Nanophotonics, vol. 9, no. 5,pp. 1003–1014, 2020.

[107] M. Faraji-Dana, E. Arbabi, A. Arbabi, S. M. Kamali, H. Kwon, andA. Faraon, “Compact folded metasurface spectrometer,” Nat.Commun., vol. 9, no. 1, p. 4196, 2018.

1464 N. Li et al.: Spectral imaging and spectral LIDAR systems

Page 29: and Lennon Yao Ting Lee Spectral imaging and spectral ...

[108] Q. Chen, D. Das, D. Chitnis, et al., “A CMOS image sensorintegrated with plasmonic colour filters,” Plasmonics, vol. 7,no. 4, pp. 695–699, 2012.

[109] Q. Chen, D. Chitnis, K. Walls, T. D. Drysdale, S. Collins, andD. R. S. Cumming, “CMOS photodetectors integrated withplasmonic color filters,” IEEE Photonics Technol. Lett., vol. 24,no. 3, pp. 197–199, 2012.

[110] Y.-T. Yoon, S.-S. Lee, and B.-S. Lee, “Nano-patterned visiblewavelength filter integrated with an image sensor exploiting a90-nm CMOS process,” Photonics Nanostruct: Fundam. Appl.,vol. 10, no. 1, pp. 54–59, 2012.

[111] S. Yokogawa, S. P. Burgos, and H. A. Atwater, “Plasmonic colorfilters for CMOS image sensor applications,”Nano Lett., vol. 12,no. 8, pp. 4349–4354, 2012.

[112] N. Li, H. Y. Fu, Y. Dong, et al., “Large-area pixelatedmetasurfacebeam deflector on a 12-inch glass wafer for random pointgeneration,” Nanophotonics, vol. 8, no. 10, p. 1855, 2019.

[113] N. Li, Z. Xu, Y. Dong, et al., “Large-area metasurface onCMOS-compatible fabrication platform: driving flat optics fromlab to fab,” Nanophotonics, vol. 9, no. 10, pp. 3071–3087,2020.

[114] J. W. Stewart, G. M. Akselrod, D. R. Smith, and M. H. Mikkelsen,“Toward multispectral imaging with colloidal metasurfacepixels,” Adv. Mater., vol. 29, no. 6, p. 1602971, 2017.

[115] Z. Xu, Y. Dong, C.-K. Tseng, et al., “CMOS-compatible all-Simetasurface polarizing bandpass filters on 12-inch wafers,”Opt. Express, vol. 27, no. 18, pp. 26060–26069, 2019.

[116] Z. Xu, N. Li, Y. Dong, et al., “Metasurface-based subtractivecolor filter fabricated on a 12-inch glass wafer using CMOSplatform,” Photonics Res., vol. 9, no. 1, pp. 13–20, 2020.

[117] L. Kong, S. Stephen, M. G. Duckworth, et al., “Handhelderythema and bruise detector,” in Medical Imaging 2008:Computer-Aided Diagnosis, vol. 6915, San Diego, California,United States, Proc. SPIE, 2008, p. 69153K.

[118] L. Kong, D. Yi, S. H. Sprigle, et al., “Single sensor that outputsnarrowband multispectral images,” J. Biomed. Opt., vol. 15,no. 1, pp. 1–3, 2010.

[119] J. Berzinš, S. Fasold, T. Pertsch, S. M. B. Bäumer, andF. Setzpfandt, “Submicrometer nanostructure-basedRGBfiltersfor CMOS image sensors,” ACS Photonics, vol. 6, no. 4,pp. 1018–1025, 2019.

[120] T. Xu, Y.-K. Wu, X. Luo, and L. J. Guo, “Plasmonicnanoresonators for high-resolution colour filtering and spectralimaging,” Nat. Commun., vol. 1, no. 1, p. 59, 2010.

[121] D. Fleischman, L. A. Sweatlock, H. Murakami, and H. Atwater,“Hyper-selective plasmonic color filters,” Opt. Express, vol. 25,no. 22, pp. 27386–27395, 2017.

[122] I. J. H. McCrindle, J. P. Grant, L. C. P. Gouveia, andD. R. S. Cumming, “Infrared plasmonic filters integrated with anoptical and terahertz multi-spectral material,” Phys. StatusSolidi, vol. 212, no. 8, pp. 1625–1633, 2015.

[123] J. Grant, I. J. H. McCrindle, C. Li, and D. R. S. Cumming,“Multispectralmetamaterial absorber,”Opt. Lett., vol. 39, no. 5,pp. 1227–1230, 2014.

[124] J. Bao and M. G. Bawendi, “A colloidal quantum dotspectrometer,” Nature, vol. 523, no. 7558, pp. 67–70, 2015.

[125] V. Jovanov, H. Stiebig, and D. Knipp, “Tunable multispectralcolor sensor with plasmonic reflector,” ACS Photonics, vol. 5,no. 2, pp. 378–383, 2018.

[126] K. W. Mauser, S. Kim, S. Mitrovic, et al., “Resonantthermoelectric nanophotonics,” Nat. Nanotechnol., vol. 12,no. 8, pp. 770–775, 2017.

[127] A. Wang and Y. Dan, “Mid-infrared plasmonic multispectralfilters,” Sci. Rep., vol. 8, no. 1, p. 11257, 2018.

[128] J. Feng, X. Li, Z. Shi, et al., “2D ductile transition metalchalcogenides (TMCs): novel high‐performance Ag2Snanosheets for ultrafast photonics,” Adv. Opt. Mater., vol. 8,no. 6, p. 1901762, 2020.

[129] T. Feng, X. Li, P. Guo, Y. Zhang, J. Liu, andH. Zhang, “MXene: twodimensional inorganic compounds, for generation of boundstate soliton pulses in nonlinear optical system,”Nanophotonics, vol. 9, no. 8, pp. 2505–2513, 2020.

[130] D. Zhang, C. Zhang, X. Li, and A. Qyyum, “Layered iron pyrite forultrafast photonics application,” Nanophotonics, vol. 9, no. 8,pp. 2515–2522, 2020.

[131] P. Guo, X. Li, T. Feng, Y. Zhang, and W. Xu, “Few-layerbismuthene for coexistence of harmonic anddualwavelength ina mode-locked fiber laser,” ACS Appl. Mater. Interfaces, vol. 12,no. 28, pp. 31757–31763, 2020.

[132] J. Feng, X. Li, G. Zhu, and Q. J. Wang, “Emerging high-performance SnS/CdS nanoflower heterojunction for ultrafastphotonics,” ACS Appl. Mater. Interfaces, vol. 12, no. 38,pp. 43098–43105, 2020.

[133] Y. Zhao, W. Wang, X. Li, et al., “Functional porous MOF-derivedCuO octahedra for harmonic soliton molecule pulsesgeneration,”ACSPhotonics, vol. 7, no. 9, pp. 2440–2447, 2020.

[134] T. Feng, X. Li, T. Chai, et al., “Bismuthene nanosheets for 1 μmmultipulse generation,” Langmuir, vol. 36, no. 1, pp. 3–8,2020.

[135] Y. Zhang, X. Li, A. Qyyum, et al., “PbS nanoparticles forultrashort pulse generation in optical communication region,”Part. Part. Syst. Char., vol. 35, no. 11, p. 1800341, 2018.

[136] Y. Wang, Y. Chen, X. Li, et al., “Optical-intensity modulator withInSb nanosheets,” Appl.Mater. Today, vol. 21, p. 100852, 2020.

[137] X. Li, J. Feng, W. Mao†, F. Yin, and J. Jiang, “Emerging uniformCu2O nanocubes for 251st harmonic ultrashort pulsegeneration,” J. Mater. Chem. C, vol. 8, no. 41, pp. 14386–14392,2020.

[138] Z. Gong, A. Bruch, M. Shen, et al., “High-fidelity cavity solitongeneration in crystalline AlN micro-ring resonators,” Opt. Lett.,vol. 43, no. 18, pp. 4366–4369, 2018.

[139] J. Liu, H. Tian, E. Lucas, et al., “Monolithic piezoelectric controlof soliton microcombs,” Nature, vol. 583, no. 7816,pp. 385–390, 2020.

[140] A.W. Bruch, X. Liu, Z. Gong, et al., “Pockels solitonmicrocomb,”Nat. Photonics, vol. 15, pp. 21–27, 2021.

[141] J. Liu, E. Lucas, A. S. Raja, et al., “Photonic microwavegeneration in the X- and K-band using integrated solitonmicrocombs,” Nat. Photonics, vol. 14, no. 8, pp. 486–491,2020.

[142] B. Shen, L. Chang, J. Liu, et al., “Integrated turnkey solitonmicrocombs,” Nature, vol. 582, no. 7812, pp. 365–369,2020.

[143] Th. Udem, R. Holzwarth, and T. W. Hänsch, “Optical frequencymetrology,” Nature, vol. 416, no. 6877, pp. 233–237, 2002.

[144] A. Schliesser, N. Picqué, and T. W. Hänsch, “Mid-infraredfrequency combs,” Nat. Photonics, vol. 6, no. 7, pp. 440–449,2012.

N. Li et al.: Spectral imaging and spectral LIDAR systems 1465

Page 30: and Lennon Yao Ting Lee Spectral imaging and spectral ...

[145] D. T. Spencer, T. Drake, T. C. Briles, et al., “An optical-frequencysynthesizer using integrated photonics,” Nature, vol. 557, no.7703, pp. 81–85, 2018.

[146] N. Singh, M. Xin, N. Li, et al., “Silicon photonics opticalfrequency synthesizer,” Laser Photonics Rev., vol. 14,p. 1900449, 2020.

[147] M. Xin, N. Li, N. Singh, et al., “Optical frequency synthesizerwith an integrated erbium tunable laser,” Light Sci. Appl., vol.8, no. 1, p. 122, 2019.

[148] Y. Na, C.-G. Jeon, C. Ahn, et al., “Ultrafast, sub-nanometre-precision and multifunctional time-of-flight detection,” Nat.Photonics, vol. 14, no. 6, pp. 355–360, 2020.

[149] E. Lucas, P. Brochard, R. Bouchand, S. Schilt, T. Südmeyer, andT. J. Kippenberg, “Ultralow-noise photonicmicrowave synthesisusing a soliton microcomb-based transfer oscillator,” Nat.Commun., vol. 11, no. 1, p. 374, 2020.

[150] S. A. Diddams, “The evolving optical frequency comb [Invited],”J. Opt. Soc. Am. B, vol. 27, no. 11, pp. B51–B62, 2010.

[151] T. Fortier and E. Baumann, “20 years of developments in opticalfrequency comb technology and applications,” Commun. Phys.,vol. 2, no. 1, p. 153, 2019.

[152] S. A. Diddams, K. Vahala, and T. Udem, “Optical frequencycombs: coherently uniting the electromagnetic spectrum,”Science, vol. 369, no. 6501, p. eaay3676, 2020.

[153] A. L. Gaeta, M. Lipson, and T. J. Kippenberg, “Photonic-chip-based frequency combs,” Nat. Photonics, vol. 13, no. 3,pp. 158–169, 2019.

[154] E. Obrzud, M. Rainer, A. Harutyunyan, et al., “Amicrophotonic astrocomb,” Nat. Photonics, vol. 13, no. 1,pp. 31–35, 2019.

[155] M. Kues, C. Reimer, J. M. Lukens, et al., “Quantum opticalmicrocombs,” Nat. Photonics, vol. 13, no. 3, pp. 170–179, 2019.

[156] T. Herr, K. Hartinger, J. Riemensberger, et al., “Universalformation dynamics and noise of Kerr-frequency combs inmicroresonators,” Nat. Photonics, vol. 6, no. 7, pp. 480–487,2012.

[157] P. Del’Haye, A. Schliesser, O. Arcizet, T. Wilken, R. Holzwarth,and T. J. Kippenberg, “Optical frequency comb generation froma monolithic microresonator,” Nature, vol. 450, no. 7173,pp. 1214–1217, 2007.

[158] M. H. P. Pfeiffer, A. Kordts, V. Brasch, et al., “Photonic Damasceneprocess for integrated high-Q microresonator based nonlinearphotonics,” Optica, vol. 3, no. 1, pp. 20–25, 2016.

[159] J. Sun, E. Timurdogan, A. Yaacobi, E. S. Hosseini, andM. R. Watts, “Large-scale nanophotonic phased array,” Nature,vol. 493, no. 7431, pp. 195–199, 2013.

[160] C. V. Poulton, M. J. Byrd, M. Raval, et al., “Large-scale siliconnitride nanophotonic phased arrays at infrared and visiblewavelengths,” Opt. Lett., vol. 42, no. 1, pp. 21–24, 2017.

[161] J. Notaros, N. Li, C. V. Poulton, et al., “CMOS-compatible opticalphased array powered by a monolithically-integrated erbiumlaser,” J. Lightwave Technol., vol. 37, no. 24, pp. 5982–5987,2019.

[162] S. A. Tadesse andM. Li, “Sub-optical wavelength acoustic wavemodulation of integrated photonic resonators at microwavefrequencies,” Nat. Commun., vol. 5, no. 1, p. 5402, 2014.

[163] D. B. Sohn, S. Kim, and G. Bahl, “Time-reversal symmetrybreakingwith acoustic pumping of nanophotonic circuits,”Nat.Photonics, vol. 12, no. 2, pp. 91–97, 2018.

[164] H. Tian, J. Liu, B. Dong, et al., “Hybrid integrated photonicsusing bulk acoustic resonators,” Nat. Commun., vol. 11, no. 1,p. 3073, 2020.

[165] Y. Jiang, S. Karpf, and B. Jalali, “Time-stretch LiDAR as aspectrally scanned time-of-flight ranging camera,” Nat.Photonics, vol. 14, no. 1, pp. 14–18, 2020.

[166] N. Singh, M. Xin, D. Vermeulen, et al., “Octave-spanningcoherent supercontinuum generation in silicon on insulatorfrom 1.06 μm to beyond 2.4 μm,” Light Sci. Appl., vol. 7,p. 17131, 2018.

[167] N. Singh, D. Vermulen, A. Ruocco, et al., “Supercontinuumgeneration in varying dispersion and birefringent siliconwaveguide,”Opt. Express, vol. 27, no. 22, pp. 31698–31712, 2019.

[168] B. Stern, X. Ji, Y. Okawachi, A. L. Gaeta, andM. Lipson, “Battery-operated integrated frequency comb generator,” Nature, vol.562, no. 7727, pp. 401–405, 2018.

[169] Z. Dong, J. Ho, Y. F. Yu, et al., “Printing beyond sRGBcolor gamutby mimicking silicon nanostructures in free-space,” Nano Lett.,vol. 17, no. 12, pp. 7620–7628, 2017.

[170] J. Proust, F. Bedu, B. Gallas, I. Ozerov, and N. Bonod, “All-dielectric colored metasurfaces with silicon mie resonators,”ACS Nano, vol. 10, no. 8, pp. 7761–7767, 2016.

[171] M. Miyata, H. Hatada, and J. Takahara, “Full-colorsubwavelength printing with gap-plasmonic optical antennas,”Nano Lett., vol. 16, no. 5, pp. 3166–3172, 2016.

[172] S. J. Tan, L. Zhang, D. Zhu, et al., “Plasmonic color palettes forphotorealistic printing with aluminum nanostructures,” NanoLett., vol. 14, no. 7, pp. 4023–4029, 2014.

[173] A. Kristensen, J. K. W. Yang, S. I. Bozhevolnyi, et al., “Plasmoniccolour generation,” Nat. Rev. Mater., vol. 2, no. 1, p. 16088, 2016.

[174] T. Lee, J. Jang, H. Jeong, and J. Rho, “Plasmonic- and dielectric-based structural coloring: from fundamentals to practicalapplications,” Nano Converg., vol. 5, no. 1, p. 1, 2018.

[175] Y. Dong, Z. Xu, N. Li, et al., “Si metasurface half-wave platesdemonstrated on a 12-inch CMOS platform,” Nanophotonics,vol. 9, no. 1, pp. 149–157, 2019.

[176] Q. Zhong, Y. Li, T. Hu, et al., “1550nm-Wavelength metalensdemonstrated on 12-inch Si CMOS platform,” in 2019 IEEE 16thInternational Conference on Group IV Photonics (GFP),Singapore, IEEE, 2019, 1949–209X, pp. 1–2.

[177] J.-S. Park, S. Zhang, A. She, et al., “All-glass, large metalens atvisible wavelength using deep-ultraviolet projectionlithography,” Nano Lett., vol. 19, no. 12, pp. 8673–8682, 2019.

[178] S. Colburn, A. Zhan, E. Bayati, et al., “Broadband transparentand CMOS-compatible flat optics with silicon nitridemetasurfaces [Invited],” Opt. Mater. Express, vol. 8, no. 8,pp. 2330–2344, 2018.

[179] L. Guo, Z. Hu, R. Wan, et al., “Design of aluminum nitridemetalens for broadband ultraviolet incidence routing,”Nanophotonics, vol. 8, no. 1, pp. 171–180, 2019.

[180] Z. Hu, L. Long, R. Wan, et al., “Ultrawide bandgap AlNmetasurfaces for ultraviolet focusing and routing,” Opt. Lett.,vol. 45, no. 13, pp. 3466–3469, 2020.

[181] E. Arbabi, A. Arbabi, S. M. Kamali, Y. Horie, M. Faraji-Dana, andA. Faraon, “MEMS-tunable dielectric metasurface lens,” Nat.Commun., vol. 9, no. 1, p. 812, 2018.

[182] T. Roy, S. Zhang, I. W. Jung, M. Troccoli, F. Capasso, andD. Lopez, “Dynamic metasurface lens based on MEMStechnology,” APL Photonics, vol. 3, no. 2, p. 021302, 2018.

1466 N. Li et al.: Spectral imaging and spectral LIDAR systems

Page 31: and Lennon Yao Ting Lee Spectral imaging and spectral ...

[183] A. She, S. Zhang, S. Shian, D. R. Clarke, and F. Capasso,“Adaptive metalenses with simultaneous electrical control offocal length, astigmatism, and shift,” Sci. Adv., vol. 4, no. 2,p. eaap9957, 2018.

[184] M. J. Byrd, E. Timurdogan, Z. Su, et al., “Mode-evolution-basedcoupler for high saturation power Ge-on-Si photodetectors,”Opt. Lett., vol. 42, no. 4, pp. 851–854, 2017.

[185] C. Xiong, W. H. P. Pernice, and H. X. Tang, “Low-loss, siliconintegrated, aluminum nitride photonic circuits and their use forelectro-optic signal processing,” Nano Lett., vol. 12, no. 7,pp. 3562–3568, 2012.

[186] C. Xiong, W. H. P. Pernice, X. Sun, C. Schuck, K. Y. Fong, andH. X. Tang, “Aluminum nitride as a new material for chip-scaleoptomechanics and nonlinear optics,” New J. Phys., vol. 14,no. 9, p. 095014, 2012.

[187] H. Jung, R. Stoll, X. Guo, D. Fischer, and H. X. Tang, “Green, red,and IR frequency comb line generation from single IR pump in

AlN microring resonator,” Optica, vol. 1, no. 6, pp. 396–399,2014.

[188] H. Jung and H. X. Tang, “Aluminum nitride as nonlinear opticalmaterial for on-chip frequency comb generation and frequencyconversion,” Nanophotonics, vol. 5, no. 2, pp. 263–271, 2016.

[189] S. Zhu and G.-Q. Lo, “Aluminum nitride electro-optic phaseshifter for backend integration on silicon,” Opt. Express,vol. 24, no. 12, pp. 12501–12506, 2016.

[190] S. Zhu, Q. Zhong, T. Hu, et al., “Aluminum nitride ultralow losswaveguides and push-pull electro-optic modulators for nearinfrared and visible integrated photonics,” in Optical FiberCommunication Conference (OFC) 2019, OSA Technical Digest(Optical Society of America), San Diego, California, UnitedStates, OSA Publishing, 2019, p. W2A.11.

[191] C. Haffner, A. Joerg, M. Doderer, et al., “Nano–opto-electro-mechanical switches operated at CMOS-level voltages,”Science, vol. 366, no. 6467, p. 860, 2019.

N. Li et al.: Spectral imaging and spectral LIDAR systems 1467