Bare finger 3D airtouch system with embedded multiwavelength ...

8
Bare finger 3D air-touch system with embedded multiwavelength optical sensor arrays for mobile 3D displays Guo-Zhen Wang Yi-Pai Huang (SID Senior Member) Tian-Sheuan Chang Abstract A camera-free 3D air-touch system was proposed. Hovering, air swiping, and 3D gestures for further interaction with the oated 3D images on the mobile display were demonstrated. By embedding multiwavelength optical sensors into the display pixels and adding angular-scanning illuminators with multiwavelength on the edge of the display, the at panel can sense images reected by a bare nger from different heights. In addition, three axis (x, y, z) information of the reected image of the ngertip can be calculated. Finally, the proposed 3D air-touch system was successfully demonstrated on a 4-inch mobile 3D display. Keywords 3D interactive, air touch, embedded optical sensors, mobile 3D display. DOI # 10.1002/jsid.190 1 Introduction The widespread adoption of smartphones and tablets has accelerated the transformation of user interfaces and has paved the way to multi-touch technologies in the beginning of 21st century. 1 Meanwhile, 3D technologies 25 have dramatically changed the user experience. Todays 3D display systems can provide new advantages to end-users and are able to support an autostereoscopic, no-glasses, 3D experience with signicantly enhanced image quality over earlier technol- ogy. Furthermore, numerous 3D interactive systems for providing friendlier and more intuitive user interfaces have been proposed for 3D display applications. Currently, there are two categories of 3D interactive systems 6,7 : machine-based and camera-based. Machine-based systems 8 involve devices worn by the user for motion detection and supports feedback vibration. For instance, the Haptic Workstation TM can detect six axes of hand movement through data gloves 9 and can render force-feedback on the wrists. Although machine-based systems have force-feedback functions, they are often thought to be inconvenient because of the bulk and weight of the devices. For camera-based systems, 10 3D information (x, y, z) can be calculated by one of a variety of camera technologies. For instance, the popular Wii and Kinect game consoles 11,12 are able to detect relative 3D position by using infrared (IR) cameras with corresponding IR light sources. 13 However, these camera-based systems are limited by their elds of view and by factors that prevent them from detecting an objects proximity to the display. These factors prevent them from being integrated into mobile devices such as smart phones, tablets, or laptops. Additionally, high-resolution data are necessary for the camera-based system to calculate 3D position, as the resolution is proportional to the size of charge-coupled device of the camera; this factor also impedes the system from being integrated into portable devices. For the integration of these technologies into mobile devices, embedded optical sensors that can be integrated within the display pixels were proposed to keep the system light and thin. Embedding the optical sensors onto a thin lm transistor array substrate was rst proposed for 2D touch applications by W. D. Boer et al. in 2003. 14 As shown in Fig. 1, in which a black matrix does not block a sensor, the photocur- rent will be generated when the sensor receives external light. Thus, the embedded optical sensor-based system becomes a kind of 2D touch technology. 15 To extend the embedded optical sensor-based system to a 3D interface, light-pen touch was proposed. 1618 However, an additional light pen was necessary for light-pen touch method. To achieve a more intuitive and friendlier user interface, a 3D air-touch system for users to touch and interactwith the virtual stereoimages on mobile 3D display was proposed, as shown in Fig. 2. This scheme is illustrated in this paper. 2 Structure To achieve the air-touch function, we proposed an embedded multiwavelength optical sensor-based system with added multiwavelength angular-scanning illuminators on the display sides, as shown in Fig. 3. The proposed air-touch system is composed of a traditional display, multiwavelength embedded optical sensors, an IR backlight, and multiwavelength angular- scanning illuminators. For calculating the 2D (x and y) posi- Received 05/05/13; accepted 10/09/13. G-Z. Wang, and T-S. Chang are with the Department of Electronics Engineering and Institute of Electronics National Chiao Tung University Taiwan, China. Y-P. Huang is with the Department of Photonics and Display Institute National Chiao Tung University Taiwan, China; e-mail: [email protected]. © Copyright 2014 Society for Information Display 1071-0922/14/2109-0190$1.00. Journal of the SID 21/9, 2014 381

Transcript of Bare finger 3D airtouch system with embedded multiwavelength ...

Page 1: Bare finger 3D airtouch system with embedded multiwavelength ...

Bare finger 3D air-touch system with embedded multiwavelength optical sensor arraysfor mobile 3D displays

Guo-Zhen Wang Abstract — A camera-fr

Yi-Pai Huang (SID Senior Member)Tian-Sheuan Chang

ee 3D air-touch system was proposed. Hovering, air swiping, and 3D gestures forfurther interaction with the floated 3D images on the mobile display were demonstrated. By embeddingmultiwavelength optical sensors into the display pixels and adding angular-scanning illuminators withmultiwavelength on the edge of the display, the flat panel can sense images reflected by a bare finger from

Received 05/05/13; accepted 10/09/13.G-Z. Wang, and T-S. Chang are with the DepartmenY-P. Huang is with the Department of Photonics and© Copyright 2014 Society for Information Display 1

different heights. In addition, three axis (x, y, z) information of the reflected image of the fingertip can becalculated. Finally, the proposed 3D air-touch system was successfully demonstrated on a 4-inch mobile3D display.

Keywords — 3D interactive, air touch, embedded optical sensors, mobile 3D display.

DOI # 10.1002/jsid.190

1 Introduction

The widespread adoption of smartphones and tablets hasaccelerated the transformation of user interfaces and haspaved the way to multi-touch technologies in the beginningof 21st century.1 Meanwhile, 3D technologies2–5 havedramatically changed the user experience. Today’s 3D displaysystems can provide new advantages to end-users and are ableto support an autostereoscopic, no-glasses, 3D experiencewith significantly enhanced image quality over earlier technol-ogy. Furthermore, numerous 3D interactive systems forproviding friendlier and more intuitive user interfaces havebeen proposed for 3D display applications.

Currently, there are two categories of 3D interactivesystems6,7: machine-based and camera-based. Machine-basedsystems8 involve devices worn by the user for motiondetection and supports feedback vibration. For instance, theHaptic WorkstationTM can detect six axes of hand movementthrough data gloves9 and can render force-feedback on thewrists. Although machine-based systems have force-feedbackfunctions, they are often thought to be inconvenient becauseof the bulk and weight of the devices.

For camera-based systems,10 3D information (x, y, z) canbe calculated by one of a variety of camera technologies.For instance, the popular Wii and Kinect game consoles11,12

are able to detect relative 3D position by using infrared (IR)cameras with corresponding IR light sources.13 However,these camera-based systems are limited by their fields of viewand by factors that prevent them from detecting an object’sproximity to the display. These factors prevent them frombeing integrated into mobile devices such as smart phones,tablets, or laptops. Additionally, high-resolution data are

t of Electronics EngineeringDisplay Institute National Ch071-0922/14/2109-0190$1.0

necessary for the camera-based system to calculate 3Dposition, as the resolution is proportional to the size ofcharge-coupled device of the camera; this factor also impedesthe system from being integrated into portable devices.

For the integration of these technologies into mobiledevices, embedded optical sensors that can be integratedwithin the display pixels were proposed to keep the systemlight and thin. Embedding the optical sensors onto a thin filmtransistor array substrate was first proposed for 2D touchapplications by W.D. Boer et al. in 2003.14 As shown in Fig. 1,in which a black matrix does not block a sensor, the photocur-rent will be generated when the sensor receives external light.Thus, the embedded optical sensor-based system becomes akind of 2D touch technology.15

To extend the embedded optical sensor-based system to a3D interface, light-pen touch was proposed.16–18 However, anadditional light pen was necessary for light-pen touch method.To achieve a more intuitive and friendlier user interface, a 3Dair-touch system for users to ‘touch and interact’ with thevirtual stereoimages on mobile 3D display was proposed, asshown in Fig. 2. This scheme is illustrated in this paper.

2 Structure

To achieve the air-touch function, we proposed an embeddedmultiwavelength optical sensor-based system with addedmultiwavelength angular-scanning illuminators on the displaysides, as shown in Fig. 3. The proposed air-touch system iscomposed of a traditional display, multiwavelength embeddedoptical sensors, an IR backlight, and multiwavelength angular-scanning illuminators. For calculating the 2D (x and y) posi-

and Institute of Electronics National Chiao Tung University Taiwan, China.iao Tung University Taiwan, China; e-mail: [email protected].

Journal of the SID 21/9, 2014 381

Page 2: Bare finger 3D airtouch system with embedded multiwavelength ...

FIGURE 1 — Schematic structure of an embedded optical sensor onto a thin film transistor substrate, and a depictionof the sensed image.

FIGURE 2 — Schematic of the embedded optical sensor panel for different 3D interactive applications.

FIGURE 3 — Cross-section of the bare-finger 3D interactive system in (a) Infrared (IR) backlight illuminating mode(for x, y determination) and (b) IR angular-scanning mode (for z determination).

tion of fingertip, a planar IR backlight was designed to passthrough the whole panel so that it can be reflected by the fin-gertip for detection, as shown in Fig. 3(a). For calculating thedepth (z) information, a multiwavelength angular-scanning

382 Wang et al. / Bare finger 3D air-touch system

light bar, which can be placed at the edge of the panel, wasproposed, as illustrated in Fig. 3(b).

However, to temporarily scan the depth (z) information withsingle wavelength, the operation of the system will suffer from

Page 3: Bare finger 3D airtouch system with embedded multiwavelength ...

the sensing rate. Thus, to increase the sensing rate for real-timeoperation, we propose a multiwavelength sensing method toreduce the capturing time by using parallel-processingconcepts. Before we describing this ‘multi-wavelength’ sensing,the timing diagram (Fig. 4) of the bare finger 3D interactivesystem with ‘single-wavelength’ sensing must be established.At first, the IR backlight and IR angular-scanning illumina-tors are synchronized with embedded optical sensors.During the first sensing frame (Frame 0), the IR backlightpasses through the panel twice, and the optical sensors capturethe reflected light for detecting the 2D position (x and y) ofthe fingertip. The working time (tx,y) for this detection is t. Then,during the sensing frames (Frame 1 to Frame n), the IR angular-scanning illuminators on two sides of the panel will emit a single-wavelength light at different tilt angles sequentially. Thesesensing frames are correlated with tilt angle θ to n× θ. Theworking time (tz) for z determination is n× t. Finally, by analyz-ing the accumulated intensity of the each frame, the scanningangle to yield the maximum reflectance can be found for theposition of fingertip. With the two axes (x and y) and scanningangle (θ), the depth (z) information of fingertip can becalculated. Finally, the 3-axis (x, y, z) information of fingertipcan be obtained over a total working time of (1+n) × t.

By utilizing the parallel-processing concept, the sensedimages can be captured simultaneously rather than sequen-tially. The timing diagram of the bare finger 3D interactivesystem with multiwavelength sensing is shown in Fig. 5. The

FIGURE 4 — Timing diagram of the bare finger 3D interactsensors and single-wavelength sequential illuminators.

2D axis working time (tx,y) for the multiwavelength sensingfor 2D detection is the same as that for single-wavelengthsensing. However, the depth sensing working time (tz) of themultiwavelength method of z determination is n/2 × t due toparallel processing. The concept of parallel processing is thatthe depth sensing step can be operated by the red and bluesubpixels at the same time. Therefore, the total depthsensing working time will be reduced in half. In the follow-ing experiment, the proposed multiwavelength sensing isdemonstrated with IR and deep-blue wavelengths for lessinterruption of human vision. For this purpose, the opticalsensors were embedded in the red and blue subpixels. Thereflected images of the IR and deep-blue light are filteredby R and B color filters, respectively.

3 Algorithm

Based on the bare finger 3D air-touch system withmultiwavelength embedded optical sensors andmultiwavelengthsequential illuminators, the proposed algorithm can be used tocalculate the 3D (x, y, z) positions of fingertip. The flow chartof proposed algorithm is shown in Fig. 6. At first, the raw dataare retrieved from embedded optical sensors. To decrease noisefrom the environment and the system, a noise suppressionsystem, including denoising and debackgrounding, is adopted.

ive system with ‘single-wavelength’ embedded optical

Journal of the SID 21/9, 2014 383

Page 4: Bare finger 3D airtouch system with embedded multiwavelength ...

FIGURE 5 — Timing diagram of the bare finger 3D interactive system with ‘multi-wavelength’ embedded opticalsensors and multiwavelength sequential illuminators.

Next, the intensity of captured image from IR backlight isaccumulated. When the accumulated value is larger than a touchthreshold value, which was determined by experiment calibra-tion, the object (fingertip) will be sensed as touching on thepanel. Then, the full search method can be used to detectthe 2-axis (x and y) position of a touch point. If the accumu-lated value is smaller than the touch threshold value, the ob-ject (fingertip) is treated as a hover of the panel, thus theproposed bare finger 3D (x, y, and z) touch algorithm canbe used to calculate the 3-axis (x, y, and z) information. Thedetails of proposed bare finger 3D (x, y and z) touchalgorithm will be described in the following.

The ‘full search’ method (Fig. 7) is used to calculate the2-axis(x, y) positions of fingertip from the captured image,

384 Wang et al. / Bare finger 3D air-touch system

which is reflect by IR backlight. The full search methoduses a filter to cover the image and sum up the intensityin order to find the position of maximum accumulatedintensity. Then, the group of images from the IR angular-scanning illuminators is processed using a ‘region-based’ al-gorithm to obtain depth (z) value of hover fingertip, asshown in Fig. 8.

The ‘region-based’ algorithm classifies the hover regionbroadly into three regions: region-1 (overlapping-central),region-2 (overlapping-wings), and region-3 (non-overlapping).In region-1 (overlapping-central), the fingertip was at thecenter position of display. In other words, the fingertip isreflected by two side illuminators with the same scanningangle (θ). Therefore, the intensity versus scanning angle curve

Page 5: Bare finger 3D airtouch system with embedded multiwavelength ...

FIGURE 6 — Flow chart of the bare finger 3D (x, y, and z) touchalgorithm.

has only one peak, corresponding to the angle (θ) thatyields the maximum reflectance of the fingertip. Finally,scanning angle (θ) and 2D (x, y) position of fingertip are usedto calculate the depth (z) information of fingertip by a simpletrigonometric relationship (z = x × tanθ). In region-2(overlapping-wings), the fingertip is reflected by two sideilluminators with different scanning angles (θ1 and θ2).Therefore, the intensity versus scanning angle curve will hastwo peaks. To obtain higher accuracy, the maximum peak(θ1) that is closest to the fingertip is used to calculate thedepth (z) information. In region-3 (non-overlapping), thereis only one peak. This condition implies that the fingertipcannot be reflected by the one side illuminator on the nearerside at the maximum tilt angle (θ1). In the other words, onlythe one illuminator on the far side with the smaller scanningangle (θ2) was used to calculate depth (z) information.

FIGURE 7 — Full search method for 2D (x, y) p

4 Experiment results

To verify the proposed concept, our multiwavelength sensingsystem was implemented on a 4-inch mobile display (Fig. 9) inwhich the optical sensors were embedded under red and bluecolor filters. Then, we set up the different conditions for prov-ing the accuracy of the 3D (x, y, z) positions of fingertip in this4-inch mobile display.

In the following steps, the experiment results of (x, y, z) ac-curacy, the limitation of detecting height of current system,and the resolution of depth will be discussed. First, the finger-tip was placed at the different (x, y) coordinates to analyze the(x, y) accuracy. The (x, y) detection has an error of less than2mm when the fingertip is at center and at the four cornersof the 4-inch mobile display. (Fig. 10)

To analyze the depth (z) accuracy, the fingertip was raisedfrom 1 to 3 cm over different working regions, as shown inFig. 11. The error of depth (z) value was smaller than3mm. Therefore, in our experiments, the maximum errorin the (x, y)-plane and the z-plane were 2mm and 3mm,respectively. However, the detected height was limited to3 cm because of the sensitivity of the embeddedphotosensors. As shown in Fig. 12, the reflected IR intensitydecreased to almost zero when approaching 4 cm. The lowsensitivity occurred because of the embedded photo sensorwas made by Si-based material. A Ge-based photo sensor,which is more sensitive to the IR wavelength, is underdevelopment. Although the sensitivity can be furtherimproved, the detectable height also can be increased.Finally, the increment of scanning angle is strongly relatedto the resolution of depth value. As shown in Fig. 13, theresolution of z increases when the increment of tilt angledecreases. However, a smaller increment of scanningangle will be required with additional sensing frames tocalculate one fingertip position. The sensing rate alsodepends on the signal-to-noise ratio. With a Ge-basedphotosensor in an optimized circuit design, the signal-to-noiseratio can be improved to have higher sensing rate, enhancingresolution of depth value.

To further verify the 3D gesture application, we tested theoriginal system with a single wavelength and the proposed sys-tem with multiple wavelengths. In an ideal system, the ges-ture could be fully reconstructed at an infinite sampling

osition.

Journal of the SID 21/9, 2014 385

Page 6: Bare finger 3D airtouch system with embedded multiwavelength ...

FIGURE 8 — Flow chart of the region-based algorithm.

FIGURE 9 — Experimental platform with red and blue optical sensors.

rate. However, in the original single-wavelength system, thesampling rate was only 3 points per second; thus, the ges-ture recognized by the original system was far from the realcase. In contrast, in the proposed system with IR and deep-blue sensors, the capturing rate was as a factor of two of the

386 Wang et al. / Bare finger 3D air-touch system

single wavelength one; thus, the recognized gesture wasmuch similar to the real S-curve, as depicted in Fig. 14.Accordingly, a real-time 3D air-touch system couldbe achieved through the proposed multiwavelengthsensing system.

Page 7: Bare finger 3D airtouch system with embedded multiwavelength ...

FIGURE 11 — Accuracy of the depth (z) value of the object at differentworking ranges.

FIGURE 12 — Measurement results for the reflected intensity of a finger-tip at different depths.

FIGURE 13 — Relationship between different increments of tilt angle andaccuracy.

FIGURE 14 — 3D tracking comparison between the single-wavelengthand the multiwavelength results.

FIGURE 10 — Accuracy of (x, y) coordinates in different x/y coordinates.

5 Conclusion

In conclusion, we had presented a camera-free 3D air-touchsystem with bare finger interaction. It detects hovering, swip-ing, and virtually ‘touching’ of the 3D images. By embeddingthe multiwavelength optical sensors in the display pixels andadding multiwavelength angular-scanning illuminators on theedge of the display, the flat panel can sense images reflectedby the fingertip. From the sensed images, the 3D (x, y, z)position of fingertip can be calculated without complex imageprocessing. Finally, the experimental results exhibit precise

Journal of the SID 21/9, 2014 387

Page 8: Bare finger 3D airtouch system with embedded multiwavelength ...

Guo-Zhen Wang received an M.S. degree from theDisplay Institute at the National Chiao Tung Univer-sity (NCTU), Hsinchu, Taiwan, R.O.C., in 2008 andis currently working toward a Ph.D. at the Depart-ment of Electronics Engineering, National ChiaoTung University (NCTU). His current researchinvolves developing 3D interaction systems andfocuses on image processing and computer archi-tecture technologies.

2D position (x, y) detection at the pixel level and a linearresponse to depth (z) from 0 to 3 cm. The detected depthrange can be further increased by improving the sensitivityof optical sensors. The 3D gesture sensing was also verified.For current mobile 3D displays, which can only provide fewcentimeter depth images, the proposed system, which canyield 0–3 cm working range, is applicable for the near-fieldair-touch of mobile 3D displays. To summarize, the proposedapproach is workable under the single bare finger. However,the multiple bare-fingers operation will fail because of theocclusion effect, which means the blocked fingertip cannotreflect IR light. For overcoming the occlusion issue and achiev-ing 3D multiple bare finger’s functionality, the interpolation andmotion vector methods may be adopted in the future.

Yi-Pai Huang received his BS degree fromNational Cheng Kung University in 1999and earned a PhD in Electro-OpticalEngineering from the NCTU in 2004. He is

Acknowledgments

The authors would like to acknowledge financial supportfrom the National Science Council (Grant No. NSC101-2221-E-009-120-MY3) of the Republic of China andhardware support from AU-Optronics.

currently a full-time Associate Professor inthe Department of Photonics and DisplayInstitute, NCTU, Taiwan. He was also a visitingAssociate Professor at Cornell University from2011 to 2012. Additionally, he is the chairmanof the SID Taipei Chapter, the Chair of the SIDApplied Vision subcommittee, and a seniormember of SID. His expertise includes 3Ddisplays and interactive technologies, display op-

tics and color science, and micro-optics. In these fields, he has publishedover 150 international journal and conference papers (including 60 SIDconference papers and 17 invited talks) and has been granted 40 patents,with another 51 currently publicly available. In addition, he has receivedthe SID’s distinguished paper award three times (2001, 2004, and 2009).Other important awards include the 2012 National Youth Innovator Awardof the Ministry of Economic Affairs, the 2011 Taiwan National Award of Ac-ademia Inventors, the 2010 Advantech Young Professor Award, the 2009Journal-SID Best Paper of the Year award, and the 2005 GoldenDissertation Award from the Acer Foundation.

Tian-Sheuanm Chang (S’93–M’06–SM’07) re-ceived BS, MS, and PhD degrees in electronicengineering from NCTU, Hsinchu, Taiwan, in1993, 1995, and 1999, respectively. From2000 to 2004, he was a Deputy Manager withthe Global Unichip Corporation, Hsinchu,Taiwan. In 2004, he joined the Department ofElectronics Engineering, NCTU, where he iscurrently a Professor. In 2009, he was a visitingscholar at Interuniversity MicroelectronicsCentre, Belgium. His current research interestsinclude system-on-a-chip design, very-large-scale integration signal processing, andcomputer architecture. Dr. Chang received theExcellent Young Electrical Engineer award from

the Chinese Institute of Electrical Engineering in 2007, and the OutstandingYoung Scholar award from the Taiwan IC Design Society in 2010. He hasbeen actively involved in many international conferences, either as a partof an organizing committee or as a technical program committee member.He is current an Editorial Board Member of the Institute of Electrical and Elec-tronics Engineers Transactions of Circuits and Systems for Video Technology.

References1 K. Bredies et al., “AVI ’08: Proceedings of the working conference onAdvanced visual interfaces 466–469,” (2008).

2 M. Tsuboi et al., “Design conditions for attractive reality in mobile-type 3-Ddisplay,” J. Soc. Inf. Disp. 18, 698–703 (2010). DOI: 10.1889/JSID18.9.698.

3 Y. P. Huang and C. W. Chen, “Superzone Fresnel Liquid Crystal Lens forTemporal Scanning Auto-stereoscopic Display,” J. Disp. Technol. 8, 650–655(2012) DOI: 10.1109/JDT.2012.2212695.

4 Y. P. Huang et al., “2-D/3-D switchable autostereoscopic display withmulti-electrically driven liquid-crystal (MeD-LC) lenses,” J. Soc. Inf. Disp.18, 642–646 (2010). DOI: 10.1889/JSID18.9.642.

5 Y. P. Huang et al., “Autostereoscopic 3D Display with Scanning Multi-electrode Driven Liquid Crystal(MeD-LC) Lens,” J. 3D Res. 1, 39–42(2010). DOI: 10.1007/3DRes.01(2010)5.

6 C. Hand, “A Survey of 3D Interaction Techniques,” Comput. Graph.Forum 16, 269–281 (1997).

7 C. Brown et al., “A system LCD with integrated 3-dimensional inputdevice,” SID Symp. Digest 41, 453–456 (2010).

8 R. Ott et al., “Advanced virtual reality technologies for surveillance andsecurity applications,” Proceedings of the 2006 ACM international confer-ence on Virtual reality continuum and its applications, 163–170 (2006).

9 K. N. Tarchanidis and J. N. Lygouras, “Data glove with a force sensor,”IEEE Trans. Instrum. Meas. 52, 984–989 (2003).

10 A. Olwal et al, “An immaterial, dual-sided display system with 3D interac-tion,” Virtual Reality Conference, 279–280 (2006).

11 T. Schlömer et al., “Gesture recognition with a Wii controller,” Proceed-ings of the 2nd international conference on Tangible and embeddedinteraction, 11–14 (2008).

12 B. Lange et al., “Initial usability assessment of off-theshelf video gameconsoles for clinical game-based motor rehabilitation,” Phys. Ther. Rev.14, 355–363 (2009).

13 T. Shiratori and J. K. Hodgins, “Accelerometer-based User Interfaces forthe Control of a Physically Simulated Character,” ACM SIGGRAPH Asia,123 (2008).

14 W. d. Boer et al., “Active matrix LCD with integrated optical touchscreen,” SID Symp. Digest 34, 1494–1497 (2003).

15 A. Abileah et al., “Integrated Optical Touch Panel in a 14.1 AMLCD,”SID Symp. Digest 35, 1544–1547 (2004).

16 H. Y. Tung et al., “Multi-user and Multi-touch System for 3D-interactiveDisplay,” SID Symp. Digest 42, 1834–1837 (2011).

388 Wang et al. / Bare finger 3D air-touch system

17 G. Z. Wang et al., “AVirtual Touched 3D Interactive Display with Embed-ded Optical Sensor Array for 5-axis (x, y, z, θ, ϕ) Detection,” SID Symp.Digest 42, 737–740 (2011).

18 Y. P. Huang et al., “Three dimensional virtual touch display system formulti-user applications,” Early Access by IEEE/OSA Jol. Of Display Tech-nology, (2013).