Otmar Hilliges – Curriculum Vitae...Awards 2014 Best paper award - IEEE IROS’14....

13
Otmar Hilliges | Curriculum Vitae Advanced Interactive Technologies Lab – ETH Zurich, Switzerland T +41 44 632 39 56 B [email protected] ˝ http://ait.ethz.ch/ Personal Born in Munich, Germany on July 3 rd , 1979. Nationality: German. Research Interests My research interests include all aspects of Human-Computer Interaction. In particular, post-desktop user interfaces, mobile interaction, augmented and virtual reality, human-robot interaction, input sensing technologies and algorithms, gesture recognition and human activity recognition. Education 2005–2009 PhD in Computer Science, LMU Munich, Germany. Grade: 1.0/1.0 – “summa cum laude”. Committee: A. Butz (Advisor), S. Izadi, A. Wilson, S. Carpendale. 1999–2004 MSc in Computer Science, TU Munich, Germany. Grade: 1.0/1.0 – “summa cum laude”. Thesis Advisor: G. Klinker. Finalist “Werner von Siemens Excellence Award”. 1998 Abitur. Erasmus-Grasser-Gymnasium, Munich, Germany. Academic Positions 2013–present Assistant Professor in Computer Science (Tenure Track) ETH Zurich, Departement of Computer Science. I lead the AIT Lab (http://ait.ethz.ch). 2012–2013 Researcher, Microsoft Research, Cambridge, UK. Interactive 3D Technologies Group. 2010–2011 Postdoc Researcher, Microsoft Research, Cambridge, UK. Sensors and Devices Group. Grants and Other Funding 2017–2021 “OPTINT: Optimization-based Design of Interactive Technologies”. Funding: € 1.5M. ERC Starting Grant. 2017–2019 “Human-Centric-Flight II: End-user Design of High-level Robotic Behavior”. Funding: CHF 180K. Microsoft Research Grant. 2015–2018 “Deformation and Motion Modeling using Modular, Sensor-based Input Devices”. Funding share: CHF 250K. Swiss National Science Foundation (SNF). 2015–2018 “UFO: Semi-Autonomous Aerial Vehicles for Augmented Reality, Human-Computer Interaction and Remote Collaboration”. Funding: CHF 375K. Swiss National Science Foundation (SNF). 2014–2017 “Human-centric flight: Micro Aerial Vehicles for Interaction, Videography and 3D Reconstruction”. Funding share: CHF 255K. Microsoft Research Grant. 2014–2016 “Gesture Recognition Algorithms Using High-Speed, Wide Field-of-View, Short Range Radar for Mobile and Wearable Computing”. Funding: CHF 220K. Google Inc. Sponsored Research Agreement. 2014–2016 “MAV In Context: Exploring Immersive Virtual Environments Through Micro Aerial Vehicles”. Funding: CHF 200K. ETH post-doctoral fellowship (Fabrizio Pece). 1/13

Transcript of Otmar Hilliges – Curriculum Vitae...Awards 2014 Best paper award - IEEE IROS’14....

Page 1: Otmar Hilliges – Curriculum Vitae...Awards 2014 Best paper award - IEEE IROS’14. “Environment-independentFormationFlightforMicroAerialVehicles” 2014 Best paper award - …

Otmar Hilliges | Curriculum VitaeAdvanced Interactive Technologies Lab – ETH Zurich, Switzerland

T +41 44 632 39 56 • B [email protected] • Í http://ait.ethz.ch/

PersonalBorn in Munich, Germany on July 3rd, 1979. Nationality: German.

Research InterestsMy research interests include all aspects of Human-Computer Interaction. In particular, post-desktop user interfaces,mobile interaction, augmented and virtual reality, human-robot interaction, input sensing technologies and algorithms,gesture recognition and human activity recognition.

Education2005–2009 PhD in Computer Science, LMU Munich, Germany. Grade: 1.0/1.0 – “summa cum laude”.

Committee: A. Butz (Advisor), S. Izadi, A. Wilson, S. Carpendale.1999–2004 MSc in Computer Science, TU Munich, Germany. Grade: 1.0/1.0 – “summa cum laude”.

Thesis Advisor: G. Klinker. Finalist “Werner von Siemens Excellence Award”.1998 Abitur. Erasmus-Grasser-Gymnasium, Munich, Germany.

Academic Positions2013–present Assistant Professor in Computer Science (Tenure Track)

ETH Zurich, Departement of Computer Science. I lead the AIT Lab (http://ait.ethz.ch).2012–2013 Researcher, Microsoft Research, Cambridge, UK. Interactive 3D Technologies Group.2010–2011 Postdoc Researcher, Microsoft Research, Cambridge, UK. Sensors and Devices Group.

Grants and Other Funding2017–2021 “OPTINT: Optimization-based Design of Interactive Technologies”.

Funding: € 1.5M. ERC Starting Grant.2017–2019 “Human-Centric-Flight II: End-user Design of High-level Robotic Behavior”.

Funding: CHF 180K. Microsoft Research Grant.2015–2018 “Deformation and Motion Modeling using Modular, Sensor-based Input Devices”.

Funding share: CHF 250K. Swiss National Science Foundation (SNF).2015–2018 “UFO: Semi-Autonomous Aerial Vehicles for Augmented Reality, Human-Computer Interaction and

Remote Collaboration”.Funding: CHF 375K. Swiss National Science Foundation (SNF).

2014–2017 “Human-centric flight: Micro Aerial Vehicles for Interaction, Videography and 3D Reconstruction”.Funding share: CHF 255K. Microsoft Research Grant.

2014–2016 “Gesture Recognition Algorithms Using High-Speed, Wide Field-of-View, Short Range Radar forMobile and Wearable Computing”.Funding: CHF 220K. Google Inc. Sponsored Research Agreement.

2014–2016 “MAV In Context: Exploring Immersive Virtual Environments Through Micro Aerial Vehicles”.Funding: CHF 200K. ETH post-doctoral fellowship (Fabrizio Pece).

1/13

Page 2: Otmar Hilliges – Curriculum Vitae...Awards 2014 Best paper award - IEEE IROS’14. “Environment-independentFormationFlightforMicroAerialVehicles” 2014 Best paper award - …

Awards2014 Best paper award - IEEE IROS’14.

“Environment-independent Formation Flight for Micro Aerial Vehicles”2014 Best paper award - ACM SIGCHI ’14

“Type–Hover–Swipe in 96 Bytes: A Motion Sensing Mechanical Keyboard”2012 Honorable mention best technote - ACM SIGCHI ’12

“Shake’n’Sense: Reducing Structured Light Interference when Multiple Depth Cameras Overlap”2012 Best demo award runner-up - ACM UIST ’12

“Digits: Freehand 3D Interactions Anywhere Using a Wrist-worn Gloveless Sensor”2012 Best paper award - Pervasive ’12

“Interactive Environment-Aware Handheld Projectors for Pervasive Computing Spaces”2011 Best paper award - IEEE ISMAR ’11

“KinectFusion: Real-Time Dense Surface Mapping and Tracking”2010 Best paper award - ACM CSCW ’10

“Opening up the Family Archive”2008 Best paper award - ACM UIST ’08

“Bringing Physics to the Surface”

Research GroupCurrent PhD students. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

2015–present Emre Aksan. “Machine Learning for interactive technologies”.2015–present Stefan Stesvic. “End-user design of robotic behavior”.2015–present Christoph Gebhardt. “Computational design of interactive technologies”.2015–present Benjamin Hepp. “Trajectory planning for resource efficient 3D reconstruction”.2014–present Jie Song.“Human activity and input recognition”.

Winner Swisscom ICT thesis award (CHF 10K).2014–present Tobias Nägeli. “Human robot interaction”.

Winner Qualcomm Innovation Fellowship (CHF 10K).

Current postdoctoral researchers. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

2014–present Fabrizio Pece, PhD from UCL, London. ETH / Marie Curie COFUND Fellow.

PhD committee member. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Nicolai Ranieri (ETHZ), Petri Tanskanen (ETHZ), Gabor Sörös (ETHZ).

Past PhD students. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

David Kim, Newcastle University (2010–2013), with Prof. Patrick Olivier. Now researcher at Microsoft Research.

Professional ActivitiesProgram committee member. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

{ ACM SIGCHI 2013, 2014, 2015, 2016, 2017{ ACM UIST 2013, 2014, 2016{ ACM NordiCHI 2016{ ACM TEI 2009, 2014{ ACM MUM 2013{ ACM UbiComp 2013{ IEEE ISMAR 2013{ IEEE 3DV 2012, 2013{ ACM ITS 2010

Conference organizing committee member. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

{ Keynote chair ACM UIST 2013,104{ Video co-chair ACM Ubicomp 2013{ Demo co-chair ACM UIST 2010,2011{ SV co-chair ACM ITS 2010

2/13

Page 3: Otmar Hilliges – Curriculum Vitae...Awards 2014 Best paper award - IEEE IROS’14. “Environment-independentFormationFlightforMicroAerialVehicles” 2014 Best paper award - …

Reviewer. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

I routinely review papers for ACM CHI, UIST, SIGGRPAH, ITS, TEI, Ubicomp, IEEE ISMAR, 3DV, IEEE IROS andICRA as well as for many journal publications, including ACM ToG, ACM ToCHI, IJHCS, IEEE ToSMC, IEEE JVR.

Organized courses and tutorials. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

{ Co-organizer of annual “Summer School on Computational Interaction” (2015 Glasgow, UK, 2016 Aalto, Fi,2017 Zurich, CH).

{ Organizer Dagstuhl seminar “Computational Interactivity” 2017.

Invited talks, conference presentations and seminars. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

I regularly give invited talks at many internationally renowned academic institutions. Since joining ETH (2013) theseincluded NYU, NY, USA, University of Tokyo and Nara Institute of Technology, Japan, TU Graz, Austria, TU Munich,Germany, FH Hagenberg, Austria, Microsoft Research and Google Research. I am an invited panelist at ACM SUI’16.

MembershipsI am a member of ACM SIGCHI, ACM UIST, ACM SIGGRAPH and the IEEE Computer Society.

TeachingETH Zurich, Switzerland. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Fall 2016 Human Computer Interaction4 ECTS, 50% teaching, 80-90 studentsVisual Computing8 ECTS, 50% teaching, 80-90 studentsSeminar : ML for Interactive Systems and Advanced Programming Tools2 ECTS, 50% teaching, 11 students

Spring 2016 Parallel Programming7 ECTS, 50% teaching, 392 studentsUser Interface Engineering4 ECTS, 100% teaching, 45 students

Fall 2015 Human Computer Interaction4 ECTS, 50% teaching, 88 students

Spring 2015 Seminar : Distributed Systems Seminar ("Smart Environments")2 ECTS, 50% teaching, 13 studentsParallel Programming7 ECTS, 50% teaching, 315 studentsUser Interface Engineering4 ECTS, 100% teaching, 30 students

Fall 2014 Human Computer Interaction4 ECTS, 33% teaching, 64 students

Spring 2014 Seminar : Distributed Systems Seminar ("Smart Environments")2 ECTS, 50% teaching, 13 studentsParallel Programming7 ECTS, 50% teaching, 262 students

Fall 2013 User Interface Engineering4 ECTS, 100% teaching, 17 students

Spring 2013 Seminar : Distributed Systems ("Interaction in Intelligent Environments")2 ECTS, 50% teaching, 11 students

Master’s and Bachelor’s thesis. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

I have supervised 16 Master’s and 8 Bachelor’s theses since 2013.

Previous teaching experience (LMU Munich). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Teaching assistant: Computer Graphics, Information Visualization, HCI, Image ProcessingSeminars: Interactive TabletopsCourses taught: 3D Graphics, Information Visualization

3/13

Page 4: Otmar Hilliges – Curriculum Vitae...Awards 2014 Best paper award - IEEE IROS’14. “Environment-independentFormationFlightforMicroAerialVehicles” 2014 Best paper award - …

Theses: Supervision of multiple Master’s theses and Bachelor’s theses. Sometimes in collaboration with industrypartners (BMW, MAN, Siemens).

PublicationsIn my area peer-reviewed conference publications are the primary outlet for current research. ACM SIGCHI andACM UIST are the premiere venues for Human Computer Interaction research with an average acceptance rate of20%. An up-to-date list of publications can be found at: http://ait.ethz.ch/publications.

Most important publications: This is a purely subjective list of my three most important papers. These wereselected because they illustrate the two main goals of my work: First, I attempt to push the state-of-the art interms of input recognition algorithmically. Second, I aim to alter accepted limitations on user experiences, bydemonstrating entirely novel forms of user interaction through algorithmic design of interactive technologies. Otherpapers may have received more citations or have had impact in other ways.

{ In-Air-Gestures [CF11] – Highly efficient algorithms for gesture recognition on mobile devices, currently achievingbest in class accuracy and enabling novel forms of interaction.

{ DefSense [CF2] – Computational design of functional flexible, 3D printed input devices.{ HoloDesk [CF14] – Mixed reality system allowing users to manipulate 3D content using uninstrumented freehand

interactions.

Journal publications. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

[J1] Oliver Glauser, Alex Ma, Daniele Panozzo, Alec Jacobson, Otmar Hilliges, and Olga Sorkine-Hornung.“Rig Animation with a Tangible and Modular Input Device”. In: ACM Transactions on Graphics (July2016).

[J2] Vittorio Megaro, Bernhard Thomaszewski, Maurizio Nitti, Otmar Hilliges, Markus Gross, and Stelian Coros.“Interactive Design of 3D-printable Robotic Creatures”. In: ACM Transactions on Graphics 34.6 (Oct.2015), 216:1–216:9.

[J3] Alec Jacobson, Daniele Panozzo, Oliver Glauser, Cédric Pradalier, Otmar Hilliges, and Olga Sorkine-Hornung. “Tangible and modular input device for character articulation”. In: ACM Transactions onGraphics 33.4 (July 2014), pp. 1–12.

[J4] Johannes Schöning, Jonathan Hook, Nima Motamedi, Patrick Olivier, Florian Echtler, Peter Brandl,Laurence Muller, Florian Daiber, Otmar Hilliges, Markus Loechtefeld, et al. “Building interactive multi-touch surfaces”. In: Journal of Graphics, GPU, and Game tools 14.3 (2009), pp. 35–55.

[J5] Lucia Terrenghi, Otmar Hilliges, and Andreas Butz. “Kitchen Stories: Sharing Recipes with the LivingCookbook”. In: Personal Ubiquitous Computing 11.5 (June 2007), pp. 409–414.

Peer-reviewed full-length conference publications. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

[CF1] Christoph Gebhardt, Benjamin Hepp, Tobias Naegeli, Stefan Stevsic, and Otmar Hilliges. “Airways:Optimization-based Interactive Design of High-Level Quadrotor Behavior”. In: SIGCHI Conference onHuman Factors in Computing Systems. CHI ’16. San Jose, CA: ACM, Apr. 2016.

[CF2] Benjamin Hepp, Moritz Baecher, Fabrizio Pece, Bernhard Thomszewski, Paul Kry, Bernd Bickel, andOtmar Hilliges. “DefSense: Computational Design of Customized Deformable Input Devices”. In: SIGCHIConference on Human Factors in Computing Systems. CHI ’16. San Jose, CA: ACM, Apr. 2016.

[CF3] Benjamin Hepp, Tobias Naegeli, and Otmar Hilliges. “Omni-directional Person Tracking on a Flying Robotusing Occlusion-robust Ultra-Wideband Signals”. In: IEEE/RSJ International Conference on IntelligentRobots and System (IEEE IROS). IEEE. Oct. 2016.

[CF4] Nicolas de Palézieux, Tobias Naegeli, and Otmar Hilliges. “Duo-VIO: Fast, Light-weight, Stereo InertialOdometry”. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IEEE IROS). IEEE.Oct. 2016.

[CF5] Jie Song, Saiwen Wang, Jamie Lien, Ivan Poupyrev, and Otmar Hilliges. “Interacting with Soli: ExploringFine-Grained Dynamic Gesture Recognition in the Radio-Frequency Spectrum”. In: ACM Symposium onUser Interface Software and Technology (ACM UIST). ACM. Oct. 2016.

[CF6] Wang Yifan, Jie Song, Limin Wang, and Otmar Hilliges. “Two-Stream SR-CNNs for Action Recognitionin Videos”. In: British Machine Vision Conference. BMVC ’16. ACM. Sept. 2016.

[CF7] Jibin Ou, Martin Vechev, and Otmar Hilliges. “An Interactive System for Data Structure Development”.In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. CHI ’15. Seoul,South Korea: ACM, Apr. 2015.

4/13

Page 5: Otmar Hilliges – Curriculum Vitae...Awards 2014 Best paper award - IEEE IROS’14. “Environment-independentFormationFlightforMicroAerialVehicles” 2014 Best paper award - …

[CF8] Gábor Sörös, Stephan Semmler, Luc Humair, and Otmar Hilliges. “Fast Blur Removal for Wearable QRCode Scanners”. In: Proceedings of International Symposium on Wearable Computers (ACM ISWC). ACM.Sept. 2015.

[CF9] Petri Tanskanen, Tobias Naegeli, Marc Pollefeys, and Otmar Hilliges. “Semi-Direct EKF-based MonocularVisual-Inertial Odometry”. In: Proceedings of Intelligent Robots and Systems (IEEE IROS). IEEE. Sept.2015.

[CF10] Tobias Nägeli, Christian Conte, Alexander Domahidi, Manfred Morari, and Otmar Hilliges. “Environment-independent Formation Flight for Micro Aerial Vehicles”. In: Proceedings of IEEE/RSJ InternationalConference on Intelligent Robots and Systems (IROS 2014). Chicago, Il, USA: IEEE Press, 2014.

[CF11] Jie Song, Gábor Sörös, Fabrizio Pece, Sean Ryan Fanello, Shahram Izadi, Cem Keskin, and Otmar Hilliges.“In-air Gestures Around Unmodified Mobile Devices”. In: Proceedings of the 27th Annual ACM Symposiumon User Interface Software and Technology. UIST ’14. Honolulu, Hawaii, USA: ACM, 2014, pp. 319–329.

[CF12] Stuart Taylor, Cem Keskin, Otmar Hilliges, Shahram Izadi, and John Helmes. “Type-hover-swipe in 96Bytes: A Motion Sensing Mechanical Keyboard”. In: Proceedings of the SIGCHI Conference on HumanFactors in Computing Systems. CHI ’14. Toronto, Ontario, Canada: ACM, 2014, pp. 1695–1704.

[CF13] Dustin Freeman, Otmar Hilliges, Abigail Sellen, Kenton O’Hara, Shahram Izadi, and Kenneth Wood.“The Role of Physical Controllers in Motion Video Gaming”. In: Proceedings of the Designing InteractiveSystems Conference. DIS ’12. Newcastle Upon Tyne, United Kingdom: ACM, 2012, pp. 701–710.

[CF14] Otmar Hilliges, David Kim, Shahram Izadi, Malte Weiss, and Andrew Wilson. “HoloDesk: Direct 3DInteractions with a Situated See-through Display”. In: Proceedings of the SIGCHI Conference on HumanFactors in Computing Systems. CHI ’12. Austin, Texas, USA: ACM, 2012, pp. 2421–2430.

[CF15] David Kim, Otmar Hilliges, Shahram Izadi, Alex D. Butler, Jiawen Chen, Iason Oikonomidis, and PatrickOlivier. “Digits: Freehand 3D Interactions Anywhere Using a Wrist-worn Gloveless Sensor”. In: Proceedingsof the 25th Annual ACM Symposium on User Interface Software and Technology. UIST ’12. Cambridge,Massachusetts, USA: ACM, 2012, pp. 167–176.

[CF16] David Kirk, Shahram Izadi, Otmar Hilliges, Richard Banks, Stuart Taylor, and Abigail Sellen. “At Homewith Surface Computing?” In: Proceedings of the SIGCHI Conference on Human Factors in ComputingSystems. CHI ’12. Austin, Texas, USA: ACM, 2012, pp. 159–168.

[CF17] David Molyneaux, Shahram Izadi, David Kim, Otmar Hilliges, Steve Hodges, Xiang Cao, Alex Butler, andHans Gellersen. “Interactive Environment-aware Handheld Projectors for Pervasive Computing Spaces”. In:Proceedings of the 10th International Conference on Pervasive Computing. Pervasive’12. Newcastle, UK:Springer-Verlag, 2012, pp. 197–215.

[CF18] Andrew Wilson, Hrvoje Benko, Shahram Izadi, and Otmar Hilliges. “Steerable Augmented Reality withthe Beamatron”. In: Proceedings of the 25th Annual ACM Symposium on User Interface Software andTechnology. UIST ’12. Cambridge, Massachusetts, USA: ACM, 2012, pp. 413–422.

[CF19] Alex Butler, Otmar Hilliges, Shahram Izadi, Steve Hodges, David Molyneaux, David Kim, and Danny Kong.“Vermeer: Direct Interaction with a 360°; Viewable 3D Display”. In: Proceedings of the 24th Annual ACMSymposium on User Interface Software and Technology. UIST ’11. Santa Barbara, California, USA: ACM,2011, pp. 569–576.

[CF20] Shahram Izadi, David Kim, Otmar Hilliges, David Molyneaux, Richard Newcombe, Pushmeet Kohli,Jamie Shotton, Steve Hodges, Dustin Freeman, Andrew Davison, and Andrew Fitzgibbon. “KinectFusion:Real-time 3D Reconstruction and Interaction Using a Moving Depth Camera”. In: Proceedings of the24th Annual ACM Symposium on User Interface Software and Technology. UIST ’11. Santa Barbara,California, USA: ACM, 2011, pp. 559–568.

[CF21] Richard A. Newcombe, Shahram Izadi, Otmar Hilliges, David Molyneaux, David Kim, Andrew J. Davison,Pushmeet Kohli, Jamie Shotton, Steve Hodges, and Andrew Fitzgibbon. “KinectFusion: Real-time DenseSurface Mapping and Tracking”. In: Proceedings of the 2011 10th IEEE International Symposium on Mixedand Augmented Reality. ISMAR ’11. Washington, DC, USA: IEEE Computer Society, 2011, pp. 127–136.

[CF22] David S. Kirk, Shahram Izadi, Abigail Sellen, Stuart Taylor, Richard Banks, and Otmar Hilliges. “OpeningUp the Family Archive”. In: Proceedings of the 2010 ACM Conference on Computer Supported CooperativeWork. CSCW ’10. Savannah, Georgia, USA: ACM, 2010, pp. 261–270.

[CF23] Mark Hancock, Otmar Hilliges, Christopher Collins, Dominikus Baur, and Sheelagh Carpendale. “ExploringTangible and Direct Touch Interfaces for Manipulating 2D and 3D Information on a Digital Table”. In:Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces. ITS ’09. Banff,Alberta, Canada: ACM, 2009, pp. 77–84.

5/13

Page 6: Otmar Hilliges – Curriculum Vitae...Awards 2014 Best paper award - IEEE IROS’14. “Environment-independentFormationFlightforMicroAerialVehicles” 2014 Best paper award - …

[CF24] Otmar Hilliges, Shahram Izadi, Andrew D. Wilson, Steve Hodges, Armando Garcia-Mendoza, and AndreasButz. “Interactions in the Air: Adding Further Depth to Interactive Tabletops”. In: Proceedings of the22Nd Annual ACM Symposium on User Interface Software and Technology. UIST ’09. Victoria, BC,Canada: ACM, 2009, pp. 139–148.

[CF25] Dominikus Baur, Otmar Hilliges, and Andreas Butz. “Flux: Enhancing photo organization throughinteraction and automation”. In: SG 08 Proceedings of the 9th international symposium on Smart Graphics.Springer, 2008, pp. 216–223.

[CF26] Lucia Terrenghi, David Kirk, Hendrik Richter, Sebastian Krämer, Otmar Hilliges, and Andreas Butz.“Physical Handles at the Interactive Surface: Exploring Tangibility and Its Benefits”. In: Proceedings ofthe Working Conference on Advanced Visual Interfaces. AVI ’08. Napoli, Italy: ACM, 2008, pp. 138–145.

[CF27] Andrew D. Wilson, Shahram Izadi, Otmar Hilliges, Armando Garcia-Mendoza, and David Kirk. “BringingPhysics to the Surface”. In: Proceedings of the 21st Annual ACM Symposium on User Interface Softwareand Technology. UIST ’08. Monterey, CA, USA: ACM, 2008, pp. 67–76.

[CF28] Sebastian Boring, Manuela Altendorfer, Gregor Broll, Otmar Hilliges, and Andreas Butz. “Shoot & copy:phonecam-based information transfer from public displays onto mobile phones”. In: Proceedings of the4th international conference on mobile technology, applications, and systems (Mobility’ 07). ACM. 2007,pp. 24–31.

[CF29] Sebastian Boring, Otmar Hilliges, and Andreas Butz. “A Wall-Sized Focus Plus Context Display”. In:Annual IEEE International Conference on Pervasive Computing and Communications (PerCom ’07).November 2005. IEEE, 2007, pp. 161–170.

[CF30] Otmar Hilliges, Dominikus Baur, and Andreas Butz. “Photohelix: Browsing, Sorting and Sharing DigitalPhoto Collection”. In: Proceedings of the 2nd IEEE International Workshop on Horizontal InteractiveHumanComputer Systems TABLETOP 2007. 2007.

[CF31] Otmar Hilliges, Peter Kunath, Alexey Pryakhin, Andreas Butz, and Hans-Peter Kriegel. “Browsing andSorting Digital Pictures using Automatic Image Classification and Quality Analysis .” In: Human-ComputerInteraction. LNCS. Springer, 2007, pp. 882–891.

[CF32] Otmar Hilliges, Lucia Terrenghi, Sebastian Boring, David Kim, Hendrik Richter, and Andreas Butz.“Designing for Collaborative Creative Problem Solving”. In: Proceedings of the 6th ACM SIGCHI Conferenceon Creativity & Cognition. C&C ’07. Washington, DC, USA: ACM, 2007, pp. 137–146.

[CF33] Otmar Hilliges, Phillipp Holzer, Rene Klüber, and Andreas Butz. “AudioRadar: A metaphorical visualizationfor the navigation of large music collections”. In: Smart Graphics. Springer, 2006, pp. 82–92.

Refereed short papers. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

[CS1] D. Asenov, O. Hilliges, and P. Müller. “The Effect of Richer Visualizations on Code Comprehension”. In:SIGCHI Conference on Human Factors in Computing Systems. CHI ’16. San Jose, CA: ACM, Apr. 2016.

[CS2] Jie Song, Fabrizio Pece, Marion Koelle, and Otmar Hilliges. “Joint Estimation of 3D Hand Position andGestures from Monocular Video for Mobile Interaction”. In: Proceedings of the SIGCHI Conference onHuman Factors in Computing Systems. CHI ’15. Seoul, South Korea: ACM, Apr. 2015.

[CS3] D. Alex Butler, Shahram Izadi, Otmar Hilliges, David Molyneaux, Steve Hodges, and David Kim.“Shake’N’Sense: Reducing Interference for Overlapping Structured Light Depth Cameras”. In: Proceedingsof the SIGCHI Conference on Human Factors in Computing Systems. CHI ’12. Austin, Texas, USA: ACM,2012, pp. 1933–1936.

[CS4] Otmar Hilliges and David Stanley Kirk. “Getting Sidetracked: Display Design and Occasioning Photo-talkwith the Photohelix”. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems.CHI ’09. Boston, MA, USA: ACM, 2009, pp. 1733–1736.

[CS5] Otmar Hilliges, David Kim, and S. Izadi. “Creating malleable interactive surfaces using liquid displacementsensing”. In: 3rd IEEE International Workshop on Horizontal Interactive Human Computer Systems, ITS.Oct. 2008, pp. 157–160.

[CS6] Otmar Hilliges, Christian Sandor, and Gudrun Klinker. “Interactive Prototyping for Ubiquitous AugmentedReality User Interfaces”. In: Proceedings of the 11th international conference on Intelligent user interfaces.Vol. 65. 0065-1419 LA - eng. ACM Press, 2006, pp. 285–287.

Book contributions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

[BC1] Otmar Hilliges, Andreas Butz, Shahram Izadi, and Andrew D Wilson. “Interaction on the Tabletop:Bringing the Physical to the Digital”. In: Tabletops-Horizontal Interactive Displays. Ed. by ChristianMüller-Thomfelde. Springer, 2010, pp. 189–221.

6/13

Page 7: Otmar Hilliges – Curriculum Vitae...Awards 2014 Best paper award - IEEE IROS’14. “Environment-independentFormationFlightforMicroAerialVehicles” 2014 Best paper award - …

Workshop papers and abstracts. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

[A1] Oliver Glauser, Alex Ma, Daniele Panozzo, Alec Jacobson, Otmar Hilliges, and Olga Sorkine-Hornung.“Rig Animation with a Tangible and Modular Input Device”. In: ACM, Oct. 2016.

[A2] Alec Jacobson, Daniele Panozzo, Oliver Glauser, Cedric Pradalier, Otmar Hilliges, and Olga Sorkine-Hornung. “Tangible and modular input device for character articulation”. In: Proceedings of the adjunctpublication of the 27th annual ACM symposium on User interface software and technology - UIST’14Adjunct. New York, New York, USA: ACM Press, Oct. 2014, pp. 45–46.

[A3] Alec Jacobson, Daniele Panozzo, Oliver Glauser, Cédric Pradalier, Otmar Hilliges, and Olga Sorkine-Hornung. “Tangible and modular input device for character articulation”. In: ACM SIGGRAPH 2014Emerging Technologies. New York, New York, USA: ACM Press, July 2014, pp. 1–1.

[A4] Shahram Izadi, Richard A Newcombe, David Kim, Otmar Hilliges, et al. “KinectFusion: real-time dynamic3D surface reconstruction and interaction”. In: SIGGRAPH ’11: ACM SIGGRAPH 2011 Talks. New York,NY, USA: ACM, 2011, p. 1.

[A5] Andreas Butz, Otmar Hilliges, Lucia Terrenghi, and Dominikus Baur. “Hybrid Widgets on an InteractiveTabletop”. In: Ubicomp ’07: Adjunct Proceedings. In Ubicomp ’07: Adjunct Proceedings, 2007.

[A6] Otmar Hilliges. “Informed Browsing: Scaling Up Co-Experienced Access to Digital Media”. In: Doctoralsymposium of 20th ACM UIST, Newport, RI, USA. 2007.

[A7] Otmar Hilliges and Lucia Terrenghi. “Overcoming mode-changes on multi-user large displays with bimanualinteraction”. In: MU3I Workshop on Multi-User and Ubiquitous User Interfaces (IUI Workshops). ACM,2006, pp. 23–31.

[A8] Martin Bauer, Otmar Hilliges, Asa MacWilliams, Christian Sandor, et al. “Integrating studierstube anddwarf”. In: Int. Workshop on Software Technology for Augmented Reality Systems (STARS 2003). 2003.

Invited publications. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

[IN1] Richard A. Newcombe, Shahram Izadi, Otmar Hilliges, David Molyneaux, David Kim, Andrew J. Davison,Pushmeet Kohli, Jamie Shotton, Steve Hodges, and Andrew Fitzgibbon. “KinectFusion: Real-time DenseSurface Mapping and Tracking”. In: Commun. ACM (to appear) (2016).

Tech reports. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

[TR1] Johannes Schöning, Peter Brandl, Florian Daiber, Florian Echtler, Otmar Hilliges, et al. Multi-TouchSurfaces: A Technical Guide. Tech. rep. Institute for Geoinformatics University of Münster, 2008.

Theses. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

[T1] Otmar Hilliges. “Bringing the Physical to the Digital: A New Model for Tabletop Interaction”. PhD thesis.Ludwig-Maximilians-Universität München, 2009.

[T2] Otmar Hilliges. “Interaction Management for Ubiquitous Augmented Reality User Interfaces”. Mastersthesis. Technische Universität München (TUM), Munich, Germany, 2004.

Patents granted. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

[P1] Detection of body and props. US Patent 8,660,303. 2014.[P2] Gesture recognition techniques. US Patent 8,760,395. 2014.[P3] Human body pose estimation. US Patent 8,638,985. 2014.[P4] Mobile camera localization using depth maps. US Patent 8,711,206. 2014.[P5] Learning image processing tasks from scene reconstructions. US Patent US8971612B2. 2013.[P6] Moving object segmentation using depth images. US Patent 8,401,225. 2013.[P7] Physics simulation-based interaction for surface computing. US Patent 8,502,795. 2013.[P8] Real-time camera tracking using depth maps. US Patent 8,401,242. 2013.[P9] Tabletop display providing multiple views to users. US Patent 8,502,816. 2013.[P10] Three-dimensional environment reconstruction. US Patent 8,587,583. 2013.[P11] Using a three-dimensional environment model in gameplay. US Patent 8,570,320. 2013.[P12] Generating computer models of 3d objects. US Patent US9053571B2. 2012.

Patents pending. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

[PA1] Grasping virtual objects in augmented reality. US Patent App. 13/653,968. 2014.

7/13

Page 8: Otmar Hilliges – Curriculum Vitae...Awards 2014 Best paper award - IEEE IROS’14. “Environment-independentFormationFlightforMicroAerialVehicles” 2014 Best paper award - …

[PA2] In-air gestures around unmodified mobile devices. EP Patent App. 14/184134.6. 2014.[PA3] Using photometric stereo for 3D environment modeling. US Patent App. 13/729,324. 2014.[PA4] Wearable sensor for tracking articulated body-parts. US Patent App. 13/644,701. 2014.[PA5] Distributed asynchronous localization and mapping for augmented reality. US Patent App. 13/152,220.

2012.[PA6] Reducing interference between multiple infrared depth cameras. US Patent App. 13/017,518. 2012.[PA7] Three-dimensional user interaction. US Patent App. 12/939,891. 2012.[PA8] User interaction in augmented reality. US Patent App. 12/940,383. 2012.[PA9] Pointing device with independently movable portions. US Patent App. 12/485,543. 2010.[PA10] Surface Computer User Interaction. US Patent App. 12/485,499. 2010.[PA11] Interactive surface computer with switchable diffuser. US Patent App. 12/040,629. 2009.

Bibliometric indicators. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Citations: 4611, h-index: 26 (Google Scholar, Jul 2016)

8/13

Page 9: Otmar Hilliges – Curriculum Vitae...Awards 2014 Best paper award - IEEE IROS’14. “Environment-independentFormationFlightforMicroAerialVehicles” 2014 Best paper award - …

Otmar Hilliges | Research StatementAdvanced Interactive Technologies Lab – ETH Zurich, Switzerland

T +41 44 632 39 56 • B [email protected] • Í http://ait.ethz.ch/

Research OverviewI conduct research in the area of Human-Computer Interaction (HCI). My research aims to push the boundaries

of what humans can do with computers and how they interact with them. So far we have seen three dominantwaves in computer interfaces: purely text-based interfaces (e.g., the command line, 1960s), graphical interfacesbased on mouse and keyboard (1980s), and direct-touch based interfaces on mobile phones and tablet computers(2000s). As digital technology moves further away from the desktop setting, it is becoming increasingly clear thattraditional interfaces are no longer adequate means for interaction and that the traditional computing paradigmwill be replaced or complemented by new forms of interaction. If we turn to the natural world then all living beingsperceive and interact with their surroundings via often elaborate sensory, processing and output ‘systems’. Thepremise of my research is then: What if every man-made object in the world had similar capabilities? What if everyinnate object in the world had input, processing and output capabilities? And as a consequence, how do we make aworld full of smart objects usable and useful for humans? And finally, who will design, program and manufacturethese objects? This lets us envision a future where smart garments that record vital statistics about an athlete’sperformance are designed directly by the athlete’s coach. Where a medical professional can design smart assistivetechnologies that measure how frequently and how well a patient performs rehabilitation exercises, and intervenesif necessary. Where a teacher makes individualized smart toys that support every child in their specific learningchallenge.

The basic premise for such a future is already in place. Electronics are becoming ever smaller and cheaper.Manufacturing and prototyping technologies such as printed electronics and 3D printing are becoming more diverse,flexible, powerful and accessible. Hence, it is conceivable that in the future every object may be able to sense, processand respond to its surroundings. Designing user interface technologies in this context raises several interesting anddifficult challenges. First, we already know that future devices will be mobile, will have ample computational powerand will utilize several sensing capabilities to collect data about all aspects of our daily lives such as cameras, inertialmeasurement units, physiological sensors and more. Second, we will most likely not use a single interface paradigm– but will interact with diverse set of devices and interfaces such as wearable computers, head-worn displays, smartbuildings, robots and more. Third, while in the past computing devices were only used by trained experts with soundtechnical knowledge, this does not hold anymore. Today everybody uses computing devices and the expectation isthat these are usable without any prior training or deep understanding of the technical aspects. Today we haveto design for users with vastly different backgrounds. For example, a doctor is mostly interested in helping apatient and an athlete is interested in performing at maximum rate, yet they often use computing devices to as-sist them in their primary task - therefore it is important that such devices are designed with the user’s intent in mind.

Hence, the design of intuitive and easy to use interactive technologies for diverse application domains and userdemographics is one of the biggest challenges in computer science. However, the current UI design approach hasbeen developed in the area of the PC, where the graphical user interface (GUI) could be considered in isolationfrom the underlying hardware. In contrast I argue that modern interface design has to consider the entire stackfrom low-level sensing hardware all the way to the graphical user interface. For example, creating devices that senseuser input via embedded sensing elements such as a customized input device to control a virtual character requiresreasoning about i) choice of sensing technology ii) placement of sensing elements iii) and finally reasoning aboutappropriate signal processing and input recognition algorithms. These aspects are intertwined with usability in aninseparable fashion. Ultimately the user experience of a custom input device critically depends on a) its interactionfidelity, i.e., the types of input it can recognize, and (b) how well user inputs can be recognized and hence thesensing aspects cannot be separated from UI design.

Postdoctoral Research. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

My research (2007–2013) prior to joining ETH was focused on enabling freehand interactions on large touchscreens[CS5, CF24, CF27, CF32] and more recently with 3D augmented reality content in stationary [CF14] and mobile[CF15] contexts. It has had significant uptake in the community (more than 700 combined citations). Furthermorethe work discussed in [CF14, CF15] had significant industrial impact as they were instrumental in launching

9/13

Page 10: Otmar Hilliges – Curriculum Vitae...Awards 2014 Best paper award - IEEE IROS’14. “Environment-independentFormationFlightforMicroAerialVehicles” 2014 Best paper award - …

Microsoft’s HoloLens1 project. Similarly, joint work on real-time dense surface reconstruction for Augmented Reality[CF17, CF21, CF20] has been cited 2142 times and the underlying algorithms have been adapted to run on theHoloLens headset. The technologies developed – and equally important – the insights gained from this work formthe foundations for my work.

Research at ETH. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Since March 2013, one particular area of interest has been the development of novel input devices alongsideadvanced input recognition algorithms to extract semantic meaning from low-level sensor data. For example, weare researching methods to create hybrid interfaces – interfaces that bring the qualities of natural user interfaces(NUIs) to the domain of productivity or knowledge work and therefore require high precision and input bandwidth.We have developed a new type of augmented mechanical keyboard, sensing rich and expressive motion gesturesperformed both on and directly above the device (see Figure 1). Custom electronics integrated into the keyboardallows for high-speed sensing albeit at low spatial resolution. A novel machine learning algorithm has been designedto detect a large set of motion gestures [CF12]. The paper has been awarded with best paper honors at SIGCHI,one of the premier outlets for HCI research.

Figure 1: Our work enables fast, easy, low-effort gestures and smooth transitions back to typing, where the user’s handsalways remain in the “home position” [CF12]. Gestures recognized by a RF-based classifier allow users to combine typingand gesturing to (A) navigate documents (B) for task switching (C+D) control of complex applications that require frequentmode-changes.

Related work has looked into developing custom, modular and tangible input devices for 3D character control [J3,J1]. Going even further we have recently developed a novel optimization-based algorithm for the design andfabrication of customized, deformable input devices that are capable of sensing their deformation continuously[CF2]. We embed piezoresistive sensing elements into flexible 3D printed objects (see Figure 2). These sensingelements are then utilized to recover rich and natural user interactions at runtime. Designing such objects is achallenging and hard problem if attempted manually for all but the simplest geometries and deformations. Ourmethod simultaneously optimizes the internal routing of the sensing elements and computes a mapping fromlow-level sensor readings to user-specified outputs in order to minimize reconstruction error.

Figure 2: Design and fabrication of custom input devices (from left to right): the designer creates a set of exampledeformations (1) and roughly indicates where to place internal sensors. Our optimization algorithm then refines the sensorplacement to maximize reconstruction accuracy (2). We fabricate our designs by inserting piezoresistive wires in-between3D-printed body parts (3), then calibrate using motion capture (4). Our customized flexible input devices can be used in avariety of applications, e.g., to animate digital characters (5).

With mobile devices becoming ever smaller the question of how to interact with information is becoming morepressing. Addressing this challenge, we are developing a number of novel means of input, complementing the touchscreen of current mobile devices. In our work [CF11] we have developed sophisticated algorithms to extend theinteraction space around mobile devices by detecting rich gestures performed behind or in front of the screen. Thetechnique uses only the built-in RGB camera, and recognizes a wide range of gestures robustly, copes with uservariation, and varying lighting conditions and runs entirely on mobile devices (see Figure 3). More recently we haveextended this work to jointly recognize gestures and estimate metric depth of hands for 3D interaction from 2Dimagery only [CS2]. This enable spatial interactions with small, body-worn devices where rich 3D input is desiredbut the usage of conventional depth sensors is prohibitive.

1http://www.microsoft.com/microsoft-hololens/en-us

10/13

Page 11: Otmar Hilliges – Curriculum Vitae...Awards 2014 Best paper award - IEEE IROS’14. “Environment-independentFormationFlightforMicroAerialVehicles” 2014 Best paper award - …

Figure 3: Touch input is expressive but can occlude large parts of the screen (A). We propose [CF11] a machine-learningbased algorithm for gesture recognition, expanding the interaction space around the mobile device (B), adding in-airgestures and hand-part tracking (D) to commodity off-the-shelf mobile devices, relying only on the device’s camera (andno hardware modifications). We demonstrate a number of compelling interactive scenarios including bi-manual input tomapping and gaming applications (C+D). The algorithm runs in real time and can even be used on ultra-mobile devicessuch as smartwatches (E).

In addition to the above contributions to interface design and input recognition, I have started a new direction ofresearch in the last couple of years. Robotics and other cyber-physical systems have advanced rapidly in recentyears – we now are very close to robotics becoming a mainstream and end-user facing technology. Therefore, afuture in which humans and physically actuated, intelligent machines share the same environment is conceivable.Following from this premise the question how non-expert users will interact with such cyber-physical systems in anatural, intuitive and safe manner is becoming increasingly pressing. Our work in this space has been focused onrobotic perception, allowing for safe navigation very close to humans, and on programming interfaces that allownovice users to create complex robotic behavior.

We have developed environment-independent methods for the localization of individual quadcopter [CF9, CF3,CF4] and swarms of robots [CF10] in indoor and outdoors environments and in particular in those shared withhumans. These are building blocks to develop more advanced application scenarios in which mobile robots haveawareness of humans.

Figure 4: Interactive computational design of quadrotor trajectories: (A) user interface to specifiy keyframes and dynamicsof quadrotor flight. (B) An optimization algorithm generates feasible trajectories and (C) a 3D preview allows the user toquickly iterate on them. (D) The final motion plan can be flown by real quadrotors. The tool enables the implementation ofa number of compelling use cases such as (B) robotic light-painting, aerial racing and (D) aerial videography.

As robots are entering the consumer market the question of how to specify robotic behavior is becomingincreasingly urgent. In recent work [CF1] we have proposed a computational design tool that allows end-users todesign advanced quadcopter behavior with applications in designing aerial displays, gaming and aerial videography.Our algorithm allows novice users to create quadcopter based use-cases without requiring deep knowledge in eitherquadcopter control or the underlying constraints of the target domain. To achieve this goal we propose a fastoptimization-based method that generates control inputs which are bounded to be within the physical limits of therobotic platform and as such can be flown in the real-world (see Figure 4). Furthermore, the method incorporateshigh-level human objectives into the planning of flight trajectories. A set of easy to use 2D and 3D design tools al-low for quick specification and editing of trajectories as well as for intuitive exploration of the resulting solution space.

Future Research Directions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

In my future research I want to further strengthen my academic approach and continue to push the boundariesof how we use technology, ultimately creating novel user-interaction technologies and redefining user experiences.As outlined in the research overview section I am active in three separate but strongly interwoven sub-areas of HCIresearch. I will continue to work on the three main threads: i) novel input devices and interfaces for high-precision,high-bandwidth input, ii) design approaches to accelerate the development of interactive technologies and iii)novel forms of human-robot interaction. While this is a diverse set of topics, they all build upon each other andresults from one area often fed back into the others. These are challenging and fruitful areas to work in and theyare important areas – ultimately the kind of technologies we develop will have impact on the way we experiencetechnology on a daily basis and at large scale.

11/13

Page 12: Otmar Hilliges – Curriculum Vitae...Awards 2014 Best paper award - IEEE IROS’14. “Environment-independentFormationFlightforMicroAerialVehicles” 2014 Best paper award - …

I am particularly interested in how we can bridge the increasing disconnect between technology driven anduser experience driven design of interactive technologies. In this sense I want to increase my focus on new formsof development processes that allow UX designers to continue to approach this difficult problem from the userperspective but without having to abstract away the increasingly complex technical challenges associated to nextgeneration interface technologies (i.e., sensing based, proactive systems, robotics and artificial smart assistantsetc.). The key problem to solve will be how to allow designers to quickly experiment with very different designsand hence to leverage their creativity but with the necessary support to understand and develop complex functionalprototypes and systems. To this end I will continue to develop intelligent design tools for interactive technologies[CF2, J1, CF1] and how to embed these into the traditional user-centered-design process via designer-in-the-loopapproaches where the algorithm computes many possible design alternatives and the designer guides the algorithm’ssearch process and ultimately selects the most promising configurations.

I will also continue to evaluate and refine the methods by implementing and testing a variety of challenging, realworld application scenarios, ranging from custom general purpose interfaces to personalized wearable interfaces forrehabilitation which I plan to test with real designers, domain experts and end-users. Insights gathered from thesedeployments will help in guiding the technical research and in refining the methodology. Finally, these use-cases havethe potential to impact both the design of mainstream user interfaces and hence improve the everyday experienceof our society at large and may deliver medical and assistive technologies to many patients that have currently noaccess to such technologies due to cost and lack of expertise. Hence, potentially improving the lives of millions.

12/13

Page 13: Otmar Hilliges – Curriculum Vitae...Awards 2014 Best paper award - IEEE IROS’14. “Environment-independentFormationFlightforMicroAerialVehicles” 2014 Best paper award - …

Otmar Hilliges | TeachingStatement

Advanced Interactive Technologies Lab – ETH Zurich, SwitzerlandT +41 44 632 39 56 • B [email protected] • Í http://ait.ethz.ch/

Teaching statementI enjoy teaching and the challenge of finding ways to help students to acquire deep technical knowledge and a

solid understanding of a subject. I believe that it is important for students to study theoretical foundations, yetalso understand how to apply these techniques to solve practical problems, and finally to be able to discuss andargue their choices when solving problems.

Whenever I teach a class, I try to lay the foundation for independent and informed problem solving throughthorough preparation and explanation of fundamental theories. Furthermore, the knowledge being conveyed needsto be made tangible by concrete examples and connections to real-world implications. I am convinced that ‘doing’is the most promising path to learning; therefore I always assign students practical and realistic tasks that have tobe solved independently – alone or in small groups. And finally, I believe that being able to reason about one’schoices is the best way to improve one’s chances of succeeding in professional life. To train reasoning skills and theability to accept and phrase criticism, every student has to present his or her work to the group and explain keydecisions at least once in every course (for graduate-level courses).

Furthermore, I truly believe that real-learning only happens when students are engaged. To achieve this engage-ment I always invest extra effort in designing in-class activities that allow students to gain new viewpoints andinsights into a particular topic. For example, in my “User Interface Engineering” course I dedicate significant timeto a peer-review simulation in which students read, review and discuss academic papers related to the class content.Using an entire lecture we then run a full program-committee simulation so that students have to think about,discuss and eventually agree upon what makes for a good paper and what not. I believe this exercise providesvaluable lessons in independent and critical thinking, in reading and writing skills and last but not least provides adirect connection between the course materials and state-of-the-art research. Since introducing this activity intothe curriculum, student engagement and in consequence learning outcomes and teaching evaluations have increasedsignificantly. I now embed similar activities in all courses and seminars that I am involved in.

A full list of teaching activities and syllabi can be found at: http://ait.inf.ethz.ch/teaching/, in particularI currently teach the following courses:

User Interface Engineering, Graduate Course, 30 Students, 4 ECTS: This is an advanced course on thetechnical aspects of HCI, including sensor hardware and algorithms for input recognition and semantic analysis ofuser input. I have developed this course from scratch as there is no textbook on this subject available and there isno comparable courses being taught elsewhere.Parallel Programming, Undergraduate Course, 300 Students, 7 ECTS: This is an introductory course in thefirst year which teaches basics of programming and parallelism in modern software development.Human Computer Interaction, 88 students, 4 ECTS: This is an undergraduate course introducing students tothe basic aspects of Human-Computer Interaction, including the user-centered design process, experimental designand analysis as well as interaction techniques and paradigms.

Looking into the future, I would be interested in teaching undergraduate computer science courses of all kindsas well as graduate courses in applied computer science such as human computer interaction, computer graphicsand computer vision. I would also like to design an applied machine-learning course with particular focus on HCIrelevant topics.

13/13