Hi, I am Lung-Pan. I am a PhD student with Prof. Patrick Baudisch at Hasso Plattner Institute. I am primarily focusing on virtual reality, specifically in haptics and mobile technology. My recent research targets on bringing immersive haptic experience to people. I was with Prof. Mike Y. Chen in the Mobile HCI Lab at National Taiwan University during my masters. My research focuses on novel interactions and input/ouput technologies.
TurkDeck: Physical Virtual Reality Based on People
TurkDeck is an immersive virtual reality system that repro- duces not only what users see and hear, but also what users feel. TurkDeck allows creating arbitrarily large virtual worlds in finite space and using a finite set of physical props. The key idea behind TurkDeck is that it creates these physical representations on the fly by making a group of human workers present and operate the props only when and where the user can actually reach them. TurkDeck manages these so-called “human actuators” by displaying visual instructions that tell the human actuators when and where to place props and how to actuate them. Published at UIST 2015.
Level-Ups: Motorized Stilts that simulates stair steps
We present “Level-Ups”, computer-controlled stilts that allow virtual reality users to experience walking up and down steps. Each Level-Up unit is a self-contained device worn like a boot. Its main functional element is a vertical actuation mechanism mounted to the bottom of the boot that extends vertically. Unlike traditional solutions that are integrated with locomotion devices, Level-Ups allow users to walk around freely (“real-walking”). Published at CHI 2015.
Haptic Turk: a Motion Platform Based on People
We present haptic turk, a different approach to motion platforms that is light and mobile. The key idea is to replace motors and mechanical components with humans. All haptic turk setups consist of a player who is supported by one or more human-actuators. The player enjoys an interactive experience, such as a flight simulation. The motion in the player’s experience is generated by the actuators who manually lift, tilt, and push the player's limbs or torso. Published at CHI 2014.
iGrasp: Grasp-Based Adaptive Keyboard for Mobile Devices
We present iGrasp, which automatically adapts the layout and position of virtual keyboards based on how and where users are grasping the devices without requiring explicit user input. Our prototype uses 46 capacitive sensors positioned along the sides of an iPad to sense users’ grasps, and supports two types of grasp-based automatic adaptation: layout switching and continuous positioning. Published at CHI 2013.
iRotateGrasp: Automatic Screen Rotation based on Grasp of Mobile Devices
iRotateGrasp automatically rotates screens of mobile devices to match users’ viewing orientations based on how users are grasping the devices. It can rotate screens correctly in different postures and device orientations without explicit user input. Our insight is that users’ grasps are consistent for each orientation, but significantly differ between different orientations. Several prototypes were implemented, which can successfully sense and classify users' grasps into users' viewing orientations. Published at UIST 2012 demo and CHI 2013 short paper.
iRotate: Automatic Screen Rotation based on Face Orientation
Current approaches to automatic screen rotation are based on gravity and device orientation. Our survey shows that most of the users experienced auto-rotation that leads to incorrect viewing orientation. iRotate solves the problem by automatically rotates screens of mobile devices to match users’ face orientations using front camera and face detection. Published at CHI 2012.
TUIC: Enabling Tangible Interaction on Capacitive Multi-touch Displays
TUIC enables tangible interaction on capacitive multi-touch devices, such as iPad, iPhone, and multi-touch displays without requiring any hardware modifications. TUIC simulates finger touches on capacitive displays using passive materials and active modulation circuits embedded inside tangible objects, and can be used with multi-touch gestures simultaneously. After recognizing the pattern on a TUIC object, users can manipulate the object by transposing and rotating it on the surface to control the virtual object.

TurkDeck: Physical Virtual Reality Based on People
Lung-Pan Cheng, Thijs Roumen, Hannes Rantzsch, Sven Köhler, Patrick Schmidt, Robert Kovacs, Johannes Jasper, Jonas Kemper and Patrick Baudisch
In Proc. UIST 2015, p417-426.

Level-Ups: Motorized Stilts that simulates stair steps
Dominik Schmidt, Robert Kovacs, Vikram Mehta, Udayan Umapathi, Sven Köhler, Lung-Pan Cheng and Patrick Baudisch
In Proc. CHI 2015, p2157-2160.

Haptic Turk: a Motion Platform Based on People
Lung-Pan Cheng, Patrick Lühne, Pedro Lopes, Christoph Sterz and Patrick Baudisch
In Proc. CHI 2014, p3463-3472.

iGrasp: Grasp-Based Adaptive Keyboard for Mobile Devices
Lung-Pan Cheng, Kate Hsiao, Andrew Liu and Mike Y. Chen
In Proc. CHI 2013, p3037-3046.

iRotateGrasp: Automatic Screen Rotation based on Grasp of Mobile Devices
Lung-Pan Cheng, Meng-Han Lee, Che-Yang Wu, Fang-I Hsiao, Yen-Ting Liu, Hsiang-Sheng Liang, Yi-Ching Chiu, Ming-Sui Lee and Mike Y. Chen
In Proc. CHI 2013, p3051-3054 and in Adjunct Proc. UIST 2012, p15-16.

iRotate: Automatic Screen Rotation based on Face Orientation
Lung-Pan Cheng, Kate Hsiao and Andrew Liu, Mike Y. Chen
In Proc. CHI 2012, p2203-2210.

TUIC: Enabling Tangible Interaction on Capacitive Multi-touch Displays
Neng-Hao Yu, Li-Wei Chan, Seng Yong Lau, Sung-Sheng Tsai, I-Chun Hsiao, Dian-Je Tsai, Fang-I Hsiao, Lung-Pan Cheng, Mike Y. Chen, Polly Huang and Yi-Ping Hung
In Proc. CHI 2011, p2995-3004 and in Adjunct Proc. UIST 2010, p457-458.

[full pdf] [one page]

Research Interest

Human-Computer Interaction; large-scale haptics in VR; sensing techniques; mobile interactions

Education

Ph.d., Human-Computer Interaction, Hasso Plattner Institute, Germany
advisor: Prof. Patrick Baudisch, 11/2012 - present

M.S., Computer Science, National Taiwan University, Taiwan
advisor: Prof. Mike Y. Chen, 09/2011 - 11/2012 (1.5 years)

B.S., Computer Science, National Chiao Tung University, Taiwan
rank 3/56, 09/2006 - 01/2010 (3.5 years)

Work/Interships

Interaction Architecture Intern, Apple Inc., 10/2014 - 03/2015
prototyped tracking devices and 3D UI

Software Developer, Wantoto Inc., 07/2011 - 07/2012
developed iOS apps and web applications on Google Cloud

Chief Counselor, R.O.C. Army, Compulsory Military Service, 08/2010 - 07/2011

Network Testing Engineer Intern, Network Benchmarking Lab, 07/2008 - 07/2010
tested switches and routers by simulating different protocol on Spirent SmartBits®.

Awards and scholarships

Studying Abroad Scholarship (US$ 32,000), Ministry of Education, Taiwan, 2015
Conference Grant (NT$ 40,000), National Science Council, Taiwan, Oct. 2012
Conference Grant (NT$ 40,000), Outstanding Scholar Foundation, Taiwan, May. 2012
。1st place (NT$ 20,000), Wargame Competition, Hacks In Taiwan, Taiwan, 2012
。1st place (NT$ 300,000), Chung Hua Telecom Mobile Apps Competition, Taiwan, 2010
Lin Hsiung Chen scholarship (NT$100,000), Taiwan, 2009
(GPA in the top 50 of all university students in Taiwan)
TSMC scholarship (NT$100,000), Taiwan, 2008
(GPA in the top 3 of EECS students in NCTU)
4x Academic Achievement Awards (NT$6,000), NCTU, Taiwan, 2007-2009
(GPA in the top 5% in a class of 56 students)
4x Core Curriculum Awards (NT$6,000), NCTU, Taiwan, 2007-2009
(top 5% in OS, Algorithm, Assembly Language and Linear Algebra courses)

Proficiency

Programming languages: Objective-C, C/C++, C#, Javascript, Python, PHP SQL

Unity 3D apps and plugins (Windows, OSX, iOS) development

iOS app, tweak (jailbroken app) and external device communication

Hardware prototyping (Arduino, Processing, PCB design, soldering, laser cutting)

Tracking system (OptiTrack, Razor Hydra and IMU data processing)

Cloud app development (Node.js, MongoDB, JQuery)