ASUS

Adaptive Typing

Adaptive Typing

Adaptive Typing

A Study Looking for User’s Mental Model to Achieve Intuitive Typing

Sheng Kai Tang
User Experience Design, ASUS

Introduction

A laptop with dual screens is a trend for the coming future. It means typing on an LCD screen displaying dynamic content instead of a fixed physical keyboard is an emergent design possibility. However, the current approach only showing a static keyboard graphically on the screen wastes the potential of a sensible and displayable touch screen. In this research, we are going to design an “Adaptive Typing” mechanism that actively detects user’s hand palm resting on the screen and automatically predicts key locations. This mechanism also provides minimum visual clues to assist novice users. In order to realize this idea, we design an experimental desk to collect and observe users’ mental models and hand ergonomics while typing. As shown in the picture, the upper camera collects data about keys and fingers; the lower camera collects data of hand palms through a transparent part with which a conventional keyboard and a touch pad embedded in. We believe that by overlapping image data of upper and lower cameras, the hidden relationships between hand palm and typing can be further revealed.


Florabot

Florabot

Swarm Robots for the 2010 Taipei International Flora Expo

Sheng Kai Tang, Patrick Chiu, Hunter Luo, Parks Tzeng
User Experience Design, ASUS

Introduction

In this project, we design and make 438 flower robots for the 2010 Taipei International Flora Expo. Our goal is to make each individual equip identical minimum rules but achieve highly intelligent behaviors. Hence, we adopt the idea of “Self-Organization” to realize “Swarm Intelligence”. Each “Florabot” has a MCU, IR transceivers, a tri-color LED, a motor and a stretch structure. The IR transceiver on top of the flower is capable of sensing the presence of a visitor. Once the visitor is detected, according to how close the visitor is, the tri-color LED in the head changes color and the stretch structure of the head modifies its size. Moreover, this Florabot propagates the information of visitor’s presence to neighbors by IR transceivers at the base. The neighbors will further react based on the received information.


Virtual Mouse

Virtual Mouse

A development of Proximity Based Gestural Pointing Device

Sheng Kai Tang
User Experience Design, ASUS

Introduction

What if the affordance of a device is totally removed? Without physical and visual clues, will users still be able to perform tasks as usual? In order to discover these curiosities, we propose a new pointing device that is named “Virtual Mouse”. Unlike conventional computer mouse, Virtual Mouse doesn’t have physical form for users to manipulate. Instead, by using a hand moving and tapping on the desk, users are capable of controlling the mouse cursor and triggering button functions intuitively. Technically, we implement a proximity-sensing bar and a pattern recognition algorithm to achieve foregoing goals. The proximity-sensing bar consisted of ten IR transceivers collects digital signals representing the contour of a hand nearby. The pattern recognition algorithm and a state machine further recognize collected signal patterns and their transitions. Finally, these patterns and transitions are mapped to cursor movements and button functions.

Publication

Tang, S.K., Tzeng, W.C., Chiu, K.C., Luo, W.W., Lin, S.T., and Liu, Y.P.: 2011, Virtual Mouse: A Low Cost Proximity Based Gestural Pointing Device. HCI International 2011.

Tang, S.K., Tzeng, W.C., Chiu, K.C., Luo, W.W., Lin, S.T., and Liu, Y.P.: 2011, Virtual Mouse: A Low Cost Proximity Based Gestural Pointing Device. TEI2011 Workshop.

 

Tangible Slider

Tangible Slider

A Capacitive Touch Slider Enabling Intuitive Sidebar Manipulation and Control

Sheng Kai Tang
User Experience Design, ASUS

Introduction

Sidebar is a kind of widely used component for most window-based applications. A sidebar is defined as a container where application shortcuts are located. Currently, a sidebar is called out from hidden by pointing the cursor at a hot-area or a hot-edge. After calling out a sidebar, users can further point at a shortcut icon and click to trigger. In this project, such a conventional interaction model is replaced by a “Tangible Touch Slider” to increase the usability. The Tangible Touch Slider is a horizontal sensible area with lighting feedbacks on the C part of a laptop computer. By touching on this sensible area, the related sidebar on the screen will be called out; by sliding on it, user can switch among shortcut icons; by releasing from the sensible area, the selected shortcut will be triggered. After a preliminary test based on GOMS, this new interaction model can save up to 1.5 seconds especially when users are doing typing oriented works.


Seamless Mobility

Seamless Mobility

Realizing “Interface Everywhere” by Object Recognition and Micro-Projection Technologies

Sheng Kai Tang, Patrick Chui, Hunter Luo and Parks Tzeng
User Experience Design, ASUS

Introduction

How to bring the conventional Graphical User Interface from the screen to the space is a popular research topic since a decade ago. Most approaches actually focus on discovering and improving related technologies of Augmented Reality. In detail, researchers develop high quality sensing and recognition technologies, and even turn them into toolkits for public use. These foregoing bases are actually good for designers, especially in the consumer product industry, to rethink and generate diverse applications. Hence, we adopt the Tangible Interface concept proposed by Professor Hiroshi Ishii and the ReactiVision toolkits to demonstrate the idea of a Seamless Interface of consumer products of the future. In our preliminary working presentation shown in COMPUTEX 2009, users can intuitively perceive and manipulate both tangible and graphical interfaces at the same time. Both physical and virtual information are highly integrated as well.


Sneak Peeking Bars

Sneak Peeking Bars

Detecting Eye Position to Control the Presence of Side Bars

Sheng Kai Tang
User Experience Design, ASUS

Introduction

The control sidebar is essential for window-based application. Most of them have auto-hide mechanism to spare more window space for users. Users can bring out a sidebar at will by moving the mouse cursor to the edge where the bar is hidden. In this project, we propose a new way of bringing out sidebars; we called it “sneak peeking sidebars”. First of all, we put all sneak peeking sidebars outside the frame of the window, which means users can’t see them while looking at the window straightly. However, when users move their head a bit to peek at edges of a window, and they will easily see sidebars hidden outside the window. The sneak peeking mechanism is actually based on the idea of “dynamic perspective” widely adopted in the interactive computer graphic to create real 3D scene. The dynamic perspective technique is to actively detect the eye position of the user based on which the perspective scene is generated dynamically.


Calligraphic Brush

Calligraphic Brush

An Intuitive Tangible User Interface for Interactive Algorithmic Design

Sheng Kai Tang
User Experience Design, ASUS

Introduction

The development of better User Interface (UI) and Tangible User Interface (TUI) for 3D modeling has lasted for decades. With the popularity of free form style achieved by algorithmic methods, the existing solutions of UI/TUI for CAD are gradually insufficient. Neglecting the steep learning curve of algorithmic design requiring solid background of mathematics and programming, the common drawback is the lack of interactivity. All actions rely heavily on mental translations and experimental trial and error. In this research, we try to realize the idea of interactive algorithmic design by developing a tangible calligraphic brush, with this device designer can intuitively adopt algorithmic methodology to achieve highly creative results.

Publication

Tang, S.K. and Tang, W. Y.: 2009, Calligraphic Brush: An Intuitive Tangible User Interface for Interactive Algorithmic DesignIn Proceedings of The Fourteenth Conference on Computer Aided Architectural Design Research in Asia 2009.
<Best Paper Award in CAADRIA 2009>

Adaptive Mouse

Adaptive Mouse

Toward a discovery of formal and functional adaptabilities

Sheng Kai Tang
User Experience Design, ASUS
Computational Design Program, CMU

Introduction

Adaptive Mouse (AM) consists of a smart material which is deformable and is capable of recognizing the deformation. The deformation provides perfect and comfortable ergonomic shape for users’ diverse hand gestures. The smart material itself could also dynamically activate any areas at will for conventional buttons and scroll wheel. The prediction of active areas is based on the recognition results of users’ hand gestures. Working with AM, all users have to do is to hold it with his/her comfortable and preferred hand gestures, then acting their fore and middle fingers intuitively will always correctly trigger related button functions. Users can also freely move the mouse and always get cursor feedbacks accurately.

Publication

Tang, S.K. and Tang, W. Y.: 2010, Adaptive Mouse: A Computer Mouse Achieving Form-Function Synchronization. In Proceeding of CHI 2010, p2785-2792.