I am a haptic researcher, with multiple years of academic and industrial experience. My efforts are towards connecting the hard core haptics research with applications
more attractive towards industrial settings, thus designing novel
and compelling applications and products. I have been an active member of mainstream haptics
community and now extending my involvements in HCI sectors.
My research interests are summarized as follows: [Research Statement - pdf]

Haptics Science:

The fundamental basis of my research is to understand the science of touch perception. This includes understanding of proximal stimuli, sensory mechanisms, 
neural pathways, mental representations, and cognitive processes. I use rigorous psychophysics to characterize the dynamic range and limitations of perception
and to develop mathematical and scientifical models that relate physical stimulations to perception. My key interests are to investigate force, tactile, vibrotactile,
shape , texture and friction perception. My general idea is to regenerate realistic and compelling haptic effects using digital technologies.
 

Actuation Technologies:

One major limitation in the present haptic technology is unavailability of rich, multidimensional and scalable actuators. My interests are to explore novel actuation
advancements for producing broad range of sensations on the skin. One such technology is Tesla-Touch, I co-invented with a team of researchers in Disney Research.
My investigations include exploring smart materials, such as shape memory alloys, elastomers, ferrofluids, piezo-materials etc., novel actuators in nano-science
and ultrasonic actuators to produce novel act-uation and interaction schemes. These actuators require unique control techniques for high resolution haptic displays.

Sensory Substitution:

One interesting and largely useful research direction is to develop communication solutions for hearing and visually impaired individuals. Presenting speech and
acoustic features to the deaf, and images and printed materials to the blinds can substantially improve their lifestyles and help them easily communicate in normal
settings. These tools not only create better academic training programs and interaction tools for sensory impaired individuals but also produce unique and
efficient experiences for normal users. My work explores skin as a viable communication medium.

Rehabilitation, Prosthetic & Therapy:

Stroke and other motor disabilities can be effectively cured by carefully controlling the force feedback provided to the patients. My previous research in human motor
control, motor adaptation and skill acquisition provided recipes for therapeutic and rehabilitation methods. These studies highlighted important performance measures
for improved human motor control. I investigate performance metrics that quantify deficiencies in movements of disable patients and explore robotic control strategies for
manual training and skill acquisition. 

Sensory Illusions:

Sensory illusions in touch produce unique sensations and are rarely explored. Carefully controlling the temporal-spatial parameters of  illusions on the skin not only
generate novel user experiences but also overcome challenges posed by the present haptic technology. For example, in my prior research, I have combined these
illusions and developed simple and economical solutions to render high resolution haptic sensations using limited number of actuators placed on the skin. Similarly,
depth and clicks on flat panels could be elicited using illusions. My interests are to explore sensory illusions and develop scientific models that reliably and robustly
produce them.

Sensory Feedback & Augmentation:

Carefully controlled haptic feedback is essential for its acceptance by the users and ultimately its success in the user and consumer market. It must account for the
use in applications as well as accommodate for human perception. I explore novel ways to provide haptic feedback in gesture based interactions, hand held and
tracking devices, communication and computational platforms, and design of tools, products and furniture. Another way to develop interesting and absorbing
experiences is to augment haptic feedback in interactive applications. Adding a haptics layer on images, plots and illustrations create unique ways for human
interactions and data representations.  My aims are to explore effective and efficient solutions for providing haptic feedback to users.

Authoring Tools & Software:

Authoring tools for artists and designers provide freedom and accessibility to design novel and realistic haptics experiences for the users. Moreover, users utilize
these tools to share contents with other users, for example over the internet. My interests are to develop authoring tools and graphic editor environments, which
add another design dimension for improved contents for user experiences.

Standardization & Ownership:

Haptics in the mainstream consumer market and applications require standardization and ownership of contents. Due to the large variety of haptic actuation tech-
nology and contact site, a mechanism must be formed that incorporates not only sensory characteristics of the skin but also account for the frequency-related varia-
tions in the actuation technology. My interests are to demonstrate perception based models for standardizing haptic contents and haptic watermarking techniques. 
An emerging direction is broadcasting haptic contents for mainstream media. 

Location Based Applications:

Finally, I invest my efforts in investigating and utilizing haptic feedback in a variety of environment settings, such as, in homes, classrooms, restaurants,
hospitals, recreational facilities, theaters, social networks, transportation, work places, parks and for entertainment purposes.
 
 
 
 
 
 
 
 Free Air Haptics (@ Disney Research, Pittsburgh 2010213.)
 
 
 
Tactile Rendering of 3D Features on Flat Surfaces (@ Disney Research, Pittsburgh 2010213.)
 
 
 
 
 
 
Surround Haptics Displays (@ Disney Research, Pittsburgh 2010-13.)
 
Today's entertainment technologies incorporate rich and highly immersive visual and audio effects for their
audiences' experiences. Augmenting these effects with high-resolution multi-dimensional haptic feedback
significantly enhances the experience leading to the deeper sense of immersion and believability. In this project,
I investigate and model tactile feedback, specifically sensory illusions, and recreate a wide range of sensations
directly on the skin using sophisticated algorithms and hardware. The haptics architecture is integrated with
entertainment and media contents, such as movies, games, internet, toys, handheld gadgets to provide
users a variety of sensations and sensory information through the skin. A wide variety of embodiments and
applications are covered in this project. Visit http://www.surroundhaptics.com/ for more details.
 
 
 
 
 
TeslaTouch (@ Disney Research, Pittsburgh 2010-11.)
 
TeslaTouc is a new technology for enhancing touch interfaces with tactile feedback. This technology is based on
the electrovibration principle, does not use any moving parts and provides a wide variety of tactile feedback to fingers
moving across a touch surface. When combined with an interactive display and touch input, it enables the design of
a wide variety of interfaces that allow the user to feel virtual elements through touch. The physical proncipals, perception
 and psychophysics of electrovibrations, and control parameters are investigated. Application are incorporated in table
and hand-help versions, and in a wide range of situations for normal and impaired individuals.
Visit http://www.teslatouch.com/ for more details.
 
 
 
Psychophysical modeling of sensory illusions (@ Disney Research, Pittsburgh 2010-11.)
 
Scientific models for sensory illusions are created using psychophysical techniques. A general idea is to create a
high resolution tactile feedback display using limited set of actuators. Some well know illusions, such as Saltation,
Phantom Sensations and Apparent Haptic Motion, are modeled by controlling parameters, for example, location,
intensity and timing of vibratory points. Evaluation analyses showed that illusions could be robustly and reliably
controlled in normal settings and a wide range of realistic sensations could be produced those were similar to real
objects interactions on the skin.

 
Improving accessibility using electrovibrations (@ Disney Research, Pittsburgh 2010-11.)
 
My team and I have recently started investigating electrovibrations to create situation awareness and improved interaction
activities for blinds. We are interested in using TeslaTouch technology for a potential assistive device for blinds. Preliminary 
studies and interviews are exciting, promising and directed our attentions towards main challenges. We are focused on
improving the perception based software and representations of pictures, photos, charts, shapes etc. to the blinds.
 
Keep tuned for more information in the near future.
 

 
 
Improving gaming experiences with haptic feedback (@ Disney Research, Pittsburgh 2010-11.)
 
I am leading a team to develop a tactile feedback architecture that enhances gaming and movie experiences. The system
is composed of a tactile surface pad mounted on a chair or embedded in a vest, a controller hardware and software for
integrating multimodal contents. We generated two games and augmented high resolution haptics with these 3D games.
Preliminary subjective evaluation showed that these games were effective in enhancing user experiences, leading users
towards deeper sense of immersion and believability.
 
Keep tuned for more information in the near future.
 
 
 
 
 
 
Tactual communication of speech (@ Purdue University 2005-07.)
 
The objective is to develop a tactual communication system that provide speech information to hearing impaired individuals
through the skin.  A three channel tactual display, the TACTUATOR, was developed that presented multidimensional tactual
signals (both vibrational and motional cues) via fingerpads of the middle finger, the index finger and the thumb. A uniques
2-dof digital controller was proposed that compensated for the frequency response of the hardware actuator as well as that
of human tactual sensitivity. This preserves each spectral component of the reference speech signal in terms of the sensitivity
of the skin. A speech-to-touch coding scheme was formulated that extracted rich features from the acoustical signal and
presented them as multidimensional tactual signals. Both vowels and consonants were successfully transmitted through the
skin, indicating that the skin could transmit phoneme level information utilizing its rich sensory features.
 
 
 
 
 
 
 
 
Control of tactual and vibrotactile devices (@ Purdue University 2004.)
 
It has been long argued how to standardize vibrotaction for industrial and research use. Depending on the actuation technology, a
reference input signal is distorted due the frequency dependant signature of the technology. During my graduate program, me and
my adviser, Hong Tan, proposed a novel standardization method that took into account for not only the response characteristics of
the actuating hardware but also the frequency dependant human sensitivity. The main challenge was to preserve the spectral comp-
onents of the broadband reference signal that excited the hardware and then passed through the skin. We primarily focused on the
absolute detection threshold curve for stimulating frequencies on the skin. This function is the baseline of perception and could be
considered as a 0 (zero) datum line of human sensivity.  In a series of experiments we showed that the subjective response was
best matched when this function was taken into account. We also proposed impedance based function by measuring the mechanical
impedance of the skin and suggested that such function could be incorporated as a impedance of a human user interacting with a
mechanical device. In summary, we highlighted design methodologies when designing haptic contents for both displacement- and
force-feedback haptics interface. 
 
 
Mechanical impedance of the hand in different holding postures (@ Purdue University 2006-07.)

Mechanical impedance is an important physical properties of materials and structures, which determines the relationship between the
applied forces and the deformations incurred due to these forces. In perceptual field, this measure determines a relationship between
displacement of skin at the point of stimulation and the forces applied to the skin. This is an important measure and connects the two
main categories of haptics display schemes, such as with force-feedback and displacement-feedback devices. A lot of psychophysical
data has connected the perception with the displacement (or its derivatives such as velocity and acceleration) of the skin, and there is
a need to connect this data with the more advanced force-feedback devices instead of re-establishing perceptual rules with more
rigorous testings. In this study, we measured psychophysical absolute detection thresholds as the function of frequencies for two types
of common tools: a stylus-shaped tool and a ball-shaped tool. We measured both displacement and force values at the threshold levels
and determined the mechanical impedance of the hold-type at the threshold levels, and at two distinct suprathreshold levels (10- and 
20- dB SL, sensation levels). The displacement thresholds for the two tools were not significantly different (p>0.05) across the tested
frequencies but the force thresholds and mechanical impedance of the two tools were different (p<0.05). Both detection thresholds and
mechanical impedance varied with frequency but the mechanical impedance was not affected by the intensity of stimulation. The human
finger skin and hand couplings are composed of intricate meshing of dynamic characteristics: spring-like features at low frequencies,  
viscous-like features at mid frequency range and pure inertial features in the high frequency range (about >200 Hz). The implication
of these findings are discussed in terms of the device design, haptics rendering and physical-perceptual configuration of human sensory
system of the hand.  
 
 
Frequency and Amplitude Discrimination thresholds (@ Purdue University 2005.)

We determined the frequency and amplitude discrimination thresholds in a broad kinesthetic-cutaneous contunuum using a high
resolution and multidimensional haptic device, the TACTUATOR. The thresholds are necessary measurements to determine
the resolution of human perception in the vibrotactlile spectrum. We measured the thresholds in isolated conditions, that is only the
test signal was presented, and when an interrupted signal was presented. In the isolated signal condition, the range of frequency
Weber fraction was 0.13–0.38, and the range of amplitude discrimination threshold was 1.65–2.98 dB. In the masking conditions
(with interrupted signal), average frequency Weber fractions rose to 0.53, and average amplitude thresholds rose to 3.68 dB. In general,
thresholds were largest when the energy of the masker signal was largest. Although the frequency and amplitude thresholds generally
increased in the presence of masking stimuli, there was indication of channel independence for low- and high-frequency target stimuli.
The measured data was critical to determine the ability of a hand to decode broadband spectral information such as that derived from
the speech signal. The discrimination thresholds of the hand, and consequently that of the skin, were inferior to that of the ear. Thus the
direct transmission of speech to vibration would not be effective. Based on these results and some prior investigation, we developed a
multi-dimensional speech-to-touch algorithms.   
 
 
 
 
 
Expertise based performance measures in rhythmic tasks (@ Rice University 2008-09.)
 
During my post-doctoral, I and my colleagues introduced and validated quantitative performance measures for a rhythmic target-hitting
 task. These performance measures were derived from a detailed analysis of human performance during a month-long training experi-
ment where participants learnt to operate a 2-DOF haptic interface in a virtual environment and executed a manual control task. The
motivation for the analysis presented in this project was to determine measures of participant performance that captured key skills of
the task. This analysis of performance indicated that two quantitative measures, (i) trajectory error (spatial) and (ii) input frequency
(temporal), captured the key skills of the target hitting task. The strong correlations between the performance measures and the task
objective of maximizing target hits were observed. The performance trends were further explored by grouping the participants based on
the expertise and examining trends during training in terms of these measures. In future work, these measures will be used as inputs
to a haptic guidance scheme that adjusts its control gains based on a real-time assessment of human performance of the task. Such
guidance schemes will be incorporated into virtual training environments for humans to develop manual skills for domains such as for
surgery, physical therapy, and sports.
 
 
 
 
 
 
 
Natural frequency Weber fractions in a resonant manual tasks (@ Rice University 2007-08.)
 
It has been shown that humans use combined feedforward and feedback control strategies when manipulating external dynamic
systems and when exciting virtual dynamic systems at resonance and that they can tune their control parameters in response to
changing natural frequencies. We determined the discrimination thresholds (just noticeable differences, jnds) for the natural
frequency of such resonant dynamic systems. Weber fractions (WF, %) were reported for the discrimination of 1, 2, 4, and 8 Hz
natural frequencies either passive or actively excite virtual system along one axis motion. The average WF for natural frequency
ranged from 4% to 8.5% for 1, 2, and 4 Hz and 20% for 8 Hz systems. Results indicated that sensory feedback modality has a
significant effect on WF during passive perception, but no significant effect in the active perception case. The data also suggest
that discrimination sensitivity is not significantly affected by excitation mode. Finally, results for systems with equivalent natural
frequencies but different spring stiffness indicate that participants do not discriminate natural frequency based on the maximum
force magnitude perceived.
 
 
 
 
 
Effects of dynamic parameters in manual control tasks (@ Rice University 2008-09.)
 
Recent findings have shown that humans adapt their internal control model to account for the changing dynamics of systems they
manipulate. We explored the effects of magnitude and phase cues on human motor adaptation in a manual rhythmic task. Participants
excited virtual second-order systems at resonance via a two-degree of freedom haptic interface, with visual and visual+haptic feedback
conditions. Then, we change the virtual system parameters and observe the resulting motor adaptation in catch trials. Results showed
that, first, humans adapt to a nominal virtual system resonant frequency fairly quickly. Second, humans shifted to higher and lower
natural frequencies during catch trials regardless of feedback modality and force cues. Third, participants detected changes in natural
frequency when gain, magnitude, and phase cues were manipulated independently. The persistent ability of participants to perform
system identification of the dynamic systems which they control, regardless of the cue that is conveyed, demonstrates the human’s
versatility with regard to manual control situations. We intend to further investigate human motor adaptation and the time for adaptation
in order to improve the efficacy of shared control methodologies for training and rehabilitation in hatic virtual environments. 
 
 
 
 
Tactile Respiratory management system (@ Rice University 2008.)
 
Gated radiation therapy is used to treat cancer patients with tumors in the upper torso area where the location of the tumor varies due
to breathing patterns. In this technique, the radiation beam targeting the tumor is turned on during specific portions of the breathing
cycle, only when the tumor is in-line with the beam orientation. Specifically with lung cancer patients, the magnitude of lung motion is
highly non-uniform during free breathing. Such irregularity in respiratory motion results in elongated treatment time, increased cost, and
additional inconvenience to the patients. In order to reduce the cost and duration of the therapy, regularity in both frequency and ampli-
tude of breathing motion must be maintained during radiotherapy. Visual and auditory feedback had been used to reduce the inconsis-
tency in breathing patterns, but these modalities were proven ineffective due to an increase in cognitive stress that patients experienced
during therapy. I lead a team of students that  proposed and implemented a biofeedback mechanism to assist cancer patients for cont-
rolling their breathing patterns during the therapy.
 
 
 
 
 
 
Psychophysical analysis of vibrations on the forearm (@ Rice University 2009.)
 
I lead a team to develop psychophysical models of the forearm along with the information transmission capacity of the skin of the
dorsal forearm using vibrotactile stimulations. Participants were able to correctly identify the 3-4 vibrators mounted along the length
of the forearm. This corresponded to about 2 bits of information transfer (IT). A relation of physical spacing of the vibrators and the
corresponding mental mapping was determined. The analysis showed that the physical spacing linearly maps to the perceived
space, along the length of the forearm, and the skin close to the elbow and wrist slightly compress the mapping. The  results of the
study were subsequently used to map the acoustic features derived from the non-sense consonant-vowel-consonant syllables.
Normal participants were able to discriminate phonemes at the beginning of the segment quite reliablywith with very little training.
 
 
 
 
 
Psychophysical model of handlebar interfaces (@ Purdue University 2006.)
 
I lead a team to develop perception-based quantitative models that relates broadband vibrations transmitted through the handlebar
interface. Our goals were to determine design changes in the handlebar and its mounting mechanisms for a comfortable motorcycle
riding experience while holding the steering handlebar. In a set of well designed studies, we predicted subjective ratings of broadband
vibrations and suggested design changes based on these models. Our studies showed that the perception based models were able
to better predict the subjective response comapred to the standardized ISO defined weighted functions.
 
 
 
Design of a variable speed mechanism (@ Purdue University  2006.)
 
I investigated the kinematics of an adjustable six-bar linkage where the rotation of the input crank was converted into the oscillation of
the output link, with Professor Gordon Pennock as a side project during my TA days. This single-degree-of freedom planar linkage was
used as a variable-speed transmission mechanism where the input crank rotated at a constant speed and the output link consisted of
an overrunning clutch mounted on the output shaft. The analysis used a novel technique in which kinematic coefficients were obtained
with respect to an independent variable. Then kinematic inversion was used to express the kinematic coefficients with respect to the
input variable of the linkage. This technique decoupled the position equations and provided additional insight into the geometry of the
adjustable linkage. The angle through which the output link oscillated, for each revolution of the input crank, could be adjusted by a
control arm. This arm allowed a fixed pivot to be temporarily released and moved along a circular arc about a permanent ground pivot. 
The technique showed how the kinematic analysis results could be used, in a straightforward manner, to redesign the control arm and
novel mechanisms.