- Open Access
- Total Downloads : 20
- Authors : Prajwal Kuchchangi, Aniket Raj
- Paper ID : IJERTCONV1IS04039
- Volume & Issue : NCRTICE – 2013 (Volume 1 – Issue 04)
- Published (First Online): 30-07-2018
- ISSN (Online) : 2278-0181
- Publisher Name : IJERT
- License: This work is licensed under a Creative Commons Attribution 4.0 International License
HAPTICS: THE VIRTUAL REALITY
ABSTRACT
PRAJWAL KUCHCHANGI ANIKET RAJ
BMSCE BMSCE
prajwal.kuchchangi@gmail.com aniketp2@gmail.com
specialized algorithms to determine when and how the user is touching objects in the virtual environment, when the contact occurs; the system employs the devices actuators (often DC Motors) to provide the user with
Haptic Interface are designed to allow humans to touch
virtual objects as they were real. Unfortunately, virtual surface models currently require extensive hand tuning and do not feel authentic, which limits the usefulness and applicability of such systems. The proposed approach of Haptography seeks to address this deficiency by basing models on haptic data recorded from real interactions between a human and a target object. The studio Haptographer uses a fully instrumented stylus to tap, press and stroke an item in a controlled environment while a computer system records position, orientation, velocities, accelerations and forces. The point and touch Haptographer carries a simple instrumented Stylus around during daily life, using it to capture interesting haptic properties of items in the real world. Recorded data is distilled into Haptographer, the Haptic impressing of object or surface patch, including properties such as local shape, stiffness, friction and texture. Finally the feel of the probed object is recreated via a haptic interface by accounting for the devices natural dynamics and focusing on the feedback of High-frequency acceleration.
-
INTRODUCION
When you contact things in your surroundings through a tool, you can feel a rich array of haptic cues that reveal the shape, stiffness, friction, texture and other characteristics of the object you are touching. For example, the vibration and forces experienced by your hand as you stroke a piece of paper or travel over its edge with a pen are distinctly different from those generated by touching a curved rubber surface or tapping on a wooden table. Humans are amazingly adept eliciting and interpreting haptic feedback during interactions with physical objects, naturally leveraging this wealth of information to guide both exploratory and dexterous manipulation.
Haptic interfaces seek to extend the normal reach of the Human hand to enable interaction with virtual
objects. Often taking the form of the lightweight, back drivable robotic arm (figure 1), Haptic interfaces
measure the motion of the human hand and map it into the virtual environment. The control computer uses
force and vibration feedback.
When accompanied by a graphical rendering of the virtual world, such haptic systems are well suited to training individuals to perform skilled tasks such as surgery and dentistry, presenting spatial data such as land topography or part geometry, creating and manipulating
3 dimensional shapes for sculpture of design and enhancing educational and entertainment media. Throughout this rich variety of endeavors, haptic feedback provides a powerful communication channel for the expanding field of human-computer-interaction
-
HAPTIC REALISM
In order to effectively meet the needs of these diverse applications and succeeding new areas, haptic interfaces must be able to compellingly imitate the feel of a vast variety of physical interactions, currently, the geometry of objects in virtual environment ay be based on real- world data, but Haptic properties like stiffness, friction, texture are almost always programmed by hand via simple parametric relationships that are only loosely based on the behavior of physical objects. The time consuming subjective nature of this turning process does not extend well to the creation of haptic environments that contain a broad assortment of simulated objects, such as virtual antique shops or a haptically augmented material database.
Furthermore virtual objects rendered via the standard haptic contact algorithm generally struggle to provide convincing feedback to the user. This claim is difficult to substantiate because few researchers are willing to compare the realism of their virtual environment with that of real objects being stimulated. In one of the few studies that does address this issue, subjects blindly tapped on a variety of real and virtual wood samples, rating their realism on a scale from 1 to 7; the piece of soft foam and the virtual surface rendered via standard penetration-based feedback both received about a 2 out of 7, indicating that neither is an adequate stand in for real wood (6). In general, hard contacts, textured surfaces and vibratory are stick-slip interactions are particularly difficult to capture and reproduce, as the common proportional forced feedback algorithm is restricted to low stiffness and smoothly changing forces (1). It is only by incorporating measurements of real- world interactions that virtual environment designers have begun to approach the realism of natural physical interactions, eg: (10, 3, 6) existing measurement based modeling techniques for haptic rendering are summarized.
2. HAPTOGRAPHY
The proposed approach of Haptography aims to improve the authenticity of virtual touch by basing haptic models on multi-sensory recording of the real experiences being emulated. Despite their ubiquitous importance in human life, we currently lack a formal method for describing and analyzing haptic experiences with everyday objects, in contrast, consider the human mastery of visual stimuli, to cerate a life like image of an observed scene, we know longer need to start with a blank canvas, painting each element by hand. Instead we use a sophisticated
measuring system (camera lens) to control the desired light pattern and project on to a sensitive medium (either film or a digital image sensor). The Latent image is then converted to a storable portable record, a negative or an image film using chemical or signal processing. Finally we create realistic copies of the original stimulus (paper- based prints, screen-based images) for others to see. Haptography research seeks to understand and control haptic interactions to this same level of excellence combining 3 distinct yet inter-oven threads of enquiry: What sensations do human experience when touching real objects. What dynamic relationships optimally characterize the feel of real touch-based interactions.
-
HAPTIC SESNSATION
Haptography begins by creating a sensor system that can detect the perceptually relevant attributes of haptic interactions. Unlike vision, the human sense of touch is distributed throughout the body. It interacts heavily with motor system and merges the output of at least 7 different biological sensory mechanisms. Most previous efforts to record haptic interactions have relied entirely on position and force,, but these two variables cannot fully categorize dynamic haptic interactions, especially considering the human hands sensitivity high frequency vibrations. Furthermore these prior modeling efforts probed real objects with a specialized mechatronic system or an industrial robot, which has dynamic properties (mass, stiffness, damping) that re different from those of human hand, accurately capturing the acceleration, transients that result from tapping on a hard object or stroking a textured surface requires the probe to be held in the same way that human holds it.
Haptography entails thoroughly sensorizing real tools and environments to capture haptic interactions in their entirety, mimicking a humans afferent nervous system. Haptography sensor suits measure mechanical quantities such as pressure, osition, orientation, velocity, acceleration, strain, contact
force and contact torque as well as user characteristics such as grip force, hand position and muscle activation. This local signals stream can be augmented by an audio and video recording interaction to facilitate further analysis. Instrumenting a stylus, handled surgical needled driver or a vehicles steering wheel in this manner ill improve ones understanding of haptic interaction and allow for sensorized comparison between stimulation and reality.
-
HAPTIC DISTILLATION
The Second main thrust of Haptography is to distill selected sensor readings in to a meaningful representation of haptic interaction that created them. While the human eye and a camera can acquire information about all parts of a scene simultaneously, the human hand can interact with only a small subsets of its environment at each point in time. Thus, time-varying sensation data will be mapped to the appropriate environmental location and state during processing to
determine the underlying principles governing the interaction. This step elucidates an additional benefits to recording haptic data from an interaction driven by a human rather that a robot; the human Haptographer will naturally explore the areas of the objects that are more haptically interesting, providing rich data streams for analysis.
Consider the sample interaction of probing a static 3dimenional environment such as a book, mouse pad and coffee cup sitting on a desktop. Haptic distillation requires the development of algorithms that can automatically reconstruct its haptic impression (including geometry, stiffness, transient response, texture and friction) from several minutes of data fro the Haptography sensor suites. This goal can be framed as a problem in hybrid system identification, where we seek to identify the transitions between 3 dynamic modes free- space motion, pressing into a surface, sliding a surface as well as the dynamics that govern each distich contact mode. Analyzing real haptic interactions in this manner will provide a succinct representation of haptic experience and a quantitative metric for evaluating a simulated environment renderings.
3.2. HAPTIC RENDERING
The final component of Haptography is to reproduce stored interactions for a human user via a haptic interface as with photography, this final step should accurately convey the stored stimuli without distortion or degradation. The main challenge to accurately controlling a touch-based interaction is that them important mechanical relationships are strongly affected by the dynamics of both the haptic device and users hand. Most existing systems neglect these influences presuming that a haptic device can render any dynamic relation for which it is programmed.
The goal of a haptic recreation can be approached from two complimentary perspectives : Algorithm development and electro mechanical device design. The dynamic properties of a haptic interface can be obtained via specialized system identification techniques and then accounted for in subsequent rendering of virtual objects. For example, a haptic interfaces ability to create a high frequency fingertip accelerations can be quantified by applying swept sine-wave current commands to the motor while the stylus is held by a user, conditioning desired acceleration transients by the inverse of the obtained model enables the creation of accurate virtual acceleration transients. Though undesirable mechanical
behaviors can be adjusted somewhat through software, the best strategy for improving haptic rendering is often to improve the device itself.
In summary, the proposed technique of HAPTOGRAPHY has multiple application including medicine, dentistry, automotive engineering, hazardous material handling, education and art; its ultimate goal is to generate virtual objects that are indistinguishable from their real counterparts, enabling authentic stimulation and efficient training of an abundance of tasks involving hand-object interaction.
-
APPLICATIONS
Partial / Temporary Paralysis Treatment through Physiotherapy
The haptic arm are programmed using LABVIEW for a particular movement and haptic feedback which enables the user to practice the movements and if the user makes angular mistakes the arm automatically corrects it. The vibrating arm band, called Ghost, detects a users arm motion and buzzes to give feedback. It would work by pre-loading the wristbands software with the right pattern of muscle movements, or even a exact swing. The pattern includes certain way points, which would indicate where your swing or stroke is going. Then the device would guide the user through the motions. Sensors can detect the joints flexing and twisting, and vibrations and sounds indicate whether the user is doing it wrong or
right. With each try, the users muscle memory gets better, but theres no coach demonstrating how to do it
and no hours of video to sit through. It could help vision- impaired and fully sighted athletes.
Video games
Haptic feedback is commonly used in arcade games, especially racing video games. In 1976, Sega's motorbike game Moto-Cross, also known as Fonz, was the first game to use haptic feedback which caused the handlebars to vibrate during a collision with another vehicle Tatsumi's TX-1 introduced force feedback to car driving games in 1983.
Simple haptic devices are common in the form of game controllers, joysticks, and steering wheels. Early implementations were provided through optional components, such as the Nintendo 64 controller's Rumble Pak. Many newer generation console controllers and joysticks feature built in feedback devices, including Sony's Dual Shock technology. Some automobile steering wheel controllers, for example, are programmed to provide a "feel" of the road. As the user makes a turn or accelerates, the steering wheel responds by resisting turns or slipping out of control.
Mobile devices
Tactile haptic feedback is becoming common in cellular devices. Handset manufacturers like Nokia, LG and Motorola are including different types of haptic technologies in their devices; in most cases, this takes the form of vibration response to touch. Alpine Electronics uses a haptic feedback technology named Pulse Touch on many of their touch-screen car navigation
and stereo units.[18] The Nexus One features haptic feedback, according to their specifications.
In February 2013, Apple was awarded the patent for a more accurate haptic feedback system that is suitable for multi touch surfaces. Apple's U.S. Patent No. 8,378,797 for a "Method and apparatus for localization of haptic feedback" describes a system where at least two actuators are positioned beneath a multi touch input device to provide vibratory feedback when a user makes contact with the unit. More specifically, the patent provides for one actuator to induce a feedback vibration, while at least one other actuator creates a second vibration to suppress the first from propagating to unwanted regions of the device, thereby "localizing" the haptic experience. While the patent gives the example of a "virtual keyboard," the language specifically notes the invention can be applied to any multi touch interface.
Virtual reality
Haptics are gaining widespread acceptance as a key part of virtual reality systems, adding the sense of touch to previously visual-only solutions. Most of these solutions use stylus-based haptic rendering, where the user interfaces to the virtual world via a tool or stylus, giving a form of interaction that is computationally realistic on today's hardware. Systems are being developed to use haptic interfaces for 3D modeling and design that are intended to give artists a virtual experience of real interactive modeling. Researchers from the University of Tokyo have developed 3D holograms that can be "touched" through haptic feedbackusing "acoustic radiation" to create a pressure sensation on a user's hands
Medicine
Haptic interfaces for medical simulation may prove especially useful for training in minimally invasive procedures such as laparoscopy and interventional radiology, as well as for performing remote surgery. A particular advantage of this type of work is that surgeons can perform more operations of a similar type with less fatigue. It is well documented that a surgeon who performs more procedures of a given kind will have statistically better outcomes for his patients Haptic interfaces are also used in rehabilitation robotics.
In ophthalmology, haptic refers to supporting springs, two of which hold an artificial lens within the lens capsule after the surgical removal of cataracts.
A Virtual Haptic Back (VHB) was successfully integrated in the curriculum at the Ohio University College of Osteopathic Medicine. Research indicates that VHB is a significant teaching aid in palpatory diagnosis (detection of medical problems via touch). The VHB simulates the contour and stiffness of human backs, which are palpated with two haptic interfaces (Sensible Technologies, PHANTOM 3.0). Haptics have also been applied in the field of prosthetics and orthotics. Research has been underway to provide essential feedback from a prosthetic limb to its wearer.
-
REFERENCES
-
N. Diolaiti. G, G. Niemeyer. F. Barbogli, J.K. Salisbury and C. Melichiorri. The effect of quantization and Coulomb friction on the stability
of haptic rendering.
-
J.P. Fiene K.J. Kuchenbecker, and G. Niemeyer Event-based haptic tapping with grip force compensation. In Proc. IEE Symposium on Haptic Interfaces for Virtual Environment and Teleoperaor Systems, pages 117 to 123, March 2006.
-
S. Greenish. V. Hayward V. Chinal A.M. Okamura and T. Stefen. Measurement, analysis and display of haptic signals during surgical cutting. Presence 11(6) : 626-651, 2002.
-
V. Hayward and O.R. Astley Performance measures for haptic interfaces. In G. Giralt and G. Hirzinger editors, Robotics Research. The 7th Int. Symposium pages 195-207, Springer Verlag 1996.
-
K.J. Kuchenbecker, Characterizing and Controlling the High frequency Dynamics of Haptic Devices. PhD thesis, Stanford University, Department of Mechanical Engineering.
-
K.J. Kuchenbecker, J.P. Fiene and G. Niemeyer. Improving contact realism through event based haptic feedback. IEEE Transaction on Visualization and Computer Graphics.
-
K.MalLean. The Haptic camera. A technique for characterizing and playing back haptic properties of real environments. In Proc, ASME INT Mechanical Engineering Congress and Exposition .
-
A.M. Okamura, K.J. Kuhenbecker and M. Mahvash, Haptic Rendering: Algorithm and Application, chapter 22 : Measurement based Modelling for Haptic Display