«Abstract The objective of the virtual reality glove is to empower consumers with an innovative high quality virtual reality experience and to advance ...»
Virtual Reality Glove
Baird Eutsler, Jared Gaertner, Evan Pollino, Nina Robinson, Mequanint Moges
Electrical Engineering Technology Department
University of Houston
firstname.lastname@example.org, email@example.com, firstname.lastname@example.org, email@example.com,
Sai Tadimeti, Kavya Yerrabandi
Electrical and Computer Systems Engineering Department
University of Houston
The objective of the virtual reality glove is to empower consumers with an innovative high quality virtual reality experience and to advance the way they interact with computers. The idea was implemented by creating a glove attached with motion sensors that can be used as a peripheral with virtual reality consoles to produce a replica of a hand within the virtual reality world. We produced a prototype for a product that is both well-made and affordable to the everyday consumer. Today’s market contains two classifications of virtual reality hardware and software. On the lower end there are the cheap products that are made very poorly and come across as toys. On the higher end there are the industrial products which are very expensive and specialized. The goal of the study project was to create a product which meets these two opposites in the middle. We accomplished this by offloading a majority of the work to the software reducing the amount of hardware needed.
Introduction New trends in consumer electronics are allowing startups to develop innovative technologies for immersive virtual reality experiences. Cheap microcontrollers are now used in head mounted displays which were classically implemented using expensive discrete hardware and difficult to manufacture lenses. There are many YouTube videos showing users’ reactions to virtual roller coasters or scary video games. In some of these videos, users are so immersed that they point at objects that bystanders, not wearing the headset, cannot see. Those same users, however, have reported feeling sick after using the Oculus Rift1 head mounted display for ten to thirty minutes. Oculus Virtual Reality has committed to eliminate this motion sickness by launch, but one could argue that the root cause of the sickness is from a lack of total immersion; some of these players have reported feeling disembodied when they look down and cannot see their bodies or their hands. A virtual reality glove will improve the latter and enhance the player’s immersion. The goal of the project was to create a glove peripheral for use with any virtual reality head-mounted display. The glove must provide one-to-one control of a “hand” in the virtual reality environment.
Another objective was to address a gap in the current Virtual Reality (VR) market; devices are either extremely expensive (tens of thousands of dollars) and advanced or cheap and imprecise. The goal was to prototype a glove that was both economical and accurate. With affordability in mind, decision was made to utilize affordable hardware by offloading complexity into software. For instance, all Proceedings of the 2015 ASEE Gulf-Southwest Annual Conference Organized by The University of Texas at San Antonio Copyright © 2015, American Society for Engineering Education signal linearization, normalization, and calculations were done in code. This approach ensured that the cost of the final product was minimal.
The primary requirement for the prototype was exhibit one-to-one motion and low latency on the order of 40 milliseconds. This number was arrived at by considering the fact that standard cinematic frame rate for 24 frames per second translates to 41.66 milliseconds latency between frames. 2.
Other requirements included modularity in terms of interfacing with various other Virtual Reality equipment like monitors and head mounted displays. In addition to these requirements, compliance with on-board drivers such as Universal Serial Bus (USB) Human Interface Device Protocol was also a major requirement for the glove to be functional over various platforms.
The glove prototype tracked absolute position and orientation of a hand as well as each finger’s movement. To streamline development, the glove was built using several existing systems including the Unity gaming engine, the Oculus Rift virtual reality head mounted display, the Razer Hydra 3 PC Gaming Motion Sensing Controller, and the Arduino microcontroller platform. Several different types of sensors were considered for detecting finger movements including bend sensors and inertial sensors but, it was ultimately decided to go with bend sensors because of their simplicity.
The glove prototype was designed based on the block diagram shown in figure 1:
Figure 1. System block diagram The VR Glove system will be powered with the +5V rail from the USB connection with the exception of the Oculus Rift.
This gives the system a current budget of 220mA. The goal for the system is to have an overall system latency of 20ms, from user hand or finger movement to it showing up on screen. The bend sensors are 2.2 inches long and are sourced from Spectra Symbol. The relationship between angle and resistance of the sensor is approximately linear. This simplified many design considerations. The sensors obtained cover a range of 22-70 kΩ with angles ranging from 75-180 Proceedings of the 2015 ASEE Gulf-Southwest Annual Conference Organized by The University of Texas at San Antonio Copyright © 2015, American Society for Engineering Education degrees. The Arduino Micro has been selected to be the processing backbone for the system which generates the virtual representation of the hand. The Hydra is the component that will be used to track the absolute position of the virtual reality hand. Integration of the gaming controllers and the base station each contain three magnetic coils. These coils work in tandem with the Razer Hydra’s amplification circuitry, digital signal processor and positioning algorithm to translate field data into position and orientation data. This gives you a truly three-dimensional interaction with the game world, allowing more spatial control than is possible with an ordinary mouse. This technology also allows for a full 6 degree-of-freedom movement tracking, while eliminating the need for a line of sight to operate. Unity is the game development ecosystem that will allow for all the components to be tied together. It is a powerful rendering engine fully integrated with a complete set of intuitive tools and rapid workflows to create interactive 3D and 2D content. Model assets for use in Unity were created in another software, Blender (Figure 2). In the creation of the hand, 1012 vertices 1990 edges and 983 faces were utilized. A 1024 x 1024 resolution bitmap using a hand was used to provide the model with texture. In the construction of the hand, research was performed into how the human hand is bound to the skeleton, and virtual bones were modeled on the real human skeleton in simplified form. Once the mesh was created, collision boxes needed to be created in order to interact with other objects in the environment, and also attached to the bones. In addition to the hand, a simple table and rubber ball were also created.
Figure 2. Blender Development Environment
The engine allowed data from all parts to be consolidated without stressing details of implementation.
The Razer Hydra and the Oculus Rift communicate to Unity through plugins. The challenge in the project was to create a plugin similar to these plugins that would send data to Unity in a convenient format. This was originally attempted using low level USB drivers. However, this proved extremely difficult. Instead, the plugin was implemented using Uniduino. It allowed the microcontroller to communicate directly to Unity without diving into low level USB drivers. Once the environment is created the user then uses the Oculus rift to submerge them into the digital experience. The Oculus Rift is the head mounted display that facilitates the virtual reality world that will be used as the environment for the 3D hand created. The Oculus Rift has a 1280x800 pixel resolution and 1 MHz head tracking capability.
Sensor Selection For full finger articulation, the available choices for sensors were resistive bend sensors and a combination of Accelerometer, Gyroscope and Magnetometer known as Inertial Measurement Unit Proceedings of the 2015 ASEE Gulf-Southwest Annual Conference Organized by The University of Texas at San Antonio Copyright © 2015, American Society for Engineering Education (IMU). The IMU would provide the angular acceleration of the joint from which the amount of bend (angular position) has to be calculated by a double integration over time. Not only does a double integration algorithm consume high memory in the microcontroller, it also incorporates certain errors into the calculation. These errors need additional filtering in software to provide acceptable data. 4, 5, 6,7.
The magnetometer readings from the IMU would get affected by the huge magnetic fields generated by the Razer Hydra which we used for absolute positioning. Moreover, the amount of forward kinematics to be performed to determine the position of each of the knuckle would consume more memory and induce latencies.8,9,10.
Figure 3. Resistance (ohm) v/s Bend Angle (degree) graph for the chosen bend sensors The bend sensor outputs (Figure 3) were nearly linear for the range between 75-180 degrees which is the widest range of bend angle of any individual knuckle on the hand.
It was also evident that through forward kinematics, given the amount of bend on the middle knuckle for any finger, the bend on the top knuckle can be accurately calculated while the bottom knuckle motion is totally independent from that of the middle knuckle.7. Moreover, the bend sensors gave reliable output even without the use of an instrumentation amplifier. All these advantages made the bend sensors a better choice over the IMU combination.
Here, θt is the bend angle at the top knuckle, θm is the bend angle at the middle knuckle. The angle measurements were made using a goniometer.
The Unity gaming engine was programmed to allow customization to each user’s unique hand
movement. The flowchart of the program is as shown in figure 5:
Figure 1. Program Flowchart
First, the data from the ADC is asynchronously captured; Unity receives packets of information that correspond to each analog channel’s value. This information is stored in memory. When Unity is ready to update a frame for rendering, it uses these values to perform calculations. Next, the data is linearized using the linearization equation. This ensures that all subsequent usage of values is linear.
Next, the data is normalized to a value between 0 and 1 using calibration data. A value of 0 indicates that the limb is fully closed, and a value of 1 indicates that the limb is fully open. The calibration data is captured in the next step. If the user desires to calibrate an opened pose or a closed pose, then a variable is set which corresponds to the respective position. This is done in line to prevent having to take multiple readings to calibrate. Next, the normalized data is mapped to the hand model. The angles on the hand model are determined before simulating and are values found for typical hands. A Proceedings of the 2015 ASEE Gulf-Southwest Annual Conference Organized by The University of Texas at San Antonio Copyright © 2015, American Society for Engineering Education normalized value of 0 will map to a fully closed finger, and a normalized value of 1 will map to a fully extended finger. Finally, the angle is applied to the hand model. The angles need only be applied to the local rotation of each digit, since the hand model was carefully designed to bend naturally on the Z axis. This process is repeated at the update of each frame, which typically happens 60 times a second.
The final prototype of the glove is as shown in figure 6.
The bend sensors are securely placed in the pockets stitched on the finger compartments of the glove.
To avoid shorting of signals from the bend sensors, they were stretch wrapped with an insulating plastic tubing material before sewing them into the pockets.
The circuit shown in figure 7 is used to get reliable output from the bend sensors. It consists of bend sensors connected to the analog pins of the Arduino through a voltage divider so that the change in resistance through a bend would correspond to an analog voltage. This circuit was neatly soldered onto a customized Printed Circuit Board (PCB) to avoid loose contacts and ensure that it stays with the glove as a single unit.
Results The VR glove prototype was able to respond to small changes in orientation and bend in a reaction time of less than 42 milliseconds. This when combined with high quality gaming devices like Razer Hydra and the Oculus Rift allows the gamer to completely immerse in the cinematic experience of the game with seamless frame change. Several issues relating to USB device drivers on Windows 8 were encountered. When the glove was connected, the Unity simulation would cause the Blue Screen of Death (BSOD) frequently with an error message. The problem was traced to the USB driver in Windows. When an asynchronous operation on different threads occurs, there is a chance that a race condition in the native USB driver can occur. This causes the NT kernel to crash. The problem was alleviated by switching to Windows 7. It crashed less frequently because it is not as optimized for multithreading as Windows 8. For cross platform compatibility, it is necessary to integrate all the USB inputs into one single USB cable.