Desktop version

Home arrow Engineering arrow Tactile Display for Virtual 3D Shape Rendering

Source

Integration with the Tactile Interface

In order to obtain and manage the digital representation that will be projected by the chosen selected HMD, a dedicated software is needed. For this purpose, we have decided to use Unity 3D tool developed by Unity Technologies [1] integrated with the AR extension Vuforia developed by Qualcomm Inc. [2]. This software tool is able to process the 3D model data in order to provide to the HMD the correct image to project. To allow this operation, a tracking system of a frame of reference placed in the real environment is needed.

In AR, tracking is the technology that allows finding the relative position and orientation between the users point of view and the fixed coordinate reference system in the environment. The estimation of the users point of view is necessary for the graphic algorithms of the AR application so as to correctly render the virtual part and consequently to have a proper alignment between the real world and the augmented content. Among the different tracking systems, we have chosen the simpler and more commonly used, which is the vision-based tracking technique, and specifically that one that uses a marker-based approach. Vision-based tracking technique provides the pose by processing images coming from the live-video stream of a camera, which is already integrated in the major part of the HMD. As regards the Marker approach, it is based on a physical marker located in the real world, which is acquired by the camera. Marker is the fixed frame of references that allow the software to determinate the position and the orientation of the virtual object allowing the correct image projection. Unity and Vuforia are able to use simple planar pictures as Marker.

In order to provide Unity 3D with the data needed for the visual rendering, a communication between Matlab and Unity is needed. When the control algorithm explained in Sect. 6.5 has been programmed, it has been developed a data export protocol. Indeed, in the first phases of the Matlab algorithm, there is the surface data import/definition and the selection of the cutting plane needed to obtain the trajectory that will be represented by means of the tactile interface. Therefore, all the surface data are processed and stored by the Matlab software and it is possible to export the geometry in an *.obj file format, which will be directly imported into Unity 3D. This process is schematically shown in Fig. 8.2.

Before performing the export process, the position of the 3D model of the surface needs some positional adjustments. The marker, which is the frame of references, is placed on the base of the station, but the surface has to be represented congruent with the strip, which has a different frame of reference located on the centre of the rails. For this reason, in order to guarantee the correct visualisation, Matlab performs a transformation that moves the surface from the marker frame of reference to the strip frame of reference. After that, it is possible to perform the export process ensuring the correct Augmented Reality Visualisation. Figure8.3 shows the integration between

Augmented reality data workflow

Fig. 8.2 Augmented reality data workflow

Augmented reality visualisation the tactile interface and the Augmented Reality visualisation system

Fig. 8.3 Augmented reality visualisation the tactile interface and the Augmented Reality visualisation system, by using as rendered surface a section of a design product, specifically a lamp designed by Artemide S.p.A.

References

  • 1. Unity 3D. http://www.unity3d.com
  • 2. Vuforia. https://www.qualcomm.com
 
Source
Found a mistake? Please highlight the word and press Shift + Enter  
< Prev   CONTENTS   Next >

Related topics