Desktop version

Home arrow Engineering arrow Tactile Display for Virtual 3D Shape Rendering


The product design process is based on a sequence of phases where the concept of the shape of a product is typically represented through a digital 3D model of the shape, and often also by means of a corresponding physical prototype. The digital model allows designers to perform the visual evaluation of the shape, while the physical model is used to better evaluate the aesthetic characteristics of the product, i.e. its dimension and proportions, by touching and interacting with it. If the new shape, either in its digital or physical form, does not satisfy the designers, it has to be modified. A modification of the digital model requires a new physical prototyping of the shape for further evaluation. Conversely, a modification of the physical prototype requires the consequent update of the digital model, which can be performed by remodelling the shape, or using techniques as reverse engineering. Design and valuation activities are typical cyclical, repeated many times before reaching the optimal and desired shape. This reiteration leads to an increase of the development time and, consequently, of the overall product development cost.

Indeed, it would be very efficient and effective if the two kinds of evaluations would be performed at the same time, instead of in two distinct moments and by using different means. Today’s computer-based tools do not allow us to perform the visual evaluation and the tactile evaluation at the same time.

The aim of this research work is to develop a novel system for the simultaneous visual and tactile rendering of product shapes, thus allowing designers to both touch and see new product shapes already during the product conceptual development phase.

The proposed system for visual and tactile shape rendering consists of a tactile display able to represent in the real environment the shape of a product, which can be explored naturally through free hand interaction. It allows designers to explore the rendered surface through a continuous touch of curves lying on the product shape. Ideally, the designer selects curves on the shape surface, which can be considered as style features of the shape, and evaluate the aesthetic quality of these curves by manual exploration. In order to physically represent these selected curves, a flexible surface is modelled by means of servo-actuated modules controlling a physical deforming strip. The device is designed in order to be portable, low cost, modular and high performing in terms of types of shapes that can be represented. The developed tactile display can be effectively used if integrated with an augmented reality system, which allows rendering the visual shape on top of the tactile haptic strip. This allows a simultaneous representation of visual and tactile properties of a shape.

The developed tactile display has been compared with similar devices, which are currently available both on the market and in research labs. In addition, preliminary tests have been performed with a group of designers. Both the comparison and the testing session have achieved positive and satisfactory results, which have highlighted the high innovative potential of the system. Several are the benefits of the tactile display used in the initial conceptual phases of product design. The designers will be able to change the shape of a product according to the tactile evaluation, before the development of the physical prototype. This feature will allow decreasing the number of physical prototypes needed reducing, consequently, both cost and overall time of the product development process. Moreover, designers may improve their creativity during the product shape conception, since they will have the chance to optimise the design-evaluation process by evaluating visual and tactile properties at the same time.

The research described in this book is taken from the Ph.D. thesis of Alessandro Mansutti with the supervision and support of the other authors.

Milan, Italy Alessandro Mansutti

July 2016

< Prev   CONTENTS   Source   Next >