Haptics tool creates realistic virtual textures

Haptics tool creates realistic virtual textures
Technology News |
Researchers at the USC Viterbi School of Engineering say they have created a user-driven haptics search that can generate virtual textures that feel like those in the real world.
By Rich Pell

Share:

In addition to sight and sound, tactile sensation is an incredibly important part of how humans perceive their reality, and haptics – or devices that can produce extremely specific vibrations that can mimic the sensation of touch – are a way to bring that third sense to life. However, say the researchers, humans are incredibly particular about whether or not something feels “right,” and current data-driven haptics models are not always successful at realistically recreating real-world textures.

While data-driven texture modeling and rendering has pushed the limit of realism in haptics, say the researchers, the lack of haptic texture databases, difficulties of model interpolation and expansion, and the complexity of real textures prevent data-driven methods from capturing a large variety of textures and from customizing models to suit specific output hardware or user needs. To address this, the researchers developed a preference-driven model that uses humans’ ability to distinguish between the details of certain textures as a tool in order to give the virtual counterparts a “tune-up.”

“We ask users to compare their feeling between the real texture and the virtual texture,” says Shihan Lu, a USC Viterbi Ph.D. student in computer science. “The model then iteratively updates a virtual texture so that the virtual texture can match the real one in the end.”

The idea, say the researchers, drew inspiration from the art application Picbreeder, which can generate images based on a user’s preference over and over until it reaches the desired result.

“We thought, what if we could do that for textures?” says Matthew Fontaine, also a USC Viterbi Ph.D. student in computer science.

Using this preference-driven model, the user is first given a real texture, and the model randomly generates three virtual textures using dozens of variables, from which the user can then pick the one that feels the most similar to the real thing. Over time, the search adjusts its distribution of these variables as it gets closer and closer to what the user prefers.

This method, say the researchers, has an advantage over directly recording and “playing back” textures, as there’s always a gap between what the computer reads and what we feel.

“You’re measuring parameters of exactly how they feel it, rather than just mimicking what we can record,” says Fontaine. There’s going to be some error in how you recorded that texture, to how you play it back.”

The only thing the user has to do is choose what texture matches best and adjust the amount of friction using a simple slider. Friction is essential to how we perceive textures, say the researchers, and it can vary between the perceptions of person to person.

The work comes just in time for the emerging market for specific, accurate virtual textures – everything from video games to fashion design is integrating haptic technology, and the existing databases of virtual textures can be improved through this user preference method.

“There is a growing popularity of the haptic device in video games and fashion design and surgery simulation,” says Lu. “Even at home, we’ve started to see users with those (haptic) devices that are becoming as popular as the laptop. For example, with first-person video games, it will make them feel like they’re really interacting with their environment.”

The texture-search model also allows for someone to take a virtual texture off of a database, like the University of Pennsylvania’s Haptic Texture Toolkit, and refine them until they get the result they want.

“You can use the previous virtual textures searched by others, and then based on those, you can then continue tuning it,” says Lu. “You don’t have to search from scratch every time.”

This especially comes in handy for virtual textures that are used in training for dentistry or surgery, which need to be extremely accurate, says Lu.

“Surgical training is definitely a huge area that requires very realistic textures and tactile feedback. Fashion design also requires a lot of precision in texture in development, before they go and fabricate it.”

In the future, say the researchers, real textures may not even be required for the model. The way certain things in our lives feel is so intuitive that fine-tuning a texture to match that memory is something we can do inherently just by looking at a photo, without having the real texture for reference in front of us.

“When we see a table, we can imagine how the table will feel once we touch it,” says Lu. “Using this prior knowledge we have of the surface, you can just provide visual feedback to the users, and it allows them to choose what matches.”

For more, see “Preference-Driven Texture Modeling Through Interactive Generation and Search.”

Linked Articles
Smart2.0
10s