With Rory Clark, Orestis Georgiou.

Git Repositories

TexNet: Image Texture Prediction Model

TexNet is a machine learning model that enables the automatic creation of mid-air haptic feedback based on visual image textures. It operates by extracting statistical measures of the variance in pixel co-occurrences contained within different types of image texture. These measures are then fed into a machine learning model and are used to predict visual perceptual quantities of different texture dimensions for a given image.

Images were used from open-source image texture database and were presented via Amazon’s Mechanical Turk to have users visually assess the underlying textural qualities and provide ratings for them. Having determined a method through which visual perceptual judgements could be predicted, these values were than converted into a mid-air haptic sensation using a linear mapping. A user study was finally conducted to assess the accuracy of the output sensations.


Intensity Modulation from Image Displacement Maps

This textures open-source project is part of an internal research project exploring the possibility of rendering varying surface textures using a UH device. This work currently enables the generation of haptic textures from images by modulating haptic sensation intensity via image displacement map greyscale values. From this information, the roughness and bumpiness of a texture can be effectively presented using ultrasonic mid-air haptics. In addition, both visual and haptic feedback are directly linked which ensures congruency for the user whilst exploring a visuo-haptic texture.

Code: TexNet Github Repository and Haptic Textures Unity Demo Project


Conference Papers

Incorporating the Perception of Visual Roughness into the Design of Mid-Air Haptic Textures

How can we connect visual and haptic feedback to create more congruent stimuli and automate this to improve the design process?

Abstract

PDF: Incorporating the Perception of Visual Roughness into the Design of Mid-Air Haptic Textures