Headline

Sponsor

 

2017 TICKETS ON SALE NOW

Exhibitor Profile: CDE

By CDE

Published in: Exhibitor Profile

Date: 6 / 4 / 17

Tags:

Based at Bath and Bournemouth Universities, the CDE is an EPSRC Centre for Doctoral Training funding doctoral researchers. In this blog we get to know the team behind the project. 

Our researchers work closely with and in some of the most exciting high-tech companies in the world, not for a token short-term placement but fully-embedded, usually for three years. Our company partners not only include animation, VFX and games companies but also include companies who use these skills in other business areas such as healthcare, heritage and defence. Founded in 2009, CDE is the national Doctoral Training Centre in these sectors with £20 million of investment. We are a big and supportive CDE family working alongside some of the most outstanding companies in the world. The EngD is a four-year doctoral programme fully at the same research level as a PhD but with an additional package of training and practical experience included via an industry placement. Students are also called Research Engineers, to reflect this. The first year is spent at one of the two Universities, Bath or Bournemouth, taking a taught course to prepare for the research in CDE’s sectors. During years two, three and four students undertake research in a company.

Click here to find out more.
Contact: Denise Cooke
Tel: 01225384646
Email: d.cooke@bath.ac.uk

CDE2

CDE Research Engineers and Virtual Reality
Owen O’Neill

Owen

Analysis of Commercial Virtual Reality for Upper Limb Stroke Impairment, 
Salisbury NHS Trust
In the UK, over 150,000 people experience a first time stroke each year. It is the most common cause of severe adult disability, and data from 2012 suggests that there are around 1.1 million survivors in the UK suffering from a wide spectrum of both physical and psychological disabilities including loss of language, memory, use of limbs and depression. The impetus of Stroke research is to explore avenues that may improve treatment, patient care and preventative measures.

Our research is focused on critically evaluating the use of low-cost, easily accessible commercial technologies and how these may, or may not, be of therapeutic value in the neuromuscular rehabilitation of patients. Recent examples of research in this area have identified technologies such as Nintendo Wii, Xbox Kinect, augmented and virtual reality as potentially exciting avenues of research.

Zack Lyons

Zack

Virtual Therapy 
Designability and Brain Injury Rehabilitation Trust
My research involves using interactive computational simulations to deliver meaningful benefits to people with acquired brain injuries. It will contribute to the science base on human-agent interaction, as well as to research on Human-Computer Interaction in mental health. I am currently carrying out exploratory work with the intention of articulating design goals to inform future development of simulations. The envisioned emphasis of the project is in exploring the unique dynamics of the three-way interaction between clients, clinicians and the machine.

Hashim Yaqub

Hashim

Virtual Reality and Maritime Defence Training, BMT Defence Services

My research involves evaluating the effectiveness of consumer off-the-shelf HMDs in Virtual Maritime Training. I am based at BMT Defence Services, Bath, with a department who specialises in developing training solutions for naval platforms. My work involves using new consumer VR technologies like the Oculus Rift in training scenario’s which include platform familiarisation, maintenance, and emergency procedures. Currently my projects are extending to having multiple users interacting with each other in the same environment via these devices.

Ifigenia Mavridou

Ifi

Emotion stimulation and recognition, Emteq

I am interested in the emotion stimulation and the identification of methods for emotion recognition in Virtual Reality (VR) Environments. Currently at Emteq, I am working towards enhancing human-computer interaction using emotional states as an input modality by assisting the development of a facial sensing platform that measures emotions through facial gestures and biometric responses. Emotion stimulation is related to engagement and “Presence” in games and VR. These factors can assist in the creation of immersive experiences as well as, the efficient content design of a VR product in terms of re-playability. The acquisition and analysis of physiologic signals and facial expressions play an important role in my studies towards evaluating and measuring the dimensions of affect, and it’s relation to cognitive processes such as attention and memory. For my studies I will start a sequence of user-behaviour experiments in VR conditions in order to explore emotion stimulation, identification and recognition in VR. I have recently returned from LA where I presented a research demo at IEEE VR 2017.  I spent three days during the conference presenting the "FACETEQ Interface Demo for Emotion Expression in VR" in collaboration with Emteq, the host company during my EngD.”

Daniel Finnegan

Daniel

Personalized Spatial Audio Content for VR Applications
CAMERA and CDE Alumni

When viewing any 3D experience, be it a movie, game, or interactive virtual reality (VR), objects don’t just jump out of the screen. They whirl around you and come to life through sound. Hear the crackle of the fire just outside of view. Hear the ice puck fire from one end of the rink to the other, bounce off the wall, then slide under your feet. All of this is amplified through sound.

Most people have heard of 5.1 or 7.1 surround sound setups. These add a third dimension to the sound of an experience by using multiple speakers configured in specific ways. This works fine for 3D movies, but most VR applications use headphones to get that immersive kick. What if we could deliver audio that moves around the VR world in 3D with just a pair of headphones?

This is possible using a technique known as binaural rendering. The theory is simple: we have only two ears, which is just like having a two-channel stereo setup, yet we can hear sounds above, below, in front, and behind us. We can do this because our ears have evolved to listen for ‘spatial cues’ in the sound that let us know where it is coming from. By understanding these cues, we can virtually render any sound we want over headphones and give it the spatial property of a real-world sound.

This raises a question: how do we capture these spatial cues? Anechoic chambers are carefully designed rooms that absorb over 90% of the reverberation and acoustic reflections. This is so we can capture the sound in its purest form, without any interference from the environment. By analyzing sound in an anechoic chamber and how a person’s ears hear it, we can extract the cues. However, this is very time consuming: it can take hours and requires not just an anechoic chamber but also specialized equipment.

We are solving this problem by studying how physical body shape (head size, outer ear shape, etc.) interacts with spatial cues in sound. We study the spatial cues through advances in computer vision, acoustic simulation, and machine learning. By exploiting computer vision we can create 3D models of ourselves using merely a set of images. Taking pictures does not require specialized hardware. By creating 3D models from images, we can synthesize spatial cues for a given person from these models. Using these models, we can then render 3D audio without the laborious capture method.

Aris Michailidis

Aris

Immersion in video games
Sony Interactive Entertainment Europe

Immersion is the psycho-cognitive state, which mediates the interaction of an individual with an activity. My research is dedicated to uncovering the brain correlates of immersion in video games with electroencephalography. It has been said that if the game has the capacity to instil immersion, it will be a significant indicator of its success. However, we have yet to determine what exactly happens while a player is immersed, whether this state truly contributes to increased performance, and what the developers can do to maintain it. For this purpose, I have developed a custom, virtual reality game of the Tower Defence genre, designated for Playstation VR, and we will also employ Machine Learning to detect the sustenance and loss of this state.


The project is in collaboration with Sony Interactive Entertainment Europe.

Get Our Sponsorship Pack Download >

@VRWorldCongress

RT @DansOnRoad: "Expect more Virtual Reality opportunities than ever before this upcoming school year" http:/t.co/VjRf46kME2 via @educati...