An interview with tech innovation leader, Danny Grant
Danny Grant, PhD is a founding partner and Chief Technology Officer at Innovobot. With over 20 years’ experience in R&D management, and a PhD in robotics, he is an expert in the areas of control systems, mechatronics, robotics and haptics, and a prolific inventor with over 250 patents to his name. Formerly VP Research at Immersion Corporation, he was the lead inventor of the PlayStation 5 DualSense controller.
The PlayStation 5 is getting great reviews and the controller in particular. What is so unique about the new design?
The DualSense controller has a number of innovative features. On the haptics side, the controller’s been completely redone. There are really two types of haptics systems in that gamepad. The first is an improvement on something that will be familiar to gamers, which is rumble technology. In this controller, there are two motors. And they have a much higher fidelity and are able to produce a much wider range of sensations with lower latency. The end result is that you get haptic effects that are more in tune with the audio and visual and are more representative of different things that you could feel. So, you have the vibration style sensations that are familiar to everyone, but you can also have very short and crisp haptic effects, like the passing of a puck or the firing of a machine gun. So that’s the first haptics system in the controller.
But then the second system targets a completely different modality in the hand. It’s not vibration based. It’s kinesthetic, which means it can apply forces to physically move the user’s fingers on the triggers. The controller contains adaptive triggers where the user can press down on the trigger and feel a tension, or the trigger can even push back on the finger, so where this is useful is let’s say a user is firing a gun, they can pull through the trigger and feel the click of a gun being fired or alternatively, the clink of an empty barrel. Another example for triggers would be in race car games, where you can feel acceleration or braking.
The bringing together of these two haptics systems is, I think, the most exciting feature of the new PlayStation. Because we have vibration and kinesthetics, we have a much richer haptic expression. And beyond that, we can now do spatial haptic effects. What I mean by that is because you have multiple haptic touchpoints (the whole device vibrating and the left and right trigger) you can create a sense of movement occurring throughout the controller. A good example of this would be in a game where the user is casting a spell. You can feel the spell gathering strength and building up in the controller, then you can feel the spell shoot out of the fingertips by activating the triggers.
You were effectively the lead inventor of the new controller, were you not?
Like any other project of this type, it was, of course, a team effort. But yes, I was responsible for coming up with the concept and overall design which incorporates those two haptics systems that I just mentioned into a gamepad. I led the development of the early prototypes, and then demonstrations to many people in the gaming industry.
The team I was leading at Immersion built several early prototypes that are functionally similar systems to what the dual-sense controller has today. Then, a lot of the time was spent—and this is one of the big challenges in haptics and gaming hardware development—creating cost-reduced designs that could fit the appropriate price-point.
For readers that might not know what haptics is, could you give us a quick overview?
Sure. Haptics is really the programmable sense of touch. So just as you have graphics and audio that are connected to vision and hearing, with haptics, you have the ability to create a digital touch sensation.
Gaming is probably the domain where the idea of haptics is most familiar to people, but moving beyond that, there are common applications in other domains as well. Joysticks for tele-operated robots are a good example. Say you have a robot operating somewhere and that robot has a gripper that it’s using to pick things up. Well, haptics in a joystick would allow you to feel the sensations that the robot would be experiencing.
The biggest haptics market right now is in the mobile phone space. Apple has clearly fully embraced haptics now, and so has Android. Most of that has been in relation to buttons on the touchscreen for keyboards but there are all sorts of new user interface mechanisms that are made possible by haptics, like sliders, spinning wheels, the list goes on.
What are some of the more exciting trends or innovations you’re seeing in the field of haptics?
We’re seeing a lot of innovation pertaining to touchscreens. When the transition was made to touchscreen technology and away from keys, people kind of lost the sense of touch or connection with the device. A good example of this are the seatbacks on an airplane where there is no haptics component to the screen…you find yourself tapping without getting any sensory cue that tells you the device is responding. The technology to remedy that is now here.
Another very exciting area for haptics right now is virtual reality. Already in VR, the video and audio is completely immersive, but when you don’t have sense of touch, you can quickly lose that experience of immersion. Once the three elements of graphics, audio and haptics are present, you can quickly forget you’re in a virtual world.
A lot of haptics is still very much handheld device-based but there have been some exciting progress where people are working on exoskeletons that you can put on a user so that they can start feeling the pressure or weight from picking up virtual objects, or When interaction with virtual characters. That’s a really interesting domain with a lot of exciting possibilities.
Given that Innovobot is focused on Tech for Good, can you talk a bit about how haptic technologies can be used to make the world a better place?
There are countless examples. The area of automotive safety comes to mind. With haptics it is easier to use the increasing number of touchscreens in automobiles and keep your eyes on the road. Surgical simulation is another area where haptics plays a vital role. There are already systems and medical simulators where doctors can get the sensation that they are interacting with a human body during a simulated procedure. The benefits to society there are immediate and obvious.
Then there’s the idea of remote surgery. The idea that you could be here, in Canada, but performing a surgery in Africa by controlling a robot remotely and getting the full sensory feedback that the robot is getting, that’s a very exciting and motivating prospect.
About the Author:
Yuri Mytko is the Director of Marketing at Innovobot and at Innovotive, the firm’s advisory services arm, where he offers clients solutions to their marketing and communications-related challenges.