Here at Novatech, we've worked with an incredibly wide range of hardware over the years, spanning all manner of devices for the home and office, to complex aviation simulators and virtual reality headsets with near human-eye clarity. But the most exciting advances we see almost always come from companies working towards new tech for deployment in the Defence and Government Service sectors, and more often than not, that new tech is designed for training and simulation.
Anyone who has ever worked in training and simulation will appreciate that the best solutions are those that remould the very way we learn; able to be tailored to each individual trainee, and to assess the efficacy of the training methods being employed, whether they're complex systems, software or simply innate to the hardware. Tools and functions such as VBS4's real-time playback and analysis in scenarios, or Varjo's eye-tracking and first-time guidance, are all great examples of how the training and simulation scene is evolving to improve the ways we measure human performance. But VR Electronics Ltd. have taken this a step further with their fast-developing TESLASUIT and TESLAGLOVE.
We spoke to VP for Global Partnerships, Paul Nickeas, and Co-Founders Dimitri Mikhalchuk and Denis Dybski to find out more about their developments with TESLASUIT, how it can improve current training and simulation, and what the future of wearable tech may hold for the world of human performance training.
Eloquently put by Paul, the TESLASUIT is "a human-to-digital interface" - a full-body suit made up of three independent systems: haptics, motion capture, and biometry.
These three systems combine to create a heightened user experience in training and simulations, in real-world, virtual or mixed reality scenarios. Through haptic feedback delivered by medical-grade electro-muscle stimulus (EMS) and Transcutaneous electrical nerve stimulation (TENS), the TESLASUIT is able to "foster 360-degree awareness and engage muscle memory by triggering a neural response." Using EMS and TENS to simulate haptic sensations in this way also means having the ability to actively trigger muscle contractions. In practical terms, by delivering low-level stimulation to specific areas of the body, it is possible to aid in correcting posture and stance, allowing users to more quickly adjust their technique, thus improving reflexes and accelerating skill mastery.
These systems can even be used to deliver impact simulation, ranging anywhere from the patter of individual rain drops, to full body blows that can simulate shockwaves from explosions.
"The beauty of our approach to haptics is that, with electrodes covering 95% of muscle mass, and with 80 channels, we can generate a signal at any frequency between any two points," said Dimitri. "That gives us the widest possible range and variation of haptic feedback. From the type of subtle feedback registered essentially subconsciously by the brain, to full body motion and response."
Body-based haptics is only one element of touch feedback in simulation though, with the other being tactile; feedback and stimulation that occurs as the direct result of user input, indicating a successful interaction. This is key to elevating simulation realism, and in helping users to more accurately train motor skills.
"The best virtual "interface" is no interface - no special controls specific to virtual simulations, just the actual skill at hand."
Tactile feedback is traditionally associated with vibration feedback in controllers and smart devices. But for virtual simulation and training, this isn't enough. With the TESLAGLOVE, tactile is combined with haptic feedback, allowing for a user to not only confirm that an interaction has been successfully registered, but to be consciously aware that the interaction is continuing, or else has ceased - in other words, if you pick up an item in the simulation, you'll feel it in your hand as you hold it, manipulate it, or use it to interact with other items, until you put it down again.
It goes beyond just being able to realistically interact with items too, with the TESLAGLOVE making use of an "active" force feedback system, meaning resistance can be applied in a far broader range of scenarios compared to a "passive" approach, such as with tension cables.
Denis explained that "by having an active system using an exoskeleton which sits on top of the hand, it not only allows users to feel the shape of an object that they interact with in VR or XR, but it can actually move the fingers using servomotors that can control flexion and extension. What this allows for is scenarios where users can be guided by way of actually positioning the fingers and generating touch sensation to either imply objects that a user cannot physically or virtually interact with, or else aid in teaching muscle memory to users with little or no experience of a task for which they are training."
Of course, with the ability to manipulate muscle tension, flexion and extension, user safety might be of concern to some adopters, but rest assured, the TESLASUIT suite of products also offers high-fidelity calibration softwares.
"It gives you a level of personalisation to the user experience like nothing that we've ever come up with before."
These softwares not only allow users to perfectly tailor the suit and gloves to each individual, but also to save those calibrations to a profile system, making the equipment incredibly versatile and easy to use when shared between multiple trainees.
As for motion capture and biometry, these systems offer an unparalleled insight into user performance, and act primarily as tools for assessment and review. Integrated skeletal and 3D kinematic inertial sensors allow for accurate motion capture capabilities, and whilst the TESLASUIT is capable of relaying emotional state, stress level, and other key health indicators (collated from a combination of electrocardiography, electrodermal activity and Galvanic skin response), the TESLAGLOVE provides additional biometry sensors which make it possible to read things like Oxygenation (CO2 levels in the blood), blood pressure variation, and PPG.
This biometric data can be collected and assessed, allowing users to not only measure an individual's progress, but also to review the actual effectiveness of the training itself. This is a vital part of any training programme, but is especially important in the Public Safety and Government Service sectors where flawless execution is a must, both for the trainees and their instructors. With the TESLASUIT suite of products, it becomes possible to put trainees in a 'Safe-to-fail' training environment, real or synthetic, providing the vital feedback loop between training, measurement and understanding.
"Using our biometrics alongside AI and Deep Learning, we can identify how far along individuals are in any training process, and have it alter and adapt the scenario to meet the level at which the trainee is performing... Combining these various systems gives us invaluable data. Everything is timestamped right down to the micro-second, allowing us to see the correlation between the intent, the action and the outcome, giving us the ability to infer what needs to be changed in order to achieve that outcome without error. That can then be learned by anyone using the suit, to gradually imprint the action or movement into our subconscious so that we no longer have to actively think or concentrate on what it is we are doing; we just do it. And we do it correctly, perfectly - every time, the first time."
5G will usher in a new age of wireless connectivity that far surpasses anything we've known to date, and it's already being trialled in the current TESLASUITs. Besides offering an alternative that could quickly come to replace existing systems such as MILES, integrating wireless capabilities that harness the power of 5G mean even more versatility and applications in field operations.
"With future 5G capabilities, all of the systems we've covered could be integrated with uniforms and PPE, meaning the suits will be able to offer things like heart-rate monitoring - essentially proof-of-life - and climate control to mitigate external temperature affection," said Denis. "They could even provide the ability to identify whether the wearer is standing, or has fallen prone - the sort of tools and systems that will be incredibly useful for the likes of first responders, and especially so in low-visibility environments."
"We'll also be able to integrate tiny micro-cameras too," Dimitri added. "Sometimes a plan of a building doesn't reflect how it has changed over time, what current tenants or owners have done to alter the floorplan from those original documents. Other times a building may be damaged before you arrive, so those same plans won't be able to determine whether you can access certain areas or not; these micro-cameras will be able to map out the wearer's surroundings, providing this vital information in real-time. And they aren't just limited to our own visual spectrum either - they can use thermal vision and various other sensory array so that operators will have a much clearer, much better understanding of what's going on, regardless of how dark, dusty, smoky or misty it is."
VR Electronics also aim to make developing software for both the suit and glove as easy and as flexible as possible, providing the full suite of software for designing, and full integration of their suits with existing and new technologies.
"We're nationally compatible with all the major engines and we've also provisioned for an Open API which can connect to anything proprietary," Dimitri commented. "We're even platform agnostic, so any headset you throw at us, the TESLASUIT just works with it, no problem."
TESLASUIT STUDIO, the SDK for TESLA wearables, offers full control over the various suit and glove systems, and is highly compatible with many industry-standard development tools and engines, such as Unity, Unreal, CryEngine, Android, and even VBS, for which they are currently developing a specialised variant of the suit, with which they aim to offer a full feature set and compatibility within the engine.
The engineers and developers at TESLASUIT also understand that not every user will be familiar with all the new data these suits will offer them, nor how to analyse and draw conclusions from this information. As such, they are also "working on some new smart software - a set of add-ons which will help translate, analyse and correlate biometric data into meaningful data points, offering insight to users with limited or no existing medical knowledge. This way they can better understand the wealth of data they're receiving, and as such, infer more accurately how individuals are really responding at the biological level."
With suits offering data collection capabilities at such an in-depth and individual level, and offering such a wide range of software compatibility, it's understandable that some might be sceptical of how safe their data really is. But Dimitri confirmed definitively that "we don't collect any of the data ourselves. Whoever owns the suit owns the data," so privacy should be of no concern.
The TESLASUIT suite of products are undoubtedly impressive, and certainly make us incredibly excited for the future of wearable tech. Between the extensive list of benefits they offer to the training and simulation scene, as well as the implications of future integration with 5G, it really does feel like we're officially moving into the realms of what was once considered science fiction.
We don't know exactly what the future will look like in 10 years' time, but we're fairly certain that the likes of TESLASUIT will still be there. And who knows, maybe when we look back at this post, we'll be wearing one...
Find out more about TESLASUIT at their CES2021 Event.
Images copyright TitanIM
Novatech supply custom, purpose-built hardware to some of the biggest names in the Security, Aviation, Defence and Marine industries.
If you have a project you'd like to discuss, please contact our dedicated Simulation and Training team using the form below, or call us on 02392 322500.
Posted in Training & Simulation
Author - Danny Adams
Published on 12 Jan 2021
Last updated on 11 Jan 2021