Home / FAQs

What you need to know about Enflux. 

 

    Company

    Where is the company based? Hayward, CA. Do you generate your own content? Where does content come from? We generate some of our own content-house. We made the VR dancing game. We are partnering with top VR developers to integrate their experiences with EnfluxVR clothes and create a content package of VR experiences. How are we funded? Went through Y-Combinator (Winter, 2016) Angel funding of $1.3M after that. Lead investors are Chinese investors and Harvard Business School Angel Investors

    Applications

    What can you use the clothes for? Virtual reality games - gives you full-body presence in VR VR games - social, sports, dance, adventure, shooter. Sports and fitness apps Health care

    Clothing

    How much does it cost? Shirt or pants only is $199. Shirt and pants together is $349. What apps are available today? Our VR dancing game, Virtual Village People. Try it out at our booth! What does the suit come with? Shirt and pants Software development kit - Unity plug-in SDK

    Software

    What user software and capabilities are available? Now - Unity Windows Desktop Late 2016 - Unity Windows Android 2017 - Unreal, Maya, 3ds Max, iOS Do you do inverse kinematics? No, we do forward kinematics. Much more reliable than inverse kinematics. Are forward kinematics done in the application layer / API / hardware? Currently these are done in the API. They are moving onto the firmware in the suit. Does the SDK give access to both raw data and filtered correct data? You can have access to either. Describe the Unity plug-in SDK Drag and drop interface to move Enflux-enabled characters into your Unity environment Real-time filtered orientation angles (Roll, pitch, yaw) from each IMU (20ms latency, 99% accuracy) Functionality to connect data from clothing to Enflux-provided rigged humanoid model Preloaded Enflux mesh and texture Descriptions on how to rig your own mesh and texture to Enflux-provided humanoid model Descriptions on how to connect sensors to your own rigged humanoid model Transmits to Bluetooth dongle in desktop Tested on Windows 10 64 bit, Unity 5, JRE 8.0 How often reads data? 250Hz Area range product works over? Bluetooth - up to 40 feet for real-time transmission Data stored on board the suit for up to 30 minutes if you are out of range When will you have iOS? Why not now? Targeting Early 2017. It’s Bluetooth related. We wrote a custom Bluetooth package to receive the data from our sensors on Android devices, and we would have to write a similar one for iOS. It’s doable, but we expect would require a solid month of development time on our end to get that up and running, and all bugs worked out. And we want to make sure SDKs on Windows and Android are well-supported before we move on to iOS. Do you support Spacial Mobility/Positional Tracking? For now, we get this from HTC Vive and Oculus. Suit itself does not do positional tracking yet. Can we attach custom IMUs to the suit? Mick (8/1/16) We have a Unity SDK plug-in (you can see details at Developers section of www.enfluxvr.com.) If you're in Unity, should be able to integrate with any custom IMUs you have. Are we able to provide them raw accel data (plus gyro is possible)? Yes, You either get one or the other. Are we handling synchronization? Yes. We process data from all ten sensors and time synchronize them. What systems do we integrate with? Unity plug-in SDK now Works with HTC Vive and Oculus Will also work with Android, Samsung GearVR, iOS What does the processing of the data? The computer/phone now. Moving stuff onboard to the clothing. Any white paper / data sheets / etc? No. Not yet. Do you do positional tracking? For positional tracking, we use the Vive headset or Oculus headset. The suit does not yet do positional tracking. Is there potential for the system to be wired, e.g. for connection to an computer? It operates via Bluetooth, so no need for wires.

    Drift & Errors

    At what rate do absolute positioning (skeletal position) errors accumulate over time? We have an adaptive Kalman filter that updates over time, so this reduces the accumulation of errors. We do still have issues with permanent magnetic distortions. Any mechanisms to counter drift? We have an adaptive Kalman filter that updates over time, so this reduces the accumulation of errors. Have you tested drift over long periods of time, like 15-20 mins? Yes. And when we are clear of magnetic distortions, we have no detectable drift over time.

    Calibration

    What type of calibrations are done? How long does calibration take? Is there a calibration process? If so at what layer is it done and calibration stored? Application / API / hardware?

    Scale

    How would the system scale with multiple users? Is there a limit on multiple users in the same area, or does update rate decrease with number of users? Since we are only transmitting a small amount of data, the system scales well with multiple users. Latency? Latency is ~60ms right now, doing an upgrade in 1-2 months that will bring latency down to 15ms. Do you do hand tracking? We do not. We do a shirt ad pants. Hand is modeled as an extension of the wrist. Best hand tracking gloves in development that we’ve seen are ManusVR if you’re looking to integrate that as well! What separates you from XSens? We have advantage in price, ease of use, and mobility. XSens starts at $12,000, Enflux shirt and pants DevKits are $349. We have a very simple setup – literally put on a shirt and pants. And ours connects to any Bluetooth enabled device. The tradeoff is we lose a bit of accuracy vs. XSens, they are 99.9% accurate, we are 99%.