Home / Frequently Asked Questions

Thank you for your interest in Enflux! Below are a number of questions we frequently receive on our product capabilities and optimization.

More information about Enflux clothing can be found on our Product Page.

More information about Enflux software plug-ins and SDKs can be found on our Developers Page.

If you’ve read through the FAQ below and have a question that is not covered, reach out to us at [email protected] and we’ll answer within 48 hours

Clothing and hardware products

How many sensors are in the suit?

5 in the shirt, 5 in the pants, 1 in the headband. See the clothing diagram on our product page here for more info.

How easy are the clothes to get in and out of?

It really is just putting on a tight shirt and pair of pants, so about two-five minutes. You have to make sure the IMUs sit right on your body, so minor adjustments to arms and legs after you put on the clothes. You can see a step by step process of putting on the clothes here.

Is Enflux Clothing washable?

It certainly is! Our sensor mesh network can be removed from the clothing, allowing you to throw the garment in the washing machine, on a delicate cycle with cold water. Our video here goes through the whole process.

What size should I get?

We have a detailed sizing chart here to make sure that you receive the best fitting garment. The garment needs to be flush against the skin to ensure the most accurate tracking possible.

Do you have plans for gloves and bands to capture movement of the hands, feet, and head?

Yes. We have straps and bands for the hands, feet, and head in development.

Is there potential for the system to be wired, e.g. for connection to a computer?

At this time, we do not have support for a wired connection. The system transmits data at 66 Hz. If you are looking for faster data transmission and have special requests, feel free to reach out to [email protected] and ask.

How do I replace a sensor if its not responding?

If a sensor is not responding or is transmitting incorrectly, make sure that the suit has been charged. If it has been charged and is still not responding, recalibrate the sensor. If problems still persist, reach out to us at [email protected] and we can resolve the situation from there.

Motion Data and Accuracy

What is the accuracy for each sensor position (including orientation and mounting errors)?

Each IMU sensor is +- 2 degrees (out of 360 degrees). Incorrect sensor placement on the body will result in less visual accuracy for realtime animated characters. For realtime animation, we have an automatic step called “alignment” that compensates for reasonable differences in body sensor placement.

How do you calculate the data?

Each sensor directly observes acceleration, magnetometer, and gyroscopic readings close to 1,000 times per second. Accel is in m/s^2, gyro is in rad/sec, and mag is normalized to the Earth’s magnetic field. The accel, mag, and gyro data gets passed through Kalman filters to obtain orientations as a quaternion for each sensor, which we calculate 125 times/second. In our Unity SDK, the X/Y/Z positions of limbs on an animated character are derived from these quaternions into our solver algorithm.

How stable are the readings over time?

The readings are very stable when no super dense metal or magnets are in the area, with no drift visible to the human eye for hours at a time. The Kalman filter we wrote autocorrects over time.

Is calibration required before each use?

We recommend recalibrating if: It has been a long period of time since the last use, or if a different person is using the suit from when it was last calibrated. Here is a one minute video describing the calibration process in Unity.

Do you use any mechanisms to counter drift?

We have an adaptive Kalman filter that updates over time, so this reduces the accumulation of errors. We have tested drift over long periods of time, and when we are clear of magnetic distortions, we encounter no detectable drift.

Software Features

Do you support Spacial Mobility/Positional Tracking?

For now, we get this from HTC Vive and Oculus. The suit itself does not do positional tracking yet.

How do you handle suit to user alignment?

Suit to user alignment happens when the user stands in a known pose and an alignment factor is calculated.

Does the SDK include any software for tailoring the model to specific user body parameters?

Features such as scaling a 3D character’s rig size to the user are in our software roadmap.

What is the range that the product works over?

The range of the suit is within 30 feet of your device’s Bluetooth receiver for real-time transmission.

Are we handling synchronization?

Yes. We process data from all sensors and time synchronize them.

How long can I record for?

As long as you want within bluetooth range of your computer. Outside of range, the sensor module will stay active and try to reconnect to your device.

What kind of data can be pulled out of the software in terms of the kinematics of movement?

Our Unity SDK provides filtered sensor orientations in quaternions. Our C API provides raw data such as accelerometer, gyroscope, and magnetometer measurements at a slower polling rate.

Do you do inverse kinematics?

We use a fusion of forward kinematics with inverse kinematics for realtime animated characters. Forward kinematics are always used when possible for most accurate results.

What file formats can I export the suit data as?

We have two software integrations with which to export the data transmitted by the suit: Blender: which can be used to record and export FBX and Blender files and Unity: which can be used to record and export .enfl files

Future Platform and Software Integrations

Are there plans to integrate with Maya, 3dsMax, iClone and other motion capture software?

We currently integrate directly into Blender and Unity. You can use exported FBX animations from Blender with Maya, MotionBuilder, iClone, and Unreal.

When will you have iOS?

We are targeting an iOS release for the middle of 2018.

Can you import into unreal engine?

You can record FBX files in Blender, export them, and then import into Unreal.

Are there plans for Mac support?

We are targeting the middle of 2018 for Mac support.

Shipping and Returns

When do you plan to deliver?

We started shipping suits to developers in July 2016. We are fulfilling orders on a rolling basis. The next wave of Enflux suits is shipping at the beginning of September 2017.

Do you ship globally and what is the cost?

We do! When you order our product here shipping location and cost are all handled online. Please allow for the extra shipping time when expecting your order.

Can I return the suit?

We accept returns if the suit is still in working condition. There is also a 12 month warranty on the suit, in case a sensor stops working, for which we will replace the entire sensor network for free.