Replies: 9 comments 5 replies
-
CC @lrapetti @claudia-lat @S-Dafarra @traversaro @DanielePucci |
Beta Was this translation helpful? Give feedback.
-
Hi @torydebra Thanks for opening this discussion.
Let me give a brief overview of our infrastructure that can hopefully help you to understand how it can be useful for your purpose. The wearables library and human-dynamics-estimation project are the two main components of the infrastructure. Wearables libraryThanks to the work done by @diegoferigo and @lucaTagliapietra , the wearables library provides general sensor interfaces that can be implemented to represent a set or a combination of sensors that constitute a wearable device. An example of such a wearable device is the XsensSuit. We have been primarily working with whole-body suit of xsens i.e., xsens Link (wired system) and xsens Awinda (wireless system) that constitute 17 distributed IMUs (Reference). However, instead of using Xsens MVN Motion Capture Studio to setup, configure and run the whole-body suit, we opted to use xsens sdk (Version 2018.03) and developed XSensMVN library to perform the setup, configuration and accessing the sensor data from the system. The XsensSuit wearable device accesses various sensors data through XSensMVN library, and implements the relevant sensor interfaces of the wearable library. The encapsulated sensors data from the An example of a single wearable data point from the
So, to summarize in bullet points:
The different devices involved in this pipeline are launch through this yarprobotinterface configuration file We refer to this aspect of our infrastructure as the PRODUCER layer as indicated in the schema picture. The reason being, this layer consists of all the components that produce the NOTE: We run this part of the infrastructure on a windows machine as the xsens SDK is not supported on a Ubuntu machine Human Dynamics EstimationWe refer to this aspect of our infrastructure as the CONSUMER layer as indicated in the schema picture. The reason being, this layer consists of all the components that consume the The first device at the CONSUMER layer is the IWearRemapper that reads the At this layer we organize the components mainly into three categories:
Similar to the producer layer, the different devices involved in this consumer layer are launch through a yarprobotinterface configuration file
I hope the above description gives a better picture of our infrastructure. Having said that, the first check that can be very useful is to see how your hardware can be configured through xsens SDK. In our case when we use the whole-body xsens awinda suit we use these configuration options. Once you can identify that, we can start a discussion on what modifications are needed on XSensMVN library to access the data from the sensors. |
Beta Was this translation helpful? Give feedback.
-
Thanks @Yeshasvitvs for your overview. Things are becoming clear. I will list some questions:
I am not sure about this. What is exactly the SDK are you speaking about ? If it is the firmware+software that works as a driver for each device, and provides the data to the system, this is available on Linux, at least for the Awindas. You can see it in the download page under the Awinda box, here the release notes.
<!-- XsensSuit configuration to be used. Available values are:-->
<!-- FullBody - FullBodyNoSternum - FullBodyNoHands - FullBodyNoSternumNoHands -->
<!-- LowerBody - UpperBody - UpperBodyNoSternum - UpperBodyNoHands - UpperBodyNoSternumNoHands -->
<!-- FullBodyNoShoulders - SingleDevice - PelvisSternum - LowerBodyPlus -->
<param name="suit-config">FullBody</param> What exactly is each configuration? How many sensors each configurations need? Is there a configuration that uses 3 Awinda IMU for each arm, to track only these arms? Which is the minimum number of sensors to let the Human State Provider tracks an arm?
|
Beta Was this translation helpful? Give feedback.
-
I think that the wearables repo is using the so-called "Xsens MVN SDK" that exposes the XME API, that is now discontinued by Xsens, but probably you can find if from the old downloads on the Xsens website or asking to Xsens, probably @Yeshasvitvs or @lrapetti can know more about this. That was a SDK that as far as I know is different from the MTw, and is only running on Windows. For more info, feel free to check the code in https://github.com/robotology/wearables/blob/master/XSensMVN/src/XSensMVNDriverImpl.cpp .
The code is open source, so feel free to check it! In particular, that parameters is read in https://github.com/robotology/wearables/blob/77362fc692bb9549b168b10c1200c473637bd59e/devices/XsensSuit/src/XsensSuit.cpp#L661 and passed to the Xsens SDK in https://github.com/robotology/wearables/blob/77362fc692bb9549b168b10c1200c473637bd59e/devices/XsensSuit/src/XsensSuit.cpp#L777 . |
Beta Was this translation helpful? Give feedback.
-
From the last time (September 2020) we spoke to a support representative at Xsens, it is not longer available from their website download section. You need to contact them to get the sdk. UPDATE
The calibration is done at the Producer layer, through yarp rpc port opened by IXsensMVNControlWrapper that implements IXsensMVNControl interface. |
Beta Was this translation helpful? Give feedback.
-
Thanks for all the replies, I have not enough thumbs 👍 for everyone :D. I may have found the MVN Studio Developer Toolkit avaialbe to dowload. It is under the "tool" section of MVN Analyze/Anymate dowload box here. It is an exe file that store some folders with c++ code. No readme nor some kind of docs is present, so I am not sure what it is, but it may be the SDK you are speaking about. Following the chat we had yesterday:
|
Beta Was this translation helpful? Give feedback.
-
@torydebra the link is not correct, please update it
@torydebra @lrapetti and @Yeshasvitvs had a quick chat yesterday to understand better what @torydebra is looking for. We gathered that @torydebra has a setup using https://github.com/qleonardolp/xsens_mtw_driver-release to get data from each individual xsens sensor, and is looking for a way to combine the sensor measurements to perform human arm tracking. So, we explained the aspects of inverse kinematics algorithms used for whole-body human motion tracking that are part of HumanStateProvider, in particular the global ik based on idyntree inverse kinematics.
@torydebra the link you pointed here corresponds to a commit, please be sure to use either the master or devel branch.
As an alternative plan, we suggested to check with the xsens support to know how the hardware @torydebra has can be run with the latest sdk, that hopefully will also have example application code for configuration and data logging from xsens awinda system at hand. |
Beta Was this translation helpful? Give feedback.
-
Hi all, Your work is great, but it may be too extended for our simpler purposes. So we are going to different solutions, but we will keep updated on you work in any case for the future. Thanks @Yeshasvitvs and all of you for the nice support! |
Beta Was this translation helpful? Give feedback.
-
Sure @torydebra, thanks for the update |
Beta Was this translation helpful? Give feedback.
-
Hi, Internet bring me here
I am Davide, an IIT PHD student, working on teleoperation. I get in contact with some of you by email.
I am looking for a framework to perform some human motion tracking, and I was wondering about the feasibility of doing this with your framework.
What we have are 6x XSens MTw Awinda Trackers (link). As you might know, the XSens firmware already provides time syncronization and data from each sensors (angle (euler/quaternion), angular velocity and acceleration, magnetic field).
What I am looking for is a way to combine data from the multiple devices attached to human body to track human motions. We would like to teleoperate the robot beginning from the operator end-effector position-velocity-acceleration (still thinking if all of these are necessary). We are starting with a single operator arm, but we want soon to use also the other arm (3 trackers for arm?). In future we may enhance using also the legs.
XSens software, MVN Analyze (link) but it is only available for Windows, and we would like to run everything on a single Linux (Ubuntu) machine.
I saw that your software already support XSens suit and has interface for ROS, things that we are looking for.
Thanks for your work and for any support!
fyi @liesrock @ntsagarakis
Beta Was this translation helpful? Give feedback.
All reactions