This week I spent quite a bit of time talking to people about digital twins that include skeletons and robots. For skeletons I’ve been working off real data from static JSON files – not yet time-series database-resident – but for robots I’ve just been relying on simulated movements. Until today, that is.
Josh Cameron, an Autodesk Research colleague in Toronto, sent through a video he took of a robot while streaming its data to our time-series back-end. This helped me interpret the data (reasonably) correctly, at least for a first pass. You’ll notice the virtal robot is a different model from the physical one, but that’s a relatively minor detail, at this stage.
This is a significant milestone for Dasher 360: we’re now able to display the current state of a robot in the Forge viewer, matching it with its real-world equivalent at a certain moment in time. There’s still quite a bit of plumbing work needed for this to happen – each joint has a unique sensor in the…