It would be really cool with some motion tracking capability within LumaFusion.
Just a thought, why not use the inbuilt capability of IOS, like we see it with Animojis, used in the Clips app and so on?
So instead of tracking an object live through the camera, you could track another object(face, text or so) in a video on the timeline.
That would be really awesome, and there might not be to much work, using IOS own API's for this?
Its a win-win for the LumaTech developers, and a win-win for the LumaFusion users.