AI Vision for Time-in-Motion
Vision AI for High-Touch Workflow Improvement.
Subtask management can enable high-touch ergonomic workflows to be optimised.
Time-in-motion studies are a longstanding tool for designing manufacturing processes. However, the traditional approach for time-in-motion studies is tedious and manual.
MotionLogic™ automates this process, for optimisation of high-touch industrial workflows.
We transform video into structured datasets through Scene Understanding. Our Event Builder interface enables customisable events to be added.
Transforming Video into Comprehensible Reports
Our Scene Understanding technology transforms video into structured datasets. We present these as comprehensible reports, which can be used to inform Lean initiatives.Hypothetical Case Study
COGS (of an ergonomically-intensive workflow)
*Reduction In Waiting and Excess Motion
Yields
Margin Enhancement
Inference Capability
Tool-On Time
Visual Inference of Tool-On Time (e.g. Welder Arc-On)
Repositoning Time
Visual Inference of Repositioning Time (e.g. Time spent climbing a ladder)
Reloading Time
Visual Inference of Reloading Time (e.g. Time spent to reload a tool)
Preparation Time
Visual Inference of Preparation Time (e.g. Time spent to prepare a tool)
Wait Time
Visual Inference of Wait Time (e.g. Time spent waiting for materials to arrive)
Spaghetti Analysis
Visual Inference of time spent moving between workstations
Pixel Privacy
Traversal has developed camera technology to preserve the privacy of individuals in visual scenes.
We pseudonymise our data, and remove individually identifying features, including blurring faces.