Niologic performed a detailed analysis of network traffic within the warehouse system to fully understand the workflow. The network data between subsystems was automatically exported for analysis to Google Cloud Storage. The network data was then transformed using Python and Google Cloud Dataflow.
After processing, the transformed data was stored in Google BigQuery for further analysis and model training. When analyzing the data, it became evident, that a number of bugs occurred in the warehouse management system’s communication between robots and the control plane (warehouse management system). The throughput could be maximized by resolving multiple communication inefficiencies and better error handling.
Furthermore, some items in the warehouse are defined as not pickable by the robots (due to physical dimensions or image recognition) and thus require human interaction. Niologic successfully trained a model using Google AutoML and also BigQuery + external libraries to predict the probability of an item being successfully picked by learning from historic picking events.