Qeexo

TRY
BACK TO PRESS

A Step by Step Guide to Robot Arm Demo

Cora Zhang 30 May 2022

In this article:

  • Prerequisites
  • Data Collection
  • Data Segmentation
  • Building Model
  • Model Performance & Live Classification

In industrial environments, different machine operation behaviors can be detected using machine learning models in conjunction with various sensors – for example, the operational condition of a machine may be monitored using an IMU sensor, and the machine’s vibration signature, responding when the machine deviates from normal operation, or when a specific fault condition is encountered. Similarly, a machine’s operational condition may be detected using its motor’s current signature.

Assuming we are operating a smart warehouse optimized for an e-commerce company. In the warehouse, we employ several, “intelligent robots mover” to help us to move objects from spot to spot. In this demonstration, we have used a miniaturized, “intelligent robots mover” powered by Qeexo AutoML to determine whether the robot griped an object.

This blog is intended to show you how to use Qeexo AutoML to build your own, “intelligent robots mover” from end to end, including data collection, data segmentation, model training and evaluation, and live testing.

Please enjoy! ????

Prerequisites

Set Up

  1. Robot arm assembly video guide
  2. Connect the current sensor
    1. Remove the cover from the gripper motor by unscrewing the four screws securing it to the body of the motor.
    2. Put back the screws after removing the cover. This ensures that the motors’ gears do not get misaligned
  3. Connecting the current sensor to gripper motor
    1. When you remove the motor cover, you will see a red color wire within the purple squared area, which looks similar to the wire in the blue squared area on the image below. De-solder the red color wire you find in purple squared area.
    2. Cut a piece of red wire from previously purchased wires. Connect one side of wire to ‘IP+’ terminal on the current sensor module (shown as circle 1 in the image below), then solder the other side of the red wire to the motor (shown as circle 2 in the image below).
    3. Cut a piece of black wire from previously purchased wires. Connect one side of wire to ‘IP-’ terminal on the current sensor module (shown as circle 1 in the image below), then solder the other side of the black wire to the PCB next to the motor (shown as circle 2 in the image below).
    4. While other combination of connections are possible, its best if we stick to a common convention across our demo setups to ensure consistency in the data we collected.
  4. Connecting current sensor to STWIN
    1. Connect current sensor with the STWIN sensor
    2. Connect STWIN with laptop device
Make sure STWIN has been previously installed and set up on your laptop device. If not, refer this link.

Data Collection

  • Create a project with STWIN (MCU) with multi-class classification
  • Collecting data
    • Click ‘Collect training data’ on ‘data’ page.
    • Step 1 – Create new environment
      • We assigned ‘office’ as the name of the new environment as data is collected from our office.
    • Step 2 – Sensor configuration
      • Make sure to build an environment containing ONLY the current sensor with value of 1850 Hz.
    • Step 3 – Collect 2 datasets
      • What data to collect?
        • We are building a multi-class with 3 classes including ‘open’, ‘object’ and ‘no_object’. This model is meant to detect the gesture of the gripper and whether there is an object gripper gripped.
        • There are 2 datasets we collected:
          • Collection label: OBJECT
            Duration: 600 seconds
            We collected 600 seconds data of the gripper repeatedly grip a wood block and release it
          • Collection label: NO_OBJ
            Duration: 600 seconds
            We collected 600 second data of the gripper repeatedly grip and release without an object.
          • Below is a screenshot of what each of the datasets look like:

Data Segmentation

Before we segment the data, it is necessary that we understand the data. Below is a zoom in of the two datasets annotated with their respective classes, this is how we intend to segment our data:

  • OBJECT
    For dataset “OBJECT”, we segment the data into 2 classes and labeled them as “OPEN” (blue color areas) and “OBJECT“ (yellow color areas).
  • NO_OBJ
    For dataset “NO_OBJ”, we segment the data into 2 classes and labeled them as “OPEN” (blue color areas) and “NO_OBJECT” (red color areas).

Building Model

  1. We are selecting the 2 datasets we mentioned above with their segmentation.
  2. We selected “Automatic Sensor and Feature Group Selection”. By selecting Automatic Selection, Qeexo AutoML will evaluate different combinations of features and choose the best performing feature group for you.
  3. For “Inference Settings”, we Enter Manually and set the “INSTANCE LENGTH” value as 2,000ms and “CLASSIFICATION INTERVAL” as 200ms.
    A Classification Interval of 200ms means that the software will make a prediction once every 200ms. Where an Instance Length of 2,000ms means the software will use 2,000ms of incoming data to make the prediction. We set Instance Length to 2,000 ms because the gripper takes about 2 seconds to finish each movement (for example opening and closing). By making the Instance Length big enough, we are making sure it covers the whole movement in a classification.
  4. Lastly, we selected two algorithms to train which are “Gradient Boosting Machine (GBM)” and “Random Forest (RF)”. Note that we have previously selected all available algorithms on the platform, and these two are the best in terms of the time taken to train a model, sizes and latency.

Model Performance & Live Classification

Model Performance

Below is a summary of our chosen models’ performance. We flashed GBM to hardware for live classification.

Live Classification

  • Push GBM to hardware by clicking the arrow button under “PUSH TO HARDWARE” to flash the model to STWin.
  • Click Live test
  • Operate grippers
  • Watch a video of the demo at this link.

Conclusion

From the image under ‘Model Performance’ section, we can see that with AutoML’s sensor and feature selection enabled, we landed with two good performing, high accuracy machine learning models.

Finally, we will flash the compiled binary back to the sensor (aka, the STWIN) and use AutoML’s live classification feature to check if the classifier is producing the expected output. As shown in the video, the final model is performing very well and can accurately recognize whether the robot has an object in within gripper.

BACK TO PRESS

Qeexo, and Bosch Enable Developers to Quickly Build and Deploy Machine-Learning Algorithms to Bosch AI-Enabled Sensors

Qeexo / Bosch Sensortec GmbH 26 May 2022

Machine learning algorithms created using Qeexo’s AutoML can now be deployed on Arduino Nicla Sense ME with Bosch BHI260AP and BME688 sensors

May 25, 2022

Qeexo, developer of the Qeexo AutoML, and Bosch Sensortec GmbH, a technology leader in MEMS sensing solutions, today announced that machine learning algorithms created using Qeexo’s AutoML can now be deployed on Arduino Nicla Sense ME with Bosch BHI260AP and BME688 sensors. Qeexo AutoML is an automated machine-learning (ML) platform that accelerates the development of tinyML models for the Edge.

Bosch’s BHI260AP self-learning AI sensor with integrated IMU, and BME688, a 4-in-1 gas sensor with AI, significantly reduce overall system power consumption while supporting a wide range of applications for different segments of the IoT market.

Using Qeexo AutoML, machine learning (ML) models–that would otherwise run on the host processor–can be deployed in and executed by BHI260AP and BME688. Its highly efficient machine learning models–that overcome traditional die-size-imposed limits to computational power and memory size–extend to applications that transform and improve lives. For example, they can be used for: Monitoring environmental parameters, including humidity and Air Quality Index (AQI); and capturing information embedded in motion, such as person-down systems to fitness apps that check posture. These devices typically have a longer time between charges and provide actionable information.

“Qeexo’s collaboration with Bosch enables application developers to quickly build and deploy machine learning algorithms on Bosch’s AI integrated sensors,” said Sang Won Lee, CEO of Qeexo. “Machine learning solutions running on Bosch’s AI integrated sensors are light-weight and do not consume MCU cycles or additional system resources as seen with traditional embedded ML.”

“Bosch Sensortec and Qeexo are collaborating on machine learning solutions for smart sensors and sensor nodes. We are excited to see more applications made possible by combining the smart sensors BHI260AP and BME688 from Bosch Sensortec and AutoML from Qeexo.” said Dr. Stefan Finkbeiner, CEO at Bosch Sensortec.

About Qeexo

Qeexo is the first company to automate end-to-end machine learning for embedded edge devices (Cortex M0-M4 class). Our one-click, fully-automated Qeexo AutoML platform allows customers to leverage sensor data to rapidly build machine learning solutions for highly constrained environments with applications in industrial, IoT, wearables, automotive, mobile, and more. Over 300 million devices worldwide are equipped with AI built on Qeexo AutoML. Delivering high performance, solutions built with Qeexo AutoML are optimized to have ultra-low latency, ultra-low power consumption, and an incredibly small memory footprint.

About Bosch Sensortec GmbH

Bosch Sensortec GmbH is a fully owned subsidiary of Robert Bosch GmbH dedicated to the world of consumer electronics; offering a complete portfolio of micro-electro-mechanical systems (MEMS) based sensors and solutions that enable mobile devices to feel and sense the world around them. Bosch Sensortec develops and markets a broad portfolio of MEMS sensors, solutions and systems for applications in smart phones, tablets, wearable devices, and various products within the IoT (Internet of Things).

https://www.eejournal.com/industry_news/qeexo-and-bosch-enable-developers-to-quickly-build-and-deploy-machine-learning-algorithms-to-bosch-ai-enabled-sensors/