Skip to Content

How Sensor Data is Powering ML and AI in Next-Generation Robotics

From traditional industrial robotic systems to today’s latest collaborative robots (or “cobots”), robots rely on sensors that generate increasingly massive volumes of highly varied data.

How Sensor Data is Powering ML and AI in Next-Generation Robotics

How Sensor Data is Powering ML and AI in Next-Generation Robotics

This data can help build better machine learning (ML) and artificial intelligence (AI) models that robots rely on to become “autonomous,” making real-time decisions and navigating in dynamic real-world environments. Industrial robots are typically placed in “caged” environments; a human entering that environment stops robot movement for safety reasons. But limiting human/robot collaboration prevents the realization of many benefits. Robots with autonomous capabilities would enable the safe and productive co-existence of humans and robots.

Sensing and intelligent perception in robotic applications are important, because the effective performance of robotic systems – particularly ML/AI systems – greatly depends on the performance of sensors that provide critical data to these systems. Today’s wide range of increasingly sophisticated and accurate sensors, combined with systems that can fuse all of this sensor data together, are enabling robots to have increasingly good perception and awareness.

Content Summary

The growth of AI
Pushing AI processing for robotics to the edge
The decentralized AI model
Cobots
Logistics robots
Last-mile delivery robots
Time-of-flight (ToF) optical sensors
Temperature and humidity sensors
Ultrasonic sensors
Vibration sensors
mmWave sensors
TI solutions handle the entire AI robotics signal chain

The growth of AI

Robotic automation has been a revolutionary technology in the manufacturing sector for some time, yet the integration of AI into robotics is poised to transform the industry over the next few years. According to research from Markets and Markets, the robotic automation technology market will grow from about $270 million in 2016 to just under $4.9 billion by 2023.

Figure 1. The different senses of a robot.

Figure 1. The different senses of a robot.

Consulting firm Accenture’s Institute for High Performance researched the impact of AI in 12 developed economies. Their study found that AI could double annual economic growth rates by 2035, improve labor productivity by as much as 40%, and change the nature of work by spawning a new hybrid relationship between human and machine, driving creativity, innovation and growth in collaboration with AI-driven robots.

This paper explores some of today’s key trends in robotics and automation and the most important technologies that will tie AI to the data that it needs to be intelligent. It will also address how Texas Instruments (TI) sensors are being used (and fused) into AI systems, and how TI offers a broad portfolio of sensor and data-processing components that covers the entire AI system signal chain for robotics.

Pushing AI processing for robotics to the edge

There are two main parts of ML: training and inference, which can be executed on completely different processing platforms. The training side usually occurs off-line on desktops or in the cloud, and entails feeding large data sets into a neural network. Real-time performance or power is not an issue during this phase. The result of the training phase is a trained AI system that when deployed can perform a specific task, such as inspecting a bottle on an assembly line, counting and tracking people within a room, or determining whether a bill is counterfeit.

But in order for AI to fulfill its promise in many industries, the fusion of sensor data that happens during inference (the part that executes the trained ML algorithm) must happen in [near] real time. Thus, designers need to put ML and deeplearning models on the edge, deploying the inference into an embedded system.

For example, a cobot is built to work in close collaboration with humans. It relies on data from proximity sensors as well as vision sensors to ensure that it successfully protects humans from harm while supporting them in activities that would be challenging for them. All of this data needs to be processed in real time, but the cloud is not fast enough for the real-time, low-latency response that the cobot needs. To address this bottleneck, today’s advanced AI systems are pushed to the edge, which in the case of robots means on-board.

“Humans can’t generally get near traditional industrial robots while they are operating without peril.”

The decentralized AI model

This decentralized AI model relies on highly integrated processors that have:

  • A rich peripheral set for interfacing to various sensors.
  • High-performance processing capability to run machine-vision algorithms.
  • A way to accelerate deep learning inference.

All of these capabilities also have to work efficiently and with a relatively low-power and small-size footprint in order to exist at the edge.

Power- and size-optimized “inference engines” are increasingly available as ML grows in popularity. These engines are specialized hardware offerings aimed specifically at performing ML inference.

An integrated system-on-chip (SoC) is often a good choice in the embedded space, because in addition to housing various processing elements capable of running deep learning inference, an SoC also integrates many components necessary to cover the entire embedded application. Some integrated SoCs include display, graphics, video acceleration and industrial networking capabilities, enabling a singlechip solution that does more than just run ML/AI.

Sitara AM57x processors are good examples of processors running AI at the edge. These processors have multiple high-speed peripherals for interfacing to multiple sensors – like video, time of flight (ToF), light detection and ranging (LIDAR) and millimeter-wave (mmWave) sensors – as well as dedicated hardware in the form of C66x digital signal processor cores and embedded vision engine subsystems to accelerate AI algorithms and deep learning inference.

Let’s look at some of the top robotic trends today.

Cobots

Humans can’t generally get near traditional industrial robots while they are operating without peril. Cobots are, in contrast, designed to operate safely alongside humans, moving slowly and gracefully.

As defined by ISO standard TS 15066, a collaborative robot is a robot capable of being used in a collaborative operation where collaborative operation means robot and humans working concurrently within a defined workspace for production operation (this excludes robot + robot systems or co-located humans and robots which are operating at different times). Defining and deploying cobots is used to foresee potential collisions between physical portions of the robot (or virtual extensions like laser) and the operators.

Figure 2. Cobots working alongside humans in a factory environment

Figure 2. Cobots working alongside humans in a factory environment

This makes the use of sensors to determine the exact position and velocity of the operator more important.

Cobot makers must implement a high level of environmental sensing and redundancy into robot systems to quickly detect and prevent possible collisions. Integrated sensors connected to a control unit will sense an impending collision between a robot arm and a human or other object, and the control unit will turn the robot off immediately. If any sensor or its electronic circuit fails, the robot also turns off.

As cobots become more capable in demanding industrial environments, manufacturers will increasingly add them to the factory floor, particularly those manufacturers with strict return-on-investment objectives and a desire to improve product cycle times.

Logistics robots

Logistics robots are mobile units that operate in environments where people may or may not be present, such as warehouses, distribution centers, ports or campuses. Logistics robots fetch goods and bring them to a packing station, or transport goods from one building of a company site to another; some are capable of picking and packing goods as well. These robots typically move within a particular environment and need sensors for localization, mapping and to prevent collisions (especially with humans).

Until recently, most logistics robots used predefined routes; they are now capable of adjusting their navigation based on the location of other robots, humans and packages. Ultrasonic, infrared and LIDAR sensing are all enabling technologies.

Because of the robot’s mobility, the control unit is located inside, often with wireless communication to a central remote control. Logistics robots are now adopting advanced technologies such as ML logic, human-machine collaboration and environmental analysis technologies.

Rising labor costs and stringent government regulations are contributing to the higher adoption of logistics robots. Their popularity is also rising because of a decrease in the cost of equipment, components like sensors, and the cost of (and time required) for integration. The global logistics robots market is expected to grow at a compound annual growth rate of more than 28% from 2018 to 2022, according to market research by Technavio.

Last-mile delivery robots

In a product’s journey from warehouse shelf to customer doorstep, the “last mile” of delivery is the final step of the process: the point at which the package finally arrives at the buyer’s door. In addition to being key to customer satisfaction, lastmile delivery is expensive and time-consuming.

Last-mile delivery costs are a substantial percentage of the total shipping cost: 53% overall. As such, making the last mile of delivery more efficient has become a focus for where to develop and implement new robotics technologies that can drive process improvements and increased efficiency.

Time-of-flight (ToF) optical sensors

These sensors rely on the principle of ToF and use a photodiode (a single sensor element or an array) along with active illumination to measure distance. The reflected light waves from obstacles are compared with the transmitted wave to measure the delay, which in turn is a representation of distance. This data then helps create a 3D map of the object.

TI’s ToF chipsets enable ToF-based sensing that goes beyond proximity detection to enable next-generation machine vision. The chipsets allow for maximum flexibility to customize designs for robot vision and other applications, with tools that include an evaluation module and a highly configurable camera development kit, which provides a 3D location of each pixel for accurate depth maps that aid customization.

Discrete solutions leverage leadingedge topologies and semiconductor technologies such as time-to-digital converter and gallium nitride (GaN), as demonstrated in the LIDAR Pulsed Time of Flight Reference Design and the Nanosecond Laser Driver Reference Design for LIDAR.

A 3D ToF sensor like TI’s OPT8320 enables robots to determine the exact angle of a screw and then fine-tune the screwdriver so that screws consistently align without human intervention. A ToF-based analog front end like the OPT3101 can help identify the distance of a robotic arm to a target and help in accurate positioning. For higherresolution 3D sensing, flexible structured lighting – as enabled with DLP® technology and demonstrated in the 3D Machine Vision Reference Design Based on AM572x Processor with DLP Structured Light – can help bring resolutions to micrometers or below.

“Vision sensors may not work if the robot is blinded by a bright light or finds itself in a very dark environment.”

Temperature and humidity sensors

Many robots need to measure the temperature and sometimes the humidity of both their environment and their components – including motors and main AI motherboards – to ensure that they are operating in safe ranges. This is especially important for robots, because when a motor is under a heavy load, it can draw a lot of power and heat up. Accurate temperature monitoring protects motors, while better temperature accuracy enables motors to be driven harder before hitting safety margin limits. In addition, just about every other sensor is sensitive to temperature and benefits from thermal compensation. By knowing the temperature, you can correct for the temperature drift of other sensors to get more accurate measurements.

In equatorial factories and in tropical climate zones, temperature and humidity sensors can predict dew points for electronic system protections and predictive maintenance.

Ultrasonic sensors

Vision sensors may not work if the robot is blinded by a bright light or finds itself in a very dark environment. By transmitting ultrasonic waves and listening for echoes that reflect back from objects (similar to how bats maneuver), ultrasonic sensors perform excellently in dark or bright conditions, overcoming the limitations of optical sensors.

Ultrasonic sensing is a low-cost, slower-speed alternative to radar for robots that don’t need to reach high speeds. Ultrasonic sensing is more reliable than optical ToF for obstacle avoidance, as ultrasonic sensing is not affected by the amount of available light reflected off of obstacles. For example, ultrasonic sensing provides the ability to sense glass or other transparent surfaces because it uses sound waves rather than light to detect objects.

Vibration sensors

Industrial vibration sensing is a crucial part of the condition monitoring necessary for predictive maintenance. Integrated electronic piezoelectric sensors are the most common vibration sensor used in industrial environments.

Vibration sensors enable robots to be aware if some of its mechanics are damaged or aged, facilitating preventive maintenance before operations become endangered. Using AI/ML can take the accuracy of these predictions to the next level

mmWave sensors

mmWave sensors use radio waves and their echoes to determine the direction and distance of a moving object by measuring three components: velocity, angle and range. This enables robots to take more predictive actions based on how fast objects are approaching the sensor. Radar sensors deliver excellent performance in the dark and can sense through materials like drywall, plastic and glass.

Complementary metal-oxide semiconductor (CMOS) mmWave radar sensors enable the highly accurate measurement of not only the distance of objects in their field of view, but also the relative velocities of any obstacles.

TI’s highly integrated single-chip mmWave radar sensors are small, lightweight and enable realtime processing to occur within the sensor edge, often removing the need for additional processors. mmWave technology enables designs that are three times smaller and half the weight of miniature LIDAR range finders.

This lowers bill-of-materials costs, makes for a small sensor, and reduces the million instructions per second needed from the central controller processor versus vision-based systems. Mounted directly behind enclosure plastics without external lenses, apertures or sensor surfaces, the sensors are extremely rugged and can meet the Ingress Protection 69K standard.

mmWave sensors were initially expensive and large, and required multiple discrete components. Now, however, driven by TI’s integration of radiofrequency, processing and memory resources onto a single monolithic CMOS die, it is reasonable to say that mmWave sensors will complement or displace established sensing technologies in robotics over the coming years.

More advanced radar sensing systems can ensure very accurate odometry through the addition of an inertial measurement unit that’s sometimes augmented with GPS. mmWave sensors can supply additional odometer information for robots that traverse over uneven terrain or have a lot of chassis pitch and yaw by sending chirp signals toward the ground and measuring the Doppler shift of the return signal.

Figure 3. Robotic arm using mmWave sensors.

Figure 3. Robotic arm using mmWave sensors.

TI solutions handle the entire AI robotics signal chain

The signal chain for adaptive, self-learning AI robotic systems requires a fusion of diverse sensor data in real time. The sensors of a cobot are in some ways like the five human senses, and all of our senses are critical for fully autonomous operation. Each of our senses use different parts of the brain and different amounts of the brain’s processing. For example, vision requires more brain power than hearing or smelling.

Analogously, robots will have more and more sensors connected to AI and ML systems that run inside the robot, with the key challenge for AI robotic system manufacturers being the need to address multiple AI systems running and communicating together for a hybrid ML system driven by data from hybrid sensors.

Robot developers depend on advanced integrated circuit solutions to minimize the headaches of circuit design and certification, speeding the development of products that they can deliver to industrial customers quickly. The ICs that enable advances in industrial robots must provide precise sensing, high-speed sensor signal conversion, fast computation/signal processing for real-time response and high-speed communications. ICs also enable highefficiency and small-form-factor power supplies in conjunction with advanced semiconductors like GaN fieldeffect transistors. New ICs also bring new standards to the industry, such as single twisted-pair Ethernet and power over single-twisted pair, which reduces cabling complexity and enhances reliability.

TI can provide everything from the sensors to the processors needed for next-generation robotics, with a broad product and solutions portfolio that spans the entire AI robotics signal chain. From sensor input to actuator or motor output, from individual equipment units to factory-level control and beyond, TI solutions handle the signal chain as well as the processing and power required for robotic applications. Products include features like reinforced isolation and are tested and qualified for use in harsh industrial environments.

By Matthieu Chevrier, Texas Instruments

    Ads Blocker Image Powered by Code Help Pro

    It looks like you are using an adblocker.

    Ads keep our content free. Please consider supporting us by allowing ads on pupuweb.com