share article

Share on facebook
Share on twitter
Share on linkedin

Artificial intelligence will turn cobots from collaborators into true partners

Series

By Mark Patrick, Mouser Electronics

It’ll be a true game-changer when cobots can be taught what to do without complex programming. It’ll be even more exciting when they can learn independently, based on their experiences. For that we turn to artificial intelligence (AI), and there are already examples today of some advanced cobots showing what’s possible through learning and autonomy.

Human-Like Learning Qualities

We’ve always collectively imagined robots to have a human-like brain, making their own decisions. Cobots will make this happen; they could optimise what they’re doing in real time, based on what their sensors are telling them. If they encountered a situation where they didn’t know what to do, they could ask a human, and listen to verbal instructions, instead of programming. AI can help achieve this kind of autonomy and more in cobots.

Of course, there are some highly autonomous robots out already, but these typically cost tens to hundreds of millions of dollars. For similarly capable cobots to become a viable option for industry, this figure needs to be a fraction of these prices, and this is where the growing cobotics industry and academia are working hard.

AI: Start Small and Expand

AI is a broad area, but for cobotics the first focus is machine learning, whereby a cobot gradually learns from its experiences to improve its abilities. This branch of AI uses algorithms that enable the cobot to make predictions, which in turn enables it to make its own choices.

Perception is another must-have: the machine needs to use data from its sensors to produce a ‘vision’ of its surroundings, to which it can then respond. This is essential in ensuring cobots can operate safely alongside humans.

Cobots’ basic motions and object-handling don’t necessarily require AI, but if the robot needs to be able to navigate autonomously, localise, map or plan a journey, sensor-enabled AI becomes essential.

Another area where AI plays a major role is natural language processing, which will mean cobots can converse with and learn from human operators. Basic verbalisation in cobots is already a reality, but there’s more to be done, drawing on several strands of AI.

Artificial neural networks will play an important role in the future as well. These aim to enable sophisticated learning without having to program the robot – much like the way the human brain operates. Artificial neural networks are extremely complex, with the main goal being to enable robots’ motors to respond appropriately to different inputs from their sensors, even when their surroundings change.

Lastly, there’s deep learning, the most sophisticated form of machine learning, which effectively is a multi-layered (hence ‘deep’) neural network, also inspired by the human brain. Deep learning uses a series of trainable components to recognise objects, with each stage being used for training. It offers the potential to develop and use algorithms straightforward, although the amount of processing power and data it requires mean it’s a future aspiration in cobotics, rather than something achievable in the short term.

State-Of-The-Art Cobotics Examples

Carnegie Mellon University’s Robotics Institute has developed autonomous mobile service robots that can move around the centre’s office buildings, navigating hallways, lifts and open spaces to deliver goods. They can transfer loads between themselves and communicate to optimise their delivery routes for maximum efficiency. This includes notifying one another if routes are blocked (so others can dynamically reroute to avoid them) or if a door to an office is closed (so another cobot due to make a delivery there knows to postpone the drop-off until the office occupant is back). They can even ask humans for help.

Areas where people are continually moving around or where furniture often gets shifted – such as a restaurant or hospital, for example – are handled using an algorithm based on Episodic non-Markov Localisation (EnML). This enables the cobot to make assumptions about items, without having to store large volumes of data, as static mapping would require. Moreover, the cobot needn’t store a full history of everything it’s ever observed or done.

Elsewhere, the German Fraunhofer Institute for Computer Graphics Research has produced a cobot capable of independently scanning parts and then printing them in 3D in real time. The robot moves its scanning arm around the component, using algorithms to produce a 3D image. It then verifies the accuracy of the scan and prints the part. None of this requires programming, manual training or computer-aided design tooling.

Exciting Future for Cobots

These examples highlight what’s already possible, and as AI develops further, exciting and potentially revolutionary cobotic capabilities will follow, most likely one step at a time. The result will be cobots that aren’t simply collaborators but true partners.

Share this article

Share on facebook
Share on twitter
Share on linkedin

Related Posts

View Latest Magazine

Subscribe today

Member Login