May 3, 2022

By Keith Shaw

Contactile 400x275One of the biggest challenges for robot developers is creating dexterous grippers or “hands” that can perform tasks similar to the human hand. While advances in the space allow for many grippers to grab objects like flat boxes or packages, several object types remain difficult for robots, such as slippery objects, or those with variable weights.

Australia-based startup Contactile is hoping to change that through its development of new tactile sensor technology that incorporates friction and weight sensing to allow grippers to better judge and grab objects, and apply the proper amount of force for the task. The company has announced raising $2.5 million in seed funding to develop its sensors, led by Silicon Valley’s True Ventures, along with participation from Flying Fox Ventures, Radar Ventures and UNSW Founders (a program launched by UNSW Sydney).

The company’s technology includes the ability to measure 3D forces and torques, partial slip and friction, something that other sensors or robotic grippers do not often take into account. “To hold an object without letting it slip from our grasp, we squeeze it with our fingers,” said Heba Khamis, Ph.D., a co-founder of Contactile and the company’s CEO. “When an object is heavier, we squeeze harder, and when an object is slippery, we squeeze harder. People know how hard to squeeze an object because they can feel the weight and the slipperiness by touching the object. But robots haven’t been able to feel these things before now.”

Contactile was founded by Khamis along with Benjamin Xia, its chief technology officer, and Stephen Redmond, an associate professor at UNSW Sydney. The three co-founders have been developing tactile sensors for eight years, where they studied or taught and performed research in electrical, software and biomedical engineering. The technologies developed include advances in human tactile physiology, artificial tactile sensing, signal processing and machine learning, along with control.

Khamis said robots currently on the market can pick up and place objects, but many times they need to be precisely programmed for each object they encounter – new objects cannot be picked up without new and advanced programming. This behavior makes them less human-like and incapable of complex manipulation, such as opening jars, turning door handles or even greater dexterity tasks.

The company hopes to use the technology to help solve short-term problems in markets such as warehouse fulfillment and retail, but are also looking at gripping problems in agriculture, healthcare and even space, such as creating robots that can build infrastructure on the Moon or Mars, or working on satellite repair tasks. Contactile said it will use the funding to help pursue additional research and development, as well as begin executing on pilot programs with partners.

For additional details on Contactile, visit its website here.

Exclusive: Robotics-World recently spoke with Khamis (pictured, right) about the development of the technology and how it plans to accelerate the development of its sensors.

Robotics-World: Congratulations on the funding. Talk about how this development began and how long it took before you figured out that this was something that could benefit robotics.

HebaKhamisContactile250pxKhamis: We’ve been developing tactile sensors since 2014. One of the other co-founders and I were working on a cross-disciplinary project in human tactile physiology. We were both engineers and working with a human physiologist who was trying to understand how people sense friction. In that project, we were doing a lot of biomechanics to understand how the finger deforms. We were looking at bio-signal processing of the neural signals coming from the tactile receptors in the finger and understanding what is actually being signaled by those receptors. We observed a bunch of things that basically inspired the development of artificial tactile sensors. The mechanical design of the sensor that we currently make is based on how the finger deforms in response to different slipperiness in the surface that it’s contacting. So it’s very much bio-inspired. It was proven back in 2019 that we could demonstrate that it works in a laboratory setting, and that’s when we started to spin it out at the university and commercialize it.

R-W: What are previous grippers doing in terms of their holding and grabbing ability? We assume that they can still hold and grab items, but if the object is slippery or heavier than expected, it would either drop or slip, correct?

Khamis: Even now, tactile sensing hasn’t taken off yet in terms of large-scale application use. If you think of where robots are being used, it’s more in manufacturing, with very repetitive processes and very structured processes. The position of the object is known well in advance. The size and shape of the packaging – everything is known. What they tend to do in those situations, they customize the gripper for that particular object, and it’s never going to pick up anything else. For example, if it’s made to pick up a double-A battery, that’s all it’s ever going to do. Even if you put a AAA battery in its path, which is about the same, but slightly smaller, it cannot handle it because the gripper has been customized for a particular object. 

What we believe is going to happen with vision, AI, and tactile sensors is that robots are going to be much smarter – they’re going to be much more reactive. So you remove that need for everything to be so structured, and you can take those robots out of the very structured environment in manufacturing and put them into less structured environments that we haven’t been able to automate because the tools weren’t there. So we’re talking about things like agriculture, medicine, space, defense – all those applications where there are lots of unknowns and you really need to sense the environment and then act according to those sensations.



R-W: You had mentioned space applications, can you give some examples of how tactile sensing could help robots in space or on other planets?

Khamis: In the short-term, these can be used for space assembly and maintenance. Many satellites that are in orbit at the moment are non-maintainable once they go down. If anything goes wrong with them they just become space junk. Most people don’t realize that space junk is becoming a big problem because the more space junk we have, the more our in-service satellites are likely to collide with something. If we could actually service the satellites when they do have a problem, that’s a big area for robots – obviously we’re not going to send people up every time there’s a problem with a satellite.

Longer-term is for building infrastructure on the Moon and Mars. In particular on Mars, we’re not going to send people to Mars to start building infrastructure. Given that you have to account for every single cubic inch and every single kilogram of product that you send to Mars, because of the huge implications of fuel, how do you send machines that can give you the most bang for your buck? If you’re sending a single machine that does a single task, then that’s probably not going to be particularly feasible, but if you can send up a robot that can use its hands like people do, and then constantly upgrade what it’s doing up there, that will be a much more valuable tool because it can actually then build the infrastructure using a single robot.

R-W: How does this technology allow robots to move beyond just picking applications?

Khamis: What tactile sensing also enables is not just picking up things that the robot doesn’t know what it is, but also manipulating them, and that’s the really exciting part. We don’t just pick up with our hands. We manipulate tools, we turn door handles, we open jar lids. We hold hammers and pens - we do all of these thanks with the same tool we have – our hands. Manufacturing tends to be all about just picking things up at the moment, but there are all these other tasks that are still very manual.

Engine assembly, for example, is still very much a manual process. There are lots of things to tighten, and lots of things to maneuver. Imagine a gas pipe, for example, out in the middle of the desert and you need to relieve the pressure. You do that by opening a valve, but maybe the valve has been sitting in the sun, or it may be very rusted, and might not have been opened for 10 years. You have no idea how much friction there is in that valve. It’s a very expensive process to send someone out in the middle of the desert to open this valve and come back. So how do you do this using a robot? You can’t take a picture of the valve and know how much force you need to open it.

R-W: Are these sensors developed for a specific gripper type, or can they be used on multiple grippers?

Khamis: The fundamental technology is the sensors, and we’ve been producing sensors that can be demonstrated on a two-finger gripper. But those sensors can be fitted onto a three-finger, four- or five-finger gripper, or it could be a fully functional hand. It’s very repurposeable and amenable to different form factors. The sensing range is also very easy to customize, whether you need it to be similar to a human hand or if you need it to pick up shipping containers – the scale is completely customizable.

Related video: