Researchers Develop A Cleaning Robot That Can Arrange Objects In Their Right Places

In Education

Princeton University researchers have created a robot named TidyBot that can effectively clean and organize rooms. TidyBot can pick up objects and put them in their appropriate places. It has also demonstrated its capabilities by sorting laundry, identifying recyclables, and correctly disposing of household items. Interestingly, TidyBot can even put toys in drawers and accurately toss empty drink cans into the proper garbage bin.

Robots to learn user preferences to deliver customized assistance

Researchers indicated in a press release that a robot needs to learn user preferences for it to be able to customize physical assistance and apply the same in future scenarios. In their current study the researchers sought to understand household cleanup personalization with robots capable of cleaning groups by placing objects in the right places.

Researchers stated that the main challenge for robots is finding the right location for every object, as individual preferences differ based on personal taste and cultural background. They sought to develop systems that can learn these preferences with minimal examples from previous interactions. To test this theory, Princeton’s School of Engineering built TidyBot, a mobile robot, which successfully stored 85% of objects in real-world test scenarios.

In this study, researchers demonstrate the ability of robots to integrate perception and language-based planning with large language model’s (LLMs) capabilities. By doing so, they can infer generalized user preferences applicable for future interactions. The approach allows quick adaptation and attains a high accuracy of 91.2% when dealing with previously unseen objects in the researcher’s benchmark dataset.

Robots being trained to have a sense of touch

Scientists are developing a system called ReSkin at Carnegie Mellon University that allows robots to sense and differentiate between delicate objects like thin layers of cloth using touch. This development aims to improve robots’ ability to handle tasks like grabbing a glass or folding towels, which have previously been challenging for them.

According to David Held, the head of the Robots Perceiving and Doing (R-Pad) Lab, humans instinctively use touch to ensure they are in the correct position to grab something. This tactile sensing is highly valuable despite not being consciously acknowledged.

Mobile Sliding Menu

Comparisonsmaster