Skip to main content
Alexandra Ion and Violet Han look over a computer screen.
Alexandra Ion and Violet Han are part of a team using AI to turn everyday objects into proactive personal assistants.

CMU Researchers Use AI To Turn Everyday Objects Into Proactive Assistants

Media Inquiries
Name
Aaron Aupperlee
Title
School of Computer Science

Carnegie Mellon University researchers have developed an AI system that enables everyday objects to anticipate people’s needs and move to assist them, opening new possibilities for seamless human-computer interaction in daily life.

A stapler slides across a desk to meet a waiting hand, or a knife edges out of the way just before someone leans against a countertop. It sounds like magic, but in CMU's Human-Computer Interaction Institute(opens in new window) (HCII) researchers are combining AI and robotic mobility to give everyday objects this kind of foresight.

Using large language models (LLMs) and wheeled robotic platforms, researchers have transformed ordinary items — like mugs, plates or utensils — into assistants that can observe human behavior, predict interventions and move across horizontal surfaces to help humans at just the right time.

"Our goal is to create adaptive systems for physical interaction that are unobtrusive, meaning they blend into our lives while still dynamically adapting to our needs," said Alexandra Ion(opens in new window), an HCII assistant professor who leads the Interactive Structures Lab(opens in new window). "We classify this work as unobtrusive because the user does not ask the objects to perform any tasks. Instead, the objects sense what the user needs and perform the tasks themselves."

The Interactive Structures Lab uses computer vision and LLMs to reason about what a person's goals are, predicting what they may do or need next. A ceiling-mounted camera senses the environment and tracks the position of objects. The system then translates what the camera sees into a text-based description of the scene. Next, an LLM uses this translation to infer what the person's goals may be and which actions would help them most. Finally, the system transfers the predicted actions to the item. This process allows for seamless help with everyday tasks like cooking, organizing, office work and more. 

"We chose to enhance everyday objects because users already trust them. By advancing the objects' capabilities, we hope to increase that trust," said Violet Han(opens in new window), an HCII Ph.D. student working with Ion. 

Ion and her team have started studying ways to expand the scope of unobtrusive physical AI to other parts of homes and offices.

"Imagine, for example, you come home with a bag of groceries. A shelf automatically folds out from the wall and you can set the bag down while you're taking off your coat," Ion said during her episode of the School of Computer Science's "Does Compute(opens in new window)" podcast. "The idea is that we develop and study technology that seamlessly integrates into our daily lives and is so well assimilated that it becomes almost invisible, yet is consistently bringing us new functionality."

The Interactive Structures Lab aims to create intuitive physical interfaces that bring safe, reliable physical assistance into homes, hospitals, factories and other spaces. The team's work in unobtrusive physical AI was accepted to the 2025 ACM Symposium on User Interface Software and Technology(opens in new window), held recently in Busan, South Korea.

To learn more about the research, visit the Interactive Structures Lab project website(opens in new window).

Work That Matters

Researchers at CMU are working on real world solutions to the biggest challenges.

Read more about the latest discoveries.(opens in new window)

— Related Content —