CONNECT WITH US

Teaching Machines to Feel the World TOUCH Lab Redefines Sensing

News highlights 0

Fig 1. TOUCH Lab team from left (Aaron Benjmin Alcuitas, Vence Jumar Sasing, Dr. Aaron Raymond See, Lanz Benedict De Guzman, and Thad Ja). Credit: TCA

Tactile sensing is rapidly emerging as a critical foundation for next-generation applications across consumer electronics, robotics, automation, healthcare, manufacturing, and immersive technologies such as VR/AR and telepresence. According to Exactitude Consultancy and Global Market Insights, the global tactile sensing market is projected to surpass USD 30.7 billion by 2030 and approach USD 43.9 billion by 2032.

Amid this rising wave of sensory innovation, an international team at National Chin-Yi University of Technology (NCUT) in Taichung is pushing the boundaries of how machines perceive the physical world. Their breakthrough project—AI driven TOUCH System – Digitizing Touch —earned the Gold Medal in the AI Application category at the 2025 Best AI Awards hosted by Taiwan's Ministry of Economic Affairs. Their goal is ambitious to digitize touch, and in doing so, teach machines how to "feel."

"Our original mission was to help the visually impaired better perceive their surroundings," says Dr. Aaron Raymond See, founder of TOUCH Lab, project lead, and associate professor at NCUT, Department of Electronic Engineering. "Most assistive technologies focus on vision or hearing. We chose to help machines understand touch—the most intuitive yet and underutilized human sense in AI today."

TOUCH Lab's journey began in 2021 with early prototypes of tactile teaching aids and haptic gloves. Over time, the team developed Omnisense, a visual-tactile sensing system that mimics the integrative capabilities of human skin. It simultaneously detects pressure, temperature, texture, and shape—unlike competing technologies such as GelSight, DIGIT or TacTip, which typically focus on isolated variables.

"Human skin doesn't separate functions," Dr. See explains. "We wanted our sensor to work the same way—fusing multiple tactile cues into one unified system, just like the way our fingertips perceive the world."

Tactile sensing, Dr. See notes, fills critical gaps left by vision and audio-based systems. In precision manufacturing, for example, engineers still rely on touch to verify metal coatings or fabric softness. "You can't tell if a blade is sharp just by looking at it. You need to feel it. That's the kind of insight tactile data provides—and it's currently missing from most machine systems."

The TOUCH system is already being evaluated for diverse applications, from robotic manipulators and automated quality control to medical rehabilitation and assistive devices for people with visual impairments. The team has also partnered with National Cheng Kung University Hospital for clinical testing of wearable medical devices.

"We're in active discussions with Siemens and other industrial partners," Raymond shares. "After multiple technical iterations and are now in the second phase of commercialization. We're also conducting durability testing under extreme conditions—high temperatures, oily surfaces, dusty environments—to prepare the system for deployment in heavy manufacturing."

Looking ahead, Dr. See says the lab's short-term priority for 2025–2027 is to refine the sensor material, making it thinner and lighter for use in medical wearables, where comfort and responsiveness are paramount. The long-term goal for 2027–2030 is to miniaturize the entire sensing module for integration into robotic arms and end-effectors that perform high-precision tasks.

"Our ultimate goal is to make tactile sensing a standard feature in future machines," says Dr. See. "Just like cameras and accelerometers became essential in smartphones, touch sensors will be essential in robotics."

For the team, the Best AI Award provided more than just funding and exposure—it validated a vision that was once considered niche. "It motivates our students, sustains our lab, and gives us the platform to go further, " Dr. See reflects. 

Winning the awardwas not just about showcasing innovation, it was a test of the team's resilience and synergy. Dr. See stated that, "he students faced enormous pressure during the competition, but that's exactly the kind of environment they’ll encounter in the real world. They didn't just refine the tech, they learned how to adapt quickly and collaborate under stress."

With students and researchers from electronics, computer science, mechanical engineering, and design, the team's cross-disciplinary structure enabled rapid integration and validation. "We've been an international team from the start, with collaborators across the Philippines, Taiwan, Malaysia, and Germany" Dr. See says. "Beyond sensors, we're also exploring assistive technologies such as leg rehabilitation tools. Our ultimate goal is to bring tangible improvements to elderly care and rehabilitation."

While today's AI is dominated by visual and auditory systems, Dr. See believes tactile intelligence remains a largely untapped blue ocean, with transformative potential.  "Touch is the most subtle, intuitive, and instinctive of all senses," he says. "If AI can capture and replicate that, it unlocks a new dimension of value."

From simulating fabric textures in e-commerce to enabling remote medical diagnostics and high-precision industrial inspections, TOUCH Lab's sensor is moving beyond the lab and into  real-world use. "We don't want this to remain an academic prototype, our vision is for machines to feel—not just calculate. And that changes everything." Dr. See concludes. 

TOUCH Lab won the Gold Award and NT$1,000,000 in the International Group AI Applications Category at the 2025 Best AI Awards. Now it's your chance to shine—bring your innovation to the world and apply for the 2026 Best AI Awards! With global tracks open for both AI Applications and IC Design, students and companies worldwide can compete for the grand prize of up to USD 30,000. The deadline is March 16, so don't miss out. For more details, join the online orientation on February 11 and sign up today.