airobot-armai-perceptionresearch

Tactile Sensing: Touch Sensors for Robot Manipulation

Explore tactile sensing technology — from resistive, capacitive to vision-based tactile, and applications in soft object grasping.

Nguyen Anh Tuan17 tháng 3, 20268 phút đọc
Tactile Sensing: Touch Sensors for Robot Manipulation

Why Do Robots Need Tactile Sensors?

Tactile sensing is the missing piece for robot manipulation to achieve dexterity like a human hand. Think about it: you can pick up an egg without crushing it, feel whether fabric is smooth or rough — all thanks to thousands of mechanoreceptors on your skin. Current robots rely mainly on cameras (vision) and force/torque sensors, but lack information at the contact point: surface shape, slip, and force distribution.

This article synthesizes modern tactile sensing technologies, compares pros/cons, and shows how to integrate them into a ROS 2 robot system.

Tactile sensor mounted on robot gripper

Tactile Sensor Classification

1. Resistive (Resistance-based)

Principle: When compressed, conductive material changes resistance. Measure resistance change to infer force.

Examples: FSR (Force Sensing Resistor), Velostat-based sensors

Pros: Cheap, easy to manufacture, can build large arrays.

Cons: High hysteresis, drift over time, low resolution.

# Read FSR sensor via ADC (Arduino + Python Serial)
import serial
import struct

class FSRReader:
    def __init__(self, port='/dev/ttyUSB0', n_sensors=16):
        self.ser = serial.Serial(port, 115200, timeout=0.1)
        self.n_sensors = n_sensors

    def read_force_array(self):
        """Read force array from 4x4 FSR grid."""
        self.ser.write(b'R')  # Request reading
        data = self.ser.read(self.n_sensors * 2)
        if len(data) == self.n_sensors * 2:
            values = struct.unpack(f'<{self.n_sensors}H', data)
            # Convert ADC value (0-1023) to Newton
            forces = [self.adc_to_newton(v) for v in values]
            return forces
        return None

    @staticmethod
    def adc_to_newton(adc_value, v_ref=3.3, r_ref=10000):
        """Convert ADC to force (Newton) based on FSR datasheet."""
        if adc_value < 10:
            return 0.0
        voltage = adc_value * v_ref / 1023.0
        resistance = r_ref * (v_ref / voltage - 1)
        # Approximation from FSR 402 datasheet
        force = 1.0 / (resistance / 1000.0)
        return min(force, 20.0)  # Cap at 20N

2. Capacitive (Capacitance-based)

Principle: Two parallel plates separated by soft dielectric layer. When compressed, gap decreases, capacitance increases.

Examples: BioTac (SynTouch), DigiTacts

Pros: High sensitivity, low drift, can measure both normal and shear force.

Cons: Susceptible to EMI, complex readout circuit, expensive.

3. Piezoelectric

Principle: Piezoelectric material (PVDF, PZT) generates voltage when deformed.

Pros: Extremely fast response (microseconds), excellent for detecting vibration and slip.

Cons: Only measures force changes (dynamic), not static force. Better for slip detection than continuous force measurement.

4. Vision-based Tactile (GelSight)

Principle: Camera inside looks at soft gel surface with reflective coating. When object presses into gel, camera captures deformation — then reconstruct 3D shape and force distribution.

Examples: GelSight, DIGIT, GelSlim, Tactile

Pros: Extremely high spatial resolution (sub-mm), measure geometry + force + texture simultaneously.

Cons: Larger form factor, speed depends on camera framerate, requires image processing.

GelSight sensor for robot manipulation

Technology Comparison Table

Property Resistive Capacitive Piezoelectric Vision-based
Resolution Low (5-10mm) Medium (1-3mm) Medium (2-5mm) High (<0.1mm)
Speed ~100 Hz ~1 kHz >10 kHz 30-60 Hz
Measure shear No Yes Limited Yes
Measure geometry No No No Yes
Cost Very low High Medium Medium
Size Thin Thin Thin Bulky
Main use Basic grasping Dexterous hand Slip detection Research, precision

ROS 2 Integration

Custom Message for Tactile Data

# tactile_msgs/msg/TactileArray.msg
# (Create custom message in ROS 2 package)

# Header
std_msgs/Header header

# Sensor metadata
string sensor_type          # "fsr", "capacitive", "gelsight"
uint32 rows                 # Rows of taxels
uint32 cols                 # Columns of taxels

# Force data (row-major order)
float32[] forces            # Normal force per taxel (Newton)
float32[] shear_x           # Shear force X (if available)
float32[] shear_y           # Shear force Y (if available)

# Contact detection
bool in_contact             # Currently in contact?
float32 total_force         # Total force (Newton)
geometry_msgs/Point center_of_pressure  # Pressure center

Publisher Node for Tactile Sensor

import rclpy
from rclpy.node import Node
from std_msgs.msg import Header
import numpy as np

class TactileSensorNode(Node):
    def __init__(self):
        super().__init__('tactile_sensor')

        # Publisher
        self.pub = self.create_publisher(
            TactileArray, '/tactile/left_finger', 10
        )

        # Sensor reader
        self.sensor = FSRReader(port='/dev/ttyUSB0', n_sensors=16)

        # Timer: 50Hz
        self.timer = self.create_timer(0.02, self.publish_tactile)

        self.get_logger().info('Tactile sensor node started (50Hz)')

    def publish_tactile(self):
        forces = self.sensor.read_force_array()
        if forces is None:
            return

        msg = TactileArray()
        msg.header = Header()
        msg.header.stamp = self.get_clock().now().to_msg()
        msg.header.frame_id = 'left_finger_link'

        msg.sensor_type = 'fsr'
        msg.rows = 4
        msg.cols = 4
        msg.forces = [float(f) for f in forces]

        # Compute contact info
        total = sum(forces)
        msg.in_contact = total > 0.1  # Threshold 0.1N
        msg.total_force = float(total)

        # Center of pressure
        if msg.in_contact:
            force_grid = np.array(forces).reshape(4, 4)
            rows_idx, cols_idx = np.meshgrid(
                range(4), range(4), indexing='ij'
            )
            msg.center_of_pressure.x = float(
                np.sum(cols_idx * force_grid) / total
            )
            msg.center_of_pressure.y = float(
                np.sum(rows_idx * force_grid) / total
            )

        self.pub.publish(msg)

rclpy.init()
node = TactileSensorNode()
rclpy.spin(node)

Deep Learning for Tactile Sensing

CNN on GelSight Images

Vision-based tactile sensors like GelSight produce images — and CNNs handle images very well. Common tasks:

  1. Contact geometry estimation: Reconstruct 3D shape from tactile image
  2. Slip detection: Detect object slipping before it falls
  3. Material classification: Classify material (metal, plastic, fabric...)
  4. Grasp stability prediction: Predict if grasp is stable
import torch
import torch.nn as nn
import torchvision.transforms as transforms

class TactileCNN(nn.Module):
    """
    CNN classifying grasp stability from GelSight image.
    Input: 224x224 RGB tactile image
    Output: probability of stable grasp
    """
    def __init__(self):
        super().__init__()

        self.features = nn.Sequential(
            # Block 1
            nn.Conv2d(3, 32, kernel_size=5, padding=2),
            nn.BatchNorm2d(32),
            nn.ReLU(),
            nn.MaxPool2d(2),

            # Block 2
            nn.Conv2d(32, 64, kernel_size=3, padding=1),
            nn.BatchNorm2d(64),
            nn.ReLU(),
            nn.MaxPool2d(2),

            # Block 3
            nn.Conv2d(64, 128, kernel_size=3, padding=1),
            nn.BatchNorm2d(128),
            nn.ReLU(),
            nn.AdaptiveAvgPool2d((7, 7)),
        )

        self.classifier = nn.Sequential(
            nn.Flatten(),
            nn.Linear(128 * 7 * 7, 256),
            nn.ReLU(),
            nn.Dropout(0.5),
            nn.Linear(256, 1),
            nn.Sigmoid(),
        )

    def forward(self, x):
        x = self.features(x)
        x = self.classifier(x)
        return x

# Training pipeline
def train_tactile_model(train_loader, epochs=50, lr=1e-3):
    model = TactileCNN()
    optimizer = torch.optim.Adam(model.parameters(), lr=lr)
    criterion = nn.BCELoss()

    for epoch in range(epochs):
        model.train()
        total_loss = 0
        correct = 0
        total = 0

        for images, labels in train_loader:
            optimizer.zero_grad()
            outputs = model(images).squeeze()
            loss = criterion(outputs, labels.float())
            loss.backward()
            optimizer.step()

            total_loss += loss.item()
            predicted = (outputs > 0.5).long()
            correct += (predicted == labels).sum().item()
            total += labels.size(0)

        acc = correct / total * 100
        print(f'Epoch {epoch+1}/{epochs} — '
              f'Loss: {total_loss/len(train_loader):.4f}, '
              f'Acc: {acc:.1f}%')

    return model

Transfer Learning with Pre-trained Models

An effective technique with limited data: use ResNet pre-trained on ImageNet, fine-tune on tactile images. Though domains are completely different, low-level features (edges, textures) still help.

import torchvision.models as models

def create_tactile_classifier(num_classes=2):
    """Fine-tune ResNet18 for tactile classification."""
    model = models.resnet18(pretrained=True)

    # Freeze early layers
    for param in list(model.parameters())[:-20]:
        param.requires_grad = False

    # Replace classifier head
    model.fc = nn.Sequential(
        nn.Linear(512, 128),
        nn.ReLU(),
        nn.Dropout(0.3),
        nn.Linear(128, num_classes),
    )
    return model

Recent Research

Notable recent work in tactile sensing:

  • Sparsh (Meta, 2024) — Self-supervised touch representations for vision-based tactile sensing. Paper: arxiv.org/abs/2410.24090

  • GelSight Modular Design (Agarwal et al., 2025) — Modular GelSight design for easy customization per application. Paper: arxiv.org/abs/2504.14739

  • Vision-based Tactile Sensor Survey (Li et al., 2025) — Comprehensive classification of VBTS by principle. Paper: arxiv.org/abs/2509.02478

  • Tactile-GAT (2024) — Graph Attention Networks instead of CNN for tactile classification. Paper: nature.com/articles/s41598-024-78764-x

  • Learning Robust Grasping (2024) — Adaptive grasping policy based on tactile feedback, generalizes well to diverse objects. Paper: arxiv.org/abs/2411.08499

Deep learning processing sensor data for robots

Application: Grasping Soft Objects

Challenges

Grasping soft objects (vegetables, fruit, bread, soft electronics) is much harder than rigid objects:

  • Deformation: Object changes shape when compressed
  • Slip: Smooth surfaces slip easily
  • Damage: Excessive squeezing damages object

Solution: Tactile-based Grasp Controller

class TactileGraspController:
    """
    Controller for grasping soft objects based on tactile feedback.
    Increase force gradually until stable contact detected.
    """
    def __init__(self, gripper, tactile_sensor):
        self.gripper = gripper
        self.sensor = tactile_sensor

        # Parameters
        self.force_increment = 0.1    # Newton per step
        self.max_force = 5.0          # Maximum Newton
        self.stable_threshold = 0.5   # Minimum Newton
        self.slip_threshold = 0.3     # Slip detection threshold

    def adaptive_grasp(self, target_force=2.0):
        """
        Adaptive grasping: increase force gradually, stop when stable.
        """
        current_force = 0.0

        while current_force < self.max_force:
            # Increase gripper force
            current_force += self.force_increment
            self.gripper.set_force(current_force)

            # Wait for stabilization (50ms)
            import time
            time.sleep(0.05)

            # Read tactile
            tactile_data = self.sensor.read_force_array()
            total_contact = sum(tactile_data)

            # Check if enough force
            if total_contact >= target_force:
                print(f'Stable grasp at {current_force:.1f}N '
                      f'(contact: {total_contact:.1f}N)')
                return True

        print('WARNING: Max force reached without stable grasp')
        return False

    def monitor_slip(self, callback_on_slip):
        """
        Monitor slip continuously.
        When slip detected, call callback to increase force.
        """
        prev_forces = None

        while True:
            forces = self.sensor.read_force_array()
            if prev_forces is not None:
                # Compute force change rate
                delta = np.array(forces) - np.array(prev_forces)
                change_rate = np.linalg.norm(delta)

                if change_rate > self.slip_threshold:
                    print(f'Slip detected! Change rate: {change_rate:.2f}')
                    callback_on_slip()

            prev_forces = forces
            time.sleep(0.02)  # 50 Hz

Future Directions

Tactile sensing is developing rapidly with notable trends:

  • Multimodal fusion: Combine vision + tactile + proprioception. Robot "sees" object first, "feels" to confirm, adjusts grasp force real-time.
  • Large-scale tactile pre-training: Like LLM pre-training on text, models like Sparsh pre-train on large tactile datasets, then fine-tune for specific tasks.
  • Soft robotics integration: Sensors integrated directly into gripper soft materials, instead of external mounting.
  • Sim-to-real transfer: Simulate tactile in simulation (TACTO, Taxim) then transfer model to real robot, reducing data collection cost.

To understand manipulation fundamentals, start with inverse kinematics to control gripper position precisely. Then add tactile sensing so gripper can "feel" objects.

NT

Nguyễn Anh Tuấn

Robotics & AI Engineer. Building VnRobo — sharing knowledge about robot learning, VLA models, and automation.

Bài viết liên quan

NEWDeep Dive
Gemma 4 cho Robotics: AI mã nguồn mở chạy trên Edge
ai-perceptionedge-computinggemmagoogleopen-source

Gemma 4 cho Robotics: AI mã nguồn mở chạy trên Edge

Phân tích Gemma 4 của Google — mô hình AI mã nguồn mở hỗ trợ multimodal, agentic, chạy trên Jetson và Raspberry Pi cho robotics.

12/4/202612 phút đọc
NEWNghiên cứu
Gemma 4 và Ứng Dụng Trong Robotics
ai-perceptiongemmaedge-aifoundation-modelsrobotics

Gemma 4 và Ứng Dụng Trong Robotics

Phân tích kiến trúc Gemma 4 của Google — từ on-device AI đến ứng dụng thực tế trong điều khiển robot, perception và agentic workflows.

12/4/202612 phút đọc
NEWSo sánh
SimpleVLA-RL (5): So sánh với LeRobot
ai-perceptionvlareinforcement-learninglerobotresearchPhần 5

SimpleVLA-RL (5): So sánh với LeRobot

So sánh chi tiết SimpleVLA-RL và LeRobot: RL approach, VLA models, sim vs real, data efficiency — hai framework bổ trợ nhau.

11/4/202612 phút đọc