← Back to Blog
airobot-armai-perceptionresearch

Tactile Sensing: Touch Sensors for Robot Manipulation

Explore tactile sensing technology — from resistive, capacitive to vision-based tactile, and applications in soft object grasping.

Nguyen Anh Tuan17 tháng 3, 20268 min read
Tactile Sensing: Touch Sensors for Robot Manipulation

Why Do Robots Need Tactile Sensors?

Tactile sensing is the missing piece for robot manipulation to achieve dexterity like a human hand. Think about it: you can pick up an egg without crushing it, feel whether fabric is smooth or rough — all thanks to thousands of mechanoreceptors on your skin. Current robots rely mainly on cameras (vision) and force/torque sensors, but lack information at the contact point: surface shape, slip, and force distribution.

This article synthesizes modern tactile sensing technologies, compares pros/cons, and shows how to integrate them into a ROS 2 robot system.

Tactile sensor mounted on robot gripper

Tactile Sensor Classification

1. Resistive (Resistance-based)

Principle: When compressed, conductive material changes resistance. Measure resistance change to infer force.

Examples: FSR (Force Sensing Resistor), Velostat-based sensors

Pros: Cheap, easy to manufacture, can build large arrays.

Cons: High hysteresis, drift over time, low resolution.

# Read FSR sensor via ADC (Arduino + Python Serial)
import serial
import struct

class FSRReader:
    def __init__(self, port='/dev/ttyUSB0', n_sensors=16):
        self.ser = serial.Serial(port, 115200, timeout=0.1)
        self.n_sensors = n_sensors

    def read_force_array(self):
        """Read force array from 4x4 FSR grid."""
        self.ser.write(b'R')  # Request reading
        data = self.ser.read(self.n_sensors * 2)
        if len(data) == self.n_sensors * 2:
            values = struct.unpack(f'<{self.n_sensors}H', data)
            # Convert ADC value (0-1023) to Newton
            forces = [self.adc_to_newton(v) for v in values]
            return forces
        return None

    @staticmethod
    def adc_to_newton(adc_value, v_ref=3.3, r_ref=10000):
        """Convert ADC to force (Newton) based on FSR datasheet."""
        if adc_value < 10:
            return 0.0
        voltage = adc_value * v_ref / 1023.0
        resistance = r_ref * (v_ref / voltage - 1)
        # Approximation from FSR 402 datasheet
        force = 1.0 / (resistance / 1000.0)
        return min(force, 20.0)  # Cap at 20N

2. Capacitive (Capacitance-based)

Principle: Two parallel plates separated by soft dielectric layer. When compressed, gap decreases, capacitance increases.

Examples: BioTac (SynTouch), DigiTacts

Pros: High sensitivity, low drift, can measure both normal and shear force.

Cons: Susceptible to EMI, complex readout circuit, expensive.

3. Piezoelectric

Principle: Piezoelectric material (PVDF, PZT) generates voltage when deformed.

Pros: Extremely fast response (microseconds), excellent for detecting vibration and slip.

Cons: Only measures force changes (dynamic), not static force. Better for slip detection than continuous force measurement.

4. Vision-based Tactile (GelSight)

Principle: Camera inside looks at soft gel surface with reflective coating. When object presses into gel, camera captures deformation — then reconstruct 3D shape and force distribution.

Examples: GelSight, DIGIT, GelSlim, Tactile

Pros: Extremely high spatial resolution (sub-mm), measure geometry + force + texture simultaneously.

Cons: Larger form factor, speed depends on camera framerate, requires image processing.

GelSight sensor for robot manipulation

Technology Comparison Table

Property Resistive Capacitive Piezoelectric Vision-based
Resolution Low (5-10mm) Medium (1-3mm) Medium (2-5mm) High (<0.1mm)
Speed ~100 Hz ~1 kHz >10 kHz 30-60 Hz
Measure shear No Yes Limited Yes
Measure geometry No No No Yes
Cost Very low High Medium Medium
Size Thin Thin Thin Bulky
Main use Basic grasping Dexterous hand Slip detection Research, precision

ROS 2 Integration

Custom Message for Tactile Data

# tactile_msgs/msg/TactileArray.msg
# (Create custom message in ROS 2 package)

# Header
std_msgs/Header header

# Sensor metadata
string sensor_type          # "fsr", "capacitive", "gelsight"
uint32 rows                 # Rows of taxels
uint32 cols                 # Columns of taxels

# Force data (row-major order)
float32[] forces            # Normal force per taxel (Newton)
float32[] shear_x           # Shear force X (if available)
float32[] shear_y           # Shear force Y (if available)

# Contact detection
bool in_contact             # Currently in contact?
float32 total_force         # Total force (Newton)
geometry_msgs/Point center_of_pressure  # Pressure center

Publisher Node for Tactile Sensor

import rclpy
from rclpy.node import Node
from std_msgs.msg import Header
import numpy as np

class TactileSensorNode(Node):
    def __init__(self):
        super().__init__('tactile_sensor')

        # Publisher
        self.pub = self.create_publisher(
            TactileArray, '/tactile/left_finger', 10
        )

        # Sensor reader
        self.sensor = FSRReader(port='/dev/ttyUSB0', n_sensors=16)

        # Timer: 50Hz
        self.timer = self.create_timer(0.02, self.publish_tactile)

        self.get_logger().info('Tactile sensor node started (50Hz)')

    def publish_tactile(self):
        forces = self.sensor.read_force_array()
        if forces is None:
            return

        msg = TactileArray()
        msg.header = Header()
        msg.header.stamp = self.get_clock().now().to_msg()
        msg.header.frame_id = 'left_finger_link'

        msg.sensor_type = 'fsr'
        msg.rows = 4
        msg.cols = 4
        msg.forces = [float(f) for f in forces]

        # Compute contact info
        total = sum(forces)
        msg.in_contact = total > 0.1  # Threshold 0.1N
        msg.total_force = float(total)

        # Center of pressure
        if msg.in_contact:
            force_grid = np.array(forces).reshape(4, 4)
            rows_idx, cols_idx = np.meshgrid(
                range(4), range(4), indexing='ij'
            )
            msg.center_of_pressure.x = float(
                np.sum(cols_idx * force_grid) / total
            )
            msg.center_of_pressure.y = float(
                np.sum(rows_idx * force_grid) / total
            )

        self.pub.publish(msg)

rclpy.init()
node = TactileSensorNode()
rclpy.spin(node)

Deep Learning for Tactile Sensing

CNN on GelSight Images

Vision-based tactile sensors like GelSight produce images — and CNNs handle images very well. Common tasks:

  1. Contact geometry estimation: Reconstruct 3D shape from tactile image
  2. Slip detection: Detect object slipping before it falls
  3. Material classification: Classify material (metal, plastic, fabric...)
  4. Grasp stability prediction: Predict if grasp is stable
import torch
import torch.nn as nn
import torchvision.transforms as transforms

class TactileCNN(nn.Module):
    """
    CNN classifying grasp stability from GelSight image.
    Input: 224x224 RGB tactile image
    Output: probability of stable grasp
    """
    def __init__(self):
        super().__init__()

        self.features = nn.Sequential(
            # Block 1
            nn.Conv2d(3, 32, kernel_size=5, padding=2),
            nn.BatchNorm2d(32),
            nn.ReLU(),
            nn.MaxPool2d(2),

            # Block 2
            nn.Conv2d(32, 64, kernel_size=3, padding=1),
            nn.BatchNorm2d(64),
            nn.ReLU(),
            nn.MaxPool2d(2),

            # Block 3
            nn.Conv2d(64, 128, kernel_size=3, padding=1),
            nn.BatchNorm2d(128),
            nn.ReLU(),
            nn.AdaptiveAvgPool2d((7, 7)),
        )

        self.classifier = nn.Sequential(
            nn.Flatten(),
            nn.Linear(128 * 7 * 7, 256),
            nn.ReLU(),
            nn.Dropout(0.5),
            nn.Linear(256, 1),
            nn.Sigmoid(),
        )

    def forward(self, x):
        x = self.features(x)
        x = self.classifier(x)
        return x

# Training pipeline
def train_tactile_model(train_loader, epochs=50, lr=1e-3):
    model = TactileCNN()
    optimizer = torch.optim.Adam(model.parameters(), lr=lr)
    criterion = nn.BCELoss()

    for epoch in range(epochs):
        model.train()
        total_loss = 0
        correct = 0
        total = 0

        for images, labels in train_loader:
            optimizer.zero_grad()
            outputs = model(images).squeeze()
            loss = criterion(outputs, labels.float())
            loss.backward()
            optimizer.step()

            total_loss += loss.item()
            predicted = (outputs > 0.5).long()
            correct += (predicted == labels).sum().item()
            total += labels.size(0)

        acc = correct / total * 100
        print(f'Epoch {epoch+1}/{epochs} — '
              f'Loss: {total_loss/len(train_loader):.4f}, '
              f'Acc: {acc:.1f}%')

    return model

Transfer Learning with Pre-trained Models

An effective technique with limited data: use ResNet pre-trained on ImageNet, fine-tune on tactile images. Though domains are completely different, low-level features (edges, textures) still help.

import torchvision.models as models

def create_tactile_classifier(num_classes=2):
    """Fine-tune ResNet18 for tactile classification."""
    model = models.resnet18(pretrained=True)

    # Freeze early layers
    for param in list(model.parameters())[:-20]:
        param.requires_grad = False

    # Replace classifier head
    model.fc = nn.Sequential(
        nn.Linear(512, 128),
        nn.ReLU(),
        nn.Dropout(0.3),
        nn.Linear(128, num_classes),
    )
    return model

Recent Research

Notable recent work in tactile sensing:

Deep learning processing sensor data for robots

Application: Grasping Soft Objects

Challenges

Grasping soft objects (vegetables, fruit, bread, soft electronics) is much harder than rigid objects:

Solution: Tactile-based Grasp Controller

class TactileGraspController:
    """
    Controller for grasping soft objects based on tactile feedback.
    Increase force gradually until stable contact detected.
    """
    def __init__(self, gripper, tactile_sensor):
        self.gripper = gripper
        self.sensor = tactile_sensor

        # Parameters
        self.force_increment = 0.1    # Newton per step
        self.max_force = 5.0          # Maximum Newton
        self.stable_threshold = 0.5   # Minimum Newton
        self.slip_threshold = 0.3     # Slip detection threshold

    def adaptive_grasp(self, target_force=2.0):
        """
        Adaptive grasping: increase force gradually, stop when stable.
        """
        current_force = 0.0

        while current_force < self.max_force:
            # Increase gripper force
            current_force += self.force_increment
            self.gripper.set_force(current_force)

            # Wait for stabilization (50ms)
            import time
            time.sleep(0.05)

            # Read tactile
            tactile_data = self.sensor.read_force_array()
            total_contact = sum(tactile_data)

            # Check if enough force
            if total_contact >= target_force:
                print(f'Stable grasp at {current_force:.1f}N '
                      f'(contact: {total_contact:.1f}N)')
                return True

        print('WARNING: Max force reached without stable grasp')
        return False

    def monitor_slip(self, callback_on_slip):
        """
        Monitor slip continuously.
        When slip detected, call callback to increase force.
        """
        prev_forces = None

        while True:
            forces = self.sensor.read_force_array()
            if prev_forces is not None:
                # Compute force change rate
                delta = np.array(forces) - np.array(prev_forces)
                change_rate = np.linalg.norm(delta)

                if change_rate > self.slip_threshold:
                    print(f'Slip detected! Change rate: {change_rate:.2f}')
                    callback_on_slip()

            prev_forces = forces
            time.sleep(0.02)  # 50 Hz

Future Directions

Tactile sensing is developing rapidly with notable trends:

To understand manipulation fundamentals, start with inverse kinematics to control gripper position precisely. Then add tactile sensing so gripper can "feel" objects.

Related Articles

Related Posts

IROS 2026: Papers navigation và manipulation đáng theo dõi
researchconferencerobotics

IROS 2026: Papers navigation và manipulation đáng theo dõi

Phân tích papers nổi bật về autonomous navigation và manipulation — chuẩn bị cho IROS 2026 Pittsburgh.

2/4/20267 min read
Sim-to-Real Transfer: Train simulation, chạy thực tế
ai-perceptionresearchrobotics

Sim-to-Real Transfer: Train simulation, chạy thực tế

Kỹ thuật chuyển đổi mô hình từ simulation sang robot thật — domain randomization, system identification và best practices.

1/4/202612 min read
IROS 2026 Preview: Những gì đáng chờ đợi
researchconferencerobotics

IROS 2026 Preview: Những gì đáng chờ đợi

IROS 2026 Pittsburgh — preview workshops, competitions và nghiên cứu navigation, manipulation hàng đầu.

30/3/20267 min read