Sensor Taxonomy: Proprioception vs. Exteroception

Robot sensors divide into two fundamental categories. Understanding this split is the first step in building your sensor suite because each category answers a different question: "What is the robot doing?" versus "What is the world doing?"

Proprioceptive sensors measure the robot's internal state -- joint angles, joint velocities, motor currents, end-effector forces. They tell the controller where the robot is and how much effort it is exerting. Without proprioception, no closed-loop control is possible.

Exteroceptive sensors measure the external environment -- object positions, surface geometry, proximity, temperature, sound. They enable perception-driven behaviors like obstacle avoidance, object grasping, and scene understanding.

Category Sensor Type Measures Typical Update Rate Price Range
ProprioceptiveEncoder (incremental)Joint angle, velocity1-10 kHz$15-$200
ProprioceptiveEncoder (absolute)Joint angle (no homing)1-10 kHz$50-$500
ProprioceptiveIMUOrientation, angular rate, acceleration100-8000 Hz$2-$2,000
ProprioceptiveForce/Torque (F/T)6-axis wrench at wrist100-7000 Hz$1,500-$12,000
ProprioceptiveMotor current sensorTorque estimate (via current)1-10 kHzBuilt-in
ExteroceptiveRGB Camera2D color image30-160 Hz$50-$2,000
ExteroceptiveDepth Camera (RGB-D)3D point cloud + color30-90 Hz$200-$1,500
Exteroceptive2D LiDARPlanar distance scan5-40 Hz$100-$1,200
Exteroceptive3D LiDAR3D point cloud (long range)10-20 Hz$400-$75,000
ExteroceptiveTactile SensorContact pressure, slip, texture30-1000 Hz$200-$5,000
ExteroceptiveProximity (ToF/IR)Distance to nearest object50-100 Hz$5-$50

Encoders: The Foundation of Robot Control

Every servo-driven robot joint requires an encoder. Encoders convert mechanical rotation into digital position data that the motor controller uses for closed-loop position and velocity control. Without accurate encoders, no control algorithm can function.

Incremental vs. Absolute Encoders

Incremental encoders output pulse trains (A/B quadrature signals) as the shaft rotates. The controller counts pulses from a known reference (home position) to determine angle. After power loss, the robot must re-home -- driving each joint to a limit switch or index pulse. Common choices:

  • Broadcom AEDT-9810 (optical): 5000 CPR (20,000 counts/rev after quadrature decoding), $45. Standard for research arms.
  • CUI AMT102 (capacitive): Configurable 48-4096 CPR, $30. Compact, no alignment needed, good for custom builds like OpenArm.

Absolute encoders report the true shaft angle immediately on power-up, eliminating homing. They use either optical code disks or magnetic Hall-effect sensing. More expensive but essential for collaborative robots where the arm could be manually moved while powered off.

  • RLS Orbis (magnetic, rotary): 14-bit (16,384 positions per revolution = 0.022 degree resolution), SPI/SSI output, $85. Used in many cobot joints.
  • US Digital MA3 (magnetic): 12-bit, analog or PWM output, $55. Simple integration for prototype arms.
  • Renishaw RESOLUTE (optical, linear/rotary): 32-bit, 1 nm resolution, $800+. Industrial-grade for CNC and high-precision robotics.

Decision rule: Use absolute encoders for any arm that will be deployed in a collaborative or unattended setting. Use incremental encoders when cost is the primary constraint and homing on startup is acceptable (research prototypes, competition robots).

Inertial Measurement Units (IMUs)

An IMU combines accelerometers and gyroscopes (and sometimes magnetometers) in a single package. In robotics, IMUs serve three roles: base orientation estimation for legged/mobile robots, vibration monitoring for predictive maintenance, and complementary sensing for SLAM when fused with visual or LiDAR odometry.

IMU Comparison Table

Model Accel Range Gyro Range Gyro Bias Stability ODR (max) Interface Price Best For
Bosch BMI088±24g±2000°/s~2°/hr1600 HzSPI/I2C$8Drones, legged robots
TDK ICM-42688-P±16g±2000°/s~1.5°/hr8000 HzSPI/I2C$6VIO/SLAM, high-rate control
Analog Devices ADIS16470±40g±2000°/s~0.36°/hr2000 HzSPI$180Industrial robots, precise dead-reckoning
VectorNav VN-100±16g±2000°/s~0.5°/hr800 HzSPI/UART/USB$800Navigation-grade AHRS, AGVs
Unitree G1 built-in±16g±2000°/s~3°/hr400 HzSDKIncludedHumanoid balance control

Key specification: Gyro bias stability. This measures how much the gyroscope output drifts over time with no rotation. A 2 deg/hr drift means the orientation estimate will be off by 2 degrees after one hour of stationary operation. For mobile robots running SLAM, you need <1 deg/hr. For arm-mounted vibration monitoring, the BMI088 at $8 is sufficient.

The Unitree G1 humanoid (available for lease from SVRC at $2,500/mo) includes a built-in IMU accessible via the Unitree SDK that provides pre-filtered orientation quaternions for balance control.

Force/Torque Sensors

A 6-axis force/torque (F/T) sensor mounted between the robot's wrist flange and the gripper measures interaction forces and torques in all directions. This enables force-controlled tasks like polishing (constant force), peg-in-hole insertion (compliance), and safe human interaction (force limiting).

F/T Sensor Comparison

Sensor Force Range (Fx/Fy) Torque Range (Tz) Resolution Rate Interface Price
ATI Gamma±65 N±5 Nm1/80 N7000 HzEtherCAT/Ethernet$8,000-$12,000
ATI Mini45±290 N±10 Nm1/16 N7000 HzEtherCAT/Ethernet$6,000-$9,000
OnRobot HEX-E±200 N±10 Nm0.2 N100 HzUSB/Tool I/O$3,500-$5,000
Bota SensONE±1000 N±40 Nm0.3 N800 HzEtherCAT/USB$2,500-$4,000
Robotous RFT40±200 N±6 Nm0.1 N1000 HzRS485/USB$1,500-$2,500

ATI Gamma is the gold standard in research labs. Its 7 kHz sample rate and sub-Newton resolution enable high-bandwidth force control loops. However, it requires a dedicated NetBox DAQ ($2,000+) and an EtherCAT master, adding integration complexity.

OnRobot HEX-E is significantly easier to integrate -- it plugs directly into UR, ABB, and Fanuc cobots via built-in tool I/O adapters. The 100 Hz sample rate is sufficient for assembly and polishing tasks but too slow for high-bandwidth impedance control.

Budget alternative: If your application only needs collision detection (not force control), use motor current-based torque estimation. The OpenArm 101 provides joint torque estimates from motor current at no additional sensor cost, sufficient for simple contact detection and gravity compensation.

Vision Sensors: Cameras, Depth, and LiDAR

Vision is the richest exteroceptive modality. For a deep dive into camera selection, see our dedicated Robot Vision Systems Guide. Here is a summary of the decision framework:

Camera Selection by Application

  • Manipulation / data collection: GigE Vision camera (Basler ace2 a2A1920, $650) with hardware trigger for deterministic frame delivery. See our Camera Setup for Teleoperation guide for the full 3-camera configuration.
  • Mobile robot navigation: Stereo camera (Intel RealSense D455, $350) or 2D LiDAR (RPLIDAR A1, $100) for SLAM. Add a 3D LiDAR (Ouster OS0-128, $3,500) for outdoor or high-speed environments.
  • Quality inspection: Machine vision camera (FLIR Blackfly S, $500-$1,500) with fixed focal length lens and ring light for consistent imaging conditions.
  • Budget research: USB webcam (Logitech BRIO, $200) is acceptable for single-camera setups at <15 fps where latency jitter is tolerable.

Depth Sensing Technologies

Structured light (Intel RealSense D435, Azure Kinect): Projects a known IR pattern and computes depth from distortion. Works indoors, struggles in sunlight and with reflective surfaces. Range 0.3-3 m for manipulation tasks.

Time-of-flight (ToF) (Azure Kinect DK ToF mode, PMD Flexx2): Measures photon round-trip time. Faster than structured light, less susceptible to texture-less surfaces. Range up to 5 m indoors.

Stereo vision (RealSense D455, ZED 2i): Two calibrated cameras compute disparity. Works outdoors, degrades on texture-less surfaces. Range up to 20 m with a 12 cm baseline.

LiDAR (Velodyne VLP-16, Ouster OS0): Laser-based time-of-flight with mechanical or solid-state scanning. Range 10-100+ m, works in all lighting. Essential for outdoor mobile robots and autonomous vehicles.

Tactile Sensors for Manipulation

Tactile sensors are the fastest-growing sensor category in robotics. They provide contact information that cameras cannot: grip force magnitude, contact geometry at sub-millimeter resolution, slip detection, and surface texture classification. For detailed product comparisons, see our Tactile Sensor Comparison guide.

When to Add Tactile Sensing

  • Required: Handling fragile objects (fruit, electronics, glass), deformable objects (fabric, cables), or objects with uncertain weight/friction.
  • Recommended: Any dexterous manipulation task where grip success rate needs to exceed 95%. Tactile feedback enables reactive grasp adjustment that vision alone cannot provide.
  • Optional: Simple pick-and-place of rigid, known objects where a parallel-jaw gripper with a single binary contact switch is sufficient.

SVRC stocks Paxini tactile sensor arrays that integrate with the OpenArm and Orca Hand. Paxini sensors provide 16x16 taxel arrays at 100 Hz with ROS2 drivers, priced at approximately $800 per fingertip unit. Contact our team for evaluation units.

Sensor Fusion: Combining Multiple Modalities

No single sensor provides complete state information. Sensor fusion combines data from multiple sensors to produce more accurate and robust estimates than any individual sensor alone. The two most common frameworks in robotics are the Kalman filter and the particle filter.

Extended Kalman Filter (EKF) Basics

The EKF is the workhorse of sensor fusion in robotics. It maintains a probabilistic estimate of the robot state (position, velocity, orientation) and updates it whenever a new sensor measurement arrives. The key idea: each sensor has different noise characteristics, and the EKF optimally weights them based on their reliability.

A typical mobile robot EKF fuses:

  • IMU (prediction step): High rate (200+ Hz), provides angular velocity and acceleration. Integrates to estimate position/orientation but drifts over seconds.
  • Wheel odometry (update step): 50-100 Hz, provides velocity. Accurate short-term but suffers from wheel slip.
  • Visual odometry or LiDAR scan matching (update step): 10-30 Hz, provides absolute position correction. Eliminates drift but computationally expensive and intermittent.

In ROS2, the standard sensor fusion package is robot_localization, which implements both EKF and UKF (Unscented Kalman Filter) nodes:

# Launch EKF node for IMU + odometry fusion
ros2 launch robot_localization ekf.launch.py
# Configure in ekf.yaml:
# odom0: /wheel/odometry   (x, y, yaw velocity)
# imu0:  /imu/data          (roll, pitch, yaw, angular vel, linear accel)
# odom0_config: [false, false, false,   # x, y, z position
#                false, false, false,   # roll, pitch, yaw
#                true,  true,  false,   # x, y, z velocity
#                false, false, true,    # roll, pitch, yaw velocity
#                false, false, false]   # x, y, z acceleration

Sensor Fusion for Manipulation

For arm manipulation, sensor fusion often combines:

  • Joint encoders + F/T sensor: Encoder positions feed forward kinematics; F/T sensor provides wrench at the end-effector. Together they enable impedance control -- the robot behaves like a spring-damper system, compliant in force while tracking position.
  • Vision + tactile: Camera detects object pose for approach; tactile sensor provides closed-loop feedback during grasp and manipulation. This is the architecture used in SVRC's data collection pipelines for learning contact-rich policies.
  • Vision + F/T + proprioception: The full multi-modal stack for complex tasks like cable routing, food preparation, or assembly. Requires careful timestamp synchronization -- see our camera setup guide for synchronization methods.

Sensor Selection by Application

Use the following decision matrix to determine which sensors your specific application requires. "Required" means the application will not function without it. "Recommended" means significant performance improvement. "Optional" means nice-to-have.

Application Encoders IMU F/T Camera LiDAR Tactile Est. Sensor Budget
Pick-and-place (rigid)Required--OptionalRequired----$500-$1,500
Assembly/insertionRequired--RequiredRecommended--Recommended$3,000-$8,000
Deformable handlingRequired--RequiredRequired--Required$5,000-$12,000
Indoor mobile robotRequiredRequired--RecommendedRequired--$500-$2,000
Outdoor AMR / deliveryRequiredRequired--RequiredRequired--$4,000-$15,000
Humanoid (Unitree G1)Built-inBuilt-inRecommendedBuilt-inOptionalRecommended$1,000-$5,000 (add-ons)
Data collection (IL)RequiredOptionalRecommendedRequired--Recommended$2,000-$6,000

ROS2 Sensor Integration Quick Reference

Most sensors publish data on standardized ROS2 message types. Knowing which message type to expect simplifies integration and lets you use existing visualization and processing tools.

Sensor ROS2 Message Type Typical Topic Name Key Driver Package
RGB Camerasensor_msgs/Image/camera/color/image_rawusb_cam, pylon_ros2
Depth Camerasensor_msgs/PointCloud2/camera/depth/pointsrealsense2_camera
IMUsensor_msgs/Imu/imu/dataimu_filter_madgwick
F/T Sensorgeometry_msgs/WrenchStamped/ft_sensor/wrenchros2_ati_netft
2D LiDARsensor_msgs/LaserScan/scanrplidar_ros, sllidar_ros2
3D LiDARsensor_msgs/PointCloud2/lidar/pointsouster_ros, velodyne
Joint Statesensor_msgs/JointState/joint_statesros2_control
Tactile Arraysensor_msgs/Image (pressure map)/tactile/pressure_imagepaxini_ros2