Humanoid Robot Glossary
Learn essential humanoid robotics terminology and technical concepts. From actuators and DOF to SLAM and AI, understand the technology powering the next generation of humanoid robots.
Showing 57 of 57 terms
A
Actuator
A mechanical component that converts energy into motion, enabling a robot to move its joints and limbs.
Actuators are the "muscles" of a robot. They can be electric motors, hydraulic cylinders, or pneumatic devices that create controlled movement in response to control signals.
AI (Artificial Intelligence)
Computer systems that can perform tasks typically requiring human intelligence, such as visual perception, speech recognition, and decision-making.
In humanoid robots, AI enables autonomous behavior, learning from experience, natural language processing, and adaptive responses to changing environments.
Autonomous
Operating independently without human control or intervention.
Autonomous robots can perceive their environment, make decisions, and execute actions without constant human guidance, though they may still operate within defined parameters.
C
Center of Mass (COM)
The point where the total mass of a robot can be considered to be concentrated.
Understanding and controlling the center of mass is crucial for maintaining balance in humanoid robots, especially during dynamic movements like walking or reaching.
Cobot
Robot designed to work safely alongside humans in shared spaces without barriers.
Humanoid cobots have safety features like force limiting and collision detection, enabling direct human collaboration.
Compliance
Ability to adjust force and position in response to contact with objects or people.
Critical for safe human interaction and handling delicate objects without damage through adaptive force control.
Computer Vision
Technology that enables robots to derive meaningful information from digital images or videos.
Computer vision allows humanoid robots to identify objects, navigate spaces, recognize faces, read text, and understand their visual environment in real-time.
D
Depth Camera (RGB-D)
A camera that captures both color images and depth information for each pixel.
Essential for humanoid robots to understand 3D space, measure distances to objects, and manipulate items safely in their environment.
Dexterity
Skill and precision in hand and finger movements for complex manipulation.
High dexterity enables humanoid robots to perform intricate tasks like tying knots, using tools, or assembling small components.
Digital Twin
Virtual replica of a physical robot used for simulation, testing, and monitoring.
Enables testing software updates, training scenarios, and troubleshooting without risking real hardware or disrupting operations.
DOF (Degrees of Freedom)
The number of independent ways a robot or robot joint can move.
A human arm has 7 DOF (shoulder: 3, elbow: 1, wrist: 3). More DOF generally means more flexibility and dexterity, but also more complexity in control.
Dynamic Balance
Maintaining stability during fast or unpredictable movements.
Allows humanoids to recover from pushes, walk on moving surfaces, or perform dynamic actions like running and jumping.
E
Embodied AI
AI that learns and operates through physical interaction with the real world.
Unlike software-only AI, embodied AI in humanoids learns through movement, touch, and direct environmental interaction.
Encoder
Device that converts mechanical motion into digital signals for precise position tracking.
Encoders provide continuous feedback on joint positions, enabling accurate movement control and closed-loop positioning in robotic systems.
End Effector
The device at the end of a robotic arm designed to interact with the environment.
End effectors include grippers, hands, tools, or sensors. In humanoid robots, these are often multi-fingered hands capable of complex manipulation tasks.
F
Facial Recognition
Identifying and distinguishing individual human faces for personalized interaction.
Allows robots to recognize users, personalize interactions, and enhance security through computer vision-based identity verification.
Footstep Planning
Determining where and when to place each foot during bipedal locomotion.
Critical for navigating uneven terrain, stairs, or cluttered environments by computing safe, stable foot placement sequences.
Force/Torque Sensor
Sensors that measure forces and torques applied at robot joints or end-effectors.
Enable safe human interaction and delicate object manipulation by detecting contact forces, preventing damage to objects or people.
Form Factor
Physical size, shape, and configuration of a robot body and structure.
Humanoid form factors range from child-sized service robots to full-size industrial units, affecting capabilities and applications.
Foundation Model
Large AI model trained on vast data that can be adapted for multiple robotic tasks.
Modern humanoids use foundation models like GPT for natural language understanding, reasoning, and task planning.
G
Gait
The pattern of movement during locomotion.
Humanoid robots must develop stable gaits for walking, running, or climbing stairs. Different gaits optimize for speed, stability, or energy efficiency.
Gesture Recognition
Ability to understand and respond to human body language and hand signals.
Enables natural, intuitive human-robot communication without voice commands or physical controllers through visual interpretation.
Gyroscope
A sensor that measures angular velocity and orientation.
Gyroscopes help humanoid robots maintain balance by detecting tilting or rotation, allowing the control system to make rapid corrections.
H
Haptic Feedback
Technology that creates tactile sensations, providing a sense of touch.
Haptic sensors in robot hands allow them to feel pressure, texture, and temperature, enabling delicate manipulation of objects without damaging them.
Hydraulic Actuator
Actuator powered by pressurized fluid for high-force applications.
Used in larger humanoids like Boston Dynamics Atlas for powerful, dynamic movements requiring high force-to-weight ratios.
I
Imitation Learning
Machine learning approach where robots learn by observing and copying human demonstrations.
Allows robots to quickly learn new tasks from human examples rather than requiring explicit programming or lengthy trial-and-error.
IMU (Inertial Measurement Unit)
A sensor that measures acceleration, angular velocity, and sometimes magnetic field.
IMUs are essential for humanoid robot balance and navigation, providing real-time data about the robot's orientation and movement in 3D space.
Inverse Kinematics
Mathematical process of calculating joint angles needed to achieve a desired end-effector position.
When a robot needs to reach a specific point in space, inverse kinematics determines how each joint must move to get there, solving complex geometric problems.
IP Rating
Standard measuring robot resistance to dust and water intrusion (e.g., IP54, IP67).
Higher IP ratings mean robots can work in harsher environments including outdoor, industrial, or wet conditions.
L
LiDAR (Light Detection and Ranging)
A sensing method that uses laser pulses to measure distances and create 3D maps of the environment.
LiDAR enables robots to build detailed spatial maps, detect obstacles, and navigate complex environments with high precision, even in low light.
Localization
The process of determining a robot's position within its environment.
Accurate localization is crucial for navigation. Robots use various sensors (cameras, LiDAR, IMU) and algorithms to track their position relative to a map or landmarks.
M
Machine Learning
A subset of AI where systems learn and improve from experience without explicit programming.
Machine learning enables robots to recognize patterns, adapt to new situations, and improve performance over time through training on large datasets.
Manipulation
The ability to physically interact with and control objects in the environment.
Manipulation tasks range from grasping and placing objects to using tools, requiring coordination of vision, touch sensors, and motor control.
Motion Planning
The computational process of finding a path for a robot to move from one configuration to another.
Motion planning algorithms must avoid obstacles, maintain balance, and optimize for factors like speed, energy efficiency, or smoothness of movement.
Multimodal Interaction
Communicating through multiple channels including speech, gesture, facial expression, and touch.
Creates more natural human-robot interaction than single-mode communication by combining voice, vision, and physical interfaces.
O
Object Recognition
Identifying and classifying objects in the robot environment through computer vision.
Essential for task performance - distinguishing mugs from cups, doors from walls, or tools from random objects in the workspace.
Odometry
Using sensor data to estimate change in position over time.
Wheel odometry tracks rotations to estimate distance traveled. Visual odometry uses camera images. Both help robots track their movement between position updates.
P
Path Planning
Finding a collision-free route from a starting point to a goal location.
Path planning algorithms like A* or RRT compute efficient routes while avoiding obstacles, considering robot dimensions, and optimizing for various constraints.
Payload
The maximum weight a robot can carry or manipulate effectively.
Payload capacity is a key specification for humanoid robots, determining what tasks they can perform, from carrying boxes to operating tools.
PID Controller
A control algorithm that adjusts outputs based on Proportional, Integral, and Derivative calculations.
PID controllers are fundamental in robotics for maintaining desired positions, velocities, or forces by continuously correcting for errors.
R
Reinforcement Learning
Machine learning technique where robots learn through trial and error with reward feedback.
Used to teach complex behaviors like walking, grasping, or navigating obstacles without explicit programming of every movement.
ROS (Robot Operating System)
An open-source framework providing tools and libraries for robot software development.
ROS is widely used in robotics research and development, offering standardized ways to handle sensors, actuators, communications, and control algorithms.
Runtime
Duration a robot can operate continuously on a single battery charge.
Critical specification for commercial applications; most humanoids run 2-8 hours depending on activity level and battery capacity.
S
Safety Certification
Compliance with safety standards (ISO, CE, UL) for safe operation around humans.
Required for commercial deployment; ensures force limits, emergency stops, collision detection, and safe failure modes.
Sensor Fusion
Combining data from multiple sensors to produce more accurate and reliable information.
By integrating data from cameras, IMUs, LiDAR, and other sensors, robots can overcome individual sensor limitations and build robust understanding of their environment.
Servo
A motorized device that provides precise control of angular or linear position.
Servos are common in robot joints, offering accurate position control with feedback. They enable the precise movements needed for walking, reaching, and manipulation.
Sim-to-Real Transfer
Training robots in simulation, then applying learned behaviors to physical robots.
Accelerates development by testing millions of scenarios safely in virtual environments before deploying to real hardware.
SLAM (Simultaneous Localization and Mapping)
A technique for building a map of an unknown environment while tracking the robot's location within it.
SLAM is essential for autonomous navigation in new spaces, allowing robots to explore while creating spatial maps they can use for future navigation.
T
Tactile Sensing
Detecting touch, pressure, texture, and temperature through artificial skin sensors.
Advanced humanoids use tactile arrays for human-like touch sensitivity, enabling delicate manipulation and safe interaction.
Teleoperation
Remote control of a robot by a human operator.
Teleoperation allows humans to control robots from a distance, useful for dangerous environments, training, or tasks requiring human judgment.
Torque
Rotational force that causes an object to rotate around an axis.
Joint torque determines how much force a robot can apply. Higher torque enables lifting heavier objects or moving more quickly, but requires more powerful actuators.
Trajectory
The path that a robot or robot part follows through space over time.
Trajectory planning ensures smooth, controlled movement from start to goal, considering velocity, acceleration, and avoiding sudden jerky motions.