Skip to content

Conversation

@iory
Copy link
Owner

@iory iory commented Jan 6, 2026

Summary

Add a new CLI command skr generate-robot-class that automatically generates Python robot classes from URDF files using geometric analysis.

Features

  • Automatically detects kinematic chains (arms, legs, head, torso)
  • Calculates gripper TCP positions from mesh geometry
  • Computes gripper orientation (Y-axis aligned with opening direction)
  • Supports ROS package:// URLs
  • Configurable pattern matching via PatternConfig class

Usage

# Generate robot class and print to stdout
skr generate-robot-class robot.urdf

# Save to file
skr generate-robot-class robot.urdf --output MyRobot.py

# Show detected kinematic groups
skr generate-robot-class robot.urdf --show-groups

Implementation Details

  • Uses mesh vertex analysis for fingertip position estimation
  • Geometric symmetry detection for gripper fingers
  • Deterministic Y-axis direction calculation for consistent orientations
  • Prefers visual meshes over collision meshes for accurate TCP estimation

No LLM or API keys required - uses only URDF structure and geometry.

iory added 13 commits January 6, 2026 15:26
Add robot_class_generator.py that generates Python robot classes
from URDF geometry without requiring LLM or API keys.

Key features:
- Automatic kinematic chain detection using NetworkX graph analysis
- Geometry-based limb type detection (arm, leg, head, torso)
- TCP estimation from gripper fingertip midpoint or mesh extent
- Tool frame and hand link detection from link structure
- Support for single-arm and dual-arm robots
Move robot class generator from examples to skr CLI command.
Update README.md and docs/source/cli.rst with usage documentation.
Automatically detect if URDF is inside a ROS package and generate
default_urdf_path with package:// format instead of absolute path.
Resolve package:// URLs using resolve_filepath before loading URDF.
This allows generated robot classes with package:// default_urdf_path
to work correctly when loaded from within a ROS package directory.
- Remove 'gripper_link' from tool_frame patterns (was too broad)
- Add _find_symmetric_gripper_midpoint() for geometry-based detection
- Detect gripper fingers by finding symmetric child links (same parent,
  opposite x/y coordinates) instead of relying on naming patterns
- Support package:// URLs in URDF.load
- Calculate gripper midpoint using _get_fingertip_position() instead
  of link.worldpos() for more accurate TCP estimation
- Skip _calculate_gripper_tcp_offset if offset already set by
  symmetric gripper detection to avoid overwriting correct values
Mesh-based fingertip detection doesn't work well for grippers with
opposing fingers (different directions). Use link origin midpoint
instead for symmetric gripper pairs.
Skip _convert_to_ros_package_path if input is already a package:// URL
to avoid corrupting the path.
Calculate end_coords orientation for symmetric gripper links so that:
- x-axis points in gripper forward direction (from parent to midpoint)
- y-axis points in gripper opening direction (between fingers)
- z-axis is computed as x cross y

The rotation is output as RPY angles in radians in the generated
CascadedCoords initialization.
Based on feedback, this commit makes the following improvements:

1. Add PatternConfig class for externalizing all "magic words"
   - Users can customize patterns for non-standard naming conventions
   - Patterns can be extended or overridden via constructor
   - force_groups option to bypass auto-detection for specific groups

2. Use pre-compiled regex for pattern matching
   - Improves performance by compiling patterns once
   - All pattern matching now uses PatternConfig.matches()

3. Switch mesh priority for TCP estimation
   - Visual mesh now preferred over collision mesh
   - Visual meshes have more accurate vertex positions for fingertips
   - Collision meshes may be simplified convex hulls

4. Add documentation about geometric detection limitations
   - Y-coordinate based left/right detection assumes T-pose
   - Notes added to docstrings about potential failure cases

Example usage:
    config = PatternConfig(
        patterns={'right_arm': ['RA_', 'right_arm_j']},
        force_groups={'head': ['neck_link', 'head_link']}
    )
    generate_robot_class_from_geometry(robot, config=config)
The Y-axis direction was dependent on the iteration order of child
links, which is non-deterministic. This caused the gripper orientation
to be inconsistent between runs.

Fix: Ensure Y-axis always points toward the positive direction of the
axis with the largest absolute component. This makes the orientation
calculation deterministic regardless of iteration order.
When finger link origins are located at the parent origin (like Fetch),
use fingertip mesh positions to determine the approach direction instead
of falling back to an arbitrary axis.

This fixes incorrect gripper orientations for robots like Fetch where
the finger links' origins are at [0, ±y, 0] relative to gripper_link,
resulting in a midpoint at the origin.
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR adds a new CLI command skr generate-robot-class that automatically generates Python robot classes from URDF files using geometric analysis, without requiring an LLM or API keys.

Key Changes

  • Implements geometry-based kinematic chain detection (arms, legs, head, torso) using pattern matching and URDF structure analysis
  • Calculates gripper TCP positions and orientations from mesh geometry using vertex analysis and symmetry detection
  • Adds ROS package:// URL resolution support to the URDF loader

Reviewed changes

Copilot reviewed 7 out of 7 changed files in this pull request and generated 5 comments.

Show a summary per file
File Description
skrobot/urdf/robot_class_generator.py New 1698-line module implementing pattern-based kinematic chain detection, gripper geometry analysis, and Python class code generation
skrobot/utils/urdf.py Adds package:// URL handling to URDF.load() method for ROS package path resolution
skrobot/apps/generate_robot_class.py New CLI application that provides command-line interface for robot class generation with options for output, class name, and group visualization
skrobot/apps/cli.py Registers the new generate-robot-class command in the CLI app registry
skrobot/urdf/init.py Exports the two main API functions for robot class generation
docs/source/cli.rst Documents the new CLI command with usage examples
README.md Adds example usage of the new CLI command

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.


# Check if mesh extends significantly more in one direction
# (asymmetric = likely end effector direction)
(bounds[0] + bounds[1]) / 2
Copy link

Copilot AI Jan 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The variable 'center' is calculated but never used. This computation should be removed.

Suggested change
(bounds[0] + bounds[1]) / 2

Copilot uses AI. Check for mistakes.
Comment on lines 387 to 392
x_symmetric = (abs(pos1[0] + pos2[0]) < 0.01
and abs(pos1[0]) > 0.005)
y_symmetric = (abs(pos1[1] + pos2[1]) < 0.01
and abs(pos1[1]) > 0.005)
# z should be similar
z_similar = abs(pos1[2] - pos2[2]) < 0.02
Copy link

Copilot AI Jan 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Magic numbers should be extracted as named constants for better maintainability. The values 0.01, 0.005, and 0.02 represent geometric tolerance thresholds for symmetry detection and should be defined as module-level constants with descriptive names like SYMMETRY_POSITION_TOLERANCE, MIN_POSITION_OFFSET, and Z_SIMILARITY_TOLERANCE.

Copilot uses AI. Check for mistakes.
Comment on lines 767 to 816
def _estimate_tcp_from_mesh(link):
"""Estimate TCP offset from link mesh extent.
For robots without gripper models, estimate the TCP position
based on the mesh geometry of the end effector link.
Parameters
----------
link : Link
The end effector link.
Returns
-------
list or None
Position offset [x, y, z] in link's local frame.
"""
mesh = _get_link_mesh(link)
if mesh is None:
return None

bounds = mesh.bounds # [[min_x, min_y, min_z], [max_x, max_y, max_z]]
extents = mesh.extents

# Find the primary axis (longest dimension)
primary_axis = np.argmax(extents)

# Check if mesh extends significantly more in one direction
# (asymmetric = likely end effector direction)
(bounds[0] + bounds[1]) / 2
min_dist = abs(bounds[0][primary_axis])
max_dist = abs(bounds[1][primary_axis])

# If asymmetric (one side extends more), use that as TCP direction
asymmetry_ratio = max(min_dist, max_dist) / (min(min_dist, max_dist) + 1e-6)

if asymmetry_ratio > 2.0: # Significant asymmetry
# TCP is at the extended end
if min_dist > max_dist:
# Extends in negative direction
tcp_pos = np.zeros(3)
tcp_pos[primary_axis] = bounds[0][primary_axis]
else:
# Extends in positive direction
tcp_pos = np.zeros(3)
tcp_pos[primary_axis] = bounds[1][primary_axis]

tcp_pos = [round(v, 6) if abs(v) > 1e-6 else 0.0 for v in tcp_pos]
return tcp_pos

return None
Copy link

Copilot AI Jan 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Multiple magic numbers (1e-6, 0.1, 2.0) are used throughout this function for geometric thresholds. These should be defined as named constants at the module level for better maintainability and to make the geometric logic more self-documenting.

Copilot uses AI. Check for mistakes.
Comment on lines 978 to 1020
if movable_joint_count >= 5:
has_lr_pattern = (right_arm_count >= 2 or left_arm_count >= 2
or right_leg_count >= 2 or left_leg_count >= 2)
if not has_lr_pattern:
# This is likely a serial chain manipulator
single_arm_count = movable_joint_count

counts = {
'right_arm': right_arm_count,
'left_arm': left_arm_count,
'right_leg': right_leg_count,
'left_leg': left_leg_count,
'head': head_count,
'torso': torso_count,
'arm': single_arm_count,
'gripper': gripper_count,
}

# Use Y coordinate for geometric left/right detection
# WARNING: This assumes the robot is in T-pose or similar standard pose.
# For robots with non-standard initial poses, this detection may fail.
y_threshold = 0.05
if tip_y_coord is not None:
has_arm_pattern = (right_arm_count >= 1 or left_arm_count >= 1
or single_arm_count >= 2)
has_leg_pattern = right_leg_count >= 1 or left_leg_count >= 1

strong_y_threshold = 0.1

if has_arm_pattern or has_leg_pattern:
if tip_y_coord > y_threshold:
boost = 4 if tip_y_coord > strong_y_threshold else 2
if has_arm_pattern:
counts['left_arm'] += boost
if has_leg_pattern:
counts['left_leg'] += boost
elif tip_y_coord < -y_threshold:
boost = 4 if tip_y_coord < -strong_y_threshold else 2
if has_arm_pattern:
counts['right_arm'] += boost
if has_leg_pattern:
counts['right_leg'] += boost

Copy link

Copilot AI Jan 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Magic numbers (0.05, 0.1, 5, 2, 4) are used for Y-coordinate thresholds and geometric detection without explanation. These should be defined as named constants with descriptive names that explain their purpose (e.g., Y_THRESHOLD_WEAK, Y_THRESHOLD_STRONG, MIN_ARM_JOINTS, etc.).

Copilot uses AI. Check for mistakes.
'rot': None,
}

max_depth = 5
Copy link

Copilot AI Jan 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The magic number 5 (max_depth) should be defined as a named constant like MAX_SEARCH_DEPTH to improve code readability and maintainability.

Copilot uses AI. Check for mistakes.
- Add module-level constants for geometric detection thresholds
- Remove unused center calculation in _estimate_tcp_from_mesh
- Replace hardcoded values with descriptive constants:
  - SYMMETRY_POSITION_TOLERANCE, MIN_POSITION_OFFSET, Z_SIMILARITY_TOLERANCE
  - MAX_SEARCH_DEPTH, POSITION_EPSILON, FINGERTIP_RATIO, ASYMMETRY_RATIO
  - Y_THRESHOLD_WEAK, Y_THRESHOLD_STRONG, MIN_ARM_JOINTS
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Copilot reviewed 7 out of 7 changed files in this pull request and generated 19 comments.


💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines +464 to +469
# x and y are parallel, use perpendicular
x_axis_local = np.array([0.0, 0.0, 1.0])
x_axis_local = x_axis_local - np.dot(
x_axis_local, y_axis_local) * y_axis_local
x_axis_local = x_axis_local / np.linalg.norm(
x_axis_local)
Copy link

Copilot AI Jan 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The function _find_symmetric_gripper_midpoint has inconsistent handling of the normalization check. On line 439, it checks if the norm is greater than POSITION_EPSILON before normalizing midpoint_local, but on lines 461-469, it performs a similar check on x_axis_local. However, the fallback logic when x_norm is too small (lines 464-469) may produce a non-orthogonal result if the fallback vector [0, 0, 1] happens to be parallel to y_axis_local. Consider adding a check for this edge case or using a more robust orthogonalization method.

Suggested change
# x and y are parallel, use perpendicular
x_axis_local = np.array([0.0, 0.0, 1.0])
x_axis_local = x_axis_local - np.dot(
x_axis_local, y_axis_local) * y_axis_local
x_axis_local = x_axis_local / np.linalg.norm(
x_axis_local)
# x and y are parallel or x is near-zero; choose a
# fallback axis that is not parallel to y.
candidate = np.array([0.0, 0.0, 1.0])
if abs(np.dot(candidate, y_axis_local)) > 1.0 - POSITION_EPSILON:
# If Z is parallel to y, fall back to X axis instead.
candidate = np.array([1.0, 0.0, 0.0])
x_axis_local = candidate - np.dot(
candidate, y_axis_local) * y_axis_local
x_norm = np.linalg.norm(x_axis_local)
if x_norm > POSITION_EPSILON:
x_axis_local = x_axis_local / x_norm
else:
# Degenerate case: use the candidate as-is.
x_axis_local = candidate

Copilot uses AI. Check for mistakes.
Comment on lines +57 to +87
class PatternConfig:
"""Configuration for link/joint name pattern matching.
This class externalizes all "magic words" used for detecting limb types,
end-effector links, and other robot parts. Users can customize patterns
for robots with non-standard naming conventions.
Parameters
----------
patterns : dict, optional
Dictionary of pattern lists to override defaults. Keys include:
- 'right_arm', 'left_arm', 'right_leg', 'left_leg'
- 'head', 'torso', 'gripper', 'finger'
- 'tool_frame', 'hand_link', 'arm_tip_priority'
force_groups : dict, optional
Force specific links into groups, bypassing auto-detection.
Example: {'head': ['neck_link', 'head_link']}
Examples
--------
>>> config = PatternConfig(
... patterns={
... 'right_arm': ['RA_', 'right_arm_j'],
... 'left_arm': ['LA_', 'left_arm_j'],
... },
... force_groups={
... 'head': ['neck_link', 'head_link'],
... }
... )
>>> generate_robot_class_from_geometry(robot, config=config)
"""
Copy link

Copilot AI Jan 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The docstring states "This class externalizes all 'magic words'" but some of the pattern categories like 'arm_tip_priority' are not clearly documented in the Parameters section. The docstring should list all available pattern keys that users can customize, not just show examples.

Copilot uses AI. Check for mistakes.
Comment on lines +1030 to +1041
if tip_y_coord > Y_THRESHOLD_WEAK:
boost = 4 if tip_y_coord > Y_THRESHOLD_STRONG else 2
if has_arm_pattern:
counts['left_arm'] += boost
if has_leg_pattern:
counts['left_leg'] += boost
elif tip_y_coord < -Y_THRESHOLD_WEAK:
boost = 4 if tip_y_coord < -Y_THRESHOLD_STRONG else 2
if has_arm_pattern:
counts['right_arm'] += boost
if has_leg_pattern:
counts['right_leg'] += boost
Copy link

Copilot AI Jan 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The magic numbers Y_THRESHOLD_WEAK (0.05) and Y_THRESHOLD_STRONG (0.1) used in lines 1030 and 1031 are well-documented as constants at the top of the file. However, the boost values (2 and 4) used in lines 1031, 1033, 1035, 1037, 1039, and 1041 are hardcoded without explanation. These magic numbers should be extracted as named constants with documentation explaining their purpose in the limb detection scoring system.

Copilot uses AI. Check for mistakes.
Comment on lines +1675 to +1693
viewer_code = f'''
if __name__ == '__main__':
from skrobot.model import Axis
from skrobot.viewers import PyrenderViewer
robot = {class_name}()
viewer = PyrenderViewer()
viewer.add(robot)
{axis_code}
viewer.show()
while viewer.is_active:
viewer.redraw()
'''
Copy link

Copilot AI Jan 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The viewer_code template (lines 1675-1693) generates a main block that will fail if the robot has no detected groups. The axis_code string would be empty, but more critically, if there are no groups at all, the viewer would show just an empty robot. Consider adding a check or comment in the generated code to handle this edge case gracefully.

Copilot uses AI. Check for mistakes.
init_lines.append(f" self.{end_coords_attr} = CascadedCoords(")
init_lines.append(f" parent=self.{parent_attr},")
init_lines.append(f" name='{end_coords_name}')")

Copy link

Copilot AI Jan 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In the generated init method, if all groups have no offset or rotation, the init method will have an empty body after the super().init() call (lines 1538-1584). While Python allows this, it would be cleaner to add a docstring or comment when there are no end_coords to initialize.

Suggested change
# If no end-effector coordinates were generated, add a comment for clarity
if not any("CascadedCoords(" in line for line in init_lines):
init_lines.append(" # No end-effector coordinates to initialize")

Copilot uses AI. Check for mistakes.

# Mesh analysis thresholds
POSITION_EPSILON = 1e-6 # Threshold for near-zero position values
FINGERTIP_RATIO = 0.1 # Top percentage of vertices considered as fingertip
Copy link

Copilot AI Jan 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The constant FINGERTIP_RATIO is defined as 0.1 (10%), but its usage on line 713 selects vertices with values greater than the threshold, which means it's selecting the top 10% of vertices along the primary axis. The naming might be clearer if it were called FINGERTIP_PERCENTILE or TOP_VERTICES_RATIO to better communicate that it represents a percentile threshold rather than a direct ratio.

Suggested change
FINGERTIP_RATIO = 0.1 # Top percentage of vertices considered as fingertip
FINGERTIP_RATIO = 0.1 # Fraction (0.0–1.0) defining the top percentage of vertices (e.g., top 10%) along the primary axis considered as fingertip region

Copilot uses AI. Check for mistakes.
Comment on lines +1247 to +1248
for desc in descendants:
if any(p in desc.lower() for p in ['gripper', 'finger', 'hand']):
Copy link

Copilot AI Jan 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The pattern matching in _find_gripper_chains (line 1248) uses a hardcoded list ['gripper', 'finger', 'hand'] instead of using the PatternConfig. This bypasses the configurable pattern system and makes it impossible for users to customize gripper detection. Consider using config.matches('gripper', desc) instead.

Copilot uses AI. Check for mistakes.
Comment on lines +1239 to +1248
def _find_gripper_chains(G, arm_tip, link_map):
"""Find gripper chain starting from arm tip."""
try:
descendants = nx.descendants(G, arm_tip)
except nx.NetworkXError:
return None

gripper_links = []
for desc in descendants:
if any(p in desc.lower() for p in ['gripper', 'finger', 'hand']):
Copy link

Copilot AI Jan 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The function _find_gripper_chains is missing the config parameter, which prevents it from using the configurable pattern matching system. This function should accept a config parameter and pass it to pattern matching operations.

Suggested change
def _find_gripper_chains(G, arm_tip, link_map):
"""Find gripper chain starting from arm tip."""
try:
descendants = nx.descendants(G, arm_tip)
except nx.NetworkXError:
return None
gripper_links = []
for desc in descendants:
if any(p in desc.lower() for p in ['gripper', 'finger', 'hand']):
def _find_gripper_chains(G, arm_tip, link_map, config=None):
"""Find gripper chain starting from arm tip."""
try:
descendants = nx.descendants(G, arm_tip)
except nx.NetworkXError:
return None
# Determine name patterns for identifying gripper-related links.
# By default, use the existing hard-coded substrings to preserve behavior.
name_patterns = ['gripper', 'finger', 'hand']
if config is not None:
patterns_from_config = None
# Support both dict-like and attribute-style config objects.
get_method = getattr(config, 'get', None)
if callable(get_method):
try:
patterns_from_config = (
config.get('gripper_name_patterns')
or config.get('gripper_patterns')
)
except Exception:
patterns_from_config = None
else:
patterns_from_config = getattr(config, 'gripper_name_patterns', None)
if patterns_from_config is None:
patterns_from_config = getattr(config, 'gripper_patterns', None)
if patterns_from_config:
# Ensure we have a list-like collection of patterns.
name_patterns = list(patterns_from_config)
gripper_links = []
for desc in descendants:
name_lower = desc.lower()
matched = False
for pattern in name_patterns:
if isinstance(pattern, str):
# Case-insensitive substring match for string patterns.
if pattern.lower() in name_lower:
matched = True
break
else:
# Allow regex-like objects with a .search method.
search = getattr(pattern, 'search', None)
if callable(search):
try:
if search(desc):
matched = True
break
except Exception:
# Ignore invalid patterns and continue with others.
continue
if matched:

Copilot uses AI. Check for mistakes.
resolved_path = resolve_filepath(os.getcwd(), file_obj)
if resolved_path is None:
raise ValueError(
'Could not resolve package path: {}'.format(file_obj))
Copy link

Copilot AI Jan 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The error message on line 3415 could be more helpful by providing guidance on how to resolve the issue. Consider adding information about setting up ROS environment variables or checking package installation.

Suggested change
'Could not resolve package path: {}'.format(file_obj))
('Could not resolve package path: {}. Ensure that your ROS '
'environment variables (e.g., ROS_PACKAGE_PATH) are correctly '
'set and that the referenced package is installed and '
'discoverable.').format(file_obj))

Copilot uses AI. Check for mistakes.
parser.add_argument(
'input_urdfpath',
type=str,
help='Input URDF path')
Copy link

Copilot AI Jan 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The CLI help text 'Input URDF path' on line 20 could be more descriptive. Consider changing it to 'Path to input URDF file (supports package:// URLs)' to inform users about the package:// URL support that was added in this PR.

Suggested change
help='Input URDF path')
help='Path to input URDF file (supports package:// URLs)')

Copilot uses AI. Check for mistakes.
@iory iory merged commit 21902bf into main Jan 6, 2026
25 checks passed
@iory iory deleted the model branch January 6, 2026 10:58
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants