Skip to content

System Architecture

Architecture


System Architecture (3).png

Note: Arrow direction is used to describe primary information flow, that is, information that causes subsequent action in the system. This does not mean systems at the end of an arrow cannot provide feedback. For example, when Task Execution commands motion to a particular EEF pose, it may still receive feedback from Motion Planning as to the state of execution of the motion, it’s success and resulting robot pose.

Component Connectivity


System ARch.png

Explanation

RGB-D Camera mounted on the roof with sufficient FOV of workspace and peripherals
Pose Estimation
Object detection provides image segmentation of target object
Passes segmented depth-registered image to object pose estimation
Notifies task execution of incoming target
This output is a continued stream as long as the object is stil being detected
Object pose estimation uses camera calibration values, object templates and depth readings to estimate object pose
Passes estimated timestamped pose to object pose prediction
This output is a continued stream as long as the object is stil being detected
Object pose prediction predicts object pose on the conveyor belt at a certain horizon into the future based on pose at detection and conveyor speed
Passes predicted pose to task execution
This predicted pose is constantly updated and a continued stream as long as the object is stil being detected
HRI Estimation
Coworker detection provides image segmentation of target object
Passes segmented depth-registered image to handover pose estimation and the safety system
Handover pose estimation determines an ergonomic gripper pose based off coworker relative location and orientation
Passes estimated pose to task execution
This estimated pose is constantly updated and a continued stream as long as the coworker is stil being detected
Task Execution
Handles the overall behavioral logic and recovery actions.
Responds to the prescence of incoming objects, determines if the arm will be able to pick in time using high-level heuristics, and if not, pauses the conveyor
Translates predicted object poses into motion goals, namely, a gripper pose and gripper action, and monitors the execution of these
Translates estimated handover poses into motion goals, namely, a gripper pose, and monitors the execution of these
Utilizes the speaker to provide verbal updates of actions, particularly alterting the coworker to handover
Toggles the handover control mode in the arm controller, whereby a detected tug triggers the gripper to open
Responds to the user interface commands
Responds to the safety system pause
Pick and place
Motion planning converts target gripper poses into joint-space trajectories
Executes these trajectories by passing them to the arm controller
Provides low level recoveries for retries
Arm controller executes the above trajectories
Uses momentum based control for safe interaction with humans and objects
Provides full telemetry to motion planning and tug detection
Tug detection detects EEF tugs and triggers gripper release
Uses estimated EEF dyanmics to detect tugs at the EEF
Commands gripper to open when tug is identified
Gripper controller opens and closes the gripper
Controls the pneumatics to apply firm but non-damaging pressure to object
Senses drops via integrated sensors
Safety System
Includes HW E-stops that shut off power to actuated systems
Can pause the system
Can provide a SW E-stop that puts the arm into high compliance mode and opens the gripper
Consumes coworker detection and engages pause or SW E-stop based on proximity and calculated risk
User Interface
Allows for start, stop, cancel, resume, restart, schedule of operation
Provides updates of current action and operational state
Provides information of safety status and allows for clearing of errors
Calibration
Commands arm to calibration poses through motion planning
Consumes RGB-D data, and arm telemetry to run the hand-eye calibration procedure
Runs the camera-checkerboard calibration proedure

Want to print your doc?
This is not the way.
Try clicking the ··· in the right corner or using a keyboard shortcut (
CtrlP
) instead.