Contest Coordinator: Richard Chrisman
Event Overview: A two-member team builds its robot and arm mechanism prior to the competition. Then, during the competition, there will be two separate but related challenges. The first will be a demonstration of proficiency in five specific skill test challenges. The second is a simulated urban search and rescue mission to traverse a course and locate, secure, and properly dispose of ordnances. Both challenges will require teams to demonstrate proficiencies such as remotely operating the robot via camera, navigating, manipulating the arm mechanism to collect simulated ordnances, traversing various types of terrain, and communicating between driver and spotter. Each team will perform one round of the five skill trials and one round of the simulated mission to locate and dispose of two ordnances. In both challenges, teams will be under time constraints to complete the objective. Challenge breakdown is as follows. Note: See Appendix for technical details of each skill challenge 2023 Urban Search and Rescue Challenge Team Guide.
The five identified skills challenge areas are:
- Arm Mechanism Skill Challenge:
a. Teams will demonstrate how effective they can open mailboxes and remove ordnance at three levels of increasing difficulty.
- Navigation Skill Challenge:
a. Teams will demonstrate basic navigation skills while controlling an ordnance by driving to specified areas of the field.
- Drive Chassis Skill Challenge:
a. Teams will navigate multilevel terrain challenges to test the engineering of their chassis and overall robot design. (Examples might be driving up smooth ramps, or rough ramps, a teeter totter, or a debris field, and so on.)
- Camera POV Skill Challenge:
a. Teams will demonstrate their Tele-Op/remote control driving proficiency using only POV (point of view) information transmitted from an onboard camera by navigating through a complex tunnel.
- Communication and Collaboration Skill Challenge:
a. Teams will demonstrate communication and collaboration skills by navigating a course using only direction from a spotter. This simulates a potential hardware failure on a robot where the driver must depend only on information from the spotter for successful completion of the challenge.