CENTERSTAGE Java Autonomous Programs

As mentioned on the CENTERSTAGE autonomous programs page there are lots of points to score in the autonomous period. That pages links mostly to tutorial pages to write programs in Blocks. This page describes some of the the Java sample programs that relate to autonomous programs and provides other example autonomous programs.

There are more Java sample program than Blocks sample programs, including some that are already autonomous programs, or easily adapted to autonomous. The program names link to the External Samples folder, but these program are also available in OnBot Java.

  • RobotAutoDriveByEncoder_Linear.java – This OpMode illustrates the concept of driving a path based on encoder counts. It will have the robot drive forward for 48 inches, spin right for 12 Inches, drive backward for 24 inches. Just want you want for driving a tank drive type robot.
  • RobotAutoDriveByGyro.java – This OpMode illustrates the concept of driving an autonomous path based on Gyro (IMU) heading and encoder counts. Similar to the previous program, but adds gyroscopic control to the driving. Useful for driving long distances. e.g. from the front starting position to the backstage area in CENTERSTAGE.
  • RobotAutoDriveByTime_Linear.java – This OpMode illustrates the concept of driving a path based on time. The most simple driving, only useful if you do not have encoders on your motors.
  • RobotAutoDriveToAprilTagOmni.java – This OpMode illustrates using a camera to locate and drive towards a specific AprilTag. The code assumes a Holonomic (Mecanum or X Drive) Robot. Very useful example that drives the robot to a set position and orientation in front of a target April Tag. There are April Tag’s on the front wall and the backdrop that can be used. See the TFOD_BlueFrontAprilTag.java program below for an example program that uses both the front wall April Tag and a backdrop April Tag.
  • RobotAutoDriveToAprilTagTank.java – This OpMode illustrates using a camera to locate and drive towards a specific AprilTag. The code assumes a basic two-wheel (Tank) Robot Drivetrain.
  • RobotAutoDriveToLine_Linear.java – This OpMode illustrates the concept of driving up to a tape line and then stopping. Uses a color sensor to detect a white tape line. This could be adapted to detect the red/blue spike marks, which would be more useful in the CENTERSTAGE game.

Here’s are some CENTERSTAGE Java sample programs that implement autonomous programs. They can be found in the CENTERSTAGE Samples GitHub repository.

  • AutoPixel1.java – The program attempts to recognize the pixel on a spike more, place purple pixel on that mark and then a yellow pixel in the corresponding area on the backdrop. It a very rough program. This program was cobbled together from various Java sample program like ConceptScanServo.java, RobotAutoDriveToAprilTagOmni.java and the Concept IMU, TensorFlow, and April Tag programs.

  • AutoPixelFront.java – The program attempts to recognize the pixel on a spike more, place purple pixel on that mark and yellow pixel in the corresponding area on the backdrop. It makes use of TensorFlow to detect a Pixel and April Tags to detect the backdrop area and also uses the IMU for some turns and driving towards backstage.

  • TFOD_BlueFrontAprilTag.java – an example CENTERSTAGE autonomous program that incorporates TensorFlow Team Prop detection, IMU controlled driving, use of the front wall April Tag to drive to position, IMU driving to the backstage area, April Tag driving to line up on the correct backdrop area, and finally a touch sensor to stop when moving up to the backdrop prior to placing the yellow pixel. Mecanum wheel robots can’t use motor encoders effectively, so we use IMU and April Tags to control the robot.

The above program uses the IMU to control direction on moves and turns. A useful enhancement would be to add a color sensor to detect the spike marks when dropping off the purple pixel. After it drops off the pixel, the robot turns to face the front wall April Tag. It then uses an aprilTagDrive function (derived from the RobotAutoDriveToAprilTag programs) to strafe to the A2 tile. The robot then does an IMU turn to face backstage, it then drives forward using the IMU. Once near the backstage area it turn to roughly face the backdrop. It then uses an aprilTagDrive function to line up in front of the backdrop April Tag. The robot move sideway slightly to line up the arm. It then drives forward until the touch sensor indicates we’ve touched the backdrop. Then the robot places the yellow pixel and parks in the corner.

The aprilTagDrive function is based on theRobotAutoDriveToAprilTag programs, but allows you to set the target Yaw and bearing values instead of assuming you want those to be zero. With a non-zero Yaw target the robot can be positioned to the left or right of the April Tag (with the webcam pointing at the tag). A non-zero Bearing would allow you to position to the left or right but the webcam facing the perimeter wall (so the robot shouldn’t be too close to the April Tag or the webcam might lose the April Tag target).

Getting Help

It is often possible to use Google (or other search engine) to get help or solve problems. There are lots of resources online. If you’re still stuck you can ask for help here.