ITD_demo1 – Java AprilTag Navigation in AUTO

This is a tutorial/demonstration on creating a Java autonomous program for the INTO THE DEEP game. It is loosely based on the RobotAutoDriveByAprilTagOmni Java sample program. It’s intended mostly as an example of one way of navigating around the field in autonomous. It may or may not suit your needs, but might serve as a starting point.

INTO THE DEEP Program Tutorials

The Problem

One problem with mecanum drive based robots is their lack of control in autonomous movement. Motor encoders are of little use as the rollers cause the wheels to slip. Driving by time is not that accurate and also varies as the battery runs down.

Solution AprilTags

The robot can use the on field AprilTags to determine position on the field. All you need is a basic webcam which should have been in your robots control and communications kit (with the driver hub and gamepad). ftc-docs has some information about webcams here.

AprilTag is a popular camera-based technology, using a scanned image similar to a QR Code. The FTC SDK provides functions that let you determine the position and orientation of a given AprilTag. The INTO THE DEEP field contains six AprilTags conveniently positioned on all four walls of the field. A robot is never more than four feet from an AprilTag, and these tags are large enough they can be seen from the other side of the field.

This program also makes use of the control hub’s inertial measurement unit (IMU) to point the robot accurately and attempt to drive the robot straight in a given direction. IMU based turns should be pretty accurate. Driving straight by time with mecanum wheels can be inaccurate because one or more wheels might slip on their rollers and the others not. The IMU should compensate somewhat for this when trying to drive straight.

This is a program for StudicaBot, a demonstration robot made from the Studica Robotics FTC kit. Initially built for the CENTERSTAGE game, it will be modified for the INTO THE DEEP game. The following video shows the robot before being modified, running this program. The robot will push two samples into the net zone and park in the observation zone.

StudicaBot running the ITD_demo1 program

ITD_demo1.java

The Java code for ITD_demo1 can be found in this GitHub repository.

If you look at the ITD_demo1.java program you’ll see similarities to the RobotAutoDriveByAprilTagOmni.java sample program. You can create that program in OnBot Java or view it here.

Some functions are similar:

  • moveRobot – I added a sleep(10) call to allow the robot to apply the given motor power for a short while before looping and calculating the next moveRobot call.
  • initVisionPortal is similar to initAprilTag – initializes the Vision Portal with an AprilTag processor. initVisionPortal has comments to show where you might add webcam calibration information or camera resolution.

Some functions are new:

  • aprilTagDrive – moves the robot so that it is positioned at a certain distance from an AprilTag and at a target bearing to the AprilTag and that the AprilTag will have a target Yaw. This is a complicated function derived from the April Tag driving logic in the sample program, but with gamepad logic removed and target bearing and yaw allowed to be non-zero. See below for details of this function.
  • imuTurn – turn the robot to the indicated heading. Note: IMU reports negative values for clockwise turns/heading.
  • imuMove – drive the robot forward given motor power levels and a heading and a time value. The IMU will keep the robot on the indicated heading. Motor power and time control how far the robot drives. This isn’t very precise, but we correct for that by using April Tags for most of the movements. We only use imuMove for short movements.
  • displayTelemetry – displays the current “step” of the program and AprilTag information on the driver station.

The top of ITD_demo1.java program is very similar to RobotAutoDriveByAprilTagOmni and uses the same variables to control the April Tag driving. ITD_demo1 adds initialization of the IMU, check out the RobotAutoDriveByGyro sample program for a sample program that uses the IMU.

        // init IMU
        imu = hardwareMap.get(IMU.class, "imu");
        double targetYaw = 0;
        /* The next two lines define Hub orientation.
         * The Default Orientation (shown) is when a hub is mounted horizontally with the printed logo pointing UP and the USB port pointing FORWARD.
         */
        RevHubOrientationOnRobot.LogoFacingDirection logoDirection = RevHubOrientationOnRobot.LogoFacingDirection.UP;
        RevHubOrientationOnRobot.UsbFacingDirection  usbDirection  = RevHubOrientationOnRobot.UsbFacingDirection.LEFT;
        RevHubOrientationOnRobot orientationOnRobot = new RevHubOrientationOnRobot(logoDirection, usbDirection);

        // Now initialize the IMU with this mounting orientation
        // Note: if you choose two conflicting directions, this initialization will cause a code exception.
        imu.initialize(new IMU.Parameters(orientationOnRobot));

I’ve removed any gamepad logic that was in the RobotAutoDriveByAprilTagOmni program. This is a pure autonomous program.

The main part of the program is a linear step wise progression of calls to move the robot around the field.

            // push sample into net zone, -10 to move away from wall, then drive forward.
            currentStep = 1;
            imuTurn(-10.0);             // turn to -10 which is a clockwise rotation
            imuMove(0.4, -10.0, 800);   // drive by time 800ms using -10 heading and 0.4 motor power.
            
            // turn so we can see tag 16
            currentStep = 2;
            imuTurn(-45.0);
            
            // align on April Tag 16 - red front wall, the robot will strafe to the right.
            currentStep = 3;
            aprilTagDrive(DESIRED_TAG_ID,20.0,0.0,0.0);

            // strafe right pointing at tag 16, lined up on a second sample
            currentStep = 4;
            aprilTagDrive(DESIRED_TAG_ID,28.0,0.0,-50.0);

Technically, Step 1 actually is two steps, I could have renumbered the steps, but imuTurn(-10) is such a short step. currentStep is a variable that indicates what step our program is on. It it displayed by the displayTelemetry function which is called by the programs that move the robot.

  • Step 1 turns away from the wall using imuTurn, then pushes the sample to the edge of the net zone using imuMove.
    • I don’t use imuMove to actually do the turn since that would likely result in some sort of curved movement. The robot would try driving straight but would be corrected to the new heading.
  • Step 2 is a clockwise turn so the webcam can April Tag 16 on the front wall.
  • Step 3 uses aprilTagDrive to line up directly in front of the April Tag at a distance of 20″.
  • Step 4 shows using aprilTagDrive with a non-zero Yaw value, the robot moves to 28″ distance but the Yaw value cause the robot to strafe right such that the April Tag ends up at a -50 yaw. This happens to line up the robot such that it will be able to push a sample into the net zone.

The main program continues with calls to those three functions to push that sample into the net zone. Then backup and line up on April Tag 16. Then use April Tag 16 to back away from the front wall moving half way to the back. The robot they spins using the IMU to end up pointing the webcam at April Tag 14. The robot then uses April Tag 14 to get closer and strafe to the right such that it ends up in the observation zone.

aprilTagDrive

The main part of the program is the function aprilTagDrive. It makes use of the Bearing, Range and Yaw values provided by the AprilTag SDK functions.

  • public void aprilTagDrive(int targetTag, double targetRange, double targetBearing, double targetYaw)
  • targetTag is the ID of the target April Tag. e.g. 16 is the ID of the tag on the red side of the front wall
  • targetRange is the distance in inches from the tag that we want to end up at
  • targetBearing is the heading of the robot with respect to the tag in degrees. eg. if zero we are pointing at the tag. Normally we want to point at the tag or risk the tag not being in the webcam’s field of view. But you might wish to line up facing an AprilTag but be offset slightly to the left or right.
  • targetYaw is the orientation of the tag with respect to the robot in degrees. e.g. if the target yaw is -45 degrees, then the robot is clockwise from the tag at a 45 degree angle.

By setting a target range, bearing and yaw you can position yourself quite accurately assuming you can see an AprilTag and are under six feet from it.

The main part of aprilTagDrive is a while loop. We test opModeIsActive() to ensure we stop the program if the driver presses the stop button on the driver station. We test a done variable which will be set true when the reported AprilTag values are “close enough”. And finally we check elapsed time and exit if we’ve exceeded the timeout threshold.

  • while ( opModeIsActive() && !done && runtime.milliseconds() < timeout )

Our “close enough” determination is the following if test. We set done = true if all three tests are good. We test using Math.abs since we could be plus or minus the indicated target. Our “close enough” test is that range is less than 3 inches from target and the heading and yaw values are less than 5 degrees off. This is sufficient for moving around the field and parking, but probably would not sufficient if you were trying to pick up a sample from one of the spike marks.

  • if ((Math.abs(rangeError) < 3) && (Math.abs(headingError) < 5) && (Math.abs(yawError) < 5))

Issues about using aprilTagDrive

  • imuMove, imuTurn, and aprilTagDrive all have “close enough” tests where they exit when the reported value is “close enough” to the target. You’ll have to decide what is “close enough”. The issue is that eventually the differences are small enough that the calculated power values are not enough to get the robot to move. You can sometimes hear the motors straining but the robot doesn’t move. If you didn’t have a timeout, this is where the program would pause and never finish.
  • You can increase the various GAIN constants or MAX_AUTO constants that are used in the motor power calculation so the robot would move with a small difference, but then the robot might start twitching back and forth as it over corrects for a small difference. You need to find a balance based on how heavy your robot is and the exact design of your drive base and wheel characteristics.
  • There is a timeout value is used in these functions as it’s possible the robot never reaches the target value(s). Ideally you set your GAIN and “close enough” tests such that the robot normally doesn’t pause and timeout during a move.
  • You might have noticed in the video that the robot sometimes seems to pause and/or jerk. That is likely because the program didn’t get a good AprilTag detection and so applies zero motor power. Adjusting exposure, gain or decimation might help with that.
  • This program doesn’t account for camera position on the robot. In fact, the camera on StudicaBot is on the front left corner of the robot. So if you lined up in front of an AprilTag you’d actually be lining up the camera. This must be taken into account when determine where and at what angles you want the robot to move.
  • The robot needs to “see” the AprilTag in order to drive using a tag. A second webcam might help. If StudicaBot had another webcam pointing backwards, it would not have needed to spin at the mid-point of it’s movement from the front of the field to the back. Check out the ConceptAprilTagSwitchableCameras sample program to see how to use two webcams and switch between them as needed.

Cautions about using AprilTags

  • The webcam must see the entire portion of the tag, so any obstruction of even a part of the tag means the robot can’t “see” the tag. The INTO THE DEEP submersible structure will interfere with seeing tags on the other side of the field.
  • AprilTag processing is somewhat sensitive to lighting conditions, glare, and robot movement.
  • Check out the setManualExposure function in the RobotAutoDriveToAprilTagOmni. It probably a good idea to reduce the exposure time and increase webcam gain so that there is less blur in the webcam image thereby increasing AprilTag detection and evaluation. The faster you want the robot to go and still detect AprilTags the less exposure needed to reduce motion blur.
  • I’ve had good results just relying on the auto exposure ability of the Logitech C920 webcam, but I haven’t pushed the robot to go very fast.

Next Steps

  1. Test how accurate you webcam is with AprilTags. Make use of the ConceptAprilTag sample program. I would start with the defaults and compare the reported Range values with the distance measured by a tape measure from the centre of the tag to the webcam. If the measured and reported range is under an inch with the robot about 3 feet from the tag that’s probably close enough. Check the values with the robot at an angle to the tag and/or with the tag not in the centre of the image. Don’t stress about actual errors too much, what matters is consistency of the robot’s movement. If the robot always ends up a little too close to the tag, you can always compensate by setting a larger target distance.
  2. If needed, calibrate your webcam. You probably don’t need to do this if using a Logitech C270. But apparently there are multiple versions of the Logitech C920 and not all are detected by the FTC SDK. This should increase the accuracy of the reported results.
  3. Check out AprilTag Localization. Rather than determine how far away an AprilTag is, you can also use AprilTags to determine the position of the robot on the field. This is because each AprilTag is in a known position on the field so if you know how far away a specific tag is and in what direction you can calculate the robot’s position on the field.