RobotAutoDriveToAprilTagTank_Blocks Sample Program

This OpMode illustrates using a camera to locate and drive towards a specific AprilTag. The code assumes a basic two-wheel (Tank) Robot Drivetrain. For an introduction to AprilTags, see the ftc-docs link below:

This is a Blocks version of the Java sample program RobotAutoDriveToAprilTagTank.java.

When an AprilTag in the Tag Library is detected, the SDK provides location and orientation of the tag, relative to the camera. This information is provided in the “ftcPose” member of the returned “detection”, and is explained in the ftc-docs page linked below.

The driving goal of this program is to rotate to keep the tag centered in the camera, while driving towards the tag to achieve the desired distance. To reduce any motion blur (which will interrupt the detection process) the Camera exposure is reduced to a very low value (5ms). You can determine the best exposure and gain values by varying the exposure and gain values.

The code assumes a Robot Configuration with motors named left_drive and right_drive. The motor directions must be set so a positive power goes forward on both wheels. This sample assumes that the default AprilTag Library (usually for the current season) is being loaded by default so you should choose to approach a valid tag ID (usually starting at 0).

  • For CENTERSTAGE, a valid tag ID ranges from 1 to 10.

Under manual control, the left stick will move forward/back, and the right stick will rotate the robot. This is called POV Joystick mode, different than Tank Drive (where each joystick controls a wheel). Manually drive the robot until the driver station displays Target data on the Driver Station.

Press and hold the Left Bumper to enable the automatic “Drive to target” mode. Release the Left Bumper to return to manual driving mode.

Under “Drive To Target” mode, the robot has two goals:

  1. Turn the robot to always keep the Tag centered on the camera frame. (Use the Target Bearing to turn the robot.)
  2. Drive towards the Tag to get to the desired distance. (Use Tag Range to drive the robot forward/backward)

You can customize the program by adjusting the following values:

  • Set the DESIRED_TAG_ID to the tag’s ID number to make that the target
  • Use DESIRED_DISTANCE to set how close you want the robot to get to the target.
  • Speed and Turn sensitivity can be adjusted using the SPEED_GAIN and TURN_GAIN constants.

The above text is mostly copied from the comments in the Java sample program. Below I’ll show you the key points of the Blocks version of this program, focusing on using April Tags. Specifically the initAprilTag function and the use of AprilTagDetection blocks. There is also a setManualExposure function which is used to reduce motion blur as we want to be detecting April Tag while the robot is moving.

Download RobotAutoDriveToAprilTagTank_Blocks

Here’s a link to the complete program on GitHub.

Copy the program by right clicking on the RobotAutoDriveToAprilTagTank_Blocks link and downloading the .blk file. Save it to your robot and open the program in Blocks. It requires a webcam called webcam 1 and a robot with two drive motors labeled left_drive and right_drive.

initAprilTag

This function is used to configure a Vision Portal and set up an April Tag processor. It starts by creating an April Tag processor object.

The initAprilTag function also let’s you set the decimation, a value that controls how much down-sampling is used. This lets you trade-off detection-range for detection-rate. The higher the decimation value, the slow the detection rate will be. The Java program has these comments about decimation:

    // eg: Some typical detection data using a Logitech C920 WebCam
    // Decimation = 1 ..  Detect 2" Tag from 10 feet away at 10 Frames per second
    // Decimation = 2 ..  Detect 2" Tag from 6  feet away at 22 Frames per second
    // Decimation = 3 ..  Detect 2" Tag from 4  feet away at 30 Frames Per Second
    // Decimation = 3 ..  Detect 5" Tag from 10 feet away at 30 Frames Per Second
    // Note: Decimation can be changed on-the-fly to adapt during a match.

With decimation set to 2 and using a Logitech C270 webcam set at 640×480 resolution the robot was able to see the small 2″ April Tags along the front wall from 7′ away at 14 frames per second. The 5″ April Tag could be seen at 10′ at 14.6 frames per second.

Finally, initAprilTag calls a builder to create a Vision Portal object, set the webcam to use, and set the Vision Portal to use the April Tag processor we just created.

In the Java code, there’s a Java call to use the back phone camera. I wasn’t able to locate a Block for this, but perhaps I would need to be using a phone as a robot controller in order to see that block. If you’re using a phone you’ll want to set USE_WEBCAM to false and set Camera to the BACK built in camera assuming the back camera is facing in the correct direction.

visionPortal = new VisionPortal.Builder()
.setCamera(BuiltinCameraDirection.BACK)
.addProcessor(aprilTag)
.build();

AprilTagProcessor.getDetections

This block is called to get the current list of April Tag detections. It will return a LIST of all the detections found.The program can then look for the target Tag ID and then retrieve the position and orientation detail for that April Tag detection.

After getting the list, there is a FOR EACH ITEM loop to check for the DESIRED_TAG_ID. If DESIRED_TAG_ID = -1, the first suitable tag is used as a target. For FOR loop sets a detection variable containing the current April Tag detection from the list. Each detection is examined. Those that have null metadata are reported on via telemetry, but otherwise ignored, such tags are not part of the internal tag library. If the April Tag’s ID matches the DESIRED_TAG_ID (or DESIRED_TAG_ID=-1) then we set targetFound = true and we set desiredTag to the detection we found and then break out of the loop. Tags that are detected but not desired are reported via telemetry as skipped.

Range and Bearing

Now that we have found a target, the program can get the range to the April Tag and the direction or bearing to the April Tag. These values are reported via Telemetry to the driver station but are also used to control driving. A rangeError value is calculated using the AprilTagDetection.ftcPose.range value which is used to control driving forward (or backwards) until we get close to the DESIRED_DISTANCE. The AprilTagDetection.ftcPose.bearing value is used directly as a heading error since if we are pointing directly at the April Tag the bearing value would be zero.

Drive / Turn Calculation

The rangeError and headingError values are now used to calculate drive and turn values that will control motor movement. The error values are multiplied by the GAIN values, this reduces the power that might be used, and those values are further constrained to be between the MAX_AUTO values for speed and turning.

The result of this is that the robot will turn towards the target April Tag and then start driving there usually at about MAX_AUTO_SPEED. But as the robot gets closer it will slow down and eventually stop. Typically it stops within a couple of inches of the DESIRED_DISTANCE but likely never reaches exactly the target distance because the rangeError times SPEED_GAIN value become too low to move the robot any further.

You can experiment with different GAIN values and MAX_AUTO values. The default values are well behaved, but not very fast. You can have it drive and turn faster, but the movement can become twitchy and robot can overcorrect and drive a bit erratically.

setManualExposure

Although setting exposure and gain is not strictly required to use April Tags, in this program we’ll manually reduce the exposure and increase the gain on the webcam. This will help to reduce motion blue in the images so that the April Tag processor has a good image to work with. It’s only possible to set exposure and gain for webcams.

Pushbot in Action

Here’s video of the robot driving under Drive to Target mode from the backstage area all the way to the front wall of the field.

Next Steps

Calibrate your webcam. To use April Tag processing to get accurate distance values you need to initialize the webcam with camera lens “intrinsic” values. If you are using a Logitech C270 webcam, the SDK should be automatically setup. Some models of Logitech C920 are automatically supported, but some models require manual calibration.

Related to calibrating your webcam, you can test how accurate your April Tag processing is using this sample program. Drive the robot around the field using the gamepad and point it at a target April Tag from different distances. Then use a tape measure to see how accurate the reported Range value is. Measure from the webcam lens to the center of the April Tag. If the lens is well calibrated the reported Range will be within an inch of the actual measured distance. This includes using the program on the 5″ April Tag targets on the front wall from ten feet away.

Note: the April Tag processor does require the whole April Tag to be in view, so the rigging and trusses and other robots can get in the way and prevent April Tag detection.

You could use this sample code to line up on the backdrop and drive towards the small April Tag that corresponds to the randomized spike mark.

Check out the other Blocks sample programs:

Blocks Sample Programs

Getting Help

It is often possible to use Google (or other search engine) to get help or solve problems. There are lots of resources online. If you’re still stuck you can ask for help here.