RobotAutoDriveByGyro_Blocks Sample Program

In the RobotAutoDriveByEncoder_Blocks program we used motor encoders to move the robot around. That’s better than time based movement, but the wheels can slip a bit when making turns. If you make a number of turns you may find that the robot is not as close to where you want it to be at the end.

There’s another Java sample program called RobotAutoDriveByGyro_Linear.java that we can use to make a Blocks sample program for even more accurate robot movement.

Note: as of the 2024 INTO THE DEEP season these programs are now included as Blocks sample programs that you can just copy when creating a new Blocks program.

This is a tutorial on creating first FIRST Tech Challenge (FTC) autonomous programs for the CENTERSTAGE game.

Visit the FTC docs page on Creating Op Modes in blocks if you need help getting started and to create your first driver controlled program.

The Java program has a driveStraight function that uses motor encoders. That function and the turnToHeading function use the internal Inertial Measurement Unit (IMU) sensor in the Rev Control Hub. The program uses those functions to create the following movement path for the robot:

  • Drive forward for 24 inches
  • Spin in place, 45 degrees clockwise
  • Drive forward 17 inches
  • Spin in place, 90 degrees counter clockwise
  • Drive forward 17 inches
  • Spin in place, 45 degrees clockwise
  • Drive Backward for 48 inches

This is a program combines use of motor encoders and the IMU move around. It has various Blocks functions that moves the robot according to some input parameters, and then calls those function to perform the required movements. These types of controlled movements will help when moving from the front of the field to the backstage area in the autonomous period.

Prerequisites/Assumptions

This tutorial assumes you’ve got some familiarity with Blocks and that you have a robot configuration in your driver station. Ideally you’ve completed a Blocks tutorial. This tutorial doesn’t explain how to program in blocks.

This tutorial assumes you have a robot with two driving wheels, normally referred to as a tank drive or pushbot drive, and that you are using a Rev Control Hub.

Your robot must have encoders on the motors and an encoder cable must connect the motor to the Rev Control Hub.

A webcam is not required for this program.

It might help to try the SensorIMUOrthogonal_Blocks program first.

The RobotAutoDriveByGyro_Linear.java program is a long program. We’re not going to create that step by step. Instead, please copy the .blk file from the Pushbot GitHub repository and then I’ll explain how the program works. An easier to understand program that focuses just on the IMU is the SensorIMUOrthogonal_Blocks program.

What is an IMU?

An inertial measurement unit (IMU) is an electronic device that measures and reports a body’s specific force, angular rate, and sometimes the orientation. In FIRST Tech Challenge, the Rev Control Hub contains an IMU and the FTC SDK provides functions that let you get the orientation of the hub as the robot moves around. For this program we will use the IMU as a Gyroscope.

Note: there is an easier program for exploring use of the IMU. Check out the SensorIMUOrthogonal_Blocks sample program as it just uses the IMU to display orientation values on the driver station.

Copy RobotAutoDriveByGyro_Blocks.blk

The Blocks programs for robot are in the Pushbot GitHub repository. Assuming your robot has left and right motors configured with the names left_drive and right_drive, you should be able to easily use this sample program.

Right click and select Download on GitHub file RobotAutoDriveByGyro_Blocks.blk

You can then connect to your robot and ‘upload’ the file to your robot, give the program the name RobotAutoDriveByGyro_Blocks. Open this program in Blocks and you can see it’s a very long program (click here to see the .png file of the program).

I’ll go over the important parts of this program, focusing on using the IMU. But first you need to initialize variables for the encoders. Set COUNTS_PER_MOTOR_REV to the encoder counts per revolution value for your motors and WHEEL_DIAMETER_INCHES to your wheel size. These values should be the same as what you used in the RobotAutoDriveByEncoder_Blocks program if you’ve created that program.

Then there are some driving/turning values that you can probably leave as is, but could change if needed. DRIVE_SPEED and TURN_SPEED could be increased, but you’ll have more control at lower speeds. HEADING_THRESHOLD controls how close must the heading get to the target before moving to next step. Requiring more accuracy (a smaller number) could make the turn take longer to get into the final position. You could increase this value to speed things up, but a value of 1 seems to work fine.

Each of these variables has comments that you can see by clicking on the ? in the circle of each value.

The final block initializes the IMU in the control hub. The IMU will take a couple of seconds to initialize. A key part of that initialization is telling the SDK how your control hub is mounted on the robot. This let’s the SDK provide you with the correct orientation values later.

One interesting thing in this program is a Repeat loop during the Initialization period of the program, prior to pressing the start button. This demonstrates that you can have your program provide feedback to the driver station as the robot is being set up.

The above Telemetry shows using the yawPitchRollAngles block to get the current orientation of the robot, in this case asking only for the Yaw value. Assuming the IMU was initialized with the correct orientation, then the Yaw value represents the robot’s heading with zero degrees pointing forward at the point where the robot was initialized. Yaw values should INCREASE as the robot is rotated Counter Clockwise. With this program, you can test this by rotating the robot and checking what Yaw value is displayed.

Once the Start button is pressed, the program Resets the IMU Yaw value so that Zero is the direction the robot is facing.

The program then goes right into the movement steps of this program. It makes use of three functions:

  • driveStraight – this is an encoder based move function where you supply a speed, distance in inches and a heading value. The robot will use encoders to drive the distance, but will correct the direction of the driving to maintain the heading value. To move backwards, set a negative distance value. Note: the robot should already be facing the heading you supply otherwise it will execute a heading correct immediately.
  • turnToHeading – this is a pure IMU function that turns the robot to face the indicated direction. You supply a speed value and a heading value. The speed value controls the motor power level so you can have faster turns if you want. The robot will turn to face the input heading value. Note that the robot uses IMU coordinates where negative headings are a clockwise turn. It helps to put comments in your code to indicate what the robot is trying to do at each step in the program. See sample below.
  • holdHeading – this is another IMU function. You provide a heading and timeout value and the robot attempts to hold on that heading. It might not be strictly needed in all cases, especially if your moves and turns are not fast. But with fast movements or turns you probably want to have the robot pause, even for half a second, and correct any heading errors. In this sample program they always call holdHeading after a turnToHeading call.

The above code is how those functions would be used in an actual program. In this case driving forward 24″ using driveStraight, then turning clockwise using -45 degrees and turnToHeading, then calling holdHeading to pause for half a second on the new heading to ensure we’ve stabilized on that heading.

Those three are the high level functions in this program. but there are many helper functions that those functions use. If you decide to use the high level functions, you’ll also have to copy the low level functions.

  • moveRobot – this function takes a driveSpeed parameter and a turn parameter and uses those to set the motor power levels. The driveStraight function uses moveRobot and supplies a turn parameter based on the IMU heading that causes the motors to apply slightly more power to one wheel and less to the other which causes the robot to adjust course as it drives.
  • getSteeringCorrection – this function takes a desiredHeading and a proportionalGain parameter. This function is used by driveStraight to get the turn value needed for moveRobot. The current heading from the IMU is compared to the desiredHeading and the difference is multiplied by the proportionalGain parameter (which is a small number) so that we get a small turn value that we can use to adjust course. Using a larger proportionalGain parameter cause quicker adjustments, but too large a value can cause the robot to twitch back and forth.
  • getHeading is a simple function that returns the IMU Yaw value.
  • sendTelemetry is a function that has a parameter called straight which should be a true or false value. Three main functions call this function to display telemetry. The driveStraight function sets the straight parameter to true, while the other main functions set the value to false. If straight=true additional telemetry blocks are called to display the target and current encoder values on the driver station.

You should save the program and try running it. The robot should drive forward, make a couple of turns, and then back up to where it started.

RobotAutoDriveByGyro_Blocks robot run

Next Steps

Try using these functions to move around the field. Here is a program called RobotAutoDriveByGyro_Demo that moves around the Centerstage field from the front starting blue position and parks backstage. It uses the same driveStraight, turnToHeading and holdHeading functions. It starts in the front blue staring location, then the robot moves as follows:

  • drive forward such that the purple pixel is placed on the center spike mark. A real autonomous program should first try and detect which spike mark had the pixel or team prop. For this demo, just assume it’s in the center.
  • backup
  • turn 90 degrees counter clockwise to face backstage
  • drive forward to the backstage area driving under the rigging and stopping in the rear blue starting tile
  • make a 45 degree clockwise turn
  • drive forward far enough so that the next turn will align the robot in front of the center backdrop April Tag (note: we are not using a webcam in this program)
  • turn 45 degrees to face the backdrop center area
  • drive forward and touch the backdrop. This is where the robot could deploy the yellow pixel, this robot cannot do that.
  • the robot then backs up, turns clockwise, drives forward a bit, then turns to face the rear wall, then drives forward to park in the backstage area beside the backdrop.

RobotAutoDriveByGyro_Demo robot run

I ran this program a few times to demonstrate that it’s quite accurate, even after all those moves and turns the robot stops within a few inches.

The robot’s right rear wheel ended up within a couple of inches even after all those moves and turns.

One issue I ran into is that all that driving took too long and the autonomous period would end before the robot finished moving. So some of the longer driveStraight commands have 0.2 added to the DRIVE_SPEED value so the robot moves a little faster.

You can see the .png image of the RobotAutoDriveByGyro_Demo program on GitHub. You can find the .blk file for the RobotAutoDriveByGyro_Demo program on GitHub in the Pushbot repository. This program was copied from RobotAutoDriveByGyro_Blocks, but the driving commands move the robot around the Centerstage field.

Check out the other Blocks sample programs:

Blocks Sample Programs

Vision Processing

To really score more points in the Centerstage autonomous period the robot probably needs to use TensorFlow and April tags. Coming soon TensorFlow and April Tag Blocks programs.

Getting Help

It is often possible to use Google (or other search engine) to get help or solve problems. There are lots of resources online. If you’re still stuck you can ask for help here.