Code Planning For The New Season

Tags: control
Personhours: 0
Code Planning For The New Season By Mahesh

Task: Plan changes to our codebase for the new season.

This year's game saw a significant boost to the importance of the control award, now being put above even the motivate and design awards in order of advancement. Therefore, it is crucial to analyze what changes we plan on bringing to our codebase and new technologies we plan on using in hopes of benefitting from the award's higher importance this year.

Firstly, the very beginning of the autonomous period requires the robot to navigate to one of three target zones, specified by the quantity of rings placed, being either 0, 1, or 4. Since the drivers will not be able to choose an opmode corresponding to each path, we will have to implement a vision pipeline to determine which of the three configurations the field is in. This can be done with OpenCV, using an adaptive threshold and blob detection to differentiate 1 ring from 4 rings through the height of the detected color blob.

Secondly, ultimate goal's disk throwing aspect opens up the opportunity for automatic shooting and aligning mechanisms and the software to go along with them. If a robot can use the vision target placed above the upper tower goal or otherwise localize itself, projectile motion models can be used to analyze the disk's trajectory and calculate the necessary angle and velocity of launch to score into the goal given a distance from it on the field. Such automation would save drivers the hassle of aligning and aiming for the baskets, allowing them to focus on more complex strategy and improve cycle time. Vuforia and OpenCV pipelines can be used to figure out the robot's location on the field, given the vision target's orientation and size in its field of view. If OpenCV is also used to allow for automatic intake of disks on the ground, then most gameplay could be automated, although this isn't as simple a task it may seem.

Asides from using the webcam, another localization technique we dabbled with this summer was using GPS. We were able to get +- 4 cm accuracy when using a specific type of GPS, allowing our test robot to trace out different paths of waypoints, including our team number, the DPRG logo, etc. If this level of accuracy proves to be viable in game, GPS could be considered an option for localization as opposed to the odometry sensors many teams employ currently. Another sensor worth noting is the PMW3901 Optical Flow Sensor, which acts similar to a computer mouse, in that its movement can be translated into a horizontal and vertical velocity, giving us more insight into the position of our robot. Regardless of how many different gadgets and sensors we may use, an important part of the code for this year's game will be automatic scoring, no matter how we choose to implement it.

As always, we hope to have this year's codebase more organized than the last, and efforts have been taken to refactor parts of our codebase to be more readable and easily workable. Additionally, although it is not necessary, we could use multithreading to separate, for example, hardware reads, OpenCV pipelines, etc. into their own separate threads, although our current state machine gives us an asynchronous structure which emulates multithreading fairly well, and implementing this would give us only a slight performance boost, particularly when running OpenCV pipelines.

Next Steps:

The most immediate changes to our codebase should be both working on refactoring as well as implementing bare bones for our teleop and autonomous routines. Next, we should work on a vision pipeline to classify the field's configuration at the start of the game, enabling the robot to navigate the wobble goal into the A, B, and C drop zones. Afterwards, the physics calculations and vision pipelines necessary to auto-shoot into the baskets can be made, starting with a mathematical model of the projectile.

Date | October 17, 2020