I apologize for not keeping up with the blog. It has been a busy couple of weeks, but I have made a lot of good progress! Let’s get into the details.
This past month has been primarily focused on kyberlib. Kyberlib is the backend library for team 6502 that makes all of our other work easier. It creates useful functions that execute common tasks, includes examples of how to create certain mechanisms, and lots of other useful libraries.
The first major upgrade has been entirely reworking the motor library. Previous years had started work on creating motor control classes, but I don’t think it got very far or was ever tested. I took the initial premise and got it fully working.
First, there is the KBasicMotorController. This class is the parent of all motor controllers. It control all motors and can control via voltage, percent, and follow. It has accessible open variables to set inversion & brakeMode. It is both Sendable & Debuggable meaning that the voltage information can easily be sent to, and set from, the Dashboard. There is currently examples for both WPI’s SpeedControllers and VictorSFX.
In addition to KBasicMotorController, there is the more advanced KMotorController. This class wraps any motor that an enoder. Encoders are devices that sense how fast a motor is spinning. With this angular velocity, we can tweak our voltages to set speed we want to make the motor move. Tweaking these voltages in optimal way is a very math heavy topic known as Control Theory. KMotorController abstracts away most of the math while also allowing flexiblity to tune PIDs, add Feedforwards, or even add your own custom control algorithm. KMotorController also provides getters and setters for important values about position, velocity, setpoint, and more. It handles linear to rotational conversion so that you can instruct it to move a linearVelocity (ie 2 meters per second instead of 2 radians per second). Like KBasicMotorController, KMotorController has full support for debugging and dashboard widgets.
One of the coolest features added into WPILIB in the past couple of years is simulation. This feature allows you to run the robot code on your laptop and simulate the effects of your robot code on a virtual robot. Using this new feature, however, was a bit clunky. A lot of the code needed to be specifically written to allow for the simulation to work. One of the nicest features I added into kyberlib is clean, hidden support for the simulation. All sensors (ie gyroscopes) and motors will automatically detect if they are in a simulation and will automatically adjust. The end result is that the coder is able to write code for a typical robot, and then seamlessly run the simulation with no adjustments.
During a previous post I talked about implementing my pathing and navigation algorithms. These have now been moved out of the main robot code and into kyberlib. This means that all of the code I have been working on for this robot can easily be reworked to use in future robots. Additionally, I implement a TravellingSaleman class to find the fastest path to get through a series of waypoints. Despite not really needing it, I got bored and found some really fascinating optimizations.
In order to test the usability of the new libraries and to instruct future coders, I added a package for some simple out of the box mechanisms. Included are a Flywheel using StateSpace control, a basic elevator, and a drivetrain of the 3 most common types (differential, mecanum, and swerve).
Other Small Additions
- Rewrote KTrajectory to inherit from the WPI trajectory and have all of the contructors from TrajectoryGenerator. This can be saved and loaded from files.
- Created a Debuggable interface that allows classes to easily debug important values to console, DriverStation, or Dashboard.
- Reworked how CommandManager functions.
- Did some testing for some way to improve our units library (typing variables is a pain). Also allowed units to retain dimensions across multiplication and division.
The other major half of the project I have been working on is implementing our vision localization system. We are doing this through an algorithm called UcoSlam. This process was far harder then anticipated so strap in for a tale.
Implementing this issue had two primary difficulties. First, the camera was attached to the robot, but I could not run vision on the robot without severely slowing down its processing speeds. The second major issue was the the UcoSlam algorithm was written in C++ while the rest of my code was written in Kotlin (JVM).
My first thought was to I will have my robot send the stream back through WPI’s built-in CameraServer. Then, I would read that from CameraServer. The issue with this approach was that there were no ways to interact with the CameraServer without setting up the WPILIB libraries for the project. I have zero experience doing this and I believe there would have been build conflicts between the two dependency managers (CMake and Gradle).
The second approach was to read the CameraServers from Kotlin and then call the important functions from there. However, JNI (calling C++ from Java/Kotlin) is awful and I gave up on this approach.
My present solution is very improvise and needs work. Currently, the robot sends images to my computer. My computer reads those images and writes them to file. The algorithm then reads this file and write its about back to the intermediary algorithm. This method is slow and causes a lot of errors. I recently discovered more about how the CameraServer works and am hoping to find a way to directly retrieve the images from UcoSlam.