FRC Programming with LimelightLib (WPILib Java & C++)
-
Java: https://github.com/LimelightVision/limelightlib-wpijava
-
JavaDocs: https://limelightlib-wpijava-reference.limelightvision.io
-
C++ Docs: https://limelightlib-wpicpp-reference.limelightvision.io
Usage
This is a single-file library.
Java: Copy the LimelightHelpers.java file from the latest release (https://github.com/LimelightVision/limelightlib-wpijava/releases) into your Java project's "robot" folder.
C++: Copy the LimelightHelpers.h file from the latest release (https://github.com/LimelightVision/limelightlib-wpicpp/releases) into your C++ project's include folder and #include "LimelightHelpers.h" where needed.
You don't need to create any objects for your Limelights - the library is simple and functional to maximize ease of use and reliability.
Tips for Success
- Start simple! Many successful FRC teams use basic approaches effectively. For example, Team 2056 in 2024 used a standard 90FPS color pipeline instead of neural networks for game piece tracking.
- Think about what you actually need: Do you need full field localization, or would simple target centering work (e.g.,
driveSpeed = result.getTx() * 0.03)?
Key Concepts
1. Basic Usage
Every method accepts a Limelight name parameter. Leave it blank or empty to use "limelight":
- Java
- C++
// Basic targeting data
double tx = LimelightHelpers.getTX(""); // Horizontal offset from crosshair to target in degrees
double ty = LimelightHelpers.getTY(""); // Vertical offset from crosshair to target in degrees
double ta = LimelightHelpers.getTA(""); // Target area (0% to 100% of image)
boolean hasTarget = LimelightHelpers.getTV(""); // Do you have a valid target?
double txnc = LimelightHelpers.getTXNC(""); // Horizontal offset from principal pixel/point to target in degrees
double tync = LimelightHelpers.getTYNC(""); // Vertical offset from principal pixel/point to target in degrees
#include "LimelightHelpers.h"
// Basic targeting data
double tx = LimelightHelpers::getTX(""); // Horizontal offset from crosshair to target in degrees
double ty = LimelightHelpers::getTY(""); // Vertical offset from crosshair to target in degrees
double ta = LimelightHelpers::getTA(""); // Target area (0% to 100% of image)
bool hasTarget = LimelightHelpers::getTV(""); // Do you have a valid target?
double txnc = LimelightHelpers::getTXNC(""); // Horizontal offset from principal pixel/point to target in degrees
double tync = LimelightHelpers::getTYNC(""); // Vertical offset from principal pixel/point to target in degrees
2. Pipeline Management
Pipelines are like instantly-swappable programs that change how Limelight processes images. You can set up 10 different pipelines in the web interface:
- Java
- C++
// Switch to pipeline 0
LimelightHelpers.setPipelineIndex("", 0);
// Switch to pipeline 0
LimelightHelpers::setPipelineIndex("", 0);
3. LED Control
For Limelights with bright illumination LEDs, you can control the LEDs for different situations:
- Java
- C++
// Let the current pipeline control the LEDs
LimelightHelpers.setLEDMode_PipelineControl("");
// Force LEDs on/off/blink
LimelightHelpers.setLEDMode_ForceOn("");
LimelightHelpers.setLEDMode_ForceOff("");
LimelightHelpers.setLEDMode_ForceBlink("");
// Let the current pipeline control the LEDs
LimelightHelpers::setLEDMode_PipelineControl("");
// Force LEDs on/off/blink
LimelightHelpers::setLEDMode_ForceOn("");
LimelightHelpers::setLEDMode_ForceOff("");
LimelightHelpers::setLEDMode_ForceBlink("");
4. Field Localization With MegaTag
(See the MegaTag documentation and the Swerve Pose Estimation Tutorial for more details.)
- Java
- C++
// In your periodic function:
LimelightHelpers.PoseEstimate limelightMeasurement = LimelightHelpers.getBotPoseEstimate_wpiBlue("limelight");
if (limelightMeasurement.tagCount >= 2) { // Only trust measurement if we see multiple tags
m_poseEstimator.setVisionMeasurementStdDevs(VecBuilder.fill(0.7, 0.7, 9999999));
m_poseEstimator.addVisionMeasurement(
limelightMeasurement.pose,
limelightMeasurement.timestampSeconds
);
}
// In your periodic function:
LimelightHelpers::PoseEstimate limelightMeasurement = LimelightHelpers::getBotPoseEstimate_wpiBlue("limelight");
if (limelightMeasurement.tagCount >= 2) { // Only trust measurement if we see multiple tags
m_poseEstimator.SetVisionMeasurementStdDevs({0.7, 0.7, 9999999});
m_poseEstimator.AddVisionMeasurement(
limelightMeasurement.pose,
limelightMeasurement.timestampSeconds
);
}
5. Field Localization With MegaTag2
(See the MegaTag2 documentation and the Swerve Pose Estimation Tutorial for more details.)
MegaTag2 enhances localization accuracy by fusing robot orientation data with vision. By providing gyro data, you help MegaTag2 constrain the localization problem and provide excellent results even if only a single tag is visible:
- Java
- C++
// First, tell Limelight your robot's current orientation
double robotYaw = m_gyro.getYaw();
LimelightHelpers.SetRobotOrientation("", robotYaw, 0.0, 0.0, 0.0, 0.0, 0.0);
// Get the pose estimate
LimelightHelpers.PoseEstimate limelightMeasurement = LimelightHelpers.getBotPoseEstimate_wpiBlue_MegaTag2("");
// Add it to your pose estimator
m_poseEstimator.setVisionMeasurementStdDevs(VecBuilder.fill(.5, .5, 9999999));
m_poseEstimator.addVisionMeasurement(
limelightMeasurement.pose,
limelightMeasurement.timestampSeconds
);
// First, tell Limelight your robot's current orientation
double robotYaw = m_gyro.GetYaw();
LimelightHelpers::SetRobotOrientation("", robotYaw, 0.0, 0.0, 0.0, 0.0, 0.0);
// Get the pose estimate
LimelightHelpers::PoseEstimate limelightMeasurement = LimelightHelpers::getBotPoseEstimate_wpiBlue_MegaTag2("");
// Add it to your pose estimator
m_poseEstimator.SetVisionMeasurementStdDevs({0.5, 0.5, 9999999});
m_poseEstimator.AddVisionMeasurement(
limelightMeasurement.pose,
limelightMeasurement.timestampSeconds
);
6. Special AprilTag Functionality
- Java
- C++
// Set a custom crop window for improved performance (-1 to 1 for each value)
LimelightHelpers.setCropWindow("", -0.5, 0.5, -0.5, 0.5);
// Change the camera pose relative to robot center (x forward, y left, z up, degrees)
LimelightHelpers.setCameraPose_RobotSpace("",
0.5, // Forward offset (meters)
0.0, // Side offset (meters)
0.5, // Height offset (meters)
0.0, // Roll (degrees)
30.0, // Pitch (degrees)
0.0 // Yaw (degrees)
);
// Set AprilTag offset tracking point (meters)
LimelightHelpers.setFiducial3DOffset("",
0.0, // Forward offset
0.0, // Side offset
0.5 // Height offset
);
// Configure AprilTag detection
LimelightHelpers.SetFiducialIDFiltersOverride("", new int[]{1, 2, 3, 4}); // Only track these tag IDs
LimelightHelpers.SetFiducialDownscalingOverride("", 2.0f); // Process at half resolution
// Adjust keystone crop window (-0.95 to 0.95 for both horizontal and vertical)
LimelightHelpers.setKeystone("", 0.1, -0.05);
// Set a custom crop window for improved performance (-1 to 1 for each value)
LimelightHelpers::setCropWindow("", -0.5, 0.5, -0.5, 0.5);
// Change the camera pose relative to robot center (x forward, y left, z up, degrees)
LimelightHelpers::setCameraPose_RobotSpace("",
0.5, // Forward offset (meters)
0.0, // Side offset (meters)
0.5, // Height offset (meters)
0.0, // Roll (degrees)
30.0, // Pitch (degrees)
0.0 // Yaw (degrees)
);
// Set AprilTag offset tracking point (meters)
LimelightHelpers::setFiducial3DOffset("",
0.0, // Forward offset
0.0, // Side offset
0.5 // Height offset
);
// Configure AprilTag detection
LimelightHelpers::SetFiducialIDFiltersOverride("", std::vector<int>{1, 2, 3, 4}); // Only track these tag IDs
LimelightHelpers::SetFiducialDownscalingOverride("", 2.0f); // Process at half resolution
// Adjust keystone crop window (-0.95 to 0.95 for both horizontal and vertical)
LimelightHelpers::setKeystone("", 0.1, -0.05);
7. Python Integration
Communicate with custom Python SnapScripts:
- Java
- C++
// Send data to Python
double[] dataToSend = {1.0, 2.0, 3.0};
LimelightHelpers.setPythonScriptData("", dataToSend);
// Get data from Python
double[] dataFromPython = LimelightHelpers.getPythonScriptData("");
// Send data to Python
std::vector<double> dataToSend = {1.0, 2.0, 3.0};
LimelightHelpers::setPythonScriptData("", dataToSend);
// Get data from Python
std::vector<double> dataFromPython = LimelightHelpers::getPythonScriptData("");
8. Getting Detailed Results From NetworkTables (RawTargets)
For maximum performance, you can access raw target data directly from NetworkTables, bypassing JSON parsing:
- Java
- C++
// Get raw AprilTag/Fiducial data
RawFiducial[] fiducials = LimelightHelpers.getRawFiducials("");
for (RawFiducial fiducial : fiducials) {
int id = fiducial.id; // Tag ID
double txnc = fiducial.txnc; // X offset (no crosshair)
double tync = fiducial.tync; // Y offset (no crosshair)
double ta = fiducial.ta; // Target area
double distToCamera = fiducial.distToCamera; // Distance to camera
double distToRobot = fiducial.distToRobot; // Distance to robot
double ambiguity = fiducial.ambiguity; // Tag pose ambiguity
}
// Get raw neural detector results
RawDetection[] detections = LimelightHelpers.getRawDetections("");
for (RawDetection detection : detections) {
int classID = detection.classId;
double txnc = detection.txnc;
double tync = detection.tync;
double ta = detection.ta;
// Access corner coordinates if needed
double corner0X = detection.corner0_X;
double corner0Y = detection.corner0_Y;
// ... corners 1-3 available similarly
}
// Get raw AprilTag/Fiducial data
std::vector<LimelightHelpers::RawFiducial> fiducials = LimelightHelpers::getRawFiducials("");
for (const LimelightHelpers::RawFiducial& fiducial : fiducials) {
int id = fiducial.id; // Tag ID
double txnc = fiducial.txnc; // X offset (no crosshair)
double tync = fiducial.tync; // Y offset (no crosshair)
double ta = fiducial.ta; // Target area
double distToCamera = fiducial.distToCamera; // Distance to camera
double distToRobot = fiducial.distToRobot; // Distance to robot
double ambiguity = fiducial.ambiguity; // Tag pose ambiguity
}
// Get raw neural detector results
std::vector<LimelightHelpers::RawDetection> detections = LimelightHelpers::getRawDetections("");
for (const LimelightHelpers::RawDetection& detection : detections) {
int classID = detection.classId;
double txnc = detection.txnc;
double tync = detection.tync;
double ta = detection.ta;
// Access corner coordinates if needed
double corner0X = detection.corner0_X;
double corner0Y = detection.corner0_Y;
// ... corners 1-3 available similarly
}
9. Limelight 4 IMU Integration
Limelight 4 has a built-in IMU that can be used with MegaTag2 for improved localization during rotation:
- Java
- C++
// Seed the internal IMU with your external gyro (call while disabled)
LimelightHelpers.SetIMUMode("", 1);
// Switch to internal IMU with external assist when enabled
LimelightHelpers.SetIMUMode("", 4);
LimelightHelpers.SetIMUAssistAlpha("", 0.001); // Adjust correction strength
// Seed the internal IMU with your external gyro (call while disabled)
LimelightHelpers::SetIMUMode("", 1);
// Switch to internal IMU with external assist when enabled
LimelightHelpers::SetIMUMode("", 4);
LimelightHelpers::SetIMUAssistAlpha("", 0.001); // Adjust correction strength
See the MegaTag2 documentation for a full explanation of IMU modes.
10. Capturing Video with Rewind (LL4 Only)
Control the Rewind feature programmatically:
- Java
- C++
// Enable/disable rewind recording
LimelightHelpers.setRewindEnabled("", true);
// Capture the last N seconds of video after an event
LimelightHelpers.triggerRewindCapture("", 30.0); // Capture last 30 seconds
// Enable/disable rewind recording
LimelightHelpers::setRewindEnabled("", true);
// Capture the last N seconds of video after an event
LimelightHelpers::triggerRewindCapture("", 30.0); // Capture last 30 seconds
11. Capturing Snapshots
Trigger snapshot capture programmatically:
- Java
- C++
// Trigger a snapshot (rate-limited to once per 10 frames)
LimelightHelpers.triggerSnapshot("");
// Trigger a snapshot (rate-limited to once per 10 frames)
LimelightHelpers::triggerSnapshot("");