Skip to main content

Complete NetworkTables API

Limelight OS features a NetworkTables 3 Client. It auto-connects to the NetworkTables 4 Server running on FRC Robots based on the Team Number / ID configured in the Settings UI.

Basic Targeting Data

Use the following code:

NetworkTableInstance.getDefault().getTable("limelight").getEntry("<variablename>").getDouble(0);

to retrieve this data:

keytypedescription
tvint1 if valid target exists. 0 if no valid targets exist
txdoubleHorizontal Offset From Crosshair To Target (LL1: -27 degrees to 27 degrees / LL2: -29.8 to 29.8 degrees)
tydoubleVertical Offset From Crosshair To Target (LL1: -20.5 degrees to 20.5 degrees / LL2: -24.85 to 24.85 degrees)
txncdoubleHorizontal Offset From Principal Pixel To Target
tyncdoubleVertical Offset From Principal Pixel To Target
tadoubleTarget Area (0% of image to 100% of image)
tldoubleThe pipeline's latency contribution (ms). Add to "cl" to get total latency.
cldoubleCapture pipeline latency (ms). Time between the end of the exposure of the middle row of the sensor to the beginning of the tracking pipeline.
tshortdoubleSidelength of shortest side of the fitted bounding box (pixels)
tlongdoubleSidelength of longest side of the fitted bounding box (pixels)
thordoubleHorizontal sidelength of the rough bounding box (0 - 320 pixels)
tvertdoubleVertical sidelength of the rough bounding box (0 - 320 pixels)
getpipeintTrue active pipeline index of the camera (0 .. 9)
jsonstringFull JSON dump of targeting results
tclassstringClass name of primary neural detector result or neural classifier result
tcdoubleArrayGet the average HSV color underneath the crosshair region (3x3 pixel region) as a NumberArray
hbdoubleheartbeat value. Increases once per frame, resets at 2 billion
hwdoubleArrayHW metrics [fps, cpu temp, ram usage, temp]

AprilTag and 3D Data

Use the following code:

NetworkTableInstance.getDefault().getTable("limelight").getEntry("<variablename>").getDoubleArray(new double[6]);

to retrieve this data:

keytypedescription
botposedoubleArrayRobot transform in field-space. Translation (X,Y,Z) in meters Rotation(Roll,Pitch,Yaw) in degrees, total latency (cl+tl), tag count, tag span, average tag distance from camera, average tag area (percentage of image)
botpose_wpibluedoubleArrayRobot transform in field-space (blue driverstation WPILIB origin). Translation (X,Y,Z) in meters Rotation(Roll,Pitch,Yaw) in degrees, total latency (cl+tl), tag count, tag span, average tag distance from camera, average tag area (percentage of image)
botpose_wpireddoubleArrayRobot transform in field-space (red driverstation WPILIB origin). Translation (X,Y,Z) in meters, Rotation(Roll,Pitch,Yaw) in degrees, total latency (cl+tl), tag count, tag span, average tag distance from camera, average tag area (percentage of image)
botpose_orbdoubleArrayRobot transform in field-space (Megatag2). Translation (X,Y,Z) in meters Rotation(Roll,Pitch,Yaw) in degrees, total latency (cl+tl), tag count, tag span, average tag distance from camera, average tag area (percentage of image)
botpose_orb_wpibluedoubleArrayRobot transform in field-space (Megatag2) (blue driverstation WPILIB origin). Translation (X,Y,Z) in meters Rotation(Roll,Pitch,Yaw) in degrees, total latency (cl+tl), tag count, tag span, average tag distance from camera, average tag area (percentage of image)
botpose_orb_wpireddoubleArrayRobot transform in field-space (Megatag2) (red driverstation WPILIB origin). Translation (X,Y,Z) in meters, Rotation(Roll,Pitch,Yaw) in degrees, total latency (cl+tl), tag count, tag span, average tag distance from camera, average tag area (percentage of image)
camerapose_targetspacedoubleArray3D transform of the camera in the coordinate system of the primary in-view AprilTag (array (6)) [tx, ty, tz, pitch, yaw, roll] (meters, degrees)
targetpose_cameraspacedoubleArray3D transform of the primary in-view AprilTag in the coordinate system of the Camera (array (6)) [tx, ty, tz, pitch, yaw, roll] (meters, degrees)
targetpose_robotspacedoubleArray3D transform of the primary in-view AprilTag in the coordinate system of the Robot (array (6)) [tx, ty, tz, pitch, yaw, roll] (meters, degrees)
botpose_targetspacedoubleArray3D transform of the robot in the coordinate system of the primary in-view AprilTag (array (6)) [tx, ty, tz, pitch, yaw, roll] (meters, degrees)
camerapose_robotspacedoubleArray3D transform of the camera in the coordinate system of the robot (array (6))
tidintID of the primary in-view AprilTag
priorityidint (setter)SET the required ID for tx/ty targeting. Ignore other targets. Does not affect localization

Camera Controls

Use the following code:

NetworkTableInstance.getDefault().getTable("limelight").getEntry("<variablename>").setNumber(<value>);

to set this data:

ledModeSets limelight's LED state
[0]use the LED Mode set in the current pipeline
[1]force off
[2]force blink
[3]force on
camModeSets limelight's operation mode
0Vision processor
1Driver Camera (Increases exposure, disables vision processing)
pipelineSets limelight's current pipeline
0 .. 9Select pipeline 0..9
streamSets limelight's streaming mode
0Standard - Side-by-side streams if a webcam is attached to Limelight
1PiP Main - The secondary camera stream is placed in the lower-right corner of the primary camera stream
2PiP Secondary - The primary camera stream is placed in the lower-right corner of the secondary camera stream
snapshotAllows users to take snapshots during a match
0Reset snapshot mode
1Take exactly one snapshot
crop(Array) Sets the crop rectangle. The pipeline must utilize the default crop rectangle in the web interface. The array must have exactly 4 entries.
[0]X0 - Min or Max X value of crop rectangle (-1 to 1)
[1]X1 - Min or Max X value of crop rectangle (-1 to 1)
[2]Y0 - Min or Max Y value of crop rectangle (-1 to 1)
[3]Y1 - Min or Max Y value of crop rectangle (-1 to 1)
| | |
camerapose_robotspace_set(Array) Set the camera's pose in the coordinate system of the robot.
priorityidSET the required ID for tx/ty targeting. Ignore other targets. Does not affect localization
robot_orientation_setSET Robot Orientation and angular velocities in degrees and degrees per second[yaw,yawrate,pitch,pitchrate,roll,rollrate]
fiducial_id_filters_setOverride valid fiducial ids for localization (array)
double[] cropValues = new double[4];
cropValues[0] = -1.0;
cropValues[1] = 1.0;
cropValues[2] = -1.0;
cropValues[3] = 1.0;
NetworkTableInstance.getDefault().getTable("limelight").getEntry("crop").setDoubleArray(cropValues);

Python

Python scripts allow for arbitrary inbound and outbound data.

llpythonNumberArray sent by python scripts. This is accessible from robot code.
llrobotNumberArray sent by the robot. This is accessible from python scripts.

Raw Contours

Corners:

Enable "send contours" in the "Output" tab to stream corner coordinates:

tcornxyNumber array of corner coordinates [x0,y0,x1,y1......]

Raw Targets:

Limelight posts three raw contours to NetworkTables that are not influenced by your grouping mode. That is, they are filtered with your pipeline parameters, but never grouped. X and Y are returned in normalized screen space (-1 to 1) rather than degrees.

rawtargets[txnc,tync,ta,txnc2,tync2,ta2....]

Raw Fiducials:

Get all valid (unfiltered) fiducials

rawfiducials [id,txnc,tync,ta,distToCamera,distToRobot,ambiguity,id2.....]

Raw Crosshairs:

If you are using raw targeting data, you can still utilize your calibrated crosshairs:

cx0Crosshair A X in normalized screen space
cy0Crosshair A Y in normalized screen space
cx1Crosshair B X in normalized screen space
cy1Crosshair B Y in normalized screen space