Skip to main content

NetworkTables API

Limelight OS features a NetworkTables 4 Client. It auto-connects to the NetworkTables 4 Server running on FRC Robots based on the Team Number / ID configured in the Settings UI.

All data is published to a table that matches the device name (e.g. "limelight"). If a hostname / nickname is assigned to your camera, the table name will match the full limelight name (e.g. "limelight-top").

LimelightLib WPIJava and LimelightLib WPICPP interact with Limelight devices via NetworkTables.

Basic Targeting Data

Use the following code:

NetworkTableInstance.getDefault().getTable("limelight").getEntry("<variablename>").getDouble(0);

to retrieve this data:

keytypedescription
tvint1 if valid target exists. 0 if no valid targets exist
txdoubleHorizontal Offset From Crosshair To Target (LL1: -27 degrees to 27 degrees / LL2: -29.8 to 29.8 degrees)
tydoubleVertical Offset From Crosshair To Target (LL1: -20.5 degrees to 20.5 degrees / LL2: -24.85 to 24.85 degrees)
txncdoubleHorizontal Offset From Principal Pixel To Target (degrees)
tyncdoubleVertical Offset From Principal Pixel To Target (degrees)
tadoubleTarget Area (0% of image to 100% of image)
tldoubleThe pipeline's latency contribution (ms). Add to "cl" to get total latency.
cldoubleCapture pipeline latency (ms). Time between the end of the exposure of the middle row of the sensor to the beginning of the tracking pipeline.
t2ddoubleArray containing several values for matched-timestamp statistics: [targetValid, targetCount, targetLatency, captureLatency, tx, ty, txnc, tync, ta, tid, targetClassIndexDetector , targetClassIndexClassifier, targetLongSidePixels, targetShortSidePixels, targetHorizontalExtentPixels, targetVerticalExtentPixels, targetSkewDegrees]
getpipeintTrue active pipeline index of the camera (0 .. 9)
getpipetypestringPipeline Type e.g. "pipe_color"
jsonstringFull JSON dump of targeting results. Must be enabled per-pipeline in the 'output' tab
tclassstringClass name of primary neural detector result or neural classifier result
tcdoubleArrayGet the average HSV color underneath the crosshair region (3x3 pixel region) as a NumberArray
hbdoubleheartbeat value. Increases once per frame, resets at 2 billion
hwdoubleArrayHW metrics [fps, cpu temp, ram usage, temp]
crosshairsdoubleArray2D Crosshairs [cx0, cy0, cx1, cy1]
tcclassstringName of classifier pipeline's computed class
tdclassstringName of detector pipeline's primary detection

AprilTag and 3D Data

Use the following code:

NetworkTableInstance.getDefault().getTable("limelight").getEntry("<variablename>").getDoubleArray(new double[6]);

to retrieve this data:

keytypedescription
botposedoubleArrayRobot transform in field-space. Translation (X,Y,Z) in meters Rotation(Roll,Pitch,Yaw) in degrees, total latency (cl+tl), tag count, tag span, average tag distance from camera, average tag area (percentage of image)
botpose_wpibluedoubleArrayRobot transform in field-space (blue driverstation WPILIB origin). Translation (X,Y,Z) in meters Rotation(Roll,Pitch,Yaw) in degrees, total latency (cl+tl), tag count, tag span, average tag distance from camera, average tag area (percentage of image)
botpose_wpireddoubleArrayRobot transform in field-space (red driverstation WPILIB origin). Translation (X,Y,Z) in meters, Rotation(Roll,Pitch,Yaw) in degrees, total latency (cl+tl), tag count, tag span, average tag distance from camera, average tag area (percentage of image)
botpose_orbdoubleArrayRobot transform in field-space (Megatag2). Translation (X,Y,Z) in meters Rotation(Roll,Pitch,Yaw) in degrees, total latency (cl+tl), tag count, tag span, average tag distance from camera, average tag area (percentage of image)
botpose_orb_wpibluedoubleArrayRobot transform in field-space (Megatag2) (blue driverstation WPILIB origin). Translation (X,Y,Z) in meters Rotation(Roll,Pitch,Yaw) in degrees, total latency (cl+tl), tag count, tag span, average tag distance from camera, average tag area (percentage of image)
botpose_orb_wpireddoubleArrayRobot transform in field-space (Megatag2) (red driverstation WPILIB origin). Translation (X,Y,Z) in meters, Rotation(Roll,Pitch,Yaw) in degrees, total latency (cl+tl), tag count, tag span, average tag distance from camera, average tag area (percentage of image)
camerapose_targetspacedoubleArray3D transform of the camera in the coordinate system of the primary in-view AprilTag (array (6)) [tx, ty, tz, pitch, yaw, roll] (meters, degrees)
targetpose_cameraspacedoubleArray3D transform of the primary in-view AprilTag in the coordinate system of the Camera (array (6)) [tx, ty, tz, pitch, yaw, roll] (meters, degrees)
targetpose_robotspacedoubleArray3D transform of the primary in-view AprilTag in the coordinate system of the Robot (array (6)) [tx, ty, tz, pitch, yaw, roll] (meters, degrees)
botpose_targetspacedoubleArray3D transform of the robot in the coordinate system of the primary in-view AprilTag (array (6)) [tx, ty, tz, pitch, yaw, roll] (meters, degrees)
camerapose_robotspacedoubleArray3D transform of the camera in the coordinate system of the robot (array (6))
tidintID of the primary in-view AprilTag
stddevsdoubleArrayMegaTag Standard Deviations [MT1x, MT1y, MT1z, MT1roll, MT1pitch, MT1Yaw, MT2x, MT2y, MT2z, MT2roll, MT2pitch, MT2yaw]
camerapose_robotspace_setdoubleArraySET the camera's pose in the coordinate system of the robot.
priorityidintSET the required ID for tx/ty targeting. Ignore other targets. Does not affect localization
robot_orientation_setdoubleArraySET Robot Orientation and angular velocities in degrees and degrees per second[yaw,yawrate,pitch,pitchrate,roll,rollrate]
fiducial_id_filters_setdoubleArrayOverride valid fiducial ids for localization (array)
fiducial_offset_setdoubleArraySET the 3D Point of Interest Offset [x,y,z]

Camera Controls

Use the following code:

NetworkTableInstance.getDefault().getTable("limelight").getEntry("<variablename>").setNumber(<value>);

to set this data:

ledModeSets limelight's LED state
[0]use the LED Mode set in the current pipeline
[1]force off
[2]force blink
[3]force on
pipelineSets limelight's current pipeline
0 .. 9Select pipeline 0..9
streamSets limelight's streaming mode
0Standard - Side-by-side streams if a webcam is attached to Limelight
1PiP Main - The secondary camera stream is placed in the lower-right corner of the primary camera stream
2PiP Secondary - The primary camera stream is placed in the lower-right corner of the secondary camera stream
crop(Array) Sets the crop rectangle. The pipeline must utilize the default crop rectangle in the web interface. The array must have exactly 4 entries.
[0]X0 - Min or Max X value of crop rectangle (-1 to 1)
[1]X1 - Min or Max X value of crop rectangle (-1 to 1)
[2]Y0 - Min or Max Y value of crop rectangle (-1 to 1)
[3]Y1 - Min or Max Y value of crop rectangle (-1 to 1)
double[] cropValues = new double[4];
cropValues[0] = -1.0;
cropValues[1] = 1.0;
cropValues[2] = -1.0;
cropValues[3] = 1.0;
NetworkTableInstance.getDefault().getTable("limelight").getEntry("crop").setDoubleArray(cropValues);

Python

Python scripts allow for arbitrary inbound and outbound data.

llpythonNumberArray sent by python scripts. This is accessible within robot code.
llrobotNumberArray sent by the robot. This is accessible within python SnapScripts.

Raw Data

Corners:

Enable "send contours" in the "Output" tab to stream corner coordinates:

tcornxyNumber array of corner coordinates [x0,y0,x1,y1......]

Raw Targets:

Limelight posts three raw contours to NetworkTables that are not influenced by your grouping mode. That is, they are filtered with your pipeline parameters, but never grouped. X and Y are returned in normalized screen space (-1 to 1) rather than degrees.

rawtargets[txnc,tync,ta,txnc2,tync2,ta2....]

Raw Fiducials:

Get all valid (unfiltered) fiducials

rawfiducials [id, txnc, tync, ta, distToCamera, distToRobot, ambiguity, id2.....]

Raw Detections:

Get all valid (unfiltered) nerual detection results

rawdetections [id, txnc, tync, ta, corner0x, corner0y, corner1x, corner1y, corner2x, corner2y, corner3x, corner3y, id2.....]

Raw Barcodes:

Get all valid (unfiltered) barcode results

rawbarcodes string array of barcode data