Skip to main content

Scores API

You can submit scores and annotations to Latitude programmatically, enabling custom quality signals from your own code, user feedback systems, or external evaluation pipelines.

Custom Scores

Submit custom scores through the scores endpoint:
POST /v1/organizations/:organizationId/projects/:projectId/scores
Each score requires:
FieldTypeRequiredDescription
traceIdstringYesThe trace to attach the score to
valuenumberYesNormalized score between 0 and 1
passedbooleanYesPass/fail verdict
feedbackstringYesHuman-readable explanation of the verdict
source_idstringYesYour custom source identifier (e.g., "user-satisfaction", "task-completion")
spanIdstringNoAttach to a specific span within the trace
sessionIdstringNoAssociate with a session
metadataobjectNoArbitrary JSON metadata
Scores submitted through this endpoint are automatically categorized as custom scores.

Example

curl -X POST \
  https://api.latitude.so/v1/organizations/org_123/projects/proj_456/scores \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "traceId": "abc123def456",
    "value": 0.2,
    "passed": false,
    "feedback": "User reported the answer was incorrect. The recommended product was discontinued",
    "source_id": "user-feedback"
  }'

Use Cases

  • User satisfaction ratings: Convert thumbs up/down or star ratings into scores
  • Task completion metrics: Track whether the agent’s output led to a successful outcome
  • Business KPIs: Conversion rates, resolution rates, or other downstream metrics
  • External validation: Results from your own evaluation pipeline or third-party tools

Annotations API

Submit human annotations through the dedicated annotations endpoint:
POST /v1/organizations/:organizationId/projects/:projectId/annotations
Use this endpoint when building your own annotation or feedback UI outside of Latitude’s web interface. Annotations submitted through this API appear alongside annotations created in the Latitude UI. Annotations support the same fields as custom scores, plus optional anchor fields for message-level or text-range annotations:
FieldTypeRequiredDescription
traceIdstringYesThe trace being annotated
valuenumberYesNormalized score between 0 and 1
passedbooleanYesPass/fail verdict
feedbackstringYesThe reviewer’s feedback text
issueIdstringNoLink to an existing issue

How Scores Feed the System

Once submitted, custom scores and annotations flow through the same reliability pipeline as internally generated scores:
  1. Issue discovery: Failed scores automatically enter the discovery pipeline, where Latitude clusters similar failures into issues
  2. Analytics: Finalized scores appear in time-series dashboards
  3. Alignment: Annotation scores are compared against evaluation scores for the same traces to compute alignment metrics
Custom scores and annotations are first-class citizens. They appear alongside evaluation-generated scores in all dashboards, filters, and analytics views.

Next Steps