Observability
Observability for Foyle
What You’ll Learn
How to use OpenTelemetry and Logging to monitor Foyle.
To configure Foyle to use Honeycomb as a backend you just need to set the telemetry.honeycomb.apiKeyFile
configuration option to the path of a file containing your Honeycomb API key.
foyle config set telemetry.honeycomb.apiKeyFile = /path/to/apikey
Google Cloud Logging
If you are running Foyle locally but want to stream logs to Cloud Logging you can edit the logging stanza in your
config as follows:
logging:
sinks:
- json: true
path: gcplogs:///projects/${PROJECT}/logs/foyle
Remember to substitute in your actual project for ${PROJECT}.
In the Google Cloud Console you can find the logs using the query
logName = "projects/${PROJECT}/logs/foyle"
While Foyle logs to JSON files, Google Cloud Logging is convenient for querying and viewing your logs.
Logging to Standard Error
If you want to log to standard error you can set the logging stanza in your config as follows:
logging:
sinks:
- json: false
path: stderr
1 - Monitoring AI Quality
How to monitor the quality of AI outputs
What You’ll Learn
- How to observe the AI to understand why it generated the answers it did
What was the actual prompt and response?
A good place to start when trying to understand the AI’s responses is to look at the actual prompt and response from the LLM that produced the cell.
You can fetch the request and response as follows
- Get the log for a given cell
- From the cell get the traceId of the AI generation request
CELLID=01J7KQPBYCT9VM2KFBY48JC7J0
export TRACEID=$(curl -s -X POST http://localhost:8877/api/foyle.logs.LogsService/GetBlockLog -H "Content-Type: application/json" -d "{\"id\": \"${CELLID}\"}" | jq -r .blockLog.genTraceId)
echo TRACEID=$TRACEID
- Given the traceId, you can fetch the request and response from the LOGS
curl -s -o /tmp/response.json -X POST http://localhost:8877/api/foyle.logs.LogsService/GetLLMLogs -H "Content-Type: application/json" -d "{\"traceId\": \"${TRACEID}\"}"
CODE="$?"
if [ $CODE -ne 0 ]; then
echo "Error occurred while fetching LLM logs"
exit $CODE
fi
- You can view an HTML rendering of the prompt and response
- If you disable interactive mode for the cell then vscode will render the HTML respnse inline
- Note There appears to be a bug right now in the HTML rendering causing a bunch of newlines to be introduced relative to what’s in the actual markdown in the JSON request
jq -r '.requestHtml' /tmp/response.json > /tmp/request.html
cat /tmp/request.html
jq -r '.responseHtml' /tmp/response.json > /tmp/response.html
cat /tmp/response.html
- To view the JSON versions of the actual requests and response
jq -r '.requestJson' /tmp/response.json | jq .
jq -r '.responseJson' /tmp/response.json | jq '.messages[0].content[0].text'
- You can print the raw markdown of the prompt as follows
echo $(jq -r '.requestJson' /tmp/response.json | jq '.messages[0].content[0].text')
jq -r '.responseJson' /tmp/response.json | jq .
2 - Monitoring Foyle With Honeycomb
How to monitor Foyle with Honeycomb
What You’ll Learn
- How to use Opentelemetry and Honeycomb To Monitor Foyle
Setup
foyle config set telemetry.honeycomb.apiKeyFile = /path/to/apikey
Download Honeycomb CLI
- hccli is an unoffical CLI for Honeycomb.
- It is being developed to support using Honeycomb with Foyle.
TAG=$(curl -s https://api.github.com/repos/jlewi/hccli/releases/latest | jq -r '.tag_name')
# Remove the leading v because its not part of the binary name
TAGNOV=${TAG#v}
OS=$(uname -s | tr '[:upper:]' '[:lower:]')
ARCH=$(uname -m)
echo latest tag is $TAG
echo OS is $OS
echo Arch is $ARCH
LINK=https://github.com/jlewi/hccli/releases/download/${TAG}/hccli_${TAGNOV}_${OS}_${ARCH}
echo Downloading $LINK
wget $LINK -O /tmp/hccli
Move the hccli
binary to a directory in your PATH.
chmod a+rx /tmp/hccli
sudo mv /tmp/hccli /tmp/hccli
On Darwin set the execute permission on the binary.
sudo xattr -d com.apple.quarantine /usr/local/bin/hccli
- In order for
hccli
to generate links to the Honeycomb UI it needs to know base URL of your environment. - You can get this by looking at the URL in your browser when you are logged into Honeycomb.
- It is typically somethling like
https://ui.honeycomb.io/${TEAM}/environments/${ENVIRONMENT}
- You can set the base URL in your config
hccli config set baseURL=https://ui.honeycomb.io/${TEAM}/environments/${ENVIRONMENT}/
- You can check your configuration by running the get command
Measure Acceptance Rate
- To measure the utility of Foyle we can look at how often Foyle suggestions are accepted
- When a suggestion is accepted we send a LogEvent of type
ACCEPTED
- This creates an OTEL trace with a span with name
LogEvent
and attribute eventType == ACCEPTED
- We can use the query below to calculate the acceptance rate
QUERY='{
"calculations": [
{"op": "COUNT", "alias": "Event_Count"}
],
"filters": [
{"column": "name", "op": "=", "value": "LogEvent"},
{"column": "eventType", "op": "=", "value": "ACCEPTED"}
],
"time_range": 86400,
"order_by": [{"op": "COUNT", "order": "descending"}]
}'
hccli querytourl --dataset foyle --query "$QUERY" --open
Token Count Usage
- The cost of LLMs depends on the number of input and output tokens
- You can use the query below to look at token usage
QUERY='{
"calculations": [
{"op": "COUNT", "alias": "LLM_Calls_Per_Cell"},
{"op": "SUM", "column": "llm.input_tokens", "alias": "Input_Tokens_Per_Cell"},
{"op": "SUM", "column": "llm.output_tokens", "alias": "Output_Tokens_Per_Cell"}
],
"filters": [
{"column": "name", "op": "=", "value": "Complete"}
],
"breakdowns": ["trace.trace_id"],
"time_range": 86400,
"order_by": [{"op": "COUNT", "order": "descending"}]
}'
hccli querytourl --dataset foyle --query "$QUERY" --open