Stream org-level audit logs for SIEM
in progress
A
Adam Harvey
While self-service audit capability by an end-user through the UI is a very nice feature, it would be great to allow for dynamic log shipping or making the audit logs queryable through the REST API , so they can be fed into a Security Incident and Event Management type solution. (or into Splunk, another logging tool, etc)
As a fall out of the CircleCI Security incident announced in early January 2023 (rotate secrets), having the ability to quickly diagnose this data and compare it to other data we already had from other systems would have made security triage significantly faster/easier.
H
Harsh Verma
Hi Henna,
What happens if the connection to AWS fails? Are there retries & Is there alerting mechanism if the logs stop streaming?
H
Henna Abbas
Harsh Verma: You will see a notification in the UI that the connection has been disconnect along with a last streamed timestamp. In addition you can manually retrieve audit logs.
H
Henna Abbas
Hi All,
This feature is in progress. Below is how we plan on addressing Audit Log streaming. Feel free to share your feedback. Let us know if this does/does not meet your needs:
Problem: Manual audit log extraction lacks real-time visibility and complicates compliance reporting.
Solution: Automated streaming of audit logs directly to your AWS S3 buckets.
Simple Setup (see linked images below)
Step 1: Connect to AWS
Step 2: Verify Connection
Step 3: Monitor
Key Benefits:
Real-time security visibility with logs delivered in under 1 hour
Long-term compliance storage in your infrastructure
SIEM integration (Splunk, DataDog, Rapid7)
Enhanced security with OIDC authentication
Technical Highlights:
Comprehensive event coverage (authentication, configuration changes, pipeline executions)
AWS S3 server-side encryption with JSON format
Compatible with Splunk, DataDog, and Rapid7
OIDC authentication support
99.9% streaming uptime and reliability
This post was marked as
in progress
This post was marked as
under review
A
Arkadiy Tetelman
Works for us - thank you! The AWS connection should please use role assumption / not require IAM users
H
Henna Abbas
Arkadiy Tetelman: Absolutely! We'll set up the AWS connection using role assumption. No IAM users will be required, making things more secure and easier to manage.
Patrick Marques
Can you share a bit more about the S3 Object / "folder" structure? is it a flat structure or is it something like yyyy/mm/dd ?
It should work for us as well
Patrick Marques
Can you share a bit more about "99.9% streaming uptime and reliability"? Do you have any mechanism to recover from any failures or deal with any GAPs?
As a side note you have reliability twice, "99.9% streaming reliability" & "99.9% streaming uptime and reliability"
H
Henna Abbas
Patrick Marques: We plan on using a fixed schema for the S3 object/folder structure rather than allowing customers to name their own. The structure follows this format:
year=%d/month=%d/day=%d/hour=%d/{timestamp}_{file_sha1}
For example, files would be organized like 'year=2025/month=07/day=16/hour=09/2025-07-16T09:52:30_a1b2c3d4e5f6...' where the timestamp is in ISO 8601 format (YYYY-MM-DDTHH:MM:SS) in UTC time, and the SHA1 is a unique hash of the file.