This project demonstrates an end-to-end runtime detection pipeline in Google Cloud that transforms raw Kubernetes security alerts into structured, actionable incident reports using AI.
It combines:
- Infrastructure as Code (Terraform)
- Runtime detection with Falco
- Event-driven architecture via Google Cloud Pub/Sub
- AI-driven analysis using Google Agent Development Kit
The goal is not just detection — but operationalizing runtime alerts for incident response.
Runtime security tools generate technically accurate alerts, but:
- Alerts are often low-context
- Analysts manually translate detections into reports
- MITRE mapping and response guidance are rarely automated
- This project explores whether AI can reduce that interpretation gap.
- Terraform provisions:
- GKE cluster
- IAM roles (least privilege)
- Pub/Sub topic and subscription
- Falco runs inside the cluster
- Detects suspicious runtime activity
- Sends alerts to Pub/Sub via Falcosidekick
- Python subscriber service
- Consumes Pub/Sub alerts
- Sends structured alert data to an AI agent (ADK)
- Receives enriched analysis
- Output
- Markdown-formatted incident report
- Human-readable explanation
- MITRE ATT&CK mapping
- Investigation & remediation steps
Simulated scenario:
- A container accesses sensitive files (/etc/shadow)
- Falco triggers a detection
- Event is pushed to Pub/Sub
- AI agent produces:
- Behavioral explanation
- Threat interpretation
- MITRE technique mapping
- Suggested IR actions
- Risk summary
The result is an AI-assisted incident report instead of a raw rule match.
This project explores:
- Detection Engineering + AI convergence
- Cloud-native runtime monitoring
- Event-driven security automation
- AI-assisted incident triage
It is a prototype of an “AI Incident Coordinator” for Kubernetes environments.
- Terraform installed.
- gcloud CLI installed and initialized.
-
Authenticate with Google Cloud:
gcloud auth application-default login gcloud auth login gcloud config set project awesome-treat-443517-s1 -
Configure variables:
- Copy
terraform.tfvars.exampletoterraform.tfvarsand edit it with the required values, or set variables directly in the file. - Ensure the
.envfile contains the correct base64-encoded GCP service account credentials.
- Copy
-
Initialize Terraform:
terraform init
-
Review the planned changes:
terraform plan
-
(Optional) Save the planned changes:
terraform plan -out tf.plan terraform show -no-color tf.plan > tf.plan.txt -
Apply the configuration:
terraform apply -auto-approve
This will:
- Create a GKE cluster and node pool.
- Create a Pub/Sub topic and subscription.
- Create service accounts for Falcosidekick (publisher) and Python subscriber (listener).
- Set up IAM bindings for least-privilege access.
- Deploy Falco and Falcosidekick to the cluster using the Helm provider.
-
Configure kubectl context to interact with the new cluster:
After apply, set up your
kubectlcontext to manage the new GKE cluster:gcloud container clusters get-credentials gcp-falco --zone=europe-central2-a --project=awesome-treat-443517-s1
-
Test
- Trigger a Falco rule/detection:
kubectl exec -it $(kubectl get pods --selector=app=nginx -o name) -- /bin/bash -c "cat /etc/shadow"
- Check that the rule was triggered:
kubectl logs -n falco -l app.kubernetes.io/name=falco
- Check for export errors in GCP:
kubectl logs -n falco -l app.kubernetes.io/name=falcosidekick
- Note: If there are log errors or issues with service account creation, grant the publisher role:
gcloud projects add-iam-policy-binding awesome-treat-443517-s1 --member="serviceAccount:falcosidekick-pubsub-publisher@awesome-treat-443517-s1.iam.gserviceaccount.com" --role="roles/pubsub.publisher"
- Trigger a Falco rule/detection:
To remove all resources created by this Terraform project, run:
terraform destroy -auto-approveThis will delete the GKE cluster, Pub/Sub topics, IAM roles, and all other resources managed by Terraform.
Note:
To verify that resources are deleted, use the followinggcloudcommands:
- List Pub/Sub topics:
gcloud pubsub topics list- Delete a Pub/Sub topic:
cloud pubsub topics delete <TOPIC_NAME>
- List Pub/Sub subscriptions:
gcloud pubsub subscriptions list- Delete a Pub/Sub subscription:
gcloud pubsub subscriptions delete <SUBSCRIPTION_NAME>
- Delete a service account:
gcloud iam service-accounts list- List IAM service accounts:
gcloud iam service-accounts delete <SERVICE_ACCOUNT_EMAIL>
- List GKE clusters:
gcloud container clusters list- Delete GKE clusters:
gcloud container clusters delete <CLUSTER_NAME> --zone=<ZONE> --project=<PROJECT_ID>
- All resources are created with unique names to avoid collisions.
- IAM permissions are set so only Falcosidekick can publish to the Pub/Sub topic (
roles/pubsub.publisher), and only the Python subscriber can read from it (roles/pubsub.subscriber). - IAM permissions also grant the Python subscriber access to Gemini/Vertex AI (
roles/aiplatform.user). - All service account keys are created and passed to the relevant components automatically by Terraform.
- Outputs will show the created service account emails and resource names for use in your applications.
To authorize the Python subscriber (main.py) to access GCP Pub/Sub, set the environment variable in your shell or in the script:
export GOOGLE_APPLICATION_CREDENTIALS="falcosidekick-subscriber-key.json"Or in Python:
import os
os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = "falcosidekick-subscriber-key.json"The key file (falcosidekick-subscriber-key.json) is created automatically by Terraform and corresponds to the falcosidekick-subscriber service account.