#
Adding Setup and Teardown Stages to Your Kusho E2E Tests
If you're already running Kusho E2E tests in GitHub Actions using tags and want to add setup and teardown stages that pass data between them, here's how to do it.
#
How It Works
When you run Kusho E2E tests with a volume mounted (-v $(pwd)/output:/app/output), all execution-related data is automatically written to execution_data.json inside the mounted directory. This file contains everything about the test run - variables, request/response data, assertion results, and more.
By uploading this file as a GitHub Actions artifact, you make it available to subsequent stages in your workflow. Later stages can download the artifact, parse the JSON file to extract specific values (like user IDs, tokens, or any data from API responses), and pass those values as variables to the next set of tests.
This creates a data flow: Setup → Artifact → Main Tests → Artifact → Teardown, where each stage can access and use data produced by previous stages.
#
What You Need to Change
Your current workflow probably looks like this:
jobs:
e2e-tests:
runs-on: ubuntu-latest
steps:
- name: Run E2E Tests
run: |
docker run --rm \
-e BASE_URL="https://api.example.com" \
-e ENVIRONMENT_ID="2" \
-e API_KEY="${{ secrets.KUSHO_API_KEY }}" \
-e CI_COMMIT_SHA="${{ github.sha }}" \
-e CI_COMMIT_MESSAGE="${{ github.event.head_commit.message }}" \
-e E2E_TEST_SUITE_TAGS="smoke,regression" \
-e E2E_PROFILE_TAGS="staging" \
public.ecr.aws/y5g4u6y7/kusho-test-runner:latest
To add setup and teardown, you'll need to:
- Add volume mounting to capture execution data
- Upload/download artifacts to pass data between stages
- Use
jqto extract variables from the execution data (This is just a suggestion; you can use anything else to parse the execution data file and extract data from it) - Pass variables to subsequent stages using
VARIABLESorVARIABLE_*env vars
#
Step 1: Add Setup Stage
The setup stage looks exactly like your main test stage, but with two additions:
✅ Add volume mounting: -v $(pwd)/output:/app/output
✅ Upload the artifact after execution
✅ You will probably use some different tags (e.g. "setup") which you had added to your setup-related workflows on KushoAI
jobs:
setup:
runs-on: ubuntu-latest
steps:
- name: Run Setup Tests
run: |
docker run --rm \
-v $(pwd)/output:/app/output \ # ← ADD THIS
-e BASE_URL="https://api.example.com" \
-e ENVIRONMENT_ID="2" \
-e API_KEY="${{ secrets.KUSHO_API_KEY }}" \
-e CI_COMMIT_SHA="${{ github.sha }}" \
-e CI_COMMIT_MESSAGE="${{ github.event.head_commit.message }}" \
-e E2E_TEST_SUITE_TAGS="setup" \ # ← Setup tag
-e E2E_PROFILE_TAGS="staging" \
public.ecr.aws/y5g4u6y7/kusho-test-runner:latest
- name: Upload execution data # ← ADD THIS
uses: actions/upload-artifact@v4
with:
name: setup-execution-data
path: output/execution_data.json
retention-days: 1
What this does:
- The volume mount makes the execution data file available on your runner
- The artifact upload makes it available to other jobs in the workflow
- Uses the
setuptag to run only setup-related workflows
#
Step 2: Update Main Test Stage
Your main test stage now needs to:
- Download the setup execution data
- Extract variables using
jq(or some other script of your preference) - Pass them to your Kusho tests
main-tests:
runs-on: ubuntu-latest
needs: setup # ← Wait for setup to complete
steps:
- name: Download setup execution data # ← ADD THIS
uses: actions/download-artifact@v4
with:
name: setup-execution-data
path: ./setup-data
- name: Extract variables from setup # ← ADD THIS
id: extract
run: |
# Extract variables using jq (jq is pre-installed on GitHub runners)
USER_ID=$(jq -r '.test_suites[0] | .. | .response.data?["user_id"]? // empty' ./setup-data/execution_data.json | head -1)
API_TOKEN=$(jq -r '.test_suites[0] | .. | .response.headers?["api_token"]? // empty' ./setup-data/execution_data.json | head -1)
echo "user_id=$USER_ID" >> $GITHUB_OUTPUT
echo "api_token=$API_TOKEN" >> $GITHUB_OUTPUT
- name: Run Main E2E Tests # ← UPDATE THIS
run: |
docker run --rm \
-v $(pwd)/output:/app/output \ # ← ADD THIS (for teardown)
-e BASE_URL="https://api.example.com" \
-e ENVIRONMENT_ID="2" \
-e API_KEY="${{ secrets.KUSHO_API_KEY }}" \
-e CI_COMMIT_SHA="${{ github.sha }}" \
-e CI_COMMIT_MESSAGE="${{ github.event.head_commit.message }}" \
-e E2E_TEST_SUITE_TAGS="smoke,regression" \
-e E2E_PROFILE_TAGS="staging" \
-e VARIABLES="user_id:${{ steps.extract.outputs.user_id }},api_token:${{ steps.extract.outputs.api_token }}" \ # ← ADD VARIABLES
public.ecr.aws/y5g4u6y7/kusho-test-runner:latest
- name: Upload main test execution data # ← ADD THIS
uses: actions/upload-artifact@v4
with:
name: main-test-execution-data
path: output/execution_data.json
retention-days: 1
What this does:
- Downloads the artifact from the setup stage
- Uses
jqto extract specific variables (likeuser_id,api_token) - Passes them using
VARIABLESin comma-separated format:key1:value1,key2:value2
#
Step 3: Add Teardown Stage
The teardown stage cleans up resources using data from both setup and main tests:
teardown:
runs-on: ubuntu-latest
needs: [setup, main-tests] # ← Wait for both
if: always() # ← Run even if tests fail
steps:
- name: Download setup data
uses: actions/download-artifact@v4
with:
name: setup-execution-data
path: ./setup-data
- name: Download main test data
uses: actions/download-artifact@v4
with:
name: main-test-execution-data
path: ./main-data
- name: Extract cleanup variables
id: extract
run: |
# Get user_id from setup
USER_ID=$(jq -r '.test_suites[0] | .. | .response.data?["user_id"]? // empty' ./setup-data/execution_data.json | head -1)
# Get order_id from main tests
ORDER_ID=$(jq -r '.test_suites[0] | .. | .response.data?["order_id"]? // empty' ./main-data/execution_data.json | head -1)
echo "user_id=$USER_ID" >> $GITHUB_OUTPUT
echo "order_id=$ORDER_ID" >> $GITHUB_OUTPUT
- name: Run Teardown Tests
run: |
docker run --rm \
-e BASE_URL="https://api.example.com" \
-e ENVIRONMENT_ID="2" \
-e API_KEY="${{ secrets.KUSHO_API_KEY }}" \
-e CI_COMMIT_SHA="${{ github.sha }}" \
-e CI_COMMIT_MESSAGE="${{ github.event.head_commit.message }}" \
-e E2E_TEST_SUITE_TAGS="teardown" \
-e E2E_PROFILE_TAGS="staging" \
-e VARIABLES="user_id:${{ steps.extract.outputs.user_id }},order_id:${{ steps.extract.outputs.order_id }}" \
public.ecr.aws/y5g4u6y7/kusho-test-runner:latest
What this does:
- Downloads artifacts from both setup and main test stages
- Extracts variables from each
- Uses them to clean up resources (delete user, cancel order, etc.)
- Runs even if tests fail (thanks to
if: always())
#
Using jq to Extract Data
jq is pre-installed on all GitHub runners, so you can use it directly. Here are common patterns:
#
Extract from variables (simple)
# Extract a variable from first test suite
jq -r '.test_suites[0].variables.user_id' execution_data.json
#
Extract from anywhere in execution data (recursive search)
# Find user_id anywhere in first test suite
jq -r '.test_suites[0] | .. | .variables?["user_id"]? // empty' execution_data.json | head -1
# Find from response headers
jq -r '.test_suites[0] | .. | .headers?["Api-Key-Pre-Run"]? // empty' execution_data.json | head -1
# Find from response body
jq -r '.test_suites[0] | .. | .response?.data?.user_id? // empty' execution_data.json | head -1
#
Extract multiple values at once
USER_ID=$(jq -r '.test_suites[0] | .. | .variables?["user_id"]? // empty' execution_data.json | head -1)
ORDER_ID=$(jq -r '.test_suites[0] | .. | .variables?["order_id"]? // empty' execution_data.json | head -1)
SESSION_TOKEN=$(jq -r '.test_suites[0] | .. | .variables?["session_token"]? // empty' execution_data.json | head -1)
The | head -1 is important - it takes only the first match when using recursive search (..).
#
Variable Passing Options
You have two ways to pass variables to Kusho:
#
Option 1: Comma-separated (VARIABLES)
Best for passing multiple variables at once:
-e VARIABLES="user_id:usr_123,api_token:tok_abc,order_id:ord_456"
#
Option 2: Individual env vars (VARIABLE_*)
Best for secrets or when you want explicit control:
-e VARIABLE_user_id="usr_123" \
-e VARIABLE_api_token="tok_abc" \
-e VARIABLE_order_id="ord_456"
You can mix both! Individual VARIABLE_* have higher priority than VARIABLES.
#
Complete Example Workflow
Here's a full workflow showing all three stages with tag-based execution:
name: E2E Tests with Setup and Teardown
on: [push]
env:
BASE_URL: "https://api.example.com"
ENVIRONMENT_ID: "2"
jobs:
setup:
runs-on: ubuntu-latest
steps:
- name: Run Setup Tests
run: |
docker run --rm \
-v $(pwd)/output:/app/output \
-e BASE_URL="${{ env.BASE_URL }}" \
-e ENVIRONMENT_ID="${{ env.ENVIRONMENT_ID }}" \
-e API_KEY="${{ secrets.KUSHO_API_KEY }}" \
-e CI_COMMIT_SHA="${{ github.sha }}" \
-e CI_COMMIT_MESSAGE="${{ github.event.head_commit.message }}" \
-e E2E_TEST_SUITE_TAGS="setup" \
-e E2E_PROFILE_TAGS="staging" \
public.ecr.aws/y5g4u6y7/kusho-test-runner:latest
- name: Upload execution data
uses: actions/upload-artifact@v4
with:
name: setup-execution-data
path: output/execution_data.json
retention-days: 1
main-tests:
runs-on: ubuntu-latest
needs: setup
steps:
- name: Download setup data
uses: actions/download-artifact@v4
with:
name: setup-execution-data
path: ./setup-data
- name: Extract setup variables
id: extract
run: |
USER_ID=$(jq -r '.test_suites[0] | .. | .variables?["user_id"]? // empty' ./setup-data/execution_data.json | head -1)
API_TOKEN=$(jq -r '.test_suites[0] | .. | .variables?["api_token"]? // empty' ./setup-data/execution_data.json | head -1)
echo "Extracted user_id: $USER_ID"
echo "Extracted api_token: ${API_TOKEN:0:10}..."
echo "user_id=$USER_ID" >> $GITHUB_OUTPUT
echo "api_token=$API_TOKEN" >> $GITHUB_OUTPUT
- name: Run Main Tests
run: |
docker run --rm \
-v $(pwd)/output:/app/output \
-e BASE_URL="${{ env.BASE_URL }}" \
-e ENVIRONMENT_ID="${{ env.ENVIRONMENT_ID }}" \
-e API_KEY="${{ secrets.KUSHO_API_KEY }}" \
-e CI_COMMIT_SHA="${{ github.sha }}" \
-e CI_COMMIT_MESSAGE="${{ github.event.head_commit.message }}" \
-e E2E_TEST_SUITE_TAGS="smoke,regression" \
-e E2E_PROFILE_TAGS="staging" \
-e VARIABLES="user_id:${{ steps.extract.outputs.user_id }},api_token:${{ steps.extract.outputs.api_token }}" \
public.ecr.aws/y5g4u6y7/kusho-test-runner:latest
- name: Upload main test data
uses: actions/upload-artifact@v4
with:
name: main-test-execution-data
path: output/execution_data.json
retention-days: 1
teardown:
runs-on: ubuntu-latest
needs: [setup, main-tests]
if: always()
steps:
- name: Download all execution data
uses: actions/download-artifact@v4
with:
pattern: '*-execution-data'
path: ./data
- name: Extract cleanup variables
id: extract
run: |
USER_ID=$(jq -r '.test_suites[0] | .. | .variables?["user_id"]? // empty' ./data/setup-execution-data/execution_data.json | head -1)
ORDER_ID=$(jq -r '.test_suites[0] | .. | .variables?["order_id"]? // empty' ./data/main-test-execution-data/execution_data.json | head -1)
echo "Cleaning up: user_id=$USER_ID, order_id=$ORDER_ID"
echo "user_id=$USER_ID" >> $GITHUB_OUTPUT
echo "order_id=$ORDER_ID" >> $GITHUB_OUTPUT
- name: Run Teardown Tests
run: |
docker run --rm \
-e BASE_URL="${{ env.BASE_URL }}" \
-e ENVIRONMENT_ID="${{ env.ENVIRONMENT_ID }}" \
-e API_KEY="${{ secrets.KUSHO_API_KEY }}" \
-e CI_COMMIT_SHA="${{ github.sha }}" \
-e CI_COMMIT_MESSAGE="${{ github.event.head_commit.message }}" \
-e E2E_TEST_SUITE_TAGS="teardown" \
-e E2E_PROFILE_TAGS="staging" \
-e VARIABLES="user_id:${{ steps.extract.outputs.user_id }},order_id:${{ steps.extract.outputs.order_id }}" \
public.ecr.aws/y5g4u6y7/kusho-test-runner:latest
#
Example: Extracting from Different Locations
Here are real-world examples of extracting data from different parts of the execution data:
#
From variables (most common)
# User ID created during setup
USER_ID=$(jq -r '.test_suites[0].variables.user_id' ./setup-data/execution_data.json)
#
From response body
# Order ID from API response
ORDER_ID=$(jq -r '.test_suites[0] | .. | .response?.data?.order_id? // empty' ./main-data/execution_data.json | head -1)
#
From response headers
# Session token from response header
SESSION_TOKEN=$(jq -r '.test_suites[0] | .. | .response?.headers?["X-Session-Token"]? // empty' ./setup-data/execution_data.json | head -1)
#
From request headers (for verification)
# API key that was sent in request
API_KEY=$(jq -r '.test_suites[0] | .. | .request?.headers?["Api-Key-Pre-Run"]? // empty' ./setup-data/execution_data.json | head -1)
#
Key Points to Remember
- Always add volume mounting:
-v $(pwd)/output:/app/outputto get the execution data file - Upload artifacts after each stage that produces data you need later
- Download artifacts at the start of stages that need data from previous stages
- Use
jqto extract - it's pre-installed, no setup needed - Use
| head -1to get just the first match when using recursive search (..) - Pass variables using
-e VARIABLES="key:value,key2:value2"format - Use
if: always()for teardown to run even when tests fail - Use tags to organize your test suites:
setup,smoke,regression,teardown
#
Execution Data Structure
Here's the structure of the execution_data.json file that gets generated. This will help you understand what data is available for extraction:
#
For Tag-Based Execution (Multiple Test Suites)
{
"execution_timestamp": "2025-12-24T10:25:34.601095Z",
"execution_type": "tags",
"tags": {
"test_suite_tags": ["setup", "smoke"],
"profile_tags": ["staging"]
},
"overall_status": "PASS",
"summary": {
"total_test_suites": 2,
"passed_test_suites": 2,
"failed_test_suites": 0,
"total_combinations": 5,
"passed_combinations": 5,
"failed_combinations": 0,
"total_tests": 15,
"passed_tests": 14,
"failed_tests": 1
},
"test_suites": [
{
"test_suite_uuid": "abc-123-def-456",
"execution_profile_uuid": "profile-789",
"workflow": {
"uuid": "workflow-uuid",
"name": "User Registration Flow"
},
"execution_profile": {
"profile_name": "Staging Environment"
},
"status": "PASS",
"summary": {
"total_combinations": 2,
"passed_combinations": 2,
"failed_combinations": 0,
"total_tests": 6,
"passed_tests": 6,
"failed_tests": 0
},
"variables": {
"user_id": "usr_12345",
"api_token": "tok_abcdef123456",
"email": "test.user@example.com",
"base_url": "https://api.staging.example.com"
},
"combination_results": [
{
"combination_id": 1,
"combination_passed": true,
"steps": [
{
"test_suite_id": 1001,
"test_suite_name": "Create User",
"test_case_id": 5001,
"test_case_desc": "Create user with valid data",
"method": "POST",
"url": "https://api.staging.example.com/users",
"status_code": 201,
"assertion_status": "pass",
"response_time": 245,
"error": null,
"executed_data": {
"request": {
"method": "POST",
"url": "https://api.staging.example.com/users",
"headers": {
"Content-Type": "application/json",
"X-API-Key": "staging_key_xyz",
"Api-Key-Pre-Run": "value_from_pre_run_script"
},
"params": {},
"data": {
"firstName": "John",
"lastName": "Doe",
"email": "john.doe@example.com"
}
},
"response": {
"headers": {
"Content-Type": "application/json",
"X-Request-Id": "req_abc123"
},
"data": {
"success": true,
"user": {
"id": "usr_12345",
"firstName": "John",
"lastName": "Doe",
"email": "john.doe@example.com",
"createdAt": "2025-12-24T10:25:15Z"
},
"token": "tok_abcdef123456"
},
"status": 201,
"statusText": "Created"
},
"assertions": [
{
"assertion": "expect(response.statusCode).to.equal(201);",
"status": true,
"message": "Assertion passed",
"assertion_desc": "Status code should be 201"
},
{
"assertion": "expect(response.response.user).to.have.property('id');",
"status": true,
"message": "Assertion passed",
"assertion_desc": "Response should contain user ID"
}
],
"error": null,
"responseTime": 245
}
},
{
"test_suite_id": 1002,
"test_suite_name": "Get User",
"test_case_id": 5002,
"test_case_desc": "Retrieve created user",
"method": "GET",
"url": "https://api.staging.example.com/users/usr_12345",
"status_code": 200,
"assertion_status": "pass",
"response_time": 123,
"error": null,
"executed_data": {
"request": {
"method": "GET",
"url": "https://api.staging.example.com/users/usr_12345",
"headers": {
"Authorization": "Bearer tok_abcdef123456",
"Content-Type": "application/json"
}
},
"response": {
"headers": {
"Content-Type": "application/json"
},
"data": {
"id": "usr_12345",
"firstName": "John",
"lastName": "Doe",
"email": "john.doe@example.com"
},
"status": 200,
"statusText": "OK"
},
"assertions": [
{
"assertion": "expect(response.statusCode).to.equal(200);",
"status": true,
"message": "Assertion passed",
"assertion_desc": "Status code should be 200"
}
],
"error": null,
"responseTime": 123
}
}
]
}
]
},
{
"test_suite_uuid": "xyz-789-ghi-012",
"workflow": {
"name": "Order Processing Flow"
},
"status": "PASS",
"variables": {
"order_id": "ord_67890",
"user_id": "usr_12345"
},
"combination_results": [
{
"combination_id": 1,
"combination_passed": true,
"steps": [
{
"test_suite_name": "Create Order",
"executed_data": {
"response": {
"data": {
"order_id": "ord_67890",
"status": "pending"
}
}
}
}
]
}
]
}
]
}
#
For Single Test Suite Execution
{
"execution_timestamp": "2025-12-24T10:25:34.601095Z",
"execution_type": "single",
"test_suite_uuid": "abc-123-def-456",
"execution_profile_uuid": "profile-789",
"workflow": {
"uuid": "workflow-uuid",
"name": "User Registration Flow"
},
"execution_profile": {
"profile_name": "Staging Environment"
},
"status": "PASS",
"summary": {
"total_combinations": 2,
"passed_combinations": 2,
"failed_combinations": 0,
"total_tests": 6,
"passed_tests": 6,
"failed_tests": 0
},
"variables": {
"user_id": "usr_12345",
"api_token": "tok_abcdef123456",
"email": "test.user@example.com"
},
"combination_results": [
{
"combination_id": 1,
"combination_passed": true,
"steps": [
{
"test_suite_name": "Create User",
"executed_data": {
"request": {
"method": "POST",
"headers": {
"Api-Key-Pre-Run": "value_from_pre_run"
}
},
"response": {
"data": {
"user": {
"id": "usr_12345"
}
}
}
}
}
]
}
]
}
#
Key Data Locations for Extraction
The recursive descent operator (..) allows you to search for values anywhere within a nested structure without knowing the exact path!
That's it! Your workflow will now have proper setup → test → teardown flow with data passing between stages. 🎉