# Report Structure

![Example Report](./img-5.png)

## Overview
The Kusho test report follows a structured format designed to provide clear visibility into API test execution results. This document outlines the key components and organization of the test reports.

## Report Components

### 1. Header Summary
The report begins with a summary table containing the following metrics:

| Field        | Description                                  |
|--------------|----------------------------------------------|
| # Total      | Total number of test cases in the suite.     |
| # Executed   | Number of tests that were executed.          |
| # Passed     | Number of successful test executions.        |
| # Failed     | Number of failed test executions.            |
| Time (ms)    | Total execution time in milliseconds.        |
| Executed At  | Timestamp of execution in UTC format.        |
| Executed By  | Email address of the test executor.          |

### 2. Test Suite Information
Following the header, the report displays the test suite name and purpose, e.g., "Test suite - Database Content Retrieval".

### 3. Detailed Test Results Table
The main body of the report consists of a detailed table with the following columns:

| Column            | Description                                 |
|-------------------|---------------------------------------------|
| No.               | Sequential test case number.               |
| UUID              | Unique identifier for each test case.      |
| Test Case         | Description of the test scenario.          |
| Result            | Test outcome (PASS/FAIL).                  |
| Status Code       | HTTP response code received.               |
| Failed Assertions | Details of any failed assertions (N/A for passed tests).|

## Result Indicators

### Status Codes
- Common HTTP status codes:
  - 401: Unauthorized.
  - 200: Success.
  - 404: Not Found.
  - 500: Server Error.

### Visual Indicators
- PASS results are highlighted in green.
- Failed tests (if any) are highlighted differently.
- N/A in the Failed Assertions column indicates a successful test.

## Best Practices

### Report Generation
1. Generate reports after each test suite execution.
2. Include a timestamp and test suite name in the report filename.
3. Store reports in a designated directory structure.

### Report Analysis
1. Review the summary metrics first.
2. Investigate any failed tests immediately.
3. Document unusual patterns or recurring issues.
4. Track execution times for performance monitoring.
