Overview
What Is a Processor?
A processor is a core component of the Aptos Indexer that handles blockchain transaction processing. It validates, transforms, and stores transactions into a database, enabling downstream applications like analytics, indexing, and querying. Testing the processor ensures that all transactions are correctly handled, maintaining data accuracy and consistency.
What Are We Testing With This?
- Transaction correctness: Ensure that each transaction is processed and stored accurately.
- Schema consistency: Verify that the database schema is correctly set up and maintained throughout the tests.
General Flow of how Processor Testing Works
- Prepare testing transactions (refer to prior documentations).
- Update dependencies as needed.
- Import new transactions.
- Write test cases.
- Generate expected database output and validate.
- Merge.
Prerequisites
Key Considerations:
- Each test runs in an isolated environment using a PostgreSQL container to prevent interference.
- Proper handling of versions ensures transactions are processed and validated in the correct order.
- Validation logic must detect changes or issues by comparing processor output with the expected baseline.
- Ensure Docker is running for PostgreSQL container support.
- Set up docker engine/daemon on your machine
- Start Docker if it’s not running
- Identify the transactions to test.
- Use imported transactions or write your own custom Move scripts to generate test transactions. Refer to Importing Transaction Guide and Generating Transaction using Move Script Guide for detailed instructions.
- Import necessary modules, see example:
use aptos_indexer_testing_framework::{ database::{PostgresTestDatabase, TestDatabase}, sdk_test_context::SdkTestContext, };
Steps to Write a Test
1. Set Up the Test Environment
Before setting up the test environment, it’s important to understand the configurations being used in this step:
What Are These Configurations?
generate_file_flag
This flag determines whether to run the test in a “diff mode” or not.
In “diff mode,” the system will compare the actual output from the processor with the expected output and highlight differences.
Use this mode when validating changes or debugging discrepancies.
custom_output_path
An optional configuration to specify a custom path where the expected database output will be stored.
If not provided, the test will use the default path defined by DEFAULT_OUTPUT_FOLDER.
DEFAULT_OUTPUT_FOLDER
This constant defines the default folder where the system stores output files for the tests.
Example: “sdk_expected_db_output_files”.
Modify this value in your configuration if you prefer a different default directory.
let (generate_file_flag, custom_output_path) = get_test_config();
let output_path = custom_output_path.unwrap_or_else(|| format!("{}/imported_mainnet_txns", DEFAULT_OUTPUT_FOLDER));
// Setup DB and replace as needed
let mut db = PostgresTestDatabase::new();
db.setup().await.unwrap();
let mut test_context = SdkTestContext::new(&[CONST_VARIABLE_OF_YOUR_TEST_TRANSACTION]); // Replace with your test transaction
if test_context.init_mock_grpc().await.is_err() {
panic!("Failed to initialize mock grpc");
};
Explanation of Each Component:
get_test_config():
This function fetches the configurations (diff_flag and custom_output_path) for the test. Modify or extend this function if you want to support additional custom flags or configurations. output_path:
Combines DEFAULT_OUTPUT_FOLDER with the subfolder imported_mainnet_txns if no custom_output_path is specified. This ensures all output files are stored in a predictable location.
PostgresTestDatabase::new():
Creates a new PostgreSQL database instance for testing. This database is isolated, ensuring no interference with production or other test environments.
SdkTestContext::new():
Initializes the test context with the transaction(s) you want to test. Replace CONST_VARIABLE_OF_YOUR_TEST_TRANSACTION with the appropriate variable or constant representing the transaction(s) to be tested.
init_mock_grpc():
Initializes a mock gRPC service for the test. This allows the processor to simulate transactions without interacting with live blockchain data.
2. Configure the Processor
let db_url = db.get_db_url();
let transaction_stream_config = test_context.create_transaction_stream_config();
let postgres_config = PostgresConfig {
connection_string: db_url.to_string(),
db_pool_size: 100,
};
let db_config = DbConfig::PostgresConfig(postgres_config);
let default_processor_config = DefaultProcessorConfig {
per_table_chunk_sizes: AHashMap::new(),
channel_size: 100,
deprecated_tables: HashSet::new(),
};
let processor_config = ProcessorConfig::DefaultProcessor(default_processor_config);
let processor_name = processor_config.name();
3. Create the Processor
let processor = DefaultProcessor::new(indexer_processor_config)
.await
.expect("Failed to create processor");
Note: Replace DefaultProcessor
with the processor you are testing.
4. Setup a Query
Set up a query to load data from the local database and compare it with expected results, see example loading function
5. Setup a Test Context run function
Use the test_context.run() function to execute the processor, validate outputs using your query, and optionally generate database output files:
let txn_versions: Vec<i64> = test_context
.get_test_transaction_versions()
.into_iter()
.map(|v| v as i64)
.collect();
let db_values = test_context
.run(
&processor,
generate_file_flag,
output_path.clone(),
custom_file_name,
move || {
let mut conn = PgConnection::establish(&db_url).unwrap_or_else(|e| {
eprintln!("[ERROR] Failed to establish DB connection: {:?}", e);
panic!("Failed to establish DB connection: {:?}", e);
});
let db_values = match load_data(&mut conn, txn_versions.clone()) {
Ok(db_data) => db_data,
Err(e) => {
eprintln!("[ERROR] Failed to load data {}", e);
return Err(e);
},
};
if db_values.is_empty() {
eprintln!("[WARNING] No data found for versions: {:?}", txn_versions);
}
Ok(db_values)
},
)
6. Run the Processor Test
Once you have your test ready, run the following command to generate the expected output for validation:
cargo test sdk_tests -- generate-output
Arguments: generate-output: A custom flag to indicate that expected outputs should be generated. output-path: it’s an optional argument to specify the output path for the db output.
The expected database output will be saved in the specified output_path or sdk_expected_db_output_files by default.
FAQ
What Types of Tests Does It Support?
- Database schema output diff.
What Is TestContext
?
TestContext
is a struct that manages:
transaction_batches
: A collection of transaction batches.postgres_container
: A PostgreSQL container for test isolation.
It initializes and manages the database and transaction context for tests.
What Does TestContext.run
Do?
This function executes the processor, applies validation logic, and optionally generates output files.
Key Features:
- Flexible Validation: Accepts a user-provided verification function.
- Multi-Table Support: Handles data across multiple tables.
- Retries: Uses exponential backoff and timeout for retries.
- Optional File Generation: Controlled by a flag.
Example Usage:
pub async fn run<F>(
&mut self,
processor: &impl ProcessorTrait,
txn_version: u64,
generate_files: bool, // Flag to control file generation
output_path: String, // Output path
custom_file_name: Option<String>, // Custom file name
verification_f: F, // Verification function
) -> anyhow::Result<HashMap<String, Value>>
where
How to Generate Expected DB Output?
Run the following command:
cargo test sdk_tests -- --nocapture generate-output
Supported Test Args:
generate-output
output_path
Troubleshooting and Tips
- Isolate Tests: Use Docker containers for database isolation.
- Handle Non-Deterministic Fields: Use helpers like
remove_inserted_at
to clean up timestamps before validation. - Enable Debugging: Use
eprintln!
for detailed error logging.
How to Debug Test Failures?
run following command to get detailed logs:
cargo test sdk_tests -- --nocapture
Additional Notes
- Adapting to Other Databases:
- Replace PostgreSQL-specific code with relevant database code you intend to use (e.g., MySQL).
- Update schema initialization and query methods.
- Docker Installation: Follow the Docker setup guide.
- Referencing Existing Tests:
- Example: Event Processor Tests.