[Update] I tried using the automatic test generation feature as AWS Transform for mainframe's functionality has been expanded #AWSreInvent

[Update] I tried using the automatic test generation feature as AWS Transform for mainframe's functionality has been expanded #AWSreInvent

2025.12.05

This page has been translated by machine translation. View original

I'm Iwasa.

With AWS Transform, I was able to create specifications from mainframe application code or modernize code. (Formerly known as: Amazon Q Developer's Transform feature)

https://dev.classmethod.jp/articles/q-developer-transformation-mainframe/

During AWS re:Invent 2025, it was announced that this AWS Transform's mainframe capabilities were expanded. It's similar to the .NET approach.

https://aws.amazon.com/about-aws/whats-new/2025/12/transform-mainframe-application-reimagining/

For mainframes, two new features were added: AI-powered architecture reimagining and automatic test script creation.
In this article, I'll demonstrate the test generation for COBOL code.

Creating a Job

I'll skip the basic steps for starting AWS Transform.
It's similar to .NET or Custom options.

https://dev.classmethod.jp/articles/transform-ai-agent-full-stack-windows-modernization/

https://dev.classmethod.jp/articles/transform-net-transformation-developer-experience/

When creating a new job, you can specify the job type - here, select "Mainframe Modernization".

1FF28C73-F57F-4638-9966-1452BA532037.png

Next, you can choose from several job plan templates within the mainframe modernization category.
I entered 6 since I wanted to generate tests.

787FD02B-0525-4F3F-83C5-9AB79476A4F7.png

This expands the job plan in the side menu.
The job plan flow is as follows:

954AD1AA-CF51-4667-AF82-AAE0A9E2FC8B.png

It analyzes the application code and then creates test cases and scripts.
The first step in the job plan is "Kickoff," which handles the S3 buckets for input/output objects and establishes connections with your AWS account.

EDA2AC4E-7758-4B93-A2D8-8ED86FE58F7D.png

Application code needed to be zipped and uploaded to an S3 bucket.
Individual source code files won't work, and unlike .NET, Git repository support isn't available.

9B99B6DD-D0BD-4C66-B75E-B5F6598B235A.png

Application Analysis

Once the S3 bucket is configured and the application is uploaded, the process begins with code analysis and business logic extraction.

B787EB85-0489-4D78-9133-D5B1B267DF26.png

First, you specify the target files for the application.

64E7FF0A-2A93-42DA-81E0-A895C689C1D3.png

After code analysis, there's a step to extract business logic.
You decide the scope (application level, file level) and select the target files.

3A19A963-B310-425E-A3C0-174B7E617D57_1_105_c.jpeg

Next, you break down the domains.
You define domains in advance and assign logic to them. For this demonstration, I created a generic domain and placed all extraction targets in it.

E974234C-AF78-49AA-8AF0-E42F1FB469EA.png

768E23B7-9847-4DC9-B9CE-6D5FB3F0A244.png

BA971AEC-8F03-4D03-BFA1-4B476CDF24D2_1_105_c.jpeg

Creating Test Cases

Now we move to test case creation.
First is the test plan creation.
The system automatically defines test scope and other parameters, so you can proceed while reviewing each file. If everything looks good, you can continue.

10F09534-193E-4E8E-AD6E-5D4E342A3BEB_1_105_c.jpeg

08F65925-B509-4597-AB8B-736FA092F1DE.png

Next, it also creates scripts to prepare test data.
You select the target test cases and specify the S3 path where the test data generation script templates are stored. Templates can be obtained from the "download samples" link. In this case, I uploaded them as is.

290460F1-0E65-4088-910A-F608C3D522C8_1_105_c.jpeg

Once the collection script is generated, you should review it. By default, you may not be able to preview it due to S3 CORS issues, so you might need to review your S3 bucket's CORS settings.

A9B3E7D3-8F8D-4649-AABA-C5D01D6CC8F4_1_105_c.jpeg

B6054D18-E75A-49EB-BFD3-F17D6FE7DD5A.png

A6F03AE0-3D37-4767-94F9-B8E2960E38D2_1_105_c.jpeg

Finally, it generates test automation scripts, completing the process.

0932BD75-7D98-4B20-B2E6-1B7F69CD6328.png

In the end, everything from execution scripts to test cases and test data is output to the specified S3 bucket.
You can then use these for testing.

Conclusion

Today I tried out the newly expanded AWS Transform for mainframe's automatic test generation feature.

Unlike the recent Windows full-stack approach, this one uses S3 bucket code as a base and creates deliverables in an S3 bucket, which reduces the risk of contaminating existing environments or repositories, making it easier to try out.
Also, while AWS Transform sometimes encounters errors or takes too much time, I found this feature relatively quick and easy to use.

Share this article

FacebookHatena blogX

Related articles