6 min read
Using AWS Step Functions For SAP Commerce Cloud Build And Deployment
22 November 2022
Eduardo Picado
Introduction
SAP Commerce Cloud V2 (SAP CCv2) is an e-commerce solution built mainly for large enterprises with advanced B2B, B2C and B2B2C use cases. It can be customised to manage even the most complex catalogues, products and configurations, enabling personalised omnichannel experiences.
Unfortunately, the usual process to bring SAP CCv2 code from version control to an application server is manual. The user must connect to SAP Cloud Portal and execute the following actions:
- Select a branch and trigger a build.
- Wait for the build completion.
- Select a target environment and start the build deployment.
- Wait for the deployment completion.
Builds created outside the SAP Cloud Portal cannot be deployed to a cloud instance. However, SAP CCv2 does provide a REST build API that allows us to automate this process.
In this blog, I will walk you through the process through which my team and I created a single command system to build and deploy to SAP CCv2 instances using AWS Step Functions. It is a convenient solution for developers that can also be integrated into CI/CD pipelines.
What is AWS Step Functions?
Before moving on to our solution, I think it is important to understand what AWS Step Functions does and why it proved so effective.
AWS Step Functions is a workflow service that developers can use to automate and orchestrate IT and business processes. Workflows manage failures and retries, parallelisation, service integrations and observability, which made this service a perfect fit for our application.
From Build to Deployment: A Solution Overview
Let’s look at the relevant SAP Build API methods that we used in this solution and their primary input and outputs:
| Method | Input | Output |
| createBuild() | The Source Branch | A Build Code |
| getBuild() | The Build Code | A Build Status: BUILDING, DONE, FAILED |
| createDeployment() | The Build Code, The Environment Code | A Deployment Code |
| getDeployment() | The Deployment Code |
The Deployment Status: PROGRESS, DONE, FAILED
|
Naturally, we have listed the actions in the same order they had to be executed in when entered in the Cloud Portal.
The createBuild and createDeployment methods are only called once, while their corresponding polling methods getBuild and createDeployment need to be called repeatedly until they returned a final state: DONE or FAILED.
AWS Step Functions was used to create a state machine to orchestrate those API calls that we encapsulated into AWS Lambdas.
The state machine can easily be triggered:
- Manually by a developer, using the AWS Management Console, AWS CLI, or AWS plugins for VS Code, IntelliJ, etc.
- By a continuous integration process, for example, from CodePipeline or Jenkins

The Architecture of the Solution
Prerequisites of the Solution
- An AWS account and an AWS user or role with sufficient permissions to create the necessary resources.
- Access to AWS Step Functions and AWS Lambda.
- An SAP account with access to the SAP Cloud Portal and permission to use the build APIs for an application.
Creating AWS Lambdas
We started by creating AWS Lambdas functions to call SAP API methods. That means that we created one Lambda function for each API method.
For example, to call the createBuild API method, we created the below createBuild Lambda in Python:
# Lambda function to call Commerce API createBuild
import requests
from requests.structures import CaseInsensitiveDict
from datetime import datetime
def handler(event, context):
token = event['token']
subscription = event['subscription']
application_code = event.get("application_code", "commerce-cloud")
branch = event.get("branch", "develop")
name = event['buildName']
url = "https://portalrotapi.hana.ondemand.com/v2/subscriptions/" + subscription + "/builds"
payload = {"applicationCode": application_code, "branch": branch, "name": name}
headers = CaseInsensitiveDict()
headers["Accept"] = "application/json"
headers["Content-Type"] = "application/json"
headers["Authorization"] = "Bearer " + token
response = requests.post(url, headers=headers, json=payload)
return {
'statusCode': response.status_code,
'body': response.json()
}
The code for createDeployment and getDeployment is very similar to the getBuild code written above. They will:
- Read the Lambda input
- Created a payload for the call to SAP API
- Called the SAP API and returned its response
We also added a layer to Lambda to include the requests library.
Creating a State Machine
Now that we have the four Lambdas, we have to orchestrate them.
We started with creating a Standard (default) state machine because our execution is expected to last longer than the 5 minutes allowed for Express Workflows.
A state machine has states and transitions. Our transitions were pretty straightforward. We used four states that are explained in the table below.
| State type | States definition | State instances created |
| Task |
Represents a single unit of work, in our case, invoke our Lambdas.
|
|
| Choice |
Checks the result of previous states and adds branching logic.
|
|
| Wait |
Delays the state machine from continuing for a specified time.
|
|
| Fail |
Stops the execution of the state machine and marks it as a failure, unless it is caught by a Catch block.
|
|
Defining the States of Our State Machine
We didn’t need to use Parallel, Map and Succeed states.
Input and Output Processing
A Step Functions state machine execution accepts a JSON text — called Amazon States Language (ASL) — as input and passes that input to the first state in the workflow. After receiving the JSON text input, individual states pass it as output to the next state.
Our Lambdas also had a JSON input and output as they returned the actual SAP API responses. To filter and control the flow of JSON from one state to another and before and after Lambda calls, the ASL provides the following fields:
- InputPath
- OutputPath
- ResultPath
- Parameters
- ResultSelector
To keep the process simple, I combined the original state machine input with the filtered response of each lambda/API call. A final successful payload would look like this:
{
"subscription": "",
"token": "",
"applicationCode": "commerce-cloud",
"branch": "develop",
"buildName": "develop-latest-test",
"databaseUpdateMode": "NONE",
"environmentCode": "D1",
"strategy": "ROLLING_UPDATE",
"CreateBuildResult": {
"statusCode": 201,
"buildCode": "20220110.8"
},
"GetBuildResult": {
"statusCode": 200,
"buildStatus": "SUCCESS"
},
"CreateDeploymentResult": {
"statusCode": 201,
"deploymentCode": "48680"
},
"GetDeploymentResult": {
"statusCode": 200,
"deploymentStatus": "SUCCESS"
}
}
Approaches to Create a State Machine
AWS Workflow Studio
The new AWS Workflow Studio, which is available on the AWS Console, is a visual way to create serverless workflows in Step Functions. To create state machines using the Workflow Studio, you can simply drag the states and connect them to implement the transitions.

Creating a Visual Workflow in AWS Workflow Studio
Amazon States Language (ASL)
You can also use ASL, which allows you to directly describe your state machine. It automatically generates a visual diagram depicting your description.

Creating a Visual Workflow through AWS States Language
AWS Cloud Development Kit (CDK)
If the above approaches do not work for you, you can also use the AWS Cloud Development Kit. I personally prefer using the CDK because it allows you to create efficient applications in the cloud with the power of a programming language. With the CDK, you can define the entire infrastructure — in this case, Lambdas and the state machine — in your favourite programming language.
Here’s what it looks like in Typescript:
import * as cdk from 'aws-cdk-lib';
import {aws_lambda as lambda, Duration} from 'aws-cdk-lib';
import {aws_stepfunctions as sfn} from 'aws-cdk-lib';
import {aws_stepfunctions_tasks as tasks} from "aws-cdk-lib";
export class AwsSapCommerceDeploymentCdkStack extends cdk.Stack {
constructor(scope: cdk.App, id: string, props?: cdk.StackProps) {
super(scope, id, props);
// =====================================================================================
// Build the AWS Lambda Function layer
// =====================================================================================
const layer = new lambda.LayerVersion(this, 'requests', {
code: lambda.Code.fromAsset('requestslayer'),
compatibleRuntimes: [lambda.Runtime.PYTHON_3_9],
license: 'Apache-2.0',
description: 'A layer to enable the requests library in our Lambdas',
});
// Lambdas:
// =====================================================================================
// Build the AWS Lambda Function getBuild
// =====================================================================================
const getBuildFn = new lambda.Function(this, 'getBuildFunction', {
code: lambda.Code.fromAsset('getbuildlambda'),
runtime: lambda.Runtime.PYTHON_3_9,
handler: 'index.handler',
timeout: cdk.Duration.seconds(30),
memorySize: 128,
layers: [layer],
});
// =====================================================================================
// Build the AWS Lambda Function createBuild
// =====================================================================================
const createBuildFn = new lambda.Function(this, 'createBuildFunction', {
code: lambda.Code.fromAsset('createbuildlambda'),
runtime: lambda.Runtime.PYTHON_3_9,
handler: 'index.handler',
timeout: cdk.Duration.seconds(30),
memorySize: 128,
layers: [layer],
});
// =====================================================================================
// Build the AWS Lambda Function createDeployment
// =====================================================================================
const createDeploymentFn = new lambda.Function(this, 'createDeploymentFunction', {
code: lambda.Code.fromAsset('createdeploymentlambda'),
runtime: lambda.Runtime.PYTHON_3_9,
handler: 'index.handler',
timeout: cdk.Duration.seconds(30),
memorySize: 128,
layers: [layer],
});
// =====================================================================================
// Build the AWS Lambda Function getDeployment
// =====================================================================================
const getDeploymentFn = new lambda.Function(this, 'getDeploymentFunction', {
code: lambda.Code.fromAsset('getdeploymentlambda'),
runtime: lambda.Runtime.PYTHON_3_9,
handler: 'index.handler',
timeout: cdk.Duration.seconds(30),
memorySize: 128,
layers: [layer],
});
// Tasks:
// Create build
const createBuild = new tasks.LambdaInvoke(this, 'Create Build', {
lambdaFunction: createBuildFn,
resultSelector: {
statusCode: sfn.JsonPath.stringAt('$.Payload.statusCode'),
buildCode: sfn.JsonPath.stringAt('$.Payload.body.code'),
},
resultPath: "$.CreateBuildResult",
});
// Get build
const getBuild = new tasks.LambdaInvoke(this, 'Get Build', {
lambdaFunction: getBuildFn,
resultSelector: {
statusCode: sfn.JsonPath.stringAt('$.Payload.statusCode'),
buildStatus: sfn.JsonPath.stringAt('$.Payload.body.status')
},
resultPath: "$.GetBuildResult",
});
// Create deployment
const createDeployment = new tasks.LambdaInvoke(this, 'Create Deployment', {
lambdaFunction: createDeploymentFn,
resultSelector: {
statusCode: sfn.JsonPath.stringAt('$.Payload.statusCode'),
deploymentCode: sfn.JsonPath.stringAt('$.Payload.body.code')
},
resultPath: "$.CreateDeploymentResult",
});
// Get deployment
const getDeployment = new tasks.LambdaInvoke(this, 'Get Deployment', {
lambdaFunction: getDeploymentFn,
resultSelector: {
statusCode: sfn.JsonPath.stringAt('$.Payload.statusCode'),
deploymentStatus: sfn.JsonPath.stringAt('$.Payload.body.status')
},
resultPath: "$.GetDeploymentResult",
});
// Get final deployment status
const getFinalDeploymentStatus = new tasks.LambdaInvoke(this, 'Get Final Deployment Status', {
lambdaFunction: getDeploymentFn,
outputPath: "$.Payload"
});
// *Wait*:
// Wait for build
const waitForBuild = new sfn.Wait(this, 'Wait for Build', {
time: sfn.WaitTime.duration(Duration.seconds(120)),
});
// Wait for deployment
const waitForDeployment = new sfn.Wait(this, 'Wait for Deployment', {
time: sfn.WaitTime.duration(Duration.seconds(120)),
});
// *Fail*:
// build failed
const buildFailed = new sfn.Fail(this, 'Build Failed', {
cause: 'Build Process Failed',
error: 'DescribeJob returned FAILED',
});
// Deployment failed
const deploymentFailed = new sfn.Fail(this, 'Deployment Failed', {
cause: 'Deployment Process Failed',
error: 'DescribeJob returned FAILED',
});
// *Choice*: inline in definitions
// Build complete?
// Deployment complete?
const definition = createBuild
.next(waitForBuild)
.next(getBuild)
.next(new sfn.Choice(this, 'Build Complete?')
.when(sfn.Condition.numberGreaterThanEquals('$.GetBuildResult.statusCode', 400), buildFailed)
.when(sfn.Condition.stringEquals('$.GetBuildResult.buildStatus', 'SUCCESS'), createDeployment)
.otherwise(waitForBuild));
createDeployment
.next(waitForDeployment)
.next(getDeployment)
.next(new sfn.Choice(this, 'Deployment Complete?')
.when(sfn.Condition.numberGreaterThanEquals('$.GetDeploymentResult.statusCode', 400), deploymentFailed)
.when(sfn.Condition.stringEquals('$.GetDeploymentResult.deploymentStatus', 'DEPLOYED'), getFinalDeploymentStatus)
.otherwise(waitForDeployment));
new sfn.StateMachine(this, 'StateMachine', {
definition,
timeout: Duration.minutes(45),
stateMachineName: "SapCommerceDeploymentStateMachine",
});
}
}
Testing Your Code
You can trigger your state machine — using the AWS Console, for example — with a payload like this:
{
"subscription": "$SUBCRIPTION",
"token": "$API_TOKEN",
"applicationCode": "commerce-cloud",
"branch": "develop",
"buildName": "develop-latest",
"databaseUpdateMode": "NONE",
"environmentCode": "d1",
"strategy": "ROLLING_UPDATE"
}
Monitoring the Execution
In the AWS Step Functions console, you can use Graph Inspector to monitor the execution; it will show it in real-time, including each state’s input and output.
You also have the Execution event history that will show the details of each event.

Areas of Improvement
While working on the project, we also identified some ideas to improve the tool and make it more efficient:
- Add notifications (via email, Slack, etc.)
- Enhance security (remove API token from input)
- Display build progress: use SAP API methods that give build/deployment progress percentage and make it available in the state machine
- Add option to execute parallel deployment to multiple environments, using Parallel state
- Integrate the tool into a CI/CD pipeline
Conclusion
In this article, we have explored how we can develop and use a tool that provides a single command interface to build and deploy to an SAP CCv2 instance by leveraging AWS Step Functions.
The workflows of AWS Step Functions are flexible and easy to debug. The solution saves us manual effort and makes getting an SAP CCv2 code to an application server quicker and more straightforward. We hope it will be as helpful for you as it has been for our team.