In this article, we explore a Proof-of-Concept (POC) solution designed to efficiently capture and direct events from SAP Commerce, a leading e-commerce platform, to external systems.
In our implementation, these systems are microservices in AWS. We built this solution by combining cloud-native tools, architectural patterns, and the event-driven capabilities of SAP Commerce.
In a dynamic e-commerce environment, critical actions — such as order placement or product creation — are constantly occurring. These actions, in turn, trigger events that serve multiple purposes, like initiating logistics operations, updating inventory, populating analytics dashboards, or sending out notifications. SAP Commerce can propagate events via webhooks, a versatile feature that calls out to external endpoints upon the occurrence of configured events.
The challenge is capturing these events efficiently and directing them to the appropriate services for diverse processing. This demands a solution that is scalable, resilient, and cost-effective.
Our solution starts with SAP Commerce, which triggers events based on actions occurring within the platform. It offers two main options to transmit events: Webhooks and the BTP Kyma Runtime. Although Kyma provides a more comprehensive framework, supporting hundreds of pre-defined events and custom ones, we opted for Webhooks due to its simplicity and ease of setup. Importantly, the proposed AWS architecture remains fully compatible with a Kyma solution.
We selected three sample events to be sent to AWS: order-created, product-created, and user-sign-in. Configuring these Webhooks at runtime requires no code changes and can be performed using the Backoffice Integration UI Tool, ImpEx files, or the Webhook Service Meta API. For our solution, we used ImpEx files.
Each Commerce Webhook was set up for a distinct integration object (UserIO, OrderIO, ProductIO) with the EventType as ItemSavedEvent. Optional Groovy script filters were paired with these webhooks to refine the data being captured. The AWS API Gateway endpoint serves as the Webhook destination, providing a bridge to the AWS ecosystem.
SAP Commerce events trigger a Lambda function via the API Gateway. Acting as an event processor, this function inspects each event and publishes it to an SNS. It also logs the event in a DynamoDB table for record-keeping.
In our implementation, the subscribers of our SNS topic are three SQS queues, each linked to backend services. Each queue uses a filter policy to receive only events with specific message attributes. With SAP Commerce’s support for the CloudEvents specification, we use the “ce-type” attribute to filter messages, for instance, “ce-type”: “sap.cx.commerce.OrderIO.Created.v1”. The queues act as a buffering load balancer, providing persistence, managing retries, and allowing scalability of subscribers.
Subsequently, we implemented three Lambda-based microservices that poll their respective SQS queues and process the events. The Analytics Service is triggered by order-created and user-sign-in events, which are subsequently directed to CloudWatch dashboards for real-time monitoring. The Notification Service, on the other hand, is activated by order-created and product-created events and leverages Amazon SES to dispatch corresponding email notifications. Lastly, the mock Fulfillment Service only receives the order-created events, which it logs.
To enable comprehensive tracing of the solution, we integrated AWS X-Ray, providing a detailed view of each event’s journey from the API Gateway to the final Lambda function.
For defining our infrastructure as code, we leveraged the AWS Serverless Application Model (SAM), including AWS SAM Pipelines to create a Bitbucket-integrated CI/CD pipeline via AWS CodePipeline.
This architecture offers several advantages:
- Scalability: Each component (API Gateway, Lambda, SNS, SQS) can independently scale based on load.
- Flexibility: Adjusting event types or their processing only requires changes to the SNS filter policies.
- Resilience: Failures or slowdowns in one area (like a processing Lambda) won’t impact the overall system. Events continue to be published, stored in the SQS queues, and processed by other Lambdas.
- Cost-effectiveness: With AWS’s pay-as-you-go model, costs are lower during periods of low event volume.
This architecture is a compelling choice for any system requiring the efficient handling and routing of various event types to different services in AWS in a scalable, flexible, and cost-effective manner. It is particularly well-suited for use with SAP Commerce, which needs to handle a wide range of events, leveraging the platform’s built-in webhooks to trigger events for processing in AWS.