AWS SNS Now Supports Batch Publish in One Request

AWS has finally announced support for batch publishing to an SNS topic. This is a long overdue features that developers including myself have been asking for quite a while. Learn more about it in this article.

AWS quietly announced a new quality of life improvement to those of you using SNS or Simple Notification Service.

Starting today, SNS now supports batch message publishing. Instead of having to make 10 separate API requests to publish 10 different messages, developers can now publish all messages to SNS in a single request.

Why Is This Feature Important?

In my mind, the biggest reason this announcement is important is the prevention of partial failure scenarios.

Imagine a scenario where an application needs to notify consumers of multiple events that changed all apart as a single process. This can be the result of an API call to a function, or by other means.

In this example, the application would need to iterate over all messages it wants to publish, and one by one publish them to the SNS topic. But what happens if one, or even more, fail?

The application will need to gracefully handle this partial failure, either by internally spinning and retying the publish, or rolling back the transaction.

Regardless of how its done, its inconvenient and requires redundant application level handling to detect and address the scenario.

Batch publishing avoids this problem completely. Not only are you going to get a latency boost out of only performing a single request, you also get zero chance of partial failure scenarios and can therefore eliminate that pesky retry code that exists in many applications.

Additionally, there is a cost benefit of using batch publish. AWS has noted that batch publish costs the same as an API request that contains only one message. That means that if you have 100 messages and were performing a single publish for each of them, you can potentially reduce the cost you’re paying on the publish side by 10x!

Its also important to note that this feature works for both Standard or FIFO topics, so theres no limitation based on the type you are using.

Example Code

Here’s a great example of this new feature pulled from this AWS blog post announcing the topic (also a great read). Using Java, our code looks like below:

private static final String MESSAGE_BATCH_ID_PREFIX = "server1234-batch-id-";

List<PublishBatchRequestEntry> entries = IntStream.range(0, 10)
    .mapToObj(i -> {
        new PublishBatchRequestEntry()
            .withId(MESSAGE_BATCH_ID_PREFIX + i)
            .withMessage(YOUR_PAYLOAD_HERE);
    })
    .collect(Collectors.toList());

PublishBatchRequest request = new PublishBatchRequest()
    .withTopicArn(topicArn)
    .withPublishBatchRequestEntries(entries);
PublishBatchResult response = snsClient.publishBatch(request);

Note that you need to provide a ID field for each message. This is used as part of the de-duplication process and ensures SNS has a unique identifier for each message in the batch that it can report on.

Other than that, this request looks very similar to how a normal SNS publish request looks in Java. The only major differnce is that we pass in a collection or list into the BatchRequestEntries field.

Limitations

The only major limitation is that BatchPublish supports at most 10 messages per request. This is a great improvement over the 1 message limit but I hope the SNS team can up this limit in the future to deal with emerging use cases.

Also of note is that the normal TPS limits for SNS publish applies. Specifically, users can publish at most 30,000 Transactions Per Second (or TPS). If each batch contains 10 messages, you can publish 30,000 * 10 (300,000) messages per second using this new feature. Previously, SNS only supported 30,000 messages per second. That’s a 10x improvement!

If you’re interested in learning more, I highly suggest checking out this great blog post by the AWS team right here.

Total
0
Shares
Leave a Reply

Your email address will not be published. Required fields are marked *

Related Posts