Amazon S3 Batch Operations introduces performance improvements

Amazon S3 Batch Operations introduces performance improvements

Amazon S3 Batch Operations now completes jobs up to 10x faster at a scale of up to 20 billion objects in a job, helping you accelerate large-scale storage operations.

With S3 Batch Operations, you can perform operations at scale such as copying objects between staging and production buckets, tagging objects for S3 Lifecycle management, or computing object checksums to verify the content of stored datasets. S3 Batch Operations now pre-processes objects, executes jobs, and generates completion reports up to 10x faster for jobs processing millions of objects with no additional configuration or cost. To get started, create a job in the AWS Management Console and specify operation type as well as filters like bucket, prefix, or creation date. S3 automatically generates the object list, creates an AWS Identity and Access Management (IAM) role with permission policies as needed, then initiates the job.

S3 Batch Operations performance improvements are available in all AWS Regions, except for AWS China Regions and AWS GovCloud (US) Regions. For pricing information, please visit the Management & Insights tab of the Amazon S3 pricing page. To learn more about S3 Batch Operations, visit the overview page and documentation.

Categories: general:products/aws-govcloud-us,marketing:marchitecture/management-and-governance,general:products/amazon-s3,marketing:marchitecture/storage

Source: Amazon Web Services



Latest Posts

Pass It On
Leave a Comment

Comments

No comments yet. Why don’t you start the discussion?

Leave a Reply