Amazon S3 introduces a new way to verify the content of stored datasets

Amazon S3 introduces a new way to verify the content of stored datasets

Amazon S3 provides a new way to verify the content of stored datasets. You can efficiently verify billions of objects and automatically generate integrity reports to prove that your datasets remain intact over time using S3 Batch Operations. This capability works with any object stored in S3, regardless of storage class or object size, without the need to restore or download data. Whether you’re verifying objects for data preservation, accuracy checks, or compliance requirements, you can reduce the cost, time, and effort required.

With S3 Batch Operations, you can create a compute checksum job for your objects. To get started, provide a list of objects (called a manifest) or specify the bucket with filters like prefix or suffix. Then choose “Compute checksum” as the operation type and select from supported algorithms including SHA-1, SHA-256, CRC32, CRC32C, CRC64, and MD5. When the job completes, you receive a detailed report with checksum information for all processed objects. You can use this report for compliance or audit purposes. This capability complements S3’s built-in validation, letting you independently verify your stored data any time.

This new data verification, compute checksum operation, is now available in all AWS Regions. For pricing details, visit the S3 pricing page. To learn more, visit the S3 User Guide.

Categories: marketing:marchitecture/storage,general:products/amazon-s3,general:products/aws-govcloud-us

Source: Amazon Web Services



Latest Posts

Pass It On
Leave a Comment

Comments

No comments yet. Why don’t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *