Amazon S3 provides a new way to verify the content of stored datasets. You can efficiently verify billions of objects and automatically generate integrity reports to prove that your datasets remain intact over time using S3 Batch Operations. This capability works with any object stored in S3, regardless of storage class or object size, without the need to restore or download data. Whether you’re verifying objects for data preservation, accuracy checks, or compliance requirements, you can reduce the cost, time, and effort required.
With S3 Batch Operations, you can create a compute checksum job for your objects. To get started, provide a list of objects (called a manifest) or specify the bucket with filters like prefix or suffix. Then choose “Compute checksum” as the operation type and select from supported algorithms including SHA-1, SHA-256, CRC32, CRC32C, CRC64, and MD5. When the job completes, you receive a detailed report with checksum information for all processed objects. You can use this report for compliance or audit purposes. This capability complements S3’s built-in validation, letting you independently verify your stored data any time.
This new data verification, compute checksum operation, is now available in all AWS Regions. For pricing details, visit the S3 pricing page. To learn more, visit the S3 User Guide.
Categories: marketing:marchitecture/storage,general:products/amazon-s3,general:products/aws-govcloud-us
Source: Amazon Web Services
Latest Posts
- Amazon S3 introduces a new way to verify the content of stored datasets
- New streamlined fulfillment experience for AMI-based products in AWS Marketplace
- Amazon Bedrock now supports Batch inference for Anthropic Claude Sonnet 4 and OpenAI GPT-OSS models
- Power Platform – Public Preview of the Alerts feature in Power Platform Monitor [MC1137567]