Home / Technology / AWS S3 Now Stores AI Vectors Natively
AWS S3 Now Stores AI Vectors Natively
3 Dec
Summary
- AWS S3 now offers native vector storage and search.
- Service scales to 2 billion vectors per index, 20 trillion per bucket.
- Potential to reduce vector storage costs by up to 90%.

Amazon Web Services (AWS) has announced the general availability of Amazon S3 Vectors, integrating native vector storage and similarity search directly into its S3 object storage service. This move aims to simplify AI infrastructure by eliminating the need for separate vector databases for many applications. The service has seen dramatic scaling since its preview, now supporting up to 2 billion vectors per index and 20 trillion vectors per bucket, significantly increasing capacity from its initial launch.
AWS highlights that S3 Vectors could reduce the total cost of storing and querying vectors by up to 90% compared to specialized vector database solutions. While AWS positions S3 Vectors as a complementary tier rather than a direct replacement, it offers a compelling option for workloads tolerating around 100 milliseconds of latency, such as semantic search and agent memory extensions. For latency-sensitive applications, traditional vector databases like Amazon OpenSearch are still recommended.
The launch introduces a tiered performance framework for enterprise architects, allowing them to architect vector storage based on specific workload requirements. This approach mirrors the evolution of tabular data in data lakes, where specialized databases coexist with cost-effective object storage. AWS plans further performance and scale improvements for S3 Vectors, signaling its growing importance in the AI data landscape.




