Overview
Go beyond simple file lookups. This agent analyzes your Amazon S3 data to answer complex questions like "Why are storage costs increasing?" or "Which files are consuming the most storage?"
**Capabilities:**
- Analyze storage usage patterns and identify optimization opportunities
- Surface access patterns and identify rarely-used files
- Compare storage costs across buckets and file types
- Identify data lifecycle issues and cleanup opportunities
- Synthesize insights from file metadata and access logs
[{"type": "user", "content": "Why are our S3 storage costs 45% higher this month? We need to identify what's driving the increase."}, {"type": "thinking", "content": "Analyzing S3 storage patterns...", "thinkingSteps": ["Retrieving bucket usage data from last 60 days...", "Calculating storage by bucket and file type...", "Identifying large files and access patterns...", "Comparing to previous month's costs..."]}, {"type": "assistant", "content": "I analyzed your S3 storage data and found the drivers of the cost increase:\n\n<widget_chart name=\"Storage by Bucket\">{\"type\":\"pie\",\"title\":\"Storage Distribution\",\"data\":{\"labels\":[\"logs-archive\",\"backups-prod\",\"media-assets\",\"other\"],\"datasets\":[{\"data\":[2.4,1.8,3.2,0.8],\"backgroundColor\":[\"#ef4444\",\"#f59e0b\",\"#3b82f6\",\"#10b981\"]}]}}</widget_chart>\n\n## Cost Breakdown\n\n| Bucket | Storage | Growth | Cost Impact | Issue |\n|--------|---------|--------|-------------|-------|\n| logs-archive | 2.4 TB | +118% | $48/month | 78% files >90 days old |\n| backups-prod | 1.8 TB | +25% | $36/month | 450 GB duplicates |\n| media-assets | 3.2 TB | +8% | $64/month | 34% inactive files |\n| other | 0.8 TB | +5% | $16/month | Normal growth |\n\n## Root Causes\n\n**Primary Driver:** The **logs-archive** bucket grew 118% (1.1 TB → 2.4 TB). Log retention policy isn't being enforced - 78% of files are older than 90 days (retention policy: 30 days). This accounts for $48/month in unnecessary storage costs.\n\n**Secondary Issue:** **backups-prod** has 450 GB of **duplicate backups** from failed cleanup jobs. The backup rotation script failed 3 times this month, leaving old backups in place.\n\n**Inefficiency Pattern:** 34% of files in **media-assets** haven't been accessed in 180+ days but are stored in Standard tier (should be moved to Glacier for 68% cost savings).\n\n**Recommendation:**\n1. Enforce log retention policy - delete files >30 days, potential $48/month savings\n2. Fix backup rotation script and remove duplicates, potential $18/month savings\n3. Move inactive media to Glacier, potential $12/month savings\n\n<ENTITY_ANNOTATION id=\"BUCKET-logs-archive\" title=\"Logs Archive Bucket\" integrationType=\"AMAZON_S3\" />\n\n<CTA label=\"View Storage Details\" action=\"Show detailed storage breakdown by bucket\" />\n<CTA label=\"Generate Cleanup Plan\" action=\"Create a storage optimization action plan\" />"}]