AWS Launches Strands Agents Open Source SDK for AI Development
AWS introduces Strands Agents, an open source SDK designed to simplify AI agent development with a model-driven approach, enabling rapid deployment from local testing to production.
Amazon Web Services (AWS) has announced the launch of Strands Agents, an open source SDK that streamlines the creation and deployment of AI agents. Designed with a model-driven approach, Strands allows developers to build and run AI agents with minimal code, scaling from simple to complex use cases. The SDK is already in use by multiple AWS teams, including Amazon Q Developer, AWS Glue, and VPC Reachability Analyzer.
Key Features of Strands Agents
- Simplified Development: Unlike traditional frameworks requiring complex workflows, Strands leverages state-of-the-art models to handle planning, tool execution, and reflection autonomously.
- Flexible Model Support: Compatible with models from Amazon Bedrock, Anthropic, Ollama, Meta, and others via LiteLLM.
- Tool Integration: Supports thousands of pre-built tools through the Model Context Protocol (MCP) and custom Python functions.
- Production-Ready: Includes deployment toolkits for AWS Lambda, Fargate, and EC2, with observability via OpenTelemetry.

Community and Collaboration
Strands Agents has garnered support from industry leaders like Accenture, Anthropic, and Meta, with contributions including API integrations for Claude and Llama models. Developers are encouraged to join the project on GitHub.
Example Use Case: Naming Assistant
python from strands import Agent from strands.tools.mcp import MCPClient from strands_tools import http_request
NAMING_SYSTEM_PROMPT = """ You are an assistant that helps name open source projects. """
tools = MCPClient.list_tools_sync() + [http_request] naming_agent = Agent(system_prompt=NAMING_SYSTEM_PROMPT, tools=tools) naming_agent("Name an AI agent project.")
Production Architectures
Strands supports diverse deployment models:
- Local Execution: Ideal for CLI tools.
- API-Based: Deploy agents behind APIs using AWS services.
- Isolated Tool Execution: Run tools in separate environments (e.g., Lambda).
- Hybrid Approach: Combine client-side and backend tools.

For more details, visit the Strands documentation.
Related News
AWS extends Bedrock AgentCore Gateway to unify MCP servers for AI agents
AWS announces expanded Amazon Bedrock AgentCore Gateway support for MCP servers, enabling centralized management of AI agent tools across organizations.
CEOs Must Prioritize AI Investment Amid Rapid Change
Forward-thinking CEOs are focusing on AI investment, agile operations, and strategic growth to navigate disruption and lead competitively.
About the Author

David Chen
AI Startup Analyst
Senior analyst focusing on AI startup ecosystem with 11 years of venture capital and startup analysis experience. Former member of Sequoia Capital AI investment team, now independent analyst writing AI startup and investment analysis articles for Forbes, Harvard Business Review and other publications.