Aman Panjwani
8 min read

Building an Agent with Microsoft Agent Framework and Microsoft 365 Agents SDK

Microsoft Agent FrameworkMicrosoft 365 Agents SDKMCPAzure AI Foundry

AI Agents are quickly becoming part of real world workflows from internal copilots to external support bots. But building them properly still takes efforts managing state, handling tools, streaming responses, and integrating with our app or channel.

Microsoft Agent Framework solves this by giving developers a structured way to build intelligent agents using Python or .NET. it supports tools, planners, memory, and orchestration logic everything needed to build agents that think and act.

The Microsoft 365 Agents SDK adds the runtime layer to bring those agents into real environment like Teams, Microsoft 365 Copilot, or our own app, it takes care of channel specific behaviour, activity handling, and state management, while letting us choos our AI models and services.

Together these two tools gives us:

  • Full Control over agent behaviour and logic
  • Channel ready architecture for Microsoft 365 experiences
  • Flexibility to use any LLM or service (like Azure OpenAI, OpenAI, etc.)

1.0 What we'll build:

In this walkthrough, we'll build a agent using:

  • Microsoft Agent Framework for the core logic and planning.
  • The Model Context Protocol (MCP) from Microsoft Learn to power responses.
  • The Microsoft 365 Agents SDK to structure agent communication.

We'll test the full setup using Azure Bot Service, showing how our agent handels real input and replies and tool drivin reasoning.

The Result: a ready to extend agent we can plug into Microsoft Teams or 365 Copilot, built with production grade component.

2.0 Prerequisites

Before building the agent make sure we have:

  • Python 3.10 or higher version
  • Access to an Azure Subscription with contibutor rights
  • Deployed language Model within Azure AI Foundry

That's all we need for now, we'll handle the rest step by step.

3.0 Setting up the Local Environment

Now that prerequisites are in place, let's set up our development environment to build and run the agent.
This will include creating a Python project, installing required packages, and preparing the folder structure.

3.1 Create Project and Virtual Environment

Create a new folder and set up a virtual environment:

bash
mkdir my-agent-project cd my-agent-project python -m venv .venv

Activate the environment:

  • on macOS/Linux:
bash
source .venv/bin/activate
  • On Windows:
bash
.venv\Scripts\activate

3.2 Define Required Packages

Create requirements.txt file in the project root with the following content:

txt
python-dotenv aiohttp microsoft-agents-hosting-aiohttp microsoft-agents-hosting-core microsoft-agents-authentication-msal microsoft-agents-activity agent-framework azure-identity

3.3 Install Dependencies

Now Install all required packages:

bash
pip install --upgrade pip pip install -r requirements.txt

3.4 Project Structure and Environment Configuration

After installing the packages, set up our project with the following layout:

bash
m365-agentframework-agent/ ├── src/ │ ├── __init__.py │ ├── agent.py │ ├── agent_framework_agent.py │ ├── main.py │ └── start_server.py ├── .env ├── requirements.txt └── venv/

3.5 Update a .env

Update a .env file in the root directory to store our environment variables securely.

bash
AZURE_OPENAI_ENDPOINT=https://<our-resource-name>.openai.azure.com/ AZURE_OPENAI_API_KEY=<our-api-key> AZURE_OPENAI_DEPLOYMENT_NAME=gpt-5.1

These will be loaded using dotenv or read directly into the agent setup logic.

4.0 Writing the Agent Code

Our project is structured well here, now each file contributes to the agent's behaviour and how the flow works from start to finish

4.1 File Overview & Code Flow

1. main.py - Entry Point

This is where execuion begins.
It sets up logging for Microsoft Agents SDK and starts the server using our configured Agent app.

  • Loads our SDK config and runtime
  • Passes everything to start_server() which boots the web server
python
import logging ms_agents_logger = logging.getLogger("microsoft_agents") ms_agents_logger.addHandler(logging.StreamHandler()) ms_agents_logger.setLevel(logging.INFO) from .agent import AGENT_APP, CONNECTION_MANAGER from .start_server import start_server start_server( agent_application=AGENT_APP, auth_configuration=None, )
2. start_server.py - Starts the AIOHTTP Server

This defines and launces the web server that listens the incoming requests.

  • Sets up the post api/messages route, which Azure Bot Service will call
  • Optionally includes auth middleware (not used here)
  • Binds the agent_application and adapter to request context
python
from os import environ from microsoft_agents.hosting.core import AgentApplication, AgentAuthConfiguration from microsoft_agents.hosting.aiohttp import ( start_agent_process, jwt_authorization_middleware, CloudAdapter, ) from aiohttp.web import Request, Response, Application, run_app def start_server( agent_application: AgentApplication, auth_configuration: AgentAuthConfiguration | None ): async def entry_point(req: Request) -> Response: agent: AgentApplication = req.app["agent_app"] adapter: CloudAdapter = req.app["adapter"] return await start_agent_process( req, agent, adapter, ) # Enable middleware ONLY if auth is configured middlewares = [jwt_authorization_middleware] if auth_configuration else [] app = Application(middlewares=middlewares) # Only set config if we have auth if auth_configuration: app["agent_configuration"] = auth_configuration app["agent_app"] = agent_application app["adapter"] = agent_application.adapter # POST route for Bot Framework app.router.add_post("/api/messages", entry_point) # Optional GET route for diagnostics async def diag(_): return Response(text="OK", status=200) app.router.add_get("/api/messages", diag) run_app(app, host="localhost", port=environ.get("PORT", 3978))
3. agent.py - Message Routing + Agent SDK Setup

This file:

  • Loads Microsoft 365 Agents SDK config from environment
  • Initializes CloudAdapter, MemoryStorage, Authorization and AgentApplication
  • Registers:
    • on_members_added handler: sends welcome message
    • on_message handler: passes user input to our agent and sends back a response
    • on_error handler: logs and reports errors cleanly This is the core of how messages get routed through the agent pipeline.
python
import re import sys import traceback from os import environ from dotenv import load_dotenv from microsoft_agents.hosting.aiohttp import CloudAdapter from microsoft_agents.hosting.core import ( Authorization, AgentApplication, TurnState, TurnContext, MemoryStorage, ) from microsoft_agents.authentication.msal import MsalConnectionManager from microsoft_agents.activity import load_configuration_from_env from .agent_framework_agent import MicrosoftAgentFrameworkAgent load_dotenv() agents_sdk_config = load_configuration_from_env(environ) STORAGE = MemoryStorage() CONNECTION_MANAGER = MsalConnectionManager(**agents_sdk_config) ADAPTER = CloudAdapter(connection_manager=CONNECTION_MANAGER) AUTHORIZATION = Authorization(STORAGE, CONNECTION_MANAGER, **agents_sdk_config) AGENT_APP = AgentApplication[TurnState]( storage=STORAGE, adapter=ADAPTER, authorization=AUTHORIZATION, **agents_sdk_config, ) mf_agent = MicrosoftAgentFrameworkAgent() @AGENT_APP.conversation_update("membersAdded") async def on_members_added(context: TurnContext, _state: TurnState): await context.send_activity( "Welcome! I’m connected to Microsoft Agent Framework and Microsoft Learn.\n" "Ask me something about Azure or Microsoft 365." ) return True @AGENT_APP.activity("message") async def on_message(context: TurnContext, _state: TurnState): user_text = (context.activity.text or "").strip() if not user_text: await context.send_activity("I didn’t see any text in your message.") return try: answer = await mf_agent.invoke(user_text) await context.send_activity(answer) except Exception as e: print(f"[agent_framework error] {e}", file=sys.stderr) traceback.print_exc() await context.send_activity("I hit an error calling the Agent Framework backend.") @AGENT_APP.error async def on_error(context: TurnContext, error: Exception): print(f"\n [on_turn_error] unhandled error: {error}", file=sys.stderr) traceback.print_exc() await context.send_activity("The bot encountered an error or bug.")
4. agent_framework_agent.py - Defines our Agent

This is actual agent implementation using Microsoft Agent Framework

  • Uses AzureOpenAIChatClient to call Azure OpenAI
  • Registers the Hosted MCP Tool, which lets the agent pull from Microsoft Learn's structured API
  • Expose an invoke() method to run the agent with input text and return the response
python
import os from typing import Annotated from random import randint from pydantic import Field from dotenv import load_dotenv from agent_framework import ChatAgent, HostedMCPTool from agent_framework.azure import AzureOpenAIChatClient load_dotenv() class MicrosoftAgentFrameworkAgent: """ Wraps an Agent Framework ChatAgent using Azure OpenAI and Microsoft Learn MCP. """ def __init__(self): self.chat_client = AzureOpenAIChatClient( endpoint=os.environ["AZURE_OPENAI_ENDPOINT"], deployment_name=os.environ["AZURE_OPENAI_DEPLOYMENT_NAME"], api_key=os.environ.get("AZURE_OPENAI_API_KEY"), ) self.agent = ChatAgent( name="MicrosoftAgentFrameworkAgent", description=( "An agent that answers questions using Microsoft Learn and Azure OpenAI." ), chat_client=self.chat_client, tools=[ HostedMCPTool( name="Microsoft Learn MCP", url="https://learn.microsoft.com/api/mcp", approval_mode="never_require", ) ], ) async def invoke(self, message: str) -> str: """Non-streaming call into the agent.""" result = await self.agent.run(message) return result.text

Once we;ve pasted in the full code into each file and our .env is configured, we're ready to run the agent but wait we have still one more step to do.

5.0 Testing the Agent with Azure Bot Service

To interact with our local agent from Microsoft's Web Chat Interface, we need to expose it to the internet securely. That's where Azure Bot Service and Dev Tunnel come in.

5.1 Why Azure Bot Service?

Azure Bot Service acts as the gateway between our bot and external channels like Web Chat, Teams or Microsoft 365 Copilot.
It listens for incoming messages, sends them to our bot endpoint, and return the response to the user just like a message router.
We'll use its built in Test in Web Chat tool to validate that out locally running agent is working as expected

5.2 Why do we need Dev Tunnel?

Since our agent runs locally on localhost:3978, Azure can't reach it directly. We need a public, secure HTTPS endpoint that forwards traffic to out local machine.
Dev Tunnels gives us exactly that. It creates a temporary, encrypted tunnel with public URL, allowing Azure Bot Service to talk to our Agent while we develop and test.

Official Guide and steps to install: Get Started with Dev Tunnel

5.3 Starting the Tunnel

We run this command to expose port 3978 to the internet with anonymous access:

bash
devtunnel host -p 3978 --allow-anonymous

Once the tunnel is active, we'll get a URL like:

bash
https://<tunnel-id>.devtunnels.ms

We'll use this in the next step.

5.4 Setup and Configure the Bot Messaging EndPoint

Now that we have a public tunnel, we need to setup an Azure Bot that can receive and forward messages to our local agent.

Create the Azure Bot Resource

  1. Go to the Azure Portal
  2. Create a resource Azure Bot Service
  3. Select it fill in the basic fields and Review and Create.
  4. Go to the Resouce
  5. In the left menu, select Settings -> Configuration.
  6. Replace the messaging endpoint with our tunnel URL:
bash
https://<our-tunnel-id>.devtunnels.ms/api/messages
  1. Click Save.

Image 1

This tells Azure to Forward all user messages to our local agent via the tunnel

5.5 Test in Web Chat

With everything wired up, it's time to validate that the agent is working.

  1. Go to the Azure Bot Resource in the Azure Portal.
  2. Select Test in Web Chat
  3. You should see: Welcome! I’m connected to Microsoft Agent Framework and Microsoft Learn

Image 2

Start chatting. Try prompt like:

bash
What is Azure Cosmos DB? How do I set up Microsoft Entra ID?

Our Agent should respond using Azure OpenAI + Microsoft Learn's MCP tool.

6.0 Recap and What's next

We've now built and tested a fully functional AI agent using:

  • Microsoft Agent Framework to handle reasoning and tool usage.
  • Hosted MCP Tool to query Microsoft Learn Content.
  • Microsoft 365 Agents SDK to manage state, activity flow, and channel communication
  • Azure Bot Service + Dev Tunnel to interact with agent via Web Chat

Everything runs locally, with secure, real-time responses powered by Azure OpenAI.

What's Next

This same agent can now be extended to real-world enterprise environments:

  • Microsoft Teams: The Messaging interface remains the same. we can plug this agent directly into a teams bot registration and deploy it to the internal users.
  • Microsoft 365 Copilot: with the 365 Agents SDK, this agent is already structured to be Copilot-ready. we can host it in the same runtime and expose it via Graph-integrated experience.

We've already built the core logic. Deploying this to channels is just a packaging and endpoint configuration step.

Discussion Log (0)

Building an Agent with Microsoft Agent Framework and Microsoft 365 Agents SDK | AI ML Insider