mcp-local-server

mcp-BirdNET-Pi-server

A Model Context Protocol (MCP) server for BirdNET-Pi integration. Provides tools to query bird detection data, statistics, audio recordings, activity patterns, and reports from a local BirdNET-Pi installation.

Features

Requirements

Installation

pip install -r requirements.txt

Or with uv:

uv pip install -r requirements.txt

Configuration

Environment variables:

Running the Server

stdio (default)

python server.py

Streamable HTTP

MCP_TRANSPORT=http python server.py

Or with custom host/port:

MCP_TRANSPORT=http MCP_HTTP_HOST=0.0.0.0 MCP_HTTP_PORT=8000 python server.py

Docker

docker build -t mcp-birdnet-pi-server .
docker run -p 8000:8000 -v /path/to/data:/app/data mcp-birdnet-pi-server

Claude Desktop

Add to ~/Library/Application Support/Claude/claude_desktop_config.json:

{
  "mcpServers": {
    "birdnet": {
      "command": "python",
      "args": ["/absolute/path/to/mcp-BirdNET-Pi-server/server.py"]
    }
  }
}

Available Tools

get_detections

Get bird detections filtered by date range and optional species.

get_stats

Get aggregate detection statistics for a time period.

get_audio

Retrieve the audio recording for a specific bird detection.

get_activity

Get hourly bird activity patterns for a specific day.

generate_report

Generate a detection report for a date range.

Testing

python -m pytest test_server.py -v

Directory Structure

mcp-BirdNET-Pi-server/
├── birdnet/
│   ├── __init__.py
│   ├── config.py
│   ├── functions.py
│   └── utils.py
├── data/
│   └── detections.json
├── server.py
├── test_server.py
├── Dockerfile
├── requirements.txt
└── README.md

License

MIT


Appendix: MCP in Practice (Code Execution, Tool Scale, and Safety)

Last updated: 2026-03-23

Why This Appendix Exists

Model Context Protocol (MCP) is still one of the most useful interoperability layers for tools and agents. The tradeoff is that large MCP servers can expose many tools, and naive tool-calling can flood context windows with schemas, tool chatter, and irrelevant call traces.

In practice, “more tools” is not always “better outcomes.” Tool surface area must be paired with execution patterns that keep token use bounded and behavior predictable.

The Shift to Code Execution / Code Mode

Recent workflows increasingly move complex orchestration out of chat context and into code execution loops. This reduces repetitive schema tokens and makes tool usage auditable and testable.

Core reading:

Client Fit Guide (Short Version)

Prompt Injection: Risks, Impact, and Mitigations

Prompt injection remains an open security problem for tool-using agents. It is manageable, but not “solved.”

Primary risks:

Mitigation baseline:

Treat every tool output as untrusted input unless explicitly verified.