MCP Servers

模型上下文协议服务器、框架、SDK 和模板的综合目录。

Workspace-aware .NET 10 MCP server with filesystem, activity routing, optional shell, SSH, web, and local inference tools.

创建于 5/14/2026
更新于 about 4 hours ago
Repository documentation and setup instructions

McpServerDotNet

.NET 10 C%23 14 License: MIT

McpServerDotNet is a .NET 10 Model Context Protocol server for local developer automation. It exposes a workspace-aware filesystem surface, MCP resources and prompts, an activity-routing layer for Codex-style workflows, and optional execution, SSH, web, and Ollama-backed local inference tooling.

This is not a generic scaffold anymore. The repository already contains a working stdio MCP host, real tool/resource/prompt routing, safety gates around higher-risk operations, optional local Ollama integration, and setup scripts for clients such as LM Studio and VS Code.

Who this is for

  • Developers building or operating local MCP workflows from .NET.
  • LM Studio or GitHub Copilot users who want a repo-aware local MCP server.
  • Teams that want a Clean Architecture MCP host they can extend with their own tools, resources, prompts, and infrastructure adapters.

Implemented capabilities

Always available

  • Stdio MCP host with JSON-RPC lifecycle handling for initialize, ping, shutdown, exit, tools, resources, and prompts.
  • Workspace-aware filesystem tools for read, write, append, copy, move, delete, metadata, and directory listing.
  • Workspace tools for root switching, project-folder switching, status, and bounded code-review snapshots.
  • MCP resources for file text, directory listings, metadata, recursive tree snapshots, and recent workspace changes.
  • MCP prompts for file summarization and grounded directory review.
  • Activity tools that classify requests and build bounded local-model context packets without calling a model directly.
  • Clean Architecture split across Host, Protocol, Application, Infrastructure, Contracts, and Domain projects.

Conditionally enabled

  • shell.exec for non-interactive local command execution under an explicit allow/deny policy.
  • inference.local_* tools for Ollama-backed summarization, planning, code review, and completion.
  • ssh.execute and ssh.write_text for profile-based remote automation.
  • web.fetch_url and web.search for allowlisted outbound web access.

Client integration

  • LM Studio installer and diagnostics scripts.
  • VS Code user-level MCP config installer.
  • Packable dotnet tool host via McpServer.Host.
  • Optional Ollama-backed local inference path for summarization, planning, code review, and bounded completion.

Solution layout

| Project | Responsibility | | --- | --- | | src/McpServer.Host | Host bootstrap, configuration, Autofac registrations, stdio loop, logging. | | src/McpServer.Protocol | MCP lifecycle, session state, JSON-RPC dispatch, tool/resource/prompt routing. | | src/McpServer.Application | Tool, resource, prompt, activity, and application abstractions. | | src/McpServer.Infrastructure | Filesystem, path policy, process execution, Ollama, SSH, web, logging implementations. | | src/McpServer.Contracts | MCP and JSON-RPC DTOs. | | src/McpServer.Domain | Reserved domain layer for future business rules. | | tests/* | Unit and stdio integration coverage. |

Default tool surface

Filesystem

  • fs.write_text
  • fs.append_text
  • fs.read_file
  • fs.read_text
  • fs.get_metadata
  • fs.list_directory
  • fs.create_directory
  • fs.move_path
  • fs.copy_path
  • fs.delete_path

Workspace and review

  • workspace.set_root
  • workspace.open
  • workspace.select_folder
  • workspace.status
  • workspace.inspect

Activity orchestration

  • activity.route
  • activity.schemas.list
  • activity.context.preview
  • activity.run

Resources

  • file:///workspace/...
  • dir:///workspace/...
  • filemeta:///workspace/...
  • tree:///project
  • tree:///workspace
  • changes:///project
  • changes:///workspace

Prompts

  • prompt.summarize_file
  • prompt.review_directory

Safety model

Higher-risk capabilities are off by default.

  • Shell execution is disabled unless explicitly enabled and constrained by command policy.
  • SSH tooling is disabled unless named profiles are configured.
  • Web access is disabled unless enabled with an allowlist.
  • Workspace paths are normalized and validated against allowed roots before file access.
  • Destructive file operations go through dedicated policy checks and per-path locking.
  • Logs are written to logs/ so stdio MCP traffic stays clean.

See SECURITY.md and docs/workspace-configuration.md for the operational details.

Run locally

Restore and build the solution:

dotnet restore .\McpServer.slnx
dotnet build .\McpServer.slnx -c Release --no-restore -v minimal

Run the stdio host:

dotnet run --project .\src\McpServer.Host\McpServer.Host.csproj -c Release

If you want the server to operate on this repository instead of the application-owned default workspace, set an explicit root first:

$env:MCPSERVER__WORKSPACE__ROOTPATH = "D:\2026 Projects\McpServerRepo"
$env:MCPSERVER__WORKSPACE__ALLOWEDROOTS__0 = "D:\2026 Projects\McpServerRepo"

By default, Shell, Ssh, WebAccess, and Ollama are disabled in src/McpServer.Host/appsettings.json.

Client setup

LM Studio

The repository includes helper scripts for installing and validating LM Studio MCP configuration:

Typical install:

.\scripts\install-lmstudio-mcp.ps1 -WorkspaceRoot 'D:\2026 Projects\McpServerRepo'

VS Code / Copilot-style MCP clients

Install a user-level mcp.json entry with:

.\scripts\install-vscode-mcp.ps1

This avoids checking a machine-specific .mcp.json into the repository.

Local Ollama path

If you want local-model assistance through MCP, enable McpServer:Ollama and use the inference.local_* tools described below.

The repo includes dedicated setup/details here:

Local inference workflows

When McpServer:Ollama:Enabled is turned on, the server exposes:

  • inference.local_status
  • inference.local_complete
  • inference.local_summarize
  • inference.local_code_review
  • inference.local_plan

The activity tools are designed to work with this model-assisted flow:

  1. Call workspace.status.
  2. Call activity.route with the user request.
  3. Optionally call activity.context.preview.
  4. Call activity.run to get a bounded context packet plus a structured-output schema.
  5. Send that payload to the local model yourself.

The server prepares the model call payload; it does not invoke the model for activity.run.

Remote and web automation

When enabled in configuration, the server can expose:

  • shell.exec for local non-interactive commands.
  • ssh.execute and ssh.write_text for remote profile-based automation.
  • web.fetch_url and web.search for allowlisted HTTP/HTTPS retrieval.

Command-style tools prefer explicit command plus args input and reject shell-control operators in raw command text.

Packaging

src/McpServer.Host is packable as a dotnet tool:

  • Package ID: McpServer.Host
  • Command name: mcpserver

Pack locally with:

dotnet pack .\src\McpServer.Host\McpServer.Host.csproj -c Release

Validation

The repo-level baseline is:

dotnet restore .\McpServer.slnx
dotnet build .\McpServer.slnx -c Release --no-restore -v minimal
dotnet test .\tests\McpServer.UnitTests\McpServer.UnitTests.csproj -c Release --no-build -v minimal
dotnet test .\tests\McpServer.IntegrationTests\McpServer.IntegrationTests.csproj -c Release --no-build -v minimal

There is also an inference smoke-test harness for exercising registered MCP tools through an OpenAI-compatible endpoint:

pwsh -File .\scripts\Invoke-InferenceToolSmokeTest.ps1

Documentation

License

This project is licensed under the MIT License. See LICENSE.

快速设置
此服务器的安装指南

安装命令 (包未发布)

git clone https://github.com/haxxornulled/McpServerDotNet
手动安装: 请查看 README 获取详细的设置说明和所需的其他依赖项。

Cursor 配置 (mcp.json)

{ "mcpServers": { "haxxornulled-mcpserverdotnet": { "command": "git", "args": [ "clone", "https://github.com/haxxornulled/McpServerDotNet" ] } } }