Workspace-aware .NET 10 MCP server with filesystem, activity routing, optional shell, SSH, web, and local inference tools.
McpServerDotNet
McpServerDotNet is a .NET 10 Model Context Protocol server for local developer automation. It exposes a workspace-aware filesystem surface, MCP resources and prompts, an activity-routing layer for Codex-style workflows, and optional execution, SSH, web, and Ollama-backed local inference tooling.
This is not a generic scaffold anymore. The repository already contains a working stdio MCP host, real tool/resource/prompt routing, safety gates around higher-risk operations, optional local Ollama integration, and setup scripts for clients such as LM Studio and VS Code.
Who this is for
- Developers building or operating local MCP workflows from .NET.
- LM Studio or GitHub Copilot users who want a repo-aware local MCP server.
- Teams that want a Clean Architecture MCP host they can extend with their own tools, resources, prompts, and infrastructure adapters.
Implemented capabilities
Always available
- Stdio MCP host with JSON-RPC lifecycle handling for
initialize,ping,shutdown,exit, tools, resources, and prompts. - Workspace-aware filesystem tools for read, write, append, copy, move, delete, metadata, and directory listing.
- Workspace tools for root switching, project-folder switching, status, and bounded code-review snapshots.
- MCP resources for file text, directory listings, metadata, recursive tree snapshots, and recent workspace changes.
- MCP prompts for file summarization and grounded directory review.
- Activity tools that classify requests and build bounded local-model context packets without calling a model directly.
- Clean Architecture split across Host, Protocol, Application, Infrastructure, Contracts, and Domain projects.
Conditionally enabled
shell.execfor non-interactive local command execution under an explicit allow/deny policy.inference.local_*tools for Ollama-backed summarization, planning, code review, and completion.ssh.executeandssh.write_textfor profile-based remote automation.web.fetch_urlandweb.searchfor allowlisted outbound web access.
Client integration
- LM Studio installer and diagnostics scripts.
- VS Code user-level MCP config installer.
- Packable
dotnettool host viaMcpServer.Host. - Optional Ollama-backed local inference path for summarization, planning, code review, and bounded completion.
Solution layout
| Project | Responsibility |
| --- | --- |
| src/McpServer.Host | Host bootstrap, configuration, Autofac registrations, stdio loop, logging. |
| src/McpServer.Protocol | MCP lifecycle, session state, JSON-RPC dispatch, tool/resource/prompt routing. |
| src/McpServer.Application | Tool, resource, prompt, activity, and application abstractions. |
| src/McpServer.Infrastructure | Filesystem, path policy, process execution, Ollama, SSH, web, logging implementations. |
| src/McpServer.Contracts | MCP and JSON-RPC DTOs. |
| src/McpServer.Domain | Reserved domain layer for future business rules. |
| tests/* | Unit and stdio integration coverage. |
Default tool surface
Filesystem
fs.write_textfs.append_textfs.read_filefs.read_textfs.get_metadatafs.list_directoryfs.create_directoryfs.move_pathfs.copy_pathfs.delete_path
Workspace and review
workspace.set_rootworkspace.openworkspace.select_folderworkspace.statusworkspace.inspect
Activity orchestration
activity.routeactivity.schemas.listactivity.context.previewactivity.run
Resources
file:///workspace/...dir:///workspace/...filemeta:///workspace/...tree:///projecttree:///workspacechanges:///projectchanges:///workspace
Prompts
prompt.summarize_fileprompt.review_directory
Safety model
Higher-risk capabilities are off by default.
- Shell execution is disabled unless explicitly enabled and constrained by command policy.
- SSH tooling is disabled unless named profiles are configured.
- Web access is disabled unless enabled with an allowlist.
- Workspace paths are normalized and validated against allowed roots before file access.
- Destructive file operations go through dedicated policy checks and per-path locking.
- Logs are written to
logs/so stdio MCP traffic stays clean.
See SECURITY.md and docs/workspace-configuration.md for the operational details.
Run locally
Restore and build the solution:
dotnet restore .\McpServer.slnx
dotnet build .\McpServer.slnx -c Release --no-restore -v minimal
Run the stdio host:
dotnet run --project .\src\McpServer.Host\McpServer.Host.csproj -c Release
If you want the server to operate on this repository instead of the application-owned default workspace, set an explicit root first:
$env:MCPSERVER__WORKSPACE__ROOTPATH = "D:\2026 Projects\McpServerRepo"
$env:MCPSERVER__WORKSPACE__ALLOWEDROOTS__0 = "D:\2026 Projects\McpServerRepo"
By default, Shell, Ssh, WebAccess, and Ollama are disabled in src/McpServer.Host/appsettings.json.
Client setup
LM Studio
The repository includes helper scripts for installing and validating LM Studio MCP configuration:
scripts/install-lmstudio-mcp.ps1scripts/Disable-LmStudioSandbox.ps1scripts/Enable-LmStudioAgentMode.ps1scripts/Test-LmStudioAgentMode.ps1scripts/Test-LmStudioWorkspaceAccess.ps1
Typical install:
.\scripts\install-lmstudio-mcp.ps1 -WorkspaceRoot 'D:\2026 Projects\McpServerRepo'
VS Code / Copilot-style MCP clients
Install a user-level mcp.json entry with:
.\scripts\install-vscode-mcp.ps1
This avoids checking a machine-specific .mcp.json into the repository.
Local Ollama path
If you want local-model assistance through MCP, enable McpServer:Ollama and use the inference.local_* tools described below.
The repo includes dedicated setup/details here:
Local inference workflows
When McpServer:Ollama:Enabled is turned on, the server exposes:
inference.local_statusinference.local_completeinference.local_summarizeinference.local_code_reviewinference.local_plan
The activity tools are designed to work with this model-assisted flow:
- Call
workspace.status. - Call
activity.routewith the user request. - Optionally call
activity.context.preview. - Call
activity.runto get a bounded context packet plus a structured-output schema. - Send that payload to the local model yourself.
The server prepares the model call payload; it does not invoke the model for activity.run.
Remote and web automation
When enabled in configuration, the server can expose:
shell.execfor local non-interactive commands.ssh.executeandssh.write_textfor remote profile-based automation.web.fetch_urlandweb.searchfor allowlisted HTTP/HTTPS retrieval.
Command-style tools prefer explicit command plus args input and reject shell-control operators in raw command text.
Packaging
src/McpServer.Host is packable as a dotnet tool:
- Package ID:
McpServer.Host - Command name:
mcpserver
Pack locally with:
dotnet pack .\src\McpServer.Host\McpServer.Host.csproj -c Release
Validation
The repo-level baseline is:
dotnet restore .\McpServer.slnx
dotnet build .\McpServer.slnx -c Release --no-restore -v minimal
dotnet test .\tests\McpServer.UnitTests\McpServer.UnitTests.csproj -c Release --no-build -v minimal
dotnet test .\tests\McpServer.IntegrationTests\McpServer.IntegrationTests.csproj -c Release --no-build -v minimal
There is also an inference smoke-test harness for exercising registered MCP tools through an OpenAI-compatible endpoint:
pwsh -File .\scripts\Invoke-InferenceToolSmokeTest.ps1
Documentation
- Architecture
- Method Summary
- Workspace Configuration
- Ollama Local Inference Tools
- Codex / VS Code MCP Setup
- Changelog
License
This project is licensed under the MIT License. See LICENSE.