Zerox.LSAI

Language Server for AI — Project Overview

What It Is

LSAI is an MCP server that provides token-efficient semantic code analysis to AI assistants. It wraps Roslyn (C#) and LSP servers (Python, TypeScript, JavaScript, Java) behind a unified plugin architecture, delivering rich code intelligence (search, usages, hierarchy, callers, outline, etc.) in compressed formats that save 82–86% tokens compared to raw LSP JSON.

It includes a web management console (Angular + PrimeNG) for managing repositories, monitoring usage, and configuring workspaces. Everything is packaged in a single Docker image.

18 MCP Tools 5 Languages 6 Output Formats 617 Tests

Architecture

graph LR
    AI["🤖 AI Assistant\nAI Assistants"] -->|"MCP Protocol\nHTTP"| Server
    User["🌐 Browser"] --> SPA

    subgraph LSAI["Zerox.LSAI Server — ASP.NET"]
        Server["MCP Host\n18 Tools + Formatters"]
        WS["Workspace\nSession Manager"]
        DB[("SQLite")]
        SPA["Angular Console\nPrimeNG"]
        Server --> WS
        Server --> DB
    end

    subgraph Plugins["Plugins — runtime loaded"]
        Roslyn["Plugin.Roslyn\nC# via Roslyn APIs"]
        LSP["Plugin.Lsp\nStreamJsonRpc Client"]
    end

    WS --> Roslyn
    WS --> LSP

    subgraph LSPServers["External LSP Servers — managed processes"]
        Pyright["Pyright\nPython"]
        TSLS["typescript-language-server\nTS + JS"]
        JDTLS["Eclipse JDTLS\nJava"]
    end

    LSP -->|"stdin/stdout"| Pyright
    LSP -->|"stdin/stdout"| TSLS
    LSP -->|"stdin/stdout"| JDTLS
  

Development Time Estimate

How long would a super senior .NET/Angular developer take to build this from scratch, writing all code by hand?

Breakdown

#PhaseDaysNotes
0Research + POC Spike3MCP protocol spec, Roslyn APIs, LSP protocol, validate feasibility with a minimal spike
1Environment + Scaffolding1.NET 10, Node.js, Python, Java SDKs, git repo, solution structure, Directory.Build.props, CI basics
2Core Contracts2ILsaiPlugin, DTOs, enums, output format interfaces, operation params/results
3Plugin Infrastructure3AssemblyLoadContext isolation, plugin discovery via deps.json convention, validation, DI registration
4Server + MCP Transport3ASP.NET host, MCP SDK integration, 18 tool definitions, workspace session management, Kestrel config
5Output Formatters26 formats (CompactText, CompactTextVerbose, TurboCompact, CompilerOutput, GrepOutput, LanguageSyntax)
6Roslyn Plugin (C#)6MSBuild workspace loading, 11 operations. Roslyn API has sharp edges
7LSP Bridge Plugin10StreamJsonRpc client, LSP handshake, capabilities, process lifecycle, 4 language configs. Hardest part
8Web Console — Backend4SQLite + EFCore, migrations, RepositoryManager, UsageTracker, REST API
9Web Console — Frontend5Angular 21, PrimeNG, Dashboard, Repositories page, Languages page, API client generation
10Docker Packaging33-stage Dockerfile, all LSP servers, compose files, Kestrel binding, port strategy
11Testing5617 unit tests, E2E per language, token efficiency benchmarks, cross-platform validation
12Bug Fixing + Polish4Path resolution, LSP edge cases, cross-platform issues, NTFS case sensitivity traps
TOTAL51~10.5 weeks, ~2.5 months

Scenario Comparison

ScenarioMultiplierTotal
Optimistic (everything clicks, no surprises)1x~51 working days (~2.5 months)
Realistic (LSP servers misbehave, rabbit holes, 61K lines by hand)1.8x~90 days (~4.5 months)
Pessimistic (OmniSharp deadlock, JDTLS quirks, cross-platform bugs)2.5x~125 days (~6 months)
Actual result with AI (AI coding assistant, ~95% AI-written)0.18x9 days

Where The Time Really Goes

The LSP Bridge Plugin (Phase 7) dominates because:

The Roslyn Plugin (Phase 6) is the second time sink:


Actual Development Timeline (with AI Coding Assistant)

The project was built with an AI coding assistant writing ~95% of the code, directed by a senior developer who designed architecture and made all decisions. Human role: architecture, decisions, prompting, review, manual testing.

94 Commits 440 Files 61,712 Lines Added 9 Working Days

Day-by-Day Breakdown

DayCommitsWhat Was BuiltHoursPhase (manual est.)
Day 1 (evening) 9 POC results, project scaffolding, core contracts, plugin loader ~3-4h Phases 0+1+2+3
Day 2 26 Output formatters, MCP server + 12 tools, Roslyn Tier1 (7 ops), Roslyn Tier2+3 (4 ops), HTTP transport, workspace sessions, E2E tests (24), publish/distribution, README, gap analysis ~12-14h Phases 3+4+5+6 + Features 8-11
Day 3 10 Bug fixes (search, diagnostics, deps), close all gaps, 3-tier output formats, OmniSharp research ~8-10h Phase 5 v2 + Feature 12
Day 4 3 LSP Bridge Plugin for Python (OmniSharp-based), 7 critical live-testing fixes ~4-5h Phase 7 start
Day 5 2 Multi-language configs (TS, JS, Java), didOpen per-file, language auto-detection ~4-5h Phase 7 continued
Day 6 4 WSL/Linux cross-platform porting, native LSP servers, jdtls investigation, path resolution bugs ~8h Phase 7 + cross-platform
Day 7 5 OmniSharp deadlock root cause found, StreamJsonRpc full rewrite (27 files, +2161/−1075 lines), deterministic readiness ~10h Phase 7 deepest debugging
Day 8 17 LSP capabilities fix, E2E 15/15, token efficiency benchmarks (8 tests), Docker 3-stage build, deploy scripts ~12h Phase 7 finish + Phase 10
rest 0 0
Day 9 18 Project rename, lsai_source tool, REST API (controllers + services + 121 tests), SQLite/EFCore migration, Angular 21 scaffold, PrimeNG, Dashboard + Repositories + Languages pages, SPA hosting, Docker update ~12-14h Phases 8+9+10
TOTAL 94 ~73-80h ~9-10 working days

Estimate vs Reality

Using the realistic estimate (1.8x) as baseline, since the optimistic 51-day figure assumes zero surprises — which never happens with undocumented LSP server quirks.

PhaseRealistic estimate (manual)Actual (with AI)Speedup
Research + Scaffolding~7 days~0.3 days23x
Core Contracts~4 days~0.2 days20x
Plugin Infrastructure~5 days~0.2 days25x
Server + MCP Transport~5 days~0.3 days17x
Output Formatters~4 days~0.5 days8x
Roslyn Plugin (C#)~11 days~0.5 days22x
LSP Bridge Plugin~18 days~4.5 days4x
Web Console — Backend~7 days~0.5 days14x
Web Console — Frontend~9 days~0.5 days18x
Docker~5 days~0.5 days10x
Testing~9 daysdistributed
Bug fixing + Polish~7 days~1.5 days5x
TOTAL~90 days (~4.5 months)9 days10x

Key Observations

  1. Boilerplate-heavy phases (contracts, DI, server scaffold) = 10–15x speedup. AI excels at generating repetitive, pattern-based code. Core contracts + plugin loader + server host were done in a single evening + morning.
  2. LSP debugging = only 2.2x speedup. This was the one area where AI couldn't shortcut the pain. Each LSP server has undocumented quirks. OmniSharp had an architectural deadlock that required reading library source code, forming hypotheses, and testing them. This phase alone consumed ~45% of total development time.
  3. The OmniSharp → StreamJsonRpc rewrite (27 files) happened in a single day. This is where AI coding shines: large-scale mechanical refactoring with clear patterns. A human would need 3–5 days for the same rewrite.
  4. Day 2 was superhuman. 26 commits covering 6+ estimated phases in one day. This kind of output is physically impossible without AI assistance — not because of typing speed, but because of the sheer volume of correct, tested code produced.
  5. Total human effort was ~73–80 hours over 9 days. The “human” work was primarily: architecture decisions, prompting, reviewing AI output, manual MCP testing, debugging LSP server behavior, and directing the AI to fix issues. Actual keystrokes-to-write-code was minimal.