Show HN: Mcp2cli – One CLI for every API, 96-99% fewer tokens than native MCP

78 points

by

@knowsuchagency

|

March 9th, 2026 at 5:18am

@jancurn

March 9th, 2026 at 8:53am

Cool, adding this to my list of MCP CLIs:

  - https://github.com/apify/mcpc
  - https://github.com/chrishayuk/mcp-cli
  - https://github.com/wong2/mcp-cli
  - https://github.com/f/mcptools
  - https://github.com/adhikasp/mcp-client-cli
  - https://github.com/thellimist/clihub
  - https://github.com/EstebanForge/mcp-cli-ent
  - https://github.com/knowsuchagency/mcp2cli
  - https://github.com/philschmid/mcp-cli
  - https://github.com/steipete/mcporter
  - https://github.com/mattzcarey/cloudflare-mcp
  - https://github.com/assimelha/cmcp

@Doublon

March 9th, 2026 at 7:54am

We had `curl`, HTTP and OpenAPI specs, but we created MCP. Now we're wrapping MCP into CLIs...

@acchow

March 9th, 2026 at 9:50am

> Every MCP server injects its full tool schemas into context on every turn

I consider this a bug. I'm sure the chat clients will fix this soon enough.

Something like: on each turn, a subagent searches available MCP tools for anything relevant. Usually, nothing helpful will be found and the regular chat continues without any MCP context added.

@rakamotog

March 9th, 2026 at 10:22am

For a typical B2B SaaS usecase (non technical employees) -> MCP is working great since its allows people to work in Chat interfaces (ChatGPT, Claude). They will not move to terminal UX's anytime soon.

So, I dont see why a typical productivity app build CLI than MCP. Am I missing anything?

@stephantul

March 9th, 2026 at 7:15am

Tokens saved should not be your north star metric. You should be able to show that tool call performance is maintained while consuming fewer tokens. I have no idea whether that is the case here.

As an aside: this is a cool idea but the prose in the readme and the above post seem to be fully generated, so who knows whether it is actually true.

@benvan

March 9th, 2026 at 8:08am

Nice project! I've been working on something very similar here https://github.com/max-hq/max

It works by schematising the upstream and making data locally synchronised + a common query language, so the longer term goals are more about avoiding API limits / escaping the confines of the MCP query feature set - i.e. token savings on reading data itself (in many cases, savings can be upwards of thousands of times fewer tokens)

Looking forward to trying this out!

@DieErde

March 9th, 2026 at 7:44am

Why is the concept of "MCP" needed at all? Wouldn't a single tool - web access - be enough? Then you can prompt:

    Tell me the hottest day in Paris in the
    coming 7 days. You can find useful tools
    at www.weatherforadventurers.com/tools
And then the tools url can simply return a list of urls in plain text like

    /tool/forecast?city=berlin&day=2026-03-09 (Returns highest temp and rain probability for the given day in the given city)
Which return the data in plain text.

What additional benefits does MCP bring to the table?

@tern

March 9th, 2026 at 8:43am

There are a handful of these. I've been using this one: https://github.com/smart-mcp-proxy/mcpproxy-go

@nwyin

March 9th, 2026 at 7:14am

cool!

anthropic mentions MCPs eating up context and solutions here: https://www.anthropic.com/engineering/code-execution-with-mc...

I built one specifically for Cognition's DeepWiki (https://crates.io/crates/dw2md) -- but it's rather narrow. Something more general like this clearly has more utility.

@Intermernet

March 9th, 2026 at 8:57am

I may be showing my ignorance here, but wouldn't the ideal situation be for the service to use the same number of tokens no matter what client sent the query?

If the service is using more tokens to produce the same output from the same query, but over a different protocol, than the service is a scam.

@ekianjo

March 9th, 2026 at 10:17am

Doubtful that a 16 tokens summary is the same as she JSON tool description that uses 10x more tokens. The JSON will describe parameters in a longer way and that has probably some positive impact on accuracy

@jofzar

March 9th, 2026 at 8:11am

How is this the 5th one of these I have seen this week, is everyone just trying to make the same thing?

@ejoubaud

March 9th, 2026 at 8:33am

How does this differ from mcporter? https://github.com/steipete/mcporter/

@philipp-gayret

March 9th, 2026 at 7:24am

Someone had to do it. mcp in bash would make them composable, which I think is the strongest benefit for high capability agents like Claude, Cursor and the like, who can write Bash better than I. Haven't gotten into MCP since early release because of the issues you named. Nice work!

@silverwind

March 9th, 2026 at 7:42am

How would the LLM exactly discover such unknown CLI commands?

@jkisiel

March 9th, 2026 at 8:01am

How is it different from 'mcporter', already included in eg. openclaw?

@Ozzie_osman

March 9th, 2026 at 7:50am

I kind of feel like it might be better to go from CLI to MCP.

@tuananh

March 9th, 2026 at 7:53am

mcp just need to add dynamic tools discovery and lazy load them, that would solve this token problem right?

@rvz

March 9th, 2026 at 7:51am

MCP itself is a flawed standard to being with as I said before [0] and its wraps around an API from the start.

You might as well directly create a CLI tool that works with the AI agents which does an API call to the service anyway.

[0] https://news.ycombinator.com/item?id=44479406

@techpulse_x

March 9th, 2026 at 8:30am

[dead]

@yogin16

March 9th, 2026 at 9:14am

[dead]

@liminal-dev

March 9th, 2026 at 7:31am

This post and the project README are obviously generated slop, which personally makes me completely skip the project altogether, even if it works.

If you want humans to spend time reading your prose, then spend time actually writing it.