cahaya11 hours ago
Nice, but can somebody tell me if this performs better than my simple Postgres MCP using npx? My current setup uses the LLM to search through my local Postgres in multiple steps. I guess this Pgmcp is doing multiple steps in the background and returns the final result to the LLM calling the MCP tool?
Codex: ``` [mcp_servers.postgresMCP] command = "npx" args = ["-y", "@modelcontextprotocol/server-postgres", "postgresql://user:password@localhost:5432/db"] ```
Cursor: ``` "postgresMCP": { "command": "npx", "args": [ "-y", "@modelcontextprotocol/server-postgres", "postgresql://user:password@localhost:5432/db" ] }, ```
With my setup i can easily switch between LLM's
oulipo210 hours ago
nice! is there a way for the agent to know about it's own queries / resource usage?
eg the agent could actively monitor memory/cpu/time usage of a query and cancel it if it's taking too long?
coderinsan3 hours ago
How does this protect against lethal trifecta attacks like the ones here - tramlines.io/blog ?
freakynit19 hours ago
Shameless plug: I literally built a desktop app that does the exact same thing, but, with any data file you throw at it. CSV, Json, Excel and Parquet... all supported. And processing happens without your files being uploaded to LLM provider.
mistrial9a day ago
recently posted https://news.ycombinator.com/item?id=43520953
foskop17 hours ago
Different project