Hacker News

__natty__
PgAdmin 4 9.13 with AI Assistant Panel pgadmin.org

panzian hour ago

Yeah, no thanks. I switched to dbeaver already anyway, because pgadmin was annoying about to which postgres versions it could connect. Too much of a hassle to setup a new version from source back when I tried. With dbeaver I just run ./dbeaver from the extracted .tag.gz. dbeaver is also not a web interface, but a real desktop application (Java, though).

chaz63 hours ago

When I got the update I looked through the settings and there appears to be no way to disable it. I do not want AI anywhere near my database. I only use it for testing/staging at least so I should hopefully not have to worry about it wrecking production.

ziml773 hours ago

What's the danger? It can see the schemas to help it generate the queries but it can't run anything on its own. Also you have to give the application credentials to an AI provider for the feature to work. So, you can just not do that.

adamas3 hours ago

There is no need of potential dangers to not want to have non-deterministic features in an application.

imjared3 hours ago

The docs suggest that you can set the default provider to "None" to disable AI features: https://www.pgadmin.org/docs/pgadmin4/9.13/preferences.html#...

smartbit3 hours ago

Note: AI features must be enabled in the server configuration

  LLM_ENABLED = True 
in config.py for these preferences to be available.

OptionOfTan hour ago

I did not enable this and yet I got the panel in the UI.

zenmac3 hours ago

It is nice that they have the default set to "None". However to have this feature in pgAmdin is as distraction from the project.

If it is just calling API anyway, then I don't want to have this in my db admin tool. It also expose surface area of potential data leakage.

bensyverson3 hours ago

Worth pointing out that Postgres is perfectly usable without an admin dashboard at all

[deleted]3 hours agocollapsed

lateforwork32 minutes ago

Did you miss this:

"This feature requires an AI provider to be configured in Preferences > AI."

And then you have to supply an API key (see here https://www.pgedge.com/blog/ai-features-in-pgadmin-configura... )

You don't get AI for free!

[deleted]3 hours agocollapsed

rubicon332 hours ago

Why do you do in production?

vavkamil3 hours ago

Quick fix based on https://github.com/pgadmin-org/pgadmin4/issues/9696#issuecom...

Click on the "Reset layout" button in the query tool (located in the top right corner), and it will move the "AI Assistant" tab to the right. Now, when you query a table, it will default to the Query tab as always.

msavaraan hour ago

No thank you. One of the worst ads for python that exists. The only one worse than pgAdmin is Windows 11.

aitchnyu3 hours ago

Might as well choose our AI subscription for our tools. I always hated the sparkle icons in Mongodb Compass (db browsing tool), Cloudwatch (logs) etc which is wired to a useless model. So I always chose to write Python scripts to query Postgres and other DBs and render pretty tables to CLI.

zbentley2 hours ago

Eh, as someone generally on the skeptical end of the spectrum for a lot of AI-assisted ops tasks, exploratory query generation is a great use case for it.

I’m highly proficient in code, only average at SQL, and am routinely tasked to answer one-off questions or prototype reporting queries against highly complex schemas of thousands of tables (owned by multiple teams and changing all the time, with wildly insufficient shared DAO libraries or code APIs for constructing novel queries). My skill breakdown and situation aren’t optimal, certainly, but they aren’t uncommon either.

In that context, being able to ask “write a query that returns the last ten addresses of each of the the highest-spending customers, but only if those addresses are in rhetorical shipment system and are residences, not businesses”. Like, I could figure out the schemas of the ten tables involved in those queries and write those joins by hand, slowly. That would take time and, depending on data queries, the approach might get stale fast.

stuaxo2 hours ago

If I can use this with a local LLM it could be useful.

zbentley2 hours ago

Yeah. This seems like an area where a “tiny” (2-4GB) local model would be more than sufficient to generate very high quality queries and schema answers to the vast majority of questions. To the point that it feels outright wasteful to pay a frontier model for it.

[deleted]3 hours agocollapsed

naranha2 hours ago

The only interface that works for me efficiently with LLMs is the chatbot interface. I rather copy and paste snippets into the chat box than have IDEs and other tools guess what I might want to ask AI.

The first thing I do with these integration is look how I can remove them.

hn-front (c) 2024 voximity
source