Engineering

We built a PlanetScale plugin for ChatGPT

Have a conversation with your database

This past weekend, we were fortunate to attend the Cerebral Valley Open AI hackathon at Shack15. We were challenged with building a ChatGPT plugin in 24 hours using Open AI’s new plug-in architecture.

We are PlanetScale users so we decided to build a plugin to talk to a PlanetScale database. The flow we wanted for the plugin was:

1. Ask a question.

2. Have ChatGPT interpret the question and generate a SQL query.

3. Retrieve the results of the query and interpret them.

After pondering and experimentation, we found the plugin architecture extremely powerful. This post aims to speed-run through that initial thinking toward a working model of how Open AI and ChatGPT plugins work and how you can build for them.

Selecting the PlanetScale plug-in in the ChatGPT UI

The plugin architecture uses a well-known JSON file to define a path to an Open API definition. As the plugin developer, you control the Open API file; this file describes your API to GPT. The methods you describe here are entirely up to you. While this may seem counterintuitive at first, ultimately, you can behave just as you would sharing an Open API descriptor with a human developer. Whatever you put in this file is then made available to GPT.

What you are doing is teaching ChatGPT to use your API. It’s the same as sending another developer your Open API file, but in this case, it’s an AI.

In the following screenshots, ChatGPT uses our Open API file, which describes PlanetScale methods to access and control a database. We hooked it up to a database sample with just a couple of tables:

We didn't tell it anything about this database. It only knows that it needs to use MySQL dialect queries.

This was a big a-ha moment for us. By describing a resultset in general form to ChatGPT, it could actually translate that result set back into natural language.

The genius of LLMs and code generation through AI works in reverse as a fantastic tool for describing data, and it can do this even with no up-front concrete knowledge of the schema.

Under the hood, it’s like you had Copilot write an SQL query. Our plugin executes that query (based on your natural language), assembles a result, and returns it to you in that same natural language.

At this point, we were itching to test the plugin with actual data. So we loaded in IMDb:

ChatGPT querying IMDb. Again, no prior knowledge of schema.

It picked it up like a charm.

Now, it shouldn’t surprise anyone that more complex queries were challenging for the AI. Even in the face of query failure, though, what it did next left us speechless:

ChatGPT recovering from a query error.

Here, you see the bot recovering intelligently from a query failure. It’s noticing that a failure occurred and then reasoning its way through to a solution anyway.

Because the bot is interpreting the results and not just returning the raw results it was able to evaluate if it has returned data that matches the original question. If the data did not match what we were looking for it would try rewriting the query.

On top of these casually paradigm-shifting capabilities, the bot was even polite in refusing to write, which we had taught it to do out of an abundance of caution. Although it refuses to run write queries (as instructed), it still offers a helpful auto-generated query in the proper dialect, which can be used with a simple copy-and-paste:

ChatGPT suggesting a write query, because it was told it can't write.

Another powerful use was asking the bot to write about the data. It would retrieve results from the database and interleave them with prose.

We then asked the bot to get really nerdy and data model in various languages. As a competent model with code, GPT-3.5 dutifully complied with the following:

ChatGPT generates a Hibernate model from a table

This works with nearly any language, just as Copilot does:

ChatGPT: Appreciator of Ruby and Java

Useful. But let’s do more. Let’s take it a step further. We asked GPT to create an entire CRUD app from our database, and it did:

AI is here

What should be evident by now is that we are witnessing a tectonic shift in AI technology. As software engineers, we already work daily with tools like Copilot, and these tools have quickly become invaluable toward sustaining an innovation velocity that defies market downturns and stands in defense of the best of humanity. The act of asking questions and exploring the database through text felt playful. Imagine the insights you could find in your data if your feedback loop was fun and conversational.

As great as that is, the sheer magnitude of creative time and energy which awaits freedom outside of software engineering dramatically outweighs any other technical optimization that any of us present in the room this weekend had seen yet in our lives.

It's open source

The repo for the plugin is open source. You can find it on my GitHub.

Using the plugin

ChatGPT plugins are added by domain. The service crawls the well-known plugin file and adds the plugin to your chat session. To use the Planetscale plugin, just drop in the domain "planescale.ai".

The plug-in is hard-coded against a sample database for the moment, but we will be adding login via PlanetScale's OAuth API soon. File an issue on the repo if you want to help, or you need it for your db!

Stack

We were able to build everything you see above in about 12 hours, using:

Special thanks

We want to extend a warm thank-you to our hosts: Jorn at Shack15, Ivan and the team at Cerebral Valley, Open AI, the volunteers from this weekend, and our teammates (Felipe Recalde, Ryan Lewis, Barrett Williams, and Damien O'Hara & Tyler Porras who both advised!).

Newsletter
Something to read other
than code.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
By signing up you agree to our Terms & Conditions