Push it to the Limit EP. 1—GPT Engineer
Integrating complex large database on Supabase with GPT Engineer
Disclaimer: I have not been paid to post or receive any sort of financial incentive. In this series I pay for all the products using my own money to provide a neutral and honest evaluation.
I’m starting a series called Push it to the Limit where I will evaluate and share techniques and strategies using generative AI coding tools to unlock a new way to develop software. I will be sharing all the lessons I learned, prompts and techniques.
We start with GPT Engineer (https://gptengineer.app/) by Lovable Labs. In the past few weeks I’ve used it extensively to generate various front-end React code base with good results. I’m struck by the obvious implication this tool is going to have on front-end positions in the near future. With only a minimal working knowledge of React, I’ve been able to generate functional, clean React code with only prompts, something that would’ve taken me days if not weeks to pull off in matter of a few hours. No working knowledge of React or coding is even needed to pull off the same results.
Naturally curious and impatient, I will cover the basics of using GPT Engineer in another post and instead dive right in to a current problem I have, the ability to marry my PostgreSQL database hosted on Supabase to my GPT-Engineer generated front-end code.
For very basic use cases like a CRUD operations on tables with RLS turned off, it was very easy to integrate with GPT-Engineer and following the documentation it was straightforward. I’ve also discovered the ability to generate Supabase Auth UI to make Sign-up and Log-in a breeze. I shall revisit this in another post as it is very exciting.
In general, integrating your existing database on Supabase with GPT Engineer is not only a reality but very easy but with one caveat—the more complex and numerous your tables the more challenging and creative you have to be with your prompts as GPT Engineer is still early in its capabilities.
In particular the database I’m using currently has roughly 40~50 tables with various relationships for a freelance website I’m building. While it was easy to integrate a database with only a dozen tables via the GPT Enginer UI, it currently is not possible to push anything more complex or bigger.
I took a look at how the Supabase integration is happening with GPT Engineer and its very clever. Supabase automatically generates an API via PostgREST (https://docs.postgrest.org/en/v12/) and GPT Engineer generates supabase client code to connect to it. There appears to be a context limit due to the complexity and depth of the database grows.
Now I begin the task of circumventing this limitation by getting creative with prompts. In Part 2, I will lay out my approach and report the results!
We are at the bleeding edge of new way to develop software and I’m excited to bring you on the journey with me!
Would love to see your progress step by step in building your application , connecting to back-end and using supabase.