Notion to Supabase Database Migration Writeup

Yogi Salomo Manguntang Pratama,NotionSupabaseMigrationDatabase

I am currently building a toy project, lab-scholarships.kr (opens in a new tab), a website to integrate scholarship information from university laboratories in Korea. Because the project is still in MVP/PoC phase, I used Notion as its database. I picked Notion because it provides CRUD API out of the box, has convenient interface since Notion is actually a user facing platform, and I don't need to maintain the database instances myself.

However, this approach is not really suitable for functionalities beyond simple viewing of the data. The main limitation that I can think of is the lack of foreign key and index on Notion's database feature. And since I plan to start developing the toy project further, I need to migrate the database to a more scalable option.

After some considerations, I decided to go with Supabase (opens in a new tab). Supabase provides a hosted PostgreSQL database with built-in tools to manage the data. Aside from the database, Supabase also provides other services as authentication service and Lambda-like serverless function, which I don't need yet but I'm sure will come in handy during the later development of the project. And the most important reason: Supabase has a generous free plan offering!

Because the website is currently operating and it already has some production data, I needed to migrate those data to the new Supabase PostgreSQL database and in this blog post I am sharing my experience in doing so.

Steps to Migrate the Database

The steps that I took to migrate the database from Notion to Supabase was actually very simple:

  1. Export the Notion Database as a CSV file

Notion allows you to export your database to a CSV file by clicking the ellipsis (...) button on the top right of your database's view page, and select Export.

notion-export.png

  1. Import the CSV file to the database

After that you can import the CSV file that you received from Notion to your Supabase database by going to the view page of the table that you want to import to, click the green Insert button and select Import data from CSV.

supabase-import.png

The tricky part is the adjustments that I needed to do to make sure the data from Notion is compatible with the SQL database before importing them. The adjusments that I did are as follows:

For enum data I used Notion's select because it provides a functionality to easily recall the previous values that has been set before. However, when exported these data are converted to simple strings, while I want them to be in uint data type so I can manage the enumeration on the server code. So I converted the data by using a simple =SWITCH function on a spreadsheet.

spreadsheet-switch.png

I used Notion's multi select for some fields that have multiple values like the research fields of the lab. However, when exported the multi select values are converted as is, i.e. Field A, Field B, while Supabase needs it to have the brackets of an array for the values to be valid, i.e. [Field A, Field B]. For this one I just add the brackets manually to the CSV before importing although looking back, prepending and appending those characters with spreadsheet or a simple script should be trivial and I should've done so..

Moreover, because I want all existing links to still work (the URL format is /lab/:id for a lab's detail page), I overrode the IDs that was automatically generated by the PostgreSQL database. Again, there might've been a smart way to do it but I couldn't be bothered so I just retrieved the old ID manually and copy-pasted them one-by-one.

Supabase Initial Configuration

After the data was successfully migrated to Supabase, they were not immediately accessible by the website's server. Several hours of troubleshooting made me understood that you have to do some extra configuration when using schema other than public on Supabase.

I created a new schema for the website's database because in my understanding that's the best practice and because I plan to use the same database for my other project (one project/database limitation for free plan) so they need to be configured first before you can access them from the client. The configurations that I had to do were as follows:

By default schemas other than public are not exposed to Supabase's API. So you have to expose them first by adding the name of your schema to the Exposed schemas list. The form is accessible in Project Settings > Configuration > API > API Settings > Exposed schemas and this is how it should look like after you added your schema:

exposed-schema.png

Moreover, Supabase has a very restrictive default policy for tables on schemas other then public where by default no table is accessible without any authentication. This is a very good security practice but it might not be intuitive if you haven't managed an SQL database to this level.

Since the scholarship data needs to be accessible by public (non logged-in users), I needed to create a policy that will allow the public access. The policy can be created on Authentication > Configuration > Policies > Create Policy and here's the sample value for public access on SELECT query:

create policy "$POLICY_NAME"
on "$SCHEMA_NAME"."$TABLE_NAME"
as PERMISSIVE
for SELECT
to public
using (
  true
);

After doing the steps above, your Supabase client should be able to access your custom schema. However, to do so you need to provide the information of your schema when initializing the client, which you can do by providing the following option to the createClient function:

const supabase = createClient(SUPABASE_URL, SUPABASE_KEY, {
  db: { schema: "YOUR_SCHEMA_NAME" },
});

Closing

The migration was a really interesting experience. In the process I relearned some concept of SQL that I haven't really dealt with in a while and learned more about the features of Supabase. Supabase has been convincing so far and I might use it again for another project and might even go for the paid plan, as opposed to managing my own cloud environment for personal projects.

The migration process was a success and the website is operating normally with the new database. Hopefully this can be a motivation and momentum for me to continue building lab-scholarships.kr!