This guide is the answer to one question: "Can I just export my base44 code and run it elsewhere?" The short version is no, and this page explains exactly why and what to do about it.
If you are still on the fence about leaving, read when to leave base44 first. If you have decided to leave and want to know what target to migrate to, see Next.js + Supabase or Vercel. This page covers the export step in detail, which is the first phase of any migration.
What the base44 code export actually is
The export is a one-click operation in base44's settings that pushes your project to a GitHub repository you authorize. It is officially in beta as of May 2026, has been in beta for over six months, and is one of the most-cited frustrations in base44's vendor lock-in problem.
When export works, you get:
- A GitHub repo with your React frontend code
- The Tailwind config and design tokens
- Your backend function source files (Deno-flavored TypeScript)
- A
schema.jsonfile describing each entity and its fields - A
package.jsonwith declared dependencies - A README pointing back at base44's docs
When export does not work — and it sometimes does not — you get a partial repo missing backend functions, or a repo that does not exist at all because the export silently failed. This is well-documented on feedback.base44.com and was the topic of the Nocode.mba review quote: "GitHub export remains in beta...feature shipping too fast, suggesting stability concerns."
Who can export
| Plan | Can export? |
|---|---|
| Free | No |
| Starter | Yes (with rate limits) |
| Pro | Yes |
| Business | Yes |
| Enterprise | Yes |
If you are on the free tier, you must upgrade to at least Starter to export. This is itself a vendor-lock-in mechanism — you cannot leave without paying for one month of the next tier up.
Step-by-step: how to export
1. Verify you are on a paid plan
Settings → Billing. If you see "Free", upgrade to Starter at minimum. The upgrade is instant; you can downgrade after one month.
2. Connect your GitHub account
Settings → Integrations → GitHub. Authorize base44 to create a repository in your account or organization. base44 requests the repo scope, which gives it write access to all your repos. We recommend creating a dedicated GitHub user or organization just for the export, then granting access only to the relevant team members afterward.
3. Trigger the export
Settings → Code Export → Export to GitHub. Confirm the target repo name. Wait. The export takes thirty seconds to ten minutes depending on app size.
If the export fails, base44's UI gives you a vague error. Common causes:
- Repo name conflict. A repo with that name already exists. Pick a different name or delete the conflicting repo.
- GitHub auth expired. Re-authorize the GitHub integration.
- App in inconsistent state. If your app was mid-AI-build when you tried to export, base44 sometimes refuses. Wait for the build to complete, then retry.
- Backend functions missing. The most common silent failure. The export completes but
backend/functions/is empty. Re-run the export; it usually succeeds the second time.
4. Clone the repo locally
git clone git@github.com:yourorg/your-base44-app.git
cd your-base44-app
ls -la
You should see roughly this structure:
your-base44-app/
├── README.md
├── package.json
├── vite.config.ts
├── tailwind.config.ts
├── tsconfig.json
├── schema.json
├── src/
│ ├── components/
│ ├── pages/
│ ├── integrations/ ← @base44/sdk adapter
│ ├── lib/
│ └── main.tsx
└── backend/
└── functions/ ← server-side functions (sometimes missing)
If backend/functions/ is empty or missing, your backend functions did not export. Re-run the export from base44.
5. Try to run it (and watch it fail)
npm install
npm run dev
The dev server starts. The UI renders. Then you click anything that fetches data, and you see something like:
[base44/sdk] Authentication failed: invalid app_id
Error: Cannot read properties of undefined (reading 'find')
at Dashboard (src/pages/Dashboard.tsx:14:42)
This is expected. The SDK only authenticates against the base44 platform. The exported code cannot run standalone.
What is in the export, line by line
src/components/
Pure React components, mostly. JSX, Tailwind classes, hooks, props. These are the most portable part of the export. ~80–95% can move to a new framework with light edits.
src/pages/
Page-level components, usually one per route. These reference @base44/sdk heavily. Every base44.entities.X.find() and base44.functions.Y() call has to be rewritten when you migrate.
src/integrations/
The SDK adapter layer. Often a base44Client.ts file that initializes the SDK with your app_id. This whole folder gets deleted in a real migration; you replace it with a Supabase client, a Postgres client, or whatever your new backend uses.
backend/functions/
Your backend function source. These are Deno-flavored TypeScript files that run on base44's server-side runtime. The function bodies port mostly cleanly to Supabase Edge Functions or Next.js Route Handlers — same Deno-or-Node patterns. Replace base44.entities.X calls inside with calls to your new database client.
schema.json
The most useful single file in the export. It describes every entity, every field, and every field's type and constraints. You use this as the source of truth when generating your new SQL DDL or Prisma schema.
Example shape:
{
"entities": {
"Project": {
"fields": {
"name": { "type": "string", "required": true },
"ownerId": { "type": "userRef", "required": true },
"status": { "type": "enum", "values": ["draft", "active", "archived"] },
"createdAt": { "type": "datetime", "default": "now" }
},
"permissions": {
"read": "owner",
"write": "owner"
}
}
}
}
Translating this to Postgres DDL:
create table projects (
id uuid primary key default gen_random_uuid(),
name text not null,
owner_id uuid not null references auth.users(id) on delete cascade,
status text not null default 'draft' check (status in ('draft','active','archived')),
created_at timestamptz not null default now()
);
alter table projects enable row level security;
create policy "owner_can_read" on projects for select using (auth.uid() = owner_id);
create policy "owner_can_write" on projects for all using (auth.uid() = owner_id);
This is forty percent of the work of any migration. The schema is the most stable surface; build it carefully.
What is NOT in the export
This is the part nobody tells you upfront.
| What's missing | Why it matters |
|---|---|
| Database rows | You have to export data separately, per entity, via base44's data export or SDK pagination |
| Password hashes | You cannot migrate user sessions; every user must reset their password on the new platform |
@base44/sdk source | The SDK is closed-source. You cannot self-host it. You replace it. |
| Platform-managed auth flows | OAuth client IDs, magic-link templates, session cookies — all live on base44 servers |
| Scheduled task definitions | Whatever crons or scheduled prompts you set up are not in the export |
| Webhook endpoint configs | The URLs are documented in the export but the routing is platform-managed |
| Storage buckets | Your uploaded files live on base44's storage. You re-upload to your new storage |
| Logs and analytics | Your historical logs do not export |
| Custom domain config | Re-configure on the new host |
A common mistake is to clone the repo, run npm install, get partial life signs, and assume the rest is fifteen minutes of work. The rest is two to three months of work.
How to export your data
The code export does not include data. You need a separate step.
Option A: base44's data export (per entity)
Settings → Data → Export. Download CSV per entity. This works for small datasets but has a row limit (around 50,000 per export as of 2026) and does not include relationships well.
Option B: SDK-based pagination script
For large datasets, write a Node script that paginates through every entity using the SDK and dumps to JSON.
// scripts/dump-base44.ts
import { createClient } from "@base44/sdk";
const b44 = createClient({ appId: process.env.BASE44_APP_ID! });
async function dumpEntity(name: string) {
const all: any[] = [];
let cursor: string | undefined = undefined;
while (true) {
const page = await b44.entities[name].find({ limit: 500, cursor });
all.push(...page.items);
if (!page.nextCursor) break;
cursor = page.nextCursor;
}
await Bun.write(`export/${name}.json`, JSON.stringify(all, null, 2));
console.log(`${name}: ${all.length} rows`);
}
await Promise.all(["users", "projects", "tasks"].map(dumpEntity));
Run from a machine that is authenticated to base44 (use your API token). This is the safest way to export large or relational datasets.
How long the export remains useful
The export is a snapshot. The moment you take it, it starts going stale.
If you plan to migrate, the right pattern is:
- Day 0: Take the export. Note the snapshot time.
- Days 1–N: Migration work happens. Do not edit the base44 app during this period unless you are doing critical bug fixes.
- Cutover day: Take a final data export. Diff against the snapshot to find new rows. Backfill those into the new system.
- Cutover hour: Lock base44 read-only. Final data sync. DNS swap.
If you keep editing both sides during the migration, you create a merge problem you cannot solve cleanly. Pick one source of truth at any given moment.
Common pitfalls with the export
1. Re-running export and overwriting your migration work. Once you have started rewriting the exported code, do not re-export from base44 — it overwrites your repo. Branch and merge if you need a fresh export later.
2. Trusting npm install success. It will succeed. The app will still not work. Verify by clicking actual data-loading routes, not just landing pages.
3. Forgetting schema.json. It is the most useful file in the export. Read it carefully and use it as the spec for your new schema.
4. Exporting on free tier and being told no. Upgrade to Starter for one month, export, then downgrade.
5. Backend functions missing. Common silent failure. Re-run export. If still missing, contact base44 support. They are slow but eventually fix the export.
6. Treating the export as a complete escape. It is not. It is the first ten percent of a migration. Plan for the other ninety.
What to do with the export, in order
- Clone it. Verify you have backend functions. Re-export if not.
- Read
schema.json. Understand your data model. - Pick a migration target (Next.js + Supabase, Vercel, self-hosted, Replit, Lovable, Bubble, or Firebase).
- Stand up the new backend. Translate the schema. Backfill data.
- Rewrite SDK calls in the exported frontend, or rebuild from scratch in the new framework.
- Cut over.
The export is a tool. The migration is the work.
Want help with the export?
We will run the export for you, audit what came through, identify the rebuild scope, and quote the migration. Free thirty-minute call.
Book a free migration assessment
Related migrations
- Base44 to Next.js + Supabase — most common destination after the export.
- Base44 to Vercel — frontend-first migration with your choice of backend.
- When to leave base44 — decision framework if you are still evaluating.