What's happening
You decided you want to leave Base44, or at least keep the option open. You exported your project to GitHub. You cloned it locally. You ran npm install. You ran npm run dev. Everything immediately fails because every backend file imports @base44/sdk and calls it for database operations that no longer have a backend.
A user on shipper.now wrote: "The backend code is strongly dependent on Base44-sdk database calls; exported code won't function elsewhere without significant redevelopment." Nocode.mba's review of Base44 noted the same: "It's unclear how you'd migrate the database if you later move off the platform."
The marketing claim is "you own your code." The technical reality is that you own a codebase that runs only on Base44. The SDK is a platform-specific runtime, not a portable library, and the export does not include the server-side machinery the SDK depends on. You did not get a portable application. You got a snapshot of source code that needs an SDK that only Base44 hosts.
Why this happens
Base44 generates code that is convenient to write inside the platform and inconvenient to run anywhere else. The convenience comes from the SDK collapsing database, auth, file storage, and async messaging into a single import surface. You write base44.collection("orders").create({...}) and it works.
The inconvenience comes from what that line of code actually does. It calls a hosted Base44 endpoint that knows your project's schema, RLS policies, auth context, and storage layout. The SDK is a thin client; the heavy lifting happens on the platform. Outside the platform, the SDK has nowhere to call.
Three structural decisions deepen the lock-in.
The SDK is the only data-access path. Base44 does not expose your database via a portable interface like a Postgres connection string. You cannot point your app at a regular Postgres URL and bypass the SDK. Even if you could, the schema and RLS are managed in Base44's UI rather than in versioned migrations.
Auth is platform-specific. User sessions, permission checks, and SSO are all SDK-mediated. Your code reads base44.auth.user.id everywhere. There is no portable equivalent that maps cleanly to Supabase Auth, Auth0, or Clerk without an adapter layer.
Backend functions assume Base44 runtime. Functions are Deno code that imports the SDK for any data work. Outside Base44, the Deno runtime alone is portable but the SDK calls inside those functions are not.
The platform's GitHub export is in beta and improving, but as of May 2026 it produces a snapshot, not a runnable artifact for any non-Base44 environment. Independent reviews have repeatedly noted: "GitHub export remains in beta...feature shipping too fast, suggesting stability concerns" (Nocode.mba).
Sources: shipper.now/export-code-base44/, nocode.mba/articles/base44-review, feedback.base44.com posts on export and migration limitations.
How to reproduce
- Open your Base44 project in the editor.
- Export to GitHub (requires a paid plan that supports export).
- Clone the resulting repo locally.
- Run
grep -r "@base44" src/ --include="*.ts" --include="*.tsx" --include="*.js" | wc -lto count SDK call sites. - Run
npm install. - Try to start the project:
npm run devor equivalent. - Observe one or more of: missing peer dependencies, runtime errors on first SDK call, build errors from the SDK trying to resolve platform-specific imports.
- Pick any backend function file. Read it. Note that nearly every line either calls the SDK directly or references types the SDK provides. There is no portable layer between your business logic and the platform.
Step-by-step fix
The fix is to introduce a data-access layer in your code that wraps every SDK call. Once every business-logic file imports your wrapper instead of the SDK, you can swap the wrapper's implementation when you migrate. Do this work while still on Base44.
1. Audit your SDK usage
Run a grep and categorize call sites by capability.
grep -rn "@base44" src/ \
--include="*.ts" --include="*.tsx" \
| tee sdk-usage.txt
Categorize each line as one of: data (collection reads/writes), auth (user/session/permission), storage (files), realtime (subscriptions), function (function invocations). Most projects have 60-80 percent in data, 15-25 percent in auth, and the remainder split among the rest.
2. Define a portable data interface
Create src/lib/data.ts with a generic interface that does not mention Base44.
// src/lib/data.ts
export interface DataLayer {
list<T>(collection: string, filter?: Record<string, unknown>): Promise<T[]>;
get<T>(collection: string, id: string): Promise<T | null>;
create<T>(collection: string, payload: Partial<T>): Promise<T>;
update<T>(collection: string, id: string, patch: Partial<T>): Promise<T>;
remove(collection: string, id: string): Promise<void>;
}
export interface AuthLayer {
currentUserId(): Promise<string | null>;
currentUserRoles(): Promise<string[]>;
}
This is the contract your business logic will depend on. It mentions no platform.
3. Implement the Base44 adapter
Create src/lib/data.base44.ts that satisfies the interface using the SDK.
// src/lib/data.base44.ts
import { base44 } from "@base44/sdk";
import type { DataLayer, AuthLayer } from "./data";
export const base44Data: DataLayer = {
list: async (collection, filter) => {
return base44.collection(collection).list({ where: filter });
},
get: async (collection, id) => {
return base44.collection(collection).get(id);
},
create: async (collection, payload) => {
return base44.collection(collection).create(payload);
},
update: async (collection, id, patch) => {
return base44.collection(collection).update(id, patch);
},
remove: async (collection, id) => {
await base44.collection(collection).delete(id);
},
};
export const base44Auth: AuthLayer = {
currentUserId: async () => base44.auth.user?.id ?? null,
currentUserRoles: async () => base44.auth.user?.roles ?? [],
};
4. Wire the adapter through a single entry point
// src/lib/data.index.ts
import { base44Data, base44Auth } from "./data.base44";
export const data = base44Data;
export const auth = base44Auth;
Now every business-logic file imports from data.index.ts instead of @base44/sdk. When you migrate, you swap the contents of data.index.ts to point at the Supabase or custom adapter instead.
5. Refactor incrementally, file by file
Pick one component or function at a time. Replace import { base44 } from "@base44/sdk" with import { data, auth } from "@/lib/data.index". Replace SDK calls with their wrapper equivalents. Test in Base44 (the wrapper still calls the SDK underneath, so behavior is unchanged). Commit. Move to the next file.
This works on Base44 today. The wrapper adds zero functionality and zero overhead. Its only purpose is to give you a single seam to cut on migration day.
6. Capture schema and RLS as code
Base44 manages schema and RLS in the UI. Export them manually.
- For every collection: write a SQL
CREATE TABLEstatement that matches the Base44 schema. Save it inmigrations/. - For every RLS policy: write the equivalent
CREATE POLICYstatement targeting your migration platform's RLS syntax. Save it next to the migration. - For every storage bucket: document its name, ACL, and contents.
Without this, you can export code but cannot rebuild the database it depends on.
7. Plan the cutover
When the wrapper is fully adopted, build a Supabase (or alternative) implementation of the same interface. Run both adapters side by side in a staging environment. Run your test suite against both. When parity is acceptable, swap data.index.ts to point at the new adapter and deploy.
This is when you stop paying Base44.
DIY vs hire decision
DIY this if: Your project has fewer than 30 components, you have an engineer on the team comfortable with refactoring, and you can spread the wrapper migration over 4-8 weeks of part-time work.
Hire help if: Your project is mid-sized or larger, you need to maintain feature velocity during the decoupling, or you are decoupling under deadline pressure (e.g., a Base44 outage or pricing change forced your hand). Decoupling and migration are usually best done as one project. Our migration service includes the wrapper refactor, the schema/RLS code-up, the target-platform implementation, and a verified cutover with no data loss.
Need to escape the lock-in?
The fastest, lowest-risk path off Base44 is a managed migration: we ship the data-access wrapper, port the schema and RLS, build the target platform, run parity tests, and execute the cutover with verified zero data loss. Small migrations from $6,000.
Related problems
- No SLA — your app is one outage from down — the strategic reason the lock-in matters.
- Data loss after returning to your app — the data-layer fragility you inherit from the SDK and want to leave behind.
- SSO bypass and auth vulnerabilities — auth is the second-hardest layer to decouple after data.