BASE44DEVS

FIX · PLATFORM · HIGH

Break Base44 Vendor Lock-In Caused by SDK Dependency

Base44 vendor lock-in is real because exported backend code is tightly coupled to the @base44/sdk for database, auth, and storage calls. The SDK is not portable. Decouple before migrating by introducing a thin data-access layer that wraps every SDK call, replacing the SDK import with a generic interface, and porting the implementation to Supabase, Firebase, or a custom backend once the layer is stable.

Last verified
2026-05-01
Category
PLATFORM
Difficulty
HARD
DIY possible
YES

What's happening

You decided you want to leave Base44, or at least keep the option open. You exported your project to GitHub. You cloned it locally. You ran npm install. You ran npm run dev. Everything immediately fails because every backend file imports @base44/sdk and calls it for database operations that no longer have a backend.

A user on shipper.now wrote: "The backend code is strongly dependent on Base44-sdk database calls; exported code won't function elsewhere without significant redevelopment." Nocode.mba's review of Base44 noted the same: "It's unclear how you'd migrate the database if you later move off the platform."

The marketing claim is "you own your code." The technical reality is that you own a codebase that runs only on Base44. The SDK is a platform-specific runtime, not a portable library, and the export does not include the server-side machinery the SDK depends on. You did not get a portable application. You got a snapshot of source code that needs an SDK that only Base44 hosts.

Why this happens

Base44 generates code that is convenient to write inside the platform and inconvenient to run anywhere else. The convenience comes from the SDK collapsing database, auth, file storage, and async messaging into a single import surface. You write base44.collection("orders").create({...}) and it works.

The inconvenience comes from what that line of code actually does. It calls a hosted Base44 endpoint that knows your project's schema, RLS policies, auth context, and storage layout. The SDK is a thin client; the heavy lifting happens on the platform. Outside the platform, the SDK has nowhere to call.

Three structural decisions deepen the lock-in.

The SDK is the only data-access path. Base44 does not expose your database via a portable interface like a Postgres connection string. You cannot point your app at a regular Postgres URL and bypass the SDK. Even if you could, the schema and RLS are managed in Base44's UI rather than in versioned migrations.

Auth is platform-specific. User sessions, permission checks, and SSO are all SDK-mediated. Your code reads base44.auth.user.id everywhere. There is no portable equivalent that maps cleanly to Supabase Auth, Auth0, or Clerk without an adapter layer.

Backend functions assume Base44 runtime. Functions are Deno code that imports the SDK for any data work. Outside Base44, the Deno runtime alone is portable but the SDK calls inside those functions are not.

The platform's GitHub export is in beta and improving, but as of May 2026 it produces a snapshot, not a runnable artifact for any non-Base44 environment. Independent reviews have repeatedly noted: "GitHub export remains in beta...feature shipping too fast, suggesting stability concerns" (Nocode.mba).

Sources: shipper.now/export-code-base44/, nocode.mba/articles/base44-review, feedback.base44.com posts on export and migration limitations.

How to reproduce

  1. Open your Base44 project in the editor.
  2. Export to GitHub (requires a paid plan that supports export).
  3. Clone the resulting repo locally.
  4. Run grep -r "@base44" src/ --include="*.ts" --include="*.tsx" --include="*.js" | wc -l to count SDK call sites.
  5. Run npm install.
  6. Try to start the project: npm run dev or equivalent.
  7. Observe one or more of: missing peer dependencies, runtime errors on first SDK call, build errors from the SDK trying to resolve platform-specific imports.
  8. Pick any backend function file. Read it. Note that nearly every line either calls the SDK directly or references types the SDK provides. There is no portable layer between your business logic and the platform.

Step-by-step fix

The fix is to introduce a data-access layer in your code that wraps every SDK call. Once every business-logic file imports your wrapper instead of the SDK, you can swap the wrapper's implementation when you migrate. Do this work while still on Base44.

1. Audit your SDK usage

Run a grep and categorize call sites by capability.

grep -rn "@base44" src/ \
  --include="*.ts" --include="*.tsx" \
  | tee sdk-usage.txt

Categorize each line as one of: data (collection reads/writes), auth (user/session/permission), storage (files), realtime (subscriptions), function (function invocations). Most projects have 60-80 percent in data, 15-25 percent in auth, and the remainder split among the rest.

2. Define a portable data interface

Create src/lib/data.ts with a generic interface that does not mention Base44.

// src/lib/data.ts
export interface DataLayer {
  list<T>(collection: string, filter?: Record<string, unknown>): Promise<T[]>;
  get<T>(collection: string, id: string): Promise<T | null>;
  create<T>(collection: string, payload: Partial<T>): Promise<T>;
  update<T>(collection: string, id: string, patch: Partial<T>): Promise<T>;
  remove(collection: string, id: string): Promise<void>;
}

export interface AuthLayer {
  currentUserId(): Promise<string | null>;
  currentUserRoles(): Promise<string[]>;
}

This is the contract your business logic will depend on. It mentions no platform.

3. Implement the Base44 adapter

Create src/lib/data.base44.ts that satisfies the interface using the SDK.

// src/lib/data.base44.ts
import { base44 } from "@base44/sdk";
import type { DataLayer, AuthLayer } from "./data";

export const base44Data: DataLayer = {
  list: async (collection, filter) => {
    return base44.collection(collection).list({ where: filter });
  },
  get: async (collection, id) => {
    return base44.collection(collection).get(id);
  },
  create: async (collection, payload) => {
    return base44.collection(collection).create(payload);
  },
  update: async (collection, id, patch) => {
    return base44.collection(collection).update(id, patch);
  },
  remove: async (collection, id) => {
    await base44.collection(collection).delete(id);
  },
};

export const base44Auth: AuthLayer = {
  currentUserId: async () => base44.auth.user?.id ?? null,
  currentUserRoles: async () => base44.auth.user?.roles ?? [],
};

4. Wire the adapter through a single entry point

// src/lib/data.index.ts
import { base44Data, base44Auth } from "./data.base44";

export const data = base44Data;
export const auth = base44Auth;

Now every business-logic file imports from data.index.ts instead of @base44/sdk. When you migrate, you swap the contents of data.index.ts to point at the Supabase or custom adapter instead.

5. Refactor incrementally, file by file

Pick one component or function at a time. Replace import { base44 } from "@base44/sdk" with import { data, auth } from "@/lib/data.index". Replace SDK calls with their wrapper equivalents. Test in Base44 (the wrapper still calls the SDK underneath, so behavior is unchanged). Commit. Move to the next file.

This works on Base44 today. The wrapper adds zero functionality and zero overhead. Its only purpose is to give you a single seam to cut on migration day.

6. Capture schema and RLS as code

Base44 manages schema and RLS in the UI. Export them manually.

  • For every collection: write a SQL CREATE TABLE statement that matches the Base44 schema. Save it in migrations/.
  • For every RLS policy: write the equivalent CREATE POLICY statement targeting your migration platform's RLS syntax. Save it next to the migration.
  • For every storage bucket: document its name, ACL, and contents.

Without this, you can export code but cannot rebuild the database it depends on.

7. Plan the cutover

When the wrapper is fully adopted, build a Supabase (or alternative) implementation of the same interface. Run both adapters side by side in a staging environment. Run your test suite against both. When parity is acceptable, swap data.index.ts to point at the new adapter and deploy.

This is when you stop paying Base44.

DIY vs hire decision

DIY this if: Your project has fewer than 30 components, you have an engineer on the team comfortable with refactoring, and you can spread the wrapper migration over 4-8 weeks of part-time work.

Hire help if: Your project is mid-sized or larger, you need to maintain feature velocity during the decoupling, or you are decoupling under deadline pressure (e.g., a Base44 outage or pricing change forced your hand). Decoupling and migration are usually best done as one project. Our migration service includes the wrapper refactor, the schema/RLS code-up, the target-platform implementation, and a verified cutover with no data loss.

Need to escape the lock-in?

The fastest, lowest-risk path off Base44 is a managed migration: we ship the data-access wrapper, port the schema and RLS, build the target platform, run parity tests, and execute the cutover with verified zero data loss. Small migrations from $6,000.

Start a migration engagement

QUERIES

Frequently asked questions

Q.01Can I just export to GitHub and deploy elsewhere?
A.01

Not without a rewrite. The exported code imports @base44/sdk and calls it for every database read, write, auth check, and storage operation. Outside Base44 the SDK has no backend to talk to. You will get runtime errors immediately. Export is a starting point for migration, not a complete migration. Plan for at minimum 40-60 hours of refactoring on a small project, more on a complex one.

Q.02How tightly is my code coupled to the SDK?
A.02

Run grep for @base44 in your exported codebase. Count the matches. That is the rough number of call sites you need to refactor. Most projects show 80-300 import sites for moderately-sized apps. Each one binds your code to Base44's data shape, auth model, and SDK error semantics. Frontend code is easier to decouple than backend; backend functions are deeply tied to the SDK for almost every operation.

Q.03Should I decouple while still on the platform or only after exporting?
A.03

Before. Decouple while you are still on Base44, with the SDK still working. This gives you a working baseline at every step. If you wait until after export, you are debugging migration and decoupling simultaneously, which is the worst time to figure out which problem you are looking at. Build the abstraction layer in place, verify it on Base44, then swap implementations when you migrate.

Q.04Is GitHub export enough by itself?
A.04

No. GitHub export is still in beta as of May 2026 and ships incomplete artifacts on complex projects. Backend functions, RLS policies, and database schema bindings often do not export cleanly. Use the export as a snapshot of your code, not as a complete project. You will need to manually capture schema, policies, and any function configuration that did not make it into the repo.

Q.05What does decoupling cost in time and money if I hire it out?
A.05

For a typical mid-sized project (50-100 components, 10-20 backend functions), expect 60-120 engineering hours to fully decouple. At blended rates this is roughly $6,000-$15,000. We package this as a migration engagement — small migration $6,000, medium $12,000 — because in practice the decoupling and the actual move are best done as one project rather than sequentially.

NEXT STEP

Need this fix shipped this week?

Book a free 15-minute call or order a $497 audit. We will respond within one business day.