Skip to content

Conversation

@AmanVarshney01
Copy link

@AmanVarshney01 AmanVarshney01 commented Dec 2, 2025

Summary by CodeRabbit

  • New Features

    • Programmatic DB creation API and packaged ESM CLI with TypeScript types.
    • Region listing and automatic nearest-region suggestion via geolocation.
  • User-facing Fixes

    • Claim flow now includes workspace and database IDs and links to the correct console.
    • Connect UI simplified to a Prisma-first flow with separate copy actions for connection strings.
  • Refactor

    • Modernized build, packaging, tests (migrated to TypeScript) and CI workflows.

✏️ Tip: You can customize this high-level summary in your review settings.

@coderabbitai
Copy link

coderabbitai bot commented Dec 2, 2025

Walkthrough

Monolithic JS CLI and tests replaced by a TypeScript-based create-db package: new src modules (CLI, API, types, analytics, database, regions, geolocation, env reader), ESM/dist packaging and build config, migrated tests to TS, CLI bin rewires, workflow updates, and related claim-db-worker adjustments.

Changes

Cohort / File(s) Summary
TypeScript source & API
create-db/src/index.ts, create-db/src/cli.ts, create-db/src/types.ts, create-db/src/analytics.ts, create-db/src/database.ts, create-db/src/geolocation.ts, create-db/src/regions.ts, create-db/src/env-utils.ts
Added full TypeScript implementation exposing programmatic create() and regions(), createDbCli() CLI initializer, analytics batching/flush, createDatabaseCore, geolocation/region utilities, env reader, types and guards, and related helpers.
Build, packaging & configs
create-db/package.json, create-db/tsconfig.json, create-db/tsdown.config.ts, create-db/vitest.config.ts
Introduced ESM/dist packaging with exports map, type: module, dist-based CLI entry, build/dev/typecheck scripts, tsdown build config, TypeScript config, and Vitest settings; updated dependencies/devDependencies.
Tests migrated to TypeScript
Added: create-db/__tests__/create.test.ts, create-db/__tests__/regions.test.ts, create-db/__tests__/utils.test.ts
Removed: create-db/__tests__/create.test.js, create-db/__tests__/regions.test.js, create-db/__tests__/utils.test.js
Replaced JS tests with TS equivalents covering CLI (default/json/regions), regions API and RegionSchema, and type-guard utilities; tests invoke the built CLI (dist/cli.mjs) and include timeouts and env merging.
Removed legacy JS CLI
create-db/index.js
Deleted legacy monolithic Node CLI implementation (interactive prompts, previous analytics client, region detection, parsing, and multi-mode outputs).
CLI package shims / delegates
create-pg/cli.js, create-postgres/cli.js, create-pg/index.js, create-postgres/index.js, create-pg/package.json, create-postgres/package.json, create-pg/index.d.ts, create-postgres/index.d.ts
Added small CLI launchers delegating to create-db/cli, converted index modules to re-export create-db, updated package.json entries and typings to surface CLI and module exports.
Workflows
.github/workflows/tests.yml, .github/workflows/release.yml, .github/workflows/preview.yml (deleted)
Added create-db build step and env vars in tests workflow; replaced/expanded release workflow to support previews and multi-mode releases; removed previous preview workflow.
claim-db-worker updates
claim-db-worker/app/api/auth/callback/route.ts, claim-db-worker/app/success/page.tsx, claim-db-worker/app/web/connect/page.tsx, claim-db-worker/lib/project-transfer.ts, claim-db-worker/lib/response-utils.ts, claim-db-worker/lib/prismaSchemaEditor/defaultSchema.ts, claim-db-worker/package.json
Auth callback now fetches project/workspace/database IDs and includes them in analytics and redirect; success page computes console URL from query; connect UI unified and copy behavior adjusted; transferProject returns parsed transferResponse; removed datasource URL line from default Prisma schema; bumped some deps.
Utilities & misc edits
create-db-worker/src/index.ts, schema-api-routes/src/routes/schema/pull.ts, .vscode/settings.json, removed JS test files under create-db/__tests__/*.js
Removed a debug log, removed datasource URL from schema template, cleared a VSCode color setting, and deleted legacy JS tests.

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~45 minutes

  • Pay extra attention to:
    • create-db/src/index.ts and create-db/src/database.ts (API integration, analytics flows, error branches, output shapes).
    • Packaging/exports in create-db/package.json and tsdown.config.ts (ESM exports, CLI bin alignment).
    • Tests that invoke dist/cli.mjs — ensure build output matches runtime expectations.
    • claim-db-worker auth callback / response-utils changes (redirect/query payload and analytics).
    • CI workflows (release/preview) for correct triggers and publish steps.

Possibly related PRs

Pre-merge checks

❌ Failed checks (1 warning)
Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 23.81% which is insufficient. The required threshold is 80.00%. You can run @coderabbitai generate docstrings to improve docstring coverage.
✅ Passed checks (2 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title 'feat: add programmatic api' accurately describes the main change: the PR introduces a new programmatic API alongside the existing CLI functionality, with new functions like create(), regions(), and a comprehensive type system in create-db/src/.

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@cloudflare-workers-and-pages
Copy link

cloudflare-workers-and-pages bot commented Dec 2, 2025

Deploying with  Cloudflare Workers  Cloudflare Workers

The latest updates on your project. Learn more about integrating Git with Workers.

Status Name Latest Commit Preview URL Updated (UTC)
✅ Deployment successful!
View logs
claim-db-worker a1e549e Commit Preview URL

Branch Preview URL
Dec 10 2025, 12:00 PM

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 8

🧹 Nitpick comments (7)
create-db/tsconfig.json (1)

1-15: Minor formatting inconsistency: mixed tabs and spaces.

Lines 3-12 use tabs for indentation while lines 13-14 use spaces. Consider normalizing to consistent indentation throughout.

 {
   "compilerOptions": {
-		"target": "ES2022",
-		"module": "ESNext",
-		"moduleResolution": "bundler",
-		"esModuleInterop": true,
-		"verbatimModuleSyntax": true,
-		"strict": true,
-		"skipLibCheck": true,
-		"outDir": "dist",
-		"types": ["node"]
-	},
+    "target": "ES2022",
+    "module": "ESNext",
+    "moduleResolution": "bundler",
+    "esModuleInterop": true,
+    "verbatimModuleSyntax": true,
+    "strict": true,
+    "skipLibCheck": true,
+    "outDir": "dist",
+    "types": ["node"]
+  },
   "include": ["src/**/*"],
   "exclude": ["node_modules", "dist", "__tests__"]
 }
create-db/tsdown.config.ts (1)

10-12: Apply shebang banner only to the CLI entry.

The outputOptions.banner currently applies the shebang (#!/usr/bin/env node) to all entry outputs, including dist/index.mjs. While Node.js strips the shebang before parsing, it should be applied only to executable CLI files per best practices. Some toolchain components may also struggle with shebangs in library modules.

Use outputOptions as a function to conditionally apply the banner only when building cli.ts:

outputOptions: (output) => {
  if (output.fileName?.includes('cli')) {
    output.banner = "#!/usr/bin/env node";
  }
  return output;
}
create-db/__tests__/regions.test.ts (1)

4-21: Tests rely on external API without mocking.

These tests make real network requests to the regions API. While this validates end-to-end behavior, it makes tests flaky if the API is unavailable. Consider adding unit tests with mocked fetch for deterministic testing, keeping these as integration tests.

create-db/src/index.ts (4)

229-238: No timeout on database creation fetch request.

Unlike analytics (5s timeout), the createDatabaseCore fetch has no timeout. A slow/hanging API response could block indefinitely.

+    const controller = new AbortController();
+    const timeoutId = setTimeout(() => controller.abort(), 30000);
+
     const resp = await fetch(`${CREATE_DB_WORKER_URL}/create`, {
         method: "POST",
         headers: { "Content-Type": "application/json" },
         body: JSON.stringify({
             region,
             name,
             utm_source: getCommandName(),
             userAgent,
         }),
+        signal: controller.signal,
     });
+
+    clearTimeout(timeoutId);

54-83: pendingAnalytics array grows unbounded.

Promises are pushed to pendingAnalytics but never removed after flushAnalytics(). In a long-running process using the programmatic API, this could cause a memory leak.

 async function flushAnalytics(maxWaitMs = 500): Promise<void> {
     if (pendingAnalytics.length === 0) return;
     await Promise.race([
         Promise.all(pendingAnalytics),
         new Promise<void>((resolve) => setTimeout(resolve, maxWaitMs)),
     ]);
+    pendingAnalytics.length = 0;
 }

575-579: Programmatic create() doesn't validate region.

Unlike the CLI handler which calls validateRegion(), the programmatic API accepts any RegionId without checking if that region is currently available.

Consider validating against available regions or documenting that the region must be valid:

 export async function create(
     options?: ProgrammaticCreateOptions
 ): Promise<CreateDatabaseResult> {
+    if (options?.region) {
+        await validateRegion(options.region);
+    }
     return createDatabaseCore(options?.region || "us-east-1", options?.userAgent);
 }

549-556: Hardcoded version string.

Consider reading version from package.json to avoid version drift.

📜 Review details

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 082dd39 and 6824f9f.

⛔ Files ignored due to path filters (1)
  • pnpm-lock.yaml is excluded by !**/pnpm-lock.yaml
📒 Files selected for processing (14)
  • create-db/__tests__/create.test.js (0 hunks)
  • create-db/__tests__/create.test.ts (1 hunks)
  • create-db/__tests__/regions.test.js (0 hunks)
  • create-db/__tests__/regions.test.ts (1 hunks)
  • create-db/__tests__/utils.test.js (0 hunks)
  • create-db/__tests__/utils.test.ts (1 hunks)
  • create-db/index.js (0 hunks)
  • create-db/package.json (2 hunks)
  • create-db/src/cli.ts (1 hunks)
  • create-db/src/index.ts (1 hunks)
  • create-db/src/types.ts (1 hunks)
  • create-db/tsconfig.json (1 hunks)
  • create-db/tsdown.config.ts (1 hunks)
  • create-db/vitest.config.ts (1 hunks)
💤 Files with no reviewable changes (4)
  • create-db/tests/utils.test.js
  • create-db/tests/regions.test.js
  • create-db/tests/create.test.js
  • create-db/index.js
🧰 Additional context used
🧬 Code graph analysis (4)
create-db/src/cli.ts (1)
create-db/src/index.ts (1)
  • createDbCli (549-556)
create-db/__tests__/utils.test.ts (1)
create-db/src/types.ts (3)
  • CreateDatabaseResult (59-59)
  • isDatabaseError (61-65)
  • isDatabaseSuccess (67-71)
create-db/__tests__/regions.test.ts (2)
create-db/src/index.ts (2)
  • regions (592-594)
  • RegionSchema (34-34)
create-db/src/types.ts (1)
  • RegionSchema (3-10)
create-db/src/index.ts (1)
create-db/src/types.ts (10)
  • RegionId (12-12)
  • RegionCoordinates (28-31)
  • UserLocation (14-21)
  • GeoLocationResponse (111-118)
  • Region (33-37)
  • RegionsResponse (124-124)
  • CreateDatabaseResult (59-59)
  • ApiResponse (99-104)
  • RegionSchema (3-10)
  • ProgrammaticCreateOptions (126-129)
🪛 GitHub Actions: Tests
create-db/__tests__/create.test.ts

[error] 24-24: AssertionError: expected '' to contain 'Database created successfully!'


[error] 29-29: SyntaxError: Unexpected end of JSON input


[error] 37-37: AssertionError: expected '' to contain 'Available Prisma Postgres regions'

⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
  • GitHub Check: Workers Builds: create-db-worker
  • GitHub Check: Workers Builds: claim-db-worker
🔇 Additional comments (9)
create-db/src/cli.ts (1)

1-3: LGTM!

Clean CLI entrypoint that correctly delegates to the createDbCli() API. The .js extension in the import is appropriate for ESM TypeScript with moduleResolution: "bundler".

create-db/vitest.config.ts (1)

1-7: LGTM!

Clean Vitest configuration that correctly targets the TypeScript test files under __tests__/.

create-db/package.json (2)

5-17: Well-structured ESM package exports.

The exports map correctly defines both the library entry point (.) with types and the CLI entry (./cli). The setup properly supports modern ESM consumption patterns.


35-48: Build and publish workflow looks solid.

Good use of prepublishOnly to ensure the package is built before publishing. The bin entries correctly point to the built CLI artifact.

create-db/__tests__/regions.test.ts (1)

23-35: LGTM!

Good coverage of RegionSchema validation including valid region IDs, invalid strings, empty strings, and type coercion rejection.

create-db/__tests__/utils.test.ts (1)

8-35: LGTM!

Type guard tests correctly verify both positive and negative cases with representative payloads matching the DatabaseResult and DatabaseError interfaces.

create-db/src/types.ts (3)

1-12: LGTM!

Clean Zod schema definition with proper type inference for RegionId.


39-71: LGTM!

Well-designed discriminated union with success as the discriminator and correctly implemented type guards.


23-26: Remove unused PartialUserLocation interface.

This interface is defined but has no usages in the codebase. Consider removing it to keep the types clean.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

♻️ Duplicate comments (3)
create-db/src/index.ts (3)

416-420: Use isCancel() instead of === null for cancellation check.

As flagged in past reviews, @clack/prompts' select() returns a cancel symbol when cancelled, not null. You must use the isCancel() helper to properly detect cancellation.

Apply this diff to fix the cancellation check:

-                if (selectedRegion === null) {
+                if (isCancel(selectedRegion)) {
                     cancel(pc.red("Operation cancelled."));
                     await flushAnalytics();
                     process.exit(0);
                 }

446-453: Add return after --env output to prevent fall-through.

As flagged in past reviews, after outputting the DATABASE_URL format (lines 451-452), the code falls through to line 455 which shows the interactive intro. The function must return after the --env output.

Apply this diff to add the missing return:

                 console.log(`DATABASE_URL="${result.connectionString}"`);
                 console.error(`\n# Claim your database at: ${result.claimUrl}`);
+                return;
             }

469-473: Use isCancel() instead of === null for cancellation check.

Same issue as line 416 - @clack/prompts' select() returns a cancel symbol, not null. Use the isCancel() helper.

Apply this diff to fix the cancellation check:

-                if (selectedRegion === null) {
+                if (isCancel(selectedRegion)) {
                     cancel(pc.red("Operation cancelled."));
                     await flushAnalytics();
                     process.exit(0);
                 }
🧹 Nitpick comments (1)
create-db/src/index.ts (1)

573-577: Consider validating region in programmatic API.

The create() function doesn't validate the region before calling createDatabaseCore(). While the API will reject invalid regions, validating upfront would provide better error messages and avoid unnecessary network calls.

Apply this diff to add region validation:

 export async function create(
     options?: ProgrammaticCreateOptions
 ): Promise<CreateDatabaseResult> {
+    if (options?.region) {
+        await validateRegion(options.region);
+    }
     return createDatabaseCore(options?.region || "us-east-1", options?.userAgent);
 }
📜 Review details

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 6824f9f and 209691d.

📒 Files selected for processing (3)
  • .github/workflows/tests.yml (1 hunks)
  • create-db/__tests__/create.test.ts (1 hunks)
  • create-db/src/index.ts (1 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
  • create-db/tests/create.test.ts
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
  • GitHub Check: Workers Builds: create-db-worker
  • GitHub Check: Workers Builds: claim-db-worker
🔇 Additional comments (2)
.github/workflows/tests.yml (2)

44-46: Build step correctly positioned before test execution.

The new build step is properly placed in the workflow sequence and necessary for the TypeScript-to-ESM compilation required by this PR.


53-54: These environment variables have fallback defaults and are already documented.

The code in create-db/src/index.ts defines default values for both CREATE_DB_WORKER_URL and CLAIM_DB_WORKER_URL:

  • CREATE_DB_WORKER_URL defaults to "https://create-db-temp.prisma.io"
  • CLAIM_DB_WORKER_URL defaults to "https://create-db.prisma.io"

If the GitHub Actions secrets are not configured, these variables will use the hardcoded defaults rather than being empty strings. The README.md already documents these environment variables and their purpose, including example values for local development. The workflow implementation is correct—the secrets allow overriding the defaults for CI if needed, but the tests will run successfully with the default endpoints if secrets are unconfigured.

Likely an incorrect or invalid review comment.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

♻️ Duplicate comments (1)
create-db/src/index.ts (1)

404-453: Add return after --env output to avoid falling through into interactive mode

When input.env is true (and input.json is false), the handler prints DATABASE_URL and the claim URL but then falls through out of the if (input.json || input.env) block into the interactive flow below. That can cause a second database to be created and an unexpected prompt even though the user requested non‑interactive .env output.

Add a return after the console.error to terminate the handler in --env mode:

             if (!result.success) {
                 console.error(result.message);
                 process.exit(1);
             }

             console.log(`DATABASE_URL="${result.connectionString}"`);
             console.error(`\n# Claim your database at: ${result.claimUrl}`);
+            return;
         }

Note: when both --json and --env are supplied, JSON currently wins and .env output is skipped because of the earlier if (input.json) { ... return; } – if that’s not intentional, you may want to enforce mutual exclusivity or emit both in a defined order.

🧹 Nitpick comments (6)
create-db/src/index.ts (6)

54-91: Avoid unbounded growth of pendingAnalytics in long‑lived processes

sendAnalytics keeps appending promises to pendingAnalytics, and flushAnalytics never clears the array. In a long‑running process (e.g. using the programmatic API repeatedly), this can accumulate resolved promises and re‑await them on every flush.

You can snapshot and clear the queue before waiting:

 async function flushAnalytics(maxWaitMs = 500): Promise<void> {
-    if (pendingAnalytics.length === 0) return;
-    await Promise.race([
-        Promise.all(pendingAnalytics),
-        new Promise<void>((resolve) => setTimeout(resolve, maxWaitMs)),
-    ]);
+    if (pendingAnalytics.length === 0) return;
+
+    const toWait = pendingAnalytics.splice(0, pendingAnalytics.length);
+
+    await Promise.race([
+        Promise.all(toWait),
+        new Promise<void>((resolve) => setTimeout(resolve, maxWaitMs)),
+    ]);
 }

36-38: Consider moving dotenv side‑effect out of the library entrypoint

Calling dotenv.config() at module load time couples importing create-db (for the programmatic API) to reading and mutating process.env. That’s convenient for the CLI, but can be surprising when the package is used as a library inside other apps.

If you want a cleaner library surface, consider moving dotenv.config() into the actual CLI entry (e.g. cli.ts / bin script) and letting host applications decide when/how to load env files.


155-174: Avoid re‑parsing .env now that dotenv.config() is already used

readUserEnvFile manually parses .env, even though dotenv.config() has already populated process.env. That duplicates parsing logic and can subtly diverge from dotenv’s behavior.

Given you only need a couple of keys, you could simplify to reading from process.env (which will already include .env content) and drop the custom parser, or at least gate this helper behind a clear reason (e.g. “read only from file, ignore real env”).


221-338: Defensive handling for non‑2xx responses without error payload (optional)

createDatabaseCore special‑cases 429 and otherwise relies on the worker always returning either valid JSON with an error field or a non‑JSON body (caught as invalid_json). If the API ever returns a non‑2xx JSON response without error (e.g. a bare {} with status 500), this path would be treated as success.

If you want extra robustness, consider also checking !resp.ok before returning success, e.g.:

-    if (result.error) {
+    if (!resp.ok || result.error) {
         // existing error handling...

This keeps current behavior but guards against unexpected wire formats.


532-544: Optional: reuse checkOnline() for the regions command for consistent UX

The regions sub‑command currently calls getRegions() directly. If the API is unreachable or returns a non‑OK response, users may see a thrown error stack rather than the friendly “Cannot reach Prisma Postgres API server” messaging you implemented in checkOnline().

For consistency with create, consider:

     regions: os
         .meta({ description: "List available Prisma Postgres regions" })
         .handler(async (): Promise<void> => {
-            const regions = await getRegions();
+            await checkOnline();
+            const regions = await getRegions();

573-592: Programmatic API reuses core logic but never flushes analytics (design choice)

create() and regions() nicely reuse createDatabaseCore and getRegions, giving a clean programmatic surface. However, createDatabaseCore sends analytics and relies on flushAnalytics() being called by the CLI; the programmatic path never flushes, and callers have no way to opt out or flush explicitly.

If this API is intended for embedding in other apps/services, consider one of:

  • Disabling analytics for programmatic calls (e.g. flag parameter).
  • Exposing a documented flushAnalytics() export for hosts that care.
  • Injecting an analytics sink so library users can stub/disable it in tests.
📜 Review details

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 209691d and 99989a1.

📒 Files selected for processing (1)
  • create-db/src/index.ts (1 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
create-db/src/index.ts (1)
create-db/src/types.ts (10)
  • RegionId (12-12)
  • RegionCoordinates (28-31)
  • UserLocation (14-21)
  • GeoLocationResponse (111-118)
  • Region (33-37)
  • RegionsResponse (124-124)
  • CreateDatabaseResult (59-59)
  • ApiResponse (99-104)
  • RegionSchema (3-10)
  • ProgrammaticCreateOptions (126-129)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
  • GitHub Check: Workers Builds: claim-db-worker
  • GitHub Check: Workers Builds: create-db-worker
🔇 Additional comments (2)
create-db/src/index.ts (2)

100-153: Location → nearest‑region helpers look correct and defensive

The detectUserLocation + getRegionClosestToLocation combo handles network failures and malformed/partial coordinates gracefully (null + NaN checks) while doing a proper Haversine distance calculation over the known REGION_COORDINATES. This is a solid, side‑effect‑free helper pair.


176-219: Online check and region fetching/validation are straightforward and consistent

checkOnline, getRegions, and validateRegion form a clear, reusable layer: health‑check first, then fetch, then validate against the live region list with a helpful error message. Error paths exit with a non‑zero code and flush analytics, which is appropriate for a CLI.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🧹 Nitpick comments (2)
create-db/src/index.ts (2)

100-121: Add timeout to geolocation fetch to prevent CLI hangs.

Unlike the analytics calls, detectUserLocation blocks the main flow. If ipapi.co is slow or unresponsive, the CLI will hang indefinitely. Consider adding an AbortController with a reasonable timeout (e.g., 3-5 seconds).

 async function detectUserLocation(): Promise<UserLocation | null> {
     try {
+        const controller = new AbortController();
+        const timer = setTimeout(() => controller.abort(), 3000);
         const response = await fetch("https://ipapi.co/json/", {
             method: "GET",
             headers: { "User-Agent": "create-db-cli/1.0" },
+            signal: controller.signal,
         });
+        clearTimeout(timer);

         if (!response.ok) return null;

176-206: Consider adding timeouts to API fetches for resilience.

checkOnline() and getRegions() lack timeouts. While they have error handling, a hung connection will block indefinitely without feedback to the user. Adding a 10-second timeout would improve resilience.

📜 Review details

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 99989a1 and 1d96c16.

📒 Files selected for processing (1)
  • create-db/src/index.ts (1 hunks)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
  • GitHub Check: Workers Builds: claim-db-worker
  • GitHub Check: Workers Builds: create-db-worker
🔇 Additional comments (7)
create-db/src/index.ts (7)

1-34: Imports and type exports look good.

The isCancel import is now included, addressing the previous review feedback. The type exports provide a clean public API surface.


36-52: Configuration setup is appropriate.

The quiet: true option correctly suppresses warnings for missing .env files. Region coordinates are appropriately hardcoded for geolocation calculations.


123-153: Haversine distance calculation is correctly implemented.

The function properly handles type coercion and edge cases, returning null for invalid coordinates.


155-174: Basic .env parsing is acceptable for this use case.

The simple parser handles the expected PRISMA_ACTOR_NAME and PRISMA_ACTOR_PROJECT variables. More complex edge cases (multiline values, escaped quotes) aren't handled, but this is fine given the limited scope.


221-338: Database creation logic is well-structured with comprehensive error handling.

The function properly handles rate limiting, JSON parse errors, and API errors. Connection string construction and analytics tracking are correctly implemented.


340-591: Router and CLI handlers are correctly implemented.

The cancellation checks now properly use isCancel(), and the control flow for JSON/env output modes includes proper returns. The interactive region selection logic is sound.


610-628: Programmatic API exports are clean and well-documented.

The create() and regions() functions provide a clean programmatic interface with helpful JSDoc examples.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

♻️ Duplicate comments (2)
create-db/src/index.ts (2)

86-92: pendingAnalytics array is never cleared after flushing.

After flushAnalytics() completes, the array still contains all settled promises, causing unbounded growth on repeated calls.

Apply this diff to clear the array after flushing:

 async function flushAnalytics(maxWaitMs = 500): Promise<void> {
     if (pendingAnalytics.length === 0) return;
+    const toFlush = [...pendingAnalytics];
+    pendingAnalytics.length = 0;
     await Promise.race([
-        Promise.all(pendingAnalytics),
+        Promise.all(toFlush),
         new Promise<void>((resolve) => setTimeout(resolve, maxWaitMs)),
     ]);
 }

594-594: Remove unused caller variable.

caller is created but never used anywhere in the file.

-const caller = createRouterClient(router, { context: {} });
🧹 Nitpick comments (4)
create-db/src/index.ts (4)

101-122: Consider adding a timeout to the geolocation request.

detectUserLocation lacks a timeout, so a slow or unresponsive ipapi.co could cause the CLI to hang. Consider using AbortController similar to sendAnalytics.

 async function detectUserLocation(): Promise<UserLocation | null> {
     try {
+        const controller = new AbortController();
+        const timer = setTimeout(() => controller.abort(), 3000);
         const response = await fetch("https://ipapi.co/json/", {
             method: "GET",
             headers: { "User-Agent": "create-db-cli/1.0" },
+            signal: controller.signal,
         });
+        clearTimeout(timer);

         if (!response.ok) return null;

457-488: Consider handling the case where envPath is a directory.

The code appends to envPath without checking if it's a valid file path. If a user passes a directory path, appendFileSync will fail with a cryptic error.

             try {
                     const targetEnvPath = envPath!;
+                    const stat = fs.existsSync(targetEnvPath) ? fs.statSync(targetEnvPath) : null;
+                    if (stat?.isDirectory()) {
+                        console.error(pc.red(`Cannot write to directory: ${targetEnvPath}. Please specify a file path.`));
+                        process.exit(1);
+                    }
                     const lines = [

589-589: Hardcoded version string may drift from package.json.

Consider importing the version from package.json or using a build-time replacement to keep them in sync.


618-622: Consider validating the region in the programmatic API.

The create function doesn't validate the region option before calling createDatabaseCore. An invalid region will result in an API error from the backend rather than a clear client-side validation error.

 export async function create(
     options?: ProgrammaticCreateOptions
 ): Promise<CreateDatabaseResult> {
+    const region = options?.region || "us-east-1";
+    if (options?.region) {
+        await validateRegion(region);
+    }
-    return createDatabaseCore(options?.region || "us-east-1", options?.userAgent);
+    return createDatabaseCore(region, options?.userAgent);
 }

Alternatively, if you want to allow the backend to handle validation, document this behavior in the JSDoc.

📜 Review details

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 1d96c16 and 550b435.

📒 Files selected for processing (1)
  • create-db/src/index.ts (1 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
create-db/src/index.ts (1)
create-db/src/types.ts (10)
  • RegionId (12-12)
  • RegionCoordinates (28-31)
  • UserLocation (14-21)
  • GeoLocationResponse (111-118)
  • Region (33-37)
  • RegionsResponse (124-124)
  • CreateDatabaseResult (59-59)
  • ApiResponse (99-104)
  • RegionSchema (3-10)
  • ProgrammaticCreateOptions (126-129)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
  • GitHub Check: Workers Builds: claim-db-worker
  • GitHub Check: Workers Builds: create-db-worker
🔇 Additional comments (5)
create-db/src/index.ts (5)

1-35: LGTM!

Imports and re-exports are well-organized. The isCancel import from @clack/prompts is correctly included as addressed from previous feedback.


37-53: LGTM!

Configuration setup with sensible defaults and well-defined region coordinates matching the RegionSchema enum.


177-207: LGTM!

Online check and region fetching are appropriately implemented. The health check provides clear user feedback on connection issues.


222-339: LGTM!

createDatabaseCore handles rate limiting, JSON parsing errors, and API errors robustly with appropriate analytics tracking. Connection string construction properly URL-encodes credentials.


570-583: LGTM!

The regions handler is clean and straightforward. Error propagation to the CLI framework is acceptable.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 9

🧹 Nitpick comments (2)
.github/workflows/publish.yml (2)

65-76: Extract duplicated build logic into a reusable workflow.

The build steps (checking for build script and running pnpm run build) are duplicated between the build job and experimental-release job. Consider extracting this into a reusable workflow to reduce maintenance burden.

Create .github/workflows/build-packages.yml as a reusable workflow, then call it from both jobs:

# .github/workflows/build-packages.yml
name: Build Packages

on:
  workflow_call:

jobs:
  build-packages:
    runs-on: ubuntu-latest
    steps:
      - name: 🔨 Build packages
        run: |
          for pkg in create-db create-pg create-postgres; do
            if [ -f "$pkg/package.json" ]; then
              cd "$pkg"
              if grep -q '"build"' package.json; then
                echo "Building $pkg..."
                pnpm run build || echo "Build skipped for $pkg (no build script)"
              fi
              cd - >/dev/null
            fi
          done

Then in publish.yml, call the reusable workflow after setup steps in both build and experimental-release jobs.

Also applies to: 219-230


65-76: Extract duplicated build logic into a reusable workflow.

The build step (checking for build script and running pnpm run build) is duplicated between build and experimental-release jobs. Create a reusable workflow to reduce maintenance burden and ensure consistency.

Also applies to: 219-230

📜 Review details

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 550b435 and 3d5dd13.

📒 Files selected for processing (3)
  • .github/workflows/preview.yml (0 hunks)
  • .github/workflows/publish.yml (1 hunks)
  • .github/workflows/release.yml (0 hunks)
💤 Files with no reviewable changes (2)
  • .github/workflows/preview.yml
  • .github/workflows/release.yml
🧰 Additional context used
🪛 actionlint (1.7.9)
.github/workflows/publish.yml

215-215: "github.event.pull_request.head.ref" is potentially untrusted. avoid using it directly in inline scripts. instead, pass it through an environment variable. see https://docs.github.com/en/actions/reference/security/secure-use#good-practices-for-mitigating-script-injection-attacks for more details

(expression)


260-260: the runner of "actions/github-script@v6" action is too old to run on GitHub Actions. update the action's version to fix this issue

(action)

⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
  • GitHub Check: Workers Builds: create-db-worker
  • GitHub Check: Workers Builds: claim-db-worker
🔇 Additional comments (5)
.github/workflows/publish.yml (5)

21-22: Workflow permissions set restrictively at the top level.

The default permissions are contents: read, which is appropriate. Child jobs explicitly elevate permissions only where needed (release jobs need contents: write, pull-requests: write, id-token: write). This follows least-privilege principle well.


12-12: Concurrency control prevents simultaneous workflow runs.

The concurrency group is scoped to workflow + ref, which prevents duplicate runs for the same branch/PR. This is appropriate.


21-22: Permission model appropriately follows least-privilege principle.

Top-level permissions are read-only by default. Jobs that need elevated permissions (contents: write, pull-requests: write, id-token: write) explicitly declare them. This is a security best practice.

Also applies to: 94-97


12-12: Concurrency control prevents simultaneous runs.

Concurrency group scoped to workflow + ref appropriately prevents duplicate executions on the same branch/PR.


154-159: The auto-generation code doesn't execute in normal operation.

The condition if [ -z "$(ls -A .changeset 2>/dev/null)" ] checks whether the .changeset directory is empty. Since .changeset/config.json and .changeset/README.md are committed to the repository and always present, the directory is never empty. The defensive code block never runs in normal circumstances.

The architectural concern about auto-generated changesets is theoretically sound, but it doesn't apply to this codebase in its current state. The condition that would trigger the auto-generation (an empty .changeset directory) should not occur during normal workflow execution.

Likely an incorrect or invalid review comment.

Comment on lines +154 to +159
- name: 📝 Ensure Changeset Exists
run: |
if [ -z "$(ls -A .changeset 2>/dev/null)" ]; then
echo "No changeset found. Creating a default one..."
pnpm changeset add --empty --message "chore(release): auto-generated changeset"
fi
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Changeset fallback creates generic entries that may mask missing release notes.

Creating an empty changeset with auto-generated messages could hide real version bumps without proper documentation of what changed. This should either fail explicitly or require manual intervention to ensure releases are properly documented.

Consider failing explicitly when changesets are missing:

             - name: 📝 Ensure Changeset Exists
               run: |
-                  if [ -z "$(ls -A .changeset 2>/dev/null)" ]; then
-                    echo "No changeset found. Creating a default one..."
-                    pnpm changeset add --empty --message "chore(release): auto-generated changeset"
-                  fi
+                  if [ -z "$(ls -A .changeset 2>/dev/null)" ]; then
+                    echo "❌ No changeset found. Please run 'pnpm changeset' and commit your changes."
+                    exit 1
+                  fi
🤖 Prompt for AI Agents
.github/workflows/publish.yml around lines 154-159: the current step
auto-creates an empty changeset which can mask missing release notes; change
this step to fail the workflow if .changeset is empty (exit non-zero with a
clear message instructing the author to add a proper changeset) or gate
auto-generation behind an explicit opt-in flag (e.g., only create an auto
changeset when a repo secret/env var is set), and update the run script to emit
the failure message and exit 1 when the directory is empty unless the opt-in
flag is present.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

♻️ Duplicate comments (4)
.github/workflows/publish.yml (4)

85-91: ⚠️ Fix silent publish failure handling to prevent incomplete releases.

Line 88 uses || echo to suppress publish failures, allowing the loop to continue even if all retries fail. In a monorepo, a single failed package should block the entire workflow. This was flagged in prior reviews and remains unresolved.

Replace the error suppression with an explicit failure:

-                    pnpm publish --access public --no-git-checks || echo "Publish failed again for $pkg"
+                    if ! pnpm publish --access public --no-git-checks; then
+                      echo "❌ Publish failed for $pkg after retry"
+                      exit 1
+                    fi

93-99: ⚠️ Replace automatic changeset fallback with explicit failure.

Auto-generating an empty changeset can mask missing release notes and hide real version bumps. This was flagged in prior reviews.

Fail explicitly if changesets are missing:

             - name: 📝 Ensure Changeset Exists
               run: |
-                  if [ -z "$(ls -A .changeset 2>/dev/null)" ]; then
-                    echo "No changeset found. Creating a default one..."
-                    pnpm changeset add --empty --message "chore(release): auto-generated changeset"
-                  fi
+                  if [ -z "$(ls -A .changeset 2>/dev/null)" ]; then
+                    echo "❌ No changeset found. Please run 'pnpm changeset' and commit your changes."
+                    exit 1
+                  fi

152-156: 🔴 Fix script injection vulnerability in preview tag creation.

Line 154 directly interpolates github.event.pull_request.head.ref into a shell script. Although tr sanitizes forward slashes, it does not prevent shell metacharacter injection (e.g., $, backticks, ;, |). An attacker can craft a malicious branch name to execute arbitrary commands. GitHub's security documentation requires untrusted context values to be passed via environment variables. This was flagged in prior reviews.

Apply this diff:

             - name: 🔖 Create unique preview tag
+              env:
+                HEAD_REF: ${{ github.event.pull_request.head.ref }}
               run: |
-                  SAFE_REF=$(echo "${{ github.event.pull_request.head.ref }}" | tr '/' '-')
+                  SAFE_REF=$(echo "${HEAD_REF}" | tr '/' '-')
                   echo "PRE_TAG=pr${{ github.event.number }}-${SAFE_REF}-${{ github.run_id }}" >> $GITHUB_ENV

191-224: ⚠️ Upgrade actions/github-script from v6 to v8.

Line 192 uses actions/github-script@v6, which is outdated and runs on Node 16. The latest stable version is v8.0.0 (released September 2025), which provides current Node.js runtime support and critical updates. This was flagged in prior reviews.

Update the action reference:

-              uses: actions/github-script@v6
+              uses: actions/github-script@v8
🧹 Nitpick comments (1)
.github/workflows/publish.yml (1)

166-171: Clean up redundant environment variable reassignments.

Lines 166–171 reassign CREATE_DB_WORKER_URL and CLAIM_DB_WORKER_URL after they are already declared in the step's env block (lines 160–163). This is redundant; remove the inline reassignments.

               run: |
-                  # Resolve URLs with fallback
-                  CREATE_DB_WORKER_URL="${{ secrets.CREATE_DB_WORKER_URL }}"
-                  CLAIM_DB_WORKER_URL="${{ secrets.CLAIM_DB_WORKER_URL }}"
-
-                  # Persist for next steps
-                  echo "CREATE_DB_WORKER_URL=$CREATE_DB_WORKER_URL" >> $GITHUB_ENV
-                  echo "CLAIM_DB_WORKER_URL=$CLAIM_DB_WORKER_URL" >> $GITHUB_ENV
-
                   echo "Using CREATE_DB_WORKER_URL=$CREATE_DB_WORKER_URL"
📜 Review details

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 3d5dd13 and 3a9d744.

📒 Files selected for processing (1)
  • .github/workflows/publish.yml (1 hunks)
🧰 Additional context used
🪛 actionlint (1.7.9)
.github/workflows/publish.yml

153-153: "github.event.pull_request.head.ref" is potentially untrusted. avoid using it directly in inline scripts. instead, pass it through an environment variable. see https://docs.github.com/en/actions/reference/security/secure-use#good-practices-for-mitigating-script-injection-attacks for more details

(expression)


192-192: the runner of "actions/github-script@v6" action is too old to run on GitHub Actions. update the action's version to fix this issue

(action)

⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
  • GitHub Check: Workers Builds: claim-db-worker
  • GitHub Check: Workers Builds: create-db-worker
🔇 Additional comments (5)
.github/workflows/publish.yml (5)

1-27: ✅ Workflow triggers and environment setup look good.

Concurrency, event triggers, and secret/workspace env variables are properly configured.


29-64: ✅ Release job setup is correct.

Node/pnpm configuration and permissions are appropriate.


100-108: ✅ Changesets PR preparation is correct.


110-145: ✅ Preview job setup is correct.


146-151: ✅ README copying logic is straightforward.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

♻️ Duplicate comments (3)
.github/workflows/release.yml (3)

93-98: Changeset fallback creates generic entries that may mask missing release notes.

Creating an empty changeset with auto-generated messages could hide real version bumps without proper documentation of what changed. This should either fail explicitly or require manual intervention to ensure releases are properly documented.

Consider failing explicitly when changesets are missing:

             - name: 📝 Ensure Changeset Exists
               run: |
-                  if [ -z "$(ls -A .changeset 2>/dev/null)" ]; then
-                    echo "No changeset found. Creating a default one..."
-                    pnpm changeset add --empty --message "chore(release): auto-generated changeset"
-                  fi
+                  if [ -z "$(ls -A .changeset 2>/dev/null)" ]; then
+                    echo "❌ No changeset found. Please run 'pnpm changeset' and commit your changes."
+                    exit 1
+                  fi

85-89: Hard-fail on publish failures to prevent silent deployment failures.

Line 88 silently suppresses the second publish retry with || echo "...", allowing the workflow to continue and report success even if all packages fail to publish. This is particularly dangerous in a multi-package loop—failed releases are masked entirely.

Remove the error suppression:

                    # First try to publish
                    if ! pnpm publish --access public --no-git-checks; then
                      echo "Publish failed, trying to bump version and retry..."
                      npm version patch --no-git-tag-version
-                     pnpm publish --access public --no-git-checks || echo "Publish failed again for $pkg"
+                     pnpm publish --access public --no-git-checks
                    fi

Alternatively, track failed packages and exit with a non-zero code if any publish fails.


191-192: Upgrade actions/github-script from v6 to v8.

Version 6 is outdated. The latest version is v8.0.0 and should be used instead.

-              uses: actions/github-script@v6
+              uses: actions/github-script@v8
📜 Review details

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 3a9d744 and 3dbbffd.

📒 Files selected for processing (1)
  • .github/workflows/release.yml (1 hunks)
🧰 Additional context used
🪛 actionlint (1.7.9)
.github/workflows/release.yml

153-153: "github.event.pull_request.head.ref" is potentially untrusted. avoid using it directly in inline scripts. instead, pass it through an environment variable. see https://docs.github.com/en/actions/reference/security/secure-use#good-practices-for-mitigating-script-injection-attacks for more details

(expression)


192-192: the runner of "actions/github-script@v6" action is too old to run on GitHub Actions. update the action's version to fix this issue

(action)

⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
  • GitHub Check: Workers Builds: claim-db-worker
  • GitHub Check: Workers Builds: create-db-worker

@github-actions
Copy link

github-actions bot commented Dec 9, 2025

Preview CLIs & Workers are live!

Test the CLIs locally under tag pr67-overhaul-19905444027:

npx create-db@pr67
npx create-pg@pr67
npx create-postgres@pr67

Worker URLs
• Create-DB Worker: https://create-db-temp.prisma.io
• Claim-DB Worker: https://create-db.prisma.io

These will live as long as this PR exists under tag pr67-overhaul-19905444027.

aidankmcalister and others added 4 commits December 9, 2025 09:18
feats and fixes: changes to structure, console log cleanups, geolocation detection, console link, and prisma 7 changes
@github-actions
Copy link

github-actions bot commented Dec 9, 2025

Preview CLIs & Workers are live!

Test the CLIs locally under tag pr67-overhaul-20068192317:

npx create-db@pr67
npx create-pg@pr67
npx create-postgres@pr67

Worker URLs
• Create-DB Worker: https://create-db-temp.prisma.io
• Claim-DB Worker: https://create-db.prisma.io

These will live as long as this PR exists under tag pr67-overhaul-20068192317.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 5

♻️ Duplicate comments (1)
create-db/src/index.ts (1)

359-359: Remove unused caller variable.

This variable is created but never referenced. Per past review comments, this is dead code.

Apply this diff:

-const caller = createRouterClient(router, { context: {} });
-
🧹 Nitpick comments (14)
create-db/src/env-utils.ts (1)

1-23: Basic .env parsing looks correct; consider aligning with dotenv semantics.

This implementation handles the common KEY=VALUE cases and ignores comments/blank lines, which is fine for simple use. Two optional improvements:

  • Handle Windows line endings by splitting on /\r?\n/ instead of "\n".
  • If you need full .env parity (exports, escaping, multi-line quirks), consider delegating to dotenv.parse(fs.readFileSync(envPath)) so CLI and other tooling stay behavior‑compatible.
claim-db-worker/lib/project-transfer.ts (1)

7-12: Return-shape extension is reasonable; tighten transferResponse typing.

The new parsing logic and inclusion of transferResponse (JSON or { rawResponse }) on success, and null on failure, make the API more informative without changing control flow. To keep type safety:

  • Prefer transferResponse: unknown (or a dedicated TransferResponse type/union) instead of any, so call sites must intentionally narrow before use.
  • If consumers care about the HTTP status on success as well, consider adding status in the success object for symmetry.

Also applies to: 31-52

create-db/src/analytics.ts (1)

1-39: Non-blocking analytics are good; clear pendingAnalytics and document property key style.

The overall pattern (fire‑and‑forget with a short abort and optional flush) is solid, but there are two follow‑ups worth addressing:

  1. Clear pendingAnalytics after a flush to avoid growth over time.
    Right now, resolved promises stay in the array forever. For a long‑lived programmatic API this can leak memory and make subsequent flushes keep waiting on already‑flushed events. You can snapshot and clear:
 export async function flushAnalytics(maxWaitMs = 500): Promise<void> {
-  if (pendingAnalytics.length === 0) return;
-  await Promise.race([
-    Promise.all(pendingAnalytics),
+  if (pendingAnalytics.length === 0) return;
+
+  const toFlush = pendingAnalytics.splice(0, pendingAnalytics.length);
+
+  await Promise.race([
+    Promise.all(toFlush),
     new Promise<void>((resolve) => setTimeout(resolve, maxWaitMs)),
   ]);
 }
  1. Clarify analytics property key conventions.
    properties is a generic bag, and you always inject distinct_id. To keep consistency with prior analytics in this repo, it’s helpful to document (e.g. in a JSDoc on sendAnalytics) that custom property keys passed via properties should follow the existing kebab‑case convention (e.g. "user-agent", "error-type"), even though distinct_id itself is a special case key. Based on learnings, this avoids duplicate attributes downstream.
create-db/vitest.config.ts (1)

1-7: Vitest config matches the new TypeScript test layout; consider future patterns.

The config will pick up all .test.ts files under __tests__, which aligns with the TS tests added in this PR. If you later add React/component tests (.test.tsx) or place tests alongside source files, you may want to expand include accordingly and re-run the suite to confirm discovery.

claim-db-worker/app/api/auth/callback/route.ts (1)

117-138: projectData is stored but never used.

The variable projectData from validateProject(projectID) is captured but not referenced anywhere in the success path. Either remove the variable assignment or use projectData instead of making another fetch to get project details on line 147-155.

-    // Validate project exists and get project data
-    let projectData;
+    // Validate project exists
     try {
-      projectData = await validateProject(projectID);
+      await validateProject(projectID);
     } catch (error) {
create-db/__tests__/create.test.ts (2)

27-41: Minor redundancy in output concatenation.

Line 36 re-concatenates result.stdout + result.stderr when result.all is already computed in the helper (line 22). Consider using just result.all.

     expect(result.exitCode).toBe(0);
     // clack outputs to stderr, so check all output
-    const allOutput = result.all || result.stdout + result.stderr;
+    const allOutput = result.all;
     if (!allOutput.includes("Database created successfully!")) {

65-79: Same redundancy as default command test.

Line 74 has the same redundant fallback. Otherwise, the test properly validates the regions command output.

     expect(result.exitCode).toBe(0);
     // clack outputs to stderr
-    const allOutput = result.all || result.stdout + result.stderr;
+    const allOutput = result.all;
create-db/src/regions.ts (1)

21-33: Consider adding a timeout to the fetch call.

The fetch call has no timeout, which could cause the CLI to hang indefinitely if the server is unresponsive. Consider using AbortController with a timeout similar to the pattern in sendAnalytics.

 export async function getRegions(workerUrl: string): Promise<Region[]> {
-  const res = await fetch(`${workerUrl}/regions`);
+  const controller = new AbortController();
+  const timer = setTimeout(() => controller.abort(), 10000);
+
+  try {
+    const res = await fetch(`${workerUrl}/regions`, {
+      signal: controller.signal,
+    });
 
-  if (!res.ok) {
-    throw new Error(
-      `Failed to fetch regions. Status: ${res.status} ${res.statusText}`
-    );
-  }
+    if (!res.ok) {
+      throw new Error(
+        `Failed to fetch regions. Status: ${res.status} ${res.statusText}`
+      );
+    }
 
-  const data = (await res.json()) as RegionsResponse;
-  const regions: Region[] = Array.isArray(data) ? data : (data.data ?? []);
-  return regions.filter((region) => region.status === "available");
+    const data = (await res.json()) as RegionsResponse;
+    const regions: Region[] = Array.isArray(data) ? data : (data.data ?? []);
+    return regions.filter((region) => region.status === "available");
+  } finally {
+    clearTimeout(timer);
+  }
 }
claim-db-worker/app/web/connect/page.tsx (2)

86-95: Password masking replaces entire string including protocol prefix.

The regex /./g matches every character, masking the entire connection string including postgresql://. Consider preserving the protocol prefix for clarity.

     let processedConnectionString = connectionString;
     if (!showPassword) {
-      processedConnectionString = connectionString.replace(/./g, "•");
+      // Preserve protocol prefix for clarity
+      const match = connectionString.match(/^(postgresql:\/\/)/);
+      const prefix = match ? match[1] : "";
+      processedConnectionString = prefix + connectionString.slice(prefix.length).replace(/./g, "•");
     }

140-154: Remove commented-out code.

The connection type toggle is fully commented out. If this functionality is no longer needed, remove it entirely to keep the codebase clean.

create-db/src/database.ts (2)

22-31: Consider adding a timeout to the fetch call.

The main database creation request has no timeout, which could cause the CLI to hang if the server is unresponsive. This is especially important for a CLI tool where users expect responsive behavior.

+  const controller = new AbortController();
+  const timer = setTimeout(() => controller.abort(), 30000);
+
-  const resp = await fetch(`${createDbWorkerUrl}/create`, {
+  const resp = await fetch(`${createDbWorkerUrl}/create`, {
     method: "POST",
     headers: { "Content-Type": "application/json" },
     body: JSON.stringify({
       region,
       name,
       utm_source: getCommandName(),
       userAgent,
     }),
+    signal: controller.signal,
   });
+  clearTimeout(timer);

Note: You'll also need to handle AbortError in a try-catch and return an appropriate timeout error response.


90-96: Complex conditional for connection details extraction.

The logic for extracting directConnDetails from two different response shapes is correct but could benefit from a brief comment explaining the two API response formats being handled.

+  // Handle both wrapped (data.database) and unwrapped (databases[]) API response shapes
   const apiKeys = database?.apiKeys;
   const directConnDetails = result.data
     ? apiKeys?.[0]?.directConnection
     : result.databases?.[0]?.apiKeys?.[0]?.ppgDirectConnection;
create-db/src/geolocation.ts (1)

11-22: Consider using environment variables for test location override.

The current pattern requires modifying source code to enable test mode. Consider reading from process.env.TEST_LOCATION to allow runtime configuration without code changes.

Example refactor:

const TEST_LOCATION: { latitude: number; longitude: number } | null =
  process.env.TEST_LOCATION && TEST_LOCATIONS[process.env.TEST_LOCATION as keyof typeof TEST_LOCATIONS]
    ? TEST_LOCATIONS[process.env.TEST_LOCATION as keyof typeof TEST_LOCATIONS]
    : null;
create-db/src/index.ts (1)

383-390: Consider validating region parameter in programmatic API.

The create() function doesn't validate the region parameter before calling the API. Adding validation would provide faster, more helpful error messages for invalid regions.

Apply this diff to add validation:

 export async function create(
   options?: ProgrammaticCreateOptions
 ): Promise<CreateDatabaseResult> {
+  if (options?.region) {
+    await validateRegionWithUrl(options.region);
+  }
   return createDatabaseCoreWithUrl(
     options?.region || "us-east-1",
     options?.userAgent
   );
 }

Alternatively, if you prefer to keep the programmatic API simple and not throw errors, you could document that invalid regions will be caught by the API layer.

📜 Review details

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 3dbbffd and 8582fbf.

⛔ Files ignored due to path filters (1)
  • pnpm-lock.yaml is excluded by !**/pnpm-lock.yaml
📒 Files selected for processing (21)
  • .vscode/settings.json (0 hunks)
  • claim-db-worker/app/api/auth/callback/route.ts (2 hunks)
  • claim-db-worker/app/success/page.tsx (2 hunks)
  • claim-db-worker/app/web/connect/page.tsx (6 hunks)
  • claim-db-worker/lib/prismaSchemaEditor/defaultSchema.ts (0 hunks)
  • claim-db-worker/lib/project-transfer.ts (2 hunks)
  • claim-db-worker/lib/response-utils.ts (1 hunks)
  • claim-db-worker/package.json (1 hunks)
  • create-db-worker/src/index.ts (1 hunks)
  • create-db/__tests__/create.test.ts (1 hunks)
  • create-db/package.json (2 hunks)
  • create-db/src/analytics.ts (1 hunks)
  • create-db/src/cli.ts (1 hunks)
  • create-db/src/database.ts (1 hunks)
  • create-db/src/env-utils.ts (1 hunks)
  • create-db/src/geolocation.ts (1 hunks)
  • create-db/src/index.ts (1 hunks)
  • create-db/src/regions.ts (1 hunks)
  • create-db/src/types.ts (1 hunks)
  • create-db/vitest.config.ts (1 hunks)
  • schema-api-routes/src/routes/schema/pull.ts (0 hunks)
💤 Files with no reviewable changes (3)
  • claim-db-worker/lib/prismaSchemaEditor/defaultSchema.ts
  • .vscode/settings.json
  • schema-api-routes/src/routes/schema/pull.ts
✅ Files skipped from review due to trivial changes (1)
  • create-db-worker/src/index.ts
🚧 Files skipped from review as they are similar to previous changes (2)
  • create-db/src/cli.ts
  • create-db/src/types.ts
🧰 Additional context used
🧠 Learnings (1)
📚 Learning: 2025-08-27T16:39:21.271Z
Learnt from: aidankmcalister
Repo: prisma/create-db PR: 48
File: create-db/index.js:423-431
Timestamp: 2025-08-27T16:39:21.271Z
Learning: In the prisma/create-db project, analytics property keys should use the existing kebab-case convention (e.g., "user-agent", "error-type", "status-code") rather than snake_case, to maintain consistency with existing analytics data and avoid duplicate attributes.

Applied to files:

  • create-db/src/analytics.ts
🧬 Code graph analysis (5)
create-db/src/index.ts (6)
create-db/src/analytics.ts (2)
  • sendAnalytics (3-31)
  • flushAnalytics (33-39)
create-db/src/regions.ts (3)
  • checkOnline (4-19)
  • getRegions (21-33)
  • validateRegion (35-49)
create-db/src/database.ts (2)
  • createDatabaseCore (12-135)
  • getCommandName (5-10)
create-db/src/types.ts (5)
  • RegionSchema (3-10)
  • RegionId (12-12)
  • ProgrammaticCreateOptions (126-129)
  • CreateDatabaseResult (59-59)
  • Region (33-37)
create-db/src/env-utils.ts (1)
  • readUserEnvFile (4-23)
create-db/src/geolocation.ts (2)
  • detectUserLocation (67-119)
  • getRegionClosestToLocation (125-173)
create-db/src/regions.ts (2)
create-db-worker/src/index.ts (1)
  • fetch (15-195)
create-db/src/types.ts (2)
  • Region (33-37)
  • RegionsResponse (124-124)
create-db/src/database.ts (2)
create-db/src/types.ts (2)
  • CreateDatabaseResult (59-59)
  • ApiResponse (99-104)
create-db/src/analytics.ts (1)
  • sendAnalytics (3-31)
create-db/src/geolocation.ts (1)
create-db/src/types.ts (3)
  • RegionCoordinates (28-31)
  • UserLocation (14-21)
  • GeoLocationResponse (111-118)
claim-db-worker/app/api/auth/callback/route.ts (2)
claim-db-worker/lib/auth-utils.ts (1)
  • validateProject (35-57)
claim-db-worker/lib/response-utils.ts (1)
  • redirectToSuccess (27-42)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
  • GitHub Check: Workers Builds: claim-db-worker
  • GitHub Check: Workers Builds: create-db-worker
🔇 Additional comments (21)
claim-db-worker/package.json (1)

29-31: Validate Prisma and React upgrades against the app runtime.

Bumping Prisma to 7.x and React/React DOM to 19.1.2 is a non-trivial change; the manifest side looks fine, but please double-check migration notes and run the app/tests to catch any subtle breakages (especially Prisma schema/Client and React 19 behavior changes).

claim-db-worker/lib/response-utils.ts (1)

27-41: Success redirect now carries workspace and database IDs; looks good.

The additional workspaceId and databaseId parameters are correctly threaded into the query string via URLSearchParams, and getBaseUrl keeps redirects host-relative. This should integrate cleanly with the updated success page logic.

claim-db-worker/app/success/page.tsx (1)

11-17: Dynamic console URL construction is safe and matches the new IDs.

Using workspaceId, projectID, and databaseId from the query string to build consoleUrl, with a fallback to the console root when any are missing, is a sensible upgrade. The anchor now always points to either a specific database or a safe default URL.

Also applies to: 32-35

claim-db-worker/app/api/auth/callback/route.ts (1)

188-195: LGTM!

The ID prefix stripping and redirect logic is clean. The sanitization of projectID, workspaceId, and databaseId aligns with the updated redirectToSuccess signature.

create-db/__tests__/create.test.ts (2)

9-24: LGTM!

The runCli helper properly combines stdout and stderr to handle clack's output behavior, uses reject: false with manual exit code checking, and includes appropriate timeouts. This addresses the previous review feedback.


43-63: LGTM!

The JSON flag test properly validates exit code before parsing, checks for valid JSON format, and verifies the expected properties. Good defensive checks with trimmed.match(/^\s*\{/).

create-db/src/regions.ts (2)

4-19: LGTM!

The checkOnline function properly handles network errors with user-friendly colored output and re-throws for upstream handling. The error messaging includes helpful status page link.


35-49: LGTM!

The validateRegion function provides clear error messaging with available options when validation fails. The implementation is clean and straightforward.

claim-db-worker/app/web/connect/page.tsx (1)

26-69: LGTM!

The unified buildSteps array is well-structured and provides clear, sequential instructions for Prisma ORM setup with the adapter pattern. The code example properly demonstrates the @prisma/adapter-pg usage.

create-db/src/database.ts (2)

5-10: LGTM!

The getCommandName function cleanly determines the CLI command context from process.argv[1], with a sensible default fallback to "create-db".


125-135: LGTM!

The successful return object is well-structured with all necessary fields. The analytics emission for successful creation is appropriately placed before the return.

create-db/src/geolocation.ts (3)

24-31: LGTM!

The region coordinates are correctly mapped to their geographic locations.


41-61: LGTM!

The Haversine formula implementation is mathematically correct and properly calculates great-circle distances.


125-173: LGTM!

The function properly validates input, handles both numeric and string coordinate types, and correctly determines the closest region using Haversine distance calculation.

create-db/src/index.ts (7)

1-49: LGTM!

Imports are complete (including isCancel from @clack/prompts), exports provide a clean public API, and dotenv configuration is appropriate.


51-88: LGTM!

The wrapper functions provide a clean abstraction that avoids threading worker URLs through every function call.


159-248: LGTM!

The JSON/env output path correctly:

  • Uses isCancel() for cancellation detection (line 174)
  • Returns after output to prevent fall-through to interactive mode (line 247)
  • Handles env file writing with proper newline management

250-333: LGTM!

The interactive CLI path correctly:

  • Uses isCancel() for cancellation detection (line 264)
  • Provides clear user feedback with spinner and formatted output
  • Properly handles both success and error cases

335-347: LGTM!

The regions route handler is clean and provides properly formatted output.


350-357: LGTM!

The CLI initialization is properly configured with name, version, and description.


403-405: LGTM!

The programmatic regions() API is clean and straightforward.

Comment on lines +146 to +177
// Fetch project details with user's token to get workspace ID
const projectDetailsRes = await fetch(
`https://api.prisma.io/v1/projects/${projectID}`,
{
headers: {
Authorization: `Bearer ${tokenData.access_token}`,
"Content-Type": "application/json",
},
}
);
const projectDetails = (await projectDetailsRes.json()) as {
data?: { workspace?: { id?: string } };
};
const workspaceId = (projectDetails.data?.workspace?.id ?? "").replace(
/^wksp_/,
""
);

// Fetch databases to get database ID
const databasesRes = await fetch(
`https://api.prisma.io/v1/projects/${projectID}/databases`,
{
headers: {
Authorization: `Bearer ${tokenData.access_token}`,
"Content-Type": "application/json",
},
}
);
const databases = (await databasesRes.json()) as {
data?: Array<{ id?: string }>;
};
const databaseId = (databases.data?.[0]?.id ?? "").replace(/^db_/, "");
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Missing error handling for project details and databases fetch.

Both projectDetailsRes and databasesRes are used without checking .ok status. If these requests fail, the code will proceed with potentially malformed data or throw on JSON parsing.

       // Fetch project details with user's token to get workspace ID
       const projectDetailsRes = await fetch(
         `https://api.prisma.io/v1/projects/${projectID}`,
         {
           headers: {
             Authorization: `Bearer ${tokenData.access_token}`,
             "Content-Type": "application/json",
           },
         }
       );
+      if (!projectDetailsRes.ok) {
+        console.error(`Failed to fetch project details: ${projectDetailsRes.status}`);
+      }
       const projectDetails = (await projectDetailsRes.json()) as {
         data?: { workspace?: { id?: string } };
       };
       const workspaceId = (projectDetails.data?.workspace?.id ?? "").replace(
         /^wksp_/,
         ""
       );

       // Fetch databases to get database ID
       const databasesRes = await fetch(
         `https://api.prisma.io/v1/projects/${projectID}/databases`,
         {
           headers: {
             Authorization: `Bearer ${tokenData.access_token}`,
             "Content-Type": "application/json",
           },
         }
       );
+      if (!databasesRes.ok) {
+        console.error(`Failed to fetch databases: ${databasesRes.status}`);
+      }
       const databases = (await databasesRes.json()) as {
         data?: Array<{ id?: string }>;
       };

Committable suggestion skipped: line range outside the PR's diff.

🤖 Prompt for AI Agents
In claim-db-worker/app/api/auth/callback/route.ts around lines 146 to 177, the
code fetches project details and databases but does not check response.ok before
calling .json(), which can lead to parsing errors or using invalid data; update
both fetches to check response.ok (and optionally response.status), handle
non-2xx responses by logging the error and throwing or returning an appropriate
error response (including the status and text/body for diagnostics), and only
proceed to parse JSON and extract workspaceId/databaseId when the responses are
ok.

Comment on lines +206 to +226
<button
className={`flex items-center justify-center w-12 h-12 border border-subtle rounded-md transition-colors ${
copiedAccel
? "text-green-400 border-green-400"
: "text-muted hover:text-white"
}`}
onClick={() =>
handleCopyConnectionString(
dbInfo.connectionString,
setCopiedAccel
)
}
title="Copy accelerate connection string"
disabled={!connectionString}
>
{copiedAccel ? (
<Check className="h-5 w-5" />
) : (
<Zap className="h-5 w-5" />
)}
</button>
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Disabled condition uses wrong connection string variable.

Line 219 checks !connectionString (the direct connection string) to disable the accelerate copy button, but the button copies dbInfo.connectionString. These could have different availability states.

           onClick={() =>
             handleCopyConnectionString(
               dbInfo.connectionString,
               setCopiedAccel
             )
           }
           title="Copy accelerate connection string"
-          disabled={!connectionString}
+          disabled={!dbInfo.connectionString}
         >
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
<button
className={`flex items-center justify-center w-12 h-12 border border-subtle rounded-md transition-colors ${
copiedAccel
? "text-green-400 border-green-400"
: "text-muted hover:text-white"
}`}
onClick={() =>
handleCopyConnectionString(
dbInfo.connectionString,
setCopiedAccel
)
}
title="Copy accelerate connection string"
disabled={!connectionString}
>
{copiedAccel ? (
<Check className="h-5 w-5" />
) : (
<Zap className="h-5 w-5" />
)}
</button>
<button
className={`flex items-center justify-center w-12 h-12 border border-subtle rounded-md transition-colors ${
copiedAccel
? "text-green-400 border-green-400"
: "text-muted hover:text-white"
}`}
onClick={() =>
handleCopyConnectionString(
dbInfo.connectionString,
setCopiedAccel
)
}
title="Copy accelerate connection string"
disabled={!dbInfo.connectionString}
>
{copiedAccel ? (
<Check className="h-5 w-5" />
) : (
<Zap className="h-5 w-5" />
)}
</button>
🤖 Prompt for AI Agents
In claim-db-worker/app/web/connect/page.tsx around lines 206 to 226, the
accelerate-copy button's disabled prop checks the wrong variable (it uses
!connectionString) while the button copies dbInfo.connectionString; change the
disabled check to use !dbInfo.connectionString (or !dbInfo?.connectionString if
dbInfo can be undefined) so the button is disabled based on the actual value
being copied, and keep the onClick and title as-is.

Comment on lines +5 to 18
"type": "module",
"exports": {
".": {
"types": "./dist/index.d.mts",
"import": "./dist/index.mjs"
},
"./cli": {
"import": "./dist/cli.mjs"
}
},
"main": "./dist/index.mjs",
"module": "./dist/index.mjs",
"types": "./dist/index.d.mts",
"author": "prisma",
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

🧩 Analysis chain

🏁 Script executed:

cd create-db && git log --oneline -20 -- package.json

Repository: prisma/create-db

Length of output: 116


🏁 Script executed:

cd create-db && cat package.json | grep -A2 '"version"'

Repository: prisma/create-db

Length of output: 268


🏁 Script executed:

cd create-db && git show HEAD~5:package.json 2>/dev/null | head -30 || git show HEAD~10:package.json 2>/dev/null | head -30 || echo "Could not retrieve earlier version"

Repository: prisma/create-db

Length of output: 42


🏁 Script executed:

find create-db -name "CHANGELOG*" -o -name "HISTORY*" | xargs cat 2>/dev/null || echo "No changelog found"

Repository: prisma/create-db

Length of output: 420


🏁 Script executed:

cd create-db && git diff HEAD~5..HEAD -- package.json 2>/dev/null | head -100 || echo "Could not diff recent versions"

Repository: prisma/create-db

Length of output: 42


🏁 Script executed:

cd create-db && cat package.json | grep -A5 '"bin"'

Repository: prisma/create-db

Length of output: 200


🏁 Script executed:

cd create-db && git log --all --oneline --decorate | head -20

Repository: prisma/create-db

Length of output: 954


🏁 Script executed:

cd create-db && git show 8582fbf:package.json 2>/dev/null | grep -E '"version"|"type"|"main"|"module"|"exports"' -A2 || echo "Could not retrieve merge commit package.json"

Repository: prisma/create-db

Length of output: 255


🏁 Script executed:

cd create-db && find . -path ./node_modules -prune -o -name ".changeset" -type d -print

Repository: prisma/create-db

Length of output: 42


🏁 Script executed:

cd create-db && ls -la .changeset/ 2>/dev/null | head -20 || echo "No .changeset directory"

Repository: prisma/create-db

Length of output: 42


🏁 Script executed:

cd create-db && git show main:package.json 2>/dev/null | grep -E '"version"|"type"|"main"|"module"|"exports"' -A2 -B1 || echo "Could not retrieve main branch package.json"

Repository: prisma/create-db

Length of output: 314


🏁 Script executed:

cd create-db && git log --oneline --all -- package.json | head -10

Repository: prisma/create-db

Length of output: 378


🏁 Script executed:

cd create-db && git diff main...8582fbf -- package.json | head -80

Repository: prisma/create-db

Length of output: 96


🏁 Script executed:

cd create-db && git show 082dd39:create-db/package.json 2>/dev/null | grep -E '"version"|"type"|"main"|"module"|"exports"' -A10 || echo "Could not retrieve main branch create-db/package.json"

Repository: prisma/create-db

Length of output: 889


🏁 Script executed:

cd create-db && git log --oneline --all -- create-db/package.json | head -10

Repository: prisma/create-db

Length of output: 42


🏁 Script executed:

cd create-db && git diff 082dd39 8582fbf -- create-db/package.json

Repository: prisma/create-db

Length of output: 42


🏁 Script executed:

cd create-db && git log --all --oneline --decorate -- create-db/CHANGELOG.md | head -5

Repository: prisma/create-db

Length of output: 42


🏁 Script executed:

cd create-db && git show 8582fbf:create-db/CHANGELOG.md 2>/dev/null | head -50

Repository: prisma/create-db

Length of output: 420


Entrypoint restructuring to dist/ is a breaking change—requires major version bump or compatibility layer.

This PR changes the primary entrypoints from index.js to ./dist/index.mjs, removes the direct root-level files, and updates bin entries to point to ./dist/cli.mjs. Consumers importing from the root directory or using direct index.js paths will break. Since this is being released as v1.1.0 (minor bump), either:

  • Bump to v2.0.0 to properly signal the breaking change, or
  • Provide a CJS/ESM compatibility layer (e.g., keep index.js as a thin wrapper, or add a "cjs" export target).
🤖 Prompt for AI Agents
In create-db/package.json around lines 5 to 18, the package export and
entrypoint changes point root imports and bin entries at ./dist/*.mjs which is a
breaking change for existing CJS consumers; either (A) bump the package version
to 2.0.0 to signal the breaking change, or (B) restore backwards compatibility
by adding a CJS/compatibility layer: keep a root-level index.js that
requires/exports the ESM bundle (or add a "cjs" export target in "exports"), and
update bin shims to point to a small CJS wrapper that loads ./dist/cli.mjs so
existing imports and CLI users continue to work; implement one of these fixes
and update package.json version/exports/bin accordingly.

Comment on lines +110 to +116
const connectionString =
directConnDetails && directHost
? `postgresql://${directUser}:${directPass}@${directHost}${directPort}/${directDbName}?sslmode=require`
: null;

const claimUrl = `${claimDbWorkerUrl}/claim?projectID=${projectId}&utm_source=${userAgent || getCommandName()}&utm_medium=cli`;
const expiryDate = new Date(Date.now() + 24 * 60 * 60 * 1000);
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Potential empty projectId in claim URL.

If both result.data?.id and result.id are undefined (line 91), projectId defaults to an empty string, resulting in a malformed claimUrl like ...?projectID=&utm_source=.... Consider validating projectId before constructing the URL.

+  if (!projectId) {
+    void sendAnalytics(
+      "create_db:database_creation_failed",
+      { region, "error-type": "missing_project_id" },
+      runId,
+      createDbWorkerUrl
+    );
+    return {
+      success: false,
+      error: "missing_project_id",
+      message: "Server response missing project ID.",
+      status: resp.status,
+    };
+  }
+
   const claimUrl = `${claimDbWorkerUrl}/claim?projectID=${projectId}&utm_source=${userAgent || getCommandName()}&utm_medium=cli`;
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
const connectionString =
directConnDetails && directHost
? `postgresql://${directUser}:${directPass}@${directHost}${directPort}/${directDbName}?sslmode=require`
: null;
const claimUrl = `${claimDbWorkerUrl}/claim?projectID=${projectId}&utm_source=${userAgent || getCommandName()}&utm_medium=cli`;
const expiryDate = new Date(Date.now() + 24 * 60 * 60 * 1000);
const connectionString =
directConnDetails && directHost
? `postgresql://${directUser}:${directPass}@${directHost}${directPort}/${directDbName}?sslmode=require`
: null;
if (!projectId) {
void sendAnalytics(
"create_db:database_creation_failed",
{ region, "error-type": "missing_project_id" },
runId,
createDbWorkerUrl
);
return {
success: false,
error: "missing_project_id",
message: "Server response missing project ID.",
status: resp.status,
};
}
const claimUrl = `${claimDbWorkerUrl}/claim?projectID=${projectId}&utm_source=${userAgent || getCommandName()}&utm_medium=cli`;
const expiryDate = new Date(Date.now() + 24 * 60 * 60 * 1000);
🤖 Prompt for AI Agents
In create-db/src/database.ts around lines 110-116, projectId may be empty which
creates a malformed claimUrl; before building the URL validate that projectId is
a non-empty string (result.data?.id || result.id) and if falsy either throw or
return an error/early-exit so you don't construct the claim URL, otherwise build
the URL using encodeURIComponent for projectId and userAgent/getCommandName to
ensure safe query params; ensure downstream code handles a null/undefined
claimUrl if you choose to skip creating it.

Comment on lines +67 to +119
export async function detectUserLocation(): Promise<UserLocation | null> {
// FOR TESTING: Return test location if configured
if (TEST_LOCATION !== null) {
return {
country: "TEST",
continent: "TEST",
city: "Test City",
region: "Test Region",
latitude: TEST_LOCATION.latitude,
longitude: TEST_LOCATION.longitude,
};
}

// Real geolocation via IP API
const controller = new AbortController();
const timeout = setTimeout(() => controller.abort(), 3000);

try {
const response = await fetch("https://ipapi.co/json/", {
method: "GET",
headers: { "User-Agent": "create-db-cli/1.0" },
signal: controller.signal,
});

if (!response.ok) return null;

const data = (await response.json()) as GeoLocationResponse;

// Validate that we have valid coordinates
if (
typeof data.latitude !== "number" ||
typeof data.longitude !== "number" ||
!Number.isFinite(data.latitude) ||
!Number.isFinite(data.longitude)
) {
return null;
}

return {
country: data.country_code,
continent: data.continent_code,
city: data.city,
region: data.region,
latitude: data.latitude,
longitude: data.longitude,
};
} catch {
// Return null on any error (timeout, network, etc.)
return null;
} finally {
clearTimeout(timeout);
}
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

🧩 Analysis chain

🌐 Web query:

ipapi.co free API rate limits SLA reliability

💡 Result:

  • Free tier rate limit: 1,000 requests per day (≈30,000/month). (ipapi.co)
  • Notes on free-tier behavior: intended for testing/development, not production; requests may be further restricted by heuristics (no sign‑up, no usage alerts/monitoring on free). (ipapi.co)
  • Support / SLA: free plan has no support and is explicitly “not for production use” — ipapi does not promise an SLA for the free tier; paid plans include support and higher limits. (ipapi.co)
  • Reliability / status reporting: ipapi publishes a status page with incident history and uptime metrics; check it for recent operational incidents. (status page and history available on their site). (ipapi.co)

If you want, I can pull the most recent status/uptime numbers right now.

Citations:


🏁 Script executed:

# Check for any documentation about external dependencies in the project
find . -type f \( -name "README.md" -o -name "*.md" -o -name "package.json" -o -name "requirements.txt" \) -exec grep -l "ipapi\|geolocation\|rate.limit\|dependency" {} \;

Repository: prisma/create-db

Length of output: 100


🏁 Script executed:

# Look for any comments or docs about API alternatives in the geolocation file
cat -n create-db/src/geolocation.ts | head -20

Repository: prisma/create-db

Length of output: 983


🏁 Script executed:

# Check if there's any environment configuration for API selection
rg -i "geolocation|ipapi|api.key|api.url" --type typescript --type json

Repository: prisma/create-db

Length of output: 94


🏁 Script executed:

# Check if there's any discussion of alternatives or paid tiers in the codebase
rg -i "alternative|paid|production|rate.limit" create-db/src/

Repository: prisma/create-db

Length of output: 213


🏁 Script executed:

# Look at the full geolocation.ts file to understand project scope
wc -l create-db/src/geolocation.ts

Repository: prisma/create-db

Length of output: 92


🏁 Script executed:

# Check the main README for project scope
head -50 README.md

Repository: prisma/create-db

Length of output: 2048


Do not use ipapi.co's free tier in production — it is explicitly not recommended for production use.

The code relies on ipapi.co's free API with only 1,000 requests per day (≈30/minute), which ipapi.co explicitly documents as unsuitable for production. The free tier also has no SLA or support. For a production CLI tool, this will hit rate limits quickly as usage scales.

Either upgrade to ipapi.co's paid plan, switch to a production-ready alternative (e.g., MaxMind GeoIP2, IP2Location), or document this as a known limitation. Given the project already handles rate limit errors elsewhere (see database.ts), this dependency warrants the same attention.

🤖 Prompt for AI Agents
In create-db/src/geolocation.ts around lines 67 to 119, the code uses ipapi.co's
free tier which is unsuitable for production; update the implementation so the
CLI does not rely on the free ipapi.co service by default — either (1) add
configuration/env support to use a production-grade provider (e.g., MaxMind
GeoIP2, IP2Location, or a paid ipapi plan) and wire provider selection into
detectUserLocation, (2) implement a clear fallback and rate-limit/error handling
consistent with database.ts (detect and surface rate-limit errors, retry/backoff
or fallback to a secondary provider), or (3) if keeping ipapi as an optional
fallback, explicitly document the limitation and require an opt-in API key/env
var for production usage; ensure logs/errors include provider and error details
and that timeouts/rate-limit responses are handled the same way as other project
network calls.

@github-actions
Copy link

Preview CLIs & Workers are live!

Test the CLIs locally under tag pr67-overhaul-20097603007:

npx create-db@pr67
npx create-pg@pr67
npx create-postgres@pr67

Worker URLs
• Create-DB Worker: https://create-db-temp.prisma.io
• Claim-DB Worker: https://create-db.prisma.io

These will live as long as this PR exists under tag pr67-overhaul-20097603007.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🧹 Nitpick comments (2)
create-pg/cli.js (1)

1-2: Add error handling for the dynamic import.

The dynamic import() returns a promise that is not awaited or handled. If the create-db/cli module fails to load or execute, the error may not be properly surfaced to the user.

Consider adding explicit error handling:

 #!/usr/bin/env node
-import("create-db/cli");
+import("create-db/cli").catch((err) => {
+  console.error("Failed to load create-db CLI:", err);
+  process.exit(1);
+});
create-postgres/cli.js (1)

1-2: Add error handling for the dynamic import.

The dynamic import() returns a promise that is not awaited or handled. If the create-db/cli module fails to load or execute, the error may not be properly surfaced to the user.

Consider adding explicit error handling:

 #!/usr/bin/env node
-import("create-db/cli");
+import("create-db/cli").catch((err) => {
+  console.error("Failed to load create-db CLI:", err);
+  process.exit(1);
+});
📜 Review details

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 8582fbf and dc2e303.

📒 Files selected for processing (6)
  • create-pg/cli.js (1 hunks)
  • create-pg/index.js (1 hunks)
  • create-pg/package.json (2 hunks)
  • create-postgres/cli.js (1 hunks)
  • create-postgres/index.js (1 hunks)
  • create-postgres/package.json (2 hunks)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
  • GitHub Check: Workers Builds: claim-db-worker
  • GitHub Check: Workers Builds: create-db-worker
🔇 Additional comments (7)
create-postgres/index.js (1)

1-1: Clean delegation to create-db programmatic API.

The re-export pattern enables create-postgres to expose the programmatic API from create-db without duplication, maintaining consistency with the create-pg package structure.

create-pg/package.json (2)

32-35: README.md excluded from published package.

The files array no longer includes README.md, which means the README won't be distributed with the npm package. This impacts documentation availability for users who install the package.

Consider adding README.md back to the files array:

  "files": [
    "cli.js",
-    "index.js"
+    "index.js",
+    "README.md"
  ]

Likely an incorrect or invalid review comment.


23-25: This change is appropriate for a CLI-only package and does not introduce breaking changes.

The create-pg package is explicitly designed and documented as a CLI tool (see README, installation via npx create-pg@latest, and all usage examples showing only CLI commands). The removal of the "main" field aligns with this CLI-first positioning. While index.js exists in the published files, it is not documented in the README or mentioned as part of the public API, and no library usage patterns exist in the codebase. Users who consume this package do so exclusively via npx as a CLI tool.

create-postgres/cli.js (1)

2-2: The create-db/cli module is properly configured and exists.

The create-db package has the correct exports configuration in package.json, mapping "./cli" to "./dist/cli.mjs". The corresponding source file exists at create-db/src/cli.ts and is built via the tsdown build system. The import statement is valid.

create-pg/cli.js (1)

2-2: The create-db/cli export is properly configured in the create-db package and the module is valid. No changes required.

create-pg/index.js (1)

1-1: Clean delegation to create-db programmatic API.

The re-export pattern enables create-pg to expose the programmatic API from create-db without duplication. The create-db package exports a complete set of functions (create, regions) and types (CreateDatabaseResult, DatabaseResult, Region, etc.) that are appropriately re-exported through this pattern, providing consumers with a consistent interface across both packages.

create-postgres/package.json (1)

23-25: No action needed — the package is designed correctly with no breaking changes.

The "main" field is not present in the current package.json and was not part of the HEAD revision. The package includes an index.js that re-exports from "create-db" (export * from "create-db"), allowing library consumption via named imports. The removal or absence of "main" is not a breaking change because:

  1. Modern ESM packages (this package has "type": "module") do not require a "main" field
  2. Named exports remain accessible via the included index.js
  3. No imports of create-postgres exist in the codebase
  4. The "bin" field provides the intended CLI entry point

The current structure is intentional (commit: "fix: entry points for both alias clis").

Likely an incorrect or invalid review comment.

Comment on lines 32 to 36
"files": [
"index.js",
"README.md"
"cli.js",
"index.js"
]
}
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

README.md excluded from published package.

The files array no longer includes README.md, which means the README won't be distributed with the npm package. This impacts documentation availability for users who install the package.

Consider adding README.md back to the files array:

  "files": [
    "cli.js",
-    "index.js"
+    "index.js",
+    "README.md"
  ]
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
"files": [
"index.js",
"README.md"
"cli.js",
"index.js"
]
}
}
"files": [
"cli.js",
"index.js",
"README.md"
]
}
🤖 Prompt for AI Agents
In create-postgres/package.json around lines 32 to 36, the package "files" array
currently only lists "cli.js" and "index.js", so README.md will be excluded from
published packages; to fix, add "README.md" (matching its filename/casing) to
the "files" array in package.json so the README is included when publishing,
then validate by running npm pack or npm publish --dry-run to confirm the file
is present in the package.

@github-actions
Copy link

Preview CLIs & Workers are live!

Test the CLIs locally under tag pr67-overhaul-20097793889:

npx create-db@pr67
npx create-pg@pr67
npx create-postgres@pr67

Worker URLs
• Create-DB Worker: https://create-db-temp.prisma.io
• Claim-DB Worker: https://create-db.prisma.io

These will live as long as this PR exists under tag pr67-overhaul-20097793889.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

♻️ Duplicate comments (1)
create-postgres/package.json (1)

44-49: README.md still excluded from published package

files only includes cli.js, index.js, and index.d.ts, so the README will not be part of the npm tarball. Consider adding README.md to improve the installed package’s documentation.

🧹 Nitpick comments (1)
create-pg/package.json (1)

44-49: Consider including README.md in the published files

files currently only ships cli.js, index.js, and index.d.ts, so consumers won’t get the README in the installed package. Including it improves offline docs and ecosystem tooling (e.g. package viewers).

📜 Review details

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between dc2e303 and a1e549e.

📒 Files selected for processing (4)
  • create-pg/index.d.ts (1 hunks)
  • create-pg/package.json (2 hunks)
  • create-postgres/index.d.ts (1 hunks)
  • create-postgres/package.json (3 hunks)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
  • GitHub Check: Workers Builds: claim-db-worker
  • GitHub Check: Workers Builds: create-db-worker
🔇 Additional comments (5)
create-pg/index.d.ts (1)

1-2: Type re-export from create-db looks correct

Delegating the entire type surface to create-db keeps create-pg’s typings aligned with the core package and matches the runtime re-export pattern.

create-postgres/index.d.ts (1)

1-2: Consistent re-export of typings from create-db

This mirrors the JS re-export behavior and ensures create-postgres exposes the same programmatic API typings as create-db.

create-pg/package.json (1)

23-37: ESM entrypoints and exports map look coherent

type: "module" plus main/module pointing to index.js, types to index.d.ts, an exports map for "." and "./cli", and bin targeting ./cli.js form a consistent surface for both programmatic API and CLI.

create-postgres/package.json (2)

5-5: Author metadata update is appropriate

Setting "author": "prisma" is consistent with the repository ownership.


23-37: Programmatic/CLI entry configuration is consistent

main/module/types plus the exports map and bin: "create-postgres": "./cli.js" give a clear separation between the library entry (index.js / index.d.ts) and the CLI entry (cli.js).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants