-
Notifications
You must be signed in to change notification settings - Fork 1
feat: add programmatic api #67
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
WalkthroughMonolithic JS CLI and tests replaced by a TypeScript-based create-db package: new src modules (CLI, API, types, analytics, database, regions, geolocation, env reader), ESM/dist packaging and build config, migrated tests to TS, CLI bin rewires, workflow updates, and related claim-db-worker adjustments. Changes
Estimated code review effort🎯 4 (Complex) | ⏱️ ~45 minutes
Possibly related PRs
Pre-merge checks❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
Deploying with
|
| Status | Name | Latest Commit | Preview URL | Updated (UTC) |
|---|---|---|---|---|
| ✅ Deployment successful! View logs |
claim-db-worker | a1e549e | Commit Preview URL Branch Preview URL |
Dec 10 2025, 12:00 PM |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 8
🧹 Nitpick comments (7)
create-db/tsconfig.json (1)
1-15: Minor formatting inconsistency: mixed tabs and spaces.Lines 3-12 use tabs for indentation while lines 13-14 use spaces. Consider normalizing to consistent indentation throughout.
{ "compilerOptions": { - "target": "ES2022", - "module": "ESNext", - "moduleResolution": "bundler", - "esModuleInterop": true, - "verbatimModuleSyntax": true, - "strict": true, - "skipLibCheck": true, - "outDir": "dist", - "types": ["node"] - }, + "target": "ES2022", + "module": "ESNext", + "moduleResolution": "bundler", + "esModuleInterop": true, + "verbatimModuleSyntax": true, + "strict": true, + "skipLibCheck": true, + "outDir": "dist", + "types": ["node"] + }, "include": ["src/**/*"], "exclude": ["node_modules", "dist", "__tests__"] }create-db/tsdown.config.ts (1)
10-12: Apply shebang banner only to the CLI entry.The
outputOptions.bannercurrently applies the shebang (#!/usr/bin/env node) to all entry outputs, includingdist/index.mjs. While Node.js strips the shebang before parsing, it should be applied only to executable CLI files per best practices. Some toolchain components may also struggle with shebangs in library modules.Use
outputOptionsas a function to conditionally apply the banner only when buildingcli.ts:outputOptions: (output) => { if (output.fileName?.includes('cli')) { output.banner = "#!/usr/bin/env node"; } return output; }create-db/__tests__/regions.test.ts (1)
4-21: Tests rely on external API without mocking.These tests make real network requests to the regions API. While this validates end-to-end behavior, it makes tests flaky if the API is unavailable. Consider adding unit tests with mocked fetch for deterministic testing, keeping these as integration tests.
create-db/src/index.ts (4)
229-238: No timeout on database creation fetch request.Unlike analytics (5s timeout), the
createDatabaseCorefetch has no timeout. A slow/hanging API response could block indefinitely.+ const controller = new AbortController(); + const timeoutId = setTimeout(() => controller.abort(), 30000); + const resp = await fetch(`${CREATE_DB_WORKER_URL}/create`, { method: "POST", headers: { "Content-Type": "application/json" }, body: JSON.stringify({ region, name, utm_source: getCommandName(), userAgent, }), + signal: controller.signal, }); + + clearTimeout(timeoutId);
54-83:pendingAnalyticsarray grows unbounded.Promises are pushed to
pendingAnalyticsbut never removed afterflushAnalytics(). In a long-running process using the programmatic API, this could cause a memory leak.async function flushAnalytics(maxWaitMs = 500): Promise<void> { if (pendingAnalytics.length === 0) return; await Promise.race([ Promise.all(pendingAnalytics), new Promise<void>((resolve) => setTimeout(resolve, maxWaitMs)), ]); + pendingAnalytics.length = 0; }
575-579: Programmaticcreate()doesn't validate region.Unlike the CLI handler which calls
validateRegion(), the programmatic API accepts anyRegionIdwithout checking if that region is currently available.Consider validating against available regions or documenting that the region must be valid:
export async function create( options?: ProgrammaticCreateOptions ): Promise<CreateDatabaseResult> { + if (options?.region) { + await validateRegion(options.region); + } return createDatabaseCore(options?.region || "us-east-1", options?.userAgent); }
549-556: Hardcoded version string.Consider reading version from
package.jsonto avoid version drift.
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
pnpm-lock.yamlis excluded by!**/pnpm-lock.yaml
📒 Files selected for processing (14)
create-db/__tests__/create.test.js(0 hunks)create-db/__tests__/create.test.ts(1 hunks)create-db/__tests__/regions.test.js(0 hunks)create-db/__tests__/regions.test.ts(1 hunks)create-db/__tests__/utils.test.js(0 hunks)create-db/__tests__/utils.test.ts(1 hunks)create-db/index.js(0 hunks)create-db/package.json(2 hunks)create-db/src/cli.ts(1 hunks)create-db/src/index.ts(1 hunks)create-db/src/types.ts(1 hunks)create-db/tsconfig.json(1 hunks)create-db/tsdown.config.ts(1 hunks)create-db/vitest.config.ts(1 hunks)
💤 Files with no reviewable changes (4)
- create-db/tests/utils.test.js
- create-db/tests/regions.test.js
- create-db/tests/create.test.js
- create-db/index.js
🧰 Additional context used
🧬 Code graph analysis (4)
create-db/src/cli.ts (1)
create-db/src/index.ts (1)
createDbCli(549-556)
create-db/__tests__/utils.test.ts (1)
create-db/src/types.ts (3)
CreateDatabaseResult(59-59)isDatabaseError(61-65)isDatabaseSuccess(67-71)
create-db/__tests__/regions.test.ts (2)
create-db/src/index.ts (2)
regions(592-594)RegionSchema(34-34)create-db/src/types.ts (1)
RegionSchema(3-10)
create-db/src/index.ts (1)
create-db/src/types.ts (10)
RegionId(12-12)RegionCoordinates(28-31)UserLocation(14-21)GeoLocationResponse(111-118)Region(33-37)RegionsResponse(124-124)CreateDatabaseResult(59-59)ApiResponse(99-104)RegionSchema(3-10)ProgrammaticCreateOptions(126-129)
🪛 GitHub Actions: Tests
create-db/__tests__/create.test.ts
[error] 24-24: AssertionError: expected '' to contain 'Database created successfully!'
[error] 29-29: SyntaxError: Unexpected end of JSON input
[error] 37-37: AssertionError: expected '' to contain 'Available Prisma Postgres regions'
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
- GitHub Check: Workers Builds: create-db-worker
- GitHub Check: Workers Builds: claim-db-worker
🔇 Additional comments (9)
create-db/src/cli.ts (1)
1-3: LGTM!Clean CLI entrypoint that correctly delegates to the
createDbCli()API. The.jsextension in the import is appropriate for ESM TypeScript withmoduleResolution: "bundler".create-db/vitest.config.ts (1)
1-7: LGTM!Clean Vitest configuration that correctly targets the TypeScript test files under
__tests__/.create-db/package.json (2)
5-17: Well-structured ESM package exports.The exports map correctly defines both the library entry point (
.) with types and the CLI entry (./cli). The setup properly supports modern ESM consumption patterns.
35-48: Build and publish workflow looks solid.Good use of
prepublishOnlyto ensure the package is built before publishing. The bin entries correctly point to the built CLI artifact.create-db/__tests__/regions.test.ts (1)
23-35: LGTM!Good coverage of
RegionSchemavalidation including valid region IDs, invalid strings, empty strings, and type coercion rejection.create-db/__tests__/utils.test.ts (1)
8-35: LGTM!Type guard tests correctly verify both positive and negative cases with representative payloads matching the
DatabaseResultandDatabaseErrorinterfaces.create-db/src/types.ts (3)
1-12: LGTM!Clean Zod schema definition with proper type inference for
RegionId.
39-71: LGTM!Well-designed discriminated union with
successas the discriminator and correctly implemented type guards.
23-26: Remove unusedPartialUserLocationinterface.This interface is defined but has no usages in the codebase. Consider removing it to keep the types clean.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
♻️ Duplicate comments (3)
create-db/src/index.ts (3)
416-420: UseisCancel()instead of=== nullfor cancellation check.As flagged in past reviews,
@clack/prompts'select()returns a cancel symbol when cancelled, notnull. You must use theisCancel()helper to properly detect cancellation.Apply this diff to fix the cancellation check:
- if (selectedRegion === null) { + if (isCancel(selectedRegion)) { cancel(pc.red("Operation cancelled.")); await flushAnalytics(); process.exit(0); }
446-453: Addreturnafter--envoutput to prevent fall-through.As flagged in past reviews, after outputting the
DATABASE_URLformat (lines 451-452), the code falls through to line 455 which shows the interactive intro. The function must return after the--envoutput.Apply this diff to add the missing return:
console.log(`DATABASE_URL="${result.connectionString}"`); console.error(`\n# Claim your database at: ${result.claimUrl}`); + return; }
469-473: UseisCancel()instead of=== nullfor cancellation check.Same issue as line 416 -
@clack/prompts'select()returns a cancel symbol, notnull. Use theisCancel()helper.Apply this diff to fix the cancellation check:
- if (selectedRegion === null) { + if (isCancel(selectedRegion)) { cancel(pc.red("Operation cancelled.")); await flushAnalytics(); process.exit(0); }
🧹 Nitpick comments (1)
create-db/src/index.ts (1)
573-577: Consider validating region in programmatic API.The
create()function doesn't validate the region before callingcreateDatabaseCore(). While the API will reject invalid regions, validating upfront would provide better error messages and avoid unnecessary network calls.Apply this diff to add region validation:
export async function create( options?: ProgrammaticCreateOptions ): Promise<CreateDatabaseResult> { + if (options?.region) { + await validateRegion(options.region); + } return createDatabaseCore(options?.region || "us-east-1", options?.userAgent); }
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (3)
.github/workflows/tests.yml(1 hunks)create-db/__tests__/create.test.ts(1 hunks)create-db/src/index.ts(1 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
- create-db/tests/create.test.ts
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
- GitHub Check: Workers Builds: create-db-worker
- GitHub Check: Workers Builds: claim-db-worker
🔇 Additional comments (2)
.github/workflows/tests.yml (2)
44-46: Build step correctly positioned before test execution.The new build step is properly placed in the workflow sequence and necessary for the TypeScript-to-ESM compilation required by this PR.
53-54: These environment variables have fallback defaults and are already documented.The code in
create-db/src/index.tsdefines default values for bothCREATE_DB_WORKER_URLandCLAIM_DB_WORKER_URL:
CREATE_DB_WORKER_URLdefaults to"https://create-db-temp.prisma.io"CLAIM_DB_WORKER_URLdefaults to"https://create-db.prisma.io"If the GitHub Actions secrets are not configured, these variables will use the hardcoded defaults rather than being empty strings. The README.md already documents these environment variables and their purpose, including example values for local development. The workflow implementation is correct—the secrets allow overriding the defaults for CI if needed, but the tests will run successfully with the default endpoints if secrets are unconfigured.
Likely an incorrect or invalid review comment.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
♻️ Duplicate comments (1)
create-db/src/index.ts (1)
404-453: Addreturnafter--envoutput to avoid falling through into interactive modeWhen
input.envis true (andinput.jsonis false), the handler printsDATABASE_URLand the claim URL but then falls through out of theif (input.json || input.env)block into the interactive flow below. That can cause a second database to be created and an unexpected prompt even though the user requested non‑interactive.envoutput.Add a
returnafter theconsole.errorto terminate the handler in--envmode:if (!result.success) { console.error(result.message); process.exit(1); } console.log(`DATABASE_URL="${result.connectionString}"`); console.error(`\n# Claim your database at: ${result.claimUrl}`); + return; }Note: when both
--jsonand--envare supplied, JSON currently wins and.envoutput is skipped because of the earlierif (input.json) { ... return; }– if that’s not intentional, you may want to enforce mutual exclusivity or emit both in a defined order.
🧹 Nitpick comments (6)
create-db/src/index.ts (6)
54-91: Avoid unbounded growth ofpendingAnalyticsin long‑lived processes
sendAnalyticskeeps appending promises topendingAnalytics, andflushAnalyticsnever clears the array. In a long‑running process (e.g. using the programmatic API repeatedly), this can accumulate resolved promises and re‑await them on every flush.You can snapshot and clear the queue before waiting:
async function flushAnalytics(maxWaitMs = 500): Promise<void> { - if (pendingAnalytics.length === 0) return; - await Promise.race([ - Promise.all(pendingAnalytics), - new Promise<void>((resolve) => setTimeout(resolve, maxWaitMs)), - ]); + if (pendingAnalytics.length === 0) return; + + const toWait = pendingAnalytics.splice(0, pendingAnalytics.length); + + await Promise.race([ + Promise.all(toWait), + new Promise<void>((resolve) => setTimeout(resolve, maxWaitMs)), + ]); }
36-38: Consider movingdotenvside‑effect out of the library entrypointCalling
dotenv.config()at module load time couples importingcreate-db(for the programmatic API) to reading and mutatingprocess.env. That’s convenient for the CLI, but can be surprising when the package is used as a library inside other apps.If you want a cleaner library surface, consider moving
dotenv.config()into the actual CLI entry (e.g.cli.ts/ bin script) and letting host applications decide when/how to load env files.
155-174: Avoid re‑parsing.envnow thatdotenv.config()is already used
readUserEnvFilemanually parses.env, even thoughdotenv.config()has already populatedprocess.env. That duplicates parsing logic and can subtly diverge from dotenv’s behavior.Given you only need a couple of keys, you could simplify to reading from
process.env(which will already include.envcontent) and drop the custom parser, or at least gate this helper behind a clear reason (e.g. “read only from file, ignore real env”).
221-338: Defensive handling for non‑2xx responses withouterrorpayload (optional)
createDatabaseCorespecial‑cases 429 and otherwise relies on the worker always returning either valid JSON with anerrorfield or a non‑JSON body (caught asinvalid_json). If the API ever returns a non‑2xx JSON response withouterror(e.g. a bare{}with status 500), this path would be treated as success.If you want extra robustness, consider also checking
!resp.okbefore returning success, e.g.:- if (result.error) { + if (!resp.ok || result.error) { // existing error handling...This keeps current behavior but guards against unexpected wire formats.
532-544: Optional: reusecheckOnline()for theregionscommand for consistent UXThe
regionssub‑command currently callsgetRegions()directly. If the API is unreachable or returns a non‑OK response, users may see a thrown error stack rather than the friendly “Cannot reach Prisma Postgres API server” messaging you implemented incheckOnline().For consistency with
create, consider:regions: os .meta({ description: "List available Prisma Postgres regions" }) .handler(async (): Promise<void> => { - const regions = await getRegions(); + await checkOnline(); + const regions = await getRegions();
573-592: Programmatic API reuses core logic but never flushes analytics (design choice)
create()andregions()nicely reusecreateDatabaseCoreandgetRegions, giving a clean programmatic surface. However,createDatabaseCoresends analytics and relies onflushAnalytics()being called by the CLI; the programmatic path never flushes, and callers have no way to opt out or flush explicitly.If this API is intended for embedding in other apps/services, consider one of:
- Disabling analytics for programmatic calls (e.g. flag parameter).
- Exposing a documented
flushAnalytics()export for hosts that care.- Injecting an analytics sink so library users can stub/disable it in tests.
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
create-db/src/index.ts(1 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
create-db/src/index.ts (1)
create-db/src/types.ts (10)
RegionId(12-12)RegionCoordinates(28-31)UserLocation(14-21)GeoLocationResponse(111-118)Region(33-37)RegionsResponse(124-124)CreateDatabaseResult(59-59)ApiResponse(99-104)RegionSchema(3-10)ProgrammaticCreateOptions(126-129)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
- GitHub Check: Workers Builds: claim-db-worker
- GitHub Check: Workers Builds: create-db-worker
🔇 Additional comments (2)
create-db/src/index.ts (2)
100-153: Location → nearest‑region helpers look correct and defensiveThe
detectUserLocation+getRegionClosestToLocationcombo handles network failures and malformed/partial coordinates gracefully (null + NaN checks) while doing a proper Haversine distance calculation over the knownREGION_COORDINATES. This is a solid, side‑effect‑free helper pair.
176-219: Online check and region fetching/validation are straightforward and consistent
checkOnline,getRegions, andvalidateRegionform a clear, reusable layer: health‑check first, then fetch, then validate against the live region list with a helpful error message. Error paths exit with a non‑zero code and flush analytics, which is appropriate for a CLI.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
🧹 Nitpick comments (2)
create-db/src/index.ts (2)
100-121: Add timeout to geolocation fetch to prevent CLI hangs.Unlike the analytics calls,
detectUserLocationblocks the main flow. Ifipapi.cois slow or unresponsive, the CLI will hang indefinitely. Consider adding anAbortControllerwith a reasonable timeout (e.g., 3-5 seconds).async function detectUserLocation(): Promise<UserLocation | null> { try { + const controller = new AbortController(); + const timer = setTimeout(() => controller.abort(), 3000); const response = await fetch("https://ipapi.co/json/", { method: "GET", headers: { "User-Agent": "create-db-cli/1.0" }, + signal: controller.signal, }); + clearTimeout(timer); if (!response.ok) return null;
176-206: Consider adding timeouts to API fetches for resilience.
checkOnline()andgetRegions()lack timeouts. While they have error handling, a hung connection will block indefinitely without feedback to the user. Adding a 10-second timeout would improve resilience.
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
create-db/src/index.ts(1 hunks)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
- GitHub Check: Workers Builds: claim-db-worker
- GitHub Check: Workers Builds: create-db-worker
🔇 Additional comments (7)
create-db/src/index.ts (7)
1-34: Imports and type exports look good.The
isCancelimport is now included, addressing the previous review feedback. The type exports provide a clean public API surface.
36-52: Configuration setup is appropriate.The
quiet: trueoption correctly suppresses warnings for missing.envfiles. Region coordinates are appropriately hardcoded for geolocation calculations.
123-153: Haversine distance calculation is correctly implemented.The function properly handles type coercion and edge cases, returning
nullfor invalid coordinates.
155-174: Basic .env parsing is acceptable for this use case.The simple parser handles the expected
PRISMA_ACTOR_NAMEandPRISMA_ACTOR_PROJECTvariables. More complex edge cases (multiline values, escaped quotes) aren't handled, but this is fine given the limited scope.
221-338: Database creation logic is well-structured with comprehensive error handling.The function properly handles rate limiting, JSON parse errors, and API errors. Connection string construction and analytics tracking are correctly implemented.
340-591: Router and CLI handlers are correctly implemented.The cancellation checks now properly use
isCancel(), and the control flow for JSON/env output modes includes proper returns. The interactive region selection logic is sound.
610-628: Programmatic API exports are clean and well-documented.The
create()andregions()functions provide a clean programmatic interface with helpful JSDoc examples.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
♻️ Duplicate comments (2)
create-db/src/index.ts (2)
86-92:pendingAnalyticsarray is never cleared after flushing.After
flushAnalytics()completes, the array still contains all settled promises, causing unbounded growth on repeated calls.Apply this diff to clear the array after flushing:
async function flushAnalytics(maxWaitMs = 500): Promise<void> { if (pendingAnalytics.length === 0) return; + const toFlush = [...pendingAnalytics]; + pendingAnalytics.length = 0; await Promise.race([ - Promise.all(pendingAnalytics), + Promise.all(toFlush), new Promise<void>((resolve) => setTimeout(resolve, maxWaitMs)), ]); }
594-594: Remove unusedcallervariable.
calleris created but never used anywhere in the file.-const caller = createRouterClient(router, { context: {} });
🧹 Nitpick comments (4)
create-db/src/index.ts (4)
101-122: Consider adding a timeout to the geolocation request.
detectUserLocationlacks a timeout, so a slow or unresponsiveipapi.cocould cause the CLI to hang. Consider usingAbortControllersimilar tosendAnalytics.async function detectUserLocation(): Promise<UserLocation | null> { try { + const controller = new AbortController(); + const timer = setTimeout(() => controller.abort(), 3000); const response = await fetch("https://ipapi.co/json/", { method: "GET", headers: { "User-Agent": "create-db-cli/1.0" }, + signal: controller.signal, }); + clearTimeout(timer); if (!response.ok) return null;
457-488: Consider handling the case whereenvPathis a directory.The code appends to
envPathwithout checking if it's a valid file path. If a user passes a directory path,appendFileSyncwill fail with a cryptic error.try { const targetEnvPath = envPath!; + const stat = fs.existsSync(targetEnvPath) ? fs.statSync(targetEnvPath) : null; + if (stat?.isDirectory()) { + console.error(pc.red(`Cannot write to directory: ${targetEnvPath}. Please specify a file path.`)); + process.exit(1); + } const lines = [
589-589: Hardcoded version string may drift frompackage.json.Consider importing the version from
package.jsonor using a build-time replacement to keep them in sync.
618-622: Consider validating the region in the programmatic API.The
createfunction doesn't validate theregionoption before callingcreateDatabaseCore. An invalid region will result in an API error from the backend rather than a clear client-side validation error.export async function create( options?: ProgrammaticCreateOptions ): Promise<CreateDatabaseResult> { + const region = options?.region || "us-east-1"; + if (options?.region) { + await validateRegion(region); + } - return createDatabaseCore(options?.region || "us-east-1", options?.userAgent); + return createDatabaseCore(region, options?.userAgent); }Alternatively, if you want to allow the backend to handle validation, document this behavior in the JSDoc.
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
create-db/src/index.ts(1 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
create-db/src/index.ts (1)
create-db/src/types.ts (10)
RegionId(12-12)RegionCoordinates(28-31)UserLocation(14-21)GeoLocationResponse(111-118)Region(33-37)RegionsResponse(124-124)CreateDatabaseResult(59-59)ApiResponse(99-104)RegionSchema(3-10)ProgrammaticCreateOptions(126-129)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
- GitHub Check: Workers Builds: claim-db-worker
- GitHub Check: Workers Builds: create-db-worker
🔇 Additional comments (5)
create-db/src/index.ts (5)
1-35: LGTM!Imports and re-exports are well-organized. The
isCancelimport from@clack/promptsis correctly included as addressed from previous feedback.
37-53: LGTM!Configuration setup with sensible defaults and well-defined region coordinates matching the
RegionSchemaenum.
177-207: LGTM!Online check and region fetching are appropriately implemented. The health check provides clear user feedback on connection issues.
222-339: LGTM!
createDatabaseCorehandles rate limiting, JSON parsing errors, and API errors robustly with appropriate analytics tracking. Connection string construction properly URL-encodes credentials.
570-583: LGTM!The regions handler is clean and straightforward. Error propagation to the CLI framework is acceptable.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 9
🧹 Nitpick comments (2)
.github/workflows/publish.yml (2)
65-76: Extract duplicated build logic into a reusable workflow.The build steps (checking for build script and running
pnpm run build) are duplicated between thebuildjob andexperimental-releasejob. Consider extracting this into a reusable workflow to reduce maintenance burden.Create
.github/workflows/build-packages.ymlas a reusable workflow, then call it from both jobs:# .github/workflows/build-packages.yml name: Build Packages on: workflow_call: jobs: build-packages: runs-on: ubuntu-latest steps: - name: 🔨 Build packages run: | for pkg in create-db create-pg create-postgres; do if [ -f "$pkg/package.json" ]; then cd "$pkg" if grep -q '"build"' package.json; then echo "Building $pkg..." pnpm run build || echo "Build skipped for $pkg (no build script)" fi cd - >/dev/null fi doneThen in
publish.yml, call the reusable workflow after setup steps in bothbuildandexperimental-releasejobs.Also applies to: 219-230
65-76: Extract duplicated build logic into a reusable workflow.The build step (checking for build script and running
pnpm run build) is duplicated betweenbuildandexperimental-releasejobs. Create a reusable workflow to reduce maintenance burden and ensure consistency.Also applies to: 219-230
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (3)
.github/workflows/preview.yml(0 hunks).github/workflows/publish.yml(1 hunks).github/workflows/release.yml(0 hunks)
💤 Files with no reviewable changes (2)
- .github/workflows/preview.yml
- .github/workflows/release.yml
🧰 Additional context used
🪛 actionlint (1.7.9)
.github/workflows/publish.yml
215-215: "github.event.pull_request.head.ref" is potentially untrusted. avoid using it directly in inline scripts. instead, pass it through an environment variable. see https://docs.github.com/en/actions/reference/security/secure-use#good-practices-for-mitigating-script-injection-attacks for more details
(expression)
260-260: the runner of "actions/github-script@v6" action is too old to run on GitHub Actions. update the action's version to fix this issue
(action)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
- GitHub Check: Workers Builds: create-db-worker
- GitHub Check: Workers Builds: claim-db-worker
🔇 Additional comments (5)
.github/workflows/publish.yml (5)
21-22: Workflow permissions set restrictively at the top level.The default permissions are
contents: read, which is appropriate. Child jobs explicitly elevate permissions only where needed (release jobs needcontents: write,pull-requests: write,id-token: write). This follows least-privilege principle well.
12-12: Concurrency control prevents simultaneous workflow runs.The concurrency group is scoped to workflow + ref, which prevents duplicate runs for the same branch/PR. This is appropriate.
21-22: Permission model appropriately follows least-privilege principle.Top-level permissions are read-only by default. Jobs that need elevated permissions (
contents: write,pull-requests: write,id-token: write) explicitly declare them. This is a security best practice.Also applies to: 94-97
12-12: Concurrency control prevents simultaneous runs.Concurrency group scoped to workflow + ref appropriately prevents duplicate executions on the same branch/PR.
154-159: The auto-generation code doesn't execute in normal operation.The condition
if [ -z "$(ls -A .changeset 2>/dev/null)" ]checks whether the.changesetdirectory is empty. Since.changeset/config.jsonand.changeset/README.mdare committed to the repository and always present, the directory is never empty. The defensive code block never runs in normal circumstances.The architectural concern about auto-generated changesets is theoretically sound, but it doesn't apply to this codebase in its current state. The condition that would trigger the auto-generation (an empty
.changesetdirectory) should not occur during normal workflow execution.Likely an incorrect or invalid review comment.
| - name: 📝 Ensure Changeset Exists | ||
| run: | | ||
| if [ -z "$(ls -A .changeset 2>/dev/null)" ]; then | ||
| echo "No changeset found. Creating a default one..." | ||
| pnpm changeset add --empty --message "chore(release): auto-generated changeset" | ||
| fi |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Changeset fallback creates generic entries that may mask missing release notes.
Creating an empty changeset with auto-generated messages could hide real version bumps without proper documentation of what changed. This should either fail explicitly or require manual intervention to ensure releases are properly documented.
Consider failing explicitly when changesets are missing:
- name: 📝 Ensure Changeset Exists
run: |
- if [ -z "$(ls -A .changeset 2>/dev/null)" ]; then
- echo "No changeset found. Creating a default one..."
- pnpm changeset add --empty --message "chore(release): auto-generated changeset"
- fi
+ if [ -z "$(ls -A .changeset 2>/dev/null)" ]; then
+ echo "❌ No changeset found. Please run 'pnpm changeset' and commit your changes."
+ exit 1
+ fi🤖 Prompt for AI Agents
.github/workflows/publish.yml around lines 154-159: the current step
auto-creates an empty changeset which can mask missing release notes; change
this step to fail the workflow if .changeset is empty (exit non-zero with a
clear message instructing the author to add a proper changeset) or gate
auto-generation behind an explicit opt-in flag (e.g., only create an auto
changeset when a repo secret/env var is set), and update the run script to emit
the failure message and exit 1 when the directory is empty unless the opt-in
flag is present.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
♻️ Duplicate comments (4)
.github/workflows/publish.yml (4)
85-91:⚠️ Fix silent publish failure handling to prevent incomplete releases.Line 88 uses
|| echoto suppress publish failures, allowing the loop to continue even if all retries fail. In a monorepo, a single failed package should block the entire workflow. This was flagged in prior reviews and remains unresolved.Replace the error suppression with an explicit failure:
- pnpm publish --access public --no-git-checks || echo "Publish failed again for $pkg" + if ! pnpm publish --access public --no-git-checks; then + echo "❌ Publish failed for $pkg after retry" + exit 1 + fi
93-99:⚠️ Replace automatic changeset fallback with explicit failure.Auto-generating an empty changeset can mask missing release notes and hide real version bumps. This was flagged in prior reviews.
Fail explicitly if changesets are missing:
- name: 📝 Ensure Changeset Exists run: | - if [ -z "$(ls -A .changeset 2>/dev/null)" ]; then - echo "No changeset found. Creating a default one..." - pnpm changeset add --empty --message "chore(release): auto-generated changeset" - fi + if [ -z "$(ls -A .changeset 2>/dev/null)" ]; then + echo "❌ No changeset found. Please run 'pnpm changeset' and commit your changes." + exit 1 + fi
152-156: 🔴 Fix script injection vulnerability in preview tag creation.Line 154 directly interpolates
github.event.pull_request.head.refinto a shell script. Althoughtrsanitizes forward slashes, it does not prevent shell metacharacter injection (e.g.,$, backticks,;,|). An attacker can craft a malicious branch name to execute arbitrary commands. GitHub's security documentation requires untrusted context values to be passed via environment variables. This was flagged in prior reviews.Apply this diff:
- name: 🔖 Create unique preview tag + env: + HEAD_REF: ${{ github.event.pull_request.head.ref }} run: | - SAFE_REF=$(echo "${{ github.event.pull_request.head.ref }}" | tr '/' '-') + SAFE_REF=$(echo "${HEAD_REF}" | tr '/' '-') echo "PRE_TAG=pr${{ github.event.number }}-${SAFE_REF}-${{ github.run_id }}" >> $GITHUB_ENV
191-224:⚠️ Upgradeactions/github-scriptfrom v6 to v8.Line 192 uses
actions/github-script@v6, which is outdated and runs on Node 16. The latest stable version is v8.0.0 (released September 2025), which provides current Node.js runtime support and critical updates. This was flagged in prior reviews.Update the action reference:
- uses: actions/github-script@v6 + uses: actions/github-script@v8
🧹 Nitpick comments (1)
.github/workflows/publish.yml (1)
166-171: Clean up redundant environment variable reassignments.Lines 166–171 reassign
CREATE_DB_WORKER_URLandCLAIM_DB_WORKER_URLafter they are already declared in the step'senvblock (lines 160–163). This is redundant; remove the inline reassignments.run: | - # Resolve URLs with fallback - CREATE_DB_WORKER_URL="${{ secrets.CREATE_DB_WORKER_URL }}" - CLAIM_DB_WORKER_URL="${{ secrets.CLAIM_DB_WORKER_URL }}" - - # Persist for next steps - echo "CREATE_DB_WORKER_URL=$CREATE_DB_WORKER_URL" >> $GITHUB_ENV - echo "CLAIM_DB_WORKER_URL=$CLAIM_DB_WORKER_URL" >> $GITHUB_ENV - echo "Using CREATE_DB_WORKER_URL=$CREATE_DB_WORKER_URL"
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
.github/workflows/publish.yml(1 hunks)
🧰 Additional context used
🪛 actionlint (1.7.9)
.github/workflows/publish.yml
153-153: "github.event.pull_request.head.ref" is potentially untrusted. avoid using it directly in inline scripts. instead, pass it through an environment variable. see https://docs.github.com/en/actions/reference/security/secure-use#good-practices-for-mitigating-script-injection-attacks for more details
(expression)
192-192: the runner of "actions/github-script@v6" action is too old to run on GitHub Actions. update the action's version to fix this issue
(action)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
- GitHub Check: Workers Builds: claim-db-worker
- GitHub Check: Workers Builds: create-db-worker
🔇 Additional comments (5)
.github/workflows/publish.yml (5)
1-27: ✅ Workflow triggers and environment setup look good.Concurrency, event triggers, and secret/workspace env variables are properly configured.
29-64: ✅ Release job setup is correct.Node/pnpm configuration and permissions are appropriate.
100-108: ✅ Changesets PR preparation is correct.
110-145: ✅ Preview job setup is correct.
146-151: ✅ README copying logic is straightforward.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
♻️ Duplicate comments (3)
.github/workflows/release.yml (3)
93-98: Changeset fallback creates generic entries that may mask missing release notes.Creating an empty changeset with auto-generated messages could hide real version bumps without proper documentation of what changed. This should either fail explicitly or require manual intervention to ensure releases are properly documented.
Consider failing explicitly when changesets are missing:
- name: 📝 Ensure Changeset Exists run: | - if [ -z "$(ls -A .changeset 2>/dev/null)" ]; then - echo "No changeset found. Creating a default one..." - pnpm changeset add --empty --message "chore(release): auto-generated changeset" - fi + if [ -z "$(ls -A .changeset 2>/dev/null)" ]; then + echo "❌ No changeset found. Please run 'pnpm changeset' and commit your changes." + exit 1 + fi
85-89: Hard-fail on publish failures to prevent silent deployment failures.Line 88 silently suppresses the second publish retry with
|| echo "...", allowing the workflow to continue and report success even if all packages fail to publish. This is particularly dangerous in a multi-package loop—failed releases are masked entirely.Remove the error suppression:
# First try to publish if ! pnpm publish --access public --no-git-checks; then echo "Publish failed, trying to bump version and retry..." npm version patch --no-git-tag-version - pnpm publish --access public --no-git-checks || echo "Publish failed again for $pkg" + pnpm publish --access public --no-git-checks fiAlternatively, track failed packages and exit with a non-zero code if any publish fails.
191-192: Upgradeactions/github-scriptfrom v6 to v8.Version 6 is outdated. The latest version is v8.0.0 and should be used instead.
- uses: actions/github-script@v6 + uses: actions/github-script@v8
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
.github/workflows/release.yml(1 hunks)
🧰 Additional context used
🪛 actionlint (1.7.9)
.github/workflows/release.yml
153-153: "github.event.pull_request.head.ref" is potentially untrusted. avoid using it directly in inline scripts. instead, pass it through an environment variable. see https://docs.github.com/en/actions/reference/security/secure-use#good-practices-for-mitigating-script-injection-attacks for more details
(expression)
192-192: the runner of "actions/github-script@v6" action is too old to run on GitHub Actions. update the action's version to fix this issue
(action)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
- GitHub Check: Workers Builds: claim-db-worker
- GitHub Check: Workers Builds: create-db-worker
|
✅ Preview CLIs & Workers are live! Test the CLIs locally under tag npx create-db@pr67
npx create-pg@pr67
npx create-postgres@pr67Worker URLs
|
feats and fixes: changes to structure, console log cleanups, geolocation detection, console link, and prisma 7 changes
|
✅ Preview CLIs & Workers are live! Test the CLIs locally under tag npx create-db@pr67
npx create-pg@pr67
npx create-postgres@pr67Worker URLs
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 5
♻️ Duplicate comments (1)
create-db/src/index.ts (1)
359-359: Remove unusedcallervariable.This variable is created but never referenced. Per past review comments, this is dead code.
Apply this diff:
-const caller = createRouterClient(router, { context: {} }); -
🧹 Nitpick comments (14)
create-db/src/env-utils.ts (1)
1-23: Basic .env parsing looks correct; consider aligning withdotenvsemantics.This implementation handles the common
KEY=VALUEcases and ignores comments/blank lines, which is fine for simple use. Two optional improvements:
- Handle Windows line endings by splitting on
/\r?\n/instead of"\n".- If you need full
.envparity (exports, escaping, multi-line quirks), consider delegating todotenv.parse(fs.readFileSync(envPath))so CLI and other tooling stay behavior‑compatible.claim-db-worker/lib/project-transfer.ts (1)
7-12: Return-shape extension is reasonable; tightentransferResponsetyping.The new parsing logic and inclusion of
transferResponse(JSON or{ rawResponse }) on success, andnullon failure, make the API more informative without changing control flow. To keep type safety:
- Prefer
transferResponse: unknown(or a dedicatedTransferResponsetype/union) instead ofany, so call sites must intentionally narrow before use.- If consumers care about the HTTP status on success as well, consider adding
statusin the success object for symmetry.Also applies to: 31-52
create-db/src/analytics.ts (1)
1-39: Non-blocking analytics are good; clearpendingAnalyticsand document property key style.The overall pattern (fire‑and‑forget with a short abort and optional flush) is solid, but there are two follow‑ups worth addressing:
- Clear
pendingAnalyticsafter a flush to avoid growth over time.
Right now, resolved promises stay in the array forever. For a long‑lived programmatic API this can leak memory and make subsequent flushes keep waiting on already‑flushed events. You can snapshot and clear:export async function flushAnalytics(maxWaitMs = 500): Promise<void> { - if (pendingAnalytics.length === 0) return; - await Promise.race([ - Promise.all(pendingAnalytics), + if (pendingAnalytics.length === 0) return; + + const toFlush = pendingAnalytics.splice(0, pendingAnalytics.length); + + await Promise.race([ + Promise.all(toFlush), new Promise<void>((resolve) => setTimeout(resolve, maxWaitMs)), ]); }
- Clarify analytics property key conventions.
propertiesis a generic bag, and you always injectdistinct_id. To keep consistency with prior analytics in this repo, it’s helpful to document (e.g. in a JSDoc onsendAnalytics) that custom property keys passed viapropertiesshould follow the existing kebab‑case convention (e.g."user-agent","error-type"), even thoughdistinct_iditself is a special case key. Based on learnings, this avoids duplicate attributes downstream.create-db/vitest.config.ts (1)
1-7: Vitest config matches the new TypeScript test layout; consider future patterns.The config will pick up all
.test.tsfiles under__tests__, which aligns with the TS tests added in this PR. If you later add React/component tests (.test.tsx) or place tests alongside source files, you may want to expandincludeaccordingly and re-run the suite to confirm discovery.claim-db-worker/app/api/auth/callback/route.ts (1)
117-138:projectDatais stored but never used.The variable
projectDatafromvalidateProject(projectID)is captured but not referenced anywhere in the success path. Either remove the variable assignment or useprojectDatainstead of making another fetch to get project details on line 147-155.- // Validate project exists and get project data - let projectData; + // Validate project exists try { - projectData = await validateProject(projectID); + await validateProject(projectID); } catch (error) {create-db/__tests__/create.test.ts (2)
27-41: Minor redundancy in output concatenation.Line 36 re-concatenates
result.stdout + result.stderrwhenresult.allis already computed in the helper (line 22). Consider using justresult.all.expect(result.exitCode).toBe(0); // clack outputs to stderr, so check all output - const allOutput = result.all || result.stdout + result.stderr; + const allOutput = result.all; if (!allOutput.includes("Database created successfully!")) {
65-79: Same redundancy as default command test.Line 74 has the same redundant fallback. Otherwise, the test properly validates the regions command output.
expect(result.exitCode).toBe(0); // clack outputs to stderr - const allOutput = result.all || result.stdout + result.stderr; + const allOutput = result.all;create-db/src/regions.ts (1)
21-33: Consider adding a timeout to the fetch call.The
fetchcall has no timeout, which could cause the CLI to hang indefinitely if the server is unresponsive. Consider usingAbortControllerwith a timeout similar to the pattern insendAnalytics.export async function getRegions(workerUrl: string): Promise<Region[]> { - const res = await fetch(`${workerUrl}/regions`); + const controller = new AbortController(); + const timer = setTimeout(() => controller.abort(), 10000); + + try { + const res = await fetch(`${workerUrl}/regions`, { + signal: controller.signal, + }); - if (!res.ok) { - throw new Error( - `Failed to fetch regions. Status: ${res.status} ${res.statusText}` - ); - } + if (!res.ok) { + throw new Error( + `Failed to fetch regions. Status: ${res.status} ${res.statusText}` + ); + } - const data = (await res.json()) as RegionsResponse; - const regions: Region[] = Array.isArray(data) ? data : (data.data ?? []); - return regions.filter((region) => region.status === "available"); + const data = (await res.json()) as RegionsResponse; + const regions: Region[] = Array.isArray(data) ? data : (data.data ?? []); + return regions.filter((region) => region.status === "available"); + } finally { + clearTimeout(timer); + } }claim-db-worker/app/web/connect/page.tsx (2)
86-95: Password masking replaces entire string including protocol prefix.The regex
/./gmatches every character, masking the entire connection string includingpostgresql://. Consider preserving the protocol prefix for clarity.let processedConnectionString = connectionString; if (!showPassword) { - processedConnectionString = connectionString.replace(/./g, "•"); + // Preserve protocol prefix for clarity + const match = connectionString.match(/^(postgresql:\/\/)/); + const prefix = match ? match[1] : ""; + processedConnectionString = prefix + connectionString.slice(prefix.length).replace(/./g, "•"); }
140-154: Remove commented-out code.The connection type toggle is fully commented out. If this functionality is no longer needed, remove it entirely to keep the codebase clean.
create-db/src/database.ts (2)
22-31: Consider adding a timeout to the fetch call.The main database creation request has no timeout, which could cause the CLI to hang if the server is unresponsive. This is especially important for a CLI tool where users expect responsive behavior.
+ const controller = new AbortController(); + const timer = setTimeout(() => controller.abort(), 30000); + - const resp = await fetch(`${createDbWorkerUrl}/create`, { + const resp = await fetch(`${createDbWorkerUrl}/create`, { method: "POST", headers: { "Content-Type": "application/json" }, body: JSON.stringify({ region, name, utm_source: getCommandName(), userAgent, }), + signal: controller.signal, }); + clearTimeout(timer);Note: You'll also need to handle
AbortErrorin a try-catch and return an appropriate timeout error response.
90-96: Complex conditional for connection details extraction.The logic for extracting
directConnDetailsfrom two different response shapes is correct but could benefit from a brief comment explaining the two API response formats being handled.+ // Handle both wrapped (data.database) and unwrapped (databases[]) API response shapes const apiKeys = database?.apiKeys; const directConnDetails = result.data ? apiKeys?.[0]?.directConnection : result.databases?.[0]?.apiKeys?.[0]?.ppgDirectConnection;create-db/src/geolocation.ts (1)
11-22: Consider using environment variables for test location override.The current pattern requires modifying source code to enable test mode. Consider reading from
process.env.TEST_LOCATIONto allow runtime configuration without code changes.Example refactor:
const TEST_LOCATION: { latitude: number; longitude: number } | null = process.env.TEST_LOCATION && TEST_LOCATIONS[process.env.TEST_LOCATION as keyof typeof TEST_LOCATIONS] ? TEST_LOCATIONS[process.env.TEST_LOCATION as keyof typeof TEST_LOCATIONS] : null;create-db/src/index.ts (1)
383-390: Consider validating region parameter in programmatic API.The
create()function doesn't validate theregionparameter before calling the API. Adding validation would provide faster, more helpful error messages for invalid regions.Apply this diff to add validation:
export async function create( options?: ProgrammaticCreateOptions ): Promise<CreateDatabaseResult> { + if (options?.region) { + await validateRegionWithUrl(options.region); + } return createDatabaseCoreWithUrl( options?.region || "us-east-1", options?.userAgent ); }Alternatively, if you prefer to keep the programmatic API simple and not throw errors, you could document that invalid regions will be caught by the API layer.
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
pnpm-lock.yamlis excluded by!**/pnpm-lock.yaml
📒 Files selected for processing (21)
.vscode/settings.json(0 hunks)claim-db-worker/app/api/auth/callback/route.ts(2 hunks)claim-db-worker/app/success/page.tsx(2 hunks)claim-db-worker/app/web/connect/page.tsx(6 hunks)claim-db-worker/lib/prismaSchemaEditor/defaultSchema.ts(0 hunks)claim-db-worker/lib/project-transfer.ts(2 hunks)claim-db-worker/lib/response-utils.ts(1 hunks)claim-db-worker/package.json(1 hunks)create-db-worker/src/index.ts(1 hunks)create-db/__tests__/create.test.ts(1 hunks)create-db/package.json(2 hunks)create-db/src/analytics.ts(1 hunks)create-db/src/cli.ts(1 hunks)create-db/src/database.ts(1 hunks)create-db/src/env-utils.ts(1 hunks)create-db/src/geolocation.ts(1 hunks)create-db/src/index.ts(1 hunks)create-db/src/regions.ts(1 hunks)create-db/src/types.ts(1 hunks)create-db/vitest.config.ts(1 hunks)schema-api-routes/src/routes/schema/pull.ts(0 hunks)
💤 Files with no reviewable changes (3)
- claim-db-worker/lib/prismaSchemaEditor/defaultSchema.ts
- .vscode/settings.json
- schema-api-routes/src/routes/schema/pull.ts
✅ Files skipped from review due to trivial changes (1)
- create-db-worker/src/index.ts
🚧 Files skipped from review as they are similar to previous changes (2)
- create-db/src/cli.ts
- create-db/src/types.ts
🧰 Additional context used
🧠 Learnings (1)
📚 Learning: 2025-08-27T16:39:21.271Z
Learnt from: aidankmcalister
Repo: prisma/create-db PR: 48
File: create-db/index.js:423-431
Timestamp: 2025-08-27T16:39:21.271Z
Learning: In the prisma/create-db project, analytics property keys should use the existing kebab-case convention (e.g., "user-agent", "error-type", "status-code") rather than snake_case, to maintain consistency with existing analytics data and avoid duplicate attributes.
Applied to files:
create-db/src/analytics.ts
🧬 Code graph analysis (5)
create-db/src/index.ts (6)
create-db/src/analytics.ts (2)
sendAnalytics(3-31)flushAnalytics(33-39)create-db/src/regions.ts (3)
checkOnline(4-19)getRegions(21-33)validateRegion(35-49)create-db/src/database.ts (2)
createDatabaseCore(12-135)getCommandName(5-10)create-db/src/types.ts (5)
RegionSchema(3-10)RegionId(12-12)ProgrammaticCreateOptions(126-129)CreateDatabaseResult(59-59)Region(33-37)create-db/src/env-utils.ts (1)
readUserEnvFile(4-23)create-db/src/geolocation.ts (2)
detectUserLocation(67-119)getRegionClosestToLocation(125-173)
create-db/src/regions.ts (2)
create-db-worker/src/index.ts (1)
fetch(15-195)create-db/src/types.ts (2)
Region(33-37)RegionsResponse(124-124)
create-db/src/database.ts (2)
create-db/src/types.ts (2)
CreateDatabaseResult(59-59)ApiResponse(99-104)create-db/src/analytics.ts (1)
sendAnalytics(3-31)
create-db/src/geolocation.ts (1)
create-db/src/types.ts (3)
RegionCoordinates(28-31)UserLocation(14-21)GeoLocationResponse(111-118)
claim-db-worker/app/api/auth/callback/route.ts (2)
claim-db-worker/lib/auth-utils.ts (1)
validateProject(35-57)claim-db-worker/lib/response-utils.ts (1)
redirectToSuccess(27-42)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
- GitHub Check: Workers Builds: claim-db-worker
- GitHub Check: Workers Builds: create-db-worker
🔇 Additional comments (21)
claim-db-worker/package.json (1)
29-31: Validate Prisma and React upgrades against the app runtime.Bumping Prisma to 7.x and React/React DOM to 19.1.2 is a non-trivial change; the manifest side looks fine, but please double-check migration notes and run the app/tests to catch any subtle breakages (especially Prisma schema/Client and React 19 behavior changes).
claim-db-worker/lib/response-utils.ts (1)
27-41: Success redirect now carries workspace and database IDs; looks good.The additional
workspaceIdanddatabaseIdparameters are correctly threaded into the query string viaURLSearchParams, andgetBaseUrlkeeps redirects host-relative. This should integrate cleanly with the updated success page logic.claim-db-worker/app/success/page.tsx (1)
11-17: Dynamic console URL construction is safe and matches the new IDs.Using
workspaceId,projectID, anddatabaseIdfrom the query string to buildconsoleUrl, with a fallback to the console root when any are missing, is a sensible upgrade. The anchor now always points to either a specific database or a safe default URL.Also applies to: 32-35
claim-db-worker/app/api/auth/callback/route.ts (1)
188-195: LGTM!The ID prefix stripping and redirect logic is clean. The sanitization of
projectID,workspaceId, anddatabaseIdaligns with the updatedredirectToSuccesssignature.create-db/__tests__/create.test.ts (2)
9-24: LGTM!The
runClihelper properly combines stdout and stderr to handle clack's output behavior, usesreject: falsewith manual exit code checking, and includes appropriate timeouts. This addresses the previous review feedback.
43-63: LGTM!The JSON flag test properly validates exit code before parsing, checks for valid JSON format, and verifies the expected properties. Good defensive checks with
trimmed.match(/^\s*\{/).create-db/src/regions.ts (2)
4-19: LGTM!The
checkOnlinefunction properly handles network errors with user-friendly colored output and re-throws for upstream handling. The error messaging includes helpful status page link.
35-49: LGTM!The
validateRegionfunction provides clear error messaging with available options when validation fails. The implementation is clean and straightforward.claim-db-worker/app/web/connect/page.tsx (1)
26-69: LGTM!The unified
buildStepsarray is well-structured and provides clear, sequential instructions for Prisma ORM setup with the adapter pattern. The code example properly demonstrates the@prisma/adapter-pgusage.create-db/src/database.ts (2)
5-10: LGTM!The
getCommandNamefunction cleanly determines the CLI command context fromprocess.argv[1], with a sensible default fallback to "create-db".
125-135: LGTM!The successful return object is well-structured with all necessary fields. The analytics emission for successful creation is appropriately placed before the return.
create-db/src/geolocation.ts (3)
24-31: LGTM!The region coordinates are correctly mapped to their geographic locations.
41-61: LGTM!The Haversine formula implementation is mathematically correct and properly calculates great-circle distances.
125-173: LGTM!The function properly validates input, handles both numeric and string coordinate types, and correctly determines the closest region using Haversine distance calculation.
create-db/src/index.ts (7)
1-49: LGTM!Imports are complete (including
isCancelfrom@clack/prompts), exports provide a clean public API, and dotenv configuration is appropriate.
51-88: LGTM!The wrapper functions provide a clean abstraction that avoids threading worker URLs through every function call.
159-248: LGTM!The JSON/env output path correctly:
- Uses
isCancel()for cancellation detection (line 174)- Returns after output to prevent fall-through to interactive mode (line 247)
- Handles env file writing with proper newline management
250-333: LGTM!The interactive CLI path correctly:
- Uses
isCancel()for cancellation detection (line 264)- Provides clear user feedback with spinner and formatted output
- Properly handles both success and error cases
335-347: LGTM!The regions route handler is clean and provides properly formatted output.
350-357: LGTM!The CLI initialization is properly configured with name, version, and description.
403-405: LGTM!The programmatic
regions()API is clean and straightforward.
| // Fetch project details with user's token to get workspace ID | ||
| const projectDetailsRes = await fetch( | ||
| `https://api.prisma.io/v1/projects/${projectID}`, | ||
| { | ||
| headers: { | ||
| Authorization: `Bearer ${tokenData.access_token}`, | ||
| "Content-Type": "application/json", | ||
| }, | ||
| } | ||
| ); | ||
| const projectDetails = (await projectDetailsRes.json()) as { | ||
| data?: { workspace?: { id?: string } }; | ||
| }; | ||
| const workspaceId = (projectDetails.data?.workspace?.id ?? "").replace( | ||
| /^wksp_/, | ||
| "" | ||
| ); | ||
|
|
||
| // Fetch databases to get database ID | ||
| const databasesRes = await fetch( | ||
| `https://api.prisma.io/v1/projects/${projectID}/databases`, | ||
| { | ||
| headers: { | ||
| Authorization: `Bearer ${tokenData.access_token}`, | ||
| "Content-Type": "application/json", | ||
| }, | ||
| } | ||
| ); | ||
| const databases = (await databasesRes.json()) as { | ||
| data?: Array<{ id?: string }>; | ||
| }; | ||
| const databaseId = (databases.data?.[0]?.id ?? "").replace(/^db_/, ""); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Missing error handling for project details and databases fetch.
Both projectDetailsRes and databasesRes are used without checking .ok status. If these requests fail, the code will proceed with potentially malformed data or throw on JSON parsing.
// Fetch project details with user's token to get workspace ID
const projectDetailsRes = await fetch(
`https://api.prisma.io/v1/projects/${projectID}`,
{
headers: {
Authorization: `Bearer ${tokenData.access_token}`,
"Content-Type": "application/json",
},
}
);
+ if (!projectDetailsRes.ok) {
+ console.error(`Failed to fetch project details: ${projectDetailsRes.status}`);
+ }
const projectDetails = (await projectDetailsRes.json()) as {
data?: { workspace?: { id?: string } };
};
const workspaceId = (projectDetails.data?.workspace?.id ?? "").replace(
/^wksp_/,
""
);
// Fetch databases to get database ID
const databasesRes = await fetch(
`https://api.prisma.io/v1/projects/${projectID}/databases`,
{
headers: {
Authorization: `Bearer ${tokenData.access_token}`,
"Content-Type": "application/json",
},
}
);
+ if (!databasesRes.ok) {
+ console.error(`Failed to fetch databases: ${databasesRes.status}`);
+ }
const databases = (await databasesRes.json()) as {
data?: Array<{ id?: string }>;
};Committable suggestion skipped: line range outside the PR's diff.
🤖 Prompt for AI Agents
In claim-db-worker/app/api/auth/callback/route.ts around lines 146 to 177, the
code fetches project details and databases but does not check response.ok before
calling .json(), which can lead to parsing errors or using invalid data; update
both fetches to check response.ok (and optionally response.status), handle
non-2xx responses by logging the error and throwing or returning an appropriate
error response (including the status and text/body for diagnostics), and only
proceed to parse JSON and extract workspaceId/databaseId when the responses are
ok.
| <button | ||
| className={`flex items-center justify-center w-12 h-12 border border-subtle rounded-md transition-colors ${ | ||
| copiedAccel | ||
| ? "text-green-400 border-green-400" | ||
| : "text-muted hover:text-white" | ||
| }`} | ||
| onClick={() => | ||
| handleCopyConnectionString( | ||
| dbInfo.connectionString, | ||
| setCopiedAccel | ||
| ) | ||
| } | ||
| title="Copy accelerate connection string" | ||
| disabled={!connectionString} | ||
| > | ||
| {copiedAccel ? ( | ||
| <Check className="h-5 w-5" /> | ||
| ) : ( | ||
| <Zap className="h-5 w-5" /> | ||
| )} | ||
| </button> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Disabled condition uses wrong connection string variable.
Line 219 checks !connectionString (the direct connection string) to disable the accelerate copy button, but the button copies dbInfo.connectionString. These could have different availability states.
onClick={() =>
handleCopyConnectionString(
dbInfo.connectionString,
setCopiedAccel
)
}
title="Copy accelerate connection string"
- disabled={!connectionString}
+ disabled={!dbInfo.connectionString}
>📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| <button | |
| className={`flex items-center justify-center w-12 h-12 border border-subtle rounded-md transition-colors ${ | |
| copiedAccel | |
| ? "text-green-400 border-green-400" | |
| : "text-muted hover:text-white" | |
| }`} | |
| onClick={() => | |
| handleCopyConnectionString( | |
| dbInfo.connectionString, | |
| setCopiedAccel | |
| ) | |
| } | |
| title="Copy accelerate connection string" | |
| disabled={!connectionString} | |
| > | |
| {copiedAccel ? ( | |
| <Check className="h-5 w-5" /> | |
| ) : ( | |
| <Zap className="h-5 w-5" /> | |
| )} | |
| </button> | |
| <button | |
| className={`flex items-center justify-center w-12 h-12 border border-subtle rounded-md transition-colors ${ | |
| copiedAccel | |
| ? "text-green-400 border-green-400" | |
| : "text-muted hover:text-white" | |
| }`} | |
| onClick={() => | |
| handleCopyConnectionString( | |
| dbInfo.connectionString, | |
| setCopiedAccel | |
| ) | |
| } | |
| title="Copy accelerate connection string" | |
| disabled={!dbInfo.connectionString} | |
| > | |
| {copiedAccel ? ( | |
| <Check className="h-5 w-5" /> | |
| ) : ( | |
| <Zap className="h-5 w-5" /> | |
| )} | |
| </button> |
🤖 Prompt for AI Agents
In claim-db-worker/app/web/connect/page.tsx around lines 206 to 226, the
accelerate-copy button's disabled prop checks the wrong variable (it uses
!connectionString) while the button copies dbInfo.connectionString; change the
disabled check to use !dbInfo.connectionString (or !dbInfo?.connectionString if
dbInfo can be undefined) so the button is disabled based on the actual value
being copied, and keep the onClick and title as-is.
| "type": "module", | ||
| "exports": { | ||
| ".": { | ||
| "types": "./dist/index.d.mts", | ||
| "import": "./dist/index.mjs" | ||
| }, | ||
| "./cli": { | ||
| "import": "./dist/cli.mjs" | ||
| } | ||
| }, | ||
| "main": "./dist/index.mjs", | ||
| "module": "./dist/index.mjs", | ||
| "types": "./dist/index.d.mts", | ||
| "author": "prisma", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
🏁 Script executed:
cd create-db && git log --oneline -20 -- package.jsonRepository: prisma/create-db
Length of output: 116
🏁 Script executed:
cd create-db && cat package.json | grep -A2 '"version"'Repository: prisma/create-db
Length of output: 268
🏁 Script executed:
cd create-db && git show HEAD~5:package.json 2>/dev/null | head -30 || git show HEAD~10:package.json 2>/dev/null | head -30 || echo "Could not retrieve earlier version"Repository: prisma/create-db
Length of output: 42
🏁 Script executed:
find create-db -name "CHANGELOG*" -o -name "HISTORY*" | xargs cat 2>/dev/null || echo "No changelog found"Repository: prisma/create-db
Length of output: 420
🏁 Script executed:
cd create-db && git diff HEAD~5..HEAD -- package.json 2>/dev/null | head -100 || echo "Could not diff recent versions"Repository: prisma/create-db
Length of output: 42
🏁 Script executed:
cd create-db && cat package.json | grep -A5 '"bin"'Repository: prisma/create-db
Length of output: 200
🏁 Script executed:
cd create-db && git log --all --oneline --decorate | head -20Repository: prisma/create-db
Length of output: 954
🏁 Script executed:
cd create-db && git show 8582fbf:package.json 2>/dev/null | grep -E '"version"|"type"|"main"|"module"|"exports"' -A2 || echo "Could not retrieve merge commit package.json"Repository: prisma/create-db
Length of output: 255
🏁 Script executed:
cd create-db && find . -path ./node_modules -prune -o -name ".changeset" -type d -printRepository: prisma/create-db
Length of output: 42
🏁 Script executed:
cd create-db && ls -la .changeset/ 2>/dev/null | head -20 || echo "No .changeset directory"Repository: prisma/create-db
Length of output: 42
🏁 Script executed:
cd create-db && git show main:package.json 2>/dev/null | grep -E '"version"|"type"|"main"|"module"|"exports"' -A2 -B1 || echo "Could not retrieve main branch package.json"Repository: prisma/create-db
Length of output: 314
🏁 Script executed:
cd create-db && git log --oneline --all -- package.json | head -10Repository: prisma/create-db
Length of output: 378
🏁 Script executed:
cd create-db && git diff main...8582fbf -- package.json | head -80Repository: prisma/create-db
Length of output: 96
🏁 Script executed:
cd create-db && git show 082dd39:create-db/package.json 2>/dev/null | grep -E '"version"|"type"|"main"|"module"|"exports"' -A10 || echo "Could not retrieve main branch create-db/package.json"Repository: prisma/create-db
Length of output: 889
🏁 Script executed:
cd create-db && git log --oneline --all -- create-db/package.json | head -10Repository: prisma/create-db
Length of output: 42
🏁 Script executed:
cd create-db && git diff 082dd39 8582fbf -- create-db/package.jsonRepository: prisma/create-db
Length of output: 42
🏁 Script executed:
cd create-db && git log --all --oneline --decorate -- create-db/CHANGELOG.md | head -5Repository: prisma/create-db
Length of output: 42
🏁 Script executed:
cd create-db && git show 8582fbf:create-db/CHANGELOG.md 2>/dev/null | head -50Repository: prisma/create-db
Length of output: 420
Entrypoint restructuring to dist/ is a breaking change—requires major version bump or compatibility layer.
This PR changes the primary entrypoints from index.js to ./dist/index.mjs, removes the direct root-level files, and updates bin entries to point to ./dist/cli.mjs. Consumers importing from the root directory or using direct index.js paths will break. Since this is being released as v1.1.0 (minor bump), either:
- Bump to v2.0.0 to properly signal the breaking change, or
- Provide a CJS/ESM compatibility layer (e.g., keep
index.jsas a thin wrapper, or add a"cjs"export target).
🤖 Prompt for AI Agents
In create-db/package.json around lines 5 to 18, the package export and
entrypoint changes point root imports and bin entries at ./dist/*.mjs which is a
breaking change for existing CJS consumers; either (A) bump the package version
to 2.0.0 to signal the breaking change, or (B) restore backwards compatibility
by adding a CJS/compatibility layer: keep a root-level index.js that
requires/exports the ESM bundle (or add a "cjs" export target in "exports"), and
update bin shims to point to a small CJS wrapper that loads ./dist/cli.mjs so
existing imports and CLI users continue to work; implement one of these fixes
and update package.json version/exports/bin accordingly.
| const connectionString = | ||
| directConnDetails && directHost | ||
| ? `postgresql://${directUser}:${directPass}@${directHost}${directPort}/${directDbName}?sslmode=require` | ||
| : null; | ||
|
|
||
| const claimUrl = `${claimDbWorkerUrl}/claim?projectID=${projectId}&utm_source=${userAgent || getCommandName()}&utm_medium=cli`; | ||
| const expiryDate = new Date(Date.now() + 24 * 60 * 60 * 1000); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Potential empty projectId in claim URL.
If both result.data?.id and result.id are undefined (line 91), projectId defaults to an empty string, resulting in a malformed claimUrl like ...?projectID=&utm_source=.... Consider validating projectId before constructing the URL.
+ if (!projectId) {
+ void sendAnalytics(
+ "create_db:database_creation_failed",
+ { region, "error-type": "missing_project_id" },
+ runId,
+ createDbWorkerUrl
+ );
+ return {
+ success: false,
+ error: "missing_project_id",
+ message: "Server response missing project ID.",
+ status: resp.status,
+ };
+ }
+
const claimUrl = `${claimDbWorkerUrl}/claim?projectID=${projectId}&utm_source=${userAgent || getCommandName()}&utm_medium=cli`;📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| const connectionString = | |
| directConnDetails && directHost | |
| ? `postgresql://${directUser}:${directPass}@${directHost}${directPort}/${directDbName}?sslmode=require` | |
| : null; | |
| const claimUrl = `${claimDbWorkerUrl}/claim?projectID=${projectId}&utm_source=${userAgent || getCommandName()}&utm_medium=cli`; | |
| const expiryDate = new Date(Date.now() + 24 * 60 * 60 * 1000); | |
| const connectionString = | |
| directConnDetails && directHost | |
| ? `postgresql://${directUser}:${directPass}@${directHost}${directPort}/${directDbName}?sslmode=require` | |
| : null; | |
| if (!projectId) { | |
| void sendAnalytics( | |
| "create_db:database_creation_failed", | |
| { region, "error-type": "missing_project_id" }, | |
| runId, | |
| createDbWorkerUrl | |
| ); | |
| return { | |
| success: false, | |
| error: "missing_project_id", | |
| message: "Server response missing project ID.", | |
| status: resp.status, | |
| }; | |
| } | |
| const claimUrl = `${claimDbWorkerUrl}/claim?projectID=${projectId}&utm_source=${userAgent || getCommandName()}&utm_medium=cli`; | |
| const expiryDate = new Date(Date.now() + 24 * 60 * 60 * 1000); |
🤖 Prompt for AI Agents
In create-db/src/database.ts around lines 110-116, projectId may be empty which
creates a malformed claimUrl; before building the URL validate that projectId is
a non-empty string (result.data?.id || result.id) and if falsy either throw or
return an error/early-exit so you don't construct the claim URL, otherwise build
the URL using encodeURIComponent for projectId and userAgent/getCommandName to
ensure safe query params; ensure downstream code handles a null/undefined
claimUrl if you choose to skip creating it.
| export async function detectUserLocation(): Promise<UserLocation | null> { | ||
| // FOR TESTING: Return test location if configured | ||
| if (TEST_LOCATION !== null) { | ||
| return { | ||
| country: "TEST", | ||
| continent: "TEST", | ||
| city: "Test City", | ||
| region: "Test Region", | ||
| latitude: TEST_LOCATION.latitude, | ||
| longitude: TEST_LOCATION.longitude, | ||
| }; | ||
| } | ||
|
|
||
| // Real geolocation via IP API | ||
| const controller = new AbortController(); | ||
| const timeout = setTimeout(() => controller.abort(), 3000); | ||
|
|
||
| try { | ||
| const response = await fetch("https://ipapi.co/json/", { | ||
| method: "GET", | ||
| headers: { "User-Agent": "create-db-cli/1.0" }, | ||
| signal: controller.signal, | ||
| }); | ||
|
|
||
| if (!response.ok) return null; | ||
|
|
||
| const data = (await response.json()) as GeoLocationResponse; | ||
|
|
||
| // Validate that we have valid coordinates | ||
| if ( | ||
| typeof data.latitude !== "number" || | ||
| typeof data.longitude !== "number" || | ||
| !Number.isFinite(data.latitude) || | ||
| !Number.isFinite(data.longitude) | ||
| ) { | ||
| return null; | ||
| } | ||
|
|
||
| return { | ||
| country: data.country_code, | ||
| continent: data.continent_code, | ||
| city: data.city, | ||
| region: data.region, | ||
| latitude: data.latitude, | ||
| longitude: data.longitude, | ||
| }; | ||
| } catch { | ||
| // Return null on any error (timeout, network, etc.) | ||
| return null; | ||
| } finally { | ||
| clearTimeout(timeout); | ||
| } | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
🌐 Web query:
ipapi.co free API rate limits SLA reliability
💡 Result:
- Free tier rate limit: 1,000 requests per day (≈30,000/month). (ipapi.co)
- Notes on free-tier behavior: intended for testing/development, not production; requests may be further restricted by heuristics (no sign‑up, no usage alerts/monitoring on free). (ipapi.co)
- Support / SLA: free plan has no support and is explicitly “not for production use” — ipapi does not promise an SLA for the free tier; paid plans include support and higher limits. (ipapi.co)
- Reliability / status reporting: ipapi publishes a status page with incident history and uptime metrics; check it for recent operational incidents. (status page and history available on their site). (ipapi.co)
If you want, I can pull the most recent status/uptime numbers right now.
Citations:
- 1: https://ipapi.co/?utm_source=openai
- 2: https://ipapi.co/free/?utm_source=openai
- 3: https://ipapi.co/?utm_source=openai
- 4: https://ipapi.co/status/?utm_source=openai
🏁 Script executed:
# Check for any documentation about external dependencies in the project
find . -type f \( -name "README.md" -o -name "*.md" -o -name "package.json" -o -name "requirements.txt" \) -exec grep -l "ipapi\|geolocation\|rate.limit\|dependency" {} \;Repository: prisma/create-db
Length of output: 100
🏁 Script executed:
# Look for any comments or docs about API alternatives in the geolocation file
cat -n create-db/src/geolocation.ts | head -20Repository: prisma/create-db
Length of output: 983
🏁 Script executed:
# Check if there's any environment configuration for API selection
rg -i "geolocation|ipapi|api.key|api.url" --type typescript --type jsonRepository: prisma/create-db
Length of output: 94
🏁 Script executed:
# Check if there's any discussion of alternatives or paid tiers in the codebase
rg -i "alternative|paid|production|rate.limit" create-db/src/Repository: prisma/create-db
Length of output: 213
🏁 Script executed:
# Look at the full geolocation.ts file to understand project scope
wc -l create-db/src/geolocation.tsRepository: prisma/create-db
Length of output: 92
🏁 Script executed:
# Check the main README for project scope
head -50 README.mdRepository: prisma/create-db
Length of output: 2048
Do not use ipapi.co's free tier in production — it is explicitly not recommended for production use.
The code relies on ipapi.co's free API with only 1,000 requests per day (≈30/minute), which ipapi.co explicitly documents as unsuitable for production. The free tier also has no SLA or support. For a production CLI tool, this will hit rate limits quickly as usage scales.
Either upgrade to ipapi.co's paid plan, switch to a production-ready alternative (e.g., MaxMind GeoIP2, IP2Location), or document this as a known limitation. Given the project already handles rate limit errors elsewhere (see database.ts), this dependency warrants the same attention.
🤖 Prompt for AI Agents
In create-db/src/geolocation.ts around lines 67 to 119, the code uses ipapi.co's
free tier which is unsuitable for production; update the implementation so the
CLI does not rely on the free ipapi.co service by default — either (1) add
configuration/env support to use a production-grade provider (e.g., MaxMind
GeoIP2, IP2Location, or a paid ipapi plan) and wire provider selection into
detectUserLocation, (2) implement a clear fallback and rate-limit/error handling
consistent with database.ts (detect and surface rate-limit errors, retry/backoff
or fallback to a secondary provider), or (3) if keeping ipapi as an optional
fallback, explicitly document the limitation and require an opt-in API key/env
var for production usage; ensure logs/errors include provider and error details
and that timeouts/rate-limit responses are handled the same way as other project
network calls.
|
✅ Preview CLIs & Workers are live! Test the CLIs locally under tag npx create-db@pr67
npx create-pg@pr67
npx create-postgres@pr67Worker URLs
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
🧹 Nitpick comments (2)
create-pg/cli.js (1)
1-2: Add error handling for the dynamic import.The dynamic
import()returns a promise that is not awaited or handled. If thecreate-db/climodule fails to load or execute, the error may not be properly surfaced to the user.Consider adding explicit error handling:
#!/usr/bin/env node -import("create-db/cli"); +import("create-db/cli").catch((err) => { + console.error("Failed to load create-db CLI:", err); + process.exit(1); +});create-postgres/cli.js (1)
1-2: Add error handling for the dynamic import.The dynamic
import()returns a promise that is not awaited or handled. If thecreate-db/climodule fails to load or execute, the error may not be properly surfaced to the user.Consider adding explicit error handling:
#!/usr/bin/env node -import("create-db/cli"); +import("create-db/cli").catch((err) => { + console.error("Failed to load create-db CLI:", err); + process.exit(1); +});
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (6)
create-pg/cli.js(1 hunks)create-pg/index.js(1 hunks)create-pg/package.json(2 hunks)create-postgres/cli.js(1 hunks)create-postgres/index.js(1 hunks)create-postgres/package.json(2 hunks)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
- GitHub Check: Workers Builds: claim-db-worker
- GitHub Check: Workers Builds: create-db-worker
🔇 Additional comments (7)
create-postgres/index.js (1)
1-1: Clean delegation to create-db programmatic API.The re-export pattern enables create-postgres to expose the programmatic API from create-db without duplication, maintaining consistency with the create-pg package structure.
create-pg/package.json (2)
32-35: README.md excluded from published package.The
filesarray no longer includesREADME.md, which means the README won't be distributed with the npm package. This impacts documentation availability for users who install the package.Consider adding
README.mdback to thefilesarray:"files": [ "cli.js", - "index.js" + "index.js", + "README.md" ]Likely an incorrect or invalid review comment.
23-25: This change is appropriate for a CLI-only package and does not introduce breaking changes.The
create-pgpackage is explicitly designed and documented as a CLI tool (see README, installation vianpx create-pg@latest, and all usage examples showing only CLI commands). The removal of the"main"field aligns with this CLI-first positioning. Whileindex.jsexists in the published files, it is not documented in the README or mentioned as part of the public API, and no library usage patterns exist in the codebase. Users who consume this package do so exclusively vianpxas a CLI tool.create-postgres/cli.js (1)
2-2: Thecreate-db/climodule is properly configured and exists.The create-db package has the correct exports configuration in package.json, mapping
"./cli"to"./dist/cli.mjs". The corresponding source file exists atcreate-db/src/cli.tsand is built via the tsdown build system. The import statement is valid.create-pg/cli.js (1)
2-2: Thecreate-db/cliexport is properly configured in the create-db package and the module is valid. No changes required.create-pg/index.js (1)
1-1: Clean delegation to create-db programmatic API.The re-export pattern enables create-pg to expose the programmatic API from create-db without duplication. The create-db package exports a complete set of functions (
create,regions) and types (CreateDatabaseResult,DatabaseResult,Region, etc.) that are appropriately re-exported through this pattern, providing consumers with a consistent interface across both packages.create-postgres/package.json (1)
23-25: No action needed — the package is designed correctly with no breaking changes.The "main" field is not present in the current package.json and was not part of the HEAD revision. The package includes an
index.jsthat re-exports from "create-db" (export * from "create-db"), allowing library consumption via named imports. The removal or absence of "main" is not a breaking change because:
- Modern ESM packages (this package has
"type": "module") do not require a "main" field- Named exports remain accessible via the included index.js
- No imports of create-postgres exist in the codebase
- The "bin" field provides the intended CLI entry point
The current structure is intentional (commit: "fix: entry points for both alias clis").
Likely an incorrect or invalid review comment.
| "files": [ | ||
| "index.js", | ||
| "README.md" | ||
| "cli.js", | ||
| "index.js" | ||
| ] | ||
| } | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
README.md excluded from published package.
The files array no longer includes README.md, which means the README won't be distributed with the npm package. This impacts documentation availability for users who install the package.
Consider adding README.md back to the files array:
"files": [
"cli.js",
- "index.js"
+ "index.js",
+ "README.md"
]📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| "files": [ | |
| "index.js", | |
| "README.md" | |
| "cli.js", | |
| "index.js" | |
| ] | |
| } | |
| } | |
| "files": [ | |
| "cli.js", | |
| "index.js", | |
| "README.md" | |
| ] | |
| } |
🤖 Prompt for AI Agents
In create-postgres/package.json around lines 32 to 36, the package "files" array
currently only lists "cli.js" and "index.js", so README.md will be excluded from
published packages; to fix, add "README.md" (matching its filename/casing) to
the "files" array in package.json so the README is included when publishing,
then validate by running npm pack or npm publish --dry-run to confirm the file
is present in the package.
|
✅ Preview CLIs & Workers are live! Test the CLIs locally under tag npx create-db@pr67
npx create-pg@pr67
npx create-postgres@pr67Worker URLs
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
♻️ Duplicate comments (1)
create-postgres/package.json (1)
44-49: README.md still excluded from published package
filesonly includescli.js,index.js, andindex.d.ts, so the README will not be part of the npm tarball. Consider addingREADME.mdto improve the installed package’s documentation.
🧹 Nitpick comments (1)
create-pg/package.json (1)
44-49: Consider includingREADME.mdin the published files
filescurrently only shipscli.js,index.js, andindex.d.ts, so consumers won’t get the README in the installed package. Including it improves offline docs and ecosystem tooling (e.g. package viewers).
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (4)
create-pg/index.d.ts(1 hunks)create-pg/package.json(2 hunks)create-postgres/index.d.ts(1 hunks)create-postgres/package.json(3 hunks)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
- GitHub Check: Workers Builds: claim-db-worker
- GitHub Check: Workers Builds: create-db-worker
🔇 Additional comments (5)
create-pg/index.d.ts (1)
1-2: Type re-export fromcreate-dblooks correctDelegating the entire type surface to
create-dbkeepscreate-pg’s typings aligned with the core package and matches the runtime re-export pattern.create-postgres/index.d.ts (1)
1-2: Consistent re-export of typings fromcreate-dbThis mirrors the JS re-export behavior and ensures
create-postgresexposes the same programmatic API typings ascreate-db.create-pg/package.json (1)
23-37: ESM entrypoints and exports map look coherent
type: "module"plusmain/modulepointing toindex.js,typestoindex.d.ts, an exports map for"."and"./cli", andbintargeting./cli.jsform a consistent surface for both programmatic API and CLI.create-postgres/package.json (2)
5-5: Author metadata update is appropriateSetting
"author": "prisma"is consistent with the repository ownership.
23-37: Programmatic/CLI entry configuration is consistent
main/module/typesplus the exports map andbin: "create-postgres": "./cli.js"give a clear separation between the library entry (index.js/index.d.ts) and the CLI entry (cli.js).
Summary by CodeRabbit
New Features
User-facing Fixes
Refactor
✏️ Tip: You can customize this high-level summary in your review settings.