-
Notifications
You must be signed in to change notification settings - Fork 313
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
perf(seed): run rebuilds in parallel, add perf logs #4334
base: main
Are you sure you want to change the base?
Changes from 1 commit
5bddaac
e0f710a
f3941d3
1c424bc
ba9eaa8
f98f105
6ca8cd1
340ee80
e5a9888
fb90a1a
b4dfc20
ec0746b
10bc983
472be29
f99b615
7e18ad8
dfe2307
d71b8cd
8369583
5bdd757
c13e908
743060f
8080800
ed4a256
436298b
8ad3add
8ef8b4b
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
- Loading branch information
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -11,25 +11,24 @@ import { RebuildOptions, buildRebuildOptions } from './common'; | |
* @param options - Optional options for how rebuild should be done. | ||
*/ | ||
export async function rebuildR4SearchParameters(options?: Partial<RebuildOptions>): Promise<void> { | ||
const finalOptions = buildRebuildOptions(options); | ||
const rebuildOptions = buildRebuildOptions(options); | ||
const client = getDatabasePool(); | ||
await client.query('DELETE FROM "SearchParameter" WHERE "projectId" = $1', [r4ProjectId]); | ||
|
||
const systemRepo = getSystemRepo(); | ||
|
||
if (finalOptions.parallel) { | ||
const promises = []; | ||
for (const filename of SEARCH_PARAMETER_BUNDLE_FILES) { | ||
for (const entry of readJson(filename).entry as BundleEntry[]) { | ||
promises.push(createParameter(systemRepo, entry.resource as SearchParameter)); | ||
} | ||
const promises = []; | ||
for (const filename of SEARCH_PARAMETER_BUNDLE_FILES) { | ||
for (const entry of readJson(filename).entry as BundleEntry[]) { | ||
promises.push(createParameter(systemRepo, entry.resource as SearchParameter)); | ||
} | ||
} | ||
|
||
if (rebuildOptions.parallel) { | ||
await Promise.all(promises); | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Consider always building the array, and then using the if (finalOptions.parallel) {
await Promise.all(promises);
} else {
for (const promise of promises) {
await promise;
}
} That way we don't need to worry about the logic getting out of sync. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Good point 👍 There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Actually this won't work. Filling the array in the serial case doesn't actually execute the promises serially, only waits for them serially. We have to create the promise in the loop where it is awaited in order to not unintentionally parallelize them There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I don't know if it's worth introducing a new dependency, but p-limit seems like a pretty elegant and simple solution to abstract away the max concurrency that we want to allow and avoid the |
||
} else { | ||
for (const filename of SEARCH_PARAMETER_BUNDLE_FILES) { | ||
for (const entry of readJson(filename).entry as BundleEntry[]) { | ||
await createParameter(systemRepo, entry.resource as SearchParameter); | ||
} | ||
for (const promise of promises) { | ||
await promise; | ||
} | ||
} | ||
} | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lol, curious how Postgres handles this
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We're capped at 10 concurrent connections with the pool, I considered trying to up it and see what happens, but seemed to run fine on my local machine as is.