-
Notifications
You must be signed in to change notification settings - Fork 9
Open
Description
this is more of a help request than an issue, i would like to have an example or help in reading/parsing files in batches
similar to @loaders.gl
for example we can do this:
const batchIterator = await parseInBatches(file, loaders, {
worker: true,
batchSize: 4000,
batchDebounceMs: 50,
metadata: true,
});
for await (const batch of batchIterator) {
for (let i = 0; i < batch.data.length; i += 1) {
batches.push(batch.data[i] as never);
}
}
so the file gets loaded and parsed without blocking the main thread, i have been exploring with js web workers so my solution would be something like this and call the worker on file upload event
onmessage = function (event) {
// console.log('Received message from the main thread:', event.data);
const wasmTable = readGeoParquet(new Uint8Array(event.data));
const jsTable = tableFromIPC(wasmTable.intoTable().intoIPCStream());
};
not sure yet if that's the right approach but also i'm confused about the option
earcutWorkerPoolSize
and earcutWorkerUrl
from the layers options if they can be more effective way to solve this issue
so more information about this would be helpful.
Thanks
Metadata
Metadata
Assignees
Labels
No labels