You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
As I understand, when doing updates with context.db.insert and context.db.update it put the changes into memory and then flushes the changes to the DB. Or at least that is what I expect the result to be since the function calls are very fast.
However context.db.delete takes much longer.
I have a use case where two events encapsulate multiple events. So the way I do it is with a db object that is only used for this temp use. When the closing event is fired, I delete the object.
But right now I am using a trick since I found out that updating is faster than deleting.
import{ponder}from"ponder:registry";import{currentClean}from"ponder:schema";/** * Handles Mangrove cleaning events by tracking clean start and completion * * During cleaning: * 1. CleanStart event creates a record of the cleaning operation * 2. CleanComplete event removes the record when cleaning is done * * The clean record is used by other handlers to determine if offers * should be modified during cleaning operations. *//** * Event handler for Mangrove:CleanStart events * Creates a record of the cleaning operation with chain ID and block details */ponder.on("Mangrove:CleanStart",async({ event, context })=>{awaitcontext.db.insert(currentClean).values({chainId: context.network.chainId,block: event.block.number,eventIndex: event.log.logIndex,}).onConflictDoUpdate({eventIndex: event.log.logIndex,});});/** * Event handler for Mangrove:CleanComplete events * Removes the cleaning operation record when complete */ponder.on("Mangrove:CleanComplete",async({ event, context })=>{// instead of deleting the record, we set the event index to a large number// this is beacuase delete operation will try to delete from the DB, whereas updating wil only create a local diff until the next flush// this reduces this function time from 0.3ms to 0.003msawaitcontext.db.update(currentClean,{chainId: context.network.chainId,block: event.block.number,}).set({eventIndex: null,});});
The problem is, I now increase my db size even though I don't really want to.
Expected behavior
I think of two solutions
I would expect the delete method to act as a diff or find a better solution for my use case.
There could be something like the transient storage in the EVM which is much simpler and is deleted automatically after every transactions (or block) if needed. It could be synchronous, or as fast as the insert and update ops (or even faster), and maybe limited in size since it could be completely in memory.
Steps to reproduce
No response
Link to repository
No response
Anything else?
No response
The text was updated successfully, but these errors were encountered:
Version
latest
Current behavior
As I understand, when doing updates with
context.db.insert
andcontext.db.update
it put the changes into memory and then flushes the changes to the DB. Or at least that is what I expect the result to be since the function calls are very fast.However
context.db.delete
takes much longer.I have a use case where two events encapsulate multiple events. So the way I do it is with a db object that is only used for this temp use. When the closing event is fired, I delete the object.
But right now I am using a trick since I found out that updating is faster than deleting.
The problem is, I now increase my db size even though I don't really want to.
Expected behavior
I think of two solutions
I would expect the delete method to act as a diff or find a better solution for my use case.
There could be something like the transient storage in the EVM which is much simpler and is deleted automatically after every transactions (or block) if needed. It could be synchronous, or as fast as the insert and update ops (or even faster), and maybe limited in size since it could be completely in memory.
Steps to reproduce
No response
Link to repository
No response
Anything else?
No response
The text was updated successfully, but these errors were encountered: