Replies: 3 comments
-
I don't have any time to assist, but I think you could do that. Alternatively, you could also clean up processed items in the |
Beta Was this translation helpful? Give feedback.
-
I'm running into the same issue. Using a high volume Stripe account and ~120,000 rows in the select exists(select * from `webhook_calls` where `name` = 'stripe' and json_unquote(json_extract(`payload`, '$."id"')) = 'evt_1LwOtxEkNJHWF29kseHoKuqd') as `exists` Apparently you can use stored generated columns as indexes for JSON values. But I think a better solution would simply be to store the event ID in a separate, indexed column. |
Beta Was this translation helpful? Give feedback.
-
just had the same issues. simply using a json column without an index in this case does not work so well :) solved by this migration:
|
Beta Was this translation helpful? Give feedback.
-
My app processes thousands of stripe calls per day.
As soon as the
webhook_calls
table goes > 200,000 records the database starts to die (technical term 😝).As the columns are json I though about potentially setting up virtual indexes but need to look into this more.
I'm wondering if there is something I'm missing maybe which could prevent this?
I could override the
StripeWebhookProfile
add thepayload->id
as its own column with an index to speed things up, any thoughts?I appreciate any help.
Thanks
Beta Was this translation helpful? Give feedback.
All reactions