You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This post is not really an issue with this project or its code. It is intendent as an inside in to my use case for it and obstacles I am dealing with. And also to help anyone else that wants to use it in same or similar way as I do. ( getting out challenges for my hotspots)
I was having problem when querying transactions table by witness address. Since that value is in array, inside jsonb it took long time.
Querying time for transaction table by address of challengee was ok, but got great when I added index on ((fields #> '{path,0}') ->> 'challengee'). I had no luck with trying to index witness gateway.
I hoped filters would also filter transactions table witch would reduce database size. It does not. I guess that is not what this etl was designed for, and I am ok with it. I still love this project!
Since I don't know rust to add filtering of transactions to this code, I added trigger to database that prevents inserting rows not containing data for my hotspots. It uses filter table but values must be gateway addresses.
here is the sql code for trigger:
CREATE OR REPLACE FUNCTION filter_transactions()
RETURNS trigger AS
$BODY$
DECLARE
i json;
pass boolean := false;
fieldsjson json := NEW.fields#>'{path,0}';
BEGIN
IF NEW.type::text LIKE 'poc_receipts_v1' THEN
IF (Select value from filters where value LIKE fieldsjson->>'challengee' ) IS NULL THEN
FOR i IN SELECT * FROM json_array_elements(fieldsjson ->'witnesses')
LOOP
IF (Select value from filters where value LIKE i->>'gateway' ) IS NOT NULL THEN
pass := true;
END IF;
END LOOP;
else
pass := true;
END IF;
END IF;
IF pass = false THEN
RETURN NULL;
else
RETURN NEW;
END IF;
END;
$BODY$
LANGUAGE plpgsql;
create trigger filter before insert on transactions
FOR EACH ROW
EXECUTE PROCEDURE filter_transactions();
The text was updated successfully, but these errors were encountered:
This post is not really an issue with this project or its code. It is intendent as an inside in to my use case for it and obstacles I am dealing with. And also to help anyone else that wants to use it in same or similar way as I do. ( getting out challenges for my hotspots)
I was having problem when querying transactions table by witness address. Since that value is in array, inside jsonb it took long time.
Querying time for transaction table by address of challengee was ok, but got great when I added index on ((fields #> '{path,0}') ->> 'challengee'). I had no luck with trying to index witness gateway.
I hoped filters would also filter transactions table witch would reduce database size. It does not. I guess that is not what this etl was designed for, and I am ok with it. I still love this project!
Since I don't know rust to add filtering of transactions to this code, I added trigger to database that prevents inserting rows not containing data for my hotspots. It uses filter table but values must be gateway addresses.
here is the sql code for trigger:
The text was updated successfully, but these errors were encountered: