Skip to content

Commit cc9c2bd

Browse files
committed
Update README, .env and related to allow for easier onramping
1 parent 2cc053f commit cc9c2bd

File tree

6 files changed

+58
-158
lines changed

6 files changed

+58
-158
lines changed

.dockerignore

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,2 +1,2 @@
1-
postgres/
1+
mounts/
22
node_modules

.env.example

+14
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,14 @@
1+
POSTGRES_USER=scraper
2+
POSTGRES_PASSWORD=scraper
3+
POSTGRES_DB=events
4+
POSTGRES_PORT=5432
5+
6+
RPC_URL_ETHEREUM=
7+
RPC_URL_BSC=
8+
RPC_URL_POLYGON=
9+
RPC_URL_AVALANCHE=
10+
RPC_URL_FANTOM=
11+
RPC_URL_CELO=
12+
RPC_URL_OPTIMISM=
13+
RPC_URL_ARBITRUM=
14+

.gitignore

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
.env
2-
postgres
2+
mounts/
33
node_modules
44
lib
55
.vscode

README.md

+12-112
Original file line numberDiff line numberDiff line change
@@ -1,124 +1,24 @@
11
# 0x-event-pipeline
22

3-
A node.js app that was originally designed for pulling 0x staking events info, but now expanded to all other 0x related events.
3+
EVM Blockchain Scraper, mainly for 0x Protocol Events and some extra useful Events.
44

55
## Getting started
66

7-
Test locally:
7+
Run locally:
88

9-
- Step 1
10-
Rename the `.env.exmaple` file to `.env`, or create a new `.env` file. Add the required env variables (see below for configureation details)
11-
12-
- Step 2
13-
Set up the database variables in `docker-compose.yml` file to the desired database destination.
14-
15-
```
16-
$ docker-compose up # get postgres up
17-
```
18-
19-
- Step 3 Test build & Debug
20-
21-
```
22-
$ yarn install
23-
$ yarn build
24-
```
25-
26-
- Step 4 Build migrations
27-
If there are new tables to be created, or schema changes, you will need to create migration files first:
28-
29-
```
30-
yarn migrate:create -n <YourMigrationName>
31-
```
32-
33-
Modify the migration file in `migrations/` folder with necessary changes.
34-
35-
Run migration:
36-
37-
```
38-
$ yarn migrate:run
39-
40-
```
41-
42-
To revert migration:
9+
1. Copy the `.env.exmaple` file to `.env`. Add the `RPC_URL` for the chain(s) you are going to run
4310

11+
2. Start Postgres
12+
```sh
13+
docker-compose up -d # get postgres up
4414
```
45-
$ yarn migrate:revert
4615

16+
3. Build the Scraper images
17+
```sh
18+
docker-compose build
4719
```
4820

49-
- Step 5
50-
Start the scraper:
51-
52-
```
53-
$ yarn start
21+
4. Start the scraper(s)
22+
```sh
23+
docker-compose up event-pipeline-ethereum # change the chain name
5424
```
55-
56-
## Configuration
57-
58-
### Environment variables:
59-
60-
**Required**
61-
62-
`ETHEREUM_RPC_URL` - The RPC URL to use. Must match `CHAIN_ID`.
63-
64-
`CHAIN_ID` - The EVM chain id.
65-
66-
`EP_DEPLOYMENT_BLOCK` - The block on which the proxy contract was deployed
67-
68-
`SCHEMA` - The schema to use to store events in the DB
69-
70-
**Optional**
71-
72-
`POSTGRES_URI` - The full postgres URI to connect to. Defaults to local development.
73-
74-
`START_BLOCK_OFFSET` - How many blocks before the current block to search for events, allowing for updates to previously scraped events that may be in orphaned blocks.
75-
76-
`MAX_BLOCKS_TO_PULL` - The maximum number of blocks to pull at once.
77-
78-
`MAX_BLOCKS_TO_SEARCH` - The maximum number of blocks to search for events at once.
79-
80-
`BLOCK_FINALITY_THRESHOLD` - How many blocks before the current block to end the search, allowing you to limit your event scrape to blocks that are relatively more settled.
81-
82-
`MINUTES_BETWEEN_RUNS` - How long to wait between scrapes.
83-
84-
`SHOULD_SYNCHRONIZE` - Whether typeorm should synchronize with the database from `POSTGRES_URI`.
85-
86-
`STAKING_DEPLOYMENT_BLOCK` - The block on which the staking contract was deployed
87-
88-
`STAKING_POOLS_JSON_URL` - The source for the JSON mapping of staking pools to UUIDs (for grabbing metadata about pools). Defaults to the 0x staking pool registry GitHub repo.
89-
90-
`STAKING_POOLS_METADATA_JSON_URL` - The source for the JSON mapping of UUIDs to metadata. Defaults to the 0x staking pool registry GitHub repo.
91-
92-
`BASE_GITHUB_LOGO_URL` - The base URL for grabbing logos for staking pools. Defaults to the 0x staking pool registry GitHub repo.
93-
94-
## Database snapshots
95-
96-
When running the app on a new database it can take a long time to find new events depending on how much time has passed since the contracts were deployed. There are options to dump and restore data from other sources using `pg_dump` ([Documentation](https://www.postgresql.org/docs/9.6/app-pgdump.html)) and `pg_restore` ([Documentation](https://www.postgresql.org/docs/9.2/app-pgrestore.html)). Some examples are outlined below.
97-
98-
These examples will require `postgresql` to be installed.
99-
100-
```
101-
$ brew install postgresql
102-
```
103-
104-
### Getting data from another database
105-
106-
If you know of another database that contains up-to-date data, you can `pg_dump` data from the relevant schemas from that database by running:
107-
108-
```
109-
$ pg_dump -h <host> -U <user> -p <port> --schema staking --schema events --data-only --file events.dump --format=c <database name>
110-
```
111-
112-
To save a `pg_dump` archive file named `events.dump`. The command will prompt you for the password.
113-
114-
### Restoring data from a pg_dump
115-
116-
If you have access to a `.dump` file you can `pg_restore` data from that file into another database.
117-
118-
To restore data into the default development database that is spun up by `docker-compose up`, you can run:
119-
120-
```
121-
$ pg_restore --data-only --dbname events --host localhost --port 5432 -U user events.dump
122-
```
123-
124-
Assuming you have access to an `events.dump` file. The command will prompt you for the password.

docker-compose.yml

+22-44
Original file line numberDiff line numberDiff line change
@@ -3,14 +3,15 @@ services:
33
postgres:
44
image: postgres:13.4
55
environment:
6-
- POSTGRES_USER=api
7-
- POSTGRES_PASSWORD=api
8-
- POSTGRES_DB=events
9-
- POSTGRES_PORT=5432
6+
POSTGRES_USER: '${POSTGRES_USER}'
7+
POSTGRES_PASSWORD: '${POSTGRES_PASSWORD}'
8+
POSTGRES_DB: '${POSTGRES_DB}'
9+
POSTGRES_PORT: ${POSTGRES_PORT}
1010
# persist the postgres data to disk so we don't lose it
1111
# on rebuilds.
1212
volumes:
13-
- ./postgres:/var/lib/postgresql/data
13+
- ./mounts/postgres:/var/lib/postgresql/data
14+
- ./postgres-init.sql:/docker-entrypoint-initdb.d/postgres-init.sql
1415
ports:
1516
- '5432:5432'
1617
command: ["postgres", "-c", "log_statement=all", "-c", "log_destination=stderr"]
@@ -23,9 +24,9 @@ services:
2324
dockerfile: Dockerfile.dev
2425
restart: always
2526
environment:
26-
ETHEREUM_RPC_URL: '${ETHEREUM_RPC_URL}'
27+
ETHEREUM_RPC_URL: '${RPC_URL_ETHEREUM}'
2728
CHAIN_ID: '1'
28-
POSTGRES_URI: 'postgres://api:api@postgres/events'
29+
POSTGRES_URI: 'postgres://${POSTGRES_USER}:${POSTGRES_PASSWORD}@postgres/${POSTGRES_DB}'
2930
SCHEMA: 'events'
3031
EP_DEPLOYMENT_BLOCK: 10247094
3132
MAX_BLOCKS_TO_SEARCH: 1000
@@ -66,9 +67,9 @@ services:
6667
dockerfile: Dockerfile.dev
6768
restart: always
6869
environment:
69-
ETHEREUM_RPC_URL: '${ETHEREUM_RPC_URL_BSC}'
70+
ETHEREUM_RPC_URL: '${RPC_URL_BSC}'
7071
CHAIN_ID: '56'
71-
POSTGRES_URI: 'postgres://api:api@postgres/events'
72+
POSTGRES_URI: 'postgres://${POSTGRES_USER}:${POSTGRES_PASSWORD}@postgres/${POSTGRES_DB}'
7273
SCHEMA: 'events_bsc'
7374
EP_DEPLOYMENT_BLOCK: 5375047
7475
MAX_BLOCKS_TO_SEARCH: 2000
@@ -96,9 +97,9 @@ services:
9697
dockerfile: Dockerfile.dev
9798
restart: always
9899
environment:
99-
ETHEREUM_RPC_URL: '${ETHEREUM_RPC_URL_POLYGON}'
100+
ETHEREUM_RPC_URL: '${RPC_URL_POLYGON}'
100101
CHAIN_ID: '137'
101-
POSTGRES_URI: 'postgres://api:api@postgres/events'
102+
POSTGRES_URI: 'postgres://${POSTGRES_USER}:${POSTGRES_PASSWORD}@postgres/${POSTGRES_DB}'
102103
SCHEMA: 'events_polygon'
103104
ENABLE_PROMETHEUS_METRICS: "true"
104105
EP_DEPLOYMENT_BLOCK: 14391480
@@ -134,9 +135,9 @@ services:
134135
dockerfile: Dockerfile.dev
135136
restart: always
136137
environment:
137-
ETHEREUM_RPC_URL: '${ETHEREUM_RPC_URL_AVALANCHE}'
138+
ETHEREUM_RPC_URL: '${RPC_URL_AVALANCHE}'
138139
CHAIN_ID: '43114'
139-
POSTGRES_URI: 'postgres://api:api@postgres/events'
140+
POSTGRES_URI: 'postgres://${POSTGRES_USER}:${POSTGRES_PASSWORD}@postgres/${POSTGRES_DB}'
140141
SCHEMA: 'events_avalanche'
141142
ENABLE_PROMETHEUS_METRICS: "true"
142143
EP_DEPLOYMENT_BLOCK: 3601700
@@ -162,9 +163,9 @@ services:
162163
dockerfile: Dockerfile.dev
163164
restart: always
164165
environment:
165-
ETHEREUM_RPC_URL: '${ETHEREUM_RPC_URL_FANTOM}'
166+
ETHEREUM_RPC_URL: '${RPC_URL_FANTOM}'
166167
CHAIN_ID: '250'
167-
POSTGRES_URI: 'postgres://api:api@postgres/events'
168+
POSTGRES_URI: 'postgres://${POSTGRES_USER}:${POSTGRES_PASSWORD}@postgres/${POSTGRES_DB}'
168169
SCHEMA: 'events_fantom'
169170
EP_ADDRESS: "0xDEF189DeAEF76E379df891899eb5A00a94cBC250"
170171
ENABLE_PROMETHEUS_METRICS: "true"
@@ -193,9 +194,9 @@ services:
193194
dockerfile: Dockerfile.dev
194195
restart: always
195196
environment:
196-
ETHEREUM_RPC_URL: '${ETHEREUM_RPC_URL_CELO}'
197+
ETHEREUM_RPC_URL: '${RPC_URL_CELO}'
197198
CHAIN_ID: '42220'
198-
POSTGRES_URI: 'postgres://api:api@postgres/events'
199+
POSTGRES_URI: 'postgres://${POSTGRES_USER}:${POSTGRES_PASSWORD}@postgres/${POSTGRES_DB}'
199200
SCHEMA: 'events_celo'
200201
ENABLE_PROMETHEUS_METRICS: "true"
201202
EP_DEPLOYMENT_BLOCK: 9350111
@@ -213,9 +214,9 @@ services:
213214
dockerfile: Dockerfile.dev
214215
restart: always
215216
environment:
216-
ETHEREUM_RPC_URL: '${ETHEREUM_RPC_URL_OPTIMISM}'
217+
ETHEREUM_RPC_URL: '${RPC_URL_OPTIMISM}'
217218
CHAIN_ID: '10'
218-
POSTGRES_URI: 'postgres://api:api@postgres/events'
219+
POSTGRES_URI: 'postgres://${POSTGRES_USER}:${POSTGRES_PASSWORD}@postgres/${POSTGRES_DB}'
219220
SCHEMA: 'events_optimism'
220221
EP_DEPLOYMENT_BLOCK: 1691335
221222
MAX_BLOCKS_TO_SEARCH: 1000
@@ -232,29 +233,6 @@ services:
232233
FEAT_NFT: "true"
233234
NFT_FEATURE_START_BLOCK: 4214981
234235

235-
event-pipeline-ropsten:
236-
depends_on:
237-
- postgres
238-
build:
239-
context: .
240-
dockerfile: Dockerfile.dev
241-
restart: always
242-
environment:
243-
ETHEREUM_RPC_URL: '${ETHEREUM_RPC_URL_ROPSTEN}'
244-
CHAIN_ID: '3'
245-
POSTGRES_URI: 'postgres://api:api@postgres/events'
246-
SCHEMA: 'events_ropsten'
247-
EP_DEPLOYMENT_BLOCK: 8075130
248-
MAX_BLOCKS_TO_SEARCH: 1000
249-
MAX_BLOCKS_TO_PULL: 1000
250-
FEAT_RFQ_EVENT: "true"
251-
FEAT_LIMIT_ORDERS: "true"
252-
V4_NATIVE_FILL_START_BLOCK: 10125228
253-
FEAT_OTC_ORDERS: "true"
254-
OTC_ORDERS_FEATURE_START_BLOCK: 10857214
255-
FEAT_NFT: "true"
256-
NFT_FEATURE_START_BLOCK: 11849825
257-
258236
event-pipeline-arbitrum:
259237
depends_on:
260238
- postgres
@@ -263,9 +241,9 @@ services:
263241
dockerfile: Dockerfile.dev
264242
restart: always
265243
environment:
266-
ETHEREUM_RPC_URL: '${ETHEREUM_RPC_URL_ARBITRUM}'
244+
ETHEREUM_RPC_URL: '${RPC_URL_ARBITRUM}'
267245
CHAIN_ID: '42161'
268-
POSTGRES_URI: 'postgres://api:api@postgres/events'
246+
POSTGRES_URI: 'postgres://${POSTGRES_USER}:${POSTGRES_PASSWORD}@postgres/${POSTGRES_DB}'
269247
SCHEMA: 'events_arbitrum'
270248
EP_DEPLOYMENT_BLOCK: 4050733
271249
MAX_BLOCKS_TO_SEARCH: 1000

postgres-init.sql

+8
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,8 @@
1+
CREATE SCHEMA events;
2+
CREATE SCHEMA events_bsc;
3+
CREATE SCHEMA events_polygon;
4+
CREATE SCHEMA events_fantom;
5+
CREATE SCHEMA events_avalanche;
6+
CREATE SCHEMA events_celo;
7+
CREATE SCHEMA events_optimism;
8+
CREATE SCHEMA events_arbitrum;

0 commit comments

Comments
 (0)