How to handle TOO_MANY_ROWS_OR_BYTES?

I have this query where Im interested to get a:

  1. All swap prices on a very first block after liquidity adding
  2. Price in every block in a timeframe up to 1 year (if there’s swap in a block ocurred)

But of course there is rate limits for some tokens:

"message": "clickhouse return status 500 [500 Internal Server Error] response Code: 396. DB::Exception: Limit for result exceeded, max rows: 25.00 thousand, current rows: 65.41 thousand. (TOO_MANY_ROWS_OR_BYTES) (version 23.7.5.30 (official build))\n",

Is there any ways to structure the call another way, to fulfill my goal?

{
	EVM(dataset: archive, network: eth) {
		DEXTrades(
			limitBy: { by: Block_Time, count: 2500000 }
			limit: { count: 2500000 }
			orderBy: { ascending: Block_Time }
			where: {
				Trade: {
					Buy: {
						Currency: {
							SmartContract: {
								is: "0xc02aaa39b223fe8d0a0e5c4f27ead9083c756cc2"
							}
						}
					}
					Sell: {
						Currency: {
							SmartContract: {
								is: "0x8390a1DA07E376ef7aDd4Be859BA74Fb83aA02D5"
							}
						}
					}
				}
			}
		) {
			Trade {
				Buy {
					Price
					Currency {
						SmartContract
						Name
					}
				}
				Sell {
					Currency {
						Name
					}
				}
			}
			Block {
				Time
				Number
			}
			Transaction {
				Hash
			}
		}
	}
}

this criteria is used wrong

limitBy: {by: Block_Time, count: 2500000}

limitby limits by the field you’re mentioning, in this case, it will get 25k records for each block timestamp. That is too much data to query in one go.

1 Like

Thank you

So now it’s minimized from 65 to 32 thousand but still exceeded the limit of 25.

So there’s no way to fetch all time ATH of LP pair in one call?

No, you have use the APIs using the time range or block range and then aggregate the data on your local machine

1 Like