java - BigQuery - decyphering 'quota exceeded' message -
im running message , im not clear of many quotas im exceeding. process has:
- 80 threads (spread on 8 machines)
- < 50 records / insert
- ~5k / record
- 1 sec delay / insert
- inserting ~100 different tables (depending on specific record - records same table grouped together)
to me is:
- < max row size (1mb)
- < max rows / second (100k / table , 1m / project)
- < max rows / request (~500)
- < max bytes / second (100mb)
im looking @ output of: bq --project <proj name> ls -j -a
. gives me jobs , success/fail. here @ results using bq --project <proj name> show -j <jobid>
error output has these lines:
"status": { "errorresult": { "location": "load_job", "message": "quota exceeded: project exceeded quota imports per project. more information, see https://cloud.google.com/bigquery/troubleshooting-errors", "reason": "quotaexceeded" }, "errors": [ { "location": "load_job", "message": "quota exceeded: project exceeded quota imports per project. more information, see https://cloud.google.com/bigquery/troubleshooting-errors", "reason": "quotaexceeded" } ],
suggestions on else look? doing math wrong? perhaps better way organize threads / data?
looks load quotas related
- daily limit: 1,000 load jobs per table per day (including failures), 10,000 load jobs per project per day (including failures)
row , cell size limits:
data format max limit csv 2 mb (row , cell size) json 2 mb (row size) avro 16 mb (block size)
maximum size per load job: 12 tb across input files csv , json
- maximum number of files per load job: 10,000
most daily limit
looking for
Comments
Post a Comment