ckan - Datapusher memory error datastore -


ckan version if known (or site url):
latest

please describe expected behavior:
large files uploaded datastore datapusher.

please describe actual behavior:
works when uploading normal sized files.

however, when trying upload large csv files (750 mb, not large still) datapusher datastore, error :

_"job "push_to_datastore (trigger: runtriggernow, run = true, next run at: none)" raised exception traceback (most recent call last):   file "/usr/local/lib/python2.7/dist-packages/apscheduler/scheduler.py", line 512, in _run_job     retval = job.func(*job.args, **job.kwargs)   file "/usr/lib/ckan/datapusher/src/datapusher/datapusher/jobs.py", line 364, in push_to_datastore     f = cstringio.stringio(response.read())   file "/usr/lib/python2.7/socket.py", line 359, in read     return buf.getvalue() memoryerror"_ 

i changed memory allowed size in datapusher_settings.py, on ckan conf file, in nginx_proxy_cache, don't see what's wrong.

more details of changed :

ckan.max_resource_size = 5120 in production.ini

max_content_length = 1024000000 in https://github.com/ckan/datapusher/blob/master/datapusher/jobs.py#l28

the cache size in proxy_cache_path /tmp/nginx_cache levels=1:2 keys_zone=cache:30m max_size=250m; , client_max_body_size 10000m; in nginx conf file.

i need this. in advance, twinko5


Comments

Popular posts from this blog

jOOQ update returning clause with Oracle -

java - Warning equals/hashCode on @Data annotation lombok with inheritance -

java - BasicPathUsageException: Cannot join to attribute of basic type -