Description
It may be than your "simple map/reduce" doesn't terminate. Liberal use
of the log() function will likely help you.
Did you try to call the view without reduce to know if at least the
map works (reduce=false parameter to the query)? Did you take rereduce
in account in your reduce function?
As for load, many people, me included have DB thousands time bigger
than yours, both in size and document count.
On Mon, Aug 2, 2010 at 9:51 AM, Serge in Darkness bolter.fire@gmail.com wrote:
I'm trying to get view results from CouchDB and encounter what seems
like timeout - et the end of file I see:Server: MochiWeb/1.0 (Any of you quaids got a smint?)
Date: Mon, 02 Aug 2010 09:37:45 GMT
Content-Type: text/plain;charset=utf-8
Content-Length: 85{"error":"EXIT","reason":"{noproc,{gen_server,call,[<0.64.0>,{pread_bin,1401793}]}}"}
I'm using curl to access views and see that every download is exactly
1 minute 40 seconds long. And it ends with this error regardless of
the file size (it's around 600 kb). So I think it's related to some
timeout on serving JSON. I've increasedos_process_timeout
and set
reduce_limit
to false in config, and it doesn't help.What happening? I thought CouchDB can handle very big datasets.
Database is only 2.5 MB, view is simple map/reduce instance count.
CouchDB is 1.0.
token=40859aae6b9645cadaaa9f68d22b1800login=janl