Description
I'm working on moving as much of my server workloads over to functions as I can. There is one user facing http GET in particular that requires parsing up a bunch of images and returning some data. On the web server, this can be done pretty quickly (parallel.for loop to the rescue) even when multiple users call the service concurrently. But it requires I have a pretty beefy server to do the work
When moved over to a function, I broke it up so that one function ( F(a) ) spawns multiple functions to do the image work ( F(b) ), then return back to F(a) (and then the customer).
The issue that I'm noticing is that when two customers call F(a) nearly concurrently, the first call returns fairly quickly, but the second call takes the first calls time, plus the time to run the function for itself
Poking through the timings, this is what I'm seeing in my test cases
Call 1 to f(a)
2017-04-12T21:27:30.663 Function started (Id=d28fd004-6ba8-49d5-8990-4a9bb5249bc2)
2017-04-12T21:27:36.821 Function completed (Success, Id=d28fd004-6ba8-49d5-8990-4a9bb5249bc2)
Call 2 (initiated about 100ms after call 1)
2017-04-12T21:27:37.114 Function started (Id=389a6ccb-396c-4e85-8eb8-4d757b6c1bd5)
2017-04-12T21:27:43.136 Function completed (Success, Id=389a6ccb-396c-4e85-8eb8-4d757b6c1bd5)
I'd expect that Call 2, since started 100ms later, would have it's function start roughly 100ms later and run in parallel with the first function. Adding more calls just continues this
Here is what it looks like from the customers point of view
call 1
Is this expected and is it possible to force functions to run in parallel?
I should note that when F(a) starts multiple F(b), all F(b) run in parallel.