Closed
Description
5.0 introduces a per-node limit on the rate of inline script compilations that should help catch the anti-pattern of embedding script parameters in the scripts themselves. I wonder if it is worth adding a master-only limit on the rate of indexes created to catch situations where people accidentally misconfigure an input system and it ends up creating thousands of indexes in quick succession. Such a rate limit would cause indexing to fail with a useful error message, causing back pressure in any queueing system. I think this'd be better than just creating thousands of indexes as fast as we can.
Is this a good idea or a horrible idea?