Closed
Description
in module urllib.request, function getproxies_environment (https://github.com/python/cpython/blob/main/Lib/urllib/request.py#L2493)
iterate over all environment variables twice
I made a test:
- create 2500 environment variables
- call and timing getproxies_environment()
- the result is: getproxies_environment() take more than 50ms, which is a big overhead
so if there are many environment variables(like 10 thousands) in a machine, packages depend on function urllib.request.getproxies_environment (like requests does) shall be very slow
is 10 thousands environment variables reasonable?
yes it is, for example, in k8s cluster,kubelete may add a set of environment variables all Services(could be 10 thousands even 100 thousands), in purpose of service discovery
so would it be ok that getproxies_environment don't iterate over all environment variables?