Replies: 1 comment
-
The performance in this use case is extremely sensitive to chunking. Can you share how your data are chunked on disk? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I want to look-up 8760 times for a single lat/lon combo in less than a second from 43.82 GB file of wind data containing:
The best time we achieved for a single-year look-up was 16 seconds for both u100 and v100 wind speed at 100m vectors. We want to have a sub-second look-up for the whole year as such file read will need to happen on every user request in our API.
Output:
I would be very thankful for any help!
Beta Was this translation helpful? Give feedback.
All reactions