Open
Description
Original comment by @droberts195:
@dimitris-athanasiou tested scroll VS search_after on a @dolaru's qa 6-node cluster (though those instances are quite small, t2.medium)
- in this scenario data was pulled from a 5-shard index
- ~15M docs
- it took exactly [2min 45sec] every single time for the scroll version
- it took ~[3min 3sec] on average when doing search_after
- that’s a 10% slowdown with search_after
However, search_after does have some benefits for ML, like not being at risk of broken scrolls.