Description
I should like to see enhancements in the MLJ ecosystem that allow models to work with out-of-memory data, and wonder if DataLoaders might be a good tool here. The main issue, as far as I can tell, is around observation resampling, which is the basis of performance evaluation, and by corollary, hyper-parameter optimization - meta-algorithms that can be applied to all current MLJ models.
So, I wondered if it is be possible for this package to implement getindex(::DataLoader, indxs)
, for indxs isa Union{Colon,Integer,AbstractVector{<:Integer}
, returning an object with the same interface.
This could be a new SubDataLoader
object, but in any case it would be important for the original eltype
to be knowable (assuming eltype
it is implemented for the original object, or you add it as a type parameter).
Since the DataLoader
type already requires the implementation of the "random accesss" getobs
method, this looks quite doable to me.
I realize that for large datasets (the main use case for DataLoaders
) resampling is often a simple holdout. However, because Holdout
is implemented as a special case of more general resampling strategies (CV
, etc) it would be rather messy to add DataLoader
support for just that case without the slicing feature.
Would there be any sympathy among current developers for such an enhancement? Perhaps there is an alternative solution to my issue?
BTW, I don't really see how the suggestion in the docs to apply observation resampling before data is wrapped in a DataLoader could really work effectively in the MLJ context, as the idea is that resampling should remain completely automated. (It also seems from the documentation that this requires bringing the data into memory...?) But I maybe I'm missing something there.