Open
Description
For large models (~13K paramemeters) the procedure
library(magrittr)
temp_rds_file <- tempfile(fileext = ".rds")
cmdstanr_fit$save_object(file = temp_rds_file)
standalone_fit <- readRDS(temp_rds_file)
Takes an incredible amount of memory.
When loaded into memory, the cmdstanr_fit
is around ~8Gb, and the operations pick at 54Gb.
This could be solved by handling ".csv" draws on-disk using DuckDB technology (toggling it with a parameter?)
Here an example of csv handling by DuckDB.
https://stackoverflow.com/questions/77797976/working-with-large-csv-file-using-duckdb-or-arrow-in-r