You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When dumping large table it would be nice the tool writes the CSV file incremental so lets say every 1000 rows dumped get appended to the file so external tools can already process partial data.
The text was updated successfully, but these errors were encountered:
Just to leave a meaningful comment here. sqlmap uses post-processing on data at the end of the run (e.g. hash cracking in case of cracked password will append the password's plain text format at the end of the retrieved entry). This basically means that you can't expect the content of every <x> rows CSV file to be of the same format as the final output file. If you could live with some raw csv I could do something about it.
Just as a follow up: I ended up using the start/stop options and force SQLmap to dump the contents. It's not optimal, but it's very close to it (I can live with that).
When dumping large table it would be nice the tool writes the CSV file incremental so lets say every 1000 rows dumped get appended to the file so external tools can already process partial data.
The text was updated successfully, but these errors were encountered: