Closed
Description
Error:
2024-06-06T04:57:33.665329Z ERROR async_openai::error: failed deserialization of: {
"id": "file-cYrRWFomydf8Ng9gvqs4zWBD",
"object": "vector_store.file",
"usage_bytes": 0,
"created_at": 1717649854,
"vector_store_id": "vs_gN12Na14YvsPhjPPxS91WX3b",
"status": "in_progress",
"last_error": null,
"chunking_strategy": {
"type": "static",
"static": {
"max_chunk_size_tokens": 800,
"chunk_overlap_tokens": 400
}
}
}
Failing code:
client
.files()
.create(CreateFileRequest {
file: FileInput::from_vec_u8("meoww.txt".into(), memory.clone().into_bytes()),
purpose: FilePurpose::Assistants,
}))
.expect("Failed to upload memory as file!");
Looks like the issue is here:
/// Static Chunking Strategy
#[derive(Clone, Serialize, Debug, Deserialize, PartialEq, Default)]
pub struct StaticChunkingStrategy {
/// The maximum number of tokens in each chunk. The default value is `800`. The minimum value is `100` and the maximum value is `4096`.
max_chunk_size_tokens: u16,
/// The number of tokens that overlap between chunks. The default value is `400`.
///
/// Note that the overlap must not exceed half of `max_chunk_size_tokens`.
chunk_overlap_tokens: u16,
}
I'm new to Rust, but it seems like the max_chunk_size_tokens
and chunk_overlap_tokens
fields may need to be pub
?