-
Notifications
You must be signed in to change notification settings - Fork 2
Fix errors and warnings #5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Really appreciate all the small changes with clippy and updating and fixing a bunch of small stuff. |
How does mca compare to fastanvil in performance? |
I quickly tried adding fastanvil to my existing benchmark that was against mca_parser (here) The benchmark is a fairly simple one where it just parses the region and gets the decompressed data from the 0,0 chunk
|
I was finally able to test Sculk, trying to use anyhow::Result;
use mca::RegionReader;
use sculk::chunk::Chunk;
use tokio::{fs::File, io::AsyncReadExt};
use tokio_stream::{wrappers::ReadDirStream, StreamExt};
const PATH: &str = "...";
#[tokio::main]
async fn main() -> Result<()> {
let read_dir = tokio::fs::read_dir(PATH).await?;
let mut read_dir_stream = ReadDirStream::new(read_dir);
while let Some(Ok(entry)) = read_dir_stream.next().await {
let path = entry.path();
let mut data = Vec::new();
let mut file = File::open(&path).await?;
file.read_to_end(&mut data).await?;
println!("Path: {path:?}");
let region = RegionReader::new(&data)?;
for x in 0..32 {
for z in 0..32 {
let Some(raw_chunk) = region.get_chunk(x, z)? else {
println!("1");
continue;
};
let bytes = raw_chunk.decompress()?;
let chunk = Chunk::from_bytes(&bytes)?;
for section in chunk.sections {
let Some(block_states) = section.block_states else {
println!("3");
continue;
};
for palette in block_states.palette {
println!("{palette:?}");
}
}
}
}
}
Ok(())
} |
Is that from 2b2t lol? Anyhow, where that the structures field is missing. My test region files that are from the same DataVersion of 3955 does always include a "structures" region for every chunk. While your samples doesnt ever have them, i guess thats a byproduct of upgrading old chunks into newer ones (instead of being generated in newer versions from the start) I guess making pub structures: Structures, into pub structures: Option<Structures>, In chunk/mod.rs would fix that since apparently it just doesnt exist in weird old>new chunks |
Yeah it's from the new world download from a few days ago, EDIT: Spoke too soon, |
I'm not sure how to find the NBT to add to BrushableBlock EDIT: This seems to be the last error, It finishes scanning without problem, I did have a decompression error once but I can't replicate that anymore. |
You can find the nbt data on the wiki here, this is from suspicious sand (but gravel would also work) and under "Data Values" and then "Block Data" you got all the data it has. But if it was only these small errors for a good real world example to scan it. I'm pretty damn happy with that |
I got the error again
EDIT: I've just realized that BrushableBlock already exists as SuspiciousBlock, but do I make a generic BlockEntityKind::BrushableBlock/SuspiciousBlock? |
This reverts commit e1e02ab.
Oh right the suspicious block, yeah that looks fine. And with the zlib error |
I'm able to scan every chunk without failure, it seems to be random when it happens, I'll see if I can loop the program and find if it's the same chunk/region every time or not |
I ran it a few times and got these files, but when I run it on these files they work fine, so I think it's just random when it fails. |
That is really weird that its random The only ever lines of code for zlib in miniz_oxide::deflate::compress_to_vec_zlib(&data, 4)
// and
miniz_oxide::inflate::decompress_to_vec_zlib(&data)? so its like something weird with your files, because it feels weird for If you really wanted to, you could try and pull |
After updating I got a slightly different error message but it's still not consistent
I can just try again if it fails, chances are it won't fail twice I found it, chunk.block_entities :) |
Sure, while retrying a failed chunk probably works for 99% of cases. But it working sometimes and sometimes not is really annoying. I wonder if you ran it with newly generated chunks, if it work ever fail then. Because i dont personally recall ever getting any of these errors. And just maybe, it is the old new chunks but even then its weird that its random |
I'll generate a new large vanilla world with Chunky while I'm gone today and see if it happens on that EDIT: It seems double-decompressing doesn't work, maybe the data coming from RegionReader is randomly invalid? |
I myself have no intention to work on newer versions.
That doesnt sound right, how / when is a chunk ever double compressed in the first place? |
I mean trying to |
I tried with new 1.20.4 regions and it also happens, but the same code and regions works flawlessly for my friend, so I'm inclined to believe it's my CPU/RAM corrupting. If I retry get_chunk and decompress when decompress fails it usually works after a number of attempts, but the chunks following also fail at the same rate, but eventually it will clear up when it reaches the next region. I also tried without sculk code, which reduces the load on my CPU to half and makes it entirely IO bound, and it basically never fails, at a much, much lower rate (I added 10TB to my SSDs reads before it failed), so this seems to be instability with my computer under heavy CPU load. This issue isn't sculk related and I think this PR is complete for now, if I decide to add 1.21.4 support I'll make a new PR. |
No description provided.