Skip to content

Failure on writing large files #67

Open
@emillindq

Description

@emillindq

Hi and thank you for an amazing project!

I've noticed that master is broken when performance testing by writing to a large file. The following problem is also existing in v1.0.0 but happens after a much longer time.

My scenario is the following:

  • 16GB SD-card
  • MBR with one partition 1GB
  • Formatted on Ubuntu using fdisk and mkfs.ext4
  • Insert card into board running an ST MCU using SDIO interface.
  • calling ext4_mbr_scan, ext4_mkfs_read_info, ext4_device_register, ext4_mount, ext4_recover, ext4_journal_start to mount it
  • calling ext4_fopen2 and continuously ext4_fwrite with a buffer of 32768 bytes.

It works fine for a while but after about 20MB written, the block device write function gets called with an address outside of the SRAM of the MCU. After a little digging, size wraps around here because block_size * fblock_count becomes larger than size.

lwext4/src/ext4.c

Line 1952 in 58bcf89

size -= block_size * fblock_count;

I've tried with the provided ext4_mbr_write and ext4_mkfs but it's the same problem.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions