Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Item too large for tree #44

Open
Scorg opened this issue Apr 9, 2022 · 9 comments
Open

Item too large for tree #44

Scorg opened this issue Apr 9, 2022 · 9 comments

Comments

@Scorg
Copy link

Scorg commented Apr 9, 2022

Basically title. Sorry for not much context, don't know what else to add.

@maharmstone
Copy link
Owner

It'd help if you pasted the console output here.

@Scorg
Copy link
Author

Scorg commented Apr 9, 2022

Unfortunately I lost it, but it wasn't much. The program calculated checksums (stopped at 1600% completion) and the next line was just Item too large for tree. I also manually converted the drive so probably won't be able to reproduce 1 to 1.

Could you give some pointers as to why this can happen? I'm not very knowledgeable about btrfs and there is not much comments in the code.

Thanks

@FelicitusNeko
Copy link

FelicitusNeko commented Oct 4, 2022

Here's an output with this line:

Not using compression.
Using SHA256 for checksums.
\$RECYCLE.BIN\[snipped Windows user ID]\$R6RCKKF.png: Skipping compressed ADS :Zone.Identifier
\$RECYCLE.BIN\[snipped Windows user ID]\$RCMFLDY.png: Skipping compressed ADS :Zone.Identifier
\$RECYCLE.BIN\[snipped Windows user ID]\$RCNJ5R5.png: Skipping compressed ADS :Zone.Identifier
Processing inode 139414 / 139414 (100.0%)
Mapped 109453 inodes directly.
Rewrote 273 inodes.
Inlined 8702 inodes.
Updating directory sizes
Calculating checksums 537159476 / 537159476 (100.0%)
Item too large for tree.

The volume still shows as NTFS, even after disconnecting/reconnecting it.

@maharmstone
Copy link
Owner

Thanks @FelicitusNeko. Does this still happen if you choose CRC32 instead of SHA256?

@FelicitusNeko
Copy link

I'll have to give that a try. However, it's a decently large volume, and the process took almost 24 hours the first time around, so it might be a couple of days before I can start that up again.

@FelicitusNeko
Copy link

I have given it a try with CRC32C. The output was identical to above, except "Using CRC32C for checksums."

@FelicitusNeko
Copy link

FelicitusNeko commented Oct 7, 2022

I've tried a few things:

  • Removing some of the largest files
  • Removing some of the files with the longest path lengths
  • Reducing the total number of files in general (mostly by deleting some redundant duplicates or compressing into 7z archives)
  • Defragmenting and consolidating free space

Nothing has worked so far, and "Item too large for tree" is too ambiguous of an error message for me to even know if there's something I can do about it. :/

@clo-yunhee
Copy link

clo-yunhee commented Oct 4, 2023

I know this issue is old but i'm running into the same problem here

@barkoder
Copy link

barkoder commented Jan 12, 2024

NTFS Volume is unmounted.

Then I did

$ ntfs2btrfs /dev/sdd1

I got

Using Zstd compression.
Using CRC32C for checksums.
Processing inode 232297 / 232297 (100.0%)
Mapped 184148 inodes directly.
Rewrote 0 inodes.
Inlined 30767 inodes.
Updating directory sizes
Calculating checksums 144091000 / 72045922 (200.0%)
Item too large for tree.

Also for some reason Calculating checksums got to 200%

$ ntfs2btrfs --version
ntfs2btrfs 20230501

@maharmstone

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants