Skip to content

Issue in multiple appends in one transaction #1946

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
1 of 3 tasks
dor-bernstein opened this issue Apr 23, 2025 · 4 comments · May be fixed by #1961
Open
1 of 3 tasks

Issue in multiple appends in one transaction #1946

dor-bernstein opened this issue Apr 23, 2025 · 4 comments · May be fixed by #1961

Comments

@dor-bernstein
Copy link

Apache Iceberg version

None

Please describe the bug 🐞

Hey,
I have a large arrow table that I want to append to a partitioned iceberg table.
I'm working locally with dockers and I'm using the tabulario/iceberg-rest:1.6.0 as my rest catalog.
To avoid OOMs, I'm splitting the arrow table into chunks. When using regular appends everything works as expected. However, I want to append all data in a single transaction. I have this code that does that:

            with table.transaction() as tx:
                for offset in range(0, data.num_rows, MAX_APPEND_CHUNK_SIZE):
                    data_slice = data.slice(offset, MAX_APPEND_CHUNK_SIZE)
                    logger.info(f'Writing batch of {data_slice.num_rows} with offset {offset} to table {table.name()}')
                    tx.append(data_slice)
                tx.commit_transaction()

The table is empty and was created in a different task.
I get the following error - CommitFailedException: Requirement failed: branch main was created concurrently.
When retrying I get this error pyiceberg.exceptions.CommitFailedException: CommitFailedException: Requirement failed: branch main has changed: expected id 4547037169132709864 != 132570956257248456.

Any help would be appreciated,
Thanks!

Willingness to contribute

  • I can contribute a fix for this bug independently
  • I would be willing to contribute a fix for this bug with guidance from the Iceberg community
  • I cannot contribute a fix for this bug at this time
@Fokko
Copy link
Contributor

Fokko commented May 2, 2025

Hey @dor-bernstein Thanks for raising this. Do you have the full stack trace by any chance?

I see in the code that you probably commit twice:

with table.transaction() as tx:
    for offset in range(0, data.num_rows, MAX_APPEND_CHUNK_SIZE):
        data_slice = data.slice(offset, MAX_APPEND_CHUNK_SIZE)
        logger.info(f'Writing batch of {data_slice.num_rows} with offset {offset} to table {table.name()}')
        tx.append(data_slice)
    tx.commit_transaction() <-- this will commit
<-- Leaving the context manager will also trigger a commit

Please let me know, it would be great to get this fixed 🚀

Fokko added a commit to Fokko/iceberg-python that referenced this issue May 2, 2025
@Fokko Fokko linked a pull request May 2, 2025 that will close this issue
@dor-bernstein
Copy link
Author

@Fokko Can I leave the context manager without triggering a commit? I prefer to be more explicit

@Fokko
Copy link
Contributor

Fokko commented May 6, 2025

Hey @dor-bernstein If there are any updates left, it will still try to process them.

#1961 will also allow you to commit within the context manager. Once the PR is in, the code above will pass without any error. Let me know what you think

@dor-bernstein
Copy link
Author

@Fokko looks good, thanks for handling this so quickly!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants