-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: deploying large contracts (loader + blob support) #1472
Open
segfault-magnet
wants to merge
35
commits into
master
Choose a base branch
from
feat/chunked_contract_deploy_w_blobs
base: master
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
…ct_deploy_w_blobs
1 task
Br1ght0ne
requested changes
Jul 25, 2024
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nothing but nits/typos - exceptional work. Still woozy after that asm, but I've never seen an asm block so well commented. LFG!
Br1ght0ne
added a commit
to FuelLabs/fuel-specs
that referenced
this pull request
Jul 25, 2024
Found while reviewing FuelLabs/fuels-rs#1472. ### Before requesting review - [x] I have reviewed the code myself
Co-authored-by: Oleksii Filonenko <[email protected]>
Br1ght0ne
previously approved these changes
Jul 27, 2024
Salka1988
previously approved these changes
Jul 28, 2024
MujkicA
previously approved these changes
Jul 29, 2024
a4fcfa5
MujkicA
previously approved these changes
Jul 31, 2024
Salka1988
previously approved these changes
Jul 31, 2024
hal3e
previously approved these changes
Jul 31, 2024
Br1ght0ne
previously approved these changes
Aug 2, 2024
caa8356
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
closes: #1470
Let me know what you think of the interface. Haven't written docs or examples until we finalize the design.
PR is blocked until the work in
fuel-core
andfuel-vm
lands.Highest level API:
This will split the contract into blobs of size 100k, submit all of them, then craft a loading contract that will, when deployed and called, load all the blobs into memory and jump to the start of that memory effectively running the
huge_contract
.I've tested all our e2e tests forcing contract deployment to happen via blobs and couldn't find any issue with contract behavior.
Lower level API
If the user wants total control over how the contract is deployed he can:
Generate a loader contract with the blob ids of the blobs previously submitted and deploy it:
Estimating max blob size
This is a bit tricky. Blob sizes are limited by two things: max transaction size and max transaction gas (both part of consensus parameters). So a blob can be as big as you want as long as the overall tx respects these two global tx limits.
So that means that properly estimating the max blob the user can send becomes a bit tricky. If you fund the tx with 2 coins instead of 1 the answer changes. If you use a predicate instead of a coin also changes the result. Basically whatever impacts the size or gas usage will impact how big the blob can be.
Say you put a blob of 20kb, and fund the tx. Oh look you have extra size and gas to spare. You increase that a bit but you no longer have 1 coin that can cover that but need to use 2. Ok the new blob size is no longer acceptable since you took up a bit of space adding that coin. Remove it. Ok now you can increase the blob size again. But now you again need 2 coins...
As a start, and until we figure something better and robust, there is a crude estimation available in the form of:
Where the theoretical maximum is calculated as
max_tx_size - size_of_a_blob_tx
. So this accounts only for the storage limitation. e2e tests showed that keeping under 95% of the max tx size works without issues. The remainder 5% is room for space inputs and witnesses might take. The docstrings (and future docs) explain the nature of this estimation to the user.Checklist