feat: add dummy circuit to final extraction for indexing Data with no provable extraction#415
Conversation
… contract `Simple.sol`.
…ng-with-no-probvable-ext
…ng-with-no-probvable-ext
…ng-with-no-probvable-ext
mp2-v1/src/final_extraction/api.rs
Outdated
| } | ||
| /// Instantiate inputs for the dummy circuit dealing with no provable extraction case | ||
| pub fn new_no_provable_input<PrimaryIndex: PartialEq + Eq + Default + Clone + Debug>( | ||
| block_number: U256, |
There was a problem hiding this comment.
Can we rename this "primary index" ?
There was a problem hiding this comment.
And shouldn't this be of type PrimaryIndex ? Or why we have a primary index generic in that case ?
There was a problem hiding this comment.
Renamed block_number -> primary_index, block_hash -> root_of_trust, prev_block_hash -> prev_root_of_trust. Regarding the PrimaryIndex generic: I mostly introduced it to satisfy the trait bound for the CellsCollection provided as input to represent the rows of the table. It required some additional bounds but I have changed the type of primary_index to PrimaryIndex as well, given that we have the generic type I agree it makes sense to use it also for the primary_index input parameter
mp2-v1/src/final_extraction/api.rs
Outdated
| pub fn new_no_provable_input<PrimaryIndex: PartialEq + Eq + Default + Clone + Debug>( | ||
| block_number: U256, | ||
| block_hash: OffChainRootOfTrust, | ||
| prev_block_hash: HashOutput, |
There was a problem hiding this comment.
Do we have to provide one ? If it's not equal to the previous insertion, will it fail ? Couldn't we simply put a dummy value inside the API ?
There was a problem hiding this comment.
Yes, if we don't provide here the block_hash used for previous primary index, a constraint in IVC proof will fail. I think it would be simpler and cleaner to provide the previous root of trust here rather than having to handle that constraint differently in the IVC proof, no?
There was a problem hiding this comment.
As agreed in DM, in commit 2f71b25 I changed the interface of this method to take as input the previous IVC proof rather than the previous block hash, so that we can fetch the previous block hash from the proof internally
mp2-v1/src/values_extraction/mod.rs
Outdated
|
|
||
| /// Compute the identifier for a column of a table containing off-chain data | ||
| pub fn identifier_offchain_column(table_name: &str, column_name: &str) -> u64 { | ||
| let inputs: Vec<F> = vec![table_name, column_name] |
There was a problem hiding this comment.
should we put a DST just because ? like OFFCHAIN || table_name || col_name ?
| ColumnMetadata::NoPrimaryKey(REMAINING_COLUMN_NAMES[0].to_string()), | ||
| ColumnMetadata::NoPrimaryKey(REMAINING_COLUMN_NAMES[1].to_string()), | ||
| ]; | ||
| let mut off_chain_data_args = OffChainDataArgs::new( |
There was a problem hiding this comment.
Could we name it OffchainTable ?
There was a problem hiding this comment.
Renamed to OffChainTableArgs in commit 4a513aa to keep naming consistent with the other table types
| // there is no previous primary index, so we generate previous hash at random | ||
| HashOutput::from(array::from_fn(|_| rng.gen())) |
There was a problem hiding this comment.
Even if there is, could we avoid relying on this ?
There was a problem hiding this comment.
This has been removed after the API changes in commit 2f71b25
| prev_hash, | ||
| primary_index: self.last_update_epoch, | ||
| rows, | ||
| primary_key_columns: self.primary_key_column_ids(), |
There was a problem hiding this comment.
This is confusing there should only be ONE primary key column no ?
There was a problem hiding this comment.
Not necessarily, a primary key may be composed of multiple columns: think of mapping of mappings tables, where the primary key is given by the keys of both the outer and the inner mappings. To clarify, primary key != primary index:: the primary key is the set of columns that uniquely identifies a row, while the primary index is just an index we build on a column (but it isn't necessarily unique for each row). I added a comment to ColumnMetadata of OffchainTableArgs data structure to clarify the meaning of primary key columns.
There was a problem hiding this comment.
Ahhh gotcha. ok. @delehef something we discussed this morning.
Zyouell
left a comment
There was a problem hiding this comment.
Looks pretty good to me! I had one or two comments but they are really more of a personal taste thing than actual needs so I've still mentioned them but will leave it up to you to decide whther they are good ideas or not!
mp2-v1/src/values_extraction/mod.rs
Outdated
| } | ||
|
|
||
| /// Compute the row unique data using the set of column values provided as input | ||
| pub fn row_unique_data<'a, I: Iterator<Item = &'a [u8]>>(columns: I) -> HashOutput { |
There was a problem hiding this comment.
I believe if you changed this to row_unique_data<I: IntoIterator<Item = &[u8]>> and then changed it to let packed_columns = columns.into-iter(). .... it would make it so that you can pass anything to this function as long as it iterates a slices (and also gets rid of the lifetime) which would mean we wouldn't have to make everything into vec![stuff].into_iter() when we call this function above.
There was a problem hiding this comment.
Done in commit 0cf3693, though it didn't get rid of the lifetime, but that's ok I guess
| let initial_epoch = table.row.initial_epoch().await; | ||
| // choose query bounds outside of the range [initial_epoch, last_epoch] | ||
| let min_block = 0; | ||
| let min_block = max(0, initial_epoch - 2) as usize; |
There was a problem hiding this comment.
This seems pointless to me as a usize can't be smaller than zero
There was a problem hiding this comment.
initial_epoch is a signed integer, so subtracting 2 could yield a negative value, that's why I added the max method. Does it make sense?
There was a problem hiding this comment.
Oh yeah of course, you are converting to usize after the call to max my bad
| | TableSource::MappingStruct(_, _) | ||
| | TableSource::MappingOfSingleValueMappings(_) | ||
| | TableSource::MappingOfStructMappings(_) => query_mapping(ctx, &table, &t).await?, | ||
| TableSource::OffChain(_) => { |
There was a problem hiding this comment.
As an alternative solution to tables having to keep track of where they came from (because in my opinion once a table is a "table" it shouldn't care about where it came from), could we make query_table_with_no_consecutive_blocks() return and Option<QueryCookin> and then ust handle the None case in the query_mapping function?
There was a problem hiding this comment.
Done in commit 0cf3693, nice suggestion, thanks
…secutive_blocks_ryhope
…ng-with-no-probvable-ext
9d292a7
into
feat/new_extraction_features
The related document: Indexing Data with No Provable Extraction
Summary
compute_table_row_digestandmerge_table_row_digestsfor table row digest computation.run_no_provable_test_casesto wrap the original test cases to run the final extraction and check with DB result directly (without values extraction).