Skip to content

Document / Add an example of RowFilter usage #9096

@alamb

Description

@alamb

Is your feature request related to a problem or challenge? Please describe what you are trying to do.
This has come up several times, most recently on the arrow mailing list:

https://lists.apache.org/thread/5kg3q0y4cqzl16q6vrvkxlw0yxmk4241

Discussing how to expose dictionary data may lead to multiple overlapping
considerations, long discussions and perhaps format and API changes. So we
hope that there could be some loopholes or small change that could
potentially unblock such optimization without going into a large design/API
space. For instance:

  1. Can we introduce a hint to ParquetReader which will produce
    DictionaryArray for the given column instead of a concrete array
    (StringViewArray in our case)?
  2. When doing late materialization, maybe we can extend ArrowPredicate,
    so that it first instructs Parquet reader that it wants to get encoded
    dictionaries first, and once they are supplied, return another predicate
    that will be applied to encoded data. E.g., "x = some_value" translates to
    "x_encoded = index".

@tustvold pointed out:

What you are requesting is already supported in parquet-rs. In
particular if you request a UTF8 or Binary DictionaryArray for the
column it will decode the column preserving the dictionary encoding. You
can override the embedded arrow schema, if any, using
ArrowReaderOptions::with_schema [1]. Provided you don't read RecordBatch
across row groups and therefore across dictionaries, which the async
reader doesn't, this should never materialize the dictionary. FWIW the
ViewArray decodeders will also preserve the dictionary encoding,
however, the dictionary encoded nature is less explicit in the resulting
arrays.

As for using integer comparisons to optimise dictionary filtering, you
should be able to construct an ArrowPredicate that computes the filter
for the dictionary values, caches this for future use, e.g. using ptr_eq
to detect when the dictionary changes, and then filters based on
dictionary keys.

The RowFilter API does exist and can evaluate predicates during evaluation, but it has no examples:
https://docs.rs/parquet/latest/parquet/arrow/arrow_reader/type.ParquetRecordBatchReaderBuilder.html#method.with_row_filter

Describe the solution you'd like
I would like these features to be more easily documented:

  1. An example of using a with_row_filter, with a link to the https://arrow.apache.org/blog/2025/12/11/parquet-late-materialization-deep-dive/ blog

Bonus points would be a second example, that shows how to evaluate predicates on "encoded data" that

  1. Reads the data as dictionary / StringView (see Document / Add an example of preserving dictionary encoding when reading parquet #9095)
  2. Uses ptr_eq as described above to reuse the dictionary

Describe alternatives you've considered

Additional context

Metadata

Metadata

Assignees

Labels

documentationImprovements or additions to documentationenhancementAny new improvement worthy of a entry in the changelogparquetChanges to the parquet crate

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions