0.3.0
This release adds some new functionality and incorporates a lot of the latest changes in biodivine libraries. As opposed to the transition from 0.1.0
to 0.2.0
, transition to 0.3.0
should not have any breaking changes.
Complete list of changes:
- Added bindings for HCTL and attractor property classification:
run_hctl_classification
,run_attractor_classification
: compute HCTL or attractor classification map.save_class_archive
,load_class_archive
: load/store classification results for later use or GUI analysis.get_model_assertions
,get_model_properties
: read HCTL assertions/properties from an annotated.aeon
model.
- Added bindings for HCTL model checking:
- A
HctlTreeNode
class is used to manipulate and represent HCTL formulas at runtime. get_extended_stg
: extends aSymbolicAsyncGraph
with a collection of symbolic variables necessary for model checking.model_check
,model_check_multiple
: Basic model checking.model_check_extended
,model_check_extended_multiple
: Model checking, but with "extended" propositions that can map to arbitrary subsets of states.mc_analysis
,mc_analysis_multiple
: Run model checking with progress printing, sanitization, etc.
- A
- Fixed #13, so
BddValuation
andBddPartialValuation
are now actually iterable objects. - Added
BddVariableSet::transfer_from
,SymbolicContext::transfer_from
, andSymbolicAsyncGraph::transfer_from
/SymbolicAsyncGraph::transfer_colors_from
/SymbolicAsyncGraph::transfer_vertices_from
. These methods are used to translate aBdd
or a symbolic set between two symbolic domains. - Added
BooleanNetwork::inline_variable
andFnUpdate::substitute_variable
, which can be used to implement network reduction. - Added three new example workflow notebooks:
classification.ipynb
: Classification of network instantiations based on HCTL and attractor properties.file-formats.ipynb
: A quick overview of supported file formats and their properties/limitations.model-checking.ipynb
: Model checking of HCTL properties on (partially specified) Boolean networks.- Examples for the network reduction process will be added later once we resolve a few remaining issues in the Rust backend.