Fix bug in
NoiseAugmentation
constructor (#183)
Support selection of AD backends via DifferentiationInterface.jl. (#167)
For gradient-based XAI methods, an AD backend must now be manually loaded. To keep using the default Zygote backend, this simply requires adding
using Zygote
to your code. (#177)Gradient
,InputTimesGradient
andGradCAM
analyzers now have an additionalbackend
field and type parameter. (#167)Update XAIBase interface to
v4
. This adds a field to theExplanation
return type and removes theadd_batch_dim
keyword argument. Refer to the XAIBase.jl changelog for more information. (#174)
This release removes the automatic reexport of heatmapping functionality. Users are now required to manually load VisionHeatmaps.jl and/or TextHeatmaps.jl.
This reduces the maintenance burden for new heatmapping features and the amount of dependencies for users who don't need heatmapping functionality.
Removed reexport of heatmapping functionality by updating XAIBase dependency to
v3.0.0
(#162).Added
GradCAM
analyzer (#155). Try it with VisionHeatmaps.jl's newheatmap_overlay
feature.
This release moves the core interface (Explanation
, heatmap
, analyze
)
into a separate package called XAIBase.jl.
Developers can make use of the XAIBase.jl interface
to quickly implement or prototype new methods without having to write boilerplate code.
As announced with version v0.6.2
, this is first release without LRP, which has been moved to a separate package called RelevancePropagation.jl. This separation is enabled by the new common XAIBase.jl interface.
Move core interface into XAIBase.jl package (#154).
- Renamed
Explanation
fieldneuron_selection
tooutput_selection
- Added
Explanation
fieldheatmap
for heatmapping presets
- Renamed
Move LRP into RelevancePropagation.jl (#157)
Remove ImageNet preprocessing utilities (#159)
Partially move documentation into the Julia-XAI ecosystem documentation
This is first release of ExplainableAI.jl as part of the Julia-XAI organization (#149) and the last minor release that includes LRP before it is moved to its own separate package.
Add Concept Relevance Propagation analyzer
CRP
(#146, #148)Add option to process heatmaps batch-wise using keyword argument
process_batch=true
(#146, #148)Remove
FlatRule
on dense layers fromEpsilonPlusFlat
andEpsilonAlpha2Beta1Flat
composite presets (#147)
This release brings GPU support to all analyzers.
Support LRP on GPUs (#142, #140)
Support gradient analyzers on GPUs (#144)
Make Tullio optional dependency using package extensions (#141)
Document GPU support (#145)
This release brings a large refactor of LRP analyzers,
supporting nested "dataflow layers" from Flux.jl like Chain
and Parallel
layers.
This enables LRP on more complex model architectures like ResNets.
Due to the fact that these new features require a breaking release, we've used the occasion to clean up the API. Since the number of changes is large, this changelog has been split between changes to LRP analyzers and more general changes to the package.
Breaking changes:
Remove all unicode characters from user-facing API (#107)
EpsilonRule
: argumentepsilon
replacesϵ
GammaRule
: argumentgamma
replacesγ
AlphaBetaRule
: argumentsalpha
andbeta
replaceα
,β
Rename
LRP
analyzer keyword argumentis_flat=false
toflatten=true
(#119)Remove
check_model
, replaced by non-exportedcheck_lrp_compat
(#119)Replace
layerwise_relevances
field ofExplanation
return type by optional named tupleextras
. Access layerwise relevances viaextras.layerwise_relevances
. (#126)Remove composite
LastNTypeRule
(#119)Rename composite primitives to avoid confusion with LRP rules (#130)
- rename
*Rule
to*Map
- rename
*TypeRule
to*TypeMap
- rename
Breaking changes to commonly extended internal functions:
Internal
lrp!
rule calls require extra argumentlayer
(#119)Pre-allocate modified layers, replacing
modify_param!
withmodify_parameters
(#102)
New features and enhancements:
Support nested Flux
Chain
s (#119)Support
Parallel
layers (#135, #138)Support
BatchNorm
layers (#129, #134)Add
GeneralizedGammaRule
(#109)Support nested indexing in composite primitive
LayerMap
(#131)Pre-allocate modified layers in
LRP
analyzer fieldmodified_layers
(#119)Set LRP output relevance to one (#128)
lrp!
rule calls require extra argumentlayer
, avoiding copies of unmodified layers (#119)Performance fixes for LRP rules, reducing number of generated pullback functions (#106, #108)
Simplify LRP analyzer (#112, #119)
Simplify LRP model checks (#110, #119)
Improve type stability of LRP rules
Documentation:
Update documentation, adding pages on model preparation, composites, custom LRP rules, developer documentation and a separate API reference for LRP analyzers (#137, #105)
Package maintenance:
Breaking changes:
Rename
Explanation
fieldattribution
toval
(#136)
Documentation:
Package maintenance:
Compatibility with Flux.jl
v0.14
(#116)Drop dependency on LinearAlgebra.jl and PrettyTables.jl (#119)
Add Aqua.jl tests (#125)
Drop Flux
v0.12
due to compatibility issues inpreactivation
(#99)
Ignore bias in
WSquareRule
Faster
FlatRule
on Dense layers (#96)Faster
WSquareRule
on Dense layers (#98)Update rule tests and references
This release brings bugfixes and usability features:
Add pretty printing of LRP analyzers, summarizing how layers and rules are matched up (#89)
Add LRP support for
ConvTranspose
andCrossCor
layersAdd equations of LRP rules to docstrings
Bugfixes:
Fix bug affecting
AlphaBetaRule
,ZPlusRule
andZBoxRule
, where mutating the layer modified Zygote pullbacks (#92)Fix bug in
FlatRule
bias (#92)Fix input modification for
FlatRule
andWSquareRule
(#93)
Big feature release that adds LRP composites and presets:
Add LRP
Composite
and composite primitives (#84)Add LRP composite presets (#87)
Add LRP
ZPlusRule
(#88)Export union-types of Flux layers for easy definition of LRP composites
Improvements to docstrings and documentation
Add
test/Project.toml
with compat entries for test dependencies (#87)
This release temporarily adds ImageNet pre-processing utilities. This enables users users to apply XAI methods on pretrained vision models from Metalhead.jl. Note that this functionality will be deprecated once matching functionality is in either Metalhead.jl or MLDatasets.jl.
Add ImageNet preprocessing utility
preprocess_imagenet
(#80)Change default
heatmap
color scheme toseismic
Updated README with the JuliaCon 2022 talk and examples on VGG16
Small bugfix release addressing a bug in v0.5.0
.
Version of ExplainableAI.jl shown in the JuliaCon 2022 talk.
Fix bug in
FlatRule
(#77)
Breaking release that refactors the internals of LRP
analyzers and adds several rules.
List of breaking changes:
Introduce compatibility checks for LRP rule & layer combinations using
check_compat(rule, layer)
(#75)Applying
GammaRule
andZBoxRule
on a layer without weights and biases will now throw an error (#75)In-place updating
modify_layer!(rule, layer)
replacesmodify_layer(rule, layer)
(#73)In-place updating
modify_param!(rule, param)
replacesmodify_params(rule, W, b)
(#73)Removed named LRP constructors
LRPZero
,LRPEpsilon
,LRPGamma
(#75)
Added new LRP rules:
Bug fixes:
Performance improvements:
Replace LRP gradient computation with VJP using
Zygote.pullback
(#72)Faster
GammaRule
Changes:
Update heatmapping normalizer, using ColorScheme's
get
. Breaking due to renamingnormalize
to ColorScheme'srangescale
. (#57)Rename
InputAugmentation
toNoiseAugmentation
. (#65)GammaRule
andEpsilonRule
now use default arguments instead of keyword arguments, removing the need for users to type unicode symbols. (#70)ZBoxRule
now requires parameterslow
andhigh
instead of computing them from the input. (#69)Add
IntegratedGradients
analyzer. (#65)Add
InterpolationAugmentation
wrapper. (#65)Allow any type of
Sampleable
inNoiseAugmentation
. (#65)
Performance improvements: