Skip to content

Conversation

@iksnagreb
Copy link
Contributor

Cherry-picked and slightly extended commit from #901, but unrelated to all the attention stuff. Related to fastmachinelearning/qonnx#143 and the generalized activation function support.

This probably is still rather sketchy, but at least it tries to check
the data layout annotation. For now seems to be enough for getting the
thresholds of multi-head attention right, IF qonnx properly annotates
the 3D layouts.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant