-
Notifications
You must be signed in to change notification settings - Fork 74.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Enabling XNNPACK with Raspberry Pi Zero/W #60282
Comments
Hi @samveen As per the TFLite documentation, the support for XNNPACK is disabled for ARMv6 since there is no NEON support. Have you observed the same behaviour with Thanks. |
As can be seen in the log output, no matter what the user supplied values of From the XNNPACK readme, it's clear that there is a subset of XNNPACK that is usable for Raspberry Pi Zero/ Zero W). However, what is not clear is that can tflite use just that subset of XNNPACK when it's being built for the Zero/Zero W. I'll try and build XNNPACK on the Pi Zero as per the instructions and get back to you on the build state. |
|
@samveen Thanks for the information. @sachinprasadhs Could you please look into this issue? Thanks. |
@pjpratik @sachinprasadhs I've created an issue against XNNPACK with a lot more details with regards to RPi0 builds - google/XNNPACK#4701 , giving the issues I've faced, the build process I followed and details of my native build environment. Hopefully that should give more insight into the underlying issues. |
Hi, @samveen Thanks for raising this issue. Are you aware of the migration to LiteRT? This transition is aimed at enhancing our project's capabilities and providing improved support and focus for our users. As we believe this issue is still relevant to LiteRT we are moving your issue there. Please follow progress here: google-ai-edge/LiteRT#177 Let us know if you have any questions. Thanks. |
Closing this as the issue is now being tracked at google-ai-edge/LiteRT#177 |
Click to expand!
Issue Type
Build/Install
Have you reproduced the bug with TF nightly?
No
Source
source
Tensorflow Version
2.12.0
Custom Code
No
OS Platform and Distribution
Linux Raspberrypi OS 32-bit (Debian bullseye)
Mobile device
Raspberry Pi Zero W
Python version
3.9.2
Bazel version
cmake 3.18.4
GCC/Compiler version
GNU c++ (Raspbian 10.2.1-6+rpi1) 10.2.1 20210110
CUDA/cuDNN version
NA
GPU model and memory
NA
Current Behaviour?
The tf-lite build instructions for Raspberry Pi Zero/Zero W state that the following should be part
of the CFLAGS/CXXFLAGS:
-march=armv6 -mfpu=vfp -mfloat-abi=hard -funsafe-math-optimizations
As per the README for xxnpack, XNNPACK supports running on the armv6 with vpf that's the Raspberry Pi Zero W. However all build instructions for Raspberry Pi Zero request explicitly disabling xnnpack. Given the support for rpi0 in xnnpack documentation, I tried to build tf-lite with xnnpack enabled.
When the xnnpack sub-build is enabled, the following conflicting CFLAGS are added to the compiler invocation during the xnnpack sub-build:
-marm -march=armv8.2-a+dotprod -mfpu=neon-fp-armv8
Please document/extend the cmake and build instructions to allow tf-lite to build correctly with xnnpack enabled for the Raspberry Pi Zero/Zero W.
Standalone code to reproduce the issue
Relevant log output
The text was updated successfully, but these errors were encountered: