Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

tflite build changes #688

Open
wants to merge 22 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 6 additions & 2 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ venv*/
*.tar.gz
/VARIANT

# Docs API reference
# Docs API reference
docs/api_reference.md

### Cmake auto tools
Expand Down Expand Up @@ -137,4 +137,8 @@ dkms.conf
.idea_modules/

# docs site
site/
site/

# docker remnants
*.iid
*.cid
22 changes: 17 additions & 5 deletions CMakeLists.txt
Original file line number Diff line number Diff line change
Expand Up @@ -138,14 +138,25 @@ ENDIF()
#----------------------------------------------------------------------------------------------

IF(BUILD_TFLITE)
FIND_LIBRARY(TFLITE_LIBRARIES_1 NAMES tensorflow-lite
FIND_LIBRARY(TFLITE_LIBRARIES_1 NAMES tensorflowlite
PATHS ${depsAbs}/libtensorflow-lite/lib)
FIND_LIBRARY(TFLITE_LIBRARIES_2 NAMES benchmark-lib.a
IF (${DEVICE} STREQUAL "gpu")
FIND_LIBRARY(TFLITE_LIBRARIES_2 NAMES tensorflowlite_gpu_delegate
PATHS ${depsAbs}/libtensorflow-lite/lib)
SET(TFLITE_LIBRARIES ${TFLITE_LIBRARIES_1} ${TFLITE_LIBRARIES_2})
MESSAGE(STATUS "Found TensorFlow Lite Libraries: \"${TFLITE_LIBRARIES}\")")
IF (NOT APPLE)
FIND_LIBRARY(OPENGL_LIBRARIES NAMES GL
PATHS /usr/lib/${MACH}-linux-gnu)
FIND_LIBRARY(EGL_LIBRARIES NAMES EGL
PATHS /usr/lib/${MACH}-linux-gnu)
ELSE()
MESSAGE(FATAL_ERROR "Build for TensorFlow Lite GPU backend on Apple machines.")
ENDIF()
ENDIF()
SET(TFLITE_LIBRARIES ${TFLITE_LIBRARIES_1} ${TFLITE_LIBRARIES_2} ${OPENGL_LIBRARIES} ${EGL_LIBRARIES})
IF (NOT TFLITE_LIBRARIES)
MESSAGE(FATAL_ERROR "Could not find TensorFlow Lite")
ELSE()
MESSAGE(STATUS "Found TensorFlow Lite Libraries: \"${TFLITE_LIBRARIES}\")")
ENDIF()
IF (${DEVICE} STREQUAL "gpu")
ADD_DEFINITIONS(-DRAI_TFLITE_USE_CUDA)
Expand Down Expand Up @@ -202,6 +213,7 @@ ENDIF()

ADD_SUBDIRECTORY(src)
ADD_SUBDIRECTORY(tests/module)

ADD_LIBRARY(redisai SHARED $<TARGET_OBJECTS:redisai_obj>)

TARGET_LINK_LIBRARIES(redisai ${CMAKE_DL_LIBS})
Expand Down Expand Up @@ -322,4 +334,4 @@ if(PACKAGE_UNIT_TESTS)
enable_testing()
include(GoogleTest)
add_subdirectory(tests/unit)
endif()
endif()
4 changes: 3 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,7 @@ redis-cli

## Building

You should obtain the module's source code and submodule using git like so:
You should obtain the module's source code and submodule using git like so:

```sh
git clone --recursive https://github.com/RedisAI/RedisAI
Expand All @@ -96,6 +96,8 @@ ALL=1 make -C opt clean build

Note: in order to use the PyTorch backend on Linux, at least `gcc 4.9.2` is required.

[See this document](docs/developer-backends.md) for building AI backends.

### Running the server

You will need a redis-server version 5.0.7 or greater. This should be
Expand Down
19 changes: 19 additions & 0 deletions docs/developer-backends.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
# RedisAI dependency builds

Platform dependency build systems are located in this folder. Dependencies are to be pre-built, and published to S3. To do so, they rely *(ultimately)* on running **make build publish** in a given directory. The goal is for this to be true on all target platforms (x86_64, arm64), though at this time it's only true for: tensorflowlite.

## Background

Items are built in docker images, for the target platform whenever possible. If needed (i.e a future planned MacOS build) items are built on the dedicated hardware. There are design wrinkles to each build. Though the ideal is to build a base docker (see the [automata repository](https://github.com/redislabsmodules/automata). That base docker is then used as the base build system injector for the dependency itself. A docker image is built from the base docker, accepting externalized variables such as the dependency version. Compilation of external requirements takes place in a build file, mounted inside the docker image.

Ideally a per-platform Docker file (i.e Dockerfile.x64, Dockerfile.arm) will exist in the underlying folder, assuming building within a docker is tenable.

--------------

## tensorflowlite (tflite)

### arm64

The arm build of tflite currently occurs on **jetson arm devices** only, as portions of the root filesystem of the Jetson device are mounted during the build. Given the symlinks that exist on the device between things in /usr/lib to /etc/alternatives, and in turn to /usr/local/cuda, which is itself a symlink to /usr/local/cuda-10.2, this is the current philosophy.

WThe *build_arm* target in the [Makefile](Makefile) describes the process in detail. The code to build the base docker build image can be found in the [automata repository](https://github.com/RedisLabsModules/automata/tree/master/dockers/buildsystem/bazelbuilder). The *bazelbuilder* image is published to the [redisfab dockerhub repositories](https://hub.docker.com/r/redisfab/).
2 changes: 1 addition & 1 deletion docs/developer.md
Original file line number Diff line number Diff line change
Expand Up @@ -106,7 +106,7 @@ Within the `backends` folder you will find the implementations code required to
* **ONNX**: `onnxruntime.h` and `onnxruntime.c` exporting the functions to to register the ONNXRuntime backend

## Building and Testing
You can compile and build the module from its source code - refer to the [Building and Running section](quickstart.md#building-and-running) of the Quickstart page for instructions on how to do that.
You can compile and build the module from its source code - refer to the [Building and Running section](quickstart.md#building-and-running) of the Quickstart page for instructions on how to do that, or view the detailed instructions on [building backends](developer-backends.md).

**Running Tests**

Expand Down
16 changes: 7 additions & 9 deletions get_deps.sh
Original file line number Diff line number Diff line change
Expand Up @@ -145,7 +145,7 @@ fi # WITH_TF

################################################################################# LIBTFLITE

TFLITE_VERSION="2.0.0"
TFLITE_VERSION="2.4.1"

if [[ $WITH_TFLITE != 0 ]]; then
[[ $FORCE == 1 ]] && rm -rf $LIBTFLITE
Expand All @@ -156,26 +156,24 @@ if [[ $WITH_TFLITE != 0 ]]; then
LIBTF_URL_BASE=https://s3.amazonaws.com/redismodules/tensorflow
if [[ $OS == linux ]]; then
TFLITE_OS="linux"
# if [[ $GPU != 1 ]]; then
# TFLITE_BUILD="cpu"
# else
# TFLITE_BUILD="gpu"
# fi
if [[ $GPU != 1 ]]; then
TFLITE_PLATFORM="cpu"
else
TFLITE_PLATFORM="cuda"
fi

if [[ $ARCH == x64 ]]; then
TFLITE_ARCH=x86_64
elif [[ $ARCH == arm64v8 ]]; then
TFLITE_ARCH=arm64
elif [[ $ARCH == arm32v7 ]]; then
TFLITE_ARCH=arm
fi
elif [[ $OS == macos ]]; then
TFLITE_OS=darwin
# TFLITE_BUILD=cpu
TFLITE_ARCH=x86_64
fi

LIBTFLITE_ARCHIVE=libtensorflowlite-${TFLITE_OS}-${TFLITE_ARCH}-${TFLITE_VERSION}.tar.gz
LIBTFLITE_ARCHIVE=libtensorflowlite-${TFLITE_OS}-${TFLITE_PLATFORM}-${TFLITE_ARCH}-${TFLITE_VERSION}.tar.gz

[[ ! -f $LIBTFLITE_ARCHIVE || $FORCE == 1 ]] && wget -q $LIBTF_URL_BASE/$LIBTFLITE_ARCHIVE

Expand Down
3 changes: 2 additions & 1 deletion opt/Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -135,7 +135,8 @@ CMAKE_FLAGS += \
-DUSE_COVERAGE=$(USE_COVERAGE) \
-DUSE_PROFILE=$(USE_PROFILE) \
-DREDISAI_GIT_SHA=\"$(GIT_SHA)\" \
-DDEVICE=$(DEVICE)
-DDEVICE=$(DEVICE) \
-DMACH=$(shell uname -m)

ifeq ($(WITH_TF),0)
CMAKE_FLAGS += -DBUILD_TF=off
Expand Down
49 changes: 27 additions & 22 deletions opt/build/tflite/Dockerfile.x64
100755 → 100644
Original file line number Diff line number Diff line change
@@ -1,29 +1,34 @@
ARG BAZEL_VERSION=3.1.0
ARG TFLITE_ARCH=x86_64

ARG OS=debian:buster
ARG OS=redisfab/ubuntu1804-${TFLITE_ARCH}-bazel${BAZEL_VERSION}

ARG FTLITE_VER=2.0.0
# cuda | cpu
ARG REDISAI_PLATFORM=cuda

ARG TFLITE_VERSION=2.4.0

#----------------------------------------------------------------------------------------------
FROM ${OS}

ARG FTLITE_VER

WORKDIR /build

RUN set -e ;\
apt-get -qq update ;\
apt-get -q install -y git ca-certificates curl wget unzip python3 ;\
apt-get -q install -y git build-essential zlib1g-dev

RUN git clone --single-branch --branch v${FTLITE_VER} --depth 1 https://github.com/tensorflow/tensorflow.git

ADD ./opt/build/tflite/build /build/
ADD ./opt/readies/ /build/readies/
ADD ./opt/build/tflite/collect.py /build/

RUN set -e ;\
cd tensorflow/tensorflow/lite/tools/make ;\
./download_dependencies.sh ;\
./build_lib.sh

RUN ./collect.py --version ${FTLITE_VER} --dest /build/dest
ARG TFLITE_VERSION
ARG TFLITE_ARCH
ARG REDISAI_PLATFORM

ADD ./opt/build/tflite /tflite

RUN apt-get -qq update && apt-get install -yqq python3
RUN apt-get install -qqy git \
unzip \
wget \
curl \
build-essential \
zlib1g-dev \
libegl1-mesa-dev \
libgles2-mesa-dev \
python3-distutils \
python3-numpy
RUN ln -s /usr/bin/python3 /usr/bin/python
WORKDIR /tflite
RUN bash build.sh ${TFLITE_ARCH} ${TFLITE_VERSION} ${REDISAI_PLATFORM}
Loading