Skip to content

Conversation

@zycz
Copy link

@zycz zycz commented Dec 19, 2025

applications: gesture_recognition: Add gesture_recognition application

Add gesture_recognition application as it is in:
https://github.com/Neuton-tinyML/neuton-nordic-thingy53-ble-remotecontrol
on commit: b2063f2ae42fa8aead6b3c7df477b3cda7caac29

Application was devloped by Raman Rusak.

zycz added 4 commits December 19, 2025 14:48
Add gesture_recognition application as it is in:
 https://github.com/Neuton-tinyML/neuton-nordic-thingy53-ble-remotecontrol
 on commit: b2063f2ae42fa8aead6b3c7df477b3cda7caac29

 Application was devloped by Raman Rusak.

JIRA: NCSDK-36990

Signed-off-by: Jan Zyczkowski <[email protected]>
Fix application for NCS 3.2.0.
Remove dupplicated USB stack.
Leave only USB_DEVICE_STACK_NEXT which is enabled
for thingy53 by deafult.

JIRA: NCSDK-36990

Signed-off-by: Jan Zyczkowski [email protected]
JIRA: NCSDK-36990

Signed-off-by: Jan Zyczkowski [email protected]
Use EdgeAI API and library from repository instead of using a copy
 from application location.
 Remove copy of API and library from application location.

JIRA: NCSDK-36990

Signed-off-by: Jan Zyczkowski [email protected]
@zycz
Copy link
Author

zycz commented Dec 19, 2025

I still have several issues which needs to be discussed:

  • What licence should gesture recognition application have?
  • Should this repo and gesture recognition application meet NCS coding style standards?
  • When adding gesture recognition application should all commit history be kept or is it ok to push sample as a single commit?
  • Should all DKs use physical sensors even if there is no existing shield which can support them? In that case it would require us and future customers to use sensors connected to DKs by cables. Plans for supported DKs: nRF54L15, nRF54H20, nRF54LM20, Thingy53

@@ -0,0 +1,24 @@
Standard: Cpp11
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please use .clang-format from root of this repository
https://github.com/nrfconnect/sdk-edge-ai/blob/main/.clang-format


![nordic-thingy-kit-img](resources/nordic_thingy.jpg)

## Setup Software Environment <div id='setup-sw-env'/>
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think that since this application are now part of the add-on, the sections Setup Software Environment and Setup Firmware Project should be reworked to match the add-on documentation.

CONFIG_BT_SMP=y
CONFIG_BT_L2CAP_TX_BUF_COUNT=5
CONFIG_BT_PERIPHERAL=y
CONFIG_BT_DEVICE_NAME="Neuton NRF RemoteControl"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

change CONFIG_BT_DEVICE_NAME something like "nRF Edge AI Remote Control", or please consult with @wbober for proper name

@@ -0,0 +1,16 @@
sample:
description: Hello World sample, the simplest Zephyr
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should match the provided application description :) I believe now it is template for hello world

@@ -0,0 +1,39 @@
/* 2023-06-09T11:22:25Z */

/* ----------------------------------------------------------------------
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please change this copyright according add-on license and file tag format

@rrusak
Copy link
Member

rrusak commented Dec 23, 2025

I still have several issues which needs to be discussed:

  • What licence should gesture recognition application have?
  • Should this repo and gesture recognition application meet NCS coding style standards?
  • When adding gesture recognition application should all commit history be kept or is it ok to push sample as a single commit?
  • Should all DKs use physical sensors even if there is no existing shield which can support them? In that case it would require us and future customers to use sensors connected to DKs by cables. Plans for supported DKs: nRF54L15, nRF54H20, nRF54LM20, Thingy53

My point of view:

  • The license should be the same as samples
  • Yes, I think we should follow the same style
  • I don't think it's worth saving the entire history of commits from the public repository, since some of the commits were before the Neuton acquisition and rebranding and there is history that shouldn't be exposed
  • I want to clarify that the model was trained on this board with this sensor, it depends on the physical orientation of the sensor in space, and using this model to other boards will require adapting the sensor axes and sensor settings for the model to work.
    It is also possible that due to the different form factors of the boards, the gestures a person makes may differ in their trajectory and pattern, and the model recognition may be worse.

@zycz
Copy link
Author

zycz commented Dec 23, 2025

I still have several issues which needs to be discussed:

  • What licence should gesture recognition application have?
  • Should this repo and gesture recognition application meet NCS coding style standards?
  • When adding gesture recognition application should all commit history be kept or is it ok to push sample as a single commit?
  • Should all DKs use physical sensors even if there is no existing shield which can support them? In that case it would require us and future customers to use sensors connected to DKs by cables. Plans for supported DKs: nRF54L15, nRF54H20, nRF54LM20, Thingy53

My point of view:

  • The license should be the same as samples
  • Yes, I think we should follow the same style
  • I don't think it's worth saving the entire history of commits from the public repository, since some of the commits were before the Neuton acquisition and rebranding and there is history that shouldn't be exposed
  • I want to clarify that the model was trained on this board with this sensor, it depends on the physical orientation of the sensor in space, and using this model to other boards will require adapting the sensor axes and sensor settings for the model to work.
    It is also possible that due to the different form factors of the boards, the gestures a person makes may differ in their trajectory and pattern, and the model recognition may be worse.

Thanks for your input.

You are right about issue that different hardware might not work correctly with model trained on data acquired from thingy53. Does it mean that we should provide for all other DKs simulated data? As an alternative I see would be having a different model for each hardware shape (I guess there is no much difference between DKs)

I'am trying to make app working on nrf54h20 with shield. I guess that when I will mange to do that we will see how big the issue is and we might then decide how to proceed.

#include <zephyr/bluetooth/uuid.h>
#include <zephyr/settings/settings.h>

//////////////////////////////////////////////////////////////////////////////
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This style of sectioning is not used in nrf


static void input_ccc_changed(const struct bt_gatt_attr* attr, uint16_t value)
{
printk("Input CCCD %s\n", value == BT_GATT_CCC_NOTIFY ? "enabled" : "disabled");
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Use LOG subsys

#define KEY_MEDIA_PREV_TRACK ( 1 << 4 )
#define KEY_MEDIA_NEXT_TRACK ( 1 << 5 )

#if CONFIG_SAMPLE_BT_USE_AUTHENTICATION
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This CONFIG is not present in this sample

#define KEY_ARROW_LEFT (0x50)
#define KEY_ARROW_RIGHT (0x4F)
#define KEY_F5 (0x3E)
#define KEY_ESP (0x29)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this typo? Should it be ESC?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we need BSP? We target only Nordic chips

[CLASS_LABEL_ROTATION_LEFT] = "ROTATION LEFT"
};

static const uint8_t LABELS_CNT = sizeof(LABEL_VS_NAME) / sizeof(LABEL_VS_NAME[0]);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ARRAY_SIZE, same below


//////////////////////////////////////////////////////////////////////////////

static const char* get_name_by_target_(uint8_t predicted_target)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this functions can be placed where they are declared, no need to split declaration from definition.

@@ -0,0 +1,22 @@
#
# Copyright (c) 2024 Nordic Semiconductor
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

correct year in every file

cmake_minimum_required(VERSION 3.20.0)

find_package(Zephyr REQUIRED HINTS $ENV{ZEPHYR_BASE})
project(nrf_edgeai_thingy53)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

gesture_recognition

@@ -0,0 +1,17 @@
// To get started, press Ctrl+Space to bring up the completion menu and view the available nodes.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this comment are not needed, file should be in boards dir

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants