Skip to content
milesAraya edited this page Mar 10, 2025 · 15 revisions

Frequently Asked Questions

We will update this section summarising fixes to installation / running errors that have been submitted as tickets or found by us or users. If you find an issue, please create and Issue and we will try to get back to you as soon as possible.

Installation

Docker

Native

Mac

Windows

  • Situation: Installing developer mode with poetry. Building wheel for datrie (pyproject.toml) did not run successfully.
  • Error log:

Failed to build datrie

ERROR: Failed to build one or more wheels

  • Cause: In windows, datrie package requires Microsoft Visual C++ 14.0 or greater to build
  • Solution: Go to the Microsoft Visual C++ Build Tools page. During the installation, make sure to select the "Desktop development with C++" workload.

Linus

Workflow

Question: Why does first time running OptiNiSt take so long? After the first time, it is much faster.

  • Answer: During the first time running nodes, a new conda environment is created. In the case of CaImAn or Suite2p, this takes a long time as these toolboxes are installed from scratch. On subsequent runs the cached conda environment can be used, so running workflows is much faster.

Error: func.py died with Signals.SIGKILL: 9

  • Situation: When using Docker, CaImAn is executed in workflow (may occur with with nodes too)
  • Error log: python /app/.snakemake/scripts/xxxxxxxxxxx.func.py' died with <Signals.SIGKILL: 9>.
  • Cause: Probably the workflow execution process is out of memory.
  • Related: Issue 344
  • Solution:

Error: pickle data was truncated

  • Situation: When a workflow is executed in which the size of the node's processing result data (.pkl) is very large (e.g., a large image + caiman)
  • Error log:

File "/app/studio/app/common/core/utils/pickle_handler.py", line 13, in read

return pickle.load(f)

_pickle.UnpicklingError: pickle data was truncated

  • Cause: When the size of the pkl file exceeds several GB (4GB?), an exception is suspected to occur during pickle read.
  • Related:
  • Solution: Not solution at this time.

Error: func.py died with Signals.SIGSEGV: 11

  • Situation: With Docker, suite2p freezes
  • Error log:

set -euo pipefail; python /app/.snakemake/scripts/XXXXXXXXXX.func.py' died with <Signals.SIGSEGV: 11>.\n',

  • Cause: Issue between Apple Silicon and certain functions. Rosetta mainly solves this, but sometimes issues remain.
  • Related: Issue 365
  • Solution:
    • Fixed with update to use miniforge + channel_priority flexible. See here in the documentation
    • Turning On / Off Rosetta in Docker may also help (in some cases turning on, in others turning off seemed to work). Go to Dockerhub, Virtual Machine Options and turn on/off Rosetta option

Error: LCCD fails with short videos

  • Situation: LCCD is not robust for videos with less than 300 time-points.
  • Error log:
  • Cause:
  • Related: Issue 408
  • Solution:

Error: OptiNiSt does not currently accept 2-channel data correctly

  • Situation:
  • Error log:
  • Cause:
  • Related: Issue 411
  • Solution:

snakemake.exceptions.LockException()\nsnakemake.exceptions.LockException

  • Situation: This is an issue that occurs when Snakemake detects that another process might be using the same working directory. This can occur when trying to RUN snakemake while processing is still on-going.
  • Error log:

ERROR - smk_status_logger.py

raise snakemake.exceptions.LockException()\nsnakemake.exceptions.LockException: Error: Directory cannot be locked. Please make sure that no other Snakemake process is trying to create the same files in the following directory:\n/app\nIf you are sure that no other instances of snakemake are running on this directory, the remaining lock was likely caused by a kill signal or a power loss. It can be removed with the --unlock argument.\n', 'timestamp': }

  • Cause: Snakemake running multiple files in the same directory.
  • Related:
  • Solution: Wait for snakemake to finish processing and saving data. And / or create a new workflow and RUN ALL.

Error: No Snakemake process found.

  • Situation: Importing workflow and selecting RUN
  • Error log: No Snakemake process found.
  • Cause: Selected workflow does not have the processed data. Selecting RUN is not possible as this checks the snakemake workflow record and determines if any parameters have been changed. However, in the case a workflow has been imported, RUN has no file to check, causing an error.
  • Solution: Select RUN ALL to rerun analysis.

Visualize

Record

Adding custom nodes

Multi-user mode