Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

discoverable axes index lookup #14

Open
wants to merge 6 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 5 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
72 changes: 72 additions & 0 deletions doc/axes.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,72 @@
# Axes lookup datastructure

```
\
├── ".axes"
│ ├── "some2DVector"
│ │ ├── attributes.json {"dataType":"int32","compression":{"type":"raw"},"blockSize":[2,2],"dimensions":[2,2]}
│ ├── "a3DVector"
│ │ ├── attributes.json {"dataType":"int32","compression":{"type":"raw"},"blockSize":[2,2,2],"dimensions":[2,2,2]}
│ ┊
├── ...
```
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't really like that .axes is in the root of the container. This makes datasets less portable (i.e. I can't just copy the dataset itself but I will have to remember to also copy the appropriate dataset in .axes. Also, what if a dataset with the same name already exists in .axes of the target container?

We could also store .axes inside the actual dataset, e.g.

├── "dataset"
│   ├── "attributes.json"
│   ├── ".axes"
│   │   ├── "attributes.json"

That would make datasets more portable and you would not need any global container information to copy a dataset.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

then we cannot use the dataset loader to resolve the ordering because there cannot be a dataset in a dataset (e.g. in HDF5). I agree about it making portability harder which is only an issue if this feature is used which it currently isn't.


Each "vector" dataset is a 2^n size single block dataset that enumerates the axes indices. The 2D axes lookup for ImgLib2 vectors (F-order) looks like this:

```
-1 0 -> -1 0 1 -1
1 -1
```
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there a more detailed specification of what

-1  0
 1 -1

and

-1 0 1 -1

mean? I have a hard time understanding how they are generated.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looking at the source code helps a lot (thanks @igorpisarev for pointing out the correct names), so maybe doc/axes.md could be more specific about what exactly is happening to create tensors like

-1  0
 1 -1

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Agree but currenty have no better idea. Will work on it.


The 3D axes lookup for numpy vectors (C-order) looks like this:

```
-1 2 -> -1 2 1 -1 0 -1 -1 -1
1 -1

0 -1
-1 -1
```

Tensor and vector aware API can then load the index lookup by loading the dataset as a tensor and access the first positive coordinate at each axis according to the API's axes indexing scheme.

This creates a named permutation for vectors.

The ImgLib2 bindings offer API methods to create and use such lookups. Naturally, the mapping is defined as coming from ImgLib2 F-order:

```java
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I could not find createAxes or any of the readVectorAttribute/writeVectorAttribute methods in n5-imglib2. Are they supposed to be part of n5-imglib2 or do they live in n5?

$ ag 'createAxes'
doc/axes.md
40:createAxes(
$ ag '(read|write).*Vect'
doc/axes.md
44:readDoubleVectorAttribute(
49:writeDoubleVectorAttribute(
55:readLongVectorAttribute(
60:writeLongVectorAttribute(

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It seems that the actual naming is a bit different, and they are called setAxes(), getAxes(), getDoubleVector(), setVector() in the code

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @igorpisarev

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Point taken.

createAxes(
String axesName,
int[] axes);

readDoubleVectorAttribute(
N5Reader n5,
String groupName,
String attributeName,
int[] axes);
writeDoubleVectorAttribute(
double[] vector,
N5Writer n5,
String groupName,
String attributeName,
int[] axes);
readLongVectorAttribute(
N5Reader n5,
String groupName,
String attributeName,
int[] axes);
writeLongVectorAttribute(
double[] vector,
N5Writer n5,
String groupName,
String attributeName,
int[] axes);
```






13 changes: 5 additions & 8 deletions pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -7,13 +7,13 @@
<parent>
<groupId>org.scijava</groupId>
<artifactId>pom-scijava</artifactId>
<version>26.0.0</version>
<version>27.0.1</version>
<relativePath />
</parent>

<groupId>org.janelia.saalfeldlab</groupId>
<artifactId>n5-imglib2</artifactId>
<version>3.4.2-SNAPSHOT</version>
<version>3.5.0-SNAPSHOT</version>

<name>N5 ImgLib2 Bindings</name>
<description>Open N5 datasets as ImgLib2 RandomAccessibles, and write ImgLib2 RandomAccessibles as/ into N5 datasets.</description>
Expand Down Expand Up @@ -58,12 +58,9 @@
<package-name>org.janelia.saalfeldlab</package-name>

<license.licenseName>bsd_2</license.licenseName>
<license.projectName>N5 Cache Loader</license.projectName>
<license.projectName>N5 ImgLib2</license.projectName>
<license.organizationName>Saalfeld Lab</license.organizationName>
<license.copyrightOwners>Philipp Hanslovsky, Stephan Saalfeld</license.copyrightOwners>

<!-- ImgLib2 RealTransform - https://github.com/imglib/imglib2-realtransform -->
<imglib2-realtransform.version>2.2.0</imglib2-realtransform.version>
</properties>
<developers>
<developer>
Expand Down Expand Up @@ -132,13 +129,13 @@
<dependency>
<groupId>org.janelia.saalfeldlab</groupId>
<artifactId>n5</artifactId>
<version>2.1.0</version>
<version>2.1.3</version>
</dependency>

<dependency>
<groupId>net.imglib2</groupId>
<artifactId>imglib2-label-multisets</artifactId>
<version>0.8.1</version>
<version>0.9.0</version>
</dependency>

<dependency>
Expand Down
4 changes: 2 additions & 2 deletions scripts/open-and-show.bsh
Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,8 @@ import org.janelia.saalfeldlab.n5.*;
import org.janelia.saalfeldlab.n5.imglib2.*;
import net.imglib2.img.display.imagej.*;
n5 = new N5FSReader ( "/nrs/saalfeld/lauritzen/02/workspace.n5" );
pre = N5Utils.open(new N5FSReader("/nrs/saalfeld/lauritzen/02/workspace.n5"), "/predictions_it150000_pre_and_post-v0.1/pre_dist/s3");
pre = N5.open(new N5FSReader("/nrs/saalfeld/lauritzen/02/workspace.n5"), "/predictions_it150000_pre_and_post-v0.1/pre_dist/s3");
ImageJFunctions.show(pre);
img = N5Utils.open(new N5FSReader("/nrs/saalfeld/lauritzen/02/workspace.n5"), "/predictions_it150000_pre_and_post-v0.1/post_dist/s3");
img = N5.open(new N5FSReader("/nrs/saalfeld/lauritzen/02/workspace.n5"), "/predictions_it150000_pre_and_post-v0.1/post_dist/s3");
ImageJFunctions.show(img);

Loading