Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow renaming datasets & dataset with duplicate names #8075

Open
wants to merge 110 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
110 commits
Select commit Hold shift + click to select a range
5ffa251
WIP: Adjust schema to allow duplicate dataset names & implement new u…
MichaelBuessemeyer Sep 12, 2024
d34bdeb
reimplement proper dataset name checking route (still keep leave away…
MichaelBuessemeyer Sep 16, 2024
49432a9
WIP: implement wk core backend routes to only use datasetId and no or…
MichaelBuessemeyer Sep 17, 2024
5d3f8bd
WIP: finish using dataset id in wk core backend and dataspath in data…
MichaelBuessemeyer Sep 18, 2024
0c111cb
WIP: Fix backend compilation
MichaelBuessemeyer Sep 18, 2024
d8ad983
Fix backend compilation
Sep 19, 2024
10724e7
WIP: Adapt frontend to new api
Sep 19, 2024
1a8349a
WIP: adapt frontend to new routes
Sep 20, 2024
6b54a7b
WIP: Adjust frontend to newest api
Sep 20, 2024
394c023
first kinda working version
Sep 20, 2024
62bbd75
Try update schema and evolution
Sep 25, 2024
f74a7f1
fix evolution & add first version of reversion (needs to be tested)
Sep 25, 2024
3a203a8
fix frontend tests
Sep 25, 2024
a65c7fb
format backend
Sep 30, 2024
2234b5b
fix dataSets.csv
Oct 1, 2024
3570545
fix e2e tests
Oct 2, 2024
cbf4470
Merge branch 'master' of github.com:scalableminds/webknossos into all…
Oct 2, 2024
50e918a
format backend
Oct 2, 2024
90990b5
fix frontend
Oct 2, 2024
1b0aec3
remove occurences of displayName access / variables in context of a d…
Oct 7, 2024
7a79ea4
fixed verion routes
Oct 7, 2024
5443912
fix reserveUploadRoute
Oct 7, 2024
8224f6a
rename orga_name in jobs to orga_id
Oct 8, 2024
35d62a9
format code
Oct 8, 2024
1b6ff5c
fix finishUploadRoute
Oct 8, 2024
e320a6c
allow duplicate names when uploading a new dataset
Oct 8, 2024
72715f1
fix job list view
Oct 8, 2024
d5a7fdc
fix some datastore requests
Oct 9, 2024
7be52d4
Merge branch 'master' of github.com:scalableminds/webknossos into all…
Oct 9, 2024
24ddfd3
further minor fixes
Oct 9, 2024
503f114
make add remote dataset path a post request as it always creates a ne…
Oct 9, 2024
3b41f86
WIP: replace missed code parts where dataset address was still wrong …
Oct 9, 2024
e0ecf8b
WIP: replace missed code parts where dataset address was still wrong …
Oct 9, 2024
852df00
WIP: adapt annotation upload & task upload to use datasetId
Oct 10, 2024
0d6b2e7
WIP: adjust backend part of task upload to use new dataset addressing
Oct 11, 2024
f4c16e3
Finish adapting task & annotation upload to new format
Oct 11, 2024
d85fc5a
Fix inserting dataset into database
Oct 11, 2024
6c8663e
fix nml annotation upload
Oct 14, 2024
e9b7c25
format backend
Oct 14, 2024
486ae67
add hint about new parameter datasetId to csv / bulk task upload
Oct 14, 2024
dbcf67c
Move task api routes to a separate file in frontend
Oct 14, 2024
403515f
add datasetName and datasetId to returned tasks
Oct 14, 2024
a24c65c
add missing task api routes file (frontend)
Oct 14, 2024
7312d7e
adapt frontend to new task return type
Oct 14, 2024
6a4219e
remove unused imports
Oct 14, 2024
a0cc4fa
fix frontend tests
Oct 14, 2024
590e57b
add datasetId to nml output and readd datasetName to nml parsing for …
Oct 14, 2024
a9ed622
add dataset id to frontend nml serialization
Oct 14, 2024
b5aea43
fix parsing dataset id from nml in backend
Oct 14, 2024
fe87533
fix nml backend tests
Oct 14, 2024
cce67e1
Merge branch 'master' of github.com:scalableminds/webknossos into all…
Oct 14, 2024
1319266
fix typing
Oct 14, 2024
4dff08a
Merge branch 'master' of github.com:scalableminds/webknossos into all…
Oct 14, 2024
cbd88cd
remove logging statement
Oct 15, 2024
0bb457e
fix frontend dataset cache by using the dataset id as the identifier
Oct 15, 2024
3683359
send dataset path as datasource.id.name to frontend
Oct 15, 2024
0f9fd7d
remove unused code
Oct 15, 2024
7831aed
Merge branch 'master' of github.com:scalableminds/webknossos into all…
Oct 22, 2024
5d1dc48
fix pervious merge with newest master
Oct 23, 2024
1823c55
fix evolution and reversion
Oct 23, 2024
4641152
remove objectid from UploadedVolumeLayer and delete SkeletonTracingWi…
Oct 23, 2024
8266187
use new notion like urls
Oct 23, 2024
3fdf60b
rename datasetPath to datasetDirectoryName
Oct 24, 2024
66f8f82
fix backend tests
Oct 24, 2024
6c2d08a
delete DatasetURLParser, rename package of ObjectId to objectid, upda…
Oct 25, 2024
36d7384
small clean up, fix dataset public writes, fix dataset table highligh…
Oct 25, 2024
46dc2d8
fix e2e tests
Oct 25, 2024
4ac6c66
make datastore dataset update route http put method
Oct 29, 2024
3353e9d
make datastore dataset update route http put method
Oct 29, 2024
3a7fddb
Merge branch 'master' of github.com:scalableminds/webknossos into all…
Oct 29, 2024
1f66225
rename datasetParsedId to datasetIdValidated
Oct 29, 2024
3b38320
bump schema version after merge
fm3 Oct 30, 2024
6bf65f3
Merge branch 'master' into allow-dataset-renaming
fm3 Nov 4, 2024
9a80688
Merge branch 'master' of github.com:scalableminds/webknossos into all…
Nov 5, 2024
39af9c2
removeexplicit invalid dataset id message when parsing a datasetid fr…
Nov 5, 2024
beae252
remove overwriting orga name and overwriting dataset name from anntoa…
Nov 5, 2024
51177e1
WIP apply PR feedback
Nov 5, 2024
d97b59c
remove unused method
Nov 6, 2024
f911204
rely on datasetId in processing of taskcreation routes
Nov 6, 2024
44a438e
apply some more review feedback
Nov 6, 2024
0020c50
cleanup unused implicits
Nov 6, 2024
2aaea60
make link generation for convert_to_wkw and compute_mesh_file backwar…
Nov 6, 2024
c61a845
adjust unfinished uploads to display correct dataset name in upload view
Nov 6, 2024
050ae05
send datasource id to compose dataset route (not dataset id)
Nov 7, 2024
ef08f0e
Merge branch 'master' into allow-dataset-renaming
MichaelBuessemeyer Nov 7, 2024
bbb4091
Merge branch 'allow-dataset-renaming' of github.com:scalableminds/web…
Nov 7, 2024
891f90e
WIP apply review feedback
Nov 8, 2024
fb88b21
Finish refactoring nml backend parsing
Nov 11, 2024
78e39a6
ifx nml typing
Nov 11, 2024
17f26b5
Merge branch 'master' of github.com:scalableminds/webknossos into all…
Nov 11, 2024
a2e13af
fix nml upload
Nov 12, 2024
0368b45
Merge branch 'master' of github.com:scalableminds/webknossos into all…
Nov 12, 2024
d90ab6c
apply frontend pr review feedback
Nov 13, 2024
3c95536
apply pr frontend feedback
Nov 14, 2024
519f7b9
add new e2e test to check dataset disambiguation
Nov 14, 2024
50331f7
re-add backwards compatibility for legacy dataset links without organ…
Nov 14, 2024
9fe0f5b
change screenshot test dataset id retrieval to be a test.before
Nov 14, 2024
4c8319d
remove outdated comment
Nov 14, 2024
3d6a5e1
Merge branch 'master' of github.com:scalableminds/webknossos into all…
Nov 14, 2024
ae41a2f
temp disable upload test
Nov 14, 2024
fd04148
fix linting
Nov 14, 2024
aa8a897
fix datasetId expected length
Nov 14, 2024
fc546d2
replace failure fox by returnError fox, fix json error msg
fm3 Nov 14, 2024
818878b
fix upload test
Nov 15, 2024
e3f80fa
remove debug logging from dataset upload test
Nov 15, 2024
1d5ad73
remove debug logs
Nov 15, 2024
b7bf68e
Merge branch 'master' into allow-dataset-renaming
MichaelBuessemeyer Nov 15, 2024
17239f0
format backend
Nov 15, 2024
8fd8979
apply pr various pr feedback
Nov 15, 2024
b5fce21
Merge branch 'allow-dataset-renaming' of github.com:scalableminds/web…
Nov 15, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions MIGRATIONS.unreleased.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,3 +9,4 @@ User-facing changes are documented in the [changelog](CHANGELOG.released.md).
[Commits](https://github.com/scalableminds/webknossos/compare/24.11.1...HEAD)

### Postgres Evolutions:
- [124-decouple-dataset-directory-from-name](conf/evolutions/124-decouple-dataset-directory-from-name)
11 changes: 5 additions & 6 deletions app/controllers/AiModelController.scala
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ import play.api.libs.json.{Json, OFormat}
import play.api.mvc.{Action, AnyContent, PlayBodyParsers}
import play.silhouette.api.Silhouette
import security.WkEnv
import utils.ObjectId
import com.scalableminds.util.objectid.ObjectId

import javax.inject.Inject
import scala.concurrent.ExecutionContext
Expand Down Expand Up @@ -40,7 +40,7 @@ object RunTrainingParameters {

case class RunInferenceParameters(annotationId: Option[ObjectId],
aiModelId: ObjectId,
datasetName: String,
datasetDirectoryName: String,
colorLayerName: String,
boundingBox: String,
newDatasetName: String,
Expand Down Expand Up @@ -143,7 +143,7 @@ class AiModelController @Inject()(
jobCommand = JobCommand.train_model
commandArgs = Json.obj(
"training_annotations" -> Json.toJson(trainingAnnotations),
"organization_name" -> organization._id,
"organization_id" -> organization._id,
"model_id" -> modelId,
"custom_workflow_provided_by_user" -> request.body.workflowYaml
)
Expand Down Expand Up @@ -173,15 +173,14 @@ class AiModelController @Inject()(
for {
_ <- userService.assertIsSuperUser(request.identity)
organization <- organizationDAO.findOne(request.identity._organization)
dataset <- datasetDAO.findOneByNameAndOrganization(request.body.datasetName, organization._id)
dataset <- datasetDAO.findOneByDirectoryNameAndOrganization(request.body.datasetDirectoryName, organization._id)
dataStore <- dataStoreDAO.findOneByName(dataset._dataStore) ?~> "dataStore.notFound"
_ <- aiModelDAO.findOne(request.body.aiModelId) ?~> "aiModel.notFound"
_ <- datasetService.assertValidDatasetName(request.body.newDatasetName)
_ <- datasetService.assertNewDatasetName(request.body.newDatasetName, organization._id)
jobCommand = JobCommand.infer_with_model
boundingBox <- BoundingBox.fromLiteral(request.body.boundingBox).toFox
commandArgs = Json.obj(
"organization_name" -> organization._id,
"organization_id" -> organization._id,
"dataset_name" -> dataset.name,
"color_layer_name" -> request.body.colorLayerName,
"bounding_box" -> boundingBox.toLiteral,
Expand Down
26 changes: 8 additions & 18 deletions app/controllers/AnnotationController.scala
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@ import org.apache.pekko.util.Timeout
import play.silhouette.api.Silhouette
import com.scalableminds.util.accesscontext.{DBAccessContext, GlobalAccessContext}
import com.scalableminds.util.geometry.BoundingBox
import com.scalableminds.util.objectid.ObjectId
import com.scalableminds.util.time.Instant
import com.scalableminds.util.tools.{Fox, FoxImplicits}
import com.scalableminds.webknossos.datastore.models.annotation.AnnotationLayerType.AnnotationLayerType
Expand Down Expand Up @@ -34,7 +35,7 @@ import play.api.libs.json._
import play.api.mvc.{Action, AnyContent, PlayBodyParsers}
import security.{URLSharing, UserAwareRequestLogging, WkEnv}
import telemetry.SlackNotificationService
import utils.{ObjectId, WkConf}
import utils.WkConf

import javax.inject.Inject
import scala.concurrent.ExecutionContext
Expand Down Expand Up @@ -242,15 +243,11 @@ class AnnotationController @Inject()(
} yield result
}

def createExplorational(organizationId: String, datasetName: String): Action[List[AnnotationLayerParameters]] =
def createExplorational(datasetId: String): Action[List[AnnotationLayerParameters]] =
sil.SecuredAction.async(validateJson[List[AnnotationLayerParameters]]) { implicit request =>
for {
organization <- organizationDAO.findOne(organizationId)(GlobalAccessContext) ?~> Messages(
"organization.notFound",
organizationId) ~> NOT_FOUND
dataset <- datasetDAO.findOneByNameAndOrganization(datasetName, organization._id) ?~> Messages(
"dataset.notFound",
datasetName) ~> NOT_FOUND
datasetIdValidated <- ObjectId.fromString(datasetId)
dataset <- datasetDAO.findOne(datasetIdValidated) ?~> Messages("dataset.notFound", datasetIdValidated) ~> NOT_FOUND
annotation <- annotationService.createExplorationalFor(
request.identity,
dataset._id,
Expand All @@ -262,19 +259,12 @@ class AnnotationController @Inject()(
} yield JsonOk(json)
}

def getSandbox(organization: String,
datasetName: String,
typ: String,
sharingToken: Option[String]): Action[AnyContent] =
def getSandbox(datasetId: String, typ: String, sharingToken: Option[String]): Action[AnyContent] =
sil.UserAwareAction.async { implicit request =>
val ctx = URLSharing.fallbackTokenAccessContext(sharingToken) // users with dataset sharing token may also get a sandbox annotation
for {
organization <- organizationDAO.findOne(organization)(GlobalAccessContext) ?~> Messages(
"organization.notFound",
organization) ~> NOT_FOUND
dataset <- datasetDAO.findOneByNameAndOrganization(datasetName, organization._id)(ctx) ?~> Messages(
"dataset.notFound",
datasetName) ~> NOT_FOUND
datasetIdValidated <- ObjectId.fromString(datasetId)
dataset <- datasetDAO.findOne(datasetIdValidated)(ctx) ?~> Messages("dataset.notFound", datasetIdValidated) ~> NOT_FOUND
tracingType <- TracingType.fromString(typ).toFox
_ <- bool2Fox(tracingType == TracingType.skeleton) ?~> "annotation.sandbox.skeletonOnly"
annotation = Annotation(
Expand Down
Loading