diff --git a/docs/source/technical_tutorials.rst b/docs/source/technical_tutorials.rst index 45fc6821..785fb9cd 100644 --- a/docs/source/technical_tutorials.rst +++ b/docs/source/technical_tutorials.rst @@ -5,6 +5,7 @@ Technical Tutorials :maxdepth: 2 + technical_tutorials/dps_tutorial/dps_tutorial_demo.ipynb technical_tutorials/searching.rst technical_tutorials/visualizing.rst technical_tutorials/accessing.rst diff --git a/docs/source/technical_tutorials/dps_tutorial/_static/alg-reg-2.png b/docs/source/technical_tutorials/dps_tutorial/_static/alg-reg-2.png new file mode 100644 index 00000000..bcac0f75 Binary files /dev/null and b/docs/source/technical_tutorials/dps_tutorial/_static/alg-reg-2.png differ diff --git a/docs/source/technical_tutorials/dps_tutorial/_static/dps_tutorial_git_repo.png b/docs/source/technical_tutorials/dps_tutorial/_static/dps_tutorial_git_repo.png new file mode 100644 index 00000000..79f740fb Binary files /dev/null and b/docs/source/technical_tutorials/dps_tutorial/_static/dps_tutorial_git_repo.png differ diff --git a/docs/source/technical_tutorials/dps_tutorial/_static/tutorial_register_1.png b/docs/source/technical_tutorials/dps_tutorial/_static/tutorial_register_1.png new file mode 100644 index 00000000..ae50899d Binary files /dev/null and b/docs/source/technical_tutorials/dps_tutorial/_static/tutorial_register_1.png differ diff --git a/docs/source/technical_tutorials/dps_tutorial/_static/tutorial_register_2.png b/docs/source/technical_tutorials/dps_tutorial/_static/tutorial_register_2.png new file mode 100644 index 00000000..93728072 Binary files /dev/null and b/docs/source/technical_tutorials/dps_tutorial/_static/tutorial_register_2.png differ diff --git a/docs/source/technical_tutorials/dps_tutorial/_static/tutorial_register_3.png b/docs/source/technical_tutorials/dps_tutorial/_static/tutorial_register_3.png new file mode 100644 index 00000000..f1c92165 Binary files /dev/null and b/docs/source/technical_tutorials/dps_tutorial/_static/tutorial_register_3.png differ diff --git a/docs/source/technical_tutorials/dps_tutorial/_static/tutorial_register_4.png b/docs/source/technical_tutorials/dps_tutorial/_static/tutorial_register_4.png new file mode 100644 index 00000000..86efea76 Binary files /dev/null and b/docs/source/technical_tutorials/dps_tutorial/_static/tutorial_register_4.png differ diff --git a/docs/source/technical_tutorials/dps_tutorial/_static/tutorial_register_5.png b/docs/source/technical_tutorials/dps_tutorial/_static/tutorial_register_5.png new file mode 100644 index 00000000..d979b8ba Binary files /dev/null and b/docs/source/technical_tutorials/dps_tutorial/_static/tutorial_register_5.png differ diff --git a/docs/source/technical_tutorials/dps_tutorial/_static/tutorial_register_6.png b/docs/source/technical_tutorials/dps_tutorial/_static/tutorial_register_6.png new file mode 100644 index 00000000..62bc4059 Binary files /dev/null and b/docs/source/technical_tutorials/dps_tutorial/_static/tutorial_register_6.png differ diff --git a/docs/source/technical_tutorials/dps_tutorial/_static/tutorial_submit_1.png b/docs/source/technical_tutorials/dps_tutorial/_static/tutorial_submit_1.png new file mode 100644 index 00000000..fdb446e0 Binary files /dev/null and b/docs/source/technical_tutorials/dps_tutorial/_static/tutorial_submit_1.png differ diff --git a/docs/source/technical_tutorials/dps_tutorial/_static/tutorial_submit_2.png b/docs/source/technical_tutorials/dps_tutorial/_static/tutorial_submit_2.png new file mode 100644 index 00000000..a9516f6d Binary files /dev/null and b/docs/source/technical_tutorials/dps_tutorial/_static/tutorial_submit_2.png differ diff --git a/docs/source/technical_tutorials/dps_tutorial/_static/tutorial_view_1.png b/docs/source/technical_tutorials/dps_tutorial/_static/tutorial_view_1.png new file mode 100644 index 00000000..81335c4b Binary files /dev/null and b/docs/source/technical_tutorials/dps_tutorial/_static/tutorial_view_1.png differ diff --git a/docs/source/technical_tutorials/dps_tutorial/_static/tutorial_view_2.png b/docs/source/technical_tutorials/dps_tutorial/_static/tutorial_view_2.png new file mode 100644 index 00000000..09a24f6e Binary files /dev/null and b/docs/source/technical_tutorials/dps_tutorial/_static/tutorial_view_2.png differ diff --git a/docs/source/technical_tutorials/dps_tutorial/_static/tutorial_view_3.png b/docs/source/technical_tutorials/dps_tutorial/_static/tutorial_view_3.png new file mode 100644 index 00000000..be70ad88 Binary files /dev/null and b/docs/source/technical_tutorials/dps_tutorial/_static/tutorial_view_3.png differ diff --git a/docs/source/technical_tutorials/dps_tutorial/dps_tutorial_demo.ipynb b/docs/source/technical_tutorials/dps_tutorial/dps_tutorial_demo.ipynb new file mode 100644 index 00000000..1b72d081 --- /dev/null +++ b/docs/source/technical_tutorials/dps_tutorial/dps_tutorial_demo.ipynb @@ -0,0 +1,541 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Data Processing System (DPS) Tutorial A to Z" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Authors: Sujen Shah and Rob Tapella\n", + "\n", + "Review and Editing: Chuck Daniels and Rajat Shinde\n", + "\n", + "Date: April, 2024\n", + "\n", + "Description: This DPS tutorial is intended to demonstrate the steps needed to create, register, run, monitor and view outputs of algorithm jobs run at scale. It includes a template repository with various files needed to set up and run a job. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Importing and Installing Packages" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Additional package installation will be included inline, and environment-configuration to support a custom conda environment in DPS is a part of the tutorial below." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Before Starting\n", + "\n", + "- This tutorial assumes that you have at least run through the [Getting Started Guide](../../getting_started/getting_started.ipynb) and have set up your MAAP account.\n", + "- This tutorial is made for the Application Development Environment (ADE) \"Basic Stable\" workspace v3.1.4 or later (February 2024 or later).\n", + "- This also assumes that you are familiar with using [Github with MAAP](../../system_reference_guide/work_with_git.ipynb)." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## An Overview of How DPS Works" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The MAAP DPS (Data Processing System) lets you run an Algorithm at scale, in the cloud. It is different from running a Jupyter notebook in that it is non-interactive. Therefore, when migrating a notebook to the DPS it needs to be altered to take all of the input parameters at once as a command-line program. Even if you start with a command-line script, there are best practices to follow when preparing it for the DPS. Once that is done, the Algorithm can be Registered in a way that allows you to execute analysis jobs using a web-based form, where each field is defined during the Registration process.\n", + "\n", + "The DPS will then be able to take a set of inputs and process them at scale. You do not need to worry about scheduling or scaling your jobs. You can also run your DPS Algorithm using the maap.py Python library in a loop to execute a batch of jobs (e.g., 1000 input files and one job per input-file). The DPS will manage the compute resources to handle your request, and you can easily find all of your outputs as each individual job completes." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Overview of this Tutorial" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "- Run and Monitor an example Algorithm using the Jobs UI to help anticipate what we will accomplish\n", + "- Clone the demo Algorithm\n", + "- Edit and test your Algorithm code to make sure that it is working in its original form\n", + "- Prepare the Algorithm for DPS by setting up the runtime arguments and pre-run environment set-up\n", + "- Register the Algorithm with the Algorithm UI\n", + "- Run and Monitor the Algorithm using the Jobs UI\n", + "- View the outputs and errors from your run" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Run and Monitor an example Algorithm" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "If you are not familiar with running jobs in the DPS, please try running through the [Jobs UI guide](../../system_reference_guide/jobsui.ipynb) and the [Getting Started Guide](../../getting_started/running_at_scale.ipynb).\n", + "\n", + "This can be helpful because the process of Registering an Algorithm for DPS helps to build the user-interface to Run a Job. By familiarizing yourself with the process of running a Job, the Registration process may become more intuitive." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Clone the Demo Algorithm" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We will use an example Python-based algorithm for this tutorial. First we need to get the demo code into a Jupyter workspace.\n", + "\n", + "1. For this tutorial, please use a Basic Stable workspace (v3.1.4 or later). \n", + "2. Clone the Github repository at https://github.com/MAAP-Project/dps_tutorial . For the sake of this tutorial, we will assume that the clone is placed into a folder called algorithms in our home folder (i.e., ~/algorithms).\n", + "```\n", + "mkdir ~/algorithms\n", + "cd ~/algorithms\n", + "git clone https://github.com/MAAP-Project/dps_tutorial.git\n", + "```\n", + "\n", + "An algorithm needs to be in a public code-repository in order to register it with the DPS. For this tutorial we will use the `gdal_wrapper` algorithm folder inside the `dps_tutorial` repository code. Now that you have the demonstration code, take a look at the files that are there and then we will be ready to perform a few test-runs before Registering it.\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Anatomy of the `gdal_wrapper` algorithm folder in the `dps_tutorial` repo:\n", + "\n", + "- `README.md` to describe the algorithm in Github\n", + "- `build-env.sh`: a shell script that is executed before the algorithm is run; it is used to set up any custom programming libraries used in the algorithm (i.e., a custom conda environment)\n", + "- `environment.yml`: a configuration file used by conda to add any custom libraries; this is used by build-env.sh\n", + "- `gdal_wrapper.py`: a python script that contains the logic of the algorithm\n", + "- `run_gdal.sh`: a shell script that DPS will execute when a run is requested. It calls any relevant python files with the required inputs\n", + "\n", + "![DPS Tutorial Git repository overview](_static/dps_tutorial_git_repo.png)\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Edit and Test your Code" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Once you have an algorithm such as the `gdal_wrapper` test it to make sure that it is running properly. If it runs properly in a Jupyter Terminal window, you are one step closer to registering and running your algorithm in the DPS.\n", + "\n", + "Typically a Jupyter Notebook is run interacively. A DPS algorithm will take all inputs up-front, do the processing, and produce output files. The `gdal_wrapper` script is already set up like a DPS algorithm. Some aspects to note:\n", + "\n", + "- **Python argparse**: Using a library like [argparse](https://docs.python.org/3/library/argparse.html) to accept input parameters helps to make the code more readable and easier to debug when working locally. It provides easy to write user-friendly command-line interface. \n", + "\n", + "- **Logging**: It is a good practice to write out a log file during the execution of your code making it easier to debug. When running on the DPS, any logs written to the `output` directory or written to `stdout` or `stderr` will automatically be preserved. \n", + "\n", + "### Testing the Python script locally within the workspace\n", + "\n", + "Before registering your algorithm you can test it locally to catch common errors related to input parsing and storing output. To test your algorithm locally before registration follow the below steps:\n", + "\n", + "- Deactivate the current python virtual environment and activate the pre-installed conda environment (for the Basic Stable workspace, it is vanilla)\n", + "```\n", + "conda deactivate\n", + "conda activate vanilla\n", + "```\n", + "- Make sure that your runtime conda environment is set up. To do this, run `build-env.sh` in the `gdal_wrapper` folder.\n", + "```\n", + "cd ~/algorithms\n", + "cd dps_tutorial/gdal_wrapper\n", + "./build-env.sh\n", + "```\n", + "- Make a new temporary directory outside your current repository. This is where you will do a test-run of your algorithm with some inputs and outputs.\n", + "```\n", + "mkdir ~/algorithms/dps_test_run\n", + "cd ~/algorithms/dps_test_run\n", + "```\n", + "- Create a directory called `input` and place any input files required by your algorithm. Here you will need to download any input files up-front, and then run your script. Also make an `output` folder. When you have registered your algorithm and run it in DPS, the input file can be passed in as a runtime parameter and DPS will automatically download it as part of the run.\n", + "```\n", + "mkdir input\n", + "mkdir output\n", + "```\n", + "```\n", + "# ls -F\n", + "input/ output/\n", + "```\n", + "- You will need a test GeoTIF file as input. If you do not have one, go to the folder where you'd like to download the example file (assuming you're in the `dps_test_run` folder as above, `cd input`) and use the following aws command (NOTE: if this step fails, it is likely that you are either in a Basic Stable workspace version prior to v3.1.4, or you do not have the vanilla conda environment activated):\n", + "```\n", + "cd input\n", + "```\n", + "```\n", + "aws s3 cp s3://nasa-maap-data-store/file-staging/nasa-map/ESACCI_Biomass_L4_AGB_V4_100m_2020/S40E160_ESACCI-BIOMASS-L4-AGB-MERGED-100m-2020-fv4.0.tif .\n", + "```\n", + "- Once the input directory is set up, `cd ..` back to the `dps_test_run` directory and call your run script. This is the script you would like the DPS to run when your code is executing on the cloud. Note: This assumes you have already built the environment with required custom packages as mentioned in the build-env.sh of your algorithm. We did this above.\n", + "```\n", + "cd ~/algorithms/dps_test_run\n", + "```\n", + "```\n", + "python ../dps_tutorial/gdal_wrapper/gdal_wrapper.py --input_file input/S40E160_ESACCI-BIOMASS-L4-AGB-MERGED-100m-2020-fv4.0.tif --output_file output/TestFile-30.tif --outsize 30\n", + "```\n", + "- Successful output of the script should look like this:\n", + "```\n", + "Installed GDAL Version: 3.6.1\n", + "b'Input file size is 2400, 3600\\n0...10...20...30...40...50...60...70...80...90...100 - done.\\n'\n", + "```\n", + "\n", + "And if you look in your output folder, you will see your output file:\n", + "```\n", + "# ls output/\n", + "TestFile-30.tif\n", + "```\n", + "\n", + "- If all desired outputs are created, you are one step closer to successfully running your algorithm on the DPS. Your core application can be run at a command-line in a non-interactive mode.\n", + "- The next step is to prepare the Algorithm to be run in DPS.\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Prepare the Algorithm for DPS" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Once your scripts are working locally, make sure that they will also work in DPS.\n", + "\n", + "The `gdal_wrapper` files are already prepared for DPS. When running from DPS we will have the `run_gdal.sh` shell script do the execution, so we should test it first:\n", + "```\n", + "cd ~/algorithms/dps_test_run\n", + "```\n", + "```\n", + "../dps_tutorial/gdal_wrapper/run_gdal.sh output_from_shell.tif 25\n", + "```\n", + "\n", + "This should write a similar message to STDOUT that you saw when executing `gdal_wrapper.py` above. You should see an output file that is slightly smaller than the original test run because we chose a smaller value for the `outsize` parameter.\n", + "\n", + "The shell script is wrapping the Python script that we ran above. Note that there are different parameters—-in the shell script we read all the input files from `input/` and write outputs into the `output/` folder. If you read the shell script you will see how this is happening.\n", + "```\n", + "# ../dps_tutorial/gdal_wrapper/run_gdal.sh output_from_shell.tif 25\n", + "Installed GDAL Version: 3.7.0\n", + "b'Input file size is 11250, 11250\\n0...10...20...30...40...50...60...70...80...90...100 - done.\\n'\n", + "\n", + "# ls -alF *\n", + "input:\n", + "total 7696\n", + "drwxr-xr-x 2 root root 6144 Feb 28 15:52 ./\n", + "drwxr-xr-x 4 root root 6144 Feb 28 15:52 ../\n", + "-rw-r--r-- 1 root root 7869236 Jun 7 2023 S40E160_ESACCI-BIOMASS-L4-AGB-MERGED-100m-2020-fv4.0.tif\n", + "\n", + "output:\n", + "total 20488\n", + "drwxr-xr-x 2 root root 6144 Feb 28 15:54 ./\n", + "drwxr-xr-x 4 root root 6144 Feb 28 15:52 ../\n", + "-rw-r--r-- 1 root root 22804732 Feb 28 15:53 TestFile-30.tif\n", + "-rw-r--r-- 1 root root 15834792 Feb 28 15:54 output_from_shell.tif\n", + "```\n", + "\n", + "Some important things to note:\n", + "\n", + "File: `build-env.sh`\n", + "\n", + "- this script is used for installing custom packages using pip or conda \n", + "- can take conda environment definition from `environment.yaml`\n", + "- current practice is to update the default environment for the Container that you're using\n", + "\n", + "File: `run_gdal.sh`\n", + "\n", + "- `run_gdal.sh` is a bash script to call the `gdal_wrapper.py` algorithm: make sure for DPS you have inputs and outputs in the right places. If you look at `run_gdal.sh` you will see that it is reading all the files from the `input/` folder and writing to the `output/` folder.\n", + "- sets the correct python environment for your code to run\n", + "- the best way to execute your algorithm with a custom environment is to use `conda run`, as shown in this script (`conda run --live-stream --name dps_tutorial python ${basedir}/gdal_wrapper.py --input_file ${INPUT_FILENAME} --output_file output/${OUTPUT_FILENAME} --outsize ${REDUCTION_SIZE}`)\n", + "\n", + "Run your scripts as if DPS is executing them:\n", + "\n", + "- activate the default conda environment, in this case `conda activate vanilla`\n", + "- run `build-env.sh` to create or update your custom environment\n", + "- run `run_gdal.sh` to execute your algorithm using the custom environment\n", + "\n", + "Future topics:\n", + "\n", + "- What happens with input and output in DPS\n", + "- How does file management happen?\n", + "- Relative paths vs. absolute for input/output\n", + "- Mimic what’s happening on DPS (basedir)\n", + "- This wrapper `run_gdal.sh` script needs to manage the input files the way that your python script requires them (e.g. pass single file at a time vs. multiple files at once, etc.)\n", + "\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Register the Algorithm with DPS using the Algorithm UI" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "0. When you are registering your own algorithm, make sure that all your changes are commited and pushed into Github. The registration process will pull the code from Github as part of registration. In this case, we will simply use the existing demonstration repo.\n", + "1. Open up [Launcher: Register Algorithm](../../getting_started/running_at_scale.ipynb#Register-an-Algorithm) (the same as the Register Algorithm option from the Command Palette)\n", + "2. Fill in the fields as described below.\n", + "\n", + "#### First you fill in the public code-repository information:\n", + "![Code repo information](_static/tutorial_register_1.png)\n", + "\n", + "- The Repository URL is the .git URL. In the demo repository, it is\n", + "```\n", + "https://github.com/MAAP-Project/dps_tutorial.git\n", + "```\n", + "- Repository Branch is used as a version when this algorithm is registered. For your test it is likely `main`\n", + "- The Run and Build Commands must be the full path of the scripts that will be used by the DPS to build and execute the algorithm. Typically these will be the `repository_name/script_name.sh`. In this case we have a run command:\n", + "```\n", + "dps_tutorial/gdal_wrapper/run_gdal.sh\n", + "```\n", + "and a build command:\n", + "```\n", + "dps_tutorial/gdal_wrapper/build-env.sh\n", + "```\n", + "\n", + "#### Then fill in the rest of the algorithm information:\n", + "![Algorithm information](_static/tutorial_register_2.png)\n", + "\n", + "- The Algorithm Name will be the unique identifier for the algorithm in the MAAP system. It can be whatever you want. \n", + "- Algorithm Description is additional free-form text to describe what this algorithm does.\n", + "- Disk Space is the minimum amount of space you expect—including all inputs, scratch, and outputs—it gives the DPS an approximation to help optimize the run.\n", + "- Resource Allocation is a dropdown-selection with some options for memory and CPU of the cloud compute you expect to need.\n", + "- The Container URL is a URL of the Stack (workspace image environment) you are using as a base for the algorithm. The user-interface will pre-fill this with the Container of your current workspace; if this is the correct base workspace for the Algorithm (i.e., you successfully ran the Algorithm in a Terminal without requiring a custom base-Container), then you can leave it as is. In this example we use: `mas.maap-project.org/root/maap-workspaces/base_images/vanilla:main`\n", + "See [the Getting Started guide](../../getting_started/running_at_scale.ipynb#Container-URLs) for more information on Containers.\n", + "\n", + "#### Finally you fill in the input section:\n", + "- There are File Inputs and Positional Inputs (command-line parameters to adjust how the algorithm runs). In our example we have a File Input called `input_file` and two Positional Inputs: an output file called `output_file` and a parameter called `outsize` describing how much file-size reduction we want to get. For each input you can add a Description, a Default Value, and mark whether it’s required or optional.\n", + "\n", + "![Algorithm-Inputs information](_static/tutorial_register_3.png)\n", + "\n", + "4. Press Register and there will a popup dialog with a link to view progress of the registration (you should copy the link and paste it into a new page, or open it in a new browser tab). If you close this modal, you can look for your Algorithm build at https://repo.maap-project.org/root/register-job-hysds-v4/-/jobs/\n", + "![Register confirmation popup information](_static/tutorial_register_4.png)\n", + "\n", + "In order to continue, your Algorithm must be successfully registered. The link from the modal dialog will have a screen with a progress log in it. If it is successful and complete, the bottom of the log will look something like this:\n", + "![Registration process log complete](_static/tutorial_register_5.png)\n", + "\n", + "If the registration process failed, the log will look like the below screenshot. In this case you will need to either ask for help, or scroll back up through the log to see what went wrong.\n", + "![Registration process log failed](_static/tutorial_register_6.png)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Running and Monitoring the Algorithm with the Jobs UI" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Once your Algorithm has been fully registered, you can try running a Job.\n", + "\n", + "1. Open the Launcher and select the [Submit Jobs](../../getting_started/running_at_scale.ipynb#Run-the-Algorithm-as-a-Job-and-Monitor-it) icon\n", + "2. Run the job. \n", + "- Choose the Algorithm you just registered using the dropdown menu.\n", + "- The Job Tag can be empty or any list of short terms that you would like to associate with your job. This will help you sort and filter the job list later. It is a comma-separated list of tags.\n", + "- The Resource is likely to be the same as the one you chose when registering the Algorithm. For the tutorial it can be the smallest one (8 GB).\n", + "- The input file can be any GeoTIF file that is accessible by the system. For example, you can browse the [MAAP STAC](https://stac-browser.maap-project.org/collections/ESACCI_Biomass_L4_AGB_V4_100m?.language=en) and find a GeoTIF. For example\n", + "```\n", + "s3://nasa-maap-data-store/file-staging/nasa-map/ESACCI_Biomass_L4_AGB_V4_100m_2020/S40E160_ESACCI-BIOMASS-L4-AGB-MERGED-100m-2020-fv4.0.tif\n", + "```\n", + "- The output file can have any name. It should end with .tif because it will be a GeoTIF also.\n", + "- Outsize is a number from 1 to 100.\n", + "![Submit Job Page](_static/tutorial_submit_1.png)\n", + "\n", + "3. Submit the job and go back to the View tab\n", + "- When you submit you should see a notification in the bottom right, and a note just under the Submit button, saying that it was successful\n", + "![Submit Success Notification](_static/tutorial_submit_2.png)\n", + "- at that point, open the Launcher -> View Jobs icon\n", + "\n", + "4. You can observe the progress of your job while it runs, and the status (complete or fail) when it completes using the View Jobs tab\n", + "![View Jobs Page](_static/tutorial_view_1.png)\n", + "- By selecting a row from the table (top panel) it will show Job Details (in the bottom panel)\n", + "- The status should go from queued to running, to completed or failed\n", + "- Check the Inputs and Outputs sections of the Job Details\n", + "\n", + "5. From the Outputs section, you can copy the path of your output file starting with `dps_outputs` and find it by going to your `~/my-private-bucket` folder and then following the remainder of the path. \n", + "![Copy the Path information](_static/tutorial_view_2.png)\n", + "\n", + "In that folder you will see some JSON files with metadata about the job and the data, as well as the output file (your .tif file).\n", + "![cd to path using Terminal](_static/tutorial_view_3.png)\n", + "\n", + "You can download the output files by browsing to them in the Jupyter file panel and selecting Download from the contextual menu (right-click)." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Running and Monitoring using the HySDS Jobs UI (Figaro)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This will be described in a future update. HySDS is the data-processing system used to run the jobs. It has a full web application that is used by NASA missions to monitor jobs and data-outputs. If you would like to beta-test this UI with MAAP, please contact Sujen or George." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Registering and Running the Algorithm using maap.py" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This will be described in a future update. Often larger batch-jobs are run from Python Notebooks rather than the GUI." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Getting the Outputs of the Job" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Output folder\n", + "\n", + "The DPS treats a folder named `output` specially. Any files stored in this folder will be preserved and uploaded to S3 after the algorithm run is complete. The location of this output will depend on factors like algorithm name, time of run, tags, etc. This output folder can be viewed within your workspace under the `my-private-bucket/dps_output` directory.\n", + "\n", + "The output directory is created relative to your script specified in `run_command` at the time of registration. So to access the directory, simply do something like this in your run script. \n", + "```\n", + "mkdir -p output\n", + "``` \n", + " \n", + "#### Stderr & Stdout \n", + "\n", + "By default, anything written to the stderr and stdout pipes will be stored in files call _stderr and _stdout and placed in your output directory. \n", + "\n", + "#### Logfiles\n", + "\n", + "DPS does not automatically store any logfiles written by your algorithm, if you would like them to be preserved make sure to write them in the output directory. " + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": false + }, + "source": [ + "## Frequently Asked Questions (FAQ)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": false + }, + "source": [ + "### What happens with input and output in DPS?\n", + "\n", + "Since the jobs on DPS are run on a machine on the cloud, your local workspace directories are not available. It is important to pass any files required as inputs for your algorithm using the `File` parameter type. \n", + "Any outputs that need to be saved should be placed in a directory called `output`. \n", + "\n", + "When a parameter is registered as a file input, DPS downloads the corresponding value provided by the user as a file and places it in a directory called `input`. \n", + "It is important to note that if the value provided a parameter marked as file during registration is not a valid url, DPS will report an error.\n", + "\n", + "Note: Both `input` and `output` directories are relative to your run script. \n", + " \n", + "### How does file management happen?\n", + "\n", + "File management i.e. files required for input and files stored as outputs on S3 are taken care of by the DPS. To locate the files created as an output from your job, look into the `/projects/my-private-bucket/dps_output` dir on your workspace and navigate to the algorithm type and time of run. \n", + "\n", + "You can also construct the output path of your files by looking at the job info on the Jobs UI or by running \n", + "```maap.getJobResult('job_id')```\n", + "\n" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": false + }, + "source": [ + "### What is the basedir variable in the run script example?\n", + "\n", + "As we now know that your job on the DPS runs on a cloud machine, it is not possible to know the absolute path of your script before execution. \n", + "Using \n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "is_executing": true + }, + "outputs": [], + "source": [ + "#basedir=$(dirname \"$(readlink -f \"$0\")\")" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": false + }, + "source": [ + "allows you to get the absolute path of the directory in which your run script is present allowing you to set any other executable paths as required by your algorithm. \n", + "\n" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "gedi_subset", + "language": "python", + "name": "gedi_subset" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.11.3" + } + }, + "nbformat": 4, + "nbformat_minor": 4 +}