New version of the post-processing agent for automated reduction and cataloging.
For the old version of the post-processing agent, see https://github.com/mantidproject/autoreduce
A configuration must be placed in /etc/autoreduce/post_processing.conf.
The configuration/post_process_consumer.conf.development file will make a good starting
point for configuration. Here are the entries to pay attention to:
{
"brokers": [("localhost", 61613)],
"amq_user": "",
"amq_pwd": "",
"sw_dir": "/opt/postprocessing",
"python_dir": "/opt/postprocessing/postprocessing",
"start_script": "python",
"task_script": "PostProcessAdmin.py",
"task_script_queue_arg": "-q",
"task_script_data_arg": "-d",
"log_file": "/opt/postprocessing/log/postprocessing.log",
"communication_only": 1,
"jobs_per_instrument": 2
}
-
The ActiveMQ server settings must be set by replacing
localhostabove by the proper address and the"amq_user"and"amq_pwd"must be filled out. -
If
"jobs_per_instrument"is set to an integer greater than zero, no more than that number of jobs will run on a given node for a given instrument. Set"jobs_per_instrument"to zero to turn this feature off.If this feature is used, you must add the following to activemq.xml:
<broker xmlns="http://activemq.apache.org/schema/core" brokerName="localhost" ... schedulerSupport="true"> ... <plugins> <redeliveryPlugin fallbackToDeadLetter="true" sendToDlqIfMaxRetriesExceeded="true"> <redeliveryPolicyMap> <redeliveryPolicyMap> <defaultEntry> <redeliveryPolicy maximumRedeliveries="4" initialRedeliveryDelay="5000" redeliveryDelay="10000" /> </defaultEntry> </redeliveryPolicyMap> </redeliveryPolicyMap> </redeliveryPlugin> </plugins>
Post-Processing Agent will terminate a post-processing task that exceeds either the time limit or memory usage limit. The limits and the time interval between checks are configurable in the configuration file. The same time interval between checks applies to both the time limit and memory usage limit.
{
"system_mem_limit_perc": 70.0,
"mem_check_interval_sec": 0.2,
"task_time_limit_minutes": 60.0
}
- Don't forget to set your Mantid user's properties to send output logs to stdout:
logging.channels.consoleChannel.class=StdoutChannel
The post processing agent handles cataloging raw and reduced data files in ONCat https://oncat.ornl.gov/ by calling scripts hosted on the analysis cluster.
Create the configuration files and edit according to your installation.
cd configuration
cp post_process_consumer.conf.developement /etc/autoreduce/post_processing.conf
To run, simply call
python [installation path]/queueProcessor.py
The pixi environment for running queueProcessor.py and tests locally is defined in pyproject.toml. Create and activate the pixi environment for development.
pixi install
pixi shell
For developers working on both post_processing_agent and plot_publisher simultaneously, you may want to use an editable install of plot_publisher. This is already configured in the pyproject.toml to install from the git repository.
The tests for this project are all written using pytest.
pixi run test
You can also run specific test suites:
pixi run test-unit # Unit tests only
pixi run test-integration # Integration tests only
pixi run test-cov # Tests with coverage
The integration tests requires activemq and the queueprocessor to be running, they will be automatically skipped if activemq is not running. This can be achieved using the docker-compose file provided,
docker compose -f tests/integration/docker-compose.yml up -d --build
then run
pixi run test-integration
after which you can stop docker with
docker compose -f tests/integration/docker-compose.yml down
Manual tests can be executed as
$ pixi run python scripts/mantidpython.py /SNS/HYP/shared/auto_reduction/reduce_HYS.py [HYS nexus file] [Output Dir]
or
$ pixi run python scripts/mantidpython.py tests/reduce_CONDA.py [Data file] [Output dir]
as an example for how to activate a specific conda environment for reduction.
docker build --tag postprocessing .
docker run --network host postprocessing- Update the version number in SPECS/postprocessing.spec and
pyproject.toml and commit the change to
main. - Create a new tag and create a release from the tag (see the three dots menu for the tag at https://github.com/neutrons/post_processing_agent/tags).
- Build the RPM using
make build/rpmand upload the.rpmand.srpmfiles as release assets to the GitHub release page.