[slurm-users] Odd behavior/bug with --array env SLURM_ARRAY_TASK_MAX
loris.bennett at fu-berlin.de
Mon Feb 19 23:54:35 MST 2018
Robert Anderson <rea at sr.unh.edu> writes:
> While working on an example python slurm job script I found the
> environment variable SLURM_ARRAY_TASK_MAX was not set to the expected
> value when a step is defined.
> Below is a minimal test of a 10 array job, with a step value of 5. When
> a step value is defined the SLURM_ARRAY_TASK_MAX is set to the maximum
> value that slurm will provide as a SLURM_ARRAY_TASK_ID, not the
> actual expected "array max value".
> Current behavior looses the only hook to the real "array max" value.
> I can think of no reason why the current behavior would be preferred
> over my expected value.
> Am I missing something?
[snip (40 lines)]
I think the logic is that SLURM_ARRAY_TASK_MAX refers to the actual
array produced, regardless of exactly how that was done. So taking the
example from the "--array" section the 'sbatch' manpage:
For example, "--array=0-15:4" is equivalent to "--array=0,4,8,12".
the array produced is (0, 4, 8, 12) regardless of exactly how it was
created, either "--array=0-15:4" or, say, "--array=0-12:4".
This definition of SLURM_ARRAY_TASK_MAX seems like it would definitely
be useful. What is the use case for what you expected?
Dr. Loris Bennett (Mr.)
ZEDAT, Freie Universität Berlin Email loris.bennett at fu-berlin.de
More information about the slurm-users