[slurm-users] Running an MPI job across two partitions
chris at csamuel.org
Wed Mar 25 05:46:38 UTC 2020
On 23/3/20 8:32 am, CB wrote:
> I've looked at the heterogeneous job support but it creates two-separate
Yes, but the web page does say:
# By default, the applications launched by a single execution of
# the srun command (even for different components of the
# heterogeneous job) are combined into one MPI_COMM_WORLD with
# non-overlapping task IDs.
So it _should_ work.
I know there are issues with Cray systems & hetjobs at the moment, but I
suspect that's not likely to concern you.
All the best,
Chris Samuel : http://www.csamuel.org/ : Berkeley, CA, USA
More information about the slurm-users