[slurm-users] Interlocking / Concurrent runner
Florian Lohoff
f at zz.de
Tue Oct 22 08:53:31 UTC 2019
Hi,
i am using slurm in a single node job batching system. Slurm ist perfect
for that case and works for a couple years flawlessly. Lately i was
shuffleing around jobs which take much longer to run to only run
daily, and other jobs to run more frequently.
A Question i had was - is there a possibility to lock jobs not to
run multiple times? Or better - i have a list of jobs with heavy
dependencys - and i'd like to run this job list again when all
of them have completed.
So i could create a lock and an cleanup job which removes that
lock and depends on all other jobs i queue in this batch.
Currently i have something like this in my cron scripts which
looks into the job queue and if it identifies jobs it does
not queue new ones.
squeue -l | egrep -q "osm.*RUNNING"
# Still jobs running
if [ $? -eq 0 ]; then
exit 0
fi
So i run the cron job a lot more often than i can process all of the
data. I feel this to be a bit like a hack.
Flo
--
Florian Lohoff f at zz.de
UTF-8 Test: The 🐈 ran after a 🐁, but the 🐁 ran away
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 833 bytes
Desc: not available
URL: <http://lists.schedmd.com/pipermail/slurm-users/attachments/20191022/7f775293/attachment.sig>
More information about the slurm-users
mailing list