If you would like the high watermark memory utilization after the job completes, https://github.com/NCAR/peak_memusage is a great tool. Of course it has the limitation that you need to know that you want that information *before* starting the job, which might or might not a problem for your use case
On Fri, Feb 9, 2024 at 10:07 AM Gerhard Strangar via slurm-users < slurm-users@lists.schedmd.com> wrote:
Hello,
I'm wondering if there's a way to tell how much memory my job is using per node. I'm doing
#SBATCH -n 256 srun solver inputfile
When I run sacct -o maxvmsize, the result apparently is the maxmimum VSZ of the largest solver process, not the maximum of the sum of them all (unlike when calling mpirun instead). When I sstat -o TresUsageInMax, I get the memory summed up over all nodes being used. Can I get the maximum VSZ per node?
Gerhard
-- slurm-users mailing list -- slurm-users@lists.schedmd.com To unsubscribe send an email to slurm-users-leave@lists.schedmd.com