<html><head></head><body><div>Hi Steve,</div><div>we are running a jupyterhub that uses our cluster as a backend. You need to be able to submit jobs from the hub node, ie you need to configure slurm and have the slurm binaries. You don't need to run slurmd. We are using the batchspawner to launch the jobs on the cluster. </div><div>Regards</div><div>magnus</div><div><br></div><div><br></div><div>On Sat, 2023-08-26 at 23:17 -0700, Steven Swanson wrote:</div><blockquote type="cite" style="margin:0 0 0 .8ex; border-left:2px #729fcf solid;padding-left:1ex"><div dir="ltr"><div>Can I submit jobs with a computer/docker container that is not part of the slurm cluster?</div><div><br></div>I'm trying to set up slurm as the backend for a system with Jupyter Notebook-based front end. <div><br></div><div>The jupyter notebooks are running in containers managed by Jupyter Hub, which is a mostly turnkey system for providing docker containers that users can access via jupyter.</div><div><br></div><div>I would like the jupyter containers to be able to submit jobs to slurm, but making them part of the cluster doesn't seem to make sense because:</div><div><br></div><div>1. They are dynamically created and don't have known hostnames.</div><div>2. They aren't supposed to run jobs.</div><div><br></div><div>Is there a way to do this? I tried just running slurmd in the jupyter containers, but it complained about not being able to figure out its name (I think because the container's hostname is not listed in slurm.conf).</div><div><br></div><div>My fall back solution is to use ssh to connect to the slurm head node and run jobs there, but that seems kludgy.</div><div><br></div><div><div><div dir="ltr" class="gmail_signature" data-smartmail="gmail_signature"><div dir="ltr">-steve</div></div></div></div></div></blockquote><div><br></div><div><span><pre>-- <br></pre><div><b><span lang="DE">Magnus Hagdorn<o:p></o:p></span></b></div><div><span lang="DE">Charité – Universitätsmedizin Berlin<o:p></o:p></span></div><div><span lang="DE">Geschäftsbereich IT | Scientific Computing<o:p></o:p></span></div><div><span lang="DE"><o:p> </o:p></span></div><div><span style="color: black;">Campus Charité<span class="Apple-converted-space"> </span></span><span lang="DE" style="color: black;">Virchow Klinikum<o:p></o:p></span></div><div><span lang="DE" style="color: black;">Forum 4</span><span style="color: black;"><span class="Apple-converted-space"> </span>| Ebene 02 | Raum<span class="Apple-converted-space"> </span></span><span lang="DE" style="color: black;">2.020<o:p></o:p></span></div><div><span style="color: black;">Augustenburger Platz 1<o:p></o:p></span></div><div><span style="color: black;">13353 Berlin<o:p></o:p></span></div><div><span style="color: black;"><o:p> </o:p></span></div><div><span style="color: black;"><span style="color: rgb(5, 99, 193);">magnus.hagdorn@charite.de</span><o:p></o:p></span></div><div><span lang="DE" style="color: black;"><a href="https://www.charite.de/" title="Click to open https://www.charite.de/"><span lang="EN-US" style="color: rgb(5, 99, 193);">https://www.charite.de</span></a></span><span lang="EN-US" style="color: black;"><o:p></o:p></span></div><div><span lang="EN-US" style="color: black;">HPC Helpdesk:<span class="Apple-converted-space"> </span><a href="mailto:sc-hpc-helpdesk@charite.de" title="Click to mail sc-hpc-helpdesk@charite.de"><span style="color: rgb(5, 99, 193);">sc-hpc-helpdesk@charite.de</span></a></span></div></span></div></body></html>