[slurm-users] Dell <> GPU compatibility matrix?

Fulcomer, Samuel samuel_fulcomer at brown.edu
Thu Oct 27 15:55:10 UTC 2022

The NVIDIA A10 would probably work. Check the Dell specs for card lengths
that it can accommodate. It's also passively cooled, so you'd need to
ensure that there's good airflow through the card. The proof would be
installing a card, and watching the temp when you run apps on it. It's
150W, so not that hot.

On Thu, Oct 27, 2022 at 11:03 AM Chip Seraphine <cseraphine at drwholdings.com>

> We have a cluster of 1U dells (R640s and R650s) and we’ve been asked to
> install GPUs in them, specifically NVIDIA Teslas with at least 24GB RAM, so
> I’m trying to select the right card.  In the past I’ve used Tesla T4s on
> similar hardware, but those are limited to 16GB.   I know most of the
> really high-end GPUs won’t physically fit in a 1U server.
> Does anyone know of a resource that will tell me which models of GPUS
> (specifically Teslas) do/do-not fit in various Dell boxes?   It seems like
> both Nvidia and Dell would be motivated to provide such a compatability
> list but so far I’ve been unable to find one for this sort of enterprise
> equipment (although they abound for desktops and consumer cards).
> --
> Chip Seraphine
> This e-mail and any attachments may contain information that is
> confidential and proprietary and otherwise protected from disclosure. If
> you are not the intended recipient of this e-mail, do not read, duplicate
> or redistribute it by any means. Please immediately delete it and any
> attachments and notify the sender that you have received it by mistake.
> Unintended recipients are prohibited from taking action on the basis of
> information in this e-mail or any attachments. The DRW Companies make no
> representations that this e-mail or any attachments are free of computer
> viruses or other defects.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.schedmd.com/pipermail/slurm-users/attachments/20221027/e5cdaaa9/attachment.htm>

More information about the slurm-users mailing list