Inhaltsverzeichnis

Access to Cluster

The following groups are eligible for scientific computing on the HPC cluster:

The HPC cluster is not intended for personal projects or educational purposes (e.g. exercises, labs). Use of the cluster is therefore linked to a chair or research project at the TU Berlin.

Prerequisites

The prerequisite for access is a TUB account, which is issued to all members of the university at the start of employment via the provisioning procedure.

For cluster access, a team must be created for each department by the corresponding role administrator, in which all access permissions are maintained. The group function ("Gruppenfunktion") must be activated for said team.

After the group name (assigned while adding the group function) has been reported back to mailto:it-support@tu-berlin.de with the subject ‘HPC:Team assignment’, the people in the team are entered in the SLURM database. The synchronisation is carried out by automated scripts and can therefore take up to an hour.

Students who need the cluster for their thesis are granted access via the chair in which the thesis is located.

Login

The login to the cluster is done via SSH (Linux/Mac build-in) using the virtual address gateway.hpc.tu-berlin.de. This address can be reached from the TUB subnets. The TUB account is used as the user name.

Within the TUB networks, login is possible as follows:

ssh <TUB-Account>@gateway.hpc.tu-berlin.de

Outside the TUB networks, the connection is either established via a VPN tunnel Cisco AnyConnect or the SSH jump host is used:

ssh <TUB-Account>@sshgate.tu-berlin.de 
<TUB-Account>@sshgate.tu-berlin.de's password:

-bash-4.2$ ssh gateway.hpc.tu-berlin.de
<TUB-Account>@gateway.hpc.tu-berlin.de's password:

-bash-4.2$

Windows workstations

Windows Subsystem for Linux (WSL) or an additional software is required for access under Windows. For example, Putty or MoabTerm can be used.

Access to the cluster storage space

You can access your own files that are located on the cluster storage space (beegfs) using ‘’scp‘’ or ‘’rsync‘’. This makes it possible, for example, to copy the files directly to your own computer or from there to the cluster. For direct access, i.e. without having to make a copy, sshfs can be used.

Under Windows, an SFTP client can be used to transfer files. Here a short explanation.