ANSYS RSM Cluster (ARC) Job Submission from Rescale Desktops

This tutorial will outline how to set up a Rescale ARC cluster running the ANSYS RSM Cluster scheduler. This will allow job submissions from Rescale desktops to a Linux compute nodes.

Limitations

  • Due to cloud provider compatibility, ARC is available only on certain core types
  • ARC will not automatically shut down the provisioned Linux compute cluster. As a result, the user will have to manually terminate the cluster. To be safe, a wall time should be set on the provisioned compute cluster
  • ARC transmits input and output data over the network. Jobs with large input or output may run longer than expected due to the time required to transfer files
  • ARC is Supported on versions 19.0 and higher

This tutorial presents an example using ANSYS Fluent Workbench Project. To obtain the Workbench project file (.wbpz) for the tutorial, please click on the Import Workbench Project button below and click on the Save option located on the top right corner of the job submission page to have a copy of this file in your Rescale cloud files.

The steps are as follows

In order to submit a job to an ARC cluster from Workbench, you will need to spin up a compute cluster first in the form of a Rescale job, and then send the job to that running cluster.

First, click on the “+ New Job” icon on the top left hand corner of the dashboard. Name the job and jump to the Software Settings page, and select ANSYS RSM Cluster (ARC) . For this tutorial, we will select Version 19.1. No input files are needed here.

arc-cluster-software


You can leave the default command in the Command window. Select the license option below.

Next, proceed to the Hardware Settings page and select the hardware configuration that you would like to run your simulation on. Also, set a suitable wall time so that the ARC cluster will be terminated after the simulation

Once done, hit submit. Your cluster will now take a few minutes to spin up, and configure itself. Please use the live-tailing section to monitor the process_output.log file. Once the log looks like the one in the screenshot below, and the cluster_config.areg files is present, your cluster is ready to receive jobs.

From this step, 3 pieces of information are required for subsequent steps

Click on process-output.log on the job status page to obtain this information.


log-message


For the tutorial, ARC Cluster login details obtained from process_output.log are as shown below

  • submit host ip-10-23-2-142
  • user name udev-brian_cCZzXS
  • password cCzXS

We will now set up a job on the Rescale platform and attach the workbench archive file. The purpose of this step is to have the workbench archive file available on the Rescale Desktop.

First, click on the + New Job icon on the top left hand corner of the dashboard. Name the job. Click on Upload from this computer and browse to the location where the workbench archive is saved. Select the file and click Open.

Jump to the Software Settings page, and select Bring your Own Software as your software of choice.

attach-wb-file-software

You could leave the default selection for Hardware. Save the job.

Go to the Desktops tab on the top and click + New Custom Desktop. Under the Add Software Tab, select ANSYS Fluent Desktop and enter your license information. We will use Version 19d.1 for this tutorial.

Under Add Jobs, select the job created in the previous step.

Once the server has been started and the loading grid has disappeared, click Connect > Connect using In-Browser Desktop. You will now be taken to the remote desktop. Please double click the ANSYS Workbench shortcut icon on the desktop to launch ANSYS Workbench.

  • Go to Start Menu > View all Programs > Open RSM Configuration
  • Once RSM is open, click on + to add a new HPC resource

On the HPC Resource tab, provide a name for the HPC resource. Enter the submit host using the information printed in process_output.log on your cluster. Select linux-64 from the drop down list. Click Apply

On the File Management tab, select RSM internal file transfer method and specify $HOME/work/shared as your staging directory path on the cluster. Click Apply. On a Rescale cluster, shared directory is the location you will have permissions to write files to. Hence specifying shared directory as the staging directory provides a location that is both visible and writable by the RSM client machine.

On the Queue tab, click on Import/Refresh HPC Queues button. You should be prompted to enter credentials to your compute cluster. Fill in the credentials using the information from process_output.log.

After verifying your credentials, you should see a Default and local queue appear. Click Apply The queue that will utilize all nodes of your cluster is the Default queue.

Open Ansys Workbench on Rescale Desktop. Click on File and Restore Archive. Browse to Desktop > attached_jobs . Click on the job folder and browse to the workbench archive file. Click on the workbench archive file and click Open. Once the project is open, right click on Parameters and click on Properties . This will open up the properties window.

Make selections in the properties as shown below.

Click on Save to save all the settings. Next, right click on Parameters and click on Update all Design Points. You will notice that the status bar below is updated. Click on Monitor Jobs to see the status of the simulation.

The Job Monitor window is displayed. Each row in the top window represents a design point. By clicking on row, the details are displayed in the Details window below.

Once all the design points are solved, double click on parameters in the Ansys workbench window and the results will be displayed

Open ANSYS Workbench on a Rescale Desktop. Click on File and Restore Archive. Browse to Desktop > attached jobs. Click on the job folder and browse to the workbench archive file. Save the project file and directory. We strongly recommend saving such files within the Z:/work directory.

The image below shows a sample Project schematic. As you can see:

  • System A (500 lb load) was already solved with 4 distributed cores
  • System B (buckling analysis) depends on System A and it’s not yet solved
  • System C (800 lb load) shares the same model as System A but is not yet solved
  • System D (1000 lb load) shares the same model as System A but is not yet solved

For this example, we have an ANSYS ARC Cluster with 8 cores running. We want to solve systems B, C, and D with 4 cores each.

Open the Mechanical GUI of system B by clicking on the Setup cell of System B. As we can see on the model tree on the left side, all systems are shown. Click on Tools > Solve Process Settings

In the Solve Process Settings window, click on Add Queue and give it a name such as 4-core-queue. Now, this created queued is highlighted and we can select the Default RSM Queue that we created earlier in the RSM Configuration window.

  • RSM Queue: Default (this is the created and configured ARC cluster)
  • Job Name: Mechanical
  • License: ANSYS Mechanical Enterprise Solver

Note that for multiple simultaneous submissions of ANSYS Mechanical in batch, it’s necessary to have enough license seats (e.g. meba license feature).

Now, click Advanced, and select 4 cores as the Max number of utilized cores. Press OK to close Advanced Properties, and once again press OK to close Solve Process Settings.

To submit, highlight Solution (A6) on the model tree. Then, click on the dropdown arrow next to Solve as seen on the image below and select the RSM queue that we created earlier.

It’s important that you don’t click the “Solve” button instead as that will not use any RSM queues and will solve locally.

The job is now submitted to the ARC Cluster and can be monitored in the Job Monitor window of Workbench. Each row in the top window represents a submission. By clicking on a row, the details are displayed in the Details window.

Finally, when the Job Monitor logs show completed for any job, and while the ARC Cluster is still connected, make sure to Get Results to populate back to the Workbench project on the Rescale Desktop.