Remote Solve Manager Users Guide PDF

Download as pdf or txt
Download as pdf or txt
You are on page 1of 150

ANSYS Remote Solve Manager User's Guide

ANSYS, Inc. ANSYS Release 15.0


Southpointe November 2013
275 Technology Drive
Canonsburg, PA 15317 ANSYS, Inc. is
[email protected] certified to ISO
9001:2008.
http://www.ansys.com
(T) 724-746-3304
(F) 724-514-9494
Copyright and Trademark Information

© 2013 SAS IP, Inc. All rights reserved. Unauthorized use, distribution or duplication is prohibited.

ANSYS, ANSYS Workbench, Ansoft, AUTODYN, EKM, Engineering Knowledge Manager, CFX, FLUENT, HFSS and any
and all ANSYS, Inc. brand, product, service and feature names, logos and slogans are registered trademarks or
trademarks of ANSYS, Inc. or its subsidiaries in the United States or other countries. ICEM CFD is a trademark used
by ANSYS, Inc. under license. CFX is a trademark of Sony Corporation in Japan. All other brand, product, service
and feature names or trademarks are the property of their respective owners.

Disclaimer Notice

THIS ANSYS SOFTWARE PRODUCT AND PROGRAM DOCUMENTATION INCLUDE TRADE SECRETS AND ARE CONFID-
ENTIAL AND PROPRIETARY PRODUCTS OF ANSYS, INC., ITS SUBSIDIARIES, OR LICENSORS. The software products
and documentation are furnished by ANSYS, Inc., its subsidiaries, or affiliates under a software license agreement
that contains provisions concerning non-disclosure, copying, length and nature of use, compliance with exporting
laws, warranties, disclaimers, limitations of liability, and remedies, and other provisions. The software products
and documentation may be used, disclosed, transferred, or copied only in accordance with the terms and conditions
of that software license agreement.

ANSYS, Inc. is certified to ISO 9001:2008.

U.S. Government Rights

For U.S. Government users, except as specifically granted by the ANSYS, Inc. software license agreement, the use,
duplication, or disclosure by the United States Government is subject to restrictions stated in the ANSYS, Inc.
software license agreement and FAR 12.212 (for non-DOD licenses).

Third-Party Software

See the legal information in the product help files for the complete Legal Notice for ANSYS proprietary software
and third-party software. If you are unable to access the Legal Notice, please contact ANSYS, Inc.

Published in the U.S.A.


Table of Contents
1. Overview ................................................................................................................................................. 1
1.1. RSM Roles and Terminology .............................................................................................................. 1
1.2. Typical RSM Workflows ...................................................................................................................... 2
1.3. File Handling .................................................................................................................................... 4
1.4. RSM Integration with ANSYS Client Applications ................................................................................ 5
1.4.1. RSM Supported Solvers ............................................................................................................ 5
1.4.2. RSM Integration with Workbench ............................................................................................. 5
2. Installation and Configuration ............................................................................................................... 7
2.1. Software Installation ......................................................................................................................... 7
2.1.1. Installing a Standalone RSM Package ........................................................................................ 7
2.1.2. Uninstalling RSM ...................................................................................................................... 8
2.2. Using the ANSYS Remote Solve Manager Setup Wizard ...................................................................... 8
2.3. RSM Service Installation and Configuration ...................................................................................... 10
2.3.1. Installing and Configuring RSM Services for Windows ............................................................. 10
2.3.1.1. Installing RSM Services for Windows .............................................................................. 10
2.3.2. Installing and Configuring RSM Services for Linux ................................................................... 12
2.3.2.1. Configuring RSM to Use a Remote Computing Mode for Linux ........................................ 12
2.3.2.2. Installing RSM Services for Linux .................................................................................... 13
2.3.2.2.1. Starting RSM Services Manually for Linux ............................................................... 13
2.3.2.2.1.1. Manually Running RSM Service Scripts for Linux ............................................ 14
2.3.2.2.1.2. Manually Uninstalling RSM Services for Linux ................................................ 15
2.3.2.2.2. Starting RSM Services Automatically at Boot Time for Linux ................................... 15
2.3.2.2.2.1. Installing RSM Automatic Startup (Daemon) Services for Linux ...................... 15
2.3.2.2.2.2. Working with RSM Automatic Startup (Daemon) Services for Linux ................ 17
2.3.2.2.2.3. Uninstalling RSM Automatic Startup (Daemon) Services for Linux .................. 17
2.3.2.3. Additional Linux Considerations .................................................................................... 18
2.3.3. Configuring a Multi-User Manager or Compute Server ............................................................ 19
2.3.4. Configuring RSM for a Remote Computing Environment ......................................................... 19
2.3.4.1. Adding a Remote Connection to a Manager ................................................................... 20
2.3.4.2. Adding a Remote Connection to a Compute Server ........................................................ 20
2.3.4.3. Configuring Computers with Multiple Network Interface Cards (NIC) .............................. 20
2.4. Setting Up RSM File Transfers .......................................................................................................... 21
2.4.1. Operating System File Transfer Utilizing Network Shares ......................................................... 22
2.4.1.1. Windows-to-Windows File Transfer ................................................................................. 23
2.4.1.2. Linux-to-Linux File Transfer ............................................................................................ 24
2.4.1.3. Windows-to-Linux File Transfer ...................................................................................... 24
2.4.1.4. Verifying OS Copy File Transfers ...................................................................................... 26
2.4.2. Eliminating File Transfers by Utilizing a Common Network Share ............................................. 26
2.4.3. Native RSM File Transfer .......................................................................................................... 28
2.4.4. SSH File Transfer ..................................................................................................................... 28
2.4.5. Custom Client Integration ...................................................................................................... 28
2.5. Accessing the RSM Configuration File .............................................................................................. 28
3. User Interface ........................................................................................................................................ 31
3.1. Main Window ................................................................................................................................. 31
3.2. Menu Bar ........................................................................................................................................ 32
3.3. Toolbar ........................................................................................................................................... 33
3.4. Tree View ........................................................................................................................................ 34
3.5. List View ......................................................................................................................................... 36
3.6. Status Bar ....................................................................................................................................... 38
3.7. Job Log View .................................................................................................................................. 38

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. iii
Remote Solve Manager (RSM)

3.8. Options Dialog Box ......................................................................................................................... 40


3.9. Desktop Alert ................................................................................................................................. 40
3.10. Accounts Dialog ............................................................................................................................ 41
3.11. RSM Notification Icon and Context Menu ....................................................................................... 42
4. User Accounts and Passwords ............................................................................................................... 45
4.1. Adding a Primary Account .............................................................................................................. 46
4.2. Adding Alternate Accounts ............................................................................................................. 47
4.3. Working with Account Passwords .................................................................................................... 48
4.4. Manually Running the Password Application ................................................................................... 49
4.5. Configuring Linux Accounts When Using SSH .................................................................................. 50
5. Administration ...................................................................................................................................... 51
5.1. Automating Administrative Tasks with the RSM Setup Wizard .......................................................... 51
5.2. Working with RSM Administration Scripts ........................................................................................ 52
5.3. Creating a Queue ............................................................................................................................ 53
5.4. Modifying Manager Properties ........................................................................................................ 54
5.5. Adding a Compute Server ............................................................................................................... 55
5.5.1. Compute Server Properties Dialog: General Tab ...................................................................... 57
5.5.2. Compute Server Properties Dialog: Cluster Tab ........................................................................ 63
5.5.3. Compute Server Properties Dialog: SSH Tab ............................................................................ 67
5.6. Testing a Compute Server ............................................................................................................... 70
6. Customizing RSM .................................................................................................................................. 73
6.1. Understanding RSM Custom Architecture ........................................................................................ 73
6.1.1. Job Templates ........................................................................................................................ 73
6.1.2. Code Templates ..................................................................................................................... 73
6.1.3. Job Scripts ............................................................................................................................. 74
6.1.4. HPC Commands File ............................................................................................................... 74
6.2. Custom Cluster Integration Setup .................................................................................................... 75
6.2.1. Customizing Server-Side Integration ....................................................................................... 76
6.2.1.1. Configuring RSM to Use Cluster-Specific Code Template ................................................. 76
6.2.1.2. Creating Copies of Standard Cluster Code Using Custom Cluster Keyword ...................... 78
6.2.1.3. Modifying Cluster-Specific Job Code Template to Use New Cluster Type .......................... 79
6.2.1.4. Modifying Cluster-Specific HPC Commands File .............................................................. 80
6.2.2. Customizing Client-Side Integration ....................................................................................... 81
6.2.2.1. Configuring RSM to Use Cluster-Specific Code Template on the Client Machine ............... 82
6.2.2.2. Creating Copies of Sample Code Using Custom Client Keyword ...................................... 84
6.2.2.3. Modifying Cluster-Specific Job Code Template to Use New Cluster Type .......................... 84
6.2.2.4. Modifying Cluster-Specific HPC Commands File .............................................................. 85
6.2.3. Configuring File Transfer by OS Type and Network Share Availability ........................................ 86
6.2.3.1. Windows Client to Windows Cluster .............................................................................. 87
6.2.3.1.1. Windows-to-Windows, Staging Visible ................................................................... 87
6.2.3.1.2. Windows-to-Windows, Staging Not Visible ............................................................. 87
6.2.3.2. Windows Client to Linux Cluster .................................................................................... 87
6.2.3.2.1. Windows-to-Linux, Staging Visible ......................................................................... 88
6.2.3.2.2. Windows-to-Linux, Staging Not Visible .................................................................. 88
6.2.3.3. Linux Client to Linux Cluster .......................................................................................... 89
6.2.3.3.1. Linux-to-Linux, Staging Visible ............................................................................... 89
6.2.3.3.2. Linux-to-Linux, Staging Not Visible ........................................................................ 89
6.3. Writing Custom Code for RSM Integration ........................................................................................ 89
6.3.1. Parsing of the Commands Output ........................................................................................... 90
6.3.1.1. Commands Output in the RSM Job Log .......................................................................... 90
6.3.1.2. Error Handling ............................................................................................................... 90
6.3.1.3. Debugging .................................................................................................................... 91

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
iv ation of ANSYS, Inc. and its subsidiaries and affiliates.
Remote Solve Manager (RSM)

6.3.2. Customizable Commands ....................................................................................................... 91


6.3.2.1. Submit Command ......................................................................................................... 91
6.3.2.2. Status Command ........................................................................................................... 92
6.3.2.3. Cancel Command .......................................................................................................... 92
6.3.2.4. Transfer Command ........................................................................................................ 92
6.3.2.5. Cleanup Command ........................................................................................................ 93
6.3.3. Custom Integration Environment Variables ............................................................................. 93
6.3.3.1. Environment Variables Set by Customer ......................................................................... 94
6.3.3.2. Environment Variables Set by RSM ................................................................................. 95
6.3.4. Providing Client Custom Information for Job Submission ........................................................ 96
6.3.4.1. Defining the Environment Variable on the Client ............................................................ 97
6.3.4.2. Passing the Environment Variable to the Compute Server ............................................... 97
6.3.4.3. Verify the Custom Information on the Cluster ................................................................. 98
7. Troubleshooting .................................................................................................................................... 99
A. ANSYS Inc. Remote Solve Manager Setup Wizard .................................................................................... 103
A.1. Overview of the RSM Setup Wizard ............................................................................................... 103
A.2. Prerequisites for the RSM Setup Wizard ......................................................................................... 105
A.3. Running the RSM Setup Wizard ..................................................................................................... 107
A.3.1. Step 1: Start RSM Services and Define RSM Privileges ............................................................ 107
A.3.2. Step 2: Configure RSM .......................................................................................................... 108
A.3.3. Step 3: Test Your RSM Configuration ...................................................................................... 109
A.4. Troubleshooting in the Wizard ...................................................................................................... 110
B. Integrating Windows with Linux using SSH/SCP ..................................................................................... 113
B.1. Configure PuTTY SSH .................................................................................................................... 114
B.2. Add a Compute Server .................................................................................................................. 117
C. Integrating RSM with a Linux Platform LSF, PBS, or SGE (UGE) Cluster ...................................................... 121
C.1. Add a Linux Submission Host as a Compute Server ........................................................................ 121
C.2. Complete the Configuration ......................................................................................................... 125
C.3. Additional Cluster Details .............................................................................................................. 125
D. Integrating RSM with a Windows Platform LSF Cluster ............................................................................ 127
D.1. Add the LSF Submission Host as a Compute Server ....................................................................... 127
D.2. Complete the Configuration ......................................................................................................... 130
D.3. Additional Cluster Details ............................................................................................................. 130
E. Integrating RSM with a Microsoft HPC Cluster ......................................................................................... 133
E.1. Configure RSM on the HPC Head Node .......................................................................................... 133
E.2. Add the HPC Head Node as a Compute Server ............................................................................... 134
E.3. Complete the Configuration .......................................................................................................... 136
E.4. Additional HPC Details .................................................................................................................. 136
Glossary ................................................................................................................................................... 139
Index ........................................................................................................................................................ 143

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. v
ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
vi ation of ANSYS, Inc. and its subsidiaries and affiliates.
Chapter 1: RSM Overview
The Remote Solve Manager (RSM) is a job queuing system that distributes tasks that require computing
resources. RSM enables tasks to be run in background mode on the local machine, sent to a remote
machine for processing, or tasks can be broken into a series of jobs for parallel processing across a
variety of computers.

Computers with RSM installed are configured to manage jobs using three primary services: The RSM
Client service, the Solve Manager service (typically shortened to “Manager”), and the Compute Server
service. You use the RSM Client interface to manage jobs.

RSM Clients submit jobs to a queue, and the Manager dispatches these jobs to idle Compute Servers
that run submitted jobs. These services and their capabilities are explained in RSM Roles and Termino-
logy (p. 1)

The following topics are discussed in this overview:


1.1. RSM Roles and Terminology
1.2.Typical RSM Workflows
1.3. File Handling
1.4. RSM Integration with ANSYS Client Applications

1.1. RSM Roles and Terminology


The following terms are essential to understanding RSM uses and capabilities:

Job
A job consists of a job template, a job script, and a processing task submitted from a client application
such as ANSYS Workbench. The job template is an XML file that specifies input and output files of the
client application. The job script runs an instance of the client application on the Compute Server(s)
used to run the processing task.

Client Application
A client application is the ANSYS application used to submit jobs to RSM, and then solve those jobs as
managed by RSM. Examples include ANSYS Workbench, ANSYS Fluent, ANSYS CFX, etc.

Queue
A queue is a list of Compute Servers available to run jobs. When a job is sent to a queue, the Manager
selects an idle Compute Server in the list.

Compute Server
Compute Servers are the machines on which jobs are run. In most cases, the Compute Server refers to
a remote machine, but it can also refer to your local machine ("localhost").

The Compute Server can be a Windows-based computer or a Linux system equipped with Mono,
the open source development platform based on the .NET framework. The job script performs a
processing task (such as running a finite element solver). If the job script requires a client application
to complete that task, that client application must be installed on the Compute Server.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 1
Overview

Once Compute Servers are configured, they are added to a queue (which can contain multiple
Compute Servers). Jobs must specify a queue when they are submitted to a Manager.

RSM Manager
The RSM Manager (also called the “Solve Manager”) is the central RSM service that dispatches jobs to
computing resources. It contains a configuration of queues (lists of Compute Servers available to run
jobs).

RSM Clients submit jobs to one or more queues configured for the Manager, and their jobs are
dispatched to Compute Servers as resources become available.

The RSM administrator decides if users should use the Manager on their local machine or a central
Manager, depending on the number of users and compute resources.

RSM Client
The RSM Client is a computer that runs both RSM and a client application such as ANSYS Workbench.
RSM enables this computer to off-load jobs to a selected queue.

Code Template
A code template is an XML file containing code files (for example, C#, VB, JScript), references, and support
files required by a job. For more information on code templates, see Job Templates.

1.2. Typical RSM Workflows


Any computer with RSM installed can act as the RSM Client, Manager, Compute Server, or any simultan-
eous combination of these three functions. This section provides an overview of several configurations
of these functions as they are typically seen in RSM workflows . For specific instruction regarding RSM
configurations, refer to RSM Service Installation and Configuration (p. 10).

The most effective use of RSM is to designate one computer as the Manager for central management
of compute resources. All RSM Clients submit jobs to a queue(s) configured for that Manager, and the
Manager dispatches jobs as compute resources become available on Compute Servers.

The following list shows several typical RSM usage workflows:

1. The RSM Client submits jobs using RSM (running locally) directly to itself so that the job runs locally in
background mode. Here, the RSM Client, the Manager, and the Compute Server are all on the local ma-
chine. This capability is available automatically when you install ANSYS Workbench.

2. The RSM Client submits jobs to the Manager running locally on the same machine. You can assign a remote
Compute Server to run the job or split the job between multiple Compute Servers, optionally including
your local machine (as depicted in the second workflow below). A remote Compute Server requires RSM

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
2 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Typical RSM Workflows

and the client application to be installed (the client application is typically installed with ANSYS Workbench,
which also includes RSM).

3. An RSM Client machine submits jobs to a Manager running on a remote machine (refer to Adding a Remote
Connection to a Manager (p. 20)). The remote machine also acts as the Compute Server. This configuration
is available automatically when both machines have ANSYS Workbench installed.

4. An RSM Client machine submits jobs to a Manager running on a remote machine. The Manager then
assigns the job to a remote Compute Server(s). The RSM Client and the Compute Servers must have
ANSYS Workbench installed. You can install ANSYS Workbench on the Manager, or choose to install only
standalone RSM software, as described in Software Installation (p. 7).

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 3
Overview

1.3. File Handling


Input files are generally transferred from the RSM Client working directory, to the Manager project dir-
ectory, and then to the Compute Server working directory where the job is run. Output files generated
by the job are immediately transferred back to the Manager’s project storage when the job finishes.
The files are stored there until the client application downloads the output files. This section provides
more details about how RSM handles files.

Client Application
The location of files on the RSM Client machine is controlled by the client application (for example, ANSYS
Workbench). When the RSM Client submits a job to a Manager, it specifies a directory where inputs are
found and where output files are placed. Refer to the client application documentation to determine
where input files are placed when submitting jobs to RSM.

Input files are copied to the Manager immediately when the job is submitted.

RSM Manager
The RSM Manager creates a project directory as defined in the project directory input from the RSM UI.
However, when the Manager is local to the client (i.e., when it is on the same machine as the RSM Client),
it ignores the RSM UI setting and creates the directory where the job is saved. The base project directory
location is controlled with the Solve Manager Properties dialog (see Modifying Manager Proper-
ties (p. 54)). All job files are stored in this location until the RSM Client releases the job. Jobs can also
be deleted manually in the RSM user interface.

Compute Server
If the Working Directory property on the General tab of the Compute Server Properties dialog is set
to Automatically Determined, the Compute Server reuses the Manager’s project directory as an optim-
ization. Otherwise, the Compute Server creates a temporary directory in the location defined in the
Working Directory property on the General tab of the Compute Server Properties dialog. If the
Working Directory property is left blank, the system TMP variable is used. When the job is complete,
output files are immediately copied back to the Manager's Project Directory. If the Delete Job Files in
Working Directory check box of the Compute Server Properties dialog is selected (default), the temporary
directory is then deleted.

Linux SSH
When Windows to Linux SSH file transfer is required by security protocols, the Linux Working Directory
property on the SSH tab of the Compute Server Properties dialog determines where files are located.
If this field is empty, the account’s home directory is used as the default location. In either case, a unique
temporary directory is created.

Third-Party Schedulers
When using the RSM job scripts that integrate with third-party schedulers such as LSF, PBS, Microsoft
HPC (previously known as Microsoft Compute Cluster), SGE (UGE) , etc., the file handling rules listed in
this section apply to the extent that RSM is involved. For more information on integrating RSM with
various third-party schedulers, see:

• Compute Server Properties Dialog: Cluster Tab

• Appendix C

• Appendix D

• Appendix E

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
4 ation of ANSYS, Inc. and its subsidiaries and affiliates.
RSM Integration with ANSYS Client Applications

File Transfer Methods


ANSYS Remote Solve Manager offers different methods of transferring files. The preferred method is OS
File Transfer and involves using existing network shares to copy the files using the built-in operating
system copy commands. Other methods include native RSM file transfer, SSH file transfer, and complete
custom integration. You can also reduce or eliminate file transfers by sharing a network save/storage
location.

For more information, see Setting Up RSM File Transfers (p. 21).

1.4. RSM Integration with ANSYS Client Applications


This section discusses RSM compatibility and integration topics related to ANSYS client applications.

For client application-specific RSM instruction, integration, or configuration details, refer to the following
resources:

• Submitting Solutions for Local, Background, and Remote Solve Manager (RSM) Processes in the Workbench
User's Guide

• For tutorials featuring step-by-step instructions for specific configuration scenarios, go to the Downloads
page of the ANSYS Customer Portal. For further information about tutorials and documentation on the
ANSYS Customer Portal, go to http://support.ansys.com/docinfo.

• The client application documentation

The following topics are discussed in this section.


1.4.1. RSM Supported Solvers
1.4.2. RSM Integration with Workbench

1.4.1. RSM Supported Solvers


RSM supports the following solvers:

• CFX

• Fluent

• Mechanical (excluding the Samcef solver)

• Mechanical APDL

• Polyflow

1.4.2. RSM Integration with Workbench


Many ANSYS Workbench applications enable you to use RSM; however, the following considerations
may apply:

• Some applications may not always work with remote Compute Servers or Managers.

• When a client application is restricted to the RSM Client machine, RSM enables the client application to
run in the background.

• When a client application can send jobs to remote Compute Servers, the job may be run completely on
one Compute Server, or the job may be broken into pieces so that each piece can run in parallel on

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 5
Overview

multiple Compute Servers (possibly including the RSM Client machine). In the case where a job is being
run in parallel on multiple machines, you need to ensure that the software that controls the parallel pro-
cessing is supported on all of the Compute Servers.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
6 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Chapter 2: ANSYS Remote Solve Manager Installation and
Configuration
A general overview of RSM installation and configuration is presented in this chapter.

This section discusses the following installation and configuration topics:


2.1. Software Installation
2.2. Using the ANSYS Remote Solve Manager Setup Wizard
2.3. RSM Service Installation and Configuration
2.4. Setting Up RSM File Transfers
2.5. Accessing the RSM Configuration File

For tutorials featuring step-by-step instructions for specific configuration scenarios, go to the Downloads
page of the ANSYS Customer Portal. For further information about tutorials and documentation on the
ANSYS Customer Portal, go to http://support.ansys.com/docinfo.

2.1. Software Installation


RSM is automatically installed with ANSYS Workbench products. You can also install RSM by itself if
desired. For example, you may want to install RSM by itself on a computer that acts as a dedicated
Manager; a Manager requires only an RSM installation for connectivity with remote RSM Clients and
Compute Servers. RSM Clients and Compute Servers require ANSYS Workbench, the ANSYS applications
you want to run, and RSM. Administrator privileges are not required to install or uninstall RSM on RSM
Client machines.

The following RSM installation topics are discussed in this section:


2.1.1. Installing a Standalone RSM Package
2.1.2. Uninstalling RSM

2.1.1. Installing a Standalone RSM Package


In addition to the default method of installing Remote Solve Manager along with Workbench, it is
possible to install a “standalone” RSM package (i.e., to install everything necessary to run RSM services
and the RSM interface, but without a full ANSYS Workbench installation that includes ANSYS Mechanical,
ANSYS Fluent, ANSYS CFX, and ANSYS Polyflow solvers, etc.) You can install the standalone RSM package
on either a Windows or a Linux machine via the ANSYS Product Installation Wizard, as follows:

1. Run the wizard as described in Installing ANSYS, Inc. Products.

2. On the Select the products to install page:

• Under ANSYS Additional Tools, select the ANSYS Remote Solve Manager Standalone Services
check box.

• Deselect all the other check boxes.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 7
Installation and Configuration

3. Continue the installation process as directed.

Note

When you install a standalone RSM package, this does not mean that RSM services are installed
at the same time; you still need to install or start up necessary RSM services. For instructions,
see Installing RSM Services for Windows or Installing RSM Services for Linux.

2.1.2. Uninstalling RSM


Uninstall RSM with Workbench
For a default RSM installation that was installed along with ANSYS Workbench, RSM is removed when
you do a full uninstall of Workbench and ANSYS products. Run the ANSYS Product Uninstall wizard
and click the Select All button to remove all products.

Uninstall a Standalone RSM Package


To uninstall a standalone RSM package, run the ANSYS Product Uninstall wizard and select only the
ANSYS RSM check box.

Uninstall a Standalone RSM Package Manually


To uninstall a standalone RSM package manually, first uninstall all RSM services.

• To uninstall RSM services for Windows, see Uninstalling RSM Services for Windows (p. 11).

• To uninstall RSM services started manually for Linux, see Manually Uninstalling RSM Services for
Linux (p. 15).

• To uninstall RSM daemon services for Linux, see Uninstalling RSM Automatic Startup (Daemon) Services
for Linux (p. 17).

After the services have been uninstalled, delete the RSM installation directory.

2.2. Using the ANSYS Remote Solve Manager Setup Wizard


The ANSYS Remote Solve Manager Setup Wizard is a new utility that guides you through the process
of setting up and configuring Remote Solve Manager; instead of using manual setup processes, you
can launch the wizard and follow its instructions for each part of the setup. Depending on the RSM
Layout you intend to use, you may need to run the wizard on multiple machines. The wizard will walk
you through the following setup tasks:

• Start RSM services

Note

– The creation of shared directories needed for use with a commercial cluster is performed
as part of the Wizard configuration.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
8 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Using the ANSYS Remote Solve Manager Setup Wizard

– To start RSM services when UAC is enabled on Windows 7, you must launch the wizard
using the right-click Run as administrator menu option. For instructions on enabling
or disabling UAC, see RSM Troubleshooting (p. 99).

• Configure the machines to be included in your RSM Layout

• Perform various cluster configuration tasks

• Integrate RSM with the following third-party job schedulers (without requiring job script customization):

– LSF (Windows and Linux)

– PBS (Linux only)

– Microsoft HPC

– SGE (UGE)

• Create and share RSM directories (Project Directory, Working Directory, and where applicable, Shared
Cluster Directory)

• Define queues

• Create accounts

• Test the final RSM configuration

To launch the RSM Setup Wizard:

• For Windows: Select Start > All Programs > ANSYS 15.0 > Remote Solve Manager > RSM Setup
Wizard 15.0. Alternatively, you can navigate to the [RSMInstall]\bin directory and double-click
Ans.Rsm.Wizard.exe.

• For Linux: Open a terminal window in the [RSMInstall]\Config\tools\linux directory and


run rsmwizard.

1. Log into the machine that will serve as the Manager. If you are configuring a cluster, this is the head
node of the cluster.

• For Windows, you must either have Windows administrative privileges on the Manager, have RSM
administrative privileges (as a member of the RSM Admins user group), or launch the wizard via
the right-click Run as administrator menu option.

• For Linux, you must log in with root privileges or have non-root administrative privileges. (“Non-
root administrative privileges” means that you are a member of the rsmadmins user group. Before
you run the wizard, your IT department must create the rsmadmins user group and manually
add any users who will be starting/running non-daemon services.)

2. Launch the wizard:

Note that the wizard requires different privileges for different parts of the RSM setup process. For details
on necessary permissions, see Prerequisites for the RSM Setup Wizard (p. 105).

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 9
Installation and Configuration

For detailed information on the wizard’s requirements, prerequisites, and capabilities, see Ap-
pendix A (p. 103).

For a quick-start guide on using the wizard, see the Readme file. To access this file:

• For Windows: Select Start > All Programs > ANSYS 15.0 > Remote Solve Manager > Readme -
RSM Setup Wizard 15.0.

• For Linux: Navigate to the [RSMInstall]\Config\tools\linux directory and open


rsm_wiz.pdf.

For more detailed information on the wizard’s capabilities, prerequisites, and use, see Appendix A (p. 103).

2.3. RSM Service Installation and Configuration


This section includes instructions for installing and configuring RSM services for Windows or Linux ma-
chines.
2.3.1. Installing and Configuring RSM Services for Windows
2.3.2. Installing and Configuring RSM Services for Linux
2.3.3. Configuring a Multi-User Manager or Compute Server
2.3.4. Configuring RSM for a Remote Computing Environment

2.3.1. Installing and Configuring RSM Services for Windows


The following RSM configuration topics for Windows are discussed in this section:
2.3.1.1. Installing RSM Services for Windows

2.3.1.1. Installing RSM Services for Windows


On a Windows machine, you can configure RSM services to start automatically at boot time by running
the RSM startup script for Windows. You can also uninstall and restart the services by running the script
with adding command line options.

Note

• RSM services cannot be started from a network installation. It is recommended that you install
RSM on a local machine.

• For GPU requirements when Windows is installed as a service, see GPU Requirements in the
Installation and Licensing Documentation.

RSM Command Line Options for Windows


By adding the following command line options to the end of an RSM service script, you can specify
what service or services you wish to configure.

-mgr
Command line option for applying the command to the Manager service.

-svr
Command line option for applying the command to the Compute Server service.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
10 ation of ANSYS, Inc. and its subsidiaries and affiliates.
RSM Service Installation and Configuration

If you use both options with the selected script, the script will be applied to be both services.

Configuring RSM Services to Start Automatically at Boot Time for Windows


To configure RSM services to start automatically at boot time, run the AnsConfigRSM.exe script.

1. Log into a Windows account with administrative privileges.

2. Ensure that Ans.Rsm.* processes are not running in the Windows Task Manager.

3. Open a command prompt in the [RSMInstall]\bin directory.

4. Enter the AnsConfigRSM.exe script into the command line, specifying the service by using the appro-
priate command line options. The examples below show how to configure both services, the Manager
service only, or the Compute Server service only.
AnsConfigRSM.exe -mgr -svr

AnsConfigRSM.exe -mgr

AnsConfigRSM.exe -svr

5. Run the command.

Note

Windows 7 users may need to select the Run as administrator option.

If the RSM services have been removed, you can also use the above sequence of steps to reconfigure
the services.

Uninstalling RSM Services for Windows


To unconfigure (remove) all RSM services, run the AnsUnconfigRSM.exe script.

1. Log into a Windows account with administrative privileges.

2. Ensure that Ans.Rsm.* processes are not running in the Windows Task Manager.

3. Open a command prompt in the [RSMInstall]\bin directory.

4. Enter the AnsUnconfigRSM.exe script into the command line.

5. Run the command.

Note

• If you using a Windows 7 operating system, you may need to select the Run as adminis-
trator option from the right-click context menu.

• The uninstaller can only stop services which were started by and are owned by the user
performing the uninstall.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 11
Installation and Configuration

2.3.2. Installing and Configuring RSM Services for Linux


The following RSM configuration topics for Linux are discussed in this section:
2.3.2.1. Configuring RSM to Use a Remote Computing Mode for Linux
2.3.2.2. Installing RSM Services for Linux
2.3.2.3. Additional Linux Considerations

2.3.2.1. Configuring RSM to Use a Remote Computing Mode for Linux


When RSM is installed on a Linux-based platform, you can select either native communication mode
or SSH communication mode for RSM to communicate with remote machines. The differences between
these two modes are detailed below:

Native communication SSH communication


Protocol Type Uses RSM application to execute Uses an external SSH application to ex-
commands and copy data to/from ecute commands and copy data to/from
Compute Servers Compute Servers
Installation Require- Requires RSM to be installed and Requires installation of SSH client (Putty
ments running on the Compute Server (see SSH) on the RSM Client machines (see
Starting RSM Services Manually for Appendix B)
Linux (p. 13))
Data Transfer Effi- Most efficient data transfer for solu- Communication overhead slows solution
ciency tion process launch and retrieval of process launch and retrieval of results
results
Platform Support Supported on Windows & Linux only Supported on all platforms

ANSYS recommends that you use native communication where possible, and use SSH where platform
support or IT policy requires it.

Configuring Native Cross-Platform Communications


In RSM, it is possible to configure a Linux machine for native mode communications. For performance
reasons, native mode is the recommended method for cross-platform RSM communications; SSH should
only be used if your IT department requires it.

With native mode, a Linux Compute Server has RSM installed and running locally, so the SSH protocol
isn’t needed to provide communications between a Windows Compute Server and a Linux Compute
Server. You can configure native mode communications by performing either of the following options
on the Linux machine:

• OPTION A: Run the ./rsmmanager and ./rsmserver scripts to manually start the Manager and
Compute Server services. Refer to Starting RSM Services Manually for Linux (p. 13) for more information.

• OPTION B: Configure RSM to start the Manager and Compute Server services at boot, as described in
Starting RSM Services Automatically at Boot Time for Linux (p. 15).

Adding Common Job Environment Variables for Native Jobs


Before installing and starting the RSM service on Linux, you can edit the rsm_env_profile file under
the [RSMInstall]/Config/tools/linux directory. In this file, you can add any common job
environment variables for native jobs to run. For example, you can use this file to source environment
variables specific to a batch-queueing system, or you can append a cluster-specific PATH. Once defined,

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
12 ation of ANSYS, Inc. and its subsidiaries and affiliates.
RSM Service Installation and Configuration

RSM service and native jobs should inherit these environments when any job is run. It is useful to be
able to set common environment variables in a single place instead of having to set them up on each
job user's .cshrc or .profile file from the user’s $HOME directory.

The following shows the content of rsm_env_profile file:


#!/bin/sh

# The following examples show loading environment settings specific to batch system (e.g. LSF, SGE/UGE).
# If defined, RSM service and jobs should then inherit this environment when a job is run.

# . /home/batch/lsf7.0/conf/profile.lsf
# . /home/batch/SGE6.2u2/default/common/settings.sh

2.3.2.2. Installing RSM Services for Linux


The following related topics are discussed in this section:
2.3.2.2.1. Starting RSM Services Manually for Linux
2.3.2.2.2. Starting RSM Services Automatically at Boot Time for Linux

2.3.2.2.1. Starting RSM Services Manually for Linux


Manager and Compute Server machines must have RSM services running in order to manage or run
jobs. If you are submitting jobs to a Manager or Compute Server on a remote machine, you can start
RSM services manually by running the scripts detailed in this section. These scripts include:

rsmmanager
Starts the Manager service.

rsmserver
Starts the Compute Server service.

rsmxmlrpc
Starts the XmlRpcServer service (required for EKM servers only).

These scripts are generated as part of the RSM installation process and are located in the WBIn-
stallDir/RSM/Config/tools/linux directory. If for some reason these scripts were not generated
during installation or are for other reasons not available, you can generate them yourself. For instructions,
see Generating RSM Service Startup Scripts for Linux (p. 99) in the RSM Troubleshooting (p. 99) section.

Important

When installing RSM services, you must determine whether you want to start the RSM services
manually via the startup scripts or want to install the RSM services as daemons (i.e., start the
service automatically when the machine is booted). Only one method of these methods
should be used.

Important

For security reasons, it is recommended that you do not start and run RSM service processes
manually as the "root" user. If you want to configure the process of installing RSM to a multi-

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 13
Installation and Configuration

user Linux machine, the recommended practice is to install it as a daemon. See Starting RSM
Services Automatically at Boot Time for Linux (p. 15).

Note

Note that when RSM services are started manually, the RSM services run as a process for the
user who initiated the services. RSM services that were started manually are stopped each
time the machine is rebooted; after a reboot, before you submit any jobs to RSM you must
first restart the RSM services by running the appropriate startup scripts. If you’d prefer to
start the services automatically when the machine is booted, you can configure daemons as
described in Starting RSM Services Automatically at Boot Time for Linux (p. 15).

2.3.2.2.1.1. Manually Running RSM Service Scripts for Linux

You can run the RSM service scripts to manually start, stop, check the status of, and restart RSM services.

Starting an RSM Service Manually


You can start any of the three RSM services manually by running the appropriate service script with the
command line option start. The examples below illustrate how to start each of the RSM services
manually:
./rsmmanager start

./rsmserver start

./rsmxmlrpc start

Stopping an RSM Service Manually


You can stop any of the three RSM services manually by running the appropriate service script with the
command line option stop. The examples below illustrate how to start each of the RSM services
manually:
./rsmmanager stop

./rsmserver stop

./rsmxmlrpc stop

Checking the Status of an RSM Service Manually


You can check the status of any of the three RSM services manually by running the appropriate service
script with the command line option status. The examples below illustrate how to check the status
of each of the RSM services manually:
./rsmmanager status

./rsmserver status

./rsmxmlrpc status

Restarting an RSM Service Manually


You can restart any of the three RSM services manually by running the appropriate service script with
the command line option restart. The examples below illustrate how to restart each of the RSM services
manually:
./rsmmanager restart

./rsmserver restart

./rsmxmlrpc restart

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
14 ation of ANSYS, Inc. and its subsidiaries and affiliates.
RSM Service Installation and Configuration

2.3.2.2.1.2. Manually Uninstalling RSM Services for Linux

1. Log into a Linux account with administrative privileges.

2. Ensure that Ans.Rsm.* processes are not running.

3. Open a terminal window in the RSM/Config/tools/linux directory.

4. Enter the rsmunconfig script into the command line, as shown below:
tools/linux#> ./rsmunconfig

5. Run the script.

Note

The uninstaller can only stop services which were started by and are owned by the user
performing the uninstall.

2.3.2.2.2. Starting RSM Services Automatically at Boot Time for Linux


You can configure RSM services to start automatically when the machine is booted by configuring them
as “daemon” services (if the services are not configured to start automatically, they must be started
manually, as described in Starting RSM Services Manually for Linux (p. 13)). Daemon services are scripts
or programs that run persistently in the background of the machine, and which are usually executed
at startup by the defined runlevel.

The following related topics are discussed in this section:


2.3.2.2.2.1. Installing RSM Automatic Startup (Daemon) Services for Linux
2.3.2.2.2.2. Working with RSM Automatic Startup (Daemon) Services for Linux
2.3.2.2.2.3. Uninstalling RSM Automatic Startup (Daemon) Services for Linux

2.3.2.2.2.1. Installing RSM Automatic Startup (Daemon) Services for Linux

Security Requirements for Daemon Service Configuration

To install RSM services as daemons, you must have system administrative permissions (i.e., you must
be logged in and installing as a “root” user or “sudoer”).

For security reasons, it is recommended that you do not run RSM services as the root user. Many Linux
OS only allow root users to listen to specific ports, so the ports that are required by the RSM Solve
Manager and Compute Server services may be blocked by system administration. For these reasons,
the RSM daemon service installation will create a non-root user account with no logon called rsmadmin;
the account is a member of the rsmadmins user group, and has a home directory of /home/rsmadmin.
The RSM daemon service will then be run by the rsmadmin user.

Note

The RSM daemon service installation will only create the rsmadmin user account if the account
does not already exist. The same is true for the rsmadmins user group if the group name
does not exist. The account/group will be created locally on the computer on which the RSM
service(s) will be run. If you want the account/group to be managed in the master server by

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 15
Installation and Configuration

Network Information Service (NIS), you need to ask your IT department to create an rsmadmin
user account and rsmadmins group from NIS before running RSM daemon service scripts.

Note

When an RSM package is installed under a directory, please make sure that all its parent
directories (not the files in the directory) have both read and execution permissions so that
the RSM service executable can be started by a non-root user.

Daemon Service Installation Methods

There are two ways to install RSM services as daemons: by running the rsmconfig script, or by running
the install_daemon script. The difference between the two methods is that whereas the
rsmconfig script always generates fresh service scripts before starting the service installation, the
install_daemon script assumes that the service scripts are always available in the WBIn-
stallDir/RSM/Config/tools/linux directory and uses the existing scripts for service installation,
allowing the system administrator to perform advanced script customizations before the services are
installed.)

Both scripts are located in the RSM/Config/tools/linux directory and have the same command
line options.
tools/linux#> ./rsmconfig -help
Options:
-mgr: Install RSM Job Manager service.
-svr: Install RSM Compute Server service.
-xmlrpc: Install RSM XML-RPC Server.

tools/linux# ./install_daemon
Usage: ./install_daemon [-mgr] [-svr] [-xmlrpc]
Options:
-mgr: Install RSM Job Manager service.
-svr: Install RSM Compute Server service.
-xmlrpc: Install RSM XML-RPC Server.

Installing RSM Services as Daemons

To install RSM services as daemon services, run either the rsmconfig script or the install_daemon
script, as follows:

1. Log into a Linux account with administrative privileges.

2. Ensure that Ans.Rsm.* processes are not running.

3. Open a terminal window in the RSM/Config/tools/linux directory.

4. Enter the script into the terminal window.

5. Add the appropriate command line options (-mrg, -svr, or -xmlrpc).

6. Run the command.

The two examples below show the command line used to configure the Manager and Compute Server
service daemons via either the rsmconfig or the install_daemon script.
tools/linux#> ./rsmconfig -mgr -svr

tools/linux#> ./install_daemon -mgr -svr

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
16 ation of ANSYS, Inc. and its subsidiaries and affiliates.
RSM Service Installation and Configuration

Once the daemon service is installed, the RSM service will be started automatically without rebooting.
The next time when the machine is rebooted, the installed RSM service will be started automatically.

Verifying the RSM Daemon Installation

To verify that the automatic boot procedure is working correctly, reboot the system and check to see
that the services are running by typing the appropriate ps command and looking for Ans.Rsm in
the resulting display:
ps aux | grep Ans.Rsm

2.3.2.2.2.2. Working with RSM Automatic Startup (Daemon) Services for Linux

Once an RSM daemon service is configured, any user can check the status of the service. System admin-
istrators can also start or restart the service.

Stopping the Daemon Service


To stop the daemon service:
./etc/init.d/rsmmanager150 stop

Checking the Status of the Daemon Service


To check the status of the daemon service:
./etc/init.d/rsmmanager150 status

Restarting the Daemon Service


To restart the daemon service:
./etc/init.d/rsmmanager150 restart

2.3.2.2.2.3. Uninstalling RSM Automatic Startup (Daemon) Services for Linux

As with RSM daemon service installation, only a system administrator can uninstall the RSM daemon
service. Also, the uninstaller can only stop services which were started by and are owned by the user
performing the uninstall.

Uninstalling All RSM Daemon Services

To uninstall all RSM daemon services, run the rsmunconfig script (without command line options).
The script is located in the WBInstallDir/RSM/Config/tools/linuxdirectory.

The example below shows the command line used to uninstall all RSM service daemons.
tools/linux#> ./rsmunconfig

Uninstalling Individual RSM Daemon Services

To uninstall RSM daemon services individually, run the uninstall_daemon script. The script is located
in the WBInstallDir/RSM/Config/tools/linuxdirectory. Specify the service by using command
line options, as shown below:
tools/linux# ./uninstall_daemon
Usage: ./uninstall_daemon [-mgr] [-svr] [-xmlrpc] [-rmadmin]
Options:
-mgr: Uninstall RSM Job Manager service.
-svr: Uninstall RSM Compute Server service.
-xmlrpc: Uninstall RSM XML-RPC Server.
-rmadmin : Remove 'rsmadmin' user and 'rsmadmins' group service account.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 17
Installation and Configuration

The example below shows the command line used to uninstall Solve Manager and Compute Server
service daemons via the uninstall_daemon script.
tools/linux#> ./uninstall_daemon -mgr -svr

Removing the Administrative User Account and Service Group Manually

By default, the rsmunconfig script does not remove the rsmadmin user account and rsmadmins
user group that were created earlier when service was configured. This allows the same account and
user group to be reused for the next service installation and configuration, and also prevents the acci-
dental deletion of important files from the rsmadmin home directory (/home/rsmadmin).

However, if you decide that you do not want to keep the user account and user group, you can remove
them manually by adding the -rmadmin command line option to the uninstall_daemon script.
tools/linux#> ./uninstall_daemon -rmadmin

Important

The service account and group cannot be deleted if one or more RSM services are still being
run by that user account and service group name. You will be prompted to answer “Yes” or
“No” from the above command when there is no service is being run by these accounts and
RSM is trying to delete them.

2.3.2.3. Additional Linux Considerations


When running RSM on Linux, the following considerations apply:

Linux Path Configuration Requirements


The RSM job scripts that integrate with Linux using PuTTY SSH require you to set AWP_ROOT150 in
the user's environment variables. If the job is not running properly, check the job log in the Job Log
view for "Command not found". Remote command clients like PuTTY SSH use the remote account's
default shell for running commands. For example, if the account's default shell is CSH, the following
line needs to be added to the .cshrc file (path may be different for your environment):
setenv AWP_ROOT150 /ansys_inc/v150

Note

• ~ (tilde) representation of the home directory is not supported for use in RSM paths (for example,
the Working Directory in the Compute Server Properties dialog).

• Different shells use different initialization files than the account's home directory and may have
a different syntax than shown above. Refer to the Linux man page for the specific shell or
consult the machine administrator.

RSH/SSH Settings for Inter/Intra-Node Communications


The Use SSH protocol for inter- and intra-node communication (Linux only) property, located on
the General tab of the Compute Server Properties dialog, determines whether RSM and solvers use
RSH or SSH for inter-node and intra-node communications on Linux machines. When Fluent, CFX,

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
18 ation of ANSYS, Inc. and its subsidiaries and affiliates.
RSM Service Installation and Configuration

Mechanical, and Mechanical APDL are configured to send solves to RSM, their solvers will use the same
RSH/SSH settings asRSM.

Explicit Dynamics Systems


RSM does not support Linux connections for Explicit Dynamics systems. Only Windows-to-Windows
connections are currently supported.

2.3.3. Configuring a Multi-User Manager or Compute Server


When configuring RSM on a single machine used by multiple users to submit RSM jobs, follow these
guidelines:

• All RSM users should have write access to the RSM working directory. The default working directory may
not function properly if write permissions are not enabled for all applicable users.

• All RSM users should cache their account password (refer to Working with Account Passwords (p. 48)). If
all users do not cache their password, only the user that started RSM on the machine can submit jobs.

• When installing RSM to a multi-user Linux machine, ANSYS strongly recommends that you set up RSM as
a daemon (see Starting RSM Services Automatically at Boot Time for Linux (p. 15)). Running RSM as a
daemon allows you to maintain consistent settings. If RSM is not run as daemon, the settings vary depend-
ing on which user first starts RSM processes.

• If you are running ANSYS Workbench on a multi-user RSM machine, the My Computer, Background option
that is available for ANSYS Mechanical (see Using Solve Process Settings in the ANSYS Mechanical User's
Guide) will likely not function as expected with Rigid Dynamics or Explicit Dynamics due to write permissions
for RSM working directories. As a workaround for this issue, follow these guidelines:

– Ensure that Manager and Compute Server (ScriptHost) processes always run under the same user
account. This will ensure consistent behavior.

– Do not use the built-in ‘My Computer’ or ‘My Computer Background’ solve process settings.

– Add a remote Solve Process Setting that specifies that the Manager name is the machine name, rather
than localhost. For more information, see Using Solve Process Settings in the ANSYS Mechanical User's
Guide.

– To run more than one job simultaneously, adjust the Max Running Jobs property in the Compute
Server Properties dialog.

2.3.4. Configuring RSM for a Remote Computing Environment


You must configure RSM Clients to work with Managers and Compute Servers on remote computers.
If RSM services are run across multiple computers, refer to the following RSM configuration procedures:
2.3.4.1. Adding a Remote Connection to a Manager
2.3.4.2. Adding a Remote Connection to a Compute Server
2.3.4.3. Configuring Computers with Multiple Network Interface Cards (NIC)

Note

When communicating with a remote computer, whether RSM Client to Manager or Manager
to Compute Server, RSM services must be installed on those computers.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 19
Installation and Configuration

2.3.4.1. Adding a Remote Connection to a Manager


RSM Clients can monitor and configure multiple Managers. The following steps describe how to add a
remote connection to a Manager on a remote computer:

1. Launch RSM.

2. In the RSM main window select Tools > Options. The Options dialog appears.

3. In the Name field, enter the name of a remote machine with the Manager service installed.

4. Select the Add button and then OK. The Manager and all of its queues and Compute Servers appear
in the tree view.

5. Passwords are cached on the Manager machine, so you must set the password again. Refer to Working
with Account Passwords (p. 48) for this procedure.

2.3.4.2. Adding a Remote Connection to a Compute Server


To use compute resources on a remote Compute Server, the Manager machine must add a new Compute
Server as described in Adding a Compute Server (p. 55), and then configure remote Compute Server
connections with the following considerations:

• If the Compute Server is running Windows, only the machine name is required in the Display Name
property on the General tab of the Compute Server Properties dialog.

• If the Compute Server involves integration with a Linux machine or another job scheduler, refer to
Appendix B for integration details.

• Ensure that you have administrative privileges to the working directory of the new Compute Server.

• Always test the configuration of a connection to a new remote Compute Server after it has been
created, as described in Testing a Compute Server (p. 70).

2.3.4.3. Configuring Computers with Multiple Network Interface Cards (NIC)


When multiple NIC cards are used, RSM may require additional configuration to establish desired com-
munications between tiers (i.e., the RSM Client, Manager, and Compute Server machines).

The most likely scenario is that the issues originate with the Manager and/or Compute Server. First, try
configuring the Manager and/or Compute Server machine(s):

1. In a text editor, open the Ans.Rsm.JMHost.exe.config file (Manager) and/or Ans.Rsm.SH-


Host.exe.config file (Compute Server). These files are located in Program Files\ANSYS
Inc\v150\RSM\bin.

2. To both files, add the machine’s IP address to the TCP channel configuration. Substitute the machine’s
correct IP address for the value of machineName. The correct IP address is the address seen in the
output of a “ping” from a remote machine to the Fully Qualified Domain Name (FQDN).
<channel ref="tcp" port="9150" secure="false" machineName="1.2.3.4">

3. Save and close both files.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
20 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Setting Up RSM File Transfers

4. Restart the following services: ANSYS JobManager Service V15.0 and ANSYS ScriptHost
Service V15.0.

• For Windows: On your Administrative Tools or Administrative Services page, open the Services
dialog. Restart the services by right-clicking on the service and selecting Restart.

• For Linux: Log into a Linux account with administrative privileges and ensure that Ans.Rsm.* processes
are not running. Open a terminal window in the [RSMInstall]/Config/tools/linux directory
and run the following command: ./rsmmanager restart

If the Manager and/or Compute Server does not resolve the problem, the RSM Client machine may
have multiple NICs and require additional configuration. For example, a virtual NIC used for a VPN
connection on an RSM Client machine can cause a conflict, even if not connected.

If configuring the Manager and/or Compute Server machines doesn’t work, configure the Multi-NIC RSM
Client machine:

1. Using a text editor, create a file named Ans.Rsm.ClientApi.dll.config in Program


Files\ANSYS Inc\v150\RSM\bin. If this file does not exist, RSM uses a default configuration.

2. From this file, copy and paste from the text below into Ans.Rsm.ClientApi.dll.config:

<?xml version="1.0" encoding="utf-8" ?>


<configuration>
<system.runtime.remoting>
<application>
<channels>
<channel ref="tcp" port="0" secure="true" machineName="ip_address">
<clientProviders>
<formatter ref="binary" typeFilterLevel="Full"/>
</clientProviders>
</channel>
</channels>
</application>
</system.runtime.remoting>
</configuration>

3. Replace the contents of ip_address with a valid IP address.

4. Save and close the file.

2.4. Setting Up RSM File Transfers


ANSYS Remote Solve Manager offers different methods of transferring files. The preferred method is
OS File Transfer and involves using existing network shares to copy the files using the built-in operating
system copy commands. Other methods include native RSM file transfer, SSH file transfer, and complete
custom integration. You can also reduce or eliminate file transfers by sharing a network save/storage
location.

One of these methods will be used when you are submitting a job to a remote machine. For details on
each method or how to eliminate file transfers, see:
2.4.1. Operating System File Transfer Utilizing Network Shares
2.4.2. Eliminating File Transfers by Utilizing a Common Network Share
2.4.3. Native RSM File Transfer
2.4.4. SSH File Transfer
2.4.5. Custom Client Integration

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 21
Installation and Configuration

2.4.1. Operating System File Transfer Utilizing Network Shares


RSM file transfer provides the ability to use the Operating System (OS) Copy operation. The OS Copy
operation can be significantly (up to 5 times) faster than the native file transfer used in the RSM code
(as described in Native RSM File Transfer (p. 28)). OS Copy is a faster and more efficient method of file
transfer because it utilizes standard OS commands and NFS shares. Typically, the client files are local
to the Client machine and are only transferred to the remote machines to solve because of storage
speed, capacity, and network congestion concerns.

No specific configuration is necessary within RSM itself. To enable the OS Copy operation, you must
configure the directories that will be involved in the file transfer so that the target directory is both
visible to and writable by the source machine. Generally, the target directories involved are:

• The Project Directory on the Manager machine (as specified in the Solve Manager Properties dialog)

• The Working Directory on the Compute Server machine (as specified in the Compute Server
Properties dialog)

Once the configuration is complete, the RSM Client machine should be able to access the Project Dir-
ectory on the Manager machine and the Manager machine should be able to access the Working
Directory on the remote Compute Server machine. The OS Copy operation will be used automatically
for file transfers.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
22 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Setting Up RSM File Transfers

If two RSM services are on the same machine, no configuration is necessary for OS Copy to function
between those two services. For example, in an RSM layout where the RSM Manager and Compute
Server are on the same machine, the Client is running on a separate machine, the RSM Manager can
access the Working Directory, as long as the permissions are set to allow it. In this case, the only other
configuration necessary is to ensure that the RSM Client can access the Manager’s network shared
Project Directory on the remote machine.

The steps for configuring directories for the OS Copy operation, discussed in the following sections, are
different between Linux and Windows.

Note

For the sake of general applicability, the configuration instructions in the following sections
assume an RSM layout in which each service runs on a separate machine. In a typical envir-
onment, however, ANSYS suggests that the Manager and Compute Server be on the same
machine.

Related Topics:
2.4.1.1. Windows-to-Windows File Transfer
2.4.1.2. Linux-to-Linux File Transfer
2.4.1.3. Windows-to-Linux File Transfer
2.4.1.4. Verifying OS Copy File Transfers

2.4.1.1. Windows-to-Windows File Transfer


System Administrator permissions are required to configure directories for Windows-to-Windows OS
Copy file transfers.

For Windows-to-Windows file transfers, RSM uses predefined share names to locate and identify the
target directories. You must perform the following setup tasks for each of the target directories:

• Share the target directory out to the remote machine.

• Provide full read-write permissions for the shared directory.

Perform these steps for each of the target directories:

1. In Windows Explorer, right-click on the target directory.

This is the directory you want to make visible for the OS Copy operations: either the Manager
Project Directory or the Compute Server Working Directory.

2. Select the Sharing tab and click Share.

3. Click the Advanced Sharing button.

4. In the Advanced Settings dialog, click Share this Folder and enter the correct name for the share,
as shown below.

• For the Project Directory on the Manager machine, enter RSM_Mgr.

For example, the directory C:\Projects\ProjectFiles may have a share named


\\winmachine06\RSM_Mgr.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 23
Installation and Configuration

• For the Working Directory on the Compute Server machine, enter RSM_CS.

For example, the directory D:\RSMWorkDir may have a share named \\winma-
chine2\RSM_CS.

5. Ensure that full read-write permissions are defined for the target directory.

6. This naming requirement applies only to the network share for the target directory; the directory
itself can have a different name.

Note

Once target directory is shared, you can access it by typing the share path into
Windows Explorer.

7. Perform these steps for the other target directory.

2.4.1.2. Linux-to-Linux File Transfer


Root permissions are required to configure directories for Linux-to-Linux OS Copy file transfers.

For Linux-to-Linux file transfers, RSM uses mount points to locate and identify the target directories.
You must configure each of the target directories by performing the following setup tasks:

1. Ensure that the target directory belongs to a file system that is mounted, so that the target directory
is visible to the machine on which the source directory is located. Use the full path for the target
directory.

2. Provide full read-write privileges for the target directory.

2.4.1.3. Windows-to-Linux File Transfer


Root permissions on the Linux machine are required to configure directories for Windows-to-Linux OS
Copy file transfers.

For Windows-to-Linux transfers (using Samba or a similar Linux utility), entries in the Samba configuration
file map the actual physical location of the Linux target directories to the predefined Windows share
names that RSM uses to locate and identify the target directories. The following example shows how
to configure a Samba share on Linux for the target directories RSM requires for the OS Copy operation.
If you are unable to create the share, contact your IT System Administrator for assistance with this step.

Edit the smb.conf Samba configuration file to include definitions for each of the Linux target direct-
ories. The example below shows Samba’s default values for the Linux target directories.
[RSM_Mgr]

path = /home/staff/RSM/ProjectDirectory
browseable = yes
writable = yes
create mode = 0664
directory mode = 0775
guest ok = no

[RSM_CS]

path = /home/staff/RSM/WorkingDirectory

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
24 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Setting Up RSM File Transfers

browseable = yes
writable = yes
create mode = 0664
directory mode = 0775
guest ok = no

The path should point to the actual physical location of the existing target directories. The path for the
Project Directory should match the Project Directory path defined in the Solve Manager Properties
dialog. The path for the Working Directory should match the Working Directory path defined in the
Compute Server Properties dialog.

After making your changes to smb.conf, restart the Samba server by running the following command:
/etc/init.d/smb restart

Note

The locations of files and method of restarting the Samba service may vary for different Linux
versions.

Verify that the Samba shares are accessible by your Windows machine, indicating that they have been
properly set up. Check this by using Windows Explorer and navigating to the locations shown below
(using your specific machine name in place of linuxmachinename):

• \\linuxmachinename\RSM_Mgr for the Project Directory on the Manager machine

• \\linuxmachinename\RSM_CS for the Working Directory on the Compute Server machine

Additional Windows-to-Linux Configuration When Using Alternate Accounts


A permissions issue can occur when an alternate account is used to run jobs on the Linux side. To resolve
this issue, make sure that Samba (or a similar Linux utility) is correctly configured.

The following code sample is from the Samba configuration file, smb.conf, showing a configuration
for file sharing between three accounts:

• A Windows account mapped to a Linux account

• An alternate account

• An account that runs as the RSM service


[RSM_CS]

path = /lsf/wbtest
browseable = yes
writable = yes
create mode = 0666
directory mode = 0777
guest ok = no

create mode:
The Samba default is 664, which corresponds to rw-rw-r--. If the alternate account is not in the same
group as the owner of the file, the job cannot write to the file and an error occurs for files that are both
inputs and outputs.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 25
Installation and Configuration

To provide full read-write access for all the accounts, set create mode to 666, as shown above in
the code sample. This sets the permissions for files that are copied from Windows to Linux to rw-
rw-rw, allowing all accounts to both read from and write to the file.

directory mode:
The Samba default is 775. If the copy from Windows to the Samba share results in the creation of direct-
ories, a value of 775 prevents the job running under the alternate account from creating files in the
newly copied subdirectories.

To allow the job to create files in the new subdirectories, set directory mode to 777.

After making your changes to smb.conf, restart the Samba server as shown above.

Note

The locations of files and method of restarting the Samba service may vary for different Linux
versions.

2.4.1.4. Verifying OS Copy File Transfers


After configuring a target directory for sharing, you can run a test server operation. Information about
the method used for file transfer is written to the job log in the RSM Job Log view and can be used to
verify whether RSM files are being transferred via the OS Copy operation:

In the job log, the messages “Manager network share is available” and “Compute Server network share
is available” indicate that all necessary directories are visible and OS Copy is being used.

2.4.2. Eliminating File Transfers by Utilizing a Common Network Share


Even though Workbench projects are typically run locally, small projects or larger models utilizing ex-
ceptional networks and file systems that exist today can allow Workbench projects to be saved and
opened from a network share. When using a shared Workbench storage location, this shared folder can
be used to minimize file transfers. In particular, this can be used to remove the necessity of transferring
files to and from the Client machine and the remote machine(s); ideally, this storage would be directly
attached to the Compute Server(s).

RSM places marker files in the RSM Client, Manager, and Compute Server directories to uniquely
identify the job.

• If the Manager finds the RSM Client’s marker in the project storage area (by recursively searching
subfolders), it will use that folder rather than copying the files to a separate folder.

• Similarly, if the Compute Server finds the Manager’s marker (by recursively searching subfolders), it
will also use that location rather than copying files unnecessarily.

Remember that while this leverages drivers at the operating system level which are optimized for network
file manipulation, files are still located on remote hard drives. As such, there will still be significant
network traffic, e.g. when viewing results and opening and saving projects. Each customer will have to
determine the RSM configuration that best utilizes network resources.

The Client must be able access the Client Directory under the RSM Manager Project Directory. The
Manager must have access to its sub-folders, including the RSM Client Directory and the shared

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
26 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Setting Up RSM File Transfers

Compute Server Working Directory. One or both of these directories will be under the shared Manager
Project Directory in this setup.

Example: You can set up RSM to use file shares in order to remove unnecessary file transfers. For example,
you might have a Linux share \usr\user_name\MyProjectFiles\, and have that same folder
shared via Samba or a similar method and mounted on the Windows Client machine as Z:\MyPro-
jectFiles\. If you save your Workbench projects to this network location, you can set the Manager
and Compute Server properties as follows in order to remove all file transfers and use the network share
directly as the working directory:

• Manager

– For a Linux-based Manager, set the Project Directory Location property to


\user\user_name\MyProjectFiles\.

– For a Windows-based Manager, set the Project Directory Location property to Z:\MyProject-
Files\.

• Compute Server

– For a Linux-based Compute Server, set the Working Directory Location property to
\user\user_name\MyProjectFiles\.

– For a Windows-based Compute Server, set the Working Directory Location property to
Z:\MyProjectFiles\.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 27
Installation and Configuration

In some cases, you might still want a separate Working Directory and/or Project Directory and thus,
would not define the corresponding network file share(s) as described above. For example, if the jobs
to be submitted will make heavy use of scratch space (as some Mechanical jobs do), you might wish
to retain a separate Working Directory which is on a separate physical disk and thus would not define
the two Working Directories to be in the same location.

2.4.3. Native RSM File Transfer


Native RSM file transfer occurs automatically if the preferred OS file copy or a Common Network Share
setup is not found. Native transfer requires no special setup or considerations, but is usually slower
than the preferred OS File copy setup. This method of file transfer uses the installed RSM services to
start a “service to service” file copy using the standard Microsoft .Net libraries. RSM has also included
some built-in compression features which can aid with copying over slow connections. For more in-
formation about these features see section Modifying Manager Properties (p. 54).

2.4.4. SSH File Transfer


SSH file transfer can be defined to transfer files between a Windows proxy Compute Server to a Linux
machine, but is not supported in other configurations. SSH file transfer mode is actually just referencing
an external PuTTY implementation and is not natively included with RSM, but is included as an option
for customers who must use this protocol based on their specific IT security requirements. This method
is also usually slower than the preferred OS File Copy method, and thus is not recommended unless it
is required. For more information on setting up SSH, see Appendix B (p. 113).

2.4.5. Custom Client Integration


RSM also provides a method for completely customizing the file handling of RSM, using client-side in-
tegration to suit any specialized customer needs by using customer-written scripts. For more information
on custom integration techniques, see Customizing ANSYS Remote Solve Manager (p. 73).

2.5. Accessing the RSM Configuration File


RSM configuration data is stored in the RSM.Config file. It is not recommended that you edit this
file, but you may want locate it in order to create a backup copy of your RSM configurations. You can
also manually load RSM configurations to another machine by copying the file to the appropriate dir-
ectory on that machine. The location of the RSM.Config file depends on how the anager service has
been installed.

To access the RSM.Config file:

• If the Manager service has been installed as a Windows service running as SYSTEM, the file is located
in%ALLUSERSPROFILE%\Ansys\v150\RSM\RSM.Config.

• If the Manager is run as a normal process on Windows, the file is located in%AppData%\An-
sys\v150\RSM\RSM.Config.

Note

For a user who can log on from different machines, the system must already be configured
to use the Roaming profile.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
28 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Accessing the RSM Configuration File

• On Linux, the file is located in ~/.ansys/v150/RSM/RSM.Config, where ~ is home directory of the


account under which the Manager is being run.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 29
ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
30 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Chapter 3: ANSYS Remote Solve Manager User Interface
This chapter describes the following features of the RSM user interface:
3.1. Main Window
3.2. Menu Bar
3.3.Toolbar
3.4.Tree View
3.5. List View
3.6. Status Bar
3.7. Job Log View
3.8. Options Dialog Box
3.9. Desktop Alert
3.10. Accounts Dialog
3.11. RSM Notification Icon and Context Menu

3.1. Main Window


To launch the RSM application main window:

• If you are using a Windows system, select Start > All Programs > ANSYS 15.0 > Remote Solve Manager
> RSM 15.0.

• If you are using a Linux system, run the rsmadmin script.

The main window displays as shown below:

The RSM main window interface elements are described in the table that follows.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 31
User Interface

Interface Description
Element
Menu Bar Provides access to the following menus: File, View, Tools, and Help.
Toolbar Contains the following tools, from left to right: the Show drop-down and
the Remove, All Owner Jobs, and Job Log icons.
Tree View Displays defined Solve Managers, along with the Queues and Compute
Servers configured for each.
List View Displays a listing of current jobs. You can delete jobs from this area
by selecting one or more jobs from the list and selecting Remove
from the context menu.
Job Log Displays the progress and log messages for the job selected in the
View List view.
Status Bar Displays an icon indicating the status of the active operation.

3.2. Menu Bar


The menu bar provides the following functions:

Menu Selections Function


File Minimize Hides the RSM main window. RSM continues to run in the
to System system tray.
Tray
Exit Exits the RSM application.

Alternatively, you can right-click the RSM icon in the


notification area (or system tray) and select Exit from
in the context menu.
View All Owner Controls the display of jobs in the List view, allowing
Jobs you to display or hide jobs according to ownership.

Deselect to display only your own jobs.

Select to display the jobs of all owners.


Job Log Displays or hides the Job Log view.
Refresh Forces the List view to update immediately, regardless
Now of the update speed setting.
Update Provides the following submenu selections:
Speed
• High - updates the display automatically every 2
seconds.

• Normal - updates the display automatically every 4


seconds.

• Low - updates the display automatically every 8


seconds.

• Paused - the display does not automatically update.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
32 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Toolbar

Always On When this option is selected , the main RSM window


Top remains in front of all other windows unless minimized.
Hide When When this option is selected, RSM will not appear in
Minimized the task bar when minimized. An RSM icon will display
in the notification area (or system tray).
Tools Desktop Enables/disables the desktop alert window.
Alert
Remove Deletes the job or jobs selected in the List view.
Submit a Displays the Submit Job dialog, which allows you to
Job submit jobs manually.
Options Displays the Manager Options dialog, which allows
you to define Managers and specify desktop alert set-
tings.
Help ANSYS Re- Displays the Help system in the ANSYS Help Viewer.
mote Solve
Manager
Help
About AN- Provides information about the program.
SYS Re-
mote Solve
Manager

3.3. Toolbar
The toolbar provides the following functions:

Tool Selections Function


Show All When this menu item is selected from the drop-down, all
Jobs jobs display in the List view.
Com- When this menu item is selected from the drop-down,
pleted only completed jobs display in the List view.

These jobs display with a Status of Finished.


Run- When this menu item is selected from the drop-down,
ning only running jobs display in the List view.

These jobs display with a Status of Running.


Queued When this menu item is selected from the drop-down,
only queued jobs display in the List view.

These jobs display with a Status of Queued.


Failed When this menu item is selected from the drop-down,
only failed jobs display in the List view.

These jobs display with a Status of Failed.


Can- When this menu item is selected from the drop-down,
celled only cancelled jobs display in the List view.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 33
User Interface

These jobs display with a Status of Cancelled.


Remove Not applic- This icon allows you to delete the currently selected job or
able. jobs. It functions in the same way as the Remove option
of the right-click context menu, the Tools > Remove option
in the menu bar, or the Delete key.
All Owner Selected or This icon allows you to display or hide jobs that belong to
Jobs deselected. owners other than yourself. The function is the same as
using the View > All Owner Jobs option in the menu bar.
Job Log Selected or This icon allows you to display or hide the Job Log view.
deselected. The function is the same as using the View > Job Log op-
tion in the menu bar.

3.4. Tree View


The Tree view contains a list of Compute Servers, Queues, and Managers. Compute Servers and queues
that appear may be set up on either your local machine (shown as My Computer) or remotely on a
Manager.

The components in the list are summarized below:

• Each Manager node is a separate configuration, defined by the machine designated as the Manager. New
Managers are added via the Options dialog, accessed by Tools > Options on the menu bar.

• The Queues node for a Manager contains all of the queues that have been defined for that Manager. You
can expand a Queue to view the Queue Compute Servers associated with it; these are the Compute
Servers that have been assigned to the Queue (i.e., the machines to which the Manager will send queued
jobs for processing).

• The Compute Servers node contains all of the Compute Servers associated with the Manager; these are
the machines that are available to be assigned to a Queue and to which jobs can be sent for processing.

Note

• If you disable a Manager, Queue, or Compute Server, it will be grayed out on the Tree view.

– For information on disabling Managers, see Options Dialog Box (p. 40).

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
34 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Tree View

– For information on disabling Queues, see Creating a Queue (p. 53).

– For information on disabling Compute Servers, see Adding a Compute Server (p. 55).

• If a connection cannot be made with a Manager, the Manager will be preceded by a red “X”
icon.

– For information on testing Managers, see Testing a Compute Server (p. 70).

Tree View Context Menus


When a Manager node is selected, Properties and Accounts options are available in the context menu.
If you haven’t yet cached your password with RSM, the Set Password option is also available.

When a Queues node is selected, only the Add option is available in the context menu.

When a queue is selected, the Properties and Delete options are available in the context menu.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 35
User Interface

When a Compute Server is selected under a Queues node or under a Compute Servers node, the
Properties and Test Server options are available. The Delete option becomes available if a Compute
Server that is not assigned to any queue is selected under a Compute Servers node, as shown in the
image on the right below.

When a Compute Servers node is selected, only the Add option is available.

For more information on using the Tree view context menu options, see RSM Administration (p. 51).

3.5. List View


You can sort the displayed fields by clicking on the column by which you want to sort. You can delete
one or more jobs that belong to you by selecting the jobs and clicking the Remove button in the
toolbar. Alternatively, you can also select Remove from the context menu, select Remove from the
Tools menu, or press the Delete key. When you delete a job, the job may not be removed from the
List view immediately; it will be removed the next time that the List view is refreshed.

Note

If a job is still running, you cannot remove it. Use either the Abort or the Interrupt option
in the List view context menu. Once the job Status changes to either Finished or Canceled,
you can click the Remove button to delete the job. The Interrupt command allows a job to
clean up the processes it has spawned before termination; the Abort command terminates
the job immediately. There may also be a job stopping option in the client application that
submitted the job (for example, ANSYS Workbench Mechanical Stop Solution command).
There may also be a disconnect option in the client application that submitted the job (for
example, the ANSYS Workbench Mechanical Disconnect Job from RSM command).

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
36 ation of ANSYS, Inc. and its subsidiaries and affiliates.
List View

The List view context menu provides the following options:

Option Function
Inquire Inquire about a running job. This action depends on the type of job
being run.

Generally, the Inquire command will run some additional job script
code to perform some action on a running job. It can also bring back
intermediate output and progress files.
Abort Immediately terminates a running job. Enabled only if a running job
is selected.

Jobs terminated via this option will have a Status of Canceled in the
List view.
Interrupt Terminates a running job. Enabled only if a running job is selected.

Jobs terminated via this option will have a Status of Finished in the
List view.
Remove Deletes the selected job or jobs from the List view. Enabled only if a
completed job is selected.

Cannot be used on a running job. It functions in the same way as the


Tools > Remove option in the menu bar.
Set Priority Allows you to set the submission priority for the selected job or jobs.
When jobs are submitted they have a default priority of Normal. En-
abled only for jobs with a Status of Queued.

The higher priority jobs in a queue run first. To change the priority of
a Queued job, right-click on the job name, select Set Priority and
change the priority. Only RSM administrators can change a job priority
to the highest level.

The status of each job displayed in the List view is indicated by the Status column and an icon. For
jobs that have completed, the Status column and an icon indicate the final status of the job; the addition
of an asterisk (*) to the final status icon indicates that the job has been released.

Status Description Icon Released Icon


Input Job is being uploaded to RSM.
Pending

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 37
User Interface

Queued Job has been placed in the Manager queue and


is waiting to be run.
Running Job is running.
Cancelled Job has been terminated via the Abort option.

Also applies to jobs that have been aborted be-


cause you exited a project without first perform-
ing one of the following actions:

• Saving the project since the update was initiated

• Saving results retrieved since your last save

Finished Job has completed successfully.

Also applies to jobs that have been terminated


via the Interrupt option or for which you have
saved results prior to exiting the project.
Failed Job has failed.

Also may be applied to jobs that cannot be


cancelled due to fatal errors.

3.6. Status Bar


The Status bar indicates the status of the currently running operation by displaying either a Busy icon
or a Ready icon in its bottom left corner.

3.7. Job Log View


The Job Log view provides log messages about the job. The log automatically scrolls to the bottom to
keep the most recent messages in view. To stop automatic scrolling, move the vertical slider from its
default bottom position to any other location. To resume automatic scrolling, either move vertical slider
back to the bottom or select End from the Job Log view context menu.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
38 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Job Log View

The right-click context menu provides the following options:

Status Description
Copy Copy selected text in the Job Log view.

Alternatively, you can use the Ctrl+C key combination.


Select All Select all of the text in the Job Log view.

Alternatively, you can use the Ctrl+A key combination.


Home Go to the top of the Job Log view.
End Go to the bottom of the Job Log view.
Save Job Report... This option allows you to generate a Job Report for the job item selected
from the RSM List view. Enabled when the job has completed (i.e., has a final
Status of Finished, Failed, or Cancelled).

The Job Report will include job details and the contents of the job log shown
in Job Log view. When generating the report, you can specify the following
report preferences:

• Include Debug Messages: whether debugging messages are included in the


Job Report

• Include Log Time Stamp: whether a log time stamp is included in the Job
Report

• Include Line Numbering: whether line numbering will be displayed on the


Job Report

Click the Browse button to select the directory to which the report will be
saved, type in report filename (RSMJob.html by default), select the report
format (HTML or text format), and click Save.
Line Numbering Enable or disable the display of line numbers in the Job Log view. Right-click
inside the inside the Job Log view and select or deselect Line Numbering
from the context menu.
Time Stamp Enable or disable the display of the time stamp for each line in the Job Log
view. Right-click inside the Job Log view and select or deselect Time Stamp
from the context menu.
Debug Messages Enable or disable the display of debugging information. Right-click inside
the Job Log view and select or deselect Debug Messages from the context
menu to toggle between standard job log messages and debugging mes-
sages.

Note

When making a support call concerning RSM functionality, send the RSM job report. Note
the HTML-format job report uses color highlighting by row to distinguish the Job Log view
contents from other information, which can be helpful for troubleshooting.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 39
User Interface

3.8. Options Dialog Box


From the menu bar, select Tools > Options to open Options dialog. Use the Options dialog to configure
Managers or set up desktop alert settings.

The Options dialog contains the following functions:

• The Managers pane lists available RSM Manager machines.

– To enable or disable a Manager, select or deselect the preceding check box. Disabled Managers will
display as grayed out in the Tree view.

– To add a new Manager, type its name into the Name field and click the Add button.

– To remove a Manager, highlight it in the list and click the Delete button.

– To change the name of a Manager, highlight it in the list, edit the name in the Name field, and click
the Change button.

• The Desktop Alert Settings pane contains check boxes to configure the following desktop alerts:

– Show Running Jobs

– Show Pending Jobs

– Show Completed Jobs

3.9. Desktop Alert


The desktop alert automatically appears when jobs are active. It displays the running, queued, and
completed jobs. The number of queued, running and completed jobs is also displayed in the window
title. If all jobs are finished, the desktop alert disappears automatically.

If you wish to hide the desktop alert window, use the menu options or tray context (right-click on the
RSM icon in the notification area (or system tray) menu to turn it off. If you close the desktop alert , it
will not remain hidden permanently. It will display again as long as jobs are active unless the alert is
turned off.

You can specify what jobs display on the desktop alert via the Options dialog. To access the Options
dialog, select Options from the RSM icon context menu or select Tools > Options from the menu bar.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
40 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Accounts Dialog

3.10. Accounts Dialog


The Accounts dialog allows you to add, edit, and delete primary and alternate accounts. You can also
define Compute Servers and change the passwords for primary and alternate accounts. To access the
Accounts dialog, right-click the Manager node in the Tree view and select Accounts from the context
menu.

The Add Primary Account button allows you to define primary accounts for RSM. A primary account
must be defined for your current user name in order for you to add new primary accounts. If a primary
account is not defined for your current user name, the User Name field of the Adding Primary Account
dialog defaults to your current user name.

When you right-click an existing account, the following context menu options are available:

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 41
User Interface

Option Function
Add Alternate Account Create an alternate account for the selected primary account.

Available only when a primary account is selected.


Change Password Change the password for the selected account.
Remove Deletes the selected account.

When a primary account is removed, any associated alternate


accounts are also removed.

By default, the primary account can send jobs to all of the Compute Servers. If an alternate account is
defined, the check boxes in the Compute Servers list allow you to specify which Compute Servers will
use the alternate account.

For details on working with accounts, see RSM User Accounts and Passwords (p. 45)

3.11. RSM Notification Icon and Context Menu


When RSM is minimized, it does not display in the task bar, but you can open the interface by double-
clicking on the RSM icon in the notification area (also called the “system tray” for Windows XP or Linux
GNOME).

On a Windows system, the notification area or system tray is accessible from the desktop and the RSM
icon is loaded to the notification area by default. On a Linux system, you may need to enable the noti-
fication area or system tray for the desktop.

To open the RSM interface, double-click the notification icon. The icon changes based on the status of
jobs, and tooltip on the icon displays the current status of the jobs (i.e., the status and how many of
those jobs are running, queued, failed, etc.).

Notification Job Status


Icon
No jobs are running.
At least one job is running.
At least one job has failed.
All jobs have completed.

Right-click the RSM icon to access its context menu. The context menu contains most of options that
are available on the RSM menu bar, as shown below:

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
42 ation of ANSYS, Inc. and its subsidiaries and affiliates.
RSM Notification Icon and Context Menu

Menu Op- Description


tion
Options Displays the Options dialog.

Functions in the same way as Tools > Options on the menu bar.
Help Displays the Help system in another browser window.
About Provides information about the program.
All Owner Displays or hides jobs that belong to other owners.
Jobs
Works in conjunction with the View > All Owner Jobs option in the
menu bar and theAll Owner Jobs icon in the toolbar.
Desktop Enables/disables the desktop alert window (see Desktop Alert (p. 40)).
Alert
Works in conjunction with the Tools > Desktop Alert option in the
menu bar.
Open Job Displays the RSM main window.
Status
Exit Exits the RSM application.

Functions in the same way as File > Exit on the menu bar.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 43
ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
44 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Chapter 4: RSM User Accounts and Passwords
The Accounts dialog allows you to configure accounts and passwords for RSM users. The changes you
are able to make depend on your RSM privileges, as follows:

RSM Administrative Privileges If you are a member of the RSM Admins user group, you have ad-
ministrative privileges for RSM. You can use the Accounts dialog to perform the following tasks:

• Create accounts

• Modify any account

• Change the password for any account

• Change the assignment of Compute Servers to any alternate account

Note

To create the RSM Admins user group and add users:

1. Right-click on Computer and select Manage.

2. On the Computer Management dialog, expand Local Users and Groups.

3. Right-click on the Groups folder and select New Group.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 45
User Accounts and Passwords

4. On the New Group dialog, enter “RSM Admins” as the Group Name and add members
by clicking the Add button.

5. On the Select Users, Computers, Service Accounts, or Groups dialog:

• Type in user names.

• Click the Check Names button to check and select each name.

6. Click the Create button to create the new group.

RSM Non-Administrative Privileges If you are not a member of the RSM Admins user group, you
do not have administrative privileges for RSM. You can use the Accounts dialog to perform the following
tasks:

• Add or remove your own primary and alternate accounts

• Change the passwords for your own accounts

• Change the assignment of Compute Servers to your own alternate account

RSM configuration data, including user account and password information, is stored in the RSM.Config
file. For details, see Accessing the RSM Configuration File (p. 28).

The following topics are discussed in this section.


4.1. Adding a Primary Account
4.2. Adding Alternate Accounts
4.3. Working with Account Passwords
4.4. Manually Running the Password Application
4.5. Configuring Linux Accounts When Using SSH

4.1. Adding a Primary Account


A “primary account” is the account that communicates with RSM, typically the account used with the
client application (ANSYS Workbench) on the RSM Client machine. You must define a primary account
for your current user name in order to add any other new accounts or edit existing ones.

To add a primary account:

1. Right-click the Manager node of the tree view and select Accounts from the context menu.

2. On the Accounts dialog, click the Add Primary Account button.

3. Specify account details in the Adding Primary Account dialog.

• Enter a user name for the account. If a primary account has not yet been defined for the user
name under which you’ve logged in, by default the User Name field will be populated with your
current user name.

• Cache the password with RSM by entering and verifying an account password. See Working with
Account Passwords (p. 48) for details.

• Click OK.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
46 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Adding Alternate Accounts

4.2. Adding Alternate Accounts


For each primary account, you can create and associate one or more alternate accounts. An alternate
account allows you to send jobs from the primary account on the RSM Client machine to be run on a
specific remote Compute Server under the alternate account. A primary account with one or more as-
sociated alternate accounts is called an “owner account.”

An alternate account is necessary if the remote Compute Server machine does not recognize the primary
account used on the RSM Client machine. For example, an RSM Client running on Windows with the
account name DOMAIN\johnd would need an alternate account to run jobs on a Linux machine acting
as a Compute Server, because the Linux machine would be unable to recognize the RSM Client account
name.

To add an alternate account:

1. In the Accounts list of the Accounts dialog, right-click the primary account and select Add Alternate
Account from the context menu.

2. Specify alternate account details in the Adding Alternate Account dialog.

• Enter a user name for the account.

• Cache the password with RSM by entering and verifying an account password. See Working with
Account Passwords (p. 48) for details.

• Click OK.

3. Specify the Compute Servers to which the new alternate account will have access.

• In the Alternates list, select the newly created alternate account.

• In the Compute Servers list, select the check box for each Compute Server to which the account
will send jobs. Each alternate account can have access to a different combination of Compute
Servers, but each Compute Server can only be associated with one alternate account at a time.
You will receive an error if you attempt to assign more than one alternate account to a single
Compute Server.

In the Accounts dialog, select an owner account to view the alternate accounts associated with it. Select
an alternate account to view the Compute Servers to which it can send jobs.

It is also possible to add an alternate account by running the RSM password application manually, rather
than via the Accounts dialog. For details, see Manually Running the Password Application (p. 49).

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 47
User Accounts and Passwords

4.3. Working with Account Passwords


If you will be sending jobs from the RSM Client machine to a remote Manager machine, you must cache
the account password with that Manager. By caching the password, you enable RSM to run jobs on a
Compute Server on behalf of that account.

When you first configure RSM, if the RSM Client and the Manager are running on different machines, a
Set Password reminder will be displayed on the tree view Manager node. It will also be displayed if
the owner account is removed. This reminder indicates that you need to cache the password for the
owner account.

Caching the Account Password


To cache an account password:

1. In the RSM tree view, right-click on My Computer [Set Password] and select Set Password.

2. On the Password Setting dialog, the User Name will be auto-populated with the Domain\user-
name of the account under which you’re currently logged in. Enter and verify the account password.

3. Click OK.

4. If the [Set Password] reminder is still displayed in the tree view, exit the RSM main window and
relaunch it to refresh the indicator to the correct state.

Note

• It is not necessary to cache your password with the Manager if you are using RSM only for
local background jobs.

• When you create a new account via the Accounts dialog and define the password for it, the
password is cached with RSM. It is encrypted and stored by the Manager, which maintains a
list of registered accounts.

• For security reasons, RSM will not allow any job to be run by the "root" user on Linux, including
primary and alternate accounts. You should not need to cache the "root" account password in
RSM Manager

Changing an Account Password


To change an account password:

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
48 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Manually Running the Password Application

1. In the RSM tree view, right-click on My Computer and select Accounts.

2. In the Accounts pane of the Accounts dialog, right-click the account and select Change Password.

3. On the Changing Account Password dialog, the User Name will be auto-populated with the Do-
main\username of the selected account. Enter and verify the password.

4. Click OK.

5. If the [Set Password] reminder is still displayed in the tree view, exit the RSM main window and
relaunch it to refresh the indicator to the correct state.

Recaching the Account Password


Whenever a password is changed, you must recache the password by changing it in RSM. To do this,
follow the same sequence of steps used for changing an account password.

When you change the password, the Set Password reminder or the tree view Manager node may be
redisplayed. Exit the RSM main window and relaunch it to refresh the indicator to the correct state.

It is also possible to cache a password by running the RSM password application manually, rather than
from the Accounts dialog. For details, see Manually Running the Password Application (p. 49).

4.4. Manually Running the Password Application


It is usually unnecessary to manually run the password caching application; however, you may find it
useful in certain circumstances. For example, it may be necessary to manually run the password applic-
ation on a Linux machine if the terminal used to start the RSM user interface is not available.

It is possible to stop and restart the RSM interface via the Ans.RSM.Password.exe password applic-
ation, located in the [RSMInstall]\bin directory. The instructions provided in this section are in-
cluded in the event that a general solution is desired.

Windows
You can run the password application directly by locating Ans.Rsm.Password.exe in the [RSMIn-
stall]\bin directory and double-clicking it.

Linux
You can open the password application by running the rsmpassword shell script, located in the
[RSMInstall]\Config\tools\linux directory.

If you run the script with no command options, it displays available options as below:
Usage: Ans.Rsm.Password.exe [-m manager][-a account][-o owner][-p password]
-m manager: RSM Manager machine (default = localhost).
-a account: Target account. If no -o owner, this is a primary account.
-o owner: Account owner. Setting password for an alternate account
specified with -a.
-p password: Account password.
-? or -h: Show usage.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 49
User Accounts and Passwords

NOTES: If no -a or -p, this is normal interactive mode.


Accounts can be entered as username or DOMAIN\username.

Note

The rsmpassword shell script depends on its relative location in the Workbench install-
ation; it should not be moved.

Alternate accounts are typically added to the owner account via the Accounts dialog, but can also be
manually added and edited by running the password application. In the example below, DO-
MAIN\johnd is the owner account and johndoe is an alternate account to be used on a Compute
Server specified in the Accounts dialog.
Setting password for primary (default), alternate or new alternate account.
Existing alternate accounts:
johndoe

Enter user name (DOMAIN\johnd):johndoe


Enter password for DOMAIN\johnd: ********
Re-enter password: ********

Password set for johndoe:


Your password has been encrypted and stored.
It can only be decrypted and used to run jobs on behalf of DOMAIN\johnd.

4.5. Configuring Linux Accounts When Using SSH


If the Windows and Linux account names are the same (for example, DOMAIN\johnd on Windows and
johnd on Linux) then no additional configuration is required. If the account name is different, specify
the account in the Linux Account property on the SSH tab of the Compute Server Properties dialog.

Client applications may also have a mechanism to specify an alternate account name. For example, you
can specify a Linux account in the ANSYS Workbench Solve Process Settings Advanced dialog box.
Remember that SSH must be configured for password-less access (see Appendix B). RSM does not store
Linux passwords for use with SSH.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
50 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Chapter 5: RSM Administration
Users with RSM administrator privileges can perform a variety of additional tasks. For instance, RSM
administrators can create and modify Managers and Compute Servers, manage queues, set jobs to
highest priority, and delete the jobs of any user.

RSM administrators must fulfill one of the following requirements:

Windows:

• The RSM administrator is a Windows administrator on the Manager machine (i.e., they are in the local
or domain administrators group).

• The RSM administrator has been added as a member of the RSM Admins group on the Manager
machine.

Linux:

• The RSM administrator is a “root” user.

• The RSM administrator has been added as a member of the rsmadmins group on the Manager machine.

Note

In both of the above cases, the RSM services ANSYS JobManager Service V15.0
and ANSYS ScriptHost Service V15.0 may need to be restarted in order for ad-
ministrator privileges to take effect.

RSM configuration data, including the configurations for the Manager, Compute Servers, and queues,
is stored in the RSM.Config file. For details, see Accessing the RSM Configuration File (p. 28).

The following RSM administration tasks are discussed in this section:


5.1. Automating Administrative Tasks with the RSM Setup Wizard
5.2. Working with RSM Administration Scripts
5.3. Creating a Queue
5.4. Modifying Manager Properties
5.5. Adding a Compute Server
5.6.Testing a Compute Server

5.1. Automating Administrative Tasks with the RSM Setup Wizard


The ANSYS Remote Solve Manager Setup Wizard is a utility designed to guide you through the process
of setting up and configuring Remote Solve Manager. The setup tasks addressed by the wizard include
adding and managing Managers, Compute Servers, queues, and accounts. It also allows you to test the
final configuration.

For information on using the wizard, see Using the ANSYS Remote Solve Manager Setup Wizard (p. 8).

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 51
Administration

5.2. Working with RSM Administration Scripts


Sometimes it is more convenient to work with RSM manually, rather than via the user interface. In ad-
dition to allowing you to manually run the password application, RSM provides you with a way to
manually open the main RSM window and start the RSM Utility application.

Opening the Main RSM Window Manually


For Windows, you can open the main RSM administration window directly by locating Ans.Rsm.Ad-
min.exe in the [RSMInstall]\bin directory and double-clicking it.

For Linux, you can open the main RSM administration window by running the rsmadmin shell
script, located in the [RSMInstall]\Config\tools\linux directory.

Starting the RSM Utility Application Manually


For Windows, you can start the RSM Utilities application by opening a command prompt in the
[RSMInstall]\bin directory and running rsm.exe. The example below shows available command
line options.

The -s configFile command option can be used to create a backup file containing configuration
information for each of the queues and Compute Servers you have defined. For example, in the
event that you would need to rebuild a machine, you can run this script beforehand. The backup
file, configFile, is created in the [RSMInstall]\bin directory and can be saved as a .txt file.
Once the machine is rebuilt, you can then use the saved configuration file to reload all the previously
defined queues and Compute Servers, rather than having to recreate them.

The -migrate vVer command option allows you to migrate the existing RSM database into the
newer release without having to set up your RSM queues and Compute Servers again.

Note

• In order to use the -migrate vVer command, you must first start the RSM Manager
service or process.

• The migration can also be achieved by running the RSM Setup wizard to set up RSM
as a SYSTEM user and then running the rsm.exe –migrate vVer command via
the command prompt.

C:\Program Files\ANSYS Inc\v150\RSM\bin>rsm.exe


Usage: rsm.exe [-m manager|machine][-clr]
[-c configFile|-s configFile|-migrate vVer]
[-stop mgr|svr|xmlrpc|all [-cancel]][-status mgr|svr]
-m manager: RSM Manager machine (default = localhost).
-c configFile: File containing Queues and Servers.
-s configFile: File to save Queues and Servers.
-clr: Clear Queues and Servers. If used with -c, clears before configure.
-stop mgr|svr|xmlrpc|all: Stop RSM services, where:
mgr = Manager, svr = ScriptHost, xmlrpc = XmlRpc Server, all = All three.
-cancel Cancel all active Jobs. For use with -stop.
-status mgr|svr: Query Manager and ScriptHost on localhost or use -m option.
-migrate vVer: Migrate database from previous version (ex. v145). Can be used with -clr.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
52 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Creating a Queue

For Linux, you can open the main RSM window by running the rsmutils shell script, located in
the [RSMInstall]\Config\tools\linux directory. The rsmutils shell script has the same
command options noted above for Windows.

Note

The Linux shell scripts are dependent on their relative location in the ANSYS Workbench in-
stallation, so cannot be moved.

5.3. Creating a Queue


A queue is a list of Compute Servers available to run jobs. To create a queue:

1. In the tree view, right-click on the Queues node for a desired Manager.

2. Select Add. The Queue Properties dialog displays:

3. Configure the Queue Properties described below, then select OK.

The table below lists the fields on the Queue Properties dialog:

Property Description
Name This field should contain a descriptive name for the queue.

Examples of queue names include “Local Queue”, “Linux Servers”, or


“HPC Cluster”. If the Compute Server(s) in the queue has a Start/End
Time specified you may want to use a name that indicates this to
users (e.g., "Night Time Only").

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 53
Administration

Enabled If True, the Manager dispatches queued jobs to available Compute


Servers.

If False, jobs remain in a Queued state until the queue is enabled.


Priority Possible values are Low, Below Normal, Normal, Above Normal, or
High.

When determining the next job to run, the Manager pulls jobs from
the highest priority queue first. Priority settings are commonly used
to create a separate, higher priority queue for smaller jobs, so that
they are processed before running large jobs that tie up the computing
resource for a long period of time.
Assigned Select the check box for each Compute Server to be used in this
Servers queue. A queue can contain more than one Compute Server. A Com-
pute Server can also be a member of more than one queue.

5.4. Modifying Manager Properties


To modify Manager properties:

1. In the tree view, right-click on a Manager node.

2. Select Properties. The Solve Manager Properties dialog appears:

3. Modify Manager properties described below, and then select OK.

The table below lists the editable fields on the Solve Manager Properties dialog:

Property Description
Job The length of time (in D.H:MM:SS format) that a job stays in the List
Cleanup view after it is released. Default value is 02:00:00 (2 hours).
Period
Acceptable values are as follows:

• D (days) = integer indicating the number of days

• H (hours) = 0–23

• MM (minutes) = 0–59

• SS (seconds) = 0–59

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
54 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Adding a Compute Server

You can enter only the number of days (without the zeros) , only the
hours/minutes/seconds, or both.

Examples:

• 1.00:00:00 or 1 = one day.

• 1.12:00:00 = 1.5 days

• 02:30:00 = 2.5 hours

• 00.15.00 = 15 minutes

Project Dir- The base location where the Manager stores input and output files
ectory for a job. As jobs are created, a unique subdirectory for each job is
created in the Project Directory.

When defining the location of your Project Directory, you must enter
an absolute path in this field; relative paths are not accepted. If you
enter a path to a base location that doesn not exist on the Manager
machine, RSM will automatically create the path on the machine. This
location can be either on the local disk of the Manager machine or
on a network share (for example, \\fileserver\RSMProjects).
Compres- The threshold at which files are compressed prior to transfer.
sion
Threshold There is always a trade-off between the time it takes to compress/de-
compress versus the time to transfer. The appropriate value depends
on the specific network environment. Enter a value of 0 to disable
compression.

Example: If you set this value to 50, files greater than 50 MB will be
compressed before being sent over the network.

5.5. Adding a Compute Server


A Compute Server is the machine on which the RSM Compute Server process runs. It executes the jobs
that are submitted by the RSM Client and distributed by the Manager. You can add and configure a
new Compute Server via the Compute Server Properties dialog. Once a Compute Server has been
added, you can also use this dialog to edit its properties.

To add a new Compute Server:

1. In the Manager tree view, right-click on the Compute Servers node under the machine you are desig-
nating as the Manager.

2. Select Add from the context menu. The Compute Server Properties dialog displays.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 55
Administration

The dialog includes three tabs: General, Cluster, and SSH.

• The General tab contains information that is used for all RSM configurations.

• The Cluster tab contains information that is necessary only if you are using a cluster computing configur-
ation.

• The SSH tab contains information that is necessary only if you are using SSH functionality to connect the
Compute Server Windows machine with a remote Linux machine.

To configure the new Compute Server, go to each tab relevant to your RSM configuration and enter or
select property values. When finished, click the OK button. Each tab is described in detail in the following
sections.

Editing Compute Server Properties


To edit the properties of an existing Compute Server:

1. In the Manager tree view, right-click on the Compute Server name, either under the Compute Servers
node or under a queue name in the Queues node.

2. Select Properties from the context menu.

3. On the Compute Server Properties dialog, edit the properties.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
56 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Adding a Compute Server

4. When finished, click the OK button.

Note

If you do not have permissions to a Compute Server machine (i.e., you have not set your
account password in RSM for the Manager node), you cannot add the machine as a Compute
Server or edit its properties. For instructions on setting your password, see Working with
Account Passwords (p. 48).

5.5.1. Compute Server Properties Dialog: General Tab


The General tab of the Compute Server Properties dialog allows you to define general information
that is necessary for all RSM configurations.

By default, only the first three fields display. Click the More>> button to expand the rest of the fields
and the <<Less button to collapse them.

The Quick Reference table below lists the properties that you can configure on the General tab. Click
on the property name for more detailed configuration instructions.

Property Description
Display Name Descriptive name for the Compute Server machine.

Required field.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 57
Administration

Property Description
Machine Name Name (the hostname or IP address) of the Compute Server machine.

Required field.
Working Directory Determines whether the location of the Working Directory is user-
Location defined or auto-defined by the system.
Working Directory Directory on the Compute Server machine where jobs are run.

Enabled when User Specified is selected for Working Directory Loca-


tion.

Required when enabled.


Server Can Accept Determines whether the Compute Server can accept jobs.
Jobs
Maximum Number The number of jobs that can be run at the same time on the Compute
of Jobs Server.
Limit Times for Determines whether there will be limitations on the times when jobs
Job Submission can be submitted to the Compute Server.
Start Time Time at which the Compute Server becomes available to accept job
submissions.
End Time Time at which the Compute Server is no longer available to accept job
submissions.
Save Job Logs to Determines if the job logs will be kept in a log file or discarded, upon
File job completion.
Delete Job Files in Determines whether the temporary job subdirectories created in the
Working Directory Compute Server Working Directory are deleted upon completion of the
associated job.
Use SSH protocol Specifies that RSM and the solvers use SSH instead of RSH for inter-node
for inter- and in- and intra-node communications on Linux machines.
tra-node commu-
nication (Linux
only)

Display Name
The Display Name property requires that you enter a descriptive name for the Compute Server machine.
It is an easy-to-remember alternative to the Machine Name, and is the name that is displayed in the
Manager tree view.

The Display Name defaults to first to New Server and thereafter to New Server n to guarantee
its uniqueness. Examples of default display names include New Server, New Server 1, and
New Server 2.

Examples of display names you might select are Bob’s Computer and My Computer to
Linux.

Machine Name
The Machine Name is the name (the hostname or IP address) of the Compute Server machine. Both RSM
and the application being used (for example, ANSYS Mechanical or Fluent) must be installed on this
machine. This is a required field.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
58 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Adding a Compute Server

• If the Compute Server is the same physical machine as the Manager, enter localhost.

• For a remote machine, enter the network machine name.

Examples of machine names for remote machines are comp1, comp1.win.domain.com, and
100.10.87.465.

Working Directory Location


The Working Directory Location property allows you to specify the location of the Working Directory,
where the Compute Server will read and write job scripts and solution files.

Available options are Automatically Determined and User Specified.

• Automatically Determined: This option is selected by default. Leave selected if you want the Working
Directory location to be determined by the system. When this option is selected, the Compute Server
will try to re-use files from the Manager Project Directory. If it cannot find the Project Directory, it will
copy files to a temporary subdirectory in the Compute Server TEMP directory.

• User Specified: Select if you want to manually specify the location of the Working Directory on the
Compute Server machine. When this option is selected, you must enter the directory path in the
Working Directory field. The Compute Server will copy all files to a temporary subdirectory within
Working Directory specified.

Note

With a native cluster configuration (i.e., you are not using SSH), your file management
settings may cause this property to be restricted to Automatically Determined. See the
descriptions of the Working Directory, Shared Cluster Directory, and File Management
properties.

Working Directory
The Working Directory property becomes enabled when you select User Specified for the Working
Directory Location property. When this property is enabled, it requires that a path is entered for the
Working Directory on the Compute Server machine. This can be done in one of the following ways:

• You can enter the path here.

• Alternatively, if you will be using a native cluster configuration (i.e., will not be using SSH) and opt to
use the Shared Cluster Directory to store temporary solver files, the Working Directory property will
be auto-populated with whatever path you enter for the Shared Cluster Directory property. See the
descriptions of the Shared Cluster Directory and File Management properties.

When the Compute Server and Manager are two different machines, for each job that runs, a tem-
porary subdirectory is created in the Compute Server Working Directory. This subdirectory is where
job-specific scripts, input files, and output files are stored. When the job completes, output files are
then immediately transferred back into the Project Directory on the Manager machine.

Requirements:

• The Working Directory must be on the located on Compute Server machine (the machine specified
in the Machine Name field).

• All RSM users must have write access and full permissions to this directory.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 59
Administration

• If you will be using a cluster configuration, the directory must be shared and writable to all of the
nodes in the cluster.

• Note that in some cluster configurations, the Working Directory may also need to exist on each cluster
node and/or may share the same physical space as the Shared Cluster Directory. Examples of Working
Directory paths are D:\RSMTemp and C:\RSMWorkDir.

Note

• In a configuration where the Compute Server and Manager are the same machine (i.e., the
job is queued from and executed on the same machine), the job execution files are stored
directly in the Project Directory on the Manager, rather than in the Working Directory on
the Compute Server.

• In a native cluster configuration (i.e., you are not using SSH), when you specify that you
want to use the Shared Cluster Directory to store temporary solver files, essentially you are
indicating that the Working Directory and the Shared Cluster Directory are the same location;
as such, the Working Directory property is populated with the path entered for the Shared
Cluster Directory property in the Cluster tab. See the descriptions of the Shared Cluster
Directory and File Management properties.

Server Can Accept Jobs


The Server Can Accept Jobs property determines whether the Compute Server can accept jobs. Selected
by default.

• Leave selected to indicate that the Compute Server can accept jobs.

• Deselect to prevent jobs from being run on this Compute Server. Primarily used when the server is
offline for maintenance.

Note

The Server Can Accept Jobs property can also be set on the client side (i.e., on the RSM
Client machine via the Workbench Update Design Point Process properties). This can
be done both in scenarios where the Manager runs locally on the same machine as the
RSM Client, and in scenarios where the Manager is run remotely on a different machine.
In either case, the Server Can Accept Jobs value set on the server side (i.e., on the remote
Compute Server machine) takes precedence.

Maximum Number of Jobs


The Maximum Number of Jobs property allows you to specify the maximum number of jobs that can
be run on the Compute Server at the same time. When this number is reached, the server is marked as
Busy.

The purpose of the Maximum Number of Jobs property is to prevent job collisions, which can
occur because RSM cannot detect the number of cores on a machine. The ability to determine a
maximum number of jobs is particularly useful when the job is simply forwarding the work to a
third-party job scheduler (for example, to an LSF or PBS cluster). Default value is 1.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
60 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Adding a Compute Server

In a cluster configuration, this property refers to the maximum number of jobs at the server level,
not at the node/CPU level.

Note

• The Maximum Number of Jobs property can also be set on the client side (i.e., on the
RSM Client machine via the Workbench Update Design Point Process properties). This
can be done both in scenarios where the Manager runs locally on the same machine as
the RSM Client, and in scenarios where the Manager is run remotely on a different machine.
In either case, the Maximum Number of Jobs value set on the server side (i.e., on the remote
Compute Server machine) takes precedence.

• When multiple versions of RSM are being run at the same time (for example, 13.0 and 14.0),
this property applies only to the current instance of RSM. One version of RSM cannot detect
the jobs being assigned to a Compute Server by other versions.

Limit Job Times for Submission


The Limit Times for Job Submissions property allows you to specify whether there will be limitations
on the times when jobs can be submitted to the Compute Server. Deselected by default.

• Leave the check box deselected for 24-hour availability of the Compute Server.

• Select the check box to specify submission time limitations in the Start Time and End Time fields.
This option is primarily used to limit jobs to low-load times or according to business workflow.

Start Time / End Time


The Start Time and End Time properties become enabled when the Limit Times for Job Submissions
check box is selected. They allow you to specify a time range during which the Compute Server is
available to accept submitted jobs. The Start Time property determines when the Compute Server be-
comes available and the End Time property determines when it becomes unavailable to accept job
submissions.

A job cannot be run on a Compute Server if it is submitted outside of the Compute Server’s range
of availability. The job may still be queued to that Compute Server later, however, when the Compute
Server again becomes available to accept it. Also, if there are multiple Compute Servers assigned
to a queue, a queued job may still be submitted to another Compute Server that is available to
accept submissions.

It can be useful to define an availability range when Compute Servers or application licenses are
only available at certain times of the day.

You can either enter the time (in 24-hour HH:MM:SS format) or select a previously entered time from
the drop-down list.

Note

Do not indicate 24-hour availability by entering identical values in the Start Time and
End Time fields; doing so will cause an error. Instead, indicate unlimited availability by
deselecting the Limit Times for Job Submissions check box.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 61
Administration

Save Job Logs to File


The Save Job Logs to File property allows you specify if job logs should be saved as files.
Deselected by default.

• Leave the check box deselected to specify that no job log files will be saved.

• Select the check box to save job log files.

When a job runs, a log file named RSM_<ServerDisplayName>.log is saved to the TEMP dir-
ectory on the anager machine unless the TMP environment variable has been defined; in this case,
job log files are saved to the location defined in the TMP environment variable.

Note

To access the default TMP directory for Windows, go to %TMP% in Windows Explorer
or to /tmp in Linux.

Selecting this option could potentially increase disk usage on the Manager.

Job log files are primarily used for troubleshooting. The log file for a job contains the same inform-
ation displayed on the Job Log view when the job is selected in the List view of the main RSM ap-
plication window.

Note

When this option is enabled, the log file on disk is only updated/saved once when a job
finishes running. The user can always see the same live log from RSM UI when the job
is running.

Delete Job Files in Working Directory


The Delete Job Files in Working Directory property determines whether the temporary job subdirect-
ories created in the Compute Server Working Directory are deleted upon completion of the associated
job. Selected by default.

• Leave the check box selected to delete temporary job subdirectories and their contents upon comple-
tion of the associated job.

• Deselect the check box to save temporary job subdirectories and their contents after the completion
of the associated job.

The job files in the Working Directory are primarily used for troubleshooting. When a submitted job
fails, saved job-specific scripts and files can be helpful for testing and debugging. You can find these
files by looking at the RSM log (either in the Job Log view of the main application window or in
the job log file saved to the Working Directory on the Manager machine) and finding the line that
specifies the “Compute Server Working Directory.”

Note

This option does not control whether job-specific files in the Project Directory on the
Manager machine are deleted. When the Compute Server and Manager are the same
machine and job-specific files are stored in the anager Project Directory instead of the

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
62 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Adding a Compute Server

Compute Server Working Directory, job-specific files will not be deleted from the Project
Directory until the job is released.

When a job is stopped abruptly rather than released (for instance, via the Abort option
in the right-click context menu of the List view) or is not released immediately, you may
need to take additional steps to ensure that its files are deleted from the Project Directory
on the Manager. You can ensure that job-specific files are deleted by one of the following
two methods:

• Remove the job from the List view. You can do this by right-clicking the job and selecting
Remove from the context menu. Alternatively, you can highlight the job and either select
the Tools > Remove option or press the Delete key.

• Configure the system to remove the files automatically by setting the Job Cleanup Property
on the Solve Manager Properties dialog.

Use SSH protocol for inter- and intra-node communication (Linux only)
The Use SSH protocol for inter- and intra-node communication (Linux only) property determines
whether RSM and solvers use RSH or SSH for inter-node and intra-node communications on Linux ma-
chines. Deselected by default.

• Leave the check box deselected to use RSH.

• Select the check box to use SSH.

This setting will be applied to all Linux Compute Servers, not only those in clusters, allowing for
solvers to run in distributed parallel mode on a single machine.

Note

When ANSYS Fluent, ANSYS CFX, ANSYS Mechanical, and ANSYS Mechanical APDL are
configured to send solves to RSM, their solvers will use the same RSH/SSH settings as
RSM.

Related Topics:

Adding a Compute Server (p. 55)

Compute Server Properties Dialog: Cluster Tab (p. 63)

Compute Server Properties Dialog: SSH Tab (p. 67)

5.5.2. Compute Server Properties Dialog: Cluster Tab


The Cluster tab of the Compute Server Properties dialog allows you to define information necessary
for a cluster computing configuration.

By default, only the first three fields display. Click the More>> button to expand the rest of the fields
and the <<Less button to collapse them.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 63
Administration

The Quick Reference table below lists the properties that you can configure on the Cluster tab. Click
on the property name for more detailed configuration instructions.

Property Description
Cluster Type Type of job scheduling system used to manage the cluster.
Shared Cluster Dir- Enabled only when a value other than None is selected for Cluster
ectory Type. Required when enabled.

Location of the Shared Cluster Directory, which primarily serves as a


central file-staging location, but also can be used to store the temporary
working files created by the application solver.
File Management Enabled only when a value other than None is selected for Cluster
Type. Required when enabled.

Location of temporary directory for temporary working files created by


the application solver.
Parallel Environ- Enabled only when SGE is selected for Cluster Type. Required when
ment (PE) Names enabled.

Names of Shared Memory Parallel and Distributed Parallel environ-


ments. The environment(s) must have already been created by your
cluster administrator.
Job Submission Enabled only when a value other than None is selected for Cluster
Arguments Type. Optional.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
64 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Adding a Compute Server

Property Description
Arguments that will be added to the job submission command line of
your third-party job scheduler.

Cluster Type
The Cluster Type property allows you select they type of job scheduling system that will be used to
manage the cluster.

Select one of the following options:

• None: Selected by default. Leave selected if you won’t be using a job scheduler for cluster management.
When this option is selected, the rest of the properties on the tab are disabled.

• Windows HPC: Select to use MS HPC. When this option is selected, the SSH tab is disabled because
SSH is applicable only to Linux cluster configurations.

• LSF: Select to use LSF.

• PBS: Select to use PBS.

• SGE: Select to use SGE or UGE.

• Custom: Select to customize RSM integration. See Customizing ANSYS Remote Solve Manager (p. 73).

Shared Cluster Directory


The Shared Cluster Directory property allows you to enter the path for Shared Cluster Directory.

Requirements:

• This directory must be shared and writable to the entire cluster.

• Note that in some cluster configurations, the Shared Cluster Directory may also need to exist on each
cluster node and/or may share the same physical location as the Working Directory.

Examples of Shared Cluster Directory paths are \\<MachineName>\Temp, /user/temp, and


/staging. With a Shared Cluster Directory path of /staging, RSM might create a temporary job
subdirectory of /staging/dh3h543j.djn.

The primary purpose of the Shared Cluster Directory is to serve as a central file-staging location for
a cluster configuration. When the Manager queues a job for execution, a temporary job subdirectory
is created for the job inside the Shared Cluster Directory. All job files are transferred to this subdir-
ectory so they can be accessed by all of the execution nodes when needed.

A secondary (and optional purpose) for this directory is to store the temporary working files created
by each application solver as a solution progresses. Depending on your configuration, the Shared
Cluster Directory may share the same location as either the Working Directory or the Linux Working
Directory. Implementation of this purpose is controlled by the File Management property on the
Cluster tab.

File Management
As part of the solution process, each application solver produces temporary working files that are stored
in a temporary directory. The File Management property allows you to specify where the temporary
directory will be created, which in turn impacts solver and file transfer performance.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 65
Administration

• Select Reuse Shared Cluster Directory to store solver files in the central file-staging directory along
with all the other RSM job files. This option is recommended when one or more of the following is
true:

– You are using a native cluster setup (i.e., you are not using SSH).

– You have a fast network connection between the execution nodes and the Shared Cluster Directory.

– You are using a solver that produces fewer, relatively small files as part of the solution and does
not make heavy use of local scratch space (for example, the CFX or the Fluent solver).

Note

– When you select this option, you specify that the Shared Cluster Directory is being used
for its secondary purpose: to store temporary working files created by the solver. Depend-
ing on your configuration, this indicates that the Shared Cluster Directory shares the
same location with either the Working Directory or the Linux Working Directory. When
the Reuse Shared Cluster Directory option is selected:

– If you will be sending CFX jobs to a Microsoft HPC Compute Server, the Reuse Shared
Cluster Directory option will always be used, regardless of the File Management
property setting.

– If you are using a native cluster (i.e., have selected a Cluster Type and are not using
SSH), the Working Directory property on the General tab is populated with the Shared
Cluster Directory path.

– If you are using SSH for Linux cluster communications, the Linux Working Directory
property on the SSH tab is populated with the Shared Cluster Directory path.

– If you are using SSH for Linux cluster communications, the Linux Working Directory
property on the SSH tab is populated with the Shared Cluster Directory path.

– Be careful when using slower NAS storage and running many concurrent jobs. For each
specific disk setup there will be a specific upper limit of jobs that can be run concurrently
without affecting each other. Refer to your disk suppliers analysis tools for verification.

• Select Use Execution Node Local Disk to store solver files in a local directory on the Compute Server
machine (also called “using scratch space”). This option is recommended to optimize performance
when one or both of the following is true:

– You have a slower network connection between the execution nodes and the Shared Cluster Direct-
ory.

– You are using a solver that produces numerous, relatively large files as part of the solution and
makes heavy use of local scratch space (for example, Mechanical solvers).

Note

When you select this value, for performance reasons, it is recommended that you use
set the Working Directory to a path with a fast I/O rate. In this case, on the General
tab you should set Working Directory Location to User Specified and then set

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
66 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Adding a Compute Server

Working Directory to the path with the fastest I/O rate on each compute node of the
cluster

• Select Custom Handling of Shared Cluster Directory to specify that file management for the cluster
will be customized, so RSM will not copy or stage files to a Shared Cluster Directory. This option is
used when the cluster’s file staging area is not visible to the RSM Client machine via a network share,
mapped drive, Samba share, or mounted directory.

For more information on solver working files, see Product File Management in the Workbench User's
Guide.

Parallel Environment (PE) Names


Names of Shared Memory Parallel and Distributed Parallel environments. The environment(s) must
have already been created by your cluster administrator.

Defaults to pe_smp and pe_mpi. To use one of the default names, your cluster administrator must
create a parallel environment with the same name. The default PE names can also be edited to
match the names of your existing parallel environments.

Enabled when SGE is selected for Cluster Type. Required when enabled.

Job Submission Arguments


The Job Submission Arguments property allows you to enter arguments that will be added to the job
submission command line of your third-party job scheduler. For example, you can enter job submission
arguments to specify the queue (LSF, PBS, SGE) or the nodegroup (MS HPC) name. For valid entries, see
the documentation for your job scheduler.

Related Topics:

Adding a Compute Server (p. 55)

Compute Server Properties Dialog: General Tab (p. 57)

Compute Server Properties Dialog: SSH Tab (p. 67)

5.5.3. Compute Server Properties Dialog: SSH Tab


The SSH tab of the Compute Server Properties dialog allows you to indicate if you intend to use SSH
(secure shell) communications protocol to establish a connection between the Compute Sever machine
(specified in the Machine Name property of the General tab) and a remote Linux machine (specified
in the Linux Machine property on this tab). If you are using SSH, you can specify details about the remote
Linux machine that will serve as a proxy for the Compute Server.

Note

Since the SSH protocol is applicable only to cross-platform communications (i.e., Windows-
Linux), this tab is disabled if the Windows HPC option is selected in the Cluster Type drop-
down of the Cluster tab.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 67
Administration

Note

SSH is not a recommended communication protocol and should be used only if it is required
by your IT policy. For ease of configuration and enhanced performance, RSM native mode
is the recommended communication protocol. For more information on using native mode
for cross-platform communications, see Configuring RSM to Use a Remote Computing Mode
for Linux (p. 12).

The Quick Reference table below lists the properties that you can configure on the SSH tab. Click on
the property name for more detailed configuration instructions.

Property Description
Use SSH Determines whether the SSH communications protocol will be used.
Linux Machine Name (the hostname or IP address) of the remote Linux machine.

Displayed when the Use SSH check box is selected. Required when
displayed.
Linux Working Dir- Location on the remote Linux machine where the temporary working
ectory files created by the application solver will be saved.

Displayed when the Use SSH check box is selected. Required when
displayed.
Linux Account Name of the account being used to login to the remote Linux machine.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
68 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Adding a Compute Server

Property Description
Displayed when the Use SSH check box is selected. Required when
displayed.

Use SSH
The Use SSH check box allows you to specify whether you intend to use the SSH communications protocol.
The SSH protocol may be used for communications either to a remote Linux machine or to the head
node of a Linux cluster. Deselected by default.

Leave the check box deselected to indicate that you will not be using SSH. When this check box is
deselected, the rest of the fields on the tab do not display.

Select the check box to allow the Compute Server (specified in the Machine Name field of the
General tab) to submit jobs via SSH to a remote Linux machine (specified in the Linux Machine
field of this tab) for execution.

Linux Machine
Displays when the Use SSH check box is selected.

Enter the name (the hostname or IP address) of the remote Linux machine.

This value is accessed by the Task.ProxyMachine property in the job script. Custom job scripts
can use this value for any purpose.

Examples of Linux Machine names are linuxmachine01, lin05.win.abc.com, and


100.10.67.432.

Required field if the Use SSH check box is selected.

Linux Working Directory


The Linux Working Directory property displays when the Use SSH check box is selected. When this
property displays, it requires that a path is entered for the Working Directory on the Linux machine (the
machine specified in the Linux Name property). This can be done in one of the following ways:

• If the File Management property on the Cluster tab is set to Use Execution Node Local Disk, set
the Linux Working Directory path to a local disk path (e.g. /tmp). The full RSM-generated path (e.g.
/tmp/abcdef.xyz) will exist on the machine specified on that tab, as well as the node(s) that the
cluster software selects to run the job.

• If the File Management property is set to Reuse Shared Cluster Directory, the Linux Working Dir-
ectory path is populated with the path specified for Shared Cluster Directory on the Cluster tab
and cannot be edited. This is where the cluster job runs, as expected.

Requirements
• All RSM users must have write access and full permissions to this directory.

• The Linux Working Directory must be shared and writable to the remote Linux machine. For a cluster
configuration, it must be shared and writable to all of the nodes in the cluster.

• Note that in some cluster configurations, the Linux Working Directory may also need to exist on each
cluster node and/or may share the same physical location as the Shared Cluster Directory.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 69
Administration

This value is accessed by the Task.ProxyPath property in the job script. Custom job scripts can
use this value for any purpose.

Examples of Linux Working Directory paths are /scratch/josephuser, \\lsfCluster-


Node1\RSMTemp, and \\msccHeadNode\RSM_Temp.

Note

In a Linux cluster configuration that uses the SSH protocol, when you specify that you
want to use the Shared Cluster Directory to store temporary solver files, essentially you
are indicating that the Linux Working Directory and the Shared Cluster Directory are the
same location; as such, the Linux Working Directory property is populated with the
path entered for the Shared Cluster Directory property in the Cluster tab. See the de-
scriptions of the Shared Cluster Directory and File Management properties.

Linux Account
The Linux Account property displays when the Use SSH check box is selected. It requires that you enter
the name of the account being used to login to the remote Linux machine.

Enter the name of the account being used to login to the remote Linux machine.

This value is accessed by the Task.ProxyAccount property in the job script. Custom job scripts
can use this value for any purpose.

For instructions integrating Windows with Linux using SSH/SCP, see Appendix B.

Related Topics:

Adding a Compute Server (p. 55)

Compute Server Properties Dialog: General Tab (p. 57)

Compute Server Properties Dialog: Cluster Tab (p. 63)

5.6. Testing a Compute Server


To test a Compute Server configuration, right-click on the Compute Servers node in the tree view and
select Test Server from the context menu that displays. This runs a test job using the settings provided.
The Job Log view displays a log message indicating if the test finished or failed. If the test finishes, you
can successfully run jobs on the Compute Server.

Note

If you do not have full permissions to the Compute Server working directory, Compute
Server tests will fail.

If tests fail, try deselecting the Delete Job Files in Working Directory check box on the General tab
of the Compute Server Properties dialog. You can then examine the contents of the temporary job
directories for additional debugging information.

When this option is deselected, RSM will keep the temporary directories on the server after the job is
completed. You can find the location of these temporary directories by looking for the line that specifies
the "Compute Server Working Directory" in the RSM log.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
70 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Testing a Compute Server

The Test Server job will always keep the temporary client working directory created by RSM on the client
machine, regardless of the Delete Files in Working Directory setting. You can find the location of the
temporary client working directory by looking for the line that specifies the "Client Directory" in the
RSM log.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 71
ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
72 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Chapter 6: Customizing ANSYS Remote Solve Manager
This section discusses various methods of customizing ANSYS Remote Solve Manager.

The following topics are addressed:


6.1. Understanding RSM Custom Architecture
6.2. Custom Cluster Integration Setup
6.3. Writing Custom Code for RSM Integration

6.1. Understanding RSM Custom Architecture


The [RSMInstall]\Config directory contains job templates, code templates, job scripts, and other
files that are used to define and control RSM jobs. The RSM architecture allows the user to customize
how jobs are executed on a cluster or Compute Server by providing a custom version of some of the
files. This section briefly describes the types of files used in the customization.

This section addresses each file type in the RSM customization architecture:
6.1.1. Job Templates
6.1.2. Code Templates
6.1.3. Job Scripts
6.1.4. HPC Commands File

6.1.1. Job Templates


Job Templates define the code template, inputs, and outputs of a job.

RSM job templates are located in the [RSMInstall]\Config\xml directory. Examples of job tem-
plates in this directory are GenericJob.xml, Addin_ANSYSJob.xml, and Addin_CFXJob.xml.

An example job template for a server test job is shown below:


<?xml version="1.0" ?>
<JobTemplate>
<script>ServerTestCode.xml</script>
<debug>TRUE</debug>
<cleanup>TRUE</cleanup>
<inputs>
<file type="ascii">*.in</file>
</inputs>
<outputs>
<file type="ascii">*.out</file>
</outputs>
</JobTemplate>

6.1.2. Code Templates


Code templates are used by the corresponding job template and determine which scripts will be used
to run a specific job. Code templates contain sections for the actual code files (job scripts), referenced
assemblies (.dlls), and support files. These code templates are chosen at runtime based upon the
job template and cluster type selected to run the job.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 73
Customizing RSM

RSM code templates are located in the [RSMInstall]\Config\xml directory.

An example code template for a server test job is shown below:


<?xml version="1.0" ?>
<codeDefinition transfer="true" prebuilt="false" assemblyName="Ans.Rsm.Test.dll">
<codeFiles>
<codeFile>Test.cs</codeFile>
</codeFiles>
<references>
<reference>Ans.TestDlls.dll</reference>
<references/>
<supportFiles>
<supportFile transfer="true" type="ascii">TestingScript.py</supportFile>
<supportFiles/>
</codeDefinition>

6.1.3. Job Scripts


The job scripts for a particular type of job are defined in the <codeFiles> section from code template.
The term script refers generically to the code used for running the different types of RSM jobs, such as
native jobs, SSH jobs, cluster jobs, etc. Depending on the Compute Server configuration, different sets
of scripts may also be compiled dynamically during the run time of the job. Job scripts also include
actual command scripts that you may provide to customize the cluster job behavior. These scripts are
included in the <supportFiles> section.

RSM job script files are located in the [RSMInstall]\Config\scripts directory.

Specialized job scripts for integrating RSM with third-party job schedulers are invoked based upon the
Cluster Type property on the Cluster tab of the Compute Server Properties dialog. Your selection
from the Cluster Type drop-down is appended to the name of the base code template. These files are
generically in the format of <BaseName>_<Keyword>.xml.

• For example, if the base code template is named TestCode.xml from the job template and you set
Cluster Type to LSF, then LSF will be your “keyword” and RSM will look for the corresponding code
template TestCode_LSF.xml. This code template then invokes the scripts necessary to run a test job
on an LSF cluster.

• If you chose a Cluster Type of Custom, then “Custom” is not used as the keyword; you are required to
provide a name of for your Custom Cluster Type. The name will become your “keyword”, allowing you
to customize the cluster and modify these files without breaking any functionality.

6.1.4. HPC Commands File


The cluster-specific HPC commands file is the configuration file used to specify the commands or
queries that will be used in the cluster integration. The file is in xml format and is located in
the [RSMInstall]\Config\xml directory. By default, the file name is hpc_commands_<cluster-
type>.xml. When using a custom cluster type, you will need to provide a copy of the HPC command
file that matches your custom cluster type name in the format hpc_commands_<keyword>.xml, as
discussed in the setup sections of Custom Cluster Integration Setup (p. 75).

The commands inside the HPC command file can point directly to cluster software specific commands
(like bsub or qstat). When the operations are more complex, the commands can reference scripts or
executables that call the cluster software functions internally. These scripts can be in any language that
can be run by the Compute Server. The HPC command file is described in greater detail in Custom
Cluster Integration Setup (p. 75).

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
74 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Custom Cluster Integration Setup

6.2. Custom Cluster Integration Setup


ANSYS Remote Solve Manager (RSM) provides built-in functionality that allows Workbench jobs to be
submitted to a commercial cluster. The built-in functionality includes the ability to transfer files auto-
matically to/from the cluster from a remote client and the ability to submit, cancel, and monitor Work-
bench jobs. The current supported commercial clusters are Windows LSF, Linux LSF, Linux PBS Pro,
Linux UGE (SGE), and Microsoft HPC (MSCC).

RSM also provides a custom cluster integration mechanism that allows third parties to use custom
scripts to perform the tasks needed to integrate Workbench with the cluster. The custom integration
scenarios can be grouped into the following categories in order of complexity:

• Commercial clusters (listed above) for which the customers need some additional operation to be performed
as part of the RSM job execution. This is a type of server-side integration.

• “Unsupported” clusters, not included in the list above, that the customers want to use for executing a job
via RSM. This is also a type of server-side integration.

• Customers with specialized requirements that need to fully replace RSM functionality with 3rd-party scripts
for handling all aspects of job submission including file transfer. This is called client-side integration.

The terms server-side and client-side integration refer to the location (in the RSM architecture) where
the custom script files are going to be located.

In the typical RSM usage with a (supported or unsupported) cluster, the head node of the cluster is
typically configured as RSM Manager and Compute Server. The cluster acts as a Compute Server with
respect to the RSM client from where the jobs are submitted; therefore the customization of RSM files
on the cluster is referred to as server-side integration. For server-side integration, you must be able to
setup the RSM services on the cluster head node and file transfers cannot use SSH. The methods of file
transfer discussed in Setting Up RSM File Transfers (p. 21) are available, except for SSH File Transfer (p. 28)
and Custom Client Integration (p. 28).

The client-side integration refers to the case where the RSM functionality is completely replaced by the
3rd-party scripts. In this case, the RSM Manager and Compute Server are located on the Client machine.
However, only a thin layer of the RSM architecture is involved, in order to provide the APIs for execution
of the custom scripts, which are located on the Client machine. The RSM services are not installed on
the cluster machine.

Please note that for supported clusters it is also possible to include additional job submission arguments
to the command executed by the cluster. The addition of custom submission arguments does not require
the creation of custom scripts. For more details, please refer to Compute Server Properties Dialog: Cluster
Tab (p. 63).

The following sections describe the general steps for customization with server-side and client-side in-
tegration. The detailed instructions for writing the custom code are similar for the two cases. They are
addressed in Writing Custom Code for RSM Integration (p. 89).

The following topics are addressed:


6.2.1. Customizing Server-Side Integration
6.2.2. Customizing Client-Side Integration
6.2.3. Configuring File Transfer by OS Type and Network Share Availability

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 75
Customizing RSM

6.2.1. Customizing Server-Side Integration


RSM allows you to customize your integration with supported cluster types (LSF, PBS, HPC, and SGE)
by starting with examples of production code for one of the standard cluster types and then changing
command lines or adding custom code where necessary. If an unsupported cluster is being used, the
recommended procedure is still to start from the example files for one of the supported clusters. This
section will walk through the process of how to integrate such changes into RSM.

On the server-side RSM installation, you will need to log into the remote cluster (RSM Manager) machine
to perform all the tasks (steps 1 through 5). To override or modify selective cluster commands, you
must:

1. Configure RSM to use a cluster-specific code template.

2. Create copies of existing code and rename files using your new Custom Cluster Type “keyword.”

3. Edit the cluster-specific code template to use your new cluster type.

4. Edit the cluster-specific hpc_commands_<keyword> file to reference the code you want to execute.

5. Provide a cluster-specific script/code/command that does the custom action and returns the required
RSM output.

The following sections discuss the steps needed for customizing your integration cluster:
6.2.1.1. Configuring RSM to Use Cluster-Specific Code Template
6.2.1.2. Creating Copies of Standard Cluster Code Using Custom Cluster Keyword
6.2.1.3. Modifying Cluster-Specific Job Code Template to Use New Cluster Type
6.2.1.4. Modifying Cluster-Specific HPC Commands File

6.2.1.1. Configuring RSM to Use Cluster-Specific Code Template


On the server-side RSM installation, you will need to log into the remote cluster (RSM Manager) machine
to perform all the tasks (steps 1 through 5) in Customizing Server-Side Integration (p. 76).

After creating a new Compute Server, set up the Compute Server Properties dialog box under the
Cluster tab. You must set Cluster Type to Custom and create a short phrase/word in the Custom
Cluster Type as the custom cluster name. The name is arbitrary, but you should make it simple enough
to append to file names. This name will be referred to as the “keyword” from now on.

For supported clusters, you can include the original cluster name in the new custom name, for clarity.
For example, if your cluster is actually an LSF or PBS cluster but you need to customize the RSM inter-
action with it, you might use the keyword “CUS_LSF” or “CUS_PBS”. If the underlying cluster is not a
supported platform, it could be called “CUSTOM” or any other arbitrary name. The names are in capital
letters for simplicity, but the only requirement is that the capitalization is the same in all places where
this keyword is referenced. A full example of a typical cluster setup using the remote RSM Manager and
custom properties is shown below.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
76 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Custom Cluster Integration Setup

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 77
Customizing RSM

Note

• The Working Directory you choose must be readable and writable by all users of RSM and also
by rsmadmins. The Working Directory should not be shared between nodes but it should have
the same name (and parent directories) on all nodes.

• The Shared Cluster Directory you choose must be readable and writable by all users of RSM
and also by rsmadmins. The Shared Cluster Directory must be shared between all nodes of
the cluster.

Please refer toAdding a Compute Server (p. 55) in the Remote Solve Manager User's Guide for inform-
ation about the properties in the General and Cluster tabs of the Compute Server Properties dialog.

6.2.1.2. Creating Copies of Standard Cluster Code Using Custom Cluster Keyword
As part of the setup, you must create a custom copy of the cluster-specific code template and
hpc_commands files and modify them to load the custom job code for your custom integration. You
must also create a custom copy of the xml file that contains the definition of the HPC commands to
be used for the job execution. The starting point for the code template and command files can be
created by copying them from existing RSM files as shown below:

• Locate the directory [ANSYS V15 Install]/RSM/Config/xml. Please note that all the actions listed
below should be performed on the cluster installation.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
78 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Custom Cluster Integration Setup

• Locate the GenericJobCode file that pertains to your cluster type (for instance, if you are starting from
PBS, the file is GenericJobCode_PBS.xml).

Note

You cannot use the SSH versions of these files.

• Copy the content of the GenericJobCode_PBS.xml code template into a new code template Gener-
icJobCode_<YOURKEYWORD>.xml. If your keyword for the custom cluster was “CUS_PBS” like the
example in Configuring RSM to Use Cluster-Specific Code Template (p. 76), the new file should be called
GenericJobCode_CUS_PBS.xml.

• Locate the commands file that pertains to your cluster type (for instance, if you are using PBS, the file is
hpc_commands_PBS.xml).

• Copy the content of the hpc_commands_PBS.xml file into a new file hpc_commands_<YOUR-
KEYWORD>.xml. If your keyword for the custom cluster was “CUS_PBS” like the example in Configuring
RSM to Use Cluster-Specific Code Template (p. 76), the new file should be called hpc_com-
mands_CUS_PBS.xml.

Note

In order to use the native RSM cluster functionality (i.e. using a fully supported cluster type
in your setup, e.g. LSF, PBS, etc.), you must not change the file names or contents of the
corresponding cluster-specific templates provided by RSM. This can cause those standard
cluster setups to fail and will make it harder to start over if you need to change something
later on. Here we have created a custom cluster type, but used copies of a standard template
to start from; this is the recommended method.

6.2.1.3. Modifying Cluster-Specific Job Code Template to Use New Cluster Type
The code sample pasted below provides an example of a modified job code file. Modified/added portions
are in bold text and you will need similar edits for any cluster type: GenericJobCode_CUS_PBS.xml.
<?xml version="1.0"?>
<codeDefinition transfer="true" prebuilt="false" assemblyName="Ans.Rsm.GenericJob_PBS.dll">
<codeFiles>
<codeFile>GenericJobTP.cs</codeFile>
<codeFile>GenericJobBase.cs</codeFile>
<codeFile>GenericCommand.cs</codeFile>
<codeFile>IProxyScheduler.cs</codeFile>
<codeFile>PBSUtilities.cs</codeFile>
<codeFile>SchedulerBase.cs</codeFile>
<codeFile>ClusterCustomization.cs</codeFile>
<codeFile>Utilities.cs</codeFile>
</codeFiles>
<references>
<reference>Ans.Rsm.ScriptApi.dll</reference>
<reference>Ans.Rsm.Utilities.dll</reference>
</references>
<supportFiles>
<supportFile transfer="true" type="ascii">ClusterJobs.py</supportFile>
<supportFile transfer="true" type="ascii">ClusterJobCustomization.xml</supportFile>
<supportFile transfer="true" type="ascii">hpc_commands_CUS_PBS.xml</supportFile>
<supportFile transfer="true" type="ascii">submit_PBS_EXAMPLE.py</supportFile>
</supportFiles>
</codeDefinition>

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 79
Customizing RSM

The lines highlighted in the sample code show the modification made to the original file:

1. The hpc_commands file was changed from the cluster-specific hpc_commands_PBS.xml to the
custom file we created: hpc_commands_CUS_PBS.xml. All cluster types will need to change this
file similarly, using <YOURKEYWORD>.

2. A custom script submit_PBS_EXAMPLE.py was added as a new support file. Its location is in the
note below. Adding support files in the job code file tells RSM to find these files in the scripts
(or xml, respectively) directory and transfer them to the working directory for use when the jobs
runs. All custom scripts used by RSM should be referenced here.

Note

The submit_PBS_EXAMPLE.py script is provided in the directory [RSMIn-


stall]/RSM/Config/scripts/EXAMPLES that can be used as a starting point for a
customized Submit command. The script should be copied into the scripts directory,
[RSMInstall]/RSM/Config/scripts, or a full path to the script must be provided
along with the name.

6.2.1.4. Modifying Cluster-Specific HPC Commands File


The command file prior to the modification is pasted below. While a detailed description of the command
is beyond the scope this documentation, it can be noted that the command file provides the information
on how actions related to job execution (submit a job, cancel a job, getting the job status) are executed.
The file also refers to a number of environment variables.
<?xml version="1.0" encoding="utf-8"?>
<jobCommands version="2" name="Custom Cluster Commands">
<environment>
<env name="RSM_HPC_PARSE">PBS</env>
</environment>
<command name="submit">
<application>
<app>qsub</app>
</application>
<arguments>
<arg>
<value>-q %RSM_HPC_QUEUE%</value>
<env name="RSM_HPC_QUEUE">ANY_VALUE</env>
</arg>
<arg>
<value>-l select=%RSM_HPC_CORES%:ncpus=1:mpiprocs=1</value>
<env name="RSM_HPC_DISTRIBUTED">TRUE</env>
</arg>
<arg>
<value>-l select=1:ncpus=%RSM_HPC_CORES%:mpiprocs=%RSM_HPC_CORES%</value>
<env name="RSM_HPC_DISTRIBUTED">FALSE</env>
</arg>
<arg>
<value>%RSM_HPC_NATIVEOPTIONS% -V -o %RSM_HPC_STDOUTFILE% -e %RSM_HPC_STDERRFILE%</value>
</arg>
<arg>
<value>-- %RSM_HPC_COMMAND%</value>
<env name="RSM_HPC_USEWRAPPER">FALSE</env>
</arg>
<arg>
<value>%RSM_HPC_COMMAND%</value>
<env name="RSM_HPC_USEWRAPPER">TRUE</env>
</arg>
</arguments>
</command>
<command name="cancel">

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
80 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Custom Cluster Integration Setup

<application>
<app>qdel</app>
</application>
<arguments>
<arg>%RSM_HPC_JOBID%</arg>
</arguments>
</command>
<command name="status">
<application>
<app>qstat</app>
</application>
<arguments>
<arg>%RSM_HPC_JOBID%</arg>
</arguments>
</command>
<command name="queues">
<application>
<app>qstat</app>
</application>
<arguments>
<arg>-Q</arg>
</arguments>
</command>
</jobCommands>

The section in bold text is the section that provides the Submit action, which we want to customize in
this example. In the original version the Submit command invokes the cluster qsub with arguments
determined via environment variables. The actual executable that is submitted is submitted to the
cluster is determined by RSM during runtime and can be specified by via an environment variable
named RSM_HPC_COMMAND. For details, see Submit Command (p. 91).

The example below shows the same section after it is customized to execute the Python file sub-
mit_PBS_EXAMPLE.py. In this example, we defined the type of application to execute (runpython,
accessed from the ANSYS installation) and the name of the Python file to be executed (submit_PBS_EX-
AMPLE.py).
<command name="submit">
<application>
<app>%AWP_ROOT150%/commonfiles/CPython/linx64/python/runpython</app>
</application>
<arguments>
<arg>
<value>submit_PBS_EXAMPLE.py</value>
</arg>
</arguments>
</command>

The custom Submit command appears much simpler than the original one. However, the details of the
submission are handled inside the Python file, which contains the same arguments used in the original
section. The Python file will also contain any custom code to be executed as part of the submission.

Other commands or queries can be overridden using the same procedure. You can find the command
name in the cluster-specific hpc_commands file and replace the application that needs to be executed
and the arguments needed by the application. Details on how to provide custom commands, as well
as the description of the environment variables, are provided in Writing Custom Code for RSM Integra-
tion (p. 89).

6.2.2. Customizing Client-Side Integration


The mechanism and operations for custom client-side integration are very similar to the ones for custom
server-side integration. However, the underlying architecture is different. In the server-side integration,
the customization affects the scripts used for RSM execution on the server/cluster side. In the client-
side integration, only a thin layer of the RSM on the client side is involved. The layer provides the APIs

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 81
Customizing RSM

for the execution of the custom scripts, which are located on the Client machine. RSM is not installed
on the server/cluster. It is responsibility of the custom scripts to handle all aspects of the job execution,
including transfer of files to and from the server.

The RSM installation provides some prototype code for client integration that can be tailored and
modified to meet specific customization needs. As indicated above, the steps needed for client-side
integration are very similar to those for server-side integration. On the client-side RSM installation, you
will be using the local Client machine (and Manager) to perform all the tasks (steps 1 through 5), as
follows:

1. Configure RSM to use cluster-specific code template.

2. Create copies of prototype code for the custom cluster type.

3. Edit cluster-specific code template to use your new cluster type.

4. Edit the cluster-specific hpc commands_<keyword> file to reference the custom commands.

5. Provide cluster-specific script\code\commands that perform the custom actions and return the
required RSM output.

The following sections discuss the steps to customize your integration:


6.2.2.1. Configuring RSM to Use Cluster-Specific Code Template on the Client Machine
6.2.2.2. Creating Copies of Sample Code Using Custom Client Keyword
6.2.2.3. Modifying Cluster-Specific Job Code Template to Use New Cluster Type
6.2.2.4. Modifying Cluster-Specific HPC Commands File

6.2.2.1. Configuring RSM to Use Cluster-Specific Code Template on the Client Machine
On the client-side RSM installation, you will be using the local Client machine (and Manager) to perform
all the tasks (steps 1 through 5) in Customizing Client-Side Integration (p. 81).

After creating a new Compute Server, set up the Compute Server Properties dialog under the Cluster
tab. You must select Cluster Type to Custom and then create a short phrase/word in the Custom
Cluster Type property as the custom cluster name. The name is arbitrary, but you should make it simple
enough to append to file names. This name will be referred to as the “keyword” from now on.

For supported clusters, you can include the original cluster name in the new custom name, for clarity.
For example, if your cluster is actually an LSF or PBS cluster but you need to customize the RSM inter-
action with it, you might use the keyword “CUS_LSF” or “CUS_PBS”. If the underlying cluster is not a
supported one, it could be called “CUSTOM” or any other arbitrary name. The names are in capital letters
for simplicity, but the only requirement is that the capitalization is the same in all places where this
“keyword” is referenced.

For the Cluster tab File Management property, select Custom Handling of Shared Cluster Directory.
This option turns off any attempt to copy files to shared cluster directory. The custom scripts are re-
sponsible for getting files to and from the appropriate location on the cluster side. See Configuring File
Transfer by OS Type and Network Share Availability (p. 86) for more details on the file transfer scenarios
by OS and network share availability.

A full example of a typical cluster setup using the local Manager and custom client definition is shown
below.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
82 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Custom Cluster Integration Setup

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 83
Customizing RSM

6.2.2.2. Creating Copies of Sample Code Using Custom Client Keyword


As part of the setup, you must create a custom copy of the cluster-specific code template and hpc_com-
mands files and modify them to load the custom job code for your custom integration. The starting
point for the code template and command files can be created by copying them from sample files that
are provided in the RSM installation. The sample files are marked with the suffix CIS (Client Integration
Sample) and provide an example of LSF-based integration.

1. Using the RSM installation on your Client machine, locate the directory [RSMInstall]\Config\xml.
Please note that all the actions listed below should be performed on the Client machine.

2. Locate the sample file for code template GenericJobCode_CIS.xml.

3. Copy the content of the GenericJobCode_CIS.xml code template into a new code template Gen-
ericJobCode_<YOURKEYWORD>.xml. If your keyword for the custom cluster was “CUS_LSF” like
the example in Configuring RSM to Use Cluster-Specific Code Template on the Client Machine (p. 82),
the new file should be called GenericJobCode_CUS_LSF.xml.

4. Locate the sample file for command execution hpc_commands_CIS.xml.

5. Copy the content of the hpc_commands_CIS.xml command file into a new command file template
hpc_commands_<YOURKEYWORD>.xml. If your keyword for the custom cluster was “CUS_LSF” like
the example in Configuring RSM to Use Cluster-Specific Code Template on the Client Machine (p. 82),
the new file should be called GenericJobCode_CUS_LSF.xml.

The client-side integration requires a custom implementation to be provided for all the commands to
be executed on the cluster. The standard RSM installation includes sample scripts for all these commands,
which should be used as a starting point for the customization. The sample scripts are named sub-
mit_CIS.py, cancel_CIS.py, status_CIS.py, transfer_CIS.py, and cleanup_CIS.py.
They are located in the [RSMInstall]\RSM\Config\scripts directory.

While it is not absolutely necessary to create a copy and rename the scripts, we have done so for con-
sistency; in the rest of the example, it is assumed that they have been copied and renamed to add the
same keyword chosen for the custom cluster, e.g. (submit_CUS_LSF.py, cancel_CUS_LSF.py,
status_CUS_LSF.py, transfer_CUS_LSF.py, and cleanup_CUS_LSF.py). These scripts will have
to be included in the custom job template, as shown in the following section, Modifying Cluster-Specific
Job Code Template to Use New Cluster Type (p. 84).

These CIS scripts are actually sample scripts that utilize a fully custom client integration on a standard
LSF cluster, for example only. Generally, custom client integrations do not use standard cluster types,
and thus there are no samples for custom client integrations on other cluster types.

Note

Any additional custom code that you want to provide as part of the customization should
also be located in the [RSMInstall]\RSM\Config\scripts directory corresponding
to your local (client) installation. Alternatively, a full path to the script must be provided
along with the name.

6.2.2.3. Modifying Cluster-Specific Job Code Template to Use New Cluster Type
The code sample pasted below provides an example of a modified GenericJobCode_CUS_LSF.xml.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
84 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Custom Cluster Integration Setup

<?xml version="1.0"?>
<codeDefinition transfer="false" prebuilt="false" assemblyName="Ans.Rsm.CustomCluster.dll">
<codeFiles>
<codeFile>GenericJobTP.cs</codeFile>
<codeFile>GenericJobBase.cs</codeFile>
<codeFile>GenericCommand.cs</codeFile>
<codeFile>IProxyScheduler.cs</codeFile>
<codeFile>SchedulerBase.cs</codeFile>
<codeFile>Utilities.cs</codeFile>
<codeFile>ClusterCustomization.cs</codeFile>
<codeFile>GenericCluster.cs/codeFile> <!-- Generic class loads hpc commands file below -->
</codeFiles>
<references>
<reference>Ans.Rsm.ScriptApi.dl</reference><!-- Required. Interface to RSM -->
<reference>Ans.Rsm.Utilities.dl</reference>
</references>
<supportFiles>
<!-- files to copy to RSM job working directory.
if no path, *.xml assumed to be in RSM/Config/xml,
otherwise RSM/Config/scripts -->
<supportFile transfer="true" type="ascii">ClusterJobs.py</supportFile>

<!-- Compute Server keyword is CUS_LSF. -->


<supportFile transfer="true" type="ascii">hpc_commands_CUS_LSF.xml</supportFile>

<!-- In this example, these are the customer provided external scripts. -->
<supportFile transfer="true" type="ascii">submit_CUS_LSF.py</supportFile>
<supportFile transfer="true" type="ascii">status_CUS_LSF.py</supportFile>
<supportFile transfer="true" type="ascii">cancel_CUS_LSF.py</supportFile>
<supportFile transfer="true" type="ascii">transfer_CUS_LSF.py</supportFile>
<supportFile transfer="true" type="ascii">cleanup_CUS_LSF.py</supportFile>
</supportFiles>
</codeDefinition>

The lines highlighted in the code sample show the modification made to the original file:

1. The hpc_command file was changed from the sample name hpc_commands_CIS.xml to your
custom file name hpc_commands_CUS_LSF.xml. All cluster types will need to change this file
similarly, using <YOURKEYWORD>.

2. Custom scripts described in the previous section are added as new support files.

Note

All the scripts listed should be in the directory [RSMInstall]\RSM\Config\scripts,


or a full path to the scripts must be provided along with the name.

6.2.2.4. Modifying Cluster-Specific HPC Commands File


The cluster-specific HPC commands file is the configuration file used to specify the commands that will
be used in the cluster integration. The file is in xml format and is located in the [RSMIn-
stall]\RSM\Config\xml directory.

This section provides an example of a modified file hpc_commands_CUS_LSF.xml. The cluster


commands are provided by the CIS sample scripts referred to in the previous section. These scripts
have been copied from the samples provided in the RSM installation and renamed to match the keyword
chosen to the custom cluster.

The hpc_commands file provides the information on how commands or queries related to job execution
are executed. The file can also refer to a number of environment variables. Details on how to provide

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 85
Customizing RSM

custom commands, as well as the description of the environment variables, are provided in Writing
Custom Code for RSM Integration (p. 89).
<jobCommands version="2" name="Custom Cluster Commands">
<environment>
<env name="RSM_HPC_PARSE">CUSTOM</env>
<env name="RSM_HPC_PARSE_MARKER">START</env>
</environment>
<command name="submit">
<application>
<app>%AWP_ROOT150%/commonfiles/CPython/winx64/python/python.exe</app>
</application>
<arguments>
<arg>submit_CUS_LSF.py</arg>
</arguments>
</command>
<command name="status">
<application>
<app>%AWP_ROOT150%/commonfiles/CPython/winx64/python/python.exe</app>
</application>
<arguments>
<arg>status_CUS_LSF.py</arg>
</arguments>
</command>
<command name="cancel">
<application>
<app>%AWP_ROOT150%/commonfiles/CPython/winx64/python/python.exe</app>
</application>
<arguments>
<arg>cancel_CUS_LSF.py</arg>
</arguments>
</command>
<command name="transfer">
<application>
<app>%AWP_ROOT150%/commonfiles/CPython/winx64/python/python.exe</app>
</application>
<arguments>
<arg>transfer_CUS_LSF.py</arg>
</arguments>
</command>
<command name="cleanup">
<application>
<app>%AWP_ROOT150%/commonfiles/CPython/winx64/python/python.exe</app>
</application>
<arguments>
<arg>cleanup_CUS_LSF.py</arg>
</arguments>
</command>
</jobCommands>

6.2.3. Configuring File Transfer by OS Type and Network Share Availability


A remote job execution on a cluster usually requires the transfer of files to and from a cluster directory.
With client-side custom integration, the cluster job file management can vary according to whether the
cluster staging area is visible to the RSM Client machine.

The Compute Server settings are used to specify information about the cluster staging area and local
scratch directory. For client-side custom integration, the Compute Server settings can also be overridden
through environment variables. The environment variables (such as RSM_HPC_PLATFORM,
RSM_HPC_SCRATCH, and RSM_HPC_STAGING) are set on the Client machine where the RSM job will
run. For details on these variables, see Environment Variables Set by Customer (p. 94).

The following sections contain example configuration settings for different scenarios:
6.2.3.1. Windows Client to Windows Cluster
6.2.3.2. Windows Client to Linux Cluster
6.2.3.3. Linux Client to Linux Cluster

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
86 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Custom Cluster Integration Setup

For each scenario, the Shared Cluster Directory (aka staging directory) can be:

• Visible to the RSM Client machine via a network share, Samba share, or mapped drive. In these cases,
RSM will attempt the copying to and transfer of files from the cluster staging area.

• Not visible to the RSM Client machine. In these cases, file management is handled by external HPC
commands/scripts. RSM is not involved in the copying of files to/from the cluster.

6.2.3.1. Windows Client to Windows Cluster


In the following two scenarios, a Windows Client machine is integrated with a Windows cluster.

6.2.3.1.1. Windows-to-Windows, Staging Visible


In this scenario, the Windows Client can “see” the Windows cluster staging area via a network share or
mapped drive.

1. On the Compute Server Properties dialog General tab, set Working Directory to Automatic.

2. On the Cluster tab:

• Set Cluster Shared Directory to the path of the actual shared directory on the cluster. RSM will copy
jobs to and from this location.

• Set File Management to either Use Execution Node Local Disk or Reuse Shared Cluster Directory.

3. Set the RSM_HPC_SCRATCH environment variable:

• If using local scratch, set to the path of the desired local scratch space on the cluster.

• If you are doing scratch management, set to CUSTOM.

6.2.3.1.2. Windows-to-Windows, Staging Not Visible


In this scenario, the Windows Client cannot “see” the Windows cluster staging area.

1. On the Compute Server Properties dialog General tab, set Working Directory to Automatic.

2. On the Cluster tab, set File Management to Custom Handling of Shared Cluster Directory. This option
will prevent RSM from file copying on the client side.

3. Set the RSM_HPC_SCRATCH environment variable:

• If using local scratch, set to the path of the desired local scratch space on the cluster.

• If you are doing scratch management, set to CUSTOM.

4. Set the RSM_HPC_STAGING environment variable to the path of the staging directory for the cluster.

6.2.3.2. Windows Client to Linux Cluster


In the following two scenarios, a Windows Client machine is integrated with a Linux cluster.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 87
Customizing RSM

6.2.3.2.1. Windows-to-Linux, Staging Visible


In this scenario, the Windows Client can “see” the Linux cluster staging area via a Samba UNC or mapped
drive.

1. On the Compute Server Properties dialog General tab, set Working Directory to Automatic.

2. On the Cluster tab:

• Set Cluster Shared Directory to the path of the actual Windows-side shared directory on the cluster.

• Set File Management to either Use Execution Node Local Disk or Reuse Shared Cluster Directory.

3. Set the RSM_HPC_SCRATCH environment variable:

• If using local scratch, set to the path of the desired local scratch space on the cluster.

• If you are doing scratch management, set to CUSTOM.

4. Set the RSM_HPC_STAGING environment variable to the path of the Linux-side staging directory.

Note

Once a unique directory for the job is created on the Windows-side (for instance, \\ma-
chine\StagingShare\abcdef.xyz), then RSM_HPC_STAGING is internally updated
by RSM to include the unique directory name (for instance /staging/abcdef.xyz).

5. Set the RSM_HPC_PLATFORM environment variable to linx64.

6.2.3.2.2. Windows-to-Linux, Staging Not Visible


In this scenario, the Windows Client cannot “see” the Linux cluster staging area.

1. On the Compute Server Properties dialog General tab, set Working Directory to Automatic.

2. On the Cluster tab, set File Management to Custom Handling of Shared Cluster Directory.

3. Set the RSM_HPC_SCRATCH environment variable:

• If using local scratch, set to the path of the desired local scratch space on the cluster.

• If you are doing scratch management, set to CUSTOM.

4. Set the RSM_HPC_STAGING environment variable to the path of the staging directory on the cluster.

Note

Once a unique directory for the job is created on the Windows-side (for instance, \\ma-
chine\StagingShare\abcdef.xyz), then RSM_HPC_STAGING is internally updated
by RSM to include the unique directory name (for instance, /staging/abcdef.xyz).

5. Set the RSM_HPC_PLATFORM environment variable to linx64.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
88 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Writing Custom Code for RSM Integration

6.2.3.3. Linux Client to Linux Cluster


In the following two scenarios, a Linux Client machine is integrated with a Linux cluster.

6.2.3.3.1. Linux-to-Linux, Staging Visible


In this scenario, the Linux Client can “see” the Linux cluster staging area because the staging area is
mounted on the Client machines.

1. On the Compute Server Properties dialog General tab, set Working Directory to Automatic.

2. On the Cluster tab:

• Set Cluster Shared Directory to the path of the actual shared directory on the cluster.

• Set File Management to either Use Execution Node Local Disk or Reuse Shared Cluster Directory.

3. Set the RSM_HPC_SCRATCH environment variable:

• If using local scratch, set to the path of the desired local scratch space on the cluster.

• If you are doing scratch management, set to CUSTOM.

6.2.3.3.2. Linux-to-Linux, Staging Not Visible


In this scenario, the Linux Client cannot “see” the Linux cluster staging area.

1. On the Compute Server Properties dialog General tab, set Working Directory to Automatic.

2. On the Cluster tab, set File Management to Custom Handling of Shared Cluster Directory.

3. Set the RSM_HPC_SCRATCH environment variable:

• If using local scratch, set to the path of the desired local scratch space on the cluster.

• If you are doing scratch management, set to CUSTOM.

4. Set the RSM_HPC_STAGING environment variable to the path of the staging directory on the cluster.

Note

Once a unique directory for the job is created on the client side (for instance,
/tmp/abcdef.xyz), then RSM_HPC_STAGING is internally updated by RSM to include
the unique directory name (for instance, /staging/abcdef.xyz).

6.3. Writing Custom Code for RSM Integration


This section provides detailed information about the code that should be provided for custom integration
with RSM.

The custom code can be in any form convenient to you, typically in the form of scripts or executables.
Generally, scripts are used to wrap the underlying cluster software (for example, LSF) commands. You
can review sample Python scripts in the [RSMInstall]\Config\scripts directory.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 89
Customizing RSM

The scripts have access to environment variables that are set to override default RSM behavior and to
environment variables that are dynamically set by RSM to provide information about job related variables.
A detailed description of the environment variables that the scripts can access is given in Custom Integ-
ration Environment Variables (p. 93).

This section discusses the following topics:


6.3.1. Parsing of the Commands Output
6.3.2. Customizable Commands
6.3.3. Custom Integration Environment Variables
6.3.4. Providing Client Custom Information for Job Submission

6.3.1. Parsing of the Commands Output


Since some of the commands used for custom integration are wrappers around cluster-specific commands,
it is necessary to parse the output of the cluster commands. The output of the cluster command provides
information such as cluster Job ID or status. It can also be used to report error and debugging messages.

6.3.1.1. Commands Output in the RSM Job Log


The output for all cluster command scripts should be sent directly to stdout or stderr. The contents
of stdout will be added to the RSM job log as standard messages. This content is also searched in
order to parse the information necessary as a result of the command execution. The handling of the
command output depends upon the value of the environment variable RSM_HPC_PARSE. The environ-
ment variable defines what type of output RSM should expect from the command. If the underlying
cluster used for the integration is one of the supported types (LSF/PBS/SGE/MSCC) you should set the
value of RSM_HPC_PARSE to the corresponding type. Printing the output of the command will allow
the RSM code to extract the appropriate information. For example, if the LSF option is used, RSM is
expecting the output of the Submit command to contain output from LSF bsub command.

If your cluster is not one of the supported ones, you should set RSM_HPC_PARSE to CUSTOM. In this
case, it is your responsibility to parse the output of the commands and provide to RSM a variable with
the result. An optional RSM_HPC_PARSE_MARKER option can be set to a ‘marker’ string of an output
line in order to indicate the line after which parsing should start. If no "start marker" is found, then RSM
will parse all of the output as the start marker was at the beginning of the output

6.3.1.2. Error Handling


Error messages and warnings information are written to stdout as necessary. If they are properly
labeled as indicated below they appear in the RSM log as orange for warnings and bold red for errors.

Output format:

• RSM_HPC_ERROR=<errormessage>

• RSM_HPC_WARN=<warning>

Example Python snippet:


Print(‘RSM_HPC_WARN=This is what a warning displays like’)

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
90 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Writing Custom Code for RSM Integration

6.3.1.3. Debugging
Debugging information, typically used for troubleshooting purposes, is shown on the RSM job log only
if the Debug Messages option is selected from the job log context menu. (To access this option, right-
click anywhere inside the job log pane of the RSM application main window.)

Output format:

• RSM_HPC_DEBUG=<debugmessage>

6.3.2. Customizable Commands


RSM will invoke a custom implementation for the following commands:
6.3.2.1. Submit Command
6.3.2.2. Status Command
6.3.2.3. Cancel Command
6.3.2.4.Transfer Command
6.3.2.5. Cleanup Command

6.3.2.1. Submit Command


The Submit command is invoked to submit a job to the cluster. The command should return as soon
as the queuing system has taken ownership of the job and a unique Job ID is available.

If using CUSTOM parsing, the command must write a unique Job ID with the following format:
RSM_HPC_JOBID=<jobid>. If using (LSF/PBS/SGE/MSCC) parsing, the script only needs to send the
direct output from the submission command (bsub/qsub/job submit).

The custom integration infrastructure provides the Python script, ClusterJobs.py, in the [RSMIn-
stall]\Config\scripts directory. The script serves as a layer of abstraction that allows a user-
selected operation (such as a component update for one or more of the applications or a design point
update) to be invoked without the need to be aware of the command line arguments and options re-
quired for the appropriate submission of the job.

In the Submit command, the ClusterJobs.py script should be invoked (rather than executing the
individual applications). This Python script should be considered as a layer that builds the appropriate
command line and sets the appropriate environment variables for the remote execution. The usage of
application specific command line in the Submit script is strongly discouraged and cannot be properly
supported in a general way.

For user convenience, the complete Python command that contains the job to be executed by the
Submit command (for instance, by LSF bsub) is provided through the environment variable
RSM_HPC_COMMAND.

Examples:

• Custom server examples for LSF, PBS, SGE, and MSCC are located in the [RSMInstall]\Con-
fig\scripts\EXAMPLES directory.

• A custom client example (for LSF) is provided in the file submit_CIS.py, located in the [RSMIn-
stall]\Config\scripts directory.

• More examples may be available on the ANSYS Customer Portal. For further information about tutorials
and documentation on the ANSYS Customer Portal, go to http://support.ansys.com/docinfo.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 91
Customizing RSM

6.3.2.2. Status Command


The Status command has access to the Job ID through the environment variable RSM_HPC_JOBID.
Given a Job ID, the command should query the cluster for the status of the job and return the status
of that job in string format. If using CUSTOM parsing, the output should be parsed in order to provide
the status information with format RSM_HPC_STATUS=<jobstatus>, where jobstatus is:

• CANCELED

• FAILED

• FINISHED

• QUEUED

• RUNNING

Examples:

• Custom server examples are not provided for this command.

• A custom client example (for LSF) is provided in the file status_CIS.py, located in the [RSMIn-
stall]\Config\scripts directory.

• More examples may be available on the ANSYS Customer Portal. For further information about tutorials
and documentation on the ANSYS Customer Portal, go to http://support.ansys.com/docinfo.

6.3.2.3. Cancel Command


The Cancel command has access to the Job ID through the environment variable RSM_HPC_JOBID.
Given a Job ID, the command should invoke the cluster command to cancel the job.

No output is required from the Cancel command. However, an output statement should be given for
verification in the RSM log.

Examples:

• Custom server examples are not provided for this command.

• A custom client example (for LSF) is provided in the file cancel_CIS.py, located in the[RSMIn-
stall]\Config\scripts directory.

• A custom client example (for LSF) is provided in the file status_CIS.py, located in the [RSMIn-
stall]\Config\scripts directory.

6.3.2.4. Transfer Command


The Transfer command is invoked in order to transfer files to and from the cluster.

No output is required from the Transfer command. However, it is suggested to output the files that are
being copied for verification in the RSM log.

The Transfer command can check if the environment variable RSM_HPC_FILEDIRECTION equals
UPLOAD or DOWNLOAD to detect whether files should be uploaded to the cluster or downloaded from
the cluster.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
92 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Writing Custom Code for RSM Integration

The Transfer command is invoked to upload files to and retrieve files from the cluster, as follows:

• Uploading of files is invoked for input files and also when the user interrupts an application. (Applications
typically look for an interrupt file in a specified location.)

• Retrieving of files is invoked for output files once the job is completed. It is also invoked for inquiring
(downloading) files during the execution of the job. Inquiring of files is typically invoked from Workbench
for small files (such as convergence information).

The list of files to be uploaded or downloaded is provided through a semi-colon delimited list in the
environment variable RSM_HPC_FILELIST. File names can possibly contain wildcards (e.g. *.out).
The files are located in the current Working Directory in which the script is invoked (i.e. the RSM job
Working Directory).

The command can also access the environment variable RSM_HPC_FILECONTEXT is set to INPUTS
(beginning of job), OUTPUTS (end of job), CANCEL (cancelling a job) or INQUIRE (request for files
while job running). This information may be useful especially in the case of inquire, when extra processing
may be required to locate files for a running job.

Examples:

• Custom server integrations do not use this command.

• A custom client example is provided in the file transfer_CIS.py, located in the [RSMIn-
stall]\Config\scripts directory.

• More examples may be available on the ANSYS Customer Portal. For further information about tutorials
and documentation on the ANSYS Customer Portal, go to http://support.ansys.com/docinfo.

6.3.2.5. Cleanup Command


The Cleanup command is called at the very end of the execution when all the other actions have been
completed. It can be used by the user to perform clean-up operation or other actions that are needed
at the end of a job.

No output is required from the Cleanup command. However, an output statement should be given for
verification in the RSM log.

Examples:

• Custom server examples are not provided for this command.

• A custom client example (for LSF) is provided in the file cleanup_CIS.py, located in the
[RSMInstall]\Config\scripts directory.

• More examples may be available on the ANSYS Customer Portal. For further information about tutorials
and documentation on the ANSYS Customer Portal, go to http://support.ansys.com/docinfo.

6.3.3. Custom Integration Environment Variables


Workbench/RSM makes job settings available to custom commands via environment variables. Some
environment variables are set automatically by RSM at runtime, providing necessary information to the
custom scripts or executables in the HPC commands file. Other environment variables can be set by
your RSM administrator, if appropriate to your job management process.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 93
Customizing RSM

6.3.3.1. Environment Variables Set by Customer


The following optional environment variables can be set by the set by your RSM administrator on the
Compute Server side and they will be passed to the Compute Server as environment variables to be
used in scripting. Additionally, the user can set any number of variables that follow in Providing Client
Custom Information for Job Submission (p. 96).

Note

For client-side custom integration, the RSM Compute Server is running on the Client machine.

Environment Variable Description


RSM_HPC_CONFIG Optional. Specifies the file name of the HPC commands file.

Set this variable only if the file name to be used is different than
the default file name, hpc_commands.xml or hpc_com-
mands_keyword.xml, where keyword (Custom Cluster Type)
is defined on the Cluster tab of the RSM Compute Server Proper-
ties dialog.

Example: RSM_HPC_CONFIG=hpc_commands_TEST.xml
RSM_HPC_PARSE Specifies what type of output RSM should expect from these com-
mands, choose LSF/PBS/SGE/MSCC or CUSTOM.

If the underlying cluster used for the integration is one of the sup-
ported types (LSF/PBS/SGE/MSCC), set the value
of RSM_HPC_PARSE to the corresponding type. For these supported
types, RSM can extract the relevant information from the output of
the command. For unsupported types,
set RSM_HPC_PARSE to CUSTOM and see Customizable Com-
mands (p. 91) for what variables must be set in each command.
RSM_HPC_PARSE_MARKER Optional. Specifies a ‘marker’ string of an output line. The marker
string is used in order to indicate the line after which parsing should
start.
RSM_HPC_PLATFORM Optional. Specifies the cluster platform being used.

Set this variable only if the cluster platform is different than the
machine from which the RSM job is submitted —for example, when
the job is submitted from a Windows client to a Linux cluster via
client custom integration.

Example: RSM_HPC_PLATFORM=linx64
RSM_HPC_SCRATCH Optional. Path for the cluster’s job scratch directory (i.e. solver files
are stored in a local directory on the Compute Server machine).

Set this variable only if the path is different than the one specified
in the Working Directory property on the General tab of the
Compute Server Properties dialog.

Example: RSM_HPC_SCRATCH=/tmp

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
94 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Writing Custom Code for RSM Integration

Note: Specifying a value of CUSTOM for RSM_HPC_SCRATCH in-


structs the code that is executed on the cluster side to NOT to create
any scratch directories. The directory where the scripts are executed
is considered to be the scratch directory for the job.
RSM_HPC_STAGING Optional. Path for the cluster’s central staging area for job files.
Typically needed when client and cluster platforms are different.

Set this variable only if the path is different than the one specified
in the Shared Cluster Directory property on the Cluster tab of the
Compute Server Properties dialog.

Set by both the administrator and RSM. Workbench/RSM modifies


the original administrator-entered path specified here so that a
unique subdirectory is added to the end.

Example: RSM_HPC_STAGING=/staging

6.3.3.2. Environment Variables Set by RSM


RSM will set the following environment variables at runtime, communicating job-specific data to the
HPC commands. These variables will need to be used in your scripts to do the job handling.

Environment Variable Description


RSM_HPC_CORES The number of cores requested by the user for the
job.
RSM_HPC_DISTRIBUTED Indicates whether a distributed (multi-node) cluster
job is allowed.

Set to TRUE if the target solver (specified in


RSM_HPC_JOBTYPE) supports distributed execution.

Set to FALSE if cores can be used on only one node.


RSM_HPC_FILECONTEXT Used only by Transfer command/script. Specifies the
context in which files are being transferred in case
any special handling is required. Possible values are
CANCEL, INPUTS, INQUIRE, and OUTPUTS.
RSM_HPC_FILEDIRECTION Used only by Transfer command/script. Specifies the
direction of file transfers. Possible values are UPLOAD
(which moves files from the client to the cluster) or
DOWNLOAD (which moves files from the cluster to
the client).
RSM_HPC_FILELIST Used only by Transfer command/script. Semi-colon
delimited list of files to transfer for the job submis-
sion or status request. Dynamically generated be-
cause the list can depend on the job type or the
specific UI action. May contain wildcards.
RSM_HPC_JOBID Identifier for the cluster job returned by the success-
ful Submit command. RSM sets this variable so it is
available to subsequent commands.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 95
Customizing RSM

RSM_HPC_JOBTYPE The solver being used for the job. Possible values are
ANSYS, Addin_ANSYS, Addin_CFX,
Addin_FLUENT, Addin_POLYFLOW, AUTODYN,
Contact, FrameworkUpdateDPs, and RBD.

The job types with the add-in suffix are jobs executed
from within Workbench as part of the component
update. FrameworkUpdateDPs is the job type
corresponding to the execution of the Workbench
Update Design Points operation.

The other jobs types correspond to jobs submitted


through RSM without the Workbench mediation.
RSM_HPC_NATIVEOPTIONS Value(s) of the Job Submission Arguments property
on the Cluster tab of the Compute Server Proper-
ties dialog.

Workbench/RSM does not define or manipulate these


administrator-specified options.
RSM_HPC_QUEUE The queue requested by the user for the job.

The list of available queues is defined by the Work-


bench/RSM administrator.
RSM_HPC_STAGING Path in the Shared Cluster Directory property on
the Cluster tab of the Compute Server Properties
dialog.

Set by both administrator and RSM. Workbench/RSM


modifies the original administrator-entered path so
that a unique subdirectory is added to the end.
RSM_HPC_STDERRFILE A request that cluster job stderr be redirected into
the named file. The contents of this file will be added
to the RSM job log.
RSM_HPC_STDOUTFILE A request that cluster job stdout be redirected into
the named file. The contents of this file will be added
to the RSM job log.

6.3.4. Providing Client Custom Information for Job Submission


When executing a job, you can provide custom information from the client side that allows you to
perform custom actions prior the submission of a job to the Compute Server or cluster. Custom inform-
ation that you define on the RSM Client machine can be picked up by RSM and then passed to the
Compute Server or cluster machine where the job is being executed.

Note

For a custom client Integration, the Compute Server is the Client machine therefore the in-
formation is made available to the custom scripts on the Client machine. In this case, the
environment variables are also passed to cluster machine on remote side.

Examples of custom information that can be provided to the cluster are:

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
96 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Writing Custom Code for RSM Integration

• The username of the submitter (which, for instance, provides the ability to monitor jobs submitted by a
particular user for accounting purposes) and

• The license necessary to execute the job, which can be used to integrate with cluster resource management
to check ANSYS license availability before a job starts running.

For more information on how to integrate licensing with cluster software, please contact your cluster
administrator or ANSYS customer support.

As an example, we’ll pass the submitter’s username from the client to a PBS cluster.

The following sections detail the steps for providing custom information for job submissions to clusters.
6.3.4.1. Defining the Environment Variable on the Client
6.3.4.2. Passing the Environment Variable to the Compute Server
6.3.4.3. Verify the Custom Information on the Cluster

6.3.4.1. Defining the Environment Variable on the Client


First, you must define the information on the RSM Client machine by creating an environment variable.
The environment variable must begin with the prefix RSM_CLIENT_ in order for RSM to detect it and
pass the information from the Client machine to the Compute Server or cluster.

In the example below, we’ve defined the environment variable RSM_CLIENT_USERNAME. The name
is arbitrary as long as it begins with the RSM_CLIENT_ prefix.

6.3.4.2. Passing the Environment Variable to the Compute Server


Once you’ve defined the environment variable on the RSM Client machine, this environment variable
will be passed along with other job files to the Compute Server or cluster machine. You can access this
environment variable value from your custom cluster job scripts. In our example, we will add the client
job user name as a new command line argument to PBS qsub command defined in the commands file
RSM uses for PBS clusters, hpc_commands_PBS.xml (located in the [RSMInstall]\Config\xml
directory).

In the code sample below, you can see that the environment variable is added to the qsub command.
Note, also, that it is preceded by –A, which defines the account string associated with the job for the
PBS cluster.
<command name="submit">
<application>
<app>qsub</app>
</application>

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 97
Customizing RSM

<arguments>
<arg>
<value>-q %RSM_HPC_QUEUE%</value>
<env name="RSM_HPC_QUEUE">ANY_VALUE</env>
</arg>
<arg>
<value>-A %RSM_CLIENT_USERNAME%</value>
<env name="RSM_CLIENT_USERNAME">ANY_VALUE</env>
</arg>
<arg>
<value>-l select=%RSM_HPC_CORES%:ncpus=1:mpiprocs=1</value>
<env name="RSM_HPC_DISTRIBUTED">TRUE</env>
</arg>
<arg>
<value>-l select=1:ncpus=%RSM_HPC_CORES%:mpiprocs=%RSM_HPC_CORES%</value>
<env name="RSM_HPC_DISTRIBUTED">FALSE</env>
</arg>
<arg>
<value>%RSM_HPC_NATIVEOPTIONS% -V -o %RSM_HPC_STDOUTFILE% -e %RSM_HPC_STDERRFILE%</value>
</arg>
<arg>
<value>-- %RSM_HPC_COMMAND%</value>
<env name="RSM_HPC_USEWRAPPER">FALSE</env>
</arg>
<arg>
<value>%RSM_HPC_COMMAND%</value>
<env name="RSM_HPC_USEWRAPPER">TRUE</env>
</arg>
</arguments>
</command>

To view a sample of this file before the addition of custom information, see Modifying Cluster-Specific
HPC Commands File (p. 85).

6.3.4.3. Verify the Custom Information on the Cluster


To verify that the custom information has been successfully passed from the RSM Client to the cluster,
run a job that will call the script you’ve customized. The environment variable should show up in the
Reading environment variables… section of the RSM job log.
Reading environment variables...
RSM_CLIENT_USERNAME = myname

Since we added the environment variable to the qsub command in the PBS commands file, it will also
show up in the area of the job log indicating that the qsub command has been run.
qsub -q %RSM_HPC_QUEUE% -A %RSM_HPC_USERNAME% -1
select=1:ncpus=%RSM_HPC_CORES%:mpiprocs=%RSM_HPC_CORES% ...
qsub -q WB_pbsnat -A myname -1 select=1:ncpus=1:mpiprocs=1 ...

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
98 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Chapter 7: RSM Troubleshooting
This section contains troubleshooting tips for RSM.

Generating RSM Service Startup Scripts for Linux


The scripts for manually starting RSM services are usually generated during installation. In the event
that the scripts are not generated as part of the install or you’ve removed the generated scripts, you
can generate the scripts manually in either of the following ways:

• Generate scripts for all of the services by running rsmconfig (without command line options).

• Generate the script for a specific service by running generate_service_script. Specify the service
by using command line options, as shown below:
tools/linux> ./generate_service_script
Usage: generate_service_script -mgr|-svr|-xmlrpc
Options:
-mgr: Generate RSM Job Manager service script.

-svr: Generate RSM Compute Server service script.


-xmlrpc: Generate RSM XML-RPC Server service script.

Configuring RSM for Mapped Drives and Network Shares for Windows
If RSM is used to solve local or remote jobs on mapped network drives, you may need to modify security
settings to allow code to execute from those drives because code libraries may be copied to working
directories within the project.

You can modify these security settings from the command line using the CasPol utility, located under
the .NET Framework installation.

• For a 32–bit machine:


C:\Windows\Microsoft.NET\Framework\v2.0.50727

• For a 64–bit machine:


C:\Windows\Microsoft.NET\Framework64\v2.0.50727

In the example below for a 32–bit machine, full trust is opened to files on a z:\ mapped drive to enable
software to run from that share:
C:\Windows\Microsoft.NET\Framework\v2.0.50727\CasPol.exe -q -machine -ag 1 -url "file://z:/*"
FullTrust -name "RSM Work Dir"

In the example below for a 64–bit machine, full trust is opened to files on a shared network drive to
enable software to run from that share:
C:\Windows\Microsoft.NET\Framework64\v2.0.50727\CasPol.exe
-q -machine -ag 1 -url "file://fileserver/sharename/*"
FullTrust -name "Shared Drive Work Dir"

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 99
Troubleshooting

For more information on configuring RSM Clients and Compute Servers using a network installation,
please refer to Network Installation and Product Configuration.

Temporary Directory Permissions on Windows Clusters


Some applications executed through RSM (e.g. Fluent) require read/write access to the system temporary
directory on local compute nodes. The usual location of this directory is C:\WINDOWS\Temp. All users
should have read/write access to that directory on all nodes in the cluster to avoid job failure due to
temporary file creation issues.

Firewall Issues
If you have a local firewall turned on for the server and/or RSM Client machines, you will need to attach
two ports to the Exceptions List for RSM, as follows:

• Add port 8150 to (Ans.Rsm.SHHost.exe).

• Add port 9150 to (Ans.Rsm.JMHost.exe).

Enabling or Disabling Microsoft User Account Control (UAC)


To enable or disable UAC:

1. Open Control Panel > User Accounts > Change User Account Control settings.

2. On the User Account Control settings dialog, use the slider to specify your UAC settings:

• Always Notify: UAC is fully enabled.

• Never Notify: UAC is disabled.

Note

Disabling UAC can cause security issues, so check with your IT department before changing
UAC settings.

Internet Protocol version 6 (IPv6) Issues


When running a cluster, you will receive an error if connection to a remote Manager is not possible
because the Manager has not been configured correctly as localhost.

If you are not running a Microsoft HPC cluster, test this by opening a command prompt and running
the command, ping localhost. If you get an error instead of the IP address:

1. Open the C:\Windows\System32\drivers\etc\hosts file.

2. Verify that localhost is not commented out (with a # sign in front of the entry). If localhost is commented
out, remove the # sign.

3. Comment out any IPv6 information that exists.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
100 ation of ANSYS, Inc. and its subsidiaries and affiliates.
4. Save and close the file.

Note

If you are running on a Microsoft HPC cluster with Network Address Translation (NAT) enabled,
Microsoft has confirmed this to be a NAT issues and is working on a resolution.

Multiple Network Interface Cards (NIC) Issues


When multiple NIC cards are used, RSM may require additional configuration to establish desired com-
munications between tiers (i.e., the RSM Client, Manager, and Compute Server machines).

The most likely scenario is that the issues originate with the Manager and/or Compute Server. First, try
configuring the Manager and/or Compute Server machine(s):

1. In a text editor, open the Ans.Rsm.JMHost.exe.config file (Manager) and/or Ans.Rsm.SH-


Host.exe.config file (Compute Server). These files are located in Program Files\ANSYS
Inc\v150\RSM\bin.

2. To both files, add the machine’s IP address to the TCP channel configuration. Substitute the machine’s
correct IP address for the value of machineName. The correct IP address is the address seen in the
output of a “ping” from a remote machine to the Fully Qualified Domain Name (FQDN).
<channel ref="tcp" port="9150" secure="false" machineName="1.2.3.4">

3. Save and close both files.

4. Restart the following services: ANSYS JobManager Service V15.0 and ANSYS ScriptHost
Service V15.0.

• For Windows: On your Administrative Tools or Administrative Services page, open the Services
dialog. Restart the services by right-clicking on the service and selecting Restart.

• For Linux: Log into a Linux account with administrative privileges and ensure that Ans.Rsm.* processes
are not running. Open a terminal window in the [RSMInstall]/Config/tools/linux directory
and run the following command: ./rsmmanager restart

If the Manager and/or Compute Server does not resolve the problem, the RSM Client machine may
have multiple NICs and require additional configuration. For example, a virtual NIC used for a VPN
connection on an RSM Client machine can cause a conflict, even if not connected.

If configuring the Manager and/or Compute Server machines doesn’t work, configure the Multi-NIC RSM
Client machine:

1. Using a text editor, create a file named Ans.Rsm.ClientApi.dll.config in Program


Files\ANSYS Inc\v150\RSM\bin. If this file does not exist, RSM uses a default configuration.

2. From this file, copy and paste from the text below into Ans.Rsm.ClientApi.dll.config:
<?xml version="1.0" encoding="utf-8" ?>
<configuration>
<system.runtime.remoting>
<application>
<channels>
<channel ref="tcp" port="0" secure="true" machineName="ip_address">
<clientProviders>

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 101
Troubleshooting

<formatter ref="binary" typeFilterLevel="Full"/>


</clientProviders>

</channel>
</channels>
</application>
</system.runtime.remoting>
</configuration>

3. Replace the contents of ip_address with a valid IP address.

4. Save and close the file.

RSH Protocol Not Supported


The RSH protocol is not officially supported at 14.5 and will be completely removed from future releases.
Windows Server 2008, Windows Vista, and Windows 7 do not include the RSH client.

SSH File Size Limitation


The PuTTY SSH/SCP client has file size limitations that RSM circumvents by splitting and joining very
large files (greater than 2 GB). The Windows Compute Server and the Linux machine may also have file
system limitations that are beyond the control of RSM. You must configure the Linux machine with
large file support, and the Windows file system must be NTFS in order to transfer files larger than ap-
proximately 2 GB. If any job output file is not successfully retrieved, all job output files are left on the
Linux machine. Consult the job log in the RSM Job Log view to learn the temporary directory name
used to store the job files. You can then manually retrieve the files from the temporary directory (using
Samba or a similar application) so the results can be loaded back into your ANSYS client application.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
102 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Appendix A. ANSYS Inc. Remote Solve Manager Setup Wizard
The ANSYS Remote Solve Manager Setup Wizard is designed to guide you through the process of setting
up and testing Remote Solve Manager (RSM). Once the setup and testing is complete, you will be able
to use RSM to submit jobs from Workbench to be executed on remote machines or clusters.

The following sections contain detailed information on using the ANSYS Remote Solve Manager Setup
Wizard:
A.1. Overview of the RSM Setup Wizard
A.2. Prerequisites for the RSM Setup Wizard
A.3. Running the RSM Setup Wizard
A.4.Troubleshooting in the Wizard

A.1. Overview of the RSM Setup Wizard


The RSM Setup Wizard can help you to configure all the machines that will be part of your RSM Layout
(the actual physical configuration of machines to be used for initiating, queuing, and solving jobs). It
allows you to perform the following tasks:

• Automate the Workbench setup before starting RSM services for certain cluster scenarios. As part of
the optional auto-configuration process, the wizard performs the following setup tasks to ensure that
Workbench is available to each node in the cluster:

– If it does not already exist, create a share to the cluster head node Workbench installation directory.

– Run ANSYS Workbench configuration prerequisites.

Note

→ In order for the wizard to install prerequisites, UAC must be disabled on any cluster
node where prerequisites are missing and need to be installed.

→ Installed prerequisites include MS.NET Framework 4.0 Redistributable and MS VC++


2010 Redistributable x64. (If needed, packages from previous versions, such as MS
VC++ 2008 Redistributable x64, among others, could be included by editing the
AutoConfig.AllPrereqInstaller entry in the Ans.Rsm.Wizard.exe.con-
fig file).

→ Once you have completed the RSM setup, it is recommended that you reboot the
machine(s) on which MS.NET Framework 4.0 and/or MS VC++ 2010 Redistributable
x64 have been installed.

– Set up Workbench environment variables on each node in the cluster.

• Start RSM services locally or remotely for the Manager and Compute Server (i.e., both on the local
machine on which you are currently running the wizard and on remote machines).

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 103
ANSYS Inc. Remote Solve Manager Setup Wizard

It is best to perform these tasks from the Manager/Compute Server machine.

• Configure machines locally and/or remotely to serve as an RSM Client, the Manager, or a Compute
Server.

It is best to perform these tasks from the Manager/Compute Server machine.

• Integrate RSM with the following third-party job schedulers (without requiring job script customization):

– LSF (Windows and Linux)

– PBS (Linux only)

– Microsoft HPC

– SGE (UGE)

• Configure a cluster.

It is best to perform cluster configuration tasks from the machine that is the head node of the
cluster. When you indicate that you are configuring the cluster head node, the wizard will walk
you through the steps to configure it as both Manager and Compute Server and to configure
all the compute nodes in the cluster.

• Create a Project Directory, Working Directory, and where applicable, a Shared Cluster Directory for
the storage of project inputs, outputs, solver files, and results. Options for allowing the wizard to select
directory paths and to automatically configure the Working Directory and Shared Cluster Directory
are available. Automation of these steps helps to ensure consistency for your RSM setup.

• Define one or more Queues that will receive jobs from the Manager and send the jobs to one or
more Compute Servers.

• Create primary accounts or alternate accounts. Alternate accounts may be required to allow access
to all the Compute Servers to which jobs will be sent.

• Test the Compute Servers to ensure that your RSM configuration is working properly. When there
are issues, the wizard will attempt to diagnose and provide you with information on the problem. If
the wizard cannot diagnose the problem, it will offer suggestions for troubleshooting outside of the
wizard.

It is best to perform testing from the RSM Client machine. For details, see Step 3: Test Your RSM
Configuration (p. 109).

Note that there are a number of setup tasks that the RSM Setup Wizard cannot perform. The wizard
cannot:

• Start Compute Server or Manager services from a network installation. You must start services locally
on the Compute Server or Manager machine before running the wizard.

• Perform certain tasks without correct permissions. For details on necessary Windows and Linux per-
missions, see Prerequisites for the RSM Setup Wizard (p. 105).

• Detect file permissions issues in the Compute Server or Manager until the final step of the setup.

• Perform some cluster setup tasks and checks remotely from the Manager or Compute Server machine;
these tasks must be performed locally on each of the machines in the cluster.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
104 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Prerequisites for the RSM Setup Wizard

• Create parallel environments (PEs), which are required for SGE (UGE) Clusters.

• Diagnose Test Compute Server configuration issues from a machine other than the RSM Client.

• Correct some connection problems, typically issues related to hardware, firewalls, IPv6, and multiple
NIC, etc. For details on these issues, see RSM Troubleshooting (p. 99).

A.2. Prerequisites for the RSM Setup Wizard


1. RSM must already be installed on all the machines to be included in the RSM Layout.

• For a machine that will serve as an RSM Client or a Compute Server (in any combination), the installation
of ANSYS Workbench, RSM, and client applications is required.

• For a machine that will serve solely as the Manager, the installation of RSM is required (so it can connect
with the RSM Client and Compute Server machines). However, if it will also serve as an RSM Client or
Compute Server, you must install ANSYS Workbench and client applications as well.

Note

• RSM and Workbench are both installed by default as product components to most ANSYS,
Inc. products. RSM can also be installed independently as a standalone package.

• For cluster configurations, when you configure the head node of the cluster as a Manager,
it will also be configured as a Compute Server. The compute nodes in the cluster will be
configured via the head node.

2. Before starting the wizard, exit Workbench and verify that no RSM jobs are running.

3. Different privileges are necessary for different parts of the setup process. Verify that you have the appro-
priate privileges for the setup tasks you will perform.

• For Windows, “administrative privileges” means that the user either has Windows administrative priv-
ileges on the Manager machine, launches the wizard via the right-click Run as administrator menu
option, or is added to the RSM Admins user group. For RSM Admins privileges, you must create the
RSM Admins user group and add users to it manually. For instructions, see RSM User Accounts and
Passwords (p. 45).

• For Linux, “administrative privileges” can be root or non-root. “Non-root administrative privileges”
means that the user is added to the rsmadmins user group. As a member of this group, you have
administrative, non-root permissions, which are necessary for certain parts of the setup. When a root
user starts RSM services, if the rsmadmins user group and rsmadmin account do not already exist, the
rsmadmins group is automatically created on the Manager machine and an rsmadmin account is
added to the group. This account can then be used to add additional users to the group.

For Linux, if the user prefers to start the non-daemon services from the RSM Setup Wizard (as
opposed to installing and starting the services as daemons with a root account), then a user account
from the rsmadmins user group must be used. Note that if the RSM services are not installed as
daemons, the rsmadmins user group is not automatically created. Therefore, in order to start
non-daemon services via the wizard, prior to running the wizard your IT department must:

– Create the rsmadmins user group manually

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 105
ANSYS Inc. Remote Solve Manager Setup Wizard

– Add the user(s) who will be running/starting non-daemon services to the rsmadmins group

• Starting RSM services.

– For Windows, you must have administrative privileges.

Note

To start RSM services when UAC is enabled on Windows 7, you must use the right-
click Run as administrator menu option to launch the wizard. For instructions on
enabling or disabling UAC, see RSM Troubleshooting (p. 99).

– For Linux, you must have either root user or rsmadmins (non-root administrative) privileges.

Note

If you start the services with an rsmadmins non-root user account, the service will
be run by that account in non-daemon mode. Root user privileges are required for
starting RSM services as daemons. If you start RSM services as daemons, any non-
daemon services will be killed.

• Configuring new or existing machines, queues, and accounts.

– For Windows, you must have administrative privileges.

– For Linux, you must have rsmadmins (non-root administrative) privileges. (You cannot perform this
step with root permissions.)

• To test the final RSM configuration, you must be logged in as a user who will be sending jobs from
the RSM client.

– For Windows, you can have either administrative or non-administrative privileges.

– For Linux, you can have either rsmadmin (non-root administrative) or non-administrative privileges.

4. In most cluster scenarios, client users (other than the user who set up the cluster) must cache their
password with the cluster prior to using the wizard for RSM configuration testing. The exceptions are as
follows:

• For MS HPC clusters, if you are logged in with administrative privileges, the wizard asks you to cache
and verify your password in order to use the wizard’s auto-configuration functionality.

• For LSF Windows clusters, password caching via the wizard has been disabled for security reasons.
You must cache your password with the LSF Windows cluster before logging into the head node and
starting the wizard.

5. If you are running an SGE (UGE) cluster, parallel environments (PEs) must have already been defined by
your cluster administrator. For more information, see Compute Server Properties Dialog: Cluster Tab (p. 63).

6. If you are running a Microsoft HPC cluster with multiple network interface cards (NIC), additional config-
urations are required to establish communications between the RSM Client and Compute Server machines.
For more information, see RSM Troubleshooting (p. 99).

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
106 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Running the RSM Setup Wizard

A.3. Running the RSM Setup Wizard


This section divides running the RSM Setup Wizard into the following steps:
A.3.1. Step 1: Start RSM Services and Define RSM Privileges
A.3.2. Step 2: Configure RSM
A.3.3. Step 3: Test Your RSM Configuration

A.3.1. Step 1: Start RSM Services and Define RSM Privileges


In this part of the setup, you will start RSM services and define RSM administrative privileges for yourself
or other users.

Required Privileges

• For Windows, you must either have Windows administrative privileges on the Manager machine, have
RSM Admin privileges, or launch the wizard via the right-click Run as administrator menu option.

Note

To start RSM services when UAC is enabled on Windows 7, you must use the right-
click Run as administrator menu option to launch the wizard.

• For Linux, you must have either root user or rsmadmins (non-root administrative) privileges. (To start
RSM services as daemons, root user privileges are required. In some cases, these tasks may need to
be performed by a member of your IT department.)

1. Log into the machine that will serve as the Solve Manager. If you are configuring a cluster, this is the
head node of the cluster.

2. Launch the wizard:

• For Windows, select Start > All Programs > ANSYS 15.0 > Remote Solve Manager > RSM Setup
Wizard 15.0. Alternatively, you can navigate to the [RSMInstall]\bin directory and double-click
Ans.Rsm.Wizard.exe.

• For Linux, open a terminal window in the [RSMInstall]\Config\tools\linux directory and


run rsmwizard.

Note

For a quick-start guide on using the wizard, see the Readme file. To access this file:

• For Windows: Select Start > All Programs > ANSYS 15.0 > Remote Solve Manager
> Readme - RSM Setup Wizard 15.0.

• For Linux: Navigate to the [RSMInstall]\Config\tools\linux directory and


open rsm_wiz.pdf.

3. Specify if you are configuring the head node of a cluster.

• If yes, specify the cluster type.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 107
ANSYS Inc. Remote Solve Manager Setup Wizard

• If yes and Windows (MS HPC or LSF) cluster, indicate whether you want to automate the setup to ensure
that Workbench is available to each node in the cluster.

Note

UAC must be disabled on any cluster node where ANSYS Workbench prerequisites are
missing and need to be installed.

• If no, verify prerequisites when prompted and then specify the service role(s) for which the local machine
is being configured.

4. Start RSM services on the local machine. If the necessary services haven’t already been started, the wizard
will start them when you click the Start Services button.

5. Provide RSM administrative privileges to users as necessary.

• For Windows, to provide users with RSM administrative privileges, you must manually create an RSM
Admins user group and add users to this group.

• For Linux, When the RSM services are started by running the wizard with root user privileges, if the
rsmadmins user group and an rsmadmin account do not already exist, the group is automatically
created on the Manager machine. An rsmadmin user account is created in the new user group. This
account has administrative, non-root privileges and can be used to perform RSM administrative and
configuration tasks via the wizard on Linux.

On Linux, to provide additional users with RSM administrative privileges, you must add them to
the rsmadmins user group.

6. If you are logged in with:

• Windows administrative or RSM Admin permissions, you can continue the RSM setup process via your
current wizard session.

• Linux root permissions, there are no further steps that you can perform with the wizard. All further
wizard configurations must be performed by a user with rsmadmin permissions. You can close the
wizard now via the exit button and log back in with rsmadmins permissions to continue the setup.

A.3.2. Step 2: Configure RSM


In this part of the setup, you will configure the Manager and Compute Server(s) in your RSM Layout. If
you are using a cluster, you will do the configurations that can be performed by the wizard. You will
also define queues and accounts.

Required Privileges

For Windows, you must have administrative permissions. For Linux, you must have rsmadmins (non-
root administrative) privileges.

Note

If you are on a Windows Manager and continuing your existing wizard session, you have
already performed the first three steps. Skip to step #4.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
108 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Running the RSM Setup Wizard

1. Log into the machine that will serve as the Manager. If you are configuring a cluster, this is the head
node of the cluster.

2. Launch the wizard as described in Step 1: Start RSM Services and Define RSM Privileges (p. 107).

3. Specify if you are configuring the head node on a cluster as described in Step 1: Start RSM Services and
Define RSM Privileges (p. 107).

4. Follow the steps provided by the wizard to perform the following setup tasks. The tasks will vary according
to your RSM Layout.

• Configure the Manager(s)

• Configure the Compute Server(s)

• Configure Queues

• Create Accounts

5. When the configuration is complete, exit the wizard.

A.3.3. Step 3: Test Your RSM Configuration


In this part of the setup, you will accomplish two tasks: you will specify the Manager to be used and
test the final RSM configuration before submitting jobs. You should perform these tasks by logging into
a machine that will serve as an RSM Client.

Note

Under certain circumstances, testing can also be performed from a Manager machine with
remote access to the RSM Client. However, testing from the Manager may prevent the wizard
from performing some of the setup tasks, such as those for cluster configuration.

Required Privileges

For Windows, you can have either administrative or non-administrative permissions. For Linux, you must
have non-root permissions.

1. Once the setup is finished, log into a machine that will be an RSM Client. You must log in under an account
that will be used to send jobs via RSM.

Note

In most cluster scenarios, client users (other than the user who set up the cluster) must
cache their password with the cluster prior to using the wizard for RSM configuration
testing. The exceptions are as follows:

• For MS HPC: If you are logged in with administrative privileges, the wizard asks you
to cache and verify your password in order to use the wizard’s auto-configuration
functionality.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 109
ANSYS Inc. Remote Solve Manager Setup Wizard

• For LSF: For security reasons, password caching has been disabled for Windows LSF
clusters. You must cache your password with the Windows LSF cluster before logging
into the head node and starting the wizard.
For instructions on caching the password, see “Manually Running the Password Applica-
tion” in the Remote Solve Manager User's Guide.

2. Launch the wizard as described in Step 1: Start RSM Services and Define RSM Privileges (p. 107).

3. Follow the steps in the wizard as before, identifying your local machine as an RSM Client.

4. When you reach the Test Compute Servers step, select the Queue and Compute Servers to be tested
and click Start Test.

5. If the tests pass, you can exit the wizard. If the tests fail, click the Diagnose Failure button for information
on the reason for the failure.

• If the wizard specifies what caused the error, correct the problems identified in the error message and
retry the test.

• If the wizard is unable to identify the exact problem, it will suggest possible troubleshooting steps.
For details, see RSM Troubleshooting (p. 99).

A.4. Troubleshooting in the Wizard


This section contains information on the sorts of problems that the wizard can diagnose. The wizard
can potentially diagnose the following problems.

• Manager problems, such as:

– RSM services have not been started

– File send or compression errors

– Script or job errors

• Compute Server problems, such as:

– Account authentication issues

– Job code compilation or load failures

– Missing files

• Job Script problems, such as:

– AWP_ROOT environment variable undefined

– Remote command execution errors

– Command runtime exceptions

– Script class exceptions

– Shared directory path creation failure

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
110 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Troubleshooting in the Wizard

– Project Directory or Working Directory creation or path issues

• Cluster-specific problems, such as:

– Invalid cluster type

– Unavailable cluster nodes

– AWP_ROOT environment variable undefined on execution node

– Queue issues (RSM queue does not exist on cluster, queue list unavailable)

– Execution node directory issues (failure to create Working Directory, failure to locate cluster shared
directory)

– Cluster control file reading errors

• SSH-specific problems, such as:

– Authentication failures (issues with public, private, or host keys)

– KEYPATH errors (environment variable undefined, KEYPATH file missing)

– Proxy machine name undefined

– Host nonexistent or unavailable

– Network error

• Client API problems, such as:

– File transfer exceptions (upload, download)

– File compression exceptions (compression, decompression)

– Manager Project Directory unshared

– Manager project file missing

For instructions on addressing problems that the wizard cannot diagnose, see RSM Troubleshoot-
ing (p. 99) and view the following entries:

• Firewall Issues (p. 100)

• Multiple Network Interface Cards (NIC) Issues (p. 101)

• Internet Protocol version 6 (IPv6) Issues (p. 100)

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 111
ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
112 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Appendix B. Integrating Windows with Linux using SSH/SCP
RSM supports using SSH/SCP (Secure Shell/Secure Copy) in custom job scripts. The built-in job scripts
for the RSM job submissions have been tested using the PuTTY SSH client (http://www.chiark.green-
end.org.uk/~sgtatham/putty).

SSH/SCP is used for integrating a Windows Manager with a Linux Compute Server. The Manager and
the Compute Server proxy (the Compute Server defined on the General tab of the Compute Server
Properties dialog) are typically on the same Windows machine. The actual Compute Server is on a remote
Linux machine (defined on the SSH tab of the Compute Server Properties dialog). Jobs are sent via
the SSH/SCP protocol from the Windows Compute Server proxy to the actual Linux Compute Server for
processing.

Communications to the Compute Server can be configured either for single Linux machine or for a
Linux cluster. Note that this section focuses primarily on setting up SSH for connection to a single remote
Linux Compute Server. If you are using SSH with n Linux LSF or PBS cluster, you can use the cluster
setup instructions contained in the Configure PuTTY SSH (p. 114) section of this appendix. Then, for
detailed instructions on configuring the LSF or PBS cluster Compute Server, refer to the cluster config-
uration instructions in Appendix C (p. 121).

Note

SSH is not a recommended communication protocol and should be used only if it is required
by your IT policy. For ease of configuration and enhanced performance, native RSM is the
recommended communication protocol. Before proceeding with this configuration, see
Configuring RSM to Use a Remote Computing Mode for Linux (p. 12) and Configuring Native
Cross-Platform Communications (p. 12) for more information.

Before You Begin


These instructions assume the following:

• Workbench User's GuideWorkbench User's Guide and RSM have been installed on the Windows machine.

• RSM has been installed on both the Windows and Linux machines.

• PS, AWK, GREP, LS, and the ANSYS150 command must exist on the Linux machine.

• You are able to install and run ANSYS, Inc. products, including Licensing, on both Windows and Linux
systems. For information on product and licensing installations, go to the Downloads page of the ANSYS
Customer Portal. For further information about tutorials and documentation on the ANSYS Customer
Portal, go to http://support.ansys.com/docinfo.

SSH Job Limitations


File Size Limitation The PuTTY SSH/SCP client has file size limitations that RSM circumvents by
splitting and joining very large files (greater than 2 GB). The Windows Compute Server and the Linux

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 113
Integrating Windows with Linux using SSH/SCP

machine may also have file system limitations that are beyond the control of RSM. You must configure
the Linux machine with large file support, and the Windows file system must be NTFS in order to
transfer files larger than approximately 2 GB. If any job output file is not successfully retrieved, all job
output files are left on the Linux machine. Consult the job log in the RSM Job Log view to learn the
temporary directory name used to store the job files. You can then manually retrieve the files from the
temporary directory (using Samba or a similar application) so the results can be loaded back into your
ANSYS client application.

High Maximum Number of Jobs Value When you use SSH as protocol to run RSM jobs and set a
high maximum number of jobs, some jobs could fail, providing a message such as “Server unexpectedly
closed network connection.” This happens because too many SSH calls are made simultaneously from
different jobs. In this case, you may need to reduce the maximum number of jobs that can be run
concurrently. To do so, go to the General tab of the Compute Server Properties dialog and lower the
value for the Maximum Number of Jobs field.

B.1. Configure PuTTY SSH


In order to send RSM jobs to a remote Linux machine using SSH, you must configure SSH to allow access
from a Windows machine. SSH configuration involves creating a cryptographic key on the Windows
Manager machine and placing public portions of the key on the Linux machine.

Note

SSH configuration must be completed by your IT administrator. This section provides instruc-
tions for a PuTTY SSH implementation. Other SSH implementations are possible, and your IT
administrator can determine which one is best for your site.

Download and install PuTTY.


Download and install PuTTY from the following location: http://www.chiark.greenend.org.uk/~sgtatham/
putty/download.html

If this link is invalid, perform a web search for "PuTTY".

Create a cryptographic key.


Create a cryptographic key using PuTTYGen (puttygen.exe) as follows:

1. On the PuTTY Key Generator dialog, click Generate.

2. Change the Key comment to include your machine name and Windows username.

3. Do not enter a key passphrase.

4. Save the private key file without a passphrase.

For example, <drive>:\Program Files\Putty\id_rsa.ppk.

If you use a pass phrase, jobs will hang a prompt for you to enter the pass phrase. Be sure to secure
the private key file using some other means. For example, if only you will be using the key, save it
to a location where only you and administrators have access to the file, such as the My Documents
folder. If multiple users share the same key, allow the owner full control, then create a group and
give only users in that group access to this file.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
114 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Configure PuTTY SSH

5. If your Linux cluster uses OpenSSH, convert the key to OpenSSH format by selecting Conversions > Export
Open SSH key in the PuTTY Key Generator dialog.

6. Move the public portion of the key to the Linux machine. This requires you to edit the ~/.ssh/author-
ized_keys file on the Linux machine as follows:

• Open an SSH session to one of your cluster nodes, cd into ~/.ssh, and open the authorized_keys
file in your favorite editor (for example, VI or EMACS).

• Copy all the text from the box under Public key for pasting and paste it into ~/.ssh/author-
ized_keys. All of this text should be one line.

• If the authorized_keys file does not exist, create one. Alternately, paste it into a text file and move
that file to the Linux machine for editing.

Modify system environment variables.


1. Open the Windows System Properties dialog.

2. On the Advanced tab, select Environment Variables. The Environment Variables dialog appears.

3. On the Environment Variables dialog, locate the Path variable in the System variables pane.

4. Select the Path variable and then click the Edit button. The Edit System Variable dialog appears.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 115
Integrating Windows with Linux using SSH/SCP

5. Add the PuTTY install directory to the Variable value field (for example, C:\Program Files\putty)
and then click OK.

6. In the System variables pane, click the New button. The New System Variable dialog appears.

7. In the New System Variable dialog , create a new environment variable named KEYPATH with a value
containing the full path to the private key file (for example, <drive>:\Program
Files\Putty\id_rsa.ppk).

Use a user variable if the key file is used only by you. Use a system variable if other users are sharing
the key file. For example, if a Windows XP user has a key file in My Documents, the variable value
should be %USERPROFILE%\My Documents\id_rsa.ppk (this expands to <drive>:\Docu-
ments and Settings\<user>\My Documents\id_rsa.ppk).

8. Click OK.

9. Reboot the computer for environment changes to take effect.

Perform an initial test of the configuration.


1. Run the following from the command prompt (quotes around %KEYPATH% are required):
plink -i “%KEYPATH%” unixlogin@unixmachinename pwd

2. When prompted by plink:

• If plink prompts you to store the key in cache, select Yes.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
116 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Add a Compute Server

• If plink prompts you to trust the key, select Yes.

B.2. Add a Compute Server


Underneath the Manager node on the RSM tree view, right-click on the Compute Server node and
select Add. The Compute Server Properties dialog displays. See Adding a Compute Server (p. 55) for
more detailed information.

General Tab
The General tab is used to set the properties of the Windows Compute Server. On the General tab,
set properties as described below.

• For Display Name, enter a descriptive name for the Windows Compute Server.

• Set Machine Name to the network machine name for the Windows Compute Server. If the Manager and
Compute Server will be on the same Windows machine, enter localhost.

In the example below, the Manager and the Compute Server are on the same machine.

• For Working Directory Location, select Automatically Determined to allow the system to determine
the location for the Working Directory; in this case, you do not need to enter a path and the property is
disabled. Alternatively, you can select User Specified to specify the location of the Working Directory
yourself.

• Enter the path to your Working Directory if you’ve opted to specify the location. If the location is determ-
ined by the system, this property is blank and disabled.

• Select Use SSH protocol for inter- and intra-node communications (Linux only) so that RSM and
solvers will use SSH for inter-node and intra-node communications for Linux machines. This setting applies
to all Linux Compute Servers.

Note

When ANSYS Fluent, ANSYS CFX, ANSYS Mechanical, and ANSYS Mechanical APDL are
configured to send solves to RSM, their solvers will use the same RSH/SSH settings as RSM.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 117
Integrating Windows with Linux using SSH/SCP

See Compute Server Properties Dialog: General Tab (p. 57) for more detailed information on available
properties.

Cluster Tab
If you are using SSH to connect to a single remote Linux Compute Server, you do not need to fill out
the Cluster tab. You can skip this tab and go straight to the SSH tab.

If you are using SSH to connect to a Linux cluster, however, you must fill out the Cluster tab. For in-
structions, see Appendix C.

SSH Tab
The SSH tab is used to configure SSH communications between the Windows Compute Server (defined
on the General tab) and a remote Linux Compute Sever (defined here).

On the SSH tab, set properties as described below.

• Select the Use SSH check box. This enables the rest of the properties on the tab.

• For Machine Name, enter the hostname or IP address of the Linux Compute Server.

• For the Linux Working Directory property:

– Enter the path for your Linux Working Directory.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
118 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Add a Compute Server

→ If the File Management property on the Cluster tab is set to Use Execution Node Local Disk, set
the Linux Working Directory path to a local disk path (e.g. /tmp). The full RSM-generated path (e.g.
/tmp/abcdef.xyz) will exist on the machine specified on that tab, as well as the node(s) that the
cluster software selects to run the job.

→ If the File Management property is set to Reuse Shared Cluster Directory, the Linux
Working Directory path is populated with the path specified for Shared Cluster Directory on
the Cluster tab and cannot be edited. This is where the cluster job runs, as expected.

→ If the Windows and Linux account names are the same (for example, DOMAIN\testuser on Windows
and testuser on Linux) then no additional configuration is required. If the account name is different,
use the Linux Account field to enter the name of the account being used to log into the remote
Linux machine.

Note

This Linux account is an alternate account that allows you to send jobs from the
primary Windows account on the RSM Client and run them under the alternate account
on a remote Linux Compute Server. Both accounts are defined on the RSMAccounts
dialog. For more information, see RSM User Accounts and Passwords (p. 45).

– For the Linux Account property, enter the name of the account being used to log into the remote
Linux machine.

See Compute Server Properties Dialog: SSH Tab (p. 67) for more detailed information on available
properties.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 119
Integrating Windows with Linux using SSH/SCP

Test the Compute Server configuration.


On the Linux Compute Server machine, ensure that the ANSYS Product environment variable
AWP_ROOT150 is set to the location of your ANSYS product installation. This is done by adding the
environment variable definition to your .cshrc (C shell) resource file or .bashrc (bash shell) resource
file.

To test the Compute Server configuration, right-click on the name of the Compute Server in the tree
view and select Test Server. This runs a test job using the settings provided. The Job Log view displays
a log message that shows if the test finished or failed. If the test finishes, you can successfully run jobs
on the Compute Server.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
120 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Appendix C. Integrating RSM with a Linux Platform LSF, PBS, or SGE
(UGE) Cluster
The following sections divide the setup and configuration process for integrating RSM with a Linux-
based Platform LSF (Load Sharing Facility), PBS (Portable Batch System), or SGE/UGE (Sun Grid Engine)
cluster into sequential parts. The sequential parts are followed by general integration details.

Before You Begin


These instructions assume the following:

• Both the Manager and Compute Server machines are set up on the network.

• An LSF, PBS, or SGE (UGE) cluster has been established and configured.

• You are not using the SSH protocol but instead are using native RSM mode. For information on native
RSM, see Configuring RSM to Use a Remote Computing Mode for Linux (p. 12).

Note

If you will be using SSH for Windows-Linux communications, see Appendix B for SSH setup
instructions. Then refer back to this appendix for instructions on configuring RSM to send
jobs to a Linux LSF, PBS, or SGE (UGE) cluster.

• You have the machine name of the LSF, PBS, or SGE (UGE) submission host.

• RSM has been installed on the LSF, PBS, or SGE (UGE) submission host.

• If you are using an SGE (UGE) cluster, parallel environments have already been defined by your cluster
administrator.

• You are able to install and run ANSYS, Inc., products, including Licensing, on both the Manager and
Compute Server machines. For information on product and licensing installations, go to the Downloads
page of the ANSYS Customer Portal. For further information about tutorials and documentation on the
ANSYS Customer Portal, go to http://support.ansys.com/docinfo..

C.1. Add a Linux Submission Host as a Compute Server


In this step, we’ll add the Linux submission host as a Compute Server. Underneath the Manager node
on the RSM tree view, right-click on the Compute Server node and select Add. The Compute Server
Properties dialog displays. See Adding a Compute Server (p. 55) for more detailed information.

General Tab
On the General tab, set properties as described below.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 121
Integrating RSM with a Linux Platform LSF, PBS, or SGE (UGE) Cluster

• If both the Manager and Compute Server services will be on the submission host of the cluster, set Machine
Name to localhost. Otherwise, enter the network name of the submission host node that will run the
Compute Server.

In the example below, pbsclusternode1 is the name of the submission host being defined as the
Compute Server.

• For Working Directory Location, select Automatically Determined to allow the system to determine
the location; in this case, you do not enter a Working Directory path. Alternatively, you can select User
Specified to specify the location of the Working Directory.

• The Working Directory property is blank and disabled if the Working Directory Location is Automatically
Determined. If the Working Directory Location is User Specified, enter the path to your Working Dir-
ectory; this directory must be shared and writeable for the entire cluster. (If the directory not shared and
instead is a local directory, it must exist on each compute node in the job scheduler queue.)

• Select Use SSH protocol for inter- and intra-node communications (Linux only) so that RSM and
solvers will use SSH for inter-node and intra-node communications for Linux machines. This setting applies
to all Linux Compute Servers.

Note

When ANSYS Fluent, ANSYS CFX, ANSYS Mechanical, and ANSYS Mechanical APDL are
configured to send solves to RSM, their solvers will use the same RSH/SSH settings as RSM.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
122 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Add a Linux Submission Host as a Compute Server

See Compute Server Properties Dialog: General Tab (p. 57) for more detailed information on available
properties.

Cluster Tab
On the Cluster tab, set properties as described below.

• Set Cluster Type to LSF, PBS, or SGE.

Note

SGE (UGE) clusters are not supported for Polyflow.

• Enter the path for your Shared Cluster Directory. This is the central file-staging directory.

• For the File Management property:

– Select Reuse Shared Cluster Directory if you want to store temporary solver files in the Shared Cluster
Directory.

When you select this option, the Shared Cluster Directory and the Working Directory are in the
same location. As such, when you select Shared Cluster Directory, the Shared Cluster Directory
path will be populated to the Working Directory Path property on the General tab. Also, the
Working Directory Location property on the General tab will be set to Automatically Determined.
See the image below.

Note

The Shared Cluster Directory is on the machine defined on the General tab. The RSM
job creates a temporary directory here. Mount this directory on all execution hosts so
that the LSF, PBS, or SGE (UGE) job has access.

– Select Use Execution Node Local Disk if you want to store temporary solver files locally on cluster
execution node. When you select this option, the Shared Cluster Directory and the Working Directory
are in different locations, so the path you entered for the Working Directory property on the General
tab remains. The path specified for the Working Directory property is used to specify the local scratch
space on the execution nodes. The path must exist on all nodes.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 123
Integrating RSM with a Linux Platform LSF, PBS, or SGE (UGE) Cluster

• If you set Cluster Type to SGE, enter names for the predefined Shared Memory Parallel and Distributed
Parallel environments that will be used for parallel processing.

These fields default to pe_smp and pe_mpi. To use one of the default names, your cluster adminis-
trator must create a PE with the same name. The default PE names can also be edited to match the
names of your existing parallel environments.

See Compute Server Properties Dialog: Cluster Tab (p. 63) for more detailed information on available
properties.

When you are finished entering values on the Cluster tab, click the OK button.

Note

Since you are not using the SSH protocol, you can skip SSH tab. (The Use SSH check box is
deselected by default.)

Test the Compute Server configuration.


Test the configuration by right-clicking on the newly added Compute Server in tree view and selecting
Test Server from the right-click context menu.

When the Compute Server is part of a cluster:

• When the server test is performed from a compute server node under a Queue parent node, the name
of the parent queue will be used by default.

• For cluster types other than Microsoft HPC, you must have already defined a queue in order to perform
a server test from a compute server node under a Compute Servers parent node. If no queue is defined,
you will receive an error.

For both of these scenarios, you can define a cluster queue and specify that it is used for subsequent
server tests. To do so:

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
124 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Additional Cluster Details

1. Right-click on the compute server node and select Properties.

2. In the Compute Server Properties dialog, open the Cluster tab.

3. In the Job Submission Arguments (optional) field, enter the following argument:
-q queuename

4. Click OK.

Note

If -q <queuename> is entered from Job Submission Arguments (optional) field, this


queue name is always used, even when you submit a job or perform server test from a
compute server node under a Queue parent node. In other words, the -q <queuename>
argument takes a higher priority in specifying the cluster queue to be used.

C.2. Complete the Configuration


Create a queue.
To complete the configuration, create a new queue and add the Compute Server to it. The RSM queue
name must match the cluster queue name exactly (where the cluster queue name can be found by executing
the LSF bqueues command or the PBS qstat -Q command on the cluster head node). Jobs can now
be submitted to this queue and then forwarded to the cluster queue for scheduling. See Creating a
Queue (p. 53) for details.

Test the configuration.


Test the configuration by sending a job to RSM.

C.3. Additional Cluster Details


Adjusting the Maximum Number of Jobs
You can set the Max Running Jobs property on the General tab to the value appropriate to your
cluster. Note that the RSM job could be in a “Running” state, but LSF, PBS, or SGE (UGE) may not yet
be able to execute the job due to limited resources. Refer to the Job Log view to determine the job ID
and state.

Integration Details
RSM essentially forwards the job to the LSF, PBS, or SGE (UGE) job scheduler. This RSM job must build
and execute the job submission command of the scheduler you’ve selected in the Cluster Type drop-
down of the Cluster tab of the Compute Server Properties dialog. The RSM job does not really do
any real work; rather, it monitors the status of the job it has submitted to the job scheduler, performing
the actions listed below:

1. Reads the control file containing paths, inputs and outputs.

2. Makes temporary directories on all nodes assigned for the job.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 125
Integrating RSM with a Linux Platform LSF, PBS, or SGE (UGE) Cluster

3. Copies inputs to the Working Directory of the execution host.

4. Runs the command (for example, solver).

5. Copies outputs to the staging folder on the submission host.

6. Cleans up.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
126 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Appendix D. Integrating RSM with a Windows Platform LSF Cluster
The following sections divide the setup and configuration process for integrating RSM with a Windows-
based Platform LSF (Load Sharing Facility) cluster into sequential parts. The sequential parts are followed
by general integration details.

Before You Begin


These instructions assume the following:

• Both the Manager and Compute Server machines are set up on the network.

• An LSF cluster has been established and configured.

• You have administrative privileges on the LSF submission host of the cluster you are configuring. This is
a node on the cluster for which bsub and lsrcp (requires RES service) commands are available.

• You have the machine name of the LSF submission host.

• RSM has been installed on the LSF submission host.

• You are able to install and run ANSYS, Inc. products, including Licensing, on both the Manager and
Compute Server machines. For information on product and licensing installations, go to the Downloads
page of the ANSYS Customer Portal. For further information about tutorials and documentation on the
ANSYS Customer Portal, go to http://support.ansys.com/docinfo.

Limitations
LSF clusters for Windows are not supported for standalone Fluent, standalone CFX, or Polyflow.

PBS clusters for Windows are not supported.

D.1. Add the LSF Submission Host as a Compute Server


Underneath the Manager node on the RSM tree view, right-click on the Compute Server node and
select Add. The Compute Server Properties dialog displays. See Adding a Compute Server (p. 55) for
more detailed information on properties available in the Compute Server Properties dialog.

General Tab
On the General tab, set properties as described below.

• If both the Manager and Compute Server services will be on the submission host of the cluster, set Machine
Name to localhost Otherwise, enter the network machine name of the submission host node that will
run the Compute Server.

In the example below, LSFClusterNode1 is the name of the submission host being defined as the
Compute Server.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 127
Integrating RSM with a Windows Platform LSF Cluster

• For Working Directory Location, select Automatically Determined to allow the system to determine
the location; in this case, you do not enter a Working Directory path. Alternatively, you can select User
Specified to specify the location of the Working Directory.

• The Working Directory property is blank and disabled if the Working Directory Loca mattion is Auto-
matically Determined. If the Working Directory Location is User Specified, enter the path to your
Working Directory; this directory must be shared and writeable for the entire cluster.

See Compute Server Properties Dialog: General Tab (p. 57) for more detailed information on available
properties.

Cluster Tab
On the Cluster tab, set properties as described below.

• Set Cluster Type to LSF.

• Enter the path for your Shared Cluster Directory. This is the central file-staging directory. This directory
must be accessible by all execution nodes in the cluster.

• For the File Management property:

– Select Reuse Shared Cluster Directory if you want to store temporary solver files in the Shared Cluster
Directory.

When you select this option, the Shared Cluster Directory and the Working Directory are in the
same location. As such, when you select Shared Cluster Directory, the Shared Cluster Directory
path will be populated to the Working Directory Path property on the General tab. Also, the

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
128 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Add the LSF Submission Host as a Compute Server

Working Directory Location property on the General tab will be set to Automatically Determined.
See the image below.

Note

The Shared Cluster Directory is on the machine defined on the General tab. This directory
should be accessible by all execution nodes and must be specified by a UNC (Univer-
sal/Uniform Naming convention) path.

– Select Use Execution Node Local Disk if you want to store temporary solver files locally on the cluster
execution node. When you select this option, the Shared Cluster Directory and the Working Directory
are in different locations, so the path you entered for the Working Directory property on the General
tab remains. The path specified for the Working Directory property is used to specify the local scratch
space on the execution nodes. The path must exist on all nodes.

See Compute Server Properties Dialog: Cluster Tab (p. 63) for more detailed information on available
properties.

When you are finished entering values on the Cluster tab, click the OK button.

Note

Since you are not using the SSH protocol, you can skip SSH tab. (The Use SSH check box is
deselected by default.)

Test the Compute Server configuration.


Test the configuration by right-clicking on the newly added Compute Server in tree view and selecting
Test Server from the right-click context menu.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 129
Integrating RSM with a Windows Platform LSF Cluster

D.2. Complete the Configuration


Create a queue.
To complete the configuration, create a new queue and add the Compute Server to it. The RSM queue
name must match the cluster queue name exactly (where the cluster queue name can be found by executing
the LSF bqueues command on the cluster head node). Jobs can now be submitted to this queue and
then forwarded to the cluster queue for scheduling. See Creating a Queue (p. 53) for details.

Test the configuration.


Test the configuration by sending a job to RSM.

Note

The first time RSM launches an LSF Windows cluster job, you may receive the following error:
CMD.EXE was started with the above path as the current directory.
UNC paths are not supported. Defaulting to Windows directory.

To resolve this issue, create a text file with the following contents and save to a file (e.g.
commandpromptUNC.reg):
Windows Registry Editor Version 5.00
[HKEY_CURRENT_USER\Software\Microsoft\Command Processor]
"CompletionChar"=dword:00000009
"DefaultColor"=dword:00000000
"EnableExtensions"=dword:00000001
"DisableUNCCheck"=dword:00000001

Next, run the following command on the head node and all of the compute nodes in the
cluster:
regedit -s commandpromptUNC.reg

D.3. Additional Cluster Details


Adjusting the Maximum Number of Jobs
You can set the Max Running Jobs property on the General tab to the value appropriate to your
cluster. Note that the RSM job could be in a “Running” state, but LSF or PBS may not yet be able to
execute the job due to limited resources. Refer to the Progress Pane to determine job ID and state.

Integration Details
RSM essentially forwards the job to the LSF job scheduler. This RSM job must build and execute the job
submission command of the scheduler you’ve selected in the Cluster Type drop-down of the Cluster
tab of the Compute Server Properties dialog. The RSM job does not really do any real work; rather, it
monitors the status of the job it has submitted to LSF, performing the actions listed below:

1. Reads the control file containing paths, inputs and outputs.

2. Makes temporary directories on all nodes assigned for the job.

3. Copies inputs to the Working Directory of the execution host.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
130 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Additional Cluster Details

4. Runs the command (for example, solver).

5. Copies outputs to the staging folder on the submission host.

6. Cleans up.

Temporary Directory Permissions on Windows Clusters


Some applications executed through RSM (e.g. Fluent) require read/write access to the system temporary
directory on local compute nodes. The usual location of this directory is C:\WINDOWS\Temp. All users
should have read/write access to that directory on all nodes in the cluster to avoid job failure due to
temporary file creation issues.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 131
ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
132 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Appendix E. Integrating RSM with a Microsoft HPC Cluster
The following sections divide the setup and configuration process for integrating RSM with a Windows-
based Microsoft HPC (High-Performance Computing) cluster. The sequential parts are followed by addi-
tional information about working with an HPC cluster.

Before You Begin


These instructions assume the following:

• A Microsoft HPC cluster has been established and configured.

• You have administrative privileges on the head node of the HPC cluster you are configuring.

• You have the machine name of the HPC head node.

• You have already configured and verified communications between RSM and the HPC head node. See
the HPC installation tutorials on the Downloads page of the ANSYS Customer Portal. For further information
about tutorials and documentation on the ANSYS Customer Portal, go to http://support.ansys.com/docinfo..

• RSM is installed on the HPC head node. This allows you to use both the Manager and Compute Server
(also known as ScriptHost) services, or just use the Compute Server service. If the latter is chosen, the
Manager runs on the RSM Client machine, or on a central, dedicated Manager machine.

• You are able to install and run ANSYS, Inc. products, including Licensing, on both Windows and Linux
systems. For information on product and licensing installations, go to the Downloads page of the ANSYS
Customer Portal. For further information about tutorials and documentation on the ANSYS Customer
Portal, go to http://support.ansys.com/docinfo.

E.1. Configure RSM on the HPC Head Node


1. In your RSM installation directory, navigate to C:\Program Files\ANSYS Inc\v150\RSM\bin.

2. Configure and start RSM services on the head node by running the following command from the command
prompt:
AnsConfigRSM.exe -mgr -svr

3. Set your RSM password. This is the password RSM will use to run jobs on the Compute Server.

4. Note that you need to update your RSM password when you update your password on the RSM client
machine.

For details, see Working with Account Passwords (p. 48).

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 133
Integrating RSM with a Microsoft HPC Cluster

E.2. Add the HPC Head Node as a Compute Server


Underneath the Manager node on the RSM tree view, right-click on the Compute Server node and
select Add. The Compute Server Properties dialog displays. See Adding a Compute Server (p. 55) for
more detailed information on properties available on the Compute Server Properties dialog.

General Tab
On the General tab, set properties as described below.

• Set Machine Name to localhost if both the Manager and Compute Server services will run on the head
node of the cluster. Otherwise, enter the network name of the head node machine that will run the
Compute Server.

In the example below, HPCHeadNode is the network name of the head node being defined as the
Compute Server.

• For Working Directory Location, select Automatically Determined to allow the system to determine
the location; in this case, you do not enter a Working Directory path. Alternatively, you can select User
Specified to specify the location of the Working Directory.

• The Working Directory property is blank and disabled if the Working Directory Location is Automatically
Determined. If the Working Directory Location is User Specified, enter the path to your Working Dir-
ectory; this directory must be shared and writeable for the entire cluster.

See Compute Server Properties Dialog: General Tab (p. 57) for more detailed information on available
properties.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
134 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Add the HPC Head Node as a Compute Server

Cluster Tab
On the Cluster tab, set properties as described below.

• Set the Cluster Type property to Windows HPC. (This selection enables the rest of the properties on the
tab and disables the SSH tab.)

• Enter the path for your Shared Cluster Directory. This is the central file-staging directory on the head
node and must be accessible by all nodes in the cluster.

• For the File Management property:

– Select Reuse Shared Cluster Directory if you want to store temporary solver files in the Shared Cluster
Directory. When you select this option, the Shared Cluster Directory and the Working Directory are in
the same location. As such, when you select Shared Cluster Directory, the Shared Cluster Directory
path will be populated to the Working Directory Path property on the General tab. Also, the Working
Directory Location property on the General tab will be set to Automatically Determined. See the
image below.

Select Use Execution Node Local Disk if you want to store temporary solver files locally on the
Compute Server machine. When you select this option, the Shared Cluster Directory and the
Working Directory are in different locations, so the path you entered for the Working Directory
property on the General tab remains. The path specified for the Working Directory property is
used to specify the local scratch space on the execution nodes. The path must exist on all nodes.

Note

If you will be sending CFX jobs to a Microsoft HPC Compute Server, the Reuse Shared
Cluster Directory option will always be used, regardless of the File Management
property setting.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 135
Integrating RSM with a Microsoft HPC Cluster

See Compute Server Properties Dialog: Cluster Tab (p. 63) for more detailed information on available
properties.

When you are finished entering values, click the OK button.

Test the Compute Server


Test the configuration by right-clicking on the newly added Compute Server in the tree view and selecting
Test Server from the right-click context menu.

E.3. Complete the Configuration


Configure a queue.
To complete the configuration, create a new queue and add the Compute Server to it. Jobs can now
be submitted to this queue and then forwarded to the Microsoft HPC cluster for scheduling. See Creating
a Queue (p. 53) for details.

Test the configuration.


Test the cluster configuration by submitting a job to RSM.

E.4. Additional HPC Details


Integration Details
RSM essentially forwards the job to a third-party job scheduler. This RSM job must build and execute
the job submission command of the scheduler you’ve selected in the Cluster Type drop-down of the
Cluster tab of the Compute Server Properties dialog. The RSM job does not really do any real work;
rather, it monitors the status of the job it has submitted to HPC, performing the actions listed below:

1. Reads a control file containing paths, inputs, and outputs.

2. Makes temporary directories on all nodes assigned for the job.

3. Copies inputs to the Working Directory of the execution node.

4. Runs the command (for example, solver).

5. Copies the outputs to the staging folder on the head node.

6. Cleans up.

The number of CPUs/nodes allocated by Microsoft HPC is controlled by the job script implementation.
For example, the Mechanical application contains a Max number of utilized processors setting that
is passed along on the solver command line. The command line is parsed in the job script and this in-
formation is passed on to Microsoft HPC. The number of CPU requests is reported in the Progress Pane.

Passwords
RSM no longer requires users to manually cache their Windows password with Microsoft HPC. Each RSM
job runs the hpcutils.exe tool prior to submitting the job to the cluster. This tool programmatically
does the equivalent of cluscfg setcreds.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
136 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Additional HPC Details

However, if you still see the error messages regarding the password in the RSM log, such as "Failed to
cache password with HPC" or "Account password MUST be cached with MS Compute Cluster," you may
need to verify that the Microsoft HPC Pack 2008 Service Packs and Windows Server Service Packs have
been installed properly. If you have not installed the service packs, you may still need to run cluscfg
setcreds command from cluster head node to cache the HPC password.

Temporary Directory Permissions on Windows Clusters


Some applications executed through RSM (e.g. Fluent) require read/write access to the system temporary
directory on local compute nodes. The usual location of this directory is C:\WINDOWS\Temp. All users
should have read/write access to that directory on all nodes in the cluster to avoid job failure due to
temporary file creation issues.

Mixed Domains
You can use RSM when the client computer and the cluster are different domains. The assumption is
that the client computer and user account are on the corporate domain and the cluster is its own domain.
In this case, the cluster domain must be configured to have a ‘one-way trust’ with the corporate domain.
That is, the cluster domain trusts the corporate domain but not vice-versa. Corporate domain users
must be able to use cluster resources (login as CORPORATE\user into a cluster node). If the cluster
administrator can add corporate domain accounts as cluster users, then this trust has likely been con-
figured when the cluster domain was created.

Multiple Network Interface Cards


Cluster nodes, especially the head node, generally have multiple network interface cards (NIC) to facil-
itate separate public and private networks. When configuring the network topology for Microsoft HPC
with RSM, be sure to select either Compute nodes isolated on a private network or Compute nodes
isolated on private and application networks. Otherwise, client-server communication difficulties
may arise and additional manual configuration will be required. Refer to Configuring Computers with
Multiple Network Interface Cards (NIC) (p. 20) for configuration instructions.

Network Path Configuration


If the RSM working directory or ANSYS software installation is referenced using a UNC path specification
(e.g. \\nodename\path), please refer to Network Installation and Product Configuration for special
considerations related to network drives. Note that both the working directory and ANSYS software
installation must be have “Full Trust” set on all compute nodes.

Troubleshooting
If RSM jobs submitted to a Microsoft HPC cluster are failing for unknown reasons, you can gain additional
diagnostic information by running the HPC Job Manager (supplied as part of the Microsoft HPC Pack),
selecting the failed job, and examining the output section of the job’s tasks.

Depending on the installed version of Microsoft HPC, registry modification may be required to enable
the execution of commands via UNC paths. Special configuration is required if the task shows the fol-
lowing error:
UNC paths are not supported. Defaulting to Windows directory.
Input Error: Can not find script file "C:\Windows\ClusterJob.py".

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 137
Integrating RSM with a Microsoft HPC Cluster

To resolve this issue, create a text file of the following contents and save to a file (e.g. commandpromp-
tUNC.reg).
Windows Registry Editor Version 5.00

[HKEY_CURRENT_USER\Software\Microsoft\Command Processor]
"CompletionChar"=dword:00000009
"DefaultColor"=dword:00000000
"EnableExtensions"=dword:00000001
"DisableUNCCheck"=dword:00000001

Next, run the following command on the head node and all compute nodes in the cluster:
regedit -s commandpromptUNC.reg

The task of executing this on the compute nodes may be automated using the clusrun utility that is
part of the Microsoft HPC Pack installation.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
138 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Glossary
Abort The Abort command immediately terminates a running job. Jobs termin-
ated via this command have a Status of Canceled.

alternate account An alternate account is necessary if the remote Compute Server machine
does not recognize the primary account used on the RSM Client machine.
An alternate account allows you to send jobs from the primary account
on the RSM Client machine and run them on a remote Compute Server
under the alternate account.

client application A client application is the ANSYS application run on the local RSM Client
machine and is used to submit jobs to RSM, and then to solve those jobs
as managed by RSM. Examples include ANSYS Workbench, ANSYS Fluent,
ANSYS CFX, etc.

client-side integration A client-side integration is a custom integration scenario in which RSM


functionality is replaced by the 3rd-party scripts. Only a thin layer of the
RSM architecture is involved, in order to provide the APIs for execution
of the custom scripts, which are located on the client machine.

code template A code template is an XML file containing code files (for example, C#,
VB, JScript), references, and support files required by a job.

Compute Server A Compute Server is a machine on which jobs are run. In most cases, a
Compute Server is a remote machine, but it can also be your local ma-
chine ("localhost").

compression threshold The compression threshold is the lower limit at which larger files will
be compressed before transferring them. File compression reduces file
sizes, so is useful for file transfers on slower networks.

custom cluster integra- Custom cluster integration is the mechanism provided by RSM that al-
tion lows third parties to use custom scripts to perform the tasks needed to
integrate ANSYS Workbench with the cluster. Both client-side and server-
side customizations are possible.

daemon services Daemon services are scripts or programs that run persistently in the
background of the machine, and which are usually executed at startup.
RSM services are recommended to be installed as daemon service. Once
an RSM service is installed as a daemon, it will be started automatically
without rebooting. The next time the machine is rebooted, the installed
service will be started automatically.

execution node An execution node is a machine in a cluster that actually executes jobs
that have been submitted. Jobs are distributed from the cluster head
node/submission host to be run on available execution nodes.

head node The head node is the machine in a cluster that is configured as the
control center for communications between the Manager and the execu-
tion nodes in the cluster. Typically, it serves as the submission host; it
accepts jobs from the Manager (which, in some cases, may be installed

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 139
Glossary

on the head node itself ) and distributes them across the cluster for exe-
cution.

Interrupt The Interrupt command terminates a running job, but allows for clean-
up of running processes before termination. Jobs terminated via this
command have a Status of Finished.

job A job consists of a job template, a job script, and a processing task sub-
mitted from a client application such as ANSYS Workbench. An example
of a job is the update of a group of design points for an ANSYS Mechan-
ical simulation.

job log In the main RSM window, the job log displays the progress and log
messages for the job selected in the list view.

job script A job script is a component of an RSM job. It runs an instance of the
client application on the Compute Server(s) used to run the processing
task.

job template A job template is a component of an RSM job. It is an XML file that
specifies input and output files of the client application.

LSF IBM Platform Load Sharing Facility is a batch queuing system supported
by RSM.

native mode Native mode is the recommended cross-platform RSM configuration, in


which a Linux Compute Server has RSM installed and running locally so
that the SSH protocol isn’t needed to provide communications between
a Windows Compute Server and a Linux Compute Server.

non-root privileges Non-root privileges give the user a limited subset of administrative
privileges. With RSM, non-root privileges are conferred by an rsmadmin
account (i.e., membership to the rsmadmins user group. It is recommen-
ded that non-root privileges are used for starting and running RSM ser-
vices.

OS Copy OS Copy is a method of file transfer provided by RSM which allows for
full utilization of the network bandwidth and uses direct access to direct-
ories across machines.

parallel processing In parallel processing, jobs are executed on multiple CPU cores simul-
taneously.

parallel environment (PE) A parallel environment allows for parallel execution of jobs. By default,
RSM is configured to support Shared Memory Parallel and Distributed
Parallel environments for SGE clusters.

PBS Altair PBS Pro is a batch queuing system supported by RSM.

primary account A primary account is the main account that is used to access the RSM
Client machine. Typically, it is the account used with the client application
(ANSYS Workbench) on the RSM Client machine.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
140 ation of ANSYS, Inc. and its subsidiaries and affiliates.
queue A queue is a list of one or more Compute Servers available to run jobs.
When a job is sent to a queue, the Manager selects an idle Compute
Server in the queue to execute the job.

root privileges Root privileges give the user administrative access to all commands and
files on a Linux system. It is recommended that root privileges are not
used for starting and running RSM services.

rsmadmin user account An rsmadmin user account is a Linux account with membership in the
rsmadmins user group; as such, the account has RSM administrative
privileges.

rsmadmins user group The rsmadmins user group is a Linux user group that confers adminis-
trative privileges for RSM.

RSM Admins group The RSM Admins group is a Windows user group that confers adminis-
trative privileges for RSM. Also refers to the privileges conferred on
members of this group (i.e., “RSM Admins privileges”).

RSM Client The RSM Client is the local machine from which RSM jobs are submitted
to a Compute Server. It runs both RSM and a client application such as
ANSYS Workbench.

scratch space Using scratch space is the practice of storing solver files in a local direct-
ory on the Compute Server machine. Recommended to optimize perform-
ance when there is a slow network connection between execution nodes
and the Shared Cluster Directory or when the solver used produces many
relatively large files.

serial processing In serial processing, jobs are executed on only one CPU core at a time.

server-side integration Server-side integration is a custom integration scenario in which RSM


is used in conjunction with a cluster (either supported or unsupported),
with the cluster head node typically configured as both Manager and
Compute Server. The cluster acts as a server with respect to the RSM
Client where from the jobs are submitted.

SGE Sun Grid Engine is not technically supported by RSM because UGE is
the latest version, though many SGE installations will still work without
modification. See UGE.

Manager The Manager is the central RSM service that dispatches jobs to computing
resources. It contains a configuration of queues (lists of Compute Servers
available to run jobs). The Manager service can be run locally (on the
same machine as the RSM Client) or remotely (on a standalone remote
machine or as part of a cluster). For clusters, it is typically installed on
the head node.

SSH Secure Shell is a network protocol providing a secure channel for the
exchange of data between networked devices. RSM can use SSH for cross-
platform communications, but native mode is the recommended method.
See native mode.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 141
Glossary

submission host A submission host is the machine or cluster node to which the Manager
submits jobs. In most cluster scenarios, the Manager is installed on the
head node of a cluster; in this case, the submission host, head node, and
Manager are all the same machine.

UGE Univa Grid Engine is a batch queuing system supported by RSM.

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
142 ation of ANSYS, Inc. and its subsidiaries and affiliates.
File Transfer, 21
Index Network Files Systems, 26
OS Copy, 22
A
accounts, 45 I
alternate, 47 installing RSM, 7
Linux with SSH, 50 service installation, 10
password, 48 Installing RSM
primary, 46 RSM Setup Wizard, 103
set password manually, 49 integrating
adding a Compute Server, 55 using LSF or PBS, 121, 127
administration, 51 using Microsoft HPC, 133
using SSH/SCP, 113
C
client application J
defining, 1 job
file handling, 4 defining, 1
integration, 5 job template, 73
integration with Workbench, 5
supported solvers, 5 L
code template, 1, 73 Linux
Compute Server configuration, 12
adding a Compute Server, 55 Explicit Dynamics systems, 18
Compute Server properties, 55 native mode, 12
Cluster tab, 63 remote computing, 12
General tab, 57 starting services at boot time, 15
SSH tab, 67 Linux Path considerations, 18
defining, 1 LSF, 121, 127
file handling, 4
remotely connecting to a Compute Server, 20 M
startup scripts, 13 Manager
testing, 70 file handling, 4
configuration file, 28 Manager properties, 54
configuring RSM remotely connecting to a Manager, 20
Linux, 12 startup scripts, 13
multi-user machines, 19 mapped drives, 99
multiple network interface cards, 20 Microsoft HPC, 133
remote computing, 12, 19 multi-user machines, 19
starting Linux RSM services at boot time, 15
Windows, 10 N
Configuring RSM native mode, 12
RSM Setup Wizard, 103 Network File System, 4
configuring RSM Services, 10 Network Files Systems, 26
custom archnitecture, 73
custom integration, 73 O
OS Copy Operation, 22
E overview, 1
EKM Servers, 13
Explicit Dynamics systems, 18 P
passwords, 45
F caching, 48
file handling, 4

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 143
Index

caching manually, 49 U
PBS, 121, 127 user interface, 31
primary, 46
W
Q Windows
queue configuration, 10
creating, 53 installation, 10
defining, 1 integration with Linux using SSH/SCP, 113
integration with Platform LSF or PBS Cluster, 121,
R 127
remote computing Wizard for RSM Setup, 103
configuration, 19 workflows, 2
remotely connecting to a Compute Server, 20
remotely connecting to a Manager, 20
Remote Solve Manager Setup Wizard, 103
RSM Client
defining, 1
file handling, 4
RSM Solve Manager
defining, 1
RSM user interface
Accounts dialog , 41
context menu, 42
desktop alert window, 40
Job Log, 31
Job Log View, 38
List View, 31, 36
main window, 31
Menu Bar, 31-32
Notification Area icon, 42
Options dialog, 40
Status Bar, 31, 38
system tray icon, 42
Toolbar, 31, 33
Tree view, 31, 34

S
SSH
integrating Windows with Linux, 113
job limitations, 113
selecting a remote computing mode, 12
SSH/SCP configuration, 113
starting RSM services manually, 13
startup scripts
Compute Server, 13
EKM Servers, 13
Manager, 13

T
terminology, 1
troubleshooting, 99

ANSYS Release 15.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
144 ation of ANSYS, Inc. and its subsidiaries and affiliates.

You might also like