CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose...

246
C O N T E N T S MSC.Patran Analysis Manager User’s Guide CHAPTER 1 Overview Purpose, 2 Product Information, 3 What is Included with this Product?, 4 Integration with MSC.Patran, 5 How this Manual is Organized, 6 2 Getting Started Quick Overview, 8 Enabling/Disabling the Analysis Manager, 10 MSC.Nastran Submittals, 11 ABAQUS Submittals, 13 MSC.Marc Submittals, 14 Generic Submittals, 15 The Main Form, 16 UNIX Interface, 16 Windows Interface, 17 - Window Pull-down Menus, 18 - Windows Icons, 18 Invoking the Analysis Manager Manually, 19 Files Created, 22 3 Submit Introduction, 24 Selecting Files, 25 Where to Run Jobs, 26 Windows Submittal, 27 4 Configure Introduction, 30 Disk Space, 31 MSC.Nastran Disk Space, 31 ABAQUS, MSC.Marc, and General Disk Space, 33 MSC.Patran Analysis Manager User’s Guide

Transcript of CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose...

Page 1: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

C O N T E N T SMSC.Patran Analysis Manager User’s Guide MSC.Patran Analysis Manager

User’s Guide

CHAPTER

1Overview ■ Purpose, 2

■ Product Information, 3

■ What is Included with this Product?, 4

■ Integration with MSC.Patran, 5

■ How this Manual is Organized, 6

2Getting Started ■ Quick Overview, 8

■ Enabling/Disabling the Analysis Manager, 10

■ MSC.Nastran Submittals, 11

■ ABAQUS Submittals, 13

■ MSC.Marc Submittals, 14

■ Generic Submittals, 15

■ The Main Form, 16❑ UNIX Interface, 16❑ Windows Interface, 17

- Window Pull-down Menus, 18- Windows Icons, 18

■ Invoking the Analysis Manager Manually, 19

■ Files Created, 22

3Submit ■ Introduction, 24

■ Selecting Files, 25

■ Where to Run Jobs, 26

■ Windows Submittal, 27

4Configure ■ Introduction, 30

■ Disk Space, 31❑ MSC.Nastran Disk Space, 31❑ ABAQUS, MSC.Marc, and General Disk Space, 33

Page 2: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

■ Memory, 35❑ MSC.Nastran Memory, 35❑ ABAQUS Memory, 37❑ MSC.Marc and General Memory, 39

■ Mail, 41

■ Time, 42

■ General, 44

■ Restart, 48❑ MSC.Nastran Restarts, 48❑ MSC.Marc Restarts, 50❑ ABAQUS Restarts, 51

■ Miscellaneous, 52❑ MSC.Nastran Miscellaneous, 52❑ MSC.Marc Miscellaneous, 53❑ ABAQUS Miscellaneous, 54❑ General Miscellaneous, 55

5Monitor ■ Introduction, 62

■ Running Job, 63❑ Windows Interface, 66

■ Completed Job, 68❑ Windows Interface, 70

■ Host/Queue, 71❑ Job Listing, 72❑ Host Status, 73❑ Queue Manager Log, 74❑ Full Listing, 75❑ CPU Loads, 76

6Abort ■ Selecting a Job, 78

■ Aborting a Job, 79❑ UNIX Interface, 79❑ Windows Interface, 79

7System Management

■ Directory Structure, 82

■ Analysis Manager Programs, 84❑ Analysis Manager Program Startup Arguments, 86

- AbaMgr, NasMgr, MarMgr, GenMgr, 86- JobMgr, 86- RmtMgr, 86- QueMgr (AdMgr), 87- P3Mgr, 88- TxtMgr, 89

Page 3: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

❑ Analysis Manager Environment File, 91

■ Organization Environment Variables, 94

■ Installation, 99❑ Installation Requirements, 99❑ Installation Instructions, 100

■ X Resource Settings, 102

■ Configuration Management Interface, 104❑ Modify Configuration Files, 107

- Applications, 109- Physical Hosts, 113- Analysis Manager Host Configurations, 116- Disk Configuration, 119- Queue Configuration, 121- Groups (of hosts), 123

❑ Test Configuration, 125- Application Test, 125- Physical Hosts Test, 126- AM Hosts Test, 128- Disk (Filesystem) Test, 130- Queue Test, 132

❑ Queue Manager, 134

■ Examples of Configuration Files, 136

■ Starting the Queue/Remote Managers, 144❑ Starting Daemons at Boot Time, 146

AError Messages ■ Error Messages, 152

- PCL Form Messages..., 152- Windows..., 152- Job Manager Daemon (JobMgr) Errors..., 153- User Interface (P3Mgr) Errors..., 158- Additional (submit) Errors..., 162- Editing (p3edit) Errors..., 165- RmtMgr Errors..., 167- QueMgr Errors..., 169- Administration (AdmMgr) Testing Messages..., 180

BApplication Procedural Interface (API)

■ Analysis Manager API, 186❑ Analysis Manager Application Procedural Interface (API) Description, 186

- Assumptions:, 186- A Quick Background, 186- General outline of the Analysis Manager API:, 187- Structures, 191

■ Include File, 196

■ Example Interface, 213

INDEX ■ MSC.Patran Analysis Manager User’s Guide, 239

Page 4: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained
Page 5: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

MSC.Patran Analysis Manager User’s Guide

CHAPTER

1 Overview

■ Purpose

■ Product Information

■ What is Included with this Product?

■ Integration with MSC.Patran

■ How this Manual is Organized

Page 6: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1.1 PurposeMSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained by the MSC.Software Corporation. MSC.Nastran and MSC.Marc are advanced finite element analysis programs used mainly for analyzing complex structural and thermal engineering problems. The core of MSC.Patran is a finite element analysis pre/postprocessor. Several optional products are available with MSC.Patran including advanced postprocessing, interfaces to third party solvers and application modules. This document describes the MSC.Patran Analysis Manager, one of these application modules.

The Analysis Manager provides interfaces within MSC.Patran to submit, monitor and manage analysis jobs on local and remote networked systems. It can also operate in a stand-alone mode directly with MSC.Nastran, MSC.Marc, ABAQUS, and other general purpose finite element solvers.

At many sites, engineers have several computing options. Users can choose from multiple platforms or various queues when jobs are submitted. In reality, the resources available to them are not equal. They differ based on the amount of disk space and memory available, system speed, cost of computing resources, and number of users. In networked environments, users frequently do their modeling on local workstations with the actual analysis performed on compute servers or other licensed workstations.

The MSC.Patran Analysis Manager automates the process of running analysis software, even on remote and dissimilar platforms. Files are automatically copied to where they are needed; the analysis is performed; pertinent information is relayed back to the user; files are returned or deleted when the analysis is complete even in heterogeneous computing environments. Time consuming system housekeeping tasks are reduced so that more time is available for productive engineering.

The Analysis Manager replaces text-oriented submission scripts with a Motif-based menu-driven interface (or windows native interface on Windows platforms), allowing the user to submit and control his job with point and click ease. No programming is required. Most users are able to productively use it after a short demonstration.

Page 7: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

3CHAPTER 1Overview

1.2 Product InformationThe MSC.Patran Analysis Manager provides convenient and automatic submittal, monitoring, control and general management of analysis jobs to local or remote networked systems. Primary benefits of using the Analysis Manager are engineering productivity and efficient use of local and corporate network-wide computing resources for finite element analysis.

The Analysis Manager has its own scheduling capability. If commercially available queueing software, such as LSF (Load Sharing Facility) from Platform Computing Ltd. or NQS is available, then the Analysis Manager can be configured to work closely with it.

This release of the MSC.Patran Analysis Manager works explicitly with MSC.Nastran release up to version 2004, MSC.Marc releases up to version 2003, and versions of ABAQUS up to 6.x. It also has a general capability which allows almost any software analysis application to be supported in a generic way.

For more information on how to contact your local MSC representative see Technical Support (p. xi).

Page 8: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1.3 What is Included with this Product?The MSC.Patran Analysis Manager product includes the following items:

1. Various executable programs, services or daemons for ALL supported computer platforms which usually reside in

$P3_HOME/p3manager_files/bin

where $P3_HOME is a variable indicating the <installation_directory>, the directory location of the MSC.Patran installation. The main executables are:

• P3Mgr (Graphical User Interface)

• QueMgr (Queue Manager)

• JobMgr (Job Manager)

• NasMgr (MSC.Natran Manager)

• AbaMgr (ABAQUS Manager)

• MarMgr (MSC.Marc Manager)

• GenMgr (General Manager)

• RmtMgr (Remote Manager)

• AdmMgr (Admin Manager - Unix only - part of P3Mgr on Windows)

• TxtMgr (Text User Interface)

• Job_Viewer (Database Job Viewer - Unix only)

2. Template configuration files contained in $P3_HOME/p3manager_files/default/conf

These configuration files must be modified to fit each new computing environment and network. These and the above executables are described in System Management (Ch. 7).

3. Two empty working directories called$P3_HOME/p3manager_files/default/log

$P3_HOME/p3manager_files/default/proj

which are necessary and are used during analysis execution to store various files.

4. This User’s Manual. An on-line version is provided to allow direct access to this information from within MSC.Patran.

Page 9: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

5CHAPTER 1Overview

1.4 Integration with MSC.Patran The MSC.Patran Analysis Manager can function as a separately run program but is intended to be run directly from within MSC.Patran in a seamless manner when submitting analysis jobs. It is integrated with MSC.Patran such that engineers can submit, monitor and manage their analysis jobs directly from within the MSC.Patran graphical interface. It provides a user-friendly environment to submit analysis jobs, then monitor and control job execution graphically. It is a distributed, multiple-process application which runs in a heterogeneous network.

There are various modes in which the Analysis Manager can be invoked. Normally, a user will see a seamless integration between MSC.Patran and the Analysis Manager. Jobs can be submitted, monitored and aborted simply by setting the appropriate Action in pull down menus available from the Analysis application form in MSC.Patran. When a job is being monitored, the monitoring window or form can be put away and recalled at any time. The user can even quit MSC.Patran and the monitoring window will remain present until the user closes it.

The full user interface is also available from within MSC.Patran simply by pressing a button on the Analysis application form or from the Tools pull down menu on the main form. This gives access to change default settings, submit previously created input files, change the default computer host or queue in which to submit jobs, and many other options which are explained throughout this manual.

The MSC.Patran Analysis Manager can also be invoked from the system prompt. This mode of implementation gives the user maximum flexibility to manage analysis jobs.

Page 10: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1.5 How this Manual is OrganizedThis manual is organized into various chapters, each dealing with certain functions of the product. The manual includes the following chapter topics:

Overview (Ch. 1) provides general information and an overview of the features of MSC.Patran Analysis Manager.

Getting Started (Ch. 2) describes rules for analysis input decks, how to invoke MSC.Patran’s Analysis Manager and gives the details involved in setting up, submitting, monitoring, and aborting an analysis job directly from within MSC.Patran.

Submit (Ch. 3) describes the use of the job submittal capability from the MSC.Patran Analysis Manager user interface.

Configure (Ch. 4) describes how to configure various options such as memory, disk space, restarts, time of submittal, host or queue selection, and a number of other options.

Monitor (Ch. 5) describes the monitoring capability of jobs, completed jobs, and hosts or queues. The graphical monitoring window is also described in detail.

Abort (Ch. 6) describes how to abort running jobs.

System Management (Ch. 7) details the system management. The individual program executables are described as well as the necessary configuration files, installation, guidelines and requirements. This chapter is mainly for the system administrator that must install and configure the Analysis Manager.

Error Messages (App. A) gives descriptions and solutions to error messages.

Page 11: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

MSC.Patran Analysis Manager User’s Guidek

CHAPTER

2 Getting Started

■ Quick Overview

■ Enabling/Disabling the Analysis Manager

■ MSC.Nastran Submittals

■ ABAQUS Submittals

■ MSC.Marc Submittals

■ Generic Submittals

■ The Main Form

■ Invoking the Analysis Manager Manually

■ Files Created

Page 12: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

2.1 Quick OverviewBefore MSC.Patran’s Analysis Manager can be used, it must be installed and configured by the system administrator. See System Management (Ch. 7) for more on the installation and set-up of the module.

In so doing, the system administrator starts the Analysis Manager’s queue manager (QueMgr) daemon or service, which is always running on a master system. The queue manager schedules all jobs submitted through the Analysis Manager. The master host is generally the system on which MSC.Patran or an analysis module was installed, but does not have to be.

The system administrator also starts another daemon (or service) that runs on all machines configured to run analyses, called the remote manager (RmtMgr). This daemon/service allows for proper communication and file transfer to/from these machines.

Users that already have analysis input decks prepared and are not using MSC.Patran may skip to The Main Form (p. 16) after reviewing the rules for input decks for the various submittal types in this Chapter.

When using MSC.Patran, in general, the user begins by setting the Analysis Preference to the appropriate analysis, such as MSC.Nastran, which is available from the Preferences pull down menu on the top menu bar.

MSC.Patran

hp, 2

$# Session file patran.ses.01 started recording at 25$# Recorded by MSC.Patran 03:36:58 PM$# FLEXlm Initialization complete. Acquiring license(s)...

File Group Viewport Display Preferences Tools HelpInsight Control

Geometry© FEM LBCs Matls Properties© ©© © Load Cases© Fields Analysis Results Insight© ©© © XYPlot©

Viewing

Analysis Preference

MSC.Nastran

Analysis Code:

Structural

Analysis Type:

.bdf

Input File Suffix:

.op2

Output File Suffix:

OK

Preferences

Analysis...Global...Graphics...

Mouse...Key Map...Picking...

Report...

Geometry...

Finite Element...

Insight...

Page 13: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

9CHAPTER 2Getting Started

Once the Analysis Preference is set and a proper analysis model has been created in MSC.Patran, the user can submit the job. Generally, the submittal process takes place from the Analysis application form when the user presses the Apply button. The full interface with access to all features of MSC.Patran’s Analysis Manager is always available, regardless of the Preference setting, from the Tools pull down menu or from the Analysis Manager button on the Analysis form. The location of the submittal form is explained throughout this chapter for each supported analysis code.

MSC.Patran

hp, 2

$# Session file patran.ses.01 started recording at 25$# Recorded by MSC.Patran 03:36:58 PM$# FLEXlm Initialization complete. Acquiring license(s)...

File Group Viewport Display Preferences Tools HelpInsight Control

Geometry© FEM LBCs Matls Properties© ©© © Load Cases© Fields Analysis Results Insight© ©© © XYPlot©

Viewing

Tools

Lists...Quickpick...Octree Tetmesh...

Mass Properties...Freebody Results...Beam Library...

MSC.Fatigue...

Laminate Modeler...

Analysis Manager...

ss

Analysis

AnalyzeAction:

Code:

Entire ModelObject:

Full RunMethod:

Job Name

Translation Parameters...

Solution Type...

Subcase Create...

Subcase Select...

Restart Parameters...

Available Jobs

Apply

MSC.Nastran:

Type: Structural

myjob

Job Description

MSC.Nastran job01-Feb-94 at 14:32:43

Analysis Manager...

Brings up the full interface with the current analysis job ready for submittal but does not submit the job until done so in the Analysis Manager interface itself (see The Main Form (p. 16). This button does not appear when the product is not installed or licensed. This gives the user full access to the capabilities of the Analysis Manager including the ability to submit already existing analysis decks, monitor running or completed jobs, monitor host or queue activity, abort jobs, or customize the individual’s setup. If this button does not appear on this form, the software is either not authorized to run or not properly installed. Also, the Action must be set to Analyze for this button to appear.

The appropriate Action is taken when this button is pressed, such as submitting the job automatically when the Analysis Manager is installed and configured.

With the Action set to Analyze and the Method set to either Full Run or Check Run, the analysis is submitted via the Analysis Manager when the Apply button is pressed (if properly licensed and installed)..

The Analysis Manager keys off of this Job Name when the Apply button is pressed and the Action is set to either Monitor, Abort, or Analyze (with Method set to Full Run or Check Run). A job by this name is submitted, monitored, or aborted when the Apply button is pressed.

Page 14: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

2.2 Enabling/Disabling the Analysis ManagerThere may be times when it is not desirable or required to submit a job through the Analysis Manager. In such a case, the user can temporarily disable the Analysis Manager and make use of MSC.Patran’s generic submittal capability for each analysis code supported. Disabling the Analysis Manager does not change the user interface at all, i.e., the Analysis Manager button remains on the Analysis form. However when the Apply button is pressed on the Analysis application form, the job will be submitted via MSC.Patran’s generic submit scripts.

To disable the Analysis Manager, type the following command in MSC.Patran’s command window and press the Return or Enter key:

analysis_manager.disable()

To enable the Analysis Manager after it has been disabled type this command:

analysis_manager.enable()

If a more permanent enabling or disabling of the Analysis Manager is required, the user may place these commands as necessary in a user’s p3epilog.pcl file. This file is invoked at startup from the user’s local directory, or home directory, or $P3_HOME in that order, if found.

Page 15: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1CHAPTER 2Getting Started

2.3 MSC.Nastran SubmittalsAny standard MSC.Nastran (up to version 2004) problem can be submitted using MSC.Patran’s Analysis Manager. This is accomplished from the Analysis form with the Analysis Preference set to MSC.Nastran.

The following rules apply to MSC.Nastran run-ready input files for submittal:

1. The BEGIN BULK and ENDDATA statements must be in the main input file; the one that is specified when submitting, not in INCLUDE files.

2. The filename may not have any '.' characters except for the extension. The filename must also begin with a letter (not a number).

Run-ready input files prepared by MSC.Patran follow these rules. Correct and proper analysis decks are created by following the instructions and guidelines as outlined in the MSC.Patran MSC.Nastran Preference Guide, Volume 1: Structural Analysis.

To submit, monitor, and manage MSC.Nastran jobs from MSC.Patran using the Analysis Manager, make sure the Analysis Preference is set to MSC.Nastran. This is done from the Preferences menu on the main MSC.Patran form. The Analysis form appears when the Analysis toggle, located on the main MSC.Patran application switch, is chosen. Pressing the Apply button on the Analysis application form with the Action set to Analyze, Monitor, or Abort will cause the Analysis Manager to perform the desired action. A chapter is dedicated to each of these actions in the manual as well as one for custom configuration of MSC.Nastran submittals.

The Analysis Manager generates the MSC.Nastran File Management Section (FMS) of the input file automatically, unless the input file already contains the following advanced FMS cards:

INIT DBLOCATE ACQUIRE DBCLEAN DBFIX DBLOAD DBSETDEL DBUNLOAD EXPAND RFINCLUDE ENDJOB ASSIGN USRSOU ASSIGN USROBJ ASSIGN OBJSCR ASSIGN INPUTT2 ASSIGN INPUTT4

in which case the user is prompted whether or not to use the existing FMS as-is, or to have the Analysis Manager auto-generate the FMS, using what FMS is already present, with certain exceptions.

The question asked is:

This file contains advanced FMS statements. Do you want to bypass the MSC.Patran Analysis Manager auto-FMS capability?

Answer NO to auto-generate FMS; answer YES to use existing FMS. Typically you would answer YES to this question unless you are fully aware of the FMS in the file.

Page 16: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

With FMS automatically generated, each logical database is made up of multiple physical files, each with a maximum size of 231 bytes (the typical maximum file size), up to the disk space currently free, or until the size limit requested in the Analysis Manager is met. Large problems requiring databases and scratch files larger than 231 bytes can, therefore, be run without the user having to add ANY FMS statements. But this requires that you do not bypass the auto-FMS capability.

If multiple file systems have been defined, the Analysis Manager will generate FMS (provided the input file does not contain advanced FMS or the user wishes to use the Analysis Manager’s automatic FMS capability along with his advanced deck) so that the scratch and database files are split onto each file system defined, according to the free space available at run time. See Disk Space (p. 31) for more information.

Restarts are handled by the Analysis Manager in the following manner: the needed FMS is generated so that the restart run will succeed. If database files exist on the local machine, they are copied to the analysis machine prior to execution; otherwise, they are expected to exist already in the scratch areas. Any ASSIGN, MASTER cards are changed/generated to ensure MSC.Nastran will locate pre-existing databases correctly. See Restart (p. 48) for more information.

Page 17: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1CHAPTER 2Getting Started

2.4 ABAQUS SubmittalsAny standard ABAQUS (up to version 6.x) problem can be submitted using MSC.Patran’s Analysis Manager. This is accomplished from the Analysis form with the Analysis Preference set to ABAQUS.

The following rules apply to ABAQUS run-ready input files for submittal:

1. The filename may not have any '.' characters except for the extension. The filename must begin with a letter (not a number).

2. The combined filename and path should not exceed 80 characters.

Run-ready input files prepared by MSC.Patran follow these rules. Correct and proper analysis decks are created by following the instructions and guidelines as outlined in the MSC.Patran ABAQUS Preference Guide,.

To submit, monitor, and manage ABAQUS jobs from MSC.Patran using the Analysis Manager, make sure the Analysis Preference is set to ABAQUS. This is done from the Preferences menu on the main form. The Analysis form appears when the Analysis toggle, located on the MSC.Patran application switch, is chosen. Pressing the Apply button on the Analysis application form with the Action set to Analyze, Monitor, or Abort will cause the Analysis Manager to perform the desired action. A chapter is dedicated to each of these actions in the manual as well as one for custom configuration of ABAQUS submittals.

If multiple file systems have been defined, the Analysis Manager will generate aux_scratch and split_scratch parameters appropriately based on current free space among all file systems for the host on which the job is executing. See Disk Space (p. 31) for more information.

Restarts are handled by the Analysis Manager by optionally copying the restart (.res) file to the executing host first, then running ABAQUS with the oldjob keyword. See Restart (p. 48) for more information.

Page 18: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

2.5 MSC.Marc SubmittalsAny standard MSC.Marc (up to version 2003) problem can be submitted using MSC.Patran’s Analysis Manager. This is accomplished from the Analysis form with the Analysis Preference set to MSC.Marc.

The following rules apply to MSC.Marc run-ready input files for submittal:

1. The filename may not have any '.' characters except for the extension. The filename must begin with a letter (not a number).

Run-ready input files prepared by MSC.Patran follow these rules. Correct and proper analysis decks are created by following the instructions and guidelines as outlined in the MSC.Marc Preference Guide.

To submit, monitor, and manage MSC.Marc jobs from MSC.Patran using the Analysis Manager, make sure the Analysis Preference is set to MSC.Marc. This is done from the Preferences menu on the main form. The Analysis form appears when the Analysis toggle, located on the MSC.Patran application switch, is chosen. Pressing the Apply button on the Analysis application form with the Action set to Analyze, Monitor, or Abort will cause the Analysis Manager to perform the desired action. A chapter is dedicated to each of these actions in the manual as well as one for custom configuration of MSC.Marc submittals.

Multiple file systems are not supported with MSC.Marc submittals. See Disk Space (p. 31) for more information.

Restarts, user subroutines, externally referenced result (POST) and view factor files are handled by the Analysis Manager by optionally copying these files to the executing host first, then running MSC.Marc with the appropriate command arguments. See Restart (p. 48) for more information.

Page 19: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1CHAPTER 2Getting Started

2.6 Generic SubmittalsAside from the explicitly supported analysis codes, MSC.Nastran, MSC.Marc, and ABAQUS, most any analysis application can be submitted, monitored and managed using MSC.Patran’s Analysis Manager general analysis management capability. This is accomplished by selecting Analysis Manager from the Tools pull down menu on the main MSC.Patran form. This brings up the full Analysis Manager user interface which is described in the next section, The Main Form (Ch. 2).

When the Analysis Manager is accessed in this manner, it keys off the current Analysis Preference. If the Preference is set to MSC.Nastran, MSC.Marc, and ABAQUS, the jobname and any restart information is passed from the current job to the Analysis Manager and is brought up ready to submit, monitor, or manage this job.

Any other Preference that is set must be configured correctly as described in Installation (p. 99) and is considered part of the general analysis management. The jobname from the Analysis form is passed to the Analysis Manager and the job submitted with the configured command line and arguments. (How to configure this information is given in Miscellaneous (p. 52) and Applications (p. 109).) If an analysis code is to be submitted, yet no Analysis Preference exists for this code, the Analysis Manager is brought up in its default mode and the user must then manually change the analysis application to be submitted via an option menu. This is explained in detail in the next section.

On submittal of a general analysis code, the job file is copied to the specified analysis computer, the analysis is run, and all resulting files from the submittal are copied back to the invoking computer and directory.

Page 20: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

2.7 The Main FormWhen MSC.Patran’s Analysis Manager is invoked either from the system prompt or via a button or a pull down menu from within MSC.Patran, the main Analysis Manager form appears as shown. There are two interfaces shown, one for UNIX and one for Windows platforms. Only the main form is shown here with brief explanations. Details are provided in subsequent Chapters.

UNIX Interface

MSC.Nastran

SubmitAction:

MSC.Nastran Input File

JobObject:

Available Actions are Submit (Ch. 3), Configure (Ch. 4), Monitor (Ch. 5), and Abort (Ch. 6). Separate chapters are dedicated to each of these actions.

Host Selection

Apply

myjob.dat

Edit Input File

Select File...

Cancel

atf_ibm

atf_sun

atf_sgi

atf_dec

atf_sun

atf_hp

defaultGroup:

The Object is a Job for the Submit and Abort actions. When the action is set to Configure, the Objects can be Memory (p. 35), Restart (p. 48),Disk Space (p. 31), Time (p. 42), General (p. 44), Miscellaneous (p. 52) and Mail (p. 41) depending on the analysis code. For Monitor, the objects can be set to Running Job (p. 63), Completed Job (p. 68), or Host/Queue (p. 71).

This is an organization of configured analysis codes and host or queues on which to submit jobs. Multiple groups can be defined with different applications. Each Group assumes that its own QueMgr daemon is active. Organization Environment Variables (Ch. 7) explains the advantage and necessity of creating different groups. At least one group must exist and is called Default by default. The Group can be changed to select a different analysis code or a different set or organization of codes and hosts/queues.

Brings up a simple editor window that allows minor editing to be done on the input file. The environment variable P3_EDITOR can be used to change the default editor. The following key strokes are available for editing with the default editor:

ctrl-s: to search for a string

ctrl-n: to repeat search

ctrl-c: exits out of editor

ctrl-<: goes to top of file

The title displays the current analysis code set. In this case,MSC.Nastran

Note: The rest of this form’s appearance varies depending on the Action that is set. Different databoxes, listboxes, or other items in accordance with the Action/Object menu settings are displayed. Each of these are discussed in the following chapters.

◆◆

◆◆

◆◆

◆◆

◆◆

Page 21: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1CHAPTER 2Getting Started

Windows Interface

The title displays the current analysis code set. In this case,MSC.Nastran

Available Actions are Submit (Ch. 3), Monitor (Ch. 5), and Administration (System Management (Ch. 7)). Separate chapters are dedicated to each of these actions. Abort (Ch. 6) and Configure (Ch. 4) are part of these as explained in each Chapter.

An organization of configured analysis codes and host or queues on which to submit jobs. Multiple groups can be defined with different applications. Each Group assumes that its own QueMgr daemon or service is active. Organization Environment Variables (Ch. 7) explains the advantage and necessity of creating different groups. At least one group must exist and is called Default by default. The Group can be changed to select a different analysis code or a different set or organization of codes and hosts/queues. This is done under the Queue pull-down menu.

Brings up a simple editor window that allows minor editing to be done on the input file. The default editor is Notepad. This can be changed under the Tools | Options menu pick.

Note: The rest of this form’s appearance varies depending on the Tab that is set. Different databoxes, listboxes, or other items in accordance with the Tree and/or Tab settings that are displayed. Each of these are discussed in the following chapters.

Page 22: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

Window Pull-down Menus

The following simple pull-down menus are available from the Windows interface:

Windows Icons

These icons appear on the main form.

Queue The main purpose of this pull-down is to allow a user to Exit the program, Print where appropriate, and to Connect To... other queue manager daemons or services. User Settings can also be saved and read from this pull-down menu. For administrators, other items on this pull-down become available when configuring the Analysis Manager and for Starting and Stopping queue manager services. This is detailed in System Management (Ch. 7). These items in the Queue pull-down menu are only enabled when the Administration tree tab is accessed.

Edit Gives access to standard text Cut and Paste operation when applicable.

View This pull-down menu allows the user mainly to update the view when jobs are being run. The Refresh (F5) option graphically updates the window when in monitoring mode. The program automatically refreshes the screen based on the Update Speed also. All Jobs or only the current User Jobs can be seen if desired.

Tools The Options under this menu allow the user to change the default editor when viewing result files or input decks. The number of user completed jobs viewable from the interface is also set here.

Windows The main purpose of this pull-down menu is to hide or display the Status Bar and Output Window at the bottom of the window.

Help Not currently implemented in this release.

Folder The open folder icon is the same as the Connect To... option under the Queue pull-down menu, which allows you to connect to other queue manager daemons/services that may be running and accessible.

Save The diskette icon is for saving user settings.

Printer Allows to print when appropriate.

Paintbrush This allows refresh of the window when in monitoring mode.

Page 23: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1CHAPTER 2Getting Started

2.8 Invoking the Analysis Manager ManuallyIf installed properly, MSC.Patran’s Analysis Manager can be invoked from the system prompt with the following arguments:

$P3_HOME/p3analysis_mgr arg1 arg2 arg3 arg4 [optional args] [args5-8]

where $P3_HOME is a variable indicating the <installation_directory>, the directory location of the MSC.Patran installation.

Each argument is described in the following table.

Argument Description

arg1(start-up type)

The program can be started up in one of the following 8 modes (enter the number only):1-Start up the full interface. See The Main Form (p. 16). (default)2-Start up the Queue Monitor. See Monitor (Ch. 5).3-Start up the Abort Job now. See Abort (Ch. 6).4-Monitor a Running Job. See Monitor (Ch. 5).5-Monitor a Completed Job. See Monitor (Ch. 5).6-Submit the job. See Submit (Ch. 3).7-Submit in batch mode. (No user interface appears or messages.)8-Same as 7 but waits until job is done. Returns status codes:

0=success, 1=failure, 2=abort.

arg2(extension)

This is the extension of the analysis input file (e.g., .dat,.bdf, .inp).(.dat is the default)

arg3 (jobname) This is the MSC.Patran jobname; the jobname that appears in any jobname textbox (without the extension). (default = unknown)

arg4(application type)

This is the analysis application requested (enter the number only).

optional args(MSC.Nastran)

-coldstart coldstart_jobname

The -coldstart parameter followed by the cold start MSC.Nastran jobname indicates a restart job. Also see P3Mgr (p. 88).

optional args(ABAQUS)

-runtype <0, 1 or 2>-restart oldjobname

The -runtype parameter followed by a 0, 1 or a 2 is to specify whether the run is a full analysis, a restart, or a check run respectively. The-restart parameter is to specify the old jobname for a restart run.

1 - MSC.Nastran (default)2 - ABAQUS3 - MSC.Marc20 - General code #121 - General code #2

- thru29 - General code #10

Page 24: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

If no arguments are provided, defaults are used (full interface (1), “.dat”, “unknown”, MSC.Nastran (1)).

The arguments listed in the table above are very convenient when invoking the Analysis Manager from pre and postprocessors such as MSC.Patran, which have access to the pertinent information which may be passed along in the arguments. It may, however, be more convenient for the user to define an alias such that the program always comes up in the same mode.

Here are some examples of invoking MSC.Patran’s Analysis Manager:

$P3_HOME/bin/p3analysis_mgr

or

$P3_HOME/p3manager_files/p3analysis_mgr 1 bdf myjob 1

or

$P3_HOME/p3manager_files/p3analysis_mgr 1 bdf myjob MSC.Nastran

This invokes MSC.Patran’s Analysis Manager by specifying the entire path name to the executable where $P3_HOME is a variable containing the MSC.Patran installation directory. The entire user interface is brought up specified by the first argument. The input file is called myjob.bdf and the last argument specifies that MSC.Nastran is the analysis code of preference.

Here is another example:

p3analysis_mgr 1 inp myjob 2 -runtype 1 -restart oldjobp3analysis_mgr 1 inp myjob ABAQUS -runtype 1 -restart oldjob

This example invokes the Analysis Manager by assuming the executable name can be found in the user’s path. The entire user interface is brought up specified by the first argument. The input file is called myjob.inp. The code of preference is ABAQUS and the last two arguments indicate that a restart analysis is to be performed and the job is a restart from a previous job called oldjob. Another example:

p3analysis_mgr 3 dat myjob 20

This example requests the termination of an analysis by the jobname of myjob with an input file called myjob.dat. The analysis code specified is a user defined application defined by the number 20 in the configuration files.

p3analysis_mgr 5 dat myjob 1

This example requests the completed monitor graph of an MSC.Nastran analysis by the jobname of myjob with an input file called myjob.dat.

optional args(MSC.Marc)

See P3Mgr (p. 88)

arg5 (x position) Optional - Specifies the X position of upper left corner of MSC.Patran right hand side interface in inches. (UNIX only)

arg6 (y position) Optional - Specifies the Y position of upper left corner of MSC.Patran right hand side interface in inches. (UNIX only)

arg7 (width) Optional - Width of right-hand side interface in inches. (UNIX only)

arg8 (height) Optional - Height of right-hand side interface in inches. (UNIX only)

Argument Description

Page 25: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

2CHAPTER 2Getting Started

If only the full interface is brought up by the user in stand-alone mode, it may be more convenient to specify an alias and place it in a login file (.login, .cshrc) such as:

alias p3am ‘p3analysis_mgr 1 dat unknown 1’

This way all the user has to type is p3am to invoke the program each time.

Page 26: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

2.9 Files CreatedAside from the files generated by the analysis codes themselves, MSC.Patran’s Analysis Manager also generates files, the contents of which are described in the following table.

Any or all of these files should be checked for error messages and codes if a job is not successful and it does not appear that the analysis itself is at fault for abnormal termination.

Argument Description

jobname.mon This file contains the final monitoring or status information from a submitted job. It can be replotted using the Monitor | Completed Job selection from the main form.

jobname.tml This is the analysis manager log file that gives the status of the analysis job and parameters that were used during execution.

jobname.submit This file contains the messages that would normally appear on the screen if the job were submitted interactively. When a silent submit is performed (batch submittal), this file is created. Interactive submittals will display all messages to a form on the screen.

jobname.stdout This file contains any messages that would normally go to the standard output (generally the screen) if the user had invoked the analysis code from the system prompt.

jobname.stderr This file will contain any messages from the analysis which are written to standard error. If no such messages are generated this file does not appear.

Page 27: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

MSC.Patran Analysis Manager User’s Guide

CHAPTER

3 Submit

■ Introduction

■ Selecting Files

■ Where to Run Jobs

■ Windows Submittal

Page 28: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

3.1 IntroductionThe process of submitting a job requires the user to select the file and options desired. The job is submitted to the system and ultimately executes MSC.Nastran, ABAQUS, MSC.Marc, or some other application module. MSC.Patran’s Analysis Manager properly handles all necessary files and provides monitoring capability to the user during and after job execution. See Monitor (Ch. 5) for more information on monitoring jobs.

In MSC.Patran, jobs are submitted one of two ways: through the Analysis application form for the particular Analysis Preference, or outside of MSC.Patran through MSC.Patran’s Analysis Manager user interface with the Action (or tree tab in the Windows interface) set to Submit. Submitting through the Analysis form in MSC.Patran makes the submittal process transparent to the user and is explained in Getting Started (Ch. 2).

For more flexibility the full user interface can be invoked from the system prompt as explained in the previous chapter or from within MSC.Patran by pressing the Analysis Manager button on the Analysis application form or by invoking it from the Tools pull down menu. This gives access to more advanced and flexible features such as submitting existing input decks from different directories, changing groups or organizations (queue manager daemons/services), selecting different hosts or queues, and configuring analysis specific items. The rest of this chapter explains these capabilities.

Below is the UNIX submittal form (see Windows Submittal (p. 27) for the Windows interface).

MSC.Nastran

MSC.Nastran Input File

Action set to Submit allows submittal of the indicated analysis input file. The Object is always set to Job.

Host Selection

Apply

myjob.dat:

Select File...

Cancel

Currently set analysis code, analysis input file or jobname, and Host or Queue selection. You can also edit the file before submittal if necessary.

atf_sun4

atf_sgi

atf_dec

atf_suns

Once everything is set properly, submit the job by pressing the Apply button. An information button appears and displays the following messages as they occur:

1. Initializing2. Sending Request to Queue Manager3. Waiting for Queue Manager to Connect4. Sending Queue Manager Input Parameters5. Job jobname Submitted... Job #<#> Assigned

If an error occurs or these messages do not appear, check for proper installation.

The Cancel button immediately exits the program and does nothing with any information or parameters that may have been selected up to that point.

SubmitAction:

JobObject:

defaultGroup: The Group menu allows for selecting a group or organization of analysis codes and host or queues to use. The currently selected analysis code shows up as blue under this menu. Use this menu and its submenus to change the analysis code to be used. Having multiple Groups means you have more that one queue manager

Edit Input File

◆◆

◆◆

◆◆

The Group menu allows for selecting a Queue (a group of hosts and analysis codes) to submit to. The currently selected analysis code shows up as blue under the active queue. Use this menu and its submenus to change the analysis code to be used. Having multiple Groups means you have more that one Queue manager daemon or service running, each with their own configuration files (multiple master hosts).

Page 29: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

2CHAPTER 3Submit

3.2 Selecting FilesThe filename of the job that is currently opened will appear in a textbox of the form on the previous page. If this is not the job to be submitted, press the Select File button and a file browser will appear.

Below is the UNIX file browser form (see Windows Submittal (p. 27) for the Windows interface).

All appropriate files in the selected directory are displayed in the file browser. Select the file to be run from those listed in the file browser or change the directory path in the Filter databox and then press the Filter button to re-display the files in the new directory indicated. An asterisk (*) serves as a wild card.

Select OK once the file is properly selected and displayed, or double-click on the selected file.

Note: The directory in the Filter databox indicates where the input file will be copied from upon submission AND where the results files from the analysis will be copied to upon completion. Any existing results files of the same names will be overwritten on completion and you must have write privileges to the specified directory.

/smith/./smith/.fminit2.0/smith/Exercises/smith/Mail/smith/Part_2_basic_functions/smith/Part_4_FEM

/smith/. [ ]

Directories Files

/okinawa/users/smith/*.dat

Filter

Selection

- OK - Filter Cancel

Choose MSC.Nastran Input FIle

Page 30: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

3.3 Where to Run JobsA default host system or queue is provided. However, a different host system or queue may be selected using the Host/Queue list on the form. Select the host system or queue where the job is to execute.

If MSC.Patran Analysis Manager is to schedule the job, then select the host where the job will run. Make the choice by clicking the toggle to the left of the appropriate host name.

If another scheduling software system (i.e., LSF or NQS) is enabled, then select the queue to submit the job to. The queueing software executes each job on the host it selects.

Below is the UNIX interface (see Windows Submittal (p. 27) for the Windows interface).

The submit function can also be invoked manually from the system prompt. See Invoking the Analysis Manager Manually (p. 19) for details. It can be invoked in both an interactive and a batch mode.

Note: Often, the user will look into the Host/Queue listing window described in Host/Queue (Ch. 5), to see what host/queue is most appropriate (free or empty) before selecting from the list and submitting. When submitting to an LSF/NQS queue, the host is selected automatically, however you can select a particular host from the Choose Specific Host button (not shown) if desired.

MSC.Nastran

MSC.Nastran Input File

Host Selection

Apply

myjob.dat:

Select File...

Cancel

atf_sun4

atf_sgi

atf_dec

Cray(group)

SubmitAction:

JobObject:

defaultGroup:

Edit Input File

The hosts or queues that appear in this listbox are determined during installation and the subsequent configuration of MSC.Patran Analysis Manager by the system administrator. System Management (Ch. 7) gives specific details on this procedure.

It is possible to change applications and or change Queues (groups of hosts and applications) if necessary before submitting a job. This can be done in two different ways. Either by changing the group and application from the Group option menu or by selecting the group name from the list of available hosts or queues. If an apparent host in the Host Selection listbox has the word (group) in parenthesis, this indicates that this host is actually another organizational group of either hosts or queues that will allow job submissions. In this example the “Cray” host is actually a group of NQS queues on a Cray machine that allow MSC.Nastran submittals and would show up as such if the “Cray” host is selected.

◆◆

◆◆

◆◆

Page 31: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

2CHAPTER 3Submit

3.4 Windows SubmittalThe interface on Windows platforms is quite different in appearance than that for UNIX, but the process is almost identical. Submitting through this interface is simple. Simply follow these steps:

1. Select the Submit tree tab or Job Control tab under Submit.

2. Select the Application from the Application pull-down menu.

3. Enter the Jobname by using the Select File button. This brings up a standard Windows file browser to allow you to select the input file.

4. Select the host or queue to which the job should be submitted under the Submit To pull-down menu.

5. Change the time that you wish the job to run if required, otherwise the default is to submit immediately. You can also put a limit on the maximum time the job should run (zero indicates that no limit is imposed).

6. Press the Submit button to submit the job. Appropriate messages appear in the command window at the bottom of the form where the Output tab appears.

1

23

4

5

6

Page 32: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

Once a file is selected, you can edit the file if necessary before submitting it. This is done by pressing the Edit File button. By default the Notepad application is used as the editor. The default editor can be changed under the Tools | Options menu pick as shown below.

Page 33: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

MSC.Patran Analysis Manager User’s Guide

CHAPTER

4 Configure

■ Introduction

■ Disk Space

■ Memory

■ Mail

■ Time

■ General

■ Restart

■ Miscellaneous

Page 34: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

4.1 IntroductionBy setting the Action to Configure on the main MSC.Patran Analysis Manager form, the user has control of a variety of options that affect job submittal. The user can customize the submitting environment by setting any of the parameters discussed in this chapter. These parameters can be saved such that all subsequent submittals use the new settings or they can be set for a single submittal only. All of this is at the control of the user.

Note: The UNIX interface is shown above. In subsequent sections both the UNIX and the Windows interface are shown. Saving settings is done under the Queue pull-down menu in the Windows interface as shown is subsequent sections. There is no concept of a Configure action in the Windows interface. All items in this Chapter are found under the Submit tree and tabs in the Windows interface.

MSC.NastranAction set to Configure allows user defined parameters to be set for items such as Memory (p. 35), Disk Space (p. 31), Time (p. 42), Mail (p. 41), Restart (p. 48), Miscellaneous (p. 52), and other General (p. 44) items.

What appears here is dependent on the Object set in the above option menu. All of these objects and their respective forms are explained in this chapter.

Once everything is as the user wants it, press the Save button. A confirmation is requested whether or not the user wants to save the settings in a file called .p3mgrrc which is stored in the user’s home directory.

If the particular settings are only desired for one submittal, do not press the Save button; instead, press Apply and change the Action back to Submit and submit the job. The selected settings will not be written to the .p3mgrrc file but will be used during that session and will be lost when the program is exited.

The Reset button changes settings to the system default values.

The Cancel button immediately exits the program and does nothing with any information or parameters that may have been selected up to that point.

ConfigureAction:

GeneralObject:

defaultGroup:

Save

Cancel

Reset

Apply

Page 35: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

3CHAPTER 4Configure

4.2 Disk SpaceThe Disk Space configuration is analysis code specific.

MSC.Nastran Disk SpaceAfter selecting the Disk Space option on the Object menu, the following Disk Space form appears.

Note: MSC.Patran’s Analysis Manager will only check for sufficient disk space if the numbers for DBALL, MASTER, and SCRATCH are provided. An error message will appear if not enough disk space is available. If these values are not specified the job will be submitted and will run until completion or the disk is full and an error occurs.

MSC.Nastran

MASTER

MBDisk Space Units:

Enter “0” for All Avail. Disk

0

DBALL 0

SCRATCH 0

ConfigureAction:

DiskObject:

defaultGroup: Use the Disk Space form to set the maximum disk space size accessible by MSC.Nastran during executions. There is one disk space setting for MASTER, DBALL, and SCRATCH files which are applied to the corresponding parameters in MSC.Nastran FMS statements.

If the values are left or set to “0”, the limits used will be the limits of the file system space defined to work with MSC.Patran Analysis Manager as specified in the disk.cfg file. The Disk Space Units menu permits specifying sizes in kilobytes (KB), megabytes (MB) or gigabytes (GB).

Save

Cancel

Reset

Apply

These are explained in first page of this section, General (p. 44).

Page 36: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

The Windows interface for MSC.Nastran disk space is shown below.

Once everything is as the user wants it, press the Save User Settings under the Queue pull-down menu. A confirmation will be requested whether or not the user wants to save the settings in a file called .p3mgrrc which is stored in the user’s home directory.

If the particular settings are only desired for one submittal, do not save the settings; instead, simply change the back to the Job Control tab and submit the job. The selected settings will not be written to the .p3mgrrc file but will be used during that session and will be lost when the program is exited.

Use the Resources tab to set the maximum disk space size accessible by MSC.Nastran during executions. There is one disk space setting for MASTER, DBALL, and SCRATCH files which are applied to the corresponding parameters in MSC.Nastran FMS statements.

If the values are left or set to “0”, the limits used will be the limits of the file system space defined to work with MSC.Patran’s Analysis Manager as specified in the disk.cfg file. The Disk Units menu permits specifying sizes in kilobytes (KB), megabytes (MB) or gigabytes (GB).

Page 37: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

3CHAPTER 4Configure

ABAQUS, MSC.Marc, and General Disk SpaceAfter selecting the Disk Space option on the Object menu, the following Disk Space form appears.

ABAQUS

Space:

MBDisk Space Units:

Enter “0” for All Avail. Disk

0

Use the Disk Space form to set the maximum disk space size accessible by ABAQUS, MSC.Marc, or other user defined analysis code during executions submitted by MSC.Patran’s Analysis Manager. There is one disk space setting for all files created during a run.

If the value is left or set to “0,” the limits used will be the limits of the file system space defined to work with MSC.Patran’s Analysis Manager as specified in the disk.cfg file. The Disk Space Units menu permits specifying sizes in kilobytes (KB), megabytes (MB) or gigabytes (GB).

ConfigureAction:

DiskObject:

defaultGroup:

Save

Cancel

Reset

Apply

These are explained in first page of this section, General (p. 44).

Page 38: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

The Windows interface for ABAQUS, MSC.Marc, or other user defined analysis disk space requirements is shown below.

Once everything is as the user wants it, press the Save User Settings under the Queue pull-down menu. A confirmation will be requested whether or not the user wants to save the settings in a file called .p3mgrrc which is stored in the user’s home directory.

If the particular settings are only desired for one submittal, do not save the settings; instead, simply change the back to the Job Control tab and submit the job. The selected settings will not be written to the .p3mgrrc file but will be used during that session and will be lost when the program is exited.

Use the Resources tab to set the maximum disk space size accessible by ABAQUS, MSC.Marc, or other user defined analysis code during executions submitted by MSC.Patran’s Analysis Manager. There is one disk space setting for all files created during a run.

If the value is left or set to “0,” the limits used will be the limits of the file system space defined to work with MSC.Patran’s Analysis Manager as specified in the disk.cfg file. The Disk Space Units menu permits specifying sizes in kilobytes (KB), megabytes (MB) or gigabytes (GB).

Page 39: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

3CHAPTER 4Configure

4.3 MemoryThe Memory configuration is analysis code specific.

MSC.Nastran MemoryAfter selecting the Memory option on the Object menu, the following Memory form appears.

MSC.Nastran

General Memory Req.

32 bit Wordsmem units:

mem: 0

All numbers in the Memory menu are absolute and are used as MSC.Nastran’s Open Core and Scratch Memory definition, i.e., the mem= and smem= parameters. The units for mem and smem parameters may be set in Words, kilobytes (KB), or megabytes (MB).

32 bit Wordssmem units:

atf_sgi

mem: 8000000

smem: 100

atf_hp

mem: 8000000

smem: 100

ConfigureAction:

MemoryObject:

defaultGroup:

Enter “0” for no/min req.

Use the Memory menu to set the maximum memory size accessible by MSC.Nastran during executions submitted by MSC.Patran’s Analysis Manager. There are two memory settings for each of the machines configured to use MSC.Patran’s Analysis Manager.

Memory configurations can be set for each machine supported to execute MSC.Nastran. A scroll bar appears if there are more hosts than can appear on the form. You must press the HOST Details button to see all the hosts. A zero value indicates that no minimum requirement is set.

HOST Details

Save

Cancel

Reset

Apply

These are explained in first page of this section, General (p. 44).

Page 40: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

The Windows interface for MSC.Nastran memory requirements is shown below:

Once everything is as the user wants it, press the Save User Settings under the Queue pull-down menu. A confirmation will be requested whether or not the user wants to save the settings in a file called .p3mgrrc which is stored in the user’s home directory.

If the particular settings are only desired for one submittal, do not save the settings; instead, simply change the back to the Job Control tab and submit the job. The selected settings will not be written to the .p3mgrrc file but will be used during that session and will be lost when the program is exited.

Use the Resources tab to set the maximum memory size accessible by MSC.Nastran during executions submitted by MSC.Patran’s Analysis Manager. There are two memory settings for each of the machines configured to use MSC.Patran’s Analysis Manager.

All numbers in the Resource tab are absolute and are used as MSC.Nastran’s Open Core and Scratch Memory definition, i.e., the mem= and smem= parameters. The units for mem and smem parameters may be set in kilobytes (KB), or megabytes (MB), or gigabytes (GB).

Memory configurations can be set independently for each machine supported to execute MSC.Nastran. A spread sheet is displayed to allow specification of mem and smem per machine. Change the numbers directly on the spread sheet. A zero value indicates that no minimum requirement is set.

Page 41: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

3CHAPTER 4Configure

ABAQUS MemoryAfter selecting the Memory option on the Object menu, the following Memory form appears.

ABAQUS

atf_ibm

64 Bit WordsUnits:

main buffer: 8000000

main memory: 100

All numbers in the Memory menu are absolute and are used as HKS's Open Core and Scratch Memory definition. The units for memory may be set to 64 Bit Words, kilobytes (KB), or megabytes (MB).

Memory configurations can be set for each machine supported to execute ABAQUS. A scroll bar appears if there are more hosts than can appear on the form. You must press the HOST Details button to see all hosts.

ConfigureAction:

MemoryObject:

defaultGroup:

pre buffer: 8000000

pre memory: 100

atf_sgi

main buffer: 8000000

main memory: 100

pre buffer: 8000000

pre memory: 100

General Memory Req.

mem: 0

Enter “0” for no/min req.

Use the Memory menu to set the maximum memory size accessible by ABAQUS during executions submitted by MSC.Patran’s Analysis Manager. There are four memory settings for each of the machines configured to use MSC.Patran’s Analysis Manager.

HOST Details

Save

Cancel

Reset

Apply

These are explained in first page of this section, General (p. 44).

Page 42: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

he Windows interface for ABAQUS memory requirements is shown below:

Once everything is as the user wants it, press the Save User Settings under the Queue pull-down menu. A confirmation will be requested whether or not the user wants to save the settings in a file called .p3mgrrc which is stored in the user’s home directory.

If the particular settings are only desired for one submittal, do not save the settings; instead, simply change the back to the Job Control tab and submit the job. The selected settings will not be written to the .p3mgrrc file but will be used during that session and will be lost when the program is exited.

Use the Resources tab to set the maximum memory size accessible by ABAQUS during executions submitted by MSC.Patran’s Analysis Manager. There are four memory settings for each of the machines configured to use MSC.Patran’s Analysis Manager.

All numbers in the Memory menu are absolute and are used as HKS's Open Core and Scratch Memory definition. The units for memory may be set to Kilobytes (KB), or Megabytes (MB) or Gigabytes (GB).

Memory configurations can be set independently for each machine supported to execute MSC.Nastran. A spread sheet is displayed to allow specification of mem and smem per machine. Change the numbers directly on the spread sheet. A zero value indicates that no minimum requirement is set.

Page 43: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

3CHAPTER 4Configure

MSC.Marc and General MemoryAfter selecting the Memory option on the Object menu, the following Memory form appears.

ABAQUS

Save

Cancel

Reset

MBUnits:

All numbers in the Memory menu are absolute and are used as Open Core Memory definition. The units for memory may be set to kilobytes (KB), megabytes (MB), or gigabytes (GB).

ConfigureAction:

MemoryObject:

defaultGroup:

General Memory Req.

mem: 0

Enter “0” for no/min req.

Use the Memory menu to set the maximum memory size accessible by MSC.Marc, or other general applications during executions submitted by MSC.Patran’s Analysis Manager. There are four memory settings for each of the machines configured to use MSC.Patran’s Analysis Manager.

Apply

These are explained in first page of this section, General (p. 44).

Page 44: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

The Windows interface for MSC.Marc or other general application memory requirements is shown below:

Once everything is as the user wants it, press the Save User Settings under the Queue pull-down menu. A confirmation will be requested whether or not the user wants to save the settings in a file called .p3mgrrc which is stored in the user’s home directory.

If the particular settings are only desired for one submittal, do not save the settings; instead, simply change the back to the Job Control tab and submit the job. The selected settings will not be written to the .p3mgrrc file but will be used during that session and will be lost when the program is exited.

Use the Resources tab to set the maximum memory size accessible during executions submitted by MSC.Patran’s Analysis Manager.

All numbers in the Memory menu are absolute and are used as Open Core Memory definition. The units for memory may be set to Kilobytes (KB), or Megabytes (MB) or Gigabytes (GB).

Page 45: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

4CHAPTER 4Configure

4.4 MailThe Mail configuration setting determines whether or not to have mail notification and, if so, where to send the mail notices.

Note: In this version there is no mail notification. This feature has been disabled.

MSC.Nastran

Mail Notification Options:

If Mail On Job Completion is ON, MSC.Patran’s Analysis Manager will send an electronic mail message to the owner of the job at the host specified in the Destination Host for Mail setting. Mail messages are sent when the job completes.

Destination Host For Mail:

Mail Off

Mail On Job Completion

Local Host:

Master Host:

(atf_suns)

(atf_ibm)

ConfigureAction:

MailObject:

defaultGroup:

If Mail Off is set (the default), then mail is not sent upon completion.

◆◆

◆◆

Save

Cancel

Reset

Apply

These are explained in first page of this section, General (p. 44).

Page 46: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

4.5 TimeAny job can be submitted to be run immediately, with a delay, or at a specific future time. The default submittal is immediate. To change the submittal time, use the following Time form.

MSC.Nastran

To wait a certain amount of time before running the job select Submit With Delay. Then, specify the delay in Hours and Minutes in the boxes to the right. Jobs cannot be delayed more than 23 hours and 59 minutes, unless the Submit At Specific Time method below is used.Submit Immediately

Submit With Delay

Submit At Specific Time

Hours Minutes

Hours Minutes

SundayDay:

0 0

0 0

ConfigureAction:

TimeObject:

defaultGroup:

To run a job on a certain day and time select Submit At Specific Time.

Click on the weekday under Day and the days appear as Sunday, Monday, Tuesday, Wednesday, Thursday, Friday, Saturday.

Select the desired day and then type in the Hours and Minutes in the boxes to the right.

Jobs cannot be submitted to be run more than a week in advance. The day and time are always interpreted as the next occurrence. Be sure your computer’s clock is accurate.

◆◆

◆◆

Save

Cancel

Reset

Apply

Maximum Job Time (min):

0

A maximum amount of time can also be specified after which the job will be terminated. Zero indicates no limit.

These are explained in first page of this section, General (p. 44).

Page 47: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

4CHAPTER 4Configure

The Windows interface for setting job submit delay and maximum job time is specified directly on the Submit | Job Control tab as shown below:

Note: There is no Day-of-the-Week type submittal on Windows.

A maximum amount of time can also be specified after which the job will be terminated. Zero indicates no limit.

To wait a certain amount of time before running the job select Submit with Time Delay. Then, specify the delay in Hours and Minutes in the boxes to the right. Jobs cannot be delayed more than 23 hours and 59 minutes, unless the Submit At Specific Time method below is used.

The default is immediate (Now) submittal.

Once everything is as the user wants it, press the Save User Settings under the Queue pull-down menu. A confirmation will be requested whether or not the user wants to save the settings in a file called .p3mgrrc which is stored in the user’s home directory.

If the particular settings are only desired for one submittal, do not save the settings; instead, simply submit the job. The selected settings will not be written to the .p3mgrrc file but will be used during that session and will be lost when the program is exited.

Page 48: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

4.6 GeneralThe General configuration form allows preferences to be set for a number of items as described below. Nothing in this form is analysis specific.

Note: Items not described on this page are described on subsequent pages in this section.

MSC.Nastran

Using a General configuration setting, you can control whether or not the Monitor Running Job function is started after a submittal. For more on monitoring a job, see Monitor (Ch. 5). Select Yes if you want each job monitored immediately after it is submitted.

atf_ibm

atf_hp

Project Name:

atf_sun

Default Host:

atf_user

Start Job Monitor

Yes No

After Job Submit?

ConfigureAction:

GeneralObject:

defaultGroup:

On Originating Host...

Yes No

Monitor Log File?

The Default Host or Queue can also be specified. If more hosts or queues exist than can fit on the form, they will appear in a list box.

A default project directory can be specified using the Project Name configuration setting. Type the project directory name into the box. The default project, if none has been specified, is the user name. A directory with the name provided is created if it does not already exist in the $P3_HOME/p3manager_files/<org>/proj directory. See more explanation below.

◆◆

◆◆

◆◆

◆◆

Save

Cancel

Reset

Apply

If you would like the contents to the Log file to be dumped to the monitoring window during execution, set this toggle to yes. This is applicable for MSC.Nastran, MSC.Marc, and Analysis Manager

More detail on these settings and those that do not appear in this form without scrolling is given on subsequent pages of this section.

These are explained in first page of this section, General (p. 44).

Page 49: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

4CHAPTER 4Configure

The Windows interface for General setting is specified directly on the Submit | General tab as shown below:

Note: Unlike the UNIX interface, to save a default Host/Queue, you select the Host/Queue on the Job Control tab and then save the settings under the Queue pull-down menu.

If you would like the contents to the Log file to be dumped to the monitoring window during execution, set this toggle to yes. This is applicable for MSC.Nastran, MSC.Marc, and Analysis Manager

A default project directory can be specified using the Project Name configuration setting. Type the project directory name into the box. The default project, if none has been specified, is the user name. A directory with the name provided is created if it does not already exist in the $P3_HOME/p3manager_files/<org>/proj directory. See more explanation below.

Once everything is as the user wants it, press the Save User Settings under the Queue pull-down menu. A confirmation will be requested whether or not the user wants to save the settings in a file called .p3mgrrc which is stored in the user’s home directory.

If the particular settings are only desired for one submittal, do not save the settings; instead, simply submit the job. The selected settings will not be written to the .p3mgrrc file but will be used during that session and will be lost when the program is exited.

Page 50: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

Project Directory. The project directory is a subdirectory below the MSC.Patran Analysis Manager install path where the Analysis Manager's job-specific files are created during job execution.

Projects are a method of organizing one’s jobs and results. For instance, if a user had two different bracket assembly designs and each assembly contained many similar if not identical parts, each assembly file might be named assembly.dat. But to avoid interference, each file is executed out of a different project directory.

If the first project is design1 and the second is design2, then one job is executed out of <file system(s) for selected host>/proj/design1 and the other is <file system(s) for selected host>/proj/design2. Hence, the user could have both jobs running at the same time without any problems, even though they are labeled with the same file name. See Disk Configuration (p. 119).

When the job is completely finished, all appropriate files are copied back to the originating host/directory (the machine and directory where the job was actually submitted from).

Pre and Post Commands. The capability exists to execute commands prior to submission of an analysis in the form of a pre and post capability. For instance, let us say that before submitting an analysis the user needs to translate an input deck from ASCII form to binary form running some utility called ascbin. This is done on the submitting host by typing ascbin at the system prompt. This same operation can be done by specifying ascbin in the Pre databox for the originating host.

Similarly, on completion of the analysis and after the files have been copied back from the executing host, the user needs to again run a program to translate the results from one file format to another using a program called trans. He would then place the command trans in the Post databox for the originating host.

A Pre and a Post command can be specified on the executing (analysis) host side also.

These commands specified in the databoxes can be as simple as a one word command or can reference shell scripts. Arguments to the command can be specified. Also, if keywords, such as the jobname or hostname, from MSC.Patran’s Analysis Manager are needed, they can be referenced by placing a $ in front of them. The available keywords that are interpreted in the Pre and Post databoxes can be examined by pressing the Keyword Index button. For more explanation of keywords, see General Miscellaneous (p. 55).

Separate User. The Separate User option allows job submittal to the selected system as a different user in case the current user does not have an account on the selected system. This must be enabled and set up in advance by the system administrator. In order for this to work properly, the separate user account specified in this databox must exist on both the selected system to run the job and the machine where the job is being submitted from. See Examples of Configuration Files (p. 136) for an explanation on how to set up separate users submission.

Default Host/Queue. The Default Host/Queue, if saved, is the host/queue to which jobs are submitted when submitted directly from MSC.Patran by using the Apply button on the Analysis form. It is also the host/queue to which jobs will be submitted when using the batch submittal from the direct Analysis Manager command line. It is also the host/queue which will come up as the selected default when the full Analysis Manager interface is started. If this setting is not saved, the default host/queue is the first in the list.

Patran Database. You can specify the name of an MSC.Patran database so that on a post-submit task such as running a script file it will know the MSC.Patran database to use for what it (the script) wants to do (like automatically reading the results back in after a job has completed.

Page 51: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

4CHAPTER 4Configure

Copy/Link Results Files. By default all results files are copied back to the directory where the input file resides. The copy/link functionality is just a method for transfering files. If you are remote then the files will be copied via the Analysis Manager. But if you run locally then there is no good reason to transfer the files or even copy them, so you can set this flag and the Analysis Manager will either link the files in the work dir to the original ones or use the copy system command instead of trying to read one file and send bytes over to write another file. If you are low on disk space then the link is a good way to go, but of course the Analysis Manager needs to see the results files from the submittal host to the analysis host scratch disk space location for this to work.

Page 52: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

4.7 RestartThe Restart configuration is analysis code specific and does not apply to General applications.

Within MSC.Patran, to perform a restart using the Analysis Manager, the job is submitted from the Analysis application as normal however, a restart job must be indicated. When invoking the Analysis Manager’s main interface with a restart job from MSC.Patran, this information is passed to the Analysis Manager and the restart jobname shows up in the Configure| Restart form. The restart job can be submitted directly from the main form or from MSC.Patran. In either case, the restart job looks for the previous job to be restarted in the local path and/or on the host machine. If this restart jobname is not specified, the databases must be located on the host machine to perform a successful restart.

MSC.Nastran RestartsAfter selecting the Restart option on the menu, the following Restart form appears. To save the MSC.Nastran database for restart using the MSC.Patran Analysis Manager, the Scratch Run toggle must be set to No in the Configure | Restart form. If the Save Databases toggle is set to No, the database is deleted from the host machine after the analysis. If the Copy Databases Back toggle is set to No, the databases are not copied back to the local path. The database files are given .DBO and .MST filename extensions for the .DBALL and .MASTER files, respectively.

A restart job that is submitted with the Analysis Manager searches for the Initial Job name .MST files in the path where MSC.Patran is invoked. Therefore, if this file and the other database files are renamed or moved, the restart job will not be successful.

MSC.Patran automatically generates the ASSIGN MASTER FMS statement required to perform a restart. If the restart .bdf is not generated by MSC.Patran and the Analysis Manager is used to submit the job, the .bdf must contain an ASSIGN MASTER FMS statement that specifies the name and location of the restart database. The following error will be issued by the MSC.Patran Analysis Manager if the ASSIGN statement is missing.

ERROR... Restart type file but no MASTER file specified with an ASSIGN statement. Use an ASSIGN statement to locate at least the MASTER database file(s) for previous runs.

See the UNIX and Window forms below for more explanation.

The restart items in the Windows interface accessible from the Restart tab in the Submit tree are identical to those in the UNIX interface shown on the next page.

Page 53: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

4CHAPTER 4Configure

MSC.Nastran

The old jobname from the cold start MSC.Nastran job which the restart is to use is supplied in this databox.

Configure Action:

Restart Object:

default Group:

Cold Start Jobname:

Scratch Run?

Save Databases?

Copy Databases Back?

NoYes

NoYes

NoYes

If a scratch restart run is to be done (default), then no databases are saved or copied back. For the initial run, this must be set to No if you wish to do subsequent restarts.

If a non-scratch restart run is to be performed, you have the choice of saving the database(s) or not. If databases are saved, they remain on the scratch disks specified by the configuration files. However, the MASTER file is always copied if Scratch Run is set to No so that restarts can be performed in the future.

If a non-scratch restart run is to be performed, you have the choice of copying the database back to the invoking host and directory. Be aware of binary incompatibilities.

◆◆

◆◆

◆◆◆

Save

Cancel

Reset

Apply

These are explained in first page of this section, General (p. 44).

Page 54: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

MSC.Marc RestartsRestarts in MSC.Marc are quite similar to those in MSC.Nastran.

MSC.Marc

Restart Jobname:

Save .t08 File?

ConfigureAction:

RestartObject:

defaultGroup:

Copy .t08 File Back?

Append .t16/t19 File?

NoYes ◆◆◆

NoYes ◆◆◆

NoYes ◆◆◆

The restart items in the Windows interface accessible from the Restart tab in the Submit tree are identical to those in the UNIX interface.

Save

Cancel

Reset

Apply

Set these two flags dependent on whether the MSC.Marc jobname.t08 file should be saved and/or copied back to the directory where the job was submitted. If the file is not copied back, then it remains in the scratch directory as configured in the disk.cfg configuration file. If it is not saved and not copied back, then it will be discarded at the end of the run.

If you wish to append results to the end of the MSC.Marc .t16/t19 file of the old job from which you are doing a restart, then set this flag accordingly.

The old jobname which the restart is to use is supplied in this databox.

These are explained in first page of this section, General (p. 44).

Page 55: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

5CHAPTER 4Configure

ABAQUS RestartsAfter selecting the Restart option on the menu, the following Restart form appears.

ABAQUS

Restart Jobname:

Save .res File?

ConfigureAction:

RestartObject:

defaultGroup:

Copy .res File Back?

Append .fil File?

NoYes ◆◆◆

NoYes ◆◆◆

NoYes ◆◆◆

Save

Cancel

Reset

Apply

Set these two flags dependent on whether the ABAQUS jobname.res file should be saved and/or copied back to the directory where the job was submitted. If the file is not copied back, then it remains in the scratch directory as configured in the disk.cfg configuration file. If it is not saved and not copied back, then it will be discarded at the end of the run.

If you wish to append results to the end of the ABAQUS .fil file of the old job from which you are doing a restart, then set this flag accordingly.

The old jobname which the restart is to use is supplied in this databox.

The restart items in the Windows interface accessible from the Restart tab in the Submit tree are identical to those in the UNIX interface.

These are explained in first page of this section, General (p. 44).

Page 56: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

4.8 MiscellaneousThe Miscellaneous configuration is analysis code specific.

MSC.Nastran MiscellaneousAfter selecting the Miscellaneous option on the menu, the following form appears.

MSC.Nastran

Supply any additional command line arguments that you wish to be used in the nastran command for invoking MSC.Nastran. See MSC.Nastran documentation for available commands.

Extra Cmd-Line Args:

Number of CPU’s:

ConfigureAction:

Misc.Object:

defaultGroup:

lansing

1

Enter the number of CPUs that are to be used for each MSC.Nastran submittal if the MSC.Nastran job can take advantage of a machine with multiple CPUs.

tavarua

1

Save

Cancel

Reset

Apply

These are explained in first page of this section, General (p. 44).

These items in the Windows interface are accessible from the Miscellaneous tab in the Submit tree are identical to those in the UNIX interface.

Page 57: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

5CHAPTER 4Configure

MSC.Marc MiscellaneousAfter selecting the Miscellaneous option on the menu, the following form appears.

MSC.Marc

Supply the name of any user subroutine here including the full path or the name of a previously compiled program. If a subroutine is being compiled, you can save the executable.

User Subroutine:

Number of CPU’s:

ConfigureAction:

Misc.Object:

defaultGroup:

lansing

1

Enter the number of CPUs that are to be used for each MSC.Marc submittal if the MSC.Marc can take advantage of a machine with multiple CPUs. You must be submitting a domain decomposition job (DDM) to take advantage of this option.

Save

Cancel

Reset

Apply

POST File

Viewfactor File

Save Executable?

NoYes ◆◆◆

These are explained in first page of this section, General (p. 44).

Supply the name of any POST file to be used for temperature loading or in an axisymmetric to 3D analysis. Also for thermal radiation, a view factor file can be supplied.

Note: When invoked from MSC.Patran, items requiring file locations are usually passed directly into the Analysis Manager such as the User Subroutine, POST file, and View Factor file. Thus, in this case, there would be no need to reenter these items.

These items in the Windows interface are accessible from the Miscellaneous tab in the Submit tree are identical to those in the UNIX interface.

Extra Cmd-Line Args:

Supply any additional command line arguments that you wish to be used in the run_marc command for invoking MSC.Marc. See MSC.Marc documentation for available commands.

Page 58: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

ABAQUS MiscellaneousAfter selecting the Miscellaneous option on the menu, the following form appears.

ABAQUS

Supply the name of any user subroutine here.User Subroutine:

Number of CPU’s:

ConfigureAction:

Misc.Object:

defaultGroup:

lansing

1

Enter the number of CPUs that are to be used for each ABAQUS submittal if the ABAQUS can take advantage of a machine with multiple CPUs.

Save

Cancel

Reset

Apply

These are explained in first page of this section, General (p. 44).

Extra Cmd-Line Args:

Use standard_memory◆

Version/Memory Test:

Old ABAQUS versions use main_mem and main_buffer, but newer ABAQUS use standard_mem and standard_buffer names for setting the memory. This flag tells which to use.

Enter the number of CPUs that are to be used for each ABAQUS submittal if the ABAQUS can take advantage of a machine with multiple CPUs.

Supply any additional command line arguments that you wish to be used in the abaqus command for invoking ABAQUS. See MSC.Nastran documentation for available commands.

These items in the Windows interface are accessible from the Miscellaneous tab in the Submit tree are identical to those in the UNIX interface.

/Buffer

Page 59: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

5CHAPTER 4Configure

General MiscellaneousAfter selecting the Miscellaneous option on the menu, the following form appears.

GENERAL

Command Line:

ConfigureAction:

Misc.Object:

defaultGroup:

$JOBNAME:

Keyword Index

Use this databox to supply the arguments to the command line of the analysis code that is to be submitted. This overrides the command line set up in the host.cfg file.

The Keyword Index button brings up an information window with the names of the possible keyword variables available and their descriptions.

Keywords can be

$JOBFILEActual filename selected (w/o full path)

$JOBNAMEJobname ($JOBFILE w/o extension)

$P3AMHOSTHostname of MSC.Patran AM host

$P3AMDIR Dir on MSC.Patran AM host where $JOBFILE resides

$APPNAME Application name (P3 preference name)

$PROJ Project Name selected

$DISK Total Disk space requested (mb)

Specific examples of some command lines are presented below.

Save

Cancel

Reset

Apply

These are explained in first page of this section, General (p. 44).

Note: Some examples of General analysis applications are discussed below.

These items in the Windows interface are accessible from the Miscellaneous tab in the Submit tree are identical to those in the UNIX interface.

Page 60: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

Examples of some specific command lines used to invoke analysis codes are given here.

Example 1:

The first example involves the ANSYS 5 code. First the Analysis Preference must be set to ANSYS 5 from MSC.Patran’s Analysis Preference form and an input deck for ANSYS 5 must have been generated via the Analysis application (this is done by setting the Action to Analyze, and the Method to Analysis Deck). Then MSC.Patran’s Analysis Manager can be invoked from the Analysis main form. Note that a direct submittal from MSC.Patran is not feasible in this and the subsequent example.

The jobfile (jobname.prp in this case) is automatically displayed as the input file and the Submit button can be pressed. The jobfile is the only file that is copied over to the remote host with this general analysis submittal capability.

In the host.cfg configuration file the path_name of the executable is defined. The rest of the command line would then look like this:

-j $JOBNAME < $JOBFILE > $JOBNAME.log

If the executable and path defined is as /ansys/bin/ansys.er4k50a, then the entire command that is executed is:

/ansys/bin/ansys.er4k50a -j $JOBNAME < $JOBFILE > $JOBNAME.log

Here the executable is invoked with a parameter (-j) specifying the jobname. The input file ($JOBFILE) is redirected using the UNIX redirect symbol as the standard input and the standard output is redirected into a file called $JOBNAME.log. The variables beginning with the $ sign are passed by MSC.Patran’s Analysis Manager. All resulting output files are copied back to the invoking host and directory on completion.

Example 2:

This is a more complicated example where an analysis code needs more than one input file. The general analysis capability in MSC.Patran’s Analysis Manager only copies one input file over to the remote host for execution. If more than one file needs to be copied over then a script must be developed for this purpose. This example shows how MSC.Patran FEA can be submitted via a script that does the proper copying of files to the remote host.

The Analysis Preference in MSC.Patran is set to MSC.Patran FEA and, in addition to setting the Preference, the input file suffix is specified as .job. MSC.Patran FEA needs three possible input files: jobname.job, jobname.ntl, and an auxiliary input file. The jobname.job file is automatically copied over to the remote host. The auxiliary input file can be called anything and is specified in the jobname.job file.

A shell script called FeaExecute is created and placed on all hosts that allow execution of MSC.Patran FEA. This FeaExecute script does the following:

1. Parses the jobname.job file to find the name of the auxiliary input file if it is specified.

2. Copies the auxiliary input file and the jobname.ntl file to the remote host.

3. Execute the FeaControl script which controls actual execution of the MSC.Patran FEA job. This is a standard script which is delivered with the MSC.Patran FEA installation.

In the MSC.Patran Analysis Manager configuration file, the FeaExecute script and its path are specified. The input parameters for this script are:

-j $JOBNAME, -h $P3AMHOST -d $P3AMDIR

Page 61: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

5CHAPTER 4Configure

which specify the jobname, the host from which the job was submitted and the directory on that host where job was submitted from. With this information the job can be successfully run. The full command that is executed on the remote host is (assuming a location of FeaExecute):

/fea/bin/FeaExecute -j $JOBNAME, -h $P3AMHOST -d $P3AMDIR

The FeaExecute script contents are shown for completeness:

#! /bin/sh# Script to submit MSC.Patran FEA to a remote host via the Analysis Manager# Define a function for displaying valid params for this scriptabort_usage( ) {

cat 2>&1 <</Usage: $Cmd -j Jobname -h Remote_Host -d Remote_Dir/

exit 1}

# Define a function for checking statuscheck_status( ) {

Status=$1if [ “$1” -ne 0 ] ; then

echo “Error detected ... aborting $Cmd”exit 2

fi} # Define a function for doing a general-purpose exitexit_normal( ) {

echo “$Cmd complete”exit 0

}

# Define a for extracting keyword values from# the .job file. Convert keyword value to upper case GetKeyValue( ){

JobFile=${1?} ; Key=‘echo ${2?} | sed ‘s/ //g’‘cat $JobFile | sed ‘s/ //g’ | grep -i ‘^’$Key’=’ | \sed ‘s/^.*=//’ | tr ‘[a-z]’ ‘[A-Z]’

}# Define a for extracting keyword values from# the .job file. Return the correct case for all characters# (don’t force anything to upper case.)GetKeyValueCC( ){

JobFile=${1?} ; Key=‘echo ${2?} | sed ‘s/ //g’‘cat $JobFile | sed ‘s/ //g’ | grep -i ‘^’$Key’=’ | \sed ‘s/^.*=//’

}

# Define a function to get the Jobname from the jobfilename## # usage: get_Jobname filespecification#get_Jobname(){

echo $1 | sed -e ‘s;^.*/;;’ -e ‘s;\..*$;;’}

Page 62: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

# Determine the command name of this scriptCmd=‘echo $0 | sed ‘s;^.*/;;’‘

# Assign the default argument parameter valuesJobname=’’Verbose=’’if [ “<installation_directory>” = ““ ] ; then Acommand=”<installation_directory>/bin/FeaControl”else Acommand=”<installation_directory>/bin/FeaControl”fiStatus=0

# Parse through the input arguments.

if [ $# -ne 6 ] ; then abort_usagefi

while [ $# -ne 0 ] ; docase “$1” in

-j) Jobname=$2 ; shift 2 ;;-h) remhost=$2 ; shift 2 ;;-d) remdir=$2 ; shift 2 ;; *) abort_usage ;;

esacdone# Runtime determination of machine/system typeOsName=‘uname -a | awk ‘{print $1}’‘case “$OsName” in

SunOS)Rsh=”rsh”RshN1=’-n’RshN2=’’;;

HP-UX)Rsh=remshRshN1=’’RshN2=’’;;

AIX)Rsh=/usr/ucb/remshRshN1=’’RshN2=’-n’;;

ULTRIX)Rsh=/usr/ucb/rshRshN1=’’RshN2=’-n’;;

IRIX)Rsh=rshRshN1=’’RshN2=’-n’;;

*)Rsh=rshRshN1=’’RshN2=’’

Page 63: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

5CHAPTER 4Configure

;;esac

# Determine the fully expanded names for the input files.JobFile=${Jobname}.jobAifFile=‘GetKeyValueCC “$JobFile” “AUXILIARY INPUT FILE”‘

# Copy the files over from the remote hostNtlFile=${Jobname}.ntllochost=‘hostname‘curdir=‘pwd‘if [ “$curdir” = “$remdir” ] ; then

crap=1else

if [ “$remhost” = “$lochost” ] ; thencp ${remdir}/${NtlFile} .if [ “$AifFile” = ““ ] ; then

crap=1else

cp ${remdir}/${AifFile} .fi

elsercp ${remhost}:${remdir}/${NtlFile} .if [ “$AifFile” = ““ ] ; then

crap=1else

rcp ${remhost}:${remdir}/${AifFile} .fi

fifi

# Perform the analysis$Acommand $Jobname ; check_status $?

# Successful exit of scriptexit_normal

Page 64: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained
Page 65: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

MSC.Patran Analysis Manager User’s Guide

CHAPTER

5 Monitor

■ Introduction

■ Running Job

■ Completed Job

■ Host/Queue

Page 66: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

5.1 IntroductionBy setting the Action to Monitor on the main MSC.Patran Analysis Manager form, the user can monitor not only his active jobs but also the Host or Queue activity. In addition, completed graphical monitoring graphs can also be recalled at anytime. Each of these functions is explained in this chapter.

Each of these functions for monitoring jobs or hosts/queues is also accessible directly from the Analysis application form within MSC.Patran. The only difference is that the full user interface of MSC.Patran Analysis Manager is not accessed first; instead, the monitoring forms are displayed directly as explained in the next few pages.

Note: The UNIX interface is shown above. In subsequent sections both the UNIX and the Windows interface are shown. Monitoring in the Windows interface is done from the Monitor tree tabs.

MSC.Nastran

Action set to Monitor allows monitoring of such items as Running Job (p. 63), Completed Job (p. 68), and Host/Queue (p. 71) items.

Apply

What appears here is dependent on the Object set in the above option menu. All of these objects and their respective forms and parameters are explained in this chapter.

Once the proper Action and Object have been selected, press the Apply button to execute the monitoring action. The Cancel button immediately exits the program and does nothing with any information or parameters that may have been selected up to that point.Cancel

MonitorAction:

Running JobObject:

defaultGroup:

Page 67: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

6CHAPTER 5Monitor

5.2 Running JobWith the Action set to monitor a Running Job, pertinent information about a specific job that is currently running or queued to run can be obtained. Jobs can be monitored from any host in the Analysis Manager's configuration, not just from where they were submitted.

Note: This form is not displayed when a job is monitored directly from MSC.Patran. Instead, only the monitoring form is displayed as shown on the next page since all the pertinent information to monitor a job is passed in from MSC.Patran. The Windows interface is displayed further down also.

MSC.Nastran

Apply Cancel

Job to Monitor

Active Job List

The filename of the user's job appears in the Job to Monitor text box. To monitor this job, simply press Apply.

The monitor form then appears if the job is running. If the job has terminated or it cannot find that particular job, an error message will be given.

The job name that appears in this text box is the jobname from the Analysis application form from MSC.Patran if this form was initiated from MSC.Patran; otherwise, it will be undefined and an active job will have to be selected as described below.

The Cancel button exits without changing any settings.

myjob

Job # Name # Owner

1 smalljob atf_user12 medjob atf_user13 bigjob atf_user14 myjob atf_user2

If a different job is to be monitored, press Update Active List for a list of currently active jobs to appear. Select the job to be monitored from those listed, then press Apply. If your job does not appear in this list then it means that the job has completed and is no longer running.Update Active List

MonitorAction:

Running JobObject:

defaultGroup:

Page 68: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

A graph of the selected running job appears, showing the duration of the job where it has been or is running.

The following table describes all the widgets that appear in this job graph.

jobname -- JOB GRAPH --

Job Information:

Job #34

176

Normalize Graph

Job Name: smallElapsed Time: 00:01:51

Owner: atf_user

mb / inch:

5

Controls:

Update (Sec.):

Remove Beginning Queue Time

40min / inch:

Close

10

atf_ibm

Fri Mar 4 15:51:22 1994 <JOB STARTEDFir Mar 4 15:51:23 1994 <TASK SUBMITTFri Mar 4 15:51:27 1994 <TASK RUNNINFri Mar 4 15:51:27 1994 <COMMENT>Fri Mar 4 15:51:35 1994 <COMMENT>Fri Mar 4 15:52:13 1994 <TASK COMPLEFri Mar 4 15:52:13 1994 <JOB FINISHED

Job Status: Percent CPU Usage:

10

Total Disk Usage: Percent Disk Usage:

50

7525

0 10050%

50

7525

0 10050%

Returning Job FIles:

Suspend/Resume Job

Item Description

Job Status This widget gives the total elapsed time in blue and the actual CPU time in red. A check mark appears when the job is completed successfully. Otherwise, an “X” appears. The clear portion of the blue bar indicates the amount of time the job was queued before execution began. Elapsed and CPU time are reported in minutes.

Percent CPU Usage This widget gives the percentage of CPU that is being used by the analysis code at any given time. The maximum percentage of CPU during job execution is indicated as a grey shade which remains at the highest level of % CPU usage.

Total Disk Usage This widget gives the total amount of disk space used by the job during execution in megabytes.

Percent Disk Usage This widget gives the percentage of the total disk space that this job occupies at an given time for all file systems. If you click on this widget with the mouse, all file systems will be shown. The maximum percentage of disk space used during job execution is indicated as a grey shade which remains at the highest level.

Page 69: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

6CHAPTER 5Monitor

The bottom left panel lists information about the job, such as date and time of event task name, host name, and status. Any error and status messages will appear here. An example listing is:

Fri Jan 4 13:31:31 1994 <TASK COMPLETED> Task Name: shock

The running job function can also be invoked manually from the system prompt. See Invoking the Analysis Manager Manually (p. 19) for details.

Job Information Job # - the sequential number of the jobJob Name - the name of the jobOwner - The name of the user or job ownerElapsed Time - how long the job has been running

Returning Job Files All files created during execution are copied back and displayed in this list box. After job completion and during job execution, it is possible to click on any of these files to view them with a simple editor/viewer. The following keystrokes are available in this viewer window:ctrl-s: to search for a stringctrl-n: to repeat searchctrl-c: exits out of viewerctrl-<: goes to top of filectrl->: goes to bottom of file

Controls Remove beginning queue time - takes off the queued portion of the graphics bar, e.g., the portion that is not blue before job begins.Suspend/Resume Job - when toggled on, the job will be indefinitely suspended. A banner across the CPU dial will display the word SUSPENDED while the job is suspended. Toggle the switch off to resume the job. The banner will be removed.Update (Sec.) - how often to update the graph / displayPixels Per Min - how many pixels wide per minuteMB Per Inch - how many megabytes per inch to be displayedNormalize Graph - make the graph fit in the window area

Close Closes the monitoring form.Status Window Status messages are returned in this window. If the log file is being

monitored, then log file lines will appear here also for MSC.Nastran and ABAQUS.

Item Description

Page 70: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

Windows InterfaceFor Running Jobs, when a job is submitted from the Windows interface, the user is queried as the whether he/she wants the interface to switch automatically to the monitoring mode.

When a job is running the Monitor tree shows running jobs and jobs that have been queued.

The Running Jobs tab under the Monitor tree shows a list of running jobs. They job names are listed in the tree structure and in the window with other pertinent information. This list is updated automatically every so often (see the Update Speeds setting under the View pull-down menu). Or you can press the Repaint (update) icon, select Refresh under the View pull-down menu or press the F5 key to update the list any time.

Page 71: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

6CHAPTER 5Monitor

When a Running Jobs in the tree structure is selected, three tabs become available to give specific status of the job, allow viewing of created output files, and give graphical display of memory, CPU and disk usage.

Job settings are displayed under the Job Control tab.

Viewable results files are shown under the Job Files tab. You must Download a file before it can be viewed.

The log file contents are displayed in this bottom list box.

The Performance tab shows a graphical representation of the CPU, disk and memory usage. The graph turns red when the resource becomes about 75% used.

Page 72: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

5.3 Completed JobThis is an Analysis Manager utility that allows the user to graph a particular completed job run by the Analysis Manager.

Note: This form is not displayed if this action is selected directly from the Analysis application form in MSC.Patran. Instead, only the monitoring form is displayed as shown on the next page. The Windows interface is also shown

MSC.Nastran

Apply Cancel

The filename of the user's job appears in the Monitor File text box. To examine this completed job, simply press Apply.

The monitor report will appear as shown on the next page.

If a different job is to be monitored, press Select File for a list of completed jobs to appear.

Use the file browser to choose a .mon file of a previous job. It can be found in the directory from which the job was originally submitted. Select the job to be monitored from those listed, then press Apply.

The full filename of the .mon file is jobname.mon where jobname is the name of the data file submitted.

The job name that appears in this text box is the jobname from the Analysis application form from MSC.Patran if this form was initiated from MSC.Patran; otherwise, it will be undefined and a previously run job will have to be selected from the Select File option as described above.

The Cancel button exits without changing any settings.

myjob.mon

Select File...

Monitor File:

MonitorAction:

Completed JobObject:

defaultGroup:

Page 73: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

6CHAPTER 5Monitor

The .mon file is created when a job is first submitted to MSC.Patran’s Analysis Manager. Information on all the job tasks is written to the .mon file. Time submitted, job name, job number, time actually run, time finished and completion status are all recorded in the file, so that this Analysis Manager function can read the file and have enough information to graph the job’s progress completely.

The explanation of the graphs on this form is identical to that of a Running Job (p. 63) except that the Update slider bar does not show up since it is not applicable to a completed job.

jobname -- JOB GRAPH --

Job Information:

Job #34

176

Normalize Graph

Job Name: smallElapsed Time: 00:01:51

Owner: atf_user

mb / inch:

5

Controls:

Update (Sec.):

Remove Beginning Queue Time

40min / inch:

Close

10

atf_ibm

Fri Mar 4 15:51:22 1994 <JOB STARTEDFir Mar 4 15:51:23 1994 <TASK SUBMITTFri Mar 4 15:51:27 1994 <TASK RUNNINFri Mar 4 15:51:27 1994 <COMMENT>Fri Mar 4 15:51:35 1994 <COMMENT>Fri Mar 4 15:52:13 1994 <TASK COMPLEFri Mar 4 15:52:13 1994 <JOB FINISHED

Job Status: Percent CPU Usage:

10

Total Disk Usage: Percent Disk Usage:

50

7525

0 10050%

50

7525

0 10050%

Returning Job FIles:

jobname.f04

jobname.f06

Page 74: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

Windows InterfaceFor Completed Jobs, the Windows interface displays them under the Completed Jobs tab in the Monitor tree.

The Completed Jobs tab under the Monitor tree shows a list of completed jobs. The job names are listed in the tree structure and in the window with other pertinent information. The number of jobs displayed in the list can be set up to 25 in the Tools | Options menu pick after which they are no longer displayed. You can always open an existing .mon file.

Once a completed job is selected from the tree structure or an existing .mon file is opened, the Job Control, Job Files, and Performance tabs are displayed. For completed jobs, the most useful tab is the Job Files tab which allows you to view/edit the result files. The other tabs are identical to the Running Jobs interface. See Windows Interface (p. 66) under Running Job (p. 63).

The editor used to view files is set under the Tools | Options menu pick.

Page 75: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

7CHAPTER 5Monitor

5.4 Host/QueueInformation about all hosts or queues used by MSC.Patran’s Analysis Manager and jobs submitted through the Analysis Manager can be reviewed using the Monitor Host/Queue selection. Options available include Job Listing (p. 72), Host Status (p. 73), Queue Manager Log (p. 74) and a Full Listing (p. 75). Press the Apply button to invoke these functions. The user can vary how often the information is updated, using the slider control.

The Host/Queue monitoring function can also be invoked manually from the system prompt. See Invoking the Analysis Manager Manually (p. 19) for details.

ABAQUS

Apply Cancel The Cancel button exits without changing any settings.

MonitorAction:

Host/QueueObject:

defaultGroup:

Press Apply to initiate the Host/Queue monitoring.

In the Windows interface, all the same information is available from the Host/Queue tab in the Monitor tree.

Page 76: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

Job ListingThe initial application form of Monitor's Host/Queue appears as follows:

At the top of the main form for Monitor Queue is a slider labeled Update Time (Min.). Drag the slider to the left to shorten the interval between information updates, or drag the slider to the right to slow update of information. The default interval time is 5 minutes. In the Windows interface the refresh setting is set under the View | Update Speeds menu pick.

The update interval may be changed at any time during the use of any Monitor Queue options.

All jobs are listed which are currently running in some capacity. Information about each job includes: Job Number, Job Name, Owner and Time. The job number is a unique, sequential number that the Analysis Manager generates for each job submitted to it. Pressing the Close button will close down the monitor form.

-- QUEUE MONITOR - -

Update Time (Min.)

5

Close

Job Listing

Host Status

Queue Manager Log

Full Listing

Job # Job Name Owner Time Submitted

33 small atf_user Fri Mar 4 19:03:17 199435 medium atf_user Fri Mar 4 19:04:17 199436 large atf_user Fri Mar 4 19:05:17 1994

CPU Loads

◆◆

◆◆

◆◆

◆◆

Windows interface, Job Listing window.

Page 77: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

7CHAPTER 5Monitor

Host StatusWhen the Host Status toggle is highlighted the form appears as follows:

The status is reported on all hosts or queues used by the Analysis Manager. Information about each host/queue includes: host/queue name (Host Name), number of jobs running (# Running), number of jobs queued (# Queued), maximum allowed to run concurrently (Max Running), and Host Type (i.e., MSC.Nastran).

If NQS or LSF is being used, queue information is provided instead of host information. See Submit (Ch. 3) for more information on default settings.

The update interval may be changed at any time during the use of any Monitor Queue options. The default interval time is 5 minutes. In Windows, use the View | Update Speeds menu option.

To exit from the Monitor Queue, select the Close button on the bottom of the main form on the right. Log files are unaffected when the form is closed.

- - QUEUE MONITOR - -

Update Time (Min.)

5

Close

Host Status

Job Listing

Queue Manager Log

Full Listing

Host Name # Running # Queued Max Running

atf_ibm 00 2

Host Type

MSC.Nastranatf_hpatf_sunatf_decatf_sunsatf_sgi

10021

00000

22222

MSC.NastranMSC.NastranMSC.NastranMSC.NastranMSC.Nastran

CPU Loads

◆◆

◆◆

◆◆

◆◆

Windows interface, Host Status window.

Page 78: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

Queue Manager LogWhen the Queue Manager Log toggle is selected, the form appears as follows:

The most recent jobs submitted are listed, regardless of where or when they were run. Information about each job includes: date and time of event, event description, job number, job or task name or host name, task type or PID (process id of task), and owner. Most recent jobs are listed in the text list box from the time the Analysis Manager’s Queue Manager daemon was started. See System Management (Ch. 7) for a description of the Queue Manager daemon.

The update interval may be changed at any time during the use of any Monitor Queue options. The default interval time is 5 minutes. In Windows, use the View | Update Speeds menu option.

To exit from the Monitor Queue, select the Close button on the bottom of the main form on the right. Log files are unaffected when the form is closed.

- - QUEUE MONITOR - -

Update Time (Min.)

5

Close

Queue Manager Log

Job Listing

Host Status

Full Listing

Queue Manager Log

=======================================================MSC.Patran Analysis Manager.

=======================================================Queue Manager StartedHost: atf_sgiSun Mar 6 17:59:00 1994: <JOB BEGIN> User: sciviz Delay: 0 Job #: 1Sun Mar 6 17:59:03 1994: <JOB ACK> Port:1600 Job Name: dresden_bajaSun Mar 6 17:59:04 1994 <TASK SUBMIT> Job #: 1 Task Name: dresdenSun Mar 6 17:59:09 1994: <TASK PID> Job #: 1 Task Host: bajaSun Mar 6 18:59:00 1994: <JOB BEGIN> User: sciviz Delay: 0 Job #: 2Sun Mar 6 18:59:03 1994: <JOB ACK> Port:1600 Job Name: dresden_bajaSun Mar 6 18:59:04 1994 <TASK SUBMIT> Job #:2 Task Name: dresdenSun Mar 6 18:59:09 1994: <TASK PID> Job #:2 Task Host: bajaSun Mar 6 19:59:00 1994: <JOB BEGIN> User: sciviz Delay: 0 Job #: 3Sun Mar 6 19:59:03 1994: <JOB ACK> Port:1600 Job Name: dresden_bajaSun Mar 6 19:59:04 1994 <TASK SUBMIT> Job #: 3 Task Name: dresdenSun Mar 6 19:59:09 1994: <TASK PID> Job #: 3 Task Host: bajaSun Mar 6 20:59:00 1994: <JOB BEGIN> User: sciviz Delay: 0 Job #: 4Sun Mar 6 20:59:03 1994: <JOB ACK> Port:1600 Job Name: dresden_bajaSun Mar 6 20:59:04 1994 <TASK SUBMIT> Job #:4 Task Name: dresdenSun Mar 6 20:59:09 1994: <TASK PID> Job #: 4 Task Host: baja

CPU Loads

◆◆

◆◆

◆◆

◆◆

Windows interface, Queue Log window.

Page 79: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

7CHAPTER 5Monitor

Full ListingWhen Full Listing is selected, the form appears as follows:

The Full Listing information shows all job tasks submitted. Information about each host/queue includes: status (blue = running; red = queued), job number, task name, task type, date and time submitted, and owner.

The queue name is shown if an additional scheduler is present and being used (LSF/Utopia) and a pointer to the actual queue name.

The update interval may be changed at any time during the use of any Monitor Queue options. The default interval time is 5 minutes.

To exit from the Monitor Queue, select the Close button on the bottom of the main form on the right.

- - QUEUE MONITOR - -

Update Time (Min.)

5

Close

Full Listing

Job Listing

Queue Manager Log

Host Status

Job # Task Name Task Type Time Submitted

atf_ibm

Owner

7 small Mon Mar 7 8:38:27 94MSC.Nastran atf_user

8 medium Mon Mar 7 8:38:27 94MSC.Nastran atf_user

9 large Mon Mar 7 8:38:27 94MSC.Nastran atf_user

atf_sgi

Empty

atf_hp

Empty

atf_sun

Empty

CPU Loads

◆◆

◆◆

◆◆

◆◆

Note: There is no Full Listing form in the Windows interface.

Page 80: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

CPU LoadsWhen CPU Loads is selected, the form appears as follows:

The load on the workstations and computers can be determined by inspecting this form which periodically updates itself. The list of hosts or queues appears with the percent CPU usage, total amount of free disk space, and available memory at that particular snapshot in time. The user may sort the hosts by CPU UTILIZATION, FREE DISK SPACE, or AVAILABLE MEMORY, so that the host or queue with the best situation appears at the top. Also, indicated in blue are the best hosts or queues for each category of CPU, disk space and memory.

- - QUEUE MONITOR - -

Update Time (Min.)

5

Close

CPU Loads

Job Listing

Queue Manager Log

Host Status

CPU Loads

atf_sgi

atf_hp

atf_sun

Full Listing

CPU UTILIZATION FREE DISK SPACE AVAILABLE MEMORY

% CPU Util: Total Free DIsk (mb): Avail Mem (mb):

50

31 39

% CPU Util: Total Free DIsk (mb): Avail Mem (mb):

66

20 14

% CPU Util: Total Free DIsk (mb): Avail Mem (mb):

95

3399

◆◆

◆◆

◆◆

◆◆

Windows interface, CPU Loads window. Sort by pressing the headings in the spread sheet.

Page 81: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

MSC.Patran Analysis Manager User’s Guide

CHAPTER

6 Abort

■ Selecting a Job

■ Aborting a Job

Page 82: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

6.1 Selecting a JobThis capability allows the user to terminate a running job originally submitted through MSC.Patran’s Analysis Manager. When aborting a job, the Analysis Manager cleans up all appropriate files.

The abort function can also be invoked manually from the system prompt. See Invoking the Analysis Manager Manually (p. 19) for details. A currently running job must be available.

MSC.Nastran

Apply Cancel

Job to Abort

Active Job List

The filename of the user's job appears in the Job to Abort textbox. To abort this running job, simply press Apply. After a confirming question, the job is aborted.

The job must be active to be aborted. Only the users jobs may be aborted, not other user’s jobs, although all jobs appear in the list.

The Cancel button exits without changing any settings.

myjob

Job # Name # Owner

1 smalljob atf_user12 medjob atf_user13 bigjob atf_user14 myjob atf_user2

Or, if a different job is to be aborted press Update Active List for a list of currently active jobs to appear. Select the job to be aborted from those listed, then press Apply.

Update Active List

AbortAction:

JobObject:

defaultGroup:

Page 83: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

7CHAPTER 6Abort

6.2 Aborting a JobYou can only abort jobs which you own (i.e., originally submitted by you).

When a job is aborted, the analysis files are removed from where they were copied to, and all scratch and database files are removed, unless the job is a restart from a previous run, in which case the scratch files are removed, but the original database files from previous runs are left unaffected.

UNIX InterfacePress the Apply button on the main form with the Action set to Abort as shown on the previous page. You are asked to confirm with,

Are you sure you wish to abort job # <jobname> ?

Press the OK button to confirm.

The Cancel button will take no action and close the Abort form.

Windows InterfaceThere are three ways to abort a job from the Windows interface.

1. When the job is initially submitted, a modal window appears asking whether you want to monitor or abort the job or simply do nothing and let the job run.

2. Once the job is running, from the Job Control tab in the Monitor tree structure. There is an Abort button on this form to terminate the job.

Note: When a job is aborted from within MSC.Patran, no user interface appears. The job is simply aborted after the confirmation.

Abort the job by pressing this button once a job as been submitted.

Page 84: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

3. From the Monitor | Running Jobs tree structure you can right mouse click on a running job. A pulldown menu appears from which you can select Abort.

Abort the job by right mouse clicking the running job in the tree structrue.

Page 85: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

MSC.Patran Analysis Manager User’s Guide

CHAPTER

7 System Management

■ Directory Structure

■ Analysis Manager Programs

■ Organization Environment Variables

■ Installation

■ X Resource Settings

■ Configuration Management Interface

■ Examples of Configuration Files

■ Starting the Queue/Remote Managers

Page 86: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

7.1 Directory StructureThe Analysis Manager has a set directory structure, configurable environment variables and other tunable parameters which are discussed in this chapter.

The Analysis Manager directory structure is displayed below. The main installation directory is shown as an environment variable, $P3_HOME =<installation_directory>. Typically this would be or /msc/patran200x or something similar.

where:

<org> (optional) is an additional organizational group and shares the same directory tree as default yet will have its own unique set of configuration files. See Organization Environment Variables (p. 94).

p3manager_files

bin

default

conf

log proj

QueMgrJobMgrP3MgrNasMgr

host.cfg

disk.cfg

lsf.cfg

QueMgr.log [User files]

<org>org.cfg

AbaMgrGenMgrAdmMgr

p3analysis_mgr<arch>

p3am_admin

nqs.cfg

.p3amusers

p3edit

QueMgr.rdb

RmtMgr

$P3_HOMEbin p3analysis_mgr

p3am_adminp3am_viewerQueMgr

exe

p3analysis_mgrp3am_adminp3am_viewerQueMgr

p3am_viewer

Job_Viewer

RmtMgr

RmtMgr

TxtMgr

MarMgr

Page 87: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

8CHAPTER 7System Management

<arch> is one of:

There may be more than one <arch> directory in a filesystem. Architecture types that are not applicable to your installation may be deleted to reduce disk space usage; however, all machine architecture types that will be accessed by the Analysis Manager must be kept. Each one of the executables under the bin directory is described in Analysis Manager Programs (p. 84).

All configuration files are explained in detail in Examples of Configuration Files (p. 136). These include org.cfg, host.cfg, disk.cfg, lsf.cfg, and nqs.cfg.

Organization groups and their uses are described in Organization Environment Variables (p. 94).

The QueMgr.log file is created when a Queue Manager daemon is started and does not exist until this time and, therefore, will not exist in the above directory structure unitl after the initial installation. Use of this file is described in Starting the Queue/Remote Managers (p. 144) respectively. The file QueMgr.rdb is also created when a Queue Manager daemon is started and is a database containing job specific statistics of every job ever submitted through the Queue Manager for that particular set of configuration file or <org>. The contents of this file can be viewed on Unix platforms using the Job_Viewer executable.

Items in the bin and exe directories are scripts to enable easier access to the main programs. These scripts make sure that the proper environment variables are set before invoking the particular program that reside in $P3_HOME/p3manager_files/bin/<arch>.

HP700 - Hewlett Packard HP-UX

RS6K - IBM RS/6000 AIX

SGI5 - Silicon Graphics IRIX

SUNS - Sun SPARC Solaris

LX86 - Linux (MSC or Red Hat)

WINNT - Windows 2000 or XP

p3analysis_mgr Invokes P3Mgr

p3am_admin Invokes AdmMgr (Unix only - on Windows this is P3Mgr.)

p3am_viewer Invokes Job_Viewer (Unix only)

QueMgr Invokes QueMgr (Unix only)

RmtMgr Invokes RmtMgr (Unix only)

Note: The directories (conf, log, proj) for each set of configuration file (organizational structure) must have read, write, and execute (777) permission for all users. This can be the cause of task manager errors.

Page 88: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

7.2 Analysis Manager ProgramsThe Analysis Manager is comprised of two main parts, the user interface and a number of daemons. Each of these parts and the executables are described below. All executables are found in the $P3_HOME/p3manager_files/bin directory, where $P3_HOME is the installation directory, typically /msc/patran200x.

User Interface. The first part of the Analysis Manager is the user interface from which a user submits and monitors the progress of jobs (P3Mgr is the executable name). This program can be executed in many different ways and from many different locations (i.e., either locally or remotely over a network). An administration tool also is available to easily set up and edit configuration files, and test for proper installation. (AdmMgr is the executable name on Unix. On Windows there is no separate executable; it is part of P3Mgr.) A small editor program (p3edit) is also part of the user interface portion and is invoked directly from the main user interface when editing and viewing files.

Two shell scripts are actually used to invoke the Analysis Manager and the administration tool. These are p3analysis_mgr and p3am_admin. When properly installed, these scripts automatically determine the installation path directory structure and which machine architecture executable to use.

Daemons. The second part of the Analysis Manager is a series of daemons (or services on Windows) which actually execute and control jobs. These daemons are responsible for queuing jobs, finding a host to run jobs, moving data files to selected hosts, executing the selected analysis code, etc. Each one is described here:

Queue Manager. This is a daemon (or service on Windows) which must run all the time (QueMgr executable name). The machine on which the Queue Manager runs is knows as the master host. Generally it runs as root (or administrator) and is responsible for scheduling jobs. The Queue Manager always has a complete account of all jobs running and/or queued. When a request to run a job is received, the Queue Manager checks to see what hosts are eligible to run the selected code and how many jobs each host is currently running. If there is a host which is eligible, the Queue Manager will start up the task on that host. If the Analysis Manager is installed along with a third party scheduling program (i.e., LSF or NQS) the Queue Manager is responsible for communicating with the scheduling software to control job execution. In summary, the Queue Manager is the Scheduler of the Analysis Manager environment. (Also, see Starting the Queue/Remote Managers (p. 144), Starting the Queue Manager.)

Remote Manager. There is only one Queue Manager, but there are many Remote Managers. A RmtMgr process runs on each and every analysis machine. These are machines that are configured to run an analysis such as MSC.Nastran or MSC.Marc. A RmtMgr can also be run on each submit machine (recommended - see Job Manager below). These are machines from which the analysis was submitted such as where MSC.Patran runs. If the submit and analysis machines are the same host, then only one RmtMgr needs to be running. The QueMgr and RmtMgr processes start up at boot time automatically and run always, but use very little memory and cpu resources, so users will not notice performance effects. Also these processes can run as root (Administrator on Windows) or as any user, if these privileges are not available.

Page 89: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

8CHAPTER 7System Management

Each RmtMgr binds to a known/chosen port number that is the same for every RmtMgr machine. Each RmtMgr process collects machine statistics on free CPU cycles, free memory and free disk space and returns this data to the QueMgr at frequent intervals. The RmtMgr is actually used to perform a command and return the output from that command on the host it is running. This is essentially a remote shell (rsh) host command as on a Unix machine.

Job Manager. The Job Manager (JobMgr executable name) runs for the life of a job. When a user submits a job using the Analysis Managerr, the user interface tells Queue Manager about the job and then starts a Job Manager daemon. The Job Manager daemon will receive and save job information from the Analysis Manager's user interface. The main purpose of the Job Manager is to record job status for monitoring and file transfer.

During the execution of jobs, users utilizing the Analysis Manager's user interface program can seamlessly connect to the Job Manager of their job and see what the status of the job is. In summary, the Job Manager controls the execution of a single job and is always aware of the current status of that job. The Job Manager runs on the submit host machine.

MSC.Nastran Manager. The MSC.Nastran Manager (NasMgr executable name) runs only for the life of a job. The MSC.Nastran Manager is started by the Queue Manager when the task reaches the top of its queue and is eligible to run. The purpose of the MSC.Nastran Manager is to run the MSC.Nastran job. When the NasMgr first comes up, it generates FMS (if necessary), checks to see if there is enough disk space, etc. The NasMgr will make sure it has all of the files it needs for the job. If not, it will obtain them. Finally, the MSC.Nastran job is started.

During execution, the NasMgr relays pertinent information (disk usage, cpu, etc.) to the Job Manager (JobMgr), which then updates the graphical information displayed to the user. The NasMgr is also responsible for cleaning up files and putting results back to desired locations, as well as reporting its status to the Job Manager. This daemon runs on the analysis host machine and only for the life of the analysis.

MSC.Marc Manager. The MSC.Marc Manager (MarMgr executable name) runs only for the life of a job. The MarMgr is identical in function to the MSC.Nastran Manager (NasMgr) except it is for execution of MSC.Marc analyses.

ABAQUS Manager. The ABAQUS Manager (AbaMgr executable name) runs only for the life of a job. The AbaMgr is identical in function to the MSC.Nastran Manager (NasMgr) except it is for execution of ABAQUS analyses.

Note: Its best to run the RmtMgr service on Windows as someone other than SYSTEM (the default if you do not do anything different). After installing the RmtMgr, use the control panel to access the services and then find the RmtMgr and change its startup to use a different account, something generic if it exists, or an Analysis Manager admin. account. If the RmtMgr is running as a user and not SYSTEM then the NasMgr/ MarMgr / AbaMgr/ GenMgr will run as this user and have access to Windows networking, shared drives and all. If it is run as SYSTEM then it is limited to only local Windows drives, shares, etc. The QueMgr does not do much in the way of files so running that as SYSTEM is OK.

Note: On Windows if a RmtMgr is running on a local machine, the JobMgr will be started through it as usual, but if a RmtMgr is NOT running then a JobMgr will be started anyway, and the submit will still work fine. The only restriction is if, in this latter case, the user logs off, a popup dialog appears asking if the user really wants to logoff. The job will be terminated if he does. This will not happen if the RmtMgr is running as a service.

Page 90: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

General Manager. The General Manager (GenMgr executable name) runs only for the life of a job. The GenMgr is identical in function to the MSC.Nastran Manager (NasMgr) except it is for execution of general analysis applications.

Editor. The editor (p3edit executable name) runs when requested from P3Mgr when viewing results files or editing the input deck.

Text Manager. The Text Manager (TxtMgr executable name) is a text based interface to the Analysis Manager to illustrate the Analysis Manager API. See Application Procedural Interface (API) (Ch. B).

Job Viewer. The job viewer (Job_Viewer executable name) is a simple program available on UNIX platforms for opening and viewing job statistics for the Analysis Manager’s database file. This file is generally located in $P3_HOME/p3manager_files/default/log/QueMgr.rdb. You must run Job_Viewer and then open the file manually.

Analysis Manager Program Startup Arguments

AbaMgr, NasMgr, MarMgr, GenMgr

Started automatically by QueMgr (or NQS/LSF); no command line arguments.

JobMgr

Started automatically by P3Mgr/TxtMgr (or RmtMgr); no command line arguments.

RmtMgr

This is a daemon on Unix or a service on Windows and started automatically at boot time. Possible command line arguments (also see Organization Environment Variables (p. 94)):

Argument Description

-version Just prints Analysis Manager version and exits

-ultima Switch to change P3_HOME to AM_HOME, p3manager_files to analysis_manager so there is no p3 in the environment required. (Generally not used!)

-port <####> Port number to use. MUST be the SAME port number for ALL RmtMgr's for whole network (per QueMgr) default is 1800 if not set.

-path <path> Use to specify base path for finding the Analysis Manager executables: $P3_HOME/p3manager_files/bin/{arch}/*Mgr. <path> is the base path $P3_HOME. Default is to use same path as program was started up with, but in the case of "./RmtMgr ...." it will not work. If a full path is used to start RmtMgr (like in a startup script) then this is not needed.

Page 91: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

8CHAPTER 7System Management

QueMgr (AdMgr)

This is a daemon on Unix or a service on Windows and started automatically at boot time. Possible command line arguments (also see Organization Environment Variables (p. 94)):

-orgpath <path> Use to specify base path for finding the Analysis Manager org tree (configuration files and directories): $P3_HOME/p3manager_files/{org}/{conf,log,proj}.<path> is the base path $P3_HOME. Use to specify the base path to find the org tree only if different than the -path argument.RmtMgr writes files in the proj/{projectname} directories, so if this is not the default (desired) location (same as -path above) then this argument needs to be set.

-name <name> Windows only. Use if you want to run more than one RmtMgr service. Each must have a unique name so the start/stop method can distinguish which one to work with. Default <name> is MSCRmtMgr.

Note: On Unix, the AdmMgr (p3am_admin) accepts the same arguments as QueMgr.

Argument Description

-version Just prints Analysis Manager version and exits

-ultima Switch to change P3_HOME to AM_HOME, p3manager_files to analysis_manager so there is no p3 in the environment required. (Generally not used!)

-port <####> Port number to use. The default is 1900 if not set. If using an org.cfg file then use this argument with the -org option below to force a port number and org name.

-path <path> Use to specify base path for finding the Analysis Manager executables: $P3_HOME/p3manager_files/bin/{arch}/*Mgr. <path> is the base path $P3_HOME. Default is to use same path as program was started up with, but in the case of "./QueMgr ...." it will not work. If a full path is used to start QueMgr (like in a startup script) then this is not needed.

-orgpath <path> Use to specify base path for finding the Analysis Manager org tree (configuration files and directories): $P3_HOME/p3manager_files/{org}/{conf,log,proj}.<path> is the base path $P3_HOME. Use to specify the base path to find the org tree only if different than the -path argument.RmtMgr writes files in the proj/{projectname} directories, so if this is not the default (desired) location (same as -path above) then this argument needs to be set.

-name <name> Windows only. Use if you want to run more than one QueMgr service. Each must have a unique name so the start/stop method can distinguish which one to work with. Default <name> is MSCQueMgr.

Argument Description

Page 92: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

P3Mgr

This program is started by the user. If 4 arguments are present then its assumed that:

-rmtmgrport <####> The port number to use for ALL RmtMgr's that this QueMgr will connect to for the entire network. Default is 1800 (default RmtMgr -port value) if not set.

-rmgrport <####> Same as -rmtmgrport above.

-org <org> org name to use. This is the name of the directory containing the configuration files for this Queue Manager daemon (i.e., $P3_HOME/p3manager_files/{org}/{conf,log,proj}). The default is default. If using an org.cfg file then use this with the -port option above to force a port number and org name.

-delayint <###> Default is 20 seconds. This is rarely used. Every delay_interval seconds the QueMgr asks another host in its list of all job hosts for a status and if it has not heard back from a host in (delay_interval * number_of_hosts * 3) + 30 seconds then it had been approximately 3 round trips through the host list without a response, so QueMgr marks the host as DOWN and will not submit new jobs to it, until it starts responding again. This is a flag to be able to modify that interval to account for network problems etc. which may be causing the Analysis Manager to think some hosts are down when they may not really be down.

Argument Description

arg 1 Startup type..... It is one of the following:1 - Start Up Full Interface.2 - Start Up Queue Monitor Now.3 - Start Up Abort Job Now.4 - Start Up Monitor Running Job Now.5 - Start Up Monitor Completed Job Now.6 - Start Up Submit Now. (Submit current job)7 - Start Up Submit Quiet. (Submit current job without GUI)8 - Start Up Submit Quiet and wait for job to complete. (with exit status)

arg 2 Extension of the job input file.

arg 3 Job name (with optional path).

arg 4 Application type (integer).1 - MSC.Nastran2 - ABAQUS3 - MSC.Marc20 through 29 - General (user defined applications)

Argument Description

Page 93: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

8CHAPTER 7System Management

If 4 more args are specified (Unix only) then its assumed:

The following arguments can be used alone or after the first 4 arguments above:

TxtMgr

This program is started by the user to manage jobs through a simple text submittal program. Possible arguments are:

Argument Description

arg 5 X position of upper left corner of MSC.Patran right hand side interface in inches.

arg 6 Y position of upper left corner of MSC.Patran right hand side interface in inches.

arg 7 Width of MSC.Patran right hand side interface in inches.

arg 8 Height of MSC.Patran right hand side interface in inches.

Argument Description

-rcf <file> rcf file to use for all GUI settings (same format as -env/-envall output) - see Analysis Manager Environment File (p. 91).

-auth <file> License file to use. Environment variable MSC_LICENSE_FILE is the default. This can also point to a port as well as a physical license file (with path), e.g., -auth 1700@banff

-env Prints the rcf / GUI settings for all applications.

-envall Same as -env but even more information is printed.

-extra <args> Argument to add extra arguments to add on the end of a particular command line.

-runtype <#> ABAQUS ONLY set run type to:0 - full analysis1 - restart2 - data check

-restart <file> ABAQUS ONLY - coldstart filename for restart.

-coldstart <file> MSC.Nastran ONLY - coldstart filename for restart. MSC.Marc uses the rcfile - see Analysis Manager Environment File (p. 91).

Argument Description

-version Same as RmtMgr.

-qmgrhost <hostname> Hostname QueMgr is running on. Default is this one if no org.cfg is found.

-qmgrport <####> Port QueMgr is running on. Default is 1900 if no org.cfg is found.

Page 94: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

-rmgrport <####> Port for ALL RmtMgr's for this org (QueMgr). Not needed unless using the Admin test feature and the default RmtMgr port is not being used.

-org <org> org to use. Default is default.

-orgpath <path> Same as RmtMgr. Needed for writing configuration files and/or Admin tests if it is not the default path (default is $P3_HOME).

-auth <file> License file to use. Environment variable MSC_LICENSE_FILE is the default.

-app <name> Application name to use. Default is MSC.Nastran (or first valid app).

-rcf <file> rcf file to use for all GUI settings (same format as -env/-envall output).- see Analysis Manager Environment File (p. 91).

-p3home <path> Switch to use if $P3_HOME environment variable is not set.

-amhome <path> Switch to use if $AM_HOME environment variable is not set.

-choice <#> Startup option if not full menu:1) submit a job2) abort a job3) monitor a job4) show QueMgr log file5) show QueMgr jobs/queues6) show QueMgr cpu/mem/disk7) list completed jobs8) write rcfile settings9) admin test10) admin reconfig QueMgr

-env Prints the rcf / GUI settings for all applications.

-envall Same as -env but even more info printed.

-envf <file> Write env settings to file specified.

-envfall <file> Same as -envf but even more info written.

-nocon Do not attempt to connect to a QueMgr. Useful for when one is not running and you want to test the Admin configuration files, etc.

Argument Description

Page 95: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

9CHAPTER 7System Management

Analysis Manager Environment FileThe -env and -envall argument to some of the above programs (P3Mgr in particular) will list the environment setting used in the Analysis Manager. The environment can be set by reading a particular file with the -rcfile argument. Default values of this environment are found in the .p3mgrrc file which gets stored in the users home directory when any settings are saved from within the Analysis Manager. Most all the widgets in the P3Mgr user interface can be set by reading an rcfile. When MSC.Marc jobs are submitted via MSC.Patran, all additional parameters such as restart filename, number of domains and host information for parallel processing and other information is passed to the Analysis Manager via this rcfile. There is an entry in the rcfile for each widget in the user interface. A list of a these entries is given below. Notice that the configuration information is also listed in the rcfile. Configuration information is explained in Configuration Management Interface (p. 104) and Examples of Configuration Files (p. 136).

## rc file ---#cfg.total_h_list[0].host_name = tavaruacfg.total_h_list[0].arch = HP700cfg.total_h_list[0].maxtasks = 2cfg.total_h_list[0].num_apps = 3cfg.total_h_list[0].sub_app[MSC.Nastran].pseudohost_name = tavarua_nast2001 cfg.total_h_list[0].sub_app[MSC.Nastran].exepath = /solvers/nast2001/bin/nast2001 cfg.total_h_list[0].sub_app[MSC.Nastran].rcpath = /solvers/nast2001/conf/nast2001rc cfg.total_h_list[0].sub_app[ABAQUS].pseudohost_name = tavarua_aba62 cfg.total_h_list[0].sub_app[ABAQUS].exepath = /solvers/hks/Commands/abaqus cfg.total_h_list[0].sub_app[ABAQUS].rcpath = /solvers/hks/6.2-1/site/abaqus_v6.env cfg.total_h_list[0].sub_app[MSC.Marc].pseudohost_name = tavarua_marc2001 cfg.total_h_list[0].sub_app[MSC.Marc].exepath = /solvers/marc2001/tools/run_marc cfg.total_h_list[0].sub_app[MSC.Marc].rcpath = /solvers/marc2001/tools/includecfg.total_h_list[1].host_name = salanicfg.total_h_list[1].arch = WINNTcfg.total_h_list[1].maxtasks = 2cfg.total_h_list[1].num_apps = 3 cfg.total_h_list[1].sub_app[MSC.Nastran].pseudohost_name = salani_nast2001 cfg.total_h_list[1].sub_app[MSC.Nastran].exepath = d:\msc\bin\nast2001.exe cfg.total_h_list[1].sub_app[MSC.Nastran].rcpath = d:\msc\conf\nast2001.rcf cfg.total_h_list[1].sub_app[ABAQUS].pseudohost_name = salani_aba62 cfg.total_h_list[1].sub_app[ABAQUS].exepath = d:\hks\Commands\abq621.bat cfg.total_h_list[1].sub_app[ABAQUS].rcpath = d:\hks\6.2-1\site\abaqus_v6.env cfg.total_h_list[1].sub_app[MSC.Marc].pseudohost_name = salani_marc2001 cfg.total_h_list[1].sub_app[MSC.Marc].exepath = d:\msc\marc2001\tools\run_marc.bat cfg.total_h_list[1].sub_app[MSC.Marc].rcpath = d:\msc\marc2001\tools\include.bat#unv_config.auto_mon_flag = 1unv_config.time_type = 0unv_config.delay_hour = 0unv_config.delay_min = 0unv_config.specific_hour = 0unv_config.specific_min = 0unv_config.specific_day = 0unv_config.mail_on_off = 0unv_config.mon_file_flag = 1unv_config.copy_link_flag = 0unv_config.job_max_time = 0unv_config.project_name = user1unv_config.orig_pre_prog =

Page 96: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

unv_config.orig_pos_prog = unv_config.exec_pre_prog = unv_config.exec_pos_prog = unv_config.separate_user = user1unv_config.p3db_file = unv_config.email_addr = empty#nas_config.disk_master = 0nas_config.disk_dball = 0nas_config.disk_scratch = 0nas_config.disk_units = 2nas_config.scr_run_flag = 1nas_config.save_db_flag = 0nas_config.copy_db_flag = 0nas_config.mem_req = 0nas_config.mem_units = 0nas_config.smem_units = 0nas_config.extra_arg = nas_config.num_hosts = 2 nas_host[tavarua.scm.na.mscsoftware.com].mem = 0 nas_host[tavarua.scm.na.mscsoftware.com].smem = 0 nas_host[tavarua.scm.na.mscsoftware.com].num_cpus = 0 nas_host[lalati.scm.na.mscsoftware.com].mem = 0 nas_host[lalati.scm.na.mscsoftware.com].smem = 0 nas_host[lalati.scm.na.mscsoftware.com].num_cpus = 0nas_config.default_host = tavarua_nast2001nas_config.default_queue = N/Anas_submit.restart_type = 0nas_submit.restart = 0nas_submit.modfms = 1nas_submit.nas_input_deck = nas_submit.cold_jobname = #aba_config.copy_res_file = 1aba_config.save_res_file = 0aba_config.mem_req = 0aba_config.mem_units = 0aba_config.disk_units = 2aba_config.space_req = 0aba_config.append_fil = 0aba_config.user_sub = aba_config.use_standard = 1aba_config.extra_arg = aba_config.num_hosts = 2 aba_host[tavarua.scm.na.mscsoftware.com].num_cpus = 1 aba_host[tavarua.scm.na.mscsoftware.com].pre_buf = 0 aba_host[tavarua.scm.na.mscsoftware.com].pre_mem = 0 aba_host[tavarua.scm.na.mscsoftware.com].main_buf = 0 aba_host[tavarua.scm.na.mscsoftware.com].main_mem = 0 aba_host[lalati.scm.na.mscsoftware.com].num_cpus = 1 aba_host[lalati.scm.na.mscsoftware.com].pre_buf = 0 aba_host[lalati.scm.na.mscsoftware.com].pre_mem = 0 aba_host[lalati.scm.na.mscsoftware.com].main_buf = 0 aba_host[lalati.scm.na.mscsoftware.com].main_mem = 0aba_config.default_host = tavarua_aba62aba_config.default_queue = N/Aaba_submit.restart = 0aba_submit.aba_input_deck = aba_submit.restart_file = #mar_config.disk_units = 2mar_config.space_req = 0mar_config.mem_req = 0mar_config.mem_units = 2mar_config.translate_input = 1mar_config.num_hosts = 2 mar_host[tavarua.scm.na.mscsoftware.com].num_cpus = 1

Page 97: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

9CHAPTER 7System Management

mar_host[lalati.scm.na.mscsoftware.com].num_cpus = 1mar_config.default_host = tavarua_marc2001mar_config.default_queue = N/Amar_config.cmd_line = mar_config.mon_file = $JOBNAME.stsmar_submit.save = 0mar_submit.nprocd = 0mar_submit.datfile_name = mar_submit.restart_name = mar_submit.post_name = mar_submit.program_name = mar_submit.user_subroutine_name = mar_submit.viewfactor = mar_submit.hostfile = mar_submit.iamval =

Page 98: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

7.3 Organization Environment Variables

P3_HOME & P3_PLATFORM. These two environment variables are set automatically for the user when MSC.Patran and/or the Analysis Manager is invoked, if the proper installation has been performed. The P3_HOME variable references the actual installation directory and P3_PLATFORM references the machine architecture. This architecture can be one of the following:

These variables can be set in the following manner with cshell (if necessary):

setenv P3_HOME /msc/patran200xsetenv P3_PLATFORM HP700

or for bourne shell or kern shell users:

P3_HOME=/patran3P3_PLATFORM=DECAexport P3_HOMEexport P3_PLATFORM

or on Windows:

set P3_HOME=c:/msc/patran200xset P3_PLATFORM=WINNT

In most instances, users will never have to concern themselves with these environment variables but are included here for completeness. In a typical MSC.Patran installation, a file called .wrapper exists in the $P3_HOME/bin directory which automatically determines these environment variables. The names of the invoking scripts, p3analysis_mgr and p3am_admin exist as pointers to .wrapper in this bin directory which, when executed, determines the variable values and then executes the actual scripts. For this to work conveniently, the user should have $P3_HOME/bin in his/her path, otherwise the entire path name must be used when invoking the programs.

P3_ORG. It may be desirable to have multiple Queue Managers running (groups of systems for the Analysis Manager to use) each with a separate organizational directory for Analysis Manager configuration files. An optional environment variable, P3_ORG, may be set for each user to specify a separate named organizational directory. If defined, the Analysis Manager will use it for accessing its required configuration files and thus connect to the Queue Manager specified by P3_ORG.

If not defined, the default organizational directory of default is used (i.e., $P3_HOME/p3manager_files/default).

For example, suppose a site has many computers with MSC.Nastran installed, yet access to each is to be limited to certain engineering groups. Each group expects to only be able to submit to their computers and are not permitted to see a choice to submit to the other groups’ machines. Therefore, to solve this problem, set up two or more different Analysis Manager organizational groups and start separate Queue Managers for each. The configuration files (host.cfg and disk.cfg) for the first set of machines are located in the $P3_HOME/p3manager_files/default/conf

HP700 - Hewlett Packard HP-UX

RS6K - IBM RS/6000 AIX

SGI5 - Silicon Graphics IRIX

SUNS - Sun SPARC Solaris

LX86 - Linux (MSC or Red Hat)

WINNT - Windows 2000 or XP

Page 99: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

9CHAPTER 7System Management

directory. Now, to handle another group’s machines, create another directory structure under $P3_HOME/p3manager_files called groupb. A directory $P3_HOME/p3manager_files/groupb should be created with subdirectories identical to the $P3_HOME/p3manager_files/default tree. The easiest way to create the new organization is to just copy the default organization tree:

cp -r $P3_HOME/p3manager_files/default $P3_HOME/p3manager_files/groupb

Now, the configuration files in the $P3_HOME/p3manager_files/group directory can be edited to set up the new group of machines. When more than one organization is defined, there will be one Queue Manager (QueMgr) running for each organizational group. When starting the Queue Manager, the -org argument must be used for organizations which are not default.

In this example, the Queue Manager for the groupb organization would be started as follows:

QueMgr $P3_HOME -org groupb

Where $P3_HOME is the directory where the installation is located, say /msc/patran200x.

Once the organizations are created, the configuration files edited and the Queue Managers started, users can utilize the P3_ORG environment variable to specify which Queue Manager to communicate with. Therefore, users wishing to use the newer version of MSC.Nastran will set their P3_ORG environment variable to groupb and users wishing to use the normal version of MSC.Nastran will not set the environment variable at all.

setenv P3_ORG groupb

or for bourne shell or kern shell users:

P3_ORG=groupbexport P3_ORG

or on Windows:

set P3_ORG=groupb

It is also possible to dynamically change organizational groups without setting this environment variable. If different organizational groups need to be created, but access should be given to all users, a configuration file called org.cfg can be created and placed in the $P3_HOME/p3manager_files directory. This will allow users to change organizational group directories from the Analysis Manager user interface. The form of this configuration file is described in Examples of Configuration Files (p. 136). If this configuration file is used to allow users to change groups on the fly from the user interface, then a QueMgr must be started on each host with unique port IDs using the -port parameter, e.g.,

QueMgr $P3_HOME -port 1501 -org groupb

Page 100: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

P3_PORT and P3_MASTER. The Analysis Manager is actually very flexible in the manner in which it can be installed and accessed. This is to take into account the variation in networks that exist from company to company or even from department to department. These two environment variables exist to allow flexibility in more restrictive networking environments.

Say, for example, that there are 50 machines, all with their own MSC.Patran installations, yet none of them are NFS mounted to each other or to the master node where the QueMgr daemon is running. A difficult way to solve this problem is to make sure the same configuration files exist on all 50 machines in the same directory tree structure, including the master node. If a change has to be made to the configuration file, then all 50 machines and the master node will have to be updated. Since the QueMgr is the only program that needs to read the configuration files, an easier solution would be for only the master node to keep an up-to-date set of configuration files and have the users of the other 50 machines set the P3_PORT and P3_MASTER environment variables to reference the port and machine of the master node. For example,

setenv P3_PORT 1501setenv P3_MASTER bangkok

or for bourne shell or kern shell users:

P3_PORT=1501P3_MASTER=bangkokexport P3_PORTexport P3_MASTER

or on Windows:

set P3_PORT=1501set P3_MASTER=bangkok

where 1501 is the port number that QueMgr is using and bangkok is the master node’s computer hostname.

The Analysis Manager works in the following manner:

1. First a check is made to see if P3_PORT and P3_MASTER have been set. If they have, the information is used to communicate with the master host and only the organizational group specified either through P3_ORG or the default organizational group will appear.

2. If P3_PORT and P3_MASTER have not been set, then the org.cfg configuration file is read which contains the master nodes and port numbers for all organizational groups that have been created. If P3_ORG is also specified, then that organizational group will appear as the default; however, all other organizational groups will still be accessible.

3. If neither P3_PORT, P3_MASTER or the org.cfg file have been specified or exist, then the defaults are used. The default host is the host machine that the Analysis Manager was started on. The default port is 1900. And the default configuration files (organization) are in the default directory. Multiple organizational groups will not be accessible, however the P3_ORG variable can be specified to change organizational groups each time the Analysis Manager is invoked. In this last method, the user or the system administrator that starts the QueMgr does not need to ever worry about assigning unique port numbers. However, it is one of the most restrictive installations and methods of access.

Page 101: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

9CHAPTER 7System Management

MSC_RMTMGR_ARGS and MSC_QUEMGR_ARGS. The RmtMgr and QueMgr will also read the environment variables called MSC_RMTMGR_ARGS and MSC_QUEMGR_ARGS, respectively, for all of its arguments. It is one big string as in this cshell setting:

setenv MSC_RMTMGR_ARGS “-port 1850 -path /msc/patran200x”setenv MSC_QUEMGR_ARGS “-port 1950 -path /msc/patran200x”

or for bourne shell or kern shell users:

MSC_RMTMGR_ARGS=“-port 1850 -path /msc/patran200x”MSC_QUEMGR_ARGS=“-port 1950 -path /msc/patran200x”export MSC_RMTMGR_ARGSexport MSC_QUEMGR_ARGS

or on Windows:

set MSC_RMTMGR_ARGS=“-port 1850 -path /msc/patran200x”set MSC_QUEMGR_ARGS=“-port 1950 -path /msc/patran200x”

And then when RmtMgr and/or QueMgr start they will check these and get their arguments from these strings. Real command line arguments overwrite these in case both are set.

This method is needed on Windows because there is currently no way to save the startup arguments for a service, so on reboot the RmtMgr would not know its startup arguments. It would have to read a file or read an environment string to get them. The only problem right now is if you have two RmtMgr's running on the same machine there is no way to have different args for each.

Note: On Windows these variables should be set under the System Control Panel such that on reboot, the RmtMgr and QueMgr start up with these arguments. You can check the Event Viewer under Adminstrative Tools Control Panel to check for proper startup.

Page 102: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

AM_CMD_STATUS and AM_JOB_STATUS. In addition, at the end of a job these two environment variables get set.

AM_CMD_STATUS gets set on the executing host after the job has completed there, with the exit status of the command. This can be used by a postscript on the execute host to possibly do different things based on the exit status of the job. One must know the exit status of the application they are running to know what is good and what is bad or f there are any other possible codes/meanings.

AM_JOB_STATUS get set on the submit host at the end of the job, after all the files have been transferred and such and this can be used by a post program on the submit host for the same reasons. The values for this environment variable are 0, 1, 2 where 0 means successful, 1 means abort and 2 means failure of any kind.

Page 103: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

9CHAPTER 7System Management

7.4 Installation

Installation RequirementsThe following definitions apply to this section:

1. The master host is the machine which continually runs the Analysis Manager daemon (called QueMgr). This is also referred to as the master node.

2. The submit host is the machine from which the analysis is submitted, sometimes referred to as the client also.

3. The analysis host is the machine which actually executes the analysis.

Below is an itemized list of installation requirements:

1. One master node must be chosen for each organizational group (for each Queue Manager that will be running - typical installation only have one!).

2. Queue Manager (QueMgr) should run as root on the master node. This is not a strict requirement but recommended on Unix. On Windows it can run as a user or as administrator.

3. Each node (submit and analysis hosts) in the Analysis Manager configuration must be reachable to and from the master node via a TCP/IP network.

4. Each analysis host must have a Remote Manager (RmtMgr) running with the same port number (for each QueMgr). It is recommended that each Submit machine also (especially on Windows) however this is not a strict requirement. (This takes the place of rsh (remsh) remote access capabilities that used to be used in older versions of the Analysis Manager.)

5. The Analysis Manager software will come off of the installation media onto any machine (master, submit, or analysis host) under the $P3_HOME/p3manager_files directory. The $P3_HOME variable is the installation directory and is typically set up as /msc/patran200x and is usually defined as an environment variable. This p3manager_files directory and tree must exist as-is and not be renamed.

6. Each analysis host machine in the Analysis Manager configuration must be able to identically see the installation tree. If a RmtMgr is running this is not an issue because the RmtMgr knows where the Analysis Manager executables are.

7. The root user should run the administration program (p3am_admin (AdmMgr)) on the master node to test and ensure that new users can correctly access the Analysis Manager. See Configuration Management Interface (p. 104).

Each user wishing to use the Analysis Manager must meet the following requirements:

1. Users who are using the Analysis Manager should have the same login name, user and group ids on all hosts / nodes in the Analysis Manager configuration. This will prevent file access problems. In specific cases, users may run jobs on different accounts other than their own, but this must be set up by the system administrator. This is described in Examples of Configuration Files (p. 136).

2. Users must have uname in their default search path (path or PATH environment variable in the user's .cshrc or .profile file).

Page 104: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

Installation Instructions1. Unload the p3manager_files directory from the installation media. (Consult the

Installation guide for more information on how this is done.)

2. Decide on a master node (typically the node the MSC.Patran software is located on), and login to that node as root.

3. Decide which machine(s), that have MSC.Nastran, MSC.Marc, ABAQUS, or other analysis modules to be used, will be included in the Analysis Manager’s configuration. Find out where each runtime script and/or configuration files are located (i.e. /msc/bin/nast200x, /msc/conf/nast200xrc for MSC.Nastran) for each machine. Only these machines will be enabled for later job submission, monitoring, and management.

4. Each analysis host machine that will be configured to run an analysis code must be able to see the p3manager_files directory structure as outlined in Directory Structure (p. 82). This directory structure must also exist on the master node as well as client (submit) nodes. This can be done in one of two ways. Either the directory structure can be copied directly to each machine so that it can be accessed in the same manner as on the master node, or symbolic links and NFS mounts can be created. In any case, if on one machine you type

cd $P3_HOME/p3manager_files

you should be able to do the same on all analysis nodes and see the same directory structure.

As an example of setting up a link, suppose that the machine venus is the master host and has the installation directory structure in /venus/users/msc/patran200x. A link can be established on venus by typing:

mkdir /patranln -s /venus/users/msc/patran200x /patran

This will ensure that on venus, if you type cd /patran you will be put into /venus/users/msc/patran200x.

Now on an analysis host called jupiter, NFS mount the disk /venus/users and then type:mkdir /patranln -s /venus/users/msc/patran /patran

This will ensure that analysis host jupiter can see the installation directory structure. Repeat this for all analysis hosts. NFS mounts are not necessary if you wish to copy the installation directory structures to each host separately instead of creating links.

Each submit host (hosts that submit jobs) does not necessarily need to see the directory structure in exactly the same way as the master and analysis hosts do. They only need to be able to see an installation directory structure to find the user interface executable (P3Mgr).

5. Start up the RmtMgr daemon or service on each and every analysis node. It is recommended to start RmtMgr on submit machines also. Starting the Queue/Remote Managers (p. 144) explains this procedure. This must be done before configuration testing can be done.

6. Use the p3am_admin program to set up the configuration files. This program is located in

Note: The above description sounds a bit more restrictive than it really is. In actuality, if a RmtMgr is started on each analysis host, the directory structure can be seen because RmtMgr knows from where it was launched and thus knows where all the Analysis Manager executable are. However, it is still recommended to follow the above procedure if at all possible.

Page 105: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1CHAPTER 7System Management

$P3_HOME/bin/p3am_admin

Modify Configuration Files (p. 107) explains the use of this program and the format of the generated configuration files as a result of running this program. The configuration file will be placed in the correct locations automatically. The following configuration files will be generated:

7. Test the configuration setup using p3am_admin’s testing features. Specifically, do basic tests and network tests for each user that wishes to access the Analysis Manager. Test Configuration (p. 125) explains this procedure in detail.

8. Start up the QueMgr daemon on the master node. Starting the Queue/Remote Managers (p. 144) explains this procedure.

9. Add commands to the appropriate rc files for automatic start-up of the QueMgr and RmtMgr daemons when the master, submit or analysis nodes have to be rebooted. Starting the Queue/Remote Managers (p. 144) also explains this procedure.

10. Invoke the Analysis Manager user interface as a normal user and check that the installation was performed properly. Invoking the Analysis Manager is explained in Invoking the Analysis Manager Manually (p. 19).

11. Repeat the procedure from step 2. for each organizational group (Queue Manager) you wish to set up.

12. When more than one organizational group (Queue Manager) is to be accessed, either modify the org.cfg file and add the port numbers and group names, or have users set the appropriate environment variables to access them. See Organization Environment Variables (p. 94) for an explanation of these variables and see Examples of Configuration Files (p. 136) for setting up the org.cfg file.

13. Make sure users have $P3_HOME/bin in their path. Most Analysis Manager executables can be invoked from$P3_HOME/bin or are links from $P3_HOME/bin for setting all environment variables. These include:

p3am_adminp3am_viewer (Unix only)p3analysis_mgrQueMgrRmtMgr

It is always safest to invoke these executables from $P3_HOME/bin.

host.cfg Host file configuration filedisk.cfg Disk space configuration filelsf.cfg LSF configuration file (if you plan to use LSF as your scheduler,

and not the Analysis Manager own built-in scheduler.nqs.cfg NQS configuration file (if you plan to use NQS as your scheduler,

and not the Analysis Manager own built-in scheduler.

Note: For a minimal configuration with a single Queue Manager, you should remove or rename the file $P3_HOME/p3manager_files/org.cfg. See step 12. for more information.

Page 106: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

7.5 X Resource SettingsThe Analysis Manager GUI on Unix requires the use of certain X Window System Resources. The following explains this use.

The name of the Analysis Manager X application class is P3Mgr. Therefore, to change the background color the Analysis Manager uses to red, the following resource specification is used:

P3Mgr*background: red

The lines below belong in the P3Mgr file delivered with your installation. This file can be found in $P3_HOME/app-defaults. This file can reside in the user’s local directory or in his home directory or be placed in .Xdefaults or /usr/lib/X11/app-defaults. It is most convenient to place it in the user’s home directory; that way changes can be made instantly without having to log out. These are the resources which the Analysis Manager requires for it to look and behave like MSC.Patran.

!! Resources for MSC.Patran Analysis Manager:!P3Mgr*background:bisqueP3Mgr*foreground:blackP3Mgr*bottomShadowColor:bisque4P3Mgr*troughColor:bisque3P3Mgr*topShadowColor:whiteP3Mgr*highlightColor: blackP3Mgr*XmScrollBar.foreground:bisqueP3Mgr*XmScrollBar.background:bisqueP3Mgr*mon_run_trough.background:DodgerBlueP3Mgr*mon_ok_label.foreground:DodgerBlueP3Mgr*mon_bad_label.foreground:redP3Mgr*que_mon_queued.background:redP3Mgr*que_mon_run.background:DodgerBlueP3Mgr*mon_disk_trough.background:redP3Mgr*mon_cpu_trough.background: green!! End of MSC.Patran Analysis Manager Resources!

A file called p3am_admin (AdmMgr) also exists for the system administration tool X resources.

Page 107: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1CHAPTER 7System Management

Font Handling. The Analysis Manager on Unix requires three fonts to work correctly. At start-up, the Analysis Manager looks through the fonts available on the machine and picks out three fonts which meet its needs. You will notice that there are no font definitions in the default Analysis Manager resources. On platforms which utilize an R4 based version of X windows, the fonts are NOT adjustable by the user. The fonts that the Analysis Manager calculates are used all the time.

On R5 X windows platforms, the three fonts are still calculated by the Analysis Manager, but the user has the option of overriding the calculated fonts by using the X resources. The names of the resources to use are as follows:

If the user decides to change the fonts, these are the resources which need to be set. All three fonts do not have to be changed, a single one can be adjusted by itself. The only requirement is that a fixed font is defined for P3Mgr*fixed.fontList. It is important for this font to be fixed or the interface for the Queue Monitor will not appear correctly.

P3Mgr*fontList: *lucida-bold-r-*-14-140-*

P3Mgr*middle.fontList: *lucida-medium-r-*-14-140-*

P3Mgr*fixed.fontList: *courier-medium-r-*-12-120-*

Page 108: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

7.6 Configuration Management InterfaceThe Analysis Manager requires a predefined set of configuration files for its use. These configuration files may be changed and validated using the Configuration Management Preference (p3am_admin executable name), which enables menu-driven ease in changing and testing the configuration files. Examples of the configuration files are found in Examples of Configuration Files (p. 136). You may also edit them using any text editor; however, you will probably find it more intuitive to use the administration tool until you become familiar with the configuration files.

To run p3am_admin from the installation directory using all defaults type:

$P3_HOME/bin/p3am_admin

or to call out a specific set of configuration files other than the default:

$P3_HOME/bin/p3am_admin -org <org>

where <org> is the directory name containing the configuration file located in $P3_HOME/p3manager_files/<org>/{conf, log, proj} or specify the full path:

<path_name>/AdmMgr $P3_HOME -org <org>

where:

<path_name> =$P3_HOME/p3manager_files/bin/<arch>/

<arch> =the architecture type of the machine you wish to run on which can be one of the following:

The arguments are defined as follows:

HP700 - Hewlett Packard HP-UX

RS6K - IBM RS/6000 AIX

SGI5 - Silicon Graphics IRIX

SUNS - Sun SPARC Solaris

LX86 - Linux (MSC or Red Hat)

WINNT - Windows 2000 or XP

$P3_HOME The path where the Analysis Manager is installed. This path will be used to locate the p3manager_files directory. For example, if /msc/patran200x is specified, the p3am_admin (AdmMgr) program will look for the /msc/patran/p3manager_files directory. Typically, the install directory is /msc/patran200x and defined in an environment variable called $P3_HOME.

-org <org> This is the organizational group to be used. See Organization Environment Variables (p. 94) for a description on the use of organizations. It is the name of the directory under the p3manager_files directory that contains the configuration files.

Page 109: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1CHAPTER 7System Management

Both of the arguments listed above are optional. If they are not specified, the p3am_admin (AdmMgr) program will check for the following two environment variables:

If the command line arguments are not specified, then at least the P3_HOME environment variable must be set. The P3_ORG variable is not required. If the P3_ORG variable is not set and the -org option is not provided, an organization of default is used. Therefore, p3am_admin (AdmMgr) will check for configuration files in the following location:

$P3_HOME/p3manager_files/default/conf

When running the p3am_admin (AdmMgr) program, it is recommended this be done on the master node and as the root user. The p3am_admin (AdmMgr) program can be run as normal users, but some of the testing options will not be available. In addition, the user may not have the necessary privileges to save the changes to the configuration files or start up a Queue Manager daemon.

When p3am_admin (AdmMgr) starts up, it will take the arguments provided (or environment variables) and check to see if configuration files already exist. The configuration files should exist as follows. The last two are only necessary if LSF or NQS queueing are used.

$P3_HOME/p3manager_files/<org>/conf/host.cfg$P3_HOME/p3manager_files/<org>/conf/disk.cfg

$P3_HOME/p3manager_files/<org>/conf/lsf.cfg$P3_HOME/p3manager_files/<org>/conf/nqs.cfg

If these files exist, they will be read in for use within the p3am_admin (AdmMgr) program. If these files are not found, p3am_admin (AdmMgr) will start up in an initial state. In this state there are no hosts, filesystems, or queues defined and they must all be added using the p3am_admin (AdmMgr) functionality.

Therefore, upon initial installation and/or configuration of the Analysis Manager, the p3am_admin (AdmMgr) program will come up in an initial state and the user can build up configuration files to save.

P3_HOME The path where the Analysis Manager is installed.

P3_ORG The organization to be used. This is the <org> directory.

Page 110: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

Action Options. The initial form for p3am_admin (AdmMgr) has the following Actions/Options:

1. Modify Config Files

2. Test Configuration

3. Reconfigure Que Mgr

On Windows the Administration tree tab is the equivalent:

- - A/M ADMIN TOOL - -

Action:

Object:

AM QueueQueue Type:

Apply Cancel

Application Name: Modify Config FilesTest ConfigurationQueue Manager

MSC.Nastran

Executable:

Optional Args:

NasMgr

Not Applicable

Application Name: ABAQUS

Executable:

Optional Args:

AbaMgr

Not Applicable

Application Name: MARC

Executable:

Optional Args:

GenMgr

-j $JOBNAME

Add Application Delete Application

Help

Admin user: (NOT root)

am_admin

3: 2+(AMGrpsConfig Vers:

Page 111: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1CHAPTER 7System Management

Modify Configuration FilesModify Config(uration) Files has the following Objects:

1. Applications

2. Physical Hosts

3. Hosts

4. Filesystems

5. Queues

Selecting the Queue Type. The Analysis Manager requires a Queue Type, whether to use LSF, NQS or Analysis Manager own queueing capability. This typically should be the first thing set when setting up a new configuration.

To select or change the selection of a Queue Type, click on the Queue Type: menu for either LSF, NQS, or MSC or AM Queue, as listed on the right side of the Modify Config Files form. Only one queue type may be selected.

To save the configuration, the Apply button must be pressed and the newly added queue type information will be saved in the host.cfg file. (Note: Apply saves all configuration files: host, disk, and, if applicable, lsf or nqs.)

Note: Queue Managers set up on Windows only have choice of the default MSC Queue type. LSF and NQS are not supported for Queue Managers running on Windows.

- - A/M ADMIN TOOL - -

Action: Modify Config Files

Object:

P3 QueueQueue Type:

Apply Cancel

ApplicationsPhysical HostsAM HostsFilesystemsQueues

Add Host Delete Host

Application Name: MSC.Nastran

Executable:

Optional Args:

NasMgr

Not Applicable

Application Name: ABAQUS

Executable:

Optional Args:

AbaMgr

Not Applicable

Application Name: MARC

Executable:

Optional Args:

GenMgr

-j $JOBNAME

Help

Admin user: (NOT root)

am_admin

3: 2+(AMGrpsConfig Vers:

Page 112: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

Administrator User. You must also set the Admin user. This should not be root on Unix or the administrator account on Windows, but should be a normal user name.

Configuration Version. There are three configuration versions. The functionality accessible to setup is dependent on which version you select. Version 1 is the original.

Version 2 includes an additional capability of limiting the maximum number of task for any given applications allowed to run at any one time. If this number is exceeded, any additional submittals are queued until the maximum number of tasks for that application drops below this number. This is typically used when there are only so may application licenses available such that a job cannot be submitted without a license being available. Otherwise the application might fail due to no license being available.

Version 3 includes all capabilities of versions 1 and 2, and also includes the ability to set up a host, made of a group of hosts, that will be monitored for the least loaded machines. Once a machine in that group satisfies the loading critiria, the job is submitted to that machine.

Page 113: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1CHAPTER 7System Management

Applications

Since the Analysis Manager can execute different applications, it needs to know which applications to execute and how to access them. This configuration information is stored in the host.cfg file located in the $P3_HOME/p3manager_files/default/conf directory. This portion of the host.cfg file contains the following fields:

type An integer number used to identify the application. The user never has to worry about this number because it is automatically assigned by the program.

program name Program names can be either:

NasMgr for executing MSC.Nastran

MarMgr for executing MSC.Marc

AbaMgr for executing ABAQUS

GenMgr for executing other analysis modules.

MSC.Patran name The name of the MSC.Patran Preference from which to key off of when invoking the Analysis Manager. These can be MSC.Nastran, MSC.Marc, ABAQUS, ANSYS, etc. Check to see what the exact MSC.Patran Preference spelling is and remove any spaces. If the Preference does not exist then the first configured application will be used when the Analysis Manager is invoked from MSC.Patran after which the user can change it to the one he wants.

optional args Used for generic program execution only. These specify the arguments to be added to the invoking line when running a generic application.

MaxAppTask By default this is not set. If the configuration file version is set to 2 or 3, then you may specify the maximum number of tasks that the given application can run at any one time (on all machines). This is convenient when you don’t want jobs submitted with the possibility of one or more not being able to check out the proper licenses if none are available because too many jobs are running at once.

Page 114: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

The p3am_admin (AdmMgr) program can be used to add and delete applications or change any field above as shown in the forms below.

The exception to this is the Maximum Number of Tasks. This value must be changed manually by editing the configuration file and then restarting the Queue Manager service on Windows. On UNIX, this can be controlled through the Administration GUI.

Page 115: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1CHAPTER 7System Management

Adding an Application. To add an application, select the Add Application button. (On Windows, right mouse click the Applications tree tab.) An application list form appears from which an application can be selected. If GENERAL is selected the Application Name and Optional Args data boxes appear on the main form.

For GENERAL, enter the name of the application as it is know by the MSC.Patran Preference, without any spaces. For example if ANSYS 5 is a preference, then enter ANSYS5.

Enter the optional arguments that are needed to run the specified analysis code. For example, if an executable for the MARC analysis code needs arguments of -j jobname, you can specify -j $JOBNAME as the optional args. Arguments can be specified explicitly such as the -j, or they can be placed in as variables such as the $JOBNAME. The following variables are available:

Up to 10 GENERAL applications can be added. To save the configuration, the Apply button must be pressed and the newly added application information will be saved in the host.cfg file. On Windows this is Save Config Settings under the Queue pull down menu. (Note: Apply saves all configuration files: host, disk, and if applicable, lsf or nqs.)

$JOBFILE Actual filename selected (without full path)

$JOBNAME The jobname ($JOBFILE without extension)

$P3AMHOST The client hostname from where the job was submitted

$P3AMDIR Directory on client host where $JOBFILE resides

$APPNAME Application name (MSC.Patran Preference name)

$PROJ Project Name selected

$DISK Total Disk space requested (mb)

- - AM ADMIN TOOL - -

Action: Modify Config Files

Object:

AM QueueQueue Type:

Apply CancelAdd Application Delete Application

Applications

Application Name: MSC.Nastran

Executable:

Optional Args:

NasMgr

Not Applicable

Application Name: ABAQUS

Executable:

Optional Args:

AbaMgr

Not Applicable

Application Name: MARC

Executable:

Optional Args:

GenMgr

-j $JOBNAME

Application List

Select Application:

MSC.Nastran

ABAQUS

GENERAL

OK Cancel

Help

Admin user: (NOT root)

am_admin

MSC.Marc

3: 2+(AMGrpsConfig Vers:

Page 116: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

Deleting an Application. To remove an application, select the Delete Application button. A list of defined applications appears.

Select one to be deleted by clicking on the application name in the list. Then, select OK. The application will be removed from the list and the list of application will disappear.

On Windows, simply select the application you want to delete from the Applications tree tab and press the Delete button (or right-mouse click the application and select Delete).

To save the configuration, the Apply button must be pressed and the newly deleted application information will be saved in the host.cfg file. On Windows this is Save Config Settings under the Queue pull down menu. (Note: Apply saves all configuration files: host, disk, and, if applicable, lsf or nqs.)

- - AM ADMIN TOOL - -

Action: Modify Config Files

Object:

AM QueueQueue Type:

Apply CancelAdd Application Delete Application

Applications

Application Name:: MSC.Nastran

Executable:

Optional Args:

NasMgr

Not Applicable

Application Name:: ABAQUS

Executable:

Optional Args:

AbaMgr

Not Applicable

Application Name: MARC

Executable:

Optional Args:

GenMgr

-j $JOBNAME

Application List

Select Application:

MSC.NastranABAQUSMSC.Patran FEA

OK Cancel

ANAYSANSYS5MARCMSC.Patran Fatigue

Help

Admin user: (NOT root)

am_admin

3: 2+(AMGrpsConfig Vers:

Page 117: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1CHAPTER 7System Management

Physical Hosts

Since the Analysis Manager can execute jobs on different hosts, it needs to know about each analysis host. Host configuration for the Analysis Manager is done via the host.cfg file located in the $P3_HOME/p3manager_files/default/conf directory.

This portion of the host.cfg file contains the following fields:

The p3am_admin (AdmMgr) program can be used to add and delete hosts or change any field above as shown in the forms below.

physical host Name of host machine for the use of the Analysis Manager

class System & O/S type:

HP700 - Hewlett Packard HP-UX

RS6K - IBM RS/6000 AIX

SGI5 - Silicon Graphics IRIX

SUNS - Sun SPARC Solaris

LX86 - Linux (MSC or Red Hat)

WINNT - Windows 2000 or XP

maximum tasks Maximum allowable concurrent job processes for this machine.

Page 118: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

Adding a Physical Host. To add a host for use by the Analysis Manager, press the Add Physical Host button. (On Windows, right mouse click the Physical Hosts tree tab.) A new host description will be created and displayed in the left scrolled window, with Host Name: Unknown and Host Type: UNKNOWN.

Enter the name of the host in the Host Name box, and select the system/OS in the Host Type menu. Additional hosts can be added by repeating this process.

When all hosts have been added, select Apply and the newly added host information will be saved in the host.cfg file. On Windows this is Save Config Settings under the Queue pull down menu. (Note: Apply saves all configuration files: host, disk, and if applicable, lsf or nqs.)

- - AM ADMIN TOOL - -

Action: Modify Config Files

Object:

AM QueueQueue Type:

Apply Cancel

UNKNOWNHost Type: Host Name: Unknown

1Maximum Concurrent Running Jobs:

Add Physical Host Delete Physical Host

Physical Hosts

Help

3: 2+(AMGrpsConfig Vers:

Page 119: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1CHAPTER 7System Management

Deleting a Host. To remove a host from use by the Analysis Manager, select the Delete Physical Host button on the bottom of the p3am_admin (AdmMgr) form. A list of possible hosts will appear.

Select the host to be deleted by clicking on the hostname in the list. Then, select OK. The host will be removed from the list of hosts and the list will go away.

On Windows, simply select the Host you want to delete from the Physical Hosts tree tab and press the Delete button (or right-mouse click the host and select Delete).

When all host configurations are ready, select Apply and the revised host.cfg file will be saved, excluding the deleted hosts. On Windows this is Save Config Settings under the Queue pull down menu.

- - AM ADMIN TOOL - -

Action: Modify Config Files

Object:

AM QueueQueue Type:

Apply Cancel

SG4DHost Type: Host Name: jupiter

1Maximum Concurrent Running Jobs:

Add Physical Host Delete Physical Host

Physical Hosts

RS6KHost Type: Host Name: venus

1Maximum Concurrent Running Jobs:

HP700Host Type: Host Name: mars

1Maximum Concurrent Running Jobs:

Physical Host List

Select Host:

mercuryvenusearth

OK Cancel

marsjupitersaturnuranusneptuneplutoplanetx

Help

Admin user: (NOT root)

am_admin

3: 2+(AMGrpsConfig Vers:

Page 120: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

Analysis Manager Host Configurations

In addition to specifying physical hosts, it is necessary to specify specific names by which the Analysis Manager can recognize the actions it should take on various hosts. For example, it may be possible that ABAQUS and MSC.Nastran are configured to run on the same physical host or that two versions of MSC.Nastran are installed on the same physical host. In order to account for this, each application and physical host has its own name or AM host name assigned to it. Host configuration for the Analysis Manager is done via the host.cfg file located in the $P3_HOME/p3manager_files/default/conf directory.

This portion of the host.cfg file contains the following fields:

The p3am_admin (AdmMgr) program can be used to add and delete AM hosts and change any field above as shown by the forms below.

AM hostname Unique name for the combination of the analysis application and physical host. It can be called anything but must be unique, for example, nas68_venus.

physical host The physical host name where the analysis application will run.

type The unique integer ID assigned to this type of analysis. This is automatically assigned by the program and the user should not have to worry about this.

path How this machine can find the analysis application - for MSC.Nastran, this is the runtime script (typically the nast200x file), for MSC.Marc, ABAQUS, and GENERAL applications, this is the executable location.

rcpath How this machine can find the analysis application runtime configuration file - the MSC.Nastran nast200xrc file or the ABAQUS site.env file. This is not applicable to MSC.Marc or GENERAL application and should be filed with the keyword NONE.

Page 121: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1CHAPTER 7System Management

Adding an AM Host. An AM host is a unique name which the user will specify when submitting a job. Information contained in the AM host is a combination of the physical host and application type along with the physical location of that application. To add a specific AM host press the Add AM Host button. A new host description will be created and displayed in the left scrolled window, with AM Host Name: Unknown, Physical Host: UNKNOWN, and Application Type: Unknown.

Enter the unique name of the host in the AM Host Name box, and select the Physical Host that this application will run on. The application is selected from the Application Type menu. Then, specify the Configuration Location and Runtime Location paths in the corresponding boxes. The unique name should reflect the name of the application to be run and where it will run. For example, if V68 of MSC.Nastran is to be run on host venus, then specify NasV68_venus as the AM host name.

The Runtime Location is the actual path to the executable or script to be run, such as /msc/bin/nas68 for MSC.Nastran. The Config Location is the actual path to the MSC.Nastran rc (nast68rc) file or the ABAQUS site.env file.

Additional AM hosts can be added by repeating this process.

For each AM host, at least one filesystem must be specified. Use the Add Filesystem capability in Modify Config Files/Filesystems to specify a filesystem for each added host.

When all hosts have been added, select Apply and the newly added host information will be saved in the host.cfg file. On Windows this is Save Config Settings under the Queue pull down menu. Note that Apply saves all configuration files: host, disk, and if applicable, lsf or nqs.

For Group, see Groups (of hosts) (p. 123).

- - AM ADMIN TOOL - -

Action: Modify Config Files

Object:

AM QueueQueue Type:

Apply Cancel

UnknownPhysical Host: AM Host Name: Unknown

Config Location: /replace/with/rc/path

UnknownApplication Type:

Runtime Location: /replace/with/exe/path

Add AM Host Delete AM Host

AM Host s

Help

Admin user: (NOT root)

am_adminUnknownGroup:

3: 2+(AMGrpsConfig Vers:

Page 122: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

Deleting an AM Host. To remove a host from use by the Analysis Manager, select the Delete AM Host button. A list of possible hosts will appear.

Select the host to be deleted by clicking on the hostname in the list. Then, select OK. The host will be removed from the list of hosts and the list of hosts will go away.

On Windows, simply select the AM Host you want to delete from the AM Hosts tree tab and press the Delete button (or right-mouse click the host and select Delete).

When all host configurations are ready, select Apply and the revised host.cfg file will be saved, excluding the deleted hosts. On Windows this is Save Config Settings under the Queue pull down menu.

- - AM ADMIN TOOL - -

Action: Modify Config Files

Object:

AM QueueQueue Type:

Apply Cancel

atf_sgiPhysical Host: AM Host Name: atf_V68

Config Location: /msc/conf/nast68rc

MSC/NASTRANApplication Type:

Runtime Location: /msc/bin/nast68

Add AM Host Delete AM Host

AM Host s

atf_ibmPhysicalHost : MSC.Patran AM Host Name:atf_V67

Config Location: /msc/conf/nast67rc

MSC/NASTRANApplication Type:

Runtime Location: /msc/bin/nast67

AM Host List

Select AM Host:

atf_V68atf_V67atf_v67.5

OK Cancel

Help

Admin user: (NOT root)

am_admin

3: 2+(AMGrpsConfig Vers:

Page 123: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1CHAPTER 7System Management

Disk Configuration

In order to define filesystems to be written for scratch and database files, the Analysis Manager needs to have a list of each file system for each host in the disk.cfg file that is to be used when running analyses. This file contains a list of each host, a list of each file system for that host, and the file system type. There are two different Analysis Manager file system types: NFS and local.

Adding a Filesystem. Use the Modify Config Files/Filesystems form to specify or add a filesystem for use by the Analysis Manager.

Press the Add Filesystem button. Then, select a host from the list provided.

There are two types of filesystems: NFS and local. Select the appropriate type for the newly added filesystem.

Additional filesystems can be added by repeating this process. Multiple filesystems can be added for each host. When all filesystems have been added, select Apply and the newly added filesystem information will be saved in the disk.cfg file.

Each host must contain at least one filesystem.

After adding a host or filesystem, test the configuration information using the Test Configuration form. See Test Configuration (p. 125).

Note: When using the Analysis Manager with LSF or NQS, you must run the administration program and start a Queue Manager on the same machine that LSF or NQS executables are located.

- - AM ADMIN TOOL - -

Action: Modify Config Files

Object:

Apply Cancel

atf_ibm_V68

/atf_ibm/p3_dir

/atf_ibm/users/tmp

Add FileSystem Delete Filesystem

FilesystemsLocal NFS

Local NFS

atf_sgi_V68

/atf_sgi/p3_dir Local NFS

atf_hp_V68

/atf_hp/p3_dir

/atf_hp/users/tmp

Local NFS

Local NFS

AM QueueQueue Type:

Help

Admin user: (NOT root)

am_admin

◆◆

◆◆

◆◆

◆◆

◆◆

3: 2+(AMGrpsConfig Vers:

Page 124: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

On Windows the form appears as below:

When an AM Host is created, one filesystem is created by default (c:\temp). You can add more filesystems to an AM Host under the Disk Space tree tab and pressing the Add button. You can change the directory path by clicking on the Directory itself and editing it in the normal method on Windows. The Type is changed by the pulldown menu next to the Directory name. If the filesystem is a Unix filesystem, make sure you remove the c:, e.g., /tmp.

Deleting a Filesystem. At the bottom of the Modify Config Files/Filesystems form, select the Delete Filesystem button to delete a filesystem from use by the Analysis Manager.

Then, select a host from the list provided, and click OK.

After selecting a host, a list of filesystems defined for the chosen host will appear. Choose the filesystem to delete from this list and click OK.

On Windows, select the AM Host under the Disk Space tree tab and press the Delete button. The last filesystem created is deleted.

Additional filesystems can be deleted by repeating this process. When all appropriate filesystems have been deleted, select Apply and the updated filesystem information will be saved in the disk.cfg file. On Windows this is Save Config Settings under the Queue pull down menu.

Page 125: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1CHAPTER 7System Management

Queue Configuration

If the LSF or NQS scheduling system is being used at this site, the Analysis Manager can interact with it using the queue configuration file (i.e., lsf.cfg or nqs.cfg). Ensure that LSF or NQS Queue is set for the Queue Type field in the Modify Config Files form. See Analysis Manager Host Configurations (p. 116). This sets a line in the host.cfg file to QUE_TYPE: LSF or NQS. The Queue Manager configuration file lists each queue name, and all hosts allowed to run MSC.Nastran, MSC.Marc, MSC.Nastran, or other GENERAL applications for that queue. In addition, a queue install path is required so that the Analysis Manager can execute queue commands with the proper path.

Adding a Queue. To add a queue for use by the Analysis Manager, press the Add Queue button on the bottom of the p3am_admin (AdmMgr) form. A new queue description will be created and displayed on the left panel, with MSC Queue Name: Unknown and LSF (or NQS) Queue Name: Unknown.”

Enter the names of the queue in the MSC Queue Name and LSF (or NQS) Queue Name boxes provided. These names can be the same or different. In addition, the administrator must also choose between one or more hosts from the listbox on the right side of the specified queue name. The host in the listbox to the right only appear after selecting an application from the Application pulldown menu. Only those hosts configured to run that application will appear in the list box. These are the hosts which will be allowed to run the analysis application when submitted to that queue.

Note: NQS and LSF are only supported by Unix platform Queue Managers. Although you can submit to an LSF or NQS queue from Windows to a Unix platform, the Windows Queue Manager does not support LSF or NQS submittals at this time.

- - AM ADMIN TOOL - -

Action: Modify Config Files

Object:

Apply Cancel

Host List:Unknown

Unknown

Add Queue Delete Queue

Queues

Path of LSF Executables:

/put/queue/path/here

eightballvenusatf_ibm

scivizMSC Queue Name:

LSF Queue Name:

UnknownApplication:

Help

Add’l Submit Params:

Keyword Index

Minimum MEM: (mb)

Minimum DISK: (mb)

Page 126: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

Additional queues can be added by repeating this process. When all queues have been added, press Apply and the newly queue host information will be saved in the lsf.cfg (or nqs.cfg) file.

Various information need to be supplied for the Analysis Manager to communicate properly with the queueing software. The most important information is the Executable Path. Enter the full executable path where the NQS or LSF executables can be found. In addition, you may specify additional (optional) parameters for the NQS or LSF executables to use if necessary. Keywords can also be used. The description of how these keywords work can be found in General (p. 44). Two keywords are available: MEM and DISK, which are evaluated to what Minimum MEMory and DISK space has been specified. For example, if an NQS command has these additional parameters: -nr -lm $MEM -lf $DISK

then submission will be

qsub -nr -lm <current MEM value> -lf <current DISK value>

where the current MEMory value is the larger of the MEMory specified here or the general memory requirement specified by the user. Current DISK space operates similarly. See Memory (p. 35) and/or Disk Space (p. 31). The MEM and DISK specified here are only used if additional parameters using the keywords are supplied.

Deleting a Queue. To remove a queue from use by the Analysis Manager, press the Delete Queue button on the bottom of the p3am_admin (AdmMgr) form. A list of possible queues will appear.

Select the queue to be deleted by clicking on the queue name in the list. Then, select OK. The queue will be removed from the list of queues and the list of queues will go away.

When the queue configuration is ready, select Apply and the revised lsf.cfg (or nqs.cfg) file will be saved, excluding the deleted queues.

Page 127: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1CHAPTER 7System Management

Groups (of hosts)

This is a nice feature that allows you to define a group. This group attribute can then be assigned to an AM Host. All AM Hosts with this attribute are grouped together and when a job is submitted, the host in this group list that matches the least loaded criteria is the one selected for job submission. This is a semi-automatic host selection mechanism based on certain criteria that is explained below..

This version of Analysis Manager supports the concept of groups of hosts. In the host.cfg file if you specify VERSION: 3 as the first non-commented line and you also add the group/queue name on the end of the am_host line in the AM_HOSTS section then you will have enabled this feature. Here is an example:

VERSION: 3...AM_HOSTS:#am_host host type bin path rc path group#------------------------------------------------------------------------------N2004_hst1 host1 1 /msc/bin/n2004 /msc/conf/nast2004rc grp_nas2004N2004_hst2 host2 1 /msc/bin/n2004 /msc/conf/nast2004rc grp_nas2004N2004_hst3 host3 1 /msc/bin/n2004 /msc/conf/nast2004rc grp_nas2004N2001_hst1 host1 1 /msc/bin/n2001 /msc/conf/nast2001rc grp_nas2001N2001_hst2 host2 1 /msc/bin/n2001 /msc/conf/nast2001rc grp_nas2001N2001_hst3 host3 1 /msc/bin/n2001 /msc/conf/nast2001rc grp_nas2001M2001_hst1 host1 3 /m2001/marc NONE grp_mar2001M2001_hst2 host2 3 /m2001/marc NONE grp_mar2001M2001_hst3 host3 3 /m2001marc NONE grp_mar2001...

- - AM ADMIN TOOL - -

Action: Modify Config Files

Object:

AM QueueQueue Type:

Apply Cancel

Group Name: hosts

Add Group Delete Group

Groups

AM Host List

Help

Admin user: (NOT root)

am_admin

3: 2+(AMGrpsConfig Vers:

Page 128: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

In this configuration, when you submit a job, you will also have the choice of the group name with the added label of 'least-loaded-grp:<group name>' in addition to and to distinguish it from regular host names. When you select this group instead of a regular host, the Analysis Manager will then decide which host from the list of those in the group is best-suited to run the job and start it there when possible. Here, best-suited means the next available host based several factors, including:

• Free tasks on each host (Maximum currently running jobs)

• Cpu utilization of host

• Available memory of host

• Free disk space of host

• Time since most recent job was started on host

If in the above example, you submitted an MSC.Nastran job to the grp_nas2004 then there are 3 machines the Analysis Manager could select to run the job, host1, host2 or host3. The Analysis Manager will query each host for the current cpu utilization, available memory and free disk space (as configured by the Analysis Manager) and also the free tasks and time since an Analysis Manager job was last started and figure out which, if any, machine can run the job. If more than one machine can run the job based on the criteria above then the Analysis Manager will select the best suited host by sorting the acceptable hosts in a user-selectable sort order. If no machines have met the criteria then the job remains queued, and the Analysis Manager will try again to find a suitable host at periodic intervals. The user selectable sort order is specificed in an optional configuration file called msc.cfg. If this file does not exist then the sort order and criteria are as follows:

• free_tasks

• cpu_util

• avail_mem

• free_disk

• last_job_time

Where the defaults for cpu util, available mem and disk are:

• Cpu util: 98

• Available mem: 5 mb

• Available disk: 10 mb

Thus any host that has cpu util < 98 and available mem > 5mb and available disk > 10mb and at least one free task (so it can start another Analysis Manager job) is eligible to run a job and the best suited host will be the one after a sort on all eligible hosts is done. You can change the sort order and defaults for cpu util, available mem and disk in the msc.cfg file. The msc.cfg file exists in the same location as the host.cfg and disk.cfg and has this format as explained in Group/Queue Feature (p. 143).

Page 129: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1CHAPTER 7System Management

Test ConfigurationThe p3am_admin (AdmMgr) program has various tests that facilitate verification of the configuration.

Application Test

Changes to the host.cfg file dealing with defined applications can be tested by selecting the Test Configuration/Applications option. The Applications Test form will appear when the Application Test button is pressed. On Windows press the Test Configuration button under Adminstration.

- - AM ADMIN TOOL - -

Action: Test Configuration

Object:

Apply CancelApplication Test

Applications

Application Name: MSC.Nastran

Executable:

Optional Args:

NasMgr

Not Applicable

Application Name:: ABAQUS

Executable:

Optional Args:

AbaMgr

Not Applicable

Application Name:: MARC

Executable:

Optional Args:

GenMgr

-j $JOBNAME

P3 QueueQueue Type:

Application Test

Start Abort Reset Close

Application Test. Ready to Begin

Unique Name Test

Host Count Test

Passed

Passed

Starting Application Test.Checking application name <MSC.Nastran for duplication.Duplication test for application <MSC.Nastran passed.Checking application name <ABAQUS> for duplication.Duplication test for application <ABAQUS> passed.Checking application <MSC.Nastran> for at least one MSC.Patran AM Host definition.Application <MSC.Nastran> is used at least once. Passed.Checking application <ABAQUS> for at least one MSC.Patran AM Host definition.Application <ABAQUS> is used at least once. Passed.

Page 130: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

This test checks to make sure that:

1. Applications are defined only once.

2. At least one AM host has been assigned on which the application can be executed.

Physical Hosts Test

Changes to portions of the host.cfg file dealing with physical hosts can be tested by selecting the Test Configuration/Physical Hosts option.

At the bottom left of the form are two buttons:

1. Basic Host Test.

2. Network Host Test.

Basic Host Test. The Basic Host Test will validate the host configuration information in the host.cfg file. There are no requirements for running the Basic Host Test. A message box provides status information as each of the following Basic Host Tests are run:

1. Validates that at least one host in the host.cfg file is present.

2. Ensures that each host specified is a valid host with the nameserver. (i.e., makes sure the machine that the p3am_admin (AdmMgr) program is running on recognizes each of the host names provided.)

3. Ensure that a valid Host Type has been provided for each of the hosts. (i.e., when a new host is added, the Host Type is set to Unknown. Makes sure the user changed it to something valid).

4. Checks that a Master host has been selected.

- - AM ADMIN TOOL - -

Action: Test Configuration

Object:

Apply Cancel

SG4DHost Type: Host Name: jupiter

1Maximum Concurrent Running Jobs:

Basic Host Test Net Host Test

Physical Host s

RS6KHost Type: Host Name: venus

1Maximum Concurrent Running Jobs:

HP700Host Type: Host Name: mars

1Maximum Concurrent Running Jobs:

P3 Queue

Basic Host Test

Start Abort Reset Close

Host Name Test

# Of Hosts Test

Pending

Pending

Basic Host Test. Ready to Begin.

Nameserver Test Pending

Host Type Test Pending

Duplicate Host Test Pending

Page 131: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1CHAPTER 7System Management

5. Makes sure two hosts with the same address were not specified.

If a problem is detected, close the form and return to the Modify Config File form to correct the configuration.

Network Host Test. The Network Host Test will validate all of the physical host configuration information in the host.cfg file, and validate communication paths between hosts.

Requirements to run the Network Host Test include:

1. Must be root.

2. Must be on the Master node (the host running the Queue Manager).

3. Must provide a username (each user must be tested separately.

A message box provides status information as each of the following network host tests is run:

1. Checks user remote command (rcmd) access between the Master node and other specified hosts.

2. Validates that each host has the correct architecture setting.

3. Makes sure that each host sees the installation directory in the same way.

4. Checks the Analysis Manager directories (i.e., makes sure user can read all configuration files; ensures he can create a directory under proj directory and other locations).

If a problem is detected, close the form and return to the Modify Config Files form to correct the configuration or exit to the system to correct the problem. It is highly recommended that you run the Network Host Test for each user who wants to use the Analysis Manager.

Page 132: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

AM Hosts Test

Changes to portions of the host.cfg file dealing with the AM hosts can be tested by selecting the Test Configuration/AM Hosts option.

At the bottom left of the form are two buttons:

1. Basic AM Host Test.

2. Network AM Host Test.

Basic Host Test. The Basic AM Host Test will validate the AM host configuration information in the host.cfg file. There are no requirements for running the Basic AM Host Test. A message box provides status information as each of the following Basic Host Tests are run:

1. Validates that at least one AM host in the host.cfg file is present.

2. Ensures that each AM host specified is a valid name.

3. Ensure that each AM host has a physical host assigned.

4. Checks that an application has been assigned to each AM host.

5. Checks that the configuration file exists for each AM host. This is applicable to MSC.Nastran and ABAQUS only.

6. Checks that the actual executable is accessible and has the proper privileges.

7. Makes sure two AM hosts with the same names are not specified.

If a problem is detected, close the form and return to the Modify Config File form to correct the configuration.

- - AM ADMIN TOOL - -

Action: Modify Config Files

Object:

Apply Cancel

atf_sgiPhysicalHost: AM Host Name: atf_V68

Config Location: /msc/conf/nast68rc

MSC/NASTRANApplication Type:

Runtime Location: /msc/bin/nast68

Basic A/M Host Test Net A/M Host Test

AM Hosts

P3 QueueQueue Type:

atf_ibmPhysicalHost : MSC/PATRAN AM Host Name:atf_V67

Config Location: /msc/conf/nast67rc

MSC/NASTRANApplication Type:

Runtime Location: /msc/bin/nast67

MSC/PATRAN ANALYSIS MANAGER ADMINISTRATION TOOLQueue Type:

Basic A/M HostTest

Start Abort Reset Close

#AM Hosts Test

AM Host Name Test

Pending

Pending

Basic AM Host Test. Ready to Begin..

Physical Host Selected Test PendingApplication Selected Test PendingValid Configuration Test PendingValid Runtime Test PendingUnique AM Host Name Test Pending

Page 133: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1CHAPTER 7System Management

Network Host Test. The Network AM Host Test will validate all of the AM host configuration information in the host.cfg file, and validate communication paths between hosts.

Requirements to run the Network Host Test include:

1. Must be root.

2. Must be on Master node.

3. Must provide a username.

A message box provides status information as each of the following network AM host tests is run:

1. Checks the location and privileges of the configuration file for each application.

2. Checks the runtime location (executable) for each application and privileges.

If a problem is detected, close the form and return to the Modify Config Files form to correct the configuration or exit to the system to correct the problem. It is highly recommended that you run the Network Host Test for each user who wants to use the Analysis Manager.

Page 134: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

Disk (Filesystem) Test

Changes to the disk.cfg file can be tested by selecting the Test Configuration/ Disk Configuration option. The test disk configuration form will appear.

At the bottom left of the form are two buttons:

1. Basic Disk Test.

2. Network Disk Test.

Basic Disk Test. The Basic Disk Test will validate the disk configuration information. There are no requirements for running the Basic Disk Test. A message box provides status information as each of the following basic disk tests are run:

1. Makes sure at least one filesystem is defined for each host.

2. Makes sure there is a value for each filesystem (i.e., no empty entries), and that the entries are absolute paths which start with a “/”.

3. Checks the length of the filesystem definitions. If they are greater than 25, a warning is provided. This may cause problems.

If a problem is detected, close the form and return to the Modify Config Files form to correct the disk configuration.

- - AM ADMIN TOOL - -

Action: Test Configuration

Apply Cancel

atf_ibm_V68

/atf_ibm/p3_dir

/atf_ibm/users/tmp

Basic Disk Test Net Disk Test

FilesystemsLocal NFS

Local NFS

Object:

atf_sgi_V68

/atf_sgi/p3_dir Local NFS

atf_hp_V68

/atf_hp/p3_dir

/atf_hp/users/tmp

Local NFS

Local NFS

P3 QueueQueue Type:

Basic Disk Test

Start Abort Reset Close

Basic Disk Test. Ready to Begin

#Filesystems Test

Valid Filesystem Test

Pending

Pending

Filesystem Length Test Pending

Page 135: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1CHAPTER 7System Management

Network Disk Test. The Network Disk Test will validate all of the disk configuration information. A message box provides status information as each test is run.

Requirements to run the Network Disk Test include:

1. Must be root.

2. Must be on Master node.

3. Must provide a username.

A message box provides status information as each of the following network disk tests are run:

1. Check that each filesystem exists for each host, and that it can be written to by the provided user.

If a problem is detected, close the form and return to the Modify Config Files form to correct the disk configuration or exit to the system to correct the problem. It is highly recommended that you run the Network Disk Test for each user who wants to use the Analysis Manager.

Page 136: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

Queue Test

Changes to the lsf.cfg queue configuration file can be tested by selecting the Test Configuration/Queue Configuration option. The test queue configuration form will appear.

At the bottom left of the form are two buttons:

1. Basic Queue Test.

2. Advanced Queue Test.

Basic Queue Test. The Basic Queue Test will validate the queue configuration information in the lsf.cfg or nqs.cfg file. A queueing system (i.e., LSF) must be defined to run the Basic Queue Test. A message box provides status information as each of the following basic queue tests are run:

1. Makes sure at least one queue has been specified.

2. Makes sure that at least one application has been specified per queue.

3. Makes sure a unique MSC queue name has been specified.

4. Makes sure that the LSF or NQS queue in unique and exists.

5. Make sure that for each queue at least one physical host is specified as a member of the queue.

6. Make sure the LSF or NQS executables path has been specified and that it is an absolute path.

- - AM ADMIN TOOL - -

Action: Test Configuration

Object:

Apply Cancel

Host List:

hp_machines

hp_queue

Basic Queue Test Network Queue Test

Queueseightballvenusatf_ibm

scivizP3 Queue Name:

LSF Queue Name:

Host List:

ibm_machines

ibm_queue

venusatf_ibm

scivizP3 Queue Name:

LSF Queue Name:

Host List:

dec_machines

dec_queue

eightballsciviz

P3 Queue Name:

LSF Queue Name:

eightball

venusatf_ibm

Path of LSF Executables:

/usr/lsf/bin

Basic Queue Test

Start Abort Reset Close

Basic QueueTest. Ready to Begin

#Queues Test

#Queues Per App Test

Passed

PassedMSC Queue Name Test PassedLSF Queue Name Test PassedQueue Host List Test PassedExecutable Path Test Passed

Page 137: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1CHAPTER 7System Management

If a problem is detected, close the form and return to the Modify Config Files form to correct the configuration.

Advanced Queue Test. The Advanced Queue Test will validate all of the queue configuration information (i.e., the lsf.cfg or nqs.cfg file).

Requirements to run the Advanced Queue Test include:

1. Must be using LSF or NQS for queue management.

2. Must be on Master node.

3. Must provide a username.

4. Must be root.

A message box provides status information as each of the following network queue tests is run:

1. Makes sure the LSF executables (bsub, bkill, bjobs) or the NQS executables (qsub, qdel, qstart) are in the specified location, and are executable by the provided user.

If a problem is detected, close the form and return to the Modify Config Files form to correct the queue configuration or exit to the system to correct the problem. It is highly recommended that you run the Advanced Queue Test for each user who wants to use the Analysis Manager.

Page 138: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

Queue ManagerThis simply allows any changes in the configuration files that may have been implemented during a p3am_admin (AdmMgr) session to be applied. If the configuration files are owned by root then you must have root access to change them. Once they have been changed, in order for the QueMgr to recognize them, it must be reconfigured. Simply press the Apply button with the Restart QueMgr toggle selected. This forces the Queue Manager to reread the configuration files. Once the Queue Manager has been reconfigured, new jobs submitted will use the updated configuration.

If a reconfiguration is issued while jobs are currently running, then those jobs are allowed to finish before the reconfiguration occurs. During this period, the Queue Manager is said to be in drain mode, not accepting any new jobs until all old jobs are complete and the Queue Manager has reconfigured itself. The Queue Manager can also be halted immediately (which kills any job running) or can be halted after it is drained.”

When the Queue Manager is halted, the three toggles on the right side change to one toggle to allow the Queue Manager to be started. All configurations that are being used are shown on the left. When the Queue Manager is halted, you may change some of the configurations on the left side, such as Port, Log File, and Log File User before starting the daemon again. For more information on the Queue Manager see Starting the Queue/Remote Managers (p. 144).

- - A/M ADMIN TOOL - -

Queue Manager

Apply Cancel

Configuration Path:

Restart QueMgr

Halt QueMgr When Empty

Halt QueMgr Immediately

Help

Executable Path:

Organization:

Port:

Log File:

Log File User:

QueMgr Status:

/patran/p3manager_files/default/conf

/patran/p3manager_files/bin/SGI5

default

1500

/tmp/QueMgr.log

root

Running

Action:

◆◆

◆◆

Page 139: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1CHAPTER 7System Management

On Windows you can Start and Stop the Queue Manager from the Queue pulldown menu when you are in the Administration tree tab.

Or you can right mouse click the Administration tree tab and the choices to Read or Save configuration file or Start and Stop the Queue Manager are also available.

Page 140: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

7.7 Examples of Configuration FilesThe Analysis Manager is a distributed application and requires a predefined set of configuration files for its use. These configuration files may be changed using the Configuration Management tool (called p3am_admin (AdmMgr)) (see Configuration Management Interface (p. 104)) or they may be edited by hand. When one or more of the configuration files is changed, the Queue Manager must either be restarted or reconfigured to force it to read and recognize the changes.

Host Configuration File. To set up and execute different applications on a variety of physical computer hosts, the Analysis Manager uses a host configuration file (host.cfg) located in the directory:

$P3_HOME/p3manager_files/default/conf

where $P3_HOME is the location of the installation, typically /msc.

The host.cfg file contains five distinct areas of information/administrator, Queue Type, AM Host information, Physical Host information, and Application information, in this order. The queue type may be either MSC, LSF, or NQS. The administrator is any valid user name except root (or Administrator on Windows).

The AM host information has these fields associated with it:

The physical host information has the following fields associated with it:

AM Hostname A unique name for the combination of the analysis application and physical host. It can be called anything but must be unique, for example nas68_venus.

Physical Host The physical host name where the analysis application will run.

Type The unique integer ID assigned to this type of analysis. This is automatically assigned by the program and the user should not have to worry about this.

Path How this machine can find the analysis application. For MSC.Nastran, this is the runtime script (typically the nast68 file), for MSC.Marc, ABAQUS or GENERAL applications, this is the executable location.

rcpath How this machine can find the analysis application runtime configuration file: the MSC.Nastran nast68rc file or the ABAQUS site.env file. This is not applicable to a MSC.Marc, GENERAL application and should be filed with the keyword NONE.

Physical Host Name of host machine for the use of the Analysis Manager

Class Machine type (RS6K, HP700, etc.)

Max Maximum allowable concurrent processes for this machine

MaxAppTsk Maximum application tasks. This is used say, if four MSC.Nastran hosts are configured, but there are only enough licenses for three concurrent jobs. Without this the 4th job would always fail. With MaxAppTsk set to 3, the 4th job waits in the queue until one of the previous running jobs completes, then it gets submitted. It is ONLY present if the configuration file version is >=2. This is set with the VERS: or VERSION: field at the top of the file.

Page 141: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1CHAPTER 7System Management

The application information has the following fields associated with it:

If the scheduling system is a separate package (e.g., LSF or NQS), then the Analysis Manager will submit jobs to a queue provided. Queues are described below. Also, If the scheduler is separate from the Analysis Manager, then the maximum task field is not used. All tasks are submitted through the queue and the queueing system will execute or hold each task according to its own configuration. An example of a host.cfg file is given below. Each comment line must begin with a # character. All fields are separated by one or more spaces. All fields must be present.

#------------------------------------------------------# Analysis Manager host.cfg file#------------------------------------------------------### A/M Config file version# Que Type: possible choices are P3, LSF, or NQS#VERSION: 2ADMIN: am_adminQUE_TYPE: MSC##------------------------------------------------------# AM HOSTS Section#------------------------------------------------------## Must start with a “P3AM_HOSTS:” tag.## AM Host:# Name to represent the choice as it will appear# on the AM menus.## Physical Host:# Actual hostname of the machine to run the application on.## Type:# 1 - MSC.Nastran# 2 - ABAQUS# 3 - MSC.Marc# 20 - User defined (General) application #1# 21 - User defined (General) application #3# etc. (max of 29)## This field defines the application for this entry.# Each value will have a corresponding entry in the# “APPLICATIONS” section.#

Note: The MaxAppTsk setting must be added manually. There is no widget in the AdmMgr to do this. If there are NO configuration files on start up of the AdmMgr, then it will set the version to 2 and use 1000 as the MaxAppTsk. If configuration files exist and version 2 is set, it will honor whatever is already there and pass them through. If version 1 is set, then MaxAppTsk is not written to the configuration files.

Type A number indicating analysis program type

Prog_name The name of the application job manager for this application

MSC.Patran name The name of the application which corresponds to the MSC.Patran analysis preference

Options Optional arguments for use with the GENERAL application.

Page 142: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

# EXE_Path:# Where executable entry is made.## RC_Path:# Where runtime configuration file (if present) is found.# Set to “NONE” if “General” application.##------------------------------------------------------# Physical Hosts Section#------------------------------------------------------## Must start with a “PHYSICAL_HOSTS:” tag.## Class:# HP700 - Hewlett Packard HP-UX # RS6K - IBM RS/6000 AIX # SGI5 - Silicon Graphics IRIX # SUNS - Sun Solaris # LX86 - Linux/# WINNT - Windows## Max:## Maximum allowable concurrent tasks for this host.##------------------------------------------------------# Applications Section#------------------------------------------------------## Must start with a “APPLICATIONS:” tag.## Type: See above for values# Prog_name:## The name of the MSC.Patran AM Task Manager executable to start.## This field must be set to the following, based on the# application it represents:## MSC.Nastran -> NasMgr# HKS/ABAQUS -> AbaMgr# MSC.Marc -> MarMgr# Any General App -> GenMgr## option args:## This field contains the default command line which will# appear in the AM user interface configure menu. This# field is only valid for user defined (General) applications.# The command line can contain any text including any of the# following keywords (which will be evaluated at runtime):## $JOBFILE Actual filename selected (w/o full path)# $JOBNAME Jobname ($JOBFILE w/o extension)# $P3AMHOST Hostname of AM host# $P3AMDIR Dir on AM host where $JOBFILE resides# $APPNAME Application name (P3 preference name)# $PROJ Project Name selected# $DISK Total Disk space requested (mb)### AM Host Physical Host Type EXE_Path RC_Path#---------------------------------------------------------------------P3AM_HOSTS:

Venus_nas675 venus 1 /msc/msc675/bin/nast675 /msc/msc675/conf/nast675rc

Venus_nas68 venus 1 /msc/msc68/bin/nast68 /msc/msc68/conf/nast68rc

Page 143: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1CHAPTER 7System Management

#---------------------------------------------------------------------##Physical Host Class Max#--------------------------------------------------------------PHYSICAL HOSTS:

#--------------------------------------------------------------###Type Prog_name MSC P3 name MaxAppTsk [option args]#--------------------------------------------------------------APPLICATIONS:

#--------------------------------------------------------------

Venus_aba53 venus 2 /hks/abaqus /hks/site/abaqus.env

Venus_mycode venus 20 /mycode/script NONE

Mars_nas68 mars 1 /msc/msc68/bin/nast68 /msc/msc68/conf/nast68rc

Mars_aba5 mars 2 /hks/abaqus /hks/site/abaqus.env

Mars_mycode mars 20 /mycode/script NONE

venus SGI4D 2

mars SUN4 1

1 NasMgr MSC.Nastran 3

2 AbaMgr ABAQUS 3

3 MarMgr MSC.Marc 3

20 GenMgr MYCODE 3 -j $JOBNAME -f $JOBFILE

Page 144: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

Disk Configuration File. This configuration file defines the scratch disk space and disk systems to use for temporary files and databases. Every AM host must have a filesystem associated with it.

In particular, the Analysis Manager’s MSC.Nastran Manager (NasMgr) generates MSC.Nastran File Management Section (FMS) statements for each job submitted. The FMS statements are to initialize and allocate each MSC.Nastran scratch and database file for each job. In order to define files to be written for each scratch and database logical file, the Analysis Manager uses the disk configuration file (called disk.cfg) to know each file system for each host in the host.cfg file that is to be used when running MSC.Nastran. So, the disk configuration file contains a list of each host, a list of each file system for that host, and the file system type. There are two different Analysis Manager file system types, nfs or local (leave field blank). An example of the disk.cfg file:

#---------------------------------------------------------------------# Analysis Manager disk.cfg file#---------------------------------------------------------------------## AM Host## AM host from the host.cfg file “MSC.Patran AM_HOSTS” section.## File System## The filesystem directory## Type## The type of filesystem. If the filesystem is local# to the machine, this field is left blank. If the# filesystem is NFS mounted, the string “nfs” appears# in this field.##

##---------------------------------------------------------------------

Each comment line must begin with a # character. All fields are separated by one or more spaces. All fields must be present.

# AM Host File System Type (nfs or blank)#----------------------------------------------------------------------Venus_nas675 /user2/nas_scratch Venus_nas675 /venus/users/nas_scratch #Venus_nas68 /user2/nas_scratch Venus_nas68 /venus/users/nas_scratch Venus_nas68 /tmp #Venus_aba53 /user2/aba_scratchVenus_aba53 /venus/users/aba_scratchVenus_aba53 /tmp #Venus_mycode /tmpVenus_mycode /server/scratch nfs#Mars_nas68 /mars/nas_scratch #Mars_aba5.2 /mars/users/aba_scratch Mars_aba5.2 /tmp#Mars_mycode /tmp

Page 145: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1CHAPTER 7System Management

In this example, the term file system is used to define a directory that may or may not be its own file system, and that already exists and has permissions so that any the Analysis Manager user can create directories below it. It is recommended that the Analysis Manager file systems be directories with large amounts of disk space and restricted to the Analysis Manager’s use, because the Analysis Manager’s MSC.Nastran, MSC.Marc, ABAQUS, and GENERAL Managers only know about their own jobs and processes.

Queue Configuration File. If a separate scheduling system (i.e., LSF or NQS) is being used at this site, the Analysis Manager can interact with it, using the queue configuration file. This file is of the same name as the Queue Manager type field in the host.cfg file (i.e. QUE_TYPE: LSF or NQS), with a.cfg extension (i.e., lsf.cfg or nqs.cfg). The Queue Manager configuration file lists each queue name, and all hosts allowed to run applications for that queue. In addition, a queue install path is required, so that the Analysis Manager can execute queue commands with the proper path. An example of a Queue Manager configuration file is given below.

Each comment line must begin with a # character. All fields are separated by one or more spaces. All fields must be present.

#------------------------------------------------------# Analysis Manager lsf.cfg file#------------------------------------------------------## Below is the location (path) of the LSF executables (i.e. bsub)#QUE_PATH: /lsf/binQUE_OPTIONS:QUE_MIN_MEM:QUE_MIN_DISK:## Below, each queue which will execute MSC tasks is listed.# Each queue contains a list of hosts (from host.cfg) which# are eligible to run tasks from the given queue.## NOTE:# Each queue can only contain one Host of a given application# version(i.e., if there are two version entries for # MSC.Nastran, nas67 and nas68, then each queue# set up to run MSC.Nastran tasks could only include# one of these versions. To be able to submit to# the other version, create a separate, additional# MSC queue containing the same LSF queue name, but# referencing the other version)##TYPE: 1

#TYPE: 2

#

#MSC Que LSF Que Hosts#---------------------------------------------------------Priority_nas priority mars_nas675, venus_nas675Normal_nas normal mars_nas675, venus_nas675Night_nas night mars_nas675#---------------------------------------------------------

#MSC Que LSF Que Hosts#---------------------------------------------------------Priority_aba priority mars_aba53, venus_aba53Normal_aba normal mars_aba53, venus_aba53Night_aba night mars_aba53, venus_aba53#---------------------------------------------------------

Page 146: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

Organizational Group Configuration File. If you wish to link all organizational groups (running Queue Managers) together, so that any user may see and switch between these organizational groups from within the Analysis Manager without setting any environment variables, then it is necessary to create an org.cfg file in the top level directory:

$P3_HOME/p3manager_files/org.cfg

where $P3_HOME is the MSC.Patran installation location.

Three fields are required in this file:

An example of this configuration file follows:

org The organizational group name

master host The host on which the Queue Manager daemon is running for the particular organizational group in question

port # The unique port ID used for this Queue Manager daemon. Each Queue Manager must have been started with the -port option.

#------------------------------------------------------# MSC.Patran ANALYSIS MANAGER org.cfg file#------------------------------------------------------## Org Master Host Port ##------------------------------------------------------default casablanca 1500atf atf_ibm 1501lsf_atf atf_sgi 1502support umea 1503

Page 147: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1CHAPTER 7System Management

Separate Users Configuration File. In order to allow execution of analysis jobs on machines where the user does not have an account, the system administrator may have to set up special accounts or allow access to other accounts for users to submit jobs. This is done with the .p3amusers configuration file located in $P3_HOME/p3manager_files/<org>/conf.

The file contents are very simple. They simply contain the name of each user account per line that will be allowed for others to use and submit jobs. As an example:

user1user2sippolasmith

The filename begins with a period (“.”) meaning it will be hidden when issuing a normal directory content command. The Queue Manager daemon must be restarted once this file is created or modified.

The capability or necessity of this separate user file has somewhat been obsoleted. In general the following applies:

1. On Unix machines, if RmtMgrs are running as root then they can run the job as the user (or the separate user as specified by this file) with no problem.

2. On Unix machines, if RmtMgrs are running as a specific user then the job will run as that user regardless of the user (or separate user) who submitted the job.

3. If Windows, the job gets runs as whoever is running the RmtMgr on the PC. The user (and separate user) is ignored.

Group/Queue Feature. This configuration file msc.cfg, allows the default least-loaded criteria to be modified when using the host grouping feature for automatically selection the least loaded machine to submit to. The file contents look like:

SORT_ORDER: free_tasks cpu_util last_job_time avail_mem free_diskGROUP: grp_nas2004MIN_DISK: 10MIN_MEM:: 5MAX_CPU_UTIL: 95

The SORT_ORDER line lists the names of the sort criteria in the order you care to sort eligible hosts. The remaining lines are then for each group you care to change the defaults. Thus you must define multiple entries of the GROUP, MIN_DISK, MIN_MEM, MAX_CPU_UTIL for each group.

A group can not contain multiple entries that use the same physical hosts (e.g.: nast2004_host1 and nast2001_host1 in the above example) because then the Analysis Manager would not know which to use. In this case just create another group name (grp_nas2001 like above) and it will work as expected. You can have different applications in the same group with no problems. You could in the above example have used grp_nas2004 as the group name for all the MSC.Nastran entries (possibly changing the name of the group to make more sense that its for hosts which run MSC.Nastran) or you can keep them separate with the added flexibility of defining a different sort order and util/mem/disk criteria for each application/group.

Note: Any user account that is configured in this manner must exist not only on the machine where the analysis is going to run, but also on the machine from which the job was submitted.

Page 148: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

7.8 Starting the Queue/Remote ManagersThe discussion pertains to Unix machines. See the end of this section for a discussion of Windows. The Queue and Remote Managers are the two daemon (or services on Windows) which run continuously. The Queue Manager runs on the master node, (QueMgr executable). The Remote Manager (RmtMgr executable) runs on all analysis hosts and it is recommended that it run on all submit host also. The programs are located in the following directory:

$P3_HOME/p3manager_files/bin/<arch>

where <arch> is one of the following:

QueMgr Usage: The Queue Manager can be manually invoked by typing

$P3_HOME/bin/QueMgr <args>

with the arguments below:

QueMgr -path $P3_HOME -org <org> -log <logfile> -port <#>

where:

$P3_HOME is the installation directory.

Only the -path is required unless the QueMgr is started with a full path. The QueMgr is recommended to be started as root although not a strict requirement. It is recommended to run the QueMgr as a separate user such as the administrator account.

Example:

If the Analysis Manager is installed in /msc/patran200x and the master node is an IBM RS/6000 computer, log into the master node (as root if you want) and do the following:

/msc/patran200x/bin/QueMgr -path /msc/patran200x

HP700 - Hewlett Packard HP-UX

RS6K - IBM RS/6000 AIX

SGI5 - Silicon Graphics IRIX

SUNS - Sun SPARC Solaris

LX86 - Linux (MSC or Red Hat)

WINNT - Windows 2000 or XP

<org> Organization name for subgroup config files. Defaults to default.

<logfile> A different log filename for QueMgr. If not specified, the QueMgr.log located in:$P3_HOME/p3manager_files/<org> is used.

<#> If the org.cfg file is to be used to allow users to interactively switch between organizational groups, then the QueMgr must be started with a unique port ID. The ID can be any number as long as it is unique and not being used by anything else and should match that listed in org.cfg if this file is used.

Page 149: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1CHAPTER 7System Management

If the Analysis Manager is installed on a filesystem that is not local to the master node and the QueMgr is started as root, it is recommended that the -log option be used when starting the Queue Manager. The -log option should be used to specify a log file which should be on a filesystem local to the master node. Writing files as root onto network mounted filesystems is sometimes not possible. Starting the QueMgr as a normal user solves this problem.

You may want to put this command line somewhere in a script so the Queue Manager is started as root each time the master node is rebooted. See Starting Daemons at Boot Time (p. 146).

RmtMgr Usage: The Remote Manager can be manually invoked by typing

$P3_HOME/bin/RmtMgr

where:

$P3_HOME is the installation directory. No arguments are necessary unless you start from where it exists with a ./RmtMgr in which case you will need the -path $P3_HOME argument.

The RmtMgr should not be started as root.

Example:

If the Analysis Manager is installed in /msc/patran200x and the analysis node is an IBM RS/6000 computer, log into the analysis node as root and do the following:

/msc/patran200x/bin/RmtMgr -path /msc/patran200x

All other arguments not specified will be defaulted. You may want to put this command line somewhere in a script so the Queue Manager is started as root each time the master node is rebooted. See Starting Daemons at Boot Time (p. 146).

Note: There are other arguments that can be used when starting up the Queue Manager for more flexibility. See Analysis Manager Program Startup Arguments (p. 86).

Note: There are other arguments that can be used when starting up the Remote Manager for more flexibility. See Analysis Manager Program Startup Arguments (p. 86).

Page 150: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

Starting Daemons at Boot TimeTo restart the QueMgr (or RmtMgr) daemon when the master host workstation is rebooted there are a number of things that can be done. Two are listed here for Unix platforms. The first is recommended via the /etc/rc2 method as opposed to the inittab method. These methods can vary from Unix machine to Unix machine. If you have trouble, consult your system administrator.

Windows uses services. Manually installing and configuring these services is also described below.

Unix Method: rc

As root the following done in general terms:

Create a file in /etc/rc2.d called Sxxz_p3am where xx is a number as high as possible (say 99) and the name z_p3am is simply a name. (The higher number indicates that it will be executed last of all the scripts in this directory during startup.) In this file you place the script commands to start the QueMgr and RmtMgr. You can also add the su command to start up the daemons as a user.

Below is an example of a file called S99z_p3am:

#! /sbin/sh

# This script starts up the QueMgr and RmtMgr# of the MSC.Patran Analysis Manager application.

# starts QueMgr as am_adminsu - am_admin -c "/etc/p3am_que start"

# starts the RmtMgr as am_adminsu - am_admin -c "/etc/p3am_rmt start"

What this script actually does is call another script (or two) that actually starts or stops the QueMgr and RmtMgr, but it could have been done directly in the above script. The contents of the p3am_que script are:

#! /usr/bin/csh -f

# This service starts/stops the QueMgr used with# the Analysis Manager application.if ( $#argv != 1 ) then echo "Usage: $0 { start | stop }" exit 1endifif ( $status != 0 ) then echo "Cannot determine platform. Exiting..." exit 1endifset P3_HOME = ”/msc/patran200x”switch ( $argv[1] ) case start: if ( -x ${P3_HOME}/p3manager_files/bin/SGI5/QueMgr ) then ${P3_HOME}/p3manager_files/bin/SGI5/QueMgr endif breaksw case stop: set quepid = `ps -eo comm,user,pid | \ egrep "QueMgr[ ]*[am_admin]" | awk '{print $3}'` foreach Qproc ( $quepid )

Note: The location of the rc2.d directory may vary from computer to computer. Check /etc and /sbin.

Page 151: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1CHAPTER 7System Management

kill -9 $Qproc end breaksw default: echo "Usage: $0 { start | stop }" exit 1 breakswendsw

The p3am_rmt script would be identical except that RmtMgr replaces QueMgr. This could be done with a single script where another argument is the daemon type, RmtMgr or QueMgr and thus another variable is set to start or stop the one specified in the argument list.

The above script can be used to stop the daemons also. This would be done if the machine were brought down when rebooting. In this case you use a script in the rc0.d directory with a name of Kxx_p3am where xx is the lowest number such as 01 to force it to be executed first among all the scripts in this directory. The argument to the above script would then be stop instead of start. This is used to do a clean and proper exit of the daemons when the machine is shut down. The example of a script called K01_p3am is:

#! /sbin/sh

# This script stops the QueMgr and RmtMgr# of the MSC.Patran Analysis Manager application.

# stop QueMgr/etc/p3am_que stop

# stop the RmtMgr/etc/p3am_rmt stop

Unix Method: inittab:

As root do the following:

Edit the file, /etc/inittab, and add the following line at the end:

p3am:2:once:/bin/sh /etc/p3am >/dev/null 2>&1 # MSC.AM QueMgr daemon

Now create the file, /etc/p3am and add the following lines:

#!/bin/shQueMgr=$P3_HOME/bin/QueMgrRmtMgr=$P3_HOME/bin/RmtMgrif [ -x $QueMgr ]then$QueMgr -path $P3_HOMEfiif [ -x $RmtMgr ]then$RmtMgrfi

Note: The script above is specific to starting the QueMgr on SGI machines. For other machines, replace the SGI5 with the appropriate <arch> as described in Directory Structure (p. 82).

Note: The number following the p3am in the above lines must match the init default # in the inittab file. Check this number to make sure you are using the correct one. Otherwise it will not start on reboot.

Page 152: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

where $P3_HOME is the Analysis Manager installation directory commonly referred to as $P3_HOME throughout this manual. You must replace it with the exact path in the above example. Make sure that this file’s protection allows for execution:

chmod 755 /etc/p3am

For Window machines:

The Queue and Remote Managers are installed as services. Once the service is installed, no further action needs to be taken. In general the installation from the media installs these services. You will have to start and stop them to reconfigure if you change the configuration files. If for some reason you must install the Analysis Manager manually and assuming that the following directory exists:

$P3_HOME\p3manager_files

follow these steps,

1. Edit the files install_server.bat and install_client.bat in$P3_HOME\p3manager_files\bin\WINNT

and make sure that the path points to

$P3_HOME\\p3manager_files\\bin\\WINNT\\QueMgr.exe and RmtMgt.exe,

respectively. Make sure there are two back slashes between each entry.

2. Double click the install_server.bat and install_client.bat files. This will install the services.

3. Edit the gui_install.reg file and make sure the path is correct also with two back slashes between each entry in the “Path”= field, e.g.,

“Path”=“C:\\MSC.Software\\MSC.Patran\\2004\\p3manager_files\\bin\\WINNT\\AnalysisMgrB.dll”

4. Right mouse click the gui_install.reg file and select merge. This will merge it into the registry. This is not necessary if you’ve installed from the CD. If you get a message saying, No doctemplate is loaded. Cannot create new document. this is because you have not merged this file into the registry, or the path was incorrect.

5. Optional: You may want the Queue and Remote Manager services to startup as different users other than Administrator. To do this right mouse click My Computer and select Manage. Then open the Services tree tab and find MSCQueMgr (or MSCRmtMgr) and select it and view Properties from the Action pulldown menu. Under the Log On tab you can change to This Account or select another account for the services to start up as.

6. You can start and stop the Queue Manager and/or Remote Managers from the Services form from the previous step. However you can also use the small command files in $P3_HOME\p3manager_files\bin\WINNT

called:start_server.batstart_client.batstop_server.batstop_client.batquery_server.batquery_client.batremove_server.batremove_client.bat

Page 153: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1CHAPTER 7System Management

to do exactly as the file describes for starting, stopping, querying, and removing the Queue Manager (server) service or the Remote Manager (client) service.

If you follow the above steps, manually installation should be successful. You will still have to edit your configuration files and the reconfigure (or stop and start) the Queue Manager to read the configuration before you will be able to successfully use the Analysis Manager. See Configuration Management Interface (p. 104).

Page 154: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained
Page 155: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

MSC.Patran Analysis Manager User’s Guide

APPENDIX

A Error Messages

■ Error Messages

Page 156: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

Error MessagesThe following are possible error messages and their corresponding explanations and possible solutions. Only messages which are not self explanatory are elaborated upon. If you are having trouble, please check the QueMgr.log file usually located in the directory $P3_HOME/p3manager_files/<org>/log or in the directory that the was specified by the -log argument when starting the QueMgr. On Windows, check the Event Log under the Administrative Tools Control Panel (or a system log on Unix).

Sometimes errors occur because the RmtMgr is running as root or administrator on Windows yet RmtMgr is trying to access network resources such as shared drives. For this reason it is recommended that RmtMgr (and QueMgr) be started as a normal user.

PCL Form Messages...

MSC.Patran Analysis Manager not installed.

Check for proper installation and authorization. Check with your system administrator. The Analysis Manager directory $P3_HOME/p3manager_files must exist and a proper license must be available for the Analysis Manager to be accessible from within MSC.Patran.

Windows...

No doctemplate is loaded. Cannot create new document.

If you get a message saying, this is because you have not merged this file into the registry, or the path was incorrect. See For Window machines: (p. 148) in Queue Manager (p. 134).

Note: The directories (conf, log, proj) for each set of configuration file (organizational structure) must have read, write, and execute (777) permission for all users. This can be the cause of many task manager errors.

Page 157: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1APPENDIX AError Messages

Job Manager Daemon (JobMgr) Errors...

ERROR... Starting a JobMgr on local host.

JobMgr is unable to run most likely because of a permission problem. Make sure that the input deck is being submitted from a directory that has read/write permissions set.

================

311 ERROR... Unable to start network communication on server side.335 ERROR... Unable to initiate server communication.

JobMgr is unable to create server communication. Possible reason is the host’s network interface is not configured properly.

================

312 ERROR... Unable to start network communication on client side.ERROR... Unable to create and connect to client socket

JobMgr is unable to create client communication. Possible reason is the host’s network interface is not configured properly.

================

ERROR... Problem in socket accept301 ERROR... Unable to accept network connection.ERROR... Unable to accept messageERROR... Unable to complete network accept.

JobMgr is unable to complete communication connection. Possible reason is the host’s network interface is not configured properly, or the network connectivity has been interrupted.

================

307 ERROR... Problem with network communication select.ERROR... Select ready returned, but no dataERROR... Problem in socket select302 ERROR... Unable to read data from network connection.306 ERROR... Data ready on network, but unable to read.ERROR... Socket empty327 ERROR... Data channel empty during read.324 ERROR... Error with network communication select.ERROR... Unknown error on select

JobMgr is unable to determine when data is available or reading. Possible cause is loss of network connectivity.

================

ERROR... Problem reading socket message.326 ERROR... Timeout while reading message.325 ERROR... Error in message received.304 ERROR... Unknown receive_message error305 ERROR... Timeout with no responses

JobMgr received an error while trying to read data or received a timeout while waiting to read data. Possible cause is loss in network connectivity or the sending process has terminated prematurely.

================

Page 158: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

321 ERROR... Unable to contact QueMgrERROR... Timeout with no response from server.

JobMgr received an error or timeout while trying to contact the QueMgr. Possible cause is loss in network connectivity or the QueMgr process has terminated prematurely.

================

ERROR... Unable to accept connection from A/MERROR... Timeout with no response from A/MERROR... Unable to contact ANALYSIS MANAGER interface.

JobMgr received an error or timeout while trying to contact the Analysis Manager interface. Possible cause is loss in network connectivity or the Analysis Manager interface process has terminated (either prematurely or by user intervention).

================

339 ERROR... Unable to initiate config network communication.328 ERROR... Unable to receive gen_config struct329 ERROR... Unable to receive app_config struct330 ERROR... Unable to receive app_submit structERROR... Unable to receive general config info.ERROR... Unable to receive application specific config info.ERROR... Unable to receive application specific submit info.

JobMgr is unable to receive configuration information from the Analysis Manager interface for a submit job request. Possible cause is loss in network connectivity or premature termination of the Analysis Manager interface process.

================

340 ERROR... Unable to send general config info.341 ERROR... Unable to send application specific config info.342 ERROR... Unable to send application specific submit info.

JobMgr is unable to send configuration information to the [Aba,Gen,Mar,Nas]Mgr process. Possible cause is loss in network connectivity or premature termination of the [Aba,Gen,Mar,Nas]Mgr process.

================

ERROR... Out of memory303 ERROR... Unable to alloc memory.ERROR... Unable to alloc mem for file sys maxERROR... Unable to alloc mem for file sys spaceERROR... Unable to alloc mem for file sys names

This indicates the workstation is out of memory. Free up memory used by other processes and try to submit again at a later time.

================

344 ERROR... Unable to determine mail flag from config structERROR... Unable to determine mail config setting.

JobMgr is unable to query memory for the mail config setting. Contact support personnel for assistance.

================

Page 159: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1APPENDIX AError Messages

345 ERROR... Unable to determine delay time from config structERROR... Unable to determine delay time setting.

JobMgr is unable to query memory for the delay time config setting. Contact support personnel for assistance.

================

350 ERROR... Unable to determine disk req from config structERROR... Unable to determine disk requirement.

JobMgr is unable to query memory for the disk req config setting. Contact support personnel for assistance.

================

349 ERROR... Unable to determine memory req from config structERROR... Unable to determine memory requirement.

JobMgr is unable to query memory for the memory req config setting. Contact support personnel for assistance.

================

352 ERROR... Unable to determine pos prog from config structERROR... Unable to determine pos program.

JobMgr is unable to query memory for the pos prog config setting. Contact support personnel for assistance.

================

351 ERROR... Unable to determine pre prog from config structERROR... Unable to determine pre program.

JobMgr is unable to query memory for the pre prog config setting. Contact support personnel for assistance.

================

353 ERROR... Unable to determine job filename from config structERROR... Unable to determine job filename.

JobMgr is unable to query memory for the job filename config setting. Contact support personnel for assistance.

================

337 ERROR... Unable to determine specific index from submit struct338 ERROR... Unable to determine submit specific host index.

JobMgr is unable to query memory for the specific index config setting. Contact support personnel for assistance.

================

ERROR... Unable to determine submit index from submit structERROR... Unable to determine submit host index.

JobMgr is unable to query memory for the submit index config setting. Contact support personnel for assistance.

================

Page 160: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

ERROR... Unable to determine/execute message

JobMgr received an unrecognizable message. Contact support personnel for assistance.

================

323 ERROR... Unable to fork child processERROR... Unable to fork new process.

JobMgr cannot fork a new process. Perhaps the system is heavily loaded or the maximum number of per-user processes have been exceeded. Terminate extra processes and try again.

================

336 ERROR... Unable to send file receive setup info.ERROR... Unable to accept connection on recv fileERROR... Unable to recv file334 ERROR... Unable to receive data file.

JobMgr is unable to set-up and/or receive the data file from the [Aba,Gen,Mar,Nas]Mgr. Possible cause is loss of network connectivity or premature termination of the [Aba,Gen,Mar,Nas]Mgr process.

================

ERROR... Unable to send file333 ERROR... Unable to transfer data file.

JobMgr is unable to send data file to [Aba,Gen,Mar,Nas]Mgr process. The executing host or network connection may be down.

================

346 ERROR... Unknown state347 ERROR... Inconsistant state.

JobMgr is in an inconsistent state. Contact support personnel for assistance.

================

310 ERROR... Unable to open log file.

JobMgr is unable to open log file. Check write permission on the current working directory and log file (if it exists).

================

348 ERROR... Unable to cd to local work dir.

JobMgr is unable to change directory to the directory with the input filename specified. Check existence and permissions on this directory.

================

314 ERROR... Unable to determine true host address.315 ERROR... Unable to determine actual host address.

JobMgr is unable to determine its host address. Possible causes are an invalid host file or name server entry.

================

316 ERROR... Unable to determine usrname.

JobMgr cannot determine user name. Check the passwd, user account files for possible errors.

Page 161: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1APPENDIX AError Messages

================

322 ERROR... Unable to open stdout stream.ERROR... Unable to open stderr stream.

JobMgr cannot open stdout, stderr streams.

================

331 ERROR... Application is NOT supported.

JobMgr is asked to submit a job which is not from a supported application. Check installation by running the basic and network tests with the Administration tool.

================

343 ERROR... Received signal.

JobMgr received a signal from the operating system. JobMgr has encountered an error or was signalled by a user.

================

354 ERROR... Invalid version of Job Manager.

The current version of JobMgr does not match that of the QueMgr. An invalid/incomplete installation is most likely the cause. To determine what version of each executable is installed, type JobMgr -version, and QueMgr -version and compare output.

=======================================================

Page 162: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

User Interface (P3Mgr) Errors...

ERROR... <type> argument must be a number from 1 - 8

P3Mgr must be started with an initial interface type of 1 through 8 only.

================

ERROR... -runtype not acceptable.

P3Mgr was started with an invalid runtype for ABAQUS.

================

ERROR... Getting P3_HOME environment variable.

P3Mgr is unable to determine the P3_HOME environment variable. Set the environment variable to the location of the MSC.Patran install path (<installation_directory> for example).

================

ERROR... Obtaining ANALYSIS MANAGER licenses.

P3Mgr is unable to obtain necessary license tokens. Check nodelock file or netls daemons.

================

ERROR... Problem reading .p3mgrrc ERROR... Unable to write to file <.../.p3mgrrc[_org]>

P3Mgr is unable to read/write to the “rc” file to load/save configuration settings. Check the owner and permissions on the designated file.

================

ERROR... Unable to determine QueMgr host or port

P3Mgr cannot determine which port to connect to a valid Queue Manager. The file QueMgr.sid is not actually used anymore. You should set P3_MASTER and P3_PORT environment variables, or use the org.cfg file.

================

ERROR... QueMgr host <> from org.cfg file inconsistentERROR... QueMgr port <> from org.cfg file inconsistent

P3Mgr has found the QueMgr host and/or port from the QueMgr.sid file to be different from that found in the org.cfg file. Check the org.cfg file and modify it to match the current QueMgr settings, or restart the QueMgr with the org.cfg settings. The QueMgr.sid file is no longer used so this message should never appear. Call MSC support personnel if this happens.

================

ERROR... Unable to determine address of master host <>

P3Mgr is unable to determine the address of the master host provided. Check the QueMgr.sid file for proper hostname and/or the org.cfg file and/or the P3_MASTER environment variable. Also check for an invalid host file or name server entry. The QueMgr.sid file is no longer used so this message should never appear. Call MSC support personnel if this happens.

================

Page 163: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1APPENDIX AError Messages

ERROR... Unable to Contact QueMgr to determine version information.

P3Mgr is unable to contact the QueMgr. Check to see if the QueMgr is up and running, and the master host is up and running, and the P3Mgr host and the master host can communicate via the network.

================

ERROR... Problem creating socket to communicate with Queue MgrERROR... Unable to send request to Queue Mgr. Is the Queue Mgr running ERROR... #<> sending request to Queue Mgr.ERROR... #<> receiving message back from QueMgr.ERROR... In message back from QueMgr.ERROR... Communicating with Queue Mgr.ERROR... Invalid message received from QueMgr.ERROR... Timeout Waiting For Message From QueMgr.

P3Mgr cannot create communication socket, or is unable to send request to the QueMgr process. Check the QueMgr process is up and running, the QueMgr host is up and running, and the network is connected.

================

ERROR... Creating Communications Socket For Job MonitorERROR... Establishing communication to Job Mgr with port <>.ERROR... In Job Mgr Communication

P3Mgr cannot create communication socket, or is unable to send request to the JobMgr process. Check the JobMgr process is up and running, and the network is connected.

================

ERROR... Sending generic configuration structure to Job MgrERROR... Sending application configuration structure to Job MgrERROR... Sending submit structure to Job Mgr

P3Mgr cannot send configuration info to the P3Mgr process. Check that the P3Mgr process is up and running, and the network interface is configured properly.

================

ERROR... An incompatible version of the QueMgr is currently running.

P3Mgr has determined that the version of the QueMgr presently running is not compatible. An incomplete or invalid installation is most likely the cause. Type P3Mgr -version and QueMgr -version and compare the output. Re-install the software if necessary.

================

ERROR... No valid applications defined in QueMgr configuration

P3Mgr has been started with an application not supported by the current configuration used by the QueMgr. Update the configuration files to include the new application and restart the QueMgr before continuing.

================

ERROR... Org <> does not contain any defined applications.

P3Mgr has been started (or switched to) an organization (group) which does not contain any applications. Check the configuration files and restart the QueMgr process for the designated organization.

================

Page 164: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

ERROR... Filename is too long. Shorten jobname

P3Mgr can only submit jobs with files no longer than 32 characters. Shorten the job filename to less than 32 characters and submit again.

================

ERROR... Job <> is not currently active.

P3Mgr was asked to monitor or delete a job with a given name (and owner) which cannot be located in the queue.

================

ERROR... File <> does not exist... Enter A/M to select file explicitly.

P3Mgr was asked to monitor a completed job (using the mon file) from the jobname information only and this file cannot be found. Use the Full Analysis Manager interface and the file browser (under Monitor, Completed Job) to select an existing mon file.

================

ERROR... Monitor file <> does not exist

P3Mgr was asked to monitor a completed job and the selected mon file does not exist. Select an existing mon file and try again.

================

ERROR... You must choose an A/M monitor file (.mon extension).

P3Mgr was asked to monitor a completed job, but no mon file was specified. Select a mon file and try again.

================

ERROR... Unable to parse Monitor File

P3Mgr encountered an error while parsing the mon file. Contact support personnel for assistance.

================

ERROR... than one active job named <> found. Request an active listERROR... More than one active job named <> found. Enter A/M to explicitly select jobERROR... No jobs named <> owned by <> are currently active

P3Mgr is asked to monitor or delete a job from the job name (and owner) and no such job can be located in the queues. Select an active list of jobs from the Full Analysis Manager interface (Monitor, Running jobs)

================

ERROR... No Host Selected. Submit cancelled.ERROR... No Queue Selected. Submit cancelled.

P3Mgr attempted to submit a job, but no host or queue was selected. Select a host or queue and try to submit again.

================

ERROR... QueMgr has been reconfigured. Exit and restart to continue.

P3Mgr has found the QueMgr has been reconfigured (restarted) and so the P3Mgr’s copy of the configuration information may be invalid. Exit the P3Mgr interface and restart to load the latest configuration information before continuing.

Page 165: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1APPENDIX AError Messages

================

ERROR... Starting a Job Mgr on local host.

P3Mgr is unable to spawn a JobMgr process. Perhaps the system is heavily loaded or the maximum number of per-user processes has been met. Free up unused processes and try to submit again.

================

ERROR... Unable to alloc mem for load info.ERROR... Unable to allocate memory for org structure

This indicates the workstation is out of memory. Free up memory used by other processes and try again.

================

ERROR... Unable to open log file

P3Mgr is unable to open a log file. Check file permissions if it exists, or check local directory access/permissions.

================

ERROR... Unable to open unique submit log file

P3Mgr is unable to open a submit log file. Check file permissions if it exists, or check local directory access/permissions.

================

ERROR... Unknown version of ANALYSIS MANAGER .p3mgrrc file

P3Mgr has attempted to read in a .p3mgrrc file but from an unsupported version. Remove the .p3mgrrc file and save configuration settings in a new .p3mgrrc file.

================

ERROR... Could not open file <>

P3Mgr is asked to submit a job, but no filename has been input. Selected an input filename and try to submit again.

================

ERROR... File <> does not exist.

P3Mgr has been asked to submit a file, but no such file can be found. Select an existing input file (or check file permissions) and submit again.

Page 166: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

Additional (submit) Errors...

ABAQUS:ERROR... JobName <> and Restart JobName <> are identical.

ABAQUS cannot have jobs where the job name and the restart job name are the same. Change one or the other and re-submit.

================

ERROR... Unable to open input file <>

P3Mgr is unable to open the designated input file. Check file permissions.

================

ERROR... JobName <> and Input Temperature File JobName <> are identical.

ABAQUS cannot have jobs where the job name and the temperature data file job name are the same. Change one or the other and re-submit.

================

ERROR... *RESTART, READ found but no restart jobname specified.

ABAQUS “RESTART” card encountered, but no filename specified. Add filename to this card and re-submit.

MSC.Nastran:

================

ERROR... File <> cannot contain more than one period in its name.

P3Mgr will currently only allow MSC.Nastran jobs to contain one period in their filename. Rename the input file to contain no more than one period and re-submit.

================

ERROR... File <> must begin with an alpha character.

P3Mgr will currently only allow MSC.Nastran jobs to start with an alpha character, and not a number. Rename the input file to start with a letter (A-z, a-z) and re-submit.

================

ERROR... Include cards are too early in file.

P3Mgr can currently only support MSC.Nastran jobs with Include cards between the “BEGIN BULK” and “ENDDATA” cards. Place the contents of the include files before the “BEGIN BULK” card directly into the input file and re-submit.

================

ERROR... BEGIN BULK card present with no CEND card present.

P3Mgr has encountered a “BEGIN BULK” card before a “CEND” card. P3Mgr currently requires a “BEGIN BULK” card if there is a “CEND” card found in the input file. Add a “BEGIN BULK” card to the input file and re-submit.

================

Page 167: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1APPENDIX AError Messages

ERROR... ENDDATA card missing.

P3Mgr has encountered the end of the input file without finding an “ENDDATA” card. Add an “ENDDATA” card and re-submit.

MSC.Marc:

================

ERROR... Unable to load MSC.Marc configuration info.

The network is unable to transfer the MSC.Marc config info over to the MarMgr from the JobMgr running on the submit machine. Check network connectivity and the submit machine for any problems.

================

ERROR... Unable to load MSC.Marc submit info

The network is unable to transfer the MSC.Marc submit info over to the MarMgr from the JobMgr running on the submit machine. Check network connectivity and the submit machine for any problems.

================

INFORMATION: Total disk space req of %d (kb) met

Information message telling that enough disk space has been found on the file systems configured for MSC.Marc to run.

================

WARNING: Total disk space req of %d (kb) cannot IMMEDIATLEY be met. Continuing on regardless ...

Information message telling that there is currently not enough free disk space found to honor the space requirement provided by the user. The job will continue however, because the space may be freed up at a later time (by another job finishing perhaps) before this job needs it

================

ERROR... Total disk space req of %d (kb) cannot EVER be met. Cannot continue.

There is not enough disk space (free or used) to honor the space requirement provided by the user so the job will stop. Add more disk space or check the requirement specified.

================

WARNING: Cannot determine if disk space req %d (kb) can be met. Continuing on regardless ...

Information message telling the disk space of the file system(s) configured for MSC.Marc cannot be determined. The job will continue anyway as there may be enough space. Sometime, if the file system is mounted over nfs the size of the file system is not available.

================

INFORMATION: No disk space requirement specified

If no disk space requirement is provided by the user then this information message will be printed.

================

Page 168: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

ERROR... Unable to alloc ## bytes of memory in sss, line lll

The MarMgr is unable to allocate memory for its own use, check the memory and swap space on the executing machine.

================

ERROR... Unable to receive file sss

MarMgr could not transfer a file from the JobMgr on the submit machine. Check the network connectivity and submit machine for any problems.

Page 169: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1APPENDIX AError Messages

Editing (p3edit) Errors...

ERROR... Unable to allocate enough memory for file list.

This indicates the workstation is out of memory. Free up memory used by other processes and try again.

================

ERROR... Unable to determine file statistics for <>

This indicates the operating system is unable to determine file statistics for the requested file. The requested file most likely does not exist. Select an existing file to view/edit and try again.

================

ERROR... File <> does not appear to be ASCII

P3edit can only view/edit ASCII files, and the requested file appears to be non-ASCII. Select an ASCII file to view/edit and try again.

================

ERROR... File <> is too large to view

Due to memory constraints, P3edit is limited to viewing/editing files no larger than 16 mb in size. (Except for Cray and Convex machines, where the max file size limit is 60 mb and 40 mb, respectively) Select a smaller file to view/edit and try again.

================

ERROR... Unable to open file <>

P3edit is unable to open requested file to load into viewer/editor. Select an existing file to view/edit and try again.

================

ERROR... File <> is empty

P3edit has found the selected file is empty. Currently P3edit can only view/edit files with data. Select a file containing data to view/edit and try again.

================

ERROR... Unable to seek to beginning of file <>

P3edit is unable to seek to beginning of selected file. Possible system error occurred during seek, or file is corrupted.

================

ERROR... Unable to read in file <>

P3edit is unable to read file data. Perhaps file is corrupted.

================

ERROR... System error while reading file <>

A system error occurred during file read. Try to view/edit file again at another time.

================

Page 170: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

ERROR... Unable to read text

P3edit is unable to read file completely, or is unable to read text from memory to write file.

================

ERROR... Unable to scan text

P3edit is unable to scan text from memory to search for text pattern.

================

ERROR... Unable to write text to file

P3edit is unable to write text out to file. Perhaps the disk is full, or some system error occurred during the write call.

Page 171: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1APPENDIX AError Messages

RmtMgr Errors...

RmtMgr errors are returned to the connection program and also printed in the OS system log (syslog) or Event Viewer for Windows.

================

RmtMgr Error RM_CANT_GET_ADDRESS

This should not happen, but if for some reason the OS / network ca not determine the network address of the machine RmtMgr is started on this will be printed before RmtMgr exits. Contact your system administrator for more information.

================

RmtMgr Error port number #### invalid

This should not happen, but if for some reason the program contacting the RmtMgr does not supply a valid port then this will be printed. The connection to the RmtMgr cannot be completed, but the RmtMgr will continue on listening for other connections.

================

RmtMgr Error RM_CANT_CREATE_SERVER_SOCKET

If the RmtMgr can not create its main server socket for listening then this error message will be printed before the RmtMgr exits.

================

RmtMgr Error accept failed, errno = ##

If the accept system call fails on the socket after a connection is established this will be printed. The error number should be checked against the system errno list for the platform to see the cause.

================

RmtMgr Error unable to end proc ## errno=%d",

If RmtMgr is asked to kill/end a process and it is unable to do so then this message is printed. The errno should give the reason.

================

RmtMgr Error Invalid message of <xxxxx>

An invalid message format/syntax was sent to RmtMgr. The message will be ignored and RmtMgr will continue, listening for other connections.

================

RmtMgr Error Invalid message status code = ##

The status on the receive message is not correct, the status code will help determine the cause, most likely invalid network connectivity is the reason.

================

RmtMgr Error Invalid NULL message

An invalid message format/syntax was sent to RmtMgr. The message will be ignored and RmtMgr will continue, listening for other connections.

================

Page 172: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

RmtMgr Error unable to determine system type, error = ##

RmtMgr is unable to determine what kid of platform it is running on. RmtMgr will exit. Check the supported platform/OS list.

================

RmtMgr Error WSAStartup failed, code = ##

Windows network/socket communication initialization failed the code should be checked against the windows system error list for the cause.

================

RmtMgr Error WSAStartup version incompatible, code = ##

If a contacting program is of a different version than the RmtMgr then this message is printed. The RmtMgr will continue, listening for other connections.

Page 173: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1APPENDIX AError Messages

QueMgr Errors...

Sometimes errors occur because the RmtMgr is running as root or administrator on Windows yet RmtMgr is trying to access network resources such as shared drives. For this reason it is recommended that RmtMgr (and QueMgr) be started as a normal user.

ERROR... Determining computer architecture

QueMgr is unable to recognize its host architecture. Check installation and OS version compatibility.

================

ERROR... Invalid -port option argument <>ERROR... Invalid -usr option argument <>

QueMgr was started with an invalid port or user argument. Select a valid port or user name argument and try to start the QueMgr again.

================

ERROR... Can’t Find Job Number To Remove.ERROR... Can’t Find Job Number To Resume.ERROR... Can’t Find Job Number To Suspend.ERROR... Can’t Find Job To Resume.ERROR... Can’t Find Job To Suspend.202 ERROR... Job to remove not found by the Que Manager.

QueMgr is unable to locate job in internal list to remove, resume or suspend. Contact support personnel for assistance.

================ERROR... Can’t Resume Job unless its RUNNING and SUSPENDED. ERROR... Can’t Suspend Job unless its RUNNING.

QueMgr received an invalid suspend/resume request. QueMgr can only suspend running jobs, and can only resume suspended jobs.

================

203 ERROR... Problem creating com file for Task Manager execution.

QueMgr is unable to create a com file on the eligible host(s) for execution. Possible causes are lack of permission connecting to the eligible host(s) as the designated user (check network permission/access using the Administration tool) or incorrect path/permission on the directory on the eligible host(s). (Use the Administration tool to check this.) The major cause of this error is that the specified user does not have remote shell access from the Master Host to the Analysis Host. Resolutions to this problem are to add the Analysis Host name to the hosts.equiv file or the user‘s.rhosts file on the Master Host.

================

ERROR... Creating host/port file <>.

QueMgr is unable to create the QueMgr.sid file containing its host and port number. Possible causes are invalid sid user name (with the -usr command line option), invalid organization (-org command line option) or invalid network or directory/file permissions. This file and -usr are no longer used and the message should not ever appear. Call MSC support if this happens.

================

ERROR... Opening log file <>ERROR... Opening rdb file <>

Page 174: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

ERROR... Seeking to end of rdb file <>ERROR... Unable to determine next valid job numberRECONFIG ERROR... Unable to determine next valid job number

QueMgr is unable to open the log and/or rdb files. Check the owner and permissions of these files. If the rdb file is corrupted, QueMgr may not be able to seek to its end and determine the next available job number to use.

================

ERROR... Out Of MemoryERROR... Problem Allocating memory for return stringERROR... Unable to alloc mem for load infoERROR... Unable to alloc memory for load informationRECONFIG ERROR... Unable to alloc mem for load info

The workstation is out of memory. Free up memory used by other processes and restart the QueMgr.

================

ERROR... Problem Creating Socket.ERROR... Problem Binding Socket.ERROR... Problem Connecting Socket.ERROR... Problem Reading Socket Message.ERROR... Problem Writing To Socket.ERROR... Problem in socket accept.ERROR... Problem in socket select.

QueMgr cannot create/complete/read/write network communication. Possible causes are invalid network interface configuration, or a loss in network connectivity.

ERROR... Unable to signal Task

QueMgr is unable to send signal to [Aba,Gen,Mar,Nas]Mgr. Check network permission/access using the Administration tool.

================

ERROR... Problem Telling JobMgr That Job Is Being Killed.

QueMgr has been requested to terminate a job, and cannot inform the JobMgr of this. Possible cause is network connectivity loss, or the JobMgr process has terminated unexpectedly, or the JobMgr workstation is down.

ERROR... Host Index <> Received, Max Index is <>,ERROR... Queue Index <> Received, Max Index is <>ERROR... Specific Index <> Received, Max Index is <>

QueMgr asked to submit a job with an invalid index. Contact support personnel for assistance.

================

ERROR... User: <> can not delete job number <> which is owned by: <>201 ERROR... User can not kill a job owned by someone else.

QueMgr was asked to delete a job from a user other than the one who submitted the job. Only the user who submitted a job is eligible to delete it.

================

ERROR... You must be the root user to run QueMgr.

The QueMgr process must run as the root user. Restart the QueMgr as the root user.

Page 175: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1APPENDIX AError Messages

================

209 ERROR... Unable to start Task Manager.

QueMgr is unable to start up an [Aba,Gen,Mar,Nas]Mgr process. Check network/host connections and access, and install tree path on the remote host. (Use the Administration tool to check network permissions.)

================

ERROR... Unable to start Load Manager on host <>

QueMgr is unable to start up a LoadMgr process. Check network/host connections and access, and install tree path on the remote host. Also, check admin user account access. (Use the Administration tool to check network permissions.) Obsolete. RmtMgr is now used.

================205 ERROR... Error submitting job to NQS. See Que Manager Log.211 ERROR... Unable to submit task to NQS queue.

QueMgr received an error while trying to submit a job to an NQS queue. Check the QueMgr.log file for the more detailed NQS error.

================

206 ERROR... Error submitting job to LSF. See Que Manager Log.207 ERROR... Error in return string from LSF bsub. See Que Manager Log.210 ERROR... Unable to submit task to LSF queue.

QueMgr received an error while trying to submit a job to an LSF queue. Check the QueMgr.log file for the more detailed LSF error.

================

214 ERROR... Unable to delete task from NQS queue.

QueMgr is unable to delete task from an NQS queue. Perhaps the task has finished or has already been deleted by an outside source.

================

213 ERROR... Unable to delete task from LSF queue.

QueMgr is unable to delete task from an LSF queue. Perhaps the task has finished or has already been deleted by an outside source.

================

217 ERROR... Job killed from outside queue system.

QueMgr cannot find job in queue when it is expected to be there. QueMgr can only assume the job was deleted from outside the Analysis Manager environment.

================

218 ERROR... Invalid version of Task Manager.

The current version of [Aba,Gen,Mar,Nas]Mgr does not match that of the QueMgr. An invalid/incomplete installation is most likely the cause. To determine what version of each executable is installed, type [Aba,Gen,Mar,Nas]Mgr-version, and QueMgr -version and compare output.

Page 176: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

Application Manager (AbaMgr, GenMgr, MarMgr, NasMgr) Errors...420 ERROR... Unable to determine host address.

[Aba,Gen,Mar,Nas]Mgr is unable to determine its host address. Possible causes are an invalid host file or name server entry.

================

ERROR... File <> is NOT executable442 ERROR... Unable to execute specified application file.ERROR... Unable to exec new application process

Executable to run is not executable. Check file/directory permissions.

================

ERROR... File <> is NOT found 424 ERROR... Unable to determine application executable path.

Executable file cannot be found. Check application installation.

================

ERROR... Inconsistant number of valid file systems410 ERROR... Bad file system count.

[Aba,Gen,Mar,Nas]Mgr has received an invalid number of file systems.

================

436 ERROR... Application errors.

The application has terminated with errors. This is not an Analysis Manager error, but just indicates there are errors in the analysis. Check and modify the input file and try again.

================

ERROR... Pos_application fatal434 ERROR... Application fatal in pos routine.

The application received a fatal error in the pos routine. Contact support personnel for assistance.

================

ERROR... Pre application error435 ERROR... Application fatal in pre routine.

The application received a fatal error in the pre routine. Contact support personnel for assistance.

================

ERROR... Abort_application fatal

The application received a fatal error in the abort routine. Contact support personnel for assistance.

================

426 ERROR... Unable to alloc mem.ERROR... Unable to alloc mem for File_Sys

Page 177: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1APPENDIX AError Messages

ERROR... Unable to alloc mem for executable pathERROR... Unable to alloc mem for file system names

The workstation is out of memory. Free up memory used by other processes and try again.

================

403 ERROR... Unable to initiate network communication.413 ERROR... Unable to initiate file transfer network communication.423 ERROR... Problem with network communication accept.415 ERROR... Problem with network communication select.404 ERROR... Problem with network communication select.ERROR... Unknown error on select

[Aba,Gen,Mar,Nas]Mgr is unable to start/complete network connection/read. Possible causes are loss in network connectivity or premature termination of sending process.

================

431 ERROR... Unable to connect network communication.422 ERROR... Unable to contact QueMgr

Process is unable to contact the QueMgr. Perhaps the QueMgr host is down, or the QueMgr process is not running or the network is down.

================

ERROR... Timeout with no response from JobMgrERROR... Unable to accept connection from JobMgr430 ERROR... Unable to contact JobMgr

Process is unable to contact the JobMgr. Perhaps the JobMgr host is down, or the JobMgr process is not running or the network is down.

================

ERROR... Unable to create info socket417 ERROR... Unable to initiate job info network communication.414 ERROR... Unable to determine job info over network.

[Aba,Gen,Mar,Nas]Mgr is unable to start info socket communication. Check network interface configuration.

================

ERROR... Unable to determine job filename

Process is unable to determine job filename. Contact support personnel for assistance.

================

ERROR... Unable to determine number of clock tics/sec425 ERROR... Unable to determine machine clock rate.

Process cannot determine the machine setting for the number of clock tics per second. Check machine operating system manual for further assistance.

================

ERROR... Unable to fork child process418 ERROR... Unable to fork new process

[Aba,Gen,Mar,Nas]Mgr cannot fork a new process. Perhaps the system is heavily loaded or the maximum number of per-user processes have been exceeded. Terminate extra processes and try again.

Page 178: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

================

ERROR... Unable to load file system info409 ERROR... Unable to load file system information.

Contact support personnel for assistance.

================

408 ERROR... Unable to locate this host in config structureERROR... Unable to locate this host in config host list.

Contact support personnel for assistance.

================

440 ERROR... Unable to determine current working directory.

Process could not determine its current working directory. Check file systems designated for executing [Aba,Gen,Mar,Nas]Mgr.

427 ERROR... Unable to create work file system dirs.ERROR... Unable to make proj sub-dirs off file systems

Process could not make directories off main file system entries in the configuration. Check file system/directory access/permission. (Use the Administration tool.)

================

428 ERROR... Unable to create local work dir. ERROR... Unable to make unique dir off proj dir

Process is unable to make a unique named directory below the designated file system/directory it is currently executing out of. Check file system/directory access/permission. (Use the Administration tool.)

================

411 ERROR... Unable to cd to local work dir.429 ERROR... Unable to cd to local work dir.ERROR... Unable to cd to unique dir off proj dir

Process is unable to change directory to the unique directory created below the file system/directory designated in the configuration. Check owner and permission of parent directory.

================

400 ERROR... Unable to open log file. 412 ERROR... Unable to re-open log file.401 ERROR... Unable to open stderr stream421 ERROR... Unable to open stdout stream

[Aba,Gen,Mar,Nas]Mgr is unable to open, re-open log and/or stdout, stderr stream files. Check current directory permissions.

================

ERROR... Unable to place process in new group

Process is unable to change process group. Contact support personnel for assistance.

================

Page 179: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1APPENDIX AError Messages

ERROR... Unable to obtain config info from QueMgr402 ERROR... Unable to receive configuration info.ERROR... Unable to receive gen_config structERROR... Unable to receive app_config structERROR... Unable to receive app_submit struct405 ERROR... Unable to receive general config info.406 ERROR... Unable to receive application config info.407 ERROR... Unable to receive application submit info.

[Aba,Gen,Mar,Nas]Mgr is unable to receive configuration information from JobMgr. Possible causes are network connectivity loss, or the JobMgr host is down.

================

ERROR... Unable to determine configuration host indexERROR... Unable to determine <> submit paramters

[Aba,Gen,Mar,Nas]Mgr is unable to query memory for either a host index or submit parameters. Contact support personnel for assistance.

================

416 ERROR... Unable to receive data file.ERROR... Unable to recv file <> from JobMgr

Process is unable to receive file from JobMgr. Possibly the network is down, or the JobMgr host is off the network, or the JobMgr host is down.

================

ERROR... file: <> cannot be sent

Process can not send file to JobMgr. Possibly the network is down, or the JobMgr host is off the network, or the JobMgr host is down.

================

432 ERROR... Task aborted.437 ERROR... Task aborted while executing.438 ERROR... Task aborted before execution.439 ERROR... Task aborted after execution.

The [Aba,Gen,Mar,Nas]Mgr has been aborted. This is not necessarily an Analysis Manager error, but an indication that the analysis has been terminated by the user.

================

441 ERROR... Received 2nd signal.

The Process has received a signal, either from an abort (from the user) or from an internal error, and during the shutdown procedures, a second signal has occurred, indicating an error in the shutdown procedure.

================

ERROR... Total disk space req of %d (kb) cannot be met

[Aba,Gen,Mar,Nas]Mgr is unable to locate the amount of free disk space on all designated file systems of this host for the analysis to be run. Either free up some disk space, reduce the amount of disk requested in the interface, or submit job to a different host (with a different set of file systems).

================

Page 180: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

ERROR... Unable to temporarily rename input file

[Aba,Gen,Mar,Nas]Mgr is unable to rename input file temporarily. Contact support personnel for assistance.

================

ERROR... File <> cannot be foundERROR... Unable to open input file

Process has transferred the designated file, but is now unable to locate it for opening or reading. Check network connections, JobMgr host, and file system permissions.

================

443 ERROR... Invalid version of Task Manager

The current version of [Aba,Gen,Mar,Nas]Mgr does not match that of the QueMgr/RmtMgr. An invalid/incomplete installation is most likely the cause. To determine what version of each executable is installed, type [Aba,Gen,Mar,Nas]Mgr -version, and QueMgr/RmtMgr -version and compare output.

Additional application specific Errors...

ABAQUS (AbaMgr):ERROR... Unable to create local environment file

AbaMgr is unable to create a local abaqus.env file. Check file system/directory free space and permissions.

================

ERROR... Unable to load ABAQUS configuration info

AbaMgr is unable to load configuration information from internal memory. Contact support personnel for assistance.

================

ERROR... Unable to load ABAQUS submit info

AbaMgr is unable to load submit information from internal memory. Contact support personnel for assistance.

================

GENERAL (GenMgr):ERROR... Unable to load GENERAL configuration info

GenMgr is unable to load configuration information from internal memory. Contact support personnel for assistance.

================

ERROR... Unable to load GENERAL submit info

GenMgr is unable to load submit information from internal memory. Contact support personnel for assistance.

================

Page 181: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1APPENDIX AError Messages

MSC.Nastran (NasMgr):ERROR... ASSIGN statement <> contains relative pathname

MSC.Nastran ASSIGN statements cannot contain relative pathnames, as this may point to different locations from invocation to invocation. Change ASSIGN statement to a full pathname and submit again.

================

ERROR... Bad card format

NasMgr cannot parse statement properly. Check card format.

================

ERROR... Include cards in the Executive Control section

NasMgr only allows Include cards to be within the BEGIN BULK and ENDDATA sections of the input file. Place the contents of the include cards which lie before the BEGIN BULK card directly in the input file, and submit again.

================

ERROR... Restart type file but no MASTER file specified

The input file appears to be a restart, as the RESTART card is found, but no MASTER file is specified. Use an ASSIGN card to designate which MASTER file is to be used, and submit again.

================

ERROR... Unable to add MASTER database FMSERROR... Unable to add DBALL database FMSERROR... Unable to add SCRATCH database FMS

When NasMgr is adding FMS, line length is found to be greater than the maximum of 80 characters. Decrease the filename (jobname) length or use links to shorten the file system/directory names.

================

ERROR... Unable to load MSC.Nastran configuration info

NasMgr is unable to load configuration information from internal memory. Contact support personnel for assistance.

================

ERROR... Unable to load MSC.Nastran submit info

NasMgr is unable to load submit information from internal memory. Contact support personnel for assistance.

================

ERROR... Unable to read file include <>

NasMgr has transferred the designated file, but is now unable to locate it for opening or reading. Check network connections, JobMgr host, and file system permissions.

================

Page 182: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

ERROR... Unexpected end of file

NasMgr has encountered the end of the input file without finding complete information. Check input file.

MSC.Marc (MarMgr):

================

ERROR... Unable to load MSC.Marc configuration info.

The network is unable to transfer the MSC.Marc config info over to the MarMgr from the JobMgr running on the submit machine. Check network connectivity and the submit machine for any problems.

================

ERROR... Unable to load MSC.Marc submit info

The network is unable to transfer the MSC.Marc submit info over to the MarMgr from the JobMgr running on the submit machine. Check network connectivity and the submit machine for any problems.

================

INFORMATION: Total disk space req of %d (kb) met

Information message telling that enough disk space has been found on the file systems configured for MSC.Marc to run.

================

WARNING: Total disk space req of %d (kb) cannot IMMEDIATLEY be met. Continuing on regardless ...

Information message telling that there is currently not enough free disk space found to honor the space requirement provided by the user. The job will continue however, because the space may be freed up at a later time (by another job finishing perhaps) before this job needs it

================

ERROR... Total disk space req of %d (kb) cannot EVER be met. Cannot continue.

There is not enough disk space (free or used) to honor the space requirement provided by the user so the job will stop. Add more disk space or check the requirement specified.

================

WARNING: Cannot determine if disk space req %d (kb) can be met. Continuing on regardless ...

Information message telling the disk space of the file system(s) configured for MSC.Marc cannot be determined. The job will continue anyway as there may be enough space. Sometime, if the file system is mounted over nfs the size of the file system is not available.

================

INFORMATION: No disk space requirement specified

If no disk space requirement is provided by the user then this information message will be printed.

================

Page 183: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1APPENDIX AError Messages

ERROR... Unable to alloc ## bytes of memory in sss, line lll

The MarMgr is unable to allocate memory for its own use, check the memory and swap space on the executing machine.

================

ERROR... Unable to receive file sss

MarMgr could not transfer a file from the JobMgr on the submit machine. Check the network connectivity and submit machine for any problems.

Page 184: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

Administration (AdmMgr) Testing Messages...

ERROR... An invalid version of Queue Manager is currently running

The current version of AdmMgr does not match that of the running QueMgr. An invalid/incomplete installation is most likely the cause. To determine what version of each executable is installed, type AdmMgr -version, and QueMgr -version and compare output.

================

ERROR... <> specified as type for host, but <> detected.ERROR... <> specified as type for host, but UNKNOWN is detected.ERROR... Host Type <> is not a valid selection for host <>.

The AdmMgr program has discovered the host architecture for the indicated host is not the same as what is designated in the configuration, or no specific type has been given to this host. Change the host type to the correct one and re-test.

================

ERROR... A/M Host <> configuration file <> does not contain an absolute path.

The AdmMgr program has found an rc file entry, or an exe file entry in the host.cfg file, or a file system in the disk.cfg file to not be a full path. Change the entries to be fully qualified. (starts with a slash character ‘/’)

================

ERROR... A/M Host <> does not have a valid application defined. Run Basic A/M Host Test.

The configuration files do not contain any valid applications. Add a valid application and all its required information and run the basic test to verify.

================

ERROR... A/M Host <> filesystem <> does not contain an absolute path.

The file system designated for the host listed is not fully qualified. Change the entry to begin with a slash ‘/’ character.

================

ERROR... A/M Host <> runtime file <> does not contain an absolute path.

The rc file designated for the host listed is not fully qualified. Change the entry to begin with a slash ‘/’ character.

================

ERROR... A/M Host name <> is used more than once within application <>.

Each application contains a list of Analysis Manager host names (which are mapped to physical host names) and each Analysis Manager host name must be unique. The AdmMgr program has found the designated Analysis Manager host names is being used more than once. Change the Analysis Manager host name for all but one of the applications and re-test.

================

ERROR... Access to host <> failed for admin <>.ERROR... Access to host <> invalid return string for admin <>.ERROR... Access to host <> failed for user <>.

Page 185: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1APPENDIX AError Messages

ERROR... Access to host <> invalid return string for user <>.ERROR... Bad return string from physical host <>.

These errors indicate various problems when trying to execute a command on the designated host as the admin or user provided. Network access to a host for a user can fail for a number of reasons, among which are lack of network permission or the network/host is down. To check network permission, the user must be able to rsh (or remsh on some platforms) from the master host (where the QueMgr process runs) to each application host (not interface host, but where the defined application is targeted to run). Rsh (or remsh) access is denied if the user has no password on the remote host, does not have a valid .rhosts file in his/her home directory on the remote host, with the same owner and group ids, and file permissions of 600 (-rw-------) or there is no /etc/hosts.equiv file on the remote host, with an entry for the originating host. The RmtMgr replaces rsh. Call MSC if you get this error.

================

ERROR... Admin account can NOT be root.

The AdmMgr program requires an administrator account which is not the root account. Change the administrator account name to something other than root and continue testing.

================

ERROR... Unable to locate Admin account <>.

The AdmMgr program is unable to locate the admin account name in the passwd file/database. Make sure the admin account name provided is a valid user account name on all application hosts (and the master host as well) and continue testing.

================

ERROR... Application must be chosen for A/M Host <>.

The configuration requires each Analysis Manager host to reference an application. Add a reference for this AM host and re-test.

================

ERROR... Application name <> is not referenced by any A/M Host.

The application specified is not referenced by any Analysis Manager hosts. Add AM hosts or remove this application and continue testing.

================

ERROR... Application name <> is used more than once.

Only unique application names can be used. Re-name the applications so no two are alike and re-test.

================

ERROR... Application not specified for A/M Host <>.

Cannot do Unique A/M Host Test. The configuration requires each A/M host to reference an application. Add a reference for this A/M host and re-test.

================

ERROR... At least one filesystem must be defined for A/M Host <>.

The configuration requires each Analysis Manager host to reference a file system. Add a reference for this AM host and re-test.

Page 186: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

================

ERROR... At least one host must be specified.

At least one physical host must be specified. Add a physical host entry and re-test.

================

ERROR... At least one application must be specified.

At least one application must be defined. Add an application and re-test.

================

ERROR... Could not determine host address for host <>.

The Admin program is unable to determine the host address for the designated host. Possible causes are an invalid name entered, or an invalid host file or name server entry.

================

ERROR... Detected NULL host name.

Provide a host name where specified and re-test.

================

ERROR... Executable test on host <> failed for user <>.

Either the Analysis Manager install tree is not accessible on the remote host, or network access to the remote host is denied. Make sure the Analysis Manager install tree is the same on all application hosts (either through nfs or by created identical directories) and try again.

If network access is the cause for failure, check network permission. (see the “ERROR... Access to host <> failed for admin <>” error description)

================

ERROR... Execution of command failed on host <>.

Either the command does not exist, or network access is denied. Most likely due to network access permission. (See the “ERROR... Access to host <> failed for admin <>” error description.)

================

ERROR... Failure Creating file <> on host <>.ERROR... Failure Accessing Test File <> on host <>.

The user does not have permission to create/access a test file in the proj directory on the designated host. Check the permission of this directory on the remote host and re-test. If permission is not the problem, check network access to the remote as the user. (See the “ERROR... Access to host <> failed for admin <>” error description.)

================

ERROR... Failure Creating Test File <> on host <>.

The user does not have permission to create/access a test file in the file system/directory on the designated host as listed in the disk.cfg file. Check the permission of this directory on the remote host and re-test. If permission is not the problem, check network access to the remote host as the user. (See the “ERROR... Access to host <> failed for admin <>” error description.)

================

Page 187: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1APPENDIX AError Messages

ERROR... Host <> and Host <> have an identical addresses.

Remove one of the host entries (since they are the same host) or change one to point to a different host and re-test.

================

ERROR... Host not specified for A/M Host <>. Run Basic A/M Host Test.

Each Analysis Manager host must reference a physical host. Provide a physical host reference and continue testing.

================

ERROR... Invalid A/M queue name <>.ERROR... Invalid LSF queue name <>.ERROR... Invalid NQS queue name <>.

Enter a valid queue name (no space characters, etc.) and re-test.

ERROR... LSF executables path <> must be an absolute path.ERROR... NQS executables path <> must be an absolute path.

The pathname must be a fully qualified path name. Change the path to be fully qualified (starts with a slash ‘/’ character) and re-test.

================

ERROR... NULL A/M Host name is invalid.

Provide an A/M host name where specified and re-test.

================

ERROR... No queue defined for application <>.

Provide a queue where specified and re-test.

================

ERROR... Physical host has not been defined for A/M Host <>. Run Basic A/M Host Test.

Each Analysis Manager host must reference a physical host. Provide a physical host reference and continue testing.

================

ERROR... Physical host must be chosen for A/M Host <>.

Each Analysis Manager host must reference a physical host. Provide a physical host reference and continue testing.

================

ERROR... Remote execution of uname command failed on host <>.

Either the uname command (required by Analysis Manager) cannot be found on the remote host of the user’s default search path, or the network access to the remote host is denied. Check the existence, permission, and location of the uname command on the remote host. (Some Convex machines are shipped without a uname, but Analysis Manager provides one, just place a copy of the uname program (or link) into a default path directory, such as /bin.) If network access is the cause of failure, check as above. (See the “ERROR... Access to host <> failed for admin <>” error description.)

================

Page 188: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

ERROR... Unable to start executable <.../bjobs> on host <> as user <>.ERROR... Unable to start executable <.../bkill> on host <> as user <>.ERROR... Unable to start executable <.../bsub> on host <> as user <>.ERROR... Unable to start executable <.../qdel> on host <> as user <>.ERROR... Unable to start executable <.../qstat> on host <> as user <>.ERROR... Unable to start executable <.../qsub> on host <> as user <>.

Either the designated files do not exist on the remote host as indicated or the network access to the remote host to test each executable is failing. Check the file existence and change the path as required, or check network access as described above. (See the “ERROR... Access to host <> failed for admin <>” error description.)

================

ERROR... Zero A/M hosts defined. At least one required.

Define an Analysis Manager host and re-test.

================

ERROR... Zero queues defined. At least one required.

Define a queue and re-test.

================

ERROR... Queue <> must contain at least one selected host.

Select a host or hosts from the list for each queue and continue.

================

ERROR... configuration file <> not located on physical host <>.ERROR... runtime file <> not located on physical host <>.

The AdmMgr program cannot locate the rc or exe file designated in the configuration on the specified host. Check the installation of the application or the rc/exe path and re-test.

================

ERROR... Unable to open unique submit log file

In the current working directory, there are more than 50 submit log files.

Page 189: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

MSC.Patran Analysis Manager User’s Guide

APPENDIX

B Application Procedural Interface (API)

■ Analysis Manager API

■ Include File

■ Example Interface

Page 190: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

B.1 Analysis Manager API

Analysis Manager Application Procedural Interface (API) DescriptionThe Analysis Manager can be used by literally any program or user interface that can access the Analysis Manager API. Thus with some programming knowledge and skill, customized user interfaces to the Analysis Manager can be made. The API is not a part of the Analysis Manager product for general use. However, for completeness, this appendix describes these procedural calls to access the Analysis Manager. The necessary include file and an example usage of the API is also included in this appendix. If you have a special customization need to incorporate the Analysis Manager API, please contact your local MSC representative. MSC is happy to provide customized solutions to its customers on a fee basis.

Assumptions:

The product is ALREADY installed and configured. A description of what it takes to install and configure is included in System Management (Ch. 7), but for the purpose of describing the API, assume this for now.

A Quick Background

There are 3 machines involved in the job submit/abort/monitor cycle:

1. The QueMgr scheduling process machine, labelled the master node.

2. The user's submit (home) machine where the graphical user interface (GUI) is run and the input files are located, labelled the submit node.

3. The analysis machine where the job actually runs, labelled the analysis node.

All 3 machines can be the same or different or any combination of these. And there are two separate persistent processes, which are already running as part of the installation. These processes (called daemons on Unix or services on Windows) are the:

1. QueMgr - job scheduling daemon or service

2. RmtMgr - remote command daemon or service

There is one and only one QueMgr process per site (or group or organization or network) but there are many RmtMgr processes. A RmtMgr process runs on each and every analysis machine. A RmtMgr can also be run on each submit machine (recommended). If the submit and analysis machines are the same host, then only one RmtMgr needs to be running.

The QueMgr and RmtMgr processes start up at boot time automatically and run always, but use very little memory and CPU resources, so users will not notice performance effects. Also these processes can run as root (Administrator on Windows) or as any user, if these privileges are not available.

Each RmtMgr binds to a known/chosen port number that is the same for every RmtMgr machine. Each RmtMgr process collects machine statistics on free CPU cycles, free memory and free disk space and returns this data to the QueMgr at frequent intervals.

The QueMgr then maintains a sorted list of each RmtMgr machine and its capacity to report back to a GUI/user. (A least loaded host selection is currently being developed so the QueMgr selects the actual host for a submit based on these statistics, instead of a user explicitly setting the hostname in the GUI.)

Page 191: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1APPENDIX BApplication Procedural Interface (API)

There are a few other AM executables:

1. The TxtMgr - a simple text-based UI which is built on this API and demonstrates all these features.

2. The JobMgr - GUI back-end processes, starts up on the same machine as the GUI (submit machine) when a job is submitted and runs only for the life of a job. There is always 1 JobMgr process per job.

3. The analysis family: These 3 programs are all built on top of an additional API which uses many common features each must do. The common code is run and the custom work for each application is in a few separate routines, pre_app(), post_app(), abort_app()

• NasMgr - The MSC.Nastran analysis process which communicates data to/from the JobMgr and spawns the actual MSC.Nastran sub-process. It also reads include files and transfers them, adds FMS statements to the deck if appropriate, and periodically sends job resource data and msgpop message data to the JobMgr to store off.

• MarMgr - The Abaqus analysis process which does the same things as NasMgr, but for the Abaqus application.

• AbaMgr - The Abaqus analysis process which does the same things as NasMgr, but for the Abaqus application.

• GenMgr - The General application analysis process, used for any other application. Does what NasMgr does except it has no knowledge of the application and just runs it and collects resource usage.

General outline of the Analysis Manager API:

With Analysis Manager there are 5 fundamental functions one can perform:

1. Submit a job

2. Abort a job

3. Monitor a specific job

4. Monitor all the hosts/queues

5. List statistics of a completed job

Each function requires some common data and some unique data. Common data include the QueMgr host and port it is listening on and the configuration structure information. Unique data is described further below.

Configure

The first step to any of the Analysis Manager functions is to connect to an already running QueMgr. To do this you must first know the host and port of the running QueMgr, which is usually in the $P3_HOME/p3manager_files/org.cfg or the $P3_HOME/p3manager_files/default/conf/QueMgr.sid file. After that, simply call

Note: With the job database viewer ($P3_HOME/p3manager_files/bin/ARCH/Job_Viewer) one can view/gather/query statistics about ALL jobs for a company/site/etc. as the QueMgr maintains a database of all job statistics. (The database is generally located in the $P3_HOME/p3manager_files/default/log/QueMgr.rdb file.)

Page 192: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

CONFIG *cfg; char qmgr_host[128]; int qmgr_port; int ret_code; int error_msg;cfg = get_config(qmgr_host, qmgr_port, &ret_code, error_msg)

ret_code and possbily error_msg are returned for checking errors.

The CONFIG structure is defined in an include file shown below. Then initialize sub-parts of the configuration structure by calling

init_config(cfg)

Then determine the application name/index. The application is the name of the application you plan to work with, most-likely MSC.Nastran, but it could be anything that is already pre-configured. A configuration includes basically the application name, and a list of hosts and paths where it is installed, as described in the $P3_HOME/p3manager_files/default/conf/host.cfg file, read by the QueMgr on start up. Each application has different names and possibly different options to the Analysis Manager functions. All applications names/indexes are in the cfg structure so the GUI can ask the user and check against the accepted list.

Then call the function of choice:

1. Submit a job

2. Abort a job

3. Monitor a specific job

4. Monitor all the hosts/queues

5. List statistics of a completed job

Submit

For submit, the GUI then needs to fill in the application structure data and make a call to submit the job. The call may block and wait for the job to complete (maybe a very long time) or it can return immediately. See the job info rcf/GUI settings listed below for what can be set and changed. Assuming defaults for ALL settings, then only a jobname (input file selection), hostname and (possibly) memory need to be set before submitting.

Then call

char *jobfile;char *jobname; /* usually same as basename of jobfile */ int background; int ret_code; int job_number; job_number = submit_job(jobfile,jobname,background,&ret_code);

This call goes through many steps: contacting the QueMgr, getting a valid reserved job number, asking the QueMgr to start a JobMgr, etc. and then sends all the config/rcf/GUI structure info to the JobMgr. The JobMgr runs for the life of the job and is essentially the back-end of the GUI, transferring files to/from the user submit machine to the analysis machine (the NasMgr, MarMgr, AbaMgr or GenMgr process).

Abort

For abort, the GUI then needs to query the QueMgr for a list of jobs, and then present this for the user to select:

Page 193: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1APPENDIX BApplication Procedural Interface (API)

char *qmgr_host;int qmgr_port; JOBLIST *job_list; int job_count;job_count = get_job_list(qmgr_host,qmgr_port,job_list);

Once a job is chosen, a simple call deletes it:

int job_number; char *job_user; int ret_code; ret_code = delete_job(job_number,job_user);

Monitor Running Job

For monitor a specific job, the GUI then needs to query the QueMgr for a list of jobs, and then present this for the user to select:

char *qmgr_host;int qmgr_port; JOBLIST *job_list; int job_count; job_count = get_job_list(qmgr_host,qmgr_port,job_list);

Once a job is chosen, a simple call with a severity level returns data:

int job_number; int severity_level; int cpu, mem, disk; int msg_count; char *ret_string; ret_string = monitor_job(job_number,severity_level,

&cpu,&mem,&disk,&msg_count);

The ret_string then contains a list (array of strings) of all messages the application stored (msgpop type) that are <= the severity level input along with a resource usage string. The number of msgpop messages is stored in msg_count, to be referenced like:

for(i=0;i<msg_count;i++)printf("%s",ret_string[i]);printf("cpu time used by job = %d,

mem used by job = %d, disk used by job = %d\n",cpu,mem,disk);

The CPU, MEM and DISK values are the current resources used by the job.

Monitor Hosts/Queues

For monitor all hosts/queues, the GUI then needs to make a call and get back all QueMgr data for the application chosen. This gets complex. There are 4 different types/groups of data available. For now lets just assume only one type is wanted. There are:

1. FULL_LIST

2. JOB_LIST

3. QUEMGR_LOG

4. QUE_STATUS

5. HOST_STATS

Each has its own syntax and set of data. For the QUE_STATUS type, the call returns an array of structures containing the hostname, number of running jobs, number of waiting jobs, maximum jobs allowed to run on that host, for the given (input) application.

Page 194: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

char *qmgr_host; int qmgr_port; int job_count; QUESTAT *que_info; que_info = get_que_stats(qgr_host,qmgr_port,&job_count);for(i=0;i<job_count;i++) printf("%s %d %d %d\n", que_info[i].hostname,que_info[i].num_running, que_info[i].num_waiting,que_info[i].maxtsk);

For FULL_LIST:

See Include File (p. 196).

For JOB_LIST:

See Include File (p. 196).

For QUEMGR_LOG, this is simply a character string of the last 4096 bytes of the QueMgr log file:

See Include File (p. 196).

For HOST_STATS:

See Include File (p. 196).

Monitor Completed Job

For a list completed jobs, the GUI then needs to query the QueMgr for a list of jobs, and then present this for the user to select.

char *qmgr_host;int qmgr_port; JOBLIST *job_list; int job_count; job_count = get_job_list(qmgr_host,qmgr_port,job_list);

Once a job is chosen, a simple call will return all the job data saved.

int job_number; JOBLIST *comp_info; comp_info = get_completedjob_stats(job_number);

Remote Manager

On a another level, a GUI could also connect to any RmtMgr and ask it to perform a command and return the output from that command. This is essentially a remote shell (rsh) host command as on a Unix machine. This functionality may come in handy when adding/extending the Analysis Manager product to network install other MSC software or whatever is thought of. The syntax for this is as follows:

char *ret_msg; int ret_code; char *rmtuser; char *rmthost; int rmtport; char *command; int background (== FORGROUND (0) or BACKGROUND (1)) ret_msg = remote_command(rmtuser, rmthost,

rmtport, command, background, &ret_code)

Page 195: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1APPENDIX BApplication Procedural Interface (API)

Structures

The JOBLIST structure contains these members:

int job_number; char job_name[128]; char job_user[128]; char job_host[128]; char work_dir[256]; int port_number;

cfg structure from config.h:

typedef struct{ char org_name[NAME_LENGTH]; char org_name2[NAME_LENGTH]; char host_name[NAME_LENGTH]; unsigned int addr; int port;

}ORG;typedef struct{ char prog_name[NAME_LENGTH]; char app_name[NAME_LENGTH]; char args[PATH_LENGTH]; char extension[24]; }PROGS;typedef struct{ char pseudohost_name[NAME_LENGTH]; char host_name[NAME_LENGTH]; char exepath[PATH_LENGTH]; char rcpath[PATH_LENGTH]; int glob_index; int sub_index; char arch[NAME_LENGTH]; unsigned int address; }HSTS;

typedef struct{ int num_hosts; HSTS *hosts; }HOST;typedef struct{ char pseudohost_name[NAME_LENGTH]; char exepath[PATH_LENGTH]; char rcpath[PATH_LENGTH]; int type; }APPS;

typedef struct{ char host_name[NAME_LENGTH]; int num_subapps; APPS subapp[MAX_SUB_APPS]; int maxtsk; char arch[NAME_LENGTH]; unsigned int address; }TOT_HST;

typedef struct{ char queue_name1[NAME_LENGTH]; char queue_name2[NAME_LENGTH]; int glob_index;

Page 196: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

}QUES;

typedef struct{ int num_queues; QUES *queues; }QUEUE;

typedef struct{ char queue_name1[NAME_LENGTH]; char queue_name2[NAME_LENGTH]; HOST sub_host[MAX_APPS]; }TOT_QUE;

typedef struct{ char file_sys_name[NAME_LENGTH]; int model; int max_size; int cur_free; }FILES;

typedef struct{ char pseudohost_name[NAME_LENGTH]; int num_fsystems; FILES *sub_fsystems; }TOT_FSYS;

typedef struct{ char sepuser_name[NAME_LENGTH]; }SEP_USER;

typedef struct{int QUE_TYPE;char ADMIN[128];int NUM_APPS;unsigned int timestamp;

/* prog names */

PROGS progs[MAX_APPS];

/* host stuff */

HOST hsts[MAX_APPS];

int total_h;

TOT_HST *total_h_list;

/* que stuff */

char que_install_path[PATH_LENGTH];char que_options[PATH_LENGTH];int min_mem_value; int min_disk_value; int min_time_value;QUEUE ques[MAX_APPS];int total_q; TOT_QUE *total_q_list;

/* file stuff */

int total_f; TOT_FSYS *total_f_list

Page 197: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1APPENDIX BApplication Procedural Interface (API)

/* separate user stuff */

int total_u;

SEP_USER *total_u_list;}CONFIG;

An example of all the rcf/GUI settings from the app_config.h files:

cfg.total_host[0].host_name = hal9000.macsch.com cfg.total_host[0].arch = HP700 cfg.total_host[0].maxtasks = 3 cfg.total_host[0].num_apps = 3 cfg.total_host[0].sub_app[MSC/NASTRAN].pseudohost_name = nas_host_u cfg.total_host[0].sub_app[MSC/NASTRAN].exepath = /msc/bin/nast705 cfg.total_host[0].sub_app[MSC/NASTRAN].rcpath = /msc/conf/nast705rc cfg.total_host[0].sub_app[ABAQUS].pseudohost_name = aba_host_u cfg.total_host[0].sub_app[ABAQUS].exepath = /hks/abaqus cfg.total_host[0].sub_app[ABAQUS].rcpath = /hks/site/abaqus.env cfg.total_host[0].sub_app[GENERIC].pseudohost_name = gen_host_u cfg.total_host[0].sub_app[GENERIC].exepath = /apps/bin/GENERALAPP cfg.total_host[0].sub_app[GENERIC].rcpath = NONE cfg.total_host[1].host_name = daisy.macsch.com cfg.total_host[1].arch = WINNT cfg.total_host[1].maxtasks = 3 cfg.total_host[1].num_apps = 4 cfg.total_host[1].sub_app[MSC/NASTRAN].pseudohost_name = nas_host_nt cfg.total_host[1].sub_app[MSC/NASTRAN].exepath = c:/msc/bin/nastran.exe cfg.total_host[1].sub_app[MSC/NASTRAN].rcpath = c:/msc/conf/nast706.rcf cfg.total_host[1].sub_app[ABAQUS].pseudohost_name = aba_host_nt cfg.total_host[1].sub_app[ABAQUS].exepath = c:/hks/abaqus.exe cfg.total_host[1].sub_app[ABAQUS].rcpath = c:/hks/site/abaqus.env cfg.total_host[1].sub_app[GENERIC].pseudohost_name = gen_host_nt cfg.total_host[1].sub_app[GENERIC].exepath = c:/apps/bin/GENERALAPP.exe cfg.total_host[1].sub_app[GENERIC].rcpath = NONE cfg.total_host[1].sub_app[GENERIC2].pseudohost_name = gen_host2_nt cfg.total_host[1].sub_app[GENERIC2].exepath = c:/WINNT/System32/mem.exe cfg.total_host[1].sub_app[GENERIC2].rcpath = NONE

# unv_config.auto_mon_flag = 0 unv_config.time_type = 0 unv_config.delay_hour = 0 unv_config.delay_min = 0 unv_config.specific_hour = 0 unv_config.specific_min = 0 unv_config.specific_day = 0 unv_config.mail_on_off = 0 unv_config.mon_file_flag = 0 unv_config.copy_link_flag = 0 unv_config.job_max_time = 0 unv_config.project_name = nastusr unv_config.orig_pre_prog = unv_config.orig_pos_prog = unv_config.exec_pre_prog = unv_config.exec_pos_prog = unv_config.separate_user = nastusr unv_config.p3db_file =

# nas_config.disk_master = 0 nas_config.disk_dball = 0

Page 198: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

nas_config.disk_scratch = 0 nas_config.disk_units = 2 nas_config.scr_run_flag = 1 nas_config.save_db_flag = 0 nas_config.copy_db_flag = 0 nas_config.mem_req = 0 nas_config.mem_units = 0 nas_config.smem_units = 0 nas_config.extra_arg = nas_config.num_hosts = 2 nas_host[hal9000.macsch.com].mem = 0 nas_host[hal9000.macsch.com].smem = 0 nas_host[daisy.macsch.com].mem = 0 nas_host[daisy.macsch.com].smem = 0 nas_config.default_host = nas_host_u nas_config.default_queue = N/A nas_submit.restart_type = 0 nas_submit.restart = 0 nas_submit.modfms = 0 nas_submit.nas_input_deck = nas_submit.cold_jobname =

# aba_config.copy_res_file = 1 aba_config.save_res_file = 0 aba_config.mem_req = 0 aba_config.mem_units = 0 aba_config.disk_units = 2 aba_config.space_req = 0 aba_config.append_fil = 0 aba_config.user_sub = aba_config.use_standard = 1 aba_config.extra_arg = aba_config.num_hosts = 2 aba_host[hal9000.macsch.com].num_cpus = 1 aba_host[hal9000.macsch.com].pre_buf = 0 aba_host[hal9000.macsch.com].pre_mem = 0 aba_host[hal9000.macsch.com].main_buf = 0 aba_host[hal9000.macsch.com].main_mem = 0 aba_host[daisy.macsch.com].num_cpus = 1 aba_host[daisy.macsch.com].pre_buf = 0 aba_host[daisy.macsch.com].pre_mem = 0 aba_host[daisy.macsch.com].main_buf = 0 aba_host[daisy.macsch.com].main_mem = 0 aba_config.default_host = aba_host_u aba_config.default_queue = N/A aba_submit.restart = 0 aba_submit.aba_input_deck = aba_submit.restart_file =

# gen_config[GENERIC].disk_units = 2 gen_config[GENERIC].space_req = 0 gen_config[GENERIC].mem_units = 2 gen_config[GENERIC].mem_req = 0 gen_config[GENERIC].cmd_line = jid=$JOBFILE mem=$MEM gen_config[GENERIC].mon_file = $JOBNAME.log gen_config[GENERIC].default_host = gen_host_u gen_config[GENERIC].default_queue = N/A gen_submit[GENERIC].gen_input_deck =

Page 199: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1APPENDIX BApplication Procedural Interface (API)

# gen_config[GENERIC2].disk_units = 2 gen_config[GENERIC2].space_req = 0 gen_config[GENERIC2].mem_units = 2 gen_config[GENERIC2].mem_req = 0 gen_config[GENERIC2].cmd_line = gen_config[GENERIC2].mon_file = $JOBNAME.log gen_config[GENERIC2].default_host = gen_host2_nt gen_config[GENERIC2].default_queue = N/A gen_submit[GENERIC2].gen_input_deck = #

Page 200: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

B.2 Include FileThis include file (api.h) must be included in any source file using the Analysis Manager API.#ifndef _AMAPI#define _AMAPI

#ifdef __cplusplusextern “C” {#endif

#if defined(SGI5) typedef int socklen_t;#elif defined(DECA) typedef size_t socklen_t;#elif defined(HP700)# if !defined(_ILP32) && !defined(_LP64) typedef int socklen_t;# endif#elif defined(WINNT) typedef int socklen_t;#endif

#define RMTMGR_RESV_PORT 1800#define QUEMGR_RESV_PORT 1900

#define GLOBAL_AM_VERSION “2003.0.1”

#ifndef AM_INITIALIZE# define AM_EXTERN extern#else# define AM_EXTERN # if !defined(__LINT__)# if !defined(__TAG_USED)# define __TAG_USED static char *sccsid[] = { “@(#) MSC Analysis Manager 2003.0.1”, “@(#) “ };# endif /* __TAG_USED */# endif /* __LINT__ */#endif

#if defined(AM_INITIALIZE) char *global_auth_msg = NULL; int __is_checked_out = 0;#else extern char *global_auth_msg; extern int __is_checked_out;#endif

#if defined(AM_INITIALIZE) int xxx_has_input_deck; int hks_has_restart; int has_extra_arg;#else extern int xxx_has_input_deck; extern int hks_has_restart; extern int has_extra_arg;#endif

#define SOCKET_VERSION1 1#define SOCKET_VERSION2 1

#ifndef PATH_LENGTH# define PATH_LENGTH 400#endif

#ifndef NAME_LENGTH# define NAME_LENGTH 256#endif

#ifndef MAX_STR_LEN# define MAX_STR_LEN 256#endif

#ifndef SOMAXCONN# define SOMAXCONN 20#endif

Page 201: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1APPENDIX BApplication Procedural Interface (API)

#ifdef ULTIMA#define MSGPOP 1#else#ifdef MSGPOP# undef MSGPOP#endif#define MSGPOPnotused 1#endif

#define NOT_JOB_OWNER -201

#define UNKNOWN_STATUS -1#define OK_STATUS 0#define BAD_STATUS 999

#define BLOCK_TIMEOUT 60#define NONB_TIMEOUT 15

#define MAX_EVENT_NUMBER 115

#define TOTAL_TO_QM_EVENTS 39

/* ---------------------------- */

/* all events to QueMgr are first (and sequential) */

#define TRANS_CONFIG1

#define XX_QM_PING 39 /* highest to QueMgr message */#define QM_XX_PING 98

#define JM_QM_JOB_FINISHED2#define JM_QM_JOB_INIT3#define JM_QM_ADD_TASK4#define JM_QM_DB_UPDATE 19#define JM_QM_CLEANUP_JOB 26

#define TM_QM_TASK_FINISHED5#define TM_QM_TASK_RUNNING6#define TM_QM_APP_FILES 25

#define PM_QM_REMOVE_JOB7#define PM_QM_FULL_LIST8#define PM_QM_JOB_LIST9#define PM_QM_QUEMGR_LOG10#define PM_QM_QUE_STATUS11#define PM_QM_JOB_SELECT_LIST12#define PM_QM_JOB_COMP_LIST27#define PM_QM_JOBNUM_REQ13

#define PM_QM_SUSPEND_JOB 21#define PM_QM_RESUME_JOB 22

#define PM_QM_CPU_LOADS 23

#define PM_QM_START_UP_JOBMGR 29

#define PA_QM_HALT_QUEMGR14#define PA_QM_DRAIN_HALT15#define PA_QM_DRAIN_RESTART16#define PA_QM_CHECK 17#define PA_QM_GET_RECFG_TEXT 18

#define XX_QM_REQ_VERSION 20

/* future XX_QM events (33-38) */

#define RM_QM_LOAD_INFO 24#define RM_QM_CMD_OUT 28#define RM_XX_PROC_OUT 32

#define QM_JM_TASK_FINISHED40#define QM_JM_TASK_RUNNING41#define QM_JM_KILL_TASK42#define QM_JM_ACCEPT_REQUEST43

#define TM_JM_IN_PRE44#define TM_JM_RUN_INFO45#define TM_JM_IN_POS46#define TM_JM_GET_FILES 62#define TM_JM_PUT_FILES 63

Page 202: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

#define TM_JM_CFG_STRUCTS 65#define TM_JM_DISK_INIT 66#define TM_JM_LOG_INFO69#define TM_JM_PRE_PROG 96#define TM_JM_POS_PROG 97

#define TM_JM_SUSPEND_JOB 77#define TM_JM_RESUME_JOB 78

#define TM_JM_ADD_COMMENT 85

#define TM_JM_RM_FILE 86

#define TM_JM_RUNNING_FILE 87

#define TM_JM_MSG_BUFFERS 95

#define TM_PM_GET_FILES 108

#define XX_RM_STOP_NOW 74#define XX_RM_RMT_CMD 81#define XX_RM_RMT_AM_CMD 99#define XX_RM_SEND_LOADS 82#define XX_RM_KILL_PROCESS 83#define XX_RM_REMOVE_FILE 84#define XX_RM_REMOVE_AM_FILE 100#define XX_RM_WRITE_FILE 75#define XX_RM_PUT_FILE 109#define XX_RM_PUL_FILE 110#define XX_RM_PING_ME 111#define XX_RM_GET_UNAME 112#define XX_RM_EXIST_FILE 113#define XX_RM_DIR_WRITEABLE 114#define XX_RM_CAT_FILE 115

#define QM_PM_RET_CODE47#define QM_PM_FULL_LISTING48#define QM_PM_JOB_LIST49#define QM_PM_QUEUE_STATUS50#define QM_PM_QUEMGR_LOG51#define QM_PM_JOB_SEL_LIST52#define QM_PM_SEND_JOBNUM53

#define QM_PM_NEEDS_RECFG 91#define QM_PM_LOAD_INFO 92

#define QM_PM_JOBMGR_START 94

#define PM_JM_REQ_JOBMON 54#define PM_JM_REQ_RUNNING_FILE 88#define PM_JM_KILL_TRANSFERS 90 #define PM_JM_MSGDEST_REQ 101 #define PM_JM_STATS_REQ 102 #define PM_JM_LOGFILE_REQ 103 #define PM_JM_MON_INIT 104#define PM_JM_LIST_RUN_FILES 105#define PM_JM_REQ_RUNNING_FILE2 106

#define QM_PA_INFO 67#define QM_PA_SEND_RECFG_TEXT 68

#define JM_PM_LOG_COMMENT55#define JM_PM_LOG_INIT_JOB56#define JM_PM_LOG_TASK_SUBMIT57#define JM_PM_LOG_TASK_RUN58#define JM_PM_LOG_TASK_COMPLETE59#define JM_PM_LOG_JOB_FINISHED60#define JM_PM_TIME_SYNC61#define JM_PM_LOG_LINE 70#define JM_PM_FILE_PRESENT 71

#define JM_JM_PRE_FINISHED 72#define JM_JM_POS_FINISHED 73

#define JM_TM_RECV_SETUP 64#define JM_TM_GIVEME_FILE 89#define JM_TM_GIVEME_FILE2 107

#define QM_XX_REQ_VERSION 76

#define QM_TM_SUSPEND_JOB 79#define QM_TM_RESUME_JOB 80

Page 203: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

1APPENDIX BApplication Procedural Interface (API)

#define QM_TM_KILL_JOB 93

#define MAX_ORGS 28#define MAX_APPS 30#define MAX_SUB_APPS 50#define MAX_GEN_APPS 10

#define LOCAL 0#define NFS 1

#define MSC_QUEUE 0#define LSF_QUEUE 1#define NQS_QUEUE 2 #define MSC_NASTRAN 1#define HKS_ABAQUS 2#define MSC_MARC 3#define GENERAL 20

#define MAX_NUM_FILE_SYS 20

#define UNITS_WORDS 0#define UNITS_64BIT_WORDS 99#define UNITS_KB 1#define UNITS_MB 2#define UNITS_GB 3

#define MIN_MEM_REQ 1 /* (mb) */#define MIN_DISK_REQ 1 /* (mb) */#define MIN_TIME_REQ 99999 /* (min) */

#define JOB_SUBMITTED 0#define JOB_QUEUED 1#define JOB_RUNNING 2

#define JOB_SUCCESSFUL 0#define JOB_ABORTED 1#define JOB_FAILED 2

#define FILE_STILL_DOWNLOADING 1#define FILE_DOWNLOAD_COMPLETE 0

/* ---------------------------- */

#define IC_CLEAN 0#define IC_CANT_GET_ADDRESS -100#define IC_CANT_OPEN_HOST_FILE -101#define IC_CANT_ALLOC_MEM -102#define IC_NOT_ENUF_HOSTS -103#define IC_CANT_OPEN_QUE_FILE -104#define IC_MISSING_FIELDS -105#define IC_CANT_FIND_HOST -106#define IC_ADD_QUE_ERROR -107#define IC_NOT_ENUF_QUES -108#define IC_CANT_FIND_QUE -109#define IC_NO_QUE_TYPE -110#define IC_UNKNOWN_QUE_TYPE -111#define IC_NO_QUE_PATH -112#define IC_CANT_FIND_MACH -113#define IC_BAD_MAXTSK -114#define IC_TOO_FEW_QUE_APPS -115#define IC_BAD_APP_TYPE -116#define IC_NOT_ENUF_SUB_HOSTS -117#define IC_BAD_PORT -118#define IC_NO_ADMIN -119#define IC_BAD_ADMIN -120

#define ID_CLEAN 0#define ID_CANT_OPEN_DISK_FILE -150#define ID_CANT_GET_ADDRESS -151#define ID_CANT_ALLOC_MEM -152#define ID_CANT_FSTAT -153#define ID_NOT_ENUF_FSYS -154#define ID_NOT_ENUF_SUBS -155#define ID_CANT_FIND_HOST -156

#define IU_CLEAN 0#define IU_CANT_ALLOC_MEM -180

#defineTIME_SYNC 99#define LOG_COMMENT100#define LOG_INIT_JOB101

Page 204: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

#define LOG_TASK_SUBMIT102#define LOG_TASK_RUN103#define LOG_TASK_COMPLETE104#define LOG_JOB_FINISHED105#define LOG_DISK_INIT106#define LOG_DISK_UPDATE107#define LOG_CPU_UPDATE108#define LOG_DISK_SUMMARY109#define LOG_DISK_FS_SUMMARY110#define LOG_CPU_SUMMARY111#define LOG_LOGLINE 112#define LOG_FILE_PRESENT 113#define LOG_TASK_SUSPEND 114#define LOG_TASK_RESUME 115#define LOG_RUNNING_FILE 116#define LOG_RUNNING_DONE 117#define LOG_MEM_UPDATE118#define LOG_MEM_SUMMARY119

/* ---------------------------- */

typedef struct{ char file_sys_name[PATH_LENGTH]; int disk_used_pct; int disk_max_size_mb;}JOB_FS_LIST;

typedef struct{ char filename[PATH_LENGTH]; int sizekb;}FILE_LIST;

typedef struct{ char org_name[NAME_LENGTH]; char org_name2[NAME_LENGTH]; char host_name[NAME_LENGTH]; unsigned int addr; int port;}ORG;

typedef struct{ char prog_name[NAME_LENGTH]; char app_name[NAME_LENGTH]; int maxapptsk; char args[PATH_LENGTH]; char extension[24];}PROGS;

typedef struct{ char pseudohost_name[NAME_LENGTH]; char host_name[NAME_LENGTH]; char exepath[PATH_LENGTH]; char rcpath[PATH_LENGTH]; int glob_index; int sub_index; int maxapptsk; char arch[NAME_LENGTH]; unsigned int address;}HSTS;

typedef struct{ int num_hosts; HSTS *hosts;}HOST;

typedef struct{ char pseudohost_name[NAME_LENGTH]; char exepath[PATH_LENGTH]; char rcpath[PATH_LENGTH]; int maxapptsk; int type;}APPS;

typedef struct{ char host_name[NAME_LENGTH]; int num_subapps; APPS subapp[MAX_SUB_APPS]; int maxtsk; char arch[NAME_LENGTH]; unsigned int address;}TOT_HST;

Page 205: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

2APPENDIX BApplication Procedural Interface (API)

typedef struct{ char queue_name1[NAME_LENGTH]; char queue_name2[NAME_LENGTH]; int glob_index;}QUES;

typedef struct{ int num_queues; QUES *queues;}QUEUE;

typedef struct{ char queue_name1[NAME_LENGTH]; char queue_name2[NAME_LENGTH]; HOST sub_host[MAX_APPS];}TOT_QUE;

typedef struct{ char file_sys_name[NAME_LENGTH]; int model; int max_size; int cur_free;}FILES;

typedef struct{ char pseudohost_name[NAME_LENGTH]; int num_fsystems; FILES *sub_fsystems;}TOT_FSYS;

typedef struct{ char sepuser_name[NAME_LENGTH];}SEP_USER;

/* ---------------------------- */

typedef struct{

int QUE_TYPE;

char ADMIN[128];

int NUM_APPS;

int config_file_version;

unsigned int timestamp;

char prog_version[32];

/* prog names */

PROGS progs[MAX_APPS];

/* host stuff */

HOST hsts[MAX_APPS];

int total_h; TOT_HST *total_h_list;

/* que stuff */

char que_install_path[PATH_LENGTH];

char que_options[PATH_LENGTH];

int min_mem_value; int min_disk_value; int min_time_value;

QUEUE ques[MAX_APPS];

int total_q; TOT_QUE *total_q_list;

/* file stuff */

int total_f; TOT_FSYS *total_f_list;

/* separate user stuff */

Page 206: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

int total_u; SEP_USER *total_u_list;

int qmgr_port; int rmgr_port; char qmgr_host[256];

}CONFIG;

/************************************************************************//* Defines for setting the different values of the config structure *//************************************************************************/#define CONFIG_VERSION 1

#define NO_JOB_MON 0#define START_JOB_MON 1

#define SUBMIT_NOW 0#define SUBMIT_DELAY 1#define SUBMIT_SPECIFIC 2

#define SUNDAY 0#define MONDAY 1#define TUESDAY 2#define WEDNESDAY 3#define THURSDAY 4#define FRIDAY 5#define SATURDAY 6

#define MAIL_OFF 0#define MAIL_ON 1

#define UI_MGR_MAIL 0#define MASTER_MAIL 1

#define MAX_PROJ_LENGTH 16

typedef struct{#ifndef CRAY int pad1;#endif int version;#ifndef CRAY int pad2;#endif int job_mon_flag;#ifndef CRAY int pad3;#endif int time_type;#ifndef CRAY int pad4;#endif int delay_hour;#ifndef CRAY int pad5;#endif int delay_min;#ifndef CRAY int pad6;#endif int specific_hour;#ifndef CRAY int pad7;#endif int specific_min;#ifndef CRAY int pad8;#endif int specific_day;#ifndef CRAY int pad9;#endif int mail_on_off;#ifndef CRAY int pad10;#endif int bogus;#ifndef CRAY int pad11;#endif

Page 207: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

2APPENDIX BApplication Procedural Interface (API)

int mon_file_flag;#ifndef CRAY int pad12;#endif int copy_link_flag;#ifndef CRAY int pad13;#endif int job_max_time;#ifndef CRAY int pad14;#endif int bogus1; char project_name[128]; char orig_pre_prog[256]; char orig_pos_prog[256]; char exec_pre_prog[256]; char exec_pos_prog[256]; char separate_user[128]; char p3db_file[256]; char email_addr[256];} Universal_Config_Info;

/* ---------------------------- */

typedef struct { char host_name[128]; int num_running; int num_waiting; int maxtsk; char stat_str[64];}Que_List;

typedef struct { char msg[2048];}Msg_List;

typedef struct { int job_number; char job_name[128]; char job_user[128]; char job_submit_host[128]; char am_host_name[128]; char job_proj[128]; char work_dir[256]; int application; int port_number; char job_run_host[128]; char sub_time_str[128]; int jobstatus;}Job_List;

typedef struct { char host_name[128]; int cpu_util; int free_disk; int avail_mem; int status;}Cpu_List;

/************************************************************************//* *//* MSC.Nastran specific configuration structures. *//* *//************************************************************************/

#define DEFAULT_BUFFSIZE 8193

/*** mck 6/12/98 - change to 0, so they dont get added unless you type something ...**#define CONFIG_DEFAULT_SMEM ( (DEFAULT_BUFFSIZE-1) * 100 )#define CONFIG_DEFAULT_MEM 8000000*/

#define CONFIG_DEFAULT_SMEM 0#define CONFIG_DEFAULT_MEM 0

#define NAS_NONE 0

#define NO 0#define YES 1

Page 208: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

#define SINGLE 1#define MULTI 2

#define DB_GET_NO_FILES 500#define DB_GET_MST_P3_FILE 600#define DB_GET_ALL_P3_FILES 650#define DB_GET_MST_MK_FILE 700#define DB_GET_ALL_MK_FILES 750

typedef struct {#ifndef CRAY int pad1;#endif int host_index;/* Global Host Index.*/#ifndef CRAY int pad2;#endif float mem; /* stored as whatever. */#ifndef CRAY int pad3;#endif float smem; /* stored as whatever. */#ifndef CRAY int pad4;#endif int num_cpus;/* Number cpu’s on machine.*/

char host_name[128];/* Real Host Name (host_name)*/ char mem_str[64]; char smem_str[64];} Nas_Config_Host;

typedef struct {#ifndef CRAY int pad1;#endif int application_type;/* Should be set to MSC_NASTRAN*/#ifndef CRAY int pad2;#endif int default_index;/* Index just within Nas List*/#ifndef CRAY int pad3;#endif int disk_master; /* stored as KB. */#ifndef CRAY int pad4;#endif int disk_dball; /* stored as KB. */#ifndef CRAY int pad5;#endif int disk_scratch; /* stored as KB. */#ifndef CRAY int pad6;#endif int disk_units; /* see defines below */#ifndef CRAY int pad7;#endif int scr_run_flag;#ifndef CRAY int pad8;#endif int save_db_flag;#ifndef CRAY int pad9;#endif int copy_db_flag;#ifndef CRAY int pad10;#endif float mem_req; /* stored as whatever */#ifndef CRAY int pad11;#endif int mem_units;#ifndef CRAY int pad12;#endif int smem_units;

Page 209: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

2APPENDIX BApplication Procedural Interface (API)

#ifndef CRAY int pad13;#endif int num_hosts;#ifndef CRAY int pad14;#endif int bogus; char default_host[128];/* uihost_name is saved here*/ char default_queue[128];/* queue_name1 is saved here*/ char mem_req_str[64]; char extra_arg[256]; Nas_Config_Host *host_ptr;} Nas_Configure_Info;

typedef struct {#ifndef CRAY int pad1;#endif int submit_index; /* Index just within Nas List */#ifndef CRAY int pad2;#endif intspecific_index; /* see descrip below.*/#ifndef CRAY int pad3;#endif int restart_type;#ifndef CRAY int pad4;#endif int restart;#ifndef CRAY int pad5;#endif int modfms;#ifndef CRAY int pad6;#endif int bogus; char nas_input_deck[256]; /* full path and filename */ char cold_jobname[256]; /* coldstart jobname */} Nas_Submit_Info;

/* The “specific_index” variable is only used when the queuing type is *//* not MSC_QUEUE (i.e. it is LSF). If it is -1 then that means the *//* task can be submitted to any host in the defined queue. If the *//* “specific_index” has a value other than -1, then this is a index into*//* the host list (host list for the application, not global index) *//* of the specific host the task should be submited to. */

/************************************************************************//* *//* ABAQUS specific configuration structures. *//* *//************************************************************************/

/* Following default values are in words (64bit).*/

/*** mck - 6/12/98 change to 0 so they dont get added unless you type something ...**#define DEFAULT_PRE_BUF 400000#define DEFAULT_PRE_MEM 1000000#define DEFAULT_MAIN_BUF 2000000#define DEFAULT_MAIN_MEM 6000000*/

#define DEFAULT_PRE_BUF 0#define DEFAULT_PRE_MEM 0#define DEFAULT_MAIN_BUF 0#define DEFAULT_MAIN_MEM 0

#define ABA_NONE 0#define ABA_RESTART 1#define ABA_CHECK 2

typedef struct {#ifndef CRAY int pad1;#endif int host_index; /* Global Host Index. */

Page 210: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

#ifndef CRAY int pad2;#endif int num_cpus;/* Number cpu’s on machine.*/#ifndef CRAY int pad3;#endif float pre_buf;/* stored as whatever.*/#ifndef CRAY int pad4;#endif float pre_mem;/* stored as whatever.*/#ifndef CRAY int pad5;#endif float main_buf;/* stored as whatever.*/#ifndef CRAY int pad6;#endif float main_mem;/* stored as whatever.*/ char pre_buf_str[64]; char pre_mem_str[64]; char main_buf_str[64]; char main_mem_str[64]; char host_name[128]; /* Real Host Name (host_name) */} Aba_Config_Host;

typedef struct {#ifndef CRAY int pad1;#endif int application_type; /* Should be set to HKS_ABAQUS */#ifndef CRAY int pad2;#endif int default_index;/* Index just within Aba List*/#ifndef CRAY int pad3;#endif int copy_res_file;#ifndef CRAY int pad4;#endif int save_res_file;#ifndef CRAY int pad5;#endif float mem_req; /* stored as whatever */#ifndef CRAY int pad6;#endif int mem_units;/* One of the defines above*/#ifndef CRAY int pad7;#endif int disk_units;/* One of the defines above*/#ifndef CRAY int pad8;#endif int space_req;/* stored as KB.*/#ifndef CRAY int pad9;#endif int append_fil;/* 0 = no 1 = yes*/#ifndef CRAY int pad10;#endif int num_hosts;#ifndef CRAY int pad11;#endif int use_standard; /* 0 = no 1 = yes */ char default_host[128];/* uihost_name is saved here */ char default_queue[128];/* queue_name1 is saved here */ char user_sub[128]; char mem_req_str[64]; char extra_arg[256]; Aba_Config_Host *host_ptr;} Aba_Configure_Info;

typedef struct {#ifndef CRAY

Page 211: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

2APPENDIX BApplication Procedural Interface (API)

int pad1;#endif int submit_index;/* Index just within Aba list*/#ifndef CRAY int pad2;#endif intspecific_index;/* see description below*/#ifndef CRAY int pad3;#endif int restart;#ifndef CRAY int pad4;#endif int bogus; char aba_input_deck[256]; /* full path and filename */ char restart_file[256];} Aba_Submit_Info;

/* The “specific_index” variable is only used when the queuing type is*//* not P3_QUEUE (i.e. it is LSF). If it is -1 then that means the*//* task can be submitted to any host in the defined queue. If the*//* “specific_index” has a value other than -1, then this is a index into*//* the host list (host list for the application, not global index)*//* of the specific host the task should be submited to.*/

/************************************************************************//* *//* MSC.Marc specific configuration structures. *//* *//************************************************************************/

#define MAR_NONE 0#define MAR_RESTART 1

typedef struct {#ifndef CRAY int pad1;#endif int host_index; /* Global Host Index. */#ifndef CRAY int pad2;#endif int num_cpus; /* Number cpu’s on machine. */#ifndef CRAY int pad3;#endif int bogus; char host_name[128]; /* Real Host Name (host_name) */} Mar_Config_Host;

typedef struct {#ifndef CRAY int pad1;#endif int application_type; /* Should be set to MSC_MARC */#ifndef CRAY int pad2;#endif int default_index; /* Index just within Mar List */#ifndef CRAY int pad3;#endif int disk_units; /* One of the defines above */#ifndef CRAY int pad4;#endif int space_req; /* stored as KB. */#ifndef CRAY int pad5;#endif int mem_units; /* One of the defines above */#ifndef CRAY int pad6;#endif float mem_req; /* stored as whatever */#ifndef CRAY int pad7;#endif int num_hosts;#ifndef CRAY int pad8;

Page 212: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

#endif int translate_input; char default_host[128]; /* uihost_name is saved here */ char default_queue[128]; /* queue_name1 is saved here */ char cmd_line[256]; /* command line to run with */ char mon_file[256]; /* log file to monitor */ char mem_req_str[64]; Mar_Config_Host *host_ptr;} Mar_Configure_Info;

typedef struct {#ifndef CRAY int pad1;#endif int submit_index; /* Index just within Mar list */#ifndef CRAY int pad2;#endif int rid; /* Flag: restart file (-rid filename) */#ifndef CRAY int pad3;#endif int pid; /* Flag: post_name (-pid filename) */#ifndef CRAY int pad4;#endif int prog; /* Flag: program_name (-prog progname) */#ifndef CRAY int pad5;#endif int user; /* Flag: user_subroutine_name (-user subname)*/#ifndef CRAY int pad6;#endif int save; /* Flag: save executable (0/1) (-save yes/no) */#ifndef CRAY int pad7;#endif int vf; /* Flag: viewfactor file (-vf vfname) */#ifndef CRAY int pad8;#endif int nprocd; /* Number processes or domains (-nprocd #) */#ifndef CRAY int pad9;#endif int host; /* Flag: hostfile (-host hostfilename) */#ifndef CRAY int pad10;#endif int iam; /* Flag: iam flag for licensing (-iam iamtag) */#ifndef CRAY int pad11;#endif int specific_index; /* see description below */ /* All files should have full path and filename */ char datfile_name[256]; /* input deck */ char restart_name[256]; /* restart file */ char post_name[256]; /* post file */ char program_name[256]; /* program file */ char user_subroutine_name[256]; /* user subroutine file */ char viewfactor[256]; /* viewfactor file */ char hostfile[256]; /* hostfile */ char iamval[256]; /* iam licensing tag - no file involved */} Mar_Submit_Info;

/* The “specific_index” variable is only used when the queuing type is*//* not P3_QUEUE (i.e. it is LSF). If it is -1 then that means the*//* task can be submitted to any host in the defined queue. If the*//* “specific_index” has a value other than -1, then this is a index into*//* the host list (host list for the application, not global index)*//* of the specific host the task should be submited to.*/

/************************************************************************//* *//* GENERAL specific configuration structures. *//* *//************************************************************************/typedef struct {#ifndef CRAY int pad1;

Page 213: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

2APPENDIX BApplication Procedural Interface (API)

#endif int host_index; /* Global Host Index. */#ifndef CRAY int pad2;#endif int bogus; char host_name[128]; /* Real Host Name (host_name) */} Gen_Config_Host;

typedef struct {#ifndef CRAY int pad1;#endif int application_type;/* Should be set to GEN - RANGE */#ifndef CRAY int pad2;#endif int default_index;/* Index just within Gen List*/#ifndef CRAY int pad3;#endif int disk_units;/* One of the defines above*/#ifndef CRAY int pad4;#endif int space_req;/* stored as KB.*/#ifndef CRAY int pad5;#endif int mem_units;/* One of the defines above*/#ifndef CRAY int pad6;#endif float mem_req;/* stored as whatever */#ifndef CRAY int pad7;#endif int num_hosts;#ifndef CRAY int pad8;#endif int translate_input; char default_host[128];/* uihost_name is saved here */ char default_queue[128];/* queue_name1 is saved here */ char cmd_line[256]; /* command line to run with */ char mon_file[256]; /* log file to monitor */ char mem_req_str[64]; Gen_Config_Host *host_ptr;} Gen_Configure_Info;

typedef struct {#ifndef CRAY int pad1;#endif int submit_index;/* Index just within Gen list*/#ifndef CRAY int pad2;#endif intspecific_index;/* see description below*/ char gen_input_deck[256]; /* full path and filename */} Gen_Submit_Info;

/* The “specific_index” variable is only used when the queuing type is*//* not MSC_QUEUE (i.e. it is LSF). If it is -1 then that means the*//* task can be submitted to any host in the defined queue. If the*//* “specific_index” has a value other than -1, then this is a index into*//* the host list (host list for the application, not global index)*//* of the specific host the task should be submited to.*/

/* ---------------------------- */

/*** api globals ...*/

#ifdef AM_INITIALIZEAM_EXTERN int gbl_nwrk_timeout_secs = BLOCK_TIMEOUT;AM_EXTERN int api_use_this_host = 0;#elseAM_EXTERN int gbl_nwrk_timeout_secs;AM_EXTERN int api_use_this_host;#endif

Page 214: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

AM_EXTERN CONFIG *cfg;AM_EXTERN ORG *org;AM_EXTERN int num_orgs;AM_EXTERN Universal_Config_Info ui_config;AM_EXTERN Nas_Configure_Info nas_config;AM_EXTERN Nas_Submit_Info nas_submit;AM_EXTERN Aba_Configure_Info aba_config;AM_EXTERN Aba_Submit_Info aba_submit;AM_EXTERN Mar_Configure_Info mar_config;AM_EXTERN Mar_Submit_Info mar_submit;AM_EXTERN Gen_Configure_Info gen_config[MAX_GEN_APPS];AM_EXTERN Gen_Submit_Info gen_submit[MAX_GEN_APPS];AM_EXTERN char api_this_host[256];AM_EXTERN char api_user_name[256];AM_EXTERN char api_application_name[64];AM_EXTERN int api_application_index;

/* * api functions ... */

/* * init - MUST BE FIRST api_* call made by application ... */extern int api_init(char *out_str);

/* * just to set the global timeout for communication ... */extern int api_get_gbl_timeout();extern int api_set_gbl_timeout(int secs);

/* * reads an org.cfg file if possible and builds the ORG struct for list of QueMgrs ... */extern ORG *api_read_orgs(char *dir,int *num_orgs,int *status);

/* * contacts running QueMgr and builds cfg struct ... */extern CONFIG *api_get_config(char *qmgr_host,int qmgr_port,int *status,char *out_str);

/* * reads *.cfg files and builds cfg struct (No QueMgr process involved) ... */extern CONFIG *api_read_config(CONFIG *cfg,char *path,char *orgname,int *status,char *out_str);/* * reads *.cfg files (without building path) and builds cfg struct (No QueMgr process involved) ...*/extern CONFIG *api_read_config_fullpath(CONFIG *cfg,char *path,int *status,char *out_str);/* * writes *.cfg files from cfg struct (No QueMgr process involved) ... */extern void api_write_config(CONFIG *cfg,char *path,char *orgname,int *stauts,char *out_str);

/* * tries to contact running QueMgr and check if timestamp is ok ... * returns 0 if all ok ... */extern int api_ping_quemgr(char *qmgr_host,int qmgr_port,unsigned int timestamp,char *out_str);

/* * initializes UI config structs (nas, aba, gen[] subimt and config) ... */extern void api_init_uiconfig(CONFIG *cfg);

/* * gets logged in user name */extern char *api_getlogin(void);

/* * checks on job data deck and returns possible question for UI to ask, setting answer for * submit call below ... */extern int api_check_job(char *ques_text,char *ans1_text,char *ans2_text,char *out_str);

Page 215: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

2APPENDIX BApplication Procedural Interface (API)

/* * submits job (needs filled in UI config and submit structs as well as global cfg struct) ... */extern int api_submit_job(char *qmgr_host,int qmgr_port,char *jobname,int background,int *job_number,char *base_path,int *jmgr_port,int answer,char *out_str);

/* * gets list of all running jobs from QueMgr ... */extern Job_List *api_get_runningjob_list(char *qmgr_host,int qmgr_port,int *job_count,char *out_str);

/* * gets initial socket for later on api_mon_job_* calls ... */extern int api_mon_job_init(char *job_host,int job_port,int *msg_port,char *out_str);

/* * gets all messages of sev level and lower from JobMgr ... */extern Msg_List *api_mon_job_msgs(int msg_sock,char *ui_host,int msg_port,int sev_level, int *num_msgs,char *out_str);

/* * gets current job statistics and run status ... */extern JOB_FS_LIST *api_mon_job_stats(int msg_sock,char *ui_host,int msg_port, int *cpu,int *pct_cpu, int *mem, int *pct_mem, int *dsk,int *pct_dsk, int *elapsed,int *status, int *num_fs,int *retcod, char *out_str);

/* * gets last 100 lines of job mon file ... */extern char *api_mon_job_mon(int msg_sock,char *ui_host,int msg_port,char *out_str);

/* * returns list of files active while job is running ... */extern FILE_LIST *api_mon_job_running_files_list(int msg_sock,char *ui_host,int msg_port, int *num_files,char *out_str);

/*** returns general info about a job from a mon_file ...*/extern Job_List *api_com_job_gen(char *sub_host,char *mon_file,char *out_str);

/* * gets job statistics and run status from mon file */extern JOB_FS_LIST *api_com_job_stats(char *sub_host,char *mon_file, int *cpu,int *pct_cpu_avg,int *pct_cpu_max, int *mem,int *pct_mem_avg,int *pct_mem_max, int *dsk,int *pct_dsk_avg,int *pct_dsk_max, int *elapsed,int *status, int *num_fs,int *retcod, char *out_str);

/* * gets last 100 lines of job mon file ... */extern char *api_com_job_mon(char *sub_host,char *mon_file,char *out_str);

/* * returns list of files from mon file ... */extern FILE_LIST *api_com_job_received_files_list(char *sub_host,char *mon_file,int *num_files,char *out_str);

/* * starts file download ... */extern int api_download_file_start(int msg_sock,int job_number,char *filename,char *out_str);

/* * checks on donwload file status ... */

Page 216: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

extern int api_download_file_check(int job_number,char *filename,int *filesizekb);

/* * returns all jobs for all hosts and apps from a running QueMgr ... */extern Que_List *api_mon_que_full(char *qmgr_host,int qmgr_port,int *num_tsks,char *out_str);

/* * gets last 4k bytes of QueMgr log file ... */extern char *api_mon_que_log(char *qmgr_host,int qmgr_port,char *out_str);

/* * gets all hosts statistics ... */extern Cpu_List *api_mon_que_cpu(char *qmgr_host,int qmgr_port,char *out_str);

/* * gets list of last 25 or so completed jobs from QueMgr ... */extern Job_List *api_get_completedjob_list(char *qmgr_host,int qmgr_port,int *job_count,char *out_str);

/* * abort job ... */extern int api_abort_job(char *qmgr_host,int qmgr_port,int job_number,char *job_user,char *out_str);

/* * reads rc file and overrides all UI settings found .. */extern int api_rcfile_read(char *rcfile,char *out_str);

/* * writes rc file from UI settigns ... */extern int api_rcfile_write( char *rcfile,char *out_str);extern int api_rcfile_write2(FILE *stream,int short_or_long);

/* * prints UI settings in rc format to screen (0 is short, != 0 is full display) ... */extern void api_rcfile_print(int fullprint);

/* * calls admin test procedure(s) and returns status and msgs ... */extern char *api_admin_test(char *orgpth,char *orgnam,int rport,int *status,char *out_str);

/* * just to get home dir ... */extern void api_get_home_dir(char *home_dir);

/* * to reconfig quemgr ... */extern char *api_reconfig_quemgr(char *qmgr_host,int quemgr_port,int *status,char *out_str);

/* * license checkout and return ... */extern int api_checkout_license(char *license_file);extern void api_release_license(void);

#ifdef __cplusplus}#endif

#endif /* _AMAPI */

Page 217: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

2APPENDIX BApplication Procedural Interface (API)

B.3 Example InterfaceThis is the actual source file of the TxtMgr, which uses the Analysis Manager API and the previously shown api.h include file.

#include <stdio.h>#include <stdlib.h>#include <string.h>#include <sys/types.h>#ifndef WINNT# include <unistd.h># include <sys/time.h># include <sys/uio.h># include <sys/socket.h># include <netinet/in.h># include <netdb.h> #else# include <winsock.h>#endif#include <time.h>

#define AM_INITIALIZE 1#include “api.h”

int dont_connect = 0;int has_qmgr_host;int has_qmgr_port;int has_org;int has_orgpath;char lic_file[256];char org_name[256];char binpath[256];char orgpath[256];char qmgr_host[256];int qmgr_port;int rmgr_port;int msg_sock = -1;int msg_port = -1;int msg_sock_job = -1;int auto_startup;char sys_rcf_file[256];char usr_rcf_file[256];int has_cmd_rcf;char cmd_rcf_file[256];

/* ==================== */

#define SUBMIT 1#define ABORT 2#define WATCHJOB 3#define WATCHQUE_LOG 4#define WATCHQUE_FULL 5#define WATCHQUE_CPU 6#define LISTCOMP 7#define RCFILEWRITE 8#define ADMINTEST 9 #define RECONFIG 10#define QUIT 11 /* must be highest defined number type */#define NOTVALID 9999

/* ==================== */

#ifdef WINNTBOOL console_event_func(DWORD dwEvent){ if(dwEvent == CTRL_LOGOFF_EVENT) return TRUE;

#ifdef DEBUG fprintf(stderr,”\nbye ...”);#endif fprintf(stderr,”\n”);

api_release_license();#ifdef WINNT WSACleanup();#endif

return FALSE;

Page 218: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

}#endif

/* ==================== */

void leafname(char *input_string, char *output_string){ int string_length; int i; char temp_string[256]; int found;

/*********************************************************************/ /* First get rid of the leading path (if any). */ /*********************************************************************/ string_length = strlen(input_string); if(string_length < 1){ output_string[0] = ‘\0’; return; }

found = 0; for(i = string_length - 1; i >= 0; i--){ if( (input_string[i] == ‘/’) || (input_string[i] == ‘\\’) ){ found = 1; strcpy(temp_string, &input_string[i + 1]); break; } }

if(found == 0) strcpy(temp_string, input_string);

/*********************************************************************/ /* Now get rid of the extention (if any). */ /*********************************************************************/ string_length = strlen(temp_string); if(string_length < 1){ output_string[0] = ‘\0’; return; }

for(i = string_length - 1; i >= 0; i--){ if( (temp_string[i] == ‘.’) && (i != 0) ){ temp_string[i] = ‘\0’; strcpy(output_string, temp_string); return; } }

strcpy(output_string, temp_string); return;}

/* ==================== */

int submit_job(void){ int background; int i; int lenc; int submit_index; int job_number; int jmgr_port; char job_name[256]; int mem; char job_fullname[256]; int srtn; char out_str[2048]; int ans; char ques_text[512]; char ans1_text[32]; char ans2_text[32];

background = 0;

/* ** if not auto_startup, ask for details ... */ if(auto_startup == 0){

#include <stdio.h>

Page 219: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

2APPENDIX BApplication Procedural Interface (API)

background = 1;

/* ** ask jobname ... */ printf(“\nEnter job name: “); scanf(“%s”,job_name);

/* ** ask memory ... */ printf(“\nEnter memory (in set units): “); scanf(“%d”,&mem);

/* ** print list of hosts from QueMgr ... ** and ask for which to submit to ... */ if(cfg->QUE_TYPE == MSC_QUEUE){

printf(“\nhosts:\n”); printf(“index name\n”); printf(“------------\n”); for(i=0;i<cfg->hsts[api_application_index-1].num_hosts;i++){ printf(“%-5d %s\n”,i+1,cfg->hsts[api_application_index-1].hosts[i].pseudohost_name); } printf(“\nEnter host index: “); scanf(“%d”,&submit_index); submit_index--; printf(“\n”);

if( (submit_index < 0) || (submit_index >= cfg->hsts[api_application_index-1].num_hosts) ){ printf(“Error, invalid index\n”); return 1; }

}else{

printf(“\nqueues:\n”); printf(“index name\n”); printf(“------------\n”); for(i=0;i<cfg->ques[api_application_index-1].num_queues;i++){ printf(“%-5d %s -> %s\n”,i+1,cfg->ques[api_application_index-1].queues[i].queue_name1, cfg->ques[api_application_index-1].queues[i].queue_name2); } printf(“\nEnter queue index: “); scanf(“%d”,&submit_index); submit_index--; printf(“\n”);

if( (submit_index < 0) || (submit_index >= cfg->ques[api_application_index-1].num_queues) ){ printf(“Error, invalid index\n”); return 1; }

}

/* ** set up config/submit struct info ... */ strcpy(job_fullname,job_name);

lenc = (int)strlen(job_fullname); for(i=0;i<lenc;i++){ if(job_fullname[i] == ‘\\’) job_fullname[i] = ‘/’; }

leafname(job_fullname,job_name);

if(api_application_index == MSC_NASTRAN){ sprintf(nas_submit.nas_input_deck,”%s”,job_fullname); nas_config.mem_req = (float)mem; nas_submit.submit_index = submit_index; }else if(api_application_index == HKS_ABAQUS){ sprintf(aba_submit.aba_input_deck,”%s”,job_fullname); aba_config.mem_req = (float)mem;

#include <stdio.h>

Page 220: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

aba_submit.submit_index = submit_index; }else if(api_application_index == MSC_MARC){ sprintf(mar_submit.datfile_name,”%s”,job_fullname); mar_config.mem_req = (float)mem; mar_submit.submit_index = submit_index; }else{ sprintf(gen_submit[api_application_index-GENERAL].gen_input_deck,”%s”,job_fullname); gen_config[api_application_index-GENERAL].mem_req = (float)mem; gen_submit[api_application_index-GENERAL].submit_index = submit_index; }

}else{

/* ** leave all config and submit struct settings alone, as ** the rcf/override ASSUME to have it all correct ... ** (just get job_name for use below ...) */ if(api_application_index == MSC_NASTRAN){ leafname(nas_submit.nas_input_deck,job_name); }else if(api_application_index == HKS_ABAQUS){ leafname(aba_submit.aba_input_deck,job_name); }else if(api_application_index == MSC_MARC){ leafname(mar_submit.datfile_name,job_name); }else{ leafname(gen_submit[api_application_index-GENERAL].gen_input_deck,job_name); }

}

ans = NO;

srtn = api_check_job(ques_text,ans1_text,ans2_text,out_str);

if(srtn < 0){ printf(“%s”,out_str); return srtn; }

if(srtn > 0){redo: printf(“%s\n”,ques_text); printf(“\nAnswer:\n”); printf(“-------\n”); printf(“0 - %s\n”,ans1_text); printf(“1 - %s\n”,ans2_text); printf(“\nanswer: “); scanf(“%d”,&ans); printf(“\n”); if( (ans != NO) && (ans != YES) ){ printf(“Error, invalid answer\n\n”); goto redo; } }

srtn = api_submit_job(qmgr_host,qmgr_port,job_name,background,&job_number,binpath, &jmgr_port,ans,out_str);

if(out_str[0] != ‘\0’){ printf(“%s”,out_str); }

if( (srtn == 0) && (background == 1) ){ /* ** right away get monitor socket ... */ msg_sock = api_mon_job_init(api_this_host,jmgr_port,&msg_port,out_str); if(msg_sock < 0){ msg_port = -1; msg_sock_job = -1; printf(“%s”,out_str); }else{ msg_sock_job = job_number; } }

return srtn;}

/* ==================== */

#include <stdio.h>

Page 221: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

2APPENDIX BApplication Procedural Interface (API)

int abort_job(void){ int srtn; Job_List *jr_ptr = NULL; int num_running_jobs; int job_number; char j_numstr[100]; int found; char job_user[256]; char job_name[256]; char proj_name[256]; int i; char out_str[2048];

jr_ptr = api_get_runningjob_list(qmgr_host,qmgr_port,&num_running_jobs,out_str); if(num_running_jobs == 0){ printf(“\nNo active jobs found\n”); return 0; }

if( (num_running_jobs < 0) || (jr_ptr == NULL) ){ printf(“%s”,out_str); if(jr_ptr != NULL) free(jr_ptr); return 1; }

job_number = -1;

if(auto_startup == 0){

/* ** present list to user ... */ printf(“\nRunning jobs ....\n\n”); printf(“num jobname jobuser project amhost runhost subtime\n”); printf(“---------------------------------------------------------------------------------------------------------------------\n”); for(i=0;i<num_running_jobs;i++){ printf(“%-4d %-20s %-20s %-20s %-20s %-20s %-20s\n”,jr_ptr[i].job_number, jr_ptr[i].job_name, jr_ptr[i].job_user, jr_ptr[i].job_proj, jr_ptr[i].am_host_name, jr_ptr[i].job_run_host, jr_ptr[i].sub_time_str); }

for(i=0;i<100;i++) j_numstr[i] = ‘\0’; printf(“\nEnter job number: “); scanf(“%s”,j_numstr); if( (j_numstr[0] == ‘q’) || (j_numstr[0] == ‘Q’) || (j_numstr[0] == ‘0’) ){ free(jr_ptr); return 0; } sscanf(j_numstr,”%d”,&job_number);

found = 0; for(i=0;i<num_running_jobs;i++){ if(job_number == jr_ptr[i].job_number){ found = 1; break; } }

if(!found){ printf(“Error, job number %d not in list\n”,job_number); free(jr_ptr); return 1; }

printf(“\n”);

}else{

if(api_application_index == MSC_NASTRAN){ leafname(nas_submit.nas_input_deck,job_name); }else if(api_application_index == HKS_ABAQUS){

#include <stdio.h>

Page 222: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

leafname(aba_submit.aba_input_deck,job_name); }else if(api_application_index == MSC_MARC){ leafname(mar_submit.datfile_name,job_name); }else{ leafname(gen_submit[api_application_index-GENERAL].gen_input_deck,job_name); } strcpy(proj_name,ui_config.project_name);

/* ** search list for match and set job_number ... */ job_number = -1; for(i=0;i<num_running_jobs;i++){ if(strcmp(jr_ptr[i].job_name,job_name) == 0){ if(strcmp(jr_ptr[i].job_proj,proj_name) == 0){ job_number = jr_ptr[i].job_number; break; } } }

}

strcpy(job_user,api_user_name);

srtn = api_abort_job(qmgr_host,qmgr_port,job_number,job_user,out_str);

if(out_str[0] != ‘\0’){ printf(“%s”,out_str); }

free(jr_ptr);

return srtn;}

/* ==================== */

int watch_job(void){ Job_List *jr_ptr = NULL; int num_running_jobs; int check; char job_host[128]; int job_port = 0; char j_numstr[100]; int found; int srtn; char *log_str; char job_name[256]; char proj_name[256]; char sfile[256]; int i; int job_number; int sev_level; char out_str[2048]; int num_msgs; Msg_List *msg_ptr = NULL; int cpu, pct_cpu; int mem, pct_mem; int dsk, pct_dsk; int elapsed; int status; FILE_LIST *file_list = NULL; int num_files = 0; int file_index; int sizekb; int num_fs; JOB_FS_LIST *job_fs_list;

extern void get_leaf_and_extention(char *,char *);

jr_ptr = api_get_runningjob_list(qmgr_host,qmgr_port,&num_running_jobs,out_str); if(num_running_jobs == 0){ printf(“\nNo active jobs found\n”); return 0; }

if( (num_running_jobs < 0) || (jr_ptr == NULL) ){ printf(“%s”,out_str); if(jr_ptr != NULL)

#include <stdio.h>

Page 223: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

2APPENDIX BApplication Procedural Interface (API)

free(jr_ptr); return 1; }

job_number = -1;

if(auto_startup == 0){

/* ** present list to user ... */ printf(“\nRunning jobs ....\n\n”); printf(“num jobname jobuser project amhost runhost subtime\n”); printf(“---------------------------------------------------------------------------------------------------------------------\n”); for(i=0;i<num_running_jobs;i++){ printf(“%-4d %-20s %-20s %-20s %-20s %-20s %-20s\n”,jr_ptr[i].job_number, jr_ptr[i].job_name, jr_ptr[i].job_user, jr_ptr[i].job_proj, jr_ptr[i].am_host_name, jr_ptr[i].job_run_host, jr_ptr[i].sub_time_str); }

for(i=0;i<100;i++) j_numstr[i] = ‘\0’; printf(“\nEnter job number: “); scanf(“%s”,j_numstr); if( (j_numstr[0] == ‘q’) || (j_numstr[0] == ‘Q’) || (j_numstr[0] == ‘0’) ){ free(jr_ptr); return 0; } sscanf(j_numstr,”%d”,&job_number);

found = 0; for(i=0;i<num_running_jobs;i++){ if(job_number == jr_ptr[i].job_number){ job_port = jr_ptr[i].port_number; strcpy(job_host,jr_ptr[i].job_submit_host); found = 1; break; } }

if(!found){ printf(“Error, job number %d not in list\n”,job_number); free(jr_ptr); return 1; }

}else{

if(api_application_index == MSC_NASTRAN){ leafname(nas_submit.nas_input_deck,job_name); }else if(api_application_index == HKS_ABAQUS){ leafname(aba_submit.aba_input_deck,job_name); }else if(api_application_index == MSC_MARC){ leafname(mar_submit.datfile_name,job_name); }else{ leafname(gen_submit[api_application_index-GENERAL].gen_input_deck,job_name); } strcpy(proj_name,ui_config.project_name);

/* ** search list for match and set job_number ... */ job_number = -1; for(i=0;i<num_running_jobs;i++){ if(strcmp(jr_ptr[i].job_name,job_name) == 0){ if(strcmp(jr_ptr[i].job_proj,proj_name) == 0){ job_port = jr_ptr[i].port_number; strcpy(job_host,jr_ptr[i].job_submit_host); break; } } }

if(job_number < 0){

#include <stdio.h>

Page 224: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

printf(“Error, job name %s not in list\n”,job_name); free(jr_ptr); return 1; }

}

free(jr_ptr);

#ifdef DEBUG fprintf(stderr,”posa\n”);#endif

/* ** get msg socket if needed ... */ if( (msg_sock < 0) || (msg_sock_job != job_number) ){

#ifdef DEBUG fprintf(stderr,”posa1\n”);#endif

msg_sock = api_mon_job_init(job_host,job_port,&msg_port,out_str); if(msg_sock < 0){ msg_port = -1; msg_sock_job = -1; printf(“%s”,out_str); return 1; }else{ msg_sock_job = job_number; } }

#ifdef DEBUG fprintf(stderr,”posb\n”);#endif

/* ** get severity if not auto ... */ sev_level = 3; if(auto_startup == 0){#ifdef MSGPOP if(api_application_index == MSC_NASTRAN){ printf(“Enter message severity level >=: “); scanf(“%d”,&sev_level); printf(“\n”); } if(sev_level < 0) sev_level = 0; if(sev_level > 3) sev_level = 3;#endif }

#ifdef DEBUG fprintf(stderr,”posc\n”);#endif

/* ** get monitor info ... */ msg_ptr = api_mon_job_msgs(msg_sock,api_this_host,msg_port,sev_level,&num_msgs,out_str); if(num_msgs < 0){ printf(“%s”,out_str); return 2; }

#ifdef DEBUG fprintf(stderr,”posd\n”);#endif

if(msg_ptr == NULL){ printf(“%s”,out_str); return 3; }else if(num_msgs == 0){ printf(“\nNo messages at this time ...\n\n”); }else{ /* ** mgs format is “severity@sevbuf@msgtxt” ... sevbuf is string “NULL” when severity=0 */ for(i=0;i<num_msgs-1;i++){ printf(“ %s\n”,msg_ptr[i].msg);

#include <stdio.h>

Page 225: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

2APPENDIX BApplication Procedural Interface (API)

} free(msg_ptr); }

#ifdef DEBUG fprintf(stderr,”pose\n”);#endif

job_fs_list = api_mon_job_stats(msg_sock,api_this_host,msg_port,&cpu,&pct_cpu,&mem,&pct_mem, &dsk,&pct_dsk,&elapsed,&status,&num_fs,&srtn,out_str); if(srtn != 0){ printf(“%s”,out_str); }else{ printf(“job stats:\n”); if(status == JOB_SUBMITTED){ printf(“cpu=%d, %%cpu=%d, mem=%d, %%mem=%d, disk=%d, %%disk=%d, elapsed=%d, sta-tus=%s\n”, cpu,pct_cpu,mem,pct_mem,dsk,pct_dsk,elapsed,”submitted”); }else if(status == JOB_QUEUED){ printf(“cpu=%d, %%cpu=%d, mem=%d, %%mem=%d, disk=%d, %%disk=%d, elapsed=%d, sta-tus=%s\n”, cpu,pct_cpu,mem,pct_mem,dsk,pct_dsk,elapsed,”queued”); }else if(status == JOB_RUNNING){ printf(“cpu=%d, %%cpu=%d, mem=%d, %%mem=%d, disk=%d, %%disk=%d, elapsed=%d, sta-tus=%s\n”, cpu,pct_cpu,mem,pct_mem,dsk,pct_dsk,elapsed,”running”); }else{ printf(“cpu=%d, %%cpu=%d, mem=%d, %%mem=%d, disk=%d, %%disk=%d, elapsed=%d, sta-tus=%s\n”, cpu,pct_cpu,mem,pct_mem,dsk,pct_dsk,elapsed,”unknown”); }

/* printf(“total num filesys = %d\n”,num_fs); for(i=0;i<num_fs;i++){ fprintf(stdout,” %s max=%d usage=%d\n”,job_fs_list[i].file_sys_name, job_fs_list[i].disk_max_size_mb, job_fs_list[i].disk_used_pct); } */

printf(“\n”);

if( (num_fs > 0) && (job_fs_list != NULL) ){ free(job_fs_list); } }

#ifdef DEBUG fprintf(stderr,”posf\n”);#endif

log_str = api_mon_job_mon(msg_sock,api_this_host,msg_port,out_str); if(log_str == NULL){ printf(“%s”,out_str); }else{ printf(“mon file contents:\n”); printf(“%s”,log_str); free(log_str); }

file_list = api_mon_job_running_files_list(msg_sock,api_this_host,msg_port,&num_files,out_str);

#ifdef DEBUG printf(“api_mon_job_running_files_list: num_files = %d\n”,num_files);#endif

if(num_files < 0){ printf(“%s”,out_str); return 4; }

if(num_files == 0) return 0;

for(i=0;i<num_files;i++){ if(i == 0){ printf(“\ndownloadable files: (use q to quit)\n”);

#include <stdio.h>

Page 226: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

printf(“index job file size (kb)\n”); printf(“--------------------------------------------------\n”); } get_leaf_and_extention(file_list[i].filename,sfile); printf(“%-10d%-30s %d\n”,i+1,sfile,file_list[i].sizekb); }

for(i=0;i<100;i++) j_numstr[i] = ‘\0’; printf(“\nEnter file index to download: “); scanf(“%s”,j_numstr); if( (j_numstr[0] == ‘q’) || (j_numstr[0] == ‘Q’) || (j_numstr[0] == ‘0’) ){ free(file_list); return 0; } sscanf(j_numstr,”%d”,&file_index);

if(file_index == 0){ free(file_list); return 0; }

check = 0; if(file_index < 0){ check = 1; file_index *= -1; } if(file_index > num_files){ printf(“invalid index\n”); free(file_list); return 5; }

if(check){

srtn = api_download_file_check(job_number,file_list[file_index-1].filename,&sizekb);

#ifdef DEBUG printf(“check returns %d\n”,srtn);#endif

if(srtn == FILE_STILL_DOWNLOADING){ printf(“File %s is still being transfered\n”,file_list[file_index-1].filename); }else if(srtn == FILE_DOWNLOAD_COMPLETE){ printf(“File %s transfer complete !\n”,file_list[file_index-1].filename); }

}else{

srtn = api_download_file_start(msg_sock,job_number,file_list[file_index-1].file-name,out_str); if(srtn != 0){ printf(“File download (%s) start failed, error = %d (%s)”, file_list[file_index-1].filename,srtn,out_str); }

}

free(file_list);

return 0;}

/* ==================== */

int watch_que(int which){ int i; int num_tasks; Que_List *ql_ptr = NULL; char out_str[2048]; char *log_str = NULL; Cpu_List *cpu_ptr = NULL;

if(which == WATCHQUE_LOG){

log_str = api_mon_que_log(qmgr_host,qmgr_port,out_str); if(log_str == NULL){ printf(“%s”,out_str);

#include <stdio.h>

Page 227: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

2APPENDIX BApplication Procedural Interface (API)

return 1; }

printf(“\n”); printf(“%s”,log_str);

free(log_str);

return 0;

}else if(which == WATCHQUE_FULL){

ql_ptr = api_mon_que_full(qmgr_host,qmgr_port,&num_tasks,out_str);

if( (num_tasks < 0) || (ql_ptr == NULL) ){ if(ql_ptr != NULL) free(ql_ptr); printf(“%s”,out_str); return 1; } if(num_tasks == 0){ printf(“\nNo active jobs found\n”); return 0; }

printf(“\nQueue stats for all hosts/apps\n”); printf(“\n%-35s%-6s%-6s%-6s %s\n”, “hostname”,”run”,”que”,”max”,”status”); printf(“------------------------------------------------------------\n”);

for(i=0;i<num_tasks;i++){ printf(“%-35s%-6d%-6d%-6d %s\n”, ql_ptr[i].host_name,ql_ptr[i].num_running,ql_ptr[i].num_waiting, ql_ptr[i].maxtsk,ql_ptr[i].stat_str); }

free(ql_ptr);

return 0;

}else if(which == WATCHQUE_CPU){

cpu_ptr = api_mon_que_cpu(qmgr_host,qmgr_port,out_str); if(cpu_ptr == NULL){ printf(“%s”,out_str); return 1; } printf(“\nQueue load stats for all hosts/apps\n”); printf(“\n%-35s%-12s%-12s%-12s\n”, “hostname”,”%cpu util”,”avail mem”,”avail disk”); printf(“---------------------------------------------------------------------\n”);

for(i=0;i<cfg->total_h;i++){ printf(“%-35s%-12d%-12d%-12d\n”, cpu_ptr[i].host_name,cpu_ptr[i].cpu_util,cpu_ptr[i].avail_mem,cpu_ptr[i].free_disk); }

free(cpu_ptr);

return 0;

}else{ printf(“\nError, invalid selection\n”); return 1; }

/*NOTREACHED*/}

/* ==================== */

int list_complete(void){ int num_completed_jobs; Job_List *jc_ptr = NULL; Job_List *jc_ptr2 = NULL; char *mon_msgs = NULL; int num_files; FILE_LIST *fl_list = NULL;

#include <stdio.h>

Page 228: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

int srtn; int i; int job_number; char j_numstr[100]; int found; char out_str[2048]; char sfile[256]; char mon_file[256]; int cpu_secs, pct_cpu_avg, pct_cpu_max; int mem_kbts, pct_mem_avg, pct_mem_max; int dsk_mbts, pct_dsk_avg, pct_dsk_max; int elapsed,status; int num_fs; JOB_FS_LIST *job_fs_list;

extern void get_leaf_and_extention(char *,char *);

jc_ptr = api_get_completedjob_list(qmgr_host,qmgr_port,&num_completed_jobs,out_str); if(num_completed_jobs == 0){ printf(“\nNo completed jobs found\n”); return 0; }

if( (num_completed_jobs < 0) || (jc_ptr == NULL) ){ printf(“%s”,out_str); if(jc_ptr != NULL) free(jc_ptr); return 1; }

job_number = -1;

if(auto_startup == 0){

/* ** present list to user ... */ printf(“\nCompleted jobs ....\n\n”); printf(“num jobname username subtime\n”); printf(“------------------------------------------------------\n”); for(i=0;i<num_completed_jobs;i++){ printf(“%-4d %-20s %-20s %-20s\n”,jc_ptr[i].job_number, jc_ptr[i].job_name,jc_ptr[i].job_user,jc_ptr[i].sub_time_str); }

printf(“\nEnter job number: “); scanf(“%s”,j_numstr); if( (j_numstr[0] == ‘q’) || (j_numstr[0] == ‘Q’) || (j_numstr[0] == ‘0’) ){ free(jc_ptr); return 0; } sscanf(j_numstr,”%d”,&job_number);

printf(“\n”);

}else{

printf(“\nError, cant list completed jobs in batch mode.\n”); free(jc_ptr); return 1;

}

found = -1; for(i=0;i<num_completed_jobs;i++){ if(job_number == jc_ptr[i].job_number){ found = i; break; } }

if(found < 0){ printf(“Error, job number %d not in list\n”,job_number); free(jc_ptr); return 1; }

printf(“Job name: %s\n”,jc_ptr[found].job_name); printf(“Job user: %s\n”,jc_ptr[found].job_user); printf(“Job originating host: %s\n”,jc_ptr[found].job_submit_host); printf(“Job originating dir: %s\n”,jc_ptr[found].work_dir);

#include <stdio.h>

Page 229: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

2APPENDIX BApplication Procedural Interface (API)

printf(“Job AM hostname: %s\n”,jc_ptr[found].am_host_name); printf(“Job run host: %s\n”,jc_ptr[found].job_run_host);

if(jc_ptr[found].jobstatus == JOB_SUCCESSFUL) printf(“Job complete status: success\n”); else if(jc_ptr[found].jobstatus == JOB_ABORTED) printf(“Job complete status: aborted\n”); else if(jc_ptr[found].jobstatus == JOB_FAILED) printf(“Job complete status: failed\n”); else printf(“Job complete status: unknown\n”);

/* ------------ */

sprintf(mon_file,”%s/%s.mon”,jc_ptr[found].work_dir,jc_ptr[found].job_name);

jc_ptr2 = api_com_job_gen(jc_ptr[found].job_submit_host,mon_file,out_str); if(jc_ptr2 == NULL){ printf(“%s”,out_str); free(jc_ptr); return 1; }

/* check job number ... */

if(jc_ptr[found].job_number != jc_ptr2->job_number){ printf(“\nJob numbers do not match -\n”); printf( “assuming newer job with same .mon file is currently running\n”); printf( “so no additional job info is available\n”); free(jc_ptr); free(jc_ptr2); return 1; }

printf(“\ngeneral info:\n”); printf(“num jobname jobuser amhost runhost subtime status\n”); printf(“------------------------------------------------------------------------------------------------------------------------------\n”); printf(“%-4d %-20s %-20s %-20s %-20s %-30s %-6d\n”,jc_ptr2->job_number, jc_ptr2->job_name, jc_ptr2->job_user, jc_ptr2->am_host_name, jc_ptr2->job_run_host, jc_ptr2->sub_time_str, jc_ptr2->jobstatus);

/* ------------ */

job_fs_list = api_com_job_stats(jc_ptr[found].job_submit_host,mon_file, &cpu_secs,&pct_cpu_avg,&pct_cpu_max, &mem_kbts,&pct_mem_avg,&pct_mem_max, &dsk_mbts,&pct_dsk_avg,&pct_dsk_max, &elapsed,&status, &num_fs,&srtn,out_str); if(srtn < 0){ printf(“%s”,out_str); if( (num_fs > 0) && (job_fs_list != NULL) ){ free(job_fs_list); } free(jc_ptr); free(jc_ptr2); return 1; }

printf(“\njob stats:\n”); printf(“cpu(sec)=%d, %%cpu(avg)=%d, %%cpu(max)=%d\n”,cpu_secs,pct_cpu_avg,pct_cpu_max); printf(“mem(kb) =%d, %%mem(avg)=%d, %%mem(max)=%d\n”,mem_kbts,pct_mem_avg,pct_mem_max); printf(“dsk(mb) =%d, %%dsk(avg)=%d, %%dsk(max)=%d\n”,dsk_mbts,pct_dsk_avg,pct_dsk_max); printf(“elapsed =%d, status=%d\n”,elapsed,status);

/* printf(“total num filesys = %d\n”,num_fs); for(i=0;i<num_fs;i++){ fprintf(stdout,” %s max=%d usage=%d\n”,job_fs_list[i].file_sys_name, job_fs_list[i].disk_max_size_mb, job_fs_list[i].disk_used_pct); } printf(“\n”); */

#include <stdio.h>

Page 230: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

if( (num_fs > 0) && (job_fs_list != NULL) ){ free(job_fs_list); }

/* ------------ */

mon_msgs = api_com_job_mon(jc_ptr[found].job_submit_host,mon_file,out_str); if(mon_msgs == NULL){ printf(“Error, unable to determine mon file msgs\n%s\n”,out_str); free(jc_ptr); free(jc_ptr2); return 1; }

printf(“\nmon file contents:\n%s”,mon_msgs); free(mon_msgs);

/* ------------ */

fl_list = api_com_job_received_files_list(jc_ptr[found].job_submit_host,mon_file,&num_files,out_str);

#ifdef DEBUG printf(“api_com_job_received_files_list: num_files = %d\n”,num_files);#endif

if(num_files < 0){ printf(“%s”,out_str); free(jc_ptr); free(jc_ptr2); return 4; }

if(num_files > 0){ for(i=0;i<num_files;i++){ if(i == 0){ printf(“\nviewable files:\n”); printf(“index job file size (kb)\n”); printf(“--------------------------------------------------\n”); } get_leaf_and_extention(fl_list[i].filename,sfile); printf(“%-10d%-30s %d\n”,i+1,sfile,fl_list[i].sizekb); } free(fl_list); }

/* ------------ */

free(jc_ptr); free(jc_ptr2);

return 0;}

/* ==================== */

int write_rcfile(void){ int srtn; char out_str[2048];

if(has_cmd_rcf){ srtn = api_rcfile_write(cmd_rcf_file,out_str); if(srtn != 0){ printf(“%s”,out_str); return 1; }else{ printf(“\nSettings successfully written to rc file <%s>\n”,cmd_rcf_file); } }else{ printf(“\nWarning, no -rcf file specified so cannot write settings\n”); }

return 0;}

/* ==================== */

int admin_test(void)

#include <stdio.h>

Page 231: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

2APPENDIX BApplication Procedural Interface (API)

{ int status; char *test_str = NULL; char out_str[2048];

test_str = api_admin_test(orgpath,org_name,rmgr_port,&status,out_str); if(status != 0){ printf(“\nAdmin test returns %d, text = %s”,status,out_str); } if(test_str != NULL){ printf(“\n%s”,test_str); free(test_str); }

return 0;}

/* ==================== */

int reconfig_quemgr(void){ int status; char *recfg_str = NULL; char out_str[2048];

/* ** if user is Admin then ... */ if(strcmp(api_user_name,cfg->ADMIN) != 0){ printf(“\nError, user <%s> is not the Admin <%s>, so cannot reconfig\n”, api_user_name,cfg->ADMIN); return 0; }

recfg_str = api_reconfig_quemgr(qmgr_host,qmgr_port,&status,out_str); if(status != 0){ printf(“\nReconfig returns %d, text = %s”,status,out_str); } if(recfg_str != NULL){ printf(“\n%s”,recfg_str); free(recfg_str); }

return 0;}

/* ==================== */

void print_menu(void){ printf(“\n”); printf(“Enter selection:\n”); printf(“ 1). submit a job\n”); printf(“ 2). abort a job\n”); printf(“ 3). monitor a job\n”); printf(“ 4). show QueMgr log file\n”); printf(“ 5). show QueMgr jobs/queues\n”); printf(“ 6). show QueMgr cpu/mem/disk\n”); printf(“ 7). list completed jobs\n”); printf(“ 8). write rcfile settings\n”); printf(“ 9). admin test\n”); printf(“ 10). admin reconfig QueMgr\n”); printf(“ 11). quit\n”); printf(“\n”); printf(“choice: “); return;}

/* ==================== */

int get_response(void){ char bogus[100]; int choice;

(void)scanf(“%s”,bogus); if( (bogus[0] == ‘q’) || (bogus[0] == ‘Q’)) choice = QUIT; else choice = atoi(bogus);

#include <stdio.h>

Page 232: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

if( (choice < SUBMIT) || (choice > QUIT) ) return NOTVALID;

return choice;}

/* ==================== */

int doit(int choice){ int srtn;

if(choice == QUIT) return -1;

#ifdef DEBUG printf(“choice made was: %d\n”,choice);#endif

if(dont_connect){ if(choice != ADMINTEST){ printf(“\nError, only valid option with -nocon is Admin test\n”); return 0; } }

if(choice == SUBMIT){ srtn = submit_job(); }else if(choice == ABORT){ srtn = abort_job(); }else if(choice == WATCHJOB){ srtn = watch_job(); }else if(choice == WATCHQUE_LOG){ srtn = watch_que(choice); }else if(choice == WATCHQUE_FULL){ srtn = watch_que(choice); }else if(choice == WATCHQUE_CPU){ srtn = watch_que(choice); }else if(choice == LISTCOMP){ srtn = list_complete(); }else if(choice == RCFILEWRITE){ srtn = write_rcfile(); }else if(choice == ADMINTEST){ srtn = admin_test(); }else if(choice == RECONFIG){ srtn = reconfig_quemgr(); }else{ srtn = 0; printf(“invalid choice ?\n”); }

return srtn;}

/* ==================== */

int main(int argc,char **argv){ int i,j,k; int len1; int done; int not_first_real_app; int first_real_app_num; FILE *wp = NULL; int do_print; char env_file[256]; char first_real_app_str[128]; int srtn; int choice; int timout; int has_timout; char *ptr; char *qmgr_hoststr; char *qmgr_portstr; char *user_str; char home_dir[256]; char tmpstr[256]; char error_msg[256]; char tmp_host[256]; char out_str[2048];#ifdef WINNT

#include <stdio.h>

Page 233: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

2APPENDIX BApplication Procedural Interface (API)

int err; WORD wVersionRequested; WSADATA wsaData;#endif

struct hostent *host_entry;

#ifdef WINNT extern BOOL console_event_func(DWORD );#endif extern void get_home_dir(char *);

/* ------------ */

/* ** necessary windows startup socket code ... */

#ifdef WINNT wVersionRequested = MAKEWORD( SOCKET_VERSION1, SOCKET_VERSION2 );

err = WSAStartup( wVersionRequested, &wsaData );

if(err != 0){ printf(“Error, WSAStartup failed\n”); return 1; }

if( ( LOBYTE( wsaData.wVersion ) != 1 ) || ( HIBYTE( wsaData.wVersion ) != 1 ) ){ WSACleanup(); printf(“Error, WSAStartup version incompatible\n”); return 1; }#endif

/* ------------ */

#ifdef WINNT /* ** console handler ... */ (void)SetConsoleCtrlHandler((PHANDLER_ROUTINE)console_event_func, TRUE);#endif

/* ------------ */

/* ** get this hostname ... */ gethostname(api_this_host,256); strcpy(tmp_host,api_this_host); host_entry = (struct hostent *)gethostbyname(tmp_host); if(host_entry != NULL) strcpy(api_this_host,host_entry->h_name);

/* ------------ */

/* ** get this username ... */ user_str = api_getlogin(); strcpy(api_user_name,user_str);

/* ------------ */

/* ** assume binpath is from P3_HOME (or AM_HOME) ... ** command-line will override ... */ binpath[0] =’\0’; orgpath[0] =’\0’;#ifdef ULTIMA ptr = getenv(“AM_HOME”);#else ptr = getenv(“P3_HOME”);#endif if(ptr != NULL){ strcpy(binpath,ptr); }

#include <stdio.h>

Page 234: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

#ifdef DEBUG printf(“binpath = <%s>\n”,binpath);#endif

/* ------------ */

/* ** get QueMgr host, port, app name, index, p3home (or amhome), ** and -rcf from command line args ... */ lic_file[0] = ‘\0’; has_qmgr_host = 0; has_qmgr_port = 0; has_org = 0; has_orgpath = 0; has_timout = 0; timout = BLOCK_TIMEOUT; strcpy(org_name,”default”); ptr = getenv(“P3_ORG”); if(ptr != NULL){ strcpy(org_name,ptr); has_org = 1; } qmgr_host[0] = ‘\0’; qmgr_port = -1111; rmgr_port = RMTMGR_RESV_PORT; api_application_name[0] = ‘\0’; api_application_index = -1;

sys_rcf_file[0] = ‘\0’; usr_rcf_file[0] = ‘\0’; has_cmd_rcf = 0; cmd_rcf_file[0] = ‘\0’;

strcpy(usr_rcf_file,”.p3mgrrc”); get_home_dir(home_dir); if(home_dir[0] != ‘\0’){ sprintf(usr_rcf_file,”%s/.p3mgrrc”,home_dir); }

#ifdef DEBUG fprintf(stderr,”usr_rcf_file = <%s>\n”,usr_rcf_file);#endif

auto_startup = 0; do_print = 0; env_file[0] = ‘\0’; if(argc > 1){ i = 1; while(i < argc){ if((strcmp(argv[i],”-qmgrhost”) == 0) && (i < argc-1)){ has_qmgr_host = 1; strcpy(qmgr_host,argv[i+1]); i++; }else if((strcmp(argv[i],”-qmgrport”) == 0) && (i < argc-1)){ has_qmgr_port = 1; qmgr_port = atoi(argv[i+1]); i++; }else if((strcmp(argv[i],”-rmgrport”) == 0) && (i < argc-1)){ rmgr_port = atoi(argv[i+1]); i++; }else if((strcmp(argv[i],”-timeout”) == 0) && (i < argc-1)){ timout = atoi(argv[i+1]); has_timout = 1; i++; }else if((strcmp(argv[i],”-org”) == 0) && (i < argc-1)){ has_org = 1; strcpy(org_name,argv[i+1]); i++; }else if((strcmp(argv[i],”-orgpath”) == 0) && (i < argc-1)){ has_orgpath = 1; strcpy(orgpath,argv[i+1]); i++; }else if((strcmp(argv[i],”-auth”) == 0) && (i < argc-1)){ strcpy(lic_file,argv[i+1]); i++; }else if((strcmp(argv[i],”-app”) == 0) && (i < argc-1)){ strcpy(api_application_name,argv[i+1]); i++; }else if((strcmp(argv[i],”-rcf”) == 0) && (i < argc-1)){ has_cmd_rcf = 1;

#include <stdio.h>

Page 235: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

2APPENDIX BApplication Procedural Interface (API)

strcpy(cmd_rcf_file,argv[i+1]); i++;#ifdef ULTIMA }else if((strcmp(argv[i],”-amhome”) == 0) && (i < argc-1)){#else }else if((strcmp(argv[i],”-p3home”) == 0) && (i < argc-1)){#endif strcpy(binpath,argv[i+1]); i++; }else if((strcmp(argv[i],”-choice”) == 0) && (i < argc-1)){ auto_startup = atoi(argv[i+1]); i++; }else if(strcmp(argv[i],”-env”) == 0){ do_print = 1; }else if(strcmp(argv[i],”-envall”) == 0){ do_print = 2; }else if((strcmp(argv[i],”-envf”) == 0) && (i < argc-1)){ strcpy(env_file,argv[i+1]); do_print = 3; i++; }else if((strcmp(argv[i],”-envfall”) == 0) && (i < argc-1)){ strcpy(env_file,argv[i+1]); do_print = 4; i++; }else if(strcmp(argv[i],”-nocon”) == 0){ dont_connect = 1; }else if(strcmp(argv[i],”-version”) == 0){ fprintf(stderr,”version: %s\n”,GLOBAL_AM_VERSION); return 0; } i++; } }

#ifdef DEBUG if(has_cmd_rcf) fprintf(stderr,”cmd_rcf_file = <%s>\n”,cmd_rcf_file);#endif

/* ------------ */

/* ** if binpath is still emtpy then its an error ... */#ifdef DEBUG printf(“binpath = <%s>\n”,binpath);#endif

if(binpath[0] == ‘\0’){#ifdef ULTIMA printf(“Error, AM_HOME env var not set\n”);#else printf(“Error, P3_HOME env var not set\n”);#endif return 1; }

#ifdef LAPI if(lic_file[0] == ‘\0’){ ptr = getenv(“MSC_LICENSE_FILE”); if(ptr == NULL){ ptr = getenv(“LM_LICENSE_FILE”); } if(ptr == NULL){ printf(“Error, authorization file not set (MSC_LICENSE_FILE)\n”); return 1; } strcpy(lic_file,ptr); }#else strcpy(lic_file,”empty.noauth”);#endif

/* ** change back-slashes to forward slashes for binpath ... */ i = 0; j = 0; k = (int)strlen(binpath); while(i < k){

#include <stdio.h>

Page 236: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

#ifdef DEBUG fprintf(stderr,”txtmgr: i=%d, k=%d\n”,i,k); fprintf(stderr,”txtmgr: binpath[i] = %c\n”,binpath[i]);#endif

if(binpath[i] == ‘\\’){ if(i < k-1){ if(binpath[i+1] == ‘\\’){ i++; } } tmpstr[j] = ‘/’; j++; }else{ tmpstr[j] = binpath[i]; j++; }

#ifdef DEBUG fprintf(stderr,”HERE\n”);#endif

i++; } tmpstr[j] = ‘\0’; strcpy(binpath,tmpstr);

/* ** make sure binpath has no slash at end ... */ len1 = (int)strlen(binpath); if(len1 > 0){ if( (binpath[len1-1] == ‘/’) || (binpath[len1-1] == ‘\\’) ){ binpath[len1-1] = ‘\0’; } }

/* ** mck - add /p3manager_files (or analysis_manager) to binpath ... */#ifdef ULTIMA strcat(binpath,”/analysis_manager”);#else strcat(binpath,”/p3manager_files”);#endif

/* ------------ */

/* ** MCK MCK MCK - get orgpath - it WILL be the same as binpath ** for the org.cfg file ... */ if(has_orgpath == 0){

strcpy(orgpath,binpath);

}else{

/* ** change back-slashes to forward slashes for orgpath ... */ i = 0; j = 0; k = (int)strlen(orgpath); while(i < k){

#ifdef DEBUG fprintf(stderr,”txtmgr: i=%d, k=%d\n”,i,k); fprintf(stderr,”txtmgr: orgpath[i] = %c\n”,orgpath[i]);#endif

if(orgpath[i] == ‘\\’){ if(i < k-1){ if(orgpath[i+1] == ‘\\’){ i++; } } tmpstr[j] = ‘/’; j++; }else{ tmpstr[j] = orgpath[i];

#include <stdio.h>

Page 237: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

2APPENDIX BApplication Procedural Interface (API)

j++; }

#ifdef DEBUG fprintf(stderr,”HERE\n”);#endif

i++; } tmpstr[j] = ‘\0’; strcpy(orgpath,tmpstr);

/* ** make sure orgpath has no slash at end ... */ len1 = (int)strlen(orgpath); if(len1 > 0){ if( (orgpath[len1-1] == ‘/’) || (orgpath[len1-1] == ‘\\’) ){ orgpath[len1-1] = ‘\0’; } }

/* ** mck - add /p3manager_files (or analysis_manager) to orgpath ... */#ifdef ULTIMA strcat(orgpath,”/analysis_manager”);#else strcat(orgpath,”/p3manager_files”);#endif

}

/* ------------ */

sprintf(sys_rcf_file,”%s/%s/p3mgrrc”,orgpath,org_name);

#ifdef DEBUG fprintf(stderr,”sys_rcf_file = <%s>\n”,sys_rcf_file);#endif

/* ------------ */

/* ** check env vars if not set on command-line */ if(qmgr_host[0] == ‘\0’){ qmgr_hoststr = getenv(“P3_MASTER”); if(qmgr_hoststr != NULL){ strcpy(qmgr_host,qmgr_hoststr); has_qmgr_host = 1; } }

if(qmgr_host[0] == ‘\0’){ qmgr_hoststr = getenv(“MSC_AM_QUEMGR”); if(qmgr_hoststr != NULL){ strcpy(qmgr_host,qmgr_hoststr); has_qmgr_host = 1; } }

if(qmgr_host[0] == ‘\0’){ qmgr_hoststr = getenv(“QUEMGR_HOST”); if(qmgr_hoststr != NULL){ strcpy(qmgr_host,qmgr_hoststr); has_qmgr_host = 1; } }

if(qmgr_port == -1111){ qmgr_portstr = getenv(“P3_PORT”); if(qmgr_portstr != NULL){ qmgr_port = atoi(qmgr_portstr); has_qmgr_port = 1; } }

if(qmgr_port == -1111){ qmgr_portstr = getenv(“MSC_AM_QUEPORT”); if(qmgr_portstr != NULL){

#include <stdio.h>

Page 238: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

qmgr_port = atoi(qmgr_portstr); has_qmgr_port = 1; } }

if(qmgr_port == -1111){ qmgr_portstr = getenv(“QUEMGR_PORT”); if(qmgr_portstr != NULL){ qmgr_port = atoi(qmgr_portstr); has_qmgr_port = 1; } }

/* ------------ */

#ifndef ULTIMA /* ** checkout license ... */ if( (do_print == 0) && (dont_connect == 0) ){ if((srtn = api_checkout_license(lic_file)) != 0){ printf(“Error, Authorization failure %d.”,srtn); if(global_auth_msg != NULL){ printf(“ Error msg = %s\n”,global_auth_msg); }else{ printf(“\n”); } return 1; }#ifdef DEBUG fprintf(stderr,”auth_file = %s\n”,lic_file); fprintf(stderr,”checkout_license returns %d\n”,srtn);#endif }#endif

/* ------------ */

/* ** init api ... */ srtn = api_init(out_str); if(srtn != 0){ printf(“%s, error code = %d\n”,out_str,srtn); return 1; }

/* ------------ */

/* ** adjust global network timeout if desired ... */ if(has_timout == 0){ timout = 30; } srtn = api_set_gbl_timeout(timout); if(srtn != 0){ printf(“Error, unable to set global timeout to %d secs\n”,timout); api_release_license();#ifdef WINNT WSACleanup();#endif return 1; }

/* ------------ */

ptr = getenv(“AM_THIS_HOST”); if(ptr != NULL){ if( (strcmp(ptr,”no”) != 0) && (strcmp(ptr,”NO”) != 0) ){ api_use_this_host = 1; } }

/* ------------ */

/* ** read orgs if possible (org.cfg is in binpath) ... */ org = NULL; num_orgs = 0;

#include <stdio.h>

Page 239: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

2APPENDIX BApplication Procedural Interface (API)

if( (has_qmgr_host == 0) && (has_qmgr_port == 0) ){

org = api_read_orgs(binpath,&num_orgs,&srtn); if(srtn != 0){ printf(“Warning, unable to read org.cfg file, code = %d\n”,srtn); /* * use defaults ... */ strcpy(qmgr_host,api_this_host); qmgr_port = QUEMGR_RESV_PORT; }else{ if( (num_orgs > 0) && (org != NULL) ){ /* ** figure out which quemgr to connect to ... */ done = 0; for(i=0;i<num_orgs;i++){ if(strcmp(org[i].org_name,org_name) == 0){ strcpy(qmgr_host,org[i].host_name); qmgr_port = org[i].port; done = 1; break; } } if( (!done) && (has_org == 0) ){ /* ** use first available ... */ strcpy(qmgr_host,org[0].host_name); qmgr_port = org[0].port; done = 1; }else if( (!done) && (has_org == 1) ){ /* ** no match found, assume this host and all ... */ strcpy(qmgr_host,api_this_host); qmgr_port = QUEMGR_RESV_PORT; done = 1; } }else{ printf(“Warning, unable to read org.cfg file, no orgs found\n”); /* * use defaults ... */ strcpy(qmgr_host,api_this_host); qmgr_port = QUEMGR_RESV_PORT; done = 1; }

}

}

#ifdef DEBUG printf(“\n”); printf(“quemgr org = %s\n”,org_name); printf(“quemgr host = %s\n”,qmgr_host); printf(“quemgr port = %d\n”,qmgr_port);#endif

/* ------------ */

if(! dont_connect){

/* ** get config info ... */ cfg = api_get_config(qmgr_host, qmgr_port, &srtn, error_msg); if(srtn != 0){ printf(“Error, msg = %s, error = %d\n”,error_msg,srtn); api_release_license();#ifdef WINNT WSACleanup();#endif return 1; }

/* ------------ */

/* find first real app, just in case */

#include <stdio.h>

Page 240: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

first_real_app_str[0] = ‘\0’; not_first_real_app = 1; first_real_app_num = 1; for(i=0;i<MAX_APPS;i++){ if(not_first_real_app){

#ifdef DEBUG fprintf(stderr,”cfg->progs[%d].app_name = <%s>\n”,i,cfg->progs[i].app_name);#endif

if(cfg->progs[i].app_name[0] != ‘\0’){ not_first_real_app = 0; strcpy(first_real_app_str,cfg->progs[i].app_name); first_real_app_num = i + 1; break; } } }

if(not_first_real_app){ /* error, no apps defined */ fprintf(stderr,”TxtMgr Error: No valid applications defined.\n”); return 1; }

if(api_application_name[0] != ‘\0’){ for(i=0;i<MAX_APPS;i++){ if(strcmp(api_application_name,cfg->progs[i].app_name) == 0){ api_application_index = i + 1; break; } } }

if(api_application_index <= 0){ /* app not specified - use first available */ strcpy(api_application_name,first_real_app_str); api_application_index = first_real_app_num;

/* put up message about app not found, using first one */

fprintf(stderr,”\nTxtMgr Info: No application specified.\nUsing first available applica-tion of <%s> = %d\n”,api_application_name,api_application_index); }

#ifdef DEBUG fprintf(stderr,”application_name = <%s>\n”,api_application_name); fprintf(stderr,”application_index = %d\n”,api_application_index);#endif

/* ----------- */

/* DEBUGGING if(orgpath[0] != ‘\0’){ api_write_config(cfg,orgpath,”bogus”,&srtn,error_msg); if(srtn != 0){ printf(“Error, unable to write config files, msg = %s, code = %d\n”,error_msg,srtn); api_release_license();#ifdef WINNT WSACleanup();#endif return 0; } } DEBUGGING */

/* ----------- */

/* ** initialize config values ... */

api_init_uiconfig(cfg);

/* ** because we are txtmgr, reset job_mon_flag of ui_config ** to be off by default ... */

if(auto_startup == SUBMIT){

#include <stdio.h>

Page 241: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

2APPENDIX BApplication Procedural Interface (API)

ui_config.job_mon_flag = 0; }

#ifdef DEBUG_111 api_rcfile_print(1);#endif

/* ----------- */

/* ** override some settings if needed ... */

(void)api_rcfile_read(sys_rcf_file,out_str);

(void)api_rcfile_read(usr_rcf_file,out_str);

if(has_cmd_rcf){ srtn = api_rcfile_read(cmd_rcf_file,out_str); if(srtn != 0){ printf(“%s\n”,out_str); } }

#ifdef DEBUG_111 api_rcfile_print(1);#endif

/* ----------- */

/* ** if just a print env then do it and stop ... */ if(do_print){ if(do_print >= 3){ if(env_file[0] != ‘\0’){ wp = fopen(env_file,”wb”); if(wp != NULL){ api_rcfile_write2(wp,(do_print-3)); fclose(wp); } } }else{ api_rcfile_print(do_print-1); } api_release_license();#ifdef WINNT WSACleanup();#endif return 0; }

} /* ! dont_connect ... */

/* ----------- */

if(auto_startup > 0){

srtn = doit(auto_startup);

}else{

/* ** query for selection and do work ... */ while(1){ print_menu(); choice = get_response(); srtn = doit(choice);

#ifdef DEBUG printf(“doit(%d) returns %d\n”,choice,srtn);#endif

if(srtn < 0) break; } srtn = 0;

}

#include <stdio.h>

Page 242: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

api_release_license();#ifdef WINNT WSACleanup();#endif

return srtn;}

#include <stdio.h>

Page 243: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

2

I N D E XMSC.Patran Analysis Manager User’s Guide

I N D E XMSC.Patran Analysis Manager User’s

Guide

A

ABAQUS, 13ABAQUS submittals, 13abort, 78

selecting job, 78action

abort, 78configure, 30monitor, 62submit, 24

administrator, 108analysis

ABAQUS, 13general, 14, 15MSC.Nastran, 11

Analysis Preference, 8applications, 109

adding, 111deleting, 112

Ccommand arguments, 46configuration

disk, 119examples, 136files, 107general, 44organizational group, 142queue, 121separate users, 143test, 125

configuration management, 104

configure, 30command line, 55command line arguments, 52, 53, 54database, 44, 53, 54disk space, 31host/queue, 44mail, 41memory, 35miscellaneous, 52monitoring, 44number of CPUs, 52, 53, 54project, 44, 45restart, 48time, 42user subroutines, 53, 54

Ddaemon, 84

General Manager, 86Job Manager, 85Queue Manager, 84

default host/queue, 46disable, 10disk configuration, 119

Eedit file, 16, 17, 24, 28editor, 86enable, 10environment variables, 94errors, 152executables, 4, 82execute, 19

Page 244: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

INDEX

Ffiles

configuration, 105, 107created, 22databases, 44directory structure, 82disk configuration, 140edit, 16, 17, 24, 28examples, 136host configuration, 136queue configuration, 141save settings, 30, 32, 34, 36, 38, 40, 43, 45selecting, 25X resources, 102

filesystemadd, 119delete, 120test, 130

fonts, 103

Ggeneral, 15Generic submittals, 15

Hhost, 26

adding, 117configuration, 116deleting, 118test, 125, 126

host groups, 123

Iinstallation, 99

instructions, 100requirements, 99

integration, 5interface

configuration management, 104user, 84

Jjob stats, 86, 187job viewer, 86, 187

Kkeyword index, 55

Lleast loaded, 123limitations, 11, 13, 14load leveling, 123LSF, 121

Mmaximum application tasks, 108, 109modify

configuration files, 107monitor, 62

completed job, 68CPU Loads, 76full listing, 75host status, 73host/queue, 71job listing, 72queue manager log, 74running job, 63

MSC.Marc Submittals, 14MSC.Nastran, 10, 11MSC.Nastran submittals, 11

NNQS, 121

Oorganization, 6

multiple, 94

Pphysical hosts, 113

adding, 114deleting, 115test, 126

product information, 3product purpose, 2program, 4, 19, 82

queue manager, 144program arguments, 86

Page 245: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

2INDEX

project directory, 46

Qqueue, 26

add, 121delete, 122test, 132

queue type, 107LSF, 107NQS, 107

Rreconfigure, 134restart, 48rules, 11, 13, 14

Sstartup arguments, 86statistics, 86, 187submit, 16, 17, 24

preparing, 8separate user, 46

Ttest

application, 125disk, 130MSC.Patran AM host, 128physical hosts, 126queue, 132

test configuration/host, 125, 126

Uuser interface, 84

Vvariables, 94

XX resources, 102

Page 246: CONTENTSgc.nuaa.edu.cn/hangkong/doc/ziliao/MSC_PATRAN/MSC.Patran Analys… · 1.1 Purpose MSC.Nastran, MSC.Marc, and MSC.Patran are analysis software systems developed and maintained

INDEX