Release-3_TSD_Source_to_LZ_-_CIS_-_v1.2 2

44
Date: 04/21/2010 WellPoint Inc. Clinical Informatics Solutions Technical Design Document [Source-LZ] [Technical Leader- ] Portfol io No.: Service Request No.: CART No.: AOP Tracking No.: Approvals Enterprise Architect: [Name] Signature: Date: IT Project Mgr: [Name] Signature: Date: Authorizing Business Sponsor: [Name] Signature: Date: Authorizing IT Sponsor: [Name] Signature: Date: [Purpose of this Section: Record changes to this document here making an entry for each new version.] document.doc 1 Notice of Confidentiality and Custodial Responsibilities This WellPoint document contains confidential information that is WellPoint’s intellectual property. As a holder of this document, you may NOT disclose its content or any information derived from it to any person or entity outside of WellPoint.

Transcript of Release-3_TSD_Source_to_LZ_-_CIS_-_v1.2 2

Page 1: Release-3_TSD_Source_to_LZ_-_CIS_-_v1.2 2

Date: 04/21/2010

WellPoint Inc.

Clinical Informatics SolutionsTechnical Design Document [Source-LZ][Technical Leader- ]Portfolio No.:

Service Request No.:

CART No.:

AOP Tracking No.:

Approvals

Enterprise Architect: [Name]Signature:

Date:

IT Project Mgr: [Name] Signature: Date:

Authorizing Business Sponsor:

[Name] Signature: Date:

Authorizing IT Sponsor:

[Name] Signature: Date:

History

[Purpose of this Section: Record changes to this document here making an entry for each new version.]

Version Number

Release and/or Approval Date Author(s) Section(s), Page (s) and Topic

Revised

1.0 12/07/2009 Offshore Team Initial Version

1.1 12/14/2009 Offshore Team Updated Version

1.2 04/21/2010 Offshore Team Updated Complete Version

document.doc 1

Notice of Confidentiality and Custodial Responsibilities

This WellPoint document contains confidential information that is WellPoint’s intellectual property. As a holder of this document, you may NOT disclose its content or any information derived from it to any person or entity outside of WellPoint.

Page 2: Release-3_TSD_Source_to_LZ_-_CIS_-_v1.2 2

Contents1. Introduction 31.1. Scope 31.2. Definitions, Acronyms and Abbreviations 3

2. Resources Affected 32.1. External Resources 3

3. Application Design 33.1. Architectural and Coding References 33.2. Platform and Version Information 3

4. Process Flow of CIS 34.1 Server check process 34.2 Load Log Script 34.3 Loading Process from Source to Landing Zone 3

4.3.1 Weekly Full Refresh Loads 34.3.2 Weekly Incremental Loads 34.3.3 History Loads 34.3.4 Deriving Member Key fields 3

4.3.5 Audit Check Process 35. Components/Objects/Modules 3

i. Major Component Inventory 3ii. Major Component Details 3

6. Data Stores 36.1 Data Store Inventory 36.2 Data Store Data Elements 36.3 Data Store Descriptions 3

7. Implementation Activities37.1 Packaging/Release Activity 3

8. Technical Assumptions 3

9. Reference Documents 3

10. Project Team Signoffs / Approvals 3

document.doc 2

Page 3: Release-3_TSD_Source_to_LZ_-_CIS_-_v1.2 2

1. IntroductionThis high-level section requires no input from the author. It is simply information to assist authors in understanding and completing each sub-section.

General Template Information

Purpose of Document

The Functional Design Document (FDD), the Architectural Design Document and the Infrastructure Impact Assessment (IIA) document are the predecessors to this document. The IT Technical Lead normally completes this document with contribution from the Solution Architect, Data Architect, and Infrastructure Build Engineering as needed.

This document must contain all the elements needed to code a fully functional system. This means that this document should be complete and accurate enough so that any developer inside or outside the immediate development team can construct all the components in order to complete the system.

The IT Technical Lead is responsible for the creation of this document. The document is owned by IT.

Help Completing Template [email protected]

Frequently Asked Questions about Completing this Document

# Question Answer

1How do I attach another doc as an object in this doc?

(In this Word Doc) click Insert Object Create From File tabCheck-off “Display as Icon” Browse for file click “ok”. File should now be on the document, but may not be fully visible. If not fully visible: click on the object one right click “Format Object” click layout tab select “Tight” hit “ok”.

2 How do I provide a hyperlink in this doc to another doc?

(In this Word Doc) InsertHyperlinkEnter hyperlink

3 How do I update the Table of Contents?

Go to the Table of Contents page Position cursor to the left of the table (not over the table) Left click mouse button. The entire table should be highlighted Click the F9 key on keyboard.

document.doc 3

Page 4: Release-3_TSD_Source_to_LZ_-_CIS_-_v1.2 2

1.1.Scope

Information Guide on Scope

Scope

The current scope of the design is to extract data from four operational source systems (ECC, WMDS, Trimed and Care planner) and load them into the Landing Zone (LZ). This document is based on the Business System Document and details the solution approach to load all the tables into the landing zone for CIS. There are 15 tables from ECC, 18 tables from WMDS, 8 tables from Trimed and 1 Flat file for CM. This document gives an overview in terms of performing Incremental Refresh (IR), Full Refresh (FR) and One Time Load of Historical Data.

In case of scheduled process, the Incremental data (containing new or changed records) from the source databases (WMDS, ECC, Trimed and Care Planner) will be loaded into Landing Zone by means of Informatica ETL mappings.

Every week the records from the source would be pulled based on the load criteria from the source tables and loaded into their respective tables in Landing Zone (LZ).

Landing Zone will be a transient staging area and no history will be maintained.

Description of Section

Following steps will be executed as part of the project scope

Design and develop landing zone tables as per the source table layouts and LZ table creation guidelines.

Extract Clinical data from ECC (Oracle), WMDS (Oracle), Trimed (DB2), Careplanner (Flat file) from Jan 1, 2007 forward.

Load the History data into landing zone tables.

Design processes to handle complex transformations.

Design ETL process to load incremental data, full refresh data into landing zone staging area.

Setup process to load & maintain load log table for LZ data loads.

Setup process for Audit balancing.

Setup process for WLM jobs scheduling.

Define job dependencies and restart-ability.

Post load cleanup activities.

Note:

1.2.Definitions, Acronyms and Abbreviations

Information Guide on Definitions, Acronyms, and Abbreviations

document.doc 4

Page 5: Release-3_TSD_Source_to_LZ_-_CIS_-_v1.2 2

BIBusiness Intelligence

BI Staging Area Staging area for Current Clinical reporting platform on a regional server (AEDW)

CDC Change Data Capture

CM Case Management

CMS Codes Management System

COB Coordination of Benefits

COBRA Consolidated Omnibus Budget Reconciliation Act

CP Care Planner

CS90 Claims System – New York

CSA Conformed Staging Area

DM Disease Management

ECC Empire Care Connects

ECR Enterprise Client Reporting

EDL Enterprise Data Layer

EDL R2 Enterprise Data Layer Release 2

EDW Enterprise Data Warehouse

EDWardEnterprise Data Warehouse and Research Depot. Earlier this was known as EDL R2.

ERISA Employment Retirement Income Security Act

ETL Extract Transform & Load

FR Full Refresh

HMC Health Management Corporation

IM Information Management

INC Incremental Load

INFA Informatica

IQ Information Quality

IR Implementation Readiness

LZ Landing Zone Staging Area

document.doc 5

Page 6: Release-3_TSD_Source_to_LZ_-_CIS_-_v1.2 2

MBU Marketing Business Unit

NAICS_CD North American Industry Classification System Code

PCP Primary Care Physician

POC Proof of Concept

RFC Request for Change

SDLC System Development Life Cycle

SIC_CD Standard Industry Classification Code

SLA Service Level Agreement

TD Teradata database

TROOP True Out of Pocket

UAT User Acceptance Testing

WEDW West-Enterprise Data Warehouse

WGS WellPoint Group Systems

WLM Work Load Manager

WMDS WellPoint Medical Decision Support System

WPD WellPoint - Product Database

2. Resources AffectedThis high-level section requires no input from the author. It is simply information to assist authors in understanding the sub-sections.

Information Guide on Resources Affected

Resources Affected WMDS and ECC - Oracle database, Trimed – DB2 database, Careplanner – Flat file, Landing Zone Area (LZ) - Teradata.

Description of Section

List all other external resources (applications/entities) affected by the changes and a brief description of how they will be affected. Do not include business entities.

External resources are represented by: hardware, a person, program, or another system.

Note:

2.1.External ResourcesInformation Guide on External Resources

External Resources Operational System WMDS/ECC/Trimed/Careplanner Host Systems

document.doc 6

Page 7: Release-3_TSD_Source_to_LZ_-_CIS_-_v1.2 2

Teradata

ENTDWPROD - Host DBA Support

Unix

TSM Disk Utilization System Admin Support

WLM

WLM Scheduling Support Service Delivery Support

Informatica

Underlying Oracle DB Application Admin Support

Description of Section

These are the External Entities outside of the applications that can be affected (i.e. DBA, External Vendor, Tape Management, Regulatory Agency, etc.). Describe in detail the impact.

Note:

3. Application DesignThis high-level section requires no input from the author. It is simply information to assist authors in understanding the sub-sections.

Information Guide on Application Design

Application Design

This section details all the technical design information for clinical subject area. Detailed design approach is explained in detail in the below sub-sections.

Description of Section

The aim of current application design is to load the CM from WMDS, ECC, Trimed, Careplanner source system to Landing Zone (LZ). This document is based on the Business System Design Document. This document is utilized to create Technical Detail Design Specification Document. This approach document defines on the way to load landing Zone tables for historical, incremental, full loads.

Note:

3.1.Architectural and Coding References

Information Guide on Architectural and Coding References

Architectural and Coding

1. Informatica Developer guideline 4.4

document.doc 7

Page 8: Release-3_TSD_Source_to_LZ_-_CIS_-_v1.2 2

References

2. EDL R2 Informatica Standards and Naming Conventions

http://sharepoint.auth.wellpoint.com/sites/EDL/ETL%20Design%20Workgroup/Forms/AllItems.aspx

3. ETL Guidelines document and RA decision Points:

The following Share Point link gives the information about the latest ETL Guidelines document and Reference Architecture Decision Points:

http://sharepoint.auth.wellpoint.com/sites/EDL/ETL%20Design%20Workgroup/Forms/AllItems.aspx

4. Landing Guidelines document – LZ_CDC_guidelines_V2

This document gives general guidelines on the process that need to be followed for creating the Landing Zone (LZ) tables & high level inputs needed for Change Data Capture (CDC) attribute mapping.

http://sharepoint.auth.wellpoint.com/sites/DIRECTSOURCING/Shared%20Documents/Forms/AllItems.aspx

5. WLM Policy Document

http://sharepoint.auth.wellpoint.com/sites/DIRECTSOURCING/Shared%20Documents/Forms/AllItems.aspx

Description of Section

Identify the Architectural and coding standards to be used. If reference is to a document that is not in an enterprise wide library, attach a link to the document or attach the document itself in the Reference section of this document.

Note:Please include as much information as possible.

3.2.Platform and Version InformationThe below table contains the list of software for the Clinical Subject area.

Sl. No

Software Required Version

1 Teradata 06.02.02.802 Informatica 8.6.13 Unix AIX4 WLM 3.0.1

The following diagram describes the platform information.

document.doc 8

Page 9: Release-3_TSD_Source_to_LZ_-_CIS_-_v1.2 2

The below table contains various server details for Clinical Subject Area in different environments.

Informatica Servers/Environment:

Environment Server Name IP Ports

DEV/SIT vaathmr380.corp.anthem.com 30.135.22.46 6001,55010-55100

UAT/IR vaathmr381.corp.anthem.com 30.135.22.47 6001,55000 - 55100

PROD vaathmr357.corp.anthem.com 30.130.16.150 6001,55201- 55300

Teradata Servers/Environment:

Environment Server Name Server IP PortsDEV DWDEV 30.135.31.232 1025SIT DWTEST1 30.135.88.22 1025UAT/IR DWTEST3 30.135.88.22 1025PROD ENTDWPROD 30.128.223.28 1025

Informatica Repository:

Informatica Folders:

Environment Folder Name

document.doc 9

Environment Repository Name

DEV/SIT EDL_DEV_86x

UAT/IR EDL_UAT_86x

PROD EDL_PROD_86x

Page 10: Release-3_TSD_Source_to_LZ_-_CIS_-_v1.2 2

Environment Folder Name

DEV/SIT ENT_CLINICAL_PROGRAMS_DEV ENT_CLINICAL_PROGRAMS_SHARED_DEV

UAT/IR ENT_CLINICAL_PROGRAMS_UAT ENT_CLINICAL_PROGRAMS_SHARED_UAT

PROD ENT_CLINICAL_PROGRAMS ENT_CLINICAL_PROGRAMS_SHARED

Teradata Database:

Environment Databases Name

DEV

ETL_TEMP_CPARP

QADATA_CPARP

CPARP

CPARP_ALLPHI

CPARP_NOPHI

CPARP_NOHAPHI

ETL_VIEWS_CPARP

SIT

T36_ETL_DATA_ENT

T36_ETL_TEMP_ENT

T36_QADATA_ENT

T36_UTLTY_ENT

T36_ETL_VIEWS_ENT

T36_EDW

T36_EDW_[ALL | NO | NOHA ] PHI

UAT/IR

T37_ETL_DATA_ENT

T37_ETL_TEMP_ENT

T37_QADATA_ENT

T37_UTLTY_ENT

T37_ETL_VIEWS_ENT

T37_EDW

T37_EDW_[ALL | NO | NOHA ] PHI

PROD ETL_DATA_V20_ENT

ETL_TEMP_V20_ENT

QADATA_V20_ENT

UTLTY_V20_ENT

document.doc 10

Page 11: Release-3_TSD_Source_to_LZ_-_CIS_-_v1.2 2

ETL_VIEWS_V20_ENT

EDW_V20

CIS_HIST

EDW_[ALL | NO | NOHA ] PHI

EDW_SL_[ALL | NO | NOHA ] PHI

Supporting Databases:

Functionality Database Name

High level hierarchy for CIS PRJ_CPARP

Proxy ID for data loads CPARP_ETL_ID

Views

Macros

Stored Procedures

ETL_VIEWS_CPARP

ETLMACRO_CPARP 

ETLPROC_CPARP

Work objects and Error Tables needed by Teradata Utilities UTLTY_CPARP

Hierarchy node for storing  CIS DDLs for use by Developers CPARP_DEVDDL

UNIX Directories:

document.doc 11

Type Path

Scripts /u01vaathmr380/app/cis/test/scripts

Source file /u97vaathmr380/pcenterdata/test/SrcFiles

Target file /u97vaathmr380/pcenterdata/test/TgtFiles

Cache File /u97vaathmr380/pcenterdata/test/Cache

Parameter Files /u97vaathmr380/pcenterdata/test/InfaParm

Page 12: Release-3_TSD_Source_to_LZ_-_CIS_-_v1.2 2

4. Process Flow of CIS

document.doc 12

Page 13: Release-3_TSD_Source_to_LZ_-_CIS_-_v1.2 2

4.1 Server check processFor WMDS and ECC:

Informatica process is created to check for the server availability. Informatica process will load data from ECC_GLOBAL_NAME table to a file. If the server is not up and running then Informatica will not be able to connect to the database and load process will fail. Once the Informatica session fails, a command task is executed to make the process wait for 1200 seconds. After 1200 seconds wait time, Informatica session will again run to look for an entry in ECC_GLOBAL_NAME table. This process will continue for 7200 seconds before the entire workflow fails. Similar process is created for WMDS source which loads data from WMDS_GLOBAL_NAME table.

.For Trimed:

$s_CIS_TRIME...$s_CIS_TRIME...$s_CIS_TRIME...$s_CIS_TRIME...$s_CIS_TRIME...$s_CIS_TRIME...$c_SLEEP_TIM...$c_SLEEP_TIM...$c_SLEEP_TIM...$c_SLEEP_TIM... $c_SLEEP_TIM...$c_SLEEP_TIM...

c_SLEEP_TIMER_5

c_SLEEP_TIMER_4

c_SLEEP_TIMER_3

c_SLEEP_TIMER_2

c_SLEEP_TIMER_1

c_SLEEP_TIMER

s_CIS_TRIMED_SERVER_UP_

6

s_CIS_TRIMED_SERVER_UP_

5

s_CIS_TRIMED_SERVER_UP_

4

s_CIS_TRIMED_SERVER_UP_

2

s_CIS_TRIMED_SERVER_UP_

3

s_CIS_TRIMED_SERVER_U

P

s_CIS_TRIMED_SERVER_UP_

1

Start

This Informatica process is created to check for the availability of the DB2 database for Trimed. This process will select data from SYSIBM.SYSDUMMY1 table for every 20 minutes up to 120 minutes as per the business terms. A command task is executed after every session task to set the process SLEEP for 1200 seconds.

document.doc 13

Page 14: Release-3_TSD_Source_to_LZ_-_CIS_-_v1.2 2

4.2 Load Log Script

Insertion of the Load Log Key:This workflow will run command tasks to invoke BTEQ script to insert load log key into LZ_LOAD_LOG table, script to update Parameter file values ,script to update present date parameter and script to verify whether Parameter file was updated properly or not.

Contents of the file:

This script checks if there is any load log key with load end date time as 8888-12-31. If it finds the load key value with the Load end date time as 8888-12-31, it errors out with the return code 100. If it does not find the record with the value of 8888-12-31, then it creates a new load log key entry based on the information present in the parameter file.

Fetching and Updating the Load Log Key Value:After the insertion of Load Log key value in the LZ_LOAD_LOG table successfully, The BTEQ script fetches the current Load Log key value from the load log key table and updates the mapping parameter file, replacing the previous Load Log key value in the FR and INC parameter files.

1. The script checks whether any other instance of the program is being run. If so, it displays a

message that there is an existing instance of same program running and exits with the exit code of 91.

2. The script checks whether the old bteq_outfile is present. If it exists, it removes that output file.

3. The script also checks if valid parameters are passed. If valid parameters are not passed it displays a message that invalid parameter is passed and prompts to pass the Full Script

document.doc 14

Page 15: Release-3_TSD_Source_to_LZ_-_CIS_-_v1.2 2

Name, Mapping Parameter File and Logon_Parameter_File.

4. The script checks if the Load log values have been fetched properly and if it doesn’t, it display a message that there was no valid load log entry in the table and exits with the exit code of 100.After the successful run of the script it removes the Temp files and captures if any errors are present.

5. The script fetches the current Load Log key value from the load log key table and updates the mapping parameter file, replacing the previous Load Log key value the parameter files.

.

Fetching and Updating the Present Run Date Value:After updating the Load Log key value in the mapping parameter file successfully, the BTEQ script fetches and updates the present run date in the mapping parameter file with current date, replacing the previous present run date in the FR and INC parameter files.

Contents of the file:

1. The script checks whether any other instance of the program is being run. If so, it displays a message that there is an existing instance of same program running and exits with the exit code of 91.

2. The script checks whether the old bteq_outfile is present. If it exists, it removes that output file.

3. The script also checks if valid parameters are passed. If valid parameters are not passed it displays a message that invalid parameter is passed and prompts to pass the Full Script Name, Mapping Parameter File and Logon_Parameter_File.

4. The script fetches and updates the present run date in the mapping parameter file with current date, replacing the previous present run date in the FR and INC parameter files.

Validating the Load Log Key Value:This script checks if the inserted load log value which exists in the load log table is same as in the parameter file. If both the values are equal then it displays a message that the load log key value has been updated correctly in the parameter file and the loads following this will proceed. If both the values are not equal then it displays a message that the verification has failed and exits with the exit code of 9 and loads following this will not proceed.

It checks for both the loads FR and INC parameters.

4.3 Loading Process from Source to Landing ZoneOnce the load log key is created in the LZ load log table, Informatica workflows are executed to load the data from the Source to the Landing Zone. As per the EDW standards all the Mappings from Source to Landing Zone sets default values for nulls or blanks present in char, varchar, number and date fields.

Following are the types of load from Source to LZ.

document.doc 15

Page 16: Release-3_TSD_Source_to_LZ_-_CIS_-_v1.2 2

1) The first type is Weekly Full refresh loads for tables with low volume counts. The full refresh tables are truncate and reload. Every table will have a command session before it which contains a generic delete script that deletes information from that particular table.

2) The second type is Incremental Load and it is based on the ‘Last Run Date’. All the records that have been updated or inserted in the source after the last run date are fetched and loaded into the Landing Zone tables. . As mentioned above, every table will have a command session before it which contains a generic delete script that deletes information from that particular table.

4.3.1 Weekly Full Refresh Loads

An example of the full refresh mapping (m_CLINICAL_LZ_TRIMED_TMDTPATIENT) is shown pictorially above. At high level, mapping will read the data from oracle database and loads into the Landing Zone tables i.e., (LZ_TRIMED and LZ_CP) in Teradata. Also Load Log and Audit Process are followed in each and every mapping for getting source record count and inserting Audt_STTSTC for table for post load verification.

document.doc 16

Page 17: Release-3_TSD_Source_to_LZ_-_CIS_-_v1.2 2

Mapping Description:

ID Transformation Name Component Type

Description

1 SQ_CME_CASE_MANAGEMENT_EPISODE

Source Definition

This is the source qualifier for the table CME_CASE_MANAGEMENT_EPISODE from ECC. This is a straight load of the source data set to the target table.

2 exp_DEFAULT_CONVERSION Expression This reusable expression is used to convert the null, blank, N/A values to default values from source systems to landing zone tables following EDW standards.

3 exp_TO_TARGET_INP_AUDIT Expression This expression passes values to target table AND passes input values to audit mapplet.

4 mplt_LOAD_AUDIT_STTSTC Mapplet This mapplet inserts into the AUDT_STTSTC with the initial count loaded into the target table.

5 LZ_ECC_CME_CM_EPISODE Target Definition

Truncate and Load. Populate Landing zone table with the most recent data.

6 AUDT_STTSTC Audit Table A row with source count appended in each mapping based on table name.

Mapplet Description:

ID Transformation Name Component Type

Description

1 agg_RECORD_COUNT Aggregator This aggregator gives a row with source count appended in each mapping based on table name.

2 exp_CALL_AUDT_BLNCG_RULE Expression This expression is used to call the lookup for the AUDT_BLNCG_RULE id based on table_nm, sor_cd and in this AUDT_BLNCG_RULE id can’t be NULL.

document.doc 17

Page 18: Release-3_TSD_Source_to_LZ_-_CIS_-_v1.2 2

3 lkp_AUDT_BLNCG_RULE Lookup This lookup transformation is used to get the AUDT_RULE_ID.

4 mplt_OUT Mapplet Output

This is the mapplet output from where we will accumulate the complete mapplet data.

4.3.2 Weekly Incremental Loads

An example of the incremental process is shown pictorially above At high level, Mapping will read the data from oracle database for the last three days based on the create date and Update date fields and loads into the Landing Zone table, LZ_ECC_COI_CON_ISSUE in Teradata. Also load log and audit

document.doc 18

Page 19: Release-3_TSD_Source_to_LZ_-_CIS_-_v1.2 2

process are followed in each and every mapping for getting source record count and writing to Audt_STTSTC for table for post load verification.

Mapping Description:

ID Transformation Name Component Type

Description

1 SQ_COI_CON_ISSUE Source Definition

This is the source qualifier for the table COI_CON_ISSUE from ECC. This is a straight load of the source data set to the target table.

2 exp_DEFAULT_CONVERSION Expression This reusable expression is used to convert the null, blank, N/A values to default values from source systems to landing zone tables following EDW standards.

3 exp_TO_TARGET_INP_AUDIT Expression This expression passes values to target table AND passes input values to audit mapplet.

4 mplt_LOAD_AUDIT_STTSTC Mapplet This mapplet inserts into the AUDT_STTSTC with the initial count loaded into the target table.

5 LZ_ECC_COI_CON_ISSUE Target Definition

Truncate and Load. Populate Landing zone table with the most recent data.

6 AUDT_STTSTC Audit Table A row with source count appended in each mapping based on table name.

Mapplet Description:

ID Transformation Name Component Type

Description

1 agg_RECORD_COUNT Aggregator This aggregator gives a row with source count appended in each mapping based on table name.

2 exp_CALL_AUDT_BLNCG_RULE Expression This expression is used to call the lookup for the AUDT_BLNCG_RULE id based on table_nm, sor_cd and in this AUDT_BLNCG_RULE id can’t be NULL.

3 lkp_AUDT_BLNCG_RULE Lookup This lookup transformation is used to get the AUDT_RULE_ID.

4 mplt_OUT Mapplet Output

This is the mapplet output from where we will accumulate the

document.doc 19

Page 20: Release-3_TSD_Source_to_LZ_-_CIS_-_v1.2 2

complete mapplet data.

The Pseudo Code for the Source filter can be as follows:

WHERE((TRUNC(COI_CON_ISSUE.COI_CREATE_DATE)>= TRUNC(TO_DATE(TO_CHAR('$$MAPP_LZ_LAST_RUN_DATE'),'MM-DD-YYYY HH24:MI:SS'))ANDTRUNC(COI_CON_ISSUE.COI_CREATE_DATE)< TRUNC(TO_DATE(TO_CHAR('$$MAPP_LZ_LAST_PRSNT_DATE'),'MM-DD-YYYY HH24:MI:SS')))OR(TRUNC(COI_CON_ISSUE.COI_LAST_UPDATE_DATE)>= TRUNC(TO_DATE(TO_CHAR('$$MAPP_LZ_LAST_RUN_DATE'),'MM-DD-YYYY HH24:MI:SS'))ANDTRUNC(COI_CON_ISSUE.COI_LAST_UPDATE_DATE)< TRUNC(TO_DATE(TO_CHAR('$$MAPP_LZ_LAST_PRSNT_DATE'),'MM-DD-YYYY HH24:MI:SS'))))

4.3.3 History LoadsThe history load is applicable only for tables identified as Incremental tables and the data of these tables are brought to the Landing zone based on the category that they fall on to:

Eg:

select /*+parallel(a,4) parallel(b,4)*/

col1,

col2,

----------

----------

FROM

WHERETAU_TREATMENT_AUTHORIZATION.TAU_MEM_UID = MEM_MEMBER.MEM_UIDANDtrunc(TAU_TREATMENT_AUTHORIZATION.TAU_CREATE_DATE)>=trunc(to_date(to_char('$$MAPP_HIST_START_DATE'),'mm-dd-yyyy hh24:mi:ss'))ANDtrunc(TAU_TREATMENT_AUTHORIZATION.TAU_CREATE_DATE)<trunc(to_date(to_char('$$MAPP_HIST_END_DATE'),'mm-dd-yyyy hh24:mi:ss'))

document.doc 20

Page 21: Release-3_TSD_Source_to_LZ_-_CIS_-_v1.2 2

Below are the tables where history data starting from Jan 01, 2007 and match based on the TAU_TREATMENT_AUTHORIZATION.TAU_UID are brought down to Landing Zone.

ECC TABLE_NAME WMDS TABLE_NAME

MNO_MEM_NOTE COI_CON_ISSUE

COI_CON_ISSUE MEM_MEMBER

MEM_MEMBER MNO_MEM_NOTE

MPG_MEM_PLAN_GROUP TAU_TREATMENT_AUTHORIZATION

TAU_TREATMENT_AUTHORIZATION MPG_MEM_PLAN_GROUP

PVD_PROVIDER

Incase huge volume of data in source tables then there is an issue in loading the records as the time taken to load these records is high. Hence the very huge tables will be split into multiple stages and are loaded into the Landing Zone. Each stage will be having the data for a particular period based on the TAU_TREATMENT_AUTHORIZATION table create_date (say 6/3 months). The Landing Zone table will be loaded completely after the completion of all the stage loads and it is used as a single table for further processing.

4.3.4 Deriving Member Key fields

WMDS.MEM_MEMBER table:WMDS.MEM_MEMBER needs to be joined with WMDS.TAU_TREATMENT_AUTHORIZATION table to get the matching MEM_ID for each TAU_UID. The information inside the ‘Member id’ needs to be split and decoded in an expression to get the ‘Subscriber ID’, ‘Source Code’, ‘Member Sequence Number’, ’Member Source Code’ and ‘Source Group Number’.

Pseudo Code (LZ_WMDS_MEM_MEMBER) used in the Source qualifier can be found in Incremental.doc or History.doc documents in ‘Deriving Member Key fields section’.

Expression to Derive fields for MBR_KEY :

Ports Expression

SUBSCRIBER_ID decode (TRUE,instr(MEM_ID, 'DENWGS') != 0 ORinstr(MEM_ID, 'DENINT') != 0 ORinstr(MEM_ID, 'WGS2BCC') != 0 ORinstr(MEM_ID, 'WGS2CO') != 0 ORinstr(MEM_ID, 'WGS2GA') != 0 ORinstr(MEM_ID, 'WGS2MO') != 0 ORinstr(MEM_ID, 'WGS2NV') != 0 ORinstr(MEM_ID, 'WGS2SSP') != 0 ORinstr(MEM_ID, 'WGS2UNI') != 0 OR

document.doc 21

Page 22: Release-3_TSD_Source_to_LZ_-_CIS_-_v1.2 2

instr(MEM_ID, 'WGS2WI') != 0 ORinstr(MEM_ID, 'WGSMPD') != 0 ORinstr(MEM_ID, 'WBMO') != 0 ORinstr(MEM_ID, 'CR') != 0 ORinstr(MEM_ID, 'AFEP') != 0 ORinstr(MEM_ID, 'HLADV') != 0 ORinstr(MEM_ID, 'HLHMO') != 0 ORinstr(MEM_ID, 'HLPPO') != 0 ORinstr(MEM_ID, 'NA') != 0 ORinstr(MEM_ID, 'WBGA') != 0 ORinstr(MEM_ID, 'UNIBOR') != 0 ORinstr(MEM_ID, 'UNISHBP') != 0 ORinstr(MEM_ID, 'DENSTAR') != 0 ORinstr(MEM_ID, 'STAR') != 0 ORinstr(MEM_ID, 'MTRKUNI') != 0 ORinstr(MEM_ID, 'WGS13') != 0,SUBSTR((RTRIM(LTRIM(MEM_ID))),1,9),instr(MEM_ID, 'WBWI') != 0,SUBSTR((RTRIM(LTRIM(MEM_ID))),4,9),instr(MEM_ID, 'D950') != 0,SUBSTR((RTRIM(LTRIM(MEM_ID))),1,12),'UNK')

MEMBER_SEQ_NBR

decode (TRUE,instr(MEM_ID, 'CR') != 0 ORinstr(MEM_ID, 'AFEP') != 0 ORinstr(MEM_ID, 'UNIBOR') != 0 ORinstr(MEM_ID, 'UNISHBP') != 0,to_char(SUBSTR((RTRIM(LTRIM(MEM_ID))),10,2)),instr(MEM_ID, 'WBGA') != 0 ORinstr(MEM_ID, 'DENSTAR') != 0 ORinstr(MEM_ID, 'STAR') != 0 ORinstr(MEM_ID, 'WGS13') != 0,to_char((SUBSTR((RTRIM(LTRIM(MEM_ID))),11,2))),instr(MEM_ID, 'NA') != 0,to_char((SUBSTR((RTRIM(LTRIM(MEM_ID))),15,2))),'UNK')

MBR_CD decode (TRUE,instr(MEM_ID, 'DENWGS') != 0 ORinstr(MEM_ID, 'WGS2BCC') != 0 ORinstr(MEM_ID, 'WGS2CO') != 0 ORinstr(MEM_ID, 'WGS2GA') != 0 ORinstr(MEM_ID, 'WGS2MO') != 0 ORinstr(MEM_ID, 'WGS2NV') != 0 ORinstr(MEM_ID, 'WGS2SSP') != 0 ORinstr(MEM_ID, 'WGS2UNI') != 0 ORinstr(MEM_ID, 'WGS2WI') != 0 ORinstr(MEM_ID, 'WGSMPD') != 0 ORinstr(MEM_ID, 'WBMO') != 0 ORinstr(MEM_ID, 'HLADV') != 0 ORinstr(MEM_ID, 'HLHMO') != 0 ORinstr(MEM_ID, 'HLPPO') != 0 ORinstr(MEM_ID, 'MTRKUNI') != 0,SUBSTR((RTRIM(LTRIM(MEM_ID))),11,2),instr(MEM_ID, 'WBWI') != 0 ORinstr(MEM_ID, 'D950') != 0,SUBSTR((RTRIM(LTRIM(MEM_ID))),13,2),

document.doc 22

Page 23: Release-3_TSD_Source_to_LZ_-_CIS_-_v1.2 2

'UNK')MBR_SOR_CD decode (TRUE,

rtrim(substr(MEM_ID,instr(MEM_ID, 'CR'))) ='CR','823',instr(MEM_ID, 'AFEP') != 0,'FEP',( instr(MEM_ID, 'NA')!=0 AND is_spaces(substr(MEM_ID,instr(MEM_ID,'NA')-1,1))), '824',instr(MEM_ID, 'DEN') != 0,'NA',instr(MEM_ID, 'STAR') != 0,'815',instr(MEM_ID, 'WGS13') != 0,'NA',instr(MEM_ID, 'WGS') != 0,'808','NA')

SRC_GRP_NBR 'UNK'

ECC.MEM_MEMBER table:ECC.MEM_MEMBER needs to be joined with ECC.TAU_TREATMENT_AUTHORIZATION table to get the matching MEM_ID for each TAU_UID. The information inside the ‘Member id’ needs to be split and decoded in an expression to get the ‘Subscriber ID’, ‘Source Code’, ‘Member Sequence Number’, ’Member Source Code’ and ‘Source Group Number’.

Pseudo Code (LZ_ECC_MEM_MEMBER) used in the Source qualifier can be found in Incremental.doc or History.doc documents in ‘Deriving Member Key fields section’.

Expression to Derive fields for MBR_KEY:

Ports Expression

SUBSCRIBER_ID DECODE(TRUE,INSTR(RTRIM(LTRIM(MEM_ID)),'-')!=0,SUBSTR(RTRIM(LTRIM(MEM_ID)),1,INSTR(MEM_ID,'-')-1),MEM_ID)

MEMBER_SEQ_NBR

DECODE(TRUE,INSTR(RTRIM(LTRIM(MEM_ID)),'-')!=0,SUBSTR(RTRIM(LTRIM(MEM_ID)),(INSTR(MEM_ID,'-')+1),2),'NA')

MBR_SOR_CD '809'

TRIMED.OPRTPRSNCVRD table:The information inside the ‘OPRTPRSNCVRD .I_COVD_PRSN ‘ needs to be split and decoded in an expression to get the ‘Subscriber ID’, ‘Member Sequence Number’, ’Member Source Code’ and ‘Source Group Number’.

Pseudo Code (LZ_TRIMED_OPRTPRSNCVRD) used in the Source qualifier can be found in Incremental.doc or History.doc documents in ‘Deriving Member Key fields section’.

Expression to Derive fields for MBR_KEY:

document.doc 23

Page 24: Release-3_TSD_Source_to_LZ_-_CIS_-_v1.2 2

Ports Expression

SUBSCRIBER_ID decode (TRUE,instr(C_OWNER_CAT, 'CORP') != 0 ORinstr(C_OWNER_CAT, 'HMORIC') != 0 ORinstr(C_OWNER_CAT, 'FEP') != 0,SUBSTR((RTRIM(LTRIM(I_COVD_PRSN))),1,9),'UNK')

MEMBER_SEQ_NBR

decode (TRUE,instr(C_OWNER_CAT,'CORP') != 0 ORinstr(C_OWNER_CAT,'HMORIC') != 0 ORinstr(C_OWNER_CAT,'FEP') != 0,to_char(SUBSTR((RTRIM(LTRIM(I_COVD_PRSN))),10,2)),'UNK')

MBR_SOR_CD decode (TRUE,instr(C_OWNER_CAT, 'HMORIC') != 0,'868',instr(C_OWNER_CAT, 'CORP') != 0,'869',instr(C_OWNER_CAT, 'FEP') != 0,'888','NA')

SRC_GRP_NBR 'UNK'

CarePlanner.CASE_EVNT TableThe information inside the ‘CASE_EVNT.PAT_ID ‘ needs to be split and decoded in an expression to get the ‘Subscriber ID’, ‘Member Sequence Number’, ’Member Source Code’ and ‘Source Group Number’, ’source member code’.

Expression to Derive fields for MBR_KEY:

Ports Expression

SUBSCRIBER_ID decode (TRUE,instr(PAT_ID, 'ACES') != 0,SUBSTR((RTRIM(LTRIM(PAT_ID))),1,10),'UNK')

MEMBER_SEQ_NBR decode (TRUE,instr(PAT_ID, 'ACES') != 0,to_char((SUBSTR((RTRIM(LTRIM(PAT_ID))),12,1))),'UNK')

MBR_SOR_CD '822'

SRC_GRP_NBR 'UNK'

SRC_MBR_CD 'UNK'

Default Values:Mapping Parameters are defined and are utilized to assign the values appropriately. A sample parameter definition and a sample parameter file are attached below.

document.doc 24

Page 25: Release-3_TSD_Source_to_LZ_-_CIS_-_v1.2 2

The Pseudo expression templates for the replacement of the null values with the defaults based on the data type can be summarized as follows:

Data Type Default Value(s)

Pseudo Expression

CHAR, VARCHAR

UNK,NA

DECODE(TRUE,ISNULL(IN_STRING_1),to_char($$MAPP_DEFAULT_STRING_UNK),IS_SPACES(IN_STRING_1),to_char($$MAPP_DEFAULT_STRING_UNK),IN_STRING_1= '',to_char($$MAPP_DEFAULT_STRING_UNK),IN_STRING_1)

INTEGER, DECIMAL

0 DECODE(TRUE, ISNULL(IN_NUMBER_1),$$MAPP_DEFAULT_INTEGER,IN_NUMBER_1)

DATE

8888-12-31 00:00:00

DECODE(TRUE,ISNULL(IN_DATE_1),ROUND(to_date(to_char($$MAPP_DEFAULT_HIGH_DATE))),IN_DATE_1)

4.3.5 Audit Check Process

1) After the load process is completed at the Landing Zone tables, a BTEQ script is run to update the load statistics of the Landing Zone tables into the Audit table AUDT_STTSTC.

Count of the row is updated for the load log key having the latest ‘Load start date time’ based on the ‘source code’, ‘subject area’, ‘workflow name’ and ‘publish indicator = N’. This is loaded into the GTT (Global Temporary table).The data is loaded into the AUDT_STTSTC from the GTT table.

When there are zero rows processed from the source (i.e.) when the source does not have any rows matching the fetch criteria, then the source record count in the audit table will not be inserted. The target record count (LZ target table count) will be inserted with the row count of zero.

2) The record variance of the source and target column is aggregated and updated into the GTT table. This is loaded back into the AUDT_BLNC_STTSTC table.

3) This bteq checks if the difference of counts and sums are within the allowed variance quantity for all the tables in a particular subject area. If the variance is more than the accepted value in the rule table, then the subsequent process would not be executed.

document.doc 25

Page 26: Release-3_TSD_Source_to_LZ_-_CIS_-_v1.2 2

4) If the above process succeeds, then a BTEQ script is executed to update the load end time in the load log table.

Updates the load end time based on the ‘source code’, ‘subject area’, ‘workflow name’ and publish_ind = ‘N’

5) The final BTEQ updates the publish indicator to Y in the load log table.

In case of any count variances in the audit, Audit log entries needs to be deleted manually by production support personal and Jobs needs to be restarted to fix the problem.

The following link will give the Audit Balancing tables structure DDL :

http://sharepoint.auth.wellpoint.com/sites/esppm/edlr2/Executing%20Phase%20Documents/DDL/dev/IQ_cut1_00.ddl

A pictorial representation of the above steps:

5. Components/Objects/Modules This section will have more detailed information for each and every component and objects for Clinical subject area for WMDS/ECC source systems.

document.doc 26

Page 27: Release-3_TSD_Source_to_LZ_-_CIS_-_v1.2 2

i. Major Component Inventory

SL No. Component Layer1 Trigger file UNIX scripting2 Load log keys Teradata3 Source to Landing Zone Informatica4 Audit Balancing Teradata/Informatica

ii. Major Component DetailsThis section will have detailed information on the source and the ETL process to load them to the Landing Zone.

Source Data Extraction Process Informatica process is created to check for the server availability. Informatica process will load data from TRIMED_GLOBAL_NAME table to a file. If the server is not up and running then Informatica will not be able to connect to the database and load process will fail. Once the Informatica session fails, a command task is executed to make the process wait for 1200 seconds. After 1200 seconds wait time, Informatica session will again run to look for an entry in TRIMED_GLOBAL_NAME table. This process will continue for 7200 seconds before the entire workflow fails

Source:

WMDS Oracle Database

ECC Oracle Database

TRIMED DB2 Database

CareplannerCare planner Flat File

Source Table Details

Serial No.

Source FR/INC Source Table Name Landing Zone Target table Name

1. WMDS FR CDA_CME_DIAGNOSIS LZ_WMDS_CDA_CME_DIAGNOSIS

2. WMDS FR CMA_CME_MNGR_ASGNMNT LZ_WMDS_CMA_CME_MNGR_ASGNMNT

3. WMDS FR CME_CM_EPISODE LZ_WMDS_CME_CM_EPISODE

4. WMDS FR CMG_CM_GOAL LZ_WMDS_CMG_CM_GOAL

5. WMDS FR CMI_CM_ISSUE LZ_WMDS_CMI_CM_ISSUE

6. WMDS FR CMV_CM_INTERVENTION LZ_WMDS_CMV_CM_INTERVENTION

7. WMDS INC COI_CON_ISSUE LZ_WMDS_COI_CON_ISSUE

8. WMDS FR DXC_DIAGNOSIS_CODE LZ_WMDS_DXC_DIAGNOSIS_CODE

9. WMDS INC MEM_MEMBER LZ_WMDS_ MEM_MEMBER

10. WMDS INC MNO_MEM_NOTE LZ_WMDS_ MNO_MEM_NOTE

11. WMDS FR ORG_ORGANIZATION LZ_WMDS_ ORG_ORGANIZATION

document.doc 27

Page 28: Release-3_TSD_Source_to_LZ_-_CIS_-_v1.2 2

Serial No.

Source FR/INC Source Table Name Landing Zone Target table Name

12. WMDS FR SRP_SBR_RESPONSE LZ_WMDS_ SRP_SBR_RESPONSE

13. WMDS FR STF_STAFF LZ_WMDS_ STF_STAFF

14. WMDS INC TAU_TREATMENT_AUTHRZN LZ_WMDS_TAU_TREATMENT_AUTHRZN

15. WMDS FR MCD_MEM_CLINICAL_DATA LZ_WMDS_MCD_MEM_CLINICAL_DATA

16. WMDS INC MPG_MEM_PLAN_GROUP LZ_WMDS_ MPG_MEM_PLAN_GROUP

17 WMDS INC PVD_PROVIDER LZ_WMDS_ PVD_PROVIDER

18 ECC FR CME_CM_EPISODE LZ_ECC_CME_CM_EPISODE

19 ECC FR SRP_SBR_RESPONSE LZ_ECC_ SRP_SBR_RESPONSE

20 ECC INC MNO_MEM_NOTE LZ_ECC_ MNO_MEM_NOTE

21 ECC FR SBR_SURVEY_BUILDER LZ_ECC_SBR_SURVEY_BUILDER

22 ECC FR CDA_CME_DIAGNOSIS LZ_ECC_ CDA_CME_DIAGNOSIS

23 ECC FR CMG_CM_GOAL LZ_ECC_CMG_CM_GOAL

24 ECC FR CMI_CM_ISSUE LZ_ECC_CMI_CM_ISSUE

25 ECC FR CMV_CASE_MGMT_INTRVNTN LZ_ECC_CMV_CASE_MGMT_INTRVNTN

26 ECC FR ORG_ORGANIZATION LZ_ECC_ORG_ORGANIZATION

27 ECC INC COI_CON_ISSUE LZ_ECC_ COI_CON_ISSUE

28 ECC INC MEM_MEMBER LZ_ECC_ MEM_MEMBER

29 ECC FR CMA_CME_MANAGER_ASGNMNT LZ_ECC_CMA_CME_MANAGER_ASGNMNT

30 ECC FR STF_STAFF LZ_ECC_ STF_STAFF

31 ECC INC MPG_MEM_PLAN_GROUP LZ_ECC_ MPG_MEM_PLAN_GROUP

32 ECC INC TAU_TREATMENT_AUTHRZN LZ_ECC_TAU_TREATMENT_AUTHRZN

33 TRIMED FR OPRTPRSNCVRD LZ_TRIMED_ OPRTPRSNCVRD

34 TRIMED FR TMDTPATIENT LZ_TRIMED _ USR_USER

35 TRIMED FR TMDTRPPTCMGT LZ_TRIMED_ TMDTRPPTCMGT

36 CP FR CARESECURE_CASE_EVNT LZ_CP_CARESECURE_CASE_EVNT

NOTE: TMDTRPTKLRLG is used only for history load. For incremental loads, we would be using TMDTTMTICKLR which contains one month information.

document.doc 28

Page 29: Release-3_TSD_Source_to_LZ_-_CIS_-_v1.2 2

ETL Process FlowThe existing data from the Landing Zone tables is deleted and the data from the source tables will be read by Informatica power center and loaded into the corresponding landing zone tables. Teradata external loader (MLoad) connections will be used to load the data into LZ tables.

If there are null values coming from the source, then it would to be populated with corresponding default values as per the EDW standards.

The following link will give the detailed ETL Specification for all the above LZ tables.

http://sharepoint.auth.wellpoint.com/sites/EntClinProgRep/Shared%20Documents/Forms/AllItems.aspx?RootFolder=%2fsites%2fEntClinProgRep%2fShared%20Documents%2fTechnical%20Design%20and%20Development%2fDesign%2fETL%5fSpecification%5fdoc&FolderCTID=&View=%7bB9C5506F%2dDD1C%2d4F22%2dA54C%2d8EBB1D5F4FE5%7d

6. Data Stores This high-level section requires no input from the author. It is simply information to assist authors in understanding the sub-sections.

Information Guide on Data Stores

Data StoresCIS History extract and incremental/full refresh data are loaded into Landing Zone table.

In this section all the data sources and its sizes will be documented.

Description of Section

Identify the new, changed or deleted data schema(s) affected, highlighting changes from/additions to existing schema(s).

Describe new tables, file, structures, data elements, and changes to the existing data dictionary objects related to the changes.

Include data model diagrams as appropriate.

Note:Please include as much information as possible.

6.1 Data Store InventorySee below grid shows the CM inventory list which contains the details of the history extract and its information. Also gives the details for LZ & CSA tables. The name of the document is ‘Inventory List.xls’

The below spreadsheet gives statistics of WMDS/ECC/Trimed/Careplanner Landing zone tables.

document.doc 29

Page 30: Release-3_TSD_Source_to_LZ_-_CIS_-_v1.2 2

6.2 Data Store Data ElementsCreate one table for each new or existing table/structure

Data Element Table Definitions

Schema/Database/File Name Name of new or existing Schema or Database or File Name

Table/Segment Name Name of new or existing table or segment

Copybook/Object Name Name of new or existing copybook or object name

Data Element Data element name

Description Description of the field

Data Type The data type for the element (alpha, numeric, String, Date, Integer, etc.)

Length The length of the data element – indicate units (characters, bytes, kb, etc.)

Values If applicable, list the values for the data element, or reference a table or document with this information.

Key/Search/Index Field Is this a key, search, or index field?

Mod Type Modification Type (A, C, or D) for addition, change, or delete for the particular data element.

The following Share Point links gives the information about Landing Zone tables DDL.

Landing zone tables DDLhttp://sharepoint.auth.wellpoint.com/sites/EntClinProgRep/Shared%20Documents/Forms/AllItems.aspx?RootFolder=%2fsites%2fEntClinProgRep%2fShared%20Documents%2fTechnical%20Design%20and%20Development%2fDesign%2fData%20Models%2fLanding%20Zone%20Models&FolderCTID=&View=%7bB9C5506F%2dDD1C%2d4F22%2dA54C%2d8EBB1D5F4FE5%7d

6.3 Data Store Descriptions

Information Guide on Data Store Descriptions

Data Store Descriptions Description of the each and every Physical Model attribute.

Description of Section

Provide narrative descriptions for field changes, if they will assist the developer when coding. Refer to Data Elements/Data Stores based on Schema Name

document.doc 30

Page 31: Release-3_TSD_Source_to_LZ_-_CIS_-_v1.2 2

Note:Please include as much information as possible.

The following Share Point link gives the information about the Clinical Programs Physical Data Model and Metadata for UM, CM. This document describes the each and every Physical Model attribute

http://sharepoint.auth.wellpoint.com/sites/EntClinProgRep/Shared%20Documents/Forms/AllItems.aspx?RootFolder=%2fsites%2fEntClinProgRep%2fShared%20Documents%2fTechnical%20Design%20and%20Development%2fDesign%2fData%20Models%2fPhysical%20Model%20Documents&FolderCTID=&View=%7bB9C5506F%2dDD1C%2d4F22%2dA54C%2d8EBB1D5F4FE5%7d

7. Implementation ActivitiesThis section holds the detail information of batch processing and recovery process.

General Information

Purpose of This Section

The CA UNICENTER scheduling tool, also referred to as the WORKLOAD MANAGER (WLM), is the preferred WellPoint, Inc. and Central Region Data Warehouse (CRDW) and Service Delivery (SD) scheduling tool.

The WLM scheduling tool is designed to process job execution across multiple technology platforms.

This section details about the WLM Jobset & Jobs that will be set for WMDS/ECC/Trimed/Careplanner Source to Landing Zone load will be discussed in detail.

Instructions for this Section:

Repeat the following section for each layer in the application (Presentation, Business, Messaging, Data, Applications, and Infrastructure). Be sure to articulate all environments such as Development, Test, QA and Production or any other environment types.

NOTE

If you choose not to use the table below, you can provide your own format. Be sure to provide any Implementation Activities not contained explicitly in this section. Repeat the following table for each step in the program unit, package or component.

7.1 Packaging/Release ActivityScheduling of the workflows can be done using WLM: work load manager tool. The WLM tool is designed to,

1. Manage end-to-end processing flows.

2. Manage execution of parallel job processing.

document.doc 31

Page 32: Release-3_TSD_Source_to_LZ_-_CIS_-_v1.2 2

3. Manage overall process sequence. (i.e., Jobs or processes those are dependent upon the completion of a preceding job or process.)

4. Manage job failure notification

For WMDS/ECC/CP/TriMed Source to Landing Zone load the process will be split into logical unit of works called Job sets in WLM. Job set is logical unit of work in WLM and each Jobset will contain jobs through which dependencies are set. Each JOB is units of work that will execute Informatica/ Teradata processes to load the target tables.

Below is the list of all the return codes from Informatica which are captured by WLM team which are the outcomes of Informatica pmcmd return code.

PMCMD RETURN CODES

Code Description

0 For all commands, a return value of zero indicates that the command ran successfully. You can issue the following commands in the wait or nowait mode: starttask, startworkflow, aborttask, and abortworkflow. If you issue a command in the wait mode, a return value of zero indicates the command ran successfully. If you issue a command in the nowait mode, a return value of zero indicates that the request was successfully transmitted to the Integration Service, and it acknowledged the request.

1 Integration Service is not available, or pmcmd cannot connect to the Integration Service. There is a problem with the TCP/IP host name or port number or with the network.

2 Task name, workflow name, or folder name does not exist.

3 An error occurred starting or running the workflow or task.

4 Usage error. You passed the wrong options to pmcmd.

5 An internal pmcmd error occurred. Contact Informatica Technical Support.

7 You used an invalid user name or password.

8 You do not have the appropriate permissions or privileges to perform this task.

9 Connection to the Integration Service timed out while sending the request.

12 Integration Service cannot start recovery because the session or workflow is scheduled, waiting for an event, waiting, initializing, aborting, stopping, disabled, or running.

13 User name environment variable is set to an empty value.

14 Password environment variable is set to an empty value.

15 User name environment variable is missing.

16 Password environment variable is missing.

17 Parameter file does not exist.

18 Integration Service found the parameter file, but it did not have the initial values for the session parameters, such as $input or $output.

19 Integration Service cannot resume the session because the workflow is configured to run

document.doc 32

Page 33: Release-3_TSD_Source_to_LZ_-_CIS_-_v1.2 2

continuously.

20 A repository error has occurred. Make sure that the Repository Service and the database are running and the number of connections to the database is not exceeded.

21 Integration Service is shutting down and it is not accepting new requests.

22 Integration Service cannot find a unique instance of the workflow/session you specified. Enter the command again with the folder name and workflow name.

23 There is no data available for the request.

24 Out of memory.

25 Command is cancelled.

8. Technical AssumptionsEach Assumption should be verified for accuracy. It is the responsibility of the document author(s) to obtain this verification from the appropriate source.

Assumption Table Definitions

Assumption # Reference number for each Assumption.

Identified By The team member who identified the assumption.

Identified Date The date each assumption was identified.

Verified By The resource (team member or other) who verified that the assumption is accurate

Verified Date The date each assumption was verified

Assumption Detailed description of the assumption being made in this document. If possible, indicate what functional requirements this assumption affects.

Comments Include any comments about the assumption

document.doc 33

Page 34: Release-3_TSD_Source_to_LZ_-_CIS_-_v1.2 2

Assumption #

Identified By

Identified Date

Verified By

Verified Date Assumption Comments

1. CIS Team

Initially Landing Zone tables will hold a Week’s data later it will get converted to daily.

2. CIS Team

Preprocess Pharmacy LZ tables are used for getting member key field.

3. CIS Team

All the attributes will be defaulted for incoming Nulls/blank values.

4. CIS Team

5. Claims Team

9. Reference DocumentsIn this section we can find all related documents.Just below this table are all the share point links for all the related documents.

Mapping Docshttp://sharepoint.auth.wellpoint.com/sites/EntClinProgRep/Shared%20Documents/Forms/AllItems.aspx?RootFolder=%2fsites%2fEntClinProgRep%2fShared%20Documents%2fTechnical%20Design%20and%20Development%2fDesign%2fData%20Models%2fMapping%20Templates&FolderCTID=&View=%7bB9C5506F%2dDD1C%2d4F22%2dA54C%2d8EBB1D5F4FE5%7d

Approach Dochttp://sharepoint.auth.wellpoint.com/sites/EntClinProgRep/Shared%20Documents/Forms/AllItems.aspx?RootFolder=%2fsites%2fEntClinProgRep%2fShared%20Documents%2fTechnical%20Design%20and%20Development%2fDesign%2fETL%20Approach&FolderCTID=&View=%7bB9C5506F%2dDD1C%2d4F22%2dA54C%2d8EBB1D5F4FE5%7d

Business System Design Dochttp://sharepoint.auth.wellpoint.com/sites/EntClinProgRep/Shared%20Documents/Forms/AllItems.aspx?RootFolder=%2fsites%2fEntClinProgRep%2fShared%20Documents%2fTechnical%20Design%20and%20Development%2fDesign%2fBusiness%20System%20Design%20Document%20%28BSD%29&FolderCTID=&View=%7bB9C5506F%2dDD1C%2d4F22%2dA54C%2d8EBB1D5F4FE5%7d

document.doc 34

Page 35: Release-3_TSD_Source_to_LZ_-_CIS_-_v1.2 2

10. Project Team Signoffs / Approvals

SOLUTION DELIVERY APPROVALSIT Technical Lead:

IT Functional Area Lead

Signature: Date:IT Project Mgr: IT Functional Area PM

Signature: Date:Solution Architect:

IT Functional Area Solution Architect

Signature: Date:

document.doc 35