SAS as Glue...in the Lotus Notes environment. The business organization rules would be maintained in...
Transcript of SAS as Glue...in the Lotus Notes environment. The business organization rules would be maintained in...
Next: SAS as Glue - An Integration Example Presenter: Lionel Teed
As Senior Manager of Reporting and Analytics for TD Bank’s Financial Group’s Financial Planning business, Lionel is responsible for all of the group’s analytics and reporting, is the compensation program and reporting metrics’ expert and, annually, is (unwillingly) pulled into (and responsible for) the group’s planning process. His 31 years experience with the bank includes stints in the Retail Credit Risk Department, Real Estate Secured Lending (Mortgages, as non bankers call them!) and even a good amount of time in the Finance department. He will admit that even though he worked in accounting, many believe he is not balanced! Lionel first starting working with SAS in June of 1993, having been part of the Purchased Assets and assumed liabilities from Central Guaranty Trust. He believes he was one of the purchased assets, though others at the bank say he was THE assumed liability. He fondly remembers opening his first SAS 5.0 manual and wondering…what the heck is this all about? Almost 20 years later he is an avid fan of the tremendous power of SAS in delivering information solutions in a myriad of platforms.
SAS as Glue An Integration Example
Lionel Teed Senior Manager
Business Analysis and Reporting
June 8, 2012
3
Business Model
Value Proposition
We complete a holistic, comprehensive financial plan, that articulates goals and objectives, for every client.
We conduct an annual Client Relationship Review.
We proactively contact our clients a minimum of 3 times per year.
We maintain an asset allocation through annual rebalancing.
We take a long term approach to investing, therefore we do not engage in active trading, or market timing.
Client On-Boarding
1. Steps to Financial Planning
Understanding
Confirmation
Plan Development
Presentation
Implement Solutions
Ongoing Management and Review
2. Wealth Considerations
Investment Strategies
Banking and Credit Management
Retirement Planning
Protecting Your Assets
Protecting Your Income
Tax Management
Passing on Your Wealth
Education Funding
Business Successions Planning
Planning for Major Purchases
Charitable Giving
3. Products and Solutions
Financial Plan
Mutual Funds
Managed Solutions
Fee-Based
Investment Savings Account
GICs
Money Market
Protected Notes
Cash
In the Beginning…..
5
Financial Planning Comes About…
Prior to the merge of TD and Canada Trust Financial Planning was handled in a separate manner for each predecessor company
The combined business had no infrastructure to rely on and limited technology resources available
The business wanted to create a ‘Best in Class’ experience for Clients, employees and shareholders.
6
Identifying Information Sources
TDCT would use an existing TD Book of Record for recording the account activities for the Clients. This information was Host based
The Financial Planning value proposition would be presented through a ‘Planning’ Software component maintained in an Oracle environment
As compensation was modeled based on the security holdings of the portfolio it was necessary to retrieve security meta data. This was Ebcdic based, but delivered to the PC environment.
7
Identifying Information Sources…2
Some Client compliance information would be maintained in a home grown access database.
Compliance overview would occur based on a daily monitoring process, identifying select securities as ‘no no’s. This list was maintained on an AS/400
8
Identifying Information Sources…3
A form of Customer Contact system would be created in the Lotus Notes environment.
The business organization rules would be maintained in the same Lotus Notes environment.
As Financial Planning and the Retail bank (TDCT) worked closely together it was necessary for FP to report at the TDCT organization level also. This information was Host based.
9
Information Frequency
The facts are that the actual ‘balances’ of an account change with the market.
As the market changes every day…we needed to know the balances every day.
Further compliance requirements were such that the sooner a ‘violation’ is noted, the sooner action is required.
Therefore, transaction information was required…every day.
10
Information Frequency…2
FP movement (arrivals, departures…etc) also created unique situations which required prompt action. Again…ending with a daily information requirement.
New security types (Mutual Funds…etc) being added on a regular basis required monitoring.
11
The Challenge!
The end result….we needed create a process to retrieve, consolidate, compare and report from all of these information sources on a daily basis.
The trick was that the information was in many different environments, in many different formats.
The process could get fairly tedious, and subject to errors unless it could be ‘wrapped up’ into one clean ‘container’
12
The Quandry!!!
HOST
FLAT FILES
EXCEL
ACCESS
ORACLE
LOTUS NOTES
13
Enter Base SAS as the solution!
SAS was available on the Host and the PC.
SAS Access for PC and ODBC were available.
Base SAS FTP was available.
14
Easy Access!!!
Using SAS Access for PC importing Access data was as easy as…
PROC IMPORT OUT=NagGoals DATATABLE="tblFPNagGoals"
– DBMS=ACCESS2000 REPLACE;
– DATABASE="&FPDBMainTables." ;
15
Notes on ODBC connections!
Reading Notes data was facilitated through the NotesSQL ODBC driver. This Driver was downloaded from the IBM Lotus Notes site.
Interactive password requirement was “bugaboo.” It prevented the complete ‘automation’ of the process.
Upon installation, the SAS syntax was…
16
Connecting up…..
Proc sql;
– connect to odbc as FPAdminx (NoPrompt="Driver={Lotus NotesSQL Driver (*.nsf)};Database=&NotesServernsfLocation;Server=&ServerName;");
– Create Table FPFullAdminX as Select * From Connection to FPAdminx
– (Select * From Staff) ;
Note that some work was required with the way in which Notes data came across. Mostly Text, and ‘odd’ characters would creep in. Some ‘cleanup’ was required.
17
Reading EBCDIC on the desktop
One of our main information sources was presented in the desktop environment but…..
…in an EBCDIC format!
SAS Informats to the rescue!
–Infile PCFile ‘MyDocuments\EBCDICFile ;
–Input RRCode $Ebcdic4.
–ClientId $Ebcdic10.
–KYCUpdated S370FF4. ;
18
The Magic of FTP on the Desktop
FTP as an option on the Filename statement was a key facility.
As expected, it allows the reading, or writing of Host data directly into a PC Sas session.
19
PUTing a file onto the Host
Filename ByPassL FTP "‘&GDGHostName(+1)'"
– User="&TSOUser" Pass="&TSOPass" Host="&TDMFURL" recfm=Fb lrecl=36
– rcmd='site lrecl=36 DISP=Old recfm=FB BLKSIZE=32000 TR PRI=2000 SEC=1000 UNIT=TSSU';
Data _Null_ ;
– Set tblByPassKYC ;
– File ByPassL ;
– Put RunDate PDJulG4.
– RRCode $Ebcdic4.
– ApprovID $Ebcdic8. ;
20
Even more magic of FTP on the Desktop
Even more powerful for us was the use of….
…the INTRDR option!
21
Triggering a Host job from PC Sas
Filename TriggerD FTP "‘&HostJobToExecute.'" User="&TSOUser" Pass="&TSOPass" Host="&TDMFURL“ recfm=fb lrecl=80 ;
FileName CpyJob00 "&JobDataLoc.\Daily\TriggerD.jcl" LRecl=80 ;
Filename TriggFTP "&JobDataLoc.\Daily\TriggerD.ftp" LRecl=80 ;
22
Building the trigger files
Data _Null_ ;
– File TriggFTP ;
– Put 'ope &TDMFURL” /
– "&TSOUser" / "&TSOPass" /
– 'literal site filetype=jes' /
– 'put G:\Jcl\Daily\TriggerD.jcl' /
– 'bye‘ ;
23
..building the trigger files…
Data _null_ ;
– File CpyJob00 ;
– Infile TriggerD ;
– Input
– LineDesc $EBCDIC80.
– ;
– Put
– LineDesc
– ;
24
…and then triggering…the trigger files
Data _Null_ ;
– If Date() ne . Then Do ;
– X G:\JCL\Daily\TriggerD' ;
– End;
25
AS/400….I don’t need no Stinking AS/400!
On occasion, key data would be present in the AS/400 environment.
In one case the information was available as a web page on the AS/400.
The magic of Filename again came to the rescue with…URL!
26
URule Filename statement!
Filename AISec URL 'http://IStink400/sequel?Obj=Gobbledigook&Lib=netdatau' lrecl=1024 debug ;
Data _null_ ;
– Infile AISec ;
– Input ;
– If Input(_infile_,$8.) eq '<TR clas' ;
– File 'C:\Temp\FlatFile.Txt' LRecl=1024 ;
– Put _infile_ ;
27
UStillRule Filename Statement
Data AITemp (Keep=SecCode WordData I) ;
– Infile 'C:\Temp\FlatFile.Txt' LRecl=1024 ;
– Input Blob1 $10. @035 Seccode $6.
– @;
– If Blob1 eq '<TR class=' Then Do ;
– Do I=1 to 39 ;
– WordData=Scan(_infile_,i,'<>') ;
– If i in (6,21,24,27,30,33,36) Then Do ;
– Output ;
– End ;
– End ;
– End ;
28
Giving to others
Once the main data extraction and transformation process was created other business areas came looking for information.
Some of these information needs were permanent. Others temporary.
Small snippets of code would be inserted into the main script to copy/transfer/transform information for these other business areas.
In some cases, the information was for external vendors and required SFTP processing.
29
SFTP example
In this SFTP case, the end user was looking for a Version 8 SAS dataset.
First thing required was to convert to V8.
–Libname ver8 v8 '\\UNCNamedDrive\TDWFPDB\Analytics\' ;
–Data Ver8.BaseOne (Compress=yes) ;
– set BaseOne ;
–Filename cportout '\\UNCNamedDrive\TDWFPDB\Analytics\CportFile' ;
–proc cport data=Ver8.BaseOne file=cportout;
30
SFTP….building the FTP command
The next step was building the SFTP command file…
Data _Null_ ;
–File LookUFTP ;
–Put "&InsUser" / "&InsPass" /
– "lcd \\UNCNamedDrive\TDWFPDB\Analytics" /
– "put \\UNCNamedDrive\TDWFPDB\Analytics\TDWDBFPList.DAT TDWDBFPList.DAT" / "binary" /
– "put baseone.sas7bdat baseone.sas7bdat" /
– "get tdw_result.sas7bdat tdw_results.sas7bdat " /
– "get quotes_notfound.sas7bdat quotes_notfound.sas7bdat " /
– "Bye" / "Exit"
31
SFTP….the simple execution
And the last step was the execution!
Data _Null_ ;
– File LookUBat ;
– Put ‘sftp -v -i -s:G:\Directory\\JCL\FTPInsurance.ftp 12.34.567.89' ;
– Put 'exit' ;
Data _Null_ ;
–If Date() ne . Then Do ; X 'G:\\JCL\FTPInsurance' ; End ;
32
‘Gluing’ it all together.
One single SAS Program was created.
As each data ‘piece’ was dealt with, an FTP trigger would be created that would execute a Host job.
After executing the host job, the process would continue on retrieving data for the next few steps.
In some cases one system was converted to another (Notes to Oracle). Our process was ‘modular’, and we only needed to update the one piece to continue with our data processing stream.
33
…Gluing it all together!
In other cases, new business needs would be dealt with by inserting the appropriate code at the required location.