TEAM LINKAGE IS480 FINAL PRESENTATION€¦ · TEAM LINKAGE IS480 FINAL PRESENTATION Supervisor: Dr...
Transcript of TEAM LINKAGE IS480 FINAL PRESENTATION€¦ · TEAM LINKAGE IS480 FINAL PRESENTATION Supervisor: Dr...
TEAM LINKAGE
IS480 FINAL PRESENTATION
Supervisor: Dr Chris Boesch
Client: Bhagya Perera
Meryl Kwek Doris Soh
PricewaterhouseCoopers IT Risk Consulting, Advisory
Team: Low Ying Lin|Liang Yahui|Liu Huan| Li Lu| Jerome Ching|Zhong Zhenyu
Content
Project Introduction and Overview
Demo
Testing
Technical Complexity
Project Management
X Factor and Learning Outcomes
Stakeholders
Mr Bhagya Perera,
Ms Meryl Kwek, Ms Doris Soh
PricewaterhouseCoopers Advisory, IT Risk Department
• World’s Largest professional services firm
• Conducts Security Reviews
• Recommends threat and vulnerability solutions for their clients
Prof Chris Boesch,
Team Supervisor,
Associate Professor of
Information Systems
• Guided the team
along FYP journey
Ying Lin, Li Lu, Liu Huan,
Jerome, Zhenyu, Yahui
• Implementation of Solutions for
our sponsor
PwC LinkAge Application
Deployed and “Live”
Client’s Environment (4th April 2013)
Our Users
PWC IT Risk & Advisory Department Staff
3 Types of Users:
Project Manager(s)
Project Member(s)
System Administrator
Admin
ADMIN
Accept or Reject user’s role
(Manager) request
Receives
Notification
Reset password for
member
If members forgot
password
Delete Account for
member
If members left
the department
Managers
MANAGER
Selects a
suitable
project team
Receives
Project Breaks down
project into
specific tasks
Assigns each
specific task to
project member(s)
Compile the
outcomes of
completed tasks
Members Complete
Their task(s)
Findings are
compiled into a
Completed Report
Report presented
to client
Hmmm… Who should be assigned to this new project?
Who is doing which project at what stages?
What will be my team members Schedule for next week?
Problem
Problem (Managers)
Manual assignment of tasks
does not cater for effective
resource management
Difficult to ascertain
member suitability to
perform required task
Difficulty in tracking
progress and datelines of
every assigned task
May create a situation of
imbalanced workload
among team members
Members
MEMBERS
Receives Task(s)
from Manager Performs Task and
issue findings by
dateline
Submits findings to
Project Manager
It is so troublesome to type same issues every time.
It is a headache to format the reports.
Why are there no user friendly applications to help us manage our process?
Problem
Problem (Members)
High Error Rate due to
multiple manual processes
Difficult to track own
schedule since they are
assigned multiple tasks
with varying due dates
Difficulty in managing
different versions of
documents sent to manager
for review
May be assigned
unsuitable tasks, leading
to unhappiness and
frustration
Introducing the PWC-LinkAge App
A customized project management web application
used to optimize the business process
Designed To:
Streamline Project Scheduling
Ensure optimal and suitable resource allocation
Track project’s progress
Analyze project data
Provide industry trends
Framework
Model–View–Controller
{ }
Project.html ProjectController.js Project Json Object
Technologies
Internet
Client Server
PWC Server
Application
Sql Database
GAE Server
Application
GAE database
(JSON object)
RESTful
Architecture
Breaking Down PWC-LinkAge App..
Manage Project
Manage Report
• Centralized location for creation of Project and
Assignment of Tasks to project members
• Dynamically generate a visually represented schedule
for each member and project to track progress
• Dynamically generate a MS Word Formatted Report for
each project based on member’s findings for submission
to the client
Analytics • Analyze past project information to provide insights on
industry trends (eg top issues per industry etc)
Other Features • Messaging Function (allows users to communicate with
each other live)
• Viewing member’s statistics
• Responsive pages
Demo
http://pwclinkage.appspot.com
Appspot version used for demonstration
Actual live application can only be accessed from PwC’s environment
Report Generation portion (unavailable on Appspot)
Browser that supports html5
Chrome
Firefox
Value To Sponsor
Optimal
Resource
Allocation
Accurate
Tracking of
Progress Provides timely
information on
industry trends
Cutting down of
man hours taken
to complete a
project
Significant cost
savings
Increased PM
efficiency
USER TESTING
Testing Objectives
Ensure that functionalities satisfy client’s
specifications and requirements
Align team and sponsor’s vision of user-friendliness
Gather feedback for evaluation purposes and
further improve the application
Internal Testing
Application is tested at the end of each iteration
following a detailed test plan
Ensure that newly completed functionalities are
working as planned
Ensure that earlier functionalities still work
Sample Test Plan
User Testing
UAT 1 (15 Feb 2013) at PwC
Tested functionalities up till Iteration 12
UAT 2 (4 Mar 2013) at PwC
Tested all functionalities
UAT 3 (15 Mar 2013) at PwC
Handover to PwC
Tester Profiles
Actual End Users
8 Testers (4 male , 4 female)
IT proficient
Used their own PCs for the test
User Testing
Each UAT consists of:
Functional Testing
Testers given hard copy test script
Follow the test cases
Indicate Pass/ Fail
Usability Testing
Testers navigate freely around
the pages
Fill in the online survey form
Functionality Test Results
50.00%
60.00%
70.00%
80.00%
90.00%
100.00%
UAT 1 UAT 2 UAT 3
Test Cases Passed
98.4% 91.7% 100%
Functionality Tests
UAT 1 – Iterations 1 to 12
Focused on less complex functionalities to build up
confidence
Tested on appspot version
UAT 2 – Iterations 12 to 19
Implemented all functionalities
Tested on deployed version on client’s environment
UAT 3 – Iteration 21 to End of FYP
Tested on deployed version on client’s environment
Usability Tests Survey Areas
Consistency throughout the pages
Error Prevention Features
Aesthetics
Efficiency of Use
Match between system and business process
Focus on recognition instead of recall
Help features
User Control and Freedom
Feedback Collection
Open Ended User Feedback
0 1 2 3 4 5
Consistency
Error Prevention
Aesthetics
Efficiency of use
Match between system and BP
Recognition
Help Features
User Control
UAT 2
UAT 1
Improvements (Task Page)
Design Stage
Earlier Iterations Currently
Improvements (Schedule)
Design Stage
Currently
TECHNICAL COMPLEXITY
Technical Complexity #1
RESTFUL Data Service
Keep progress of front-end design
Python database GAE
Fulfill needs of changing back-end database
Instead of changing the whole app, change queries
Request as an JSON object
Request is send over with json objects
Parse into json objects
Differentiating requests
Getting what is required from URI mapping
Process each request differently by their method
(PUT, GET, POST, DELETE)
Different implementation
- SQL Can’t Display Store Array
Process data with array differently
It’s still restful
This works!
Xxx/apikey/model/id
Return JSON object showing the specific item
Xxx/apikey/model/id (POST + a JSON Object)
Update the old item with the new one
Etc
Functions are the same as the old database
Technical Complexity #2
Report generation
Editing report through Linkage application
“Generate Report” function can download a well
formatted report into word document
Technical Complexity #2
Report generation
The header that makes it work
Complex Microsoft headers
Generate a report!
Some formats are more
complex than others…
Technical Complexity #3
Chatting Function
Enhance communication among team members
Use channel method to accomplish the real time
chatting
System Flexibility & Maintainability
Flexible Backend
Deploy to different backend without changes to front
end codes.
Log file at backend
Time and type of request received
Queries executed
PROJECT MANAGEMENT
Schedule-Iterative
5 days (Mon-Fri)
Weekends (Buffer)
Documentation Review
Development Codes
Development Design
Metrics Review
Test Case
Debugging
Schedule (Tracked with MS Project)
Milestones
Iteration Dates Milestone
2 22 Oct 2012-26 Oct12012 FYP Acceptance Proposal
4 5 Nov 2012- 9 Nov 2012 FYP Acceptance Demo
12 11 Feb 2013 -15 Feb 2013 UAT 1
13 18 Feb 2013 -22 Feb 2013 FYP Midterm Evaluation
15 4 Mar 2013 - 8 Mar 2013 1st Deployment
18 25 Mar 2013 - 29 Mar 2013 Poster & Video Submission
19 1Apr 2013 - 5 Apr 2013 UAT 2 & 2nd Deployment
21 15 Apr 2013-19 Apr 2013 UAT 3 & Final Deployment
FYP Hand Over & Final Presentation
22 22 Apr 2013 – 26 Apr 2013 FYP Poster Day
Schedule
Current Iteration : Iteration 21
Completed : 9 milestones
“Fully Deployed” at Sponsor’s site
2 out of 21 iterations: ‘Behind Schedule’
1 out of 21 iterations: ‘Ahead Schedule’
Task Level Breakdown : Mixture of Indicators
Schedule Metrics Calculation
Schedule Index (per iteration) = Estimated Hours
Actual Hours
Status Indicator Index Action
Ahead Schedule > 1.2 Reschedule and allocate less time for similar tasks.
On Schedule 0.8-1.2 No re-adjustment needed on the schedule.
Behind Schedule < 0.8 Allocate more time for similar tasks.
Reschedule and split into smaller tasks.
eg. Schedule Index (iteration 21) = 11.00
9.50
= 1.16
Schedule Metrics
2
3
1
0
1
2
3
4
Frequency
Ahead Schedule
On Schedule
Behind Schedule
2
4
0
1
2
3
4
5
Frequency
Ahead Schedule
On Schedule
Behind Schedule
Schedule Metrics- Per Iteration
2 1 1
2
3
5 4
5 5
3 4 4
1 3
6
3
4
3
3 2 2
4 2
0
1
2
3
4
5
6
7
8
9
1 2 3 4 5 6 7 8 9 10 11 12 13
Behind Schedule
On Schedule
Ahead Schedule
No of Tasks
Iteration
Schedule Metrics- Per Iteration
Iteration
3
1 1
2 2
6
3
5
3
4
3
3
4
3
1
2
1
0
1
2
3
4
5
6
7
14 15 16 17 18 19 20 21
Behind Schedule
On Schedule
Ahead Schedule
No of Tasks
Project Status
Iterations Planned Actual Comments
1 FYP Acceptance Proposal
2 FYP Acceptance Demo
Prototypes
3 FYP Acceptance Demo
Prototypes Phase 2
4 Project Management Phase 1
5 Acceptance Clean-up • Dropped IT Audit function
• Dropped voice chatting function
• Dropped auto assigning members to tasks
6 Project Management Phase 2
7 Project Management Phase 3
8 Project Management + Report
Management Phase 1
Project Status
Iterations Planned Actual Comments
9 Project Management +
Report Management
Phase 2
10 Project Management +
Report Management
Phase 3
Project Management
UI Front End
Deployment was being put to a hold till
sponsors’ server had to finish maintenance
11 UAT Preparation
Wrap up on :
Profile Page & Project
Management
UAT was shifted to Iteration 12 as sponsors’
schedule did not allow for it
12 Report Management &
Schedule Management
Heuristic & UAT • Heuristic & UAT Feedback collection
• Changes done to UI from UAT feedback
13 FYP Midterm Preparation Report Management & Schedule Management
Phase 1 is shifted to Iteration 14
14 Schedule Management
Phase 1
Deployment is to be scheduled in recess week
Project Status
Iterations Planned Actual Comments
15 Analytics Phase 1 Adjustment made to make
it ‘deployment smooth’
• Analytics Phase 1 shifted to Iteration 16
• 1st Deployment is completed
16 Analytics Phase 2 Analytics Phase 1 & 2
17 UAT 2 Preparation Touch Screen Enabled with
Responsive Website
Phase 1
UAT 2 is shifted to Iteration 18
18 Touch Screen Enabled
with Responsive
Website Phase 1
Touch Screen Enabled with
Responsive Website
Phase 2 & FYP WIKI
UAT 2 is shifted to Iteration 19
Project Status
Iterations Planned Actual Comments
19 Touch Screen Enabled
with Responsive Website
Phase 2
UAT 2 • 2nd Deployment is completed
• UAT 2 Results Collection & Feedback
20 UAT 3 UAT2 Adjustment on
sponsor’s UI requirements
FYP Final Preparation Phase 1
21 FYP Final Presentation UAT 3 • Final Deployment is completed
• FYP Hand Over
• FYP Final Presentation
22 FYP Poster Day
Bug Metrics Calculation
Scoring = 1 x num (low) + 5 x num (high) + 10 x num (critical) Severity Description
Low ( 1 point) Non Critical . Application able to run normally. Core functions are working.
Editing on small amendments to the codes.
High (5 points) System runs but not functioning properly. Some core functions working.
Critical (10 points) Application is unstable or crashes. Unable to continue without troubleshooting.
Scoring Action
0-5 Fix during buffer time
5-10 Debug during debugging session set aside at each iteration
11 or more Resolve bug immediately. Stop developing project
eg. Bug Scoring (Iteration 21) = 1X 1(low) + 5 x 4(high) + 10 x 0 (critical)
=21
Bug Metrics
Points : 21 Action : Resolve bugs immediately. Stop development of project.
Iteration Date of bugs Description Severity Status Date solved
21 14/4/2013 ‘complete’ button not working in in mytask.html High
completed 13/4/2013
21 14/4/2013 Sequence of tasks jumbles up in add_task.html High
completed 14/4/2013
21 15/4/2013 Timing is incorrect in message.html Low
completed 15/4/2013
21 15/4/2013 'edit' & 'delete' buttons not working in myTasks.html High
completed 15/4/2013
21 15/4/2013 'prev' button not working in add_tasks.html High
completed 15/4/2013
Scoring = 1 x num (low) + 5 x num(high) + 10 x num (critical)
=1X1(low)+5x4(high) = 21
Bug Metrics- Per Iteration
2
5 7
6 5
10
2
9
22 20
2 4 6 7 8 9 10 11 12 13Iteration
Bug Metrics- Per Iteration
31
2
6
20
33
18
30
21
14 15 16 17 18 19 20 21
Iteration
X Factor
Improved project operations
Increase Efficiency
Increase efficiency of report editing process by over 20%
Increase efficiency of project schedule by over 20%
Data analytics has ability to identify trends of industry
Sponsor estimated cost savings of at least $50,000 per year
Lessons Learnt
Flexibility
Python Backend
Design from user perspective
Cater for large amount of buffer time
Plan to integrate application at earliest time possible
Deployed thrice
Importance of Feedback