JUC Europe 2015: How to Optimize Automated Testing with Everyone's Favorite Butler
-
Upload
cloudbees -
Category
Technology
-
view
37 -
download
1
Transcript of JUC Europe 2015: How to Optimize Automated Testing with Everyone's Favorite Butler
#jenkinsconf
Footer
How to Optimize Automated Testing with Everyone's Favorite Butler
Viktor Clerc, Product Manager & Jenkins Fan, XebiaLabs
#jenkinsconf
Agenda
• The World of Testing is Changing • Testing = Automation • Test Automation and CD: Execution and Analysis • Focus on the Basics • Best Practices for Test Execution using Jenkins • Supporting Test Analysis
2
#jenkinsconf
Footer
But first…a bit about me
• Product Manager XL TestView for XebiaLabs • Traversed through all phases of the software
development lifecycle • Supported major organization in setting up a
test strategy and test automation strategy • Is eager to flip the way (most) organizations do
testing
3
#jenkinsconf
Footer
…and about XebiaLabs
• We build tools to solve problems around DevOps and Continuous Delivery at scale
4
#jenkinsconf
The World of Testing is Changing
5
#jenkinsconf
Introducing Test Automation For Real SPECIFY DESIGN BUILD TEST INTEGRATE REGRESSION USER
ACCEPTANCE RELEASE
Acceptance Driven Testing
Development = TestTest = Development Automate ALL
User Acceptance
Test effort
INTEGRATE REGRESSION USER ACCEPTANCE
#jenkinsconf
Testing = Automation
7
#jenkinsconf
Testing = Automation: Implications
• Developers are becoming testers – Maintain test code as source code
• Need to set up on-demand pipelines and environments
• Infrastructure as code – X-browser tests, Selenium grids, dedicated
performance environments, mobile etc. • Hosted services
#jenkinsconf
Testing = Automation: Challenges
• Many test tools for each of the test levels, but no single place to answer “Good enough to go live?”
• Requirements coverageis not available – “Did we test enough?” – Minimize the mean time to repair – Support for failure analysis
JUnit, FitNesse, JMeter, YSlow, Vanity Check, WireShark, SOAP-UI, Jasmine, Karma, Speedtrace,
Selenium, WebScarab, TTA, DynaTrace, HP Diagnostics, ALM stack AppDynamics, Code Tester
for Oracle, Arachnid, Fortify, Sonar, …
#jenkinsconf
Testing = Automation: Challenges
• Thousands of tests makes test sets hard to manage: – “Where is my subset?” – “What tests add most value, what tests are superfluous?” – “When to run what tests?”
• Running all tests all the time takes too long, feedback is too late
• Quality control of the tests themselves and maintenance of testware
#jenkinsconf
Testing = Automation: Challenges
• Tooling overstretch
#jenkinsconf
Testing = Automation: Challenges
• Tooling overstretch • Poor butler!
#jenkinsconf
Test Automation and CD: Execution and Analysis
13
#jenkinsconf
The Two Faces of CD
• A lot of focus right now is on pipeline execution • …but there’s no point delivering at light speed if
everything starts breaking • Testing (= quality/risk) needs to be a first-class citizen
of your CD initiative!
#jenkinsconf
The Two Faces of CD
• CD = Execution + Analysis
#jenkinsconf
The Two Faces of CD
• CD = Execution + Analysis • = Speed + Quality
#jenkinsconf
The Two Faces of CD
• CD = Execution + Analysis • = Speed + Quality • = Pipeline orchestration + ..?
#jenkinsconf
Focus on the Basics
18
#jenkinsconf
Quick Review
19
1. Cohn’s pyramid – Unit tests – Service tests (under the GUI) – (graphical) User Interface tests
2. And even further downstream – Integration Tests – Performance Tests
#jenkinsconf
“Modern Testing” 101
1. Testers are developers
20
#jenkinsconf
“Modern Testing” 101
1. Testers are developers 2. Test code equals production code
– Conway’s Law – Measure quality
21
#jenkinsconf
“Modern Testing” 101
1. Testers are developers 2. Test code equals production code
– Conway’s Law – Measure quality
3. Linking tests to use cases
22
#jenkinsconf
“Modern Testing” 101
1. Testers are developers 2. Test code equals production code
– Conway’s Law – Measure quality
3. Linking tests to use cases 4. Slice and dice
– Labeling
23
#jenkinsconf
“Modern Testing” 101
1. Testers are developers 2. Test code equals production code
– Conway’s Law – Measure quality
3. Linking tests to use cases 4. Slice and dice
– Labeling 5. Radical parallelization 24
Fail FASTer!
“Kill the nightlies”
#jenkinsconf
Dealing With Growing Tests
• Conway’s Law for test code – Let the test code mimic the production code – Organize tests under the project/system under test
• Suite.App.UseCase.TestCase • Cut the suite at UseCase: now you have
independent chunks which you can run massively in parallel
25
#jenkinsconf
Dealing With Growing Tests
• Tests should not depend on other tests – Setup and tear down of test data done within each test – Share test components (as you would do with ‘real’
production code) – Trade-off between:
• No code duplication yet somewhat more complex fixtures • Easy-to-grab simple fixtures but a lot of them (and
duplication)
26
#jenkinsconf
Keep It Manageable
• Focus on functional coverage, not technical coverage • Say 40 user stories, 400 tests
– Do I have relatively more tests for the more important user stories?
– How do I link tests to user stories/features/fixes? • Metrics
– Number of tests – Number of tests that have not passed in <time> – Flaky tests – Duration
27
#jenkinsconf
Slice and Dice
• Use appropriate labels in your test code – Responsible team – Topic – Functional area – Flaky – Known issue – etc.
28
#jenkinsconf
Best Practices for Test Execution in Jenkins
29
#jenkinsconf
Jenkins Testing Basics
30
• Tilt the pyramid …
• … and use this as the guiding principle to set up your Jenkins test jobs “left to right”
#jenkinsconf
Organizing Test Jobs in Jenkins
1. Create unique artifacts and fingerprints to monitor what you are pushing across your pipeline
2. Treat different platforms (e.g. browsers) as different tests, handled by different jobs
3. Well-known plugins: – Multi-job – Copy Artifact – Workflow
31
#jenkinsconf
Organizing Test Jobs in Jenkins
4. Keep Jenkins jobs sane and simple – Ergo: execute shell scripts from your Jenkins jobs
5. Shell scripts are parameterized 6. Parameters are fed to individual test tools
– FitNesse labels, Cucumber labels, etc. etc. 7. Shell scripts placed under version control
– Managed by the team as any other source code
#jenkinsconf
Example Job Distribution
33
Build Deploy Int. Tests Test Test Test Perf. Tests
Build Deploy Int. Tests Test
Test
Test
Perf. Tests
Beware of scattered result qualification
#jenkinsconf
Distributing Tests Across Jobs
34
• Radical parallelization using cheap and cheerful throw-away environments – Especially when environments (e.g. containers)
lie at your fingertips • Jobs should not depend on other jobs • Test jobs are your “eyes and ears” – optimize for
them!
#jenkinsconf
Example Job Distribution
35
Build Deploy
Int. Tests
Test
Test
Test
Perf. Tests
?
#jenkinsconf
Challenge: Scattered Results
36
#jenkinsconf
Supporting Test Analysis
37
#jenkinsconf
Footer
Making Sense of Test Results
• Real go/no go decisions are non-trivial – No failing tests – 5 % of failing tests – No regression (tests that currently fail but passed
previously) – List of tests-that-should-not-fail
• Need historical context • One integrated view • Data to guide improvement
#jenkinsconf
Footer
Making Sense of Test Results
Executing tests from Jenkins is great, but… • Different testing jobs have their share of Jenkins
plugins • Historic view merely available per job, not across
jobs • Pass/Unstable/Fail is too coarse
– How to do “Passed, but with known failures”?
#jenkinsconf
Footer
Making Sense of Test Results
• Ultimate analysis question (“are we good to go live?”) is difficult to answer
• No obvious solution for now, unless all your tests are running through one service
40
#jenkinsconf
Example Case Study
#jenkinsconf
• Started with 1 project containing all tests • Sharing knowledge • Structured the same as our use cases, i.e.
• WebshopSuite.BusinessAccountSuite.UseCase1500 • Nightly runs from the beginning • Indication by labels (“nightly”) • First sequential per application
• WebshopSuite • Later parallel, split by functional area
• WebshopSuite.BusinessAccountSuite.*
42
FitNesse Implementation
#jenkinsconf
Check-in Step
Tools
Environment git-server
Build Unit tests Build EAR Deploy
Jenkins-server
Smoke Test
Dedicated Team Server
Build Deploy
Code review
Example Pipeline
#jenkinsconf
System Test Step
Tools
Production Acc. Test
Deploy to Chain Chain Test
Security Test Step
Tools
Environment Jenkins server and Sonar server
Remarks
Source Code Quality Test Step
Tools
Dedicated Team Server
Chain {1-5} Chain {1-5}
Testing
Environment
Dedicated Team Server Environment
End to End Testing
Smoke Test
Chain {1-5}
#jenkinsconf
45
Test Analysis: Homebrew
#jenkinsconf
46
Test Analysis: Custom Reporting
#jenkinsconf
47
Test Analysis: Custom Reporting
#jenkinsconf
Summary
• Testing = Automation – Testers are developers
• Structure and annotate tests – Conway’s Law for Tests – Link to functions/features/use cases
• Radical parallelization – Throwaway environments
48
#jenkinsconf
Summary
• Keep Jenkins jobs simple • Keep Jenkins jobs independent • Track SUT with fingerprints • Invoke test tools via plugins or version-controlled
scripts • Parameterization! • Parallelize & optimize
49
#jenkinsconf
Summary
• CD = Speed + Quality = Execution + Analysis • Making sense of scattered test results is still a
challenge • Need to figure out how to address real world go/no
go decisions
50
#jenkinsconf
What’s Next?
• Visit http://tiny.cc/webinar-xebialabs for a webinar by CloudBees and XebiaLabs demonstrating the key value of CD and go-live decisions
• Read more on the testing challenges in CD – http://tiny.cc/ta-and-cd
• Try XebiaLabs’ XL TestView solution to bring quality into the heart of your CD initiative – http://tiny.cc/xl-testview
51
#jenkinsconf
Please Share Your Feedback
• Did you find this session valuable? • Please share your thoughts in the
Jenkins User Conference Mobile App. • Find the session in the app and click
on the feedback area.
52