Make your own Vanity or Ego feed stalk yourself Or how to really.
Can You Really Automate Yourself Secure
Transcript of Can You Really Automate Yourself Secure
Can you really automate yourself secure?Facts vs. FantasiesScott Crawford, Research Director, 451 ResearchNabil Hannan, Managing Principal, Security, Cigital
Security teams are struggling• “Lack of staff expertise” the
most common obstacle to multiple aspects of security ops• “Organizational politics/Lack of
attention” the second most reported infosec pain point1
• More data than teams can handle• SIEM: Events per second into the
5-digit range• One-fourth of security orgs still
can’t understand and baseline normal behavior2
DAST/SAST inhibitors: What inhibitors has your organization encountered in adopting or fully
utilizing your vendor’s technology?
1 451 Research, Voice of the Enterprise: Information Security, Q3 20152 SANS Institute 2015 Analytics and Intelligence Survey
Source: 451 Research, Voice of the Enterprise: Information Security, Q3 2015
Automation can help with these burdens• Continued growth in use of
security analytics• Assessment and testing as
well as operational monitoring and control
• Applications of machine learning
• Ability to handle data at speed and scale
• Advantages of the cloud
• Rise of security task automation • Today:• “Playbook” approach to
orchestration• Auto-generation of code (.py)• Incident response workflow & data
aggregation• Testing and assessment
• Tomorrow:• Containers and microservices• “Infrastructure-as-code”
What CAN’T automation do?
What CAN’T automation do?
First of all, what’s your strategy?• Case in point: Application security covers
a lot of ground• Static, Dynamic, Interactive testing?• IDE-integrated real-time code
analysis/guidance?• Source supply chain? Runtime protection?
• What do you hope to achieve with automation?• Who will be responsible? Developers?
Operations teams? Security experts?• Limited requirements for test data and automated test suites? Developers implementing more secure code?
• Limited requirements for modifying the production environment?• Automation without a plan and a strategy runs the risk
of automating the wrong things
Goal Setting
The reality of automation tools• They are designed to address “likely”
use cases out of the box• BUT…Every application is different• Specific implementations require one
(or both) of two things:• Adaptation of the technology to the
environment (tool tuning)• Adaptation of the environment to the tool (assuring that tools have
access to all relevant functionality in the target application)• Who or what makes this happen?• Do you know what the cost in time and expertise will be to
make sure your automation tools deliver expected results?
Adapting to application differences
Automation has its limits• Do you know how far automation can go in
your case?• Application security coverage:• How much direction do your tools require?• Do they follow all the logic and branches
you expect? What are the tradeoffs if they do?• Caveat: Be aware when benchmarks may mask these factors!• Those that favor techniques such as IAST by scripting the assessment of
all inputs, vs. SAST or DAST tools which must drive their own coverage• Scope/scale of test cases: How many does each type of benchmark
evaluate? Are you measuring a sprint against a marathon?• Do the limits of your tools fit your expectations? Do you know
how to manage the gaps if they don’t?
Automation has its own information overload risks• False positives aren’t the only
issue. Say:• A given run of your tool finds 500 possible
issues• With a true positive rate of 20% (so 100 out of
those 500 reported issues are actually legit)• …but your team only has time to evaluate 200
of the 500 findings• If that 20% of true positives is evenly
distributed, your team will ID only 40 TP’s out of the 200 possibilities they examine (200 x 20% = 40)
• ...Meaning that your team will miss the remaining 60 true positives found
• Does your team really have the resources to fully assess the results of automation without missing important findings?
All finding
s
The finding
s our team
can get to
All true positives
The TPs our team found
The TPs our team missed
Where will automation fit into your processes?• Traditional application security testing:
• Periodic• DAST may be “monolithic” – run against
an entire application or large/comprehensive components
• In a DevOps environment:• Frequent updates, releases – moving
toward CI/CD• Testing may need to be closer to
continuous• Broken into smaller chunks – more
“unit” oriented than monolithic• Are your teams ready for the impact of integrating secure development and testing into DevOps?
How well do your security pros understand the DevOps
toolchain?
How well do your developers & ops teams understand security
needs?
Congratulations, you have results! …Now what?• Don’t forget: Discovery is only
half the battle!• Once your automation tools
reveal exposures, will you be able to close them successfully?
• Security task automation can close operational exposures…
• ...but can they remediate vulnerabilities or implementations in the applications themselves?
• If your automated testing tools produce results, what’s your strategy for closing the loop on remediation?
Automation solves a lot of problems…but• Only people can:• Understand your business and
security goals and objectives• Know the limits of automation tools…
and how and where to close their gaps
• Invest the effort required to assess the results of automation and apply it correctly
• Integrate security automation into development and operational practices
• Assure that remediation answers the security needs revealed by the automation of monitoring and assessment
• Help you develop the right automation strategy based on experience
Solutions for the full SDLC