A COMPLIANCE AWARE INFRASTRUCTURE AS A...
Transcript of A COMPLIANCE AWARE INFRASTRUCTURE AS A...
International Journal of Services Computing (ISSN 2330-4472) Vol. 2, No. 2, April - June 2014
58 http://hipore.com/ijsc
A COMPLIANCE AWARE INFRASTRUCTURE AS A SERVICE Shakil M. Khan, Lorraine M. Herger, Mathew A. McCarthy
IBM Corporation [email protected],[email protected],[email protected]
Abstract With cloud eclipsing the $100B mark, it is clear that the main driver is no longer strictly cost savings. The focus now is to exploit the cloud for innovation, utilizing the agility to expand resources to quickly build out new designs, products, simulations and analysis. Companies will use this agility and speed as competitive advantage. An example of the agility is the adoption by enterprises of the software-defined datacenter (SDDC) model, required to support the changing workloads and dynamic patterns of the enterprise. Often, security and compliance become an 'after thought', bolted on later when problems arise. In this paper, we will discuss our experience in developing and deploying a centralized management system for public, as well as an Openstack based cloud platform in SoftLayer, with an innovative, analytics-driven 'security compliance as a service' that constantly adjusts to varying compliance requirements based on workload, security and compliance requirements. Keywords: SDDC, GRC, Ontology, IaaS, Compliance, OWL, SWRL, Cloud
__________________________________________________________________________________________________________________
1. INTRODUCTION Companies are increasingly going “cloudwards” using
both public providers and private datacenters because of the
business agility that Infrastructure as a Service (IaaS)
enables. Full IT automation, self-service provisioning, and
metered usage billing helps companies accelerate the
development of their products and services, and improves
organizational efficiency. Unfortunately, many companies
are struggling to accelerate the most important parts of their
business due to the challenges of securing these highly
dynamic environments. Use of cloud service does not
automatically guarantee strong security or required
compliance. Although some providers provide optional
security capabilities that can be used to help reach the
required security and compliance posture, it is the user’s
obligations to ensure secure, compliant workloads running
on cloud. This is a fact which is often forgotten in the haste
to bring an application or service online.
IBM Research has collaborative research projects with
clients, ranging from internal business units to external
clients - such as government and almost all vertical market
segments. Researchers need to run their experiments with
innovative architectures and algorithms in a datacenter
environment modeled around the ‘living lab’ concept in
order to pilot solutions for highly dynamic and volatile
markets in a timely fashion. This introduces tremendous
challenges in supporting heterogeneity in workloads as well
as security and compliance requirements.
In most cases the researchers who need to move fast and
implement change run into very legitimate barriers and
concerns from their IT and “governance” teams when they
bring their ideas to the table. The groups responsible for
creating and supporting applications and solutions are
chartered with ensuring that data and intellectual property
are secure, privacy laws and other regulations are complied
with, and that the solutions are “future proof” and smart
investments. The stewardship of one group to protect the
company and the other to accelerate the response to change
creates tension, frustration, and conflict.
2. BRIDGING THE CHASM BETWEEN
AGILITY AND SECURITY With the acquisition of SoftLayer , IBM Research is
being encouraged to use it to power its research workloads.
Unfortunately, SoftLayer does not automatically guarantee
strong security or required compliance. In order to stay
relevant and competitive, research needs to respond to
market forces almost immediately. Capabilities such as
service catalog with standardized offerings and tiered SLA,
automated workload aware provisioning in private, public
and hybrid clouds, proactive incident and problem
management, IT cost transparency and chargeback helped
unlock the efficiency, agility and benefits of cloud. Yet
reliability, security and compliance stand as formidable
barriers in the path of turning these benefits into true
potentials for achieving innovations at the speed of the
business. Manual security and compliance as an
“afterthought” pose the following challenges to the
researchers:
Need-specific, piecemeal solutions bolted on to
existing infrastructures create silos, drives up cost,
impedes innovations.
Users lack expertise in security and compliance.
Often the changes in regulations are not
communicated outside security and compliance
functions leading to contextually invalid security
implementations by users.
Data theft and intellectual property theft due to
lack of security and compliance expertise.
’Home grown’ research solutions that meet
business requirements but fall short of security and
compliance audit requirements.
International Journal of Services Computing (ISSN 2330-4472) Vol. 2, No. 2, April - June 2014
59 http://hipore.com/ijsc
This situation can only be overcome by building an IaaS
integrated with a fully automated risk assessment and
remediation engine in a “Compliance as a Service (CaaS)”
model. CaaS treats non-functional security and compliance
requirements in a non-proprietary and interoperable way.
CaaS functional activities are controlled by a set of dynamic
policies. An analytics function constantly interfaces with
security information and event management (SIEM) tools,
audit logging, etc., to measure the drift and then disseminate
policy commands to the policy aware security control
components, applications, IaaS, Platform as a Service (PaaS)
to fix the drift. Also IaaS, PaaS solicit guidance during
provisioning to selected target environment based on
compliance requirement and trend analysis. This
“infrastructure as code” model (IaaS) integration with
“compliance as code” model (CaaS) bridges the gap
between agility and security.
3. COMPLIANCE AS A SERVICE SOLUTION
ARCHITECTURE Figure 1 describes the high level CaaS solution
framework that enables security provisioning. Major
solution components are:
CaaS Controller
Event Correlation
Forensic Analysis
Root Cause
Policy to Security
OpenPagesAnalytics
Messaging
Bus
External Event Collection Framework
OpenPagesOrchestration
operational policies
Network
VPN
HOST IDS
NetworkOpenFlow
Controller
Firewall
Ops Mgmt
Asset Mgmt
Identity MgmtPatch Mgmt
VulnerabilityMgmt
Infrastructure and Apps
Application Events
Topology
Config
App Dependency
InfrastructureProblemMgmt health
IaaSProvisioning Audit Trail
PaaS
Provisioning Audit Trail
Analysis
OpenPagesPolicy LifecycleManagement
IDS
DriftAnalysis
Control Mapping
Policy and Management Plan Translation and dissemination Framework
stateOpenPagesRepository
Extended Complianceprovenance
Control to PlanMapping
BigData
API
API
System as Data
MetaLayer
stateCMDB
stateStoreRDF
Physical Datacenter
BrokerageSecurity and Process Control Remediation
Enterprise OntologyBusiness
Process Models Data ModelsOrganizational
Hierarchies
Contracts Models
Security Analytics
Fig 1: CaaS solution architecture
3.1 OpenPages An IBM solution in the space of governance, risk and
compliance management. OpenPages allows describing
underlying security, compliance and risk requirements as
declarative policy items. Policy items are then mapped to
the security controls and checked periodically through
management workflows to ensure policy adherence.
Changes in regulations affect the requirements thus also
affecting the mapping of security controls and policy items.
Therefore, policy will have a process for controlling its
overall lifecycle. We enhanced and extended OpenPages
security, compliance and operational management plans so
that they could be annotated with security controls in a
domain agnostic manner. We then integrated the OpenPages
orchestration definition functions with CaaS functions that
handle policy to control and control to plan mapping. We
enhanced OpenPages portal to include CaaS functionalities.
3.2 CaaS Controller This is the centralized management system for security
provisioning. The CaaS controller co-ordinates with various
distributed policy-aware components to maintain the desired
compliance state in a fully automated fashion. The
controller continuously polls filtered monitoring data
through an event collection framework. It indexes and
aggregates the machine produced data, applies security
control contexts and persists in the OpenPages repository. It
then invokes OpenPages analytics and native algorithms to
compute drifts. If components are determined to be out of
policy, the controller invokes the OpenPages management
plan to remediate the non-compliance. All the compliance
state computations are also validated by a metadata driven
controller function called the “Compliance provenance
function”. It also provides sophisticated agentless
monitoring for compromised virtual machines for forensic
and root cause analysis. Finally, the controller provides
functions to formalize knowledge derived from trend
analysis and applies the knowledge to predict compliance
drift.
3.3 Policy/Plan Translation Framework Management plans in the OpenPages orchestration
framework are annotated with security controls in an
abstract manner. In order to remediate a component deemed
out of policy, there is a need for a semantic layer which
knows the component domain, domain specific security
control, domain specific policy and domain specific action
to take to bring the component out of non-compliance.
Success of “Compliance as Code” depends on the policy
awareness of the cohorts of components that the CaaS
controller interacts with. In reality, it is impractical to expect
policy awareness from vast swaths of domain solutions that
make up the CaaS solution. Most of the solutions capture
policies in the form of static configuration and do not
expose any API to manipulate them. This semantic layer
also has the function to convert high level OpenPages native
policies into an XACML format, then to domain specific
source control configurations.
International Journal of Services Computing (ISSN 2330-4472) Vol. 2, No. 2, April - June 2014
60 http://hipore.com/ijsc
3.4 Event Collection Framework The Event collection Framework enables modularized
solutions to collect events and alerts from key domains in a
common format. The Framework also allows defining
domain specific event filters created through a common
User Interface.
3.5 IaaS Audit Trail The Software Defined Infrastructure with its
“Infrastructure as Code” paradigm introduces such a
dynamic environment and scale of operation that the
traditional operations model is inadequate to meet the
operations requirements. Also it redistributes
responsibilities from the lower level of stack to the
platforms and applications. Operations are crucial to success,
but operations can only succeed to the extent that it
collaborates with developers and participates in the
development of applications that can monitor and heal
themselves. DevOps is an approach which streamlines
interdependencies between development and operations
through set of protocols and tools. DevOps facilitates an
enhanced degree of agility and responsiveness through
continuous integration, continuous delivery, and continuous
feedback loops between development and operations teams.
DevOps tools scan an environment to gather infrastructure
components and configuration information and make these
available to the deployment engine manifests. The
deployment engine then manipulates these configurations
through plain text language which, along with deployment
artifacts, could easily be version controlled. This provides a
powerful framework to make compliance specific security
controls available to policy-aware applications in the form
of fully traceable configuration parameters. Since DevOps is
in the core of the IaaS framework, audit logging of
provisioning activities are automatically supported to
provide information for security control.
3.6 PaaS Audit Trail The Information Technology Infrastructure Library
(ITIL) is a framework of best practice approaches intended
to facilitate the delivery of high quality information services.
In order to facilitate the integration and automation of ITIL
best practices, ITIL specifies the use of a Configuration
Management Database (CMDB) to leverage a single source
of information for all configuration items (CI) such as
computer system, operating systems, and software
installation. The configuration management process
includes performing tasks such as identifying configuration
items and their relationships, and adding them to the CMDB.
The contextual mapping of CIs stored into CMDB provides
the basis for converting the information into a knowledge
graph (RDF) based Semantic model. This allows us to
traverse the relationship to form pattern-based queries and
deduce other implicit relationships, which may not be stored.
This permits meshing an external information graph with
the CMDB knowledge graph through entailment. In the
presence of partial information (an essential feature of
volatile unstructured data) the output is still a consistent
RDF model, which can be successfully processed. CMDB
acts as a Trusted Information Management Framework for
Master Configuration Data. Topology and Orchestration
Specification for Cloud Applications (TOSCA) ensure the
portability of a complex cloud application running on
complex software and hardware infrastructures. TOSCA’s
abstraction level provides a way to describe both
applications and infrastructure components at a high level,
which enables cloud orchestration that can leverage CMDB
for the infrastructure layer. Assembling and orchestrating
virtual images into larger structures, and then relating these
to existing infrastructure, produces a useful audit trail which
could be mined to unearth process flaws that could lead to
non-compliance. Also through TOSCA’s lifecycle support
beyond deployment, it is possible to provide historical data
to measure topology drift.
4. MAPPING NON-IT BUSINESS CONTROLS
TO IT/SECURITY CONTROLS A dynamic enterprise is comprised of hierarchical
functional layers where business reference model of ideas
and goals starts at the top, followed by business functions,
business processes, Business services and IT functional
model and its realization at the bottom. Each layers
comprised of cohorts of domain meta models that represent
domain scope, functions, and policies. Policies defined at
the top layer to guide business goals, ideas, functions
becomes more and more IT implementation specific as it
moves towards each successive bottom layer.
Contract
Semantic business Model Inference and
Translation Engine
Enterprise Ontology
Optimized IT functional Model
Optimized IT Security Confguration Model
Discrepency between Optimized IT functional Model
And realized IT model
IT Security Confguration Model
Optimized Enterprise Ontology
Realized IT Model
Enterprise Ontology
Fig 2: Business controls to IT/Security controls
Governance, Risk and Compliance activities in an
enterprise relies on measuring state of compliance of
business processes using security controls derived from
regulatory policies. These controls could be implemented in
variety of ways. Now in ever evolving enterprise, business
events arising outside IT may very well change the
International Journal of Services Computing (ISSN 2330-4472) Vol. 2, No. 2, April - June 2014
61 http://hipore.com/ijsc
consistency, definition and implementation of the security
controls (For example merger between a Business process
oriented Company and a Functionally oriented company)
leading to outright non-compliance pertaining to business
processes, security etc. Understanding the security
compliance behavior through the lenses of regulatory
policies and within the containment of business process
model and IT oriented models is not sufficient enough. We
need a way to factor in impact of business functions, goals,
ideas and complex cross business area functional and
process interactions described in high level business policies
to determine the optimized security configuration.
Complicating the situation, enterprises also utilize
operating agreement Contracts with customers which cover
security and compliance requirements, in a manner similar
to procurement, pricing or SLA specifications. However,
the security and compliance requirements introduced from
these contracts with enterprise clients are also defined in
text which is not easily interpreted into IT requirement
policies. As with the other aspects of the contract which
have been interpreted with xml formats, the security and
compliance feature/requirement set can be automatically
extracted via a Compliance as a Service model, which can
then be interpreted and applied.
When read together, these ideas still present several
important gaps -
1. IT functional models is evaluated against statically
defined high level compliance policies to assess impact on
low level policies and state of compliance only. Business
demand is limited to changing the IT functional model and
subsequently compliance state which, current disclosure
suggests to correct through manual changes in low level IT
policies.
2. IT-Business alignment is interpreted through IT
functional model only
3. Criteria for satisfied Compliance is based on evidences
between IT System policies and documented security
requirement (Compliance policies) only. There is no notion
of optimized compliance configuration based on tolerance
for risk and budget for security defined though Contracts
and Enterprise Architecture(Ontology).
4. Proposed methods ignores Operational goals
(Sustainability, Performance, Profitability ).
Evidencing that there is no iterative correspondence
between Enterprise operational universe and Compliance as
a Service solution to negotiate and implement an optimized
compliance configuration that balances tolerance for risk
and budget for security, We are proposing a fully
automated risk assessment and remediation solution in a
“Compliance as a Service (CaaS)” model dynamically
derives IT functional model from the Enterprise Ontology
which essentially captures Organizational goals and
operational universe. Contract model within Enterprise
ontology decomposes portions of contracts that will require
IT commitments into high level policies and dynamically
bind with security requirements.
A semantic representation of the regulations as well as
enterprise reference architecture provides the flexibility and
extensibility needed for modeling continuously evolving
enterprise domains, compliance regulations and capturing
their impacts of business goals and ideas on determining
state of the compliance.
Figure 2 depicts a semantic business model inference
and translation engine that reasons over the enterprise
ontology constrained by business model and control
instances derived from contracts or manual input a manual
input in the form of business metrics for Performance, or
business metrics on effect of Risk on capital and earning, or
business metrics for sustainability, or business metrics on
tolerance for risk, or risk scores and vectors, or cost of
security etc. to provide one or more outputs on the degree of
performance, optimized enterprise architecture, optimized
Security configuration, optimized IT functional model,
discrepancy between optimized IT functional model and
realized IT model as a report to a user. Figure 3 shows a
flow diagram for iterative computations of state of
compliance using enterprise ontology, contracts, IT
functional model etc.
Start
Read Enterprise Ontology
Map portion of contracts to appropriate
business semantics (process,activities,roles,goals,technologies etc)
Read Contract Model
Derive IT functional Model
Generate and manage High Level
Operational Policies
Disambiguate contextual
references ,Map policy to Security
controls
Annotate security controls with security
requirements, generate low level
policices
Deploy and collect evidence
Compute
No
Drift?
Stop
200
210
220
230
240
250
260
280
IT professional contract modification
via visualization assisstant
Enterprise profile and rule based
contract content analysis
Analysis based
visualization of IT relevant parameters
Create revised version of contract
with known, accepted IT parameter
Additional revision from other parties
Generate deal specific artifacts
IT implementation details Project specific polices IT security and Compliance
requirements
100
110
120
130
140
150
160
Org ProfileRulesServicesCosts
Org Profileprocesses
ServicesActivities
RolesGoals
Technology
Create IT instance from IT functional, Security
requirement and implementation model
270
Fig 3: Enterprise Ontology and policy generation
In Figure 3, contract analytics with the help of domain
experts and Enterprise Ontology (Enterprise profile,
Services, rules) and parameterized operational criteria
(Sustainability, performance, profitability) e.g. costs,
tolerance for security, budget for security as input,
decomposes portions of contracts that will require IT
commitments into high level policies and requirements
(100,101,120,130,140,150,160).
An IT functional model is dynamically derived with the help
of Enterprise Ontology and artifacts generated from
contracts analytics . The IT functional model dynamically
bind with security requirements and security
implementations and generate deployable policies. The
International Journal of Services Computing (ISSN 2330-4472) Vol. 2, No. 2, April - June 2014
62 http://hipore.com/ijsc
polices are deployed and evidences are collected. The
evidences are computed to compare against parameterized
enterprise operational goals and requirement thresholds. If
drift detected then contractual item/items and Enterprise
ontology modification suggested. The process iterates until
the the measures reach within a defined tolerance for the
threshold ( 200,210,220,230,240,250,260,270,280).
5. ONTOLOGY BASED COMPLIANCE
VALIDATION EXAMPLE Security incidents with data breach present a wide array
of legal problems for victim companies. The data breach
notification laws pertaining to definition of personal
information, identification of notification triggers, method
of notification, content of notification, determination of time
and acceptable delays etc. widely vary across states.
A meta model for compliance validation and evaluation
i.e. Compliance-Ontology is proposed, based on which,
regulation constraints can be modeled into OWL axioms
and SWRL rules. An activity (Data-Sensitivity-Assessment-
Activity) from the Data Breach Notification Process Flow
has been used to show how state statute provisioned
regulatory constraints are applied to validate activity result
compliance. This meta model is influenced by “Ontology-
based semantic modeling of regulation constraint for
automated construction quality compliance checking”
( Zhong, Ding, et al. 2012). Compliance Checking Ontology
serves as a meta model, defining the concepts and relations
related to the IT Security regulatory compliance checking.
Analysis-Task class is the central concept in this Ontology.
An Analysis-Task is set according to the specific regulation
constraint. An Analysis-Task can be related to the Analysis-
Object through the “hasAnalysisObject” property, which
indicates that the Analysis-Object will be inspected to make
sure their compliance to the relevant regulation constraints
through the execution of the Analysis-Task. The Analysis-
Object refers to any concepts governed by regulations and
indicates what is to be inspected, in the case of IT Security
compliance to regulatory requirements domain the entities
include identification, evaluation, remediation processes
(activities and procedures), the data security products and
resources used in analysis. An Analysis-Object may include
a set of violation Analysis items. These analysis items can
be identified from the regulation provisions. For example,
The NYS Information Security Breach and Notification Act
are comprised of section 208 of the State Technology Law
and section 899-aa of the General Business Law. Section
899-aa states that “(c) "Breach of the security of the
system" shall mean unauthorized acquisition or acquisition
without valid authorization of computerized data that
compromises the security, confidentiality, or integrity of
personal information maintained by a business. Good faith
acquisition of personal information by an employee or agent
of the business for the purposes of the business is not a
breach of the security of the system, provided that the
private information is not used or subject to
unauthorized disclosure.
In determining whether information has been acquired,
or is reasonably believed to have been acquired, by an
unauthorized person or a person without valid authorization,
such business may consider the following factors, among
others:
(1) Indications that the information is in the physical
possession and control of an unauthorized person, such as a
lost or stolen computer or other device containing
information …………….
(d) "Consumer reporting agency" shall mean any person
which, for monetary fees, dues, or on a cooperative
nonprofit basis, regularly engages in whole or in part in the
practice of assembling or evaluating consumer credit
information or other information on consumers for the
purpose of furnishing consumer reports to third parties, and
which uses any means or facility of interstate commerce
for the purpose of preparing or furnishing consumer
reports.
2. Any person or business which conducts business in New
York state, and which owns or licenses computerized data
which includes private information shall disclose any
breach of the security of the system following discovery or
notification of the breach in the security of the system to any
resident of New York state whose private information was,
or is reasonably believed to have been, acquired by a
person without valid authorization.”, the analysis
items include determination of personal information,
identification of notification triggers, method of notification,
content of notification, determination of time and acceptable
delays etc. widely vary across states, and so on.
Furthermore, an Analysis-Task needs a set of Analysis-
Item-Checking-Action to test and collect the conformance
information/data for the analysis items. Each Analysis-Item-
Checking-Action has a Checking-Result, which represents
the actual violation/ conformance/ compliance information
collected. Similarly, an Analysis-Task needs a set of
Evaluation-Task to evaluate the provenance of those
Analysis items in accordance with the Evaluation-Criteria.
The Evaluation-Criteria is imposed by the regulation
provisions or set by the domain experts. Basing on the
Checking-Result and the Evaluation-Criteria, the
Evaluation-Task can be done to judge whether the analysis
items are compliant with the regulation constraints. Each
Evaluation-Task has an Evaluation-Result, which all
together are constituted the Analysis-Report. The Analysis-
Report of a particular Analysis-Task for the corresponding
Analysis-Object can be documented, based on the
Evaluation-Result of all the inspection items. In Compliance
Ontology, the Regulation-Constraint constitutes the main
the Analysis knowledge, since the focus is the regulation-
based compliance analysis. Each constraint comes from the
corresponding provision text in regulations. The relation
“hasRegulation” associates the constraint with the provision
International Journal of Services Computing (ISSN 2330-4472) Vol. 2, No. 2, April - June 2014
63 http://hipore.com/ijsc
text from which the constraint is extracted. Meanwhile, an
Analysis-Task must be assigned to a Position as it’s
responsibility, who performs the Analysis-Item-Checking-
Action and the Evaluation-Task to accomplish the Analysis-
Task. In addition, many parameters, such as business
process parameters, IT functional and realization
Parameters, User behavioral parameters and so on, are used
to depict the compliance features/state, in the IT security
regulatory compliance domain.
As shown in Fig. 3, the Analysis-Object can be the IT
functional model, IT Security Model, IT Configuration
model, IT Security products, Business processes, or user
activities and so on. Here, each main concept indicates one
facet of the analysis objects, and can be modeled as the IT
Security process ontology. In Compliance Ontology, the
Analysis-Object concepts (enveloped with the dashed line,
as shown in Fig. 3) are also the concepts of the IT Security
process model. Through the Analysis-Object concept, the
Compliance Ontology for compliance checking can interact
with the IT Security process model the meta mode provides
general and common terms and relations common to the IT
Security compliance checking against regulatory
requirements domain. Basing
on the meta model, the specific domain model for the
security compliance checking can be obtained via
specializing and instantiating the generic concepts and
relations in the meta model. Since the metamodel is not
limited to any specific IT Security domain, the metamodel
can be reused independently of any specific security
implementation. Basing on the meta model and the ontology,
the constraints knowledge imposed by the regulations can
be clearly and unambiguously defined such that they may
potentially be interpreted by a machine.
Analysis-Task
Regulation-Constraint Regulation
Deontic-Constraint
Analysis-Object
Checking-Result
Parameter
Role
Evaluation-Criteria
Evaluation-Task
Evaluation-Result
Compliance-Report
Analysis-Item-Checking-Action
hasAnalysisTask
resource product activity
include include include
Process Model
hasAnalysisCriteria
isRegulatedBy
hasReference
isRegulatedBy
hasEvaluationCriteria
hasEvaluationResult
isComposedOf
hasAnalysisItemComplianceCheckingAction
isResponsibilityOfperformAnalysi
sperformEvaluation
Analysis2Evaluation
hasEvaluationTask
hasAnalysisReport
hasCheckingResult
Fig 4: Compliance checking Ontology
Here, Sensitive data breach notification process
compliance analysis is presented as an example to
demonstrate. Based on Compliance-Ontology, regulation
constraints can be modeled into OWL axioms and SWRL
rules. An activity (Data-Sensitivity-Assessment-Activity)
from the Data Breach Notification Process Flow has been
used to show how state statute provisioned regulatory
constraints are applied to validate activity result compliance.
Data Breach Notification Analysis Process
hasActivity
…..
Data-Sensitivity -Assessment-
Activity
Restore-System-SecurityActivity
Notification-Activity …..
hasActivity
….. isdirectlyBeforeIncident-
Investigation-Activity
incident
isUsedIn
State – to determine state statutes pertaining to definition of personal information, determination of notification triggers, content of notification etc
Incident-timestamp – to determine acceptable delay
Fig 5: Data breach notification process flow
Analysis-Task
Regulation-ConstraintRegulation
Deontic-Constraint
Analysis-Object Checking-ResultAnalysis-Item-Checking-
ActionRole
Evaluation-Criteria
Evaluation-TaskEvaluation-
ResultCompliance-Report
Compliance-Ontology
Compliance Acceptance
Standard
Data-Breach-Notification-Analysis-task1
Data-Security-task1
hasAnalysisTask
Title 15, United States Code isRegulatedBy
Investigation Personnel
Law Enforcement
John Doe
isResponsibleOf
Data-Sensitivity-Checking-Action_1
On
to
log
yIn
sta
nce
hasAnalysisItemComplianceCheckingAction
Data-Sensitivity-Checking-Result_1
hasCheckingResult
Data-Sensitivity-Evaluation-Action_1hasAnalysisItem
ComplianceEvaluationAction
Checking2Evaluation
Data-Sensitivity-Evaluation-Result_1
hasComplianceEvaluationResult
Data-Breach-Notification-Analysis-Report1
isComposedOf
Fig 6: Compliance Checking Ontology and Data Breach
Notification process Instance
Based on Compliance Ontology and the IT Security
process, each compliance analysis task can be modeled as
an ontology instance. Fig. 5 shows the Compliance
Ontology instance for Sensitive Data breach notification
process compliance checking. In order to make the ontology
knowledge understandable to both machines and human
beings, the ontology knowledge is described in OWL. OWL
is a W3C recommended language for ontology
representation on the semantic web. It offers a relatively
high level of expressivity while still being decidable. In
addition, OWL, as a formal language with description logic
based semantics, enables automatic reasoning about
inconsistencies of concepts, and provides RDF/XML syntax
to represent ontology knowledge.
International Journal of Services Computing (ISSN 2330-4472) Vol. 2, No. 2, April - June 2014
64 http://hipore.com/ijsc
5.1 Existential restriction Existential restriction “One Analysis-Task has at least
one Analysis-Item-Checking-Action” can be modeled in the
following axioms Axiom A1. “Analysis-Task
hasAnalysisItemComplianceCheckingAction only Analysis-
Item-Checking-Action”
Axiom A2. “Analysis-Task
hasAnalysisItemComplianceCheckingAction min 1” can be
expressed in below OWL format
<owl:Class rdf:ID=”Analysis_Task”>
<rdfs:subClassOf>
<owl:Restriction>
<owl:allValuesFrom>
<owl:Class rdf:ID="Analysis-Item-Checking-
Action"/>
</owl:allValuesFrom>
<owl:onProperty>
<owl:ObjectProperty
rdf:ID="hasAnalysisItemComplianceCheckingAction"/>
</owl:onProperty>
</owl:Restriction>
</rdfs:subClassOf>
<rdfs:subClassOf>
<owl:Restriction>
<owl:minCardinality
rdf:datatype="http://www.w3.org/2001/XMLSchema#int">
1</owl:minCardinality>
<owl:onProperty>
<owl:ObjectProperty
rdf:about="#hasAnalysisItemComplianceCheckingAction"
/>
</owl:onProperty>
</owl:Restriction>
</rdfs:subClassOf>
</rdfs:subClassOf
rdf:resource="http://www.w3.org/2002/07/owl#Thing"/>
</owl:Class>
5.2 Constraints Constraint, “Personal Information must contain
consumer’s name and at least one of the following
information: Social Security Number, Driver’s License
Number or State Identification Card Number, Credit card
number, debit card number, account number and any codes
or password (from State Data Breach Notification Law)”
can be modeled in the following Axiom A:
<owl:Class rdf:ID=”Personal_Information”>
<rdfs:subClassOf>
<owl:Restriction>
<owl:onProperty
rdf:resource=“#consumer_fname"/>
<owl:minCardinality
rdf:datatype="http://www.w3.org/2001/XMLSchema#int">
1</owl:minCardinality>
</owl:onProperty>
</owl:Restriction>
</rdfs:subClassOf>
<owl:unionOf rdf:parseType=”Collection”>
<owl:Class rdf:about=”#Social_Security_Number” />
<owl:Class rdf:about=”#Driver_License_Number” />
<owl:Class rdf:about=”#Credit_Card_Number” />
…………………. …………………………
</owl:unionOf>
</rdfs:subClassOf
rdf:resource="http://www.w3.org/2002/07/owl#Thing"/>
</owl:Class>
Axiom B.
<owl:Class rdf:ID=”Breached_Information”>
<rdfs:subClassOf>
<owl:Restriction>
<owl:onProperty
rdf:resource=”#hasType”/>
<owl:hasValue
rdf:resource=”#personal_information”/>
</owl:Restriction>
</rdfs:subClassOf>
</owl:Class>
Typical constraints in the regulation occur in the form of
rules. However, OWL only provides a basic, standard level
of reasoning, limited to a certain level of complexity. When
a more complex logical reasoning is necessary, one may
need to build his or her rules in a more dedicated rule
language. Several rule languages, such as Semantic Web
Rule Language (SWRL), the Rule Interchange Format (RIF)
and the N3Logic language, have been developed to express
such logic. SWRL is a good candidate to represent the
constraints rules, since SWRL rule language is
tightly integrated with OWL and the predicates in SWRL
rules may be OWL-based classes or properties. The use of
SWRL rules along with OWL axioms results in more
powerful constraints and intuitive inferring capability,
which could not be achieved through the use of axioms
alone. In addition, the SWRL rule representation enables
that the quality inspection information is separated from
regulation constraint knowledge, and provides the level of
flexibility needed so that the user can add or modify the set
of governing rules and regulations. This feature is useful,
since the regulation often changes. Another benefit is that
SWRL is a descriptive language that is independent of any
rule language internal to rule engines, which decouples the
rules from the technical implementation of the rules engine.
For example, the constraint “The system security
configuration must be restored before the notification goes
out”, can be modeled in Axiom C1 & C2 and SWRL Rule 1:
Rule 1. Notification-Activity(?CT_na) ∧ Restore-System-
Security-Activity (?CT_rss) →
isDirectlyBefore(?CT_na, ?CT_rss )
International Journal of Services Computing (ISSN 2330-4472) Vol. 2, No. 2, April - June 2014
65 http://hipore.com/ijsc
Axiom C1. Notification-Activity isDirectlyBefore only
Restore-System-Security-Activity
Axiom C2. Notification-Activity isDirectlyBefore exactly 1
Here, Rule 1 is written in terms/concepts from the Data
breach notification process model. Rule 1 indicates that the
existence of one instance of the Notification-Activity
implies the existence of one corresponding instance of the
Restore-System-Security-Activity, and the constraints
represented by object property isDirectlyBefore should be
met.
In the Data Breach Notification process definition,
people may mistakenly configure the instance of one
notification process (for convenience, called notification
process A) with the instance of another notification process
(for convenience, called notification process B). In the
instance level, the computer will think that the relationship
isDirectlyBefore, between the instance Ins_Notification-
Activity_a of class Notification-Activity
and the instance Ins_Restore-System-Security-Activity_b of
class Restore-System-Security-Activity, is reasonable.
Obviously, this is not logical, as shown in Fig. 6. Given
that two Data Breach Notification processes A and B occur
at the same time, the property isDirectlyBefore will be
followed with two instances: Ins_Notification-Activity_a
and Ins_Restore-System-Security-Activity_b. However, this
will violate Axiom C1 & C2. Therefore, we can preclude
the mismatch.
Restore-System-SecurityActivity
Notification-ActivityisdirectlyBefore
Ins_Restore-System-SecurityActivity_a
Ins_Notification-Activity_a
isdirectlyBefore
Ins_Restore-System-SecurityActivity_b
Ins_Notification-Activity_bisdirectlyBefore
isdirectlyBefore
X
ClassLevel
InstanceLevel
Fig 7: isDirectlyBefore at class/instance level During analysis stage, namely, during the execution of
the Data Breach analysis process, it is necessary to assure
the corresponding analysis actions and evaluation are done,
and any discrepancies should be detected in time to prevent
rework and cost increase. Based on the ontology, we can
divide the analysis task constraints into a set of SWRL rules.
For example, the deontic provision: “In the process
of Data Breach Notification, it is necessary to verify that
data breach actually took place according to the state
statute… identify owner of data”, can be modeled in the
following two SWRL rules.
Data-Breach-Notification-Task (Data-security-task1) à needsAnalysis (Data-security-task1, true) (Rule 2-1)Data-Breach-Notification-Task (Data-security-task1) ^ needsAnalysis (Data-security-task1, true) à Data-Breach-Notification_Analysis-Task (Data-Breach-Notification-Analysis-task1) ^hasAnalysisTask (Data-security-task1,Data-Breach-Notification-Analysis-task1) (Rule 2-2)
Rule 2‐1 indicates that once the analysis of the data security
starts, the person in charge of analysis will be reminded that
the Data-Security-task1 should be analyzed. When
executing Rule 2‐2, the instance Data-Breach-Notification-
Analysis-task1 is assigned to the class Data-Breach-
Notification_Analysis-Task via the property
hasInspectionTask.
Similarly, the constraint(s) “the owner of the data should
be identified” can be modeled in Rule 3-1/2, which indicates
that the analysis task for Data security breach includes the
checking action of the analysis item.
Data-Breach-Notification_Analysis-Task (Data-Breach-Notification-Analysis-task1)à Analysis-Item-Checking-Action (Data-Sensitivity-Checking-Action_1)^hasAnalysisItemComplianceCheckingAction(Data-Breach-Notification-Analysis-task1,Data-Sensitivity-Checking-Action_1) (Rule 3-1)
Data-Breach-Notification_Analysis-Task (Data-Breach-Notification-Analysis-task1)àEvaluation-Task(Data-Sensitivity-Evaluation-Action_1)^hasAnalysisItemComplianceEvaluationAction(Data-Breach-Notification-Analysis-task1,Data-Sensitivity-Evaluation-Action_1) (Rule 3-2)
Furthermore, based on Rule 4-1/2, the checking action
and evaluation for some specific quality inspection items,
which comprise the inspection task, can be assigned to the
corresponding data security analysis task.
Data-Security-Task (?dst)^Analysis-Task (?at)^hasAnalysisTask(?dst,?at)^Checking-Action(?cha)^ hasAnalysisItemComplianceCheckingAction (?at,?cha) à ComplianceCheck (?cha,?dst) (Rule 4-1)Data-Security-Task (?dst)^Analysis-Task (?at)^hasAnalysisTask(?dst,?at)^Evaluation-Task(?et)^ hasAnalysisItemComplianceEvaluationAction (?at,?et)à ComplianceEvaluate (?et,?dst) (Rule 4-2)
Once Rule 4–1 is fired, the property “ComplianceCheck”
of the data security task Data-Security-task1 is
automatically deduced and filled in. Obviously, based on
these rules, the applicable compliance requirements can be
translated into a set of analysis tasks to be performed by
Compliance Auditors. This enables the regulatory
compliance checking to be a parallel activity to the data
security, rather than an afterthought. Meanwhile, it also
facilitates the integration of the regulation deontic
knowledge and the data security process. Once the
inspection task of one construction task is determined, the
checking actions are performed to collect quality data. The
quality data should be evaluated according to the acceptance
criteria imposed by regulations so as to decide whether the
inspection objects are compliant with the quality acceptance
criteria constraints. In the quality inspection, if the
difference between the actual inspection results and the
acceptance criteria exceeds a certain range, the inspection
objects are identified as quality defects and need to be
International Journal of Services Computing (ISSN 2330-4472) Vol. 2, No. 2, April - June 2014
66 http://hipore.com/ijsc
investigated or reworked. Here, the general rules are defined
as following:
Analysis-Item-Checking-Action(?cha)^hasActualDeviation(?cha,?ad)^Evaluation-Criteria(?ec)^hasPermissibleDeviation (?ec,?pd)^swrlb:lessThan(?ad,?pd)^Evaluation-Task(?et)^hasEvaluationCriteria(?et,?ec)^Cehcking2Evaluation(?cha,?et)àhasAnalysisItemComplianceEvaluationResult(?et,”isSatisfied”)^Satisfied-Entity(?cha) (Rule 5-1)Analysis-Item-Checking-Action(?cha)^hasActualDeviation(?cha,?ad)^Evaluation-Criteria(?ec)^hasPermissibleDeviation (?ec,?pd)^swrlb:lessThan(?ad,?pd)^Evaluation-Task(?et)^hasEvaluationCriteria(?et,?ec)^Cehcking2Evaluation(?cha,?et)àhasAnalysisItemComplianceEvaluationResult(?et,”isUnSatisfied”)^UnSatisfied-Entity(?cha) (Rule 5-2)
Note that the object property hasPermissibleDeviation is a
sub-property of hasEvaluationCriteria. The object property
hasActualDeviation is a sub-property of hasCheckingResult.
After executing Rule 5-1/2, the Analysis items, whose
ActualDeviation is less than PermissibleDeviation, will be
classified into the Satusfied-Entity, which means that the
state of compliance criteria satisfies the requirements.
Otherwise, the compliance analysis items will be classified
as UnSatisfied-Entity, which means that further measures
(investigation or rework) need be taken.
The implementing soft environment is shown in Fig. 8.
Ontology editor enables the users to load and save OWL
and RDF ontologies, edits and visualizes classes, properties,
and SWRL rules, defines logical class characteristics as
OWL expressions, executes reasoners such as description
logic classifiers and edits OWL individuals. actual
reasoning process is conducted through the rule engine. The
rule engine converts a combination of OWL+SWRL into
new facts. The inferences are carried out in Rule engine
inference engine by matching facts in working memories in
accordance with the rules in rule base. Also, if the inference
engine infers knowledge using forward chaining, the new
knowledge can be used for further inference or querying
stored or inferred knowledge.
Ontology Editor
Ontology
Fact/Knowledgebase
Rule/SWRL
Classes/Instances/
Axioms
Reasoner
RuleEngine
Facts/Rules
New Knowledge
Fig 8: Soft Environment for Semantic Analysis
6. SECURITY ANALYTICS AND MACHINE
LEARNING The heart of the CaaS solution is the ability to process
massive amount of structured and unstructured data using
the Big Data analytics approach. CaaS analytics has the
ability to analyze both explicit and implicit knowledge and
tie them together to discover new contexts and new facts.
This is essential for automated policy generation and
security control mapping.
7. INFORMATION PROVENANCE
FRAMEWORK Compliance automation is composed of complex
interactions between actionable policies and processes.
Policy must be decomposable to discrete action plans. Those,
in turn, should be mapped to domain-specific security
controls. Processes responsible for interrogating security
controls for current state of the operation, validating
compliance conformance, performing actions to bring
components out of non-compliance should be generating
process execution metadata to trace the execution path.
These metadata could be analyzed to understand the process
integrity and behavior and the accuracy of the decisions.
Again, learning from these analyses could be formalized and
fed back to the CaaS controller. The CaaS controller may
modify or control the process behavior through domain-
specific, dynamic policies or process modification. Figure 2
shows high level components of metadata driven
provenance framework.
Process
Execution
Metadata
Compliance
Metadata
Process
Defintion
Metadata
Metadata Analytics
CaaSEvent cube Analytics
Policy
Generation
Process
Model
Simulation
Fig 9: Metadata driven provenance framework
8. FORENSIC ANALYSIS Forensic analysis of a compromised virtual machine
(VM) provides critical information about ‘holes’ in security
controls in place to protect the assets. Time based
correlation of system and application data gives deeper
insight into the root cause for compliance drift.
We needed a continuous, non-intrusive monitoring system
that provides disk and memory state introspection,
fingerprint generation and indexing from image snapshot.
The snapshot preserves the current state of the configuration.
Analysis of the snapshot does not require intra-VM agents
and leaves the running VM instance unperturbed. We
envisioned building a “System as Data” meta-layer which
would enable querying a datacenter in a similar manner to
querying the web, and multi-dimensional analysis of event
International Journal of Services Computing (ISSN 2330-4472) Vol. 2, No. 2, April - June 2014
67 http://hipore.com/ijsc
cubes in conjunction with process metadata and compliance
metadata.
Client VM
No in-VM
agent
VM
Memory VM Disk
Connector
Crawler VM
Introspection
CRAWLER
Enrichment
Hypervisor
Origami
Service
KnowledgeBase
Forensic drilling
Feature
CaaS
Forensic Analysis Drift Analysis
Root cause Analysis
Fig 10 : VM Introspection
Figure 3 shows a high level architecture of an IBM
Research solution Origami (Bala, 2013) for agent-less
monitoring through introspection. Figure 4 and Figure 5
show a typical feature set present in an Origami Discovery
Frame.
os
Mount-points
/dev/vda1:”ext3"
/dev/vda2:”ext4"
...
package
tomcat6
...
version:”6.0.2"
vendor:”apache"arch:”x86_64"
file
/etc
/bin
...
... /host
permission:”-rw-r--r”
size:236
user=”root”
name=”hosts”
group=”wheel”
mtime=”2010:04:11:09"
/usr
CRAWL
os : {
ipaddr:’9.xx.xx.x'
mount-points:{
‘/dev/vda1’:’ext3',’/dev/vda2':’ext4'
},
……
},
package : {
libgcc:’….'
tomcat6:{
‘version’:’6.0.2',’vendor:’Apache’,
’arch’:’x86_64'
},
……
},
file : {
‘/etc/hosts’:{
‘permission’:’-rw-r—r-- ',
‘size’:236,’usr’:’root’,
’name’:’hosts',’group’:’wheel’,
‘mtime’:’2010:04:11:09'
},
‘/bin/…/’:’….’,
‘/usr/…/’:’….’,
……
},
Config Frame
Fig 11 Filesystem configuration frame
Agentless access to this vast wealth of configuration
information enables Origami to perform analytics at
multiple levels of granularity – files, products (aggregation
of files), images (aggregation of products) and systems
(aggregation of images). All these information sources are
consumed by the CaaS controller through a “System as
Data” semantic meta-layer. Data collected by Origami is
also converted into a knowledge graph format. Aside from
traditional time correlation, cube-analysis, data mining, a
reasoning framework meshes up the configuration
knowledge graph with other information sources to unearth
implicit knowledge. These explicit and implicit analytics
frameworks together give deeper insight into forensic, root-
cause analysis of non-compliance.
Pattern
Was-dmgr
db2
Ipaddr=9.25.34.1
Vcpu=2
Vmem=4GB
Host=9.45.22.4
Posture=’Internet facing’
connectsTo=was-dmgr
Was-cell
CRAWL
pattern : {
httpd:{
‘ipaddr’:’9.25.34.1',
’vcpu':2,
‘vmem’:4,
‘host’:’9.45.22.4',
‘posture’:’Internet facing’,
‘connectsTo’:’was-dmgr’
},
was-dmgr:{
…………..
},
was-cell:{
…………..
},
db2:{
…………..
}
}
Config Frame
Fig 12: Pattern configuration frame
Another innovative cloud service is hosting replication and
forensic analysis of infrastructure, platform, and software
operational states and configuration events. The power to
leverage federated cloud resources to analyze Big Data and
intelligently automate repetitive tasks / predictable
outcomes can optimize IT Delivery, technical support, and
service level management .
8.1 Cloud Service 1 - Problem and Change Optimization: Work Load Automation This scenario leverages Cloud resources to automate
mature IT support best practice procedures and processes.
Cloud infrastructure will host asynchronous registration and
replication of distributed platforms to capture sequential
computation states (N-3, N-2, N-1, N) occurring on the
registered platform . When a registered platform encounters
a known error condition, the cloud service launches services
to log/track defects and execute digital forensic analysis
using the replicated computation state history from that
platform. This allows root cause analysis to be
proceduralized (automated) to a very high degree because a
complete history of change and configuration events are
readily available. The replication data captures computation
states of each IT as a Service (ITaaS) layer - virtual
infrastructure (HW), virtual Platform (OS), and virtual
Applications (SW), preceding an executed instruction,
computational event, or error condition. Analogies to this
process can be drawn from two legacy IT architectures; data
replication and software development functions such as
Debug, Trace, Verbose, and STEP. In this service
architecture, computation states are replicated across all the
ITaaS layers and provide a literal audit trail of the key
resource computation states of CPU's, Co-Processors,
Registers, Cache, Memory, Paging, etc....
International Journal of Services Computing (ISSN 2330-4472) Vol. 2, No. 2, April - June 2014
68 http://hipore.com/ijsc
8.2 Cloud Service 2 - Platform & Application Performance Continuous Service Improvement In scenario 2, Cloud services "reverse engineer", harvest,
and reuse key technical and operational characteristics of
the infrastructure, platform, and software service layers. IT
domains with established best practices and procedures are
good service candidates. In this example, installation and
configuration of application software on distributed client
systems could be safely recorded, recreated, tested, and
automated. To start, the cloud is capturing state based
replication data for a target system as each ITaaS layer is
applied. Special configurations or modifications to any of
the layers are captured as replication of discreet
computational states are running throughout the target
system build process. This capability is the foundation for
developing, optimizing, and delivering distributed platforms
and bundled software services from the Cloud. These
platform and application services can be logically extended
to optimize continuous improvement solutions in the Cloud.
The state based replication history can be used to repeatedly
recreate and test platform and application performance.
Replication history of computation states enables repeated
roll back and re-execution of key performance test scenarios
from any previous computation state. As the subsequent
computations are re-executed, key performance indicators
are measured, tuned, and retested to achieve optimal
performance across all ITaaS layers.
Fig 13: System replication and simulation services
in the Cloud
.
9. COMPLIANCE REMEDIATION EXAMPLE IBM’s security policy for computing is dictated by
corporate instruction and the purpose is to establish
requirements for the protection of IBM's worldwide
information technology services and the information assets
they contain. Since this is a condition which all employees
must observe, running a workload in SoftLayer must adhere
to this condition as well. We used this opportunity to test
our solution framework to lighten the burden on the
researchers.
We first identified a subset of security checks needed to
comply with IBM IT security requirements. We grouped
them into user-centric and server-centric checks so that the
controls and remediation actions are defined accordingly.
User-centric checks :
Compliance Check Why
Email addresses must
have IBM Internet
format, i.e.
userid@cc[n].ccc.com
and be found in
Enterprise Directory.
So activity can be linked
back to an individual. Non-
compliant users can
optionally be set to Inactive
status.
SoftLayer accounts
must be set with a
password expiration
less than or equal to
the value in the policy
configuration file.
Reduces risk of brute force
password attack.Non-
compliant users can
optionally be set to Inactive
status.
PPTP VPN is not
permitted.
Non-compliant users can
optionally
be set to Inactive status.
Only a limited
number of users
should have
permission to manage
security aspects of the
account (VPN,
firewalls, etc.)
Least privilege is an
important security principle.
Unused SoftLayer
Portal accounts are
identified, and
optionally disabled.
Lack of regular and recent
logon to the SoftLayer
Portal may indicate
unnecessary privilege.
Server-centric checks:
Compliance Check Why
Identify servers that have a
public network interface.
Remind users to limit
use to private network
interface only unless
there is an explicit
reason for public
Internet connectivity.
Servers running for more
than a policy-controlled
number of days
Remind users to
remove servers that are
no longer
required. This will save
the account money, as
well as reduce the
potential attack surface.
Secondary policy-
controlled time interval
reminds users to
register their for IBM
Mandated security
International Journal of Services Computing (ISSN 2330-4472) Vol. 2, No. 2, April - June 2014
69 http://hipore.com/ijsc
scanning
Windows servers without
McAfee anti-virus installed
Anti-virus is a basic
security control.
Servers unprotected by
a firewall
Checks for:
Server based hardware
firewalls
VLAN based hardware
firewalls
Host based software
firewalls
Vyatta gateway
appliances
NOTE: Software based
host based firewall can
be a mitigating control
for not requiring
hardware firewalls.
Servers with public
network interface and no
SoftLayer vulnerability
scan within a policy-
controlled number of days
Vulnerability scanning
is an Security
compliance
requirement. Use of
SoftLayer vulnerability
scanner is an acceptable
interim measure before
registration for internal
vulnerability .
If there is not a
sufficiently recent
vulnerability scan, the
tool will initiate one
automatically for each
server that requires it.
Servers with an SSHD port
that is contactable on
public IP and has password
authentication configured.
Experience tells us that
weak SSH passwords
are a common attack
vector which has
resulted in Internet
facing Linux servers to
be compromised in a
very short period of
time.
Servers with insecure ports
open to the Internet via
public network interface.
Eliminate use of
insecure protocols and
make use of secure
alternatives, e.g.
HTTPS instead of
HTTP, SSH instead of
TELNET and
FTP. Also to ensure
that traditionally weak
protocols such as
NETBIOS are not
enabled on Internet
facing public network
interface.
Insecure ports open on Same check as above,
public IP addresses on
static and portable VLANs
but catches servers that
may be contactable
from the Internet that
are hosted on a self-
managed hypervisor.
CaaS Controller
Event Correlation Root Cause
Messaging
Bus
IBM IT Security policies
NetworkOps
MgmtInfrastructure
IaaS PaaS
Analysis
DriftAnalysis
Policy and Management Plan Translation and dissemination Framework
stateAPI
API
System as Data
MetaLayer
statestate
Brokerage
SoftLayer
Security and Process Control Remediation SoftLayer Python
Adapter
SSLVPN
Fig 14: Compliance remediation in SoftLayer
Figure 14 shows the required customization in the policy
and control management domains. IBM IT security policies
are defined at a higher level using OpenPages. Appropriate
atomic business process workflows are defined at the
compliance orchestration level. These atomic flows belong
to various categories such as user and server centric
compliance checks, periodic collection of master entities
from target cloud environment, compliance remediation, etc.
Workflow activities are mapped to appropriate security
controls with appropriate domain specific annotations. All
the high-level activities are then translated into SoftLayer
API calls by the translation layer. The event collection,
control remediation overlay and IaaS, PaaS brokerages
leverage a SoftLayer python adapter to access SoftLayer
cloud Operational Support System (OSS)/Business Support
System (BSS) information.
10. LESSONS LEARNED The goal of this project was to create a solution
architecture using policy aware cloud brokerage, bi-
directional IT event handlers, policy lifecycle management,
policy to control and control to policy translation, etc. that
truly align IT compliance operations with the “Compliance
as Code” paradigm. Security is not a one-shot deal, (e.g.,
International Journal of Services Computing (ISSN 2330-4472) Vol. 2, No. 2, April - June 2014
70 http://hipore.com/ijsc
“Let’s install this patch, anti-virus, audit mechanism and we
are done”). Security requires constant updates but not all the
updates are necessary, so the updates must be done
judiciously. Some updates may render already running
applications and middleware not to function as expected.
These expectation mismatches may lead to serious
compromise. Only a policy based tight integration between
CaaS and the rest of the “as a Service” components can
alleviate this problem. Cloud brokerage is a new functional
area which provides common API to access various cloud
providers’ “as a Service” components, but still lagging in
supporting all provider specific IaaS, PaaS, Software as a
Service (SaaS) component features. For our SoftLayer
examples we implemented support for a set of security
controls that we identified in our IBM IT security guidelines
for SoftLayer.
Policy driven hardening of host and guest configurations
worked well with private Openstack cloud in SoftLayer.
SoftLayer allowed manipulating guest configuration for
their Xen based Virtual Machines only. In the absence of
Software defined Networking (SDN) and Software Defined
Storage (SDS) we had limited programmable hardening
capabilities.
Cloud technology and services have the potential to
optimize, automate, and simplify key ITIL processes and
best practices. Virtual self service solutions maximize IT
support using digital knowledge resources while improving
efficiency of the skilled (but limited) human resources. The
focus of SME (subject matter expert) resources can then be
continuous improvement of these same cloud services. IT
service providers must follow one simple edict to achieve
dramatic efficiency through IT work load automation --That
one edict is “optimize and simplify by modeling best
practices, then automate repetitive and repeatable processes.
Best Practices is the key.”
11. REFERENCES Schectman,Joel (2012), Electronic source with authors and publication
time, Retrieved January 15, 2014 from http://blogs.wsj.com/cio/2012/12/27/netflix-amazon-outage-shows-any –
company-can-fail/
Electronic source for references without author names. (2013). Retrieved
January 15, 2014, from
http://www.womencitizen.com/business-2/cloud-computing-will-exceed-
100b-in-2014-3162.html
Electronic source for references without author nor publication time.
Retrieved January 15, 2014, from
http://www.webopedia.com/TERM/S/software_defined_data_center_SDD
C.html
Electronic source for references without author nor publication time.
Retrieved January 15, 2014, from https://www.openstack.org/
Electronic source for references without author nor publication time.
Retrieved January 15, 2014, from http://www.interoute.com/what-iaas
Electronic source for references without author nor publication time.
Retrieved January 15, 2014, from www.softlayer.com
Electronic source for references without author nor publication time.
Retrieved January 15, 2014, from
http://www.itil.org/en/vomkennen/itil/servicetransition/servicetransitionpro
zesse/serviceassetconfigurationmgmt.php
Electronic source for references without author nor publication time.
Retrieved January 15, 2014, from
http://en.wikipedia.org/wiki/Security_information_and_event_management
Electronic source for references without author nor publication time.
Retrieved January 15, 2014, from http://www.interoute.com/what-paas
Electronic source for references without author names. (2001). Retrieved
September 12, 2005, from http://www-
01.ibm.com/software/analytics/openpages/
Electronic source for references without author nor publication time.
Retrieved January 15, 2014, from https://www.oasis-
open.org/committees/tc_home.php?wg_abbrev=xacml
Electronic source for references without author nor publication time.
Retrieved January 15, 2014, from http://radar.oreilly.com/2012/06/what-
is-devops.html
Electronic source for references without author nor publication time.
Retrieved January 15, 2014, from
http://www.itil.org/en/vomkennen/itil/servicetransition/servicetransitionpro
zesse/serviceassetconfigurationmgmt.php
Electronic source for references without author nor publication time.
Retrieved January 15, 2014, from http://www.itil.org/en/vomkennen/itil/servicetransition/servicetransitionpro
zesse/serviceassetconfigurationmgmt.php
Electronic source for references without author nor publication time.
Retrieved January 15, 2014, from http://dmtf.org/standards/cim
Lassila, O., Swick, R (1998), “Resource Description Framework (RDF)
model and syntax specification”, Working Draft, W3C
Electronic source for references without author nor publication time.
Retrieved January 15, 2014, from https://www.oasis-
open.org/committees/tc_home.php?wg_abbrev=tosca
Bala, V. (2013) ), Electronic source with authors and publication time,
Retrieved January 15, 2014 from
http://ibmresearchnews.blogspot.com/2013/07/virtual-image-library-
fingerprints-data.html#fbid=6dVVjXDs7qZ
Electronic source for references without author nor publication time.
Retrieved January 15, 2014, from
http://en.wikipedia.org/wiki/IT_as_a_service
Electronic source for references without author nor publication time.
Retrieved January 15, 2014, from http://www.interoute.com/what-saas
Zhong , B.T., Ding , L.Y., Luo, H.B. , Zhou, Y., Hu, Y.Z., Hu, H.M
(2012) , “Ontology-based semantic modeling of regulation constraint for
automated construction quality compliance checking”, ELSEVIER
Authors
Lorraine M Herger is the Director of
Integrated Solutions and CIO of IBM
Research. In this role, Lorraine is the
service provider for the IBM Research
International Journal of Services Computing (ISSN 2330-4472) Vol. 2, No. 2, April - June 2014
71 http://hipore.com/ijsc
Division. As part of being the Research CIO, Lorraine and
her team provide mobility services to the Research team,
and work with the IBM CIO office to develop new
technologies which can improve mobility services, as well
as the associated governance, policies and practices to
streamline the introduction of mobility across IBM.
Lorraine holds a BSEE from University of Maryland; BA,
Columbia University and MBA, Stern School of Business,
NYU. Ms. Herger is currently the President, of the SWE-
NY Professional Chapter, a Senior Member of the IEEE and
an ABET (Accreditation Board for Engineering and
Technology) Board member, representing SWE.
Shakil M Khan is a Senior Software
Engineer with IBM Research. Shakil is a
Subject Matter Expert on IT Service
Management and IT Infrastructure
Library (ITIL). His current focus is
adoption of Semantic Web and Ontology
for Services Computing to enable a true “Service
Knowledge Management System” core and machine
learning driven intelligent IT process automation. He is also
working on building semantic enhanced DevOps for next
generation cloud. His other interest areas include Big data
analytics, Contextual Computing and Cognitive Computing.
Shakil holds a Master of Science in Mechanical Engineering
from Texas Tech University.
25+ years of IT career of Matt McCarthy
has been marked by steady advancement
in leadership positions on increasingly
complex IT projects. Matt is a recognized
leader and SME in IBM's IT Services
Management (ITaaS) and IT Asset
Management disciplines with numerous awards for
technical innovation and thought leadership in the IT
architecture domain. Has maintained professional
certification as Consulting IT Architect (Enterprise
Integration Strategy) since 2002 and was recently nominated
for the appointment of “IBM Master Inventor”. Received a
BS/CS Degree in Software Systems Engineering From
Colorado Tech (Graduated Magna Cum Laude)