02 - Redgate Software - Compliant Database DevOps ......DevOps is that they’re not getting the...

32
1 Database DevOps 02 The Database DevOps magazine from Redgate Software DevOps and the database: what's going on? All you ever wanted to know about GDPR The language of DevOps ROI SQL Clone: the aspirin for database provisioning headaches Achieving DevOps success in financial services Autumn/Winter 2017 issue

Transcript of 02 - Redgate Software - Compliant Database DevOps ......DevOps is that they’re not getting the...

Page 1: 02 - Redgate Software - Compliant Database DevOps ......DevOps is that they’re not getting the backing from the executive level,” Donovan says. “You have this waterfall mentality

1Database DevOps

02The Database DevOps magazine from Redgate Software

DevOps and the database: what's going on?

All you ever wanted to know about GDPR The language of DevOps ROI SQL Clone: the aspirin for database provisioning headaches Achieving DevOps success in financial services

Autumn/Winter 2017

issue

Page 2: 02 - Redgate Software - Compliant Database DevOps ......DevOps is that they’re not getting the backing from the executive level,” Donovan says. “You have this waterfall mentality

2 Shift LEFT Issue 2

Visual Studio Enterprise 2017 now includes Redgate Data Tools – rolling your SQL Server and Azure SQL databases into DevOps for higher productivity, agility and cross-team performance.

Accelerate release cycles and increase collaboration by making your database part of the DevOps workflow.

Learn morehttps://aka.ms/vsenterprise

Equip your teams for better collaboration

Page 3: 02 - Redgate Software - Compliant Database DevOps ......DevOps is that they’re not getting the backing from the executive level,” Donovan says. “You have this waterfall mentality

3Database DevOps

The reasons to Shift LEFT are increasingDevOps has moved from the back room to the board room. Once the challenger threatening the status quo, it is now being seen as the enabler for change that will help to maintain the status quo.

New disruptors are emerging in the tech space, their ability to turn on a dime giving them a competitive advantage over bigger, slower players. Established companies with a defined market position are recognizing they are also entrenched in a confined space. New data regulations are looming that will force every company and organization to follow rigid guidelines.

DevOps can help – and the database is part of that. As Microsoft’s Donovan Brown points out in the lead article, “If you’re doing DevOps differently for your database and the rest of your system … there’s something wrong there.”

Truth is, you can introduce DevOps for both the application and the database by adding to the infrastructure you already have in place, rather than replacing it. At the same time, you’ll put in place the processes you need to match the disruptors on the one hand and meet regulatory requirements on the other.

Times are changing, but the technology is now in place to help you change with them.

Matt HilbertEditor

Contents

4 DevOps and the database: what’s going on?

6 Five challenges to scaling DevOps at enterprise level

8 All you ever wanted to know about GDPR

12 Why it’s time to think seriously about SQL Server 2017

13 Avoiding the slide from DevOps to DevOops

14 The language of DevOps ROI

16 SQL Clone: the aspirin for your database provisioning headaches

20 Achieving DevOps success in financial services

24 Bringing DevOps to the database – Moody’s Analytics and SQL Clone

26 You’re not delivering DevOps to the database

30 Find a DevOps partner

EDITOR

Matt Hilbert

COVER DESIGN

Anastasiya Rudaya

PHOTOGRAPHY

James Billings

http://jamesbillings.photography

DESIGN AND ARTWORK

Pete Woodhouse

CONTRIBUTORS

Tom Austin

William Brewer

Karis Brummitt

Coeo

Tony Davis

DevOpsGuys

Kate Duggan

Grant Fritchey

Ed Leighton-Dick

www.red-gate.com

Page 4: 02 - Redgate Software - Compliant Database DevOps ......DevOps is that they’re not getting the backing from the executive level,” Donovan says. “You have this waterfall mentality

4 Shift LEFT Issue 2

When DevOps was first talked about in Flickr’s seminal ’10 deploys per day’ Velocity presentation in 2009, it was regarded by some as strange and alien to corporate culture. It was the antithesis to the accepted way of doing things, a threat to established and proven processes: it was downright dangerous.

Times have changed. Unicorns like Uber, Airbnb, Spotify and GitHub couldn’t live without it. Microsoft has embraced it and actively promotes it. It’s become the key to gaining competitive advantage in a technology marketplace that is moving faster and changing quicker than it ever has before.

But what about the database? Why are practices like version control, continuous integration and automated deployment being introduced to application development but left on the shelf when it comes to the database? The 2017 State of Database DevOps survey, for example, found that continuous integration was in place for 39% of applications, but fell to 20% for databases. What’s going on?

In search of some answers, I spoke to Donovan Brown, Principal DevOps Manager at Microsoft and DevOps advocate. Also known as The Man in the Black Shirt, his unofficial tagline is #RubDevOpsOnIt. He lives DevOps. In fact, you can’t stop him talking about it and enthusing about its advantages.

He describes DevOps as: “The union of people, process, and products to enable continuous delivery of value to our end users.” The most important of those three, according to Donovan, is people. It’s also, conversely, the most difficult to fix.

“People are creatures of habit.”

“If you’ve been doing something the same way for 30 years,” he says, “why would you fix it? We all resist change, even if change is for the better, and DevOps requires a lot of change.”

A large part of that needs to come from the Ops side of the equation because, while processes like version control and continuous integration have become second nature to developers, Ops teams don’t traditionally do them on a daily basis. In fact, Ops are often seen as the opponents of change.

Donovan does, however, hold his hand up as being partly responsible for the division. “Historically,” he says, “the Ops people have had to become gatekeepers because developers like me 20 years ago were wreaking havoc on production servers, running on there with admin credentials. Their job in my opinion has never been to deploy software for us. Their job is to protect and manage the infrastructure, to make sure it runs as efficiently as it can. That history, though, stays with us because there is so much regulation and red tape to get code from the fingers of developers to the hands of users.”

He sees DevOps as the solution to the problem. “What I’m hoping is that DevOps will help automate what we’re forcing them to do, which was never their job, and rebuild the trust between Dev and Ops. We really need to help Ops people realize that we’re not trying to replace them, we’re trying to empower and enable them to allow us to continuously deliver value.”

Interview

{ matt hilbert }

If you’re doing DevOps differently for your database and the rest of your system, there’s something wrong. There shouldn’t be a separate

conversation about it, says Microsoft’s Donovan Brown.

DevOps and the database:what’s going on?

Page 5: 02 - Redgate Software - Compliant Database DevOps ......DevOps is that they’re not getting the backing from the executive level,” Donovan says. “You have this waterfall mentality

5Database DevOps

It’s not down to Ops teams on their own, however. C-level executives also have a part to play because they often favor a traditional approach to software development, with targets and long term strategies standing in the way of DevOps.

“What I’ve noticed in a lot of companies that are struggling to adopt DevOps is that they’re not getting the backing from the executive level,” Donovan says. “You have this waterfall mentality from the top that still wants GANT charts and milestones, and things that just do not line up with agile thinking at all. I’ve had the best success implementing DevOps in organizations where the C-level executive got it. When you have a grassroots team who get it, but no one else does, they usually get snuffed out by tradition.”

The database is not separate from Donovan’s goal of people, process and products coming together. Prior to joining Microsoft, he was a Process Consultant and Scrum Master, helping a range of companies introduce DevOps to their development practices. He made sure the database was part of that.

Then, as now, including the database was a challenge, but Donovan worked around it, using the technologies available to allow customers to treat the database just like every other element in their DevOps solution.

He does, however, admit it was hard work. “It’s actually the most difficult part of the pipeline to automate because of the risk of failure,” he says. “You can easily roll back a front-end – you can’t do the same for your database because any transactions that have occurred during the deployment could potentially be lost. So, it’s a crucial, integral part of your DevOps pipeline and I challenge any company that claims to be doing DevOps that only does it for the front-end but their back-end is still done manually. To me, they’re faking it.”

“Infrastructure as code should not be a special snowflake. It’s just part of your solution.”

The key is to remember that the code written to make changes to a database is just that: code. Traditionally, it has been managed outside of and separate to application code, with database changes often managed on-the-fly, at the last minute. Consequently, deployments have become worrying at best, with teams on hand for the expected firefight against downtime.

Include it in version control, however – and continuous integration and automated deployments – and the database can be developed in parallel with applications. Problems are then picked up earlier, deployments become predictable and reliable, downtime and rollbacks are a thing of the past.

What has made this even more appealing is the tooling now available – software that in Donovan’s days as a consultant was only just emerging. As he admits: “The tooling had to mature to allow us to get over a lot of the hurdles that are unique to database deployments, so that we can have everyone incorporated into the DevOps pipeline, not just the unicorns. Everyone should have the power available to them to be able to take their databases and incorporate them as part of their DevOps pipeline. Until you do, you still have that being the bottleneck.”

There is a price to pay, however, if by introducing tools to bring DevOps to the database, companies get bogged down by what Donovan calls ‘the vendor explosion’.

“Every vendor you add to your pipeline also adds integration tax.”

Integration tax is the cost to companies, in terms of training and introducing unfamiliar ways of working, in order to use a new piece of software that resolves part of the DevOps puzzle. If that cost is too high by, for example, moving people out of the development environment they’re comfortable with, it betrays the ideals of DevOps. One onerous way of working is simply replaced by another.

Donovan is keen to remove that tax by encouraging companies to work with as few vendors as possible – and preferably those vendors who have already paid the integration tax in advance. By, for example, introducing tools that work inside or alongside existing software, thereby reducing the time it takes to become familiar with them, and making the most of the current infrastructure rather than adding to it.

Otherwise, Donovan concludes, “You’re paying tax for that integration. When we at Microsoft have partners who create extensions for our products, it’s the vendor and Microsoft who pay the tax in advance.”

Finally, I asked Donovan what the biggest challenge was for companies who adopt DevOps for the database. His answer was the most telling:

“If you’re doing DevOps differently for your database and the rest of your system, to me there’s something wrong there. When I think of DevOps, the database is part of that. I don’t need a separate conversation about it.”

This interview was originally published on DevOps.com

Page 6: 02 - Redgate Software - Compliant Database DevOps ......DevOps is that they’re not getting the backing from the executive level,” Donovan says. “You have this waterfall mentality

6 Shift LEFT Issue 2

Five challenges to scaling DevOps at enterprise level

We and many in our industry believe successful digital transformation initiatives will need DevOps to be the driving force. DevOps techniques are already being used by 74% of technology professionals working today (at enterprise level, this figure rises to 81%), and IDC believes that 80% of the top 1,000 companies will be embracing DevOps practices by 2019.

DevOps scaling challengesMany large enterprises launch small DevOps initiatives within certain departments, but subsequently find that scaling DevOps across the organization faces a number of challenges that must be overcome.

Digital transformation means a wholesale change that will rock the very foundations of your business model. Fear of the new can be crippling – a sense of the risk that may be involved with such a serious undertaking can put up cautionary barriers that hinder your organization’s path to progress and future survival.

Make no bones about it, the need to transform is urgent, especially for established companies still running legacy operations.

At an organizational level, keeping pace with the competition means developing and deploying new digital services; establishing a new operating model for IT that increases the speed at which new and/or altered products are brought to market, and improving operating efficiency.

This means DevOps.

DevOps is an emerging model of product delivery that facilitates higher and faster rates of change through the optimization of development and delivery processes. It’s a working culture that breaks down the traditional siloes between development, operations and all stakeholders in the delivery process, ultimately driving better business outcomes more rapidly.

{ devopsguys }

Opinion

Deciding that your organization needs to kick-start its future growth plans with a digital transformation initiative is as exciting as it is daunting. No

matter what your industry – financial services, insurance, media, retail, travel – disruption is here and more is coming, and you know you must respond.

Page 7: 02 - Redgate Software - Compliant Database DevOps ......DevOps is that they’re not getting the backing from the executive level,” Donovan says. “You have this waterfall mentality

7Database DevOps

Many of the above challenges are simply perception issues. Changing perceptions, therefore, is key to enacting the changes that are required to enable the DevOps imperative for organizations wishing to survive and compete in the future.

Not just among the people around you, but right across development, operations, and every other stakeholder in the delivery process.

This is a guest article from DevOpsGuys, a UK DevOps consultancy which transforms and accelerates the way organizations deliver software.

Lack of sponsorshipIn order for DevOps to succeed throughout an organization, buy-in is needed from senior leadership. The value of DevOps needs to be demonstrated and shown how it can be applied across the organization to drive profitable change. Ultimately, without senior endorsement, DevOps will not get the support and funding needed to scale.

Inflexible command and control structuresA fundamental principle of DevOps is to realize rapid progress and improvement. However, such is the hierarchical culture of command and control within many large enterprises, slow and complex approval processes stand in the way of DevOps even getting off the ground.

Over-dependence on static outsourcing modelsThe pace of technology advances has left many companies struggling to keep up. Knowledge gaps have emerged and project managers have relied too heavily on outsourcing to counterbalance in-house skills shortages. In such circumstances, organizations find themselves simply unequipped to suddenly change direction or try something new, and are therefore unable to respond to new customer demands.

Reinforced change-resistant cultureSuccess with DevOps will influence change upon nearly every aspect of IT and many other parts of the business. But fear of such change – especially amongst long-serving staff – creates a culture of resistance. However, some staff members will be keen to learn new approaches and technologies, and will want to further their careers. If the organization cannot provide such opportunities, then they will be attracted elsewhere which can reinforce the change-resistant culture in the company they leave behind.

Fear of the “fail fast” principle“Fail fast” is a concept that tends to raise eyebrows amongst those not familiar with the discipline. Failure is of course not a desirable outcome, but certain mistakes or failures are bound to creep in when building software – it’s just the nature of the practice, and a fact of life.

The point of “fail fast” is that if certain failures are inevitable, organizations should detect them quickly, fix the problem, and learn from it. Its unfortunate nomenclature appears to go against the grain, but in fact it is one of the most efficient principles in practice.

How you can scale DevOps to enable digital transformation

Page 8: 02 - Redgate Software - Compliant Database DevOps ......DevOps is that they’re not getting the backing from the executive level,” Donovan says. “You have this waterfall mentality

8 Shift LEFT Issue 2

The GDPR aims to harmonize data protection in Europe and, notably, will affect any and every company and organization wishing to trade with Europe, or which holds personal information about individual Europeans.

Inevitably, equivalent legislation will be adopted by other countries outside the EU, either because of trade requirements or because it represents the most thorough definition of best-practice in managing personal data.

The GDPR incorporates the requirements of several international standards for information protection, and covers the rather different aspects of the rights of individuals to be informed, and have their data deleted and moved.

Essentially, the GDPR provides a broader definition of personal data and makes it far clearer what constitutes consent to the use of personal data. It increases the requirement for accountability and frankness from organizations that use data, including the need to report incidents of security breaches, allows users to move data from one organization to another, and requires a better standard of custodianship of data from organizations which hold personal information on individuals.

Prior to the introduction of GDPR, organizations which handle personal data will need to audit their data, identify personal data, prove that they have plans in place to deal with issues that breach privacy, and ensure their databases and data processing systems meet the basic rights of privacy in society. If they don’t, then it is time to put it right.

The GDPR defines two categories of personal data that require special handling:

Standard personal data includes names, addresses, phone numbers, mobile device IDs, IP addresses and cookie strings, and any data specific to the physical, physiological, mental, economic, cultural or social identity of that person, including shopping or web-surfing data. This is typical Big Data information.

‘Special’ personal data is data relating to racial or ethnic origin, political opinions, religious or philosophical beliefs, trade union membership, and health or sex life. Biometric data such as fingerprints, facial recognition and retinal scans, as well as genetic data is also included.

All you ever wanted to know about GDPR

With many well-publicized examples of the consequences of data breaches and data misuse, there is increasing public pressure for

legislation on privacy and personal data that has enough clout to prosecute serious offenders. In the vanguard has been the EU data protection regulation, soon to be succeeded by the General Data Protection Regulation (GDPR). It defines IT practices for data that are likely to extend worldwide. William Brewer gives a rundown of the implications for IT practice.

Insight

{ william brewer }

Page 9: 02 - Redgate Software - Compliant Database DevOps ......DevOps is that they’re not getting the backing from the executive level,” Donovan says. “You have this waterfall mentality

9Database DevOps

How GDPR will affect individuals

The GDPR gives individuals the right to have their ‘private and family life, home and correspondence’ respected, and can be categorized as a right to be informed how data is being used, obliging organizations holding data to get explicit consent for using it.

It also gives individuals the right to access the data and check whether it is correct, and have it altered or erased if necessary.

If a decision is made automatically via profiling, on the basis of data held on an individual, they must be able to express their point of view, obtain an explanation of the decision, and challenge it.

Individuals should additionally be able to move their data between organizations securely where this is appropriate (for example, utility providers or those in the healthcare sector).

There are six specific areas companies and organizations need to be aware of.

Explicit consent to its useOrganizations may not process the personal information of individuals unless they have been freely given a specific, informed and unambiguous indication of consent, either by a statement or by a clear, affirmative action.

ErasureIndividuals have the right to request that their personal data is deleted or removed without undue delay. This is an exercise of the ‘right to be forgotten’.

This is not an unlimited right, but must be balanced against legal freedom of expression, the public interest in health, scientific and historical research, and the exercise or defense of legal claims.

PortabilityIf the storage of personal data was based on the user’s consent, or on a contract, then individuals will have the right to have their data transferred elsewhere in a ‘structured, commonly used, machine-readable and interoperable format’.

Breach notificationsGDPR requires organizations holding data on individuals to notify them of certain specified types of breaches within 72 hours of discovering the breach.

Protection of data, both by design and defaultGDPR requires that data protection safeguards are integrated into products and services from the earliest stage of development, with privacy always the default option. TransparencyTransparency means that individuals should be informed when data is collected about them, and that it should be easy for them to respond.

How GDPR will affect organizations

There are certain areas where complying with GDPR will prove technically or administratively difficult, even where an organization is committed to adopting the correct practices.

Storage of personal dataThere must be effective encryption of data both in transit and at rest, and only secure remote access to data is allowed.

Erasure of dataIn many database systems, there are practical problems in deleting personal data on request. Legacy systems can be tricky to adapt, for example, and new systems will need to be designed to erase data records upon request.

Data portabilityUsers must be able to transfer their data from one organization to another, with requests accommodated using a secure, standard, structured format.

Notification of breachesThe need to notify users of data breaches means organizations will need audit logs that enable them to determine where a hacker has gone on a network and what data has been breached.

ResilienceThe GDPR insists on a high standard for data resilience, so that personal data is treated in compliance with all business continuity standards for the industry. It states that there must be a service level agreement that guarantees the provider of the data processing services has the ability to quickly restore data when necessary.

SecurityAny employee with a high level of access to personal data needs to have had background checks. Organizations must also have a sensible policy to restrict access to data to only those who need it.

001010001001001001001011011101000011100101010010010011111001010010010100010010010010010010010111010101010111100100100100100010100100101010101000101000100100101001011110010101010101011111100000100100100010001000011101010101010101000101010101000010111101010001001010010100100001010001001001001001011011101000011100101010010010011111001010010010100010010010010010010010111010101010111100100100100100010100100101010101000101000100100101001011110010101010101011111100000100100100010001000011101010101010101000101010101000010111101010001001010010100100001010001001001001001011011101000011100101010010010011111001010010010100010010010010010010010111010101010111100100100100100010100100101010101000101000100100101001011110010101010101011111100000100100100010001000011101010101010101000101010101000010111101010001001010010100100001010001001001001001011011101000011100101

000100100101001011110010101010101011111100000100100100010001000

Page 10: 02 - Redgate Software - Compliant Database DevOps ......DevOps is that they’re not getting the backing from the executive level,” Donovan says. “You have this waterfall mentality

10 Shift LEFT Issue 2

Identify the risksThe GDPR requires that ‘controllers’ and ‘processors’ of personal data make Data Protection Impact Assessments or Privacy Impact Assessments. This will require you to identify, analyze, and document the risks to privacy. These impact assessments are similar to those described in ISO 27001, and are there to allow organizations to identify and fix problems at an early stage.

Consolidate the location of personal dataThe more that you can restrict the number of databases that handle personal data, the easier it will be to meet the requirements for audit, operational conditions, network security, intrusion detection and logging.

Encrypt all ‘special’ personal dataEncryption is not, by itself, sufficient for the protection of data, but it is necessary for compliance and must be considered an effective method. ‘Pseudonymization’ is widely practiced as a way of using real data for research, development, testing, demonstrating databases and for training staff, but pseudonymized data is not considered by the GDPR to be an effective way of protecting data.

This means that pseudonymized personal data that is intended to be processed for purposes such as research must be handled in the same way as any other personal data. Indications are that full anonymization via data masking represents the safest strategy where testing requires data that is close to production in volume and distribution.

Introduce full access control to personal dataThis practice, often called ‘minimum necessary privileges’, is essential for the proper management of data, and allows a more accurate audit of data access and unsuccessful attempts at access. It also minimizes the requirement for background checks for users.

Ensure there is organization-wide data managementMany of the obligations placed on organizations are easier to implement if there is centralized management of data. This is essential in order to demonstrate that you know where data goes, what category it belongs to, whether it is secure when in transit, where it is used and whether it is properly disposed of. Without this, an audit of compliance for GDPR would be time-consuming and frustrating for external auditors.

Cloud-based dataEven if an organization stores customer data in the cloud, or in a third party co-location facility, it is still responsible for the security of that data.

Data Protection OfficersLarger organizations in Europe that are engaged in ‘systematic monitoring of data subjects’ will need to employ, or outsource, Data Protection Officers (DPOs), as is already customary in larger companies in Germany. All public authorities are also required to appoint a DPO under the GDPR.

Strategies for compliance

The more you summarize the requirements of the GDPR, the more draconian and harsh they might seem. They are, however, extraordinarily close to existing good practices for the custodianship of data, and there are some relatively straightforward steps to be compliant.

Identify your dataThis may sound obvious, but many organizations find it hard to explain precisely what categories of data they’re storing, where and why. External audits can become an expensive and protracted nightmare, so if you haven’t an agreed and well-understood framework for identifying and categorizing data in your organization, this is a good place to start.

You can then use this framework to find out where the data is being held, and in what system. This will, in turn, provide answers to the questions you will need to ask. If data falls within the GDPR’s definition of personal data, for example, can you easily report to individuals what data is being held on them? Can you delete part or all of this data, on request? Can you encrypt it, both at rest and in transit? Is there adequate access-control? Are you allowing developers to use it for developing or modifying the database that holds it? Are backups held securely and encrypted?

Identify access requirements within the organizationYou need to implement a data-access system which adopts the principle that users of an application should have permission to view, modify or delete only the data in the live database that is relevant to their job role.

001010001001001001001011011101000011100101010010010011111001010010010100010010010010010010010111010101010111100100100100100010100100101010101000101000100100101001011110010101010101011111100000100100100010001000011101010101010101000101010101000010111101

01000100101001010010001010001001001001001011011101000011100101010010010011111001010010010100010010010010010010010111010101010111100100100100100010100100101010101000101000100100101001011110010101010101011111100000100100100010001000011101010101010101000101010101000010111101010001001010010100100001010001001001001001011011101000011100101010010010011111001010010010100010010010010010010010111010101010111100100100100100010100100101010101000101000100100101001011110010101010101011111100000100100100010001000011101010101010101000101010101000010111101010001001010010100101001010001001001001001011011101000011100101010010010011111001010010010100010010010010010010010111010101010111100100100100100010100100101010101000101000100100101001011110010101010101011111100000100100100010001000011101010101010101000101010101000010111101010001001010010100100001010001001001001001011011101000011100101010010010011111001010010010100010010010010010010010111010101010111100100100100100010100100101010101000101000100100101001011110010101010101011111100000100100100010001000011101010101010101000

Page 11: 02 - Redgate Software - Compliant Database DevOps ......DevOps is that they’re not getting the backing from the executive level,” Donovan says. “You have this waterfall mentality

11Database DevOps

Implement an audit of personal dataTo respond properly to a request from an individual on whom you hold data, you not only need to explain clearly how the data is used, but also how and when data got into the system, especially if there is a dispute over the correctness of the data.

Data also needs to be audited in the case of an intrusion. Auditing has to be in place, and this information cannot otherwise be gathered retrospectively. Where changes to data cannot be confidently documented, then disputes, prosecutions, audits and other legal processes can become highly distracting and expensive.

Make sure you have a robust intrusion-detection systemBreaches or intrusions are a fact of life, even in the most vigilant of organizations, and occasionally a hacker just gets very lucky. Organizations need to react as quickly as possible and be completely transparent in explaining what has happened to anyone whose data has been compromised by the breach. To do this, you need a good intrusion-detection system in place, and your databases must log both failed and successful logins.

Draw up plans for dealing with data breaches, including timely notificationsOrganizations which use personal data are obliged to report certain types of data breach to the relevant supervisory authority, and in some cases to the individuals affected. The breach can be as simple as a member of staff reading a medical record without authorization, out of curiosity rather than clinical need, or where real data was used for testing an application.

Organizations are obliged to keep records that, in the event of a breach, can show they have thought through the impact on systems and processes, and made informed choices about protecting personal data.

A plan needs to be in place for dealing with a range of severity of breach, so that if it happens, staff can react appropriately. Any notification of a breach needs to include the number of individuals affected, the type of data and approximate number of data records compromised, the name and contact details of the Data Protection Officer or other responsible person, the possible consequences of the personal data breach and a description of the measures taken, or proposed to be taken, to deal with it.

Make all your IT systems secureThe best source of information on application security in general is OWASP, which is a not-for-profit charitable organization. This maintains a lot of resources to help with the best practices for security. The most appropriate international standard for security is ISO 27001, which covers most of the GDPR requirement.

Conclusion

Database administrators, database developers and Ops people will need to take a lead in preparing organizations which use or process personal data to deal with upcoming legislation on data privacy. The GDPR is in the vanguard of this legislation but it is part of an international move to tighten up the way organizations handle personal data.

Except perhaps for the practical difficulties of implementing the right to be forgotten, the right to refuse to allow certain data to be held, and the right to transfer data between organizations (such as energy suppliers or phone companies), the difficulties aren’t insurmountable, and the more you look at the detail, the less draconian the regulations seem.

The most effective strategy will be to deal with it as soon as possible and there will, inevitably, be some temporary inconvenience. Developing databases against live data, for example, would seem to require all manner of bureaucratic tasks, even with pseudonymized data. Encryption, logging and auditing will be required, there will be paperwork and effort to perform all the required assessments. Security in general will be a greater concern to senior IT management.

Nevertheless, in reading through the impending legislation in detail, there seems nothing particularly unreasonable about it: it conforms with what the best players in the industry already achieve, and this sort of legislation defends ordinary people from the worst abuses of ‘Big Data’. The security practices that it mandates are in line with OWASP opinion and ISO27001 standards.

In short, we now need to implement all those practices that we’ve always known were necessary but had seemed lower priority than dealing with whatever the current crisis happens to be.

001010001001001001001011011101000011100101010010010011111001010010010100010010010010010010010111010101010111100100100100100010100100101010101000101000100100101001011110010101010101011111100000100100100010001000011101010101010101000101010101000010111101

01000100101001010010001010001001001001001011011101000011100101010010010011111001010010010100010010010010010010010111010101010111100100100100100010100100101010101000101000100100101001011110010101010101011111100000100100100010001000011101010101010101000101010101000010111101010001001010010100100001010001001001001001011011101000011100101010010010011111001010010010100010010010010010010010111010101010111100100100100100010100100101010101000101000100100101001011110010101010101011111100000100100100010001000011101010101010101000101010101000010111101010001001010010100101001010001001001001001011011101000011100101010010010011111001010010010100010010010010010010010111010101010111100100100100100010100100101010101000101000100100101001011110010101010101011111100000100100100010001000011101010101010101000101010101000010111101010001001010010100100001010001001001001001011011101000011100101010010010011111001010010010100010010010010010010010111010101010111100100100100100010100100101010101000101000100100101001011110010101010101011111100000100100100010001000011101010101010101000

Page 12: 02 - Redgate Software - Compliant Database DevOps ......DevOps is that they’re not getting the backing from the executive level,” Donovan says. “You have this waterfall mentality

Opinion

Why it’s time to think seriously about SQL Server 2017

SQL Server 2017 has landed and is now on general release. The latest version of the heavyweight platform is more than the sum of its parts, however, because it doesn’t just deliver new functionality. Alongside the list of extra features, it also changes two important ways we think about the platform itself.

Time to think Linux

The headline message about SQL Server 2017 running on Linux has been talked about for months, but the big wow factor about it is just how well the transition has been managed. What its database engine does on a server running the Windows operating system, it now does on a server running Linux. The only easy way to find a difference between the two is to look at how the server SQL Server is installed on is configured, managed and monitored.

Microsoft has been able to make an in-stance of SQL Server running on a server using Linux almost indistinguishable from one running on a server using its own Windows operating system. That’s quite something.

If there’s little difference to SQL Server itself, then why bother making it run on the Linux operating system? The answer lies in the data centers of some of the world’s largest companies. The evolu-tion in technology over the last decade has left some of the largest spenders with very few servers that use Windows. And according to chatter from Microsoft

staff, most of those servers are often just there to run SQL Server.

Allowing SQL Server to run on the Linux operating system allows those organiza-tions to retire their remaining servers that run Windows and standardize on servers that run Linux. At the same time, it gives them the freedom to adopt the next generation of workload manage-ment platforms, such as OpenShift, Kubernetes and Docker, which work best managing Linux workloads.

Time to think again about Oracle

On the opposite side of the table are or-ganizations which have a different prob-lem. They’ve already standardized on Linux as their server operating system, but have little choice about what com-mercial database server software they can use. A decade ago, there were op-tions from a variety of vendors including Oracle, IBM, and Sybase. Today, market share data tells us there’s only really one and that’s Oracle.

So what happens if those organiza-tions fall out with Oracle, or want some bargaining power when they renegotiate their licensing agreements? Right now, the media will tell us they can look at open source options such as MySQL or cloud-hosted alternatives, but the real-ity for some organizations is that their boards still want their most important systems to use a traditional vendor-written and vendor-supported database platform.

SQL Server 2017 puts a different op-tion on the table because it’s a mature, proven platform that now offers a real strategic alternative to Oracle. Merv Adrian, a VP of Research at Gartner specializing in data platforms, told me that SQL Server on Linux will have a huge impact on Microsoft, but also Oracle and IBM. For the first time in a generation, there’s now a compelling new entrant to the RDBMS on Linux market.

Time to welcomea new way of working

For most of its lifetime, new versions of SQL Server have been rare. Some industry veterans will remember a five year gap between releases while most have become used to a regular two

year release cycle. SQL Server 2017 has changed that rhythm by appearing just a year after the last release, but there has been little commotion about this change.

Knowing which version of SQL Server an application uses is important, which is why vendors often support several ver-sions and organizations standardize on a handful. They press the pause button and freeze database server innovation in their organization, staying with a specific version of SQL Server for as long as they need to, often many years, sometimes over a decade. This option to manage versions gives them the feeling of control they want or the technical compatibilities they need.

That approach is very different to how cloud services evolve. There, providers are constantly adding functionality that developers and users can start using without too much effort. A cloud-gen-eration developer today is used to new functionality appearing in the platforms they use or new programming languages and frameworks emerging that solve problems which didn’t until recently exist.

With SQL Server 2017, Microsoft is ap-pealing to both worlds. It’s providing new database server functionality to those ready to deploy the current latest version and start programming, and it’s extend-ing the support with its Premium Assur-ance offering to those that want to use older versions of SQL Server for much longer than it expected.

The interesting question is whether Microsoft now continues with a yearly release cycle for SQL Server and plans for SQL Server 2018. Right now, there are still parts of the SQL Server family wait-ing to be brought to the Linux operating system, such as SQL Server Analysis Services. So, it wouldn’t surprise me to see another release in 12 months’ time.

For now, however, SQL Server 2017 is already a big step forward. It will be inter-esting to see the effect it has – and what the response from Oracle will be.

This is a guest article from Coeo, a UK-based Microsoft Gold Partner for Data Platform, Data Analytics and Cloud Platform.

{ coeo }

Shift LEFT Issue 212

Page 13: 02 - Redgate Software - Compliant Database DevOps ......DevOps is that they’re not getting the backing from the executive level,” Donovan says. “You have this waterfall mentality

13Database DevOps

Introducing DevOps can be harder than it first appears. Without the right checks and balances in place, cracks will start to appear:

• Team members complain of unmanageable workloads• Requirements, quality management and metrics get neglected,

while customer complaints increase • You promote and reward the ‘firefighters’ rather than the staff who

prevent bad things from happening• Your DevOps practices don’t work with legacy technology• You have pockets of DevOps practice in just a few teams • You feel the need to appoint specialist ‘DevOps’ people to sort out

the issues • You suddenly find that you’ve lost your testers • Your developers talk more about NoOps than DevOps

What are the underlying causes of these painful symptoms? Lack of technical skills? Or of effective automation tools? In fact, it’s more likely to be ineffective team working. It’s soft skills not technical, tools or automation skills that are the biggest factor in determining the ease or pain with which an organization can switch to DevOps.

This goes against many of our instincts as technologists. DevOps can seem at first to be another unruly aspect of the universe that can be fixed with technology. The fun part is often imagining the ability to deploy to production 10 times a day, and figuring out the technologies, cloud services, and the collaborative scripting, automation and monitoring tools that will make it happen. The hard part is defining DevOps in a way that wins support from all sides. Not everyone trusts the real motivation for such change.

So how can we achieve the hard part? Much is down to the old-fashioned art of effective communication: show what you’re learning, where you succeeded – and failed, and share your knowledge with others. Whereas other professions give training in teamwork skills like communication, arbitration, empathy, compromise, and tactful persuasion, they are less in evidence in IT.

DevOps requires these teamwork and communication skills by the bucket load. If we can find better ways of facilitating communication within teams, then misunderstandings and tribalism are less likely to happen. Communication and coordination tools such as Slack, Trello and Jira are evolving to the point of becoming the ‘operating system of the business’.

Some operations teams are already using tools like Slack or HipChat to automate and document basic infrastructure changes, using bots and slash commands. The resulting text is searchable and helps to explain to others not just what’s happening, but how and why. The automated processes can directly communicate their progress and outcome.

DevOps can make spectacular improvements in the speed and quality of the delivery of software, but it is the power of good teamwork and realistic workloads that makes it work. Sure, automation will help but it will only help an organization whose members are enabled to work cooperatively. It must support the teamwork and make the team processes more visible. That way, automation becomes the servant to the team, not its master.

Avoiding the slide from DevOps

{ tony davis }

Opinion

If you roll out DevOps across an organization before it’s culturally prepared, you will see warning signs that the initiative is failing. Tony

Davis outlines what they are – and what to do to avoid the slippery slope.

Page 14: 02 - Redgate Software - Compliant Database DevOps ......DevOps is that they’re not getting the backing from the executive level,” Donovan says. “You have this waterfall mentality

14 Shift LEFT Issue 2

It can take time to undergo a transformation like this, and there is likely to be initial pain to go through before you actually start to reap the benefits.

But as the annual State of DevOps report from DORA and Puppet shows us, there are enormous tangible and intangible benefits for organizations that have adopted this approach to software delivery. Efficiencies are made through automating manual processes, reducing downtime, or minimizing the time spent recovering from failed deployments.

Plus additional value is gained as time spent on tasks such as unnecessary rework or manual testing can be spent on work that delivers value to the business instead.

But what does the term ‘value’ even mean for your organization?

Before you can quantify the benefit of DevOps, perhaps it’s worth reflecting on what would be the greatest value you could add to your business.

DevOps is clearly becoming more popular. 33% of respondents in our State of Database DevOps survey had already adopted a DevOps approach across at least some of their IT projects, and a further 47% intended to within the next two years.

For the 20% with no plans to move towards this new way of working, the barrier most frequently cited was a lack of understanding of the business benefits.

Quite understandably, getting over that hurdle of determining how this will actually help the business seems to be a key factor in adoption.

So how do you calculate the business benefit, or the return on investment, of DevOps?

It’s not as straightforward as, say, calculating the return on investment of a new piece of hardware. DevOps is a new way of working that encompasses changes across people, processes and tools. It may involve a shift in the way your teams work, prompting the need for training as well as investment in new technologies.

The languageof DevOps ROI

Research

{ kate duggan }

How do you quantify the value of DevOps? The answer might depend on what value actually means for your organization, which stakeholder

you’re talking to, and what type of lens they’re looking through.

The definition of that business value may well depend on the nature of the organization and industry you’re in. How that value manifests itself may also vary at different levels within the organization, depending on who you’re talking to and what lens they’re looking through.

DevOps measurement lenses

The idea of different lenses for measuring the benefits of DevOps is something David Linwood, an experienced IT Director, explored when carrying out an MSc research project on DevOps ROI. He looked at the perspective taken by three different levels of management within the typical organization – CEOs, CIOs and Team Managers – and came up with three different measurement lenses:

Business outcomeThis is the CEO’s area of interest. What’s important to him or her, and the rest of the Board, is how will this investment ultimately lead to higher revenue and/or profitability? Whether that’s delivered through tangible cost reductions, faster speed to market or other improvements in business performance, ultimately the CEO will be interested in the

Page 15: 02 - Redgate Software - Compliant Database DevOps ......DevOps is that they’re not getting the backing from the executive level,” Donovan says. “You have this waterfall mentality

15Database DevOps

value that this change can deliver to the organization and its customers. And that value often varies according to the nature of the organization.

Preserving value is equally important though. In 2015, Gartner put the average cost of a critical application failure at anywhere between $500,000 and $1 million an hour. So investments that can be proven to help minimize financial risk caused by catastrophic IT failures will quickly capture board-level attention. And that’s before you even take into account the potential reputational damage, and the subsequent impact on shareholder value, caused by high-profile incidents of downtime or data breaches.

Ingredients for successAt the next level down, for the CIO or Head of IT, the focus is on ingredients for success. How can processes be put in place to increase the throughput of the IT department? Can the skilled IT staff needed to deliver quality services to the business be recruited and retained?

The 2017 State of DevOps report found that organizations practicing DevOps spent 21% less time on unnecessary rework and 44% more time on new work than their peers. Not only was their throughput higher but spending more time on enjoyable, value-added development also led to higher employee satisfaction levels.

As the DORA ROI guide outlines, “Retaining existing talent is more cost-effective, preserves institutional knowledge and gives organizations an advantage by having a strong technical workforce that is engaged and continuing to learn”.

If this sounds too intangible, how about measuring and quantifying the cost to the business of recruiting and training new members of staff? A study by the Center for American Progress found that the typical cost of turnover is 21% of an employee’s salary.

IT output performanceDown in the engine room, for team lead roles such as IT Managers or Technical Leads, the focus becomes more firmly on the output performance of the IT department or team. They care about metrics like the speed of

deployment or number of new releases delivered by their teams. They’re also more interested in the ability to reduce defects, decrease downtime or improve time to recovery, perhaps reflecting the pressure on these leaders to maintain what is a core function for the business.

Again, the State of DevOps report shows improvements across all these key metrics in high-performing organizations. They can typically deploy changes, updates and improvements 46 times more frequently, their change failure rate is 5 times lower, and they can recover from failures when they do occur 96 times faster.

It’s all about perspectiveEvidently, all of these factors are important because each will eventually translate into some kind of business value. But what you choose to focus on will differ, depending on who in the organization you’re talking to.

So, when you’re at the start of the journey and seeking buy-in, think about which stakeholder you’re talking to and the lens that they’re looking through. Then talk to them in their language.

Page 16: 02 - Redgate Software - Compliant Database DevOps ......DevOps is that they’re not getting the backing from the executive level,” Donovan says. “You have this waterfall mentality

16 Shift LEFT Issue 2

The aspirin for your database provisioning headaches

SQL Clone can help reduce the pain associated with

many of the provisioning tasks that go with database

development, regardless of what development model you

use. For teams who use a shared development database

out of necessity rather than choice, SQL Clone can remove

enough of the additional problems to make the dedicated

model viable for the first time.

When developing a database as a team, most team

members would generally prefer to work with their own,

isolated copy of the database, rather than work on a shared

development database.

Up to now, however, there have been a number of

management and security problems with the dedicated

database approach and these burdens only increase as

the size of databases grows, and as regulations around

data security and sharing tighten. As a result, many

organizations still tend to favor the shared model, despite

its drawbacks.

Product

{ ed leighton-dick }Why is the shared development database model still prevalent?

Whichever model you use when developing a database, you’re always going to have to overcome database provisioning problems. Even in the shared model, you still need to manage a few ‘sandboxes’, where developers can try out new ideas. Both development models require test databases, and both require the provisioning and management of databases in development environments.

Over the years, I’ve heard and read some impassioned arguments in favor of dedicated development databases. Each developer can do whatever is needed to accomplish an assigned task, without fear of conflicting with the work of others, for example. They have the freedom to experiment without having to worry about causing unpredictable data states, disrupting integration tests, and so on.

A dedicated development database simplifies the task of diagnosing a sudden change in behavior or performance by removing the possibility that someone else’s modification caused the issue. In short, dedicated database development makes developers more productive.

These arguments are all valid, and yet in my experience, for databases of any size or significance, I’ve found the shared database development model to be more prevalent. Why?

SQL Clone

Page 17: 02 - Redgate Software - Compliant Database DevOps ......DevOps is that they’re not getting the backing from the executive level,” Donovan says. “You have this waterfall mentality

17Database DevOps

So how does SQL Clone help?

SQL Clone is a database provisioning tool that creates full copies of SQL Server databases and backups in seconds, using around 40MB of disk space per clone. Instead of spending hours provisioning multiple copies of a database, it creates a single ‘image’ of a SQL Server database or backup, which is used as the source for multiple clones. Each clone works just like a normal database, but takes only a few seconds to create and requires only a fraction of disk space.

If additional data security is required, data masking or loading of development data can be completed before the image is created. Development work can then proceed without any concern that user data or confidential data can be compromised.

Even with SQL Clone, you’ll still require a slick management process to be able to keep things under control when provisioning, and regularly refreshing multiple development, test, and other environments. Fortunately, in addition to a streamlined web interface that is accessible to both administrators and developers, SQL Clone comes with a library of PowerShell cmdlets, allowing full control of the process within a script.

For many companies, the question is not Why?, but rather Why not? The shared model uses less disk space and requires less DBA time to manage (at least, in theory). Security concerns compel companies to keep data in locations where it is under IT control. And if the workload and database schemas and permissions are managed properly, the changes of different developers don’t often conflict with each other.

The risk of changing a development practice that isn’t broken, in favor of one that will likely have a higher administrative overhead, is not outweighed by the intangible reward of increased productivity.

Of course, the benefits of the shared development database come at a cost. Since multiple developer schedules are involved, opportunities for refreshing it are dramatically reduced. Often, this means planning an outage on the database to allow time for the refresh.

When the time does finally come to provision the new copy of the database, work must be stopped while the provisioning is occurring – for large databases, the time required can be hours or even days. Far too often, this means the refresh is avoided in favor of keeping the normal flow of work moving. Developers must adapt to using stale data for their testing, while companies settle for results that are ‘good enough’. The cost to change that state has traditionally been high enough to put it out of reach for many companies that may otherwise desire to improve.

Page 18: 02 - Redgate Software - Compliant Database DevOps ......DevOps is that they’re not getting the backing from the executive level,” Donovan says. “You have this waterfall mentality

18 Shift LEFT Issue 2

Provisioning for database testingFor companies adopting a fully-automated DevOps workflow, provisioning provides a unique challenge. Ideally, a fresh copy of the database would be created for each run of the build process so that unit tests and integration tests can judge accurately whether the changes made in that build had the intended effect, and whether the changes introduced any unwanted regressions.

This ideal has not been feasible in many cases, simply because large databases take too long to restore. This has left teams with the choice of finding ways to revert the changes in the previous test or living with an unclean copy of the database.

With SQL Clone, a clean copy of the database can be provisioned as part of the build process without creating unwanted delays, and because a clone only requires a small amount of disk space, clones can be created as often as necessary to run the desired tests.

It even becomes feasible to do parallel testing with real databases. Let’s say you have a series of five tests, one for each for end-to-end testing of a specific business function, which if run serially, take 16 hours. What if you could spin up five separate clones and test each business process in parallel? It would dramatically reduce the time required for end-to-end testing.

The end result is faster test times, much more accurate testing, and higher quality code.

Provisioning multiple development environmentsSometimes, database provisioning simply means restoring the latest backup of the production database to the target server, though it’s rarely as simple as that sounds. For many databases, there are usually a number of additional steps to perform any time a fresh database copy is deployed, such as basic data cleansing, or removing production logins, and then recreating logins for that target environment and mapping them to existing users and roles.

For large databases containing sensitive data, the problems are more challenging. It can be hard to keep up with the regular demand for fresh data, as well as the various ad hoc requests from developers who need to return to a stable state after a failed experiment, or need to replicate a production issue from last night that didn’t exist the night before.

With SQL Clone, you can create an initial ‘clean’ data image, and from that image create as many clones as required, in seconds. It means that each developer can work with their own copy of the production database. If they need to create a branch, they can quickly create a new clone to support development on that branch. Likewise, if they need to reset their environment, or refresh it with a clone of the latest data image, it becomes a matter of seconds rather than hours or days.

Provisioning copies of large databasesAs the size of databases has grown, the difficulty of transferring those databases between servers has also increased. Large production databases can take hours, or longer, to restore. Finding space for multiple copies of these large databases is also a challenge in the face of ever-increasing demands for storage space from all sides.

SQL Clone solves this problem by significantly reducing the amount of space required to store multiple copies of databases. Instead of requiring the full amount of space on the database server each time a copy is provisioned, the image is only stored once, in a central location. Each clone is then created with a small differencing disk on the database server, starting with around 40MB of space. Even though the clone will grow as changes are made to the schema or data, the space required is generally tiny compared to what would have been required for a full copy of the database.

Page 19: 02 - Redgate Software - Compliant Database DevOps ......DevOps is that they’re not getting the backing from the executive level,” Donovan says. “You have this waterfall mentality

19Database DevOps

Dealing with sensitive dataMost companies now have to comply with some sort of regulation regarding the privacy of the data they store, and in those cases, providing developers with realistic volumes of ‘production-like’ data for testing is a bigger problem.

SQL Clone can help make this process easier. In cases where there are no real data sensitivity issues, very basic data cleansing, such as replacing real email addresses, can be performed by altering the data in the clone. This can easily be incorporated in an automated PowerShell script that produces the clones. Because a clone can also be imaged, we can still create a new data image to use for distribution.

For more stringent requirements, though, it’s probably best to produce clean data before creating the data image. This means restoring the backup, sanitizing or obfuscating the production data to an extent that it complies with any relevant regulations regarding data sharing, then creating the data image from that database.

Alternatively, production data can be avoided altogether by performing a new database build, from the latest version in source control, and then importing standard data sets for testing, and then creating the image.

Streamlined managementWith traditional methods, the task of provisioning databases, and keeping control of each environment, falls heavily on the shoulders of database administrators.

Creating a backup, restoring it to a different server, performing data cleansing, and adjusting the security for the development environment all take a significant amount of time when done manually. Automation is possible with existing tools, but it often requires complex scripts, which also take time to create and maintain. If a company is affected by strict regulations that prevent a DBA from seeing customer data, all of this must be performed under the watchful eyes of an auditor.

With SQL Clone, an image can be created using SQL Clone’s built-in PowerShell cmdlets, which simplifies the scripting required. That script can be scheduled during off-hours using the company’s standard scheduling tools. To satisfy regulators, the variant using non-production data can be created without intervention by a DBA by using additional scripts.

Clones can then be created from the resulting image, either automatically with PowerShell or on-demand from the SQL Clone web interface, by anyone with permissions, including developers. Automation that was previously difficult becomes feasible for anyone.

ConclusionSQL Clone might be a game-changer

Developers benefit from being able to work with realistic data and ‘self-service’ provisioning.

Database administrators benefit from the reduced management overhead, leaving more time for tasks that add real value.

Companies benefit from improved developer productivity and the higher quality code that results from improved testing.

And the best part is, all of these benefits are possible without giving up the benefits of the development model you’re already using.

Page 20: 02 - Redgate Software - Compliant Database DevOps ......DevOps is that they’re not getting the backing from the executive level,” Donovan says. “You have this waterfall mentality

20 Shift LEFT Issue 2

Achieving DevOps successin financial services

Firstly, let’s start by trying to define DevOps. I’ve seen 101 different presentations and heard 101 ways of describing it, but I like the one from Donovan Brown, Principal DevOps Manager at Microsoft:

Unfortunately, it’s not something you can just buy or decide to do tomorrow. Instead, it’s a shift that needs the right guidance to become reality. But while there are significant challenges and costs to adopting DevOps in the financial industry, the benefits are too great to ignore – as are the risks of not delivering value to customers quickly enough and losing your customers to competitors or new fintech disruptors. But what are the drivers for adopting DevOps in financial services, how can you include the database, and how can you assess your database DevOps maturity against your peers?

{ tom austin }

Analysis

The financial services industry is heavily regulated, complicated, and challenging in IT terms. Conversely, it’s also the industry

that has the most to gain from DevOps initiatives.

“DevOps is the union of people, process, and products to enable continuous delivery of value to our end users.”

Page 21: 02 - Redgate Software - Compliant Database DevOps ......DevOps is that they’re not getting the backing from the executive level,” Donovan says. “You have this waterfall mentality

21Database DevOps

DevOps drivers in financial servicesAs the 2017 State of DevOps Report from DORA and Puppet shows, there are big advantages for companies and organizations which embrace DevOps. They can typically deploy changes, updates and improvements 46 times more frequently, for example. Their change failure rate is also 5 times lower, and they can recover from failures when they do occur 96 times faster.

In the financial services sector specifically, DevOps enables companies to perform better in three key areas.

Increasing the speed of deliveryFinancial services companies are under increasing pressure to release software faster. Whether it’s new entrants to the market such as mobile-only banks, or the likes of Apple and Google entering the mobile payments space, or increased investment in fintech start ups, change is afoot.

Adopting DevOps practices has been proven to significantly increase the speed of delivery, with high IT performers deploying multiple times per day, and low performers deploying once a week or even once a month.

Reducing downtimeIn its 2015 report, DevOps and the Cost of Downtime, IDC calculated that, on average, infrastructure failures cost large enterprises $100,000 per hour. The 2016 Cost of Data Center Outages report from the Ponemon Institute goes further, indicating the cost of unplanned outages in the financial services industry is the highest of any business sector, and more than double that of the public sector.

In an ever-more competitive industry, today’s financial institutions can’t afford these costly mistakes. Especially when DevOps practices have been proven to significantly reduce downtime, as mentioned above, with the mean time to recovery (MTTR) of high IT performers 96 times faster than low performers.

Improving complianceThe financial services industry is one of the most highly regulated sectors in the world. While introducing DevOps may at first appear to be the antithesis of such regulations, the opposite is true.

DevOps practices allow for greater risk management, for example, with small, iterative changes being thoroughly tested by processes like continuous integration. This in turn leads to levels of confidence far higher than the traditional software development cycle. The 2017 State of DevOps report also found that high performers spend 50% less time remediating security issues than low performers.

Page 22: 02 - Redgate Software - Compliant Database DevOps ......DevOps is that they’re not getting the backing from the executive level,” Donovan says. “You have this waterfall mentality

22 Shift LEFT Issue 2

Including the database in DevOpsDevOps is about changing the culture of software development and improving collaboration between development and operations teams. But it’s also about automating many of the common jobs in delivering software, such as source control, testing, compliance and security checks, and deployments. With the automation in place, a process is established that is now common in application development:

Development progresses from source control through continuous integration to release management before changes are deployed. At each stage, the changes are checked and tested so that errors are picked up earlier in the cycle and software releases are both faster and more reliable.

Databases, however, are more problematic because business-critical data needs to be safely and correctly preserved. In addition to this, there are specific challenges in financial services, such as extremely complex systems, legacy databases, and siloed departments.

Tools and processes have now been introduced, however, that allow databases to be developed alongside applications by plugging into and integrating with the systems and infrastructure already in place:

As can be seen, rather than database development being separate to that of the application and managed at the very end by a siloed team, it becomes an integral and natural part of the whole development process.

This is a real advantage for companies and institutions where, typically, the database has been a bottleneck. Because the application and database are developed and tested together, errors or potential issues are highlighted much earlier in the development process, avoiding problems when changes are deployed.

Compare this to the conversations I’ve had with a lot of DBAs who are required to review thousands of lines of script when it comes to deploying database changes. That can take days, depending on how many errors they find in the script.

By committing database changes to source control on a regular basis, you can introduce automated builds and tests to make sure that all of those small units of change are tested and validated multiple times before you are ready to deploy to your next environment. This results in releases being more reliable and less time-consuming, and also means you can respond to change a lot faster.

Page 23: 02 - Redgate Software - Compliant Database DevOps ......DevOps is that they’re not getting the backing from the executive level,” Donovan says. “You have this waterfall mentality

23Database DevOps

Find out where you are on the journeyRedgate conducted a State of Database DevOps survey in early 2017 which showed that 42% of companies do not yet have their database under version control, the first step in optimizing the development environment. This despite the fact that it lets companies spend more time on improving code quality, prompts earlier feedback on potential errors, and minimizes potential performance problems in production.

Similarly, the survey also revealed that only 17% of respondents deploy database changes daily, with a further 20% deploying changes more than once a week. So 63% of companies are missing the advantages of a test-driven development process which makes it possible to deploy changes at will, and reduce errors at deployment to almost zero.

You can find out how you compare to the industry average by benchmarking your current practices with the Database DevOps Maturity Assessment tool online. Do so and you’ll receive a tailored report with practical advice and insights into the next steps to take.

Visit red-gate.com/devops-assessment for more details.

Assessing the maturity of your database DevOpsWhether you’re exploring the advantages of DevOps or already fully immersed in the journey, including the database brings additional advantages.

How are you performing compared to the competition, however? What could you do better? What benchmark should you be aiming for?

To help you answer these questions, Redgate has created a Database DevOps Maturity Assessment tool which evaluates where you are now, shows how you compare with your peers, and makes recommendations for how to move forward.

The tool divides the database development process into its three main stages: Environment & Development, Continuous Integration & Deployment, and Protecting & Preserving Data.

Environment & Development

The first stage covers the broad development

environment, and its aim is to discover how much

collaboration exists, what practices are already in

place, and if any bottlenecks exist.

With good collaboration between teams, effective

management of environments and use of best

practices like version control or automated

provisioning, development practices can

be optimized, leaving teams free to

focus on process improvements.

Protecting & Preserving Data

The final stage is probably the most important

for financial services companies. It covers how

performance is monitored, what backup and

recovery strategies are in place, and how data

access is controlled and audited.

With a solid data management strategy in place,

monitoring across environments can enable

performance to be correlated with changes so the

cause of any issues or errors can be pinpointed

and acted upon immediately. Compliance can

also be readily demonstrated, which will become

increasingly relevant when the General Data

Protection Regulation (GDPR) enters the picture.

Continuous Integration & Deployment

The second stage of the assessment tool is

concerned with how you validate and test code

changes, what processes are in place to deploy

those changes, and how the changes are deployed.

By automating the database deployment pipeline

and testing potential changes with realistic data and

server environments, the risk of introducing bugs

and defects further downstream can be minimized.

This gives teams the ability to focus on iterative

improvements that enable a higher frequency of

deployments.

Page 24: 02 - Redgate Software - Compliant Database DevOps ......DevOps is that they’re not getting the backing from the executive level,” Donovan says. “You have this waterfall mentality

Bringing DevOps to the database – Moody’s Analytics and SQL Clone

A subsidiary of Moody’s Corporation, Moody’s Analytics helps capital markets and risk management professionals worldwide respond to an evolving marketplace with confidence. The company offers unique tools and best practices for measuring and managing risk through expertise and experience in credit analysis, economic research and financial risk management.

Innovative, data-driven IT

Moody’s Analytics provides insurers with industry-leading modeling and content delivered through high-performance, modular and configurable software. Its Economic Scenario Generator (ESG) is an award-winning software product that provides Monte Carlo simulation paths for the joint behavior of financial market risk factors and economic variables.

The ESG is constantly updated with the latest market and calibration data, and the responsibility for keeping the update process running smoothly falls on the shoulders of the Service Delivery Infrastructure (SDI) team.

The architect, two testers and three software engineers in the team have already adopted agile software development practices and Principal Software Engineer, Daryl Griffiths, is constantly looking for ways to innovate yet further.

Case study

{ karis brummitt }

Developers need copies of databases to successfully create new code,

test programs and fix issues, but provisioning copies is time-consuming,

and each one can take up large amounts of storage space – up to a terabyte

and more in many cases. This challenge can potentially slow down extending

DevOps to the database and ultimately hit the speed at which updates can

be created, tested and deployed. That’s exactly the situation that Moody’s

Analytics found itself in, until it discovered SQL Clone.

Shift LEFT Issue 224

Page 25: 02 - Redgate Software - Compliant Database DevOps ......DevOps is that they’re not getting the backing from the executive level,” Donovan says. “You have this waterfall mentality

SQL Clone creates full copies of SQL Server databases and backups in seconds, yet uses

only a tiny fraction of disk space.

They had a constant need to provision database copies, for example, particularly for the Test Engineers looking to run multiple daily database integration and acceptance tests. While Daryl had developed an easy-to-use approach to provisioning, it still took over an hour to perform each restore, limiting the number of copies that could be created each day.

SQL Clone – provisioning copies in seconds, not hours

An existing Redgate Software customer, the Moody’s Analytics team already uses SQL Source Control, SQL Compare and SQL Data Compare as part of its daily work. So when the beta version of SQL Clone was announced, Daryl leapt at the chance to try it.

SQL Clone creates full copies of SQL Server databases and backups in seconds, yet uses only a tiny fraction of disk space. By doing so, it allows companies to provision copies for development, testing and diagnostics quickly and easily.

The copies, or clones, which are only around 40MB in size for a 1TB database, work just like normal databases and can be connected to and edited using any program. By saving time and disk space, it enables teams to work locally on up-to-date, isolated copies of the database and speed up development, accurately test code using realistic data, and fix issues faster.

After testing SQL Clone in his own development environment, Daryl saw immediate results. “Our process before was quite slick, but in my tests with SQL Clone, I reduced the time to provision copies of all six of our databases from nearly two hours to ten minutes.”

Following further tests, Daryl shared the tool with the team’s Test Engineers. They also saw the benefits and reported that there was no difference performing tests against a clone, rather than a full copy of the database. While the developers in the team were aware that SQL Clone was coming, as a final check, they were not informed of the exact timing when remaining provisioning was switched to SQL Clone. The result? They didn’t notice any differences, and there was no impact on performance or how they carried out their work.

Increasing efficiency and supporting continuous deployment

Now that SQL Clone is a standard part of the SDI team’s development process, anyone can self-serve a database copy in minutes, rather than needing to wait hours as before. This frees up time and means development and testing is faster and more efficient.

This new freedom is bringing wider benefits too. Working with clones removes the fear that changing data in development databases might lead to problems that take a long time to fix, encouraging the team to be more innovative. For example, they can try two different ways of achieving something, each on one clone, and compare the results to work out the best approach.

It is also supporting Moody’s Analytics move to embrace DevOps, streamlining the testing process and ensuring that the database is a central part of development. As Daryl concludes, “If we want to deliver new functionality more frequently, then we need to adapt and improve our development processes, and ensure that the testers can make the most effective use of their testing time. We’re now moving towards continuous deployment, and SQL Clone is a big part of that.”

Database DevOps 25

Page 26: 02 - Redgate Software - Compliant Database DevOps ......DevOps is that they’re not getting the backing from the executive level,” Donovan says. “You have this waterfall mentality

26 Shift LEFT Issue 2

You’re not deliveringDevOps to the database

There have been, and, in our increasingly connected and interconnected world, will continue to be, a large number of spectacular failures with

technology. I want to point the finger at all of us for these failures. It’s not Development. It’s not Operations. It’s us. We’re in it together. If we don’t communicate with each other and share our knowledge, we’re going to mess up horribly.

{ grant fritchey }

Opinion

I’ve read through a number of the industry thought leaders to get an understanding of how DevOps is being communicated out there. As with so much else in life, you can start at Wikipedia to get a general understanding:

DevOps … is a term used to refer to a set of practices that emphasize the collaboration and communication of both software developers and information technology (IT) professionals while automating the process of software delivery and infrastructure changes.

This is the opening sentence and neatly sums up my understanding of the approach. However, I know there are haters out there that have zero trust for that source. Let’s go to another from AgileAdmin:

DevOps is the practice of Operations and Development engineers participating together in the entire service lifecycle, from design through the development process to production support.

While I think these are pretty well informed people doing good work, that might be an overly obscure source of information. Let’s try Gartner, who everyone trots out when trying to make a point (when Gartner supports the point they’re trying to make anyway):

DevOps represents a change in IT culture, focusing on rapid IT service delivery through the adoption of agile, lean practices in the context of a system-oriented approach. DevOps emphasizes people (and culture), and seeks to improve collaboration between Operations and Development teams. DevOps implementations utilize technology – especially automation tools that can leverage an increasingly programmable and dynamic infrastructure from a lifecycle perspective.

Typical Gartner: very wordy. However, when you peel it apart a little, you can tell that it’s largely in sync with the other definitions. In short, DevOps is meant to represent a crossover between Development and Operations (the infrastructure and management part of IT, including DBAs).

Page 27: 02 - Redgate Software - Compliant Database DevOps ......DevOps is that they’re not getting the backing from the executive level,” Donovan says. “You have this waterfall mentality

27Database DevOps

I have successfully worked within an environment that implemented a DevOps approach to development, deployment and maintenance. I regularly teach classes and provide consulting on how to approach DevOps from the Ops perspective. Heck, I’ve written books on DevOps.

All of this is just a setup to complain that you’re letting me down. It’s not that I’m not providing you the information and methods needed for you to implement DevOps within your environment. No, it’s you.

DevelopersPlease, developers, don’t get smug. I’m starting with you because you’re a core part of the problem here. Go back and re-read the definitions of DevOps … I’ll wait …

You have to notice one salient point. Nowhere does it say, Developers rule the world, or Developers have Sys Admin privileges, or We get to ignore the Operations side of IT and do anything we want. No, instead, it talks about cooperation.

I get it – you’ve put up with DBAs and all the headaches they create for decades (they’ll get their turn). However, the fix to the problem is not attempting to eliminate Operations. You still need the specialists within Operations to get your job done. Eliminating them will, if nothing else, put you on the on-call rota (and I know from the number of times I’ve attempted to give developers my phone, only to have it summarily rejected – you don’t want that).

I’ve read the treatises on the full-stack developer, only to find that more than a few of them leave out little things like high availability in the systems they build. They don’t worry about disaster recovery too much while programming because it’s going to slow things down. In fact, all the issues around things going just horribly wrong are written off. I even saw someone arguing that they were a full-stack developer because they could code the front-end and could work with the back-end and the rest was just: Operational details increasingly handled by service providers.

Ha! No kidding. Operational details? You mean security, backups, high availability, disaster recovery – all the stuff that your Operations team does? In short, you’re not actually full-stack but instead are relying on others to provide you an operational space, let’s say, within which to do your development. In short, Operations specialists, like DBAs maybe?

So you do need DevOps. You’re going to build code that has to live within the operational space provided for you. For you to deliver better quality code faster, wouldn’t it help for that operational space to ‘shift left’, to move closer to you and your development so that your code works when it ultimately lands in Production? If all that’s true, then you want to, need to, communicate with the Operations people and the DBAs. Which brings me to the DBAs.

DBAsHey guys! Said “No” to someone this morning, or is it a slow day?

You’ve spent many years developing the knowledge necessary to set up a well functioning database, the server it sits on, backups (and backup testing), your Availability Group, and that off-site disaster recovery location in New Jersey. And yes, all those trips to the middle

Page 28: 02 - Redgate Software - Compliant Database DevOps ......DevOps is that they’re not getting the backing from the executive level,” Donovan says. “You have this waterfall mentality

28 Shift LEFT Issue 2

“It’s time to learn a new word. Code is going to change faster and faster. New technologies are not slowing down, they’re speeding up. You need to get on the bandwagon and start to communicate with your brethren in the Development community.”

Page 29: 02 - Redgate Software - Compliant Database DevOps ......DevOps is that they’re not getting the backing from the executive level,” Donovan says. “You have this waterfall mentality

29Database DevOps

of nowhere were pretty awful. Then, out of nowhere, this development team wants to do some crazy thing with Entity Framework (yeah, I’ve seen some of the queries), and all you can see is another trip to New Jersey (I’d suggest looking at Azure). The answer is “No”.

Automating deployments? No.

Shifting Production databases to the left? Left? No.

Source control? Tried that in 2002. No.

Every single time I point out how you guys say “No” to everything, I get the litany of reasons why, well, of course, you’re perfectly reasonable creatures and you’re more than open to what the developers want, but they’re wrong, so, NO.

It’s time to learn a new word. Code is going to change faster and faster. New technologies are not slowing down, they’re speeding up. Further, as service offerings get better, aspects of your job (not your whole job) are going to go away. You need to get on the bandwagon and start to communicate with your brethren in the Development community. I’m saying this because they need you and your knowledge. If you take part, in a DevOps way, in what they do, you can exert positive influences. Where? Let’s talk about that.

Why DevOpsI know that you know that you should both take backups and validate those backups. However, have you worked with your other teams in development to ensure that they know?

How about the fact that you should have some pretty stringent controls on when you’re in Production versus when you’re in a Test environment? We use completely different logins, right? We use different tool settings and all the different ways we can to ensure we

don’t accidently drop stuff in Production, right? Or, maybe this hasn’t been communicated as thoroughly as we thought.

And if you have an outage, have you done the work to ensure that you have a disaster recovery process (even if it means a trip to New Jersey)?

It gets even more fundamental than that. Backups are frequently ignored by startups and small organizations that are overly focused on change. Fundamental behaviors of information storage such as ACID properties are dismissed, incorrectly it turns out.

I’m not saying that all these problems are solved by DevOps. I am saying that more, better, faster communication between the specializations involved in Development and the specializations involved in Operations will lead to fewer of these types of problems. DevOps provides that communication mechanism.

It’s not me, it’s youI know the whole break-up thing is supposed to go the other way (to my shame, I’ve used that line). However, we’re not breaking up. I refuse to let this go. I know that DevOps as a communication mechanism and process not only works, but makes everyone’s lives better.

Developers deliver higher quality code, faster. DBAs have safer, more stable Production environments. Most of all, the business gets new, better functionality in order to make money.

After all, that’s what it’s all about, supporting the business. We have a common goal. We have mechanisms for automation and communication.

Let’s start using them and deliver DevOps.

Page 30: 02 - Redgate Software - Compliant Database DevOps ......DevOps is that they’re not getting the backing from the executive level,” Donovan says. “You have this waterfall mentality

30 Shift LEFT Issue 2

Educate Accelerate TransformEducate Accelerate TransformEducate Accelerate Transform

We’re Hiring!

We Transform and Accelerate the Way That Organisations

Deliver Software

Educate | Automate | Transform

0800 368 7378

@DevOpsGuys

[email protected]

www.devopsguys.com

Cogsworth

Partner

Partners

Page 31: 02 - Redgate Software - Compliant Database DevOps ......DevOps is that they’re not getting the backing from the executive level,” Donovan says. “You have this waterfall mentality

31Database DevOps

In Europe, Middle East, and AfricaArchitecture & PerformanceLyon, [email protected]

Axians IT Solutions GmbHUlm, [email protected]

DataMovementsLondon, [email protected]

DevOpsGuysCardiff, [email protected]

DLM ConsultantsCambridge, [email protected]

Drost ConsultingAmsterdam, [email protected]

GDS Business Intelligence GmbHEssen, [email protected]

Jarrin ConsultancyNewcastle-upon-Tyne, [email protected]

xTEN | High-Performance DataLeeds, [email protected]

In North AmericaAdaptivEdgeAlameda, [email protected]

Centino SystemsOxford, [email protected]

Crafting BytesSan Diego, [email protected]

CSW SolutionsChicago, [email protected]

IowaComputerGurusWest Des Moines, [email protected]

Kingfisher TechnologiesCedar Rapids, Iowa [email protected]

Nebbia TechnologyOrlando, [email protected]

Northwest CadenceBellevue, [email protected]

SQL WatchmenAustin, [email protected]

In Asia PacificDBInsightBrisbane, Australia [email protected]

Human Interactive Technology Inc.Tokyo, [email protected]

LobsterPot SolutionsMelbourne, [email protected]

SQL Down UnderMelbourne, [email protected]

SQL MaestrosBangalore, [email protected]

SQL Masters ConsultingBrisbane, [email protected]

Let’s talkIf you’d like advice on which partner is best able to help you, talk to us. We know them, we’ve trained them, and we can recommend which of our experts is the most suitable for your particular requirements.

Contact [email protected].

If you’re interested in becoming a Redgate Certified Partner, we’d love to hear from you.

Visit www.red-gate.com/consultant

Find a Redgate DevOps partnerMore and more companies and organizations are exploring DevOps. But while you have the people, and Redgate and Microsoft have the products, there is often a gap in the process. Step in Redgate’s Certified Partners.

We have a growing number of consultants across the globe who can help you with the installation, customization and training of our products to get you on the road to database DevOps.

Page 32: 02 - Redgate Software - Compliant Database DevOps ......DevOps is that they’re not getting the backing from the executive level,” Donovan says. “You have this waterfall mentality

We do DevOps for databases

We’re looking for bright people to join usWe’re looking for Account Executives in Pasadena CA and Austin TX.We’re looking for Software Engineers and UX Designers in Cambridge UK.

red-gate.com/careers