Changing the Shape of E&P Data Management

13
Summer 1997 21 Robert Beham Conoco Inc. Aberdeen, Scotland Alastair Brown Chris Mottershead Jane Whitgift BP Exploration Operating Co. Ltd. Aberdeen, Scotland Joe Cross Conoco Inc. Lafayette, Louisiana, USA Louis Desroches Caracas, Venezuela Jørgen Espeland Aberdeen, Scotland Michael Greenberg Paul Haines Ken Landgren Houston, Texas, USA Ignacio Layrisse Intevep, S.A. Caracas, Venezuela Jairo Lugo Lagoven Caracas, Venezuela Orlando Moreán Eugenio Ochoa Petróleos de Venezuela, S.A. Caracas, Venezuela Dennis O’Neill Houston, Texas Jim Sledz Conoco Inc. Houston, Texas The long, hard pull through a decade of low oil prices yielded some sweet rewards for the exploration and production (E&P) industry. Many inefficiencies in E&P operations have been overcome; organizations have been downsized, rightsized and reengineered to strike quickly at short-lived opportunity; more powerful technologies are efficiently mobi- lized; the fastest computers and the ablest software are purring along; and multidisci- plinary teams are empowered to maximize the motivation of each player. Economic necessity and a new posture of looking for- ward rather than just surviving provided the impetus for all these advances. And yet, still more is needed to reach even higher levels of productivity. What do you do for an encore? You set upon the thorny, unglamorous and indispensable world of data management. Today, with E&P organizations driving to enhance productivity and recovery ratio, data management has moved into the main- stream. The transition has taken some time. In the early 1990s, the E&P community focused on cost containment, and to contain costs in data handling, the first bits of data began to trickle off legacy mainframes and onto distributed desktop workstations. 1 It was a new world, and the questions of the day concerned mechanics and economics— How does it work? How much will it save us? What is the best technology and work process for our needs? By the mid 1990s, the Changing the Shape of E&P Data Management Thirty years ago, oil companies owned drilling rigs. Twenty years ago, they owned seismic vessels. Today, both activities have been outsourced. Now, management of oil company data—often called the very heart of a company's value—is also being outsourced. Here is a look at three companies who have chosen this path, and their observations on what they've learned so far. 1. Balough S, Betts P, Breig J, Karcher B, Erlich A, Green J, Haines P, Landgren K, Marsden R, Smith D, Pohlman J, Shields W and Winczewski L: “Managing Oilfield Data Management,” Oilfield Review 6, no. 4 (July 1994): 32-49. For help in preparation of this article, thanks to John Adams and Alan Woodyard, Conoco Inc., Houston, Texas, USA; Cyril Andalcio, Alan Black, Herve Colin, Alberto Nicoletti, Todd Olsen, Matt Vanderfeen, GeoQuest, Caracas, Venezuela; Carmelo Arroyo, Zak Crawford, Geraldine McEwan and Tom O’Rourke, GeoQuest, Aberdeen, Scotland; Bill Baksi, Meyer Bengio, Knut Bülow, John Dinning and Mike Rosenmayer, GeoQuest, Houston, Texas; Karen El-Tawil, Geco-Prakla, Houston, Texas; Jay Haskell, Schlumberger Oilfield Services, Caracas, Venezuela; Brent McDermed, Legacy Solutions, Houston, Texas; Tony Oldfield, Schlumberger Integrated Project Management, Aberdeen, Scotland; John Pooler, BP Exploration Operating Co. Ltd., Aberdeen, Scotland; David Scheibner, Schlumberger Austin Product Center, Austin, Texas; Gustavo Valdés, Centro de Petróleos de Venezeula, S.A., Caracas, Venezuela. LogDB, LogSAFE, Finder and FMI (Fullbore Formation MicroImager) are marks of Schlumberger. Dwights is a mark of Dwight’s Energydata, Inc. Openworks is a mark of Landmark Graphics. Oracle is a mark of Oracle Corp. PI is a mark of Petroleum Information/Dwights LLC. POSC is a mark of Petrotechnical Open Software Corp. QC Data is a mark of QC Data Holdings Limited. Sun is a mark of Sun Microsystems, Inc. UNIX is a mark of AT&T. VAX is a mark of Digital Equipment Corp.

Transcript of Changing the Shape of E&P Data Management

Page 1: Changing the Shape of E&P Data Management

Summer 1997 21

Robert BehamConoco Inc.Aberdeen, Scotland

Alastair Brow nChris MottersheadJane W h i t g i f tBP Exploration Operating Co. Ltd.Aberdeen, Scotland

Joe CrossConoco Inc.L a f ayette, Louisiana, USA

Louis Desroch e sC a racas, Ve n e z u e l a

Jørgen EspelandAberdeen, Scotland

M i chael GreenbergPaul HainesKen LandgrenHouston, Texas, USA

Ignacio Lay r i s s eI n t e vep, S.A.C a racas, Ve n e z u e l a

Jairo LugoL a g oven C a racas, Ve n e z u e l a

Orlando MoreánEugenio Och o aPetróleos de Venezuela, S.A.C a racas, Ve n e z u e l a

Dennis O’NeillHouston, Te x a s

Jim SledzConoco Inc. Houston, Te x a s

The long, hard pull through a decade of lowoil prices yielded some sweet rewards for thee x p l o ration and production (E&P) industry.M a ny inefficiencies in E&P operations havebeen ove rcome; organizations have beend ownsized, rightsized and reengineered tostrike quickly at short-lived opportunity; morep owerful technologies are efficiently mobi-lized; the fastest computers and the ablests o f t ware are purring along; and multidisci-plinary teams are empowered to maximizethe motivation of each playe r. Economicnecessity and a new posture of looking for-ward rather than just surviving provided theimpetus for all these advances. And yet, stillmore is needed to reach even higher levels ofp r o d u c t iv i t y. What do you do for an encore?

You set upon the thorny, unglamorous andindispensable world of data management.

To d ay, with E&P organizations driving toenhance productivity and recovery ra t i o ,data management has moved into the main-stream. The transition has taken some time.In the early 1990s, the E&P communityfocused on cost containment, and to containcosts in data handling, the first bits of databegan to trickle off legacy mainframes andonto distributed desktop wo r k s t a t i o n s .1 I twas a new world, and the questions of thed ay concerned mechanics and economics—H ow does it work? How much will it saveus? What is the best technology and wo r kprocess for our needs? By the mid 1990s, the

Changing the Shape of E&P Data Management

T h i rty years ago, oil companies owned drilling rigs. Twenty years ago, they owned seismic

vessels. To d a y, both activities have been outsourced. Now, management of oil company

data—often called the very heart of a company's value—is also being outsourced. Here is

a look at three companies who have chosen this path, and their observations on what

they've learned so far.

1 . Balough S, Betts P, Breig J, Karcher B, Erlich A, GreenJ, Haines P, Landgren K, Marsden R, Smith D,Pohlman J, Shields W and Winczewski L: “ManagingOilfield Data Management,” Oilfield Rev i ew 6, no. 4( July 1994): 32-49.

For help in preparation of this article, thanks to Jo h nAdams and Alan Wo o dyard, Conoco Inc., Houston, Te x a s ,USA; Cyril Andalcio, Alan Black, Herve Colin, A l b e r t oNicoletti, Todd Olsen, Matt Vanderfeen, GeoQuest, C a racas, Venezuela; Carmelo A r r oyo, Zak Craw f o r d ,G e raldine McEwan and Tom O’Rourke, GeoQuest,Aberdeen, Scotland; Bill Baksi, Meyer Bengio, KnutB ü l ow, John Dinning and Mike Rosenmaye r, GeoQuest,Houston, Texas; Karen El-Tawil, Geco-Prakla, Houston,Texas; Jay Haskell, Schlumberger Oilfield Services, C a racas, Venezuela; Brent McDermed, Legacy Solutions,Houston, Texas; To ny Oldfield, Schlumberger Integra t e dProject Management, Aberdeen, Scotland; John Po o l e r, BP Exploration Operating Co. Ltd., Aberdeen, Scotland;D avid Sch e i b n e r, Schlumberger Austin Product Center,Austin, Texas; Gustavo Valdés, Centro de Petróleos deVenezeula, S.A., Caracas, Venezuela. L o g D B, LogSAFE, Finder and FMI (Fullbore Fo r m a t i o nMicroImager) are marks of Sch l u m b e r g e r. Dwights is amark of Dwight’s Energydata, Inc. Openworks is a markof Landmark Graphics. Oracle is a mark of Oracle Corp.PI is a mark of Petroleum Information/Dwights LLC.POSC is a mark of Pe t r o t e chnical Open Software Corp.QC Data is a mark of QC Data Holdings Limited. Sun isa mark of Sun Microsystems, Inc. UNIX is a mark ofAT & T. VAX is a mark of Digital Equipment Corp.

Page 2: Changing the Shape of E&P Data Management

22 Oilfield Review

desktop environment began to mature andstabilize. Data storage problems diminished.S o f t ware grew more robust and pow e r f u l .The walls between disciplines—“datasilos”—began to crumble. The focus in datamanagement shifted from s av i n g s to va l u ec re a t i o n, still with an eye to making the tech-nology work and to the promise of seamlessdata integra t i o n .

By the late 1990s, the data managementstory has changed as it moved to the high-priority list. No longer is technology itself themain concern—not only because its nove l t yhas worn off, but also because its rapid cy c l etime has elevated expectations. Now, usersd o n ’t want more technology; they want thet e chnology to disappear, to become tra n s-parent. Data management issues today areless about tech n o l o g y, and more aboutchanging the business process to becomemore efficient: How do we improve the waywe manage data and who does what? If weo u t s o u rce, what do we keep doing, what dowe outsource, and how do we manage it?

Changing Data Management is Changing CultureA major trend today in E&P organizations islearning how to successfully outsource all,part or some of data management (r i g h t) .While outsourcing is far from new, it is newto the field of E&P data management.

S e ve ral forces are driving this change. Oneis the growing volume, diversity and com-plexity of the data itself. Not only is therem o re data, but there is more data in a greatervariety of forms—digital files, images, tables,text, maps and physical specimens, such asfluids, cores and thin sections; data frommore vendors and from an eve r- i n c r e a s i n gnumber of partners; and data from va r i o u sg e n e rations of processing and interpretation(n ext page). This means the task of manage-ment has grown along with the complexity ofdata. In addition, as data diversity and datavolume grow, the number of interpretablerelationships between data entities canbecome unmanageable. “People can typicallyhandle about 10 to 15 unrelated items at at i m e ,” said Knut Bülow of GeoQuest. “Somevery good people can handle 25 items. But incomplex settings, there may be upward of 50different pieces of information that need to bei n t e g ra t e d .” The future will require deve l o p-ing ways to handle shortcuts and refine infor-mation about the information, to break it intomanageable pieces.

A second factor is the drive to increase thep r o d u c t ivity of the E&P process. At BP, forexample, a chief motivation is to free geosci-entists to concentrate on building reserve s .For Petróleos de Venezuela, S.A. (PDVSA), itis to double production by increasing inter-nal efficiency and by opening assets to part-nerships with multinationals, both of wh i chrequire state-of-the-art data management. Inmost organizations, improvements in pro-d u c t ivity require more streamlined and effi-cient data handling.

Other driving factors are cost reductionand a refocusing within operating compa-nies on the core business of finding and pro-ducing oil. For cost reduction, significants avings come from outsourcing managementof large volumes of bulk data to sharedi n f rastructures, either government centers orc o m m e rcial operations. The emphasis oncore business follows the goal of increasingproduction with the same personnel leve l s ,and therefore having geoscientists focus onfinding and producing oil, not on noncorea c t ivities like data management.

But why turn outside for solutions? Not allE&P companies do. Some still prefer to buildand operate their own systems. The grow i n gnumbers that have sought outside assistance,h ow e ve r, often have similar observa t i o n s :• “ We could not afford to do it all and be

the best.”• “Service companies are in a better position

to advance certain technologies efficientlyand stay ahead of the fast rate of tech n o-logical ch a n g e .”

• “Our in-house system worked fine, butsharing data with a bigger group of part-ners was difficult.”

• “ We now probably spend 10% of our timemanipulating data, down from about 30% before.”Some report difficulty in making the tra n s i-

tion, and some initial loss of efficiency.Whether this dip happens, and its dura t i o nand depth, depended on conditions beforethe ch a n g e .

■Comparison of how data have been handled in the past, and the direction taken bysome oil companies toward outsourcing. In general, the traditional approach of the oilcompany was “my data, my people, my machines, my software, on my pre m i s e s . ”Today, “my data” stays the same, but everything else changes. Oil company personnela re still involved in overseeing management, and in high-order data validation, whichre q u i res intimate local knowledge, or which invokes a significant HSE concern. Similarly,it is likely that at least some data management will continue to be conducted on oilcompany premises, although many operators are moving some tasks—especially thosep e rf o rmed easily in batch mode—off-site to regional data centers.

Page 3: Changing the Shape of E&P Data Management

Summer 1997 23

“ We did not experience a decline in pro-d u c t ivity after initiating our new approach , “said Chris Mottershead of BP. “For us, thedata themselves weren’t the source of ineffi-c i e n cy. Our data were in good shape. It wa sa people problem—we were down to so fewpeople who understood data management.This was a key motivation for us to quick l ybring our data partners up to speed andmake the system wo r k .”

For Conoco, wh i ch started its effort afterB P, the benefits are still in the future, asefforts now focus on building a critical massof high-quality, validated data. “We don’t seethe benefit ye t ,” said Jim Sledz. “We wo n ’tsee it for a while. We ’re still laying the foun-dation for the house.”

As part of the move toward outsourc i n g ,oil companies are taking three approach e sto shared data centers, all of wh i ch offer ane f f i c i e n cy of scale not possible for a singleo p e rator: government-led efforts, opera t o ri n i t i a t ives and service company initiative s .

National arch ives have been in place forsome time in the UK, Norway, Kazakhstanand Algeria. The goal common to all of theseefforts is to reduce the cost of data handlingby reducing the need for multiple copies andby gaining economies of scale associated withc e n t ralized data stores. In the last few ye a r s ,these efforts have become more active in pro-cessing data requests, and are starting to func-tion more as live libraries than arch ive s .

One of the more ambitious opera t o r- l e defforts was initiated in 1995 in London,England. A consortium of 40 operating com-

panies working in the UK sector of the NorthSea formed the Common Data A c c e s s( C DA), an arch ive near London that housesall partner data and public data, in wh i chf i r e walls protect data priva cy. Examples ofve n d o r-sponsored efforts are the LogSAFEp r o g ram and the Houston Data ManagementC e n t e r, wh i ch was established thisSeptember for companies operating in NorthAmerica. The LogSAFE program is an on-lineservice providing log data arch iving andt ransmission, with data access in 24 hours orless. A c t ive since 1991, the program nowc overs the most active areas of the NorthAmerican market.

In the move toward outsourcing and dataconsortia, all oil companies are seekingessentially the same goal with their datamanagement: immediate, seamless andsecure access to the greatest possible dive r s i t yof relevant, high-quality data and interpreta-tions. With this capability, geoscientists canentertain more “what-if” scenarios in lesstime and develop a higher degree of certain-ty about each decision. Evaluation of moreprospects and higher certainty ultimatelyt ranslate into fewer lost opportunities andgreater ove rall productiv i t y.

There are nearly as many paths to this visionas there are travelers on the path (see “HowDo You Relate to Data?” n ext page). Wi t h i ne a ch operating company, there are often threetypes of people who share the same goal buth ave varying—and sometimes conflicting—views of how to get there. These groups can beroughly divided into three categories: DataVisionaries, Geoscientists and Data Managers.

D a ta V i s i o n a r i e s—Typically a mid-leve lm a n a g e r, sometimes at the VP level, ch a r g e dwith framing a vision and building consensusand mechanisms to ach i e ve it. They may havea r r ived at this position through the geosciencepath. Visionaries view good data manage-ment as a key to higher productivity and atool for better risk management. They may notview all of it, how e ve r, as a core expertise.Visionaries tend to view software and hard-ware as “less is more,” and favor more sharplydefined use of technology to solve problemsat the heart of value genera t i o n .

G e o s c i e n t i s t s—Charged with getting the jobdone, the geoscientist is often the focus of pro-cess reengineering. He or she has the mostintimate knowledge of information require-ments, yet sometimes has conflicting interestsin data management. Geoscientists may loador verify their own data because they don’ttrust the existing system, or because of thetemptation to “do it all myself because I can.”And yet, they ultimately want to have as littleas possible to do with data management. Fo rthem, data management is nonproductivetime, diverted from picking formation tops orf a i r ways. Because geoscientists may haveseen data management initiatives come andgo, they may rely on proven and often propri-etary home-grown solutions that enable indi-viduals to work, but prohibit the flow ofk n owledge to others. Nevertheless, they areoften open to better ways of wo r k i n g .

D a ta Managers—Data managers oversee ateam that loads, formats and manipulatesdata, and performs some data quality assur-ance, often up to the gray area that marks thebeginning of interpretation. The data managerand the team members may have come froma systems science or computer science back-ground and have some degree of geoscienceexposure, often from on-the-job tra i n i n g .Data managers view data management asthe lifeblood of the E&P process and there-fore as a core expertise.

To understand the diversity of these viewsfrom a variety of organizations, O i l f i e l dRev i ew asked the same six questions of E&Pprofessionals in three oil companies:P DVSA, BP Aberdeen, Scotland and ConocoUSA. This trio represents three approaches todata management. PDVSA, a major national,is in the early stages of revamping its datamanagement to attract multinationali nvestors and double production by the ye a r2007. BP Aberdeen is the most adva n c e dand aggressive in outsourcing, and Conocois taking a middle ground in outsourc i n g .From each, a range of perspectives is repre-sented, from geoscientists and IT supportpeople in the trenches to data visionaries.

■W h e re generations ofdata meet. Geoscien-tists, graphics special-ists and data pro c e s-sors at GeoQuest inCaracas work side byside on diff e rent partsof the same data set fora PDVSA project. Pro x-imity saves time incollaboration on dataupdating and qualityc h e c k i n g .

Page 4: Changing the Shape of E&P Data Management

24 Oilfield Review

Petróleos de Venezuela, S.A. (PDVSA)Caracas, Venezuela

Ignacio Lay r i s s eGeneral Manager ofExploration and Pro d u c t i o nI n t eve p , S. A .

Jairo LugoInformation Technology Exploration GeologistLa g ove n

Orlando MoreánE&P Co n s u l ta n t for theBADEP pro j e c tP DV SA

Eugenio Och o aManager of E&P for theBADEP pro j e c tP DV SA

Venezuela produces 3.2 million barrels of oilper day, ranking as the fifth largest oil pro-ducer wo r l dw i d e .2 All upstream and dow n-stream operations are overseen by an arm ofthe government, PDVSA. Under PDVSA thethree major exploration and production affil-iates are Lagoven, Corpoven and Marave n .INTEVEP is the research and deve l o p m e n tarm of PDVSA. In January 1997, PDV S Aspun off data management to a projectcalled BA D E P, wh i ch is coordinated by rep-r e s e n t a t ives from PDVSA, its affiliates andG e o Q u e s t .3 Its objective is to migrate datamanagement from legacy systems and ontostandardized, commercial data managementand interpretation systems. The BADEP team,wh i ch by year end will consist of about 150people at 21 sites, also establishes serviceslines to support about 2000 data users, andperforms data loading and some data qualitya s s u rance (n ext page, t o p).

The BADEP approach was viewed as a wayto most rapidly advance the data managementsystem and to attract investment from multi-nationals. This investment is seen as a neces-sary step in the effort to double Ve n e z u e l a ’s oilproduction by 2007. Training needed form i g ration to newer data management systemsis coordinated through the education arm ofP DVSA, Centro Internacional de Educación yDesarrollo (CIED), located in Cara c a s .

As one of the largest corporate-wide efforts,the data management task is huge. Iti nvo l ves three main steps. The first is migra-tion to the Finder system, and creation of acommon database system for all three affili-ates (n ext page, b o t t o m). The second is va l i-dation of data, some of wh i ch date backnearly 80 years. This step alone will take sev-e ral years to complete. And the third is tod e velop a work flow to capture, load andvalidate new data, and establish a means toupdate interpretations and make data correc-tions. During all three steps, effort is focusedon moving users from the old concept ofdata management as a “safe” for data,t oward a new concept of live informationmanagement—making sure value added byinterpretations is preserved and accessible.

What are your top data managementchallenges today? What were they thre eyears ago?

A l l• I m p r oving productivity of geoscientists to

reduce the time they spend accessing data.• I m p r oving data quality. Data up to five

years old are in satisfactory shape, but olderl e g a cy data—we have wells going back to1920—are in questionable condition.

• Creating an integrated working env i r o n-ment, in wh i ch the data, systems andapplications function as a single unit. N owwe have geophysical data in one database,core data somewhere else, logs in a thirdplace. We want geoscientists to have it allat their fingertips, and to be able to work asif all data reside in one place.

2 . BP Statistical Rev i ew for 1996: the top five producersare Saudi A rabia, USA, the Russian Fe d e ra t i o n ,Mexico and Ve n e z u e l a .

3 . BADEP is a Spanish acronym for “exploration andproduction database:” Banco de Datos deE x p l o ración y Producción.

t is written, in the apocrypha of information sci-

ence, that how you think about and work with data

elates to your age. If you want to move out of the

box you are in—mainly, if you want to act younger

han you are—you will have to work at it, like learn-

ng a new language.

As with all generalizations, exceptions abound.

And within organizations, nothing accelerates a cul-

t u re change as much as a resounding success with

a new method. But at the beginning of a new initia-

ive in data management, the reaction to the change

may follow these lines....

People work with data as if they belong to one

of three categories: Baseball Kids, TV Kids or

Video Kids.

Baseball Kids (age 50+) grew up before televi-

sion became a defining cultural medium. They

played baseball (or soccer, or cricket) instead of

watching TV. They learned computing as adults and

a re occasionally comfortable with it. They might

have written some code, and might even have an

advanced degree in computer science. T o d a y, they

a re senior managers and executives. They keep a

pocket calendar that they update in pencil.

TV Kids (age 30 to 49) grew up with television,

p robably did F O RT R A N p rogramming on punch car d s

n middle school and cut their teeth on home-built

m i c rocomputers. They equate power with har d w a re

and software: “The more tools I have, the more

c o n t rol I have; the more control I have, the more I

can do.” They are least likely to willingly surr e n d e r

c o n t rol of data or equipment. As geoscientists,

hey tend to have multiple computer screens, disk

drives and tape drives. They enjoy writing code

and take pride in their macros. They have a calen-

dar program on their desktop computer.

Video Kids (age 30 and under) grew up with

video games and by the time they left college, the

d o rmitories were wired to the Internet. They never

used a typewriter or carbon paper. They are not

ikely to care about control of data, since they are

used to data being served to them. They do not

c a re where data reside. Their sense of power

esides in work they do with data, not on data.

They can write code, but prefer to let others do it.

They organize their life with a PDA, which they

download to their notebook computer; they eschew

p a p e r. (PDA: a personal digital assistant, which

its in your palm, see local Video Kid for a demo.)

How Do You Relate to Data?

Page 5: Changing the Shape of E&P Data Management

Summer 1997 25

Orlando More á nMy first priority is to manage not just the

l i b ra r y, but the meaning of the libra r y. As wecontinue to outsource a larger volume of ourdata management, we need to find a way toinclude not just the numbers, but also thei n t e r p r e t a t i o n s .

Data volume is an issue. We simply havemore data today than ever before. So anymanagement system we establish must bepowerful enough to function with an expo-nential growth in data. At the same time,we may be focusing on an increasinglysmaller percentage of data. So we need tobe able to quickly get to the most importantdata, which are buried deeper and under agrowing pile.

Trust is also a key factor. People used torely on their own resources. Now, I am askedto give my knowledge to someone else. Tr u s thas to come from the top down, and thattrust needs to be built with milestones eve r y-one can see. In the first stages of outsourc i n g ,it is essential to have an early and persuasivesuccess to build confidence when trust is justbeginning to be built.

How do you measure the efficiency of yourdata management in financial terms? Givean example of how a change in work pro-cess or technology provided you with a sig-nificant efficiency gain.

Orlando More á nOne measure of efficiency is the length of

the data path from the source to thedatabase. The longer the path, the lower thee f f i c i e n cy. Three years ago, it took a personon the rig an hour to fill out the daily drillingreport, then send it to the regional center.N ow, the system on the rig is wired to adatabase at the regional center. This allow srig personnel to compare their well with oth-ers in the area, and then drop the reportdirectly into the database.

Ignacio Lay r i s s eOur gas-lift operations in Maracaibo invo l ve

6000 wells, probably the largest such opera-tion wo r l dwide. In our gas-lift programs, theprocess from starting analysis to taking actiontook about 10 days. Lagoven took data fromall 6000 wells and, after cleaning up the data,put them into a regional database. Using ap a rallel processor, now two or three engi-neers can do in two or three hours what took20 engineers more than a week.

■What data loading looks like. Jesus Pagua, a member of the BADEP team at PDVSA,loads data into the LogDB system for Corpoven.

■Training for the move to a new data management system. Cyril Andalcio of GeoQuest,c e n t e r, runs a training school that teaches PDVSA affiliate geoscientists how to use newdata interpretation and management tools. Across all PDVSA affiliates, more than 400data users will receive training at Centro Internacional de Educación y Desarrollo (CIED),the company’s education center in Caracas, Ve n e z u e l a .

Page 6: Changing the Shape of E&P Data Management

26 Oilfield Review

Based on your experience, what are thehidden costs of data management?

Orlando More á nThe problem is often having too much

t e ch n o l o g y, and redundancy betweendatabases. At worst, 70% of the time can bespent trying to find data.

J a i ro Lu g oPoor tape storage can be a hidden cost. If

tapes need to be transcribed, then the addedcost of better storage is trivial compared tothe cost of recopy i n g .

In making the transition to outsourcing ofdata management, what are the fore m o s tbenefits? The foremost pitfalls?

A l lB e n e f i t s :• Cost sav i n g s .• A l l owing geoscientists to concentrate on

their core business: evaluating prospectsand finding oil.

• I n c o r p o ration of best practices, drawing ona ve n d o r ’s broad experience.

P i t f a l l s :• Retaining “data tra n s l a t o r s ,” people that

speak both the language of data manage-ment and of geoscience.

• Assuring that the outsourcing ve n d o ri nvests an “emotional account” in our data,to be assured that they are safeguarded. “Iprefer the word ‘alliance’ to ‘outsourc e ,’ ”Ignacio Layrisse said. “Outsourcing soundslike I am getting rid of my data. In analliance, the vendor keeps in contact withme and my reality—it is like running athree-legged race. You can win if you coor-dinate your efforts, but you fall flat if yo uare out of step.”

• Protecting data confidentiality.

Which data management functions arec o n s i d e red core expertise?

A l lData quality assurance remains a core

function at PDVSA. Secondly, we need tokeep some data management expertise in-house to understand the commercial datamanagement system and provide construc-t ive feedback to determine its optimum use.

What is your data management dre a m ?How do you imagine your data manage-ment in the year 2000 will be diff e rent fro mwhat it is today?

J a i ro Lu g oMy dream is to have it all on a map, to have

e verything linked to a geographic coordinate.

Orlando More á nAll revisions are automatically updated.

My dream is for a customized data systemthat knows what I want to know. It mighte ven learn from my queries.

Eugenio OchoaBy the year 2000, the walls between geo-

science disciplines will have come dow nc o m p l e t e l y. The geologist, geophysicist andr e s e r voir engineer, for example, will beworking in a seamless asset team using wh a tappears to them as a single database. A l ldata are immediately available to all teammembers to view, interpret and use. Th e r eare no separate reports by discipline, but theteam makes one report, with all know l e d g eshared in one database. This will be neces-sary to get to the next level of integra t i o n :continuous, real-time dynamic updating ofthe field model.

Ignacio Lay r i s s eTo reach this next level of integration we

need a culture change, toward the Ja p a n e s emodel of the team, in wh i ch responsibilitiesare shared and felt by all team members. Fo rpeople to get out of the mindset of “these aremy data and part of my pay is what I know,”they need to ach i e ve—as Orlando suggestedearlier—a strong and early success in a teamenterprise. An indisputable success is theonly way to ach i e ve the culture ch a n g e .

BP Exploration OperatingCompany LimitedAberdeen, Scotland

Alastair Brow nSenior Pe t ro p hys i c i s t / G e o l o g i s t

Chris MottersheadTechnology Business Manager

Jane W h i t g i f tBusiness Information Manager

Nearly a quarter of the oil consumed in theUK comes from North Sea fields operated byB P, and the majority of those operations arecoordinated from BP’s center in A b e r d e e n .As part of a larger effort started four years agoto lower cost and improve its productiv i t y,BP Aberdeen moved aggressively towa r do u t s o u rcing of data management. In twoyears, from 1993 to 1995, the cost of datamanagement fell by nearly 60%. Now anindustry leader in this approach, BP’s opera-tions have drawn attention across the indus-t r y. Chris Mottershead, the technology busi-ness manager who helped frame the newpath for BP, has consulted with visitors fromnearly all petroleum prov i n c e s .

Today, GeoQuest handles BP’s data man-agement and a major portion of graphicssupport, and provides information technol-ogy support—hardware, local- and wide-area networking and desktop support—under a teaming arrangement with ScienceApplications International Corpora t i o n(SAIC). Around two dozen GeoQuest peo-ple work for BP—a small service deliveryteam is based onsite while the bulk of staffis located in the GeoQuest Service Centreabout seven miles [11 km] from the BPoffices (next page, top). Well, wireline andseismic data are all stored at the offsitelocation and accessed from the customer’sdesktop via a 34-MB/sec network link.

Page 7: Changing the Shape of E&P Data Management

Summer 1997 27

BP groups its data into two categories,s t r u c t u re d and u n s t r u c t u re d. Structured dataare electronically managed, usually by adatabase system, providing security, integrityand lineage control. These data—includingwell, geological and production information,well logs and seismic—are managed byGeoQuest as a shared resource for all appro-priately authorized users. By volume, about80% of data are structured (b e l ow).

Unstructured data reside with, and arecontrolled by, geoscientists and genera l l yconsist of live, working documents, such asmaps and reservoir simulation outputs of theusers. Unstructured data are available to,and useable by, a few individuals and smallgroups. Knowledge of integrity, currency andlineage, how e ve r, generally resides with onlyone indiv i d u a l .

“In our projects we try to structure as muchdata as possible,” Chris Mottershead said,“but in the end, we live with the contra d i c-tion that our smallest but most valuable vo l-ume of data is not structured. We can see afuture in wh i ch Web-based search engineswill provide tools to allow us to flexibly storeand retrieve these unstructured data.”

■Geraldine McEwan, left, with Jennifer Laird restoring BP map files from backup in theserver room at the GeoQuest office in downtown Aberdeen. The bulk of BP digital data ismanaged from this site and served to users at the BP office on the other side of the city.

■A BP Aberdeen view of data manage-ment. BP strives for full integrationbetween its three main data sets: subsur-face, wells and facilities. Ultimately, auser would be able to bring up a seismictrace or a finance memo with equal ease.On the vertical axis, data concern onlyfacts, such as “porosity is 20%.” Informa-tion would be “20% porosity means we areabove the Dunlin formation;” knowledgewould be: “we might get a water influx”and understanding would be, “despite allthe indicators, we will not get a waterinflux.” On the axis coming out of thepage, BP views the shared earth model as“allowing us to worry about managingonly the few percent of data that are criti-cal to our business decisions,” said ChrisMottershead.

Page 8: Changing the Shape of E&P Data Management

28 Oilfield Review

What are your top data managementchallenges today? What were they thre eyears ago?

Chris MottersheadThree years ago, our goal was to low e r

data unit costs, such as the cost of data man-agement per head, per barrel of reserve sadded or as a percentage of lifting cost. Wel owered these costs so efficiently, that wefound ourselves with only a few people wh oknew data management. In a way, we hadbecome victims of our own success, andplaced ourselves at risk. What if our handfulof key people left, or moved to other jobs?We saw outsourcing as a means to low e rthis risk and develop a single reliable sourc eof data management.

When we talk to other industry peopleabout why we chose this path, there is atemptation to take the BP solution and plugit in elsewhere. What we did, however, wasmotivated by our business structures andgoals: because it works for BP does notmean it’s a “plug-and-play” for everyone.Our goal is to double the productivity ofour staff to meet our aggressive hydrocar-bon productivity goals. We don’t have thepeople to do it, so the only way to grow isto try a different approach. The way wehave chosen is to make geoscientists moreproductive by letting them shed tasks notcore to growing productivity—that is, out-sourcing data management.

Jane W h i t g i f tOur business structure is crucial for making

this approach work. BP functions as 12 inde-pendent businesses that all focus on thesame goal of growing reserves and produc-tion. Outsourcing of data management isone device to free geoscientists to be morep r o d u c t ive. A related approach is to createlearning, integrated business teams. In thepast, there were walls between disciplines.The geophysicist might pass his or her infor-mation to the facilities engineer and thatwould be the end of their communication.The facilities engineer might not understandthe implications of the geophysics, and as a

result, facilities might be underdesigned.N ow, people work in “fuzzy teams,” so theg e o p hysicist knows wh i ch data are of inter-est to the facilities engineer, and the problemof underdesigning may disappear. The idea isthat if data are shared by an integrated team,e a ch team member will have l e s s data toc o n s i d e r, and only the most i m p o rta n t d a t a .

A l a s tair Brow nM a ny of the technical problems in data

handling of five years ago have diminished.This allows geoscientists to be more produc-t ive by creating more time to interpret thedata to further enhance the finding and pro-duction of additional hydrocarbon reserve s .

The second challenge is rapid turnaroundof data interpretation, measured from thetime the tool comes out of the hole to thetime we have an interpretation. Now we canh ave an interpretation on the desktop in afew hours. Three years ago, it would havetaken longer since the process was moreaw k ward and labor- i n t e n s ive .

The third challenge would be cross-plat-form communication. We still have two dif-ficulties here: links can corrupt data, and weneed to solve the problem of linking multipledatabases and multiple formats of data. Inthis respect, the demands of data are alway sa step or two ahead of the capability of thesystem for data handling. It’s the nature ofthe beast: we don’t know what our datademands will be in three years, so how canwe build a system to handle it? For example,t o d ay we can display and intera c t ive l ymanipulate on a computer screen core pho-tos at different scales, along with thin sec-tions and an FMI borehole image log. In1994 we couldn’t do this and we had to takea leap to do this.

For me, the ove rall challenge in the futureis to further optimize our data management,i n t e g ration and manipulation processes toensure that we do not become slaves to ourdata. A lot of data is just the ammunitionused by the geoscientist to better describethe size and flow ch a racteristics of hy d r o c a r-bons within the subsurface in order to moreefficiently find new reserves and optimallyproduce existing deve l o p m e n t s .

How do you measure the efficiency of yourdata management in financial terms? Givean example of how a change in work pro-cess or technology provided you with a sig-nificant efficiency gain.

Chris MottersheadWe use surveys of our 300 data users to

gauge not only the efficiency, but the successof our data management approach. Th r e eyears ago, when we started data manage-ment outsourcing, our problem was lack ofpeople, not lack of data quality. Our datawere in good shape when we started out-s o u rcing, so that enabled us to get up tospeed quick l y. We cut costs by 50% withinthe first ye a r, and improved staff productiv i t yand end-user satisfaction by 10% per quar-t e r. Satisfaction was measured simply by ask-ing users, “Are you happy ? ”

In 1993, we mapped our work flow andfound that we had 500 geoscience applica-tions. By careful winnowing, we cut that tos e ven core applications and eight specialistapplications, then we started the long pro-cess of building links between those 15applications. Users at first rebelled at thethought of giving up tech n o l o g y, but wewere able to convince them that less isindeed more when it comes to software. Wefocused our support resources to make a fewapplications work well, rather than manythat limped along. In the end, we don’tb e l i e ve it’s valuable to have anyone doinge verything; we encourage them to focus onthe most valuable things. For example, wen ow have geoscientists picking tops, notwriting code.

Based on your experience, what are the hid-den costs of data management?

Chris MottersheadWe probably spend 20% of our time trying

to close disconnects in our less-than-idealdata management, mostly with unstructureddata. Now we have data that reside in threeplaces: with the user, with the team and inthe corporate database. There is some redun-

Page 9: Changing the Shape of E&P Data Management

Summer 1997 29

d a n cy. Getting rid of duplicate logs, forexample, will reduce cost, but not add va l u e .Because we’re talking about the small butsignificant percentage of data that is redun-dant, there is probably limited value in ra t i o-nalizing the user- t e a m - c o r p o rate databasesinto a single database.

A l a s tair Brow nA cost that is sometimes not considered is

that of managing outsourcing of data man-agement. It becomes too easy to makereports and maps and fancy plots. You sud-denly have all these resources at your finger-tips and there is a temptation to manipulatedata for the sake of manipulating data.Managing this outsourcing resource meansyou keep eve r yone focused only on tasksthat grow value.

In making the transition to outsourcing ofdata management, what are the fore m o s tbenefits? Foremost pitfalls?

Chris Mottershead and Jane W h i t g i f tB e n e f i t s :• Longevity and security of data.

O u t s o u rcing of data management formal-izes data management as a stand-alone

task, wh i ch falls under the purview of avendor devoted to that purpose only.

• Releasing interpretation people from datamanagement obligations, allowing themto focus on geoscience.

• R e m oving islands of data defined by disci-pline, and allowing for introduction ofstandards that permit easier sharing of datathroughout the value chain. This remova lof legacy systems, wh i ch tended to be dis-cipline-specific, also allows for easiersharing of data and interpretations withpartners and contra c t o r s .

P i t f a l l s :• G iving cost reduction too much weight.

D r iving down cost needs to be an issue,but not the only issue. One needs to keepan eye as well on the creation of long-term va l u e .

• Inflated expectations. A ny change invo l ve ssome initial aw k wardness. Shifting datamanagement outside the company is noexception. “There is a risk that efficiencyand productivity decline, if not well man-a g e d ,” said Chris Mottershead. “As respon-sible users of data, we need to keep tra ck ofproblems, and quickly develop solutions.Initiating a conversation to find a solutionis more productive than feeling victimizedby the fact that a plotter doesn’t wo r k .”

• Believing someone else can sort out yo u rdata better than you can. This is a false-hood. The petrophysicist is ultimatelyresponsible for data quality. You can dele-gate the act of data management, but yo uc a n ’t delegate accountability.

• Retaining the ability to manage outsourc-ing, and the degree of intervention in theprocess of data management depends onthe risk associated with the data (l e f t) .

In the case of a deviation surve y, wee n c o u rage a high degree of interve n t i o n ,since erroneous survey coordinates canmean you drill into an existing well withpotentially catastrophic losses.

A l a s tair Brow nA chief pitfall is the failure to manage the

o u t s o u rcing of data. An analogy is taking atrip in a fast car: the car is quick and conve-nient, but it doesn’t know where to go. Yo uneed to steer it, otherwise you might end upat the town dump rather than the bank.

A related pitfall is thinking, in the world ofs o f t ware, that more and most recent are bet-t e r. When it comes to software and data han-dling, fewer, smarter steps with simple sys-tems are better than many dumb steps withcomplicated systems. For every minute yo uspend manipulating and interpreting data incyberspace, you need to spend twice thetime thinking about what you want to obtainfrom the data, about what are the most va l u-able steps to take, not what are the possiblesteps to take.

Another perc e ived pitfall is becomingreliant upon the vendor, with a resultingloss in our own skills base. We could noteasily make a U-turn if things do not workout. I think there are two keys to minimiz-ing this risk. The first is financial. Both par-ties have to win. The second is people:there has to be a bond and a mutual trust sothat any issues of conflict are aired immedi-ately, without focus on finger-pointing buton problem resolution. To ach i e ve thisbond, for us at least, it is tremendouslyhelpful to have the vendor move in, to liveand work with us to understand our culture,work processes and requirements.

Page 10: Changing the Shape of E&P Data Management

30 Oilfield Review

The main benefit of outsourcing is sav i n gtime. We now probably spend 10% of ourtime manipulating data, down from about30% before. I doubt we would go low e rthan 10%—you need to spend some timech e cking, working with and validating thedata, looking them ove r. Th a t ’s still part ofthe thinking process.

Another benefit is that in passing data man-agement to the ve n d o r, we also pass alongpersonnel management. Vendors can moreeasily move staffing levels up and down inresponse to demand, since they can shuttlepeople to other projects.

Which data management functions arec o n s i d e red core expertise?

Chris MottersheadManagement of any data associated with

a high-risk decision, or affecting HSE,remains a core expertise. We need to knowexactly how these data are managed, andmonitor the management, but not performthe task ourselves.

Jane W h i t g i f tAnother core expertise is remaining an

informed buyer of data management ser-vices, to distinguish service level prov i s i o n sand efficiencies. Data management is likem a ny other of our activities in wh i ch we leanhard on vendors. We need to keep and cul-t ivate people with in-depth understanding inkey functional areas, such as structural engi-neering, chemistry—and in data manage-ment. We don’t need a whole department,

but a person who stays up to speed enoughto be an informed buye r. For all data man-agement functions, except those of high risk,we need to know the w h a t and w h e n, butnot the h ow.

A l a s tair Brow nWe need to be able to follow and verify the

audit trail of data—and not just the deviations u r veys, but also well log data, perfora t i o ni n t e r vals and any data that are crucial to thesafety and success of the well. GeoQuestmight do the data handling and formattingand arch iving, but BP remains responsiblefor the data. So 10% of our time will remainoccupied with ch e cking and managing thisinterface. Perhaps in the future it will shrinkto 5% when GeoQuest gets to know ourdata, but it can never go to zero.

What is your data management dre a m ?How do you imagine your data manage-ment in the year 2000 will be diff e rent fro mwhat it is today?

Chris MottersheadThere are two interwoven dreams: the tech-

nology dream and the human dream. Th et e chnology dream is a shared earth model—a single source that holds all data from alls o u rces. But it is more than a depository ofdata, since a depository does not add va l u e .What adds value is information about uncer-tainties inherent in the data, so that yo uk n ow what are high-quality data and wh a tare of lesser certainty. This will allow you tomore fully rationalize different types andqualities of data in the ove rall reservo i rmodel. You can’t do this today. At present,the small subset of high-quality data is dilut-ed by the lower quality data, and you losethe advantage of the high-quality data.

The human dream is the behav i o ra lchanges needed to ach i e ve integrated learn-ing teams. In such an environment, teammembers would go out of their way to sharek n owledge. This exchange would be part ofthe job, not an added extra. “I’m not sureh ow we encourage this to happen,” ChrisMottershead said. “If we knew, we’d havethe problem solve d ! ”

A l a s tair Brow nWith the tremendous growth in new soft-

ware and hardware technologies to acquireand handle increasing quantities of data,plus the easier routes to manipulate thesedata, we have a danger of data overload. Th eexponential improvements in tech n o l o g ya l l ow us to do bigger, better and more things.But it’s up to us to focus on adding va l u e — t ouse data intelligently and efficiently and notbecome slaves to our data. I think the com-panies that will prosper wo n ’t be distra c t e dby the new toys. They will focus on mini-mizing high-quality, reliable data sets, mini-mizing data movement and manipulation,and maximizing the quality of their interpre-tation in order to add va l u e .

Page 11: Changing the Shape of E&P Data Management

Summer 1997 31

Conoco Inc.

Robert BehamSenior Geophysical Ad v i s o rA b e r d e e n ,S c o t l a n d

Joe CrossG e o p hys i c i s tLa f aye t t e , Lo u i s i a n a , U SA

Jim SledzD i re c t o r, Global ExplorationInformation ManagementS t r a t e g yH o u s t o n , Texa s , U SA

Conoco is the eleve n t h - ranked US oil com-p a ny according to assets, producing 366,000barrels [58,157 m3] of petroleum liquids perd ay, with slightly more than 10% comingfrom the Gulf of Mexico.

C o n o c o ’s operations in the USA have takena middle road between the approach of BPand what PDVSA hopes to ach i e ve. Th e i ro u t s o u rcing is farther along than that ofP DVSA, but not as far as BP’s. Conoco haso u t s o u rced data management and applica-tion support to GeoQuest at five locations,and is migrating from an in-house databaserunning on a VAX to UNIX-based Sun sys-tems that access Oracle databases. A b o u t60% of the work consists of loading data intoworkstations, building regional masterdatabases, providing graphics support andoverseeing arch ives of physical and digitaldata, such as seismic data, logs, cores, cut-tings, books and journals. The remaining40% is software application support. Sincethe start of outsourcing in 1996, Conoco esti-mates that geoscientists spend 5% to 10%less time looking for and preparing data. Agoal is to reduce this time by 50%.

What are your top data managementchallenges today? What were they thre eyears ago?

Joe Cro s sIn the last three years, our data manage-

ment challenges have changed—and insome ways they have remained the same.

In 1993, we decided to move data off ac e n t ral database in Ponca City, Oklahoma,USA, and push that responsibility out to thefour business units, for efficiency and so theycould develop a stake in productive datamanagement. At the time, the challenge wa sfinding a system that could handle the va s tdata volume. The second challenge was topopulate the now-dispersed databases withs o u rces of data other than those in Po n c aC i t y. The business units had to learn how tosecure sources of data that they tra d i t i o n a l l ydid not have to think about. The third ch a l-lenge was quality control of data.

Quality control was, and remains, a non-t r ivial challenge. For example, not every datavendor calls a well by the same name. Fo rOffshore Oil Scouts, an API well numbermight have 10 or 12 digits, wh e r e a sPetroleum Information might use 14 digits. Ift h e r e ’s a discrepancy between two well iden-tifiers, you can’t just lop off the last three orfour numbers; you need to know the infor-mation beyond that embedded in the num-bering system, such as a company selling awell to another company, or drilling the welld e e p e r. When we have a 40-ft [12-m] differ-ence in kelly bushing elevation from PI orDwights, we don’t know wh i ch is right and ittakes some sleuthing to find the right answ e r.

The second challenge invo l ves mov i n gdata around and handling updates. Part ofthe Geoshare promise is fulfilled, but it is notseamless yet. It is still sometimes a struggle tom ove data from the central database to theproject databases, then propagate changes inproject data back to the central database.Likewise, we get updates and correctionsfrom data vendors. Finding the trail of thatdata, and making corrections to all theinstances, can be difficult.

Ro b e rt BehamThree years ago, there was very little va l i-

dated, commercial data around in digitalform. Only a few percent of log curve datafor the Gulf of Mexico was available in digi-tal form, so people relied on the old methodof hand-posting. There were 3D seismic sur-veys, but they were fairly simple data sets—a series of zeros and ones—and to managethese meant having sufficient space andh o r s e p ower to handle a monotone data set.So the challenge at the time was to deve l o pdigital databases and a methodology forhandling them.

To d ay, we have the databases, and some ofthe data management software. But anunderstanding of proper data management isstill evolving. The complexity of data stillpresents a problem. For example, we need todo numerous extensions to tables to cove rareas needed in Pe t r o t e chnical OpenS o f t ware Corporation (POSC) and Fi n d e rdata models.

We have the basic direction and tools. Th eharder challenge today is developing anunderstanding of the importance of data. Th ehardest thing to get anyone to do on a pro-ject is to document what they did. To do this,management needs to provide incentives fordocumentation of your work—we need tocapture both the raw and value-added data.

Jim SledzOur challenges have changed in three ye a r s

because our business drivers have ch a n g e d .Three years ago, we worked from year to ye a rwith no long-term perspective. Oil was $17 abarrel and all we accomplished were tacticsto solve current problems and get to the nextye a r. Now that pricing pressure is less and wecan think stra t e g i c a l l y, we can look three tof ive years down the road, and when we dothat, we start to think about fixing prob-lems—like data management—that willrequire a long-term commitment.

Page 12: Changing the Shape of E&P Data Management

32 Oilfield Review

How do you measure the efficiency of yourdata management in financial terms? Givean example of how a change in work pro-cess or technology provided you with a sig-nificant efficiency gain.

Joe Cro s sM oving from a legacy system to a com-

m e rcial system mainly means we are morenimble—we can work faster, respond toopportunities faster. In the past, when I reliedon the central database in Ponca City, Iwould call there and ask for base maps forthe area in wh i ch I was working. I’d get in aqueue with eve r yone else in the company.E ve n t u a l l y, they would build my projectdatabase for me: gather seismic traces, shotpoints, well numbers and so on. I got what Ineeded to start working within about aweek. Now, if I already have the seismic dataloaded, I can be up and working in less thana day.

Ro b e rt BehamE f f i c i e n cy of data management is difficult

to express as a business metric, such as mored i s c overies. Even if you have the most effi-cient data management system in the wo r l d ,geoscientists might not be able to use it tomake discoveries. Proving the efficiency of anew data system is largely out of the handsof the data people.

Jim SledzDecreases in cycle time and in perc e ive d

risk are to me the key metrics. By perc e ive drisk, I mean not only fewer dry holes, but anability to better understand the risk factors ofe a ch prospect. We may not be able to quan-tify the risk, other than feeling that we have abetter understanding of its order of magni-tude. I don’t think I can directly or realisti-cally relate data management efficiency tobarrels of oil produced.

Based on your experience, what are the hid-den costs of data management?

A l lTaking “hidden costs” to mean those that

are not normally ack n owledged or tra ck e d ,there are many hidden costs to data qualitya s s u ra n c e .• Propagation of error detection. We tend to

load data once, from wh i ch point theym ay be served to multiple projects. If anerror is detected—such as the nav i g a t i o ndata in a seismic set being off by 2°—oneproject may get the correction, but not all.This results in a cascade of error.

• Removing all interpreters from qualityassurance. Geoscientists need to be in thedata quality control loop. Data managersare fine at loading data in the rightplace—they can spot errors like a15,000-ft [4570-m] log curve placed in awell only 8000 ft [2440-m] deep—in factthe Finder system will flag errors in con-straints like this. But information technol-ogy (IT) people can’t always tell if a fossilis placed in the wrong horizon. “Thisi nvo l ves time in ‘meat-space’ not incyberspace,” Joe Cross said. “You need totalk to people, find the experts, work inthe real world to solve problems like this.The solutions are still in people’s minds,not on-line. In our approach, we don’twant every geoscientist QCing all data,but a handful of geoscientists workingside-by-side with IT people.” Jim Sledzsaid, “Geologists and geophysicists stillneed to be in the loop.”

• Data quality control always takes longerthan predicted. “It will humble you to real-ize what has to happen before you canpoint to a seismic section and say ‘here isa gas zone,’” said Joe Cross. “You haveseismic from Geco-Prakla, well surfacelocation data from Petroleum Information,a directional survey from DDI, a ve l o c i t ys u r vey from Velocity Databank, tops fromPDI and well log curves from QC Data.Th a t ’s six vendors. For an interpreter topoint to an intersection of a seismic tra c eand a well, all six vendors have to be spot-on in putting the right data in the rightlocation—and they all have to agree onlocation conve n t i o n s .”

• Data maintenance. Most people stopthinking about the cost of data once load-ing is complete, but data require continu-ous maintenance. “If I get another tape ofnew data that supplements or updateswhat we have ,” said Jim Sledz, “that stillcosts a fair bit to load and make sure it doesnot cover up edited Conoco information.”

In making the transition to outsourcing ofdata management, what are the fore m o s tbenefits? Foremost pitfalls?

A l lB e n e f i t s :• Importing a growing body of data manage-

ment expertise. Vendors bring in expertisethat we did not have, such as know l e d g eof Structured Query Language, and how tostructure Oracle databases.

• More appropriate and flexible staffing. Byh aving GeoQuest examine our dataneeds, they can more accurately deter-mine the staffing level needed, and canmore easily move people on and off theproject as needed.

• We ’re getting results: data are being loaded,systems we predicted building are beingbuilt; physical libraries are being built andput in order.

P i t f a l l s :• Loss of expertise. We hate to lose the

expertise we once had, and to seer e s o u rceful people leave the company.

• Dip in efficiency. The vendor needs tolearn our data management history andhow we did things. “No matter how wellyou plan,” Joe Cross said, “you will havea brownout during the transition.” Toshorten this period, data managers needto work closely with interpreters, at leastuntil a critical mass of validated data is assembled.

• Steep learning curve. The E&P industrydoes not have a long history of data man-

Page 13: Changing the Shape of E&P Data Management

Summer 1997 33

agement skills, so there are few trueexperts. We are all, operators and ve n d o r s ,learning as we go.

• Determine the pricing structure based onthe tasks. With on-going work, such as ahelp desk or library filing, paying per per-son makes sense. But for projects, it isprobably more economic to negotiate ap e r-project cost.

Which data management functions arec o n s i d e red core expertise?

A l lData management in Conoco today is con-

fined to four areas: large-scale project man-agement, management of outsourcing, con-t ract management and setting data manage-ment strategy (wh i ch invo l ves defining thedata processes, such as how loading is car-ried out and how correction factors areapplied). The company has moved towa r dthe position of setting goals and establishingt e chniques, and overseeing implementationrather than performing implementation.

“My opinion has evo l ved over the last threeye a r s ,” said Joe Cross. “At the outset, out-s o u rcing didn’t seem like a good idea to me.I felt it was critical that we guard the bin, soto speak. We had our own people manipu-lating, moving, sifting and dispensing data.N ow that I’ve seen the new system work, Ican see its benefits in new expertise, and inmore flexible headcount. How e ve r, somedegree of data QCing remains part of myjob. For example, if GeoQuest is loading ormapping with data that I’ve worked on for 17years, I have a pretty good idea wh a t ’s rightand wh a t ’s not.”

In Conoco, we think of data managementas lying along a continuum, and the closer afunction comes to interpretation, the more itremains a core function. We have a positioncalled “geodata specialist.” These specialistswork in an exploration or exploitation team,setting up projects, generating base maps,and loading data. They have a direct intera c-

tion with geoscientists and engineers. Th e i rrole is part of the exploration process, notjust an information management process, soit is considered core. But around the cornerare people who just load data or just managep hysical records, and their role is not con-sidered core. Genera l l y, if we are dealingwith simple facts, we are farther from core; ifwe are dealing with information about facts,w e ’re closer to core. “Project management ofdata management is a core function, butdata management itself is not,” said RobertBeham. “Project management needs to be afull-time job, and perhaps by more than onemanager per office.”

What is your data management dre a m ?How do you imagine your data manage-ment in the year 2000 will be diff e rent fro mwhat it is today?

Joe Cro s sSeamless and transparent integra t i o n

across platforms. I want to go into the Fi n d e rsystem, draw a rectangle on a map ande verything that pertains to that rectangle isput into my Openworks project.

Ro b e rt Beham• Recording data provenance. I would like

to see clean data residing in a secure, cen-t ral location, where the data pedigree iseasily determined.

• Capture of value-added data. Today wedon’t capture interpretations in a way thatallows them to be easily verified or builtupon. There isn’t sufficient documenta-tion to explain a rationale for picking topsor net gross for sandstone. Anyone whoexamines the data can’t reconstruct theinterpretations, so they tend to betrashed. We need effective documenta-tion to build collective corporate knowl-edge grounded with evidence, instead of“I remember when...”To d ay, there is a small group of people wh o

are keen to document what they do andmake it available to the next interpreter. Weneed to institutionalize that spirit. If I have a

s o u rce rock distribution map that’s just paperin a folder—that works OK, if people knowto go to the file. But it would be better toh ave it digitally, and be attached to a meta-data list—a list of data ava i l a b l e .

Jim SledzIn a single sitting, I’d like to be able to

determine what data are available for a pro-ject, both internally and externally, and getthem loaded within a couple of hours—excepting the physical data, wh i ch mighttake a few days. Now I might have to go tonine people, or tap into 15 systems. There istremendous value for me if I can do it all inone stop.

Common Threads in Data ManagementThis brief survey shows that even companiesof diverse cultural backgrounds and at dif-ferent points in their development effortsshare some perspective s :• Quantifying the pay b a ck: Operating com-

panies had, to varying degrees, metrics inplace to gauge costs of conventional datamanagement. In retooling and restructur-ing data management, new metrics needto be established to match the demands ofthe new system.

• Cross-platform compatibility: There is ashared vision of a single-point of contactwith all kinds of data—diverse types (phy s-ical, digital, images, text), different genera-tions in the interpretation cycle and fromdifferent disciplines.

• Walking the fine line: There is internaldiscussion of which functions and skillsthe operating company retains and whichare delegated to a service partner. Evenwhen data management is largely shiftedoutside, there is debate about the bestway to maintain the expertise necessaryto oversee data management and plan itsstrategic direction.

• Quality control: Data quality remains ac e n t ral concern. A ny improvement inwork process is incomplete unless it alsoaddresses means to improve and maintaindata quality. — J M K