45 results on '"Bob Harrison"'
Search Results
2. FELTAG in rearview: FE from the past to the future through plague times
- Author
-
Howard Scott, Alison Iredale, and Bob Harrison
- Published
- 2022
- Full Text
- View/download PDF
3. Command college – foresight as a foundation to police executive development
- Author
-
Bob Harrison
- Subjects
Research design ,business.industry ,05 social sciences ,Law enforcement ,050301 education ,Public relations ,Education ,Futures studies ,Work (electrical) ,Executive education ,0502 economics and business ,business ,Baseline (configuration management) ,Psychology ,0503 education ,Curriculum ,050203 business & management ,Criminal justice - Abstract
Purpose The education of police executives has been a priority of criminal justice agencies for more than 40 years to address the need to professionalize law enforcement in America. Since the 1980s, programs for this purpose have existed, one of which is the California POST Command College. Command College is an academically oriented executive development program intended to “invest in the future” as its students – mid-career police managers – acquire the tools and skills necessary to be promoted to executive positions. This paper aims to answer the question, “Does the Command College achieve its intended goals?” Design/methodology/approach A survey instrument was used to obtain perspectives of recent graduates and of those who had graduated from the program more than four years before the survey. An assessment of the frequency of promotions to command and executive roles was completed, and an external academic assessment of the program’s curriculum was completed by a university. Findings Support for the program by graduates increased over time, graduates were promoted at a rate of three times higher than baseline averages for police managers and the program’s curriculum was vetted as being equivalent to graduate-level courses at the university level. Research limitations/implications As its value is validated through this assessment, others can learn how they might better prepare their police executives for the future. No similar law enforcement program has been similarly assessed, so others may also learn ways to ensure they are achieving their intended outcomes from this example. Given the differences in other law enforcement leadership programs in terms of student selection and specific goals, direct comparisons would be limited, both by the program differences and the research design used by others as they work to validate their success in meeting their goals. Originality/value Although law enforcement executive education has existed since 1935, and leadership training programs for the police since 1982, no research has been conducted to validate the outcomes and impact of such programs on the graduates of such programs and their agencies.
- Published
- 2019
- Full Text
- View/download PDF
4. An Integrated Approach to Reservoir Characterization and Tilted Contact Saturation Modelling Across the Shuaiba/Kharaib Zones of the Thamama Group: A Case Study of an Abu Dhabi Onshore Carbonate Field
- Author
-
Bob Harrison, Ivan Kok, Jan van der Wal, Francis Eriavbe, Alyazia Mohammed, Abdurahiman Kutty, and Aluka Osakwe
- Subjects
chemistry.chemical_compound ,Abu dhabi ,chemistry ,Reservoir modeling ,Carbonate ,Integrated approach ,Petrology ,Saturation (chemistry) ,Geology - Abstract
A Reservoir Rock Typing (RRT) and Saturation modelling integrated study of the Lower Cretaceous Group reservoirs in Field X was conducted as part of required inputs into an ongoing FDP update. The results of this study allowed for a robust 3D modelling workflow that included saturation height modelling of the observed tilted oil water contact in the acquifer underlying three (3) of the major hydrocarbon bearing zones. The study intervals covered intervals that for the purpose of this paper will be called XH-4, XH-1, KX-2, XK-3 and XK-2 zones. Rock types were defined using core data integrated with geological descriptions and 3D distributed taking into consideration the depositional environments. Porosity, permeability and saturation property distribution was guided by the RT distribution. In total 44 wells in the field plus 4 nearby wells were incorporated in this study. This paper analyses the petrophysical, geological and static modelling workflows utilized. The RRT definition was based on the prediction of Self Organising Map (SOM) electrofacies which were classified using the petrophysical grouping obtained from lithofacies and depositional environment, permeability and MICP data. As a further step, the predicted RRT was optimised where the Archie saturation indicated that a different rock type was more likely. The analysis of virgin formation pressure data to understand the tilted OWC/FWL concept reveals that different wells have different absolute pressures. When the differences are mapped no regional trends were observed which makes a hydrodynamic origin of the tilted OWC/FWL less likely. Free water level (FWL) has been interpreted on well by well basis and mapped across the field for a better understanding of the variation. The pressure data are consistent with the scenario that the XH-4, XH-1 and KX-2 zones all share the same free water level for modelling purposes. The OWC is more or less flat in the central field area and is about 35 ft deeper in the eastern part of the structure. Although the difference in OWC could be abrupt (compartmentalisation), a gradual tilt seems to fit the data better. Furthermore, the ‘deepest oil observations’, which relate to paleo conditions, are deeper than present-day FWL in the east. All these observations would be consistent with a structural tilt towards the east. Similar trends were also observed from the FWL interpreted across key wells having formation pressure data. In terms of reservoir quality, the interval XK-3 has highest porosity and highest permeability. Also XH-1 and KX-2 contain good reservoir quality rocks. XH-4 is located above XH-1 and KX-2 and has relatively low quality matrix properties. It does contain low porosity (and hence brittle rocks) that can be expected to contain oil when fractured. The deeper XK-3 is probably disconnected from the underlying XK-2 which can be explained by the argillaceous interval between XK-3 and −2. XK-2 consists of thin reservoir intervals so that both Archie saturation estimates and the rock typing work carry significant uncertainty. The derived saturation height (SH) models utilise both a present-day and a paleo free water level per well. Each reservoir interval has its own SH-model based on SH-functions for each RRT. Implementation of all defined RRTs with associated SHFs was done to allow for highest resolution population of the geomodel, though a significant number of algorithms is required for this process. The RRT population of the static models was designed to be obtained from the combination of 3 distributed parameters: porosity, depositional environment, and reservoir quality by using the algorithms provided. Reservoir Quality is the driver most closely linked to diagenesis and local permeability variations.
- Published
- 2020
- Full Text
- View/download PDF
5. Low power mode energy demand of household appliances—SELINA and APP projects
- Author
-
Barbara Schlomann, Bob Harrison, Melissa Damnics, Paula Fonseca, Anibal T. de Almeida, and Carlos Patrão
- Subjects
Consumption (economics) ,Sustainable development ,Engineering ,business.industry ,020209 energy ,02 engineering and technology ,Energy consumption ,Environmental economics ,Natural resource ,General Energy ,General partnership ,Scale (social sciences) ,0202 electrical engineering, electronic engineering, information engineering ,Marketing ,business ,Standby power ,Efficient energy use - Abstract
In the last decades, it has been recognized that energy consumption in low power modes for electrical and electronic products is an important issue. There is a need to expand energy efficiency efforts beyond simple standby modes into the new more complex area of networks, thus tackling the new paradigm of living based on the Internet of Things. The European project SELINA carried out a large scale in store monitoring campaign, measuring about 6300 different equipment. Since then, there is no reference to other similar market surveillance studies being carried out in Europe. In Asia, a market surveillance campaign performed by the Asia Pacific Partnership with measurements on a regular basis has been very successful. SELINA results show that 18.5% of the measured products present power values higher than the 2010 EC 1275/2008 regulation threshold in off-mode, and for standby this value reached 31%. When a comparison is made with the 2013 EC 1275/2008 regulation threshold, these values increase twice. The Asia Pacific Partnership results alert policy makers that low passive standby does not guarantee low active standby. Several studies indicate that consumer electronic products are becoming more efficient and their energy consumption is decreasing. However, because the ownership of appliances is also increasing, these improvements in energy efficiency do not seem to have significant impact in the overall consumption of the households. In addition, there is evidence that not all appliances in the market reach the performance announced by the manufacturers. Recent measurements carried out by the Natural Resources Defense Council on flat screen TVs revealed that their real energy consumption seems to be higher than announced in the label. This shows the urgent need for measurement campaigns, since no market surveillance is being carried out on regular basis, and trusting the manufacturer’s data seems to be unreliable.
- Published
- 2017
- Full Text
- View/download PDF
6. Data room documentation
- Author
-
Bob Harrison
- Subjects
Documentation ,Phrase ,Point (typography) ,business.industry ,Computer science ,Memorandum ,Internet privacy ,ComputingMilieux_COMPUTERSANDSOCIETY ,business ,Object (philosophy) ,Caveat emptor ,Due diligence ,Task (project management) - Abstract
How do we see what is hidden? Imagine looking at a dog behind a picket fence. You do not see several slices of dog; you see a single dog that is partially hidden by a series of opaque vertical slats. Even though we often cannot see an entire object, the human brain is able to fill in the gaps of missing information and create that image. Scientists refer to this amazing ability as amodal completion (Ramachandran et al., 2010) and it is an extremely difficult thing to do. For instance, researchers have been trying for years, without success, to program a computer to perform the task. So, if our brains are so smart, why do we often struggle to understand and appreciate the complete picture when reviewing opportunities that are outlined in MA namely the teaser, the confidentiality agreement (CA), and the all-important information memorandum (IM), from the perspective of the seller and that of the buyer. By detailing key elements of these documents, the text explains what must go into them and what things are better left out, illustrating how sellers can make their sales pitches stand out from the crowd. These illustrations are also used to give the prospective buyers' point of view and show how potential investors can read between the lines of an IM to formulate searching and critical questions to put to the seller (Harrison, 2014a). Any buyers or investors who are considering embarking on a due diligence exercise must always keep in mind the phrase, “caveat emptor,” which is Latin for “let the buyer beware.”
- Published
- 2020
- Full Text
- View/download PDF
7. External help and advice
- Author
-
Bob Harrison
- Subjects
Negotiation ,Promotion (rank) ,Documentation ,Process (engineering) ,media_common.quotation_subject ,Control (management) ,Quality (business) ,Business ,Asset (economics) ,Bidding ,Marketing ,media_common - Abstract
The seller can remain in total control of the M&A process by taking a Do It Yourself (DIY) approach to advertising the asset sale, setting up and administering the data room, writing supporting documentation, finding and contacting potential buyers, and managing the negotiations and the bidding procedure. However, the skills, resources, and time required to achieve these tasks satisfactorily should not be underestimated. Sellers (and buyers) can get support for deal promotion, data room services, and professional advice on financial, legal, and strategic matters, but such external help is not cheap, the specialist services offered can vary widely in scope and in quality, and the seller tends to have less control over the deal. This chapter discusses the pros and cons of keeping the M&A process in-house or getting external help. It suggests how to go about choosing advisers and what they might cost. Finally, some words of caution are given to those who decide to use deal brokers and advisory services.
- Published
- 2020
- Full Text
- View/download PDF
8. Before the data room
- Author
-
Bob Harrison
- Subjects
Polymath ,Law ,media_common.quotation_subject ,Art ,Advice (programming) ,media_common - Abstract
Benjamin Franklin, the renowned American polymath of the 18th century, famously said, “By failing to prepare, you are preparing to fail,” and nowhere is this advice more apt than in a data room exercise, where every minute counts. This chapter describes how sellers and buyers should get ready for a data room.
- Published
- 2020
- Full Text
- View/download PDF
9. After the data room
- Author
-
Bob Harrison
- Subjects
Schedule (workplace) ,Process (engineering) ,Respite care ,Order (business) ,Key (cryptography) ,Business ,Marketing ,Due diligence - Abstract
Tired and footsore, the M&A team returns from its visit to the physical data room (PDR). Yet, there is likely to be no respite for the team members as their management (or client) will be eager to discover what they think about the assets about which they have gathered information. However, buyers must rein in their impatience for feedback from the data room team as there are some key tasks that need attention in order to ensure the due diligence process stays on schedule.
- Published
- 2020
- Full Text
- View/download PDF
10. Due diligence, definitions, and doubt
- Author
-
Bob Harrison
- Subjects
Jargon ,Resource (project management) ,Petroleum industry ,Face value ,business.industry ,Audit ,Asset (economics) ,business ,Database transaction ,Due diligence ,Law and economics - Abstract
Before one embarks on discovering the world of data rooms, it is essential to recognize that all parties do not necessarily speak the same language. Whenever the term “reserves” is used, all interested parties in a potential asset deal should know what it means and have the same definition in their minds. One must never lose sight of the fact that M&A activity means that money (and we mean “lots of money”) may change hands, so there is a requirement for due diligence in any oil and gas transaction, as investors should never accept anything at face value. Potential buyers must ensure that a seller’s resource estimates are “properly classified, accurately calculated, and appropriately reported” (Deloitte, 2013). To achieve this goal, buyers need to be diligent, knowledgeable about resource reporting requirements and limitations, and read carefully the provided memoranda to examine a seller’s claims. This chapter explains what due diligence is and why it needs a framework that defines how resources should be reported. This section also introduces the main definitions, conventions, and jargon commonly used by M&A professionals in the oil and gas industry. In this way, it is hoped to get all readers “on the same page” whenever the petroleum resource lexicon is used throughout this book. The fundamentals of a petroleum resource reporting system are described here briefly. Finally, the need for analysts to harbor doubts when auditing hydrocarbon resource volumes is raised.
- Published
- 2020
- Full Text
- View/download PDF
11. The data room
- Author
-
Bob Harrison
- Subjects
Scrutiny ,Documentation ,Workstation ,business.industry ,law ,Internet privacy ,Portfolio ,Electronic data ,Treasure ,business ,Work space ,Due diligence ,law.invention - Abstract
The data room is a potential treasure trove of proprietary intelligence about oil and gas assets that are up for sale, but one must admit that there are more salubrious places in which to ply one's trade. The author has experienced intense scrutiny by security guards before being permitted to enter a windowless room, with flickering fluorescent lighting and drab monochromatic decor. The seismic workstation and computers sat on bare tables, alongside boxes of musty documentation. The link with the world outside is via the seller's staff, who pop their heads round the door every hour or so to ask if one has any questions or needs any missing reports data to be brought from the archives. This work space is to be one’s “home” for the next few days. Welcome to the Orwellian world of pain that is known as the physical data room (PDR). And the virtual data room (VDR) is not much better, as the M&A team, who has been tasked with due diligence of a large asset portfolio in an impossibly short space of time, can say goodbye to their families and friends for the project duration as they withdraw into their cubicles each day for complete immersion in an online world in order to make sense of a morass of electronic data downloads that is being continually updated by the seller. This chapter covers the various types of data room one might be faced with and why they are so important.
- Published
- 2020
- Full Text
- View/download PDF
12. Tips for quicker focused evaluation
- Author
-
Bob Harrison
- Subjects
Coping (psychology) ,ComputingMilieux_THECOMPUTINGPROFESSION ,Risk analysis (engineering) ,Computer science ,Time pressure ,Due diligence - Abstract
Arguably the biggest challenge in any data room exercise is keeping the due diligence on track while coping with the immense time pressure. This chapter offers some advice on how to ensure the due diligence goes according to plan. Some techniques are represented that can help keep the M&A team focused on processing sufficient data to allow it to provide the buyer with enough comfort and understanding about the deal to either walk away or to make an informed offer.
- Published
- 2020
- Full Text
- View/download PDF
13. In the data room
- Author
-
Bob Harrison
- Subjects
Value (ethics) ,Capital expenditure ,Hierarchy ,Confidentiality Agreement ,Order (business) ,media_common.quotation_subject ,Curiosity ,Business ,Marketing ,Asset (computer security) ,media_common - Abstract
With a mixture of curiosity and trepidation, the M&A team arrives at the offices hosting the physical data room (PDR). After sitting through a carefully orchestrated sales pitch by the seller's management, they may be asked to show the signed confidentiality agreement to the security personnel before they can gain entry to the PDR itself. The buyer's team hope that the PDR will be an Aladdin's cave of data. This chapter outlines what the M&A team must achieve during its limited period in the PDR and, equally as important, what it must refrain from doing in order not to waste time. The data room framework and hierarchy are introduced, teams are encouraged to be follow in order that no stone is left unturned, essential data are captured, and imperative questions are asked (Harrison, 2012). The buyer's team's efforts must be geared to understanding the four essential things about any potential asset purchase: volume, uncertainty, value, and risk. Thus, the aim of the PDR visit is to furnish the prospective buyers with an independent view of the uncertainty in the seller's petroleum initially in-place (PIIP) and claimed petroleum volumes that are likely to recoverable, in what timeframe, at what level of capital expenditure, an estimate of their value, and what contingencies may impact their recovery. To this end, the roles of the various members of the multidisciplined M&A team are described in detail, including checklists of which data are considered vital and which are nice to have while in the PDR. In contrast, the role of the seller is also discussed.
- Published
- 2020
- Full Text
- View/download PDF
14. When things go pear shaped
- Author
-
Bob Harrison
- Subjects
History ,Aesthetics ,Slang ,media_common.quotation_subject ,Pear shaped ,Due diligence ,media_common ,Dozen - Abstract
There are many pitfalls one might encounter during a due diligence exercise of oil and gas properties in the confines of a data room and simply far too many to be covered in this book. This chapter describes almost two dozen examples (from the author's experience) that have occurred but may have been avoided if the approaches suggested in this book had been taken. For the record, going pear shaped is English slang for when things go horribly wrong. The source of the phrase is purportedly the Royal Air Force in the 1940s, who referred to trainee pilots' failed attempts to loop the loop in their aircraft as “going pear shaped.” Yet, mischievous wags claim it is a reference to a person's figure being narrower at the shoulders and wider at the hips, possibly due to the aging process and the effects of gravity.
- Published
- 2020
- Full Text
- View/download PDF
15. Data Room Management for Mergers and Acquisitions in the Oil and Gas Industry
- Author
-
Bob Harrison and Bob Harrison
- Subjects
- Petroleum industry and trade--Information resour, Consolidation and merger of corporations
- Abstract
Data Room Management and Rapid Asset Evaluation - Theory and Case Studies in Oil and Gas, Volume 66 introduces frameworks and workflows that help streamline the data room process, highlight the essential data that must be assembled in the permitted time window, and accelerate the subsequent assessment of the opportunity. The book combines theory with case studies, some of which describe lessons learned directly by the author himself. Methodologies are presented that can be used immediately by those involved in the technical and commercial evaluation of oil and gas exploration and production ventures. The book is suitable for readers with a wide spectrum of experience, from those who are newcomers to the strange world of data rooms, to those diehards who may have spent too many hours in them. The purposes, strategies, and tactics of data rooms are explained, along with some suggestions on how to survive them, and how to get a fit-for-purpose evaluation in front of the decision makers in the shortest timeframe possible. - Demonstrates what makes a good data room, including how vendors attract potential buyers to attend and how the latter can decide whether they should go or not - Presents how to prepare for a data room, what needs to be done there, and how to evaluate the assets on offer as quickly as possible - Covers which essential data should be gathered and questions to ask - Suggests how to avoid common'banana skins'when under pressure to provide a rapid but reasonable evaluation
- Published
- 2020
16. Panel - Software Tools for High-Performance Distributed Computing.
- Author
-
Vaidy S. Sunderam, Geoffrey C. Fox, Al Geist, William Gropp, Bob Harrison, Adam Kolawa, Michael J. Quinn, and Anthony Skjellum
- Published
- 1993
17. Choosing an Unconventional Play Analog for the Bowland Shale and Incorporating Onshore United Kingdom Operational Constraints in Potential Recovery Estimates
- Author
-
Gioia Falcone, Tamara Oueidat, and Bob Harrison
- Subjects
Petroleum engineering ,Oil shale ,Geology - Published
- 2019
- Full Text
- View/download PDF
18. The Development and Implementation of Subsoil Use Standards for the Petroleum Sector in Kazakhstan
- Author
-
Nurzhan Kairbayev and Bob Harrison
- Subjects
chemistry.chemical_compound ,chemistry ,Environmental protection ,Petroleum ,Business ,Subsoil - Abstract
Modernization of the regulatory framework of the Republic of Kazakhstan (RoK) will complement the "100 Concrete Steps" national initiative of President Nazarbayev. Planned reforms aim to increase the transparency and predictability of the subsoil use sector by implementing an international system of reporting standards. The Society of Petroleum Engineer's Petroleum Resource Management System (SPE-PRMS-2007) is the most likely choice for the hydrocarbon sector, as the guidelines can be altered to meet the needs of RoK to increase the contribution of petroleum production to drive economic growth, improve governance, and attract more investment, along with best available practices and technologies, in RoK. A comparative analysis of the leading petroleum resource reporting systems used by governments and stock markets around the world concurred that a tailored SPE-PRMS-2007, which was named KAZ-PRMS, was the most appropriate system for the needs of RoK. Discussions were then held with a RoK Working Group, which was comprised of government bodies, technical institutes and operators, to address their concerns and to get their ‘buy-in’ to create a road map for the development and implementation of the new Code for Reporting Hydrocarbon Resources (Code) as required by international SPE-PRMS standards. The initial concerns about migrating to a new reporting system of some of the Working Group were overcome via open and lively discussions, by workshops and through focused training sessions. The various parties concurred that migrating to KAZ-PRMS could be a valid alternative to the existing RoK resource reporting system from the Soviet era, which is disliked by the international investor community. There remained some concern over the proposed speed of change presented in the road map, but there was consensus for the need for change. The administrative, technical and cultural challenges facing the RoK, which intends to radically overhaul its Soviet-era regulatory system and its petroleum resource reporting structure, whilst remaining legally compliant, are described in detail.
- Published
- 2018
- Full Text
- View/download PDF
19. Energy savings potential of uninterruptible power supplies in European Union
- Author
-
Chris Nuttall, Christoph Jehle, Pedro Moura, Bob Harrison, and Anibal T. de Almeida
- Subjects
Engineering ,Emerging technologies ,business.industry ,020209 energy ,02 engineering and technology ,Energy consumption ,General Energy ,Reliability (semiconductor) ,Work (electrical) ,Risk analysis (engineering) ,0202 electrical engineering, electronic engineering, information engineering ,media_common.cataloged_instance ,Operations management ,European union ,business ,Uninterruptible power supply ,Ecodesign ,media_common ,Efficient energy use - Abstract
Uninterruptible power supplies (UPS) are key components of information and communications technologies (ICT) systems, ensuring reliability by maintaining the continuity and quality of the systems’ power supply. The energy consumption of UPS should be an important consideration due to its high impact on the lifecycle costs, but in most applications of UPS, energy efficiency is not the most important issue, since the operational reliability of the ICT systems and the related security of data processing and storage are the major concerns. However, the conversion efficiency of UPS systems has been improving in recent years and high energy savings can be achieved with the adoption of new technologies without a reduction of the reliability levels. The Ecodesign Preparatory Study for UPS (Lot 27) aimed to identify and recommend ways to improve, at their design phase, the environmental performance of UPS in the European Union throughout their lifetime. This paper presents the work developed during the Preparatory Study for UPS focused on the technical analysis of the best available and not yet available technologies, as well as the potential energy savings that can be achieved. Several technologies were considered at component and product level. The main design options were then modelled, and the potential energy savings achievable with policy options focused on minimum efficiency performance standards and energy labelling were assessed, showing a potential for energy savings in European Union in 2025 of 11.4 TWh (65 % energy saving relative to predicted energy requirement of EU ICT system UPS based on current practice).
- Published
- 2015
- Full Text
- View/download PDF
20. The Development and Implementation of Subsoil Use Standards for the Petroleum Sector in Kazakhstan (Russian)
- Author
-
Bob Harrison and Nurzhan Kairbayev
- Subjects
chemistry.chemical_compound ,chemistry ,Environmental protection ,Petroleum ,Business ,Subsoil - Published
- 2018
- Full Text
- View/download PDF
21. Fracture Behaviour of ODS 410L Stainless Steel using Small Punch Test
- Author
-
Hanliang Zhu, Kim Lu, Michael E. Fitzpatrick, Bob Harrison, D.G. Carr, Tao Wei, Lyndon Edwards, and Asim Zeybek
- Subjects
Materials science ,integumentary system ,Ion beam ,Scanning electron microscope ,fungi ,technology, industry, and agriculture ,Oxide ,chemistry.chemical_element ,chemistry.chemical_compound ,chemistry ,Fracture (geology) ,Irradiation ,Composite material ,Dispersion (chemistry) ,Helium ,Yttria-stabilized zirconia - Abstract
The fracture behavior of oxide dispersion strengthened (ODS) 410L stainless steels was investigated by means of small punch testing and scanning electron microscopy. The results showed that there are noticeable strengthening effects by the addition of 0.9 μm yttria into 410L steel. Small punch testing was also conducted on a specimen of ODS 410L steel with 50 nm yttria irradiated by helium ion beam with a dose of 2 × 10 16 ion.cm–2. The small punch test is an effective method to reveal the irradiation effects in the materials.
- Published
- 2017
- Full Text
- View/download PDF
22. Handling Decommissioning and Restoration Liabilities within PRMS and their Impact on Reported Reserves of Producing Fields Approaching Abandonment
- Author
-
Bob Harrison and Gioia Falcone
- Subjects
Finance ,Engineering ,business.industry ,media_common.quotation_subject ,Resource Management System ,State of affairs ,Net present value ,Civil engineering ,Nuclear decommissioning ,Negotiation ,Safeguard ,Operating cash flow ,Oil reserves ,business ,media_common - Abstract
It is estimated that nearly 150 fields may cease production by 2020 on the United Kingdom Continental Shelf (UKCS), requiring tens of billions of dollars to be spent on decommissioning and restoration (D&R) to remove platforms and development wells. Unfortunately, SPE's Petroleum Resource Management System (PRMS) offers limited guidance on how to handle D&R liabilities, even though such costs impact on whether late-life producible hydrocarbon volumes from a mature field can be classified as reserves.The economic limit test (ELT), which ignores any costs related to abandonment, is described in the PRMS as the date when net operating cash flow (NOCF) becomes negative, and it defines the reserves to the end of commercial producing life. However, with the agreement of government regulators, some North Sea field operations continue at negative NOCF to defer abandonment expenditure (Abex) and to maximize recovery. Therefore, a discontinuity can occur when subsequently calculating the point forward net present value (NPV) of the developed project, which must include D&R costs, and which leads to the developed project's NPV becoming negative, even though there is a positive ELT to some future date. Recoverable hydrocarbon volumes must be commercial to be classified as reserves, so we have an increasingly common state of affairs where mature fields in the North Sea continue to produce, but should perhaps be reclassified as contingent resources. Operators will find it far more difficult to get loans from banks or attract investor funding if their reserve base is eroded. The PRMS has an implicit, but unstated assumption that the operators of developed projects have arranged financing for D&R liabilities, yet some operators of mature producing fields, who are stil negotiating such cover, argue that the tail-end produced volumes can be booked as reserves in accordance with the vague guidance in the PRMS.The paper discusses this discontinuity between ELT and point forward NPV, as well as issues around abandonment in the region, methods to finance D&R and safeguard against default, differing views of how D&R costs should be handled within PRMS, and their impact on project classification. The definition of D&R within PRMS is critically reviewed, with formal responses solicited from the SPE Oil & Gas Reserves Committee (OGRC). Finally, the paper suggests changes to the PRMS are needed to make it more suitable for characterizing mature assets that are approaching abandonment.
- Published
- 2017
- Full Text
- View/download PDF
23. Carbon capture and sequestration versus carbon capture utilisation and storage for enhanced oil recovery
- Author
-
Gioia Falcone and Bob Harrison
- Subjects
Engineering ,Waste management ,business.industry ,Saline aquifer ,Environmental economics ,Geotechnical Engineering and Engineering Geology ,Public domain ,Due diligence ,Workflow ,Earth and Planetary Sciences (miscellaneous) ,European commission ,Enhanced oil recovery ,business ,Energy system - Abstract
There are 74 integrated carbon capture projects worldwide currently listed by the Global CCS Institute, including the few already running and those still at the identification, evaluation, definition or execution stage for operation by 2018. Significant funding programmes have recently been launched by the European Commission (NER300 in November 2011) and by the UK Department of Energy and Climate Change (CCS Commercialisation Programme in April 2012) for commercial demonstration projects leading to innovation across the CCS/CCUS technology chain to reduce energy system costs. In their calls for proposals, these programmes were open to both CCS and CCUS projects. However, there are significant technical and commercial differences between projects for enhanced oil recovery and those for permanent storage of carbon dioxide in saline aquifers or in depleted hydrocarbon reservoirs, the same way that there exist more complexities and limitations for offshore implementation. Such differences are accompanied by different levels of field verification of the various storage and utilisation concepts, with permanent sequestration having only a more recent history and smaller-scale implementation. In this scenario, the need for appropriate due diligence workflows and screening criteria to assess the technical viability and the deliverability of different CCS/CCUS projects remains crucial, vis-a-vis the high component costs, efficiency penalties, reservoir uncertainties and the many challenges related to full chain integration (from carbon dioxide capture to underground sequestration). Based on information in the public domain, this paper reviews the current status of offshore CCS/CCUS implementation worldwide and discusses screening criteria for use by governments, operators and investors alike.
- Published
- 2013
- Full Text
- View/download PDF
24. Re-grading Proven Reserves with PRMS
- Author
-
Bob Harrison, LR Senergy, and Gioia Falcone
- Subjects
Proven reserves ,Finance ,Government ,Resource (project management) ,business.industry ,Natural resource economics ,Economics ,Sanctions ,Force majeure ,Asset (economics) ,Downgrade ,business ,Contingency - Abstract
The Petroleum Resources Management System (PRMS) is a project based framework that classifies and categorizes a project's resources based on their likely commerciality and recoverability. The path from prospective resource to reserves is inferred to be ‘one way’, as re-grading proven reserves is considered a "rare event", but evidence suggests reclassification decisions are more commonplace. According to PRMS, reserves should only be downgraded if there is "an unforeseeable event that is beyond the control of the company … that causes development activities to be delayed beyond a reasonable time frame". This rather loose definition allows companies to interpret the meaning quite differently. Some choose to leave ‘under threat’ reserves as booked, by claiming the projects are subject to force majeure that might adversely affect their commerciality. Others downgrade such reserves to contingent resources, at least until the critical contingency is removed and the volumes return to the higher class. Two case studies involving reserves re-grading are given to highlight the applicability (or not) of PRMS to politically unstable areas and mature provinces, respectively. The first example from the Middle East tells how continued civil conflict and international sanctions has led shareholders to invest elsewhere, so that some operators have reclassified reserves. The second example tells how a North Sea operator, responding to pressure from government and partners, initiated a remedial program to get an asset back on stream, and in so doing allowed reserves to remain booked. Decisions over whether or not to re-grade proven reserves are more prevalent than implied in the PRMS. Consideration should be given to amending the PRMS guidelines to reflect this observation.
- Published
- 2015
- Full Text
- View/download PDF
25. Multiphase Flow Metering: Current Trends and Future Developments
- Author
-
Gioia Falcone, Bob Harrison, Geoffrey F. Hewitt, and Claudio Alimonti
- Subjects
Engineering ,Measure (data warehouse) ,Computer science ,business.industry ,Strategy and Management ,Multiphase flow ,Flow assurance ,Electrical engineering ,Energy Engineering and Power Technology ,Current (stream) ,Natural gas field ,Operator (computer programming) ,Fuel Technology ,Petroleum industry ,Industrial relations ,Systems engineering ,Wet gas ,Metering mode ,Oil field ,business ,Reliability (statistics) - Abstract
Distinguished Author Series articles are general, descriptive representations that summarize the state of the art in an area of technology by describing recent developments for readers who are not specialists in the topics discussed. Written by individuals recognized as experts in the area, these articles provide key references to more definitive work and present specific details only to illustrate the technology. Purpose: to inform the general readership of recent advances in various areas of petroleum engineering. Abstract Over the last decade, the development, evaluation, and use of multiphase-flow-metering (MFM) systems have been a major focus for the oil and gas industry worldwide. Many alternative metering systems have been developed, but none can be referred to as generally applicable or universally accurate. Both established and novel technologies suitable to measure the flow rates of gas, oil, and water in three-phase flow are reviewed and assessed within this framework. Technologies already implemented in various commercial meters then are evaluated in terms of operational and economical advantages or shortcomings from an operator's point of view. The lessons learned about the practical reliability, accuracy, and use of available technology are discussed. As operators now realize, use of MFM systems (MFMSs) is essential in exploiting marginal fields. A new approach to flow assurance, deepwater developments, downhole/seabed separation systems, and wet-gas fields is foreseen. The authors suggest where additional research to develop the next generation MFM devices will be focused to meet the as yet unsolved problems. Brief History The first commercial MFMSs appeared approximately 10 years ago, as a result of several multiphase metering research projects in the early 1980s. The driving force to develop MFM technology was the forecast decline of production from the major North Sea fields, accompanied by the necessity to tie backfuture smaller discoveries to existing infrastructure. Increasing gas and waterfractions, inherent in a mature producing province, would create more-unstable flow conditions in existing production facilities and require more-flexible multiphase solutions. In less than a decade, MFM has become accepted in the field and is beginning to be considered as a primary metering solution for new field developments. MFM Applications Within the oil and gas industry, it is generally recognized that MFM could lead to great benefits in terms of the following.1,2,3 Layout of Production Facilities. The use of MFMs reduces the hardware needed for onshore, offshore topside, and offshore subsea applications. Of primary importance is the removal of a dedicated test separator for well-testing applications. Use of MFM (with its smaller footprint) for topside applications minimizes platform space and load requirements for well-testing operations. Finally, costly well-testing lines can be stripped from the production facilities, which may be of vital importance for unmanned locations, deepwater developments, and satellite fields. Well Testing. Conventional test separators are expensive and require much time to monitor each well's performance because of the time required to reach stabilized flow conditions. It is particularly important in deepwater developments, because of the exceptional length of the flowlines. In such cases, production from individual wells connected to the same manifold may be monitored by use of a dedicated test line to avoid shutting down all the wells, then testing them one by one (with considerable production loss). However, the expense of a separate flowline may be prohibitive, hence the advantages of MFM installed in the subsea manifold. Test separators have an accuracy between approximately 5 and10% (currently achievable with MFMSs) but require regular intervention by trained personnel and cannot provide continuous well monitoring. Another disadvantage of conventional well testing with conventional separators is that well performance suffers after shutdown cycles related to well testing. Often, wells tested on a regular basis require more-frequent workovers to maintain their production rates. Use of MFMSs for exploration-well testing4 provides satisfactory flow measurements without separation of the phases. It is claimed that they can be used to monitor the well during its cleanup flow (traditionally, this flow information is lost because the well stream is not directed through the test separator). Added value is represented by improved control of the drawdown applied to the formation, the pressure transient, and shortened flow periods. Reservoir Management. MFMSs provide real-time, continuous data to enable operators to characterize field and reservoir performance better and react faster. Changes in gas/oil ratio or water cut can be detected and quantified immediately, where as traditional test separators provide information about only cumulative volumes at discrete points in time.
- Published
- 2002
- Full Text
- View/download PDF
26. Guest Editorial: Keeping Reservoir Stewardship on Course
- Author
-
Bob Harrison
- Subjects
Fuel Technology ,business.industry ,Strategy and Management ,Industrial relations ,Environmental resource management ,Energy Engineering and Power Technology ,Engineering ethics ,Stewardship ,business ,Course (navigation) - Abstract
Guest Editorial The French term, déja vu, which means literally “already seen,” is the feeling that you have previously experienced something you are currently experiencing. Two thirds of adults claim to have sensed this phenomenon, but this figure rises to 100% when one considers professionals in the oil and gas industry, which is undergoing yet another boom-and-bust cycle. Besides the real concern that the recently announced staff layoffs will only hasten the “big crew change” (as some “golden oldies” may decide to call it a day this time around), one hopes that operators will maintain good stewardship of their wells and fields and resist the temptation to cut back on essential data acquisition. Reservoir stewardship, in which operators accept the responsibility to shepherd and safeguard the assets of a company or a country, involves the periodic review of asset performance to ensure productivity and recovery targets are met and maintained, and to guide future work plans. Continuous reservoir appraisal and surveillance are essential to minimize production losses from downtime in wells, facilities, and export systems. Unfortunately, it is evident that some operators (and governments) pay only lip service to good reservoir stewardship, especially when oil and gas prices are low. Sometime ago, I read a student thesis that looked at options for reducing costs in the unconventional “factory drilling” process. It concluded that significant time and money could be saved if formation evaluation services were eliminated from the well program. The project sponsor was happy with the result and the student graduated, but I was appalled that this suggestion could ever be taken seriously. Unconventional reservoirs have complex pore systems, very low interparticle permeability, contain free and adsorbed gas, and exhibit variable water salinity, all of which make their characterization a major challenge for the geoscientist. Therefore, more core data (not less) are needed to calibrate the responses of logging suites, which also require enhanced measurement services as opposed to standard tool strings. The taking of core permits subsequent rock typing to include dynamic properties and fracturability and allows partitioning of the reservoir into zones that reflect quartz content and producibility.
- Published
- 2015
- Full Text
- View/download PDF
27. Computing: A busy year ahead for schools
- Author
-
Bob Harrison
- Subjects
Engineering ,business.industry ,Information and Communications Technology ,ComputingMilieux_COMPUTERSANDEDUCATION ,Countdown ,Mathematics education ,Use of technology ,business ,Curriculum - Abstract
The countdown is well under way towards the new computing curriculum. Bob Harrison considers the challenges facing schools in 2014 – both with computing and wider use of technology – and discusses the support available for computing to help schools hit the ground running in September.
- Published
- 2014
- Full Text
- View/download PDF
28. Putting a Value on Estimated NGL Volumes in Prospect Evaluation
- Author
-
Bob Harrison
- Subjects
Engineering ,Petroleum engineering ,business.industry ,Value (economics) ,business ,Liquefied natural gas - Abstract
Natural gas liquids (NGLs) are recovered from the produced wet gas stream by a more complicated process than typical separation that involves additional chilling and compression. Thus, NGL volumes can be obtained by decreasing a field's wet gas production profile, but this should be avoided unless a premium price can be obtained for the liquids relative to the gas that is used to make them. Traditionally, in the evaluation of undrilled exploration prospects in the North Sea, there is so much uncertainty in the forecasts of product prices that analysts tend to only consider oil, condensate and sales gas, when deriving an economic value for a prospect. However, some North Sea Operators routinely offer farm-in opportunities where the probable resource estimates in their prospects, which are assumed to contain volatile oil, are broken down into oil, dry sales gas and NGLs. This practice has come about as the price differential between oil and gas has grown. The upshot is the company farming in will see a higher prospect valuation, and thus receive a lower working interest for the same money it puts on the table. Some say that putting a value on NGLs in prospect evaluation is over-sciencing the process. They argue that as analysts are guessing the hydrocarbon type and its gas in solution, it is going a step too far to claim they can estimate NGL yield too with any confidence. They also hold the view that the range calculated for a prospect's resources should be wide enough to adequately accommodate any additional benefit which may result from NGLs, as they would be taken care of in the noise. Those that support attributing value to NGLs counter that the real issue is a matter of the amount and reliability of the technical data on the opportunity. They argue that if analysts can determine the NGL yield with an acceptable level of accuracy, then it should be considered, provided that one remembers to include the incremental facilities cost of NGL recovery. Both points of view are discussed, in particular whether companies may have under-valued and walked away from past opportunities by failing to put a value on NGLs. By valuing prospects higher, companies can live with a higher exploration risk and tolerate higher drilling costs. The paper briefly considers the ethics of a company adopting a binary strategy of assigning value to NGLs when farming out, but ignoring it when farming in. Finally, appendices provide detailed descriptions of how to use a linear version of the expected value concept for farming analysis and give a simple workflow to estimate potential NGL recoverable volumes for a North Sea volatile oil prospect. Many of the issues discussed also apply to gas condensate exploration targets, but this paper focuses on those prospects that are thought to contain volatile oil. Readers can draw their own conclusions as to which view is better, yet the majority of the feedback from analysts was that many prefer not to assign a separate value to the potential NGL stream from a volatile oil prospect in the North Sea.
- Published
- 2013
- Full Text
- View/download PDF
29. Deciding Whether to Fund Either CCS or CCUS Offshore Projects: Are We Comparing Apples and Pears in the North Sea?
- Author
-
Gioia Falcone and Bob Harrison
- Subjects
Natural gas field ,Engineering ,Open and closed systems in social science ,Lead (geology) ,business.industry ,Natural resource economics ,Greenhouse gas ,Fossil fuel ,Submarine pipeline ,Enhanced oil recovery ,business ,Unit (housing) - Abstract
Recent years have seen significant funding competitions launched in Europe and in the UK that call for bidders to propose commercial demonstration projects which will bring innovation across the carbon capture and sequestration (CCS) technology chain to reduce energy system costs. The primary carbon dioxide (CO2) storage site candidates that are targeting funds are the saline aquifers and depleted oil and gas fields in the North Sea. At the time of writing, no outright winners have been announced. These programs are open to CCS projects as well as carbon capture, utilization and storage (CCUS) projects. The former deals exclusively with greenhouse gas storage, but the latter differs by using the injected CO2 for enhanced oil recovery (EOR) before eventually being stored. Hence, there are considerable technical and commercial differences between CCS and CCUS projects, in much the same way as onshore projects face less challenges and constraints than if they were being implemented offshore. The evaluation and selection of which offshore carbon storage projects should be funded is a tough exercise to undertake, but it becomes much more difficult if the competing projects under consideration are allowed to be CCS or CCUS, full chain or part chain, or a mixture of all of the aforementioned. Bias may arise via reliance on selection criterion such as volume of stored CO2 per unit of expenditure, which is likely to favor saline aquifer storage projects over other types, no matter how innovative or compelling they are. The authors believe that selection committees in Brussels and London would greatly simplify their decision of which bid should be funded, and in what proportion, by separating competing projects into straightforward storage types and CO2-EOR types. Offshore experience of either project type is scarce and their relative merits are difficult to reconcile as the subsurface understanding, timeframe, economics, performance and goals of each project type are quite different. The paper recognizes that CCS-type projects can be further subdivided into saline aquifers, with open and closed systems, and abandoned gas fields, as each have different storage limitations. Also, CCUS-type projects, which realistically only include abandoned oil fields, can be further subdivided to reflect the operational and commercial characteristics of different EOR schemes. It is hoped that the discussion outlined in this paper will lead to easier and fairer screening criteria for offshore CCS and CCUS projects for use by governments, operators and investors alike.
- Published
- 2013
- Full Text
- View/download PDF
30. The countdown to computing
- Author
-
Bob Harrison
- Subjects
Medical education ,Engineering ,business.industry ,Pedagogy ,Countdown ,business ,Curriculum ,Expert group ,Advice (programming) - Abstract
An expert group has been convened by the DfE to develop advice, resources and CPD as schools prepare for the introduction of the new computing curriculum. Bob Harrison explains.
- Published
- 2013
- Full Text
- View/download PDF
31. Evaluation of Grid Pattern Photocoagulation for Macular Edema in Central Vein Occlusion
- Author
-
John G. Clarkson, Elaine Chuang, Donald Gass, Maria Pedroso, Tony Cubillas, Erlinda S. Duria, Ditte J. Hess, Isabel Rams, Marguerite Ball, Alex Gutierrez, Nayla Muniz, June Thompson, Michele Pall, Charles J. Pappas, Daniel Finkelstein, Arnall Patz, Dolores Rytel, Judy Belt, Dennis Cain, Terri Cain, David Emmert, Terry George, Mark Herring, Pete Sotirakos, David H. Orth, Timothy P. Flood, Kirk H. Packo, Toni Larsen, Nancy Perez, Doug Bryant, Don Doherty, Jay Fitzgerald, Martha Gordon, Cynthia Holod, Kathy Kwiatkowski, Celeste MacLeod, Chris Morrison, Charlotte Westcott, Michael L. Klein, David Wilson, Richard G. Weleber, Susan Nolte, Nancy Hurlburt, Mark Evans, Patrick Wallace, Peter Steinkamp, Debora Funkner, Cathy Gordon, Clement Trempe, Alex Jalkh, John Weiter, Sherry Anderson, Dennis Donovan, Tom O'Day, Gerald Friedman, Rodney Immerman, Gabriel Coscas, Gisele Soubrane, Rose Marie Haran, Christophe Debibie, Jean Gizelsky, Ingolf H.L. Wallow, Guillermo de Venecia, George Bresnick, Sandra Larson, Sandy Fuller, Bob Harrison, Gene Knutson, Michael Neider, Greg Weber, Ruth Bahr, Bonnie Grosnick, Robert Lazorik, Helen Lyngaas, Diane Quackenboss, Guy Somers, Froncie A. Gutman, Sanford Myers, Tina Kiss, Deborah Ross, Pamela Vargo, Janet Edgarton, Sue Hanson, Janet Nader, Nancy Tomsak, Lawrence J. Singerman, Hernando Zegarra, Susan Lichterman, Adrienne Fowler Kramer, Sheila Smith-Brewer, Pam Brown Rowe, Geraldine Daley, Anne Pinter, Kathy Coreno, Lori Cooper, Marty Delisio, Donna Cross, Wendy Lord, Argye Hillis, Mark W. Riggs, Cheryl KasbergPreece, M. Hasan Rajab, Krista Carlson Giniewicz, Kevin Gilmore, Carol Zimmerman, Mary Lou Lewis, Maria Cristina Wells, Julie Lord Forbes, Kathleen C. Fetzer, Heather McNish, George H. Bresnick, Lissa McNulty, Jim Baliker, Linda Alanen, Laura Gentry, Richard L. Mowery, Donald F. Everett, Robert J. Hardy, Gary Abrams, Robert N. Frank, Maureen G. Maguire, and Abner V. McCall
- Subjects
education.field_of_study ,medicine.medical_specialty ,Visual acuity ,genetic structures ,medicine.diagnostic_test ,business.industry ,Population ,Physical examination ,medicine.disease ,eye diseases ,Vein occlusion ,Surgery ,Clinical trial ,Ophthalmology ,Grid pattern ,Central retinal vein occlusion ,medicine ,sense organs ,medicine.symptom ,business ,education ,Macular edema - Abstract
Purpose: To evaluate the efficacy of macular grid photocoagulation in preserving or improving central visual acuity in eyes with macular edema due to central vein occlusion (CVO) and best-corrected visual acuity of 20/50 or poorer. Methods: Patients with angiographically documented macular edema due to CVO were entered into a multicenter randomized controlled clinical trial supported by the National Eye Institute. Eligibility was determined based on both clinical examination findings and photographic documentation evaluated at a photograph reading center. Eyes were assigned randomly to macular grid photocoagulation (77 eyes) or no treatment (78 eyes). Patients were followed every 4 months for 3 years or until the end of the study. The outcome measure was visual acuity. Results: The study population consisted of 155 eyes in 155 patients. There was no difference between treated and untreated eyes in visual acuity at any point during the follow-up period. Initial median visual acuity was 20/160 in treated eyes and 20/ 125 in control eyes. Final median visual acuity was 20/200 in treated eyes and 20/160 in control eyes. However, treatment clearly reduced angiographic evidence of macular edema. Conclusion: The results of this study do not support a recommendation for macular grid photocoagulation for the population meeting the Central Vein Occlusion Study macular edema group eligibility criteria.
- Published
- 1995
- Full Text
- View/download PDF
32. Learning without frontiers
- Author
-
Bob Harrison
- Subjects
Information and Communications Technology ,Computer science ,ComputerSystemsOrganization_MISCELLANEOUS ,ComputingMilieux_COMPUTERSANDEDUCATION ,Mathematics education ,Data_CODINGANDINFORMATIONTHEORY ,Mobile device - Abstract
Teacher and ICT expert Bob Harrison reports from the recent Handheld Learning 2009 conference
- Published
- 2009
- Full Text
- View/download PDF
33. Where is the 21st century learning?
- Author
-
Bob Harrison
- Subjects
Engineering ,Looming ,Information and Communications Technology ,business.industry ,Digital native ,Happening ,Library science ,Public relations ,business - Abstract
Everybody presumes all children are digital natives in the 21st century, but is this really the case? And with funding cuts looming, what is happening to ICT? Education advisor Bob Harrison discusses
- Published
- 2009
- Full Text
- View/download PDF
34. Reading Between the Lines of A&D Documentation
- Author
-
Bob, Harrison, additional
- Published
- 2014
- Full Text
- View/download PDF
35. Technology Focus: Formation Evaluation (August 2014)
- Author
-
Bob Harrison
- Subjects
Focus (computing) ,Fuel Technology ,Strategy and Management ,Political science ,Industrial relations ,Energy Engineering and Power Technology ,Engineering ethics - Abstract
Technology Focus Any oil and gas deal, whether it is acquisition, divestment, greenfield, brownfield, or out of left field, usually involves a data room, where rich seams of proprietary information can be mined for a limited period. However, this physical data room, as opposed to a virtual version where data are downloadable, is the bane of due diligence. Some may recall caffeine-fueled assignments to paw through boxes brimming with confidential documents that allegedly held the financial secrets of a potential acquisition. The process is expensive; can be a logistical nightmare of travel arrangements, visas, and inoculations; and is always stressful because of time pressure. Data rooms are migraine inducing. The priority of the petrophysicist on a data-room team is to access digital data for all wells—for example, logs, wireline formation pressures, and deviation surveys for depth conversion. Often neglected are the log headers themselves, which provide data on tool setup, bottomhole temperature, mud properties, and the casing program and present the logging engineer’s comments. Gathering data is less straightforward if they are in an unfamiliar language or are vintage (noncombinable tools with unfocused measurements requiring more runs). These factors require translation, tool response modeling, and increased depth matching, adding time, cost, and potential for error. Petrophysicists audit the hydrocarbons in place by reviewing previous studies to confirm the all-important trinity of porosity, water saturation, and net pay claimed by the seller. The appropriateness of the saturation-height function in the geomodel is checked for the assumed hydrocarbon and rock type, or merely to ascertain that one was used at all. Proposed fluid contacts must be verified, no mean feat when faced with structure maps showing only oil-down-to (ODT) and water-up-to (WUT) depths, with the contact possibly being anywhere in between. Some companies assume that the fluid contact is halfway between ODT and WUT, which is a completely arbitrary approach. In these digital days, almost everything needed to carry out petrophysical due diligence can be downloaded for off-site analysis. However, there are places where it is illegal to take data out of the country, so physical-data-room attendance is compulsory. In these instances, as in the past, the petrophysicist’s briefcase might hold color pencils, a ruler, tracing paper, chart books, and maybe a planimeter, a strange device for measuring areas within mapped contours. Yet painful memories suggest that the most important piece of equipment one should take along to a data room is probably a bottle of aspirin. JPT
- Published
- 2014
- Full Text
- View/download PDF
36. Technology Focus: Formation Evaluation (August 2013)
- Author
-
Bob Harrison
- Subjects
Focus (computing) ,Fuel Technology ,Strategy and Management ,Political science ,Industrial relations ,Energy Engineering and Power Technology ,Engineering ethics - Abstract
Technology Focus When interviewing potential senior-petrophysicist recruits, we exchange pleasantries to break the ice and then I ask an easy, yet fundamental technical question: “How would you explain Archie’s law to an inexperienced colleague?” Then, I sit back in expectation of a slick answer from the candidate. Imagine my surprise when the interviewee takes a sharp intake of breath, looks up at the ceiling, scratches his head, and, finally, exhales audibly before launching into a ramble with more “errs” than a David Beckham post-match press conference. “Sorry, I do know it. Honestly, I use it every day in my petrophysical interpretation program. It’s just that I haven’t been asked to describe it before. Sorry,” mumbles the candidate. Unfortunately, the scene I have just described is an all-too-common one. I was genuinely taken aback that some experienced petrophysicists were unable to state Archie’s law chapter and verse. I was expecting to hear something like, “Archie’s law relates the in-situ electrical conductivity of a sedimentary rock to its porosity and brine saturation. It is purely empirical, describing ion flow (mostly sodium and chloride) in clean, consolidated sands, with varying intergranular porosity of moderate to high values. Electrical conduction is assumed not to be present within the rock grains or in fluids other than water.” Hence, I was gladdened by the news that, from next September, the University of Aberdeen in Scotland is to offer a master’s degree in petrophysics and formation evaluation. Hopefully, the next generation of petrophysicists not only will answer my question without hesitation but also will appreciate the danger of blindly applying Archie’s law. At a recent conference organized by the London Petrophysical Society to celebrate 70 years since Gus Archie gave his seminal paper, attendees were reminded of the various influences on calculated water saturation. It was shown that, with a classic Archie rock of 20% porosity and formation and water resistivities of 10 and 0.1 Ω∙m, respectively, the resulting computed water saturation of 50% will have a fractional uncertainty of 7%, which increases rapidly for porosities less than 10%. The uncertainty depends on the combination of reservoir parameters. At high porosity, the major contributor to uncertainty is formation resistivity. At intermediate porosity, the major contribution is from the cementation exponent. And, at low porosity, the most significant parameter is porosity itself. Archie’s work is one of many fine papers in the archives of SPE and the Society of Professional Well Log Analysts that deserve to be read, to understand the methods described therein, where they can be applied, and, possibly more importantly, where they cannot. Simply knowing how to drive expensive interpretation software is no substitute for a solid understanding of the fundamental theory behind the key strokes. Recommended additional reading at OnePetro: www.onepetro.org. SPE 158545 A Greater Dolphin Area Case Study Part 1: Defining Geological Uncertainty by K.S. Taylor, BG, et al. SPE 163973 Gas/Condensate Flow Behavior Within Tight Reservoirs by Mahmood Al-Harrasi, Petroleum Development Oman, et al. SPE 164884 Modeling Net-to-Gross in Deepwater Reservoirs by Jiajie Chen, Marathon, et al. IPTC 16808 The Eagle Ford Shale Play, South Texas: Regional Variations in Fluid Types, Hydrocarbon Production, and Reservoir Properties by Yao Tian, Texas A&M University, et al.
- Published
- 2013
- Full Text
- View/download PDF
37. Technology Focus: Formation Evaluation (August 2012)
- Author
-
Bob Harrison
- Subjects
Focus (computing) ,Fuel Technology ,Strategy and Management ,Political science ,Industrial relations ,Energy Engineering and Power Technology ,Engineering ethics - Abstract
Technology Focus A little bit of knowledge is not always a dangerous thing. Recently, I was reviewing some simulation models with a client’s subsurface team. They were taken aback when I asked them if they were happy with the petro-physical inputs they had been provided with. “It’s not our job to question the petrophysics,” they replied in unison. However, it was obvious that they were struggling to get the simulation model to match the observed reservoir performance and could do so only by dramatically changing the input rock properties. It transpired that their employer had recently bought the asset and the team had inherited the previous operator’s interpretations, which, “to save time,” had simply been imported into their model. After much cajoling, the team agreed to review the raw core and log data and opened a Pandora’s box. The older wells had poor borehole quality with large washouts, usually in shaly zones, that adversely affected the logs so that they indicated porosity in shales. The net-pay count was, therefore, optimistic. Closer inspection of the permeability/porosity transform revealed that the core data had not been corrected for the Klinkenberg effect or for compaction or even checked against test values. Finally, the computed Corey coefficients for the relative permeability curves looked very low for the oil-wet carbonate they were supposed to characterize, which was because of unstable flooding of the core plugs in the laboratory. With these revelations in mind, the team members decided that the petrophysics for the field had to be reinterpreted, and they collaborated closely with their company petrophysicist during the review. A new simulation model was built that was more coherent, more consistent, and more realistic and that gave much better results than before. There is no harm in developing a petrophysical side to one’s discipline. As I mentioned in my Focus column in August 2009, petrophysics touches all the subsurface disciplines in one way or another. Being able to converse intelligently with log analysts and ask them pertinent questions can only be good news for employers and subsurface project managers. It can also save time that may have been wasted creating simulation models that do not reflect reality. Recommended additional reading at OnePetro: www.onepetro.org. SPE 153537 A Study of Differences in Array-Induction and Multi-Laterolog Responses in a Well Drilled With High-Salinity Water-Based Mud by Bill Corley, Baker Hughes, et al. SPE 153593 A New Method for Estimating Waterflood Oil-Recovery Efficiency Using Post-Waterflood NMR and Dielectric Well Logs, Belridge Field, California by Daniel A. Reed, Aera Energy, et al. SPE 154426 Reducing Reservoir Uncertainties Using Advanced Wireline Formation Testing by D. Loi, Eni, et al.
- Published
- 2012
- Full Text
- View/download PDF
38. Gas Cycling: A New Approach
- Author
-
Bob Harrison
- Subjects
Fuel Technology ,Nuclear engineering ,Environmental science ,Geotechnical Engineering and Engineering Geology ,Cycling - Published
- 2002
- Full Text
- View/download PDF
39. Technology Focus: Formation Evaluation (August 2011)
- Author
-
Bob Harrison
- Subjects
Focus (computing) ,Fuel Technology ,Strategy and Management ,Political science ,Industrial relations ,Energy Engineering and Power Technology ,Engineering ethics - Abstract
Technology Focus Reportedly, companies are struggling to find petrophysicists for their subsurface teams, especially companies exploiting unconventional resources that require a major petrophysical effort to unlock their reserves potential. How has this state of affairs come about? Undoubtedly, there are fewer routes into the discipline because few universities offer degrees in the subject. Most petrophysicists either have a logging-company background or are professionals from other disciplines who developed a taste for it. Yet, there are more-deep-seated issues at play, concerning the nature of the job itself and how petrophysicists are perceived in the industry. Practitioners must spend considerable time performing the less-glamorous tasks of gathering, checking, loading, merging, shifting, and correcting raw data before any analysis can be performed. This may discourage professionals from joining the discipline before they can appreciate its richness. Some petrophysicists feel undervalued. Their main workload comes early in a project and ends when they pass their well/rock-property data to the geomodelers. By servicing several teams, particularly in operational environments, they end up belonging to none and feeling isolated. Some suspect that other disciplines offer greater career mobility, with increased chances of fast tracking into management. In reality, the role has a unique multitasking and technically challenging nature, with a blend of fundamental science and applied practices in which reservoir studies are interjected with designing laboratory core-analysis programs and monitoring wellsite operations. Thus, it seems that a make-over is required to enhance the petrophysicist’s image, standing, and popularity. To address the reported shortage, the role must be broadened to attract and retain high-caliber people. Newcomers must be taught integrated working practices instead of how to push buttons of sophisticated software that provides analysis products of which they have little understanding. Quality-focused training from academia and the industry is essential to achieve this aim. For their own part, petrophysicists must contribute more than porosity, saturation, net-reservoir, and permeability data. They must step into the limelight and be acknowledged by peers for their help and advice in all areas of established subsurface workflows, while providing input throughout a project’s life. Petrophysicists perform a complex and onerous role, synthesizing multifarious data into a single coherent interpretation. Their work impinges on every discipline, providing deliverables for drillers, geologists, geophysicists, reservoir simulators, and production engineers. We need more of these valuable people. Formation Evaluation additional reading available at OnePetro: www.onepetro.org SPE 137766 • “Barnett Shale (Mississippian), Fort Worth Basin, Texas: Regional Variations in Gas and Oil Production and Reservoir Properties” by Y. Tian, SPE, Texas A&M University, et al. SPE 134515 • “Novel Approach To Quantifying Deepwater Laminated Sequences Using Integrated Evaluation of LWD, Real-Time Shear, Porosity, Azimuthal Density, and High-Resolution-Propagation Resistivity” by Katerina Yared, SPE, Baker Hughes, et al. SPE 121313 • “Petrophysical-Analysis Method To Identify ‘Sweet Spots’ in Tight Reservoirs: Case Study From Punta Rosada Formation in Neuquen Basin, Argentina” by C. Naides, Petrobras.
- Published
- 2011
- Full Text
- View/download PDF
40. Technology Focus: Formation Evaluation (August 2010)
- Author
-
Bob Harrison
- Subjects
Focus (computing) ,Fuel Technology ,Strategy and Management ,Political science ,Industrial relations ,Energy Engineering and Power Technology ,Engineering ethics - Abstract
Technology Focus Analysts of conventional logs, who routinely evaluate complex formations, may be less productive when faced with more-esoteric responses from modern tools, especially because the latter often are interpreted by the logging contractors themselves. This situation arises because available petrophysical software may not be capable of handling the new log responses, and the required interpretive algorithms often are claimed as proprietary and, therefore, are withheld. Logging-contractor-generated results from new logs may bring additional insight that is not readily apparent from basic logging suites, but end users must beware of blindly accepting these results and the implication that the new tools are necessary. Many published case studies that claim new logging services are essential do not prove that basic logs are inadequate. Readers should be skeptical if these conference papers do not conform to Richard Feynman’s exhortation, “The idea is to try and give all the information to help others to judge the value of your contribution, not just the information that leads to judgment in one particular direction or another.” We should always ask ourselves three questions: Was the log really necessary? Does it add new information or simply confirm what could have been determined from a basic logging suite? For example, a nuclear-magnetic-resonance log in shaly gas-bearing sands with oil-based-mud-filtrate invasion may be able to distinguish gas from oil, but invaded-zone fluids also can be solved for by use of a combination of conventional logs. Is the logging-contractor interpretation correct? New logs should be validated against careful traditional-log analysis, which has been calibrated against core. Can operators improve on the logging-contractor interpretation? Yes! Operators have more relevant data and local knowledge, so they can use the log data more effectively. Petrophysical software and a log analyst’s time cost much less than log acquisition. Logging contractors should run a safe and efficient operation while acquiring good raw data. If they also can supply an interpretation, then this is a bonus. End users must not abdicate their responsibility for the interpretation of log data that their employer pays to acquire. Formation Evaluation additional reading available at OnePetro: www.onepetro.org SPE 125342 • “Reservoir-Rock-Types Application—Kashagan” by A. Francesconi, Eni, et al. SPE 128013 • “Residual Hydrocarbons—A Trap for the Unwary” by T. O’Sullivan, SPE, Cairn India, et al. SPE 130414 • “Fulmar Sandstones—Correcting Some Received Wisdom in the Greater Kittiwake Area (North Sea)” by G. Coghlan, SPE, Centrica Energy, et al.
- Published
- 2010
- Full Text
- View/download PDF
41. Cost of cerivastatin in cost-effectiveness study
- Author
-
Karen C. Chung, Kenneth Pomerantz, Bob Harrison, William J. Elliott, and David R. Weir
- Subjects
Pharmacology ,Computer science ,Cost effectiveness ,Health Policy ,medicine ,Cerivastatin ,medicine.drug ,Reliability engineering - Published
- 2000
- Full Text
- View/download PDF
42. Technology Focus: Formation Evaluation (August 2009)
- Author
-
Bob Harrison
- Subjects
Focus (computing) ,Fuel Technology ,Strategy and Management ,Political science ,Industrial relations ,Energy Engineering and Power Technology ,Engineering ethics - Abstract
Technology Focus Generally, the entire interval between the surface casing and total depth is logged, but far too often we neglect to take core. While the expense of coring and core analysis is not small, it usually is only a fraction of total well cost; yet it remains an uphill struggle to convince management that the project will benefit from the knowledge gained. With the quest for reserves leading to exploration for and development of reservoirs with ever-more-complex porosity systems at greater depths and higher temperatures where log responses become suspect, I would argue that taking core has never been more important. There is no denying that logs provide greater statistical coverage of a larger volume; have a better depth reference; can identify fluid contacts, missed pay, and swept zones; measure reservoir pressure; and can be calibrated to seismic data for reservoir-property mapping. However, visual inspection and laboratory analysis of core provide many key data that logs simply cannot provide, with applications for every discipline. Core confirms the lithology and mineralogy of reservoirs; calibrates estimates of fundamental rock properties such as porosity, saturation, and net thickness; and remains the only true measure of permeability. Core shows how fluids occupy and flow within the reservoir pore space; enables formation-damage studies; and supplies mechanical properties to allow faster and safer drilling and better-designed completions. Despite issues with cleaning, storage, restoration to native state, and scaling up, core still provides the best estimates of many of the crucial inputs for accurate reservoir modeling, even though logging-contractor marketing literature would have us believe otherwise. Logs cannot characterize a reservoir if knowledge of the rock is absent, so subsequent modeling must rely on uncalibrated and unverified log-derived correlations and analogs. The inevitable consequence is greater uncertainty. Formation Evaluation additional reading available at OnePetro: www.onepetro.org IPTC 12837 • "Accurate NMR Fluid Typing Using Functional T1/T2 Ratio and Fluid-Component Decomposition" by Boqin Sun, Chevron Energy Technology, et al. SPE 117728 • "Reservoir Rock Typing From Crest to Flank: Is There a Link?" by R.E. Mahmoud Basioni, Abu Dhabi Company, et al. IPTC 12328 • "Data-Acquisition and Formation-Evaluation Strategies in Anisotropic, Tight Gas Reservoirs of the Sultanate of Oman" by H.J. de Koningh, SPE, Petroleum Development Oman, et al.
- Published
- 2009
- Full Text
- View/download PDF
43. Are we reasonably certain that reasonable certainty adequately defines uncertainty in our reserves estimates?
- Author
-
Gioia Falcone and Bob Harrison
- Subjects
Value (ethics) ,Investment decisions ,media_common.quotation_subject ,Probabilistic logic ,Econometrics ,Certainty ,Divergence (statistics) ,Cognitive bias ,Outcome (probability) ,Mathematics ,Event (probability theory) ,media_common - Abstract
Words can have many meanings, while numbers tend to have only one, so using the language of probability can better communicate the potential risks and uncertainties in developed oil and gas projects, which all employ uncertain forecasts of time, hydrocarbon price, cost, and reserves. Yet, rather than assign a probability value to the Proved or 1P deterministic estimate, SPE's Petroleum Resource Management System (PRMS) uses loosely defined expressions such as "reasonable certainty" to convey a high degree of confidence that the deterministically estimated hydrocarbon volumes will be recovered.A literature review is conducted to highlight what biases people are prone to, and to reveal what people consider to be reasonable certainty. The study recounts the provenance of the phrase "reasonable certainty" in the oil and gas industry. Using published work on cognitive bias by acknowledged oil and gas consultants, drawing on research by Harvard University statisticians, and carrying out our own tests on industry professionals, the authors demonstrate that there are consistent, and quite different perceptions of the language of quantitative probability from those suggested in the PRMS.Tests show repeatedly that industry professionals believe the probability value that corresponds to "reasonable certainty" is in the range 70-75%. This is far less than the implied 90% figure assigned to the probabilistic low estimate of recoverable volumes, generally taken as equivalent the 1P deterministic case. The divergence between deterministic language and associated and perceived probability is more pronounced when one considers the other reserves classes of 'Probable' and 'Possible'. Such perceptions lead to overly optimistic expectations and can result in poor investment decisions. There is a reluctance to assign probability values to deterministic reserve classes, but that should not prevent the assignment of more appropriate language within PRMS.Project team members, experts, and stakeholders must be aware that they are all prone to cognitive bias, which when coupled with overly optimistic perceptions of uncertainty, leads to wrong decisions being made, and to projects being mismanaged.The PRMS text could be changed to better describe the likelihood of the outcome of an event. The continued overuse of the loosely defined adjective 'reasonable' should be avoided. To convey probability greater than 90% or to describe the Low or Proved 1P case, the PRMS would better inform its users by using expressions like "very high probability", "almost certain", or "very likely".
44. From blissful ignorance to intelligent foreboding
- Author
-
Bob Harrison
- Subjects
Engineering ,Engineering management ,General Energy ,business.industry ,Energy (esotericism) ,Association (object-oriented programming) ,media_common.quotation_subject ,Institution ,Engineering ethics ,Ignorance ,Management, Monitoring, Policy and Law ,business ,media_common - Published
- 1979
- Full Text
- View/download PDF
45. Energy modelling and net energy analysis
- Author
-
Bob Harrison
- Subjects
Engineering ,General Energy ,business.industry ,Net energy ,Management, Monitoring, Policy and Law ,business ,Engineering physics ,Energy (signal processing) - Published
- 1980
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.