Detailed imaging and careful measurement boost field recovery rates
Dick Ghiselin
Contributing Editor
Great philosophers almost universally maintain that we learn more from our mistakes than from our successes. This quirk of human nature probably explains why many automobile owners go to great lengths to coax one more year out of their ancient vehicles, when it would have been wiser to have taken better care of them in the first place. Sadly, the analogy also applies to our own bodies, as many of us have experienced.
Reservoirs are no different. Their production behavior after a couple of decades reflects the care they were given when they were first developed and completed. When they finally stop producing on their own, we come up with all sorts of techniques to extend their economic life. These include artificial lift, water flooding, pressure maintenance, and in-fill drilling projects. Somewhere near the end of the line are enhanced oil recovery (EOR) schemes. The current favorite, by a large margin, is thermal, followed by chemical and biological, or microbial.
With all these techniques at the command of the world's producing companies, one would think we are at the brink of near total recovery of every drop of hydrocarbon from our reservoirs. In fact we are far from it. Although some conventional natural gas reservoirs have recovery factors near 80%, most reservoirs reach their economic limits somewhere short of 40%.
The lesson for success at improved recovery has been staring us in the face for a century. On Jan. 10, 1901, Patillo Higgins and his partners brought in the infamous Spindletop gusher that blew an estimated 100,000 barrels of oil into the Southeast Texas sky. Lesser gushers were the distinctively un-scientific way oil producers recognized success. About 20 years later the Schlumberger brothers invented a way to assess the potential of subsurface reservoirs without having them spew a large volume of their potential profits onto the lease. Logging, or electrical coring as it was known at the time, could help an operator by answering a few basic questions: Are hydrocarbons present? Where are they? Will they produce? How much? The Schlumberger inventions were accompanied by drilling and well control technology that quelled the gushers that had up to that point characterized discovery.
The fact that critical reservoir knowledge could be obtained remotely without losing control of the well intrigued drillers, especially with the development of the dipmeter log, which answered a fifth, and very important, question: Where should we drill the next well?
Subsurface measurement
This is the point where a critical lull occurred in reservoir knowledge. Logs were able to tell geoscientists and engineers many valuable facts about the formations pierced by the drill bit, but they could not tell what was really inside the reservoir, between the wells. Conventional wisdom at the time was that wells could be accurately correlated, and that the intervening geology was simply a continuation of what was observed in the wells. The only techniques that shed light on the reservoir volumes existing between the wells were seismic and well-testing. However, both had their limitations. Inter-well knowledge was enhanced by the introduction of production logs that could be run into producing wells through tubing to measure dynamic production parameters that shed light on reservoir behavior. Still, the ultimate solution was elusive.
The evolution of subsurface measurement technology took decades. In the meantime, several early reservoirs were reaching their limits: they either produced unacceptable water cuts, or coned gas, or petered out altogether due to loss of reservoir pressure. Operators reasoned that residual hydrocarbons could be harvested by applying artificial lift, water flooding, or gas-cap pressurization. These techniques worked, up to a point. It soon became obvious that water injected in one well was simply being produced from a neighboring well. Both would be shut in as operators moved on to greener pastures.
But when reservoir engineers compared cumulative production from the reservoir to original-oil-in-place (OOIP) estimates, it became obvious that as much as 70% of the OOIP was still in the reservoir. Analysis revealed that the residual oil was trapped by a variety of geological or petrophysical conditions. High oil viscosity was a big factor, and this paved the way for the thermal, chemical, and biological recovery techniques that achieved reasonable success over the years. However, many were cost-prohibitive in all but the giant reservoirs. Another factor was relative permeability. Relative permeability modifiers have been successful in reducing water cut and allowing residual oil to flow through water-wet pores.
But the biggest factor restricting hydrocarbon production is reservoir heterogeneity—in other words, the unknowns between the wells. Reservoir heterogeneity appears as undetected compartmentalization, or at worst, as highly complex mineralogy. Engineers agreed that it would be extremely valuable to have this knowledge in advance.
Building a knowledge base
Recently, time-lapse techniques have enabled reservoir managers to deduce how hydrocarbons move through the reservoir. 4D seismic imaging has seen success, particularly where gas is involved. Cross-well tomography has enabled inter-well resistivity imaging with enough precision to allow the siting of infill wells or steering of side tracks to improve sweep efficiency. The biggest disadvantage to time-lapse techniques is that they are reactive processes; achieving maximum reservoir productivity requires predictive processes.
Leaseholders today are finding that they can improve reservoir productivity by improving their base knowledge. That means building the highest quality 3D reservoir model possible using all the measurements taken during the exploration, drilling, and completion processes. Any physician will tell you that the chances of detecting and treating the diseases and infirmities of later life are greatly improved by a good set of base images taken before you start having problems. The same is true for hydrocarbon reservoirs. With a high-quality 3D base model, the most subtle changes are highlighted in subsequent images. These provide clues on how the hydrocarbons are moving toward the producing wells, any obstacles they are encountering, and any significant volumes of residual hydrocarbons.
Planning is the best way to assure maximum early recovery, and to predict what will likely occur as the reservoir continues to drain. This allows operators to take remedial steps to manage their production, avoiding many pitfalls.
Timing is key
Some may argue that all this planning and measurement-taking costs too much. However, it is better to spend money up front to build an accurate knowledge base than to wait until problems occur, which can necessitate far more costly interventions or remedial work. Also, some of the most valuable data can only be obtained by open-hole logging and sample-taking. Once casing is set, the opportunity to acquire the data is gone, or at least severely compromised. Measurements taken during exploration, drilling, and initial completion can be capitalized as part of finding and development costs. However, remedial work to improve well performance is considered lifting costs, and must be expensed.
A good example can be found in California's San Joaquin valley. Years ago, when production from the famed Kern River field waned, the reactive decision was to establish a flood program to sweep the remaining oil toward designated producing wells. The fresh water steam used to flood the reservoir swept some of the oil as expected, but left hundreds of thousands of barrels behind, trapped in isolated pockets. Resistivity tools deployed to find these pockets were unable to discriminate between the highly resistive oil and the equally resistive fresh condensed steam. Only recently has a new logging tool been introduced that can differentiate between oil and fresh water, but in the meantime millions of dollars have been spent trying to solve the problem.
With quality measurements, modern reservoir or production simulators can be used to predict future conditions, quantify the implications of problems, and even dry-test proposed solutions. Reservoirs must be managed and solutions designed using a holistic approach. The performance of each well affects that of all other wells in the production unit whether the other wells are producers or injectors.
Perhaps the combination of available new technology in the form of more precise measurements, the capability of running several different logs on a single trip through universal combinability, and increasingly sophisticated LWD tools will encourage better understanding of our offshore reservoirs.
Giant deepwater discoveries represent huge investments. Operators may be loath to pass up any measurement that could affect subsequent reservoir management plans for maximum recovery and so maximum return on their investment. Offshore seismic, logging, well testing and reservoir modeling have reached world-class status in terms of accuracy, resolution, and reliability. Operators can use these to enhance their reservoir knowledge before production problems occur. Engineered solutions based on good data will enable better recovery factors. In the past, EOR was implemented as a last resort, when all else failed. However, for maximum effectiveness, EOR must be part of the game plan from the beginning.