Geopressure Gradient

Definition - What does Geopressure Gradient mean?

Geopressure Gradient is the measure of the increase in the temperature per unit depth in the earth's surface. It is generally measured in degree Celsius per meter or degree Fahrenheit per 100 feet and varies between 1.5 degrees to 2.3 degrees Fahrenheit per 100 feet; however, it may be lower and higher than the range given.

Petropedia explains Geopressure Gradient

The geothermal gradient relates to the change in the pore pressure in the formation as the depth increases. This means that with the increase in depth, the temperature also increases as well as the pressure. The other units in which geothermal gradient can be measured apart from degree Celsius per meter or degree Fahrenheit per 100 feet is pounds per square inch per foot or kPA/meter. The geopressure gradient might be described as high or low if it deviates from the normal hydrostatic pressure gradient of 0.433 psi/ft [9.8 kPa/m].

Share this:

Connect with us

Email Newsletter

Subscribe to our free newsletter now - The Best of Petropedia.