Example: Measuring blood pressure
Prerequisite
Understanding the situation
When I get my blood pressure taken, the doctor, physician's associate, or technician always insists that my arm be aligned so that the cuff is at the same level as my heart. Does this really matter? My upper arm can only rotate a very limited amount.
Presenting a sample problem
If my blood pressure is normal, say 120/80 mmHg, and the measurement when the cuff is at the level of my heart, estimate what the reading would be if I placed my forearm on top of my head so that the cuff was at the level of my eyes.
Solving this problem
While this is not a realistic situation (Why in the world would I put my arm on top of my head while I'm having my blood pressure measured?), this gives an extreme case: it's about as far above my heart as I can move my arm. So any more realistic situation (say a patient lying on a bed with their arm hanging down on the side of the bed) should have less of an effect than we calculate here.
Let's choose to treat this problem using the toy model of a static fluid container (as in the consideration of Archimedes' Principle). This isn't quite right since the blood system is highly dynamic, with the blood vessels having some elasticity and muscles being able to squeeze them, but it should be a good starting point.
Our analysis of pressure as a function of depth in a static fluid shows that the pressure varies with depth as
$$p=p_0 + \rho g d$$
where $p_0$ is the pressure when $d=0$ and $\rho$ is the density of the fluid we are considering.
So if we go up a distance $h$ from our starting point, we expect the pressure to drop by an amount $\rho g h.$ Let's estimate this.
As a first cut, let's take the density of blood to be about that of water, $\mathrm{10^3\ kg/m^3}.$ We'll take $g = \mathrm{10\ N/kg,}$ and I estimate the distance between my heart and eyes as being about $\Delta h = \mathrm{30\ cm = 0.3\ m.}$
Putting these in I get a change of pressure of
$$\Delta p = \rho g \Delta h = \mathrm{(10^3\ kg/m^3)\times (10\ N/kg)\times (0.3\ m) \\= 0.3\times 10^4\ N/m^2.}$$
Unfortunately, medical practice measures blood pressure in the historical units of "millimeters of mercury = mmHg". But since the dimensionalities are the same, we know this is just a change of scale, so we can use proportionality to figure out what our result is in mm Hg.
We know (or can look up) that 1 atmosphere of pressure is pretty close to $\mathrm{100 kPa = 10^5 N/m^2.}$ And we know (or can look up) that 1 atmospher is about 760 mmHg. So we can find what our change is in mmHg by the proportion
$$\mathrm{\frac{760\ mm Hg}{10^5 N/m^2} = \frac{\Delta p}{0.3\times 10^4\ N/m^2.}}$$
This gives our result as
$$\Delta p = \mathrm{0.03 \times 760\ mmHg \approx 20\ mm Hg.}$$
This suggests that if the blood pressure reads 120/80 mmHg at the level of the heart, it would read 100/60 mmHg at the level of the eyes.
I performed this experiment with my home blood-pressure measuring cuff. My blood pressure at the level of my heart came out to be $\mathrm{119/74\ mmHg}$ (measured 3 times in quick succession and averaged, since blood pressure fluctuates a bit from moment to moment). When I measured it at the level of my eyes, it came out to be $\mathrm{100/54 mmHg.}$ This agrees remarkably well with my estimate using the very simple toy model!
While the difference from 120/80 mmHg to 100/60 mmHg might not be medically particularly significant, if the patient were receiving drugs to reduce fluid build up, the blood pressure would be watched carefully as the drugs could produce a blood pressure that was dangerously low. A pressure of 100/60 mmHg would be acceptable and not lead to a change of medications. But a reading of 80/40 mmHg would be a red flag and could lead to an unnecessary and possible dangerous change of meds.
Joe Redish 3/21/21
Last Modified: March 21, 2021