I've now tried various algorithms.
In all of these I set the oversampling of both temperature and pressure to 1 (no oversampling).
Forced mode, with fixed 5mS delay between reads,
Forced mode, with fixed 4mS delay between reads,
Forced mode, with fixed 3mS delay between reads,
Forced mode, with fixed 1mS delay between reads,
Normal mode, set t_sb = 0.5 mS, waiting until status bit 3 = 0, read data, repeat
Normal mode, set t_sb = 0.5 mS, waiting until status bit 3 = 0, read data, wait until status bit 3 = 1, repeat
With each algorithm, I'd do 2000 reads.
As I went through the above scenarios, the minimum and maximum readings (for both values) got farther and farter apart - ie: the range of values got larger.
That was until #5, where it was slightly better then #1. The range with #6 was the least of all - in other words, algorithm 6 gave the most consistent readings (I should say the sensor is laying still throughout these tests).
In #5, it took 6 seconds to make 2000 readings - or an ODR of about 333 hz. In #6, it took 12 seconds to make 2000 readings - or an ODR of 166 hz. Just what the datasheet said would be the maximum ODR.
I have to assume with #5, about half the readings were done before the registers were updated. In other words, it was a waste of cpu time to make the reading.