In the AD8310 data sheet Rev. F 6/10 the sensitivity ( slope ) is stated in several places, but with different values being given, i.e., 20, 22, 24, 25 and 26mV/dB.

I calculate that for a dc output voltage 1.8V, slopes in the range 20 to 26mV/db give a input voltage range of -18 to-39dBV.

The conversion from Vout to input dBV is essential and has to me made, but the uncertainty in slope is a problem for an acceptable tolerance on input dBV.

In my application I am currently using 24mV/dB slope. But what slope should I choose and on what basis ? Does anyone have an opinion on this ?

The slope and intercept can change from part-to-part, over temperature and over frequency. That is why we give minimum, typical, and maximum numbers for the slope and intercept. To calculate an accurate input voltage (or power), you must calibrate each part with a minimum of two calibration points to know what the slope and intercept is of the part you are current working with.

There's a decent write-up on how to do this and calculate error in the ADL5506 datasheet starting on page 17 and continuing on page 18. In this case, it shows how to do a 3-point calibration and error calculation.