You guys know I am a data weenie and have to jump in here. I was a participant in a study that compared two diagnostic (2 decimal digits) test methods: the older ultra sensitive 2 decimal test, Siemens Centaur - June 2006 protocol, with the new, more selective method Roche Cobas 601 - April 2010 protocol. All samples were drawn the same day from the same vial.
Here is my data after RP in July 2009.
As you can see the older method reports a slightly higher value and my results are quite stable.
1/14/2010 6 months - 0.05 (Siemens Centaur)
4/14/2010 9 months - 0.04 (Siemens Centaur) and <0.01 (Roche ECLIA).
7/12/2010 1 year - 0.03 (Siemens Centaur, direct chemilum); <0.01 (Roche Cobas 601 ECLIA)
10/22/2010 15 months- 0.03 (Siemens Centaur, direct chemilum); <0.01 (Roche Cobas 601 ECLIA)
12/29/2010 17 months Siemens data discontinued; <0.01 (Roche Cobas 601 ECLIA)
So, does 0.02 make a difference? It depends upon the machine used. He should ask for a copy of the test report then decide. (Either way it is good.)
Post Edited (Worried Guy) : 4/6/2011 5:43:03 PM (GMT-6)