volt-nuts@lists.febo.com

Discussion of precise voltage measurement

View all threads

traceable calibration

AJ
Andreas Jahn
Wed, Sep 7, 2011 12:37 PM

Hello together,

I have some questions to the calibration experts:

On monday I calibrated my voltage references at a friend of mine.
He has two  6,5 digits Keithley 2000.

Reading of my LM399 #2 was

6,86089 V on Keithley #1  (last cal 02 / 2010)
6,86082 V on Keithley #2  (last cal 05 / 2011)

both values sometimes tripping to the 10uV higher level.

giving
6,86090 or 6,86083 as maximum values.

room temperature was 24,6 degrees (celsius)
humidity read from the hair hygrometer was 58%

So far so good. But what can I tell to the absolute voltage
of my reference with respect to absolute volt.

normally I would use the 1 year spec of the instrument
which is 30ppm of reading + 5ppm of Range (10V)
giving a uncertainity of +/-260uV
so I would spec the reference being
(6,86089+6,86082) / 2 = 6.860855V + / - 260uV

Calibration reference:

On the other side I had the luck that the calibration
protocol of the Keithley #2 was available.

Keithley #2 has been calibrated mid of 05 / 2011
at the german calibration lab of Keithley.
Calibration has been done with a Fluke 5720A calibrator
at a room temperature of 22,6 degrees and humidity 42%.
The calibrator itself has been calibrated end 02 / 2011
with due date end 05 / 2011. So the calibrator was near
end of the 90 day calibration cycle.

Readings of the Keithley 2000 in the 10V range where

reading        Error    Error (% TOL)

-10.00000    0 ppm    0%

  • 5.00001    1,2ppm    3%
    0.0            100%      18,4%
    4.99999    -2,0ppm    5%
    9.99999    -1,5ppm    4,29%

further the calibration protocol states that the
TUR (Test Uncertainity Ratio) was at least 4 or greater
and the Keithley was not adjusted during calibration.

The 90 day specs of the Fluke 5520A I have found
to be 1.5ppm of reading + 3uV relative to calibration
and: 3 ppm of reading + 3uV absolute.
(whatever "absolute" is meaning in this context).

So I am tempted to take reading of Keitley #2
adding 10uV (calibration offset between 5 and 10V)
and using the specs of the 5520A as tolerance.
For the temperature difference of 2 K between
calibration of reference against instrument I would use
the 2ppm of reading + 1ppm of range per degree celsius
spec of the Keithley.

so this would give a total uncertainity of  7ppm + 23uV

Calibration history:

2010:

I have compared my LM399 #2 already one year before
The readings of September 2010 where

Keithley #1 6,86089 V (last cal unknown)  (same as in 2011)
Keithley #2 6,86081 V (last cal 05 / 2009) (10 uV less than in 2011)

Temperature was 23,8 degrees (1 degree less than in 2011)
Humidity not recorded

And: the reference LM399 #2 is running parallel with the
two LTZ1000A references which I have built at end of last year.

2009:

Keithley #1 6,86087 V (last cal unknown)
Keithley #2 6,86079 V (last cal 05 / 2009)

Temperature was 23,5 degrees (1 degree less than in 2011)
Humidity not recorded

but this measurement is uncertain by 30uV because it was
before I wrote "top" and "bottom" on the housing of the reference.
So I cannot tell in which orientation the calibration has been done.

Before and after calibration the measured values at home
differed by 30 uV on this reference against my ADC with 5V Reference LT1027.
In 2010 and 2011 there was no significant shift (less than noise level of 2uV)
before and after transportation.

The conclusions from the history are:

Both Keithleys read a constant difference of 70-80uV @7V.
(so none of the Keithleys drifted during transport from calibration).

and: Drift of my LM399 #2 is less than 10-20uV against the Keithleys
during the last year.

Since the Keitleys have a LM399 as reference element
the worst case would be that all 3 LM399 references drift at the
same rate.

Questions:

  • does one know what the "absolute" spec is meaning
    in the context of a calibrator.
    Is this spec usually reached within a calibration lab?
    Do I have to add a additional uncertainity?

  • what would You spec my LM399 #2 as absolute uncertainity?

  • what can be derrived from calibration history. Can the
    absolute tolerance be limited further?

  • how should the temperature difference between Keithley
    calibration lab and the not climatized room of my friend
    be handled?

  • which influence has the humidity on calibration?
    (since the references are all in metal housings and
    the 10V range does not need voltage divider resistors
    I do not expect any significant influence)

  • Under which conditions the instruments will be adjusted
    during calibration?
    When missing the 1 year spec or
    When missing the 24 hours spec?

  • What can I derrive from the TUR spec of calibration?

  • How are the ERROR and ERROR (% TOL) values
    calculated? It seems to be quite non-linear.
    10uV on the negative side gives 1,2ppm and
    10uV on the positive side gives 2 ppm for 5V.

with best regards

Andreas

Hello together, I have some questions to the calibration experts: On monday I calibrated my voltage references at a friend of mine. He has two 6,5 digits Keithley 2000. Reading of my LM399 #2 was 6,86089 V on Keithley #1 (last cal 02 / 2010) 6,86082 V on Keithley #2 (last cal 05 / 2011) both values sometimes tripping to the 10uV higher level. giving 6,86090 or 6,86083 as maximum values. room temperature was 24,6 degrees (celsius) humidity read from the hair hygrometer was 58% So far so good. But what can I tell to the absolute voltage of my reference with respect to absolute volt. normally I would use the 1 year spec of the instrument which is 30ppm of reading + 5ppm of Range (10V) giving a uncertainity of +/-260uV so I would spec the reference being (6,86089+6,86082) / 2 = 6.860855V + / - 260uV Calibration reference: =============== On the other side I had the luck that the calibration protocol of the Keithley #2 was available. Keithley #2 has been calibrated mid of 05 / 2011 at the german calibration lab of Keithley. Calibration has been done with a Fluke 5720A calibrator at a room temperature of 22,6 degrees and humidity 42%. The calibrator itself has been calibrated end 02 / 2011 with due date end 05 / 2011. So the calibrator was near end of the 90 day calibration cycle. Readings of the Keithley 2000 in the 10V range where reading Error Error (% TOL) -10.00000 0 ppm 0% - 5.00001 1,2ppm 3% 0.0 100% 18,4% 4.99999 -2,0ppm 5% 9.99999 -1,5ppm 4,29% further the calibration protocol states that the TUR (Test Uncertainity Ratio) was at least 4 or greater and the Keithley was not adjusted during calibration. The 90 day specs of the Fluke 5520A I have found to be 1.5ppm of reading + 3uV relative to calibration and: 3 ppm of reading + 3uV absolute. (whatever "absolute" is meaning in this context). So I am tempted to take reading of Keitley #2 adding 10uV (calibration offset between 5 and 10V) and using the specs of the 5520A as tolerance. For the temperature difference of 2 K between calibration of reference against instrument I would use the 2ppm of reading + 1ppm of range per degree celsius spec of the Keithley. so this would give a total uncertainity of 7ppm + 23uV Calibration history: ============== 2010: ==== I have compared my LM399 #2 already one year before The readings of September 2010 where Keithley #1 6,86089 V (last cal unknown) (same as in 2011) Keithley #2 6,86081 V (last cal 05 / 2009) (10 uV less than in 2011) Temperature was 23,8 degrees (1 degree less than in 2011) Humidity not recorded And: the reference LM399 #2 is running parallel with the two LTZ1000A references which I have built at end of last year. 2009: ==== Keithley #1 6,86087 V (last cal unknown) Keithley #2 6,86079 V (last cal 05 / 2009) Temperature was 23,5 degrees (1 degree less than in 2011) Humidity not recorded but this measurement is uncertain by 30uV because it was before I wrote "top" and "bottom" on the housing of the reference. So I cannot tell in which orientation the calibration has been done. Before and after calibration the measured values at home differed by 30 uV on this reference against my ADC with 5V Reference LT1027. In 2010 and 2011 there was no significant shift (less than noise level of 2uV) before and after transportation. The conclusions from the history are: Both Keithleys read a constant difference of 70-80uV @7V. (so none of the Keithleys drifted during transport from calibration). and: Drift of my LM399 #2 is less than 10-20uV against the Keithleys during the last year. Since the Keitleys have a LM399 as reference element the worst case would be that all 3 LM399 references drift at the same rate. Questions: ======== - does one know what the "absolute" spec is meaning in the context of a calibrator. Is this spec usually reached within a calibration lab? Do I have to add a additional uncertainity? - what would You spec my LM399 #2 as absolute uncertainity? - what can be derrived from calibration history. Can the absolute tolerance be limited further? - how should the temperature difference between Keithley calibration lab and the not climatized room of my friend be handled? - which influence has the humidity on calibration? (since the references are all in metal housings and the 10V range does not need voltage divider resistors I do not expect any significant influence) - Under which conditions the instruments will be adjusted during calibration? When missing the 1 year spec or When missing the 24 hours spec? - What can I derrive from the TUR spec of calibration? - How are the ERROR and ERROR (% TOL) values calculated? It seems to be quite non-linear. 10uV on the negative side gives 1,2ppm and 10uV on the positive side gives 2 ppm for 5V. with best regards Andreas
G
gbusg
Wed, Sep 7, 2011 3:39 PM

Hi Andreas,

...Good point regarding SI 1990 (your previous post).

Before we "disect" some of the questions of your current post, can you tell
us if the German cal lab's standard was a Fluke 5520A or 5720A or 5720A
Series II? (Your post mentions both 5520A and 5720A.)

Based on the Fluke specs you mentioned, I'm guessing it is a 5720A?

Regarding Fluke's use of the terms "Absolute" vs. "Relative", their
"Absolute" specs include the propagation of uncertainty through external
standards, tracing up the chain to your National Metrology Institute (PTB).
Here's a quote from Fluke's 5700A/5720A Operator's Manual:

Using Absolute and Relative Uncertainty Specifications

To evaluate the 5700A/5720A Series II coverage of your calibration workload,
use the Absolute Uncertainty specifications. Absolute uncertainty includes
stability, temperaturecoefficient, linearity, line and load regulation, and
the traceability to external standards. You do not need to add anything to
absolute uncertainty to determine the ratios between the calibrator.s
uncertainties and the uncertainties of your calibration workload. Relative
uncertainty specifications are provided for enhanced accuracy applications.
These specifications apply when range constants are adjusted (see .Range
Calibration.). To calculate absolute uncertainty, you must combine the
uncertainties of your external standards and techniques with relative
uncertainty.

Cheers!

Greg

----- Original Message -----
From: "Andreas Jahn" Andreas_-_Jahn@t-online.de
To: "Discussion of precise voltage measurement" volt-nuts@febo.com
Sent: Wednesday, September 07, 2011 6:37 AM
Subject: [volt-nuts] traceable calibration

Hello together,

I have some questions to the calibration experts:

On monday I calibrated my voltage references at a friend of mine.
He has two  6,5 digits Keithley 2000.

Reading of my LM399 #2 was

6,86089 V on Keithley #1  (last cal 02 / 2010)
6,86082 V on Keithley #2  (last cal 05 / 2011)

both values sometimes tripping to the 10uV higher level.

giving
6,86090 or 6,86083 as maximum values.

room temperature was 24,6 degrees (celsius)
humidity read from the hair hygrometer was 58%

So far so good. But what can I tell to the absolute voltage
of my reference with respect to absolute volt.

normally I would use the 1 year spec of the instrument
which is 30ppm of reading + 5ppm of Range (10V)
giving a uncertainity of +/-260uV
so I would spec the reference being
(6,86089+6,86082) / 2 = 6.860855V + / - 260uV

Calibration reference:

On the other side I had the luck that the calibration
protocol of the Keithley #2 was available.

Keithley #2 has been calibrated mid of 05 / 2011
at the german calibration lab of Keithley.
Calibration has been done with a Fluke 5720A calibrator
at a room temperature of 22,6 degrees and humidity 42%.
The calibrator itself has been calibrated end 02 / 2011
with due date end 05 / 2011. So the calibrator was near
end of the 90 day calibration cycle.

Readings of the Keithley 2000 in the 10V range where

reading        Error    Error (% TOL)

-10.00000    0 ppm    0%

  • 5.00001    1,2ppm    3%
    0.0            100%      18,4%
    4.99999    -2,0ppm    5%
    9.99999    -1,5ppm    4,29%

further the calibration protocol states that the
TUR (Test Uncertainity Ratio) was at least 4 or greater
and the Keithley was not adjusted during calibration.

The 90 day specs of the Fluke 5520A I have found
to be 1.5ppm of reading + 3uV relative to calibration
and: 3 ppm of reading + 3uV absolute.
(whatever "absolute" is meaning in this context).

So I am tempted to take reading of Keitley #2
adding 10uV (calibration offset between 5 and 10V)
and using the specs of the 5520A as tolerance.
For the temperature difference of 2 K between
calibration of reference against instrument I would use
the 2ppm of reading + 1ppm of range per degree celsius
spec of the Keithley.

so this would give a total uncertainity of  7ppm + 23uV

Calibration history:

2010:

I have compared my LM399 #2 already one year before
The readings of September 2010 where

Keithley #1 6,86089 V (last cal unknown)  (same as in 2011)
Keithley #2 6,86081 V (last cal 05 / 2009) (10 uV less than in 2011)

Temperature was 23,8 degrees (1 degree less than in 2011)
Humidity not recorded

And: the reference LM399 #2 is running parallel with the
two LTZ1000A references which I have built at end of last year.

2009:

Keithley #1 6,86087 V (last cal unknown)
Keithley #2 6,86079 V (last cal 05 / 2009)

Temperature was 23,5 degrees (1 degree less than in 2011)
Humidity not recorded

but this measurement is uncertain by 30uV because it was
before I wrote "top" and "bottom" on the housing of the reference.
So I cannot tell in which orientation the calibration has been done.

Before and after calibration the measured values at home
differed by 30 uV on this reference against my ADC with 5V Reference LT1027.
In 2010 and 2011 there was no significant shift (less than noise level of
2uV)
before and after transportation.

The conclusions from the history are:

Both Keithleys read a constant difference of 70-80uV @7V.
(so none of the Keithleys drifted during transport from calibration).

and: Drift of my LM399 #2 is less than 10-20uV against the Keithleys
during the last year.

Since the Keitleys have a LM399 as reference element
the worst case would be that all 3 LM399 references drift at the
same rate.

Questions:

  • does one know what the "absolute" spec is meaning
    in the context of a calibrator.
    Is this spec usually reached within a calibration lab?
    Do I have to add a additional uncertainity?

  • what would You spec my LM399 #2 as absolute uncertainity?

  • what can be derrived from calibration history. Can the
    absolute tolerance be limited further?

  • how should the temperature difference between Keithley
    calibration lab and the not climatized room of my friend
    be handled?

  • which influence has the humidity on calibration?
    (since the references are all in metal housings and
    the 10V range does not need voltage divider resistors
    I do not expect any significant influence)

  • Under which conditions the instruments will be adjusted
    during calibration?
    When missing the 1 year spec or
    When missing the 24 hours spec?

  • What can I derrive from the TUR spec of calibration?

  • How are the ERROR and ERROR (% TOL) values
    calculated? It seems to be quite non-linear.
    10uV on the negative side gives 1,2ppm and
    10uV on the positive side gives 2 ppm for 5V.

with best regards

Andreas


volt-nuts mailing list -- volt-nuts@febo.com
To unsubscribe, go to
https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
and follow the instructions there.

Hi Andreas, ...Good point regarding SI 1990 (your previous post). Before we "disect" some of the questions of your current post, can you tell us if the German cal lab's standard was a Fluke 5520A or 5720A or 5720A Series II? (Your post mentions both 5520A and 5720A.) Based on the Fluke specs you mentioned, I'm guessing it is a 5720A? Regarding Fluke's use of the terms "Absolute" vs. "Relative", their "Absolute" specs include the propagation of uncertainty through external standards, tracing up the chain to your National Metrology Institute (PTB). Here's a quote from Fluke's 5700A/5720A Operator's Manual: Using Absolute and Relative Uncertainty Specifications To evaluate the 5700A/5720A Series II coverage of your calibration workload, use the Absolute Uncertainty specifications. Absolute uncertainty includes stability, temperaturecoefficient, linearity, line and load regulation, and the traceability to external standards. You do not need to add anything to absolute uncertainty to determine the ratios between the calibrator.s uncertainties and the uncertainties of your calibration workload. Relative uncertainty specifications are provided for enhanced accuracy applications. These specifications apply when range constants are adjusted (see .Range Calibration.). To calculate absolute uncertainty, you must combine the uncertainties of your external standards and techniques with relative uncertainty. Cheers! Greg ----- Original Message ----- From: "Andreas Jahn" <Andreas_-_Jahn@t-online.de> To: "Discussion of precise voltage measurement" <volt-nuts@febo.com> Sent: Wednesday, September 07, 2011 6:37 AM Subject: [volt-nuts] traceable calibration Hello together, I have some questions to the calibration experts: On monday I calibrated my voltage references at a friend of mine. He has two 6,5 digits Keithley 2000. Reading of my LM399 #2 was 6,86089 V on Keithley #1 (last cal 02 / 2010) 6,86082 V on Keithley #2 (last cal 05 / 2011) both values sometimes tripping to the 10uV higher level. giving 6,86090 or 6,86083 as maximum values. room temperature was 24,6 degrees (celsius) humidity read from the hair hygrometer was 58% So far so good. But what can I tell to the absolute voltage of my reference with respect to absolute volt. normally I would use the 1 year spec of the instrument which is 30ppm of reading + 5ppm of Range (10V) giving a uncertainity of +/-260uV so I would spec the reference being (6,86089+6,86082) / 2 = 6.860855V + / - 260uV Calibration reference: =============== On the other side I had the luck that the calibration protocol of the Keithley #2 was available. Keithley #2 has been calibrated mid of 05 / 2011 at the german calibration lab of Keithley. Calibration has been done with a Fluke 5720A calibrator at a room temperature of 22,6 degrees and humidity 42%. The calibrator itself has been calibrated end 02 / 2011 with due date end 05 / 2011. So the calibrator was near end of the 90 day calibration cycle. Readings of the Keithley 2000 in the 10V range where reading Error Error (% TOL) -10.00000 0 ppm 0% - 5.00001 1,2ppm 3% 0.0 100% 18,4% 4.99999 -2,0ppm 5% 9.99999 -1,5ppm 4,29% further the calibration protocol states that the TUR (Test Uncertainity Ratio) was at least 4 or greater and the Keithley was not adjusted during calibration. The 90 day specs of the Fluke 5520A I have found to be 1.5ppm of reading + 3uV relative to calibration and: 3 ppm of reading + 3uV absolute. (whatever "absolute" is meaning in this context). So I am tempted to take reading of Keitley #2 adding 10uV (calibration offset between 5 and 10V) and using the specs of the 5520A as tolerance. For the temperature difference of 2 K between calibration of reference against instrument I would use the 2ppm of reading + 1ppm of range per degree celsius spec of the Keithley. so this would give a total uncertainity of 7ppm + 23uV Calibration history: ============== 2010: ==== I have compared my LM399 #2 already one year before The readings of September 2010 where Keithley #1 6,86089 V (last cal unknown) (same as in 2011) Keithley #2 6,86081 V (last cal 05 / 2009) (10 uV less than in 2011) Temperature was 23,8 degrees (1 degree less than in 2011) Humidity not recorded And: the reference LM399 #2 is running parallel with the two LTZ1000A references which I have built at end of last year. 2009: ==== Keithley #1 6,86087 V (last cal unknown) Keithley #2 6,86079 V (last cal 05 / 2009) Temperature was 23,5 degrees (1 degree less than in 2011) Humidity not recorded but this measurement is uncertain by 30uV because it was before I wrote "top" and "bottom" on the housing of the reference. So I cannot tell in which orientation the calibration has been done. Before and after calibration the measured values at home differed by 30 uV on this reference against my ADC with 5V Reference LT1027. In 2010 and 2011 there was no significant shift (less than noise level of 2uV) before and after transportation. The conclusions from the history are: Both Keithleys read a constant difference of 70-80uV @7V. (so none of the Keithleys drifted during transport from calibration). and: Drift of my LM399 #2 is less than 10-20uV against the Keithleys during the last year. Since the Keitleys have a LM399 as reference element the worst case would be that all 3 LM399 references drift at the same rate. Questions: ======== - does one know what the "absolute" spec is meaning in the context of a calibrator. Is this spec usually reached within a calibration lab? Do I have to add a additional uncertainity? - what would You spec my LM399 #2 as absolute uncertainity? - what can be derrived from calibration history. Can the absolute tolerance be limited further? - how should the temperature difference between Keithley calibration lab and the not climatized room of my friend be handled? - which influence has the humidity on calibration? (since the references are all in metal housings and the 10V range does not need voltage divider resistors I do not expect any significant influence) - Under which conditions the instruments will be adjusted during calibration? When missing the 1 year spec or When missing the 24 hours spec? - What can I derrive from the TUR spec of calibration? - How are the ERROR and ERROR (% TOL) values calculated? It seems to be quite non-linear. 10uV on the negative side gives 1,2ppm and 10uV on the positive side gives 2 ppm for 5V. with best regards Andreas _______________________________________________ volt-nuts mailing list -- volt-nuts@febo.com To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts and follow the instructions there.
AJ
Andreas Jahn
Wed, Sep 7, 2011 4:31 PM

Before we "disect" some of the questions of your current post, can you
tell
us if the German cal lab's standard was a Fluke 5520A or 5720A or 5720A
Series II? (Your post mentions both 5520A and 5720A.)

Based on the Fluke specs you mentioned, I'm guessing it is a 5720A?

Yes it was a Fluke 5720A calibrator. (sorry for the typo).
I cannot tell if it was a Series II or not.
(Series II is not mentioned on the calibration sheet)
The serial number is beginning with 954.. if this helps.

So the specs I got from the Fluke homepage. I did not find
calibrator specs for the non Series II device.
I hope that this does not make a too big difference.
Do you have the standard 5720A specs?

To evaluate the 5700A/5720A Series II coverage of your calibration
workload,
use the Absolute Uncertainty specifications.

Sounds good to me.

With best regards

Andreas

> Before we "disect" some of the questions of your current post, can you > tell > us if the German cal lab's standard was a Fluke 5520A or 5720A or 5720A > Series II? (Your post mentions both 5520A and 5720A.) > > Based on the Fluke specs you mentioned, I'm guessing it is a 5720A? > Yes it was a Fluke 5720A calibrator. (sorry for the typo). I cannot tell if it was a Series II or not. (Series II is _not_ mentioned on the calibration sheet) The serial number is beginning with 954.. if this helps. So the specs I got from the Fluke homepage. I did not find calibrator specs for the non Series II device. I hope that this does not make a too big difference. Do you have the standard 5720A specs? > To evaluate the 5700A/5720A Series II coverage of your calibration > workload, > use the Absolute Uncertainty specifications. > Sounds good to me. With best regards Andreas
G
gbusg
Thu, Sep 8, 2011 11:23 PM

Hi Andreas,

You wrote:

Under which conditions the instruments will be adjusted
during calibration?
When missing the 1 year spec or
When missing the 24 hours spec?

Good question! The term "calibration" means different things in different
disciplines and application of it varies by company and product even within
a given discipline. In some disciplines, there's an expectation that the lab
will adjust your instrument. But I think most of the time in electrical
metrology the lab runs a *Performance Test" but does not adjust unless a
parameter is out of tolerance (OOT). Some paper standards and/or customers
require the lab to adjust if a parameter's value reaches the limit of a spec
guardband limit. There has been much debate about this topic down through
the years within metrology organizations and paper standards writing
committees.

If you want to know how the German Keithley lab handles this topic with
regards to Keithley 2000 calibration, you will have to ask them. Do please
let us know what they tell you. (I'm curious too!)

Best,
Greg

Hi Andreas, You wrote: >Under which conditions the instruments will be adjusted >during calibration? > When missing the 1 year spec or >When missing the 24 hours spec? Good question! The term "calibration" means different things in different disciplines and application of it varies by company and product even within a given discipline. In some disciplines, there's an expectation that the lab will *adjust* your instrument. But I think most of the time in electrical metrology the lab runs a *Performance Test" but does not adjust unless a parameter is out of tolerance (OOT). Some paper standards and/or customers require the lab to adjust if a parameter's value reaches the limit of a spec *guardband* limit. There has been much debate about this topic down through the years within metrology organizations and paper standards writing committees. If you want to know how the German Keithley lab handles this topic with regards to Keithley 2000 calibration, you will have to ask them. Do please let us know what they tell you. (I'm curious too!) Best, Greg
AJ
Andreas Jahn
Fri, Sep 9, 2011 9:07 PM

Hi Andreas,

Hello Greg

You wrote:

Under which conditions the instruments will be adjusted
during calibration?
When missing the 1 year spec or
When missing the 24 hours spec?

Good question! The term "calibration" means different things in different
disciplines and application of it varies by company and product even
within
a given discipline. In some disciplines, there's an expectation that the
lab
will adjust your instrument. But I think most of the time in electrical
metrology the lab runs a *Performance Test" but does not adjust unless a
parameter is out of tolerance (OOT). Some paper standards and/or customers
require the lab to adjust if a parameter's value reaches the limit of a
spec
guardband limit. There has been much debate about this topic down
through
the years within metrology organizations and paper standards writing
committees.

If you want to know how the German Keithley lab handles this topic with
regards to Keithley 2000 calibration, you will have to ask them. Do please
let us know what they tell you. (I'm curious too!)

In the meantime I have found the Keithley calibration page on web which
answers the questions:
http://www.keithley.com/services/calibration

So on Keithley calibration as standard procedure the instrument is
adjusted when the 70% level of the 1 year accuracy is not met.
So for the 10V range 70% of 35 ppm will give 25 ppm which
correspond approximately to the 90 day accuracy.

With best regards

Andreas

> Hi Andreas, Hello Greg > You wrote: > >>Under which conditions the instruments will be adjusted >>during calibration? >> When missing the 1 year spec or >>When missing the 24 hours spec? > > Good question! The term "calibration" means different things in different > disciplines and application of it varies by company and product even > within > a given discipline. In some disciplines, there's an expectation that the > lab > will *adjust* your instrument. But I think most of the time in electrical > metrology the lab runs a *Performance Test" but does not adjust unless a > parameter is out of tolerance (OOT). Some paper standards and/or customers > require the lab to adjust if a parameter's value reaches the limit of a > spec > *guardband* limit. There has been much debate about this topic down > through > the years within metrology organizations and paper standards writing > committees. > > If you want to know how the German Keithley lab handles this topic with > regards to Keithley 2000 calibration, you will have to ask them. Do please > let us know what they tell you. (I'm curious too!) In the meantime I have found the Keithley calibration page on web which answers the questions: http://www.keithley.com/services/calibration So on Keithley calibration as standard procedure the instrument is adjusted when the 70% level of the 1 year accuracy is not met. So for the 10V range 70% of 35 ppm will give 25 ppm which correspond approximately to the 90 day accuracy. With best regards Andreas
G
gbusg
Sat, Sep 10, 2011 12:50 AM

Andreas wrote:

In the meantime I have found the Keithley calibration page on web which
answers the questions:
http://www.keithley.com/services/calibration

So on Keithley calibration as standard procedure the instrument is
adjusted when the 70% level of the 1 year accuracy is not met.
So for the 10V range 70% of 35 ppm will give 25 ppm which
correspond approximately to the 90 day accuracy.

With best regards

Andreas


That's good information, Andreas.

Keithley's (70% of spec) adjustment guardband sounds like a good strategy.

Thanks for sharing!

Greg

Andreas wrote: >In the meantime I have found the Keithley calibration page on web which >answers the questions: >http://www.keithley.com/services/calibration >So on Keithley calibration as standard procedure the instrument is >adjusted when the 70% level of the 1 year accuracy is not met. >So for the 10V range 70% of 35 ppm will give 25 ppm which >correspond approximately to the 90 day accuracy. >With best regards >Andreas ---------------- That's good information, Andreas. Keithley's (70% of spec) adjustment guardband sounds like a good strategy. Thanks for sharing! Greg
G
gbusg
Sat, Sep 10, 2011 5:24 AM

Andreas asked:

So the specs I got from the Fluke homepage. I did not find
calibrator specs for the non Series II device.
I hope that this does not make a too big difference.
Do you have the standard 5720A specs?


Hi Andreas,

I compared the Fluke 5720A 2-sigma specs to their 5720A Series II 2-sigma
specs, and on the 11Vdc range, they both have identical specs. For example
they're both spec'd as +/- (3.5ppm output + 2.5 uV) for 1 Year.

Cheers,
Greg

Andreas asked: >So the specs I got from the Fluke homepage. I did not find >calibrator specs for the non Series II device. >I hope that this does not make a too big difference. >Do you have the standard 5720A specs? ---------------- Hi Andreas, I compared the Fluke 5720A 2-sigma specs to their 5720A *Series II* 2-sigma specs, and on the 11Vdc range, they both have identical specs. For example they're both spec'd as +/- (3.5ppm output + 2.5 uV) for 1 Year. Cheers, Greg
AJ
Andreas Jahn
Sat, Sep 10, 2011 10:04 AM

Ok Greg,

so for the 10V range the specs seem to be totally identical.
But since they have a 90 day calibration cycle for the 5720A
according to the measurement protocol id prefer to use the
90 day spec which is 2.5ppm of output + 2.5uV for the 2 sigma
or 3ppm of output + 3uV for the 3 sigma value.

The interesting thing is that on a sample report
acoording to ISO they state the measurement uncertainity of being
16.1 uV for a 5V output and
28.1 uV for a 10V output.

http://www.keithley.com/support/services/calibrationrepair/ReportA.pdf

On the other side on their calibration side they say that they will use
the 2 sigma specs for ISO-calibration.
5V would give  15  uV with 2 sigma
10V would give 27.5 uV with 2 sigma.

so the values of the protocol seem to include a tolerance 0.6 to 1.1 uV for
wiring.

so I would use the calibrator tolerance as being 2.5ppm + 3.6uV as worst
case
and transfer this to the 7V of my reference voltage as 21.1uV uncertainity
for the calibrator.
But how do I have to regard the other tolerances up to the measurement of my
LM399 #2?

With best regards

Andreas

----- Original Message -----
From: "gbusg" gbusg@comcast.net
To: "Discussion of precise voltage measurement" volt-nuts@febo.com
Sent: Saturday, September 10, 2011 7:24 AM
Subject: Re: [volt-nuts] traceable calibration

Andreas asked:

So the specs I got from the Fluke homepage. I did not find
calibrator specs for the non Series II device.
I hope that this does not make a too big difference.
Do you have the standard 5720A specs?


Hi Andreas,

I compared the Fluke 5720A 2-sigma specs to their 5720A Series II
2-sigma
specs, and on the 11Vdc range, they both have identical specs. For example
they're both spec'd as +/- (3.5ppm output + 2.5 uV) for 1 Year.

Cheers,
Greg


volt-nuts mailing list -- volt-nuts@febo.com
To unsubscribe, go to
https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
and follow the instructions there.

Ok Greg, so for the 10V range the specs seem to be totally identical. But since they have a 90 day calibration cycle for the 5720A according to the measurement protocol id prefer to use the 90 day spec which is 2.5ppm of output + 2.5uV for the 2 sigma or 3ppm of output + 3uV for the 3 sigma value. The interesting thing is that on a sample report acoording to ISO they state the measurement uncertainity of being 16.1 uV for a 5V output and 28.1 uV for a 10V output. http://www.keithley.com/support/services/calibrationrepair/ReportA.pdf On the other side on their calibration side they say that they will use the 2 sigma specs for ISO-calibration. 5V would give 15 uV with 2 sigma 10V would give 27.5 uV with 2 sigma. so the values of the protocol seem to include a tolerance 0.6 to 1.1 uV for wiring. so I would use the calibrator tolerance as being 2.5ppm + 3.6uV as worst case and transfer this to the 7V of my reference voltage as 21.1uV uncertainity for the calibrator. But how do I have to regard the other tolerances up to the measurement of my LM399 #2? With best regards Andreas ----- Original Message ----- From: "gbusg" <gbusg@comcast.net> To: "Discussion of precise voltage measurement" <volt-nuts@febo.com> Sent: Saturday, September 10, 2011 7:24 AM Subject: Re: [volt-nuts] traceable calibration > Andreas asked: > >>So the specs I got from the Fluke homepage. I did not find >>calibrator specs for the non Series II device. >>I hope that this does not make a too big difference. >>Do you have the standard 5720A specs? > > ---------------- > > Hi Andreas, > > I compared the Fluke 5720A 2-sigma specs to their 5720A *Series II* > 2-sigma > specs, and on the 11Vdc range, they both have identical specs. For example > they're both spec'd as +/- (3.5ppm output + 2.5 uV) for 1 Year. > > Cheers, > Greg > > > _______________________________________________ > volt-nuts mailing list -- volt-nuts@febo.com > To unsubscribe, go to > https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts > and follow the instructions there.
G
gbusg
Sat, Sep 10, 2011 11:48 AM

Andreas wrote:

...so I would use the calibrator tolerance as being 2.5ppm + 3.6uV as worst
case
and transfer this to the 7V of my reference voltage as 21.1uV uncertainity
for the calibrator.
But how do I have to regard the other tolerances up to the measurement of
my
LM399 #2?

Hi Andreas,

Actually, assuming Keithley's Test Uncertainty Ratios (TUR) are reasonably
high enough - say >4:1 -  you don't even need to carry-forward Keithley's
calibration Measurement Uncertainties (associated with their process to
calibrate your 2000) because Keithley's published specs (for their Model
2000) include allowances for the Measurement Uncertainties (MU) of their
processes to calibrate your 2000.

Therefore in metrology we assume the MU of your process mostly equals the
rectangular distribution of Keithley's published specs for Model 2000.

If your specific Model 2000 has a "proven track record" (e.g., at least 3
historical calibration test reports showing that your 2000 DC readings are
very repeatable and stable between cal events), then at least on an informal
basis you might use Keithley's test report data as correction factors in
your process to enhance your accuracy (lower MU). However that strategy
would involve a lot of additional statistical analysis and knowledge of your
Model 2000.

My best,
Greg

Andreas wrote: >...so I would use the calibrator tolerance as being 2.5ppm + 3.6uV as worst >case >and transfer this to the 7V of my reference voltage as 21.1uV uncertainity >for the calibrator. >But how do I have to regard the other tolerances up to the measurement of >my >LM399 #2? Hi Andreas, Actually, assuming Keithley's Test Uncertainty Ratios (TUR) are reasonably high enough - say >4:1 - you don't even need to carry-forward Keithley's calibration Measurement Uncertainties (associated with their process to calibrate your 2000) because Keithley's published specs (for their Model 2000) include allowances for the Measurement Uncertainties (MU) of their processes to calibrate your 2000. Therefore in metrology we assume the MU of *your* process mostly equals the rectangular distribution of Keithley's published specs for Model 2000. If your specific Model 2000 has a "proven track record" (e.g., at least 3 historical calibration test reports showing that your 2000 DC readings are very repeatable and stable between cal events), then at least on an informal basis you might use Keithley's test report data as *correction factors* in *your* process to enhance your accuracy (lower MU). However that strategy would involve a lot of additional statistical analysis and knowledge of your Model 2000. My best, Greg
AJ
Andreas Jahn
Sat, Sep 10, 2011 1:07 PM

...so I would use the calibrator tolerance as being 2.5ppm + 3.6uV as
worst
case
and transfer this to the 7V of my reference voltage as 21.1uV uncertainity
for the calibrator.
But how do I have to regard the other tolerances up to the measurement of
my
LM399 #2?

Hello Greg,

after thinking a while I have a guess what Keithley is actually doing.
They seem to add a constant 6uV tolerance geometrically to
the 5V and 10V tolerances.
so  sqrt(15 ^ 2 + 6 ^ 2)    = 16.1
and sqrt(27.5 ^ 2 + 6 ^ 2) = 28.1

so for a 7V I would correct my uncertainity to 20.9 uV.

This leads to another interesting question:
Which errors are statistical and thus can be added geometrically
and which errors have to be added linear?

Hi Andreas,

Actually, assuming Keithley's Test Uncertainty Ratios (TUR) are reasonably
high enough - say >4:1 -  you don't even need to carry-forward Keithley's
calibration Measurement Uncertainties (associated with their process to
calibrate your 2000) because Keithley's published specs (for their Model
2000) include allowances for the Measurement Uncertainties (MU) of their
processes to calibrate your 2000.

Therefore in metrology we assume the MU of your process mostly equals
the
rectangular distribution of Keithley's published specs for Model 2000.

If your specific Model 2000 has a "proven track record" (e.g., at least 3
historical calibration test reports showing that your 2000 DC readings are
very repeatable and stable between cal events), then at least on an
informal
basis you might use Keithley's test report data as correction factors in
your process to enhance your accuracy (lower MU). However that strategy
would involve a lot of additional statistical analysis and knowledge of
your
Model 2000.

So I would have to use the +/-260 uV for the 7V value as MU.
I still hope that I can narrow down the MU for my LM399 #2 measurement.

Is there really no chance of narrowing this MU to a lower value
with the knowing that: over 2 years (3 calibrations) both Keithleys
have shown always a constant difference of 70-80uV for the 7V reference
and: during last year the difference of both instruments against LM399 #2
has been 10uV?

My best,
Greg

With best regards

Andreas

>>...so I would use the calibrator tolerance as being 2.5ppm + 3.6uV as >>worst >>case >>and transfer this to the 7V of my reference voltage as 21.1uV uncertainity >>for the calibrator. >>But how do I have to regard the other tolerances up to the measurement of >>my >>LM399 #2? Hello Greg, after thinking a while I have a guess what Keithley is actually doing. They seem to add a constant 6uV tolerance geometrically to the 5V and 10V tolerances. so sqrt(15 ^ 2 + 6 ^ 2) = 16.1 and sqrt(27.5 ^ 2 + 6 ^ 2) = 28.1 so for a 7V I would correct my uncertainity to 20.9 uV. This leads to another interesting question: Which errors are statistical and thus can be added geometrically and which errors have to be added linear? > Hi Andreas, > > Actually, assuming Keithley's Test Uncertainty Ratios (TUR) are reasonably > high enough - say >4:1 - you don't even need to carry-forward Keithley's > calibration Measurement Uncertainties (associated with their process to > calibrate your 2000) because Keithley's published specs (for their Model > 2000) include allowances for the Measurement Uncertainties (MU) of their > processes to calibrate your 2000. > > Therefore in metrology we assume the MU of *your* process mostly equals > the > rectangular distribution of Keithley's published specs for Model 2000. > > If your specific Model 2000 has a "proven track record" (e.g., at least 3 > historical calibration test reports showing that your 2000 DC readings are > very repeatable and stable between cal events), then at least on an > informal > basis you might use Keithley's test report data as *correction factors* in > *your* process to enhance your accuracy (lower MU). However that strategy > would involve a lot of additional statistical analysis and knowledge of > your > Model 2000. So I would have to use the +/-260 uV for the 7V value as MU. I still hope that I can narrow down the MU for my LM399 #2 measurement. Is there really no chance of narrowing this MU to a lower value with the knowing that: over 2 years (3 calibrations) both Keithleys have shown always a constant difference of 70-80uV for the 7V reference and: during last year the difference of both instruments against LM399 #2 has been 10uV? > My best, > Greg With best regards Andreas