time-nuts@lists.febo.com

Discussion of precise time and frequency measurement

View all threads

GPSDO with all-digital phase/time measurement?

MH
Mark Haun
Wed, Feb 26, 2014 9:51 PM

Hi everyone,

I'm new to the list, and have been reading the recent threads on
Arduino-based GPSDOs and the pros/cons of 10-kHz vs 1-Hz time pulses with
interest.

As I understand it, there are a couple of reasons why one needs a
time-interval / phase measurement implemented outside the MCU:

  1. Time resolution inside the MCU is limited by its clock period, which is
    much too coarse.  The GPSDO would ping-pong within a huge dead zone.
  2. Software tends to inject non-determinism into the timing.

Are there others?  I have no background or experience with PLLs/DLLs, so
I'm really just feeling my way blindly here.

That being said, I find myself wondering as follows:
Suppose that we count OCXO cycles (at, say, 10 MHz) using one of the MCU's
timer/counter peripherals, and periodically sample the counter value with an
interrupt triggered on the rising edge of the GPS 1pps.  Assume that this
interrupt is the highest priority in the system, so that our measurement is
fully deterministic, having only the +/- one cycle ambiguity inherent in the
counting.  Also assume that we keep the counter running continuously.

At this point the time measurement is quite crude, with 100-ns resolution.
But because we keep the counter running, the unknown residuals will keep
accumulating, and we should be able to average out this "quantization noise"
in the long run.  That is, we can measure any T-second period to within 100
ns, so the resolution on a per-second basis becomes 100 ns / T.

Is there any reason why this sort of processing cannot attain equivalent
performance to the more conventional analog phase-detection approach?

Thanks,

Mark
KJ6PC

Hi everyone, I'm new to the list, and have been reading the recent threads on Arduino-based GPSDOs and the pros/cons of 10-kHz vs 1-Hz time pulses with interest. As I understand it, there are a couple of reasons why one needs a time-interval / phase measurement implemented outside the MCU: 1) Time resolution inside the MCU is limited by its clock period, which is much too coarse. The GPSDO would ping-pong within a huge dead zone. 2) Software tends to inject non-determinism into the timing. Are there others? I have no background or experience with PLLs/DLLs, so I'm really just feeling my way blindly here. That being said, I find myself wondering as follows: Suppose that we count OCXO cycles (at, say, 10 MHz) using one of the MCU's timer/counter peripherals, and periodically sample the counter value with an interrupt triggered on the rising edge of the GPS 1pps. Assume that this interrupt is the highest priority in the system, so that our measurement is fully deterministic, having only the +/- one cycle ambiguity inherent in the counting. Also assume that we keep the counter running continuously. At this point the time measurement is quite crude, with 100-ns resolution. But because we keep the counter running, the unknown residuals will keep accumulating, and we should be able to average out this "quantization noise" in the long run. That is, we can measure any T-second period to within 100 ns, so the resolution on a per-second basis becomes 100 ns / T. Is there any reason why this sort of processing cannot attain equivalent performance to the more conventional analog phase-detection approach? Thanks, Mark KJ6PC
TV
Tom Van Baak
Thu, Feb 27, 2014 12:24 AM

Hi Mark,

Not to worry. It's not really "too coarse". Your method is more than enough for millisecond or microsecond timing. Consider that everything about time & frequency is merely an exponent; avoid fuzzy words like coarse or fine. Your approach will work just fine; many a GPSDO has been designed that way.

If you measure your oscillator to a second using WWVB and make adjustments, or if you measure your oscillator to a millisecond using NTP and make adjustments, or if you measure your oscillator to a microsecond using GPS and make adjustments, it is really all the same thing. The exponent is different but the concept is the same.

The final result depends on the time quality of the external 1PPS, the frequency quality of your oscillator, the resolution quality of your measurement, and the granularity of your adjustment. So every GPSDO "works", the question is simply how well does it work? And does it meet your needs? For many applications 1 ms or 1 us timing accuracy is more than enough. For only a few applications does 100 ns or 50 ns matter. It gets much, much harder as you get into the double and single ns digit range.

Several members on this list have developed GPSDO that are based on a MCU timestamping of 1PPS pulses. It works fine. Perfecting the tuning constants takes some time, and is somewhat dependent on your GPS receiver, your antenna, your location. Your oscillator, environment, and MCU also play a key part. But the art is to be able to measure all of this, to document it, to experiment, and to possibly improve on it over time.

/tvb

----- Original Message -----
From: "Mark Haun" haunma@keteu.org
To: time-nuts@febo.com
Sent: Wednesday, February 26, 2014 1:51 PM
Subject: [time-nuts] GPSDO with all-digital phase/time measurement?

Hi everyone,

I'm new to the list, and have been reading the recent threads on
Arduino-based GPSDOs and the pros/cons of 10-kHz vs 1-Hz time pulses with
interest.

As I understand it, there are a couple of reasons why one needs a
time-interval / phase measurement implemented outside the MCU:

  1. Time resolution inside the MCU is limited by its clock period, which is
    much too coarse.  The GPSDO would ping-pong within a huge dead zone.
  2. Software tends to inject non-determinism into the timing.

Are there others?  I have no background or experience with PLLs/DLLs, so
I'm really just feeling my way blindly here.

That being said, I find myself wondering as follows:
Suppose that we count OCXO cycles (at, say, 10 MHz) using one of the MCU's
timer/counter peripherals, and periodically sample the counter value with an
interrupt triggered on the rising edge of the GPS 1pps.  Assume that this
interrupt is the highest priority in the system, so that our measurement is
fully deterministic, having only the +/- one cycle ambiguity inherent in the
counting.  Also assume that we keep the counter running continuously.

At this point the time measurement is quite crude, with 100-ns resolution.
But because we keep the counter running, the unknown residuals will keep
accumulating, and we should be able to average out this "quantization noise"
in the long run.  That is, we can measure any T-second period to within 100
ns, so the resolution on a per-second basis becomes 100 ns / T.

Is there any reason why this sort of processing cannot attain equivalent
performance to the more conventional analog phase-detection approach?

Thanks,

Mark
KJ6PC

Hi Mark, Not to worry. It's not really "too coarse". Your method is more than enough for millisecond or microsecond timing. Consider that everything about time & frequency is merely an exponent; avoid fuzzy words like coarse or fine. Your approach will work just fine; many a GPSDO has been designed that way. If you measure your oscillator to a second using WWVB and make adjustments, or if you measure your oscillator to a millisecond using NTP and make adjustments, or if you measure your oscillator to a microsecond using GPS and make adjustments, it is really all the same thing. The exponent is different but the concept is the same. The final result depends on the time quality of the external 1PPS, the frequency quality of your oscillator, the resolution quality of your measurement, and the granularity of your adjustment. So every GPSDO "works", the question is simply how well does it work? And does it meet your needs? For many applications 1 ms or 1 us timing accuracy is more than enough. For only a few applications does 100 ns or 50 ns matter. It gets much, much harder as you get into the double and single ns digit range. Several members on this list have developed GPSDO that are based on a MCU timestamping of 1PPS pulses. It works fine. Perfecting the tuning constants takes some time, and is somewhat dependent on your GPS receiver, your antenna, your location. Your oscillator, environment, and MCU also play a key part. But the art is to be able to *measure* all of this, to document it, to experiment, and to possibly improve on it over time. /tvb ----- Original Message ----- From: "Mark Haun" <haunma@keteu.org> To: <time-nuts@febo.com> Sent: Wednesday, February 26, 2014 1:51 PM Subject: [time-nuts] GPSDO with all-digital phase/time measurement? > Hi everyone, > > I'm new to the list, and have been reading the recent threads on > Arduino-based GPSDOs and the pros/cons of 10-kHz vs 1-Hz time pulses with > interest. > > As I understand it, there are a couple of reasons why one needs a > time-interval / phase measurement implemented outside the MCU: > > 1) Time resolution inside the MCU is limited by its clock period, which is > much too coarse. The GPSDO would ping-pong within a huge dead zone. > 2) Software tends to inject non-determinism into the timing. > > Are there others? I have no background or experience with PLLs/DLLs, so > I'm really just feeling my way blindly here. > > That being said, I find myself wondering as follows: > Suppose that we count OCXO cycles (at, say, 10 MHz) using one of the MCU's > timer/counter peripherals, and periodically sample the counter value with an > interrupt triggered on the rising edge of the GPS 1pps. Assume that this > interrupt is the highest priority in the system, so that our measurement is > fully deterministic, having only the +/- one cycle ambiguity inherent in the > counting. Also assume that we keep the counter running continuously. > > At this point the time measurement is quite crude, with 100-ns resolution. > But because we keep the counter running, the unknown residuals will keep > accumulating, and we should be able to average out this "quantization noise" > in the long run. That is, we can measure any T-second period to within 100 > ns, so the resolution on a per-second basis becomes 100 ns / T. > > Is there any reason why this sort of processing cannot attain equivalent > performance to the more conventional analog phase-detection approach? > > Thanks, > > Mark > KJ6PC
BS
Bob Stewart
Thu, Feb 27, 2014 2:03 AM

Hi Mark,

I'm neither an engineer, nor an expert, but here are my comments. 

I think that the idea of 100ns/T is wrong.  There are several variables that control accuracy, but the time between pulses from your OCXO (assuming no phase or frequency drift) isn't one of them.  So, that gives 1/T.  Here the problem is that T must get large before your accuracy can be good.  You can achieve very good accuracy, but at the cost of waiting thousands of seconds between "phase points"; i.e. where your 1PPS coincides with the 10 millionth OCXO pulse.  The theoretical maximum would be infinity, of course, but your oscillator won't be that stable.

Another big problem is the accuracy of the 1PPS pulse.  I'm using an Adafruit GPS receiver, and it's listed as accurate to within 10ns.  And it is, but you have to be wary of exactly what that means.  It doesn't mean +/- 5ns.  So, as your 1PPS pulse bobs back and forth, you will often encounter an OCXO pulse up to 10ns early, or up to 10ns late.  So, might you count 9,999,999 pulses from the OCXO immediately followed by 10,000,001 pulses.  Neither of those, by itself is a signal to change the EFC voltage to your OCXO.  In fact, it is normal for your count to alternate between the two for long periods, if you are very very close to exactly 10MHz, just from the quantization error on the 1PPS.  It is also normal for 1/T to control the time between phase crossings.  So you have to wait for two miscounts in a row in the same direction to make a change.  And even then, you can't be 100% sure that it's not due to the quantization errors in your 1PPS
signal.

The better GPS receivers will output a quantization error value every second.  But if you're using the 1/T method, there's nothing you can do with it, so you have to live with whatever quantization errors you get.

Anyway, those are my experiences.

Bob - AE6RV


From: Mark Haun haunma@keteu.org
To: time-nuts@febo.com
Sent: Wednesday, February 26, 2014 3:51 PM
Subject: [time-nuts] GPSDO with all-digital phase/time measurement?

Hi everyone,

I'm new to the list, and have been reading the recent threads on
Arduino-based GPSDOs and the pros/cons of 10-kHz vs 1-Hz time pulses with
interest.

As I understand it, there are a couple of reasons why one needs a
time-interval / phase measurement implemented outside the MCU:

  1. Time resolution inside the MCU is limited by its clock period, which is
    much too coarse.  The GPSDO would ping-pong within a huge dead zone.
  2. Software tends to inject non-determinism into the timing.
<snip>

Thanks,

Mark
KJ6PC

Hi Mark, I'm neither an engineer, nor an expert, but here are my comments.  I think that the idea of 100ns/T is wrong.  There are several variables that control accuracy, but the time between pulses from your OCXO (assuming no phase or frequency drift) isn't one of them.  So, that gives 1/T.  Here the problem is that T must get large before your accuracy can be good.  You can achieve very good accuracy, but at the cost of waiting thousands of seconds between "phase points"; i.e. where your 1PPS coincides with the 10 millionth OCXO pulse.  The theoretical maximum would be infinity, of course, but your oscillator won't be that stable. Another big problem is the accuracy of the 1PPS pulse.  I'm using an Adafruit GPS receiver, and it's listed as accurate to within 10ns.  And it is, but you have to be wary of exactly what that means.  It doesn't mean +/- 5ns.  So, as your 1PPS pulse bobs back and forth, you will often encounter an OCXO pulse up to 10ns early, or up to 10ns late.  So, might you count 9,999,999 pulses from the OCXO immediately followed by 10,000,001 pulses.  Neither of those, by itself is a signal to change the EFC voltage to your OCXO.  In fact, it is normal for your count to alternate between the two for long periods, if you are very very close to exactly 10MHz, just from the quantization error on the 1PPS.  It is also normal for 1/T to control the time between phase crossings.  So you have to wait for two miscounts in a row in the same direction to make a change.  And even then, you can't be 100% sure that it's not due to the quantization errors in your 1PPS signal. The better GPS receivers will output a quantization error value every second.  But if you're using the 1/T method, there's nothing you can do with it, so you have to live with whatever quantization errors you get. Anyway, those are my experiences. Bob - AE6RV >________________________________ > From: Mark Haun <haunma@keteu.org> >To: time-nuts@febo.com >Sent: Wednesday, February 26, 2014 3:51 PM >Subject: [time-nuts] GPSDO with all-digital phase/time measurement? > > >Hi everyone, > >I'm new to the list, and have been reading the recent threads on >Arduino-based GPSDOs and the pros/cons of 10-kHz vs 1-Hz time pulses with >interest. > >As I understand it, there are a couple of reasons why one needs a >time-interval / phase measurement implemented outside the MCU: > >1) Time resolution inside the MCU is limited by its clock period, which is >much too coarse.  The GPSDO would ping-pong within a huge dead zone. >2) Software tends to inject non-determinism into the timing. > ><snip> > >Thanks, > >Mark >KJ6PC > >
CA
Chris Albertson
Thu, Feb 27, 2014 2:04 AM

At this point the time measurement is quite crude, with 100-ns resolution.

But because we keep the counter running, the unknown residuals will keep
accumulating, and we should be able to average out this "quantization
noise"

What you are saying is "With a long enough gate time you can measure
frequency to any desired level of accuracy."

For that to be true the  frequency must remain stable over the long period
you are averaging.  For example let's say you average over 10 seconds.
What happens of the nominal 10MHz oscillator ran at  10.1 Mhz for five
seconds and then 9.9 Mhz for 5 seconds.  You'd measure 10Mhz and be happy.
So there is the problem.  With long period averaging, you can only do it
for so long as you trust the clock to not change.

All the long gate time tells you is the AVERAGE so it only works of the
change over the period is smooth and monotonic.

If you want very good accuracy, say one part in 10^14 you can't get there
by waiting days and weeks because the frequency will move while you are
measuring it.

The quicker way is to measure the phase.  You can still do this digitally.
Make a one bit measurement.  zero if the phase leads, 1 if it lags and
then over time you want the average to be 0.5  This works much faster
because your controller does not have to wait for an entire cycle of error
to accumulate.

Your counting system would have to wait for a one cycle error and then make
a very large and infrequent correction.  It could work for some use cases.

Chris Albertson
Redondo Beach, California

At this point the time measurement is quite crude, with 100-ns resolution. > But because we keep the counter running, the unknown residuals will keep > accumulating, and we should be able to average out this "quantization > noise" What you are saying is "With a long enough gate time you can measure frequency to any desired level of accuracy." For that to be true the frequency must remain stable over the long period you are averaging. For example let's say you average over 10 seconds. What happens of the nominal 10MHz oscillator ran at 10.1 Mhz for five seconds and then 9.9 Mhz for 5 seconds. You'd measure 10Mhz and be happy. So there is the problem. With long period averaging, you can only do it for so long as you trust the clock to not change. All the long gate time tells you is the AVERAGE so it only works of the change over the period is smooth and monotonic. If you want very good accuracy, say one part in 10^14 you can't get there by waiting days and weeks because the frequency will move while you are measuring it. The quicker way is to measure the phase. You can still do this digitally. Make a one bit measurement. zero if the phase leads, 1 if it lags and then over time you want the average to be 0.5 This works much faster because your controller does not have to wait for an entire cycle of error to accumulate. Your counting system would have to wait for a one cycle error and then make a very large and infrequent correction. It could work for some use cases. -- Chris Albertson Redondo Beach, California
TV
Tom Van Baak
Thu, Feb 27, 2014 5:22 AM

At this point the time measurement is quite crude, with 100-ns resolution.
But because we keep the counter running, the unknown residuals will keep
accumulating, and we should be able to average out this "quantization noise"
in the long run.  That is, we can measure any T-second period to within 100
ns, so the resolution on a per-second basis becomes 100 ns / T.

No. The timing resolution per second is always 100 ns. You're probably thinking about average frequency, in which case dividing by T is sometimes valid, and it looks better and better as time goes by, usually.

What saves you here is that your counter noise (100 ns) is likely greater than the quantization noise. So you can pretty much ignore the receiver 1PPS quantization noise. For people with much lower measurement noise (e.g., 1 ns) the quantization noise becomes a more important piece of the error pie.

Try not to say average "out"; that sounds like it goes away over time or gets smaller. You're doing a timing measurement so the 100 ns measurement granularity is always there, on every measurement.

Is there any reason why this sort of processing cannot attain equivalent
performance to the more conventional analog phase-detection approach?

All other factors equal, a GPSDO based on 100 ns measurement resolution can never attain the equivalent of a GPSDO based on 10 ns or 1 ns measurement resolution. Waiting shorter or longer doesn't change the RMS timing accuracy.

/tvb

> At this point the time measurement is quite crude, with 100-ns resolution. > But because we keep the counter running, the unknown residuals will keep > accumulating, and we should be able to average out this "quantization noise" > in the long run. That is, we can measure any T-second period to within 100 > ns, so the resolution on a per-second basis becomes 100 ns / T. No. The timing resolution per second is always 100 ns. You're probably thinking about average frequency, in which case dividing by T is sometimes valid, and it looks better and better as time goes by, usually. What saves you here is that your counter noise (100 ns) is likely greater than the quantization noise. So you can pretty much ignore the receiver 1PPS quantization noise. For people with much lower measurement noise (e.g., 1 ns) the quantization noise becomes a more important piece of the error pie. Try not to say average "out"; that sounds like it goes away over time or gets smaller. You're doing a timing measurement so the 100 ns measurement granularity is always there, on every measurement. > Is there any reason why this sort of processing cannot attain equivalent > performance to the more conventional analog phase-detection approach? All other factors equal, a GPSDO based on 100 ns measurement resolution can never attain the equivalent of a GPSDO based on 10 ns or 1 ns measurement resolution. Waiting shorter or longer doesn't change the RMS timing accuracy. /tvb
BS
Bob Stewart
Thu, Feb 27, 2014 5:32 AM

Tom,

I took his 100ns figure to be simply the period of 10MHz.  He mentioned using an interrupt driven system, so the counts should not necessarily be limited to 100ns accuracy.  At least on the PIC I'm using, the CCP and timer interrupts don't seem to be synchronous with the PIC clock.  I could be mistaken.

Bob


From: Tom Van Baak tvb@LeapSecond.com
To: Discussion of precise time and frequency measurement time-nuts@febo.com
Sent: Wednesday, February 26, 2014 11:22 PM
Subject: Re: [time-nuts] GPSDO with all-digital phase/time measurement?

At this point the time measurement is quite crude, with 100-ns resolution.
But because we keep the counter running, the unknown residuals will keep
accumulating, and we should be able to average out this "quantization noise"
in the long run.  That is, we can measure any T-second period to within 100
ns, so the resolution on a per-second basis becomes 100 ns / T.

No. The timing resolution per second is always 100 ns. You're probably thinking about average frequency, in which case dividing by T is sometimes valid, and it looks better and better as time goes by, usually.

What saves you here is that your counter noise (100 ns) is likely greater than the quantization noise. So you can pretty much ignore the receiver 1PPS quantization noise. For people with much lower measurement noise (e.g., 1 ns) the quantization noise becomes a more important piece of the error pie.

Try not to say average "out"; that sounds like it goes away over time or gets smaller. You're doing a timing measurement so the 100 ns measurement granularity is always there, on every measurement.

Is there any reason why this sort of processing cannot attain equivalent
performance to the more conventional analog phase-detection approach?

All other factors equal, a GPSDO based on 100 ns measurement resolution can never attain the equivalent of a GPSDO based on 10 ns or 1 ns measurement resolution. Waiting shorter or longer doesn't change the RMS timing accuracy.

/tvb


time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.

Tom, I took his 100ns figure to be simply the period of 10MHz.  He mentioned using an interrupt driven system, so the counts should not necessarily be limited to 100ns accuracy.  At least on the PIC I'm using, the CCP and timer interrupts don't seem to be synchronous with the PIC clock.  I could be mistaken. Bob >________________________________ > From: Tom Van Baak <tvb@LeapSecond.com> >To: Discussion of precise time and frequency measurement <time-nuts@febo.com> >Sent: Wednesday, February 26, 2014 11:22 PM >Subject: Re: [time-nuts] GPSDO with all-digital phase/time measurement? > > >> At this point the time measurement is quite crude, with 100-ns resolution. >> But because we keep the counter running, the unknown residuals will keep >> accumulating, and we should be able to average out this "quantization noise" >> in the long run.  That is, we can measure any T-second period to within 100 >> ns, so the resolution on a per-second basis becomes 100 ns / T. > >No. The timing resolution per second is always 100 ns. You're probably thinking about average frequency, in which case dividing by T is sometimes valid, and it looks better and better as time goes by, usually. > >What saves you here is that your counter noise (100 ns) is likely greater than the quantization noise. So you can pretty much ignore the receiver 1PPS quantization noise. For people with much lower measurement noise (e.g., 1 ns) the quantization noise becomes a more important piece of the error pie. > >Try not to say average "out"; that sounds like it goes away over time or gets smaller. You're doing a timing measurement so the 100 ns measurement granularity is always there, on every measurement. > >> Is there any reason why this sort of processing cannot attain equivalent >> performance to the more conventional analog phase-detection approach? > >All other factors equal, a GPSDO based on 100 ns measurement resolution can never attain the equivalent of a GPSDO based on 10 ns or 1 ns measurement resolution. Waiting shorter or longer doesn't change the RMS timing accuracy. > >/tvb > >_______________________________________________ >time-nuts mailing list -- time-nuts@febo.com >To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts >and follow the instructions there. > > >
AP
Alexander Pummer
Thu, Feb 27, 2014 1:42 PM

it would be interesting to see the "accuracy" of the 1pps pulses by
comparing them with a second 1pps pulse, which is derived from a
rubidium standard, which on his own does not have quantizing errors,
73
KJ6UHN
Alex

On 2/26/2014 6:03 PM, Bob Stewart wrote:

Hi Mark,

I'm neither an engineer, nor an expert, but here are my comments.

I think that the idea of 100ns/T is wrong.  There are several variables that control accuracy, but the time between pulses from your OCXO (assuming no phase or frequency drift) isn't one of them.  So, that gives 1/T.  Here the problem is that T must get large before your accuracy can be good.  You can achieve very good accuracy, but at the cost of waiting thousands of seconds between "phase points"; i.e. where your 1PPS coincides with the 10 millionth OCXO pulse.  The theoretical maximum would be infinity, of course, but your oscillator won't be that stable.

Another big problem is the accuracy of the 1PPS pulse.  I'm using an Adafruit GPS receiver, and it's listed as accurate to within 10ns.  And it is, but you have to be wary of exactly what that means.  It doesn't mean +/- 5ns.  So, as your 1PPS pulse bobs back and forth, you will often encounter an OCXO pulse up to 10ns early, or up to 10ns late.  So, might you count 9,999,999 pulses from the OCXO immediately followed by 10,000,001 pulses.  Neither of those, by itself is a signal to change the EFC voltage to your OCXO.  In fact, it is normal for your count to alternate between the two for long periods, if you are very very close to exactly 10MHz, just from the quantization error on the 1PPS.  It is also normal for 1/T to control the time between phase crossings.  So you have to wait for two miscounts in a row in the same direction to make a change.  And even then, you can't be 100% sure that it's not due to the quantization errors in your 1PPS
signal.

The better GPS receivers will output a quantization error value every second.  But if you're using the 1/T method, there's nothing you can do with it, so you have to live with whatever quantization errors you get.

Anyway, those are my experiences.

Bob - AE6RV


From: Mark Haun haunma@keteu.org
To: time-nuts@febo.com
Sent: Wednesday, February 26, 2014 3:51 PM
Subject: [time-nuts] GPSDO with all-digital phase/time measurement?

Hi everyone,

I'm new to the list, and have been reading the recent threads on
Arduino-based GPSDOs and the pros/cons of 10-kHz vs 1-Hz time pulses with
interest.

As I understand it, there are a couple of reasons why one needs a
time-interval / phase measurement implemented outside the MCU:

  1. Time resolution inside the MCU is limited by its clock period, which is
    much too coarse.  The GPSDO would ping-pong within a huge dead zone.
  2. Software tends to inject non-determinism into the timing.
<snip>

Thanks,

Mark
KJ6PC


time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.

it would be interesting to see the "accuracy" of the 1pps pulses by comparing them with a second 1pps pulse, which is derived from a rubidium standard, which on his own does not have quantizing errors, 73 KJ6UHN Alex On 2/26/2014 6:03 PM, Bob Stewart wrote: > Hi Mark, > > I'm neither an engineer, nor an expert, but here are my comments. > > I think that the idea of 100ns/T is wrong. There are several variables that control accuracy, but the time between pulses from your OCXO (assuming no phase or frequency drift) isn't one of them. So, that gives 1/T. Here the problem is that T must get large before your accuracy can be good. You can achieve very good accuracy, but at the cost of waiting thousands of seconds between "phase points"; i.e. where your 1PPS coincides with the 10 millionth OCXO pulse. The theoretical maximum would be infinity, of course, but your oscillator won't be that stable. > > Another big problem is the accuracy of the 1PPS pulse. I'm using an Adafruit GPS receiver, and it's listed as accurate to within 10ns. And it is, but you have to be wary of exactly what that means. It doesn't mean +/- 5ns. So, as your 1PPS pulse bobs back and forth, you will often encounter an OCXO pulse up to 10ns early, or up to 10ns late. So, might you count 9,999,999 pulses from the OCXO immediately followed by 10,000,001 pulses. Neither of those, by itself is a signal to change the EFC voltage to your OCXO. In fact, it is normal for your count to alternate between the two for long periods, if you are very very close to exactly 10MHz, just from the quantization error on the 1PPS. It is also normal for 1/T to control the time between phase crossings. So you have to wait for two miscounts in a row in the same direction to make a change. And even then, you can't be 100% sure that it's not due to the quantization errors in your 1PPS > signal. > > The better GPS receivers will output a quantization error value every second. But if you're using the 1/T method, there's nothing you can do with it, so you have to live with whatever quantization errors you get. > > Anyway, those are my experiences. > > Bob - AE6RV > > > > > >> ________________________________ >> From: Mark Haun <haunma@keteu.org> >> To: time-nuts@febo.com >> Sent: Wednesday, February 26, 2014 3:51 PM >> Subject: [time-nuts] GPSDO with all-digital phase/time measurement? >> >> >> Hi everyone, >> >> I'm new to the list, and have been reading the recent threads on >> Arduino-based GPSDOs and the pros/cons of 10-kHz vs 1-Hz time pulses with >> interest. >> >> As I understand it, there are a couple of reasons why one needs a >> time-interval / phase measurement implemented outside the MCU: >> >> 1) Time resolution inside the MCU is limited by its clock period, which is >> much too coarse. The GPSDO would ping-pong within a huge dead zone. >> 2) Software tends to inject non-determinism into the timing. >> >> <snip> >> >> Thanks, >> >> Mark >> KJ6PC >> >> > _______________________________________________ > time-nuts mailing list -- time-nuts@febo.com > To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts > and follow the instructions there.