I was doing this sort of test with my new IC7610 - with its GPS sync'd
oscillator - and eventually found that the two methods produce
frequencies
that differ by 0.1 Hz. It looks like USB/LSB are offset by equal amounts:
0.05 Hz in opposite directions. This seems independent of the frequency
being measured.
Most receivers have limited frequency resolution rather than infinite
resolution and at least some ICOMS do that can be seen by zooming in an
accurate transmitted frequency. Most SDR receivers also do so even if
GPS stabilized, they will be stable but not necessarily on frequency
(with errors seen sometimes of the order of 10 Hz or more with some).
That is true for both Airspy and SDRplay receivers. And the observed
offset may be different in different frequencies ranges.
When I want accurate measurements, I use a GPS discipline SignalHound
spectrum analyzer. $$$$
I thought many stations were consistently a little off frequency that
were not as revealed when I started using the SignalHound. But some are
still off frequency.
David L. Wilson
I suspect that is a cost driven thing.
If you go out and buy a piece of test equipment, the documentation usually covers this kind of thing, or there's a ap note, or you can ask the mfr. And you pay for it.
If you buy a piece of consumer equipment (which ham radios most certainly are), the answer is more likely to be "it meets the published specification".
For SSB HF radios, for instance, a common requirement is that the frequency be within 10 or 20 Hz of what's displayed (because that is, empirically, close enough that voice pitch won't be adversely affected - you won't need a "clarifier"). So if your synthesizer has a mixture of 5 Hz, 6 Hz, and 7 Hz steps, depending on the dial frequency, the reference frequency, and any drift corrections - that meets spec, and the manufacturer's job is done.
Consider the NanoVNA - it's a $50-100 vector network analyzer that does a fairly decent job, but did not come with a lot of documentation on its idiosyncracies or quirks. There's a fairly active community that has reverse engineered aspects of it (and, even, modified the firmware), but that wasn't in the original device, nor in the $50 selling price.
Even in test equipment, it's not always obvious from the documentation. For instance, the keysight 33500/33600 series waveform generators have a 10 MHz input on the back. But hey, it turns out that that's a frequency reference, not a phase reference, so the phase relationship between a 10 MHz output from two 33500s driven with the same reference signal is not fixed. (Learned this with a stack of 33622s)
Or a more ancient example: The Deep Space Network used dozens of HP 8663 signal generators for all kinds of things, including generating precisely modulated uplink signals by monkeying with the 10 MHz reference input and the phase modulation input. That trick was used in all kinds of test setups. When we switched to the new E8663B, it all stopped working, because the synthesis chain works differently. (phase modulation was done by making small steps in a DDS, rather than an analog modulator). There was some Fluke signal generator (model number I don't recall) which would work "the old way" so that provided a way to keep doing it, but eventually, we had to figure out how to work with the equipment we could buy. (There being a limited number of old 8663As on pallets in storage that we could leverage for repairs)
On Sat, 7 Dec 2024 05:01:44 -0600, Dana Whitlow via time-nuts time-nuts@lists.febo.com wrote:
What really irks me is that the manufacturers virtually never admit
to the existence of synthesis offset errors in their data, and that they
do not display the amount of synthesis offset error in the course of
normal operation. Nor, as a rule, do they show sufficiently detailed
info on the overall design that one could compute that error for
himself.
Dana
On Sat, Dec 7, 2024 at 3:58 AM Magnus Danielson via time-nuts <
time-nuts@lists.febo.com> wrote:
Hi,
There are a couple of factors in how a receiver creates offsets:
Timebase offset - internal reference oscillator not perfect on
nominal frequency
Synthesis offset - For practical reasons, synthesis is not engineered
to necessarily give exactly 100 Hz steps (or whatever) with perfect 0 Hz
offset from nominal.
The synthesis offset comes from implementation and forms a scaling of
the timebase. The timebase can be locked up in some receivers, and then
synthesis offset remains.
With a bit of care, these can be characterized for the particular
receiver dial frequency, but indeed, this is not test instruments, they
are not built for it, so you will have to learn how they work and
compensate for it.
Cheers,
Magnus
On 2024-12-07 07:34, glen english LIST via time-nuts wrote:
IC7610s are not test instruments..... the Icoms have fixed offset
errors.. I told them about this and their response was "its good enough"
On 7/12/2024 2:38 pm, ke2n--- via time-nuts wrote:
I understand that if I offset my radio tuning by (say) 1500 Hz from a
received carrier, I will hear a 1500 Hz note in the audio.
If you use USB mode, then the tuning should be offset in a negative
direction.
For LSB mode, the tuning should be offset in a positive direction.
Both methods should produce the exact same beat note with a
properly-aligned
radio.
I was doing this sort of test with my new IC7610 - with its GPS sync'd
oscillator - and eventually found that the two methods produce
frequencies
that differ by 0.1 Hz. It looks like USB/LSB are offset by equal
amounts:
0.05 Hz in opposite directions. This seems independent of the frequency
being measured.
Since this is an SDR-based radio, I suppose that the cause may be
some sort
of round-off error in the frequency synthesis. Perhaps due to resolution
limitations of the scheme. Or maybe its an actual programming error?
I was wondering if others have seen this type of thing with SDR
radios? Is
it common?
Ken/KE2N
time-nuts mailing list --time-nuts@lists.febo.com
To unsubscribe send an email totime-nuts-leave@lists.febo.com
time-nuts mailing list -- time-nuts@lists.febo.com
To unsubscribe send an email to time-nuts-leave@lists.febo.com
time-nuts mailing list -- time-nuts@lists.febo.com
To unsubscribe send an email to time-nuts-leave@lists.febo.com
time-nuts mailing list -- time-nuts@lists.febo.com
To unsubscribe send an email to time-nuts-leave@lists.febo.com
Am 2024-12-07 16:52, schrieb Tim Shoppa via time-nuts:
Note that DDS synth blocks don't always give the "nice round numbers"
that
humans would see on a dial, and they should also block out divider
values
....
Stanford Telecom has (had?) DDS chips that used BCD counters.
Maybe I'll add that as an option to my FPGA/VHDL DDS block on
opencores.org.
Cheers, Gerhard, DK4XP
In connection with my complaint about synthesizer frequency offset errors,
I am well
aware that a synthesizer of finite complexity cannot go to arbitrary
frequencies.
But I know how to correct for this, provided that I know what the frequency
error is.
That is all I am asking. AFAIK, once the frequency setting algorithm has
picked the
register values (or equivalents) fed to the synthesizer, calculation of the
frequency
error to extreme precision is straightforward and requires no iteration
or anything
like that. Thus it would seem that what I ask is virtually trivial both in
programming
effort and in computational burden. Since we're apparently OK to mention
brand
names and models, I will say that the problem I'm having is with the Signal
Hound
spectrum analyzer model SA44B. But even their more expensive models seem to
be similarly inflicted. Signal Hound has got one thing right when
streaming I&Q
data to disk, in that they also write a brief XML file with the metadata.
I mention
this because communicating the tuning error to the user would simply be a
matter
of adding one line of text to said XML file.
Dana Whitlow
On Tue, Dec 10, 2024 at 3:10 PM Gerhard Hoffmann via time-nuts <
time-nuts@lists.febo.com> wrote:
Am 2024-12-07 16:52, schrieb Tim Shoppa via time-nuts:
Note that DDS synth blocks don't always give the "nice round numbers"
that
humans would see on a dial, and they should also block out divider
values
....
Stanford Telecom has (had?) DDS chips that used BCD counters.
Maybe I'll add that as an option to my FPGA/VHDL DDS block on
opencores.org.
Cheers, Gerhard, DK4XP
time-nuts mailing list -- time-nuts@lists.febo.com
To unsubscribe send an email to time-nuts-leave@lists.febo.com
Having once been developing an API for a "generic SDR" (as a component of the NASA Space Telecommunications Radio Standard (STRS) - we wanted to device canonical abstraction layers for radios), I've got some comments:
It is sometimes straightforward to do this, but not always. Consider -
1) do you have your software calculate the "as tuned" frequency (to some arbitrary precision) and send that back. This is the most general, but gets you into the precision thing. AND, of concern to us time-nuts people, do you give the tuned frequency assuming the "nominal" source oscillator frequency, the "last measured and/or modeled" oscillator frequency (perhaps rolling in a calibrated temperature correction?), or something else.
2) Or, do you return the "programming info" - in which case, is it generalized parameters A, B, N, R, etc. and separately you define the synthesizer frequency equation in those terms? Or do you just give the part #, and let the user figure it out.
3) There are complexities in some synthesizer designs which should be accounted for, even if they don't have an effect on the frequency, they have effects on the spurs. Various and sundry spur cancellation/reduction techniques, or techniques that move "close to carrier phase noise" farther out.
4) Some of that might be proprietary or export controlled.
This can get really complex, really rapidly, and a lot depends on what you're using that "actual frequency" for.
Some practical experience. The original Iris cubesat deep space transponder on MarCO used a particular fractional-N PLL. The V2 version (on all the Artemis 1 cubesats) used an integer N PLL (ADF4108), which as a side effect only synthesizes in "steps". There's another 36 or 42 bit NCO/DDS that generates an offset from those steps for the actual frequency. (in the original Iris, that offset would normally be zero, and is used to "track out" Doppler and such). The receiver and the transmit side use different LOs (X-band space radios receive in the 7.1 GHz area, and transmit around 8.4 GHz), so there's a different "offset" NCO for each. And while the NCO on the receive side is done in "software", on the transmit side, we use an NCO to drive the PLL's reference input at around 20 MHz, so that's pushed around a bit to make the frequency come out right. The Kobayashi paper cited below, Figure 2 shows a simplified version of this.
So "return the tuned frequency" gets pretty complex - you have the various programming values for the ADF4108, you have the NCO offsets, and you might have an added arithmetic terms for the turnaround ratio calculation. Having worked with all the teams using those half dozen Irises on EM-1, I can confidently say that none of them really understood what was going on frequency wise. They'd say "OK, I guess if I need it, we can call JPL and get an explanation of our data. As long as it's "on frequency" we're happy. For that matter, it wasn't particularly easy going from "frequency" to "what do we program" - that is, what divisors you choose determines the PLL's reference clock, which in term affects the "step spacing", which in turn affects where the PLL spurs wind up in the output, so for some, you might go in 4 MHz steps, for others in 5 MHz steps, and adjust the NCO accordingly. (An Excel Spreadsheet with many pages, and lots of manual iteration to "find the right combination that works well" is required)
But DSN channel frequencies are weird multiples with a lot of decimal places (because they follow a mathematical formula, and the Tx and Rx frequencies are related by a (usually) fixed ratio). This is because in the time honored way when the stuff was defined back in the 1970s, everything is done with multipliers and dividers and locked to a single reference on the ground. (and as a side effect, relying on HP 8663 synthesizer pecularities and such). And it was all fully analog (dozens of modules cabled up in racks).
The downlink frequency for channel N is : Fch(N) = (N-14) * (10/27) + 2295 MHz, rounded to the nearest Hz. Uplink in S band is then that times 221/240. X band channels are 749/240 * Fch and so forth. And it gets weirder for Ka band because there's more integer ratios involved.
Note well, none of these integer ratios has finite precision (most are infinitely repeating decimals, they are rational fractions, though)
For one of the units, the desired Tx was 8416.358023 Mhz (to 1 Hz as 810-005 calls out) so the desired Rx is (749/880) or 7163.468363 MHz. (Rounded to 1 Hz, so it's not exact...It's off by 3.37E-11)
The synthesizers produced an N of 3596 (for the Rx), against a desired 3595.993865, so that's only 12 kHz off for the receive IF from the desired 112.5. (Oscillator 50 MHz * 9/4, with the signal sampled at 50 MHz)
And for the Tx the N was 1053, resulting in an offset of 7.641977 MHz, requiring the DDS to be set to 19.98185665, instead of 20.0000000.
Someone doing radio science or wanting to check the nav process would be getting telemetry of things like the NCO values from the loop filters.
Circling back - if you're interested in the precise frequency, then you probably DO want all those numbers, and you'll work through the math and figure it out. But if you're just recording data, and need a "as tuned" frequency to dump into your VITA 41 data file of raw samples, and 1 Hz is good enough, then you want that.
For what it's worth, for the STRS standard, we wound up defining some notional "hardware abstraction layer" calls that used frequency request (as a float double precision in Hz) and frequency return (as float double precision in Hz), and left it up to the user to figure out what it wanted. If you wanted to "pierce the abstraction", then you could set a "hardware specific" API call to set registers and such, or retrieve registers settings and/or values from inside the tracking loop.
I'll just conclude by noting that the KaTS (Ka band transponder system) on Juno is regularly used to measure the radio range to Juno with a precision of centimeters, with a range of just under a billion km ( that's 1E12 meters, so 1E14 cm) and the range rate to mm/s. This is with a measurement interval of ~1000 or 10000 seconds - note that it's not a "single frequency measurement" since Juno is orbiting Jupiter, so there's a continuously varying Doppler component you have to model.
In comparison Iris can't do as well. (partly because of compromises in all those DDSes and such). It does "meters" at Lunar and Mars distances. (i'll have to look up the residuals for MarCO)
(Duncan, Smith, Aguirre, "Iris Transponder - communications and navigation for deep space - https://dataverse.jpl.nasa.gov/dataset.xhtml?persistentId=hdl:2014/45593 )
(Kobayashi, "Iris Deep-Space Transponder for SLS EM-1 Cubesat Missions" https://digitalcommons.usu.edu/cgi/viewcontent.cgi?referer=&httpsredir=1&article=3603&context=smallsat)
(Module 201 of the DSN 810-005 document, available on line, I normally google 810-005 and go from there, but it's at https://deepspace.jpl.nasa.gov/dsndocs/810-005/201/201F.pdf)
I sympathize with your desires Dana, but it's not so simple. <grin>
On Tue, 10 Dec 2024 16:44:19 -0600, Dana Whitlow via time-nuts time-nuts@lists.febo.com wrote:
In connection with my complaint about synthesizer frequency offset errors,
I am well
aware that a synthesizer of finite complexity cannot go to arbitrary
frequencies.
But I know how to correct for this, provided that I know what the frequency
error is.
That is all I am asking. AFAIK, once the frequency setting algorithm has
picked the
register values (or equivalents) fed to the synthesizer, calculation of the
frequency
error to extreme precision is straightforward and requires no iteration
or anything
like that. Thus it would seem that what I ask is virtually trivial both in
programming
effort and in computational burden. Since we're apparently OK to mention
brand
names and models, I will say that the problem I'm having is with the Signal
Hound
spectrum analyzer model SA44B. But even their more expensive models seem to
be similarly inflicted. Signal Hound has got one thing right when
streaming I&Q
data to disk, in that they also write a brief XML file with the metadata.
I mention
this because communicating the tuning error to the user would simply be a
matter
of adding one line of text to said XML file.
Dana Whitlow
On Tue, Dec 10, 2024 at 3:10 PM Gerhard Hoffmann via time-nuts <
time-nuts@lists.febo.com> wrote:
Am 2024-12-07 16:52, schrieb Tim Shoppa via time-nuts:
Note that DDS synth blocks don't always give the "nice round numbers"
that
humans would see on a dial, and they should also block out divider
values
....
Stanford Telecom has (had?) DDS chips that used BCD counters.
Maybe I'll add that as an option to my FPGA/VHDL DDS block on
opencores.org.
Cheers, Gerhard, DK4XP
time-nuts mailing list -- time-nuts@lists.febo.com
To unsubscribe send an email to time-nuts-leave@lists.febo.com
time-nuts mailing list -- time-nuts@lists.febo.com
To unsubscribe send an email to time-nuts-leave@lists.febo.com