Tom,
On 02/27/2016 03:50 PM, Tom McDermott wrote:
The reason many cell sites went to GPS for time and frequency
synchronization is that in many cases
it was either less expensive or only possible to backhaul the cell site
traffic to the MTSO (Mobile Telephone
Switching Office) via microwave radio rather than wireline copper or fiber
carrier.
That microwave backhaul did not always provide sufficiently precise phase
and frequency reference
needed at the cell site.
For most telecom purposes, the frequency transfer of microwave links
suffice. Newer links have some basic support for things like PTP.
I run time-transfer over many microwave links, which is a bit of a
challenge for all the wrong reasons, but it works.
It is only when you push the requirement below +/- 1 us that you start
to have issues.
Cheers,
Magnus
In message A08B0266-2ED5-422E-8E5F-A227E491D8B2@n1k.org, Bob Camp writes:
(Stratum 1,2,3) is based on various timing sources. It also was
designed in an era of “top down” timing. That is a very different
approach than the “bottom up” timing of the over the air codes
on CDMA or some (but not all) advanced TDMA systems.
The BSTJ contains some very interesting articles about how
synchronization got introduced and rolled out in the telephone
network originally.
You can find them on archive.org
--
Poul-Henning Kamp | UNIX since Zilog Zeus 3.20
phk@FreeBSD.ORG | TCP/IP since RFC 956
FreeBSD committer | BSD since 4.3-tahoe
Never attribute to malice what can adequately be explained by incompetence.
Mark Spencer wrote:
I'd be curious to know how many carriers have a
reference source other than GPS for their "data
line sync."
Fifteen-ish years ago when I was in wireline telecom, most offices used a Telecom Solutions (later Symmetricom) DCD-LPR as the local primary reference for the BITS clock. The LPR accepts various cards, including GPS and LORAN-C receivers, and while every single office had a GPS card, I saw at least a few that also had a LORAN-C card slotted. I don't remember exactly how many.
I don't know what ever became of them, but I haven't run across any in surplus. Haven't been looking specifically, though.
LPR manual: http://www.syncworks.com/wp-content/uploads/2014/05/DCD-LPR-Manual.pdf
...
Poul-Henning Kamp wrote:
The BSTJ contains some very interesting articles about how
synchronization got introduced and rolled out in the telephone
network originally.
You can find them on archive.org
My (admittedly US-centric) understanding is that prior to the building-integrated timing supply (BITS) concept, there was a master oscillator which was used to time a DS-1 signal, which was then fanned out across the country and distributed to any equipment that needed it. The master was in Kansas City, being roughly central to the continental US.
I only interacted tangentially with this system, when we replaced some of the Synchronization Distribution Expander (SDE) equipment with BITS equipment (DCD-whatevers).
The SDE documentation refers to an "Office Timing Supply (OTS)", but that appears to be a generic term rather than a specific piece of equipment, and I'm not finding much information about it. As far as I can infer, that was the gear directly involved in receiving the signal from the upstream source(s), and presumably providing some modicum of source selection and holdover, but that's speculation on my part.
One of the more interesting aspects of an SDE-to-DCD cutover was checking for timing slips between the sources, and the T-Berd analyzers had a differential timing mode for just this purpose. In this mode, both of the T-1 inputs are used, and their frame timing offset is displayed (with a nice bargraph on the VFD, as I recall). Static offset was expected, but any drift was cause to abort the cutover and get on the phone with someone at a higher pay grade. ;)
More on frame slip measurements: http://www.reeve.com/Documents/Sync%20Testing%20R3.pdf
-Nate B-