[USRP-users] Strange behavior of N210 with timed commands

LOUF Laurent laurent.louf at thalesgroup.com
Mon Dec 15 05:29:01 EST 2014

Hello Martin,

I indeed work on a continuous stream of samples. To measure the latency, I do as follows : 
  - First, I send a dummy packet through the chain to initialize everything (filters, FFT, ... )
  - Once this packet is sent, I issue a stream command with a time spec of 1s (I initialize the time to 0 when starting the system), with the mode STREAM_MODE_START_CONTINUOUS 
  - When I have to send the first processed packet, I calculate the latency as follows : with the packet is transmitted through my system the time of reception (obtained with gettimeofday), so I take the current time - the time of reception, divide that quantity by 625 (which gives me the number of frames of latency), add 3 (just to be sure as gettimeofday may not be that precise, and to leave time to send the data), and re-multiply that by 625. It gives me a rough estimate of the latency in µs, that I add to 1s (as the system was supposed to begin to stream at 1s).
  - Following packets have a time spec equal to the previously calculated time spec of the first packet + current number of packet * 625.1µs

There do not seem to be a problem with the time spec of the first packet as I never get an error when sending it, it's only when I replace 625.1µs by 625µs as the spacing between two consecutive frames that I get a fastpath message every two frames, and only one of two frames seems to be actually sent. 

Concerning the burst mode on the sending part, I've just played around to find something that worked, it's maybe not the parameters that I should use for my application. 


[@@OPEN @@]

-----Message d'origine-----
De : USRP-users [mailto:usrp-users-bounces at lists.ettus.com] De la part de Martin Braun via USRP-users
Envoyé : lundi 15 décembre 2014 10:33
À : usrp-users at lists.ettus.com
Objet : Re: [USRP-users] Strange behavior of N210 with timed commands

On 12/12/2014 02:58 PM, LOUF Laurent via USRP-users wrote:
> I'm not quite sure how to use this mailing list so do not hesitate to 
> redirect me to a better place or to someone that may know the answer 
> to my question.

Hey Laurent,

you came to the right place.

> That being said, I'm currently working with an USRP N210 at work and 
> observed a strange behavior on which I would like to have your input. 
> I use the USRP to receive a signal, apply a channel simulation on it 
> and transmit the signal obtained. I have absolutely no problem on the 
> receive part but one little on the transmit part. The signal I use is 
> sampled at 2Msamples / second, and I work on frames of 1250 samples, 
> so frames of 625µs. So the whole process goes that way, I receive a 
> frame, apply the channel simulation and then transmit it, at a fixed interval.

To clarify: You're working on a continuous stream of samples, processed in chunks of 1250 samples? Or do you 'only' receive 1250 samples, then stop, do the processing and restart?

> The problem occurs now : I do some calculations based on the time when 
> I receive the first frame, add the latency of my system and a number 
> of times 625µs corresponding to the number of frames sent, which makes 
> the time to transmit the frame, that is sent to the USRP along with 
> the current frame. If I put an interval of exactly 625µs, I get a 
> fastpath message "L" every two frames, and I can see that the data of 
> the second frame is not transmitted. If I put an interval of 625,1µs, 
> everything seems to work fine and I don't get any error. The metadata 
> I send along with the frame are : start_of_burst = true , end_of_burst 
> = true , has_time_spec = true and the time spec I got from my 
> calculations. That feels strange to me since I don't even have to add 
> the time corresponding to one sample to get it to work (0,5µs). Could 
> it be something related to the burst mode that I'm using ? I did not 
> fully get that part reading the documentation, so feel free to 
> enlighten me on that subject.

When you say burst mode, that sounds like you're using STREAM_MODE_NUM_SAMPS_AND_DONE. Is that correct? Also, how are you calculating your system latency?
If you've determined that precisely, you might be cutting it very close, and the .5µs is all you need to just make it in time.

Note that the time stamp is stored in ticks, i.e. with reference to the DAC sampling rate (100 Msps on the N210). .5µs is thus a difference of
50 samples.


USRP-users mailing list
USRP-users at lists.ettus.com

More information about the USRP-users mailing list