ARM Technical Support Knowledge Articles

How does the JTAG synchronisation logic work? / How does adaptive clocking work?

Applies to: Multi-ICE, RealView ICE and Trace (RVI / RVT)


Synthesizable cores need to implement a JTAG synchronisation logic block, which samples TDI, TMS and TCK from the JTAG emulator with the core clock. The synchronisation logic generates a signal called RTCK, which is a synchronised (delayed) version of TCK.

The JTAG synchronisation logic is explained in the cores' technical reference manuals. Below is a diagram taken from the ARM9E-S TRM.


If you configure the JTAG emulator (Multi-ICE or RealView ICE) in adaptive clocking mode, it uses the information in RTCK to generate TCK.

The JTAG tool waits to detect a falling edge in RTCK before generating the next rising edge in TCK and waits for the rising edge of RTCK to generate the next falling edge in TCK.

There is a delay between the detection of an edge in RTCK and the generation of the next edge in TCK, due to the time needed to sample RTCK with an internal clock. This delay is different depending on the JTAG tool used.

- With Multi-ICE you can select a TCK frequency in either adaptive or non-adaptive mode. The delay is one half of the TCK period selected plus about 50ns

- With RealView ICE, if you select adaptive clocking, you cannot select a TCK frequency. RTCK is sampled by default with a 50MHz internal clock, so TCK changes approximately 60ns after RTCK. This can be a problem when connecting to very slow targets such as hardware emulators. For more information see the following FAQ entry: How to change the frequency of the RVI sampling clock.

The result of this is that the TCK frequency in adaptive clocking is not the highest possible, but it is close to it. If you need to work at the maximum possible JTAG frequency you should disable adaptive clocking and set the JTAG speed manually.

The three sampling flip-flops set the theoretical maximum TCK frequency, which is one sixth of the core clock frequency (one eighth in ARM11). The figure below shows the timing of the signals inside the three-stage JTAG synchronisation logic.

signal timings 

In position 1 the synchronisation logic registers TDI and TMS, so the falling edge of TCK (which changes TDI and TMS) must happen after position 1. This imposes a minimum length of 3 CLK periods on the TCK high period.

In position 2 DBGTDO is generated by the core. TDO is sampled by the JTAG tool at the following rising edge of TCK (position 3). This imposes a minimum length of 3 CLK periods on the TCK low period.

In order to avoid a racing condition in the sampling of TDI, TMS and TDO, the real maximum TCK frequency will be lower than this and will depend on the core clock frequency, the delays on the JTAG signals and the setup time of the JTAG emulator and the ASIC.

In adaptive clocking mode, the JTAG emulator ensures that there is some time between the falling edge of RTCK and the rising edge of TCK and vice-versa. This way the sampling of the JTAG signals is always correct.


Attachments: img4496.jpg , img4497.jpg

Article last edited on: 2008-09-09 15:47:40

Rate this article

Disagree? Move your mouse over the bar and click

Did you find this article helpful? Yes No

How can we improve this article?

Link to this article
Copyright © 2011 ARM Limited. All rights reserved. External (Open), Non-Confidential