A Hybrid Approach to Decrease Port Influence in Transmission Line Characterization
This document has been relocated to http://scholarsmine.mst.edu/ele_comeng_facwork/1366
There were 6 downloads as of 28 Jun 2016.
Characterization and models for multi-gigabit signaling is an important issue in modern digital system. A good physical based model relies on a precise characterization of the test board. Typically, the characterization of the test board is associated with scattering matrix parameter measurement, which can be done with a VNA (vector network analyzer) in the frequency-domain or a TDR (time domain reflectometer) in the time-domain. The commonly used launch techniques on PCBs (printed circuit boards) associated with the VNA or TDR measurement in the microwave frequency range use SMA or 3.5 mm connectors, in edge-launch or vertical-launch fashions. The transition between the launch port and the DUT (device under test) introduces errors in the measurement. Embedding/de-embedding techniques are used to remove the port influences in the measurement generally. For example, TRL (through, reflect, and line) calibration is the typical method used in measurement to eliminate port influences. However, extra test kits are needed for TRL calibration, and furthermore the TRL calibration is sometimes difficult to implement, such as in coupled differential lines. In this paper, an effective hybrid approach for transmission line characterization is proposed, which includes choosing a suitable port launch technique for the test board, port parasitic parameters estimation, and building up a proper circuit model for evaluation with genetic algorithms (GA).