Note: You must be logged in to view papers and slide presentations. Not a WRT member?
Can you design and manufacture PCB interconnects for 10-50 Gbps data rates and achieve perfect simulation to measurement correlation? How are laminate materials selected for these data rates? Why are laminate DK and DF measurements at single frequency points not enough to accurately define dielectric models? Can copper foil specifications be used to define roughness model parameters? How does material characterization with GMS-parameters compare with IPC standard techniques? Can loss and dispersion effects between dielectric and conductor roughness be separated? This tutorial will answer your “material” questions and teach a systematic approach to interconnect design at 10-50 Gbps data rates.
1. Learn to define and identify model parameters for a particular data rate and select PCB materials for particular applications.
2. Understand the role of dielectric and conductor roughness models and why broadband characterization is necessary for accurate analysis of PCB and packaging interconnects at 10-50 Gbps data rates.
In this paper, we introduce a special class of resonant structures-the Beatty standard- along with other classic resonators. We examine closely the resonant Beatty standard, its impedance profile, loss characteristics, and its application to printed circuit board material property extraction. Suited for 32Gbpsec simulation requirements, this proposed supercoupon, the Beatty standard, not only provides engineers a compact structure to perform material property extraction, but also help establish a robust design flow.
We analyze the computational procedure specified for Channel Operation Margin (COM) and compare it to traditional statistical eye/BER analysis. There are a number of differences between the two approaches, ranging from how they perform channel characterization, to how they consider Tx and Rx noise and apply termination, to the differences between numerical procedures employed to convert given jitter and crosstalk responses into the vertical distribution characterizing eye diagrams and BER. We show that depending on the channel COM may potentially overestimate the effect of crosstalk and, depending on a number of factors, over- or underestimate the effect of transmit jitter, especially when the channel operates at the rate limits. We propose a modification to the COM procedure that eliminates these problems without considerable work increase.
Learn how to specify the PCB stack-up and how to call out the materials so that the PCBs are manufactured in such a manner that they meet the requirements of high speed designs. Then see how to match measurements with simulations and benefit from this with improved specification of PCB manufacturing tolerances. Then learn how to design PCB test structures that validate measurement and simulations for a robust design flow. The workshop will work through real world examples and conclude with a demonstration of an as-fabricated PCB channel passing 32 Gb/s and matching with simulation. Prerequisites: Knowledge of eye diagrams and s-parameters for signal integrity analysis.
Meaningful interconnect design and compliance analysis must start with the identification of broadband dielectric and conductor roughness models. Such models are not available from manufacturers and the model identification is the most important element of successful interconnect design for link paths with 10-50 Gbps and higher data rates. Electromagnetic analysis of interconnects without such models may be simply not accurate. Overview of broadband dielectric and conductor roughness models for PCB and packaging interconnect problems is provided in the paper. Theory of model identification with generalized modal S-parameters and separation of dielectric and conductor dispersion and loss effects is described. Practical examples of successful dielectric and conductor roughness model identification up to 50 GHz are also provided.
As data speeds increase, signal integrity engineers are increasingly using frequency domain measurements to characterize their designs. This raises the question of how measurements in the frequency domain impact factors such as causality when transformed into the time domain and also how to reliably verify simulation-measurement correspondence. This presentation will highlight various considerations, including the importance of the extent of the frequency range at low and high end of S-parameter measurements and how this affects achieving correlation between simulation and measurement. The use of various tools will be described including Vector Network Analyzers, Channel Modeling Platforms and BERTs and how their use fits into the high speed serial data interconnect design flow.
The traditional S-parameter model approach for high speed time domain simulations does not always work well. This can result in time domain distortion and poor estimation of peak-peak deterministic jitter, vertical eye diagram fidelity and contour due to a lack of causality, and poor DC point estimation. This session will be valuable to engineers designing systems from 10 to 32 Gb/s and system and 3D EM simulator users. It refocuses the RF/Microwave centrism of VNA-produced S-parameter data to a sampled system construct which is then extrapolated down to DC and up to frequencies higher than initially measured. An explanation will be given of how the discrete frequency sampled nature of the S-parameter measurement affects time domain simulation. Finally, practical measurement results will be presented to illustrate how the sampled-model impacts various structures and how the uncertainties vary depending on the selected start frequency, stop frequency, and point density. Takeaway: This session will offer practical methods of managing S-parameters for optimal time domain simulation correspondence (VNA set-up, calibration, and sampling); identify causality and convergence issues that plague time domain, and how to resolve these issues in the S-parameter sampled system; and provide illustrative examples of measurement uncertainties achieved for various structures using classic structures and one novel structure.
Data rates corresponding to unit intervals of less than 35 psec require unprecedented model accuracy and validation. Validation of small interconnect structures, such as vias, represent the primary objective of this session. Additionally, some recent data and analysis demonstrate unexplained higher losses at high frequencies. Finally, we validate an ultra-fast TDR measurement methodology with a new and novel approach. Takeaway: Advanced time domain measurement techniques can be used to study small electromagnetic structures such as vias with the time/spatial resolution required by the latest generation of high speed serial channel designs. A combination of mesh based and hybrid electromagnetic modeling techniques can produce analysis results that match high resolution measured data for structures such as vias, thus providing deeper engineering insight that can be used to produce high speed serial channel designs with greater performance and/or lower cost.
This paper addresses a new methodology for 12 Gbps interoperability that combines a concerted family of pathological channels, internal eye monitoring, and external EQ simulation tools, providing insight into an EQ optimization strategy that addresses the specific channel’s mix of crosstalk noise, jitter, and channel loss. This also provides a backplane designer the ability to configure a high-loss, crosstalk aggressed system. The method, combining co-simulation channel optimization, a reconfigurable channel platform, and receiver eye monitoring, has two key benefits; the separation of channel eye opening versus un-equalizable Deterministic Jitter (Dj), and the capability to map loss-crosstalk space for a particular SERDES channel pair, a new concept in SERDES interoperability evaluation. The method is described in detail, followed by relevant case examples using hardware specifically designed for this endeavor. Finally, we compare eye monitor results with the original co-simulation, validating the method.
As serial link speeds increase, systems become more
Stressed. Loss, low probability deterministic jitter, crosstalk aggression from densely packed signal nets, via and connector impedance and associated resonances, and package and power delivery issues all add their own jitter density function, resulting in a net jitter picture that is inherently complicated. This paper represents a rigorous and practical crosstalk analysis of 10 Gbps and higher serial data transmission systems, which will begin at pre-layout 3D EM extraction, continue with the material parameters identification and post-layout analysis and end with direct jitter measurement and separation. We believe this is one of the timeliest of topics in signal integrity at the present time.
Design of interconnects on PCBs for 6-10 Gb/s data rates requires electromagnetic models from DC up to 20 GHz. Manufacturers of low-cost FR-4 PCBs typically provide values for dielectric constant and loss tangent either at one frequency or without specifying frequency value at all, that is not acceptable for the broad-band models. A simple and practical methodology to extract frequency-dependent dielectric parameters on the base of correlation of measurements and simulations is proposed. A board with 30 test structures has been built to validate the extraction methodology and to verify possibilities to predict interconnect parameters with the electromagnetic analysis.
This paper begins with a review of jitter fundamentals including a discussion of the various random jitter (RJ) and deterministic jitter (DJ) components and their possible causes. Using RJ/DJ separation and Bathtub curve analysis is typically the first step in determining if your high-speed system and/or components are comfortably meeting a particular timing budget requirement for reliable data transmission performance. But separation tables and bathtub curve extractions give no indication of the source of jitter. Knowing how to isolate jitter components with time-correlation will enhance your ability to find the root cause so that you can then proceed to
beat down individual error components one at a time in order to improve reliable system performance.
As the cost of bringing complex high-performance systems to market increases and the time to market decreases, new methods must be used to insure first-pass design success. This paper describes a verifiable design method that establishes excellent signal integrity and enables leading-edge performance with common off-the-shelf materials. By integrating field solver and measurement-based modeling methods, the net process is verifiable. The optimum low-cost production design can be attained in the shortest time. This process yields unmatched time-domain and frequency-domain performance using off-the-shelf, generic, non-esoteric, low-cost materials.