2012 ISCE logo
IEEE

Workshops and Tutorials ISCE and SI Symposiums 2012


Workshops and Tutorials ISCE and SI Symposiums 2012



Day 1, June 4, 2012
8:30-9:30 KEYNOTE:
Glass Interface, Izhak Baharav. Corning Glass
9:30-11:30 TUTORIAL 1:
Quality of Experience (QoE) evaluation in consumer electronics applications, devices and services
Dr. Ulrich Reiter. Norwegian University of Science and Tech (Norway)
12:30-1:30 LUNCH:
Dr. Scott Snyder is the President and Chief Strategy Officer of Mobiquity
1:45-3:45 TUTORIAL 2:
GPU Programming and Architecture for Multimedia signal Processing
Dr. Saeid Nooshabadi. Michigan Technical University (USA)

Day 2, June 5, 2012
8:30-9:25 KEYNOTE:
3DTV, Tony Vetro: Mitsubishi Electric
9:30-11:30 TUTORIAL 3:
Techniques for Permittivity Measurement of Materials at High Frequencies
Dr. Mohammad-Reza Tofighi. Pennsylvania State University (USA) and
Dr. Nathaniel Hager III. Elizabethtown College (USA)
12:00-1:30 LUNCH: Speaker: David Heslter.
1:45-3:45 TUTORIAL 4:
An Overview of the America Invents Act and how it Changes U.S. Patent Law.
Brian T. Sattizahn. McNees Wallace & Nurick (USA)
BANQUET DINNER: Dr. Taechan Kim, Samsung VP

Day 3, June 6, 2012
8:30-9:25 KEYNOTE: Jim Nadonly, Samtech
9:30-11:30 TUTORIAL 5:
Using the LeCroy Signal Integrity Studio Tool for Simulating Eye Diagrams, Jitter Behavior and Equalization Schemes using S-parameters
Dr. Alan Blankman, LeCroy (USA)
TUTORIAL 6: Finite Element Practices for Causal and Broad Band. Frequency Sweeps of PCBs and Interconnects with ANSYS HFSS
Dr. Matt Commens. ANSYS (USA)
12:00-1:30 LUNCH Speaker: Howard Heck, Intel
1:45-4:00 TUTORIAL 7:
High-Speed Serial Interconnect Analysis Software for Next Generation High-Speed Digital Standards.
Yoji Sekine. Agilent (USA)
TUTORIAL 8:
Jitter & Timing Analysis for High Speed Designs and Link Analysis for High Speed Serial Standards
Frank Selvaggio. Tektronix (USA)

TUTORIAL NUMBER 1:
Quality of Experience (QoE) evaluation in consumer electronics applications, devices and services
Author:
Dr. Ulrich Reiter, Norwegian University of Science and Technology, Trondheim (Norway)
Abstract:
About ten years ago, Quality of Experience (QoE) has been introduced as a concept that challenges the well-known Quality of Service (QoS) approach. Whereas QoS looks at the technical parameters of a network, service, device, or application to measure its performance, QoE looks beyond these parameters and includes the human user into the equation. It comes from the insight that our perception of quality is influenced by technical, but also significantly by human and context-related factors. A simple example for this is that we can be equally satisfied with the overall quality of a movie we watch at the cinema, and a Youtube clip we watch on our mobile while travelling on the bus. However, the same clip at the same bitrate, watched on the TV in our living room, would probably fail our quality expectations.
Although perceived quality / QoE is highly relevant for Consumer Electronics, it has not received a lot of attention (yet). In fact, many professionals still find it hard to exchange their QoS-based approaches and turn them into QoE-based ones. Reasons for this are manifold: whereas QoS management can rely on traditional, engineering-style measurements of parameters easily described by and quantified in numbers, QoE measurement involves a multi-disciplinary set of parameters that can be difficult to measure, that varies depending on a number of factors external to the actual device, and that generally requires advanced methodologies of parameter estimation and analysis to be successfully applied.
In this tutorial we will give an overview on QoE in the context of Consumer Electronics. We will define the term itself in more detail, we will look at relevant features and influence factors, and we will discuss traditional and novel techniques and methodologies for QoE measurement. Special focus will be put on human perception mechanisms (visual, auditory, and audiovisual perception), and how knowledge about such mechanisms can be used to model and design advanced objective metrics for QoE. The tutorial will provide interested attendants with a jump-start to the field of QoE, will discuss standard procedures and recommendations, and will give insight to latest developments and current hot topics in QoE research.
Short Bio:
Dr. Ulrich Reiter is a researcher and lecturer working in the fields of audiovisual quality perception, subjective assessment methodologies, and interactivity issues in audiovisual applications at the Norwegian University of Science and Technology (NTNU) in Trondheim, Norway. He holds a Master degree in electrical engineering from RWTH Aachen, and a PhD in media technology from TU Ilmenau, both Germany.
Ulrich is the author or co-author of more than 50 articles for scientific books, journals, and conferences. His publications mainly focus on the subjective assessment of perceived audiovisual quality, on MPEG-4 audio at the scene reproduction level, and on virtual acoustics in interactive application systems. More recently, he has been designing novel subjective assessment methodologies for the comparison of different types of video artifacts, and evaluating the content-dependent quality trade-off between modalities in audiovisual media. His latest research focuses on categorization of audiovisual media content as a pre-requisite for better objective quality metrics, on the role of emotional state in perceived quality, and on quality evaluation methods for long-duration audiovisual content.
He received the ‘IEEE International Symposium on Consumer Electronics (ISCE) Best Paper Award’ in 2005 and 2007, and the ‘Quality of Multimedia Experience (QoMEX) Best Paper Award’ in 2010. Since 2006 he is a member of the editorial board for the IEEE Transactions on Consumer Electronics. He has served in the TPC of various international conferences and is a regular reviewer for IEEE Signal Processing Magazine, Signal Processing:Image Communications, IEEE Transactions on Consumer Electronics and other scientific publications. Ulrich is a member of the Audio Engineering Society and the IEEE. He is the deputy leader for Working Group 2 on ‘mechanisms and models of human perception’ in the EU-funded COST action IC1003 ‘QUALINET’ (http://www.qualinet.eu).
TUTORIAL NUMBER 2:
GPU Programming and Architecture for Multimedia signal
Processing)
Author:
Dr. Saeid Nooshabadi. Department of Electrical and Computer Engineering and Department of Computer Science in Michigan Technological University (USA)
Abstract:
This is a proposal for –half day or full-day tutorial. This tutorial proposal is on Multimedia Signal Processing using the Graphical Processing Units (GPUs). It is on how to bring the power of supercomputers into your desktop using the GPUs with a price tag of just under $1500 as a graphics card.

In the era of computing with single core microprocessors most compute applications developers have relied on the advances in hardware to increase the speed of their applications under the hood; the same software simply runs faster as each new generation of processors is introduced. However, this drive has slowed down since 2003 mainly due to power dissipation issues.

To achieve high performance with low power for a variety of tasks, multiprocessor SoCs and chip multiprocessors have emerged as an important class of VLSI system since its first debut a decade ago, and become more wide spread. For example, Sony PlayStation III is equipped with an 8-core IBM CELL Broadband Engine processor, Nvidia GeForce 9800 GX2 has 256 stream processors, SUN UltraSPARC T1/T2 processors have 8 cores.

Further virtually all microprocessors vendors have switched to models where multiple processing units called “processor cores” are used in each chip to increase the processing power. AMD Phenom™ X4 9000 Series Quad-Core with HyperTransport™ Technology have 4 cores, and even some of the latest Mobile workstations come with 4-core Intel® Core™ 2 i7. We expect to see anexplosion of multi processing units in the future.

However, sequential algorithms that most developers are familiar with are illplaced to take the advantage of these many-cores and multi-core systems. Rather properly parallelized algorithms and programs will be able to enjoy performance improvement with each new generation of many-cores and multicores, where multiple threads of execution cooperate to complete faster.

Unfortunately the best sequential algorithm is not necessarily the best parallel algorithm. Therefore, this tutorial will discuss the algorithm development, algorithm remapping and programming skills to enable the user harness the tremendous power of massively parallel GPUs to do general purpose computing at 1000s of GFLOPS.

This tutorial is primarily designed to serve the needs of those who require to do high performance computing for their multimedia applications where tremendous amounts of data being processed. Examples are video and audio coding and manipulation, radar signal processing, IC CAD tools, structural engineering, computational physics, molecular dynamics simulation, 3D imaging and visualization, consumer game physics, genetics, cryptography, machine intelligence, support vector machine and virtual reality products.

Topics to be covered:
For a half day tutorial the essentials and techniques for algorithm development for parallel processing of massively large problem on GPUs will be presented. The process of mapping of a sequential algorithm into an efficient parallel algorithm that can run on the GPU will be explored. The role of efficient memory bandwidth utilization to get the best performance from the streaming multi-processors in GPU is also discussed. The following topics will be covered in adequate depth.
• Multi-core and many-core (GPUs) introduction and architectural features (15 minutes)
• Streaming multi processors, and Unified Driver Architecture (CUDA) (20 minutes)
• Caching and shared memory architectures on GPUs (10 minutes)
• Scan reduction and geometric algorithms on GPUs (30 minutes)
• Dense matrix multiplications on GPUs (30 minutes)
• Sparse matrix multiplications and Linear Algebra Algorithms on GPUs (20 minutes)
• Multimedia signal processing on GPUs (30 minutes)
• Scientific Computing on GPUs (25 minutes)
Short Bio:
Saeid Nooshabadi, is a professor in multimedia signal processing in the Department of Electrical and Computer Engineering and Department of Computer Science in Michigan Technological University, Houghton MI. Prior to his current appointment he was with the Department of Information and Communications, Gwangju Institute of Science and Technology, Republic of Korea (2007 to 2010). Earlier he was was with the School of Electrical Engineering and Telecommunications, University of New South Wales, Sydney, Australia (2000 to 2007), where he currently holds a adjunct appointment. In 1992, he was a Research Scientist at the CAD Laboratory, Indian Institute of Science, Bangalore, India, working on the design of VLSI chips for TV ghost cancellation in digital TV. In 1996 and 1997, he was a Visiting Faculty and Researcher, at the Center for Very High Speed Microelectronic Systems, Edith Cowan University, Western Australia, working on high performance integrated circuits; and Curtin University of Technology, Western Australia, working on the design of high speed-high frequency modems.
He has extensive research and teaching experience and interests in the area of SoC design of Multimedia Systems, High performance and low power computing systems, application-specified integrated circuit design for information processing systems, and embedded electronic systems. He is coauthor of multiple patents and more than 150 technical journal and conference papers in all aspects of VLSI information processing. He has supervised more than 20 postgraduate students.

He was the co-author of the best paper awards for Mid West Conference in Circuits and Systems 2007, and VLSI Design Conference 1997. Saeid Nooshabadi received the MTech and PhD degrees in electrical engineering from the India Institute of Technology, Delhi, India, in 1986 and 1992, respectively.

TUTORIAL NUMBER 3:
Techniques for Permittivity Measurement of Materials at High Frequencies
Authors:
Dr. Mohammad-Reza Tofighi1 and Dr. Nathaniel Hager III2
1 School of Science, Engineering, and Technology, Pennsylvania State University, Harrisburg, Middletown (USA)
2 Department of Physics and Engineering, Elizabethtown College, Elizabethtown, PA, and Material Sensing & Instrumentation, Lancaster (USA)

Abstract:
A knowledge of material dielectric properties (or complex permittivity) at RF and microwave frequencies is important in many industrial, medical, research, and regulatory applications. In telecommunication devices, high-frequency components and antennas performance are greatly influenced by the dielectric characteristics of packaging and circuit boards. In high speed signal transmission, the dielectric properties of cables and transmission lines have a great influence on the signal transmission. Communication products can be buried inside materials with certain desirable features or placed inside or adjacent to heterogeneous and complex dielectric objects such as human body. Other applications include a continuous monitoring of the processing of materials through their dielectric properties, such as monitoring the curing process in composite polymers or hydrating cement, or rapidly identifying components in chemical or biological solutions. Despite its significance, high frequency permittivity measurement is mostly limited to research labs, and its related apparatus and procedures remain highly customized
This tutorial reviews contemporary methods of complex permittivity measurement at high frequencies, as well as the current trends in the subject area. The focus will be on the methods involving transmission line components and techniques. These methods can be generally categorized by resonance versus non-resonance techniques, narrowband versus broadband frequency methods, time domain versus frequency domain approaches, low-loss versus lossy materials, and reflection versus transmission methods. These distinctions will be clarified in detail, various probing and fixturing apparatus will be described, typical sample measurement results will be presented, and calibration techniques specific to permittivity characterization will be discussed. In particular, a perspective on time domain reflectrometry (TDR) versus vector network analyzer (VNA) methods will be provided.

Short Bio:
Mohammad-Reza Tofighi received his Ph.D. degree in electrical engineering from Drexel University, Philadelphia, PA in 2001. He is currently an Associate Professor of Electrical Engineering at the Pennsylvania State University, Harrisburg, where he has taught a variety of undergraduate and graduate courses on RF and microwaves, communication systems, and electromagnetics since 2004. Dr. Tofighi’s main research interest is on RF and microwave technologies. In particular, he is conducting research on implanted wireless devices, biomedical antennas, permittivity measurement methods, and microwave sensing. He is the first author of a chapter titled “Measurement Techniques for Electromagnetic Characterization of Biological Materials,” appeared in Handbook of Engineering Electromagnetics (New York: Marcel Dekker, 2004). Dr. Tofighi is an active member and an officer of IEEE Microwave Theory and Techniques (MTT) society, serving as the chair of its Technical Committee 10, on Medical Applications of RF and Microwave.

Nathaniel Hager III received his Ph.D. in physics from State University of New York at Binghamton in 1981. He worked 12 years in R&D at Armstrong World Industries in Lancaster, developing a program on dielectric relaxation of materials as a process monitor and probe of molecular dynamics. He then worked 10 years with the Small Business Innovation Research (SBIR) program, receiving Phase I/II awards from the US Army on aerospace composite cure monitoring and Phase I awards from Department of Commerce and National Science Foundation on concrete hydration monitoring and composite structural-health monitoring. Dr. Hager’s main interest is Time Domain Reflectometry (TDR) Dielectric Spectroscopy for process monitoring applications, and he currently has a grant from NSF on concrete hydration monitoring. Dr. Hager consults for a variety of companies in the wireless, medical, aerospace, and construction industries on problems involving TDR and microwave cavity characterization.

TUTORIAL NUMBER 4:
An Overview of the America Invents Act and how it Changes U.S. Patent Law
Authors:
Brian T. Sattizahn. McNees Wallace & Nurick (USA)
Abstract:
In September 2011, the America Invents Act (AIA) was enacted into law and has significantly changed U.S. Patent law. This tutorial will provide an overview of the changes to U.S. Patent Law as a result of the AIA and discuss how those changes may impact you and your patent strategy. Included in the overview are changes such as the switch to a first-to-file system, new patent infringement defenses based on prior commercial use, and new mechanisms for the post-grant review of issued patents.

Short Bio:
Brian is an IEEE member and Intellectual Property Attorney and Registered Patent Attorney with over 15 years of experience. Brian's practice areas include the preparation and prosecution of patent applications in the electrical arts, including computer and software technologies, the licensing of patents and technology and the evaluation of new technologies. Brian has a B.S. in Electrical Engineering and is a former Examiner with the U.S. Patent and Trademark Office.

TUTORIAL NUMBER 5:
Using the LeCroy Signal Integrity Studio Tool for Simulating Eye Diagrams, Jitter Behavior and Equalization Schemes using S parameters
Author:
Dr. Alan Blankman, Signal Integrity Product Manager, LeCroy Corporation (USA)

Abstract:
Todays signal integrity engineering design flow requires design engineers to both measure and model S-parameters, and then to emulate the S-parameters in time domain simulations of high-speed serial data traffic. The Signal Integrity Studio software gives SI engineers the ability to quickly analyze signal integrity effects like eye diagram closure and inter-symbol interference using imported S-parameters, and to explore equalization schemes that can used to open closed eyes. In this tutorial, we will first introduce the basic concepts of S-parameters, eye diagrams and jitter analysis and then show how to explore and emulate your S-parameters using Signal Integrity Studio.

Outline:

I. Understanding the basic theory of:
a. S-parameters
b. Eye diagram generation
c. Jitter calculation techniques
d. Channel emulation
e. Equalization and Emphasis techniques
II. What can the tool do?
a. Simulation of NRZ patterns
b. Simulation of signal impairments in the presence of a channel via its S-parameters
c. Eye diagram modeling
d. Jitter analysis & breakdown
e. Opening closed eyes via equalization
III. Application examples
a. Pre-fab modeling
b. Model/measurement comparisons
c. Equalization exploration

Short Bio:
Dr. Blankman is a Product Marketing Manager at LeCroy Corporation, focusing on signal integrity products and applications, including the SPARQ series network analyzers and serial data solutions. He has a PhD degree in physics from University of Pennsylvania, and has over 20 years experience developing instrumentation and software solutions for high-energy physicists and electrical engineers.

TUTORIAL NUMBER 6:
Jitter & Timing Analysis for High Speed Designs and Link Analysis for High Speed Serial Standards
Author:
Frank Selvaggio. Tektronix (USA)
Abstract:
We will discuss the challenges to performing accurate and repeatable jitter measurements and how to easily test for the latest high-speed serial standards. You will also learn how jitter analysis can be used to debug your design and provide additional insight into Bit Error Rate (BER) performance. Guidance will be provided on the different measurement solutions from oscilloscopes to spectrum analyzers to bit-error-ratio analyzers.

Learn about the techniques and tools to analyze the complete serial data link including all interactions between the different elements. We will cover customized application-specific measurements that include advanced Jitter Analysis and Link Analysis with equalization. We will also discuss additional concepts needed to prepare and succeed with next-generation technologies.

Short Bio:
Frank Selvaggio is a Senior Applications Engineer at Tektronix Inc. with responsibility for supporting customers in Oscilloscope, Logic analyzer, Signal Source applications. Currently focusing on High Speed data links, Digital analysis (Bus and microprocessor debug & validation), TDNA Link analysis (Impedance, S-Parameters and Eye Simulation) and Optical data links. Specializing in standards compliance testing including DDR, SATA/SAS, USB, HDMI, Ethernet, PCI Express. Along with testing signal integrity methodologies for Jitter, BER, Link Analysis, Power supply and Protocol analysis. Has also specialized in designs for VXI and general GPIB ATE test systems. During the past 22 years at Tektronix Frank has helped 1000’s of companies in solving their design problems in a wide variety of applications.

Frank was born and raised in Bethlehem PA and developed an interest in Electrical / Electronic designs as a child and during High School began studies in Electrical / Electronic basics. After graduating from High School he continued an Electrical, Electronic and Computer Science education at Lehigh University, Northampton Community College and Lincoln Technical Institute. Along with over 28 additional courses across the US in microprocessor design, DC power supply designs, Audio / Video designs and Data Network and telecommunication designs.

TUTORIAL NUMBER 7:
High speed serial interconnect analysis software for next generation high speed digital standard
Author:
Yoji Sekine. Agilent (USA)
Abstract:
With the increase in bit rates, standards continue to evolve and new measurements are often the result. There is a growing need in the industry for more thorough evaluation of components, as well as evaluation under actual operating conditions. The ENA Option TDR offers a variety of measurement capabilities, providing you tools to characterize high speed digital designs more thoroughly.
- determine optimal signal conditioning for high-speed digital link with emphasis and equalization features
- perform stressed eye diagram analysis of interconnects with the jitter insertion feature
- analyze impedance of active devices under actual operating conditions with Hot TDR measurement capability

Short Bio:
Yoji Sekine, Marketing Engineer, Agilent Technologies
Yoji Sekine is a Marketing Engineer for the Component Test Division of Agilent Technologies. His most recent activities have focused on ENA Option TDR, which provides a simple and intuitive time domain characterization method utilizing a Vector Network Analyzer. Prior to his current assignment in marketing, he worked as a R&D engineer designing various products, including vector network analyzers, signal source analyzers, and LCR meters. He received his BSEE degree from University of California at Davis in 1997.
TUTORIAL NUMBER 8:
Finite Element Practices for Causal and Broad Band Frequency Sweeps of PCBs and Interconnects with ANSYS HFSS

Author:
Dr. Matt Commens. ANSYS (USA)

Abstract:
This tutorial will outline techniques applied in the Finite Element Method in order to extract accurate and causal broad band frequency sweeps when simulating IC packages, printed circuit boards (PCBs) and interconnects. Instructions will be given in setting up and solving problems and on the importance and treatment of metal traces of finite thickness, accurate DC point, causal material models, causal surface roughness models and de-embedding parasitic effects will be discussed and demonstrated.
Short Bio:
Dr. Matthew Commens is a Lead Product Manager at ANSYS, Inc. in charge of HFSS, High Frequency Structure Simulator, a 3D full wave electromagnetic field simulator. He has held this position since January of 2009 and works in Pittsburgh, PA USA. He first joined the greater ANSYS organization in August of 2001 working for Ansoft, LLC as an applications engineer specializing in high frequency electromagnetic simulation tools. Prior to Joining ANSYS he worked as an antenna designer and simulation manager at Rangestar Wireless of Aptos, CA USA where he specialized in the design of compact, integrated antenna solution for commercial wireless applications. Prior to this he worked at Varian Inc. as a high resolution NMR probe designer. Dr. Commens holds five patents in the areas of NMR coil and antenna design. He received a B.S. in Physics from University of Missouri-Rolla (1989) USA and a Ph.D. in physics from Washington University in St. Louis, MO USA.