Semi-AdditivePCBProcessing: Process, Reliability Testing andApplications

Member Download (pdf)

The continued miniaturization of both packaging and component size in next-generation electronics presents a significant challenge for PCB designers and PCB fabricators. To effectively navigate the constraints of the traditional subtractive-etch PCB fabrication processes, PCB designs require advanced PCB fabrication capabilities pushing the limits of finer feature size, higher layer counts, multiple levels of stacked microvias and increased lamination cycles. Semi-Additive PCB processes, which can be implemented and integrated with existing PCB fabrication equipment and processes, provide an alternative that effectively resets the SWaP-C curve while increasing reliability.

The ability to design with and manufacture a 15-micron trace and space repeatedly and reliably provides options and opportunities previously not available to PCB designers and PCB fabricators. While just scratching the surface, SemiAdditive PCB processes can: • reduce the number of layers needed for routing high density BGA’s

• increase the hole size

• reduce the number of microvia layers required

• dramatically reduce size, weight and packaging and conversely increase the electronic content within an existing footprint

These benefits and more are being explored and realized as PCB fabricators implement semi-additive processes into their manufacturing facilities.

This session will begin with an overview of Semi-Additive technology as it relates to PCB fabrication including materials, equipment required, and process flow. This overview will be followed by a discussion of reliability test results and signal integrity modeling and will close with the discussion of use cases demonstrating the various ways the technology can be applied.

Author(s)
Mike Vinson
Resource Type
Technical Paper
Event
IPC APEX EXPO 2021

Thermal Improvement in 3D Embedded Modules Using Copper Bar Vias

Member Download (pdf)

The combination of increased I/O density, reduced footprint, and multi-die capability within a single platform makes embedded die an attractive solution. The benefits of embedding active die include miniaturization, improved electrical & thermal performance, heterogeneous integration, and an opportunity for cost reduction. Currently vertical integration is happening in power management chips with embedded components integrated into modules. Heat dissipation in integrated 3D high efficiency power modules is challenging when die is embedded inside the substrate. Heat dissipation affects efficiency and electro-migration of the packaged structures. Circular vias can be replaced by copper bar vias in the substrate design for thermal improvement. The thermal improvement is simulated using Ansys FEM (Finite Element Method) tools.

Key words # Embedded die, Fan Out Wafer Level Module, Thermal Improvement

Author(s)
Manoj Kakade, Richard Dowling, Mumtaz Bora, Jake Tubbs, and Ahmed Maghawri
Resource Type
Technical Paper
Event
IPC APEX EXPO 2021

FIDES, Reliability Assessment of Electronics: A New Approach to the LeadFree Process Factor

Member Download (pdf)

The reliability control in airborne electronics products is essential due to safety and business reasons. As users do not have confidence in raw results provided by previous reliability prediction methodologies, FIDES has the objective to build up this confidence considering technology, process and use. After more than a decade of RoHS legislation and the beginning of the lead-free transition of airborne electronics, it is imperative to consider an appropriate lead-free process factor (П𝐿𝐹) in the reliability prediction. As shown in this work, aeronautical, military and medical industry have waivers that allow them to continue using lead to ensure reliability. Some industries apply components that have lead-free surface finishes with tin-lead soldering alloys. These mixtures, higher temperatures, different reflow times, and other factors introduce risks to the product reliability mainly when subjected to thermomechanical fatigue during long life cycle.

This work presents a new approach for the lead-free process factor (П𝐿𝐹) and new recommendations for the lead-free grade (LF_grade). An analysis was performed of the current recommendations for lead-free grade (LF_grade). Thirteen new recommendations and its respective weights were defined for the LF_grade, and the lead-free process factor (П𝐿𝐹) equation was simplified respecting the value range (from 1 to 2) proposed by FIDES; therefore, the lead-free process factor (П𝐿𝐹) can double the failure rate predicted (λ).

It is necessary to consider that currently there is a need to predict reliability applying an updated lead-free factor that identify and control the factors that presently influence the reliability are even more important objectives. Key Words: Failure rate, FIDES, lead-free, Pb-free, pi-factors, reliability prediction

Author(s)
Murilo Levy Casotti, José Carlos Boareto, Orestes Estevam Alarcon, Andre Oliveira,
Resource Type
Technical Paper
Event
IPC APEX EXPO 2021

Failure Analysis Cases Studies on Solder De-Wetting For Electronics Products

Member Download (pdf)

Over many years defect analysis has been used at the company to determine the root cause of various defects experienced in the field on electronic products from customers. Based on this work it has been found that around 25 percent of all case studies have been due to de-wetting issues.

De-wetting is an issue with the solder joint where the molten solder and the substrate/component repel each other during the soldering process. Due to this a very weak or no intermetallic bond is formed at the interface after reflow leading to defective and unreliable solder joints.

Case studies in this area will be reviewed based on root cause analysis and countermeasures to prevent these defects. These case studies are related to inferior component/board plating quality, contaminated plating on both the PCB as well as the component, Foreign Object Debris (FOD) as a cause of the de-wetting of solder, damaged component plating and improper off-set solder paste printing. The results of the failure analysis are reported.

Author(s)
Jasbir Bath, Kentaro Asai, Shantanu Joshi, Jack Harris, Roberto Segura
Resource Type
Technical Paper
Event
IPC APEX EXPO 2021

Analyzing Printed Circuit Board Voiding and other Anomalies when Requirements Covering the Anomalies are Vague

Member Download (pdf)

Two independent Printed Circuit Board (PCB) suppliers found unusually high voiding anomalies in multiple manufacturing lots of PCBs that were processed over a 5 month period. The issue was noted during conformance coupon inspection of an initial lot, and subsequently determined to be isolated to the prepregnated portion of pure polyimide constructions in a number of PCB manufacturing lots. During the study, the team investigated numerous variables potentially associated with the root cause of the anomaly, including but not limited to: raw materials, conformance coupon preparation, and processing issues. The objective of this paper is to describe the approaches and techniques used in this void anomaly case history when requirements are not always clearly stated.

An initial review meeting was requested by one of the PCB suppliers and was attended by: the PCB manufacturer, procuring activity, design agent, and raw material manufacturer. The purpose of the meeting was to identify the anomalies, categorize them, and determine acceptability based on product specifications. Also addressed during the review were potential impacts to product functionality. The following categories were used for anomaly identification: glass tear out, void/striation, bundle cracks, foreign material, and unknown. The categorization was conducted to help the team determine the appropriate acceptability criteria for each attribute. A pareto analysis indicted that the top two anomalies observed were void/striation and glass tear out. In many cases, it was difficult to determine the difference between the two. A striation (or tunnel void) is a void in the resin between the filaments of the fiberglass bundle. A tear out is a condition where sections of glass bundles are removed from the potted coupon, as a result of coupon preparation (grinding and polishing). Based on MIL-PRF-31032/1 (2020), voids/striations are a rejectable condition if they are out of the thermal zone, and the condition is greater than .08 mm, and/or reduces the dielectric to below the minimum requirement. Upon further investigation, it was found that the anomalies were potentially related to the glass style used in the prepregnated layers.

This paper provides a methodical approach to investigating particular anomalies in PCBs to determine acceptability. It may be utilized as a guideline for others facing a similar anomalies associated with PCBs.

Author(s)
Wade Goldman, Hailey Jordan, Curtis Leonard
Resource Type
Technical Paper
Event
IPC APEX EXPO 2021

A Framework for Large-Scale AI-Assisted Quality Inspection Implementation in Manufacturing Using Edge Computing

Member Download (pdf)

In recent years, neural network based deep learning models has demonstrated high accuracy in object detection and classification in the area of digital image processing. Manufacturing industry has successfully implemented prototypes and small-scale deployment to employ artificial intelligence (AI) models for quality inspection. It has been proven that AI-assisted quality inspection can improve inspection accuracy, operation throughput and efficiency, significantly through those prototypes and small-scale deployment. However, the industry-known challenge of Operational Technology (OT) and Information Technology (IT) integration arises when scaling up AI-assisted quality inspection in manufacturing operation. While model accuracy is the main concern from an inspection point of view, IT implementation has to meet the requirements of high availability, scalability, security and model & device lifecycle management. 

This paper discusses in detail the challenges in large-scale deployment of AI models for quality inspection operation and introduces a framework for large-scale AI-assisted quality inspection in manufacturing environment using edge computing architecture. The framework focuses on IT architectural decisions to fulfill the OT requirement, including user experience in the quality inspection ecosystem. Keyword: Quality Inspection, AI Models, Edge Computing

Author(s)
Feng Xue, Charisse Lu, Christine Ouyang, James Hoey, Rogelio Fernando Gutierrez Valdez, Richard B Finch
Resource Type
Technical Paper
Event
IPC APEX EXPO 2021

The Case for an Electronics Supply Chain Blockchain

Member Download (pdf)

Blockchain technology has a lot of publicity in the electronics industry because of the way it has been used to address issues with sharing data across distributed networks and is recognized for providing "greater transparency, enhanced security, improved traceability, increased efficiency and speed of transactions, and reduced costs". The challenge for the electronics industry is to set up an electronics supply chain blockchain architecture correctly such that we can leverage those benefits quickly while expanding the blockchain back into sub-tier suppliers. This paper will discuss key blockchain cases within the supply chain, how to determine if a blockchain solution is the right answer for a problem, the key design considerations for a blockchain solution, and the importance of standards. Some of the lessons learned from recent efforts will also be shared, along with discussion some of the key challenges.

Keywords —blockchain, supply chain, electronics industry

Author(s)
Michelle Lam, Dave Verburg, Curtis Grosskopf
Resource Type
Technical Paper
Event
IPC APEX EXPO 2021

Enriching Test Equipment Analytics with Structured Logging

Member Download (pdf)

Automated test equipment plays a large role in the manufacturing process at the Kansas City National Security Campus (KCNSC). Every test run generates data which is used to determine if a part meets its requirements or to troubleshoot failures if they occur. Among this data are log events which contain unique information about each test run. Ordinarily, these events are rendered to a string and routed to a file, database, or status window. Strings are used because they are easy to create and store, are human-readable, and are capable of encoding many data types. 

When troubleshooting failures, these logs are a critical resource. Log files can be browsed manually to find key information; however, it can quickly become tedious to correlate a specific log event across multiple runs. At the KCNSC, tester software now utilizes an approach known as structured logging. A structured log event message is still defined using a string but one which is annotated to indicate the parts that were derived from data. This enhances each event, allowing it to be treated as a collection of properties rather than a simple string.

Using this approach to define log events sacrifices no flexibility but highly promotes data accessibility. With access to this data, tester teams can quickly find answers to their questions about tester or part performance. For example, one tester exhibited an issue with crosstalk between digitizer channels. Using log aggregation software, the structured logs were queried and used to produce plots indicating on which channels the issue most frequently appeared. This minimized the effort required for inspecting signal routing through the tester. Analysis like this is critical to reducing downtime during troubleshooting and also allows for proactive monitoring in a way which is not feasible with conventional unstructured logs.

Author(s)
Jack Savage, Mike Tohlen, Alex Czarnick
Resource Type
Technical Paper
Event
IPC APEX EXPO 2021

A Guide to Manufacturing Data Analytics

Member Download (pdf)

Data is the New Oil

Clive Humby, UK Mathematician and architect of Tesco’s Clubcard is widely credited as the first to coin the phrase in 2006: “Data is the new oil. It’s valuable, but if unrefined it cannot really be used. It must be changed into gas, plastic, chemicals, etc. to create a valuable entity that drives profitable activity; so, must data be broken down, analyzed for it to have value.

There is no doubt that we are witnessing an authentic technical revolution. Call it Industry 4.0, Internet of Things, Big Data, or Artificial Intelligence; the proliferation of solutions facilitating Industry 4.0 is accelerating.

Whatever the perspective is with which we look at this fast- paced evolution, there is an asset that is at the center of everything: data. 

However, contrary to what happened in the previous industrial revolutions where manufacturing was at the focus of the revolution, manufacturing has lagged in implementing the base technologies underlying this data transformation. It has been conservative and extremely slow to realize that the application of these technologies is invaluable, perhaps much more than in any other segment. They are refining, but at a much slower pace.

Case in point, the term “The Internet of Things” was coined by Kevin Ashton in a presentation to Proctor & Gamble in 1999; and more than 20 years later, manufacturing is still learning the meaning of IoT and trying to devise strategies to take advantage of it.

Author(s)
Francisco Almada Lobo & Dave Trail
Resource Type
Technical Paper
Event
IPC APEX EXPO 2021

Advanced, Non-Real-Time Uses of Machine Data for Factory Operational Improvement

Member Download (pdf)

EMS factories have collected and used machine data for many decades. Over that time, much of the value derived from machine data collection has come from three operational use cases: allowing fewer operators to simultaneously monitor more machines for errors, reducing common operational mistakes through programmatic interlocks, and maintaining traceability records in case of product recalls. There has been significantly less use of machine data for strategic optimization of factory operations, with the notable exception of asset utilization monitoring using simple calculations like Line Utilization and OEE. One of the historical reasons for the absence of large-scale analysis of machine data in the EMS industry has been that it was difficult to interpret machine data absent external context on what intended operation was being performed when the data was collected. More recently however, the advent of big data analysis techniques and machine learning algorithms has largely removed this traditional limitation. In this paper we discuss the difference between tactical and strategic data analysis approaches to the common EMS factory goals of lowering component attrition and increasing line utilization. We show how machine data can provide significant value at the strategic level if it is stored and analyzed in granular detail instead of being pre-aggregated into high level key performance indicators before being analyzed. As the EMS industry looks forward to Industry 4.0, we argue that one of the biggest areas of efficiency gain may come from such strategic data analysis.

Author(s)
Timothy M Burke
Resource Type
Technical Paper
Event
IPC APEX EXPO 2021