Step 12: Stencil Printing for Wafer Bumping: Statistical Process Control
BY RONALD C. LASKY
With the advent of type 5 and type 6 solder paste, and much effort in process development, stencil printing for wafer bumping has arrived. As with stencil printing for PCB assembling, the control of the volume of the solder paste stencil printed "brick" is crucial for high yields in wafer bumping. This need can best be met with an effective statistical process control (SPC) program for the wafer printing process. The purpose of this article is to provide an outline to establish a SPC program for a wafer bumping process.
Attribute Vs. Variables Data
High end-of-line yield is the goal of all processes. The data related to the yield typically are called "attribute" data. In a process related to stencil printing of wafers, the attribute data might be shorts, opens, or lack of solder bump coplanarity. It usually is advantageous to plot such data in a "Pareto" graph.1 A Pareto graph of fails helps to determine which failure modes are causing the most yield loss. This information will always be crucial in developing a process improvement plan. A little thought, however, reveals that attribute data alone cannot help directly fix the problems. As an example, let's assume that our printing process is producing some shorts. Knowing this information does not help us to solve the problem. However, if we are also measuring the volume of the stencil printed bricks, we might find that the brick volume has increased well beyond the control limits of the process. Hence, this out of control variable (i.e. the brick volume) is too high and causing the shorts. These volume data are in a class called variable data. Any good SPC program monitors both types of data to improve effectively process yields. The proper use of both attribute and variable data is crucial to an effective continuous improvement plan.
Wafer Bumping Vs. PCB Assembly
Average and range (referred to as Xbar-R) of a small sampling of the solder paste volume of stencil-printed bricks typically is monitored in SPC data collection of variables data in PCB assembly. A similar approach would be helpful in a SPC program for wafer bumping. Since it has been determined that the stencil-printed "brick" volume is going to be measured, the first step is to verify the precision of the test equipment. To accomplish this task, a gage repeatability and reproducibility (often called a gage R&R) study has to be performed to evaluate the precision of the volume measurement system.2 Typically one would like to have less than 10% of the variation be introduced by the measurement system. The gage R&R analysis can show whether or not this is the case. Fortunately, tools are now emerging that can achieve this type of precision. The gage R&R analysis also helps determine which sites on the wafer provide the best data. After this gage analysis is performed, we are ready to collect data.
As an example, consider an 8-in. wafer that has 200,000 sites to be bumped. The pad sites are on 10-mil centers, and the 3-mil-thick stencil used for printing has square apertures with 6-mil sides. The stencil aperture volume is 108 mils3. In our SPC program, we might measure 200 sites on each wafer. In each of these samplings, we could calculate and average and range. With this many sites sampled, it probably is more helpful to calculate the average and the standard deviation, σ, instead of the range. Let's put some numbers to this process scenario.
Assume that during our process development efforts, designed experiments were performed to show that as long as the standard deviation of 200 readings on a wafer was < 8 mils3, we had acceptable coplanarity performance. This being the case, to be conservative, we might set our specification for the standard deviation to be less than 7 mils3 to be acceptable. Therefore, we would want our process to deliver a standard deviation consistently less than 7 mils3. Figure 1 shows this situation. We want the upper control limit of the process (essentially this is the maximum that the process will deliver in a 3 sigma sense) to be comfortably below the upper specification limit.
Although less important, our process development data showed that the grand average of the means was 100 mils3. Since our specification is that the standard deviation be below 7, we would expect any individual value of solder paste printed brick volume to be 100 ± 3 σ, or about 79 to 121. We make our specification a little tighter than 3 sigma and set it at 80 to 120 mils3. However, we expect our process to deliver better than this target, as the historical σ from process development work was better than 7 mils3.
With our process well established, we go into production. For our SPC program, we measure 200 stencil-printed brick volumes on each wafer. After 100 wafers, we analyze the data with a statistical software tool.2 The data confirm that our process is well in control, as seen in Figure 2.
Figure 2. SPC process capability analysis. Note the "S" chart, which shows the upper control limit to be well below our specification of < 7.
Note that the upper control limit of the standard deviation is 5.741 mils3, well below the specification of < 7 mils3. In addition the Xbar chart shows a grand average of 99.96 mils3, very close to that of the process development analysis. Since we are below a standard deviation of 7 mils3, the Cpk of our process is 1.33 or we have a "4 sigma process." 3
Setting up an SPC process should have at least two objectives:
- To monitor and control a process to provide high yields
- Working with a Pareto analysis of attribute data, the variables data from an SPC program should be the foundation for a continuous improvement program for the process
SPC programs are well established in PCB assembly. With the advent of wafer bumping, a SPC program should be developed for these processes. Since a critical failure mode in wafer bumping is non-coplanarity, monitoring the standard deviation of the volume of the stencil-printed bricks may be the most important variable to consider.
- Vilfredo Pareto, 1848-1923, Italian Economist. His Pareto graphs lead to the development of the now famous "80-20 Rule," or 80% of problems are caused by 20% of variables.
- A trial download of Minitab14 is available at www.minitab.com.
- Montgomery, Douglas, Applied Statistics and Probability for Engineer, 3rd edition, pp. 595, John Wiley & Sons, 2001.
RONALD C. LASKY, Ph.D., PE, senior technologist at Indium Corp., as well as a visiting professor at Dartmouth College, may be contacted at 26 Howe St., Medway, MA 02053; e-mail: email@example.com.