Open Access

Reduction of defects in the lapping process of the silicon wafer manufacturing: the Six Sigma application


Cite

Introduction

The semiconductor industry consists of companies engaged in the design and fabrication of semiconductor devices, which form the foundation of modern electronics. The industry began in the 1960s and currently accounts for about 0.5% of the global GDP ($299.5 billion). It also enables the generation of approximately $1200 billion in the electronic systems business and $5000 billion in services, representing close to 10% of the global GDP, thus gaining the recognition for its critical role in the supply chain of the electronics industry (Kazmierski, 2012).

The silicon wafer manufacturing involves several stages. The semiconductor industry consists of three key sectors: silicon ingot growing, the silicon wafer manufacturing and the fabrication of integrated circuit (IC) chips. The silicon ingot growing process is contingent on many factors, such as size, specifications and quality, and so the ingot growing time can range from a week to a month. The next stage is the wafer production, whereby a fully-grown silicon ingot is sliced into wafers of different thicknesses. This sliced wafer is then subjected to the flattening process, and then undergoes the fabrication process producing IC chips. The focus of this research project is on the silicon wafer manufacturing.

The silicon wafer manufacturing consists of four key value-adding processes: slicing, lapping, chemical etching and polishing. The primary aim of the slicing process is to define the crystal structure of the wafer, finding the best possible shape. The wafer is subjected to high-pressure cutting, to achieve correct wafer thickness, but on the downside, it causes high surface damage and contamination. The second stage is lapping, which removes the surface damage caused by slicing. It is also critical in defining the flatness of the wafer as a failure to achieve the optimum wafer flatness can lead to complete wafer rejection. Wafers are then exposed to abrasive chemicals during the chemical etching stage that helps to remove impurities from the wafer surface. Finally, wafers that have been manufactured to the required flatness standards, with no surface damage, are polished on one side to give a smooth, mirror-like finish, on which IC chips can be fabricated. This process flow is further illustrated in Tab. 1.

Four key stages of silicon wafer manufacturing process

Process FlowDescriptionProcess ObjectiveProcess Demerits
Slicing Silicon ingot is mounted on the slicing machine. A web of meatl wire along with cutting slurry passes through the ingot slowly to provide disc shaped silicon wafersTo generate wafer slice structure. To achieve correct wafer thickness, orientation and warpPoor flatness High surface damage High contamination
Lapping Sliced wafers held in carriers are placed in between two metal plates along with abrasive slurry falling on wafers. Rotation of plates and carriers causes mechnical removal of slicon therby reducing waferReduce slicing damage. Establish optimum wafer flatnessSusceptible to surface and flatness rejects
Chemical Etching Lapped wafers are subjeted to abrasive chemicals which disolves silicon into chemical thereby thinning wafersReduce Previous Process Damage Reduce surface contaminationDegrades flatness wafer staining
Polishing Etched wafers are mounted on plates. With pressure applied on these plates, the wafers are rubbed against fine abrasive polishing pads along with polishing slurry to give a mirror like finishEstablish optimum flatness. Eliminate surface damage. Minimise contamination levelHigh rejection rate Rework cost

The lapping process helps to achieve the maximum wafer flatness. Wafer flatness measured as Total Thickness Variation (TTV) in microns is one of critical-to-quality (CTQ) requirements for a silicon wafer. This project aimed to reduce the number of TTV rejects in the lapping process.

Literature review

Quality is an elusive and abstract concept (Hossain, Tasnim & Hasan, 2017; Wright, 1997). In a study by Evans and Dean (2002), managers from 86 firms in the United States were asked to define “quality”. They found several definitions ranging from waste elimination, conformity to customer requirements, policy compliance, consistency in output, getting it right the first time and customer satisfaction. Broadly, quality can be understood along four dimensions: excellence, value, conformance to specifications and the extent to which expectations are met (Yong & Wilkinson, 2002). Quality as excellence is believed to be immeasurable, and control can only be exerted through the investment of maximum efforts and best skill sets. This approach has been criticised of being of little practical value to an organisation (Garvin, 1988). Quality as a value definition focuses on external effectiveness (meaning costs) and internal efficiency. This definition, once again, has been argued to lack practical and measurable parameters for organisational use (Yong & Wilkinson, 2002). A more quantifiable and objective definition of quality is conformance to specifications; which was dominant in the 20th century. It means that an outcome must not deviate from the specifications set by an organisation, and any deviations are considered as lowering the quality (Reeves & Bednar, 1994). Such conformance measures have been argued to increase the internal efficiency of the process and sale prices over time (Topalovic, 2015). Later, as the focus shifted from manufacturers to customers, a new definition of “quality” emerged, i.e. meeting or exceeding customer expectations. Anything that does not satisfy the customer was considered to be of low quality (Yong & Wilkinson, 2002). However, customer requirements are variable and subjective, which makes them difficult to satisfy. These intricacies associated with the concept of quality, highlight the need to have a management system that ensures the manufacturing of good-quality products resulting in monetary profits and customer satisfaction. This leads to the concept of quality management.

The revolution in quality management began in Japan during the 1950s and gradually gained momentum in the rest of the world during the 1970s and the 1980s (Foley, 2004). One of the most influential quality movements during this era was the total quality management (TQM), which again owes its origin and conceptual developments to Japan (Cole, 1998; Esaki, 2016; Juran, 1995). TQM eliminated the weaknesses of previous quality improvement techniques, which makes it an efficient quality management tool. TQM is an integrated approach that stresses the top-down approach, staff engagement in the process, evidence-based decision-making and the consideration of customer requirements (Tobin, 1990). Despite its influential and successful reputation, many publications have documented unsuccessful implementation stories of the TQM concept in the manufacturing industry (e.g., Brown et al., 1994; Eskildson, 1994; Cao et al., 2000; Nwabueze, 2001). Based on the evidence from independent publications by consulting firms, it can be argued that two-thirds of TQM implementation efforts failed to produce any significant improvements in the overall quality of the product, financial gains or the company’s competitive situation in the industry (Jimoh et al., 2018). This is partly due to the ever-changing and evolving definitions of TQM, which can mean different things to different people, making its implementation insufficiently consistent and reliable (Andersson et al., 2006; Boaden, 1997; Talapatra, Uddin & Rahman, 2018).

Two other branches of quality management were introduced during the 1960s: reliability engineering and zero defects. The reliability engineering technique has roots in the disciplines of pure probability and statistics. It was mostly used in the USA and aimed to apply principles of probability to reduce defect rates of durable products (Dimitri, 1991). The zero defects strategy, which originated in the USA during the 1960s, was considered to be the most optimistic approach in the quality management field as it aimed to achieve the complete elimination of defects and process failures (Crosby, 1979). Both concepts, however, received criticism for being impractical and expensive (Crosby, 1984). It can, thus, be argued that the different quality management concepts differ in their origin, aim, definitions, methodology and focus, thereby confusing rather than informing the readers. Furthermore, each of the techniques cited above has limitations that reduce their applicability and anticipated benefits (Kedar et al., 2008). Many organisations have reported difficulties in the implementation of quality management programmes (Brown et al., 1994; Eskildson, 1994; Harari, 1997; Nwabueze, 2001); they suggested the lack of inherent connectivity between parts and also reported some missing information about some of the relevant sections, as shown in Tab. 2.

Overview of key quality management concepts

ConceptOriginAimFocusMethodology and ToolsCriticisms
Total Productive MaintenanceJapan (the 1950s)Increase process capability by reduction of unplanned failures, accidents and defectsPreventive and predictive maintenance of processesGap analysis of historical records, cause‐effect analysisSkilled workers required to implement, resource demanding and long‐term
Total Quality ControlJapan (the 1960s)To coordinate quality maintenance and improvement from all groups to achieve the most economical processTo reduce rework and achieve maximum customer satisfactionMethodology: Plan Do Study Act Tools: Statistical techniquesVague and difficult to coordinate
Total Quality Management (TQM)Japan (the 1990s)Improve the quality and consistency of processesCustomer satisfactionMethodology: Plan Do Study Act Tools: Statistical techniquesVague and inconsistent conceptualisation, excessive resource consumption, unsatisfactory resu lts
Zero DefectsDenver Division of the Martin Marietta Corporation (the 1960s)To enhance the quality of a process outcome through the elimination of any defects in the production processDefect eliminationExtra attention and care devoted to each step of the production process ensuring no mistakesExpensive
Reliability engineeringShewart during the 1920s and the 1930s: cited in Kapur and Lamberson (1977)To reduce failure modesLongevity and dependability of parts, products and systemsReliability‐Centred Maintenance, failure modes and effects, root cause analysis, condition‐ based maintenanceTechnical and requires skilled staff. Expensive and demands long term commitment

None of the quality management tools discussed so far had global success, and quality managers were still in search of a complete quality management programme when Six Sigma arrived. Six Sigma is a systematic set of guidelines that aims to significantly improve the quality of a manufacturing process and reduce costs by minimising process variation and reducing defects. It utilises statistical tools that can either be applied to facilitate a new product development or strategic process improvement (Breyfogle et al., 2001). Six Sigma is the edge that helps win the market competition as it provides financial, business and personal benefits; financial — by optimal and efficient use of resources; business — through ensuring maximum customer satisfaction; and personal — enhancing skills of an individual and, thereby, increasing their employability. In the last decade or so, there has been a rapid uptake of the Six Sigma technique as a process change, management and improvement strategy by global industries. This helped them beat market competition and maximise yearly savings (Su & Chou, 2008; Yang & Hsieh, 2009).

Besides, in the last decade or so, there has been a massive uptake and implementation of the Six Sigma technique as a process change, management and improvement strategy by global industries, which include the manufacturing process (Al-Aomar, 2006; Gangidi, 2019; Valles et al., 2009), financial organisations (Brewer & Eighme, 2005), engineering firms (Bunce et al., 2008), hospitals and intervention clinics (Craven et al., 2006), banking, hospitality, pharmaceutical companies (Cupryk et al., 2007), chemical industries (Doble, 2005), educational institutions (Bandyopadhyay & Lichtman, 2007), software industry (Arul & Kohli, 2004), call centres (Schmidt & Aschkenase, 2004), utility service providers (Agarwal & Bajaj, 2008), the automobile sector (Gerhorst et al., 2006), information technology (Edgeman et al., 2005), human resources departments (Wyper & Harrison, 2000), military administration units (Chappell & Peck, 2006) and even government departments (Furterer & Elshennawy, 2005). A summary is presented Tab. 3.

Selected success stories of the Six Sigma implementation in industries

AuthorsName of an organisationBenefits of implementing the Six Sigma technology
Financial sector
Rucker (2000)Citibank groupNumerous benefits have been reported across different organisations of this group. They successfully halved their credit processing time and reduced internal call‐back time by 80% and external by 85%. Reduced the time between a customer first placing an order till the actual service delivery and the credit decision cycle from 3 days to just 1 day. The time taken to process a statement was also decreased from 28 days to 15 days only
JP Morgan Chase (Global Investment Banking)Improved customer experience in using bank's services such as account opening, balance enquiry, making transfers and payments via online or cheque mode; leading to increased customer satisfaction and a reduction in process cycle time by more than 30%
Antony (2006)British Telecom wholesaleFinancial benefits of over $100 million, greater customer satisfaction, error reduction
Roberts (2004)Bank of America24% reduction in customer complaints and a 10.4% increase in customer satisfaction
Sun Trust BanksSignificant improvement in customer satisfaction
Bolt et al. (2000)American ExpressImproved the external vendor related processes and reduced the numbers of non‐received renewal cards
Manufacturing sector
Antony (2006)Motorola (1992 and 1999)1992: Achieved dramatic reduction in the defect levels of their process by about 150 times
1999: Huge financial gains of about $15 billion over 11 years
HoneywellProfit of $1.2 billion
Texas InstrumentsAchieved a financial gain of over $ 600 million
Johnson and JohnsonThe financial gain of about $500 million
Telefonica de Espana (2001)Whooping increase in revenue by about 30 million in the first 10 months and gain in savings too
Dow chemical/rail delivery projectReported substantial savings in capital expenditures: of over $2.4 million
McClusky (2000)AlliedSignal/ Bendix IQ brake padsThe cycle time of their production‐shipment process decreased by 10 months (18 to 8 months)
AlliedSignal/ Laminates plant in South CarolinaReaped a range of benefits: their capacity almost doubled, and punctuality in delivering goods reached 100% threshold level. Their cycle time and inventory had a reduction of 50% each
DuPont/Yerkes plant in New York (2000)Increase in yearly savings of over $25 million
Seagate TechnologyGained financial profits of about £132 million in just 2 years
General ElectricIncrease in yearly financial savings by about $2 billion
Hughes Aircraft's Missiles Systems Group / Wave soldering op.The quality of their yields improved by about 1000% and productivity by 500%
Raytheon/ Aircraft Integration SystemsAchieved a significant reduction (approx. 88%) in the inspection time spent on the depot maintenance process
McClusky (2000)GE/ Railcar leasing businessAchieved a 62% reduction in the time spent at repair stops
Healthcare sector
Benedetto (2003)Radiology film library, Anderson cancer centreService quality improved
Outpatient CT exam lab at the University of TexasPreparation and waiting times for patients reduced from 45 min to 5 min Dramatic increase by 45% in daily numbers of examinations without an increase in workforce/ equipment
Engineering and Construction sector
Byrne(1998)General Electric1997: made a profit of $320 million which was more than double their goal of $150 million, further in 1999, annual savings of 2 billion
Magnusson et al. (2003)Volvo cars (Sweden)Profit of over 55 million euro in the years 2000 and 2002
Anderson et al. (2006)Business Unit of Transmission and Transportation Networks at EricssonSavings of over 200 million euro between the years 1997‐2003

The origins of Six Sigma can be found in statistics. The term Six Sigma owes its origin to the terminology employed in the statistical modelling of manufacturing processes. A Six Sigma process is the one that produces 3.4 defects or non-conformances per million opportunities (DPMO). A defect is anything that does not conform to the manufacturer’s guidelines or customer’s specifications, and an opportunity is any chance for this defect to happen. The sigma level, also known as a Z-value, is used as a capability index for the process, which indicates how well that process can meet the customer’s requirements (Bothe, 2001; Da Silva et al., 2019). Each sigma level corresponds to a certain number of defects/non-conformances that are associated with the process, as shown in Fig. 1.

Fig. 1

Sigma levels depending on DPMO

Each manufacturing process has set specification limits for a process and product quality. If six standard deviations can be managed between the statistical mean of a process and its nearest specification limits, then all aspects of that process would meet the specification criteria. This distance between the process mean and the specification limit is measured in sigma units and is known as the process capability. The process capability measurement index is the process performance index (Ppk). Once a process has been brought under statistical control through the implementation of a Six Sigma project, Ppk estimates how stable these improvements would be in the longterm and how closely they would meet customer expectations. The larger the Ppk value, the less is the process variability and the higher the long-term stability. To satisfy customers, the Ppk value should be greater than 1.67 (Kotz, 1993; Raman & Basavaraj, 2019).

The literature indicated that even if a process achieved a high sigma level over short-term, its performance could have declined over long-term; and a common research finding is that it might fall from Six Sigma to the 4.5 sigma level (Alexander, Antony & Rodgers, 2015; Pandey, 2007). This could happen because the process may ‘drift’ over time. Such shifting can further lead the process mean to move away from the target, thus reducing the number of standard deviations that can fit between the process mean and the closest specification limit. This is commonly known as a 1.5 sigma shift. So, the standardised definition of the Six Sigma quality considers this shift and guarantees that a six-sigma process will produce no more than 3.4 DPMO (Antony, Snee & Hoerl, 2017; Harry, 1988).

Some previous attempts to resolve the problem of high TTV failed. The problem of obtaining high TTV rejects had been a recurrent issue at the current organisation, for a significantly long time. Earlier, the traditional problem-solving technique called One Factor At a Time (OFAT) had been applied in an attempt to reduce TTV rejects. OFAT is an experimental technique that evaluates the impact of potential factors on the process outcome, one at a time while keeping other factors constant. However, these attempts failed to identify the root cause of the current problem. Further expert consultations and systematic research review indicated a better potential of the technique called Factorial Experimental Design (FED), as compared to the OFAT strategy. FED evaluates the effect of more than one factor together with their interactions, simultaneously on the process outcome. This gives FED an edge over the OFAT technique, wherein only one factor can be evaluated at one time.

Many studies have compared OFAT and FED — the two problem-solving techniques — and FED is considered to be more effective than OFAT for the following reasons (Czitrom, 1999):

In FED, investment of comparatively fewer resources (time, money and material) results in greater and more accurate information. This makes it extremely useful in industries, where time and financial costs of running a process are extremely high;

Interactions between factors cannot be identified using OFAT technique, which uses trial and error method; whereas the FED technique provides a systematic procedure for estimating interactions between several factors;

Each observation carried out during a factorial experiment considers all the factors and their interactions, which estimate the effect of factors much more precisely. In contrast, OFAT typically uses only two observations to measure the effect of one factor; these estimations are subject to greater variability.

The current study used the FED technique to address the problem of high TTV rejects in the lapping process. At a broader level, the Six Sigma methodology was applied for improving the output quality of the lapping process. Six Sigma emphasises the need to identify and clearly define customer requirements and internal industrial factors for setting goals. The data-driven rigour component of the Six Sigma approach delineates objective decision-making purely guided by the statistical analysis of data to determine process strengths and weaknesses. It is crucial to this approach that a solution is not offered until the problem has been clearly and completely defined (Ishikawa, 1985; Kume, 1985, 1995; Hoerl, 1998; Sreedharan et al., 2019).

Research methods

The Six Sigma methodology was applied to resolve the issue of high TTV rejects in the wafer lapping process, of the semiconductor manufacturing industry. The problem of the current project was formulated as follows: the mandatory replacement of slurry in the lapping process results in poor wafer flatness causing TTV rejects to increase from 0.1% to 4.43% or a loss of £58k/month.

The overall aim of this research was to evaluate the effectiveness of Six Sigma aiming to improve the quality output of the lapping process in the silicon wafer manufacturing industry.

Specific research objectives were to test the utility of the Six Sigma methodology in:

Identifying the factors responsible for high TTV defect rates;

Implementing sustainable long-term process improvements that will reduce the defect rate to <0.1%;

Increasing the Lapping Process Performance index (Ppk) to >1.67;

Delivering at least a £25k saving to the company by the end of the project.

The rationale for the project selection was based on the experience of one of the authors of this paper, who worked as a process engineer in the world’s leading semiconductor wafer manufacturing company and received black belt level in-service training in the Six Sigma methodology. The issue of high TTV rejects had been causing huge financial losses (> £50k/month) to the organisation and was topping the priority list of the senior management. It was, thus, considered necessary to apply the statistically validated, well-known, effective strategy of Six Sigma to address this problem. A team of the appropriately qualified technical staff was delegated to undertake this task, under the leadership of a trained Six Sigma black belt specialist, the first author of this article.

DMAIC, which is a Six Sigma process, was employed for achieving the above stated objectives.

Six Sigma, inspired by Deming’s Plan-Do-Check-Act cycle, has two popular methodologies, namely, DMAIC and DFSS. The DMAIC methodology is utilised for improving an existent process whereas DFSS — for the development of a new product. The current investigation followed the DMAIC principles, following the five stages:

Define is the most important and critical stage of the Six Sigma process. First and foremost, project scoping and mapping is carried out that helps explain the basic problem to all the team members. Then, current process defects are defined according to customer’s preferences, these are known as Critical to Quality (CTQ) metrics;

Measure: at this stage, in accordance with the set goal and CTQ specifications, further measurements of key elements for the concerned process are carried out;

Analyse: this stage includes an in-depth and scientific investigation of collected data, statistical tools are utilised to identify and assess the statistical significance of associations between various aspects of a process. Any findings should be objective, valid and justifiable by means of data. Analyses results help reveal the root cause of the defect in the manufacturing process. Results from the Analyse stage may also be treated as pilot results which are documented to facilitate replication of the same process design in the future;

Improve: the results from the analyses were applied to eliminate the root cause of defects in a process, thereby improving the overall quality of the outcomes;

Control: the methodology and results of each stage are clearly spelt out in sufficient detail to allow the replication in future processes to identify and correct early errors and prevent a financial loss due to defects in the yield. At this stage, Six Sigma tools of poka yoke, statistical process and quality control charts, and control plan were used.

Research results

Stage 1 — Define: the key aim of this phase was to mutually agree on a clear and concise problem statement, gain a fuller understanding of the process and identify the manageable focus area.

To clearly define and communicate the issue at hand, the following problem statement was developed based on IS-IS Not Analysis: the mandatory replacement of slurry in the lapping process is resulting in poor wafer flatness causing TTV rejects to increase from 0.1% to 4.43% or a loss of £58k/month.

Process mapping: all the essential components and steps of lapping were outlined in detail, such as material flow, operational activities, resources and material required; and the desired standards of the outcome were established. The aim is to provide all the team members with the understanding of the internal requirements of the process to allow for effective and quicker reviews in the case of any errors.

Critical to quality metrics: in Six Sigma, customer requirements are expressed through the Voice of Customer (VOC), which is converted into quantifiable Critical to Quality (CTQ) metrics. Tab. 4 displays the results on VOC and CTQ characteristics for both internal and external customers in the context of the current project. As lapping supplies two different products to polishing, there were two different CTQs for the same customer requirement (VOC). These CTQs were used throughout the project to assess improvements made to the lapping process.

Critical to Quality metrics of the lapping process

TypENameVOCCTQ
ExternalPolishingNo sharp roll‐off at the edge of waferFor Loose Spec TTV < 3.5 μm
For Tight Spec TTV < 2.0 μm
InternalLappingComparable yield to old active agentTTV reject < 0.10 %
Good FlatnessFor Tight Spec TTV < 2.0 μm

Cause and Effect (C&E) Diagram is also known as the fishbone, 6M or Ishikawa diagram (Ishikawa, 1968). It is a tool to facilitate brainstorming, identifying the causes for an effect under six broad categories of Measurements, Material, Man, Environment, Methods and Machines. In the current project, poor flatness or TTV reject (CTQs) was the effect and a potential factor responsible for poor flatness. The initial project scoping identified over one hundred of potential causes for TTV rejects. It would have been excessively time consuming to analyse all these factors. So, it was important to prioritise potential causes and reduce the project scope to a manageable extent, which was achieved by using the Y=f(x) cascade tool as shown below in Fig. 2.

Fig. 2

Y = f (x) cascade

All causes identified during the making of the C&E diagram were combined in broad categories to form clusters. These clusters (five in the current scenario) formed the highest level in the Y=f(x) cascade, which was drilled into lower levels till it reached a manageable scope, as shown in Fig 2. Key input variables to be investigated are highlighted in blue and green.

It should be noted that Six Sigma is an iterative process, and it must continue running until the desired results are achieved. So, in this case, if the chosen experimental variables did not result in any improvement, remaining inputs were selected for the next iterative cycle.

Stage 2 — Measure: from the define stage, fourteen input variables were identified, for which the data collection plan using the Kipling’s checklist was developed, as shown in Tab. 5.

Data collection plan

Item No.WhatWhyWhenHowWhereWho
1Wafer TraceabilityTo with correlate differentpoor TTVEvery LotData applicationcaptureCentral BrowserDatabase‐ Lapping Op
2TTVKPOV, CTQEvery waferLapping upload ADE, DataCentral DatabaseCW Insp. Op
3Slurry Mixing TimeKPIVEvery time fresh slurry preparedMachine Setting (SOP)Slurry SheetLapping Op
4Slurry DensityKPIVEvery slurry time prepared freshManual MeasurementControl ChartsLapping Op
5Machine flowrateKPIVEvery was changedtime loopManual MeasurementControl ChartsLapping Op
6Plate ShapeKPIV shape) (can affect waferEvery Opertion 40 hr Time ofUsing dial gaugePlate Shape SheetLapping Op
7Recycle StatusSlurryTo different evaluate slurryimpact ofEvery 5 minsData application captureBMS DatabasePSE Op
8Active Agent VolumeKPIV (can affect slurry viscosity)Every time fresh slurry preparedMachine Setting (SOP)Slurry SheetLapping Op
9Sun Gear RatioKPIV (can affect wafer rotation)Any by engineertime changedMachine Setting (SOP)QA RecordsEngineer
10Bottom Plate SpeedKPIV (can affect wafer rotation)Any by engineertime changedMachine Setting (SOP)QA RecordsEngineer
11Exhaust TimerKPIV rotation)(can affect waferAny by engineertime changedMachine Setting (SOP)QA RecordsEngineer
12Acceleration TimerKPIV rotation)(can affect waferAny by engineertime changedMachine Setting (SOP)QA RecordsEngineer
13Slurry TemperatureKPIV viscosity)(can affect slurryEvery slurry time preparedfreshMachine Setting (SOP)Slurry SheetLapping Op
14Plate TemperatureKPIV viscosity)(can affect slurryAt the start of shiftDigital thermometerLot SheetProcessingLapping Op

MSA — the measurement systems analysis — was conducted on all the measurement devices used for data collection, which aimed to identify sources of variations, induced due to the measurement process. It included checks on measurement devices, personnel engaged in the data collection process, their skill sets, adequacy and accuracy of specifications, raw material and the measurement procedure. In the current study, two MSA tools of Gage R & R and Gage Bias & Linearity Study were used. Results from both studies suggested that all measurement tools were fit for purpose.

Establish the Baseline DPMO: first and foremost, the starting line for the process was established with the help of DPMO and process capability (Ppk). Key CTQ was TTV based on loose and tight specifications. The process capability for both types of materials was calculated using Minitab Software, which is shown in Fig. 3 and Tab. 6.

Fig. 3

Process capability of TTV for Tight and Loose Spec Material

Key results from process capability charts

ParametersTight Flatness SpecLoose Flatness SpecCombined
Wafer Qty5,62328,12033,743
TTV Reject %9.823.354.43
DPMO98,168 19,67733,49944,282
Process Capability (Ppk)0.210.52‐NA‐

As shown in Tab. 6, the combined TTV rejects was very high at 4.43%, with TTV rejects for tight flatness specification being higher than that of loose flatness specification. For a Six Sigma project to be successful, it should be able to reduce DPMO to 1/10DPMO which means the combined TTV DPMO should be less than 4428.

Stage 3 — Analyse: during this phase, the collected data was analysed through a systematic application of statistical and graphical tools.

Control charts, also known as Shewhart charts, were used to identify key trends and generate clues.

Although there are different kinds of control charts, in the current study, Xbar-S and I-MR charts were used together with boxplots. The further investigation into out-of-control lots gave more clues, from which a number of Multi-Vari charts were generated but only two charts displayed a significant trend (Fig. 4).

Fig. 4

Multi-Vari charts for Avg and Std Dev TTV by LSM reject — Run order

Avg TTV refers to the mean of TTV for a lot; and it should be noted that a lot can have a wafer quantity from 100 to 320. Then, this mean of Avg TTV is further split by three factors: Lapping Machines, lots with Line Saw Mark (LSM) Rejects and Run Order of lots during the shift. The right-hand side graph is exactly the same, except for Std Dev of TTV on the Y-Axis. In summary, the following inferences could be extracted from the Multi-Vari charts (Fig. 4):

The green trend line in Fig. 4 shows that generally, Avg TTV for the first lot of the shift was comparatively higher than the rest of the lots processed during the same shift. It means that something at the start of the shift was not correct, which was resulting in high Avg TTV. Factors that were different at the start of the shift and got stabilised during the shift were the slurry temperature, slurry mixing time and plate temperature;

Generally, the wafers with Line Saw Mark (LSM) reject have a higher Avg and Std Dev TTV than the wafers without LSM reject. This is an important finding as it indicates that due to the poor quality of an incoming wafer, the wafer was unable to rotate freely at lapping. Wafer rotation could be affected by factors like sun gear ratio, plate speed, acceleration time and exhaust time;

Generally, lapping machines have no significant impact on Avg and Std Dev TTV. It means that the problem is global and related to something that was common to all lapping machines, like slurry composition, slurry type etc.

Stage 4 — the Improve phase: based on enhanced learning gained from the Analyse phase, a list of factors that can influence TTV, was generated by the team using the brainstorming technique. To characterise the impact of these input variables on TTV, Design of Experiments were used. As part of DoE planning, a CNX diagram was generated, as shown in Fig. 5.

Fig. 5

CNX Diagram for DOE

The list of experimental variables was still too long, and only screening DOE was feasible. Taguchi L12 design was used to rank the factors in the order of their impact on TTV. Taguchi L12 implied that by performing 12 lapping batches, nine experimental factors at two different values were evaluated for their impact on TTV, as shown in Fig. 6 and Tab. 7.

Fig. 6

Main effect plots for means

Response Table for Means

LevelSun Gear RatioBottom Plate SpeedExhaust TimerAcceleration TimerPlate Temp.Recycle Slurry StatusActive agent concentrationSlurry Temp.Slurry Mixing Time
11.28931.03381.00531.00921.19970.99401.01200.99571.0062
20.71230.96780.99630.99250.80201.00770.98971.00600.9955
Delta0.57700.06600.00900.01670.39770.01370.02230.01030.0107
Rank139526487

From Taguchi DOE, the top two ranked factors (the sun gear ratio and plate temperature) seemed to have a significant impact on TTV. Although the factor ranked third (the bottom plate speed) did not seem significant, it was still chosen for further analysis. As Taguchi is only a screening DOE, it is always recommended to perform a more comprehensive DOE to confirm the results of any screening DOE. So, three factors of the second level of the full factorial design of experiment (DOE) were designed with three centre points to characterise and optimise the three experimental factors.

DoE results in the Pareto chart represent the relative impact of differences between different factors on the process outcome. As shown in Fig. 7, factor A (sun gear ratio) had the biggest impact on TTV. This was followed by factor B and an interaction effect between factors A and B. The Pareto chart was also used to determine statistically significant factors and the effect of their interactions on the output. The factors above the red line were significant, and the factors below were non-significant.

Fig. 7

Normal and standardised effects for Pareto charts

T main effects plot (Fig. 8) indicated that an increase in the sun gear ratio, plate temperature and the bottom plate speed resulted in reduced TTV. Further, as centre points (marked red ) for all the three factors were not on the line, the impact of these factors on TTV was not linear. In simple terms, the increase in the plate temperature to 24 °C did not result in any reduction in TTV; however, a plate temperature closer to 30 °C was most likely to reduce the TTV reject rate.

Fig. 8

Main effects and interaction plots for avg TTV

To make things simple, Minitab’s response optimiser function could be used, which calculated values for input variables to achieve the optimum output. Results are shown in Fig. 9, suggesting that to achieve the minimum TTV (0.462), inputs should be set to the following parameters (highlighted in red):

Sun Gear Ratio = 0.7;

• Plate Temperature = 30 °C;

Bottom Plate Speed = 35 rpm.

Fig. 9

Response optimiser result

To validate the theoretical predictive DOE model, a confirmation run should be conducted using the recommended input values. A confirmation run was carried out with six lapping batches, and the results are presented in Fig. 10.

Fig. 10

Graphical Summary of Confirmation Test Run Results

Using the recommended input values, Response Optimiser predicted Avg TTV to be around 0.462 μm, which is exactly the same as the confirmation run result of 0.460 μm (see Fig. 10). The confirmation test, thus, confirmed the validity of the theoretical model.

Pilot testing was used for the validation of the results found during the improvement stage by implementing it on larger sample size. The incremental implementation with a continuous review of results was conducted to reduce the risk of large rejects. Loose TTV spec materials were run on one lapping machine for a limited period of one week, and then, the results were awaited. Since results from this initial phase were as expected, this optimum condition was expanded to other machines, one at a time. Boxplot results from the pilot stage for loose spec material are shown in Fig. 11.

Fig. 11

Boxplot of TTV for tight flatness spec by improvement stages

There were rather a few wafers with TTV > 3.5 microns (spec limit) before improvement. However, a noticeable difference was visible after the introduction of the improved condition. It was found that TTV had become stable; and few wafers were above the spec limit. This improvement was statistically significant (W=1367172420.0 and p< 0.01). This data provided sufficient evidence in support of the improved condition for loose spec material. These conditions were then transferred to tight spec material, and the improvement in TTV was again statistically significant according to the Mann Whitney test (W=124139220.0 and p= 0.00).

The Establishing Process capability and DPMO were the next step, aiming to quantify and validate the new improved results. At the start of the process, the baseline for the lapping process was established using DPMO, which was now repeated for the improved process. The process capability charts are shown in Fig. 12. The key results from Fig. 12 and from the start of the project are summarised in Tab. 8.

Fig. 12

Process capability of TTV for tight and loose spec after improvement

Displaying the impact of improvements on TTV rejects

Before ImprovementAfter Improvement
ParametersTight Flatness SpecLoose Flatness SpecCombinedTight Flatness SpecLoose Combined Flatness Spec
Wafer Qty5,62328,12033,74319,67750,666 70,343
TTV Reject %9.823.354.430.020.01 0.01
DPMO98,16833,49944,282152118 127
Process Capability (Ppk)0.210.52‐NA‐3.871.30 ‐NA‐

As shown in Tab. 8, there was a massive reduction in TTV rejects from 9.82% to 0.02% for tight flatness spec; from 3.35% to 0.01% for loose flatness spec; and from 4.43% to 0.01% for the combined materials. This is also reflected in the DPMO values. The initial target of the project was to reduce TTV rejects to less than 0.1%; and after the improvements, TTV reject for both specs was found to be less than 0.03%. Process capability charts clearly demonstrated that the mean and variation for loose spec material was comparatively higher than that of the tight spec material. Loose spec material went through other manufacturing processes that increased its TTV value, whereas tight spec material went straight to the TTV measurement stage. Hence, a true reflection of the lapping process capability could be evaluated by tight flatness spec material only. The process capability index increased from 0.21 to 3.87, suggesting a significant improvement in the output quality of the process, as well as sustainable long-.term stability of the lapping process, in terms of controlling TTV of wafers.

Stage 5 — the Control phase: poka yoke. This term, commonly referred to as “mistake proofing”, originated from Japanese words “poka” meaning “avoid” and “yokeru” meaning “mistakes”. In the current scenario, the sun gear ratio and the bottom plate speed were identified as two key input variables for TTV control in the lapping process. So, the poka yoke method was exercised on the lapping machine, which would automatically prevent the machine from running if any of these two parameters were set incorrectly. This was achieved using continuous, automatic measurement of these parameters by the machine, and as soon as incorrect values were identified, an alarm went-off and the machine stopped running.

The Statistical Process Control (SPC) Chart: the underlying principle of Six Sigma is to reduce the process output variation by controlling key process input variables (KPIV’s), which is done with the help of SPC charts. As discussed earlier, the second most critical parameter for good TTV is the plate temperature, where poka yoke cannot be applied. However, it is imperative to control it using SPC charts. The most appropriate SPC chart for this purpose is the Individual – Moving Range (I-MR) control chart.

Statistical Quality Control Chart is used to control the key process output variables (KPOV). In this project, the KPOV was TTV, which could be controlled using the Mean and Std. Deviation (Xbar-S) chart

Control Plan is the master document, which lists all the parameters of the process that should be monitored and controlled to achieve the desired results. It is of utmost importance to communicate the control procedure amongst various stakeholders. Maintaining timely updates to the lapping process control plan was also important. Proper control and vigilance mechanisms for the key input parameters impacting the identified output and placing adequate controls for achieving quantifiable improvements were crucial.

Discussion of the results

The current implementation of the Six Sigma methodology to reduce flatness defects in lapping was successful. The silicon wafer flatness was found to be dependent on several and their interaction. This project presented an interesting revelation regarding the key input variables that can affect lapping TTV. Based on the controlled design of experiments, it can be concluded that the sun gear ratio and the plate temperature were the two most critical input variables affecting TTV. The interaction between these two factors was also statistically significant. The sun gear ratio of 0.7 and the plate temperature of 30 ºC were the optimum setting to minimise TTV. However, care must be taken as these were optimum settings for one manufacturing plant location and were completely different from other locations using a similar manufacturing set-up.

The bottom plate speed was a statistically significant input factor. However, its impact on TTV was not as high as the other two significant factors (the sun gear ratio and the plate temperature). It was found that the bottom plate speed should be optimised at 35 rpm. The wafer rotation was critical for achieving the optimum TTV: Lapping is a closed process, which makes it difficult to see what is going inside the machine. As per the results of the test, which were discussed earlier, the increasing sun gear ratio and the bottom plate speed helped with wafer rotation. Further, the increasing plate temperature reduced the viscosity of lapping slurry, thereby aiding the wafer rotation as well. This suggested that if silicon wafer does not receive enough rotation in the lapping carrier, it can result in uneven lapping removal, thereby high TTV.

Detailed scoping of the problem at early stages helped to generate the following solutions. As part of the Six Sigma approach, a clear statement of the problem was drafted at the start of the project. Consistent with evidence of previous research (Mishra, 2018; Pyzdek, 2003), the scoping exercise largely consisted of ensuring that team members understand the nature of the problem and what is expected of them. A clear and shared understanding of the problem and the process at the early stages enabled working towards a common goal and channelling of team members’ efforts in the right direction, avoiding any ambiguity or confusion. Following which, a detailed breakdown of the components of the problem facilitated a productive brainstorming session, resulting in a list of possible causes and solutions. For example, initially, a long list of over 60 potential causes was generated. It was practically impossible to evaluate all of them at the same time. However, it did not take very long for the Six Sigma experts in the team to suggest solutions for dealing with such situations, which was only possible due to a clear and detailed written statement of the problem. Six Sigma tools of project framing, cause and effect diagram and Y=f(x) cascade helped to shorten the long list of identified potential causes, thus making it easier to reduce the scope of the project to a manageable extent and providing the necessary focus to the evaluation.

A seamless, systematic roadmap of statistical thinking was ensured. Six Sigma is a step-by-step procedure for problem solving or process improvement. Previously, at this organisation, all the efforts to identify the root cause for high TTV rejects had failed. Whereas, with the help of Six Sigma methodology, wherein every step was clearly laid out in sufficient detail, the problem-solving process was much more straightforward. Whenever the project met a roadblock, a range of statistical tools was available to facilitate the problem-solving process. For example, after using several statistical and graphical analysis techniques, such as Multi-Vari charts, there were still nine potential factors whose impact on wafer flatness was unknown. At that stage, Taguchi screening DOE helped reveal that causal relationship. The learning point here is that since there are a wide number of tools and techniques listed at every step of the five phases of the Six Sigma process, at least one of them is likely to work for the concerned process.

Statistical tools were used to validate the engineering models. Statistical knowledge forms the core of the Six Sigma methodology, which is yet another important learning point gained from the current project. The implementation of the Six Sigma methodology in the current project can be argued to be a creative interplay between three components of managerial objectives, the expansion of engineering knowledge and the application of statistical techniques. Statistical analyses were used to validate the theoretical engineering models.

The full factorial DOE technique gives the Six Sigma methodology an edge over traditional problem-solving techniques. Full factorial DOE tool, which also belongs to Six-Sigma, was particularly useful in identifying key factors that were responsible for causing TTV rejects. Previous studies have also shown that the application of Taguchi and full factorial DOE technique helps to identify sources of variation in the process (Ghosh & Rao, 1996). Previous attempts at this organisation involved the use of One Factor at a Time (OFAT) technique. This technique had repeatedly failed to identify the underlying cause in the current scenario. In hindsight, it is now logical to understand why traditional problem-solving techniques had failed. In the current project, an interaction between more than one factors (plate temperature and sun gear ratio) was responsible for high TTV rejects. Such an evaluation of the interaction between more than one input factor would have been impossible to perform through traditional problem-solving techniques, such as OFAT. The use of Full factorial DOE technique revealed not only the impact of all three significant factors but also imparted new insights on the interaction between these factors. It was found that unless these factors are set at their respective optimum levels simultaneously, inputting of individual optimum values of these factors will still result in poor TTV values.

Implementation and control were also carefully considered. It had been argued in previous research (Sreedharan et al., 2019; Raman & Basavaraj, 2019) that after the desired results through the Six Sigma methodology are achieved, if appropriate monitoring and control procedures are not established, then most likely the process would regress back to its original, flawed state (Antony & Coronado, 2002; Muraliraj et al., 2018). A similar phenomenon had been observed in this organisation, where an effective solution had temporarily fixed a problem, but a lack of appropriate control systems caused the re-appearance of the same issue. For such reasons, in the current project, much emphasis was put on the control stage of the Six Sigma methodology, which was a new experience for the entire team and, probably, the organisation too.

Limitations

How stable are the improvements in the lapping process? It has been shown that despite noticeable improvements achieved by a project in the short-term, its sigma level and performance can decline over time by 1.5 sigma. The standardised six-sigma level accounts for such variations in the long-term. Since the current process did not achieve the full six-sigma level, it is likely that in case of a 1.5 sigma shift occurring, the process capability might regress to its original state. Nevertheless, another measure of longterm stability of an improved process achieved through the implementation of Six Sigma methodologies is the process capability (Ppk) index. The Ppk index of the current process was found to be 3.87, which indicates that a sustainable solution to the problem has been identified and implemented. It also ensures that correct factors responsible for poor wafer flatness have been identified and that adequate control has been exerted to maintain a long-term process stability. Any decline in the process capability or quality over long term should, therefore, be unlikely.

Conclusions

This study aimed to reduce flatness rejects in the lapping process of silicon wafer manufacturing using the Six Sigma methodology. It is a novel application of Six Sigma as previously it has been implemented in several industries, such as finance, service, manufacturing and non-manufacturing but not for the lapping process of silicon wafer manufacturing in the semiconductor industry. A significant reduction in rejects from 4.83% to 0.02% was achieved through the implementation of the Six Sigma methodology, resulting in the savings of £57.5k/month. A significant increase in the capability index (Ppk) of the lapping process also occurred, which is indicative of the enhanced product quality and efficiency, thereby increasing customer satisfaction. In the current project, three factors (the sun gear ratio, the plate temperature and the bottom plate speed) and the interaction between the sun gear ratio and the plate temperature were statistically significant. The optimisation and tighter control of these variables were key for the successful reduction in TTV rejects.

On a large scale, dealing with such complex processes as lapping, it is often difficult to agree on a starting point and contain the scope to a manageable extent. The Six Sigma techniques of define and measure provided a much-needed direction and structure to the problem-solving process, at an early stage. Previous attempts using traditional problem-solving techniques had failed since several potential factors and their complex interactions were responsible for high TTV rejects. On the other hand, the Six Sigma technique could quantify and rank order several factors with their interactions in terms of their relative effect on the process outcome. Other Six Sigma pointers, such as the commitment of the top management, the interim publication of success stories and open communication, also helped to maintain the motivation and focus throughout the project.

Overall, Six Sigma was shown to be an effective problem solving and quality improvement statistical technique for the semiconductor industry, which lead to a series of other benefits (such as the increase in the process efficiency, the reduction in time and financial costs, the enhanced satisfaction of customers and employees as well as the provision of innovative team building and networking opportunities), in addition to the defect rate reduction in the lapping process. The relevance of Six Sigma in the manufacturing industry is irrefutable, but there may be a need to conduct further rigorous research on its utility across non-conventional sectors and the practicalities in terms of incurring costs and resources.