RequestLink
MICRO
Advertiser and
Product
Information

Buyer's Guide
Buyers Guide

tom
Chip Shots blog

Greatest Hits of 2005
Greatest Hits of 2005

Featured Series
Featured Series


Web Sightings

Media Kit

Comments? Suggestions? Send us your feedback.

 

MicroMagazine.com

Design for Manufacturing

Using grid-based optical proximity correction verification at the 65-nm node and below

Travis Brist, Mentor Graphics

The 0.18-µm technology node has proven to be the mother of invention for the modern semiconductor industry. Because design features at larger technology nodes (e.g., 0.35 and 0.25 µm) were larger than the wavelength of light used in the photolithography process, ICs could be manufactured as drawn. But the 0.18-µm node posed new challenges, requiring a different set of rules and innovations. Hence, the manufacturing status quo changed. At that process technology node, geometric features shrunk to a size that was smaller than the lithographic wavelength of light. ICs could no longer be manufactured as drawn. In fact, they could not be manufactured without the assistance of resolution enhancement techniques (RET). A group of postlayout applications, RET includes optical proximity correction (OPC)—also called optical and process correction—which “tricks” the light into creating a pattern transfer as close as possible to the intended design. RET applications have enabled foundries to extend the life of 248- and 193-nm manufacturing equipment longer than was thought possible in the subwavelength era.

Although each new process node introduces manufacturing struggles that result in a cycle of low initial yields, slower ramp up to maximum yield, lower maximum yield, and longer time to market, advancements in RET and the lithography processes allow the semiconductor industry to continue the progression of Moore’s Law.

For engineers working in the areas of manufacturing and RET, the 0.18-µm technology node seems to be a long time ago. Since then, much has changed in process technology and manufacturing. The increased complexity and continued shrinking of IC designs has resulted in a smaller lithographic process window, the process region between dose and focus steps during which features print within an acceptable critical dimension (CD) tolerance. In the lithographic process, IC complexity and decreased size have also increased sensitivity to layout topology and complex mask-rule constraints, affecting the application of RET. All of these conditions are major contributors to silicon failure.

Smaller technology nodes have resulted in lower k1 values, lower contrast, higher mask error enhancement factor (MEEF) conditions, and the need for tighter CD control to yield, thus increasing the impact of process variability. As a result, the available process window continues to shrink, despite better reticle CD control and improved control over lithography process conditions and materials.

Process variability is of critical concern because it can have a catastrophic effect on yields. This is especially true where variability puts image fidelity at risk, even when the lithographic system’s operating conditions are acceptable. Adding to the systematic issues that affect process variability is the increasing complexity of RET applications, which are affected by several factors that can contribute to suboptimal RET: mask-rule constraints, setup file errors (fragmentation, site placement), tile and hierarchy boundaries, modeling errors, and metrology errors. These constraints can contribute to RET failure and, subsequently, to manufacturing and yield failures.

OPC was at the forefront of design for manufacturing (DFM) long before that term came into vogue. Without OPC, the 0.18-µm technology node and all subsequent nodes would not have been possible. While efforts to detect and respond to manufacturing disturbances continuously reduce the magnitude of manufacturing variability, the problem persists, leading to increasing yield losses.

Leading-edge OPC (the 90- and 65-nm device generations) alters reticle-pattern data to compensate for process transformations, ensure adequate patterning margin for every feature in a design, and guarantee that the final wafer pattern fidelity meets product/circuit requirements.1 While the semiconductor industry continues to view OPC as a proven path to successful silicon, additional technologies are needed to mitigate the yield impact of manufacturing variability.

Issues must be identified with fast, accurate, and advanced RET verification capability that detects yield-limiting conditions. This requirement will be especially important as the semiconductor industry ramps up the 65-nm node and begins to develop the 45-nm generation.

Many challenges face OPC at small technology nodes, including modeling, user rules, integration, and production.1 In addition, design errors can prevent OPC from functioning as intended or expected. New DFM methodologies, including those with litho-friendly design capabilities that capture process variation data for use in the design domain, are being used to help improve yields through better process-aware design practices.

Whereas the 0.18-µm node became the catalyst for advanced OPC and improved manufacturing, advanced OPC is becoming a catalyst for virtual manufacturing methods at the 65- and 45-nm nodes. Because of its complexity at these nodes, OPC requires a new method of verification to reduce the risk of silicon failure and ensure acceptable yields. A method that can detect lithographic errors or marginalities caused by process variability, and in which RET recipes can be simulated at the full chip level before the design goes to the mask or wafer manufacturer, will help avoid costly respins and associated delays in the time to market. The cost of yield losses is also critical: In 300-mm wafer fabs, a mere 1% yield loss costs chipmakers about $5 million annually.2

From Sparse to Dense OPC

With the push to 65 nm and 45 nm and with costs, yields, and revenues at stake, there is a strong desire in the IC industry to create a virtual manufacturing environment in which OPC recipes can be analyzed and optimized before chip manufacturing begins. How this technology leap will be managed for some designs at the 65- and 45-nm nodes may require a reevaluation of the industry’s approach to OPC. A method will be needed that can enable a more comprehensive accounting of sensitivities and variances across the process window. To achieve that goal, dense OPC is under consideration. In contrast to traditional, or sparse, OPC, dense OPC uses a grid format. The two approaches are compared in Figure 1.

Figure 1: Simulation using sparse, or traditional, OPC focuses on features in areas of concentration. In sparse simulations, some areas are oversimulated, while others are not simulated at all. In contrast, dense simulation is grid based and evenly distributed across the design.

Current OPC tools require users to define sparse simulation sites along the layout polygon edges. That is, the user defines the amount of fragmentation and where sites should be placed for simulation and correction by the OPC model. Typically, the results of this approach must be verified through simulation, after which changes must be made to the fragmentation, site placement, and the number of OPC iterations. This cycle is performed several times to achieve an optimal OPC application.

Dense OPC uses pixel-based simulation on a full chip. (In the area of dense simulation, a single simulation point is often called a pixel.) This procedure creates simulated contours of how the design will look when it is manufactured and allows analysis of contours so that the optimal number of iterations during the OPC application process can be determined. Dense OPC can also be used to eliminate the site and fragmentation discrepancies associated with sparse OPC. In addition, it simplifies OPC configuration and can result in more thorough verification (simulation) of the full chip across the process window. Dense OPC’s advanced options enable users to automate site placement and fragmentation and determine the optimal numbers of iterations during the OPC application process.

Dense OPC simulation is used to obtain an aerial image intensity function on a 2-D grid. The aerial image is computed exactly on the grid. As long as the grid is at or below the Nyquist sampling rate spacing, the aerial image intensity can be interpolated exactly and without error at any location (x, y) defined in the continuum of points between grid samples.3 The aerial image intensity in the spatial frequency domain and the image contour are achieved using an inverse Fourier transform method. As shown in Figure 2, the image intensity is a positive-valued 2-D function on the grid.

Figure 2: Dense image intensity simulation.

A grid-based or dense verification tool can be used on existing technology nodes. It provides a method of measuring pattern-transfer accuracy after OPC through process window conditions, accurately checking for conditions that can result in manufacturing failures.

OPC Verification Methodology

The goals of OPC verification flow easily from the goals of OPC: Its purpose is to test whether OPC sufficiently compensates for process transformations and to ensure adequate patterning margin in order to meet production requirements.1 The key benefits of OPC verification are that it allows for the testing of RET recipes, determines pattern transfer, and finds critical yield-limiting errors as early in the product development flow as possible, where problem resolution is easiest. All of these steps are performed before actual chip manufacturing commences.

Dense OPC verification provides a thorough check of lithographic patterning failures that can affect yields, such as bridging, pinching, extra printing, and nonprinting features. It also checks for problems that can affect performance once yield has been achieved, such as CD variation and two-layer overlay marginality. Process-variation simulations consider user-specified focus and dose settings (or other optical variations) to predict layout features that will fail to meet CD requirements through the process window. Such CD variations as edge-placement errors, in turn, can be used to identify areas with inadequate RET treatment. An example of failures detected through process variations is presented in Figure 3.

Figure 3: Example of failures detected through process variations using dense OPC verification. The method accurately finds bridging at a subnominal dose, even when no bridging occurs under the best conditions.4

Dense OPC verification can be used to monitor bridging, pinching, CD errors, extra printing features or nonprinting features, and edge errors.

Bridging. Instances of bridging can be checked using an outer tolerance zone or CD measurement check. The user can generate a simulation contour for a given dose and focus setting and identify bridging regions where the contour exceeds the tolerance zone or the separation between contours goes below a specified limit. Those data points can then be sorted according to the severity of the bridging region or be used to score a layout to identify regions of high failure concentrations in the design.

Pinching. Pinching cases can be checked using an inner tolerance zone. As with bridging checks, the user can generate a simulation contour for a given dose and focus, check for pinching under those process conditions, and then sort the data according to severity or score the layout. Again, any contour outside the parameters of the tolerance band or below a given CD tolerance is considered a catastrophic edge placement error (EPE) and is a potential failure. Tolerance bands are user defined and can be created within the verification setup file. While pinch and bridge checks can be completed using other methods, including CD-based checks, tolerance bands are exceptionally good for detecting difficult pinch/bridge scenarios.

CD Errors. For gate CD checks, the user inputs the active region and poly mask shapes into the OPC verification tool. The wafer contours are simulated at a given dose and focus. Internal checks can be performed in the active region to measure CD and quantify variation across the gate. Simulations can be checked for variations and weak points across the process window. These checks can be defined in a setup file, such as the Calibre OPCverify tool from Mentor Graphics (Wilsonville, OR), and can be specified through the verification center graphical user interface (GUI). In addition to a gate CD check (across-chip linewidth variation), a collection of contours across the process window can be generated to create process variability bands. These bands can be used to locate the regions of low contrast and high MEEF that are most affected by process variability and should be monitored during the manufacturing process.5

Extra Printing Features and Nonprinting Features. Frequently, nonprinting lithographic features such as scattering bars will print, creating failure issues and yield loss. It is beneficial to predict and locate those features so that corrections can be made before manufacturing. In this area, dense OPC verification has distinct advantages over the sparse OPC method. When a sparse OPC simulator is used, simulation occurs only where the user specifies: in other words, the simulation site. Since subresolution assist features and side lobes do not have simulation sites, their printability is not simulated. Dense simulation is the only way to detect whether such features are printing. The dense simulator does not require simulation sites for each polygon; instead, the simulation is performed on a grid that is virtually overlaid with the design.

Edge-Error Detection. Edge-error detection simulates the error between the simulated wafer pattern and the desired target (the edge-placement error).6 This feature is useful for detecting the amount of line-end pullback that can appear as a result of user rules, design errors, and reticle manufacturing constraints, which prevent OPC from fully compensating for the line-end pullback as determined by the OPC model.

Conclusion

While there is no argument that advanced lithographic capability is needed to achieve the 65- and 45-nm nodes, RET is a time-consuming and intricate process. A dense verification approach is possible only because advanced processing power has been brought to the desktop. Five years ago, dense verification would have been impossible or would have required the use of a prohibitively expensive supercomputer. With the availability of distributed processing on relatively inexpensive off-the-shelf computer platforms, virtual manufacturing for small technology nodes has become a reality.

By its very nature, grid-based OPC verification is a simplified process, since it does not require users to specify simulation sites or define fragmentation. A key to its ease of use is a user-friendly verification center GUI that enables simple setup of the most common checks and is customizable for multiple applications and results outputs. Using available tools, such a GUI can be easily incorporated into the current lithographic flow.

Early prevention of yield-inhibiting problems is far better for the bottom line than late detection because it reduces costs, improves yields, and brings designs to market quickly. As a supplement to an existing system for checking pattern-transfer accuracy under nominal process conditions, lithography process-window verification provides value-added checking that is able to identify process-sensitive structures across the chip. From the standpoint of lithography-friendly design and DFM, this added value helps to improve OPC and design quality. Silicon-proven simulation models can provide 100% simulation coverage of the chip to ensure silicon patterning success, satisfying the need for both RET recipe validation and mask verification.

References

1. O Toublan et al., “Verification Requirements for 45nm and 65nm Optical Proximity Correction” (paper presented at the Interface Microlithography Symposium, Coronado Island, CA, October 23–25, 2005).

2. JR Lineback, “Even the Slightest Yield Losses Are Becoming More Expensive,” in “Confab Speakers Debate the New Economics of Chipmaking,” Solid State Technology [online] August 2005 [cited 9 December 2005]; available from Internet.

3. NB Cobb and Y Granik, “Dense OPC for 65nm and Below,” in Proceedings of SPIE, BACUS Symposium on Photomask Technology, vol. 5992 (Bellingham, WA: SPIE, 2005), 1521–1532.

4. J Belledent et al., “Critical Failure ORC: Application to the 90-nm and 65-nm Nodes,” in Proceedings of SPIE, Optical Microlithography XVII, vol. 5377 (Bellingham, WA: SPIE, 2004), 1184–1197.

5. JA Torres and N Cobb, “Study Towards Model-Based DRC Verification,” in Proceedings of SPIE, BACUS Symposium on Photomask Technology, vol. 5992 (Bellingham, WA: SPIE, 2005), 1049–1056.

6. Y Granik et al., “Subresolution Process Windows and Yield Estimation Technique Based on Detailed Full-Chip CD Simulation,” in Proceedings of SPIE, Process Control and Diagnostics, vol. 4182 (Bellingham, WA: SPIE, 2002), 335–341.


Travis Brist is an RET technical marketing engineer in the Calibre Design to Silicon division at Mentor Graphics (Wilsonville, OR). With 10 years of photolithography experience, he previously worked as a process development engineer at LSI Logic in Gresham, OR, and as a lithography engineer at Cypress Semiconductor in Round Rock, TX. He received a BSEE from Texas A&M University in College Station. (Brist can be reached at 503/685-1409 or travis_brist@mentor.com.)


MicroHome | Search | Current Issue | MicroArchives
Buyers Guide | Media Kit

Questions/comments about MICRO Magazine? E-mail us at cheynman@gmail.com.

© 2007 Tom Cheyney
All rights reserved.