Nowadays, high end photomasks are usually patterned with electron beam writers since they provide a superior
resolution. However, placement accuracy is severely limited by the so-called charging effect: Each shot with the
electron beam deposits charges inside the mask blank which deflect the electrons in the subsequent shots and
therefore cause placement errors. In this paper, a model is proposed which allows to establish a prediction of
the deflection of the beam and thus provide a method for improving pattern placement for photomasks.
Today's semiconductors consist of up to forty structured layers which make up the electric circuit. Since the
market demands more powerful chips at minimal cost, the structure size is decreased with every technology
node. The smaller the features become, the more sensitive is the functional effciency of the chip with respect to
placement errors. One crucial component for placement errors is the mask which can be viewed as a blueprint of
the layer's structures. Hence, placement accuracy requirements for masks are also tightening rapidly. These days,
mask shops strive for improving their positioning performance. However, more and more effort is required which
will increase the costs for masks. Therefore, the transfer of mask placement errors onto the wafer is analyzed in
order to check the guidelines which are used for deriving placement error specifications.
In the first section of this paper the basic concepts for measuring placement errors are provided. Then, a method
is proposed which is able to characterize the transfer of placement errors from mask to wafer. This is followed
by two sections giving a thorough statistical analysis of this method. In the fifth section, the connection to
placement accuracy specifications on mask and wafer is established. Finally, the method is applied to a set of
test masks provided by AMTC and printed by AMD.
As a consequence of the shrinking sizes of the integrated circuit structures, the overlay budget shrinks as well. Overlay is
traditionally measured with relatively large test structures which are located in the scribe line of the exposure field, in the
four corners. Although the performance of the overlay metrology tools has improved significantly over time it is
questionable if this traditional method of overlay control will be sufficient for future technology nodes. For advanced
lithography techniques like double exposure or double patterning, in-die overlay is critical and it is important to know
how much of the total overlay budget is consumed by in-die components.
We reported earlier that small overlay targets were included directly inside die areas and good performance was
achieved. This new methodology enables a wide range of investigations. This provides insight into processes which
were less important in the past or not accessible for metrology. The present work provides actual data from productive
designs, instead of estimates, illustrating the differences between the scribe line and in-die registration and overlay.
The influence of the pellicle on pattern placement on mask and wafer overlay is studied. Furthermore the registration
overlay error of the reticles is correlated to wafer overlay residuals.
The influence of scanner-induced distortions (tool to tool differences) on in-die overlay is shown.
Finally, the individual contributors to in-die-overlay are discussed in the context of other overlay contributors. It is
proposed to use in-die overlay and registration results to derive guidelines for future overlay and registration
specifications. It will be shown that new overlay correction schemes which take advantage of the additional in-die
overlay information need to be considered for production.
Following the international technology roadmap for semiconductors
the image placement precision for the 65nm technology node has to be 7nm. In order to be measurement capable, the measurement error of a 2D coordinate measurement system has to be close to 2nm. For those products, we are using the latest Vistec registration metrology tool, the LMS IPRO3. In this publication we focus on the tool performance analysis and compare different methodologies. Beside the well-established ones, we are demonstrating the statistical method of the analysis of variance (ANOVA) as a powerful tool to quantify different measurement error contributors. Here we deal with short-term, long-term, orientation-dependent and tool matching errors.
For comparison reasons we also present some results based on LMS IPRO2 and LMS IPRO1 measurements. Whereas the short-term repeatability and long-term reproducibility are more or less given by the tool set up and physical facts, the orientation dependant part is a result of a software correction algorithm.
We finally analyse that kind of residual tool systematics and test some improvement strategies.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.