Facing the Rising Costs of R&D
Collaborative Partnerships, Alliances Reduce Industry Research Gap
Moslehi, PhD, is chief technology officer and senior vice president,
semiconductor technology research, for The Noblemen Group, a boutique
investment banking, strategic advisory, and business development firm.
Moslehi has 20 years' experience working in the semiconductor and semiconductor
equipment industries. He can be reached at email@example.com.
Escalating research and development and manufacturing costs have compounded
the already-daunting business pressures faced by the chipmaking industry.
According to the Semiconductor Industry Association, IC manufacturers
spend about 30% of their sales on capital expenses and R&D.
Further complicating matters, the diverse R&D paths defined in the
International Technology Roadmap for Semiconductors (ITRS) and the associated
uncertainties with the adoption and timing of various options confront
the industry with difficult decisions about how to allocate its scarce
R&D investment. The roles of cyclical industry revenues, deteriorating
profit margins, access to capital and its high cost, roadmap challenges
and timing, and stagnant levels of government R&D investment all
demand careful examination.
The ITRS provides the consensus of technical requirements for continued
industry advancements. The unprecedented number of technological challenges
with no known solutions requires significant levels of research investment.
The semiconductor industry in recent years has typically invested 12-15%
of its revenues in R&D; the investment by SIA members increased from
$2.6 billion in 1990 to $14 billion in 2003. Despite these efforts to
address long-term research needs, an estimated annual investment gap
of about $1.5 billion still exists.
Process R&D costs have been increasing rapidly with each new technology
node and now outpace the semiconductor revenue growth rate. Since 1995,
the industry's average growth rate has slowed from the historical
17% to around 8-10%. Yet various industry sources (such as VLSI
Research and Sematech) note that with each node the R&D costs accelerate,
from about 12% to as much as 20% annually. The estimated total cost of
developing 90 nm is thought to be around $500 million, while the expense
of developing upcoming technology nodes is projected to approach $1 billion.
This means that in order to stay with the ITRS targets, a company must
have an annual R&D budget of at least several hundred million dollars.
On the equipment side, the total cost of developing the latest lithography
tools is believed to be in the $500 million to $1 billion range.
The progressively widening gap created by the escalating cost of R&D
cannot be sustained in the long term, a situation that has alarmed such
industry leaders as Robert Helms (former CEO of Sematech), Mike Splinter
(CEO of Applied Materials), and Hans Stork (CTO of Texas Instruments).
Various factors contribute to rising R&D expenditures, including
steep fab and equipment depreciation and wafer/materials costs (by far
the largest contributors); the high cost of introducing new processes
(e.g., high-k dielectrics); skyrocketing mask costs; and ever-more-expensive
Most technology development takes place in either advanced pilot lines
or volume manufacturing wafer fabs. The price tag for building modern
chipmaking facilities has been increasing by about 10-15% annually,
resulting in a doubling of fab cost every 4 or 5 years. High-volume 300-mm
production fabs cost $2 to $3 billion, with equipment expenditures accounting
for more than 80%.
However, this upward trend is significantly offset by the vastly improved
manufacturing efficiencies and higher productivity of modern fabs. In
1985, about 2700 production fabs helped create $22 billion in revenue;
in 2004, less than 400 manufacturing fabs helped push total revenues
to around $220 billion—a tenfold increase in less than two decades.
According to Gartner Dataquest, capital expenditures in 2004 were $49
billion, with about 77% of this amount spent on equipment. This is slightly
more than 22% of total device revenues for the year, which is consistent
with the historical capex budgets of 20-30% of fab revenues. To
afford a new fab, a company needs to have annual revenues of at least
$2 to $3 billion and an annual capex budget exceeding $1 billion.
Since the 1980s, chipmakers have been delegating an increasing portion
of process and technology development R&D to the capital equipment
and materials suppliers. Consequently, the R&D investment (as a percentage
of sales) made by the equipment and materials companies has steadily
grown, equaling and in some cases surpassing that of the device manufacturers.
More recently, OEMs have been transferring some of the R&D cost burden
to their suppliers. In a cyclical industry with severe pricing margin
pressures across the supply chain, this situation cannot be economically
sustained for the long term. Following the recent drastic downturn, equipment
suppliers have been forced to reduce their R&D expenditures and,
according to Applied's Splinter, now spend a smaller percentage
of sales on process R&D than their customers.
Statistics from the Alliance for Science and Technology Research in America
(ASTRA) show that about 70% of corporate R&D investments are typically
devoted to commercial product development, with 25% spent on applied
research (with a <7-year outlook) and only 5% allocated to basic scientific
research (with a >15-year outlook). Although applied research relies
heavily on basic research, ASTRA reports that while the GDP has doubled
since 1970, the federal government's investment in basic research
in the physical and mathematical sciences and engineering has fallen
from 0.25% of GDP to 0.16% (a 37% drop). This deficit in basic research
spending has created a gap, with major long-term implications for the
U.S. chipmaking industry as well as the overall economy.
The exponential macroeconomic effect of recent semiconductor sector growth
far outweighs the relative size of the industry's actual revenues.
As the 2005 SIA annual report states, "since 1995, information
technology industries have accounted for 25% of overall economic growth,
while making up only 3% of the GDP. As a group, these industries contribute
more to economywide productivity growth than all other industries combined."
The National Science Foundation says that state governments have steadily
increased their investments in basic research by 50% over the past decade
to more than $1.6 billion. This certainly helps, but it is still a small
fraction of the nearly $50 billion in federal funds, less than $20 billion
of which is allocated to research in the physical and mathematical sciences
and engineering, since the bulk of recent federal and state investments
are dedicated to health sciences.
A somewhat brighter spot is the increase in worldwide investment for
nano- technology R&D, which is approaching $10 billion annually,
more than 50% of which is provided by government entities. The federally
funded Nanotechnology Initiative has a 2005 allocation of $980 million;
state-sponsored programs with significant budgets include Albany NanoTech
(New York) and the California NanoSystems Institute (UCLA and UC Santa
To counter the mounting pressure and challenges of rising R&D costs,
most chipmakers (with the exception of Intel) have embraced various cost-
and risk-sharing collaborative partnerships. The IBM (IBM, AMD, Chartered,
Infineon, Sony, Toshiba, Samsung) and Crolles2 (STMicroelectronics, Philips,
Freescale, TSMC) alliances are examples of seemingly competitive companies
joining forces to ease the R&D spending burden. Top-tier foundries
also offer R&D solutions to fabless or fab-lite chip suppliers. Another
trend is outsourced collaborative R&D alternatives such as Sematech's
Advanced Technology Development Facility, Cypress Semiconductor's
Silicon Valley Technology Center, and LSI Logic's Orgeon fab. University-,
industry-, and government-affiliated R&D efforts such as Sematech,
IMEC, Selete, and LETI also play vital roles.
These precompetitive types of collaborations have been much tougher to
implement and are largely impractical for the volatile, margin-challenged,
and fiercely competitive tool and materials sectors. These companies
usually prefer to enter joint development projects with device manufacturers,
partner with noncompetitive suppliers with synergistic products, and
work with academic research centers.
The long-term implications of all the innovation, strategies, and policies
adopted by the IC industry have not been fully embraced by—or effectively
registered in the minds of—many industry and business leaders,
policymakers, and even economists. Companies and policymakers have a
short-term focus: Stock pricing pressures by the financial community
influence the former, while preoccupation with reelection and short-term
politics affect the latter. The semiconductor industry can learn from
those in the health sciences and make a concerted, unified effort to
get its concerns understood.
Better prioritization and improved strategies for R&D allocations,
more-efficient spending of R&D investments, and reduced duplication
of effort are essential basic steps. Increased investment in basic research
is also critical for the industry's longer-term needs. Permanent
R&D tax credits and more-favorable depreciation schedules and rules
would help a great deal. Successful approaches such as precompetitive
collaborative partnerships and alliances among industry, universities,
research labs, and government entities; R&D consortia; advanced foundries;
outsourced R&D; and customer/supplier cost sharing must continue
to flourish as effective ways of ameliorating spiraling costs and narrowing
the industry's research gap.
Search | Current Issue | MicroArchives
Buyers Guide | Media Kit
Questions/comments about MICRO Magazine? E-mail us at firstname.lastname@example.org.
© 2007 Tom Cheyney
All rights reserved.