Engineers at Massachusetts Institute of Technology and Santa Fe Institute in New Mexico found two formulas for predicting technological change, including Moore’s Law, produce reasonably accurate forecasts. The team led by MIT engineering systems professor Jessika Trancik (pictured right), formerly a postdoctoral fellow at Santa Fe Institute, published its findings last week in the online journal PLoS One.
The researchers, which included colleagues from St. Johns College in Santa Fe and University of Oxford in the U.K., tested common models for forecasting changes in industrial technology against real-world data to find which of the formulas were the best predictors. This kind of analysis, say the researchers, could help industries better plan their research efforts, as well as point investors to high-growth sectors, and aid regulators to better predict the economic impacts of policy changes.
Probably the best-kinown of these models is Moore’s Law, promulgated by Intel Corporation co-founder Gordon Moore in 1965 to describe advances in computer chips. That formula predicts the number of components in integrated chips will double every 18 months, and has been generalized to any technology, stating that rates of improvement increase exponentially over time.
Trancik and colleagues tested Moore’s Law and four other models against 62 different industrial technologies, categorized as chemical, hardware, energy, and other technologies stored in the Santa Fe Institute’s Performance Curve Database. The team gathered data on technological improvements, costs, and production levels from research articles, government reports, market research publications, and other published sources. They used sources that offered at least 10 years’ worth of of consistent data with metrics for both production and technology improvement.
The researchers developed a statistical model to calculate and rank the power of the hypotheses to predict future costs. The results showed Moore’s Law scores high on predictive value, but it came in a close second place to the hypothesis by Theodore Wright in 1936 that cost decreases as a power law of cumulative production, or each percent increase in cumulative production in a given industry results in a fixed percentage improvement in production efficiency.
The team found rates of change vary greatly among different technologies. “Information technologies improve the fastest,” says Trancik, “but you also see the sustained exponential improvement in many energy technologies.” She adds, “One of our main interests is in examining the data to gain insight into how we can accelerate the improvement of technology.”
Read more:
- Statistical Database Analysis Links Genes, High Cholesterol
- U.S. Industrial Companies Plan More 2013 R&D Spending
- Study: Insurance Industry Adjusting to Climate Change
- Report: Business Research Funding Decline Impacts Innovation
- Report: Asia Closing Gap in U.S. R&D Leadership
* * *
You must be logged in to post a comment.