Big Data and the Future of Optical Interconnection

This research note is based on CIR’s new report, “Revenue Opportunities for Optical Interconnects: Market and Technology Forecast – 2013-2020. Vol. I Board-to-Board and Rack-Based”

CIR’s latest forecast for sales of optical interconnection devices used at the rack level and for board-to-board interconnection remains bullish. We are anticipating revenues doubling from $1.1 million to $2.2 million between 2013 and 2018. Several factors are leading to this growth.

Usual suspects: The growth drivers include all the usual suspects: faster processors, lower power consumption and swelling data rates:

• Multicore and 3D chips put off the demise of Moore’s Law for a few more years; processors will continue to get faster requiring faster I/O

• To accommodate this, large data centers are leaping to 40 Gbps, a data rate not easily supported by copper

• At high speeds copper interfaces run hot; another reason to go with optics.

Cloud computing: Another factor is cloud computing, but its impact is ambiguous.

The “cloud computing” phenomenon is new, or at least new-ish. It can lead to some very big data centers being built, at the cloud service provider location (with a propensity to use optical interconnection). But clouds also reduce the number of data centers that will be built by larger corporations and government organizations; in fact that is their whole point. Thus clouds are a mixed blessing from the point of view of optical interconnection.

Big data is different: But there is one application that is genuinely new, and that CIR thinks it will add wind to the sails of optical interconnection. This application is “big data.”

Big data is an emerging paradigm in which it has been recognized that one can more pull a lot more trends and conclusions out of large data sets than by breaking them into smaller data sets. Big data requires massively parallel computers, potentially with tens of thousands of CPUs and hundreds of servers. These computers almost cry out for optical interconnection. Two factors are important here:

• In the past, massively parallel computers have been primarily associated with niche scientific disciplines; biological cell simulations and fluid dynamics modeling, for example The new big data paradigm means that massive parallelism – and with it optical interconnection – will be become much more common in general computing environments. It will be increasingly required for Internet search, finance and so-called business informatics, which ultimately account for a lot more data centers than scientific/university computing does.

• With so many racks needed for big parallel machines, interconnection distances get bigger and harder to handle with copper at the required speed.

Forward: The “big data” paradigm has also been boosted by the huge growth in data collection devices ranging from mobile phones, to RFIDs to remote sensing devices. But it also faces challenges. Big data does not work well with the current generation of relational database management systems, statistics and visualization software.

Instead what many big data insiders envision is a future species of big data software that is specifically designed for massively parallel machines. It may take a little while for such packages to emerge. But when they do, we anticipate that they will drive optical interconnection in data centers forward with the force of a tsunami.

Scroll to Top