These overarching themes will keep evolving in 2013. As the world becomes increasingly connected and mobile, all of the tech companies that touch the network are pushed to find new ways to remain relevant to survive. They must continue honing their business models and operations for maximum efficiency in an unpredictable economic environment. They must continue building out faster and more flexible networks to support increasing and more random traffic patterns. And they must continue to create innovative products and services that support our need for instant access via new apps and the next shiny and new connected device that captivates us based on increasingly shorter product cycles.
This is not breaking news, but themes that will intensify throughout 2013. Here’s a look at how these themes will play out in various aspects of optical networking.
Network providers rally behind 100G, self-aware networks; 400G on the horizon
Network operators indicate that capacity is doubling in their networks every six months. According to IDC, worldwide broadband traffic for both wireline and mobile activity could reach 116,539 petabytes per month by 2015. This continued need for bandwidth is being driven both by more people accessing video via PCs and TVs and also by their increasing use of mobile smart devices to access information anywhere at any time. People around the world now leverage networks as a daily part of how they communicate and share information during both their work and personal lives.
To bolster and speed up network infrastructures, network operators began volume implementations of 100G optical equipment in 2012. 100G took off much more quickly than 40G, as all network players bought into to the overarching need for 100G and developed a healthy supply chain with a variety of competing building blocks.
Many experts believe 100G will play a central role in transmission much the way 10G did in the past. 100G creates a new baseline for network performance by using bandwidth in an efficient way; it aligns with 100GbE standards, and today’s 100G technology is expected to serve as the foundation for higher transmission line rates in the future. Implementation of 100G started as line cards much like 10G did in its day, and the packaging will quickly become smaller due to new developments in photonic components that drive down costs and power requirements.
With 100G in full swing in 2013, many will look ahead to 400G. Most agree that 400G will be implemented in two forms – superchannels with either four 100G dual-polarization quadrature phase-shift keying (DP-QPSK) modulated wavelengths or two 200-Gbps wavelengths encoded with dual polarization 16QAM. Each approach has its benefits and drawbacks. DP-QPSK modulation provides better performance over long distances but consumes more spectrum within the fiber; 16QAM suffers from shorter reach but offers more efficient spectrum use.
The 16QAM option will likely be a good fit for areas of high population density, such as Europe and the East Coast of the United States, where the end-to-end service distances are shorter. The DP-QPSK option will be necessary where distances between major population centers are greater, such as the rest of North America. Since many of the technologies for 400G are very similar to 100G and exist today, the industry could see initial deployments in late 2013 or early 2014.
Along with faster transmission rates, the development of “self-aware networks” that provide a much more flexible way to manage wavelength traffic also made great strides in 2012. These networks incorporate colorless, directionless, contentionless, and flexible spectrum capabilities that let network operators automatically restore and rebalance optical network traffic and quickly provision new services. Optimizing the network in this way also helps them more efficiently manage equipment and operations. Taken together, these benefits lower overall costs.
Network operators now fully appreciate the value of self-aware networks and are committed to their implementation. Expect to see deployments of such networks in 2013.
Organizations increasingly move to the cloud
As businesses continue to streamline processes and reduce costs, cloud-based services that can offload the handling and storing of data and house key enterprise applications will see more traction in 2013.
Forrester Research expects the cloud computing market to grow to $241 billion by 2020. Along with network providers, many non-telecommunications companies have begun offering some form of cloud-based services. Amazon and Google were two early adopters and have built vast data centers to meet demand. The main challenge with cloud services to date has been their dependency on the public Internet for sending data – which means unreliable delivery times.
This reliance has created an opportunity for providers that own their networks to safely transport the data. They can add cloud-based services to their list of capabilities and leverage their existing networks to connect data centers around the globe and push information to the edge for faster, more reliable delivery.
Many providers have targeted the high-end enterprise market, as it provides the best synergy with the global reach of their networks. They are also able to provide different delivery options such as dedicated lines and virtual private networks to ensure performance.
One issue with current cloud architectures is that virtual machines within a data center traditionally have experienced difficulty transferring information to other data centers through switched networks. An Ethernet-connected cloud provides a cost effective, flexible Layer 2 protocol over longer distances. The question then becomes how to test for performance and quality.
New network monitoring and optimization approaches will play a key role in providing real-time information about any problems that arise, even at the edge of the network. This will be important for enterprise customers with multiple branch offices and remote locations; new capabilities will make it easier to see how an enterprise’s wide area network is being used and let enterprise network managers troubleshoot issues quickly. It will also help network providers sell new services based on a real-time view about how enterprise customers are using the network.
Innovation for the future
Technology innovation is moving faster than at any other time in history. Connected devices that used to take years to develop now are launched on much shorter cycles and offer new ways for people to communicate and access a variety of content from home, at the office, or while on the move.
All of the companies responsible for creating and managing networks to support this ever-changing environment must continue to innovate and create common standards that will support network requirements far into the future. As networks continue to become more complex and intelligent, it will be important to focus on dynamic and flexible optical, network monitoring, and optimization technologies that give providers a competitive advantage beyond owning transmission lines. This new type of innovation will take thinking beyond current business models.
Meanwhile, cloud-based network management, 100G, and self-aware networks will continue to evolve as they are more broadly deployed this year. The likely result is the generation of more new ideas for the future.
Leave a Reply