Skip navigation
IT Innovators: Is This The Year SDDC Will Shine?

IT Innovators: Is This The Year SDDC Will Shine?

It’s now 2016 and you’ve probably already had more than your fill of 2016 predictions; however, I have one of my own. I think 2016 will be a big year for the Software Defined Data Center (SDDC) as it takes hold in a growing number of IT organizations, or at the very least, increasingly becomes fodder for internal discussion. It’s not a startling prediction given the steady advance of Software Defined Networks (SDNs) and virtualization; however, if this truly is the year that SDDC finally takes off, it promises big changes and benefits for the data center.

The SDDC, as defined by Gartner, is a data center wherein all infrastructure is virtualized and delivered "as-a-service." The virtualized infrastructure is automated by software, with cloud computing playing a critical role. According to Research and Markets, by the year 2020, the SDDC market will top $77.18 billion. That’s a pretty significant increase given that 2015 was expected to finish up around $21.78 billion.

What makes this concept so appealing is its ability to deliver automation, flexibility and increased scalability. It can also help reduce data center complexity and lower cost. For IT organizations everywhere, the end result is greater business agility, and that will enable them to more effectively compete in today’s highly dynamic and competitive marketplace.

These are enviable benefits, granted, but realizing them will demand preparation, even from those organizations not yet ready to take the plunge into a software-defined world. After all, today you can’t just go out and buy a turnkey SDDC. On the contrary, you have to first figure out what you want to accomplish with the SDDC, then go out and buy the parts required to help you meet that goal—likely from different vendors. Finally, you have to integrate those parts and deploy the solution.

It can be a challenging task; one that according to Dave Russell, vice president and distinguished analyst at Gartner, says is typically “most appropriate for visionary organizations with advanced expertise in I&O engineering and architecture.” Russell also points out that beyond the SDDC deployment required, new skill sets will be needed within the organization and a cultural shift in the IT organization itself will need to take place to ensure the SDDC delivers solid business results.

While these changes may be hard to swallow for those risk-averse data center managers out there, some emerging trends are helping to soften the blow. One such trend is the move toward open standards. In the Software Define Network (SDN) market, for example, the Open Networking Foundation’s (ONF’s) OpenFlow has emerged as the industry’s first standard communications interface between the control and forwarding layers of an SDN architecture. According to the organization, SDN technologies based on the standard “enable IT to address the high-bandwidth, dynamic nature of today's applications, adapt the network to ever-changing business needs, and significantly reduce operations and management complexity.”

While standards such as OpenFlow will certainly help ease some IT organizations into the idea of a SDDC, others have come to the realization that for them, adopting a SDDC just makes good business sense. These are typically the organizations that have already thoroughly evaluated its risks and benefits, and have the necessary skill set and mind set to make it work. With more and more organizations taking this leap, 2016 may just be the year that the SDDC starts to shine.

More information on the SDDC is available in the Gartner report, “Should Your Enterprise Deploy a Software-Defined Data Center?” For additional information on specific software-defined areas like SDNs and software-defined storage, check out the free Microsoft eBook, "Microsoft System Center: Network Virtualization and Cloud Computing." The company also offers a Software-Defined Storage Design Calculator spreadsheet that IT professionals can use to record their storage requirements and calculate the hardware and software configurations needed to create a storage solution customized to their specific needs.

This blog is sponsored by Microsoft.

Cheryl J. Ajluni is a freelance writer and editor based in California. She is the former Editor-in-Chief of Wireless Systems Design and served as the EDA/Advanced Technology editor for Electronic Design for over 10 years. She is also a published book author and patented engineer. Her work regularly appears in print and online publications. Contact her at [email protected] with your comments or story ideas.

 

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish