Skip navigation
IT Innovators: Eye on SDDC Standards

IT Innovators: Eye on SDDC Standards

Standards are an important part of any burgeoning industry or emerging technology, having both a technological and economic impact. When it comes to infrastructure management of the software-defined data center (SDDC), the Distributed Management Task Force (DMTF), a non-profit organization comprised of industry-leading companies like Intel, Microsoft and VMware collaborating on the development, validation and promotion of infrastructure management standards, is looking to make a huge impact.

To aide in the development of SDDC standards, DMTF formed the Open Software Defined Data Center (OSDDC) incubator. Its job is to develop use cases, reference architectures and requirements based on real-world customer requirements. This input will then be used to develop white papers and a set of recommendations for industry standardization of architectures and definitions to describe the SDDC.

Recently, the OSDDC incubator took a big step forward in meeting that goal by releasing its first white paper, Software Defined Data Center (SDDC) Definition. While the specification presented in the paper does not constitute a DMTF standard, it does serve as an exploratory effort that can be used to garner industry comment/feedback. Some of the content may ultimately be used for future standards; however, it’s much more likely that the specification is just a starting point and will evolve over time based on the feedback received.

The OSDDC Incubator’s white paper outlines specific SDDC use cases and definitions. It also identifies existing gaps in standards and possible architectures for the various implementations of SDDC. Of particular interest are the use cases described. They essentially validate why the standardization effort is even warranted. Furthermore, they will help ensure that the emerging SDDC standards meet the needs of the widest community possible.

The white paper lays out 2 primary use cases for the SDDC, as shown in Figures 1 and 2. Neither of these use cases—Infrastructure as a Service (IaaS) and Software as a Service (SaaS)—should come as any great surprise: According to the OSDDC Incubator, IaaS involves a customer (e.g., infrastructure requestor and consumer) who wants to execute a workload and uses the data center to host the infrastructure. Once the infrastructure is available, the customer just installs the necessary software and content/data and then executes the workload.

                         

Figure 1. IaaS use case for SDDC, as specified in the OSDDC incubator’s DSP-150501, version 1.0 document.

                         

Figure 2. SaaS use case for SDDC, as specified in the OSDDC incubator’s DSP-150501, version 1.0 document.

In the case of SaaS, the customer (e.g., service requestor) wants to instantiate a service and uses the data center to host the service (Figure 2). The service is used by a service consumer, who is not the same as the SaaS customer. Once the service is instantiated, the service requestor may need to provide additional content before the service is enabled and ready to be consumed.

While defining these use cases is a good start on the part of the OSDDC Incubator, much more work is still necessary. According to Alex McDonald, the Cloud Storage Initiate Chair, the services (software) provided by the use cases must be managed, and that means there needs to be a way to provision, monitor and account for them, as well as a way to address them over the network. There is also the issue of a standards gap, since some technologies do not yet have well defined standards, and that makes having a true standards-based SDDC solution nearly impossible.

One of those gaps is in the area of SDDC application and workload management. Instrumentation requirements for applications and workloads need to be defined, and that instrumentation must enable auto-configuration and scaling once workloads and applications have been deployed. According to the OSDDC incubator, additional work is also needed to enable emerging containerized applications to expose their requirements in a standard way. Doing so would allow software-defined resources to be created and removed dynamically.

These are issues that will likely be addressed in the months and years ahead as SDDC continues to move forward. And that’s good news for those wanting to transition to the SDDC as standards will allow the industry to become much more efficient. For those IT professionals using standard-based SDDC solutions, conformance to a standard will bring reassurance that the solutions they are using are safe, have a guaranteed set of functionality, and in some cases, are environmentally friendly. And that explains why the OSDDC Incubator, under the auspices of the DMTF, is working so diligently to lay the groundwork on which emerging SDDC standards will be based.

If you have any thoughts on the SDDC standardization effort, drop me a line at [email protected]. In the meantime, check back here for future blog posts on a range of IT-related issues.

This blog is sponsored by Microsoft.

Cheryl J. Ajluni is a freelance writer and editor based in California. She is the former Editor-in-Chief of Wireless Systems Design and served as the EDA/Advanced Technology editor for Electronic Design for over 10 years. She is also a published book author and patented engineer. Her work regularly appears in print and online publications. Contact her at [email protected] with your comments or story ideas.

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish