In a world where software drives nearly every technology, the term software-defined everything (SDx) strikes me as a bit redundant. After all, what isn’t dependent on software these days? Even in the technologies we readily associate with hardware — storage, for instance — the magic is in the software that runs on top.

To learn about using SDx, read CDW’s white paper, “Achieving Software-Defined Everything.”

As SDx matures, this once holistic idea is becoming more narrowly segmented. As with any evolving technology, opinions may differ on what falls under its broad umbrella. Does SDx include Infrastructure as Code? Artificial intelligence? Cloud operating models?

IT professionals may enjoy debating what SDx is, but there’s another valuable approach we can take to the conversation, and that’s to define what SDx isn’t. To that end, let’s examine and debunk four common myths about SDx.

MYTH: Automation Enabled by SDx Will Eliminate the Need for IT Employees

The nature of work continually changes. What is constant is that employees are happiest when engaged in the types of work they enjoy — work that allows them to contribute something of value and advance the organizational mission. That’s harder to do when they are burdened with the tedious manual tasks that automation can alleviate. When organizations start automating that work, they free people up to invest their time and talents in rewarding, business-focused projects that serve customers and constituents.

MYTH: Organizations Will Lose Visibility and Monitoring Capabilities in an SDx Infrastructure

Where you look and how you consume infrastructure monitoring information may change with SDx, but that doesn’t mean it’s not there. Visibility, observance, metrics, key performance indicators and the like are all still essential to software-defined infrastructure, but some work may be required to integrate with the software-defined system you’re creating.

Some people believe they obtain better telemetric or statistical data directly from individual technology components — as opposed to, for example, hyperconverged infrastructure appliances that combine multiple components. The assumption is that a single, integrated system limits the visibility of this data. In reality, SDx offers various ways of looking at that data, rather than requiring users to access each individual system to pull and aggregate it.

Depending on the activities and technologies involved, SDx can actually increase capabilities around event correlation. This facilitates a better understanding of how systems interact with each other and what that means for the performance, availability or reliability of any application you need to support.

MYTH: SDx Will Hamper the Performance of Mission-Critical Applications

This myth often comes up because people think SDx is a one-size-fits-all situation. It isn’t. In fact, SDx can support any application it needs to. What is true is that every environment needs to be sized and crafted appropriately for the applications it’s going to run. This architectural alignment with a particular application stack might come at a cost, but it can certainly deliver the performance that an organization requires.

MYTH: Software-Defined Environments Can’t Pass Regulatory Compliance Audits

If anything, SDx should enhance and expedite regulatory audits. It consolidates logging and audit trails so that organizations have fewer places to chase down information. It allows them to apply role-based access controls to entire infrastructure deployments from a single point. Unless you have successfully funneled all your infrastructure activities through a single point, you’re actually more at risk in terms of regulatory compliance audits by having everything spread out and disaggregated.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.