Interoperability in the Data Center

man looking at data centerRemember the old days when you could argue with coworkers whether buying everything from one vendor to guarantee interoperability was better or worse than buying best-of-breed tools and cobbling them together? Ah, the good old days. Now, for better or worse, every data center is a test of interoperability. No data center stays the same, and even if it could, a merger or partnership would force the integration of vastly different equipment. Interoperability is the “new normal” if you like buzzwords.

Non-technical management is always good for a laugh on this topic. “We’re all using Ethernet, right? So they all just plug together, right?” Enjoy the chuckles, then get back to work.

While there are multiple angles to the interoperability process (anyone want to talk about Multichassis Link Aggregation versus Shortest Path Bridging?), let’s focus on the first word in data center: data.

Those non-technical managers point to racks of storage systems and assume that’s where all the data lives. Ah, if it were only that simple. True, data lives in those storage systems on the spinning drives and the static memory bits in SSD storage, but that’s not all the data. Every server has internal storage, every NAS (Network Attached Storage) appliance has data, every SAN (Storage Area Network) device has data, and sharing that data can be tough. Managing all that data cohesively might be impossible for you right now, but help is on the way.

“Cloud” is used in many ways, but private clouds of services and storage enter into most medium to large data center conversations sooner or later. And taking a cloud approach to storage means data from here will be available there as well.

Sometimes, the connections stymie interoperability. Does your SAN run on Fibre Channel or iSCSI? If on one but not the other, can you like provide interoperability? With EqualLogic SANs, you can choose from iSCSI or Fibre Channel, so all your options are covered. Sometimes, connections increase interoperability.

Interoperability includes brand names, not just connectors. Vendors that work together make interoperability happen, such as vendors like SUSE Linux get involved in projects like the Emerging Solutions Ecosystem to integrate their OpenStack project with Dell’s Open Source software Crowbar. Open Source Software leads the interoperability march currently, so make sure your vendor of choice plays well with others. The “friendlier” your vendor, the more available your data across multiple platforms. Going “big” with your data? Make sure your vendor supports Apache Hadoop, the primary Big Data infrastructure tool today.

And look for vendors with products that promise interoperability in the name itself. When you first see the name Dell Fluid File System, what comes to mind? A locked-down, proprietary option? Absolutely the opposite, especially when connected to Dell’s DX Object Storage Platform to control and distributed digital content from Web publishing to Big Data to end-of-life safe archive options.

You may not be able to judge a book by its cover, but interoperability in the data center is a different animal. Even the name can tell you that data interoperability is no longer fiction.

James GaskinJames Gaskin writes books, articles, and jokes about technology, and consults for those who don’t read his books and articles. Email him at [email protected].

Tags: Data Center,Technology