No matter what industry you’re in, there is someone having grand success with a big data project… and many, many more having no success at all.
Big data projects truly can revolutionize an organization and help physicians find better disease treatments, athletes find weaknesses in their game or retailers better serve their customers. But before you start making grandiose promises to everyone at your organization, take a look around and make sure you have what you need to make a big data project a success.
Systems integration – To solve a puzzle, you need to collect all the clues. So if you want to discover valuable new insights about your users, you need to pool together everything you know about them — and that means you’ll want to do a thorough inventory of your resources and integrate some systems. Integration is rarely quick, simple or pleasant, but it might just be crucial to bringing your big data dreams to fruition.
Adequate storage – Big data obviously requires big storage capacity. Luckily, memory is cheap these days, so choose what storage media are best for you and stock up. Elastic cloud storage solutions can also provide great storage scalability, which can be an economical option not only because it allows you to quickly add capacity when you need it, but also because it allows you to cut back on storage costs if you find that you don’t need so much. And remember, you don’t need to collect and store everything. Talk to your compliance officers about your data retention obligations, talk to your stakeholders about what they want to achieve and trim the fat as much as possible.
Processing power – Storage for big data may be cheap these days, but the computing power necessary to quickly draw insights from that data may be more expensive. Honestly assess your needs for high-performance computing or in-memory analytics, and invest in the right tools — it doesn’t matter how big or how good your data sets are if you can’t make use of them.
Security – When you integrate systems and data flows, identity and access management must become a higher priority. The more data is centralized, the trickier it is to maintain proper segregation of duties and to ensure that intellectual property and corporate secrets are not accessed by unauthorized users. Large, centralized databases are also very tempting attack targets, so adequate encryption — both for data at rest and data in transit — is essential.
Expertise – These days, the hottest, hippest job title is “data scientist.” The trouble is, nobody is entirely certain what a data scientist should be. An IT engineer who reports to the CIO? A social networking genius who reports to the CMO? A scientist with PhDs in statistics and psychology? A high school kid with a million Twitter followers? Or someone already on staff who just needs some extra training? Regardless of who’s “in charge of” a big-data project, it’s clear that they’ve got to be creative, analytical, collaborative, curious and always willing to learn new skills. Make sure you have the right people for the job and invest in their career development.
About the Author
Sara Peters is Editor in Chief of EnterpriseEfficiency.com, a UBM Tech community.Tags: Data Center,Security,Storage,Technology