CIO

Is Data Complexity Blinding Your IT Decision-Making?

The ever-increasing complexity of data about IT environments is making it increasingly difficult for organizations to make effective IT decisions.

According to a new study by Forrester Research, commissioned by Data as a Service (DaaS) company BDNA (creator of the Technopedia repository of information on enterprise hardware and software), 73 percent of high-level IT decision makers cite the complexity of data as the largest challenge in making effective IT decisions in the next 12 months.

"We're a big company with lots of systems, and harmonization is a monumental task-taking inventory, what systems are out there, what validation state they're in, what systems they can combine," says an IT corporate quality executive at a multibillion dollar equipment manufacturer interviewed for the study. "And feeding this information into change control, consolidation and validation-where they are and status tracking. We are trying to find one global solution, but end up doing it with multiple tools-and sometimes manually."

There are two main drivers of the accelerating challenges around data complexity says BDNA CMO Mahesh Kumar: innovation and the Internet of Things. Innovation by both an organization and its vendors (i.e., new products) and the introduction of new complex technologies like cloud, mobile and virtualization are add to the volume and complexity of data. The addition of new connected technologies like wearable technology, vehicles and even buildings that are part of the Internet of Things will exacerbate the problem.

"Enterprises need to take a step back and honestly assess the magnitude of the problems that lie ahead," says Constantin Delivanis, CEO of BDNA. "Without doing so, it is impossible to take correction action. Compounding the current data problem is the amount of updates vendors are making to their products-giving customers decision paralysis due to the lack of clean and purposeful data. In an analysis of our data, Microsoft, IBM and BMC Software combined to update more than 16,500 software products in the first nine months of this year!"

"Several years ago, outsourcing was going to reduce all the money we spend in IT," Kumar says. "It really didn't happen. Then virtualization was going to be the panacea. Cloud computing is it now. The thing that everyone is overlooking is that each of these endeavors is complicating the data behind it all. A lot of the efficiency you gained in virtualization is lost in trying to manage the data or the environment itself. You're simplifying one aspect but complicating another. We believe it's the data that's getting so complicated right now that the benefits of innovation are getting delivered in a very localized fashion."

And that complexity can translate directly into unnecessary costs. He points to one BDNA customer-a Fortune 100 financial firm-that had to migrate 300,000 desktops to Windows 7. The company sought to perform the migration manually. It had close to 20 employees working on the project for nine months with very little progress to show for it. They were struggling to put together data on the software the company owned, the versions of that software and what those versions were compatible with. Moreover, only a little more than 50 percent of the information they had compiled was accurate.

With BDNA's help, the company completed the entire analysis in two weeks with 99 percent accuracy. In doing so, IT discovered that it had purchased 6,000 Microsoft Project and Visio licenses, deployed 5,000 of those licenses, but was only using 1,000. Eliminating that waste allowed IT to immediately realize $1.2 million in savings.

Another example is a Fortune 200 financial services company that used BDNA's DaaS solution to consolidate 112 human resource systems in 12 countries and migrate to an HR cloud solution. In doing so, the company discovered that it had 300 servers deployed that weren't doing anything. Eliminating those servers saved the company $3 million.

BDNA's DaaS aggregates inventory data and normalizes it to a consistent taxonomy. It fixes inconsistencies, resolves duplicates and removes irrelevant low-level data from analyses. For example, such low-level data in a migration might include drivers, dlls, hotfixes, etc. Essentially BDNA gains new information on deployment patterns with each engagement and is able to leverage that experience on future engagements.

As Kumar puts it, IT patterns are pretty consistent in a lot of different enterprises. Often it's able to aggregate and normalize a majority of an organization's data with very little effort and is then able to focus its attention on things unique to the engagement in question.

Once the data has been aggregated and normalized, it then appends rich market intelligence to the data. In a migration, for instance, it will append information on hardware readiness, 64 bit compatibility, Windows 7 compatibility, upgrade path and so forth. Some of the market intelligence allows customers even finer control over their environments.

"We have data that shows that the oldest 20 percent of the servers in a data center consume up to 40 percent of the power in that data center," Kumar notes. "If you're trying to reduce power consumption, we can pinpoint exactly the 20 percent that will get you the biggest bang for your buck."

Thor Olavsrud covers IT Security, Big Data, Open Source, Microsoft Tools and Servers for CIO.com. Follow Thor on Twitter @ThorOlavsrud. Follow everything from CIO.com on Twitter @CIOonline, Facebook, Google + and LinkedIn.

Read more about big data in CIO's Big Data Drilldown.