The age of virtualisation

The age of virtualisation

The NZ Defence Force and other early adopters discuss the benefits and pitfalls.

Attend almost any ICT-related conference these days and there is one word that seems to crop up again and again: Virtualisation. It’s not hard to understand why. At a time when data centres are running up against the constraints of limited cooling and back-up generation capacity as well as physical space, any technology that promises to free up servers, power and network infrastructure is going to be welcomed with open arms. According to a recent survey by The Strategic Counsel, conducted on behalf of CA, at least 69 per cent of New Zealand and Australia-based organisations with more than 500 employees have already adopted server virtualisation. This installed base is expected to grow by 28 per cent in the next 18 months.

For many organisations the ‘green’ benefits of virtualisation and the prospect of reducing power consumption, are enough to provide a business case for virtualisation on their own.

Paul Harapin, Australia and New Zealand managing director of virtualisation software company VMware, claims the past five or six years has seen server power consumption increase by around 400 per cent and more than half of office power consumption now comes from IT. This comes at a time when the cost of power is increasing and organisations are coming under pressure to reduce their carbon footprints from customers and shareholders, governments and even their employees.

Harapin says that average server consolidation ratios are between 10:1 and 15:1 in production environments and between 15:1 and 20:1 in development and testing environments. Each server removed from the data centre as a result of virtualisation will save as much as $2000 in power charges and the equivalent of 12.5 tonnes of carbon dioxide emissions a year — which in green terms is the same as planting 55 native trees.

Of course, virtualisation also has its own set of ‘cons’. According to the Strategic Counsel survey, virtualisation deployment has lead to the creation of multiple, mixed server environments that has led to difficulties with reporting, visibility and metrics. The survey showed 21 per cent of the Australasian organisations that have deployed server virtualisation have either failed or are unsure whether they have realised a ROI, while 51 per cent were unable to say whether or not the deployment has been successful.

At a recent conference on virtualisation hosted by CIO and Computerworld in Auckland, senior IT executives from three early adopters of this technology — the New Zealand Defence Force, Manukau Institute of Technology and Carter Holt Harvey — related their virtualisation experiences. While all three organisations were realising benefits from virtualisation, they had also encountered a number of pitfalls — ranging from software licensing and governance issues, to disk contention and the physical difficulties of re-cabling and shifting servers around.

Project Crush

Captain Roger MacDonald, IT security programme director at the NZDF’s Joint Information Services Agency, says the NZDF started to look at server virtualisation in 2005, as part of a wider IT simplification and consolidation programme known as Project Genesis. MacDonald says the NZDF’s complex IT environment, which includes 790 servers in New Zealand and more than 100 servers in about 20 different overseas locations, presented “a clear case for convergence and consolidation”.

Genesis, which is not expected to be complete until 2020, consists of four separate transformation programmes to simplify and streamline the NZDF’s IT operations. Although the deadline is still 13 years away, MacDonald says the pressure is already being felt, due to the scale of the task ahead.

With a main data centre located in Wellington and a secondary data centre in Devonport, the NZDF’s IT infrastructure includes more than 14 data network layers, the largest of which serves about 9000 seats, as well as voice and video networks.

Some of these networks are connected to real-time sensors and weapons systems, several involve extra security layers and many of the systems must be available 24x7.

The hardware and software that makes this all tick has been supplied by a wide proliferation of vendors, each with their own set of methodologies.

“It’s not just the hardware, it’s about people, skill sets and business processes,” says MacDonald. The NZDF began its virtualisation project with an evaluation programme, picking three virtualisation platforms for further examination. Throughout 2006 the IT department ran benchmarking tests and built up its virtualisation skills.

Preliminary tests on one of the NZDF’s largest networks, running on 390 servers in New Zealand, showed the average server CPU usage was only 2.85 per cent, with occasional peaks at 35.3 per cent.

“The bottom line was that there was scope for virtualisation,” he says.

MacDonald says when the network is fully virtualised it will free up the equivalent of 25 fully-populated server racks, save 143 kilowatts in power consumption annually and allow over 300 servers to be recovered.

Project Crush began in earnest earlier this year, using VMware Virtual Infrastructure 3 and VMware Server as the main server virtualisation platform.

MacDonald says that 60 NZDF servers in Auckland have been virtualised to date and from August were proceeding at a target of virtualising 10 physical server boxes per week. With little opportunity for downtime, he describes it as “virtualisation on the fly”. The NZDF is using HP DL 585/385 dual and quad processor servers as its virtualisation servers and the project is already paying “big dividends” in server recovery. In one case, 34 servers were virtualised into three and at this stage many of the redundant servers are being earmarked for further virtualisation.

Physical factors such as server cabling and “shifting cabinets around” have been the biggest obstacles to Project Crush so far. For this reason the NZDF is concentrating on the secondary Auckland data centre first, because it offers a “cleaner environment” with less complicated cabling than the Wellington site.

The NZDF is tackling the problems posed by the security network layers with a combination of virtualisation and virtual private networking. As well, it plans to virtualise its SAN and some of its remote sites.

MacDonald says the main objectives of Crush are to save space in data centres, while reducing some of the load on the data centre’s diesel-powered, back-up generators. The project will also cut software licensing costs and reduce the organisation’s carbon footprint. Virtualisation can also reduce the numbers of servers that are needed to be deployed at sea or in land operations areas. For example, each frigate requires access to seven different network layers that have previously required their own servers. Virtualisation allows the same capability to be hosted on just two servers.

With land deployments, virtualisation could mean putting smaller, more easily transported servers into the field, or deploy additional back-up servers.

Consolidation and migration

Reducing hardware costs is one of the main objectives of a virtualisation programme at timber products company Carter Holt Harvey, which is aiming to consolidate at least half of its 400-plus servers by the end of 2007.

Krassi Modkov, manager of design and implementation at CHH Infotech, the company’s IT department, says scoping for the consolidation project began in late 2004 and the company is using Microsoft Virtual Server 2005 as its main virtualisation platform.

The business objectives include reducing the cost of server applications, simplifying the overall administration, back-up and support environment, improving scalability and reducing end-of-lease migration costs.

Modkov says CHH leases server equipment on a three-year cycle, which means it incurs extra costs every time the company has to migrate applications to new hardware.

After surveying the available options the company chose Microsoft’s virtualisation product, even though it was a relatively new technology. Describing CHH Infotech as “a Microsoft shop”, Modkov says it was easy for staff to work with the Microsoft virtualisation platform. Modkov adds he believed the long-term benefits of sticking with Microsoft outweighed any current shortcomings in Microsoft’s virtualisation platform. “We looked at Microsoft’s roadmap, and we believed them.”

Another reason for choosing Microsoft, says Modkov, was that the company was “slightly keener” than the other vendors. Microsoft flew in a consultant from the US free of charge to help with initial development. Modkov says this turned out to be a godsend, as the deployment tools weren’t very mature at the time.

Modkov says the Microsoft platform turned out to be easy to use and reliable, but back-up was not as easy as expected and some applications will not work in a virtual environment. Modkov also stresses strong governance processes are needed to track and control the spread of virtual servers.

Since 2005, around 100 server operating systems have been decommissioned at CHH and approximately 180 physical servers have been recovered.

Modkov says this has been a big contributing factor to a significantly reduced operational IT budget and virtualisation has improved CHH’s ability to respond to rapidly changing business needs.

“At the moment, we are very pleased with Microsoft’s product,” he says. “The benefits so far had not been any different with, for example, VMware.”

Modkov says CHH is now ready to start accelerating the virtualisation programme and it may investigate other types of virtualisation in the future, for example shared OS virtualisation and desktop virtualisation.

Stress buster

Another early adopter of the technology, Auckland’s Manukau Institute of Technology, simplified its IT infrastructure and saved server-room space through virtualisation.

Systems specialist Daniel Kenna says he and his team started investigating virtualisation technology in 2004 and moved it into the production environment in 2005. At that point, MIT’s infrastructure consisted of a range of Intel, Sun, Compaq and TMC servers, crammed into 19 racks.

The proliferation of servers was placing heavy demands on cooling, was overloading the UPS systems and required a lot of maintenance. All of this was leading to a stressed IT team.

“There were a huge amount of cables and it was difficult to know what was plugged in where,” Kenna says. “Over time it became a complete mess. We ran out of rack space and ended up putting servers in wherever we could fit them.”

Another concern with the old infrastructure was it did not allow room for a development environment, which made testing and upgrading of live systems both disruptive and risky.

Kenna started using VMware’s GSX server for development and testing, but when he tried hosting some production services on it, he found that overhead was too high.

However, the virtualisation concept seemed promising so Kenna started implementing VMware’s more powerful ESX server. Initially this was version 2.5, though the IT department has since updated to ESX 3.0.

Two years after the programme began, Kenna says MIT has achieved a 64 to five or 13:1 consolidation ratio, with 95 per cent of the Microsoft servers now virtualised. Virtualisation has led to higher systems availability and business continuity, and the IT team is providing fast, reliable service to other departments. Kenna says the reduced stress levels have also resulted in a happier IT team.

Other benefits include quicker provisioning of servers, with Kenna saying that live migrations of virtual machines are now possible. As the virtual machines consist simply of folders and files, this makes it easier to manage and back-up information.

Kenna says when two disks in MIT’s SAN system failed, the system was up and running again in about 15 minutes when recovery would have previously taken two days.

But Kenna warns that virtualisation also has its pitfalls. Software licensing has proved to be one of the issues at MIT. Kenna recommends checking with the suppliers whether they support their software being virtualised.

He adds that resource contention, particularly disk I/O contention, is more of a problem than high CPU utilisation and for this reason MIT won’t be virtualising applications servers such as its Exchange server. Another shortcoming of MIT’s current virtualisation platform is that it does not support PCI, parallel or USB devices. However, this should be supported in a future release of the software.

Managed services

Auckland’s North Shore City Council (NSCC) embarked on a sweeping virtualisation programme as part of a managed service contract it signed with IT outsourcing company Revera in March. “Basically we took our apps to Revera’s new data centre and virtualised all of them, with a couple of exceptions,” says information services manager Geoff Shaw.

Prior to the Revera contract, the council had already virtualised some of its applications using VMware’s GSX and ESX platforms. “We had about 70 physical servers and 100 logical [virtualised] servers,” says Shaw. “Virtualisation was piecemeal at this stage. Our main driver was the lack of capacity in our server room and the air conditioning, which was causing us some grief.”

This first taste of virtualisation was enough to convert NSCC to the concept when it started seeking a managed services provider early this year.

“When looking for partners, we were favouring anyone who is looking at virtualisation as a means of reducing our costs,” says Shaw.

As part of the contract, Revera took ownership of most of NSCC’s servers and virtualised all of its main applications, including Technology One financials, Exchange and several SQL databases. Revera is using VMware ESX as the virtualisation platform, running in an HP Blade server hardware environment.

The only applications not virtualised were a geographical information service running on Sun hardware, a data warehouse server and “a couple of legacy applications”.

“It was a pretty ambitious project, but we did it in six weeks,” says Shaw.

Revera is now about to start building a disaster recovery facility for NSCC at another data centre, which will also serve as the council’s test environment.

Under the new arrangement, Revera owns all of the IT equipment and is responsible for delivery of service to NSCC.

“We sold servers to Revera and they are now in the process of decommissioning them,” says Shaw.

Shaw says the biggest advantage of virtualisation is that, in conjunction with the managed services contract, it has brought NSCC’s costs down.

“The risk was pretty high, but there were really no issues,” says Shaw. “With a couple of applications, performance could be improved, but some are working faster.”

© Fairfax Business Media

Join the CIO New Zealand group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.

Join the newsletter!


Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

Tags virtualisationserverManaged Servicesgreen ITcostnew technologysoftware licensingpower costs

Show Comments