Knowledge is power, particularly when it comes to your IT estate. Knowing how your applications relate to each other or exactly how much capacity there is in your Storage Area Network (SAN), for instance, can lead to considerable savings. In this time of economic uncertainty this knowledge and understanding is more vital than ever, as large organisations could make significant savings just by ensuring that their storage networks and software estates are operating as efficiently as possible.

An accurate understanding of how all applications in an IT estate interact with each other, and the underlying architecture, is vital for several projects including: data centre virtualisation; data centre moves; application performance monitoring; current state assessments; and compliance, regulation and risk.

A detailed analysis of a SAN estate has the potential to greatly improve the cost efficiency of a SAN investment and provides companies with a real awareness of their storage status. However, further cost savings can be made through improving the processes that identify SAN usage and map the relationships between applications.

Traditionally, those companies that recognise the need to map their application architecture or conduct an analysis of their SAN usage have an in-house IT team to do this. But this can be a time consuming process as they are often required to write their own in-house scripts or provision multiple tools and deploy these across the estate, which can take many months to roll-out completely.

The data is then collated into a database or a reporting tool, so that the business can correlate the information across the different estates. Such information could come from SAN management software, such as SANscreen, the infrastructure information about their hosts and the data from the in-house scripts, but due to the disparate sources of data this can often involve a significant manual process.

Some organisations that go through this process often require multiple resources, which can take up to two or three weeks to collate and correlate the data and produce reports. This process needs to be carried out on a regular basis, but because this is static information, it needs to be collected afresh each time. Not only does this require a lot of time and effort to produce, the output can be inaccurate; the nature of this process means that the information is out of date immediately after it is collected and there are likely to be gaps and mistakes, which can cost the business a lot of money.

On top of that you also have other factors, such as the ongoing cost of support for in-house scripts and customisation or multiple deployed tools, worse still, if the internal resource responsible for the scripts leaves the company, the knowledge of how to support, update these scripts and correlate the output goes with them, meaning further training costs for the business.

One option is to buy off-the-shelf products that are able to automate these processes, which are a large investment for a business. For example, an organisation not only pays for the product, but they also need an ongoing annual licence for the software; then there is the time needed to deploy the software across an estate; as well as the cost of having to train the teams to use that discovery software.

However, these have their disadvantages; requiring the analysts to know unique scripts in order to produce the reports and the discovery of the SAN tends to be from frame to host, which does not offer the level of detail required.

Fortunately, there are multi-platform solutions out there that not only capture SAN or application data without the use of complex scripts, but can capitalise on the investments already made. These discovery and reporting technologies automatically create a detailed analysis of a company’s SAN estate and usage status or its application architecture, removing the need for in-house custom-written scripts, reporting tools and all the associated level of support.

Furthermore, the SAN discovery solutions identify SAN usage from host to frame across all data sources and at every tier of storage, meaning that a granular level of detail is possible. This enables the reports to be tailored to specific customer needs, offering an in-depth view of SAN deployment, capacity utilisation and comprehensive usage metrics. As data capture is automatic and up to date, it is repeatable at no extra cost. There is no longer two to three weeks required to map all the data, so from the perspective of time consumption, support and product cost, it is a great saving.

Such analysis has the potential to greatly improve the cost efficiency of an IT estate and provide companies with a real awareness of their storage status and application architecture.

Using the information generated from a SAN discovery solution companies can make accurate, cost-aware business decisions such as storage reclamation, decommissioning and buy back. However, it is also important from a risk management perspective. For example, recently a well-known bank accidentally linked a production system and a development system to the same SAN array, during a restructuring of its IT systems.

As part of this restructuring, the bank load tested the development system, part of which meant filling up their SAN array. Unfortunately, this brought down a trading production application, because it was using the same SAN array. This resulted in the bank being fined several thousand pounds for losing that application; all because it did not know that both systems were attached to the same array. A SAN discovery tool would have spotted this.

Additionally, if you can understand what your applications do and where they sit, you can then map out how much those applications are using from a storage perspective. From there, you can accurately charge the business back for the amount of space that you are using, and prevent waste. You are also able to identify what has been left unallocated, so it’s clear what is available when further storage space is required and prevents the need to unnecessarily purchase more capacity.

Whatever your reason for understanding your IT estate more accurately – data centre move, virtualisation, compliance – there is no doubt that to make real cost savings you need to use automation tools.