Contact us: +44 (0)1582 464 740

ITAMS Blog

Contact us

UK No.
0370 4050508

Int'l No.
+44 (0)1582 464740

Ask us a question

Contact details

Email: info@itamsolutions.com

Call:
Tel. 0370 4050508
Int.
+44 (0)1582 464740

Office & Training Centre
Westley & Hendrix House
42 Coldharbour Lane
Harpenden
Hertfordshire
AL5 4UN

OfficesDirections

Keep in Touch

How to release value from your IT assets

According to the Gartner Group, “organisations can realise cost savings of between 5 – 35 % by implementing focused software asset management practices”. So how can you realise cost savings or even release value from your IT assets?

There are a number of ways this can be done. IT asset value can be ‘hard’ (monetary) and ‘soft’ (efficiency and effectiveness of operations). Hard value is realised by minimising spend and selling used assets. Soft value is reducing the cost of associated operations. Cost avoidance is a large component of this, avoiding the cost of buying more software than is actually needed and re-harvesting existing surplus software.

The following checklist provides a quick view on where value / savings can be derived:

  • Sell unused software selling software that is no longer required. To do this you must ensure that you have a good handle on your IT assets. Know exactly what licences you have, who is using them, how they are being used and on what hardware. Also whether you are likely to embark upon a software migration/upgrade project or a M&A that may result in unwanted licences. An ELP (Effective Licence Position) supported by good quality inventory and usage data will also help to pinpoint unwanted licences.
  • Stay compliant avoid unbudgeted / unforeseen audit costs (including internal resources, 3rd party, tools and fees), by being audit ready. Understanding natural licence renewal points and anticipating licence demand in the business can also help to stay compliant.
  • Maximise product usage rights and entitlements ensure you are exploiting the product usage rights that come with your software, (e.g. free 2nd installs on an allocated laptop which comes with some products).
  • Provide the right application for the right user ensure that users have access to what they need to perform their roles, improving efficiency and productivity. This includes the right edition and user type for a product – often the default position is to purchase the highest features which might not be needed. Target training requirements and improve collaboration.
  • Reduce Support & Maintenance costs ensure that only approved and standard applications are installed. This reduces infrastructure and support costs and related IT functions such as security costs. Support costs can be reduced significantly. A KPMG survey in 2008 stated that a 50% reduction in support costs can be achieved by managing software effectively.

Understanding your IT assets allows you to:

  • budget and forecast effectively
  • support and facilitate internal re-charging and cost allocation for services
  • prepare and implement more effectively for upgrades/migration projects
  • identify security vulnerabilities and apply fixes and patches
  • use your existing assets more efficiently

Some of the services that ITAMS can help with include: building a business case for SAM, gap & risk analysis, audit defence, ELP generation, etc.

Why data accuracy has an impact on your bottom line

Achieving data accuracy can initially be unnerving. Not knowing this can hide a plethora of issues that do not come to light until a vendor, such as IBM, demands an audit. When this happens there is no window for any remediation work, nor investigation on how best to proceed.

On the other hand having good quality data can be a ‘win’ situation and can help with:

  • planning for warranty – reducing maintenance budgets
  • managing leases – reducing cost at end of life
  • software optimisation – reducing licensing costs
  • utilising asset reuse and increased reliability to provide maximum use of life

If IT Asset information is not accurate it can have a dramatic effect on the bottom line. In the two cases which come to mind, both relied heavily on accurate information. One client had an exposure of approximately £3 million whilst another made a saving of about £20 million.

On another occasion, ITAMS was asked to conduct a due diligence audit for an outsourcing organisation. This involved an audit on all the hardware which was going to be managed under the contract. Initially we were told there were 10,000 desktops to audit. Within the first two weeks of the project, it soon became apparent that this number would be much higher. On completion, we audited approximately 17,500 systems. The implications were large and the atmosphere difficult, as many decisions, such as licensing and hardware maintenance budgets, had been made based on the assumed figure.

With all such undertakings there are generally more assets than anticipated. The norm is 10% to 15% extra. However, this can swing the other way when an outsourcer manages an audit, and reporting can often be overstated by 30%. A classic finding is, maintenance being paid on 10-15% of hardware assets which no longer exist, and approximately 20% of software assets that are no longer in use.

In summary, inaccurate data will affect the bottom line both negatively and positively. The effect may not necessarily be immediate, but the longer it is left, there is a greater risk that an event will expose any problems which could potentially be both embarrassing and expensive.

Walk-around audits versus Electronic Network Tool audits

Traditionally walk-around or physical audits have been the main stay for IT Asset managers but in today’s technology rich climate there is a plethora of electronic network audit tools that can offer faster data collection capabilities.

Electronic audit tools are however complimentary to walk-round audits, not a replacement. Physical audits are expensive and if used, should only be used to kick-start a project. Drivers for physical audits might be the level of accuracy required, coverage of secure or not LAN-connected assets (e.g. stores), and the additional information they offer, e.g. location, assignment, asset tag, nearest phone, etc.). Cross correlation of physical and discovery data is now highly useful… especially when integrated with an asset lifecycle tracking system.

The table below sets out a brief overview of the two common data collection methods:

Data Collection Method

Advantages

Disadvantages

Electronic Network Audit Tools

Discovers and scans all hardware devices on the IT network for detailed hardware attributes and all the installed software and operating systems

  • detailed technical data
  • non-invasive
  • fast and repeatable
  • cost
  • IT security and change control blockers
  • no real environmental or location data available
  • cannot audit non networked systems (cupboards or secure)
  • cannot ‘see’ what has been missed
Walk around / physical audit

Physically/Visually identifies the hardware device

  • definitive information
  • detailed environment, status and location data (crucial for ELP)
  • stock coverage of non-network devices (store room etc.)
  • unique device tags
  • user assignment information
  •  lengthy and more costly
  • human error (scanning technology reduces errors)
  • can be disruptive if not well planned
  • limited technical information gathered (unless local login performed)

Both data collection methods provide practical options for managing your hardware estate however neither method, standalone, can provide 98+% data accuracy.

So what is the best method?

From experience there are several obstacles that may get in the way of correctly managing your hardware estate. Firstly at management level, key decision makers tend to argue, “We know what we have, I do not see any problem, or why fix it when it is working?”. Lower in the organisation the case is different, “We have always done it this way, but we know the data is less than 70% accurate”. Eventually, as installs, moves, adds and changes occur the inventory repository (if there is one) is not kept current (old assets not removed), and the accuracy level falls even further.

Therefore there is often a need for a one off physical audit to re-initiate / validate a lifecycle system’s core data, but no one wants to keep doing that. Discovery and a slick barcoding add-on to the lifecycle system acts as a way of constantly refreshing and validating the data and avoiding the costly re-audit, as well as giving the rich benefits of cross-correlated physical and discovery info (great for maintaining the constantly changing hardware CI layer of CMDB for example).

So my message to Asset Managers is, are you sure you want to keep managing data you know is bad, or would you prefer to kick-start a new and more efficient mechanism which generates real value based on trustworthy records?

I recently visited an organisation, where due to their business sector, they had a “laissez-faire” attitude to tracking their assets. However recent changes from the Financial Reporting Council (FRC), means that this has to change. Below are key areas which have to be considered before committing to change, as it will affect the whole organisation and may conflict with existing cultures.

  • Understand why IT Asset Management is import to the business.
  • Demonstrate the ways in which this information is or could be used.
  • Get Board level ‘buy-in’ to the above – crucial!
  • Produce a business plan which clearly shows why this is needed, what the cost would be and how the returns will be realised.
  • Have a clear roadmap / strategy over time, not just a single project.
  • Have an IT Asset Policy which is clearly communicated and understood.
  • Have solid lifecycle touch / capture points.
  • Produce sound and tested procedures which are simple to follow, and automate using barcoding.
  • Know what accuracy is required and why, and who your data customers are.
  • Where is the data going to be held, who is responsible for it and who should have access to the data?
  • Know what sources of data are currently available – and are they trustworthy?
  • What other sources are required for your business purposes?
  • Have a feedback system to test the accuracy, including a service compliance function in larger organisations.

Key: Clarify and align responsibility across the whole organisation.

Tip: Start to plan for the future.

Why locating hidden IT assets is key to delivering accurate data for vendor audits

The key to providing an accurate effective licence position (ELP) is reliant on the accuracy of the IT inventory.

Many organisations cannot produce an accurate asset inventory report with an accuracy of 98%. Ensuring you have a 360° view of your hardware estate is an essential piece in the jigsaw. Knowing what hardware you have, where it is located and the status of this inventory is essential.

Technology audits can only provide between 85% to 95% accuracy. All too often, there is a reliance on a variety of technologies, spread sheets and Active Directory, with no coherent strategy to reach the desired level of accuracy necessary in today’s world of complex licensing models.

The 360° view of your hardware estate must also take into account licensing implications and the location and hardware profile of devices which are stored, retired, and passive, in production or disaster recovery systems. Once this information has been carefully collated, it can then be reflected correctly in SAM data, so that the correct effective demand is used in ELP calculations.

How to boost data accuracy with a Software Catalogue

Many large organisations struggle to maintain licence management accuracy within their SAM processes. ITAMS recommends that clients use a catalogue-centric licence management tool which helps to marry installed software with SKUs (Stock Keeping Units) and to procurement contracts.

The key benefits gained from using software catalogues are great. Less dependency on discovery tools means that data from other inventory tools can be used to perform software recognition, time and money on manual resources saved and a higher percentage of data accuracy ensured.

Case Study

Software Catalogue Use Proof of Concept (POC)

By Paul Sheehan, Managing Consultant

“About a year ago we ran a 3 day desktop focused POC in an 80,000 user multinational organisation, including the evaluation of two of the very few catalogue centric Licence Management tools in the market. Overall, we evaluated 5 licence management tools in the POC. For those without a catalogue, the work to reconcile discovery, entitlement and contracts in the POC were prohibitive and we could not fully test them in the 3 day window.

The more mature catalogue centric tool had 150 vendor SKUs and tens of thousands of products at its heart. This is constantly updated and links: a) discovery profiles to normalised product names; b) these product names to SKUs (where each SKU can be thought of as a specific and unique set of rights to use the software) and c) these SKUs to the contracts under which they may be purchased (Microsoft EA vs. Select vs. Retail for example).

Note that this isn’t just software recognition… it’s a pre-built cross reference between the software installation and usage evidence from the network and the ways a firm may (or may not!) have purchased licences to use it.

The incoming discovery data from the customer only contained ARP (Add/Remove Programs) data, but this dataset had in excess of 2 million software instances. Over 91% of the non-customer specific software instances were auto matched to the catalogue in this POC, more than double its nearest rival.

This not only meant we very quickly knew what each product was, and had common naming conventions for it and its publisher, but also that we understood the various ways it could be purchased and licenced. In production, that means invoices with SKUs could be auto-matched, permitting very fast reconciliation. Neither we nor the client needed to build this intelligence ourselves, nor did they have to maintain this specialised knowledge in their BAU operations team.

Very quickly it became clear that this element of the tools functionality was the single most important deciding factor – it directly influenced time to value and the volume of operational staff required to properly run and administer the system.”

Is your data fit for consumption?

ITAMS’ Strategic Advisory Services have been designed to help you navigate through the difficult challenges of controlling and managing software asset ownership and utilisation

Amongst the variety of services we offer the “Data Fit for Purpose Testing” Service in particular is pertinent to helping you understand the quality of your data.

The service performs a high level review of the data quality in your major data sources and highlights any data quality issues. Once these are identified ITAMS’ experts are able to make informed recommendations on how these issues can be addressed. To find out more about the, “Data Fit for Purpose Testing” service please call: 0870 4050508 or email: info@itamsolutions.com

ITAMS always delivers its services as an independent service provider. For more information or expert advice on any of the topics above please contact ITAMS.

Get Real Value from Disparate Data Sets

Are you using the right asset to manage your licensing entitlement?

A common issue ITAMS’ consultants come across when helping clients build a SAM / licensing programme is the conflicting records found for a given asset, normally from different data sources. The data sources often include various tools such as discovery, security, patching and software distribution, results from physical audits or barcoding solutions, as well as entries in CMDB etc.

The conflicted records provide limited coverage and are most likely to be out-dated, posing a threat not only to its accuracy but also its integrity. As a result many clients are left in a difficult position in terms of deciding which set of the conflicting records should be used to run compliance calculations or against which data CMDB CIs should be validated.

ITAMS’ best practice recommendation is for clients to consolidate and normalise the disparate asset data sets into a “golden source” and to use only this common record across the organisation and on which to base licence and configuration decisions.

Case Study:

Hardware data reconciliation supporting licence, configuration and outsourcer management

With help from ITAMS, a recent client in the finance sector successfully consolidated, compared, normalised, visualised and reported against 15 hardware data sources. The results were spectacular. They assumed the worst case for calculating hardware maintenance contract by including all identified machines from a number of data sources as input to their assumed estate.

In fact, less than 70% of these machines were real and still present. Analysis showed that the largest contributor was that discovery tools continued to routinely report machines they had last scanned years ago as active. As many of these tools were outsourcer managed with neither a mandatory nor a ‘3 month aged device’ policy applied, the reported data was evidently inaccurate and accounted for unnecessary overpayments in various categories (hardware and software related) to the tune of £10M’s.

The value from the project goes beyond cost avoidance and supports many areas of operational and cost management. It provides a full audit trail of how the “golden record” is derived against which reports on complex data sets can run.

Are you spending too much on software?

Organisations must focus more on managing their top vendors to reduce spend on software. In a recent enterprise licence management engagement (80k seats), ITAMS was able to attribute approximately 40% of its client’s total annual software spend of £165m, to their top 12 vendors. With 70% of their budget spent on support and maintenance and 30% on new licences there is a large opportunity to make substantial savings.

Do you know what percentage your budget is attributed to your top vendors?