Whilst on client site, ITAMS’ consultants come across a lot of software that is not officially authorised for use in a business environment. With the increasing emergence of freeware, shareware and trial software being used in businesses across the UK, I decided to ask our Pre-Sales Consultant Matt about some of the common instances he has come across, what organisations need to do in order to stay compliant and how these new instances of software should be treated.
“As a certified auditor in software compliance I found some strange responses to why they use so much of this licensing type in their organisation, it appears many are oblivious to the risks.
Just because it’s an application in freeware it doesn’t mean that there aren’t any associated terms and conditions of use. A software environment can sometimes be made up of 60% freeware but the question is, can that application be used in a corporate environment?
This is one such ‘grey area’. Other examples include the use of iTunes and Google Earth in a corporate environment. The iTunes application for example can sit on your work machine and be used for synchronising a business iPhone but when you start downloading files, where does the responsibility for that file lie? You can install the private version of Google Earth at work but you must only use it for personal use. Once you start using the application for business, the corporate version must be purchased and this carries a substantial per installation cost, so beware!
Trial software is another area of contention. Some licenses have so many hooks it is sometimes difficult to know what you can and can’t do! There are some vendors that allow you to download the trial software but have a fair few stipulations around usage even though they provide you with access to the full version.
The vendor may also state that you have to remove the software from machines when the trial expires. If you don’t, regardless of whether you have accessed it since expiration, then you are liable to pay for the full application license.
WinZip is one such vendor that leads this policy on trial software. I have lost count of the number of organisations I have audited where their trial software is still sat on their computers even after it has laid dormant for months. An audit would expose that weakness, and something innocuous like that could mean a substantial bill!
There is a whole raft of similar examples and without education and guidance; misinterpretation of usage rights in a business environment could prove to be a liability.
The main aspect to learn from this is to treat all software the same, whether it is purchased officially, freeware, shareware or trial software. Follow the same process and have the same fundamental software lifecycle and recording strategy. This will stand you in good stead and help you control your ‘non – standard’ software estate. Alternatively place stringent conditions on the use of non-authorised software in the corporate environment.”
I’m on a mission at ITAMS. I want to get under the skin of our experienced consultants that work mainly on site with some of our most important clients. So I’ve decided to spend a little time talking to them and understanding some of the daily challenges they face. We have a range of consultants working at ITAMS, from data specialists to technical gurus and project managers.
This week I decided to catch up with John who is currently working for a large insurance provider. He is in the middle of a VMWare licence entitlement assessment. Whilst his current observations are centred on VMWare products, many of these are also common for other popular software vendors where licensing support is being provided by a third party.
I asked him what observations and tips he may have that he would like to share with our community. Here’s a round-up of his thoughts:
When a VMWare licence is purchased, a support or maintenance contract is usually insisted upon to be purchased and used at the same time (these are usually valid for 12 months at a time). However, the support contract can be with a third party and doesn’t have to be with VMWare, subsequent contracts would be agreed on the anniversary.
So John’s observations are mostly about the way in which some licence vendors treat the data that identifies each licence, when they are also the third party support contract supplier.
When a licence is issued by VMWare it is identified by both a VMWare ‘Instance’ number and a VMWare Licence Key. Each support contract is acknowledged by VMWare who provides a contract identification number. Each contract may contain a number of licences, so each licence is also identified by the support contract on which it is listed.
Because licences are purchased at various times during the year, the contracts tend to have different start and end dates. VMWare allows contracts to be Co-termed, where common start and end dates are agreed and two or more contracts are combined, usually under a different contract number or the number of the ‘dominant’ contract (which has the most licences listed against it).
This Co-terming continues to be practiced as more and more licences are purchased and as the list of licences becomes bigger, it is very easy to lose the audit trail of each individual licence, from purchase through to the latest support contract, if the identification numbers are not carried forward.
John has observed that the invoices and contract paperwork issued by most of the third party support entities fail to maintain audit trails by not keeping each licence identified by the Licence Key, Instance number or even the contract from which they were Co-termed. The Licence Key must be the most important piece of information that should identify each licence, but this is invariably missing on every support contract he has ever seen. In some cases incorrect duplication has occurred where Licence Keys have been guessed at by the third party support supplier, when required by the Client.
So John’s top suggestion is that as licence owners, we need to insist on better, clearer audit trails for our licences and on all subsequent contracts with third party suppliers, so that each licence can be traced back to purchase records for the purpose of identifying the Proof of Entitlement. If software has to be re-installed for whatever reason, without the Licence Key it will prove to be quite difficult. Also the added advantage of keeping a firm handle on entitlement records is that it will save time hunting for the correct documentation when this is required for an audit event.
Do you have any tips or advice you would like to share? If so please leave a comment.
All too often we see managed services that are just PAAS (platform-as-a-service) where the customer effectively leases a hosted tool (and even worse, they are sold on the contents of the future roadmap rather than current capabilities).
If your requirement is to simply outsource the management of a technology then this can work, however you must be clear that you understand the deliverables and that you have had a proof of concept demonstrating what you will actually get, followed by a statement of works or contract that puts this down in black and white.
If we take ‘generating an effective licence position ’ (ELP) as a normal (and probably now expected!) output of a licence management service, one way that an MSP (Managed Service Provider) can help you is by providing an ‘on-boarding’ service.
This is a dramatically different approach to simply ‘throwing data over the wall’ at your MSP. Onboarding involves generating a list of publishers and products (prioritised by the criteria that is important to you such as risk, spend, contract and maintenance anniversaries, vendor relationship and even ease of completing the task!) that will be properly managed, leading to an ELP that you understand and trust.
Onboarding includes the following main activities that ITAMS can deliver as part of its ELM360 (Enterprise Licence Management) service:
- Data requirements definition
Different products and licence metrics have greatly differing data requirements. There is no one size fits all approach. Your existing discovery tools may not be capable of tracking SQL server deployments correctly, mapping server farms or monitoring usage. ITAMS can identify the data requirements needed to produce a real ‘effective demand’ figure rather than a simple installation number.
- Licence Clearing
Bringing all relevant entitlement and contract data together in the licence repository and modelling the licences which have adequate proof of purchase. This includes linking entitlement to contracts, contract relationships, renewals, upgrades and downgrade paths.
- Software Recognition
Where tools are required to detect the consumption of licences, working with those tools to configure them to recognise software and count it accurately. Verifying that the recognition patterns and rules in the solution are correct. Creating custom rules and patterns for customer specific deployment scenarios.
- Some products may need new discovery tools or data sources to provide accurate data such as physical / virtual relationships, editions, CPU data and usage tracking. These data sources often require combining before being ready for use in an ELP calculation.
- Exception reporting
Highlighting areas of data weakness, insufficient entitlement, data gaps and where assumptions have been made in the final calculations.
- Reaching Effective Licence Position (ELP)
As a result of the above activities, reconciling the installation/usage and entitlement/contract data to reach a known, provable Effective Licence Position.
According to the Gartner Group, “organisations can realise cost savings of between 5 – 35 % by implementing focused software asset management practices”. So how can you realise cost savings or even release value from your IT assets?
There are a number of ways this can be done. IT asset value can be ‘hard’ (monetary) and ‘soft’ (efficiency and effectiveness of operations). Hard value is realised by minimising spend and selling used assets. Soft value is reducing the cost of associated operations. Cost avoidance is a large component of this, avoiding the cost of buying more software than is actually needed and re-harvesting existing surplus software.
The following checklist provides a quick view on where value / savings can be derived:
- Sell unused software – selling software that is no longer required. To do this you must ensure that you have a good handle on your IT assets. Know exactly what licences you have, who is using them, how they are being used and on what hardware. Also whether you are likely to embark upon a software migration/upgrade project or a M&A that may result in unwanted licences. An ELP (Effective Licence Position) supported by good quality inventory and usage data will also help to pinpoint unwanted licences.
- Stay compliant – avoid unbudgeted / unforeseen audit costs (including internal resources, 3rd party, tools and fees), by being audit ready. Understanding natural licence renewal points and anticipating licence demand in the business can also help to stay compliant.
- Maximise product usage rights and entitlements – ensure you are exploiting the product usage rights that come with your software, (e.g. free 2nd installs on an allocated laptop which comes with some products).
- Provide the right application for the right user – ensure that users have access to what they need to perform their roles, improving efficiency and productivity. This includes the right edition and user type for a product – often the default position is to purchase the highest features which might not be needed. Target training requirements and improve collaboration.
- Reduce Support & Maintenance costs – ensure that only approved and standard applications are installed. This reduces infrastructure and support costs and related IT functions such as security costs. Support costs can be reduced significantly. A KPMG survey in 2008 stated that a 50% reduction in support costs can be achieved by managing software effectively.
Understanding your IT assets allows you to:
- budget and forecast effectively
- support and facilitate internal re-charging and cost allocation for services
- prepare and implement more effectively for upgrades/migration projects
- identify security vulnerabilities and apply fixes and patches
- use your existing assets more efficiently
Some of the services that ITAMS can help with include: building a business case for SAM, gap & risk analysis, audit defence, ELP generation, etc.
Achieving data accuracy can initially be unnerving. Not knowing this can hide a plethora of issues that do not come to light until a vendor, such as IBM, demands an audit. When this happens there is no window for any remediation work, nor investigation on how best to proceed.
On the other hand having good quality data can be a ‘win’ situation and can help with:
- planning for warranty – reducing maintenance budgets
- managing leases – reducing cost at end of life
- software optimisation – reducing licensing costs
- utilising asset reuse and increased reliability to provide maximum use of life
If IT Asset information is not accurate it can have a dramatic effect on the bottom line. In the two cases which come to mind, both relied heavily on accurate information. One client had an exposure of approximately £3 million whilst another made a saving of about £20 million.
On another occasion, ITAMS was asked to conduct a due diligence audit for an outsourcing organisation. This involved an audit on all the hardware which was going to be managed under the contract. Initially we were told there were 10,000 desktops to audit. Within the first two weeks of the project, it soon became apparent that this number would be much higher. On completion, we audited approximately 17,500 systems. The implications were large and the atmosphere difficult, as many decisions, such as licensing and hardware maintenance budgets, had been made based on the assumed figure.
With all such undertakings there are generally more assets than anticipated. The norm is 10% to 15% extra. However, this can swing the other way when an outsourcer manages an audit, and reporting can often be overstated by 30%. A classic finding is, maintenance being paid on 10-15% of hardware assets which no longer exist, and approximately 20% of software assets that are no longer in use.
In summary, inaccurate data will affect the bottom line both negatively and positively. The effect may not necessarily be immediate, but the longer it is left, there is a greater risk that an event will expose any problems which could potentially be both embarrassing and expensive.
Traditionally walk-around or physical audits have been the main stay for IT Asset managers but in today’s technology rich climate there is a plethora of electronic network audit tools that can offer faster data collection capabilities.
Electronic audit tools are however complimentary to walk-round audits, not a replacement. Physical audits are expensive and if used, should only be used to kick-start a project. Drivers for physical audits might be the level of accuracy required, coverage of secure or not LAN-connected assets (e.g. stores), and the additional information they offer, e.g. location, assignment, asset tag, nearest phone, etc.). Cross correlation of physical and discovery data is now highly useful… especially when integrated with an asset lifecycle tracking system.
The table below sets out a brief overview of the two common data collection methods:
Data Collection Method
|Electronic Network Audit Tools
Discovers and scans all hardware devices on the IT network for detailed hardware attributes and all the installed software and operating systems
|Walk around / physical audit
Physically/Visually identifies the hardware device
Both data collection methods provide practical options for managing your hardware estate however neither method, standalone, can provide 98+% data accuracy.
So what is the best method?
From experience there are several obstacles that may get in the way of correctly managing your hardware estate. Firstly at management level, key decision makers tend to argue, “We know what we have, I do not see any problem, or why fix it when it is working?”. Lower in the organisation the case is different, “We have always done it this way, but we know the data is less than 70% accurate”. Eventually, as installs, moves, adds and changes occur the inventory repository (if there is one) is not kept current (old assets not removed), and the accuracy level falls even further.
Therefore there is often a need for a one off physical audit to re-initiate / validate a lifecycle system’s core data, but no one wants to keep doing that. Discovery and a slick barcoding add-on to the lifecycle system acts as a way of constantly refreshing and validating the data and avoiding the costly re-audit, as well as giving the rich benefits of cross-correlated physical and discovery info (great for maintaining the constantly changing hardware CI layer of CMDB for example).
So my message to Asset Managers is, are you sure you want to keep managing data you know is bad, or would you prefer to kick-start a new and more efficient mechanism which generates real value based on trustworthy records?
I recently visited an organisation, where due to their business sector, they had a “laissez-faire” attitude to tracking their assets. However recent changes from the Financial Reporting Council (FRC), means that this has to change. Below are key areas which have to be considered before committing to change, as it will affect the whole organisation and may conflict with existing cultures.
- Understand why IT Asset Management is import to the business.
- Demonstrate the ways in which this information is or could be used.
- Get Board level ‘buy-in’ to the above – crucial!
- Produce a business plan which clearly shows why this is needed, what the cost would be and how the returns will be realised.
- Have a clear roadmap / strategy over time, not just a single project.
- Have an IT Asset Policy which is clearly communicated and understood.
- Have solid lifecycle touch / capture points.
- Produce sound and tested procedures which are simple to follow, and automate using barcoding.
- Know what accuracy is required and why, and who your data customers are.
- Where is the data going to be held, who is responsible for it and who should have access to the data?
- Know what sources of data are currently available – and are they trustworthy?
- What other sources are required for your business purposes?
- Have a feedback system to test the accuracy, including a service compliance function in larger organisations.
Key: Clarify and align responsibility across the whole organisation.
Tip: Start to plan for the future.
The key to providing an accurate effective licence position (ELP) is reliant on the accuracy of the IT inventory.
Many organisations cannot produce an accurate asset inventory report with an accuracy of 98%. Ensuring you have a 360° view of your hardware estate is an essential piece in the jigsaw. Knowing what hardware you have, where it is located and the status of this inventory is essential.
Technology audits can only provide between 85% to 95% accuracy. All too often, there is a reliance on a variety of technologies, spread sheets and Active Directory, with no coherent strategy to reach the desired level of accuracy necessary in today’s world of complex licensing models.
The 360° view of your hardware estate must also take into account licensing implications and the location and hardware profile of devices which are stored, retired, and passive, in production or disaster recovery systems. Once this information has been carefully collated, it can then be reflected correctly in SAM data, so that the correct effective demand is used in ELP calculations.
Many large organisations struggle to maintain licence management accuracy within their SAM processes. ITAMS recommends that clients use a catalogue-centric licence management tool which helps to marry installed software with SKUs (Stock Keeping Units) and to procurement contracts.
The key benefits gained from using software catalogues are great. Less dependency on discovery tools means that data from other inventory tools can be used to perform software recognition, time and money on manual resources saved and a higher percentage of data accuracy ensured.
Software Catalogue Use Proof of Concept (POC)
By Paul Sheehan, Managing Consultant
“About a year ago we ran a 3 day desktop focused POC in an 80,000 user multinational organisation, including the evaluation of two of the very few catalogue centric Licence Management tools in the market. Overall, we evaluated 5 licence management tools in the POC. For those without a catalogue, the work to reconcile discovery, entitlement and contracts in the POC were prohibitive and we could not fully test them in the 3 day window.
The more mature catalogue centric tool had 150 vendor SKUs and tens of thousands of products at its heart. This is constantly updated and links: a) discovery profiles to normalised product names; b) these product names to SKUs (where each SKU can be thought of as a specific and unique set of rights to use the software) and c) these SKUs to the contracts under which they may be purchased (Microsoft EA vs. Select vs. Retail for example).
Note that this isn’t just software recognition… it’s a pre-built cross reference between the software installation and usage evidence from the network and the ways a firm may (or may not!) have purchased licences to use it.
The incoming discovery data from the customer only contained ARP (Add/Remove Programs) data, but this dataset had in excess of 2 million software instances. Over 91% of the non-customer specific software instances were auto matched to the catalogue in this POC, more than double its nearest rival.
This not only meant we very quickly knew what each product was, and had common naming conventions for it and its publisher, but also that we understood the various ways it could be purchased and licenced. In production, that means invoices with SKUs could be auto-matched, permitting very fast reconciliation. Neither we nor the client needed to build this intelligence ourselves, nor did they have to maintain this specialised knowledge in their BAU operations team.
Very quickly it became clear that this element of the tools functionality was the single most important deciding factor – it directly influenced time to value and the volume of operational staff required to properly run and administer the system.”
ITAMS’ Strategic Advisory Services have been designed to help you navigate through the difficult challenges of controlling and managing software asset ownership and utilisation
Amongst the variety of services we offer the “Data Fit for Purpose Testing” Service in particular is pertinent to helping you understand the quality of your data.
The service performs a high level review of the data quality in your major data sources and highlights any data quality issues. Once these are identified ITAMS’ experts are able to make informed recommendations on how these issues can be addressed. To find out more about the, “Data Fit for Purpose Testing” service please call: 0870 4050508 or email: email@example.com
ITAMS always delivers its services as an independent service provider. For more information or expert advice on any of the topics above please contact ITAMS.
Are you using the right asset to manage your licensing entitlement?
A common issue ITAMS’ consultants come across when helping clients build a SAM / licensing programme is the conflicting records found for a given asset, normally from different data sources. The data sources often include various tools such as discovery, security, patching and software distribution, results from physical audits or barcoding solutions, as well as entries in CMDB etc.
The conflicted records provide limited coverage and are most likely to be out-dated, posing a threat not only to its accuracy but also its integrity. As a result many clients are left in a difficult position in terms of deciding which set of the conflicting records should be used to run compliance calculations or against which data CMDB CIs should be validated.
ITAMS’ best practice recommendation is for clients to consolidate and normalise the disparate asset data sets into a “golden source” and to use only this common record across the organisation and on which to base licence and configuration decisions.
Hardware data reconciliation supporting licence, configuration and outsourcer management
With help from ITAMS, a recent client in the finance sector successfully consolidated, compared, normalised, visualised and reported against 15 hardware data sources. The results were spectacular. They assumed the worst case for calculating hardware maintenance contract by including all identified machines from a number of data sources as input to their assumed estate.
In fact, less than 70% of these machines were real and still present. Analysis showed that the largest contributor was that discovery tools continued to routinely report machines they had last scanned years ago as active. As many of these tools were outsourcer managed with neither a mandatory nor a ‘3 month aged device’ policy applied, the reported data was evidently inaccurate and accounted for unnecessary overpayments in various categories (hardware and software related) to the tune of £10M’s.
The value from the project goes beyond cost avoidance and supports many areas of operational and cost management. It provides a full audit trail of how the “golden record” is derived against which reports on complex data sets can run.