Contact us: +44 (0)1582 464 740

ITAMS Blog

Contact us

UK No.
0370 4050508

Int'l No.
+44 (0)1582 464740

Ask us a question

Keep in Touch

Data

Case Study: The Golden Record

In our three-part whitepaper series on ‘IT Asset Data Quality’, we highlighted the issues that can be caused by inaccurate IT asset data within multiple areas of the organisation such as outsourcer billing optimisation, CMDB and ITSM controls, licence management optimisation, security and far more.

For organisations to have complete confidence in their IT asset data we recommended creating a ‘Golden Record’.

A customer of ours recently asked for a brief explanation as to what a Golden Record is and how it is created, this is how our consultant explained it:

A Golden Record is a single, well defined, version of specific data, held within a company, about a device or asset. It is also known as the ‘single version of the truth’. It is where to go, to ensure you have the correct version of a piece of information.

A Golden Record can be created by reconciling several sources of data that may have a different ‘view’ of the asset. The aim of creating these is to have a set of records that the company knows are accurate and can rely on. The Golden Record can be held in a separate repository but it will not be the master. The master data will always sit within the source it was created and the Golden Record will be updated each time a reconciliation is carried out.

Data to be taken from the sources will need to be optimised initially to give the best possible opportunities for matching the records and a repository will need to be created to accept all the prepared data. Data must then be extracted from the sources, within the same time period, to reduce the risk of time effecting the field values.

The work to carry out the analysis in identifying assets that are the same and creating the Golden Record will be very time consuming. The best matches will be from the identification of the key fields. Even once this match has been made, the other fields must then be analysed to see which values should go through to the Golden Record and which are incorrect.

Once the analysis is complete, it can be investigated where the inaccuracies lie and what needs to be done to correct them. The processes that support the data population for each source will need to be reviewed and adjusted if necessary. For historically incorrect data, manual work may be required to change the data. This may also involve changing roles and responsibilities in some teams to ensure the work gets done.

Improving data quality should be an ongoing process. The initial reconciliation will yield a tremendous amount of work and raise many questions but will lead to the largest immediate benefits. Repeating these activities will show trends over time and support that the work being done is producing improvements. New, or enhanced, processes will ensure data does not deteriorate.


If you have a question regarding any of the points raised, would like more information on ‘Data Hub’ our IT Asset Data Reconciliation Service, a demonstration of the capabilities of the service or to investigate a no risk no fee engagement, please call ITAMS on +44 (0)1582 464740 or email your enquiry to info@itamsolutions.com.

For further information on managing IT Asset Data Quality please visit:  Data Hub

Case Study: Outsourcer Billing Optimisation

Where an estate is outsourced, the service provider will be creating bills based on their view of the client’s assets. Either they will own and manage their own tools and databases for this, or they will use those belonging to the client. Either way, it is necessary to be able to validate the information. If an asset is not correctly billed for; it could be that it is not registered and therefore not supported, or it is marked as decommissioned and still being billed for. Both scenarios are equally as bad.

The IT asset data used for billing will also support a multitude of other functions within the business, such as refresh, service desk, procurement and software asset management/audits. It is therefore vital that there is complete confidence that the data upon which the billing is based is true and accurate. Using data reconciliation techniques, across multiple sources, will ultimately lead to improved data integrity, processes and confidence – further supporting the business and saving time and money.

Case Study

A London based company was taking back ownership of the server and end-user computing assets from their existing IT outsourcers. To date, there had been little ongoing validation of the service billing figures or of the physical estate numbers in general. The already deployed discovery and operational tools (e.g. AD, AV, CMDB) had been under the management of the outsourcer too, so there had also been limited day to day involvement with the asset figures.

An initial consultative assessment was proposed to determine the scope of the transfer and discuss the impact to the company in terms of the required refresh, software licensing, service design, roles and responsibilities and how the future mode of operation would operate. In this case, the billing figures would have been used to understand the estate and plan the above activities. However, lack of confidence in these figures, or the data held within the sources led to further action being required. This work would have been equally as important if the outsourcer was to continue billing and there was not an ongoing transfer. It is also worth noting that a change in Outsourcer agreements will often give rise to higher software publisher attention – compounding the issues already being faced

The sources needed to validate billing were identified and used to populate a “Data Hub” and carry out a targeted reconciliation. Further analysis identified where data was incorrect and in which sources the processes needed to change to remediate and update existing information, and to put in place actions so the data did not deteriorate; support was provided in this area. Benefits were realised such as correction of asset status, removal of duplicates/invalid assets and populating missing fields. This supported more accurate billing and the transfer of the asset ownership. The client had more confidence in the figures and would be able to more accurately predict the refresh and manage it. The transfer of ownership is now in final stage of completion and the Client is actively considering an ongoing data quality process to perform regular check points and maintain a high degree of ongoing data accuracy.


If you have a question regarding any of the points raised, would like more information on ‘Data Hub’ our IT Asset Data Reconciliation Service, a demonstration of the capabilities of the service or to investigate a no risk no fee engagement, please call ITAMS on +44 (0)1582 464740 or email your enquiry to info@itamsolutions.com.

For further information on managing IT Asset Data Quality please visit:  Data Hub

Case Study: Configuration Management Optimisation

IT Configuration Management is a process for establishing and maintaining information around the performance, functional and physical attributes of hardware assets e.g. servers and desktops.

This information is commonly stored in a central database tool, providing support to other processes such as the service desk, refresh, transformation, break fix, billing and patching. This data is updated and maintained in a number of ways and if it becomes inaccurate, the validity of the processes using it becomes reduced.

Understanding if your data is complete and accurate can be very time consuming, as the most effective way is to compare several data sources and understand what the ‘true’ picture is of any particular asset. Using this information correctly will lead to improvements being made in processes that feed and maintain the data; hence having a positive effect on other dependent processes.

A well proven method of optimising Configuration Management is to engage with a supplier who can provide a data reconciliation service coupled with some advisory support. The case study below shows an example of how this can progress.

Case Study

This customer engagement started with an initial advisory project to evaluate the existing ITAM maturity. The customer is a multi-national banking sector corporation with approximately 10,000 seats. During this engagement, it was found that there would be a large transformation project for the majority of the companies’ IT hardware.

Phase 1 included delivery of a workshop, detailing a Service Design Overview (with descriptions and RACI), identification of the data sources supporting the operational configuration management function and in-depth analysis as to how the transformation will be run, and the effect of it on the current mode of operation. Data quality was clearly an area for improvement to ensure the assets to be supported were known and that the new configuration database could be populated with fit for purpose, accurate information.

Phase 2  the data sources identified in phase 1 (e.g. AD, CMDB, AV & Discovery) were fed into a ‘Data Hub’ where a reconciliation exercise was carried out. The analysis identified where data was incorrect and in which sources the processes needed to change to update existing information and to put in place actions so the data did not deteriorate; support was provided in this area. An advisory group was set up with all involved stakeholders so that the optimisation would be felt across all the operational areas and the buy-in would support the actions. Benefits were realised such as correction of asset status, removal of duplicates and populating missing fields, increasing the overall accuracy of data. On-going full technical and data services were provided for the customer in support of this work. Tangible savings will be realised when maintenance and support contracts are based on more accurate data.


If you have a question regarding any of the points raised, would like more information on ‘Data Hub’ our IT Asset Data Reconciliation Service, a demonstration of the capabilities of the service or to investigate a no risk no fee engagement, please call ITAMS on +44 (0)1582 464740 or email your enquiry to info@itamsolutions.com.

For further information on managing IT Asset Data Quality please visit:  Data Hub

What is an ELP?

ELP stands for Effective Licence Position and is related to a specific software vendor or product.

Being able to create an ELP is fundamental to keeping on top of your product compliance for audit defence and to support ongoing SAM activities such as optimisation and cost control.

Read Blog

The Importance of Data to your SAM Service Line

Tea-time-tips-6

Is your software asset data providing you with the right information to help you make informed decisions for your business?

In this post, ITAMS’ consultants define “data”, what data elements support IT asset management (ITAM) and what the data model comprises of for effective licence and compliance management.

Accurate, complete, and timely data is essential for reliable licence and compliance management. Regardless of how many sources you have, data coming into your asset repository must be of sufficient quality, quantity and coverage.

Data quality is the measurement of the correctness and completeness of data elements across all entities and the applications of standards, including attributes therein. When data is generated across multiple systems and then brought into the repository, where there are multiple data elements mapped to a single attribute, the data should, according to best practices, have the same format, i.e. Date = DD/MM/YYY.

Data quality also refers to the age of the data; it must be generated and transported in a timely fashion which supports the asset needs. Timeliness of data will depend on the asset (licence) and business requirements.

Data quantity then, is the measurement of the number of relevant data elements you have available to map against each entity. For example, if you are trying to create an asset profile, but all you have available to you is the product name and the date acquired, then you have insufficient data quantity. In order to create an asset profile, you would also need: PO number, SKU number, cost etc. Without the correct data quantity, your assets will not be complete.

Data coverage refers to the measurement of the availability of data elements, entities across the entire organisation in all environments and platforms. In order to generate a correct view of your assets, you must have data on Wintel, UNIX, Development and Test, Production, Disaster Recovery, Infrastructure, Regional Data etc. Unless you limit the scope of your SAM Service, you must have sufficient coverage for asset data.

Data elements in support of ITAM (of which SAM is a subset), must:

  • Include all relevant data, not just software asset and licence data.
  • Support all IT asset views resident in different technologies:
  • – Acquired (IT asset & licence repository).
    – Installed (inventory discovery).
    – Used (usage metering).
    – Entitled (access authorisations).

  • Support specific requirements for analysis and reporting.

The data model is comprised of:

  • Entities – e.g. asset, contract.
  • Elements (fields) – e.g., asset name, asset type.
  • Definitions (purpose, intended content).
  • Standards for key data elements – e.g., asset status.
  • Attributes – e.g., alphanumeric, character.
  • Links between entities – e.g., assets to licences to contracts.
  • Relationships amounts entities – e.g., ‘installed on’, ‘assigned to’.
  • Sources (systems-of-record) for each entity, possibly element.

Without data, your technology will not support your business requirements and be unable to provide you with the reporting and analysis information needed to make qualified decisions regarding your assets.

For more information about SAM data analysis services, please contact ITAMS.

Oracle Licence Inventory Uncovered

Tea-time-tips-6

ITAMS’ Licensing Analyst Sergiu provides an insight into the Oracle Licence Inventory.

The Oracle audit process represents a series of steps. The basis for this is Oracle’s so called “Licence Inventory”, although other software vendors may refer to it as ‘licence repository’/ ‘entitlement’, etc.

The Oracle licence inventory report is an excel spreadsheet summarising the licence products purchased by a particular customer and also a comprehensive overall picture of the historical support and licences.

How is this report used?

The report is used by Oracle Licence Management Services (LMS) consultants as a primary verification tool against which the original contracts are checked, in order for them to build a clear, simple and complete software repository. Later on, this will act as a guide for Oracle field consultants when managing their clients’ software accounts.

Grouped in several tabs, the information is based on the contract migrations report data and product migration rules. As a customer, special attention should be paid to your reference information, such as the correct spelling of names and ensuring that the address you have registered is the correct one. Also, pay particular attention to the status of your licences (if they are active or inactive) and the licence metric name of each licenced product.

Can the customer obtain this information from Oracle?

In 90% of cases, the customer will receive a “customer facing document”, where details about Oracle’s licencing audit report can be found. Included is a licence table (a simplified format of the licence inventory), nonstandard clauses (found in the original contracts), definitions of the licence metrics that the customer is licensed on (taken either from the original contract or from Oracle’s price lists) and finally, the minimum number of users /devices required for every software product (so these can be correctly licenced).

Gotchas

As Oracle consultants build the customer repository using different systems, sometimes the information lacks accuracy, so the customer should pay extra attention to ensure that the details contained in the inventory are correct.
For the customer, the most sensitive information to check is related to:

  • Product name
  • Licence Level
  • Licence Term
  • Metrics
  • Quantity
  • Support status
  • Licence ownership
  • Duplicate products.

Recommendations

For a customer to have a clear licencing position, they should ensure they have:

  • A tool that is capable of tracking all software licences currently in use.
  • Identified all the deployed licences across the organisation’s network. (Oracle uses measurement scripts which may be provided (but not in all cases). Customers must ensure that they know what the outputs are and that an expert is working on that.)
  • Built and maintained a report with detailed information on licence use. In other words, keep track of licence use internally.
  • Started comparing entitlement and deployment on a regular basis in order to have a strong compliance position.

After all the above are taken in consideration, the customer should have a clear view of whether they are under-licensed and need to acquire extra licences or whether they are over-licenced and need to uninstall licences.

Managing software assets can feel like an endless maze if you don’t know how to approach licensing information. Remember creating an efficient SAM process takes a lot of time and needs dedicated resources.

If you would like to listen to the pre-recorded webinar on Oracle Licence Inventory Uncovered, please click here. For further information, please contact us.

Is your data fit for consumption?

ITAMS’ Strategic Advisory Services have been designed to help you navigate through the difficult challenges of controlling and managing software asset ownership and utilisation

Amongst the variety of services we offer the “Data Fit for Purpose Testing” Service in particular is pertinent to helping you understand the quality of your data.

The service performs a high level review of the data quality in your major data sources and highlights any data quality issues. Once these are identified ITAMS’ experts are able to make informed recommendations on how these issues can be addressed. To find out more about the, “Data Fit for Purpose Testing” service please call: 0870 4050508 or email: info@itamsolutions.com

ITAMS always delivers its services as an independent service provider. For more information or expert advice on any of the topics above please contact ITAMS.

Get Real Value from Disparate Data Sets

Are you using the right asset to manage your licensing entitlement?

A common issue ITAMS’ consultants come across when helping clients build a SAM / licensing programme is the conflicting records found for a given asset, normally from different data sources. The data sources often include various tools such as discovery, security, patching and software distribution, results from physical audits or barcoding solutions, as well as entries in CMDB etc.

The conflicted records provide limited coverage and are most likely to be out-dated, posing a threat not only to its accuracy but also its integrity. As a result many clients are left in a difficult position in terms of deciding which set of the conflicting records should be used to run compliance calculations or against which data CMDB CIs should be validated.

ITAMS’ best practice recommendation is for clients to consolidate and normalise the disparate asset data sets into a “golden source” and to use only this common record across the organisation and on which to base licence and configuration decisions.

Case Study:

Hardware data reconciliation supporting licence, configuration and outsourcer management

With help from ITAMS, a recent client in the finance sector successfully consolidated, compared, normalised, visualised and reported against 15 hardware data sources. The results were spectacular. They assumed the worst case for calculating hardware maintenance contract by including all identified machines from a number of data sources as input to their assumed estate.

In fact, less than 70% of these machines were real and still present. Analysis showed that the largest contributor was that discovery tools continued to routinely report machines they had last scanned years ago as active. As many of these tools were outsourcer managed with neither a mandatory nor a ‘3 month aged device’ policy applied, the reported data was evidently inaccurate and accounted for unnecessary overpayments in various categories (hardware and software related) to the tune of £10M’s.

The value from the project goes beyond cost avoidance and supports many areas of operational and cost management. It provides a full audit trail of how the “golden record” is derived against which reports on complex data sets can run.