The SAM Global Process Maturity Report has been compiled by SAM Charter from the anonymous responses of clients from its partner-base in 2016.  The report covers 16 countries and nearly 1.2 million devices, and offers a summary insight into the state of SAM Process Maturity around the globe.

SAM Global Process Maturity Report

SAM Global Process Maturity Report

SAM Charter has so far logged 256 participants taking the light assessment; 71 were completed, of which 13 were removed for being dummy/for exercise-only reports. While we appreciate that English is the language of international commerce, we would not wish to exclude any company/person who would prefer to take the assessment in a different language. Over the year, French, German, Spanish and Brazilian Portuguese have been added to the platform so that we can offer greater global participation and a richer dataset.

The report contains some reasons why you need the report, its data set, its findings and top tips in term of governance, scope, data, systems, policies and procedures, implementations, reporting, operational processes and best practice processes.

To get your full report, please contact us here.

Here are summary of the findings of the report.

Governance

The report reveals either:
– SAM programmes are being driven from IT operations up (through a company’s command structure); or
– Senior Management still don’t ‘get’ SAM.

Scope

An appreciation of SAM processes when it came to Scope fared slightly better than Governance, averaging a total of 1.26, but that still puts this topic in the category of ‘practised but not documented, OR documented but not practised’. Several reasons might exist for this low score:

– SAM was adopted as a discipline to tackle a specific software vendor (thereby ignoring certain divisions of an IT estate);
– Automated inventory-gathering techniques only cover certain elements of an IT estate (thereby ignoring certain divisions of an IT estate);
– A write-off policy for missing devices and a lax attitude towards information security means that inventory data from a single source is taken at face value.

Data

We see a steady climb in the likelihood of SAM processes existing around data management, with the two questions asked eliciting a score of 1.43, but that still firmly roots data management in the ‘practised but not documented, OR documented but not practised’ category. Rather like the validation of Scope in the previous section, we suspect the reason Data still scored relatively low is:
– Consumption of data in the production of SAM reports is deemed a technical discipline, and so its gathering and importation is not worthy of documentation;
– Requests for contract data might typically be handled by emailing someone in Procurement (a focus of one of the questions in the light assessment).

Systems

Regrettably, processes around Systems and Systems Management falls back as low as the Governance score, reaching just 1.01. The reasons such processes might be missing are:
– Too much faith being placed in the ability of the SAM suite to recognise all inventory data;
– Not enough thought being given to quality assurance of the data that comprises ELP reports.

Policies and Procedures

The Policies and Procedures score (perhaps understandably) didn’t fare too well, reaching a meagre 0.82 across the 58 participants. For the first time we see a section fall into the ‘not practised or documented’ portion of the SAM Charter scale.  Possible reasons to explain the low score might also include:
– An absence of a QA/PDCA approach in benchmarking a SAM programme;
– An absence of a QA/PDCA approach in creating a SAM programme.

Implementation

The score for Implementation and Processes is 0.92. As before, these processes could be deemed moot as they will have been disregarded once any SAM implementation has been executed. Possible reasons for the low scores could include:
– Formal processes to make such assessments are deemed project-only, and therefore not carried forward into Business As Usual (BAU);
– QA/PDCA does not form part of the SAM ethos.

Control

This element of the assessment seeks to measure whether the SAM programme is under control, rather than being led astray by random elements of data. This section saw yet another poor score: a 0.88 average across the participants, achieving ‘not practised or documented’ status. Reasons for this might include:

– Processes are not annually reviewed against KPIs (if they are even in place at all);

– Contracts are not reviewed annually.

Reporting

Processes around Reporting see the cohort’s average rise sufficiently to achieve a ‘practised but not documented, OR documented but not practised’ status of Level 1, with a score of 1.05. As before, formal processes around reporting and the release and management of reports might not be deemed valuable, since it is assumed that members of the SAM team naturally understand a licence compliance process.

Processes (operational)

It’s hardly surprising that the section measuring Operational SAM Processes scored higher than many other sections but at 1.38 it is still a Level 1 result of ‘practised but not documented, OR documented but not practised’. If processes don’t exist around the management of lifecycle SAM processes, then the ability of the SAM team to drive cultural change/maturity in handling software is greatly diminished.

Processes (best practice)

The final section of the light assessment, perhaps not surprisingly, scored the lowest mark out of the 10 sections, averaging 0.71 across all participants and earning it the status of ‘not practised or documented’. Clearly, best practice processes won’t even be attempted until the necessary level of SAM Maturity is in place, and the operational SAM processes are performing optimally.

Summary

We hope that the findings contained in this report have offered you an assurance that you are heading in the right direction with your SAM efforts, but more importantly, a feeling that you are not alone – help is available, and you may not be as far behind as you first thought.

The absolute worst thing you can do about managing software assets is nothing(!) and hope that software vendors don’t know of your existence – ignorance is not bliss. Gartner recently reported startling statistics about the frequency of companies receiving at least two vendor audits in a 12-month window. Having the steps in place to address a vendor audit is the kernel of good SAM. This takes us back to the process review above: no amount of technology or personnel is going to prevent eye-watering demands for extra capital if you haven’t laid the foundation for informed SAM business intelligence.

As mentioned previously, SAM Charter and our partners are well placed to guide you to the next level of SAM Maturity – drilling much deeper than the high-level review provided above. Why not contact us now?

Categories: News