Are You Ready for Analytics? We Say Yes!
All organizations these days go through changes at a rapid pace. Whether it be organizational realignments, new CIOs, mergers and acquisitions, staffing reduction, new applications, technologies or processes, or market disruptions, the IT department is not exempt from this. In fact, one could argue they are more impacted as they need to be agile enough to respond to changes in the business AND in the IT landscape! As a result, there is often insufficient time to ensure business & master data management processes are fully defined, followed, and controlled.
At Numerify, we have increased our focus on providing both the tools and services to allow our customers to gain valuable insights in spite of imperfect operational systems. Perhaps your organization can relate to some of the following data challenges Numerify customers are finding are made easier to address through data warehousing and analytics:
Several of our ITSM customers do not enforce clean separation of IT tasks into categories or task types, either because they want to “start simple”, or because they wish to minimize the effort for the team entering the transactions.
- By providing flexible filtering and reporting across task types, we can support an organization’s ongoing transition from a less mature model to a more mature one.
- Through use of text analytics on description fields and work notes by agents, we can extrapolate and highlight the most common issue categories even if categories are not explicitly tracked on the incoming incidents.
In some cases, customers have not made a complete transition to a single system of record for their ITSM needs. Common requirements for having multiple systems are: switching to a newer SaaS application such as ServiceNow, in process mergers and acquisitions, different systems supporting external customer support vs. internal IT support, or regional variations.
- This is where analytics and data warehouses really shine. Numerify360 for IT can support sourcing the transactions out of multiple, diverse systems and provide a single normalized view across all of them.
- By starting tracking of your key metrics very early in the deployment of a new operational system (ideally from Day 1), you can catch system or process shortcomings immediately and adjust configuration while users are still defining their usage patterns for the new system and are not yet “set in their ways”. Additionally, having a discussion with your internal stakeholders about what metrics to measure in the first place is best done when you are already engaging them to define the new operational system workflows.
Dirty Data – It May Be Dirtier Than You Think!
Some of our customers do not enforce validation on all of the reference data. For example, when filing an IT support ticket, a retailer we work with asks their employee to provide the store reference on the ticket form, but there are many different notations possible to refer to the same store location, such as Store B12, B12, Newport Beach, etc. In many cases cleaning up this data at the source has limited value as (for operational processing) any reference clearly identifying the store is sufficient. However, it causes a nightmare for reporting!
- To support store-level summary trend reporting, we worked with the customer to load a master set of stores from another system. With the master set of stores in place, now we simply maintain a mapping from the source data into the master data. The end result provides accurate location-based analytics without any need to alter workflow or, even worse, the culture of the teams opening the tickets.
Several customers have remarked how much more clearly they see their known data issues, but also discover *unknown* issues, once their Numerify instance is populated.
- Their known issues are magnified because they realize how much their configuration or process issues are limiting insights, creating inefficiencies, and/or decreasing transparency.
- Regarding unknown issues, we are introducing programmatic data profiling and bringing it forward in the discovery/delivery process. Doing this early on allows us to highlight the unknown issues right away and not require them to be stumbled upon later. We find this approach really helps add credibility and value to our service — this is not just a BI tool, but a service that has domain expertise and understands what drives value and how to overcome common stumbling blocks associated with data quality.
Continuous improvement is a stated goal for leaders in many domains, and data management is no exception. The reality is achieving continuous improvement requires consistent visibility and assessment against goals. Waiting to ‘get to’ data quality is often not practical and those kind of projects often don’t lend themselves to continuous measurement. Leveraging analytics and data profiling can not only provide insights for an improved operation, but casts a light on process & data issues. Often times, casting that light is the most sure way to improvement.