Your Industry and Applied ARTIFICAL INTELLIGENCE

by Dharmesh Kishore

Customer Service Operations

Unlocking Performance: Applied AI and Customer Service Operations Analytics

Customer service operations generate a ton of data, and data analytics helps make sense of it all. By analysing key metrics, you can gain valuable insights into how your team is performing, identify areas for improvement, and ultimately deliver better customer experiences. Here’s a breakdown of different metric categories used in data analytics for customer service:

Performance Metrics:

First Contact Resolution Rate (FCR): Percentage of issues resolved during the first interaction. High FCR indicates efficient service.
Average Handling Time (AHT): Average time spent resolving an issue. Lower AHT translates to quicker resolutions.
Customer Satisfaction Score (CSAT): Measures customer satisfaction with the interaction. High CSAT shows happy customers.
Net Promoter Score (NPS): Gauges customer loyalty and likelihood to recommend your business. High NPS indicates positive brand perception.
Agent Productivity: Tracks the number of issues handled or time spent by agents. Optimizing this metric improves efficiency.

Operational Metrics:

Call/Chat/Email Volume: Tracks volume of service requests across different channels. Helps with staffing and resource allocation.
Self-Service Usage: Measures customer adoption of self-service options like FAQs or knowledge bases. High usage reduces agent workload.
Resolution Trends: Identifies frequently occurring issues and helps prioritize solutions.
Channel Performance: Compares performance across channels (phone, email, chat) to identify strengths and weaknesses.

Sentiment Analysis:

Positive Mentions: Identifies positive feedback about your service or products.
Negative Mentions: Helps understand customer frustrations and areas for improvement.
Emotion Classification: Categorizes customer sentiment (e.g., anger, joy, frustration) for deeper analysis.
Topic Analysis: Uncovers trending topics and areas of customer concern.

Predictive Analytics:

Churn Prediction: Identifies customers at risk of leaving and helps with proactive retention efforts.
Issue Prediction: Uses historical data to predict potential issues and take preventive measures.
Resource Optimization: Optimizes staffing and scheduling based on anticipated contact volume.

Remember, the specific metrics you use will depend on your unique business goals and priorities. But by tracking and analysing relevant metrics, you can gain valuable insights that drive data-driven decisions
for better customer service operations.

Case Study 1 – Using AI to form Specialised Teams

This case study is from a fintech major. While the business at the point was booming and nearly teams were expanding, client engaged our consultants for building applications and analytics infrastructure to solve the ever-rising backlog of customer complaints.

Problem Statement:

Client had started initial analysis in grouping the customer reported issues by categories and measuring the resolution times, it was also noticed that a high number of tickets, even though they were categorised in minor issue category were taking significantly large amount of time to resolve. This backlog was significantly large.

The Investigation:
Having gone through the initial reports, our team of data scientists and analysts applied data mining, process mining, clustering, and classification techniques to conclude that these tickets had following 2 properties:

  1. A set of specific failure codes
  2. High number of back and forth between experts
  3. Tickets clustered around 3 expert groups.

Recommendations:

Our team recommended that for an extended class of observed failure codes instead of going through a 3-step escalation or transfer of desk, a special group that would have one expert from each of the earlier groups should be created, and the tickets should be directly routed to this group. Our team expected that with some cross training the switch times between the teams will be reduced. Client accepted the recommendations and implemented this group with experts from the respective groups.

Results:
As predicted, we quickly saw the cross training bearing fruit and the resolution time coming down drastically.


Soon cascading effects were also realised. Teams were seated in a formation such that they could easily make an eye contact with other teams and even seek the required support. Due to these interactions and cross training individuals quickly gained expertise in extended areas and this reflected in the number of ticket closures, handling times and handover times.

Case Study 2 – Using Predictive Analytics to Automate Ticket Allocation to Field Support

Problem Statement:

Customer was struggling with the extremely high Time-to-Resolution for support tickets that required Field Support visit to the merchants. Customer sought our team’s help in analysing and designing a solution such that these issues could be arrested as they directly impacted the customer satisfaction and churn rate.

Investigation:

Our team of data scientists and analysts ran analytics on logs that also contained text data. We identified that there was a pattern to a significant number of failures that required Field Support.

Our data scientists could visualize the data and show how most of these issues could be traced back into a series of events over last few hours and days. The ML models created by our team could predict a failure type with great accuracy.

Recommendations:

Our team proposed deploying and running these models at the end-of day and predict/classify the customers that would have a probability of seeing these failures in the next few hours and days. Based on the classification and severity of errors Field Support resources could be allocated the respective tickets even on a proactive basis. Customer accepted the recommendation and our team helped deploy the ML model and ticket assignment to Field Support.

Results:

As promised, we could predict the failures and most of the times attend the customer pre-emptively. This greatly reduced the class of failures that were resulting in high number of customer escalations.

Ecommerce and Retails Analytics

Unleashing the Power of Ecommerce and Retail Analytics

1. What is Sales Analytics?

Sales analytics involves systematically analysing and interpreting sales data and key performance indicators (KPIs) to improve sales strategies. It goes beyond mere number-crunching, offering a nuanced understanding of consumer behaviour trends, market dynamics, and operational efficiency. By leveraging advanced analytics, retailers can make informed decisions that impact everything from inventory management to customer satisfaction.

2. Categories of Sales Analytics:

Let’s explore the four core categories of sales analytics:

A. Descriptive Analytics:
Descriptive sales analytics provides a snapshot of past performance by analysing historical sales data. For instance, sales reports might reveal that Q4 revenue growth exceeded expectations. Understanding what has or hasn’t worked in the past is essential for sales teams.

B. Diagnostic Analytics:
Diagnostic analytics delves into the “why” behind key sales metrics. Sales managers use analytics tools to investigate customer behaviour and market trends. This understanding helps explain anomalies, such as a spike in customer acquisition costs during a specific month.

C. Predictive Analytics:
Predictive analytics uses existing sales data to make predictions about future sales. Sales representatives can forecast that the current sales pipeline will likely translate into increased total revenue for the next quarter. This is crucial for resource allocation in sales and marketing efforts.

D. Prescriptive Analytics:
Prescriptive analytics offers actionable insights for resolving specific issues or achieving desired outcomes. It guides decision-making by suggesting optimal strategies based on data. For example, it might recommend adjusting pricing strategies or optimising inventory levels.

3. Why Ecommerce and Retail Analytics Matter:

● Refining Pricing Strategies: Analytics helps you understand price elasticity, identify optimal price points, and adjust pricing strategies to maximise revenue.
● Optimising Inventory: Accurate demand forecasting ensures you stock the right products in the right quantities, minimising excess inventory or stockouts.
● Enhancing Customer Experience: Personalised recommendations, targeted marketing, and understanding customer preferences lead to better customer satisfaction and loyalty.

Success Mantra Data-driven decisions are the key to success!

1. SKU Pricing Comparison Using Web Scraping

What is SKU Pricing?

SKU (Stock Keeping Unit) pricing refers to the specific price assigned to individual products within your inventory. It’s essential for retailers to analyse and compare SKU prices to ensure competitiveness and profitability.

How Web Scraping Helps:

Web scraping allows you to extract pricing data from competitor websites, marketplaces, or other sources. By comparing your SKU prices with those of competitors, you gain insights into pricing trends, identify opportunities for adjustment, and optimise your pricing strategy.

2. Recommending SKU Pricing

Dynamic Pricing Algorithms:

Implementing dynamic pricing algorithms based on real-time data can enhance your SKU pricing recommendations. These algorithms consider factors like demand, supply, competitor prices, and customer behaviour to suggest optimal pricing for each SKU.

3. Product Recommendation Systems

Personalised Product Recommendations:

Leverage analytics to create personalised product recommendations for your customers. By analysing their browsing history, purchase behaviour, and preferences, you can offer tailored suggestions. These recommendations boost cross-selling and upselling opportunities.

4. Promotion Recommendation Systems

Strategic Promotions:

Analytics helps you identify the most effective promotional strategies. By analysing historical data, customer segments, and seasonal trends, you can recommend targeted promotions. Whether it’s discounts, bundling, or loyalty rewards, data-driven decisions lead to better results.

5. Marketing Analytics and Promotions Tracking

Measuring Campaign Effectiveness:

Track the performance of marketing campaigns using analytics. Understand which channels drive the most conversions, monitor click- through rates, and optimise your marketing spend. Data-driven insights ensure efficient resource allocation.

6. Abandoned Cart Recovery with Promotion
Automation

Reducing Cart Abandonment:

Analyse the cart abandonment patterns to identify bottlenecks in the purchase process. Implement automated email reminders with personalised promotions to encourage customers to complete their transactions.

7. Fraud Prevention, Customer Analytics, Loyalty
Programs, CRV, CIV

Fraud Detection:

Use analytics to detect fraudulent activities such as payment fraud, account takeovers, and fake reviews. Real-time monitoring and anomaly detection safeguard your business.

Customer Insights:

Analyse customer data to understand preferences, behaviour, and lifetime value. Tailor your offerings and communication accordingly.

Loyalty Programs:

Design loyalty programs based on analytics. Reward loyal customers, encourage repeat purchases, and foster brand loyalty.

CRV (Customer Retention Value) and CIV (Customer Incremental Value):

Calculate CRV to assess the long-term value of retaining a customer. CIV measures the additional value generated by incremental purchases. Both metrics guide retention strategies.

Case Study 1 – Using Analytics to detect Abuse/Fraud and Proposing Pricing Strategy

While consulting for one of our ecommerce clients, in our Exploratory Data Analysis our consultants discovered that some SKUs were quickly flying away from the inventory. This included one of the well-known energy drinks brands.

While presenting this data to the Marketing and Product managers it was discovered that these items were not part of any of the promotions and had little margins in them. Further analysis revealed that there was a pattern to these purchases. It was seen that these items, whenever available, were bought in large quantities and through multiple accounts in a certain geography. Many of these accounts were pointing to the same addresses. Digging further we wanted to find why these were being bought in such large numbers. Our hypothesis was that these items including the energy drink were priced cheaper compared to other sources and retailers took the opportunity of stocking these up as soon they became available in stocks. We devised a price discovery strategy by running web scraping scripts that would collect the price of the SKUs daily. We were able to collect and compare the prices of over 83000 SKUs. We were immediately able to find and prove that the bestsellers were priced way below the competition, and it was as good as leaving money on the table.

Recommendations and engineering changes:

  1. Duplicate accounts were identified and throttled so that retail stores could not abuse the offers.
  2. Limits were placed on the purchase quantity of a promoted SKU so that benefits could be passed to more and more targeted consumers.
  3. Mechanism was devised to suggest a pricing per SKU that would help keep the SKU price in a range where customers would still like to purchase from the platform and the client doesn’t have to leave
    money on the table.

Results:

Strategy brought great results to the ecommerce company:

  1. After the engineering changes, the Marketing and Promotion managers could not arbitrarily price any SKU just to increase the sales.
  2. Promotions were tracked for the targeted results and non performing promotions were quickly stopped.
  3. Organisation didn’t lose money on non-profitable SKUs to the unprofitable customers.

Case Study 2: Using AI/ML for a Focussed promotion strategy

While working for the same ecommerce client, Customer asked if we could help them look at the unfulfilled and failed orders so that the promotion process can be streamlined while optimising the costs. It was observed that often in the overnight promotions customers from remote parts would end up buying certain products that would have achieved the promotion objective in Tier-1 cities but not in Tier-2, Tier-3 cities. The logistics costs for these Tier-2, Tier-3 cities were extremely high, and this was resulting in poor or negative ROI on a promotion. Our team of consultants looked at the Click Through and Sales Data and collated them with the promotions data. We found that there was a lack of technology that would allow the app to limit the visibility of the promotions by geography. It was also found that even Marketing team’s allocation of promotion budget required greater fine tuning.

Recommendations:

  1. Final product pricing was recommended to include a dynamically calculated logistics cost based on the pin code, quantity, volume, and weight.
  2. Customer segmentation using parameters such as age, gender, location, purchase history, preferred product categories, cart value, discount and promotion usage was supplemented to cohort identification.
  3. Promotions designed by Marketing now had to go through the matching and filtering of these targeted cohorts.

Results:

  1. App was now able to show or block the respective promotions for the targeted emographic/cohorts and business didn’t lose money in the overbooking and unnecessary returns.
  2. With the dynamically calculated Logistics costs included in the final cart value, the company lost little money in failed orders.
  3. With the cohort matching algorithms message push campaigns could be targeted to bring back churned or inactive customers

Pharma and Manufacturing

“Transforming Operations Excellence in Pharma and manufacturing with AI/ML”

As Chemical and Pharma Manufacturing industries grow in volume and complexity, they face challenges like:

● Product Quality
● Delivery Delays
● Cost Optimization
● First to Market (R&D processes)
● Lack of digital systems
● Life Cycle Management of Products

Pandemic Covid-19 has also shown how unprepared the industries can find themselves in combating the disruption, especially with the availability of talent that can help sustain the above-mentioned org goals.

It’s no secret that to do things faster, a greater number of organisations are adopting AI/ML alongside the knowledge base that they have been able build over all these years. https://hbr.org/2023/11/how-ai-fits-into-lean-six-sigma] application AI/ML/Automation to all stages of DMAIC] [Reference: refers to At 1team.ai we offer a unique combination of decades of expertise in Applied AI/ML, Data/Process Mining to the Lean and Six Sigma methodology used in Operations Excellence projects in Pharma and Chemical Manufacturing.

We have developed proprietary methodologies of investigation by applying AI/ML to the Lean and Six Sigma processes generally followed in large pharma and manufacturing organisations. Through these we have been able to shorten the time of the Measure and Analyse phases of the DMAIC of the Six Sigma.

DefineRef: https://asq.org/quality-resources/dmaic
MeasureRef: https://asq.org/quality-resources/dmaic
Plus:
○ We use Automation, AI to collect data from Digital or Hard Copy artefacts.
○ We use AI and Process Mining to discover and confirm the “process map” shared by the SMEs
○ We use AI for Genealogy Analytics
○ We use Automation to Measure and Baseline
AnalyseRef: https://asq.org/quality-resources/dmaic
Plus:
○ Create Hypotheses
○ Run Deep Analytics using Machine Learning and AI tools for CPP/CQA Analysis
○ Run advanced models, create simulations for testing Hypotheses
○ Communicate findings of RCA using models’ results
○ Recommend the corrective actions
ImproveRef: https://asq.org/quality-resources/dmaic
ControlRef: https://asq.org/quality-resources/dmaic


Case Study 1: Differentiating the KPIVs, KPOVs, Human Factor and Serendipity that helped us make our process better

Minimising Cost of Poor Quality – CoPQ, is among the commonly identified projects in the Pharma industry. This article covers an Oral Drug project that one of our consultants worked on for a Pharma Client.

Problem Statement: The measured Assay for this product, a Critical Quality Attribute (CQA) in the FP (Finished Product) stage was having repeated issues and the process performance identifier Ppk of this product was coming out to be fairly low (<0.66). Mandate for this project was to identify the real root cause of the issue (since other probable root causes identified earlier and preventive actions had not yielded any results).

We followed Six Sigma Methodology and applied conventional DMAIC steps. (As mentioned earlier, we have already integrated Machine Learning and AI modelling as part of the investigation steps.)

  • Project Charter signed.
  • CTQ (Critical to Quality) analysis done.
  • Process Map and Cause and Effect analysis done.
  • For this project the CTQs are measured at IP stage (in process) and
  • FP (finished product stage)

In the DMAIC’s “Analyse” stage we investigated the previous reports on investigation and Corrective and Preventive Action (CAPA). Reports indicated that the CAPAs had not been effective, and issues continued to reoccur.

The Investigation

It was observed that the CTQ is on target at the IP stage but higher in the FP stage, though there are no major process steps in between.
This indicated possible analytical variation. However, before proving analytical, we had to Collect the CPP (Critical Process Parameters) data. We ran the exploratory analysis using different ML algorithms. This was the first time we had even operator level data made available to us, so we looked at the possible dimensions of Equipment and Operators. ML model and exploratory analysis did not indicate any potential causes in Manufacturing, which would lead us to suspect the analytical variation.

Data was further collected as per given format and data was analysed. We did extensive experimentation with ML models and analytical variation was validated.

Based on the analysis the variation appeared to be in the standard preparation stage in the Analysis method. This conclusion was presented, and it was confirmed by R&D and QC/QA.
As a recommendation a new CAPA was proposed and implemented.

Result: A marked difference in mean value of CTQ was observed. The project was appreciated by the senior production managers and technical and Quality management.

Serendipity

While Water by KF (WKF) was not part of the investigations earlier, the extensive analysis for Analytical variation threw up pointers that directly pointed to the variance seen in WKF measurements. We ran ML models on the collected data on CPP profiles and presented the observations and modelling results to the manufacturing SMEs and our OE counterparts.
Kudos to the Operations Excellence (OE) team to be open minded about the scope of the investigation. These SMEs went to lengths to help us collect the relevant data and corroborate the findings with their observations. In several cases the SMEs would also challenge and help us correct the observations that were unexplained purely looking at the data.
The prediction model for WKF was accepted and was referred for all further working ranges on the equipment.

Lessons in What not to do.

We had a sort of false start when our ML Models started showing parameters like Bed Temperature and Outlet Temperature as highly significant in the SHAP Plots. Before it could waste a significant time, the OE Experts and their experience proved very helpful. A Master Black Belt was quick to point out that these are Output Variables and not necessarily Input Variables. Segregating the KPIVs and KPOVs continues to be one of the first steps in the modelling process.

Case Study 2: An easily overlooked inlet RH was impacting dissolution.

This case study is for another such project where Operations Excellence highlighted the product for its high cost of poor quality.

Problem Statement: The process capability and performance indicators Cpk, Ppk had been observed and Ppk of the product was seen as very less (<0.66). Product had repeated issues on the quality attribute “dissolution”. Previous investigation reports and CAPAs taken indicated the CAPAs are not effective, and issues continued to reoccur. This project was given to us to identify the real root cause of the issue. We followed conventional Six Sigma Methodology and applied DMAIC steps. With Project Charter signed, as part of the methodology next steps were to get CTQ analysis, Process Map and Cause and Effect analysis done. For this product the CTQs are measured at IP stage (in process) and FP (finished product stage). Compensation is done as per IP stage analysis.

The Investigation

In this case, the customer had an advanced data collection mechanism, a decent data engineering and an evolving data science team. Customer’s data lake collects all the CPP and CQA data which is called a Flat File. We made a conscious decision to use the software packages known to the customer as it obviously helps bring the teams on the same page with known interface and style of storytelling, which also shortens the investigation time.
A few initial reports were already available with the Operations Excellence liaison team. After examining the existing reports, we pulled in our data scientists to speed up the investigation and dig further to discover any hidden patterns by applying Machine Learning and AI models.
We took this Flat file and ran exploratory analysis through Visualization and Machine Learning Software that was also known to the customer. Our experts ran a few quick analyses that helped us in identifying potential causes for the variation. Further data collected was for all critical parameters and detailed analysis followed using the time series profiles of the CPPs (critical process parameters). Experts followed through with various ML models and validated them.
In this course of a few iterations of sifting through the data and posing new questions to the manufacturing team, the investigation allowed us to develop a special methodology. We applied this methodology to investigate the causes for variations and came up with the hypothesis that inlet RH at a key process step is a key CPP. The variation in inlet RH at this step is the root cause for the variation in dissolution. This hypothesis was validated against many other batches of data.
As a final step and recommendation, a new CAPA was proposed and implemented.
Result: After the CAPA implementation a marked difference in the mean value of CTQ was observed. The value was in the desirable range. The project has been appreciated by sr. production managers and technical teams.

Conclusion

Complementing Six Sigma investigations with automation, applied Machine Learning and AI can really speed up the phases of DMAIC.
Descriptive, Predictive, Prescriptive Analytics accompanied with appropriate Visualisation and Dashboarding can augment Process Quality and Results with quicker identification of variations, outliers and thus warranting necessary actions for the stakeholders.