Cognition in Testing Cognitive Software

Slides:



Advertisements
Similar presentations
SLA-Oriented Resource Provisioning for Cloud Computing
Advertisements

SAS solutions SAS ottawa platform user society nov 20th 2014.
April 28, 2015 Virginia Tech. Data Analytics “Analytics is the combustion engine of business, and it will be necessary for organizations that want to.
Mr. Perminous KAHOME, University of Nairobi, Nairobi, Kenya. Dr. Elisha T.O. OPIYO, SCI, University of Nairobi, Nairobi, Kenya. Prof. William OKELLO-ODONGO,
CHAPTER 8: LEARNING OUTCOMES
Leveraging JDA technology to support a Shelf Connected Supply Chain Amy Higgins VP, Space Management & Analytics Sears Holdings Corporation 1.
The Microsoft Office 2007 Enterprise Project Management Solution:
Chapter 16 Jones, Investments: Analysis and Management
CHAPTER EIGHTEEN Technical Analysis CHAPTER EIGHTEEN Technical Analysis Cleary / Jones Investments: Analysis and Management.
The Chief Future OfficerThe Chief Future Officer  The Chief Financial Officer is shouldering ever- increasing responsibility of sailing the ship through.
(1) Organize information processing centers environment, the various functions and details Electronic Data Processing (EDP): can refer to the use of automated.
BIA Supply Chain Analytics 11c. Sales and Operational Planning – Retail Analytics.
Kbv Research | +1 (646) | Active Network Management Market Knowledge Based Value (KBV) Research Visit:
What we mean by Big Data and Advanced Analytics
Reinventing Customer Experiences
Ron Dimon CheckPoint Consulting, LLC
Making the Case for Business Intelligence
Service Assurance in the Age of Virtualization
PRIMARY DATA vs SECONDARY DATA RESEARCH Lesson 23 June 2016
Cloud University Live: 8 Steps to Build Your Cloud Go to Market Plan
JD Edwards EnterpriseOne In-Memory Sales Advisor
Digital Transformation Services
Optimized Factory + Logistics
ADT (Analytics Driven Testing)
The Internet of Things (IoT) and Analytics
Microsoft SAM for Hosting (SPLA)
Recall The Team Skills Analyzing the Problem
A Real Business Example Questions for Discussion
DEFECT PREDICTION : USING MACHINE LEARNING
BUSINESS DRIVEN TECHNOLOGY
Dr Paul Lewis Chief Technology Officer
Applications of Data Mining in Software Engineering
Database Testing in Azure Cloud
MANAGEMENT INFORMATION SYSTEM MEHTAP PARLAK Industrial Engineering Department, Dokuz Eylul University, Turkey 1.
CHAPTER 8: LEARNING OUTCOMES
AI emerging trend in QA Sanjeev Kumar Jha, Senior Consultant
Achieving Operational Excellence and Customer Intimacy:Enterprise Applications Chapter 9 (10E)
Artificial Intelligence with Heart: Improving Customer Experience through Sentiment Analysis.
Cognitive Software Delivery Using Intelligent Process Automation (IPA)
Quantifying Quality in DevOps
A Must to Know - Testing IoT
AUDIT AND VALIDATION TESTING FOR BIG DATA APPLICATIONS
CSE 4705 Artificial Intelligence
Quality framework for Stepping into the Cloud
Sivaram kishan A, Consultant
Tools of Software Development
ARTIFICIAL INTELLIGENCE IN SOFTWARE TESTING
MBML_Efficient Testing Methodology for Machine Learning
Disclaimer The information in this presentation is confidential and proprietary to SAP and may not be disclosed without the permission of SAP. Except for.
Transforming Automation through Artificial Intelligence
Machine Learning Telepathy for Shift Right Approach
CHAPTER 8: LEARNING OUTCOMES
ML Integrated Software Testing First American India Private Limited
What-If Testing Framework
Artificial Intelligence in Manufacturing
iSRD Spam Review Detection with Imbalanced Data Distributions
Copyright © JanBask Training. All rights reserved Top 10 Charming IT jobs that would be High in Demand in 2019.
Introducing AIMEE AI Powered Multi Channel E-Commerce Platforms.
Software Verification, Validation, and Acceptance Testing
Selling IIoT Solutions to End Users
Cognition in Testing Cognitive Solutions
UNIT 5 EMBEDDED SYSTEM DEVELOPMENT
UNIT 5 EMBEDDED SYSTEM DEVELOPMENT
MANAGING THE DEVELOPMENT AND PURCHASE OF INFORMATION SYSTEMS
The Forecaster’s Imperative | September 19, 2018
Big DATA.
What is the Fourth Industrial Revolution*?
Smart companies carefully track their investments in every part of their business. By carefully monitoring and managing their return on investment (ROI)
The Intelligent Enterprise and SAP Business One
Presentation transcript:

Cognition in Testing Cognitive Software Sindu Vijayan – Principal QE Vidhya Nandhini Paramanathan – Sr Principal QE Manhattan Associates

Abstract The technology advancements over the last decade has made storage and connectivity so affordable that the Digital world right now is transforming at a speed unimaginable. No field is untouched by this digital revolution and so is software testing. The enormous data demands the usage of cognitive or machine learning logic to derive meaningful information. The once niche of programs that use cognitive logic better termed as “Non-testable Programs” is now becoming an integral part of any business solution. This article touches upon the best practices, techniques and challenges taking examples from supply chain solutions. While traditional testing techniques can be used in such applications where ever applicable; often assessing the correctness of the solution remains a challenge. While discussing about “Testing the non-testable programs” The article touches upon the following areas at a broader level: Effectiveness of the Algorithm/Logic Correctness of Implementation End to End Solution Accuracy Quality Bench Mark Cognitive solutions are often about optimization or better insights over the traditional models. There are no fixed boundaries or pass fail criteria. Taking an example of forecast projections, we use what we call the “Time Travel Technique” – where validations would be conducted against the actuals by moving back in time. Other techniques like cross validations with key performance indicators can also help test the accuracy. The closer you are to reality the better the predictions. There are no one-fits-all solutions available while addressing the “Non-testable programs”. Every such solution needs a unique test plan and methods to ensure that the primary objective of the Quality function is met – The solution provided meets the customer needs.

The World Right now The Potential Impacts!!! The technology advancements over the last decade has made storage and connectivity so affordable that the Digital world right now is transforming at a speed unimaginable. 44 Trillions of Giga bytes of data by 2020 Data doubles every year The potential information the data holds is also unimaginable The Potential Impacts!!! Technology is no longer the differentiator! Business Intelligence Transformation demands: Right information which steers the business in the right direction. Detection of Trends early enough for better response time. The emerging trends are capture over a period of time. Rise of Cognitive Computing. AI – Once a luxury is now a necessity! An integral part of software solution is an integral part of testing too !!

Cognitive Solutions what do they do? Add optimization to the otherwise traditional approach Can be used to provide better insights Are statistical in nature Cognitive Solutions in Real life! Cognitive Solutions make meaningful information out of data that seems unrelated. Examples: Sentimental Analysis/Trend Analysis for future planning using social media data – Marketing Strategy Real time weather or traffic analysis - predicting customer inflow or supply of goods Major Challenges in Testing the Enormous Data Identifying the right data for validation Validate the Effectiveness of the Solution

“The Solution provided is as Good as the Data” What’s the big deal about data? “The Solution provided is as Good as the Data” The Data Set chosen determines the effectiveness and accuracy of the solution. In cases where the results of a cognitive solution feeds into another cognitive solution; the effects of poor data can be multifold. Most of the Data Science models assume that the features(predictors) are accurately measured. So Generate your own “enormous” data for testing????

The Right Data for validation Generate a Standard Data Set Create Data set from snapshots of proven data sources. The data should be crafted carefully with the “actuals” captured. Required to test the impact of different flavors of data. Data is Static in nature. Sales matrix gathered from different verticals over a period of time – Eg: a sales forecasting system Real-Time Data feeds – See what the customer sees Real-time production like data. Testing solutions that analyze trends or customer sentiments would benefit from this approach. With real-time data, the actual results are not readily available. It is a challenge to assess the effectiveness of the solution as actuals are not always available. The patterns may be diluted as compared to a standard data set manually crafted The two methods serve two different purpose. The most beneficial would be to use both the techniques in the order above – if applicable.

Validate the effectiveness of the solution Testing the Non-Testable Programs Non-Testable Programs have: No Test oracles applicable No direct way of assessing the results of most such solutions The level of optimization is different from solution to solution Test the Non-Testable: How optimized are the results with and without the cognitive solution? The Test methodologies: Effectiveness of the Algorithms Correctness of the implementations End to End Solution Accuracy Quality Bench Mark The level of optimization required for each solution is unique!

Effectiveness of Algorithms Compare prototypes with other algorithms Utilize tools(with different algorithms) to compare results Often helps while choosing the right model or algorithm Compare the calculated results with the actuals available Eg: Precision v/s Recall for classification problems Data Seeding Techniques The technique can be used to artificially inject a pattern. Evaluate the response of the system. Identifies if the cognitive solution can capture changing trends/react to changes Eg: Association rule learning where injecting large no: of records of a particular association would identify the newly fed association with a strong affinity score.  

Correctness of Implementation Often Cognitive solutions use prototyping for the algorithmic logic. A well tested prototype eases the implementation testing. Methods: Bench mark testing against prototype. Benchmarking against the same data is the way to go. The relatively easier form of testing as we have a standard set to compare against. Automation Testing can be used    Validate the OTS solutions Off-The-Shelf solutions are becoming popular for cognitive solutions where the component with the algorithmic logic can be purchased. Challenges: The algorithmic logic is usually Intellectual Property or not shared. Optimization of the Cognitive logic is a challenge. However, we could assess the effectiveness by benchmarking techniques.

End to End Solution Accuracy The Rule of Thumb: Compare the results with and without the cognitive solution. The extend of improvement is the gauge!   Time Travel Technique: Collect the data with actuals over a period of time. Go back in time make the predictions. Compare the predictions to the actuals Cross validation techniques can be used. Taking the example above: Evaluate KPI’s like loss sales and Service Level Agreements A well optimized cognitive solution should reduce the Lost Sales and there by improve the SLAs. How Close?

Quality Bench Mark Yet another challenge is testing between versions where algorithms are optimized. Quality Bench Mark technique: Collect a Standard data set Run End to End solution accuracy validation technique Metrics captured Accuracy of the product version over version is measured. Evaluates if the optimization attempts have paid off. Conclusion Testing the ‘Non-Testable programs’ remains a challenge No one-fits-all test solution available. Every solution needs a unique test plan that suits the problem it tries to address. Primary objective of any Quality function – Ensure the solution meets the customer needs.

References & Appendix https://academic.oup.com/comjnl/article/25/4/465/366384/On-Testing-Non-Testable-Programs

Author Biography Sindu Vijayan: Principal Quality Engineer with Manhattan Associates. Sindu leads the testing activities in the Inventory suite of product which are associated with Planning, forecasting and optimization of supply chain inventory. With an over all industry experience of above 14 years she has been associated with Manhattan for 11+ years. She is a Masters in computer application from Bharathiyar university and also holds a Post Graduate Diploma in Retail Management. She is also a QAI Certified Manager in Software Quality. Vidhya Nandhini Paramanathan: Vidhya is a Sr Principal Quality Engineer with Manhattan Associates. She leads the testing activities in the Inventory suite of product which are associated with Planning, forecasting and optimization of supply chain inventory. Vidhya has an overall 14+ years of industry experience and has been associated with Mahattan for 11+ years.

Thank You!!!