Trustable AI-Assisted Analysis: Dataset Management and Reliable AI Flaw Detection
Tracks
NDT Methods
Wednesday, October 23, 2024 |
4:30 PM - 5:00 PM |
207/208 - Technical Session |
Details
The fast-paced progress in artificial intelligence (AI) shall not hide the fact that AI is a empirical method where success is not guaranteed, and failure can happen without warning. A successful deployment of AI in a high stakes environment such as NDT is thus challenging and requires a new workflow bridging the gap between AI expertise and NDT expertise.
The presentation will detail such a workflow whose goal is to develop and deploy trustworthy AI for assisted analysis. The workflow has been implemented in a software tool. The use case is phased array flaw detection in composite materials, but the tool supports other use cases.
Since AI is driven by data, dataset management is a crucial component of any AI development tool. A typical dataset for AI can contain hundreds of thousands of images (e.g. Dscans) and their indications (e.g. flaw position, size and type). We will present a complete dataset management tool including dataset format, dataset creation and revision, dataset augmentation, and dataset quality assessment. The goal is to guide users during the creation of high-quality, trustable, traceable, datasets to train AI models.
AI models are then trained on these datasets. The main AI model detects flaws and is able to accurately measure their sizes: voids, foreign objects... However, a robust assisted analysis AI shall also detect any event which could prevent a correct analysis because the data is corrupt or incomplete, which can happen for instance in the event of a mechanical problem during the inspection, or a lack of coupling, or a broken probe. The flaw detection model may not flag such corrupt data, thus a secondary anomaly detection model is trained to detect any abnormal data. Such abnormal data must be reviewed by inspectors. The goal for these two AI models is to achieve 100% probability of detection. After model training, model validation is facilitated by an interactive visualization tool.
Finally, the AI workflow includes a qualification software tool where the AI models are qualified for use in production. Qualification is a dedicated process during which inspection personnel build trust in the AI solution. The combination of these tools is a complete solution allowing trustable AI to be integrated in the traditional NDT workflow, from development to final deployment in production.
AI models are then trained on these datasets. The main AI model detects flaws and is able to accurately measure their sizes: voids, foreign objects... However, a robust assisted analysis AI shall also detect any event which could prevent a correct analysis because the data is corrupt or incomplete, which can happen for instance in the event of a mechanical problem during the inspection, or a lack of coupling, or a broken probe. The flaw detection model may not flag such corrupt data, thus a secondary anomaly detection model is trained to detect any abnormal data. Such abnormal data must be reviewed by inspectors. The goal for these two AI models is to achieve 100% probability of detection. After model training, model validation is facilitated by an interactive visualization tool.
Finally, the AI workflow includes a qualification software tool where the AI models are qualified for use in production. Qualification is a dedicated process during which inspection personnel build trust in the AI solution. The combination of these tools is a complete solution allowing trustable AI to be integrated in the traditional NDT workflow, from development to final deployment in production.
Speaker
Patrick Huot
Evident
Trustable AI-Assisted Analysis: Dataset Management and Reliable AI Flaw Detection
4:30 PM - 5:00 PMPresentation Description
The fast-paced progress in artificial intelligence (AI) shall not hide the fact that AI is a empirical method where success is not guaranteed, and failure can happen without warning. A successful deployment of AI in a high stakes environment such as NDT is thus challenging and requires a new workflow bridging the gap between AI expertise and NDT expertise.
The presentation will detail such a workflow whose goal is to develop and deploy trustworthy AI for assisted analysis. The workflow has been implemented in a software tool. The use case is phased array flaw detection in composite materials, but the tool supports other use cases.
Since AI is driven by data, dataset management is a crucial component of any AI development tool. A typical dataset for AI can contain hundreds of thousands of images (e.g. Dscans) and their indications (e.g. flaw position, size and type). We will present a complete dataset management tool including dataset format, dataset creation and revision, dataset augmentation, and dataset quality assessment. The goal is to guide users during the creation of high-quality, trustable, traceable, datasets to train AI models.
AI models are then trained on these datasets. The main AI model detects flaws and is able to accurately measure their sizes: voids, foreign objects... However, a robust assisted analysis AI shall also detect any event which could prevent a correct analysis because the data is corrupt or incomplete, which can happen for instance in the event of a mechanical problem during the inspection, or a lack of coupling, or a broken probe. The flaw detection model may not flag such corrupt data, thus a secondary anomaly detection model is trained to detect any abnormal data. Such abnormal data must be reviewed by inspectors. The goal for these two AI models is to achieve 100% probability of detection. After model training, model validation is facilitated by an interactive visualization tool.
Finally, the AI workflow includes a qualification software tool where the AI models are qualified for use in production. Qualification is a dedicated process during which inspection personnel build trust in the AI solution.
The combination of these tools is a complete solution allowing trustable AI to be integrated in the traditional NDT workflow, from development to final deployment in production.
The presentation will detail such a workflow whose goal is to develop and deploy trustworthy AI for assisted analysis. The workflow has been implemented in a software tool. The use case is phased array flaw detection in composite materials, but the tool supports other use cases.
Since AI is driven by data, dataset management is a crucial component of any AI development tool. A typical dataset for AI can contain hundreds of thousands of images (e.g. Dscans) and their indications (e.g. flaw position, size and type). We will present a complete dataset management tool including dataset format, dataset creation and revision, dataset augmentation, and dataset quality assessment. The goal is to guide users during the creation of high-quality, trustable, traceable, datasets to train AI models.
AI models are then trained on these datasets. The main AI model detects flaws and is able to accurately measure their sizes: voids, foreign objects... However, a robust assisted analysis AI shall also detect any event which could prevent a correct analysis because the data is corrupt or incomplete, which can happen for instance in the event of a mechanical problem during the inspection, or a lack of coupling, or a broken probe. The flaw detection model may not flag such corrupt data, thus a secondary anomaly detection model is trained to detect any abnormal data. Such abnormal data must be reviewed by inspectors. The goal for these two AI models is to achieve 100% probability of detection. After model training, model validation is facilitated by an interactive visualization tool.
Finally, the AI workflow includes a qualification software tool where the AI models are qualified for use in production. Qualification is a dedicated process during which inspection personnel build trust in the AI solution.
The combination of these tools is a complete solution allowing trustable AI to be integrated in the traditional NDT workflow, from development to final deployment in production.