Masterarbeit, 2023
151 Seiten, Note: 1,0
This thesis explores the impact of Explainable Artificial Intelligence (XAI) on User Experience (UX), specifically in the context of computer vision tasks. It aims to prototype and evaluate a UX-optimized XAI interface for brain tumor detection using a Convolutional Neural Network (CNN) and the Local Interpretable Model-agnostic Explanations (LIME) method. This research investigates the influence of visual explanations on user trust, acceptance, and overall UX, considering both pragmatic and hedonic qualities.
This thesis focuses on the interplay between Explainable Artificial Intelligence (XAI), User Experience (UX), and computer vision. Key areas of interest include the design and evaluation of UX-optimized XAI interfaces, particularly in the context of brain tumor detection. This research incorporates concepts like user-centered design, Convolutional Neural Networks (CNNs), Local Interpretable Model-agnostic Explanations (LIME), and the User Experience Questionnaire (UEQ). The study investigates the impact of visual explanations on dimensions of usability, trust, and user satisfaction.
XAI refers to artificial intelligence systems designed to provide human-understandable explanations for their decisions and predictions, moving away from "black box" models.
XAI can significantly improve UX by increasing transparency, which in turn enhances user trust, perceived usefulness, and the feeling of control over the system.
The BTA is a custom XAI system developed for this thesis that uses a Convolutional Neural Network to classify brain x-rays and provides visual explanations using the LIME method.
LIME stands for Local Interpretable Model-agnostic Explanations. It is a technique used to explain the predictions of any machine learning classifier by approximating it locally with an interpretable model.
Quality can be quantified using tools like the User Experience Questionnaire (UEQ), specifically adapted to assess dimensions like trustworthiness, attractiveness, and controllability in XAI systems.
Yes, the study conducted in this thesis found that visual explanations have a statistically significant positive effect on user trust and overall system acceptance.
Der GRIN Verlag hat sich seit 1998 auf die Veröffentlichung akademischer eBooks und Bücher spezialisiert. Der GRIN Verlag steht damit als erstes Unternehmen für User Generated Quality Content. Die Verlagsseiten GRIN.com, Hausarbeiten.de und Diplomarbeiten24 bieten für Hochschullehrer, Absolventen und Studenten die ideale Plattform, wissenschaftliche Texte wie Hausarbeiten, Referate, Bachelorarbeiten, Masterarbeiten, Diplomarbeiten, Dissertationen und wissenschaftliche Aufsätze einem breiten Publikum zu präsentieren.
Kostenfreie Veröffentlichung: Hausarbeit, Bachelorarbeit, Diplomarbeit, Dissertation, Masterarbeit, Interpretation oder Referat jetzt veröffentlichen!

