Paper Title
ADVPRED-CORE VS FOCUSPRED: THE IMPACT OF ATTENTION ON MODEL PERFORMANCE AND EXPLAINABILITY

Abstract
In the era of expanding customer-centric services, accurate prediction of customer satisfaction is critical for businesses. This study explores the application of advanced machine learning and neural network models for predicting customer satisfaction in the airline industry using the Airline Passenger Satisfaction dataset. We compare traditional models such as Random Forest and XGBoost with more complex architectures, including Deep Neural Networks (DNN) and innovative models like Quantum-Inspired Neural Networks (QINN) (which incorporate entanglement and superposition principles), both with and without attention mechanisms. Our results show that integrating attention layers into the models enhances their performance. Additionally, a stacked ensemble model combining XGBoost, Random Forest, DNN with attention, and QINN with attention achieved commendable results. To ensure interpretability and transparency, we employed a range of Explainable AI (XAI) techniques, including but not limited to LIME, Permutation Importance, Accumulated Local Effects (ALE), and Integrated Gradients. These methods enabled us to examine the models’decision-making processes and assess the influence of attention on feature importance. Overall, our work revealed that the attention mechanism not only improved accuracy but also highlighted key features more effectively, providing deeper insights into the factors driving customer satisfaction. The combination of high-performing models and explainability tools offers a robust solution for both accurate predictions and transparent decision-making in customer satisfaction analysis. The code for our work can be found at : github Keywords - Neural Network Models, Attention Mechanisms, Explainable AI (XAI), Quantum-Inspired NeuralNetworks (QINN), Stacked Ensemble Models