![](https://crypto4nerd.com/wp-content/uploads/2023/05/15EyQc-m3dOkCJyMbdLfL2w.jpeg)
- On Pitfalls of RemOve-And-Retrain: Data Processing Inequality Perspective(arXiv)
Author : Junhwa Song, Keumgang Cha, Junghoon Seo
Abstract : Approaches for appraising feature importance approximations, alternatively referred to as attribution methods, have been established across an extensive array of contexts. The development of resilient techniques for performance benchmarking constitutes a critical concern in the sphere of explainable deep learning. This study scrutinizes the dependability of the RemOve-And-Retrain (ROAR) procedure, which is prevalently employed for gauging the performance of feature importance estimates. The insights gleaned from our theoretical foundation and empirical investigations reveal that attributions containing lesser information about the decision function may yield superior results in ROAR benchmarks, contradicting the original intent of ROAR. This occurrence is similarly observed in the recently introduced variant RemOve-And-Debias (ROAD), and we posit a persistent pattern of blurriness bias in ROAR attribution metrics. Our findings serve as a warning against indiscriminate use on ROAR metrics. The code is available as open source
2.Integral formula for quantum relative entropy implies data processing inequality (arXiv)
Author : Péter E. Frenkel
Abstract : Integral representations of quantum relative entropy, and of the directional second and higher order derivatives of von Neumann entropy, are established, and used to give simple proofs of fundamental, known data processing inequalities: the Holevo bound on the quantity of information transmitted by a quantum communication channel, and, much more generally, the monotonicity of quantum relative entropy under trace-preserving positive linear maps — complete positivity of the map need not be assumed. The latter result was first proved by Müller-Hermes and Reeb, based on work of Beigi. For a simple application of such monotonicities, we consider any `divergence’ that is non-increasing under quantum measurements, such as the concavity of von Neumann entropy, or various known quantum divergences. An elegant argument due to Hiai, Ohya, and Tsukada is used to show that the infimum of such a `divergence’ on pairs of quantum states with prescribed trace distance is the same as the corresponding infimum on pairs of binary classical states. Applications of the new integral formulae to the general probabilistic model of information theory, and a related integral formula for the classical Rényi divergence, are also discussed