Compactness and Separability in MR-Metric Spaces with Applications to Deep Learning

Authors

  • Abed Al-Rahman Malkawi Department of Mathematics, Faculty of Arts and Science, Amman Arab University, Amman 11953, Jordan

DOI:

https://doi.org/10.29020/nybg.ejpam.v18i3.6592

Keywords:

MR-metric spaces, compactness, separability, paracompactness, deep learning, neural networks

Abstract

This paper establishes fundamental topological properties of MR-metric spaces, a significant generalization of conventional metric spaces characterized by an R-scaled tetrahedral inequality. We prove several key results including: (1) a complete characterization of compactness through three equivalent conditions, (2) the Lebesgue number lemma adaptation, (3) equivalence between separability and the Lindelöf property, and (4) automatic paracompactness. The theoretical framework is applied to four domains: (i) global optimization in Euclidean spaces, (ii) neural network weight space analysis, (iii) fractal geometry, and (iv) quantum state spaces. The proofs leverage the unique properties of MR-metrics, particularly the R-scaling factor in the tetrahedral inequality, to extend classical metric space results to this broader setting. Applications demonstrate the utility of these theoretical advances in computational and machine learning contexts.

Downloads

Published

2025-08-01

Issue

Section

Topology

How to Cite

Compactness and Separability in MR-Metric Spaces with Applications to Deep Learning. (2025). European Journal of Pure and Applied Mathematics, 18(3), 6592. https://doi.org/10.29020/nybg.ejpam.v18i3.6592