Quantum signal processing invokes the injection of abstract quantum mechanical frameworks into classical signal processing problems. In this work we apply this idea to the notion of optimal likelihood ratio tests within the context of the location verification problem. We first draw parallels with quantum mechanical measurements and the notion of generalized likelihood ratio measurements. As we show, these quite different measurement frameworks are mathematically similar since both can be described in the language of projections into subspaces - the projections removing the nuisance parameters of the underlying system in the latter case. We then show how the imposition of an 'artificial' mathematical constraint, borrowed from a similar constraint imposed on quantum mechanics by the uncertainty principle, is likely to assist in machine-learning solutions of the location verification problem - such solutions being more useful in real-world deployments.