The problem of autoignition by the absorption of moisture has been postulated to occur in a number of situations of high practical importance. In this article we examine this phenomenon from a theoretical point of view. The problem is formulated as a combination of classical criticality theory and the critical initial value problem as defined more recently. We shown that the phenomenon of wetting-induced ignition (WII) is possible in certain regions of parameter space, but however close the dry material may have been to criticality before wetting a finite amount of water is required to cause ignition. Equally interesting is the finding, that for a given material, WII is impossible in sufficiently small samples but possible in larger ones. This reveals an important flaw in scaling test procedures.