This paper investigates a three-layer IoT-fog-cloud computing system to determine the optimum workload and power allocation at each layer. The objective is to minimize maximum per-layer latency (including both data processing and transmission delays) with individual power constraints. The resulting optimum resource allocation problem is a mixed-integer optimization problem with exponential complexity. Hence, the problem is first relaxed under appropriate modeling assumptions, and then an efficient iterative method is proposed to solve the relaxed but still non-convex problem. The proposed algorithm is based on an alternating optimization approach, which yields close-to-optimum results with significantly reduced complexity. Numerical results are provided to illustrate the performance of the proposed algorithm compared to the exhaustive search method. The latency gain of three-layer distributed IoT-fog-cloud computing is quantified with respect to fog-only and cloud-only computing systems.