In this paper, we study the problem of pedestrian relative positioning with respect to their walking direction. Existing approaches are mainly based on trajectory information or device proximity detection, and they highly rely on infrastructure or specialized device support. Importantly, most work does not provide relative position information with respect to people’s walking direction. To address the above issues, we propose a direction-aware, audio-based solution that only uses daily wearable devices. Based on the fact that pedestrian’s arms often swing back and forth during walking, we develop the wrist-body model that formally models the distance change between a user’s wrist and his/her walking mate’s body when walking together. Based on this model, we design our system by attaching the audio sources to a user’s wrists and an audio receiver to the other user’s body. We develop key indicators that characterize the received audio signal’s Doppler shift induced by arm swing motions and the differences in signal strength. We further propose methods such as cycle segmentation and aggregation to deal with several real-world challenges. The performance of our approach is studied through extensive experiments. Evaluation conducted using real-world data suggests the prototype system achieves 85.9% positioning accuracy, demonstrating its effectiveness.