With the advent of the Internet of Vehicles (IoV), drivers are now provided with diverse time-sensitive vehicular services that usually require a large scale of computation. As civilian vehicles are generally insufficient in computational resources, their service requests are offloaded to cloud data centers and edge computing devices (ECDs) with ample computational resources to enhance the quality of service (QoS). However, ECDs are often overloaded with excessive service requests. In addition, as the network conditions and service compositions are complicated and dynamic, the centralized control of ECDs is hard to achieve. To tackle these challenges, a dynamic task offloading method with minority game (MG) in cloud-edge computing, named DOM, is proposed in this paper. Technically, MG is an effective tool with a distributed mechanism which can minimize the dependency on centralized control in resource allocation. In the MG, reinforcement learning (RL) is applied to optimize the distributed decision-making of participants. Finally, with a real-world dataset of IoV services, the effectiveness and adaptability of DOM are evaluated.