This paper describes a motion capture (mocap) data-driven digital museum roaming system with high walking reality. We focus on three main questions: the animation of avatars; the path planning; and the collision detection among avatars. We use only a few walking clips from mocap data to synthesize walking motions with natural transitions, any direction and any length. Let the avatars roam in the digital museum with its Voronoi skeleton path, shortest path or offset path. And also we use Voronoi diagram to do collision detection. Different users can set up their own avatars and roam along their own path. We modify the motion graph method by classify the original mocap data and set up their motion graph which can improve search efficiency greatly.