This paper aims to identify individuals by analyzing their gait using motion descriptors based on densely sampled short-term trajectories, instead of traditional binary silhouettes. The approach leverages advanced people detectors to create detailed spatial configurations around the person, capturing rich gait motion. Local motion features, combined using Fisher Vector encoding, result in a high-level gait descriptor called Pyramidal Fisher Motion. The method is validated on multiple datasets (CASIA, TUM GAID, CMU MoBo, and AVA Multiview Gait), achieving state-of-the-art results in recognizing individuals across various conditions such as different viewpoints, clothing, speeds, and walking paths.