Abstract: The emergence of the upright human bipedal gait can be traced back 4 to 2.8 million years ago, to the now extinct hominin Australopithecus afarensis. Fine grained analysis of gait using the modern MEMS sensors found on all smartphones not just reveals a lot about the person’s orthopedic and neuromuscular health status, but also has enough idiosyncratic clues that it can be harnessed as a passive biometric. While there were many siloed attempts made by the machine learning community to model Bipedal Gait sensor data, these were done with small datasets often collected in restricted academic environs. In this talk, we will introduce the ImageNet moment for human gait analysis by presenting 'Project GaitNet', the largest ever planet-sized motion sensor based human bipedal gait dataset ever curated. We’ll also present the associated state-of-the-art results in classifying humans harnessing novel deep neural architectures and the related success stories we have enjoyed in transfer-learning into disparate domains of human kinematics analysis.
Bio: Vinay Prabhu is currently on a mission to model human kinematics using motion sensors on smartphones paving the way for numerous breakthroughs in areas such as passive password-free authentication, geriatric care, neuro-degenerative disease modeling, fitness and augmented reality. He is currently the Chief Scientist at UnifyID Inc and has over 30 peer reviewed publications in areas spanning Physical layer wireless communications, Estimation theory, Information Theory, Network Sciences and Machine Learning. His recent research projects include Deep Connectomics networks, Grassmannian initialization, SAT: Synthetic-seed-Augment-Transfer framework and the Kannada-MNIST dataset. He holds a PhD from Carnegie Mellon University and an MS from Indian Institute of Technology-Madras. In his spare time, he works on his cricketing skills and generative art projects, some of which have made it to the playa at Black Rock City.