Samsung Funds Visual Lexicon Research

Department of Computer Science Assistant Professor Minh Hoai Nguyen has received a grant for his latest research into computer visualization of human actions. Professor Nguyen’s work aims to build a database of human actions collected via video which allows a computer to remember and recognize actions based on techniques currently used for text and speech recognition. Computer recognition of human action in a video has a wide array of potential applications, in fields ranging from entertainment to robotics to healthcare.

Human action recognition is currently very difficult for computers due to the relative complexity of video data, in addition to the enormous variety of potential physical actions the human body can perform. However, Professor Nguyen believes that if a lexicon of human actions is created, then computers will be able to break down the actions into segments that are easier to comprehend and work with.

“If a well-constructed visual lexicon exists, a complex and hard-to-recognize human action can be decomposed into constituent visual components that are simpler and recognizable” said Professor Nguyen.

The amount of the grant is just under $100,000 per year. The project is scheduled to take place over the course of three years, with the ultimate goal being the completed construction of a visual lexicon of human actions. The goal of the first year of the project is mostly data collection and the creation of algorithms necessary to produce a useful lexicon.

Minh Hoai Nguyen received a PhD in Robotics from Carnegie Mellon University and a Bachelor of Engineering from the University of New South Wales. Before coming to Stony Brook, Minh Hoai was a postdoctoral research fellow at Oxford University and his research interests are in computer vision, machine learning, and time series analysis. The first part of this Samsung-supported research is expected to be completed by September 31, 2016.