Hand gesture recognition is challenging task inmachine vision due to similarity between inter class samples andhigh amount of variation in intra class samples. The gesturerecognition independent of light intensity, independent of colorhas drawn some attention due to its requirement where systemshould perform during night time also. This paper provides aninsight into dynamic hand gesture recognition using depth dataand images collected from time of flight camera. It provides userinterface to track down natural gestures. The area of interest andhand area is first segmented out using adaptive thresholding andregion labeling. It is assumed that hand is the closet object tocamera. A novel algorithm is proposed to segment the hand regiononly. The noise due to ToF camera measurement is eliminated bypreprocessing algorithms. There are two algorithms which wehave proposed for extracting the hand gestures features. The firstalgorithm is based on computing the region distance between thefingers and second one is about computing the shape descriptor ofgesture boundary in radial fashion from the centroid of handgestures. For matching the gesture the distance between twoindependent regions is computed for every row and column. Sameprocess is repeated across the columns. The number of totalregion transitions are computed for every row and column. Thesenumber of transitions across rows and columns forms the featurevector. The proposed solution is easily able to deal with static anddynamic gestures. In case of second approach we compute thedistance between the gesture centroid and shape boundaries atvarious angles from 0 to 360 degrees. These distances forms thefeature vector. Comparison of result shows that this method isvery effective in extracting the shape features and competentenough in terms of accuracy and speed. The gesture recognitionalgorithm mentioned in this paper can be used in automotiveinfotainment systems, consumer electronics where hardwareneeds to be cost effective and the response of the system should befast enough.