Presented in this thesis is the design and development of a novel real-time HCI system based on a unique combination of a pair of data gloves based on fibre-optic curvature sensors to acquire finger joint angles, a hybrid tracking system based on inertia and ultrasound sensors to capture hand position and orientation, and a stereoscopic display to provide an immersive visual feedback. The potential and effectiveness of the proposed system is demonstrated through a number of hand gesture based applications, namely, virtual object manipulation and visualisation, direct sign writing, and finger spelling. For virtual object manipulation and visualisation, the system is shown to allow a user to manipulate and visualise virtual objects in 3D space using natural hand gestures in real-time. For direct sign writing, the system is shown to be able to display immediately the corresponding SignWriting symbols signed by a user using a range of complex hand gestures. Furthermore, for finger spelling, the system is shown to be able to recognise five vowels signed by two hands using the British Sign Language in real-time.