Zero-Shot Sign Language Recognition :
Can Textual Data Uncover Sign Languages?

Yunus Can Bilge    Nazli Ikizler-Cinbis     R. Gokberk Cinbis


We introduce the problem of zero-shot sign language recognition (ZSSLR), where the goal is to leverage models learned over the seen sign class examples to recognize the instances of unseen signs. To this end, we propose to utilize the readily available descriptions in sign language dictionaries as an intermediate-level semantic representation for knowledge transfer. We introduce a new benchmark dataset called ASL-Text that consists of 250 sign language classes and their accompanying textual descriptions. Compared to the ZSL datasets in other domains (such as object recognition), our dataset consists of limited number of training examples for a large number of classes, which imposes a significant challenge. We propose a framework that operates over the body and hand regions by means of 3D-CNNs, and models longer temporal relationships via bidirectional LSTMs. By leveraging the descriptive text embeddings along with these spatio-temporal representations within a zero-shot learning framework, we show that textual data can indeed be useful in uncovering sign languages. We anticipate that the introduced approach and the accompanying dataset will provide a basis for further exploration of this new zero-shot learning problem.

Move both S hands in alternating forward circles, palms facing down, in front of each side of the body.

Bring the fingertips of the right flattened O hand, palm facing in, to the lips with a repeated movement.

With the right index finger extended up, move the right hand, palm facing back, in a small repeated circle in front of the right shoulder.

Beginning with the fingertips of both F hands touching in front of the chest, palms facing each other, bring the hands away from each other in outward arcs while turning the palms in, ending with the little fingers touching.

Beginning with the bent thumb and middle finger of the right 5 hand touching the chest, palm facing in, bring the hand forward while closing the fingers to form an 8 hand.

Strike the knuckles of the right A hand, palm facing in, against the extended left index finger held up in front of the chest, palm facing right.

Please consider citing if you make use of this work and/or the dataset:
      author = {Bilge, Yunus Can and Ikizler-Cinbis, Nazli and Cinbis, Ramazan Gokberk},
      title =  {Zero-Shot Sign Language Recognition: Can Textual Data Uncover Sign Languages?},
      booktitle = {Proceedings of the British Machine Vision Conference ({BMVC})},
      year = {2019}

A medium post in Turkish language is available for the study

Back to Home