My da|ra Login

Detailed view

metadata language: English

TEgO: Teachable Egocentric Objects Dataset

Version
1
Resource Type
Dataset : images: photographs, drawings, graphical representations
Creator
  • Kacorri, Hernisa (University of Maryland, College Park)
Publication Date
2019-01-04
Funding Reference
  • NIDILRR
    • Award Number: 90REGE0008
Description
  • Abstract

    TEgO dataset includes egocentric images of 19 distinct objects taken by two people for training and testing a teachable object recognizer. Specifically, a blind and a sighted individual takes photos of the objects using a smartphone camera to train and test their own teachable object recognizer.
    A detailed description of the dataset (people, objects, environment, and lighting) can be found in our CHI 2019 paper titled “Hands holding clues for object recognition in teachable machines” (see citation below).
Availability
Download
Relations
  • Cites
    DOI: 10.1145/3290605.3300566 (Text)
Publications
  • Lee, Kyungjun, and Hernisa Kacorri. “Hands Holding Clues for Object Recognition in Teachable Machines.” In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems  - CHI ’19. New York, New York, USA: ACM Press, 2019. https://doi.org/10.1145/3290605.3300566.
    • ID: 10.1145/3290605.3300566 (DOI)

Update Metadata: 2019-05-31 | Issue Number: 1 | Registration Date: 2019-05-31

Kacorri, Hernisa (2019): TEgO: Teachable Egocentric Objects Dataset. Version: 1. ICPSR - Interuniversity Consortium for Political and Social Research. Dataset. https://doi.org/10.3886/E109967V1