Simultaneous Deep Transfer Across Domains and Tasks Simultaneous Deep Transfer Across Domains and Tasks
Paper summary # Simultaneous Deep transfer across domains and tasks ## Tzeng, Hoffman, Saenko, 2015 * The paper aims to exploit unlabeled and sparsely labeled data from the target domain. * As a baseline, they mention that one could match feature distributions between source and target domain. This work will also explore correlation between categories, such as _bottle_ and _mug._ * The authors derive inspiration from the _Name the dataset_ game by Torralbe and Efros. In this game, you train a classifier to predict which dataset an image originates from. This idea transpires into the domain confusion loss. The domain classifier measures the confusion between learned features from source and target domain. The image classifier learns a feature representation that makes the domain inditinguishable, as measured by the domain confusion. * The second idea also learns the similarity structure between objects in the target domain. This works as follows. _We first compute the average output probability distribution, or “softlabel,” over the source training examples in each category. Then, for each target labeled example, we directly optimize our model to match the distribution over classes to the soft label. In this way we are able to perform task adaptation by transferring information to categories with no explicit labels in the target domain._ * The experiments take place in two situations. The _supervised_ case, where only few labels are present in the target domain. The _semi supervised_ case, where only few labels of a subset of the classes are present. * In the final section, the authors perform analysis on theis own result. They show how the image classifier correctly labeled monitor, while no labels for monitor were present in the target domain.

Summary by robromijnders 3 years ago
Your comment: allows researchers to publish paper summaries that are voted on and ranked!

Sponsored by: and