Sep 6, 2018 · In this work, we introduce a context-based vocabulary remapping model to reprogram neural networks trained on a specific sequence classification ...
In this work, we develop methods to repurpose text classification neural networks for alternate tasks without modifying the network architecture or parameters.
Feb 25, 2020 · In this work, we introduce a context-based vocabulary remapping model to reprogram neural networks trained on a specific sequence classification ...
Jun 28, 2018 · We demonstrate adversarial reprogramming on six ImageNet classification models, repurposing these models to perform a counting task, as well as classification ...
People also ask
Can neural networks be used for text classification?
What is the best neural network architecture for text classification?
What is the best algorithm for text classification in NLP?
Why is LSTM good for text classification?
This work introduces a context-based vocabulary remapping model to reprogram neural networks trained on a specific sequence classification task, ...
People also search for
Adversarial reprogramming is a technique that repurposes a machine learning model, originally trained for a task, to perform a different chosen task.
In this work, we develop techniques to adversar- ially reprogram image classification networks for discrete sequence classification tasks.
This paper demonstrates adversarial reprogramming on six ImageNet classification models, repurposing these models to perform a counting task, as well as ...
We prove that two-layer. ReLU neural networks with random weights can be adversarially reprogrammed to achieve arbitrarily high accuracy on Bernoulli data ...
Adversarial Reprogramming has demonstrated success in utilizing pre-trained neural network classifiers for alternative classification tasks without modification ...