The DeepMind Fast Language Learning Tasks is a set of machine-learning tasks that requires agents to learn the meaning of instruction words either slowly (i.e. across many episodes), quickly (i.e. within a single episode) or both.
The tasks in this repo are Unity-based.
These tasks are provided through pre-packaged Docker containers.
This package consists of support code to run these Docker containers. You
interact with the task environment via a
dm_env
Python interface.
Please see the documentation for more detailed information on the available tasks, actions and observations.
dm_fast_mapping
requires Docker,
Python 3.6.1 or later and a x86-64 CPU with SSE4.2
support. We do not attempt to maintain a working version for Python 2.
Note: We recommend using Python virtual environment to mitigate conflicts with your system's Python environment.
Download and install Docker:
You can install dm_fast_mapping
by cloning a local copy of our GitHub
repository:
$ git clone https://github.com/deepmind/dm_fast_mapping.git
$ pip install ./dm_fast_mapping
You can install the dependencies for the examples/
with:
$ pip install ./dm-fast-mapping[examples]
Once dm_fast_mapping
is installed, to instantiate a dm_env
instance run the
following:
import dm_fast_mapping
settings = dm_fast_mapping.EnvironmentSettings(seed=123,
level_name='fast_slow/fast_map_three_objs')
env = dm_fast_mapping.load_from_docker(settings)
If you use dm_fast_mapping
in your work, please cite the accompanying paper:
@misc{hill2020grounded,
title={Grounded Language Learning Fast and Slow},
author={Felix Hill and
Olivier Tieleman and
Tamara von Glehn and
Nathaniel Wong and
Hamza Merzic and
Stephen Clark},
year={2020},
eprint={2009.01719},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
For the with_distractors
tasks, please also cite the source for those tasks:
@misc{lampinen2021towards,
title={Towards mental time travel:
a hierarchical memory for reinforcement learning agents},
author={Lampinen, Andrew Kyle and
Chan, Stephanie C Y and
Banino, Andrea and
Hill, Felix},
archivePrefix={arXiv},
eprint={2105.14039},
year={2021},
primaryClass={cs.LG}
}
This is not an officially supported Google product.