The dataset viewer is not available for this split.
Error code: FeaturesError
Exception: ArrowInvalid
Message: JSON parse error: Column(/episodes/[]/rigid_objs/[]/[]) changed from string to number in row 0
Traceback: Traceback (most recent call last):
File "/usr/local/lib/python3.12/site-packages/datasets/packaged_modules/json/json.py", line 174, in _generate_tables
df = pandas_read_json(f)
^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/packaged_modules/json/json.py", line 38, in pandas_read_json
return pd.read_json(path_or_buf, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/pandas/io/json/_json.py", line 815, in read_json
return json_reader.read()
^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/pandas/io/json/_json.py", line 1014, in read
obj = self._get_object_parser(self.data)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/pandas/io/json/_json.py", line 1040, in _get_object_parser
obj = FrameParser(json, **kwargs).parse()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/pandas/io/json/_json.py", line 1176, in parse
self._parse()
File "/usr/local/lib/python3.12/site-packages/pandas/io/json/_json.py", line 1391, in _parse
self.obj = DataFrame(
^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/pandas/core/frame.py", line 778, in __init__
mgr = dict_to_mgr(data, index, columns, dtype=dtype, copy=copy, typ=manager)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/pandas/core/internals/construction.py", line 503, in dict_to_mgr
return arrays_to_mgr(arrays, columns, index, dtype=dtype, typ=typ, consolidate=copy)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/pandas/core/internals/construction.py", line 114, in arrays_to_mgr
index = _extract_index(arrays)
^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/pandas/core/internals/construction.py", line 680, in _extract_index
raise ValueError(
ValueError: Mixing dicts with non-Series may lead to ambiguous ordering.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/src/services/worker/src/worker/job_runners/split/first_rows.py", line 228, in compute_first_rows_from_streaming_response
iterable_dataset = iterable_dataset._resolve_features()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/iterable_dataset.py", line 3496, in _resolve_features
features = _infer_features_from_batch(self.with_format(None)._head())
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/iterable_dataset.py", line 2257, in _head
return next(iter(self.iter(batch_size=n)))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/iterable_dataset.py", line 2461, in iter
for key, example in iterator:
^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/iterable_dataset.py", line 1952, in __iter__
for key, pa_table in self._iter_arrow():
^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/iterable_dataset.py", line 1974, in _iter_arrow
yield from self.ex_iterable._iter_arrow()
File "/usr/local/lib/python3.12/site-packages/datasets/iterable_dataset.py", line 503, in _iter_arrow
for key, pa_table in iterator:
^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/iterable_dataset.py", line 350, in _iter_arrow
for key, pa_table in self.generate_tables_fn(**gen_kwags):
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/packaged_modules/json/json.py", line 177, in _generate_tables
raise e
File "/usr/local/lib/python3.12/site-packages/datasets/packaged_modules/json/json.py", line 151, in _generate_tables
pa_table = paj.read_json(
^^^^^^^^^^^^^^
File "pyarrow/_json.pyx", line 342, in pyarrow._json.read_json
File "pyarrow/error.pxi", line 155, in pyarrow.lib.pyarrow_internal_check_status
File "pyarrow/error.pxi", line 92, in pyarrow.lib.check_status
pyarrow.lib.ArrowInvalid: JSON parse error: Column(/episodes/[]/rigid_objs/[]/[]) changed from string to number in row 0Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.
FindingDory: A Benchmark to Evaluate Memory in Embodied Agents
Karmesh Yadav*, Yusuf Ali*, Gunshi Gupta, Yarin Gal, Zsolt KiraCurrent vision-language models (VLMs) struggle with long-term memory in embodied tasks. To address this, we introduce FindingDory, a benchmark in Habitat that evaluates memory-based reasoning across 60 long-horizon tasks.
In this repo, we release the FindingDory Habitat Dataset. The episode dataset is used to run online evaluations with the Habitat simulator using VLM-based navigation agents. Each episode involves a robot performing multiple pick-place object interactions. The agent then needs to execute actions in the simulator complete the tasks specified in the FindingDory benchmark.
Dataset Structure
This repository contains data and models for the dnb_release package, structured into two main components: findingdory and imgnav.
findingdory
| File/Folder | Description |
|---|---|
| train/ | Training split of the habitat dataset. |
βββ episodes.json.gz |
Encoded JSON file containing episode metadata for training episodes. |
βββ transformations.npy |
NumPy array of transformation matrices (of each rigid object in simulator). |
βββ viewpoints.npy |
NumPy array of viewpoints (camera poses of all objects/receptacles). |
| val/ | Validation split of the habitat dataset. |
βββ episodes.json.gz |
Encoded JSON file containing episode metadata for validation episodes. |
βββ transformations.npy |
NumPy array of transformation matrices (of each rigid object in simulator). |
βββ viewpoints.npy |
NumPy array of viewpoints (camera poses of all objects/receptacles). |
findingdory_imagenav
| File/Folder | Description |
|---|---|
| dataset/ | Image navigation dataset (split into train and val). |
βββ train/ |
Training data for the image navigation task. |
βββ val/ |
Validation data for the image navigation task. |
| policy_ckpt/ | Policy checkpoint files (trained navigation policies). |
βββ ckpt.17.pth |
PyTorch checkpoint file containing trained policy weights. |
| pretrained_vis_enc/ | Pretrained visual encoder weights. |
βββ ckpt.16.pth |
PyTorch checkpoint file for the pretrained visual encoder used in imagenav training. |
π Citation
@article{yadav2025findingdory,
title = {FindingDory: A Benchmark to Evaluate Memory in Embodied Agents},
author = {Yadav, Karmesh and Ali, Yusuf and Gupta, Gunshi and Gal, Yarin and Kira, Zsolt},
journal = {arXiv preprint arXiv:2506.15635},
year = {2025}
}
- Downloads last month
- 431