A Cloud-Optimized Storage for Interactive Access of Large Arrays




FAIR, community, bioimaging, data, cloud, format


For decades, the sharing of large N-dimensional datasets has posed issues across multiple domains. Interactively accessing terabyte-scale data has previously required significant server resources to properly prepare cropped or down-sampled representations on the fly. Now, a cloud-native chunked format easing this burden has been adopted in the bioimaging domain for standardization. The format — Zarr — is potentially of interest for other consortia and sections of NFDI.


M. D. Wilkinson et al., “The FAIR Guiding Principles for scientific data management and stewardship,” Sci Data, vol. 3, p. 160018, Mar. 2016, doi: 10.1038/sdata.2016.18. DOI:

J. Moore, zarr-developers/zarr-illustrations-falk-2022: Zarr illustrations by Henning Falk (August 2022). 2022. doi: 10.5281/zenodo.7037679.

M. Linkert et al., “Metadata matters: access to image data in the real world,” J. Cell Biol., vol. 189, no. 5, pp. 777–782, May 2010, doi: 10.1083/jcb.201004104. DOI:

J.-M. Burel et al., “Publishing and sharing multi-dimensional image data with OMERO,” Mamm. Genome, vol. 26, no. 9–10, pp. 441–447, Oct. 2015, doi: 10.1007/s00335-015-9587-6. DOI:

E. Williams et al., “The Image Data Resource: A Bioimage Data Integration and Publication Platform,” Nat. Methods, vol. 14, no. 8, pp. 775–781, Aug. 2017, doi: 10.1038/nmeth.4326. DOI:

J. Moore et al., “OME-NGFF: a next-generation file format for expanding bioimaging data-access strategies,” Nat. Methods, vol. 18, no. 12, pp. 1496–1498, Dec. 2021, doi: 10.1038/s41592-021-01326-w. DOI:

The HDF Group, Hierarchical Data Format version 5. 1997-2021. [Online]. Available:

I. G. Goldberg et al., “The Open Microscopy Environment (OME) Data Model and XML file: open tools for informatics and quantitative analysis in biological imaging,” Genome Biol., vol. 6, no. 5, p. R47, May 2005, doi: 10.1186/gb-2005-6-5-r47. DOI:

U. Sarkans et al., “REMBI: Recommended Metadata for Biological Images-enabling reuse of microscopy data in biology,” Nat. Methods, vol. 18, no. 12, pp. 1418–1422, Dec. 2021, doi: 10.1038/s41592-021-01166-8. DOI:

G. Nelson et al., “QUAREP-LiMi: A community-driven initiative to establish guidelines for quality assessment and reproducibility for instruments and images in light microscopy,” arXiv [q-bio.OT], Jan. 21, 2021. [Online]. Available:

D. Schapiro et al., “MITI minimum information guidelines for highly multiplexed tissue images,” Nat. Methods, vol. 19, no. 3, pp. 262–267, Mar. 2022, doi: 10.1038/s41592-022-01415-4. DOI:

M. Hartley, G. Kleywegt, A. Patwardhan, U. Sarkans, J. R. Swedlow, and A. Brazma, “The BioImage Archive - home of life-sciences microscopy data,” bioRxiv, p. 2021.12.17.473169, Dec. 21, 2021. doi: 10.1101/2021.12.17.473169. DOI:

M. Lange et al., “Zebrahub – Multimodal Zebrafish Developmental Atlas Reveals the State Transition Dynamics of Late Vertebrate Pluripotent Axial Progenitors,” bioRxiv, p. 2023.03.06.531398, Mar. 07, 2023. doi: 10.1101/2023.03.06.531398. DOI:

S. N. Chandrasekaran et al., “Three million images and morphological profiles of cells treated with matched chemical and genetic perturbations,” bioRxiv, p. 2022.01.05.475090, Jan. 05, 2022. doi: 10.1101/2022.01.05.475090. DOI:

J. Moore et al., “OME-Zarr: a cloud-optimized bioimaging file format with international community support,” bioRxiv, Feb. 2023, doi: 10.1101/2023.02.17.528834. DOI:

C. Pape et al., “MoBIE: a Fiji plugin for sharing and exploration of multi-modal cloud-hosted big image data,” Nat. Methods, vol. 20, no. 4, pp. 475–476, Apr. 2023, doi: 10.1038/s41592-023-01776-4. DOI:

K. M. Boergens et al., “webKnossos: efficient online 3D data annotation for connectomics,” Nat. Methods, vol. 14, no. 7, pp. 691–694, Jul. 2017, doi: 10.1038/nmeth.4331. DOI:

J. Maitin-Shepard et al., google/neuroglancer: Zenodo, 2021. doi: 10.5281/ZENODO.5573293.

Open Geospatial Consortium. “Zarr Storage Specification 2.0 Community Standard”. (April 22, 2023)




Conference Proceedings Volume


Poster presentations II (Call for Papers)
Received 2023-04-26
Accepted 2023-06-30
Published 2023-09-07

Funding data