bioio.writers package¶
Submodules¶
bioio.writers.ome_tiff_writer module¶
- class bioio.writers.ome_tiff_writer.OmeTiffWriter[source]¶
Bases:
Writer
- static build_ome(data_shapes: List[Tuple[int, ...]], data_types: List[dtype], dimension_order: List[str | None] | None = None, channel_names: List[List[str] | None] | None = None, image_name: List[str | None] | None = None, physical_pixel_sizes: List[PhysicalPixelSizes] | None = None, channel_colors: List[List[List[int]] | None] | None = None) OME [source]¶
Create the necessary metadata for an OME tiff image
- Parameters:
- data_shapes:
A list of 5- or 6-d tuples
- data_types:
A list of data types
- dimension_order:
The order of dimensions in the data array, using T,C,Z,Y,X and optionally S
- channel_names:
The names for each channel to be put into the OME metadata
- image_name:
The name of the image to be put into the OME metadata
- physical_pixel_sizes:
Z,Y, and X physical dimensions of each pixel, defaulting to microns
- channel_colors:
List of all images channel colors to be put into the OME metadata
- is_rgb:
is a S dimension present? S is expected to be the last dim in the data shape
- Returns:
- OME
An OME object that can be converted to a valid OME-XML string
- static save(data: List[ndarray | Array] | ndarray | Array, uri: str | Path, dim_order: str | List[str | None] | None = None, ome_xml: str | OME | None = None, channel_names: List[str] | List[List[str] | None] | None = None, image_name: str | List[str | None] | None = None, physical_pixel_sizes: PhysicalPixelSizes | List[PhysicalPixelSizes] | None = None, channel_colors: List[List[int]] | List[List[List[int]] | None] | None = None, fs_kwargs: Dict[str, Any] = {}, **kwargs: Any) None [source]¶
Write a data array to a file.
- Parameters:
- data: Union[List[biob.types.ArrayLike], biob.types.ArrayLike]
The array of data to store. Data arrays must have 2 to 6 dimensions. If a list is provided, then it is understood to be multiple images written to the ome-tiff file. All following metadata parameters will be expanded to the length of this list.
- uri: biob.types.PathLike
The URI or local path for where to save the data. Note: OmeTiffWriter can only write to local file systems.
- dim_order: Optional[Union[str, List[Union[str, None]]]]
The dimension order of the provided data. Dimensions must be a list of T, C, Z, Y, Z, and S (S=samples for rgb data). Dimension strings must be same length as number of dimensions in the data. If S is present it must be last and its data count must be 3 or 4. Default: None. If None is provided for any data array, we will guess dimensions based on a TCZYX ordering. In the None case, data will be assumed to be scalar, not RGB.
- ome_xml: Optional[Union[str, OME]]
Provided OME metadata. The metadata can be an xml string or an OME object from ome-biob.types. A provided ome_xml will override any other provided metadata arguments. Default: None The passed-in metadata will be validated against current OME_XML schema and raise exception if invalid. The ome_xml will also be compared against the dimensions of the input data. If None is given, then OME-XML metadata will be generated from the data array and any of the following metadata arguments.
- channel_names: Optional[Union[List[str], List[Optional[List[str]]]]]
Lists of strings representing the names of the data channels Default: None If None is given, the list will be generated as a 0-indexed list of strings of the form “Channel:image_index:channel_index”
- image_name: Optional[Union[str, List[Union[str, None]]]]
List of strings representing the names of the images Default: None If None is given, the list will be generated as a 0-indexed list of strings of the form “Image:image_index”
- physical_pixel_sizes: Optional[Union[biob.types.PhysicalPixelSizes,
List[biob.types.PhysicalPixelSizes]]]
List of numbers representing the physical pixel sizes in Z, Y, X in microns Default: None
- channel_colors: Optional[Union[List[List[int]], List[Optional[List[List[int]]]]]
List of rgb color values per channel or a list of lists for each image. These must be values compatible with the OME spec. Default: None
- fs_kwargs: Dict[str, Any]
Any specific keyword arguments to pass down to the fsspec created filesystem. Default: {}
- Raises:
- ValueError:
Non-local file system URI provided.
Examples
Write a TCZYX data set to OME-Tiff
>>> image = numpy.ndarray([1, 10, 3, 1024, 2048]) ... OmeTiffWriter.save(image, "file.ome.tif")
Write data with a dimension order into OME-Tiff
>>> image = numpy.ndarray([10, 3, 1024, 2048]) ... OmeTiffWriter.save(image, "file.ome.tif", dim_order="ZCYX")
Write multi-scene data to OME-Tiff, specifying channel names
>>> image0 = numpy.ndarray([3, 10, 1024, 2048]) ... image1 = numpy.ndarray([3, 10, 512, 512]) ... OmeTiffWriter.save( ... [image0, image1], ... "file.ome.tif", ... dim_order="CZYX", # this single value will be repeated to each image ... channel_names=[["C00","C01","C02"],["C10","C11","C12"]] ... )
bioio.writers.ome_zarr_writer module¶
- class bioio.writers.ome_zarr_writer.OmeZarrWriter(uri: str | Path)[source]¶
Bases:
object
Constructor.
- Parameters:
- uri: biob.types.PathLike
The URI or local path for where to save the data.
- static build_ome(size_z: int, image_name: str, channel_names: List[str], channel_colors: List[int], channel_minmax: List[Tuple[float, float]]) Dict [source]¶
Create the omero metadata for an OME zarr image
- Parameters:
- size_z:
Number of z planes
- image_name:
The name of the image
- channel_names:
The names for each channel
- channel_colors:
List of all channel colors
- channel_minmax:
List of all (min, max) pairs of channel intensities
- Returns:
- Dict
An “omero” metadata object suitable for writing to ome-zarr
- write_image(image_data: ndarray | Array, image_name: str, physical_pixel_sizes: PhysicalPixelSizes | None, channel_names: List[str] | None, channel_colors: List[int] | None, chunk_dims: Tuple | None = None, scale_num_levels: int = 1, scale_factor: float = 2.0, dimension_order: str | None = None) None [source]¶
Write a data array to a file. NOTE that this API is not yet finalized and will change in the future.
- Parameters:
- image_data: biob.types.ArrayLike
The array of data to store. Data arrays must have 2 to 6 dimensions. If a list is provided, then it is understood to be multiple images written to the ome-tiff file. All following metadata parameters will be expanded to the length of this list.
- image_name: str
string representing the name of the image
- physical_pixel_sizes: Optional[biob.types.PhysicalPixelSizes]
PhysicalPixelSizes object representing the physical pixel sizes in Z, Y, X in microns. Default: None
- channel_names: Optional[List[str]]
Lists of strings representing the names of the data channels Default: None If None is given, the list will be generated as a 0-indexed list of strings of the form “Channel:image_index:channel_index”
- channel_colors: Optional[List[int]]
List of rgb color values per channel or a list of lists for each image. These must be values compatible with the OME spec. Default: None
- scale_num_levels: Optional[int]
Number of pyramid levels to use for the image. Default: 1 (represents no downsampled levels)
- scale_factor: Optional[float]
The scale factor to use for the image. Only active if scale_num_levels > 1. Default: 2.0
- dimension_order: Optional[str]
The dimension order of the data. If None is given, the dimension order will be guessed from the number of dimensions in the data according to TCZYX order.
Examples
Write a TCZYX data set to OME-Zarr
>>> image = numpy.ndarray([1, 10, 3, 1024, 2048]) ... writer = OmeZarrWriter("/path/to/file.ome.zarr") ... writer.write_image(image)
Write multi-scene data to OME-Zarr, specifying channel names
>>> image0 = numpy.ndarray([3, 10, 1024, 2048]) ... image1 = numpy.ndarray([3, 10, 512, 512]) ... writer = OmeZarrWriter("/path/to/file.ome.zarr") ... writer.write_image(image0, "Image:0", ["C00","C01","C02"]) ... writer.write_image(image1, "Image:1", ["C10","C11","C12"])
bioio.writers.ome_zarr_writer_2 module¶
- class bioio.writers.ome_zarr_writer_2.OmeZarrWriter[source]¶
Bases:
object
Class to write OME-Zarr files. Example usage:
from ome_zarr_writer import OmeZarrWriter, compute_level_shapes, compute_level_chunk_sizes_zslice # We need to compute the shapes and chunk sizes for each # desired multiresolution level. shapes = compute_level_shapes(input_shape, scaling, num_levels) chunk_sizes = compute_level_chunk_sizes_zslice(shapes) # Create an OmeZarrWriter object writer = OmeZarrWriter() # Initialize the store. Use s3 url or local directory path! writer.init_store(str(save_uri), shapes, chunk_sizes, im.dtype) # Write the image. # This will compute downsampled levels on the fly. # Adjust t batch size based on dask compute capacity. writer.write_t_batches_array(im, tbatch=4) # Generate a metadata dict and write it to the zarr. meta = writer.generate_metadata( image_name="my_image_name", channel_names=my_channel_names, physical_dims=physical_scale, physical_units=physical_units, channel_colors=my_channel_colors, ) writer.write_metadata(meta)
- generate_metadata(image_name: str, channel_names: List[str], physical_dims: dict, physical_units: dict, channel_colors: List[str] | List[int]) dict [source]¶
Build a metadata dict suitable for writing to ome-zarr attrs.
- Parameters:
- image_name:
The image name.
- channel_names:
The channel names.
- physical_dims:
for each physical dimension, include a scale factor. E.g. {“x”:0.1, “y”, 0.1, “z”, 0.3, “t”: 5.0}
- physical_units:
For each physical dimension, include a unit string. E.g. {“x”:”micrometer”, “y”:”micrometer”, “z”:”micrometer”, “t”:”minute”}
- init_store(output_path: str, shapes: List[Tuple[int, int, int, int, int]], chunk_sizes: List[Tuple[int, int, int, int, int]], dtype: dtype, compressor: Codec = Blosc(cname='lz4', clevel=5, shuffle=SHUFFLE, blocksize=0)) None [source]¶
Initialize the store.
- Parameters:
- output_path:
The output path. If it begins with “s3://” or “gs://”, it is assumed to be a remote store. Credentials required to be provided externally.
- shapes:
The shapes of the levels.
- chunk_sizes:
The chunk sizes of the levels.
- dtype:
The data type.
- write_metadata(metadata: dict) None [source]¶
Write the metadata.
- Parameters:
- metadata:
The metadata dict. Expected to contain a multiscales array and omero dict
- write_t_batches(im: BioImage, tbatch: int = 4, debug: bool = False) None [source]¶
Write the image in batches of T.
- Parameters:
- im:
The BioImage object.
- tbatch:
The number of T to write at a time.
- class bioio.writers.ome_zarr_writer_2.ZarrLevel(shape: Tuple[int, int, int, int, int], chunk_size: Tuple[int, int, int, int, int], dtype: str, zarray: zarr.core.Array)[source]¶
Bases:
object
- chunk_size: Tuple[int, int, int, int, int]¶
- dtype: str¶
- shape: Tuple[int, int, int, int, int]¶
- zarray: Array¶
- bioio.writers.ome_zarr_writer_2.build_ome(size_z: int, image_name: str, channel_names: List[str], channel_colors: List[int], channel_minmax: List[Tuple[float, float]]) dict [source]¶
Create the omero metadata for an OME zarr image
- Parameters:
- size_z:
Number of z planes
- image_name:
The name of the image
- channel_names:
The names for each channel. Must be of correct length!
- channel_colors:
List of all channel colors
- channel_minmax:
List of all (min, max) pairs of channel intensities
- Returns:
- Dict
An “omero” metadata object suitable for writing to ome-zarr
- bioio.writers.ome_zarr_writer_2.chunk_size_from_memory_target(shape: Tuple[int, int, int, int, int], dtype: str, memory_target: int) Tuple[int, int, int, int, int] [source]¶
Calculate chunk size from memory target in bytes. The chunk size will be determined by considering a single T and C, and subdividing the remaining dims by 2 until the chunk fits within the size target.
- Parameters:
- shape:
Shape of the array. Assumes a 5d TCZYX array.
- dtype:
Data type of the array.
- memory_target:
Memory target in bytes.
- Returns:
- Chunk size tuple.
- bioio.writers.ome_zarr_writer_2.compute_level_chunk_sizes_zslice(shapes: List[Tuple[int, int, int, int, int]]) List[Tuple[int, int, int, int, int]] [source]¶
Convenience function to calculate chunk sizes for each of the input level shapes assuming that the shapes are TCZYX and we want the chunking to be per Z slice. This code also assumes we are only downsampling in XY and leaving TCZ alone. For many of our microscopy images so far, we have much more resolution in XY than in Z so this is a reasonable assumption.
The first shape returned will always be (1,1,1,shapes[0][3],shapes[0][4]) and the following will be a scaled number of slices scaled by the same factor as the successive shapes.
This is an attempt to keep the total size of chunks the same across all levels, by increasing the number of slices for downsampled levels. This is making a basic assumption that each of the shapes is a downsampled version of the previous shape.
For example, in a typical case, if the second level is scaled down by 1/2 in X and Y, then the second chunk size will have 4x the number of slices.
- Parameters:
- shapes:
List of all multiresolution level shapes
- Returns:
- List of chunk sizes for per-slice chunking
- bioio.writers.ome_zarr_writer_2.compute_level_shapes(lvl0shape: Tuple[int, int, int, int, int], scaling: Tuple[float, float, float, float, float], nlevels: int) List[Tuple[int, int, int, int, int]] [source]¶
Calculate all multiresolution level shapes by repeatedly scaling. Minimum dimension size will always be 1. This will always return nlevels even if the levels become unreducible and have to repeat.
- Parameters:
- lvl0shape:
Shape of the array. Assumes a 5d TCZYX tuple.
- scaling:
Amount to scale each dimension by. Dims will be DIVIDED by these values.
- nlevels:
Number of levels to return. The first level is the original lvl0shape.
- Returns:
- List of shapes of all nlevels.
- bioio.writers.ome_zarr_writer_2.dim_tuple_to_dict(dims: Tuple[int, int, int, int, int] | Tuple[float, float, float, float, float]) dict [source]¶
- bioio.writers.ome_zarr_writer_2.get_scale_ratio(level0: Tuple[int, ...], level1: Tuple[int, ...]) Tuple[float, ...] [source]¶
- bioio.writers.ome_zarr_writer_2.resize(image: Array, output_shape: Tuple[int, ...], *args: Any, **kwargs: Any) Array [source]¶
Wrapped copy of “skimage.transform.resize” Resize image to match a certain size.
- Parameters:
- image: :class:`dask.array`
The dask array to resize
- output_shape: tuple
The shape of the resize array
- *args: list
Arguments of skimage.transform.resize
- **kwargs: dict
Keyword arguments of skimage.transform.resize
- Returns:
- Resized image.
bioio.writers.timeseries_writer module¶
- class bioio.writers.timeseries_writer.TimeseriesWriter[source]¶
Bases:
Writer
A writer for timeseries Greyscale, RGB, or RGBA image data. Primarily directed at formats: “gif”, “mp4”, “mkv”, etc.
- DIM_ORDERS = {3: 'TYX', 4: 'TYXS'}¶
- static save(data: ndarray | Array, uri: str | Path, dim_order: str | None = None, fps: int = 24, fs_kwargs: Dict[str, Any] = {}, **kwargs: Any) None [source]¶
Write a data array to a file.
- Parameters:
- data: biob.types.ArrayLike
The array of data to store. Data must have either three or four dimensions.
- uri: biob.types.PathLike
The URI or local path for where to save the data.
- dim_order: str
The dimension order of the provided data. Default: None. Based off the number of dimensions, will assume the dimensions – three dimensions: TYX and four dimensions: TYXS.
- fps: int
Frames per second to attach as metadata. Default: 24
- fs_kwargs: Dict[str, Any]
Any specific keyword arguments to pass down to the fsspec created filesystem. Default: {}
- Raises:
- IOError
Cannot write FFMPEG formats to remote storage.
Notes
This writer can also be useful when wanting to create a timeseries image using a non-time dimension. For example, creating a timeseries image where each frame is a Z-plane from a source volumetric image as seen below.
>>> image = BioImage("some_z_stack.ome.tiff") ... TimeseriesWriter.save( ... data=image.get_image_data("ZYX", T=0, C=0), ... uri="some_z_stack.mp4", ... # Overloading the Z dimension as the Time dimension ... # Technically not needed as it would have been assumed due to three dim ... dim_order="TYX", ... )
Examples
Data is the correct shape and dimension order
>>> image = dask.array.random.random((50, 100, 100)) ... TimeseriesWriter.save(image, "file.gif")
Data provided with current dimension order
>>> image = numpy.random.rand(100, 3, 1024, 2048) ... TimeseriesWriter.save(image, "file.mkv", "TSYX")
Save to remote
>>> image = numpy.random.rand(300, 100, 100, 3) ... TimeseriesWriter.save(image, "s3://my-bucket/file.png")
bioio.writers.two_d_writer module¶
- class bioio.writers.two_d_writer.TwoDWriter[source]¶
Bases:
Writer
A writer for image data is only 2 dimension with samples (RGB / RGBA) optional. Primarily directed at formats: “png”, “jpg”, etc.
This is primarily a passthrough to imageio.imwrite.
- DIM_ORDERS = {2: 'YX', 3: 'YXS'}¶
- FFMPEG_FORMATS = ['mov', 'avi', 'mpg', 'mpeg', 'mp4', 'mkv', 'wmv', 'ogg']¶
- static get_extension_and_mode(path: str) Tuple[str, str] [source]¶
Provided a path to a file, provided back the extension (format) of the file and the imageio read mode.
- Parameters:
- path: str
The file to provide extension and mode info for.
- Returns:
- extension: str
The extension (a naive guess at the format) of the file.
- mode: str
The imageio read mode to use for image reading.
- static save(data: ndarray | Array, uri: str | Path, dim_order: str | None = None, fs_kwargs: Dict[str, Any] = {}, **kwargs: Any) None [source]¶
Write a data array to a file.
- Parameters:
- data: types.ArrayLike
The array of data to store. Data must have either two or three dimensions.
- uri: types.PathLike
The URI or local path for where to save the data.
- dim_order: str
The dimension order of the provided data. Default: None. Based off the number of dimensions, will assume the dimensions similar to how https://github.com/bioio-devs/bioio-imageio/blob/main/bioio_imageio/reader.py reads in data. That is, two dimensions: YX and three dimensions: YXS.
- fs_kwargs: Dict[str, Any]
Any specific keyword arguments to pass down to the fsspec created filesystem. Default: {}
Examples
Data is the correct shape and dimension order
>>> image = dask.array.random.random((100, 100, 4)) ... TwoDWriter.save(image, "file.png")
Data provided with current dimension order
>>> image = numpy.random.rand(3, 1024, 2048) ... TwoDWriter.save(image, "file.png", "SYX")
Save to remote
>>> image = numpy.random.rand(100, 100, 3) ... TwoDWriter.save(image, "s3://my-bucket/file.png")
bioio.writers.writer module¶
- class bioio.writers.writer.Writer[source]¶
Bases:
ABC
A small class to build standardized image writer functions.
- abstract static save(data: ndarray | Array, uri: str | Path, dim_order: str = 'TCZYX', **kwargs: Any) None [source]¶
Write a data array to a file.
- Parameters:
- data: types.ArrayLike
The array of data to store.
- uri: types.PathLike
The URI or local path for where to save the data.
- dim_order: str
The dimension order of the data.
Examples
>>> image = numpy.ndarray([1, 10, 3, 1024, 2048]) ... DerivedWriter.save(image, "file.ome.tif", "TCZYX")
>>> image = dask.array.ones((4, 100, 100)) ... DerivedWriter.save(image, "file.png", "CYX")
Module contents¶
- class bioio.writers.OmeTiffWriter[source]¶
Bases:
Writer
- static build_ome(data_shapes: List[Tuple[int, ...]], data_types: List[dtype], dimension_order: List[str | None] | None = None, channel_names: List[List[str] | None] | None = None, image_name: List[str | None] | None = None, physical_pixel_sizes: List[PhysicalPixelSizes] | None = None, channel_colors: List[List[List[int]] | None] | None = None) OME [source]¶
Create the necessary metadata for an OME tiff image
- Parameters:
- data_shapes:
A list of 5- or 6-d tuples
- data_types:
A list of data types
- dimension_order:
The order of dimensions in the data array, using T,C,Z,Y,X and optionally S
- channel_names:
The names for each channel to be put into the OME metadata
- image_name:
The name of the image to be put into the OME metadata
- physical_pixel_sizes:
Z,Y, and X physical dimensions of each pixel, defaulting to microns
- channel_colors:
List of all images channel colors to be put into the OME metadata
- is_rgb:
is a S dimension present? S is expected to be the last dim in the data shape
- Returns:
- OME
An OME object that can be converted to a valid OME-XML string
- static save(data: List[ndarray | Array] | ndarray | Array, uri: str | Path, dim_order: str | List[str | None] | None = None, ome_xml: str | OME | None = None, channel_names: List[str] | List[List[str] | None] | None = None, image_name: str | List[str | None] | None = None, physical_pixel_sizes: PhysicalPixelSizes | List[PhysicalPixelSizes] | None = None, channel_colors: List[List[int]] | List[List[List[int]] | None] | None = None, fs_kwargs: Dict[str, Any] = {}, **kwargs: Any) None [source]¶
Write a data array to a file.
- Parameters:
- data: Union[List[biob.types.ArrayLike], biob.types.ArrayLike]
The array of data to store. Data arrays must have 2 to 6 dimensions. If a list is provided, then it is understood to be multiple images written to the ome-tiff file. All following metadata parameters will be expanded to the length of this list.
- uri: biob.types.PathLike
The URI or local path for where to save the data. Note: OmeTiffWriter can only write to local file systems.
- dim_order: Optional[Union[str, List[Union[str, None]]]]
The dimension order of the provided data. Dimensions must be a list of T, C, Z, Y, Z, and S (S=samples for rgb data). Dimension strings must be same length as number of dimensions in the data. If S is present it must be last and its data count must be 3 or 4. Default: None. If None is provided for any data array, we will guess dimensions based on a TCZYX ordering. In the None case, data will be assumed to be scalar, not RGB.
- ome_xml: Optional[Union[str, OME]]
Provided OME metadata. The metadata can be an xml string or an OME object from ome-biob.types. A provided ome_xml will override any other provided metadata arguments. Default: None The passed-in metadata will be validated against current OME_XML schema and raise exception if invalid. The ome_xml will also be compared against the dimensions of the input data. If None is given, then OME-XML metadata will be generated from the data array and any of the following metadata arguments.
- channel_names: Optional[Union[List[str], List[Optional[List[str]]]]]
Lists of strings representing the names of the data channels Default: None If None is given, the list will be generated as a 0-indexed list of strings of the form “Channel:image_index:channel_index”
- image_name: Optional[Union[str, List[Union[str, None]]]]
List of strings representing the names of the images Default: None If None is given, the list will be generated as a 0-indexed list of strings of the form “Image:image_index”
- physical_pixel_sizes: Optional[Union[biob.types.PhysicalPixelSizes,
List[biob.types.PhysicalPixelSizes]]]
List of numbers representing the physical pixel sizes in Z, Y, X in microns Default: None
- channel_colors: Optional[Union[List[List[int]], List[Optional[List[List[int]]]]]
List of rgb color values per channel or a list of lists for each image. These must be values compatible with the OME spec. Default: None
- fs_kwargs: Dict[str, Any]
Any specific keyword arguments to pass down to the fsspec created filesystem. Default: {}
- Raises:
- ValueError:
Non-local file system URI provided.
Examples
Write a TCZYX data set to OME-Tiff
>>> image = numpy.ndarray([1, 10, 3, 1024, 2048]) ... OmeTiffWriter.save(image, "file.ome.tif")
Write data with a dimension order into OME-Tiff
>>> image = numpy.ndarray([10, 3, 1024, 2048]) ... OmeTiffWriter.save(image, "file.ome.tif", dim_order="ZCYX")
Write multi-scene data to OME-Tiff, specifying channel names
>>> image0 = numpy.ndarray([3, 10, 1024, 2048]) ... image1 = numpy.ndarray([3, 10, 512, 512]) ... OmeTiffWriter.save( ... [image0, image1], ... "file.ome.tif", ... dim_order="CZYX", # this single value will be repeated to each image ... channel_names=[["C00","C01","C02"],["C10","C11","C12"]] ... )
- class bioio.writers.OmeZarrWriter(uri: str | Path)[source]¶
Bases:
object
Constructor.
- Parameters:
- uri: biob.types.PathLike
The URI or local path for where to save the data.
- static build_ome(size_z: int, image_name: str, channel_names: List[str], channel_colors: List[int], channel_minmax: List[Tuple[float, float]]) Dict [source]¶
Create the omero metadata for an OME zarr image
- Parameters:
- size_z:
Number of z planes
- image_name:
The name of the image
- channel_names:
The names for each channel
- channel_colors:
List of all channel colors
- channel_minmax:
List of all (min, max) pairs of channel intensities
- Returns:
- Dict
An “omero” metadata object suitable for writing to ome-zarr
- write_image(image_data: ndarray | Array, image_name: str, physical_pixel_sizes: PhysicalPixelSizes | None, channel_names: List[str] | None, channel_colors: List[int] | None, chunk_dims: Tuple | None = None, scale_num_levels: int = 1, scale_factor: float = 2.0, dimension_order: str | None = None) None [source]¶
Write a data array to a file. NOTE that this API is not yet finalized and will change in the future.
- Parameters:
- image_data: biob.types.ArrayLike
The array of data to store. Data arrays must have 2 to 6 dimensions. If a list is provided, then it is understood to be multiple images written to the ome-tiff file. All following metadata parameters will be expanded to the length of this list.
- image_name: str
string representing the name of the image
- physical_pixel_sizes: Optional[biob.types.PhysicalPixelSizes]
PhysicalPixelSizes object representing the physical pixel sizes in Z, Y, X in microns. Default: None
- channel_names: Optional[List[str]]
Lists of strings representing the names of the data channels Default: None If None is given, the list will be generated as a 0-indexed list of strings of the form “Channel:image_index:channel_index”
- channel_colors: Optional[List[int]]
List of rgb color values per channel or a list of lists for each image. These must be values compatible with the OME spec. Default: None
- scale_num_levels: Optional[int]
Number of pyramid levels to use for the image. Default: 1 (represents no downsampled levels)
- scale_factor: Optional[float]
The scale factor to use for the image. Only active if scale_num_levels > 1. Default: 2.0
- dimension_order: Optional[str]
The dimension order of the data. If None is given, the dimension order will be guessed from the number of dimensions in the data according to TCZYX order.
Examples
Write a TCZYX data set to OME-Zarr
>>> image = numpy.ndarray([1, 10, 3, 1024, 2048]) ... writer = OmeZarrWriter("/path/to/file.ome.zarr") ... writer.write_image(image)
Write multi-scene data to OME-Zarr, specifying channel names
>>> image0 = numpy.ndarray([3, 10, 1024, 2048]) ... image1 = numpy.ndarray([3, 10, 512, 512]) ... writer = OmeZarrWriter("/path/to/file.ome.zarr") ... writer.write_image(image0, "Image:0", ["C00","C01","C02"]) ... writer.write_image(image1, "Image:1", ["C10","C11","C12"])
- bioio.writers.OmeZarrWriter2¶
alias of
OmeZarrWriter