BatchMapper¶
- class torchdata.datapipes.iter.BatchMapper(datapipe: IterDataPipe, fn: Callable, batch_size: int, input_col=None)¶
Combines elements from the source DataPipe to batches and applies a function over each batch, then flattens the outpus to a single, unnested IterDataPipe (functional name:
map_batches).- Parameters:
datapipe – Source IterDataPipe
fn – The function to be applied to each batch of data
batch_size – The size of batch to be aggregated from
datapipeinput_col – Index or indices of data which
fnis applied, such as: -Noneas default to applyfnto the data directly. - Integer(s) is used for list/tuple. - Key(s) is used for dict.
Example
>>> from torchdata.datapipes.iter import IterableWrapper >>> def fn(batch): >>> return [d + 1 for d in batch] >>> source_dp = IterableWrapper(list(range(5))) >>> mapped_dp = source_dp.map_batches(fn, batch_size=3) >>> list(mapped_dp) [1, 2, 3, 4, 5]
Notes
Compared with
map, the reason thatmap_batchesdoesn’t takeoutput_colargument is the size offnoutput is not guaranteed to be the same as input batch. With different size, this operation cannot assign data back to original data structure.And, this operation is introduced based on the use case from TorchText. A pybinded C++ vectorized function can be applied for efficiency.