Skip to content

Pipeline Transform concept #486

@wkornewald

Description

@wkornewald

Currently, transformations like encrypt/decrypt and compress/decompress have to be implemented as Source/Sink. If you want to support encryption you have to implement it twice as EncryptionSource and EncryptionSink because the decision where the encryption is applied depends on how you want to structure the code from a security and abstraction perspective. In Node.js the Streams API supports such a Transform concept already.

What we see with Okio's GzipSink and GzipSource is that you can only decompress a source and compress a sink. That's a little bit inflexible though because you have to write your code around this limitation. Written as a pipeline it's clear that compression is its own concept, independent of sink and source.

source | compress | sink
source | decompress | sink

It would be nice to introduce a general Transform API with source.pipe(transform) -> Source and transform.pipe(sink) -> Sink operations. See also Node.js as an example, but not necessarily for how we should structure the API: https://nodejs.org/api/stream.html#class-streamtransform

This is just an example how this could work (but please let's first agree on the need for a Transform concept): A Transform could, for example, receive an input and output Buffer and then be allowed to use UnsafeBufferOperations on both to transfer data from the source to the sink. It would also need some way to request a minimum amount of data to be present in the input Buffer and only be called again when the Buffer has the requested size or the input has been closed / has reached the end of the stream (in which case the Transform can process the remaining data or fail).

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions