Allow to use streaming payload for big objects #114

Closed
opened 2024-01-11 17:44:50 +00:00 by fyrchik · 0 comments

If we store the payload in memory we are easily limited by the amount of RAM, even though grpc native client accept arbitrary reader. Same for S3 client, the payload is wrapped in bytes.NewReader.

Why not use this reader directly?
I see 2 problems this:

  1. []byte is wrapped in JS array and passed to a client via scenario. If we use streams¸some "interop" wrapper need to be used or client interface should be refactored.
  2. Payload is generated together with the hash. If we use streams, the hash will be available after the object is put, shouldn't be a problem in practice.
If we store the payload in memory we are easily limited by the amount of RAM, even though grpc native client accept arbitrary reader. Same for S3 client, the payload is wrapped in `bytes.NewReader`. Why not use this reader directly? I see 2 problems this: 1. `[]byte` is wrapped in JS array and passed to a client via scenario. If we use streams¸some "interop" wrapper need to be used or client interface should be refactored. 2. Payload is generated together with the hash. If we use streams, the hash will be available _after_ the object is put, shouldn't be a problem in practice.
fyrchik self-assigned this 2024-01-11 19:16:29 +00:00
Sign in to join this conversation.
There is no content yet.