You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Example use cases are streaming IO and large files.
It should be feasible to do something like this:
def deserialize_stream(stream, prototype_object \\ %__MODULE__{}) do
Stream.transform(stream, {prototype_object, ""}, fn data, {object, pvs_remainder} ->
__MODULE__.deserialize_partial(object, pvs_remainder <> data)
end)
end
where deserialize_partial(t, binary) :: {t, binary} would deserialize as much of an object as it can get from the input data, merge it with the input object, and return the new object along with whatever data was left over (e.g., a partial field).
The text was updated successfully, but these errors were encountered:
That’s a good question. I was also realizing there might have to be some kind of “merge” function implemented for each field. Ie what happens if you have a set and then you get more elements of the set in the next stream chunk?
On Oct 22, 2017, at 2:26 PM, Jon Parise ***@***.***> wrote:
That sounds useful to me. How would errors propagate?
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub, or mute the thread.
Example use cases are streaming IO and large files.
It should be feasible to do something like this:
where
deserialize_partial(t, binary) :: {t, binary}
would deserialize as much of an object as it can get from the input data, merge it with the input object, and return the new object along with whatever data was left over (e.g., a partial field).The text was updated successfully, but these errors were encountered: