Skip to content

Commit

Permalink
Add a how-it-works section to the readme
Browse files Browse the repository at this point in the history
  • Loading branch information
miraclx committed Jul 2, 2020
1 parent e87108c commit f5ba3ca
Showing 1 changed file with 19 additions and 0 deletions.
19 changes: 19 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -68,6 +68,25 @@ xget("https://github.com/microsoft/TypeScript/archive/master.zip", {
Get the master branch of the Typescript repository.
With 10 simultaneous downloads. Retrying each one to a max of 10.

## How it works

``` txt
|progress| | cacher |
xresilient[axios] -> || part || -> ||cachingstream|| -\
xresilient[axios] -> || part || -> ||cachingstream|| -\
xresilient[axios] -> || part || -> ||cachingstream|| -\
xresilient[axios] -> || part || -> ||cachingstream|| -> chunkmerger [ -> hasher ] -> file
xresilient[axios] -> || part || -> ||cachingstream|| -/
xresilient[axios] -> || part || -> ||cachingstream|| -/
xresilient[axios] -> || part || -> ||cachingstream|| -/
|progress| | cacher |
```

xget infers from a [HEAD](https://developer.mozilla.org/en-US/docs/Web/HTTP/Methods/HEAD) response whether or not the server supports [byte-ranges](https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Accept-Ranges).
In the event that it does, it opens N connections feeding in non-overlapping segments of the resource. In order to retry broken connections, xget wraps the requests in [xresilient](https://github.com/miraclx/xresilient) streams to ensure proper retries and probable completion of the request. The streams are piped and tracked through the [progress bar](https://github.com/miraclx/xprogress) and into a [caching stream](lib/streamCache.js) and then all chunks are [merged](https://github.com/teambition/merge2) sequentially, in-place and in-order and piped into an optional hasher and finally the output file.

The purpose of the caching stream is to ensure that other chunks can begin while the merger is still on the first. Liberating the download speed from the write speed, recieved chunks are buffered in memory to a maximum cache limit.

## API

### xget(url[, options])
Expand Down

0 comments on commit f5ba3ca

Please sign in to comment.