Non-interractive, chunk-based, web content retriever
Via NPM:
npm install libxget
This installs a CLI binary accessible with the xget
command.
# Check if the xget command has been installed and accessible on your path
$ xget -V
v0.8.0
The xget
command, utilizes the library to retrieve web content by its chunks according to specification
# Normal
xget https://google.com/doodle.png
# Write to output file
xget https://google.com/doodle.png image.png
# Piping output
xget https://myserver.io/runtime.log --no-bar | less
# Stream response in real time (e.g Watching a movie)
xget https://service.com/movie.mp4 | vlc -
Use --help
to see full usage documentation.
// Node CommonJS
const xget = require("libxget");
// Or ES6 Modules
import xget from "libxget";
// Or Typescript
import * as xget from "libxget";
xget("https://github.com/microsoft/TypeScript/archive/master.zip", {
chunks: 10,
retries: 10,
}).pipe(fs.createWriteStream("master.zip"));
Get the master branch of the Typescript repository. With 10 simultaneous downloads. Retrying each one to a max of 10.
|progress| | cacher |
xresilient[axios] -> || part || -> ||cachingstream|| -\
xresilient[axios] -> || part || -> ||cachingstream|| -\
xresilient[axios] -> || part || -> ||cachingstream|| -\
xresilient[axios] -> || part || -> ||cachingstream|| -> chunkmerger [ -> hasher ] -> file
xresilient[axios] -> || part || -> ||cachingstream|| -/
xresilient[axios] -> || part || -> ||cachingstream|| -/
xresilient[axios] -> || part || -> ||cachingstream|| -/
|progress| | cacher |
xget infers from a HEAD response whether or not the server supports byte-ranges. In the event that it does, it opens N connections feeding in non-overlapping segments of the resource. In order to retry broken connections, xget wraps the requests in xresilient streams to ensure proper retries and probable completion of the request. The streams are piped and tracked through the progress bar and into a caching stream and then all chunks are merged sequentially, in-place and in-order and piped into an optional hasher and finally the output file.
The purpose of the caching stream is to ensure that other chunks can begin while the merger is still on the first. Liberating the download speed from the write speed, recieved chunks are buffered in memory to a maximum cache limit.
url
: <string>options
: <XGETOptions>- Returns: <XGETStream>
XGETOptions extends
RequestOpts: Object
chunks
: <number> Number of chunked-simultaneous downloads. Default:5
retries
: <number> Number of retries for each chunk. Default:5
timeout
: <number> Network response timeout (ms). Default:20000
start
: <number> Position to start feeding the stream from. Default:0
auto
: <boolean> Whether or not to start the request automatically or wait for arequest.start()
call (useful when chaining events you want to fire in order). Default:true
size
: <number> Number of bytes to stream off the response.hash
: <string> Hash algorithm to use to create a crypto.Hash instance computing the stream hash.cache
: <number> Whether or not to use an in-memory cache to enable read-aheads of pending chunks.cacheSize
: <boolean> Custom maximum cache size (bytes).use
: <object> Key-value pairs of middlewares with which to pipe the response object through. keys are strings, values are Transformer generating functions (Alternatively, use the xget.use() method).with
: <object> Key-value pairs of middlewares with which to pipe the dataslice object through. keys are strings, values are functions whose return values are accessible within the store. (Alternatively, use the xget.with() method).headHandler
: <HeadHandler> An interceptor for the initial HEAD data, useful for programmatically defining a range offset;
xget.store: Map
A map whose keys and values are tags and return types of content processed within the withStack of the xget object.
xget(URL)
.with("variable", () => 5)
.once("set", (store) => {
/*
`store` is a map whose key and values directly match tags and return types within
> a with call or the with object in the xget options
*/
console.log(store.get("variable")); // 5
})
.pipe(FILE);
xget.ended: Boolean
A readonly property that tells whether or not the xget instance has ended.
xget.loaded: Boolean
A readonly property that tells whether or not the xget instance has been loaded.
xget.bytesRead: Number
A readonly property that tells how many bytes has been processed by the underlying streams.
Class: XGETStream extends
stream.Readable
The core multi-chunk request instance.
url
: <string>options
: <XGETOptions>- Returns: <XGETStream>
The 'end'
event is emitted after the data from the URL has been fully flushed.
store
: <xget.store> The shared internal data store.
The 'set'
event is emitted after all the middlewares defined in the with
option of the XGETOptions or with the xget.with() method.
This event is fired after the 'loaded'
event.
err
: <Error> The error instance.
The 'error'
event is emitted once a chunk has met it's maximum number of retries.
At which point, it would abruptly destroy other chunk connections.
retrySlice
:meta
: <boolean> Whether or not the error causing the retry was caused while getting the URL metadata. i.e before any streams are employed.index
: <number> The index count of the chunk.retryCount
: <number> The number of retry iterations so far.maxRetries
: <number> The maximum number of retries possible.bytesRead
: <number> The number of bytes previously read (if any).totalBytes
: <number> The total number of bytes that are to be read by the stream.lastErr
: <Error> The error emitted by the previous stream.store
: <xget.store> The shared internal data store.
The 'retry'
event is emitted by every chunk once it has been re-initialized underneath.
Based on the spec of the xresilient module, chunks are reinitialized once an error event is met.
loadData
: <LoadData> The pre-computed config for the loaded data slice.
This is emitted immediately the head data is gotten, preprocessed, parsed and used to tailor the configuration for the chunk setup.
This loadData
contains information like the actual size of the remote file and whether or not the server supports multiple connections, chunking, file resumption, etc.
This event is fired prior to the 'set'
event.
- Returns: <boolean>
Starts the request process if options.auto
was set to false.
Returns true
if the request was started, false
if it had already been started.
Calculates the digest of all data that has been processed by the library and its middleware transformers. This, creates a deep copy of the internal state of the current crypto.Hash object of which it calculates the digest.
This ensures you can get a hash of an instancce of the data even while still streaming from the URL response.
- Returns: <string>
Returns the hash algorithm if any is in use.
fn
: <HeadHandler> Handler to be set.- Returns: <boolean> Whether or not the handler was successfully set.
Sets an interceptor for the initial HEAD data, useful for programmatically defining a range offset. Returns false
if the request has already been loaded, true
if successfully set.
size
: <number>- Returns: <XGETStream>
Set maximum capacity for internal cache.
tag
: <string>handler
: <UseMiddlewareFn>- Returns: <XGETStream>
Add a named handler to the use middleware stack whose return value would be used to transform the response stream in a series of pipes.
The handler
method is called after the stream is requested from and we start pumping the underlying request
instances for a data response stream.
The core expects the handler
to return a stream.Duplex instance. (A readable, writable stream) to transform or passthrough the raw data streams along the way.
// Example, compressing the response content in real time
xget(URL)
.use("compressor", () => zlib.createGzip())
.pipe(createWriteStreamSomehow());
tag
: <string>handler
: <WithMiddlewareFn>- Returns: <XGETStream>
Add a named handler
to the with middleware stack whose return value would be stored within the store after execution.
xget(URL)
.with("bar", ({ size }) => progressBar(size)) // Create a finite-sized progress bar
.use("bar", (_, store) => store.get("bar").genStream()) // Create a stream handling object that updates the progressbar from the number of bytes flowing through itself
.once("set", (store) => store.get("bar").print("Downloading..."))
.pipe(createWriteStreamSomehow());
Extract data from an error if it was either thrown from within a UseMiddlewareFn or a WithMiddlewareFn function.
xget(URL)
.use('errorThrower', () => {
throw new Error('Custom error being thrown');
})
.once('error', err => {
const ({tag, source}) = xget.getErrContext(err);
if (source)
console.log(`Error thrown from within the [${tag}] method of the [${source}] middlware`);
// Error thrown from within the [errorThrower] method of the [xget:use] middleware
})
.pipe(createWriteStreamSomehow());
HeadHandler: function
props
: <object>headers
: <[IncomingHttpHeaders][incominghttpheaders]> HEAD headers from the URL.acceptsRanges
: <boolean> Whether or not the URL resource accepts byte ranges.
- Returns: <number | void> An offset to begin streaming from. Analogous to
.start
. If void, defaults to.start
or0
;
An interceptor for the initial HEAD data, useful for programmatically defining a range offset.
LoadData: Object
url
: <string> The URL specified.size
: <number> Finite number returned if server responds appropriately, elseInfinity
.start
: <number> Sticks to specification if server allows chunking viacontent-ranges
else, resets to0
.chunkable
: <number> Whether or not the URL feed can be chunked, supporting simultaneous connections.totalSize
: <number> Actual size of the resource without an offset.chunkStack
: <ChunkLoadInstance[]> The chunkstack array.headers
: <[IncomingHttpHeaders][incominghttpheaders]> The recieved array.
ChunkLoadInstance: Object
min
: <number> The minimum extent for the chunk segment range.max
: <number> The maximum extent for the chunk segment range.size
: <number> The total size of the chunk segment.stream
: <ResilientStream> A resilient stream that wraps around a request instance.
WithMiddlewareFn: Function
loadData
: <LoadData>
This handler
is called immediately after metadata from URL is loaded that describes the response.
That is, pre-streaming data from the HEAD like size (content-length), content-type, filename (content-disposition), whether or not it's chunkable (accept-ranges) and a couple of other criterias.
This information is passed into a handler whose return value is filed within the store referenced by the tag
.
UseMiddlewareFn: Function
dataSlice
: <ChunkLoadInstance>store
: <xget.store>- Returns: <stream.Duplex>
- To avoid the terminal being cluttered while using pipes, direct other chained binaries'
stdout
andstderr
to/dev/null
# Watching from a stream, hiding vlc's log information
xget https://myserver.com/movie.mp4 | vlc - > /dev/null 2>&1
Feel free to clone, use in adherance to the license and perhaps send pull requests
git clone https://github.com/miraclx/libxget-js.git
cd libxget-js
npm install
# hack on code
npm run build
Apache 2.0 © Miraculous Owonubi (@miraclx) <omiraculous@gmail.com>