Skip to content

Commit

Permalink
Internal event handler modifications as fix for #10
Browse files Browse the repository at this point in the history
  • Loading branch information
nathanpeck committed Aug 22, 2014
1 parent 65eef85 commit 7b1a0ee
Show file tree
Hide file tree
Showing 4 changed files with 15 additions and 17 deletions.
6 changes: 6 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,17 +1,23 @@
Changelog
=========

#### 0.6.1 (2014-08-22)

* The internal event emitter wasn't set up properly, causing errors about the upload stream object no having the .emit and/or .once methods. This bug impacted versions 0.5.0 and 0.6.0. Fixes issue #10.

#### 0.6.0 (2014-08-15)

* Fix for mismatch between documentation and reality in the maxPartSize() and concurrentParts() options.
* New feature: part size and concurrect part helpers can be chained now.
* *Warning, this version has a critical bug. It is recommended that you use 0.6.1 instead*

### 0.5.0 (2014-08-11)

* Added client caching to reuse an existing s3 client rather than creating a new one for each upload. Fixes #6
* Updated the maxPartSize to be a hard limit instead of a soft one so that generated ETAG are consistent to to the reliable size of the uploaded parts. Fixes #7
* Added this file. Fixes #8
* New feature: concurrent part uploads. Now you can optionally enable concurrent part uploads if you wish to allow your application to drain the source stream more quickly and absorb some of the bottle neck when uploading to S3.
* *Warning, this version has a critical bug. It is recommended that you use 0.6.1 instead*

### 0.4.0 (2014-06-23)

Expand Down
12 changes: 2 additions & 10 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,17 +6,9 @@ A pipeable write stream which uploads to Amazon S3 using the multipart file uplo

### Changelog

#### 0.6.0 (2014-08-15)
#### 0.6.1 (2014-08-15)

* Fix for mismatch between documentation and reality in the maxPartSize() and concurrentParts() options.
* New feature: part size and concurrect part helpers can be chained now.

#### 0.5.0 (2014-08-11)

* Added client caching to reuse an existing s3 client rather than creating a new one for each upload. Fixes #6
* Updated the maxPartSize to be a hard limit instead of a soft one so that generated ETAG's are consistent due to the reliable size of the uploaded parts. Fixes #7
* Added a changelog.md file. Fixes #8
* New feature: concurrent part uploads. Now you can optionally enable concurrent part uploads if you wish to allow your application to drain the source stream more quickly and absorb some of the backpressure from a fast incoming stream when uploading to S3.
Fix for an issue with the internal event emitter being improperly attached. This issue caused crashs in v0.5.0 and v0.6.0, so it is recommended that you upgrade to v0.6.1 if you are using one of the affected versions.

[Historical Changelogs](CHANGELOG.md)

Expand Down
12 changes: 6 additions & 6 deletions lib/s3-upload-stream.js
Original file line number Diff line number Diff line change
@@ -1,6 +1,5 @@
var Writable = require('stream').Writable,
util = require("util"),
EventEmitter = require("events").EventEmitter,
events = require("events"),
AWS = require('aws-sdk');

var cachedClient;
Expand All @@ -13,6 +12,7 @@ module.exports = {
// Generate a writeable stream which uploads to a file on S3.
Uploader: function (connection, destinationDetails, doneCreatingUploadStream) {
var self = this;
var e = new events.EventEmitter();

if (arguments.length == 2) {
// No connection passed in, assume that the connection details were already specified using
Expand Down Expand Up @@ -50,6 +50,8 @@ module.exports = {
highWaterMark: 4194304 // 4 MB
});

events.EventEmitter.call(self);

// Data pertaining to the overall upload
self.partNumber = 1;
self.partIds = [];
Expand Down Expand Up @@ -104,14 +106,14 @@ module.exports = {
else {
// Block uploading (and receiving of more data) until we upload
// some of the pending parts
self.once('chunk', upload);
e.once('chunk', upload);
}

function upload() {
self.pendingParts++;
self.flushPart(function (partDetails) {
--self.pendingParts;
self.emit('chunk'); // Internal event
e.emit('chunk'); // Internal event
self.ws.emit('chunk', partDetails); // External event
});
next();
Expand Down Expand Up @@ -267,5 +269,3 @@ module.exports = {
);
}
};

util.inherits(module.exports.Uploader, EventEmitter);
2 changes: 1 addition & 1 deletion package.json
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
{
"name": "s3-upload-stream",
"description": "Writeable stream for uploading content of unknown size to S3 via the multipart API.",
"version": "0.6.0",
"version": "0.6.1",
"author": {
"name": "Nathan Peck",
"email": "nathan@storydesk.com"
Expand Down

0 comments on commit 7b1a0ee

Please sign in to comment.