Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

@tus/s3-store: finalize incomplete parts #502

Closed
wants to merge 1 commit into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions packages/s3-store/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -504,7 +504,13 @@ export class S3Store extends DataStore {

if (metadata.file.size === newOffset) {
try {
// If no parts exist yet, then the incomplete part needs to be completed
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
// If no parts exist yet, then the incomplete part needs to be completed
// If there is any incomplete part it needs to be completed

const incompletePart = await this.getIncompletePart(id)
if (incompletePart) {
await this.uploadPart(metadata, incompletePart, nextPartNumber)
}
const parts = await this.retrieveParts(id)

await this.finishMultipartUpload(metadata, parts as Array<AWS.Part>)
this.clearCache(id)
} catch (error) {
Expand Down
35 changes: 35 additions & 0 deletions packages/s3-store/test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -113,6 +113,41 @@ describe('S3DataStore', function () {
}
})

it('completes an incomplete part when deferred length becomes resolved', async function () {
const store = this.datastore
const incompleteSize = 2 * 1024 * 1024 // 2MB
const uploadIncompletePart = sinon.spy(store, 'uploadIncompletePart')
const uploadPart = sinon.spy(store, 'uploadPart')
const finishMultipartUpload = sinon.spy(store, 'finishMultipartUpload')
const upload = new Upload({
id: 'deferred-incomplete-part-test-' + Uid.rand(),
// Deferred length
size: undefined,
offset: 0,
})
let offset = upload.offset
await store.create(upload)

// Upload a single chunk small enough to create an incomplete part
offset = await store.write(
Readable.from(Buffer.alloc(incompleteSize)),
upload.id,
offset
)
assert.equal(uploadIncompletePart.called, true)
assert.equal(uploadPart.called, false)

// Simulate the completion PATCH of a deferred length multipart upload
// Resolve the deferred length
await store.declareUploadLength(upload.id, incompleteSize)
// Notify the store to complete the multipart upload (empty payload)
await store.write(Readable.from(Buffer.alloc(0)), upload.id, offset)
// The incomplete part will now be completed
assert.equal(uploadPart.called, true)
// The multipart upload will now be completed
assert.equal(finishMultipartUpload.called, true)
})

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I prepared another test where we test if it works when there is a one complete part and one incomplete part.

  it('completes an incomplete part when deferred length becomes resolved', async function () {
    const store = this.datastore
    const firstPartSize = 8 * 1024 * 1024 // 8MB
    const secondPartSize = 2 * 1024 * 1024 // 2MB

    const uploadPart = sinon.spy(store, 'uploadPart')
    const getIncompletePart = sinon.spy(store, 'getIncompletePart')
    const uploadIncompletePart = sinon.spy(store, 'uploadIncompletePart')
    const finishMultipartUpload = sinon.spy(store, 'finishMultipartUpload')

    sinon.stub(store, 'calcOptimalPartSize').callsFake(() => firstPartSize)

    const upload = new Upload({
      id: 'deferred-incomplete-part-test-' + Uid.rand(),
      // Deferred length
      size: undefined,
      offset: 0,
    })

    await store.create(upload)

    // Upload a single chunk large enough to create a part
    {
      upload.offset = await store.write(
        Readable.from(Buffer.alloc(firstPartSize)),
        upload.id,
        upload.offset
      )
      assert.equal(uploadPart.called, true)
      assert.equal(uploadIncompletePart.called, false)
    }

    // Upload a single chunk small enough to create an incomplete part
    {
      uploadPart.resetHistory()
      uploadIncompletePart.resetHistory()

      upload.offset = await store.write(
        Readable.from(Buffer.alloc(secondPartSize)),
        upload.id,
        upload.offset
      )
      assert.equal(uploadIncompletePart.called, true)
      assert.equal(uploadPart.called, false)
    }

    // Simulate the completion PATCH of a deferred length multipart upload
    {
      getIncompletePart.resetHistory()
      uploadIncompletePart.resetHistory()

      // Resolve the deferred length
      await store.declareUploadLength(upload.id, firstPartSize + secondPartSize)
      // Notify the store to complete the multipart upload (empty payload)
      await store.write(Readable.from(Buffer.alloc(0)), upload.id, upload.offset)
      // Final write needs to check for incomplete part
      assert.equal(getIncompletePart.called, true)
      // The incomplete part will now be completed
      assert.equal(uploadPart.called, true)
      // The multipart upload will now be completed
      assert.equal(finishMultipartUpload.called, true)
    }
  })

it('upload as multipart upload when incomplete part grows beyond minimal part size', async function () {
const store = this.datastore
const size = 10 * 1024 * 1024 // 10MiB
Expand Down
Loading