Socket
Socket
Sign inDemoInstall

pacote

Package Overview
Dependencies
Maintainers
1
Versions
221
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

pacote - npm Package Compare versions

Comparing version 1.0.0 to 2.0.0

lib/util/pack-dir.js

62

CHANGELOG.md

@@ -5,2 +5,64 @@ # Change Log

<a name="2.0.0"></a>
# [2.0.0](https://github.com/zkat/pacote/compare/v1.0.0...v2.0.0) (2017-04-15)
### Bug Fixes
* **api:** use npa[@5](https://github.com/5) for spec parsing (#78) ([3f56298](https://github.com/zkat/pacote/commit/3f56298))
* **deprecated:** remove underscore from manifest._deprecated ([9f4af93](https://github.com/zkat/pacote/commit/9f4af93))
* **directory:** add _resolved to directory manifests ([1d305db](https://github.com/zkat/pacote/commit/1d305db))
* **directory:** return null instead of throwing ([d35630d](https://github.com/zkat/pacote/commit/d35630d))
* **finalize:** don't try to cache manifests we can't get a good key for ([8ab1758](https://github.com/zkat/pacote/commit/8ab1758))
* **finalize:** refactored finalize-manifest code + add _integrity=false sentinel ([657b7fa](https://github.com/zkat/pacote/commit/657b7fa))
* **git:** cleaner handling of git tarball streams when caching ([11acd0a](https://github.com/zkat/pacote/commit/11acd0a))
* **git:** emit manifests from git tarball handler ([b139d4b](https://github.com/zkat/pacote/commit/b139d4b))
* **git:** fix .git exclusion, set mtime = 0 to make tarballs idempotent ([9a9fa1b](https://github.com/zkat/pacote/commit/9a9fa1b))
* **git:** fix fallback order and only fall back on hosted shortcuts ([551cb33](https://github.com/zkat/pacote/commit/551cb33))
* **git:** fix filling-out of git manifests ([95e807c](https://github.com/zkat/pacote/commit/95e807c))
* **git:** got dir packer option working with git ([7669b3e](https://github.com/zkat/pacote/commit/7669b3e))
* **headers:** nudge around some headers to make things behave ([db1e0a1](https://github.com/zkat/pacote/commit/db1e0a1))
* **manifest:** get rid of resolved-with-non-error warning ([d4d4917](https://github.com/zkat/pacote/commit/d4d4917))
* **manifest:** stop using digest for manifests ([4ddd2f5](https://github.com/zkat/pacote/commit/4ddd2f5))
* **opts:** bring opt-check up to date ([564419e](https://github.com/zkat/pacote/commit/564419e))
* **opts:** rename refreshCache to preferOnline cause much clearer ([94171d6](https://github.com/zkat/pacote/commit/94171d6))
* **prefetch:** fall back to the _integrity in the manifest if none calculated ([083ac79](https://github.com/zkat/pacote/commit/083ac79))
* **prefetch:** if there's no stream, just skip (for directory) ([714de91](https://github.com/zkat/pacote/commit/714de91))
* **registry:** fix error handling for registry tarballs ([e69539f](https://github.com/zkat/pacote/commit/e69539f))
* **registry:** nudging logging stuff around a bit ([61d62cc](https://github.com/zkat/pacote/commit/61d62cc))
* **registry:** only send auth info if tarball is hosted on the same registry ([1de5a2b](https://github.com/zkat/pacote/commit/1de5a2b))
* **registry:** redirect tarball urls to provided registry port+protocol if same host ([f50167e](https://github.com/zkat/pacote/commit/f50167e))
* **registry:** support memoizing packuments ([e7fff31](https://github.com/zkat/pacote/commit/e7fff31))
* **registry:** treat registry cache as "private" -- bumps m-f-h ([6fa1503](https://github.com/zkat/pacote/commit/6fa1503))
### Features
* **directory:** implement local dir packing ([017d989](https://github.com/zkat/pacote/commit/017d989))
* **fetch:** bump make-fetch-happen for new restarts ([cf90716](https://github.com/zkat/pacote/commit/cf90716))
* **git:** support pulling in git submodules ([5825d33](https://github.com/zkat/pacote/commit/5825d33))
* **integrity:** replace http client (#72) ([189cdd2](https://github.com/zkat/pacote/commit/189cdd2))
* **prefetch:** return cache-related info on prefetch ([623b7f3](https://github.com/zkat/pacote/commit/623b7f3))
* **registry:** allow injection of request agents ([805e5ae](https://github.com/zkat/pacote/commit/805e5ae))
* **registry:** fast request pooling ([321f84b](https://github.com/zkat/pacote/commit/321f84b))
* **registry:** registry requests now follow cache spec more closely, respect Age, etc ([9e47098](https://github.com/zkat/pacote/commit/9e47098))
### BREAKING CHANGES
* **api:** spec objects can no longer be realize-package-specifier objects. Pass a string or generate npa@>=5 spec objects to pass in.
* **integrity:** This PR replaces a pretty fundamental chunk of pacote.
* Caching now follows standard-ish cache rules for http-related requests.
* manifest() no longer includes the `_shasum` field. It's been replaced by `_integrity`, which is a Subresource Integrity hash string containing equivalent data. These strings can be parsed and managed using https://npm.im/ssri.
* Any functions that accepted `opts.digest` and/or `opts.hashAlgorithm` now expect `opts.integrity` instead.
* Packuments and finalized manifests are now cached using sha512. Tarballs can start using that hash (or any other more secure hash) once registries start supporting them: `packument.dist.integrity` will be prioritized over `packument.shasum`.
* If opts.offline is used, a `ENOCACHE` error will be returned.
<a name="1.0.0"></a>

@@ -7,0 +69,0 @@ # [1.0.0](https://github.com/zkat/pacote/compare/v0.1.1...v1.0.0) (2017-03-17)

72

extract.js

@@ -7,5 +7,7 @@ 'use strict'

const extractStream = require('./lib/extract-stream')
const npa = require('npm-package-arg')
const pipe = BB.promisify(require('mississippi').pipe)
const optCheck = require('./lib/util/opt-check')
const rps = BB.promisify(require('realize-package-specifier'))
const retry = require('promise-retry')
const rimraf = BB.promisify(require('rimraf'))

@@ -15,10 +17,19 @@ module.exports = extract

opts = optCheck(opts)
if (opts.digest) {
opts.log.silly('extract', 'trying ', spec, ' digest:', opts.digest)
spec = typeof spec === 'string' ? npa(spec, opts.where) : spec
const startTime = Date.now()
if (opts.integrity && opts.cache && !opts.preferOnline) {
opts.log.silly('pacote', 'trying', spec.name, 'by hash:', opts.integrity.toString())
return extractByDigest(
dest, opts
startTime, spec, dest, opts
).catch(err => {
if (err && err.code === 'ENOENT') {
opts.log.silly('extract', 'digest for', spec, 'not present. Using manifest.')
return extractByManifest(spec, dest, opts)
if (err.code === 'ENOENT') {
opts.log.silly('pacote', `data for ${opts.integrity} not present. Using manifest.`)
return extractByManifest(startTime, spec, dest, opts)
} else if (err.code === 'EINTEGRITY') {
opts.log.warn('pacote', `cached data for ${opts.integrity} failed integrity check. Refreshing cache.`)
return cleanUpCached(
dest, opts.cache, opts.integrity, opts
).then(() => {
return extractByManifest(startTime, spec, dest, opts)
})
} else {

@@ -29,22 +40,45 @@ throw err

} else {
opts.log.silly('extract', 'no digest provided for ', spec, '- extracting by manifest')
return extractByManifest(spec, dest, opts)
opts.log.silly('pacote', 'no tarball hash provided for', spec.name, '- extracting by manifest')
return retry((tryAgain, attemptNum) => {
return extractByManifest(
startTime, spec, dest, opts
).catch(err => {
// We're only going to retry at this level if the local cache might
// have gotten corrupted.
if (err.code === 'EINTEGRITY' && opts.cache) {
opts.log.warn('pacote', `tarball integrity check for ${spec.name}@${spec.saveSpec || spec.fetchSpec} failed. Clearing cache entry. ${err.message}`)
return cleanUpCached(
dest, opts.cache, err.sri, opts
).then(() => tryAgain(err))
} else {
throw err
}
})
}, {retries: 1})
}
}
function extractByDigest (dest, opts) {
function extractByDigest (start, spec, dest, opts) {
const xtractor = extractStream(dest, opts)
const cached = cache.get.stream.byDigest(opts.cache, opts.digest, opts)
return pipe(cached, xtractor)
const cached = cache.get.stream.byDigest(opts.cache, opts.integrity, opts)
return pipe(cached, xtractor).then(() => {
opts.log.verbose('pacote', `${spec.name}@${spec.saveSpec || spec.fetchSpec} extracted to ${dest} by content address ${Date.now() - start}ms`)
})
}
function extractByManifest (spec, dest, opts) {
const res = typeof spec === 'string'
? rps(spec, opts.where)
: BB.resolve(spec)
function extractByManifest (start, spec, dest, opts) {
const xtractor = extractStream(dest, opts)
return res.then(res => {
const tarball = require('./lib/handlers/' + res.type + '/tarball')
return pipe(tarball(res, opts), xtractor)
return BB.resolve(() => {
const tarball = require('./lib/handlers/' + spec.type + '/tarball')
return pipe(tarball(spec, opts), xtractor)
}).then(() => {
opts.log.verbose('pacote', `${spec.name}@${spec.saveSpec || spec.fetchSpec} extracted in ${Date.now() - start}ms`)
})
}
function cleanUpCached (dest, cachePath, integrity, opts) {
return BB.join(
rimraf(dest),
cache.rm.content(cachePath, integrity, opts)
)
}

@@ -6,5 +6,3 @@ 'use strict'

const cache = require('./cache')
const checksumStream = require('checksum-stream')
const dezalgo = require('dezalgo')
const finished = require('mississippi').finished
const finished = BB.promisify(require('mississippi').finished)
const gunzip = require('./util/gunzip-maybe')

@@ -15,7 +13,24 @@ const minimatch = require('minimatch')

const path = require('path')
const pipe = require('mississippi').pipe
const pipeline = require('mississippi').pipeline
const pipe = BB.promisify(require('mississippi').pipe)
const ssri = require('ssri')
const tar = require('tar-stream')
const through = require('mississippi').through
// `finalizeManifest` takes as input the various kinds of manifests that
// manifest handlers ('lib/handlers/*/manifest.js') return, and makes sure they
// are:
//
// * filled out with any required data that the handler couldn't fill in
// * formatted consistently
// * cached so we don't have to repeat this work more than necessary
//
// The biggest thing this package might do is do a full tarball extraction in
// order to find missing bits of metadata required by the npm installer. For
// example, it will fill in `_shrinkwrap`, `_integrity`, and other details that
// the plain manifest handlers would require a tarball to fill out. If a
// handler returns everything necessary, this process is skipped.
//
// If we get to the tarball phase, the corresponding tarball handler for the
// requested type will be invoked and the entire tarball will be read from the
// stream.
//
module.exports = finalizeManifest

@@ -25,25 +40,31 @@ function finalizeManifest (pkg, spec, opts) {

opts = optCheck(opts)
opts.memoize = true
return ((opts.cache && key) ? cache.get.info(opts.cache, key, opts).then(res => {
if (!res) { throw new Error('cache miss') }
return new Manifest(res.metadata)
}) : BB.reject()).catch(() => {
return BB.fromNode(cb => {
tarballedProps(pkg, spec, opts, function (err, props) {
if (err) { return cb(err) }
// normalize should not add any fields, and once
// makeManifest completes, it should never be modified.
var result = pkg && pkg.name
const cachedManifest = (opts.cache && key && !opts.preferOnline)
? cache.get.info(opts.cache, key, opts)
: BB.resolve(null)
return cachedManifest.then(cached => {
if (cached && cached.metadata.manifest) {
return new Manifest(cached.metadata.manifest)
} else {
return tarballedProps(pkg, spec, opts).then(props => {
return pkg && pkg.name
? new Manifest(pkg, props)
: new Manifest(props)
if (opts.cache) {
opts.metadata = result
cache.put(
opts.cache, key || finalKey(result, spec), '.', opts
).then(() => cb(null, result), cb)
}).then(manifest => {
const cacheKey = key || finalKey(manifest, spec)
if (!opts.cache || !cacheKey) {
return manifest
} else {
cb(null, result)
opts.metadata = {
id: manifest._id,
manifest,
type: 'finalized-manifest'
}
return cache.put(
opts.cache, cacheKey, '.', opts
).then(() => manifest)
}
})
})
}
})

@@ -67,6 +88,6 @@ }

this.peerDependencies = pkg.peerDependencies || {}
this.deprecated = pkg.deprecated || false
// These depend entirely on each handler
this._resolved = pkg._resolved
this._deprecated = pkg._deprecated || pkg.deprecated || false

@@ -76,3 +97,3 @@ // Not all handlers (or registries) provide these out of the box,

// These are details required by the installer.
this._shasum = pkg._shasum || fromTarball._shasum
this._integrity = pkg._integrity || fromTarball._integrity
this._shrinkwrap = pkg._shrinkwrap || fromTarball._shrinkwrap || null

@@ -93,3 +114,3 @@ this.bin = pkg.bin || fromTarball.bin || null

this._id = null // filled in by normalize-package-data, but unnecessary
this._id = null

@@ -110,5 +131,3 @@ // TODO - freezing and inextensibility pending npm changes. See test suite.

// from the stream.
function tarballedProps (pkg, spec, opts, cb) {
cb = dezalgo(cb)
const extraProps = {}
function tarballedProps (pkg, spec, opts) {
const needsShrinkwrap = (!pkg || (

@@ -123,94 +142,101 @@ pkg._hasShrinkwrap !== false &&

))
const needsShasum = !pkg || !pkg._shasum
const needsHash = !pkg || (!pkg._integrity && pkg._integrity !== false)
const needsManifest = !pkg || !pkg.name
if (!needsShrinkwrap && !needsBin && !needsShasum && !needsManifest) {
const needsExtract = needsShrinkwrap || needsBin || needsManifest
if (!needsShrinkwrap && !needsBin && !needsHash && !needsManifest) {
opts.log.silly('finalize-manifest', 'Skipping tarball extraction -- nothing needed.')
return cb(null, extraProps)
return BB.resolve({})
} else {
opts = optCheck(opts)
opts.memoize = false
const tarball = require('./handlers/' + spec.type + '/tarball')
const tarData = tarball.fromManifest(pkg, spec, opts)
let shaStream = null
let extractorStream = null
let paths = []
if (needsShrinkwrap || needsBin || needsManifest) {
opts.log.silly('finalize-manifest', 'parsing tarball for', spec.name)
const dataStream = tar.extract()
extractorStream = pipeline(gunzip(), dataStream)
dataStream.on('entry', (header, fileStream, next) => {
const filePath = header.name.replace(/[^/]+\//, '')
if (
(needsShrinkwrap && filePath === 'npm-shrinkwrap.json') ||
(needsManifest && filePath === 'package.json')
) {
let data = ''
fileStream.on('data', d => { data += d })
return finished(fileStream, err => {
if (err) { return dataStream.emit('error', err) }
let parsed
try {
parsed = JSON.parse(data)
next()
} catch (e) {
dataStream.emit('error', e)
}
if (filePath === 'package.json') {
Object.keys(parsed).forEach(k => {
if (extraProps[k] == null) {
extraProps[k] = parsed[k]
const tarHandler = require('./handlers/' + spec.type + '/tarball')
const tarStream = tarHandler.fromManifest(pkg, spec, opts)
const extracted = needsExtract && tar.extract()
extracted && extracted.on('entry', (h, str, next) => {
// Drain it
str.on('data', () => {}).on('end', next).on('error', next)
})
return BB.join(
needsShrinkwrap && jsonFromStream('npm-shrinkwrap.json', extracted),
needsManifest && jsonFromStream('package.json', extracted),
needsBin && getPaths(extracted),
needsHash && ssri.fromStream(tarStream, { algorithms: ['sha1'] }),
needsExtract && pipe(tarStream, gunzip(), extracted),
(sr, mani, paths, hash) => {
const extraProps = mani || {}
// drain out the rest of the tarball
tarStream.unpipe()
tarStream.on('data', () => {})
// if we have directories.bin, we need to collect any matching files
// to add to bin
if (paths && paths.length) {
const dirBin = mani
? (mani && mani.directories && mani.directories.bin)
: (pkg && pkg.directories && pkg.directories.bin)
if (dirBin) {
extraProps.bin = {}
paths.forEach(filePath => {
if (minimatch(filePath, dirBin + '/**')) {
const relative = path.relative(dirBin, filePath)
if (relative && relative[0] !== '.') {
extraProps.bin[path.basename(relative)] = path.join(dirBin, relative)
}
})
extraProps._resolved = spec.spec
} else if (filePath === 'npm-shrinkwrap.json') {
extraProps._shrinkwrap = parsed
}
})
} else if (needsBin) {
paths.push(filePath)
}
})
}
}
// Drain and get next one
fileStream.on('data', () => {})
return Object.assign(extraProps, {
_shrinkwrap: sr,
_resolved: (mani && mani._resolved) ||
(pkg && pkg._resolved) ||
spec.fetchSpec,
_integrity: hash && hash.toString()
})
}
)
}
}
function jsonFromStream (filename, dataStream) {
return BB.fromNode(cb => {
dataStream.on('error', cb)
dataStream.on('finish', cb)
dataStream.on('entry', function handler (header, stream, next) {
const filePath = header.name.replace(/[^/]+\//, '')
if (filePath !== filename) {
next()
})
} else {
extractorStream = through()
}
if (needsShasum) {
shaStream = checksumStream({
algorithm: opts.hashAlgorithm
})
shaStream.on('digest', d => {
extraProps._shasum = d
})
} else {
shaStream = through()
}
// Drain the end stream
extractorStream.on('data', () => {})
return pipe(tarData, shaStream, extractorStream, err => {
if (needsBin) {
const dirBin = pkg
? (pkg.directories && pkg.directories.bin)
: (extraProps.directories && extraProps.directories.bin)
extraProps.bin = {}
Object.keys((pkg && pkg.bin) || {}).forEach(k => {
extraProps.bin[k] = pkg.bin[k]
})
paths.forEach(filePath => {
if (minimatch(filePath, dirBin + '/**')) {
const relative = path.relative(dirBin, filePath)
if (relative && relative[0] !== '.') {
extraProps.bin[path.basename(relative)] = path.join(dirBin, relative)
}
} else {
let data = ''
stream.on('data', d => { data += d })
stream.on('error', cb)
finished(stream).then(() => {
dataStream.removeListener('entry', handler)
try {
cb(null, JSON.parse(data))
next()
} catch (err) {
cb(err)
}
}, err => {
dataStream.removeListener('entry', handler)
cb(err)
})
}
cb(err, extraProps)
})
}
})
}
function getPaths (dataStream) {
return BB.fromNode(cb => {
let paths = []
dataStream.on('error', cb)
dataStream.on('finish', () => cb(null, paths))
dataStream.on('entry', function handler (header, stream, next) {
const filePath = header.name.replace(/[^/]+\//, '')
stream.on('data', () => {})
paths.push(filePath)
next()
})
})
}
function finalKey (pkg, spec) {

@@ -222,6 +248,6 @@ if (pkg && pkg._uniqueResolved) {

return (
pkg && (pkg._sha512sum || pkg._shasum) &&
pkg && pkg._integrity &&
cache.key(
`${spec.type}-manifest`,
`${pkg._sha512sum || pkg._shasum}-${pkg._resolved}`
`${pkg._resolved}:${ssri.stringify(pkg._integrity)}`
)

@@ -228,0 +254,0 @@ )

@@ -12,4 +12,4 @@ 'use strict'

function manifest (spec, opts) {
const pkgPath = path.join(spec.spec, 'package.json')
const srPath = path.join(spec.spec, 'npm-shrinkwrap.json')
const pkgPath = path.join(spec.fetchSpec, 'package.json')
const srPath = path.join(spec.fetchSpec, 'npm-shrinkwrap.json')
return BB.join(

@@ -24,4 +24,4 @@ readFileAsync(pkgPath).then(JSON.parse).catch({code: 'ENOENT'}, err => {

pkg._hasShrinkwrap = !!sr
pkg._shasum = 'directory manifests have no shasum'
pkg._sha512sum = 'directory manifests have no shasum'
pkg._resolved = spec.fetchSpec
pkg._integrity = false // Don't auto-calculate integrity
return pkg

@@ -32,11 +32,11 @@ }

const dirBin = pkg.directories.bin
return glob(path.join(spec.spec, dirBin, '/**')).then(matches => {
return glob(path.join(spec.fetchSpec, dirBin, '/**'), {nodir: true}).then(matches => {
matches.forEach(filePath => {
const relative = path.relative(dirBin, filePath)
const relative = path.relative(spec.fetchSpec, filePath)
if (relative && relative[0] !== '.') {
if (!pkg.bin) { pkg.bin = {} }
pkg.bin[path.basename(relative)] = path.join(dirBin, relative)
pkg.bin[path.basename(relative)] = relative
}
})
})
}).then(() => pkg)
} else {

@@ -43,0 +43,0 @@ return pkg

'use strict'
module.exports = tarballNope
module.exports.fromManifest = tarballNope
const BB = require('bluebird')
function tarballNope () {
throw new Error('local directory packages have no tarball data')
const manifest = require('./manifest')
const packDir = require('../../util/pack-dir')
const through = require('mississippi').through
const pipe = BB.promisify(require('mississippi').pipe)
module.exports = tarball
function tarball (spec, opts) {
const stream = through()
manifest(spec, opts).then(mani => {
return pipe(fromManifest(mani, spec, opts), stream)
}).catch(err => stream.emit('error', err))
return stream
}
module.exports.fromManifest = fromManifest
function fromManifest (manifest, spec, opts) {
const stream = through()
packDir(manifest, manifest._resolved, manifest._resolved, stream, opts).catch(err => {
stream.emit('error', err)
})
return stream
}

@@ -14,6 +14,7 @@ 'use strict'

opts = optCheck(opts)
if (spec.hosted) {
if (spec.hosted && spec.hosted.getDefaultRepresentation() === 'shortcut') {
return hostedManifest(spec, opts)
} else {
return plainManifest(spec.spec, spec, opts)
// If it's not a shortcut, don't do fallbacks.
return plainManifest(spec.fetchSpec, spec, opts)
}

@@ -23,8 +24,14 @@ }

function hostedManifest (spec, opts) {
return plainManifest(
spec.hosted.gitUrl, spec, opts
).catch(() => {
return plainManifest(spec.hosted.sshUrl, spec, opts)
}).catch(() => {
return plainManifest(spec.hosted.httpsUrl, spec, opts)
return BB.resolve(null).then(() => {
return plainManifest(spec.hosted.https(), spec, opts)
}).catch(err => {
if (!spec.hosted.ssh()) {
throw err
}
return plainManifest(spec.hosted.ssh(), spec, opts)
}).catch(err => {
if (!spec.hosted.git()) {
throw err
}
return plainManifest(spec.hosted.git(), spec, opts)
})

@@ -31,0 +38,0 @@ }

@@ -10,8 +10,7 @@ 'use strict'

const osenv = require('osenv')
const packDir = require('../../util/pack-dir')
const PassThrough = require('stream').PassThrough
const path = require('path')
const pipe = BB.promisify(require('mississippi').pipe)
const rimraf = BB.promisify(require('rimraf'))
const tar = require('tar-fs')
const through = require('mississippi').through
const to = require('mississippi').to
const uniqueFilename = require('unique-filename')

@@ -24,8 +23,7 @@

opts = optCheck(opts)
let streamErr = null
const stream = through().on('error', e => { streamErr = e })
const stream = new PassThrough()
gitManifest(spec, opts).then(manifest => {
if (streamErr) { throw streamErr }
stream.emit('manifest', manifest)
return pipe(fromManifest(manifest, spec, opts), stream)
})
}, err => stream.emit('error', err))
return stream

@@ -38,7 +36,7 @@ }

let streamError
const stream = through().on('error', e => { streamError = e })
const stream = new PassThrough().on('error', e => { streamError = e })
const cacheStream = (
opts.cache &&
cache.get.stream(
opts.cache, cache.key('git-clone', manifest._resolved), opts
opts.cache, cache.key('packed-dir', manifest._resolved), opts
)

@@ -52,3 +50,3 @@ )

stream.emit('reset')
withTmp(opts, tmp => {
return withTmp(opts, tmp => {
if (streamError) { throw streamError }

@@ -59,3 +57,3 @@ return cloneRepo(

if (streamError) { throw streamError }
return packDir(spec, manifest._resolved, tmp, stream, opts)
return packDir(manifest, manifest._resolved, tmp, stream, opts)
})

@@ -71,3 +69,3 @@ }).catch(err => stream.emit('error', err))

// cacache has a special facility for working in a tmp dir
return cache.tmp.withTmp(opts.cache, opts, cb)
return cache.tmp.withTmp(opts.cache, {tmpPrefix: 'git-clone'}, cb)
} else {

@@ -88,39 +86,1 @@ const tmpDir = path.join(osenv.tmpdir(), 'pacote-git-tmp')

}
function packDir (spec, label, tmp, target, opts) {
opts = optCheck(opts)
const packer = opts.gitPacker
? opts.gitPacker(spec, tmp)
: tar.pack(tmp, {
map: header => {
header.name = 'package/' + header.name
},
ignore: (name) => {
name.match(/\.git/)
}
})
if (!opts.cache) {
return pipe(packer, target)
} else {
const cacher = cache.put.stream(
opts.cache, cache.key('git-clone', label), opts
)
cacher.once('error', err => packer.emit('error', err))
target.once('error', err => packer.emit('error', err))
packer.once('error', err => {
cacher.emit('error', err)
target.emit('error', err)
})
return pipe(packer, to((chunk, enc, cb) => {
cacher.write(chunk, enc, () => {
target.write(chunk, enc, cb)
})
}, done => {
cacher.end(() => {
target.end(done)
})
}))
}
}
'use strict'
const BB = require('bluebird')
module.exports = manifest
function manifest () {
// The tarball handler will take care of it!
return Promise.resolve(null)
return BB.resolve(null)
}

@@ -16,3 +16,3 @@ 'use strict'

function tarball (spec, opts) {
const src = spec._resolved || spec.spec
const src = spec._resolved || spec.fetchSpec
const stream = through()

@@ -19,0 +19,0 @@ statAsync(src).then(stat => {

'use strict'
const BB = require('bluebird')
module.exports = manifest

@@ -9,3 +11,3 @@ function manifest (spec) {

// a manifest based on ./tarball.js
return null
return BB.resolve(null)
}
'use strict'
const request = require('../../registry/request')
const url = require('url')
const registryTarball = require('../../registry/tarball')
module.exports = tarball
function tarball (spec, opts) {
const uri = spec._resolved || spec.spec
const parsed = url.parse(uri)
return request.stream(uri, {
protocol: parsed.protocol,
hostname: parsed.hostname,
pathname: parsed.pathname
}, opts)
const uri = spec._resolved || spec.fetchSpec
return registryTarball.fromManifest({
_resolved: uri,
_integrity: opts.integrity
}, spec, opts)
}

@@ -16,0 +13,0 @@

'use strict'
var optCheck = require('../util/opt-check')
var pickManifest = require('./pick-manifest')
var pickRegistry = require('./pick-registry')
var url = require('url')
var request = require('./request')
const BB = require('bluebird')
const fetch = require('make-fetch-happen')
const optCheck = require('../util/opt-check')
const pickManifest = require('./pick-manifest')
const pickRegistry = require('./pick-registry')
const registryKey = require('./registry-key')
const ssri = require('ssri')
const url = require('url')
module.exports = manifest
function manifest (spec, opts) {
opts = optCheck(opts)
opts.memoize = true
var registry = pickRegistry(spec, opts)
var uri = metadataUrl(registry, spec.escapedName)
const registry = pickRegistry(spec, opts)
const uri = metadataUrl(registry, spec.escapedName)
opts.log.verbose(
'registry.manifest',
'looking up registry-based metadata for ', spec
)
return request(uri, registry, opts).then(metadata => {
opts.log.silly('registry.manifest', 'got metadata for', spec.name)
return pickManifest(metadata, spec, {
return fetchPackument(uri, registry, opts).then(packument => {
return pickManifest(packument, spec, {
engineFilter: opts.engineFilter,
defaultTag: opts.defaultTag
}).catch(err => {
if (
err.code === 'ETARGET' &&
packument._cached &&
!opts.offline
) {
opts.log.silly(
'registry:manifest',
`no matching version for ${spec.name}@${spec.fetchSpec} in the cache. Forcing revalidation`
)
opts.preferOnline = true
return manifest(spec, opts)
}
throw err
})
}).then(manifest => {
// Done here instead of ./finalize-manifest because these fields
// have to be filled in differently depending on type.
manifest._shasum = (
manifest.dist && manifest.dist.shasum
) || manifest._shasum
const shasum = manifest.dist && manifest.dist.shasum
manifest._integrity = manifest.dist && manifest.dist.integrity
if (!manifest._integrity && shasum) {
// Use legacy dist.shasum field if available.
manifest._integrity = ssri.fromHex(shasum, 'sha1').toString()
}
manifest._resolved = (
manifest.dist && manifest.dist.tarball
) || url.resolve(
registry,
'/' + manifest.name +
'/-/' + manifest.name +
'-' + manifest.version +
'.tgz'
)
if (!manifest._resolved) {
const err = new Error(
`Manifest for ${manifest.name}@${manifest.version} from ${uri} is missing a tarball url (pkg.dist.tarball). Guessing a default.`
)
err.code = 'ENOTARBALL'
err.manifest = manifest
if (!manifest._warnings) { manifest._warnings = [] }
manifest._warnings.push(err.message)
manifest._resolved =
`${registry}/${manifest.name}/-/${manifest.name}-${manifest.version}.tgz`
}
return manifest
}).catch({code: 'ETARGET'}, err => {
if (!opts.cache || opts.maxAge < 0 || opts.offline) {
throw err
} else {
opts.log.silly('registry.manifest', 'version missing from current metadata, forcing network check')
opts.maxAge = -1
return manifest(spec, opts)
}
})

@@ -55,3 +65,3 @@ }

function metadataUrl (registry, name) {
var normalized = registry.slice(-1) !== '/'
const normalized = registry.slice(-1) !== '/'
? registry + '/'

@@ -61,1 +71,55 @@ : registry

}
function fetchPackument (uri, registry, opts) {
const startTime = Date.now()
const auth = opts.auth && opts.auth[registryKey(registry)]
const memo = opts.memoize && opts.memoize.get // `memoize` is a Map-like
return memo && memo.has(uri) ? BB.resolve(memo.get(uri)).then(p => {
opts.log.http('registry:manifest', `MEMOIZED ${uri}`)
return p
}) : fetch(uri, {
agent: opts.agent,
cache: opts.offline
? 'only-if-cached'
: opts.preferOffline
? 'force-cache'
: opts.preferOnline
? 'no-cache'
: 'default',
cacheManager: opts.cache,
headers: {
// Corgis are cute. 🐕🐶
accept: 'application/vnd.npm.install-v1+json; q=1.0, application/json; q=0.8, */*',
authorization: (auth && auth.token && `Bearer ${auth.token}`) || '',
'user-agent': opts.userAgent,
'pacote-req-type': 'packument',
'pacote-pkg-id': `registry:${manifest.name}`
},
proxy: opts.proxy,
retry: opts.retry,
timeout: opts.timeout
}).then(res => res.json().then(packument => {
if (res.headers.get('npm-notice')) {
opts.log.warn('notice', res.headers.get('npm-notice'))
}
const elapsedTime = Date.now() - startTime
const attempt = res.headers.get('x-fetch-attempts')
const attemptStr = attempt && attempt > 1 ? ` attempt #${attempt}` : ''
const cacheStr = res.headers.get('x-local-cache') ? ' (from cache)' : ''
opts.log.http(
'registry:manifest',
`GET ${res.status} ${uri} ${elapsedTime}ms${attemptStr}${cacheStr}`
)
if (res.status >= 400) {
const err = new Error(`Failed with ${res.status} while fetching ${uri}. ${JSON.stringify(packument)}`)
err.code = `E${res.status}`
err.response = res
throw err
}
opts.log.silly('pacote', res.headers.raw())
packument._cached = !!res.headers.get('x-local-cache')
packument._contentLength = +res.headers.get('content-length')
memo && memo.set(uri, packument)
return packument
}))
}

@@ -25,5 +25,5 @@ 'use strict'

if (spec.type === 'tag') {
target = distTags[spec.spec]
target = distTags[spec.fetchSpec]
} else if (spec.type === 'version') {
target = spec.spec
target = spec.fetchSpec
} else if (spec.type !== 'range') {

@@ -39,3 +39,3 @@ return cb(new Error('Only tag, version, and range are supported'))

metadata.versions[tagVersion] &&
semver.satisfies(tagVersion, spec.spec, true)
semver.satisfies(tagVersion, spec.fetchSpec, true)
) {

@@ -46,6 +46,6 @@ target = tagVersion

if (!target) {
target = semver.maxSatisfying(versions, spec.spec, true)
target = semver.maxSatisfying(versions, spec.fetchSpec, true)
}
if (!target && spec.spec === '*') {
if (!target && spec.fetchSpec === '*') {
// npm hard-codes `latest` here, but it's likely intended

@@ -62,3 +62,3 @@ // to be `defaultTag`.

if (!manifest) {
err = new Error(`No matching version found for ${spec.name}@${spec.spec}`)
err = new Error(`No matching version found for ${spec.name}@${spec.fetchSpec}`)
err.code = 'ETARGET'

@@ -65,0 +65,0 @@ err.name = metadata.name

'use strict'
var manifest = require('./manifest')
var optCheck = require('../util/opt-check')
var pickRegistry = require('./pick-registry')
var pipe = require('mississippi').pipe
var request = require('./request')
var through = require('mississippi').through
const BB = require('bluebird')
const fetch = require('make-fetch-happen')
const manifest = require('./manifest')
const optCheck = require('../util/opt-check')
const PassThrough = require('stream').PassThrough
const pickRegistry = require('./pick-registry')
const pipe = BB.promisify(require('mississippi').pipe)
const registryKey = require('./registry-key')
const ssri = require('ssri')
const url = require('url')
module.exports = tarball
function tarball (spec, opts) {
opts = optCheck(opts)
opts.log.verbose(
'registry.tarball',
'looking up registry-based metadata for ', spec
)
var stream = through()
const stream = new PassThrough()
manifest(spec, opts).then(manifest => {
opts.log.silly(
'registry.tarball',
'registry metadata found. Downloading ', manifest.name + '@' + manifest.version
)
pipe(
fromManifest(manifest, spec, opts),
stream.emit('manifest', manifest)
return pipe(
fromManifest(manifest, spec, opts).on(
'integrity', i => stream.emit('integrity', i)
),
stream
)
}, err => stream.emit('error', err))
}).catch(err => stream.emit('error', err))
return stream

@@ -34,6 +34,88 @@ }

opts = optCheck(opts)
var registry = pickRegistry(spec, opts)
var uri = manifest._resolved
opts.digest = manifest._shasum
return request.stream(uri, registry, opts)
opts.scope = spec.scope || opts.scope
const stream = new PassThrough()
const registry = pickRegistry(spec, opts)
const uri = getTarballUrl(registry, manifest)
const auth = (
opts.auth &&
// If a tarball is on a different registry, don't leak creds
url.parse(uri).host === url.parse(registry).host &&
opts.auth[registryKey(registry)]
)
const startTime = Date.now()
fetch(uri, {
agent: opts.agent,
cache: opts.offline
? 'only-if-cached'
: opts.preferOffline
? 'force-cache'
: opts.preferOnline
? 'no-cache'
: 'default',
cacheManager: opts.cache,
headers: {
authorization: (auth && auth.token && `Bearer ${auth.token}`) || '',
'user-agent': opts.userAgent,
'pacote-req-type': 'tarball',
'pacote-pkg-id': `registry:${manifest.name}@${manifest.version}`,
'pacote-pkg-manifest': JSON.stringify(manifest)
},
integrity: manifest._integrity,
algorithms: [
manifest._integrity
? ssri.parse(manifest._integrity).pickAlgorithm()
: 'sha1'
],
proxy: opts.proxy,
retry: opts.retry,
timeout: opts.timeout
}).then(res => {
stream.emit('integrity', res.headers.get('x-local-cache-hash'))
res.body.on('error', err => stream.emit('error', err))
if (res.headers.get('npm-notice')) {
opts.log.warn('notice', res.headers.get('npm-notice'))
}
if (res.status >= 400) {
const err = new Error(`Failed with ${res.status} while fetching ${uri}.`)
err.code = `E${res.status}`
err.response = res
throw err
} else {
res.body.pipe(stream)
stream.on('end', () => {
const elapsedTime = Date.now() - startTime
const attempt = res.headers.get('x-fetch-attempts')
const attemptStr = attempt && attempt > 1 ? ` attempt #${attempt}` : ''
const cacheStr = res.headers.get('x-local-cache') ? ' (from cache)' : ''
opts.log.http(
'registry:tarball',
`GET ${res.status} ${uri} ${elapsedTime}ms${attemptStr}${cacheStr}`
)
})
}
}).catch(err => stream.emit('error', err))
return stream
}
function getTarballUrl (registry, manifest) {
// https://github.com/npm/npm/pull/9471
//
// TL;DR: Some alternative registries host tarballs on http and packuments on
// https, and vice-versa. There's also a case where people who can't use SSL
// to access the npm registry, for example, might use
// `--registry=http://registry.npmjs.org/`. In this case, we need to rewrite
// `tarball` to match the protocol.
//
const reg = url.parse(registry)
const tarball = url.parse(manifest._resolved)
if (reg.hostname === tarball.hostname && reg.protocol !== tarball.protocol) {
tarball.protocol = reg.protocol
// Ports might be same host different protocol!
if (reg.port !== tarball.port) {
delete tarball.host
tarball.port = reg.port
}
delete tarball.href
}
return url.format(tarball)
}

@@ -66,6 +66,8 @@ 'use strict'

}, opts).then(() => {
return execGit(['checkout', committish, '-c', 'core.longpaths=true'], {
return execGit(['checkout', committish], {
cwd: target
})
}).then(() => headSha(repo, opts))
}).then(() => {
return updateSubmodules(target, opts)
}).then(() => headSha(target, opts))
}

@@ -88,6 +90,21 @@

return execGit(gitArgs, {
cwd: path.dirname(target)
}, opts).then(() => headSha(repo, opts))
cwd: target
}, opts).then(() => {
return updateSubmodules(target, opts)
}).then(() => headSha(target, opts))
}
function updateSubmodules (localRepo, opts) {
const gitArgs = [
'submodule',
'update',
'-q',
'--init',
'--recursive'
]
return execGit(gitArgs, {
cwd: localRepo
}, opts)
}
function headSha (repo, opts) {

@@ -94,0 +111,0 @@ opts = optCheck(opts)

'use strict'
var silentlog = require('./silentlog')
const pkg = require('../../package.json')
const silentlog = require('./silentlog')

@@ -8,2 +9,4 @@ function PacoteOptions (opts) {

this._isPacoteOptions = true
this.agent = opts.agent
this.annotate = opts.annotate
this.auth = opts.auth

@@ -13,20 +16,17 @@ this.scopeTargets = opts.scopeTargets || {}

this.cache = opts.cache
this.cacheUid = opts.cacheUid
this.cacheGid = opts.cacheGid
this.digest = opts.digest
this.integrity = opts.integrity
this.engineFilter = opts.engineFilter
this.hashAlgorithm = opts.hashAlgorithm || 'sha1'
this.log = opts.log || silentlog
this.maxAge = typeof opts.maxAge === 'undefined' ? 1000 : opts.maxAge
this.maxSockets = opts.maxSockets || 10
this.memoize = !!opts.memoize
this.offline = opts.offline
this.pkg = opts.pkg
this.preferOffline = opts.preferOffline
this.proxy = opts.proxy
this.registry = opts.registry || 'https://registry.npmjs.org'
this.retry = opts.retry // for npm-registry-client
this.scope = opts.scope
this.userAgent = opts.userAgent || `${pkg.name}@${pkg.version}/node@{process.version}+${process.arch} (${process.platform})`
this.where = opts.where
this.preferOnline = opts.preferOnline
this.gitPacker = opts.gitPacker || null
this.dirPacker = opts.dirPacker || null

@@ -33,0 +33,0 @@ this.uid = opts.uid

'use strict'
const BB = require('bluebird')
const finalizeManifest = require('./lib/finalize-manifest')
const optCheck = require('./lib/util/opt-check')
const rps = BB.promisify(require('realize-package-specifier'))
const pinflight = require('promise-inflight')
const npa = require('npm-package-arg')

@@ -14,19 +13,36 @@ let handlers = {}

opts = optCheck(opts)
spec = typeof spec === 'string' ? npa(spec, opts.where) : spec
const res = typeof spec === 'string'
? rps(spec, opts.where)
: BB.resolve(spec)
return res.then(res => {
const label = [
spec.name,
spec.saveSpec || spec.fetchSpec,
spec.type,
opts.cache,
opts.registry,
opts.scope
].join(':')
return pinflight(label, () => {
const startTime = Date.now()
const fetcher = (
handlers[res.type] ||
handlers[spec.type] ||
(
handlers[res.type] =
require('./lib/handlers/' + res.type + '/manifest')
handlers[spec.type] =
require('./lib/handlers/' + spec.type + '/manifest')
)
)
return fetcher(res, opts).then(manifest => {
return finalizeManifest(manifest, res, opts)
return fetcher(spec, opts).then(manifest => {
return finalizeManifest(manifest, spec, opts)
}).then(manifest => {
// Metadata about the way this manifest was requested
if (opts.annotate) {
manifest._requested = spec
manifest._spec = spec.raw
manifest._where = opts.where
}
const elapsedTime = Date.now() - startTime
opts.log.verbose('pacote', `${spec.type} manifest for ${spec.name}@${spec.saveSpec || spec.fetchSpec} fetched in ${elapsedTime}ms`)
return manifest
})
})
}
{
"name": "pacote",
"version": "1.0.0",
"version": "2.0.0",
"description": "JavaScript package downloader",

@@ -45,8 +45,6 @@ "main": "index.js",

"bluebird": "^3.5.0",
"cacache": "^6.2.0",
"checksum-stream": "^1.0.2",
"dezalgo": "^1.0.3",
"cacache": "^7.0.3",
"glob": "^7.1.1",
"inflight": "^1.0.6",
"lru-cache": "^4.0.2",
"make-fetch-happen": "^2.2.2",
"minimatch": "^3.0.3",

@@ -56,9 +54,8 @@ "mississippi": "^1.2.0",

"normalize-package-data": "^2.3.6",
"npm-registry-client": "^7.4.6",
"npm-package-arg": "^5.0.0",
"osenv": "^0.1.4",
"promise-inflight": "^1.0.1",
"realize-package-specifier": "^3.0.3",
"request": "^2.81.0",
"promise-retry": "^1.1.1",
"semver": "^5.3.0",
"slide": "^1.1.6",
"ssri": "^4.1.1",
"tar-fs": "^1.15.1",

@@ -71,3 +68,3 @@ "tar-stream": "^1.5.2",

"mkdirp": "^0.5.1",
"nock": "^9.0.6",
"nock": "^9.0.13",
"npmlog": "^4.0.1",

@@ -77,3 +74,3 @@ "nyc": "^10.0.0",

"rimraf": "^2.5.4",
"standard": "^9.0.1",
"standard": "^10.0.1",
"standard-version": "^4.0.0",

@@ -80,0 +77,0 @@ "tacks": "^1.2.6",

@@ -8,3 +8,3 @@ 'use strict'

const optCheck = require('./lib/util/opt-check')
const rps = BB.promisify(require('realize-package-specifier'))
const npa = require('npm-package-arg')

@@ -14,30 +14,47 @@ module.exports = prefetch

opts = optCheck(opts)
spec = typeof spec === 'string' ? npa(spec, opts.where) : spec
const startTime = Date.now()
if (!opts.cache) {
opts.log.info('prefetch', 'skipping prefetch: no cache provided')
return BB.resolve()
return BB.resolve({spec})
}
if (opts.digest) {
opts.log.silly('prefetch', 'checking if ', spec, ' digest is already cached')
return cache.get.hasContent(opts.cache, opts.digest, opts.hashAlgorithm).then(exists => {
if (exists) {
opts.log.silly('prefetch', 'content already exists for', spec)
if (opts.integrity && !opts.preferOnline) {
opts.log.silly('prefetch', 'checking if', opts.integrity, 'is already cached')
return cache.get.hasContent(opts.cache, opts.integrity).then(integrity => {
if (integrity) {
opts.log.silly('prefetch', 'content already exists for', spec.raw, `(${Date.now() - startTime}ms)`)
return {
spec,
integrity,
byDigest: true
}
} else {
return prefetchByManifest(spec, opts)
return prefetchByManifest(startTime, spec, opts)
}
})
} else {
opts.log.silly('prefetch', 'no digest provided for ', spec, '- fetching by manifest')
return prefetchByManifest(spec, opts)
opts.log.silly('prefetch', 'no integrity hash provided for', spec, '- fetching by manifest')
return prefetchByManifest(startTime, spec, opts)
}
}
function prefetchByManifest (spec, opts) {
const res = typeof spec === 'string'
? rps(spec, opts.where)
: BB.resolve(spec)
return res.then(res => {
const stream = require('./lib/handlers/' + res.type + '/tarball')(res, opts)
setImmediate(() => stream.on('data', function () {}))
function prefetchByManifest (start, spec, opts) {
let manifest
let integrity
return BB.resolve().then(() => {
const stream = require('./lib/handlers/' + spec.type + '/tarball')(spec, opts)
if (!stream) { return }
stream.on('data', function () {})
stream.on('manifest', m => { manifest = m })
stream.on('integrity', i => { integrity = i })
return finished(stream)
}).then(() => {
opts.log.verbose('prefetch', `${spec.name}@${spec.saveSpec || spec.fetchSpec} done in ${Date.now() - start}ms`)
return {
manifest,
spec,
integrity: integrity || (manifest && manifest._integrity),
byDigest: false
}
})
}

@@ -73,4 +73,3 @@ # pacote [![npm version](https://img.shields.io/npm/v/pacote.svg)](https://npm.im/pacote) [![license](https://img.shields.io/npm/l/pacote.svg)](https://npm.im/pacote) [![Travis](https://img.shields.io/travis/zkat/pacote.svg)](https://travis-ci.org/zkat/pacote) [![AppVeyor](https://ci.appveyor.com/api/projects/status/github/zkat/pacote?svg=true)](https://ci.appveyor.com/project/zkat/pacote) [![Coverage Status](https://coveralls.io/repos/github/zkat/pacote/badge.svg?branch=latest)](https://coveralls.io/github/zkat/pacote?branch=latest)

"_resolved": TarballSource, // different for each package type
"_shasum": TarballSha1Sum,
"_sha512sum": TarballSha512Sum,
"_integrity": SubresourceIntegrityHash,
"_shrinkwrap": null || ShrinkwrapJsonObj

@@ -127,10 +126,10 @@ }

##### `opts.digest`
##### `opts.integrity`
If provided, pacote will confirm that the relevant `shasum` for each operation's
results matches the given digest. The call will return `EBADCHECKSUM` if the
check fails.
If provided, pacote will confirm that the relevant integrity hash for each
operation's results matches the given digest. The call will return `EINTEGRITY`
if the check fails.
Additionally, `pacote.extract` will check the cache before performing any other
operations.
Additionally, `pacote.extract` will use this integrity string check the cache
directly for matching contents before performing any other operations.

@@ -137,0 +136,0 @@ ##### `opts.cache`

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc