Skip to content

Commit

Permalink
fix: Limit concurrent open files during lsstream (#285)
Browse files Browse the repository at this point in the history
<!-- What / Why -->
<!-- Describe the request in detail. What it does and why it's being
changed. -->
During 'npm cache verify', currently all the cache files are open at the
same time, which will bring EMFILE error in an environment that limit
max open files.
This change limits the concurrent open files in garbageCollect() with
p-map module to avoid this problem.
I first sent this pull request to npm/cli and it was merged, but
realized that the original was in this repository.

## References
npm/cli#7631
npm/cli#4783
  • Loading branch information
oikumene committed Jul 11, 2024
1 parent f9ebcea commit 5f2166a
Showing 1 changed file with 12 additions and 6 deletions.
18 changes: 12 additions & 6 deletions lib/entry-index.js
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,9 @@ const hashToSegments = require('./util/hash-to-segments')
const indexV = require('../package.json')['cache-version'].index
const { moveFile } = require('@npmcli/fs')

const pMap = require('p-map')
const lsStreamConcurrency = 5

module.exports.NotFoundError = class NotFoundError extends Error {
constructor (cache, key) {
super(`No cache entry for ${key} found in ${cache}`)
Expand Down Expand Up @@ -182,15 +185,15 @@ function lsStream (cache) {
// Set all this up to run on the stream and then just return the stream
Promise.resolve().then(async () => {
const buckets = await readdirOrEmpty(indexDir)
await Promise.all(buckets.map(async (bucket) => {
await pMap(buckets, async (bucket) => {
const bucketPath = path.join(indexDir, bucket)
const subbuckets = await readdirOrEmpty(bucketPath)
await Promise.all(subbuckets.map(async (subbucket) => {
await pMap(subbuckets, async (subbucket) => {
const subbucketPath = path.join(bucketPath, subbucket)

// "/cachename/<bucket 0xFF>/<bucket 0xFF>./*"
const subbucketEntries = await readdirOrEmpty(subbucketPath)
await Promise.all(subbucketEntries.map(async (entry) => {
await pMap(subbucketEntries, async (entry) => {
const entryPath = path.join(subbucketPath, entry)
try {
const entries = await bucketEntries(entryPath)
Expand All @@ -213,9 +216,12 @@ function lsStream (cache) {
}
throw err
}
}))
}))
}))
},
{ concurrency: lsStreamConcurrency })
},
{ concurrency: lsStreamConcurrency })
},
{ concurrency: lsStreamConcurrency })
stream.end()
return stream
}).catch(err => stream.emit('error', err))
Expand Down

0 comments on commit 5f2166a

Please sign in to comment.