Fooney
Fooney12mo ago

fooneyp – 12-56 Jan 12

I have a node project that converts files so I set up workers to run faster and it does by a lot. The problem is it creates a new worker for every single file at once. With 3k files it's really bad.
3 Replies
Unknown User
Unknown User12mo ago
Message Not Public
Sign In & Join Server To View
ScriptyChris
ScriptyChris12mo ago
If you want to limit amount of workers created, you can chunk this process by using some queue to avoid having more than N workers/files evaluated at a time (depending on how many cores/threads does a system where the script is executed has)
reactibot
reactibot12mo ago
This thread hasn’t had any activity in 36 hours, so it’s now locked. Threads are closed automatically after 36 hours. If you have a followup question, you may want to reply to this thread so other members know they're related. https://discord.com/channels/102860784329052160/565213527673929729/1195350689812066304

Did you find this page helpful?