On posts with many missing articles I often get my cache going to "0/300" which of course stops all downloading until it can recover.
On a recent download, which turned out to be too broken to be repaired, I gave up on the download removing it from the list. I then went to the
chunks directory and found about 32,000 chunks totaling around 12GB waiting to be assembled (or orphaned and just left laying around). My "Max Retries" = 1. I have only two servers. I understand that there is a delay before Newsbin does the retries and this can affect the assembly of incompletes. No question here other than general suggestions. It seemed extreme so I thought I'd report it.
My real questions are...
Exactly how big, and what is a cache unit. Some posts have there part sizes around 500MB. What setting would I need to keep it in memory until complete? If Newsbin has to write out parts of the file due to memory constraints does it make chunks until the file is complete then write the final file?
If I start downloading a file then move another ahead of it in the download list does it release the cache units from the previous file to work on the new one? My observation says "no" but I don't know for sure.