Quade wrote:If Newsbin is caching them in memory, there's a bunch more disk IO time for the unrar to happen. I have to say, I consider download speed far more important than unrar speed. I'd be more inclined to throttle the unrar than to throttle the download.
If you dl a HD-documentry where every rar is 150MByte x 30 at a speed of 300Mbit , storing in memory is easy, but when the disk drive need to handle over 35MB/s to write and then same time access other files for Unraring it's problem.
Soo the download speed gets killed of the disk IO. And its faster and more healthy for the disk to process one item at a time when you got extream download speeds.
Why not just add a function that we can force newsbinpro to dl,unrar,delete and then take next item in que. (Singleprocessing=1)
The only workaround for me now is to pick one file at a time or lower the download speed, and both solutions feels the wrong way to go.
The only solution I can think of, is to create a ramdrive ,but then I must beable to force newsbinpro to single process the files for me. Else I would need like 100GB ram memory, Iam planing to stay around 20GB for the ram drive.