Taking advantage of lots of RAM

Technical support and discussion of Newsbin Version 6 series.

Taking advantage of lots of RAM

Postby jumpingstar » Wed Oct 03, 2012 3:19 pm

Hello,

As RAM prices are what they are, I would assume more and more people are having 12GB+ or more RAM in their system.

How can I take advantage of this when using Newsbin?

For sure I can use RAMDisk software and configure newsbin to download all stuff there and extract them to directory Y (hard disk)

Could I achieve this with Newsbin, without any ramdisk? Say I have 16GB ram, Windows is using at most 4GB, I have 12GB ultra-fast RAM available. Is there any way I can configure Newsbin to download data and keep in ram up to X MB/GB and then extract to disk?

I can't get any good results by using MemCacheLimit= variable...

Any comments or does anybody have experience setting up similar setup?
jumpingstar
n00b
n00b
 
Posts: 5
Joined: Sun Apr 19, 2009 7:27 am

Registered Newsbin User since: 04/19/09

Re: Taking advantage of lots of RAM

Postby Quade » Wed Oct 03, 2012 4:09 pm

What version are you currently using?
User avatar
Quade
Eternal n00b
Eternal n00b
 
Posts: 44984
Joined: Sat May 19, 2001 12:41 am
Location: Virginia, US

Registered Newsbin User since: 10/24/97

Re: Taking advantage of lots of RAM

Postby jumpingstar » Thu Oct 04, 2012 2:15 pm

Quade wrote:What version are you currently using?


Been updating always via beta-page, had 6.40 Build 2057, just upgraded to Build 2059.
jumpingstar
n00b
n00b
 
Posts: 5
Joined: Sun Apr 19, 2009 7:27 am

Registered Newsbin User since: 04/19/09

Re: Taking advantage of lots of RAM

Postby jumpingstar » Thu Oct 04, 2012 6:30 pm

Tried with latest version, changed MemCacheLimit=3000 so 3GB but immediately while downloading, everything gets written to disk...

This is how Windows' Resource Monitor, Disk-tab looks when download is on-going:

Image

(Newsbin downloading 25MB/sec)

HDD's I/O will be bottleneck if there are many downloads in the queue.
Basically newsbin is trying to write data let's say 20MB/sec.
Then when download is finished and it starts to extract, everything goes mad; it tries to read from HDD, extract to HDD and still save currently queued data... you can imagine my HDD led ;)

If downloaded chunks could be saved @RAM, that would definitely speed up the whole process..

Is there anything I could try or should I stick with ramdisk?
jumpingstar
n00b
n00b
 
Posts: 5
Joined: Sun Apr 19, 2009 7:27 am

Registered Newsbin User since: 04/19/09

Re: Taking advantage of lots of RAM

Postby Quade » Thu Oct 04, 2012 7:46 pm

I was formulating an answer. I needed to look at where the memcache limit is now. It's probably in the performance section. Meaning you need to change it there and not in [SETTINGS]. You won't won't to use 3GB anyways because most RAR files are under 500 megs.

Even without the memcachlimit, it's probably writing 6 megs at a time to disk.

[PERFORMANCE]
MemcacheLimit=?
ChunkCacheSize=?

- How about trying this. In the performance options, tell it to stop writing to disk when it's unraring/repairing. Then set the "ChunkCacheSize" to 4000. Remove the memcache limit from the NBI. Then see what happens. Figure a chunk is about 600K. 4000*600K is how much RAM it might potentially use. Make sure the performance options are set to use the memory buffering.

- Then if performance still isn't up to snuff, You could set the memcache limit to 500. It's not the whole rar set that gets buffered, it's the individual rar files so, really there's no point setting it to 3G.

The number of files in the download queue has no impact on disk IO. Only one thread writes download data to disk no matter how many files or connections to the server you use. That's what chunkcachesize is used for. It determines how many of "packets" of data can be outstanding in RAM waiting to write to disk.

A) I can sustain 200-400 Mbps download speed during unrar to a single encrypted 1TB disk. Even when downloading from my test server, none of these contortions are needed to maintain high speed downloads.

B) Buffering won't solve a slow disk problem. If just "kicks the can down the road" a little. It might let you ride through the unrar but, that depends on how fast the connection is and how much RAM you're going to dedicate to it.

C) Downloading to one drive and unraring to another is likely to perform better than doing both to the same drive.
User avatar
Quade
Eternal n00b
Eternal n00b
 
Posts: 44984
Joined: Sat May 19, 2001 12:41 am
Location: Virginia, US

Registered Newsbin User since: 10/24/97

Re: Taking advantage of lots of RAM

Postby jumpingstar » Fri Oct 05, 2012 3:41 am

Thanks for your reply,

In the Performance section, there wasn't option to stop writing to disk when unrar/repair.

There is setting "Pause Download during UnRAR/Repair" which would stop all downloads when one download is finished.

Everything in Performance is unchecked aka High Powered PC.

I have ChunkCacheSize=4000 under .nbi file's [Performance] -section

When I removed MemCacheLimit, or basically set it to MemCacheLimit=0, then everything was written to disk as well. Based on http://help.newsbin.com/index.php/V600- ... ache_Limit, if MemCacheLimit is disabled, then it does not save anything to RAM.

So then I put MemCacheLimit=500. And still immediately after starting download, everything was written to disk... :(



There is another screenshot which explains the problem. Please note when I took this screenshot, I did not have anything in queue anymore:

Image

There is currently writing in progress (unrar), it is also reading files what it is currently extracting.

There is 64MB/sec Disk I/O. If I would have anything else in the queue, download speed would be horrible since disk is so busy already...

Does the settings work for you, I mean everything is stored @RAM and only decoded files are written to HDD?
jumpingstar
n00b
n00b
 
Posts: 5
Joined: Sun Apr 19, 2009 7:27 am

Registered Newsbin User since: 04/19/09

Re: Taking advantage of lots of RAM

Postby Quade » Fri Oct 05, 2012 10:54 am

There is setting "Pause Download during UnRAR/Repair" which would stop all downloads when one download is finished.


Is the setting I'm suggesting you use. It stops writing do disk during the unrar/repair but, lets the download run till it runs out of chunk cache.

When you set to ChunkCacheSize=4000 you should see it in the status bar, do you?

What's your connection speed?

Your biggest complaint seems to be thrashing during unrar/repair and that's what my suggestion addressed, writing to disk during download doesn't really matter as long as your connection is 100% used right? The total amount of writing to disk is the same whether you do it on the fly or backload it.

I verified that "ChunkCacheSize=4000" works. I see it in the status bar. (3972/4000).

I haven't tried memcachelimit lately. So it's possible it's not working.

I am puzzled as to why you have to many open files though.
User avatar
Quade
Eternal n00b
Eternal n00b
 
Posts: 44984
Joined: Sat May 19, 2001 12:41 am
Location: Virginia, US

Registered Newsbin User since: 10/24/97

Re: Taking advantage of lots of RAM

Postby jumpingstar » Fri Oct 05, 2012 4:35 pm

Thanks Quade once again for your reply :)

Quade wrote:
There is setting "Pause Download during UnRAR/Repair" which would stop all downloads when one download is finished.


Is the setting I'm suggesting you use. It stops writing do disk during the unrar/repair but, lets the download run till it runs out of chunk cache.

When you set to ChunkCacheSize=4000 you should see it in the status bar, do you?



Yes I see it. I tested this now and even though there was lot of Chunk Cache available, it stopped downloading. I had still like 3400/4000 but download stopped. Returned when unrar was finished.

What's your connection speed?


300Mbit+

Your biggest complaint seems to be thrashing during unrar/repair and that's what my suggestion addressed, writing to disk during download doesn't really matter as long as your connection is 100% used right? The total amount of writing to disk is the same whether you do it on the fly or backload it.

I verified that "ChunkCacheSize=4000" works. I see it in the status bar. (3972/4000).

I haven't tried memcachelimit lately. So it's possible it's not working.

I am puzzled as to why you have to many open files though.

Since releases are let's say 50MB per rar and when I download them at 25MB/sec, it only takes 2 secs to get one RAR, then another, another, .. so I end up having multiple RAR's open.

I think we can dismiss this due the fact that Softperfect has released their very good RAMDisk software as freeware; http://www.softperfect.com/products/ramdisk/

It supports boot disk feature (ramdisk available immediately when system, no need for user login) so therefore just download that software, mount ramdisk, make newsbin to download @RAMdisk and unrar to hdd. You're done...

It would be very good to see this feature directly in newsbin, though. But since that version is now freeware, everyone can grab it...
jumpingstar
n00b
n00b
 
Posts: 5
Joined: Sun Apr 19, 2009 7:27 am

Registered Newsbin User since: 04/19/09

Re: Taking advantage of lots of RAM

Postby Quade » Fri Oct 05, 2012 5:39 pm

Since releases are let's say 50MB per rar and when I download them at 25MB/sec, it only takes 2 secs to get one RAR, then another, another, .. so I end up having multiple RAR's open.


I was just testing downloads at 500 Mbps and I still don't have anything like your open file count. By the time the file is a RAR on disk, it's closed. It's only open when it's an NB2.

I queued up a number of 7 GB rar sets from my test server. At 500 Mbps with a chunk count of 4000 and setting it to pause writes on unrar/repair, my download continued after the repair/unrar started. I watched the chunk cache shrink from 3900 to 700 during the unrar.

So, for me it works exactly how I described it. I wonder if yours is defaulting to something else.

[Performance]
PauseDownload=1
RepairPriority=0
RepairThreads=0
SaveChunksMode=0
ChunkCacheSize=2000

If you want to experiment, I'd suggest exiting Newsbin, then cutting and pasting that into your existing NBI performance section. Then it'll be apples to apples. I have at best, 2-3 download files open at a time. It sounds to me like it might be defaulting to unbuffered mode for you. Unbuffered mode would stall the download during unrar too.
User avatar
Quade
Eternal n00b
Eternal n00b
 
Posts: 44984
Joined: Sat May 19, 2001 12:41 am
Location: Virginia, US

Registered Newsbin User since: 10/24/97


Return to V6 Technical Support

Who is online

Users browsing this forum: Google [Bot] and 3 guests