1 - I'd look in the performance options and see if "Pause Download during unrar/repair is checked"
2 - Down at the bottom, I'd look at the Cache line. If the first number goes to zero during unrar/repair, it means Newsbin can't write the data it has in memory to disk. My cache is typically set to 400. You want it to be larger than the largest number of chunks in the files you download.
[PERFORMANCE]
ChunkCacheSize = 1000;
If you have the RAM, you might want to bump the size up. Each cache block is probably 500-600K of RAM while actively downloading.
and the main download folder set an 24x 2TB 7.2k drive array over a 10gb network.
Set the download path to the local drive and see what happens. You might be better off downloading locally and unraring remotely.
Re-reading your comment it sounds like your slowdown is the process of assembling the files from chunks. Running a larger cache means they can all assemble out of RAM but, still going over the network for assembly will be slower than local. There's also a hit because autopar has to re-read the assembled file to collect the PAR status. I think downloading locally and unraring remotely would be more efficient.
You're on the edge of the performance envelope though so, there's probably not too many people experienced with this out there. You ought to be able to sustain 200 without too much trouble. Going to a local server, downloading to a local drive, I've often got well over 2 times that speed so, it's doable. I get 110 Mbps day in and day out downloading locally.