Page 1 of 1

Getting Newsbin to cleanup some common download crud

PostPosted: Thu May 22, 2014 7:17 pm
by jimerb
Almost every time I download a NZB file,
i get as part of that download a bunch of junk in addition to file I want to keep.

So for instance it might be like:

Code: Select all
LIST OF FILES:
 - TOWN - NZB FORUM - KLICK IT.url
 - Top Usenet Provider -CHEAP-FULL SPEED.url
 - BONUS.rar
 - somecoolfile.nfo
 - somecoolfile.mp4   <---- here's the one I want


On each of these downloads I have to delete the other useless files.

is there a way I can get newsbin to do this for me? For example can i say, never leave a .url file behind?

I'm on the latest production version.

thanks.

Re: Getting Newsbin to cleanup some common download crud

PostPosted: Fri May 23, 2014 9:09 am
by Quade
There's potentially 2 ways to do it:

1 - Using the "Post unRAR filter"

2 - Using the scripting interface to delete the files using a script you write.

To use the "Post UnRAR filter"

1 - Create a Filter profile.

2 - Use the filename accept/reject filters to add files to reject. For example I added "Bonus.rar" to the filename reject filter.

3 - In the autopar options, select this filter profile. ONLY Filename filters should be used for this.

[.]htm
[.]html
[.]nfo
[.]sfv
[.]srr
[.]txt
[.]url
bonus.rar


Is my list.

Is that it looks like after an unrar.

[08:05:36] HIGH HFC: Rejected RAR Filter: Filename Filter Bonus.rar
[08:05:36] HIGH HFC: Rejected RAR Filter: Filename Filter TOWN - NZB FORUM - KLICK IT.url
[08:05:36] HIGH HFC: Rejected RAR Filter: Filename Filter TOWN - NZB INDEXER - KLICK IT.url
[08:05:36] HIGH HFC: Rejected RAR Filter: Filename Filter Top Usenet Provider - CHEAP - FULL SPEED.url
[08:05:36] HIGH HFC: Rejected RAR Filter: Filename Filter Copy(1) of TOWN - NZB FORUM - KLICK IT.url
[08:05:36] HIGH HFC: Rejected RAR Filter: Filename Filter Copy(1) of TOWN - NZB INDEXER - KLICK IT.url
[08:05:36] HIGH HFC: Rejected RAR Filter: Filename Filter Copy(1) of Top Usenet Provider - CHEAP - FULL SPEED.url
[08:05:36] HIGH HFC: Rejected RAR Filter: Filename Filter TOWN - NZB FORUM - KLICK IT.url
[08:05:36] HIGH HFC: Rejected RAR Filter: Filename Filter TOWN - NZB INDEXER - KLICK IT.url
[08:05:36] HIGH HFC: Rejected RAR Filter: Filename Filter Top Usenet Provider - CHEAP - FULL SPEED.url


Is what it shows in the log.

Re: Getting Newsbin to cleanup some common download crud

PostPosted: Fri May 23, 2014 6:59 pm
by jimerb
Wow this is perfect! Thanks Quade.

Would you recommend putting [.]exe there or is that being handled in the spam filter?

Also how can i get rid of .par2 files that seem to pop up here and there? I'm thinking they are not included in the RAR.

Re: Getting Newsbin to cleanup some common download crud

PostPosted: Fri May 23, 2014 7:53 pm
by Quade
Likely the PAR2 files are spam you tried to download and the download went to the failed list. There's no real hook for Newsbin to clean these up.

Can't hurt to put exe in there because, I don't typically look at what's in the RAR files beyond the spam detector.

Re: Getting Newsbin to cleanup some common download crud

PostPosted: Fri May 23, 2014 8:54 pm
by jimerb
Your right I deleted pars sitting in the Downloading Files tab and they all went away.

These changes are working great! Nice and clean so far!

I've been using Newsbin for nearly 10 years and I was just spending time looking at all the settings. This program has really become one heck of an amazing tool. So many options.

Thanks.

Re: Getting Newsbin to cleanup some common download crud

PostPosted: Wed May 28, 2014 8:29 am
by wiggins09
I'm using 6.52b3 bd3196 & have successfully tried combinations for file cleanup as outlined above.
For some reason [.]srr doesn't work. I can clean other files using this syntax & ext but not *.srr :?
I have -
Code: Select all
[.]sfv
[.]srr

& only *.sfv is deleted. The *.srr is untouched.
I find this a bit strange. Any ideas?

Re: Getting Newsbin to cleanup some common download crud

PostPosted: Wed May 28, 2014 11:50 pm
by Quade
I'll have to check it out.

Re: Getting Newsbin to cleanup some common download crud

PostPosted: Thu May 29, 2014 7:19 am
by wiggins09
Thanks.
Reading http://forums.newsbin.com/viewtopic.php?f=43&t=29961#p203704 about cleanup, made me wonder if there are multiple .srr which somehow escape cleanup. Could just be a red herring tho :roll: :)

Re: Getting Newsbin to cleanup some common download crud

PostPosted: Fri May 30, 2014 4:00 am
by wiggins09
Yep looks like a red herring. :P

Just tried a couple of nzbs where srr are specifically listed along with sfv, rars & pars, & the srr remain at processing end.

Re: Getting Newsbin to cleanup some common download crud

PostPosted: Fri May 30, 2014 8:39 am
by Quade
If the file is just detached and hanging out, I mean not in the PAR files and not in the RAR files, the post-unrar filter won't do anything about it.

You might want to add "[.]srr" to the global filename reject filter.

Re: Getting Newsbin to cleanup some common download crud

PostPosted: Fri May 30, 2014 6:24 pm
by wiggins09
Thanks Quade.
I will try the global as suggested. :)

I had assumed that if a filter was selected to be run after unrar it would cleanup the dir of any files matching the criteria, i.e. whether they were in the par/rar set or not, they are still in the target dir.
Is that not the case? :?

Re: Getting Newsbin to cleanup some common download crud

PostPosted: Fri May 30, 2014 8:03 pm
by Quade
It only knows about the files in the PARs and RARS. It's not looking in the folder for something to clean up. Plenty of people download and unrar into a single folder, trying to scan all of that every time, wouldn't scale as the file counts went up.

Re: Getting Newsbin to cleanup some common download crud

PostPosted: Sat May 31, 2014 11:08 am
by wiggins09
Oh OK, thanks for clearing that up Quade.
I'm so used to $(FILENAME) subfolder creation I totally missed that not using that MO would create complications. :)

I'm beginning to see there must be quite a lot of NB features that I don't make full use of, I'm grateful to you for helping me understand & get the most from NB with your replies.