Yes, Newsbin did exactly that! It spent more than half an hour generating two files (and using 15% of quad-core CPU) instead of downloading them in less than 2 minutes.
It happens when there is a set of separate posts and pars posted separately for all the posts together, lets say:
+Title.S03.COMPLETE.DL - Title.S03.par2
Title.S03.COMPLETE.DL - Title.S03E01.mkv
Title.S03.COMPLETE.DL - Title.S03E02.mkv
Title.S03.COMPLETE.DL - Title.S03E03.mkv
......
Title.S03.COMPLETE.DL - Title.S03E21.mkv
Title.S03.COMPLETE.DL - Title.S03E22.mkv
Title.S03.COMPLETE.DL - Title.S03E23.mkv
As you see, only the par2 post is expandable (+) - others are just single files.
A couple of those mkv files were incomplete.
So, after the S03E21 was downloaded, there was enough par2 blocks to fix the problems in the incomplete files and to create S03E22 and S03E23 files from scratch.
So NewsBin did exactly that instead of downloading S03E22 and S03E23 first.
So instead of fast download of two files and 3-block-fix, it proceeded to generate 65 blocks worth of files from par2s.
And the total size of all the files was about 40 GB. So it took a considerable time, more than 30 minutes and 15% of quad-core CPU.
After S03E22 and S03E23 were generated, only then they were also downloaded, so I got also -(0001).mkv files, thank you very much.
I'm not sure, perhaps I caused all this myself, because after the par2 header file was downloaded I forced the download of par2 blocks as well.
So not sure what happens if those par2 blocks are available but not yet downloaded.
Perhaps Newsbin thinks (very wrongly) that it is faster to use the already downloaded blocks rather than to download the remaining two files.
But I suspect that NewsBin does not even look what is next in the download queue once it has enough blocks to repair all the files covered by the par2 set.