Page 1 of 1

Did I hit the limit?

Posted: Sun Jul 26, 2009 6:53 am
by xtrips
Hello,

Lately Newshosting has augmented the retention to 400 days.
Since then, when I try to update a.b.hdtv, I am always getting a message box saying "alt.binaries.hdtv cnt.dat overflow. Limit retention, headers per server, then apply retention and invoke group Compact binary-> Compact storage."

Well this doesn't work (or i didn't do it right).
What happens next is that UE stops downloading the headers for this group.

What should I do? Can I keep my settings for 400days retention or is there a limit?

Thank you

Posted: Sun Jul 26, 2009 7:40 pm
by alex
newshosting has now most likely the same retention as http://www.ngroups.net/ - 331 days and growing (both are highwinds servers).

the message box is not related to a retention, i just added the limit detection to prevent the program from crashing, in short cnt.dat per group cannot grow more than 4GB.

i think the current limit given this limitation is around 150M+ headers per newsgroup, for boneless i got 154M to reach 4GB.

i can increase it 2 or 4 times (i have such provision in the code, i intended to use for the search service), the question is whether it makes sense, loading time is proportional to the newsgroup size and the current limit is far more than any other newsreader allows. if to increase the limit - the database format would change as well (then per group cnt.dat would be able to grow e.g. to 8GB or to 16GB instead of currently 4GB).

so maybe you should limit retention as it prescribes, those options are in the workspace, the newsgroup context menu. open map.txt there you can find the directory number of the newsgroup, in the directory you'll see cnt.dat, e.g. if it will be 3.5GB after compacting and all headers up to date the retention will be adequate.

Posted: Sun Jul 26, 2009 9:24 pm
by MikusR
At least 8GB per group would be nice. The problem is that it's impossible to get all headers for the growing retention (it mainly affects groups with big files like a.b.h.x264).

Posted: Mon Aug 10, 2009 5:20 pm
by Archibald
Would it be possible to add an option "retain as many headers as possible"?

Doing it by a specific number of days is rather inelegant.

Posted: Tue Aug 11, 2009 11:32 pm
by alex
the option wouldn't make sense, it would have the same drawbacks like "fill as much RAM as possible".

i'll check whether i can change it without changing the db format it may be possible. second option is to keep db version number somewhere so i can change database when needed, as to now the format was unchanged since v1.0 despite all additions. just the 4GB cnt.dat limit is trivial to change so no need to add any special options, the catch it i'm not sure i can change it and leave the database compatible with prior versions.

Posted: Wed Aug 12, 2009 6:26 am
by MikusR
May be add it as another storage type.
compact-binary (as it is now)
compact-binary large (with appropriate mentions of it being slower)

Posted: Wed Aug 12, 2009 12:03 pm
by alex
there is no need in new type since nothing will get slower, it appears i can also incorporate this into the current database format with only no backwards compatibility, i'll think about something.

try to follow next version preview, this time maybe i'll put it there first.

later: resolved in v2.6

Posted: Sun Sep 06, 2009 4:03 pm
by slotboxed
950 million boneless headers (from giganews) gzipped is about 21 gigs. ;) Unzipped, well... the only way to handle a file that big is not to have it all in memory at once... you have to just have a view of a small slice of it at a time.

It's only gonna get worse, you know.