Did I hit the limit?

Post Reply
xtrips
Posts: 28
Joined: Mon Mar 10, 2003 3:37 pm

Did I hit the limit?

Post by xtrips »

Hello,

Lately Newshosting has augmented the retention to 400 days.
Since then, when I try to update a.b.hdtv, I am always getting a message box saying "alt.binaries.hdtv cnt.dat overflow. Limit retention, headers per server, then apply retention and invoke group Compact binary-> Compact storage."

Well this doesn't work (or i didn't do it right).
What happens next is that UE stops downloading the headers for this group.

What should I do? Can I keep my settings for 400days retention or is there a limit?

Thank you
alex
Posts: 4547
Joined: Thu Feb 27, 2003 5:57 pm

Post by alex »

newshosting has now most likely the same retention as http://www.ngroups.net/ - 331 days and growing (both are highwinds servers).

the message box is not related to a retention, i just added the limit detection to prevent the program from crashing, in short cnt.dat per group cannot grow more than 4GB.

i think the current limit given this limitation is around 150M+ headers per newsgroup, for boneless i got 154M to reach 4GB.

i can increase it 2 or 4 times (i have such provision in the code, i intended to use for the search service), the question is whether it makes sense, loading time is proportional to the newsgroup size and the current limit is far more than any other newsreader allows. if to increase the limit - the database format would change as well (then per group cnt.dat would be able to grow e.g. to 8GB or to 16GB instead of currently 4GB).

so maybe you should limit retention as it prescribes, those options are in the workspace, the newsgroup context menu. open map.txt there you can find the directory number of the newsgroup, in the directory you'll see cnt.dat, e.g. if it will be 3.5GB after compacting and all headers up to date the retention will be adequate.
MikusR
Posts: 7
Joined: Fri Jan 04, 2008 9:37 am

Post by MikusR »

At least 8GB per group would be nice. The problem is that it's impossible to get all headers for the growing retention (it mainly affects groups with big files like a.b.h.x264).
Archibald
Posts: 10
Joined: Sun Jun 29, 2008 7:47 pm

Post by Archibald »

Would it be possible to add an option "retain as many headers as possible"?

Doing it by a specific number of days is rather inelegant.
alex
Posts: 4547
Joined: Thu Feb 27, 2003 5:57 pm

Post by alex »

the option wouldn't make sense, it would have the same drawbacks like "fill as much RAM as possible".

i'll check whether i can change it without changing the db format it may be possible. second option is to keep db version number somewhere so i can change database when needed, as to now the format was unchanged since v1.0 despite all additions. just the 4GB cnt.dat limit is trivial to change so no need to add any special options, the catch it i'm not sure i can change it and leave the database compatible with prior versions.
MikusR
Posts: 7
Joined: Fri Jan 04, 2008 9:37 am

Post by MikusR »

May be add it as another storage type.
compact-binary (as it is now)
compact-binary large (with appropriate mentions of it being slower)
alex
Posts: 4547
Joined: Thu Feb 27, 2003 5:57 pm

Post by alex »

there is no need in new type since nothing will get slower, it appears i can also incorporate this into the current database format with only no backwards compatibility, i'll think about something.

try to follow next version preview, this time maybe i'll put it there first.

later: resolved in v2.6
slotboxed
Posts: 57
Joined: Sun Nov 09, 2003 3:49 am

Post by slotboxed »

950 million boneless headers (from giganews) gzipped is about 21 gigs. ;) Unzipped, well... the only way to handle a file that big is not to have it all in memory at once... you have to just have a view of a small slice of it at a time.

It's only gonna get worse, you know.
Post Reply