This is not an real issue but since recently one big usenet service provider raised the retention to 200days, what will be the impact of that on UE usage on long term ?
I mean if I set retention to 200days even if I use compact binary type for huge binary groups, how much ram will be needed to open those when they'll be full of 200 days old headers? And what about the cpu load and disk space (headers only no bodies)?
I know maybe it's pointless to keep so many days of headers but since they offer that service I'd like to use.
Impact on UE setting very large retention on binary groups
For actual numbers you'd have to search the forum (it's been mentioned before somewhere) for what UE can handle.
Realistically, though, how many headers would you want to bother yourself reading through if you decided you wanted something older? With a large group, you'd get bored even reading through the past seven day's worth. The option I'd take is to set retention to something decent, like thirty days or even seven and use the Search Service to look for anything older. Two benefits to this: I saves you disk space and the Search Service would be even faster than opening up your relevant groups and then filtering. Plus you always have the option of importing NZBs which would be picked up from your NSP regardless of whether or not you had the headers stored.
What possible benefits do you see in storing two hundred day's worth of headers? Personally, I see none other than it would help in the rare cases that the Search Service is down but even then, it's always restarted fairly quickly.
Realistically, though, how many headers would you want to bother yourself reading through if you decided you wanted something older? With a large group, you'd get bored even reading through the past seven day's worth. The option I'd take is to set retention to something decent, like thirty days or even seven and use the Search Service to look for anything older. Two benefits to this: I saves you disk space and the Search Service would be even faster than opening up your relevant groups and then filtering. Plus you always have the option of importing NZBs which would be picked up from your NSP regardless of whether or not you had the headers stored.
What possible benefits do you see in storing two hundred day's worth of headers? Personally, I see none other than it would help in the rare cases that the Search Service is down but even then, it's always restarted fairly quickly.
In UE you can load large groups, I saw it loading a newsgroup with 150M headers in alt.binaries.boneless, but rather by accident, some user who was participating in the beta stage before the official release forgot to set retention so he kept headers for about a year. I think you better have at least 1GB of RAM.
I'm not sure how much RAM it will take, the amount of RAM is proportional to newsgroup size within the same newsgroup destination type, so you just load a smaller but comparable size newsgroup (not too small) and measure the difference in the RAM consumption, then multiply to the difference in the size to get approximate RAM consumption.
The 32 bit system allows about 1.4GB per process, with /3GB option (in vista it is default I think) it rises to 2.4GB and in 64 bit system those limit doesn't exist (you need the RAM though), in 64 bit system 64 bit compilation would be needed which I can easily make but I right now I only compiling the indexing server in 64 bit (same project, source code and database support as in UE).
NewsPro allowed about 4.5M per instance (all newsgroups) and max memory allocated was half of what system allows because restoring database could take the rest, so in UE it was the main improvement target and the ultimate reason to rewrite NewsPro completely instead just going through the way of gradual upgrades, when by optimizing the restore database process it could be increased only twice to 9M headers per instance only if you have 1.4GB+ of RAM at least.
As far as I know other binary downloaders are limited to 5-12M headers per newsgroup, 150M in UE is not a limit as well, just you need to check it yourself. You can also set last headers say to 100M headers in properties->newsgroups to limit the newsgroup size. I didn't see anyone focusing any more on header download (some programs nowadays don't have header download at all), probably with UE you get the best possible size since the database is optimized for Usenet down to bits, optimized for binaries and using C++ which is quite low level and efficient (which means fast) language, say with SQL it would just take more RAM because of universality and would be slower/be able to load only smaller groups than now for sure.
With Giganews it is very funny, Newsfeeds set 150 days retention recently (I'm not comparing them now ), I was wondering how long the Giganews response will take, Giganews wasn't prepared so they bought time by increasing retention gradually. Those are Usenet providers wars, and frankly I don't like to see Giganews on the scene with all those advertising everywhere trying to keep 2 times higher prices, since I'm not sure what is the impact on the Usenet community (cheaper price for certain would bring more Usenet users thus giving more diverse content, say students or less "busy" low income users would be at least the same likely to make interesting posts, but with higher prices we have smaller Usenet user pool and cheaper usenet providers have less upgrading options and had to keep higher prices than they could if they had more users). What I hope at some stage the retention war will turn into the price war, since usenet providers just run common software, they can choose the best available, so the quality of service shouldn't depend on the provider as hardware becomes cheaper and network becomes faster, but right now Giganews just tries to keep 2 time higher price through dirty tricks (see for example this dumb article: http://www.giganews.com/blog/2006/11/ss ... peeds.html I tried to post a reply but it didn't appear - so called "blogs" mean only strict censorship).
I'm not sure how much RAM it will take, the amount of RAM is proportional to newsgroup size within the same newsgroup destination type, so you just load a smaller but comparable size newsgroup (not too small) and measure the difference in the RAM consumption, then multiply to the difference in the size to get approximate RAM consumption.
The 32 bit system allows about 1.4GB per process, with /3GB option (in vista it is default I think) it rises to 2.4GB and in 64 bit system those limit doesn't exist (you need the RAM though), in 64 bit system 64 bit compilation would be needed which I can easily make but I right now I only compiling the indexing server in 64 bit (same project, source code and database support as in UE).
NewsPro allowed about 4.5M per instance (all newsgroups) and max memory allocated was half of what system allows because restoring database could take the rest, so in UE it was the main improvement target and the ultimate reason to rewrite NewsPro completely instead just going through the way of gradual upgrades, when by optimizing the restore database process it could be increased only twice to 9M headers per instance only if you have 1.4GB+ of RAM at least.
As far as I know other binary downloaders are limited to 5-12M headers per newsgroup, 150M in UE is not a limit as well, just you need to check it yourself. You can also set last headers say to 100M headers in properties->newsgroups to limit the newsgroup size. I didn't see anyone focusing any more on header download (some programs nowadays don't have header download at all), probably with UE you get the best possible size since the database is optimized for Usenet down to bits, optimized for binaries and using C++ which is quite low level and efficient (which means fast) language, say with SQL it would just take more RAM because of universality and would be slower/be able to load only smaller groups than now for sure.
With Giganews it is very funny, Newsfeeds set 150 days retention recently (I'm not comparing them now ), I was wondering how long the Giganews response will take, Giganews wasn't prepared so they bought time by increasing retention gradually. Those are Usenet providers wars, and frankly I don't like to see Giganews on the scene with all those advertising everywhere trying to keep 2 times higher prices, since I'm not sure what is the impact on the Usenet community (cheaper price for certain would bring more Usenet users thus giving more diverse content, say students or less "busy" low income users would be at least the same likely to make interesting posts, but with higher prices we have smaller Usenet user pool and cheaper usenet providers have less upgrading options and had to keep higher prices than they could if they had more users). What I hope at some stage the retention war will turn into the price war, since usenet providers just run common software, they can choose the best available, so the quality of service shouldn't depend on the provider as hardware becomes cheaper and network becomes faster, but right now Giganews just tries to keep 2 time higher price through dirty tricks (see for example this dumb article: http://www.giganews.com/blog/2006/11/ss ... peeds.html I tried to post a reply but it didn't appear - so called "blogs" mean only strict censorship).
I hadn't seen that before. Someone should start a petition to get Giganews Blog renamed to Giganews Propaganda. I suppose everyone's being told how much better broadband is these days and how easy and simple everything is. Therefore people who don't know any better get suckered by misleading articles full of misinformation. I always read at least a couple of testimonials on sites that have them. I don't trust them or get swayed by them. I find reading them can be a measure of what kind of company they are, whether or not they are confidant enough to publish negative comments.alex wrote:...Giganews just tries to keep 2 time higher price through dirty tricks (see for example this dumb article: http://www.giganews.com/blog/2006/11/ss ... peeds.html ...
I tried a few NSP's once to find out which I'd prefer to stay with. I trialled Giganews when they had 70 day's retention guaranteed across 'ALL GROUPS' (their capitalisation, not mine) at that point. I found that anything over around 50 days was errored as not found yet their status page said all was running well across all servers. That was enough for me to make my informed decision.