In UE you can load large groups, I saw it loading a newsgroup with 150M headers in alt.binaries.boneless, but rather by accident, some user who was participating in the beta stage before the official release forgot to set retention so he kept headers for about a year. I think you better have at least 1GB of RAM.
I'm not sure how much RAM it will take, the amount of RAM is proportional to newsgroup size within the same newsgroup destination type, so you just load a smaller but comparable size newsgroup (not too small) and measure the difference in the RAM consumption, then multiply to the difference in the size to get approximate RAM consumption.
The 32 bit system allows about 1.4GB per process, with /3GB option (in vista it is default I think) it rises to 2.4GB and in 64 bit system those limit doesn't exist (you need the RAM though), in 64 bit system 64 bit compilation would be needed which I can easily make but I right now I only compiling the indexing server in 64 bit (same project, source code and database support as in UE).
NewsPro allowed about 4.5M per instance (all newsgroups) and max memory allocated was half of what system allows because restoring database could take the rest, so in UE it was the main improvement target and the ultimate reason to rewrite NewsPro completely instead just going through the way of gradual upgrades, when by optimizing the restore database process it could be increased only twice to 9M headers per instance only if you have 1.4GB+ of RAM at least.
As far as I know other binary downloaders are limited to 5-12M headers per newsgroup, 150M in UE is not a limit as well, just you need to check it yourself. You can also set last headers say to 100M headers in properties->newsgroups to limit the newsgroup size. I didn't see anyone focusing any more on header download (some programs nowadays don't have header download at all), probably with UE you get the best possible size since the database is optimized for Usenet down to bits, optimized for binaries and using C++ which is quite low level and efficient (which means fast) language, say with SQL it would just take more RAM because of universality and would be slower/be able to load only smaller groups than now for sure.
With Giganews it is very funny, Newsfeeds set 150 days retention recently (I'm not comparing them now
), I was wondering how long the Giganews response will take, Giganews wasn't prepared so they bought time by increasing retention gradually. Those are Usenet providers wars, and frankly I don't like to see Giganews on the scene with all those advertising everywhere trying to keep 2 times higher prices, since I'm not sure what is the impact on the Usenet community (cheaper price for certain would bring more Usenet users thus giving more diverse content, say students or less "busy" low income users would be at least the same likely to make interesting posts, but with higher prices we have smaller Usenet user pool and cheaper usenet providers have less upgrading options and had to keep higher prices than they could if they had more users). What I hope at some stage the retention war will turn into the price war, since usenet providers just run common software, they can choose the best available, so the quality of service shouldn't depend on the provider as hardware becomes cheaper and network becomes faster, but right now Giganews just tries to keep 2 time higher price through dirty tricks (see for example this dumb article:
http://www.giganews.com/blog/2006/11/ss ... peeds.html I tried to post a reply but it didn't appear - so called "blogs" mean only strict censorship).