I want to try out Giganews Accelerator; it says change the server name to "Localhost." Since I can't change the server name, I assume I have to make a new server named "Localhost" and put in the server name / Password from the old server, right? But I can't because I don't remember the server password.
-------------------------------------------------------
Release Notes
The Giganews Accelerator is easy to install and comes pre-configured for optimal use. You will need to update the news server settings in your news client based on the instructions listed below.
News client settings:
Change the news server address in your news client to "localhost" and the news server port to “119”. This tells your client to pass all traffic to the Giganews Accelerator proxy. The Giganews Accelerator also handles all SSL traffic, so SSL must also be disabled in your client.
Note: If your client has any rate-limit settings, you should disable them. The Giganews Accelerator can rate-limit for you if necessary.
All other configuration information should remain the same, including the username/password and the maximum number of connections.
How do I get the Server Password?? Also, using GN Accelerat
the workspace, servers pane right click the server then invoke advanced->rename.
also ensure in edit menu->properties->tasks->prevent header overload is checked, but i don't think the defaults there are good, if you see wild fluctuation in the header download speed - in the header overload settings try to increase the download speed limit from 25 to I think 100-200K and maybe the amount of headers, but ensure the parameters are small enough so UE process RAM consumption is not rising very fast, it would mean UE processes headers slower than the supply, so unprocessed headers would fill memory and then the UE process may run out of memory.
also ensure in edit menu->properties->tasks->prevent header overload is checked, but i don't think the defaults there are good, if you see wild fluctuation in the header download speed - in the header overload settings try to increase the download speed limit from 25 to I think 100-200K and maybe the amount of headers, but ensure the parameters are small enough so UE process RAM consumption is not rising very fast, it would mean UE processes headers slower than the supply, so unprocessed headers would fill memory and then the UE process may run out of memory.
UE and Giganews Accelerator - Testimonial & Howto
I'm a current user of both.
With the Accelerator, I'm no longer 'afraid' to d/l 30M headers (170 day retention to date) in one go using a 'punny' 1.5Mbps ADSL (PPPoE) connection, my last 'full header download' snagged a total of 70+M headers from 4 binary groups. Yeah, I like to 'keep everything' and 'sloooowly' vet through them and purge those that I dont want. This is where UE helps me a lot. I call this 'distilling' my headers.
Here are some stats (I'm on a C2D OC'd 3.375GHz, 2GB RAM, SATA HDD):
UE database is in same drive as C:\WINDOWS. Not yet defrag after header downloads.
With a 1.5Mbps PPPoE connection, the Accelerator tells me that I'm getting abt 1.3Mbps 'actual'. Accelerated, my header d/l goes as fast as 13Mbps!!! UE's status bar tells me that I'm going 1.3 MBytes/sec. This is abt 8-10 times. It took me a little more than 2 hrs to pull all 70 million headers that GN has in its 170 day retention.
All 4 binary newsgroups types are set as compact binary in UE. Currently, I have all 4 opened/loaded and UE tells me that it is using 203MB (reserved). From Task Manager, I see Mem Usage as 293,180K which is about right.
Opening/loading the 30M group takes abt 20 secs (I sort by date). Searching/re-sorting takes below 5 secs to display results. I feel no speed slowdown even when doing complex searches in UE (mixed AND/OR on Subject combined with AND/OR for Author), perhaps an extra 1-2 seconds. For eg. Subject: title1|title2|title3 AND Author: ^spammer1&^spammer2&^spammer3. Just tried it when in UE's Global View (with alll 4 newsgroups loaded) - below 5 secs.
The UE folder used up 2.19GB of HDD space. Perhaps things may be a bit more efficient once I defrag my C drive, but at 1-2 secs per search call already, who cares...
Oh yes, system CPU utilization while everything's in progress is a mere 2% (but this might be diff on a slower system).
Conclusion:
GN's Accelerator works beautifully. Combined with UE, fetching and manipulating huge databases becomes a non-issue. Personally, I feel that UE is the 'best' newsreader to be used with the GN service, especially when you want to play with all 200 days worth of retention.
With GN's soon-to-become 200-day retention and UE's very efficient handling, I'm now able to have my own/personal newzbin database. I can use UE's even more powerful search criteria to get a more accurate header list and export them to NZBs. I keep 2 instances of UE; one is 'live' used for actual article downloads based upon NZBs, the other is just to keep all my headers to search and generate NZBs for the former to use.
Howto:
Setting up the Accelerator to work with UE is similar to setting up an external SSL service for use with UE.
Think of a new name for this pseudo-server (no spaces in name). I call mine 'Giganews.Accelerator' (duh!).
Next, look up your 'host file' on yr system. In XP, this is the file named: %WINDIR%\SYSTEM32\DRIVERS\ETC\HOSTS.
Use notepad.exe (or similar) to edit this text file, locate for an entry:
Append your 'new name' after 'localhost'. I used a 'tab' instead of a space to provide better readability:
Check that you've not typed in extra stuffs and then save the host file.
Next, go to UE and create a new server entry. Supply 'Giganews.Accelerator:119' as servername:portno, and provide your necessary login_id and password.
For my case, I then brought up the Properties->Servers pane and just tick 'Headers' and made sure 'Msg-id' and 'Bodies' are unticked. I also have a server entry for 'news.giganews.com'. For this entry, I ticked 'Msg-id' and 'Bodies' and unticked 'Headers'. To be doubly sure, I also set 'Giganews.Accelerator' with a 'Lowest' priority while leaving 'news.giganews.com' with a higher server priority. I use a 'Strict Preference' rule. This ensures that I only d/l headers from 'Giganews.Accelerator' and article bodies via the normal way.
Whenever I need to download headers, I'll start up the Accelerator (you can auto-start this if you want, I opt to manually control it). Then, I'll bring up UE's Workgroup->Newsgroups pane, expand the newsgroup I want to d/l headers from, right-click on 'Giganews.Accelerator', and select the 'get headers' function accordingly. After I've gotten all my headers, I can optionally close down and exit the Accelerator.
When I load up a newsgroup, and I wish to d/l any bodies (for eg, NFO files), I'll just click accordingly and this will get done using my normal server list based on the priorities I've set.
For my case, I'll then apply UE's searches and generate NZBs for the 'collections' I want to d/l and import it into another UE instance for actual d/l. Additionally, after creating each NZB, I'll mark these headers as read. Once my other instance has sucessfully downloaded the articles, I can then return to my 'main UE database engine', show only read headers and purge them accordingly. This is my method of doing 'simple housekeeping'.
End HowTo.
@krdu: I suggest you write to GN and somehow get your userid and password from them for future use. Keep it handy. I have a strange feeling that the Accelerator is GN's 'killer app' which they can use to differentiate their service offering from other NSPs. How abt: SSL VPN with data acceleration/compression (headers & bodies)? A Juniper/Bluecoat device at their end (if they arent there now) and an 'updated' version of Accelerator is all it takes to get this done (with a bump up in subscription price, of course).
I sincerely hope this post helps someone...
XEQ.
With the Accelerator, I'm no longer 'afraid' to d/l 30M headers (170 day retention to date) in one go using a 'punny' 1.5Mbps ADSL (PPPoE) connection, my last 'full header download' snagged a total of 70+M headers from 4 binary groups. Yeah, I like to 'keep everything' and 'sloooowly' vet through them and purge those that I dont want. This is where UE helps me a lot. I call this 'distilling' my headers.
Here are some stats (I'm on a C2D OC'd 3.375GHz, 2GB RAM, SATA HDD):
UE database is in same drive as C:\WINDOWS. Not yet defrag after header downloads.
With a 1.5Mbps PPPoE connection, the Accelerator tells me that I'm getting abt 1.3Mbps 'actual'. Accelerated, my header d/l goes as fast as 13Mbps!!! UE's status bar tells me that I'm going 1.3 MBytes/sec. This is abt 8-10 times. It took me a little more than 2 hrs to pull all 70 million headers that GN has in its 170 day retention.
All 4 binary newsgroups types are set as compact binary in UE. Currently, I have all 4 opened/loaded and UE tells me that it is using 203MB (reserved). From Task Manager, I see Mem Usage as 293,180K which is about right.
Opening/loading the 30M group takes abt 20 secs (I sort by date). Searching/re-sorting takes below 5 secs to display results. I feel no speed slowdown even when doing complex searches in UE (mixed AND/OR on Subject combined with AND/OR for Author), perhaps an extra 1-2 seconds. For eg. Subject: title1|title2|title3 AND Author: ^spammer1&^spammer2&^spammer3. Just tried it when in UE's Global View (with alll 4 newsgroups loaded) - below 5 secs.
The UE folder used up 2.19GB of HDD space. Perhaps things may be a bit more efficient once I defrag my C drive, but at 1-2 secs per search call already, who cares...
Oh yes, system CPU utilization while everything's in progress is a mere 2% (but this might be diff on a slower system).
Conclusion:
GN's Accelerator works beautifully. Combined with UE, fetching and manipulating huge databases becomes a non-issue. Personally, I feel that UE is the 'best' newsreader to be used with the GN service, especially when you want to play with all 200 days worth of retention.
With GN's soon-to-become 200-day retention and UE's very efficient handling, I'm now able to have my own/personal newzbin database. I can use UE's even more powerful search criteria to get a more accurate header list and export them to NZBs. I keep 2 instances of UE; one is 'live' used for actual article downloads based upon NZBs, the other is just to keep all my headers to search and generate NZBs for the former to use.
Howto:
Setting up the Accelerator to work with UE is similar to setting up an external SSL service for use with UE.
Think of a new name for this pseudo-server (no spaces in name). I call mine 'Giganews.Accelerator' (duh!).
Next, look up your 'host file' on yr system. In XP, this is the file named: %WINDIR%\SYSTEM32\DRIVERS\ETC\HOSTS.
Use notepad.exe (or similar) to edit this text file, locate for an entry:
Code: Select all
127.0.0.1 localhost
Code: Select all
127.0.0.1 localhost Giganews.Accelerator
Next, go to UE and create a new server entry. Supply 'Giganews.Accelerator:119' as servername:portno, and provide your necessary login_id and password.
For my case, I then brought up the Properties->Servers pane and just tick 'Headers' and made sure 'Msg-id' and 'Bodies' are unticked. I also have a server entry for 'news.giganews.com'. For this entry, I ticked 'Msg-id' and 'Bodies' and unticked 'Headers'. To be doubly sure, I also set 'Giganews.Accelerator' with a 'Lowest' priority while leaving 'news.giganews.com' with a higher server priority. I use a 'Strict Preference' rule. This ensures that I only d/l headers from 'Giganews.Accelerator' and article bodies via the normal way.
Whenever I need to download headers, I'll start up the Accelerator (you can auto-start this if you want, I opt to manually control it). Then, I'll bring up UE's Workgroup->Newsgroups pane, expand the newsgroup I want to d/l headers from, right-click on 'Giganews.Accelerator', and select the 'get headers' function accordingly. After I've gotten all my headers, I can optionally close down and exit the Accelerator.
When I load up a newsgroup, and I wish to d/l any bodies (for eg, NFO files), I'll just click accordingly and this will get done using my normal server list based on the priorities I've set.
For my case, I'll then apply UE's searches and generate NZBs for the 'collections' I want to d/l and import it into another UE instance for actual d/l. Additionally, after creating each NZB, I'll mark these headers as read. Once my other instance has sucessfully downloaded the articles, I can then return to my 'main UE database engine', show only read headers and purge them accordingly. This is my method of doing 'simple housekeeping'.
End HowTo.
@krdu: I suggest you write to GN and somehow get your userid and password from them for future use. Keep it handy. I have a strange feeling that the Accelerator is GN's 'killer app' which they can use to differentiate their service offering from other NSPs. How abt: SSL VPN with data acceleration/compression (headers & bodies)? A Juniper/Bluecoat device at their end (if they arent there now) and an 'updated' version of Accelerator is all it takes to get this done (with a bump up in subscription price, of course).
I sincerely hope this post helps someone...
XEQ.