I guess my purpose for writing this is to encourage anyone who has been thinking about upgrading their system – that now might be the time. Looks like CPUs are dropping in price every week and memory is about $24 for 256 PC133 now.
Check this link for the best prices.
www.sharkyextreme.com/hardware/weekly_cpu/
Klaus
[This message has been edited by Klaus (edited 08-07-2001).]
[This message has been edited by Klaus (edited 08-07-2001).]
Thanks for the info...
Hauser
So far it has run perfectly – no lockups or crashes even with power management on. I am running SETI all day today, so hopefully it will still be running when I get home. I might try to overclock it to 1.5g tonight – supposedly it shouldn’t be an issue. All the Thunderbirds over 1.2g are not multiplier locked so overclocking is easier now (fsb can be increased or mult. can be increased) I will keep on eye on how hot the chip gets at different speeds until I find one I like.
A couple other things I like about the motherboard are: there are 2 additional IDE connectors on the motherboard for a total of 4 (all ATA100) and it comes with 4 USB ports. And no more ISA slots! I would stick with either an ASUS or ABIT board stay away from MSI (makers of my previous setups glitchy motherboard) Let me know if you have any other questions.
Klaus
Just my two cent's.
(And, I am only kidding around)
These days AMD doesn’t multiplier lock their 1.2g and above Thunderbirds some think to allow them to be easily overclocked and therefore create more demand for them. It’s always a crap shoot when it comes to overclocking anyways – my 1.3 might hit 1.6 or it might not go over 1.4.
Klaus
[This message has been edited by Chadwick (edited 08-07-2001).]
Two years ago when I picked up my Athlon 500 (with a 650 core) it was a well known fact amongst overclockers that at that time AMD’s batches were turning out so good (all the chips were testing at 650+) that they didn’t have enough 500’s to meet demand. So they labeled the 650 as 500 – this had nothing to do with engineering and everything to do with supply & demand (something I know a thing or two about After awhile the batches were identified and web sites started to offer the “650” (that was actually labeled a 500) and the “snap on device” to overclock it for less $$ then the real 650. This of course only last for several months until the 500 was phased out.
Hauser
I bought an OEM chip and an aftermarket cpu cooler (don't remember the name) seems to run cool so far.
From what I understand you shouldnt need to perform the "pencil trick" to overclock any Thunderbird over 1.2 (they aren't mul. locked) Basically what the trick does is connect a group of circuits on the board by drawing on the board inbetween them to make a connection - I guess this allows the muliplier to be unlocked.
Again, make sure to get a ASUS or ABIT board that allows for overclocking. I know my board is the first that allows 1mhz increments in fsb. My old board you only had like 3 options - 100,112,133.
Goodluck
quote:
Originally posted by mikey:
I think I have to agree with Chad here. Even though that does hurt me a little. Not because I know anything about heat transfer or pencil snapping or chippies jumpers or crazy shit like that. I look at things on a fundemental basis. What really is the difference between 1.2 gig and 1.5 gig? I guess I'm not a computer whiz but my guess would be not that much. I just think that a person is setting themselves up for some trouble for very little gain by pencil jerking and whizbanging. Anyway, just my 2 cents worth.
I agree with you Mike that a 1.3g @ 1.5g isn’t really a big increase (7.69%) and I would just be seeing if my chip would run at a higher speed, but I am not planning on keeping it overclocked. Now in your case – you have a 800 Thunderbird, right? What if by drawing a couple lines on the chip with a pencil you could run it at 1.0g or 1.2g? A 25% or 50% increase might be worth it then, right? I know you would rather just spend the $100 and get a 1.0g or 1.2g and that cool J
quote:
Originally posted by Chadwick:
You do understand Scott that you just dont get a straight percentage speed increase. That your speed and performance depend on alot of other factors which can easily muddle a 25% speed increase in an identical system architecture.
Did I ever say that the percentage increase of Mhz equated to the percentage increase in “performance”? I just don’t agree with your statement that Mhz within the same architecture are all the same (ie. PIII800 = PIII900) - that’s just silly.
What is the name of that place that you found that cheap memory at of university? The micron stuff not the crappy PNY stuff that G Nanosystems and best buy sell.
Tran Micro Computer – 612-379-2572 – 2720 University Ave
General NanoSystems, Inc. - 612-331-3690 – 3014 University Ave
I think Tran Micro had the Micro memory, but I can’t remember.
EVERY TIME!!!
Good god man, 128 MB of DDR RAM...
[This message has been edited by BoondockSaint (edited 02-11-2002).]
Nvidia's GeForce4 Hits and Misses
By Vince Freeman
The 4 Ti Is a Clear Advance; the 4 MX Is False Advertising
The 3D graphics market got a lot hotter today, as Nvidia announced new products spanning all market or price segments. The GeForce4 graphics processor line represents a huge shift for the company -- in the past, each new Nvidia product offered a quantum leap forward in either processing power or features, along with a new core design. The high-performance GeForce4 Ti and value-priced GeForce4 MX are quite different in that respect, being more of enhancements to existing GeForce3 and GeForce2 technologies, respectively.
At the high end, there's the true GeForce4 chip, which is a higher-clocked, core-enhanced version of Nvidia's former flagship GeForce3 Ti 500. There are some new bells and whistles, such as an extra vertex shader, new antialiasing engine, and improved, 128-bit DDR memory controller, but basically the 4 Ti still looks a lot like a faster GeForce3.
The 4 comes in three flavors -- the ultra-high-end GeForce4 Ti 4600, the high-end GeForce4 Ti 4400, and the midrange GeForce4 Ti 4200. The last offers performance at about the level of a GeForce3 Ti 500, while the upper two are even faster.
While ATI is already sniping about "GeForce3.5," the new name is at least partly deserved for the true GeForce4 cards. The new boards' prices will be welcome news, with the GeForce4 Ti 4200 undercutting GeForce3 Ti 500 prices, while the higher-speed models stay competitive with Nvidia's previous performance-card pricing. So bleeding-edge game crazies get higher performance at the same price, while the rest of us get a new midrange board that's likely to be popular with a far larger market segment. It may be rather like Intel hypothetically introducing the Northwood as the Pentium 5, but as long as price/performance improves, the label is largely inconsequential.
There's a reason I'm referring to the 4 Ti as "the true GeForce4," however: things get a lot more muddy when considering the new GeForce4 MX. Whether or not you're cool with the new GeForce4 naming scheme, the GeForce4 MX -- available in MX 460, MX 440, and MX420 versions -- is actually more of a GeForce2 MX derivative with some enhancements thrown in for good measure.
For the uninitiated, the GeForce2 MX was a stripped-down version of the GeForce2, with only half of the processing pipelines found in the original GeForce2 core. This helped Nvidia achieve a lower price and made the GeForce2 MX a popular (though lower-performing) choice with system OEMs, both on separate graphics cards and in Nvidia's nForce integrated chipset.
You Are the Weakest Link, Goodbye
Unfortunately, the new MX is a GeForce4 in name only. The GeForce4 MX has none of the nifty DirectX 8/8.1 features supported by in Nvidia's nfiniteFX engine, and even the former midrange GeForce3 Ti 200 offering is a far more advanced and feature-rich chip. There have been a few enhancements such as a faster memory controller and hardware DVD playback, but these exist outside the core architecture. Again, even though the GeForce2 MX was a hobbled product for the value market, it was based on a GeForce2 core. By contrast, I'd be hard pressed to find a reason to call Nvidia's new value entry even a GeForce3 MX.
This move might even indicate a plateau effect in 3D technology. When Nvidia started its GeForce3 marketing push, DirectX 8 and nfiniteFX support were placed on a pedestal and their importance to future 3D development was proclaimed. Now with the GeForce4 MX not even including these basic features, what kind of product support can GeForce3 owners expect?
Frankly, I think the GeForce4 MX announcement immediately tarnishes the GeForce4 name and makes Nvidia's numbering system nearly useless. I'm quite familiar with the market, but can still find vendors' myriad names and numbers confusing. Imagine, then, how it'll look to the average consumer, who buys a new "High-Performance GeForce4" desktop for Junior only to find that its GeForce4 MX 420 SDR video is actually slower than the old GeForce2 Ultra?
These are very real risks for Nvidia -- except for streamlining its varied product line (even mobile chips) under a single name, what could the company hope to gain by letting the hobbling MX version share the prestigious GeForce4 label with the real high-performance products? Well, the company took a lot of flak for including "only" GeForce2 MX video in its nForce chipset; maybe it'll sound better to advertise the next nForce as incorporating GeForce 4 MX graphics. Also, to be fair, Nvidia might feel that the general computing public doesn't want or need the immense power of a true GeForce4 or even GeForce3, but acknowledge that its economy TNT2 and Vanta products are past due for retirement.
Whatever the reasons, the DirectX-7-generation GeForce4 MX will spell confusion in the marketplace, and I wouldn't imagine Nvidia's competitors are happy with this slight of hand. Imagine the ads we'll see touting GeForce4 systems at prices well below those of ATI Radeon-equipped PCs. There'll likely be a lowball GeForce4 MX inside, but it's an easy bet which system will get more attention.
So I'm tossing some laurels to Nvidia for leading the 3D performance race with the impressive GeForce4 Ti, but throwing darts at the deceitfully labeled GeForce4 MX -- it's not that the 4 MX is a poor performer for its market, but it clearly flaunts a name well above its station and hopes to beat its competitors with branding alone. Time will tell if the ploy works, but consumers like an ordered world with consistent numbering. Marketing a pumped-up GeForce2 MX as a GeForce4 seems to be stepping well over the line, and might give ATI some extra market share based on consumer goodwill alone.
god-damn I find that radio spot funny.....
it is sooooooo me!!!
[This message has been edited by cramer (edited 02-19-2002).]