Saturday, July 01, 2006

Crazy Bitch...

I must not be as computer-smart as I thought I was.

Since my gaming PC is hooked up to my sister's super-ancient Gateway monitor, I've decided to try out some of my prettiest games on it. You see, I purchased a 19" Envision flatscreen monitor for my PC. It is, in short, a piece of shit. Fuzzy center graphics, a faked flat screen, and distorted geometry at all corners make it a 45-pound paperweight to me. The Gateway monitor, from 1997, mind you, is a super-nice 17" near-flatscreen monitor that is capable of putting out 1600 x 1200 resolution at 75 Hz.

I wanted to load up Half-Life 2: Episode One and try seeing how sharp I could get things running. On my AMD Athlon 64 3000+, 1.8 GHz machine, I typically run the game at 1024 x 768, 32-bit color, with most settings, minus textures, at HIGH. Textures are always set to MEDIUM quality. I knew, absolutely without a doubt, that setting this game to 1600 x 1200 would bring framerates to the single digits, as my Radeon X700, 256MB (128-bit) video card just isn't the fastest out there.

What the hell is going on here? My computer is tearing through Half-Life 2 at those settings with a cool 70 FPS average. Thing is, this is the same framerate I'm achieving at 1024 x 768. I even bumped textures up to HIGH quality at the max resolution of 1600 x 1200. Again, 70 FPS average.

What does this say about my rig? I guess a gig of DDR-400 memory helps things, but I'm beginning to think I'm under-rating my GPU, and greatly over-rating my CPU, which is currently overclocked to a steady 2.2 GHz.

How high can I go? I'll be taking official benchmarks with 3DMark 2001 and re-evaluating the overall power of my rig soon enough.

No comments: