XP Quick Fix Plus - Free Portable Repairing Tools

XP Quick Fix Plus a freeware by LeeLu Soft 2009 v2.0 (Plus)
There are a lot of excellent free Windows comprehensive repairing and fixing tools around the internet. But there are times when you want just one quick fix for example when you are under a virus attack and you just want to enable the Task Manager or the Registry Editor so you can fight the virus back, or when some new installation changed your XP behavior and My Documents opens on every start up.

This is the right time to draw XP Quick Fix Plus with 40 common Windows XP problems fixes, only 0.58mb, portable, small and fast, a must have on every computer and with a small extra, a command line utility to fix 6 common problems directly from command line !

Note ! There can be a problem with the same symptoms as described in XP Quick Fix, but it happened because of a different reason, in this case, XP Quick Fix, may not solve the problem but also will not harm your computer.

How to use XP Quick Fix Plus GUI


XP Quick Fix is a portable application, There is no install process, just unzip to any folder and run it. It includes two files:
LFX.exe (584kb) the main program GUI with 40 fixes
QFC.exe (38kb) a command line utility with 6 fixes

Using XP Quick Fix is very simple, just run it, and click on the button with the relevant fix Some of the fixes may effect only after a restart.

Clicking a fix button is safe and will not harm even a proper configured computer !

XP Quick Fix Feature
  • Enable Task Manager
  • Enable Registry Editor
  • Stop My Documents Open At Start-up
  • Enable Folder Options
  • Restore Missing Run Dialog
  • Enable CMD
  • and many more tools ...

How to use the Command Line Utility

In a case there is no way to run XP Quick Fix Plus from Windows or if you need to run the fix from a batch file or script, there is a simple command line utility (QFC.EXE) included in the original zip file, only 38kb.

Open a command line (cmd.exe), change directory to the program folder and run QFC.EXE with the parameters:

QFC /t - quick enable the Task Manager
QFC /r - quick enable the Rgistry Editor
QFC /f - quick enable the Folder Options
QFC /e - quick restore the Run Dialog
QFC /p - quick restore My Computer Properties
QFC /x - Quick fix Windows can't run exe files
  • Note ! parameters are case sensitive.
  • Note ! You can run only one parameter at a time.
OS Support

XP Quick Fix Plus was made for Windows XP and was not tested on any other OS except Windows XP SP2.

Notes
  • Very important ! XP Quick Plus Fix will not remove any virus or mallware !!, it will just enable some of the features that were disabled by the virus or other programs, you still have to take the required steps to remove the virus, but it will for sure help you fight back.
  • Be aware, if your computer is a part of an organizational network some issues are likely because of the organization group policy, this tool is only for use with personal and private computers !
  • if you computer is part of an organization network advice your system administrator before using this tool !
LeeLu Soft Freeware License
Copyright © 2008 by LeeLu Soft
All rights reserved

How to Send a Fax from Your Computer

So how can you send a fax from your computer using the Internet or email? We are in the days of an easy lifestyle where we can do everything from our computers using the Internet for free. Sending and receiving your fax is one of them.

You know how not long ago, you needed to have a bulky fax machine and a phone line to send and receive faxes. Not to mention the costs of getting a fax machine, faxing paper, phone line, and more.

But today you can easily send and receive your faxes from your computer using your email. Many of these online fax services even offer you a free local phone number to use as your faxing number.

Why sending a fax from your computer is better than a fax machine?

Simply because you can save a lot of time and send and receive your faxes with a simple click. Also you will save a lot of money by using free online fax services.

Not to mention you can access your faxes from anywhere in the world - as long as you have an Internet connection and your computer. You can see all your incoming faxes in your email online. How easier can it get?

How to Send a Fax from Your Computer?

It is very easy and you can do it in 3 simple steps. Here they are...

Step 1: Choose a Good Fax to Email Service Online

There are many various services and programs available online that give you an easy-to-use fax number linked to your email address. So any fax that you receive in that number will be automatically sent to your email address as an attachment.

Also when you want to send a fax, you can login to your web account and upload your fax in many formats and then send it to any local or even international fax number that you want.

Step 2: Upload Your Fax Document Online

You can upload your documents that you want to send in various formats - like Microsoft Word, picture files, Excel, or a simple text file.

Then you can choose from one of their professional, ready-made fax document templates and cover page to add an impressive layout to your faxing document.

Step 3: Click the Send Fax Button

After everything is ready, you simply click the button to send out your fax to the number you want. Yes, as easy as that.

The most important step among these three is choosing the best online faxing program that helps you with all the features and benefits you want to have. So make sure you do enough research and choose the best online fax to email service.

Author:
Alex C Johnson

  • Did You Know? You can send and receive your faxes by email in 3 minutes from now - easily and fast? Simply get your Free Fax to Email Service now.
  • Want to find the most reliable and easy-to-use online fax services? Check out this helpful guide to discover the best Free Online Fax Services.

Computer Tips : 23 Easy Ways To Speed Windows Xp

Computer Info, Tips and Tricks about the Operating System.. (Windows XP)

Since defragging the disk won't do much to improve Windows XP performance, here are 23 suggestions that will. Each can enhance the performance and reliability of your customers' PCs. Best of all, most of them will cost you nothing.

1.) To decrease a system's boot time and increase system performance, use the money you save by not buying defragmentation software -- the built-in Windows defragmenter works just fine -- and instead equip the computer with an Ultra-133 or Serial ATA hard drive with 8-MB cache buffer.

2.) If a PC has less than 512 MB of RAM, add more memory. This is a relatively inexpensive and easy upgrade that can dramatically improve system performance.

3.) Ensure that Windows XP is utilizing the NTFS file system. If you're not sure, here's how to check: First, double-click the My Computer icon, right-click on the C: Drive, then select Properties. Next, .. examine the File System type; if it says FAT32, then back-up any important data. Next, click Start, click Run, type CMD, and then click OK. At the prompt, type CONVERT C: /FS:NTFS and press the Enter key. This process may take a while; it's important that the computer be uninterrupted and virus-free. The file system used by the bootable drive will be either FAT32 or NTFS. I highly recommend NTFS for its superior security, reliability, and efficiency with larger disk drives.

4.) Disable file indexing. The indexing service extracts information from documents and other files on the hard drive and creates a "searchable keyword index." As you can imagine, this process can be quite taxing on any system.

The idea is that the user can search for a word, phrase, or property inside a document, should they have hundreds or thousands of documents and not know the file name of the document they want. Windows XP's built-in search functionality can still perform these kinds of searches without the Indexing service. It just takes longer. The OS has to open each file at the time of the request to help find what the user is looking for.

Most people never need this feature of search. Those who do are typically in a large corporate environment where thousands of documents are located on at least one server. But if you're a typical system builder, most of your clients are small and medium businesses. And if your clients have no need for this search feature, I recommend disabling it.

Here's how: First, double-click the My Computer icon. Next, right-click on the C: Drive, then select Properties. Uncheck "Allow Indexing Service to index this disk for fast file searching." Next, apply changes to "C: subfolders and files," and click OK. If a warning or error message appears (such as "Access is denied"), click the Ignore All button.

5.) Update the PC's video and motherboard chipset drivers. Also, update and configure the BIOS. For more information on how to configure your BIOS properly, see this article on my site.

6.) Empty the Windows Prefetch folder every three months or so. Windows XP can "prefetch" portions of data and applications that are used frequently. This makes processes appear to load faster when called upon by the user. That's fine. But over time, the prefetch folder may become overloaded with references to files and applications no longer in use. When that happens, Windows XP is wasting time, and slowing system performance, by pre-loading them. Nothing critical is in this folder, and the entire contents are safe to delete.

7.) Once a month, run a disk cleanup. Here's how: Double-click the My Computer icon. Then right-click on the C: drive and select Properties. Click the Disk Cleanup button -- it's just to the right of the Capacity pie graph -- and delete all temporary files.

8.) In your Device Manager, double-click on the IDE ATA/ATAPI Controllers device, and ensure that DMA is enabled for each drive you have connected to the Primary and Secondary controller. Do this by double-clicking on Primary IDE Channel. Then click the Advanced Settings tab. Ensure the Transfer Mode is set to "DMA if available" for both Device 0 and Device 1. Then repeat this process with the Secondary IDE Channel.

9.) Upgrade the cabling. As hard-drive technology improves, the cabling requirements to achieve these performance boosts have become more stringent. Be sure to use 80-wire Ultra-133 cables on all of your IDE devices with the connectors properly assigned to the matching Master/Slave/Motherboard sockets. A single device must be at the end of the cable; connecting a single drive to the middle connector on a ribbon cable will cause signaling problems. With Ultra DMA hard drives, these signaling problems will prevent the drive from performing at its maximum potential. Also, because these cables inherently support "cable select," the location of each drive on the cable is important. For these reasons, the cable is designed so drive positioning is explicitly clear.

10.) Remove all spyware from the computer. Use free programs such as AdAware by Lavasoft or SpyBot Search & Destroy. Once these programs are installed, be sure to check for and download any updates before starting your search. Anything either program finds can be safely removed. Any free software that requires spyware to run will no longer function once the spyware portion has been removed; if your customer really wants the program even though it contains spyware, simply reinstall it. For more information on removing Spyware visit this Web Pro News page.

11.) Remove any unnecessary programs and/or items from Windows Startup routine using the MSCONFIG utility. Here's how: First, click Start, click Run, type MSCONFIG, and click OK. Click the StartUp tab, then uncheck any items you don't want to start when Windows starts. Unsure what some items are? Visit the WinTasks Process Library. It contains known system processes, applications, as well as spyware references and explanations. Or quickly identify them by searching for the filenames using Google or another Web search engine.

12.) Remove any unnecessary or unused programs, from the Add/Remove Programs section of the Control Panel.

13.) Turn off any and all unnecessary animations, and disable active desktop. In fact, for optimal performance, turn off all animations. Windows XP offers many different settings in this area. Here's how to do it: First click on the System icon in the Control Panel. Next, click on the Advanced tab. Select the Settings button located under Performance. Feel free to play around with the options offered here, as nothing you can change will alter the reliability of the computer -- only its responsiveness.

14.) If your customer is an advanced user who is comfortable editing their registry, try some of the performance registry tweaks offered at Tweak XP.

15.) Visit Microsoft's Windows update site regularly, and download all updates labeled Critical. Download any optional updates at your discretion.

16.) Update the customer's anti-virus software on a weekly, even daily, basis. Make sure they have only one anti-virus software package installed. Mixing anti-virus software is a sure way to spell disaster for performance and reliability.

17.) Make sure the customer has fewer than 500 type fonts installed on their computer. The more fonts they have, the slower the system will become. While Windows XP handles fonts much more efficiently than did the previous versions of Windows, too many fonts -- that is, anything over 500 -- will noticeably tax the system.

18.) Do not partition the hard drive. Windows XP's NTFS file system runs more efficiently on one large partition. The data is no safer on a separate partition, and a reformat is never necessary to reinstall an operating system. The same excuses people offer for using partitions apply to using a folder instead. For example, instead of putting all your data on the D: drive, put it in a folder called "D drive." You'll achieve the same organizational benefits that a separate partition offers, but without the degradation in system performance. Also, your free space won't be limited by the size of the partition; instead, it will be limited by the size of the entire hard drive. This means you won't need to resize any partitions, ever. That task can be time-consuming and also can result in lost data.

19.) Check the system's RAM to ensure it is operating properly. I recommend using a free program called MemTest86. The download will make a bootable CD or diskette (your choice), which will run 10 extensive tests on the PC's memory automatically after you boot to the disk you created. Allow all tests to run until at least three passes of the 10 tests are completed. If the program encounters any errors, turn off and unplug the computer, remove a stick of memory (assuming you have more than one), and run the test again. Remember, bad memory cannot be repaired, but only replaced.

20.) If the PC has a CD or DVD recorder, check the drive manufacturer's Web site for updated firmware. In some cases you'll be able to upgrade the recorder to a faster speed. Best of all, it's free.

21.) Disable unnecessary services. Windows XP loads a lot of services that your customer most likely does not need. To determine which services you can disable for your client, visit the Black Viper site for Windows XP configurations.

22.) If you're sick of a single Windows Explorer window crashing and then taking the rest of your OS down with it, then follow this tip: open My Computer, click on Tools, then Folder Options. Now click on the View tab. Scroll down to "Launch folder windows in a separate process," and enable this option. You'll have to reboot your machine for this option to take effect.

23.) At least once a year, open the computer's cases and blow out all the dust and debris. While you're in there, check that all the fans are turning properly. Also inspect the motherboard capacitors for bulging or leaks. For offers about discounts:
www.discountsvu.com

Following any of these suggestions should result in noticeable improvements to the performance and reliability of your customers' computers. If you still want to defrag a disk, remember that the main benefit will be to make your data more retrievable in the event of a crashed drive.

Author: Rathish kumar

The next article, let's talk about Tips and Tricks on Windows Vista..

Troubleshooting and Maintaining Your Laptop Battery

Troubleshooting and Maintaining Your Laptop Battery

The actual life of a laptop battery will vary with computer usage habits. For most users, it is not uncommon to experience differences in battery life, of anywhere from just under one hour to over two hours in each sitting. If you are experiencing shorter battery life cycles, say 10 to 15 minutes, it may not yet be time to order that new battery.

There are several factors to take into consideration when determining if the time has come to replace your battery. This information may also apply to that new battery that you have recently purchased, that has been giving you fits. The two primary things to consider when troubleshooting battery problems is Usage Habits and Battery Memory. We will cover both in their complexities in just a moment, but first, let us take a look at what you should expect from your battery's life cycle.

NiMH batteries usually last 1.5 to 2.5 hours.
LiION batteries usually last 2.0 to 3.0 hours.

These are average results and the results will vary greatly depending on your system's conservation settings, the temperature of the room and the climate that you are operating your computer in. As a general rule, your Lithium Ion battery will last much longer than your standard Nickel Metal Hydride battery.

Now let's take a look at the various usage habits to consider when troubleshooting your laptop's battery. These processes are very similar to the way that your portable stereo uses batteries ... just think how much faster your stereo dell Inspiron 9400 battery eats batteries when you are playing the CD or the tape deck, as opposed to when you are just playing the radio.

The more you use physical devices --- which require more electricity to operate --- the more of the battery's power you can expect to consume. The devices that create a larger power drain are the hard drive, the floppy drive and the CD-ROM.

When the computer dell 310-6321 battery is able to use its physical memory resources to store information, the computer will use less of the battery's power, since the process is mostly electrical in nature. However, when the processes you are using exhaust the physical memory resources available to your system, the system will turn to virtual memory to continue the process at hand. Virtual Memory is designed to extend system memory resources by building a memory swap file on the hard drive, and then transfer needed information between the hard drive and the physical memory as required. Since the hard drive is a electricity hog, the use of virtual memory becomes an electricity hog by proxy.

Two other processes that engage virtual dell 310-6322 battery memory on your computer are computational programs and the calculation processes used by spreadsheet applications and database programs. Both of these items engage the processor to a greater degree as well, which in itself is a consumer of electricity. Because they both compute and calculate large quantities of information, they will also increase the amount of electricity that your laptop will consume.
More information about laptop visit the website:
Laptop Accessories - Discount Laptop Batteries, Laptop Keyboards, Laptop AC Adapter, LCD Screen Panel

Laptop parts- Laptop Battery, Laptop Keyboards, Laptop AC Adapter, LCD Screen Panel

Overclock Your Computer

There are two schools of thought as to why you can, or would even want to overclock most CPUs and GPUs. One of them takes the peace, love and understanding route, namely that the manufacturing process is never 100 per cent reliable, so not every chip that rolls off the same production line is born equal. Those with the most lustrous coats and shiniest eyes (bred on Pedigree, presumably) are ready to be high-end components, but those with a bit of a squint and a runny nose may have a funny turn if they exert themselves too much.


Hence, some chips are slapped with a lower official clock-speed and sold for less groats than their beefier brethren. The potential for their intended glory remains, however. Overclocking techniques can unlock at least some of that potential, albeit at the risk of frying the chip completely.

The tinfoil hat/Angry Internet Men theory is based on the same concept but chucks in a bit of paranoia. In this scenario, every same-series processor is born equal, but The Man artificially neuters most of them and slaps different badges on what are fundamentally the same chips. Overclocking, then, is simply a way of taking back what's rightfully yours.

The truth likely lies somewhere between the two. Mass production certainly makes more financial sense than dozens of separate lines, and it's true that a low-end CPU or GPU can be made to punch far above its weight, but their stability isn't as guaranteed as a chip that's officially able to run at a higher speed. No manufacturer wants to deal with a steady trickle of returned parts, after all. But it does mean home overclocking is almost always productive - and seemingly more so with every new hardware generation.

It's also increasingly easy. The earliest overclocking on the 4 to 10MHz 8088-based CPUs of 1983, involved desoldering a clock crystal from the motherboard and replacing it with a third-party one, with only partially successful results. Ouch. Still, the precedent was set: a dedicated guy-at-home could exceed his chip's official spec. IBM, then very much the top dog of PC land, wasn't entirely happy about this, so follow-up hardware included hard-wired overclock blocks.

More soldering this time of a BIOS chip, managed to get around this. By 1986 IBM's stranglehold had been broken, resulting in a raft of 'clone' systems - and a wealth of choice. Intel's 286 and 386 processors became the de facto standard chips, and bus speed and voltage controls began to shift from physical switches and jumpers to BIOS options and settings.

It was the 486 that really changed everything however. It's telling that this was the chip most prevalent during the era that birthed the first-person shooter as we know it: 1993's Doom very much popularized performance PCs for gaming driving system upgrades in the same way a Half-Life 2 or Crysis does these days. At the same time, the 486 introduced two concepts absolutely crucial to overclocking both then and now. Firstly, it popularized split product lines; no longer was it a matter of buying simply a processor, but rather which processor. The 486SX and DX offered some serious performance differential, and notably the SXs were hobbled/failed DXs, giving rise to the ongoing practice of assigning different speeds and names to what were the same chip.

For a while too, the 25MHz SXes could be overclocked to 33MHz by adjusting a jumper on the motherboard; something less salubrious retailers took full advantage of. Secondly, it introduced the multiplier: performing more clocks per every one mustered by the system's front side bus. The 486's 2x multiplier thus effectively doubled the bus frequency. This was something overclockers would make the best of for successive processor generations - bumping up the multiplier was the simplest and often most effective way of increasing CPU speed. Nowadays (since the Pentium II, in fact), the multiplier is locked to prevent this, save for high-end chips, such as Intel's Extreme Edition series. For a while, there were complicated ways of defeating the multiplier lock: soldering on a PCB for earlier chips, third-party add-ons and the infamous practice of drawing a line onto certain AMD CPUs with a pencil. No CPU manufacturer's likely to make that mistake again.

Around this time, RAM overclocking became more common place, as memory speeds were ratified, and with that came more tweaking of the front-side bus to compensate for the locked multipliers. Overclocking shifted further towards the BIOS and away from jumpers, which in turn led to overclocking software.

The first was 1998's SoftFSB, which enabled bus-tweaking from within Windows for the first time. With the Pentium III era came aftermarket coolers, as processors now chucked out so much heat that a standard cooling block and fan wasn't enough to cope with an overclocked chip. And so it continued, overclocking largely becoming easier and more common place with each processor generation. This leads us to the Core 2 chips of today, and Intel's current terrifyingly unassailable dominance of the CPU market. Generally drawing as little as half the power of the Pentium 4s that preceded them, most of the range offers a vast amount of overclocking headroom, to the point that a low-end Core 2 Duo can almost go toe-to-toe with the top of the line.

So how's it done? Key to processor overclocking is the front side bus (FSB). In the very simplest terms, this is the connection between the CPU and the rest of the PC, and its speed defines the processor's speed to a significant extent. Intel CPUs final speed is the FSB times the multiplier - so if you've got an FSB of 266MHz and a multiplier of 9, your chip will run at approximately 2.4GHz. While the multiplier is usually locked - though some chips let you at least lower it, to conserve power and reduce heat - the FSB isn't. Bump up the FSB and you bump up the chip. In our example taking the bus to 290MHz gives us a 2.6GHz processor. This is no random example, incidentally, it's what we run the Intel Core 2 Quad Q6600 in one of our office test systems at, giving it a healthy 200MHz boost that makes a noticeable difference in CPU-intesive games and hi-def video re-encodes.

What stops us from going higher? Not a lot in the case of this particular chip. We're playing it safe for desktop work, cos we're in a particularly sweaty office. When we're playing around with high-end tasks, we can have it running stably at over 33GHz (with an FSB of 370 or so) on a decentish, third-party air cooler. That's more or less trading blows with the best Intel has to offer on a $200 chip. But while going to 280MHz on the FSB took a BIOS tweak, a reboot and Microsoft BOB's your uncle, going much higher does involve more fuss.

First up, when our Q6600 is at 33GHz, it's also running at nearly 70°C when under maximum load (and around 50°C when idling). It's perfectly stable, but it could damage it in the long run, and on top of that the fan is making enough noise to wake the deaf pensioner in the next street over. Watercooling, a fancier air-cooler or even just a spot of dust-cleaning will bring the heat down, but there can come a point where that stuff becomes more expensive and hassle than simply buying a better processor.

Hurdle the second is the motherboard. Pushing up the FSB doesn't affect only the CPU, but also the motherboard and, in many cases, the RAM and PCI-e slot to boot. In our case, we're using a motherboard that supports a monstrously high FSB. When shopping for a motherboard, its max FSB will usually be referred to as four times the actual speed, due to the way the processor actually fetches data. So when we've got the FSB set to 266MHz, in effect that's 1,066MHz. When it's up to 372MHz, we need a motherboard that's happy at nearly 1,500MHz. That simply isn't a given, especially on cheaper boards, so shop carefully.

As well as that, if you've got a board with a stingy BIOS, you may not be able to alter RAM and PCI timings independently of the FSB, which can lead to those falling over. Ours does, and for our mighty near-Gigahertz Q6600 overclock, we have to lower the RAM's clock speed a little to compensate for the strain put on it by the raised FSB - we have it sitting pretty at 893MHz. It could comfortably go higher, but the real-world benefits (as opposed to the willy-waving benefits, which are a different matter entirely) would be so miniscule that it's simply not worth placing the extra pressure on the RAM.


Similarly, while faster and, most likely, more expensive RAM will cope better at their stock speeds with a massive FSB, the pay-off is often so minor that value RAM, running at a lower clock-speed may well be enough to make your overclocking masterplan hugely successful. Even the best memory will net you something in the region of just a five per cent performance boost - worth having if every little helps, but it's the FSB that makes the big difference. And for that, the motherboard is critical.

Thirdly, there's the matter of voltage. The faster your chip runs, the more power it needs to feed it. As the FSB goes up, you'll find your motherboard's North Bridge and your RAM also get hungrier.

Unfortunately, your hardware will automatically report its revised power requirements, so trial and miserable error are required to find the sweet spot. Volt tweaking is a fiddly and danger-fraught business.

Some overclocking-friendly motherboards can automatically adjust voltages for you, but are understandably conservative about it, so for the really big overclocks you'll need to set them yourself. This needs to be done by the tiniest increments possible, establishing reboot-by-reboot how many volts your embiggened CPU needs; as low as possible, essentially, as firing too many into it can fry it.

Establish in advance what your chip's out-of-the-box volts are and, through a mix of common sense and googling, decide on a number you're not going to risk going higher than. We pushed our Q6600 from 13 to 1.4V, which is a fairly big increase as volt modding goes. It's not just a matter of the so-called vCore either - as you go for the big overclocks, you'll find you're having to play with the arcane likes of CPU PLL and FSB termination voltage. Again, so long as you raise stuff in tiny increments the risk of killing your chip, RAM or motherboard is fairly minimal.


It's a different matter with AMD processors, which for a while now have had an onboard memory controller, which allows the chip to communicate more directly with the RAM, which in turn means there isn't an FSB as such. Instead, you're overclocking something known as the HyperTransport bus, which is achieved in more or less the same way, but can require lowering the NT's own multiplier to retain stability when you bump the speed. If you've gone for one of the recent AMD Phenom Black Editions, you'll find it comes with the multiplier unlocked, which makes overclocking an easier affair.

By contrast, overclocking a graphics card is dead simple. As a more self-contained piece of hardware, there's none of this confusing multiplier or FSB business; just overclocking the card itself, finding the right speeds for both the GPU and the card's onboard memory. Free software - some of it official NVIDIA/ATI driver plug-ins - will do the trick from within Windows, and built-in safety cut-offs and stability tests make it incredibly hard to damage the card, though of course you are going beyond the warranty.

It's also grown a little more complicated of late in that you may need to overclock the shader clock as well as the GPU and RAM for the best boosts. In the case of NVIDIA cards, it used to be that this was twinned to the GPU speed, meaning a raise in one had a synchronous effect on the other, but for a little while now they've been able to be altered independently. So if you hit the speed ceiling on the GPU, it may yet be possible to eke more performance out of the card by pushing the shader clock a little further.


While the present situation is that you can overclock everything and be pretty confident it'll work, the future of the form is harder to call. One thing seems sure: it's not a dirty little nerdy secret anymore, but an increasingly common practice, most especially with Core 2 chips. There's a vast aftermarket cooler industry to support it, and even cheap motherboards can handle a bit of a free boost. If anything overclocking will become easier, with more and better applications to achieve it within Windows, rather than from the BIOS, and possibly more in the way of automatic volt-modding. But much depends on the future of desktop processing. There's a big war brewing between Intel and NVIDIA as to whether the CPU or the GPU will be the major element in the PC of the near-future.

Intel are pushing ray-tracing using a multi-core CPU to render game graphics, while NVIDIA's CUDA enables its recent GeForce cards to perform parallel processing, such as video encoding and in-game physics, far faster than a CPU could manage. If either of these bed in, overclocking will need to take them into account. At the same time, the slow move to ever-more cores potentially reduces the need for conventional overclocking, as raw clock speed continues to be a lesser concern to multi-threading and, in the case of 3D cards, the number of stream processors and texture units. That's hardly going to stop anyone from trying it, of course. Even when its effects are minimal, overclocking's always going to be a sure-fire way of making a system feel like its yours rather than simply a collection of mass-produced parts.

Modding the case is one thing, but what makes a PC is its performance. When you've painstakingly tweaked that performance into something that suits your own purposes, and it's become something that feels like you've gone far beyond what you paid for it, the system will feel more unique than all the green neon tubing in the world could ever hope to achieve.

Author:
Sandra Prior’s

website:
http://usacomputers.rr.nu
http://sacomputers.rr.nu.

Other Tips and Tricks