When CNET invited me to start blogging here, I suggested the name "Speeds and Feeds" because that phrase bridges two fields I'm interested in: machine-shop operations, where it originated, and the computer industry, which adopted it fairly recently.
In a machine shop, the phrase has a definite meaning: "speed" is the rate at which a tool cuts through the workpiece. "Feed" is the rate at which the tool is advanced into the workpiece, thereby determining the depth of the cut.
So a 1"-diameter two-flute end mill (a tool that looks like a drill bit, but which is designed to cut sideways) turning at 1,000 rpm has a "speed" of (1 * 3.14 * 1,000 / 60) = 52.3 inches per second.
To set a cutting depth of 0.003" each time a cutting edge reaches the workpiece, that same end mill needs a "feed" rate of (0.003 * 2 * 1,000) = 6 inches per minute.
A machinist uses his or her experience as well as guidelines and tables from the Machinery's Handbook (now in its 27th edition since 1914) and other sources to select the right tool, speed, and feed given a particular type of material and other requirements.
In the computer industry, "speeds and feeds" has no particular meaning, but it's generally used as a blanket term for the features and performance of a microprocessor or a whole computer system. I think many of the people who use this phrase in the computer industry have no idea where it came from or what it means; I hope this blog post will help spread the word.