One question that always comes up when programming a part in a CAM system is just how accurate do the speeds and feeds have to be? This has been something debated over and over.
Let’s start by defining the RPM formula from the Machinery’s Handbook.
N = 12V/πD
D =Cutter Diameter
So for a HSS end mill in Cold Drawn 1212 Carbon Steel the suggested FPM is 160, and we will be using a 1” cutter.
RPM = (12 * 160)/(3.14*.500) or 611 RPM.
We have the RPM how about the feed? We only have two choices in the Machinery’s Handbook 22nd edition for depth of cut .250 and .05. So much for the 1-5/8 LOC. At .250" depth of cut, .004" is the recommended chip per tooth.
So for a 4 flute 1” HSS end mill in 1212 we should be running 611RPM, 9.8IPM, at .25"DOC. We are all set now right?
Let’s take a close look at what happens when cutting an arc. A 1” end mill programmed to cut a 2” radius at 10 IPM. The center of the tool moves at 10 IPM. When cutting on the outside of the radius the cutting edge of the tool ends up slowing down to 8 IPM. When cutting on the inside of the radius the cutting edge of the tool ends up speeding up to 13 IPM. That’s a total of 5 IPM difference using the same tool and the same programmed feedrate.
What hasn’t been accounted for? How are we holding the part? Is it a new, used, or resharpened cutter? How rigid is the machine spindle (30, 40, 50 taper?) Is it a knee mill or machining center? Is it a horizontal or vertical mill? What is the horse power of the machine? What type of coolant is being used and how much (flood, mist)?
Ok let’s face it, at the time of programming there are so many unaccounted for variables, unless we create some kind of complicated database we can only program the speeds and feeds close to what we THINK they should be, or by what the tool has run at in the past on similar cuts. After all isn’t that why there are speed and feed overrides on the machine?