Filmmaker Peter Jackson has recently performed a very public experiment in releasing The Hobbit: An Unexpected Journey at 48 frames per second (also known as "HFR" for High Frame Rate). Some movie lovers, members of the press, and even other filmmakers have unfortunately deemed it a reluctantly qualified failure; Vincent Laforet wrote a particularly insightful piece on the reasons why. In the wake of this, it is worth taking a moment to consider the reasons for and against using higher frames.
Back when they were first developing moving pictures, a lot of experiments took place to determine the best frame rate or rates to use. Many early films were shot at 16 fps, but this rate was found to be too slow to properly capture motion (filming a fistfight seems to have been one of the common litmus tests). Rates up to 48 fps were tried; as we all now know, a frame rate of 24 fps - with a mechanism to display the frames at 48 fps, to reduce the visual effects of flicker from changing the image behind a closed shutter - was decided upon as the best compromise to provide satisfactory motion and not use up excessive amounts of film. More than a century later, 24 fps is still considered by many filmmakers to be a magical number.
Another lesser-known result of these tests which I learned from fellow content creator Kevin Dole was that Edison supposedly discovered 48 fps was a second magic number: Below this frame rate, the brain perceived the images as being dream-like (or at the very least, not real); at this rate and above, the brain perceived the motion to be a representation of reality. If true, one can understand that the lower frame rate of 24 fps could help provide a mental mood conducive to storytelling; 48 fps and higher are better for news, sports, and other happening-now depictions of reality. It helped account for why soap operas, which used to be recorded on interlaced video, had a very different look that prime time television, which used to be recorded on film and transferred to video. You can also test this yourself with most video cameras: Record the same scene - cars driving past, people in motion, birds feeding, etc. - at 25, 29.97, or 30 fps progressive scan, versus the same scenes interlaced (interlacing effectively doubles the motion frame rate to 50 or 59.94 fps) or versus 50, 59.94, or 60 fps progressive.
The problem with lower frame rates is that they are prone to distracting strobing, especially when you have fast movement in the scene. This is a headache for filmmakers, motion graphics designers, and 3D animators alike. Copious amounts of motion blur can help cover the strobing; the film standard is 180 degrees of blur (a shutter speed that is half as long in duration as the frame itself, resulting in a corresponding motion blur trail that covers half the frame's duration). Even then, you really have to watch out for fast motion. The inertia of a heavy film camera helps moderate the temptation to move the camera too quickly; however, smaller, lighter cameras (tip learned from Scott Billups: try attaching them to weights) or the virtual environment of the computer makes it all too easy to move all too fast. A lot of discussion of HFR seems to be based around the elimination of motion blur, but often it is a necessary glue to hold movement together.
Avoiding strobing seems to be at least part of the reason why Peter Jackson decided to try creating The Hobbit at 48 fps - and you can see why this would be an attractive solution to someone making any action film. The two problems that result are: you're bumping up against Edison's threshold of reality frame rate, and you have to reduce the length of the motion blur trails as each frame records a shorter duration of time. Some moviegoers (and more professional critics) seem to feel the tradeoff wasn't worth it; maybe it's just the shock of the new, and our brains will learn to adjust. After all, many of us are already watching high frame rate video on our HD TVs with high refresh rates and motion interpolation, and I've only heard a relatively muted outcry about those. And, it may also be a matter of learning to adapt our storytelling to the medium. You could consider using frame rate as a storytelling device, using high frame rates for scenes set in the present, and low frame rates for flashbacks or historical material. This can be done today in video by intercutting interlaced and progressive scan material into the same program, for example.
But if you have to choose one frame rate - today - which do you choose? When faced with life's conundrums, I occasionally fall back on a nugget of wisdom from Don Juan. Carlos Castaneda posed to him the following question: If given the choice of either between being gored by the left horn of the bull or the right, which do you choose? Don Juan replied: Neither - go between the horns of the bull. For me, that "between the horns of the bull" frame rate is 29.97 or 30 fps progressive. It's 25% faster than film's traditional 24 fps, helping with the strobing problem and providing smoother motion in general, but is far enough away from the 48 fps threshold that motion can still have that dream-like aura - and motion blur trails are still acceptably long enough to glue everything together. It's not yet a common rate for film projection, but is already a standard for video and broadcast. I encourage you to give it a try and see if it's an acceptable compromise for your work - whether the story you are telling is of mythic proportions, or merely how a new and improved cereal is better than all the others.