The BBC has published a research white paper about high frame-rate television. According to their experiments, a higher frame rate becomes necessary to avoid juttering or smearing of moving objects as the resolution and size of the TV screen increases.
As an old demo coder, I tend to frown upon the jerky motion of sub-50 frames per second graphics. Many a hard-core gamer would tell you that they wouldn't play a game running at only 30 fps. BBC's research shows that even at 100 fps, there is room for improvement. Still, motion pictures are trudging on in their old 24 fps that was set as a standard when sound film was introduced in the 1920's. As BBC's research shows, the perceived image quality can be increased tremendously by increasing the frame rate.
There are suggestions to improve the frame rate of motion pictures, but since video has always had higher frame rates than film, those suggestions are typically met with a large number of filmmakers saying, "It looks like video!"
So what? It's not like video is a dirty word.
Currently, there is a new generation of filmmakers growing up that never has and never will shoot on film. In a few years, digital acquisition and projection will have more or less killed canned celluloid. Why should filmmakers stick to the limitations of film? Sure, video gives a different look than film and 100 fps gives a different look than 24 fps. But thinking different means worse is conservative in a very boring way.