The first number is the average of the framerate from the last second. The second number is the minimum framerate over the last second. Both numbers refresh every second.
The first number will always be higher or equal to the second number, and neither are wholly representative of your framerate. The average will be brought up or down by anomalies, and in a scenario can say your FPS is 60 when in reality your game is spending 25% of the time at 195FPS and 75% of the time at 15FPS, not a smooth experience by any means. The second number will show 15FPS in this scenario.
You'll want to enable the lagometer under the other settings. It'll show you all the frametimes for each frame over the last few seconds. Read the tooltip for the setting to see what each color means. The larger the bar, the longer it took to render that frame, and the bar is colored by how much of the frame was spent doing something.
You can also enable the debug profiler in the same page and it'll show you a pie chart with information on what is taking up the most execution time.
You never told us what your system specifications are, or what settings you're using with Optifine so nobody can really tell you what is wrong. Is the JVM constantly having to run garbage collection because you have a very small amount of RAM? Is chunk loading slow because you're running an old 5400RPM hard drive? Is chunk generation slow because the game is running single threaded or you have a CPU that runs at less than 2GHz?
Take a look around the performance page in the settings and try out the fast renderer and fast math along with lazy chunk loading and the smooth world setting. You can also drop the chunk updates down to 1. It'll mean you can only load 1 chunk per frame, but it'll reduce the impact of chunk loading for each frame as it'll only be loading just one.