In 1.2.5 I was getting on average about 30-40 FPS, it wasn't bad, I wasn't complaining, mostly since I knew that my GPU wasn't being maxed out, it just didn't consider Java a game. Since I've started using the 1.3 snapshots(started at 12w26a upgraded to the most recent yesterday) my framerates have jumped up to 130-150 with Advanced OpenGL off. Now my understanding of the functional bits of computing is pretty limited but how if the requirements are going up does my framerate more than triple?
Honestly, I know just as much about that kind of stuff as you apparently do, but it depends. If you didn't use Adv. OpenGL on the snapshots or on the official build, I don't know. If you had Adv. OpenGL on in the official build and off in the snapshots, it's not all that strange. Also, they might have been smoothening the performance between 1.2.5 and 1.3, which would also explain the increase.
I used to keep OpenGL on, didn't really know what it did. Even with it on in the snapshots I still have about 80 FPS.
I'm obviously not mad about this, I just don't get how(unless they did optimization for various GPUs or some programming wizardry centering around being recognized as needing my GPU's power) they could raise framerates while making it more demanding.
Contrary to what has been spreading around the community 1.3 is not the end of the world.
Running a server in the background and client is more resource intensive, but that model actually works better on most modern system. Most modern PCs have 2 or more CPU cores and minecraft performance is mainly tied to your CPU.
In 1.2.5 the game would heavily use only one of those CPU cores. In 1.3 the client and the background server can each heavily use a core, so the game more efficiently uses typical hardware.
The performance issue comes from users on marginal hardware... single core systems. In 1.3/snapshots there are 2 heavy CPU tasks trying to share a single CPU's time. To get optimal performance in this scenario will take careful resource usage to ensure each CPU hog - the client and the local server leave some cpu cycles for the other to use.
Since this is a retrofit to an existing code base there are some growing pains compared to say FPS games, which use the same local server multi-purpose client model. FPS games have used this model for decades, but they start the design of the client and server day 1.
Minecraft has done it backward and started the design with a monolithic client (client does everything), then they forked the project into a multiplayer client + dedicated server and kept also monolothic client for singleplayer. Since the fork they have maintained the two versions in parallel making changes to both for every release requires at least double the effort for every change.
To implement a MOD API on top of the two separate server models (MP dedicated and the integrated one in the original single player client) would require two massive coding efforts. The most logical path is to collapse the two clients into one and move forward under one consistent client+server model. That has been the focus of the 1.3 release, merge the clients, which has meant take the multiplayer client and improve its performance on a local server to get as close to the feel of the old single player client as possible. As of the latest snapshot they are pretty close to that goal.
read the changelogs on the wiki... with almost every snapshot there was a "improved performance" bullet on the list... obviously there is (or was) lots of headroom for optimization... after all optifine doesn't depend on black magic or voodoo either
I'm obviously not mad about this, I just don't get how(unless they did optimization for various GPUs or some programming wizardry centering around being recognized as needing my GPU's power) they could raise framerates while making it more demanding.
Contrary to what has been spreading around the community 1.3 is not the end of the world.
Running a server in the background and client is more resource intensive, but that model actually works better on most modern system. Most modern PCs have 2 or more CPU cores and minecraft performance is mainly tied to your CPU.
In 1.2.5 the game would heavily use only one of those CPU cores. In 1.3 the client and the background server can each heavily use a core, so the game more efficiently uses typical hardware.
The performance issue comes from users on marginal hardware... single core systems. In 1.3/snapshots there are 2 heavy CPU tasks trying to share a single CPU's time. To get optimal performance in this scenario will take careful resource usage to ensure each CPU hog - the client and the local server leave some cpu cycles for the other to use.
Since this is a retrofit to an existing code base there are some growing pains compared to say FPS games, which use the same local server multi-purpose client model. FPS games have used this model for decades, but they start the design of the client and server day 1.
Minecraft has done it backward and started the design with a monolithic client (client does everything), then they forked the project into a multiplayer client + dedicated server and kept also monolothic client for singleplayer. Since the fork they have maintained the two versions in parallel making changes to both for every release requires at least double the effort for every change.
To implement a MOD API on top of the two separate server models (MP dedicated and the integrated one in the original single player client) would require two massive coding efforts. The most logical path is to collapse the two clients into one and move forward under one consistent client+server model. That has been the focus of the 1.3 release, merge the clients, which has meant take the multiplayer client and improve its performance on a local server to get as close to the feel of the old single player client as possible. As of the latest snapshot they are pretty close to that goal.