-snip- i found a site that would download a zip for a resource pack called optifine.-snip-
I'm going to say that is DEFINITELY bogus. There is no texturepack capable of fixing lag period, and any download for Optifine that is not on this forum, or on the optifine.net downloads page, is most likely a fraud, with the exception of some other sites that need not be listed.
I'm going to say that is DEFINITELY bogus. There is no texturepack capable of fixing lag period, and any download for Optifine that is not on this forum, or on the optifine.net downloads page, is most likely a fraud, with the exception of some other sites that need not be listed.
Actually lower resolution texture packs will obviously give better framerates than higher res ones
Rollback Post to RevisionRollBack
PC specs: Intel Core 2 Duo e8500 @ 3.16ghz / XFX Radeon HD 6670 2GB DDR3 / 6GB 1666MHz DDR3 RAM / Windows 7 Professional
When using Optifine in some areas the sand changes from the default coloration to a kind of reddish/orange. Is it possible to disable this coloration? It's really bothering me since I use Optifine to help my FPS, and without it the sand looks normal.
Differently colored blocks depending on their position (biome) is a resource pack feature. You can disable it with "Custom Colors = OFF"
Is 1.8 going to have advanced GL? I know it's alpha but that didn't help my Framerate all and I know AGL did.
I'm just wondering. The biggest issue is 1.8 has broken both pcs and made them unplayable with 1.8. Max frames 20, Average normally runs at 8 frames.
With the 1.8 graphics engine the Avanced OpenGL will be very hard to implement. Instead 1.8 has chunk visibility detection which seems to work well.
The lag and FPS drops in 1.8 have different source and AOGL is not going to help.
Actually lower resolution texture packs will obviously give better framerates than higher res ones
Up to 32x there is little measurable difference in performance.
However the texture packs (even 16x) may use some advanced features (animations, connected textues, etc) that can decrease the FPS.
Minecraft 1.8 has so many performance problems that I just don't know where to start with.
Maybe the biggest and the ugliest problem is the memory allocation. Currently the game allocates (and throws away immediately) 50 MB/sec when standing still and up to 200 MB/sec when moving. That is just crazy.
What happens when the game allocates 200 MB memory every second and discards them immediately?
1. With a default memory limit of 1GB (1000 MB) and working memory of about 200 MB Java has to make a full garbage collection every 4 seconds otherwise it would run out of memory. When running with 60 fps, one frame takes about 16 ms. In order not to be noticeable, the garbage collection should run in 10-15 ms maximum. In this minimal time it has to decide which of the several hundred thausand newly generated objects are garbage and can be discarded and which are not. This is a huge amount of work and it needs a very powerful CPU in order to finish in 10 ms.
2. Why not give it more memory?
Let's give Minecraft 4 GB of RAM to play with. This would need a PC with at least 8 GB RAM (as the real memory usage is almost double the memory visible in Java). If the VM decides to use all the memory, then it will increase the time between the garbage collections (20 sec instead of 4), but it will also increase the garbage collection time by 4, so every 20 seconds there will be one massive lag spike.
3. Why not use incremental garbage collection?
The latest version of the launcher by default enables incremental garbage collection (-XX:+CMSIncrementalMode) which in theory should replace the one big GC with many shorter incremental GCs. However the problem is that the time at which the smaller GCs happen and their duration are mostly random. Also they are not much shorter (maybe 50%) than a full scale GC. That means that the FPS starts to fluctuate up and down and there are a lot of random lag spikes. The stable FPS with a lag spike from time to time is replaced with unstable FPS and microstutter (or not very micro depending on the CPU). This strategy can only work with a powerful enough CPU so that the random lag spikes become small enough not to be noticeable.
4. How did that work in previous releases?
The previous Minecraft releases were much less memory hungry. The original Notch code (pre 1.3) was allocating about 10-20 MB/sec which was much more easy to control and optimize. The rendering itself needed only 1-2 MB/sec and was designed to minimize memory waste (reusing buffers, etc). The 200 MB/sec is pushing the limits and forcing the garbage collector to do a lot of work which takes time. If it was possible to control how and when the GC works then maybe it would be possible to distribute the GC pauses such that they are not noticeable or less disturbing. However there is no such control in the current Java VM.
5. Why is 1.8 allocating so much memory?
This is the best part - over 90% of the memory allocation is not needed at all. Most of the memory is probably allocated to make the life of the developers easier.
- There are huge amounts of objects which are allocated and discarded milliseconds later.
- All internal methods which used parameters (x, y, z) are now converted to one parameter (BlockPos) which is immutable. So if you need to check another position around the current one you have to allocate a new BlockPos or invent some object cache which will probaby be slower. This alone is a huge memory waste.
- The chunk loading is allocating a lot of memory just to pass vertex data around. The excuse is probably "mutithreading", however this is not necessary at all (see the last OptiFine for 1.7).
- the list goes on and on ...
The general trend is that the developers do not care that much about memory allocation and use "best industry practices" without understanding the consequences. The standard reasoning being "immutables are good", "allocating new memory is faster than caching", "the garbage collector is so good these days" and so on.
Allocating new memory is really faster than caching (Java is even faster than C++ when it comes to dynamic memory), but getting rid of the allocated memory is not faster and it is not predictable at all. Minecraft is a "real-time" application and to get a stable framerate it needs either minimal runtime memory allocation (pre 1.3) or controllable garbage collecting, which is just not possible with the current Java VM.
6. What can be done to fix it?
If there are 2 or 3 places which are wasting memory (bugs), then OptiFine can fix them individually. Otherwise a bigger refactoring of the Minecraft internals will be needed, which is a huge task and not possible for OptiFine.
7. Example
A sample log of GC activity with effective FPS for the GC lag spikes is available here.
- the average rendering FPS is about 50 FPS
- the GC lag spikes have effective FPS of 7-20
- there are 1-2 lag spikes per second caused by GC activity
tldr; When 1.8 is lagging and stuttering the garbage collector is working like crazy and is doing work which has nothing to do with the game itself (rendering, running the internal server, loading chunks, etc). Instead it is constantly cleaning the mess behind the code which thinks that memory allocation is "cheap".
The important question is how many of the allocated objects survive long enough to get into OldGen. Cleaning up the survivor spaces is indeed very cheap and if the objects only live for a couple milliseconds then I would assume that most don't get into OldGen.
How many full GCs do you get according to your GC logs?
"Cheap" is relative when talking about 200 MB allocated per second and response time in the order of milliseconds. What is cheap for a server application that runs on a powerful server CPU and can easily afford 100 ms GC pause is not cheap for a "real-time" application running on a laptop CPU where 100 ms means a drop to 10 FPS.
I have not checked the full GC events, but i have a strong correlation between GC events and single lag spikes. About 70% of the single lag spikes are coincident with a used memory drop (GC at work).
The days of good coding are long behind us. I was reading an article the other day about how they managed to squeeze amazing graphics out of the tiniest memory and cpu envelopes of old consoles http://arstechnica.com/gaming/2014/08/same-box-better-graphics-improving-performance-within-console-generations/ It's a really interesting read. Solaris on the Atari 2600 was particularly impressive. 128 bytes of ram and it manages pseudo 3D graphics and gradient textures.... crazy.
I've noticed minecraft getting slower and slower over the past 4 years. It all started with the internal server change in 1.3 which halved FPS for everyone and it definitely hasn't improved since then. 1.8 has been awful. I know Java was a manageable way of coding it back when the game was more simple, but now its like trying to fit a pig in a wedding dress. Oh well, we're stuck with it so I hope someone can figure it out.
Thank you for taking the time to explain this to us!! I feel a little more informed now.
"The general trend is that the developers do not care that much about memory allocation and use "best industry practices" without understanding the consequences."
Does this mean, in your opinion, that when Microsoft starts to officially make they're take over by working on the game, will they also encourage/force their coders to use the 'best industry practices', since it is a huge corporation?
Seeing sp614x post that made me realize how much hard work goes into it. I know a lot of work goes into it but seeing him post that detailed info about it made me realize that he's working his butt off. I always appreciated optifine but i appreciate it much more now. My pc has been decent without optifine but overall it does stabilize mostly everything. I do have to say though that 1.8 has run the best without optifine since probably version 1.2.5. 1.3 and up is when i started noticing performance decrease when not using optifine. Using the alpha versions though actually make it run a little better than vanilla 1.8. I cannot wait for it to be fully done.
ONE PROBLEM WITH 1.8 SINCE PRE-VERSIONS I THINK (STILL IN OPTIFINE!!!)
When i press F5 to see my back and then again to see my front, a lot of the map around my character doesn't display until i move the mouse. This is a serious issue for me atleast. i can't imagine i'm the only one though. I can see it being an issue for probably everyone. My machine is up to date and i keep it very clean. Noticed it on the latest java 7 version but now im on java 8 update 40. Preview version of java that minecraft seems to run very good on. I can verify it's not java 8. Anybody have any recomendations? I will say i haven't tried the 1.8.1 versions yet though.
My goodness- it seems that it must be very hard for all these developers! I was reading on the Sponge project and it seems the other developers there are also complaining about spaghetti code! Wish I could help- but I'm not very experienced with Java. I've taken two classes and that's about it.
thank you
I'm going to say that is DEFINITELY bogus. There is no texturepack capable of fixing lag period, and any download for Optifine that is not on this forum, or on the optifine.net downloads page, is most likely a fraud, with the exception of some other sites that need not be listed.
Never arbitrary,
-DaBlizz
Actually lower resolution texture packs will obviously give better framerates than higher res ones
PC specs: Intel Core 2 Duo e8500 @ 3.16ghz / XFX Radeon HD 6670 2GB DDR3 / 6GB 1666MHz DDR3 RAM / Windows 7 Professional
Lower than 16x16? God help us...
--My PC --SEUS 10.1 Fixes --SEUS Water Fix --SEUS 10.2 Preview --MultiMC 1.8 Shaders Fix (Not needed if using Forge)
I'm just wondering. The biggest issue is 1.8 has broken both pcs and made them unplayable with 1.8. Max frames 20, Average normally runs at 8 frames.
Compared to my older video, There is quite an improvement
Differently colored blocks depending on their position (biome) is a resource pack feature. You can disable it with "Custom Colors = OFF"
With the 1.8 graphics engine the Avanced OpenGL will be very hard to implement. Instead 1.8 has chunk visibility detection which seems to work well.
The lag and FPS drops in 1.8 have different source and AOGL is not going to help.
Up to 32x there is little measurable difference in performance.
However the texture packs (even 16x) may use some advanced features (animations, connected textues, etc) that can decrease the FPS.
Maybe the biggest and the ugliest problem is the memory allocation. Currently the game allocates (and throws away immediately) 50 MB/sec when standing still and up to 200 MB/sec when moving. That is just crazy.
What happens when the game allocates 200 MB memory every second and discards them immediately?
1. With a default memory limit of 1GB (1000 MB) and working memory of about 200 MB Java has to make a full garbage collection every 4 seconds otherwise it would run out of memory. When running with 60 fps, one frame takes about 16 ms. In order not to be noticeable, the garbage collection should run in 10-15 ms maximum. In this minimal time it has to decide which of the several hundred thausand newly generated objects are garbage and can be discarded and which are not. This is a huge amount of work and it needs a very powerful CPU in order to finish in 10 ms.
2. Why not give it more memory?
Let's give Minecraft 4 GB of RAM to play with. This would need a PC with at least 8 GB RAM (as the real memory usage is almost double the memory visible in Java). If the VM decides to use all the memory, then it will increase the time between the garbage collections (20 sec instead of 4), but it will also increase the garbage collection time by 4, so every 20 seconds there will be one massive lag spike.
3. Why not use incremental garbage collection?
The latest version of the launcher by default enables incremental garbage collection (-XX:+CMSIncrementalMode) which in theory should replace the one big GC with many shorter incremental GCs. However the problem is that the time at which the smaller GCs happen and their duration are mostly random. Also they are not much shorter (maybe 50%) than a full scale GC. That means that the FPS starts to fluctuate up and down and there are a lot of random lag spikes. The stable FPS with a lag spike from time to time is replaced with unstable FPS and microstutter (or not very micro depending on the CPU). This strategy can only work with a powerful enough CPU so that the random lag spikes become small enough not to be noticeable.
4. How did that work in previous releases?
The previous Minecraft releases were much less memory hungry. The original Notch code (pre 1.3) was allocating about 10-20 MB/sec which was much more easy to control and optimize. The rendering itself needed only 1-2 MB/sec and was designed to minimize memory waste (reusing buffers, etc). The 200 MB/sec is pushing the limits and forcing the garbage collector to do a lot of work which takes time. If it was possible to control how and when the GC works then maybe it would be possible to distribute the GC pauses such that they are not noticeable or less disturbing. However there is no such control in the current Java VM.
5. Why is 1.8 allocating so much memory?
This is the best part - over 90% of the memory allocation is not needed at all. Most of the memory is probably allocated to make the life of the developers easier.
- There are huge amounts of objects which are allocated and discarded milliseconds later.
- All internal methods which used parameters (x, y, z) are now converted to one parameter (BlockPos) which is immutable. So if you need to check another position around the current one you have to allocate a new BlockPos or invent some object cache which will probaby be slower. This alone is a huge memory waste.
- The chunk loading is allocating a lot of memory just to pass vertex data around. The excuse is probably "mutithreading", however this is not necessary at all (see the last OptiFine for 1.7).
- the list goes on and on ...
The general trend is that the developers do not care that much about memory allocation and use "best industry practices" without understanding the consequences. The standard reasoning being "immutables are good", "allocating new memory is faster than caching", "the garbage collector is so good these days" and so on.
Allocating new memory is really faster than caching (Java is even faster than C++ when it comes to dynamic memory), but getting rid of the allocated memory is not faster and it is not predictable at all. Minecraft is a "real-time" application and to get a stable framerate it needs either minimal runtime memory allocation (pre 1.3) or controllable garbage collecting, which is just not possible with the current Java VM.
6. What can be done to fix it?
If there are 2 or 3 places which are wasting memory (bugs), then OptiFine can fix them individually. Otherwise a bigger refactoring of the Minecraft internals will be needed, which is a huge task and not possible for OptiFine.
7. Example
A sample log of GC activity with effective FPS for the GC lag spikes is available here.
- the average rendering FPS is about 50 FPS
- the GC lag spikes have effective FPS of 7-20
- there are 1-2 lag spikes per second caused by GC activity
tldr; When 1.8 is lagging and stuttering the garbage collector is working like crazy and is doing work which has nothing to do with the game itself (rendering, running the internal server, loading chunks, etc). Instead it is constantly cleaning the mess behind the code which thinks that memory allocation is "cheap".
"Cheap" is relative when talking about 200 MB allocated per second and response time in the order of milliseconds. What is cheap for a server application that runs on a powerful server CPU and can easily afford 100 ms GC pause is not cheap for a "real-time" application running on a laptop CPU where 100 ms means a drop to 10 FPS.
I have not checked the full GC events, but i have a strong correlation between GC events and single lag spikes. About 70% of the single lag spikes are coincident with a used memory drop (GC at work).
I've noticed minecraft getting slower and slower over the past 4 years. It all started with the internal server change in 1.3 which halved FPS for everyone and it definitely hasn't improved since then. 1.8 has been awful. I know Java was a manageable way of coding it back when the game was more simple, but now its like trying to fit a pig in a wedding dress. Oh well, we're stuck with it so I hope someone can figure it out.
"The general trend is that the developers do not care that much about memory allocation and use "best industry practices" without understanding the consequences."
Does this mean, in your opinion, that when Microsoft starts to officially make they're take over by working on the game, will they also encourage/force their coders to use the 'best industry practices', since it is a huge corporation?
The wonderful MOD
I am looking forward to the official version!
Japanese-English
(I am using google to translate)
intel core i7 4970k
8GB Ram
GTX760 GDDR2GB
Z97 chipset motherboard
ONE PROBLEM WITH 1.8 SINCE PRE-VERSIONS I THINK (STILL IN OPTIFINE!!!)
When i press F5 to see my back and then again to see my front, a lot of the map around my character doesn't display until i move the mouse. This is a serious issue for me atleast. i can't imagine i'm the only one though. I can see it being an issue for probably everyone. My machine is up to date and i keep it very clean. Noticed it on the latest java 7 version but now im on java 8 update 40. Preview version of java that minecraft seems to run very good on. I can verify it's not java 8. Anybody have any recomendations? I will say i haven't tried the 1.8.1 versions yet though.
PC Specs:
ASUS SABERTOOTH Z77
16GB RAM
EVGA GTX 680
2600K 3.4GHz
SAMSUNG 850 PRO 256 GB (OS)
2 OCZ VECTOR 4'S (GAME INSTALLS)