Nope. Sorry, but there's no way a celeron and 4gb of ram will be "lag-free." It won't be terrible, but it's not going to be flawless.
There's nothing wrong with 4 GB of RAM when the game doesn't even need 1/8th of that, and if mods do need more it is because they are usually badly coded; my computer only has 3.2 GB of usable RAM (32 bit OS with 4 GB installed) and there is still plenty of free memory when playing, either with 512 MB or 1 GB allocated (which doesn't actually affect memory used by Java unless it actually allocates all of it; even it it doesn't allocate all of 512 MB it may exceed it with 1 GB allocated just because the garbage collector is a bit lazy and possibly cause issues with the 2 GB limit for a 32 bit process, "out of memory" despite having plenty within the game):
From a test somebody else did with a newer version; despite the crazy memory churn since 1.8+ actual memory usage has not increased much:
From my testing, memory usage does affect performance slightly but the change is negligible considering I got a framerate above 60 in all occasions, and I usually play capped at 60 FPS. The game functions perfectly fine with 512MB allocated, the exact same as 4GB allocated. There's no difference because the game isn't actually using any more RAM, it would help to allocate more if the world had more things going on in it, such as a bank of furnaces smelting a heap of ore or something similar.
(the part about furnaces is nonsense because they do not use much more RAM when smelting something; even a null object reference uses 4-8 bytes and an ItemStack itself only uses a few dozen bytes per instance)
https://www.reddit.com/r/feedthebeast/comments/5x0twz/investigating_extreme_worldgen_lag/ (this also causes the game to use far more memory than needed due to all the extra chunks being generated; aside from this I simply can not believe that even 1000 new blocks or items need gigabytes of memory, vanilla itself has not changed much in memory usage over the years despite the equivalent of a huge modpack being added to the game)
The GPU is a bigger factor when concerning mods, particularly those that are heavy on rendering, not necessarily including shaders (even some vanilla blocks with complex models can cause a lot of lag when many are rendered at once; some mods also make heavy use of tile entities for various reasons, such as getting around blockstate limitations, and this can also significantly increase CPU usage).
Intel HD graphics can refer to a wide range of graphics, of which have a wide performance spectrum. Without knowing what HD graphics you're asking about and also the resolution it needs to render at, we can only guess at performance. Typically, the better the CPU, the better the HD graphics it also has. Also, typically the lower end the laptop, the lower the resolution the display. Lastly, the newer ones tend to have faster ones as well (but this is not absolute, as a newer Celeron may still have slower graphics thatn a barely older Core i5, for example). With a Celeron, you're probably not looking at good graphics regardless, and then the CPU itself is also pretty much one of Intel's lowest offerings. If you keep the render distance and graphics settings lower, it should be modestly playable, but I can't say for once you start adding mods to the game.
There's nothing wrong with 4 GB of RAM when the game doesn't even need 1/8th of that, and if mods do need more it is because they are usually badly coded; my computer only has 3.2 GB of usable RAM (32 bit OS with 4 GB installed) and there is still plenty of free memory when playing, either with 512 MB or 1 GB allocated...[/img]
Using 32-bit is partly why your RAM use may be lower, and it is not directly comparable to 64-bit OS/Java scenarios (which is probably what most typical players these days will be dealing with), as is what I'm guessing to be a lower render distance. Render distance seems to be one of the bigger contributors to RAM use. I'm on the opposite end of the spectrum. I can't even start the game with 1 GB of RAM allocated because it literally freezes as soon as I open the world with all RAM is use, and even with twice that, it loads fine but I'm near cap RAM allocation right at the start. I had to set it to 4 GB to be completely problem-free.
There's also the strange matter that the RAM use the game reports doesn't necessarily seem to be the whole of it, at least not for me. My PC has 16 GB of RAM, and with 4 GB allocated (and the game using less), I was yet once having issues of actually running out of RAM somehow with Minecraft supposedly taking up close to 10 GB despite that in game number reporting less than 4 GB. I don't know if that's some strange thing for just my sort of case or what, but I still believe in practice you want much more RAM on hand than what the game says it can run with regardless.
There's also the strange matter that the RAM use the game reports doesn't necessarily seem to be the whole of it, at least not for me. My PC has 16 GB of RAM, and with 4 GB allocated (and the game using less), I was yet once having issues of actually running out of RAM somehow with Minecraft supposedly taking up close to 10 GB despite that in game number reporting less than 4 GB. I don't know if that's some strange thing for just my sort of case or what, but I still believe in practice you want much more RAM on hand than what the game says it can run with regardless.
This is why it is generally said to allocate no more than half the RAM your computer has because the JVM and native code, including OpenGL, has a large overhead, and why you can run into odd out of memory errors on 32 bit systems due to the process memory (2 GB max) being exhausted despite F3 showing plenty of free memory (I had this happen a few times with 1 GB allocated; instead of crashing the game exited the world and displayed an in-game screen saying that it ran out of memory).
In any case I just don't see how mods need so much memory unless they are really that terribly written (and you thought that Notch was a terrible programmer); as mentioned before world generation mods are often a big culprit since they cause cascading chunk generation, which loads many more chunks than necessary and this is the single biggest use of memory in the game (hence why it is so sensitive to render distance). Even tile entities don't seem to be enough; I once created a Superflat world with 256 layers of furnaces and memory usage was around 900 MB with a view distance of 10 (for 1.6.4 render distance is much less important since the server always loads the same number of chunks; even if you use Optifine it doesn't work until you set it above 16 chunks due to a bug within Optifine itself). I did have to increase memory to 768 MB to avoid "heartbeat" lag spikes after playing for a while when I was playing with my "triple height terrain" mod but that was pretty self-explanatory as there were around 3 times more blocks loaded (this in itself also explains part of the higher memory usage with a max-height Superflat world, as well as Amplified) but even then it did not use much more memory than vanilla (back then I also used Forge and several mods (not my own) on my worlds but it did not make a big difference; one of the mods also added blocks with IDs above 255 so some chunk sections required an extra 2 KB of data to store them).
This was also before the various optimizations I've made to my later mods; for example, vanilla allocates a 32 KB array (4 times this in 1.8+) to store data during the first stages of terrain generation every time a new chunk is generated while I reuse a permanently allocated 64 KB array, reducing GC load despite doubling the size (128 - 256 high terrain); during cave generation vanilla creates a ton of Random objects while I reuse the same one over and over (cave generation still takes about twice as long as vanilla, largely due all the spawn checks I added, so these changes do not necessarily completely offset additions); I also disabled structure saving for mineshafts, though the game still stores data for each mineshaft in chunks loaded during the current session.
(Recent) versions of vanilla/modded Minecraft seem to use much more complex graphics stuff by default, which just adds to the extreme load the renderer gets put under. While the OP will definitely have to tweak things a little bit more, most of the rest of us who maybe played fine with 1.7.10-1.9 but now have major issues with 1.10+ can pretty much wipe out 90% of the lag issue by turning off a single particle type via the Animations (Terrain animations) settings provided by Optifine.
Render distance, fast/fancy graphics, fast render, fast math, and all the other stuff we can tweak via vanilla/optifine settings? Combined they don't really come close to the wreckage that Terrain animations does to Minecraft.
(Recent) versions of vanilla/modded Minecraft seem to use much more complex graphics stuff by default, which just adds to the extreme load the renderer gets put under. While the OP will definitely have to tweak things a little bit more, most of the rest of us who maybe played fine with 1.7.10-1.9 but now have major issues with 1.10+ can pretty much wipe out 90% of the lag issue by turning off a single particle type via the Animations (Terrain animations) settings provided by Optifine.
Render distance, fast/fancy graphics, fast render, fast math, and all the other stuff we can tweak via vanilla/optifine settings? Combined they don't really come close to the wreckage that Terrain animations does to Minecraft.
Can you elaborate? I remember versions 1.3, 1.7, and 1.8 all having severe performance impacts (1.7 and 1.8 especially were horrible), but when I went from 1.8 to 1.9, and then from 1.9 to 1.10 where I now am, I didn't notice anything.
The lag I still do notice is that initial horrible lag for about ten seconds when entering the Nether (started with 1.8 and even affects my faster PC) and my weaker PC only has issues with jungles, but strangely it seems to just be select ones, and only when looking at them (started with 1.7). I like to keep myself informed on performance impacts and changes, so what are the terrain animations and what did they do? I wonder if it might be worth trying to change on my weaker PC.
Can you elaborate? I remember versions 1.3, 1.7, and 1.8 all having severe performance impacts (1.7 and 1.8 especially were horrible), but when I went from 1.8 to 1.9, and then from 1.9 to 1.10 where I now am, I didn't notice anything.
The lag I still do notice is that initial horrible lag for about ten seconds when entering the Nether (started with 1.8 and even affects my faster PC) and my weaker PC only has issues with jungles, but strangely it seems to just be select ones, and only when looking at them (started with 1.7). I like to keep myself informed on performance impacts and changes, so what are the terrain animations and what did they do? I wonder if it might be worth trying to change on my weaker PC.
There are roughly 25 different animation options that can be separately toggled. Terrain seems to be by far the most expansive one, with all the others being for things that are either frequent but limited (ie, water/lava, fire, etc) or just rare entirely. Given how vastly the game improves by turning it off by itself, my relatively uneducated guess is that it affects the way grass and leaves gets rendered (coloring and biome blending still happens). Aside from a generally complete lack of lag, I don't notice anything different about leaf or grass blocks so it might be a far more subtle result than turning off the water/lava/fire animations (this makes those things look completely weird to the point of disorientation to me).
I assume it also affects blocks like prismarine that have an actual animation (prismarine will change colors from green to blue), but I'm generally nowhere near the stuff so I don't actually know one way or the other. Unless there's just some underlying thing applied to every block even if it's unused by said block, I can't imagine what else it would apply to that isn't already covered by the other options (my further assumption is that anything affected by those other options isn't handled by Terrain).
Interesting. I've been through that menu before but never really looked too closely at most of it. At first I thought it was some new feature, since I'm a bit outdated, but you're right, there it was even for me.
I wonder what that does, and if it might help. Thanks for pointing it out anyway and telling about your experience with it though, so I'll try it the next time that other PC is used. I'm not expecting miracles anyway so if it doesn't do anything for me it's fine. It could just be I need more of a video card for that PC now with the settings and render distance I want.
Quick Question:
Is Intel HD Graphics is enough to have a lag free modded survival?
i have a Intel Celeron CPU (in Laptop)(4gb)
and 2gb ram allocation to my modded launcher.. is it enough for a lag free modded survival?
Nope. Sorry, but there's no way a celeron and 4gb of ram will be "lag-free." It won't be terrible, but it's not going to be flawless.
Want to host a dedicated server yourself, easily, and for free? Click here!
Need to post a DXDiag log and don't know how? Here you go!
I make YouTube vidoes! Why not go check em out?
My specs:
R7 1700 (8c/16t) @ 3.8ghz
Cryorig H7 cooler
G1 Gaming GTX 1080 8gb @ ~2000mhz core
16gb DDR4 3200mhz ram
250gb 850 EVO SSD
240gb Sandisk SSD Plus
1tb WD Blue 7200rpm HDD
1tb Generic 2.5" 7200rpm HDD
500gb WD 7200rpm HDD
Win 10
3x 24" 1080p Monitors @75hz
Click me, and let all your dreams come true....
There's nothing wrong with 4 GB of RAM when the game doesn't even need 1/8th of that, and if mods do need more it is because they are usually badly coded; my computer only has 3.2 GB of usable RAM (32 bit OS with 4 GB installed) and there is still plenty of free memory when playing, either with 512 MB or 1 GB allocated (which doesn't actually affect memory used by Java unless it actually allocates all of it; even it it doesn't allocate all of 512 MB it may exceed it with 1 GB allocated just because the garbage collector is a bit lazy and possibly cause issues with the 2 GB limit for a 32 bit process, "out of memory" despite having plenty within the game):
From a test somebody else did with a newer version; despite the crazy memory churn since 1.8+ actual memory usage has not increased much:
(the part about furnaces is nonsense because they do not use much more RAM when smelting something; even a null object reference uses 4-8 bytes and an ItemStack itself only uses a few dozen bytes per instance)
https://www.reddit.com/r/feedthebeast/comments/5x0twz/investigating_extreme_worldgen_lag/ (this also causes the game to use far more memory than needed due to all the extra chunks being generated; aside from this I simply can not believe that even 1000 new blocks or items need gigabytes of memory, vanilla itself has not changed much in memory usage over the years despite the equivalent of a huge modpack being added to the game)
The GPU is a bigger factor when concerning mods, particularly those that are heavy on rendering, not necessarily including shaders (even some vanilla blocks with complex models can cause a lot of lag when many are rendered at once; some mods also make heavy use of tile entities for various reasons, such as getting around blockstate limitations, and this can also significantly increase CPU usage).
TheMasterCaver's First World - possibly the most caved-out world in Minecraft history - includes world download.
TheMasterCaver's World - my own version of Minecraft largely based on my views of how the game should have evolved since 1.6.4.
Why do I still play in 1.6.4?
That's the key. Depending on @op's definition of "modded survival" (and their definition of "lag-free"), 2gb allocated may very well not be enough.
Want to host a dedicated server yourself, easily, and for free? Click here!
Need to post a DXDiag log and don't know how? Here you go!
I make YouTube vidoes! Why not go check em out?
My specs:
R7 1700 (8c/16t) @ 3.8ghz
Cryorig H7 cooler
G1 Gaming GTX 1080 8gb @ ~2000mhz core
16gb DDR4 3200mhz ram
250gb 850 EVO SSD
240gb Sandisk SSD Plus
1tb WD Blue 7200rpm HDD
1tb Generic 2.5" 7200rpm HDD
500gb WD 7200rpm HDD
Win 10
3x 24" 1080p Monitors @75hz
Click me, and let all your dreams come true....
Intel HD graphics can refer to a wide range of graphics, of which have a wide performance spectrum. Without knowing what HD graphics you're asking about and also the resolution it needs to render at, we can only guess at performance. Typically, the better the CPU, the better the HD graphics it also has. Also, typically the lower end the laptop, the lower the resolution the display. Lastly, the newer ones tend to have faster ones as well (but this is not absolute, as a newer Celeron may still have slower graphics thatn a barely older Core i5, for example). With a Celeron, you're probably not looking at good graphics regardless, and then the CPU itself is also pretty much one of Intel's lowest offerings. If you keep the render distance and graphics settings lower, it should be modestly playable, but I can't say for once you start adding mods to the game.
Using 32-bit is partly why your RAM use may be lower, and it is not directly comparable to 64-bit OS/Java scenarios (which is probably what most typical players these days will be dealing with), as is what I'm guessing to be a lower render distance. Render distance seems to be one of the bigger contributors to RAM use. I'm on the opposite end of the spectrum. I can't even start the game with 1 GB of RAM allocated because it literally freezes as soon as I open the world with all RAM is use, and even with twice that, it loads fine but I'm near cap RAM allocation right at the start. I had to set it to 4 GB to be completely problem-free.
There's also the strange matter that the RAM use the game reports doesn't necessarily seem to be the whole of it, at least not for me. My PC has 16 GB of RAM, and with 4 GB allocated (and the game using less), I was yet once having issues of actually running out of RAM somehow with Minecraft supposedly taking up close to 10 GB despite that in game number reporting less than 4 GB. I don't know if that's some strange thing for just my sort of case or what, but I still believe in practice you want much more RAM on hand than what the game says it can run with regardless.
This is why it is generally said to allocate no more than half the RAM your computer has because the JVM and native code, including OpenGL, has a large overhead, and why you can run into odd out of memory errors on 32 bit systems due to the process memory (2 GB max) being exhausted despite F3 showing plenty of free memory (I had this happen a few times with 1 GB allocated; instead of crashing the game exited the world and displayed an in-game screen saying that it ran out of memory).
In any case I just don't see how mods need so much memory unless they are really that terribly written (and you thought that Notch was a terrible programmer); as mentioned before world generation mods are often a big culprit since they cause cascading chunk generation, which loads many more chunks than necessary and this is the single biggest use of memory in the game (hence why it is so sensitive to render distance). Even tile entities don't seem to be enough; I once created a Superflat world with 256 layers of furnaces and memory usage was around 900 MB with a view distance of 10 (for 1.6.4 render distance is much less important since the server always loads the same number of chunks; even if you use Optifine it doesn't work until you set it above 16 chunks due to a bug within Optifine itself). I did have to increase memory to 768 MB to avoid "heartbeat" lag spikes after playing for a while when I was playing with my "triple height terrain" mod but that was pretty self-explanatory as there were around 3 times more blocks loaded (this in itself also explains part of the higher memory usage with a max-height Superflat world, as well as Amplified) but even then it did not use much more memory than vanilla (back then I also used Forge and several mods (not my own) on my worlds but it did not make a big difference; one of the mods also added blocks with IDs above 255 so some chunk sections required an extra 2 KB of data to store them).
This was also before the various optimizations I've made to my later mods; for example, vanilla allocates a 32 KB array (4 times this in 1.8+) to store data during the first stages of terrain generation every time a new chunk is generated while I reuse a permanently allocated 64 KB array, reducing GC load despite doubling the size (128 - 256 high terrain); during cave generation vanilla creates a ton of Random objects while I reuse the same one over and over (cave generation still takes about twice as long as vanilla, largely due all the spawn checks I added, so these changes do not necessarily completely offset additions); I also disabled structure saving for mineshafts, though the game still stores data for each mineshaft in chunks loaded during the current session.
TheMasterCaver's First World - possibly the most caved-out world in Minecraft history - includes world download.
TheMasterCaver's World - my own version of Minecraft largely based on my views of how the game should have evolved since 1.6.4.
Why do I still play in 1.6.4?
Mostly if you have terrible settings and a laptop (if u have a laptop) Mostly it might be a chance you cant even run it
i have intel hd 630 with 2gb ddr4 ram for minecraft and in the normal wolrd it sometimes have lag spikes but not alot
but in nether it is so laggy and crashes alot i have 32 mods loaded
get more ram
the mods i use 6 or 7 are big / really big mods and other ones are very small changes
Mostly if your talking about a twitch (Like FTB) It might not run if you have that.
(Recent) versions of vanilla/modded Minecraft seem to use much more complex graphics stuff by default, which just adds to the extreme load the renderer gets put under. While the OP will definitely have to tweak things a little bit more, most of the rest of us who maybe played fine with 1.7.10-1.9 but now have major issues with 1.10+ can pretty much wipe out 90% of the lag issue by turning off a single particle type via the Animations (Terrain animations) settings provided by Optifine.
Render distance, fast/fancy graphics, fast render, fast math, and all the other stuff we can tweak via vanilla/optifine settings? Combined they don't really come close to the wreckage that Terrain animations does to Minecraft.
Can you elaborate? I remember versions 1.3, 1.7, and 1.8 all having severe performance impacts (1.7 and 1.8 especially were horrible), but when I went from 1.8 to 1.9, and then from 1.9 to 1.10 where I now am, I didn't notice anything.
The lag I still do notice is that initial horrible lag for about ten seconds when entering the Nether (started with 1.8 and even affects my faster PC) and my weaker PC only has issues with jungles, but strangely it seems to just be select ones, and only when looking at them (started with 1.7). I like to keep myself informed on performance impacts and changes, so what are the terrain animations and what did they do? I wonder if it might be worth trying to change on my weaker PC.
There are roughly 25 different animation options that can be separately toggled. Terrain seems to be by far the most expansive one, with all the others being for things that are either frequent but limited (ie, water/lava, fire, etc) or just rare entirely. Given how vastly the game improves by turning it off by itself, my relatively uneducated guess is that it affects the way grass and leaves gets rendered (coloring and biome blending still happens). Aside from a generally complete lack of lag, I don't notice anything different about leaf or grass blocks so it might be a far more subtle result than turning off the water/lava/fire animations (this makes those things look completely weird to the point of disorientation to me).
I assume it also affects blocks like prismarine that have an actual animation (prismarine will change colors from green to blue), but I'm generally nowhere near the stuff so I don't actually know one way or the other. Unless there's just some underlying thing applied to every block even if it's unused by said block, I can't imagine what else it would apply to that isn't already covered by the other options (my further assumption is that anything affected by those other options isn't handled by Terrain).
Interesting. I've been through that menu before but never really looked too closely at most of it. At first I thought it was some new feature, since I'm a bit outdated, but you're right, there it was even for me.
I wonder what that does, and if it might help. Thanks for pointing it out anyway and telling about your experience with it though, so I'll try it the next time that other PC is used. I'm not expecting miracles anyway so if it doesn't do anything for me it's fine. It could just be I need more of a video card for that PC now with the settings and render distance I want.