Is it me or are the majority of people here that are complaining about the Lag on their Minecraft are those that have laptops?
Just because your laptop plays WOW, on good settings doesn't mean you are going to get the same results playing Minecraft.
You may also meet the minimum system requirements, but that is what you going to get is the minimum results. When a game updates, it improves on the things that are bad about the game, and this sometimes makes the game demand more of your system. So although your game wasn't lagging when you were on 1.6 and now on 1.7.8 it is. There are a lot of factors to look at on why this is the case. If you were meeting the minimum and not the recommend system requirements, and you updated your game, you're probably not going to get the same results.
If you are running an older laptop that is a few years old, unless that laptop was top of the line back then, don't expect that older laptop to perform at the peek settings. You are going to have to lower your settings and deal with it - or Upgrade your older laptop.
If you know anything about java you know that it is a resource intensive program, with memory leaks. Restarting your minecraft and your computer could improve your gameplay for that session.
Lessen the load on your CPU and limit the amount of programs that load when your computer starts can also improve your gameplay.
Please feel free to leave you questions or comments below - I would like to hear what anyone has to say about their laptops having Lag or Slow gameplay, and what you did or are doing to improve your gameplay.
Rollback Post to RevisionRollBack
aNaturalCause - Steaming Live on Mixer 24/7! Mixer
I'd say that it's the people with laptops that have the most lag. I have a laptop myself, but I find it ok. Yes, occasionally on servers it WILL lag, and when it does it's ALWAYS severe lag when it does happen. Lost many a PvP with severe lag. But it won't do anything raging over it, what happens happens.
I've played on laptop for quite awhile and one day it just started lagging to the point I couldn't play anymore. I tried everything, uninstalling everything I could and reinstalling, changing options, etc. Then, just now, I discovered that my chicken farm had far too many chickens. I killed them all and voila, my month long lag is gone.
The Meaning of Life, the Universe, and Everything.
Join Date:
2/26/2013
Posts:
285
Location:
Look in your closet
Minecraft:
Mzureich5501
Xbox:
WackyJuggler372
Member Details
I recently got a new laptop. Minecraft runs fine on it. When a server get's a bit crowded it may lag for a few seconds but other than that, it's fine. I used a pretty high resolution texture pack as well. It might just be your computer.
I'm running an AMD 6 core cpu. Amd 7700 1GB gpu with all up to date drivers. On a gaming PC. The update is laggy as heck. Can't play. It's not just laptops.
Well, the community usually call these players' laptops "toasters", and insult them and yell at them about getting a new one. For all of those people out there, not everyone has an extra $199.29 lying around...
BTW 1.9 is really laggy on my computer, possibly just the game.
Rollback Post to RevisionRollBack
I just took the Minecraft Noob test! Check out what I scored. Think you can beat me?!
I've played on laptop for quite awhile and one day it just started lagging to the point I couldn't play anymore. I tried everything, uninstalling everything I could and reinstalling, changing options, etc. Then, just now, I discovered that my chicken farm had far too many chickens. I killed them all and voila, my month long lag is gone.
I guess the moral is check your automatic farms.
I've killed a realm server via this method. I stupidly made an automatic chicken machine (eggs go inside hopper, come back up dispenser, never ends).
My laptop peaks at 30 fps but oftens can be as low as 10 or 15 when I first join the server. However in single player I can get as much as 60 fps. Optifine and lower render distance makes no difference. However i think its not my specs as i have a Dell inspiron 15 7573, it has an i7 processor and 16gb of ram. i would expect it to easily push 100 fps on vanilla. Yes its a few years old but i never got decent fps on servers that I remember and my brother Playing on a much older hp pavillion with lower specs on the same network connecting to the same server has 50fps. Crucial says i can double the ram to 32 and get a ssd that is about 3x time faster but i dont want to pay 200 for upgrades when i dont think thats the problem. What am i doing wrong? Also I do not normally use any mods, i tried optifine but mad no noticable difference.
I'm not sure what specific Core i7 that is, but the model number indicates it's newer than my Inspiron 3537 which runs it better than that, so it should be peaking above 30 FPS, and lowering render distance should absolutely make a massive difference as that's probably THE biggest thing to impact performance in this game (the only time I can think of where it won't is if something else is holding back performance besides the CPU, such as when using shaders, or if the GPU is otherwise the limiting factor).
Also, playing on servers SHOULD give better performance than single-player, as your PC isn't having to run the internal server like it does in single-player. You're also limited in render distance (typically to 10 I think?) as you only get sent the amount of chunks the server is configured for, so unless you run single-player at a render distance of 6 or below or something, it shouldn't be slower on servers. Make sure you're not confusing lag from connection issues.
Upgrading to 32 GB RAM and an SSD won't really help, so I wouldn't spend the money if making a difference in Minecraft is what you're after (though I recommend an SSD anyway; there's really no reason not to have one as your main drive and it makes overall computing much smoother, just don't expect it to change matters for Minecraft).
On the side, having an SSD for saving the local world massively improves gameplay in local games when you are generating new chunks, expecially if you have a fast CPU.
On the side, having an SSD for saving the local world massively improves gameplay in local games when you are generating new chunks, expecially if you have a fast CPU.
There actually isn't much of a difference between a traditional hard drive and even a RAM disk, and generating new chunks is entirely CPU-bound:
The most significant difference was loading an existing world, where a HDD was 43% slower than a SSD or RAM disk, but the relative difference was much less in other tests. Also, even in 1.6.4 the game saves chunks asynchronously, using a dedicated file I/O thread, precisely so it doesn't need to wait for them to be saved (this does not apply to chunk loading, which is however smart enough that if a chunk that was unloaded but not saved yet needs to be reloaded it will fetch it from the save queue instead of disk), and even then the OS itself will cache reads and writes (this is one reason why you should never allocate too much memory, leave plenty for the OS as it will use free memory as a disk cache.
For example, the first time I recompile the game in MCP, which reads/writes a few thousand files, it takes about twice as long as subsequent recompilations, until I restart the computer or enough time has passed without accessing them. I haven't measured the difference when loading a world but it is quite small, even for my modded worlds, which have significantly larger region files, up to 8 MB vs 6 MB for vanilla (my worlds are larger than the average player's world since lighting up caves increases the complexity of chunk data and they would be closer to 5 MB per region, 1.13+ may be different due to changes in the save format), but they load within a second (it helps that I disabled spawn chunks, which can also be done by enabling "smooth world" in newer versions of Optifine, be aware though that anything that depends on them being permanently loaded will not work).
Also, I've seen plenty of cases where people with high-end desktops (like a i9-9900 and RTX 2080) had performance issues, which seem to be more related to some specific combination of hardware and software than their performance, as indicated by the following bug reports (the first one was resolved as "upgrade your system or lower settings" but there were many people with systems that still met the newly updated system requirements who had issues):
Perhaps the biggest issue with laptops is the dual switchable graphics they often have, which will often run the game on the integrated GPU instead of the more powerful dedicated GPU since they don't recognize Java as a high-performance 3D application and/or their power saving settings are too strict.
It has been a long time since I got a new laptop, I do have concerns whether or not that the i5 7200U CPU in mine will handle Minecraft at acceptable frame rates, it is paired with an Nvidia 940mx GPU but that isn't going to guarantee good performance because at the end of the day its old architecture.
I could probably get away with it with both Java and bedrock edition, (only just), if I don't run extra unnecessary apps in the background, but what happens when the system requirements of the game change? this is a game that continuously is receiving new content, not just patches. Thankfully though they're not shoehorning the ray tracing stuff on us, allowing people with old hardware to play as if it's business as usual.
I successfully played MC again on my old-fashioned mac book pro with core duo and 8GB ram.
I stopped playing in 1.12 and went back a couple of days ago to 1.6.4 (I think it's .4 the lastest). Installed Optifine, reduced view distance to 10 chunks, got 50-60 fps at native (1366x768) resolution (save during chunk generation).
I also have to say that I've installed Linux on it and I'm using a custom (but stock with the OS) JVM.
I successfully played MC again on my old-fashioned mac book pro with core duo and 8GB ram.
I stopped playing in 1.12 and went back a couple of days ago to 1.6.4 (I think it's .4 the lastest). Installed Optifine, reduced view distance to 10 chunks, got 50-60 fps at native (1366x768) resolution (save during chunk generation).
I wish bedrock edition were on Linux because then I could have the best of both worlds on my laptop, a much more resource friendly OS combined with the version of the game I actually play. I could use Java version, but then I'd have to create a new world file up to play it that way. I'm sure it would work, there are benefits to using Linux Ubuntu over Windows 10. But even that wouldn't save us forever.
Just because your laptop plays WOW, on good settings doesn't mean you are going to get the same results playing Minecraft.
You may also meet the minimum system requirements, but that is what you going to get is the minimum results. When a game updates, it improves on the things that are bad about the game, and this sometimes makes the game demand more of your system. So although your game wasn't lagging when you were on 1.6 and now on 1.7.8 it is. There are a lot of factors to look at on why this is the case. If you were meeting the minimum and not the recommend system requirements, and you updated your game, you're probably not going to get the same results.
If you are running an older laptop that is a few years old, unless that laptop was top of the line back then, don't expect that older laptop to perform at the peek settings. You are going to have to lower your settings and deal with it - or Upgrade your older laptop.
If you know anything about java you know that it is a resource intensive program, with memory leaks. Restarting your minecraft and your computer could improve your gameplay for that session.
Lessen the load on your CPU and limit the amount of programs that load when your computer starts can also improve your gameplay.
Please feel free to leave you questions or comments below - I would like to hear what anyone has to say about their laptops having Lag or Slow gameplay, and what you did or are doing to improve your gameplay.
Mixer
I've played on laptop for quite awhile and one day it just started lagging to the point I couldn't play anymore. I tried everything, uninstalling everything I could and reinstalling, changing options, etc. Then, just now, I discovered that my chicken farm had far too many chickens. I killed them all and voila, my month long lag is gone.
I guess the moral is check your automatic farms.
I recently got a new laptop. Minecraft runs fine on it. When a server get's a bit crowded it may lag for a few seconds but other than that, it's fine. I used a pretty high resolution texture pack as well. It might just be your computer.
Minecraft runs fine on my laptop (avg 60 fps) even with around 80 mods. Although, that's with optifine.
my ign is NW6782playsMC but i prefer NiccGames ok?
You can most likely find me in the suggestions area of the forums. That's where I spend most of my time.
Subscribe to my channel for daily minecraft content: https://goo.gl/mDxn1h
<iframe src="https://player.twitch.tv/?channel=niccgames" frameborder="0" scrolling="no" height="378" width="620"></iframe><a href="https://www.twitch.tv/niccgames?tt_medium=live_embed&tt_content=text_link" style="padding:2px 0px 4px; display:block; width:345px; font-weight:normal; font-size:10px;text-decoration:underline;">Watch live video from niccgames on www.twitch.tv</a>
I'm running an AMD 6 core cpu. Amd 7700 1GB gpu with all up to date drivers. On a gaming PC. The update is laggy as heck. Can't play. It's not just laptops.
Well, the community usually call these players' laptops "toasters", and insult them and yell at them about getting a new one. For all of those people out there, not everyone has an extra $199.29 lying around...
BTW 1.9 is really laggy on my computer, possibly just the game.
I just took the Minecraft Noob test! Check out what I scored. Think you can beat me?!
To take the test, check out
https://minecraftnoobtest.com/test.php
[SSSS]
I scored 91% on the Minecraft Trivia Quiz. How much do you know about Minecraft?
My Dell plays 50 mods at 90+ FPS..
I've killed a realm server via this method. I stupidly made an automatic chicken machine (eggs go inside hopper, come back up dispenser, never ends).
My laptop peaks at 30 fps but oftens can be as low as 10 or 15 when I first join the server. However in single player I can get as much as 60 fps. Optifine and lower render distance makes no difference. However i think its not my specs as i have a Dell inspiron 15 7573, it has an i7 processor and 16gb of ram. i would expect it to easily push 100 fps on vanilla. Yes its a few years old but i never got decent fps on servers that I remember and my brother Playing on a much older hp pavillion with lower specs on the same network connecting to the same server has 50fps. Crucial says i can double the ram to 32 and get a ssd that is about 3x time faster but i dont want to pay 200 for upgrades when i dont think thats the problem. What am i doing wrong? Also I do not normally use any mods, i tried optifine but mad no noticable difference.
I'm not sure what specific Core i7 that is, but the model number indicates it's newer than my Inspiron 3537 which runs it better than that, so it should be peaking above 30 FPS, and lowering render distance should absolutely make a massive difference as that's probably THE biggest thing to impact performance in this game (the only time I can think of where it won't is if something else is holding back performance besides the CPU, such as when using shaders, or if the GPU is otherwise the limiting factor).
Also, playing on servers SHOULD give better performance than single-player, as your PC isn't having to run the internal server like it does in single-player. You're also limited in render distance (typically to 10 I think?) as you only get sent the amount of chunks the server is configured for, so unless you run single-player at a render distance of 6 or below or something, it shouldn't be slower on servers. Make sure you're not confusing lag from connection issues.
Upgrading to 32 GB RAM and an SSD won't really help, so I wouldn't spend the money if making a difference in Minecraft is what you're after (though I recommend an SSD anyway; there's really no reason not to have one as your main drive and it makes overall computing much smoother, just don't expect it to change matters for Minecraft).
On the side, having an SSD for saving the local world massively improves gameplay in local games when you are generating new chunks, expecially if you have a fast CPU.
There actually isn't much of a difference between a traditional hard drive and even a RAM disk, and generating new chunks is entirely CPU-bound:
https://www.reddit.com/r/Minecraft/comments/7octba/minecraft_doesnt_really_care_if_its_run_on_an_ssd/
The most significant difference was loading an existing world, where a HDD was 43% slower than a SSD or RAM disk, but the relative difference was much less in other tests. Also, even in 1.6.4 the game saves chunks asynchronously, using a dedicated file I/O thread, precisely so it doesn't need to wait for them to be saved (this does not apply to chunk loading, which is however smart enough that if a chunk that was unloaded but not saved yet needs to be reloaded it will fetch it from the save queue instead of disk), and even then the OS itself will cache reads and writes (this is one reason why you should never allocate too much memory, leave plenty for the OS as it will use free memory as a disk cache.
For example, the first time I recompile the game in MCP, which reads/writes a few thousand files, it takes about twice as long as subsequent recompilations, until I restart the computer or enough time has passed without accessing them. I haven't measured the difference when loading a world but it is quite small, even for my modded worlds, which have significantly larger region files, up to 8 MB vs 6 MB for vanilla (my worlds are larger than the average player's world since lighting up caves increases the complexity of chunk data and they would be closer to 5 MB per region, 1.13+ may be different due to changes in the save format), but they load within a second (it helps that I disabled spawn chunks, which can also be done by enabling "smooth world" in newer versions of Optifine, be aware though that anything that depends on them being permanently loaded will not work).
Also, I've seen plenty of cases where people with high-end desktops (like a i9-9900 and RTX 2080) had performance issues, which seem to be more related to some specific combination of hardware and software than their performance, as indicated by the following bug reports (the first one was resolved as "upgrade your system or lower settings" but there were many people with systems that still met the newly updated system requirements who had issues):
MC-45458 Framerate drop/lag in 1.8 for some hardware setups
MC-164123 Poor FPS performance with new rendering engine
Perhaps the biggest issue with laptops is the dual switchable graphics they often have, which will often run the game on the integrated GPU instead of the more powerful dedicated GPU since they don't recognize Java as a high-performance 3D application and/or their power saving settings are too strict.
TheMasterCaver's First World - possibly the most caved-out world in Minecraft history - includes world download.
TheMasterCaver's World - my own version of Minecraft largely based on my views of how the game should have evolved since 1.6.4.
Why do I still play in 1.6.4?
Please do install optifine. I previously had terrible lag until I installed optifine, and the lag was gone instantly
It has been a long time since I got a new laptop, I do have concerns whether or not that the i5 7200U CPU in mine will handle Minecraft at acceptable frame rates, it is paired with an Nvidia 940mx GPU but that isn't going to guarantee good performance because at the end of the day its old architecture.
I could probably get away with it with both Java and bedrock edition, (only just), if I don't run extra unnecessary apps in the background, but what happens when the system requirements of the game change? this is a game that continuously is receiving new content, not just patches. Thankfully though they're not shoehorning the ray tracing stuff on us, allowing people with old hardware to play as if it's business as usual.
I successfully played MC again on my old-fashioned mac book pro with core duo and 8GB ram.
I stopped playing in 1.12 and went back a couple of days ago to 1.6.4 (I think it's .4 the lastest). Installed Optifine, reduced view distance to 10 chunks, got 50-60 fps at native (1366x768) resolution (save during chunk generation).
I also have to say that I've installed Linux on it and I'm using a custom (but stock with the OS) JVM.
I wish bedrock edition were on Linux because then I could have the best of both worlds on my laptop, a much more resource friendly OS combined with the version of the game I actually play. I could use Java version, but then I'd have to create a new world file up to play it that way. I'm sure it would work, there are benefits to using Linux Ubuntu over Windows 10. But even that wouldn't save us forever.
Well I know there are unofficial ways to convert world between versions... of course do more than one backup first