Hey all, So I recently have gotten into filming minecraft videos for the server I am a member on. So far I have managed to get time-lapses of the event sorted, But when it comes to filming my laptop seems to struggle beyond belief.
I currently have a HP DV6-6137tx
8GB ram
2GB dedicated graphics
i7 processor
and a 1080p 24inch LED Screen
However when i run minecraft I have to change to my internal graphics rather then my 2gb card as minecraft crashes the drivers on it. and I have to turn all my settings down to avoid lag.
I was going to install some shaders on to make the videos look better but with my laptop lagging as it is I am a bit stuck.
The strange thing is my desktop.
AMD something or other, So old I rarely use it and forgot what chip it is 3GB Ram
and a 2GB graphics (once again forgot what model)
Seems to run better at a slightly higher quality then my laptop, but not enough power to run the shaders properly.
Does anyone know how to increase my minecraft power?
How to get the shaders to run at full potential?
And how to get my laptop to push all its resources as what is the point in all that power if I cant use it.
Thanks in Advance
Minecraft lags on even the most powerful computer in the world. It's because it is coded in Java. You can try installing Optifine and switching to multicore rendering for it to utilize 2 cores instead of one for chunk loading. Also, I would recommend you only allocating at most 1GB of ram to Minecraft if it is a vanilla client, more ram than that does more harm than good.
If you can give us the maker of your graphics card, it'd be alot easier to give you help
as far as what I know, if you have an nvidia card, make sure to add the javaw.exe (C:\Program Files\java\javaw.exe) to the 3d programs list to make sure your laptop is using the GPU and not the integrated (i7s have the intel hd4000 built in).
The reason for this (in terms of NVidia) is the optimus software, which saves power by using the integrated GPU instead of the dedicated GPU on the card. Java, typically being used simply in internet applications, or simple applets is often considered to only be used with the integrated gpu only.
You can try installing Optifine and switching to multicore rendering for it to utilize 2 cores instead of one for chunk loading. Also, I would recommend you only allocating at most 1GB of ram to Minecraft if it is a vanilla client, more ram than that does more harm than good.
Yes I have tried installing Optifine when trying to get the shaders to work, but that just caused more problems then solutions.
And How to allocate ram to minecraft? I have a vanilla client Minus a camera mod or two.
If you can give us the maker of your graphics card, it'd be alot easier to give you help
as far as what I know, if you have an nvidia card, make sure to add the javaw.exe (C:\Program Files\java\javaw.exe) to the 3d programs list to make sure your laptop is using the GPU and not the integrated (i7s have the intel hd4000 built in).
The reason for this (in terms of NVidia) is the optimus software, which saves power by using the integrated GPU instead of the dedicated GPU on the card. Java, typically being used simply in internet applications, or simple applets is often considered to only be used with the integrated gpu only.
I dont have my laptop on me at the moment but a quick google search says I have a AMD Radeon HD 6770M
However as I said above, minecraft is using my graphics card but for some reason it crashes it after about 30 seconds of use.
Wrong. Although Java will not get the same speed as native languages as C in theory it is not the reason for Minecrafts 'slow' performance.
Minecraft is often compared to games such as Call Of Duty. The reason this is completely wrong is because those games deal mostly with static terrain, they are precomputed and stored to be drawn fast. Minecraft doesn't has this luxury and has to do everything realtime. The rendering isn't the major issue, but all the world updates along with using old intermediate OpenGL is putting a lot of load on the CPU making the game CPU bound. Don't expect Minecraft to suddenly run at 2000fps if it was coded in C. This will simply not happen as Minecraft is a lot complexer and intensive than it might look.
Im not expecting 200 FPS quality from minecraft as I understand there is alot of processing required in the java side.
But I have seen videos online of scenic shots using HD graphics and shaders at around 30fps.
But when i try it i can only get about 12 on a good day
Rollback Post to RevisionRollBack
To post a comment, please login or register a new account.
Hey all, So I recently have gotten into filming minecraft videos for the server I am a member on. So far I have managed to get time-lapses of the event sorted, But when it comes to filming my laptop seems to struggle beyond belief.
I currently have a HP DV6-6137tx
8GB ram
2GB dedicated graphics
i7 processor
and a 1080p 24inch LED Screen
However when i run minecraft I have to change to my internal graphics rather then my 2gb card as minecraft crashes the drivers on it. and I have to turn all my settings down to avoid lag.
I was going to install some shaders on to make the videos look better but with my laptop lagging as it is I am a bit stuck.
The strange thing is my desktop.
AMD something or other, So old I rarely use it and forgot what chip it is 3GB Ram
and a 2GB graphics (once again forgot what model)
Seems to run better at a slightly higher quality then my laptop, but not enough power to run the shaders properly.
Does anyone know how to increase my minecraft power?
How to get the shaders to run at full potential?
And how to get my laptop to push all its resources as what is the point in all that power if I cant use it.
Thanks in Advance
-
View User Profile
-
View Posts
-
Send Message
Retired Staffas far as what I know, if you have an nvidia card, make sure to add the javaw.exe (C:\Program Files\java\javaw.exe) to the 3d programs list to make sure your laptop is using the GPU and not the integrated (i7s have the intel hd4000 built in).
The reason for this (in terms of NVidia) is the optimus software, which saves power by using the integrated GPU instead of the dedicated GPU on the card. Java, typically being used simply in internet applications, or simple applets is often considered to only be used with the integrated gpu only.
Edit: Reason as to why this might work.
Yes I have tried installing Optifine when trying to get the shaders to work, but that just caused more problems then solutions.
And How to allocate ram to minecraft? I have a vanilla client Minus a camera mod or two.
I dont have my laptop on me at the moment but a quick google search says I have a AMD Radeon HD 6770M
However as I said above, minecraft is using my graphics card but for some reason it crashes it after about 30 seconds of use.
Im not expecting 200 FPS quality from minecraft as I understand there is alot of processing required in the java side.
But I have seen videos online of scenic shots using HD graphics and shaders at around 30fps.
But when i try it i can only get about 12 on a good day