Hi guys my Forge log says this when I try to put Optifine into my mods folder:
"2013-10-27 09:42:04 [INFO] [ForgeModLoader] FML has found a non-mod file OptiFine_1.6.4_HD_U_C6.jar in your mods directory. It will now be injected into your classpath. This could severe stability issues, it should be removed if possible."
Yet, Optifine does not work when I test it in game. Can anyone tell based on this log whether something is going wrong? Thanks.
If you're running 1.7.2, OptiFine has not been updated to that version yet.
Bro, not to upset you or anything, but these are no where near gaming specs... 4GB is minimum as Windows usually takes up 2GB from all the services it runs. I'm not sure what Pineview is, but I'm assuming this is a laptop or netbook. If Minecraft is your Main thing, I'd save up and build a nice $400 Mini-Desktop to play games on or pay more $$$ to get a Laptop with a Dedicated Graphics card with an Core i3 Processor.
You sir... are not rendering in full screen Minecraft, which everyone plays in. FPS is different when the Window is the size of iPod.
Its a Gateway Laptop. Again, not the best, but at least you have a Pentium and the RAM. Since Minecraft 1.7+ is going to using Shaders, a graphics card is going to be required, if not now, then soon...
Tip for all : Your system should be able to "load balance" whats on the machine. Whether you are a Gamer or not, a graphics card is usually desired in all Consumer applications these days, even to just run the OS and Surf the Web. Like I stated before, I have an i3 and GT 610 (cheapest hardware avail at the time). System runs fine and can play most games for me and my wife on Low / Medium. (examples : LoL + Minecraft). These games run fine because the system has 2x sources of processing to deal with the load. Most Laptops and low end PCs have "built-in" graphics to the processor (Intel HD // Radeon A). This is bad joojoo because the processor not only has to deal with the data load, but the graphical load as well. Minecraft specifically has to render Chunks (65,536 Blocks). 8 Chunks is 524,288 Blocks and every block is an array of data holding values such as : Textures, Animation, Light Data, Block IDs, Blast Resistance, Tool Resistance, Flammability, Placement Parameters, Texture Connection Data, Entity Data, etc. If you multiply those values by 524,288 you are pushing a lot of onDemand Information. So, if you're process has 2 Cores and needs 1 Core to do video, that leaves 1 Core left to deal with Minecraft, Windows, Services and any other applications you are running (Music, Youtube, Systray stuff). In short : You're gonna have a bad time.
Bonus Tip : You can buy an i3/GT micro desktop for about $400 (Tower + Internals). My suggestion is always get the news Sockets for your Motherboard so you can upgrade next time Games decide to crank it up a notch. Recommended Specs to live by :
CPU : Core i3+
MEM : 4GB + (8GB for actual gamers)
VID : GeForce or Radeon with 128-bit Bus Width (Examples GT630 / Radeon 6xxx Series)
HDD : 7200RPM (Size depends on User)
SND : Sound Blaster Go PRO (Not required, but $20 on Amazon to help alleviate the CPU even more)
PSU : 480W (Way overkill, but longevity is higher with super light loads)
Um, I have an 8-year old PC with only 2GB RAM, and Minecraft is still playable. Specs:
Windows XP Professional SP3 32-bit
Intel Pentium 4, 3.2GHz
2GB RAM DDR-400
Nvidia GeForce 8500 GT 256MB
I usually play in windowed mode, but fullscreen works too.
Um, I have an 8-year old PC with only 2GB RAM, and Minecraft is still playable. Specs:
Windows XP Professional SP3 32-bit
Intel Pentium 4, 3.2GHz
2GB RAM DDR-400
Nvidia GeForce 8500 GT 256MB
I usually play in windowed mode, but fullscreen works too.
That's fine... But those specs are acceptable to play Minecraft. 3.2Ghz vs his 1.8Ghz Processor. Pentium vs Atom? Plus you have an 8500GT. So imagine if you will that these are cars. You have a v6 Engine with a Turbo and he's driving a 4 banger. You can't expect him to go as fast as you.
You sir... are not rendering in full screen Minecraft, which everyone plays in. FPS is different when the Window is the size of iPod.
I usually don't play in fullscreen because it's a hassle when you need to switch to other things while you play, but I get notably higher FPS when I'm in fullscreen, rather than the other way around, contrary to what you say xD
Bro, not to upset you or anything, but these are no where near gaming specs... 4GB is minimum as Windows usually takes up 2GB from all the services it runs. I'm not sure what Pineview is, but I'm assuming this is a laptop or netbook. If Minecraft is your Main thing, I'd save up and build a nice $400 Mini-Desktop to play games on or pay more $$$ to get a Laptop with a Dedicated Graphics card with an Core i3 Processor.
You sir... are not rendering in full screen Minecraft, which everyone plays in. FPS is different when the Window is the size of iPod.
Its a Gateway Laptop. Again, not the best, but at least you have a Pentium and the RAM. Since Minecraft 1.7+ is going to using Shaders, a graphics card is going to be required, if not now, then soon...
Tip for all : Your system should be able to "load balance" whats on the machine. Whether you are a Gamer or not, a graphics card is usually desired in all Consumer applications these days, even to just run the OS and Surf the Web. Like I stated before, I have an i3 and GT 610 (cheapest hardware avail at the time). System runs fine and can play most games for me and my wife on Low / Medium. (examples : LoL + Minecraft). These games run fine because the system has 2x sources of processing to deal with the load. Most Laptops and low end PCs have "built-in" graphics to the processor (Intel HD // Radeon A). This is bad joojoo because the processor not only has to deal with the data load, but the graphical load as well. Minecraft specifically has to render Chunks (65,536 Blocks). 8 Chunks is 524,288 Blocks and every block is an array of data holding values such as : Textures, Animation, Light Data, Block IDs, Blast Resistance, Tool Resistance, Flammability, Placement Parameters, Texture Connection Data, Entity Data, etc. If you multiply those values by 524,288 you are pushing a lot of onDemand Information. So, if you're process has 2 Cores and needs 1 Core to do video, that leaves 1 Core left to deal with Minecraft, Windows, Services and any other applications you are running (Music, Youtube, Systray stuff). In short : You're gonna have a bad time.
Bonus Tip : You can buy an i3/GT micro desktop for about $400 (Tower + Internals). My suggestion is always get the news Sockets for your Motherboard so you can upgrade next time Games decide to crank it up a notch. Recommended Specs to live by :
CPU : Core i3+
MEM : 4GB + (8GB for actual gamers)
VID : GeForce or Radeon with 128-bit Bus Width (Examples GT630 / Radeon 6xxx Series)
HDD : 7200RPM (Size depends on User)
SND : Sound Blaster Go PRO (Not required, but $20 on Amazon to help alleviate the CPU even more)
PSU : 480W (Way overkill, but longevity is higher with super light loads)
That much I do understand, but can't really do much about the laptop at the moment, since it was a gift from my parents; just so I would have a computer. Thanks for the tip. When I do manage to buy a rig of some sort you could bet it will be a desktop of some kind; maybe not the most advanced, but at least something that will handle better. Haven't had much problems with it really, and thankfully. Usually with the optifine features I usually turn a few things off and not use certain ones. It will work for the time being, but when they do decide to up the ante then I will just have to play the version that I can till I am able to get a better rig.:D
Rollback Post to RevisionRollBack
My First World, always getting back to is a pleasure I enjoy with each new update that brings in more things to add in.
I am unable to disable the debug profiler when loading optifine through forge. The debug profiler seems to be causing significant lag, too. Has anybody else had this problem?
Bump. This is still a problem for me. I have disabled optifine for the time being. I suppose my issue may vanish with the release of optifine 1.7.*, but that is not yet released to my knowledge and the server(s) that I play on are still on 1.6.4
It just baffles me they haven't found a way to make Minecraft perform any better than it does. In all seriousness, a single core CPU running at 1.2 GHz or so, adequate amount of RAM (2GB+ for Windows XP and 4GB+ for Windows Vista and up), and a video card with at least 128 MB of VRAM should theoretically run Minecraft at a solid 60 FPS at 1920x1080 (granted you're not trying to run some super fancy shaders).
When my computer was down, my dad's AMD Sempron (which is a dual core CPU) and GeForce 5800 struggled to run it at like 20 FPS with all the graphic settings set to fast, which is ridiculous.
I know people keep saying "Derp, buy a faster computer", but that's not where the problem lies. The people having problems understand they don't have the fastest computers in the world. They understand they're not going to pull 300 FPS at 1920x1080 with a 512x512 resolution texture pack and Sonic Ether's Unbelievable Shaders. The problem is any computer -- hell, a netbook made in the past 10 years even (and since netbooks aren't even that old I think you get my drift), theoretically should run Minecraft at a solid 60 FPS in the vanilla state. The fact we even have to use mods such as OptiFine to maximize the performance is downright pitiful. Even my computer, which is quite a beast (runs Battlefield 3 on Ultra without even dropping below 35 FPS in even the most intense fights) doesn't get maximum performance in Minecraft unless I use Optifine. That in itself is terrible.
It's really time to ditch the Java and rewrite this game in C++. The games potential would quadruple. And the performance would shoot through the roof.
It just baffles me they haven't found a way to make Minecraft perform any better than it does. In all seriousness, a single core CPU running at 1.2 GHz or so, adequate amount of RAM (2GB+ for Windows XP and 4GB+ for Windows Vista and up), and a video card with at least 128 MB of VRAM should theoretically run Minecraft at a solid 60 FPS at 1920x1080 (granted you're not trying to run some super fancy shaders).
When my computer was down, my dad's AMD Sempron (which is a dual core CPU) and GeForce 5800 struggled to run it at like 20 FPS with all the graphic settings set to fast, which is ridiculous.
I know people keep saying "Derp, buy a faster computer", but that's not where the problem lies. The people having problems understand they don't have the fastest computers in the world. They understand they're not going to pull 300 FPS at 1920x1080 with a 512x512 resolution texture pack and Sonic Ether's Unbelievable Shaders. The problem is any computer -- hell, a netbook made in the past 10 years even (and since netbooks aren't even that old I think you get my drift), theoretically should run Minecraft at a solid 60 FPS in the vanilla state. The fact we even have to use mods such as OptiFine to maximize the performance is downright pitiful. Even my computer, which is quite a beast (runs Battlefield 3 on Ultra without even dropping below 35 FPS in even the most intense fights) doesn't get maximum performance in Minecraft unless I use Optifine. That in itself is terrible.
It's really time to ditch the Java and rewrite this game in C++. The games potential would quadruple. And the performance would shoot through the roof.
That would meen a complete rewrite of the whole game, which would take months.
when I take out Optifine from the MC , everything works perfect.
but as soon as I install Optifine I get the weird pages in NEI as shown in the spoiler.
The page it does it on is where Tinkers Construct is at.
All mods are up to date, all block-Ids and Item IDs have been checked and no conflicts at all.
Everything works perfect when there is no Optifine.
This problem only happens when I have optifine installed.
** Look at my Signature to know what NEI I have installed.
Rollback Post to RevisionRollBack
Windows 7 x64-bit SP-1 - MC v1.7.10 - Forge 10.13.4.1517 - CodeChickenCore v1.0.7.47 - NEI v1.0.5.118 - (MultiMC v5.0.4.8)
---- Minecraft Crash Report ----
// You should try our sister game, Minceraft!
Time: 16:45 28/10/13
Description: Unexpected error
java.lang.NullPointerException
at bdd.a(SourceFile:322)
at bdy.l_(SourceFile:74)
at beh.a(SourceFile:57)
at atv.k(SourceFile:1402)
at atv.S(SourceFile:663)
at atv.d(SourceFile:619)
at net.minecraft.client.main.Main.main(SourceFile:101)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at net.minecraft.launchwrapper.Launch.launch(Launch.java:131)
at net.minecraft.launchwrapper.Launch.main(Launch.java:27)
A detailed walkthrough of the error, its code path and all known details is as follows:
---------------------------------------------------------------------------------------
-- System Details --
Details:
Minecraft Version: 1.6.4
Operating System: Windows 7 (amd64) version 6.1
Java Version: 1.7.0_40, Oracle Corporation
Java VM Version: Java HotSpot(TM) 64-Bit Server VM (mixed mode), Oracle Corporation
Memory: 4119262728 bytes (3928 MB) / 4285005824 bytes (4086 MB) up to 4285005824 bytes (4086 MB)
JVM Flags: 3 total; -XX:HeapDumpPath=MojangTricksIntelDriversForPerformance_javaw.exe_minecraft.exe.heapdump -Xmx4096M -Xms4096M
AABB Pool Size: 16282 (911792 bytes; 0 MB) allocated, 15444 (864864 bytes; 0 MB) used
Suspicious classes: IWrUpdater, Config, WrUpdates, ...]
IntCache: cache: 0, tcache: 0, allocated: 0, tallocated: 0
Launched Version: 1.6.4-OptiFine_HD_U_C6
LWJGL: 2.9.0
OpenGL: GeForce GTX 760/PCIe/SSE2 GL version 4.4.0, NVIDIA Corporation
Is Modded: Very likely; Jar signature invalidated
Type: Client (map_client.txt)
Resource Pack: Sphax PureBDcraft 128x MC16.zip
Current Language: Canadian English (CA)
Profiler Position: N/A (disabled)
Vec3 Pool Size: ~~ERROR~~ NullPointerException: null
You sir... are not rendering in full screen Minecraft, which everyone plays in. FPS is different when the Window is the size of iPod.
... Most Laptops and low end PCs have "built-in" graphics to the processor (Intel HD // Radeon A). This is bad joojoo because the processor not only has to deal with the data load, but the graphical load as well. Minecraft specifically has to render Chunks (65,536 Blocks). 8 Chunks is 524,288 Blocks and every block is an array of data holding values such as : Textures, Animation, Light Data, Block IDs, Blast Resistance, Tool Resistance, Flammability, Placement Parameters, Texture Connection Data, Entity Data, etc. If you multiply those values by 524,288 you are pushing a lot of onDemand Information. ...
First, not everyone plays full-screen. I happen to prefer the "wider view" over a "square-er" view. (I keep my window at 854x480; but then, I've only got 1024 wide display).
Second, depending on your operating system, graphics card/driver, and general setup, you may find that full-screen has to copy pixels to screen once, while windowed mode has to copy twice (once to a buffer, then to the screen). In such a setup, if full-screen has less than twice the pixels, it may be faster. (Useful if your full-screen mode can be set to a lower pixel count.)
Third, the built-in 3000/4000 GPU, while not very powerful / feature rich, is not CPU bound. It does not consume CPU power to operate. It's biggest contention, as I understand it, is that it wants to use normal system ram as vram, while dedicated vram can actually work with two users at the same time (so real vram can be updated by the system, and read by the display generator, at the same time without having to pause one of the two).
Fourth, your concept of what minecraft has to do is ... lets say "badly mistaken". Minecraft does a huge amount of culling of unused data before it tries to render. With Advanced OpenGL, which the 3000 chipset does support, there is typically only about 100 minichunks being examined for graphical information -- each minichunk is 16x16x16, it only renders block faces that are exposed to air (so most blocks are not rendered), and then most blocks are static images based on block ID without any meta-data or tile entity information being read. So there is a lot of information that isn't even looked at, before information is thrown away, so very little actually has to be read and processed to generate the display lists for the OpenGL driver.
I usually don't play in fullscreen because it's a hassle when you need to switch to other things while you play, but I get notably higher FPS when I'm in fullscreen, rather than the other way around, contrary to what you say xD
Very possible; as I said, some systems only copy pixels once in full-screen, but twice in windowed.
It just baffles me they haven't found a way to make Minecraft perform any better than it does. In all seriousness, a single core CPU running at 1.2 GHz or so, adequate amount of RAM (2GB+ for Windows XP and 4GB+ for Windows Vista and up), and a video card with at least 128 MB of VRAM should theoretically run Minecraft at a solid 60 FPS at 1920x1080 (granted you're not trying to run some super fancy shaders).
And ... your basis for this claim is what?
Have you made any attempt to document how much non-graphical work minecraft has to do every second to run at full speed?
Have you made any attempt to document how much graphical work is spent doing what graphics operations, or how much would be gained by making changes to the rendering model?
Do you know what Mojang has said about the performance hit minecraft takes from the use of the now outdated, old style OpenGL that it uses, or that they want to toss it for newer, better performance up-to-date OpenGL? (It would not surprise me if Minecraft 1.7 did that, by the way.)
Have you even done anything to address java's memory/garbage collection? (The default garbage collector is designed for throughput -- least total GC overhead -- in uses where response time is not critical -- where a 2-10 second pause is considered OK.)
...
The problem is any computer -- hell, a netbook made in the past 10 years even (and since netbooks aren't even that old I think you get my drift), theoretically should run Minecraft at a solid 60 FPS in the vanilla state. The fact we even have to use mods such as OptiFine to maximize the performance is downright pitiful. Even my computer, which is quite a beast (runs Battlefield 3 on Ultra without even dropping below 35 FPS in even the most intense fights) doesn't get maximum performance in Minecraft unless I use Optifine. That in itself is terrible.
It's really time to ditch the Java and rewrite this game in C++. The games potential would quadruple. And the performance would shoot through the roof.
Again: Please document your research showing that:
1. Minecraft's performance should be that much better,
2. The performance hit is inherent in Java as opposed to the rendering model being used.
(In regard to a mod that gives realistic animal genetics):
Would you really rather have bees that make diamonds and oil with magical genetic blocks?
... did I really ask that?
It says its compatible with 933. Load 933 and try optifine that way.
That's fine... But those specs are acceptable to play Minecraft. 3.2Ghz vs his 1.8Ghz Processor. Pentium vs Atom? Plus you have an 8500GT. So imagine if you will that these are cars. You have a v6 Engine with a Turbo and he's driving a 4 banger. You can't expect him to go as fast as you.
First, not everyone plays full-screen. I happen to prefer the "wider view" over a "square-er" view. (I keep my window at 854x480; but then, I've only got 1024 wide display).
Second, depending on your operating system, graphics card/driver, and general setup, you may find that full-screen has to copy pixels to screen once, while windowed mode has to copy twice (once to a buffer, then to the screen). In such a setup, if full-screen has less than twice the pixels, it may be faster. (Useful if your full-screen mode can be set to a lower pixel count.)
Third, the built-in 3000/4000 GPU, while not very powerful / feature rich, is not CPU bound. It does not consume CPU power to operate. It's biggest contention, as I understand it, is that it wants to use normal system ram as vram, while dedicated vram can actually work with two users at the same time (so real vram can be updated by the system, and read by the display generator, at the same time without having to pause one of the two).
Fourth, your concept of what minecraft has to do is ... lets say "badly mistaken". Minecraft does a huge amount of culling of unused data before it tries to render. With Advanced OpenGL, which the 3000 chipset does support, there is typically only about 100 minichunks being examined for graphical information -- each minichunk is 16x16x16, it only renders block faces that are exposed to air (so most blocks are not rendered), and then most blocks are static images based on block ID without any meta-data or tile entity information being read. So there is a lot of information that isn't even looked at, before information is thrown away, so very little actually has to be read and processed to generate the display lists for the OpenGL driver.
Well then...
As "optimized" as that rendering setup sounds, this doesn't explain the issue. My default, as is the Minecraft Default, is 854x480 (my Monitor Resolution is 1920x1080). I misspoke earlier when I said Fullscreen. I meant to say Maximized (still not used to Minecraft's new real Fullscreen option). So, with that said, a system rendering 44FPS in Windowed : 854x480 will take a performance hit in Windowed : 1920x1058. (Note, this is more than double the games default native draw).
Have you made any attempt to document how much non-graphical work minecraft has to do every second to run at full speed?
Have you made any attempt to document how much graphical work is spent doing what graphics operations, or how much would be gained by making changes to the rendering model?
Do you know what Mojang has said about the performance hit minecraft takes from the use of the now outdated, old style OpenGL that it uses, or that they want to toss it for newer, better performance up-to-date OpenGL? (It would not surprise me if Minecraft 1.7 did that, by the way.)
Have you even done anything to address java's memory/garbage collection? (The default garbage collector is designed for throughput -- least total GC overhead -- in uses where response time is not critical -- where a 2-10 second pause is considered OK.)
I'm with you on this. However, I'm more concern that people feel that games should run on Single Core 1.2Ghz, 2GB Ram and 128MB VRAM. I have not seen these specs in what? 10 Years? I build and fix PCs as a side gig and I still can't believe people think Computers should last as long as cars.
Not going to say I have documentation, but I do have considerable experience in PC Building for Gaming and Application Support. When I do Gaming PCs, Minecraft is one of the benchmarks I run. $400 can get you a "Minecraft Machine". Which would, probably needed to be replaced (or upgraded with the right parts), in a year or so.
If you want Minecraft to run at will, get it for the Consoles and deal with the World Cap. Yes, Xbox360 has a older patch, but its designed for a system that runs with 512MB UNIFIED Memory. Plus, Xbox One & PS4 should have better world Caps.
Again: Please document your research showing that:
1. Minecraft's performance should be that much better,
2. The performance hit is inherent in Java as opposed to the rendering model being used.
I code in both Java and C#. I find Java to be way less effective at pulling anything off. Not because of the Codes structure, but how people utilize it.
That comment about them switching to C++. That's all dandy, but Games to 2 Puzzle Pieces : Code and Developer. You can have the Best code / engine in the world and it can run poorly if the Developer does not take his product into consideration.
Lets take Facebook games as an easy example. If you load up so of the EA ones, they run Horrifically Slow. Even on i7 PCs. This is because of the poor development process. While, the ones I checked out were written in ActionScript 2.0 (Flash). This should not be an issue. I've written AS & AS2.0 and never had anything run that badly. It's all about what you put into it and how you manage your resources. Some languages can handle certain things better, but you also have to think about deployment as well.
JAVA is on everything. So theoretically, Minecraft can run on anything with the proper Java Version installed. I'm sure the reason the Rasberry PI has a different Minecraft version is just for the significant processing difference it has between it, consoles and desktops.
Just wanted to inform you of the stack trace im getting for the error problems im having ,
that I posted in the post #37269
Which I forgot to add on to it. (original post would not let me edit for some odd reason)
I was informed by the Tinkers Construct Dev that the problem is on your end. just FYI.
If you're running 1.7.2, OptiFine has not been updated to that version yet.
I'm still on 1.6.4 and Optifine works if I copy the files straight into the Forge.jar but it won't work as a mod file.
Are you running Forge #933?
Not being on a computer I can't say much, but check your computer's control panel.
Um, I have an 8-year old PC with only 2GB RAM, and Minecraft is still playable. Specs:
Windows XP Professional SP3 32-bit
Intel Pentium 4, 3.2GHz
2GB RAM DDR-400
Nvidia GeForce 8500 GT 256MB
I usually play in windowed mode, but fullscreen works too.
No, 935.
It says its compatible with 933. Load 933 and try optifine that way.
That's fine... But those specs are acceptable to play Minecraft. 3.2Ghz vs his 1.8Ghz Processor. Pentium vs Atom? Plus you have an 8500GT. So imagine if you will that these are cars. You have a v6 Engine with a Turbo and he's driving a 4 banger. You can't expect him to go as fast as you.
I usually don't play in fullscreen because it's a hassle when you need to switch to other things while you play, but I get notably higher FPS when I'm in fullscreen, rather than the other way around, contrary to what you say xD
That much I do understand, but can't really do much about the laptop at the moment, since it was a gift from my parents; just so I would have a computer. Thanks for the tip. When I do manage to buy a rig of some sort you could bet it will be a desktop of some kind; maybe not the most advanced, but at least something that will handle better. Haven't had much problems with it really, and thankfully. Usually with the optifine features I usually turn a few things off and not use certain ones. It will work for the time being, but when they do decide to up the ante then I will just have to play the version that I can till I am able to get a better rig.:D
Bump. This is still a problem for me. I have disabled optifine for the time being. I suppose my issue may vanish with the release of optifine 1.7.*, but that is not yet released to my knowledge and the server(s) that I play on are still on 1.6.4
When my computer was down, my dad's AMD Sempron (which is a dual core CPU) and GeForce 5800 struggled to run it at like 20 FPS with all the graphic settings set to fast, which is ridiculous.
I know people keep saying "Derp, buy a faster computer", but that's not where the problem lies. The people having problems understand they don't have the fastest computers in the world. They understand they're not going to pull 300 FPS at 1920x1080 with a 512x512 resolution texture pack and Sonic Ether's Unbelievable Shaders. The problem is any computer -- hell, a netbook made in the past 10 years even (and since netbooks aren't even that old I think you get my drift), theoretically should run Minecraft at a solid 60 FPS in the vanilla state. The fact we even have to use mods such as OptiFine to maximize the performance is downright pitiful. Even my computer, which is quite a beast (runs Battlefield 3 on Ultra without even dropping below 35 FPS in even the most intense fights) doesn't get maximum performance in Minecraft unless I use Optifine. That in itself is terrible.
It's really time to ditch the Java and rewrite this game in C++. The games potential would quadruple. And the performance would shoot through the roof.
Paragraph 3, Sentence 1. Read it. Their point is that Minecraft's performance is awful even on a decent computer. It could be much better.
When I install this version of optifine on my MC-1.6.4 modded MC.
I get a weird NEI page when going to a certain page in NEI.
Here is an exact example of whats happening.
http://i.imgur.com/IKaDihf.png
when I take out Optifine from the MC , everything works perfect.
but as soon as I install Optifine I get the weird pages in NEI as shown in the spoiler.
The page it does it on is where Tinkers Construct is at.
All mods are up to date, all block-Ids and Item IDs have been checked and no conflicts at all.
Everything works perfect when there is no Optifine.
This problem only happens when I have optifine installed.
** Look at my Signature to know what NEI I have installed.
Windows 7 x64-bit SP-1 - MC v1.7.10 - Forge 10.13.4.1517 - CodeChickenCore v1.0.7.47 - NEI v1.0.5.118 - (MultiMC v5.0.4.8)
First, not everyone plays full-screen. I happen to prefer the "wider view" over a "square-er" view. (I keep my window at 854x480; but then, I've only got 1024 wide display).
Second, depending on your operating system, graphics card/driver, and general setup, you may find that full-screen has to copy pixels to screen once, while windowed mode has to copy twice (once to a buffer, then to the screen). In such a setup, if full-screen has less than twice the pixels, it may be faster. (Useful if your full-screen mode can be set to a lower pixel count.)
Third, the built-in 3000/4000 GPU, while not very powerful / feature rich, is not CPU bound. It does not consume CPU power to operate. It's biggest contention, as I understand it, is that it wants to use normal system ram as vram, while dedicated vram can actually work with two users at the same time (so real vram can be updated by the system, and read by the display generator, at the same time without having to pause one of the two).
Fourth, your concept of what minecraft has to do is ... lets say "badly mistaken". Minecraft does a huge amount of culling of unused data before it tries to render. With Advanced OpenGL, which the 3000 chipset does support, there is typically only about 100 minichunks being examined for graphical information -- each minichunk is 16x16x16, it only renders block faces that are exposed to air (so most blocks are not rendered), and then most blocks are static images based on block ID without any meta-data or tile entity information being read. So there is a lot of information that isn't even looked at, before information is thrown away, so very little actually has to be read and processed to generate the display lists for the OpenGL driver.
Very possible; as I said, some systems only copy pixels once in full-screen, but twice in windowed.
And ... your basis for this claim is what?
Have you made any attempt to document how much non-graphical work minecraft has to do every second to run at full speed?
Have you made any attempt to document how much graphical work is spent doing what graphics operations, or how much would be gained by making changes to the rendering model?
Do you know what Mojang has said about the performance hit minecraft takes from the use of the now outdated, old style OpenGL that it uses, or that they want to toss it for newer, better performance up-to-date OpenGL? (It would not surprise me if Minecraft 1.7 did that, by the way.)
Have you even done anything to address java's memory/garbage collection? (The default garbage collector is designed for throughput -- least total GC overhead -- in uses where response time is not critical -- where a 2-10 second pause is considered OK.)
The problem is any computer -- hell, a netbook made in the past 10 years even (and since netbooks aren't even that old I think you get my drift), theoretically should run Minecraft at a solid 60 FPS in the vanilla state. The fact we even have to use mods such as OptiFine to maximize the performance is downright pitiful. Even my computer, which is quite a beast (runs Battlefield 3 on Ultra without even dropping below 35 FPS in even the most intense fights) doesn't get maximum performance in Minecraft unless I use Optifine. That in itself is terrible.
It's really time to ditch the Java and rewrite this game in C++. The games potential would quadruple. And the performance would shoot through the roof.
Again: Please document your research showing that:
1. Minecraft's performance should be that much better,
2. The performance hit is inherent in Java as opposed to the rendering model being used.
* Promoting this week: Captive Minecraft 4, Winter Realm. Aka: Vertical Vanilla Viewing. Clicky!
* My channel with Mystcraft, and general Minecraft Let's Plays: http://www.youtube.com/user/Keybounce.
* See all my video series: http://www.minecraftforum.net/forums/minecraft-editions/minecraft-editions-show-your/2865421-keybounces-list-of-creation-threads
(In regard to a mod that gives realistic animal genetics):
Would you really rather have bees that make diamonds and oil with magical genetic blocks?
... did I really ask that?
P.S: My computer isn't a netbook, is a Desktop
Woah woah!
Well then...
As "optimized" as that rendering setup sounds, this doesn't explain the issue. My default, as is the Minecraft Default, is 854x480 (my Monitor Resolution is 1920x1080). I misspoke earlier when I said Fullscreen. I meant to say Maximized (still not used to Minecraft's new real Fullscreen option). So, with that said, a system rendering 44FPS in Windowed : 854x480 will take a performance hit in Windowed : 1920x1058. (Note, this is more than double the games default native draw).
I'm with you on this. However, I'm more concern that people feel that games should run on Single Core 1.2Ghz, 2GB Ram and 128MB VRAM. I have not seen these specs in what? 10 Years? I build and fix PCs as a side gig and I still can't believe people think Computers should last as long as cars.
Not going to say I have documentation, but I do have considerable experience in PC Building for Gaming and Application Support. When I do Gaming PCs, Minecraft is one of the benchmarks I run. $400 can get you a "Minecraft Machine". Which would, probably needed to be replaced (or upgraded with the right parts), in a year or so.
If you want Minecraft to run at will, get it for the Consoles and deal with the World Cap. Yes, Xbox360 has a older patch, but its designed for a system that runs with 512MB UNIFIED Memory. Plus, Xbox One & PS4 should have better world Caps.
I code in both Java and C#. I find Java to be way less effective at pulling anything off. Not because of the Codes structure, but how people utilize it.
That comment about them switching to C++. That's all dandy, but Games to 2 Puzzle Pieces : Code and Developer. You can have the Best code / engine in the world and it can run poorly if the Developer does not take his product into consideration.
Lets take Facebook games as an easy example. If you load up so of the EA ones, they run Horrifically Slow. Even on i7 PCs. This is because of the poor development process. While, the ones I checked out were written in ActionScript 2.0 (Flash). This should not be an issue. I've written AS & AS2.0 and never had anything run that badly. It's all about what you put into it and how you manage your resources. Some languages can handle certain things better, but you also have to think about deployment as well.
JAVA is on everything. So theoretically, Minecraft can run on anything with the proper Java Version installed. I'm sure the reason the Rasberry PI has a different Minecraft version is just for the significant processing difference it has between it, consoles and desktops.
That's my 2 cents. (Sorry if spelling errors).
Just wanted to inform you of the stack trace im getting for the error problems im having ,
that I posted in the post #37269
Which I forgot to add on to it. (original post would not let me edit for some odd reason)
I was informed by the Tinkers Construct Dev that the problem is on your end. just FYI.
http://www.minecraftforum.net/topic/249637-164-optifine-hd-c6-fps-boost-hd-textures-aa-af-and-much-more/page__st__37260#entry25429809
2013-10-28 13:20:40 [SEVERE] [Minecraft-Client] 1283: Stack overflow
2013-10-28 13:20:40 [SEVERE] [Minecraft-Client] ########## GL ERROR ##########
2013-10-28 13:20:40 [SEVERE] [Minecraft-Client] @ Post render
2013-10-28 13:20:40 [SEVERE] [Minecraft-Client] 1283: Stack overflow
2013-10-28 13:20:40 [SEVERE] [Minecraft-Client] ########## GL ERROR ##########
2013-10-28 13:20:40 [SEVERE] [Minecraft-Client] @ Post render
2013-10-28 13:20:40 [SEVERE] [Minecraft-Client] 1283: Stack overflow
2013-10-28 13:20:40 [INFO] [STDERR] Error while rendering: 1xitem.tconstruct.ArmorPattern@1
2013-10-28 13:20:40 [INFO] [STDERR] java.lang.NullPointerException
2013-10-28 13:20:40 [INFO] [STDERR] Error while rendering: 1xitem.tconstruct.ArmorPattern@2
2013-10-28 13:20:40 [INFO] [STDERR] java.lang.NullPointerException
2013-10-28 13:20:40 [INFO] [STDERR] Error while rendering: 1xitem.tconstruct.ArmorPattern@3
2013-10-28 13:20:40 [INFO] [STDERR] java.lang.NullPointerException
2013-10-28 13:20:40 [SEVERE] [Minecraft-Client] ########## GL ERROR ##########
2013-10-28 13:20:40 [SEVERE] [Minecraft-Client] @ Post render
2013-10-28 13:20:40 [SEVERE] [Minecraft-Client] 1283: Stack overflow
2013-10-28 13:20:40 [INFO] [STDERR] Error while rendering: 1xitem.tconstruct.ArmorPattern@0
2013-10-28 13:20:40 [INFO] [STDERR] java.lang.NullPointerException
2013-10-28 13:20:40 [SEVERE] [Minecraft-Client] ########## GL ERROR ##########
2013-10-28 13:20:40 [SEVERE] [Minecraft-Client] @ Post render
2013-10-28 13:20:40 [SEVERE] [Minecraft-Client] 1283: Stack overflow
2013-10-28 13:20:40 [SEVERE] [Minecraft-Client] ########## GL ERROR ##########
2013-10-28 13:20:40 [SEVERE] [Minecraft-Client] @ Post render
2013-10-28 13:20:40 [SEVERE] [Minecraft-Client] 1283: Stack overflow
2013-10-28 13:20:40 [SEVERE] [Minecraft-Client] ########## GL ERROR ##########
A few other mods gives a "Stack Overflow GL Error" as well. but this is the main one that crashes the game.
Windows 7 x64-bit SP-1 - MC v1.7.10 - Forge 10.13.4.1517 - CodeChickenCore v1.0.7.47 - NEI v1.0.5.118 - (MultiMC v5.0.4.8)