I'm taking a C++ class right now but I am also trying to learn Java via Bucky's YouTube tutorials. From what I've gathered so far, I must say that I much prefer Java. For some reason it seems that it is easier to get a grasp on Java than C++ and it seems better organized.
Java is easier and faster to develop, but it is a bit slower and more of a memory hog than Windows products. This is evident by Minecraft using 500-800 MBs of RAM. Java is easier to port / run on multiple systems, though.
C++ is harder and takes a lot longer to develop, but it is a bit faster and, when written properly, much more efficient with memory. I do not know how easy it is to develop C++ for non-windows systems. I think you just recompile for each system and architecture. (Windows, Linux, Mac * 2 = 6 compiles)
I love Bucky's videos, and they can only further your understanding of concepts. Most of programming is understanding the concepts: How, and when to use them. The syntax is the same or just about between languages, so I wouldn't get caught up in it.
... more of a memory hog than Windows products. This is evident by Minecraft using 500-800 MBs of RAM.
Its not that much of a memory hog, and the reason Minecraft uses so much memory is not a fault with java; its the fault with minecraft storing huge three dimensional arrays. A C++ program would use just about the same amount of memory.
Quote from Mithrildor »
C++ for the sake of it being way and way faster.
Java is only a little bit slower than C++ at the start of execution, and its debatable about which wins out in long-term execution.
To me, the only benefit of using C++ over Java is for Direct Memory Access.
I'd recommend Java to anyone learning programming, due to having a massive standard library and for having far more compile and run time checking going on, so its much easier to see where you programs are going wrong. You also don't have to worry about the possibility of memory leaks.
The only time you can cause memory leaks in Java is if you're using classes which use the Java native interface (Which is not covered by the collector because its not running Java code), and most programs don't use much of the native interface anyway.
When you do happen to use them, its mentioned all over the place that you need explicitly release its resources.
Java doesn't run as fast (it can't, as it uses an intermediary layer between code and CPU)
You mean that intermediary layer which doesn't exist due to the JIT compiler?
The JIT compiler is the intermediary layer he is talking about. On run-time, a native program executes code that can be loaded directly in the processor. Java will generally require a run time translation to work, the bytecode is translated into native code before execution, so for every instruction executed, it must first be translated.
The bytecode does have a nice advantage though, a program written on windows will generally work on *nix with few changes. The same .jar is also portable across various architectures which support java.
... why are there tutorials and guides on gc optimisation and manual collection? also why are there a multitude of java heap analysis tools?
Optimising for the garbage collector means "Make your program do the same thing without creating thousands of new objects".
Programmers do so by caching reusable objects that are significantly slow to create (Such as Threads, Database entries, or gigantic arrays) to speed up performance by reducing the amount of work that has to be done by the collector, the hard drive, or the processor.
Why do heap analysis tools exist? To allow you do easily debug problems with said caches, along with helping any situation where direct heap analysis could be beneficial. Just because you can only think of one reason for having a tool doesn't mean it only has one use.
What is it you mean about tutorials on manual collection in Java?
Quote from _s1gma »
surely if it was perfect you wouldn't need to analyse the heap usage and manually clean stuff up because it should do everything for you.
The only classes I have ever seen that require explicit closing have always required the Java Native Interface.
The GC cannot collect memory from native code because its not part of the JVM's heap space.
As for heap analysis, see above.
Just as jmp has said, the garbage collector doesn't cause memory leaks; if a program is still referencing an object, it is not a memory leak despite it taking up memory because its still being used by the program. Attempting to deallocate an object in such a case could be done in C and C++, however it would cause your program to have the far worse potential bug that is a dangling pointer.
However, there here are possible advantages too. For example, the HotSpot JVM runs interprets the Java bytecode first, monitors for frequently used bytecode and compiles that to machine code. Interpreting the code can offer a lot of insight for the JIT compiler/optimizer, whereas static compilers usually have to rely on analysis of the source code without actually knowing how it performs.
Most static compilers (ie, every one I've ever used) perform both source and assembly level analysis and optimization so your point is void. Even MSVS does this.
"Assembly level analysis" has nothing to do with analyzing bytecode during runtime. Can you name a single static compiler that does runtime analysis? You can't, because such a compiler does not exist, by definition; it would be a JIT compiler.
That is exactly my point. Analyzing bytecode during runtime costs resources that are detracted from the overall performance of the program. While it does provide some nice benefits such as portability, a static compiler can optimize just as much, and maybe more depending on settings.
Also, I don't recall mentioning that assembly level analysis was linked to runtime analysis, only that it is done.
<Guo_Si> Hey, you know what sucks?
<Guo_Si> Hey, you know what sucks in a metaphorical sense?
<TheXPhial> black holes
<Guo_Si> Hey, you know what just isn't cool?