Offtopic:
Quote:
Btw, what are the key parts of C++ I should be focusing on? Just to give me an idea of what I should work hardest on.
Quick ideas:
- Learn to code in modular ways, it makes code easier to maintains, understand and reuse.
- Learn C++ language constructs/_libraries_, they are harder to use at first then C language constructs/_libraries_ but the gain is there.
EDIT:
- Learn good memory management and error handling technique.
- Learn Data Structures (advance one too), their benefits and drawbacks, when to use them.
----------------
Tycho comments, for the fun of argumenting, no war is declared here
Quote:
Second, you really need to learn the pitfalls in C++. Many computer science majors will wrongly tell you how wonderful recursive functions are. Recursive functions solve problems with less effort from the programmer but with much more effort for the machine. It also is dangerous because you can get infinite recursion (or something close to it which has just as bad results) if you're not careful. There's a function in the Uplink code which is called TraceLog(). It's unfortunately recursive, and it wasn't able to recognize that it had entered an infinite recursion, so one of our forum users experienced a crash when the in-game computer's log said "Two computers. Computer A and computer B. Computer A has connected to itself." The connection to itself made it trace back to itself over and over and over, and eventually caused a crash.
The problem here is not the use of recursion, it's the algorithm. An infinite loop wouldn't have been better
(People need to verify that their algorithms meets their termination conditions)
Quote:
Third, optimization. I'm not listing these in order of importance, because I actually think optimization comes first, and the computer engineering side of programming is extremely important. Why? Computer science works in the theoretical plane. Stuff that you have to run for months on end on a supercomputer or not even solve for years. Engineering is about making things work with what you have to meet a set of constraints, and deals in milliseconds or microseconds. In game programming, you have a CPU, a GPU, memory that has a certain latency, a disk with a certain seek time, etc. You need to accomplish so much every 60th of a second given those hardware constraints or else the game will suck.
Optimization is good, algorithm first (the gain can be in order of magnitude), code second. A lot of time optimizations makes things ugly/hard to maintain and introduce new bugs. Optimization should be made in critical sections where there is proof the optimization as tangible benefits.
Quote:
What I?ve found is that computer science majors write really shitty code. Not code that doesn?t work, but code that?s slow. They don?t know how to optimize. They can?t tell you the difference between an AMD Athlon XP and an Intel Pentium 4. They can?t explain why the new Core 2 is such a good thing. In their little abstract world of trees and lists and Java, they don?t need to understand the low level hardware. Many of them can?t ever read an x86 disassembly or tell me the first thing about how many registers in a Pentium processor or what the registers are for.
I do agree that most computer science majors write shitty code and not only speed/resource wize, code is badly structured, error handling is very poor. Having an idea of the inner working of the hardware and low level libraries is indeed a good thing, it helps evaluate the cost of a given piece of code. But once again, I would say that knowing the cost of your algorithms is as important (if not more important), how many time I saw O(n2) algorithms where a O(n) algorithm could have been used. EDIT: Same goes for Data Structures, how many times I saw arrays/lists when hashtables/sets/trees would have been much better.
Quote:
Computer Science has a long standing solution to concurrency - the concept of a lock. Which in Linux and Windows are synchronization objects known as locks, mutexes, semaphores, and other names. They?re used in every operating system and just about every shipping Windows and Linux application today. Even calling malloc() is a seralizing operation on the memory heap that causes a lock. Have multiple threads calling malloc() and they basically get to stand in line (a queue data structure) and execute serially. That?s not very parallel. So once again, computer science has given us something that doesn?t translate well into real-world performance.
I don't get why you blame the locks problems on Computer Science, if some piece of code relays heavily on locks I guess the code is badly written, the problem is not the locks but their utilization. On the malloc problem, malloc is a generalized memory manager, fit all but don't do anything very well, you could code your own memory manager (and some have done it) or utilize other methods to reduce the number of calls to malloc.
EDIT: I would add that C is not designed with parallelisms in mind. Which doesn't mean that some programming languages don't fair better in that domain.
Quote:
My point of this is that you should NOT go pure computer science when you program. You?ll rot your head with abstract ideas and end up writing very poor code. If you can?t visualize an algorithm or a piece of code and understand what that touches in the microprocessor, in the memory, on the disk, and what the costs and delays of all the steps are, you?re going to write shitty code.
Keeping an idea on how it works in the inside is good to keep the things down to earth, realistic. But I think one of the first thing to keep in mind is the structure of your program, easy to maintain, easy to read/modify, easy to spot/correct errors/bugs.