Multijointed Monster Maker #1 Posted March 25, 2010 (edited) Why is it that whenever I use the phrase "optimizing code," people always assume I'm refering to fine-tuning or nit-picking code? Fine-tuning or nit-picking barely speeds up code at all. Archectural optimizations and taking different approaches make far more of a difference, so why do small optimizations always come to people's minds whenever they hear the word "optimize?" Edited March 25, 2010 by Multijointed Monster Maker Quote Share this post Link to post Share on other sites
Crazyace #2 Posted March 25, 2010 You're simply misusing it by not spelling it 'optimise' Quote Share this post Link to post Share on other sites
+FujiSkunk #3 Posted March 25, 2010 Optimizing can be for more than just speed gains. I consider it "optimizing" when I go over my code to make execution flow more cleanly or logically, or otherwise just to make the code easier to read for anyone who works on it after me. Such optimization may not speed things up, but it will make the program a whole lot more maintainable. Then again, you might surprised at how such tweaks really can speed things up sometimes. A little nip here and a little tuck there, especially in code that gets looped over a lot, can make big differences down the road. Quote Share this post Link to post Share on other sites
danwinslow #4 Posted March 25, 2010 Because it has no strongly defined meaning in terms of computer science. To me it means things that a human can do to speed up an algorithmic process that the compiler/assembler/linker itself cannot do. These things usually reduce the generality of the code in favor of speed by taking advantage of special cases or of methodology that the compiler doesn't know about. Quote Share this post Link to post Share on other sites
Multijointed Monster Maker #5 Posted March 25, 2010 I also consider using creative use of DMA, V-RAM, fitting a sprite frame within the least amount of 8x8 tiles, and recycling tiles between frames, to be another type of optimization. I'm writing my own game, and I'm planning on simulating sprite rotation by drawing out 1/4 the rotation steps in ROM, using an algorithm to make the cpu rotate each rotation frame by 90 degrees into RAM for another 1/4, and using x and y flips for the other 1/2. Symetrical sprites make it even easier to fake rotation. Quote Share this post Link to post Share on other sites
BigO #6 Posted March 26, 2010 Code can be optimized for speed, resource utilization, size, readability, portability, etc., etc. To me, saying "optimize" is ambiguous without the additional specification of "for what". 3 Quote Share this post Link to post Share on other sites
unhuman #7 Posted March 26, 2010 Optimization just means making better - and you're right - in these days of compiled languages, the compilers have gotten so good and handle many optimizations for you. If you're programming on ghetto-old computers, like many of us, that's not as true. Loop optimization is often best done by hand. And, of course, there's optimization done to reduce memory footprints and code size. That's often still best done by hand. Quote Share this post Link to post Share on other sites
Multijointed Monster Maker #8 Posted March 26, 2010 How about optimizing code for making it easy to write efficient code. Quote Share this post Link to post Share on other sites
doppel #9 Posted April 2, 2010 (edited) I've come to the realization that optimization as it applies to the 6502 is a battle for speed versus size. And sometimes both versus robustness. I never use the word "optimizing" when I'm fine-tuning things, unless the fine-tuning itself is optimizing. I usually say "fine-tuning", "tweaking" or "polishing it off". Edited April 3, 2010 by doppel Quote Share this post Link to post Share on other sites
PeteD #10 Posted April 24, 2010 We always used to call speeding code up optimising in the 80s. On a 1-2mhz cpu when you're trying to make a game shove around as much stuff as possible every cycle, especially in a loop, can make a massive difference. At the time there wasn't a lot of code sharing so as long as code was well commented it didn't really matter how easy it was for other people to pick up. It's stuck with me ever since. Pete Quote Share this post Link to post Share on other sites
Multijointed Monster Maker #11 Posted May 7, 2010 (edited) I can never understand why everyone says "premature optimization is the root of all evil?" It's like a million times easier to optimize your code early on than to wait till the end to fix it. The first sign of my game lagging I would unroll a loop or knock a few cycles off a macro. I'm not waiting till the end of development to solve such a simple problem that can be fixed within seconds. Edited May 7, 2010 by Multijointed Monster Maker Quote Share this post Link to post Share on other sites
RevEng #12 Posted May 7, 2010 I can never understand why everyone says "premature optimization is the root of all evil?" It's like a million times easier to optimize your code early on than to wait till the end to fix it. The first sign of my game lagging I would unroll a loop or knock a few cycles off a macro. I'm not waiting till the end of development to solve such a simple problem that can be fixed within seconds. Premature optimization refers to optimizing before you clearly understand where the bottlenecks are... Coding the first whack at a routine in some fast-but-less-maintainable way without knowing its a bottleneck would be one example. In your example you've tested and realized that a bottleneck exists. That's not premature. Quote Share this post Link to post Share on other sites
Multijointed Monster Maker #13 Posted May 7, 2010 I can never understand why everyone says "premature optimization is the root of all evil?" It's like a million times easier to optimize your code early on than to wait till the end to fix it. The first sign of my game lagging I would unroll a loop or knock a few cycles off a macro. I'm not waiting till the end of development to solve such a simple problem that can be fixed within seconds. Premature optimization refers to optimizing before you clearly understand where the bottlenecks are... Coding the first whack at a routine in some fast-but-less-maintainable way without knowing its a bottleneck would be one example. In your example you've tested and realized that a bottleneck exists. That's not premature. Man, did the ROM hacking community get that one wrong! Quote Share this post Link to post Share on other sites
RevEng #14 Posted May 7, 2010 Man, did the ROM hacking community get that one wrong! You think the ROM hacking community doesn't understand the bottlenecks involved before they optimize? That's a pretty dim view of them, I think. Usually with ROM hacking, any performance issues are solved in the original game, and the hacker just needs to optimize to gain some additional space for the hack code. Even with homebrewers, premature optimization is a mistake. Take the VCS for example, you could optimize the code that draws the scanline by running self-modifying code from ram. Afterward you may find you have unused cycles on the scanline, but desperately need the ram you threw at your optimization. Quote Share this post Link to post Share on other sites
Multijointed Monster Maker #15 Posted May 7, 2010 Man, did the ROM hacking community get that one wrong! You think the ROM hacking community doesn't understand the bottlenecks involved before they optimize? That's a pretty dim view of them, I think. Usually with ROM hacking, any performance issues are solved in the original game, and the hacker just needs to optimize to gain some additional space for the hack code. Even with homebrewers, premature optimization is a mistake. Take the VCS for example, you could optimize the code that draws the scanline by running self-modifying code from ram. Afterward you may find you have unused cycles on the scanline, but desperately need the ram you threw at your optimization. You haven't met the Super Mario World hacking team. They took Super Mario World and butchered it with tons of slowdowns that weren't in the original. Quote Share this post Link to post Share on other sites
RevEng #16 Posted May 7, 2010 You haven't met the Super Mario World hacking team. They took Super Mario World and butchered it with tons of slowdowns that weren't in the original. Ouch! Grateful I haven't met them. Quote Share this post Link to post Share on other sites
+intvnut #17 Posted September 23, 2010 (edited) I can never understand why everyone says "premature optimization is the root of all evil?" It's like a million times easier to optimize your code early on than to wait till the end to fix it. The first sign of my game lagging I would unroll a loop or knock a few cycles off a macro. I'm not waiting till the end of development to solve such a simple problem that can be fixed within seconds. Premature optimization refers to optimizing before you clearly understand where the bottlenecks are... Coding the first whack at a routine in some fast-but-less-maintainable way without knowing its a bottleneck would be one example. In your example you've tested and realized that a bottleneck exists. That's not premature. That's only half of it. Perhaps you've perfected your high-speed, branchless unrolled mega-awesome bubble sort. It'll always be beaten by the most lazily coded (but correct) quicksort or heap sort, if you have a large enough data set. Understanding that something's a bottleneck is only half the battle. Understanding why it's a bottleneck and if there are better algorithms is an important step. As for the original premise of this thread: I use "optimize" in all contexts of improving a program. For example, I could write a game using the general purpose framework offered by the Intellivision EXEC. The EXEC is compact, fairly straight forward, and in many ways elegant. However, its architecture and overall structure are not well suited to many games: ie. it's not an optimal choice. Then there's the issue of implementation: They squeezed a lot in 4K of ROM, but they gave up speed in the process. So, from a speed perspective it's far from optimal, though it's much closer on size. The concept remains valid at both levels. Optimize is a mathematical concept: There's some theoretical minimum (or maximum) you're trying to achieve, and a solution is optimal if you've achieved it. Optimizing is merely the process of moving toward that minimum or maximum. Using the term "optimize" to refer to low-level fiddling is almost as old as computing itself. (While that article is dated 1983, it's describing events of the 50s or 60s.) Edited September 23, 2010 by intvnut Quote Share this post Link to post Share on other sites
sack-c0s #18 Posted October 15, 2010 (edited) My optimisation strategy is simple - my code gets thrown into 2 piles: the stuff that runs in the frameloop and the stuff that doesn't. The frameloop code gets targeted for speed, often with a code/data size increase taking up more memory the outside code gets treated for size at the expense of speed - after all if the level initialisation code takes 3 frames of solid processing instead of 2 when nothing is being drawn who really sees that extra 20ms? Not a massively elegant way of doing it - but it seems to work Edited October 15, 2010 by sack-c0s Quote Share this post Link to post Share on other sites
Thomas Jentzsch #19 Posted October 15, 2010 ...so why do small optimizations always come to people's minds whenever they hear the word "optimize?" I call bigger changes "refactoring". Quote Share this post Link to post Share on other sites
Thomas Jentzsch #20 Posted October 15, 2010 Even with homebrewers, premature optimization is a mistake. Take the VCS for example, you could optimize the code that draws the scanline by running self-modifying code from ram. Afterward you may find you have unused cycles on the scanline, but desperately need the ram you threw at your optimization. Agreed. Usually there are three steps involved in 2600 coding. 1. make it run somehow 2. look for bottlenecks and optimize code/data structure and algorithms(!) ("refactoring") 3. squeeze out the maximum performance by "peephole" code optimizations Sometimes you mix 1 and 2 or 2 and 3. But you should never start early with 3. Usually step 2. is the most efficient one. This goes for everything but the kernel. Here step 1 is usually skipped very fast and a lot of time is invested into steps 2. and especially 3. Quote Share this post Link to post Share on other sites