Jump to content

Photo

Misuse of the word "Optimization"


19 replies to this topic

#1 Multijointed Monster Maker OFFLINE  

Multijointed Monster Maker

    Chopper Commander

  • 170 posts

Posted Thu Mar 25, 2010 8:02 AM

Why is it that whenever I use the phrase "optimizing code," people always assume I'm refering to fine-tuning or nit-picking code? Fine-tuning or nit-picking barely speeds up code at all. Archectural optimizations and taking different approaches make far more of a difference, so why do small optimizations always come to people's minds whenever they hear the word "optimize?"

Edited by Multijointed Monster Maker, Thu Mar 25, 2010 8:03 AM.


#2 Crazyace OFFLINE  

Crazyace

    Stargunner

  • 1,018 posts
  • Location:London / HK / Tokyo / San Fransisco

Posted Thu Mar 25, 2010 10:04 AM

You're simply misusing it by not spelling it 'optimise' :)

#3 FujiSkunk OFFLINE  

FujiSkunk

    Quadrunner

  • 6,037 posts
  • Behold the Fuji!
  • Location:Planet Houston

Posted Thu Mar 25, 2010 10:30 AM

Optimizing can be for more than just speed gains. I consider it "optimizing" when I go over my code to make execution flow more cleanly or logically, or otherwise just to make the code easier to read for anyone who works on it after me. Such optimization may not speed things up, but it will make the program a whole lot more maintainable.

Then again, you might surprised at how such tweaks really can speed things up sometimes. A little nip here and a little tuck there, especially in code that gets looped over a lot, can make big differences down the road.

#4 danwinslow OFFLINE  

danwinslow

    Stargunner

  • 1,996 posts

Posted Thu Mar 25, 2010 10:45 AM

Because it has no strongly defined meaning in terms of computer science.

To me it means things that a human can do to speed up an algorithmic process that the compiler/assembler/linker itself cannot do. These things usually reduce the generality of the code in favor of speed by taking advantage of special cases or of methodology that the compiler doesn't know about.

#5 Multijointed Monster Maker OFFLINE  

Multijointed Monster Maker

    Chopper Commander

  • Topic Starter
  • 170 posts

Posted Thu Mar 25, 2010 4:31 PM

I also consider using creative use of DMA, V-RAM, fitting a sprite frame within the least amount of 8x8 tiles, and recycling tiles between frames, to be another type of optimization.

I'm writing my own game, and I'm planning on simulating sprite rotation by drawing out 1/4 the rotation steps in ROM, using an algorithm to make the cpu rotate each rotation frame by 90 degrees into RAM for another 1/4, and using x and y flips for the other 1/2. Symetrical sprites make it even easier to fake rotation.

#6 BigO OFFLINE  

BigO

    River Patroller

  • 3,335 posts
  • Location:Phoenix, AZ

Posted Thu Mar 25, 2010 6:07 PM

Code can be optimized for speed, resource utilization, size, readability, portability, etc., etc.
To me, saying "optimize" is ambiguous without the additional specification of "for what".

#7 unhuman OFFLINE  

unhuman

    Dragonstomper

  • 991 posts
  • Location:Vienna, VA

Posted Thu Mar 25, 2010 6:19 PM

Optimization just means making better - and you're right - in these days of compiled languages, the compilers have gotten so good and handle many optimizations for you. If you're programming on ghetto-old computers, like many of us, that's not as true. Loop optimization is often best done by hand.

And, of course, there's optimization done to reduce memory footprints and code size. That's often still best done by hand.

#8 Multijointed Monster Maker OFFLINE  

Multijointed Monster Maker

    Chopper Commander

  • Topic Starter
  • 170 posts

Posted Thu Mar 25, 2010 6:46 PM

How about optimizing code for making it easy to write efficient code.

#9 doppel OFFLINE  

doppel

    Star Raider

  • 67 posts

Posted Thu Apr 1, 2010 6:07 PM

I've come to the realization that optimization as it applies to the 6502 is a battle for speed versus size. And sometimes both versus robustness.

I never use the word "optimizing" when I'm fine-tuning things, unless the fine-tuning itself is optimizing. I usually say "fine-tuning", "tweaking" or "polishing it off".

Edited by doppel, Sat Apr 3, 2010 3:31 AM.


#10 PeteD OFFLINE  

PeteD

    Stargunner

  • 1,747 posts
  • Location:Wales

Posted Sat Apr 24, 2010 1:30 PM

We always used to call speeding code up optimising in the 80s. On a 1-2mhz cpu when you're trying to make a game shove around as much stuff as possible every cycle, especially in a loop, can make a massive difference. At the time there wasn't a lot of code sharing so as long as code was well commented it didn't really matter how easy it was for other people to pick up. It's stuck with me ever since.


Pete

#11 Multijointed Monster Maker OFFLINE  

Multijointed Monster Maker

    Chopper Commander

  • Topic Starter
  • 170 posts

Posted Fri May 7, 2010 12:20 PM

I can never understand why everyone says "premature optimization is the root of all evil?" It's like a million times easier to optimize your code early on than to wait till the end to fix it. The first sign of my game lagging I would unroll a loop or knock a few cycles off a macro. I'm not waiting till the end of development to solve such a simple problem that can be fixed within seconds.

Edited by Multijointed Monster Maker, Fri May 7, 2010 12:20 PM.


#12 RevEng OFFLINE  

RevEng

    River Patroller

  • 3,413 posts
  • bit player
  • Location:Canada

Posted Fri May 7, 2010 1:41 PM

I can never understand why everyone says "premature optimization is the root of all evil?" It's like a million times easier to optimize your code early on than to wait till the end to fix it. The first sign of my game lagging I would unroll a loop or knock a few cycles off a macro. I'm not waiting till the end of development to solve such a simple problem that can be fixed within seconds.

Premature optimization refers to optimizing before you clearly understand where the bottlenecks are... Coding the first whack at a routine in some fast-but-less-maintainable way without knowing its a bottleneck would be one example.

In your example you've tested and realized that a bottleneck exists. That's not premature.

#13 Multijointed Monster Maker OFFLINE  

Multijointed Monster Maker

    Chopper Commander

  • Topic Starter
  • 170 posts

Posted Fri May 7, 2010 3:06 PM


I can never understand why everyone says "premature optimization is the root of all evil?" It's like a million times easier to optimize your code early on than to wait till the end to fix it. The first sign of my game lagging I would unroll a loop or knock a few cycles off a macro. I'm not waiting till the end of development to solve such a simple problem that can be fixed within seconds.

Premature optimization refers to optimizing before you clearly understand where the bottlenecks are... Coding the first whack at a routine in some fast-but-less-maintainable way without knowing its a bottleneck would be one example.

In your example you've tested and realized that a bottleneck exists. That's not premature.


Man, did the ROM hacking community get that one wrong!

#14 RevEng OFFLINE  

RevEng

    River Patroller

  • 3,413 posts
  • bit player
  • Location:Canada

Posted Fri May 7, 2010 4:08 PM

Man, did the ROM hacking community get that one wrong!

You think the ROM hacking community doesn't understand the bottlenecks involved before they optimize? That's a pretty dim view of them, I think.

Usually with ROM hacking, any performance issues are solved in the original game, and the hacker just needs to optimize to gain some additional space for the hack code.

Even with homebrewers, premature optimization is a mistake. Take the VCS for example, you could optimize the code that draws the scanline by running self-modifying code from ram. Afterward you may find you have unused cycles on the scanline, but desperately need the ram you threw at your optimization.

#15 Multijointed Monster Maker OFFLINE  

Multijointed Monster Maker

    Chopper Commander

  • Topic Starter
  • 170 posts

Posted Fri May 7, 2010 4:46 PM

Man, did the ROM hacking community get that one wrong!

You think the ROM hacking community doesn't understand the bottlenecks involved before they optimize? That's a pretty dim view of them, I think.

Usually with ROM hacking, any performance issues are solved in the original game, and the hacker just needs to optimize to gain some additional space for the hack code.

Even with homebrewers, premature optimization is a mistake. Take the VCS for example, you could optimize the code that draws the scanline by running self-modifying code from ram. Afterward you may find you have unused cycles on the scanline, but desperately need the ram you threw at your optimization.


You haven't met the Super Mario World hacking team. They took Super Mario World and butchered it with tons of slowdowns that weren't in the original.

#16 RevEng OFFLINE  

RevEng

    River Patroller

  • 3,413 posts
  • bit player
  • Location:Canada

Posted Fri May 7, 2010 5:03 PM

You haven't met the Super Mario World hacking team. They took Super Mario World and butchered it with tons of slowdowns that weren't in the original.

Ouch! Grateful I haven't met them. ;)

#17 intvnut ONLINE  

intvnut

    Stargunner

  • 1,286 posts
  • Location:@R6 (top of stack)

Posted Thu Sep 23, 2010 7:00 AM


I can never understand why everyone says "premature optimization is the root of all evil?" It's like a million times easier to optimize your code early on than to wait till the end to fix it. The first sign of my game lagging I would unroll a loop or knock a few cycles off a macro. I'm not waiting till the end of development to solve such a simple problem that can be fixed within seconds.

Premature optimization refers to optimizing before you clearly understand where the bottlenecks are... Coding the first whack at a routine in some fast-but-less-maintainable way without knowing its a bottleneck would be one example.

In your example you've tested and realized that a bottleneck exists. That's not premature.


That's only half of it. Perhaps you've perfected your high-speed, branchless unrolled mega-awesome bubble sort. It'll always be beaten by the most lazily coded (but correct) quicksort or heap sort, if you have a large enough data set.

Understanding that something's a bottleneck is only half the battle. Understanding why it's a bottleneck and if there are better algorithms is an important step.

As for the original premise of this thread: I use "optimize" in all contexts of improving a program. For example, I could write a game using the general purpose framework offered by the Intellivision EXEC. The EXEC is compact, fairly straight forward, and in many ways elegant. However, its architecture and overall structure are not well suited to many games: ie. it's not an optimal choice. Then there's the issue of implementation: They squeezed a lot in 4K of ROM, but they gave up speed in the process. So, from a speed perspective it's far from optimal, though it's much closer on size. The concept remains valid at both levels.

Optimize is a mathematical concept: There's some theoretical minimum (or maximum) you're trying to achieve, and a solution is optimal if you've achieved it. Optimizing is merely the process of moving toward that minimum or maximum.

Using the term "optimize" to refer to low-level fiddling is almost as old as computing itself. (While that article is dated 1983, it's describing events of the 50s or 60s.)

Edited by intvnut, Thu Sep 23, 2010 7:01 AM.


#18 sack-c0s OFFLINE  

sack-c0s

    Stargunner

  • 1,113 posts
  • Location:Kingston Upon Thames, UK

Posted Fri Oct 15, 2010 7:14 AM

My optimisation strategy is simple - my code gets thrown into 2 piles: the stuff that runs in the frameloop and the stuff that doesn't.

The frameloop code gets targeted for speed, often with a code/data size increase taking up more memory
the outside code gets treated for size at the expense of speed - after all if the level initialisation code takes 3 frames of solid processing instead of 2 when nothing is being drawn who really sees that extra 20ms?

Not a massively elegant way of doing it - but it seems to work

Edited by sack-c0s, Fri Oct 15, 2010 7:15 AM.


#19 Thomas Jentzsch OFFLINE  

Thomas Jentzsch

    Thrust, Jammed, SWOOPS!, Boulder Dash

  • 18,909 posts
  • Always left from right here!
  • Location:Düsseldorf, Germany

Posted Fri Oct 15, 2010 9:02 AM

...so why do small optimizations always come to people's minds whenever they hear the word "optimize?"

I call bigger changes "refactoring".

#20 Thomas Jentzsch OFFLINE  

Thomas Jentzsch

    Thrust, Jammed, SWOOPS!, Boulder Dash

  • 18,909 posts
  • Always left from right here!
  • Location:Düsseldorf, Germany

Posted Fri Oct 15, 2010 9:11 AM

Even with homebrewers, premature optimization is a mistake. Take the VCS for example, you could optimize the code that draws the scanline by running self-modifying code from ram. Afterward you may find you have unused cycles on the scanline, but desperately need the ram you threw at your optimization.

Agreed.

Usually there are three steps involved in 2600 coding.
1. make it run somehow
2. look for bottlenecks and optimize code/data structure and algorithms(!) ("refactoring")
3. squeeze out the maximum performance by "peephole" code optimizations

Sometimes you mix 1 and 2 or 2 and 3. But you should never start early with 3. Usually step 2. is the most efficient one.

This goes for everything but the kernel. Here step 1 is usually skipped very fast and a lot of time is invested into steps 2. and especially 3.




0 user(s) are browsing this forum

0 members, 0 guests, 0 anonymous users