Jump to content
IGNORED

Sprites in Pascal


Rossman

Recommended Posts

Wow, I never would have caught that.

 

Everything I do is on a specific value and I do not do any array-level actions. i do pass arrays as parameters and (fortunately) have not had any difficulty. I dont have the code in front of me right now, but Im pretty sure I declare all of my arrays as types.

 

type

Hand = array[0..3];

 

Var

Punto, Banco : hand;

 

And that might have prevented any problems when passing parameters.

 

(Sorry if the code doesnt read quite right. Im typing this on an iPad on the way to work, and autocorrect is adding leading capital letters, changing words, etc.)

Link to comment
Share on other sites

  • 1 month later...

I am sure this question has been well discussed in Atariage, probably many times over, and many moons ago. I apologize for not doing more thorough research. If there is an existing thread please direct me toward it.

 

I am making progress on my card game in Pascal. I am about to integrate a GUI in addition to / in spite of / in replacement of the text UI.

 

That reads better than it lives. Two cards will emerge from a shoe positioned on the right side of the screen, destined for the player (punto); two more will emerge for the house (banco). Reveals with card flips will happen, and additional cards will be drawn or will not be drawn. While an animated production of cards is more visually appealing than gotoxy(x,y); write(''JACK OF CLUBS") and gotoxy(x,y); write("COWBOY OF DIAMONDS") { which, by the way, is a lousy hand as it results in a score of 0 }, it will not be a mind expanding experience. At best. I am creating a non-offensive graphical UI with a correctly color coded (red/black) value for a suit, revealed as appropriate for game logic. Meh. The coding is far more interesting than the game, if you are into that sort of thing.

 

As much as I want a resource efficient UI, I am leaning toward the convenience of having long-lived sprites that show card values. I am using sprites with a magnification of 2, which means each sprite consumes 4 characters. Thus is the price of high-res card faces.

 

If I have a maximum of 3 cards each across 2 player's hands (punto and banco can stand on 2, draw a 3rd) I am consuming upwards of 24 ascii characters. That is not itself a problem. But if I toss in a few other visuals (sprites or otherwise) for game controls, all at the same size as a 2x magnification, pretty soon I am consuming 48 ascii characters. Still, I can make this work within the limits of the ascii character set.

 

Which brings me to my question. What ascii limit am I really working against? I started doing some research into ascii character real estate and I came up with incongruities I cannot quite explain. I'm sure the answer is there, but it is late.

 

-- According to the E/A quick reference card, there are 128 characters (0..127, decimal).

 

-- According to the Pascal manual page 145 (and other places), ASCII values range from 0..255 (decimal).

 

-- According to the TI User's Reference Guide, standard characters are in the range of 32-127 (decimal) with additional characters from 128-159, but anything above 127 is at-risk. I never stopped to think about how anything below 32 was not re-programmable (or at any rate, isn't encouraged), but why the stop at 240 octal (this I know I can probably find if I research it) and more importantly, why the risk - what else is competing for that memory?

 

-- There is a sample program in Molesworth's Introduction to Assembly Language for the TI Home Computer (pages 129-131) that redefines the character set (e.g., while using XB) from 30 up to character 143 (decimal). Is there something magical about 220 octal (143 decimal)?

 

@apersson850 pointed out earlier that low ascii sprites perform better than high value ascii values (assuming I groked that post correctly), so I am motivated to use low ascii values. But some of my not-so-important graphics can occupy a higher set of characters. Are characters 128 and above freely available? In Pascal, in other languages? Or, thar be dragons? And if yes, why?

 

This is not a show-stopper by any means. It's just an amusing incongruity. Is the low value of the ASCII range 0, 30 or 32? Is the high value 127, 143, 159, or 255?

 

Best regards,

 

 

R.

Link to comment
Share on other sites

ASCII is defined from 0-127. It just declares which code is assiged to which character. The video processor does not care about such assignments. You could load your character definitions in a way to implement EBCDIC.

 

Space is very tight in TI BASIC's VDP memory. If I remember correctly, BASIC allocates memory that intersects with the pattern or color tables. In principle, you can have 256 characters (0-255), as far as the video processor is concerned. However, if you block the usage of higher character codes (as done in TI BASIC) you can re-use that memory for buffers or other stuff.

Link to comment
Share on other sites

The answer above is correct.

In Pascal, the space used for the character definitions isn't shared with other functions. Thus you can use the full set, from 0 to 255, for character definitions. Check page 117 in the Pascal compiler manual.

Those in the range 32 to 127 are used for normal, printable characters by the system. If you redefine them, and display text at the same time, you start seeing funny things on the screen.

 

Likewise, any pattern in the range 0..255 can be used to define a sprite (see page 145).

 

It's if you use automatic motion of sprites within the p-system that it's beneficiary to either let sprites with sprite number (not character number) 0.. 15 be the moving ones, or you let those in the range 16..31 move. Provided not more than 16 are moving at the same time, that is. The reason for this is that the system keeps a bitmap of the moving sprites in two words, and if there's no bit at all set in one of these ranges, then the code to move sprites there is skipped altogether. So it's simply more efficient to concentrate all motion to one half of the sprite numbers.

 

If you control motion by changing the sprite positions by yourself, under program control, then this doesn't matter.

Link to comment
Share on other sites

Thank you for the clarifications. I will have text and sprites at the same time, I did identify several groups of 8 characters (2 sets of 4 for a sprite) I could use without concern of something peculiar displaying in the text that I also display. I can simplify all of that and simply use 128 and above. That's one less array I need worry about.

 

@apersson850 I must have misunderstood your earlier post. I will have a maximum of 6 moving sprites, so I'll use the first 6.

 

Best regards,

 

 

R.

Link to comment
Share on other sites

  • 2 weeks later...

Still plugging away on this.

 

I created a sprite test bed as a program that I quickly converted into a gui interface for my little card game. It did not take long to connect it up to my existing game logic and augment the text UI. The relative ease validated my primitive model-view-controller (well, such as it is, I am quite sure I am violating a lot of established patterns here.) The bottom line is, punto 1 card 1 isn't entirely implemented (no flip animation, no reveal), but it made its debut appearance, and didn't embarrass itself.

 

I expect to have some questions on how abstract I can make the sprite control logic, as I don't want to have lengthy link declarations. I will probably finish animation of punto cards 1 and 2 and I'll try to create a reusable base, and that's when those questions will arise.

 

Pascal is turning out to be a fantastic language for this.

 

Best regards,

 

 

R.

  • Like 2
Link to comment
Share on other sites

Growing up, the p-system always seemed to me very powerful, but I could not confirm that as it was always out of reach. Learning it now, I appreciate virtual machine architectures in ways I do not think I would have in the 1980s. The p-system had flaws, but (arguably) so did every architecture. In the end, the p-system was pretty good.

 

I always did love/hate Pascal as a language, and of course I could easily write this little game to be p-system portable with extensions for users of the TI. With little modification, I could make this deployable on an Apple ][ and TI 99/4A. In fact, the non-gui version was portable excluding the random number generation.

 

I've shared the fact that I'm re-learning Pascal with a number of colleagues at work. Those of my generation have mostly asked, "why?", although a few have been encouraging. Of course, those of a much younger generation have responded, "Hey, that's cool, I've heard of that language called Pascal." (sigh).

 

Sigh, indeed. I now have two card deals and flips. I need to reveal the cards. I have worked this out previously, I just need to connect up existing code for that to show the suit over the value. This also helps me fine-tune the animation.

Link to comment
Share on other sites

I've got the player cards animated in sprites, and it works well. Except for one little thing...

 

The start of the game involves dealing two cards to punto and two cards to banco.

 

Sprites appear at (y := 135, x := 240). The first two cards to punto are launched in consecutive fashion. They each go from right to left through two sprites before stopping at (sprite #1 - y := 135, x := 42) and (sprite #3 - y := 135, x := 58). Once they stop lateral motion, each goes through a change in character to simulate a card flip (countdown := 16 for flip 1, 2 and 3 respectively) before revealing the card itself. Once the card is revealed, a "hidden" sprite parked on the landing spot (sprite #2 blends in with color := 0 at y := 135 and x := 42, sprite #4 is hidden at y := 135, x := 58) changes color (color := 15) at the same moment in time the higher-order sprite card flip is complete.

 

This works great. There's no lag between the card face reveal and the background, and it looks ok, even if the Spade looks like a mushroom and the animation as I defined it lacks grace.

 

I have an array of values that returns (most all) animation parameters depending on which card it is, and a single sprite procedure with the sprite record logic with the behavior geometry. The code is not optimized, but it is reasonably compact. I can easily add calls to define new cards and (separately) initiate them.

 

A curious thing happened when I went to add a third card. The player (and, ultimately, the house) has a choice as to whether to draw a 3rd card. The first two cards are initiated, then there is a prompt asking the player if they want a card, and if yes a 3rd card is dealt. All I had to do was add a third entry to the array and invoke the sprite procedure referencing those array values, in addition to the details about the card drawn.

 

Only thing is, when I launch the sprite, I don't see it. Actually, it's two sprites: it's the card back coming out of the shoe toward the player's side of the screen that gets flipped to show the card, and the background sprite that is invisible until the card flip is complete. I don't see either of them. Not the foreground, nor the background.

 

If I end the game (respond to play again with "N") I do see the sprites exactly where I expect to see them with exactly the values I expect them to have, if only briefly: until the sprites are deleted and I set_screen back to 1.

 

I seem to remember a problem like this with sprites in XB years ago, but I don't remember all of the details. I recall there was something about sprites being invisible relative to others under some circumstances, I just don't recall all of the details.

 

I changed the array to put the 3rd sprite on a different row (y := 153), and it works! I see it appear and land exactly where I want it, and the background appears without issue.

 

Any ideas? I'm sure there's something rudimentary about sprites I've just forgotten. I'm going to try to add the banco cards as sprites now, which will be in the same X, Y planes and see what happens.

Link to comment
Share on other sites

... Only thing is, when I launch the sprite, I don't see it. Actually, it's two sprites: it's the card back coming out of the shoe toward the player's side of the screen that gets flipped to show the card, and the background sprite that is invisible until the card flip is complete. I don't see either of them. Not the foreground, nor the background. ...

 

The TMS9918A VDP can only show 4 sprites on the same row. The fifth and higher-numbered sprites on the same row will be invisible as you have just observed.

 

...lee

Link to comment
Share on other sites

Aaaaaaaahhhhhhhhh!!!!!!!!!!!!!!!!!! Or as James T. Kirk would have it, "Kaaaaaaaaaaahhhhhhhhhhnnnnnnnnn!!!!!!!!!!!!!!!"

 

Thanks for the quick reply, Lee. You saved me many hours of experimentation. I saw your message just as I tried reverse ordering the sprites (the first sprite is 12, the second sprite is 11, etc.)

 

I guess the screen will be divided into north and south hemispheres, not east and west.

 

Any chance a minor offset (e.g., half-row distance) will work? I'll give that a try.

Link to comment
Share on other sites

I will retain east-west layout. Changing to a north-south would not have been worthwhile as I would face the same 4 sprite limitation. I have adjusted the visuals, and it is working fine.

 

@Lee, thank you again for the prompt reply.

 

The last bit I need for the GUI to be complete is the sprite zombie code: take the banco cards which are showing card back while punto decides what to do and re-animate the sprite to reveal the cards.

Link to comment
Share on other sites

All cards (up to 6 per hand) appear with the correct animation. The card placement is not ideal given the 4-sprites-per-plane, but my implementation with the card back being a sprite is a factor in that. The animation isn't great, but as Dr. McCoy would say, "I'm a doctor, not a choreographer".

 

It is satisfying that it works. Specifically, that the gui was relatively easy to implement. My primitive MVC worked out. Now it's just details. And making the code less bad.

 

@Lee, thank you again for the fast response last night. And thank you, apersson850, for pointing out that attempting to fight it through some other means would be futile. I'd have

 

I have code bloat to get under control, some gui controllers that I need to create, and animation to improve. And that will complete the game logic. Then I can get on with the wagering logic.

 

Thank you all for your help and support.

  • Like 1
Link to comment
Share on other sites

The GUI for my little card game works, but it doesn't work for very long. I ran an extended test with the sprites connected and before long I got a STACK OVERFLOW * REBOOT error. The GUI code is connected to the main logic flow at multiple points (explained, below). I tried reducing this to invoking sprites at 3 points of the code, 2 points, and 1 point, and the game always failed with STACK OVERFLOW on the 14th round of the game. Always. That is, it didn't matter whether I showed the first 4 cards and backgrounds (there will be 4 cards drawn, guaranteed every hand), or 5 cards or 6 cards (optional depending on user choice and algorithm); it didn't matter the cumulative number of sprites; the code failed on the 14th cycle, every single time.

 

I disconnected the sprites completely and the code ran its natural life. That is not a perfect test, as my text UI should run into perpetuity without a stack overflow and as it is, it self-terminates if I've run out of cards in the shoe (52 cards x 4 decks, I haven't implemented reshuffle logic), but that the text UI worked to completion tells me the stack overflow problem likely has to do with the gui code I added.

 

The code is structured this way:

-- CDFI is the controller, it contains the core game logic, it USES:

-- Unit CARDCODE, which contains card logic, which USES random

-- Unit CDFTUI, which contains the text user interface

-- Unit CDFGUI, which contains the graphical user interface; it USES sprite, support and

-- Unit CDFGHLP, which contains helpers specific to CDFGUI, and USES support

 

The main game logic does this:

-- Deal cards to punto and banco

-- Display text information about punto cards

-- Invoke sprites for punto, animated from right to left card back up, then flipping to reveal card face

-- Invoke sprites for banco, animated from right to left card back up

 

*******

 

-- Depending on scores and what not, prompt punto for draw or stand

-- If punto elects to draw, invoke sprite for another punto card, animated from right to left card back up, flipping to reveal card face

-- Banco algorithm determines whether Banco wants a 3rd card; if yes, animate a 3rd card from right to left, card back up, flipping to reveal card face

-- Banco's first two cards are reanimated, flipping to reveal card face

-- Whether the user elects to play again or not, all sprites are deleted

 

Note that I disabled all sprite logic after the string of asterisks above, and still the code reports a stack overflow on the 14th round. That is, sprites for the first four cards still result in a stack overflow.

 

I am running this on Classic99, not real iron or another emulator. I have had success with Classic99 and I am not inclined to suggest it is a problem with the emulator. I suspect there is something untidy in the construction of my code. I am deleting the sprites. Clearly, I am accreting something that is overloading the stack. Should I merge the gui helper code back into the GUI controller? Should I make all of the dependencies visible from the root?

I am clearly doing something wrong. I'm just not certain what it is.

 

Before calling it a night, I ran one final test: just launch the sprite for the first punto card on every hand. Not both punto cards, just the one. This worked, and the code ran to its natural completion.

 

One thing about my implementation of the sprite logic is that I have tried to make the record constructors reusable. That is, instead of having a new packet definition for each sprite (6 different sprite controllers), I am re-using the sprite definitions. Cards 1, 2, 3 (all punto) as well as 6 (banco's optional draw) move from right to left quickly in sp1^, then slowly in sp2^, then begin their flip sequence in sp3^ and eventually reveal their card suit and face in ^sp7. The timing is different but the sequence is the same, and the math works out.

 

Similarly, Banco's first two cards travel right to left and stop (one logic flow), and are revealed only at the end of the game.

 

The animation geometry for each card is loaded into memory in CDFGUI (locally to CDFGUI), and returned by procedure (locally to CDFGUI).

 

It'll be a whole lot of code bloat, but I wonder if I should create a unique sequence for sprites 1, 2 and eventually 3, and only explore consolidation once I get the sprites to work without incident.

 

Meantime, I can add punto sprite #2 and see what happens. I can alternatively add banco sprite #1, to see what happens. Plenty of things to test before drawing any conclusions. Meanwhile, all suggestions welcome.

 

Best regards,

 

 

R.

Link to comment
Share on other sites

The most obvious reason for a stack overflow is that you try to run too many too large code segments together with too much data allocated on the stack. I don't know how big your program is, but in this case it doesn't sound too likely.

The most common reason for issues like this is a memory leak. I don't know if you are experienced in hunting such creatures, so I'll refrain from any further input until I see if you reply "Alright" or "Duh?".

Link to comment
Share on other sites

Memory leaks? Oh yeah, I've had my share of memory leaks. Especially in the early days of Java. Bad memories!!

 

I do think you are onto something with the large code segments and data.

 

For better or worse, I created levels of abstraction in the code. For example, the primary game logic calls a single textui controller with parameters that determine whether to reveal cards or update scores or clear the text UI space. By doing this, I can transfer control from the game logic to the text UI logic, and have all of the text UI logic consolidated in a single location - and not co-mingled with the game logic. I have quite a few cases where I structured the code this way, calling procedures that localize logic to determine what action to take, which usually results in another procedure to call. As much as I might find this tidy, the abstraction creates layers of calls, and that may be a contributing factor.

 

Then there is the data. I have two procedures that return the bitmap data for the card face when flipped. There are 4 images per card (2 for the card, 2 for the suit), so that adds up to a fair bit of data. I moved those procedures into a separate unit just to make the GUI logic more manageable at compile time.

 

There is also the size of the code. I hadn't been looking at it until now, but the two GUI units nearly double the size of the codebase. The sprite animation goes through 9 different steps (7 foreground, 2 background), so as you can imagine I have a lot of packet definition code. And I have multiple sprite animations. Perhaps instead of having a single animation sequence for punto1, punto2, punto3 and banco3, I can have the two-sprite animation for banco1 and banco2 for all, and save a lot of code. Or better yet, save the theatre: deal the cards that immediately go face up as face up. That will reduce the sprite packet code complexity.

 

So maybe the things to look at are code levels (how deep is the code nested?) and code bloat (can I simplify)?

 

I have been experimenting with reducing calls to the sprite code. What it boils down to is, I can make more repetitive calls to the simpler sprites than the complex ones. I'll start by simplifying.

Link to comment
Share on other sites

Considering how the sprite functionality is implemented, I would assume that it allocates dynamic memory for itself. Perhaps you do too? Not disposing that memory will eventually lead to a stack overflow.

Are you aware of how code and data space is managed in UCSD Pascal in general, and on the 99/4A in particular?

 

Basically, the remaining free memory in the VDP RAM is used as the primary code pool. Only pure p-code (no assembly) segments are loaded here. As many segments as fits can be loaded. It will be roughly 12 K RAM in size, when the screen data got what it requires.

The secondary code pool is in one end of the 24 K RAM. The stack is at the other end. They grow in opposite directions, and if they meet, then you have *STACK OVERFLOW*. In between the two you have the heap, where dynamically allocated data (new(variable)) is kept.

 

In case of a stack fault, the system first tries to release code segments from the secondary code pool, if possible. Note that for inter-segment calls, both the calling and the called segment must be in memory simultaneously. Thus if your main program uses units A and B, but only one at a time, B can be rolled out when A is called, and vice versa. But if your main program uses A, and unit A in turn uses unit B, then both the main program and the two units have to be in memory at the same time. Thus you gain no memory from the segmentation in that case.

 

Also remember that local data in procedures is allocated (on the stack) when the procedure is called and disposed when it's exited. So if you don't need the data to remain across calls, make it local and regain the space used at each exit from the procedure.

 

It usually makes sense to have a unit globals, which contains frequently referenced routines. This unit will probably be in memory almost always.

Then split up the rest of the program in units with clear borders between their tasks. Which depends on the program, of course. In a more calculation-oriented program perhaps data entry, calculation and data printout are three segments with no relation, provided they can all interface with the unit globals to reach the data they need to work with.

 

When debugging, consider printing the result from calls to memavail and/or varavail over and over again. Thus you can keep an eye on that data, and see if it's diminishing as the program continues to work. If so, you almost certainly have a memory leak.

Edited by apersson850
Link to comment
Share on other sites

Thank you, @apersson850. This is hugely helpful.

 

Before I started, I read up on but did not put much thought into memory management. It was not a problem before the sprites as the codebase was small and manageable. With the sprites, the code has doubled in size, so clearly something is up.

 

I did put some thought into how best to encapsulate the code as far as what is public and what is private within the units. I try to minimize the global variables and code, even at a unit level. Within the GUI unit, I created a GUI controller that is exposed publicly, with all the sprite invocation logic private to the GUI unit. That should minimize the total footprint and allow for memory to be freed once a procedure is complete.

 

I will add a goto(x,y) and perform a couple of write statements on the screen to show memory status with each round of the game. That will tell me how well my code is freeing memory.

 

I spent the past couple of days looking at my code to see if I missed something. I have only the example sprite logic from section 6 of the compiler manual, but I still looked to see if maybe I missed something in declaration or cleaning up. It does not appear that I did, but I have only the one example to compare it against. I have also installed MAME. I do not expect a different result (I believe it has to do with my code), but having a different environment might yield something.

 

Thanks again.

 

Best regards,

 

 

R.

Link to comment
Share on other sites

Assuming I am reading MEMAVAIL correctly, I have some leaky code - but the leak is consistent. As long as there is a pattern, it is not a mystery problem and therefore solve-able.

 

I perform a gotoxy() and write(MEMAVAIL) after each round of the game, just before the user is prompted whether or not to play again. The observed values:

 

Hand Memavail Delta Cards Dealt

1 6888 ---- 4

2 5953 (418) 4

3 5337 (517) 5

4 4919 (418) 4

5 4303 (616) 6

6 3885 (418) 4

7 3368 (517) 5

 

I am not cleaning up something properly, probably with each card.

 

Lots more testing to do, and I have to start by ruling things out. What if it has nothing to do with the sprites, and it's in the core game logic? What if the sprites are just amplifying the problem? But these results are encouraging as they show a consistent pattern of erosion based on the number of cards dealt.

Edited by Rossman
Link to comment
Share on other sites

I have isolated the source of the memory leak. The data that describes a sprite in Pascal is in a linked data structure. A linked structure is a pointer type. A pointer reserves a bit of memory.

 

var sc1, sc2, sc3 : link;

 

begin

new(sc1); { reserves memory for sc1 }

new(sc2); { reserves memory for sc2 }

new(sc3); { reserves memory for sc3 }

sc1^.packet := [scpr_pattern..spr_x_vel];

with sc1^ do begin

...

{ populate the first sprite movement }

link := sc2;

end;

with sc2^ do begin

...

{ populate the second sprite movement }

link := sc3;

end;

...

set_sprite(1, sc1);

...

end;

 

Every new([link_name]) like new(sp1) creates a pointer that reserves a location in memory.

 

The contra statement to new is dispose([link_name]); But I wasn't performing dispose statements, and because of that, I was accreting 11 bytes of memory per link every time I initiate a sprite. Each sprite, with background, has 9 links. Every time I call the sprite creation logic, I have 9 new links, but no dispose statements. With a maximum of 6 cards per hand, or 12 sprites total (including the card back), it adds up quickly.

 

The good news is, the memory leak is under control. The bad news is, my little game will need a change in sprite implementation.

 

I had hoped to re-use the sprite definition (packet) and invocation logic to contain code bloat. The sprites follow identical patters with different absolute and delta x,y values. But I can't invoke a procedure like this twice in rapid succession with different sprite geometry and expect my sprites to appear correctly:

 

 

begin

new(sp1);

new(sp2);

...

sp1^.packet := [spr_pattern..spr_x_vel];

...

{ define packets }

...

set_sprite([val], sp1);

...

dispose(sp1);

dispose(sp2);

dispose(...);

end;

 

This works if I call this code to invoke a single sprite. It does not work if I call this code to invoke multiple sprites. I suspect it fails because the link pointer needs to be as long lived as the sprite it was initiating. Once the pointer is disposed of, its location is lost to the sprite.

 

This tells me that the links need to be long-lived, which means making them globals. There are a finite number of these, so the memory allocation will be minimal. If I define the links in a var statement in the INTERFACE section of the gui unit. I can encapsulate this reasonably well through the GUI controller.

 

Thank you, @apersson850. Once I started monitoring memavail, it didn't take me long to isolate the problem. That helped me to make the problem very concrete. I believe it is all implementation on my part from here.

 

Best regards,

 

 

R.

Link to comment
Share on other sites

I'm glad I could offer some advise, which was useful.

 

Even if it's reasonable that a dynamic variable has to exist as long as the thing it's controlling (a sprite this time), it may still be possible to dispose it, provided you don't need the sprite any more. The tricky thing is perhaps to efficiently figure out when to dispose it?

But, as you write, when the data you allocate is reasonably small, it doesn't matter.

 

Another trick, that's a bit "dirty", but sometimes useful, is to revert to the older p-system way of allocating and releasing memory on the heap. Version IV.0 does fully support the real meaning of new and dispose, i.e. create one variable and dispose just that one.

But it still supports the older mechanism with mark and release, too. That implies that you can still do a mark, then any number of new, and still get rid of them all with one single release. The mark/release mechanism kind of structures the heap as a stack, so when you set a mark, it means that everything you have pushed onto the heap (new) since that mark, you can pop from that stack (dispose) with a single call to relase, using the same variable as you used for the mark.

Edited by apersson850
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...