-
Content Count
4,794 -
Joined
-
Last visited
-
Days Won
16
Content Type
Profiles
Member Map
Forums
Blogs
Gallery
Calendar
Store
Everything posted by potatohead
-
So what if it will corrupt floppy? Just don't hold anything valuable in drive... Unfrickingbelievable! And your defense is that you keep lots of copies of things? No wonder you have all those copies...
-
Hmmm... It's almost like a fundemental life change has occured. Maybe new job, financials, etc... that force a serious priority change. Though one would think at least answering a coupla e-mails would make sense. We've all been there before. Sometimes it makes sense to just bail, but that's always a conversation. This is different. No response at all. You guys were working well together, so there should be no reason to not at least have some small conversation. Doesn't add up, that's for sure. Good luck finding stuff out. Hope you hear some good news. I've always appreciated your project works. It's gotta be damn tough to have it just stop. This was gonna be an interesting one, IMHO. Hope Delicon is ok too. Nobody wants to see bad things happen like that.
-
Changing the direction registers might be interesting... The pins are common to all COGs. I'll have to do some reading on that aspect of the chip. The designer of the thing is not into non-flexible operation. I'll bet that was a design choice, just in case. Was on a long drive back from helping a friend today thinking about this. I think all three video COGs can run the same code. Start one of them, and it looks to see if it's the first one running or not. Use a flag in HUB memory for this. It then starts the others, which make the same analysis. All of them work from a master line counter in HUB memory and shared parameters for resolution, number of lines, etc... Any of them can output the SYNC signals necessary, if it's their role to do so.
-
Yep, you gotta feed it. There are solutions though. One could set the VSCL to different values to keep the generator at bay, so multiple waitvids don't get in the way of processing. At first, I thought the COG was totally consumed during a waitvid from start until end. This is not the case. Once the sync has happened, and the video generator grabs the D & S registers (instruction bytes and registers are referred to here and I'm not sure they are differentiated), the COG can go on about it's business while the pixels are being shifted out.
-
I think at some point there's a law of diminishing returns. You could very easily end up without enough cog power left to actually manipulate those sprites. Going with more video cogs might make it feasible to go back to the calculated byte->long translation for color, thus leaving half of the cog RAM for code which could do things during VSYNC. More time might also make a tiled background possible, but I think that would need too many hub cycles to be effective for anything more than lo-res. I think 3 cogs is a reasonable sweet spot for my design. I particularly like the 1/3 pixel side-effect. The Donkey Kong game in progress ended up with plenty of game power, after consuming 6 cogs for video. Still that's excessive and the ideas you've come up with here will improve on that. I will be able to do some coding next week, now that I'm back home with a display! I've toyed with running two video generators together. At the time I ran them both full screen bitmaps for a layering effect. Was hard to get the color synced, but otherwise they ran just fine. The outputs are or'ed together. As long as you keep them fed with the proper waitvids, your idea of just running zeros should work just fine. I never thought about having them output their own scanlines! I was thinking about one master SYNC cog, with the others drawing into the graphics area. This method will take one less COG. If you want to run some code, just let me know. I'm seriously impressed with your ability to just consider this stuff. How did you get that skill? It's just the way I need to work given my schedule and availability on the machine. I know how to sync the COGS. Did that when trying to overlay two bitmaps. You store the counter in SPIN, just prior to launching assembly code. No matter what, a small three line spin wrapper is required to start any program. On boot, cog 0 launches SPIN, which can then launch assembly. Store a value, add some time to it, then have the COG's launch, watch the system counter, then begin executing code, based on that count. The system counter is shared among all COGs, cannot be zeroed and is 32 bit. One master assembly COG could do the same thing, with a shared HUB memory location as well. Learned something about waitvid this morning too. It stops the COG, until the video generator is ready. Once the generator is loaded (4 cycles), the COG continues executing code. If instructions are well timed, delays per wiatvid can be held to as few as 8 cycles. Once started the video generators keep running. If they are not fed with waitvids, they grab whatever happens to be in the D & S registers at the time. Your idea of using the COG video registers together like this is interesting and new. All other game engines currently use one video generator and feed data to it. This is probably why so many COGs are being consumed. IMHO, keeping them all synced with similar waitvids will reduce thoughtput, but will keep things deterministic.
-
Man, that is worrysome. You should try to locate some family in the area. Maybe somebody knows something more. For a few bucks, many of those people search sites will indicate relatives. Surely they know something. Just thought I would share an idea. You clearly have some investment here in both materials and just people. Knowing the score will help.
-
Used to. The one I remember best was easy. Insert Space Invaders, hold reset down, then turn the console on. You get double shots. If you flicked the switch a lot, sometimes you would get deep purple space invaders as well! A friend klonked the sound on his Sears Telegames unit frying... Pissed him huge. Everything worked fine, just no more sound. We all stopped doing it about that time, but for the space invaders trick. That one seemed safe.
-
You can order oscillators with whatever frequency you want from Digi-Key; in moderate quantities (100 or so) they're not much more than standard ones, and even in onesie-twosies they're not too bad. it's got PLL circuts for each COG. They are good to 120Mhz, with the 5Mhz crystal. There is some harmonic distortion at the higher frequencies, as the generated frequency exceeds the CPU clock, but I think the ranges we are discussing are below that enough to not worry so much. Another user has used this thing as a fairly solid frequency generator, and another wrote a logic probe application that can actually debug one COG using another one to capture the pin states! In this regard, it's a very flexible design. Clocked at 80Mhz, it does broadcast on channel 3 very well. Not sure why I had so much trouble with this on the HYDRA though. Could be my environnment, could be the different crystals used or some artifact of the circut in the HYDRA -vs- the demo board. I think one COG could do donkey kong sounds with the video just fine, given some work replicating them. Samples would take their own COG for any sort of quality. Will be interesting to see how that all shakes down. I find it amazing it takes so many COGs to draw the game graphics. Maybe that's how it is, but I am reluctant to buy into that given all the options. It's just gonna take some thought. (or maybe some more RAM...) Byte addressing in the COG takes multple ops to sort out. Byte addressing from the HUB costs cycles... Seems to me, changing how sprite data is stored might yield faster solutions. I was thinking about one driver, building lines from a buffer. The problem with that happens to be the transfer time from HUB to COG. Can't really move much when doing wativids. Two COGs alternating is doable. Something similar to that was done to add a cursor to a high resolution VGA driver. Another option is to change the waitvids to place things at specific times, but that is not gonna work well for a lot of sprites. There is the overlay idea too. Essentially have one more than one COG draw to the pins at the same time. If they are color cycle synced, the only real problem appears to be video getting or'ed together. This allows for more intensities, but also looks bad when things overlap. (no priorities, no collisions without COG to COG communication.)
-
Holy Crap! I'm hosed at the moment too. No display option I PM'ed you on this... Five per line is what others have seen as well. You clearly grok this thing. There are 8 COGs, so that's not the end of the world... (thinking about that and your code block) Edit: Just finished looking over the code, over lunch. Seems to me, the color table lookup method makes the most sense. You are assuming the sub-pixel method worked out above is in play right? (I think that will work too) One would have to pay for the table in RAM somewhere, but the higher number of objects possible seems to outweigh that. I'm thinking about what the actual line driver can do in this regard. It's not clear to me just yet, how those play together. I'll have to do some more looking things over this evening... The DK game in progress uses a lot of COG's to get it's graphics display. Essentially, it leaves one for sound and another one to process the actual game logic in SPIN. That just seems to be a lot of CPU to get all of that done. On the background, lower resolutions might make sense, 2600 style. A larger sprite size might cut down on the bit chopping required as well, though that will cost RAM.
-
I need help with final fantasy 7
potatohead replied to super_dos_man's topic in Modern Console Discussion
I did this a while back. It does take a PS1 card. Great game, BTW. Enjoy. -
Thanks for thinking that through. I did the same this weekend and arrived at a similar conclusion. There is enough horsepower to do sprites full on, without the display lists. Given that, it's probably the more direct way to go. I'm out on business travel right now. Brought the Prop with me. Damn TV has no composite input, so I'm gonna have to spend tonight adding in the broadcast functionality to the driver I want to use before progressing on this. The good thing is that it is possible, however! Edit: This worked, but not anywhere near as well with HYDRA as it did with the demo board. So, I'll be looking for an inexpensive RF modulator...
-
Agreed! Keep the CD's. If you are worried about space, take them out of the cases, keep the cover art, liners and the media in those books. Won't take much room at all to keep your collection handy. I did this a while back and do not regret having done it. BTW: Get a copy of LAME. It is by far the best encoder out there. Good rips, done at 160 - 256Kbps are excellent with the quality settings on that endoder. Over time, I've noted quite a difference in high frequency accuracy with LAME over other encoders, particularlly the Fraunhoffer ones.
-
...so I think they'll be feeling this soon enough. No time like the present to start talking about it. Time Warner had a major hand in this affair. It's the first time a major corporation contributed to the rate and fee structure. Clearly it should be the last. http://www.usps.com/ratecase/_pdf/Mar19FINAL.pdf http://action.freepress.net/freepress/post...xplanation.html Advocacy link: http://action.freepress.net/campaign/postal/
-
Just wanted to say, "great job!". I knew you were gonna crank out an excellent game. Keep up the creative efforts. It's damn cool!
-
Been reading regularly since Atariage.com Didn't contriburte for some time after that. Used to hang out in USENET in the mid 90-s. Ended up with one of the original Stella Gets a Brain CD's that resulted from one of Glenns projects. (Damn cool) I've really enjoyed watching all the great talent here build the excellent homebrew scene. I also enjoy the quality of people and conversation. It's an excellent community with lots going on. Thanks Al, for keeping it all going.
-
Dude!!! That's ugly. I'm getting ready to check out. Don't know if you are up for a short, but very entertaining read.... Try Neal Stephenson's "In the Beginning There Was The Command Line". Awesome look at some core quirks surrounding computing. Any retro computing geek will identify with this in a second. Cheers! http://www.cryptonomicon.com/beginning.html
-
Make it so it is possible for the player to clear the ooze! This ends the level, instead of the timer. Use the timer for something else like bonus points, etc... Each level puts the ooze on screen in different configrations, and the game attributes are changed to make it difficult. Do this with some goofy sound effects and it should then be good. If the player gets through all the levels, then play a little tune! That will make it a game. level 1 000000000000 000000000000 000000000000 level 2 000 000 000 000 000 000 000 000 000 000 etc.... Edit: forgot to publish this. After some time away from Bb, I think this is probably where the game should go. The new features will add a lot to Ooze. This will get a re-write at some point. Wanted to archive the idea here for later on.
-
That's an interesting idea for sure. (Likely to work too) Tempted to try doing something like that next time I get some time on the chip. BTW, there is an emulator that will run code, display (most) TV output. It's slow, but effective for tinkering with code ideas. I've attached a coupla screenies from that. The IDE runs on win32, and is free to use, propeller or not and can be found at the Parallax site. Gear is OSS, runs on .Net and is also free. I'm using the emulator to work on stuff when I can't get time in front of the chip and a television... http://sourceforge.net/projects/gear-emu I think what I am going to do next is generate video on the fly. That's a kernel essentially, built for the game graphics I want to display. This will feed into the scanline buffer and will be a nice run through some of the tougher elements. This chip is both very easy and difficult at the same time. The assembly language, for example, has no registers. It's a memory to memory design, with conditional instructions. So, the A, B above simply reference two of the 512 cog longs, and I treat them as registers. That's cool in that any of the 512 longs in the COG can be instructions or data. Everything is a long in the COG as well. (no byte level access to COG memory. It's a long or nothing. Self modifying code is the norm, for loops and indexing. No biggie there, I've done that on the 8 bit chips before. One nice element, I'm still working through is that writing results, setting flags and executing instructions are all conditional things. Do an and instruction, for example, have it set the flags, but not write the result, then have the next instruction execute or not, based on the flag set on the AND. Lots of things happen in just two instructions where short strings of variable length instructions are required on most of the CPU's I know how to program. Interacting with the HUB is done with pointers from COG memory, as is passing parameters from the higher level, on chip, SPIN interpeter. Knowing where things are, and what is running on what COG and when it's actually doing it is the trick. So far, it's been an exercise in timing, using flags or just knowing how long things take to happen, to know when a passed value is ok to use. That round robin memory access timing also requires one interleave tasks to make best use of HUB access windows. On this first pass to actually get something playable done, program one COG to interpet memory to generate on-screen graphics, a sprite or two, another to generate the actual video, another to capture user input from a joystick, etc..., and another for sound. So 4 or 5 COGs define the environment such that the actual game can be written in SPIN, which runs from an interpeter on one COG. All it really should be doing is manupulating memory locations, branching, etc... (how does one do attachments now??) Looks like they go in blog entries... so they are in the main post.
-
Each of the 8 CPU's runs at 80MHZ, clocked together. Instructions are 4 cycles each, for 20 MIPS. Bytes per line is completely arbitrary. Well, mostly. Depends on how you want to do pixels. The waitvid instruction runs for either 4, or 16 or 32 pixels. 8 bit color, 2 bit color, or one bit. The driver I'm building off off, works by setting the colors to be at SYNC levels, and the scales to be equal to the time required for them. One waitvid then draws the HSYNC, etc... when necessary. Right as the user graphics area happens, the scale is changed to pixel size and then my bitmap code, in this case, just starts grabbing bytes, and feeding them to waitvids for that scanline. Right now, it's running at one byte per pixel, for a total of 160, or 320 interlaced. Sprite engine code, and other video code, would consume less, depending on how long unpacking pixels takes, the number of pixels, colors, etc...
-
Hmmm... Take a look at the schematic here: http://www.parallax.com/dl/docs/prod/prop/PropDemoDschem.pdf Pins 12-15 handle video output. Did some looking at the stock TV driver. Seems that chroma can be output stand alone, or mixed in with the luma. So, it will do an S-video signal, if that's the desired effect. The behavior I noted, was given the output circut shown. In that configraiton, chroma and luma are added together. I've got one of these boards, and the HYDRA game machine. Both are really just support circuts for the prop, but the HYDRA does have a cart, card slot for expansion, tinkering, etc... Demo board has a small breadboard for the same thing. Tinkering with different video output would just involve a few resistors, a cap or two and another RCA jack plugged into the breadboard. Really, one could connect any kind of video, it seems. The circuit shown is the current standard, however.
-
Just re-read your display list comment again. One problem I'm having with a flat lookup table for which sprite ends up on the scan line, is simple speed. Takes a while to determine what actually is on the line... Might be a better solution, in the end. Likely to use less cogs, or provide more sprite options.
-
As things stand right now, the propeller video generators (8 of them, all identical), output pure luma, when no color is specified, and both when color is specified. Colors range from dark, to very bright. I think the chroma is a function of the luma. Let's put it this way, I can ask for black pixels that have colors! I found this interesting, and is the source of the comment above in the code I posted. On some televisions, these colors do not deal well. The video hardware essentially has 16 sub, pixel clocks that translate into chroma, when combined with the color burst. Their intensity, again, I believe is keyed to the actual luma being generated at the time. As far as the hardware goes, it's really simple. Basically, you've got a serializer that takes color and pixel data, four pixels at a time, with each one being one of the available hardware generated colors. Or, you can have 2 bit color and feed it 16 pixels at a time. That's the norm for the chip. In addition to that, there is a scale register and a PLL counter that together define how long a pixel clock is. This can be synced to the NTSC color burst for pixel perfect graphics, and that's what this driver code does. I was not able to get it, but another user did. (fine by me) Additionally, it's running in the 4 pixels at a time mode, so there are 8 bits per pixel color. Requires more CPU time, but there are 8 of them, so no biggie, as long as it all keeps up. There are no interrupts on the thing, all timing is deterministic. All instructions take 4 cycles, but for taken branches, which take 8, and HUB memory accesses, which take up to 20 or so, depending on where they are in the round robin memory access scheme. The pixel clocks can be quite fast, with 1280 x 1024 VGA being possible! Overclocking the NTSC pixels, to essentially pack more than one pixel into an NTSC color clock is gonna be possible, but I'm not going to bother for now. As for the display lists, this is a very interesting question. I'm quite sure the answer is yes. I've been writing a 16 color display system that's modeled after the Atari 8 bit way of doing things. So, it's gonna have 16 color registers, bitmap modes and sprites --or some combination of those. There are three core approaches: 1. Have one COG output a bitmap, then have other cogs, draw sprites into it, in the usual way one would see on the Apple ][, for example. This is quick and easy, but memory intensive, and has color limitations because a full color bitmap consumes all the RAM. (sounds familiar huh?) These engines, then use tiled displays where 16x16 pixels regions are 4 colors each. Sprites take on the color of the region they are in. 2. Have one COG output the core NTSC sync, etc... but have it render from a scanline buffer, that is built from one or more COGs. It's possible to have flags that tell the group of CPU's when VBLANK, etc... is happening, so things can be frame locked, if you want --or not. This is the approach I'm currently taking. Run one COG in high color mode, then have the others, however many it takes, read from memory to read color, bitmap, sprites, etc... In this fashion, it can all act like Antic + GTIA (sort of). And that's why I bought one of the darn things. Wanted to be able to have a software video environment to recreate nice frame locked displays, like the 2600 does and the 8 bitters do. 3. Have more than one COG output video, on the same pins, at the same time. This one has proven difficult, but I don't think it's impossible. If the COGs are synced, then essentially, you've got video on layers that ends up being OR'ed together. This is gonna yield more colors, as would sub NTSC color clocking would. All the COG's have their own video generators. They are independant, can output to their own pins, or the same pins, etc... One could drive a TV, VGA and another coupla TV's if that was the goal. Not sure where the RAM would come from, but the chip is happy to output whatever it's asked to. Getting display list style sprites would involve building an engine to feed the main COG, the scanline information and it will end up on screen. Said engine could run on multiple additional COGs, depending on the demand. Lots of options there, that currently lie beyond my skills at the moment. It took a lot to grok just how to make the signal --and how to program in a parallel fashion. Picking up speed now though. I think the curve is very similar to that I've seen many here experience when working on a new classic machine.
-
After an extended time off, I've started to tinker with the Propeller again. Another user managed to do what I was trying to do; namely, build an NTSC video driver that's got stable color clocks on every scan line. Now it's possible to emulate the look of nearly any classic machine on this chip! There is enough control over the signal to essentially output any kind of NTSC / PAL signal you want. IMHO, this is really cool --and what I was hoping was doable. I'll have to get a camera, or something for screenies... perhaps this weekend! I was just reading the Closed Captioning thread. This is completely doable on the Propeller. It's got full control of the entire video signal. I'll have to go find a sample waveform to encode... To get warmed up, I modified and built on that driver to output 80x192x8. This is a lot like the GTIA mode on the Atari, only with more color bits. Did this to check out the color possibilities, with nice big easily debugged pixels. Turns out, about 96 different colors are possible, largely due to there only being 6 intensity levels on the chip. It's quite possible to use two of the video generators together to get more, but that's gonna be outside the scope for now. There are 16 distinct hues per pixel, with six intensities. Sub-pixel artifacting, which I've not yet really tried, is gonna yield more. Either way, sub pixel, or combining the output of multiple video generators at the same time, is gonna yield plenty of additional color and intensity levels beyond the 96 done by the hardware. As things stand right now, I've got a driver that outputs one pixel per NTSC color clock, non interlaced. This is exactly what the older Atari hardware does. If the chip had more intensity levels, it would all map out nicely, but it doesn't. It's totally possible to overdrive things to sub-color clock levels and get more direct control of what happens in a given pixel. I'm gonna save that for later. Onboard RAM, being limited to 32K, does limit the color screens. 160x192x8 takes up 30K of ram! Eek. Just enough to build a color demo or two, and that's it. maybe, maybe a small game in assemblier. Breakout or something... It is 8 bit addressable. No color limitations, just a nice high-color bitmap. It appears there is plenty of time between pixels for all sorts of tricks. Pixel packing to generate 32 or 64 color displays is doable and that would make a ~20k or so screen at classic game resolutions. Now there is room to actually do stuff. On the fly video will do better ---or add external RAM. Andre, the designer of the HYDRA game system, that uses the chip, has done this, giving 64K of random access memory, with 500k or so sequential above that. More than plenty for high-color graphics in full on bitmap mode. I'll be playing with that later... For now, the next task is to build an emulation of some of the better Atari modes with sprites. I've a coupla choices in this. Somebody already worked out how to connect atari Joysticks to the VGA port for quick and dirty control interfaces on the Demo Board. On that note, it's a bummer --sort of. The HYDRA game machine has nintendo style ports, and code to read the controllers. This is actually very cool in that modern controllers all function in a similar fashion. The Demo Board, which lacks some storage and essentially limits you to just the 32K in chip, without you building stuff, does not have game inputs, but does have mouse, keyboard, VGA, etc... in common with the HYDRA. If one is using a TV for the display, then that VGA port can easily become a controller port instead! Either system can do this, because all the I/O is bidirectional. Time to make some cables and hook up my Atari stuff! Paddles are gonna be an annoyance as a CAP, charging, timing, etc... are gonna be necessary. All possible, but extra work right now. Joysticks, driving controllers, etc... are gonna be just fine, so I start with those. Have one COG (that's one of the 8 CPU's running on this little bugger) build the screen display one scanline at a time, in high color mode. It works from a buffer that's built by one or two more of the COG's. Those two will be building sprite graphics, read from memory and drawn into the buffer on the fly. Graphics done this way, only need the storage necessary to define the images, with no full screen bitmap required for display. It's like a programmable TIA, essentially. Once the sprite engine is up and running, it will appear as just video hardware to the real game program running on one or more of the other CPU's. I'm finding the Atari style Player missile graphics sprites an interesting design option. The existing sprite engines, for the chip, all use your typical rectangular sprite definitions. Drawing these into the buffer takes time --more time than would be required if all the sprites were screen height! So, vertical movement would be actually moving data, horizontal movement would be changing a register. Maybe add an origin register so we get to say where a particular bit of data ends up on the screen. This might make packing things in memory easier, and vertical movement easier, given the sprite is not to be reused vertically as seen on most every game done on the 8bitters. I'm not sure how many will be possible. I strongly suspect this number is high, given the 8 CPU's to work with. It will be possible to have them be more than one color for sure. I don't think I'm there yet where emulating hardware collisions is concerned, either. No biggie, there is plenty of speed and ram to check these things in the usual non-hardware ways. That's it for now. By way of reference, here is the actual 80x192x8 driver code, and the little quick and dirty color demo that went with it. ********** Color Demo code first *************** { ******** 80 x 192 High Color 1 byte / Pixel display Demo *********** * This demo writes all the useful color values to the screen * * Linear addressing, no tiles * * Derived from CardBoardGuru's Simple NTSC display example * * Written for HYDRA * ********************************************************************* } CON ' Set up the processor clock in the standard way for 80MHz _CLKMODE = xtal1 + pll8x _XINFREQ = 10_000_000 + 0000 VAR byte displayb[15360] 'allocate display buffer in RAM '80 x 192 x 1 byte / pixel long index 'temp offset for byte statement below OBJ tv : "80x192_NTSC" 'the TV driver, for this example running at 80x192 PUB start | j, c, k, o { fill bitmap with NTSC black color. A zero is below black and will hose the display. Perhaps it's not a bad idea to have the TV driver watch for this condition... 8 bits / pixel appears to be a waste in that only 120 distinct pixel conditions result. Of these, perhaps 90 or so are really useful. Of the 8 bits, the first three deal with intensity: Black = %010 01 Grey = %011 02 Grey = %100 03 Grey = %101 04 Grey = %110 White = %111 Useful color exists in the remaining bits as shown by this program. The first row of darker colors is questionable. On my better TV, it works. On lesser ones, it doesn't... } 'fill bitmap with black pixels, before triggering display repeat index from 0 to 15360 byte [@displayb] [index] := 2 'another way to point at HUB memory... tv.start(@displayb) 'start the tv cog & pass it the address of the bitmap 'draw a border around the visible graphics screen (80 x 192) repeat j from 0 to 79 plot(j,0,251) plot(j,191,251) repeat j from 0 to 191 plot(0,j,251) plot(79,j,251) 'draw 6 intensities possible (black is one) repeat o from 15 to 55 step 10 repeat j from o to o+9 repeat k from 160 to 180 plot(j,k,o/10+2) 'draw useful sets of colors possible repeat k from 8 to 15 if k == 9 k := 10 c := k repeat o from 3 to 77 step 5 c := c + 16 repeat j from 20 to 30 plot(o,j+16*(k-8),c) plot(o+1,j+16*(k-8),c) pub plot(x,y,c) 'very simple dot plotter 'one byte per pixel is sweet! displayb[y*80+x] := c And the driver... actually, it's full of commentary, written my me and the other guy! (Attached instead --actualy I don't know how after the AA upgrade. No biggie...) Just for fun, this is all it really takes to encode the bitmap display, once the framework is all up and running: mov VSCL, CALC_user_data_VSCL mov r1, #20 '80 pixels horizontal resolution is 20 waitvids :draw_pixels rdlong B, A 'get four pixels from hub waitvid B, #%%3210 'draw them to screen add A, #4 'point to next pixel group djnz r1, #:draw_pixels 'line done? Move to sync... Just one tight loop, grabbing bytes, writing them to the screen, etc... BTW, the entire signal is encoded as colors. This is why there are only 6 intensity levels, when three bits are actually defined for this. The other two are below black, for sync. These produce interesting results when present on screen, in the display graphics area! Not all TV's will cope with this either.... That's why the demo program above, filled the display memory with pixel values greater than zero. I'll have the driver itself handle this going forward, so zero will actually be black and not sync! One other thing... I saw the great little project to make a 7800 adapter for the 5200 machine. Didn't know that one had a video input! I'm assuming this is a pass thru kind of thing. If so, one of these little chips could be on a cart, and present itself as ROM, maybe co-exist with the ROM, and output a video signal to be used in lieu of the 5200 one! These screens were taken with the emulator. I don't have a good video capture system right now, so this will have to do! The emulator works from a compiled binary from the Propeller IDE. See comment thread for more...
-
What is the best version of SSX for the Gamecube?
potatohead replied to kevin242's topic in Modern Console Discussion
I'll third that recommendation. Excellent game. Did the original end up on the cube? It's still way up there for the music alone. "call your mamma in the room and show (tell?) her how great you are!"
