-
Content Count
842 -
Joined
-
Last visited
-
Days Won
1
Posts posted by Chilly Willy
-
-
It's amazing how fast this is. You've really got some efficient code going there.

-
2
-
-
I gotta then check the code to see what's going on.
From some coding folks I get that still it's not a raycaster, they just find which wall segments are visible through BSP and project the walls/colums on the screen, but not mention raycasting.
This code review explains some stuff but not mention raycasting at all. At some point it says "Visible SEGS are projected on screen via a lookup table and clipped via an occlusion array."
I think I will study more this site (if I have time) and maybe look at the code for any signs.
Uhg... must be getting old or something. The way it works is like this:
It goes through the bsp tree looking for sectors on the same side of the player as the way he's looking. If a subsector is found, the segments are all checked. Each line segment has its start and end view angles computed, which are clipped to the angles corresponding to the screen. The line segment and clipped start and end x coords are stored for later.
So it really isn't a raycaster in the traditional sense. Sorry about the confusion... most of which was my own.

-
The 8088, 8086, 80286, 80386, 80486 (and I would argue the Pentium 1 series) are not RISC chips.
http://stackoverflow.com/questions/13071221/is-x86-risc-or-cisc
I sold, spec'd, and thoroughly researched those machines during their heyday. To this day, I still have stacks of magazine articles, spec sheets, and other documentaiton on them.
Everything changed with the Pentium Pro architecture (I have a suspicion of where that technology came from, but that's a different topic) and the introduction of the Pentium 4. So it's important to make a distinction between x86 as it relates to the original Pentium and earlier and x86 as it relates to Pentium Pro and after.
Yes, this is true. I should have been more specific. We were talking about the newer x86 where Motorola finally gave up trying to keep up, and I was pointing out that Motorola switched the 680x0 family into the RISC based ColdFire family, which allowed them to up the speed while maintaining backwards compatibility with a library for unsupported opcodes, and how Intel basically did the same thing - converting the x86 from a full CISC processor into one with a RISC core for better speed, and a translation layer to keep the same ISA.
Incorrect. The 500 came out in late 1987 in the US. Unless you have specific stats you can provide from a reputable source, I'm gonna call BS on the 2-1. I'm sure both Commodore and Atari used whatever PR techniques they could to convince their dealer body to stay with them and I wouldn't put it past either company to lie to their dealers about how their own lines were selling in relation to the competition.
Came out in Oct 87. I got one in Jan or Feb 88 to replace my Atari 400. You know, I came THIS CLOSE to working for GVP in 1990. I was in talks with them to fly out for an interview, then the war in Kuwait/Iraq started and I never heard back from them.
-
That's the first time I've ever heard anyone say that iNTEL's x86 architecture, well its evolutionary decendents anyway, are really RISC architecture at heart. That architecture has oodles of instructions, op-codes, and complex operations supported right in the chips. They're most definitely CISC based my friend, and everyone else I've ever heard or read so far sees it that way too. Now if we are all somehow mistaken, I really would like to learn the truth. But you just saying so isn't going to be enough. Sorry.
You must not be an engineer, nor read ANY boards that talk about hardware. I suggest you google "x86 risc core" and prepare to be blown away.

-
1
-
-
As GTIA is somehow a successor to the "TV adapter" Part, it has kept the visualisation Part in prior. Seems, the original GTIA Design of the mid 70s had no space in the chip to keepa fully working sound-part. The "click" sound , produced by GTIA, incl. the audio mixing with POKEY and the SIO sound, shows that there had been the intention to make GTIA playing sound. The interesting part is that GTIA sound doesn't interfere the volume mixing of POKEY. TIA was even able to play stable bass sounds. IF GTIA had taken that part, EVERYTHING soundwise had been fine with the A8.
But, as we know, GTIA was the problem chip...
The CLICK is an unused GPIO, nothing to do with audio. And there's NO MIXING in the GTIA. All audio is handled external to the GTIA entirely - the SIO audio is mixed by an external circuit with the POKEY audio, while the GPIO line from the GTIA goes to a transistor to the speaker on the old consoles, and into the external mixer on the newer consoles.
-
I did! I hacked a hole in the back of my 3000. I didn't take it to a shop. And it showed...
Yes, there were instructions on how to do it yourself in magazines and on BBSes not long after the A3000 came out, but you really needed to be sure you weren't all thumbs as more than one person managed to kill both their A3000 and Toaster doing it themselves. If all you would up with for trouble was not as nice a looking job, yuh dun good!

-
At one point in time, I had my A500 with the Fatter Agnus, 1MB of chip, 2.75 MB of bogo mem, a 25MHz 68030 with 8MB of ram, a Slingshot adapter plugged into the side (which gives you two standard A2000 style Zorro II slots), with a Ethernet card in one slot, and an EMPLANT card in the second slot. I did all my EMPLANT development on that for at least a couple years before finally getting an A4000 to do EMPLANT development on.
-
On the A8 , things would have been easy to upgrade. ANTIC is often called "graphics" on the A8 , but it isn't . It's the DMA Controller, nothing else. GTIA is the Graphics and Sound Chip, even if cutted heavily on the sound part. A lot of registers were still unused, to add the possibility of an own RAM (Graphics Memory) and a self running digitizer, without disturbing any compatibility. It could use a co processor aswell, acting like "Copper".
It's just that chaos at Atari that stopped any progress there.
The GTIA has no real sound resources on it at all. An unused GPIO (general purpose in/out) line was used to toggle a speaker on the old 400/800, and that was the extent of "sound" in the GTIA. The sound chip on the A8 is the POKEY, which also handled serial and keyboard interfaces. This became the PAULA chip in the Amiga.
Perhaps the simplest upgrade Atari could have done to the A8 line would have been to double the speed of the ANTIC/GTIA path. 160 wide color modes would have become 320 wide color modes, and the 320 wide B&W would have become a 640 wide B&W. The GTIA modes would have gone from 80 wide to 160 wide. It would have required memory to be twice as fast, but memory got cheaper and faster all the time. Make the speed switchable for backwards compatibility and you would have had a really nice upgrade.
-
2
-
-
Add a third Antic and give the CPU a fulltime break

A box with several slots and make a retro version of SLI with ANTIC cards.

-
And to compound this mistake, the Amiga 3000 they phased in to replace the A2K couldn't fit the Toaster!
Actually, you could MAKE it fit, but doing so would void your warranty. I knew quite a few folks who took their Toaster and A3000 to certain shops that would do the fitting for you and warrant it themselves... for a price.

-
1
-
-
In the fall semester of 87 at college, I used my Atari 400 to do a solution to a bounded planar electric field from a 2D source, requiring Eigenfunction expansion of a 2D partial differential equation. It took almost four hours to compute. I decided right there it was time to move on, so I did my research and bought an Amiga 500 in early 88. I knew I wanted a 68000 based system (all the best systems were... the ST, the Amiga, the Mac...) and while the CPU in the ST was slightly faster, the Amiga beat it hands down in every other respect. And we had a few Amiga 1000s in the lab, so it was easier to take stuff back and forth going with the Amiga. When we needed something more, we had several NEXT boxes for the high end stuff.
-
1
-
-
Not for RAPTOR, it's designed to take the windows little endian format so people don't have to arse around.
Yeah, if you can have the function deal with the BMP directly, that's much easier on the user - they just include some BMPs with their project. They just need to make sure they're in the proper format.
-
1
-
-
I am an engineer, both of hardware and software, so when it came time to update from my venerable A400, I looked into it, and it was OBVIOUS even 25 years ago that the Amiga was the successor of the A8. It wasn't heretical, all us engineers knew and acknowledged it. So even if it meant changing which company we bought from, we bought the design we knew was best and most familiar to us. What was also clear was that the ST was NOT the next generation A8.
-
6
-
-
Actually, Doom IS a raycaster. What is different is the method used to determine the intercept points with walls. Wolf3D uses a regular grid to represent walls, while Doom uses a BSP tree. Each ray is cast like a normal raycaster, then for each sector the ray enters, line intersection calculations are done between the ray and each line segment inside the sector. If there is no intersection, or if the line segment indicates the wall isn't solid, the ray is advanced to the next sector along the ray's path, and the intersection check is done again. The BSP tree merely makes finding which sector the ray is in faster given arbitrary shaped sectors.
-
When saving as a 16 bit BMP, there are two modes most programs save in - REAL 16 bit (RGB565), and 15 bit (XRGB1555). When I'm working on the 32X, I need the latter. For the Jaguar, you'd need the former. Two more points - BMPs are always in PC format, i.e., little endian. The Jaguar is going to want the data in big endian format. And finally, the Jaguar 16 bit RGB is actually RBG556, not RGB565. Converters need to take all this into account.
-
Blackthorne will NEVER work... it's a 32X game.

-
1
-
-
I always thought the color cycling was in the hardware. Was it something that coders had to use up space to add in?
Nope. It's software. You cycle the color palette after so much time without an input. The 8-bit computers had it in the OS, but it was still software.
-
Actually...
Here's RAPTORVADERS....
Written in under 24 hours (including debugging the .OBJ file and commenting the code) using the RAPTOR library.
Nice job!
Once this is released there'd better not be a single utterance of 'the jaguar is hard to code for' ever again. Coffin, meet Nail.
Too hard generally means no examples and/or libraries. I think you're taking care of both those here.

-
I think he's looking for the Dreamcast or Jaguar. I know I'm interested in the Dreamcast version if you have a date on that.
-
Good to hear. Welcome to the CDX Club.

-
1
-
-
Yeah, those old CD drives can sometimes be hard to find. For the Sega family, the Model 2 is the easiest to replace.
-
Yes, the CDX is a persnickety console. Mine is VERY picky about CDRs, but pretty decent with pressed CDs.
-
Heh - I had the Koala Pad and MicroIllustator for the Atari. Still have the Koala Pad...
-
So the Z80 could directly control the audio like in their arcade units?
You could use either the 68000 or Z80... you were mainly supposed to use the Z80 for Genesis music. Some drivers use only the 68000, and a few use both the Z80 AND the 68000.
I'll never understand Japan's fetish for the Z80 whereas the American arcade companies went with the 6502 and the 6809 for the most part.
The Z80 was big for many companies... in the US, Tandy made a ton of Z80 based computers before finally switching to the 6809. The Z80 was huge in Europe as well. I think the main thing was companies trying to save every cent possible went with the 6502. The Z80 was powerful, but pricey, so you saw it in devices where the price didn't matter as much, like arcades and PCs. Home consoles liked the 6502 to keep the price low.

5.25" drive on Amiga -- how to question
in Commodore Amiga
Posted
I used a 720K drive using the guts of a normal 3.5" external drive. The computer thought it was a 3.5", but the only difference with a 720K 5.25" drive is the size in any case. Most external drives are designed so that they shift the select lines over one, use the new "drive zero" for the drive it's controlling while passing the rest through to the daisy chain. Makes it really simple.