Jump to content

EricBall

Members
  • Posts

    2,427
  • Joined

  • Last visited

Everything posted by EricBall

  1. While going with Swift 2.3 was okay for my existing projects, I couldn't easily create a new Swift 2.3 project. Fortunately, you can download old versions of XCode and rename them as part of the install. So now I can go keep developing using Swift 2.2 for iOS 9 without issues. It also appears the big downloads last month was XCode updating itself on my macBook and iMac.
  2. Last night I told myself to stop playing Minecraft and get back to work on my iOS game. At least start to put together the level editor. Once I have something partially working I'm much more likely to spend time working on it. But XCode had other ideas as it had updated. So when I opened the project, it asked me (twice) whether I wanted to convert it to the current version of Swift. I said no, but then it said it wouldn't be able to compile it. That wasn't a good option either. So I made a backup of the project and told it to convert it to Swift 3. For the most part the change from Swift 2.2 to Swift 3 is cosmetic - more consistent method naming (like fixing stdio.h so fputs parameters are stream then string order). But I found one Catch-22. In my game I'm using an OpenGL fragment (pixel) shader. To pass a variable from the main program to the shader it's put in a "uniform" - a special kind of variable. Uniforms have special types to match OpenGL types, vectors in particular. The commands to create these these vector uniforms is different in Swift 3. That wouldn't be a problem except the commands aren't supported by iOS 9 for some reason. So while I'd like to use Swift 3 in case I need to make code changes in the future, I also want to support iOS 9. So for the moment I've gone with Swift 2.3, which XCode will compile. And I still haven't started coding the level editor.
  3. Over the years my home internet access (though my cableco) has steadily improved. However, it's always had a usage cap. Not a hard cap where it stops working, or even a soft cap where the bandwidth gets downgraded. Nope, instead I get a usage charge if I exceed it - which I've done occasionally. The "fix" is to increase my service. However, I've also configured my Netgear router to kick out a warning page when I've exceeded 100GB to give me some warning (the cableco will as well, but theirs only kicks in after I've exceeded it and has a 24-48 hour lag). This month that warning popped up although I hadn't done anything particularly bandwidth intensive (like pulling down the MAME torrent...) Checking the cableco's website I could see there were several days where I had used over 10GB. Hmm... what could be downloading that much data? In a perfect world, my router would have a nice little report breaking down usage by device and destination. There's no technical reason it couldn't as part of the NAT functionality. (Although it might require some external storage.) Heck, the router is even a DNS proxy so the report could even map the IP address of the destination to a FQDN. But this isn't a perfect world - far from it. Other than tracking overall usage the router doesn't track anything other than the list of currently connected devices. Nor does this function appear to be part of other routers, even the open source ones. Bleargh - so how to do this? Hmm - could I sniff the WiFi network and get the info that way? The short answer is yes. Long answer: 1. My first attempt was to use Wireshark on Windows. But that only captured the traffic for that computer. What I needed was to put the adapter in "monitor mode". 2. My second attempt was to use Acrylic WiFi Professional. This worked, I could at least see the number of packets being sent by each device. But I couldn't get a count of the number (and size) of packets received by each device. It also seemed to crash while running unattended for several hours. 3. I then tried to use the drivers from Acrylic with Wireshark (which is supposed to be possible according to Acrylic). But I couldn't get it working or find out how to configure the driver to only listen on the correct channel. 4. So I downloaded a LiveCD/USB image of Kali Linux and tried to use aircrack-ng without success. I'm not sure whether the Linux drivers don't support monitor mode or if there was another issue. 5. Finally I loaded up Wireshark on my wife's old MacBook. Ding, ding, ding, we have a winner! Monitor mode without complaint. TShark to capture straight to disk. About the only quirk was it uses the channel of the connected SSID, but I needed to disable my 5GHz network anyway. So with this I can easily track usage per device. In theory I should also be able to decrypt the packets (as I know the WPA2 passphrase) too, assuming I sniffed the WPA handshake. Of course, it looks like Internet usage is back to normal.
  4. Ah, as per "Rebecca Heineman on 3DO DOOM", s/he got the Jag port source from iD (along with the DOS source) and used it because it had already been stripped down to fit in the Jag's 1MB of RAM - making the port easier. That makes more sense. The reuse isn't due to the code but the assets.
  5. A most entertaining podcast (along with the related Wolf3d podcast). I like how you go beyond a simple review of the game and delve into the development history and relationships & comparisons with other ports. (It was also interesting to learn how many console ports were actually ports of previous console ports. I wouldn't have thought there was enough similarity between the various consoles to make that strategy better than going back to the original source.) Personally I only played DOOM (and Wolf3d) on the PC, although I had (at least for DOOM) a Gravis Ultrasound for superior music and sound effects. (IMHO the main failing of the GUS was it didn't provide a hardware compatible Sound Blaster DAC port when so many games were using it.) DOOM was my last FPS until GoldenEye 64 as my PC couldn't keep up and I got off the upgrade treadmill.
  6. hell yeah DVD is still popular 1. both players and content are much cheaper than Blu-Ray (even without considering DVD-R copies) 2. laptops & portable players typically handle DVDs 3. people and libraries (and adult video stores) have lots of DVDs, there's also a big used market 4. while there is a quality difference, unlike the VHS to DVD transition it isn't enough to make you want to upgrade
  7. The game part of my iOS app is 90% done. (Which all programmers know means there's still 90% left to do.) But the big challenges have been conquered - the touch & tilt controls and physics works. There's still some to-dos to load a level, do a reset, handle pause etc. but they will wait until I get the level builder working. However, before I did that I wanted to take a look at performance. One of the cool things with iOS development is you can easily test out the app on an actual device. Plug in an iPhone, iPad or iPod Touch and click run. It even links into the debugger. The thing which concerned me was my tutorial level was running at only 40fps on my 5th gen iPod Touch. This is about the slowest iOS 9 device so if I can keep the framerate up on it I'd probably be good on anything else. And while 40fps isn't bad, the tutorial level is fairly simple so it's possible some complex level might be unplayable. Unfortunately the GPU analysis wasn't giving me much of a hint. Both the Tiler and Renderer were at 75% and CPU was less than 20% - so no absolute bottlenecks. I could walk through the OpenGL call stack for a frame, which was interesting but not very useful. So I started changing the code to see what made a difference. Turning off a couple of recent additions didn't make a difference. Nor did removing some logic from the update loop or removing my shader. Moving the textures into a texture atlas reduced the number of draw calls and provided a minor increase, but not much. But the texture atlas also seemed to weird out the dynamic lighting. Hmm... comment out the dynamic light - 60fps. Ahh-ha! And it turns out texture atlases don't work with normal maps for some reason - which was causing the visual problems. So I guess I'll make that an optional setting. It also got me to thinking. I was also using the normal map textures to generate the physics bodies via the alpha channel. But the process was less than perfect. So if I decided to discard the dynamic lighting entirely, I'd need to generate the physics bodies manually - which would remove the imperfections as well.
  8. Unlike many other smartphones (like my HTC-601) iPhones don't receive FM radio (https://9to5mac.com/2015/04/20/fm-radio-iphone/) - so that's not a big problem for Apple. And while it's a nice feature, it's not a must-have for most people. OTOH, the bluetooth speaker I just bought has an FM tuner and it uses the USB cable for the antenna. So maybe if Apple really wanted to they could use the lightning cable as an FM antenna.
  9. Reading https://www.buzzfeed.com/johnpaczkowski/inside-iphone-7-why-apple-killed-the-headphone-jackgave me an idea. What if you combined the Apple W1 wireless chip with something like a Chu Moy pocket amplifier. So you plug your $tupidly expen$ive headphones into the pocket amp which is wirelessly connected to your phone. On the actual removal of the headphone jack, I think Apple is ahead of the curve. Bluetooth based headphones are becoming more common and there are definite advantages to not having a cable between the your head and where you stuff your upgraded Walkman. I suspect not having a headphone jack will be common in the next generation of smartphones.
  10. Although cellphones are common, even ubiquitous, "everyone" does not have a cellphone. Cellphones can also be damaged, lost, stolen, or be out of range. Heck, some people don't even have a home phone. Unfortunately, due to cellphones, the usage of public telephones has dropped far below the point where they are profitable. So instead those without need to find other ways when they need to make a call.
  11. In my game there's a ball which rolls around the playfield. While SpriteKit is very cool with baked in physics and lighting, it's a 2D engine. So although it will rotate the 2D texture of the ball around the Z axis, that doesn't really convey the look of the ball rolling forward. My original idea was to go with a flat shaded sphere and let the normal map lighting give the ball a more 3D look. Unfortunately the result was less than satisfying. But then I had an idea. What if the ball was a window into a larger texture. Moving the window would then suggest the rolling motion. While doing some investigation into how to implement this idea, I stumbled into the idea of using a custom shader. In SpriteKit you can assign an OpenGL ES fragment (aka pixel) shader to a sprite which then renders the sprite. These shaders are written in a C style language which is then compiled at runtime into instructions executed by the GPU. There's all kinds of special instructions for dealing with vectors (e.g. RGBA and XYZ) & matricies (e.g. rotation). So I bashed at it this weekend (ghods it's good to code sometimes) and managed to get it working! The shader does 3 things: 1. Return transparent for pixels outside of the ball 2. Move the center of the ball around the base texture (where the center is is tracked by the main program) 3. Warp the edges of the texture to "wrap" it around the ball The one problem with using a custom texture is you lose the built-in normal map lighting. I'd like to add this in as well, but my first attempts didn't work out. So I've put that on the to-do later list as the current result is good enough for now. Also, for some reason SpriteKit does the rotation step after the custom shader. It's one less thing to handle in my shader, although it probably causes some quality loss. (But I will need to account for it in the lighting step.)
  12. I have this idea for a game, which I'm trying to program for iOS (using Swift). As this is a considerable step beyond my typical C & ASM skill set, it's been slow going. Swift itself is a post C object oriented / procedural language so isn't that difficult to understand, and I can certainly appreciate _not_ having to learn Objective C. No, the problem is the frameworks. Swift on it's own can do very little. It needs libraries like UIKit and SpriteKit. So unlike C where you can do a lot with just a small portion of the standard library functions, here I need to learn how work within these complex frameworks (which seem to assume implicit knowledge) to accomplish anything. But on the plus side - the frameworks have a lot of features built in. SpriteKit has a whole 2D physics engine baked in - so I don't need to worry about collision detection or bouncing. It will even automatically generate the collision models based on the alpha channel in an image. I recently learned SpriteKit also has dynamic lighting - including "normal maps". My 2D sprites are actually 3D objects, so I'm hoping the normal maps will make them look more 3D. So I whipped up some C programs to generate the normal map texture and the alpha channel for the collision models. Dropped them all into a test app and started to play around. The physics portion worked great, but the dynamic lighting was strange as only part of the textures were responding to the dynamic lighting. In the middle of the night I realized the problem - the dynamic lighting was picking up the alpha channel part of the image which I'd used for the collision model. I'd used 255 for solid parts and 0 for empty parts. So the empty parts weren't being lit. (The alpha channel on the light color works in the same way as a general dimming factor.) In theory this could be used to bake shadows into the normal map, but for me it wasn't what I wanted. The solution is to redo the alpha channel to use 255 & 253 and use 254 as the decision point. * I have no illusions about making millions from this game. In fact, my plan is to make the game free (gratis) to download & play. No in-app purchases, no advertising, no information gathering. Just my gift to the world. ** For iOS development you just need a Mac running the current version of MacOS. The rest of the development tools can be downloaded from Apple for free. There's even a simulator if you don't have an iOS device to test with. To actually publish an app you'll need to sign up for Apple as a developer (<$100) and you need a domain name & website (beware of teaser offers).
  13. I think you overestimate what a five-year-old is capable of creating. My fifteen-year-old has difficulty with programming anything although I've been trying to teach him for years. I have to admit, not doing something you don't enjoy and don't need to do is often the sensible plan. Perhaps taking a break is what you need. Maybe you'll find a different creative passion to fill your time with. Or maybe that desire to create homebrew games will hit you with enough force to push you through the challenges. If that does happen, one recommendation I have is to focus on one programming language / console - preferably the one which you have found "easiest" to develop for. My programming knowledge went through several phases. I started with Applesoft BASIC and from there different dialects of BASIC on different computers. I later learned Borland Turbo Pascal - similar to BASIC but with structure instead of GOTO/GOSUB. C came next, although I wouldn't really understand C until after I learned assembly. After a brief attempt to learn 6502, I discovered the power of assembly with the 6809 (TRS-80 Color Computer). Then I went back to the 6502 but was stymied by the lack of 16 bit index registers. 8086 was next with the rise of PCs. I only got back to the 6502 with the 2600. Now I'm trying to learn Swift & IOS programming. It's slow going and frustrating 'cause I have an idea in my head for a game but I don't have anywhere near enough knowledge to start actually creating it.
  14. Q1 & 2 - Building the display lists is one of the great challenges with the 7800. As usual, some tradeoffs have to be made between space (RAM) and speed (CPU cycles). Q1 - Yes, 6 bytes per display list is enough for a single 4 byte display list entry and the 2 byte end of list entry. Q2 - While not rebuilding the entire display list sounds more efficient, it is difficult to accomplish in practice. First, as sprites move vertically they need to be removed from one display list and added to another - which means the remaining entries in the display list would need to be rewritten anyway. (Unless you pad the display list with transparent sprites, which will rob your game of precious CPU cycles due to the additional MARIA DMA.) Second, the extra time and space required to determine whether the sprite needs to be updated and finding it in the list can easily negate the savings of not simply rebuilding the display list. Q3 - Yes, you can certainly use any space not used for sprites for code or other data - just don't let it spill onto the next page. However, I'd start with the assumption you're going to use the entire 2K / 4K block for sprite graphics until you run out of space on the holey DMA pages.
  15. How much time, money & effort are you willing to spend? The first step would be to somehow divide up the display into vector sequences. Both the Vectrex and Atari games after Asteroids (Deluxe) use analog vector generators. So each vector sequence would start with a "return to center" which could be used as a starting point. (The vector sequence is then drawn as a series of slope + speed over time at intensity or off.) The vector sequence would need to be slowed down for the laser projector, tweaked to account for momentum and the output buffered. Then you'd need a bunch of laser projectors. You'd want to make them as physically small as possible so they can be placed close together rather than trying to create some kind of "multi angle beam combiner". As long as the distance between the outputs is small relative to the final display the parallax shouldn't be too noticeable. However, to make any significant difference you'd need a lot of laser projectors (tens at least). And hopefully each vector sequence could be drawn quickly enough to match the game's internal refresh rate.
  16. It looks like they have a lot of flicker, so aren't simply driving the laser with the raw vector outputs. Yep http://web.archive.org/web/20040401223651/http://games.lasers.org/pressrelease.shtml Not to dismiss what they did - the display looks good. But the galvos are definitely the limiting factor.
  17. Earlier this month the family and I spent a week camping at (nee Six Flags) Darien Lake - riding roller coasters & other rides plus roasting marshmallows & drinking beer. Every night DL has a laser & fireworks show set to music. (Unfortunately the same one every night.) The laser show part of the show in particular was particularly impressive and would have made Pink Floyd (from 40 years ago) green with envy - complex animated scenes in full color. (Probably restricted more by the creation tools and software capabilities than the actual hardware.) But seeing what a modern laser projector is capable of got me to thinking - could one be used as a replacement for a vector monitor (e.g. Asteroids and Tempest)? But then I suspect was others probably had had the same thought - and there's probably a big reason why I've never heard of a laser version of Tempest. Of course, last week's insurmountable problem might be this week's trivial solution. So I started with a little investigation. To make a long story short, the limiting factor is moving the beam. A vector monitor uses electromagnets, while a laser display uses mirrors - typically using a variation of a "mirror galvanometer" or galvo. People have made their own galvos using the voice coil actuator from a hard disk (which is what moves the heads across the disk). But electromagnets can change the position of the beam from one side of the monitor to the other in 100-300 microseconds, whereas it requires 10-20 milliseconds to move the head across the entire disk. That's 100 times slower and a bigger challenge than I think can be easily overcome.
  18. I just borrowed Tomorrowland from the local library. IMHO Ben Affleck's Paycheck does a much better job with the whole "future paradox" idea. I do agree with you that the primary trio of characters were adequate, but not really sympathetic. My biggest issue was with the EET dimension-hopping transport. I could believe in Plus Ultra and even that PU could create Tomorrowland. But an electrically powered rocket hidden under the Eiffel tower (which is really a gantry in disguise) goes way beyond my ability to suspend belief. There also was a lot of casual vaporizations for a Disney movie.
  19. A rack mountable 2600 shouldn't be that bad. The hardest thing would be rewiring the cartridge slot and switches so they are in line with the joystick connectors. Or were you thinking something more radical? Of course, you could always make a rack mounted RPi and play everything via emulation.
  20. For those interested in Apple ][ copy protection (and cracking of it) I have found two troves of information. First is the 4am Apple ][ Library at https://archive.org/details/apple_ii_library_4am Attached to each archive is a text file where 4am describes his cracking process. In most cases he simply follows the boot process, analyzes the often obfuscated code to figure out how to capture the next step in the process, then determines the minimum change required to allow the disk to be easily copied. (e.g. https://archive.org/details/Gumball4amCrack ) Second is PoC||GTFO 0x10 (pocorgtfo10.pdf, available from https://www.alchemistowl.org/pocorgtfo/and other mirrors) which contains an article on various methods of disk were made uncopyable, or at least impossible to copy without being detected. Finally, http://fabiensanglard.net/prince_of_persia/pop_boot.phphas information on the RWTS18 format used by Prince of Persia. There's also source code at https://github.com/jmechner/Prince-of-Persia-Apple-II See the comments from http://atariage.com/forums/blog/7/entry-2436-more-apple-musings/and http://atariage.com/forums/blog/7/entry-2514-more-disk-musings/ for previous discussion on this topic.
  21. I've had the worst luck getting an N64 working with my collection. I've bought a couple off eBay and none of them reliably and consistently play. Sometimes I'll get lucky and the game will start, but most of the time I just get a black screen. Next time I'm going to buy it via Kijiji so I'm buying from someone more local. However, I've got lots of options for playing games, so it's just not a priority. I also need to get a controller extension cable so I can sit farther back from the TV.
  22. One note - a maxlen LFSR will have one "dead" value (typically 0, but might be another value) where the output value is the same as the input value. It's therefore a good idea to either initialize the LFSR with a good seed or detect the "dead" value and handle it.
  23. Graphics on the 7800 is very different from the 2600, so you're going to need different tools. (Which might exist, who knows?) First, graphics data is laid out "upside down" with the last scanline on the lowest address and each scanline on a separate page. e.g. for a 2 byte x 16 line sprite $e000-$e001 bottom line $e100-$e101 2nd last line ... $ee00-$ee01 2nd line $ef00-$ef01 top line (Sprites are typically 16 or 8 lines high and will be on even 4K (16 line) or even 2K (8 line) address blocks. Second, while sprites on the 2600 are 1 bit per pixel, sprites on the 7800 may be 1, 2 (most common), or 4 bits per pixel depending upon the graphics "mode". The bits from the graphics data are combined with bits in the display list entry to determine if the pixel is transparent or the output color (from a 24 of 256 entry palette).
  24. Yep, need to digitize those tapes before everyone forgets how to handle analog!
  25. Personally, I think the movie could have done with a few less characters on the battlefield. Neither Spidey nor Ant Man had enough character development to be part of the battle. Even Wanda & Clint didn't really have the emotional weight of the others. This movie was really about Tony's guilt and Steve's convictions.
×
×
  • Create New...