Jump to content

danwinslow

Members
  • Posts

    3,034
  • Joined

  • Last visited

Everything posted by danwinslow

  1. Shtetl optimized! Great blog, +1. Been following the 'quantum wormhole in the lab' stuff there and on Not Even Wrong. The Eugene thing, though, seems really incapable and obvious, like Eliza level. Is ChatGPT as easy to fool? Can't get to it at work but I'll try later at home.
  2. Hmm, I thought it had been beat by now. So, you're saying that given a normal human without prior knowledge that they were conversing with a chatbot, they would be sure to notice it's not an actual human within a short time? Your linkage does confirm what you say...although it's from 2022. Yeah, it's a fuzzy line to draw...in any case it's kind of an empty test because it's so dependent on uncontrollable variables. But still, I think chat GPT could come close.
  3. yah, 'intelligence' is a done deal, and I would guess that ChatGPT easily passes a Turing test, at least in its original meaning. Consciousness, now...that's a whole other story.
  4. All of it besides peripherals, probably. It's FPGA, so you load 'circuits and silicon as software' into it. New color modes on an otherwise accurate emulation would be awesome, and make me want to play. Would need somepOS pathcing obviously, but I bet we could squeeze in a few new color modes without too much disruption.
  5. (in my opinion) ChatGPT is a synthesis engine for written material and discussion...it is not doing any reasoning on it own in any way we would recognize. It doesn't actually create, either...even when it's written a poem or a song or a paper that seems pretty good, it's just a kind of extremely deep summation of everything similar it's been trained on for the subject, with some random rolls thrown in probably. We are really reading our own writings, and we are really conversing with ourselves. There's no separate viewpoint, no point of separate self. Is it useful? Yeah, probably. It is better? No. Many things we have created are useful, but not better.
  6. yeah. Well, from the 2 post count and the dope related name I'm guessing that we won't hear much direct conversation from OP.
  7. Have 0 experience with chatGPT, but I really do think foft has a point. Think about it - if chatGPT had been asked to 'write an atari basic program to communicate with itself', then it could appear as above, sort of a close-but-not-quite-right deal. ChatGPT, as I understand, was trained on huge amounts of stuff on the internet. Given the way that works, you kind of expect that it would have some dialect drift since it's seen hundreds of BASIC variants, and what you would get back out is sort of an averaged mish-mash syntactically. That said, I find it hard to believe that it would be able to a-priori write the above details about communication with itself using the fujiNet api and the details of atari SIO. Right now I lean towards somebody just screwing with us. I don't think chatGPT actually reasons on it's own much, it's just an extremely well trained and sophisticated natural language machine learning model.
  8. Interesting. So when he removed the brackets he got a pointer to location 0? Surprised he didn't crash. Thanks for the explanation.
  9. Oh brackets vs. parens, yeah. Brackets are apparently initializer lists. So what does a non bracketed number do? Does INT COUNT=0 do anything different? For that matter, what would INT COUNT=[0,1,2,3] do ? Its not an array declaration.
  10. As far as I can see, it shouldn't be. It should be set to just 0. The code uses it as a variable, but count[0] sets up an array. Not sure about Action! but in some languages you would be incrementing the array pointer by doing things like count=count+1. that would be bad. Remove the brackets is my advice.
  11. Ah, that's too bad. I wonder what kind of revenue stream modern Atari Inc. is trying to protect...can't be much.
  12. Kind of surprised the title doesn't have the word 'atari' in it.
  13. Yep, that's why I said "you have to do it on purpose". I was comparing C code written in this manner, not the full C language.
  14. I think that is fairly correct. You can write C in a manner that is very bare metal too, but you have to do it on purpose. I think Action! and C code such as mentioned are very similar in 'height' above bare metal...so then it gets down to which has better code generation and optimization...and I'm sure that's been explored in one of the language benchmarking threads here. Malloc functions are very interesting and fundamental routines, you would learn a lot if you worked your way through writing your own.
  15. I meant in more of a philosophical nature. Consider: any such run of numbers and the tests you mention are for, by necessity, a limited run. Proving a normal distribution over a subset of iterations is not the same as saying something is actually inherently random, it's just one way of detecting if your particular algorithm favors certain outcomes over others. The nature of your algorithm is digital - you may input some analog entropy along the way but there's no escaping the fact that your are running a deterministic routine, and you need to run it forever to prove anything. Even sampling only a natural random event such as the decay of a particle is necessarily limited in span, and a run of all 5's, say, for any given amount of time is perfectly possible. To prove randomness is proving a negative- you have to say that no causes can ever be provided for a certain outcome. This is why these algorithms are always called pseudo random generators. The actual existence of true inherent randomness is by definition impossible to prove algorithmically.
  16. Not to mention that everything was on cards...hollerith-encoded punch cards in huge trays. JCL cards and embedded program code cards, etc. You'd get your printed output the next morning.
  17. Not sure you can do anything about it via normal programming using the stock RNG algorithm. What you need is a source of more entropy and probably implement a custom RNG that takes full advantage of it. You can think of entropy as sources of randomness from the environment, things you can sample as a number and preferably that are themselves considered random.
  18. My high school had a mainframe connection, a teletype. I watched someone play original adventure on it. Then I got a sinclair zx80 with 1K of RAM (!). I wrote a 'lunar lander' game on it with characters for graphics (the lander was an 'M' and the thrust was the vertical pipe symbol). Joined the Air Force at 18 and they made me a programmer and I've been doing it ever since.
  19. The odd thing about randomness is that it impossible to prove, or even define completely. If you could, then by definition it wouldn't be random.
  20. Python is handy, interpreted so no linking mysteries (although there can be DLL hell involved), and it's available on smaller platforms and architectures. That's part of the reason it's around so much. Nobody is writing really serious back-end applications or high performance games in it, though. It's fine for what you need. There's really nothing one way or the other that's special about it, it's just a tool, does some things better than others. The syntax is kinda funky but not overly weird. You can learn some object oriented programming with it too, although that's not it's strong point. Just start!
  21. Yeah, the indentation thing is one of the nits I have with the language. But, I mean, python is perfectly fine for writing most non-web applications as long as you don't want super performance. playxlo.com looks like a typical javascript web site, as was mentioned. Modern dev, especially web stuff, is way more complicated than good old basic on an 8 bit. Anywhere you go you will have to get past a lot of complications, one of the biggest being the user interface. Since you like the playxlo example, try some web development with javascript, css, and HTML. All that is really doing is drawing some pictures to the web page and handling click events by playing an mp3. Get a book or take a course and have fun. If you want a desktop app, then python is fine, you'll just have to deal with picking a suitable UI library. QT, as mentioned, is a good start but there are others.
×
×
  • Create New...