Jump to content

danwinslow

Members
  • Posts

    3,034
  • Joined

  • Last visited

Posts posted by danwinslow

  1. Hmm, I thought it had been beat by now. So, you're saying that given a normal human without prior knowledge that they were conversing with a chatbot, they would be sure to notice it's not an actual human within a short time?

     

    Your linkage does confirm what you say...although it's from 2022.

     

    Yeah, it's a fuzzy line to draw...in any case it's kind of an empty test because it's so dependent on uncontrollable variables. But still, I think chat GPT could come close.

  2. (in my opinion)  ChatGPT is a synthesis engine for written material and discussion...it is not doing any reasoning on it own in any way we would recognize. It doesn't actually create, either...even when it's written a poem or a song or a paper that seems pretty good, it's just a kind of extremely deep summation of everything similar it's been trained on for the subject, with some random rolls thrown in probably. We are really reading our own writings, and we are really conversing with ourselves. There's no separate viewpoint, no point of separate self. Is it useful? Yeah, probably. It is better? No. Many things we have created are useful, but not better.

    • Like 2
  3. Have 0 experience with chatGPT, but I really do think foft has a point. Think about it - if chatGPT had been asked to 'write an atari basic program to communicate with itself', then it could appear as above, sort of a close-but-not-quite-right deal. ChatGPT, as I understand, was trained on huge amounts of stuff on the internet. Given the way that works, you kind of expect that it would have some dialect drift since it's seen hundreds of BASIC variants, and what you would get back out is sort of an averaged mish-mash syntactically.

     

    That said, I find it hard to believe that it would be able to a-priori write the above details about communication with itself using the fujiNet api and the details of atari SIO. Right now I lean towards somebody just screwing with us. I don't think chatGPT actually reasons on it's own much, it's just an extremely well trained and sophisticated natural language machine learning model.

  4. 23 hours ago, kensu said:

    Why is count set to [0]

    As far as I can see, it shouldn't be. It should be set to just 0. The code uses it as a variable, but count[0] sets up an array. Not sure about Action! but in some languages you would be incrementing the array pointer by doing things like count=count+1. that would be bad. Remove the brackets is my advice.

  5. 4 hours ago, kensu said:

    Thanks to everyone who replied, the issue was that it was overwriting itself. (I assumed MoveBlock moved it as a discrete block, but it must do it byte-by-byte like the De Re Atari exampl). I fixed the problem by zeroing out the player graphics at the current position and redrawing them (from the array where I originally defined it) in the final position. I also padded it with zeroes, an arcane practice I've been seeing in code examples but haven't understood until now.   It appears the way I did it now is the way it's done in most examples.

     

    I'm beginning to realize that Action! is even closer to the metal then C is, never before have I found myself wondering how malloc works so I could write my own.

     

    I think that is fairly correct. You can write C in a manner that is very bare metal too, but you have to do it on purpose. I think Action! and C code such as mentioned are very similar in 'height' above bare metal...so then it gets down to which has better code generation and optimization...and I'm sure that's been explored in one of the language benchmarking threads here.

     

    Malloc functions are very interesting and fundamental routines, you would learn a lot if you worked your way through writing your own.

  6. I meant in more of a philosophical nature. Consider: any such run of numbers and the tests you mention are for, by necessity,  a limited run. Proving a normal distribution over a subset of iterations is not the same as saying something is actually inherently random, it's just one way of detecting if your particular algorithm favors certain outcomes over others. The nature of your algorithm is digital - you may input some analog entropy along the way but there's no escaping the fact that your are running a deterministic routine, and you need to run it forever to prove anything. Even sampling only a natural random event such as the decay of a particle is necessarily limited in span, and a run of all 5's, say, for any given amount of time is perfectly possible. To prove randomness is proving a negative- you have to say that no causes can ever be provided for a certain outcome.

    This is why these algorithms are always called pseudo random generators. The actual existence of true inherent randomness is by definition impossible to prove algorithmically.

  7. 1 hour ago, Mclaneinc said:

     

    I've heard about that practice of only getting to actually run the code on the mainframe a few times from various people, I can understand the reasons, but it seemed a bit cruel. But what we see as a small box on the desk was the best part of a full wall of electronics, back then. I dread to think of the price. My wife worked for Barclays back then and it was terminals, punch cards etc, she hated it...

     

    I remember seeing the ICL logo, just looked on Wikipedia and saw they were taken over by Fujitsu in 2002, apparently Russia still uses the brand name of ICL.

    Not to mention that everything was on cards...hollerith-encoded punch cards in huge trays. JCL cards and embedded program code cards, etc. You'd get your printed output the next morning.

  8. Python is handy, interpreted so no linking mysteries (although there can be DLL hell involved), and it's available on smaller platforms and architectures. That's part of the reason it's around so much. Nobody is writing really serious back-end applications or high performance games in it, though. It's fine for what you need.

     

    There's really nothing one way or the other that's special about it, it's just a tool, does some things better than others. The syntax is kinda funky but not overly weird. You can learn some object oriented programming with it too, although that's not it's strong point. Just start!

  9. Yeah, the indentation thing is one of the nits I have with the language. But, I mean, python is perfectly fine for writing most non-web applications as long as you don't want super performance. playxlo.com looks like a typical javascript web site, as was mentioned. 

    Modern dev, especially web stuff, is way more complicated than good old basic on an 8 bit. Anywhere you go you will have to get past a lot of complications, one of the biggest being the user interface.

     

    Since you like the playxlo example, try some web development with javascript, css, and HTML. All that is really doing is drawing some pictures to the web page and handling click events by playing an mp3. Get a book or take a course and have fun. If you want a desktop app, then python is fine, you'll just have to deal with picking a suitable UI library. QT, as mentioned, is a good start but there are others.

×
×
  • Create New...