Search the Community
Showing results for tags 'SAM'.
-
Saw this awesome Color Computer video that explains some of the things I always wondered about in really good detail: I'm pretty sure the 555 timer affected which side of the phase the artifact colorset fell on - weather you got red and blue or blue and red as the dominant colors the other artifact colors were generated from when you pressed reset... bitd that always reminded me of the Atari 2600's BW switch that could change up the games with a second color palette.
-
There was a Korg microKorg under the Christmas tree last year; only because my wife wouldn't let me set it up in November when it was purchased. I did manage to get the manual out of the box before it was wrapped. So for a whole month I read the manual and watched youTube videos. The microKorg has a vocoder. This suggested that the audio output from an Atari running SAM could be hooked up to the line-in on the Korg. Then the Korg could modifiy, modulate, or magically manipulate the signal to yet unheard sonic stimulations. Singing has never been one of SAM's strengths. But if auto-tune can make SAM sound this good then maybe I could have been a pop star. The score for "Row Row Row" was in the book "The Musical Atari" by Hal Glicksman, with music arranged by Laura Goodfriend (cc1984) and it seemed like a good place to start the experiments. The factory preset (A-85 Vocoder Chorus) was used to modify the input. I was a bit supprised. Sam ROW.mp3 In case you can't remember what SAM sounds like, the input to the vocoder is in this next file. Sam ROW- vocoder input.mp3 This was fun but I can't see SAM being my vocalist of choice. There is a chance that a few program changes and the use of the speed control POKE could change my mind. How it was done. The score was programmed into MIDI MUSIC SYSTEM software. Voice 1 was the music and was sent to the Korg as MIDI data to play. Voice 2 was programmed to output timing notes on channel 3 to an Arduino. The Arduino would then change the logic state on the STRIG(0) to let the Atari know it should speak the next word. The MIDI data flowed from the MIDIMax interface to the Korg and out the THRU port to the Arduino. The song was recorded twice. Once starting at middle C and then an octave lower. Then they were offset to produce the round. I have used the arduino interface a couple of times in the past to trigger events. This is just one optocoupler hooked up between the Arduino and Joystick port trigger. http://atariage.com/forums/blog/572/entry-14044-sam-raaks-yuw/ The Arduino runs this program to read the NOTEON data and set the trigger /* SAM Trigger * This program accepts MIDI data to sequence SAM voice. * * When a NOTE ON command for the sellected channel is * detected, the joystick trigger is turned on for * 50 milliseconds to trigger the next word to be said. */ int trigger = 3; byte midiData = 0; byte noteOnCommand = 146; //number representing channel and Command // 144(noteon)+2(channel 3) void setup() { pinMode(trigger,OUTPUT); digitalWrite(trigger,LOW); Serial.begin(31250); } void loop() { while(Serial.available()<1){}//wait for data midiData = Serial.read(); if (midiData == noteOnCommand){ // Note On - Channel digitalWrite(trigger,HIGH); delay(50);// give atari a chance to read joystick } else{ digitalWrite(trigger,LOW); // reset for next word } } The SAM-Atari runs the following. 1 REM SAM ROW VOICE TRIGGERED BY MIDI 2 REM ---kPack 2019 3 REM Arduino monitors midi input and 4 REM sets trigger when word is to be 6 REM said. Audio output from Atari 7 REM is connected to line-in on the 8 REM microKorg. 10 DIM SAM$(255):SAM=8192 15 POKE 8208,50 20 RESTORE 1000:TRAP 20 30 READ SAM$ 35 IF STRIG(0)=1 THEN 35 40 A=USR(SAM) 50 IF STRIG(0)=0 THEN 50 60 GOTO 30 70 REM 1000 DATA ROHW,ROHW,ROHW 1030 DATA YOHWR,BOH4T5 1040 DATA JEH5NT,LIY 1050 DATA DAWN1 1060 DATA DHAH4 1070 DATA STRIY4MM 1080 DATA MEHERAXLIY 1090 DATA MEHERAXLIY 1100 DATA MEHERAXLIY 1110 DATA MEHERAXLIY 1120 DATA LAY4F 1130 DATA IH4SS 1140 DATA BAA4T 1150 DATA AH4,DRIY4MM The tracks from the microKorg were used as they were recorded. No additional processing was done with Audacity on the PC. The ATR contained in the zip file is a single density 2.5 DOS disk. The Autorun file is SAM . The disk also contains a test program(VOCODER.BAS) used to let SAM speak the words without the timing. This was used when experimenting with the voice programing on the KORG. The MUS file is for use with the MIDI MUSIC SYSTEM. SAMROW.zip These photos were taken to remind future me of the setup before I got organized.
-
-
- milesgordontechnology
- mgt
-
(and 3 more)
Tagged with:
-
Had some fun recording this one. MP3 audio file batsam_mp3.zip
-
I've added a third computer to the MIDI chain. Computer #1 plays drums, and Computer #2 runs S.A.M. and Computer #3 plays the lead, . Each computer had a specific BASIC program written to read data from the joystick ports. For this example, Queen's -" We Will Rock You" was arranged for the three computers. You can listen to the MP3 file and then decide if you want to read about the how. SAM Rocks - mp3.zip COMPUTER #0 - Control The music was entered using the MIDI MUSIC SYSTEM software. Voice 1 - Lead was assigned to MIDI channel 1, Voice 2 - SAM assigned to MIDI channel 3, and Voice 3 - Drums is Channel 10. All the editing was done using the CASIO-481 keyboard. Once the playback was acceptable, the rest of the computers were programed in BASIC and the Arduino interfaces were built. COMPUTER #1 - MIDI Channel 10 - Drums The same drum setup from the last two blog entries was used. Only one computer was required for this simple drum pattern. Computer #2 - MIDI Channel 3 - S.A.M. Getting SAM to sing the words was accomplished by activating the trigger button at the right time. The Arduino would read the MIDI data stream. Whenever a 146(NOTEON+2) command was received by the Arduino the joystick state was changed to zero for 10 milliseconds. When the Atari program detected the joystick status change, the next word is sung. Before the Atari was programmed the phoneme spelling of the words were created using the SAM Word Editor(another previous blog entry). Then a small test program was used to change the speed and pitch of the voices to meet the requirements of the music. The BASIC program simply reads the joystick port and when it reads 0 the next word is sung. This created a problem. If SAM was speaking it couldn't detect that the next word needed to be sung and resulting in skipped words. This problem was finally solved by changing the TEMPO in MMS from 175 to 160. Computer #3 - MIDI Channel 1 - Lead If I had started the work on the this computer, the project may never have been finished. Since the work on SAM was done I had to continue on. Bad solder joints, program logic and a misunderstanding of the MIDI data from the Casio or MMS added to the confusion. This was the first time a full 8bits were required to be received by the Atari8(A8). Four additional optoisolators were added to the Arduino Uno. This way the Atari could look for the MIDI command number for NOTEON or NOTEOFF. When NOTEON was detected at PORTA(joystick address) the computer would wait till the note number was set and the joystick trigger logic state changed. The note number was then used to calculate the index for an array holding frequency settings. The Arduino did most of the work decoding the midi data. You would almost think that a NOTEON would be followed by the NOTEOFF command before the next NOTEON command. NOTSO, when the Casio Keyboard was being used to test the hardware and software. Eventually the Casio was hooked up to the IBM running a MIDI Monitor. Press a note key and a NOTEON command was sent. Release the key and a NOTEON command was sent. The only difference was the velocity setting. Setting a NOTEON at 0 velocity does stop the note from being heard. You would almost think that a NOTEON would be followed by the NOTEOFF command before the next NOTEON command. NOTSO, when MMS was used to create the MIDI data stream. Some times the NOTEON command was issued before the last note turned on was turned off. So, the NOTEOFF command received by the Arduino was sent to the Atari only if the note matched the last note turned on. A function to set the output pins on the Arduino was used instead of the "case" used in the drum software. Each bit of the note byte is checked starting at bit 0. When the 7th bit is turned on the A8 computer knows it is a command byte that starts the process of turning on or off a note. The A8 then waits for the joystick trigger button to change state to indicate the note number is ready to be read. delay()s are used to give the A8 enough time to check the PORTA and the trigger. Project Software The ATRs and Arduino source code are in the zip file. The computers in the midi chain have no monitor so the programs for the drums and lead will autorun the BASIC programs. SAM would not allow the autorun code to execute the BASIC program when appended onto end of the AUTORUN.SYS file, so an easy name of "A" was used to reduce typing required to RUN "D:A". The Arduino source code for each of the instruments are in the folder. Also, an ATR containing the MMS music and info files is included. SAM Rocks - support files.zip That should be enough information if I ever want to rebuild this setup. Its been nice to be able to write custom programs and build Arduino interfaces for each voice but I have to start thinking about…. 1. A polyphonic A8 program for general use. 2. Two way communication between the Arduino and A8 3. Using commands beyond NOTEON and NOTEOFF 4. Programming using MAC/65 5. An editor for SAM to include pitch and speed manipulation 6. Getting a new furnace before winter sets in.