Jump to content
IGNORED

Was MS-DOS (and similar) an entry barrier to computing?


Recommended Posts

Ironically, in the early days DOS was simple but price was the barrier. By the time PC prices came down enough for home users, DOS had grown very complex and there was a usability barrier.

 

DOS was incredibly simple until the 286 came along and suddenly memory management became a thing.

 

There was hardly anything for the home user to configure, didn't really need to fiddle with config.sys or AUTOEXEC.BAT (in fact I believe these weren't even in the early versions of DOS).

 

TSRs and drivers didn't really become necessary until the second half of the 80s.

 

Early disk games auto-booted or required very simple commands to start.

Edited by rpiguy9907
Link to comment
Share on other sites

People have always been deterred from going with third party solutions or any solution whose future is unknown.

 

Which is why MS-DOS remained the standard well into the 90s, it had compatibility and a certain future. Plenty of other options were available-- OS/2, any number of Unix variants, GEM. Not even Windows could unseat DOS at first, because until Windows 95 forced the issue, Windows ran as just another application on top of DOS that you had to invoke with the "win" command. If you didn't need it, you didn't bother to run it.

Link to comment
Share on other sites

It's too bad ms-dos didn't include multitasking or task-switching capability. It would have made people's lives easier and the hardware was certainly capable of doing it. People did see the advantages of windows, one being multitaskng. Organizations have enormous investments in their existing data and human resources not to mention the hardware and software investment. That type of change takes time.

Link to comment
Share on other sites

No.. plenty of non-technical people used DOS. If it was so archaic that it scared people away, it never would have become the dominant platform at a time that everyone else had gone full GUI.

 

This, plus people seem to "forget" about crucial soft such as Norton Commander. This makes all the alleged command line sorcery a breeze. In fact my actual Win 10 GUI at the moment consists of two (or 3) Total Commanders windows.

 

These kinda threads seem a fertile ground for myth-making and wild statements anyway :) Amiga was a plastic toy? Sure, all these media folk where it was a standard for years had it backwards. Alongside Atari ST, what with the MIDI/tracker revolution it helped to spark. Single-tasking was never a barrier to creativity/productivity. You could maybe even make an argument that procrastrination has increased wildly with being able to have your spreadsheet open next to Minesweeper, or now a browser with 30 tabs full of delicious distractions.

  • Like 2
Link to comment
Share on other sites

It's true when people multitask they can lose focus. But, sometimes having two things open on the computer can help people do one task. Back when Lotus-123 didn't do charts and graphs it was really helpfull to be able switch between harvard graphics and the spreadsheet. Or when Wordperfect couldn't open more than one file at a time having two sessions open for copy paste made it so much easier. Or looking something up in a database without closing your work. Background processing wasn't normally needed but if you ever printed a large document on a slow printer it was nice to be able to switch to the next task. For advanced users you could script batch processing of data between programs that otherwise didn't have the function saving hours of manual work or thousands of dollars in custom software. Fortunately, ms-dos didn't come with any games as interesting as minesweeper or the internet.

 

In those days, learning the application software was more of a barrier than learning a couple of dos commands. Fortunately everything came with one or more bible thick books. Seriously, I miss those books, they were very helpfull.

Edited by mr_me
  • Like 1
Link to comment
Share on other sites

I remember being really ticked off to leave DOS for windows operating system controls most of the time. Other than the config and autoexec stuff, I felt like I was handing over the controls to Microsoft, which ultimately I was. I never became a fan of linux. I know how to use it, and still do when necessary, but it's always been undesirable to me for some reason. I guess if we're talking about that brief period before Windows 3.1 went mainstream, I could see people having a hard time taking up a new PC. If you started on the very first PCs in 79-80, things just seemed to get more powerful and easier to use if you were having to do things the hard way from the start. Use a cassette loader for a couple years, then tell me how much you hate DOS.

Link to comment
Share on other sites

These kinda threads seem a fertile ground for myth-making and wild statements anyway :) Amiga was a plastic toy? Sure, all these media folk where it was a standard for years had it backwards. Alongside Atari ST, what with the MIDI/tracker revolution it helped to spark. Single-tasking was never a barrier to creativity/productivity. You could maybe even make an argument that procrastrination has increased wildly with being able to have your spreadsheet open next to Minesweeper, or now a browser with 30 tabs full of delicious distractions.

 

Single-tasking was the norm in desktop computers. People took it for granted. When multi-tasking arrived, it was like what kind of sorcery is this.

 

We are also judging these computers in hindsight, compared to what we have now. But back then we judged them against what we had before.. More memory! 16-bit faster CPU, higher capacity disk! Higher resolution screen! I can do so much more with this!

  • Like 3
Link to comment
Share on other sites

From someone that sold computers in the MS-DOS days... no it wasn't an obstacle.
We set up menu programs or autoboot disks that didn't require typing commands for customers all the time.
If they wanted to use MS-DOS for something, we'd often show them how to do it. No big deal.

Link to comment
Share on other sites

  • 3 weeks later...

I certainly don't miss dealing with memory allocation and that damned DEBUG command.

 

Ugh.

Ah, the serenity of a halted system after a MCB chain corruption.

 

CodeView was a cool debugger, except for the amount of memory it consumed. Same for Turbo Debugger.

 

Watcom Debugger was better than stone knives and bear skins.

 

Sometimes DEBUG was still required anyway :(

Link to comment
Share on other sites

All this talk about multitasking... you know the iPhone couldn't even multitask until iOS 4 in 2010? Yet it still did pretty well without that feature.

 

Most people do not care a whit about multitasking. Humans can't multitask - it's a myth. The most we can do is task switch quickly, but we lose efficiency when we do so. So nobody really needs a multitasking OS and most people won't ever use more than one program at a time. I personally do, but I'm a weirdo like that. (Even if I do, I'll usually just have one app running in the background on a second monitor while I do something else on the primary monitor.)

 

You can argue that the command line interface was a barrier to computing, but that's like arguing that the stick shift was a barrier to car ownership. First, plenty of people bought both products anyway, and plenty of those and other people still rely on the earlier interface because it can do things that the newer one can't. So... the most you can say is it was a barrier to people who couldn't figure out that earlier interface, be it the command line or stick shift. But that's not to say the newer interface is better, or even necessarily simpler. It's just different in a way that more people can seem to grasp.

  • Like 2
Link to comment
Share on other sites

All this talk about multitasking... you know the iPhone couldn't even multitask until iOS 4 in 2010? Yet it still did pretty well without that feature.

 

Most people do not care a whit about multitasking. Humans can't multitask - it's a myth.

 

Human and computer multitasking are two different things. I can't imagine my PC being able to execute only one program at a time anymore, it'd meant I'd have to close Firefox to fire up Thunderbird, copy something from an email, close Thunderbird, only then open the notepad and paste it in...instead of just alt tabbing and without the chance to listen to music during all these tedious operations.

  • Like 1
Link to comment
Share on other sites

A single core single thread cpu doesn't really multitask as we think. Like my Pentium III can do iTunes, be downloading through Firefox, be programming a large EPROM, all while I'm playing Gyruss.

 

Does it look like it's multi-tasking? You bet. Is it really? No. It's just context switching very very fast to give the illusion of multitasking.

 

Considering granularity and speed of switching. At what speed does it appear multi-tasking? And at what speed does it appear to be context switching?

 

A sufficient;y fast rig will appear to run two or more things simultaneously to a sufficiently slow human.

Link to comment
Share on other sites

You can argue that the command line interface was a barrier to computing, but that's like arguing that the stick shift was a barrier to car ownership. First, plenty of people bought both products anyway, and plenty of those and other people still rely on the earlier interface because it can do things that the newer one can't. So... the most you can say is it was a barrier to people who couldn't figure out that earlier interface, be it the command line or stick shift. But that's not to say the newer interface is better, or even necessarily simpler. It's just different in a way that more people can seem to grasp.

In some ways the CLI was easier for low-tech saavy people. These days everyone is used to tech, phones, and GUI interfaces in general. But in the 90s, I would

frequently run into people who did not understand tech at all, or they were afraid of it, afraid of screwing something up. They needed a computer to do their job

but would demand step-by-step instructions to run the tasks they needed.

 

In DOS/Cli, this was easy, you put all the commands into a batch file, and told them to type the name of the batch file.

 

But with a GUI, trying to give such step-by-step instructions was a nightmare. It was like you have double-click this icon, but the icon may or may not be on screen it may be

in a window that's minimized, or the app may already be running and you have to maximize it. If they clicked out of the app, you'd have

to tell them how to get back into it. And they would panic if anything didn't go like they expected.

Even the concept of double-clicking was hard to grasp for some of these people, they would generally click too slow at first.

 

Like I said, today people are pretty tech-saavy and I never run into this anymore. Back then it was not uncommon.

  • Like 4
Link to comment
Share on other sites

^^ that's a good point.

 

For a low skilled individual with only a few tasks to perform, CLI could be easiest, especially if frequently used things were automated to a single key press or command.

 

For a smart/curious person who wants to think about other things than the needs of the computer's operating system, a WIMP GUI would be better. Until they drag components of their System Folder to the Trash (which was easily done in early Mac OS).

 

For the person who really cares about how things work, the command line will always be the most powerful and versatile.

 

My personal preference will always be the nice GUI, but I'm glad we have a robust CLI in every modern OS now as an option.

  • Like 1
Link to comment
Share on other sites

  • 2 weeks later...

All this talk about multitasking... you know the iPhone couldn't even multitask until iOS 4 in 2010? Yet it still did pretty well without that feature.

 

Most people do not care a whit about multitasking. Humans can't multitask - it's a myth. The most we can do is task switch quickly, but we lose efficiency when we do so. So nobody really needs a multitasking OS and most people won't ever use more than one program at a time. I personally do, but I'm a weirdo like that. (Even if I do, I'll usually just have one app running in the background on a second monitor while I do something else on the primary monitor.)

 

 

Wow. I wonder what most people's employers would think nowadays if their employees stated that "nobody really needs a multitasking OS and most people won't ever use more than one program at a time." Where I work, that would pretty much be grounds for dismissal.

 

So you mean there's no need to download or install a file in the background while you write an email to someone? No need to generate a PDF while doing something else? No need to make edits to video while rendering an image sequence in another app?, No need to work in a spreadsheet or word processor while the computer does a virus scan or indexes files? No need to have someone run a remote desktop connection to your machine while showing them something in another app? No need to have two audio apps share information with one-another while playing a sequence? No need to run a video/screen capture app while running another program? I could go on, but I think just about everyone on this forum can find multiple cases in which they made good use of a multitasking OS to save time and do things more efficiently.

  • Like 1
Link to comment
Share on other sites

Which adjusted for inflation, is equivalent to "only" $3519.00 in 2019.

 

Yes, to get back on topic - cost was the initial barrier to using a DOS-based computer, not the complexity of learning commands for file management and launching programs. Backing all the way up to the initial release of the IBM 5150 PC; it was sold as a professional, commercial computer that just happened to fit on a desktop. And the way it was sold was part-by-part. The "base" price of the computer included the motherboard and the case; no power supply; no RAM; no CGA video card; no keyboard; no monitor; no disk drives... ...you even had to pay separately for the power cord!

 

By the time you assembled a complete system that would boot up and display something on a monitor screen, you were in the neighborhood of $4,000 dollars ($4,000 1981 dollars...). For the first few years the most serious competition to IBM was the Apple II. (Remember that ad, "Welcome IBM - Seriously"). A lot of businesses and colleges used Apple II's to run VisiCalc.

 

Money was the barrier to the first GUI-based computers too; remember, the "Lisa" was a $10,000 computer when it was introduced. That's why my first computer was the "First computer for under $100 - The Timex/Sinclair 1000"...

Edited by almightytodd
  • Like 2
Link to comment
Share on other sites

^^ same

These were the days when access to a university computer lab made a lot more sense than a home purchase for anything other than "toy" computers like that. My T/S-1000 was $39.99 on clearance at Boscov's, right next to the blowout Vectrex.

 

The rate of obsolescence was very fast back then, too. Moore's Law was strongly felt.

  • Like 1
Link to comment
Share on other sites

^^ same

These were the days when access to a university computer lab made a lot more sense than a home purchase for anything other than "toy" computers like that. My T/S-1000 was $39.99 on clearance at Boscov's, right next to the blowout Vectrex.

 

The rate of obsolescence was very fast back then, too. Moore's Law was strongly felt.

 

Yep. I had a $100 Commodore 64 "toy computer" at home for games but a $100,000 Sun workstation at the college lab! It was not until I was working a decade before I bought my own computer with my own money. Why should I when I have $100,000 AIX workstation at work and at home had a $3,000 employer owned "work from home" PC.

 

At the time we were encouraged to use our workstations and the PCs for anything we wanted educationally off work hours.. ie just no offensive or adult only content and no side businesses.

 

Ah good old days... I remember late December '99 DOOM source code was open sourced (GPLed). We all scrambled even on Christmas '99 frantically patching and compiling to get DOOM compiled for AIX so we could go to work on January 1, 2000 and play DOOM all day! Ah to be 20 something single geek in the late 90's.

 

The really sad part is we could barely play DOOM on our workstations due to graphic card bugs!!! Prior to DOOM just CAD/CAM apps stressed the graphics cards/drivers on AIX. Playing DOOM was added to official QA testing for the next round of graphic cards.

Edited by thetick1
  • Like 1
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...