The CWAL Rubber Room
HQPost a New MessageReturn to The CWAL Rubber RoomFAQ

Wow, talk about brain fart.
Posted by TheDeamon from 70.196.3.*, on July 14, 12017 at 09:19:43:

Computers aren't "bad" but how the business world adjusted to the new technological reality was done in a "bad way" and the precedents set during that transition has left a legacy that isn't entirely good.

It was needed, badly, when initially implemented. Because even many of the people in our own generation sub-set(sitting on the split between "X" and "the Millennials") didn't really have much interaction with technology in their life outside of the workplace.

So when you take some random person off the street who has little to no practical experience working with a computer aside from tightly controlled "lab time" that might have had a few hours of in school(in total, or per week).

And promptly throw them into a job which requires using a non-intuitively designed computer(remember, early versions weren't very user friendly) as a core part of their job, and you're in for a lot of pain and suffering for everyone involved.

Intuitive use of computers was basically a non-thing until the late 1990's, and that was only if you happened to be on one of the newer systems, and not caught in a "legacy system" from before(and more than a few of them still remain). However, since then we've had the weird confluence of computers becoming both easier to use even as an increasing number of people deal with computer technology in their daily life outside the workplace.

The days of the techno-plebe coming off the street and expected to be able to flawlessly operate a Unix terminal from the command line only are basically over. The command line only jobs still exist, but they're highly specialized, and few and far between, and the (young) techno-plebe is becoming nearly impossible to find as well.

But the requirement for hire remains pretty much where they set it back in early 1990's. If they haven't raised it, because of the ongoing problem with K-12.