This study by Nielsen Nelson literally changed how I see the world when it came out.
From its own summary: "Across 33 rich countries, only 5% of the population has high computer-related abilities, and only a third of people can complete medium-complexity tasks."
When you read what they consider medium-complexity tasks, you will probably head desk. But this is the computer skill level of the general population. When you make applications, you have to consider these are your users, and you need to accommodate them. Which also leads me to why I feel this study needs a follow up. After around 10 years of improving UX since this study started we have seen a significant increase of usability, but I believe a lot of other computer skills have atrophied in its wake. In that time, we built out an infrastructure around simplifying computer use for the average person, and I believe we are now victims of our own success. What a lot of us consider basic computer skills has been seen as black magic by the general public for years.
At my alma mater, they are calling for the state to bring back typing classes because kids don't know how to use a file path, or sometimes they never learned how to type. PCGamer had a similar piece about this a couple years ago. Part of this is the shift to mobile and tablets, and it's not entirely a bad thing. In a lot of cases we have created expert photoshop users who don't know where their photos are saved to, and they frankly don't care. They are complacent in their walled garden app store and are comfortable. I just hope the search function doesn't get bugged on them one day, cause that will cause a lot of chaos for them.
Every day I encounter people who have only ever used their phone for everything you use a computer for and don't know how to use a mouse or keyboard and jab the monitor with their fingers. People who don't know the words upload or download. People who don't have an email. Or people who insist they do know how to use a computer and are very good at them and then ask me for help printing something from a page that has a giant blue PRINT button that actually downloads a virus.
I think a lot of tech savvy millennials take for granted that a lot of office jobs are seen as skilled labor because computer skills actually are really complicated skills that not just anyone knows how to do. You might think "all I'm doing is using MS Office to type shit up and make basic spreadsheets and manage emails" but like there's a lot of librarians with masters degrees who absolutely struggle with that stuff even after "15+ years of experience with MS Office"
Whether the work being done is necessary for the economy is another question but when it comes to salaries being set by supply and demand, the supply of people competent enough with computers to do these "basic easy things" is actually much smaller than you think. It just seems bigger because your own individual life is mediated by computers
This is extremely resonant. At both my current and previous job, I've had director-level employees ask me to do extremely basic things for them — like delete a slide from a PowerPoint presentation or edit text.
At my first office job out of college, way back in 2010, we realized that most of the salespeople did not know the basics of using a computer. So we created a short and extremely simple test to be used during the interview process, with questions like: "Which of these is an email address?" And let me tell you, the vast majority of applicants failed. (So much so, that they would usually just hire them anyway, lol.)
And like, no hate or anything. It's actually kind of impressive that people can function in an office setting without knowing a lot of this stuff. But it obviously is (and has been) a concerning trend as technology consumes more and more of our lives!
I guess a silver lining is that if you're savvy, you should realize that your own knowledge and skillset is more valuable than you may initially assume. I'm very glad I actually had typing and word processing classes in high school. (Bring 'em back!)
If you know the verbs "click" "copypaste" "drag and drop" "unzip" "upload" and "right click" then you are in a minority
In another life I was doing tech support for a small specialized software company. Its clients were mostly doctors with a private practice, sometimes their office admin people.
Most of the people that called the support line were over 50 years old, at least. All incredibly smart and accomplished and able to do difficult things, but not very familiar with computers in general.
I distinctly remember getting a call from this surgeon. Incredibly smart woman. She cut people open and put them back together as her fucking job, come on. She'd been running her own practice for decades. Clearly a skilled person. And she was having issues with her software. It was "broken", it "looked different" and she didn't know what to do.
So I ask her to describe what is different and she's having trouble being specific. She doesn't recognize anything on the screen. At this point I don't even know what part of the software she accessed by mistake. Maybe a settings page?
Eventually I ask her to tell me everything she sees on the screen, describe it as detailed as possible, top to bottom, right to left.
And she describes "a small apple, then it says "finder" and next to it it says..."
"ok, what's under the apple and those words"
"under that there's some colors and then a little rectangle thing and next to it a red dot and a yellow and green one, but there's more red dots under that..."
I realized she accidentally got the software window out of fullscreen. She didn't recognize that the desktop, the icons, the windows, etc were all different "layers" of things. Her screen was just a chaos of shapes and words and fragments unrelated to each other. I imagine it really looked "broken" to her.
(Very patiently I managed to guide her keystroke by keystroke to launch the screensharing app we used for tech support and once I could take over her screen I showed her what was going on. It took a while, but she wanted to learn and she got it)
I used to work for a small language department at a major university, as a combination office administrator, IT guy, and "whatever needs to get done" person. the faculty was full of internationally-renowned authors and researchers and experts, and nearly all of them were absolutely terrible at dealing with their technology. and this was in the mid-2000s, so this was even before smartphones and tablets became dominant
the problem was that they couldn't view their computers (and related tech) as systems with behaviors; everything was a separate, arbitrary piece that didn't connect to anything else. this is related to a longer post I made a while back about people who can't model complex systems in their heads, but even the linguists who demonstrably could model complex systems, because they did it every day for their research, they couldn't do it with their technology because they didn't have a foundation to start with, and nobody teaches people how to use computers that way
one of the senior faculty was a language pedagogist and we talked about this sort of thing quite a bit, how computer docs tend to either be "X For Dummies" books that focus on individual tasks, or technical documentation that assumes a minimal level of understanding already, and very little in between
this plus the other decades worth of tech support and related work has led me to have a project on the back burner since basically forever: a book targeted at "smart people who are not computer people". a book about understanding concepts, not about how to do specific tasks. about building models in your head about how a computer works, even very simple models, to give people that foundation to build on
there would be four core aspects to the book:
-
giving the reader a foundation for building a mental model on, specifically the idea that computers receive, store, share, and modify information, and that everything a computer does is one of these things. and on top of that, the instructions that a computer follows are also information, and can be shared, modified, etc the same way that, say, a picture of a cat can be
-
explaining the origins of computer terminology because they are incredibly useful for building that foundation. what does the word "interface" actually mean, for example, and why does it mean the same thing in both the terms "user interface" and "Digital Video Interface" (aka DVI)? because an interface is the location where two things overlap and interact, a word that predates its use in computer technology; a DVI port is where the computer interfaces with the display, and the UI is where the computer interfaces with a human being
-
explaining the concept of "states", in the sense that computers (and phones and tablets and whatnots) will shift between modes of operation, either on their own or when told to by a user, and that these modes can overlap and interact. not knowing what state or mode your device or application is in will lead to confusion and frustration, and not understanding what states are in the first place will just lead to complete chaos
-
a rundown of basic concepts that are present on most technological systems, but often in very different ways and often using different terminology. "folders" and "directories" are really the same thing, for example, and "bookmarks", "shortcuts", and "favorites" are usually referring to the same concept, although what they're pointing at varies wildly. instead of explaining how to use the Mac OS Finder or Windows Explorer, explain the idea and what the common functionality is
this book has not happened because A) I'm deeply disabled, B) taking care of basic shit for my partner and I under capitalism has taken up all my spoons, and C) the odds of anybody reading such a book are incredibly slim, but watching the influx of "easier" devices like phones and tablets just make the problem worse keeps making me feel like I should somehow find the time and energy
we think this book is a really excellent idea, and we strongly encourage it
interestingly, people our age did get tutorials on this basic stuff. our first DOS computer came with a little IBM-branded program that explained concepts such as the "monitor", "keyboard", "files" etc. our first Mac came with an Apple program that explained "files", "the desktop", "windows", "the mouse", "click", "drag" etc. when this stuff was new, everyone was well aware that it was necessary to explain it.
we think what happened was, people who are... pretty much the exact age we are, give or take a couple years... all got very excited by computers as kids and learned our way around them and there was an assumption that someday everyone would learn this stuff as kids and understand it intuitively and there would be no need to teach it anymore
like this wasn't just some minor thing, you'd find tech policy people talking about how they envied us "digital natives" (yes, this terminology is very revealing about how colonial thinking is baked into society...) ... us young 'uns who would understand stuff deeper than they did, and about how every future generation would be the same.
this didn't happen, for two key reasons
-
intuition is built on factual knowledge, there still needs to be a base layer of explanation, if you stop giving the explanations people stop knowing the things. technology without understanding ceases to be technology, it becomes magic.
-
corporations that own platforms don't actually want users to be empowered, they want users to pay money. that turns out to be at odds with designing systems to be understood and teaching people how they work.
we should elaborate on (2) a bit more. in those days everyone could plainly see - and it's still true - that the most important thing about computers is how they expand the human potential for discovery and creation. computer companies did draw their strength from that wellspring, for a while, from that public desire to go on a voyage of knowledge and creativity together. unfortunately the mechanisms of capitalist enclosure are well studied and easily put into practice, it just takes a while.
like, the capitalist angle on this is basically that people who know things have too much power and independence. if we understand how computers work we might get the idea we shouldn't have to pay for things that are easy (ringtones were an early step down this path, getting people to pay money again for stuff the computer could trivially do. modern game consoles try to sell you every wallpaper individually!!!! with no ability to create your own. wtf)
or worse yet we might make our own stuff and not pay the company at all
so there's really no incentive for corporations to teach how computers work. so it shouldn't be surprising that that fell by the wayside.
and besides again, it's fine anyway because everyone knows. right?

by