lunemercove

witchy girl/virtual snep

^ computer witch ^
^ self-taught 3D modeller ^
^ 🏳️‍⚧️, fan of girls ^
^ old enough ^
^ anarchist 🟥⬛^


see them uncombined here


you can always find me here
lune.gay/
the blog specifically
lune.gay/blog/

Xuelder
@Xuelder

This study by Nielsen Nelson literally changed how I see the world when it came out.

From its own summary: "Across 33 rich countries, only 5% of the population has high computer-related abilities, and only a third of people can complete medium-complexity tasks."

When you read what they consider medium-complexity tasks, you will probably head desk. But this is the computer skill level of the general population. When you make applications, you have to consider these are your users, and you need to accommodate them. Which also leads me to why I feel this study needs a follow up. After around 10 years of improving UX since this study started we have seen a significant increase of usability, but I believe a lot of other computer skills have atrophied in its wake. In that time, we built out an infrastructure around simplifying computer use for the average person, and I believe we are now victims of our own success. What a lot of us consider basic computer skills has been seen as black magic by the general public for years.

At my alma mater, they are calling for the state to bring back typing classes because kids don't know how to use a file path, or sometimes they never learned how to type. PCGamer had a similar piece about this a couple years ago. Part of this is the shift to mobile and tablets, and it's not entirely a bad thing. In a lot of cases we have created expert photoshop users who don't know where their photos are saved to, and they frankly don't care. They are complacent in their walled garden app store and are comfortable. I just hope the search function doesn't get bugged on them one day, cause that will cause a lot of chaos for them.


shel
@shel

Every day I encounter people who have only ever used their phone for everything you use a computer for and don't know how to use a mouse or keyboard and jab the monitor with their fingers. People who don't know the words upload or download. People who don't have an email. Or people who insist they do know how to use a computer and are very good at them and then ask me for help printing something from a page that has a giant blue PRINT button that actually downloads a virus.

I think a lot of tech savvy millennials take for granted that a lot of office jobs are seen as skilled labor because computer skills actually are really complicated skills that not just anyone knows how to do. You might think "all I'm doing is using MS Office to type shit up and make basic spreadsheets and manage emails" but like there's a lot of librarians with masters degrees who absolutely struggle with that stuff even after "15+ years of experience with MS Office"

Whether the work being done is necessary for the economy is another question but when it comes to salaries being set by supply and demand, the supply of people competent enough with computers to do these "basic easy things" is actually much smaller than you think. It just seems bigger because your own individual life is mediated by computers


mattcolewilson
@mattcolewilson

This is extremely resonant. At both my current and previous job, I've had director-level employees ask me to do extremely basic things for them — like delete a slide from a PowerPoint presentation or edit text.

At my first office job out of college, way back in 2010, we realized that most of the salespeople did not know the basics of using a computer. So we created a short and extremely simple test to be used during the interview process, with questions like: "Which of these is an email address?" And let me tell you, the vast majority of applicants failed. (So much so, that they would usually just hire them anyway, lol.)

And like, no hate or anything. It's actually kind of impressive that people can function in an office setting without knowing a lot of this stuff. But it obviously is (and has been) a concerning trend as technology consumes more and more of our lives!

I guess a silver lining is that if you're savvy, you should realize that your own knowledge and skillset is more valuable than you may initially assume. I'm very glad I actually had typing and word processing classes in high school. (Bring 'em back!)


shel
@shel

If you know the verbs "click" "copypaste" "drag and drop" "unzip" "upload" and "right click" then you are in a minority


NoelBWrites
@NoelBWrites

In another life I was doing tech support for a small specialized software company. Its clients were mostly doctors with a private practice, sometimes their office admin people.

Most of the people that called the support line were over 50 years old, at least. All incredibly smart and accomplished and able to do difficult things, but not very familiar with computers in general.

I distinctly remember getting a call from this surgeon. Incredibly smart woman. She cut people open and put them back together as her fucking job, come on. She'd been running her own practice for decades. Clearly a skilled person. And she was having issues with her software. It was "broken", it "looked different" and she didn't know what to do.

So I ask her to describe what is different and she's having trouble being specific. She doesn't recognize anything on the screen. At this point I don't even know what part of the software she accessed by mistake. Maybe a settings page?

Eventually I ask her to tell me everything she sees on the screen, describe it as detailed as possible, top to bottom, right to left.

And she describes "a small apple, then it says "finder" and next to it it says..."

"ok, what's under the apple and those words"

"under that there's some colors and then a little rectangle thing and next to it a red dot and a yellow and green one, but there's more red dots under that..."

I realized she accidentally got the software window out of fullscreen. She didn't recognize that the desktop, the icons, the windows, etc were all different "layers" of things. Her screen was just a chaos of shapes and words and fragments unrelated to each other. I imagine it really looked "broken" to her.

(Very patiently I managed to guide her keystroke by keystroke to launch the screensharing app we used for tech support and once I could take over her screen I showed her what was going on. It took a while, but she wanted to learn and she got it)


apocryphalmess
@apocryphalmess

I used to work for a small language department at a major university, as a combination office administrator, IT guy, and "whatever needs to get done" person. the faculty was full of internationally-renowned authors and researchers and experts, and nearly all of them were absolutely terrible at dealing with their technology. and this was in the mid-2000s, so this was even before smartphones and tablets became dominant

the problem was that they couldn't view their computers (and related tech) as systems with behaviors; everything was a separate, arbitrary piece that didn't connect to anything else. this is related to a longer post I made a while back about people who can't model complex systems in their heads, but even the linguists who demonstrably could model complex systems, because they did it every day for their research, they couldn't do it with their technology because they didn't have a foundation to start with, and nobody teaches people how to use computers that way

one of the senior faculty was a language pedagogist and we talked about this sort of thing quite a bit, how computer docs tend to either be "X For Dummies" books that focus on individual tasks, or technical documentation that assumes a minimal level of understanding already, and very little in between

this plus the other decades worth of tech support and related work has led me to have a project on the back burner since basically forever: a book targeted at "smart people who are not computer people". a book about understanding concepts, not about how to do specific tasks. about building models in your head about how a computer works, even very simple models, to give people that foundation to build on

there would be four core aspects to the book:

  1. giving the reader a foundation for building a mental model on, specifically the idea that computers receive, store, share, and modify information, and that everything a computer does is one of these things. and on top of that, the instructions that a computer follows are also information, and can be shared, modified, etc the same way that, say, a picture of a cat can be

  2. explaining the origins of computer terminology because they are incredibly useful for building that foundation. what does the word "interface" actually mean, for example, and why does it mean the same thing in both the terms "user interface" and "Digital Video Interface" (aka DVI)? because an interface is the location where two things overlap and interact, a word that predates its use in computer technology; a DVI port is where the computer interfaces with the display, and the UI is where the computer interfaces with a human being

  3. explaining the concept of "states", in the sense that computers (and phones and tablets and whatnots) will shift between modes of operation, either on their own or when told to by a user, and that these modes can overlap and interact. not knowing what state or mode your device or application is in will lead to confusion and frustration, and not understanding what states are in the first place will just lead to complete chaos

  4. a rundown of basic concepts that are present on most technological systems, but often in very different ways and often using different terminology. "folders" and "directories" are really the same thing, for example, and "bookmarks", "shortcuts", and "favorites" are usually referring to the same concept, although what they're pointing at varies wildly. instead of explaining how to use the Mac OS Finder or Windows Explorer, explain the idea and what the common functionality is

this book has not happened because A) I'm deeply disabled, B) taking care of basic shit for my partner and I under capitalism has taken up all my spoons, and C) the odds of anybody reading such a book are incredibly slim, but watching the influx of "easier" devices like phones and tablets just make the problem worse keeps making me feel like I should somehow find the time and energy


ireneista
@ireneista

we think this book is a really excellent idea, and we strongly encourage it

interestingly, people our age did get tutorials on this basic stuff. our first DOS computer came with a little IBM-branded program that explained concepts such as the "monitor", "keyboard", "files" etc. our first Mac came with an Apple program that explained "files", "the desktop", "windows", "the mouse", "click", "drag" etc. when this stuff was new, everyone was well aware that it was necessary to explain it.

we think what happened was, people who are... pretty much the exact age we are, give or take a couple years... all got very excited by computers as kids and learned our way around them and there was an assumption that someday everyone would learn this stuff as kids and understand it intuitively and there would be no need to teach it anymore

like this wasn't just some minor thing, you'd find tech policy people talking about how they envied us "digital natives" (yes, this terminology is very revealing about how colonial thinking is baked into society...) ... us young 'uns who would understand stuff deeper than they did, and about how every future generation would be the same.

this didn't happen, for two key reasons

  1. intuition is built on factual knowledge, there still needs to be a base layer of explanation, if you stop giving the explanations people stop knowing the things. technology without understanding ceases to be technology, it becomes magic.

  2. corporations that own platforms don't actually want users to be empowered, they want users to pay money. that turns out to be at odds with designing systems to be understood and teaching people how they work.

we should elaborate on (2) a bit more. in those days everyone could plainly see - and it's still true - that the most important thing about computers is how they expand the human potential for discovery and creation. computer companies did draw their strength from that wellspring, for a while, from that public desire to go on a voyage of knowledge and creativity together. unfortunately the mechanisms of capitalist enclosure are well studied and easily put into practice, it just takes a while.

like, the capitalist angle on this is basically that people who know things have too much power and independence. if we understand how computers work we might get the idea we shouldn't have to pay for things that are easy (ringtones were an early step down this path, getting people to pay money again for stuff the computer could trivially do. modern game consoles try to sell you every wallpaper individually!!!! with no ability to create your own. wtf)

or worse yet we might make our own stuff and not pay the company at all

so there's really no incentive for corporations to teach how computers work. so it shouldn't be surprising that that fell by the wayside.

and besides again, it's fine anyway because everyone knows. right?


gregory
@gregory
This page's posts are visible only to users who are logged in.

You must log in to comment.

in reply to @Xuelder's post:

I feel there's an edge of, "if you don't use it, you won't need it" in modern society that wants to cut out all cruft and hone children into no-frills workers (and also obedient consumers).

This is added atop the already horrendous digital divide between those who have the time and the money to familiarize with computers and those who don't.

It really is a good way to put it, being victims of our own success, because the success condition has been cutting out cruft and making things more simple and efficient, but it's done in the context of being driven harder to work more and consume more.

Not everyone has to become a senior-level programmer, but the ability to choose to become more skilled despite easier UX is rapidly being taken away even among the privileged. If I wanted to write a dystopian novel, I'd craft a story where everyone has lost the ability to code anything new, and they're fed algorithmic recycled content like some fucked-up cultural human centipede.

One of the only pieces of Warhammer 40k lore I've learned is that that is the foundation for why the setting is so fucked up.

At some point they made super advanced AI that started doing all the invention for them, to the point where human inventors were not necessary. Went on for centuries or millenia, idk.

When the AI rebelled and tried to wipe them out, humanity destroyed all the super AI. All their old automated factories still work, but now no one knows how anything works to the point where it's fully magic to them.

They talk about "rituals for awakening the machine spirit" when they are really just going through the normal boot up sequence for a smelting factory.

I sent a programmer that's like, 20 years my senior a .7z file because I just hit that instead of .zip and figured "eh, it's a better compression ratio anyway" and he replied "What is this. Please resend it as a zip and refrain from using this file type in the future." I was so caught off guard.

It's horrifying to me that it never occurred to him to simply look for something that opens 7zip, because that's my reflex action if I encounter a strange new file format, and as shitty as modern web search is, it can still locate some useful things...

... and I guess that's the whole point of all of this discussion is the fact that this is a thing that happens, and happens more frequently than people think, even among those we assume as being tech-literate.

tbf if i was an old fogey and had not learned about winrar from pirating shit in my teens one look at winrar would set off all my bonzi buddy free toolbar alarm bells

the important question isn't "could the senior programmer have done this", but rather "would this have been a good use of the senior programmer's time"

getting a mystery filetype in the email with no explanation, and having to google around for a program that can open it, then checking that the program is safe and complies with corporate requirements, and then installing it on a work PC is a waste of time that he could have spent on something more important

whether or not he could do all of that, the fact is that it's pointless for him to do so. everyone on the team can already create zips and open zips. it's quicker and easier for him to dash off an annoyed email telling you to use the format he can already open

That's a really good point. I think most any org is going to have strict IT procedures to follow, and even installing a new browser extension has to make it past review. 7zip files would have to pass that same review, so the senior dev here would most definitely be in his right to say "wtf? do this over"

going to be honest, with most IT policies these days, i've noticed that zip files get blocked because "THEY COULD CONTAIN ANYTHING"

i've had to do the "rename .zip to .txt to send it to someone" thing within the last month

7z worked fine :|

If they don't have to know where their photos are saved to, then honestly, good for them! That's a sign that usability has gotten way better! Instead of saying "I just hope the search function doesn't get bugged on them one day", let's say "I hope app developers don't break crucial interface elements one day".

It's easy to take pride in our skills and look down on people who don't have them, but a fair percentage of my computer skills were learned not because I liked them but because they were a necessity for dealing with the busted-ass computers of the 90s and 00s. I haven't had to change the graphics color depth in twenty years, and that's a good thing! I'm glad zoomers don't have to remember to change the system graphics settings every time they want to play a game because the system will freeze up and crash if they play a game with more than 256 colors.

My father (born mid-1940s) tells me of when his employer was first rolling out new mainframes with these "disk file system" things where your data would be stored in a name abstracted behind a file system layer instead of an actual physical location and some customers' IT people would freak out that they wouldn't know where their data actually was.

This was calmed over by making a utility that turned filenames into information about where, physically, the data for that file was.

Now think about how we look back on those people in the early 70s and smile thinking "what a silly thing to worry about".

Why was it a silly worry? Why is it fine to let go of where data is physically (until doing forensics or similar), and rely only on the name, whereas the idea that an expert photoshop user wouldn't know the directory name where is represents a crisis of "people can't use computers"? What qualitative distinction do we have that gives us any indication that we're worrying about a real thing any more than the people who were freaked out that they wouldn't know where their data was physically?

i think the thing that made this obvious to me was when I wrote a piece of software, without configuration (which means, clicking 1 option out of a selection of about 3), wouldn't work. There was a message at start up saying "hey, you need to click 'options' and select which option you want to use". i'd implement proper onboarding later i said (i never did)

the amount of support tickets i got saying "doesn't work" with no body, and after asking them what was wrong, "i get an error". i ask them what the error was, "something about not having something selected". and i'd copy and paste the exact error message text to them and they'd be like "thanks that fixed it".

i understand not being technical. i understand that that wasn't the peak of UX. however, i don't feel it's too much to ask to read the error message than to waste their own time and mine

I mean computers of any kind are just absolutely terrible and disgusting devices. The less people need to use them to live their lives, the better. We don't need more eyes looking at screens.

in reply to @shel's post:

My opinion in this regard is that is kind of irksome though, because with so few people being considered "tech literate", even those of us who laugh at the idea of a job where you can literally just send emails all day, it is really hard to actually get a job like that. There is an age bias where they will want an older candidate who can claim "15+ years of experience", whether or not they can even change the font size in MS Word, instead of someone who has been a techie since they could walk. Most people with hiring power dont actually know how to judge a candidate accurately

in reply to @mattcolewilson's post:

There's that old chestnut, "Give a man a fish and you feed him for a day; teach a man to fish and you feed him for a lifetime." I feel like instead of teaching a man to fish with a pole, explaining bait, how the tides correlate with fishing patterns, etc. we ended up just developing the solution equivalent of giving the man dynamite and assumed he had a net.

in reply to @shel's post:

I wonder if there's any other profession where the people who know how to do it think it's super easy baby shit everyone knows how to do and the people who don't know are completely hopelessly lost, because it seems like it's always and most extremely computer skills

For a long time it was car maintenance, but now with all the electrics and on board computers it really does take a specialist to deal with a lot of the problems. Also, home improvement and property maintenance. The amount of people who don't know how to trim bushes, mow the lawn or even basic gardening would surprise you.

I think a crucial factor here is that, in addition to the "basic" skills, there are many more tiers of skills that get progressively more advanced and difficult. People who understand how to email and unzip probably also understand that they don't know how to set up an email server. People who know how to set up an email server understand that they don't know how to code a new one. And so on—the deeper you get, the more challenging the next level seems, which by contrast makes the earlier levels seem all the more trivial.

I can't speak for everyone, but I've gotten the vibe before that a lot of lawyers and law students (including me sometimes!) vastly overestimate the average person's familiarity with legal processes, and often even overall literacy.

Step 1 of filing a lawsuit is you hire an attorney. Then the rest happens. For people who only ever use computers at libraries, step 1 of printing a document is you get the librarian, and then the rest happens. No matter how many times I teach them how to do it themselves and demonstrate that it does not take a 6-year education to do this, it doesn't matter. Step 1 is you ask the librarian and then the rest of it Just Happens and you don't really need to pay attention or retain any of it even when they make you do all the steps yourself.

If I was filing a lawsuit I think I'd be the same though. I'd sign the paperwork my lawyer gives me but I probably wouldn't remember what paperwork is needed or why.

The vast majority of people can, with just a few months training, run a mile and get times under 14 minutes.

Also, the vast majority of people cannot run for a full mile. If you take them unannounced and ask them to, you're likely to get times in the 15-20 minute range, and much of the distance will have been walking.

If you ask on a hobbyist running forum they will guess that most people can run a mile in about 10 minutes, because that's the time you get when you're non-competitive but not fundamentally out of shape. The idea that most people won't even be able to keep running for the whole mile won't even occur to them, so they won't figure that into their estimates.

huh. I had not thought about this because many of my friends do computer things professionally and I'm definitely not as good at computers as they are so therefore I must be average or below average, right? but apparently, wrong

The older I get the more I realize that my skill set that I thought was woefully insufficient is actually pretty advanced, I just don't have the unheeded confidence or salesmanship of, say, a 57-year-old Greg who refuses to use Google.

in reply to @NoelBWrites's post:

in reply to @ireneista's post:

WRT "digital natives," history is repeating itself. The term "quantum native" is getting used now to describe in a positive sense people who learned everything from quantum computing references that are virtually indistinguishable from product whitepapers. I'll 100% grant that's very niche, but it's interesting to see the same patterns repeat in microcosm.

Sadly, it's coming from folks I respect, but given that the baseline for the field is "supremacy isn't racist, and I will shout at you until you believe me," something like appropriating the term "native" in a colonialist fashion is a few steps down the list, sigh.

yeah, we realized as we wrote this post that it's not even just an appropriation issue

it is a real and serious problem with the tendency to view new areas of discovery as previously uninhabited, empty territory that we can just go into and take over without thought for the consequences or morality

... and like in this case, sure, the potential of computing WAS mostly empty territory prior to electronic computers. but you can see that same mode of thinking being applied to this current issue where ML data sets are being flat-out stolen from the people who created the stuff. those inconvenient people, getting in the way of our progress! how dare they already be living in the places we're trying to build!

Oh, entirely agreed. I think I should perhaps expand my comment above to "appropriating the term 'native' to refer to the horrifying idea that colonizing and displacing people from an area of technical expertise and human experience is a good thing."

In any case, I have thoughts that are best expressed in a different venue on how that applies in my own niche field.

I was doing educational design and tech support as part of a team at an online education program for adults. We were worried about a student who'd never owned a computer and had rarely used computers over their lifetime. With support, time, and resources, they gained a lot of skills and were one of our better students.

In contrast, the first couple years we severely overestimate how motivated most of our students with basic computer knowledge would be to learn new tech skills. They struggled, so we provided guided in person instruction and went student to student verifying that they could complete basic and moderate difficulty tasks, thinking that they would retain this information. They did not! We did promise improved tech skills as part of the value of our program, but we were pushing tasks and skills on them that they did not want or value. If I could go back in time I would spend much more time advocating for the use of a learning management system with a simpler UI and much less time trying to get them to use a user unfriendly one.

we talked recently to a friend who'd been through some childhood stuff alongside us and, as a result, got exposed to some fancy computer stuff a few years before the rest of the world did

the friend no longer uses computers, only smartphones

they said it no longer felt rewarding to use computers because of how platforms are constantly changing things for no reason other than some product manager wanted to be promoted, so there's no point in building knowledge about them because the knowledge won't have lasting value

it's. damn. this friend is a brilliant person. if they're feeling this way, surely EVERYONE is feeling some version of it. (it's never been a problem for us because we mostly have computer problems of our own making, like how we can't get our patched kernel to be compatible with other things that need differently patched kernels, or whatever.)