Gamasutra: The Art & Business of Making Gamesspacer
arrowPress Releases
October 31, 2014
PR Newswire
View All
View All     Submit Event

If you enjoy reading this site, you might also want to check out these UBM Tech sites:

The Myth of the Digital Native
by Ryan Creighton on 03/21/13 01:23:00 pm   Expert Blogs   Featured Blogs

The following blog post, unless otherwise noted, was written by a member of Gamasutra’s community.
The thoughts and opinions expressed are those of the writer and not Gamasutra or its parent company.


[This article by Ryan Henson Creighton is re-posted from the Untold Entertainment blog, which is awesome.]

There's a term flying around that really gets my goat, to put it like a Nancy Drew character. "Digital native" purports to describe a young person who has grown up surrounded by digital technology. It is a dangerous, grossly misleading term that needs to be nuked from orbit if we ever hope to move forward into a healthy relationship with The Future. Here's why.

There Is No Fork

i remember a quote making the rounds during a conference on kids and technology. i'm not sure if it was borrowed from somewhere, but the gist of it was this: we're not excited about using forks, because we've grown up with forks all our lives. Kids today have the same relationship with the Internet.


It's true: there now exists a generation of people who have never known a life without the Internet, smart phones, VOIP, video conferencing and game consoles. So it must follow, some people reason, that these new technologies are as commonplace to them as are eating utensils.


To compare something as earth-shattering and civilization-changing as the Internet with something as mundane as a fork already betrays a lack of appreciation of the capability and complexity of the current Age ... and i capitalize "Age" because i have no doubt that the networked computers have ushered in a capital-A Age of human technological development: as in Stone > Bronze > Iron > Internet. An astoundingly myopic focus sees only Pinterest and cat pictures; what's happened in the past few decades is nothing short of epochal.

The Internet has been compared to the printing press, but that invention was not made available at a very low cost to millions of people enabling the unfettered transmission of type, sound, AND images - both moving and still - WITH automated language translation and free duplication and instant WORLDWIDE distribution. Take a much more macro view of human existence, and the printing press won't even rank.

But more importantly, the term "digital native" subtly implies that because young people are surrounded by networked technology, they intuitively know how to use that technology. In fact, nothing could be further from the truth. It doesn't matter what sort of technology you're surrounded by: no one comes out of the womb knowing how to type a search engine query, pilot a spaceship, or even use a fork.


The crucial difference, continuing with our fork/computer comparison, is that today's parents know how to use a fork, they know the importance of using a fork, and they consequently teach their children how to use a fork. In contrast, today's parents do not know how to use computers, they do not know the importance of using computers, and they therefore do not and cannot teach their children to use computers.

Father may know best, but he definitely doesn't know how or why to defrag a hard drive.

Calling kids "digital natives" seems to leave technology education up to forces of nature, as if kids are somehow going to learn how to properly use a computer by osmosis - much like we've done with sex education, and look at how that's turned out. i've seen the resulting ignorance that a tack like that produces; when i taught a group of first year college students a few years ago, i required them to zip their midterm test file and email it to me as an attachment. The class erupted with protests. They did not know how to zip computer files. They did not know how to attach files to emails. They did not know how to send emails. And in which program were they enrolled? Video game design.

So in this computer course, you want me to ... USE ... a computer?

But why should they know how to send emails? Email is a very recent advancement. It's really only seen widespread use for the past fifteen years. i didn't really begin to use email heavily until i was working full-time in an office setting. And how were these kids supposed to know how to archive a collection of files? It's an easy thing to do, but you don't know what you don't know. Archiving has only been a recent addition to operating systems; prior to its inclusion in Windows XP (i believe?), you had to download a shareware program like WinZip or gZip or WinRar to archive files. It's not really something you'd naturally know how to do until you've been required to do it.

Tying your shoes: not incredibly difficult, but definitely a learned skill.


i found that the students i've taught and the young graduates i've mentored - "digital natives", all - have been completely hopeless at using search engines, a skill i call "Google-Fu". They've been taught by their high school teachers never to use Wikipedia as a source because it's "unreliable", due to the fact that "anyone can edit it". (Teachers, if you think that just anyone is on Wikipedia writing extensive entries on complex mathematical theorems, ancient Jewish mysticism, and common practices in the manufacture of thumbtacks, kindly retire. The Future will take it from here.)

Lately, this admonishment has softened to become "fine - use Wikipedia, but it can't be your only source", which is equally ridiculous, because many well-written Wikipedia articles are already cross-referenced to the nines with links to all of the material that would turn up through diligent independent research anyway [citation needed]. And often, articles that are further off the beaten path all have Talk pages which feature ongoing discussions on how those articles are being written and refined. Talk pages are excellent resources to help young researchers identify authorial biases and to develop media criticism skills.

And again, the fact that so many young people i meet have been told not to use Wikipedia as a source suggests an education system that, itself, does not understand the current Age and has been teaching neither adequately nor accurately.

If someone vandalizes a Wikipedia article to make Magellan a contemporary of Cap'n Crunch, and a student cites that passage verbatim, the problem is not Wikipedia.

Forgotten Knowledge from the Mists of Time

i attended college on the cusp of the changeover between a period in personal computing where it was a niche interest of hobbyists, and the explosion of networked machines into the lives of everyone on the planet. And being involved during the changing of the guard, i was very fortunate to attend a class at my school that unravelled some of the crucial mysteries of computing for me, and to this day, i am immensely thankful that i have this knowledge.


The course taught me what a disk is, and explained the actual physical process involved in storing data inside a computer. i learned what RAM was, what a ROM was, and why waving a magnet around near your computer was a bad idea. i came to understand how digital displays worked, and the difference between our increasingly old-fashioned cathode ray tube monitors, and these newfangled flat LCD monitors. i learned what a bus was, how a microprocessor worked, why we talked about "BOOTing" computers, and where the term "spam" came from. i learned how search engines indexed web pages on the Internet, and that knowledge alone has made me particularly adept at Google fu. i was taught about viruses, what they were and how to avoid them.

To this day, i understand how disk drives and CDs store digital information. This should be common knowledge.

All of this amazing and wonderful arcane knowledge is stuff that we no longer teach, because we have a generation populated by "digital natives". Our kids know how to thumb around on tablet and smart phone devices that have one button. They can communicate with each other as long as it's nothing too complicated, and as long as it all boils down to one gigantic shiny graphic element that says "SEND". Some know it all boils down to 1's and 0's somewhere down the line, but they have no idea how or why, or why they should care. As long as it all just works, they're fine. They can't swap the battery out of their devices, but pretty soon they won't need to: companies like Apple are leading the charge with perfectly impenetrable little boxes that we must return to them to service. The days of tinkering are disappearing. Our future - The Future - belongs to the companies who build the devices, who hold the keys, and who alone understand how things work.

Making Us Go

IANASTF (i am not a Star Trek fan), but one Trek episode introduces an alien race called the Pakleds:  The Pakleds appear to be very simple-minded, yet somehow they're flying around in spaceships. That's because they steal as much technology as they can get their hands on - "things to make us go", without ever putting in the effort to develop their own technology, or to understand how their stolen technology works. They desire only the power that this technology brings, and they don't care about the ramifications or consequences of using it.

The poisonous term "digital natives" excuses us from effectively teaching our children how to properly use, appreciate, and understand the incredible networked computer technology that now permeates our lives. We don't want to learn how to program - we just want programs that work. We want things to make us go. We have become, and we are raising, a generation of Pakleds - a devolution of humankind which, instead of standing on the shoulders of giants, is dandruff on the shoulders of giants. To wit: we're flaky. It's time that we do away with the term "digital natives" altogether, accept our responsibility, and recognize the importance of teaching our young people how to effectively navigate and steer the incredible future they will soon inherit.

Related Jobs

Forio — San Francisco, California, United States

Web Application Developer Team Lead
The Workshop
The Workshop — Marina del Rey, California, United States

InnoGames GmbH
InnoGames GmbH — Hamburg, Germany

Mobile Developer C++ (m/f)
Activision Publishing
Activision Publishing — Santa Monica, California, United States

Tools Programmer-Central Team


Jonathan Jou
profile image
I kind of want to say that, while Wikipedia is a fabulous source of highly credible information, it's not wrong for teachers to want their students to, say, at least browse through the references. If part of the assignment was developing research skills, being able to find out more than wikipedia can offer is kind of a useful ability. As a graduate student, I always start with wikipedia, and google, but I usually wind up browsing journal articles and following the trail of references it leads me on.

So yes, as an only source, wikipedia is as good as any encyclopedia. And I would say that teachers probably shouldn't encourage students to just cite the encyclopedia.

Nathan Ware
profile image
To say that the term 'digital native' has no bounds in reality is to dismiss the fact that there IS a certain set of technical literacy skills that one develops "through osmosis" as they say. They are wrong when they imply young people learn to operate computers through a magical aura; they are as wrong as the entire Baby Einstein movement and purveyors of the Hypnopedia techniques. Where they're right is where their underlying gut-feeling comes from that these 'digital natives' understand a logic of operation that old people who grew up with traditional information mediums have a hard time wrapping their heads around.

The millennial generation grew up navigating computers. Through their developmental years they learned a range of ways to use computers and where to look on those computers to find answers for their questions. Something not working on your pc? Find its settings and properties. Check its connections. Reinstall its software. Google the problem. Wikipedia the thing. These are all the natural first-steps for anyone who has had a PC and had to fix an unexpected problem. the media is right when is assumes that young people who grew up interacting with the systems that were growing up around them have learned some consistencies with those systems. They also are comfortable enough with the IDEA of messing with a computer that they'll experiment a little bit.

But any mac user will look at the series of troubleshooting solutions listed above and find them mostly gibberish. Here is where the label 'digital native' falls apart. Our technical comfort zone is an exclusive one. I am a PC user, and when operating a mac, I seem like a bumbling Luddite. If I am native to anything, it is that I am Microsoft native, or an Android native. Since I'm comfortable with computers, I can mess around and look for menus or keywords I recognize, but if I don't have a mac user around to talk to, the first button i try to find is the parallel-desktop or boot-camp button. Mac is a language I do not speak, and have not had to speak all my life.

The last point that muddies the waters around the 'digital native' label is the concept of information hiding. More and more, user interfaces for consumer products attempt to separate the user from the technical aspects of systems, and users have been happy to do what works. Apple is notorious for this, reducing interaction to as few buttons as possible, and organizing data not as it is organized in the computer, but as humans might understand and organize information outside of the technical fields. On some level, it makes sense that we would reduce as far as possible the barrier of entry to a new experience. Designers are creating designs to fit into the consumer's comfort zones. But on another level, it is ludicrous.

Computers are indelibly intertwined with our society. they are almost a utility, and we often talk about them as such. Interaction with computer systems is no longer a luxury reserved for the bourgeois. Everybody, on a daily basis interacts with a computer, and it is ludicrous that designers have to create for a population that may not know how to use them. What do we call a mild proficiency in the operation of a computer? Computer Literacy. It is so important that we compare it to such a base skill as reading. No producer of the written word has to create their work within the comfort zones of the illiterate. It is shameful that software and hardware designers do.

No functioning person in the modern world can exist without at some point expanding the comfort zone of their computer literacy, and it should be as important in our homes and schools as written literacy.

Curtiss Murphy
profile image
"The millennial generation grew up navigating computers. Through their developmental years they learned a range of ways to use computers and where to look on those computers to find answers for their questions. Something not working on your pc? Find its settings and properties. Check its connections. Reinstall its software. Google the problem. Wikipedia the thing. These are all the natural first-steps for anyone who has had a PC and had to fix an unexpected problem."

I have a son 14, and a daughter 17. And they have grown up with computers their entire lives. So, as a software developer, I was shocked to realize how little my kids knew. They didn't know how to install software, or configure their PCs or save screen shots or zip files. Anymore than my Mother-In-Law did.

Of course, with practice, they figured these things out. Just as my 67 year-old, retired Mother-In-Law did.

The term, 'Digital Native' implies young people just 'get' technology, in a way old people never will. Don't hate it cause it's ageism; hate it cause it's wrong.

Titi Naburu
profile image
"today's parents do not know how to use computers, they do not know the importance of using computers, and they therefore do not and cannot teach their children to use computers."

I totally agree, there's a long way to go until everyone understands computers and take advantage of them at the fullest.

A week ago, I told an 8-year old nephew what was the government, and the fact that we citizens control it. He didn't seem interested, even when I mentioned the military (he loves action cartoons).

Jamie Mann
profile image
It's a bit off-topic, but I have to disagree with the statement about Wikipedia: for anything even vaguely controversial, there's a good chance that articles can be either biased or censored/watered down. It's certainly a good place to start, but it's always a good idea to check external sources and get some cross-references.

As regards "digital natives": at the risk of sounding like I'm playing Devil's Advocate, does the general populace need to know how the underlying technology works?

To use the oft-abused car analogy: back in the day, cars were clumsy, relatively fragile mechanical devices. If you owned one, you had to know how to get under the hood to diagnose/repair it; when driving it, you had to learn how to handle situations (e.g. braking on a wet road).

These days, when something does go wrong, cars can usually diagnose themselves and a lot of the stuff under the hood is driven by computers and you need specialised tools/knowledge to do anything. And with features such as anti-lock brakes, automatic gears, parking sensors, built-in GPS and even automated braking/parking technology, a lot of the complexities of using it have gone away.

And it's the same with computer technology: a lot of stuff has been abstracted away. My dad doesn't need to know how digital encoding on a DVD works: he just has to make sure it's clean/unscratched and pop it in the player. My 8 year old niece doesn't need to understand how to optimise a search query: google automatically does this for her. My friend doesn't need to understand zip files: Windows automatically presents them as folders - and has been doing for over a decade now.

Generally, people don't need to understand how technology works: they just need to know how to use it, whether it's a car, computer, mobile phone or microwave. And as with most things, you don't need to be an expert in how to use it, you just need to be good enough - Moore's law has given even the most humble mobile phone more CPU power than it needs, and all those "excess" cycles are being plowed into increasing the levels of abstraction and simplification.

Admittedly, people looking for "technical" jobs should have at least some understanding of the underlying technology: even a graphics artist needs to understand how polycounts and texture sizes can impact game performance. But I'd argue that this is something which needs to be included in the training courses: we've come a long way from the days of hand-soldering z80s onto a circuit board, or having to hand-optimise assembler code to unroll loops and minimise memory usage...

Ryan Creighton
profile image
People don't need to understand how networking works: they just need to give a telephone con artist their credit card number so that he can fix viruses he found in an "online scan" of their computer.

People don't need to understand how microtransactions work: they just need to hand their phone to their kids so they can rack up thousands of dollars in unexpected Smurfberry charges.

People don't need to know how to change a battery: they can just bring a closed device like a $400 iPod back to the Apple store and pay another $80 to swap out for a refurbished unit with a fresh battery.

People don't need to understand digital rights: they can just upload all their precious family photos to, and then see their images being held hostage when Kodak decides to change its terms of service.

People don't need to understand how RFID technology works: they just need to carry their new credit card around allowing anyone with an RFID reader to gank their personal information and card number.

People don't need to understand online privacy: they just need to keep voting for politicians who enact obtrusive, rights-denying laws that are masked as piracy protection and anti-child pornography measures.

Douglas Rushkoff reframes the car analogy best: it's not about knowing how a car works under the hood - it's about deciding whether or not to let somebody else drive your car.

Ryan Creighton
profile image

Alex Leighton
profile image
The problem is that anyone who doesn't understand how something works is at the mercy of an "expert" when something goes wrong. And while I don't care one bit if an adult gets scammed because they choose to not know how something works, it really bothers me to think of deliberately raising children to be ignorant.