A theory of the perception of gaming value
Video games are fake, or so we’ve been told. Talk to a non-fan about games and you may hear that they’d rather be doing something real rather than spend their days indoors. The thinking seems to be that video games occupy a lesser tier of activities under things like travel or meeting friends. Whereas these activities offer inherent worth, video games don’t provide as valuable experiences and, ultimately, lack meaning. This is silly and, for that, I blame the 80s.
The original sin of video games is that they started off as toys. Since the release of the Magnavox Odyssey, video games and consoles were designed and sold to kids as something fun to play with. When the Nintendo Entertainment System caught fire in both homes and the public consciousness, it became the filter through which gaming was view for both the young players and the parents who purchased it. This vision remains powerful as many people, thirty years on, still view video games as primarily a toy for kids enjoyed by socially unaware losers. Don’t believe me? The popular sitcom, Big Bang Theory, relies almost entirely on the stereotypes established during this era. The characters are the grown up losers of the video game age. Emotionally and socially, they are still the same awkward kids who played games indoors rather than play football or hang out with their friends. It is through the perceived contrast of experiences that the idea of the “fake” experience developed.
Notice how games were contrasted with other childhood pursuits. Games didn’t reflect the childhood of parents who spent their time outside or socializing for want of anything else to do. To be sure, solitary pursuits such as reading did exist, but none of them were closely tied to the amazing time waster known as TV. Whereas kids could potentially learn something from reading, TV watching rarely imparted anything of value. Furthermore, video games did not appear to be a medium that could ever be more than what it was at that time: a toy. Developers clearly saw their target market as (male) children and made games for that audience. Both parents and kids only saw content with the philosophical implications of a Saturday morning cartoon show. For parents (and kids) of the 1970s and 80s, video games were toys (strike one), tied to the TV (strike two), that contrasted to the more physical activities of an idealized youth (strike three).
Gaming evolved greatly during the 90s as the tools, developers, and audience matured. Every aspect of gaming increased in complexity allowing for great diversity of gameplay and storylines. Players who continued past the NES glory days enjoyed better developed experiences as their less committed peers sought other interests. Meanwhile, parents continued to buy games for their kids thereby introducing another generation to gaming. Unlike the gamers of the 70s and 80s, the gamers of the 90s and aughts enjoyed a more diverse ecosystem of games that gave them offerings as they aged and provided a wider variety of experiences such as multiplayer. Unlike the previous generation of gamers, the 90s and aughts gamers kept playing games in greater numbers and, if they left, they did so with a stronger appreciation of what gaming could do.
The end result was a bifurcation of the perception of video games. The parents and kids of the 70s and 80s saw video games as a shallow waste of time only enjoyed by socially awkward adults who never transitioned past playing with kid’s toys. Gamers became a subset of losers without any particularly redeeming characteristics. By contrast, the latter generations grew to appreciate video games as a normal part of their everyday lives. Video games join movies, books, and music as just another medium to be viewed, discussed, and enjoyed. Ultimately, the latter generations reflect where I believe gaming is headed. Games will become, if they haven’t already, just a normal part of the collection of ways we interaction with friends, enjoy some down time, or view the great mysteries of life.