When I was 13, my elder brother was pestering my mother to buy an Intel 8086 desktop computer. At first, I did not understand what the fuss was about. But when we bought our first computer, I had it hard not to play PC games. Some titles I will always remember include Ultima V, Sentinels World: Future Magic, Dark Queen of Krynn, and Ancient Art of War at Land.
Due to some weird circumstances, I was still messing around with games in my grad school. First, I studied World of Warcraft for my dissertation research. Then, and I did not plan for this, studied StarCraft II for my postdoctoral research.
I cannot stress how often I was asked by my family, friends, and colleagues (mostly non-game studies researchers): Why games?
(This also meant: “Don’t you have a better research topic?”)
After many futile attempts at answering this question, I learned that many sceptics have decided on the answer before I can speak. We cannot blame them. Societies often carry negative sentiments toward emerging media. For example, comics in the 18th century, cinema in the 19th century, and then television in the 20th century, and now games. For these sceptics, I have more recently attempted a counter-offensive by just asking them in return: Why not?
Game studies at the present
What should already be compelling is the fact that gaming is a big business. The gaming industry is a now 10 billion dollar enterprise. And a reported 67% of U.S. households play video games of varying complexity and themes. Thus, it does make a lot of sense to invest research on businesses.
What is less obvious is that the ways gamers collaborated and learned with each other also help us understand characteristics of our increasingly networked society. As researchers have begun to tout the benefits of using technology to tinker and remix current media, gamers are already well versed in this practice. The first video game, SpaceWar! (1962), was built by a group of MIT students modding a PDP-1 mini computer. Then, more game genres were developed out of user-innovations. For example, before modding was technically supported in World of Warcraft, players were already modifying the game in an activity known as FrameXML hacks.
I am most fascinated by the way gamers have mastered a bottom-up, peer-to-peer, innovation process that is primarily interest-driven. Many of the gamers that I have studied—modders of World of Warcraft and eSports participants of StarCraft II—volunteered their time to work with peers around the world. In many companies, distributed teams are still learning to collaborate across distance. Working with people outside your face-to-face social networks is hard. Yet, many gamers are already accustomed to these social processes see , as evidenced by our research within the starcraft community.
On a different front, researchers are also interested in how gamers are living the future of learning. The Connected Learning initiative is a step towards researching new models of education that connect with peer-to-peer learning powered by the digital media and youth interests (full synthesis report) . Many gamers are adept self-help learners. They can learn skills like maths, programming, and even social skills through their own efforts. However, their learning communities exist outside the powerful networks of educational institutions and authorities.
So how do we credit learners who are learning “in the wild,” i.e., not taking standardized exams? Mozilla Open Badges which is already in use at stackoverflow is one possible intermediate solution. Participation in gaming communities may be more rewarding if its activities are more widely recognized by schools as academically-relevant. Imagine a future scenario: that being a StarCraft II Grand Master is a sign of intellectual development and a bragging right to parents and teachers. While this seems unlikely for now, there are already activities like chess and robotics that are believed to spur academic development. At this moment, understanding how gamers learn among their peers would provide valuable data to identify points where peer-to-peer learning and education may be connected.
Game studies in the future
Like computer science and business, game studies is a scientific discipline centred around an industry. Unlike physics or mathematics, game studies can involve almost any theoretical approaches or methods, so long as it helps solve a problem relevant to the industry. As a result, involvement of the industry’s participants in the research process can make a difference. In the past, I have interviewed many gamers in both the U.S. and in China, and have found them extremely approachable. What is more challenging is to obtain access to game companies to understand corporate’s perspectives. My own research work examining how we can encourage user-innovation in gaming communities is limited due to the inability of many of my colleagues (many of them professors) in convincing public relations at major game studios to interview their developers.
However, there are signs that game companies are opening up. Previously, Linden Lab (developers of Second Life) have opened their doors to researchers leading to some interesting research findings. There are also game designers like Raph Koster who attend academic conferences to tell their stories. Then, during my postdoctoral study in 2012, our team at the Connected Learning Research Network gained unprecedented access to developers at Blizzard Entertainment, and also at Valve. And we found very interesting connections between design cultures of these companies and that of their user communities. Also in 2012, I heard that Riot Games have assembled a user research team, including a few Ph.D and master holders, to use research methods in order to understand and improve their community.
I dream of a day that game company-sponsored research projects are plentiful, and they help establish linkages and connections to issues like learning, collaboration, and even media production in gaming communities. In the computing industry, companies like Microsoft, Facebook, IBM, and Intel are already employing researchers, giving out research grants, sponsoring scientific conferences and meetings, and supporting graduate students. In the process, companies are using academics to experiment with future technologies and to understand future users.
Of course, another (debatable) use of corporate grants is to channel researcher’s time to solve very specific problems relevant to these companies (some mobile and surface technologies comes to mind). By employing researchers, these companies can also strike meaningful conversations with the academia in shaping up projects and to identify other areas of concerns – such as education, medicine, and social issues. But in game studies, the relationship between academics and game developers has not come into complete fruition. In the meantime, game researchers will be content hanging out in communities, enjoying our time with gamers, and solving problems that we can foresee.
Given that games provide an avenue for ‘learning in the wild’, what other areas do you feel gaming research and gamers can make an impact? With the potential of games now and in the future, how do you see the future of games research developing?
Yong Ming Kow is Research Fellow at the National University of Singapore, leading the Digital Memory Project. He studied World of WarCraft during his PhD work under Dr. Bonnie Nardi at the University of California Irvine. His post-doctoral work with Dr. Mizuko Ito in the Connected Learning Research Network involved interest and peer-supported learning within the StarCraft community. Learn more about his research at: http://kowym.com/