PRESERVING VIRTUAL WORLDS
FINAL REPORT
Jerome McDonough & Robert Olendorf
Graduate School of Library & Information Science
University of Illinois at Urbana-Champaign
501 E. Daniel Street
Champaign, IL 61820
Matthew Kirschenbaum, Kari Kraus, Doug Reside & Rachel Donahue
College of Information Studies
Maryland Institute for Technology in the Humanities
University of Maryland
College Park, MD 20742-7011
Andrew Phelps & Christopher Egert
Interactive Games & Media Department
B. Thomas Golisano College of Computing & Information Sciences
Rochester Institute of Technology
Rochester, NY 14623-5603
Henry Lowood & Susan Rojo
Humanities Research Group, Stanford University Libraries
Green Library, 557 Escondido Mall
Stanford University
Stanford, CA 94305-6004
Aug. 31, 2010
Preserving Virtual Worlds was conducted with the support of the
Library of Congress' National Digital Information Infrastructure for
Preservation Program
The report is made available under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 license
(See http://creativecommons.org/licenses/by-nc-sa/3.0/).
? 2?
Acknowledgements
Numerous individuals assisted in the research described in this report, including many
students and colleagues who have assisted in gathering data, proofing drafts, and helping us
think through the difficult problems associated with preservation of digital games and
interactive fiction. Several stand out for special recognition, however. The authors would like
to thank Janet Eke, Patricia Hswe and Maeve Reilly for their tireless efforts in managing an
extremely complex project and herding cats (and otters) as necessary. This project could not
have happened without all of their extraordinarily capable assistance. We would also like to
thank Beth Dulabahn at the Library of Congress for all of her help and for providing a
valuable sounding board for ideas. The support and collaboration of Neil Fraistat and
Dennis Jerz were invaluable, as was the assistance of Henrik Bennetsen and Matteo Bittanti.
We are extremely grateful to J.P. Dyson, director of the International Center for the History
of Electronic Games (ICHEG) and vice president for exhibit research at the Strong National
Museum of Play, for being free with his time and his knowledge of preservation of gaming
hardware, and to the Internet Archive for their support for our efforts. Finally we would like
to thank the all of the game developers, artists and authors who have given us worlds to
explore.
? 3?
Table?of?Contents?
1. Executive Summary ...............................................................................................................5?
2. Introduction ............................................................................................................................9?
What is a Virtual World? .....................................................................................................................9?
Cultural and Economic Significance ............................................................................................ 12?
Challenges to Preservation.............................................................................................................. 13?
About the Project .............................................................................................................................. 15?
About the Project Partners and Personnel ................................................................................... 16?
About this Report.............................................................................................................................. 18?
3. Games & Interactive Fiction: Collecting for Preservation ......................................... 19?
Issues of Appraisal & Selection ..................................................................................................... 19?
Working with Developers ................................................................................................................ 22?
Context Information Reconsidered............................................................................................... 27?
Literature & Ephemera ................................................................................................................... 29?
4. Collections ............................................................................................................................ 33?
Collections: Bibliographic & Archival Description ................................................................... 33?
Collections: Data and Documentation ......................................................................................... 37?
Problems/Opportunities in Access............................................................................................... 48?
5. Software Preservation and the Law ................................................................................. 52?
Copyright Issues: Complex Ownership and Orphan Works.................................................... 52?
Emulators & the Law ....................................................................................................................... 53?
6. Preservation Strategies....................................................................................................... 58?
Analysis of Hardware Preservation ............................................................................................... 58?
Analysis of Emulation and Virtualization .................................................................................... 61?
Analysis of Migration ....................................................................................................................... 78?
Analysis of Re-enactment................................................................................................................ 83?
Digital Forensics and Disk Image Analysis ................................................................................ 85?
Recommendations on Preservation Strategies ........................................................................... 87?
7. When Strategies Fail: The Case of Second Life ............................................................ 89?
Problems in Archiving Second Life............................................................................................... 89?
Basics of Archiving a World............................................................................................................ 91?
Creating a Manifest of an Island ................................................................................................... 91?
Obtaining Permission ...................................................................................................................... 93?
Copying a World ............................................................................................................................... 94?
Failures in Archiving ........................................................................................................................ 96?
8. Packaging Virtual Worlds ................................................................................................. 98?
Virtual Worlds and the OAIS Reference Model.......................................................................... 98?
Analysis of Packaging Requirements .........................................................................................101?
Metadata & Packaging Recommendations...............................................................................102?
9. Steps for the Future ..........................................................................................................105?
Library of Congress National Collecting Plan ..........................................................................105?
? 4?
Intellectual Property & Digital Preservation: Widening the Discussion.............................112?
Representation Information and Format Registries/Tools ...................................................118?
Digital Game Canon.......................................................................................................................120?
Reimagining Videogame Asset Management & Preservation (ReVAMP) Symposium ..121?
A Research Agenda for Preservation of Video Games & Interactive Fiction .....................123?
10. Conclusion ........................................................................................................................126?
Bibliography...........................................................................................................................128?
Appendix A - Virtual Worlds that Died During the Grant ............................................134?
Appendix B ? Media Coverage of PVW............................................................................140?
Appendix C ? Preserving Virtual Worlds Ontology .......................................................143?
Appendix D ? Multi-Institutional Collaboration: Lessons Learned ...........................162?
Appendix E ? Publications .................................................................................................163?
Appendix F ? Museogames Exhibit at the Mus?e des arts et m?tiers .......................166?
Appendix G ? Second Life Deed of Gift ...........................................................................170?
Appendix H ? Gaming Websites Identified in PVW Project .......................................172?
Appendix I ? Sample Output from CopyBot ...................................................................187?
?
? 5?
1. Executive Summary
The Preserving Virtual Worlds project is a collaborative research venture of the Rochester
Institute of Technology, Stanford University, the University of Maryland, the University of
Illinois at Urbana-Champaign and Linden Lab, conducted as part of Preserving Creative
America, an initiative of the National Digital Information Infrastructure and Preservation
Program at the Library of Congress. The primary goals of our project have been to
investigate issues surrounding the preservation of video games and interactive fiction
through a series of case studies of games and literature from various periods in computing
history, and to develop basic standards for metadata and content representation of these
digital artifacts for long-term archival storage. The games included within our case studies
were:
? Spacewar! (1962) ? a space combat simulation for the PDP-1 computer;
? Adventure (1977) ? one of the earliest of the text adventure games;
? Star Raiders (1979) ? one of the more popular and complex games released for the
Atari 2600 game console;
? Mystery House (1980) ? the first work of interactive fiction to employ computer
graphics as a significant part of the game, and not just text;
? Mindwheel (1984) ? an interactive fiction work, notable for having been authored by
U.S. Poet Laureate Robert Pinsky;
? Doom (1993) ? the game which popularized the first-person, 3D shooter game;
? Warcraft III: Reign of Chaos (2002) ? the real-time strategy game from Blizzard
Entertainment; and
? Second Life (2003) ? one of the most successful of the ?social? (i.e., non-gaming)
virtual worlds. Given the large amounts of data involved in archiving all of Second
Life, our project has focused on a small set of Second Life?s islands.
Virtual worlds such as these are software artifacts, communities, and commodities. Their
preservation is thus intertwined with issues of technology, social relationships and law, and
our investigations have touched on issues in all of these realms. Significant problems we
have identified for the preservation of these materials are summarized below.
Obsolescence ? The most obvious problem affecting these materials is the obsolescence of
the hardware and software infrastructures necessary to allow software to run. The earliest
game in our case set, Spacewar!, currently exists in its original form stored on a punched paper
tape intended to be read into the memory of a PDP-1 computer. There is, to the best of our
knowledge, only one functioning PDP-1 computer left in the world, at the Computer
History Museum in Mountain View, California, and paper tape readers are not exactly
common equipment at this time. The fate of the paper tape of Spacewar! is the fate awaiting
all games without the active intervention of preservationists. A book may pass 50 years on a
shelf and still be readily accessible; rapid technological change and the resulting obsolescence
of the technology necessary to access software mean that a computer game will not.
Boundaries ? Identification of the exact boundaries of the object of preservation is difficult
in the case of computer games. While we tend to think of the game as a relatively discrete
package of software, the reality is that a functioning game involves a web of interconnections
? 6?
between the game?s executable, an operating system, the hardware platform used to execute
both, and potentially network hardware and software and a multiplicity of other computer
systems (witness the cases of Warcraft III and Second Life). Even a relatively simple, early game
such as Adventure possessed a dependency on an operating system library in the source code
that the original author, Will Crowther, sent to Don Woods. Games may also come in a
multitude of official versions, as well as unofficial modifications (mods), rendering decisions
about what to collect, even in the case of a single game, problematic. Determining what
constitutes the game, and more importantly, what is necessary to preserve the game, can be an
extraordinarily difficult exercise.
Intellectual Property Law ? Copyright and related issues manifested themselves repeatedly
in our studies. The Digital Millennium Copyright Act?s prohibition on defeating
technological protection measures makes it impossible for a library to create a preservation
copy of games employing DRM and anti-copying measures. While obtaining the permission
of the rights owner to make a preservation copy offers a potential path around this obstacle,
securing these permissions is complicated by the existence of a large number of ?orphan
works? in the field of computer games, and the great difficulties encountered in trying to
track intellectual property rights ownership in an industry as volatile as the game software
industry. Intellectual property laws also may pose impediments to the development of
emulation technology necessary for continuing access to some games.
Collection Management ? As the Open Archival Information System Reference Model
makes clear, preservation of any object requires the preservation of more than the object
itself. Knowledge of how to decode and play back the information (representation
information) must also be preserved, as must information which provides an intellectual
context for the object to aid in understanding its full meaning and significance. Librarians,
archivists and curators dealing with gaming materials must be extremely proactive in their
collection management to insure that they are adding the requisite additional materials to
their collections. Unfortunately, these types of materials can be exceptionally difficult to
obtain in the case of computer games and electronic literature. Many of them may be in the
hands of private companies that do not wish to see them released (or belonged to companies
that have ceased to exist), and other information may be produced far outside of normal
publication channels (e.g., machinima videos created in networked gaming systems and
traded among fans), requiring significant time and energy to track down.
Preservation Strategies & Significant Properties ? Standard preservation strategies for
software entities such as computer games include migration (relatively easy if source code is
available), emulation, maintenance of original hardware and operating systems (the
?computer museum? approach), and in cases where these all fail, re-enactment/recreation
(the game Mystery House is available on modern computing platforms in significant part
because of the reimplementation undertaken by the Mystery House Taken Over project,
which created a bug-compatible reimplementation of the original). None of these
approaches, however, provides a perfect solution to the problem of preservation, and
migration, emulation and re-enactment pose significant risks of altering the appearance and
performance of games. Our research on emulation in particular shows that significant visual
and aural aspects of the work can be strongly affected by running under emulation. Without
a clear understanding of which aspects of a game are likely to be considered significant by
scholars in the future, it is extremely difficult to choose an appropriate preservation strategy,
? 7?
and preserving games without any change in their appearance and play may simply not be
achievable in many instances.
Despite all of these significant issues, we have identified a number of immediate and
intermediate steps that libraries, archives and museums can take to assist in the long-term
preservation of games. These include:
? Developing packaging standards for ingest of gaming materials into digital
repositories that explicitly include support for application of the FRBR and OAIS
data models to insure that the identification of materials and links between them are
sufficiently precise and detailed to support preservation activity. We hope that the
OWL ontology developed by our project, and demonstrations of its application with
METS and OAI-ORE, might serve as the basis for further development in this area.
? Have organizations such as the Library of Congress, the National Digital
Stewardship Alliance and others concerned with large-scale collaboration between
institutions in support of preservation take a lead role in trying to negotiate the
distribution of responsibilities for collection development and management that will
be essential to preserve computer games and interactive fiction in a distributed
fashion. Given the common need for repositories of representation information and
supporting documentation, the Library of Congress may wish to consider starting
discussions with the National Institute of Standards and Technology and the Unified
Digital Format Registry about the creation of such a repository, focusing initially on
data format standards.
? Help develop preservation systems that are accessible by and can accept
contributions from the gaming community. The How They Got Game project, as
part of the Preserving Virtual Worlds efforts, has established a sub-collection within
the Internet Archive?s moving image collection called ?Archiving Virtual Worlds.? It
contains video documentation of a large number of virtual worlds, much of it created
by those worlds? users. Efforts such as this, that provide a stable environment in
which to preserve the contributions of the gaming community and assist them in
documenting their own activities and culture, are essential to the preservation of
computer games and interactive literature. Moreover, such efforts help promote
dialog among librarians, archivists and curators and the larger gaming community,
and help build the partnerships essential to preservation activity in this field. The
Library of Congress and members of the National Digital Stewardship Alliance
should seek out further opportunities for such collaborative efforts with the gaming
community. Development of emulators and emulation technology, an area where the
gaming community has already invested significant effort, might be a potential arena
for collaboration, as might establishing repositories of documentation of hardware
platforms, such as the recently launched Maryland Institute for Technology in the
Humanities? Vintage Computers website.1
? Intellectual property laws as they currently stand represent serious obstacles to
preservation of computer games and interactive fiction. The inability of libraries and
other cultural memory organizations to make preservation copies of materials
employing technological protection measures (TPMs) will certainly doom these
????????????????????????????????????????????????????????
1?Available at http://mith.umd.edu/vintage-computers/
? 8?
materials to a rapid demise. The ?1201 anti-circumvention rulemaking process, due
to the time limitations established on rules allowing circumvention of TPMs, the
time and difficulty in coordinating applications for exemptions to the anti-
circumvention prohibitions in the law, and the uncertainty of the outcome of any
petition, does not provide libraries with a stable legal environment in which to
conduct preservation planning. In addition to seeking collaboration with the
inhabitants of virtual worlds, libraries, archives and museums need to build
relationships with gaming companies to work on legislative changes that will enable
preservation of computer games to proceed in a manner that protects the rights of
games? creators while insuring that their creations are available to future generations.
The ReVAMP symposium discussed in Chapter 9 represents the type of discussion
that is necessary if a legal infrastructure conducive to digital preservation is to be
forged.
Computer games and interactive fiction form an essential part of our cultural heritage. These
virtual worlds are unique forms of art, places for education, socializing, business and
entertainment, and seem certain to play an increasing role in people?s lives. We hope that the
research we report on below will help contribute to librarians?, archivists? and curators?
efforts to insure that these virtual worlds remain living worlds.
Jerome McDonough
Preserving Virtual Worlds, Principal Investigator
? 9?
2. Introduction
?This is our history, and just a handful of people are saving it.?
? PixelVixen707, screen name of ?Rachael Webster,? herself a fictional character in
the alternate reality game Personal Effects: Dark Art
What is a Virtual World?
Virtual worlds are software artifacts, communities, and commodities. They are places and
spaces whose geography and landmarks can be as familiar as your own neighborhood,
teeming with personalities that are rich and genuine and multifaceted, but?simultaneously
and paradoxically?they are also always finally layers of logical abstractions mediated by the
conventions of digital computing. They are also (often) branded media properties capable of
generating massive revenues, or else generating very little, a phenomenon not irrelevant to
their frequent and sudden demise.
As software artifacts exhibiting complex dependencies on platform, operating system, and
network environment, virtual worlds are undoubtedly among the most imperiled forms of
interactive digital content. As communities?shared spaces and places?they are defined by
no less delicate and idiosyncratic skeins of people, relationships, memories, and folklore akin
to those found within oral cultures. Virtual worlds are not virtually?that is, ?almost??real.
They are instead, to borrow a phrase from Jesper Juul (2005), precisely half-real: they are
human products, scripted and engineered out of millions of lines of code written by dozens
or hundreds or even thousands of individuals, but they are also focalizers for powerful
collective acts of the imagination that rely on the same willing suspension of disbelief that
characterizes immersion in other media, like novels and films. Virtual worlds are governed
by the non-virtual realities of limitations on processor and rendering speeds, bandwidth,
memory, and server technology, but they have also functioned to spur innovation and
technical breakthroughs in those very same areas. Most virtual worlds are also buffeted by
the inexorable pressures of the marketplace. During the two-year span of our project, no
fewer than seventeen virtual world properties winked out of existence, with varying degrees
of bang and whimper (see Appendix A). Even when profit-motive is not a consideration and
a virtual world is built and maintained out of something like love, it remains vulnerable to
the patience, attention span, resources, disposable income, and of course ultimately the
inescapable mortality of its founders.
The online journal Virtual Worlds Review defines a virtual world as ?an interactive simulated
environment accessed by multiple users through an online interface.? Six essential features
are prescribed: shared space (multiple users), a graphical user interface, immediacy
(?interaction takes place in real time?), interactivity (?the world allows users to alter, develop,
build, or submit customized content?), persistence (?the world's existence continues
regardless of whether individual users are logged in?), and socialization, or a sense of
community. While presumably capturing what many immediately think of when they conjure
the mental image of a virtual world, we find this definition and these criteria limiting. For
example, the requirement for a graphical user interface would seem to exclude text-only
? 10?
Multi-User Dungeons or MOOs (MUDs Objected-Oriented) from the definition of a virtual
world. Likewise, Maze War, a remarkable 1974 multiplayer game which anticipated the first-
person shooter genre would not qualify, since there is no persistence to the game world and
precious little in the way of social interaction.
The Preserving Virtual Worlds project has instead taken a deliberately broad and catholic
approach to defining its scope and purview. Not all of the virtual worlds in our case set
exhibit each of the above properties. This is partly a result of our mandate to explore a
variety of different cases and preservation scenarios. However, it is also out of the
conviction that no single quality?three-dimensional graphics, multiple users, user-
contributed content, persistence?defines a virtual world. For our project, a virtual world
means something like a digital setting whose properties are stable and coherent enough to
deliver a consistent ludic or interactive experience to two or more users or to the same user
over time. A virtual world also, however, inevitably involves some form of imaginative
projection based on the promise of inhabitable space. An Excel spreadsheet is thus not a
virtual world because despite being capable of producing consistent interactive experiences it
fails to create the sense of immersive possibility so crucial to the experience of space and
place. We believe virtual worlds thrive on this human craving for possibility.
Perhaps no single example illustrates this more effectively than Battlezone, a coin-op arcade
tank combat simulator first introduced in 1980 and subsequently ported to the Atari 2600
and numerous other consoles and home computer systems. Battlezone, which was also
eventually adapted by the US Army for
training purposes, presents its players
with a sparse and largely empty world,
populated by abstract geometric
obstacles and an array of hostile tanks,
missiles, and spacecraft. The kill or be
killed ethos of the game matches this
stark, vector-drawn landscape
perfectly. There is, however, one
unusual touch: on the horizon of this
world is a mountain range; peaks and
valleys rising and falling like an EKG.
Amongst the mountain peaks is a
solitary volcano, a distinctive landmark
capped off with a few blinking pixels
meant to suggest lava or debris.
This volcano became the stuff of gamer legend, with the rumor spreading that if you drove
your tank far enough you could eventually get to it. And there was more: upon reaching the
volcano you could drive up its side, and peer over the rim, and inside was a castle, and inside
the castle treasure . . . One commentator puts it this way:
The attraction of Battlezone?s world was so strong that many players wanted to
turn their back on the fighting and drive their tank up into the mountains to
go exploring. The designers of the game had to put in a routine to send a
The Volcano in Batt lezone
?
? 11?
missile after would-be explorers so that arcade owners wouldn?t lose money
on the peaceful tourists who didn?t want to fight. Many great legends
emerged from the arcades that centered on finding a way to leave the fighting
behind.
This behavior anticipates the appeal of later, much more complex and fully rendered virtual
worlds, such as those found in the Grand Theft Auto franchise, in which players often engage
in ?sandbox? mode, foregoing the missions assigned to advance the plot in favor of ... just
driving around. (The kind of free-form exploration that Situationist Guy Debord (1956)
called a d?rive.) Such activity is possible because all virtual worlds exist on a continuum
between (again using terminology from Jesper Juul, who also discusses the example of
Battlezone in this context) coherence and incoherence. A virtual world is coherent or
incoherent to the extent that the player is able to use the information that is presented to
him or her to fill in the gaps and blanks about other aspects of the world that are not literally
portrayed. Battlezone, for example, gives us nothing with which to conjecture as to why its
antagonists are fighting for possession of this barren planet, or how the crews of the tanks
(are they even human?) eat or sleep or what they do to pass the time when they?re not all
busy fighting. But specific details, like the volcano, can become the catalyst for such
imaginative projections because they tantalize us with the promise of a more replete world
just over the vector limned horizon. Human-computer interaction designer Stan Ruecker
(2007) refers to this in a different context as the ?aesthetic function,? where attention to
seemingly arbitrary cosmetic details?in this case, the erupting volcano, with its pixelated
lava?has the salutary benefit of capturing a user?s attention and engendering trust in the
surrounding environment. Virtual worlds, regardless of where they fall on Juul?s continuum
between coherence and incoherence, exploit and depend on the effects of the aesthetic
function on their users.
We draw no hard and fast distinctions between traditional computer games and massive,
multiplayer online virtual environments like Second Life. Our case set encompasses examples
of both, as well as several works of interactive fiction which are at least as much instances of
belles-lettres as games or entertainment. Steve Russell?s Spacewar!, programmed in 1961, is thus
a virtual world for purposes of our project: indeed, it is a very literal virtual world, with its
primitive visualization of a solar system on a screen at MIT (or today at the Computer
History Museum in Mountain View). Will Crowther and Don Woods?s text-only
ADVENTURE is a virtual world, its faithful recreation of Kentucky?s Bedquilt Cave
complex yielding an uncannily accurate world model replete with objects, places, and non-
player characters. John Romero?s and John Carmack?s DOOM is a virtual world, an
architectural tapeworm enticing users to wonder what lies beyond the vanishing points of its
seemingly endless corridors and mazes?precisely the question taken up by those who
learned to build their own game levels, as well as early creators of machinima who used the
game engine as a platform for narrative. Finally, we might recall the well-known origin story
behind Will Wright?s landmark game, The Sims. Wright was at work on his first game design,
a by-the-numbers shoot-?em-up for the Commodore 64 called Raid on Bungeling Bay; as he
tells it, he became captivated by the idea of using the level editor to design new landscapes
and combat zones, eventually finding this activity more compelling than the actual game
play. This is the kind of world-making?made up of imaginative projection, social
interaction, and unabashed play?that forms the basis for the virtual worlds we examined.
? 12?
Cultural and Economic Significance
The word ?game? comes to us from the Old English gamen, itself descended from the
Germanic, meaning joy, fun, and amusement. This etymology is consistent with most
people?s contemporary sense of the term, and so it is not surprising that games are
sometimes regarded as trivial, disposable, or at best tangential to those objects of cultural
heritage seemingly possessed of more inherent gravitas. Yet even a moment?s reflection
should reveal the shortcomings of this thinking. Historically, card, tabletop, and parlor
games of all sorts have been integral to the human experience across cultures, spanning all
levels of social class and caste. Yet as often as games are relegated to the category of the
frivolous, they are just as often associated with violence, addiction, and dangerous
tendencies. Digital games in particular seem to suffer for their close association with youth
culture, and of course debates around violence and militarism in contemporary digital
gaming are constantly in the headlines.
This report does not seek to intervene in such debates, other than to point out that the sheer
topicality of computer games and virtual worlds at this particular moment in our collective
history would seem to make preserving accurate and authoritative records of them an
essential aspect of the mission of an institution such as the Library of Congress. Likewise, it
is worth remembering that the history of technology reveals consistent patterns of anxiety
around new media and new forms of immersive storytelling. Consider the following, for
example:
A whole family, brought to destitution, has lately had all its misfortunes
clearly traced by the authorities to an ungovernable passion for novel-reading
entertained by the wife and mother. The husband was sober and industrious,
but his wife was indolent, and addicted to reading everything procurable in
the shape of a romance. This led her to utterly neglect her husband, herself,
and her eight children. One daughter, in despair, fled the parental home and
threw herself into the haunts of vice?. The house exhibited the most
offensive appearance of filth and indigence. In the midst of this pollution,
privation, and poverty, the cause of it sat reading?and refused to allow
herself to be disturbed in her entertainment.
? ?T. C., ? The Christian?s Penny Magazine and Friend of the People (1859);
Qtd. in The ?Dangerous? Potential of Reading: Readers and the Negotiation of
Power in Nineteenth Century Narratives (Aliaga-Buchenau, 2003).
This warning about the perils of too much novel reading seems instantly familiar, evoking
many of the jeremiads about the addictive qualities of gaming one associates with public
debates around Everquest, World of Warcraft, and Second Life. The popular pastimes of the
present?outright dangerous or at best mere distractions from more wholesome pursuits?
are the new cultural norms of tomorrow. Just as it would be very difficult to analyze a
painting such as Manet?s Olympia without reference to Titian?s Venus of Urbino, it will prove
very difficult in the future to discuss games such as Star Wars Galaxies without reference to
Spacewar!. It will also prove difficult to provide a full accounting of our cultural heritage
outside the realm of gaming without preserving games as well. In an age where games such
as Tomb Raider, Resident Evil, Final Fantasy and DOOM spawn media franchises including
? 13?
films, novelizations, comic books and a variety of action figures and collectibles, and where
games such as Microsoft?s Halo and Blizzard?s World of Warcraft provide media production
platforms for the creation of machinima videos, it is impossible to provide a proper
contextualization for much of our existing popular culture without preserving games.
Virtual and gaming worlds also exhibit increasingly complex transactions with segments of
society beyond popular entertainment, for example the placement of ads in XBox games by
the Obama presidential campaign (GamePolitics.com, 2008). Bainbridge (2007) has noted
the potential impact of virtual environments on the research community, both as virtual
laboratory spaces and as settings for economic and social research. Increasingly, any
complete understanding of modern society and culture requires an understanding of the
world of gaming, and if cultural heritage institutions are to adequately serve researchers, we
must develop means of preserving these new information resources.
Games have also redefined the social spaces around them: whether it is the coin-op cabinet
in a neighborhood tavern or pizza parlor, the video arcade as a new kind of adolescent
hangout, the changing dynamics of the family living room?where the TV set takes on a
new role as a game-playing device?or the LAN parties that characterize online gaming or
cybercaf?s in Seoul, it is obvious that games have reconfigured physical as well as virtual
space.
Economically, computer games have become a significant part of the global economy.
Global sales of video games were estimated at $46.5 billion in 2009 (Wu, 2010), and within
the United States, direct and indirect employment by the gaming software industry accounts
for over 80,000 jobs (Siwek, 2007). Socially, video games have become one of the most
popular forms of entertainment within the United States, with 67% of American households
playing computer or video games (Entertainment Software Association, 2010), and with
online gaming sites in the United States reporting over 190 million visitors in a single month
in 2008 (ComScore, 2009).
Challenges to Preservation
Unlike a book in a library, computer games have very poorly defined boundaries that make it
difficult to determine exactly what the object of preservation should be. Is it the source code
for the program? The binary executable version of the program? Is it the executable program
along with the operating system under which the program runs? Should the hardware on
which the operating system runs be included? Ultimately, a computer game cannot be played
without a complex and interconnected set of programs and hardware. Is the preservationist?s
job maintaining a particular, operating combination of elements, or is to preserve the
capability to produce an operating combination using existing software and hardware? Is it
both? Once these questions of the boundaries of the preservation object are addressed, there
are a host of other difficulties presenting the would-be preservationist. What information,
beyond the game itself, will we need to ensure continuing access to the game? How should
librarians, archivists and preservationists go about organizing the body of information
needed to preserve a game? What strategy should we adopt to preserve software in a
technological environment in which computing hardware and operating systems are
undergoing constant and rapid evolution? Given the costs of preservation of normal library
? 14?
and archival materials, how can we possibly sustain the additional costs of preserving these
complex and fragile technological artifacts?
Given that the entire report really revolves around the challenges to preserving and
maintaining access to games and virtual worlds, here we simply run down, in abbreviated
form, the main problem domains.
? Hardware obsolescence ? The original console or computing platform used to run
the game may cease to be supported or even available in the aftermarket.
? Software obsolescence ? The original software needed to run the game?operating
system, drivers, frameworks?may lose support, cease development, or become
incapable of running on future hardware/software configurations.
? Scarcity ? Some video games are produced in limited quantities and are subject to
the dangers of media decay. This is especially likely to be the case for special editions
and releases, recalled games, or art games.
? Third party dependencies ? Currently most emulators are developed by the game
community and are of questionable legality. They are also typically created without
the benefit of the original specifications and are themselves at risk of becoming
obsolete.
? Complex, proprietary code ? And an associated lack of documentation. Digital
games are generally released as compiled binaries with no documentation of the
compiling process, or even the programming languages used. Not having access to
the source code or language specifications makes migrating or emulating software far
more difficult.
? Authenticity ? The elephant in the digital preservation room, proving that a digital
object is what it claims to be, free from tampering or corruption. Digital games enjoy
many versions between the first prototype, the official release (on multiple
platforms), and cracked or otherwise altered unauthorized editions. Especially for
older games, the only extant copy may exist in a fan-run web repository, making the
authenticity impossible to establish.
? Intellectual Property Rights ? The game development industry is highly creative
and competitive, leading developers to be conservative with their intellectual
property. Most have instituted restrictive shrink-wrap licenses reflecting this. And
yet, once a game is no longer actively marketable, they are less likely to respond to
inquiries about licensing for it.
? Significant properties ? What are the significant properties of a game that must be
maintained with each transformation/preservation action? What makes Mario
Mario? How important are font size and color palette? What about the speed of text
scrolling or sprite movement? What about controllers? How faithful must we stay to
the original code? Significant properties are essential to define, as they play a major
role in determining authenticity.
? Context ? Although not an immediate threat to the preservation of games, building
contextuality is important to creating understanding for future users. This is truer for
digital games than many other record types because, as technology advances, game
players who have only been exposed to the latest and greatest may be apt to play an
older game and say, ?so what?? even though the game might have been revolutionary
for its time. For example, Sierra?s Mystery House was the first text adventure to
? 15?
incorporate graphics. An amazing breakthrough in its day, it seems crude in
comparison to today?s virtual environments.
About the Project
Preserving Virtual Worlds (PVW) was a two-year collaborative research venture of the
Rochester Institute of Technology, Stanford University, the University of Maryland, the
University of Illinois at Urbana-Champaign and Linden Lab (2008-10). The project is one of
several launched as part of Preserving Creative America, an initiative of the National Digital
Information Infrastructure and Preservation Program at the Library of Congress intended to
seek novel solutions for the preservation of commercial digital content. Preserving Virtual
Worlds has focused on developing methods for the preservation of digital games and
interactive fiction through the creation of standards for metadata and content representation.
The PVW project has employed a case set approach in its investigations, choosing a set of
games and using those to identify representative types of preservation problems posed by
each member of the case set. We have selected a number of games from different periods in
gaming history, from different platforms and with different intellectual property status to try
to maximize the opportunities for identification of problems. Games, interactive fictions,
and virtual worlds within the case set include:
? Spacewar! (1962) ? a space combat simulation for the PDP-1 computer;
? Adventure (1977) ? one of the earliest of the text adventure games;
? Star Raiders (1979) ? one of the more popular and complex games released for the
Atari 2600 game console;
? Mystery House (1980) ? the first work of interactive fiction to employ computer
graphics as a significant part of the game, and not just text;
? Mindwheel (1984) ? An interactive fiction work, notable for having been authored by
U.S. Poet Laureate Robert Pinsky;
? DOOM (1993) ? The game which popularized the first-person, 3D shooter game;
? Warcraft III: Reign of Chaos (2002) ? The real-time strategy game from Blizzard
Entertainment; and
? Second Life (2003) ? one of the most successful of the ?social? (i.e., non-gaming)
virtual worlds. Given the large amounts of data involved in archiving all of Second
Life, our project has focused on a small set of Second Life?s islands.
For each of the games in our case set we have tried to identify specific problems that might
impede a library?s or archive?s ability to preserve the game in the long-term. Identification of
these problems has involved content analysis of the games themselves (including
documentation provided with the game) as well as research into the various games?
intellectual property status. Common problems identified for games in our case set include
the unavailability of the original game, content management problems resulting from the
complex versioning issues surrounding software, storage media obsolescence and fragility,
degradation of game play due to the use of emulation or migration, unavailability of
representation information documenting the file formats used in game software, and a
number of intellectual property issues.
? 16?
Based on our investigations into the games in our case set, we have developed a set of
requirements for game preservation. We have also implemented a new ontology for game
description that draws upon both the Functional Requirements for Bibliographic Records
Final Report and Open Archival Information System Reference Model, and successfully
used that ontology to create archival information packages for these games in conjunction
with existing digital preservation standards such as METS, OAI-ORE and BagIt. We have
loaded those packages into preservation repositories at Stanford University and the
University of Illinois at Urbana-Champaign.
Project infrastructure included a listserv with all participants subscribed; biweekly conference
calls, both project-wide and for the technical team (primarily personnel at Illinois and RIT);
several face-to-face all-hands meetings over the course of the project; a public website,
including a blog (http://pvw.illinois.edu/pvw); and a wiki, with many (but not all) content
areas also open to the public (https://apps.lis.illinois.edu/wiki/display/PVW/Home).
Readers are encouraged to consult these online resources, as well as the project partners?
publications (see Appendix E) for further information and discussion.
About the Project Partners and Personnel
The University of Illinois at Urbana-Champaign was the lead institution, with Jerome P.
McDonough, assistant professor in the Graduate School of Library and Information Science,
serving as principal investigator. Project personnel included a diverse array of staff, faculty,
and graduate and professional students from Illinois and three other universities, thereby
extending the project?s demographics to additional information schools, a program in game
and interactive media design, a university research library, and a digital humanities center.
The team included specialists in game design and game studies, library and information
science, archives, digital humanities, intellectual property, and descriptive and analytical
bibliography. The participating units within the four partner institutions are further
described below.
University of Illinois at Urbana ? Champaign (Lead Institution)
Graduate School of Library and Information Science (GSLIS)
The Graduate School of Library and Information Science began as the first library science
program in the Midwest, founded in 1893 by Katharine Sharp. More than a hundred years
later, it is consistently ranked as one of the very best in the field. The School?s faculty
believes strongly that librarianship and newly emerging forms of information science must
develop together, in order to ensure that libraries resist obsolescence and newer institutions
learn the importance of access, privacy, and service. The mission of the Graduate School of
Library and Information Science is to provide: graduate education for leaders in research and
practice in the fields of library and information science; groundbreaking research to advance
preservation of and access to information in both traditional and digital libraries and in the
many settings outside of libraries where large amounts of critical information are collected;
and useful service to librarians and other information service providers, as well as to the
citizens of Illinois.
? 17?
Rochester Institute of Technology
Department of Interactive Games and Media (IGM), and
B. Thomas Golisano College of Information Sciences and Technology
The Department of Interactive Games and Media is renowned for its innovative approaches
to media-centric computing that merge the creative design of the interactive experience with
the development of content, technologies, and systems that form the basis of such work.
The department will support, wherever and whenever possible, multi-disciplinary work that
fuses these elements in pursuit of its academic mission. IGM is comprised of talented and
motivated individuals from a variety of academic backgrounds with a shared interest in
computing as it relates to interactive and social media, new media, games, simulations, and
media-centric systems of all varieties. The department?s mission is to provide a sustained
educational environment that supports and encourages creative and collaborative academic
inquiry by both faculty and students into these areas. The department?s programs,
coursework, research, and development efforts will provide students with the knowledge and
skills to pursue meaningful and rewarding careers in this arena, while simultaneously
advancing the field and helping to provide a well-rounded educational experience.
The Golisano College is one of the largest and most comprehensive computing colleges in
the nation. The College has garnered accolades and recognition as a premier computing
education and applied research facility. Housed in a 125,000 sq. ft. state-of-the-art building,
the College showcases cutting-edge innovation and world-class faculty who are passionate
about their work. The Golisano College includes the departments of Computer Science,
Information Sciences & Technologies, Interactive Games & Media, Networking, Security,
and Systems Administration, Software Engineering, as well as the Ph.D. in Computing and
Information Sciences program, the research arm of the College. This mixture of applied
computing disciplines is unique and allows the College to offer a strong, diverse series of
programs centered on computing, from infrastructure to the end user.
Stanford University
Stanford University Libraries and Academic Information Resources (SULAIR)
Stanford University Libraries and Academic Information Resources includes more than 30
libraries and programs supporting research, teaching, and learning at Stanford University.
SULAIR acquires and delivers library collections in all formats, establishes policies and
standards to guide the use of academic information resources, develops training and support
programs for academic uses of computers, and maintains a broad array of electronic
information resources, including the online library catalog and several hundred article and
indexing databases and electronic journal subscriptions. In each library unit, knowledgeable
professional staff provides assistance in locating and using print and online information
resources.
University of Maryland
College of Information Studies, and
Maryland Institute for Technology in the Humanities (MITH)
The College of Information Studies, Maryland?s iSchool, engages in collaborative,
interdisciplinary, and innovative research, teaching, and service. We educate information
? 18?
professionals and scholars, and we create knowledge, systems, and processes. The iSchool
offers Master?s degrees in Library Science (MLS), Information Management (MIM), and a
doctorate degree in Information Studies. In Fall 2009, 344 students were enrolled in the
MLS program, 141 enrolled in the MIM program, and 26 enrolled in the doctoral program.
Approximately 70% of the total student body is female. The iSchool has 35 faculty and staff
and 33 adjunct faculty representing diverse subject areas in information studies. The iSchool
serves the mid-Atlantic region.
The Maryland Institute for Technology in the Humanities is the University of Maryland?s
primary intellectual hub for scholars and practitioners of digital humanities. On a day-to-day
basis, MITH functions as an applied think tank for the digital humanities, supporting faculty
fellows and engaging in sponsored research clustering around digital databases and tools,
thematic research collections, text mining and visualization, and the creation and
preservation of electronic literature, digital games, and virtual worlds. MITH has sponsored
over two dozen faculty and graduate student fellows, and serves a community of several
hundred researchers and interested members of the public who attend its events.
About this Report
This is the final report of the Preserving Virtual Worlds project, co-authored by the
members of the PVW team. While presenting broad ranging and at times in-depth
discussion, together with conclusions and recommendations and various supporting
documentation in the appendices, it nonetheless cannot claim to be a comprehensive
summation of the full range of PVW?s activities over the last two years. The principle
audience for the report is library and information science professionals at the Library of
Congress and other collecting institutions who have already or will soon be developing
collections policies for computer games, interactive fiction, and virtual worlds. We also,
however, hope that the report will be of interest to game developers and designers, many of
whom as yet fail to take even rudimentary steps necessary to ensure the preservation of their
own creative legacy and intellectual property; to the academic game studies community,
especially those scholars with an interest in understanding the material underpinnings of the
platforms and systems they study; and finally, but not least, to the fan and game player
communities who have already done so much to safeguard future access to the content they
cherish.
The report is made available under a Creative Commons Attribution-NonCommercial-
ShareAlike 3.0 license (see http://creativecommons.org/licenses/by-nc-sa/3.0/).
? 19?
3. Games & Interactive Fiction: Collecting for
Preservation
Issues of Appraisal & Selection
In The Study of Games (1971), an essential treatment of the anthropology of games, Elliott
Avedon and Brian Sutton-Smith asked:
What are games? Are they things in the sense of artifacts? Are they
behavioral models, or simulations of social situations? Are they vestiges of
ancient rituals, or magical rites? It is difficult and even curious when one tries
to answer the question ?what are games,? since it is assumed that games are
many things and at the same time specific games are different from one
another?but are they? (Avedon & Smith, 1971, p. 419)
We are confronted with a number of questions here. One concerns the essential nature of
games. Either games are fixed objects?perhaps authored texts or built artifacts?or
alternatively, they are the experiences generated by a framework of rules, codes, or stories
and expressed through interaction, competition, or play. Text or performance? Artifact or
activity?
Another question is whether there are general structural similarities among all sorts of games.
Answering this question fully would be out of scope for our project, which was focused on
practical issues of collection and preservation. However, it is worth noting that the authors
of The Study of Games conclude with ?seven elements in games? distilled from studies by
psychologists, mathematicians, and others. These structural elements include ?procedures
for action,? ?roles of participant,? ?participant interaction patterns? and the like, taking
games away from the notion that they are stable artifacts or texts. These elements
underscore the importance of documenting interactivity as a historical phenomenon,
something that predated computers (an obvious statement to any player of Diplomacy or
Dungeons and Dragons [D&D]). As fashionable as it has become to discuss games as cinematic
or as narratives, let us not forget that actions and responses are fundamental to the nature of
interactive games. Games provide a structure within which players do something?whether
the game is baseball, D&D, or Myst?and this structure is not compelled to be a linear
narrative. Collections of digital games provide a tailor-made opportunity to document
interactivity in ways that are not provided by other media. Any collecting activity thus must
consider not only the fixed content of games as authored texts or software, but also
examples of game development, game play, and player response and activity that provide a
more complete picture of digital games as an interactive medium. We will have more to say
about the implications of these considerations later in this report.
Emphasis on both selection and appraisal underlines the diverse nature of the game-related
collections, from collections of published games well-suited to selection policies traditionally
in the domain of library collection development, to collections of complex software,
artifacts, and archives better fitting notions of appraisal and ?special? collecting developed in
the realms of archives, manuscript divisions, and museums. The common element uniting
? 20?
selection and appraisal is the intellectual effort of describing collecting and preservation
priorities.
Two of the PVW partners have active collecting programs: Stanford in games and virtual
worlds, and Maryland in interactive fiction. The Library of Congress asked for our input with
respect to the selection/appraisal problem, in order to aid its own eventual collecting
activities in the areas of digital games and related forms of interactive software. In this spirit,
we offer some thoughts that derive in part from our project work.
With the growth of interactive software as a basis for entertainment and cultural production,
it is safe to predict that many libraries will establish collecting programs for digital games and
related artifacts, such as hardware consoles and interface devices. The foundation for such a
collection in most cases is a library of published console- and personal computer-based
games. A collection development policy is therefore a statement of the scope and extent of
this library. Based on institution-specific program and patron need, the scope will encompass
chronological, technical, and access considerations. A minimal or basic information-level
collection, for example, would likely focus on current and recently published games that run
on readily available and currently supported platforms (XBox 360, Wii, PS3, Windows PC,
Apple Macintosh). With the progression through deeper levels of collecting (instructional
support, research, comprehensive collections), decisions need to be made about the
acquisition of content and hardware for obsolete platforms, some of which are still
supported through third-party vendors (e.g., Atari VCS, Nintendo NES) and others of which
are largely unsupported, historical operating systems (e.g., DOS), or titles that involve
particular difficulties with respect to security, account management, or patron access (server-
based games, handheld games, games distributed by web services such as Steam or XBox
Live). Institutions supporting research or comprehensive-level collections will likely need to
support multiple access solutions, such as a managed space for use of recent games on
currently supported hardware platforms, a ?special collections? solution for archival and
historical materials, and a digital repository for media migration and long-term preservation,
as well as possible future access solutions.
Once the overall scope of the collection is decided upon, the next level of selection policy
should cover a mix of traditional library decision points (format, chronological categories,
language of publication, country of origin) and categories more specifically tuned to digital
games and virtual worlds. With respect to the former, it should be noted that ?chronology?
in the digital game industry is very closely linked to decisions about platforms of interest; the
entire history of published digital games currently spans fewer than four decades, yet this
history is closely tied to a very strong notion of platform and operating system
?generations.? Also, the country of origin is a particularly important matter, especially given
the historical importance of Asian game publishers, notably from Japan, Korea, and China.
Several criteria for selection decisions are oriented towards specific uses or capacities of
digital games. As forms of software, for example, many digital games can be modified or re-
used as platforms for activities that are not restricted to the original game content. Some
software platforms, such as Flash, console environments (such as XBox Live Arcade) or
even ?ancient? hardware platforms, such as the Atari VCS, provide a foundation for
independent games produced by teams as small as a single designer/coder. Thus a game
might be modified from an existing title or created as a one-off project for an art installation,
? 21?
used to create animated movies, or provide a basis for commentary on current events
(?newsgames?). Such uses have come to define entire areas of activity that require special
tactics of collection and preservation, such as ?art games,? machinima, serious games, and so
on. These areas, in turn, provide specific foci for collecting activities, depending on local
priorities. Thus, a library might well decide, in addition to a broad collection of current titles,
to collect a specific type of game, such as art games or ?auteur? games as a focused
specialization. Defining such areas in a collection policy statement is a more useful approach
for collection development than typically applied categories for discussion of digital games,
such as genre, because such categories are often loosely applied and may in fact produce
artificial divisions of content based entirely on fine distinctions of game-play characteristics
rather than inclusive research priorities.
Other important aspects that may figure in a collection policy statement include
technological aspects, social or ?community? aspects (particularly with respect to network-
based games), competitive aspects (digital games as e-sports), surface qualities (graphics,
audio, music) and inter-textual or cross-media qualities of games (e.g., games in relationship
to literature or cinema). Thus one collection might be differentiated from another by an
emphasis on competitive, multi-player games, including documentation of events,
tournaments, and ?virtual communities? of fans through the collection of websites and
replays, or perhaps by a focus on games based on films, which might include a companion
collection of films based on games. Finally, with respect to collections based in the United
States, it may be important to differentiate regional collecting activities. Designers,
developers, publishers, game, and technology companies have historically been concentrated
in a few regions, such as Northern California, Washington state, Maryland, and Texas. Thus,
Stanford University?s game-related collections are closely tied to the older Silicon Valley
Archives, and the University of Texas? efforts have thus far emphasized the collection of
papers and artifacts from Texas-based designers, technologists, and musicians.
The linkage of game collections and archival collecting efforts raises the related, but separate
question of appraisal of game-related collections that consist primarily of documentation,
personal papers, corporate records, or other categories we might call archival or special
collections material. Appraisal of games as a part of archival practice was not a core issue for
investigation by the PVW project. It refers to the process of evaluating collections of records
for acquisition, retention or disposition, an assessment made in the context of a particular
institution?s mission and/or research needs. Richard Pearce-Moses (2005), in the Society of
American Archivists? A Glossary of Archival and Records Terminology, defines appraisal as ?the
process of determining whether records and other materials have permanent (archival)
value? and notes that it ?may be done at the collection, creator, series, file, or item level.? In
this sense, appraisal is an issue for organizations such as corporate archives in game
development companies who must make retention decisions on a regular basis; some of the
questions around this issue of corporate stewardship of archival records were addressed in
Rachel Donahue?s work included in the IGDA?s Before It?s Too Late: A Digital Game
Preservation White Paper (Monnens et al., 2009). Academic institutions with active programs in
game studies or game development will eventually face similar issues when making decisions
about the retention of faculty and project papers in these areas. While PVW was not focused
on the appraisal process, we do note that there is a need for careful attention in archival
studies to this issue as it relates to game-related archives, particularly with respect to a better
understanding of production work-flows, intellectual property considerations, and other
? 22?
matters as they are specifically shaped in the game industry. We also note that some work
has begun in this area, notably by Prof. Megan Winget at the University of Texas.
No single institution can do the work of comprehensively collecting and preserving digital
games and their history. Competent institutions must join together to build archives of
computer game history. Lay historians of digital games, mostly players and fans, have created
websites and entire communities of game players dedicated to the preservation of game
content and technology, and have made emulators and collections of game movies available
to their communities. Institutions committed to game preservation and history could enlist
these pioneers in the effort to create more permanent historical resources, and intelligent
collection policies might provide a basis for initial discussion among potential participants.
The Digital Game Canon, discussed later in this report, is one example of how such an
effort might proceed. Indeed, participation of the game industry, museums, and academic
institutions in this project can help to defuse the adversarial relationship between, say, the
emulation community and publishers by developing mutually acceptable practices with
respect to intellectual property and access. Better communication of collection priorities
from libraries and museums working in this area might even provide a solid basis for
industry support and participation.
Working with Developers
Preservation of video games and interactive fiction requires preserving not only the games
themselves but also contextualizing material necessary to understand games? origin and use.
Significant amounts of this type of material can be found in the possession of the software
developers and designers who create games and the companies that distribute and support
them. Much of this material, unfortunately, is not made public, although in some instances it
may end up in archival collections. Our project has attempted to provide an initial
classification of materials that those engaged in game preservation might wish to try to
collect from developers:
? Source Code & Other Game Assets (including text, still image, audio, video, and 3D
files) ? As noted in the discussion of FRBR and games as bibliographic objects, a
game can have many expressions, and while most users will wish to interact with a
binary executable version of the game, obtaining and preserving source code for a
video game can serve at least three significant purposes: 1.) it provides information
on game production that is not available from the executable version to users
interested in the underlying technology of games; 2.) it enables migration as a
potential preservation strategy for a game when emulation is not viable; and 3.) it can
improve the results of using emulation as a preservation strategy in those cases
where producing a faithful rendition of game play requires recompilation of the
source. In the case of Spacewar!, for example, in order for the online version of the
game to run correctly on an emulated version of the PDP-1 computer, minor
changes to the original code had to be made (clock speeds on modern computers
being somewhat different than a PDP-1). In addition to source code, having copies
of draft and final versions of assets used in a game (including text, still image, audio
files, video files and 3D models) can ease scholarly analysis and use of these
materials.
? 23?
? Technical Documentation ? software and hardware development teams for games
will typically accrue a variety of technical materials in the course of game
development regarding the underlying technologies necessary to enable a game?s use,
ranging from documentation of APIs for particular hardware devices, programming
language documentation, their own notes on issues like enabling cross-platform
compatibility of code, etc. This information may be valuable to scholars wishing to
understand game production processes, as well as to preservationists trying to
achieve a better understanding of the platforms on which a game was designed to
run.
? Production Materials ? understanding of the life cycle of a game can also be
enhanced by access to materials the game?s designers used in its development. This
can include design notebooks, storyboards, mock-ups created of game objects,
scripts, character profiles, maps of the game?s terrain, and similar materials.
? Designer Stories ? some of the more interesting material that can be collected
regarding games? origins and use does not reside in documents, but in the designers?
minds. Interviews with game designers and recordings of designers presenting their
work at conferences can be a remarkably rich source of data for those wanting to get
a more complete understanding of the history of games? development.
? Records of Interaction with the User Community ? Game companies inevitably
generate a fair amount of information in the process of interacting with their users,
including bug reports on software, records of support calls, discussion forums and
wikis. Some of this information may not be available due to privacy issues or other
considerations, but it can provide valuable information regarding the game designers?
relationship with users as well as the culture of game use.
The game source code, production assets and the technical documentation acquired and
generated in the course of game production are likely to have the greatest direct impact on
the preservability of a game. The source code and game assets widen the set of potential
preservation strategies, and technical documentation regarding the platform on which the
game was intended to run is second only to representation information about the game files
themselves in terms of its significance for supporting technical preservation activities.
Production materials, oral documentation from game designers, and records of interaction
with a game?s user community are of great value to anyone seeking to understand the life
cycle of a game over time.
Unfortunately, game companies may be unwilling or unable to part with all of these
categories of materials for varying reasons. While there are exceptions, such as id Software?s
treatment of DOOM, game companies are typically not particularly enthused about publicly
releasing their source code, technical documentation or production materials, which might
alert competitors to the design processes. Any social scientist who has studied software
companies can attest to the fact that getting a programmer or designer to take the time to sit
down for an interview is a difficult process, as companies do not wish to sacrifice the
valuable time of their employees in an industry which is extremely driven by release dates.
They are also disinclined to have their internal operating procedures made public. Records of
interactions with users can raise obvious issues of confidentiality and privacy.
This is not to say that game companies will necessarily be unwilling to share these materials,
but there are obviously a number of factors working against making them available that
? 24?
libraries, archives or museums seeking to collect and preserve games must address. The most
fundamental issue for any cultural memory organization attempting to collect this material is
trust. Many software companies would view sharing something like source code as
equivalent to handing over the crown jewels. Assurances that material will be ?dark?
archived and made available only at some later date or under certain conditions are not of
any real significance to a software company unless they already have trust in the individuals
and the institution making the promises.
Familiarity is key to building trust, and so cultural memory organizations seeking to collect
materials from game designers need to continually engage with those from whom they are
seeking materials, and not just as part of the process of negotiating donation of a collection.
Librarians and archivists need to be in regular attendance at events that draw the video game
design industry, including industry-oriented conferences such as the Game Developers
Conference and E3 Expo, gamer-oriented conferences such as PAX Prime and VGXPO
and academic conferences such as the International Conference on Computer Games, the
IEEE Conference on Computational Intelligence and Games, and the Conference &
Festival of the Electronic Literature Organization. Archivists are well aware that personal
relationships can be the key to successful acquisition of materials, and maintaining a visible
presence in the venues inhabited by game companies is critical to that effort.
Game companies are also more likely to be willing to discuss donations with organizations
that they see make active contributions to the world of gaming. Cultural memory institutions
that wish to acquire materials from game developers need to have public (and permanent)
activities around video games and electronic literature. Points of presence for gaming
materials in the library, both physical and virtual (see, for example, the University of Illinois
Gaming Collection (http://www.library.illinois.edu/gaming/index.html), Stanford
University Library?s Stephen M. Cabrinety Collection in the History of Microcomputing
(http://sulair.stanford.edu/depts/hasrg/histsci/index.htm), the University of Maryland?s
Maryland Institute for Technology in the Humanities? Deena Larsen collection
(http://mith.umd.edu/larsen/dlcs/), and the Video Game Archive at the University of
Texas at Austin (http://www.cah.utexas.edu/projects/videogamearchive/index.html), as
well as sponsoring events such as workshops on gaming research and game nights, help the
institution promote an image of itself as actively contributing to gaming culture. To the
extent that an institution is seen as having goals that align with the gaming industry regarding
the importance of promoting games, it will be more likely to be seen as trustworthy by the
industry.
Choosing Versions: What?s the Difference?
Digital games and virtual worlds are not static or homogeneous entities. Like books with
different printings, editions, and translations, like art works with derivatives, prints, copies,
and fakes, like films with director?s cuts and abridgements, games often exist in numerous
different versions. These versions include ports for different platforms, patches and
upgrades, user-contributed modifications (?mods?), pirated copies and hacks, and sequels
that may prove more or less contiguous to the original. And here we are only talking about
the game as a finished product. During development, the game will iterate through alpha and
beta versions with greater or lesser availability to the public or to smaller communities of
play testers. The constituent digital components of the game, meanwhile, from source code
? 25?
to graphics, movies and soundtrack, will manifest their own complex hierarchies of variants
and version forks on the development tree. Little wonder then that programmers rely on
version control repositories to manage these relations as they check code or master files in
and out of the object library in the course of their work. Versioning is, in short, a fact of life
in the digital world?indeed, one can argue that it is essential to the ontology of the digital at
its most fundamental level since (strictly speaking) every time a file is accessed a new copy of
it is created.
Libraries and collecting institutions will face constant decisions over which ?version? of a
game or virtual world to acquire, often also including the question of what source media it
comes stored on. Limited resources will inevitably force trade-offs and decision-making.
One cannot collect?let alone preserve and maintain access to?everything. Sometimes a
particular version of a game will be important to collect since it is the first of its kind, even if
it was never widely disseminated. Adventure from our case set is just such an example. The
version of the game that propagated over the nascent ARPANET, and then to nearly every
mainframe and personal computer system in the industry thereafter, was not the ?original?
programmed by Will Crowther, but rather the expanded version of the game released a year
or so later by Don Woods. Anyone who remembers playing Adventure on their first computer
almost certainly is recollecting the expanded Woods version, not the Crowther original. Yet
Crowther?s source code has recently been recovered, is now available on the Internet, and
has been implemented in a variety of different virtual machines. Which version of Adventure
does a collecting institution want then, the Crowther original or the version modified and
expanded by Woods that proved so influential? The answer, of course, is probably both, but
recognizing the difference may not be within the capability of a cataloger or archivist who
does not specialize in games research.
Much like an upside down stamp, some versions of games are significant because they
contain bugs, so-called Easter eggs, or other anomalies. Sometimes the question of version
will be dictated by access: an institution may reach a decision to collect only PC versions of
games, because that corresponds with the vintage hardware they have on hand. Sometimes
version may be dictated merely by happenstance: what comes into an archivist?s hands as the
result of an acquisition or what is available that day on eBay. This was precisely the method
by which we obtained our Macintosh-compatible copy of Mindwheel, a format not even listed
in the Moby Games entry for the game (http://www.mobygames.com/game/mindwheel).
(It was left on the doorstep of one of the project members, who narrowly avoided stepping
on it on his way out.)
We have done considerable work with FRBR as a framework for mapping relationships
between different versions, ports, and storage media for games, with Adventure serving as our
focal point (McDonough et al., 2010). While librarians have long recognized the distinction
between a work as an intellectual creation and its embodiment within a particular physical
form (and the need to adequately describe both), the publication of the Functional
Requirements for Bibliographic Records Final Report by the IFLA Study Group on the
Functional Requirements for Bibliographic Records (FRBR) marked a pronounced increase
in the level of attention that the library community has devoted to these issues. FRBR
proposed a formal model for bibliographic description that recognizes four classes of entities
as implicated in descriptive practice: Works (unique intellectual or artistic
creations), Expressions (the realization of Works), Manifestations (the physical
? 26?
embodiment of particular Expressions), and Items (single exemplars of a Manifestation).
Attributes commonly found in bibliographic description, such as publisher or title, are
bound in the FRBR model to one of these four entities.
At first it might seem that all versions of Adventure should be grouped under a single
?Work,? a particular instance of the game (the last version modified by Don Woods, for
instance) should be the ?Expression,? a particular file with a unique MD5 hash should be the
?Manifestation,? and an individual copy of that file (perhaps on a Commodore 64 664 Block
disk) would be the ?Item.? But what if the text read by the reader is exactly the same, but the
underlying code is different? These variants might be simple (a comment added to the
FORTRAN source code), peripheral (such as the ability to recognize ?x? as a synonym for
the command ?examine?), or very large (a port of the code from FORTRAN to BASIC).
Should these code level variants be considered different expressions? To further complicate
matters, what if the FORTRAN code was exactly the same but compiled to two different
chips? For example, an IBM mainframe and a Commodore 64 might both have a
FORTRAN compiler, but the two compilers will interpret the FORTRAN to a different set
of set machine instructions. It might also be the case that two FORTRAN compilers
designed by different programmers will generate slightly different machine language. Even
the same compiler might generate slightly different machine code from a single source code
file depending on the options with which it is invoked. Should these compiled executables,
different in their binary structure but based on the same FORTRAN code, represent
different ?Manifestations? or different ?Expressions??
Finally, even two files with exactly the same MD5 signature participate in a larger software
environment at runtime. The drivers that run the display interface, the keyboard, the
memory, and the disk drives arguably become part of Adventure when the user is playing the
game. For instance, the experience of playing the game using the 6507 chip in a Commodore
64 hooked up to a black and white television may be different than the experience of playing
the game on the same chip in a Commodore SX64 (the all-in-one machine some felt fit to
call ?portable?). Should the software environment on which the binary is executed be a part
of the classification scheme at all? Would playing the game on a video monitor (which
displays only a fixed number of lines at a time) provide a substantially different experience
from a session with the same game played on a Teletype (which saves the output indefinitely
on paper)?
Such questions are reminiscent of issues and debates that have long been debated in more
traditional arenas for cultural heritage, such as literature and textual scholarship. ?The root
cause of any version is revision,? the Melville scholar and editor John Bryant (2002, p. 70)
has contended. But what is the threshold by which mere variants in a text promote the text
to a new version of the work? Bryant delineates the following characteristics: that versions
may be either physical (literally distinct documents) or inferred from the evidence available
on one or more extant documents; that a version of a work can always be linked to another
version; that versions are revisions of the work, and that they may be initiated, either
deliberately or inadvertently, by the author or some other agency like an editor or the public
at large; that versions do not depend on the authority of the author to be authorized as
versions; that versions are manifest by their degree of difference from other versions; that
versions entail some reconceptualization or reimagining of the work in question; that
versions are partly defined by their impact on their audiences; and finally, that versions are
? 27?
constructed critically by virtue of other artifacts in the documentary field, meaning that the
existence of a version is always finally arguable. If we step back from this delineation, what
we have then is an account of versioning wherein a version itself is always critical,
contingent, relative or relational, functional, rhetorical, and perhaps above all, consequential.
This seems no less true for digital games and virtual worlds than for the literary and
historical documents that concern Bryant. A version is not an inferior derivative of some
legitimized and sanctioned work, nor is it simply a collection of observable variants. Versions
are, instead (to paraphrase Gregory Bateson) the differences that make a difference.
Our work with Adventure and other digital games has highlighted two deficiencies in the
FRBR entity-relationship model. The first problem arises from the complex tangle of
derivative works associated with any particular game. Neither catalogs as they exist today nor
FRBR provide sufficient facilities to ease collocation of these works for users. Computer
games provide one of the stronger arguments for the concept of a superwork and adding
support for superworks to our bibliographic systems, and to the FRBR model. The second
problem is the omission of any mention of intellectual property rights within the FRBR
model. While the IFLA Study Group (1997) made it clear that they were not enumerating
every attribute of or relationship existing between bibliographic entities, the failure to
account for intellectual property relationships between Group 1 and Group 2 entities is
extremely problematic for those attempting to describe computer games, and we suspect
much other digital material. Alignment of legal theory and cataloging theory regarding the
separation between artistic/intellectual creations and their expression in particular forms is,
we suspect, a difficult task that will require the input of the both communities.
In sum, there are no easy formulas or checklists to aid in decision-making regarding which
versions of a digital game or virtual world to collect. There is no one rule of thumb or
absolute heuristic. Bryant?s criteria, while compelling, are likely to remain resistant to formal
codification. Unraveling the relationships between even the simplest digital works and their
constituent elements is orders of magnitude more challenging than for most artifacts in the
analog world. That said, these same challenges lend themselves well to visualization and
mapping techniques, and one can imagine catalogers and archivists working with increasingly
sophisticated tools to articulate and display the relationships between digital objects in their
collections. All of this also requires considerable subject knowledge, and here the input of
both game scholars and fan communities is likely to be invaluable.
Context Information Reconsidered
Unlike many creative works, video games and virtual worlds are rarely able to stand on their
own as time progresses and technological advancements render them obsolete. That which
was startlingly innovative in the early 1980s appears simplistic to the modern eye. Take, for
example, Mindwheel (1984) and Fallout 3 (2008), two apocalyptic adventures played from a
first-person perspective.
? 28?
(Pinksy & Synapse Software Corporation, 1984)
(Bethesda Softworks, 2008)
Fallout clearly has superior graphics?Mindwheel has none and, even if it did, the 16 color
graphics of its day would not be competitive. The difference in sound quality is similarly
? 29?
dramatic. Traversing the world of Fallout 3 is an instantaneous experience, interrupted only
by loading screens for significant changes in setting. In the screenshot above, nearly 20
seconds elapsed between entering the command, ?DOCTOR, BEGIN GAME? and the text
fully loading. In its simplicity, Mindwheel?s data barely hit megabytes, while Fallout 3 occupies
several gigabytes.
Games and interactive fiction are not the only works that suffer when viewed after their
time. Virtual worlds?from Second Life to Neopets to World of Warcraft?are not experienced in
a vacuum. The value and meaning of a virtual world is primarily derived from the actions
and interactions of its players. Imagine stepping into Second Life, which doesn?t even have the
benefit of plotlines or non-player characters (NPCs), years after the last user signed off. The
world would be empty; interactivity limited to the virtual equivalent of archaeology:
examining buildings and prims in an attempt to build a picture of how Second Life was lived.
So how do we prove the historical significance of Mindwheel or other early video games to a
modern gamer? By spending as much effort preserving the context of gameplay as the
software that enables it.
Literature & Ephemera
As with any type of history, contemporary publications play a large role in providing an
adequate context in which to understand video games. The most relevant of these are
documents such as game manuals and press releases, and third-party publications including
strategy guides, reviews, and interviews.
A video game?s retail package is rarely limited to a copy of the software, especially if it was
released before the advent of downloadable content and sophisticated digital rights
management (DRM) technology. The box?a valuable artifact itself?typically contains an
instruction manual, registration card, and advertisement or catalog at a minimum. Depending
on the game, maps, control cheat sheets, posters, and comics or other game related fiction
might also be included. Mindwheel was bundled with a 93-page novella. Many older computer
games required information from these physical materials as a way to prevent illegal copies
from being played.
The materials bundled with a game become
particularly important if the game itself is no
longer playable. Manuals include introductions
to plot and characters, screenshots, credits,
and explanations of controls. These controller
instructions can provide some limited insight
into gameplay. For instance, in Mortal Kombat
II, the character Shang Tsung can transform
into any of the other player characters with
specific sequences of one to four button
presses (or holds). This type of complex
button combination for special moves is as
intrinsic a part of the fighting genre as the
? 30?
hardboiled detective is in film noir. If nothing else, this may aid with understanding ?button
mashing? in a time when games are increasingly controlled by motions instead of joysticks. In
a similar fashion, the keyboard card packaged with Eidos? Thief: The Dark Project illustrates
the level of thought and detail put into the game?s design as well as the difficult controls
required by such nuance. When text adventures were the games of the day, players often
hand drew maps to keep track of their location; in the golden age of the 16-bit role playing
game (RPG), world maps became common pack-ins.
Strategy guides, generally published by third parties, provide a much richer picture of game
play than can be gleaned from the materials a game is packaged with. By their nature, these
books offer detailed descriptions of game play from beginning to end, and often pair
screenshots with the guide text. Strategy guide content may include:
?
? Screenshots
? Maps
? Concept art
? Tips for mastering difficult game controls
? Puzzle solutions
? Lists of in game items, their purpose, and where to find them
? Cheats, secrets, and Easter eggs
?
The last item on the list might be viewed as a type of game developer marginalia (or graffiti).
Easter eggs are messages, events, or effects hidden by game developers that are typically not
encountered during regular game play, or require inside knowledge to catch, if they are.
They often contain programmer signatures, homages to earlier titles, advertisements for
upcoming games, or genre in-jokes. Sierra On-Line was notorious for using Easter eggs as
marketing tools. In particular, the various Quest series all contained references to each other;
King?s Quest II hides a preview of Space Quest I in a snake home and a guard in Space Quest II
asks if you've played King?s Quest (?Easter Eggs and cheats,? n.d.). DOOM has a well-known
cheat code (?idkfa?) which, when typed during game play, gives the player every weapon, full
ammo, every key, and full armor. In Heretic, another id Software title, typing this code
removes all acquired weapons and calls the player a cheater (?Easter egg,? 2010).
Promotional materials released by game developers with the launch of new titles can provide
important insight into the contemporary state of video game technology. While marketing
claims are likely to be exaggerated, they are unlikely to be outrageously overblown. Thus
given the fact that both Blizzard?s 2010 press release for StarCraft II and Activision?s 1980
promo reel emphasize graphics, it can be assumed that each of the games is fairly
representative of the state of graphics technology at the time of its creation. This is the type
of detail that helps with the Mindwheel vs. Fallout scenario presented above; with it, a modern
researcher unfamiliar with the timeline of video game advancement gains a better
understanding of the quality of a game at release.
As with literature, game ephemera may include items from both the game developer and
licensed third parties. These tie-in items range widely in form and material, from ordinary
posters and patches to pocket lint and potions. The table below illustrates some of this
variety.??
? 31?
?
Item Manufacturer Year
Small?Centipede?Poster? Atari?? 1982?
Dragon's?Lair?Lunchbox?and?Thermos? Aladdin?Industries?Inc?? 1983?
Q*Bert?Miniature?Figures?? Kenner? 1983?
Mindwheel?novel? Synapse?and?Pinsky?? 1984?
Packet?of?pocket?fluff?(Packaged?with?
the?Hitchhiker's?Guide?to?the?Galaxy)?
Infocom?? 1984?
Nintendo?Cereal?System?? Ralston?Cereals?? 1988?
Breath?of?Fire?Map? Capcom? ?1993?
Chrono?Trigger?Key?Chains? Bandai? ?1995?
The?Book?of?Atrus?(First?in?a?series?of?
novels?inspired?by?Myst)?
Miller?? 1997?
Trance?Vibrator?(Packaged?with?Rez)? Sony?Computer?Entertainment?? 2001?
DOOM:?The?Boardgame? Fantasy?Flight?Games?? 2004?
Yoshi?Tag?and?Run?Meter?? McDonald?s?Happy?Meal?toy? ?2006?
Master?Chief?Costume? Rubies?Costume?Co? ??2008?
Monster?Hunter?Health?Drink? Bandai?? 2008?
Sonic?the?Hedgehog?Plush?Backpack?? Hot?Topic?? 2009??
?????Key:? Developer?Third?Party?[Plain]?
? Retail?Package?[Bold]?
? Limited?Edition?[Italic]?
? Nostalgic?[Bold?Italic]?
The meaning constructed by these items is as diverse as their physical forms:
? The lunch kit, Happy Meal toy, and cereal indicate popularity with school-age
children.
? The map implies elaborate world building (for the time).
? The pocket fluff appealed to the humors of Douglas Adams fans.
? The Myst novels represent a reversal of the usual path of adaptation; book to game
(as was the case with The Hitchhiker?s Guide to the Galaxy) is much more common.
? The board game could imply either outreach to the wider (non-video) gaming
community, or greater than average popularity among tabletop gamers.
? The costume speaks to the devotion of the games fans, especially when accompanied
by photographs of those fans dressed up for video game or genre themed
conventions.
? 32?
? Nostalgic items like the Sonic backpack, faux vintage t-shirts, and rereleased classic
games speak to the lasting influence upon American culture.
In addition to these textual and artifactual materials, soundtracks, movies, television series,
audio recordings, and orchestral performances have all emerged from popular video game
franchises. While the specific details of ephemera are important for contextualizing games,
that some varieties exist at all is a valuable appraisal aid. While any game might be sold with a
poster or figurine, only the leading titles are likely to inspire entire lines of action figures or
musical arrangements. Certainly, there are many factors other than simple popularity that
contribute to a game?s enduring value, and not all titles of import come with a large market
impact. But, as a history of video games would be incomplete without the blockbusters,
some decisions for a large repository are simple.
? 33?
4. Collections
Collections: Bibliographic & Archival Description
Bibliographic records are one of the primary tools that any library has for management of
collections. As has been noted with regards to moving image materials (McDonough &
Jimenez, 2007), however, catalog records that may serve perfectly adequately as mechanisms
to aid access often fail as mechanisms to aid in preservation. In the case of computer games
and interactive fiction, traditional bibliographic description may not even prove adequate for
purposes of access.
Our project focused on research collections of games, such as those at Stanford University,
when considering issues of description for access, and our examination of current practices
of bibliographic description was performed with an assumption that our main interest was in
determining whether they adequately supported the research activities of scholars concerned
with games and related materials. We note that scholars? interests in these materials should
not be considered uniform. The needs of scholars working within a program such as the
Department of Interactive Games & Media at RIT, which is focused on game design and
development, are not identical to those of scholars working within the digital humanities
who might be interested in analyzing the rhetorical nature of games (Bogost, 2007).
Another point which is often neglected in library and information science literature is that
bibliographic information does not only support the work of patrons; it must also support
the work of librarians and archivists. Coyle (2004) has examined the ways in which the
current discussions of the FRBR final report and its implications for bibliographic record
design have had an unfortunately narrow view of the functions which should be included in
?Functional Requirements,? particularly with respect to libraries? internal operating
procedures. When examining forms of bibliographic description from the point of view of
preservationists, we have tried to keep in mind that description must support the work of
information professionals as well as that of the research community.
A significant problem with existing practices of bibliographic and archival description from
the point of view of scholars interested in computer games and interactive fiction is the lack
of detail provided by existing formats and descriptive practices, particularly with respect to
the issue of versioning and editions of particular games. Consider this record from the
Library of Congress? online catalog:
? 34?
This record does provide some information regarding the version of the game, inasmuch as
it indicates in the title that it contains ?Episode 1? of DOOM and that the CD-ROM in
question apparently dates from November 1994. However, by November 1994, DOOM was
already in version 1.7a; in fact versions 1.2 through 1.7a were all released during the first
eleven months of 1994, and there are significant differences in game play, technical support
and capabilities between the different versions. There is also nothing in this record to
indicate whether this CD-ROM contains a copy of the shareware version of DOOM, or one
of the commercial releases. If the shareware version, it would only contain Episode 1, but
the mail order releases of the original DOOM contained further episodes. For a scholar
researching games, this record is singularly unhelpful with respect to the version of the game
contained with the Library?s collection.
This level of detail, unfortunately, counts as the epitome of service when compared against
traditional descriptive practice found in archival finding aids, where item-level description is
minimal at best. The finding aid of the Stephen M. Cabrinety Collection in the History of
Microcomputing at Stanford University Libraries provides the following information for
anyone looking for the game Star Raiders:
? 35?
Box 136 Atari, Inc. ST Star Raiders, 1986
Physical Description: 1 computer disk ; 5 ? in. Atari
Box 72 Atari, Inc. Star Raiders, 1982
Physical Description: computer cartridge Atari
Box 134 Atari, Inc. Star Raiders, 1982
Physical Description: computer cartridge Atari
Box 142 Atari, Inc. Star Raiders, 1980
Physical Description: computer cartridge Atari
Box 152 Atari, Inc. Star Raiders, 1982
Physical Description: computer cartridge Atari
Presumably we can depend on the fact that Box 136 contains the version of Star Raiders for
the Atari ST, but as for the rest, whether the cartridge in Box 142 is intended for an Atari
2600 system or one of the Atari 400/800 systems is unclear, and for the remaining three
boxes, the cartridges could be in theory be for the Atari 400/800, 2600 or 5200 systems. And
only a scholar with knowledge of the exact release dates for the various editions and
platforms would be able to deduce that information from this finding aid.
As discussed in Chapter 3, our project did extensive exploration of the application of the
FRBR data model to game materials, and found that it provides reasonable if not perfect
support for identification of versions and editions of games. However, FRBR in and of itself
is not a solution to the problems of adequate description of game materials. As analysis of
the current MARC and Anglo-American Cataloging Rules (Delsey, 2002) shows that the
bibliographic format and rules of description employed by the library community already
provide support for much of what?s contained within the FRBR Final Report. What is
needed is a commitment to providing an enhanced level of description necessary to support
scholarly work. While clearly there are painful financial ramifications for any library (and
even more painful for any archive) trying to create more detailed bibliographic descriptions
of these materials, existing practices are not adequate to meet the needs of the scholarly
community.
Nor do these descriptions provide anything like the information that would needed by
preservationists, for whom the details of version and edition information are equally as vital.
The cartridges employed by varying Atari systems (e.g., 400/800, 2600, 5200) differed in the
pin-out arrangement used by the cartridge slots to physically interface the cartridge with the
rest of the system. While Atari systems are still available on the secondary market today,
eventually these systems will cease to function, and any ability to access the information on
such a cartridge will be dependent on our ability to read the data off the cartridge and move
it into some form of emulator. Without sufficiently detailed knowledge of what types of
cartridges are in hand, a library or archive would not even know what additional technical
information they might need to collect in order to try to insure long-term access. While the
bibliographic record for DOOM above indicates that DOS is required to run the software it
fails to provide any indication of which version of DOS is necessary.2 Long-term access to
software requires detailed knowledge of the technical environment in which it was designed
????????????????????????????????????????????????????????
2?The original DOOM required MS-DOS version 3.3 or higher.
? 36?
to execute. Current descriptive practices do not provide anything resembling sufficient
information on the technical aspects of software to support preservation activity.
One of our critical findings regarding the preservation of gaming materials is that
preservation of a game itself is insufficient; we need to also preserve the information that
contextualizes the game and helps researchers achieve a more complete understanding of the
game?s significance and use. Much of the contextualizing material that we might wish to
preserve, however, is archival in nature, existing in relatively few copies. The UT Videogame
Archive at the Dolph Briscoe Center for American History at UT Austin, for example, has
papers and files donated by Warren Spector, a game designer famous for his contributions to
the Ultima series of games from Origin Systems, Inc. and Wing Commander. These materials
are unique, one-of-a-kind contributions, which is to say, the type of material archives
specializing in personal papers and manuscripts acquire on a regular basis. A scholar at
Stanford University looking at the Ultima games contained within the Stephen M. Cabrinety
collection would probably be extremely interested in the material held at UT Austin, but
bibliographic links between collections held by different institutions are so rare as to count
as non-existent. Preservation demands that libraries, archives and museums begin to rethink
their practices of bibliographic description and consider the possibility that catalogs and
finding aids should help users obtain the information they need regardless of whether it
happens to be housed at the current institution. The fact that a catalog provides a valuable
service as an inventory of an institution?s holdings does not mean that that is the only
purpose it should serve.
Problems with linking bibliographic information with databases maintained by other
organizations are not limited to the realm of descriptive metadata. One of the basic tenets of
the Open Archival Information System Reference Model is that an archival information
package for any digital object should contain representation information that allows a user to
interpret the bits comprising an object as syntactically valid and semantically meaningful
information. As much of this information will be identical for different objects of the same
data format, many institutions are hoping that institutions such as the developing Unified
Digital Format Registry will become hosts for representation information needed by other
organizations. But if this is to occur for games, we need to be able to link the individual files
comprising a game stored within an institution?s repository systems with representation
information stored elsewhere. This will require the addition of linking information that
libraries, archives and museums have never had to store previously, as well as the use of
standards for linking within a distributed system of repositories, standards that are at best
inchoate at the moment.
As the above discussion should make clear, problems with descriptive metadata created
within libraries, archives and museums today are not a matter of existing metadata being
erroneous; rather it is simply insufficient. Given the increasing costs of generating metadata
manually, this is clearly problematic. While some of the forms of metadata described above
(e.g., links to representation information) will require decisions by professionals with
significant expertise within the preservation field, other forms (e.g., contextualizing
information) could be provided by anyone with a strong interest and knowledge of games.
Given the number of individuals answering to that description, for games at least, cultural
memory organizations should explore how they might enlist the aid of the communities they
serve in the creation of metadata necessary to preserve these objects.
? 37?
Collections: Data and Documentation
Collections of digital games and virtual worlds are, to a large extent, collections of software.
Many of the preservation issues faced by repositories holding these collections are therefore
problems of software preservation. However, software preservation only addresses one
aspect of digital games and virtual worlds; focusing on software preservation was an
important aspect of the PVW project, but other equally challenging problems emerged and
were addressed by the project investigators.
It is important to emphasize here that virtual worlds are historical in two senses of the
phrase: They are worlds of historical interest, and they are going to go away. In PVW, we
explored some of the implications of the life-cycle of virtual worlds, especially of their
extinction, for thinking about how the history of computer-based ?worlds,? as well as their
use by communities of players or ?residents,? could be documented. The moment when a
virtual world ?is history??when it shuts down?reminds us that every virtual world has a
history. These histories of individual virtual worlds are inextricably bound up with the
intellectual and cultural history of virtual world technologies and communities. They are also
venues for historically specific events and activities. An important part of the historical
context for virtual world history is the fact that human beings (through their avatars) fill
these digital environments with meaning that emerges from their activities in social spaces,
regardless of whether the spaces are synthetic (digital) or physical. So, in addition to
preserving the software that provides the technical underpinnings for virtual worlds, our
project faced the problem of how to identify, collect and preserve documents that convey
events and activities that take place in virtual spaces.
Perfect Capture
Thinking of a game world or a virtual world as historical brings us directly to issues of
historical documentation, digital preservation and curation of virtual worlds. What will
remain of virtual worlds after they close down, either individually or perhaps even
collectively, i.e., when the technology has become pass??
A notion that plays into the preservation discussion is particularly relevant here: namely, that
of the potentially perfect reproduction of digital data. Recall that our digital personae, our
avatars, and our player characters are ultimately all bits of data on a machine. Death switches
count on that. If we can only get access to these data, shouldn?t it be possible to copy them
forever?
Consider an actual historical case, that of Chris Crosby, a.k.a. NoSkill. Crosby was the first
of the highly skilled players of the on-line multiplayer game DOOM to be recognized as a
?Doomgod.? An active player from about 1994 to 1996, the young father was killed in a car
crash in 2001. His memorial site on the web, like many others, depicts him in the prime of
life, holding his young son, but it also offers a number of files for downloading (NoSkill
Memorial Site, 2004). These files are demos recorded from games he played between May
1995 and April 1996. A demo, also called an lmp (from its .lmp or ?lump? file extension), is
a replay file. It is a recording of a game session in the form of a sequence of commands that
correspond to input control states during each frame of the game, or ?tic.? DOOM players
? 38?
could generate a demo file by simply entering the command ?-record? in the console, a
command-line interface that could be called up while playing the game. In other words, they
could create a script?a sequence of instructions?generated from game data and save it as a
demo recording. A recording in this format is much more compact than video captured
from the screen. The catch is that the demo data must be run and executed inside a copy of
the same game from which it was generated, and even from the exact same version of that
game, if the game engine is to render the action correctly.
It is easy enough to download Chris Crosby?s demo files from his memorial site and play
these files inside the correct version of this old game, originally published towards the end of
1993. Indeed, DOOM is in the collection of digital games that have been preserved as part of
PVW, so that several versions of the game, including the version Crosby played, will be
available in the Stanford Digital Repository, for example. Playing Crosby?s replay files is a
profound experience. It means nothing less than experiencing a dead (historical) game
through the eyes of a dead player, that is, seeing the game as he saw it. NoSkill in this sense
comes back to life, as the replay file activates the game engine to carry out the exact
sequence of actions enacted by the now dead player. Again, in this first-person shooter, we
can see the game action through NoSkill?s eyes. The player is dead, but it is now possible for
his avatar in some sense to live on through an act of perfect reproduction.
Historians will be unable to help but contrast the potentially infinite repetition and perfect
reproduction of NoSkill?s game-play to the fading memories of his life ? and death. At the
same time that we are powerfully affected by revisiting a past experience, we know that what
we are seeing may be an historical event, but without documentation it is not history. It is a
remarkable act of software and data preservation, but it would be a mistake as we begin to
stage early work on preservation of games and virtual worlds to frame these projects like
PVW primarily, or even exclusively, in terms of software preservation and the perfect
capture mode of game replays. This would be a barren exercise with respect to the
documentation of the events and activities?the history?that has occurred in these worlds.
Perfect Loss
This is because future historians and others interested in the history of virtual worlds will not
just want to experience what it was like to play a historical game or visit a world like Second
Life, they will want to know much more about the things people were doing in virtual
worlds, why they were doing them, and what their activities meant to them.
So perfect event capture with respect to digital data is possible, and replay offers a paradigm
for perfectly reproducing the past, even seeing through the eyes of players who are no longer
with us. From a historian?s point-of-view, perfect capture is half of a paradox, for it must be
placed alongside the very real possibility of ?perfect loss? in digital spaces. If we save every
bit of a virtual world, its software and the data associated with it and stored on its servers, it
may still be the case that we have completely lost the history. To date, the virtual world has
not yet been produced that offers vestiges or traces of the past after transaction logs or
content have been removed from it. When the data is gone, it?s gone. That is not the entire
problem, however.
? 39?
An example might help here.
A few years ago, a series of
nasty protests in the virtual
world Second Life led to an
attack on in-world buildings
owned by the National Front.
Reports about this clash
appeared in blogs and forums
visited by members of the
Second Life community and
others interested in virtual
worlds not long after the
events had occurred. After
reading a witness?s account of
the events, many readers
jumped into Second Life to see
what was going on, only to
find there was absolutely
nothing to see. The National
Front had already abandoned its Island and deleted all of the content there, essentially
stripping the turf of every trace and artifact. The ?world? revealed nothing of what had
recently been a hotbed of activity and conflict. The only sources left were documents created
and stored outside the virtual world itself ? blog entries, forum posts, screenshots, etc., on
the 2D web.
The historian Timothy Burke has described the difference between game-generated data and
historical documentation in terms of what he calls the ?proprietary? data of virtual worlds,
meaning the data that is owned, or present on the servers that support that world: ?... I think
the one thing that *isn?t* in the proprietary data is the history of unusual or defining
episodes or events in the life of particular virtual worlds ? The narrative history, the event
history, of any given virtual world, may in fact be obscured by the kinds of god?s-eye view
data that developers have. After all, they often don?t know what is happening at the
subjective level of experience within communities, or have to react to it after it?s happened.
(Say, when players stage a protest.)? (Burke, 2006) Thus, focusing on preservation of what
Burke calls proprietary data matches up poorly to the likely needs of future scholars of
virtual worlds.
Consider another example that illustrates this point. In the first hours after the WTC and
Pentagon attacks on 11 Sept 2001, online communities used systems such as massively
multiplayer role-playing games as a medium for responding to the attacks. In games such as
Everquest and Asheron?s Call, players read news alerts either via in-game text or system
announcement, while outside the world but still on-line, other players caught up via player
community websites. Of course, others watched television, heard from friends, or even
experienced the events up-close and personal. Within hours, players organized candlelight
vigils for the victims of the attacks, using glowing weapons or other objects, taking
screenshots and posting online to document their in-world activities and discuss what they
meant in the context of the dramatic historical events unfolding around them. For a vigil
? 40?
held on Everquest?s Luclin server on 12 September in response to ?yesterdays [sic]
disheartening display of events,? players were invited to ?mourn and discuss? on the
Everlore website (Lowood, 2008). Players commented on the meaning of this action to
them; one of them, with the player name Keeter, argued that, ?Just because you are in a
game doesn?t mean the world outside doesn?t effect [sic] you. Many people would like to
mourn and share peace along side [sic] people they have battled long and hard side by side
with. Yes, I can go to a church to mourn, but I would like to do it with my comrades around
the country/world, which is impossible everywhere else. If you don?t want to be a part of it,
then dont [sic]. You can choose not to do it. But respect the people who would like to. We
don?t bother people that want to run naked gnomes through the country, so don?t bother
people that want to gather and discuss something important to us all.? Documentation such
as this quotation is necessary both for a full description of an event and for a rich
interpretation of what the activities associated with that event meant to participants, no
matter what kind of world we are talking about.
There are three important points here with regard to a preservation project focused on
software and game-produced data. First, inside Everquest today there is no trace of these
events. Assuming that the game world has not been deleted, erased or remade (untrue), that
we are on the right server or shard of the game, and that we are standing on precisely the
spot where such a vigil occurred, it is generally not possible to dig beneath the surface,
scratch underneath a poster, or find a file cabinet of documents or an old newspaper in a
nearby building. There are exceptions, such as a monument on a specific Asheron?s Call server
that commemorated a unique achievement by its players, but such exceptions are rare.
Second, this lack of in-world artifacts and documentation clearly has implications for long-
term preservation that focuses on game software and server-side data. Assume that we are
able to capture every bit from a virtual world server, everything from 3D models to account
information, that we are able to reverse engineer or disable authentication and log-in
controls after the original server is no longer live, and that we have received permission from
every rights holder ranging from game developers to third-party developers and players to
copy, store, and use what they created, show their avatar, or reveal their identity and
activities. The chances of all this actually happening are near zero, of course, but assume that
it all could be done. Then assume that we can sync up every state or version of the software
to the matching states of databases. It might then be possible to run a simulation of the
virtual world as an archival time-machine, flying around on a magic carpet in spectator mode
but never interacting with events run by the game engine and player data, much like a game
replay. Turn the dial to 12 September 2001, and you might find a group of players standing
around with brightly colored weapons and wands in their hands. But what are they doing,
and what does it mean to them?
The third point then is that the documentation that is a prerequisite for future historical
studies of virtual worlds may not be located on game servers at all. The most important
qualitative documentation may be somewhere else, on a blog or a wiki, in a player-created
database or Flickr screenshots, or a YouTube video. The same may be true for some of the
contextual information desired under whatever set of transfer protocols or preservation
specifications a project is using, even sometimes for technical aspects such as software
dependencies or relationships among objects. Game researchers such as Dmitri Williams
have extracted and analyzed a wealth of quantitative data from virtual worlds; they have used
? 41?
these empirical data to explore social and economic aspects of these worlds. Such research
rarely has access to server-side data, but instead relies generally on surveys, participant
observation or data harvested on the client side using bots or automated characters.
(Williams et al., 2006). The point is that writing the history of virtual worlds on the basis of
software and of associated data alone would be a barren exercise. Installing Everquest in 2050
will not reveal much about the virtual world that emerged from the software, even if future
writers and historians have access to everything needed to run a fully functioning version of
the game. Certainly, there are still important reasons for preserving this software, whether as
artistic or cultural content, for technology studies, or for forms of scholarship that treat
aspects of digital games and virtual worlds as authored texts or artistic objects. Still, we need
to think more about virtual world history in terms of events and activities, much as an
archivist or historian would in the real world, and attend more carefully to preservation of
forms of documentation in digital form that are external to virtual worlds as software
environments.
Implications for Collections
One way of characterizing the mix of software preservation and documentation activities
necessary to preserve virtual world content and history is to think of the primary collections
as a mix of ?library? (published digital games and virtual world content) and ?archives?
(documentation about game/virtual worlds). This is, of course, a familiar model for cultural
repositories such as libraries. However, the nature of digital games and virtual worlds as
software objects and the particular ways in which events and experiences in these virtual
spaces are mediated and reported carry significant implications for collecting strategies as
well as preservation.
It is high time to review a few of the issues that have surfaced in the Preserving Virtual
Worlds project with respect to collection and preservation of data and metadata. The first
issue has already been set up in the discussion of virtual worlds as history. What exactly are
we trying to preserve? Specifically, is virtual world preservation focused on the software
and server-side data that in some sense defines or encompasses the ?world? as a created
artifact, or are we looking for materials in digital form that document the activities of players
or ?residents? of these spaces? There are at least two other ways to shape this distinction.
The first is to separate developer-created or -managed materials from those created or
managed by players. The second, as already noted, is to distinguish repositories of virtual
world data as essentially libraries or museums of created artifacts or texts from archives of
documentation about events. However, depicting virtual world preservation as a binary
proposition such as developer vs. player or artifact vs. document is ultimately unnecessary
and counter-productive. Certainly, there is value in preserving both software artifact and
event histories. The most productive approaches to virtual world preservation will be those
that integrate artifact and documentation in terms of collecting focus, evaluation of digital
content objects, organization of content transfer packages, metadata creation and access
strategies.
? 42?
Consider a problem that might seem to be entirely a matter of treating the essential task of
game preservation as software preservation, yet turns out to have crucial implications for
documenting player behavior and history: software versions. In the Preserving Virtual
Worlds project, we decided early on to limit our attention to a dozen or so representative
case studies, rather than a comprehensive collection of software or data. One of our key
cases has been the computer game DOOM, created by id Software. DOOM is a multiplayer,
first-person shooter, and while now is not the time to delve into game history, suffice it to
say that this game immediately transformed competitive, multiplayer gaming into the
leading-edge genre for computer games through the 1990s. There are two other aspects of
DOOM that are important with respect to preservation. First, it was distributed first and
throughout its history in shareware versions that featured a limited number of episodes of
the game; the idea was that the shareware version would hook players on the game, so they
would then purchase the full version. Second, the developers of DOOM openly embraced
modification of their software by the player community, and by implication revision of the
notion of game authorship, which immediately de-stabilized the notion of a canonical
version of the game. Defining a version of the game DOOM therefore involves considerable
attention not only to a sequence of patches and versions, but also to combinations of
developer-produced software, third-party add-ons and player-developed modifications,
better known as mods.
De-stabilizing the notion of a fixed version of software is not just a problem when we
attempt to determine what versions to preserve and how to account for revisions and
? 43?
changes. It also raises significant issues that affect the documentation of player activities in
the game. A crucially important category of objects that provide this documentation are
produced by players? efforts to capture their experiences through replays, screen captures
and screenshots. In the case of DOOM, we have already seen how NoSkill?s demo files make
it possible to view the actions of one of this game?s earliest and best competitive players.
Again, DOOM demos were essentially replay files, saved sequences of instructions from a
previously played game that, when executed by the game software, would show the same
game from the same (first-person) perspective of the original player. In other words, demos
are recorded game play sessions that are played by the game engine. The same is true of
replay files in later games, such as Blizzard?s Warcraft III, up to the present day. Unlike video
files captured from the screen or video-card output, demos or replays allow different views
and settings as permitted by the game software and the best visual quality that the software
will produce. However, this all is possible only when a running version of the game engine is
available in order to view these replays. Not only that, the version used to view the demo or
replay nearly always must correspond exactly to the version that was played when it was
created. Therefore, any decision about which version of the game will be preserved
determines which replay or demo files will be viewable in the future. Likewise, any decision
about which demos or replays are historically significant in terms of game culture or history
will presuppose preservation of the appropriate version of the game software. Treatment of
the software artifact affects documentation, and selection of documentation affects
treatment of the software artifact. At least in the realm of virtual world or digital game
history, separation of these treatment decisions into specialized areas or departments may
lead to disastrous consequences for future archivists and historians.
A second example with respect to virtual world data and metadata also speaks to the
necessity of maintaining contact between collections and their contexts, as well as between
projects of software preservation and historical documentation. This example suggests that
documentation can also serve as a category of metadata for virtual world data.
An interesting quality of virtual and game worlds is that many of them can be navigated by
in-world coordinate systems, much like real-world cartography. Two well-known examples
are the ?Second Life URL? (SLURL) in Second Life and the UI coordinate system in World of
Warcraft. Just as we can mash up data by attaching GPS coordinates to real-world maps,
photographs, and other media, these virtual world coordinate systems might make it possible
to match documentation we have assembled in our virtual world collections not only to
locations in virtual worlds, but also to each other. In the case of the video collection,
however, the metadata scheme based on Dublin Core already provides the ?coverage?
element for individual objects. As the Dublin Core specifications tell us, this element can be
applied ?for the use of multiple classification schemes to further qualify the incoming
information? such as latitude and longitude or other ?native coordinate representations.?
(Becker, et al. 1997) The Internet Archive?s Heritrix-based crawler allows collection
curators to input metadata tags at the document level for web pages harvested through
crawls; by adding a tag for a specific SLURL, say, to a document describing an event that
occurred in Second Life, we can thus link documentation to in-world locations.
? 44?
Using virtual world coordinates might help us bridge the gap between documentation and
mute server-side software and data with respect to ?event history,? perhaps offering a
solution to the problem of perfect loss in virtual world history. Here is a specific use
scenario. A cultural historian is interested in the use of game worlds for scholarly
communication and learns about the first science conference held in World of Warcraft, in
May 2008 (Bohannon, 2008). She finds videos documenting this event in the Archiving
Virtual Worlds collection, but they are a bit grainy and she is curious about the locations
chosen for the event. So as part of her ?fieldwork,? she installs and fires up the game world.
Then, using the coordinates conveniently provided by the collection metadata, she ports to
the location where the conference was held and, using her avatar, walks the terrain depicted
in the video. This scenario will work better in game worlds, where developers maintain
relatively stable environments with respect to content, than in virtual worlds such as Second
Life, where residents such as the National Front are free to delete everything they created.
However, if we are able to maintain backups of content as part of a package of data
associated with a virtual world, this problem will be alleviated. For now, it is sufficient to
observe that, as in the case of DOOM demos, useful connections between documentation
and data will only be available to historians and other researchers, if curators and archivists
work closely along these lines with software preservation specialists.
? 45?
Examples of Collections and Impact of Preserving Virtual Worlds? Collection-
Building Efforts
Currently, there are only a few significant library collections of historical game and virtual
world software. These collections also include archival documentation, hardware artifacts
(such as game consoles and realia), and print collections related to digital games. They
include the Stephen M. Cabrinety Collection in the History of Microcomputing in the
Stanford University Libraries (http://www.oac.cdlib.org/findaid/ark:/13030/kt529018f2),
the International Center for the History of Electronic Games at the Strong National
Museum of Play (http://www.icheg.org/) and the UT Videogame Archive at the Dolph
Briscoe Center for American History, University of Texas at Austin
(http://www.cah.utexas.edu/projects/videogamearchive/).
The Cabrinety Collection was a resource for the Preserving Virtual Worlds project, as it is
held by one of the project partners, the Stanford University Libraries. Stanford's How They
Got Game was founded in 2000 to begin work on the history and preservation of digital
games and interactive simulations. The founding of the project was stimulated by Stanford?s
acquisition of the Cabrinety Collection three years earlier. Today, the collection remains
perhaps the largest collection of microcomputer history held by a major cultural repository,
with approximately 20,000 software titles, roughly 85% of which are digital games, some 75
hardware platforms, publications, ephemera, and archival materials. How They Got Game
continued earlier work in software history and archives carried out under the auspices of the
Silicon Valley Archives at Stanford, and this activity is closely tied to related archival
collections, such as the papers of Steve Meretzky and Hal Barwood, and records of the 73
Easting Simulation and HPS Simulations. The linkage of library and archival collections is
similar at both UT Austin and at the Strong Museum.
As part of the Preserving Virtual Worlds project, the How They Got Game group at
Stanford has created three collections to document virtual world events using largely player-
generated content. The first is the Archiving Virtual Worlds collection
(http://www.archive.org/details/virtual_worlds) hosted by the Internet Archive as part of
their Moving Image Collections; the second is the Machinima Archive
(http://www.archive.org/details/machinima), a curated collection of the emerging medium
of game-based moviemaking known as machinima; and the third is a curated collection of
websites related to digital games and virtual worlds hosted by the Internet Archives Archive-
It service (http://archive-it.org/). The Archiving Virtual Worlds collection is particularly
relevant to the discussion of documentation. It consists in large part of video footage made
with real-time screen capture tools such as Beepa?s Fraps. The ?Final Countdown? video
that documented the last minutes of EA-Land is an example of the content preserved in this
collection. The Machinima Archive is dedicated to the academic investigation and historical
preservation of the emerging art form known as machinima. Machinima is filmmaking
within real-time, 3D virtual environments such as games and virtual worlds, usually made
from existing video game engines, whether through editing of replay files (?demo?), video
captured from the screen or graphics card, or composited game assets. Machinima ?capture?
of in-game performance, sometimes called ?virtual puppeteering,? provides a model for the
creation of high-quality video that documents player activities inside a game or virtual world.
Last but not least, the How They Got Game project established its subscription with the
? 46?
Internet Archive?s Archive-It service in early 2008 for the purpose of crawling, saving, and
making searchable numerous game- and virtual world-related websites. In two Archive-It
collections (?digital games? and ?virtual worlds?), we have seen to the preservation of
videos, weblogs, wikis, player-created websites, maps and many other forms of
documentation that provide information about player activities. These activities might
include modifying game software, demonstrating skills through superior game-play,
commenting on events such as protests or artistic performances, or anything else that
someone who has spent time in a virtual world might consider important. A particular
emphasis of the documentation activity undertaken through Archive-It has been the
preservation of web-based collections of player-created maps, walkthroughs, exploit videos,
glitch hunting, and other activities that deeply explore in-game and in-world spaces.
Compared to institutional collections of digital games and design archives, collections of
player-created documentation relating to activities and events in game and virtual worlds
have only begun to emerge in similar institutional contexts backed by preservation solutions.
On the other hand, vast troves of such documentation maintained by individual collectors
and players, fans and fan groups, and projects (such as MobyGames) have been assembled
and maintained. The Archiving Virtual Worlds collection, the Machinima Archive, and the
How They Got Game Project?s Archive-It collections will all be preserved as part of the
Internet Archive?s collections, with backup in the Stanford Digital Repository. These
collections complement and supplement the project?s efforts to work out schemes for long-
term preservation of digital game software and assets by documenting in-game and in-world
activities and events, player culture, and the efforts made by the player and, to some extent,
the developer community to save, preserve and provide access to historical documentation
about games and their cultural impact as well as game content.
The Contributions of Game Communities
Prior to recent efforts such as those of the institutional repositories mentioned above and
the Preserving Virtual Worlds project, it is fair to say that nearly all of the significant efforts
towards collection, preservation, documentation, enunciation and emulation of game
content, technology and culture were due to the work of players and fans, or individuals with
experience in the game industry. Examples of such efforts include the MAME community
on the emulator front, the Software Preservation Society (founded as the Commodore
Amiga Preservation Society) on the data migration front, or the work of collectors such as
Frank Cifaldi?s ?Lost Levels? project and website (http://lostlevels.org/).
One of the activities specified by Library of Congress in the award of the NDIIPP grant to
the Preserving Virtual Worlds project was outreach. In the project, we emphasized two kinds
of outreach with the specific goals of (1) encouraging better contact and communication
between communities of players, developers and fans and cultural repositories working in
the area of game and virtual world preservation; and (2) opening up channels for the
exchange of information among these various groups and institutions.
We believe it will be necessary to encourage more contact between the various communities
and individuals with an interest in preserving the history of digital games and virtual worlds
and cultural repositories such as museums and libraries building programs for game
preservation. The disconnect between active collectors and programmers building software
? 47?
such as emulators is not unique to game preservation, but is a problem for computer history
more generally. An important moment in awareness of the value in forging such contacts
was provided, for example, by the ?The Attic & the Parlor: A Workshop on Software
Collection, Preservation & Access,? which took place at the Computer History Museum in
Mountain View, California, in May 2006. One of our project members (Curator Henry
Lowood at Stanford) was on the organizing committee of this meeting and facilitated the
proceedings. The meeting included library and museum curators, corporate archivists, and
individuals active in collecting and preserving software, and highlighted the benefits to
repositories of working with broader communities, both with respect to collecting and
technology.
Preserving Virtual Worlds contacted a number of game companies, collectors, and players
throughout the course of our various activities. On the basis of these contacts, we were able
to open up channels for exchanging information among these various parties. While we
believe that much more work can be done in this area, we can point to two specific examples
of our activity as possible models for future projects and programs. The first is our work
with the International Game Developers Association (IGDA), specifically, the Game
Preservation Special Interest Group (SIG). Quoting from its own website, ?The
International Game Developers Association is the largest non-profit membership
organization serving individuals who create video games. We bring together developers at
conferences, in local chapters and in special interest groups to improve their lives and craft?
(http://www.igda.org/). While the IGDA serves its mission in many ways, the special
interest groups are the focus of member-based activities in a variety of areas, ranging from
specific game design areas to quality of life concerns. The Preservation SIG serves as a
?meta-resource, hub and community for those interested in digital game preservation and
history.? The SIG, with roughly 75 members, maintains an e-mail discussion that
emphasizes game preservation related news and projects. It has sponsored events at the
mammoth annual Game Developers Conference (such as the Digital Game Canon), where it
also hosts an annual round-table for members and other interested parties, and aims
?towards setting standards and guidelines for assisting industry/academia in establishing
preservation efforts.?
One of our project members (Lowood) served through the course of PVW as chair of this
SIG. Reports on the status and results of the project were delivered to the group and
discussed at the annual round-table meetings in 2008, 2009 and 2010. Moreover, several
project members (e.g., McDonough, Donahue) joined the SIG and thus participated in on-
line discussions of a variety of technical, game, and legal issues via the e-mail forum. In 2008,
the SIG launched an effort to raise awareness among IGDA members of the importance of
digital game preservation, both as an issue for the industry and as a matter of concern for
individual members who collect games or for whom games are an important part of their
lives and shared culture. A committee of the SIG began work on a two-part effort to
produce easily distributed publications about game preservation. The first is a white paper
describing the problem of the possible loss of game content and culture if no action is taken
and the challenges facing preservation activities. The second would document best practices
after the conclusion of the Preserving Virtual Worlds project. In March 2009, the first white
paper was published under the title, Before It?s Too Late: A Digital Game Preservation White Paper,
with publication support provided by the IGDA in the form of a grant to the SIG. It was
announced at GDC 2009, where a limited number of copies were distributed; since that
? 48?
time, it has been available either as a free download or a modestly-priced print publication
via Lulu Press (http://www.lulu.com). Before It?s Too Late includes contributions from PVW.
Lowood edited the volume and Donahue distributed two surveys, ?Records Management in
the Gaming Industry? and ?Preservation Activities in the Video Game User Community? as
appendices to the white paper. It is anticipated that work will begin on the best practices
document in 2011.
A second major outreach activity of the project was the Play - Machinima - Law conference
hosted by the Stanford University Law School and the How They Got Game group at
Stanford. This two-day conference covered key legal issues associated with player-generated
content. The specific focus was machinima, computer-animated cinema that is based on 3D
game and virtual world environments, technology, and content. Issues of player-generated
content play into many areas of concern to the Preserving Virtual Worlds project as a whole.
The specific context of machinima production provided a lens for looking at these issues in
detail, while also narrowing the scope of the conference to manageable proportions.
Speakers included machinima artists/players, legal experts, commercial game developers, and
game researchers; participants from Preserving Virtual Worlds included Lowood, Bittanti
and Rojo (as organizers, moderators, and speakers), McDonough and Kraus (as speakers).
Topics included game art, game hacking, open source and ?modding,? player/consumer-
driven innovation, cultural/technology studies, fan culture, legal and business issues,
transgressive play, game preservation, and notions of collaborative co-creation drawn from
virtual worlds and online games. A particular focal point and the main topic of the second
day was a document drafted by a team from the Stanford Law School as guidance
concerning copyright law for machinima artists. With input from the conference and the
IGDA Preservation SIG, that document will be produced as part of the Preserving Virtual
Worlds effort and is discussed in more detail elsewhere in this report.
The outreach efforts of the Preserving Virtual Worlds project, both with respect to the
IGDA Preservation SIG and the Play - Machinima - Law conference, demonstrated the
importance and potential gains from encouraging contact, discussion and collaboration
among cultural repositories, the game industry, and the various communities with an interest
in game cultures.
Problems/Opportunities in Access
Assume again that we are able to save all the software and server data for a virtual world,
game world or simulation and put it into a digital repository in a library. When a historian
fires up this virtual world fifty years from now, it will be empty. Moreover, the
documentation that will be a pre-requisite for future historical studies of virtual worlds won?t
be there. It never was. The most important qualitative documentation was somewhere else,
on a blog or a wiki, in a player-created database such as Thottbot, in technical
documentation, or Flickr screenshots, or a YouTube video. The same may be true when a
future repository attempts to fire up some ancient software. If contextual information, such
as software dependencies or descriptions of relationships among objects, is not provided
under whatever set of transfer protocols or specifications a project is using?and all the
retired software engineers are gone?it may be impossible to get old software to run.
We suggested earlier in this report that nobody writing the history of virtual worlds will be
? 49?
able to do much with software and associated server-side data alone. Installing Everquest in
2050 will not reveal much about the virtual world that emerged from the software, how it
was built or used, even if future writers and historians have access to everything needed to
run a fully functioning version of the game. Certainly, there are important reasons for
preserving this software, whether as artistic or cultural content, for technology studies, or for
forms of scholarship that treat digital games and virtual worlds as authored texts or artistic
objects. Still, we also need to think about virtual world history in terms of events and
activities, much as an archivist or historian would in the real world, and attend more carefully
to preservation of forms of documentation in digital form that are external to virtual worlds
as software environments. Software preservation without historical documentation of in-
world events will be a barren project.
These considerations bring us to specific future use cases and scenarios. Who might want to
have a look inside a historical virtual world, and why? And how might future librarians
provide access to them? Consider the following hypothetical use scenario:
? A historian is interested in the cultural and social history of virtual worlds. His
attention is drawn to vigils that were held in game worlds such as Everquest and
Asheron?s Call after the WTC and Pentagon attacks on 11 September 2001. He would
like to know more about virtual communities and the meaning to them of such
events. However, it turns out that no data other than backed-up chat logs?which
soon would be deleted?was ever kept on the game servers. Surviving
documentation can only be found via the discussion boards, websites, and
screenshots saved by players. Although he is interested in the virtual world?s history,
his search for documentation depends on web archiving efforts and personal digital
collections. Will full-text searching of this vast amount of material yield the
documents he needs? Or will finding needles in the web haystack require document-
level indexing with specific date, location, or event tags?
The moral of this example? Documenting virtual worlds through software and data
preservation alone will not do the job. As we have already suggested, there will be no trace of
past events inside a virtual world, at least not the present generation of these worlds. We
noted one exception earlier (in Asheron?s Call), but they are rare. Here is what this means.
Assume again that such a world has not been deleted, erased or remade, that we are on the
right server or shard of a game or virtual world in which some documented event happened,
and indeed that we are standing on precisely the spot where a historical event such as a
conference or vigil occurred. We cannot dig, find old documents or inspect traces of the past
on the site. Installing Everquest in 2050 will not reveal much about the virtual world that
emerged from its game software. Nothing of the events that shaped player culture, game
politics and relationships between players and developers will remain. Like Ozymandias?
giant leg in the desert, the surviving software artifact will tell us nothing about what ?once
dwelt in that annihilated place.? Future researchers will be sorely disappointed if we do
nothing to ensure that historical documentation about virtual worlds, but created outside
them, is preserved along with software and proprietary data.
? 50?
Now that we have introduced potential users for our virtual world collections, we will say
more about access. Access is perhaps not a core concern for digital preservation per se, and
there were no specific project deliverables that involved investigation of an access model or
solution. Still, the Preserving Virtual Worlds project, due to its participants and its problem-
sets, highlights the importance of finding solutions for problems such as identifying
significant digital artifacts or developing standards for metadata. A key element in finding
solutions to problems such as these is the collaboration of individuals and teams with
different perspectives on software preservation, archives, and history. As a consequence of
the Stanford group?s connection with a separate project team (led by the former project
manager of Stanford PVW effort) concerned with the development of 3D environments for
scholarly applications, we roughed out a concept and began development work on an
access/delivery environment that might address some future access issues.
Specifically, we began to re-think access to documentation and artifacts from virtual worlds
collected in cultural repositories and, with that foundation, begin working on an approach to
access that could lead to an alternative model for preservation of 3D artifacts from games
and virtual worlds. A collection of such assets would include models, maps, geometries,
textures, transactions, and so on. In working on this problem, we have been inspired by Ivan
Sutherland?s earliest essay on virtual reality, ?The Ultimate Display.? In this case, however,
we are working on an approach to access that may lead to an alternative model for
preservation of 3D artifacts built for games and virtual worlds. This work depends once
again on collaboration, in this case, on the allied project just mentioned above. Based in the
Stanford Humanities Laboratory and the Computer Science Department, this project has
been developing next-generation, virtual world platform called Sirikata. Sirikata is a BSD
licensed, open source platform. The development team aims to provide a set of libraries
and protocols that can be used to deploy a virtual world, as well as fully featured sample
implementations of services for hosting and deploying these worlds. An alpha version has
been used for a mixed reality performance at the MiTo International Festival of Music in
Milan, 12-13 September 2009, and is currently live for an installation at the Bornholm Art
Museum in Denmark.
Think about the assets and content that go into the creation of a virtual world: models,
maps, geometries, textures, and so on. We are not sure yet how future scholars will
visualize, analyze, and understand these artifacts in a digital repository consisting of data files
and metadata. In administrator mode, which is what we have today, the Stanford Digital
Repository is essentially a file directory. Now think about another model of access to
artifacts from an historical world, also largely models and suitable spaces for these models: a
natural history museum showing dinosaur skeletons in a set that takes the visitor to a
prehistoric savannah. Access to the information preserved there is visual and is reinforced
by immersion in the world of the artifacts. It should be possible to do something similar
with 3D artifacts from virtual and game worlds.
We are investigating the use of the Sirikata platform for the creation of a new kind of
repository, one in which 3D objects are stored, retrieved and investigated as 3D objects. This
means that we would like to be able to move original geometry and texture data?archival
assets?from their original environments into such a repository. The two cases we are
investigating are (1) digital artifacts such as maps or levels from 3D games, beginning with
? 51?
early titles such as id Software?s DOOM and Quake, and (2) exhibitions created in virtual
worlds such as Second Life by cultural institutions, including libraries and museums. Can we
move these objects into an instance of an open virtual world platform such as Sirikata? If
so, might we think of these instances as virtual wings of a library, rather than file
repositories, places where the historical artifacts are deposited, preserved, found and
investigated in an environment that puts documentation and narrative alongside the
artifacts?
Maps are among the most important artifacts in game development and player cultures.
Players analyze them, re-create them as mods in games other than the ones in which they
were originally created, and build viewers and projections so as to better visualize how to
optimize their game-play in these spaces. As artifacts in a digital repository built with virtual
world technology, historical maps would not just be artifacts, they might also provide spaces
in which to site other objects and documentation?such as models, screenshots, videos, or
documentation?that provide information about what took place in these settings. This
might be where our future historians goes to check out the locations used for the science
conference held in World of Warcraft, for example, without having to assemble, install and
figure out how to use the original movement and navigation systems of the game's user
interface. While further development work is necessary to realize this vision, we have already
successfully exported levels (maps) from id Software?s Quake (1996) to the open VRML
format, from which we are able to move unaltered geometries and textures from this
historical digital artifact to other environments, whether Maya, 3DS Max, or Sirikata. When
the pipeline to the Sirikata-based virtual repository is completed, it will be possible to drop in
to Sirikata and see 3D objects with the same geometries and textures they were given in the
original game. In fact, these artifacts will be created from certified copies of original game
data used to produce them in the first place, utilizing forensics techniques for data extraction
and the preservation methodologies worked out in the Preserving Virtual Worlds project.
Future access environments for digital repositories will need to consider how to provide
scholars with access both to digital artifacts from environments such as virtual worlds and to
digital documentation about these artifacts and worlds. Bootstrapping virtual world
technology as a means for re-conceptualizing the digital repository as a 3D environment
rather than a file system provides a potential solution to this problem and deserves further
investigation with the specific goal of developing access solutions for virtual world and game
assets.
? 52?
5. Software Preservation and the Law
Copyright Issues: Complex Ownership and Orphan Works
The owner of IP has the right to transfer all rights by assignment, or a portion of the rights
by license. The rights may be placed in the public domain, either intentionally, through IP
misuse or neglect, or as the result of expiration of a registration. Or the owner may elect to
make no use of the IP rights and to prohibit others from use. It has been charged that some
patents are procured not to protect the owner?s use but to prevent use of the invention by
competitors.
? Introduction to Game Development, Ch. 7.5, ?Transfer of IP Rights?
Copyright
The most basic preservation action for any digital object is to convert it into a media-neutral
format. This is typically done by creating an image of the original media; that is, making an
exact, bit-for-bit replica of the disk that can then be mounted from the hard drive or burned
to a fresh disk (e.g. an ISO or IMG file). In other words, the simplest and most common
preservation process violates reproduction and (usually) anti-circumvention provisions in
U.S. intellectual property law.
Copyrights held by corporations endure for up to 120 years, and under the DMCA cultural
heritage institutions enjoy no special privileges. A video game console generation typically
lasts less than a decade. We are currently in the 7th generation of consoles, and personal
computers have evolved in equally dramatic ways since Apple II and Commodore 64 began
saturating the home PC market. Given the difficulty of identifying and then obtaining
permission from the current rights holders of older video games, this translates into libraries
risking fines of $200-150,000 per game were they to migrate their collection of classic
software from 3.5" floppy disks to images stored on hard drives, an act comparable to
rebinding a book or creating an access copy of a manuscript.
Patents
According to Stephen Rubin, video game patents (or at least litigation surrounding them)
most often center on game controllers, consoles, and peripherals (Rubin, 2010). As patents
have the shortest lifespan of all intellectual property protections, and outside of museums
hardware is likely to be discarded in favor of accessing the games on modern computers,
they have far less of an effect on cultural heritage institutions. Even so, at least two
generations of game consoles have passed by the time a patent expires (20 years from the
time of filing) and there is a danger of hardware becoming irreparably damaged after support
has stopped, with no legal option to build a replacement.
The Vectrex console, released in 1982, was unique for two reasons: it used vector rather than
raster graphics, and it was the first console to offer a 3D peripheral. The Vectrex had a short
market life, and the 3D Imager an even shorter one?making it a very rare, and expensive
device. The rarity makes the games even more appealing to collectors and, in a display of
? 53?
typical resourcefulness, someone in the video game community began building 3D imagers
from breadboards and portable CD players (Woolff). Because the Madtronix Imagers were
produced in 2005, the original patent had expired and Woolff was able to sell them for about
$75 each, a far cry from the $700 or more you might pay for one on the grey market, if a
functional 3D Imager could even be found to purchase. Without this peripheral, a number
of games are rendered unplayable, and preservation is meaningless without access. Although
in many cases hardware upkeep is not sustainable, there are instances when the device is
unique enough to be worthwhile.
The majority of video game patents may be hardware related, but patents on software, a
fairly controversial topic, do exist. Rabin cites Sega, Nintendo, Sony, and Microsoft as all
having patents on game play. Patents of this nature serve to strengthen the protections
already offered to individual games by copyright.
Trademark
Unlike copyrights and patents, as long as a trademark is used and protected (that is, the
owner doesn?t look the other way when made aware of infringement), it never expires.
Historically, trademarks have simply been logos or other commercial identifiers. With video
games, flagship characters become brand identifiers and thus eligible to be registered as
trademarks. This could potentially mean that works that would otherwise enter the public
domain in a fixed time period could be monopolized by a progression of rights holders for
centuries.
Trade Secrets and Licensing
?
Trade secrets and licensing add further complexity to the web of video game IPR. Trade
secret regulations are not as well known or clear-cut as the three major categories of
protections. Without even getting into the mire of contract law created by end user license
agreements (EULAs), licensed content (for instance, games released with a major motion
picture) makes the task of identifying rights holders and obtaining permissions far more
arduous. In the book Challenges for Game Designers, a great deal of time is spent discussing how
to work with licensed content and how to pitch proposals for sequels and other licensed
works. Combined with the proliferation of titles on the shelves using characters from
Marvel, Disney, and other media giants, this seems to indicate that video games utilizing
licensed IP are a sizable chunk of the existing catalog, and so are also likely to be included in
any video game preservation projects.
Emulators & the Law
?
We have argued for a greater degree of collaboration between the gaming community and
the preservation community in insuring the longevity of games. The gaming community has
been exceptionally active in the creation of emulators to enable access to older games. If
emulation is to be used as a preservation strategy for some types of games, then it would
behoove the preservation community to take advantage of the large amount of prior work
done by the gaming community in creating and disseminating emulators for various
platforms and, if possible, contribute to their further development as necessary. If
? 54?
preservationists are to become actively involved in the creation and dissemination of
emulators, however, they need to do so with a full awareness of the legal issues surrounding
emulators? creation and use.
In the eyes of the law, the creation of an emulator constitutes a form of reverse engineering,
in which an existing computing platform is disassembled and examined for the purposes of
replicating its behaviors and operations in a new platform. While generally speaking the
courts have found reverse engineering to be a legal activity, the ability to create an emulator
can be limited by various laws, including copyright law, trade secret law, patent law, the
Digital Millennium Copyright Act (DMCA), contract law, and the Electronic
Communications Privacy Act (Electronic Frontier Foundation, 2010). To date, most of the
court cases regarding emulators? creation have focused on copyright law, in particular on the
extent to which an emulator?s creator may examine and use software from the original
computing platform when creating the emulator.
Two of the most significant pieces of case law regarding emulators have been the cases of
Sega Enterprises Ltd. v. Accolade Inc. (977 F.2d 1510 [9th Cir. 1992]) and Sony Computer
Entertainment v. Connectix Corp. (203 F.3d 596 [9th Cir. 2000]). In Sega, Accolade reverse
engineered the Sega Genesis console platform by wiring a decompiler to a Sega Genesis
system and loading different games on it in order to determine the interface specifications
for the console. Accolade used the information obtained by decompiling the Sega Genesis
software to create a software specification that they in turn used for creating Sega Genesis
compatible games. The Ninth Circuit Court found that Accolade?s actions constituted fair
use, as they had engaged in copying of the software embedded in the Sega Genesis system
only to be able to access the ?unprotected ideas and functional concepts embodied in the
code,? and disassembly of the code was the only means of gaining access to those ideas and
functional concepts.
The Sony case, however, is the more definitive ruling on the legality of producing and
marketing an emulator. In this case, Connectix extracted the contents of a Sony Playstation?s
BIOS and disassembled it in order to determine its behavior as part of creating an emulator,
the Virtual Game Station, that would run Playstation games on Windows or Macintosh
computers. As in the Sega case, none of the actual code from the Playstation system was used
in the Virtual Game Station; Connectix merely studied the original code in order to replicate
its behavior. The Ninth Circuit Court found that the extraction and examination of the Sony
Playstation BIOS constituted fair use, as in the Sega case. The Sony decision, however, goes
into far more detail regarding the court?s analysis of the four factor test involved in a
determination of fair use.
The court found that the purpose and character of the use made of the Sony BIOS by
Connectix was ?modestly transformative,? in that it established a new platform on which
consumers could play Playstation games, and that therefore this factor favored Connectix.
On the second statutory factor, the nature of the copyrighted work, while the court
recognized that software deserves copyright protection, it had previously ruled that such
protection is less than that offered to ?traditional literary works,? and found that in this case,
the standard set in the Sega case, that copying be necessary to access the functional aspects of
the software had been met, and this factor also favored a finding a fair use. While the third
factor, amount and substantiality of the portion used, weighed against Connectix (inasmuch
? 55?
as they had copied the entire work), the court also recognized that the copying had occurred
as an intermediate step in the production of the Virtual Game Station, and that none of
Sony?s software had actually appeared in the released product. On the fourth factor, the
effect of the use upon the potential market, the court somewhat interestingly found that
while the Virtual Game Station might negatively impact Sony?s sales of the Playstation, the
transformative aspect of the Virtual Game Station meant that the VGS did not simply
supplant the Playstation, and that therefore, this factor favored Connectix as well.
The courts to date, then, have found that as long as an emulator does not include software
taken from the original system, it does not directly infringe upon copyright. The gaming
community has discussed the possibility that the creators of an emulator might be taken to
court for contributory infringement, on the theory that by enabling others to play games on
a platform other than the original, the creators of an emulator are inducing others to illegally
copy game software to enable it to work with the emulator. As far as we know, the courts
have not directly addressed this issue. In other cases that have focused on technological
devices? role in contributory infringement (see, for example, Sony Corp. v. Universal City
Studios, Inc., et al. 464 U.S. 417 [1984] and MGM Studios, Inc. v. Grokster, Ltd. 545 U.S. 913
[2005]), the courts have focused on the issues of whether the technology has significant non-
infringing uses, whether those responsible for producing the technology were in a position
to control the behavior of those infringing on others? copyrights, whether those producing
the technology promoted the use of the tool for infringing activities, and whether those
producing the technology materially benefited from the infringing activities of others. Any
library or archive wishing to become involved in the creation of emulators needs to be aware
of these issues surrounding contributory infringement, and avoid any activity that might be
seen as encouraging others to infringe copyright or profiting from others infringing
activities.
There are a number of other legal issues that may arise with respect to the creation of
emulators other than copyright issues. In the Sony v. Connectix case, one of Sony?s accusations
was that Connectix had tarnished Sony?s Playstation trademark by the negative associations
that would arise in users? minds regarding Playstation games if they played them under the
Connectix Virtual Game System and had an inferior experience of the game. While the court
rejected this argument, stating that consumers would be able to distinguish between the two
platforms and avoid misattributing quality issues to the Playstation brand, the existence of
this complaint does indicate that those creating emulators may need to be careful to avoid
any appearance of intruding on companies? trademarks.
Contract law and the Digital Millennium Copyright Act may also impede programmers?
ability to create an emulation of an existing platform. In Davidson & Associates DBA Blizzard
Entertainment, Inc.; Vivendi Universal Inc. v. Jung et al., 422 F.3d 630 (8th Cir. 2005), Blizzard
Entertainment sued the developers of BNETD, a reverse-engineered implementation of the
Battle.net online gaming servers that support multiplayer online games from Blizzard such as
Diablo, Starcraft and World of Warcraft. Blizzard claimed BNETD?s programmers had agreed to
the Battle.net end user license agreement and terms of use when they signed up for the
service, and the EULA and TOU for Battle.net specifically prohibit reverse engineering. The
court upheld the license agreements as enforceable contracts and found BNETD?s
developers in violation of the EULA. The court also found that BNETD?s developers
? 56?
violated the anti-circumvention provisions of the DMCA by emulating the authentication
sequence used by Battle.net servers when establishing communication with a client game.
As a sidebar to this court ruling with respect to emulation, it should be noted that BNETD
source code continues to be widely disseminated and that hundreds of servers around the
world based on this code are openly operated. This observation raises the possibility that
content created on such a private server (replays, machinima, modified assets) might become
part of a future archive or collection; indeed, it is likely that some of this content is already
evident in a collection such as the Machinima Archive. As part of our collaboration with the
Center for Internet & Society at the Stanford Law School, we consulted with legal staff and
students at the law school to discuss the specific issue of the liability of a repository (such as
the Internet Archive) in such an instance. While in the absence of a specific example or legal
action, PVW was not in a position to obtain legal advice and there has been to date no clear
legal precedent in this area, it appears that the exposure of an archival repository to legal
action due to the inclusion of such content in a video, for example, is likely to be quite small.
In the case of an objection to making such a video available for viewing, it is likely that
removing public access to the archived content would be an acceptable response. This has
been the practice of the Internet Archive, for example; however, the application of such a
policy to the case of content produced on a ?private server? (as a separate issue from issues
of reverse engineering and emulation) has not yet been tested in the courts.
Finally, developers of emulators may find that even if they do not fall afoul of copyright law,
they may encounter problems with patents. There has been substantial litigation in recent
years regarding patents and emulation technology, most of which has settled out of court.
With copyright law providing only a limited ability to restrict the actions of those developing
emulators, companies that do not wish to see emulators for their products are more
frequently turning to patents as a means of restraining the development of competing
technology.
The above issues all relate to the development of emulation technology. There are also legal
issues involved in the use of game emulation technology, most notably copyright and the
DMCA?s anti-circumvention provisions. Most of the copyright issues center on the creation
and use of ?ROM? files from game cartridges for game console platforms. In Atari, Inc. v. JS
& A Group, Inc., 597 F. Supp. 5, Atari sued the JS & A Group for copyright infringement and
patent infringement, stating that JS & A, in marketing its product PROM BLASTER (which
allowed users to make copies of Atari 2600 game cartridges), was guilty of contributory
copyright infringement. JS & A, citing ? 117 of the Copyright Act, stated that the PROM
BLASTER was intended to allow users to make archival copies of their legally owned
computer programs and hence its primary use was non-infringing. The District Court
decided that, as ? 117 had been rewritten based on the input from the final report of the
National Commission on New Technological Uses of Copyrighted Works (CONTU) that
states that archival copies of digital media should be allowed ?to guard against destruction or
damage by mechanical or electrical failure,? and since game cartridges as media are not
particularly subject to mechanical or electrical failure, making an archival copy of a game
cartridge constituted infringement. While this ruling was mitigated somewhat by the case of
Sega v. Accolade, where the Ninth Circuit found that making a copy of a game cartridge could
be fair use if done for the purposes of making an intermediate copy needed to reverse
engineer a game system, the case law at the moment is clear that copying a game cartridge
? 57?
for the sole purpose of being able to play the game using the copy, rather than the original, is
copyright infringement.
The DMCA may also inhibit a library?s ability to use certain emulation technologies. Some
modern computer games employ digital rights management technology, such as SecuROM,
which checks for the original installation CD-ROM or DVD-ROM when it runs an installed
program. While there are disk emulation technologies, such as Daemon Tools and Alcohol
120%, that claim to be able to create a copy of a protected CD or DVD and mount it in a
virtual drive that will enable the game to run even if the original disk is not present, such an
action constitutes bypassing a technological protection measure. While the use of such disk
image emulation technologies is fairly common among the gaming community, the legality
of such an action has not, to the best of our knowledge, been tested in court.
? 58?
6. Preservation Strategies
Analysis of Hardware Preservation
As part of the preservation of video games and virtual worlds process, it is important to
understand how current collectors are faring as part of their acquisition and preservation
strategy. To this end, the research team conducted an interview with J.P. Dyson, director of
the International Center for the History of Electronic Games (ICHEG), and vice president
for exhibit research at the Strong National Museum of Play. The mission of the ICHEG is
to collect, study, and interpret electronic and computer games from a number of
perspectives, including the cultural impact of games upon society, and how people play and
learn from such experiences. Strong and the center have the additional mission of catering to
two distinct audiences: the general public and patrons who visit the Strong National
Museum of Play, as well as academics and researchers engaged in the study of video games
and play experiences.
ICHEG has processed close to 20,000 game artifacts and 100,000 game-related artifacts in
the last 18 months. Dyson states that the preliminary focus of ICHEG has been three-fold:
acquiring primary artifacts related to electronic games hardware and software, restoration of
the collection for both display and interaction, and gathering of ancillary artifacts related to
the play experience and documentation of the production process. Acquisition of content is
procured through private donation as well as active investigation and outreach by the
museum staff.
Restoration processes vary depending upon the type of artifact. ICHEG and Strong rely
upon a number of sources for the repair and restoration of electronic game equipment,
including internal personnel, experts from the community (from both academia and
industry), external paid experts, and others. For arcade cabinets, the process involves either
internal specialists or external vendors. Special attention is paid to the type of restoration. In
some situations, it is only acceptable for the restoration to occur with original parts (the
preferred method). In other situations, original part restoration may be secondary, such as in
situations where motherboard components are no longer available. For consoles and
computers, ICHEG often acquires multiple copies of the same artifact. In such situations,
working components can be selected or reassembled from the duplicates within the
collection. When a component does not work, it can be relegated as a display-case sample.
Museum personnel have also acquired parts collections that can, in turn, be used to repair
broken devices.
Along with parts, the restoration process for an individual component can take a number of
forms. Reconditioning may include, depending upon the condition, resurfacing of a CD-
ROM and the restoration of magnetic media components. Archival process includes the
capture of video and image related to the play experience. For components that find their
way into Strong?s interactive installations, reinforcement or reconditioning may include
processes by which the artifact can be made ?bulletproof? for the duration of the exhibit
showing. For example, joysticks and other input devices may have to be replaced by versions
that can survive the repeated use by patrons of all ages. The museum has used this technique
for the Atari 2600 version of Pac-Man, due to its overwhelming popularity.
? 59?
According to Dyson the museum currently focuses on the acquisition and assessment of
artifacts. To create the proper experience for the general public and researchers, a number of
steps must be undertaken. For consoles, reproducing an experience means coupling
hardware with televisions and display devices of the proper era. Dyson has noted that the
scarce nature of older televisions coupled with technological problems of newer televisions
can cause impediments in re-creating how the game can be experienced in its original form.
Dyson said that ICHEG and the museum are starting to explore the use of emulation and
virtualization technologies as a method of preserving game experiences. The Strong National
Museum of Play has used emulation technologies in at least two situations where the original
artifact was damaged or was too valuable to exhibit for public interaction.
Dyson states that there are a number of challenges looming on the horizon for museums.
Currently, ICHEG is concerned about games delivered through technologies such as Steam
or the iPhone Apps store. In such cases, the distribution platform is a part of the game
experience, and if the distribution channel were to cease operations, the process of capturing
the game experience would be exceedingly difficult. Dyson is also concerned about the
preservation of experiences related to massively multi-player games and games that exist and
persist over a series of distributed servers. Such preservation leads to larger questions
regarding the overall preservation of software, firmware, operating systems, and application
infrastructures so that an experience can be properly recreated.
Dyson described current efforts of the museum and ICHEG to create a digital collections
policy. The policy is intended to determine what it means to collect artifacts that are digital
in nature, and is a starting point for the process and for communication related to the
preservation of digital artifacts.
Dyson also considers establishing communication between collectors and the companies
that produce electronic game hardware and software as one upcoming challenge. With better
connections, the potential exists for greater collaboration related to preservation.
ICHEG?s long-range plans are to move their processes beyond acquisition, restoration, and
presentation. ICHEG and the Strong National Museum of Play are committed to creating
plans for large-scale preservation and presentation policies that will require emulation,
virtualization, and migration strategies.
In the Preserving Virtual Worlds project, our research team gained insight into the problems
that are too often encountered by museums in their quest to provide preservation and access
to historical video games.
One such problem dealt with the process of video conversion for both experiencing game
play, as well as capturing video of game play for preservation, which first occurred in a case
study related to the Atari 2600 Star Raiders game. The Atari 2600 utilized a circuit called the
?Television Interface Adapter,? which output a radio frequency signal. The signal could be
received by a standard television set and was treated in a similar manner as an ?over the air?
transmission. The signal was routed into the television set through a switch block, which
allowed the player to select either the game system or a standard television antenna. As part
of the Preserving Virtual Worlds research, the team acquired three such Atari 2600 systems
? 60?
from different eras, including an original Sears Telegames Atari model, an Atari 2600
?Heavy-Sixer? (manufactured around 1977), and an Atari 2600 ?Light-Sixer? (manufactured
around 1980). In order to create an authentic game experience, the team was able to procure
black and white and color televisions from the early 1970s (Sony 1974 Color TV 19",
Motorola B&W 13" portable with components dated 1971). Both of these sets were
characterized by split VHF/UHF inputs, rotary channel selection and tuning, and a wide
bezel to guard against overscan issues related to that era cathode ray tube technology.
Although the team was able to gain a satisfactory play experience from the 1970s television
technologies, several problems manifested with progressively newer sets. For example, with
televisions from the 1980s through 2009, the team experienced problems such as pixelization
of image, fluctuation of brightness (photodiode and photoresistor compensation), problems
with signal lock (digital tuner problems), problems with sound reproduction (volume,
clarity), impedance conversion (conversion from 300 ohm to 75 ohm), and overscan
problems (ghosting, artifacts around image edge). The problems are only exacerbated by
newer television technologies based upon the conversion to digital TV signals in the United
States. The team was already aware of television technologies that have eliminated the
possibility of over-the-air input based upon the lack of analog signal as well as the decrease
in popularity of VCRs, game consoles, and DVD players that use a direct RF broadcast
signal. In such cases, the television is not even capable of receiving the signal, and as such,
nothing is displayed.
The problem also gains new dimensions if the video signal is to be recorded on a digital
capture device. In such scenarios, additional signal transformations may have to occur. For
example, in order for the team to record screen capture from the Atari 2600 using a digital
process, the signal was transformed from an analog RF signal to a composite TV signal
(using a VCR to do the transformation) to a digital signal (using a digital camcorder).
The experience gained from the video conversion project was shared with a team of curators
and game restoration specialists at the Strong National Museum of Play as part of an
extended dialogue that has resulted in a partnership between the museum and the Rochester
Institute of Technology, in which students gain cooperative education credit by assisting the
museum in video capture and play experience restoration tasks.
A second problem is the conversion of game-related media so that the media can be read
and processed by newer technology. Many of the earlier game technologies utilized
proprietary formats for their ROM cartridges, magnetic tapes, and floppy disks. As the
systems age, it becomes harder to find modern technology equivalents that are capable of
processing the information. For example, if a person no longer possesses an original Atari
2600, the only way to retrieve the information from a cartridge is to either purchase an Atari
2600 ROM reader from a third-party company (it cannot be purchased from the average
computer or electronics store) or to extract the relevant ROMs from the cartridge?s circuit
board and process them through a ROM reader. With magnetic media, combinations of
media packaging, sizes, capacities, and protection techniques compound the problem. For
example, when the research team worked on a case study with DOOM, not only did it take
some time to hunt down a computer that was still capable of reading a floppy disk, but it was
also important to identify a floppy drive that could still process both single as well as double
density disks.
? 61?
In summary, game museums are at an important crossroads in the preservation process.
Curators at such institutions must balance the need to acquire rapidly disappearing artifacts
related to the early history of video games while ensuring that their acquisitions will be able
to be studied and enjoyed for decades to come. The promise of preservation technologies
and their strategies may be able to span the bridge between these two desires.
Analysis of Emulation and Virtualization
As part of the preservation strategy for virtual worlds and electronic games, one must
consider options that allow players the ability to experience a game well beyond the
availability of a given hardware platform. Ideally, museums and collectors wish to preserve
the original experience through maintenance and upkeep, but as hardware platforms age, it
becomes more difficult to find repair parts. Furthermore, finding individuals with sufficient
knowledge to repair such systems also becomes increasingly difficult over time. In order to
augment the preservation of the original game system, techniques of emulation and
virtualization can be used to provide game play experiences on newer hardware. This section
examines the use of emulation and virtualization techniques as well as their implications for
our case study games.
Emulation Overview
An emulator is a software application that is capable of simulating a particular hardware
platform, including the hardware and appropriate firmware and support software. Playing
through an emulator varies from the original experience because the emulator is designed to
execute on hardware that is radically different from the original platform and corresponding
system architecture. Emulators are designed to utilize the original executable code from the
game. For most emulation systems, this involves extracting and providing a copy of the
native cartridge ROM or game CD-ROM.
? 62?
Over the last several years, the game community has seen a surge in emulation technologies
designed to allow users to play many of their favorite, classic console games. In order for an
emulator to successfully re-create a play experience, it must address several key areas of the
original game environment platform, including the central processing unit, the memory
manager, the environment code, and peripherals. In addition, the emulator must have a
means of loading and processing the original game code.
? Central Processing Unit ? An emulator has to be able to translate CPU instructions
and must be able to emulate the functionality of a specific processor. This includes
all arithmetic and logical operations, registers, storage modes, addressing modes,
interrupt handling, and input/output operations.
? Memory Manager ? An emulator must be able to provide the structure and access to
main memory that is analogous to the original platform. This includes memory
layout and management for system internal ROMs, scratch RAM, system RAM,
device-based RAM and ROM (video, audio), and memory controllers (MMUs, DMA
circuits).
? Environment Code (ROM/Firmware/BIOS/OS) ? An emulator requires the
appropriate software framework to operate. For some emulated systems, the
firmware is separate from the instructions that are part of the game media. For
example, DOS emulators will provide access to BIOS memory and some consoles
will provide access to ROMs for common system functions (peripheral calls, math
calculations, IO routines). Due to both maintenance and legal reasons, some
emulator developers separate the environment code from the rest of the emulation
system. Despite potential separation, this code is absolutely required for the
emulation system to operate.
? Peripheral Mapping ? An emulator must provide mappings from the original system
to the target system. For example, in the case of most console systems, components
external to code execution include video output, player input (joysticks, paddles),
audio, and the game cartridge ports. In many cases, these components were not
designed for general use, but were specific to the particular platform. For example,
the Atari 2600 provided video and audio through a custom-designed Television
Interface Adapter (TIA), which provided a means for the consumer to attach the
console to a standard TV. The emulator must provide peripheral translation such
that TV signaling can be displayed upon modern video cards and monitors, input
devices can be mapped to modern joysticks, mice and keyboards, and ROM
cartridges can be mapped to a modern file system.
? ROM/Emulated Software ? Target software that can be run on an emulator.
Technically, the target software is separate from the emulation
environment. However, for most game console emulators, the ROM or Emulated
Software must be processed so that it can be properly loaded and executed in the
emulator. For example, to run an Atari 2600 cartridge in an emulator, a copy of the
ROM code must be extracted with a ROM or cartridge reader and must be
formatted in a particular manner such that the emulator can properly parse the
resulting file.
The community for emulation development, especially for video games, can be best
characterized as a grass-roots movement led by a few dedicated programmers. For most
? 63?
emulator developers, the process involves reverse engineering the hardware and firmware
present in the original systems. Currently, the emulation movement?s focus has been
predominately on classical console systems, including the Atari 2600, Sega, Playstation 1,
NES, Game Boy, and other early platforms. For general computing platforms, classic game-
related computers such as the Apple II, TI99/4A, Commodore 64, Atari 400/800, and
Amiga have all been the target of emulation development. At one level, the focus on earlier
platforms may be based on their simplicity. Earlier platforms consisted of simpler CPUs
(6502 series being a popular model), minimal memory management schemes, and limited
peripheral detail. At another level, as these platforms age, hobbyists and fans are frantically
looking at means to preserve these experiences. As the community often drives emulation
efforts, the efforts are in various states of completion. A quick survey of popular emulators
for the Atari, Sega, Sony, Commodore, Apple, and Nintendo brands showed that over 70%
of the efforts never resulted in a viable emulation environment. In addition, for those efforts
that were completed, portions were abandoned for a number of reasons, including developer
abandonment and legal confrontation. To make the situation even more difficult, a survey of
abandoned projects revealed that their websites were taken over by malicious software (e.g.
many of the Apple II emulation sites have been replaced by malware websites as well as
virus- and spyware-laden applications).
Emulation Criteria
In exploring emulation environments, the research team compiled a list of factors deemed
critical for an emulator that was to be part of a preservation experience. The list of criteria is
as follows:
? Source Availability ? Is the source code for the emulator freely available?
? Source Maintainability ? Is the programming style of the emulator source code
readable? How easy is it to maintain the code? How well is the source code
documented? Does the source code?s architecture make sense? How easy is it to
obtain the compilation tools and libraries necessary to rebuild the emulator from
scratch?
? Licensing ? Does the emulator support a license model that is compatible with
preservation efforts?
? Range of Host Platforms and Operating Systems ? Does the emulator work on a
variety of PC and Apple platforms? Can the emulator run under common operating
systems such as DOS, Windows, Apple OSX, and Linux?
? External Documentation ? What is the extent of external documentation related to
the emulator and the target system, including what is necessary to execute the
emulator and related game ROMs?
? Sensory Fidelity ? Does the emulator appropriately reproduce the visual, aural, and
tactile interactions of the original game?
? Peripheral Emulation ? Does the emulator properly reproduce the range of
functionality related to the peripheral interfaces? Are the peripherals properly
reproduced from a code and experience standpoint?
? Peripheral Expansion ? Does the emulator allow for the connection of original
peripheral devices (such as gamepad or joystick)?
? 64?
? Performance ? Does the emulator run within acceptable parameters on the host
operating system? Are there options available to adjust the performance of the
emulation software?
? Ease of Use ? Is the emulator easy to install and use by someone with a low level of
technical proficiency?
? Community Maintenance ? Is the emulator actively maintained? How large is the
community supporting the emulator? Are there any active or looming impediments
to the continuation of the emulator?
? Robustness ? To what degree can the emulator execute the original games? If there
are still bugs, to what extent do they inhibit playability?
? System Requirements ? What does it take to run the emulator? (CPU, memory,
RAM, peripherals)
? Emulation Variation ? Does the emulator only emulate the standard run-of-the-mill
platform or does it emulate the various different manufacturing of firmware
differences of a particular series (e.g., does an Atari 2600 emulator have any
indicators that there were actually four different production versions of that
particular model and does the emulator embody the nuances between these
variations?)
For purposes of this research, the team considered an emulator to be ?good? if it had the
following characteristics:
? The emulator is based upon freely available source code and appropriate licensing.
? The emulator is actively maintained.
? There is reasonable internal and external documentation for the project.
? The emulator interface is easy to use by non-technical players.
? The emulator supports a wide range of performance and tuning options.
? The emulator is robust and provides a believable level of fidelity when compared to
the original game experience.
Virtualization Overview
Virtualization is another technique that can be used to help in the preservation process.
Similar to emulation software, virtualization software is an application that allows a user to
execute code related to a different operating environment from the host system. Unlike
emulation, virtualization is performed on CPU and system architectures in which the host
and target share similar attributes. In addition, virtualization creates a level of abstraction
that does not include the target operating system. Instead, virtualization provides a
bootstrapping mechanism in which a legacy or different operating system can be loaded into
virtual machine. From there, additional software can be loaded into the virtualization
environment.
? 65?
Virtualization?s focus is on providing the appropriate level of abstraction for the hardware
environment, including the CPU, memory manager, and peripheral operation. Unlike
emulation, virtualization employs a variety of techniques to leverage functionality of the host
hardware architecture. For example, virtualization solutions developed for modern PC
architectures may employ a technique known as hardware-assisted virtualization. This
technique relies upon special processor instructions (Intel?s VT-x and AMD?s AMD-V
extensions) that help accelerate the base assembly instructions. Some experimental systems
use a technique called paravirtualization, in which the guest operating system is modified for
use with the virtualization process. In either case, the virtualization system must provide
access to the following hardware components or provide an appropriate level of abstraction:
? Central Processing Unit (CPU) ? Ideally, in virtualization, the native CPU handles
every one of the instructions handled by the virtualization system. Unlike emulation,
which requires mapping and translation, this means that either the CPU must have
the appropriate extensions to handle the virtualization instruction calls or that the
target operating system has been modified to allow for code to be context switched
from the virtualization environment to the actual CPU.
? Memory Management ? The memory management system is a hybrid between the
actual hardware and an abstraction process. At one level, the virtualization process
must provide the equivalent architectural access to main memory, including multi-
user and demand paging techniques used within the system. At another level, the
virtual machine?s memory space should be separated from any other virtual machines
or any processes running on the host system.
? BIOS/Firmware ? The virtual machine should have access to the BIOS of the host
system. In cases where a legacy operating system requires access to an older BIOS,
? 66?
the virtualization technology may provide a way of loading such ROM images into a
section of RAM memory. When such techniques are utilized, the virtualization
system also must account for any potential situations where devices are mapped into
the ROM/BIOS memory space.
? Bootstrapper ? For native systems, there is usually a process by which an operating
system is loaded onto a hard drive. In such cases, BIOS or firmware code is
necessary in order to search for peripherals that contain bootable environments.
Virtualization systems must contain code to provide an operating system bootstrap
for initial configuration.
? Peripheral Mapping ? Virtualization technology must provide resource management
for peripherals. In some cases, the peripheral is remapped, such as the use of a USB
memory stick for a floppy drive as well as a large file for a primary hard drive. In
other cases, the host peripheral is utilized in either a shared or exclusive mode, such
as CD-ROMS, video, and sound cards.
Unlike the emulation community, the target of the virtualization community is generalized
and is often supported by larger corporations. For example, Oracle/Sun Microsystems,
VMWare, the RedHat foundation, and other entities that have invested substantial resources
into the process are conducting leading efforts in this area.
Virtualization Criteria
The research team compiled a list of criteria for virtualization technologies:
? Source Availability ? Is the source code for the machine freely available? If not, is it
supported by a commercial entity with significant investment in virtualization
technology success?
? Source Maintainability ? Is the programming style of the virtual machine source code
readable? How easy is it to maintain the code? How well is the source code
documented? Does the source code?s architecture make sense? How easy is it to
obtain the compilation tools and libraries necessary to rebuild the emulator from
scratch?
? Licensing ? Does the virtualization technology support a license model that is
compatible with preservation efforts?
? Range of Host Platforms and Operating Systems ? Does the virtual machine
software work on a variety of PC and Apple platforms? Can the emulator run under
common operating systems such as Windows, Apple OSX, and Linux?
? External Documentation ? What is the extent of external documentation related to
the virtualization platform?
? Peripheral Emulation ? Does the virtualization technology properly reproduce the
range of functionality related to the peripheral interfaces? Are the peripherals
properly reproduced from a code and experience standpoint?
? Peripheral Expansion ? Does the virtualization technology allow for appropriate
device mappings? Are there clear interfaces for the addition of shared, exclusive, or
simulated device code? Can the process of device inclusion be handled by a
programmer with moderate systems programming skills?
? Performance ? Does the virtualization technology run within acceptable parameters
on the host operating system?
? 67?
? Ease of Use ? Is the technology is easy to install and use by someone with a low
level of technical proficiency?
? Community Maintenance ? Is the virtual machine actively maintained?
? Robustness ? To what degree can the virtualization system run applications? If there
are still bugs, to what extent do they inhibit a software experience?
? System Requirements ? What does it take to run the virtualization system? (CPU,
memory, RAM, peripherals)
For purposes of this research, the team considered a virtualization technology to be ?good?
if it had the following characteristics:
? The virtualization technology is based upon freely available source code and
appropriate licensing. In cases where the source was not available, the virtualization
technology was being developed by a company dedicated to virtualization.
? The virtualization technology is actively maintained.
? There is reasonable documentation for the project.
? The virtualization interface is easy to use.
? The virtualization technology is robust and provides a believable level of fidelity
when executing legacy software.
Case Studies
As part of this study, the research team selected four game titles from our case set for testing
with emulation technology. The game selection included Star Raiders for the Atari 2600,
Mindwheel for the Apple II computer, Mystery House for the Apple II computer, and DOOM
for an IBM PC/MS-DOS environment. For the purposes of the study, the selected software
titles were executed on modern hardware with a selection of commonly used operating
system software. In all, the team selected ten Intel-based PCs, with Pentium IV processors,
at least 4 GB RAM, Nvidia 8800, 9600, and GTX260 video options, CD/DVD ROM drives,
at least 320 GB hard drives, integrated Realtek sound solutions, as well as standard mouse
and keyboard devices. The systems were configured with a range of host operating systems,
including Windows XP 32 bit edition, Vista Ultimate 32 bit edition, Windows 7 Ultimate 32
bit edition (Release Candidate), Windows 7 Ultimate 64 bit edition (Release Candidate), and
a variety of Linux variations including Ubuntu, Debian, Arch, Gentoo and Fedora. For
Linux installations, additional packages were installed to create an appropriate graphical and
audio environment for emulation and virtualization, including X11, Gnome, ALSA,
PulseAudio, OpenAL, OpenGL, and other library and application sets. In addition to the PC
environments, the team had access to both Intel- and PowerPC-based MacBook Pro
computers along with Intel-based Mac G5 systems. These systems were configured with
both Tiger OSX (10.4) and Leopard OSX (10.5).
Case Study 1 ? Star Raiders
Star Raiders, released by Atari for the 2600 platform in 1982, was a cartridge-based computer
game for the Atari 2600, in which the player pilots a fighter spacecraft and engages in battle
with opponents known as ?Zylons.? Star Raiders is an instrumental game in video console
history for a number of reasons. First, it immersed the player in a first-person, ?3D?
dogfight environment, which allowed the player to engage enemy craft via a combination of
? 68?
thrust and rotation interaction. As such, the game is considered to be the forerunner of an
entire series of dogfight combat simulators for arcade and home console play, including such
classics as Wing Commander, Star Wars, and Elite. Second, the game utilized a non-standard
pad controller that allowed for keyboard-style commands on the Atari 2600. The player
worked the keyboard controller in tandem with the joystick depending upon the particular
phase of the game. The Star Raiders cartridge consisted of an 8K ROM, which stored the
entirety of the game code.
The target platform for Star Raiders included the Atari 2600 game console. Created in 1977,
the Atari Game Video Computer System consisted of a MOS 6507 (6502 variant)
microprocessor, a MOS 6532 RAM-I/O Chip (128 bytes static memory, timer, and 8 digital
I/O ports), and a Television Interface Adapter Chip (TIA). The MOS 6532 was used to
buffer the status of console chips as well as buffer the states of the joystick devices. The TIA
provided the ability to generate an RF signal that could be received on channels 2 or 3 of a
standard television set. The TIA generated both the video and audio signals used by the
game system. The console also provided an adapter port for receiving game cartridge ROMs.
In order to ensure emulation accuracy, the research team had access to several original Star
Raiders cartridges as well as three Atari 2600 platforms, including a Sears Telegames Model, a
?heavy sixer,? and a ?light sixer.? As part of emulator testing, the research team selected
four emulation systems to test on PC-based environments. The four systems selected
included Stella (http://stella.sourceforge.net), z26 (http://www.whimsey.com/z26), PC
Emulator (http://pcae.vg-network.com), and MESS (http://www.mess.org).
All four emulation platforms were able to successfully run the Star Raiders ROM image.
However, based upon the team?s criteria for emulation software, all but Stella were
eliminated from contention. Specifically, MESS?s licensing policy was too restrictive for use
as an archival system, and both PC Atari Emulator and Z26 are in an indeterminate state of
upkeep, despite the latter?s popular following in some segments of the emulator community.
In addition, PC Atari Emulator experienced difficulties with clock timings in the first version
we evaluated (from a non-authoritative emulator site). The problem proved inconsistent,
varying on target PC platform and operating system (problems were experienced on MS-
DOS and FreeDOS).
In side-by side comparison, there were several differences noted by the test team. First, there
is a perceivable quality difference in the graphical output between the emulator and the
original 2600 output for both black and white as well as color televisions (for authenticity,
we did some of our testing on 1970s era televisions). The RF/television combination created
an experience in which the graphics were not as sharp as the emulated version. Although this
might seem as a detriment, the slight blurriness disguised several visual artifacts that
distracted players on the emulated systems. Second, the sound quality was also different in
the emulated versions. Although the sound approximation of the TIA was acceptable in the
emulated version, there were some pitch and timbre variations noted in the audio
reproduction when compared to the original experience. In part, this was due to differences
in the fidelity when comparing the internal speakers of a 1970s era television versus a
modern sound card output to commonly available desktop speakers. To verify the sound
quality difference was not due to aging of the speakers, the team attached the Atari console
to televisions from the 1980s and 1990s to compare the sound experience. The results
? 69?
correlated to the experiments performed with the 1970s era television sets. Finally, the
mouse and keyboard experience did not properly substitute for the joystick and pad
experience of the original. The team later found that an adaptor could be purchased
(www.stellaadaptor.com), which would allow the original Atari 2600 joysticks and pads to be
interfaced to a modern PC. The adaptor converted the original digital I/O signals from the
9-bin connector to a USB format that could be utilized by z26 and Stella. Despite these
issues, the Atari 2600 emulator community has created a robust product that provides an
extraordinary level of support for reproducing the original Atari experience. The emulation
technology is mature enough to support most Atari cartridges that were produced, and the
involvement of the development community has ensured that most emulators support a rich
set of tools for both debugging existing ROMs and allowing hobbyists a platform for
software experimentation. However, this investigation has also revealed one of the critical
weaknesses in the emulator community ? as emulators age, they too can fall into neglect and
disrepair.
Case Study 2 ? Mindwheel and Mystery House
Mindwheel, published in 1984 by Broderbund, was available on a variety of platforms
including the Apple II. U.S. Poet Laureate Robert Pinsky designed the game as an interactive
novel that allowed the player to explore the thoughts and minds of four deceased characters
within the fictional work, with the player trying to solve particular problems in search of a
?Wheel of Wisdom.? The game was accompanied by a hard-covered instruction manual and
a novella that served both as a way to introduce the story line and as an early form of copy
protection, as the player was requested to enter words from particular pages and paragraphs
in the novella.
Mystery House, an Apple II title published in 1980 by On-line Systems, is a ?whodunnit? game
in which the player must explore a mansion in search of clues that will lead to the identity of
a murderer, who is also locked within the house. The game is based on the tradition of text-
based exploration games, but also utilized simple line-based drawings to augment the text
descriptions within the game. The game was designed to run on a standard Apple II
computer and was distributed on 5.25" single-sided floppy disks for the Apple floppy drive
reader.
The target platform for these games was the Apple II/Apple II+ computer system. Based
upon the MOS 6502 processor, the Apple II/II+ supported up to 48K RAM, cassette and
floppy drives through expansion ports, and a NTSC composite output signal for both
monochrome text and color graphics. The support ROM built into the Apple II/II+
provided direct support for Apple?s BASIC dialect. Sound was generated through pulsing a
speaker. Crude wave generation could be computed on the computer, allowing for low
fidelity playback of voice and multi-channel source sound (on a small internal speaker within
the computer system).
To ensure that the Apple II emulator experience was similar to the original, the team was
able to secure time on an original Apple II+ system in order to determine the how well the
emulator matched to an original system over a variety of Apple II applications and games.
? 70?
The team discovered a number of emulator systems for the Apple II. The following list
includes general observations about each emulation application.
Apple PC (http://www.zophar.net/apple2/applepc.html) emulates the Apple II+, Apple
IIe, and Apple IIc computers. The emulator does not include the Apple II ROM, but the
user community provides appropriate sources for downloading the ROM images. Apple PC
is designed to run on a DOS system, and can run on DOS emulators such as DosBox. The
software cannot run on limited DOS environments such as those available in Vista and
Windows 7. Apple PC emulates monochrome green and white monitors, and provides
sound from the PC speaker, Sound Blaster, or MockingBoard options. The emulation allows
for the mouse to substitute as a joystick. Although the software does not require explicit
installation, it can be run from a directory on the hard drive. Help is included, and the
system includes an integrated debugger. Apple PC uses an inverted capslock scheme, which,
when utilized, will persist in lowercase mode appropriate to later Apple models. Backspace
and arrow key functions, along with cursor representation, are mapped such that backspace
and left arrow move the cursor left. Apple PC is incapable of running Mindwheel.
Appler (http://www.zophar.net/apple2/appler.html) is predominantly an Apple II+
emulator, although the README file enumerates several hardware options that span
different Apple II class revisions. Appler is designed to run on a minimal 386 system, with at
least 1MB RAM memory, 128K EGA graphics adapter capability, and MS-DOS 3.30
operating system. Appler will not run on newer versions of Windows including Vista and
Windows 7 without a DOS emulation environment such as DosBox. The emulator includes
the Apple II ROM and supports a monochrome white monitor mode for text applications
and a color (green, white, magenta) output for graphical applications. Appler supports sound
and does not require any special installation. The emulator comes with help and includes an
integrated debugger. Keyboard interaction varies with the application. For Mindwheel, the
backspace and left arrow both perform the backspace, and delete does not perform any
operation. In Mystery House, backspace and left arrow move the cursor back without erasing
? 71?
the previously typed text. Appler uses a different font than the standard Apple II character
generator. In addition, there are issues with combined text and graphics renderings as well as
issues with running Mindwheel.
AppleWin (http://developer.berlios.de/projects/applewin/appler) by far was the best Apple
emulator tested on the PC platform. Distributed under a GNU license, the emulator
supports Apple II, II+, and IIe computers and includes the Apple II ROM. The emulator
runs on Windows platforms (tested with Windows XP, Vista, and Windows 7 RC). The
emulator supports a wide range of monitor formats, including four monochrome variants
(white, green, amber, and custom), as well as color modes (standard, TV, text-enhanced, and
half-shift). Sound is available with Mockingboard and Phasor options. AppleWin differs
? 72?
from other emulators in that it has a performance mode that allows for ?authentic? timings
and run speeds as well as optimized modes that allow it to take advantage of the host
hardware. AppleWin requires installation, and includes extensive help, information on the
Apple platform, and a complete debugger. AppleWin supports a toggle capslock mode that
starts in uppercase. In Mindwheel, backspace, left arrow, and delete all perform a backspace.
In Mystery House, backspace and left arrow both move the cursor left. Delete replaces the
highlighted character with a blank and the image of the cursor does not blink.
? 73?
Virtual Apple (http://www.virtualapple.org) is a browser-based emulator. The emulator is
capable of executing on both Mac and Windows platforms, and is supported by popular
browsers such as Internet Explorer, Firefox, Safari, and Chrome. The application emulates
sound and provides two monitor modes ? a white on blue for text, and a green, white,
magenta mode for images. There is also a Java implementation that can be run from the
browser. There is minimal help and no debugging support. Capslock and keyboard reactions
are fairly atypical, even when compared to other Apple II emulators.
? 74?
Virtual II (http://www.xs4all.nl/~gp/VirtualII/) is an emulator designed to run on the Mac
platform. The emulator supports Apple II, Apple II+, and Apple IIe computers. It does not
come with Apple II ROMs, requiring the user to search the Internet for an appropriate
source or to extract the information from their own ROM set. The emulator requires at least
the Mac OSX 10.4 operating system. The emulator allows for monochrome and color video,
and supports a mode to toggle scan lines on and off. The emulator supports audio including
Mockingboard emulations. Options allow for controlling refresh rate as well as CPU speed.
Virtual II requires installation, includes help files, and a debugger. To emulate flexible
keyboard behavior, Virtual II has the ?soft caps lock? option.
? 75?
By far, the greatest problem the team noted with emulation technology for the Apple II was
the inconsistent use of a standard keyboard metaphor when dealing with the Apple II/II+
keyboard. The Apple II/II+ keyboard worked primarily with uppercase encodings, with
such machines only allowing key font generation of the uppercase glyphs when typed for
certain applications. The inconsistency among emulators leads to inconsistent behavior that
is not representative of the original experience. Also, the difference in emulators means that
emulator users should explore the keyboard behaviors, or else they could observe
unexpected results in the play experience.
Case Study 3 - DOOM
DOOM, by id Software, is universally synonymous with the first-person shooter genre of
video games. Developed in 1993 by John Carmack and John Romero, the game story
involves a space marine (you, the player), who is dispatched to Mars? moon Phobos, to
contain a dangerous situation created by the Union Aerospace Conglomerate, a mega-
corporation responsible for processing radioactive waste and supplying the military. The
game is known for its graphic nature, supporting a range of weapons as well as stylized
violence. The goal for players was to navigate levels searching for keycards, which in turn
allow access to additional areas of the game. Along the way, players have to both avoid and
fight various monsters with a range of abilities and strengths. DOOM was groundbreaking in
advancing several techniques for game engine design, including 2.5D game level design,
better handling of BSP, complete texture mapping for room surfaces, and non-perpendicular
space design. DOOM also provided a means by which the game could be experienced in a
single-player mode as well as a multi-player mode over a network or a modem.
DOOM was designed to run within an IBM PC environment. Its minimum specifications
stated that it could be played on a system with an Intel 80386 or compatible processor, 4MB
of RAM memory, a VGA graphics card, and a hard disk drive. The preferred specifications
allowed for an Intel 80486 processor, a Sound Blaster Pro or Sound Blaster Pro compatible
sound card, and a network card that could support IPX network protocols (with a terminate
and stay resident application or DOS network driver stack). Although an operating system is
not specified, the README file contained on the distribution media mentions setup issues
with MS-DOS 5.0 and before, MS-DOS 6, and OS/2.
With the advent of PC compatible clones in the 386/486 era, there is no standard reference
platform for the target system. For this era system, motherboards could support 8MB to
32MB of system RAM, support a PC compatible BIOS, and a number of expansion slots for
ISA, VESA, EISA peripherals. Common peripherals included VGA cards that conformed to
VESA specification, coax-based Ethernet cards, modems, drive controllers, COM ports,
parallel ports, joystick ports, and multifunction IO adaptors for floppy drives and hard disks.
Unlike modern PCs, most peripherals required that the user set through jumpers,
appropriate IO addresses, memory addresses, interrupt lines, and DMA address channels.
Jumper configuration problems could cause unintended consequences, including machine
lockups, system crashes, and indeterminate behavior. When running many applications, the
user would have to provide known settings for peripherals to the program in order for it to
operate effectively.
? 76?
The team was able to secure an older 486/DX50 system for testing an original DOOM
distribution (version 1.8 on both Floppy and CD-ROM [Shareware/Retail]).
Before selecting virtualization and emulation technologies, the team examined two different
ways of providing an operating system environment for the game. One choice was to use a
Microsoft MS-DOS 6.22 distribution from floppy disk. The MS-DOS option is based upon
Microsoft?s closed source operating system. Although Microsoft does not directly support
the sales of MS-DOS, it is still available through the Microsoft Academic Alliance for
educational use. The second choice was to use FreeDOS (http://www.freedos.org), an open
source alternative to MS-DOS. FreeDOS has GPL licensing and was developed for both
modern and legacy systems. FreeDOS can be used to provide DOS functionality for the
Windows NT line of operating systems, Windows XP, Vista, and Windows 7. FreeDOS
supports such features as FAT32 file systems, disk cache, memory management (EMS and
XMS), mouse extensions, and multimedia extensions for CD ROM.
The team worked with a number of virtualization and emulation applications for DOOM.
The following is an overview of the various systems, detailing what worked and what did
not.
Virtual PC 2004 and Virtual Machine 2007 (http://www.microsoft.com/windows/virtualpc)
are virtualization systems promoted by Microsoft that provide virtualization support for a
Pentium II processor and/or the virtualized CPU of the host platform. Virtualization device
mapping provided a uniform peripheral architecture including a S3 Trio 32 VESA card, an
AMI BIOS, a Sound Blaster 16, and DEC 21XXX series Ethernet solution. Virtual PC 2004
was able to run on Windows XP and Vista, but not Windows 7 RC. Virtual PC 2007 could
run on all the Windows variants that were tested. Both versions of Virtual PC were able to
run MS-DOS as well as FreeDOS, and supported the DOOM installation process. Video
worked nicely in both versions. Sound effects were selectable for PC Speaker and Sound
Blaster solutions. Music worked with SoundBlaster, Adlib, and Pro Audio Spectrum
selections. For Virtual PC 2007, DOOM would tend to freeze on all tested operating systems
except Windows 7 RC 64 bit. After several attempts, the team was unable to create a
reproducible freeze point: the DOOM application would freeze the Virtual PC application at
indeterminate points in time.
DOSEmu (http://www.dosemu.org) is a combination hardware virtualization and emulation
solution capable of running classic operating systems such as MS-DOS, FreeDOS, and DR-
DOS. The program is available for Linux systems and provides a GPL license for its use.
DOSEmu provides partial support for sound, with current workarounds for Sound Blaster
music compatibility issues. DOSEmu was able to work with all five of the Linux test
platforms. The distribution used by the team automatically installed FreeDOS as the target
operating system. Although DOSEmu is supposedly capable of mapping to a MPU-401
MIDI card, the testers were unable to reproduce this mapping successfully. Therefore, on all
of the host platforms, music did not work. However, sound effects did work in speaker and
Sound Blaster mode, except for the Gentoo build, which does not support standard audio
device mapping in the base distribution set.
QEMU (http://www.qemu.org) has aspects of both a virtualization platform and an
emulation system, and is a contributing component for a variety of other VM solutions,
? 77?
including VirtualBox, Xen, and KVM. QEMU was deployed on all five of the host Linux
systems. In all cases, it was able to install both MS-DOS and FreeDOS, and was able to load
the DOOM game software. For sound effects, QEMU was able to provide PC Speaker
access and Sound Blaster functionality. The sound support did not properly work on
Gentoo and Fedora due to audio device mapping. Music was supported for Adlib, Sound
Blaster, and Pro Audio Spectrum sound solutions. Fedora had some difficulties mapping the
arrow keys in Fedora. Problems with Fedora also led to some situations where the game
would manage to get into an unplayable state.
VirtualBox (http://www.virtualbox.org) is a commercial-grade virtualization solution that
was offered by Sun Microsystems and was recently acquired by Oracle along with Sun?s
other operational assets. VirtualBox is distributed in both a commercial version as well as an
open source edition. VirtualBox can virtualize a wide range of peripherals and provides
native virtual execution, a just-in-time dynamic recompiler, and hardware-accelerated
virtualization features. VirtualBox can be run on a wide range of host operating systems. In
the testing phase for VirtualBox, the team was able to successfully run the environment
under all Windows variants, as well as Ubuntu and Gentoo Linux. VirtualBox was unable to
execute under a Fedora base installation due to missing files, and was unable to run under
Debian and Arch due to VM initialization problems. Unfortunately, VirtualBox had many
problems running DOOM: the virtualization process for Windows platforms made the game
very slow on Windows variants. To verify this was not a problem with the installation: text-
based applications that were run did not experience the same problems as in the running of
DOOM. Although the Sound Blaster virtualization allowed for sound effects, music
selections through the Sound Blaster and other devices were ignored, regardless of the sound
card settings.
The final virtualization test was with VMWare (http://www.vmware.com), a company
dedicated to the creation of virtualization technology. VMWare was used as a baseline to
compare the previously tested virtualization solutions. VMWare worked with the various
versions of Windows, but worked only with Ubuntu and Debian Linux distributions. Arch,
Gentoo, and Fedora were unable to provide re-compilation for the virtualization tools and
device mappings. For all distributions, VMWare experienced problems with crashes and
issues with sound virtualization. As such, VMWare did not provide a suitable execution
environment for DOOM.
The final application we tried was Peter Veenstra?s DOSBox. DOSBox is actually an
emulator that provides an environment analogous to an IBM computer with an MS-DOS
execution environment. The emulator provides a wide range of video support, from original
Hercules monochrome, through CGA, EGA, VGA, VESA, and S3 Trio 64. For audio, the
emulator supports emulation of the PC Speaker, Sound Blaster, MPU-401, Disney Sound
Source, Gravis Ultrasound, AdLib, and Pro Audio Spectrum. The emulator has support for
network emulation as well as advanced features for screen and movie captures. DOSBox
does not require the user to own a Microsoft copy of MS-DOS. As such, FreeDOS is not
applicable to this environment. In testing, DOSBox ran within every host operating system
the team tried, including all Windows and Linux variations. DOSBox performed the best for
the team, having only a freeze problem under Linux with the WaveBlaster sound driver set.
By far, DOSBox provided the best experience for DOOM compared to the prior
technologies.
? 78?
There were virtualization problems with both MS-DOS and FreeDOS DOOM scenarios,
including intermittent crashes, freezes, and sound mapping problems. The pure emulation
solution created the best experience for the game.
Analysis of Migration
Another important part of the technology of preservation is the process of migration, the
process of moving code from one target platform to a newer or better target platform that is
capable of running the software application. There are two types of migration that one must
consider for the preservation of virtual worlds and video games: source migration and binary
migration.
Source migration, or ?porting,? is possible if one has access to the original source code and
assets for a game experience. The process of source migration is usually easier on equipment
that possesses similar CPU series and related platform architectures, but can be performed in
cases where the source and target architectures differ. To perform source migration, one
must possess appropriate tools, such as interpreters, compilers, and assemblers that are
capable of not only reading the source programming but also are capable of targeting the
new environment. Along with the tools, the re-compilation process requires access to
libraries and APIs related to graphics, user input, operating systems and firmware access,
math functions, and data parsing. In some cases, source migration may require the original
source code for the libraries, as well as the game. In other cases, the operating system of the
API set may provide appropriate wrappers for legacy code (e.g. DirectX SDK support for
previous versions). If such libraries are not available, one may have to rely upon conversions
or library ports from the open source community. Along with the code, digital assets might
need some special attention in the source migration process. At the simplest level, the
? 79?
original code and libraries may be able to use the content directly, and, as such, very little
might have to be done. For example, in migrating image manipulation code, libraries will
allow for older image formats such as BMP and JPEG to be loaded, but special attention
must be paid to elements such as image bit depth, alpha channel designation, bit encoding,
as well as file name format and case sensitivity.
Binary migration is possible if one has access to binary and assets of the original game.
Binary migration is usually only possible in systems where the CPU series and platform
architecture is backward compatible with the legacy system, for example, the execution of a
PC compatible Intel 386 binary application on a PC compatible Intel Pentium IV system.
Although some binary game applications are self-contained, others rely upon dynamic
libraries and modules provided by the target operating system or platform environment. For
example, DirectX 9 Games are reliant upon a Windows operating system as well as the
presence of the DirectX 9 run-time environment. Such games are often packaged with an
integrated or stand-alone installer for the DirectX 9 components. Binary migration can be
susceptible to small changes in computer architecture, changes in peripheral devices, CPU
optimizations, and changes in BIOS and firmware code.
Another difficulty encountered within the binary migration process is the conversion from
original storage and execution media to newer media that can be accessed by modern
devices. Since the advent of the personal computer, media storage formats have changed
frequently. Tapes were replaced by magnetic disks, which gave way to optical disks and,
most recently, to solid-state flash memory. Each format has numerous variants, and each
variant requires its own drive connected to a computer through a series of different ports
and cables often made obsolete within two years of introduction to the market. As a result,
the cultural record of the last 30 years is scattered over a variety of media that cannot be read
by humans (or modern computers) without the aid of obsolete and hard to find machinery.
One of our test cases for this project was the preservation of Mindwheel, a piece of interactive
fiction written in part by U.S. Poet Laureate Robert Pinsky (see also Case Study 2 in Section
6). We obtained a Macintosh copy of this title, but it came to us on the original 400K 3.25"
floppy disk. While, as of this writing, USB connectable drives capable of reading 1.44 MB
disks are common, these devices are unable to read 800K or 400K. Fortunately, we had
access to a Macintosh Powerbook G3 ?Wallstreet edition? that had both a floppy drive
capable of reading the 800K disks as well as an Iomega Zip drive.
Using this machine, along with an external USB Zip drive, we created an image of the floppy
on a Zip disk using the Powerbook, transferred the image to the external Zip drive, and
thence to a contemporary iMac. Unfortunately, the disk image was in the now obsolete MFS
format and so accessing the individual files via a contemporary machine remained difficult.
We solved the problem in two ways: the first involved opening the disk image in the
popular MacPlus emulator, vMac, transferring the files to a virtual disk image formatted in
HFS (readable by both the emulator and a modern operating system). We later found the
?MFSLives? software provided by Apple and were able to open the image within the native
operating system.
The chains of transfers from medium to medium and platform to platform described above
are common in digital preservation projects, but the fragility of the links that compose them
? 80?
is palpable. The process described above depended, ultimately, on an aging laptop, available
only through serendipity. As the chronological distance between the date of creation and the
encounter in the archive grows, accessing old store media will likely become even more
difficult. Already, formats even as common as the once ubiquitous 5.25" floppy disk present
significant obstacles to preservation and access.
Most scholarly work in born digital preservation to date has focused primarily on the
problem of archiving, cataloging, and accessing material that has been transferred to modern
hardware. The problem of copying data from older machinery has largely been solved
informally on an ad hoc basis. Obsolete drives are pulled out of storage closets and
connected to computers old enough to have the requisite ports but new enough to accept
more modern connections. Hobbyists build custom-made cables online for connecting old
drives to modern computers and sometimes post their schematics on the web, but their
documentation is often understandably amateurish and rarely maintained.
There are, however, a few well-supported
solutions upon which this project will build.
Our experience demonstrates that some
computers built at transitional moments in
the evolution of media technology can
serve as ?Rosetta Stone? machines,
translating the archaic and forgotten to a
form recognizable by the modern. Of these,
the Macintosh Wallstreet edition
Powerbook G3 is among the best. The
laptop, manufactured between May and
September 1998, came with swappable CD,
DVD, and floppy drives capable of reading
800K and 400K disks. A swappable zip drive could also be purchased for the machine. An
Ethernet port further allows data to be transferred from the computer using standard
networking protocols, and PCMCIA slots permit the addition of USB ports through a third
party card to which an external hard drive or even flash media can be attached. The
hardware is capable of supporting older versions of Linux, and with it many contemporary
open source software packages. The machine does not, of course, natively support 5.25"
floppies or other more archaic formats, but it does serve as an example of the sorts of
machines that may prove very valuable to
digital preservation laboratories in the
future.
One of the best commercial tools produced
in the present day for accessing old media is
the Catweasel manufactured by the German
company Individual Computers. The
Catweasel is an internal PCI-card that allows
more modern drives to access older formats
(including those manufactured by
Commodore). Unfortunately, it requires the
end user to have an internal floppy drive of
? 81?
the appropriate size as well as an internal PCI slot, thereby all but limiting the tool to use in
desktop computers. Digital archival and recovery work must often be performed using
portable workstations that can easily be brought to the archival site. In these purposes,
Device Side Data?s FC5025 USB 5.25" floppy controller, released in February 2010, provides
most of the same functionality as the Catweasel but with an external USB connection.
For 800K and 400K disks, much of the Catweasel?s functionality can be replicated with the
robust OmniFlop software, developed and released as freeware by Jason Watton of the
British company Sherlock Consulting Limited. OmniFlop replaces the default drivers in
Windows XP with custom-made software that controls the mechanisms to permit standard
floppy drives to read almost any format. Like the Catweasel, it requires a computer with an
internal floppy drive (USB floppy drives are not usually compatible with the software).
If one has access to the original machinery,
it is sometimes possible to connect early
personal computers to a modem (originally
used for dialing up bulletin board systems
or commercial services like CompuServe or,
later, America Online). With the
appropriate software, and a working old
computer with the appropriate drive and a
compatible modem, it is possible to send
files to any other computer using what is
called a null modem connection. Finding all
of the necessary components is often
difficult today, however, and one must assume they will become increasingly rare in the
future. The procedure is a useful one, then, but not ultimately sustainable.
In January 2009, Nate Lawson, a consultant at a California digital security company called
Root Labs, announced a USB connection to a Commodore 1541 drive using what he calls
the xum1541 cable. The device is an optimization of the xu1541, originally designed by Till
Harbaum in 2007, but later adopted by Spiro R. Trikaliotis of the Institut f?r Automation
und Kommunikation in Magdeburg, Germany. The xu1541 project includes schematics for
building a USB connection to the 1541 using components that the author claims will cost
less than 5 euros. Lawson?s updated xum1541 device is not yet available for purchase, nor
are his schematics available online, but a YouTube video gives enough information about the
component pieces to suggest an approach for building similar cables for the 1541 and other
early 5.25" drives.
While the amount of preservation work that has been accomplished even under such less
than ideal circumstances is remarkable, a formalized and repeatable set of procedures are
needed, and needed immediately. With every passing day, the problem becomes increasingly
urgent as drives fail and 5.25" disks crumble like papyrus, and with them disappear large
swathes of our creative history.
? 82?
Single density Commodore 1541/Apple II series ?flippy? 5.25? disks
According to the 2003 Guinness World Records, the Commodore 64 is the best selling
computer of all time. With over 30 million units sold, the machine represented for many an
introduction to the personal computer. Although it did not originally ship with a disk drive
(plug-in ROM cartridges were the only media that could be used with the machine out of the
box), many users quickly purchased the 1541 drive. The drive, similar to the one Apple
manufactured for its Apple II computer, could read only one side of the disk at a time,
requiring users to physically turn the disk over and reinsert it into the drive to access the
other side (hence the colloquial name, ?flippy? disks). The drive thus operated differently
than later IBM-compatible drives (and for that matter later Commodore drives), which,
thanks to the addition of another drive head, could read both sides of the disk at once.
Given the staggering number of Commodores and Apples sold, developing an access
strategy for the media format is at the top of our priority list.
Double density, IBM-formatted 5.25" disks
Although the Commodore and Apple II enjoyed nearly a decade of dominance in the home
markets, business tended to use instead IBM and IBM compatibles. The most common
5.25" drives on these machines could read both sides of a floppy disk simultaneously, and
could access a higher number of tracks on a side (hence the designation, double density).
Macintosh 800K/400K floppy drives
The 3.5" drives on early Macintosh computers were formatted slightly differently than those
used by DOS/Windows machines and later Macs. Unlike the 1.4MB disks which can be read
with a (currently) commercially available USB drive, the disks used by these drives cannot be
read using modern equipment. Many writers and artists who used Macintosh computers in
the early to mid-1990s have saved their material on these floppies.
? 83?
3" Amstrad Drives
Although never terribly popular in the United States, the U.K.-based electronics company
Amstrad produced a series of personal computers, including the Spectrum series, which were
very popular in Europe in the 1980s. These machines frequently used a proprietary 3" disk
format that is now beginning to appear frequently in the collections of British authors.
SyQuest Optical Disks
In the mid-1990s, a company called SyQuest produced a series of ?removable hard disks?
that vaguely resembled a compact disk permanently sealed in a plastic jewel case. These disks
were read by a special drive that usually connected to a computer via a SCSI port. The drives
were most popular with Mac users, and their disks can be found in several publicly held
collections of artists? papers. Unfortunately, although connecting a SCSI device to a
contemporary machine is not impossible (although still fairly difficult), there are no drivers
available for modern computers to control the SyQuest devices.
Analysis of Re-enactment
Another method of preserving a game experience is through the use of technologies that
support some level of re-enactment process. In this report, re-enactment covers a range of
possible techniques, including the reconstruction of an experience from historical record and
game-related artifacts, the reconstruction of the environment through technology, and the
documentation of game-related experience through secondary sources, such as community
content, advertisements, popular culture artifacts, and other such paraphernalia. In this
section, such techniques are examined and the technological impact of the techniques is
considered.
? 84?
By examining such artifacts, two levels of implementation must be reconstructed. At one
level, the logic and flow of the game experience must be re-created. This includes navigation
related to title screens, credits, game play instructions, game settings, and game play states. It
also includes the rules of the game for level presentation, character movement, win and lose
states, as well as elements related to play difficulty. At another level, the content of the game
must be reconstructed. This includes game art, animation, video, and sound. Fortunately, in
some situations, partial information is available that may assist in the reconstruction process,
especially in cases where a playable version is available (e.g. games that exist, but due to a
loss of technological information, cannot be migrated to a new system). Such re-enactments
vary in their accuracy and fidelity in re-creating the original game.
Sometimes re-enactment can be used as an exercise with students who are learning the art of
video game development. National programs in the field of game development often
challenge students to re-create classic game experiences such as Asteroids, Space Invaders, and
text adventure games so that students can gain greater understanding into the logic and fun
of the game development process. Even as recently as this year with Google?s ?Pac-Man in a
Browser? experiment, we have seen the impact of experience-based re-enactment.
Another methodology is environment-based re-enactment. This is especially important for
games that involve server-based technologies or distributed processing in the presentation of
the play experience. For example, games such as World of Warcraft require a client-side
component (the game world visualization component, input manager, and user interface
system) as well as a server-side component (the realm that maintains the object catalog,
object instances, and player instances). In such systems, if the server-side technology is no
longer available, the game cannot be experienced. As we look back over the last several
years, there have been a number of online games that have ceased operation or have
changed so significantly that the original play experience has been lost (e.g. what was it like
to play during the first three months of the World of Warcraft private beta experience?) When
server-based games disappear, there are only a few options available to preserve the
experience.
? 85?
Environment-based preservation involves the game development company either
distributing the server-side technology to their player base or involves providing the source
code to the player community. An example of this approach can be seen with Cyan World?s
Myst Online title, which was based upon the popularity of the Myst family of exploratory
games. In April 2008, the Myst Online servers were shut down. However, in December 2008,
Cyan Worlds announced plans to open source the project. Although this process is still
ongoing, the possibility of an open source version of the game has re-sparked the player
community and has already allowed Cyan Worlds to re-introduce some of the prior
technology to the public.
Another form of environment-based preservation involves the underground development
community. There have been numerous examples of developers creating their own server-
side technology so that a server could be hosted on a local server and a private version of the
game (with a small group of friends) could be experienced. Such attempts include the UXO
project for Ultima Online and the Mangos project for World of Warcraft. These projects have
resulted in various levels of response from game companies, including active encouragement,
non-committal acknowledgement, and legal cease-and-desist responses.
Environment-based preservation can also be accomplished through software designed to
process and re-create the environment in a stand-alone application. Such scenarios do not
provide the ability for multi-user interactions, but instead allow for an interactive snapshot
of what the game would look like at a precise moment in time. The technique involves the
capture of game assets as well as information related to the position, orientation, and state of
the game objects.
The final methodology is artifact-based re-enactment. This approach involves the use of
secondary sources to define the game experience. Key artifacts for this approach include
video, images, audio, and first-hand testimonials from players and player communities. In
addition, information from trade magazines, conventions, reviewers, commercial outlets, and
other sources can supplement critical information. Popular cultural artifacts such as TV
commercials, music, spin-off media (Saturday morning cartoons), merchandise, and other
items can provide appropriate context for the time period and the experience.
Digital Forensics and Disk Image Analysis
Digital forensics is a rapidly emerging field that encompasses e-discovery, intrusion detection
and incident response, data recovery, and the packaging and presentation of digital evidence
to standards admissible in various legal settings. As libraries and archives acquire more and
more data in born-digital formats, tools, techniques, and workflows from digital forensics
offer solutions and technologies for processing this challenging new kind of material.
Stanford University Library, along with the British Library and the Bodleian in the UK, are
currently applying forensics hardware and software in exactly this role. See Kirschenbaum,
Ovenden, and Redwine?s forthcoming report (2010) for a more comprehensive introduction
to the field.
The bedrock of all digital forensic techniques is the concept of a disk image, a bitstream
representation of every bit of information originally stored on an instance of physical media.
? 86?
Bitstream imaging bypasses the file system, thereby allowing an archivist to capture
information stored in slack space at the ?end? of a file or data clusters flagged for
reallocation (the typical fate of supposedly ?deleted? material). A technique known as data
carving allows an analyst to reconstruct the pieces of a file from the bitstream. File types can
be identified by tell-tale headers (if they remain intact), allowing for some reconstruction of
the file even it cannot be reconstituted in its entirety. More exotic techniques may allow for
the recovery of information from data tracks that have already been overwritten at least
once, but these are not within the realm of practicality for libraries and collecting
institutions.
Disk images have circulated freely in player communities and various Internet venues,
including retro gaming forums and abandonware hubs. Like cartridge-based ROMs, the
availability of disk images allows users to run older games via emulators. Sometimes,
however, the disk image carries additional freight. A report on a web forum details how one
player used an image of an old freeware demo disk to locate source files for the first
Commander Keen game, an episodic series of side-scrollers published by id Software. Residual
data captured as part of the image allowed the player to reconstruct a near-complete running
copy of the original game:
Most of the Keen 2 data had been overwritten by the sample games, but to my
surprise most of the Keen 1 data was still intact and after scratching together the
deleted files I managed to get almost a fully running game. Only EGAHEAD.CK1,
EGALATCH.CK1, EGASPRIT.CK1, HELPTEXT.CK1 and LEVEL01.CK1 had
been completely erased (well LEVEL01.CK1 had about 50% of it's [sic] data there
but I didn?t bother combining it with the finalized one), so I took these missing files
from version 1.0 of the game.
Now I was a bit disappointed that I didn't get the graphics files together (usually
graphics are what's most interesting in development versions of games), but it seems
the other game files were actually quite different at this point as well. I don't know at
exactly which stage the game is in this build but it at least looks pretty close to final
with a bunch of minor level design differences in quite many levels.
The story section is no doubt the most interesting difference, you actually get a scene
where a Vorticon comes to your ship and takes the vital parts and then leaves
(explaining why these tiles without and with the parts exist in the final game).
Other points of interest is the lack of the GOD cheat and the fact that you can't
complete the game. (http://www.pckf.com/viewtopic.php?t=1211)
Large-scale, systematic analysis of disk images circulating within the player community may
therefore provide an important tool for recapturing and reconstituting supposedly ?lost?
works or versions of works.
Nonetheless, it is important to recognize that even a disk image, while somewhat akin to a
?facsimile? in the realm of photographic reproduction, is an abstraction, a projection (or,
more precisely, an interpretation) of physical phenomena on the surface of the disk. Creating
a disk image is not a homogeneous process, and not all imaging tools will acquire the same
data. Some disk image formats, for example, will not indicate the presence of damaged data
sectors. It is therefore easy to fetishize a disk image as a more authentic version of the data
? 87?
than just extracting the files and dumping them on a modern file system. A more useful
understanding, however, would be that forensics gives you better documentation about how
you reproduced a copy of the original data, not (necessarily) a superior copy.
In one sense, digital games and virtual worlds present no distinct challenges or opportunities
for applications of forensics, since the same techniques for data acquisition and analysis that
are used with other forms of digital content can be applied just as seamlessly. Whether or
not forensic methods are useful in a particular institutional setting will be determined by a
cost-benefit analysis of time, resources, specialty training, and the kind of access and services
it wishes to provide to its patrons. Techniques for authenticating digital content may,
however, have a larger role to play in appraising material collected from the player
community. Demos, speed runs, and other forms of player-generated content are
occasionally subject to question or controversy regarding authenticity and provenance. In
these cases, forensic tools and techniques can prove uniquely useful.
Recommendations on Preservation Strategies
The team has provided several recommendations that must be addressed in the preservation
process. These recommendations are not only derived from the emulation, virtualization,
and migration processes from the grant, they are also derived from interactions related to the
metadata encoding process undertaken by the research team. The recommendations are as
follows:
? It is important for museums and other collectors to start the exploration process into
emulation, virtualization, and migration techniques as one facet of the preservation
process.
? It is important to encode information related to game play based upon original
platform, emulation, and virtualization information within preservation metadata.
? There must be a mechanism to encode a testing rubric for virtualization and
emulation into a metadata set, such that there is a template for the range of
permissible options that is easily searchable and can be indexed. In addition, the
testing rubric must contain mechanisms for freeform comments related to the
emulation and virtualization process.
? Information related to virtualization and emulation must be extensible and there
must be easy mechanisms by which emulation and virtualization information can be
updated across collections of metadata. Information update should be updatable
based upon the growth and demise of emulation and virtualization community
efforts.
? In cases where re-enactment is necessary, there must be means of collecting and
indexing metadata and content based upon re-enactment data. Since this information
can be more expansive than the original game, there must be mechanisms to
minimize repetition and support important artifact derivation.
In looking towards the future, agencies such as the Library of Congress will want to fund
exploration in a number of areas if the process of emulation, virtualization, and migration is
to be an important part of the preservation strategy. Initiatives should be based upon certain
guidelines:
? 88?
? It is vital to encourage the dissemination of information to help museums and
collectors understand the importance of emulation, virtualization, and migration.
? Agencies must help support the development of emulation platforms for the express
purpose of emulation. The goal would be to unite various grassroots development
teams into a larger community dedicated to the preservation mission.
? Agencies must help larger virtualization companies understand the importance of
peripheral virtualization in the preservation stratagem. The effort should include the
encouragement of developing adequate video and sound driver solutions, as well as
creating avenues for developers to extend the virtualized device specification in an
easy to understand format.
? Agencies must help metadata format creators understand the need for a consistent
way of encoding emulation, virtualization, and migration data into data set. Standard
metadata encoding schemes need to be re-evaluated and extended to support such
function.
? Agencies must encourage game engine developers and producers to find
technological and interaction solutions to re-enacting and re-scripting typical
scenarios within games and engines.
? 89?
7. When Strategies Fail: The Case of Second Life
Second Life is a virtual world where users create nearly all of the content within the world, and
also create most of the social, political and economic activity. While the social, political and
economic activities may be the most interesting aspects of Second Life, any attempt to
preserve those must in part rely on preserving the technological environment providing the
context in which they occur. For our project, then, we wished to investigate the feasibility of
preserving that environment, and, if possible, doing so in such a way as to avoid relying on
the proprietary infrastructure of Linden Lab, if for no other reason than the fact that Linden
simply does not possess the resources at this point to serve as a long-term preservation
repository for archived states of their own virtual territories (known as ?islands?). As we also
lacked the resources to try to tackle all of the various islands in Second Life, we choose to
focus on a subset of five for our archiving efforts:
? Life Squared ? A reincarnation of the archive of artist Lynn Hershman Leeson
housed in Stanford University Libraries? Special Collections;
? Stanford University Libraries ? a virtual library space established by Stanford
University Libraries to support online collaboration, education and exhibits;
? Democracy Island ? A project of New York Law School devoted to overcoming
?some of the difficulties associated with civic participation ? in real space?;
? International Spaceflight Museum ? a virtual museum of spacecraft and space travel
managed by a non-profit corporation; and
? Rumsey Historical Maps in Second Life ? a virtual exhibit of selections from the David
Rumsey Map collection.
Problems in Archiving Second Life
Second Life is by far the most technologically complex game (if that is even an appropriate
word for it) with which our project dealt, a networked, virtual environment based upon a
client/server architecture involving multiple task-specific server components:
? Login Server ? manages user authentication and login processes;
? User Server ? manages instant messaging sessions;
? Space Server ? manages the routing of messages between residents based upon their
location in the virtual space;
? Data Server ? manages connections to the various databases containing Second Life?s
data and log information; and
? Simulators ? each simulator manages the state of a single region in Second Life,
including the state of both objects and the terrain and managing the simulation of
physics for the region.
Linden Lab has made the client software for Second Life available under the terms of the
GNU General Public License (version 2.0), with the associated artwork licensed under the
Creative Commons Attribution/Share-alike 3.0 license. All software for the server
components is closed source.
? 90?
Second Life is organized into different 256 x 256 meter regions known as islands, or sims
(short for ?simulators?). A region can be owned either by Linden Lab or by one of Second
Life?s players, known as residents. The contents of a given region consist primarily of a
combination of 3D objects, graphical texture files that can be overlaid on the 3D objects,
audio files providing background noises and scripts that enable interactivity with the various
objects within a region. All of this content is hosted on Linden Lab?s servers, with the
different servers managing the interactions between residents? client software applications
and the various regions.
Objects in Second Life are created by linking various shape primitives (or prims). The
primitives comprise eight basic shapes (box, prism, cylinder, sphere, torus, ring, tube and
sculpted). Each of these can be further modified from their default shapes. A primitive is
more complex than its name might imply, as each prim has an associated inventory that can
contain scripts, notecards, textures and other items. Textures (image files applied to the faces
of the primitives) give objects the illusion of possessing even more detail. Each object has a
set of metadata elements associated with it, including the identity of the individual who
originally created the object (the intellectual property rights holder), the Creator, and the
individual who has possession of the object, the Owner.
Object scripts, written in the Linden Scripting Language (LSL), are what transform Second
Life from a beautiful, but static environment into a highly dynamic and interactive one,
allowing any object to exhibit a wide variety of complex behaviors and accomplish a variety
of functions within an island. Any island can have thousands of script-bearing objects, each
object with its own creator and owner. Additionally, textures, scripts and other items in an
object inventory may be the creations of someone other than the object creator (and may be
impossible to identify).
Preserving such an environment presents several obstacles. There are technological
impediments to obtaining access to certain forms of content in Second Life. Due to the
permissions system employed by Second Life, an object?s inventory (including scripts) is, by
default, not available to anyone other than the object?s creator (and so cannot be accessed or
copied by any other resident of Second Life). As a third party (not Linden Lab and not the
owner of an object), we cannot realistically obtain access to the underlying database of object
data on Linden Lab?s servers directly to make a complete copy, nor can we access the full
contents of an island as a user on the system within that particular region.
Intellectual property and contract law present additional obstacles to preservation activity.
The Terms of Service agreement covering residents? access and use of the Second Life service
specifically states ?You retain any and all intellectual property rights in content you submit to
the service.? The Terms of Service also forbid any infringement of intellectual property
through unauthorized copying of content available through the service, stating that ?You
must obtain from the applicable Content Providers any necessary license rights in Content
that you desire to use or access? and ?You agree that you will not copy, transfer, or
distribute outside the Service any Content that contains any Linden In-World Content, in
whole or in part or in modified or unmodified form, except as allowed by the Snapshot and
Machinima Policy, or that infringes or violates any intellectual property rights of Linden Lab,
other Content Providers, or any third parties.? The combination of Second Life?s contractual
framework with copyright law make any effort to make a preservation copy of the contents
? 91?
of an island without the explicit permission of all of the content creators who have objects
present within that island illegal.
One final difficulty with archiving Second Life is determining the full scope of what must be
archived. If we were to manage to archive all of the objects from a given region, including all
scripts, animations and other inventory content that are typically protected, we would, in
effect, have managed to archive a ghost town, an empty set of architecture and geography
with no information about how the space was used or what its inhabitants were like. A static
snapshot of a world such as Second Life may in some sense serve as a surrogate for the
original, but it is a poor substitute. Part of the fundamental nature of Second Life is that it is a
living, evolving, dynamic space. An archived copy of a region provides some documentation
of what the world was like, but it is hardly a complete set of documentation, and in a very
real sense it is not and cannot be a complete version of the original.
Basics of Archiving a World
Despite the issues mentioned above, our project decided that we would try to proceed with
an approach that involved making an archival copy of our regions of interest. We decided
that our approach should fulfill several criteria. First, it must respect the intellectual property
of Second Life residents, as well as Linden Lab. It should, if at all possible, produce an archival
copy that is not tied to the Second Life architecture, but which would allow a copy of a
region?s contents to be instantiated in multiple different virtual environment platforms.
While Linden Lab was a partner in our project, we also decided to approach the issue of
archiving a region as a party without special permissions or access, in effect putting ourselves
in the position that any library, archive or museum might occupy if they decided to make an
archival copy of some region of Second Life.
Creating an archival copy of a region in Second Life involved a multistage process. First, we
created manifests of the various regions? contents and gathered metadata about the objects
within those regions. Using the objects? Creator metadata, we next contacted each rights
holder for objects in the regions asking for their permission to archive the content they
owned. We then downloaded all available information for the objects we had received
permission to archive, along with additional information about the island itself, and created a
submission information package suitable for loading into repositories at the University of
Illinois and Stanford University Libraries containing the complete set of object and region
information. Details on our process are below.
Creating a Manifest of an Island
?
One of the first issues we needed to deal with is identification of the objects to be archived.
There are potentially hundreds if not thousands of them on each island, and they may be
invisible, at a high altitude within the virtual space or otherwise difficult to identify.
Fortunately, Linden Scripting Language (LSL) provides a fairly simple mechanism for
detecting objects. However, LSL?s limited script memory presented some limitations that
required a great deal more thought about algorithm design than was initially imagined.
? 92?
The primary function for sensing objects in LSL is llSensor() or llSensorRepeat() and the
associated events ?sensor? and ?no_sensor?. The llSensor() function can be imagined as a
form of virtual radar, sending out a ping, with the event sensor acting as the receptor
(no_sensor is triggered when nothing matching the criteria is found). The llSensor() function
has a maximum range of 96 virtual meters, so a single ping would sense a sphere with a
radius of 96 meters. The spherical nature of llSensor()?s search capabilities presented us with
a packing issue, however. If we used llSensor() to detect objects every 192 (2 x 96) meters,
there will be gaps between the spheres (imagine packing ping pong balls into a box, and you
can visualize the air space that would represent regions we could not detect). Cubes will pack
perfectly, so we just compute the maximum size for a box bounded by the sphere and stack
the boxes. The diagram below presents a 2D model of the 3D case.
As you can see there are some overlaps
in the scan areas when the scans are
placed so as to focus our scans on
adjacent cubes, rather than spheres, but
checking for objects detected twice is
straightforward. A more significant
problem is that that the sensors only
detect the sixteen objects closest to the
center of the search sphere. So if there
are seventeen or more objects within the
space, the probe will not detect any
object further away than the sixteenth
object detected. In a case where we
detect sixteen objects, we will not know
whether this is because there actually are
sixteen objects, or whether there are
more within the space that simply are not
being reported by the sensor. We can
compensate for this by adjusting the
sensor?s detection range down until we
get less than sixteen objects per scan, but
we cannot know what the reduced range
necessary to ensure detection of all
objects might be.
We chose to adopt a method that involves a progressive subdivision of the search space in
any case in which a scan detects sixteen objects. If a scan reports sixteen objects within the
sensor?s range, the sides of the search box are decreased in size by half (so a 136 meter box
would become 68) and we essentially stack 8 boxes into the original box with more than 16
objects. We then scan each of those sub boxes. If any of them have 16 or more objects, we
repeat the process.
!"#$%&?
(
!")*?+,%&?
( !"#$%&?
(
!")*?+,%&?
(
!"#$%&?(!"#$%&?(
!"#$%""&?(#)%$&(*#+",(--.&"#/012(%3(&4&05(678(9&3&0#:
;4&0-%))+",(--.&"#/012(#$%"#(?/"&(%3(
<6=>:?@(9&3&0#3/(#$%"(%?A%$&"3($*B(
? 93?
There is a potential flaw with this method. It is
possible within Second Life to put more than one
object in the same location. This makes it
possible to have 16 or more objects that have
the same exact coordinates. In these situations,
if our scanning boxes reach a minimum sensor
range of 0.01 meter (the minimum shape
dimension permitted), we exit the function and
our program returns an error alerting the user
to the problem. If desired in those cases, a
manual inventory of the objects can be
performed.
As each object is detected during our scanning
process, information about the object?s owner
and creator is obtained, and our program stores
that information in an external MySQL
database. Once all of the objects? intellectual property rights holders have been identified, we
are ready to embark on the next phase of archiving, obtaining permissions to archive the
objects from the appropriate rights holders.
Obtaining Permission
After having created a list of the objects within an island, we can then proceed to contact the
creators of those objects to obtain permission to archive their materials. Given the
potentially large number of content creators in any given region, we used a semi-automated
system to obtain permission to archive objects. The avatar names and UUIDs (a unique
identifier given to every asset in Second Life) of all content creators from a region are retrieved
from our database, and sent to a Second Life object that acts as a portal for interaction with
Second Life residents? avatars. The portal receives the avatar information and sends a notecard
within Second Life to the creators, describing our project, what we want to do with the objects
they created, and including an in-world teleport link they can use to travel to our portal,
where they can obtain access to a web interface we have created for granting or denying
permission to archive their materials.
At the portal, creators use the object?s interface to access the web interface for permissions
management. The creators are first asked to set up an account, and then they are presented
with a list of all their objects we wish to archive. The creator may choose to allow us to
archive all of the objects, some of the objects or none. Creators are also allowed to specify
an embargo period during which their content will not be made public. The language for
granting permission to archive is based upon a standard archival deed of gift used by
Stanford University Libraries that was modified for use on this project (see Appendix G for
the language used). Once we have a listing of the objects for which we have been granted
permission to archive, we can embark on actually making an archival copy of those objects.
?
!"#$%&?($
&%$))?($ &%$))?(
$
!"#$%&?()*"?$)*+,-,+"+?,#.%?$/011?()*"$?,#?,#$)&"?
+"."(.,%#?%2?/%&"?.30#?45?%*6"(.$
? 94?
Copying a World
Overview of Building in Second Life
The fundamental building block in Second Life is the shape primitive or prim. More complex
objects are built by combining prims of different shapes together similar to a set of building
blocks or Legos. There are eight prim types used:
? box
? cylinder
? prism
? sphere
? torus
? tube
? ring
? sculpted
Most of these are self explanatory except for sculpted prims, a newer addition that allows
builders to create more organic shapes using off-world 3D design software, and then
importing them into Second Life as 2D UV maps (32 x 32 pixel .tga or .jpg files) that are polar
coordinate maps of the sculpted objects. The low resolution of the UV maps limits the
details that can be included and display costs associated with sculpted prims continue to
make it advantageous to use the other prim types for many applications.
There is a daunting array of parameters that can be used to alter the shape of these prims.
Such modifications can often extend to such a degree that they render the base shape
unrecognizable. Parameters cover such aspects of an object as its material composition,
position, rotation with the coordinate space of the world, size, texture files overlaid on the
prim, color, reflectivity and other physical parameters. There are also parameters having to
do with ownership, creation, IP issues and contents of a prim?s inventory. Prims can contain
other things such as other objects, scripts, animations and so on.
Each face of a prim can be textured and colored independently. Second Life currently
supports .tga, .jpg and .gif files for texturing with a maximum resolution of 1024 x 1024
pixels. Textures can be sized and positioned on a prim face repeatedly as well. You can get
the UUID for each texture, and capture the textures using the GLIntercept software.
It is very common to find scripts inside Second Life prims. Without scripts, Second Life objects
would just be pretty (or occasionally ugly) pictures. Prims can contain multiple scripts that
function more or less independently of one another. A person can write scripts that allow
objects to communication with one another. The Linden Scripting Language actually allows
objects to do some surprisingly complex things, and given the ability to use HTTP and
XML-RPC to contact off-world web servers, objects can exhibit extremely sophisticated
behaviors.
To make more complex shapes or items larger than 10 meters in any dimension, prims are
linked together into objects. Linked prims can be moved as a unit and can communicate via
scripts a bit more easily than unlinked prims. No more than 255 prims can be linked
? 95?
together in a single object. Many large buildings are therefore composed of many individual
objects, each containing a number of prims. One prim in any object serves as the root prim.
All other linked prims can be referred to by their link number. An unlinked prim has a link
number of 0, the root prim of a linked set has link number of 1, and the rest are numbered
consecutively based on the order they were added or selected. The root prim is important
because position, rotation and velocity information for an object are all relative to the root
prim. Child prim information is also given relative to the root prim. Additionally, the root
prim is generally where the main scripts are placed, although child prims often contain
scripts that communicate with scripts in the root prim.
The largest independent building block in Second Life is the Island, Sim or Region. A sim
measures 256 meters on the x and y axes. A region is defined by its terrain, stored in a .raw
file that gives the height of the ground, ground textures and the water level. The ground
textures display somewhat sequentially from lowest to highest terrain height and can be
controlled semi-independently by specifying different heights for each corner of the sim.
Additionally, there are sim parameters that determine what functions are permissible within
the region, and privately held sims often have a covenant that functions like a legal
document specifying how the sim is meant to be used. Sims can be divided into parcels that
can also have names, descriptions and parameters that determine what is allowable (if the
sim parameters allow that for smaller parcels). A standard sim supports a maximum of
15,000 prims (not objects; the number of objects will depend on how prims are linked).
The software we chose to copy the objects in a region is known as CopyBot. CopyBot is a
text-based client for Second Life developed by a group of griefers (avatars who disrupt others?
activities in Second Life) known in Second Life as the Patriotic Nigras (whose motto is ?Ruining
Your Second Life Since 2006?). The group has a poor reputation in Second Life as its primary
function was to make content theft and griefing easier. Despite their nefarious reputation,
they developed what turns out to be a very useful tool for archiving Second Life content, as it
allows us to export and import Second Life object data to and from an XML file.
The export function downloads all object data into an XML file (see Appendix I for an
example). It also downloads all of the textures used on a particular object. There are several
important caveats. First, it breaks all links to intellectual property ownership. The names of
the content creator and owner do not even appear in the file. Second, it does not gather the
contents of an object?s inventory, such as scripts, sounds, etc. These have to be collected and
inserted back into the objects manually, a process that is only feasible when you have the
appropriate permissions to examine the objects? inventories.
Another restriction of CopyBot is that it is designed to download a single object at a time. In
order to download all of the objects on an island, we created a program that interacts with
our database of objects created during the initial scan of an island. This program obtains the
UUIDs for the objects we have permission to archive, and then sends a request via HTTP to
a special attachment worn by the CopyBot avatar, directing the CopyBot to move to the
object of interest and download it. The management program, combined with CopyBot,
allows us to automate the downloading of all the objects we have permission to archive.
In addition to the data for the individual objects, we need to store the information regarding
the region and also provide facilities for relating the objects to each other (and individual
? 96?
information we may wish to store about the island). For this purpose we created an XML
schema that stores all of the data required to reinstantiate the objects back into Second Life.
Having the information stored in an XML document would also allow us to export the
information into other virtual world platforms, although this would require some
transformation of the data. The XML Schema is broken into six subdocuments and follows
this basic structure:
SLBuild.xsd ? the root file for the schema, which stores basic information about the
island (e.g., the sim owner, a description of the sim) as well as information on all
objects and their composition prims;
SLShapes.xsd ? this file defines the parameters that influence the shape of each of
the different prim types;
SLPrimFeatures.xsd ? this file defines non-shape parameters that may be recorded
for each prim type including color, flexibility, drag, the influence of gravity on the
object, etc.;
SLPrimTextures.xsd ? this file defines the parameters recording the texture
information associated with each prim;
SLPrimContents.xsd ? this file defines the information set which may be used to
record information about the contents of a prim?s inventory; and
SLBasicTypes.xsd ? this file defines basic element types used in other types in the
schema, such as vectors and rotations.
While our process often does not allow us to obtain complete information about the objects
in an island (the inventories of primitives in particular are difficult to obtain), the schema
does allow us to store a complete record of the contents of an island.
Failures in Archiving
While in theory the mechanisms defined above allow us to archive a fairly complete
representation of an island in Second Life, separately from the Second Life system, and in a
form that would potentially allow us to reinstantiate the island in another virtual
environment platform, in practice our efforts can only be described as partially successful at
best. We encountered a number of difficulties in attempting to archive our test cases in
Second Life that severely impeded our ability to create a complete record of those worlds.
The greatest impediment was the need to obtain permission from the intellectual property
owners before we archived their objects. At its best, our response rate to requests for
permission was 10% of the individuals we contacted on one island, and at worst, we received
no responses at all. It is reasonable to ask whether an archive that contains only 10% of the
objects within an island is worth creating at all; it is certainly not an accurate reflection of the
island as it existed at that time. Moreover, some of our responses were actually hostile. Many
Second Life builders earn a living from creating objects in Second Life, and they were not
? 97?
pleased with the notion that anyone might actually copy their creations and place them in an
archive where others would be able to obtain access to them.
In the case of one island, we experienced a complete failure in our attempts to archive not
because of intellectual property issues, but due to a technical one. Democracy Island, a
project of the New York School of Law, allows any and all Second Life residents to build
within their island. As a result, the island has been packed with objects to the point where it
has actually hit the limit on the maximum number of prims allowed within a given region.
Our approach to archiving does require us to instantiate a probe object within an island to
perform our initial scan for objects. Because we could not instantiate our probe, we were
unable to obtain the necessary information to continue our attempts to archive that island.
Another potential problem exists in that regions and parcels within Second Life can be
declared ?no script? zones. Any scripts bound to an object in Second Life will immediate cease
when they enter such an area. Our initial probe to identify objects within an island is a script-
based object. If it accidently enters such a region, it immediately fails.
A final major problem is that, as a third party, we have no access to the inventories of
objects other than those we have created. While the information we can automatically extract
from an island allows us to create a visually complete recreation of the objects in an island,
any animations or scripts associated with the objects are lost. All of the behaviors of objects
that make Second Life an interesting and dynamic environment are gone. While it is possible
for the intellectual property owners of the objects to provide us with copies of the contents
of the objects? inventories, it was difficult enough to simply obtain permission to archive.
The odds of the IP owners being willing to manually copy all of their objects? inventories for
us seem remote.
In short, our experiments in trying to archive islands in Second Life at best resulted in
extremely partial and static representations of the original. While the techniques we?ve
developed may be useful in archiving some virtual world systems, at least in the case of
commercial environments such as Second Life, there are severe limits to the preservation
activity in which a third party can engage. Given the intellectual property and contractual
restrictions governing Second Life, any hope for a complete archive of a Second Life world
would rest on Linden Lab?s willingness to archive the content itself.
? 98?
8. Packaging Virtual Worlds
Virtual Worlds and the OAIS Reference Model
The Reference Model for an Open Archival Information System (OAIS) (Consultative
Committee for Space Data Systems, 2002) has been accepted as one of the foundations for
digital preservation efforts throughout the digital library community. One of the key aspects
of the OAIS reference model is the concept of an archival information package, the
complete set of content information and preservation description information necessary to
maintain a data object as interpretable by the designated community a repository serves over
the long-term. The OAIS reference model includes a data model for an archival information
package (see the figure below), and later work by the Consultative Committee for Space
Data Systems led to an XML packaging format for the creation of archival information
packages, XML Formatted Data Units (XFDU) (Consultative Committee for Space Data
Systems, 2008).
OAIS Archival Information Package Data Model
While the data model set forth in the OAIS reference model provides a structure that can in
theory be applied to information housed in any archive, the OAIS reference model was
designed to support the operations of aerospace agencies archives, including data archives.
While the Consultative Committee for Space Data Systems (2002) did mention software as a
type of content information that might be saved in an archive, the reference model?s
examples, particularly in the case of representation information, tend to focus on simpler
data types. It seems safe to assume that the use cases the OAIS reference model?s creators
had in mind when designing the model did not include preserving World of Warcraft or
DOOM 3. Our investigations have found that while the OAIS reference model as it stands
? 99?
today is capable of handling game software as an object of preservation, video games and
interactive fiction are extremely complex digital objects, and packaging them in a manner
consistent with the OAIS reference model can be a challenge, particularly with regards to the
matter of providing adequate representation networks and adequate context information.
The OAIS reference model specifies that an archival information package for digital content
should contain the representation information necessary to convert the bit sequences in the
digital information into more meaningful information by describing the data format used to
encode the digital information. It recognizes that a given piece of representation may
reference other pieces of representation information (as when Unicode references ISO/IEC
Technical Report 19769 in its discussion of Unicode data types for the C programming
language), and that representation information in digital form, as an object of preservation in
its own right, may require its own representation information (e.g., if we stored the Unicode
standard in PDF file format as representation information for an object, we would need a
copy of Adobe?s PDF Reference specifying the PDF data format). The full set of
representation information needed to ensure a digital object?s on-going interpretability can
thus form a large and complex representation network.
As an example, consider the game Mystery House from our case set. The original source code
for Mystery House has apparently been permanently lost. Existing versions of the original
game are encoded as files embedded within an Apple II disk image file. Our representation
information for the game thus needs to start with a copy of Apple II: The DOS Manual, the
documentation for the Apple II disk operating system. Fortunately, we have a copy of that
as a PDF 1.6 file. Obviously, we will then need a copy of the Adobe PDF Reference for
PDF version 1.6. The PDF Reference references 33 other documents published by Adobe,
the vast majority of them technical documents needed to understand aspects of the PDF
specification. It also references 60 documents from other agencies, including a large number
of standards documents from agencies including the Institute of Electrical and Electronics
Engineers, the International Color Consortium, the International Electrotechnical
Commission, the International Organization for Standardization, the International
Telecommunication Union, the Internet Engineering Task Force and the World Wide Web
Consortium that define technologies relevant to implementation of the PDF specification.
Many of these standards documents in turn reference other standards documents. We are
well on our way to having a representation network that resembles a small technical library,
and we only have the information needed for preservation of the disk image file format; we
have not yet considered the files contained within the disk image.
Of course, the size of this representation network could be truncated quite effectively if we
keep our copy of Apple II: The DOS Manual in print form rather than in PDF format. This
requires, however, that the wrapper for our archival information package for this game be
able to successfully reference both digital and non-digital material. And that still leaves the
issue of the representation information necessary to document the Apple II?s binary
executable file format, including the documentation for the MOS 6502 instruction set used
in the Apple II line.
In addition to complicated representation networks, the games in our case set have shown
that games can be extremely reliant on what the OAIS reference model calls ?context
information? to aid in their interpretation. The reference model defines context information
? 100?
as ?information that documents the relationships of the Content Information to its
environment. This includes why the Content Information was created and how it relates to
other Content Information Objects.? It also provides examples of context information for a
software package: a help file for the software, user guides, related software, and
documentation on the programming language used. Taken together, these examples indicate
that the purpose of context information for software is to aid the individual examining the
software to understand its purpose, its operation, and the technical environment necessary to
enable the software?s use.
All of this information is clearly necessary for the game software we examined in our case
set. While we think of games such as Star Raiders for the Atari 2600 as relatively primitive in
comparison to today?s games, players had to understand how to activate and deactivate their
ship?s shields, how to use the attack computer in targeting enemy ships, how to use the
game?s galactic map to identify their own location, their starbase location, and the location of
enemy ships, how to assess their ship?s energy level (and how to recharge it), and how to
determine the amount of damage their ship had taken, in addition to basics such as
maneuvering and shooting. The game also tracks skill levels of the game players over time as
they advance in missions. Without the game manual, it is extremely difficult to interpret the
screen displays of the game and understand exactly how game play is supposed to proceed.
And a game like Star Raiders, designed for an antiquated console platform, nicely
demonstrates the necessity of maintaining complete technical documentation on the
environment in which it was designed to run if we are to offer continuing access to the
game. Without the documentation necessary to produce an emulation of the Atari 2600
Console, the Star Raiders game as distributed is useless.
Games, however, may also require context information beyond the forms documented
within the OAIS reference model document. Games are not just technical phenomena, they
are also social phenomena, and the explanation of ?why the content information was
created? can be extraordinarily complex in the case of something like the International
Spaceflight Museum in Second Life. There are entire books devoted to the origins of the Atari
platform and games from our case set (for examples, see Montfort [2009], Montfort [2003],
Barton [2008], Boellstorff [2008]). If context information is needed to understand why
content information is created, then in the case of video games and interactive fiction
context information is going to be far more extensive than in the case of a data file from a
space mission, and will also involve more extensive links between the digital content
information to be preserved and the standard bibliographic universe of published books and
journal literature with which librarians have traditionally been concerned.
? 101?
Spacewar! running on a PDP-1. Image courtesy of Kenneth Lu.
Another obvious, but critical, aspect of games relating to context information is that they are
highly interactive. And because they are interactive, the ?why? of their creation is not
necessarily something established by the game designers, but is constructed on an on-going
basis by the games? users. This is particularly obvious in the case of massively multiplayer
online systems such as World of Warcraft or Second Life. The nature of these virtual worlds is
constituted in significant part through their use. Without documentation of how players
actually engage with these resources, individuals in the future studying these materials will be
left with an incomplete answer as to the ?why? of their creation. But this is equally true of
single-user games. The oldest game in our case set, Spacewar!, was developed in part as a
mechanism to demonstrate the capabilities of the PDP-1 computer on which it was
originally implemented, including the capabilities of the vector graphics display used by the
computer (see figure above). Vector graphics display terminals, however, are almost unheard
of today outside of a few niche applications. Video or photographic documentation of the
game in use on its original platform is also needed if people studying the game Spacewar! are
to gain a complete appreciation of the ?why? of its creation.
Analysis of Packaging Requirements
As the above discussion (as well as the previous discussions on FRBR) should make clear,
we have found that, to the extent preservation of computer games and interactive fiction is a
metadata problem, it is primarily a matter of structural metadata. More completely, the
problem of preserving computer games and interactive fiction is primarily an issue of
structural metadata and collection management, insuring that you have the complete set of
? 102?
representation information and context information necessary to render your content
information both accessible and apprehensible, and that all of the necessary relationships
between content, representation, and context information are appropriately recorded.
Possessing a technical metadata record stating that the file named ?star_raiders.bin? contains
a binary executable intended to run on a MOS 6502 processor is of limited value for
preservation when compared to having copies of the MCS 6500 Family Programming
Manual and the MOS 6500 Microprocessor data sheets necessary to interpret the contents of
the file. In fact, if we possess structural metadata linking the ?star_raiders.bin? file to those
documents, and asserting that they provide representation information for it, recording a file
format in a technical metadata record is actually somewhat redundant.
Both the relationships described in the Functional Requirements for Bibliographic Records Final
Report (1997) and the Reference Model for an Open Archival Information System (2002) are critical to
the successful preservation of video games and interactive fiction. Any packaging mechanism
for preservation of these materials must therefore support expression of these relationships
between different entities. Moreover, as the relationships that exist in FRBR are specific to
particular classes of entities defined in the FRBR report, a packaging mechanism for digital
games must allow support, either implicitly or explicitly, the identification of entities
described within the package as belonging to a particular FRBR entity class.
Beyond these requirements are ones common to any digital preservation effort. In addition
to representation information and context information, a packaging format needs to be able
to provide a wrapper for fixity, reference, and provenance information. The digital file
format employed for any packaging file for metadata and content should itself demonstrate
the sustainability factors identified in the Library of Congress?s Sustainability of Digital Formats
(2007) website.
Finally, while packaging games for preservation is primarily a structural metadata problem, it
is clear from both our research (and from the OAIS reference model itself) that archival
information packages for video games and interactive fiction will need to make reference to
non-digital material (print documentation for software, books and journal literature
containing context information, etc.). Such references need to be capable of providing
sufficiently detailed descriptions of resources to allow for reliable identification of specific
editions of print material.
Metadata & Packaging Recommendations
As noted above, packaging of video games and interactive fiction for preservation is in
significant part a matter of making explicit the different relationships that exist between
various resources. There are two common web technologies in use today that can both link
resources and specify the exact nature of the link between them, and that also provide a
wrapper format that we can consider as highly sustainable if evaluated using the Library of
Congress?s sustainability criteria. The first is the XML Linking (XLink) (World Wide Web
Consortium, 2001) standard, and the second is the RDF/XML syntax specification (World
Wide Web Consortium, 2004). While these two standards have some significant differences,
for our purposes, the similarities are more important. Both provide a mechanism to specify a
relationship between two resources. Both provide a mechanism to record a URI that
indicates the specific type of relationship between two resources (the predicate in an RDF
? 103?
triple in RDF/XML, the xlink:arcrole attribute in XLink). Both are capable of asserting
relationships between resources when:
? both resources are contained within the same document as the link itself;
? one resource is contained within the document containing the link and the other is
external; or
? both resources are external to the document containing the link.
The METS standard employs XLink, while OAI-ORE is expressible within the RDF/XML
format; both of these provide the necessary linking mechanisms and do so using common
standards. While MPEG-21 DIDL does not explicitly support XLink, its schema allows
widespread use of attributes from non-MPEG namespaces (including XLink) and so is
capable of encoding the necessary linking information. The XFDU specification from the
CCSDS does not, unfortunately, currently conform to either XLink or to RDF, and so we
cannot recommend it as a packaging standard at this time.
We believe that a shared, formalized ontology defining the various FRBR and OAIS
reference model relationships is the preferred mechanism for enabling the specification of
relationships in digital preservation packaging standards. By providing standardized URIs for
each of the relationships, ontology provides both the means of indicating the specific
relationships between resources and provides a shared language for recording information
on relationships between preserved resources that will help promote interoperability
between institutions. For the Preserving Virtual Worlds project, we have created an OWL
ontology (see Appendix C) that defines the FRBR Group 1, 2 and 3 entities as classes, and
defines the various relationships mentioned in both FRBR and the OAIS reference model as
properties between these FRBR-based classes.
Having this ontology permits the expression of both FRBR and OAIS relationships between
resources in packaging formats employing RDF/XML or that provide appropriate support
for XLink extended links. In OAI-ORE, for example, defining a particular aggregation as a
FRBR work and associating the work with a piece of context information can be
accomplished relatively easily:
A similar description is possible in METS:
While we believe that an approach to game packaging relying on standardized ontologies for
the description of relationship types is the best means for encoding the necessary linking
information between resources in a preservation environment in a way that supports
interoperability, the PVW ontology should be seen as the start of a discussion on
standardized ontologies for this purpose and not the end. We do believe that an ontology or
ontologies that describe both the FRBR and OAIS models and that integrate the two are an
essential component for packaging of computer games and other software. But if ontologies
are to become a component of the preservation community?s standard toolkit for packaging,
development of those ontologies needs to be done in a way that involves the larger
community as part of process that would feed into an official standards effort.
With regards to descriptive metadata employed within archival information packaging, we
have already mentioned the need for descriptions that are sufficiently detailed to enable
identification of specific editions of resources. This is particularly true if we assume a
distributed, collaborative preservation environment such as envisioned by the Library of
Congress (2002) in their Plan for the National Digital Information Infrastructure and Preservation
Program. In such an environment, it is conceivable that institutions may rely on copies of
representation information and context information being held by other, remote institutions.
In such a case, being able to describe a work in sufficient detail in order to enable its
identification outside of your own institutional setting is of vital importance.
Dublin Core does not provide this level of detail, and we would recommend against its use
in a preservation setting. Description standards such as MARC/XML, MODS, CDWA and
VRA Core that enable identification of specific editions of resources should be used instead.
? 105?
9. Steps for the Future
Library of Congress National Collecting Plan
The Library of Congress has based its collection policy statements on three fundamental
principles:
? The Library should possess all books and other library materials necessary to the
Congress and the various officers of the Federal Government to perform their
duties;
? The Library should possess all books and other materials (whether in original form
or copy) that record the life and achievement of the American people; and
? The Library should possess in some useful form, the records of other societies, past
and present, and should accumulate, in original or in copy, full and representative
collections of the written records of those societies and peoples whose experience is
of most immediate concern to the people of the United States.
The Library has a number of separate collection policy statements and overviews for various
specific topic fields and in some cases for particular formats. The collection overviews
covering special materials and formats include one on computer files
(http://www.loc.gov/acq/devpol/colloverviews/computer.pdf), which discusses the
computer game holdings of the Motion Picture, Broadcast and Recorded Sound Division at
the Culpepper facility. The Collection Policy Statement for Moving Image Materials includes
video games within its scope, noting that:
Video games have become an established, popular medium of moving image
entertainment that demand inclusion in the collections of MBRS. The
Division is developing new approaches for the more systematic acquisition
of video games, including playback consoles and platforms, the multiplicity
of formats and their equipment needs, and the technical challenges in
preserving the digital source files. The collection will encompass a wide range
of examples of video game culture, to allow historians decades hence to fully
understand this as a popular phenomenon, and not have simply a few games
that seemed significant at the moment of release.
The Collection Policy Statement also states that the collecting level for video games should
be at the research level, and that the Motion Picture, Broadcast and Recorded Sound
Division ?both selects via copyright and purchases video games, their associated hardware,
and magazines about them that reflect the breadth and depth of gaming culture.?
We strongly endorse the maintenance of a research-level collection of published video games
by the Library of Congress. Just as important, in addition to documenting the output of the
game industry, the Library of Congress, in collaboration with other cultural institutions,
should make a sustained effort to ensure that materials are collected that document game
cultures and communities built around digital games within the United States, as well as
international cultures and communities that have influenced those in the United States.
Based on our research, we would like to recommend some steps that we believe will assist
? 106?
the Library in maintaining a research-level collection of video games, as well as suggesting
some guidelines that might be useful in the selection process for materials relating to video
games.
Collection Policy Recommendations
It is perhaps the least surprising of our findings that computer games and interactive fiction
are among the most fragile of digital works, due to their extensive dependency and
interconnection with computer operating systems and particular hardware configurations.
We can safely assume that a PNG image will display equally well on a Mac or a PC, and that
an update to a computer?s operating system is unlikely to mean that the image can no longer
be displayed. Neither assumption is safe for software. Given this fragility, all software,
including games, requires an extremely proactive approach to preservation, and this has
implications for collection policy.
One of the first implications that should be considered is the necessity of collecting materials
other than the games themselves. Both related representation information and context
information must be acquired and preserved as part of the Library?s collections if games
added to the collection are to survive. We address issues around representation information
below. Context information is important in two respects. First, it provides understanding of
the relationship of specific content such as a digital game to its technical environment. For
example, documents such as game manuals, developer-created technical support websites,
and contributions by players to discussion forums provide information about hardware and
software dependencies that must be understood to install and provide access to game
content or instructions on how to operate a game or tools based on it (as in the case of
replay or machinima production). Second, information about the context of a game often
serves equally well as archival documentation of the technical, player, fan, or creative
communities around a game, or about historical events and activities that took place in game
or virtual worlds; we discuss historical collections at greater length in section 4 of this report.
We note here that contextual information for games and virtual worlds contributes double
duty by adding to historical as well as preservation documentation.
With respect to context information generally, our research has shown that preservation of
computer games requires an integrative approach to collection development that is not
common in libraries, archives, and museums to date. Context information for games can be
found in the form of traditional published print material in many forms. The range of
published formats for game-related information extends from scholarly books and journal
literature such as Montfort?s & Bogost?s (2009) Racing the Beam or Dennis Jerz?s (2007) article
on the game Adventure, through trade and web-based publications such as game guides and
magazines and the descriptions of play experiences characteristic of the ?new game
journalism? (Jim Rossignol?s (2008) This Gaming Life, or Tom Chick?s columns and essays) to
ancillary publications and ephemera such as printed game manuals or on-line player reviews.
Digital games have been at the heart of scholarly discussion of ?trans-media? content, so it
should come as no surprise that non-print media also provide contextual information about
games. We can start with the growing number of game-based movies such as the DOOM,
TombRaider, Final Fantasy, Resident Evil, BloodRayne, and Mortal Kombat, game-based machinima
and replays, and documentaries on games and gaming culture like King of Kong: A Fistful of
Quarters, Chasing Ghosts: Beyond the Arcade, Frag, and Second Skin. Even audio materials such as
? 107?
game soundtracks and popular music contribute to our understanding of game content. The
history of video game music has resulted in significant collections of game music, whether in
the form of soundtracks provided with games or produced by fans, or original live
performances. Game composers such as George Sanger (The Fat Man, whose archive is
preserved at the University of Texas) and Nobuo Uematsu (the main composer for the Final
Fantasy series) have established game music as a genre. Well-known musicians and bands
from several styles of music have contributed to the production of specific soundtracks
(such as Nine Inch Nails for id Software?s Quake), composed and performed specific songs
for games ranging from EA?s Fight Night to music games such as Dance, Dance, Revolution and
Guitar Hero, or licensed songs used within video games (such as the large selection of music
available via car radio within the game Grand Theft Auto IV, from classics such as Duke
Ellington?s ?Take the ?A? Train? to contemporary music by artists including Ghostface
Killah, Sepultura, and Seryoga). Focusing exclusively on materials that might fall within
traditional publication chains, we can see that context information that we might collect
crosses divisional boundaries within the Library of Congress, such as between book- and
media-oriented divisions. Significant coordination on collection policies will be necessary if a
comprehensive collection supporting preservation efforts is to be achieved.
Context information for a new digital medium such as games and virtual worlds will often, if
not usually, be found outside traditional publication channels. We suggest that the Library of
Congress consider web archiving efforts focused on game and game community sites similar
to those which can be found in the Archive-It collections we created as part of the
Preserving Virtual Worlds project (preserved both at the Internet Archive and at Stanford).
Web archiving efforts at the Library of Congress might focus on major sites devoted to
gaming and the gaming industry as a whole, or industry-wide issues and movements,
whether these sites are created by developers, publishers, trade groups, players, or advocacy
groups; specific topical approaches might better be undertaken by collaborating research
libraries and organizations deeply engaged with specific games, genres, or research directions
(such as the emphasis on game world ?cartography? in our project). Examples of significant
categories for websites of industry-wide significance might include:
? Portals for information about game development or game publication and history.
Examples: Gamasutra (http://www.gamasutra.com), IGN (http://www.ign.com)
and Moby Games (http://www.mobygames.com)
? Websites of game industry associations. Examples: the International Game
Developers Association (IGDA, http://www.igda.org/) and the Entertainment
Software Association (http://www.theesa.com/). It is worth noting here that the
IGDA includes a Game Preservation SIG with its own website
(http://www.igda.org/preservation); this SIG is chaired by Henry Lowood from our
group and was a productive outreach channel for the Preserving Virtual Worlds
project.
? Websites devoted to retro-gaming and related attention to historical games and game
collecting, including fan-created websites such as the Atari Museum
(http://www.atarimuseum.com/mainmenu/mainmenu.html) and Retrogames
(http://www.retrogames.com/).
? Various sites devoted to development or support of emulation platforms useful for
playing historical games and games from obsolete platforms, such as Stella, and Atari
2600 emulator, and the MAME emulation platform. Examples: the Stella open-
? 108?
source project site (http://stella.sourceforge.net/), the official site of the MAME
development team (http://mamedev.org/) and the Emulator Zone
(http://emulator-zone.com).
? Fansites for games, many of which offer not only fan forums and fan-produced
content, but also collections of game documentation, such as screenshots, walk-
throughs, FAQs, reviews, replays, and videos produced from screen captures and
replays. Examples: Koinup (http://www.koinup.com/), SL Universe
(http://www.sluniverse.com/php/) and WCR for Warcraft III replays
(http://www.wcreplays.com/). Blizzard Entertainment offers an excellent collection
of such sites on its World of Warcraft fansites page,
(http://www.worldofwarcraft.com/community/fansites.html).
? Major sites devoted to collecting and distributing machinima provide extensive
documentation both of player creativity and of the exploration and content of virtual
and game worlds. Examples: Machinima.com (http://www.machinima.com), the
Machinima Archive (curated by Henry Lowood and a PVW site for archiving
content; http://www.machinima.org), and sites devoted to machinima from specific
games or virtual worlds, e.g., Halo movies (http://halomovies.org) or Warcraft
movies (http://www.warcraftmovies.com/).
? Websites devoted to end-user game modifications. Examples: ModDB primarily for
console games (http://wwww.moddb.com) or FilePlanet?s mod pages primarily for
PC games (http://www.filplanet.com/mods) and historical FilePlanet sites devoted
to specific games, such as Planet DOOM (http://planetdoom.gamespy.com/)
? Websites devoted to alternative games and ?art games.? Examples: 56KModern
(http://www.56kmodern.com/), the Independent Game Festival?s Indie Games
(http://www.indiegames.com/), the IndieCade festival website
(http://www.indiecade.com/), or Gamescenes: Art in the Age of Videogames
(http://www.gamescenes.org/).
? Websites devoted to ?serious games? and educational uses of games, including
military training. Examples: The Serious Games Initiative
(http://www.seriousgames.org/), Games for Change
(http://www.gamesforchange.org/), or Combatsim.com
(http://www.combatsim.com/).
Some of these sites, such as the FilePlanet family of websites, have barely weathered threats
to their continued operation or are under duress in terms of their long-term survival. A
detailed list of virtual world and massively-multiplayer game websites created as part of the
PVW Project in 2008 (see Appendix H) includes many now-defunct websites, only a fraction
of which we were able to preserve as part of our ongoing Archive-It crawl.
In addition to electronic resources, media and web-based collections of documents, the
Library of Congress might consider collecting in some categories of print materials that are
not handled within traditional publication channels. These typically would be collected
perhaps under the rubric of ?ephemera? in a Special Collections Department; we are
proposing that these sorts of materials, along with other categories of near print or ?grey?
publication play an important role in documenting digital media such as games. Examples
include repair manuals and technical schematics for game platforms of many kinds, including
arcade consoles; advertising materials for games; and production materials from gaming
companies (design notebooks and sketches, development versions of game software and
? 109?
source code, storyboards, scripts, rule outlines, character descriptions, etc.). Clearly much of
this latter material falls within the domain of archival and special collections as a specialized
department or area of expertise within most libraries. It should be noted here that in libraries
that are currently collecting game-related material (such as the University of Texas and
Stanford University), substantial archival holdings are held in such departments, along with
published books in circulating libraries and audio and video in media collections. However,
we believe that increasingly preservation of digital media of all forms will require eliding the
traditional distinctions between library collections and archival collections if it is to be
successful. If the Library does not choose to collect such archival material on its own,
coordination between its efforts and archival collections at other research institutions
committed to documenting the history of gaming and interactive fiction (including Stanford
University, the Strong Museum, the University of Maryland and the University of Texas)
should form part of the Library?s efforts under the National Digital Stewardship Alliance.
With regards to games themselves, obviously the Library will collect a large number of video
games through mandatory deposit of published software with the Copyright Office.
However, we would caution that, given that registration of copyright fulfills mandatory
deposit requirements, and that registration of copyright for software programs does not
require complete deposit of the software, it is possible for significant games to be registered
with the Library but not actually reside within it. Considering our case set, for example,
Mystery House from Sierra On-Line, Inc., is registered with the Copyright Office, but the
Library does not actually hold a copy of the game. A videotape and textual description were
submitted in lieu of the game itself. Given this reality of the copyright registration and
mandatory deposit system, the Library will need to survey the games actually present within
its collections and determine whether there are additional games which it might wish to add.
In addition, the Library may wish to determine whether other classes of supporting
materials, such as the above-mentioned video and description of Mystery House, should be
identified as relevant and important contextual documentation for game preservation and
history. These considerations raise questions about the criteria that the Library should use in
selecting games for inclusion in their collections, and for evaluating materials already present
for reasons such as copyright registration.
In considering both the games in our case set and in the larger worlds of historical and
contemporary games considered as a whole, there are a variety of factors that we believe
should inform decisions regarding collecting of games:
? Popularity/sales/distribution ? popularity, as measured by game sales, should be
considered. A game such as Call of Duty: Modern Warfare 2, which sold 4.7 million
copies on its release day, clearly represents an important piece of popular culture.
However, it should also be noted that lists of top-selling titles require careful
interpretation; for example, they often are crowded with a limited number of titles,
due to dominant series (such as Madden Football), sequels, and titles that appear on
multiple platforms. Games with a larger number of release platforms and ports are
likely to be among the more popular game titles, to the extent that popular industry
?top-10? lists and the like may only provide a narrow view of ?hit? titles. A better
source of information on popularity and success might be review-based ranking such
as Game Rankings (http://www.gamerankings.com/) or Metacritic?s game site
? 110?
(http://www.metacritic.com/games/). Stanford, for example, uses the latter, in its
game approval plan.
? Novelty (technical/artistic) ? certain games represent significant advances of the
state of the art in gaming, either due to technical or artistic innovations. A game such
as DOOM, which represented a vast leap in the use of immersive 3D graphics, or
Myst, which influenced the aesthetics of many later computer games, should be
considered for inclusion. More recently, specific movements oriented towards
independent (?indy?) game development, art games, or news games have emphasized
the use of games as a platform for creative expression; websites tracking these areas
will provide guidance concerning game titles and designers who are influencing these
trends in game development.
? Intertextuality ? certain games may be of value in understanding and interpreting
other games. A game such as David Smith?s The Colony from 1987, which utilized
true 3D for a game involving an off-world colony overrun by bizarre creatures as a
result of scientific experiments in teleportation, clearly helps contextualize DOOM,
which was released six years later (and in fact some of his techniques, such as using
ray casting to determine the visibility of objects within the 3D space, were later used
by id Software).
? Impact on the Industry ? This criterion is closely related to popularity and cultural
impact. However, it also calls for collecting games that had a significant impact not
just in terms of reception, but also in terms of such matters as marketing, packaging,
and distribution. For example, it is important to ensure that a representative
collection of games includes examples that introduced specific media (e.g., CD-
ROM) or packaging formats (e.g., Electronic Arts? ?album cover? packaging). These
impacts extend to the introduction of new forms of distribution and game genres,
which may also lead to special difficulties in access to games that should be collected.
For example, Valve?s Steam has had enormous significance as a digital distribution
and rights management platform, but the specific nature of its technology will lead to
difficulties in collecting games distributed via Steam. Likewise, web-based ?social
network games? such as Farmville have had an enormous impact on casual game
design and easily satisfy criteria such as popularity and business impact, but the task
of ?collecting? such a game is made difficult by its close dependence on social
network sites such as Facebook.
? Cultural Impact ? some games have a significant cultural impact reflected in their
influence on other forms of media or in their ability to spark public discussion and
debate. Id Software?s DOOM has achieved significance by both of these markers.
References to the game have appeared in a variety of other media (including episodes
of The Simpsons, Friends, and Family Guy, as well as in a song by the Smashing
Pumpkins). At the same time, the subject matter of DOOM (slaughtering demons in
outer space), its moody graphics and audio, and the vocabulary (?shooters,? ?death
match?) increased public attention to the levels of violence depicted in computer
games. The game was released just as congressional hearings convened by Senators
Herb Kohl and Joseph Lieberman were getting underway to examine media violence
and its influence on children, and thus its release played a role in the national debate
on these issues. In such cases, games become involved in public policy, and the
Library should make a special effort to ensure that the games cited in debates such as
those around DOOM are included within their collections.
? 111?
? Creator Prestige ? A game may also be considered important enough to collect due
to the identity of its creator. Mindwheel probably would not have received the
attention it has were it not authored by U.S. Poet Laureate Robert Pinsky. Games by
well-known designers such as John Romero, Will Wright, or Hironobu Sakaguchi
may not necessarily always be the best examples of games, but they may still have a
significant impact on the game industry. It should be noted that this is a criterion
particularly for archival collecting; the collections at Stanford (Steve Meretzky, Hal
Barwood) and Texas (Warren Spector) have already begun to collect the papers of
noted game designers. Some games are significant due to the prestige of their
developers as inventors or innovators, some due to the importance of computer
technology and software innovation for this medium. John Carmack?s early games
(before DOOM) are significant for this reason, and the Ralph Baer papers at the
Smithsonian Institution provide an example of the impact of this criterion on
archival collecting.
? End-user Appropriation ? Some games offer features that make them particularly
suited to end-user appropriation in various ways. This is a particularly important
criterion for considering games not just as authored content per se, but as a creative
platform for expression by fans, players, and others. One aspect of this creativity is
the alteration of games or their use to create wholly new games. Games like DOOM
and Quake were among the first to provide the means for users to create new maps
for game play or make new assets such as creature ?skins? and monsters. Such games
have since spawned a cottage industry of ?mods,? modified (in some cases heavily
modified) versions of game. Users have also appropriated various multi-user online
games (including Halo, World of Warcraft, Second Life and many others) for the
production of machinima, videos created using game technology, assets, or the
virtual world of a game as a staging and production platform. Finally, many players
have created reputations as ?cyber-athletes? by virtue of their creativity in
competitive game-play, to the extent that today professional and near-professional e-
sports have attained some significance both nationally and internationally. All of
these areas?mods, machinima, e-sports?represent activities through which players
demonstrate their creativity. Games which have lead to the creation of significant
user-produced content should also be given some priority in collecting activity, as
should the mods and machinima produced using them, or replay movies that capture
important moments in the history of competitive e-sports.
? Impact on Game Design and Technology ? Particularly novel games may have a
major impact on the development of the gaming industry. The game Crysis, for
example, established a new benchmark for the whole industry in terms of
photorealism in the display of the game world and in the complexity of the physics
engine used by the game; the game was so demanding of computer hardware, in fact,
that ?But can it run Crysis?? became a common trope when discussing the
capabilities of a new computer platform. The original Pong game was obviously not
noted for the sophistication of its game play, but it led to the development of the
entire console game industry.
In addition to the above issues, there are two other factors that the Library should consider
in developing its game collection. Mandatory deposit for copyright will obviously help
ensure that the Library has a strong collection of games produced and marketed within the
United States. However, the gaming industry (and gaming community) is international, and a
? 112?
research collection such as the Library?s should include significant games from outside the
United States. The factors listed above should be considered when selecting games from
outside the United States, but the Library might also wish to consider making special efforts
to collect representatives of genres of games that are not typical of the U.S. game market
(the genres of visual novels and dating sims from Japan, for example), that have achieved
prominence within their own countries (such as the various MMORPGs operated by Shanda
in China), or that have had significant influence on games or the game industry in the United
States. Likewise, the emphasis on collecting games that document creative play might point
to documentation of competitive play cultures in countries such as Korea and regions such
as Central and Northern Europe. Countries already possessing major video game industries
include Japan, Korea, United Kingdom, Canada, Australia, Germany, Sweden, and France,
and there are significant industries developing in China, Taiwan, Russia, the Ukraine,
Finland, Denmark, and the Netherlands.
On the other hand, game technology and the game industry have also been significantly
shaped by the regional concentration of U.S. ?high-technology? industry, particularly in
computing and web-related sectors. In recent years, the game industry has also been a driver
in sectors such as the graphics and audio hardware industries and has influenced mainstream
entertainment such as in the movie industry. Thus, it is no surprise that institutions currently
active in game collecting activities tend to be located in or near prominent techno-regions or
have themselves played prominent roles in producing relevant technologies (Stanford, Texas,
Illinois). These comments regarding the collecting implications of both the global and the
regional nature of the digital game industry underscore again our conviction that one
collecting institution is unlikely to be able to do it all and lead us to conclude this section by
noting again the importance of collaborative collecting efforts.
Intellectual Property & Digital Preservation: Widening the Discussion
There is a need for memory institutions interested in establishing video game repositories to
support policy positions and offer services that are better aligned with the preservation and
use practices of the game community, not only because of the potential for integrating the
members of this community into the larger preservation network, but also because their
attitudes and values may well influence those of professional archivists in years to come.
With their deep curatorial investment in games, players have adopted a versatile set of
approaches for collecting, managing, and providing long-term access to these cultural
artifacts. While bitstream preservation and emulation are an essential part of the overall
picture, so too are re-releases, remakes, demakes, ports, mods, and ROM hacks.3 As a result,
game archives in the wild often reflect notions of authenticity that are different from those
????????????????????????????????????????????????????????
3??Remake? and ?demake? are game design terms coined by Ian Bogost. As defined by Bogost,
remakes ?are recreations of earlier works, irrespective of the hardware platform of original creation
or recreation.? Conversely, demakes are ?retro-inspired reimaginings of modern games, as if they had
been created on earlier hardware. Demakes are not necessarily created to run on older machines, but
their design and behavior are constrained by the real or perceived constraints of vintage systems.?
See Bogost?s spring 2010 syllabus for ?Atari Hacks, Remakes and Demakes: Special Topics in Game
Design and Analysis,? available at
http://www.bogost.com/teaching/atari_hacks_remakes_and_demake.shtml.
? 113?
of memory institutions.4 While the dominant orientation of archivists is toward the
stabilizing of cultural records, gamers can tolerate?indeed embrace?greater variability in
the objects with which they interact. The Mario Brothers franchise, for example, has
persevered largely because of the consumer appetite for new and altered game levels, power-
ups, graphics, characters, and items. As another example, consider the case of Mystery House,
the first graphical work of interactive fiction ever created. Released into the public domain in
1987, Mystery House was recently reimplemented in the Inform programming language by the
Mystery House Taken Over Occupation (MHTO) Force, comprised of Nick Montfort, Dan
Shiovitz, and Emily Short (2004). The project team also commissioned ten contemporary
digital artists to mod the game using a specially designed kit for the purpose. The end result
is that MHTO oscillates between a preservation project and a remix project. As Jon Ippolito
(2008, p. 106) has remarked, new media art ?can survive only by multiplying and mutating ...
fixity is death.? Consequently, there is an argument to be made in favor of preservation
strategies that involve reprogramming, reimplementation, and recreation. The ?softer? view
of authenticity that underlies these strategies operates at what Seamus Ross (2010) would call
a ?lower threshold of verisimilitude.?
What policy initiatives and preservation services might be adopted in response to the needs,
practices, and perspectives of players and player-archivists? We propose the following:
Digital Preservation Services: Comparative Methods and Stemmatics
In addition to providing authenticated capture, ingest, hashing, and storage services for
archival copies of games, digital repositories might also offer appropriate services for access
copies of games in the wild. Because these copies are often modified rather than fixed
representations?in line with the ?softer? canons of authenticity previously mentioned?
repositories could provide users and player-archivists with the means to analyze, document,
and measure their inter-relationships using similarity metrics and other approaches. A good
example is Colossal Cave Adventure, created by Will Crowther in c. 1975, which has the
distinction of being the first documented computer text-adventure game. Inspired by
Kentucky?s Mammoth Cave system, the game inaugurated many of the conventions and
player behaviors now associated with the genre, such as solving puzzles, collecting treasure,
and interacting with the simulated fantasy world via short text commands (Jerz, 2007).
Originally written in FORTRAN, the code has been repeatedly ported and expanded by
hackers, fans, and programmers over the years, most influentially by Don Woods (c. 1977).
The number of versions of Adventure is legion and continues to grow, with players mapping
the evolution of the game using tree-like structures showing patterns of inheritance and
variation (Dalenberg, 2004 and 2006).
Applying the techniques of digital stemmatics, archivists could help users visualize and
interpret these patterns in sophisticated ways. Developed in the 19th Century, stemmatics
codified a set of methods for analyzing the filiation of literary manuscripts. Significantly, the
tree structures representing these relationships have parallel importance in evolutionary
biology and historical linguistics, where they are used to group genomes or languages into
families; show how they relate to one another in genealogical terms; and reconstruct lost
????????????????????????????????????????????????????????
4??Archives in the wild? is a phrase coined by Jeremy John in the British Library?s Digital Lives
Report.
? 114?
archetypes (Kraus, 2009). Speculating on the role of digital stemmatics (or phylogenetics, as
the comparative method is called in biology) in the context of personal digital archives,
Jeremy John of the British Library has postulated that ?future researchers will be able to
create phylogenetic networks or trees from extant personal digital archives, and to determine
the likely composition of ancestral personal archives and the ancestral state of the personal
digital objects themselves? (John, 2010, p. 134).
Stemmatic methods have already been applied to board games: Joseph Needham, a
pioneering historian of East Asian science, technology, and culture, published a family tree
of board games connecting divination, liubo, and chess through a long line of ancestry and
descent (331); and biologist Alex Kraaijeveld has applied phylogenetics to variants of chess
to help determine its place of origin. The methodology therefore shows great promise for
the study of video and computer games in the wild, where variability rather than fixity of
representation is often the norm.
Two other tools cited recently by Jeremy John are also relevant in this context:
? ccHost ? an open-source content management system developed under the auspices
of Creative Commons that can be used to track and document how media content is
used, reused, and transformed on the web. (ccMixter, the popular music site for
remixing and sharing audio samples, is powered by ccHost.)5
? Comparator ? a tool developed by Planets (Preservation and Long-term Access
through Networked Services), a four-year project funded in part by the European
Union. Comparator is designed to measure degrees of similarity between different
versions of a digital object.
Stemmatics might also be experimentally applied to the history of platform architectures.
The system of hardware documentation developed by Walker Sampson for MITH?s vintage
computers complements such an approach by modeling the collection ?through component
pieces and parts? (Sampson, 2010). By decomposing each computer into discrete hardware
components?microprocessors, disk drives, input devices, display technologies, and so
forth?the documentation isolates possible units of variation. These units or parts can then
be grouped together according to hereditary relationships (for example, partial pedigrees
have already been constructed on Wikipedia and elsewhere for the MOS 6502 8-bit
microprocessor, the ancestor of both the MOS 6510, which was used in the Commodore 64,
and the W65C02S, which was used in the portable Apple II). Similar variations and
derivations could be established for other processors and hardware components. Perhaps
even more significantly, we could begin to model evolutionary degrees of similarity and
difference not only between successive lines of personal computers or consoles developed
by a single company (e.g., the Apple and Macintosh series), but also between entirely
different families or brands of computers (e.g., the Atari ST, nicknamed ?Jackintosh? for its
obvious mimicry of the original Macintosh. Both machines were also based on the Motorola
68000 CPU). These findings would in turn have potential implications for computer
restoration and console modding.
????????????????????????????????????????????????????????
5 Information about ccHost and ccMixter can be found online at
http://wiki.creativecommons.org/CcHost and http://ccmixter.org/. See also Jeremy John 56 and
156n244.
? 115?
Digital Preservation Services: Calculating Trust In Fan-Run Game Repositories
Because game archives in the wild cannot usually be authenticated according to standard
integrity checks, an alternative method for evaluating the authenticity of their holdings might
involve the application of trust-based information. Jennifer Golbeck, for example, has
demonstrated how the trust relationships expressed in web-based social networks can be
calculated and used to develop end-user services, such as film recommendations and email
filtering. Applying Golbeck?s insights, archivists could leverage the trust values in online
game communities as the basis for judgments about the authority or utility of relevant user-
run repositories, such as abandonware sites and game catalogs. Under this scenario,
authenticity is a function of community trust in the content being provided. One
consequence of this approach is that authenticity and mutability need not be considered
mutually exclusive terms; on the contrary, fan-run game repositories that make provisions
for transformational use of game assets?such as altering the appearance of avatars or
inventory items?might in many instances increase trust ratings.
IP and Public Policy: Reform of Contract Law
While archivists and librarians have for the most part targeted copyright law for reform,
gamers have influenced the civil code documents that govern virtual worlds, such as EULAs
and TOS agreements. Because this class of documents is frequently updated to reflect the
evolving nature of the relationships among different stakeholders (Grimes et al, 2008),
gamers have arguably had greater success than archivists and librarians in advancing their
goals over a relatively short period of time. We therefore recommend that archivists and librarians
devote more effort to the reform of contract law and its associated document genres. To the extent that IP is
becoming a matter of private policy rather than public policy, managed through licensing
and contractual agreements as opposed to federal copyright law, then advocacy efforts
increasingly need to be directed at the commercial industry rather than the copyright office
or the legislative branch. In many respects?and perhaps somewhat counter-intuitively?the
game industry is better positioned than Congress or the courts to respond in a progressive
and timely fashion to emerging social, cultural, and technological forces that are radically
restructuring the relationship between content creators and users. For example, within the
last few years Microsoft and Blizzard have published Game Content Usage Rules that relax
the exclusive authorial right of adaptation in order to permit players to generate and
distribute machinima, video, and other derivative works of art (2004-2010). These rules in
turn provide a nascent legal environment in which remix culture and Web 2.0 can flourish.
That such an environment is the product of contract law rather than copyright law bears
emphasis: the protracted processes through which copyright law has traditionally been
emended increasingly provide inadequate grounds for meeting the challenges of our current
cultural milieu, a milieu characterized by rapid technological upheavals that in turn redefine
social attitudes toward concepts such as originality, copyright, and creativity. Case in point:
the Section 108 Study Group, charged with re-examining the Copyright Act with an eye to
updating it for the digital age, took nearly three years to reach agreement on a set of non-
binding and comparatively tepid recommendations, which may or may not eventually result
in actual policy change upon legislative review. By contrast, the example of Microsoft, in
particular, demonstrates how contract law can help legitimate player innovation by rapidly
legalizing and codifying it??rapidly? here being an admittedly relative term.
? 116?
It is crucial to note, however, that the fluid state of contractual agreements is a double-edged
sword: EULAs can just as easily be used to constrict or rescind rights as to expand or confer
them. As Microsoft states in its content usage rules, the company may choose to revoke a
license ?at any time and for any reason? (2004-2010). Although subject to some statutory
constraints, it may act self-interestedly, magnanimously, or capriciously as it sees fit. A
temporal content analysis would shed light on directional patterns of change: are user rights,
once introduced into game licenses, really vulnerable to repeal, or do they tend to persist
over time to the point that they become fully naturalized components of the game culture?
Questions and caveats aside, it remains the case that civil code documents, such as EULAs,
are more dynamic in nature than the U.S. Copyright Act.6 Between 2002 and 2007, for
example, Second Life, the popular 3D virtual world developed by Linden Lab, published at
least nineteen different versions of its EULA (Fitz, 2008). Just as revealingly, Microsoft
published revisions to its Content Usage Rules less than two months after their initial release
based on feedback from the machinima community (Hayes, 2008, p. 570). By contrast,
Section 108 of the Copyright Act has been amended a mere four times since it was first
published in 1976. The opportunities for citizen intervention into the machinery of contract
law is therefore much greater than comparable opportunities for intervention into copyright
law.7
IP and Public Policy: Layered Licensing Agreements and Archival Endorsements
As a way to address the impenetrability of many software license agreements, Jeremy John
has recently proposed a layered approach to their implementation, citing Creative Commons
deeds as a relevant model (56, 58). Under this example, the prohibitive length and jargon of
many game licenses would be ameliorated by condensing them and restating their salient
points in accessible language and machine-readable format. A set of icons would be
prototyped as an additional layer encoding baseline user rights and exemptions, again
following the CC precedent. The end result would be layered licenses, each layer targeting a
different audience: lawyers and experts (original legalese), users (vernacular and visual
versions), and machines (RDF version). Applying John?s suggestions to game EULAs,
Content Usage Rules, and TOS agreements, we recommend that iconic and vernacular
versions of these documents be developed and standardized to efficiently communicate the
following information:
? Whether or not the user has the right to reproduce game assets
o The game assets created by the user
o The game assets created by other users
o The game assets created by the developers
? Whether or not the user has the right to create derivative works, such as machinima
? Whether or not the user has the right to transfer game assets (e.g., to a public
repository)
????????????????????????????????????????????????????????
6?On the dynamism of civil code documents, such as EULAs, see Grimes et al., 2008.
7?Such intervention is a function of the dynamism of the documents, and is by no means due to their
being collaboratively authored by different communities of practice. Machinimists and players
influence the content of the documents indirectly rather than directly.?
? 117?
? Whether or not the user has the right to export his or her game assets (or those of
others) in an open format
? Whether or not an archival exemption clause exists that allows librarians and
archivists to create preservation copies of game assets
? Whether or not the license includes a ?competitive endorsement by official or public
repositories? (John et al., 2010).
The PVW team or LOC should consider using Appendix A: Virtual Worlds that Died
During the Grant to pressure game companies to embrace use policies designed to secure
the endorsement of libraries and archives; and to raise awareness among players about the
importance of these policies for long-term access to and use of game assets. Additionally, the
PVW team recommends convening a working group of librarians, archivists, and researchers
to draft an archival exemption clause and conduct outreach events with game developers and
players to promote its adoption.
IP And Public Policy: Expanding User Rights In Adaptation
Archivists and librarians have spent most of their political capital in lobbying for legal
exceptions to the exclusive authorial right of reproduction, largely ignoring the fact that they
and their users also have a tremendous stake in limiting the exclusive authorial right of
adaptation. Gamers, by contrast, have significantly expanded their rights to produce derivative
works based on the creative assets of game publishers (Hayes, 2008). We therefore recommend
that archivists, librarians, and game scholars work to expand user rights in the area of adaptation, following
the example of gamers. Section 108 of U.S. copyright law privileges librarians and archivists as a
special class of users for whom reproduction is the single most valuable non-exclusive right
that may be accorded to them for purposes of preserving the media objects in their care.8
Conversely, game content usage rules privilege players and the commercialization of
primary content, which is potentially supported by the production of derivative works, such
as machinima, that boost brand value. Granting the user the right to prepare derivative
works is therefore fundamental to realizing what Lawrence Lessig (2008) calls a ?Read-
Write? (RW) culture. Defined in opposition to a ?Read-Only? (RO) culture, which assumes
a clear division between creators and consumers, a RW culture sees the two as deeply
intertwined: RW creativity is one in which a society?s ?ordinary citizens? not only passively
?read? their culture, but also actively transform it, producing content such as fan fiction,
game mods, machinima, and audio remixes.
Within the RW context, it is the right of adaptation, not reproduction, that achieves
preeminence among the five established and inter-related pillars of the copyright code (the
reproduction, adaptation, distribution, performance, or display of the original work).
Moreover, the model of creativity that underwrites RW culture shades almost imperceptibly
into an emergent model of preservation. Because players are already an integral part of the
preservation system for video games, any adaptation rights they acquire as content creators
also helps them in their capacity as content preservers (Kraus, Donahue & Winget, 2009).
????????????????????????????????????????????????????????
8?The word ?reproduce? and its cognates (e.g., ?reproducing?) appear no less than twenty-four times
in Section 108 of the U.S. Copyright Act. Conversely, the word ?derivative? never appears at all, nor
does any related terminology (such as ?adaptation?). Additionally, the executive summary report
issued by the Section 108 Study Group makes no recommendations regarding the right of adaptation.?
? 118?
This crossover advantage stems from the fact that the player as modder and the player as
preservationist both produce a transformed digital object. In short, because citizen
preservation methods that transmit culture via ?version streams? are becoming increasingly
prevalent,9 it is essential that archivists, librarians, and curators advocate strongly for the
right to prepare derivative works.
Representation Information and Format Registries/Tools
The Library has endeavored in creating its digital preservation systems to ensure that they
comply with the Open Archival Information System (OAIS) Reference Model. This carries
with it the obligation to ensure that any digital content acquired is matched by the necessary
representation information to interpret the digital content. Our research into our case set
indicates that the representation information necessary for preservation of software will fall
into several major classes:
1. Source Code Representation Information
a. Structure Information ? documentation of text standard (e.g., Unicode)
employed by the source code documents
b. Semantic Information
i. Documentation of the programming language employed
ii. Documentation of any OS libraries invoked by the code which may
not be part of the standard libraries for a language
iii. Documentation of any external APIs invoked by the code
2. Binary Executable Representation Information
a. Structure Information
i. Documentation of the executable file format employed by the binary
(e.g., COFF, ELF, Mach-O, etc.)
b. Semantic Information
i. Documentation of the instruction set employed by the processor the
binary executable was created to run on (e.g., Intel 64 and IA-32
Architecture reference manuals).
ii. Documentation of the standard hardware architecture for the
computing platform the executable was designed to run on, including
information on memory, buses used to interconnect parts of the
system, standard I/O devices and any special I/O devices required
by the program.
3. Game Data File Representation Information (images, databases, audio tracks, etc.)
a. Structure Information
b. Semantic Information
4. Media Representation Information (disk image formats, ROM organization, etc.)
a. Structure Information
b. Semantic Information
5. Recursive Representation Information (rep. information for all preceding forms of
rep. information)
????????????????????????????????????????????????????????
9??Version stream? is a concept defined by Jon Ippolito, et al., in the context of The Pool, an online
collaborative environment for designing, sharing, and disseminating variable media art:
http://bit.ly/57hhE0.?
? 119?
a. Structure Information
b. Semantic Information
While in many ways the representation information for software is similar to that for any
other complex data object which may employ a variety of file formats, there are several
aspects of representation information for computer games that we believe deserve
highlighting and that suggest possible future steps that the Library might wish to consider.
The first is that the representation information for computer software consists in significant
part of standards documents or public specifications created by companies. Standards for
character encoding and programming languages form the greatest part of representation
information for source code versions of programs (augmented by documentation regarding
device specific APIs that might be used in a program). For binary executables, the formats
employed for executables on different platforms are well documented, as are the instruction
sets for processors for particular machines; companies producing hardware typically have
every incentive to provide developers this information in order to encourage them to create
software for that platform. While representation information for source code and binary
executable formats is typically readily available for modern platforms, it should be noted that
this information can be 1) quite voluminous (the documentation for Intel 64 and IA-32
architectures runs to several thousand pages); 2) difficult to obtain once the technologies
supporting a particular game become antiquated; 3) unlikely to be found in typical library
collections, particularly standards documents; and 4) highly repetitive for different pieces of
software.
Given these facts, and given that the OAIS Reference Model suggests that the end points of
a representation network for a data object must be decipherable without use of computing
equipment, there appear to be clear cost benefits to maintaining a separate collection of
printed representation information with the Library of Congress. This would allow
information packages that the Library of Congress creates for any games it holds to
reference print copies of representation information for these items. Given that so much of
this information is in the form of standards documents, there might be grounds for some
collaboration with the National Institute of Standards and Technology on the creation of
such a collection.
Another crucial point about representation information for gaming materials that should be
highlighted is that it may include more information about storage media than might be
required for some other forms of data. Some of the games that we have investigated would
require an emulation strategy for preservation due to the fact that the original source code
was lost or unavailable, and in those instances, the binary executable versions of games that
we were able to obtain were contained within a disk image file for the original game
platform. Such instances can be seen as examples of the onion-model set forth in the
PREMIS metadata dictionary, where one file may contain bitstreams, which may in turn
contain further bitstreams, each of which may require separate representation information.
For disk images, the representation information will include documentation on the original
media format and use.
As noted in the OAIS Reference Model, representation information can itself require
representation information, leading to the creation of a representation network. We found
? 120?
that representation networks for games could quickly become quite large both in terms of
the number of works and the byte count of the representation information files, and that the
representation information for a game could easily outsize the game itself by several orders
of magnitude. In the case of Star Raiders for the Atari 2600 platform, the game itself required
only 8K of storage, while the representation information for the game consumed over 100
MB. This provides yet another incentive to compile a central store of representation
information that information packages for games (and probably other materials) would
reference at the Library.
Having the representation information, however, is not enough. One of the aspects of
representation networks that has not been sufficiently commented upon in the case of
standards documents is that they are stable (at least until a standards document is
significantly revised), and that much of the recursively referenced representation information
for a standards document will be other standards documents cited in the original standard. A
database tracking these representation network links between documents would be of great
benefit to the Library and others. At the moment, we do not believe that the data models
being proposed for the Unified Digital Format Registry (UDFR) will support recording this
type of information linking the pieces of documentation associated with a particular data
format. The Library may wish to either work with the UDFR to ensure that support for this
type of information is added into the UDFR data model, or consider creating its own
database for representation network information.
Digital Game Canon
This outreach activity coincided with the initial development of the Preserving Virtual
Worlds proposal. The creation of the Digital Game Canon was undertaken under the
auspices of the Game Preservation Special Interest Group (SIG) of the International Game
Developers Association (IGDA); Henry Lowood of the PVW team has chaired the Game
Preservation SIG since 2006 and organized the original Digital Game Canon activity,
namely, the announcement of the initial selection of 10 games for inclusion in the Digital
Game Canon.
The goals of the Digital Game Canon are two-fold: 1) recognition for the importance of
digital game culture, including raising awareness of the impact and responsibilities of this
importance for the game industry; and 2) establishment of a basis for decisions about the
historical value of specific game titles that reflect a mix of academic, industry, and
journalistic perspectives.
The Canon provides a starting-point for the difficult task of preserving this history inspired
by the role of that the U.S. National Film Registry has played for film culture and history.
The scope of the Canon has thus far been international. Our argument: We could do worse
than to start by making sure these games and archival material related to them are available
to future developers, players, and scholars.
At the 2007 Games Developers Conference, five panelists (Matteo Bittanti, Christopher
Grant, Henry Lowood, Steve Meretzky, and Warren Spector) revealed and discussed their
choices for the first 10 games on this list
? 121?
(https://www.cmpevents.com/GD07/a.asp?option=C&V=11&SessID=3885). These were
the 10 game titles put on the Canon at this event:
? Spacewar! (MIT, 1962)
? Star Raiders (Atari, 1979)
? Zork I: The Great Underground Empire (Infocom, 1980; PDP-11 version)
? Tetris (Alexey Pajitnov, 1985)
? Sim City (Maxis, 1989)
? Super Mario Brothers 3 (Nintendo, 1990)
? Civilization I/II (MicroProse, 1991-1996)
? DOOM (id Software, 1993)
? Sensible World of Soccer (Sensible, 1994)
? Warcraft I/II/III (Blizzard, 1994-2003)
Full audio and slides from the event can be downloaded from the IGDA Preservation SIG
website (http://www.igda.org/preservation/files/dgc_gdc2007/).
The Preservation SIG also maintains a web page for the Digital Game Canon
(http://wiki.igda.org/Game_Preservation_SIG/Digital_Game_Canon/). The Gamasutra
game design website has commissioned articles on games in the Canon, such as this essay on
Spacewar:
http://www.gamasutra.com/view/feature/1433/down_the_hyperspatial_tube_.php. The
Canon project was originally conceived as an ongoing project, much like the National Film
Registry. Plans have been discussed to revive the project in 2010 or 2011 as a continuing
series of annual additions of 10 or so games at a suitable public venue, possibly as a joint
venture with a website such as the Joystiq blog community. Just as the National Film
Registry is linked to the work of the Library?s National Film Preservation Board, we propose
that the Digital Game Canon be coordinated in a similar fashion by a board with ties to
Library of Congress. If this were of interest to the Library, this would be a topic for further
investigation with Henry Lowood and the IGDA Preservation SIG, as well with potential
project partners in industry and the player community. It could also provide the basis for
future preservation activities based on the work accomplished by the Preserving Virtual
Worlds project.
Reimagining Videogame Asset Management & Preservation (ReVAMP)
Symposium
The challenges of digital preservation require early intervention -- a requirement that has
brought archivists, librarians, information technology professionals, and scientists from a
range of fields (notably in the space and geospatial communities) together to find solutions.
To date, video game designers have been relatively uninvolved in these collaborations.
Through our work with the Preserving Virtual Worlds project, a survey of game developers,
and discussions with game industry professionals, we have realized that there is a large gap
of understanding and experience between people in the industry and those in cultural
institutions, with scholars sitting somewhere in the middle ? perhaps able to act as bridging
agents. As video games grow in cultural importance, the need to preserve them and the
? 122?
materials generated during their development becomes more evident. It is essential that these
diverse communities understand each other?s goals and perspectives to ensure that future
students, scholars, and game developers are able to access the rich history of video games?
creation and use. The natural first step in creating such understanding is to bring
representatives of each community into one room for discussion.
To impress upon the video game industry the importance of preserving its own history, as
well as for the purpose of educating its talent pool, we propose to start with a two-day
Symposium styled after the 1960 Conference on Science Manuscripts (Woolf, 1962).10 The
event will bring video game developers, information professionals, and game scholars
together to discuss the challenges and value of instituting formal preservation programs
through presentations and discussion. It is hoped that such a forum will lead to the
development of a roadmap for the future, generate follow-up activities, encourage cross-
community collaboration, and motivate the video game developers to take action for
preservation. The planning details, agendas, and activity summaries could be used as a model
to jump-start preservation and collaboration in other industries that may currently be
struggling with (or ignoring) the same issues.
The goal at the heart of our proposal is inspired by the history of science documentation
efforts in the United States; we intend nothing less than to instigate an intensive series of
meetings and jump-start serious efforts toward the adequate documentation and archival
preservation of the many worlds involved in video games: design, technology, business, and
culture. To do so, we propose to host a series of symposia styled after the Conference on
Science manuscripts mentioned above. The conference could be scheduled around a major
industry event, such as the Game Developers Conference (GDC), to ensure maximum
attendance. Stanford would be an ideal venue due to its location in the heart of Silicon
Valley, the presence of significant game-related holdings in its archives, and its proximity to
the GDC site in San Francisco.
We believe that a series of small symposia would be best suited towards influencing key
members of the game industry to build support for future preservation work. To accomplish
this goal, the first ReVAMP conference will bring together scholars, information
professionals, and developers to present and discuss the benefits of preserving game
development materials, including providing a general awareness of game history, reuse of
game assets, training new developers, preserving culturally important records and artifacts,
and building brand awareness. Additionally, we will strive to ease the understandable
concerns of business managers and legal offices regarding respect for licenses and
intellectual property rights. Attendance at the first ReVAMP conference will be by invitation
only, and we expect to include no more than 30?40 individuals.
Ultimately, we seek to partner with the game development industry in the development of
one or more disciplinary centers with archival repositories to support all sectors of the video
game world in much the same way that the Center for History of Physics (CHP) was
established in 1965 by the American Institute of Physics to ensure adequate documentation
????????????????????????????????????????????????????????
10?See also, Joseph Anderson?s write up on the development of the Center for History of Physics at
the American Institute of Physics, ?Difficult to Document: Physics in Government and Industry.?
Available at http://www.bath.ac.uk/ncuacs/FP_Anderson.htm.?
? 123?
and preservation of physics materials. Gaining the support of the creators themselves is a
vital first step along this ambitious path, and a conference of this nature represents the best
way to open discussion with game developers and get them thinking about preserving their
own history. As Joseph Anderson, head of the CHP?s archival program has put it, ?All the
stakeholders in archival records?the people who create them, the researchers who use, and
the archivists who preserve them?should work together to decide what can and should be
preserved and to develop appraisal guidelines.? The ReVAMP conference will represent our
best effort to date to create such a consensus among parties concerned with the records of
game development, technology, business, and culture. We therefore propose a follow-up
symposium, somewhat smaller than the first, for the organization of an appropriate effort to
create a disciplinary history center for the preservation of these records.
A Research Agenda for Preservation of Video Games & Interactive
Fiction
While the Preserving Virtual Worlds project has managed to answer a number of research
questions regarding the issues which make preservation of computer games and interactive
fiction problematic, as well as demonstrating the feasibility of building upon existing
packaging mechanisms with the digital preservation community for the storage of these
unique materials, any such investigation will in turn spark new questions and lines of inquiry
to be pursued. Based on research, there a number of questions with respect to the
preservation of games and related content that we believe need further investigation.
One of the foremost of these arises out of the work we have done on emulation as a
preservation strategy for computer games. As discussed in chapter 6, emulation proved to be
only a partially successful strategy. Aspects of the original game which we would, in the best
case, wish to preserve (e.g., the music accompanying game play in DOOM, differences in
graphic output in Star Raiders) proved difficult to maintain under emulation, and we expect
similar problems to arise under migration approaches. The operation of any piece of
software involves a complex interplay with a computer?s hardware and its operating system,
and changes to any component in this dynamic can have an impact on the software?s
performance and appearance. To expect software to stay utterly unchanged in the face of
changing computing platforms may not be realistic, but we clearly need more research into
how emulators might improve their performance, particularly with respect to their ability to
successfully reproduce the behaviors of I/O devices.
Preservationists have accepted the necessity of changing the form and appearance of stored
information in order to insure its on-going accessibility; newspapers in microform are not
identical to the paper copies they replace. But we have little experience to date with users?
desires and expectations regarding preserved software, or with their unstated criteria for
what constitutes ?good enough? or ?better than good enough? preservation. In the language of
the preservation community, we do not know what properties of games our users consider
significant. Given that the preservation strategies we have evaluated with respect to
computer games and interactive fiction all suffer from some degree of imperfection, a first
major research question for the preservation community is ?what are the significant
properties of digital games which we should seek to maintain and what are there relative
? 124?
degrees of importance?? Without this knowledge, we are poorly positioned to select an
appropriate preservation strategy from the options available.
As demonstrated by our severe difficulties in attempting to create a preservation copy of
islands in Second Life, massive, multiplayer virtual environments are extraordinarily difficult to
preserve in any meaningful way. Even at the most basic level of creating a complete replica
of the physical environment, our efforts were only (very) partially successful. Clearly further
research must be undertaken regarding not only how we might preserve these environments,
but how we might provide access to what we?ve preserved. The How They Got Game project
at Stanford University has been investigating the use of an open source virtual environment
software package, Sirikata, as a means of display 3D worlds originally built on other
platforms. The project has already managed to export portion of the environment from id
Software?s game Quake and re-instantiate it within Sirikata. Further research is needed,
however, not only on how to achieve more complete translation of an original environment
in the Sirikata platform (including such features as scripts, animations and other dynamic
aspects of the virtual world) but also on how to present a recreated virtual world as part of a
virtual museum of 3D worlds.
While our project was able to develop mechanisms for packaging games for preservation
that building upon existing standards within the preservation community, our work also
revealed the practical limits of using these technologies. Generation of XML packaging files
for complex compilations of material such as games is simply prohibitively time-consuming
using current technologies. Research and development of tools that speed the process of
generating packaging files for such materials, including recording the necessary relationships
between content, representation information and context information, will be absolutely
critical to the long-term success of the preservation community.
Another recurring theme of our investigations has been that preservation activity
surrounding computer games and interactive fiction could benefit significantly from the
involvement of the gaming community. Gamers have documentation regarding games? use
and history that will be incredibly valuable for scholars in the future; they often possess
technical documentation regarding the hardware and software necessary to run a game that
the game companies themselves may no longer possess; and they have demonstrated their
willingness to invest long hours in the creation of emulators, websites on games and other
activities. If this energy could be harnessed by the digital preservation community, it would
be of immense value.
Whether this is possible or not, and if so how, are questions that remain to be answered and
that we believe deserve further investigation. Some specific questions that have arisen in the
course of our research have been:
? How might metadata systems run by the cultural heritage sector be opened up to
contributions from the gaming community to allow them to provide enhanced
description information, as well as context and representation information, for
games? Are there legal or social issues that might impede this?
? What legal and social issues might impede libraries collaborating with the gaming
community on development and maintenance of emulation software?
? 125?
? Websites created by the gaming community documenting particular aspects of games
have demonstrated the same longevity as other parts of the web, which is to say, they
are highly perishable. If the library community were to provide infrastructure for the
gaming community for some of this activity, it might help promote the survival of
some of this material. What factors might impede this, and how can they be
addressed?
Through studying these questions with regards to the gaming community, digital
preservationists might also gain further insight into their relationships with other
communities of practice.
? 126?
10. Conclusion
In 1996, as the first generation of online, 3D
multi-user virtual environments were becoming
increasingly known and popular on the Internet,
the VRML Consortium created a working group
known as ?Living Worlds.? The Living Worlds
group was to create specifications that would
allow the virtual worlds created by different
companies to be truly interoperable, with avatars
and objects being able to move freely between
environments provided by different companies.
Through cooperation, the companies involved in
the working group hoped to spur further user
interest and participation in these evolving virtual
worlds.
Unfortunately, while draft specifications were
developed and some test implementations were
performed, the Living Worlds work never really
came to fruition. The initial burst of development in multi-user virtual environments died
quickly as companies struggled to find a sustainable business model for running a virtual
world, and several companies involved in the Living Worlds effort went bankrupt or were
acquired by other firms. Later entries into the multi-user virtual environment sphere, such as
the early MMORPGs Ultima Online and Everquest, had little motivation for pursuing
interoperability.
There are several lessons that the preservation community can take from the early history of
3D virtual worlds. The first is that without a sustainable business model, any effort to try to
keep a virtual world alive is doomed to failure. The second, perhaps less obvious lesson is
that involvement of the user community is a vital component of sustainability. While it has
struggled financially over the years, ActiveWorlds (the company responsible for Alphaworld, a
3D virtual world in operation since 1995) has managed to stay in business, and this may in
part be ascribed to the strong community of users that has inhabited the space and their
active involvement in its development. It does not seem coincidental that one of the very
few companies producing 3D virtual worlds to have survived since 1995 is also the one with
many users willing to undertake the expense of traveling to face-to-face annual ?Reunion?
events. A final lesson is that, given the time necessary for collaborative efforts to develop, it
is usually better to start them sooner rather than later.
Our project has documented a number of significant issues impeding the preservation of
virtual worlds, including obsolescence of original platforms, difficulties in identification and
description of the objects of preservation, problematic rendition of games using common
preservation strategies of migration and emulation, and numerous difficulties resulting from
intellectual property and contract law. But we have also found several reasons to be hopeful
that these problems can be overcome. Existing packaging formats employed by the digital
preservation community, such as METS, OAI-ORE and BagIt, can be employed for
A map of Alphaworld , an early,
multi-user 3D world, in 1996
? 127?
packaging game materials for preservation, particularly when supplemented with ontologies
which delineate the data models set forth in the Functional Requirements for Bibliographic Records
Final Report and the Reference Model for an Open Archival Information System. While we
documented several problems with existing emulation technology?s ability to successfully
render older games, we also note that there have been successful efforts to modify emulators
to enable them to more perfectly reproduce the original experience of game play (Bogost,
2009). Interaction between the gaming community and game companies such as Microsoft
and Blizzard Entertainment have resulted in those companies re-crafting their licensing
regimes for their MMORPGs to provide machinima creators the legal permissions they need
to create new video works using these virtual worlds. Our own collaborations with the
gaming community and the Internet Archive have resulted in the creation of a repository of
moving image documentation of virtual worlds within the Internet Archive that will allow
user-created documentation of worlds to persist within a stable preservation repository.
While these initial efforts provide hopeful signs that the preservation of virtual worlds is, in
fact, a tractable problem, continuing work and collaboration by librarians, archivists,
curators, game developers, authors of electronic literature, and the communities they all
serve will be needed to insure the problem of preserving virtual worlds finds a permanent
solution. Libraries, archives and museums need to start long-overdue conversations about
how they might more effectively collaborate with each other on collection management,
description and preservation of virtual worlds. Game companies and the digital preservation
community need to collaborate on changes to intellectual property law that will allow game
developers to profit from their work while insuring their work is also known to future
generations. Cultural heritage organizations also need to actively engage the gaming
community. A good starting-point would be a discussion of how libraries, archives and
museums might provide stable, preservation infrastructures that the gaming community
might use to document the history and culture of virtual worlds and to collaborative develop
tools to insure on-going access to those worlds.
In October of 2007, Linden Lab, our commercial collaborator on the Preserving Virtual
Worlds project, and IBM announced their intention to develop open standards to enable the
interoperability of 3D virtual worlds. By July of 2008, Linden Lab and IBM were able to
demonstrate their first successful case of interoperability, moving avatars back and forth
between a Second Life Preview Grid and an OpenSim virtual world server. Through their
collaboration, the two companies were able to finally realize the dream that sparked the
Living Worlds project over a decade ago, and through the collaboration of the cultural
heritage community, game companies and gamers, we will be able to insure that virtual
worlds remain living worlds in the future.
? 128?
Bibliography
Aliaga-Buchenau, Ana-Isabel (2003). The ?Dangerous? Potential of Reading: Readers and the
Negotiation of Power in Nineteenth Century Narratives. New York: Routledge.
Avendon, Elliott M. & Brian Sutton-Smith (1971). The Study of Games. New York: J. Wiley.
Bainbridge, William Sims (July 27, 2007). ?The Scientific Research Potential of Virtual
Worlds.? Science 317(5837), pp. 472-476.
Bartle, Richard (1996). Hearts, Clubs, Diamonds, Spades: Players Who Suit MUDs. Retrieved
August 31, 2010 from http://www.mud.co.uk/richard/hcds.htm
Barton, Matt (2008). Dungeons & Desktops: The History of Computer Role-Playing Games.
Wellesley, MA: A. K. Peters.
Becker, et al. (1997). Dublin Core Element: Coverage. Retrieved Aug. 30, 2010 from
http://www.alexandria.ucsb.edu/public-documents/metadata/dc_coverage.html
Bethesda Softworks (Firm) (2008). Fallout 3. Rockville, MD: Author.
Boellstorff, Tom (2008). Coming of Age in Second Life: An Anthropologist Explores the Virtually
Human. Princeton, NJ: Princeton University Press.
Bogost, Ian (2007). Persuasive Games: The Expressive Power of Videogames. Cambridge, MA: The
MIT Press.
Bogost, Ian ([2009]). A Television Simulator: CRT Emulation for the Atari VCS. Retrieved
August 10, 2010 from http://www.bogost.com/games/a_television_simulator.shtml.
Bohannon, J. (2008). ?Slaying Monsters for Science.? Science 320 (20 June), 1592.
Book, Betsy (2006). ?What is a Virtual World?? Virtual Worlds Review. Retrieved August 10,
2010 from http://www.virtualworldsreview.com/info/whatis.shtml
Bryant, John (2002). The Fluid Text: A Theory of Revision for Book and Screen. Ann Arbor:
University of Michigan Press.
Burke, T. (2006). The History of Virtual Worlds (weblog comment), Terra Nova. Retrieved
March 1, 2009 from
http://terranova.blogs.com/terra_nova/2006/12/the_history_of_.html.
? 129?
Candlelight Vigil. (2001). re: Candlelight Vigil on Luclin. Posts by Nirrian and Keeter, Sept. 12
2001. Retrieved Sept. 15, 2001 from http://www.everlore.com.
comScore (Jan. 28, 2009). Game On! Online Gaming Surges as Gamers Seek Out Free Alternatives in
Tight Economy. Retrieved July 17, 2010 from
http://www.comscore.com/layout/set/popup/Press_Events/Press_Releases/2009/1/Onli
ne_Gaming_Grows.
Consultative Committee for Space Data Systems (2002). Reference Model for an Open Archival
Information System (OAIS) (CCSDS 650.0-B-1). Washington, DC: CCSDS Secretariat, National
Aeronautics and Space Administration.
Consultative Committee for Space Data Systems (2008). XML Formatted Data Unit (XFDU)
Structure and Construction Rules. (CCSDS 661.0-B-1). Washington, DC: CCSDS Secretariat,
National Aeronautics and Space Administration.
Coyle, Karen (2004). ?Future Considerations: The Functional Library Systems Record.?
Library Hi Tech 22(2), pp. 166-174.
Dalenberg, Russel (2004). ?Adventure Family Tree.? Russel Dalenberg?s Home Page [personal
website]. Retrieved August 31, 2010 from http://www.io.com/~ged/www/family.html.
Dalenberg, Russel (2006). ?Versions and Ports of Adventure known to exist, v. 2.6.7.? Russel
Dalenberg?s Home Page [personal website]. Retrieved August 31, 2010 from
http://www.io.com/~ged/www/advelist.html.
Debord, Guy-Ernest (November 1956). ?Th?orie de la D?rive.? Les Levres Nues 9.
Easter egg (June 24, 2010). The Doom Wiki. Retrieved August 31, 2010 from
http://doom.wikia.com/wiki/Easter_egg.
Easter Eggs and cheats (n.d.). ONLINE SIERRA ? The Sierra Online Fan Site. Retrieved
August 31, 2010 from http://tawmis.com/onlinesierra/eggsandcheats.html.
Electronic Frontier Foundation (2010). Coders? Rights Project Reverse Engineering FAQ. San
Francisco, CA: Author.
Entertainment Software Association (2010). Essential Facts about the Computer and Video Game
Industry: 2010 Sales, Demographic and Usage Data. Retrieved Aug. 17, 2010 from
http://www.theesa.com/facts/pdfs/ESA_Essential_Facts_2010.PDF.
Fitz, Kate (2008). ?Terms of service (Second Life) - History of changes.? Second Life LawSpot.
Retrieved March 24, 2008 from http://www.lawspotonline.com/lawspot/termsofservice.jsp.
GamePolitics.com (Oct. 9, 2008). Report: Obama Ads in Burnout Paradise. Available at:
http://www.gamepolitics.com/2008/10/09/report-obama-ads-burnout-paradise.
? 130?
Golbeck, Jennifer (2005). Computing and Applying Trust in Web-based Social Networks. Ph.D.
Dissertation. Retrieved Aug. 30, 2010 from
http://trust.mindswap.org/papers/GolbeckDissertation.pdf.
Grimes, Justin, Paul Jaeger and Kenneth Fleischmann (2008). ?Obfuscatocracy: A
Stakeholder Analysis of Governing Documents for Virtual Worlds,? First Monday 13(9)
http://firstmonday.org/htbin/cgiwrap/bin/ojs/index.php/fm/article/view/2153/2029.
Hayes, Christina (2008). ?Changing the Rules of the Game: How Video Game Publishers
Are Embracing User-Generated Derivative Works,? Harvard Journal of Law & Technology
21(2): 567-587.
IFLA Study Group on the Functional Requirements for Bibliographic Records (1997).
Functional Requirements for Bibliographic Records: Final Report. The Hague: International
Federation of Library Associations and Institutions.
Ippolito, Jon (2008) ?Death by Wall Label.? In Christiane Paul (ed.), New Media in the White
Cube and Beyond. Berkeley: University of California Press.
Ippolito, Jon, et al. The Pool. Retrieved January 30, 2010 from
http://pool.newmedia.umaine.edu/index.php.
J. Paul Getty Trust and the College Art Association (2009). Categories for the Description of
Works of Art. Patricia Harpring and Murtha Baca (Eds.). Los Angeles, CA: Author.
Jerz, Dennis (2007). ?Somewhere Nearby is Colossal Cave: Examining Will Crowther?s
Original ?Adventure? in Code and in Kentucky? DHQ 1
http://digitalhumanities.org/dhq/vol/001/2/000009/000009.html.
John, Jeremy L., et al. (2010). Digital Lives / Personal Digital Archives for the 21st Century: An
Initial Synthesis. March 3, 2010. Beta Version 0.2.
Juul, Jesper (2005). Half-Real Video Games Between Real Rules and Fictional Worlds. Cambridge,
MA: MIT Press.
Kirschenbaum, Matthew, Richard Ovenden, and Gabriela Redwine (2010). Digital Forensics
and Born-Digital Content in Cultural Heritage Collections. Washington, DC: Council on Library
and Information Resources.
Kraus, Kari (2009). ?Conjectural Criticism: Computing Past and Future Texts.? Digital
Humanities Quarterly 3 http://digitalhumanities.org/dhq/vol/3/4/000069/000069.html.
Kraus, Kari, Rachel Donahue, Megan Winget (2009). ?Game Change: The Role of Amateur
and Professional Cultures in Preserving Virtual Worlds,? Digital Humanities Conference, College
Park, MD (June 22-29 2009). Retrieved Aug. 30, 2010 from
http://www.mith2.umd.edu/dh09/index.html%3Fpage_id=99.html.
Lessig, Lawrence (2008). Remix: Making Art and Commerce Thrive in the Hybrid Economy. New
York: Penguin Press.
? 131?
Library of Congress (2002). Preserving our Digital Heritage: Plan for the National Digital Information
Infrastructure and Preservation Program. Washington, DC: Author.
Library of Congress (2007). Sustainability of Digital Formats: Planning for Library of Congress
Collections. Washington, DC: Author.
Library of Congress Network Development & MARC Standards Office (2008). MODS 3.3.
Washington, DC: Author.
Library of Congress Network Development & MARC Standards Office (2009).
MARCXML: The MARC 21 XML Schema. Washington, DC: Author.
Lowood, H. (2008). ?Impotence and Agency: Computer Games as a Post-9/11 Battlefield,?
in Andreas Jahn-Sudmann & Ralf Stockmann (Eds.), Games Without Frontiers - War Without
Tears. Computer Games as a Sociocultural Phenomenon. London: Palgrave Macmillan: 78-86.
McDonough, Jerome, Matthew Kirschenbaum, Doug Reside, Neil Fraistat & Dennis Jerz
(Fall 2010). ?Twisty Little Passages Almost All Alike: Applying the FRBR Model to a Classic
Computer Game.? Digital Humanities Quarterly 4(2).
McDonough, Jerome & Mona Jimenez (April 2007). ?Video Preservation and Digital
Reformatting: Pain and Possibility.? Journal of Archival Organization 4(1/2), pp. 167-191.
Microsoft Corp., Inc. (2009). Xbox.com, Game Content Usage Rules. Retrieved August 31, 2010
from http://www.xbox.com/en-US/community/developer/rules.htm.
Monnens, Devin, Andrew Armstrong, Judd Ruggill, Ken McAllister, Zach Vowell & Rachel
Donahue (2009). Before It?s Too Late: A Digital Game Preservation White Paper. Henry Lowood
(Ed.). Mt. Royal, NJ: International Game Developers Association. Retrieved Aug. 11, 2010
from http://www.igda.org/wiki/images/8/83/IGDA_Game_Preservation_SIG_-
_Before_It%27s_Too_Late_-_A_Digital_Game_Preservation_White_Paper.pdf.
Montfort, Nick, (2003). Twisty Little Passages: An Approach to Interactive Fiction. Cambridge, MA:
MIT Press.
Montfort, Nick & Ian Bogost (2009). Racing the Beam: The Atari Video Computer System.
Cambridge, MA: MIT Press.
? 132?
Montfort, Nick, et al. (2004). Mystery House Taken Over. Commissioned by New Radio and
Performing Arts, Inc. Retrieved January 30, 2010 from
http://www.turbulence.org/Works/mystery/.
NoSkill Memorial Site. Retrieved Dec. 1, 2004 from
http://www.doom2.net/noskill/index.htm.
Pearce-Moses, Richard (2005). A Glossary of Archival and Records Terminology. Chicago, IL: The
Society of American Archivists. Retrieved Aug. 11, 2010 from
http://www.archivists.org/glossary/index.asp.
Pinsky, R., & Synapse Software Corporation (1984). Mindwheel. Richmond, CA: Synapse
Software Corp. & Broderbund.
Ross, Seamus (2010). Personal Communication. Computer Forensics and Cultural Heritage
Symposium, University of Maryland.
Rossignol, Jim (2008). This Gaming Life: Travels in Three Cities. Ann Arbor, MI: University of
Michigan Press.
Rubin, Stephen (2010). ?Intellectual Property Content, Law, and Practice.? In Steve Rabin
(Ed.) Introduction to Game Development, 2nd Edition, Course Technology. Charles River Media.
Ruecker, Stan, St?fan Sinclair & Milena Radzikowska, (2007). ?Confidence, Visual Research
and the Aesthetic Function.? Partnership: the Canadian Journal of Library and Information Practice
and Research. 2(1).
Sampson, Walker (July 9, 2010). "Modeling a Computer on Omeka [blog post].? iSchool
Digital Humanities Network Blog. Retrieved Aug. 31, 2010 from
http://www.ischooldh.org/network/.
Section 108 Study Group (2008). Executive Summary Report, Retrieved August 30, 2010 from
http://www.section108.gov/docs/Sec108ExecSum.pdf.
Siwek, Stephen E. (2007). Video Games in the 21st Century: Economic Contributions of the US
Entertainment Software Industry. Entertainment Software Association. Available at:
http://www.theesa.com/facts/pdfs/VideoGames21stCentury.pdf .
Visual Resources Association Data Standards Committee (2007). VRA Core 4.0. Kansas City,
MO: Visual Resources Association.
Williams, D. (2006). ?Groups and Goblins: The Social and Civic Impact of Online Gaming.?
Journal of Broadcasting and Electronic Media 50, pp. 651-670.
Woolf, Harry (Ed.) (March 1962). ?The Conference on Science Manuscripts.? ISIS 53(171),
pp. 1-158.
? 133?
World Wide Web Consortium (27 June 2001). XML Linking Language (XLink) Version 1.0.
Steve DeRose, Eve Maler and David Orchard (Eds.). Cambridge, MA: Massachusetts Inst.
of Technology. Retrieved Aug. 31, 2010 from http://www.w3.org/TR/xlink/.
World Wide Web Consortium (10 February 2004). RDF/XML Syntax Specification (Revised).
Dave Beckett (Ed.). Cambridge, MA: Massachusetts Inst. of Technology. Retrieved Aug. 31,
2010 from http://www.w3.org/TR/REC-rdf-syntax/.
WorldofWarcraft.com, Letter to the Machinimators of the World. Retrieved May 12, 2008
from http://www.worldofwarcraft.com/community/machinima/letter.html.
Wu, Jia (Feb. 5, 2010). Global Video Game Market Forecast. Strategy Analytics.
? 134?
Appendix A - Virtual Worlds that Died During the Grant
Virtual
World
Publisher Description DoB Date
Died
Cause of
Death
Links
1 Aradath/Drag
on's Gate
Mythic
Realms
(when
closed)
Gamer's
World, AUSI,
Genie, AOL,
Mythic
Realms
1984 10
February
2007
2 Championshi
p Manager
Online
Beautiful
Game
Studios/Ja
destone
Publisher:
Eidos.
Cancelled by
Square Enix
after it
acquired
Eidos.
2005 30 April
2010
http://www.cm-
online.com/main/acti
on/forum/thread/232
78?
?
http://www.computer
andvideogames.co
m/article.php?id=23
3969?
?
http://www.edge-
online.com/news/ch
ampionship-
manager-online-to-
close
3 Cities XL Monte
Cristo
Games
a Sim City-
esque game
with MMO
and social
networking
components.
9
Octo
ber
2009
8 March
2010
Low
subscriber
numbers are
leading Monte
Cristo to pull
back the
online
component
and focus on
its single-
player side.
http://www.virtualwo
rldsnews.com/2010/
01/cities-xl-to-close-
mmo-due-to-small-
subscriber-
base.html?
?
http://www.massivel
y.com/2010/01/27/ci
tiesxl-to-close-
multiplayer-features/
4 EA-Land
(AKA The
Sims Online)
Electronic
Arts
MMOG
based on the
video game
The Sims
17
Dece
mber
2002
4.35am
PST on 1
August
2008
?Widely seen
as a failed
attempt to port
the single-
player game
to an online,
multiplayer
environment.
Still, EA kept
TSO running,
even as it was
eclipsed by
other social
virtual worlds,
and it limped
along with a
small
http://www.stanford.
edu/group/htgg/cgi-
bin/drupal/?q=node/
239?
?
http://news.cnet.com
/8301-13772_3-
9931757-52.html?
?
http://terranova.blog
s.com/terra_nova/20
08/04/ea-to-close-
ea.html
? 135?
membership.?
5 Faketown Identity
Play
A free MMO
created in a
pixel art
aesthetic,
Faketown
features
drawing and
animation
tools, along
with photo
and mp3
hosting, as
well as
YouTube
videos.
23 June
2008
"The site and
underlying
technology
are currently
on auction to
the highest
bidder," said
Yeary via
email. "We
still believe in
the product
and we are
optimistic that
someone will
see the value
in re-
establishing
the
community."
http://www.zazzle.co
m/rip_faketown_tshi
rt-
2359978227933893
36?
?
http://www.virtualwo
rldsnews.com/2008/
07/faketown-
closes.html?
?
http://www.virtualwo
rldsnews.com/2008/
07/interview-
lesso.html
6 GoPets Zynga Virtual pet
site where
owners could
customize
3D pets.
4
Augu
st
2005
8
Novemb
er 2009
Chose to
focus on
Petville
instead?
http://en.wikipedia.or
g/wiki/GoPets?
?
http://www.virtualwo
rldsnews.com/2009/
12/this-week-zynga-
unleashed-its-latest-
social-game-petville-
-as-the-name-
implies-petville-is-
zyngas-entry-into-
the-time-test.html
7 Hellgate:
London
Flagship
Studios /
Hanbitsoft
Fantasy RPG 31
Octo
ber
2007
31
January
2009
Hanbitsoft has
acquired the
US/EU
territory rights
and will be re-
releasing back
to the US/EU
territories with
its sequel
Hellgate
London:
Resurrection.
?
?
http://en.wikipedia.or
g/wiki/Hellgate:_Lon
don
8 Lively Google Web-based
virtual
environment
integrated
with
YouTube and
Picasa.
8
July
2008
31
Decemb
er 2008
?But we've
also always
accepted that
when you take
these kinds of
risks not every
bet is going to
pay off.?
http://googleblog.blo
gspot.com/2008/11/l
ively-no-more.html?
?
http://www.lively.co
m/goodbye.html?
?
http://arstechnica.co
? 136?
m/old/content/2008/
11/google-to-shut-
down-lively-its-
interactive-3d-
world.ars
9 The Matrix
Online
Sony
Online
Entertain
ment
A MMOG
developed by
Monolith
Productions.
It was the
official
continuation
of the
storyline of
the Matrix
series of
films.
22
Marc
h
2005
31 July
2009
?doubts about
the game
circled within
the industry,
based on the
lacklustre
reception of
the later two
Matrix films
and an
overcrowded
MMORPG
market.?
http://en.wikip
edia.org/wiki/T
he_Matrix_On
line
http://thematrixonlin
e.station.sony.com/i
ndex.vm?
?
http://www.joystiq.co
m/2009/08/04/the-
matrix-onlines-final-
moments-
documented/?
?
http://thematrixonlin
e.station.sony.com/g
ame_basics.vm
10 Meridian 59 Near
Death
Studios
Sword and
Sorcery
combat RPG
15
Dece
mber
1995
Announc
ed 5
January
2010
"Unfortunately
, M59 never
really grew,"
Green admits
in his personal
blog. "We
were lucky
that we got a
lot of attention
for keeping an
old game alive
from the
press. We
also had a
small and
dedicated
group of fans
willing to keep
the game
alive. But, the
press didn't
really care
about our
attempts to
improve the
game, and the
fans weren't
interested in
trying to
attract new
players."
http://meridian59.ne
ardeathstudios.com/?
?
http://www.gamasutr
a.com/view/news/26
677/Meridian_59_D
eveloper_Near_Dea
th_Studios_Closes.p
hp?
?
http://www.gamepro.
com/article/news/21
3452/near-death-
gives-up-the-ghost-
after-nine-years-of-
meridian-59/
? 137?
11 Metaplace Areae The
Metaplace
website
promises that
the platform
will integrate
smoothly into
our current
web
standards,
allowing for
integration of
Metaplace
elements into
websites,
RSS feeders,
and more.
1
January
2010
It just hasn?t
gotten traction
http://www.raphkost
er.com/2009/12/21/
metaplace-com-
closing/?
?
http://www.massivel
y.com/2008/02/25/g
dc08-raph-kosters-
reinventing-mmos-a-
metaplace-
antemortem/?
?
http://www.metaplac
eveterans.com/foru
m/viewforum.php?f=
3&sid=3d45064c8a0
c864cac00a0cc20fa
1812
12 Phantasy
Star Online
Sega On
Dreamcast.
21
Nove
mber
2000
31 March
2007
13 Tabula Rasa NCsoft A MMORPG
developed by
Destination
Games about
humanity's
last stand
against a
group of
aliens called
"The Bane".
http://eu.play
nc.com/eu/ga
mes/overvie
w/tabula_ras
a/
2
Nove
mber
200
7
28
February
2009
"Unfortunately
, the fact is
that the game
hasn't
performed as
expected."?
?
http://www.ga
masutra.com/
php-
bin/news_inde
x.php?story=2
1224
http://eu.plaync.com
/eu/games/info/tabul
a_rasa/?
?
http://www.gamasutr
a.com/php-
bin/news_index.php
?story=22510?
?
http://www.gamepoli
tics.com/2009/05/06
/richard-garriott-
sues-nc-soft-over-
millions-stock-
options?
?
http://arstechnica.co
m/gaming/news/200
9/03/does-a-game-
have-to-fail-to-have-
an-ending-tabula-
rasa.ars?
?
http://en.wikipedia.or
g/wiki/Tabula_Rasa
_%28video_game%
29
14 There.com Makena
Technolog
ies
A 3D online
virtual world.
From home
page: "...an
9
Janu
ary
2003
9 March
2010
Recession: "...
at the end of
the day, we
can't cure the
http://www.prod.ther
e.com/info/announc
ement?
?
? 138?
online
getaway
where you
can hang out
with your
friends and
meet new
ones..."
recession,
and at some
point we have
to stop writing
checks to
keep the
world open,"
says Wilson.
"There's
nothing more
we would like
to avoid this,
but There is a
business, and
a business
that can't
support itself
doesn't work."
http://news.cnet.com
/8301-13772_3-
10462627-52.html?
?
http://www.virtualwo
rldsnews.com/2010/
03/therecom-
shutting-down-on-
march-9th11.html
15 Virtual Magic
Kingdom
Walt
Disney
Parks and
Resorts
Online
MMOG
created as
part of their
50th
anniversary
with areas
and games
based on
real park
scenery and
attractions
open to the
public daily
between
7:00am?
10:00pm
PST.
23
May
2005
21 May
2008
Created for
50th Ann and
promotion
ended long
after originally
planned.
http://www.intercot.c
om/discussion/show
thread.php?t=13054
8?
?
http://familyinternet.
about.com/od/websit
es/tp/most-
influential-virtual-
worlds.htm?
?
http://www.worldsin
motion.biz/2008/04/
disney_closes_gate
s_to_virtual.php
16 Vivaty Vivaty Vivaty is a
next
generation
3D virtual
world in the
browser
where you
can meet
new friends
and express
yourself by
dressing up
your avatar
and
personalizing
your own
virtual scene
for free.
Marc
h
2008
16 April
2010
?Vivaty.com is
a rather
expensive site
to run, much
more than a
regular web
site, and
Vivaty the
company has
been running
out of money
for some
time.??
Jay Weber,
Co-founder
and Chief
Technical
Officer
http://blog.vivaty.co
m/2010/03/31/vivaty
-shutdown-party/?
?
http://www.raphkost
er.com/2010/03/31/v
ivaty-is-closing-
down/?
?
http://games.venture
beat.com/2010/03/3
1/vivaty-shuts-down-
virtual-world/?
?
http://blog.vivaty.co
m/2008/03/31/welco
me-to-vivaty/
? 139?
17 Weblin Zweitgeist A software
that turns the
internet into
a virtual
world where
you can chat
with people
from all over
the world
Announc
ed 7 Aug
2009
?Media reports
attribute the
closure to a
lack of funds.?
http://www.virtualwo
rldsnews.com/2009/
08/weblin-ceases-
operations.html?
?
http://www.raphkost
er.com/2009/08/08/
weblins-
closing/?utm_source
=feedburner&utm_m
edium=feed&utm_c
ampaign=Feed%3A
+RaphsWebsite+%2
8Raph%27s+Websit
e%29
?
? 140?
Appendix B ? Media Coverage of PVW
?
Kotaku October 2007
?The Library of Congress Loves Video Games?
http://kotaku.com/313328/the-library-of-congress-loves-video-games
Kotaku November 2008
?On the LOC Preserving Virtual Worlds Project?
http://kotaku.com/5096782/on-the-loc-preserving-virtual-worlds-project
Special Collections 2.0: New Technologies 2009
For Rare Books, Manuscripts and Archival
Collections
On p. 14: ?Creative born-digital works in Second Life are also earmarked for long term
preservation; the Library of Congress recently provided a grant to several universities,
spearheaded by the University of Illinois at Urbana-Champaign, to develop digital
preservation standards for Second Life, as part of their Preserving Creative America
program.?
Crispy Gamer January 2009
?Saved Games: Preserving the New TV?
http://www.crispygamer.com/features/2009-04-08/saved-games-preserving-the-new-
tv.aspx
MetaverseTV April 2009
ttp://blip.tv/file/2003603
Blog post: Crispy Gamer April 2009
http://www.crispygamer.com/features/2009-04-08/saved-games-preserving-the-new-
tv.aspx
Salon April 2009
?How to Make Machinima Without Getting Sued Blind?
http://www.salon.com/technology/the_gigaom_network/online_video/index.html?section
=online_video&blog=/tech/giga_om/online_video/2009/04/28/how_to_make_machinim
a_without_getting_sued_blind
Kotaku April 2009
?What AJ Learned About Machinima Law Today?
http://kotaku.com/5226851/what-aj-learned-about-machinima-law-today
IEEE Spectrum April 2009
?Stanford Boots Up Machinima?
http://spectrum.ieee.org/sandbox/consumer-
electronics/gaming/stanford_boots_up_machinima
? 141?
Future Tense, American Public Media April 2009
?Machinima lowers barrier of entry to filmmaking, but raises legal questions?
http://www.publicradio.org/columns/futuretense/2009/04/machinima-lower.html
Blog post: Pixel Vixen May 2009
http://www.pixelvixen707.com/?p=1743/#content
Stanford Magazine November/December 2009
?Saving Worlds Preserving the digital and virtual?
http://www.stanfordalumni.org/news/magazine/2008/novdec/farm/news/virtual.html
Voice of America with Art Chimes March 2010
http://www1.voanews.com/english/news/science-technology/Our-World--13-March-
2010-87482377.html at ~15:53
Atlantic Monthly March 2010
?Pac Rat?
http://www.theatlantic.com/doc/201003/archiving-video-games
Kojo Nnamdi Show March 2010
http://thekojonnamdishow.org/shows/2010-03-30/preserving-video-games-and-virtual-
worlds
University of Maryland Diamondback April 2010
?A Second Life?
http://www.diamondbackonline.com/news/a-second-life-1.1434797
http://thekojonnamdishow.org/shows/2010-03-30/preserving-video-games-and-virtual-
worlds
Baltimore Public Radio May 2010
http://stream.publicbroadcasting.net/production/mp3/wypr/local-wypr-899636.mp3
http://mdmorn.wordpress.com/2010/05/03/53102-the-perplexing-task-of-archiving-
virtual-worlds/
Ars Technica June 2010
?Saving Virtual Worlds from Extinction?
http://arstechnica.com/gaming/news/2010/06/the-art-of-archiving-virtual-worlds.ars
WILL-AM 580?s Sidetrack July 2010
http://will.illinois.edu/sidetrack/program/sidetrackjul10/
USA Today August 2010
?Video Game Hall of Fame Inducting Pac-Man and Pals?
http://www.usatoday.com/tech/gaming/2010-08-05-gamearchive05_ST_N.htm
Christian Science Monitor August 2010
?Video game museum gives arcade classics extra lives?
http://www.csmonitor.com/Innovation/Tech/2010/0805/Video-game-museum-gives-
? 142?
arcade-classics-extra-lives
This Way Up, Radio New Zealand August 2010
http://www.radionz.co.nz/national/programmes/thiswayup
ABC News Technology site August 2010
?Hall of Fame playing up video games?
http://abcnews.go.com/Technology/hall-fame-playing-video-
games/story?id=11328263&page=1
http://stream.publicbroadcasting.net/production/mp3/wypr/local-wypr-899636.mp3
? 143?
Appendix C ? Preserving Virtual Worlds Ontology
?
1
? 144?
an abstract notion or idea
a material thing
an action or occurrence
? 145?
the physical embodiment of an expression of a work
the intellectual or artistic realization of a work)
? 146?
1
a location; encompasses a comprehensive range of locations: terrestrial and extra-
terrestrial; historical and contemporary; geographic features and geo-political
jurisdictions
an individual; encompasses individuals that are deceased as well as those that are
living.
? 147?
a distinct intellectual or artistic creation
? 148?
? 149?
an organization or group of individuals and/or organizations acting as a
unit
? 150?
? 151?
? 152?
? 153?
? 154?
? 155?
? 156?
? 157?
? 158?
? 159?
the class to which the work belongs (e.g., novel, play, poem, essay, biography,
symphony, concerto, sonata, map, drawing, painting, photograph, etc.)
? 160?
A word, phrase, or group of characters naming an intellectual or artistic
creation
? 161?
? 162?
Appendix D ? Multi-Institutional Collaboration: Lessons
Learned
When partners are slow in providing contributions, enlist a strike force of elite otter
mercenaries for encouragement.
? 163?
Appendix E ? Publications
Forthcoming
Kraus, K. ? ?A Counter-Friction to the Machine?: What Game Scholars, Librarians, and
Archivists Can Learn from Machinimists about User Activism.? Journal of Visual Culture
(special Machinima-themed issue guest-edited by the Stanford Humanities Lab),
forthcoming spring 2011.
Lowood, H. & M. Nitsche. The Machinima Reader. Cambridge, Mass.: MIT Press, forthcoming
2011. (Includes ?Video Capture: Machinima, Documentation, and the History of Virtual
Worlds, by Lowood.)
Lowood, H., ?Memento Mundi: Are Virtual Worlds History?? In: Digital Media:
Interdisciplinary Perspectives on History, Preservation, and Ontology, eds. Megan Winget and William
Aspray. Scarecrow Press, forthcoming 2011.
Lowood, H., ?Perfect Capture: Three Takes on Replay, Machinima and the History of
Virtual Worlds.? Journal of Visual Culture (special Machinima-themed issue guest-edited by
the Stanford Humanities Lab), forthcoming spring 2011.
McDonough, J. ?Packaging Videogames for Long-Term Preservation: Integrating FRBR and
the OAIS Reference Model.? Journal of the American Society for Information Science and Technology.
Forthcoming.
2010
Kraus, K. (March 2010). Quoted in ?Pac Rat: The Fight to Preserve Old Video Games from
Bit Rot, Obsolescence, and Cultural Oblivion.? Atlantic Magazine.
Kraus, K. & Donahue, R. (May 3, 2010). ?The Perplexing Task of Archiving Virtual
Worlds.? Interview. Maryland Morning Show with Sheilah Kast, broadcast on Baltimore?s NPR
member station, WYPR 88.1 FM. http://mdmorn.wordpress.com/2010/05/03/53102-the-
perplexing-task-of-archiving-virtual-worlds/
?
Lowood, H. (2010). ?The Future of Virtual Worlds,? (with William Sims Bainbridge, Wayne
Lutters and Diana Rhoten,? in: Online Worlds: The Convergence of the Real and the Virtual, ed.
William Sims Bainbridge (London: Springer, 2010): 289-302.
? 164?
McDonough, J., Kirschenbaum, K., Reside, D., Fraistat, N. & Jerz, D. (Fall 2010). ?Twisty
Little Passages Almost All Alike: Applying the FRBR Model to a Classic Computer Game.?
Digital Humanities Quarterly 4(2). Retrieved Sept. 13, 2010 from
http://www.digitalhumanities.org/dhq/vol/4/2/000089/000089.html.
2009
Kirschenbaum, M. & Farr, E., et al. (October 2009). ?Digital Materiality: Preserving Access
to Computers as Complete Environments.? iPres 2009.
Kirschenbaum, M., Tabbi, J., Grigar, D., Tata, M.A., Heckman, D., Gibbs, A. & Angel, M.
(December 2009). ?E-Ject: On the Ephemeral Nature, Genres, and Criticism of Electronic
Objects.? Digital Arts and Culture 2009.
Lowood, H., ed. (2009). Before It?s Too Late: A Game Preservation White Paper (written by the Game
Preservation SIG of the International Game Developers Association). Available at:
http://wiki.igda.org/Game_Preservation_SIG/White_Paper/Before_It%27s_Too_Late:_A
_Digital_Game_Preservation_White_Paper.
Lowood, H., McDonough, J., Frick, C. & Renear, A. (2009). ?Digital Curation of
Humanistics, Multimedia Materials: Lessons Learned and Future Directions? (Panel
abstract), Proceedings of DigCCurr 2009: Digital Curation ? Practices, Promise & Prospects, ed. Helen
R. Tibbo, et al. (Chapel Hill: School of Library and information Science, Univ. of North
Carolina, 2009; distributed by Lulu Press): 42-43.
Lowood, H. (2009). ?Game Counter,? in Fiona Candlin and Raiford Guins (Eds.),The Object
Reader. Abingdon, Eng., and New York: Routledge: 466-69.
Lowood, H. (2009). ?Players are Artists,? in Debora Ferrari and Luca Traini (Eds.), The Art
of Games: Nuove Frontiere tra Gioco e Bellezza. Aosta: Regione Auonoma Valle d?Aosta: 190-93.
Also in Italian trans. as ?I giocatore come artisti,? pp. 194-97.
Lowood, H. (Summer 2009). ?Putting Politics Into Play: Three Recent Books on Virtual
Worlds,? American Journal of Play 2(1).
Lowood, H. (July-Sept. 2009). ?Video Games in Computer Space: The Complex History of
Pong,? in IEEE Annals in the History of Computing: 5-19.
Lowood, H. (2009). ?Warcraft Adventures: Texts, Replay and Machinima in a Game-Based
Story World,? in Pat Harrigan and Noah Wardrip-Fruin (Eds.), Third Person: Authoring and
Exploring Vast Narratives. Cambridge: MIT Press: 407-27.
Lowood, H. guest editor, July-Sept. 2009 issue of IEEE Annals of the History of Computing.
McDonough, J. (2009). ?How will we preserve virtual worlds?? in Bart De Nil & Jeroen
Walterus (Eds.), Nieuwe Perspectieven voor Digitaal Erfgoed. Brussel: Pharo Publishing.
McDonough, J., Kirschenbaum, M., Reside, D., Freistat, N., Jerz, D., Lowood, H., Kraus,
? 165?
K., Donahue, R. & Winget, M. (2009). ?Preserving Virtual Worlds: Models & Community?
(panel & paper abstracts). In Digital Humanities 2009. Conference Abstracts. University of Maryland,
College Park. June 22-25, 2009, pp. 22-29. Available at: http://www.mith2.umd.edu/dh09/wp-
content/uploads/dh09_conferencepreceedings_final.pdf.
2008
Lowood, H. (2008). ?La cultura del replay. Performance, spettatorialit?, gameplay,? in Schermi
interattivi. Il cinema nei videogiochi, ed. Matteo Bittanti. Rome: Meltimi: 69-94.
Lowood, H. (Spring 2008). ?Game Capture: The Machinima Archive and the History of
Digital Games,? in Mediascape: Journal of Cinema and Media Studies.
Lowood, H. (2008). ?Replay Culture: Performance and Spectatorship in Gameplay,? in
Carlos A. Scolari (Ed.), L?homo videoludens: Videojocs, textualitat i narrativa interactiva. Vic: Eumo
Editorial: 167-87.
? 166?
Appendix F ? Museogames Exhibit at the Mus?e des
arts et m?tiers
Implications for Collecting Repositories
The Museogames exhibit?on display from 22 June-7 November 2010 at the Mus?e des Arts
et M?tiers in Paris?showcases the history of videogames from the classic arcade games of
the 1980s through the sixth-generation console games that became available at the turn of
the 21st century. Filling three rooms of the museum?s lower level, the exhibit includes home
consoles, arcade cabinets, peripherals, miscellaneous game culture documents and artifacts,
and recorded interviews with game developers and scholars. The exhibit?s main attraction is
the game table in the second room, where visitors are invited to sit down and experience
retro games and platforms spanning nearly three decades, from the Magnavox Odyssey, the
first home gaming console, to Sony?s PlayStation 2, the world?s best-selling console. Real-
time video capture of visitors playing Pac-Man, Super Mario Bros, and other vintage games
is projected onto the walls of the exhibition space.
Left: A view of the game table. Right: Magnavox Odyssey game console. Photos courtesy of Matthew
Kirschenbaum. July 2010.
For collecting institutions such as the Library of Congress, Museogames serves as an object
lesson in managing the underlying tensions between the preservation and access functions of
archives, particularly where interactive media are concerned. While many of the works in the
exhibition are drawn from the museum?s permanent collections, the Mus?e des Arts et
M?tiers also partnered with MO5.com, a non-profit French association dedicated to the
preservation and public dissemination of digital culture and videogames. Founded in 1996,
the association is comprised of a diverse set of stakeholders, including private collectors,
hobbyists, journalists, historians, researchers, and gamers. The organization boasts a
collection of some 30,000 objects, which run the gamut from antiquated computers,
peripheral devices, and other hardware to software, source code, technical manuals, and rare
press kits.
In its information leaflets the museum takes pains to distinguish its video game collection
from that of MO5.com along two axes: selection and use.
? 167?
? Selec t ion ? ?Toward the end of the 90s, a few private collectors took it upon
themselves to safeguard videogame heritage. Their motivation was essentially
sentimental and the objective was to keep everything. The approach adopted by the
museum is somewhat different. All pieces that join its collections represent a
landmark in the history of videogames. The objects acquired by the Mus?e des Arts
et M?tiers bear witness to the technical developments of their time; they have played
an important role in our economic, social, and cultural history and represent an
essential link between the inventions that preceded them and those that were to
come after them. This policy involves selectivity when acquiring pieces for the museum? (?Two
Collections?; emphasis added).
? Use ? ?The objective is [for the museum] to preserve [its] collections for future
generations. For this reason, unlike the pieces belonging to private collections and
which do not have this perenniality requirement, handling of the video games is
strictly limited. This difference is illustrated throughout this exhibition with the
MO5.com collection featuring videogames that can be played, and the collection that
belongs to the Mus?e des arts et m?tiers, which is protected for preservation purposes?
(?Two Collections?; emphases added).
These distinctions are visually reinforced throughout the exhibit, with game consoles from
the museum?s collection sequestered behind wire cages and those from the MO5.com
collection set up in the open as game stations.11 One obvious consequence of this functional
division of collections is that the role of the museum visitor changes from that of passive
observer to active participant over the course of the exhibition. Likewise, the status of the
game artifact itself shifts from one room to the next?from antiquarian relic to source of
hands-on entertainment.
Contrasting experiences: on the left Kraus views the museum?s collection of game consoles displayed within
wireframe cages; on the right, kids play with classic console games lined up in the second gallery room. Photos
courtesy of Matthew Kirschenbaum. July 2010.
????????????????????????????????????????????????????????
11?While we cannot categorically state that every item behind the cages is drawn from the museum?s
permanent collection rather than MO5.com, this inference is supported by the museum?s
information leaflet, quoted above. All the games on the game table are from the MO5.com
collection.?
? 168?
The example of Museogames prompts two observations with special relevance to collecting
institutions:
? In an age of interactive media, archivists and curators need a richer typology of user
behaviors with which to work. Access models to cultural heritage have been overly
determined by the stewardship of paper documents, with the researcher variously
understood as reader, transcriber, or viewer of content. In the case of videogames
and variable media art, however, a more accurate depiction of the user might be that
of player or interactor.
? In a user-centered milieu, it is especially important that collecting institutions solicit
the input and consider the needs and expectations of the user community when
developing selection and appraisal models. These expectations need to be
thoughtfully and creatively balanced against other archival responsibilities, notably
preservation.
With these observations in mind, we propose the following ways MO5.com might serve as a
resource and model for collecting institutions:
? An analysis of the scope and diversity of the MO5.com holdings would allow a
repository to get a purchase on the publishing, documentary and artifactual universe
of videogame production and reception: from concept art and storyboards to
marketing materials to vintage software and hardware (and everything in between).
Determining the full spectrum of products of videogame culture would pave the way
for the archive to adopt a systematic approach to selection and appraisal rather than
be disadvantaged by a partial view of the collecting landscape.
? The Museogames exhibition demonstrates the value of establishing strong
partnerships with private collectors in order to design interactive game experiences
for the public without compromising the preservation mission of the collecting
institution. In the absence of a North American counterpart to MO5.com, the
Library of Congress might consider facilitating the formation of one, perhaps under
the auspices of (or in cooperation with) the IGDA Game Preservation SIG.
? Finally, a collecting institution such as the Library of Congress might consider
internalizing some of the practices of MO5.com by adopting a dual-track strategy for
collection development. This strategy would entail acquiring and maintaining both
preservation and access copies of software and hardware?such as early
microcomputers and home game consoles?rather than following the example of the
Mus?e des Arts et M?tiers in assuming a noncustodial role with respect to playable
versions of games. Such a collection model has been adopted by the International
Center for the History of Electronic Games (ICHEG) at the Strong National
Museum of Play. As described in the ?Analysis of Hardware Preservation? section
of this report, the ICHEG collects multiple copies of the same artifact, allowing the
museum to source hardware components for installations and exhibits from
duplicates in its own collection.
? 169?
Bibliography
?Two Collections Brought Together for Museogames,? Paris: Mus?e des Arts et M?tiers, 22
June-7 November 2010. Print.
Museogames exhibition, Paris: Mus?e des Arts et M?tiers, 22 June-7 November 2010.
http://museogames.com/
? 170?
Appendix G ? Second Life Deed of Gift
LICENSE AGREEMENT
This is a license agreement between DONOR NAME (Donor) and The Board of Trustees
of Leland Stanford Junior University (Stanford) regarding the preservation of virtual
property in the Stanford University Libraries' Preserving Virtual Worlds Second Life Archive
(Archive).
Donor is the creator and copyright owner of material developed for Second Life, more
particularly described to the right of the webpage.
Donor hereby grants Stanford a non-exclusive, royalty-free, irrevocable, worldwide license
to preserve the Object(s), and to display and distribute the Object(s) for educational and/or
not-for-profit purposes in all media now known or hereafter created, including but not
limited to print, audio, electronic, video, optical disk, photographic, digital, and film, subject
to the following.
1. Donor warrants and represents that s/he is the exclusive creator of and sole
rightsholder for the Object(s) and has the full authority to enter into this Agreement.
Donor agrees to indemnify the Board of Trustees of the Leland Stanford Junior
University against any claims (including attorneys? fees and court costs) made against
the University relating to this Agreement.
2. Stanford is licensed to use and manipulate the Object(s) in all ways necessary to
preserve the work, including changing the format of the Object(s).
3. Stanford may display and distribute the Object(s) on its own website, and may also
share the Object(s) with partner organizations for purposes of display and
distribution.
4. Except as set forth herein, copyrights held by the Donor in the Object(s) will remain
with the Donor, including the rights to sell, assign or otherwise exploit or dispose of
copyrights, subject to this irrevocable license agreement. Donor may commercially
exploit the copyrighted materials by any means now known or hereafter known,
including but not limited to print, motion picture, audio, audio-visual, visual,
electronic, film, internet, or similar means of exploitation and dissemination.
5. Stanford University Libraries is under no obligation to accept the Object(s) or to
maintain the Object(s) if accepted. Stanford reserves the right to refuse to accession
Objects, and may delete Objects from the Archive at any time for any reason.
Stanford is under no obligation to maintain the Object(s) or the Archive. Stanford is
in no way liable for third party use of the Object(s).
6. Stanford will provide access to the Object(s) in accordance with its own policies and
procedures. The Object(s) will not become accessible to researchers until archival
processing has been completed, and the time required for such processing is at
Stanford's discretion.
? 171?
7. Optional: Stanford agrees to restrict access to Object(s) as requested by the Donor
below, and will not make the Object(s) available on publicly accessible media until
that time. Stanford will, however, preserve the Object(s) in the Archive during that
time.
8. This Agreement will be subject to the laws of California and any action arising out of
this Agreement will be exclusively heard in a court of competent jurisdiction in Santa
Clara County, California.
? 172?
Appendix H ? Gaming Websites Identified in PVW
Project
? = Archive-It seed has been made
Virtual Words, MMOs, Game Worlds in general
? ?http://terranova.blogs.com/
Terra Nova : Collaborative blog on online communities and worlds.
? ?http://my.binhost.com/lists/listinfo/mud-dev2
MUD-Dev2 Info Page : discussion list of MUD system design, development and
implementation
? ?http://my.binhost.com/pipermail/mud-dev2/
MUD-Dev2 archives
? ?http://www.raphkoster.com/2007/02/02/full-mud-dev-archive-for-download/
Raph's Website : MUD-Dev1 archive can be downloaded here
? ?http://mmorpg.qj.net/
QJ.net : MMORPG blog "24/7 Coverage of Your Favorite MMORPG News"
? ?http://www.fudco.com/habitat/
Habitat Chronicles : Chip Morningstar and Randy Farmer
? ?http://www.dsgames.net/index.htm
DSGames history, including Qlink.net and Habitat screenshots
? ?http://www.guildcafe.com/
Guild Cafe a social network site with blogs and forums, supporting player guilds and
clans of many MMORPGs and online games
Second Life:
? ?http://secondlife.com
Second Life : official website
? http://mariogerosa.blogspot.com/
Played in Italy: Curation of History of Second Life
? http://second-life.com
Second Life : tips, tricks, news, and everything you want to know
? http://blog.secondlife.com/
Official Linden Blog
? http://secondlifeherald.com
The Second Life Herald : presents the game?s events, news, and author?s tips
? http://secondlife.reuters.com
Reuters Second Life News Portal: company portal
? http://lindenlab.com
Linden Lab : official website of Linden Lab, makers of Second Life
? http://secondlifeinsider.com
? 173?
Second Life Insider : weblog with news and tips about both the players and the
game
? http://valleywag.com/tech/second-life/
Valleywag : gossip rag on Second Life
? http://secondlifegrid.net/programs
Second Life Grid : open source platform used for 3D virtual world development in
business,education,and nonprofit organizations
? http://forums.secondlife.com
Second Life Forums
? http://sundancechannel.com/secondlife
Sundance Channel: Blogs: Second Life : virtual screening room
? http://edition.cnn.com/2007/TECH/11/12/second.life.irpt/
CNN Enters the Virtual World of Second Life : CNN to open i-report hub in
Second Life
? http://secondlifevideo.com
Second Life Video
? http://wiki.secondlife.com/wiki/Video_Tutorials
Second Life Video Tutorials : list of video tutorials
? http://crn.com/it-channel/205101362
Accenture Scientist Predicts Death of Second Life
? http://slurl.com
SLurl : location-based linking in Second Life
? http://somethingawful.com/d/second-life-safari/
Second Life Safari
? http://slprofiles.com
Second Life Profiles : online community for residents to meet
? http://getafirstlife.com
Get A First Life: one page satire of Second Life
? http://abc.net.au/services/secondlife
ABC Online: Second Life : ABC island in Second Life
? http://dell.com/html/global/topics/sl/index.html
Dell Island In Second Life
? http://wiki.secondlife.com/wiki/Main_Page
Second Life Wiki
? http://slexchange.com
SL Exchange : Second Life commerce
? ?http://infoisland.org
Second Life Library 2.0
? http://usd.auctions.secondlife.com
Second Life Land Auctions
? http://slcc2007.wordpress.com
The Official SLCC Blog
? http://reperes-secondlife.com
Reperes : first market research institute on Second Life
? http://alpha.cbs.com/primetime/csi_ny/second_life/
Virtual CSI NY : CSI NY on Second Life
? http://secondlla.googlepages.com
? 174?
Second Life Liberation Army
? http://slnn.com
Second Life News Network
? http://mmorpg.qj.net/category/Second-Life/cid/850
MMORPG News : MMORPG news on Second Life
? http://roughtype.com/archives/2006/12/avatars_consume.php
Rough Type : Nicholas Carr?s blog
? http://secondlife.vodafone.com
Vodafone
? http://freshtakes.typepad.com/sl_communicators
Business Communicators on Second Life
? http://slgames.wordpress.com/
Second Life Games : blog
? http://sleducation.wikispaces.com
Second Life in Education : wiki
? http://teen.secondlife.com/whatis
Teen Second Life
? http://slvid.com
SLVid : share Second Life videos
? http://nonprofitcommons.org
Nonprofits in Second Life
? http://ibm.com/developerworks/spaces/secondlife
Second Life
? http://flickr.com/groups/secondlife
Flickr: Second Life : upload pictures from Second Life
? http://2lifeblog.com
Second Life Scripts
? http://massively.com/category/second-life
Posts From The Second Life Category At Massively : the daily news about
MMOs
? http://crunchbase.com/company/secondlife
Crunchbase Company Profile
? http://secondlife.intellagirl.com
Second Life Education Research : blog
? http://sldrama.com
SecondLife Drama
? http://web.ics.purdue.edu/~mpepper/slbib
Second Life Annotated Bibliography
? http://secondlifelibrary.blogspot.com
Second Life library 2.0
? http://slpodcast.com
Second Life Podcast : the podcast and blog about Second Life
? http://sl-art-news.blogspot.com
Second Life Art News
? http://secondlifenotes.com
Second Life Notes: podcast dedicated to exploring the music, art, and culture of
Second Life
? 175?
? http://jira.secondlife.com/secure/Dashboard.jspa
Second Life Issues
? http://secondlife.blogs.cnn.com
Second Life I-Reports : news of a virtual world
? http://slfuturesalon.blogs.com/
Second Life Future Salon
? http://secondlifeupdate.com
Second Life Update
? http://slcn.tv
Second Life Cable Network
? http://second411.com
Second 411 : Second Life search engine
? http://iste.org/Content/NavigationMenu/Membership/Member_Networking/IST
E_Second_Life.htm
Iste Second Life: Iste island on Second Life
? http://foureyedmonsters.com/secondlife
Four Eyed Monsters : blog
? http://taotakashi.wordpress.com
Tao?s Thoughts On Second Life : blog
? http://molotovalva.com
My Second Life
? http://bloghud.com
BlogHUD : blogging system for the residents of Second Life
? http://rootscamp.org/RootsCampSL
Roots Camp
? http://secondlife.crowneplaza.com
Crowne Plaza?s Second Life Meeting Room Reservation
? http://apps.facebook.com/second-life
Second Life Facebook Application
? http://slballet.org
Second Life Ballet
? http://lists.secondlife.com/cgi-bin/mailman/listinfo
Second Life Mailing Lists
? http://slcreativity.org/blog
SL Creativity
? http://slleftunity.blogspot.com
Second Life Left Unity
? http://secondliferesearch.blogspot.com
Second Life Research : blog about research on Second Life
? http://slambling.blogspot.com
Ambling In Second Life : blog
? http://secondlifehowto.com
Second Life HowTo: free guide to content creation in Second Life
? http://virtualaloft.com
Virtual Aloft
? http://sltree.com
Second Life Tree : the metaverse directory
? 176?
? http://secondlife.wikia.com
Second Life Wikia
? http://www.myfirstsecondlife.com
My First Second Life
? http://secondlife.podcast.com
Second Life Podcasts
? http://secondlifeproject.com
Second Life Project
? http://bethssecondlife.blogspot.com
Beth?s Second Life : a teacher's blog
? http://libsecondlife.org
Libsecondlife : project trying to understand the game from a technical aspect
? http://lindenlifestyles.com
Linden Lifestyles: unofficial Second Life fashion and shopping blog
? http://mynameiskate.typepad.com/secondlife
Kate?s So-Called Second Life : blog
? http://slbusinesscommunicators.pbwiki.com
Second Life Business Communicators Wiki : a collaborative resource for anyone
interested in business and communications applications of Second Life
? http://sltutorials.net
SLTutorials.net
? http://blog.secondstyle.com
Second Style Fashionista : fashionista blog for Second Life
? http://mercedes-benz-secondlife-infos.com
Mercedez-Benz : Mercedes-Benz in Second Life
? http://siobhantaylor.wordpress.com
Sio?s Second Life : blog
? http://nwn.blogs.com/nwn/2007/12/second-life-as.html
New World Notes : Wagner James Au?s blog
? http://sweden.se/templates/cs/CommonPage____18195.aspx
Second Life: The House of Sweden : the official gateway to Sweden
? http://storyofmysecondlife.com
The Story of My ?Second Life? : educator?s grant-funded exploration into Second
Life
? http://slfashionpolice.wordpress.com
Second Life Fashion Police : fashion faux pas of Second Life
? http://slhandbook.com
Second Life handbook : guide to people, places, and things in Second Life
? http://esl-secondlife.blogspot.com
Secondlifeenglish.com Blog
? http://slsolutions.org
Second Life Solutions : home of the metaverse stock exchange
? http://slsailing.org
Second Life Sailing Federation
? http://aeneaideas.wordpress.com
Aenea?s Second Life : blog
? http://caterin.wordpress.com
? 177?
Girl Meets Second Life : Caterin?s Second Life blog
? http://www.secondlifeize.com
Second Life Linden Dollars
? http://secondlifesearch.ourtoolbar.com
Second Life Search Toolbar
? http://kellysecondlife.com/eprise/main/web/us/customers/secondlife/index.html
Kelly?s second life
? http://secondlifeenglish.com
Do You Speak SLEnglish?
? http://slpulse.com
Second Life Pulse : second life press releases, announcements, and events
? http://gridgrind.com
The Second Life Grid Grind : Second Life news, info, gossip and gab
? http://secondstyle.com
Second Style Magazine
? http://secondlife.blogs.com/babbage/2005/08/second_life_in_.html
The Creation Engine: Second Life in Mono : Babbage?s blog
? http://ausslers.com
AusSLers : Australian Second Life Educators and Researchers
? http://sl.nmc.org
NMC Campus Observer
? http://slrecord.typepad.com
Second Life Record : blog
? http://fizzysecondlife.blogspot.com
Fizzy?s Second Life
? http://nicolaescher.com
Nicola Escher : virtual world fashion designer
? http://secondlifenow.com
Mr. P?s : bringing Second Life to life
? http://secondlifetraveler.com
Second Life Traveler
? http://scottsecondlife.blogspot.com
Oh Second Life : blog
? http://sluniverse.com/pics
SLUniverse
? http://sl-sexualhealth.org.uk
A Sexual Health SIM in Second Life
? http://sltweets.com
Second Life Twitter HUD Plus : metaverse to real life gateway
? http://sldnug.net
Second Life .Net Developers User Group: a microsoft.net user community in
Second Life
? http://www.metaversemessenger.com/
Metaverse Messenger: an SL in-world publication; Tagline: ?A real newspaper for a
virtual world?
? 178?
World of Warcraft:
? ?http://worldofwarcraft.com
World of Warcraft Community Site : official site
? http://wow-europe.com
World of Warcraft Europe
? http://wow.allakhazam.com
Allakhazam?s magical realm
? http://bestofwarcraft.com
The Best of Warcraft
? http://forums.worldofwarcraft.com
WOW Forums
? http://thottbot.com
Thottbot : World of Warcraft database
? http://worldofwar.net
World of Warcraft the Unofficial Site : offers news, information, movies, screen
shots, forums, journals, live chats, and item database
? http://blizzard.com
Blizzard Entertainment
? http://wow.warcry.com
World of Warcraft Warcry
? http://worldofwarcraft.filefront.com
World of Warcraft Downloads: World of Warcraft files for download
? http://mapwow.com
World of Warcraft Maps
? http://wowwiki.com
WoWWiki : guide to the World of Warcraft
? http://wowvault.ign.com
World of Warcraft Vault : World of Warcraft news, trailers, screeenshots, previews,
reviews, guides
? http://wow.incgamers.com
World of Warcraft the Unofficial Site
? http://wowguru.com
World of Warcraft Guru : database
? http://wowhead.com
Wowhead : database website for World of Warcraft
? http://wow.qj.net
QuickJump : World of Warcraft news, announcements, patches, downloads, and videos
? http://wowarmory.com
The World of Warcraft Armory : vast searchable database of information for World
of Warcraft
? http://wowinsider.com
Wow Insider
? http://worldofwconline.com
WoW World of Warcraft Online
? ?http://warcraftmovies.com
World of Warcraft Movies: large database of World of Warcraft movies
? 179?
? http://almostgaming.com/wow
World of Warcraft Strategy Guides
? http://warcraft.trei.ro/world_of_warcraft.php
Warcraft Trei Ro
? http://wowdetox.com
WOW Detox: detox center for wow addiction
? http://war3world.com
War3World : contains downloads, replays, maps, and wallpapers
? http://wow.curse.com
World of Warcraft AddOns, Downloads : World of Warcraft news, articles, files,
screenshots, and videos
? http://wow.gameamp.com
World of Warcraft GameAmp: maps, gold, guides, quests, downloads, guilds, demos,
and videos
? http://warcraftrpg.com
World of Warcraft the Roleplaying Game
? http://gamesites200.com/wow
World of Warcraft Top 200 : top 200 WOW sites as ranked by users
? http://wowui.incgamers.com
WoWUI @ IncGamers : World of Warcraft mods, addons, and more
? http://wow.stratics.com
World of Warcraft Stratics: news and information coverage for World of Warcraft
? http://askapadwe.com
World of Warcraft Help
? http://worldofwarcraftguru.blogspot.com
World of Warcraft : blog
? http://worldofwarcraftempire.com
WoW Empire : information and resources for World of Warcraft
? http://wow-europe.com/en/burningcrusade
World of Warcraft: The Burning Crusade
? http://warcrafthelp.com
World of Warcraft Community: discussion forums for the World of Warcraft gaming
community
? http://wowchron.com
World of Warcraft Chronicles Podcasts and News
? http://freewarcraftguides.com
World of Warcraft Guides
? http://stormscale.org
Stormscale : news about World of Warcraft from official websites and boards
? http://wowforum.com
World of Warcraft Forum
? http://worldofwarcraftboard.com
World of Warcraft Board: discussion forum powered by vBulletin
? http://forum.igsky.com/forum-3-1.html
World of Warcraft IGSky Game Forum
? http://blizzard.co.uk/wow/forums.shtml
World of Warcraft Forum
? 180?
? http://goblinworkshop.com
The Goblin Workshop: forum
? http://mmotricks.com/forums/world-of-warcraft
Massive Multiplayer Online Gaming Forum
? http://graffe.com/forums/forumdisplay.php?f=53
Graffe Forums
? http://forum.ragezone.com
MMORPG development forum
? http://ubuntuforums.org/showthread.php?t=120615
World of Warcraft with Wine : forum
? http://broadbandreports.com/forum/wowetc
World of Warcraft Forum
? http://moncom.net/moncomwow.asp
World of Warcraft Forum
? http://forums.gameaxis.com/forumdisplay.php?f=236
The Unofficial World of Warcraft Forum
? http://worldofwarcraft-game.com
WoW World of Warcraft : description and links
? http://uberwow.com
Uber WoW Fansite : uber WOW fansite; WOW forum; WOW guides
? http://mmoverload.com
MMOverload : strategy website
? http://abcwow.net
ABC World of Warcraft
? http://wowvillage.com
World of Warcraft Village : "all things world of warcraft"
? http://leeroyjenkins.net
Leeroy Jenkins
? http://wowfix.com
WoWFix
? http://www.almostgaming.com/wow
World of Warcraft Strategy Guides
? http://wowstatus.com
World of Warcraft Private Server Status
? http://exploitsrus.com/wow.html
Exploits R Us : wow cheats, gold, and hacks
? http://worldofwarcraft100.com
World of Warcraft 100 : source for WOW sites
? http://worldofwarcraft.bz
World of Warcraft BZ : free private EMU wow server
? http://frostbolt.com
Frostbolt : WOW blog
? http://worldofwarcraft.areblogs.com
World of Warcraft Addict's Blog
? http://hogit.org
Hogit?s Story : blog
? http://wowgrrl.com
? 181?
World of Warcraft Blog by WoWGrrl
? http://casualwow.blogspot.com
Casual WoW : blog
? http://world-of-warcraft-gold.com/blog
World of Warcraft News Blog
? http://1up.com/do/my1Up?publicUserId=5829624
Legendary Thread : blog
? http://lamthara.blogspot.com
Lamthara : blog
? http://marama-wow-corner.blogspot.com
Marama?s WoW Corner : blog
? http://the-wow-blog.com
The World of Warcraft Blog
? http://aeonity.com/wow
Alliance World of Warcraft Blog
? http://wowguy.wordpress.com
World of Warcraft Guy : blog
? http://worldofwarcraftblog.wordpress.com
World of Warcraft Blog
? http://wowdaily.info
WoWDaily
? http://worldofwarcraftblog.info
World of Warcraft Blog : tips on gold making and power leveling
? http://wowgold.shoutpost.com
ShoutPost : blog
? http://wow-gold-reviews.blogspot.com
World of Warcraft WoW : blog
? http://noggaddicts.com
Noggaddicts : blog
? http://wow-tips.blogspot.com
World of Warcraft Tips, Guides, and Exploits
? http://warcraftbot.blogspot.com
World of Warcraft Bot
? http://aeigelus.com
Adventures of Aeigelus : blog
? http://warcraft.topplayers.com
World of Warcraft Blog
? http://squidoo.com/worldofwarcraftblogs
List of World of Warcraft Blogs on Squidoo
? http://wow.mmodb.com
World of Warcraft Database
? http://wow.gamepressure.com
GamePressure Network: database- maps, locations, quests, items, etc
? http://warcraftworlds.net
World of Warcraft Resource Site
? http://hiddenstuff.com
World of Warcraft Guides Database
? 182?
? http://wowecon.com
World of Warcraft Auction House Price Database
? http://worldofwconline.com/database/items
World of Warcraft Items Database
? http://warcraftpets.com/wow.pets/index.asp
Warcraft Small Pets : index of all non-combat pets
? http://wow-item.info
Wow Item Compare Database
? http://wowdigger.com
WoWDigger : database
? http://gankbang.com/a/
Gankbang - Armory data search and ranking
? http://www.wowjutsu.com/world/
WoWJutsu -- Guild data and ranking
? http://www.blogazeroth.com/
Blog Azeroth -- forum for WoW bloggers
? http://www.killerguides.com/
Killer Guides -- strategy guides for MMOs, incl. WoW (commercial site)
? http://elitistjerks.com/
Elitist Jerks -- forums and strategy for WoW (guild site)
Lord of the Rings Online:
? http://lotro.com
The Lord of the Rings Online : official community site
? http://lotro.allakhazam.com
Allakhazam's Magical Realm : database with quest walkthroughs, item lists, a full
bestiary, spells, etc.
? http://lotrovault.ign.com
The Lord of the Rings Online Vault : Lord of the Rings Online news, trailers,
screenshots, previews, etc.
? http://lotrmmorpg.com
LOTR MMORPG : community site; features forums, screenshots, updated news,
guild forums, user photos
? http://lotro.warcry.com
Lord of the Rings Online WarCry
? http://lordoftheringsonline.net
The Madhouse Tavern : European fansite of the MMORPG LOTR Online
? http://lotro.mmodb.com
Lord of the Rings Online Database
? http://gamespot.com/pc/rpg/middleearthonline/index.html
Gamespot : news, previews, images, videos, links, and a forum
? http://lotrtalk.com
Lord of the Rings Online : LOTR Online Community Talk
? http://lotronline.com
Lord of the Rings Online : information database fan site
? http://forums.lotro.com
? 183?
Lord of the Rings Online Forums
? http://lordoftheringsonlinenews.blogspot.com
Lord of the Rings Online : blog
? http://lotro.curse.com
Curse : LOTR Online news, articles, files, screenshots, and videos
? http://lordoftherings.filefront.com
Lord of the Rings Files : LOTR downloads
? http://lotro.us
Lord of the Rings Online Fansite : information on classes, races, and creatures
? http://guides.ign.com/guides/12112
Lord of the Rings Online Guide
? http://lord-of-the-rings.org/lotr_game.html
The Lord of the Rings Online Game
? http://massively.com/category/lord-of-the-rings-online
Posts from the Lord of the Rings Online Category at Massively
? http://lotronlineguides.com
Lord of the Rings Online Leveling & More Guide
? http://mmorpg.qj.net/category/The-Lord-of-the-Rings-Online-Shadows-of-
Angmar/cid/1905
QuickJump : MMORPG blog
? http://lordoftherings.gameamp.com
GameAmp : betas, release dates, maps, guides races, classes and PVP information
? http://games.slashdot.org/games/07/06/01/0816244.shtml
Slashdot : LOTR Online review
? http://community.codemasters.com/forum/forumdisplay.php?f=417
Codemasters Forum
? http://lotrovideo.blogspot.com
Lord of the Rings Online Videos
? http://lorebook.lotro.com
Lord of the Rings Online Lorebook
? http://gamesites200.com/lotro
Lord of the Rings Online Top 200 : top 200 sites for LOTR Online
? http://lotro.turbine.com/index.php?page_id=73&siid=3
Lord of the Rings Online Gameplay
? http://arda-online.com/map/
Lord of the Rings Online Game Database on Google Maps
? http://lotro.tentonhammer.com
Lord of the Rings Online @ TenTon Hammer : LOTRO community site
? http://mapslotro.com
Lord of the Rings Online Maps
? http://lotro-wiki.com
Lord of the Rings Online Wiki
? http://lotroedge.com
LOTRO Edge : lotr online guide
? http://lotrofaces.com
Character Faces
? http://lotrointerface.com
? 184?
LotRO Interface : comprehensive site dealing with anything Interface related in
LOTRO
? http://lotrolife.com
Lord of the Rings Online MMORPG web comic
? http://lotrorphaven.com
Lord of the Rings Online Role Player?s Haven
? http://lotro.stratics.com
Lord of the Rings Online Community Resource
? http://lotro.wikia.com/wiki/Main_Page
Lord of the Rings Online Wikia Wiki
? http://thebrasse.com/lotro
The Brasse
? http://weathertopradio.com
Weathertop Radio : podcasts with news and updates on LOTRO
? http://lotroalliance.com
Lord of the Rings Online Alliance
? http://gamertales.com/lotrotales.php
Gamer Tales : collection of LOTRO gaming tales
? http://kismetbp.com
Kismet Lord of the Rings Online : news and fansite
? http://guildcafe.com/LordOfTheRingsOnline.php
Guildcafe : kinships, guilds, profiles, screenshots
? http://lotrocraft.com
Lord of the Rings Online Crafters of Middle Earth : fansite for crafters
? http://lotroguilds.net
LOTRO Guilds : guild listing, screenshots, videos, interviews
? http://lotronotes.com
LOTRO Notes : searchable tips database
? http://lotropolis.com
LOTROpolis : LOTRO community
? http://middleearthcenter.com
Middle Earth Center
? http://virginworlds.com/podcast.php?show=3&ep=15
Ringcast : weekly podcast covering LOTRO
? http://lickanear.com/corcoffee.html
Lick An Ear : LOTRO related fansite
? http://underthebanner.com
Under the Banner : information blog
? http://visionsofthering.com
Visions of the Ring: website for LOTRO suggestions, concepts, ideas
? http://arda-online.com
Arda Online : LOTRO community
? http://lotromovies.net
Lord of the Rings Online Movies Database
? http://onlinelotr.com
Lord of the Rings Online : database of quests, deeds, items
? http://lotro-game.com
? 185?
LOTR Online
? http://www.lotro-game.com/resources/databases
Lord of the Rings Online Databases
? http://lotrotraits.com
LotRO Traits : filterable trait database
? http://lotrocrafters.com
Lord of the Rings Online : crafters forum
? http://pooh.cz/lotro
Lord of the Rings Online : fansite and blog
? http://lotrovideo.vsocial.com
Lord of the Rings Online Videos
? http://annonamarth.eu
Annon Amarth
? http://lotroinsider.blogspot.com
Lord of the Rings Online Insider : blog
? http://blogtoplist.com/rss/lord-of-the-rings-online.html
Blog Toplist
? http://lordoftheringsonline.wordpress.com
Lord of the Rings Online : LOTRO blog
? http://lotr-online-leveling-guides.blogspot.com
Free Lord of the Rings Online Leveling Guide
? http://female-gamer.com/lotro
LOTRO Insider Blog : female gamer blog
? http://lotroblog.cn
Lotro Blog
? http://lotro.mmorpgedge.com
LOTRO Edge
? http://thesafehouse.org/forums/forumdisplay.php?f=64
The Safehouse Forums
? http://ubuntuforums.org/showthread.php?t=386480
Lord of the Rings Online : forum
? http://lotroquests.com
LOTRO Quests : guides, pictures, maps
There.com:
? http://there.com
There : official site
Sony's Home
? http://www.homebetatrial.com/
PlayStation Home Trial
? http://www.playstationhome.com/
PlayStation Home Forums (unofficial)
? 186?
? http://www.youtube.com/watch?v=oInP2DAa3BA
PlayStation Home GDC 2007 presentation video
Entropia Universe
? http://www.entropiauniverse.com/
Entropia Universe : Official website
? http://www.euturnpike.com/
EU Turnpike : Ore Rates, Crafting Costs, Charts, Maps, guides, tools and more
? http://www.entropedia.info/Page.aspx?page=Main%20Page
Entropedia : Charts, maps and guides
The Sims Online
? http://player.thesimsonline.ea.com/index.jsp
The Sims Online : Official website
? http://sims.stratics.com/
Sims Stratics: Sims Online Community
? http://ea-land.ea.com/
EA Land: The Sims Online, free version, launched in Feb 2008
Digital Games (generally)
? ?http://www.mobygames.com/home
? ?http://grandtextauto.org/
? ?http://www.gamasutra.com/
? ?http://www.ludology.org/
? ?http://vgmaps.com/
? 187?
Appendix I ? Sample Output from CopyBot