The first session I attended at the ACA conference was titled “Where We Go From Here: Challenges in Digital Preservation.”
Adam Jansen spoke first, an enthusiastic fellow in a Hawaiian shirt who is affiliated with UBC’s InterPARES (International Research on Permanent Authentic Records in Electronic Systems) Trust. He emphasized the importance of preserving documentary evidence–be that document electronic or physical–and discussed the inherent differences in the ways electronic records must be preserved. Unlike their analog counterparts, electronic records all but demand that preservation begin at creation, with the production of accurate metadata. They also require the early employment of systems (intellectual systems as well as hardware/software) which will accommodate swiftly growing and ever-changing collections of digital media.
In an effort to meet the increasingly rigorous demands of preserving electronic records, some archives (and many individuals) have turned to the cloud. “The Cloud” or cloud computing focuses on maximizing the effectiveness of shared resources, which can be dynamically reallocated per demand. The end result is that vast amounts of information (from multiple sources) can be stored in a flexible shared space which may exist either on or off site.
While there are obvious benefits to cloud computing for preservation (the storing of electronic records at a secure data center, for example, which may otherwise be far outside of the financial means of the institution) there are also great risks. Jansen was keen to impart the dangers of relying upon the cloud as a means of electronic record preservation (making that face << and an “EAUGH!” noise of frustration and despair at some point in his discussion). Questions of ownership, jurisdiction, and privacy all become complicated with the introduction of cloud computing to e-records management and, as has been demonstrated time and again (remember the celebrity icloud hacking scandal of last year?) the cloud is not necessarily as secure as we would like to think.
In many ways, cloud computing was not designed for preservation (as my Computer-Science genius-brained fiance was quick to point out to me). The system was designed to meet the needs of rapidly fluctuating processing power demands (e-mail servers, for example), not the more-or-less steady processing needs of long-term data storage. Nevertheless, the potential for a fruitful partnership between cloud computing vendors and archivists should not be ignored. The key to ensuring that important data is not lost when applying preservation principles to the cloud lies in the metadata–which, Jansen assured us, is where InterPARES Trust plays a role.
The trustworthiness of an electronic document–its documentary integrity–lies in the completeness and purity of its metadata, and InterPARES Trust has made it a goal to work with users and cloud vendors in order to regulate the preservation of that metadata in the cloud. The group has identified 6 preservation services which are necessary to ensure data integrity within the cloud, including: 1) ensuring that what was sent to the cloud was received intact 2) ensuring that metadata has been accurately imported with the digital object 3) ensuring that what has been imported is authentic 4) ensuring that users receive a report on the status of the physical storage space of the data-center 5) ensuring that there is data being generated reflecting any migrations and emulations that occur and 6) ensuring that there is data on how materials are being disseminated.
Jansen, then, sees archivists and information professionals as the bridge between the cloud provider and their audience when it comes to safely preserving electronic records for the long term. We have seen some of the fruits of this sort of union in our own unit with the introduction of Archivematica, a platform for the long-term preservation of “trustworthy, authentic, and reliable digital content,” which makes use of cloud-based technologies. Although not yet fully implemented at UASC, the benefits of Archivematica in keeping our digital (and particularly born-digital) records accessible, safe, and clean is inarguable.
(If you couldn’t guess, Jansen’s talk sparked a ton of subsequent discussion, which is why I’m rambling about it here. For the others, I’ll be brief: )
Next to speak was Paul Wagner from Library and Archives Canada, who focused on their shifting digital strategies. Of particular note is the attention LAC is going to be paying to digital initiatives across the country, attempting to sort out who has digitized what, and thereby cut down on the amount of overlap in digitization projects occurring between institutions. He also reminded us that “context improves content” when it comes to digital material, and that it may be fruitful to ask clients what they want to see digitized. A very down-to-earth and helpful talk.
Finally, Allana Mayer, an independent researcher, discussed her survey which asked archivists about the nature of their digital holdings : What do we have? How much are we holding? How often is it asked for? How are we preserving it? She also encouraged archives to treat their digital records like nitrate film : swiftly and with constant vigilance. These are all important questions to be asking; however, Allana’s impatience with archivists and records managers for not dealing more effectively with their electronic records seemed symptomatic of a non-holistic view of archival work. With so many archives and special collections facing years and kilometers of physical backlog, it is perhaps unsurprising that the thought of tackling electronic records in full-body contact sport makes us cringe.
Still, if there’s one thing this session assured me of, its that digital preservation is not as unmanageable as we may fear.