Posted by mike thuman
At a high level, the question being asked by all is: How will we guarantee the authenticity and accessibility of the terabytes and now petabytes of all of the digital content we are now creating? Steve Knight of the National Library of New Zealand (NLNZ), the lead speaker at one symposium, discussed how NLNZ had gone though all of the phases of documenting the business requirements, analyzing “build versus buy” strategies, explaining and positioning open source/proprietary/open platform solutions to NLNZ directors, training the organization on preservation standards, and through a partnership with Ex Libris, going live withRosetta– a workflow-based digital preservation infrastructure that provides both access and long-term preservation.
Also of interest was the fact that half of the attendees believed that digital preservation is really just preservation dealing with yet another new information format. (See our previous blog post on this subject athttp://commentary.exlibrisgroup.com/2009/05/libraries-choosing-to-end-preservation.html). Digital preservation (or the ability to guarantee long term access to digital content) is a new discipline in many ways and one that comes with new questions. For instance: a) how do we create a digital preservation policy? b) how do we plan for digital preservation? c) how do we build the business case? d) how do we orchestrate a program and eventually go live with a workflow based solution? During the symposium, answers were discussed among attendees, including such things as best-practices, Web-based resources that exist, and actual experiences.