Saturday, March 25, 2017

21st-Century Preservation Basics

21st-Century Preservation Basics. Brian J. Baird. Sidebar.  American Libraries. March 1, 2017.
    Since most scholarly information is now electronic, the basic elements of any digital library preservation policy in the 21st century include:

  • Cooperation. Every library has unique digital collections to preserve, but as the volume continues to grow exponentially, and as older material gets accessed less frequently, libraries may need to cooperate in order to collect and preserve materials long term. 
  • Environmental conditions. Optimal conditions for storing and preserving electronic information must continually be reexamined and improved. 
  • Disaster planning. A library disaster plan should build on an institution’s IT disaster plan to address specific needs.
  • Reformatting.  
  • Repositories. Ideally, repository collections should be well preserved, sharable, and cost-effective and could expand on the consortial efforts already in use.

"Preservation in the 21st century must be proactive, visionary, and cooperative. If it is not, vast amounts of cultural heritage are in danger of vanishing."

Wednesday, March 22, 2017

Collecting Digital Content at the Library of Congress

Collecting Digital Content at the Library of Congress. Joe Puccio, Kate Zwaard. The Signal.
March 21, 2017.
     The Library of Congress has increased its digital collecting capacity in order to acquire as much selected digital content as technically possible, currently 12.5 petabytes, and make that content accessible to users. Expansion of the digital collecting program is "an essential part of the institution’s strategic goal to: Acquire, preserve, and provide access to a universal collection of knowledge and the record of America’s creativity." The newly-adopted strategy is directed at acquisitions and collecting, and is based on a vision in which the "Library’s universal collection will continue to be built by selectively acquiring materials in a wide range of formats" and via collaborative relationships with other entities.

The strategy is based on the assumptions that the amount of available digital content will continue to grow rapidly, that the Library will acquire content selectively, that the same content will be "available both in tangible and digital formats", and that intellectual rights will be respected.  Their plan for digital collecting over the next five years is categorized into six strategic objectives:
  1. Maximize collections of selected digital content submitted for copyright purposes
  2. Expand digital collecting through purchase, exchange and gifts
  3. Focus on purchased and leased electronic resources
  4. Expand use of web archiving to acquire digital content
  5. Acquire openly available content
  6. Collect appropriate datasets and other large units of content

Thursday, March 16, 2017

Creating the disruptive digital archive

Creating the disruptive digital archive. John Sheridan. Digital Preservation Coalition. 1 March 2017.
     The National Archives has been working on a new Digital Strategy. "Digital" is their biggest strategic challenge. Archives worldwide are "grappling with the issues of preserving digital records. We also need to be relevant to our audiences: public, government, academic researchers and the wider archives sector – to provide value to them at a time of change."

Traditional archives are built around the physical nature of the records, but digital records "change all our assumptions around the archive – from selection to preservation and access". Their new Digital Strategy is to move beyond the digital simulation of physical records and to become a ‘disruptive’ digital archive, to be "digital by design".

The National Archives is currently a "fully functioning digital archive with a Digital Records Infrastructure capable of safely, securely and actively preserving very large quantities of data with associated descriptive metadata" which is applying the paper records paradigm of selection, preservation and access to digital records. This is their first generation archive.  The second generation digital archive they are aiming for is to be "digital by instinct and design":

  • rich mixed media content (things like websites), datasets, computer programs, even neural networks, as records not just information in document formats
  • ability to select and preserve all these types of things 
  • digital information has value in aggregate – that it’s not just individually important artefacts that have historical value. 
  • a relentless engineering effort to preserve digital objects that measures and manages the preservation risks
  • transparent in its practices
  • develops approaches for enabling access to the whole collection with regard to legal, ethical and public considerations. 
  • regards the archive as conceptually interconnected data.

"These are ambitious aims and there are many challenges we need to tackle along the way." Collaboration between archives and other institutions is essential in moving forward.


Wednesday, March 15, 2017

Developing a Digital Preservation Infrastructure at Georgetown University Library

Developing a Digital Preservation Infrastructure at Georgetown University Library. Joe Carrano, Mike Ashenfelder. The Signal. March 13, 2017.
     At the library of Georgetown University, half of the library IT department is focused on digital services such as digital publishing, digitization and digital preservation. These IT and library functions overlap and support each other, which creates a need for the librarians, archivists and IT to work together. It provides better communication and makes it easier to get things done. "Often it is invaluable to have people with a depth of knowledge from many different areas working together in the same department. For instance, it’s nice to have people around that really understand computer hardware when you’re trying to transfer data off of obsolete media." 

While digital preservation and IT is centered in one department, the preservation files are in different systems and on different storage mediums throughout the library, but they are in the process of  putting them into APTrust.  Several strategies to improve their digital preservation management are:
  1. Implement preservation infrastructure, including a digital-preservation repository
  2. Develop and document digital-preservation workflows and procedures
  3. Develop a training program and documentation to help build skills for staff
  4. Explore and expand collaborations with both university and external partners to increase the library’s involvement in regional and national digital-preservation strategies.
These goals build upon each other to create a sustainable digital-preservation framework which includes APTrust and the creation of tools to manage and upload the content, particularly creating  custom automated solutions to fit their needs. They are also developing documentation and workflows so any staff member can "upload materials into APTrust without much training".

Librarians and archivists need to be trained and integrated into the process to ensure the sustainability of the project’s outcome and to speed up the ingest rate. "Digital curation and preservation tasks are becoming more and more commonplace and we believe that these skills need to be dispersed throughout our institution rather than performed by only a few people". 

"By the end of this process we hope to have all our preservation copies transferred and the infrastructure in place to keep digital preservation sustainable at Georgetown."

Monday, March 13, 2017

What Makes A Digital Steward: A Competency Profile Based On The National Digital Stewardship Residencies

What Makes A Digital Steward: A Competency Profile Based On The National Digital Stewardship Residencies. Karl-Rainer Blumenthal, et al. Long paper, iPres 2016. (Proceedings p. 112-120 / PDF p. 57-61).
       Digital stewardship is the active and long-term management of digital objects with the intent to preserve them for long term access. Because the field is relatively young, there is not yet a "sufficient scholarship performed to identify a competency profile for digital stewards". A profile details the specific skills, responsibilities, and knowledge areas required and this study attempts to describe a competency profile for digital stewards by using a three-pronged approach:
  1. reviewing literature on the topics of digital stewardship roles, responsibilities, expected practices, and training needs
  2. qualitatively analyzing current and completed project descriptions
  3. quantitatively analyzing the results from a survey conducted that identified competencies need to successfully complete projects
"This study had two main outputs: the results of the document analysis (qualitative), and the results of the survey (quantitative)."  Seven coded categories of competence emerged from the analysis:
  1. Technical skills;
  2. Knowledge of standards and best practices;
  3. Research responsibilities;
  4. Communication skills;
  5. Project management abilities;
  6. Professional output responsibilities; and
  7. Personality requirements.
Based on the responses for Very important and Essential, a competency statement representing this profile would suggest that "effective digital stewards leverage their technical skills, knowledge of standards and best practices, research opportunities, communication skills, and project management abilities to ensure the longterm viability of the digital record." They do this by:
  • developing and enhancing new and existing digital media workflows
  • managing digital assets
  • creating and manipulating asset metadata
  • commit to the successful implementation of these new workflows
  • manage both project resources and people
  • solicit regular input from stakeholders
  • document standards and practices
  • create policies, professional recommendations, and reports,
  • maintain current and expert knowledge of standards and best practices for metadata and data management
  • manage new forms of media
The study suggests that, in practice, technical skills are not always as essential in digital stewardship as job postings suggest. Hardware/software implementation and Qualitative data analysis skills were important to only half of the respondents. Workflow management is a universally important skill deemed ”Essential" by almost all respondents. Other categories appeared as Somewhat Important, or as areas that need further research.

The study suggests that "although specific technical skills are viewed as highly important in different settings, a much larger majority of projects required skills less bound to a particular technology or media, like documentation creation and workflow analysis."  Digital stewards should possess, not only a deep understanding of their field, but the ability to "effectively disseminate their work to others."

Thursday, March 09, 2017

Top 10 Digital Archives Blogs

Top 10 Digital Archives Blogs. Jan Zastrow.  Information Today. July/August 2016.
Post about keeping up with reading about an archival or historical topic. By sharing it with others we can learn about new developments in the field without having to read all the current literature ourselves. Here is a list of  selected sources to help sift through the noise and keep up with the quickly evolving world of digital archives, electronic records, digital preservation and curation, personal archiving, digital humanities, and more. Some are from institutions, others are more informal, and they are mostly U.S.-centric, English-language sources. [I learned about some new helpful sites here.]

Society of American Archivists
1. The Society of American Archivists’ semi-annual The American Archivist, theoretical and practical developments in the archives profession in North America.

2. SAA Electronic Records Section runs the popular BloggERS! which aggregates news, information, and resources on electronic records, including case studies, reviews, and surveys.

U.S. Federal Agencies
3.The National Archives’ AOTUS Blog, and more at archives.gov/social-media/blogs.html.

4. The Library of Congress: The Signal: Digital Preservation with up-to-the-minute digital issues (such as web archiving, audiovisual preservation, digital forensics, data migration, and digital asset management).

Aggregated Sources to save you time.
5. ArchivesBlogs is a syndicated collection of blogs about archives, “by and for archivists,” taken from international RSS and Atom feeds every hour.

6. Digital Archiving Resources is an excellent annotated database of materials on digital archiving created by doctoral students at the University of Central Florida.

7. Digital Preservation Matters:  For more than a decade articles on digital preservation, long term access, digital archiving, digital curation, institutional repositories, and electronic records management. Search the blog’s archive, use the tag cloud interface, or subscribe via RSS or on Twitter.

Blogs: By and For Individuals
8. The brainchild of Kate Theimer, ArchivesNext  advocate of archives, technology, and professional issues

9. Trevor Owens: User Centered Digital History blog with cutting edge essays on digitization, born-digital, primary sources, web archives, and digital art, etc. 

10. Jaime Mears, Notes From a Nascent Archivist  is chockfull of great ideas, resources, projects, and more.


Wednesday, March 08, 2017

The Hidden Phenomenon That Could Ruin Your Old Discs

The Hidden Phenomenon That Could Ruin Your Old Discs. Ernie Smith. Motherboard. February 6, 2017.
     An article about regular CD and DVD optical discs and the problems that cause them to deteriorate.  "CDs and DVDs were sold to consumers as these virtually indestructible platters, but the truth, as exemplified by the “disc rot” phenomenon, is more complicated."  Early research showed that problems with the reflective layer could make the disc fail in 8 - 10 years. Or the degradable dye used in record-able discs will break down. The disc degradation sometimes looks like a stain or discoloration, or tiny pin pricks on the disc surface. "The eventual decay of optical media is a serious situation, whether you're a digital archivist or simply someone who wants to watch a movie on a weird format like a Laserdisc."

A Library of Congress preservation specialist said that the disc destruction showed up in three different forms: the "bronzing" of discs;  small pin-hole specs located on the discs; or "edge-rot".
Five facts about disc rot, according to the Library of Congress:
  1. Discs with significant errors are often still at least partially readable. This depends on the type of disc and where the error occurs.
  2. A scratch at the top of a CD is more problematic than one on the bottom, because scratches to the top surface can penetrate through and damage the reflective layer.
  3. DVDs generally have better integrity than do CDs but layers can delaminate over time. Dual-layer discs tend not to hold up so well.
  4. Recordable discs, and particularly DVDs don't last as long, due to the degradation of the organic dye used. A poorly recorded disc tends to wear out more quickly.
  5. Proper storage and handling helps. A well-made commercially pressed disc can last many decades if stored and handled properly. Discs stored in harsh environmental conditions with elevated temperature and/or humidity will have shorter expected lifetime.

Tuesday, March 07, 2017

The role of archives

The role of archives. Helen Hockx. Things I cannot say in 140 characters.  January 20, 2017.
     The role of Archives, especially when it comes to digital records, is not commonly understood. An archivist should ask questions "about the file structure, the access system, who accessed it, and how was it used… Appraisal is based on context, or the entire record keeping system and the importance of individual items depends on how they relate to one another within a system". This is difficult to do after the fact. The heart of the problem is: who makes decisions on what records to keep? A perception is that Archives are "museums with artifacts, and have no authority over digital records”.  access to the digital files should be determined by the “data stewards” under the direction of the University’s Information Governance Committee. The role of Archives, data access, record lifecycles and retention schedules seem to be largely misunderstood.


Monday, March 06, 2017

Electric WAILs and Ham

Electric WAILs and Ham. John Berlin. Web Science and Digital Libraries Research Group. February 13, 2017.
     Web Archiving Integration Layer (WAIL) is a one-click configuration and utilization tool that fits between institutional and individual archiving tools from a user's personal computer. Changing the tool from a Python application into an Electron application has brought with it many improvements especially the ability to update and package it for Linux, MacOS, and Windows.

WAIL is now collection-centric and provides users with the ability to curate personalized web archive collections, similar to Archive-It, but on their local machines. It also adds the ability to monitor and archive Twitter content automatically. WAIL is now available from the project's release page on Github.  More information about WAIL is available on their wiki.

Saturday, March 04, 2017

What Do IT Decision Makers Want?

What Do IT Decision Makers Want? Tom Coughlin. Forbes. March 1, 2017.
     An article that looks at a study of over 1,200 senior IT decision makers in 11 countries. Some findings

  • The vast majority of those surveyed have revised their storage strategy in the last 12 months because of frustrations with storage costs, performance, complexity and fragmentation of existing solutions. 
  • 60% say storage expenses are under increased scrutiny 
  • 95% are interested in the scalability and efficiency of software-defined storage. 
  • Digital storage is about 7% of the total IT budget.
  • Some concerns: 
    • High costs: 
      • 80% were concerned with the cost of their storage system
      • 92 % worry about managing storage costs as capacity needs grow. 
      • On average 70% of IT budgets are allocated to data storage 
    • Performance: 
      • 73% are concerned with the performance of their existing storage solution. 
    • Growing complexity and fragmentation: 
      • 71% of respondents said storage systems were complex and highly fragmented.  
  • Software-defined storage [which involves separating the storage capabilities and services from the storage hardware]  is playing significant roles in improving the utilization of storage resources and stretching storage budgets.

Thursday, March 02, 2017

A lifetime in the digital world

A lifetime in the digital world. Helen Hockx. Blog: Things I cannot say in 140 characters.
February 15, 2017.
     A very interesting post about papers donated to the University of Notre Dame in 1996, and how the library has been dealing with the collection. The collection includes a survey that is possibly “the largest, single, data gathering event ever performed with regard to women religious”. The data was stored on “seven reels of 800 dpi tapes, ]rec]120, blocksize 12,000, approximately 810,000 records in all”, extracted from the original EBCDIC tapes and converted to newer formats in 1996, transferred to CDs then to computer hard disk in 1999. The 1967 survey data has fortunately survived the format migrations. Some other data in the collection had been lost: at least 3 tape reels could not be read during the 1996 migration exercise and at least one file could not be copied in 1999. "The survey data has not been used for 18 years since 1996 – nicely and appropriately described by the colleague as “a lifetime in the digital world”.

The dataset has now been reformatted and stored in .dta and .csv formats. We also recreated the “codebook” of all the questions and pre-defined responses and put in one document. The dataset is in the best possible format for re-use. The post gives examples of  digital collection items that require intervention or preservation actions. A few takeaways:
  • Active use seems to be the best way for monitoring and detecting digital obsolescence.
  • Metadata really is essential. Without the notes, finding aid and scanned codebook, we would not be able to make sense of the dataset.
  • Do not wait a lifetime to think about digital preservation. 
  • The longer you wait, the more difficult it gets.

Wednesday, February 01, 2017

Why Aren't We Doing More With Our Web Archives?

Why Aren't We Doing More With Our Web Archives? Kalev Leetaru. Forbes. January 13, 2017.
     The post looks at the many projects that have been launched to archive and preserve the digital world; the best known is the Internet Archive, "which has been crawling and preserving the open web for more than two decades" and has preserved more than 510 billion distinct URLs from over 361 million websites. The author asks: "With such an incredible repository of global society’s web evolution, why don’t we see more applications of this unimaginable resource?"

Some of the reasons that there isn't a more vibrant and active research and software development community around web archives may be:
  • Economics plays a role, 
  • Complex nature of web archives
  • The Internet Archive archive is over 15 petabytes, which is difficult to manipulate
  • There aren't many tools that can use the archive, particularly indexing
The Internet Archive last year announced the first efforts at keyword search capability. These kinds of search tools are needed to make the Archive’s holdings more accessible to researchers and data miners.

"At the end of the day, web archives are our only record capturing the evolution of human society from the physical to the virtual domains. The Internet Archive in particular represents one of the greatest archives ever  created of this immense transition in human existence and with the right tools and a greater focus on non-traditional avenues, perhaps we can launch a whole new world of research into how humans evolved into a digital existence."

Tuesday, January 31, 2017

20 TB Hard Disk Drives, The Future Of HDDS

20 TB Hard Disk Drives, The Future Of HDDS. Tom Coughlin. Forbes. January 28, 2017.
     Interesting article on the status and future of hard drives. It looks at the declining market and the trends for hard disk drives over the next few years.  Overall drive shipments in 2016 dropped about 9.4%, meaning that 424 million drives were shipped in 2016. Of the total HDDs shipped in 2016:
  • Western Digital shipped 41% 
  • Seagate shipped 37%  
  • Toshiba shipped 22%.
"The long-term future of HDDs likely rests with high capacity HDDs, particularly in data centers serving cloud storage applications".  Seagate plans to ship 14 and 16 TB drives in the next 18 months, and possibly 20 TB drives in the next three years.

Digital Preservation and Archaeological data

Digital Preservation.  Michael L. Satlow. Then and Now. Jan 26, 2017.
     The post looks at the issue of preservation in relation to the modern scholarly and artistic works. "The underlying problem is a simple one: most scholarly and creative work today is done digitally." Archaeological excavations generate reams of data, and like other scientific data, archaeological data are valuable.  There is no single way that archaeologists record their findings. "Unlike scientists, many archaeologists and humanists have not thought very hard about the preservation of digital data. Scientists routinely deposit their raw data in institutional repositories and are called upon to articulate their digital data management and preservation plan on many grant applications. The paths open to others are less clear."

Institutional digital repositories provide a simple and inexpensive solution. When the project is complete, the data can be converted to xml and deposited. The data conversion would be the most involved part. The xml format would allow the data to be easily accessed and used. "It is time to think about digital preservation as a staple of our 'best practices'.”


Monday, January 30, 2017

Born-digital news preservation in perspective

Born-digital news preservation in perspective. Clifford Lynch. RJI Online. January 26, 2017. [Video and transcript.]
   The challenge with news and academic journals: how do you preserve this body of information. The journal community has working on that in a much more systematic way. There is a shared consensus among all players that preserving the record of scholarly journal publication is essential. Nobody wants their scholarship to be ephemeral so you have to tell people a convincing story about how their work will be preserved.

The primary responsibility for the active archive in most cases is the publisher, but there must be some kind of external fallback system so content will survive the failure of the publisher and the publisher’s archive. These are usually collaborative. Libraries have been the printed news archive, but that is changing. There is also a Keepers Registry so you can see how many keepers are preserving a given journal. The larger journals are well covered, but the smaller ones are really at risk, and a lot of these are small open source journals. "So, we need to be very mindful of those kinds of dynamics as we think about what to do about strategies for really handling the digital news at scale."

With the news, there are a few very large players, and a whole lot of other small news outlets of various kinds. Different strategies are needed for the two groups. We need to be very cautious about news boundaries. "Now in many, many cases, the journalism is built on top of and links to underlying evidence which at least in the short term is readily inspectable by anyone clicking on a link." But the links deteriorate and the material goes away and "preserving that evidence is really important." But it is unclear who is or should be preserving this. There are also questions about the news, the provenance, the motives, the accuracy, and these have to be handled in a more serious way.

"most social media is actually observation and testimony. Very little of it is synthesized news. It’s much more of the character of a set of testimonies or photographs or things like that. And collectively it can serve to give important documentation to an event, but often it is incomplete and otherwise problematic. We need to come to some kind of social consensus about how social media fits into  the cultural record.

We need to devise some systematic approaches to this because the journalistic organizations really need help; "their archives are genuinely at risk" and in many cases the "long term organizational viability is at risk". We need a public consensus. "We need a recognition that responsible journalism implies a lasting public record of that work." The need for free press is recognized consitutionally. "We cannot, under current law, protect most of this material very effectively without the active collaboration of the content producers." This is too big a job for any single organization, and we don't want a single point of failure.


Tuesday, January 24, 2017

The UNESCO/PERSIST Guidelines for the selection of digital heritage for long-term preservation

The UNESCO/PERSIST Guidelines for the selection of digital heritage for long-term preservation. Sarah CC Choy, et al. UNESCO/PERSIST Content Task Force. March 2016.
     The survival of digital heritage is much less assured than its traditional counterparts. “Identification of significant digital heritage and early intervention are essential to ensuring its long-term preservation.” This project was created to help preserve our cultural heritage, and to provide a starting point for institutions creating their policies. Preserving and ensuring access to its digital information is also a challenge for the private sector. Acquiring and collecting digital heritage requires significant effort and resources. It is vital that organizations accept digital stewardship roles and responsibilities.Some thoughts and quotes from the document.
  • There is a strong risk that the restrictive legal environment will negatively impact the long-term survival of important digital heritage.
  • The challenge of long-term preservation in the digital age requires a rethinking of how heritage institutions identify significance and assess value.
  • new forms of digital expression blur boundaries and lines of responsibility and challenge past approaches to collecting.
  • libraries, archives, and museums have common interests to each preserve heritage
  • heritage institutions must be proactive to identify digital heritage and information for long-term preservation before it is lost.
  • Selection is as essential, as it is economically and technically impossible, and often legally prohibited, to collect all current digital heritage. Selecting for long-term preservation will thus be a critical function of heritage institutions in the digital age.
  • Selecting digital heritage for long-term preservation may focus primarily on evaluating publications already in their collection, originally acquired for short-term use, rather than assessing new publications for acquisition. 
  • Rapid obsolescence in digital formats, storage media, and systems is collapsing the window of opportunity of selection, and increase the risk that records are lost that might not have yet “proved” their significance over time.
Address strategies for collecting digital heritage and develop selection criteria for an institution. Four possible steps to use:
  1. Identify the material to be acquired or evaluated
  2. Determine the legal obligation to preserve the material
  3. Assess the material using three selection criteria: significance, sustainability, and availability
  4. Compile the above information and make a decision based on the results
Management of long-term digital preservation and metadata is important. There are five basic functional requirements for digital metadata:
  1. Identification of each digital object
  2. Location of each digital object so that it can be located and retrieved.
  3. Description of digital object is needed for recall and interpretation, both content and context
  4. Readability and encoding, in order to remain legible over time.
  5. Rights management, including conditions of use and restrictions of each digital item
“The long-term preservation of digital heritage is perhaps the most daunting challenge facing heritage institutions today.”

Wednesday, January 11, 2017

Digital preservation is a mature concept, but we need to pitch it better

Digital preservation is a mature concept, but we need to pitch it better. Dave Gerrard. Digital Preservation at Oxford and Cambridge.  6 December, 2016.
     The OAIS standard can be confusing for newcomers to the field, and one of the potentially confusing areas is the Administrative area. It looks "like a place where much of the hard-to-model, human stuff had been separated from the technical, tool-based parts." The diagram is busier and more information-packed than other areas, and thus could use more modeling. The standard may be easier to use if there were other documents focusing on the ‘technical’ and ‘human’ aspects.

Communication, particularly an explanation to funders, about the importance of digital preservation is vital. It will help to have an 'elevator pitch' to explain simply what digital preservation is. The post suggests "Digital Preservation means sourcing computer-based material that is worthy of preservation, getting that material under control, and then maintaining the usefulness of that material, forever." [Some of these words may be easily misunderstood.]

The "OAIS standard is confusing" "but it has reached a level of maturity: it’s clear how much deep thought and expertise underpins it."  The digital preservation community is ready to take their ideas to a wider audience: "we perhaps just need to pitch them a little better".

Saturday, December 31, 2016

Managing the preservation and accessibility of public records from the past into the digital future

Managing the preservation and accessibility of public records from the past into the digital future.  Dean Koh. Open Gov.  30 November 2016.
     A post about the Public Record Office of the State Archives of Victoria. They have many paper records but now also a lot of born digital records governments, so the archives is a hybrid paper and digital archives. For accessibility purposes, paper records are digitised to provide online access. The Public Record Office also sets records management standards for government agencies across Victoria. "In the digital environment, there is not a lot of difference between records and information so that means we set standards in the area of information management as well." Access to records is a major focus, including equity of access in a digitally focused age.

"There’s a lot to access that isn’t necessarily ‘just digitise something’, there’s a lot of work to be done in addition to just digitising them. There’s capturing metadata about the digital images because again, if I just take photographs of a whole lot of things and send you the files, that’s not very accessible, you have to open each one and look at it in order to find the one that you want. So we have to capture metadata about each of the images in order to make them accessible so a lot of thinking and work goes into that."

Another issue around records, particularly born digital records, is the different formats used to create records in government. There are a "whole bunch of different technologies" used to create born digital records and the archives is trying to manage the formats and the records so that they "continue to remain accessible into the far future. So 50 years, a 100 years, 200 years, they still need to be accessible because those records are of enduring value to people of Victoria. So that’s a format issue and a format obsolescence issue."


Friday, December 30, 2016

How Not to Build a Digital Archive: Lessons from the Dark Side of the Force

How Not to Build a Digital Archive: Lessons from the Dark Side of the Force. David Portman. Preservica. December 21, 2016.
     This post is an interesting and humorous look at Star Wars archiving: "Fans of the latest Star Wars saga Rogue One will notice that Digital Archiving forms a prominent part in the new film. This is good news for all of us in the industry, as we can use it as an example of how we are working every day to ensure the durability and security of our content. Perhaps more importantly it makes our jobs sound much more glamorous – when asked 'so what do you do' we can start with 'remember the bit in Rogue One….'"

The Empire’s choice of archiving technology is not perfect and there are flaws in their Digital Preservation policy in many areas, such as security, metadata, redundancy, access controls, off site storage, and format policy. Their approaches are "hardly the stuff of a trusted digital repository!"

Thursday, December 29, 2016

Robots.txt Files and Archiving .gov and .mil Websites

Robots.txt Files and Archiving .gov and .mil Websites. Alexis Rossi. Internet Archive Blogs. December 17, 2016.
     The Internet Archive collects webpages "from over 6,000 government domains, over 200,000 hosts, and feeds from around 10,000 official federal social media accounts". Do they ignore robots.txt files? Historically, sometimes yes and sometimes no, but the robots.txt file is less useful that it was, and is becoming less so over time as, particularly for web archiving efforts. Many sites do not actively maintained the files or increasingly block crawlers with other technological measures. The "robots.txt file is not relevant to a different era". The best way for webmasters to exclude their sites is to contact archive.org and to specify the exclusion parameters.

"Our end-of-term crawls of .gov and .mil websites in 2008, 2012, and 2016 have ignored exclusion directives in robots.txt in order to get more complete snapshots. Other crawls done by the Internet Archive and other entities have had different policies."  The archived sites are available in the beta wayback. They have had little feedback at all on their efforts. "Overall, we hope to capture government and military websites well, and hope to keep this valuable information available to users in the future."


Thursday, December 22, 2016

Securing Trustworthy Digital Repositories

Securing Trustworthy Digital Repositories. Devan Ray Donaldson, Raquel Hill, Heidi Dowding, Christian Keitel.  Paper, iPres 2016. (Proceedings p. 95-101 / PDF p. 48-51).
     Security is necessary for a digital repository to be trustworthy. This study looks at digital repository staff members’ perceptions of security for Trusted Digital Repositories (TDR) and explores:
  • Scholarship on security in digital preservation and computer science literature
  • Methodology of the sample, and data collection, analysis techniques
  • Report findings; discussion of implications of the study and recommendations
Security in the paper refers to “the practice of defending information from unauthorized access, use, disclosure, disruption, modification, perusal, inspection, recording or destruction”.  Three security principles mentioned are confidentiality, integrity, and availability.  Recent standards for TDRs show the best practices of the digital preservation community, including security as part of attaining formal “trustworthy” status for digital repositories. However, security can be hard to measure. Part of security is the threat modeling process, where "assets are identified; threats against the assets are enumerated; the likelihood and damage of threats are quantified; and mechanisms for mitigating threats are proposed". Understanding threats should be based on "historical data, not just expert judgment" to avoid unreliable data. The study discusses the Security Perception Survey, which "represents a security metric focused on the perceptions of those responsible for managing and securing computing infrastructures". 

Two standards, DIN 31644 and ISO 16363, draw upon DRAMBORA, an earlier standard, which consisted of six steps for digital repository staff members:
  1. identify their objectives.
  2. identify central activities necessary to achieve their objectives and assets.
  3. align and document risks to their activities and assets.
  4. assess, avoid, and treat risks by each risk’s probability, impact, owner, and remedy
  5. determine what threats are most likely to occur and identify improvements required. 
  6. complete a risk register of all identified risks and the results of their analysis.
Security is a major issue for digital repositories. "Taken together, standards for TDRs underscore the importance of security and provide relatively similar recommendations to digital repository staff members about how to address security." Participants in this study found the security criteria in the standard that they chose sufficient.

Wednesday, December 21, 2016

We Are Surrounded by Metadata--But It’s Still Not Enough

We Are Surrounded by Metadata--But It’s Still Not Enough. Teresa Soleau. In  Metadata Specialists Share Their Challenges, Defeats, and Triumphs. Marissa Clifford. The Iris. October 17, 2016.
     Many of their digital collections end up in their Rosetta digital preservation repository. Descriptive and structural information about the resources comes from many sources, including the physical materials themselves as they are being reformatted. "Metadata abounds. Even file names are metadata, full of clues about the content of the files: for reformatted material they may contain the inventory or accession number and the physical location, like box and folder; while for born-digital material, the original file names and the names of folders and subfolders may be the only information we have at the file level."

A major challenge is that the collection descriptions must be at the aggregate level because of the volume of materials, "while the digital files must exist at the item level, or even more granularly if we have multiple files representing a single item, such as the front and back of a photograph". The questions is how to provide useful access to all the digital material with so little metadata. This can be overwhelming and inefficient if the context and content is difficult to recognize and understand.  And "anything that makes the material easier to use now will contribute to the long-term preservation of the digital files as well; after all, what’s the point of preserving something if you’ve lost the information about what the thing is?"

Technical information about the files themselves are fingerprints that help verify the file hasn’t changed over time, in addition to tracking what has happened to the files after entering the archive. Software preservation, such as with the Software Preservation Network, is now being recognized as an important effort. Digital preservationists are working out who should be responsible for preserving which software. There are many preservation challenges yet to be solved in the years ahead.


Tuesday, December 20, 2016

File Extensions and Digital Preservation

File Extensions and Digital Preservation. Laura Schroffel. In  Metadata Specialists Share Their Challenges, Defeats, and Triumphs. Marissa Clifford. The Iris. October 17, 2016
     The post looks at metadata challenges with digital preservation. Most of the born-digital material they work with exists on outdated or quickly obsolescing media, such as floppy disks, compact discs, hard drives, and flash drives that are transferred into their Rosetta digital preservation repository, and accessible through Primo.

"File extensions are a key piece of metadata in born-digital materials that can either elucidate or complicate the digital preservation process". The extensions describe format type, provide clues to file content, and indicate a file that may need preservation work. The extension is an external label that is human readable, often referred to as external signatures. "This is in contrast to internal signatures, a byte sequence modelled by patterns in a byte stream, the values of the bytes themselves, and any positioning relative to a file."

Their born-digital files are processed on a Forensic Recovery of Evidence Device ( FRED) which can acquire data from many types of media, such as Blu-Ray, CD-ROM, DVD-ROM, Compact Flash, Micro Drives, Smart Media, Memory Stick, Memory Stick Pro, xD Cards, Secure Digital Media and Multimedia Cards. The workstation also has the Forensic Toolkit (FTK) software is capable of processing a file and can indicate the file format type and often the software version. There are challenges since file extensions are not standardized or unique, such as naming conflicts between types of software, or older Macintosh systems that did not require files extensions. Also, because FRED and FTK originated in  law enforcement, challenges arise when using it to work with cultural heritage objects.


Monday, December 19, 2016

Metadata Specialists Share Their Challenges, Defeats, and Triumphs

Metadata Specialists Share Their Challenges, Defeats, and Triumphs. Marissa Clifford. The Iris. October 17, 2016.
     "Metadata is a common thread that unites people with resources across the web—and colleagues across the cultural heritage field. When metadata is expertly matched to digital objects, it becomes almost invisible. But of course, metadata is created by people, with great care, time commitment, and sometimes pull-your-hair-out challenge."  At the Getty there are a number of people who work with metadata "to ensure access and sustainability in the (digital) world of cultural heritage—structuring, maintaining, correcting, and authoring it for many types of online resources." Some share their challenges, including:
Some notes from some of the challenges:
  • The metadata process had to be re-thought when they started publishing digitally because the metadata machinery was specifically for print books. That proved mostly useless for their online publications so that started from scratch to find the best ways of sharing book metadata to increase discoverability. 
  • "Despite all of the standards available, metadata remains MESSY. It is subject to changing standards, best practices, and implementations as well as local rules and requirements, catalogers’ judgement, and human error." 
  • Another challenge with access is creating relevancy in the digital image repository 
  • Changes are needed in skills and job roles to make metadata repositories truly useful. 
  • "One of the potential benefits of linked open data is that gradually, institutional databases will be able speak to each other. But the learning curve is quite large, especially when it comes to integrating these new concepts with traditional LIS concepts in the work environment."

Thursday, December 15, 2016

DPN and uploading to DuraCloud Spaces

DPN and uploading to DuraCloud Spaces. Chris Erickson. December 15, 2016.
     For the past while we have been uploading preservation content into DuraCloud as the portal to DPN. DuraCloud can upload files by drag-and-drop but a better way is with the DuraCloud Sync Tool. (The wiki had helpful information in setting this up). This sync tool can copy files from any number of local folders to a DuraCloud Sspace, and can add, update, and delete files. I preferred the GUI version in one browser window and the DuraCloud Account in another.

We have been reviewing all of our long term collections and assigning Preservation Priorities, Preservation Levels, and also the number of Preservation Copies. From all this we decided on three collections to add to DPN, and created a Space (which goes into an Amazon bucket) for each. The Space will then be processed into DPN:
  1. Our institutional repository, including ETDs which are now digitally born, and research information. From our ScholarsArchive repository
  2. Historic images that have been scanned; the original content is either fragile or not available. Exported from Rosetta Digital Archive.
  3. University audio files; the original content was converted from media that is at risk. Some from hard drives, others exported from Rosetta Digital Archive.
Some of the files were already in our Rosetta preservation archive, and some were in processing folders ready to be added to Rosetta. They all had metadata files with them. The sync tool worked well for uploading these collections by configuring the source location as the Rosetta folders and target was the corresponding DuraCloud Space. Initially, the uploading was extremely slow, several days to load 200 GB. But DuraCloud support provided a newer, faster version of the sync tool, and we changed to a faster connection. The upload threads changed from 5 to 26 and we uploaded the next TB in about a day.

We also had a very informative meeting with DPN and the two other universities in Utah that are DPN members, where Mary and Dave told us that the price per TB was now half the original cost. Also, that unused space could be carried over to the next year. This will be helpful in planning additional content to add. Instead of replicating our entire archive in DPN, we currently have a hierarchical approach, based on the number and location of copies, along with the priorities and preservation levels.

Related posts:

Wednesday, December 14, 2016

PDF/A as a preferred, sustainable format for spreadsheets?

PDF/A as a preferred, sustainable format for spreadsheets?  Johan van der Knijff. johan's Blog. 9 Dec 2016.
     National Archives of the Netherlands published a report on preferred file formats, with an overview of their ‘preferred’ and ‘acceptable’ formats for 9 categories. The blog post concerns the ‘spreadsheet’ category for which it lists the following ‘preferred’ and ‘acceptable’ formats:
  • Preferred:  ODS, CSV, PDF/A     
  • Acceptable: XLS, XLSX
And the justification / explanation for using PDF:
PDF/A – PDF/A is a widely used open standard and a NEN/ISO standard (ISO:19005). PDF/A-1 and PDF/A-2 are part of the ‘act or explain’ list. Note: some (interactive) functionality will not be available after conversion to PDF/A. If this functionality is deemed essential, this will be a reason for not choosing PDF/A
There are some problems of the choice of PDF/A and its justification.
  • Displayed precision not equal to stored precision
  • Loss of precision after exporting to PDF/A
    • Also loss of precision after exporting to CSV
    • Use of cell formatting to display more precise data is possible but less than ideal,
  • Interactive content
  • Reading PDF/A spreadsheets: This may be difficult without knowing the intended users, the target software, the context, or how the user intends to use the spreadsheet. 
The justification states that some interactive functionality "will not be available after conversion to PDF/A. If this functionality is deemed essential, this will be a reason for not choosing PDF/A." However, deciding what functionality is ‘essential’ depends on the context and intended user base. In addition, interactive aspect may imply that "any spreadsheets that do not take any interaction with a user can be safely converted to PDF/A. But it may also be better to make a distinction between ‘static’ and ‘dynamic’ spreadsheets.

There may be situations where PDF/A is a good or maybe the best, but choosing a preferred format should "take into account the purpose for which a spreadsheet was created, its content, its intended use and the intended (future) user(s)."


Monday, December 12, 2016

Harvesting Government History, One Web Page at a Time

Harvesting Government History, One Web Page at a Time.  Jim Dwyer. New York Times. December 1, 2016.
     With the arrival of any new president, large amounts of information on government websites are at risk of vanishing within days. Digital federal records, reports and research are very fragile. "No law protects much of it, no automated machine records it for history, and the National Archives and Records Administration announced in 2008 that it would not take on the job."  Referring to government websites: “Large portions of dot-gov have no mandate to be taken care of. Nobody is really responsible for doing this.”  The End of Term Presidential Harvest 2016  project is a volunteer, collaborative effort by a small group of university, government and nonprofit libraries to find and preserve valuable pages that are now on federal websites. The project began before the 2008 elections. Harvested content from previous End of Term Presidential Harvests is available at http://eotarchive.cdlib.org/.

The project has two phases of harvesting:
  1. Comprehensive Crawl: The Internet Archive crawl the .gov domain in September 2016, and also after the inauguration in 2017.
  2. Prioritized Crawl: The project team will create a list of related URL’s and social media feeds.
The political changes in the past 8 years at the end of presidential terms has made a lot of people worried about the longevity of federal information.

Saturday, December 10, 2016

Error detection of JPEG files with JHOVE and Bad Peggy – so who’s the real Sherlock Holmes here?

Error detection of JPEG files with JHOVE and Bad Peggy – so who’s the real Sherlock Holmes here?  Yvonne Tunnat. Yvonne Tunnat's Blog. 29 Nov 2016.
     Post that describes an examination of the findings of two validation tools, JHOVE (Version 1.14.6) and Bad Peggy (version 2.0), which scans image files for damages, using the Java Image IO library. The goal of the test is to compare the findings from these validation tools and know what to expect for digital curation work. There were 3070 images for the test, which included images from Google's publicly available Imagetestsuite. Of the images, 1,007 files had problems.

The JHOVE JPEG module can determine 13 different error conditions; Bad Peggy can distinguish at least 30 errors. The results of each are in tables in the post. The problem images could not be opened and displayed or had missing parts, mixed up parts and colour problems. The conclusion is that the tool Bad Peggy was able to detect all of the visually corrupt images. The JHOVE JPEG module missed 7 corrupt images out of 18.

Thursday, December 08, 2016

OAIS: a cage or a guide?

OAIS: a cage or a guide? Barbara Sierman. Digital Preservation Seeds. December 3, 2016.    
     Post about the OAIS standard and asking if it is a restriction or a guide. OAIS, the functional model, the data model and metrics in OAIS and the related standards like the audit and certification standard. "OAIS is out there for 20 years and we cannot imagine where digital preservation would be, without this standard." It is helpful for discussing preservation by naming the related functions and meta data groups. But it lacks a link to implementation and application for daily activities. OAIS is a lot of common sense put into a standard. The audit and certification standard, ISO 16363, is meant to explain how compliance can be achieved, a more practical approach.

Many organisations are using this standard to answer to the question "Am I doing it right?" People working with digital preservation want to know the approach that others are using, the issues that they have solved. The preservation community needs to "evaluate regularly whether the standards they are using are still relevant in the changing environment" and a continuous debate is required to do this. In addition, we need evidence that practical implementations that follow OAIS are the best way to do digital preservation. Proof of what worked and what did not work is needed in order to adapt standards, and the DPC OAIS community wiki has been set up to gather thoughts related to the practical implementation of OAIS and to provide practical information about the preservation standards,


Monday, December 05, 2016

Digital Preservation Network - 2016

Digital Preservation Network - 2016. Chris Erickson. December 5, 2016.
     An overview of  the reason for DPN. Academic institutions require that their scholarly histories, heritage and research remain part of the academic record. This record needs to continue beyond the life spans of individuals, technological systems, and organizations. The loss of academic collections that are part of these institutions could be catastrophic. These collections, which include oral history collections, born digital artworks, historic journals, theses, dissertations, media and fragile digitizations of ancient documents and antiquities are irreplaceable resources.

DPN is structured to preserve the stored content by using diverse geographic, technical, and institutional environments. The preservation process consists of:
  1. Content is deposited into the system through an Ingest Node, which are preservation repositories themselves; 
  2. Content is replicated to at least two other Replicating Nodes and stored in different types of repository infrastructures; 
  3. Content is checked by bit auditing and repair services to prevent change or loss; 
  4. Changed or corrupted content is restored by DPN; 
  5. As Nodes enter and leave DPN, preserved content is redistributed to maintain the continuity of preservation services into the far-future.
The Ingest Node that we are using is through DuraCloud.


Thursday, December 01, 2016

Implementing Automatic Digital Preservation for a Mass Digitization Workflow

Implementing Automatic Digital Preservation for a Mass Digitization Workflow. Henrike Berthold, Andreas Romeyke, Jörg Sachse.  Short paper, iPres 2016.  (Proceedings p. 54-56 / PDF p. 28-29). 
     This short paper describes their preservation workflow for digitized documents and the in-house mass digitization workflow, based on the Kitodo software, and the three major challenges encountered.
  1. validating and checking the target file format and the constraints to it,
  2. handling updates of d content already submitted to the preservation system, 
  3. checking the integrity of all archived data in an affordable way
They produce several million scans a year and preserve these digital documents in their Rosetta based archive which is complemented by a submission application for pre-ingest processing, an access application that prepares the preserved master data for reuse, and a storage layer that ensures the existence of three redundant copies of the data in the permanent storage and a backup of data in the processing and operational storage. They have customized Rosetta operations with plugins they developed.  In the workflow, the data format of each file is identified, validated and technical metadata are extracted. AIPS are added to the permanent storage (disk and LTO tapes). The storage layer, which uses hierarchical storage management, creates two more copies and manages them.

To ensure robustness, only single page, uncompressed TIFF files are accepted. They use the open-source tool checkit-tiff to check files against a specified configuration. To deal with AIP updates, files can be submitted multiple times: the first time is an ingest, all transfers after that are updates. Rosetta ingest functions can add, delete, or replace a file. Rosetta can also manage multiple versions of an AIP, so older versions of digital objects remain accessible for users.

They manage three copies of the data, which totals 120 TBs. An integrity check of all digital documents, including the three copies, is not feasible due to the time that is required to read all data from tape storage and check them. So to get reliable results without checking all data in the archive they use two different methods:

  • Sample Method Integrity 1% sample of archival copies is checked yearly 
  • Specified fixed bit pattern workflow that is checked quarterly.

Their current challenges are in developing new media types (digital video, audio, photographs and pdf documents), unified pre-ingest processing, and automation of processes (e.g. to perform tests of new software versions).