The increased digitalisation of science and technology is problematic for museums as institutions for the preservation of the material cultural heritage.
The reasons is that we usually think of digital information as something ‘immaterial’, a mere collection of zeros and ones that —as Jean-François Blanchette (Dept. of Information Studies, UCLA) points out—are considered “wholly independent from the particular media on which it resides (hard drive, network wires, optical disk, etc.) and the particular signal carrier which encode bits (magnetic polarities, voltages, or pulses of light)”.
But, suggests Blanchette, “however immaterial it might appear, information cannot exist outside of given instantiations in material forms”. Building on, among others, Philiip Agre’s Computation and human experience (1997), Blanchette discusses the material foundation of digital information:
It suggests that various factors, including the trope of immateriality, have obscured the physical constraints that obtain on the storage, circulation, and processing of digital information, resulting in inadequate theorization of this fundamental dimension of information systems. In fact, computing systems are suffused through and through with the constraints of materiality, and the computing professions devote much of their activity to the management of these constraints, as manifested in infrastructure software.
In Blanchette’s analysis computing is material through and through:
But this materiality is diffuse, parceled out and distributed throughout the entire computing ecosystem. It is always in subtle flux, structured by the persistence of modular decomposition, yet pressured to evolve as new materials emerge, requiring new trade-offs. This project thus argues that, in a very literal and fundamental sense, materiality is a key entry point for reading infrastructural change, for identifying opportunities for innovation that leverage such change, and for acquiring a deep understanding of the possibilities and constraints of computing. This understanding is not particularly provided by exposure to programming languages. Rather, it requires familiarity with the conflicts and compromises of standardization, with the principles of modularity and layering, and with a material history of computing that largely remains to be written.
In other words, this is food for thought for those of us who are interested in the materiality of contemporary computational biomedicine. Read the whole paper (submitted to the Journal of the American Association for Information Science and Technology) here.
(thanks to Haidy L. Geismar for the tip)