Metadata are data about data. However, if you challenge the conventional explanation of metadata, it is probably more like a room key. If this piece of metal is lying on the street, there is a good chance it will end up in the garbage or in the sewer system. However, if someone happens to pick it up and has a clue about where the key is from and who owns it, it can be more easily located. Most modern digital cameras export a fairly large amount of metadata such as manufacturer and model, exposure and resolution, date and time, and GPS, as soon as you click the shutter button. These types of metadata hardly cause any harm or benefit when floating in the background. But when people look at the metadata and try to make sense of it, it can be a game-changer. The US National Security Agency's surveillance data leakage, made by Edward Snowden back in 2013 was all about the metadata of calls. Perhaps, you would assume that the actual voice recording might have posed an immediate danger. However, someone can make a pretty good educated guess if they know callers and recipients identity and duration of calls i.e. the metadata.
There are many ways to conserve important pieces of information. A trusted method is via setting rules and procedures. This is more prevalent in the mining industry. However, it suffers from memory loss from time to time mainly due to a changing workforce and poor management of its datasets. As an example of the closing of a mine after 20 plus years of successful operation, the longest-lasting value probably comes from a better-organized list of past failures and successes.
The earliest known library catalogue is called Pinakes, and dates back to about 245 BCE. The list was organized based on authors and subjects. In the context of mining, the way we organize our catalogue has been inconsistent. It is logical to think that the fine-tuning of solutions to problems in operating mines would get perfected over time. But if our hypothetical mine keeps some critical information in an email chain in Outlook or in a strangely named folder in a network drive, this is even worse than word-of-mouth transmission of vital information. So, what then are the characteristics of better-managed metadata?
With the advent of the internet, an overabundance of online information with varying degrees of structural complexity made it hard to do a search. Without any consensus for standard metadata, it became a problem of misfit or over-fit when doing a web search. One solution was to describe all networked resources with 15 metadata - the Dublin Core Elements. Any man-made object can be described somewhat with those metadata – the lowest common denominator. Even though Google search does not operate based on the Dublin Core, it reminds us of the need for fixed schema so that different resources can talk to each other. The metadata is expressed as subject-predicate-object For instance, ‘XXX’ underground drift (subject) is (predicate) 15 years old (object). The subject of metadata is preferred to be tied into other resources (either a weblink or a link to an existing database) to ensure continuity and to reduce ambiguity.
The concept of a living document in the mining industry is probably realistic if any job title was tied into the document/database so that it can be updated and accounted for. On the other end of the spectrum, there is the concept of “open source” – a rethinking of the development and the distribution of software and hardware. Anyone can pull this up and use or modify it. Being open in nature creates a dynamic environment and engages the wider community which can then offer better security because it is under the continuous watch of its users. The shorter feedback loop guarantees success in the long run as it accepts failure and helps contributors to mature. Probably, this is why the Linux type operating system has withstood time and is still a go-to choice for developers. So, another proposed characteristic of mining metadata is to open them to as many people as possible.
The third characteristic is about creating a chain across the users of documents and datasets. It is important to keep a usage log. Probably, the most extreme example would be to know who got alarmed at what time and how he/she responded when some safety-critical system failed or was triggered. However, the usage log usually stays passive in the background until something goes wrong. It also loses the detail of interactions among users. In recent years, digital signatures for documents have been extensively used due to their convenience and better security. In principle, the signing party’s private key is used to encrypt the whole document file. Any minor change in in transit will change the resultant 256-bit hash. If the signature is legitimate, the recipient of the document file should be able to reproduce the hash using the sender’s public key. Just to extend the use case of the digital signature further, the aforementioned handshake plus the previous hash ID creates a block. Blockchain is a matter of connecting every block using the previous block’s hash and creating a copy of the block in every user of the network. Whichever has the longest block, aka "computation power" is considered valid. An application of blockchain with minor modification will be useful in the mining industry especially in defining live documents or files since it guarantees a feedback loop and longer blocks (document files) will be rewarded.
Wasting effort by reinventing the wheel is inevitable when things are misplaced because this is effectively the same as losing them altogether. Better organized and forward-looking metadata management could be a silver bullet for tackling the problem of memory loss. Just to reiterate my points: first, a metadata scheme needs to be agreed upon and must be traceable and interconnected; second, metadata need to be as open as possible to keep them alive and ensure continuous improvement; third, incentives for active participation and guaranteed transaction could be provided via the blockchain system.
Contributor: Munkhtsolmon Munkhchuluun