Monday 13 November 2023

Executing the King's English

In early January, I was transcribing a supplementary document for court proceedings. It was a simply awful time for someone who was an English grammar teacher in a past life. (I nearly got a English literature scholarship for my first university degree, too.) The documents were so badly written that it gave me profound thoughts about the death of my language (which we will get to!).

There were hundreds of pages of this stuff, which was mostly variations on the first ten pages or so, but it wasn’t exactly repetitive. (At least that would have been quick and easy to transcribe!) Every page was full of mistakes, but different mistakes. It was all written by the same person – as far as I can tell – who simply didn’t know how to use spelling, capitalisation, or indentation. That much I could excuse, except this person didn’t even make consistent mistakes; almost like they didn’t know, so were just hedging their bets. Their spelling was puzzling, and even their sentence structure didn’t always make sense. After transcribing only dozens of the hundreds of pages of this documentation in Microsoft spell checker was highlighting Melbourne and subtotal as mistakes. This writing was so bad that it caused the machine to second guess itself! I even started wondering: How could someone learn how to use a language in a way that made them actively more difficult to understand?

Of all things, I started thinking about the latest series of advertisements I’d been getting on YouTube. These were obviously written and voiced entirely by artificial intelligence, then recommended to me by another artificial intelligence, in a way that insulted my organic intelligence. None of the artificial intelligence involved had the capability of detecting any mistakes. (Pro. tip: try saying “this $AU200 machine” relative to anything. It’s just weird and wrong, isn’t it?)

Even as a child I observed how television programming was becoming more automated with less human involvement who for example would decide to begin a advertising break midway through a sentence? To cut off a joke before the punchline? Now though the artificial intelligence writes and voices the advertisements as well as recommending them to me, after engineering the greatest certainty that my attention wouldn't go anywhere else.

I had a horrible flash-forward to a future in which even such creative and intuitive tasks would be performed entirely by artificial intelligence. They of course would be designed and administered by unaccountable mathematicians and engineers. They in turn are not likely to deeply understand grammatical intricacy or poetic appeal. Even language tools designed by engineers prove this: digital spelling-checkers & online translators are a imperfect substitute for a professional proof-reader. Nevertheless, people seem more willing to pay for tools like Grammarly than a human proof-reader (at least according to advertisements served to me, a former grammar teacher).

We are familiar with the expression that “those who don’t learn [from] history are doomed to repeat it” (Santayana, 1905). I recently heard a variation of this, that “those who learn history are doomed to watch helplessly while others repeat it”. I foresaw another version of this, in which people who passionately care about language – or English, at least – would be doomed to watch helplessly while it is abused by bureaucracies who are insufficiently self-aware to realise the problem, let alone their contribution to it.

From my perspective, the trouble began before my time (just). The 1980s push for economic rationalism valued maths over languages and technical skills over contextual thinking. I distinctly remember being told in school that there was no need to study anything except for business, law, and accounting degrees, in that order, because “every way you want to work is a business and everyone you want to do business with speaks English”. Retrospectively this proved my point about awareness, because the advice was given to us by P.E. Teacher. As a society we invest butt-tonnes of money into professional and community sport. The dividends are a society which is less healthy than 50 years ago, similarly successful in international competitions, but spends more per head on gambling than anyone else (including other cultures that are stereotypically famous for it). My lifetime had already seen the Japanese boom and the $5 orange, with its legacy still visible in parts of this country. Within a few short years of us young ones receiving this advice came the China boom. The major skills in demand were mining and civil engineering, but we already had people who could do that, as well as sports science. Now we needed Mandarin fluency and cultural skill. Whoops!

We muddled through that of course; Australia continues to be the lucky country. China is the major trading partner for ourselves and pretty much everyone else. Now there is still a shortage of people who “understand” China, and people who can write well. People can't write well? how can that be we all learned that at school! So did the people who wrote the submission, who somehow don't understand capitalization, punctuation, or spelling.

You might have heard employers today complaining that there are too many people in the job market who “don’t know how to write”. Perhaps this is also linked to a resurgent discourse of “soft skills” in hiring practice? Never mind that the project managers & lawyers would have all been encouraged to study “hard” sciences, math’s & technology, over languages, or creative/performing arts. Those on the other side will be consigned to jobs in data entry, systems maintenance & administration, hospitality, childcare, aged care, teaching, social work, or manual trades, without any hint of irony. (That is, until machines completely replace manual trades as well.) Irony at least can be unintentional as it doesn’t require reflective capacity, after all. According to recruitment theory, formal education is only 10% of what somebody knows. Education theory also says that one can reliably retain 10% of lesson input, if one was even paying attention. What these jobs tend to have in common is low pay, low social status, and insecure conditions. The rich will get even richer, and the poor will remain poor.

None of my concerns are at all new. The book ‘Eats shoots and leaves’ (Truss, 2003) is based on a pun which most native anglophones wouldn't immediately understand, since the pun is inherently grammatical. The book makes analogy to a parrot in danger who could only scream out phrases and words which it's overheard (from memory, “I’ve lost my other sock!”). The parrot doesn't understand that those words don't fit its current context and that a different expression fits its context perfectly (e.g., “help! Fire!” or “call an ambulance!”). Even context is something which is beyond its bird-brain, and so it dies helplessly. I may not have even learned about in school except for a particular choice of subjects and teachers.

Even George Orwell took inspiration by science-fiction dystopia in which human government was completely surrendered to artificial intelligence. Accordingly, the population were taught facts like 2 + 2 = 4 without ever learning the meaning of ‘plus’. Presumably, no one had thought that was important enough to program into the A.I. Orwell 's crowning glory (1984) imagined a world in which all literature, cinema, and other performing arts was procedurally generated by machines operating by people with the requisite technical skill but who didn't actually care about the content they generate. Meanwhile, the P.O.V. protagonist Winston still reads book recreationally. He values his books so highly that he tries to keep them secret from the deep state. He is also an employee of the same state whose primary duties appear to be maintaining the retroactive continuity of the authorised truth and history. While comparing what we don't like about the world to 1984 is very cliché, – see also ‘the Handmaid's tale’ – I’d like to consider how to prevent this from being irreversibly entrenched.

I'm a great admirer of continental Europe. I appreciate learning in school – since I defied that P.E. Teacher's advice and continued studying languages, in which I got good marks – that the German we learned at school was codified and authorised by the German government. This took a rigorous and extensive scientific process including qualified professionals and academics and politicians who took their tasks very seriously. The French government likewise regulates vocabulary and grammar of their language. Perhaps there is another purest in their society who bemoans the encouragement of “le weekend”?

This role I see being taken up by the Commonwealth of Nations – who presently also seems to care more about sport than other forms of “cultural ties” – or the British Council. With input from truly worldwide expertise and understanding, there should be authoritative standards separate from the influence of celebrity micro-blogging, deliberate misspelling as corporate trademarks, programming syntax, and other sources which society values more than its own science and historical evolution. That standard could helpfully replace the absurdity of Microsoft and apple spell checkers looking at Australian, Bermudan, Canadian, Irish, New Zealand, Zimbabwean and British English as separate languages, then also defaulting to American English, apparently at a whim.

Our language is dying. The forces involved globally significant and they require a globalised response. I certainly can’t save it on my own. The King’s English needs our help. (If King Charles happens to get a copy of this, I’m available to discuss how to reverse this trend.)

Monday 13 February 2023

Everyone needs a librarian

 

We think about librarianship being primarily about cataloguing and inventory; knowing which books are in stock and exactly where they all are. The profession itself regards its work as being about precise service; matching what a customer wants exactly, down to the best storage medium (book, D. V. D., etc.). My argument is that the core of the profession is maintenance; ensuring that the information you have is accurate and accessible. This profession should be treated as mission-critical for any serious bureaucracy. My opinion comes from both study and experience.

My first post on this log mentioned how libraries are derived from the name for “a place of papers”, typically as part of a mediæval monastery or cathedral school. Imagine the hazardous conditions of that time and place: mould from European rain in a room without sealed windows; moths and other bugs; fire; theft; general entropy. A large part of a librarian's time would have been taken up by checking the condition of the 'papers', and ordering replacement productions as needed. Even in classical times, the full contents of a library were restricted, so controlling access has also been a core function. Discretion of records was also historically necessary; with all the effort and expense of producing new books, only knowledge judged to be historically significant was preserved at all.
We've talked about living and working in the information age for a long time (~30 I'd say). We've all surely heard talk about “working smarter, not harder”. I'd argue that attitude is especially relevant in hot, dry climates like Australia. This has taken an added impetus as the Chinese Communist Party last year amended its fundamental doctrine, to say that data is now a factor of economics, as important as land, labour, and capital. All this should mean that managing information – and information literacy – are just as important skills as managing people or machines. Even Australian defence planning now talks about having “decision superiority”.
About ten years ago, I worked in a business which had a revolutionary service offering of a paperless office. Another part of their revolution was doing away with a traditional I. T. department, in favour of simply 'information'. Unfortunately, the revolution was incomplete: the – now fully digital – mail room wasn't integrated with the 'information' department, and neither was freedom of information. Surely the change was made with the best intentions, but with tragic results: this was also the job in which I discovered bit rot.
A large part of my day was downloading clients' case files onto a U. S. B. and walking them to solicitors' offices. The only alternative – used by everyone else – was to have someone print every page of every file and carry them through the street. After pushing a trolley full of paper through central Melbourne, complete with its wind and rain, I realised that for all the talk about modernisation and “working smarter, not harder”, this specific job hadn't changed for at least 600 years. I jumped on board with the revolution of this new business, until the day when a lawyer complained that I'd sent them a corrupted file. After checking, I had to explain that there was nothing we could do; the only (digital) copy of this file was corrupted during upload. They shot the messenger soon afterwards.

The experience gave me understanding to build upon this topic in my Master's degree. Entropy is the enemy of librarianship and archiving; we seek to maintain knowledge forever, but the physical medium for storing information will constantly decay. Whether it be paper and leather, the metal of a hard drive, the human brain, celluloid film, or even tablets literally carved in stone, we are always in a race against time. The bits rot as surely as a log. We keep seeking technology which is more permanent in storage, easy to secure, easy to maintain, and doesn't corrode anything around it.

With knowledge, experience and understanding of bit rot, I've come around to a few realisations. I now understand why any organisation of sufficient size will periodically update/upgrade/refresh its storage media: One way to guard against bit rot is to always have the bits stored on factory-fresh hardware. I now also appreciate why professions heavy on media – like law, academia, or government – have the critical information in both 'hard' and 'soft' copies, if only in case of emergencies. I would also like to experiment more with printing on plastic or rubber. After all, at the rate of decomposition and difficulty of recycling, plastic will endure practically for ever.
Furthermore I maintain that librarians and archivists as information managers should be natural candidates for risk management. Keeping information resources safe and clean are essential elements of the profession. Maintaining information resources, including backup and recovery, are vital to any white-collar organisation operating at scale.

Monday 6 April 2020

Goat stories and globalised IP


Because I've been writing and thinking a lot about historical fiction in medieval Bohemia, YouTube recommended to me the “Old Goat Stories”, animated movies based on central-European fairy tales. At a point in the second one, I imagined myself as a writer inserting the line: “I would have gotten away with it too, if it weren't for you meddling kids, and your talking goat!”
That got me thinking: can you copyright a catch phrase? At what point should intellectual property which becomes successfully pervasive enough to become memes or tropes, stop being protected?

The hair to split in this is that short phrases and brand identifiers are protected by trademarks, not simply copyright. That's a surprisingly important difference, because copyright is automatic with publication, but trademarks have to effectively be bought from regulators. It's similar to buying a radio frequency, or a licence to sell alcohol.
Furthermore, they are only valid for specific purposes. Returning to the original research question, I looked up the U.S. Trademarks & Patents Office and wrote some in. That's how I learned anyone can say “yabba dabba doo” on the radio or internet – because the original source material is no longer in production, so its trademark has lapsed – but not print it on a T-shirt, because the trademark on the apparel is ongoing. Similarly, ripping off “what's up doc” or “wassally wabbit” without attribution will still cause trouble for you.

Secondly, copyright and trademarks operate for different purposes. Copyright is meant to spread original ideas around, as long as others recognise whose original idea it was. [Academia has this at its foundation.] The only time you're allowed to get aggressive is when your work is copied instead of being reproduced. Technically speaking then, Warner Music can demand payment from any children's party who sings “happy birthday” without acknowledging that they legally own the song. Similarly, anyone who writes a series of books about Harry the Halfling travelling with a wizard across a war-torn hellscape populated by orcs to destroy an ancient artefact will probably find themselves in trouble with the Tolkien estate [at least until 2043, when the rights will expire, and the content will enter the “public domain”].

However, we're still free to write about “halflings” and “orcs”, and spell “dwarves” instead of “dwarfs”, even though these are all things which Tolkien invented. This is because he didn't apply them as trademarks, which would give a legal monopoly not only on copying but any reproductions. Any easily recognisable symbol or slogan you're aware of from the 20th century is likely a trademark.

Demonstrating the relationship of intellectual property and legal protections leads to an investigation of some of the biggest brands commonly around. Robin Hood has been around for so long that, even if we could determine the original author, it is definitely public domain. Exactly because it is a story and character so widely recognisable, D.C.'s Green Arrow and Marvel's Hawkeye used it as the base for their own, newer products, which are now protected by commercial law. The same goes for the various animated and cinematic portrayals of Robin Hood and the merry men, have paradoxically become products protected by commercial law. Especially, anyone who tries to reimagine the story, viewing it from a different angle, could argue that it is an innovation of the author, and therefore protected.

Another successful example demonstrates how it can't be assumed that that original author wants people to add to work they've already done, to preserve their money-making rights. Tolkien's “swords and sorcery” genre of “fantasy adventure” led to Dungeons and Dragons (D&D). (Which seriously, even if you've never played before, I bet you've heard of it, right?) Since their content isn't original, all they have left are the rules systems and game mechanics, and they've been open sourced for 20 years. In the year 2000, the owners of the brand released an open game licence (OGL), allowing others – A. K. A. Third-party providers; 3PPs – to write and distribute material to be used for their game. Without this licence, the owners had monopoly distribution rights, which they could enforce under penalty of law. Instead, recognising the wide variety of “home-brewed” material that was already being used by their customers – including tweaks to their core rule systems – the publishers allowed that content to be distributed widely, provided that the new authors “clearly indicate which portions of the work that you are distributing are open game content”, according to OGL s8. Some of this experimentation they later adopted into new, official rules and mechanics.

Another consequence of throwing open the rights to their rules system was allowing more competition. Some of their staff left and started a rival game publishing company (Paizo), utilising the content covered in OGL, and taking it in a different direction. Normally that is exactly what you don't want from your IP regime.

Doing another assignment, I was also impressed to find that Microsoft's online pictures included coins and banknotes from Australia, and Ethiopia. Putting them in a document includes a text box with the licencing information. Specifically, "This Photo by Unknown Author is licensed under CC BY-SA". This is a creative commons protocol, meaning that anyone can use the image freely, anywhere, including for commercial purposes. 

Other specific terms of the licence say, "indicate any changes made. Any sharing of transformed material must be distributed under the same licence as the original". After all, the point of creative commons is to encourage widespread use of their content by removing any barriers. For the images of currency, its purpose is similarly to be distributed for indiscriminate use, which raises the question that if everyone should have it, should it have any IP protection?

Yes, in fact. When Australia changed from pounds to dollars in 1966, we originally had $1 notes. With the best of intentions at the time, they had aboriginal art printed on one side. The artist, David Malangi, recognised the work was his, and successfully sued the Reserve Bank, which admitted they had not asked for permission to use his designs.
When it comes to digital publications, IP protections are intertwined with other concepts and technological capabilities. Metadata is a big one, since it can effectively determine who gets credit, and who gets paid. Digital rights management software [DRM] is another emergent battleground, since it combines questions of ownership and distribution with access. It can therefore be seen that copyright and other considerations of disseminating intellectual property on the internet are a rabbit hole that requires professional understanding to navigate.