Columns
| Dialek :: Dialect
Autocorrect Is Not Your Mother
Though tech assists so much of our daily communication, it’s not omniscient. Nor is it any kind of authority in our lives.
This is Dialek :: Dialect , a column by Khairani Barokka on language, culture, and power.
I recently received an official letter proclaiming that I’d been granted my PhD. This was an event I’d been looking forward to for most of my life, achieved through difficulty, so I’d rushed to share the good news with the people who’d rooted for me all these years. I immediately sent the document, along with a brief message, to my family.
Understandably enthused, I typed “hore” at the end, the Indonesian for “hurray.” Imagine my surprise when I discovered later that the email was sent with autocorrect’s special sauce, turning “hore”—without my knowledge—into “horse.”
That’s right. In one of the most important, meaningful emails I’ll ever send in my life, I inadvertently called my family “horse.” HORSE.
It has become a recurring joke. Though I hope I’ll never commit such a grave mistake again—god forbid sending such a flub to people who may not have a sense of humor—I now quite like being able to look back on it with a chuckle and, upon reflection, with a sense of relief. The email powers that be still don’t have the ability to read my mind entirely.
My email app, which I’ve been using for years, still doesn’t get that I’m bilingual and write in two languages—Indonesian and English, at times within the same email. It’s annoying, but ultimately comical. If I look down deep, it’s also welcome.
Years ago, on a trip to China, I gleefully discovered the ability of a translation app to turn photographs I took of Mandarin signs into English. My joy at this discovery could have been used in an ad for the app in question. Today, that joy is replicated each time we successfully outwit the same app, when it has completely mistranslated a word in my native Indonesian—also known as Bahasa Indonesia; not “Bahasa,” which literally just means “language.” As Gmail’s autocorrect lets me down, I internally clap my hands with approval.
This ambivalence is useful to hold when using technology that alters our language, our meanings, our intentions, whether or not we like it. We know how we use language-creating products by tech behemoths to keep us going—literally, by allowing many of us to do our jobs that keep us alive, to communicate with loved ones, and to otherwise express ourselves in ways that feel fulfilling.
For those of us who have names that are autocorrected as “not proper English”—in word processing apps, in texts, and the like—there’s been an implicit understanding. The understanding that, though tech assists so much of our daily communication, it’s not omniscient. Nor is it any kind of authority in our lives.
There’s the way in which language and translation technologies—Google, Yahoo, Bing!, Whatsapp, Facebook, etc.—blatantly grease the wheels of capital. To capture a language is to capture a market, a people. The more people are socialized to use a dominant language or languages, the greater that market. And in the name of commerce, we give up such intimate parts of ourselves.
One conglomerate owns the details of my life in a way I, you, we, daily sweep under the rug of our bodyminds. Our awareness of being owned in so many ways by a single corporation is embedded on a low-level hum in our cells. We know the Black Mirror nature of our lives, the inscrutability of the privacy policies in legalese I skim when I’m impatient to get to a page. We know the way employees who we’ve never met and never will have access to the ins and outs of our daily behavior, our personal lives, our archived existences.
We willfully forget we are not ours alone. So when I’m home in Jakarta and the maps cannot label a tiny side alley, I rejoice.
Though tech assists so much of our daily communication, it’s not omniscient.
This same company highlights its translation of the gender-neutral Indonesian pronoun “dia” of “he” before “she” or “it”, and does not even include “they” as a possible translation of the word. It makes sense that it tries, and still fails, to understand the names of alleys important to Jakarta-dwellers. I flinch at the hurts that must arise from these mistranslations. Simultaneously, I celebrate the knowledge that—as flawed as we are and as entrenched as we are in capitalist systems of injustice—I can still understand a patriarchal, colonial bias when I see one.
Out of curiosity, I check Google’s translation of other gendered Indonesian words into English. They’re lacking in context or wholly inadequate. It will not comprehend whether “perempuan” or “wanita” is to be used in your Indonesian sentence to say “woman,” nor the histories embedded in each.
“Perempuan” has a Sanskrit origin that means “powerful,” while the Javanese origin of “wanita” connotes a subservience to men. Thus many gender activists (and myself) prefer to call ourselves “perempuan.” However, the usages of both words over time have shifted, and the meanings attached to each continue to vary from person to person.
Capitalism runs on ableist time, privileging speed and shortcuts. It ignores vast amounts of useful information to reach an end goal that benefits a few over the collective, disregards process as vital to so many cultures. The word I use most often in Indonesian for “Miss” is not translated as “Miss” at all by Google’s translation tool; it’s not even one of the options. One translation for the word that it does provide is “mother.”
Looking at this, the thought arises: “These translation apps are not our mothers.”
Those who advocate for AI translation and other language tool algorithms as “objective” would do well to read Safiya Noble’s Algorithms of Oppression , to understand how AI is built of human-created data, to remember that human data is littered with our biases. In the craze for AI as a supposed cure-all for the world’s woes—including the touting of AI translation tools as something that will wipe out the need for human translators (as though artificial intelligence is sui generis truth)—it helps to understand tech as a tool.
Like any tool, it can be used for any manner of things, with a range of outcomes for people and the planet. And it is important to understand that AI is of the planet, is literally earthen, as Lakota artist and researcher Suzanne Kite, of the Initiative for Indigenous Futures, says in this Adjacent article : “AI is formed from not only code, but from materials of the earth.”
The materials that comprise electronics are mined from the earth, and mining continues to be responsible for profound destruction of communities, especially indigenous communities around the world. Yet we who consume and consume so many electronics are taught to divorce ourselves from the realities of what exactly is meant by “green technologies”— including those that rely on destructive lithium mining in Latin America —when creating them involves uprooting people from their lands, poisoning them, and using extraction as part of more deeply unjust economic systems.
“These translation apps are not our mothers.”
If you think this is “too far” a tangent from using translation technologies, then the distancing of us as consumers from the suffering caused by our consumption has fully succeeded.
So, as an infinitesimally small part of the gigantic capitalist cogs that tear our planet apart while they claim to be able to “solve” its problems, I’m grateful for the “horse” errors, the lack of contextualization, the inability as yet for technology to fully invade our linguistic psyches.
I do not want them to gain all the information we have as a community. I do not want the meetings of people from my family’s close-knit West Sumatran village to be infiltrated. As a professional translator for over two decades, I do not want corporate machines invested in profit over all else to better our understanding of other languages to the detriment of human beings’ own critical thinking.
They cannot then give this information to individuals, to companies, that would do us harm. Much of us has still not been captured by them, even if our heart rate and steps walked per day are being monitored. The data cannot constitute the whole, nor the soul some of us believe we have.
In the meantime, we can try to know our privacy and data rights better, and support efforts to help grassroots activist groups maintain their privacy, so they can protect themselves against those who are accelerating climate change and making us part of it, whose aim is to capture language, in order to control and delete.