ATA

Featured Articles

Find a Translator or Interpreter
Search for:

Featured Article from The ATA Chronicle (July 2014)

Internationalization and Localization: An Interview with Cisco’s Gary Lefman
By Marta Chereshnovska

A few months ago, I was fortunate enough to be able to speak with Gary Lefman, an internationalization architect at Cisco Systems. Gary is a master of internationalization and localization computer science. He is a computer scientist with the added benefit of having worked 14 years in software engineering, network voice engineering, and telecommunications at Cisco. Over the past decade, Gary has developed into a prominent specialist in software internationalization and localization. He is a practical and visual thinker, intent on internationalization innovation. He is currently gathering material for his second book on advanced software internationalization.

Please say a few words about yourself. How did you get involved with internationalization and localization?
My response will probably sound very familiar to most people in the
localization industry. The reason I am where I am today, and the reason I
have modeled my life around this industry is simply because of an accident--a disastrous accident--if that makes it seem any less passé.

I was invited to join Cisco in 2000, before I had finished my undergraduate
degree, and became a network engineer with a research and development group specializing in telecommunications protocols and switching. It was not long before I
started to design, build, and manage development labs in England and China.

Somewhere around the time the Dotcom Bubble had a blowout, management
volunteered me to rectify a tragic cocktail of localization issues with a prominent voice product. I had never heard of the term localization, but I threw myself into the project with full gusto--blind and naïve--as a good engineer always should, and solved the problem. Unbeknown to me at the time, I had altered the course of a crushed localization project, which subsequently expanded from four locales into 52 within a very short time. Thus, having been thrown to the wolves and walking away unscathed, I had gained a level of recognition and respect that fueled my decision to switch to the dark, and far more exciting I might add, side of voice localization engineering.

With an abnormal thirst for producing truly global products, it was only natural that I should move into an architectural role and focus on developing an internationalization strategy for the entire engineering organization. This involved developing an internationalization support structure for developers and internationalization champions, as well as a full training program to cover all aspects of product and content internationalization to multimedia localization. In the meantime, I have been working on several projects outside of Cisco.

At the end of 2013, I graduated with an MSc in multilingual computing and localization from the University of Limerick, via the Localisation Research Centre. Not wanting to stop there, I am now working on my PhD with CNGL the Centre for Global Intelligent Content at Trinity College in Dublin. I am also a partner and chief technology officer in a fantastic company that redefines localization education. In addition, I serve as a director for an internationalization consultancy.

In December 2013, my first book was published, Internationalisation of People Names (Scholars’ Press, 2013). It is a study of human name structures around the world and presents a model to prevent identity loss within computer systems. Earlier this year, I was recognized as a Fellow of the British Computer Society, as well as a Fellow of the Royal Institution of Great Britain. Although I have a strong background in internationalization and localization engineering, I would say that I still have a very long way to go.

Can you describe some peculiarities of the internationalization platform for Cisco products you developed?
Up until five years ago, I would have said there was nothing peculiar about internationalization. Throw in a few standards and best practices, and anything is possible.

But today, with a bring-your-own-device mentality and the so-called Internet of Everything, we have really mixed things up. It has gotten to the point where developers are rushing to produce application programming interfaces, but do not consider how the fruits of their labor will be implemented in locales other than their own. This is probably due to a focus on supporting different manufacturers’ platforms, more than thepeople that use them.

With a lot of applications being developed in the U.S., there is still too little consideration for the global user. This is not exactly a peculiarity of internationalization per se, because every platform has its qualification and quirk, but it is still a challenge
nonetheless.

What is the most challenging aspect of your work?
Changing the mindset of development teams--convincing them of the real value of internationalization--is a monumental task. The first barrier is deep-seated common misconceptions that inject ice-cold fear into the hearts of many developers when we utter the term localization. It is fear of the unknown and self-doubt that ultimately cause a developer’s resolve to crumble.

I believe this challenge can be addressed, to a certain degree, by academic institutions. Schools, colleges, and universities continue to be oblivious to the need for internationalization when teaching computing and writing. They often fail to provide an awareness of the world’s variety of cultures and how these cultures perceive and interact with programming systems. These are the very systems that students may one day be developing themselves.

Can you recommend any best practices and tools for proper internationalization?
Without a doubt, the Unicode Common Locale Data Repository (CLDR)1 is the quintessential resource a developer should have in his or her bag of tricks. This thoroughly grounded library of locale data will help enormously in the development of truly global products. Also, the use of programming libraries like the International Components for Unicode2 will make implementation of CLDR child’s play. Adopting new technologies and standards, such as the Internationalization Tag Set 2.03 and
HTML 54 will also make life much easier for developers and localizers alike.

Developers should also consider moving toward Best Current Practice 475 for finer-grained locale codes that better represent languages and cultures than the simple ISO 638-1 and 3166-1 codes for languages and countries respectively.

As for tools, the most important tool I have worked with is Globalyzer6 by Lingoport. This is a crucial piece for any developer because it performs internationalization static analysis and pinpoints almost all of the common internationalization problems. These are problems that would otherwise become apparent during product localization, and ultimately affect the time and cost of software development.

How do you envision the future of the localization industry?
Today, the World Wide Web of information is predominantly accessible in the English language, but this will undoubtedly change. Why? Because advances in the way audio, visual, and textual content is linked to similar content in other languages will help make all of this content even more available on the Internet. This is inevitable, and, to compound the matter, the vast amount of new information (and disinformation) being added online on a daily basis means there is going to be an ever-increasing desire to share it with the world. This is where we step in, but it is not going to be an easy ride.

Things are going to move very quickly when it does happen, and whilst the smaller localization service providers are lean and agile, the larger, institutionalized providers are going to be faced with some tough choices if they want the best seats in the show. They will need to develop a culture of adaptation and learn how to change direction in a very short space of time. They will also need to know their limits, because they will find it detrimental to their business if they attempt to accept every job.

I also foresee a lot of new language services providers appearing on the scene. They will be more specialized, focusing only on smaller and more specific domains such as social media or tourism.

This increase in demand will also affect developers, but there will be a greater understanding of the need for internationalization. This will accelerate research into more effective internationalization standards, as well as better integration of internationalization into programming languages and software development tools. The result: seamless localization and localization workflows. Is this all pie in the sky? Perhaps.

But note that I have not mentioned machine translation yet. This is because the advances in machine translation will probably continue to be slow and painful. If we have not cracked it since the 1950s, then we are not going to crack it in the next 60 years. This is because the human brain is a brilliant, yet at the same time wonderfully complicated, organ, and no machine will ever match it.

Notes
1. Common Locale Data Repository, http://cldr.unicode.org.

2. International Components for Unicode, http://site.icu-project.org.

3. Internationalization Tag Set 2.0, www.w3.org/TR/its20.

4. HTML 5, www.w3.org/TR/html5.

5. Best Current Practice 47, http://tools.ietf.org/search/bcp47.

6. Globalyzer, http://bit.ly/Globalyzer.

Marta Chereshnovska is a translation and localization specialist (English>Ukrainian, English>Russian). She has seven years of translation, localization, and subtitling experience. She has worked on information technology, telecommunications, marketing translation, and localization projects (software, hardware, web, mobile, and games). You can find her blog, Translation and l10n for dummies, at http://transl10n.tumblr.com, or follow her on Twitter @Martav88. Contact: martavelychko@gmail.com.

>