Pages

Wednesday, May 5, 2010

If Content is Exploding, Why are Translation Prices Still Falling?

In my last entry I talked about three macro trends that are causing downward pressure on translation prices. And I got a fair amount of feedback, some agreeing and some challenging things I said. I also found out that I am being lumped together with others in the “Localization 2.0” camp. This confused me, since as a child I was denied the camp experience in apartheid based South Africa (no darkies allowed!), and CSA just released research that says that “Localization 2.0” is a failed experiment that was a non-starter and really, in the end it is all about translation. However several good questions were raised from this feedback that I would like to address here.

Am I “against” the TEP model?

I am not proposing tearing down the TEP model. I am suggesting that it has a place and is best suited for static content that has a relatively long shelf life. The TEP model made a lot of sense in the late 90’s and even early into the 2000 decade. But it is much less useful as a production model for the more dynamic content that is more common for global enterprises today. This "new" content is more voluminous and often has a half-life of 3-6 months or even less, but could be very valuable during this peak time. Much of this new content is user or community generated and strongly influences purchase behavior, so it is risky to ignore it. However, manuals and software still need to be localized and I do not advocate discarding this proven model for some kinds of content. I do expect it will need to evolve and connect to new technology and infrastructure.

If we are experiencing a content explosion and there is a shortage of human translators, how can prices be falling? This does not jive with standard market behavior where increasing demand typically results in higher prices.

On the surface this does sound illogical and inconsistent with standard economic theory. Usually, more demand = higher prices. However, for the longest time, we have seen enormous amounts of content on the web unavailable to customers and communities around the world. And why? Because traditional translation processes have been largely unaffordable and too slow for many of these new applications. By this, I mean content and communications such as web pages, intranets, knowledge bases, social network feedback, product documents, IM’s, blogs and emails.  I believe that the demand has always been there, but unable to fulfill itself in the cost/time scenarios that traditional translation production models were based on. But now the urgency created by globalization and internet based commerce makes this a much more pressing issue.

So we see global enterprises asking: How do we reduce costs, maintain quality and increase the productivity and speed of the translation process?  And we see localization managers asking: Is this MT/crowdsourcing stuff really going to affect the area of focus for the “commercial grade localization” market?

The explosion of content has an impact in three ways that I can see on the commercial grade localization market.

Firstly, it raises questions of value at an executive management level. 

Most professional translation is done to facilitate and drive international revenue. If the product and sales line-managers receive feedback that end customers (who generate international revenue) do not care about the “traditional” content, then I am sure some of these managers who are P&L driven, will seek to reduce this spend. This may reduce the need and the volume of traditional content or may cause price pressure, because it is felt that the value of the traditional content is low, and thus can be cut back without damage to revenue flows. Localization managers in global enterprises are generally not power players in the corporate hierarchy, and are often just told to do more with less. However, some of this content (GUI, Brand Marketing Content, Legal Terms) cannot be compromised, and it is likely that there will not be any price pressure and possibly even see rate increases for this type of content. So corporate localization managers have to cut budgets on documentation and other relatively “low value” content.

Secondly, as managers become aware of the automation possibilities they begin to consider it for use in “high-volume” commercial grade work. I have been involved with several automotive OEMs who produce 2,000 and 3,000 page manuals for their dealer and service networks. Customized and tuned SMT makes great sense here, as there is a lot of data to train with, the content is repetitive and MT can deliver measured productivity benefits. The net gain from using MT: manuals done significantly faster and at a much lower ongoing cost, at the SAME QUALITY levels of the old TM-based process. While editors/translators are usually paid lower rates they have much higher throughput and most actually make more money under this scenario. Note that I said CUSTOMIZED MT, not Google. 

Thirdly, for those companies that have products with very short product life-cycles (usually in consumer electronics), custom MT based automation can greatly speed up the documentation production process and over time reduce the actual net costs for each new manual. Again post-editing ensures that HT quality levels are delivered. 

Custom MT is the equivalent of building a production assembly line – it can make one or a few products very efficiently and give you a translation factory that gets more and more profitable as the production volumes increase, since the marginal cost continues to fall. In every case I know of, if you want publishable quality you still have it go through human post-editing. However, MT is a strategic long-term investment and production capacity asset in my opinion, (though several RbMT experts claim it can be used on a project to project basis). Real leverage for commercial grade localization comes from domain focus and continuing enhancement of these systems. To produce HT quality, it will ALWAYS be necessary to have a post-editing step in the process.  An informed SMT engine development process will deliver continuing improvements and starting point quality. This task of producing an ever-improving translation production line is relatively new and we are just beginning to understand how to do it well. Don’t believe the experts who tell you they have done this before. This is new and we are all still learning. Just as cars have gotten much more reliable as automation technology evolved,  we will see that effective man-machine collaborations will produce both, compelling quality and efficiency.
The most polished human translations (TM) will, I think, generally produce the best systems and the most skilled correction feedback will produce the biggest ongoing boost to the quality. If you want human quality you will need to have a modified form of TEP to make that happen, with more objectivity and more automation. I believe, growing use of SMT in particular will create new eco-systems and tools that require strong human linguistic skills. I believe there will be a need for more skilled linguists in addition to translators and cheap editors who may not even be bilingual but have subject matter expertise.

Expanding use of MT and community collaboration does not mean that QA, process control and quality concerns go away. I actually think there is much more room to build differentiation as these skills are broader and deeper than you need with TM. I expect that both at the translator and the LSP level there will be more room for differentiation than there is today.

Google continues to improve their MT systems and so will Microsoft and maybe even IBM now  (finally!) with assistance from Lionbridge. These are all largely SMT technology initiatives which are all a little bit hybrid nowadays. I am still betting that domain focused SMT systems developed by agile and open minded LSPs and translators, working in collaboration with technology partners like Asia Online will easily outperform anything the big boys will produce. The key to excellence is still the quality of the collaboration and the dialog between these players, not just the technology.

1 comment:

  1. Hi Kirti, I think that falling prices are due to the Google Translation Toolkit which allows any translator to access high quality MT for free. People are using this and other tools/workflows to streamline and expedite their work, making prices more competitive.

    ReplyDelete