Does Crowdsourcing have a Place in the Translation Industry?
By Deb Kramasz
Crowdsourcing will work for translation needs only where it works for original authoring needs, because translation is essentially a writing or authoring activity in another language.
And where does crowdsourcing work for original authoring? For what document genres, if any, does it work? It may be useful for compiling the Internet’s funniest videos and a means for problem solving (technical research for tough R&D problems, grading method, patent examination), but what about original authoring?
Crowdsourcing has been successful for online open encyclopedias, such as Wikipedia.
Another possible authoring application may be in journalism, if you consider incident reports as a genre of collaborative authoring. Journalist Robert Niles, in his article “A journalist’s guide to crowdsourcing” sees the value of crowdsourcing in incident reports. “True crowdsourcing involves online applications that enable the collection, analysis and publication of reader-contributed incident reports, in real time.”
The attempt at crowdsourcing for the translation of the professional networking site Linked In was a failure. Well, actually, it failed even before a point for translation crowdsourcing could be reached. It failed because a specific group (translators) was needed for the crowdsourced translation and this group was an unwilling one. Freelance translators were unwilling to work for free.
Beyond open encyclopedias and incident reports, I have yet to see examples of successful authoring or translation using crowdsourcing.
“The Rise of Crowdsourcing”
By Jeff Howe
“What Is Crowdsourcing?”
by Jennifer Alsever
The Chronicle of Higher Education
July 30, 2009, 06:00 PM ET
“Duke Professor Uses ‘Crowdsourcing’ to Grade”
By Erica Hendry
The Online Journalism Review
“A journalist’s guide to crowdsourcing”
By Robert Niles
July 31, 2007