“Will I be replaced by a computer?” This question was not the only thing on my mind when I attended the ACL annual conference – but I’d be lying if I said it didn’t cross it.
Understanding Hamlet
Given enough time, a monkey bashing away at the keyboard will eventually type out Hamlet. But what would it take for your computer to actually understand Hamlet? Natural Language Processing (or NLP) is all about developing computer algorithms that can understand texts we human beings have written.
It’s a booming field of computer science right now, mostly because of the huge amount of information we’re inputting onto computers.
Wading through it all to make some sense of it would take humans an unfeasibly long time, but now computers are starting to do it at lightning speed. They can even translate it and apply ‘sentiment classification’ to textualise the content.
And although it’s been with us for over five decades, NLP (also known as Text Analytics and Computational Linguistics) is gaining a lot of attention at the moment because of the vast amount of available textual data and business interest.
Broad appeal
I first attended the Association for Computational Linguistics (ACL)’s annual congress 12 years ago, when most of the people came from universities. Then around five years ago the likes of Google, IBM and Amazon started to show up. But at this year’s conference in Berlin, attended by over 1600 people, I noticed the make-up was much more diverse. Around half of attendees affiliated to small and mid-size enterprises (including three Black Swan Data scientists). NLP research has been hit by the start-up era!
ACL is the best place to keep up-to-date with the latest developments in NLP research. This year, there were seven parallel sessions running, consisting of presentations by top scientists from Stanford University, MIT, Cambridge University, Google, and Microsoft. While the topics were diverse, a central issue ran through all of them: deep learning.
Deep learning
Deep learning came out of machine learning around 10 years ago and has been the architect of revolutionary success stories in speech technology and image processing.
However, it remains to be seen whether deep learning can achieve such an important breakthrough in NLP. Signal processing of audio and images is one thing, but for computers to understand texts is a much more complex task.
A force for good – or evil?
There’s a massive commitment to finding advances in deep learning, since clearly the opportunities are huge too. Some of the people in this field have already dedicated a decade or more of their lives to deep learning in NLP. Others are still waiting for these people to deliver tools they can make use of, while others think it’s plain evil. One big fear, of course, is that it might even put people like us out of work. Why pay someone a wage to do something you can get a computer to do for free?
Deep learning’s big promise is exactly that: to find the best architecture for a certain task instead of needing a human to engineer it. A professor told me that he recently decided to buy more CPU/GPU instead of hiring an engineer. I’m also an NLP engineer but I’m not afraid that a computer will replace me in the next decade.
Deep learning can definitely contribute if you focus on a specific sub-problem for years, much in the way that academics do. On the other hand, if you’re regularly challenged with new problems – like we are at Black Swan Data – the creativity that human engineers provide in these situations cannot be replaced by machines.
And…. relax
The good news is that attending ACL has not given me sleepless nights. It was a great event packed with stimulating talks to spark fresh thinking. Personally, I found it comforting to see that we’re bang on track with our state-of-the-art research at Black Swan Data. In fact I couldn’t help but notice how similar in approach one of the papers presented was to the geolocation system we’d developed during our Szeged hackathon just three weeks before!