Natural language processing might sound like something having to do with speech therapy or human consciousness. In fact, it is something else altogether: NLP is a highly advanced field of computer science, specifically relating to ongoing research into artificial intelligence. The field is concerned with how human beings and computers interact, based upon the accuracy of linguistic interfaces. It focuses on a computer’s ability to recognize things such as context, nuance, and implication, and upon the interface between humans and computers being made more intuitive — easier for people to understand, and to use.

This highly complicated field is consistently in the public eye; it represents one of the most rapidly growing branches of computer science.

How did NLP Start?

The subject began to gather momentum with the introduction of the concept of the Turing Test, first proposed by Alan Turing in the early 1950s. Rather than being a specific test, the Turing Test is an outline of how a test might be structured. The idea is that it would be presented to a computer (or a software system) with the capacity for intelligence, to determine whether or not the artificial intelligence could functionally pass for that of a human being. The idea was revolutionary, at the time. It is based on linguistic comprehension, including the ability to take improperly structured human input and apply the appropriate structure to it based on context — much as our brains do for us in a given conversation, more or less without our having to think about it.

Recommended reading50 Most Advanced University Computer Science Departments 2016

Why do We Need NLP?

The idea behind an increasingly intuitive, or “natural” style of language processing, is to broaden the scope of human-computer interaction. Presently, in order to use computers to their fullest potential, specialized knowledge and training is required. Most people don’t know how to structure their input so that a computer can comprehend it. Imagine a human language which was so specifically structured and convoluted that it required a graduate degree — or a similar amount of experience — in order to have a meaningful conversation in it; this is fundamentally where computers are at today. The purpose of NLP is to make them more accessible to people whose specialized skills lie outside of the world of computer science, allowing everyone greater access to these powerful tools.

Popular References to Natural Language Processing

Before Turing outlined the concept of the Turing Test in his paper “Computing Machinery and Intelligence,” the subject of creating computers that could think, reason, and even program other computers without direct human intervention at every step of the process had not been seriously considered. It wasn’t even a particularly common feature in the science fiction of the day. Turing’s concept got researchers and scientists thinking, as well as encouraging the use of AI in fiction. The most famous popular example of the Turing Test today is probably that used in the “Blade Runner” film franchise, which is based on the work of author Philip K. Dick. This concept is one of the central themes of the film, but has also been presented in such works as “2001: A Space Odyssey,” as well as in the various incarnations of the “Star Trek” television franchise. All of the above include humans interacting constructively with computer systems, through natural, intuitive linguistic processes.

Natural language processing is perhaps most frequently encountered by the general public while making search engine inquiries. Google’s search engine is particularly famous for its ability to recognize errors in spelling, analyze sentence structure for context, and suggest related search terms or subject matter. It does this through the use of advanced computational algorithms, which reflect the complexity of linguistic interactions between two people. More information about Google’s NLP algorithms may be found here.