How is Cognitive Computing Different From Big Data and NLP?

Praful Krishna
4 min readJun 5, 2020

Any organizational behavior expert will tell you that even in the largest, most complex organizations, the hairiest business problems often go untouched. Systemic issues having to do with huge amounts of data spread over many different repositories seem truly overwhelming — There is simply too much information to sort through.

This problem has been exacerbated in recent years with the advent of Big Data. Big data problems have to do with the symbolic processing of vast data for cumbersome problems: targeted advertisements, recommendation systems, and the learning of straightforward correlations.

Of course, data is inherently valuable — but many enterprise search solutions are simply ill-equipped to effectively analyze the structured and unstructured data sources using traditional methods. This leads to band-aid solutions, applied in order to solve issues in the short term, waiting for some vague point in the future when there are no other pressing issues, resources are limitless, and competition says “can’t we all just get along”.

Is there any good news?

Yes. The good news for enterprise is that there are many players in the data science market who offer proven solutions more than capable of handling large amounts of structured data. The bad news is that if you focus on structured data only, you may be looking in the wrong place for a solution to your problems.

Cognitive computing aims to teach computers how to think and process information like humans. One of the its key applications is natural language understanding (NLU). Forward thinking CTOs recognize that this is a valuable technology. Having the ability to truly comprehend natural language without any human intervention is a huge problem today. Have you used a customer service chatbot for anything outside of the 15 set FAQs lately? If so, you know what we mean. Humans are expensive, and they get bored and make mistakes.Why not free up your critical-thinking, outside-the-box problem-solving superhumans to focus on what they do best? Our guess is you’re already thinking along these lines. As computers get “smarter” they will be configured to automate complex workflows and truly make decisions in the enterprise world.

You’re describing a utopia. How can this be possible?

Right now, most is based on strictly structured inputs. But what if computers could understand natural or unstructured inputs as well? This is AI for human language. In our day to day lives as well natural language interaction with our machines is becoming more common. With Droid, Siri, Cortana, etc. the building blocks are already in place.

Natural language is just one application of cognitive computing. Advanced cognitive systems are already being developed that relate with vision, speech, etc. With the ability to comprehend and actually take action based non-text-based, unstructured information, cognitive computing seems to be the natural choice in solving traditional big data problems, especially where data size results in prohibitively high cost or latency; or where underlying data is too dynamic; or when structured data is only one piece of a larger puzzle.

Here are the advantages of cognitive computing vs. traditional big data technologies for these problems:

  • Scalability: Cognitive computing is all about forming hypotheses, proving or disproving them, and learning from them to form new hypotheses — essentially how humans think. Cognitive systems have memory, they have the ability to second guess themselves, and they are designed to go back and forth for finding the right answers. It means they can come up with equally rigorous insights without having to brute force their way through entire databases. They make fewer redundant calculations, a hallmark of scalability.
  • Dynamism: The same capabilities give cognitive computers a lot of flexibility in ingestion and processing. If, for example, a system is processing some data batches, and a new variable shows up in a batch, cognitive computers will not skip a beat. They will automatically change models to include the new variables. For traditional technologies models may have to be rewritten and the performance will suffer by that much. This also holds true for situations where the content itself keeps evolving.
  • Natural Interaction: Given that natural language is a prominent application in any case, pulling in natural language capabilities makes cognitive computing systems very powerful. For example, these systems can output their insights directly in business language. More importantly, these systems can extract information from natural language or unstructured text. This increases their applicability and the time taken to prepare data for advanced analyses.

Before we close: a super geeky note for all you computer scientists out there…

The worst case performance of a cognitive system for a database with n records and m fields could be O(nm²), essentially that of the brute-force method, but in reality if the model is trained well, it almost always performs near the best case, which asymptotes to O(n) with a very low constant. The longer the system is in operation, the lower the constant becomes.

But where is Natural Language Processing (NLP) in all of this? The data science driven approach to NLP is very different than traditional NLP stacks, where machine relies on inherent grammatical structure. Cognitive computers are faster to learn and can be very flexible with quality of input. Based on our experience, if there was a scale where completely structured data would be 1, and a well composed language piece 10, Coseer’s cognitive systems do the best between 6 and 9.

Overall, cognitive computing is very different from a big data or NLP based system. The artificial intelligence, and truly cognitive features in the design make it powerful, and especially suited for applications that run across structured and unstructured domains, and/or are dynamic in nature.

--

--