From big data to AI: from automation to code autonomy

The Business Intelligence is historically part of IT that can process data - usually structured - to make them intelligible. The purpose of these complex tools, connected to databases (including the famous Datawarehouses ), was to transform the data into information (operational dashboard), then the information into lessons for management (strategic dashboard).

In the beginning: Big Data

Then came other sources of data - added to traditional customer databases, account books, statistical surveys, supplier invoices, and so on. Unlike those mentioned above, these unstructured - or semi-structured - data come from social networks or non-formal interactions with customers and partners (tweets about a brand or visit to a website, for example). ).

Kết quả hình ảnh cho công nghệ 2020

"They are often interesting and can be crossed with your business data. That was the arrival of what we called Big Data, "recalls Isabelle Flory, Intel Sales Director for Western Europe.

One of the limits of Big Data is that "we were working on previous data". The idea has thus sprung up to develop algorithms to try to understand the mechanisms that could determine what happened in the past.

This search for recurrences, correlations and complex cause-and-effect relationships has laid the foundation for " learning ". In other words, they were the first cornerstone of machine learning , more evolved.
Then came predictive, prescriptive ...

The next step was all right. "If I can model the learning of what happened so well, can not that be used to predict what will happen? Does it work if I say "let's see if I apply the same algorithm on the rest", explains Isabelle Flory to illustrate the birth of the predictive .

"To give you an example, we worked with AP-HP on emergency admission. There is congestion of this service. [...] We investigated whether there were time series that make it possible to anticipate the peaks and to better affect the nursing staff ".

Predictive analytics applied to humans, as in the example of the AP-HP, but also to machines. What was possible in the industry, thanks to the advent of sensors and connected objects ( the IoT ). "It was very interesting for predictive maintenance ."

"From this ' ' predictive analytics '' we went to the ' ' prescriptive ' '", remembers the head of Intel. Again, a logical continuation for IT that has automated some of the preparation for decision-making. The solution says she "thinks" that this or that option is the best.
... and Event Stream Processing

Next step is the decision-making itself - not the preparatory stage - that is automated.

"There are cases where I can perfectly wait, when it is not critical. But there are cases where it is better to give a rule to the system. In a nuclear power station for example, if a particular alarm is triggered, it is better to say to him "if this event occurs, you stop everything. We will see later, we will make the decision later on.

This new phase is coming. Called Event Stream Processing , it comes mainly from the industrial world. "But we realize that this can also have implications in other areas, such as customer journey (NDR: a location triggers a promotion)".

Yes, but now, at the heart of the Event Stream Processing process, you have to set a rule manually. But who evaluates the effectiveness of this rule? Who says she is adapted? Or evenappropriate?

Here again the IT has allowed - or is trying to - to automate this phase.
Machine Learning, Deep Learning: from automation to autonomy

To automate it, learning algorithms compare the expected effects of a decision with its actual effects. In return, the algorithms modify the rule and proceed thus by successive iterations. This is the Machine Learning that can be translated into French as "machine learning".

A subset of Machine Learning is Deep Learning, which attempts to answer the question summarized by Isabelle Flory: "What happens if I do not know the rule? ".

In other words, "can not I deduce from all my data, algorithms that will allow me to create a rule that I did not know myself". Algorithms that design algorithms therefore.

"It is for example with Deep Learning that the origin and the carrier of the Zika virus were identified by processing the data available on the epidemic", illustrates Isabelle Flory.
Cognitive Computing: IT copies biology

All these technologies today rely on architectures, "logic and mathematics that we know well," she analyzes.

But a new IT trend is introducing a breakthrough, with "technology inspired by biology". And in particular neuroscience.

This computer science "relies on logical processes that are very close to the processes of the human brain". And this includes in the physical architectures, "even in the silicon which is also imitating the functioning of our brain, with synapses and very parallelized systems".

This imitation of biology is, for Isabelle Flory, the branch of IT that is now called " Cognitive Computing " branch the most central - today - of Artificial Intelligence .

Technologies that rub shoulders

This historicized explanation may give the impression that technologies succeed one another and that "one passes from one to the other as one climbs the steps of a staircase", from the most rudimentary to the most powerful.

It is not so. "All these technologies exist simultaneously," warns Isabelle Flory. Because each one answers to different problems: real time or not, which concern machines or humans, in relation with binary logics or logic more blurred and more complex.

This is not without inducing "a certain complexity" for the CIOs in their choices and in the articulation of these tools between them. Especially since the categorization of solutions by type of problem is not strict or fixed.

"We should not believe that the cognitive will only be used for voice analysis, automatic translation or recognizing images - unstructured processes. For example Boeing has been working on predictive maintenance for a long time, but they managed to save an extra $ 100 million by applying cognitive logic, "says the manager. Logics from Saffron, a start-up specialist in Artificial Intelligence ... acquired by Intel in 2015. This shows, in the AI, the involvement of increasingly closer silicon and code.