Good data rather than big data

Artificial intelligence (AI) does not date from yesterday but from… 1642 with the invention of the first computer by Pascal. And it is gaining ground every day as its use increases to analyze or predict the most varied behaviors.

The increase in the world population, the acceleration of urbanization and the growth of the middle class, which constantly create new needs, will intensify it further.

But its immoderate or above all inappropriate use could create unexpected difficulties, particularly in terms of water or energy consumption, not to mention CO emissions.2.

Gigantic quantities

As Luc Julia, the chief scientific officer of the Renault group rightly points out, “AI is a simple tool that must remain simple even and above all in the face of the growing complexity of the problems it must solve. It is the women and men who use it who decide to orient it for the safeguard of our planet… or for its destruction. »

Positively oriented, AI makes it possible in particular to:- discover new materials or active ingredients that retain the performance of those that already exist while removing or substituting toxic or polluting ingredients;- acquire and give meaning to the data obtained industrial processes (sound, vibration, temperature) to identify waste more quickly and further upstream;- to make traffic more fluid at peak hours to reduce energy consumption by giving back “free time” to users;- to optimize the masses of connected objects in areas of urban density to create more energy than is consumed; – use images to speed up diagnoses and make the choice of therapeutic treatments more effective.

We would consume less data by using only the ‘good’ data to solve a problem.

All this is obviously very positive, but results in gigantic quantities of data whose growth is unbridled.

And if we add to AI the use of the Internet by all of us citizens – 62.5% of the world’s population is connected to the Internet in 2022 against only 26.6% in 2010 – the figures are just incredible: 33 zettabytes (1021 bytes) in 2018, 59 Zo in 2020, 75 Zo in 2021 and 175 Zo expected in 2025!

Limited number

Our electricity distribution capacities and networks will not be able to keep up – last summer’s fires in California showed this. What to do ? We can of course reduce the power consumption associated with storing and using data by developing technologies like spin electronics. But above all, we can reduce the amount of data itself.

It is a question of replacing the “model centric/big data” approach, in which we collect a maximum of data before developing an intelligent model or algorithm, by a “data centric” system, where we focus on a limited number targeted data.

Example: A voice recognition system malfunctioned when there were car noises in the background. To solve the problem, we searched only for data with car noises in the background rather than using as much non-specific data as possible.

Greg Mulholland, CEO of Citrine Informatics, also cites the example of a chemical group that was able to remove a toxic ingredient from one of its formulas thanks to an AI system using only a hundred data, while going five times faster than with the traditional approach.

In other words, we would consume less data and only the “good” data to solve a problem. Good data rather than big data! We still have to accept the principle of simple solutions based on common sense rather than sophisticated solutions based on (too) fine intelligence.

Victory of Margerie is founder and vice-president of the World Materials Forum.

We would like to say thanks to the author of this short article for this remarkable content

Good data rather than big data


Visit our social media accounts and other related pageshttps://www.ai-magazine.com/related-pages/