The risk of data decentralization is mitigated with unified consumption

The risk of data decentralization is mitigated with unified consumption
The risk of data decentralization is mitigated with unified consumption

Data has become the main axis of financial entities. Both banking and insurance companies are facing a key moment in terms of competitiveness, and this can only be achieved with innovation in the management of massive volumes of information.

That is why elEconomista.es, in collaboration with PUE, Dell and Starburst organized a round table to talk about a capital issue in the operation of companies: whether it is more convenient centralize or decentralize data.

For this issue, there were a variety of opinions on the matter, but all the speakers agreed that the ideal is to have a centralized place where the data can be accessed quickly and easily, but that their use is decentralized and that it is always linked to the needs of the company’s business areas. This argument provided by Nicolás Oriol, deputy general director of data, advanced analytics and robotics at Mutua Madrileña, brought the insurance and banking sector into agreement, although it was not the only vision regarding data processing and storage.

Tomás Arteaga, director of data science and analytics at Deutsche Bank Spain, stated that they have been working for years “on the centralization of data in a Data Warehouse so that all divisions can access it easily.” He also added that “We see decentralizing data for proofs of conceptsandbox, etc.”.

“Centralizing means security, protection, sovereignty…, but 10 years later, reality has passed us by, and the speed with which the business asks us for things changes,” said César Tapias, sales director EMEA WER at Dell. . This was justified by the time involved in the centralization process, which would cause “arriving late to the market”, something that for Tapias is not possible, because that would lead to poor decision-making. There, the Dell representative gave devastating information: Companies only use 28% of data to make decisions. This, Tapias said, will mean in the future that the problems “will multiply due to the massive use of artificial intelligence (AI)” and warned that if companies do not have a solution “that allows them to access the data from anywhere, in an open and standards-based way, the seams will show.

Adrián Estala, field CTO of Starburst, insisted on the need to be fast, saying that “Decisions have to be made now, because we do not have the luxury of losing opportunities. Before we could take five years to change the company, now we only have five months,” she clarified.

Javier Marqués, head of data at Generali, explained that the Italian insurer is committed to decentralization at the international level, but declared that within each country they opt for centralization “so that activities are not repeated and duplicated.”

This type of “mixed model” was also highlighted by Asier Gochicoa, CDO of Kutxabank, where “the data must come from the banking model and from there they can explore other aspects.” Sergio Rodríguez, the CTO of PUE, chose to speak about that “The evolution between centralization and decentralization is now a gray scale. The trend is a decentralized model that can unify actions,” he explained.

The role of the legacy system

Legacy, an outdated computer system that has become obsolete due to technological advances, but is still used, was another of the issues discussed at the round table. Marqués is clear that the change to an updated system “involves a lot of effort, and more so the older it is, but the legacy also has good things, we should not demonize it.” According to Marqués, this inherited system, in the end, “also transmits good things, such as stability and very strong solvency, because the data is robust and the response times are good.”

Fernando Lipúzcoa, CDO of ING in Spain and Portugal, highlighted that neobanks have “the ability to create onboarding and a brutal customer experience in record time. On the other hand, more traditional banks, although it costs us more, have “We worked a lot to vitaminize the data and give it meaning according to the regulator’s demands, but that defensive strategy has become offensive, giving value to the client.”

Generative AI, with caution

Finally, another of the growing topics within data management is the emergence of generative artificial intelligence and how it will affect the process. Oriol explained how for Mutua, “Generative Artificial Intelligence is ready to be used at a business level and that they are already working on initiatives that will bring improvements in the customer experience and the company’s results.”

Gochicoa, in his case, affirms that “problems are still being encountered, because the technology for some cases is not always mature enough. We are a cautious sector, and we have to do a lot of trial and error until we find the key,” he concludes. Arteaga, Marqués and Lipúzcoa agree with him, but the latter stressed that “sooner or later, we will have to surf the tsunami.”

Rodríguez underpinned the round table by highlighting that “AI closes the circleand specializing this technology based on data is key and necessary”, but clarifies that we must not lose sight of sovereignty (with whom the data is shared) or where the information is processed.




 
For Latest Updates Follow us on Google News
 

-