12.24.2018

Outlook 2019: Gerard Francis, Bloomberg

12.24.2018

Gerard Francis is head of enterprise data at Bloomberg

Gerard Francis,
Bloomberg

What will be the skill sets most in demand in 2019?

Continuing efforts to build a more diverse workforce will be pivotal to success. It has been my experience time and time again that diverse teams thrive and dramatically out-perform homogenous teams.

More specifically, as we look at the rapid changes coming towards us with AI and machine learning, data science skill sets will be the most sought after. Employees with a deep understanding of these technologies and who can draw insightful conclusions from vast amounts of raw data will be in demand. People who can integrate AI, machine learning, and natural language processing concepts with insightful knowledge of a business domain will be especially valuable as the industry looks to automate solutions to real-world problems.

What changes do you expect to see regarding artificial intelligence and machine learning in 2019?

When it comes to AI and machine learning, three components make it work. The first is the technology stack, or the ability to get computation on demand. The second one is the learning algorithms that have been in place for decades. Finally, there is the data wrangling, which is the hardest part.

You cannot do supervised learning unless you have clean historical data with which to work. Relationships between types of data are also complicated and difficult for a machine to interpret.
In 2019, there will be a move towards solving that problem. At Bloomberg, we are particularly focused on getting to a point where the data our clients receive is ready to consume directly by their learning algorithms. That change will drive much innovation within the overall financial technology world.

What was the most important lesson the industry learned this year?

The most important lesson of 2018 is around the high operational and efficiency costs of data fragmentation. The volume, complexity, and diversity of data available to decision-makers have grown exponentially in recent years, but it also has become more disjointed. As a result, financial services firms have had to re-evaluate the fundamental ways in which they discover, analyze and apply data to make critical decisions.

Typically, financial firms could not satisfy all their data needs in one place. They would use separate data vendors for reference data, real-time data and pricing data, and then go through and stitch it all together manually through various platforms. Without a common symbology and common field definitions to link the data together, making sense of the growing volume of unstructured and structured data became an extremely costly process. On average, for every dollar spent acquiring data, it costs another $5 to $7 cleaning it for use. Quants working for these firms spend 80% to 90% of their time just going through this cleaning process.

As firms simplify their approach and realize the cost and efficiency benefits of receiving quality, standardized data through one source, they can spend more time applying the data to drive bottom-line value across their enterprise.

Related articles

  1. This addresses the shortcomings of applying LLMs in regulated markets, including hallucinations.

  2. OPINION: Artificial, Yes. Intelligent? Maybe.

    Market participants across the globe face a growing list of challenges in trade settlement.

  3. The bank will use its new system to deploy AI applications and manage AI infrastructure.

  4. How APIs are Changing the FinTech Narrative

    BNP Paribas and Citi are the first institutional investors in the digital transformation platform.

  5. Nearly one fifth og patents granted last year related to AI and machine learning.