For the most part of this decade, new age technology-driven disruptors have been threatening the existence of traditional incumbents in every industry. To survive and thrive, the incumbents need to match the technology prowess and agility of these new-age competitors. In 2016, HSBC embarked on an ambitious technology transformation journey. From new ways of working to automation and tooling, HSBC overhauled its entire global IT function to fundamentally change how it services its customers.
One of the key pillars of the transformation was the adoption of cloud technologies. HSBC partnered with leading cloud technology providers to develop new services and create engagement models that allowed the highly regulated financial industry to adopt the public cloud at scale, across a complex global regulatory landscape.
With more than 40 million customers, HSBC is no stranger to big data and cloud adoption. They now have a sizeable footprint in the cloud of their approximately 240 petabytes of total data.
Adoption Of Big Data & Cloud Technology
At the outset, the Bank tried the traditional relational databases and adding data warehousing systems to crunch greater volumes of big data, but this mixed stack was not efficient. As volumes grew and with unstructured data coming in, they turned to Hadoop making a foray into open source technologies. However, HSBC required the performance and data integrity of RDBMS, the analytics capabilities of data warehousing, the speed and scale of the Hadoop ecosystem, all in one system. Cloud technologies seemed like the best option for the future.
“4 years into the journey, we have built an enduring capability to enable personal and relevant customer experiences. We are improving data-driven business decisions that will also support the regulatory agenda. We have various modern Analytics and Machine Learning modules running on massive, curated datasets to support automated decision making for Risk, Financial Crime and Fraud. These also support IFRS9 regulatory reporting and AI-powered equity index products. With data analytics, open-source, and machine learning at the core of our data platforms, as well as our growing cloud footprint, this has turned out to be like a magic-pill for HSBC.” explains Pradeep Menon, who is HSBC’s Global Head of Data Engineering, with teams based out of Hyderabad and Pune in India.
Presently, one of the key programmes underway is moving big data lakes, holding vast amounts of data in its native format, securely to the cloud until it is needed for analytics. HSBC is leveraging the latest in data engineering technologies, including the Google stack (GCS, BIGQUERY, CLOUD SQL, STACK DRIVER), to give their analytics the machine learning edge. These technologies are helping make better sense of data and do it faster to meet the evolving needs across it’s various markets.
“Another important area where we are applying our big data technologies is in preventing financial crime and fraud. HSBC has deployed an industry-leading Anti-Money Laundering (AML) system and an automated sanctions checking system as part of its ongoing efforts to improve financial crime detection. HSBC recently won the Celent Model Bank Award for Risk Management for launching a global analytics solution that identifies potential financial crime by contextually analysing customer, transactional, and publicly available data in order to understand a customer’s global network.” shares Chittaranjan Bhide – Head of Data Engineering Delivery.
Wrapping Up
HSBC has understandably spent a lot of time and effort to define and implement a safe route to the cloud for their needs. This unarguably helps expedite not just their future cloud migrations, but also paves the way for a large global financial organisation to do the same; at a much faster pace while adhering to the stringent security, privacy and compliance rules of the sector.