May 4, 2023

UCalgary-U.S. team uses technology to fight financial cybercrime

Federated learning approach nets 3rd-place finish at Summit for Democracy 2023
Steve Drew Fan Dong
Steve Drew, left, and Fan Dong were part of a team that won third place at Summit for Democracy 2023. Joe McFarland, Schulich School of Engineering

A Schulich School of Engineering assistant professor and his graduate student are being rewarded for their work in the fight against financial cybercrimes.

Dr. Steve Drew, PhD, and Master of Science in software engineering student Fan Dong teamed up with a research group from Michigan State University (MSU) that took part in the Summit for Democracy 2023.

Convened by U.S. President Joe Biden, the summit included the Privacy-Enhancing Technologies Prize Challenges, aimed at using data analysis and machine learning to solve pressing societal issues.

For the UCalgary-MSU team, their challenge was to detect financial crimes through banks while not compromising personal information and data, and their approach earned them a third-place finish.

“This is a perfect example of how research and technology can help protect and promote democracy,” says Drew, an assistant professor in the Department of Electrical and Software Engineering. “Our solution has been recognized internationally for preventing financial crime using federated learning.”

Multi-billion-dollar industry

Drew says federated learning is a distributed privacy-preserving machine-learning approach that uses algorithms and systems to solve problems like money laundering that, according to the United Nations, costs the world between two and five per cent of its gross domestic product.

That works out to somewhere between $800 billion and $2 trillion U.S. being laundered annually.

With their complex web of current accounts and loans being whisked between different banks in order to obscure where they all come from, criminals are able to escape detection as banks are only able to look at their own customers’ behaviour.

“Financial crime detection requires both transfer records and account histories to collaboratively learn a predictive model to detect the problematic transactions,” Drew says. “However, payment network systems, such as the Society for Worldwide Interbank Financial Telecommunications (SWIFT), have money transfer records but doesn’t have the access to sender and recipient account history, while the financial institutions have account histories but don’t have the big picture of all transfers.”

He says the two types of data can’t be shared because of privacy-related regulations. The General Data Protection Regulation (GDPR), also known as the toughest privacy and security law in the world, was put into effect in 2018.

Drew says the GDPR levies harsh fines against those who violate its privacy and security standards, with penalties reaching the tens of millions of euros.

Federated learning is a promising way to protect the privacy of user data without breaking regulations while still benefiting artificial intelligence.

Detecting high-risk money transfers

Drew says he has worked with Dr. Jiayu Zhou, PhD, at MSU on federated learning research for a number of years with multiple publications and ongoing projects, so it made sense to collaborate for the challenge.

They designed a protocol that Drew says can securely train a prediction model without sharing the user’s account data.

The model is first trained at the financial institution level, then encrypted and sent to a payment network system like SWIFT.

“With models from multiple financial institutions, the payment network system aggregates their models with its own trained model to generate a global model for detecting financial crime,” says Drew. “Such models have proven to be more effective in detecting high-risk money transfers, compared with traditional methods, only leveraging money transfer interactions from a payment network system.”

With Dong writing most of the code, Drew says the team stayed up late on the final night of the competition to make sure they submitted the optimal solution.

Data 'belongs to the users'

Drew says he was thrilled that his team finished in the top three of the competition, but it’s the real-world impact he is most proud of.

“As AI technologies are rapidly advancing today, there are growing concerns that large corporations and governments have too much information of individuals for things like surveillance,” he says. “This kind of activity could be a major threat to democracy.”

Drew says we are already seeing the implications of this technological evolution with social media and the political content users are seeing on their feeds.

“I believe the ownership of data ultimately belongs to the users,” he says. “Corporations and governments may use technologies like AI to learn from the data, but must ensure that the data is solely used for the intended purposes which have been disclosed to users in advance.”

Drew says protecting the privacy of citizens around the world is a vital part of maintaining democracy, and hopes more people pay attention to the long-term impacts of what technology can bring, and not just on the short-term convenience.


Sign up for UToday

Sign up for UToday

Delivered to your inbox — a daily roundup of news and events from across the University of Calgary's 14 faculties and dozens of units

Thank you for your submission.