Data Processing as System and Business Informatics
Kheng Ho Toh / 123RF Stock Photo

As technology impacts everything from how products are chosen to how advisors interact with clients, it brings with it new accountability questions.

What happens to an advisor’s practice (and fees) when almost everything is automated, for example? And who is held responsible for an algorithm’s recommendations?

“I’m just waiting for the day when there might be some sort of enforcement action, [and] somebody will say, ‘The algorithm made me do it!’” said James Leong, senior legal counsel at the BC Securities Commission.

Leong was speaking Tuesday on a panel about AI and fintech at Advocis’s symposium in Toronto. The panellists covered issues that could soon confront advisors and regulators.

For one, current regulations require individuals and their firms to be registered, Leong said. “In the future then, if you do end up with an algorithm that really does have a decision-making process that is transparent—that has been highly tested and the quality of its recommendations is in line with a  qualified representative—how would you actually look at that under the regulatory framework?” he asked.

Leong said he’s not sure when we’ll get to that point, but that regulatory sandboxes provide opportunities to observe this type of risk. He’s not convinced regulators need to be granted access to an algorithm’s inner workings in order to regulate it, though.

Algorithms can create problems and also provide solutions for product suitability, Leong said. Advisors have often worried about being “able to navigate the whole universe of products to figure out if something’s suitable,” he said. Algorithms will be able to refine that search, and possibly even identify previously unknown biases in an advisor’s decision-making process.

One potential benefit would be eliminating certain types of conflicts of interest by, for example, screening out products that give the advisor a higher referral fee. “Maybe that factors inherently into people’s decision-making while they don’t necessarily realize it,” Leong said.

Of course, he also noted that algorithms’ own biases have been found in areas ranging from hiring to sentencing.

Leong also discussed the scenario where an advisor uses a registered portfolio manager who provides a platform that offers recommendations.

“You have to step back and consider who’s providing what in this context,” he said. “Again, if you’re providing registrable activity, somebody needs to be registered for it. Maybe in some cases, if you have no input and the portfolio manager’s actually learning all about the clients and they have all the recommendations—as part of your larger financial advice portfolio—then there might be two different types of hazards in a sense.”

Both the advisor and the portfolio manager may have responsibilities to the client in that case, he said. A portfolio manager who does planning for a client has a fiduciary duty.

IIROC addressed online advice offerings in its 2017/2018 compliance priorities report. The SRO said it’s looking for clear disclosure around products and conflicts of interest where proprietary recommendations are made, and for “an appropriate account-opening process, including adequate online [KYC] and risk-tolerance assessments,” the report said. It’s also seeking “sufficient” oversight within firms and of affiliated advisors.

Wendy Rudd, IIROC’s senior vice-president of member regulation and strategic initiatives, told Advisor’s Edge earlier this year that if there were an online advisory problem that a client didn’t cause, “we would look to the firm, given that it created the algorithm. The client would not be without recourse.”

Read: How regulators are adapting to robo growth

A role for in-person advice?

Another hypothetical, posed as an ethics question about fees, concerned what happens when technology reaches a point where an advisor’s business is completely automated aside from the occasional client meeting, with an AI advisor managing account changes.

Chadi Habib, EVP for information technology at Desjardins Group, said he doesn’t think more online tools will mean the end of in-person advice—even though 94% of interactions with Desjardins’ customers already don’t involve a human. Those aren’t the value-added interactions, he said.

While he said everything “mechanical” about the financial services industry will be automated, human interaction isn’t part of that. “We do not believe in a 100% virtual or automated advice,” he said.

Even digitally savvy clients are no exception, he said. “Millennials are pushing us hard in our industry, saying ‘Stop over-virtualizing everything. There are several key life events that are critical for us to interact with a human being.'”

Clients are increasingly looking for that human interaction on their own terms, though, so advisors need to be prepared to meet via Skype or in a coffee shop rather than in their office, he said.

“You have to decide in your business—where is it that you bring the most amount of value?” he said.