Sebi has proposed that registered Investment Advisers and Research Analysts who employ artificial intelligence (AI) tools in their services must disclose the extent of usage to clients, emphasizing the importance of strong security measures to avoid unintended data exposure.
This transparency is crucial for clients to understand how AI tools contribute to their investment decisions and to make informed choices about their advisory services.
"The possibility of unintended data exposure highlights the need for strong security measures and clear disclosure to clients about the extent of AI tool usage", Trivesh D, COO at Tradejini, a stock trading platform, told PTI.
The Securities and Exchange Board of India (Sebi), in its consultation paper earlier this month, highlighted the growing usage of AI tools in Investment Adviser (IA) and Research Analyst (RA) services.
With technological innovations and advancements, many AI tools are currently available in chatbot form such as OpenAI's ChatGPT, Google's Gemini, etc.
AI-based tools allow one to have human-like conversations and receive human-like responses with the chatbot. These tools assist various tasks such as summarising and analysing data and may help in improving efficiency and productivity.
"These AI tools, however, may not adequately safeguard sensitive data shared during conversations, potentially leading to unintended data exposure and concerns related to data security," Sebi said in its consultation paper issued last week.
Feroze Azeez, Deputy CEO, Anand Rathi Wealth Ltd said "while embracing this innovation, we must be mindful of its implications and responsibilities".
IAs provide personalised services according to client-specific requirements based on risk profiling and suitability. Similarly, RAs provide recommendations based on certain parameters and methodology adopted and are required to keep records of the research report, research recommendations and rationale for arriving at research recommendations.
While AI tools can provide significant assistance in the work of IAs and RAs, they may not always give meaningful outputs that are expected to be based on the understanding of complex security-specific or client-specific scenarios/ requirements such as personal/ financial conditions or goals, Sebi stated.