For some time now, consumer protection experts, politicians and industry representatives have been urging BaFin to approve algorithms used in the financial sector. In their view, BaFin’s advance approval would enable algorithms to be used in compliance with the law – even in the case of outsourcing arrangements, i.e. where external service providers make algorithms available to supervised undertakings. In practice, however, the large-scale review of algorithmic decision-making processes is not feasible; particularly on the financial market, it would hardly be worthwhile for most of the algorithms being used.
BaFin understands an algorithm to be a well-defined set of instructions for solving a problem or a category of problems. Algorithms are normally implemented in computer programmes, where they execute predefined, individual steps. Algorithms and their results are of particular interest to BaFin when these are used to make decisions that are relevant to financial supervision. BaFin’s supervisory focus is not on the algorithm itself, however, but on the overall algorithm-based decision-making process.
In any case, the important factor is how supervised undertakings actually incorporate algorithms (see info box) into their decision-making processes. An algorithm that is suitable for one particular context may yield unusable results in another situation or contravene the conditions with which an approval has been granted. Furthermore, the results of an algorithm depend on the data available and the quality of such data. This is why BaFin’s supervision does not focus solely on the algorithm itself but on the overall algorithm-based decision-making process – from the data to the results – and on the associated risks. BaFin is thus able to remain technology-neutral.
Risk-based supervision of algorithm-based decision-making processes – a worthwhile approach
BaFin’s supervisory activities are risk-oriented and are conducted as and where necessary. In processes such as the authorisation procedure, the ongoing supervision of institutions and the prevention of and fight against violations of statutory provisions, BaFin reviews and raises objections to algorithm-based decision-making processes in the same manner in which it also deals with human decision-making processes. In doing so, BaFin draws on the statutory provisions – which are worded for the most part in technology-neutral terms – that govern the authorisation requirement, proper business organisation, organisational requirements and documentation requirements. With this approach – instead of approving all algorithm-based decision-making processes – BaFin allocates its available resources in a risk-oriented manner to fulfil its supervisory objectives. At the same time, it prevents algorithm-based processes from being subjected to standards that are more stringent than necessary in comparison to those applied to comparable processes performed by humans. What is more, BaFin thus makes sure extensive approval procedures do not stifle innovation.
No legal basis for general approval
Finally, there is also no legal basis for a general approval of algorithms or algorithm-based decision-making processes. From the supervisors’ point of view, there is also no need for BaFin to have a general approval process in place for algorithm-based decision-making processes. However, there are special cases regulated by law in which the scope of an algorithm is defined and for which there are at least general provisions and minimum requirements in place for the procedures used. Even in such cases, though, BaFin does not grant its general approval but examines whether a procedure is suitable for its purpose. BaFin’s assessment depends on factors such as the data available and their quality as well as the processes in which the relevant undertakings use such procedures. Examples in this respect include internal models used by credit institutions and insurers to calculate their regulatory capital requirements or solvency. Even with the models used to calculate the margin requirements, or in the case of the default fund contributions of central counterparties, the law requires BaFin to approve the models. The legal bases for credit institutions are set out in Article 142 et seq. and Article 362 et seq. of the European Capital Requirements Regulation (CRR), for insurers in sections 111 et seq. of the German Insurance Supervision Act (Versicherungsaufsichtsgesetz – VAG) and for central counterparties in Article 49 of the European Market Infrastructure Regulation (EMIR). In terms of risk, these provisions are necessary because the models have a significant impact on the resilience of the supervised undertakings and thus, potentially, financial stability in general.
When BaFin checks whether a financial service is subject to the authorisation requirements under the supervisory legislation, it is likewise irrelevant if an undertaking uses algorithms. Here too, the law is structured in such a way that it is technologically neutral. One exception is high-frequency trading as a special form of proprietary trading under section 1 (1a) sentence 2 no. 4 (d) of the German Banking Act (Kreditwesengesetz – KWG). In this case, the purchase and sale of financial instruments are based directly on decisions that have been determined by algorithms; these algorithms themselves constitute an element of the financial service that is subject to the authorisation requirement. In addition, there are no universal legal provisions common to or comparable across all supervisory legislation that require undertakings to report or notify algorithm-based decision-making processes to BaFin. There are only a few individual reporting or notification requirements, such as the notification requirement under section 80 (2) sentence 5 of the German Securities Trading Act (Wertpapierhandelsgesetz – WpHG), which is aimed at investment services enterprises engaged in algorithmic trading.
This article reflects the situation at the time of publication and will not be updated subsequently. Please take note of the Standard Terms and Conditions of Use.