AI in the regulated environment: A case study with grenke AG
Financial institutions are subject to particularly strict regulations in the areas of data protection, information security, and IT security. This also applies to AI systems. This case study demonstrates how two secure AI solutions were implemented at grenke to support employees in carrying out their tasks.

Co-authors: Tobias Domnik, Tobias Eljasik-Swoboda, Katharina Rainer, Branko Ristivojcevic, Jasmin Schauer, Tobias Wüchner
About grenke: A global financing partner
grenke is a global financing partner for small and medium-sized enterprises, offering flexible, fast, and straightforward small-ticket leasing and banking services. This creates financial flexibility and helps customers focus on their core business. Founded in 1978 in Baden-Baden, the group now operates with around 2300 employees in over 30 countries worldwide and is an international leader in small-ticket leasing.
At the beginning of 2023, grenke publicly announced its strategic decision to invest extensively in digitalization. As part of the Digital Excellence program, essential foundations for a modern cloud infrastructure were established, and IT security and data architecture were aligned with future challenges. Already involved in building the delivery platform at that time: ONTEC AI.
Building on this technical foundation, 2024 saw the development of grenke’s own AI platform, focusing on generative AI, machine learning, and agent-based AI. The goal is twofold: to make business processes more efficient by integrating AI into grenke’s software systems and to equip employees with state-of-the-art tools to support them in successfully performing their tasks.
A company-wide campaign on AI use cases
Through a company-wide campaign, the global demand for AI use cases within the organization is identified, evaluated, classified, and prioritized along a dependency roadmap. However, the final decision lies with the employees, who, through an internal vote, select their top three use cases as starting points for implementation. This approach ensures that the topic of artificial intelligence is addressed with strong buy-in from the workforce.

Among the top use cases is a general AI chatbot assistant that supports daily work.
Additionally, the development of a grenke AI knowledge database is highlighted, enabling quick and easy access to the right company information at the right time. This makes it logical to deepen the existing collaboration and, in particular, to leverage ONTEC AI’s expertise in the area of generative AI solutions.
The chatbot assistant
The grenke chatbot – named ELLA through an internal vote – provides secure access to various LLMs and LLM services within an environment defined and fully controlled by grenke. This is essential in times of GDPR and DORA compliance. The LLM chat was implemented within grenke’s infrastructure and offers a wide range of features, including:
- Text generation
- Translation
- Text summarization
- Proofreading
- … and many more.
The LLM chat is further enhanced by a “Chat-with-Documents” feature, a question-and-answer function for individual documents. A clear chat history and a prompt library, where frequently used prompts can be saved or imported from a company-wide library, complete the usability of this AI solution.
To ensure a secure environment for sensitive data, it was necessary to deploy on-premise LLMs within grenke’s infrastructure. In a hybrid setup, data can now be routed to on-premise models or a standard LLM provider, depending on the use case.
AI Knowledge Management: An Internal Question-and-Answer Solution
grenke possesses a vast internal data repository that employees continuously rely on for their daily work. A significant portion of these documents consists of critical internal guidelines on processes and legal frameworks for the highly regulated company. In the past, this required regular, manual searching and scanning of documents across different languages and regional versions – a time-consuming and often exhausting task.
The developed AI assistant significantly streamlines this process. The tool offers a semantic search function, allowing employees to query documents and information in plain language. They receive clear answers in the form of summaries based on the information in the grenke knowledge database – including quoted passages and direct references to the original documents from which the answers were generated.
To achieve this, the AI assistant accesses internal document repositories through various interfaces. The tool is based on RAG (Retrieval-Augmented Generation) technology and ensures, with the help of advanced validation algorithms, that the answers provided are absolutely accurate and reliable.
We’ve discussed this project in depth in a webinar!
Watch our webinar with grenke AG and grenke digital GmbH and learn how the financial institution implemented AI in their highly regulated environment.

Beta Testing in the Field of Agent-Based AI
During the course of the project, it became evident that scopes of applicability (e.g., which country or department a guideline applies to) have dependencies, and iterative searches are required to place queries in the correct context.
To support this process, an AI agent was implemented with various capabilities:
- Pre-filling form fields, such as specifying which division or department documents should be searched within (bottom-up search).
- Asking follow-up questions and refining search parameters until the correct document or information is found.
- Multilingual functionality, automatically translating and delivering relevant search results across languages.
The agent operates on the “human-in-the-loop” principle: while it handles much of the manual, preparatory work, a human always makes the final input by confirming the action via a “confirmation button.”
Special Capabilities
The tools mentioned are enhanced by additional features and functionalities:
- Authentication: Highly secure SSO-based two-factor authentication for all employees, including those working outside virtual desktops.
- Token usage dashboards: Easy monitoring of system utilization and cost-benefit analyses for various use cases.
- Architecture compatibility: The implementation was carried out within grenke’s existing infrastructure. The well-structured architecture enables seamless integration of ONTEC AI’s solutions.
Approach: Sprints, Check-ins, and Collaboration on Equal Terms
At grenke’s request, the work was organized into two-week sprints. Additionally, there were multiple weekly check-ins between ONTEC AI and grenke’s internal team.
From the very beginning, this collaboration was close and intensive – a joint process in which both sides contributed their expertise equally. This fostered a partnership between the two implementation teams, characterized by mutual trust and respect. The collaboration was not only technically demanding but also personally rewarding.
“We at grenke Digital GmbH are very satisfied with ONTEC AI as a partner. All the solutions we have utilized so far fit our requirements perfectly. Above all, we value their flexibility, agility, and strong service-oriented mindset – an absolute necessity in the fluid and fast-paced world of artificial intelligence.
Even though there is geographical distance, it is hardly noticeable thanks to regular communication and a straightforward communication culture. We appreciate the collaborative partnership, where we work together on a solution rather than simply purchasing a product and implementing it without further alignment.”
Dr. Tobias Wüchner, Managing Director of grenke Digital GmbH
Outcome: Scalable and Compliance-Secure AI Solutions
The implementation of the LLM chatbot and AI knowledge management transitioned seamlessly from a multi-phase testing period into full production, currently supporting up to 400 unique daily users.
Supported by a regulatory AI framework and an accompanying change management project, the grenke AI platform is becoming increasingly embedded within the organization. The AI solutions are evolving into constant companions for employees, regardless of their role or position within the company.
These new AI solutions have significantly improved the work experience for employees, and the untapped potential they offer cannot be overstated.
Outlook
With the broad identification of use cases and the defined implementation roadmap, the next use cases are already in the pipeline. grenke’s AI ambitions will continue to expand, building on the established foundations.
“We are very much looking forward to working with ONTEC AI again in the future and implementing additional use cases. The experiences we have gained have shown us that ONTEC AI is a reliable and competent partner who supports us in achieving our goals. We are confident that our collaboration will continue to be successful and that together we will develop many more innovative solutions”
Dr. Tobias Domnik, Head of Organization Development at grenke AG
Summary and Conclusion
In this project, two AI solutions were developed for grenke to ease the workload of employees in their highly regulated daily operations: an AI assistant for knowledge management and an LLM chatbot that meets the particularly high regulatory requirements of the financial sector.
Initial audit results show that the AI solutions comply with the strict regulatory standards of the financial industry, demonstrating that AI can be implemented securely and in full compliance even in such environments.
The collaboration is set to continue: with the defined implementation roadmap, the next use cases are already in development.