We learn & share

ACA Group Blog

Read more about our thoughts, views, and opinions on various topics, important announcements, useful insights, and advice from our experts.

Featured

8 MAY 2025
Reading time 5 min

In the ever-evolving landscape of data management, investing in platforms and navigating migrations between them is a recurring theme in many data strategies. How can we ensure that these investments remain relevant and can evolve over time, avoiding endless migration projects? The answer lies in embracing ‘Composability’ - a key principle for designing robust, future-proof data (mesh) platforms. Is there a silver bullet we can buy off-the-shelf? The data-solution market is flooded with data vendor tools positioning themselves as the platform for everything, as the all-in-one silver bullet. It's important to know that there is no silver bullet. While opting for a single off-the-shelf platform might seem like a quick and easy solution at first, it can lead to problems down the line. These monolithic off-the-shelf platforms often end up inflexible to support all use cases, not customizable enough, and eventually become outdated.This results in big complicated migration projects to the next silver bullet platform, and organizations ending up with multiple all-in-one platforms, causing disruptions in day-to-day operations and hindering overall progress. Flexibility is key to your data mesh platform architecture A complete data platform must address numerous aspects: data storage, query engines, security, data access, discovery, observability, governance, developer experience, automation, a marketplace, data quality, etc. Some vendors claim their all-in-one data solution can tackle all of these. However, typically such a platform excels in certain aspects, but falls short in others. For example, a platform might offer a high-end query engine, but lack depth in features of the data marketplace included in their solution. To future-proof your platform, it must incorporate the best tools for each aspect and evolve as new technologies emerge. Today's cutting-edge solutions can be outdated tomorrow, so flexibility and evolvability are essential for your data mesh platform architecture. Embrace composability: Engineer your future Rather than locking into one single tool, aim to build a platform with composability at its core. Picture a platform where different technologies and tools can be seamlessly integrated, replaced, or evolved, with an integrated and automated self-service experience on top. A platform that is both generic at its core and flexible enough to accommodate the ever-changing landscape of data solutions and requirements. A platform with a long-term return on investment by allowing you to expand capabilities incrementally, avoiding costly, large-scale migrations. Composability enables you to continually adapt your platform capabilities by adding new technologies under the umbrella of one stable core platform layer. Two key ingredients of composability Building blocks: These are the individual components that make up your platform. Interoperability: All building blocks must work together seamlessly to create a cohesive system. An ecosystem of building blocks When building composable data platforms, the key lies in sourcing the right building blocks. But where do we get these? Traditional monolithic data platforms aim to solve all problems in one package, but this stifles the flexibility that composability demands. Instead, vendors should focus on decomposing these platforms into specialized, cost-effective components that excel at addressing specific challenges. By offering targeted solutions as building blocks, they empower organizations to assemble a data platform tailored to their unique needs. In addition to vendor solutions, open-source data technologies also offer a wealth of building blocks. It should be possible to combine both vendor-specific and open-source tools into a data platform tailored to your needs. This approach enhances agility, fosters innovation, and allows for continuous evolution by integrating the latest and most relevant technologies. Standardization as glue between building blocks To create a truly composable ecosystem, the building blocks must be able to work together, i.e. interoperability. This is where standards come into play, enabling seamless integration between data platform building blocks. Standardization ensures that different tools can operate in harmony, offering a flexible, interoperable platform. Imagine a standard for data access management that allows seamless integration across various components. It would enable an access management building block to list data products and grant access uniformly. Simultaneously, it would allow data storage and serving building blocks to integrate their data and permission models, ensuring that any access management solution can be effortlessly composed with them. This creates a flexible ecosystem where data access is consistently managed across different systems. The discovery of data products in a catalog or marketplace can be greatly enhanced by adopting a standard specification for data products. With this standard, each data product can be made discoverable in a generic way. When data catalogs or marketplaces adopt this standard, it provides the flexibility to choose and integrate any catalog or marketplace building block into your platform, fostering a more adaptable and interoperable data ecosystem. A data contract standard allows data products to specify their quality checks, SLOs, and SLAs in a generic format, enabling smooth integration of data quality tools with any data product. It enables you to combine the best solutions for ensuring data reliability across different platforms. Widely accepted standards are key to ensuring interoperability through agreed-upon APIs, SPIs, contracts, and plugin mechanisms. In essence, standards act as the glue that binds a composable data ecosystem. A strong belief in evolutionary architectures At ACA Group, we firmly believe in evolutionary architectures and platform engineering, principles that seamlessly extend to data mesh platforms. It's not about locking yourself into a rigid structure but creating an ecosystem that can evolve, staying at the forefront of innovation. That’s where composability comes in. Do you want a data platform that not only meets your current needs but also paves the way for the challenges and opportunities of tomorrow? Let’s engineer it together Ready to learn more about composability in data mesh solutions? {% module_block module "widget_f1f5c870-47cf-4a61-9810-b273e8d58226" %}{% module_attribute "buttons" is_json="true" %}{% raw %}[{"appearance":{"link_color":"light","primary_color":"primary","secondary_color":"primary","tertiary_color":"light","tertiary_icon_accent_color":"dark","tertiary_text_color":"dark","variant":"primary"},"content":{"arrow":"right","icon":{"alt":null,"height":null,"loading":"disabled","size_type":null,"src":"","width":null},"tertiary_icon":{"alt":null,"height":null,"loading":"disabled","size_type":null,"src":"","width":null},"text":"Contact us now!"},"target":{"link":{"no_follow":false,"open_in_new_tab":false,"rel":"","sponsored":false,"url":{"content_id":230950468795,"href":"https://25145356.hs-sites-eu1.com/en/contact","href_with_scheme":null,"type":"CONTENT"},"user_generated_content":false}},"type":"normal"}]{% endraw %}{% end_module_attribute %}{% module_attribute "child_css" is_json="true" %}{% raw %}{}{% endraw %}{% end_module_attribute %}{% module_attribute "css" is_json="true" %}{% raw %}{}{% endraw %}{% end_module_attribute %}{% module_attribute "definition_id" is_json="true" %}{% raw %}null{% endraw %}{% end_module_attribute %}{% module_attribute "field_types" is_json="true" %}{% raw %}{"buttons":"group","styles":"group"}{% endraw %}{% end_module_attribute %}{% module_attribute "isJsModule" is_json="true" %}{% raw %}true{% endraw %}{% end_module_attribute %}{% module_attribute "label" is_json="true" %}{% raw %}null{% endraw %}{% end_module_attribute %}{% module_attribute "module_id" is_json="true" %}{% raw %}201493994716{% endraw %}{% end_module_attribute %}{% module_attribute "path" is_json="true" %}{% raw %}"@projects/aca-group-project/aca-group-app/components/modules/ButtonGroup"{% endraw %}{% end_module_attribute %}{% module_attribute "schema_version" is_json="true" %}{% raw %}2{% endraw %}{% end_module_attribute %}{% module_attribute "smart_objects" is_json="true" %}{% raw %}null{% endraw %}{% end_module_attribute %}{% module_attribute "smart_type" is_json="true" %}{% raw %}"NOT_SMART"{% endraw %}{% end_module_attribute %}{% module_attribute "tag" is_json="true" %}{% raw %}"module"{% endraw %}{% end_module_attribute %}{% module_attribute "type" is_json="true" %}{% raw %}"module"{% endraw %}{% end_module_attribute %}{% module_attribute "wrap_field_tag" is_json="true" %}{% raw %}"div"{% endraw %}{% end_module_attribute %}{% end_module_block %}

Read more
We learn & share

ACA Group Blog

Read more about our thoughts, views, and opinions on various topics, important announcements, useful insights, and advice from our experts.

Featured

8 MAY 2025
Reading time 5 min

In the ever-evolving landscape of data management, investing in platforms and navigating migrations between them is a recurring theme in many data strategies. How can we ensure that these investments remain relevant and can evolve over time, avoiding endless migration projects? The answer lies in embracing ‘Composability’ - a key principle for designing robust, future-proof data (mesh) platforms. Is there a silver bullet we can buy off-the-shelf? The data-solution market is flooded with data vendor tools positioning themselves as the platform for everything, as the all-in-one silver bullet. It's important to know that there is no silver bullet. While opting for a single off-the-shelf platform might seem like a quick and easy solution at first, it can lead to problems down the line. These monolithic off-the-shelf platforms often end up inflexible to support all use cases, not customizable enough, and eventually become outdated.This results in big complicated migration projects to the next silver bullet platform, and organizations ending up with multiple all-in-one platforms, causing disruptions in day-to-day operations and hindering overall progress. Flexibility is key to your data mesh platform architecture A complete data platform must address numerous aspects: data storage, query engines, security, data access, discovery, observability, governance, developer experience, automation, a marketplace, data quality, etc. Some vendors claim their all-in-one data solution can tackle all of these. However, typically such a platform excels in certain aspects, but falls short in others. For example, a platform might offer a high-end query engine, but lack depth in features of the data marketplace included in their solution. To future-proof your platform, it must incorporate the best tools for each aspect and evolve as new technologies emerge. Today's cutting-edge solutions can be outdated tomorrow, so flexibility and evolvability are essential for your data mesh platform architecture. Embrace composability: Engineer your future Rather than locking into one single tool, aim to build a platform with composability at its core. Picture a platform where different technologies and tools can be seamlessly integrated, replaced, or evolved, with an integrated and automated self-service experience on top. A platform that is both generic at its core and flexible enough to accommodate the ever-changing landscape of data solutions and requirements. A platform with a long-term return on investment by allowing you to expand capabilities incrementally, avoiding costly, large-scale migrations. Composability enables you to continually adapt your platform capabilities by adding new technologies under the umbrella of one stable core platform layer. Two key ingredients of composability Building blocks: These are the individual components that make up your platform. Interoperability: All building blocks must work together seamlessly to create a cohesive system. An ecosystem of building blocks When building composable data platforms, the key lies in sourcing the right building blocks. But where do we get these? Traditional monolithic data platforms aim to solve all problems in one package, but this stifles the flexibility that composability demands. Instead, vendors should focus on decomposing these platforms into specialized, cost-effective components that excel at addressing specific challenges. By offering targeted solutions as building blocks, they empower organizations to assemble a data platform tailored to their unique needs. In addition to vendor solutions, open-source data technologies also offer a wealth of building blocks. It should be possible to combine both vendor-specific and open-source tools into a data platform tailored to your needs. This approach enhances agility, fosters innovation, and allows for continuous evolution by integrating the latest and most relevant technologies. Standardization as glue between building blocks To create a truly composable ecosystem, the building blocks must be able to work together, i.e. interoperability. This is where standards come into play, enabling seamless integration between data platform building blocks. Standardization ensures that different tools can operate in harmony, offering a flexible, interoperable platform. Imagine a standard for data access management that allows seamless integration across various components. It would enable an access management building block to list data products and grant access uniformly. Simultaneously, it would allow data storage and serving building blocks to integrate their data and permission models, ensuring that any access management solution can be effortlessly composed with them. This creates a flexible ecosystem where data access is consistently managed across different systems. The discovery of data products in a catalog or marketplace can be greatly enhanced by adopting a standard specification for data products. With this standard, each data product can be made discoverable in a generic way. When data catalogs or marketplaces adopt this standard, it provides the flexibility to choose and integrate any catalog or marketplace building block into your platform, fostering a more adaptable and interoperable data ecosystem. A data contract standard allows data products to specify their quality checks, SLOs, and SLAs in a generic format, enabling smooth integration of data quality tools with any data product. It enables you to combine the best solutions for ensuring data reliability across different platforms. Widely accepted standards are key to ensuring interoperability through agreed-upon APIs, SPIs, contracts, and plugin mechanisms. In essence, standards act as the glue that binds a composable data ecosystem. A strong belief in evolutionary architectures At ACA Group, we firmly believe in evolutionary architectures and platform engineering, principles that seamlessly extend to data mesh platforms. It's not about locking yourself into a rigid structure but creating an ecosystem that can evolve, staying at the forefront of innovation. That’s where composability comes in. Do you want a data platform that not only meets your current needs but also paves the way for the challenges and opportunities of tomorrow? Let’s engineer it together Ready to learn more about composability in data mesh solutions? {% module_block module "widget_f1f5c870-47cf-4a61-9810-b273e8d58226" %}{% module_attribute "buttons" is_json="true" %}{% raw %}[{"appearance":{"link_color":"light","primary_color":"primary","secondary_color":"primary","tertiary_color":"light","tertiary_icon_accent_color":"dark","tertiary_text_color":"dark","variant":"primary"},"content":{"arrow":"right","icon":{"alt":null,"height":null,"loading":"disabled","size_type":null,"src":"","width":null},"tertiary_icon":{"alt":null,"height":null,"loading":"disabled","size_type":null,"src":"","width":null},"text":"Contact us now!"},"target":{"link":{"no_follow":false,"open_in_new_tab":false,"rel":"","sponsored":false,"url":{"content_id":230950468795,"href":"https://25145356.hs-sites-eu1.com/en/contact","href_with_scheme":null,"type":"CONTENT"},"user_generated_content":false}},"type":"normal"}]{% endraw %}{% end_module_attribute %}{% module_attribute "child_css" is_json="true" %}{% raw %}{}{% endraw %}{% end_module_attribute %}{% module_attribute "css" is_json="true" %}{% raw %}{}{% endraw %}{% end_module_attribute %}{% module_attribute "definition_id" is_json="true" %}{% raw %}null{% endraw %}{% end_module_attribute %}{% module_attribute "field_types" is_json="true" %}{% raw %}{"buttons":"group","styles":"group"}{% endraw %}{% end_module_attribute %}{% module_attribute "isJsModule" is_json="true" %}{% raw %}true{% endraw %}{% end_module_attribute %}{% module_attribute "label" is_json="true" %}{% raw %}null{% endraw %}{% end_module_attribute %}{% module_attribute "module_id" is_json="true" %}{% raw %}201493994716{% endraw %}{% end_module_attribute %}{% module_attribute "path" is_json="true" %}{% raw %}"@projects/aca-group-project/aca-group-app/components/modules/ButtonGroup"{% endraw %}{% end_module_attribute %}{% module_attribute "schema_version" is_json="true" %}{% raw %}2{% endraw %}{% end_module_attribute %}{% module_attribute "smart_objects" is_json="true" %}{% raw %}null{% endraw %}{% end_module_attribute %}{% module_attribute "smart_type" is_json="true" %}{% raw %}"NOT_SMART"{% endraw %}{% end_module_attribute %}{% module_attribute "tag" is_json="true" %}{% raw %}"module"{% endraw %}{% end_module_attribute %}{% module_attribute "type" is_json="true" %}{% raw %}"module"{% endraw %}{% end_module_attribute %}{% module_attribute "wrap_field_tag" is_json="true" %}{% raw %}"div"{% endraw %}{% end_module_attribute %}{% end_module_block %}

Read more

All blog posts

Lets' talk!

We'd love to talk to you!

Contact us and we'll get you connected with the expert you deserve!

Lets' talk!

We'd love to talk to you!

Contact us and we'll get you connected with the expert you deserve!

Lets' talk!

We'd love to talk to you!

Contact us and we'll get you connected with the expert you deserve!

Lets' talk!

We'd love to talk to you!

Contact us and we'll get you connected with the expert you deserve!

Ship-IT day 2023
Ship-IT day 2023
Reading time 7 min
8 MAY 2025

November 30, 2023 marked a highly anticipated day for numerous ACA employees. Because on Ship-IT Day, nine teams of ACA team members, whether or not supplemented with customer experts, delved into creating inventive solutions for customer challenges or for ACA Group itself. The hackathon proved to be both inspiring and productive, with at the end a deserved winner! The atmosphere in the ACA office in Hasselt was sizzling right from the early start. Eight out of the nine project teams were stationed here. During the coffee cake breakfast, you immediately felt that it was going to be an extraordinary day. There was a palpable sense of excitement among the project team members , as well as a desire to tackle the complex challenges ahead. 9 innovative projects for internal and external challenges 🚀 After breakfast, the eight project teams swarmed to their working habitat for the day. The ninth team competed in the ACA office in Leuven. We list the teams here: Chatbot course integration in customer portal System integration tests in a CI/CD pipeline Onboarding portal/platform including gamification Automatic dubbing, transcription and summary of conversations publiq film offering data import via ML SMOCS, Low level mock management system Composable data processing architecture Virtual employees Automated invoicing If you want to know more about the scope of the different project teams, read our first blog article Ship-IT Day 2023: all projects at a glance . Sensing the atmosphere in the teams Right before noon, we wondered how the teams had started and how their work was evolving. And so we went to take a quick look... 👀 1. Chatbot course integration in customer portal “After a short kick-off meeting with the customer, we divided the tasks and got to work straight away,” says Bernd Van Velsen. “The atmosphere is great and at the end of the day, we hope to present a result that will inspire the customer . In the best case, we will soon be able to use AI tools in a real customer project with the aim of making more optimal use of the customer's many data.” “The Ship-IT Day is an annual tradition that I like to participate in,” says Bernd. “Not only because it is great to collaborate with colleagues from other departments, but also because it is super educational.” 2. System integration tests in a CI/CD pipeline “We want to demonstrate that we can perform click tests in the frontend in an existing environment and verify whether everything works together properly,” says Stef Noten. “We can currently run the necessary tests locally, so we are good on schedule. The next step is to also make this work in our build pipeline. At the end of the day, we hope we will be able to run the tests either manually or scheduled on the latest version of the backend and frontend .” 3. Onboarding portal/platform including gamification The members of this project team all started at ACA fairly recently. And that is exactly what brought them together, because their goal was to develop a platform that makes the onboarding process for new employees more efficient and fun . Dieter Vennekens shared his enthusiasm with us, stating, "We kicked off with a brainstorming session to define the platform's requirements and goals. Subsequently, we reviewed these with the key users to ensure the final product aligns with their expectations. Our aim is to establish the basic structure before lunch, allowing us to focus on development and styling intensively in the afternoon. By the day's end, our objective is to unveil a functional prototype. This project serves as an opportunity to showcase the capabilities of Low-Code .” 4. Automatic dubbing, transcription and summary of conversations Upon entering their meeting room, we found the project team engrossed in their work, and Katrien Gistelinck provided a concise explanation for their business. "Our project is essentially divided into two aspects. Firstly, we aim to develop an automatic transcription and summary of a conversation . Concurrently, we are working on the live dubbing of a conversation, although we're uncertain about the feasibility of the latter within the day. It might be a tad ambitious, but we are determined to give it a try." She continued, "This morning, our focus was on defining the user flow and selecting the tools we'll utilize. Currently, multiple tasks are progressing simultaneously, addressing both the UI and backend components." 5. Publiq film offering data import via ML Comprising six publiq employees and three from ACA, this team engaged in an introductory round followed by a discussion of the project approach at the whiteboard. They then allocated tasks among themselves. Peter Jans mentioned, "Everyone is diligently working on their assigned tasks, and we maintain continuous communication. The atmosphere is positive, and we even took a group photo! Collaborating with the customer on a solution to a specific challenge for an entire day is energizing. " "At the close of the day, our objective is to present a functional demo showcasing the AI and ML (Machine Learning) processing of an email attachment, followed by the upload of the data to the UIT database. The outcome should be accessible on uitinvlaanderen.be ." Peter adds optimistically, "We're aiming for the win." That's the spirit, Peter! 6. SMOCS, Low level mock management system Upon our arrival, the SMOCS team was deeply engrossed in their discussions, making us hesitant to interrupt. Eventually, they graciously took the time to address our questions, and the atmosphere was undoubtedly positive. "We initiated the process with a brief brainstorming session at the whiteboard. After establishing our priorities, we allocated tasks accordingly. Currently, we are on track with our schedule: the design phase is largely completed, and substantial progress has been made with the API. We conduct a status check every hour, making adjustments as needed," they shared. "By the end of the day, our aim is to showcase an initial version of SMOCS , complete with a dashboard offering a comprehensive overview of the sent requests along with associated responses that we can adjust. Additionally, we have high hopes that the customized response will also show up in the end-user application." 7. Composable data processing architecture This project team aims to establish a basic architecture applicable to similar projects often centered around data collection and processing. Currently, customers typically start projects from scratch, while many building blocks could be reused via platform engineering and composable data. “Although time flies very quickly, we have already collected a lot of good ideas,” says Christopher Scheerlinck. “What do we want to present later? A very complex scheme that no one understands (laughs). No, we aspire to showcase our concepts for realizing a reusable architecture , which we can later pitch to the customer. Given that we can't provide a demo akin to other teams, we've already come to terms with the likelihood of securing second place!" 8. Virtual employees This team may have been the smallest of them all, but a lot of work had already been done just before noon. “This morning we first had a short meeting with the customer to discuss their expectations,” Remco Goyvaerts explains. “We then identified the priority tasks and both of us quickly got to work. The goal is to develop a virtual colleague who can be fed with new information based on AI and ML . This virtual colleague can help new employees find certain information without having to disturb other employees. I am sure that we will be able to show something beautiful, so at the moment the stress is well under control.” Chatbot technology is becoming more and more popular. Remco sees this Ship-IT project as the ideal opportunity to learn more about applications with long-term memory. “The Ship-It Day is a fantastic initiative,” says Remco. “It's wonderful to have the opportunity to break away from the routine work structure and explore innovative ideas.” 9. Automated invoicing The client involved in this project handles 50,000 invoices annually in various languages. The objective is to extract accurate information from these invoices, translate it into the appropriate language, and convert it into a format easily manageable for the customer . “Although we started quite late, we have already made great progress,” notes Bram Meerten. "We can already send the invoice to Azure, which extracts the necessary data reasonably well. Subsequently, we transmit that data to ChatGPT, yielding great results. Our focus now is on visualizing it in a frontend. The next phase involves implementing additional checks and solutions for line information that isn't processed correctly." Bram expresses enthusiasm for the Ship-IT Day concept, stating, "It's fun to start from scratch in the morning and present a functional solution at the end of the day. While it may not be finished to perfection, it will certainly be a nice prototype." And the winner is …. 🏆 At 5 p.m., the moment had arrived... Each team had the opportunity to showcase their accomplishments in a 5-minute pitch, followed by a voting session where everyone present could choose their favorite. All teams successfully presented a functional prototype addressing their customer's challenges. While the SMOCS team may not have managed to visualize their solution, they introduced additional business ideas with the SMOCintosh and the SMOCS-to-go food concept. However, these ideas fell just short of securing victory. In a thrilling final showdown, the team working on the onboarding platform for ACA came out as the winners! Under the name NACA (New at ACA), they presented an impressive prototype of the onboarding platform, where employees gradually build a rocket while progressing through their onboarding journey. Not only was the functionality noteworthy, but the user interface also received high praise. Congratulations to the well-deserving winners! Enjoy your shopping and dinner vouchers. 🤩 See you next year!

Read more
eventsourcing and cqrs
eventsourcing and cqrs
Reading time 1 min
8 MAY 2025

Staying current with the latest trends and best practices is crucial in the rapidly evolving world of software development. Innovative approaches like EventSourcing and CQRS can enable developers to build flexible, scalable, and secure systems. At Domain-Driven Design (DDD) Europe 2022 , Paolo Banfi delivered an enlightening talk on these two techniques. What is EventSourcing? EventSourcing is an innovative approach to data storage that prioritises the historical context of an object. Rather than just capturing the present state of an object, EventSourcing stores all the events that led to that state. Creating a well-designed event model is critical when implementing EventSourcing. The event model defines the events that will be stored and how they will be structured. Careful planning of the event model is crucial because it affects the ease of data analysis. Modifying the event model after implementation can be tough, so it's important to get it right from the beginning. What is CQRS CQRS (Command Query Responsibility Segregation) is a technique that separates read and write operations in a system to improve efficiency and understandability. In a traditional architecture, an application typically interacts with a database using a single interface. However, CQRS separates the read and write operations, each of which is handled by different components. Combining EventSourcing and CQRS One of the advantages of combining EventSourcing and CQRS is that it facilitates change tracking and data auditing. By keeping track of all the events that led to a particular state, it's easier to track changes over time. This can be particularly useful for applications that require auditing or regulation. Moreover, separating read and write operations in this way provides several benefits. Firstly, it optimises the system by reducing contention and improving scalability. Secondly, it simplifies the system by isolating the concerns of each side. Finally, it enhances the security of sensitive data by limiting access to the write side of the system. Another significant advantage of implementing CQRS is the elimination of the need to traverse the entire event stream to determine the current state. By separating read and write operations, the read side of the system can maintain dedicated models optimised for querying and retrieving specific data views. As a result, when querying the system for the latest state, there is no longer a requirement to traverse the entire event stream. Instead, the optimised read models can efficiently provide the necessary data, leading to improved performance and reduced latency. When to use EventSourcind and CQRS It's important to note that EventSourcing and CQRS may not be suitable for every project. Implementing EventSourcing and CQRS can require more work upfront compared to traditional approaches. Developers need to invest time in understanding and implementing these approaches effectively. However, for systems that demand high scalability, flexibility or security, EventSourcing and CQRS can provide an excellent solution. Deciding whether to use CQRS or EventSourcing for your application depends on various factors, such as the complexity of your domain model, the scalability requirements, and the need for a comprehensive audit trail of system events. Developers must evaluate the specific needs of their project before deciding whether to use these approaches. CQRS is particularly useful for applications with complex domain models that require different data views for different use cases. By separating the read and write operations into distinct models, you can optimise the read operations for performance and scalability, while still maintaining a single source of truth for the data. Event Sourcing is ideal when you need to maintain a complete and accurate record of all changes to your system over time. By capturing every event as it occurs and storing it in an append-only log, you can create an immutable audit trail that can be used for debugging, compliance, and other purposes. Conclusion The combination of EventSourcing and CQRS can provide developers with significant benefits, such as increased flexibility, scalability and security. They offer a fresh approach to software development that can help developers create applications that are more in line with the needs of modern organisations. If you're interested in learning more about EventSourcing and CQRS, there are plenty of excellent resources available online. Conferences and talks like DDD Europe are also excellent opportunities to stay up-to-date on the latest trends and best practices in software development. Make sure not to miss out on these opportunities if you want to stay ahead of the game! The next edition of Domain-Driven Design Europe will take place in Amsterdam from the 5th to the 9th of June 2023. Did you know that ACA Group is one of the proud sponsors of DDD Europe? {% module_block module "widget_bc90125a-7f60-4a63-bddb-c60cc6f4ee41" %}{% module_attribute "buttons" is_json="true" %}{% raw %}[{"appearance":{"link_color":"light","primary_color":"primary","secondary_color":"primary","tertiary_color":"light","tertiary_icon_accent_color":"dark","tertiary_text_color":"dark","variant":"primary"},"content":{"arrow":"right","icon":{"alt":null,"height":null,"loading":"disabled","size_type":null,"src":"","width":null},"tertiary_icon":{"alt":null,"height":null,"loading":"disabled","size_type":null,"src":"","width":null},"text":"More about ACA Group"},"target":{"link":{"no_follow":false,"open_in_new_tab":false,"rel":"","sponsored":false,"url":{"content_id":null,"href":"https://acagroup.be/en/aca-as-a-company/","href_with_scheme":"https://acagroup.be/en/aca-as-a-company/","type":"EXTERNAL"},"user_generated_content":false}},"type":"normal"}]{% endraw %}{% end_module_attribute %}{% module_attribute "child_css" is_json="true" %}{% raw %}{}{% endraw %}{% end_module_attribute %}{% module_attribute "css" is_json="true" %}{% raw %}{}{% endraw %}{% end_module_attribute %}{% module_attribute "definition_id" is_json="true" %}{% raw %}null{% endraw %}{% end_module_attribute %}{% module_attribute "field_types" is_json="true" %}{% raw %}{"buttons":"group","styles":"group"}{% endraw %}{% end_module_attribute %}{% module_attribute "isJsModule" is_json="true" %}{% raw %}true{% endraw %}{% end_module_attribute %}{% module_attribute "label" is_json="true" %}{% raw %}null{% endraw %}{% end_module_attribute %}{% module_attribute "module_id" is_json="true" %}{% raw %}201493994716{% endraw %}{% end_module_attribute %}{% module_attribute "path" is_json="true" %}{% raw %}"@projects/aca-group-project/aca-group-app/components/modules/ButtonGroup"{% endraw %}{% end_module_attribute %}{% module_attribute "schema_version" is_json="true" %}{% raw %}2{% endraw %}{% end_module_attribute %}{% module_attribute "smart_objects" is_json="true" %}{% raw %}null{% endraw %}{% end_module_attribute %}{% module_attribute "smart_type" is_json="true" %}{% raw %}"NOT_SMART"{% endraw %}{% end_module_attribute %}{% module_attribute "tag" is_json="true" %}{% raw %}"module"{% endraw %}{% end_module_attribute %}{% module_attribute "type" is_json="true" %}{% raw %}"module"{% endraw %}{% end_module_attribute %}{% module_attribute "wrap_field_tag" is_json="true" %}{% raw %}"div"{% endraw %}{% end_module_attribute %}{% end_module_block %}

Read more
time
time
Reading time 3 min
6 MAY 2025

Nowadays, there is constant talk about the importance of a 'real-time enterprise' that can immediately notice and respond to any event or request. So, what does it mean to be ‘real-time’? Real-time technology is crucial for organizations because real-time decision-making is a competitive differentiator in today's fast-paced world. A real-time application requires the ability to ingest, structure, analyze and act on data in real-time. The emphasis lies on providing insights and decision-making whenever an event occurs, rather than days or even weeks afterwards. Today’s business systems are primarily capable of providing what a real-time application promises: collecting data in real-time. Another criterion, which is analyzing this data and gaining valuable insight in real-time, is a whole other challenge. It is also often confused with the former, diverting attention from what should be the main considerations when planning a real-time application: What are the decisions your business needs to make when receiving data? What slows down your business in making those decisions? How will your business benefit from this ability to make decisions? Enterprises must first be able to answer these considerations and make them clear to the rest of the business before the successful implementation of a real-time application can be guaranteed. Goals of real-time The sole purpose of a real-time application is to make decisions in real-time. As these applications will control a much larger part of an enterprise, close cooperation with humans will offer significant advantages and become a requirement in the future. Software will automate deterministic functions and standardized activities. At the same time, humans will add experience, intuition, and values to: assure the most appropriate actions are taken, intervene when they are not, and take charge when it is not clear enough what to do. By interactions, we mean communication that goes far beyond text, email or chat systems. We are talking about truly sophisticated collaborative relationships in which a software application and a human being communicate and are each aware of the context of what is happening, how a situation changes over time, and what choices or recommendations are likely to produce the best results. The 3 steps to become a real-time enterprise Now that we’ve established what a real-time enterprise is: how do you become one? There are 3 key steps to take into account: Put business needs first: adopt the mindset to create and change both business and operational processes with a real-time-first attitude. For example: allowing certain automatic decisions depending on what data streams are feeding your applications. Speaking of data… get it right! Moving to real-time also requires robust data management that supports both emerging streaming data and traditional data sources for real-time data integration. Look to the edge: as we’ve already established, going to real-time also requires implementing real-time analytics where the data originates and delivering analytics. This requires autonomous support to perform analytics closer to the data source without connecting to the cloud, creating more flexible and powerful deployments. With edge computing, organizations can ingest, enrich and analyze data locally, run machine learning models on cleansed datasets and deliver enhanced predictive capabilities. The velocity and volume of data arriving in real-time require in-memory stream analytics and complex event processing. It requires a shift from a traditional 3-tier database-centric architecture (with presentation, application and data tiers) to a modern event-driven architecture method of application development. Conclusion Although we’ve only scratched the surface, we hope this article has shown you how exciting and valuable real-time applications can be. If you want to learn more or explore ways to implement these technologies into your business, get in touch. We would be happy to help you transform into a real-time enterprise!

Read more
digital transformation
digital transformation
Reading time 3 min
6 MAY 2025

A couple of months ago Dorien from Marketing asked me to write a blogpost about my first experiences at ACA Group. Now the roles are reversed: “Dorien, may I share my experiences?” I could write many pages on my experiences from the past 6 months, but I’m not going to do that. I would rather talk about the topic “Digital Transformation: Buzz or Hot?” Digital transformation During the last few months, I have participated to many events, talked to experts and clients, … Digital transformation is possibly the most discussed term of the moment ( besides GDPR of course). One way or another, every organization and every employee is “confronted” with it. Digital transformation (DT): “what does it mean for my organization, when do I have to start it, what does it mean for my job and what will it cost?” It’s very clear to me that there are two kinds of people (putting it very black-and-white for a moment). One group considers digital transformation as a buzzword. Let’s call them buzzers . Another group has a somewhat anxious view on digital transformation. Let’s call them hotters. 😉 The next technological wave I believe that we – as a community – should indeed evolve to a higher level. Call it the next technological wave . But of course, it’s difficult, almost impossible, to predict what this will mean for the future of an organization. On WebTomorrow, one of the speakers put it into words very nicely. "The technological revolution increases at a faster and faster pace. For people / employees, it means that acquiring new skills becomes necessary.” Self-education is and remains of the utmost importance. There are truly some incredible, innovative technologies: Chatbots, AR, VR, AI, IoT, … Not thinking about where to start would be simply irresponsible. But as with any evolution, it’s even more important to start at the beginning. The (digital) future In my opinion, companies should start thinking about how new technologies can truly add value to their organization and employees as soon as possible. But how do you stay up-to-date on all new trends, possibilities, … and what would it mean to your organization? A partner is indispensable for this. Look for a partner that deals with the digital transformation of business processes on a daily basis. Consider them as an extension of your IT department. And of course, involve your own employees as much as possible. Reassure the hotters . Digital transformation doesn’t imply immediate job loss. It means eliminating paper and letting employees focus on what they love to do and what they’re good at. The term digital transformation is possibly a bit buzz , but also emphasize to the buzzers that the next technological wave is a crucial factor that may not be underestimated. Some companies have already done a lot, but that doesn’t mean that they are finished with their digital transformation. It’s an ongoing process. And of course, other companies haven’t done anything yet. That doesn’t mean that it’s already too late: embrace it, study it and do something with it. Digital transformation costs money, but it’s not a real cost. It’s an investment and, as with any other investment, a positive ROI is intended. So, start as soon as possible. Look for a partner and discuss the possibilities. Focus on the low-hanging fruits and gradually grow to a higher level. Just another opinion This blog is no science, but it’s how I look at digital transformation. An opinion like many others and of course opinions should be shared. I’m very curious about what your view on digital transformation is. Are you part of the Buzzers or Hotters ? Do you want to discover what your company needs for digital transformation? Contact us through the ACA websit e and we will certainly help you along

Read more
superhero
superhero
Reading time 6 min
6 MAY 2025

Artificial intelligence (AI) is typically defined as the ability of a machine to perform cognitive functions we associate with human minds . Functions such as perceiving, reasoning, learning and problem solving. In specific cases, AI machines do a far better job at those things than we do. AI is not one technology. It’s a toolbox of different technologies with the potential to outperform or augment human performance, especially in complex repetitive tasks that require connecting vasts amounts of data points. Every technology in AI’s toolbox is a building block capable of doing one specific task very well and ad infinitum, without ever complaining. To be able to employ this formidable power, you’ll need two things: a lot of data and a mathematical model that you need to train . Simply put, that mathematical model is a formula that generates output from the data you feed it. But just any data isn’t enough. Only when your data is visible, adequate, and complemented with external data and representative for your demographic, can you really profit from AI’s huge potential. We’ve written a blog post about this earlier, click here to read ‘ Is your data ready for Artificial Intelligence? ’. No, we’re not helping the robots enslave humanity Okay, so structured data and a mathematical model are all we need to give people superpowers. But… what about creating supervillains? Robot overlords enslaving all of humanity in the not-so-distant future is a concern that’s often portrayed in popular culture. Think for example of 2001: A Space Odyssey ’s HAL 9000, Terminator’s Skynet and the androids from I, Robot . And it’s not just popular culture: well-known scientists such as Stephen Hawking have already warned us of the dangers that advanced AI might pose . So, that’s it then. We have a few good years left before AI takes over the world and enslaves us all. Right? Well, not really. Even though popular culture and science (-fiction) is rife with examples, AI advanced enough to rule the world is still a long way off. Besides, that’s not what we’re focusing on here at ACA. At least, not right now… 😉 Instead of artificial intelligence, which intends to replace humans, we chose to focus on augmented intelligence . Augmented intelligence is the use of technology to supplement and support human intelligence, with humans remaining at the center of the decision making process . Basically, augmented intelligence allows us to give people superpowers : predicting the future, optimizing processes in ways that weren’t possible before, helping us to make decisions and so much more. What use is there for AI then? Artificial intelligence impacts businesses in 4 benefit domains : engaging your customers, e.g. by shorting conversation cycles through chatbots; enabling your employees, e.g. by automating repetitive tasks, allowing them to focus more on creativity or difficult to automate tasks; transforming your products and services, e.g. through providing added value with new services; optimizing your operations, e.g. by reducing costs through prediction and deep insights. Companies today use AI mainly for the 5 business cases listed below. For clarity’s sake I’ve added a tangible example for each business case, based on a service most of us are familiar with: Google Maps. Predicting: anticipating events and their outcomes. For example: Google Maps will predict how long your commute to work will be for your chosen means of transport. Through a predictive model, we were able to reduce the inventory carrying cost of one of our clients by almost 75%. You can read more about how exactly in this blog post . Automating: handling tasks without human intervention. For example: Google Maps asking you if you want to navigate home as soon as it notices you’re driving away from work, after it previously figured out which location is work and which one is home. Insights: identifying and understanding patterns and trends. In this case, Google Maps provides insights to its users (e.g. notifying them of traffic jams), but also to advertisers: how long people commute, when they work from home, and so on. Personalizing: tailoring content and user-experiences to specific users and providing them with recommendations based on their profile. This should sound familiar to anyone using Netflix’s ‘Recommended for you’ or Spotify’s ‘Discover weekly’ feature. Google Maps does this as well, for example providing you with tailored points of interest along your route. Prescribing: complex decision-making based on numerous factors. For example: Google Maps will tell you to take the next exit on the highway to reach your destination while avoiding the traffic jam ahead. Why should I care about AI in my business? Remember when we used actual film rolls in our cameras to take pictures? In the early 2000s, Kodak was still a well-known and lucrative company. However, by failing to truly embrace the opportunities that the shift to digital photography brought along , it was forced to file for bankruptcy in 2012 with a debt of 6.8 billion US dollars. It doesn’t matter if you call it digital disruption, digital transformation or digital revolution: the fact is that businesses that don’t go digital now may not survive ( Cisco 2015 ). And even if you are still in business, investing in digital is understood in an effort to catch up. Everyone realizes the power of going digital now. But many people are yet to realize the power of artificial intelligence. It’s not that Kodak didn’t do anything with digital photography – they even invented it – but they didn’t capitalize on its potential to change the industry. Just like the digital transformation before it, AI will change industries. AI will alter the relationship between businesses and technology, reduce the burden on skilled labor and ease the decision-making processes of management while revolutionizing business models ( Observer 2016 ). You don’t want to make the same mistake as Kodak when it comes to AI. You have to stop looking at tech itself, and start looking at its impact. You can bet your competitors will. The simulated statistics above show why it’s important for you to start adopting AI now (see front-runner breakdown). The sooner you do, the more revenue you’ll get in the long run. Yes, adopting AI will take a lot of resources, however the profits far outweigh the investments. Don’t delay: act today and enjoy your strong competitive advantage tomorrow. If you act tomorrow, your competitive edge will be largely gone and if you don’t act tomorrow, you’ll end up with a hopeless competitive handicap (see laggard breakdown). When it comes to AI, it’s do-or-die. But I don’t even know how to start with AI! There are indeed a few obstacles to overcome when it comes to incorporating AI into your business: there might not be a clear strategy for AI in your business, it's difficult to find people with the appropriate skill set for AI work, functional silos still constrain end-to-end solutions (e.g. different departments all working according to their own processes without sharing information), your organization might lack the technological infrastructure to support AI, and there might be a lack of available data (i.e. collected and structured) data. This is where we can help you. You need to get on the AI train, but don’t jump the AI wagon blindly. It’s never nice to start with something and then discover you skipped a few essential steps later on. That’s why we propose a step-by-step approach with 3 essential steps to transform your business into an AI-driven organization. First, you'll need to have leadership with the ability to lead an AI transformation from top to bottom. You can do this by articulating a vision, setting goals and securing broad buy-in across your organization. Identify which problems you want to solve and which opportunities to pursue. Manage your data. Capture, store, structure label, and govern your data to build the foundation and infrastructure to work with AI technologies. Obtain and deploy specialized data science, data engineering, data architecture and data visualization skills by relentlessly training employees or attracting new talent, preferable ahead of the curve. You don’t have to do this alone. We can help you bridge the gap and get started with a workshop in which we demystify AI and have an interactive session around 1 to 3 specific pains, with the goal to check whether: you need AI to solve that pain, your organization is ready for an AI solution. Contact us for personalized advice and let’s get you your superpower.

Read more
ship it 2024
ship it 2024
Reading time 6 min
6 MAY 2025

Friday, June 7th, was a day many ACA members had marked in red on their calendars. It was Ship-IT Day, the annual hackathon where multidisciplinary teams work on innovative solutions for customer challenges. The day was filled with excitement, culminating in impressive results and a well-deserved winner. Read the full report below. Ship-IT Day started early this year. As participants arrived at the ACA offices in Hasselt, they were greeted by the aroma of fresh coffee and pastries. Conversations in the kitchen buzzed with excitement and determination. Everyone seemed ready to tackle the day's challenges. AI as the Common Thread in Ship-IT Projects After a refreshing breakfast, the eight project teams got to work. As with the previous edition , AI and LLMs (Large Language Models) were central to most projects. Here’s a brief overview of the eight project teams: Energie.be - Support the customer support DEMAZE your legacy Oracle Fluvius - “Stroomlijn” insights: FAQ optimizer Fluvius - Flow in “De Stroomlijn” Umani Group Flexer - CV matching S-Lim - AI-powered visual notifications for local government Digitalizing of task cards and E2E cleaning service journey in a B2B context YouGO soccer app - AI gamification For more details on the scope of these projects, check out our first blog post: Ship-IT Day 2024: Discover the 8 Innovative Projects. Checking In on Team Status (and Stress Levels) By midday, the office was unusually quiet despite the activity level of all project teams. Was this a good or bad sign? We decided to check in on each team's progress and stress levels. 1. Energie.be - Support the customer support This project team aims to provide extra support to Energie.be's customer support team by centralizing customer data from various systems into a convenient widget in Freshdesk. "After a brief kick-off meeting with the client to clarify all needs and expectations, we first conducted data mapping to identify the necessary data formats", explains Michiel Sioen. "This allowed us to start building the frontend. We already have several standalone elements, and we should be able to present a working widget before the end of the day." "Ship-IT is fantastic because it allows you to explore new things with colleagues you don't usually work with", says Michiel. "It's exciting to build a proof of concept in one day that can provide immediate value to a client." 2. DEMAZE your legacy Oracle This project team aims to create an LLM-based code assistant to help developers navigate legacy codebases more easily. "We want to be able to ask AI questions about an existing codebase so we can understand its structure faster and make changes or additions more quickly", explains Pieter Vandeperre. "We started by testing different AI models to understand codebases. The feedback we received was quite good. We also verified the results with the original developer of the codebase, who confirmed their accuracy. So, we're optimistic." Pieter participates in Ship-IT every year with great enthusiasm. "I see it as a hands-on training to discover new technologies and methods", he says. 3. Fluvius - “Stroomlijn” Insights: FAQ Optimizer This project team aims to use AI to analyze incoming customer questions and interactions at Fluvius, allowing them to more accurately and quickly detect the top 10 current customer queries and automatically generate FAQ articles. "We first solidified the concept internally and then validated it with two Fluvius representatives present at the ACA office in Ghent", says Jo Corthals. "We're in a good position now. The data anonymization and structuring are complete, and the frontend is ready to receive the data. Now, we're focusing on data processing." For Jo, who usually prefers a background role, this Ship-IT Day is a step out of his comfort zone. "It's also a perfect opportunity to provide added value for our client", Jo adds. 4. Fluvius - Flow in “De Stroomlijn” The second project for Fluvius is also progressing smoothly. This team focuses on generating summaries of past customer interactions at “De Stroomlijn”, providing helpdesk staff with quick insights into customer context and sensitivities. "Everything is going according to plan, and the client is pleased with our progress", says Jelle Cayman. "We'll be presenting a concept, part of which we've already developed." 5. Umani Group Flexer - CV Matching This project team aims to build an AI assistant for Umani Group to automatically match candidates with job vacancies. "Additionally, we want to incorporate OCR functionality to process textual information from scanned documents or images", says Alexander Frimout. "We're also developing a chatbot that provides candidates with targeted information about specific vacancies and directs them to jobs that match their profiles." The six-member team is also supported today by a representative from Umani Group. "That is very productive", Alexander notes, "as it ensures we have first-hand information to tailor our solution perfectly to the client's needs." Together with his five teammates, Alexander is confident they will deliver an impressive final product: "We're going to blow everyone away!" 6. S-Lim - AI-powered visual notifications for local government S-Lim is collaborating with ACA to develop a new app for cities and municipalities. This project team aims to create a proof of concept for a smart reporting feature that uses AI to analyze a photo, categorize the issue automatically, and forward it to the municipality. "We started by setting up the project and discussing the designs and user interface", says Jeffrey Vanelderen. "The design is now finalized, and the camera functionality and permissions are in place. Currently, we're working on interfacing with the AI model to see what information it returns and optimizing the results. That's the toughest part, but once we succeed, we'll deliver a great proof of concept." Jeffrey enjoys the opportunity to see a new project through from start to finish during Ship-IT. "As a mobile developer, you're usually brought into an ongoing project. It's nice to start from scratch for once; you learn a lot from that." 7. Digitalizing of task cards and E2E cleaning service journey in a B2B context This project team aims to streamline a cleaning company's operations by digitizing the current time-consuming and error-prone briefing process, which involves a lot of paperwork, to increase efficiency. "The idea is for cleaning staff to receive targeted cleaning instructions on their smartphones based on their location", explains Stijn Schutyser. "We started by mapping out the current process and identifying areas for digital optimization", Stijn continues. "We've made significant progress. There’s still work to be done on the UI and UX because we want it to look attractive and be user-friendly." Last year, Stijn's team won Ship-IT Day, and he hopes for the same outcome this year. 8. YouGO soccer app - AI gamification The Mobile Flutter team at ACA sent some of its members to create a proof of concept for YouGO soccer's training app. “We want to make the existing exercises more dynamic and interactive”, says Stijn Huygh. “We started by outlining and distributing the work packages. Some parts are on schedule, while others are progressing slower than expected. The biggest challenge is the accuracy of ball detection. By the end of the day, we hope to convert one exercise into a more dynamic version.” By participating in Ship-IT Day 2024, Stijn hopes to gain first-hand experience in how machine learning can enhance user experience and enable new features in mobile development. Who won Ship-IT Day 2024? At exactly 5:00 PM, Stijn Van den Enden kicked off the closing ceremony. Each project team had five minutes to pitch their work and demonstrate the final result. Despite some teams needing a couple of attempts to get their demos working, the final results were impressive. It’s remarkable how much the multidisciplinary ACA teams can achieve in just one day. However, there can only be one winner, determined by a quick vote using a mobile voting app. Ultimately, the S-Lim project team took home the victory along with well-deserved shopping and dining vouchers. Their functional demo, which offered high potential value for the client, earned them a whopping 40% of the votes. A score any political party would envy with election Sunday approaching! See you next year for another edition of Ship-IT Day!

Read more
chat gpt
chat gpt
LangChain: A revolution in Conversational AI
Reading time 5 min
6 MAY 2025

The world of chatbots and Large Language Models (LLMs) has recently undergone a spectacular evolution. With ChatGPT, developed by OpenAI, being one of the most notable examples, the technology has managed to reach over 1.000.000 users in just five days. This rise underlines the growing interest in conversational AI and the unprecedented possibilities that LLMs offer. LLMs and ChatGPT: A Short Introduction Large Language Models (LLMs) and chatbots are concepts that have become indispensable in the world of artificial intelligence these days. They represent the future of human-computer interaction, where LLMs are powerful AI models that understand and generate natural language, while chatbots are programs that can simulate human conversations and perform tasks based on textual input. ChatGPT, one of the notable chatbots, has gained immense popularity in a short period of time. LangChain: the Bridge to LLM Based Applications LangChain is one of the frameworks that enables to leverage the power of LLMs for developing and supporting applications. This open-source library, initiated by Harrison Chase, offers a generic way to address different LLMs and extend them with new data and functionalities. Currently available in Python and TypeScript/JavaScript, LangChain is designed to easily create connections between different LLMs and data environments. LangChain Core Concepts To fully understand LangChain, we need to explore some core concepts: Chains: LangChain is built on the concept of a chain. A chain is simply a generic sequence of modular components. These chains can be put together for specific use cases by selecting the right components. LLMChain: The most common type of chain within LangChain is the LLMChain. This consists of a PromptTemplate, a Model (which can be an LLM or a chat model) and an optional OutputParser. A PromptTemplate is a template used to generate a prompt for the LLM. Here's an example: This template allows the user to fill in a topic, after which the completed prompt is sent as input to the model. LangChain also offers ready-made PromptTemplates, such as Zero Shot, One Shot and Few Shot prompts. Model and OutputParser: A model is the implementation of an LLM model itself. LangChain has several implementations for LLM models, including OpenAI, GPT4All, and HuggingFace. It is also possible to add an OutputParser to process the output of the LLM model. For example, a ListOutputParser is available to convert the output of the LLM model into a list in the current programming language. Data Connectivity in LangChain To give the LLM Chain access to specific data, such as internal data or customer information, LangChain uses several concepts: Document Loaders Document Loaders allow LangChain to retrieve data from various sources, such as CSV files and URLs. Text Splitter This tool splits documents into smaller pieces to make them easier to process by LLM models, taking into account limitations such as token limits. Embeddings LangChain offers several integrations for converting textual data into numerical data, making it easier to compare and process. The popular OpenAI Embeddings is an example of this. VectorStores This is where the embedded textual data is stored. These could, for example, be data vector stores, where the vectors represent the embedded textual data. FAISS (from Meta) and ChromaDB are some more popular examples. Retrievers Retrievers make the connection between the LLM model and the data in VectorStores. They retrieve relevant data and expand the prompt with the necessary context, allowing context-aware questions and assignments. An example of such a context-aware prompt looks like this: Demo Application To illustrate the power of LangChain, we can create a demo application that follows these steps: Retrieve data based on a URL. Split the data into manageable blocks. Store the data in a vector database. Granting an LLM access to the vector database. Create a Streamlit application that gives users access to the LLM. Below we show how to perform these steps in code: 1. Retrieve Data Fortunately, retrieving data from a website with LangChain does not require any manual work. Here's how we do it: 2. Split Data The resulting data field above now contains a collection of pages from the website. These pages contain a lot of information, sometimes too much for the LLM to work with, as many LLMs work with a limited number of tokens. Therefore, we need to split up the documents: 3. Store Data Now that the data has been broken down into smaller contextual fragments, to provide efficient access to this data to the LLM, we store it in a vector database. In this example we use Chroma: 4. Grant Acces Now that the data is saved, we can build a "Chain" in LangChain. A chain is simply a series of LLM executions to achieve the desired outcome. For this example we use the existing RetrievalQA chain that LangChain offers. This chain retrieves relevant contextual fragments from the newly built database, processes them together with the question in an LLM and delivers the desired answer: 5. Create Streamlit Application Now that we've given the LLM access to the data, we need to provide a way for the user to consult the LLM. To do this efficiently, we use Streamlit: Agents and Tools In addition to the standard chains, LangChain also offers the option to create Agents for more advanced applications. Agents have access to various tools that perform specific functionalities. These tools can be anything from a "Google Search" tool to Wolfram Alpha, a tool for solving complex mathematical problems. This allows Agents to provide more advanced reasoning applications, deciding which tool to use to answer a question. Alternatives for LangChain Although LangChain is a powerful framework for building LLM-driven applications, there are other alternatives available. For example, a popular tool is LlamaIndex (formerly known as GPT Index), which focuses on connecting LLMs with external data. LangChain, on the other hand, offers a more complete framework for building applications with LLMs, including tools and plugins. Conclusion LangChain is an exciting framework that opens the doors to a new world of conversational AI and application development with Large Language Models. With the ability to connect LLMs to various data sources and the flexibility to build complex applications, LangChain promises to become an essential tool for developers and businesses looking to take advantage of the power of LLMs. The future of conversational AI is looking bright, and LangChain plays a crucial role in this evolution.

Read more
ship it day 2022 aca group
ship it day 2022 aca group
Reading time 7 min
6 MAY 2025

Every year ACA organizes a Ship-IT Day. Different teams, which are composed on the basis of skills and interests, try to solve (potential) problems of customers in one day. They work out innovative ideas and explore new technology. Together with those customers, the Ship-IT teams validated various innovative ideas and developed them into a Proof-of-Concept. You can discover them all below: 1. Spot the free charging spot Anyone who has been following ACA for a while knows that we are fully engaged in greening our vehicle fleet. In order to continue to grow, we must provide sufficient charging options. In addition, the 'facilities fleet' manager wants to map the use of charging stations and parking spaces in order to consider further optimizations and investments. For example, how many charging stations are not in use at a given moment, even though parking is taking place? Together with Mobility+, our partner and supplier of charging stations, some ACAs built a dashboard using Azure that optimizes the use of parking spaces and justifies potential investments with real-time data. The dashboard offers an overview of the status (free or occupied) of the charging stations in the underground car park of our office, in combination with the active charge-user. This allows the 'facilities fleet' manager to see who is parked in the relevant parking lot and to consult other important data and metrics. Employees with an EV can use an application to see where parking is still possible. 2. Widgets Home Automation The second project is part of an existing case for a customer who specializes in windows and window extensions. We had already developed an application for this customer that offers a lot of interesting functionalities for users. For example, the possibility to consult the air quality at home. To create an even better user experience, the team delved into widgets during Ship-IT Day. At the moment, users have to open the application every time to query the air quality. Since this can be a bit cumbersome at times, widgets now make it possible to send an alert without disrupting the daily flow and without requiring users to open the application beforehand. In other words, the team wanted to create an experience where the information automatically reaches the user. The application itself is written in Xamarin.Forms. While this turned out to be quite simple for Android, for iOS a native widget extension had to be developed in Swift and SwiftUI. Finally, the option has also been added to ask Google Assistant about the air quality in the home. 3. UGent: Keyword matching Ghent University has developed a mechanism that connects researchers and project proposals. Each researcher has its own bibliography from which the tool can extract information about expertise and research topics. And keywords are also extracted from the project proposals. Based on this, the Keyword Matcher makes a list of researchers and project proposals that match. Users can therefore quickly see which projects are of interest to researchers at Ghent University. The Keyword Matcher was built by the university itself, but the tool could use some improvement, especially in terms of UX and UI. During a workshop, our Ship-IT Day team first looked for the user's pain points. For example, it was cumbersome to select a researcher and it was also not possible to share one specific result. For the cumbersome selection of a researcher, the UI team developed a live search input field that can search for multiple things, such as name and first name, but also a unique ID per researcher. In addition, users now also have the option to export the complete list of results or a specific result or to share it directly via email with, for example, one or more researchers. 4. MyValipac - Micro Frontends The fourth team wanted to transform various applications of our customer Valipac into one well-organized whole. To do this, they suggested building a new platform that could act as a kind of ecosystem. The focus was on the use of Micro Front Ends, with which the different business domains can be developed independently of each other. During the Ship-IT Day, the team wanted to guide the customer to the start of such a platform and also map the use and benefits of Micro Frontends for themselves. The project was a great success. For example, the team found that the “assembly” of Micro Front Ends goes very smoothly if you use the module federation principles correctly. The end result is a POC of a platform that consists of: a login based on different user rights, a platform landing page, a maintenance page, a task list page (internal module that is called in the platform), a link with legacy providers (external modules that are called up in the platform). For the end user, the platform has one UX and look feel, in other words it feels like one entity. While the frontend actually consists of several separate micro frontends, and several separate services are called in the backend. 5. NFT ticketing system with fan tokens for clubs, tournaments and festivals This team developed an NFT ticketing system with a virtual currency (fan token). NFTs are non-fungible tokens. The idea for Ship-IT Day was to develop a platform where fans can buy tickets online and receive fan tokens in exchange. The fan tokens are redeemable on the platform and can be used by supporters to purchase goods or services, such as: Merchandising Voting (participation in e.g. T-shirt design, music, etc.) Live Meet-ups, autographed gadgets or VIP tickets ... In addition, the Fan tokens in the Metaverse can be used to shop in your favorite Fan store through AR. 6. The Tech Radar Tech Radar is an online virtualization tool that shows which technology choices are and are not available in an organization or team. Based on this, you can determine whether the necessary knowledge is already available in-house. And if so, where is that knowledge and how you can make full use of it in projects. The first version of the Tech Radar was developed earlier this year by a colleague who did an internship at ACA. Although the application already contained many functionalities, we wanted to further optimize and operationalize the tool throughout ACA during the Ship-IT Day. The Tech Radar not only shows what knowledge and expertise we have in house, but also offers a visual representation of the evolution over time of a certain technology. Below you can see, for example, how often a tool is used within an organization. In addition to the classic visualization, there is also one with “quadrants” available. This not only shows whether a tool has been used, but also how much it is used within the organization. Depending on the popularity, the balls get bigger. The last visualization shows a top 5 of the technologies that are most often worked with or of which the most knowledge is present within the organization. 7. The Mobility+ charge card in your mobile wallet Every employee with an EV will receive a charging card. This makes it possible to charge the car in various places, such as the underground car parks of our offices. The physical charge card is not always handy. Especially if you suddenly notice that you have forgotten your charge card. The charge card is also sometimes a bit cumbersome for our partner Mobility+. It costs money to produce the card, the charge card has to be sent by post, etc. As a solution, this team came up with the idea to put the existing charge card in a mobile wallet in the existing Mobility+ app. Every time you log in to the app, you will arrive at the general screen with various assets, including the digital charge card. The mobile solution is not only convenient for daily use, new users no longer have to wait for their plastic card during onboarding. And it is even possible to start a charging session offline. 8. Chatbot integration for our customer This project team started working for one of our customers, who remains anonymous. Making the use of data or interactions easier through a conversational interface, that was the scope of the project of the last team and our client. At the beginning of the hackathon it became clear that this could not only offer an advantage on the website itself, but also provide onboarding flows on other media such as Facebook or Whatsapp. Or even to let partners do onboarding, while maintaining internal control. Although no one had experience with it, the team members decided to get started with Power Virtual Agent. The big advantage of this tool is the possibility to work with several people on the same chatbot. One of the team members is a project manager who, despite a lack of technical knowledge, was able to build a lot himself thanks to the low-code capabilities of the tool. It eventually resulted in a working chatbot that is able to collect the necessary information, do data validation (both locally and on server), and fully handle the effective registration. It is certainly the intention to further develop this story, possibly with other technology than during the hackathon. And the winner is… Project 1, 'Spot the free charging spot'! The winning team was rewarded with dinner and a CoolBlue voucher. Many of our employees voted for this project because of its innovative nature and added value for both ACA itself and for our customers.

Read more
ship it 2024 people
ship it 2024 people
Reading time 5 min
6 MAY 2025

On Friday 7 June, ACA Group will once again organize the annual Ship-IT Day, a hackathon where various ACA teams work on innovative ideas for and with the customer. This seventh edition promises to be another day full of creativity and collaboration. This year, seven project teams are competing for the coveted title of winner of Ship-IT Day 2024. Not surprisingly, AI and LLMs are the common thread running through most projects this year. You will discover them all in this blog. What is Ship-IT Day? Ship-IT Day is all about collaboration and innovation . On this day, multidisciplinary ACA teams use their knowledge and expertise to come up with innovative solutions for internal or external challenges. The goal is to present a first proof of concept (POC) by the end of the day, after which a winner is chosen. Why Ship-IT Day? Ship-IT Day gives ACA team members the chance to work on innovative ideas that could potentially grow into concrete solutions. It is a unique opportunity to build knowledge and explore new possibilities, away from daily projects. This stimulates innovation within the company and gives creative ideas the space to flourish. The 8 projects of Ship-IT Day 2024 🚀 1. Energie.be - Support the Customer Support The customer support team at Energie.be receives tickets through Freshdesk, but they don’t have direct access to important customer information like previous support tickets or recent bills. This project aims to solve that problem by building an application that gathers customer information from various data sources. The project team plans to develop an app that integrates with Freshdesk and provides a comprehensive overview of all relevant customer information. In the future, a Large Language Model (LLM) could be used to summarize this information and offer more specific insights based on the nature of the query. 💼 Customer : Energie.be 2. DEMAZE - Your Legacy Oracle Navigating an existing codebase can be challenging due to outdated or missing documentation. This project aims to create a code assistant that provides targeted guidance on architecture and starting points using the latest generations of Large Language Models (LLMs). This will make it easier to find your way through legacy codebases. 💼 Customer : Confidential 3. Flow in "De Stroomlijn" Fluvius ' customer contact center, De Stroomlijn (The Streamline), often finds that customers have to repeatedly explain their problems when being transferred between different representatives. Although the CRM system documents all interactions, it is often disorganized, and helpdesk staff does not have the time to review all cases for each customer. This is frustrating for both customers and employees. This project aims to solve this issue for Fluvius by generating summaries of previous interactions, including an indication of customer satisfaction. This way, helpdesk staff can quickly understand the context and any sensitivities, allowing them to assist more efficiently and effectively. The ultimate goal is to reduce call duration and increase customer satisfaction. 💼 Customer: Fluvius 4. Stroomlijn Insights: FAQ Optimizer Fluvius lacks a clear view of the top 10 current customer questions they receive. Identifying the most important questions is often based on intuition or time-consuming manual work. This project aims to automatically analyze incoming questions and interactions to more accurately and quickly detect what customers are asking. By using AI, Fluvius can gain quicker insights into current issues and the impact of recent events. The ultimate goal is to automatically detect the most pressing customer questions and generate corresponding FAQ articles to improve customer satisfaction and efficiency. 💼 Customer: Fluvius 5. Umani CV Matching Umani Group , an HR consultancy firm, spends a lot of time manually matching CVs with job postings. This project aims to automate this process using LLMs (Large Language Models). A demo environment has already been built based on OpenAI, and the goal is to enhance and expand it. Additionally, the project team will explore whether OCR technology can assist in reading and interpreting handwritten CVs accurately. Introducing a chatbot for candidates could simplify the process and make it more customer-centric. Furthermore, the project will focus on various UX aspects, such as improving the flow, visuals, and information presentation. 💼 Customer : Umani Group Flexer 6. AI-powered visual notifications for local government S-Lim brings together the municipalities of Limburg to collaborate and transform the region into a smart region. Citizens can report issues such as road damage, waste, or other concerns through the websites of cities and municipalities. However, filling out these reports requires many steps and lots of information, which can be discouraging. This project aims to streamline this process by simplifying and making the reporting system more user-friendly. Specifically, the project team aims to develop a feature allowing citizens to easily upload photos of issues. These photos will be analyzed by AI to simplify the reporting forms and swiftly address the problems. The technical focus will be on image recognition, AI, and integrating with back-office systems like GreenValley, TopDesk, and 3P. 💼 Customer : s-Lim 7. Digitization of task cards and E2E cleaning service journey in a B2B context In a secure industrial environment, over 300 buildings need to be cleaned by approximately 75 employees, each with specific cleaning requirements and restrictions. The current process is error-prone and cumbersome, involving manual updates and communication via email, as well as physical prints of floor plans with a lot of information manually added. The client aims to increase efficiency, reduce costs, and minimize paper usage. This project team wants to work on a first concept that digitizes and simplifies the current process. 💼 Customer : Confidential 8. YouGO Soccer App - AI Gamification You Go Soccer has a Flutter application for soccer training and wants to expand it with additional features such as real-time video analysis and gamification to make the app more appealing to users. Specifically, this project team aims to implement Google MLkit for position detection and an AI vision model like YoloV8 for real-time ball tracking. Additionally, they intend to develop an algorithm to detect soccer-related actions and gamify the exercises. Also included in the scope of this project are the addition of a point system, tracking reaction times, and selecting different training sessions. 💼 Customer : You GO Soccer App by Thomas Buffel Follow Ship-IT live and be the first to discover the winner! Curious to see which project will win this year? Follow the event live on our social media channels: LinkedIn , X , Instagram and Facebook ! 🏆 With Ship-IT Day, ACA Group continues to innovate and improve, always keeping the customer in mind. Which project is your favorite? Let us know!

Read more
A day in the life of a Data Protection Officer
A day in the life of a Data Protection Officer
Reading time 4 min
5 MAY 2025

In our last blog post about GDPR, we looked at the state of GDPR 8 months after it went into effect. Today, we’ll look at what the content of the job of a Data Protection Officer exactly is. What could a Data Protection Officer (DPO) possibly do besides looking at implementation methods for a European regulation, or answer questions from his customers about the same topic? A day in the life of a DPO, what’s that like? Data protection impact assessment A typical day starts at 8:30am in the offices of a customer where meetings (one after the other) take up the whole morning. Preparation for these meetings is key. There’s professionals in front of you: CFOs, legal counsels, CIOs, CEOs, ICT development ICT infrastructure managers, GDPR coordinators, … These people know their business, so you better come prepared! A recent example of such a morning is with a customer where we need to finalize a data protection impact assessment (DPIA). A DPIA is a way to assess the privacy risks of data processing beforehand. The methodology we use is the CNIL application approach. That day, we discuss the consequences of the ‘DPO validation step’ which I prepared the day before. The meeting’s attendants are the COO, the HR director and myself and although the DPIA did not produce a ‘high or very high risk’ for the assessed processing activity, we find that we do need to define certain actions or mitigations for some smaller risks related to some flaws we found in the process. Being the Data Protection Officer, I had defined the required actions to mitigate each of the documented risks that we found and these now need to be discussed, approved and added to the action list with deadlines and responsibilities. Compromise is key... It is worth noting that a DPO only has an advisory function and does not have the mandate to take decisions. However, if, in this case, the COO or HR director would not agree with one or more of my proposed to-dos and we can’t agree to an alternative with the same result, the company needs to document and motivate the reason(s) why they didn’t follow the DPO’s advice. Fortunately, we had a good meeting with a very good discussion on one of the mitigating actions with an interesting compromise as a result. This is why the discussion is so important: an external Data Protection Officer needs to understand that the knowledge of the business processes, the business risks, the business value and commercial proposition is far better known by the company than by themselves and it’s mandatory to listen to the customer. But, and this is a very important but, it doesn’t mean that we can bend the rules! In this case, we came up with a valid compromise but in other cases (with another customer) we hadn’t, which implied my advice wasn’t accepted and the required documented motivation was written. As the meeting came to an end sooner than I expected, I had some time left. The marketing manager took this opportunity to discuss the possible impact of the GDPR on the next marketing campaign that was still under development. The campaign itself was very nice and creative, but since interactivity with the (potential) customer was a key part of it, the GDPR indeed had a certain impact. This meeting took a bit longer than “just a quick question”. 😊 ... and so is context! For that particular day I went to our office for the afternoon. When I’m at the office, I mostly prepare customer meetings, review Data Protection Agreements, prepare policies, presentations, trainings (e.g. Privacy by design for IT development) and DSAR (Data Subject Access Request) concepts. Additionally, I answer questions from our customers: "I have been asked to... Can I do this?" "I would like to add this functionality to our website. Does the GDPR have any impact on it?" "We would like to implement an MDM tool. Is that OK?" "An ex-employee sent a DSAR and would like to receive this specific information. Do we need to give it to them?" Of course, these are just a few examples. In reality, there are many more questions of all types all from different companies with different processes, different culture and policies. The same question may have different answers, depending on the situation or company. Knowing the legislation (and this means more than only the GDPR) is a basic requirement but unfortunately, that’s not enough. The interpretation for specific situations and knowing how to explain these within different types of companies in such a way that people accept it is one of the more challenging aspects. After all, not everybody loves the GDPR… 💔 Data Protection Officer: a varied and challenging job Being a Data Protection Officer is a very interesting, challenging job if you’re interested in business processes, data security, lifelong learning, lively discussions and sharing legal views or interpretations. While a lot of the job revolves around the GDPR, it is much more varied than that. I hope I’ve been able to give you some insight in what a DPO does from day to day!

Read more
Reading time 2 min
6 DEC 2023

Make it concrete for all stakeholders Data Mesh is frequently perceived as highly abstract and theoretical, leaving stakeholders uncertain about its precise implications and potential solutions. Therefore, at ACA Group, we focus on making it as concrete as possible for business stakeholders, technical stakeholders, and other impacted stakeholders in the organization. We recommend simultaneously addressing three key challenges: IDENTIFY BUSINESS VALUE – Define how Data Mesh exactly contributes to the business value by considering data as a product. ORGANIZE TEAMS – Specify the role of every team, team member and persona within the context of Data Mesh. BUILD PLATFORM – Show how data mesh influences the technical architecture. Challenge 1: Identifying the Data Mesh Business Value One of the first challenges in adopting Data Mesh is to explain and prove its business value. At ACA Group, we start by identifying potential data products, domains, and use cases. This process is grounded in business input and results in a data product landscape. An example of an e-commerce company is shown below (boxes are applications, hexagons are data products, colors are owning domains). This landscape serves as a navigation map, inspiring new innovative business ideas and showcasing the value that Data Mesh can bring to the organization. By demonstrating how Data Mesh can enable new possibilities, we clarify its relevance to business stakeholders. Aligning Data Mesh Solutions with Organizational Goals To get the most out of Data Mesh, alignment with the organization's overall goals and strategy is paramount. It's essential to ensure that the investment in technology and process aligns with the broader business objectives. This alignment helps maintain support and momentum, crucial for realizing the success of a Data Mesh initiative. Identifying Data Mesh Opportunities through Game Storming At ACA Group, we apply game storming techniques to discover domains and data products. This process begins with business capabilities and data use cases identified through workshops, such as impact mapping. By aligning Data Mesh with these aspects, we identify a data product landscape from two perspectives: an inventory of available data and potential data products inspires and generates new business ideas, while the desired business impact and goals helps to identify required data and data products. Challenge 2: Organizing Teams and Empowering Individuals Data Mesh is not just about technology; it's about transforming how teams and team members operate within the organization. ACA Group believes in organizing teams effectively to harness the power of Data Mesh. We interact with existing teams and team members, positioning their valuable roles and expertise within a Data Mesh team organization. This typically involves platform teams, domain teams, enabling teams, and a federated governance team. Additionally, we explore the various user journeys and experiences for each persona, ensuring that Data Mesh positively impacts the organization, its people, and their roles. Challenge 3: Building the Technical Architecture as a First-Class Component The technical architecture is a critical aspect of Data Mesh, and ACA Group is committed to making it a tangible reality. We demonstrate how Data Mesh can work in practice by developing a coded and working proof of concept. Leveraging our platform engineering expertise, we bring data products to life, showcasing how Data Mesh can leverage existing data technology while providing a future-proof and flexible architecture tailored to the client's unique context. Conclusion Adopting Data Mesh is a transformative journey for any organization. By breaking down the challenges into actionable steps, as ACA Group does, you can make Data Mesh more tangible, clarify its value, and align it with your organization's goals. These incremental actions serve to demystify Data Mesh, rendering it comprehensible to a wide array of stakeholders and facilitating well-informed decisions. Embracing Data Mesh represents an embrace of the future of data management, with its potential to unlock myriad possibilities for your organization. This journey is about making Data Mesh a practical reality while aligning it with your organizational objectives. 💡 Curious about what else Data Mesh has to offer you? Discover it here ✅

Read more
Reading time 5 min
30 NOV 2023

On November 30, ACA Group will launch their annual Ship-IT Day. During this hackathon, various teams of ACA team members will develop an innovative idea for (and together with) the customer. This year, nine project teams will compete for eternal glory that awaits at the end of the day. Discover all projects here. What is Ship-IT Day? Ship-IT Day is in its sixth edition this year 🎉 . On this day everything revolves around collaboration and innovation . Various multidisciplinary ACA teams (whether or not supplemented with the right experts from the customer) will use their knowledge and expertise on November 30 to come up with innovative solutions for internal or external challenges. The goal is to present a first proof of concept at the end of the day , with a winner selected based on these presentations. Why? With the Ship-IT Day, ACA Group wants to give its employees the opportunity to work on innovative ideas that can form the basis for concrete solutions. It is also a great opportunity to build knowledge and explore new opportunities . The 9 Projects of Ship-IT Day 2023 1. Chatbot course integration in customer portal In the customer portal of one of our clients, employers have access to more than fifty different brochures. However, it is difficult to know where to find specific information. In addition, this customer offers 384 training courses. This also requires a lot of searching to find the right training courses. With this project, the team of eight experts wants to offer a solution to simplify the search for documentation and training for employers . They want to do this by expanding the existing AI tool, transforming it into a web component, and integrating it as a chatbot into the existing customer portal. 2. System integration tests in a CI/CD pipeline During the R D and testing phase of traffic management systems, this company relies heavily on manual testing and actions such as deploys to test environments. In addition to the extra work, problems are also discovered relatively late, making the costs of solving them higher than necessary. The project team of four experts aims to address these issues by introducing end-to-end testing against an integrated backend and frontend, performed in GitHub actions . After a successful run, the software can then be automatically deployed to a test environment. The solution will result in shorter feedback loops and an improvement in product quality . 3. Onboarding portal/platform including gamification Every new ACA member undergoes an onboarding process, and the relevant information is currently distributed across different platforms. This project aims to establish a centralized platform that consolidates all onboarding information . The incorporation of gamification elements adds an extra layer of fun for each incoming employee. Additionally, utilizing the platform enhances HR's ability to efficiently manage follow-ups. 4. Automatic dubbing, transcription and summary of conversations The consultants of one of our clients engage in numerous conversations, each requiring a subsequent report—a time-consuming task. The project team of seven experts will use Speech-To-Text technology to automatically create transcriptions of these conversations in real time . Leveraging Large Language Models, the team can also provide live translations for seamless communication with partners who may not speak Dutch or French. Furthermore, the technology facilitates the automatic generation of summary reports after each conversation. 5. publiq film offering data import via ML publiq manages the communication of public leisure offerings, frequently receiving numerous emails weekly regarding film offerings. Currently, this information is manually inputted into the UiT database ( https://www.uitdatabank.be ), a time-consuming task that occasionally leads to duplicate publications. To address this, a project team consisting of four ACA employees and seven publiq experts aims to enhance the data import process for film offerings into the UiT database. Through the integration of machine learning , this initiative seeks to significantly improve efficiency, ensuring quicker availability of information on www.UiTinVlaanderen.be . 6. SMOCS, Low level mock management systeem The Woningpas , an online tool from the Flemish government, uses many integrations with external parties. They provide a Swagger file with descriptions of their endpoints, while they develop their API in the meantime. ACA is developing a mock application in parallel for this purpose that mocks the API endpoints with specific cases provided by the customer. This involves a lot of development work. Moreover, changes often occur that require the mock application to be reworked. The customer also often adds additional edge cases that need to be tested ad hoc. The project team aims to create a user-friendly tool facilitating the addition of mock data by the entire team . This empowers the customer to independently test and incorporate edge cases. The objective is to enable developers to concentrate more on actual code development. 7. Composable data processing architecture For a government service that collects a lot of data from external parties, according to largely similar processes, this project team wants to establish a reference architecture that can be used for the development of new applications. This means you do not have to start from scratch every time, and fewer development costs and maintenance are required. The focus is on composability and platform thinking within the context of data ingestion. The developers of an application must be able to choose which building blocks they bring together to build their application. Setting up a new application should be relatively simple, with the aim being that the application can limit itself to defining the application-specific business rules. 8. Virtual employees This project team wants to expand the customer's team with virtual employees based on AI and chatbot technology . These extra virtual colleagues can help train new employees and make internal knowledge accessible. The idea is to set up several separate bots, each with their own specialization. The training of the bots will be done gradually. 9. Automated invoicing The customer involved is responsible for incident management of trucks and transport throughout Europe, including the coordination of truck repairs. They handle approximately 50,000 invoices each year from local garages, requiring translation before forwarding to insurers and suppliers. This process is highly labor-intensive and time-consuming. This project team wants to build an AI solution to largely automate the invoicing process. They get started based on 32 sample invoices.

Read more