We learn & share

ACA Group Blog

Read more about our thoughts, views, and opinions on various topics, important announcements, useful insights, and advice from our experts.

Featured

8 MAY 2025
Reading time 5 min

In the ever-evolving landscape of data management, investing in platforms and navigating migrations between them is a recurring theme in many data strategies. How can we ensure that these investments remain relevant and can evolve over time, avoiding endless migration projects? The answer lies in embracing ‘Composability’ - a key principle for designing robust, future-proof data (mesh) platforms. Is there a silver bullet we can buy off-the-shelf? The data-solution market is flooded with data vendor tools positioning themselves as the platform for everything, as the all-in-one silver bullet. It's important to know that there is no silver bullet. While opting for a single off-the-shelf platform might seem like a quick and easy solution at first, it can lead to problems down the line. These monolithic off-the-shelf platforms often end up inflexible to support all use cases, not customizable enough, and eventually become outdated.This results in big complicated migration projects to the next silver bullet platform, and organizations ending up with multiple all-in-one platforms, causing disruptions in day-to-day operations and hindering overall progress. Flexibility is key to your data mesh platform architecture A complete data platform must address numerous aspects: data storage, query engines, security, data access, discovery, observability, governance, developer experience, automation, a marketplace, data quality, etc. Some vendors claim their all-in-one data solution can tackle all of these. However, typically such a platform excels in certain aspects, but falls short in others. For example, a platform might offer a high-end query engine, but lack depth in features of the data marketplace included in their solution. To future-proof your platform, it must incorporate the best tools for each aspect and evolve as new technologies emerge. Today's cutting-edge solutions can be outdated tomorrow, so flexibility and evolvability are essential for your data mesh platform architecture. Embrace composability: Engineer your future Rather than locking into one single tool, aim to build a platform with composability at its core. Picture a platform where different technologies and tools can be seamlessly integrated, replaced, or evolved, with an integrated and automated self-service experience on top. A platform that is both generic at its core and flexible enough to accommodate the ever-changing landscape of data solutions and requirements. A platform with a long-term return on investment by allowing you to expand capabilities incrementally, avoiding costly, large-scale migrations. Composability enables you to continually adapt your platform capabilities by adding new technologies under the umbrella of one stable core platform layer. Two key ingredients of composability Building blocks: These are the individual components that make up your platform. Interoperability: All building blocks must work together seamlessly to create a cohesive system. An ecosystem of building blocks When building composable data platforms, the key lies in sourcing the right building blocks. But where do we get these? Traditional monolithic data platforms aim to solve all problems in one package, but this stifles the flexibility that composability demands. Instead, vendors should focus on decomposing these platforms into specialized, cost-effective components that excel at addressing specific challenges. By offering targeted solutions as building blocks, they empower organizations to assemble a data platform tailored to their unique needs. In addition to vendor solutions, open-source data technologies also offer a wealth of building blocks. It should be possible to combine both vendor-specific and open-source tools into a data platform tailored to your needs. This approach enhances agility, fosters innovation, and allows for continuous evolution by integrating the latest and most relevant technologies. Standardization as glue between building blocks To create a truly composable ecosystem, the building blocks must be able to work together, i.e. interoperability. This is where standards come into play, enabling seamless integration between data platform building blocks. Standardization ensures that different tools can operate in harmony, offering a flexible, interoperable platform. Imagine a standard for data access management that allows seamless integration across various components. It would enable an access management building block to list data products and grant access uniformly. Simultaneously, it would allow data storage and serving building blocks to integrate their data and permission models, ensuring that any access management solution can be effortlessly composed with them. This creates a flexible ecosystem where data access is consistently managed across different systems. The discovery of data products in a catalog or marketplace can be greatly enhanced by adopting a standard specification for data products. With this standard, each data product can be made discoverable in a generic way. When data catalogs or marketplaces adopt this standard, it provides the flexibility to choose and integrate any catalog or marketplace building block into your platform, fostering a more adaptable and interoperable data ecosystem. A data contract standard allows data products to specify their quality checks, SLOs, and SLAs in a generic format, enabling smooth integration of data quality tools with any data product. It enables you to combine the best solutions for ensuring data reliability across different platforms. Widely accepted standards are key to ensuring interoperability through agreed-upon APIs, SPIs, contracts, and plugin mechanisms. In essence, standards act as the glue that binds a composable data ecosystem. A strong belief in evolutionary architectures At ACA Group, we firmly believe in evolutionary architectures and platform engineering, principles that seamlessly extend to data mesh platforms. It's not about locking yourself into a rigid structure but creating an ecosystem that can evolve, staying at the forefront of innovation. That’s where composability comes in. Do you want a data platform that not only meets your current needs but also paves the way for the challenges and opportunities of tomorrow? Let’s engineer it together Ready to learn more about composability in data mesh solutions? {% module_block module "widget_f1f5c870-47cf-4a61-9810-b273e8d58226" %}{% module_attribute "buttons" is_json="true" %}{% raw %}[{"appearance":{"link_color":"light","primary_color":"primary","secondary_color":"primary","tertiary_color":"light","tertiary_icon_accent_color":"dark","tertiary_text_color":"dark","variant":"primary"},"content":{"arrow":"right","icon":{"alt":null,"height":null,"loading":"disabled","size_type":null,"src":"","width":null},"tertiary_icon":{"alt":null,"height":null,"loading":"disabled","size_type":null,"src":"","width":null},"text":"Contact us now!"},"target":{"link":{"no_follow":false,"open_in_new_tab":false,"rel":"","sponsored":false,"url":{"content_id":230950468795,"href":"https://25145356.hs-sites-eu1.com/en/contact","href_with_scheme":null,"type":"CONTENT"},"user_generated_content":false}},"type":"normal"}]{% endraw %}{% end_module_attribute %}{% module_attribute "child_css" is_json="true" %}{% raw %}{}{% endraw %}{% end_module_attribute %}{% module_attribute "css" is_json="true" %}{% raw %}{}{% endraw %}{% end_module_attribute %}{% module_attribute "definition_id" is_json="true" %}{% raw %}null{% endraw %}{% end_module_attribute %}{% module_attribute "field_types" is_json="true" %}{% raw %}{"buttons":"group","styles":"group"}{% endraw %}{% end_module_attribute %}{% module_attribute "isJsModule" is_json="true" %}{% raw %}true{% endraw %}{% end_module_attribute %}{% module_attribute "label" is_json="true" %}{% raw %}null{% endraw %}{% end_module_attribute %}{% module_attribute "module_id" is_json="true" %}{% raw %}201493994716{% endraw %}{% end_module_attribute %}{% module_attribute "path" is_json="true" %}{% raw %}"@projects/aca-group-project/aca-group-app/components/modules/ButtonGroup"{% endraw %}{% end_module_attribute %}{% module_attribute "schema_version" is_json="true" %}{% raw %}2{% endraw %}{% end_module_attribute %}{% module_attribute "smart_objects" is_json="true" %}{% raw %}null{% endraw %}{% end_module_attribute %}{% module_attribute "smart_type" is_json="true" %}{% raw %}"NOT_SMART"{% endraw %}{% end_module_attribute %}{% module_attribute "tag" is_json="true" %}{% raw %}"module"{% endraw %}{% end_module_attribute %}{% module_attribute "type" is_json="true" %}{% raw %}"module"{% endraw %}{% end_module_attribute %}{% module_attribute "wrap_field_tag" is_json="true" %}{% raw %}"div"{% endraw %}{% end_module_attribute %}{% end_module_block %}

Read more
We learn & share

ACA Group Blog

Read more about our thoughts, views, and opinions on various topics, important announcements, useful insights, and advice from our experts.

Featured

8 MAY 2025
Reading time 5 min

In the ever-evolving landscape of data management, investing in platforms and navigating migrations between them is a recurring theme in many data strategies. How can we ensure that these investments remain relevant and can evolve over time, avoiding endless migration projects? The answer lies in embracing ‘Composability’ - a key principle for designing robust, future-proof data (mesh) platforms. Is there a silver bullet we can buy off-the-shelf? The data-solution market is flooded with data vendor tools positioning themselves as the platform for everything, as the all-in-one silver bullet. It's important to know that there is no silver bullet. While opting for a single off-the-shelf platform might seem like a quick and easy solution at first, it can lead to problems down the line. These monolithic off-the-shelf platforms often end up inflexible to support all use cases, not customizable enough, and eventually become outdated.This results in big complicated migration projects to the next silver bullet platform, and organizations ending up with multiple all-in-one platforms, causing disruptions in day-to-day operations and hindering overall progress. Flexibility is key to your data mesh platform architecture A complete data platform must address numerous aspects: data storage, query engines, security, data access, discovery, observability, governance, developer experience, automation, a marketplace, data quality, etc. Some vendors claim their all-in-one data solution can tackle all of these. However, typically such a platform excels in certain aspects, but falls short in others. For example, a platform might offer a high-end query engine, but lack depth in features of the data marketplace included in their solution. To future-proof your platform, it must incorporate the best tools for each aspect and evolve as new technologies emerge. Today's cutting-edge solutions can be outdated tomorrow, so flexibility and evolvability are essential for your data mesh platform architecture. Embrace composability: Engineer your future Rather than locking into one single tool, aim to build a platform with composability at its core. Picture a platform where different technologies and tools can be seamlessly integrated, replaced, or evolved, with an integrated and automated self-service experience on top. A platform that is both generic at its core and flexible enough to accommodate the ever-changing landscape of data solutions and requirements. A platform with a long-term return on investment by allowing you to expand capabilities incrementally, avoiding costly, large-scale migrations. Composability enables you to continually adapt your platform capabilities by adding new technologies under the umbrella of one stable core platform layer. Two key ingredients of composability Building blocks: These are the individual components that make up your platform. Interoperability: All building blocks must work together seamlessly to create a cohesive system. An ecosystem of building blocks When building composable data platforms, the key lies in sourcing the right building blocks. But where do we get these? Traditional monolithic data platforms aim to solve all problems in one package, but this stifles the flexibility that composability demands. Instead, vendors should focus on decomposing these platforms into specialized, cost-effective components that excel at addressing specific challenges. By offering targeted solutions as building blocks, they empower organizations to assemble a data platform tailored to their unique needs. In addition to vendor solutions, open-source data technologies also offer a wealth of building blocks. It should be possible to combine both vendor-specific and open-source tools into a data platform tailored to your needs. This approach enhances agility, fosters innovation, and allows for continuous evolution by integrating the latest and most relevant technologies. Standardization as glue between building blocks To create a truly composable ecosystem, the building blocks must be able to work together, i.e. interoperability. This is where standards come into play, enabling seamless integration between data platform building blocks. Standardization ensures that different tools can operate in harmony, offering a flexible, interoperable platform. Imagine a standard for data access management that allows seamless integration across various components. It would enable an access management building block to list data products and grant access uniformly. Simultaneously, it would allow data storage and serving building blocks to integrate their data and permission models, ensuring that any access management solution can be effortlessly composed with them. This creates a flexible ecosystem where data access is consistently managed across different systems. The discovery of data products in a catalog or marketplace can be greatly enhanced by adopting a standard specification for data products. With this standard, each data product can be made discoverable in a generic way. When data catalogs or marketplaces adopt this standard, it provides the flexibility to choose and integrate any catalog or marketplace building block into your platform, fostering a more adaptable and interoperable data ecosystem. A data contract standard allows data products to specify their quality checks, SLOs, and SLAs in a generic format, enabling smooth integration of data quality tools with any data product. It enables you to combine the best solutions for ensuring data reliability across different platforms. Widely accepted standards are key to ensuring interoperability through agreed-upon APIs, SPIs, contracts, and plugin mechanisms. In essence, standards act as the glue that binds a composable data ecosystem. A strong belief in evolutionary architectures At ACA Group, we firmly believe in evolutionary architectures and platform engineering, principles that seamlessly extend to data mesh platforms. It's not about locking yourself into a rigid structure but creating an ecosystem that can evolve, staying at the forefront of innovation. That’s where composability comes in. Do you want a data platform that not only meets your current needs but also paves the way for the challenges and opportunities of tomorrow? Let’s engineer it together Ready to learn more about composability in data mesh solutions? {% module_block module "widget_f1f5c870-47cf-4a61-9810-b273e8d58226" %}{% module_attribute "buttons" is_json="true" %}{% raw %}[{"appearance":{"link_color":"light","primary_color":"primary","secondary_color":"primary","tertiary_color":"light","tertiary_icon_accent_color":"dark","tertiary_text_color":"dark","variant":"primary"},"content":{"arrow":"right","icon":{"alt":null,"height":null,"loading":"disabled","size_type":null,"src":"","width":null},"tertiary_icon":{"alt":null,"height":null,"loading":"disabled","size_type":null,"src":"","width":null},"text":"Contact us now!"},"target":{"link":{"no_follow":false,"open_in_new_tab":false,"rel":"","sponsored":false,"url":{"content_id":230950468795,"href":"https://25145356.hs-sites-eu1.com/en/contact","href_with_scheme":null,"type":"CONTENT"},"user_generated_content":false}},"type":"normal"}]{% endraw %}{% end_module_attribute %}{% module_attribute "child_css" is_json="true" %}{% raw %}{}{% endraw %}{% end_module_attribute %}{% module_attribute "css" is_json="true" %}{% raw %}{}{% endraw %}{% end_module_attribute %}{% module_attribute "definition_id" is_json="true" %}{% raw %}null{% endraw %}{% end_module_attribute %}{% module_attribute "field_types" is_json="true" %}{% raw %}{"buttons":"group","styles":"group"}{% endraw %}{% end_module_attribute %}{% module_attribute "isJsModule" is_json="true" %}{% raw %}true{% endraw %}{% end_module_attribute %}{% module_attribute "label" is_json="true" %}{% raw %}null{% endraw %}{% end_module_attribute %}{% module_attribute "module_id" is_json="true" %}{% raw %}201493994716{% endraw %}{% end_module_attribute %}{% module_attribute "path" is_json="true" %}{% raw %}"@projects/aca-group-project/aca-group-app/components/modules/ButtonGroup"{% endraw %}{% end_module_attribute %}{% module_attribute "schema_version" is_json="true" %}{% raw %}2{% endraw %}{% end_module_attribute %}{% module_attribute "smart_objects" is_json="true" %}{% raw %}null{% endraw %}{% end_module_attribute %}{% module_attribute "smart_type" is_json="true" %}{% raw %}"NOT_SMART"{% endraw %}{% end_module_attribute %}{% module_attribute "tag" is_json="true" %}{% raw %}"module"{% endraw %}{% end_module_attribute %}{% module_attribute "type" is_json="true" %}{% raw %}"module"{% endraw %}{% end_module_attribute %}{% module_attribute "wrap_field_tag" is_json="true" %}{% raw %}"div"{% endraw %}{% end_module_attribute %}{% end_module_block %}

Read more

All blog posts

Lets' talk!

We'd love to talk to you!

Contact us and we'll get you connected with the expert you deserve!

Lets' talk!

We'd love to talk to you!

Contact us and we'll get you connected with the expert you deserve!

Lets' talk!

We'd love to talk to you!

Contact us and we'll get you connected with the expert you deserve!

Lets' talk!

We'd love to talk to you!

Contact us and we'll get you connected with the expert you deserve!

code
code
Istio Service Mesh: What and Why
Reading time 3 min
8 MAY 2025

In the complex world of modern software development, companies are faced with the challenge of seamlessly integrating diverse applications developed and managed by different teams. An invaluable asset in overcoming this challenge is the Service Mesh. In this blog article, we delve into Istio Service Mesh and explore why investing in a Service Mesh like Istio is a smart move." What is Service Mesh? A service mesh is a software layer responsible for all communication between applications, referred to as services in this context. It introduces new functionalities to manage the interaction between services, such as monitoring, logging, tracing, and traffic control. A service mesh operates independently of the code of each individual service, enabling it to operate across network boundaries and collaborate with various management systems. Thanks to a service mesh, developers can focus on building application features without worrying about the complexity of the underlying communication infrastructure. Istio Service Mesh in Practice Consider managing a large cluster that runs multiple applications developed and maintained by different teams, each with diverse dependencies like ElasticSearch or Kafka. Over time, this results in a complex ecosystem of applications and containers, overseen by various teams. The environment becomes so intricate that administrators find it increasingly difficult to maintain a clear overview. This leads to a series of pertinent questions: What is the architecture like? Which applications interact with each other? How is the traffic managed? Moreover, there are specific challenges that must be addressed for each individual application: Handling login processes Implementing robust security measures Managing network traffic directed towards the application ... A Service Mesh, such as Istio, offers a solution to these challenges. Istio acts as a proxy between the various applications (services) in the cluster, with each request passing through a component of Istio. How Does Istio Service Mesh Work? Istio introduces a sidecar proxy for each service in the microservices ecosystem. This sidecar proxy manages all incoming and outgoing traffic for the service. Additionally, Istio adds components that handle the incoming and outgoing traffic of the cluster. Istio's control plane enables you to define policies for traffic management, security, and monitoring, which are then applied to the added components. For a deeper understanding of Istio Service Mesh functionality, our blog article, "Installing Istio Service Mesh: A Comprehensive Step-by-Step Guide" , provides a detailed, step-by-step explanation of the installation and utilization of Istio. Why Istio Service Mesh? Traffic Management: Istio enables detailed traffic management, allowing developers to easily route, distribute, and control traffic between different versions of their services. Security: Istio provides a robust security layer with features such as traffic encryption using its own certificates, Role-Based Access Control (RBAC), and capabilities for implementing authentication and authorization policies. Observability: Through built-in instrumentation, Istio offers deep observability with tools for monitoring, logging, and distributed tracing. This allows IT teams to analyze the performance of services and quickly detect issues. Simplified Communication: Istio removes the complexity of service communication from application developers, allowing them to focus on building application features. Is Istio Suitable for Your Setup? While the benefits are clear, it is essential to consider whether the additional complexity of Istio aligns with your specific setup. Firstly, a sidecar container is required for each deployed service, potentially leading to undesired memory and CPU overhead. Additionally, your team may lack the specialized knowledge required for Istio. If you are considering the adoption of Istio Service Mesh, seek guidance from specialists with expertise. Feel free to ask our experts for assistance. More Information about Istio Istio Service Mesh is a technological game-changer for IT professionals aiming for advanced control, security, and observability in their microservices architecture. Istio simplifies and secures communication between services, allowing IT teams to focus on building reliable and scalable applications. Need quick answers to all your questions about Istio Service Mesh? Contact our experts

Read more
handout
handout
Reading time 4 min
8 MAY 2025

Let’s begin this blog post by remembering embarrassing memories from our past. Remember when you called your teacher “mom” in kindergarten? Or that huge zit on your forehead on your first day of high school? Let’s have a look at a picture from a couple of years ago. Those clothes… those hideous clothes. Why did you think you’d rock those super tight pants forever? Well.. you’ve changed and so did those fashion trends. Trends also seem to change quite rapidly in other kinds of design. In this blogpost, I’ll walk you through 3 tips to modernize your handout design and improve an old print design to fit in the modern world. I’m using a recent redesign of a handout made for our mobile team as an example. 1. Go digital The biggest change in this redesign is that we’re actually stepping away from print and moving towards digital media, such as a PDF file. A couple of advantages go hand in hand here. You no longer have to worry about how you’re going to jam all your information on two A4 pages, since you aren’t limited to a certain page size . This gives you more freedom with both your structure and information placement. As you can see in our example (click the picture on the right), this results in a lighter design that’s much more enjoyable to read because of extra whitespace between sections. Another big improvement is the ability to make use of links . You don’t have to write a full URL anymore, because you can just click on a shorter one or even add a button like you’re designing a webpage. Try adding links to logos or pictures. By doing this, you’re changing a completely static object into a link to even more information. Your handout is no longer limited to just the information on your two pages. Be careful you don’t forget to add links to anything that seems ‘clickable’ though. If your user tries to click on something that should’ve been a redirect to for example your website and it doesn’t work, you don’t make a good impression. 2. Add more whitespace! First things first, whitespace doesn’t need to be ‘white’ space. Whitespace refers to empty breathing room in your design , not a white colour between design elements. Imagine getting a page to read and everything is cropped in the first half of the page. I’d rather gouge my eyes out than try to read that. Space out your information and use that empty space to improve the reader’s experience and even guide them to parts you want them to read. In other words, empty space makes your content more readable. When your focus is to inform your reader, readability is top priority . In our old design, we used an abstract background. Pictures or patterns can be used as whitespace but aren’t ideal if you’re going for a professional and clean look. We simply changed to a white background instead. This way, we structured our design in a couple of ‘information containers’. The whitespace makes sure each of those containers gets the focus it deserves. So don’t just read whitespace and see it as unused space. Think of whitespace as a guide for your reader . 3. Don't overdo it You want to tell your audience something, so get to the point . Don’t try to over-explain and don’t use words you even barely know to sound smart. The same goes for showing pictures and using design elements. Nobody wants to scroll through 10 pictures which all show the same thing to find information they need. Say what you need to say and show what you need to show. Also, try using a number instead of an icon when speaking about a certain percentage for example. People love numbers (just look at all those infographics floating around on the internet as proof). Try to have a nice balance between icons and percentages when designing something. Of course, there’s nothing wrong with using icons as a visual too. They can break your wall of text, make it more enjoyable to reed and keep the reader interested. Last but not least, your design isn’t a child’s coloring book. Don’t use all of the crayons available! We went way overboard in our previous design. Now we’re only using red as our main color because it’s also the main color of ACA’s corporate identity. If you really want to use more color in your new design, be consistent . Don’t give your first title a blue color while giving your other titles a purple color. It’s pretty basic really, but often overlooked. 🚀 Takeaway Print design isn’t exclusively for just print anymore in today’s digital world. If you want to update your previous print designs so they fit in our online lives as well, here are some things you should take into account: Forget strict page size limits. Include clickable links in your design. You can even add buttons like on a website. Use as much whitespace as you need to help your reader ‘breathe’ for a second and rest their eyes. Be brief! Include visuals, but stay consistent in your color palette.

Read more
futuristic man behind laptop ai
futuristic man behind laptop ai
Reading time 4 min
8 MAY 2025

The world of data analysis is changing fast. AI tools like Copilot are automating tasks that used to take us hours, which is exciting! But it also means we need to evolve our skills to stay ahead of the curve. Instead of spending time on repetitive tasks, data analysts can now focus on the bigger picture: strategy, problem-solving, and truly understanding the business. This blog explores the key skills data analysts need to thrive in this new AI-powered environment. The data analyst’s new focus: from repetitive tasks to strategy Imagine having more time to focus on what really matters: understanding the business, solving complex problems, and making strategic decisions. That's the opportunity AI provides. To maximize Copilot’s potential, data analysts need to shift their focus from manual tasks to work that require deep business knowledge and critical thinking. A crucial part of this shift is collaborating closely with stakeholders. Data analysts need to understand their challenges, define the right questions, and ensure their insights truly drive decision-making. Key skills data analysts need when working with AI 1. Advanced data modeling and metadata management Why it matters: With AI tools like Copilot handling much of the front-end report creation, the quality of insights will increasingly depend on the robustness of the underlying data model. Data analysts should invest time in refining their data modeling and metadata management skills. Actionable steps: Ensure that data models are clean, scalable, and well-documented. Be honest, how often have you filled out the “Description” field in your Power BI data model? How often have you used the “Synonyms” field? Our guess is: not all that often. Ironically, these fields will now be crucial in your pursuit of qualitative responses from Copilot … You will need to organize metadata to improve discoverability, ensuring Copilot (or other AI tools) can leverage the right data to generate insights. Build a deep understanding of how to structure data to enable AI to create actionable, accurate insights. Take a good hard look at your data model and how it is built. Define what can be improved based on best practices, and then apply them systematically. 2. Data governance and quality assurance Why it matters: Copilot can only produce reliable outputs with high-quality data. Data analysts will need to focus on ensuring data consistency, reliability, and governance. Actionable steps: Implement and maintain best practices for data governance. Use clear naming conventions, predefined measures, and logical data structures that make it easier for Copilot to generate actionable insights. 3. Business acumen and strategic insight generation Why it matters: AI tools lack contextual understanding, so data analysts must bridge this gap. Developing a strong grasp of business operations, industry trends, and strategic objectives allows analysts to create insights that are both relevant and impactful. Actionable steps: Invest in learning about your organization’s goals and strategic challenges. The clearer you can understand and document these goals and challenges, the better you will be able to translate them into relevant insights. Regularly engage with business leaders to understand the context behind the data, which in turn helps translate findings into actionable strategies. 4. Communication and storytelling skills Why it matters: Translating technical insights into stories that resonate with business stakeholders is crucial. Storytelling bridges the gap between data and decision-makers. Actionable steps: Become an expert at framing the insights. Work on presenting data in narrative formats that highlight the “why” and “how” behind the insights. Focus on how the data aligns with the company’s goals, offering clear recommendations and visualizations that stakeholders can easily grasp. How to implement these skills: practical actions for data analysts Developing data modeling and metadata management skills With AI tools like Copilot in the mix, the quality of insights depends significantly on data models. Data analysts should dedicate time to refining their data modeling skills, focusing on: Organizing and documenting data: Pay attention to metadata fields like descriptions and synonyms, which will help AI generate more accurate outputs. Data structure optimization : Ensure your data structure is scalable, clean, and flexible. This will streamline Copilot’s ability to work with the data seamlessly. Engaging with business stakeholders AI-generated insights are only as valuable as their alignment with business goals. Data analysts must regularly engage with stakeholders to: Define clear objectives: Discuss goals and pain points with stakeholders to set a clear direction for AI analysis. Gather feedback: Regular feedback helps adjust AI-generated insights to better meet business needs, ensuring outputs are practical and actionable. Conclusion: the future of data analysis is here AI tools like Copilot are transforming data analysis, and it's an exciting time to be in this field! By focusing on strategic thinking, communication, and strong data foundations, data analysts can not only adapt but thrive. The ability to connect data insights to business context, combined with excellent communication and storytelling, will define the most successful data analysts in the years to come. By investing in these skills, data analysts can stay at the forefront of data-driven innovation. For more insights on how Copilot is shaping data analysis, read the article “How Copilot in Power BI is Transforming Data Analysis” . 🚀 Ready to empower your data team with advanced AI skills? Contact our experts to support your transformation.

Read more
woman behind laptop
woman behind laptop
Reading time 5 min
8 MAY 2025

“Do I have the right qualifications to become a freelancer?” This is a popular question for people who are thinking about becoming their own boss. Perhaps you’ve asked this very question yourself. In this blog post, we’ll go over some stats, as well as some different approaches to education as a freelancer. Is formal education as a freelancer necessary? Let’s start by looking at some data. Freelancermap conducted a survey in which they asked freelancers from their community to tell them their highest obtained degree. They found that from a sample of over one thousand freelancers, 42.5% held a university degree. An additional 4.5% of participants held a Master’s degree. Moreover, 28% of the survey participants had finished a technical college degree. That means that in this case, 76% of freelancers have had some form of higher education . So, should you pursue a degree? Looking at these numbers, it certainly seems like you should if you want to be competitive. However, there’s no right or wrong answer to this question. A degree isn’t a holy grail. While it does increase your chances as a freelancer, it’s no guarantee. So if you don’t have a degree, no reason to panic. A lack of education and experience is not a barrier for freelancing. However, you can’t expect clients to just drop out of the sky and trust you with large projects from the get-go. You’ll need to network and create a portfolio full of excellent examples of your work, even if you have a degree . Depending on your major, your degree may enable you to specialize and charge a higher rate for your services. Ultimately, if you want to become a freelancer, you have to take a few other things than a degree into consideration as well, such as: education and training required by your niche. If professional licensing or certification requirements demand a degree or other training, that’s simply unavoidable. You’ll need to research what education and/or training is absolutely required. your own feelings about getting a degree or any type of education as a freelancer. Do you feel like you’ll be more confident with a degree to your name? Or is obtaining a degree an important accomplishment to you? Or perhaps you just really want to closely study something you’re interested in? What about intrinsic motivation? The value of higher education isn’t solely wrapped up in earnings power or career success. your desire to be free, to be your own boss and your entrepreneurial spirit. You gotta want it and dare to go for it! Continuing education as a freelancer So: while education as a freelancer is always a good thing, there are no formal requirements to start freelancing. But what about continuing education as a freelancer? While there’s no definitive answer to the question whether freelancers should pursue a degree, we can be unambiguous when it comes to the question of continuing education. The key to sustaining freelance success is continuing your education, both in your chosen field and as a general businessperson. There are numerous reasons for this. First off, technology advances at the speed of light. It’s a cliché, but it’s true: what’s cutting-edge one day might obsolete the next. Clients don’t need developers for iOS 4, Android 2.3 or Windows 95 anymore. It’s vital that you stay up-to-date with current technologies and understand where your niche is moving to in the near future. Additionally, while it might not happen quite as quickly as with technology, business practices and methodologies are changing too. Secondly, education investing in personal and professional development makes sense from a business point of view. And what do you know: as a freelancer, you’re a business too. If you want to compete with the big boys and increase your hourly fee, maintaining a current level of education and certification is a no-brainer. Thirdly: taking a workshop, signing up for a seminar, or attending an industry conference are all easy ways to expand your professional network as well as your knowledge base. If you want to be tapped in to the pulse of your profession, you need to be talking, sharing, and learning from other freelancers and industry leaders. Taking workshops is also a good way to keep yourself interested in your field. Lastly, continuing your education will help you keep producing high-quality work. It’s just easier to keep the quality of your work up when you’re up-to-date with what the market expects from you, be it a new version of an operating system, tips and tricks to complete assignments faster or an entire new skillset. There are a few good e-learning platforms you can register for to continue your education: Skillshare offers over 23,000 classes in design, business, tech and more. It even has a separate freelance section! We’ve written about Skillshare before in a whitepaper about the 10 tools for growing a successful freelance career. You can download the whitepaper down below. In it, there’s a link that allows you to try Skillshare for free for 2 months instead of just one! Udemy offers discounted courses starting from €10.99 for graphic design, writing, web design, editing, photography, and running a freelance business. Coursera provides certification for courses that last about 7 days on average. The platform offers language, writing, marketing, advertising, business, and academic material from top universities across the United States. Perfect for freelancers looking to hone their skills! 🚀 Takeaway While education is always a good thing, there are no formal requirements to start freelancing. However, you can’t expect to become a successful freelancer if you don’t possess the necessary courage, an entrepreneurial spirit, a high degree of motivation and a skill in which you really excel. Also take into consideration education and training required by your niche and your own feelings about getting a degree. When it comes to continuing education, there’s no doubt about it: do it as often as you can! Continuing your education is key to sustaining your success as a freelancer for a number of reasons, ranging from keeping up-to-date with current technologies and practices to challenging yourself.

Read more
woman thinking
woman thinking
Reading time 7 min
8 MAY 2025

In software development, assumptions can have a serious impact and we should always be on the look-out. In this blog post, we talk about how to deal with assumptions when developing software. Imagine…you’ve been driving to a certain place A place you have been driving to every day for the last 5 years, taking the same route, passing the same abandoned street, where you’ve never seen another car. Gradually you start feeling familiar with this route and you assume that as always you will be the only car on this road. But then at a given moment in time, a car pops up right in front of you… there had been a side street all this time, but you had never noticed it, or maybe forgot all about it. You hit the brakes and fortunately come to a stop just in time. Assumption nearly killed you. Fortunately in our job, the assumptions we make are never as hazardous to our lives as the assumptions we make in traffic. Nevertheless, assumptions can have a serious impact and we should always be on the look-out. Imagine… you create websites Your latest client is looking for a new site for his retirement home because his current site is outdated and not that fancy. So you build a Fancy new website based on the assumption that Fancy means : modern design, social features, dynamic content. The site is not the success he had anticipated … strange … you have build exactly what your client wants. But did you build what the visitors of the site want? The average user is between 50 – 65 years old, looking for a new home for their mom and dad. They are not digital natives and may not feel at home surfing on a fancy, dynamic website filled with twitter feeds and social buttons. All they want is to have a good impression of the retirement home and to get reassurance of the fact that they will take good care of their parents. The more experienced you’ll get, the harder you will have to watch out not to make assumptions and to double-check with your client AND the target audience . Another well known peril of experience is “ the curse of knowledge “. Although it sounds like the next Pirates of the Caribbean sequel, the curse of knowledge is a cognitive bias that overpowers almost everyone with expert knowledge in a specific sector. It means better-informed parties find it extremely difficult to think about problems from the perspective of lesser-informed parties. You might wonder why economists don’t always succeed in making the correct stock-exchange predictions. Everyone with some cash to spare can buy shares. You don’t need to be an expert or even understand about economics. And that’s the major reason why economists are often wrong. Because they have expert knowledge, they can’t see past this expertise and have trouble imagining how lesser informed people will react to changes in the market. The same goes for IT. That’s why we always have to keep an eye out, we don’t stop putting ourselves in the shoes of our clients. Gaining insight in their experience and point of view is key in creating the perfect solution for the end user. So how do we tackle assumptions …? I would like to say “Simple” and give you a wonderful oneliner … but as usual … simple is never the correct answer. To manage the urge to switch to auto-pilot and let the Curse of Knowledge kick in, we’ve developed a methodology based on several Agile principles which forces us to involve our end user in every phase of the project, starting when our clients are thinking about a project, but haven’t defined the solution yet. And ending … well actually never. The end user will gain new insights, working with your solution, which may lead to new improvements. In the waterfall methodology at the start of a project an analysis is made upfront by a business analist. Sometimes the user is involved of this upfront analysis, but this is not always the case. Then a conclave of developers create something in solitude and after the white smoke … user acceptance testing (UAT) starts. It must be painful for them to realise after these tests that the product they carefully crafted isn’t the solution the users expected it to be. It’s too late to make vigorous changes without needing much more time and budget. An Agile project methodology will take you a long way. By releasing testable versions every 2 to 3 weeks, users can gradually test functionality and give their feedback during development of the project. This approach will incorporate the user’s insights, gained throughout the project and will guarantee a better match between the needs of the user and the solution you create for their needs. Agile practitioners are advocating ‘continuous deployment’; a practice where newly developed features will be deployed immediately to a production environment instead of in batches every 2 to 3 weeks. This enables us to validate the system (and in essence its assumptions) in the wild, gain valuable feedback from real users, and run targeted experiments to validate which approach works best. Combining our methodology with constant user involvement will make sure you eliminate the worst assumption in IT: we know how the employees do their job and what they need … the peril of experience! Do we always eliminate assumptions? Let me make it a little more complicated: Again… imagine: you’ve been going to the same supermarket for the last 10 years, it’s pretty safe to assume that the cereal is still in the same aisle, even on the same shelf as yesterday. If you would stop assuming where the cereal is … this means you would lose a huge amount of time, browsing through the whole store. Not just once, but over and over again. The same goes for our job. If we would do our job without relying on our experience, we would not be able to make estimations about budget and time. Every estimation is based upon assumptions. The more experienced you are, the more accurate these assumptions will become. But do they lead to good and reliable estimations? Not necessarily… Back to my driving metaphor … We take the same road to work every day. Based upon experience I can estimate it will take me 30 minutes to drive to work. But what if they’ve announced traffic jams on the radio and I haven’t heard the announcement… my estimation will not have been correct. At ACA Group, we use a set of key practices while estimating. First of all, it is a team sport. We never make estimations on our own, and although estimating is serious business, we do it while playing a game: Planning poker. Let me enlighten you; planning poker is based upon the principle that we are better at estimating in group. So we read the story (chunk of functionality) out loud, everybody takes a card (which represent an indication of complexity) and puts them face down on the table. When everybody has chosen a card, they are all flipped at once. If there are different number shown, a discussion starts on the why and how. Assumptions, that form the basis for one’s estimate surface and are discussed and validated. Another estimation round follows, and the process continues till consensus is reached. The end result; a better estimate and a thorough understanding of the assumptions surrounding the estimate. These explicit assumptions are there to be validated by our stakeholders; a great first tool to validate our understanding of the scope.So do we always eliminate assumptions? Well, that would be almost impossible, but making assumptions explicit eliminates a lot of waste. Want to know more about this Agile Estimation? Check out this book by Mike Cohn . Hey! This is a contradiction… So what about these assumptions? Should we try to avoid them? Or should we rely on them? If you assume you know everything … you will never again experience astonishment. As Aristotle already said : “It was their wonder, astonishment, that first led men to philosophize”. Well, a process that validates the assumptions made through well conducted experiments and rapid feedback has proven to yield great results. So in essence, managing your assumptions well, will produce wonderful things. Be aware though that the Curse of Knowledge is lurking around the corner waiting for an unguarded moment to take over. Interested in joining our team? Interested in meeting one of our team members? Interested in joining our team? We are always looking for new motivated professionals to join the ACA team! {% module_block module "widget_3ad3ade5-e860-4db4-8d00-d7df4f7343a4" %}{% module_attribute "buttons" is_json="true" %}{% raw %}[{"appearance":{"link_color":"light","primary_color":"primary","secondary_color":"primary","tertiary_color":"light","tertiary_icon_accent_color":"dark","tertiary_text_color":"dark","variant":"primary"},"content":{"arrow":"right","icon":{"alt":null,"height":null,"loading":"disabled","size_type":null,"src":"","width":null},"tertiary_icon":{"alt":null,"height":null,"loading":"disabled","size_type":null,"src":"","width":null},"text":"View career opportunities"},"target":{"link":{"no_follow":false,"open_in_new_tab":false,"rel":"","sponsored":false,"url":{"content_id":229022099665,"href":"https://25145356.hs-sites-eu1.com/en/jobs","href_with_scheme":null,"type":"CONTENT"},"user_generated_content":false}},"type":"normal"}]{% endraw %}{% end_module_attribute %}{% module_attribute "child_css" is_json="true" %}{% raw %}{}{% endraw %}{% end_module_attribute %}{% module_attribute "css" is_json="true" %}{% raw %}{}{% endraw %}{% end_module_attribute %}{% module_attribute "definition_id" is_json="true" %}{% raw %}null{% endraw %}{% end_module_attribute %}{% module_attribute "field_types" is_json="true" %}{% raw %}{"buttons":"group","styles":"group"}{% endraw %}{% end_module_attribute %}{% module_attribute "isJsModule" is_json="true" %}{% raw %}true{% endraw %}{% end_module_attribute %}{% module_attribute "label" is_json="true" %}{% raw %}null{% endraw %}{% end_module_attribute %}{% module_attribute "module_id" is_json="true" %}{% raw %}201493994716{% endraw %}{% end_module_attribute %}{% module_attribute "path" is_json="true" %}{% raw %}"@projects/aca-group-project/aca-group-app/components/modules/ButtonGroup"{% endraw %}{% end_module_attribute %}{% module_attribute "schema_version" is_json="true" %}{% raw %}2{% endraw %}{% end_module_attribute %}{% module_attribute "smart_objects" is_json="true" %}{% raw %}null{% endraw %}{% end_module_attribute %}{% module_attribute "smart_type" is_json="true" %}{% raw %}"NOT_SMART"{% endraw %}{% end_module_attribute %}{% module_attribute "tag" is_json="true" %}{% raw %}"module"{% endraw %}{% end_module_attribute %}{% module_attribute "type" is_json="true" %}{% raw %}"module"{% endraw %}{% end_module_attribute %}{% module_attribute "wrap_field_tag" is_json="true" %}{% raw %}"div"{% endraw %}{% end_module_attribute %}{% end_module_block %}

Read more
Reading time 6 min
8 MAY 2025

Over the last few years, digitization and innovation have had an enormous impact on the application landscape. A company’s application architecture used to be relatively simple, but that is no longer the case. Numerous cloud solutions, rented on a monthly basis, now complicate things to the point where it’s no longer obvious which data is kept where . Combine this trend with the shift towards self-service applications from a data consumption perspective and the impact on data architectures is inevitable. In this blog post, we’ll dive deeper into this (r)evolution in the world of data and the impact of a changing application landscape on your data architecture . Keeping an open data architecture ‘Data’ is a broad concept and includes an incredible amount of domains that all require specific knowledge or some sort of specialization. There are plenty of examples: data architecture, data visualization, data management, data security, GDPR, and so on. Over the years, many organizations have tried to get a grasp on all these different ‘data domains’. And this really isn’t a cakewalk, since innovative changes are taking place in each of these domains . Additionally, they often coincide with other and newer concepts such as AI, data science, machine learning, and others. In any case, it’s preferable to keep your vision and data architecture as ‘open’ as possible . This keeps the impact of future changes on your current implementation as low as possible. Denying such changes means slowing down innovation, possibly annoying your end-users and vastly increasing the chance of a huge additional cost a few years down the line when the need to revise your architecture can no longer be postponed. Modern applications complicate combining data The amount of data increases exponentially every year . Moreover, the new generation of end-users is used to being served at their beck and call. This is a trend that the current application landscape clearly supports. Within many applications, its software vendors offer data in real-time in an efficient, attractive and insightful way. Huge props to these vendors of course, but this poses additional difficulties for CIOs to deliver combined data to end-users. “What is the impact of a marketing campaign on the sale of a certain product?” Anwering a question like this poses a challenge for many organizations. The answer requires combining data from two (admittedly well-organized) applications. For example, Atlassian offers reporting features in Jira while Salesforce does the same with its well-known CRM platform. The reporting features in both of these software packages are actually very detailed and allow you to create powerful reports. However, it’s difficult to combine this data into one single report. Moreover, besides well-structured Marketing and Sales domains, a question like that requires an overarching technical and organizational alignment. Which domain has the responsibility or the mandate to answer such a question? Is there any budget available? What about resources? And which domain will bear these costs? Does Self-Service BI offer a solution? In an attempt to answer such questions, solutions such as Self-Service BI introduced themselves to the market. These tools are able to simply combine data and provide insight their users might not have thought of yet. The only requirement is that these tools need access to the data in question. Sounds simple enough, right? Self-Service BI tools have boomed the past few years, with Microsoft setting the example with its Power-BI. By making visualizations and intuitive ‘self-service data loaders’ a key component, they were able to convince the ‘business’ to invest. But this creates a certain tension between the business users of these tools and CIOs . The latter slowly lose their grip on their own IT landscape, since a Self-Service BI approach may also spawn a lot of ‘shadow-BI’ initiatives in the background. For example, someone may have been using Google Data Studio on their own initiative without the CIO knowing, while that CIO is trying to standardize a toolset using Power-BI. Conclusion: tons of data duplication, security infringement and then we haven’t even talked about GDPR compliance yet. Which other solutions are there? The standard insights and analytics reports within applications are old news, and the demand for real-time analytics, also known as streaming analytics, is rising. For example, during online shopping, stores display their actual stock of a product on the product page itself. Pretty run-of-the-mill, right? So why is it then so hard to answer the question regarding the impact of my marketing campaign on my sales in a report? The demands and needs for data are changing. Who is the owner of which data and who determines its uses? Does historical data disappear if it’s not stored in a data warehouse? If the data is still available within the application where it was initially created, how long will it still remain there? Storing the data in a data lake or data repository is a possible cheap(er) solution. However, this data is not or hardly organized, making it difficult to use it for things like management reporting. Perhaps offloading this data to a data warehouse is the best solution? Well-structured data, easily combined with data from other domains and therefore an ideal basis for further analysis. But… the information is not available in real-time and this solution can get pretty costly. Which solution best fits your requirements? Takeaway As you’ve noticed by now, it’s easy to sum up a ton of questions and challenges regarding the structuring of data within organizations. Some data-related questions require a quick answer, other more analytical or strategic questions don’t actually need real-time data. A data architecture that takes all these needs into account and is open to changes is a must. We believe in a data approach in which the domain owner is also the owner of the data and facilitates this data towards the rest of the organization. It’s the responsibility of the domain owner to organize their data in such a way that it can provide an answer to as many questions from the organization as possible. It’s possible that this person doesn’t have the necessary knowledge or skills within their team to organize all of this. Therefore, a new role within the organization is necessary to support domain owners with knowledge and resources: the role of a Chief Data Officer (CDO). They will orchestrate everything and anything in the organization when it comes to data and have the mandate to enforce general guidelines. Research shows that companies that have appointed a CDO are more successful when rolling out new data initiatives. ACA Group commits itself to guide its customers as best as possible in their data approach. It’s vital to have a clear vision, supported by a future-proof data architecture: an architecture open to change and innovation, not just from a technical perspective, but also when it comes to changing data consumption demands. A relevance to the new generation, and a challenge for most data architectures and organizations. {% module_block module "widget_ee7fe7f9-05fc-4bd8-b515-6bb400cb56b4" %}{% module_attribute "buttons" is_json="true" %}{% raw %}[{"appearance":{"link_color":"light","primary_color":"primary","secondary_color":"primary","tertiary_color":"light","tertiary_icon_accent_color":"dark","tertiary_text_color":"dark","variant":"primary"},"content":{"arrow":"right","icon":{"alt":null,"height":null,"loading":"disabled","size_type":null,"src":"","width":null},"tertiary_icon":{"alt":null,"height":null,"loading":"disabled","size_type":null,"src":"","width":null},"text":"Check out our data services"},"target":{"link":{"no_follow":false,"open_in_new_tab":false,"rel":"","sponsored":false,"url":{"content_id":null,"href":"https://acagroup.be/en/services/data/","href_with_scheme":"https://acagroup.be/en/services/data/","type":"EXTERNAL"},"user_generated_content":false}},"type":"normal"}]{% endraw %}{% end_module_attribute %}{% module_attribute "child_css" is_json="true" %}{% raw %}{}{% endraw %}{% end_module_attribute %}{% module_attribute "css" is_json="true" %}{% raw %}{}{% endraw %}{% end_module_attribute %}{% module_attribute "definition_id" is_json="true" %}{% raw %}null{% endraw %}{% end_module_attribute %}{% module_attribute "field_types" is_json="true" %}{% raw %}{"buttons":"group","styles":"group"}{% endraw %}{% end_module_attribute %}{% module_attribute "isJsModule" is_json="true" %}{% raw %}true{% endraw %}{% end_module_attribute %}{% module_attribute "label" is_json="true" %}{% raw %}null{% endraw %}{% end_module_attribute %}{% module_attribute "module_id" is_json="true" %}{% raw %}201493994716{% endraw %}{% end_module_attribute %}{% module_attribute "path" is_json="true" %}{% raw %}"@projects/aca-group-project/aca-group-app/components/modules/ButtonGroup"{% endraw %}{% end_module_attribute %}{% module_attribute "schema_version" is_json="true" %}{% raw %}2{% endraw %}{% end_module_attribute %}{% module_attribute "smart_objects" is_json="true" %}{% raw %}null{% endraw %}{% end_module_attribute %}{% module_attribute "smart_type" is_json="true" %}{% raw %}"NOT_SMART"{% endraw %}{% end_module_attribute %}{% module_attribute "tag" is_json="true" %}{% raw %}"module"{% endraw %}{% end_module_attribute %}{% module_attribute "type" is_json="true" %}{% raw %}"module"{% endraw %}{% end_module_attribute %}{% module_attribute "wrap_field_tag" is_json="true" %}{% raw %}"div"{% endraw %}{% end_module_attribute %}{% end_module_block %}

Read more
Mob programming in a meeting room
Mob programming in a meeting room
Reading time 4 min
8 MAY 2025

ACA does a lot of projects. In the last quarter of 2017, we did a rather small project for a customer in the financial industry. The deadline for the project was at the end of November and our customer was getting anxious near the end of September. We were confident we could pull off the job on time though and decided to try out an experiment. We got the team together in one room and started mob programming . Mob what? We had read an article that explains the concept of mob programming. In short, mob programming means that the entire team sits together in one room and works on one user story at a time. One person is the ‘driver’ and does the coding for a set amount of time. When that time has passed, the keyboard switches to another team member. We tried the experiment with the following set-up: Our team was relatively small and only had 4 team members. Since the project we were working on was relatively small, we could only assing 4 people. The user stories handled were only a part of the project. Because this was en experiment, we did not want the project - as small as it was - to be mobbed completely. Hence, we chose one specific epic and implemented those user stories in the mob. We did not work on the same computer. We each had a separate laptop and checked in our code to a central versioning system instead of switching the keyboard. This wasn't really a choice we made, just something that happened. We switched every 20 minutes. The article we referred to talks about 12, but we thought that would be too short and decided to go with 20 minutes instead. Ready, set, go! We spent more than a week inside a meeting room where we could, in turn, connect our laptops to one big screen. The first day of the experiment, we designed. We stood at the whiteboard for hours deciding on the architecture of the component we were going to build. On the same day, our mob started implementing the first story. We really took off! We flew through the user story, calling out to our customer proxy when some requirements were not clear. Near the end of the day, we were exhausted. Our experiment had only just started and it was already so intense. The next days, we continued implementing the user stories. In less than a week, we had working software that we could show to our customer. While it wasn’t perfect yet and didn’t cover all requirements, our software was able to conduct a full, happy path flow after merely 3 days. Two days later, we implemented enhancements and exception cases discussed through other user stories. Only one week had passed since our customer started getting anxious and we had implemented so much we could show him already. Finishing touches Near the end of the project, we only needed to take care of some technicalities. One of those was making our newly-built software environment agnostic. If we would have finished this user story with pair programming, one pair would know all the technical details of the software. With mob programming, we did not need to showcase it to the rest of the team. The team already knew. Because we switched laptops instead of keyboards, everyone had done the setup on their own machine. Everyone knew the commands and the configuration. It was knowledge sharing at its best! Other technicalities included configuring our software correctly. This proved to be a boring task for most of the navigators. At this point, we decided the mob experiment had gone far enough. We felt that we were not supposed to do tasks like these with 4 people at the same time. At least, that’s our opinion. Right before the mob disbanded, we planned an evaluation meeting. We were excited and wanted to do this again, maybe even at a bigger scale. Our experience with mob programming The outcome of our experiment was very positive. We experienced knowledge sharing at different levels. Everyone involved knew the complete functionality of the application and we all knew the details of the implementation. We were able to quickly integrate a new team member when necessary, while still working at a steady velocity. We already mentioned that we were very excited before, during and after the experiment. This had a positive impact on our team spirit. We were all more engaged to fulfill the project. The downside was that we experienced mob programming as more exhausting. We felt worn out after a day of being together, albeit in a good way! Next steps Other colleagues noticed us in our meeting room programming on one big screen. Conversations about the experiment started. Our excitement was contagious: people were immediately interested. We started talking about doing more experiments. Maybe we could do mob programming in different teams on different projects. And so it begins… Have you ever tried mob programming? Or are you eager to try? Let’s exchange tips or tricks! We’ll be happy to hear from you!

Read more
Liferay AI search
Reading time 3 min
8 MAY 2025

Liferay DXP has become a widely adopted portal platform for building and managing advanced digital experiences over recent years. Organizations use it for intranets, customer portals, self-service platforms, and more. While Liferay DXP is known for its user-friendliness, its default search functionality can be further optimized to meet modern user expectations. To address this, ACA developed an advanced solution that significantly enhances Liferay’s standard search capabilities. Learn all about it in this blog. Searching in Liferay: not always efficient Traditionally, organizational searches relied on individual keywords . For example, intranet users would search terms like "leave" or "reimbursement" to find the information they needed. This often resulted in an overload of results and documents , leaving users to sift through them manually to find relevant information—a time-consuming and inefficient process that hampers the user experience. The way users search had changed The rise of AI tools like ChatGPT has transformed how people search for information. This is also visible in online search engines like Google, where users increasingly phrase their queries as complete questions. For example: “How do I apply for leave?” or “What travel reimbursement am I entitled to?” To meet these evolving search needs, search functionality must not only be fast but also capable of understanding natural language. Unfortunately, Liferay’s standard search falls short in this area. ACA develops advanced AI-powered search for Liferay To accommodate today’s search behavior, ACA has created an advanced solution for Liferay DXP 7.4 installations: Liferay AI Search . Leveraging the GPT-4o language model , we’ve succeeded in significantly improving Liferay’s standard search capabilities. GPT-4o is a state-of-the-art language model trained on an extensive dataset of textual information. By integrating GPT-4o into our solution, we’ve customized search algorithms to handle more complex queries , including natural language questions. How does Liferay AI Search work? Closed dataset The AI model only accesses data from within the closed Liferay environment. This ensures that only relevant documents— such as those from the Library and Media Library—are accessible to the model. Administrators controls Administrators can decide which content is included in the GPT-4o dataset, allowing them to further optimize the accuracy and relevance of search results. Depending on the user’s profile, the answers and search results are tailored to the information they are authorized to access. Direct answers Thanks to GPT-4o integration, the search functionality provides not only traditional results but also direct answers to user queries. This eliminates the need for users to dig through search results to find the specific information they need. The comparison below illustrates the difference between search results from Liferay DXP’s standard search and the enhanced results from ACA’s Liferay AI Search. Want to see Liferay AI Search in action? Check out the demo below or via this link! Be nefits of Liferay AI Search Whether you use Liferay DXP for your customer platform or intranet, Liferay AI Search offers numerous advantages for your organization: Increased user satisfaction: Users can quickly find precise answers to their queries. Improved productivity: Less time is spent searching for information. Enhanced knowledge sharing: Important information is easier to locate and share. Conclusion With Liferay AI Search, ACA elevates Liferay DXP’s search functionality to meet modern user expectations. By integrating GPT-4o into Liferay DXP 7.4, this solution delivers not only traditional search results but also direct, relevant answers to complex, natural language queries. This leads to a faster, more user-friendly, and efficient search experience that significantly boosts both productivity and user satisfaction. Ready to optimize your Liferay platform search functionality Contact us today!

Read more
Ship-IT day 2023
Ship-IT day 2023
Reading time 7 min
8 MAY 2025

November 30, 2023 marked a highly anticipated day for numerous ACA employees. Because on Ship-IT Day, nine teams of ACA team members, whether or not supplemented with customer experts, delved into creating inventive solutions for customer challenges or for ACA Group itself. The hackathon proved to be both inspiring and productive, with at the end a deserved winner! The atmosphere in the ACA office in Hasselt was sizzling right from the early start. Eight out of the nine project teams were stationed here. During the coffee cake breakfast, you immediately felt that it was going to be an extraordinary day. There was a palpable sense of excitement among the project team members , as well as a desire to tackle the complex challenges ahead. 9 innovative projects for internal and external challenges 🚀 After breakfast, the eight project teams swarmed to their working habitat for the day. The ninth team competed in the ACA office in Leuven. We list the teams here: Chatbot course integration in customer portal System integration tests in a CI/CD pipeline Onboarding portal/platform including gamification Automatic dubbing, transcription and summary of conversations publiq film offering data import via ML SMOCS, Low level mock management system Composable data processing architecture Virtual employees Automated invoicing If you want to know more about the scope of the different project teams, read our first blog article Ship-IT Day 2023: all projects at a glance . Sensing the atmosphere in the teams Right before noon, we wondered how the teams had started and how their work was evolving. And so we went to take a quick look... 👀 1. Chatbot course integration in customer portal “After a short kick-off meeting with the customer, we divided the tasks and got to work straight away,” says Bernd Van Velsen. “The atmosphere is great and at the end of the day, we hope to present a result that will inspire the customer . In the best case, we will soon be able to use AI tools in a real customer project with the aim of making more optimal use of the customer's many data.” “The Ship-IT Day is an annual tradition that I like to participate in,” says Bernd. “Not only because it is great to collaborate with colleagues from other departments, but also because it is super educational.” 2. System integration tests in a CI/CD pipeline “We want to demonstrate that we can perform click tests in the frontend in an existing environment and verify whether everything works together properly,” says Stef Noten. “We can currently run the necessary tests locally, so we are good on schedule. The next step is to also make this work in our build pipeline. At the end of the day, we hope we will be able to run the tests either manually or scheduled on the latest version of the backend and frontend .” 3. Onboarding portal/platform including gamification The members of this project team all started at ACA fairly recently. And that is exactly what brought them together, because their goal was to develop a platform that makes the onboarding process for new employees more efficient and fun . Dieter Vennekens shared his enthusiasm with us, stating, "We kicked off with a brainstorming session to define the platform's requirements and goals. Subsequently, we reviewed these with the key users to ensure the final product aligns with their expectations. Our aim is to establish the basic structure before lunch, allowing us to focus on development and styling intensively in the afternoon. By the day's end, our objective is to unveil a functional prototype. This project serves as an opportunity to showcase the capabilities of Low-Code .” 4. Automatic dubbing, transcription and summary of conversations Upon entering their meeting room, we found the project team engrossed in their work, and Katrien Gistelinck provided a concise explanation for their business. "Our project is essentially divided into two aspects. Firstly, we aim to develop an automatic transcription and summary of a conversation . Concurrently, we are working on the live dubbing of a conversation, although we're uncertain about the feasibility of the latter within the day. It might be a tad ambitious, but we are determined to give it a try." She continued, "This morning, our focus was on defining the user flow and selecting the tools we'll utilize. Currently, multiple tasks are progressing simultaneously, addressing both the UI and backend components." 5. Publiq film offering data import via ML Comprising six publiq employees and three from ACA, this team engaged in an introductory round followed by a discussion of the project approach at the whiteboard. They then allocated tasks among themselves. Peter Jans mentioned, "Everyone is diligently working on their assigned tasks, and we maintain continuous communication. The atmosphere is positive, and we even took a group photo! Collaborating with the customer on a solution to a specific challenge for an entire day is energizing. " "At the close of the day, our objective is to present a functional demo showcasing the AI and ML (Machine Learning) processing of an email attachment, followed by the upload of the data to the UIT database. The outcome should be accessible on uitinvlaanderen.be ." Peter adds optimistically, "We're aiming for the win." That's the spirit, Peter! 6. SMOCS, Low level mock management system Upon our arrival, the SMOCS team was deeply engrossed in their discussions, making us hesitant to interrupt. Eventually, they graciously took the time to address our questions, and the atmosphere was undoubtedly positive. "We initiated the process with a brief brainstorming session at the whiteboard. After establishing our priorities, we allocated tasks accordingly. Currently, we are on track with our schedule: the design phase is largely completed, and substantial progress has been made with the API. We conduct a status check every hour, making adjustments as needed," they shared. "By the end of the day, our aim is to showcase an initial version of SMOCS , complete with a dashboard offering a comprehensive overview of the sent requests along with associated responses that we can adjust. Additionally, we have high hopes that the customized response will also show up in the end-user application." 7. Composable data processing architecture This project team aims to establish a basic architecture applicable to similar projects often centered around data collection and processing. Currently, customers typically start projects from scratch, while many building blocks could be reused via platform engineering and composable data. “Although time flies very quickly, we have already collected a lot of good ideas,” says Christopher Scheerlinck. “What do we want to present later? A very complex scheme that no one understands (laughs). No, we aspire to showcase our concepts for realizing a reusable architecture , which we can later pitch to the customer. Given that we can't provide a demo akin to other teams, we've already come to terms with the likelihood of securing second place!" 8. Virtual employees This team may have been the smallest of them all, but a lot of work had already been done just before noon. “This morning we first had a short meeting with the customer to discuss their expectations,” Remco Goyvaerts explains. “We then identified the priority tasks and both of us quickly got to work. The goal is to develop a virtual colleague who can be fed with new information based on AI and ML . This virtual colleague can help new employees find certain information without having to disturb other employees. I am sure that we will be able to show something beautiful, so at the moment the stress is well under control.” Chatbot technology is becoming more and more popular. Remco sees this Ship-IT project as the ideal opportunity to learn more about applications with long-term memory. “The Ship-It Day is a fantastic initiative,” says Remco. “It's wonderful to have the opportunity to break away from the routine work structure and explore innovative ideas.” 9. Automated invoicing The client involved in this project handles 50,000 invoices annually in various languages. The objective is to extract accurate information from these invoices, translate it into the appropriate language, and convert it into a format easily manageable for the customer . “Although we started quite late, we have already made great progress,” notes Bram Meerten. "We can already send the invoice to Azure, which extracts the necessary data reasonably well. Subsequently, we transmit that data to ChatGPT, yielding great results. Our focus now is on visualizing it in a frontend. The next phase involves implementing additional checks and solutions for line information that isn't processed correctly." Bram expresses enthusiasm for the Ship-IT Day concept, stating, "It's fun to start from scratch in the morning and present a functional solution at the end of the day. While it may not be finished to perfection, it will certainly be a nice prototype." And the winner is …. 🏆 At 5 p.m., the moment had arrived... Each team had the opportunity to showcase their accomplishments in a 5-minute pitch, followed by a voting session where everyone present could choose their favorite. All teams successfully presented a functional prototype addressing their customer's challenges. While the SMOCS team may not have managed to visualize their solution, they introduced additional business ideas with the SMOCintosh and the SMOCS-to-go food concept. However, these ideas fell just short of securing victory. In a thrilling final showdown, the team working on the onboarding platform for ACA came out as the winners! Under the name NACA (New at ACA), they presented an impressive prototype of the onboarding platform, where employees gradually build a rocket while progressing through their onboarding journey. Not only was the functionality noteworthy, but the user interface also received high praise. Congratulations to the well-deserving winners! Enjoy your shopping and dinner vouchers. 🤩 See you next year!

Read more
Reading time 4 min
8 MAY 2025

OutSystems: a catalyst for business innovation In today's fast-paced business landscape, organisations must embrace innovative solutions to stay ahead. There are a lot of strategic technological trends that address crucial business priorities such as digital immunity, composability, AI, platform engineering, Low-Code , and sustainability. OutSystems , the leading Low-Code development platform , has become a game-changer in supporting organisations to implement these trends efficiently and sustainably. OutSystems enhances cyber security As organisations increasingly rely on digital systems, cyber threats pose a significant risk. Additionally, digital engagement with customers, employees, and partners, plays a vital role in a company's well-being. The immunity and resilience of an organisation is now as strong and stable as its core digital systems. Any unavailability can result in a poor user experience, revenue loss, safety issues, and more. OutSystems provides a robust and secure platform that helps build digital immune systems , safeguarding against evolving cybersecurity challenges. With advanced threat detection, continuous monitoring, secure coding practices , and AI code-scanning, OutSystems ensures applications are resilient and protected. Furthermore, the platform covers most of the security aspects for project teams, enabling them to focus on delivering high value to end customers while best practices are recommended by the platform through code analysis using built-in patterns. OutSystems simplifies cloud-native infrastructure management Cloud-native architecture has emerged as a vital component for modern application development. The OutSystems Developer Cloud Platform enables teams to easily create and deploy cloud-native applications, leveraging the scalability and flexibility of cloud infrastructure through Kubernetes . It allows companies to: Optimise resource utilisation Auto-scale application runtimes Reduce operational costs Adopt sustainable practices (serverless computing, auto-scaling, …) All this without the need for prior infrastructure investment nor the deep technical knowledge required to operate it and the typical burdens associated. OutSystems: gateway to AI and automation AI and hyper-automation have become essential business tools for assisting in content creation, virtual assistants, faster coding, document analysis, and more. OutSystems empowers professional developers to be more productive by infusing AI throughout the application lifecycle. Developers benefit from AI-assisted development, natural language queries, and even Generative AI. Once ready with your development, transporting an app to the test or production environment only takes a few clicks. The platform highly automates the process and even performs all the necessary validations and dependency checks to ensure unbreakable deployments. OutSystems seamlessly integrates with AI capabilities from major cloud providers like Amazon, Azure (OpenAI), and Google, allowing project teams to leverage generative AI, machine learning, natural language processing , and computer vision . By making cutting-edge technologies more accessible, OutSystems accelerates digital transformation and creates sustainable competitive advantages. OutSystems enables composable architecture for agility Composable architecture and business apps, characterised by modular components, enable rapid adaptation to changing business needs. OutSystems embraces this trend by providing a cloud-native Low-Code platform using and supporting this type of architecture. It enables teams to easily build composable technical and business components. With the visual modelling approach of Low-Code, a vast library of customizable pre-built components and a micro-service-based application delivery model, OutSystems promotes high reusability and flexibility. This composable approach empowers organisations to: Respond rapidly to changing business needs Experiment with new ideas Create sustainable, scalable, and resilient solutions OutSystems enables the creation of business apps that can be easily integrated, replaced, or extended, supporting companies on their journey towards composability and agility. OutSystems facilitates self-service and close collaboration Platform engineering, which emphasises collaboration between development and operations teams, drives efficiency and scalability. OutSystems provides a centralised Low-Code platform embracing this concept at its core by being continuously extended with new features, tools and accelerators. Furthermore the platform facilitates the entire application development lifecycle until operations . Including features like Version control Automated deployment Continuous integration and delivery (CI/CD) Logging Monitoring Empowering organisations to adopt agile DevOps practices. With OutSystems, cross-functional teams can collaborate seamlessly, enabling faster time-to-market and improved software quality. By supporting platform engineering principles, OutSystems helps organisations achieve sustainable software delivery and operational excellence. OutSystems drives sustainability in IT OutSystems leads the way in driving sustainability in IT through its green IT Low-Code application development platform and strategic initiatives. By enabling energy-efficient development, streamlining application lifecycle management, leveraging a cloud-native infrastructure , and promoting reusability , OutSystems sets an example for the industry. Organisations can develop paperless processes, automate tasks, modernise legacy systems, and simplify IT landscapes using OutSystems 3 to 4 times faster, reducing overall costs and ecological footprint. By embracing OutSystems, companies can align their IT operations with a greener future, contribute to sustainability, and build a more resilient planet. Wrapping it up In the era of digital transformation and sustainability, OutSystems is a powerful ally for organisations, delivering essential business innovations, such as … High-performance Low-Code development Cloud-native architecture AI and automation Robust security measures Collaborative DevOps practices Take the OutSystems journey to align with IT trends, deliver exceptional results, and contribute to a sustainable and resilient future. Eager to start with OutSystems? Let us help

Read more
business woman behind laptop
business woman behind laptop
Reading time 5 min
8 MAY 2025

People don’t really read online. Instead, readers ‘scan’ web content for useful bits and pieces. If you truly want to cater to your website’s visitors, you should make their lives as easy as possible. How? By making your texts as scannable as possible and making it easy for your readers to find the information they are looking for. In this blog post, I’ll tell you the best ways to do that, give you some concrete tips from our own web designers and copywriters, and give you a checklist to see whether your website is as scannable as possible. Paper vs. screen If you write texts for websites, you should probably take into account that your reader reads very differently on a screen compared to on paper. Like I said, readers scan a lot more on tablet, smartphone and pc screens . If they do decide to really read something, they do so 25% slower compared to reading on paper . Reading tests also revealed that reading on a screen is more tiring and that readers tend to remember less from what they’ve read. Use screen fonts Screens and paper handle the legibility of letters differently. Microsoft fonts such as Verdana (instead of Arial) and Georgia (instead of Times New Roman) are designed specifically for screens by Matthew Carter . The Dutch Lucas de Groot designed the Calibri screen font for Windows Vista. For your online writing, it’s best to choose a font that has been designed for screens instead of paper . Define what the most important information or task is Why do people visit your website? Check your website statistics (through Google Analytics , for example) or log files and think about the following questions: How do people navigate to your website? Referral, organic search, social media, …? What are your most popular pages? From what pages do people leave your website? Where do they stop reading? Find out what the most common pieces of information are that visitors are looking for on your website and make that information easiest to find. For example, people usually call a school to tell them their kid feels sick. The school’s homepage might include something like this to cater towards their audience then: Will your child be absent from school? Call or send a text to [phone number] before 8:30 AM. Don’t forget to mention: your child's full name, class reasons of absence. Thank you! This is much more informative than a homepage that starts with ‘Welcome’, which doesn’t really help out with your SEO (Search Engine Optimization) either. It’s important that every single one of your web pages displays the call to action as clearly as possible, e.g. click a button. Visitors want to get started immediately, so make the most important information or task stand out . Make your web content as scannable as possible The internet is a quick search medium, pretty neat for whoever needs to look something up quickly. But ill-considered web content gets lost quickly, both in search engines and in the reader’s mind. More than on paper, the reader needs short pieces of text, titles and white space. A title summarizes a whole paragraph in just a couple of words. Your titles should be informative and include the most important information. Something important is of course what the reader thinks is important or what they are looking for, not necessarily what you think is important . This provides the reader with structure and makes your text much more scannable. Other significant information should be in headings, photos and captions, short paragraphs, buttons and links. You can also use bullet lists like the one above. Arrows are particularly useful for highlighting something important. Numbers in digits work better than numbers in words. To make your web content more scannable, write 100% instead of one hundred percent. Numbers are more striking than words. 7 tips from the ACA web designers and copywriters Limit your web page length to 3-5 screen lengths If your web content is longer and all information is relevant, consider altering your website’s structure and adding more web pages. Say the most important things first Web surfers look at the left side of their screen for 70% of the time. A visitor glances over your web page roughly in the shape of an ‘F’: they start at the top left, then look at the top right, go down from the left and next glance at the right again somewhere in the middle (see heat map to the right). It’s worthwhile to put the most important information at the top left of the page and less important information towards the bottom right of the page. Write your web text from short to lengthy The first paragraph should be the shortest. The reader will only continue to read the other, longer paragraphs if this one is interesting enough. Mind the imagery Pictures, hyperlinks, bullets, … Provide one absolute eye-catcher and at most 7 visual accents per web page. Take the text-to-imagery ratio into account and the fact that text to the right or under a picture gets read much more than other texts. Provide contrast between text and background color For paper, it’s black on a white background. On a screen, however, this is too taxing for your eyes. Dark grey on a white background is a much better option. One line should include 75 symbols or less, including spaces A sentence should never be smeared over more than 2 lines. The amount of lines per screen length should be 23 or less. A line should contain about 75 symbols or less, which comes down to about 12 words per line. Links should tell readers what they do Don’t use ‘Click here’ links, but rather use ‘Click here to learn how to write reader-friendly emails’. Use buttons for actions like searching and registering, and hyperlinks to navigate to other pages.

Read more