27 results found with an empty search
- Taming the Giants: Large-Scale Modelling and how can Surrogate Models be the right move | Funis Consulting
< Back Taming the Giants: Large-Scale Modelling and how can Surrogate Models be the right move Image by Pexels from Pixabay 09 Jul 2025 Large-scale models can take ages to run, slowing down decision-makers and frustrating the users. Surrogate models offer a solution to this challenge. Surrogate models are simplified, faster alternatives trained on input-output data from the original model. While they aren’t physics-based, they can mimic complex models closely and deliver results far more quickly. Large-scale modelling is developing and using computational models to simulate systems which are very complex in nature. It’s all about managing complexity, as you have lots of variables and many scenarios with often very time-consuming computations. These systems would normally require the processing of large amounts of data (or variables) within wide ranges to represent real-world systems at significantly large scales, both temporally or spatially and so they require substantial computational power or time to solve. There are two types of large-scale models. The first type involves machine learning or statistical models trained on vast datasets - think on the lines of predictive models trained on millions of datapoints or high-dimensional data. Such models are used in many fields, ranging from finance, marketing, and bioinformatics. The second type is complex mechanistic or first-principles (physics-based models) which are based on physical or chemical laws and rules, and are often formulated as systems of differential equations and these can be used in engineering, environmental modelling, climate science, fluid dynamics or food process simulations. Let's take Climatic Modelling for instance, these simulate the Earth's atmosphere, oceans, land surfaces and ice and such models use fundamental laws of physics to predict how climate variables like temperature, rainfall or wind patterns change over time. Since these must cover the entire globe over decades or even centuries, they require huge computational resources. Another example would be Computational Fluid Dynamics (CFD) in Food Processing, designing process such as spray drying or extrusion in food manufacturing. CFD models are used to simulate how fluids, such as air, steam or liquids, move and transfer heat or mass. These models are based on the Navier-Stokes equations and require fine-grained spatial and temporal resolution to capture key details. Running a single scenario can take hours or days especially if the geometry and chemistry is complex or the material properties are complex and vary with conditions such as temperature or pressure. So, if you've ever worked with large-scale modelling, whether that's handling vast datasets or complex, physics-based models, you’ll know that solving or training these models can take anywhere from a few minutes to several weeks, if not more. This time lag can be frustrating, especially for end users who may not fully understand why the results take so long. Often, this becomes a barrier to adoption. However, the good news, is that there is possibly a way around this! Surrogate models, are simplified mathematical versions of your original model, constructed using the outcomes of simulations from that full-scale model. By running the original model under a variety of starting conditions or inputs, you collect a range of outputs. Provided the underlying model is robust, these input-output pairs can be used to train a new, much faster model that mimics the behaviour of the original. While this surrogate model won't be rooted in physical laws, it will be built on sound data generated from a model that is. That being said, two critical questions arise, the first being how many original simulations you need to execute, and will it take so long that building the surrogate model becomes no longer practical or feasible? The answer to this depends on several factors, mainly the complexity of your model and the breadth of the input space that you want to explore. If you're dealing with many variables across wide ranges, the effort required might be substantial. Still, it could be worthwhile. Surrogate models can offer results orders of magnitude faster than the full models. Building one isn’t always straightforward, but if it makes your work more accessible and widely used, it might just be the right move. Previous Next
- FAQs | Funis Consulting
Find the answers to the most frequently asked questions about Funis Consulting Ltd's services and how we work. Frequently Asked Questions Read our frequently asked questions. If you cannot find the answer to your question here please send us an email on info@funisconsulting.com or Contact Us What does Funis Consulting offer? At Funis Consulting, we focus on two key areas: i) Scientific Computing (Mathematical, Physical and Chemistry Modelling & Simulation including Process Modelling, Advanced Machine Learning & AI, Data Analysis & Visualisation, Code Optimisation, Process Optimisation) ii) Scientific Consultancy (Chemistry and Physics Consultancy with a strong focus on FMCG and Food Manufacturing Industries) In terms of Scientific Computing we specialise in Modelling & Simulation ranging from chemistry and physics-based models, for instance to see how various chemicals react under certain conditions, to modelling entire processes for manufacturing companies. In Modelling & Simulation we use data-driven methodologies and/or First-Principles models, either on their own or in conjunction with each other, to help businesses tackle complex challenges with precision and efficiency. Data-driven models can be based on data analysis, statistics, AI, and advanced machine learning whereas First-Principles, or Physics-based models can be based on solutions like reaction kinetics, Computational Fluid Dynamics (CFD), Finite Element Analysis (FEA), Smoothed Particle Hydrodynamics (SPH), Discrete Event Simulation (DES) and Discrete Element Modelling (DEM). Our expertise spans mathematical, chemical and physical modelling, computational chemistry, machine learning & AI and GPU-powered, high-performance computing (HPC). We also offer Code Optimisation, Process (Parameter) Optimisation and Data Analysis & Visualisation. These services are most of the times integrated or part of the Models that we create, however we can also provide these services individually on their own. Funis Consulting uses both off-the-shelf tools as well as develops tailor-made software solutions from scratch according to the client’s requirements, or the product / process in question. We combine our strong scientific knowledge together with technology to create scientific software. Further to the above, we are software-agnostic, meaning we are also able to integrate seamlessly with our client’s preferred programming language and software. Our second area of expertise is Scientific Consultancy with a strong focus on the FMCG and Food Manufacturing Industries. We provide expert consultancy services in Chemistry and Physics on either the product formulation or the manufacturing processes from the chemical and physical side of things. In some of the cases we use Modelling & Simulation solutions, or other more-technology based solutions, for such Consultancy, depending on what the case may be. Some of the services we provide are in research and development, product formulation, product innovation, and process optimisation. We also assist in designing and analysing shelf-life studies, ensure your products meet regulatory compliance as well as market expectations. Our Core services therefore are: - Scientific Tailor-Made Software Development - Process & Parameter optimisation - Data Analysis & Visualisation - Mathematical, Physical & Chemical Modelling - Process Modelling - Code Optimisation & High Performance Computing (HPC) - Machine Learning & AI integration - Scientific (Chemistry and Physics) Consulting for FMCG and Food Manufacturing industries We can work across many industries including, but not limited to FMCG, Food Manufacturing, flavours and fragrances, pharmaceuticals and healthcare, banking, finance and economics, transportation including automotive and aerospace, utilities and government departments, manufacturing, energy, environmental and agriculture, research and academia. What core services do you offer? The core services offered by Funis Consulting are: - Custom Software Development: Tailored scientific applications and business tools - Process & Parameter Optimisation: Enhancing efficiency through advanced algorithms - Data Analytics & Visualisation: Transforming complex data into actionable insights - Mathematical, Physical & Chemical Modelling: Developing high-precision simulations - Code Optimisation & HPC: Leveraging GPU computing for superior performance - Machine Learning & AI Integration: Driving innovation through smart automation - Scientific Consulting for FMCG: From new product development to food chemistry insights What kind of project engagement models do you offer? We offer flexible project structures to suit businesses of all sizes. Our approach can be fixed-cost or time-based, depending on your specific needs. Plus, our first online consultation, to understand whether Funis Consulting is for you, is always free, with no obligation. If you’re facing a complex challenge or have a project in mind, we’d love to work with you. Talk to us if you need more information and take a look at some of our clients and partners by going on our homepage. Which industries can make use of your services? We can work across many industries including, but not limited to FMCG, flavours and fragrances, pharmaceuticals and healthcare, banking, finance and economics, transportation including automotive and aerospace, utilities and government departments, manufacturing, energy, environmental and agriculture, research and academia. Not sure whether we can help the nature of your business? Don’t worry, drop us an email and ask us, or fill in the contact form we shall get back to you as soon as we can. How can my business benefit from Modelling & Simulation? Data-driven models can be powerful tools, especially when your system involves a lot of different variables which can be difficult to make sense of through simple plots or linear relationships. We can help you make sense of your data and build input-output relationships to help you understand, and control, your system better using statistical and machine learning methods. The main limitation of data-driven models is predicting outcomes beyond the available data range, which can lead to significant uncertainties. Models using fundamental physical laws, offering greater flexibility in forecasting changes to operating conditions and providing deeper insights into system behaviour. At Funis Consulting we also do mathematical, chemical and physical modelling and simulation such as Computational Chemistry, Computational Fluid Dynamics (CFD), Finite Element Analysis (FEA) and Discrete Element Modelling (DEM). From simple chemical reactions to complex physical-chemical processes, we develop custom modelling solutions across industries. Our expertise in equation-based modelling enables us to deliver accurate simulations, helping businesses improve predictive accuracy, optimise processes, and gain a deeper understanding of their operations. Contact us on info@funisconsulting.com for more information and follow us on Linkedin to stay tuned to our insights. What are the benefits of Code Optimisation? With expertise in the latest optimisation techniques, we enhance existing code or develop high-performance solutions from the ground up. We can optimise both CPU and GPU-based code and our software-agnostic approach ensures compatibility with any programming language, allowing seamless integration into your current systems. Whether you're accelerating simulations, optimising machine learning workflows, or improving large-scale data processing, we help you achieve faster, more efficient results. GPUs have revolutionised scientific computing and machine learning, dramatically increasing computational efficiency. When code is optimised effectively, GPU computing can reduce model-building and simulation times from months to minutes, unlocking new levels of performance. Read our article on code optimisation and Follow us on Linkedin for more updates. If you would like further information about how code optimisation can help you contact us. How can Process / Parameter Optimisation help my business? Process / Parameter Optimisation plays a crucial role across industries. If you know the desired outcome but aren’t sure which variables to consider or how to manipulate them, that’s where we step in. Every business generates valuable data, and when leveraged correctly, this data becomes a powerful tool for building models that answer critical questions. Whether you need a simple linear model or advanced techniques like neural networks, we have the expertise to guide you. Let us help you optimise your processes and parameters, using data-driven models to achieve the best possible outcomes, faster and more efficiently. Contact us to discuss further whether Process and Parameter Optimisation can be for you. How does Data Analysis & Visualisation benefit my business? Data is the backbone of any model, whether it’s empirical, data-driven, or based on first-principles physics and chemistry. However, the real value comes from interpreting and applying data correctly. While gathering the right data is essential, it’s how you analyse and visualise it that drives meaningful insights and informed decision-making. At Funis Consulting, we help you navigate the complexities of data analysis and visualisation. We guide businesses in selecting the best methods and tools tailored to their specific needs, ensuring the data collected is relevant, high-quality, and ready for analysis. With our experience across various industries, we make sure the right data is used to build robust, reliable models. Once your data is gathered, it's the interpretation that unlocks its true potential. Raw data alone isn’t enough, it needs to be transformed into clear, compelling insights. Using advanced machine learning techniques and AI integration, we ensure that your data is analysed to reveal actionable strategies that drive smarter, data-driven decisions. We create intuitive visualisations that simplify complex datasets, enabling you to communicate insights effectively to stakeholders. With clear, well-structured visualisations, you can confidently present your findings, ensuring transparency, credibility, and a stronger foundation for decision-making. By leveraging the power of data analysis and visualisation, your business can unlock smarter, more strategic decisions, enhancing efficiency and driving innovation. Contact us today to discuss more. How can Food Science & Technology Consultancy help my Company? In the fast-paced and ever-evolving FMCG industry, staying ahead of the competition requires a deep understanding of the science behind food processes and manufacturing. Whether you're formulating a new product, refining an existing one, or exploring innovative food alternatives, a science-driven approach is key to optimising quality, functionality, and consumer appeal at every stage. At Funis Consulting, we offer the scientific expertise you need to navigate these challenges. With years of experience across a wide range of food and beverage products, from refrigerated items with short shelf lives to sterile products, we help businesses enhance product stability, improve shelf life, and ensure compliance with regulatory standards. Our expertise spans food chemistry and physics, enabling you to maintain optimal taste, texture, and nutritional value. We also provide consultancy in formulation, process optimisation, and packaging solutions, ensuring your product is both innovative and practical. Additionally, we assist in designing and analysing shelf-life studies to help you develop the right product for your market. Our team ensures that your product complies with all relevant regulations, including accurate ingredient lists, nutritional information, and compliant claims. With our support, you can confidently bring your product to market without delays, ensuring it’s market-ready and positioned for success. Contact us today to discuss further how food science and technology can help your business. How can I book an appointment with Funis Consulting? Just send us an email on info@funisconsultancy.com or fill in the Contact us form and we will get back to you as soon as we can. I am not very technical, however I would like to understand whether Funis Consulting can help my business. Will I be able to understand the solutions that Funis Consulting is proposing? Funis Consulting provides scientific and technical solutions which are generally complex in nature, however we do our very best to explain things in a manner which is easy to understand from the first meeting, so as to cater for all audiences. Funis Consulting ensures that its solutions are designed with the end-user in mind. All solutions can be made to feel easy to the user, as well as ensuring that such solutions are sustainable and human-centric. Further to this we can provide training and support on-site or online as the situation requires. We have experience with training and teaching people, and so we feel that we are in a position that we can assist stakeholders from all parts of the business. In addition, we speak English, Maltese and Italian fluently. How can I reach you? You can reach us by email on info@funisconsulting.com or by filling in the Contact us form. What fees do you charge? The fees for our services are very much contingent on the project involved. It would be very difficult for us to give you a project cost before we understand the scope of your project. Contact us so that we can understand your needs better. Is it possible to have an on-site first meeting with you? First consultations / meetings held online or on the Maltese islands for the purposes of understanding whether Funis Consulting is for you, are free of charge. If you wish to have a first on-site consultation but you are located abroad, this can be arranged, however it will be against a fee, as this would involve travelling. Contact us to book your meeting with us. Can Models be made to feel easy and user-friendly? Will staff that is not technical be able to use the technology with ease? Funis Consulting ensures that its solutions are sustainable and human-centric. This means that the software, models or technologies are made to feel easy-to-use as we keep the end-user in mind at all times while designing the solution. For example, different outputs may be required by different people, or the same dataset may be displayed differently to cater for different audiences. This is made possible by designing friendly interfaces while complex algorithm-based technologies are running in the background for you. We also provide training sessions for different audiences, as well as support. Training and refresher training is also available, both in person as well as online as the situation requires. Talk to us to find out how our solution can help you, your team and your business grow.
- Funis Consulting Ltd | Modelling & Simulation
Funis Consulting Ltd specialises in custom software development for scientific applications, modelling, and simulation. Through mathematical modelling, advanced machine learning and GPU Computing, we specialise in data analysis, mathematical, physical and chemical modelling, process optimisation and high-performance computing. We also provide consulting services in food science and technology, and new product development. Data Handling Scientific & Technical Services Food Scientific Consultancy About Us Funis Consulting Ltd is a dynamic company based on the Maltese islands (European Union) and was founded by Robert Cordina who brings over 20 years of experience working with startups, multinational corporations, academia, and government agencies. Driven by a passion for science, technology, and innovation, Funis Consulting specialises in custom software development and consulting services for a wide range of industries. The first key area is Modelling and Simulation, encompassing both data-driven approaches and first-principles/physics-based methods. Our expertise spans machine learning, mathematical, chemical and physical modelling, process/parameter optimisation, and high-performance computing, allowing us to tackle complex challenges with precision and efficiency, using both custom and off-the-shelf software. Whether it’s data-driven modelling – such as data analysis, statistics, advanced machine learning, and AI – or physics-based approaches like chemistry models (reaction kinetics, Molecular Dynamics (MD) and thermodynamics), Computational Fluid Dynamics (CFD), Finite Element Analysis (FEA), Smoothed Particle Hydrodynamics (SPH), Discrete Element Modelling (DEM) or Discrete Event Simulation (DES), Funis Consulting develops and integrates bespoke software and modelling solutions using your software/programming language of choice to enhance decision-making, automate processes and optimise business operations which drives efficiency, innovation, and create real-world impact. Whether through advanced simulation techniques, machine learning integration, or GPU-accelerated computing, we help businesses remain at the cutting edge of innovation. The second specialisation is scientific consultancy for the FMCG industry. With a strong background in chemistry and physics, we support businesses in key areas such as research and development, product renovation and innovation, product formulation, NPD and chemistry insight, ensuring your business stays ahead in a rapidly evolving industry. Core Specialities: Build and integrate smart, custom software, from scientific applications to business tools GPU & High-Performance Computing (HPC) Scientific Computing and Mathematical, Physical and Chemical Modelling & Simulation Process / Parameter Optimisation Code Optimisation; we are software agnostic for a seamless integration into your current systems Making sense of complex data through Data Analytics & Visualisation Advanced Machine Learning and AI integration Scientific Consulting specialising in FMCG industries, from new product development to chemistry insights We offer flexible project structures tailored to businesses of all sizes, whether large or small. Our approach can be fixed-cost or time-based, depending on your projects’ needs. Plus, our first online consultation/meeting to see how Funis Consulting can help, is always free, with no obligation. Send us a message to discuss more about your project, by clicking here . Check out our FAQs and Services sections to know more about what we do. Follow our Blog for our insights into the world of Scientific Computing and Scientific Consultancy. Services At Funis Consulting Ltd, we offer a range of services across different industries, to help businesses stay at the forefront of innovation, efficiency and excellence. Through our modelling and simulation, as well as process optimisation services we tackle complex challenges using both custom and off-the-shelf software, depending on your needs. Whether it’s leveraging data-driven models or physics-based simulations, here at Funis Consulting Ltd we deliver the precision and reliability that your Company is after in order to stay ahead in the ever evolving landscape of innovation. Read More About Our Services Bringing Science, Technology and Innovation together Beyond the Numbers: Gaining Insights Through Data Analysis and Visualisation Collecting data is essential, but its true value lies in the ability to interpret and apply it effectively. Businesses must make sense of the data they gather to drive meaningful insights and informed decision-making. With the right analysis and interpretation, companies can transform raw data into actionable strategies. By leveraging advanced machine learning techniques and AI integrations, businesses can unlock smarter, data-driven decisions that enhance efficiency and innovation. Read More About Our Services Food Science Unlocked: Expert Consultancy for Smarter Solutions A strong understanding of the science behind food processes and manufacturing is key to developing successful products and staying competitive in an ever-evolving industry. Whether it’s creating a new formulation, refining an existing product, or exploring innovative food alternatives, a science-driven approach ensures that every step is optimised for quality, functionality, and consumer appeal. By applying expertise in food chemistry and physics, businesses can enhance product stability, improve shelf life, and meet regulatory requirements while delivering on taste, texture, and nutritional value. With the right scientific insight, your company can navigate challenges more effectively and bring groundbreaking food innovations to market with confidence. Read More About Our Services Past and Present Clients and Partners Mdlz quote Vow_quote Mdlz quote 1/2
- Blog | Funis Consulting
Keep up-to-date with our latest insights on scientific computing, modelling & simulation, data analysis and food science for innovative solutions. Follow Our Posts on Linkedin Follow Funis Consulting Ltd's Page on Linkedin 21st March 2025 Partnership News We are delighted to announce our new partnership with Smart Vision Europe Ltd. Read more news Follow us on Linkedin 13th March 2025 Partnership News We’re thrilled to announce our collaboration between Funis Consulting Ltd and Sweeft Analytics. Read more news Follow us on Linkedin Read our Social Media Articles
- Modelling: A Thoughtful Approach to Understanding and Innovating | Funis Consulting
< Back Modelling: A Thoughtful Approach to Understanding and Innovating Image by Arek Socha from Pixabay 02 Jul 2025 At its heart, modelling is about making sense of complexity allowing us to simulate outcomes, test ideas virtually and optimise systems before we even step into the lab or the production floor. In this article we walk trough how we approach model building, not in heavy tech-jargon but in plain, accessible language. If you are curious about how modelling could support your work or simply want a better understanding of what it actually involves, take a look. Creating a model, at its core, is a way to understand the world. It is a process that allows us to translate complexity into clarity and help us make better decisions, especially before heavy investment is made or before important decisions are taken. It also helps uncover hidden behaviours and relationships thus predicting what might happen next and also provide the best outcome possible or to use the best variables possible for a given solution or scenario. Modelling starts with the purpose that you are trying to achieve. A model should never exist in a vacuum or in isolation, as it is built to answer questions and to solve problems. These could be how a product will behave under certain circumstances, simulating a process to reduce waste or testing what happens when you tweak one or more variables in a production line. Or even understanding how proteins behave in food systems, building economic models or understanding how manufacturing processes respond to different settings and mechanistic variables, or maybe even how a new formulation might perform even before stepping in a lab, or creating a first prototype. In essence modelling becomes a powerful ally, with the first step being understanding what you are trying to find out or what is the problem that you are trying to resolve. Once the purpose is clearly defined, the next task is to get to know the system, and this mostly comes with breaking it down. Understanding what goes in, what comes out and what happens in between is essential. In a food context for instance, this might involve ingredients, temperatures, moisture levels, microbial interactions and how all of this connects to each other and how these evolve together. After understanding the system, we then choose how to represent that system. Some models may rely on known scientific principles, often called mechanistic models. Other models are driven by data, picking up the patterns that might not be obvious to the human eyes. The best results sometimes come from blending the two creating a hybrid approach to modelling, by combining the two approaches (first-principles and data-driven models). The choice depends on the nature of the problem and the data available, as well as whether you have more knowledge of the system as well as quality data. Data is of course essential. It may come from experiments or previous studies. This information becomes the fuel for the model through shaping the model, calibrating and giving it meaning. After this, you build. You build models using tools such as Python, R or specialised simulation software and we begin translating the system into something the computer can work with which could be an equation, a set of rules or a learning algorithm. We test the model and compare its behaviour to reality, refining it as we go. It is an iterative process often requiring a fair bit of tweaking and creative thinking. Once a well-built model is ready it can offer a new lens for decision making. It can help companies predict outcomes without trial-and-error, reducing costs by optimising processes, accelerating innovation timelines and making informed strategic choices, especially in environments where real-world experimentation is time-consuming, risky, or expensive. Here at Funis Consulting, we see modelling not as a technical afterthought but as a cornerstone of innovation. By combining scientific understanding with modern computational tools, we help clients simulate, predict and optimise across a range of domains, from food systems, to industrial processes and beyond. While every model we build is unique to the problem at hand, the goal is always the same: to make the invisible visible and the complex comprehensible. Modelling is about numbers and equations but it is also about insight and foresight. Increasingly, modelling is becoming a quiet but essential driver of smarter business in a world where agility and accuracy matter more than ever. Previous Next
- Process Modelling & Simulation: Calibrated Dynamic and Steady-State system infrastructures (Part 2 of 3) | Funis Consulting
< Back Process Modelling & Simulation: Calibrated Dynamic and Steady-State system infrastructures (Part 2 of 3) Image by Oberholster Venita from Pixabay 23 Apr 2025 Modelling can be done on a system which is constantly in a dynamic state or on a system which is in a steady-state. Transient behaviours can be embedded in a dynamic model, whereas steady-state models are used to simulate a system which is expected to behave in a much more stable manner. Which to use depends on the question/problem you are trying to resolve. Process Modelling & Simulation can be carried out on a single process or on a combination of different processes by combining these together to have a holistic understanding of your system. You can use Process Modelling to model a dynamic system (a system changing over time) or a steady-state system (a system working when all the processes have been coupled together and equilibrated). In dynamic system process modelling, the system is constantly changing and therefore the variables are never constant, sometimes changing drastically and at a high frequency. This means that dynamic systems are influenced by variability, and in the context of a new manufacturing line this could mean that you are modelling a process which is constantly changing. An example of this is when you have a manufacturing line with frequent product switches. Another example of such a dynamic process is when you want to understand the impact of transient behaviours such as when product or resources changeover is carried out, or what happens during peak times. In this case Discrete Event Simulation (DES) is the modelling type which is mostly used. In the example of the manufacturing line you are essentially modelling the flow of products manufactured (you can model from raw material state all the way to a finished good), but also factoring in elements such as the people, the behaviours, the resources, the constraints, and then simulate multiple what-if scenarios. So, you are essentially modelling a real-life situation, or in this case a manufacturing line, but in a digitalised format. You can “play” around or test multiple scenarios, in a safe digital space, until you are ready to implement in real life, once the optimum settings have been found. A steady-state system, on the other hand, is modelling a system which is already calibrated and everything is running in a stable state. So, for instance in manufacturing this would be a system where it is running at a constant rate, such as when you are focused on chemical or thermal processes. There are no changes being made to the system, and thus there are no changes to the system’s output. Imagine if we were to run multiple tests of chemical reactions taking place in a chamber (therefore, without any interference to the process). What we will model is a system in a digital environment which will have no transient behavioural elements, so once all of the coupled systems have converged we will know how the system will perform. In this case the modelling techniques used may vary. So, dynamic models factor in changes, including human-interaction and behaviour, as well as constant or frequent changes to the process. On the other hand, steady-state is when there are no changes made to a process, thus the process should reach a stable operation. In certain industries dynamic models are more used in discrete manufacturing where there is an element of resource usage, frequent changes over time, human-interaction element, coordination between automation and manual processes or settings/environmental changes. Dynamic models are more operational. Steady-state models, on the other hand are not into the operational aspect, but rather how a system behaves when the variables are not changing. In a model, whether that is dynamic or a steady-state, you can add as many variables as you need. Some simple examples of these are costs, throughputs, chemical reactions, mixing and even random events (for a dynamic system) and many more, depending on the model you are building and the problem/question you are trying to answer. These variables do not have to be modelled in isolation, but all of these variables can be coupled and modelled together all at once in one larger model. This gives you a holistic picture of the system, how it works when calibration takes place, in either a dynamic system or a steady-state system. Previous Next
- Harnessing the Power of Optimisation | Funis Consulting
< Back Harnessing the Power of Optimisation Image by Elchinator from Pixabay 12 Mar 2025 We have all been there, seeing a process and thinking, there must be a better way to do this, even achieving a better, more accurate output. It could be a software flow, a manual process or even an entire system, optimisation helps businesses in finding and implementing improvements resulting in a huge impact to the business. Certain processes can be far too complicated when they do not need to be. This means that resources’ time is wasted leading to sub-optimal productivity within a Company. The more complicated processes are, the higher the risk of human errors and setbacks, thus holding Companies back from moving projects and innovation forward, and focusing on what really matters. Every system has its own pace, but when inefficiencies start to negatively affect a Company, it is a good idea to pause for a moment and take a closer look at the different components and tools in place and see where optimisation can make a real change to your business. Process Optimisation can truly help business make that transformation, enabling teams to focus and spend their time and energy on what’s important. Optimisation can bring about a number of benefits to companies and can be used across all sectors, be it public policy, governmental planning, pharmaceutical, biotechnology, transportation, mobility services, manufacturing and operations, FMCG, supply chain and logistics, healthcare, medical applications and finance, just to name a few. To understand Optimisation one has to first understand Predictive Modelling. In Predictive Modelling, as long we know the input x , the relationship between x and y (or f(x)), we are able to predict the output, y . You might be familiar with the below example from your school days, which illustrates the equation of a straight line; y=mx+c , where m is the gradient (or slope) and c is the intercept. In process optimisation m and c could be your process settings. Here, by knowing x and f(x) ,you are able to predict the output, y . Graph showing correlation between x and y So, taking the example above, Optimisation comes in when you need to know m and c , by knowing your input ( x) and what you want to get out ( y) . Therefore, starting from the desired output ( y ), a known variable, we need to understand the relationship between x and y , i.e. f(x) , which are unknowns. We do this by utilising the data that is known by us. Therefore, Optimisation is when you find out what variables you need to deploy and in what manner, in order to get to the desired result or output. Optimisation works by attempting various iterations or value changes in the unknowns (in this case, m and c, our process settings), and varying these until we reach what is called a zero loss (0 Loss) and hence achieve the desired output, y. In this way, we are discovering the parameters needed to get to the desired y. Optimisation can be single-objective or it can be multi-objective, with the latter having more complexity which might make obtaining a 0 Loss very difficult. In such cases, one finds what is called the global minimum, which essentially is the closest possible to a 0 Loss scenario. In Optimisation, a specialised algorithm is used to run the simulations, according to a set of chosen rules and weights attributed to the different rules. Let’s take for instance a multi-objective process Optimisation in a manufacturing setting. Imagine a number of different ingredients which need to be combined together, each bearing different pricing, processing time, and various constraints. A specialised algorithm helps in determining the variables and how these are to be deployed in order to get to the desired product / output. So, this means the best possible product, manufactured within a certain time, cost and of a certain quality. With a Random Sampling technique, when working on such large number of variables and permutations, the higher the number of samples or iteration runs, the closer you get to a 0 Loss, and therefore the more accurate the output. This however leaves the probability of finding the global minimum up to chance. With a Bayesian Optimisation technique we can reach the global minimum in a much more focused manner, taking many less iterations to do so, especially in a multi-variate scenario, making it a more preferred method for Optimisation. Previous Next
- The Science and Value of Finite Element Analysis (FEA) in Food Packaging: Food packaging plays a crucial part in complex supply chains (Part 2 of 2) | Funis Consulting
< Back The Science and Value of Finite Element Analysis (FEA) in Food Packaging: Food packaging plays a crucial part in complex supply chains (Part 2 of 2) Image by Pexels from Pixabay 11 Jun 2025 Behind every sealed lid, there is a world of science and simulation, where temperature shifts, compression forces and impact drops are tested in a virtual setting, before the physical prototype is built. This enables precision and removes the guesswork, in order to bring to your homes food that is safe, fresh and intact enabling high quality products. The future of foods is about smarter design to reduce waste, increase performance and take faster decisions. Last week, in our article "The Science and Value of Finite Element Analysis (FEA) in Food Packaging: Packaging is more than a mere container for your food product. (Part 1 of 2)”, we spoke about how FEA can help companies make better decisions as to which packaging design to go for, when considering various variables. Today we are going to give a more practical example of how food packaging plays a fundamental part during supply chains, to ensure the product arrives safely on our tables at home. Let’s take as an example that we are designing packaging for a chilled ready meal that needs to be transported across a regional supply chain to be sold to supermarkets for the end-user to enjoy. The conditions are that the product is to stay below the 5 °C, remain intact during the various transportation stages, as well as that it has a short shelf-life of 7 days from end of production to consumption. Through Finite Element Analysis or "FEA", we can make use of thermal modelling to understand the behaviour of the meal during the different shifts in temperatures during different transportation, loading and unloading and storage scenarios. It helps us to predict how well the design of the packaging insulates the product across various conditions and temperatures. By creating a virtual model and run simulations, you can compare various materials to understand thermal conductivity and insulation and assess whether additional packaging design features such as layers and vacuum sealing are needed for extra protection. Through FEA’s structural analysis and simulation you understand how the packaging will endure stressors such as stacking in a warehouse by applying virtual compression and impact forces to see whether it survives the pressure, what happens if the product is dropped from a certain height and whether the seal will hold under different pressures. As you see the behaviour, you can tweak and optimise the design such as for example the packaging’s thickness, creating stronger corners, reinforcing the lid but optimising this in such a way to balance out additional protective features and not waste excess material. It is finding the right balance of costs, quality and sustainability, as well as finding the balance of what is the lighter, lowest-cost material to meet the needed requirements. Other considerations are also factored in such as will the packaging fit securely in standard crates for the handling by retailers, how will the design of the packaging fit on a pallet, will its shape and its rigid form allow automated handling in a warehouse and how will it fare under different temperature shifts. By using data-driven and physics-based modelling and simulation early in the development process you can reduce the number of physical prototypes needed, reducing packaging failures and take faster and more informative decisions on the right design of packaging to choose in line with business, technical and sustainability goals. This improves cost, quality and time to make food systems better. Packaging development process becomes a proactive and strategic process, rather than a trial-and-error based exercise, which places a burden on companies and societies. It is way of a smarter exercise to understand how you can deliver to your consumers, in the right manner. Funis Consulting works at the intersection of R&D and Innovation through the use of modelling and simulation techniques. Whether it is understanding new materials, improving robustness or costs-efficiencies of your current design or systems or reducing environmental impact through smart design, the opportunities for meaningful change in foods and food systems are there and they are vast. We believe in a better way to do things, to create real-world impact for a better world. Previous Next
- Process Modelling & Simulation: Calibrated system infrastructures for when failing is not an option. (Part 1 of 3) | Funis Consulting
< Back Process Modelling & Simulation: Calibrated system infrastructures for when failing is not an option. (Part 1 of 3) Image by Pete Linforth from Pixabay 16 Apr 2025 Sometimes you need scientific-backed systems to make the right decisions. Sometimes when you need to make certain decision which have a lot of “weight”, you need to ensure that the study is done on a sound virtual infrastructure, because failing is not an option. Mathematical and Physical principles provide a reliable way of working out what is happening, and what can happen. Using these in a structured framework via Process Modelling & Simulation, you can have the peace of mind that multiple scenarios have been tried and tested and that the best “settings” for your system have been determined so that you can achieve the desired result. In one of our previous articles, Harnessing the Power of Optimisation , have gone through the basics of Process Optimisation, where we said that if you know what you want to achieve, you can achieve that goal by understanding what variable values to use through modelling & simulation and algorithm-based models. A model can be physics-based or chemistry-based, or it can be built using your data, which, together with correct data analysis & the proper visualisations, can enable a business to ensure that they take the right decisions and to ensure the right mechanisms are adopted to achieve the desired result. Process Modelling & Simulation is crucial in many types of businesses, especially ones where getting it right is a must. Some businesses incur huge losses if they fail in achieving the optimal state or a 0 Loss scenario and therefore, through Process Modelling, variables are calibrated in such a way so as to ensure the achievement of the goal or the best possible outcome. There isn’t only the monetary element at stake, but sometimes certain situations have an impact on a national scale, or the wellbeing of people, and so Process Modelling & Simulation is a must to ensure that the right thing is done and to take the right decisions. Let’s take the scenario of a company that wants to build a new manufacturing line in which the target end-of-line output is the manufacturing of 1000 packets a minute according to a set of criteria or specifications (e.g. 1000 packets, containing 10 units each, of a certain quality, length, weight, with the right amount of materials per unit and so on). If you wanted to carry out Process Modelling you need to start with creating a virtual/digital representation of the manufacturing line. This helps to identify the problems that are to be avoided in real-life scenarios when the physical manufacturing line is built. Through various modelling and simulation techniques, one of which is the “Discrete Event Simulation”, or DES, you can change the variables with no repercussion, as all is done on a virtual environment. By testing such variables, you can identify inefficiencies and constraints, you can assess the impact of any changes to the system, you can identify the most important variables/parameters, and through data analysis & visualisation and specific testing you can ensure data-based decisions are made as well as ensuring optimisation for the best throughput of your system. This reduces waste thus saving costs, as well as ensuring that no disruption of operations takes place in the eventual real-life scenario. In a manufacturing line, some components, from raw material to end product are run in parallel and some others run in sequence. This is already complex, as there are many moving parts. Through Process Modelling we can understand the specifications needed, the quantity of equipment and resources for each subprocess to get to the Output the Company is looking for. One of the core tenets of Process Modelling is that it can be used to model any process: from the simplest chemical reaction to a whole factory or even a national infrastructure. In this case the infrastructure would be split into its individual sub-components. These sub-components can be “described” (modelled) using a mathematical description of the process (be it physical or chemical) and then simulated. This can be quite complex in nature, when you have many relationships or variables to assess, and those variables being coupled, i.e., one affects an other, which can affect an other, or even two variables affecting each other. Through virtual simulation you can integrate and assess various variables, then multiple scenarios (tens, hundreds or thousands) are tested in parallel. In real life scenarios you have various operations, resources and processes running in parallel and in sequence, as well as automated and manual steps that need to be carried out. Therefore, by “understanding” your system through Modelling & Simulation, you will be certain that you are using the best possible “settings” of the right variables, to ensure these work together in an optimum way so that the goal, the best possible outcome is achieved. Previous Next
- Data: The Importance of Data for Strategy and Decision-Making (Part 1 of 2) | Funis Consulting
< Back Data: The Importance of Data for Strategy and Decision-Making (Part 1 of 2) A bowl with different glass marbles 02 Apr 2025 Data in its raw form can be very powerful to you and your business if you use it well. Otherwise, it can be overwhelming to manage as well as misleading if not properly prepared, transformed and interpreted. What is also more important is that you are interpreting the data correctly, otherwise it can cause more harm than good: data transformation and interpretation therefore is key to make correct informed, strategic choices. Data, in its raw form, is useless, unless you transform it and interpret it. That is when you start reaping the benefits of the data that you hold. It helps organisations and professionals to take the right decisions for their business or for their clients. So, a bunch of text, numbers or images all brought together without any structure say absolutely nothing. It is simply data overload to no effect, where you risk being lost in data, or, even worse, misinterpreting it. It is only upon the correct transformation of data into readable information that we can start truly “seeing” what the message behind the data is. This is the interpretation of data. When analysing large datasets, data is likely to come from different sources, which makes this exercise even more complex. For example, imagine we want to analyse the major news events taking place over the past ten years, utilising different sources: for instance, various social media, various news websites, forums, blogs. What you need to do is to put different data types in different databases. That way you have a comprehensive dataset from various sources which you can then link together. Structuring data is therefore essential when combining data from different sources. There are many tools out there that can help to do this. Python is one of the preferred tools for data scientists, helping with structuring and cleansing the data. Cleaning of data can take many forms, from changing formatting to unit conversions to more complex algorithm-based techniques to filter out unwanted data, outliers and so on. Available libraries in the Python framework, such as NumPy and Pandas greatly help in data wrangling. Machine learning, through the use of algorithms and equations can then start making sense of all this data. Machine learning models can learn on their own from the constant inputs being fed to them, by humans or otherwise. A classic example is spam calls identification. Of course, there isn’t a person stopping these calls before they reach you, but it is an automated machine learning; a complex set of rules built and integrated into a Machine Learning algorithm. This constantly improves and learns on its own by the input made by us humans. So, if a type of call having specific features is being identified as a spam call by a large amount of people, then the algorithm identifies the pattern amongst these different spam numbers that make them classify as spam. So, that is how unsupervised machine learning learns: through the continuous autonomous improvement which is contingent on the reaction and input of us humans. One of the most basic and yet most important factors in data analysis and visualisation is to understand what the goal is… what is the question you are trying to answer? The goal must be specific. Only this way can the analysis and visualisation give you the answer to your question, because we are using the correct dataset to start with, cleansing the data as appropriate, and the using the relevant algorithms to answer the target question. You don’t want to get lost into too much data when it comes to strategic and important decision-making. Decision-making should be based on reliable information. That reliable information must be presented in a way that stakeholders understand: whether they are analytical or not, the data must speak to them. That is why visualisation is much more than making the data look nice and colourful. It is about the ability to present the data in such a way as to answer the goal or question that you have, without any ambiguity and uncertainties, as well as making it easy for other stakeholders to understand the data. Previous Next
- Process Modelling & Simulation: Calibrated system infrastructures with friendly-to-use, intuitive human-centric interfaces (Part 3 of 3) | Funis Consulting
< Back Process Modelling & Simulation: Calibrated system infrastructures with friendly-to-use, intuitive human-centric interfaces (Part 3 of 3) Image by kiquebg from Pixabay 30 Apr 2025 Human-centricity in innovation is not just a buzzword. Innovation should serve a purpose. That purpose for us here at Funis Consulting is to do good. Companies have an important role to play in society and here at Funis, we bring together science, technology and innovation for that same purpose. In Process Modelling & Simulation, there is a great deal of science, mathematics, data and technical complexity involved, but the system can be designed with the end user in mind. That is what human-centricity should be about - innovation that works with, around and for people and societies. Once a model is built, you can “play” around with the variables to examine “what-if” scenarios, such as what would happen, or what would my model output be, if variables A, C and G are changed into such and such. Of course, the more complex a system, the more variables there are that can be changed to assess various scenarios. You can run thousands of simulations, changing all inputs over ranges. As you change variables you get to know your system and its limitations as well as its optimised state. You can also model different systems and connect them into one model thus understanding the relationship between processes or systems. In modelling and simulation, sensitivity analysis can also be performed which will help you understand which parameters affect the overall system the most, thus ensuring the core variables that are the most important are retained at an optimised levels at all times. So once a model is built, through various iterations or simulations, you can carry out process optimisation of the overall system or infrastructure. For instance, you can carry out multi-objective optimisation, add constraints to the system, as well as carry out real-time control of your systems. This means you can run continuous system optimisation for real time balancing of the system, just to mention a few. Statistical process control in real time can give you warnings if trends are observed - this helps in forecasting problems before they arise. Modelling & Simulation can be extremely complex behind the scenes but that does not need to feel difficult to use by the end-user. With the correct user interface as well as with proper training and support, such tools can be made intuitive and approachable. Whilst there is a great deal of science, mathematics, data and technology running in the background, the system can be designed to be feel friendly and simple on the surface. Depending on who is using the model, whether for instance your in-house data scientist or your production line machine operator, different users will need different insights, or sometimes the same insights presented in different ways, with more or less detail. The look-and-feel therefore can be adapted to the needs of its users by building different UIs, and showing data in different ways, or even showing only the data which is relevant the person viewing it. Although data, mathematics, science and technology involve a lot of complexity, here at Funis Consulting, we believe in innovation that serves a purpose. Here at Funis, our aim is to deliver smart, tailored solutions that bring real value to businesses and society alike, always designed with the end-user in mind. Previous Next
- Data: Patterns and Clusters in Visualisation (Part 2 of 2) | Funis Consulting
< Back Data: Patterns and Clusters in Visualisation (Part 2 of 2) Marbles in different groups according to their colour 09 Apr 2025 When working with large datasets, visualisation is key to gaining insights. This is also important when presenting to other business stakeholders. It makes all the difference when data is presented in a clear and meaningful way. Complex datasets do not need to be overwhelming. Today we explore the concept of clustering - how to identify patterns in unstructured and unlabelled data. Data Collection is an important part of data analysis and visualisation. If you collect your data in a wrong way, it can lead to a misleading interpretation. Data needs to then be sorted out. The same type of data is then placed together. In the LEGOs image below, you see the LEGO pieces of different colours grouped together. Not only you need to sort out the data, but also arrange them, i.e. for instance convert the data so that the data is uniform and can be compared and used e.g. formatting, unit conversion etc. Data is then presented in a way which is understandable to analytical and non-analytical internal (and possibly external) stakeholders. Remember, in an organisation, some functions, who might not be analytical in nature, might need to be able to read and understand the data for strategy and/or decision-making. Once data is presented visually, it needs to be analysed and explained, and hence one can reach an outcome. Image by Mónica Rosales Ascencio from Linkedin There are many visualisation methods one can use, from bar charts to scatter plots, but let’s take a more scientific approach to visualisation, which is mostly used when you have large unstructured datasets to work with; Clustering visualisation. Supervised clustering is when you are grouping your data according to datapoints which you have defined. These datapoints are defined by understanding and finding a pattern or common element, in unstructured and unlabelled datasets. So let’s take an easy example and imagine that we have the following data: Cat, Dog, Kitchen, Donkey, Sofa, Wardrobe, Door, Table, Horse, Bird, Chair. We immediately understand there are two Clusters which are furniture (let’s call it Cluster A) and animals (Cluster B). So, all of the above data will be grouped around the datasets we established, either Cluster A for furniture or Cluster B for animals. If to this data I add a candleholder, then this data will be somewhere outside of the range of these clusters, because it is neither furniture nor an animal, however it will be closer to Cluster A (furniture) than it is to Cluster B (animals). If then we add a glass bowl to the dataset, this too, like the candleholder, would be outside of the range. Having said that, the glass bowl might be slightly further away from Cluster A, then a candleholder would be. This is because a domestic fish could live in a glass bowl and so there is a linkage, albeit not a strong one, there. The above is a simple sketch, showing Cluster A with cyan datasets (furniture) around it. Cluster B is showing with purple datasets (animals) around it. The yellow dot in the middle is the candleholder and the orange dot is the fish bowl. Understanding a pattern is crucial when attributing data points for clusters. In machine learning for instance, clustering is about grouping raw data. There are many applications for clustering across many industries, from fraud detection in banking and anomaly detection in healthcare, to market segmentation and many more. Let’s take another simple example of how to make sense of unstructured data. Imagine we are to analyse a phone numbers' list and receive this data: 729698782172106674475298921152340587 What we know for sure is that phone numbers will start either with 7 (in case of a mobile phone number) or with 5 (in case of a landline). Mobile numbers and phone numbers are of different lengths, but each type will always contain the same number of numbers. Furthermore, the area code (normally found in the first few digits of a phone number) has to be a common number, since this data comes from the same geographical area. Looking at the data, we have identified that the only common number in the above which follows either a 5 or a 7 is the number 2. We identified an equal length to both the mobile numbers (10 digits) and the phone numbers (8 digits). With this knowledge we can split and structure the datasets as below - the first two from the below list are mobile phone numbers and the second two are landline numbers. 7296987821 7210667447 52989211 52340587 The larger and more complex the data, the more important it is to visualise it. If you have lots of data to show for interpretation, you simply have to visualise it to make sense of it. Visualisation is simply the key to let your data help you and to make your data count. Previous Next