top of page

46 results found with an empty search

  • Smoothed Particle Hydrodynamics | Funis Consulting

    < Back Smoothed Particle Hydrodynamics 06 Aug 2025 In R&D, systems are rarely neat. Irregular flows, soft solids, and messy boundaries are often the norm making traditional Computational Fluid Dynamics (CFD) a poor fit. Smoothed Particle Hydrodynamics (SPH) offers a flexible, mesh-free alternative which is ideal for modelling the complexity we intuitively understand but struggle to simulate. In R&D, the systems we want to model are rarely clean or convenient. You have irregular boundaries, shifting phases, soft solids, and chaotic flows and this is quite often the norm! While traditional Computational Fluid Dynamics (CFD) has its place, it’s not always a comfortable fit, especially when the system doesn’t want to behave like a neat little mesh. And this is where Smoothed Particle Hydrodynamics (SPH) offers something genuinely useful in such cases. SPH was originally developed for astrophysics, but is now being applied across engineering, biophysics, and even food science. It’s a mesh-free computational method that treats matter as a collection of discrete particles. These particles interact through smoothing kernels, allowing the method to capture the nuances of deformable materials and complex flows, without the constraints of a predefined grid. Many of the problems that industrial R&D teams face involve free-surface flows, splashing, or breakup; multiphase systems like slurries, emulsions, or suspensions, soft, gel-like, or granular materials that don’t behave "neatly" and flow regimes that are non-Newtonian or highly localised. In other words, R&D teams increasingly face challenges of systems that are difficult to model using Computational Fluid Dynamics (CFD) approaches. Hence one can explore SPH in contexts where flexibility and physical intuition matter more than rigid formulations or high-fidelity turbulence models, such as for instance in food science and technology with pastes, emulsions and powder-liquid interactions. It can also be used in Materials Science such as in soft solids, gels and composites or in bioprocessing with slow flows, yield-stress fluids and phase interactions as well as in environmental processes such as in sedimentation, erosion and pollutant spread. These are just a few of the applications that SPH can be used for. Notwithstanding the above, SPH is not a silver bullet. For large-scale simulations, SPH can be computationally heavy. Furthermore, it takes experience to tune things like kernel size and particle density effectively. But when the goal is to gain insight into complex, deformable, and dynamic systems, it often outperforms more conventional options, especially when you want models that reflect the system’s quirks rather than smoothing them away. A lot of R&D teams have a deep understanding of their processes, empirical knowledge, pattern recognition, hands-on experience but don’t always have tools that can express that complexity. SPH offers a bridge between what people know intuitively and what can be represented computationally. It’s just versatile enough to model what really matters. Previous Next

  • Arthur (Turu) | Funis Consulting

    < Back Arthur (Turu) Vice President of Recreation & Wellbeing Arthur, (Paw. D.) Arthur is a distinguished Maltese Hunting Dog who has been an integral part of the Company since Day 1. Arthur holds the prestigious title of Vice President of Recreation & Wellbeing, and is fully dedicated to his role. Arthur is very passionate about ensuring that all team members take their well-deserved breaks, get fresh air, and maintain a healthy work-life balance. The latter is achieved through reminders and demands for regular walks and the encouragement for playtime. A true connoisseur of life's pleasures, Arthur, also known as Turu (the Maltese name equivalent for Arthur), is highly motivated by food and cuddles. He considers his life's mission and professional duty to sample (or at least attempt to sample) every snack, meal, and beverage within his vicinity. Loose focus for a moment, with some food in hand, and he will sneakily sample it for you. Arthur's refined palate and unwavering enthusiasm make him an unofficial Culinary Inspector, though his review process tends to involve swift and unapproved taste-testing. His passions, however, extend well beyond food. He enjoys running at full speed, engaging in spontaneous play as well as conducting meticulous olfactory research on cats and flowers. He is a strong advocate for nature but also for proper rest - the latter is demonstrated by his napping frequency, highlighting the importance of proper napping techniques. As the Company’s Champion for Safety & Wellbeing, Arthur reminds everyone that happiness is essential. His daily mission is to spread joy, encourage smiles all around and at all times, and to have fun - FunisTuru after all! Arthur ensures no one forgets to enjoy life. Whether it’s nudging someone away from their desk for a midday break or leading an end-of-day mandatory long walk, Arthur takes his responsibilities seriously, though never too seriously! info@funisconsulting.com

  • Fat Bloom in Chocolate | Funis Consulting

    < Back Fat Bloom in Chocolate 26 Mar 2025 Food needs to satisfy the five senses. Enjoying food means that not only should it taste good, but it should also look good, and the texture feels right in your mouth. Even if you have the most delicious food product, if it doesn’t look, taste or smell how it should, then it’s probably not going to be very successful with your consumers. This reminds me of fat blooming in chocolate. Not an uncommon phenomenon, fat blooming does not make a bar of chocolate look tasty, and actually tends to put people off. The good news is that one can implement some changes that bring about a remarkable difference to the chances of fat bloom developing in chocolate products. Thousands of years ago, an ancient civilisation, in what is now known as Ecuador, was the first to recognise and revere the cocoa tree as a sacred source of food. Chocolate comes from cocoa beans - which actually don’t taste anything like the chocolate we know and have a bland taste in their raw form. It is only when the cocoa beans go through the process of fermentation and roasting that the familiar cocoa flavour develops. Chocolate is used worldwide in many shapes and forms, in both sweet and savoury dishes, and the husk of the cocoa tree is also used to make tea, which is said to replenish with energy and boosts the mind. When it comes to a chocolate bar, the mouthfeel, as well as how it looks, sounds when you break it, and smells, are of utmost importance. A good chocolate should “snap” when you break it, and the chocolate should be shiny with a rich deep colour. Chocolate never ceases to amaze and indulge the world over, being probably the most loved and widely available confectionery around. So, it is a most disappointing experience to open a chocolate bar to find it has fuzzy-white layer or spots. This phenomenon is called “blooming” or “fat bloom”. To start off, fat bloom in chocolate is definitely not mould, and it’s not a health hazard, so a chocolate with fat bloom is still good to eat - having said that it is definitely not something that you look forward to when unwrapping a nice bar of chocolate. So why does fat bloom occur? Fat bloom in chocolate is caused by uncontrollable crystallisation of the fat in the chocolate. Whilst crystallisation of fat is a natural occurrence, when then is no control over the number, size and orientation of the crystals, fat bloom is observed. In short, the physical crystalline phase of the fat molecules is not what gives you a snappy, shiny chocolate bar. One thing to note is this is not a chemical phenomenon, that is none of molecules in the chocolate are broken down and there is no chemical reaction taking place. A fun fact is that if you were to take the bloomed chocolate, melt it and temper it again, thus controlling crystallisation of the fat, to form a chocolate bar, it will become shiny and snappy once again. So, what can be done to resolve the issue of fat blooming in chocolate, you might ask. The good news is that something can be done to greatly reduce the chances of fat bloom in chocolate occurring. First and foremost, chocolate manufacturers should understand the root cause of fat bloom as in most cases, fat blooming issues can be resolved via formulation and/or process, depending on the case. When tackling formulation issues, one needs to understand the client’s needs as well as the current interaction of the formulation at a chemical and physical level, to see whether any changes in formulation are needed. Many of you might think that changing the formulation of the product will definitely change the taste of the product. Whilst in some cases this might be true, this is not necessarily the case if the right changes are implemented. Some alternative formulations are extremely close in terms of flavour and texture, and so there would be a minimal to no impact to the end product. Process is also another important factor to look at when tackling the issue of fat blooming. Changing or tweaking the manufacturing process can make a huge difference. In most cases this would not necessarily mean needing additional equipment or adding extra manufacturing costs. Sometimes it’s the small tweaks that make a big difference. To determine the root cause the entire process needs to be kept in mind; from manufacturing processes to storage, all can affect the chance of fat bloom developing in chocolate. Other than fat bloom, chocolate can experience sugar bloom, which is when the chocolate is in a high humidity environment - this however happens when the chocolate is packed at high humidity levels, so it is much more easily controlled. What happen in such circumstances is that if you were to pack a product in high humidity levels, the humidity gets trapped inside the packaging, which then, due to temperature fluctuations in the supply chain, would condense inside the packaging, with the water dissolving the sugar, to then form sugar crystals once the temperature increases again and the water re-evaporates. This type of bloom is a rare phenomenon. So, if you open a chocolate bar with a white fuzzy layer, it’s highly likely to be fat bloom. It is not a health hazard so eat to your heart’s content. Having said that it would be nice if your chocolate bar is shiny and snappy every time you unwrap it. Previous Next

  • My Phython, what's your Numba? The Power of Code Optimisation. | Funis Consulting

    < Back My Phython, what's your Numba? The Power of Code Optimisation. 19 Mar 2025 In today’s fast-paced world we expect things to work and work efficiently. We all hate waiting for a slow system to respond or get back with the results, especially when we have tonnes of other things to do. One way to improve performance is through Code Optimisation. In simple terms, code optimisation is when code is optimised, or improved, in such a way that it executes faster, in some cases much faster, and more efficiently. This could mean a reduction in memory usage to handle larger datasets more smoothly, speeding up any calculations, or simply improving readability to make code easier to maintain, all the while reducing carbon footprint. This in turn speeds up execution. It is about small tweaks that can save you hours or days down the line. Python is widely used in the scientific community due to its relative ease of use and extensive library, however, let's face it, it might run slower than other languages, such as C++. When performing mathematical transformations, especially on large datasets, Numpy libraries can become a bottleneck in terms of speed, especially if you have to iterate over the array’s elements. Despite this, Python remains one of the most popular choices for scientific computing. So, what can we do to make a model written in Python run faster? On a standard laptop you would have probably about 8 CPU cores and the speed to process calculations is dependent on these cores. For normal day-to-day use this is just fine, but let’s imagine that you need to process a model with a large amount of data, or iterate over a large dataset, then GPU computing might be a much better option. CPUs (Central Processing Units) enable you to run different processes at any one time, in the example above 8 different processes at a time. GPUs (Graphics Processing Units) on the other hand work a bit differently. You can have thousands of cores in one single GPU. Having said that, the processes carried out on these cores have to be the same. So, for instance if you wanted to process two different calculations at the same time, 1*2 and 1+2, a GPU won’t do. What GPUs excel at is doing the same calculation but with different values, so for instance with a GPU you can compute 1*2, 1*3 and 1*4 because it is the same calculation with different variables, i.e. x*n, where in our example x is always 1 and n is the changing value which can go up to millions of variable values, and these calculations can all carried out at the same time. So, for large model computing which has the same calculation but different variable values, GPU Computing works very well. Now let’s imagine that you have some Python code that you would like to optimise. This is where Numba comes in. Numba is the Python framework that enables just-in-time compilations, directly made on a GPU (instead of traditional CPU, such as with Numpy module) in order to reduce processing time, and hence drastically speed up the computations. Basically, what you are doing differently is that computations are not run in sequence such as with a CPU, but in parallel. The GPU will be running thousands of calculations simultaneously and that is where you will start seeing a huge difference in execution performance. Scientists and engineers all over the world use GPUs for data-driven modelling because the same equation is computed with the different variable values, simultaneously. This could be a key process in optimisation algorithms, or simply solving your model, as in most cases you will be working with thousands or millions of calculations. Carrying out all of these calculations, multiple times, would take an unfeasibly long time unless you use GPUs. LLMs (Language Learning Models), such as ChatGPT were trained just like this: handling large amounts of computations at the same time. With Digital Twins for instance, where virtual models replicate the real world scenario, scientists and engineers can test a huge number of scenarios in a safe manner. Take for instance the exploration of space - space scientists and engineers must be almost certain that everything will go as planned. Without such environments, real-world experimentation could be catastrophic. Previous Next

  • From K-pop to K-beauty...now let's talk about K-food! | Funis Consulting

    < Back From K-pop to K-beauty...now let's talk about K-food! 12 Nov 2025 Long before refrigerators were invented humans turned to fermentation as a means to ensure foods could last for months and sometimes even years. Fast forward to today we still enjoy fermented foods like kimchi because we love the tangy umami flavour besides being great for our gut due to natural probiotics. Fermented foods like Korean kimchi keep well after so long because of salt which kills most bacteria except for lactobacilli which happen to be halotolerant i.e. tolerant to salt, and also love sugar. Munching away on sugars during fermentation, these bacteria produce acids (lowering the pH) and carbon dioxide (creates an anaerobic environment) - i.e. stopping the bad microbes from growing, whilst keeping the fermentation going. If there is one dish that screams Korean food then that must be kimchi! A fiery, tangy, umami-packed staple food, this delicacy is more than just a side dish, but a truly fascinating example of food science at work, from its very first step of preparation. At its core, kimchi is a fermented food. This means that it is preserved by lowering its pH (increasing its acidity). This acidic environment keeps unwanted microorganisms from taking over and spoiling the food. Fermentation is indeed one of human's clever trick to making food last, even for years! All over the world, people have been fermenting foods long before fridges were invented. It turns out fermented foods are also delicious which is why some of these foods, like kimchi, are very popular nowadays. Today we do not need fermented food for survival but we still ferment some foods because we love those complex flavours and, as science shows, it is good for us too! Kimchi is rich in probiotics, those friendly bacteria that support a healthy gut microbiome, aid digestion and boost our immune system (and make our skin glow too ;))! Kimchi is mostly associated with spicy fermented cabbage but in reality kimchi can be made with radish, kohlrabi, spring onions and plenty of other vegetables. Some regional varieties also include seafood. Think fish sauce and fermented shrimp for that extra savoury kick. So let's delve into the science angle! Why is kimchi safe to eat, even after months or years sitting in a jar? The answer is Salt. When you salt vegetables (pretty much the first step of preparing kimchi) you are not only seasoning them, but you are killing off most bacteria except for the halotolerant (salt-tolerant) ones. These survivors are mainly lactobacilli, a type of bacteria that thrive on sugars. You feed these bacteria with sugar and sometimes cooked rice flour and as they munch away they produce acids (which lower the pH and stop harmful microbes from growing) and carbon dioxide (which creates an oxygen free or "anaerobic" environment that keeps the fermentation going). The result of this is a tangy, crisp, fermented kimchi bursting with flavour and alive with beneficial bacteria, but just one word of warning on this, that carbon dioxide builds up pressure inside your jar, so unless you fancy a kimchi volcano erupting in your kitchen remember to "burp" your kimchi every now and then. So for those in food science and technology, fermentation presents a complex process, especially when we think of an industrial scale food manufacturing operation of kimchi. Fermentation involves multiple types of bacteria, sugars, acids, temperatures and time and predicting how a batch of kimchi will turn out, can be quite complex due to the many variables. Using computational tools at our disposal we can simulate the growth of bacteria, the production of acids as well as the formation of gas under the different conditions allowing us to optimise fermentation time for consistent flavour and texture and experiment virtually with ingredients or salt levels to see how they affect taste as well as safety. So modelling and simulation helps turn centuries-old culinary art into a precise science without losing any of its magic. Previous Next

  • Process Modelling & Simulation: Calibrated system infrastructures for when failing is not an option. (Part 1 of 3) | Funis Consulting

    < Back Process Modelling & Simulation: Calibrated system infrastructures for when failing is not an option. (Part 1 of 3) 16 Apr 2025 Sometimes you need scientific-backed systems to make the right decisions. Sometimes when you need to make certain decision which have a lot of “weight”, you need to ensure that the study is done on a sound virtual infrastructure, because failing is not an option. Mathematical and Physical principles provide a reliable way of working out what is happening, and what can happen. Using these in a structured framework via Process Modelling & Simulation, you can have the peace of mind that multiple scenarios have been tried and tested and that the best “settings” for your system have been determined so that you can achieve the desired result. In one of our previous articles, Harnessing the Power of Optimisation , have gone through the basics of Process Optimisation, where we said that if you know what you want to achieve, you can achieve that goal by understanding what variable values to use through modelling & simulation and algorithm-based models. A model can be physics-based or chemistry-based, or it can be built using your data, which, together with correct data analysis & the proper visualisations, can enable a business to ensure that they take the right decisions and to ensure the right mechanisms are adopted to achieve the desired result. Process Modelling & Simulation is crucial in many types of businesses, especially ones where getting it right is a must. Some businesses incur huge losses if they fail in achieving the optimal state or a 0 Loss scenario and therefore, through Process Modelling, variables are calibrated in such a way so as to ensure the achievement of the goal or the best possible outcome. There isn’t only the monetary element at stake, but sometimes certain situations have an impact on a national scale, or the wellbeing of people, and so Process Modelling & Simulation is a must to ensure that the right thing is done and to take the right decisions. Let’s take the scenario of a company that wants to build a new manufacturing line in which the target end-of-line output is the manufacturing of 1000 packets a minute according to a set of criteria or specifications (e.g. 1000 packets, containing 10 units each, of a certain quality, length, weight, with the right amount of materials per unit and so on). If you wanted to carry out Process Modelling you need to start with creating a virtual/digital representation of the manufacturing line. This helps to identify the problems that are to be avoided in real-life scenarios when the physical manufacturing line is built. Through various modelling and simulation techniques, one of which is the “Discrete Event Simulation”, or DES, you can change the variables with no repercussion, as all is done on a virtual environment. By testing such variables, you can identify inefficiencies and constraints, you can assess the impact of any changes to the system, you can identify the most important variables/parameters, and through data analysis & visualisation and specific testing you can ensure data-based decisions are made as well as ensuring optimisation for the best throughput of your system. This reduces waste thus saving costs, as well as ensuring that no disruption of operations takes place in the eventual real-life scenario. In a manufacturing line, some components, from raw material to end product are run in parallel and some others run in sequence. This is already complex, as there are many moving parts. Through Process Modelling we can understand the specifications needed, the quantity of equipment and resources for each subprocess to get to the Output the Company is looking for. One of the core tenets of Process Modelling is that it can be used to model any process: from the simplest chemical reaction to a whole factory or even a national infrastructure. In this case the infrastructure would be split into its individual sub-components. These sub-components can be “described” (modelled) using a mathematical description of the process (be it physical or chemical) and then simulated. This can be quite complex in nature, when you have many relationships or variables to assess, and those variables being coupled, i.e., one affects an other, which can affect an other, or even two variables affecting each other. Through virtual simulation you can integrate and assess various variables, then multiple scenarios (tens, hundreds or thousands) are tested in parallel. In real life scenarios you have various operations, resources and processes running in parallel and in sequence, as well as automated and manual steps that need to be carried out. Therefore, by “understanding” your system through Modelling & Simulation, you will be certain that you are using the best possible “settings” of the right variables, to ensure these work together in an optimum way so that the goal, the best possible outcome is achieved. Previous Next

  • From Chemistry to Code: How Modelling, Simulation and Data Science are Transforming Formulation R&D | Funis Consulting

    < Back From Chemistry to Code: How Modelling, Simulation and Data Science are Transforming Formulation R&D 10 Sept 2025 Formulation R&D is evolving. Traditional trial-and-error approaches are no longer enough to keep pace with rising costs, tighter regulations, and growing sustainability targets. Computational modelling and data science make it possible to explore molecular interactions virtually, optimise formulations, and predict outcomes more efficiently. By combining chemistry with data, even smaller teams can innovate smarter, developing products that are more effective, sustainable, and aligned with modern expectations. At the heart of every consumer product, whether food, cosmetics, personal care, or household goods, lies chemistry. Formulation is the science of making ingredients work together: stabilising emulsions, controlling crystallisation, fine-tuning viscosity, balancing actives, and designing textures, aromas, or cleaning performance. For decades, new products have been developed through trial and error in the lab or pilot plant. Scientists experiment, tweak, and test until a stable, effective, or appealing formula emerges. But in today’s environment, this traditional approach is often too slow, too costly, and too uncertain. The pressures are clear. Consumers expect products that deliver functionality, safety, and sensory appeal while also being healthier, gentler, or more sustainable. Competitors move quickly, and faster innovators often capture both shelf space and consumer loyalty. Meanwhile, volatile raw material costs, rising energy prices, and the expense of running iterative formulation trials drive the need for more efficient R&D. Regulations governing ingredients and safety are becoming increasingly complex, especially for chemicals and additives. At the same time, ambitious sustainability targets push companies to reduce environmental impact, optimise resources, and replace legacy ingredients without compromising performance. This is where modelling, simulation, and data science redefine the rules of formulation. Instead of relying purely on bench experiments, companies can now test, optimise, and predict product behaviour in silico. Consider the role of chemistry at the microscopic level: surfactants arranging at oil-water interfaces, polymers creating networks that affect viscosity, proteins folding and unfolding, fats crystallising into different structures, or volatile molecules driving aroma. These interactions determine whether a cream remains smooth, a sauce stays stable, a detergent dissolves effectively, or a shampoo delivers the right foam and feel. Traditionally, understanding these behaviours meant months of iterative testing. Now, computational models can simulate these same interactions virtually. Stability over shelf life can be predicted; ingredient compatibility mapped; formulation robustness stress-tested under different conditions. Optimisation becomes faster, as algorithms can explore thousands of compositions long before a single sample is mixed. Even sensory and functional attributes such as flavour, fragrance, mouthfeel, spreadability, cleaning efficacy can be linked directly to underlying chemistry using statistical and machine learning approaches. So how does this translate into real advantages for manufacturing companies such as FMCG and CPG companies? Most generate vast amounts of data, from lab instruments, formulation databases, pilot plant trials, production lines, and consumer testing. Yet this information often remains fragmented and underutilised. Data science brings it together, combining experimental data with chemical knowledge to build predictive models. These models not only explain why certain formulations behave as they do but also forecast how new combinations will perform. This reduces dead ends, shortens development cycles, and increases confidence when scaling up. Crucially, advances in computing now make such tools accessible to small and mid-sized enterprises as well as multinationals. Working with specialists allows R&D teams to focus on creativity and innovation while computational methods handle the complexity of formulation space. Adopting these techniques requires a mindset shift. Modelling and data science do not replace chemistry and formulation expertise; they amplify it. Chemistry provides the governing rules, while computation offers the means to explore, optimise, and innovate at speed and scale. Together, they enable companies to design products that are more effective, more sustainable, and better aligned with consumer expectations, without the heavy cost of endless trial-and-error. In today’s fast-moving CPG sector, formulation R&D is no longer confined to mixing and measuring in the lab. It is evolving into a powerful interplay between chemistry and computation, where smarter, faster, and more confident innovation becomes possible. Previous Next

  • Process Modelling & Simulation: Calibrated system infrastructures with friendly-to-use, intuitive human-centric interfaces (Part 3 of 3) | Funis Consulting

    < Back Process Modelling & Simulation: Calibrated system infrastructures with friendly-to-use, intuitive human-centric interfaces (Part 3 of 3) 30 Apr 2025 Human-centricity in innovation is not just a buzzword. Innovation should serve a purpose. That purpose for us here at Funis Consulting is to do good. Companies have an important role to play in society and here at Funis, we bring together science, technology and innovation for that same purpose. In Process Modelling & Simulation, there is a great deal of science, mathematics, data and technical complexity involved, but the system can be designed with the end user in mind. That is what human-centricity should be about - innovation that works with, around and for people and societies. Once a model is built, you can “play” around with the variables to examine “what-if” scenarios, such as what would happen, or what would my model output be, if variables A, C and G are changed into such and such. Of course, the more complex a system, the more variables there are that can be changed to assess various scenarios. You can run thousands of simulations, changing all inputs over ranges. As you change variables you get to know your system and its limitations as well as its optimised state. You can also model different systems and connect them into one model thus understanding the relationship between processes or systems. In modelling and simulation, sensitivity analysis can also be performed which will help you understand which parameters affect the overall system the most, thus ensuring the core variables that are the most important are retained at an optimised levels at all times. So once a model is built, through various iterations or simulations, you can carry out process optimisation of the overall system or infrastructure. For instance, you can carry out multi-objective optimisation, add constraints to the system, as well as carry out real-time control of your systems. This means you can run continuous system optimisation for real time balancing of the system, just to mention a few. Statistical process control in real time can give you warnings if trends are observed - this helps in forecasting problems before they arise. Modelling & Simulation can be extremely complex behind the scenes but that does not need to feel difficult to use by the end-user. With the correct user interface as well as with proper training and support, such tools can be made intuitive and approachable. Whilst there is a great deal of science, mathematics, data and technology running in the background, the system can be designed to be feel friendly and simple on the surface. Depending on who is using the model, whether for instance your in-house data scientist or your production line machine operator, different users will need different insights, or sometimes the same insights presented in different ways, with more or less detail. The look-and-feel therefore can be adapted to the needs of its users by building different UIs, and showing data in different ways, or even showing only the data which is relevant the person viewing it. Although data, mathematics, science and technology involve a lot of complexity, here at Funis Consulting, we believe in innovation that serves a purpose. Here at Funis, our aim is to deliver smart, tailored solutions that bring real value to businesses and society alike, always designed with the end-user in mind. Previous Next

  • Sustainable Food Systems through Data Modelling techniques | Funis Consulting

    < Back Sustainable Food Systems through Data Modelling techniques 07 May 2025 Food and Water constitute one of the most basics physiological needs. It is therefore important that these resources, staples of humanity's very existence, are taken appropriate and adequate care of. Science coupled with Technology, can greatly help innovate Food Systems. In a world where climate change is an everyday reality, careful resource management and getting the most out of whatever resources are available is essential. Natural resources to grow food, whether that’s water or land, are precious and need to be managed effectively. Feeding the world’s growing population is requiring more land, and more water to irrigate the crops. In a heating world, this is becoming ever more challenging. Moreover, once the food is grown it needs to be in the right place at the right time and in the right quantities. Too much food goes to waste because of over production at any given time, or simply because it can’t be delivered in the right condition. Managing the resources to grow food, and managing which and how much food to grow are two very different challenges. However, there is a common thread between them, which is to be smart about how we go about these. Starting with actually growing the crops themselves, many times too much water is used due to indiscriminate irrigation, without taking into consideration other factors. Different plants require different amounts of water to grow at their best. Watering plants continuously (using drip irrigation) has been shown to help with plant growth, and is much more effective than watering in large amounts during a short period of time. However, watering plants on the soil surface leads to a lot of water evaporation before the water can trickle down to the roots where it’s then absorbed by the plant. Moreover, there are lots of other factors at play here, notably rainfall (or lack of it), sunlight intensity, air temperature and wind speed. All of these will affect how fast a plant will grow, how much water and nutrients it needs, its water transpiration rate and so on. By implementing systems to measure, and process, all of this real-time data, one can introduce an automated system for irrigating plants. This could control not only the quantity of water sent to irrigate the plants, but also the main nutrients needed (usually Nitrogen, Phosphorus and Potassium) as well as the micronutrients. This could be done via a continuous closed feedback loop, which is to measure the soil conditions in real time, and adjust accordingly. More advanced systems could include imaging the crops with drones, looking at leaf coverage and leaf health, and again adjusting accordingly. However, this can only be done if there is data available to know what the ideal conditions are, and then couple that with predictive and optimisation models. Such automated systems, using these optimisation models, have the power to reduce water use by careful water use, and land use by growing crops in the most efficient manner. But growing crops effectively is only half the picture. If we grow food that then goes to waste because there’s too much of it, or it can’t be delivered to the right place on time, then the sustainable use of water and land would have been absolutely useless. Good demand forecasting, and supply chain management, is absolutely key here. Predicting how much produce will be required in 6 to 12 months’ time will never be 100% accurate, but it can get pretty close if a robust and validated data model is built. The vagaries of weather (for example, different weather to that expected might give rise to demand for different foods) and new consumer trends are hard to account for, but in most cases seasonal demand for different crops is pretty repetitive. Throw in the fact that different regions of the world are growing at different rates, and different regions might grow and/or consume different crops, and this makes for a very interesting predictive model. Such models would help not only individual farmers to know what to sow when, but would also help governments and regional institutions with agricultural policies. Collect data, but make sure it's data that can be used to build such models. If in doubt what data to collect, speak to an expert who will help you devise a data collection plan. With good data come good models. Previous Next

  • Robert Cordina | Funis Consulting

    < Back Robert Cordina Founder & Managing Director Dr. Robert Cordina, ( B.Sc.(Hons.) MPhil DIC PGDipAppChem MSc PhD CChem FRSC) Robert holds a PhD in Computational Chemistry from the University of Strathclyde, focusing on the simulation of fat molecules using Molecular Dynamics. He is also a Chartered Chemist and a Fellow of the Royal Society of Chemistry. Robert's undergraduate degree and other postgraduate degrees are from the University of Malta, Imperial College London and Cardiff Metropolitan University. He has over 20 years working experience in the pharmaceutical and food industries across Malta, Switzerland, Australia and the United Kingdom. Having worked with a range of companies, from startups to multinationals, in different roles, such as Quality Assurance, Quality Control, Product Development and Modelling & Simulation, Robert has also worked across a wide range of products and therefore has gained a wide breath of experience. Before founding Funis Consulting, Robert was Global Technical Lead for Physical Science Modelling & Simulation at a global snacking company. Other than running his consultancy business, where he focuses on mathematical modelling and simulation and food chemistry consultancy, Robert is also a casual lecturer at the University of Malta and has been for over nine years, lecturing in Food Chemistry and Physics. Robert is also currently on the editorial board of Current Research in Food Science, and is a past Chair of the Food Group of the Royal Society of Chemistry. Robert is keen to share his expertise and experience with the next generation of food scientists and professionals. Robert is passionate about leveraging his knowledge and skills to solve complex and challenging problems for industry, and to create innovative and sustainable solutions. He is motivated by the opportunity to contribute to the advancement of food science and technology, as well as applying modelling & simulation to help companies in their decision-making. robert@funisconsulting.com

  • Process Modelling & Simulation: Calibrated Dynamic and Steady-State system infrastructures (Part 2 of 3) | Funis Consulting

    < Back Process Modelling & Simulation: Calibrated Dynamic and Steady-State system infrastructures (Part 2 of 3) 23 Apr 2025 Modelling can be done on a system which is constantly in a dynamic state or on a system which is in a steady-state. Transient behaviours can be embedded in a dynamic model, whereas steady-state models are used to simulate a system which is expected to behave in a much more stable manner. Which to use depends on the question/problem you are trying to resolve. Process Modelling & Simulation can be carried out on a single process or on a combination of different processes by combining these together to have a holistic understanding of your system. You can use Process Modelling to model a dynamic system (a system changing over time) or a steady-state system (a system working when all the processes have been coupled together and equilibrated). In dynamic system process modelling, the system is constantly changing and therefore the variables are never constant, sometimes changing drastically and at a high frequency. This means that dynamic systems are influenced by variability, and in the context of a new manufacturing line this could mean that you are modelling a process which is constantly changing. An example of this is when you have a manufacturing line with frequent product switches. Another example of such a dynamic process is when you want to understand the impact of transient behaviours such as when product or resources changeover is carried out, or what happens during peak times. In this case Discrete Event Simulation (DES) is the modelling type which is mostly used. In the example of the manufacturing line you are essentially modelling the flow of products manufactured (you can model from raw material state all the way to a finished good), but also factoring in elements such as the people, the behaviours, the resources, the constraints, and then simulate multiple what-if scenarios. So, you are essentially modelling a real-life situation, or in this case a manufacturing line, but in a digitalised format. You can “play” around or test multiple scenarios, in a safe digital space, until you are ready to implement in real life, once the optimum settings have been found. A steady-state system, on the other hand, is modelling a system which is already calibrated and everything is running in a stable state. So, for instance in manufacturing this would be a system where it is running at a constant rate, such as when you are focused on chemical or thermal processes. There are no changes being made to the system, and thus there are no changes to the system’s output. Imagine if we were to run multiple tests of chemical reactions taking place in a chamber (therefore, without any interference to the process). What we will model is a system in a digital environment which will have no transient behavioural elements, so once all of the coupled systems have converged we will know how the system will perform. In this case the modelling techniques used may vary. So, dynamic models factor in changes, including human-interaction and behaviour, as well as constant or frequent changes to the process. On the other hand, steady-state is when there are no changes made to a process, thus the process should reach a stable operation. In certain industries dynamic models are more used in discrete manufacturing where there is an element of resource usage, frequent changes over time, human-interaction element, coordination between automation and manual processes or settings/environmental changes. Dynamic models are more operational. Steady-state models, on the other hand are not into the operational aspect, but rather how a system behaves when the variables are not changing. In a model, whether that is dynamic or a steady-state, you can add as many variables as you need. Some simple examples of these are costs, throughputs, chemical reactions, mixing and even random events (for a dynamic system) and many more, depending on the model you are building and the problem/question you are trying to answer. These variables do not have to be modelled in isolation, but all of these variables can be coupled and modelled together all at once in one larger model. This gives you a holistic picture of the system, how it works when calibration takes place, in either a dynamic system or a steady-state system. Previous Next

  • Audrey Cordina Sacco | Funis Consulting

    < Back Audrey Cordina Sacco Head of Growth & Operations Audrey Cordina Sacco, (BA(Hons), BBA(Hons), MBA) An MBA graduate from Edinburgh Business School, Heriot-Watt University, where she was awarded with a Distinction, Audrey is a true all-rounder in business and is our Head of Growth & Operations. Audrey is a versatile professional who handles everything - from content and marketing, to strategy and business growth - and everything in between. With experience in licensing, compliance, sales, marketing, business development, project coordination, team management and insights across Malta, Spain and the UK, Audrey has deepened her technical expertise with various short courses related to data science and python. In fact, Audrey's unique combination allows to bridge the gap between business operations and the world of science, technology and innovation - ensuring smooth communication and strategic alignment. Other than being a passionate Funis team member, Audrey is passionate about travelling, art, food and wine and she brings a dynamic (and fun!) approach to business - Audrey's uniquely positioned to always adapting and moving according to what needs to be done to move the business forward. Curiosity and Adaptability are Audrey's key drivers for success both in life and in business. audrey@funisconsulting.com

bottom of page