Future-proofing quantitative investing for complex marketsBy Leanne Micklewood, Co-Head of Quantitative Research and Data Science, and Sipho Mkhaba, Data Engineer23 November 2025 | Read time: 3 MIN

      Recently, we expanded our factor library from local to global stock data, now covering more than 30 years of data worldwide. It includes both traditional and bespoke factors, and in-house research-driven factors. This global scope empowers us to build alpha models tailored to any stock universe, whether Shari’ah-compliant, South African-focused, or otherwise, effectively creating a unified alpha engine accessible across all our quantitative strategies.

      To maintain data reliability, we employ automated checks for coverage and quality due to the sheer volume and complexity of the data. Moreover, a bespoke factor testing module runs comprehensive statistical tests on all factors across any investment universe over extensive back-test periods. This framework identifies which factors are likely to produce alpha, dramatically shifting our research team’s focus from raw data acquisition to interpretation of factor testing outputs, enabling more targeted and effective strategy development.

      Building for scale

      The factor library’s role as a single source of truth enables seamless customisation of strategies without rebuilding core infrastructure. As markets shift or new performance patterns emerge, we can quickly analyse the driving factors of returns across any investment universe, offering transparency and data-driven insights that are quickly integrated into our system through continuous integration and delivery frameworks, ultimately serving our clients better. This means that every investment decision made is firmly grounded in verifiable data processed through our rigorous testing framework. Automated quality checks and statistical validation ensure that portfolio managers and clients can trust that decisions are rooted in historical accuracy.

      While technology provides the scale, it is the dedication of the quantitative researchers, data engineers, and portfolio managers that unlocks its true potential. By building infrastructure that handles time-consuming computational tasks, teams are empowered to focus on what humans do best – creatively solving problems, critically analysing market dynamics, and designing investment solutions that align with real client needs. Although AI and automation are expected to take on an increasing share of the quantitative investing workload in the future, the need for human oversight and intervention will always remain essential. Investment firms must carefully assess how they deploy their quantitative research capabilities going forward, ensuring that human expertise and judgment continue to guide the development and application of these tools.

      This ongoing partnership between advanced technology and skilled human teams ensures that infrastructure not only supports current demands but also remains adaptable and responsive to future innovations. It is these enhanced teams that build, maintain, and evolve the infrastructure, transforming raw computational power into actionable investment insights.

      A strategic imperative

      The era we are currently operating in is defined by rapid data growth and market complexity; building future-fit infrastructure for quantitative analysis is not just an operational necessity but a strategic imperative. It requires a seamless blend of cutting-edge technology, robust data management, and human expertise to deliver consistent alpha and meaningful client outcomes. As markets evolve, our commitment to innovation, transparency, and adaptability is critical to ensure that we remain at the forefront of quantitative investment techniques, equipping investors with the tools and insights they need to thrive in an unpredictable world.

      Recently, we expanded our factor library from local to global stock data, now covering more than 30 years of data worldwide. It includes both traditional and bespoke factors, and in-house research-driven factors. This global scope empowers us to build alpha models tailored to any stock universe, whether Shari’ah-compliant, South African-focused, or otherwise, effectively creating a unified alpha engine accessible across all our quantitative strategies.

      To maintain data reliability, we employ automated checks for coverage and quality due to the sheer volume and complexity of the data. Moreover, a bespoke factor testing module runs comprehensive statistical tests on all factors across any investment universe over extensive back-test periods. This framework identifies which factors are likely to produce alpha, dramatically shifting our research team’s focus from raw data acquisition to interpretation of factor testing outputs, enabling more targeted and effective strategy development.

      Building for scale

      The factor library’s role as a single source of truth enables seamless customisation of strategies without rebuilding core infrastructure. As markets shift or new performance patterns emerge, we can quickly analyse the driving factors of returns across any investment universe, offering transparency and data-driven insights that are quickly integrated into our system through continuous integration and delivery frameworks, ultimately serving our clients better. This means that every investment decision made is firmly grounded in verifiable data processed through our rigorous testing framework. Automated quality checks and statistical validation ensure that portfolio managers and clients can trust that decisions are rooted in historical accuracy.

      While technology provides the scale, it is the dedication of the quantitative researchers, data engineers, and portfolio managers that unlocks its true potential. By building infrastructure that handles time-consuming computational tasks, teams are empowered to focus on what humans do best – creatively solving problems, critically analysing market dynamics, and designing investment solutions that align with real client needs. Although AI and automation are expected to take on an increasing share of the quantitative investing workload in the future, the need for human oversight and intervention will always remain essential. Investment firms must carefully assess how they deploy their quantitative research capabilities going forward, ensuring that human expertise and judgment continue to guide the development and application of these tools.

      This ongoing partnership between advanced technology and skilled human teams ensures that infrastructure not only supports current demands but also remains adaptable and responsive to future innovations. It is these enhanced teams that build, maintain, and evolve the infrastructure, transforming raw computational power into actionable investment insights.

      A strategic imperative

      The era we are currently operating in is defined by rapid data growth and market complexity; building future-fit infrastructure for quantitative analysis is not just an operational necessity but a strategic imperative. It requires a seamless blend of cutting-edge technology, robust data management, and human expertise to deliver consistent alpha and meaningful client outcomes. As markets evolve, our commitment to innovation, transparency, and adaptability is critical to ensure that we remain at the forefront of quantitative investment techniques, equipping investors with the tools and insights they need to thrive in an unpredictable world.