How smarter factors boost returns
In this interview, Carsten Rother, Co-Head of Research Forecasts, explains how smart factors work and how they differ from the basic, often one-dimensional factor definitions used by many other approaches.
Key takeaways
-
We continuously refine and enhance the performance drivers for our client portfolios using ‘smart factors’.
-
Smart factors differ from the basic, often one-dimensional factor definitions used by many other approaches.
-
Quoniam benefits from using alternative sources of insight and data as well as machine learning.
How do you stay ahead of the game when it comes to performance drivers at Quoniam?
The beauty about capital market theory is that we know that returns can be explained by an underlying factor structure. Factors are certain company characteristics that have consistently led to better risk-adjusted returns. Factor investing aims to exploit those premia.
Unfortunately, we don’t know exactly what these factors are. Decades of research have established the suitability of valuation, quality and momentum for predicting stock returns.[1] While the basic idea behind these three pillars of investment remains valid, the models based on them have had to be continuously refined to capture all the angles.
For this reason, smart beta and factor strategies differ in how they define factors. Many factor approaches use comparatively simple, often one-dimensional definitions of factors – a typical feature of smart beta strategies that rely on specific key ratios related to quality, value, and so on. At Quoniam, however, we base our models on innovative factors that include not only more data, but also information derived from our proprietary research and machine learning approaches.
The chart below shows the difference between the straightforward ‘simple factor’ approach and Quoniam’s more sophisticated, data-driven method.
A more nuanced, data-rich approach to factor investing
On the left are examples of the simple factors that are commonly used in traditional models. Book yield is associated with value style, return on equity is linked to quality, and 12-month price is used to gauge momentum and sentiment. However, these simple indicators often fail to capture the complexity of market drivers, and they can also be limited by outdated assumptions or data constraints.
Quoniam breaks down these styles into composite factors to capture more refined signals, as can be seen on the right. The value style, for example, consists of four composite factors: accounting value, operating value, yield, and intangible value. Each composite factor is explained by multiple underlying metrics in order to capture all the relevant aspects. Intangible value, for instance, is composed of patent value, knowledge capital and organisational capital. In total, ten metrics define the value style.
Our forecast model comprises 16 composite factors, each containing up to seven metrics. These metrics are defined in-house and have the highest long-term explanatory power. This adds up to over 60 single metrics in total. The extended factor model allows for a more granular and robust understanding of what truly drives company performance and stock returns.
By considering and constantly refining all these dimensions, we achieve balanced exposure to all factor groups at portfolio level.
Don’t fall for the factor fallacy! Instead, seek out models that are continuously refined and empirically validated to fully capture the potential of today’s markets.
Carsten Rother,
Co-Head of Research Forecasts
What other features make your model stand out?
Firstly, how the individual factors are weighted is crucial for boosting returns. Weighting factors dynamically is akin to playing a game of whack-a-mole, where the moles represent potential sources of excess return and the player must constantly refine their strategies to stay ahead of the game.
Just as the moles pop up at unexpected times and places, the efficacy of factor definitions can diminish over longer periods of time. It’s a constant process of adaptation and refinement. We use evolving weighting to ensure stable allocation of the model components while remaining responsive to market movements.
Secondly, alongside well-established balance sheet data, we incorporate information on short selling, directors’ dealings, external fund holdings, and intangible assets. Intangible assets include intellectual property, such as patents. Unlike tangible assets, they do not appear on the balance sheet. According to traditional valuation measures, companies with business models that rely on intangible assets appear overpriced because their book values appear low.
Could you provide an example of refinements to your alpha engine?
In 2024, we implemented a number of significant upgrades. For instance, we enhanced the price momentum factor by recognising that momentum effects frequently extend to other companies covered by the same analysts, thereby improving predictive power.
In line with our commitment to a constantly evolving model, we also adopted the use of machine learning to capture nonlinear effects, such as interactions or amplifications between factors. Since 2018, we have been using the machine learning technique ‘gradient boosted trees’ for this, having thoroughly tested it in real-world portfolios. As part of our ongoing commitment to innovation, we are exploring large factor models to further improve our model’s predictive capabilities.
What makes large factor models such an interesting machine learning framework?
Large factor models are a particularly compelling machine learning framework for asset management due to their ability to capture complex patterns in high-dimensional data. As machine learning continues to evolve, we are exploring how to integrate large factor models alongside our existing linear models. Recent academic work suggests that even models perfectly fitting in-sample data can yield stable and superior out-of-sample performance, challenging traditional assumptions about generalisation. While our initial results are very promising, their robustness in real-world investment settings requires further empirical validation.
What are the benefits of a systematic approach and what is your recommendation to investors?
Systematic investing is often mistakenly viewed as a ‘black box’, but the opposite is actually true. A quantitative approach allows full transparency of return drivers, performance attribution and the rationale behind each investment decision. Each element of the model can be broken down and examined, providing a degree of clarity that is often hard to attain with discretionary strategies.
Systematic strategies offer investors significant benefits beyond transparency. These include access to broadly diversified portfolios, the ability to capture non-linear effects through machine learning and dynamic, efficacy-based weighting of return drivers. All of these features enhance adaptability and robustness in changing market environments.
My recommendation to investors: don’t fall for the factor fallacy. Relying on simplistic or static factor definitions often leads to suboptimal outcomes. Instead, seek out models that are continuously refined and empirically validated — ones that evolve with the data to fully capture the complexity and potential of today’s markets.
YOU MIGHT ALSO BE INTERESTED IN THIS
[1] E.g. Value: Basu 1982, Fama/French 1992. Quality: Sloan 1996, Asness/Frazzini/Pedersen 2013. Momentum (sentiment): Jegadeesh/Titman 1993, Carhart 1997.