Multivariate Testing (MVT henceforth), the art of testing more than one page element in a given space of time simultaneously (almost) has proven its worth to me over the years on numerous projects.
Gone are the days of testing single page variants and instead we can now test numerous permutations of pages with various page elements in a fraction of the time required for simple A/B/n testing. However this does introduce an interesting element of concern to us SEOs, concerned not only with ensuring page content converts but also that the page performs at capacity as far as Search Engine Robots are concerned.
So with your MVT tool serving many variants of various parts of your website, have you considered the impact on your SERPs? Many of the MVT tools do recognise a bot from a web browser, but have you actually checked your tool? The effect that this could have is profound.
Take this example; your site is highly indexed and you’re running MVT iterations on your homepage. Your MVT tool is not segmenting robots out of the tests and the Search Bots see different layouts every day (because your site is visited every day). All that hard work you, the SEO guru, spent on optimising the homepage for maximum SEO performance may well be compromised and undermined by way of changing layouts and content.
Even if the Search Bots are being filtered out, what if the winning permutation of the [Home/Product/Insert Name] page reduces the effectiveness of your page in the eyes of the Bots? This is a crucial aspect of MVT that is often overlooked.
My advice? Unless you’re also responsible for MVT, ensure you are fully aware of what is being tested and the impact any new layouts will have on your SEO strategy. After all, increasing page conversion yet decreasing traffic may leave you spinning your wheels in the sand…
What are your experiences? Would love to hear of other real-life examples