Local SEO & marketplaces: A product management view

$3,000 per day. This is the day rate for an SEO audit from an expert. In other words, $3,000 to tell how your web app sucks, not even how to suck less. . Why? Because SEO matters. There is good news too. You don’t have to pay a dime as long as you have your eyes on your users. This is how we did it at Timewith, my company, and ended up growing 1,000% within 6 months just from SEO.

Timewith is a therapy marketplace. Initially I was apprehensive about SEO. We were a pure-bred product/engineering team and SEO seemed like a dark art incompatible with our structured approach. How could we trust a process we neither understood nor controlled? But the prize was large so I decided to make a few shy attempts. Our first foray in search engine optimization was enough to teach me that SEO rewards products focused on pleasing the users. And this changed my mindset. Sure, there is depth to cover in the technical details one needs to master and sure, a Google “randomness” factor applies but equally, the biggest wins came from product and UX improvements. We also didn’t have any experts at hand, so perhaps you can do the same without spending $xx,000.

SEO and marketplaces in particular have had a natural love affair that has been going on since the humble beginnings of the internet. Amazon, Expedia, eBay, Airbnb, Craigslist, Thumbtack. The list is endless. 

To make it visual and add some other nice examples, here’s a visual with just a few unicorns where SEO really made a difference.

SEO and local marketplaces. A recipe for success
SEO & local Marketplaces. A recipe for success. Mtsireud.com, 2021.

There is good reason for that. Marketplaces are not producers. Instead, their function is to organize the corresponding market’s information and make it easy for potential buyers to assess, compare and select the right product or service for them. A curated, thought-through aggregation results in a better web experience and makes customers happy.

And since marketplaces make customers happy, Google (and other search engines) promotes aggregated listings pages that offer true utility to the customers (not to be confused with scrapers/aggregators with no added context which are down ranked by Google).

On the other hand, SEO is critical for the growth stage and viability of a marketplace and there lies the basis of this fragile symbiosis.

SEO strategy in 2021: UX above it all

Due to the intrinsic value of SEO the level of competition steadily grew. Whilst a decade ago,  simply aggregating and presenting information would serve a business well, nowadays pages need to abide by the principles of great UX, provide unique insight, and be supported by great engineering.  Backlinks are another important factor but since they’re not location-specific, I will leave this out for now. However, there is one case (Thumbtack) worth looking into where they used local gov organizations backlinks to boost local listings. Amazing.

And since SEO is equally art and science it makes a natural fit for a Product Management viewpoint. “What does this mean?” you say. Think about the user, about the business case, think about engineering precision. And think about Google.

The ideal scenario is that is that when a customer visits a result link, they educate themselves on the offering, engage by progressing deeper down the funnel and finally make an action. Of course not all experiences happen in a linear, centralized or or sequential fashion but this is a starting point for the goal. The bad scenario is that the user can’t find what they want or are misled so they bounce. Google notes that as a negative signal.

Marketplace Funnel, a simple, linear perspective

So what is the best SEO strategy? It’s the one pleasing the users and a good way to go about this is by preempting their needs. Ask their questions beforehand.

  1. How many of the customer queries can we answer?
    1. Can we make the answers granular and specific?
    2. Can we add unique insight to each answer?
  2. Can we get the answers fast enough? Can we internally promote the right pages?
  3. Can we allow users to find the right solutions for them?
    1. Can we allow users to dissect, compare and filter the right solutions for them?
    2. Can we create trust that we are offering the right solution?
    3. Do we have the right products or suppliers to help them?
SEO as a series of UX optimizations.
SEO: A product management approach. Mtsireud.com, 2021.

Product functionality 

Search & filter

Imagine you are looking for a therapist.  What is it that you need to do to find, assess and compare in order to find the right one for you

You might need help in assessing what you need help with. A short “test” might be available. This helps you create a relationship with the app and also the app benefits usually from capturing some data, prompting the user to come back. 

Secondly, once you have a sense of what you’re looking after you might think about what is “the right therapist”. This might include credentials, experience, tone and attitude and therapeutic modality or it might be practical considerations such as location, price and spoken languages. 

In general, to keep a customer happy you need to know the question they’ll answer and crete a series of search and filtering experiences that ideally serve the content before it’s asked through personalization and recommendations or at least allow the user to easily discover what they’re looking for. 

Supply and quality of data 

Supply matters. It matters to business model and it matters to SEO too. There’s a dangerous allure to onboard as many professionals or inventory to please Google. But this does not matter if the content is poor. In other words, a product, a shop or a professional are not a useful addition unless there is proper QA and the information around the products and services provided is detailed at least, unique ideally. 

Social Proof and Trust building 

Often considered a conversion tactic, Social proof and Trust building can take your SEO a long way. Generous cancellation policies, refunds, 24/7 customer support are great to create trusted relationship. Publications, accolades and reviews for social proof. Besides the obvious benefits, there are a few SEO specific reasons why this is important.

  1. More clients convert if the brand seems legitimate. In fact, nowadays client’s won’t convert unless the brand is legitimate; the bar is higher than ever. From an SEO perspective, this is also great signal.
  2. It can be a strong signal for search engines to use reviews. 
  3. It helps with retention and virality which creates a virtuous cycle for SEO too. When users trust a brand they sign up for newsletters, follow pages on social media and in the end come to the site “direct” i.e. typing the name of the business. All that is a strong signal for SEO and in the same time plays a valuable strategic goal of slowly setting up the stage for independence from SEO which is and will be owned by Google. 

Information Architecture 

Information architecture should be designed by thinking in terms of information priority or in other words, what are the questions users ask and in what order? This is where good keyword research comes handy. 

Let’s return to our therapist example. In this case, looking at the keyword structure (UK) we conclude that the top Searches are in this order:

  1. “Therapist in <location>” – between 500,000 to 1,000,000 searches per month. Example: Therapist in Manchester 
  2. <Counselling> therapist between 100,000 – 250,000 searches per month. Example: Depression therapist in London
  3. <Approach> therapist – 50,000 – 100,000 searches per month. Example: CBT therapist 

What does this tell us? It tells us what is the way people think. Firstly, users might think in terms of distance and locality. However this subset of users may or may not have a lot of knowledge in the sector. For instance, they might have not considered what the right modality for them is and thus, might also be of lower intent. 

If you’re starting up and building your domain ranking, it makes sense to go for the niche, specific queries. When it comes to SEO, less search traffic is often correlated with lower competition and higher intent. Then you can build your way up to the fat heads (e.g. therapist in <location>). 

However, from an overall architecture, here is what this could look like in terms of your information order and URL structure. 

Information architecture could look like this: 

  • Information order
    1. Nearby areas
    2. Actual results
    3. Pagination
    4. Related searches with area and approach
    5. Stats about therapy inside the search area (custom content, continuously updating)
    6. Featured therapist & therapy approaches (custom content, continuously updating)

The more technical view, including URL architecture could look like this: 

  • Url structure – three SEO friendly URL structures
    1. Location only:  
      • Top location:  /topLocation/?page=x 
      • Sub-location:  /topLocation/subLocation?page=x 
    2. Location & counselling area: /topLocation/subLocation/counselling/<area>?page=x
    3. Location & therapy approach: /topLocation/subLocation/approach/<therapy_approach>?page=x

Rich (user-generated) content

Reviews. Insights. Comments. Recovery data. A dedicated section for “How our first session works”. Anything that is not automated, duplicated and scraped and offers insight in a question a user might have is gold. 

Here the winnings are twofold: first, you keep users spending longer time on your pages and secondly they progress down the funnel due to trust and no cognitive barriers – you have answered their questions -. 

This is a great example where product management and SEO overlap. Product managers think in those terms already to increase conversion, revenue and decrease CAC. But guess what? This is what Google looks for too.

Finally, deliver content that keeps your users coming back. Emails and Social media prompts are reasons for your users to come back. 

Here’s what blending the concepts of tags, content and overall architectural considerations  might look like for a results page: 


Finally, you got latency. When was the last time you waited for more than 3” to load a page? According to Google analytics, after 7 seconds, 32% of users will have bounced from a page. Why do you think Facebook, Twitter, and Instagram do all this work to load in less than 3 seconds? 

So here are a few suggestions to get right early on to get any application to load in under 2 seconds.

  • Consider pre-rendered content. An example architectural solution would be a React application with Server-side rendering.
  • CDN caching – cache the pre-rendered pages for <x> hours/days.
  • Only load the information (content) required for Google to index the pages, e.g. don’t pre-load the availability of listings – this is pulled in after the whole page is loaded in the browser.
  • Use compressed versions of images together with lazy loading them as the user scrolls down the page i.e. don’t load the whole page just the first results like the Facebook feed.
  • Preload any CSS, JS assets so your page rendering isn’t blocked. There are trade-offs here between performance and UI (e.g. with preloading you may see an unstyled version of the page for a fraction of a sec before seeing the styled one) but hey; SEO benefits, right? 
  • Finally, database design. If you’re using a relational DB, make sure the relationships in your database are such that serve the performance of loading the data.

When it comes to engineering specifically, I see this as free points. Every team control their own performance vs controlling users’ reactions in a new experience. That’s not to say it will be trivial or easy to do the above. It will time but this is also the most straightforward optimization conceptually. In other words, this is up to you. 

Link equity and no-follow

Another important concept to consider is that of link equity. Think of your site as having a total amount of credit. Which pages should this credit be distributed to? Are your meta-tags reflecting this hierarchy of page importance? 

  • Meta tags
    1. First listing page <meta name=”robots” content=”index, follow”> – these are the pages Google indexes, so only the first pages are showing on google results. These are the pages where your link equity is diverted to. 
    2. Next pages <meta name=”robots” content=”noindex, follow”> — tells google to not index all other pages in the results but to follow all links on these pages. 
    3. Also, avoid to index pages with few results that don’t provide a meaningful search experience.
    4. Finally, do not follow links in the footer section which you don’t require for SEO reasons (e.g. terms or privacy pages), because why pass link equity yo T&Cs? 

And so just like that you got SEO game. 

How much should you invest in SEO? 

If SEO sounds like the land of promise, it’s not and there are pitfalls to be aware of. As time goes by, Google may change their algorithms or decide to capture a larger part of SEO for themselves by selling promoted listings. 

A good example is the travel vertical of search. Type “flights to <location>” and you will see the google flight product, Google’s own flight search product. 

Is this an imminent threat in your industry? The answer is it depends how big your market is but it is an eventuality. When you rent a house, the landlord will eventually kick you out.  

For the marketplace, the end game is to end up with the majority of the traffic flowing direct, create a destination site. Early on every business to solve for discoverability which can either be paid and expensive or it can be organic and SEO is a known and tested formula to do so fast but it’s dangerous to be the end-to-end play.

Does this mean not investing in it? Certainly not. Take on SEO early on, treat it like a core part of the product and you will have consistent, sustainable strategy that can take you far.

The href attribute is not fit for purpose anymore

We reached the point where the functionality of the <href> attribute needs to be extended. Links within an article -like the one you are reading – aim to provide context or reference for the main object of interest i.e. the page currently accessed.

However gaining access to the contents of the link, means reprioritizing attention (change url, visit the link) OR not gaining value at the time needed (open tabs, read later).

In a low bandwidth world when internet protocols where still shaped, this made sense. Unsurprisingly, the mental model of the link is similar to a pointer in C.

Given that the utility is not fit for purpose anymore feels like rather than choosing it we keep it as we have not questioned it.

The notion of a “link” could be much more versatile, such as a) being embedded like <video>, b) adding controls similar to autoplay and c) becoming intelligent i.e. selecting the most relevant bits of the referenced source to show as an interstitial within the article. 

Mostly, this would be a better experience but it also would have a heavy implication on traffic distribution. Users would visit less sites and thus consume less ads. You can immediately see who’s losing in that scenario (hi google). That being said one’s loss, another’s opportunity.

This is a geeky topic I would like to talk about. Hit me up if you have any thoughts on it (technical/commercial) – might be interested in building something in the space.

Alternatively, if you’re reading this and thinking of building solo, still do text me. I could use a better reading experience online and would be your first user.