SEO Hierarchy & Roles
1) Standard SEO Hierarchy & Roles
SEO is a field which has matured in terms of competition and complexity, so nowadays (more often than not, though this is not an ‘absolute’ rule) the practice of SEO requires teams of people to compete sensibly, all focused around different activities and aspects of SEO / Digital PR.
Usually you have a downward hierarchy like this:
- Head of department
- Account / department director (agency / client-side)
- Account / line manager (agency / client-side)
- Account / SEO analysts and - or strategists (agency / client-side)
- Account / SEO executives (agency / client-side)
- Junior staff, learners & interns
2) Sub-Departmental Break-Off:
At certain junctures all of these job roles specialise and head into sub-departments of the SEO team such as content (copy writing), production & implementation (think dev / designer-lite), analysis or R&I (SEO research and analysis, your Analytics, API, Regex & SQL experts go here), Technical (like analysis but purely focused on on-site analysis, insights and recommendations) and finally link-building / digital promotions (like analysis again; but focused purely on off-site analysis, insights and recommendations).
For example you could have a Digital Promotions Analyst, a Technical SEO Analyst, a Technical SEO Manager or a Creative Content Director.
3) What’s Changed? Why Can’t One Guy Be ‘My SEO’?
Going back 8 or more years you could as an individual, process hundreds of directory submissions a day, stuff Meta keyword tags and create any number of location / variation-based doorway pages and conquer Google’s results in any given area relatively quickly. SEO was all about automation so one person could manage it
These days Google have released a number of updates to combat the prior relative ease of search-manipulation (see their Penguin and Panda updates in particular, others like Caffeine, Farmer and May-Day also had strong impacts - May-Day was big and is often forgotten about). Google’s aim is to index the web so that users can find what they want with ease, as such Google aims to show users a reflection of what they are already seeking to find. Having their results injected with new unknown quantities is good for that particular site-owner but (9 times out of 10) bad for everyone else.
If Google provide a bad user experience they will lose their users to a competitor like Bing, Yahoo or Duck Duck Go (pretty cool check it out, I still miss Blekko!) - in such an event Google’s search engine would draw far fewer users and their ad revenue (from Google AdWords but also places like their Display Network) would decrease significantly stopping their main bread-winner (the Google search engine) from being a viable project.
Again: something started out by a couple of students (Google was originally a university project named “Backrub”) quickly grows in complexity and ceases to be something which can be continued by one or two people alone.
As Google grows, the SEO (and paid search) industry which feeds on the periphery of Google’s success must grow too. As Google becomes more complex, so must the SEO response to Google’s increasing complexity.
I consider myself one of a number of SEO experts, but like the others what I do changes every day. I’m actually quite lucky I have very strong training in all areas of SEO - I can build great links, expose a site’s technical flaws (including negative SEO vulnerability issues) in hours. So if I can do everything to a high standard - why don’t I? Why don’t I just do that for one person?
TIME. I don’t have unlimited time. I can’t compress time
Without cloning myself my pace of work would be woefully insufficient. I work very quickly, but because SEO is so big now one person just CAN’T do it all
Unless you can accept that, you’ll never pay for the package you need or get the results which you require
4) Alright Mr Smart Arse but What’s the Main Thing you DO?
Aggregate. For instance when performing a backlink analysis, we have to download as many backlinks as we can find to our client’s site in order to evaluate their health and toxicity. Many sites and / or tools have tried to chronicle all the links on the web, but they all use different crawling methodologies (thus finding different web pages and links, with some areas of overlap) so we have to take data from all of them and normalise, de-dupe and aggregate the data to get the most complete picture possible.
Anyone who promises a 100% complete picture of all your site’s / your competitor’s links is either lying or they don’t actually know how stupid they are. We have to download data from Ahrefs, Majestic SEO, OSE (Moz), GSC (Search Console) and many other tools (Such as SEO Profiler and SEO SpyGlass) and then aggregate it all.
Google became king by aggregating data. They have done so to such a vast extent that we now as SEO-workers must also aggregate our insights to keep up. If you DO have to hire just one guy, look for a data aficionado who makes no mistakes, a true Excel and database aggregator. There are some wizards about