(Revised 23 Feb 2016)
The SEO technique for optimizing any website, called a “Silo” is pretty well described on the Bruce Clay site. But actually implementing silos on a big website can be an exercise in frustration unless you know what you’re doing.
To optimize (SEO) a large site with a lot of categories and huge link menus, it becomes really important to do the siloing right.
If every page of a large site links to every other page of the site through the menu system, then any PageRank coming to the site from Google is shunted around and diluted, not concentrated into your main categories and not USED the way it should be to get you better rankings at Google.
The standard method of making a big menu that shows up at the top (or left side) of every page is NOT conducive to good search engine rankings, for a large website. Especially if they are HTML/CSS drop-down menus using A HREF tags as links. (As pretty and clever as those menu systems may be).
Here’s the theory on which this is based. (I have seen no proof of this, other than the increased PageRank of sub-pages of a site after setting up a silo system):
What happens when you have a big menu on your home page with, say, 100 links to pages in your site, is that the PageRank coming to your site from Google is cut up into 100 pieces and distributed to all 100 links. Google says they give more weight to text links than menu links, and I’m fairly sure that’s true.
What happens if you link TWICE to certain pages (from a top menu and a bottom menu, say)? Well, then you’re just wasting precious PageRank — Google only counts the first link for PageRank purposes. It still divides by 100 (if that’s the number of links from your home page) but it only passes PageRank once — a little bit gets “leaked” or wasted, then, for every double-link you have on your home page.
But, you say, “Those big navigation menus are important pieces of my human interface for my site. I don’t want to get rid of them entirely.”
So, how can you focus the PageRank down to your “silos” and still have those awful, huge menu systems? I’ll tell you in a minute….
We’ve achieved very good rankings for several websites, and increased the PageRank of their main sub-pages, by putting the main pages they want to rank for at Google, as the only VISIBLE (to Google) links from the home page. That’s just one step of the silo process, which I’ll outline in full below.
There are problems with this, of course:
- “How do you still have a user-friendly navigation system that shows the rest of your website to human visitors?”
- “What about linking to our contact page on every page?”
- “What about the links to our ‘Specials’ page we have on every page of our site?”
- “What about the link to the blog that we’re supposed to have on every page of our site?”
Those questions are answered below.
The Silo Process for Large Websites:
1. Keyword Research.
Sorry if you thought it might be something else. Doesn’t everything start with keywords?
Figure out the main keyword phrases for which your site should rank well at Google. Pare that down to about a dozen terms. If you aim for more than a dozen keyword phrases, then the silo process will be less effective. For the purpose of the silo process, make sure you:
- a) aim for keywords that are relevant to what the site is actually about, and
- b) aim for keyword phrases having a good volume of searches at Google every month.
We use the Google AdWords keyword planner to thoroughly research that information, but WordTracker and other keyword research tools are also very valid and work well. Use the keyword research tools you’re comfortable using.
What I’m saying here is, don’t use the silo process for “long tail” keywords. If you try to have a thousand silos in your site, you’ll fail. Try for a dozen — that’s an attainable goal!
Example: For a large drug rehab website, the main, targeted keywords, based on our keyword research, might be:
- drug treatment
- drug addiction
- drug rehab
- marijuana addiction
- heroin addiction
- meth addiction
- oxycontin addiction
- cocaine addiction
- crack addiction
- drug rehab clinic
2. Set Up Folders on the Server for Those Keywords
Make sure you have your folders organized by content, focused on those keywords.
These are going to be your “silos”.
So in my continuing example, your folders (directories on your server within the www/ or http_docs/ or public_html/ folder) would be:
- drug-treatment/
- drug-addiction/
- drug-rehab/
- marijuana-addiction/
- heroin-addiction/
- meth-addiction/
- oxycontin-addiction/
- cocaine-addiction/
- crack-addiction/
- drug-rehab-clinics/
The way I’ve set up the folders above, each of these folders (directories) is going to be a silo.
3. The ONLY Links from the Home Page Go to These Folders
From the home page of the site, use standard <A HREF > links ONLY to these folders. One reason for doing this is that some of these links will become the site-links under your site’s listing at Google, if you can get your site to the #1 spot at Google, instead of terms that typically show up under Google site-links, such as “contact” and “about us”. Think for a moment about why a contact page or an about page might show up as a sitelink… Here’s my conclusion — Google makes them sitelinks because webmasters typically link to them from every page of a website. They must be important, right? So don’t do that unless you really want your contact page or about us page as sitelinks.
On the home page, it is possible to put the standard set of normal links (to “home, about us, contact us, sitemap, privacy policy” or whatever) in a menu that humans will see, but that Google will not crawl and will not follow or index, by putting such links in an iframe.
Don’t use JavaScript to make your navigation menu (because Google can read JavaScript just fine nowadays). And don’t even consider using Flash as your navigation system — you’d lose any visitors on iPhones, iPads, or Macs. Sorry, but Flash is history – I recommend you don’t use it for anything.
It is perfectly valid to put a menu in an iframe on your home page, and forbid Google from crawling or indexing the framed page, so that humans can go wherever they want, to ANY page of the site from within an iframe, from the home page. But as far as Google is concerned, because Googlebot will only follow A HREF links (and JavaScript), you only have the 10 links (one each to each of your silos) coming off your home page.
So you CAN have a lot of other links on the home page — just don’t make them Google-friendly links. You can even use standard A HREF links. Simply put those in an iframe and then exclude Google from crawling the iframed file. (Use your robots.txt file to do that) the file called by the iframe (your menu for humans) is then never crawled by Google. Viola! Links within it won’t count, for PageRank purposes. I have extensive anecdotal evidence that this works very well, because I’ve done it with sites and they routinely rank highly for very competitive keywords. Of course, your mileage may vary.
4. From the Folders, Only Link UP or DOWN
From the main silo pages, such as (in my example):
drug-treatment/index.php (or asp or html – it truly doesn’t matter what the file extension is)
you use A HREF menus that link only DOWN into that folder, and UP to the home page. DO NOT LINK ACROSS to the other main categories/silos/folders of the site.
drug-treatment/index.php might then use A HREF links to link:
- up to the home page
- down to drug-treatment/dual-diagnosis.html
- down to drug-treatment/outpatient.html
- down to drug-treatment/inpatient.html
- down to drug-treatment/admission/index.html (which might have more pages in this folder)
These sub-pages can link across to other pages within the same silo, but not out of the silo to other categories or folders.
Again, you can have as many links to the other categories and pages within those categories as you think you should have, within menus for humans to use to navigate the site, but those links MUST be in a non-Google-friendly menu. In other words, in an iframe.
(Historical note: In years past I used links set up as forms, links in JavaScript, and Flash links to keep Google from crawling the links I don’t want them to crawl and index. Google’s been reading JavaScript fine for some years now, so it won’t work to put your links into JavaScript.)
All the content of the site is then put within those folders, or other folders such as
- about/ (the contact page, about us, privacy policy, references, etc. all go here).
At some point you’ll have to put each and every page of the existing website (500 pages? 50,000?) onto a spreadsheet and figure out which silo/folder and sub-folder it should be in. This process is called “taxonomy”. Taxonomy is your friend, at Google. And, of course, you’ll need to set up 301 permanent redirects to the new page location from the old page location.
5. Go Deep!
You can have silos that keep on going down, deep into the pages within a website. Using my hypothetical drug rehab site as an example, you could do a sub-silo for drug-treatment/heroin/, containing pages such as:
- drug-treatment/heroin/index.html
- drug-treatment/heroin/what-is-heroin.html
- drug-treatment/heroin/symptoms-of-heroin-use.html
- drug-treatment/heroin/intervention.html
- drug-treatment/heroin/drug-withdrawal-symptoms.html
- drug-treatment/heroin/methadone-pros-and-cons.html
- drug-treatment/heroin/relapse.html
and so on. You could have 20 or 200 pages about the rehabilitation and treatment of heroin users, each with its own unique content but all of them focused on the rehabilitation of heroin users.
In this example, the methadone page (methadone is a heroin substitute) would link upward to the drug-treatment/heroin/ folder it is in, but NOT across to every other page of the silo it is in, using A HREF links.
Again, you can have as many menus for humans as you want, with links to any other pages in the site (you could conceivably make a huge drop-down JavaScript menu system that would show a link to EVERY page in the site, viewable from every other page of the site) AS LONG AS you don’t make them Google-friendly. Use iframes for those menus you want humans to view, but that you don’t want Google to follow.
So what Google then “sees” when it crawls your site’s link structure, is:
- home page – linking to 10 folders.
- each folder linking down within itself and up to the home page.
- each sub-folder linking down within itself and up to its main page within that folder.
To a web crawler like Googlebot, this gives your site the appearance of being neatly organized into folders, making it simple to figure out your subject matter, and makes for an easy and certain categorization in their index under the keywords that YOU picked.
PageRank coming from Google, when the site is siloed properly, is then smoothly distributed through your home page to your main sub-pages, then down to their sub-pages, and so on. It’s not dispersed across a hundred links, then another hundred links on every page. It’s the difference between a fire hose of page rank, and a sprinkler.
6. Take Advantage of Off-Site Links to Sub-Pages
If you have a page that has many links coming to it from off-site, then it can be its own silo, as well. It works like the home page: Google flows PageRank directly to that page, and you can flow that PageRank down to where YOU want it to go, if you organize the menus on that page, so that Google only sees the links you want. Your Google PageRank coming to that page directly from Google (not via the home page) will then flow from that page only to the pages you link to.
In our example, that page with many links to it from other websites, giving it its own PageRank, might be:
- drug-rehab-clinics/Los-Angeles/
Turn that page into a silo by only linking UP and DOWN. Link up to
- drug-rehab-clinics/ (the folder it is in)
link down to
- drug-rehab-clinics/Los-Angeles/Hollywood.php
- drug-rehab-clinics/Los-Angeles/Inglewood.php
- drug-rehab-clinics/Los-Angeles/Beverly-Hills.php
You might need to MAKE a folder to do that.
—–
That’s the silo method we use for our clients with large websites, and it works well.
We’d be happy to talk with you about “siloing” your website so Google can figure out what it is about, and help you attain better search rankings as a result. Feel free to call us at 541-655-0285 or contact us by email.
I want to thank you for your article about silos. I liked that you explained everything in detail.
I have some questions for you:
a) What is the best way to link down from drug-treatment/ folder to 500 sub-silos?
Example:
drug-treatment/cocaine/
drug-treatment/crack-cocaine/
drug-treatment/ecstasy/
drug-treatment/heroin/
drug-treatment/lsd/
drug-treatment/methamphetamine/
And another 494 sub-silos.
b) In this example (your example):
drug-rehab-clinics/Los-Angeles/Hollywood.php
drug-rehab-clinics/Los-Angeles/Inglewood.php
drug-rehab-clinics/Los-Angeles/Beverly-Hills.php
Hollywood.php, or Inglewood.php, or Beverly-Hills.php should only link up to Los-Angeles/ folder?
I would appreciate if you can explain a little more detailed at first question. Thank you!
C. M.
You’re welcome.
You wouldn’t link to 500 sub-silos – the silo link structure only works if you limit the links. If you try to flow PageRank to 50 pages from one page, you wind up with very little making it to any one page. I believe (based on my tests) that if a page only links to two pages, it flows half its PR to both pages. But if you link to 10 pages, it only flows about 1/20th of the PR to each page. If you link to 100 pages, I think it only flows about 1/1000th of the PR to each page. Not much makes it through – it’s as if it were on a logarithmic scale, not a direct ratio. Just my theory, but it seems to work.
So pick 10 sub-pages (max) to link to from your first silo level, to create a sub-silo. Go down from there to create even more silos, on the 3rd and 4th levels down in your site. I think this stops being effective around the 5th level, but I could be wrong.
You might want to bone up on “taxonomy” as a subject. It’s an actual science about how to organize information (primarily biology), taught in universities, which I’d like to spend a lot more time studying – but who has the time? It’s used in Information Science (IT) as well.
You asked:
Hollywood.php, or Inglewood.php, or Beverly-Hills.php should only link up to Los-Angeles/ folder?
That’s right. They link back UP to Los Angeles, unless they have pages organized under them in your silos.
What about the no-follow tags? Are they a viable alternative to iframes, java scripts, and flash?
No, the no-follow attribute is not a viable alternative to iframes, JavaScript or Flash, as far as hiding the link from Google. And the rel=nofollow tag has an unpleasant side effect.
The main side effect is that any link Google can see, will result in that link receiving a proportionate part of the overall PageRank flowing from the page the link is on, even if that link is set to rel=nofollow. So putting on a page a link that Google can see, even with the rel=nofollow attribute, will result in a complete waste of a piece of the PageRank flowing from that page to the rest of the website. If you limit the links, then you focus the PageRank where you want it to go.
I very rarely use the rel=nofollow attribute on a link; usually I will use it when there is a valid reason to have a link in the text of a page off to another website, and when I don’t want to help that site by flowing it any PageRank. It waste’s my PageRank, but I’d rather waste it than help a competitor or a known scammer.
Great post. I have been reading about Siloing for a couple weeks off and on, and this was by far the most helpful. I have even been to one of Bruce Clay’s 1 day training sessions and this still was more explanatory. I do have a couple of questions though still. What if after doing keyword research you are having a hard time finding keywords with decent volume besides the main keyword. Would you still just use the best you can find, or would you restructure your silo on a weird combination (not as closely related in a theme) just to have better trafficked keywords? Also, do all the subpages need to have the main keyword of that silo in the anchor text, or is it more important that they just are related to the overall theme of that silo? For example Sewer Repair -> Sewer Line Repair -> Trenchless Sewer Line Repair. Or Sewer Repair -> Sewer Cleaning ->Sewer Video Inspections etc.. Thanks for the help and the great post.
Good questions, Josh. Not sure I can answer all of them as they are more like a logic puzzle than SEO or silo specific.
When doing keyword research for use in optimizing a website, you usually find a few generic terms that have a zillion searches. It’s usually a mistake to go after these (but not always). One of my clients, for example, had the main keyword of “life insurance”. Lots of competition for that term, and high search volume for it. The average website is simply not going to rank well for “life insurance”, even if that’s mostly what they sell. There’s only room for 10 sites on the first page of Google search results. My client was on the first page of search results for several years, then the company was sold, changed their websites and all the good rankings went away. Sigh. (Bruce Clay’s company and mine both worked on that site at different times.)
But for the average website, the keywords that actually apply to the site (the keywords that, when someone searched for them, that person would be very happy to find your site first at Google). Remember Google’s first responsibility is to their searchers; they want the searchers to be happy with the first thing in the search results.
So look at it from that viewpoint, not “How many keyword phrases can I possibly optimize the site for, that have high volume searches?” But “How many keywords would searchers want to find our site for, at the top of the search results?” Optimize for those terms. Typically that will give you a shorter list, and much more specific (i.e., “long tail”) keyword phrases.
Back to the Life Insurance example: If your site isn’t Aetna or Met Life, or one of the other top 10 Life Insurance companies, don’t expect a number 1 listing on Google for a competitive term like “life insurance”. But if your site sells life insurance in eastern Kentucky, then by all means go for “life insurance eastern Kentucky”.
Back to your question: I would not restructure the site to have better trafficked keywords. In my view, any commercial website is an advertisement for services or products. So once you know what your website’s products or services are, then you do your keyword research, and then you organize your site around the best-fitting keyword phrases for your site. Dog kennel manufacturer’s website would go for keywords related to “dog kennel manufacturer” or “dog kennel supplier” not “dog kennels”. If I were a searcher at Google looking for a local kennel to put my dog while I went away for a week, I would not want the dog kennel manufacturer to come to the top of the search results. But if I were running a dog kennel and needed some new cages for my kennel, I would not be looking for “dog kennels” — I would be looking for “dog kennel supplier” or “dog kennel manufacturer”.
For your sewer repair example, I would look at the main services your website is selling — sewer repair — (and localize it geographically, of course). And then silo toward your services, not necessarily toward the highest volume search terms related to those keywords.
Someone else might do it differently, trying to maximize those “wasted” keyword phrases that don’t really apply to your site. I just want the visitors to be happy they found the site, and not go clicking on the next search result because when they found your site, it wasn’t right for them. Google pays attention to what gets clicked, and whether people stay at that site or go click the next search result immediately. Enough of that user behavior and you can expect a site will be devalued as a spam result, and go down in the rankings for that term and possibly others.
That’s my take on it anyway.
Best,
Jere
Jere,
Thanks for taking the time to respond to my question. Your answers were very valuable. I will put those suggestions into practice and then try to write back with the results in a couple months. Hope to see you at one of the conferences someday. We will be at SMX Advanced in Seattle in a couple months. Take care and thanks again!
Josh
Hi Jere,
Thanks for the explanation of siloing. You have finally made it clear to me. I took Bruce Clay’s 5 day course a few years ago, and have, to this very minute, been confused about it. My sites are massive, with lots of pages that have been around for years, so this is a huge help.
Best regards,
Debbie
Thank you for this incredible explanation of siloing! After reading this it made me re-evaluate my entire website. I plan to do a complete SEO overhaul now.
Cheers!
Zee
Since I understood the importance of siloing a week ago, I’ve used the past few days to find information about it, but really couldn’t get the hang on how to sort it out, especially since users mix our main keywords into different phrases, like searching for “A”, “B”, “C”, “A C”, “C B”, “B C A”, and on top of that with other longtail words as well. This seems a bit clearer now.
A question though, how about duplicate content? If I sell car parts, how do I organize pages with similar information to avoid Google seeing it as duplicates?
Example:
/ford/galaxy/mirror
/ford/galaxy/transmisson
/ford/fiesta/mirror
/ford/fiesta/transmission
With say 30 brands and several hundred models, all with the same names for their spare parts, will siloing help avoiding Google to see almost similar text on different pages as duplicates, as long as similar texts are within different silos or sub-silos?
Hi Jere, Informative post, but I have heard that Siloing is DEAD? And if I use a good SEO friendly WordPress theme, wouldn’t all these things get taken cared of automatically? Now about PR Sculpting, I have been using rel=”nofollow” tag to tell Google bots not to follow links to sites that I don’t trust, I have a lot of them on my site, so you mean to say that I wouldn’t have any PR left on my site if I don’t use JavaScript of Iframes instead of Crawlable links? Oh god, then I need to overhaul my entire site by taking out the nofollow tags and replace them with JavaScript? But previously nofollow didn’t drain the PR, right’? I am doomed :((((
GREAT post!!
It’s been really useful. Thanks a lot
Hi, Deepak –
No, siloing is not dead. It’s got another name, called “PageRank Sculpting”. It’s basically a method of maximizing the amount of PageRank your site gets from Google, and sending it where you want it to go within the site.
Even the Joost DeValk SEO for WordPress does not do *anything* with siloing or PR sculpting. It actually does more to hurt the PageRank of individual posts, by creating tons of sidebar links automatically. Each one of those leaks PageRank you don’t necessarily want to happen.
I don’t think this is an issue for a small site to worry about, but when you get thousands of page on a blog or ecommerce site, it can be a real problem, and one that is difficult to solve on the WordPress platform. I’m facing that own issue on my own blog, here. There are a few posts I’d like to flow a lot more PageRank. A simple link from other posts isn’t going to cut it – not when there are a hundred links on every page. If there are only a few links on every page (because of siloing effects) then one text link in the body of that page can send a lot of pagerank. Do that a few times, and you can flow that particular page a higher pagerank.
I use Siloing on several sites to get the pages that we want to rank well ranking well for specific keywords.
It’s just one SEO technique, but I find it to be very useful and I wish WordPress were not so eager to make as many links as it does by default.
Good question, and one for which I do not have a definitive answer.
My best guess would be that if you have it siloed into silos like Ford/Galaxy/ and Ford/Fiesta, then Google (and don’t forget Bing/Yahoo!) will be able to tell that you are not putting duplicate content on your site. There would be enough differentiation that I don’t think you’d get any duplicate content penalties.
Hi Jere,
Loved your article on silo structure. I am putting together a wordpress blog and would like to use JavaScript menu for human interface. Can you recommend a good plugin that would do that.
Boris
Hi, Boris –
Thanks for your kind words.
Unfortunately, I don’t know of any plugins for WordPress that allow you to implement silo structuring in your site. We usually skip siloing of WP sites and go with the Yoast Devalk SEO plugin. We tend to disable tags and categories — all the internal linking those create run counter to the whole point of siloing, which is to control where your links go internally. If every page links to every other page of your site through the categories, that’s not good for SEO (siloing) or user experience (UX).
Best,
Jere Matlock
Hi,
a few weeks ago i replaced all links in a big dropdown menu by javascript links. Meaning that i wrote the folders of the original subcategory links (those wich normally dropped down) in a javascript block in the source code like this:
var menu = new Object();menu[“137”] = new Object();menu[“137”][“141”] = new Object();menu[“137”][“141”][“lz”] = “/url”;menu[“137”][“141”][“lt”] = “Anchor Text”;
…
A js function called in an external js file then builts links (a href) of those variables and writes them in the document. Everything works fine and you can only access the dropdown links with js enabled in your browser.
But a few weeks later i had a look in the section “internal links” in my google webmaster tools. Unfortunately (nearly) all links are listed there. So for every page, that is linked in the dropdown menu, the amount of internal links is about the total number of indexed pages of my site.
So i wouldn’t say siloing is dead. But link cloaking is dead. Maybe the only way (wich i wouldn’t recommend) is to exclude certain user agents and ip adresses.
Please let me know if someone uses a technique of cloaking internal links and noticed that they aren’t listed in wmt.
Hi, Niko –
Agreed. JavaScript cloaking of links is completely dead. Google now reads and understands and indexes all links previously cloaked by javascript.
The only way I know of to “cloak” internal links is to put them in a file called by an iframe, and prohibit Google from indexing the called file. That still works well, and I have several sites I’ve done that with, where Google does not index the pages thus linked. If you want to hide a link from Google in order to focus their attention on particular pages, that’s the only way I know of to do it that works.
To summarize, if you don’t want Google to follow links,
1. create an iframe for your navigation
2. put the links within the html file called by your iframe
3. block Google from indexing the html file called by your iframe.
Ok, I’ll try it. Thank You!
Hey Jere,
I was searching for some articles about Website Silo Architecture and I came across this page.
I noticed you linked to some of my favorite articles: SEO Silos by Bruce Clay
Just wanted to give you a heads up that I created a similar one. It’s like SEO Silos by Bruce Clay, but more thorough and up to date:
Website Silo Architecture
Might be worth a mention on your page.
Either way, keep up the awesome work!
Cheers,
David Alvarez
Thanks, David –
It takes a lot of work to create a page like yours on how to actually do something as complex as siloing a website. Well done on putting yours together, and thanks for sharing your tips and tricks.
I need to revise my old blog post on the subject from 2010 (because Google can definitely index the JavaScripted links now, and no one in their right mind would use Flash). But the iframes still work as a link obfuscation tool.
Best,
Jere Matlock
I cannot articulate how much better this has made my day.
Earlier this week I posted this article on Reddit – let’s just say the response wasn’t as friendly, haha.
Best,
David Alvarez
Hi, David –
Ha. Glad to have brightened up your day.
There’s an old expression, “Tell the truth and shame the devil.” Nowadays people have corrupted the meaning of that one into the expression, “Speaking truth to power,” but that’s pretty boring and generalized, as well as often being misleading, because it usually means the opposite: lying to the powerless.
In the SEO field, that would be the equivalent of telling Google that their denials and misdirection are exactly that, and have no basis in reality. “Yes, Google, we CAN figure some things out about your algorithm, despite lacking a doctorate in Information Science, based on trial and error, and based on correlation.” Trolls are quick to point out that correlation is not causality — to which I am equally quick to reply, “consistent correlation might as well be.”
I’ve always loved this quote:
I guess the funniest part is that people will argue this method doesn’t work or that they already knew how to implement it…yet I see no case studies, no live rankings, no concrete proof of them performing well in Google…
It’s the most bizarre thing.
It’s like a scrawny dude telling a professional bodybuilder that their workout routine doesn’t work.
But I absolutely agree with your quote – “consistent correlation might as well be.”
If something has been tested ad nauseam, and the results have been proven – there must be something to it…
David Alvarez