The default values of 5 extra mixed anchors, 3 generic and 2 domain anchors, means that if you input 5 keywords, then you'll end up with 15 anchors in total with an equal possibility of being 'chosen'. So 33% of your anchors are your primary keywords, 33% 'contain' your primary keywords (but are mixed up into extended phrases,) and 33% contain generic and domain anchors (5|5|3+2).
For a 'Penguin-safe' output, you should uncheck the "Use 'Bare' Keywords" option and use 10 mixed keywords, 10 generic, and 10 URL anchors (or more if you're a premium member!) This is a really good (and safe) keyword distribution curve, which will really help you avoid over-optimisation. It will avoid using your prime/bare keywords (and only use them when extended into multiple variances) and will give a profile of 1/3rd 'keyword-mixed', 1/3rd generic and 1/3rd URL/Domain/Naked anchors.
For diluting a current anchor-profile, (one that is already over-optimised and suffering from the Penguin updates,) don't use any bare or mixed keywords, and just ouput the generic and URL anchors. (Note: You'll still need keywords in the entered list, as the software looks for these to be in place for all eventualities.) Over time, this will then dilute your anchor pool by increasing the number of generic and domain/URL/naked anchors.
An 'author name' tracking signature can also be added for the subsequent finding of indexed pages in the search engines (by searching the "name" in quotes.) You can also define the spintax characters themselves, so you can alter the output to suit alternate and non-standard spinning systems. Standard members can process 10 total URL's over the 2 lists, and enter up to 5 keywords for each; Premium members can process up to 100 URL's in total, and enter up to 10 keywords each.
Click HERE To Display/Hide FULL NOTES Below...
Most experienced online marketers use Excel (or some other spreadsheet) for storing their URL & keyword lists, and it also makes it easier to keep all your tracking & SEO information in one place. You can normally export or save a spreadsheet in .CSV (comma separated values) format, which is exactly what we use for this tool. The first column should be the full http://URL, and the 2nd column onward are the SEO keyword phrases for that URL. You can view an example HERE of a simple demo .CSV file (which are essentially just text files with the values separated by commas, and a 'new-line' signifying the start of the next record.)
This software takes that simple comma-delimited list of URL's and keywords, and outputs them as nested-spintax 'link blocks' encoded in HTML, BB-Code or Wiki-Links format (whichever you need for the spinning submission system you're using.)
¹There is also an option to output as a Unique Article Wizard 'ready' list of HTML link-encoded lines: The system will automatically output 10 random signature lines (for the demo), with a blank line between each (as per the UAW requirements.) Standard members usually get 100 lines, Premium members get 250 lines.
² The 1:1 List is a simple text list of lines in the format: URL,Keyword. This is a format used by LFE and other systems.
³ Use 'Bare' Keywords: This will mean the system will include your keywords 'as-is' as part of the anchor-text spun outputs. If you un-check this box, then the bare/original keywords you enter will be ignored, so it will only output the 'mixed keyword' anchors (that we create for you) and the generic & URL anchors you specify. This helps with avoiding over optimisation since Google Penguin.
For HTML, Wiki-Links & BB-Code options, you end up with 3 'block' outputs:
- A signature/resource style block which has some highly-spun surrounding English text, and one link for each block above (so 2 max.) You'd normally use this for article or content/web 2.0 type submissions. You can add the author tracking name to make it easy to find which pages are later indexed by the search-engines.
- A secondary 'in-content' block, again with some surrounding English text, but this time with only a single link (the 2 boxes are combined.) This is for contextual 'in-content' embedding. i.e. this block could be stuck in the middle of a block of text, or at the end of one of the middle paragraphs. Again, this would normally be used for any content type submission which allows 'in-content' contextual links.
- A third 'stripped-bare' block with no surrounding text - just the bare link information from the secondary block - with 1 link (again, the 2 boxes are combined.) This is for where you want to completely control your own surrounding text, or just embed it in your own spun block.
Most submission systems accept some level of HTML, but many bulletin boards and forums etc. use BB-Code instead. Wiki-Links format is obviously for submitting to wiki pages. You should know from your own manual submissions which to use.
The links output will be a nested spintax set including:
- Your primary keyword anchors; as you entered them.
- X extra auto-generated contextual keyword anchors. These are spun out randomly from your supplied primary keywords for that URL.
- Y extra 'domain' anchors. These are root domain anchors of your url. i.e. 'www.google.com/subdir/home.php' would give 'www.google.com'
- Z extra 'generic' anchors. These are randomly selected human text anchors like 'Click here' and 'Try this' and 'Browse this site' etc. We have well over 200 to draw from.
You may choose '0' (zero) for any of these options to remove them from the output.
It's important at this point to quickly cover why all this is necessary...
Originally, SEO entailed trying to use keywords and on-page text as close to the keywords you wanted to rank for as possible. This was why everyone was looking for high (and precise) keyword densities in their titles, anchors and content. Over the last few years though, this has become more and more of a limit or 'control' rather than a target, as over-optimisation filters become easier and easier to trip, particularly with Google. On-page optimisation is nowhere near as important as it used to be - EXCEPT where you're obviously pushing content towards specific keywords. Google has got very good at working out when you're trying to optimise for specific keywords - and they can quickly penalise you when they discover it.
So... What do we need to do? Well, we still need our keywords to feature, so we can rank for them, but we now have to add prefixed & suffixed 'modifiers' to expand the keyword set out; keeping the central keywords constant, but blurring the edges of the keyword distribution curve.
i.e. if you're trying to rank for 'lose ten pounds', then your page title and anchors need to incorporate that keyword phrase, but you want those 3 words to account for about 30-70% of the total keyword 'mass', with plenty of other words surrounding them to 'blur' the edges of the keyword groups. Natural link-building looks like this - it isn't overly concentrated on specific phrases, but there are common themes and keywords to the majority of anchor texts. You can read a little more on over-optimisation here.
What this page does, is automatically build your nested link blocks so that each time you submit, a randomly selected URL & matching keyword is used, keeping it mixed-up, natural and organic looking.
Many new starters in online marketing (and many experienced people too,) make the mistake of blaming an aggressive link-building campaign for their sand-boxing or penalties. When more often than not, it's because of over-optimised content & anchors, and inconsistent/spiked link-building. Consistency of link-building (what we call 'link-velocity') is VERY important; as it establishes a base-line for the search-engines to judge link-growth against.
The other major advantage to using multiple URL's in the boxes above, is that each submission is sharing many URL's - so that link-building is spread out over multiple pages (and domains as well hopefully.) If you plan on doing a submission to 100 sites each day for 10 URL's total, then rather than doing 10 submissions with 1 URL in each, do 10 with 1/10th to each URL each day - so that the links are built gradually for all pages.
The 'author name' tracking signature is just a useful extra so you can search for that name "in quotes" on Google. Since it will be unique (if you check our Name Generator page, you'll see that there are 1.7 Billion permutations in this format,) you'll be able to immediately find those pages that were part of that submission set, and even 'scrape' or 'harvest' them using ScrapeBox. This can create a 'footprint' of your link-building though, and is better used intermittently for testing of your link-building strategies.
Standard members can use 10 URL's in total over the 2 lists, and enter up to 5 keywords for each. So you could enter 7 in the first box to get a lower submission rate per URL (as it's spread over more URL's), and 3 in the second to get a higher submission rate per URL.
Premium members can process up to 100 URL's in total, and enter up to 10 keywords each, giving incredible power for mixing up huge link-blocks for large-scale submission projects.