Membership Level: Demonstration Site
Not Logged In: Demo ONLY
This tool uses Natural Language Processing techniques based on Markov-Chains to create large 'super-spun' nested-spintax structures of contextual 'computer-generated' content.
It works by starting with a content-base that is contextual to a specific topic, and then uses NLP/Markov techniques to process this into mini chunks, which are rebuilt into new sentences. These sentences don't make 'sense' to a human reader (although they can provide somewhat amusing outputs sometimes!...) but they follow normal language rules and 'flow' like real content. It's not just random words, as you'll see; it uses sophisticated language processing techniques to ensure the structure and 'feel' of real content, and would pass most computer-based content checks.
This content is obviously NOT for your 'money' sites or top-tier link-building, but many internet marketers need very large quantities of highly unique content for posting to their lower-tiers (with embedded links.) Good content is always at a premium, so spending money and resources on the high volumes needed for the lower tiers can be impractical. This type of content is also often used for building out large black-hat 'cloaked' sites. And we've been asked by many members to provide this tool; so we have!
You can select one of the 100+ 'standard' niche categories below as the contextual 'primer', or use your own niche-related content. If you supply your own content, we recommend 2000-5000 words of good quality and varied article writing; you can actually use ANY content, but if it's not related to your keyword/niche, then the final outputs won't be relevant either of course! Our 'Standard' primers are approx 10k words.
Each time you click one of the buttons below, it will do all the required processing and return a fresh super-spun {braces|format} document (prepared in-line with the Output Notes² at the bottom of this page) which you can then use with any spintax-based poster.
or
Caveat/Important Note: Without getting into all the ramifications and arguments over content-quality (post Google Panda/Penguin,) you have to make your own decisions whether or not you wish to use this type of computer-generated content, and they'll be based upon the type and level of marketing you're doing. Many IM power-marketers use and need this type of content, but if it's not for you, then please just ignore this tool!
We strongly recommend that this content is not used throughout your top-tiers of link-building, unless you know what you're letting yourself in for, and understand it's full implications. We neither recommend nor condemn this type of content; it's purely provided as a technical tool to those members who've asked for it and can use it 'wisely'.
Premium Members:
¹You can use 'primer' content of up to 150k in size, which typically equals around 28,000 words. You should try to use at least 5,000 words of good quality, varied content so that the algorithms have plenty of data to 'key' off.
²For 'Articles', the tool creates 14 'rewrites' of each and every sentence, with 6 sentences per paragraph, and a 3-in-17 (17.6%) chance of ANY sentence being dropped from the final (spun) outputs. There is a random 50% chance of the first paragraph being used or ignored - to both increase the randomness of the 'opening' content, and also vary the overall paragraph quantity. It gives 15^6 (11.4 Million) spin permutuations per paragraph (with double that for the first paragraph - as it alternates with the 2nd paragraph.) It typically generates 7-9 paragraph outputs (depending on markov build effects and randomised first paragraph useage) of 500-800 words in length, which yields somewhere around 3 x 10^56 overall article permutations, and avoids using sentences of less than 25 chars in length.
For 'Snippets', the tool creates 21 'rewrites' of each and every sentence, with 6 sentences in the one-paragraph snippet, and a 4-in-25 (16%) chance of ANY sentence being dropped from the final (spun) outputs. It gives 22^6 (113.4 Million) spin permutuations.
³The 'English Grammar Processing' option (which appears for Premium members only,) provides enhanced English-Grammar based language processing. It's only suitable for use with English, and can provide better 'flow' by additionally analysing the grammar structures of the primer content against a standard lexicon, and then using those language rules to 'tune' the subsequent build.
Standard Members:
¹You can use 'primer' content of up to 30k in size, which typically equals around 5,500 words. You should try to use at least 2,000 words of good quality, varied content so that the algorithms have plenty of data to 'key' off.
²For 'articles', the tool creates 6 'rewrites' of each and every sentence, with 4 sentences per paragraph, and a 1-in-7 (14.3%) chance of ANY sentence being dropped from the final (spun) outputs. There is a random 50% chance of the first paragraph being used or ignored - to both increase the randomness of the 'opening' content, and also vary the overall paragraph quantity. It gives 7^4 (2401) spin permutuations per paragraph (with double that for the first paragraph - as it alternates with the 2nd paragraph.) It typically generates 7-9 paragraph outputs (depending on markov build effects and randomised first paragraph useage) of 500-800 words in length, which yields somewhere around 10^27 overall article permutations, and avoids using sentences of less than 25 chars in length.
For 'Snippets', the tool creates 9 'rewrites' of each and every sentence, with 4 sentences in the one-paragraph snippet, and a 1-in-10 (10%) chance of ANY sentence being dropped from the final (spun) outputs. It gives 10^4 (10,000) spin permutuations.