Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters needed only to submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed.[5] The process involves a search engine spider downloading a page and storing it on the search engine's own server. A second program, known as an indexer, extracts information about the page, such as the words it contains, where they are located, and any weight for specific words, as well as all links the page contains. All of this information is then placed into a scheduler for crawling at a later date.
Novelty wears off. Even if people aren’t getting sick of your ads, your product itself will become less revolutionary over time. When Casper launched, a direct-to-customer mattress-in-a-box company was a hot new take on the traditional sleep industry. Now there are so many competitors that the idea of a mattress showing up at your door in a box just isn’t as exciting.

Keyword difficulty is a number that lets you know how difficult it will be to rank for a certain keyword. The higher the number, the more difficult it will be to rank on that keyword. There are a few sites online that will tell you keyword difficulty of a word or phrase. Record these numbers in your Excel document or Google Sheet that you made earlier.

Remember when you used to rely solely on search engines for traffic? Remember when you worked on SEO and lived and died by your placement in Google? Were you #1? Assured success. Well, okay, maybe not assured. Success only came if the keywords were relevant to your site users, but it was the only real roadmap to generating site traffic and revenue.

×