In March 2006, KinderStart filed a lawsuit against Google over search engine rankings. KinderStart's website was removed from Google's index prior to the lawsuit and the amount of traffic to the site dropped by 70%. On March 16, 2007 the United States District Court for the Northern District of California (San Jose Division) dismissed KinderStart's complaint without leave to amend, and partially granted Google's motion for Rule 11 sanctions against KinderStart's attorney, requiring him to pay part of Google's legal expenses.[68][69]
Search engine marketing recognizes that websites get a large amount of web traffic, approximately 30 percent, from Internet web searches. Search engines like Google, Yahoo, Bing and AOL create standards that serve as guidelines for developing search engine advertising. The 2 main ways of advertising using search engines are search engine optimization (SEO) and paid search advertising. SEO is the process of improving a website so that the content ranks highly on search engines. In paid search advertising, companies buy paid listings using chosen keywords. Most often the advertiser has a Pay-Per-Click (PPC) plan. Every time someone performs a query on a search engine, they will see both free, organic results and paid listings. Learning to use both types of advertising will increase traffic to your site. Learn how to use a search engine for advertising.
Back end tools, including Web analytic tools and HTML validators, provide data on a website and its visitors and allow the success of a website to be measured. They range from simple traffic counters to tools that work with log files and to more sophisticated tools that are based on page tagging (putting JavaScript or an image on a page to track actions). These tools can deliver conversion-related information. There are three major tools used by EBSCO: (a) log file analyzing tool: WebTrends by NetiQ; (b) tag-based analytic tool: WebSideStory's Hitbox; and (c) transaction-based tool: TeaLeaf RealiTea. Validators check the invisible parts of websites, highlighting potential problems and many usability issues and ensuring websites meet W3C code standards. Try to use more than one HTML validator or spider simulator because each one tests, highlights, and reports on slightly different aspects of your website.
Building trust within your digital community of potential clients, existing customers, and possible employee candidates should be a goal in your video marketing strategy. If you imagine a video marketing funnel, the top of the funnel type of videos should aim to introduce your company’s service or product and then explain how it can help solve a problem.

Seriously? Yes, seriously. Search engines are sources of information and not all searchers are super specific. A prime example of this is someone searching for ‘Wine Bars in London’. Whilst you may expect Google to return the likes of Humble Grape or Gordon’s Wine Bar in the results, you will actually find that the main results are dominated by lists.
Links - Links from other websites play a key role in determining the ranking of a site in Google and other search engines. The reason being, a link can be seen as a vote of quality from other websites, since website owners are unlikely to link to other sites which are of poor quality. Sites that acquire links from many other sites gain authority in the eyes of search engines, especially if the sites that are linking to them are themselves authoritative.

Companies that employ overly aggressive techniques can get their client websites banned from the search results. In 2005, the Wall Street Journal reported on a company, Traffic Power, which allegedly used high-risk techniques and failed to disclose those risks to its clients.[15] Wired magazine reported that the same company sued blogger and SEO Aaron Wall for writing about the ban.[16] Google's Matt Cutts later confirmed that Google did in fact ban Traffic Power and some of its clients.[17]
When Googlebot crawls a page, it should see the page the same way an average user does15. For optimal rendering and indexing, always allow Googlebot access to the JavaScript, CSS, and image files used by your website. If your site's robots.txt file disallows crawling of these assets, it directly harms how well our algorithms render and index your content. This can result in suboptimal rankings.
Our SEO professionals are all well-respected thought leaders in the space and have decades of combined experience and include the following credentials: Search Engine Workshop Certification, Google Analytics and Yahoo Certifications, PMP Certification, UNIX Certification, Computer Engineering degrees and MBA’s. Our SEO team members are acclaimed SEO speakers and bloggers. IMI’s SEO team members have been keynote presenters at Pubcon, SMX, SEMCon, Etail, and many more influential conferences.
×