Tag: seo

  • How To Create A Successful Social Design

    How To Create A Successful Social Design

    After spending some time looking through the list of Facebook’s heaviest hitters I couldn’t help but notice a trend of characteristics that worked in favor of nurturing a positive, active social interaction between the brand and the fans.  However nothing is perfect and the same can even be said for the top three Branded Facebook Fan Pages as there are a handful of things that could be improved to make their social efforts even more successful.

     

    Top 3 Branded Facebook Pages

     

    One Chance to Make a First Impression

    First impressions make lasting statements both in reality and the virtual world. Much like a firm handshake and an inviting smile help break the ice during an introduction, a Facebook welcome page is a great way to say hello to new visitors while inviting them to explore the page’s content and become a fan. Interestingly enough, only one of the top three branded pages welcomes non-fans in this fashion. Both YouTube and Facebook’s Fan Page goes straight to the wall while Starbucks’ page directs non-fans to a tab inviting visitors to “Join the Starbucks Pumpkin Spice Latte Celebration.” This is a great example of providing non-fan visitors with compelling content promoting a seasonal product without blatant advertising. The result is an approachable social interaction that gives the users’ visit a purpose and directs them to further investigate the page, and ultimately click the Like button.

     

    Starbucks Pumpkin Spice Latte Celebration

     

    Socializing Requires a Two-Way Street

    Once users eventually arrive on the wall of a Fan Page, whether directly or after visiting a landing page, it’s important that the wall appears organized and uncluttered. Unfortunately many companies believe keeping a tidy wall requires restricting or eliminating user posts.  This could give off the wrong impression that a company is too worried about what others might say about them to allow user posting and it denies users the freedom of interacting with each other without the direction of the brand. The use of a display filter can offer greater flexibility that doesn’t remove the users’ ability to post comments to a page’s wall yet still maintains a neat, organized wall by default.

    Facebook’s own Fan Page only allows for site managed comments to show up with no option for users to post or view comments of their own while both Starbucks and YouTube defaults to their own comments but enables a filter that allows users to choose from viewing just YouTube’s comments, just others’ comments or all comments.  This gives users a clean glance at the official posts on the wall with the additional functionality of being able to post their own thoughts, feelings and experiences with others, resulting in a more open, inviting social experience than that of Facebook’s Fan Page.

    However, this doesn’t mean that your fans should be given free rein to post anything and everything under the sun. Careful monitoring and moderation are crucial for insuring that discussions remain on track and are void of vulgar or offensive material. Maintaining a balance between healthy discussion and a sense of structure lets visitors know their thoughts are being heard and that someone is watching over things to keep comments from getting out of hand while encouraging an inviting atmosphere.

  • Optimizing Your Website’s Internal Linking

    Optimizing Your Website’s Internal Linking

    While many webmasters focus on acquiring more external links, it’s easy to forget that internal link building is still a great way to boost the rankings of other pages within your website. Internal links can be an untapped goldmine and the best part is you have complete control over their implementation.

    (more…)

  • How to Keep Your Search Engine Ranking During a Redesign

    How to Keep Your Search Engine Ranking During a Redesign

    301 redirects are essential when you’re redesigning your website and don’t want to lose the search engine traffic that you currently enjoy.

    The unfortunate thing about a 301 redirect is that it sounds so extremely geeky and off-putting to the average business owner that they’re scared away. That’s too bad, because it is a critical tool in search engine optimization. So, to that end, I’m going to attempt explain the benefits of 301s in the least geeky way possible.

    Search Engines and Trust

    There are a lot of variables in why one site ranks higher than another site at Google and other search engines. One is how long the site (and a given page) has been in existence, and another is how many incoming links a page has. All things being equal (which they never are), older pages rank higher than newer pages and pages with more inbound links rank higher than ones with fewer inbound links.

    Breaking that Trust

    Often, when rebuilding a site, you end up changing the URLs–or addresses–of your web pages.  Maybe it’s because you’re reorganizing your site, or maybe it’s because you’re redeveloping your site on a content management system like WordPress, Drupal or Joomla. In either case, the new URLs don’t have the trust that the old URLs do, even if a lot of the content is the same.

    It’s like moving to a new town. You may have been the greatest manager/plumber/accountant in your old town, but that doesn’t mean anything in the new town. You haven’t changed; you still have an excellent bedside manner or mad sales skills, but you’re starting from scratch in this new town.

    When you take your established content, uproot it and replant it somewhere else on your site, you are resetting the clock on when that content was created and breaking all of the inbound links that pointed to it.

    Reestablishing that Trust

    There are many ways to tell the search engines that you’ve moved your content, but the most search engine friendly way is the 301 redirect. By setting up 301 redirects for your content, you show search engines where your content has moved from, and your inbound links will now direct to your new pages.

    How you setup your 301s may depend on the type of host you have. If you have no idea what I’m talking about, it’s time to talk to your web developer and get them involved.

    If you want your web developer to create redirects for you, I recommend writing up a guide for him or her to show where the old pages should be redirected. Here’s a guide for you to use, where the first item is the old page and the second item is where you want the traffic to flow:

    • old/old.html -> new/new.php
    • van-halen/david-lee-roth.html -> van-halen/sammy-hagar.html
    • wonka/gene-wilder.php ->wonka/johnny-depp.php

    The easiest approach is to use a plugin like the WordPress Redirection plugin to setup 301 redirects.

    If you do feel comfortable playing around with 301 redirects, .htaccess and other files on your server, there are plenty of resources online:

    How to Redirect a Web Page Using a 301 Redirect

    301 Redirect – How to Create Redirects

    How to Set Up Redirects Using .htaccess

    These are just a few of the top results.

    Final Thoughts

    301 redirects are also great when you are changing from one domain to another (never a great idea, but sometimes a necessary evil.) Even with a 301 redirect, you should expect a dip during a major overhaul of your website. However, my own experience has been that the numbers get back to normal in about a month or three and then you see increases after that.

  • Improve SEO By Removing Your Duplicate Content

    Improve SEO By Removing Your Duplicate Content

    Duplicate content is like a virus. When a virus enters your system, it begins to replicate itself until it is ready to be released and cause all kinds of nasty havoc within your body. On the web, a little duplicate content isn’t a huge problem, but the more it replicates itself, the bigger the problem you’re going to have. Too much duplicate content and your website will come down with some serious health issues.

    I’m going to break this into three parts. In this post, I’ll discuss the problems that are caused with duplicate content. In Part II, I’ll address the causes of duplicate content, and in Part III, I’ll discuss some duplicate content elimination solutions.

    Duplicate Content Causes Problems. Duh!

    Google and other search engines like to tell us that they have the duplicate content issue all figured out. And, in the cases where they don’t, they provide a couple of band-aid solutions for you to use (we’ll get to these later). While there may be no such thing as a “duplicate content penalty”, there are certainly filters in place in the search engine algorithms that devalue content that is considered duplicate, and make your site as a whole less valuable in the eyes of the search engines.

    If you trust the search engines to handle your site properly, and don’t mind having important pages filtered out of the search results, then go ahead and move on to another story… you got nothing to worry about.

    Too many pages to index

    Theoretically, there is no limit to the number of pages on your site that the search engines can add to their index. In practice, though, if they find too much “junk”, they’ll stop spidering pages and move on to the next site. They may come back and keep grabbing content they missed, but likely at a much slower pace than they otherwise would.

    Duplicate content, in practice, creates “junk” pages. Not that they may not have value, but compared to the one or two or dozen other pages on your site or throughout the web that also contain the same content, there really isn’t anything unique there for the search engines to care about. It’s up to the engines to decide which pages are the unnecessary pages and which is the original source or most valuable page to include in the search results.

    The rest is just clutter that the search engines would rather not have.

    Slows search engine spidering

    With so many duplicate pages to sort through, the search engines tire easily. Instead of indexing hundreds of pages of unique content, they are left sifting through thousands of pages of some original content and a whole lot of duplicate crap. Yeah, you’d tire too!

    Once the engines get a whiff that a site is overrun with dupes, the spidering process will often be reduced to a slow crawl. Why rush? There are plenty of original sites out there they can be gathering information on. Maybe they’ll find a few good nuggets or two on your site, but it can wait, as long as they are finding gold mines elsewhere.

    Splits valuable link juice

    When there is more than one page (URL) on your site that carries the same content as another there becomes an issue of which page gets the links. In practice, whichever URL the visitor lands on and bookmarks, or passes on via social media, is the page that gets the link value. But, each visitor may land on a different URL with that same content.

    If 10 people visit your site, 5 land on and choose to link to one URL, while the other 5 land on and choose to link to the other (both being the same content), instead of having one page that has 10 great links, you have 2 pages each with half the linking value. Now imagine you have 5 duplicate pages and the same scenario happens. Instead of 10 links going to a single page, you may end up with 2 links going to each of the 5 duplicate versions.

    So, for each duplicate page on your site, you are cutting the link value that any one of the pages could achieve. When it comes to rankings, this matters. In our second scenario, all it takes, essentially, is a similarly optimized page with 3 links to outrank your page with only 2. Not really fair, because the same content really has 10 links, but it’s your own damn fault for splitting up your link juice like that.

    Inaccessible pages

    We talked above about how duplicate content slows spidering leaving, some content out of the search engine’s index. Leaving duplicate content aside for a moment, let’s consider the page URLs themselves. We’ve all seen those URLs that are so long and complicated that you couldn’t type one out if it was dictated to you. While not all of these URLs are problematic, some of them certainly can be. Not to mention URLs that are simply undecipherable as being unique pages.

    We’ll talk more about these URLs in part 3, but for now, let’s just consider what it means when a URL cannot be spidered by the search engines. Well, simply put, if the search engines can’t spider it, then it won’t get indexed. The browser may pull open a page the visitors can see, but the search engines get nothin’. And when you multiply that nothin’ the search engines get with the nothin’ they’ll show in the results (don’t forget to carry the nothin’), you get a whole lot of nothin’ going on.

    Pages inaccessible to the search engines means those pages can’t act as landing pages in the search results. That’s OK, if it’s a useless page, but not if it’s something of value that you want to be driving traffic to.

    There are a lot of problems caused by duplicate content and bad URL development. These problems may be minor or cataclysmic, depending on the site. Either way, small problem or large, it’s probably a good idea to figure out the cause of your duplicate content problems so you can begin to implement solutions that will pave the way for better search engine rankings.

  • Bing’s Guide To Quality Content

    Bing’s Guide To Quality Content

    Following the Google’s Panda slap, now Bing reasserts it’s stand for quality content as well.

    When we think of quality content, Google Search is our the first automated response. However, to reinstate it’s position, Bing’s, Duane Forrester’s blog gives webmasters some tips and tricks to creating quality content to ensure that both the users and the search engines respond to your website.

    Unlike Google, that has left webmasters across the globe in murky waters of Reconsider Request, Bing seems to provide us with rather quick and easy to follow pointers that will easily make their crawler conclude that your website has quality content.

    Following are the steps Bing suggests you avoid whilst producing content:

    “Duplicate content” – don’t use articles or content that appear in other places. Produce your own unique content.

    Thin content – don’t produce pages with little relevant content on them – go deep when producing content – think “authority” when building your pages. Ask yourself if this page of content would be considered an authority on the topic.

    All text/All images – work to find a balance here, including images to help explain the content, or using text to fill in details about images on the page. Remember that text held inside an image isn’t readable by the crawlers.

    Being lonely – enable ways for visitors to share your content through social media.

    Translation tools – rarely does a machine translation tool leave you with content that reads properly and that actually captures the original sentiment. Avoid simply using a tool to translate content from one language to the next and posting that content online.

    Skipping proofreading – when you are finished producing content, take the time to check for spelling errors, grammatical mistakes and for the overall flow when reading. Does it sound like you’re repeating words too frequently? Remove them. Don’t be afraid to rewrite the content, either.

    Long videos – If you produce video content, keep it easily consumable. Even a short 3 – 4 minute video can be packed with useful content, so running a video out to 20 minutes is poor form in most instances. It increases download times and leads to visitor dissatisfaction at having to wait for the video to load. Plus, if you are adding a transcription of your video, even a short video can produce a lengthy transcription.

    Excessively long pages – if your content runs long, move it to a second page. Readers need a break, so be careful here to balance the length of your pages. Make sure your pagination solution doesn’t cause other issues for your search optimization efforts, though.

    Content for content’s sake – if you are producing content, be sure its valuable. Don’t just add text to every page to create a deeper page. Be sure the text, images or videos are all relevant to the content of the page.”


    When looking to optimize your website this comprehensive list of ‘don’t’ seems like a good place to start from. However, some skeptics may question the reason behind Bing emphasis on quality at this juncture; is this guide any early indication towards Bing’s version of Google like Panda update? Hmmm…

  • Where to Submit Your XML Sitemap

    Where to Submit Your XML Sitemap

    Sitemaps are an ingredient that completes a website’s SEO package. They are certainly still relevant, since they ensure content is not overlooked by web crawlers and reduce the resource burden on search engines. Sitemaps are a way to “spoon feed” search engines your content to ensure better crawling. Let’s look at how this is done.

    XML Format

    The sitemap file is what search engines look for. The elements available to an XML sitemap are defined by the sitemap protocol and include urlset, url, loc, lastmod, changefreq, and priority. An example DOM looks like:

        http://example.com/
        2006-11-18
        daily
        0.8

    Sitemaps have a 10 MB size limit and cannot have more than 50,000 links, but you can use more than one file for the sitemap. A sitemap that consists of multiple files is called a sitemap index. Sitemap index files have a similar, but different format:

      http://www.example.com/sitemap1.xml.gz
      2004-10-01T18:23:17+00:00
    
    
      http://www.example.com/sitemap2.xml.gz
      2005-01-01

    There are all kinds of sitemaps, ones for web pages, ones tailored to sites with videos and other media, mobile, geo data, and more. As long as it is within the cost-benefit for achieving better SEO, take the time to become familiar with the different types of sitemaps and make one that best fits your website’s architecture.

    Location

    Sitemaps can be named anything, but convention is that a sitemap will be named ‘sitemap.xml’ and is placed in the root of the site, so http://example.com/sitemap.xml. If multiple files are needed they can be named ‘sitemap1.xml’ and ‘sitemap2.xml’. Sitemap files can also be compressed, such as ‘sitemap.gz’. One can also have sitemaps in sub directories or submit them for multiple domains, but the cases for needing such are very limited.

    Submission

    Sitemaps are recognized by search engines in three ways:

    • Robots.txt
    • Ping request
    • Submission interface

    First, sitemaps can be specified in the robots.txt as follows:
    Sitemap: http://example.com/sitemap.xml

    The robots.txt file is then placed in the root of the domain, http://example.com/robots.txt, and when crawlers read the file they will find the sitemap and use it to improve their understanding of the website’s layout.

    Second, search engines can be notified through “ping” requests, such as:
    http://searchengine.com/ping?sitemap=http%3A%2F%2Fwww.yoursite.com%2Fsitemap.xml

    These “ping” requests are a standard way search engines allow websites to notify them of updated content. Obviously, the domain (i.e. “searchengine.com”) will be replaced with say “google.com”.

    Lastly, every major search engine has a submission tool for notifying the engine that a website’s sitemap has changed. Here are four major search engines and their submission URLs:

    Google – http://www.google.com/webmasters/tools/ping?sitemap=

    Yahoo! – http://search.yahooapis.com/SiteExplorerService/V1/updateNotification?appid=SitemapWriter&url=

    Ask.com – http://submissions.ask.com/ping?sitemap=

    Bing – http://www.bing.com/webmaster/ping.aspx?siteMap=

    The ping requests do not respond with any information besides whether or not the request was received. The submission URLs will respond with information about the sitemap, such as any errors it found.

    If your website uses WordPress or the like, there are great plugins such as Google XML Sitemaps which will do all this heavy work for you: creating sitemaps and notifying search engines including Google, Bing, Yahoo, and Ask. There are also tools for creating sitemaps such as the XML-Sitemaps.com tool or Google’s Webmaster Tools.

    As we’ve said before, making sitemaps “shouldn’t take precedence over good internal linking, inbound link acquisition, a proper title structure, or content that makes your site a resource and not just a list of pages.” However, taking just a little bit of time with a good tool will help you complete your SEO package with a sitemap. Take this tutorial and make your site known!