When you redesign or enhance your site, you make a lot of changes. You change the content, the design, the front end technology, the back end stack, the user flows, the information architecture, everything. It is tough to know what you have done right, and what needs help, particularly as it compares to other sites. These sites can help show you what you have done right, what needs help, and how you compare to other sites. I use them… and so should you.
- https://website.grader.com/ – the gold standard of online web site graders. Shows performance, SEO, mobile capability, and security.
- https://www.semrush.com/ – this site gathers a LOT of marketing information about your site… Monitor this information before and after your cutover.
- https://validator.w3.org/ – Are you W3C Compliant? Are you writing valid HTML? Using this throughout your development will ensure your site is as readable and indexable as possible.
- http://www.webpagetest.org – How long does the first view of my page take? How about the second view? This grader shows you both… just like the Developer Tools in Google Chrome.
- https://developers.google.com/speed/pagespeed/insights/ – another technical site grader that can give you guidance where to increase performance. Be careful trying to get 100/100, though… not everything NEEDS to be done.
- http://nibbler.silktide.com/en_US – Evaluates your site down in four areas – Accessibility, Experience, Marketing, and Technology. Still useful to get another view of your site.
- https://www.woorank.com/ – “Run a review to see how your site can improve across 70+ metrics” – Marketing, SEO, Mobile, Usability, Technology, Crawl Errors, Backlinks, Social, Local, SERP Checker, Visitors.
- http://www.similarweb.com/ – Another great site for a large, corporate web site. But not a lot of information about performance. Good to monitor usage and marketing metrics.
- https://moz.com/researchtools/ose – Moz is known for its SEO tools, and this is an easy dashboard of information to monitor before and after your redesign. The free version is useful, but the Pro version is even better. Not a lot of tech help here, though.
- http://www.alexa.com/ – 7 days for free, the paid version is the only one really useful. Lots of marketing information is available, though.
- http://builtwith.com/ – Very technical. Shows you the infrastructure and software choices made by the development team. You will be surprised. Helpful for technology and information security teams.
- http://www.google.com/analytics – Free analytics tool. Tells you who uses your site, how much, where they are from, what browsers, what time of day… a plethora of information. Including Page Speed.
- https://www.google.com/webmasters/tools – Free tool that shows you what index errors Google has encountered, things to make your site more indexable, and what your pages look like to the Google Search Crawlers. Use this.
- http://www.bing.com/toolbox/webmaster – Everything that Search Console is for Google, this site is for Bing.
So did I miss any tools that you use? Are any of these ones you have struck off your list? How do you measure results of your site before and after? Leave a comment and let me know!
EDIT: Two more sites were recommended to me that help redesign projects, so I am adding them here:
If you are building a web site on an Agile team, you need to find ways to save time. These two checklists will help you with that. The first checklist, for on-page optimization, is helpful when building a new page or significantly modifying an existing one. This is a good set-up for success criteria for a user story or sprint. The second checklist, for on-site optimization, is good for regression testing or stabilization, and is a good baseline for success criteria for the release.
Do you have any feedback? Things you disagree with? Anything I missed? Please leave feedback.
- Readable by a human
- 115 characters or shorter
- shorter URLs are better for usability
- Head Section Order
- Meta tags are in the right order: Title > Description > Keywords.
- these tags are used to render the title and description in the search engine results pages
- Title Tag
- 6 to 12 words , 70 characters or less
- Unique across the site
- Description Tag
- include the most important info and keywords before the SERP cutoff
- approximately 160 characters including spaces.
- make it compelling – don’t want to waste your prime real estate
- Unique across the site
- Keywords Tag
- Even with the controversy of their value, include it as a best practice
- List keywords in order of importance, separated by commas.
- Meta Robots tag
- <meta name=”robots” content=”noindex”>
- NoFollow prop on anchor tags
- View State tag
- Heading Tags
- make sure your first heading tag is an <h1>,and that there is only one on the page.
- Canonical tag
- Helps prevent duplicate content within your site
- rel=”alternate” hreflang=”x”
- Tells Google what language to target for search purposes
- Use page level keywords in your image alt attributes
- Ensure your images have proper descriptions for Accessibility Standards
- Alt attributes are also required to validate your HTML code.
- Ensure file names reflect the content of the image
- Geo Meta Tags
- Overall Word Count
- More than 250 words is recommended,
- Quality content is key.
- avoid duplicate content and thin content
- Dashes vs. Underscores in URLs
- Underscores are alpha characters and do not separate words.
- Dashes (i.e. hyphens) are word separators, but not too many or things could look like spam
- use fully qualified links, i.e. http://www.URL.com
- 100-200 links on a page is a good high end target
- Make sure your link text uses keywords and is relevant
- Ensure the most important part of your page is the first thing the bots crawl.
- externalize code to ensure there aren’t unnecessary lines above the body text.
- Make sure there are no misspellings or grammar mistakes
- Make sure your page is W3C valid HTML
- Last but not least, make sure it is relevant content
- Site Map
- Have an HTML sitemap with every page on it,
- Every page should link to that sitemap page
- Have an XML Sitemap to submit to search engines
- The site map should always have fully qualified URLs.
- Text Navigation
- Fully qualified domain
- 301 redirect from domain.com to www.domain.com
- Make your site available over http and https
- Robots.txt File
- tells the search engine spiders what to index and what not to index.
- Ensure XML sitemaps are listed in the robots.txt file
- Social Sharing
- Make sure they are all set up and working properly
- Web Analytics
- make sure you have it – GA, Omniture, etc.
- Make sure you have only one of each analytics tag on your page
- Ensure your analytics are set up properly – test with Fiddler, firebug, etc.
- Monitor them regularly
- Server Configuration
- Regularly check your server logs, looking for 404 errors, 301 redirects and other errors.
- Privacy Statement
- An important element to Bing. It’s best practices to include one anyway
- Static Pages
- Do not use more than two query string parameters
- use mod_rewrite or ISAPI_rewrite to simplify URLs
- use the Canonical tag.
- Check for Duplicate Content
- check out CopyScape.com . Use it regularly.
- Find and Fix Broken Links
- Google Search
- Home page should appear first
- Track how many pages are indexed
- 301 redirects
- Do not use multiple 301 redirects
- Site wide Uptime
- Cache your site
- Improve Site Speed
- Improve Site Performance
- Compress images
- Minify CSS and JS files
- Set Up a Google Webmaster Tools Account and check it regularly
- Register all versions of your domains and subdomains
- Check Health ad Crawl Errors Reported
- Review Mobile Usability Issues
- Check for Manual Penalties Reported
- Check blocked content
- Ensure CSS and JS is not blocked
- Set up Bing Webmaster Tools as well
SEO Checklist Source URLs
These are just some of the articles I have read in Google Reader over the last month and a half that I have found interesting. I thought maybe you would too…
Other Interesting Stuff
I don’t usually do this, but with my recent trip to India, it might be a good idea to do a SEO blogroll post. I have found a number of interesting articles lately, and thought that I would share them with my team, particularly the developers in India. So, without further ado, here is an SEO blogroll from the last 3 months:
Whew! That was a lot of links! But the articles are really interesting, and pertain to my day-to-day job, and I thought some folks I work with would be really interested to read them too.
Let me know if I should do this more often, or if this is annoying and a waste of time. Leave me a message and let me know what you think of these articles!
I have stayed connected to the search industry ever since I was involved with the original launch of the Pravachol web site ten years ago. One of the ways I have stayed connected is through great online resources like Alt Search Engines. This week they covered a great new online tool that helps its users search for synonyms. Visual Thesaurus displays entries in the thesaurus graphically and separates them into individual entries through a tool called Thinkmap. This is very similar to the technology used in the TouchGraph Google Browser. Both of these technologies are similar to some of the social networking graphs that are used in Web 2.0 sites. Take a look at the new Visual Thesaurus, and the TouchGraph Google Browser, and let me know what you think of the usage of thetechnology, and what other ways you might like to see it.
I don’t usually do this, but this blog entry will be about an email I received from one of my readers. I got an email from Rachel, who works at a company called SEO Site Checkup. She asked me to take a look at their site. They have created a simple-to-use web site that will analyze your site against a series of SEO based rules. All you do is put in your URL, submit, and let the site do its work. It will return a list of important fixes, recommended fixes, and successful checks. It provides a lot of information, and a great deal of next steps to make your site more SEO friendly. In fact, it was good enough to point out a few changes that we will want to make to some of our major brand sites.
To be fair, there are two other tools that I use in the web site SEO analysis space. WebSite Grader is provided by HubSpot – a company focusing on marketing for small companies. I have also used a site called XinuReturns, which will help you “Find out how well your site is doing in popular search engines, social bookmarking and other site statistics.”
XinuReturns focuses more on aspects outside of your site, including inbound links, search engine results, and social bookmarking. WebSite Grader gives a high level overview of lots of different aspects of your site’s SEO, both internally and externally, and gives you an easy “grade” to compare results against other sites. The strength of SEO Site Checkup over these other two sites is is that it takes a deep dive into aspects of your web site that you can change to improve your search results. It analyzes your technology and your content, and gives you an action plan for improvement. All three tools are a great way to measure your site’s SEO, but SEO Site Checkup goes the next step further to tell you how to improve those measurements.
I recommend using all three of these tools, in conjunction with analytics tools and other metrics to montiro and improve your site. SEO Site Checkup is a great new tool to add to that arsenal.
If you have ever redesigned, moved, or migrated a web site, then you know how important 301 redirects are. You have worked hard at building up your page rank within all of the search engines. And you don’t want to lose it. Your users have bookmarked your pages, and your partners all have links to your pages. And you don’t want those to break either.
My team and I are currently in the middle of migrating our first major site from one platform to another, and if we are successful there will be many more to come. We need to handle redirects for all the old content, the media pages, the banner advertisements, the existing client side redirects, and the internal analytics tracking pages. Here are some of the resources we are using while managing all the redirects in the site.
Are there other resources you use when dealing with 301 redirects? Do you have any lessons learned about page redirects when redesigning or migrating your site? Leave me some feedback and let me know what you think.