SEO Audits and Tools: The Good, The Better and The Best

SEO audits & tools: good, better & best

SEO Audits and Tools: The Good, The Better and The Best was originally published on BruceClay.com, home of expert search engine optimization tips.

“SEO audits” can mean different things to different people.

In general, an SEO website audit identifies issues that hinder a site’s ability to be found in search results and recommends changes to fix those issues.

The end goal of a technical SEO audit? To help you improve the site’s search visibility and bring in more organic traffic.

But the approach to SEO audits varies across practitioners and agencies. Which approach fits you best depends on many factors.

TL;DR: Three levels of SEO audits exist. They all aim to uncover ways to improve a website’s visibility in search. From free tools to an expert’s analysis, all audit types have their place. This article lists five auditing tools, explains the different approaches, and clarifies what you can expect to pay and to get from each level of SEO audit.

The 3 Levels of SEO Audits

Three levels of technical SEO audits exist today:

  • The “Good” SEO Audit: A software tool uncovers superficial SEO issues (many of which may be useful). The tool produces a one-size-fits-all generic report. Appropriate for when you don’t have an auditing budget, or you want to check some basics yourself before starting with an agency. Never a waste, but not a deep dive.
  • The “Better” SEO Audit: An SEO vendor or practitioner offers additional SEO insights but without many solutions. They can identify problems that data analytics alone can’t uncover. Without in-depth solutions, this only points to possible trouble areas, but sometimes that is all you need.
  • The “Best” SEO Audit: An SEO agency performs an in-depth technical audit. It requires the labor and expertise of a seasoned SEO analyst(s) who specializes in technical website analysis and SEO business strategy. This is a manual review supported by tools, and it takes many hours.

Of course, you may have different names for each of these levels of audits. Each serves a purpose.

However, when all three levels come together in one powerful and “best” SEO audit, you gain a solid understanding of where your website is today, where it can be tomorrow, and what needs to be done to get it there.

Let’s look at these three levels in detail, starting with the “good.”

The “Good” SEO Audit

Let’s start with the most basic SEO audit.

Mostly automated, this type of SEO audit uses a software tool. The software examines your website against a set of SEO factors and generates a list of things to fix.

Most often, businesses themselves use these tools to do simple self-audits. But this type of audit tends to be superficial.

What’s lacking here is the knowledge behind the recommendations. You might receive a brief explanation, but understanding the “why” behind the suggestion can be unclear.

Add to that the fact that every business and website is unique. The tool may say “X” is a problem — but is it really a problem for your site’s situation? And how much priority should you give to it?


The “best” SEO audit’s power lies in its expert analysis and strategic recommendations.
Click To Tweet


For example, if you’re seeing a traffic loss, a tool-generated report offers no understanding of why. Is there a search engine penalty involved? Could projects that your team is running, like a redesign, be affecting rankings?

The self-audit tool doesn’t take into consideration any number of important things that could be impacting your SEO.

Still, it does serve the purpose of a quick-and-dirty website review. And you can send the recommendations to your developer team to make quick fixes.

Software tools generally contribute to any SEO audit procedure, though the more comprehensive audits give much more insight.

SEO Audit Tools I Recommend

Below I’ve listed five software products I like for SEO audit work. They have both free and paid versions, so the price can range from $0 to several hundred dollars per month for a tool subscription.

1. Nibbler (My Favorite)
Nibbler is my preferred auditing tool. It looks at everything from on-page factors to back-end considerations, breadth of content, mobile factors, freshness and more.

Nibbler audit tool screenshot

1. Nibbler report example

Nibbler’s free version limits you to three reports for five webpages per test. But it offers a paid version that opens up the report to 100-plus pages. This more comprehensive reporting comes in between $50 and $120 per month.

2. SEO-Detective
SEO-Detective is a free tool that analyzes a site one webpage at a time against more than 20 factors, including Alexa rank, server information, keywords and more.

SEO-detective tool screen shot

2. SEO-detective sample report

3. SEOptimer
SEOptimer audit data is available in multiple languages and covers everything from on-page SEO factors to usability and accessibility. It prides itself on speed (being able to analyze a site in 30 seconds or less), and allows users to customize and white label reports.

SEOptimer sample result

3. SEOptimer tool screenshot

Paid plans for SEOptimer run from $29 to $59 per month, and you can run a report for free and download the data when you sign up for a 14-day free trial.

4. UpCity
UpCity offers an SEO “report card” that covers things like ranking and on-site analysis including links, trust metrics and accessibility.

UpCity sample report

4. UpCity’s sample report

The report card is a free feature. It’s wrapped into their paid SEO software aimed at agencies at $150 to $800 per month.

5. WebPageTest.org
WebPageTest is a free tool that is useful for verifying or identifying speed issues.

WebPageTest.org audit tool screen shot

5. WebPageTest.org screen shot

As you can see, these five SEO auditing tools cover many best-practice SEO tactics. After accessing the data, you should make recommended changes where it makes sense for your website.

The downside is that these tools do not listen to your business and website problems, nor do they dive deeper into the data. That is the next level of auditing that I will cover.


In an SEO audit, the human insights piece makes a great difference if you want to understand major SEO issues impacting the site.
Click To Tweet


The “Better” SEO Audit

Level 2 SEO audits are better because they typically involve an SEO vendor or SEO practitioner.

The vendor or practitioner will likely use a software tool such as those I outlined in the previous section. In addition, they can:

  • Listen to you and learn about your business situation, goals, and past SEO decisions that may be impacting your website.
  • Analyze your Google Analytics and Google Search Console data for more insights.
  • Manually identify SEO issues that are below the surface and not easily recognizable by a tool.

This deeper dive can often result in problem identification, sometimes very quickly. You’ll receive important explanations of what’s behind the data uncovered in analytics and auditing tools.

What to Expect from a Level 2 Audit

Some of the things an audit like this may discover are:

Your content is not good enough. Software can only do so much to analyze content (for example, word count or reading score). A human can compare your website content to your competitors‘ pages that are ranking in the search results. Such an evaluation can pinpoint where you have room for improvement.

Your website design is bad. A software tool won’t be able to tell you that you have an ugly, hard-to-navigate website. It won’t notice that you have a bunch of junk code that’s preventing search engine spiders from doing their job. Having expert eyes on the overall development and design to identify SEO problems is key.

Your inbound link profile is poisoned. There are many reasons why a link profile can go bad. A website that has been around for a while has probably had multiple webmasters. During that time, the rules of SEO may have changed, or those responsible may not have understood Google guidelines. Whatever the reason, it’s the SEO auditor’s job to interpret potential spam issues hurting the site.

Your server is way too slow. Google cares about how long it takes to get information from your webpage, and says it should display in 200 milliseconds or less. An older study found a correlation between an increased “time to first byte” and decreased search rankings.

To clarify, a tool can certainly check page load time. But is the reported load time normal among your competitors? And have you lost traffic due to delays? An audit tool cannot tell you the answers. But they’re crucial to deciding how to prioritize your slowness issues.

In a Level 2 SEO audit, the human insights piece makes a great difference if you want to understand major SEO issues impacting the site.

If you engage with an SEO vendor, know that the price varies greatly depending upon the issue prompting the audit, site complexity, and the size of the agency’s SEO checklist.

Here’s the rub: Most audits at this level are little more than a problem list without solutions. Those fixes are yours to research.

Normally, a reasonable “better” audit by a professional as described here ranges from $3,000 to $12,000 (USD). This covers a one-time review; afterwards, the engagement is typically over. You can compare this against the cost and the need to employ an SEO analyst in-house.


At the end of the day, all three levels of SEO audits can be useful. Something is better than nothing.
Click To Tweet


The “Best” SEO Audit

Each level of audit has its place based on a business’s needs and budget. Luckily, you have options.

At the “best” auditing level, you engage with an SEO vendor that has explicit knowledge of how to perform a thorough technical and strategic SEO website audit. You want to choose an agency that makes audits a core business specialty and has enough hours available to do it (up to 100 hours or more).

Commonly, a Level 3 audit takes advantage of the kinds of tools I’ve mentioned for the other levels. Tools improve the process. But the “best” SEO audit’s power lies in its expert analysis and strategic recommendations.

What to Expect from the Best SEO Audit

A technical SEO audit at this level uncovers everything in the first two levels of auditing, plus it delivers the following:

  • Expert knowledge. The most comprehensive audit means you’re working with a senior SEO analyst with many years of experience versus a junior analyst doing a more “by-the-book” review. The senior analyst will be apprised of any algorithm changes (known or suspected) as well that may be impacting the site.
  • Competitive research. The auditor makes a comprehensive review of not only your website, but also competitor websites. This will help create a strategic roadmap of how to compete in the search results based on those that are ranking already.
  • New opportunities. The audit will include in-depth keyword analysis that will expose new opportunities for search visibility based on a business’s goals. This can be paired with a new SEO-friendly site architecture that works to improve a website’s authority through the structure of its content.
  • Prioritization & guidance. The auditor will not only identify the problem and describe exactly why it’s important based on Google’s guidelines, but also explain how to resolve it. You should receive in-depth instructions on how to implement solutions, complete with a priority list of what to tackle first.

You can expect an SEO vendor at this level to spend a lot of time researching and analyzing – sometimes over 100 hours depending on the website – in order be as thorough as necessary.

At that level of labor, these audits often start at $20,000 and can exceed $50,000 for very large ecommerce sites. (Side note: You can learn more about the cost of SEO and what goes into the price tags.)

Summary

Our search marketing agency mostly does the “best” audits; we also run many auditing tools and tasks internally over the course of every project. Let us know if we can help your business.

At the end of the day, all three levels of SEO audits can be useful. In other words, something is better than nothing.

If you’re deciding which type will work best for you, consider these factors:

  • The age and complexity of your website
  • Problems you’re experiencing, such as severe traffic loss
  • Your budget
  • Projects on the horizon (for example, a site redesign), and more.

In all cases, expect to make improvements to your site following an SEO audit.

Bottom line: Making the right changes uncovered in a technical SEO audit WILL help you compete in the search results.

I want to know: Have you ever had a “best” level of SEO audit, and what were your results? Tell me in the comments below.

Source:: bruceclay.com

Why Wikipedia is still visible across Google’s SERPs in 2018

Google is always evolving. But some things in the world of search never change.

One such thing is the presence of Wikipedia across the Google SERPs. From queries about products and brands to celebrities and topical events, Wikipedia still features heavily across Google searches – even while our habits as search engine users change (with voice and mobile increasingly having an impact), and while Google itself works to make its results more intuitive and full of rich features.

Back in May my piece No need for Google argued that wikis themselves were fantastic search engines in their own right (check out wiki.com if you want search results that delve into the content on Wikipedia as well as other numerous wikis). Wikipedia’s visibility on Google is testament to the continuing value and usefulness of “the free encyclopedia anyone can edit.”

So how does Wikipedia manage to maintain this visibility in 2018?

Natural ranking

Even in 2018, Google’s SERPs are still dominated by the organic rankings – a list of web pages it deems relevant to your query based on a number of factors such as size, freshness of content, and the number of other sites linking into it.

Unsurprisingly, Wikipedia’s pages still do the job when it comes to appearing in Google’s organic rankings. It has massive authority, having been established for nearly 20 years and now boasting almost 6 million content pages. There has been plenty of time for inbound links to build up and there are an ever-growing number of pages on the domain giving other sites more reason to link back.

So Wikipedia is a massive, well-established site. It also does really well in the fresh content stakes. Around 35 million registered editors keep tweaking and adding to the site’s content, as well as countless more users who make changes without signing in. Additionally, more than a thousand administrators check and verify whether changes should be kept or reverted. This ensures the site is being amended around the clock – and Google is always keen to rank sites which are actively updated ahead of those which are published and never touched again.

Another element of Wikipedia’s natural ranking prowess is thanks to its on-site SEO. Here I’m referring to things like how it uses internal links to reference its own pages which are handy for both users and Google’s crawlers. Internal links are super easy to add when editing Wiki-pages – and thus appear peppered throughout most articles on the site. Also, note the site’s use of title and header tags, as well as its clean URLs.

These features aren’t the toughest SEO hacks in the world, but you can see how added together they keep Wikipedia visible across Google’s organic rankings in the face of increasing competition and ever-emerging SEO tactics.

Featured snippets

Wikipedia is doing well at remaining visible in other parts of the SERPs too. Featured snippets are the box-outs on Google’s results pages which appear above the natural results. They seek to give a summary answer to the searcher’s question, without the user needing to click beyond the SERP.

There is no mark-up that Wikipedia is including on its pages in order for its content to be included in featured snippets. Rather, it is the strength of the site’s content – which usually see a concise and clear (read: edited) summary at the top of each article page – which is helping Google’s crawlers to ascertain what information on the page would be useful to the user in that context.

“People also ask”

It follows that if Google is including Wikipedia articles in its featured snippets, that it also retains visibility (albeit small, before a user makes a click) in the search engine’s “People also ask” boxes.

Again, Google is crawling and delivering this content programmatically. When searching for “midterms 2018,” Google’s algorithm is smart enough to understand that searchers are also asking more long tail questions around that search term – and even if Wikipedia doesn’t have a presence in the organic listings (in this instance, most of these places are given over to news sites), it still receives some visibility and traffic by virtue of its clear, concise and crawlable content.

Knowledge graphs

Knowledge graphs appear towards the right hand side of the Google SERPs. They typically feature a snippet of summary text…

Images and/or maps…

And a plethora of scannable details and handy links…

They are generated in part from Google drawing on content the algorithm crawls programmatically (as in featured snippets), as well as that which is marked-up to alert the search engine to useful details. Businesses can increase their chances of being included in knowledge graphs by signing up to Google My Business and adding the necessary information to their profile, as well as by using on-site mark up.

As you can see from the examples above, Wikipedia content is frequently used by Google to populate knowledge graphs. Again, this is likely due to the natural authority of the site and the easily-crawlable text, rather than any SEO-responsible mark up. But it is a good example of showing how visible the domain is thanks to the strength of its content. Frequently, as is the case with the above “landmarks in plymouth” query, Google will opt to display the informational Wikipedia content (and elements from other sources) in the knowledge graph while giving over the rest of the SERPs to other pages – but it is still visible.

Site links

Another way Wikipedia is good at ensuring it grabs another bit of SERP real estate – as well as giving searchers more reason to click through to the domain – is by giving Google good reason to display its site links.

These are generated by a mixture of relevant links Google crawls from the page in question (“United States Senate elections” in the above example), as well as other related pages on the same domain (“Ras Baraka” is the re-elected Mayor of Newark, but his page is not linked from the elections page).

Wikipedia succeeds here, where the BBC doesn’t, by virtue of its flawless site-structure and liberal use of internal linking – making it easy for Google to draw out the most relevant links for the query.

Takeaways

There are a number of places in Google’s increasingly rich SERPs that Wikipedia doesn’t tend to appear as frequently, if at all. These include: image packs, video results, news and social carousels, sponsored (and retail orientated content), and local results. The reason for this is obvious in most cases, but not in all. Images and video do, of course, feature across thousands of Wiki-pages, but it is arguable that other sites are that bit better at optimizing this kind of content. After all, wiki software was established when much of the web was text based, so we can understand why Google may be more likely to display this content from more modern CMSs.

With that being said, seeing the degree to which Wikipedia is still visible across the SERPs not only highlights the increasing opportunity for SEOs to find some visibility amid increasingly competitive results pages, but also goes to show how important domain authority, good (updated, concise, edited, readable and crawlable) content and excellent internal linking is to acquire and maintain visibility on Google in 2018.

Source:: searchenginewatch.com

OQC – Immobilienvermarktung für Wohnstätte Krefeld AG

Mehr Bilder…

Für die Vermarktung hochwertiger Kauf-Immobilien des Auftraggebers Wohnstätte Krefeld AG entsteht dieses eigenständige Kommunikationskonzept. Von der Namensgebung des neuen Wohnquartiers »OQC – Ostwall Quartier Crefeld“ über die Markenbildung bis hin zum Exposee leitet sich die Gestaltungssprache aus dem einzigartigen Architekturentwurf ab und ordnet sich diesem auf minimalistische und stilvolle Weise unter. In Kooperation werden zudem fotorealistische Architektur-Visualisierungen sowie Fotoaufnahmen entwickelt, welche das Kommunikationsdesign abrunden und logisch ergänzen. In dem hochwertig produzierten Exposee fließen alle Komponenten ineinander und addieren sich zu einer zeitgemäßen, individuellen Broschüre im Bereich Real Estate Marketing, welche durch ihren Magazin ähnlichen Charakter besticht.

Agentur
Ungestrichen

3D-Visualisierungen
Christoph Wasserhoven

Fotografie
Simon Erath

Source:: designmadeingermany.de

Relikt – Ein Magazin über schwindende Gesellschaftsphänomene

Mehr Bilder…

Die Magazinreihe »Relikt“ widmet sich mit jeder Ausgabe verschiedenen Phänomenen des alltäglichen Lebens, an der Schnittstelle zur Urbanität, Gesellschaft und Kultur.

Die Definition von »Relikt“ lautet: Etwas, was aus einer zurückliegenden Zeit übrig geblieben ist; Überreste oder Überbleibsel. Diese Reste nimmt sich das Heft vor.

Während ihrer Blütezeit haben sie das Straßenbild geprägt oder waren in jedem Haushalt präsent – heute werden sie allerdings nur noch als Relikte vergangener Zeit gesehen. Diese Phänomene haben gemein, dass sie eventuell in geraumer Zeit gänzlich verschwinden werden oder es bereits schon sind.

Ein Magazin, das Verbindungen zwischen Gesellschaftsthemen, Kulturgeschichten und Verknüpfungen zu Kunst und Design schafft. Diese Ausgabe wirft, einen etwas anderen, vielfältigen und skurrilen Blick auf jene sich rasant verändernde Dinge.

Eine Bachelorarbeit der Münster School of Design.

Designer
Philip Hüwels

Begleitung
Prof. Rüdiger Quass von Deyen
Prof. Dipl.-Des. Hermann Dornhege

Source:: designmadeingermany.de