Google & Facebook Squeezing Out Partners

Sections

Just Make Great Content…

Remember the whole shtick about good, legitimate, high-quality content being created for readers without concern for search engines – even as though search engines do not exist?

Whatever happened to that?

We quickly shifted from the above „ideology“ to this:

The red triangle/exclamation point icon was arrived at after the Chrome team commissioned research around the world to figure out which symbols alarmed users the most.

Search Engine Engineering Fear

Google is explicitly spreading the message that they are doing testing on how to create maximum fear to try to manipulate & coerce the ecosystem to suit their needs & wants.

At the same time, the Google AMP project is being used as the foundation of effective phishing campaigns.

Scare users off of using HTTP sites AND host phishing campaigns.

Killer job Google.

Someone deserves a raise & some stock options. Unfortunately that person is in the PR team, not the product team.

Ignore The Eye Candy, It’s Poisoned

I’d like to tell you that I was preparing the launch of https://amp.secured.mobile.seobook.com but awareness of past ecosystem shifts makes me unwilling to make that move.

I see it as arbitrary hoop jumping not worth the pain.

If you are an undifferentiated publisher without much in the way of original thought, then jumping through the hoops make sense. But if you deeply care about a topic and put a lot of effort into knowing it well, there’s no reason to do the arbitrary hoop jumping.

Remember how mobilegeddon was going to be the biggest thing ever? Well I never updated our site layout here & we still outrank a company which raised & spent 10s of millions of dollars for core industry terms like [seo tools].

Though it is also worth noting that after factoring in increased ad load with small screen sizes & the scrape graph featured answer stuff, a #1 ranking no longer gets it done, as we are well below the fold on mobile.

Below the Fold = Out of Mind

In the above example I am not complaining about ranking #5 and wishing I ranked #2, but rather stating that ranking #1 organically has little to no actual value when it is a couple screens down the page.

Google indicated their interstitial penalty might apply to pop ups that appear on scroll, yet Google welcomes itself to installing a toxic enhanced version of the Diggbar at the top of AMP pages, which persistently eats 15% of the screen & can’t be dismissed. An attempt to dismiss the bar leads the person back to Google to click on another listing other than your site.

As bad as I may have made mobile search results appear earlier, I was perhaps being a little too kind. Google doesn’t even have mass adoption of AMP yet & they already have 4 AdWords ads in their mobile search results AND when you scroll down the page they are testing an ugly „back to top“ button which outright blocks a user’s view of the organic search results.

What happens when Google suggests what people should read next as an overlay on your content & sells that as an ad unit where if you’re lucky you get a tiny taste of the revenues?

Is it worth doing anything that makes your desktop website worse in an attempt to try to rank a little higher on mobile devices?

Given the small screen size of phones & the heavy ad load, the answer is no.

I realize that optimizing a site design for mobile or desktop is not mutually exclusive. But it is an issue we will revisit later on in this post.

Coercion Which Failed

Many people new to SEO likely don’t remember the importance of using Google Checkout integration to lower AdWords ad pricing.

You either supported Google Checkout & got about a 10% CTR lift (& thus 10% reduction in click cost) or you failed to adopt it and got priced out of the market on the margin difference.

And if you chose to adopt it, the bad news was you were then spending yet again to undo it when the service was no longer worth running for Google.

How about when Google first started hyping HTTPS & publishers using AdSense saw their ad revenue crash because the ads were no longer anywhere near as relevant.

Oops.

Not like Google cared much, as it is their goal to shift as much of the ad spend as they can onto Google.com & YouTube.

Google is now testing product ads on YouTube.

It is not an accident that Google funds an ad blocker which allows ads to stream through on Google.com while leaving ads blocked across the rest of the web.

Android Pay might be worth integrating. But then it also might go away.

It could be like Google’s authorship. Hugely important & yet utterly trivial.
Faces help people trust the content.
Then they are distracting visual clutter that need expunged.
Then they once again re-appear but ONLY on the Google Home Service ad units.
They were once again good for users!!!

Neat how that works.

Embrace, Extend, Extinguish

Or it could be like Google Reader. A free service which defunded all competing products & then was shut down because it didn’t have a legitimate business model due to it being built explicitly to prevent competition. With the death of Google reader many blogs also slid into irrelevancy.

Their FeedBurner acquisition was icing on the cake.

Techdirt is known for generally being pro-Google & they recently summed up FeedBurner nicely:

Thanks, Google, For Fucking Over A Bunch Of Media Websites – Mike Masnick

Ultimately Google is a horrible business partner.

And they are an even worse one if there is no formal contract.

Dumb Pipes, Dumb Partnerships

They tried their best to force broadband providers to be dumb pipes. At the same time they promoted regulation which will prevent broadband providers from tracking their own users the way that Google does, all the while broadening out Google’s privacy policy to allow personally identifiable web tracking across their network. Once Google knew they would retain an indefinite tracking advantage over broadband providers they were free to rescind their (heavily marketed) free tier of Google Fiber & they halted the Google Fiber build out.

When Google routinely acts so anti-competitive & abusive it is no surprise that some of the „standards“ they propose go nowhere.

You can only get screwed so many times before you adopt a spirit of ambivalence to the avarice.

Google is the type of „partner“ that conducts security opposition research on their leading distribution partner, while conveniently ignoring nearly a billion OTHER Android phones with existing security issues that Google can’t be bothered with patching.

Deliberately screwing direct business partners is far worse than coding algorithms which belligerently penalize some competing services all the while ignoring that the payday loan shop funded by Google leverages doorway pages.

„User“ Friendly

BackChannel recently published an article foaming at the mouth promoting the excitement of Google’s AI:

This 2016-to-2017 Transition is going to move us from systems that are explicitly taught to ones that implicitly learn.“ … the engineers might make up a rule to test against—for instance, that “usual” might mean a place within a 10-minute drive that you visited three times in the last six months. “It almost doesn’t matter what it is — just make up some rule,” says Huffman. “The machine learning starts after that.

The part of the article I found most interesting was the following bit:

After three years, Google had a sufficient supply of phonemes that it could begin doing things like voice dictation. So it discontinued the [phone information] service.

Google launches „free“ services with an ulterior data motive & then when it suits their needs, they’ll shut it off and leave users in the cold.

As Google keeps advancing their AI, what do you think happens to your AMP content they are hosting? How much do they squeeze down on your payout percentage on those pages? How long until the AI is used to recap / rewrite content? What ad revenue do you get when Google offers voice answers pulled from your content but sends you no visitor?

The Numbers Can’t Work

A recent Wall Street Journal article highlighting the fast ad revenue growth at Google & Facebook also mentioned how the broader online advertising ecosystem was doing:

Facebook and Google together garnered 68% of spending on U.S. online advertising in the second quarter—accounting for all the growth, Mr. Wieser said. When excluding those two companies, revenue generated by other players in the U.S. digital ad market shrank 5%

The issue is NOT that online advertising has stalled, but rather that Google & Facebook have choked off their partners from tasting any of the revenue growth. This problem will only get worse as mobile grows to a larger share of total online advertising:

By 2018, nearly three-quarters of Google’s net ad revenues worldwide will come from mobile internet ad placements. – eMarketer

Media companies keep trusting these platforms with greater influence over their business & these platforms keep screwing those same businesses repeatedly.

You pay to get likes, but that is no longer enough as edgerank declines. Thanks for adopting Instant Articles, but users would rather see live videos & read posts from their friends. You are welcome to pay once again to advertise to the following you already built. The bigger your audience, the more we will charge you! Oh, and your direct competitors can use people liking your business as an ad targeting group.

Worse yet, Facebook & Google are even partnering on core Internet infrastructure.

In his interview with Obama tonight, @billmaher suggested the news business should be not-for-profit. Mission accomplished, thank Facebook.— Downtown Josh Brown (@ReformedBroker) November 5, 2016

Any hope of AMP turning the corner on the revenue front is a „no go“:

“We want to drive the ecosystem forward, but obviously these things don’t happen overnight,” Mr. Gingras said. “The objective of AMP is to have it drive more revenue for publishers than non-AMP pages. We’re not there yet”.

Publishers who are critical of AMP were reluctant to speak publicly about their frustrations, or to remove their AMP content. One executive said he would not comment on the record for fear that Google might “turn some knob that hurts the company.”

Look at that.

Leadership through fear once again.

At least they are consistent.

As more publishers adopt AMP, each publisher in the program will get a smaller share of the overall pie.

Just look at Google’s quarterly results for their current partners. They keep showing Google growing their ad clicks at 20% to 40% while partners oscillate between -15% and +5% quarter after quarter, year after year.

In the past quarter Google grew their ad clicks 42% YoY by pushing a bunch of YouTube auto play video ads, faster search growth in third world markets with cheaper ad prices, driving a bunch of lower quality mobile search ad clicks (with 3 then 4 ads on mobile) & increasing the percent of ad clicks on „own brand“ terms (while sending the FTC after anyone who agrees to not cross bid on competitor’s brands).

The lower quality video ads & mobile ads in turn drove their average CPC on their sites down 13% YoY.

The partner network is relatively squeezed out on mobile, which makes it shocking to see the partner CPC off more than core Google, with a 14% YoY decline.

What ends up happening is eventually the media outlets get sufficiently defunded to where they are sold for a song to a tech company or an executive at a tech company. Alibaba buying SCMP is akin to Jeff Bezos buying The Washington Post.

The Wall Street Journal recently laid off reporters. The New York Times announced they were cutting back local cultural & crime coverage.

If news organizations of that caliber can’t get the numbers to work then the system has failed.

The Guardian is literally incinerating over 5 million pounds per month. ABC is staging fake crime scenes (that’s one way to get an exclusive).

The Tribune Company, already through bankruptcy & perhaps the dumbest of the lot, plans to publish thousands of AI assisted auto-play videos in their articles every day. That will guarantee their user experience on their owned & operated sites is worse than just about anywhere else their content gets distributed to, which in turn means they are not only competing against themselves but they are making their own site absolutely redundant & a chore to use.

That the Denver Guardian (an utterly fake paper running fearmongering false stories) goes viral is just icing on the cake.

Look at this brazen, amazing garbage. Facebook has become the world’s leading distributor of lies.https://t.co/oueWUiydJO— Matt Pearce (@mattdpearce) November 6, 2016

many Facebook users wish to connect with people and things that confirm their pre-existing opinions, whether or not they are true. … Giving people what they want to see will always draw more attention than making them work for it, in rather the same way that making up news is cheaper and more profitable than actually reporting the truth. – Ben Thompson

These tech companies are literally reshaping society & are sucking the life out of the economy, destroying adjacent markets & bulldozing regulatory concerns, all while offloading costs onto everyone else around them.

The crumbling of the American dream is a purple problem, obscured by solely red or solely blue lenses. Its economic and cultural roots are entangled, a mixture of government, private sector, community and personal failings. But the deepest root is our radically shriveled sense of “we.” … Until we treat the millions of kids across America as our own kids, we will pay a major economic price, and talk of the American dream will increasingly seem cynical historical fiction.

And the solution to killing the middle class, is, of course, to kill the middle class:

„We are going to raise taxes on the middle class“ -Hillary Clinton #NeverHilla… (Vine by @USAforTrump2016) https://t.co/veEiZnfbkH— JKO (@jko417) November 6, 2016

An FTC report recommended suing Google for their anti-competitive practices, but no suit was brought. The US Copyright Office Register was relieved of her job after she went against Google’s views on set top boxes. Years ago many people saw where this was headed:

„This is a major affront to copyright,“ said songwriter and music publisher Dean Kay. „Google seems to be taking over the world – and politics … Their major position is to allow themselves to use copyright material without remuneration. If the Copyright Office head is towing the Google line, creators are going to get hurt.“

Singer Don Henley said Pallante’s ouster was „an enormous blow“ to artists. „She was a champion of copyright and stood up for the creative community, which is one of the things that got her fired,“ he said. … [Pallante’s replacement] Hayden „has a long track record of being an activist librarian who is anti-copyright and a librarian who worked at places funded by Google.“

And in spite of the growing importance of tech media coverage of the industry is a trainwreck:

This is what it’s like to be a technology reporter in 2016. Freebies are everywhere, but real access is scant. Powerful companies like Facebook and Google are major distributors of journalistic work, meaning newsrooms increasingly rely on tech giants to reach readers, a relationship that’s awkward at best and potentially disastrous at worst.

Being a conduit breeds exclusives. Challenging the grand narrative gets one blackballed.

Mobile Search Index

Google announced they are releasing a mobile first search index:

Although our search index will continue to be a single index of websites and apps, our algorithms will eventually primarily use the mobile version of a site’s content to rank pages from that site, to understand structured data, and to show snippets from those pages in our results. Of course, while our index will be built from mobile documents, we’re going to continue to build a great search experience for all users, whether they come from mobile or desktop devices.

There are some forms of content that simply don’t work well on a 350 pixel wide screen, unless they use a pinch to zoom format. But using that format is seen as not being mobile friendly.

Imagine you have an auto part database which lists alternate part numbers, price, stock status, nearest store with part in stock, time to delivery, etc. … it is exceptionally hard to get that information to look good on a mobile device. And good luck if you want to add sorting features on such a table.

The theory that using the desktop version of a page to rank mobile results is flawed because users might find something which is only available on the desktop version of a site is a valid point. BUT, at the same time, a publisher may need to simplify the mobile site & hide data to improve usability on small screens & then only allow certain data to become visible through user interactions. Not showing those automotive part databases to desktop users would ultimately make desktop search results worse for users by leaving huge gaps in the search results. And a search engine choosing to not index the desktop version of a site because there is a mobile version is equally short sighted. Desktop users would no longer be able to find & compare information from those automotive parts databases.

Once again money drives search „relevancy“ signals.

Since Google will soon make 3/4 of their ad revenues on mobile that should be the primary view of the web for everyone else & alternate versions of sites which are not mobile friendly should be disappeared from the search index if a crappier lite mobile-friendly version of the page is available.

Amazon converts well on mobile in part because people already trust Amazon & already have an account registered with them. Most other merchants won’t be able to convert at anywhere near as well of a rate on mobile as they do on desktop, so if you have to choose between having a mobile friendly version that leaves differentiated aspects hidden or a destkop friendly version that is differentiated & establishes a relationship with the consumer, the deeper & more engaging desktop version is the way to go.

The heavy ad load on mobile search results only further combine with the low conversion rates on mobile to make building a relationship on desktop that much more important.

Even TripAdvisor is struggling to monetize mobile traffic, monetizing it at only about 30% to 33% the rate they monetize desktop & tablet traffic. Google already owns most the profits from that market.

Webmasters are better off NOT going mobile friendly than going mobile friendly in a way that compromises the ability of their desktop site.

Mobile-first: with ONLY a desktop site you’ll still be in the results & be findable. Recall how mobilegeddon didn’t send anyone to oblivion?— Gary Illyes (@methode) November 6, 2016

I am not the only one suggesting an over-simplified mobile design that carries over to a desktop site is a losing proposition. Consider Nielsen Norman Group’s take:

in the current world of responsive design, we’ve seen a trend towards insufficient information density and simplifying sites so that they work well on small screens but suboptimally on big screens.

Tracking Users

Publishers are getting squeezed to subsidize the primary web ad networks. But the narrative is that as cross-device tracking improves some of those benefits will eventually spill back out into the partner network.

I am rather skeptical of that theory.

Facebook already makes 84% of their ad revenue from mobile devices where they have great user data.

They are paying to bring new types of content onto their platform, but they are only just now beginning to get around to test pricing their Audience Network traffic based on quality.

Priorities are based on business goals and objectives.

Both Google & Facebook paid fines & faced public backlash for how they track users. Those tracking programs were considered high priority.

When these ad networks are strong & growing quickly they may be able to take a stand, but when growth slows the stock prices crumble, data security becomes less important during downsizing when morale is shattered & talent flees. Further, creating alternative revenue streams becomes vital „to save the company“ even if it means selling user data to dangerous dictators.

The other big risk of such tracking is how data can be used by other parties.

Spooks preferred to use the Google cookie to spy on users. And now Google allows personally identifiable web tracking.

Data is being used in all sorts of crazy ways the central ad networks are utterly unaware of. These crazy policies are not limited to other countries. Buying dog food with your credit card can lead to pet licensing fees. Even cheerful „wellness“ programs may come with surprises.

Want to see what the future looks like?

For starters…

About 2 months ago I saw a Facebook post done on behalf of a friend of mine. Gofundme was the plea. Her insurance wouldn’t cover her treatment for a recurring breast cancer and doctors wouldn’t start the treatment unless the full payment was secured in a advance. Really? Really. She was gainfully employed, had a full time, well paying job. But guess what? It wasn’t enough although hundreds of people donated.

This last week she died. She was 38 years old. She died not getting access to a treatment that may or may not have saved her life. She died having to hustle folks for funds to just have a chance to get access to another treatment option and she died while worrying about being financially ruined by her illness. Just horrid.

Is this the society we want? People forced to beg friends on gofundme for help so they can get access to medical treatment? Is this the society we are? Is this truly the best we can do?

Click here to read more.

Categories:

Source:: seobook.com

Penguin 4.0 Update

On Friday Google’s Gary Illyes announced Penguin 4.0 was now live.

Key points highlighted in their post are:

  • Penguin is a part of their core ranking algorithm
  • Penguin is now real-time, rather than something which periodically refreshes
  • Penguin has shifted from being a sitewide negative ranking factor to a more granular factor

Things not mentioned in the post

  • if it has been tested extensively over the past month
  • if the algorithm is just now rolling out or if it is already done rolling out
  • if the launch of a new version of Penguin rolled into the core ranking algorithm means old sites hit by the older versions of Penguin have recovered or will recover anytime soon

Since the update was announced, the search results have become more stable.

No signs of major SERP movement yesterday – the two days since Penguin started rolling out have been quieter than most of September.— Dr. Pete Meyers (@dr_pete) September 24, 2016

They still may be testing out fine tuning the filters a bit…

Fyi they’re still split testing at least 3 different sets of results. I assume they’re trying to determine how tight to set the filters.— SEOwner (@tehseowner) September 24, 2016

…but what exists now is likely to be what sticks for an extended period of time.

Penguin Algorithm Update History

  • Penguin 1: April 24, 2012
  • Penguin 2: May 26, 2012
  • Penguin 3: October 5, 2012
  • Penguin 4: May 22, 2013 (AKA: Penguin 2.0)
  • Penguin 5: October 4, 2013 (AKA Penguin 2.1)
  • Penguin 6: rolling update which began on October 17, 2014 (AKA Penguin 3.0)
  • Penguin 7: September 23, 2016 (AKA Penguin 4.0)

Now that Penguin is baked into Google’s core ranking algorithms, no more Penguin updates will be announced. Panda updates stopped being announced last year. Instead we now get unnamed „quality“ updates.

Volatility Over the Long Holiday Weekend

Earlier in the month many SEOs saw significant volatility in the search results, beginning ahead of Labor Day weekend with a local search update. The algorithm update observations were dismissed as normal fluctuations in spite of the search results being more volatile than they have been in over 4 years.

There are many reasons for search engineers to want to roll out algorithm updates (or at least test new algorithms) before a long holiday weekend:

  • no media coverage: few journalists on the job & a lack of expectation that the PR team will answer any questions. no official word beyond rumors from self-promotional marketers = no story
  • many SEOs outside of work: few are watching as the algorithms tip their cards.
  • declining search volumes: long holiday weekends generally have less search volume associated with them. Thus anyone who is aggressively investing in SEO may wonder if their site was hit, even if it wasn’t.
    The communications conflicts this causes between in-house SEOs and their bosses, as well as between SEO companies and their clients both makes the job of the SEO more miserable and makes the client more likely to pull back on investment, while ensuring the SEO has family issues back home as work ruins their vacation.
  • fresh users: as people travel their search usage changes, thus they have fresh sets of eyes & are doing somewhat different types of searches. This in turn makes their search usage data more dynamic and useful as a feedback mechanism on any changes made to the underlying search relevancy algorithm or search result interface.

Algo Flux Testing Tools

Just about any of the algorithm volatility tools showed far more significant shift earlier in this month than over the past few days.

Take your pick: Mozcast, RankRanger, SERPmetrics, Algaroo, Ayima Pulse, AWR, Accuranker, SERP Watch & the results came out something like this graph from Rank Ranger:

One issue with looking at any of the indexes is the rank shifts tend to be far more dramatic as you move away from the top 3 or 4 search results, so the algorithm volatility scores are much higher than the actual shifts in search traffic (the least volatile rankings are also the ones with the most usage data & ranking signals associated with them, so the top results for those terms tend to be quite stable outside of verticals like news).

You can use AWR’s flux tracker to see how volatility is higher across the top 20 or top 50 results than it is across the top 10 results.

Example Ranking Shifts

I shut down our membership site in April & spend most of my time reading books & news to figure out what’s next after search, but a couple legacy clients I am winding down working with still have me tracking a few keywords & one of the terms saw a lot of smaller sites (in terms of brand awareness) repeatedly slide and recover over the past month.

Notice how a number of sites would spike down on the same day & then back up. And then the pattern would repeat.

As a comparison, here is that chart over the past 3 months.

Notice the big ranking moves which became common over the past month were not common the 2 months prior.

Negative SEO Was Real

There is a weird sect of alleged SEOs which believes Google is omniscient, algorithmic false positives are largely a myth, AND negative SEO was never a real thing.

As it turns out, negative SEO was real, which likely played a part in Google taking years to roll out this Penguin update AND changing how they process Penguin from a sitewide negative factor to something more granular.

@randfish Incredibly important point is the devaluing of links & not „penalization“. That’s huge. Knocks negative SEO out. @dannysullivan— Glenn Gabe (@glenngabe) September 23, 2016

Update != Penalty Recovery

Part of the reason many people think there was no Penguin update or responded to the update with „that’s it?“ is because few sites which were hit in the past recovered relative to the number of sites which ranked well until recently just got clipped by this algorithm update.

When Google updates algorithms or refreshes data it does not mean sites which were previously penalized will immediately rank again.

Some penalties (absent direct Google investment or nasty public relations blowback for Google) require a set amount of time to pass before recovery is even possible.

Google has no incentive to allow a broad-based set of penalty recoveries on the same day they announce a new „better than ever“ spam fighting algorithm.

They’ll let some time base before the penalized sites can recover.

Further, many of the sites which were hit years ago & remain penalized have been so defunded for so long that they’ve accumulated other penalties due to things like tightening anchor text filters, poor user experience metrics, ad heavy layouts, link rot & neglect.

What to do?

So here are some of the obvious algorithmic holes left by the new Penguin approach…

  • only kidding
  • not sure that would even be a valid mindset in the current market
  • hell, the whole ecosystem is built on quicksand

The trite advice is to make quality content, focus on the user, and build a strong brand.

But you can do all of those well enough that you change the political landscape yet still lose money.

“Mother Jones published groundbreaking story on prisons that contributed to change in govt policy. Cost $350k & generated $5k in ad revenue”— SEA☔☔LE SEO (@searchsleuth998) August 22, 2016

Google & Facebook are in a cold war, competing to see who can kill the open web faster, using each other as justification for their own predation.

Even some of the top brands in big money verticals which were known as the canonical examples of SEO success stories are seeing revenue hits and getting squeezed out of the search ecosystem.

And that is without getting hit by a penalty.

It is getting harder to win in search period.

And it is getting almost impossible to win in search by focusing on search as an isolated channel.

I never understood mentality behind Penguin „recovery“ people. The spam links ranked you, why do you expect to recover once they’re removed?— SEOwner (@tehseowner) September 25, 2016

Efforts and investments in chasing the algorithms in isolation are getting less viable by the day.

Obviously removing them may get you out of algorithm, but then you’ll only have enough power to rank where you started before spam links.— SEOwner (@tehseowner) September 25, 2016

Anyone operating at scale chasing SEO with automation is likely to step into a trap.

When it happens, that player better have some serious savings or some non-Google revenues, because even with „instant“ algorithm updates you can go months or years on reduced revenues waiting for an update.

And if the bulk of your marketing spend while penalized is spent on undoing past marketing spend (rather than building awareness in other channels outside of search) you can almost guarantee that business is dead.

„If you want to stop spam, the most straight forward way to do it is to deny people money because they care about the money and that should be their end goal. But if you really want to stop spam, it is a little bit mean, but what you want to do, is break their spirits.“ – Matt Cutts

Categories:

Source:: seobook.com

Neofeudal Web Publishing Best Practices Guide

At the abstract level, if many people believe in something then it will grow.

The opposite is also true.

And in a limitless, virtual world, you can not see what is not there.

The Yahoo Directory

Before I got into search, the Yahoo! Directory was so important to the field of search there were entire sessions at SES conferences on how to get listed & people would even recommend using #1AAA-widgets.com styled domains to alphaspam listings to the top of the category.

The alphaspam technique was a carry over from yellow page directories – many of which have went through bankruptcy as attention & advertising shifted to the web.

Go to visit the Yahoo! Directory today and you get either a server error, a security certificate warning, or a redirect to aabacosmallbusiness.com.

Poof.

It’s gone.

Before the Yahoo! Directory disappeared their quality standards were vastly diminished. As a webmaster who likes to test things, I tried submitting sites of various size and quality to different places. Some sites which would get rejected by some $10 directories were approved in the Yahoo! Directory.

The Yahoo! Directory also had a somewhat weird setting where if you canceled a directory listing in the middle of the term they would often keep it listed for many years to come – for free. After many SEOs became fearful of links the directory saw vastly reduced rates of submissions & many existing listings canceled their subscriptions, thus leaving it as a service without much of a business model.

DMOZ

At one point Google’s webmaster guidelines recommended submitting to DMOZ and the Yahoo! Directory, but that recommendation led to many lesser directories sprouting up & every few years Google would play a whack-a-mole game and strip PageRank or stop indexing many of them.

Many have presumed DMOZ was on its last legs many times over the past decade. But on their 18th birthday they did a spiffy new redesign.

Different sections of the site use different color coding and the design looks rather fresh and inviting.

Take a look.

However improved the design is, it is unlikely to reverse this ranking trend.

Lacking Engagement

Why did those rankings decline though? Was it because the sites suck? Or was it because the criteria to rank changed? If the sites were good for many years it is hard to believe the quality of the sites all declined drastically in parallel.

What happened is as engagement metrics started getting folded in, sites that only point you to other sites become an unneeded step in the conversion funnel, in much the same way that Google scrubbed affiliates from the AdWords ecosystem as unneeded duplication.

What is wrong with the user experience of a general web directory? There isn’t any single factor, but a combination of them…

  • the breadth of general directories means their depth must necessarily be limited.
  • general directory category pages ranking in search results is like search results in search results. it isn’t great from the user’s perspective.
  • if a user already knows a category well they would likely prefer to visit a destination site rather than a category page.
  • if a user doesn’t already know a category, then they would prefer to use an information source which prioritizes listing the best results first. the layout for most general web directories is a list of results which are typically in alphabetical order rather than displaying the best result first
  • in order to sound authoritative many directories prefer to use a neutral tone

If a directory mostly links to lower quality sites Google can choose to either not index it or not trust links from it. And even if a directory generally links to trustworthy sites, Google doesn’t need to rank it to extract most the value from it.

The trend of lower traffic to the top tier general directory sites has happened across the board.

Many years ago Google’s remote rater guidelines cited Joeant as a trustworthy directory.

Their traffic chart looks like this.

And the same sort of trend is true for BOTW, Business.com, GoGuides.org, etc.

There is basically nothing a general web directory can do to rank well in Google on a sustainable basis, at least not in the English language.

Even if you list every school in the city of Winnipeg that page can’t rank if it isn’t indexed & even if it is indexed it won’t rank well if your site has a Panda-related ranking issue. There are a couple other issues with such a comprehensive page:

  • each additional listing is more editorial content cost in terms of building the page AND maintaining the page
  • the bigger the page gets the more a user needs something other than alphabetical order as a sort option
  • the more listings there are in a tight category the more the likelihood there will be excessive keyword repetition on the page which could get the page flagged for algorithmic demotion, even if the publisher has no intent to spam. Simply listing things by their name will mean repeating a word like „school“ over 100 times on the above linked Winnipeg schools page. If you don’t consciously attempt to lower the count a page like that could have the term repeated over 300 times.

Knock On Effects

In addition to those web directories getting fewer paid submissions, most are likely seeing a rise in link removal requests. Google’s „fear first“ approach to relevancy has even led them to listing DMOZ as an unnatural link source in warning emails to webmasters.

What’s more, many people who use automated link clean up tools take the declining traffic charts & low rankings of the sites as proof that the links lack value or quality.

That means anyone who gets hit by a penalty & ends up in warning messages not only ends up with less traffic while penalized, but they also get extra busy work to do while trying to fix whatever the core problem is.

And in many cases fixing the core problem is simply unfeasible without a business model change.

When general web directories are defunded it not only causes many of them to go away, but it also means other related sites and services disappear.

  • Editors of those web directories who were paid to list quality sites for free.
  • Web directory review sites.
  • SEOs, internet marketers & other businesses which listed in those directories

Now perhaps general web directories no longer really add much value to the web & they are largely unneeded.

But there are other things which are disappearing in parallel which were certainly differentiated & valuable, though perhaps not profitable enough to maintain the „relevancy“ footprint to compete in a brand-first search ecosystem.

Depth vs Breadth

Unless you are the default search engine (Google) or the default social network everyone is on (Facebook), you can’t be all things to all people.

If you want to be differentiated in a way that turns you into a destination you can’t compete on a similar feature set because it is unlikely you will be able to pay as much for traffic-driven partnerships as the biggest players can.

Can niche directories or vertical directories still rank well? Sure, why not.

Sites like Yelp & TripAdvisor have succeeded in part by adding interactive elements which turned them into sought after destinations.

Part of becoming a destination is intentionally going out of their way to *NOT* be neutral platforms. Consider how many times Yelp has been sued by businesses which claimed the sales team did or was going to manipulate the displayed reviews if the business did not buy ads. Users tend to trust those platforms precisely because other users may leave negative reviews & that (usually) offers something better than a neutral and objective editorial tone.

And that user demand for those reviews, of course, is why Google stole reviews from those sorts of sites to try to prop up the Google local places pages.

It was a point of differentiation which was strong enough that people wanted it over Google. So Google tried to neutralize the advantage.

Blogs

The above section is about general directories, but the same concept applies to almost any type of website.

Consider blogs.

A decade ago feed readers were commonplace, bloggers often cross-linked & bloggers largely drove the conversation which bubbled up through mainstream media.

Google Reader killed off RSS feed readers by creating a fast, free & ad-free competitor. Then Google abruptly shut down Google Reader.

Not only do whimsical blogs like Least Helpful or Cute Overload arbitrarily shut down, but people like Chris Pirillo who know tech well suggest blogging is (at least economically) dead.

Many of the people who are quitting are not the dumb, the lazy, and the undifferentiated. Rather many are the wise trend-aware players who are highly differentiated yet find it impossible to make the numbers work:

The conversation started when revenues were down, and I had to carry payroll for a month or two out of my personal account, which I had not had to do since shortly after we started this whole project. We tweaked some things (added an ad or two which we had stripped back for the redesign, reminded people about ad-blockers and their impact on our ability to turn a profit, etc.) and revenue went back up a bit, but for a hot minute, you’ll remember I was like: “Theoretically, if this industry went further into the ground which it most assuredly will, would we want to keep running the site as a vanity project? Probably not! We would just stop doing it.”

In the current market Google can conduct a public relations campaign on a topic like payday loans, have their PR go viral & then if you mention „oh yeah, so Google is funding the creation of doorway pages to promote payday loans“ it goes absolutely nowhere, even if you do it DURING THE NEWS CYCLE.

So much of what exists is fake that anything new is evaluated from the perception of suspicion.

While the real (and important) news stories go nowhere & the PR distortions spread virally, the individual blogger ends up feeling a bit soulless if they try to make ends meet:

„The American Mama reached tens of thousands of readers monthly, and under that name I worked with hundreds of big name brands on sponsored campaigns. I am a member of virtually every ‘blog network‘ and agency that “connects brands with bloggers”. … What’s the point of having your own space to write if you’re being paid to sound like you work for a corporation? … PR Friendly says “For the right price, I will be anyone you want me to be.” … I’m not saying blogging is dying, but this specific little monster branch of it, sponsored content disguised as horribly written “day in the life” stories about your kids and pets? It can’t possibly last. Do you really want to be stuck on the inside when it crumbles?“

If you can’t get your own site to grow enough to matter then maybe it makes sense to contribute to someone else’s to get your name out there.

I recently received this unsolicited email:

„Hello! This is Theodore, a writer and chief editor at SomeSiteName.Com I noticed that you are accepting paid reviews online and you will be glad to know that now you can also publish your Sponsored content to SomeSite via me. SomeSite.Com is a leading website which deals in Technology, Social Media, Internet Stuff and Marketing. It was also tagged as Top 10 _____ websites of 2016 by [a popular magazine]. Website Stats- Alexa Rank: [below 700] Google PageRank: 6/10 Monthly Pageviews: 5+ Million Domain Authority: 85+ Price : $500 via PayPal (Once off Payment) Let me know if you are interested and want to feature your website product like nothing! This will not only increase your traffic but increase in overall SEO Score as well. Thanks“

That person was not actually a member of that site’s team, but they had found a way to get their content published on it.

In part because that sort of stuff exists, Google tries to minimize the ability for reputation to flow across sites.

The large platforms are so smug, so arrogant, they actually state the following sort of crap in interviews:

„There’s a space in the world for art, but that’s different from trying to build products at scale. The one thing that does make me a little nervous is a lot of my designer friends are still focused building websites and I’m not sure that’s a growth business anymore. If you look at people who are doing interesting work, they tend to be building inside these platforms like Facebook and finding ways to do interesting work in there. For instance, journalists. Instant Articles is a really great way for stories to be told.“

Sure you can bust your ass to build up Facebook, but when their business model changes (bye social gaming companies, hello live streaming video) best of luck trying to follow them.

And if you starve during the 7 lean years in between when your business model is once again well aligned with Facebook you can’t go back in time to give yourself a meal to un-starve.

Content Farms

Ehow.com has removed *MILLIONS* of pages of content since getting hit by Panda. And yet their ranking chart looks like this

What is crazy is the above chart actually understates the actual declines, because the shift of search to mobile & increasing prevalence of ads in the search results means estimates of organic search traffic may be overstated significantly compared to a few years prior.

A half-decade ago a bootstrapped eHow competitor named ArticlesBase got some buzz in TechCrunch because they were making about $500,000 a month on about 20 million monthly unique visitors. That business was recently listed on Flippa. They are getting about a half-million unique monthly visitors (off 95%) and about $2,000 a month in revenues (off about 99.6%).

The negative karma with that site (in terms of ability to rank) is so bad that the site owner suggested on Flippa to publish any new content from new authors onto different websites: „its not going to get to 0 as most of the traffic is not google today, but we would suggest to push out the fresh daily incoming content to new sites – thats where the growth is.“

Now a person could say „eHow deserves to die“ and maybe they are right. BUT one could easily counter that point by noting…

  • the public who owns the shares owns the ongoing losses & many top insiders cashed out long ago
  • Google was getting a VIG on eHow on their ride up & is still collecting one on the way down (along with funding other current parallel projects from the very same people with the very same Google ad network)
  • Demand Media’s partner program where they syndicate eHow-like content to newspapers like USA Today keeps growing at 15% to 20% a year (similar process, author, content, business model, etc. … only a different URL hosting the content)
  • look at this and you’ll see how many publishing networks are still building the same sort of content but are cross-marketing across networks of sites. What’s more some of the same names are at the new plays. For example, Demand Media’s founder was the chairman of an SEO firm bought by Hearst publishing & his wife is on the about us page of Evolve Media’s ModernMom.com

The wrappers around the content & masthead logos change, but by and large the people and strategies don’t change anywhere near as quickly.

Web Portals & News Sites

As the mainstream media gets more desperate, they are more willing to partner with the likes of Demand Media to get any revenue they can.

You see the reality of this desperation in the stock charts for newspaper companies.

Or how about this chart for Yahoo.com.

It doesn’t look particularly bad, especially if you consider that Yahoo has shut down many of their vertical sites.

Underlying flat search traffic charts misses declining publisher CPMs and the click traffic mix shift away from organic toward paid search channels as search traffic shifts to mobile devices & Google relentlessly increases the size of the search ads. Yahoo may still rank #3 for keyword x, but if that #3 ranking is below the fold on both mobile and desktop devices they might need to rank #1 to get as much traffic as #3 got a couple years ago.

Yahoo! was once the leading search portal & now they are worth about 1/5th of LinkedIn (after backing out their equity stakes in Alibaba and Yahoo! Japan).

The chart is roughly flat, but the company is up for a fire sale because organic search result displacement & the value of traffic has declined quicker than Yahoo! can fire employees & none of their Hail Mary passes worked.

Ms. Mayer compared the [Polyvore] deal to Google’s acquisition of YouTube in 2006, arguing that “you can never overpay” for a company with the potential to land a huge new base of users.

“Her core mistake was this belief that she could reinvent Yahoo,” says a former senior executive who left the company last year. “There was an element of her being a true believer when everyone else had stopped.”

The same line of thinking was used to justify the Tumblr acquisition, which has went nowhere fast – just like their 50+ other acquisitions.

Yahoo! shut down many verticals, fired many workers, sold off some real estate & is exploring selling their patents.

Chewing Up the Value Chain

Smaller devices that are harder to use means the gateways have to try to add more features to maintain relevance.

As they add features, publishers get displaced:

The Web will only expand into more aspects of our lives. It will continue to change every industry, every company, and every life on the planet. The Web we build today will be the foundation for generations to come. It’s crucial we get this right. Do we want the experiences of the next billion Web users to be defined by open values of transparency and choice, or the siloed and opaque convenience of the walled garden giants dominating today?

And if converting on mobile is hard or inconvenient, many people will shift to the defaults they know & trust, thus choosing to buy on Amazon rather than a smaller ecommerce website. One of my friends who was in ecommerce for many years stated this ultimately ended up becoming the problem with his business. People would email him back and forth about the product, related questions, and basically go all the way through the sales process with getting him to answer every concern & recommend each additional related product needed, then at the end they would ask him to price match Amazon & if he couldn’t they would then buy from Amazon. If he had more scale he might have been able to get a better price from suppliers and compete with Amazon on price, but his largest competitor who took out warehouse space also filed for bankruptcy because they were unable to make the interest payments on their loans.

We live in a society which over-values ease-of-use & scale while under-valuing expertise.

Look at how much consolidation there has been in the travel market since Google Flights launched & Google went pay-to-play with hotel search.

Expedia owns Travelocity & Orbitz. Priceline owns Kayak. Yahoo! Travel simply disappeared. TripAdvisor is strong, but even they were once a part of Expedia.

How different are the remaining OTAs? One could easily argue they are less differentiated than this article about the history of the travel industry makes Skift against other travel-related news sites.

How many markets are strong enough to support the creation of that sort of featured editorial content?

Not many.

And most companies which can create that sort of in-depth content leverage the higher margins on shallower & cheaper content to pay for that highly differentiated featured content creation.

But if the knowledge graph and new search features are simply displacing the result set the number of people who will be able to afford creating that in-depth featured content is only further diminished.

Over 5 years ago Bing’s Stefan Weitz mentioned they wanted to move search from a web of nouns to a web of verbs & to „look at the web as a digital representation of the physical world.“ Some platforms are more inclusive than Google is & decide to partner rather than displace, but Bing’s partnership with Yelp or TripAdvisor doesn’t help you if you are a direct competitor of Yelp or TripAdvisor, or if your business was heavily reliant on one of these other channels & you fall out of favor with them.

Chewing Up Real Estate

There are so many enhanced result features in the search results it is hard to even attempt to make an exhaustive list.

As search portals rush to add features they also rush to grab real estate & outright displace the concept of „10 blue links.“

There has perhaps been nothing which captured the sentiment better than

.@mattcutts I think I have spotted one, Matt. Note the similarities in the content text: pic.twitter.com/uHux3rK57f— dan barker (@danbarker) February 27, 2014

The following is paraphrased, but captures the intent to displace the value chain & the roll of publishers.

„the journeys of users. their desire to be taken and sort of led and encouraged to proceed, especially on mobile devices (but I wouldn’t say only on mobile devices).

there are a lot of users who are happy to be provided with encouragement and leads to more and more interesting information and related, grouped in groups, leading lets say from food to restaurants, from restaurants to particular types of restaurants, from particular types of restaurants to locations of those types of restaurants, ordering, reservations.

I’m kind of hungry, and in a few minutes you’ve either ordered food or booked a table. Or I’m kind of bored, and in a few minutes you’ve found a book to read or a film to watch, or some other discovery you are interested in.“ – Andrey Lipattsev

What role do publishers have in the above process? Unpaid data sources used to train algorithms at Facebook & Google?

Individually each of these assistive search feature roll outs may sound compelling, but ultimately they defund publishing.

Looks like Symptom Cards will lead to additional, more-focused searches (& not to third party sites.) #seo pic.twitter.com/vhkz5ZflMJ— Glenn Gabe (@glenngabe) June 20, 2016

Not a „Google Only“ Problem

People may think I am unnecessarily harsh toward Google in my views, but this sort of shift is not a Google-only thing. It is something all the large online platforms are doing. I simply give Google more coverage because they have a history of setting standards & moving the market, whereas a player like Yahoo! is acting out of desperation to simply try to stay alive. The market capitalization of the companies reflect this.

Google & Facebook control the ecosystem. Everyone else is just following along.

„digital is eating legacy media, mobile is eating digital, and two companies, Facebook and Google, are eating mobile. … Since 2011, desktop advertising has fallen by about 10 percent, according to Pew. Meanwhile mobile advertising has grown by a factor of 30 … Facebook and Google, control half of net mobile ad revenue.“ – Derek Thompson

The same sort of behavior is happening in China, where Google & Facebook are prohibited from competing.

As publishers get displaced and defunded online platforms can literally buy the media: “There’s very little downside. Even if we lose money it won’t be material,” Alibaba’s Mr. Tsai said. “But the upside [in buying SCMP] is quite interesting.”

The above quote was on Alibaba buying the newspaper of record in Hong Kong.

As bad as entire industries becoming token purchases may sound, that is the optimistic view. 😀

Facebook’s Instant Articles and Google’s AMP those make a token purchase unnecessary: „I don’t think it’s any secret that you’re going to see a bloodbath in the next 12 months,“ Vice Media’s Shane Smith said, referring to digital media and broadcast TV. „Facebook has bought two-thirds of the media companies out there without spending a dime.“

Those services can dictate what gets exposure, how it is monetized, and then adjust the exposure and revenue sharing over time to keep partners desperate & keep them hooked.

“If Thiel and Nick Denton were just a couple of rich guys fighting over a 1st Amendment edge case, it wouldn’t be very interesting. But Silicon Valley has unprecedented, monopolistic power over the future of journalism. So much power that its moral philosophy matters.” – Nate Silver

Give them just enough (false) hope to stay partnered.

All the while track user data more granularly & run AI against it to disintermediate & devalue partners.

TV networks are aware of the risks of disintermediation and view Netflix with more suspicion than informed SEOs view Google:

for all the original shows Netflix has underwritten, it remains dependent on the very networks that fear its potential to destroy their longtime business model in the way that internet competitors undermined the newspaper and music industries. Now that so many entertainment companies see it as an existential threat, the question is whether Netflix can continue to thrive in the new TV universe that it has brought into being.

“ ‘Breaking Bad‘ was 10 times more popular once it started streaming on Netflix.” – Michael Nathanson

the networks couldn’t walk away from the company either. Many of them needed licensing fees from Netflix to make up for the revenue they were losing as traditional viewership shrank.

And just like Netflix, Facebook will move into original content production.

The Wiki

Wikipedia is certainly imperfect, but it is also a large part of why other directories have went away. It is basically a directory tied to an encyclopedia which is free and easy to syndicate.

Every large search & discovery platform has an incentive for Wikipedia to be as expansive as possible.

The bigger Wikipedia gets, the more potential answers and features can be sourced from it. More knowledge graph, more instant answers, more organic result displacement, more time on site, more ad clicks.

Even if a knowledge graph listing is wrong, the harm done by it doesn’t harm the search service syndicating the content unless people create a big deal of the error. But if that happens then people will give feedback on how to fix the error & that is a PR lead into the narrative of how quickly search is improving and evolving.

„Wikipedia used to instruct its authors to check if content could be dis-intermediated by a simple rewrite, as part of the criteria for whether an article should be added to wikipedia. There are many rascals on the Internets; none deserving of respect.“ – John Andrews

Sergy Brin donates to fund the expansion of Wikipedia. Wikipedia rewrites more webmaster content. Google has more knowledge graph grist and rich answers to further displace publishers.

I recently saw the new gray desktop search results Google is tested. When those appear the knowledge graph appears inline with the regular search results & even on my huge monitor the organic result set is below the fold.

The problem with that is if your brand name is the same brand name that is in the knowledge graph & you are not the dominant interpretation then you are below the fold on all devices for your core brand UNLESS you pay Google for every single click.

How much should a brand like The Book of Life pay Google for being a roadblock? What sort of tax is appropriate & reasonable? How high will you bid in a casino where the house readjusts the shuffle & deal order in the middle of the hand?

I recently did a search on Bing & inside their organic search results they linked to a Mahalo-like page called Bing Knows. I guess this is a feature in China, but it could certainly spread to other markets.

If they partnered with an eBay or Amazon.com and put a „buy now“ button in the search results they’d have just about completely closed the loop there.

Broad Commodification

The reason I started this article with directories is their role is to link to sites. They are categorized collections of links which have been heavily commodified & devalued to the point they are rendered unnecessary and viewed adversely by much of the SEO market (even the ones with decent editorial standards).

Just like links got devalued, so did domain names.

And, as mentioned above in the parts about blogging, content farms, web portals & news sites … the same trend is happening to almost every type of content.

Online ad revenues are still growing quickly, but they are not flowing through to old media & many former leading bloggers consider blogging dead.

Big platform players like Google and Facebook broaden cross-device user tracking to create new relevancy signals and extract most the value created by publisher. The more information the platform owns the more of a starving artist the partners become.

As partners become more desperate, they overvalue growth (just like Yahoo! with Polyvore):

„It’s the golden age right now,“ [Thrillist CEO Ben Lerer] said. „If you’re a digital publisher, you have every big TV company calling you. When I look at media brands, if a media brand disappeared tomorrow, would I notice?“ he said. „And there are a bunch of brands that have scale, and maybe a lot of money raised, and maybe this and that, but, actually, I might not know for a year. There’s so many brands like that. Like, what does it really stand for? Why does it exist?“

Disruption is not a strategy, but the whole point of accelerating it & pushing it (without an adequate plan for „what’s next“) is to re-establish feudal lords.

The web is a virtual land where the commodity which matters most is attention. If you go back in time, lords maintained wealth & control through extracting rents.

A few years ago a quote like the following one may have sounded bizarre or out of place

These are the people who guard the company’s status as what ranking team head Amit Singhal often sees characterised as “the biggest kingmaker on this Earth.”

But if you view it through the some historical context it isn’t hard to understand

„The nobles still had the power to write the law, and in a series of moves that took place in different countries at different times, they taxed the bazaar, broke up the guilds, outlawed local currencies, and bestowed monopoly charters on their favorite merchants. … It was never really about efficiency anyway; industrialization was about restoring the power of those at the top by minimizing the value and price of human laborers.“ – Douglas Rushkoff

Google funding LendUp & ranking their doorway pages while hitting the rest of the industry is Google bestowing „monopoly charters on their favorite merchants.“

Headwinds

The issue is not that the value of anything drops to zero, but rather a combine set of factors shrinks down the size of the market which can be profitably served. Each of these factors eat at margins…

  • lower CPMs
  • the rise of ad blockers (funded largely by some big ad networks paying to allow their own ads through while blocking competing ad networks)
  • rise of programmatic ads (which shift advertiser budget away from publisher to various forms of management)
  • larger ad sizes: „Based on early testing, some advertisers have reported increases in clickthrough rates of up to 20% compared to current text ads. “
  • increase of vertical search results in Google & more ads + self-hosted content in Facebook’s feed
  • shift of search audience to mobile devices which have no screen real estate for organic search results and lower cost per click (there’s a reason Google AdSense is publishing tips on making more from mobile)
  • increased algorithmic barrier to entry and longer delay times to rank

The least sexy consultant pitch in the world: „Sure I can probably rank your website, but it will take a year or two, cost you at least $80,000 per year, and you will still be below the fold even if we get to #1 because the paid search ads fill up the first screen of results.“

That isn’t going to be an appealing marketing message for a new small business with a limited budget.

The Formula

“The open web is pretty broken. … Railroad, electricity, cable, telephone—all followed this similar pattern toward closedness and monopoly, and government regulated or not, it tends to happen because of the power of network effects and the economies of scale” – Ev Williams.

The above article profiling Ev Williams also states: „An April report from the web-analytics company Parse.ly found that Google and Facebook, just two companies, send more than 80 percent of all traffic to news sites.“

The same general trend is happening to almost every form of content – video, news, social, etc..

  • a big platform over-promotes a vertical to speed up buy-in (perhaps even offering above market rates or other forms of compensation to get the flywheel started)
  • other sources join the market without that compensation & then the compensation stream gets yanked
  • displacement of the source by a watered down copy (eHow or Wikipedia styled rewrite), or some zero-cost licensing arrangement (Facebook Instant Articles, Google AMP, syndicating Wikipedia rewrites)
  • strategic defunding of the content source
  • promise of future gains causing desperate publishers to lean harder into Google or Facebook even as they squeeze more water out of the rock.

Hey, sure your traffic is declining & your revenue is declining faster. You are getting squeezed out, but if you trust the primary players responsible for the shift & rely on Instant Articles or Google’s AMP this time will be different.

…or maybe not…

Facts & Opinions

When I saw some Google shills syndicating Google’s „you can’t copyright facts“ pitch without question I cringed, because I knew where that was immediately headed.

A year later the trend was obvious.

The Internet commoditized the distribution of facts. The „news“ media responded by pivoting wholesale into opinions and entertainment.— Naval Ravikant (@naval) May 26, 2016

So now we get story pitches where the author tries to collect a few quote sources to match the narrative already in their head. Surely this has gone on for a long time, but it has rarely been so transparently obvious and cringeworthy as it is today.

How modern journalism works pic.twitter.com/i2CRnwAWZy— Nick Cohen (@NickCohen4) June 15, 2016

And if you stray too far from facts into opinions & are successful, don’t be surprised if you end up on the receiving end of proxy lawsuits:

Can we talk about how strange it is for a group of Silicon Valley startup mentors to embrace secret proxy litigation as a business tactic? To suddenly get sanctimonious about what is published on the internet and called News? To shame another internet company for not following ‘the norms‘ of a legacy industry? The hypocrisy is mind bending.

The desperation is so bad news sites don’t even attempt to hide it. And part of what is driving that is bot-driven content further eroding margins on legitimate publishing. Google not only ranks those advertorials, but they also promote some of the auto-generated articles which read like:

As many as 1 analysts, the annual sales target for company name, Inc. (NYSE:ticker) stands at $45.13 and the median is $45.13 for the period closed 3.

The bearish target on sales is $45.13 and the bullish estimate is $45.13, yielding a standard deviation of 1.276%.

Not more than 1 investment entities have updated sales projections on upside over the last week while 1 have downgraded their previously provided sales targets. The estimates highlight a net change of 0% over the last 1 weeks period.

Sales estimated amount is a foremost parameter in judging a firm’s performance. Nearly 1 analysts have revised sales number on the upside in last one month and 1 have lowered their targets. It demonstrates a net cumulative change of 0% in targets against sales forecasts which were given a month ago.

In latest quarterly period, 1 have revised targeted sales on upside and 1 have decreased their projections. It demonstrates change of 4.898%.

I changed a few words in each sentence of that quote to make it harder to find the source as I wasn’t trying to out them specifically. But the auto-generated content was ranked by Google & monetized via inline Google AdSense ads promoting the best marijuana stocks to invest in and warning of a pending 80% stock market crash coming soon this year.

Hey at least it isn’t a TOTALLY fake story!

Publishers get the message loud and clear. Tronc wants to ramp up on AI driven video content at scale:

„There’s all these really new, fun features we’re going to be able to do with artificial intelligence and content to make videos faster,“ Ferro told interviewer Andrew Ross Sorkin. „Right now, we’re doing a couple hundred videos a day; we think we should be doing 2,000 videos a day.“

All is well, news & information are just externalities to a search engine ad network.

No big deal.

„With newspapers dying, I worry about the future of the republic. We don’t know yet what’s going to replace them, but we do already know it’s going to be bad.“ – Charlie Munger

Build a Brand

Build a brand, that way you are protected from the rapacious tech platforms.

Or so the thinking goes.

But that leads back to the above image where The Book of Life is below the fold on their own branded search query because there is another interpretation Google feels is more dominant.

The big problem with „brand as solution“ is you not only have to pay to build a brand, but then you have to pay to protect it.

And the number of search „innovations“ to try to siphon off some late funnel branded traffic and move it back up the funnel to competitors (to force the brand to pay again for their own brand to try to displace the „innovations“) will only continue growing.

And at any point in time if Disney makes a movie using your brand name as the name of the movie, you are irrelevant and need of a rebrand overnight, unless you commit to paying Google for your brand forever.

Having an offline location can be a point of strength and a point of differentiation. But it can also be a reason for Google to re-route user traffic through more Google owned & controlled pages.

Further, most large US offline retailers are doing horrible.

Almost all the offline growth is in stores selling dirt cheap unbranded imported stuff like Dollar General or Family Dollar & stores like Ross and TJ Maxx which sell branded item remainders at discount prices. And as Amazon gets more efficient by the day, other competitors with high cost structures & less efficient operations grow relatively less efficient over time.

The Wall Street Journal recently published an article about a rift between Wal-Mart & Procter & Gamble: “They sell crappy private label, so you buy Swiffer with a crappy refill,” said one of the people familiar with the product changes. “And then you don’t buy again.”

In trying to drive sales growth, P&G is resorting to some Yahoo!-like desperate measures, included meetings where „Some workers donned gladiator-like armor for the occasion.“

Riding on other platforms or partners carries the same sorts of risks as trusting Google or Facebook too much.

Even owning a strong brand name and offline distribution does not guarantee success. Sears already spun out their real estate & they are looking to sell the Kenmore & Craftsman brands.

The big difference between the web and offline platforms is the marginal cost of information is zero, so they can quickly & cheaply spread to adjacent markets in ways that physically constrained offline players can not & some of the big web platforms have far more data on people than governments do. It is worth noting one of the things that came out of the Snowden leaks is spooks were leveraging Google’s DoubleClick cookies for tracking users.

As desperate stores/platforms see slowing growth they squeeze for margins and seek to accelerate growth any way possible. Chasing growth ultimately leads to the promise of what differentiates them disappearing. I recently bought some „hand crafted“ soaps on Etsy, which shipped from Shenzen.

I am not sure how that impacts other artisinal soap sellers, but it makes me less likely to buy that sort of product from Etsy again.

And for as much as I like shopping on Amazon, I was uninspired when a seller recently sent me this.

Amazon might usually be great for buyers & great for affiliates, but hearing how they are quickly expanding their private label offerings wouldn’t be welcome news for a merchant who is overly-reliant on them for sales in any of those categories.

The above sort of activity is what is going on in the real world even among brands which are not under attack.

The domestic economic landscape is getting quite ugly:

America’s economy today is in some respects more concentrated than it was during the Gilded Age, whose excesses prompted the Progressive Era reforms the FTC exemplifies. In sector after sector, from semiconductors and cable providers to eyeglass manufacturers and hotels, a handful of companies dominate. These giants use their market power to hike prices for consumers and suppress wages for workers, worsening inequality. Consolidation also appears to be driving a dramatic decline in entrepreneurship, closing off opportunity and suppressing growth. Concentration of economic power, in turn, tends to concentrate political power, which incumbents use to sway policies in their favor, further entrenching their dominance.

And the local abusive tech monopolies are now firmly promoting the TPP: „make it more difficult for TPP countries to block Internet sites“ = countries should have less influence over the web than individual Facebook or Google engineers do.

In a land of algorithmic false positives that cause personal meltdowns and organizational breakdowns there is nothing wrong at all with that!

I kept waiting. For a year and a half, I waited. The revenues kept trickling down. It was this long terrible process, losing half overnight but then also roughly 3% a month for a year and a half after. It got to the point where we couldn’t pay our bills. That’s when I reached out again to Matt Cutts, “Things never got better.” He was like, “What, really? I’m sorry.” He looked into it and was like, “Oh yeah, it never reversed. It should have. You were accidentally put in the bad pile.

Luckily the world can depend on China to drive growth and it will save us.

Or maybe there is a small problem with that line of thinking…

Beijing’s intellectual property regulator has ordered Apple Inc. to stop sales of the iPhone 6 and iPhone 6 Plus in the city, ruling that the design is too similar to a Chinese phone, in another setback for the company in a key overseas market.

Can any experts chime in on this?

Let’s see…

First, there is Wal-Mart selling off their Chinese e-commerce operation to the #2 Chinese ecommerce company & then there’s this from the top Chinese ecommerce company:

“The problem is the fake products today are of better quality and better price than the real names. They are exactly the same factories, exactly the same raw materials but they do not use the names.” – Alibaba’s Jack Ma

Categories:

Source:: seobook.com

The 4 Fundamental Steps of Conversion Optimization

Once upon a time, I was sitting in my office looking over data for one our new clients and reviewing the conversion project roadmap. The phone rang and on the other end was the VP of marketing for a multi-billion-dollar company. It is very unusual to get an unannounced call from someone at his level, but he had an urgent problem to solve. A good number of his website visitors were not converting.

His problem did not surprise me. We deal with conversion rates optimization every day.

He invited me to meet with his team to discuss the problem further. The account would be a huge win for Invesp, so we agreed on a time that worked for both us. When the day came, our team went to the company’s location.

We started the discussion, and things did NOT go as I expected. The VP, who led the meeting, said, “we have a conversion problem.”

“First-time visitors to our website convert at a rate of 48%. Repeat visitors convert at 80%!”

I was puzzled.

Not sure what exactly puzzled me. Was it the high conversion numbers or was it the fact that the VP was not happy with them. He wanted more.

I thought he had his conversion numbers wrong. But nope. We looked at his analytics, and he was correct. The numbers were simply amazing by all standards. The VP, however, had a different mindset. The company runs thousands of stores around the US. When someone picks up the phone and calls them, they convert callers at a 90% rate. He was expecting the same conversion rate for his online store.

Let’s face it. A typical e-commerce store converts at an average of 3%. Few websites are able to get to anywhere from 10 to 18%. These are considered the stars of the world of conversion rates.

The sad truth about a website with 15% conversion rate is that 85% of the visitors simply leave without converting. Money left on the table, cash the store will not be able to capture. Whatever way you think about it, we can agree that there is a huge opportunity, but it is also a very difficult one to conquer.

The Problem with Conversion Optimization

Most companies jump into conversion optimization with a lot of excitement. As you talk to teams conducting conversion optimization, you notice a common thread. They take different pages of the website and run tests on them. Some tests produce results; others do not. After a while, the teams run out of ideas. The managers run out of excitement.

The approach of randomly running tests on different pages sees conversion rate optimization in a linear fashion. The real problem is that no one shops online in a linear fashion. We do not follow a linear path when we navigate from one area of the website to the next. Humans most of the time are random, or, at least, they appear random.

What does that mean?

The right approach to increase conversion rates needs to be systematical, because it deals with irrational and random human behavior.

So, how do you do this?

The Four Steps to Breaking to Double Digits Conversion Rates

After ten years of doing conversion optimization at Invesp, I can claim that we have a process that works for many online businesses. The truth is that it continues to be a work in progress.

These are the four steps you should follow to achieve your desired conversion rate:

Create Personas for Your Website

I could never stop talking about personas and the impact they have on your website. While most companies talk about their target market, personas help you translate your generalized and somewhat abstract target market data into a personalized experience that impacts your website design, copy and layout.

Let’s take the example of a consulting company that targets “e-commerce companies with a revenue of 10 million dollars or more.” There are two problems with this statement:

  • The statement is too general about the target market (no verticals and no geography, for example)
  • I am not sure how to translate this statement into actionable items on my website or marketing activity

You should first think about the actual person who would hire the services of this consulting company. Most likely, the sales take place to:

  • A business owner for a company with annual revenue from 10 to 20 million dollars.
  • A marketing director for a company with annual revenue from 20 to 50 million dollars.
  • A VP of marketing for a company with annual revenue over 50 million dollars.

Now, translate each of these three different cases into a persona.

So, instead of talking about a business owner for a company that is generating annual revenue from 10 to 20 million dollars, we will talk about:

John Riley, 43 years old, completed his B.A. in physics from the University of Michigan-Ann Arbor. He is a happy father of three. He started the company in 2007 and financed it from his own pocket. His company generated 13.5 million dollars of revenue in 2014 and expects to see a modest 7% increase in sales in 2015. John is highly competitive, but he also cares about his customers and thinks of them as an extended family. He would like to find a way to increase this year’s revenue by 18%, but he is not sure how to do so. He is conservative when it comes to using new marketing techniques. In general, John does not trust consultants and thinks of them as overpaid.

This is an oversimplification of the persona creation process and its final product. But you get the picture. If you are the consulting company that targets John, then what type of website design, copy and visitor flow would you use to persuade him to do business with you?

What data points do you use to create personas for your website? I would start with this:

  • Market research
  • Demographical studies
  • Usability studies
  • Zip code analysis
  • Existing customer surveys
  • Competitive landscape
  • AB and Multivariate testing data

A website or a business should typically target four to seven personas.

Add Traffic Sources

So, you have the personas. These personas should impact your design, copy and visitor flow.

But how?

Let’s start by looking at analytics data. Look for a period of six months to one year and see the top traffic sources/mediums. If your website has been online for a while, then you will probably have hundreds of different sources. Start with your top 10 traffic sources/medium and create a matrix for each of the personas/traffic source/landing pages:

Now, your job is to evaluate each top landing page for each traffic source through the eyes of your website personas. For each page, you will answer eight questions.

The persona questions: Eight questions to ask

  • What type of information would persona “x” need to see to click on to the next page on the website?
  • What would be the top concerns persona “x” have looking at the page?
  • What kind of copy does persona “x” need to see?
  • What type of trigger words are important to include on the page for persona “x”?
  • What words should I avoid for persona “x”?
  • What kind of headline should I use to persuade persona “x” to stay on my website?
  • What kind of images should I use to capture persona “x” attention?
  • What elements on the page could distract persona “x”?

As you answer these questions for each of the personas, you will end up with a large set of answers and actions. The challenge and the art will be to combine all these and make the same landing page work for all different personas. This is not a small task, but this is where the fun begins.

Consider the Buying Stages

You thought the previous work was complex? Well, you haven’t seen anything just yet!

Not every visitor who lands on your website is ready to buy. Visitors come to your website in different buying stages, and only 15-20% are in the action stage. The sequential buying stages of a visitor are:

  • Awareness stage (top of the sales funnel)
  • Research stage
  • Evaluating alternatives
  • Action stage
  • Post action

A typical buying funnel looks like this:

How does that translate into actionable items on your website?

In the previous exercise, we created a list of changes on different screens or sections of your website based on the different personas. Now, we are going to think about each persona landing on the website in one of the first four buying stages.

Instead of thinking of how to adjust a particular screen for John Riley, now you think of a new scenario:
Persona “x” is in the “evaluating alternatives” stage of the buying funnel. He lands on a particular landing page. What do I need to adjust in the website design and copy to persuade persona “x” to convert?

Our previous table looks like this now:

Next, answer all eight persona-questions again, based on the different buying stages.

Test your different scenarios

This goes without saying; you should NEVER introduce changes to your website without actually testing them. You can find plenty of blogs and books out there on how to conduct testing correctly if you are interested in learning more about AB testing and multivariate testing.

For a start, keep the five No’s of AB testing in mind:

1. No to “Large and complex tests”

Your goal is NOT to conduct large AB or multivariate tests. Your goal is to discover what elements on the page cause visitors to act a specific way. Break complex tests into smaller ones. The more you can isolate the changes to one or two elements, the easier it will be to understand the impact of different design and copy elements on visitors‘ actions.

2. No to “Tests without a hypothesis”

I can never say it enough. A test without a good hypothesis is a gambling exercise. A hypothesis is a predictive statement about a problem or set of problems on your page and the impact of solving these problems on visitor behavior.

3. No to “Polluted data”

Do not run tests for less than seven days or longer than four weeks. In both scenarios, you are leaving yourself open to the chance of inconsistent and polluted data. When you run a test for less than seven days, website data inconsistencies you are not aware of may affect your results. So, give the test results a chance to stabilize. If you run a test for more than four weeks, you are allowing external factors to have a larger impact on your results.

4. No to “Quick fixes”

Human psychology is complex. Conversion optimization is about understanding visitor behavior and adjusting website design, copy and process to persuade these visitors to convert. Conversion optimization is not a light switch you turn on and off. It is a long-term commitment. Some tests will produce results and some will not. Increases in conversion rates are great but what you are looking for is a window to visitor behavior.

5. No to “Tests without marketing insights”

Call it whatever you like: forensic analysis, posttest analysis, test results assessment. You should learn actionable marketing insights from the test to deploy across channels and verticals. The real power of any testing program lays beyond the results.

If you follow the steps outlined in this blog, you will have a lot to do.

So, happy testing!

About the author: This guide was written by Khalid Saleh. He is the CEO of Invesp, a conversion optimization software and services firm with clients in 11 different countries.

Categories:

Source:: seobook.com

Publisher Blocking: How the Web Was Lost

Streaming Apps

Google recently announced app streaming, where they can showcase & deep link into apps in the search results even if users do not have those apps installed. How it works is rather than users installing the app, Google has the app installed on a computer in their cloud & then shows users a video of the app. Click targets, ads, etc. remain the same.

In writing about the new feature, Danny Sullivan wrote a section on „How The Web Could Have Been Lost“

Imagine if, in order to use the web, you had to download an app for each website you wanted to visit. To find news from the New York Times, you had to install an app that let you access the site through your web browser. To purchase from Amazon, you first needed to install an Amazon app for your browser. To share on Facebook, installation of the Facebook app for your browser would be required. That would be a nightmare.

The web put an end to this. More specifically, the web browser did. The web browser became a universal app that let anyone open anything on the web.

To meaningfully participate on those sorts of sites you still need an account. You are not going to be able to buy on Amazon without registration. Any popular social network which allows third party IDs to take the place of first party IDs will quickly become a den of spam until they close that loophole.

In short, you still have to register with sites to get real value out of them if you are doing much beyond reading an article. Without registration it is hard for them to personalize your experience & recommend relevant content.

Desktop Friendly Design

App indexing & deep linking of apps is a step in the opposite direction of the open web. It is supporting proprietary non-web channels which don’t link out. Further, if you thought keyword (not provided) heavily obfuscated user data, how much will data be obfuscated if the user isn’t even using your site or app, but rather is interacting via a Google cloud computer?

  • Who visited your app? Not sure. It was a Google cloud computer.
  • Where were they located? Not sure. It was a Google cloud computer.
  • Did they have problems using your app? Not sure. It was a Google cloud computer.
  • What did they look at? Can you retarget them? Not sure. It was a Google cloud computer.

Is an app maker too lazy to create a web equivalent version of their content? If so, let them be at a strategic disadvantage to everyone who put in the extra effort to publish their content online.

If Google has their remote quality raters consider a site as not meeting users needs because they don’t publish a „mobile friendly“ version of their site, how can one consider a publisher who creates „app only“ content as an entity which is trying hard to meet end user needs?

We know Google hates app install interstitials (unless they are sold by Google), thus the only reason Google would have for wanting to promote these sorts of services would be to justify owning, controlling & monetizing the user experience.

App-solutely Not The Answer


Apps are sold as a way to lower channel risk & gain direct access to users, but the companies owning the app stores are firmly in control.

Everyone wants to „own“ the user, but none of the platforms bother to ask if the user wants to be owned:

We’re rapidly moving from an internet where computers are ‘peers‘ (equals) to one where there are consumers and ‘data owners‘, silos of end user data that work as hard as they can to stop you from communicating with other, similar silos.

If the current trend persists we’re heading straight for AOL 2.0, only now with a slick user interface, a couple more features and more users.

You’ve Got AOL

The AOL analogy is widely used:

Katz of Gogobot says that “SEO is a dying field” as Google uses its “monopoly” power to turn the field of search into Google’s own walled garden like AOL did in the age of dial-up modems.

Almost 4 years ago a Google engineer described SEO as a bug. He suggested one shouldn’t be able to rank highly without paying.

It looks like he was right. Google’s aggressive ad placement on mobile SERPs „has broken the will of users who would have clicked on an organic link if they could find one at the top of the page but are instead just clicking ads because they don’t want to scroll down.“

In the years since then we’ve learned Google’s „algorithm“ has concurrent ranking signals & other forms of home cooking which guarantees success for Google’s vertical search offerings. The „reasonable“ barrier to entry which applies to third parties does not apply to any new Google offerings.

And „bugs“ keep appearing in those „algorithms,“ which deliver a steady stream of harm to competing businesses.

From Indy to Brand

The waves of algorithm updates have in effect increased the barrier to entry, along with the cost needed to maintain rankings. The stresses and financial impacts that puts on small businesses makes many of them not worth running. Look no further than MetaFilter’s founder seeing a psychologist, then quitting because he couldn’t handle the process.

When Google engineers are not focused on „breaking spirits“ they emphasize the importance of happiness.

The ecosystem instability has made smaller sites effectively disappear while delivering a bland and soulless result set which is heavy on brand:

there’s no reason why the internet couldn’t keep on its present course for years to come. Under those circumstances, it would shed most of the features that make it popular with today’s avant-garde, and become one more centralized, regulated, vacuous mass medium, packed to the bursting point with corporate advertising and lowest-common-denominator content, with dissenting voices and alternative culture shut out or shoved into corners where nobody ever looks. That’s the normal trajectory of an information technology in today’s industrial civilization, after all; it’s what happened with radio and television in their day, as the gaudy and grandiose claims of the early years gave way to the crass commercial realities of the mature forms of each medium.

If you participate on the web daily, the change washes over you slowly, and the cumulative effects can be imperceptible. But if you were locked in an Iranian jail for years the change is hard to miss.

These sorts of problems not only impact search, but have an impact on all the major tech channels.

iPhone autocorrect inserted „showgirl“ for „shows“ and „POV“ for „PPC“. This crowd sourcing of autocorrect is not welcomed.— john andrews (@searchsleuth998) November 10, 2015

If you live in Goole, these issues strike close to home.

And there are almost no counter-forces to the well established trend:

Eventually they might even symbolically close their websites, finishing the job they started when they all stopped paying attention to what their front pages looked like. Then, they will do a whole lot of what they already do, according to the demands of their new venues. They will report news and tell stories and post garbage and make mistakes. They will be given new metrics that are both more shallow and more urgent than ever before; they will adapt to them, all the while avoiding, as is tradition, honest discussions about the relationship between success and quality and self-respect.

If in five years I’m just watching NFL-endorsed ESPN clips through a syndication deal with a messaging app, and Vice is just an age-skewed Viacom with better audience data, and I’m looking up the same trivia on Genius instead of Wikipedia, and “publications” are just content agencies that solve temporary optimization issues for much larger platforms, what will have been point of the last twenty years of creating things for the web?

A Deal With the Devil

As ad blocking has grown more pervasive, some publishers believe the solution to the problem is through gaining distribution through the channels which are exempt from the impacts of ad blocking. However those channels have no incentive to offer exceptional payouts. They make more by showing fewer ads within featured content from partners (where they must share ad revenues) and showing more ads elsewhere (where they keep all the ad revenues).

So far publishers have been underwhelmed with both Facebook Instant Articles and Apple News. The former for stringent ad restrictions, and the latter for providing limited user data. Google Now is also increasing the number of news stories they show. And next year Google will roll out their accelerated mobile pages offering.

The problem is if you don’t control the publishing you don’t control the monetization and you don’t control the data flow.

Your website helps make the knowledge graph (and other forms of vertical search) possible. But you are paid nothing when your content appears in the knowledge graph. And the knowledge graph now has a number of ad units embedded in it.

A decade ago, when Google pushed autolink to automatically insert links in publisher’s content, webmasters had enough leverage to „just say no.“ But now? Not so much. Google considers in-text ad networks spam & embeds their own search in third party apps. As the terms of deals change, and what is considered „best for users“ changes, content creators quietly accept, or quit.

Many video sites lost their rich snippets, while YouTube got larger snippets in the search results. Google pays YouTube content creators a far lower revenue share than even the default AdSense agreement offers. And those creators have restrictions which prevent them from using some forms of monetization while forces them to accept other types of bundling.

The most recent leaked Google rater documents suggested the justification for featured answers was to make mobile search quick, but if that were the extent of it then it still doesn’t explain why they also appear on desktop search results. It also doesn’t explain why the publisher credit links were originally a light gray.

With Google everything comes down to speed, speed, speed. But then they offer interstitial ad units, lock content behind surveys, and transform the user intent behind queries in a way that leads them astray.

As Google obfuscates more data & increasingly redirects and monetizes user intent, they promise to offer advertisers better integration of online to offline conversion data.

At the same time, as Google „speeds up“ your site for you, they may break it with GoogleWebLight.

If you don’t host & control the user experience you are at the whim of (at best, morally agnostic) self-serving platforms which could care less if any individual publication dies.

It’s White Hat or Bust…


What was that old white hat SEO adage? I forget the precise wording, but I think it went something like…

Don’t buy links, it is too risky & too uncertain. Guarantee strong returns like Google does, by investing directly into undermining the political process by hiring lobbyists, heavy political donations, skirting political donation rules, regularly setting policy, inserting your agents in government, and sponsoring bogus „academic research“ without disclosing the payments.

Focus on the user. Put them first. Right behind money.

Source:: seobook.com