Tuesday, 28 June 2016

Finding things to buy on Pinterest is about to get a lot easier

Https%3a%2f%2fblueprint-api-production.s3.amazonaws.com%2fuploads%2fcard%2fimage%2f129798%2fgettyimages-524282176
Feed-twFeed-fb

Pinterest announced a collection of new tools on Tuesday that aim to make shopping on the site easier, including the addition of visual search tech that lets you shop online for products you discovered offline, with the help of your camera.


The company announced the new suite of tools, dubbed “Shopping with Pinterest,” at an event held at its company headquarters in San Francisco.



The biggest piece of news from the event is one that isn't quite live yet. In the months ahead, users will have the ability to take a photo of an object in the real world - or say, a scene (like the decor of a cute cabin you stayed at over the weekend) - and Pinterest will push out related recommendations based on the style. Mashable got a sneak preview of the feature (captured in the Vine below) and offline-to-online detection was smooth and took less than four seconds. Read more...


More about E Commerce, Social Media, Shopping, Pinterest, and Tech


10 Illustrations of How Fresh Content May Influence Google Rankings (Updated)

Posted by Cyrus-Shepard

[Estimated read time: 11 minutes]

How fresh is this article?

Through patent filings over the years, Google has explored many ways that it might use “freshness” as a ranking signal. Back in 2011, we published a popular Moz Blog post about these “Freshness Factors” for SEO. Following our own advice, this is a brand new update of that article.

In 2003, Google engineers filed a patent named Information retrieval based on historical data that shook the SEO world. The patent not only offered insight into the mind of Google engineers at the time, but also seemingly provided a roadmap for Google's algorithm for years to come.

In his series on the “10 most important search patents of all time,” Bill Slawski's excellent writeup shows how this patent spawned an entire family of Google child patents–the latest from October 2011.

This post doesn't attempt to describe all the ways that Google may determine freshness to rank web pages, but instead focuses on areas we may most likely influence through SEO.

Giant, great big caveat: Keep in mind that while multiple Google patent filings describe these techniques - often in great detail - we have no guarantee how Google uses them in its algorithm. While we can't be 100% certain, evidence suggests that they use at least some, and possibly many, of these techniques to rank search results.

For another take on these factors, I highly recommend reading Justin Briggs' excellent article Methods for Evaluating Freshness.

When “Queries Deserve Freshness”

Former Google Fellow Amit Singhal once explained how “Different searches have different freshness needs.”

The implication is that Google measures all of your documents for freshness, then scores each page according to the type of search query.

Singhal describes the types of keyword searches most likely to require fresh content:


  • Recent events or hot topics: “occupy oakland protest” “nba lockout”

  • Regularly recurring events: “NFL scores” “dancing with the stars” “exxon earnings”

  • Frequent updates: “best slr cameras” “subaru impreza reviews”

Google may determine exactly which queries require fresh content by monitoring the web and their own huge warehouse of data, including:


  1. Search volume: Are queries for a particular term spiking (i.e. “Earthquake Los Angeles”)?

  2. News and blog coverage: If a number of news organizations start writing about the same subject, it's likely a hot topic.

  3. Social media: A spike in mentions of a particular topic may indicate the topic is “trending.”

While some queries need fresh content, other search queries may be better served by older content.

Fresh is often better, but not always. (More on this later.)

Below are ten ways Google may determine the freshness of your content. Images courtesy of my favorite graphic designer, Dawn Shepard.

1. Freshness by inception date

Initially, a web page can be given a “freshness” score based on its inception date, which decays over time. This freshness score may boost a piece of content for certain search queries, but degrades as the content becomes older.

The inception date is often when Google first becomes aware of the document, such as when Googlebot first indexes a document or discovers a link to it.


"For some queries, older documents may be more favorable than newer ones. As a result, it may be beneficial to adjust the score of a document based on the difference (in age) from the average age of the result set."
– All captions from US Patent Document Scoring Based on Document Content Update

2. Amount of change influences freshness: How Much

The age of a webpage or domain isn't the only freshness factor. Search engines can score regularly updated content for freshness differently from content that doesn't change. In this case, the amount of change on your webpage plays a role.

For example, changing a single sentence won't have as big of a freshness impact as a large change to the main body text.


"Also, a document having a relatively large amount of its content updated over time might be scored differently than a document having a relatively small amount of its content updated over time."

In fact, Google may choose to ignore small changes completely. That's one reason why when I update a link on a page, I typically also update the text surrounding it. This way, Google may be less likely to ignore the change. Consider the following:

"In order to not update every link's freshness from a minor edit of a tiny unrelated part of a document, each updated document may be tested for significant changes (e.g., changes to a large portion of the document or changes to many different portions of the document) and a link's freshness may be updated (or not updated) accordingly."

3. Changes to core content matter more: How important

Changes made in “important” areas of a document will signal freshness differently than changes made in less important content.

Less important content includes:


  • JavaScript

  • Comments

  • Advertisements

  • Navigation

  • Boilerplate material

  • Date/time tags

Conversely, “important” content often means the main body text.

So simply changing out the links in your sidebar, or updating your footer copy, likely won't be considered as a signal of freshness.


"…content deemed to be unimportant if updated/changed, such as Javascript, comments, advertisements, navigational elements, boilerplate material, or date/time tags, may be given relatively little weight or even ignored altogether when determining UA."

This brings up the issue of timestamps on a page. Some webmasters like to update timestamps regularly - sometimes in an attempt to fake freshness - but there exists conflicting evidence on how well this works. Suffice to say, the freshness signals are likely much stronger when you keep the actual page content itself fresh and updated.

4. The rate of document change: How often

Content that changes more often is scored differently than content that only changes every few years.

For example, consider the homepage of the New York Times, which updates every day and has a high degree of change.


"For example, a document whose content is edited often may be scored differently than a document whose content remains static over time. Also, a document having a relatively large amount of its content updated over time might be scored differently than a document having a relatively small amount of its content updated over time."

Google may treat links from these pages differently as well (more on this below.) For example, a fresh “link of the day” from the Yahoo homepage may be assigned less significance than a link that remains more permanently.

5. New page creation

Instead of revising individual pages, fresh websites often add completely new pages over time. (This is the case with most blogs.) Websites that add new pages at a higher rate may earn a higher freshness score than sites that add content less frequently.


"UA may also be determined as a function of one or more factors, such as the number of 'new' or unique pages associated with a document over a period of time. Another factor might include the ratio of the number of new or unique pages associated with a document over a period of time versus the total number of pages associated with that document."

Some webmasters advocate adding 20–30% new pages to your site every year. Personally, I don't believe this is necessary as long as you send other freshness signals, including keeping your content up-to-date and regularly earning new links.

6. Rate of new link growth signals freshness

Not all freshness signals are restricted to the page itself. Many external signals can also indicate freshness as well, oftentimes with powerful results.

If a webpage sees an increase in its link growth rate, this could indicate a signal of relevance to search engines. For example, if folks start linking to your personal website because you're about to get married, your site could be deemed more relevant and fresh (as far as this current event goes.)


"…a downward trend in the number or rate of new links (e.g., based on a comparison of the number or rate of new links in a recent time period versus an older time period) over time could signal to search engine 125 that a document is stale, in which case search engine 125 may decrease the document's score."

Be warned: an unusual increase in linking activity can also indicate spam or manipulative link building techniques. Search engines are likely to devalue such behavior. Natural link growth over time is usually the best bet.

7. Links from fresh sites pass fresh value

Links from sites that have a high freshness score themselves can raise the freshness score of the sites they link to.

For example, if you obtain a link off an old, static site that hasn't been updated in years, this may not pass the same level of freshness value as a link from a fresh page, i.e. the homepage of Wired. Justin Briggs coined this FreshRank.


"Document S may be considered fresh if n% of the links to S are fresh or if the documents containing forward links to S are considered fresh."

8. Traffic and engagement metrics may signal freshness

When Google presents a list of search results to users, the results the users choose and how much time they spend on each one can be used as an indicator of freshness and relevance.

For example, if users consistently click a search result further down the list, and they spend much more time engaged with that page than the other results, this may mean the result is more fresh and relevant.


"If a document is returned for a certain query and over time, or within a given time window, users spend either more or less time on average on the document given the same or similar query, then this may be used as an indication that the document is fresh or stale, respectively."

You might interpret this to mean that click-through rate is a ranking factor, but that's not necessarily the case. A more nuanced interpretation might say that the increased clicks tell Google there is a hot interest in the topic, and this page - and others like it - happen to match user intent.

For a more detailed explanation of this CTR phenomenon, I highly recommend reading Eric Enge's excellent article about CTR as a ranking factor.

9. Changes in anchor text may devalue links

If the subject of a web page changes dramatically over time, it makes sense that any new anchor text pointing to the page will change as well.

For example, if you buy a domain about racing cars, then change the format to content about baking, over time your new incoming anchor text will shift from cars to cookies.

In this instance, Google might determine that your site has changed so much that the old anchor text is now stale (the opposite of fresh) and devalue those older links entirely.


"The date of appearance/change of the document pointed to by the link may be a good indicator of the freshness of the anchor text based on the theory that good anchor text may go unchanged when a document gets updated if it is still relevant and good."

The lesson here is that if you update a page, don't deviate too much from the original context or you may risk losing equity from your pre-existing links.

10. Older is often better

Google understands the newest result isn't always the best. Consider a search query for “Magna Carta." An older, authoritative result may be best here.

In this case, having a well-aged document may actually help you.

Google's patent suggests they determine the freshness requirement for a query based on the average age of documents returned for the query.


"For some queries, documents with content that has not recently changed may be more favorable than documents with content that has recently changed. As a result, it may be beneficial to adjust the score of a document based on the difference from the average date-of-change of the result set."

A good way to determine this is to simply Google your search term, and gauge the average inception age of the pages returned in the results. If they all appear more than a few years old, a brand-new fresh page may have a hard time competing.

Freshness best practices

The goal here shouldn't be to update your site simply for the sake of updating it and hoping for better ranking. If this is your practice, you'll likely be frustrated with a lack of results.

Instead, your goal should be to update your site in a timely manner that benefits users, with an aim of increasing clicks, user engagement, and fresh links. These are the clearest signals you can pass to Google to show that your site is fresh and deserving of high rankings.

Aside from updating older content, other best practices include:


  1. Create new content regularly.

  2. When updating, focus on core content, and not unimportant boilerplate material.

  3. Keep in mind that small changes may be ignored. If you're going to update a link, you may consider updating all the text around the link.

  4. Steady link growth is almost always better than spiky, inconsistent link growth.

  5. All other things being equal, links from fresher pages likely pass more value than links from stale pages.

  6. Engagement metrics are your friend. Work to increase clicks and user satisfaction.

  7. If you change the topic of a page too much, older links to the page may lose value.

Updating older content works amazingly well when you also earn fresh links to the content. A perfect example of this is when Geoff Kenyon updated his Technical Site Audit Checklist post on Moz. You can see the before and after results below:


Be fresh.

Be relevant.

Most important, be useful.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

3 Tools to Create Social Media Visuals

dm-desktop-visual-tools-600

Do you create custom images for social media? Looking for tools to streamline the design process? There are some new desktop design tools that make it easy to quickly create multiple graphics for social media. In this article, you'll discover three user-friendly desktop tools to create visuals for social media. Why Create Images via Desktop? If [...]


This post 3 Tools to Create Social Media Visuals first appeared on .

- Your Guide to the Social Media Jungle

Monday, 27 June 2016

Predicting Intent: What Unnatural Outbound Link Penalties Could Mean for the Future of SEO

Posted by Angular

[Estimated read time: 8 minutes]

As SEOs, we often find ourselves facing new changes implemented by search engines that impact how our clients' websites perform in the SERPs. With each change, it's important that we look beyond its immediate impact and think about its future implications so that we can try to answer this question: "If I were Google, why would I do that?"

Recently, Google implemented a series of manual penalties that affected sites deemed to have unnatural outbound links. Webmasters of affected sites received messages like this in Google Search Console:

Google Outbound Links Penalty

Webmasters were notified in an email that Google had detected a pattern of "unnatural artificial, deceptive, or manipulative outbound links." The manual action itself described the link as being either "unnatural or irrelevant."

The responses from webmasters varied in their usual extreme fashion, with recommendations ranging from "do nothing" to "nofollow every outbound link on your site."

Google's John Mueller posted in product forums that you don't need to nofollow every link on your site, but you should focus on nofollowing links that point to a product, sales, or social media page as the result of an exchange.

Now, on to the fun part of being an SEO: looking at a problem and trying to reverse-engineer Google's intentions to decipher the implications this could have on our industry, clients, and strategy.

The intent of this post is not to decry those opinions that this was specifically focused on bloggers who placed dofollow links on product/business reviews, but to present a few ideas to incite discussion as to the potential big-picture strategy that could be at play here.

A few concepts that influenced my thought process are as follows:


  • Penguin has repeatedly missed its "launch date," which indicates that Google engineers don't feel it's accurate enough to release into the wild.

Penguin Not Ready

  • The growth of negative SEO makes it even more difficult for Google to identify/penalize sites for tactics that are not implemented on their own websites.

  • Penguin temporarily impacted link-building markets in a way Google would want. The decline reached its plateau in July 2015, as shown in this graph from Google Trends:
    Trend of Link Building

If I were Google, I would expect webmasters impacted by the unnatural outbound links penalty to respond in one of these ways:


  1. Do nothing. The penalty is specifically stated to "discount the trust in links on your site." As a webmaster, do you really care if Google trusts the outbound links on your site or not? What about if the penalty does not impact your traffic, rankings, visibility, etc.? What incentive do you have to take any action? Even if you sell links, if the information is not publicly displayed, this does nothing to harm your link-selling business.

  2. Innocent site cleanup effort. A legitimate site that has not exchanged goods for links (or wants to pretend they haven't) would simply go through their site and remove any links that they feel may have triggered the issue and then maybe file a bland reconsideration request stating as much.

  3. Guilty site cleanup effort. A site that has participated in link schemes would know exactly which links are the offenders and remove them. Now, depending on the business owner, some might then file a reconsideration request saying, "I'm sorry, so-and-so paid me to do it, and I'll never do it again." Others may simply state, "Yes, we have identified the problem and corrected it."

In scenario No. 1, Google wins because this helps further the fear, uncertainty, and doubt (FUD) campaigns around link development. It is suddenly impossible to know if a site's outbound links have value because they may possibly have a penalty preventing them from passing value. So link building not only continues to carry the risk of creating a penalty on your site, but it suddenly becomes more obvious that you could exchange goods/money/services for a link that has no value despite its MozRank or any other external "ranking" metric.

In scenarios No. 2 and No. 3, Google wins because they can monitor the links that have been nofollowed/removed and add potential link scheme violators to training data.

In scenario No. 3, they may be able to procure evidence of sites participating in link schemes through admissions by webmasters who sold the links.

If I were Google, I would really love to have a control group of known sites participating in link schemes to further develop my machine-learned algorithm for detecting link profile manipulation. I would take the unnatural outbound link data from scenario No. 3 above and run those sites as a data set against Penguin to attempt 100% confidence, knowing that all those sites definitely participated in link schemes. Then I would tweak Penguin with this training dataset and issue manual actions against the linked sites.

This wouldn't be the first time SEOs have predicted a Google subtext of leveraging webmasters and their data to help them further develop their algorithms for link penalties. In 2012, the SEO industry was skeptical regarding the use of the disavow tool and whether or not Google was crowdsourcing webmasters for their spam team.

martinibuster

"Clearly there are link schemes that cannot be caught through the standard algorithm. That's one of the reasons why there are manual actions. It's within the realm of possibilities that disavow data can be used to confirm how well they're catching spam, as well as identifying spam they couldn't catch automatically. For example, when web publishers disavow sites that were not caught by the algorithm, this can suggest a new area for quality control to look into." - Roger Montti, Martinibuster.com



What objectives could the unnatural outbound links penalties accomplish?


  1. Legit webmasters could become more afraid to sell/place links because they get "penalized."

  2. Spammy webmasters could continue selling links from their penalized sites, which would add to the confusion and devaluation of link markets.

  3. Webmasters could become afraid to buy/exchange links because they could get scammed by penalized sites and be more likely to be outed by the legitimate sites.

  4. The Penguin algorithm could have increased confidence scoring and become ready for real-time implementation.

Russ Jones


"There was a time when Google would devalue the PR of a site that was caught selling links. With that signal gone, and Google going after outbound links, it is now more difficult than ever to know whether a link acquired is really of value." -- Russ Jones, Principal Search Scientist at MOZ



Again, if I were Google, the next generation of Penguin would likely heavily weight irrelevantly placed links, and not just commercial keyword-specific anchor text. Testing this first on the sites I think are guilty of providing the links and simply devaluing those links seems much smarter. Of course, at this point, there is no specific evidence to indicate Google's intention behind the unnatural outbound links penalties were intended as a final testing phase for Penguin and to further devalue the manipulated link market. But if I were Google, that's exactly what I would be doing.

Tripp Hamilton


"Gone are the days of easily repeatable link building strategies. Acquiring links shouldn't be easy, and Penguin will continue to change the search marketing landscape whether we like it or not. I, for one, welcome our artificially intelligent overlords. Future iterations of the Penguin algorithm will further solidify the “difficulty level” of link acquisition, making spam less popular and forcing businesses toward legitimate marketing strategies." - Tripp Hamilton, Product Manager at Removeem.com

Google's webmaster guidelines show link schemes are interpreted by intent. I wonder what happens if I start nofollowing links from my site for the intent of devaluing a site's rankings? The intent is manipulation. Am I at risk of being considered a participant in link schemes? If I do link building as part of an SEO campaign, am I inherently conducting a link scheme?

Google Webmaster Guidelines for Link Scheme

So, since I'm an SEO, not Google, I have to ask myself and my colleagues, "What does this do to change or reinforce my SEO efforts?" I immediately think back to a Whiteboard Friday from a few years ago that discusses the Rules of Link Building.

Cyrus Shepard

"At its best, good link building is indistinguishable from good marketing." - Cyrus Shepard, former Content Astronaut at Moz





When asked what type of impact SEOs should expect from this, Garret French from Citation Labs shared:

Garret French


"Clearly this new effort by Google will start to dry up the dofollow sponsored post, sponsored review marketplace. Watch for prices to drop over the next few months and then go back and test reviews with nofollowed links to see which ones actually drive converting traffic! If you can't stomach paying for nofollowed links then it's time to get creative and return to old-fashioned, story-driven blog PR. It doesn't scale well, but it works well for natural links."

In conclusion, as SEOs, we are responsible for predicting the future of our industry. We do not simply act in the present. Google does not wish for its results to be gamed and have departments full of data scientists dedicated to building algorithms to identify and devalue manipulative practices. If you are incapable of legitimately building links, then you must mimic legitimate links in all aspects (or consider a new career).

Takeaways

Most importantly, any links that we try to build must provide value. If a URL links to a landing page that is not contextually relevant to its source page, then this irrelevant link is likely to be flagged and devalued. Remember, Google can do topical analysis, too.

In link cleanup mode or Penguin recovery, we've typically approached unnatural links as being obvious when they have a commercial keyword (e.g. "insurance quotes") because links more naturally occur with the URL, brand, or navigational labels as anchor text. It would also be safe to assume that natural links tend to occur in content about the destination the link offers and that link relevance should be considered.

Finally, we should continue to identify and present clients with methods for naturally building authority by providing value in what they offer and working to build real relationships and brand advocates.

What are your thoughts? Do you agree? Disagree?


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

How to Target Local Customers With Facebook Ads

aa-local-facebook-ads-600

Do you want to connect with local customers on Facebook? Have you considered targeting them with Facebook ads? Facebook ads offer a quick, easy, cost-effective way to reach consumers in your local area. In this article, you'll discover how to get your business in front of local customers using Facebook ads. #1: Choose Your Ad [...]


This post How to Target Local Customers With Facebook Ads first appeared on .

- Your Guide to the Social Media Jungle

Student's photo campaign hits back at body image pressures from social media

Https%3a%2f%2fblueprint-api-production.s3.amazonaws.com%2fuploads%2fcard%2fimage%2f125164%2fturn_your_back
Feed-twFeed-fb

LONDON - Social media plays a massive role in our daily lives. But, it can also add to the existing pressures we face, particularly when it comes to our bodies.



With Instagram filters, Photoshopped images and curated feeds, social media can feel like an onslaught of unattainable perfection.


Research suggests that too much time spent on Facebook can cause women to dislike their appearance, and can contribute to feelings of loneliness and depression. 


But, one British student has created a campaign to fight against the pressures women face from social media, including pressures to lose weight or have cosmetic surgery.  Read more...


More about Cosmetic Surgery, Lifestyle, Uk, Instagram, and Facebook


Sunday, 26 June 2016

Qantas joins Snapchat to give you a glimpse behind the baggage claim

Https%3a%2f%2fblueprint-api-production.s3.amazonaws.com%2fuploads%2fcard%2fimage%2f127574%2f8741a6aebc9d4b7493d7ab0a91f7562d
Feed-twFeed-fb

The brands are invading Snapchat, and Qantas is the latest to sign up with the social media platform to connect with new (read younger) demographics.


On Monday, the airline announced it would use Snapchat to take users behind the scenes, following in the footsteps of other Australian brands like the Commonwealth Bank, Oak and Westpac.



"What makes it really unique is that we're handing over the keys to the SnapChat account to our employees," Olivia Wirth, group executive of brand, marketing and corporate affairs at Qantas, said in an emailed statement. The company will share content on the platform around every fortnight. Read more...


More about Snapchat, Qantas, Australia, and Social Media