среда, 31 мая 2017 г.

10 Google Ranking Factors Every Website Must Focus On In 2017

10 Google Ranking Factors Every Website Should Focus On In 2017

Q: What do Entrepreneur.com, Forbes.com, Backlinko.com and Jeffbullas.com have in common?

A: All four of these sites rank high on Google SERPs and enjoy untapped authority.

Now, here’s the big question.

How do these sites consistently rank high on Google SERPs?

Well, these sites do two things very well.

Firstly, they focus on the Google ranking factors that have the biggest impact.

Secondly, they put all of their resources into scaling these ranking factors.

By now you’re probably thinking: Google uses more than 200 ranking signals to rank websites. So how do I find out which factors I should focus on to get a higher Google ranking?

Well, you’re in luck because I’m going to tell you the top 10 most significant Google ranking factors that websites should focus on in 2017.

1. Page authority

Page authority is one of the most critical Google ranking factors that every website owner who is hoping to rank highly on Google SERPs needs to be mindful of in 2017.

Page authority refers to the authority/credibility of your page in the eyes of the Google spiders/crawlers. The ‘Big G’ determines the credibility of your webpage by looking at your website’s link profile.

Basically, the higher the quantity and quality of inbound links pointing to your webpage, the more authority it will enjoy on Google SERPs.

Image Source: Backlinko

In fact, if your page isn’t authoritative, Google won’t rank it at all.

I’m sure you’re wondering how to increase your website’s authority now?

The answer is simpler than you might think: practice white hat link-building strategies. Reach out to similar bloggers, website owners or businesses who enjoy untapped authority on the web, and pitch for guest posts, infographics, etc.

For each and every page of your website, try to embed as many high-quality inbound links as possible. But don’t take the easy route of buying links! Trust me, this is not going to do you any good.

Instead, try a deep linking strategy to build better page authority.

Image Source: theoffbeatstory

2. Link relevancy

As discussed above, Google determines how authoritative your website is based on the quality and quantity of links connecting back to your website/web page.

But there’s a catch: it’s not just about getting many high-quality inbound links, it’s also about getting relevant inbound links.

Yeah, you heard me right!

Google pays more and more attention to the relevancy of the inbound links that point back to your website.

In fact, as Andre Weyher, the former head of Google’s web spam team, revealed in an interview, the relevance of links is the new PR as far as Google rankings are concerned.

Aim at getting as many high-quality inbound links as you can, but make sure all of these links are relevant. Focus on link building only from authoritative and relevant sites. This will help ensure that links pointing back to your site are from sites or blogs that are from the same industry and niche as yours.

For instance, if you’re a digital marketing agency and publish content on technical SEO, social media marketing and content marketing, you should aim at getting inbound links from sites with a higher page authority that publish content on the same topics as yours.

3. Content length

It may surprise you to learn that content is one of the most important ranking factors. In fact, content is the second most important signal that Google uses to rank websites on its SERPs.

But it should be no surprise.

For over a decade, content has been the essential fuel that drives modern-day marketing machines. More and more businesses now rely on content marketing to increase traffic, generate leads and optimize lead conversion rates.

The only problem is that now 93% of B2B marketers use content marketing, there’s a lot of white noise on Google. So how do you make sure your content is able to cut through all of the white noise, drive traffic to your website and rank higher on Google SERPs?

Here’s how: don’t keep on creating and publishing short-form (aka useless) content, day in and day out, as it is not going to help you rank higher.

Instead, create long-form content that is ideally more than 2000 words in length.

Longer doesn’t necessarily have to mean boring. Carefully craft your content to cut through all the fluff and only include information that your target audience is looking for.

As this study by SERPIQ (one of Google’s top 10 results) shows, long-form content consistently ranks better on Google SERPs.

Image Source: theoffbeatstory.com

4. Content ‘thinness’ 

Now that I’ve spilled the beans on the impact of content length on Google rankings, it’s time to impart that sites with ‘thin’ content which offers little or no value to people are dropped like a stone by Google.

Let’s talk about what ‘thin’ content is.

‘Thin’ content is any content that adds no or little value to a related search query. Remember Google’s Panda and Penguin updates? They sought to identify content farms (sites with low-value content stuffed with keywords to rank higher) and scraper sites (sites that featured duplicate content) and penalize them. These days, Google’s updates are focused primarily on sites with ‘thin’ or near-duplicate content.

Though Google doesn’t formally penalize sites for having ‘thin’ content, it may potentially misguide Google spiders when they are trying to find unique page content amongst the ‘thin’ content. This may result in low SERP rankings.

So what should you do?

Google crawlers prefer assigning better rankings to sites that provide original, robust content. So remove or no-index pages with thin or duplicate content. Include archived and category pages in this list. Delete or ‘no-index’ such pages.

After all, these pages often add no value to Google searchers anyway. So, if you are holding on to these pages, you’re actually pushing down your Google ranking.

5. Average time spent on page

Here’s a shocker for you: most blogs have an average bounce rate of around 70 to 98%.

This means that when someone lands on your blog page, they don’t spend more than a few seconds there before abandoning it.

So how does this impact your Google ranking?

Google pays close attention to how much time people spend on your web page when it comes to ranking your site on its SERPs. If people tend to spend more time there, Google assumes your page is informative and ranks you well.

Image Source: Backlinko.com 

If you’re thinking your bounce rate doesn’t really impact your ranking, please be advised that they are a very important ranking signal for Google.

Now it’s time for the big question: how do I reduce my bounce rate?

Here’s your answer: be clever when creating content. Add a lot of ‘Bucket Brigades’ and compelling headings into your posts. Structure your blog posts well, too, by including data points and quote boxes.

Here’s what Bucket Brigades look like:

Image Source: theoffbeatstory.com

Make sure your web pages are visually compelling and clutter-free. Ensure that your links and calls to action are properly positioned, and leave a lot of cell padding around your CTA buttons and links.

Remember, the better, more informative and more interesting your website looks, the higher the average amount of time spent on your page and the better your Google SERP rankings will be.

6. Domain authority

You might not know that the domain authority of your website or web page plays a significant role as far as Google SERP ranking is concerned, but it does.

Domain authority is actually a metric that includes 40 different ranking signals and rates websites on a scale of 1 to 100, where 1 is rated as the worst and 100 is rated as the best.

Obviously, the better your domain authority; the higher your ranking.

Image Source: Backlinko.com

While a large number of experts believe that you can’t really do much to increase your domain authority because it takes time, it doesn’t solely depend on domain age.

Here are a few things you can do to quickly increase your domain authority: first, extend the expiry date of your domain name to send a signal to Google that your site is trustworthy. Try to get as many high-quality and relevant inbound links as possible pointing to your website, then smartly sprinkle your target keyword throughout your long-form content.

7. Keywords in title tags

Keywords are so important that you can’t really think of ranking your blog or website without it having keywords in it.

But did you know that Google carefully considers keywords used in your title tags when ranking websites on its SERPs? It actually gives more weight to pages that use keywords in the beginning of their title tags.

Image Source: Backlinko.com

This is because Google spiders are able to instantly figure out what the page is all about if target keywords are placed in the beginning of title tags.

For example, let’s say your target keyword is ‘Google Adwords Tips’ and you have two title tags.

Title Tag #1: Google Adwords Tips: 5 Tips to Get More With Google AdWords

Title Tag #2: How to Get More with These 5 Google AdWords Tips

Now, can you guess which title tag will rank better on Google SERPs?

No prizes for guessing which! It’s Title Tag #1. That is because this title tag uses the keyword right at the start of it, making it easy for Google spiders to understand what the page is all about.

8. Keyword positioning and relevance

Now that we’re talking about keywords, let me also reveal that Google pays close attention to how you position your target keyword on your webpage.

This means that Google crawlers carefully check for keywords in your page URL, in your posts, and in your H1, H2 and H3 tags. Placing keywords in these areas makes it clear to Google spiders that your page is focused on the target keyword.

But here’s a shocker: if your page content doesn’t really match with your keywords, your site may be penalized by Google. Yeah! That’s true!

The days when Google only paid attention to the keyword density are long gone. Today, it looks for keyword relevance too.

So what should you do?

Smartly structure your page using relevant keywords. Place keywords in the first 100 words of your posts, in your page URL, and in your H1/H2/H3 tags. Don’t forget to include keywords in the beginning of your meta descriptions. That’s important, too.

9. Page load time

Slower page load time dramatically affects your pages’ performance. In fact, a one-second delay in page loading time means 11% fewer page views.

It gets worse: 25% of visitors will abandon your site if your page doesn’t load within 4 seconds.

Remember, Google is the world’s largest search engine and it takes its’ user experience seriously. So naturally, it penalizes sites with a slower load time by drastically dropping their rankings.

Image Source: Backlinko.com

Now, it’s not rocket science to understand why page load time remains one of the most significant Google ranking factors that you should be mindful of in 2017 and beyond.

The question is how you can improve the load time of your slower web page.

First, analyze your pages’ loading speed using Google’s flagship product, PageSpeed Insights tool. Then, based on the analysis of your page speed, upgrade your server. Don’t forget to minimize your HTML, CSS and JavaScript files as well as all redirects.

Optimizing images may also help you reduce your page’s load time.

10. Responsive design

Mobile devices make for 51.3% of internet usage globally. Considering how many people now use mobile devices for Google searches, it’s easy to understand why Google takes responsive design so seriously when it comes to ranking websites.

A study conducted by Google revealed that over 67% of users prefer buying from responsive sites and 61% of users abandon non-responsive sites.

Image Source: theoffbeatstory.com

Google penalizes sites that aren’t responsive by dropping their rankings.

Image Source: theoffbeatstory.com

Remember, if you’re looking to rank higher on Google SERPs in 2017, you’ll need to adopt responsive design.

Bonus tip: Steer clear of website pop-ups!

As you may already know, if not used correctly, pop-ups can annoy your visitors to such an extent that they will abandon your site and never return. Statistics show that 70% of Americans get annoyed by irrelevant pop-up ads.

Since irrelevant pop-ups can ruin your user experience, Google recently announced that it will start penalizing mobile sites that feature irrelevant pop-ups. Google may soon implement this update for desktop sites, too.

So it makes good business sense not to use irrelevant pop-ups on your site. Better yet, avoid using pop-ups at all.

Wrapping up

So, there you have it – 10 of the most significant Google ranking factors to consider when looking to boost your website’s ranking!

Go ahead and put this learning into practice, and get set to rank higher on Google SERPs in 2017 and beyond.

I’d love to hear what you think of these top 10 Google ranking factors – let me know by leaving a comment down below.

Guest Author:Saumya Raghav is the founder of The Offbeat Story, a blog that aspires to help small businesses and startups minimize the marketing costs of generating traffic and improving conversions. A blogger and conversion scientist by day, he is an avid Jeff Bullas and Brian Dean reader at night. He loves to write on topics related to digital marketing, technical and international SEO, web analytics, startup and small business strategies, growth hack strategies and conversion optimization strategies. You can find him on Twitter,FacebookandLinkedIn

The post 10 Google Ranking Factors Every Website Must Focus On In 2017 appeared first on Jeffbullas’s Blog.



from Affiliate Marketing http://ift.tt/2sdWV2y via Affiliate Marketing
from Tumblr http://ift.tt/2rr0sgj

Commisoned content: Make money online with ClickSure (Commissioned Content)

If you’re looking to start your own business and make money online, then you’ve more than likely come across the term ’Affiliate Marketing” before.

from Affiliate Marketing http://ift.tt/2rFd5oR via Affiliate Marketing


from Tumblr http://ift.tt/2rpqHmY

CAKE by Accelerize to Lead Affiliate Marketing Panel at Afiliados Brasil

Held June 1-3 in Sao Paulo, Brazil, Afiliados Brasil is Latin America’s biggest affiliate marketing event and draws the industry’s top affiliates, merchants …

from Affiliate Marketing http://ift.tt/2sew2f4 via Affiliate Marketing


from Tumblr http://ift.tt/2snICYq

Missy Ward

This issue of FeedFront Magazine includes Screw Resolutions, Plunge into Affiliate Marketing Now by Missy Ward, Tips for Achieving Tech–Life …

from Affiliate Marketing http://ift.tt/2rFkpki via Affiliate Marketing


from Tumblr http://ift.tt/2rpjxzq

Is the rise of e-com helping affiliate marketing players?

The success of such magnitude and increasing focus on profitability led to affiliate marketing programmes becoming popular. It was a win-win situation …

from Affiliate Marketing http://ift.tt/2senxAt via Affiliate Marketing


from Tumblr http://ift.tt/2snd98U

Optimizing AngularJS Single-Page Applications for Googlebot Crawlers

Posted by jrridley

It’s almost certain that you’ve encountered AngularJS on the web somewhere, even if you weren’t aware of it at the time. Here’s a list of just a few sites using Angular:

  • Upwork.com
  • Freelancer.com
  • Udemy.com
  • Youtube.com

Any of those look familiar? If so, it’s because AngularJS is taking over the Internet. There’s a good reason for that: Angular- and other React-style frameworks make for a better user and developer experience on a site. For background, AngularJS and ReactJS are part of a web design movement called single-page applications, or SPAs. While a traditional website loads each individual page as the user navigates the site, including calls to the server and cache, loading resources, and rendering the page, SPAs cut out much of the back-end activity by loading the entire site when a user first lands on a page. Instead of loading a new page each time you click on a link, the site dynamically updates a single HTML page as the user interacts with the site.

image001.png

Image c/o Microsoft

Why is this movement taking over the Internet? With SPAs, users are treated to a screaming fast site through which they can navigate almost instantaneously, while developers have a template that allows them to customize, test, and optimize pages seamlessly and efficiently. AngularJS and ReactJS use advanced Javascript templates to render the site, which means the HTML/CSS page speed overhead is almost nothing. All site activity runs behind the scenes, out of view of the user.

Unfortunately, anyone who’s tried performing SEO on an Angular or React site knows that the site activity is hidden from more than just site visitors: it’s also hidden from web crawlers. Crawlers like Googlebot rely heavily on HTML/CSS data to render and interpret the content on a site. When that HTML content is hidden behind website scripts, crawlers have no website content to index and serve in search results.

Of course, Google claims they can crawl Javascript (and SEOs have tested and supported this claim), but even if that is true, Googlebot still struggles to crawl sites built on a SPA framework. One of the first issues we encountered when a client first approached us with an Angular site was that nothing beyond the homepage was appearing in the SERPs. ScreamingFrog crawls uncovered the homepage and a handful of other Javascript resources, and that was it.

SF Javascript.png

Another common issue is recording Google Analytics data. Think about it: Analytics data is tracked by recording pageviews every time a user navigates to a page. How can you track site analytics when there’s no HTML response to trigger a pageview?

After working with several clients on their SPA websites, we’ve developed a process for performing SEO on those sites. By using this process, we’ve not only enabled SPA sites to be indexed by search engines, but even to rank on the first page for keywords.

5-step solution to SEO for AngularJS

  1. Make a list of all pages on the site
  2. Install Prerender
  3. “Fetch as Google”
  4. Configure Analytics
  5. Recrawl the site

1) Make a list of all pages on your site

If this sounds like a long and tedious process, that’s because it definitely can be. For some sites, this will be as easy as exporting the XML sitemap for the site. For other sites, especially those with hundreds or thousands of pages, creating a comprehensive list of all the pages on the site can take hours or days. However, I cannot emphasize enough how helpful this step has been for us. Having an index of all pages on the site gives you a guide to reference and consult as you work on getting your site indexed. It’s almost impossible to predict every issue that you’re going to encounter with an SPA, and if you don’t have an all-inclusive list of content to reference throughout your SEO optimization, it’s highly likely you’ll leave some part of the site un-indexed by search engines inadvertently.

One solution that might enable you to streamline this process is to divide content into directories instead of individual pages. For example, if you know that you have a list of storeroom pages, include your /storeroom/ directory and make a note of how many pages that includes. Or if you have an e-commerce site, make a note of how many products you have in each shopping category and compile your list that way (though if you have an e-commerce site, I hope for your own sake you have a master list of products somewhere). Regardless of what you do to make this step less time-consuming, make sure you have a full list before continuing to step 2.

2) Install Prerender

Prerender is going to be your best friend when performing SEO for SPAs. Prerender is a service that will render your website in a virtual browser, then serve the static HTML content to web crawlers. From an SEO standpoint, this is as good of a solution as you can hope for: users still get the fast, dynamic SPA experience while search engine crawlers can identify indexable content for search results.

Prerender’s pricing varies based on the size of your site and the freshness of the cache served to Google. Smaller sites (up to 250 pages) can use Prerender for free, while larger sites (or sites that update constantly) may need to pay as much as $200+/month. However, having an indexable version of your site that enables you to attract customers through organic search is invaluable. This is where that list you compiled in step 1 comes into play: if you can prioritize what sections of your site need to be served to search engines, or with what frequency, you may be able to save a little bit of money each month while still achieving SEO progress.

3) “Fetch as Google”

Within Google Search Console is an incredibly useful feature called “Fetch as Google.” “Fetch as Google” allows you to enter a URL from your site and fetch it as Googlebot would during a crawl. “Fetch” returns the HTTP response from the page, which includes a full download of the page source code as Googlebot sees it. “Fetch and Render” will return the HTTP response and will also provide a screenshot of the page as Googlebot saw it and as a site visitor would see it.

This has powerful applications for AngularJS sites. Even with Prerender installed, you may find that Google is still only partially displaying your website, or it may be omitting key features of your site that are helpful to users. Plugging the URL into “Fetch as Google” will let you review how your site appears to search engines and what further steps you may need to take to optimize your keyword rankings. Additionally, after requesting a “Fetch” or “Fetch and Render,” you have the option to “Request Indexing” for that page, which can be handy catalyst for getting your site to appear in search results.

4) Configure Google Analytics (or Google Tag Manager)

As I mentioned above, SPAs can have serious trouble with recording Google Analytics data since they don’t track pageviews the way a standard website does. Instead of the traditional Google Analytics tracking code, you’ll need to install Analytics through some kind of alternative method.

One method that works well is to use the Angulartics plugin. Angulartics replaces standard pageview events with virtual pageview tracking, which tracks the entire user navigation across your application. Since SPAs dynamically load HTML content, these virtual pageviews are recorded based on user interactions with the site, which ultimately tracks the same user behavior as you would through traditional Analytics. Other people have found success using Google Tag Manager “History Change” triggers or other innovative methods, which are perfectly acceptable implementations. As long as your Google Analytics tracking records user interactions instead of conventional pageviews, your Analytics configuration should suffice.

5) Recrawl the site

After working through steps 1–4, you’re going to want to crawl the site yourself to find those errors that not even Googlebot was anticipating. One issue we discovered early with a client was that after installing Prerender, our crawlers were still running into a spider trap:

As you can probably tell, there were not actually 150,000 pages on that particular site. Our crawlers just found a recursive loop that kept generating longer and longer URL strings for the site content. This is something we would not have found in Google Search Console or Analytics. SPAs are notorious for causing tedious, inexplicable issues that you’ll only uncover by crawling the site yourself. Even if you follow the steps above and take as many precautions as possible, I can still almost guarantee you will come across a unique issue that can only be diagnosed through a crawl.

If you’ve come across any of these unique issues, let me know in the comments! I’d love to hear what other issues people have encountered with SPAs.

Results

As I mentioned earlier in the article, the process outlined above has enabled us to not only get client sites indexed, but even to get those sites ranking on first page for various keywords. Here’s an example of the keyword progress we made for one client with an AngularJS site:

Also, the organic traffic growth for that client over the course of seven months:

All of this goes to show that although SEO for SPAs can be tedious, laborious, and troublesome, it is not impossible. Follow the steps above, and you can have SEO success with your single-page app website.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!



from Affiliate Marketing http://ift.tt/2qyzWlH via Affiliate Marketing
from Tumblr http://ift.tt/2sdSTqV

вторник, 30 мая 2017 г.

Affiliate Marketing Executive

Home Box Office (S) Pte Ltd is hiring Affiliate Marketing Executive. Monday - Friday, 9.00am - 5.30pm. Walking distance from MRT. 2 years marketing …

from Affiliate Marketing http://ift.tt/2qzNzgi via Affiliate Marketing


from Tumblr http://ift.tt/2r9QOy1

Digital Marketing Executive affiliate

View details & apply online for this Digital Marketing Executive affiliate vacancy on reed.co.uk, the UK’s #1 job site.

from Affiliate Marketing http://ift.tt/2rT3Dhk via Affiliate Marketing


from Tumblr http://ift.tt/2r9BVLS

Social Proof: The Ultimate Guide

If you want your business to grow faster, you’ll need some social proof. Social proof pertains to personal or paid recommendations by real people who …

from Affiliate Marketing http://ift.tt/2qzxwiP via Affiliate Marketing


from Tumblr http://ift.tt/2qExTb3

Performance Marketing Goes Mainstream: What You Need to Know

It used to be that the terms “performance marketing” and “affiliate marketing” were interchangeable. Affiliate programs, after all, are the original …

from Affiliate Marketing http://ift.tt/2rTkjVX via Affiliate Marketing


from Tumblr http://ift.tt/2r9KuX6

7 Simple Low-Cost SEO Tips to Boost Your Business Blog

7 Simple Low-Cost SEO Tips to Boost Your Business Blogs

It is a truth universally acknowledged that blogging can be extremely advantageous for your business. In fact, there are few reasons not to keep a business blog.

A blog can drive traffic to your website by incorporating SEO-friendly keywords and skyrocket your search engine rankings. It also opens up a valuable channel of communication between you and your clients, builds trust, and generates new leads that translate into tangible business gains.

Even better, a blog can help you establish yourself as an expert within your field, and share your knowledge and experience in an engaging manner.

While creating content for a business blog undeniably requires considerable time and effort, there are a number of free or low-cost ways to optimize your site for search engines. Here are 10 effective and economical ways to become an SEO superhero.

1. Outsource the work

Producing a blog isn’t a part time job – if you really want to optimize your website for search engines, then you’re going to need to post regularly and in depth. As these blogs also need to brilliantly reflect you as a business person, they should be original, high-quality and editorially flawless. And that’s hard work.

Aside from being a huge time commitment, it just may not sit well with your existing talents. For this reason, many businesses choose to delegate their blog writing or content creation to external third parties.

You can easily find a content writer or editor by posting an ad on a site like Upwork or Freelancer. Other websites don’t even require an ad from you – you can simply select the service you’re looking for, and it will be assigned to an expert. Check out UK Writings or Big Assignments for this sort of thing. You can also find a low-cost automated service for writing and proofreading on People Per Hour.

2. Use tools to improve content

Without consistently high-quality content, your SEO rankings will flounder because you will simply lose the interest of readers. To make sure every blog is top-notch before you post it, without the help of an external editor or proofreader, you can use the handy guides available at Australian Help or Academized to ensure your language, tone, and structure are perfect.

It’s also essential to remember that blog posts of around 2000 words are best for SEO. You can keep an eye on this with Easy Word Count.

Furthermore, never forget to reference authoritative outside sources, as Google pays attention to what you link to in an article. Similarly, every article on your site should be accompanied by social share icons to fully optimize the opportunity to earn more inbound links and gain further authority for your site.

3. Don’t forget about keywords

When writing a business blog post, you cannot forget to repeatedly and strategically use keywords. These words should be the cornerstones of your blogs – the glue that holds the writing together.

Make sure the keywords you choose fit seamlessly into your writing, though, and are always used within context. SEO does not improve when you overstuff your content with multiple keywords – concentrate on one or two per blog that are close to your niche and have few competitors.

Google Trends and Google’s Keyword Tool in AdWords are free tools that will give you all the data you need to master keywords.

4. Remember that keyword placement matters

On that note, while keywords are useful on their own, they are at their most effective when properly situated within the text.

There are three main places where your keywords should appear. The first is the title, where you should aim to write something catchy that also incorporates a keyword for SEO purposes. The second is within different sub-headers and the central body text.

The third (and most important) is within the URL itself. Your URLs should have a breadcrumbs trail and feature readable indicators of the content within the blog. Avoid using numbers or random strings of text in your URLs.

5. Make sure your content is mobile-friendly

Increasingly, people access the internet and consume blogs and news while commuting, watching TV or lying in bed. This means they’re generally using their mobile phones. And if your blog isn’t mobile-optimized, that is bad news for your SEO.

The majority of Google search now occurs on mobile devices and Google will not display your website prominently in mobile search results if it isn’t mobile-optimized.

Putting that aside, the internet is the home of short attention spans, and the chances of someone returning to your site on a laptop or computer after being unable to access it properly on their mobile phone are slim.

Use the Mobile-Friendly Test tool to check how Google search sees your pages or read this free guide on building mobile-friendly sites to get started.

6. Don’t write boring essays

While your blog content will ideally be around 2000 words, be wary of making it too long, technical or tedious. Break up your text with images, infographics or videos and format the content in a way that’s proven to be pleasing to online consumers. This means including sub-headers, writing in short paragraphs, bolding certain points and incorporating bullet points/lists.

If your reader can scan through your post quickly to find the information they’re looking for, then they’ll be more likely to revisit it as a resource in the future.

Another advantage of including images, videos, infographics or even graphs and tables that relate to SEO is that you can create alt-text for each inset, which improves your ranking overall if people are searching for picture or video content.

7. Make sure you can be contacted

Your contact details need to be included on every page of your blog – this means a phone number, email address, or contact form at a minimum. After all, this isn’t a personal blog – it’s a business blog, which means you need to be available to potential clients and customers.

Similarly, be specific about who you are and what your blog stands for. Have your business name on every landing page of your website and put it into blog posts in the same way that you would include a keyword.

In summary

It’s very simple to use some simple tricks and online tools to improve your SEO for little to no cost. Time and effort need to be dedicated to running a successful blog; however, the above tips will definitely help you achieve better and faster results.

SEO is simpler to understand and master than many people think – and over time it can be your biggest source of traffic.

Guest Author: Sharon Conwell has been a content manager and ghost writer at over 20 online projects, now she is a part-time educator and freelance writer. She’s specializing in content creation and optimization. She loves coffee, tulips and her Shih Tzu named Bobby. You can find her on LinkedIn.

The post 7 Simple Low-Cost SEO Tips to Boost Your Business Blog appeared first on Jeffbullas’s Blog.



from Affiliate Marketing http://ift.tt/2r7DUAp via Affiliate Marketing
from Tumblr http://ift.tt/2r96m4O

How Scandinavian online gaming affiliates can become future proof

Pay-per-click, email, SMS and even advertising are more common methods for affiliate marketing today. Social media have also grown more and more …

from Affiliate Marketing http://ift.tt/2qwLsOF via Affiliate Marketing


from Tumblr http://ift.tt/2rkk99w

No, Paid Search Audiences Won’t Replace Keywords

Posted by PPCKirk

I have been chewing on a keyword vs. audience targeting post for roughly two years now. In that time we have seen audience targeting grow in popularity (as expected) and depth.

“Popularity” is somewhat of an understatement here. I would go so far as to say that I’ve heard it lauded in messianic-like “thy kingdom come, thy will be done” reverential awe by some paid search marketers. as if paid search were lacking a heartbeat before the life-giving audience targeting had arrived and 1-2-3-clear’ed it into relevance.

However, I would argue that despite audience targeting’s popularity (and understandable success), we have also seen the revelation of some weaknesses as well. It turns out it’s not quite the heroic, rescue-the-captives targeting method paid searchers had hoped it would be.

The purpose of this post is to argue against the notion that audience targeting can replace the keyword in paid search.

Now, before we get into the throes of keyword philosophy, I’d like to reduce the number of angry comments this post receives by acknowledging a crucial point.

It is not my intention in any way to set up a false dichotomy. Yes, I believe the keyword is still the most valuable form of targeting for a paid search marketer, but I also believe that audience targeting can play a valuable complementary role in search bidding.

In fact, as I think about it, I would argue that I am writing this post in response to what I have heard become a false dichotomy. That is, that audience targeting is better than keyword targeting and will eventually replace it.

I disagree with this idea vehemently, as I will demonstrate in the rest of this article.

One seasoned (age, not steak) traditional marketer’s point of view

The best illustration I’ve heard on the core weakness of audience targeting was from an older traditional marketer who has probably never accessed the Keyword Planner in his life.

“I have two teenage daughters.” He revealed, with no small amount of pride.

“They are within 18 months of each other, so in age demographic targeting they are the same person.”

“They are both young women, so in gender demographic targeting they are the same person.”

“They are both my daughters in my care, so in income demographic targeting they are the same person.”

“They are both living in my house, so in geographical targeting they are the same person.”

“They share the same friends, so in social targeting they are the same person.”

“However, in terms of personality, they couldn’t be more different. One is artistic and enjoys heels and dresses and makeup. The other loves the outdoors and sports, and spends her time in blue jeans and sneakers.”

If an audience-targeting marketer selling spring dresses saw them in his marketing list, he would (1) see two older high school girls with the same income in the same geographical area, (2) assume they are both interested in what he has to sell, and (3) only make one sale.

The problem isn’t with his targeting, the problem is that not all those forced into an audience persona box will fit.

In September of 2015, Aaron Levy (a brilliant marketing mind; go follow him) wrote a fabulously under-shared post revealing these weaknesses in another way: What You Think You Know About Your Customers’ Persona is Wrong

In this article, Aaron first bravely broaches the subject of audience targeting by describing how it is far from the exact science we all have hoped it to be. He noted a few ways that audience targeting can be erroneous, and even *gasp* used data to formulate his conclusions.

It’s OK to question audience targeting — really!

Let me be clear: I believe audience targeting is popular because there genuinely is value in it (it’s amazing data to have… when it’s accurate!). The insights we can get about personas, which we can then use to power our ads, are quite amazing and powerful.

So, why the heck am I droning on about audience targeting weaknesses? Well, I’m trying to set you up for something. I’m trying to get us to admit that audience targeting itself has some weaknesses, and isn’t the savior of all digital marketing that some make it out to be, and that there is a tried-and-true solution that fits well with demographic targeting, but is not replaced by it. It is a targeting that we paid searchers have used joyfully and successfully for years now.

It is the keyword.

Whereas audience targeting chafes under the law of averages (i.e., “at some point, someone in my demographic targeted list has to actually be interested in what I am selling”), keyword targeting shines in individual-revealing user intent.

Keyword targeting does something an audience can never, ever, ever do…

Keywords: Personal intent powerhouses

A keyword is still my favorite form of targeting in paid search because it reveals individual, personal, and temporal intent. Those aren’t just three buzzwords I pulled out of the air because I needed to stretch this already obesely-long post out further. They are intentional, and worth exploring.

Individual

A keyword is such a powerful targeting method because it is written (or spoken!) by a single person. I mean, let’s be honest, it’s rare to have more than one person huddled around the computer shouting at it. Keywords are generally from the mind of one individual, and because of that they have frightening potential.

Remember, audience targeting is based off of assumptions. That is, you’re taking a group of people who “probably” think the same way in a certain area, but does that mean they cannot have unique tastes? For instance, one person preferring to buy sneakers with another preferring to buy heels?

Keyword targeting is demographic-blind.

It doesn’t care who you are, where you’re from, what you did, as long as you love me… err, I mean, it doesn’t care about your demographic, just about what you’re individually interested in.

Personal

The next aspect of keywords powering their targeting awesomeness is that they reveal personal intent. Whereas the “individual” aspect of keyword targeting narrows our targeting from a group of people to a single person, the “personal” aspect of keyword targeting goes into the very mind of that individual.

Don’t you wish there was a way to market to people in which you could truly discern the intentions of their hearts? Wouldn’t that be a powerful method of targeting? Well, yes — and that is keyword targeting!

Think about it: a keyword is a form of communication. It is a person typing or telling you what is on their mind. For a split second, in their search, you and they are as connected through communication as Alexander Graham Bell and Thomas Watson on the first phone call. That person is revealing to you what’s on her mind, and that’s a power which cannot be underestimated.

When a person tells Google they want to know “how does someone earn a black belt,” that is telling your client — the Jumping Judo Janes of Jordan — this person genuinely wants to learn more about their services and they can display an ad that matches that intent (Ready for that Black Belt? It’s Not Hard, Let Us Help!). Paid search keywords officiate the wedding of personal intent with advertising in a way that previous marketers could only dream of. We aren’t finding random people we think might be interested based upon where they live. We are responding to a person telling us they are interested.

Temporal

The final note of keyword targeting that cannot be underestimated, is the temporal aspect. Anyone worth their salt in marketing can tell you “timing is everything”. With keyword targeting, the timing is inseparable from the intent. When is this person interested in learning about your Judo classes? At the time they are searching, NOW!

You are not blasting your ads into your users lives, interrupting them as they go about their business or family time hoping to jumpstart their interest by distracting them from their activities. You are responding to their query, at the very time they are interested in learning more.

Timing. Is. Everything.

The situation settles into stickiness

Thus, to summarize: a “search” is done when an individual reveals his/her personal intent with communication (keywords/queries) at a specific time. Because of that, I maintain that keyword targeting trumps audience targeting in paid search.

Paid search is an evolving industry, but it is still “search,” which requires communication, which requires words (until that time when the emoji takes over the English language, but that’s okay because the rioting in the streets will have gotten us first).

Of course, we would be remiss in ignoring some legitimate questions which inevitably arise. As ideal as the outline I’ve laid out before you sounds, you’re probably beginning to formulate something like the following four questions.

  • What about low search volume keywords?
  • What if the search engines kill keyword targeting?
  • What if IoT monsters kill search engines?
  • What about social ads?

We’ll close by discussing each of these four questions.

Low search volume terms (LSVs)

Low search volume keywords stink like poo (excuse the rather strong language there). I’m not sure if there is any data on this out there (if so, please share it below), but I have run into low search volume terms far more in the past year than when I first started managing PPC campaigns in 2010.

I don’t know all the reasons for this; perhaps it’s worth another blog post, but the reality is it’s getting harder to be creative and target high-value long-tail keywords when so many are getting shut off due to low search volume.

This seems like a fairly smooth way being paved for Google/Bing to eventually “take over” (i.e., “automate for our good”) keyword targeting, at the very least for SMBs (small-medium businesses) where LSVs can be a significant problem. In this instance, the keyword would still be around, it just wouldn’t be managed by us PPCers directly. Boo.

Search engine decrees

I’ve already addressed the power search engines have here, but I will be the first to admit that, as much as I like keyword targeting and as much as I have hopefully proven how valuable it is, it still would be a fairly easy thing for Google or Bing to kill off completely. Major boo.

Since paid search relies on keywords and queries and language to work, I imagine this would look more like an automated solution (think DSAs and shopping), in which they make keyword targeting into a dynamic system that works in conjunction with audience targeting.

While this was about a year and a half ago, it is worth noting that at Hero Conference in London, Bing Ads’ ebullient Tor Crockett did make the public statement that Bing at the time had no plans to sunset the keyword as a bidding option. We can only hope this sentiment remains, and transfers over to Google as well.

But Internet of Things (IoT) Frankenstein devices!

Finally, it could be that search engines won’t be around forever. Perhaps this will look like IoT devices such as Alexa that incorporate some level of search into them, but pull traffic away from using Google/Bing search bars. As an example of this in real life, you don’t need to ask Google where to find (queries, keywords, communication, search) the best price on laundry detergent if you can just push the Dash button, or your smart washing machine can just order you more without a search effort.

Image source

On the other hand, I still believe we’re a long way off from this in the same way that the freak-out over mobile devices killing personal computers has slowed down. That is, we still utilize our computers for education & work (even if personal usage revolves around tablets and mobile devices and IoT freaks-of-nature… smart toasters anyone?) and our mobile devices for queries on the go. Computers are still a primary source of search in terms of work and education as well as more intensive personal activities (vacation planning, for instance), and thus computers still rely heavily on search. Mobile devices are still heavily query-centered for various tasks, especially as voice search (still query-centered!) kicks in harder.

The social effect

Social is its own animal in a way, and why I believe it is already and will continue to have an effect on search and keywords (though not in a terribly worrisome way). Social definitely pulls a level of traffic from search, specifically in product queries. “Who has used this dishwasher before, any other recommendations?” Social ads are exploding in popularity as well, and in large part because they are working. People are purchasing more than they ever have from social ads and marketers are rushing to be there for them.

The flip side of this: a social and paid search comparison is apples-to-oranges. There are different motivations and purposes for using search engines and querying your friends.

Audience targeting works great in a social setting since that social network has phenomenally accurate and specific targeting for individuals, but it is the rare individual curious about the ideal condom to purchase who queries his family and friends on Facebook. There will always be elements of social and search that are unique and valuable in their own way, and audience targeting for social and keyword targeting for search complement those unique elements of each.

Idealism incarnate

Thus, it is my belief that as long as we have search, we will still have keywords and keyword targeting will be the best way to target — as long as costs remain low enough to be realistic for budgets and the search engines don’t kill keyword bidding for an automated solution.

Don’t give up, the keyword is not dead. Stay focused, and carry on with your match types!

I want to close by re-acknowledging the crucial point I opened with.

It has not been my intention in any way to set up a false dichotomy. In fact, as I think about it, I would argue that I am writing this in response to what I have heard become a false dichotomy. That is, that audience targeting is better than keyword targeting and will eventually replace it…

I believe the keyword is still the most valuable form of targeting for a paid search marketer, but I also believe that audience demographics can play a valuable complementary role in bidding.

A prime example that we already use is remarketing lists for search ads, in which we can layer on remarketing audiences in both Google and Bing into our search queries. Wouldn’t it be amazing if we could someday do this with massive amounts of audience data? I’ve said this before, but were Bing Ads to use its LinkedIn acquisition to allow us to layer on LinkedIn audiences into our current keyword framework, the B2B angels would surely rejoice over us (Bing has responded, by the way, that something is in the works!).

Either way, I hope I’ve demonstrated that far from being on its deathbed, the keyword is still the most essential tool in the paid search marketer’s toolbox.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!



from Affiliate Marketing http://ift.tt/2qBdBzf via Affiliate Marketing
from Tumblr http://ift.tt/2rRkHnZ

понедельник, 29 мая 2017 г.

Affiliate Marketing; The Secret to Creating a Massive Passive Income Business Online

coohnu: Download & Read ’Affiliate Marketing: The Secret to Creating a … What’s inside: What affiliate marketing is and how to make it work for you …

from Affiliate Marketing http://ift.tt/2ri0TZZ via Affiliate Marketing


from Tumblr http://ift.tt/2rPA93M

Why Instagress Shut Down But You Can Still Grow A Massive Instagram Following

Why Instagress Shut Down But You Can Still Grow A Massive Instagram Following

By now you’re probably well aware that Instagress, an easy-to-use and affordable website that allowed would-be Instagram influencers to use bots to rack up followers, has folded.

Instagress didn’t make fake accounts, but it created bots that automatically commented on and liked other people’s Instagram photos, artificially increasing engagement and making accounts appear more popular.

In a tweet, Instagress said the ‘sad’ decision to shut down its web service was due to a ‘request’ from Instagram.

It’s no secret that Instagram dislikes the use of automation and bots, but regulating botting on platforms has always been difficult, due to the bot-like behavior of most people. Think about it: we like, comment and click on the same things in a very predictable fashion. Plus, most people want a high volume of Instagram followers, and to get that to be less of a priority, you’d have to change Instagram into a platform that isn’t based on popularity and likes. And nobody’s doing that.

This is what the Instagress dashboard looked like:

So why did Instagress shut down?

It’s the one question that has been trending across social media platforms for a month. Some say it was because the name had the word ‘insta’ in it; others linked it to the fact that the company breached Instagram’s terms of service.

The truth is, we may never know the exact reason. But the key takeaway from the collapse of Instagress is that paid follower growth isn’t going anywhere. Yes, that may sound contrary. But let me tell you why.

Marketing will always be marketing. And what all marketers want to do at the end of the day is reach as many members of their target audience as possible while spending the minimum amount possible.

I hate to be the bearer of bad news, but organic reach on social media is dead. Most content goes nowhere and gets zero shares on its own. Think about it. There are 500 million users on Instagram alone. That makes it bigger than the entire population of the USA!

It means standing out on social media is not easy. Changes to Google’s search results pages have further obscured organic content, especially on competitive commercial searches, and the typical internet marketing conversion rate is less than 1%.

Content doesn’t go viral on Instagram just because there are a lot of people on it. It goes viral because all of those 500 million people are connected to each other through one platform and they share whatever catches their attention. It’s a domino effect, and I would argue that it’s only really possible through paid acquisition and automation.

But why do I need bots?

If you’re still thinking, I can just engage with as many people as possible, as frequently as possible, across the necessary platforms, then you’re still not thinking big enough.

In the instance that your Instagram account experiences a high level of activity – I’m talking literally thousands upon thousands of comments and shares – it would take you all day and all night to respond, and then some. Sure, you might have a social media specialist in-house to take care of:

  • Sharing comments
  • Sharing likes
  • Following people
  • Responding to direct messages
  • Exchanging shout-outs for shout-outs

But a web service like Instagress can take care of a lot of this for you. Imagine the money you will save per year if you outsource your account activity and process of engagement, and have your social media specialist focus more on creating killer content – the kind that commands attention and keeps it.

Here are a couple of Instagram accounts that do a great job of branding themselves with content. And check out their follower counts! They may be organic… But I have an inkling they’re not.

Without a doubt, automation is the way forward for all marketers. It is a form of marketing that, when used correctly, can increase efficiency by a significant margin, and make all the difference between floating and floundering for your business.

So what was great about Instagress?

When I first came across Instagress roughly two years ago, I fell in love with it as it helped me solve my time-management and marketing woes. However, I hated the fact that I had to constantly check the dashboard, adjust the settings and make sure it was working properly. So I decided to revamp the platform in a customized way so that it would work just like I wished it would.

That was when SociallyRich was born.

With Instagress, the user process involved conducting research for hashtags you wished to target. Once done, you would manually type them into the dashboard and the software would begin to ‘like’ pictures using those hashtags at whichever speed you had set the software to do so.

The issue was that the dashboard created a lot of confusion for people who were not well-versed in using Instagram.

We built SociallyRich keeping three issues we identified with Instagress in mind:

  1. Not everyone is an Instagram expert; in fact, many people looking for paid follower growth are small businesses without a designated team to handle their social media accounts
  2. There was no dedicated assistive tool for performing hashtag research, and doing so was time-consuming
  3. There was no customer service team to troubleshoot problems when you started using it.

In building SociallyRich, we removed the dashboard to take the guesswork out of researching hashtags by doing it ourselves in the back end. We aimed to kill two birds with one stone: make using our bot service easier, and make sure every account is 100% optimized. And we succeeded.

At SociallyRich, we look at each account manually to decide how much activity can be generated for optimal performance, and benchmark it to some metrics we look at and calculate periodically. We have a service called ‘Done For You‘ where our marketing team performs in-depth research of your target market so that the following you get is as targeted as possible.

We also offer the best customer service we possibly can for those un-tech-savvy moments. We turned the ‘disadvantage’ of being a small start-up into an advantage and kept it, even when we grew big. To have a connection and be treated as a human through a computer should be standard practice.

Over time, we grew a client list belonging to a diverse range of industries. Clients of SociallyRich include:

  • Restauranteurs
  • Social media influencers
  • Bloggers
  • Marketing agencies
  • Photographers
  • Artists
  • Real estate companies
  • Small start-ups
  • Retail stores
  • Design accounts
  • Ecommerce companies

And many more.

What other tips can you give me?

First and foremost, understand that Instagram is for everyone: there’s not a single industry that it doesn’t suit and can’t serve in terms of marketing.

Anyone who says otherwise has tried it and given up too quickly as it didn’t work for them. A platform with over 500 million users definitely is worthy of your attention!

Which brings me to another point. You may have noticed that in the list of SociallyRich client industries I included above, social media influencers were mentioned. You don’t need to own a business to give yourself permission to expand your following and extend your influence on Instagram. You can monetize your account simply by establishing your own good self as a brand.

One of my favorite ways to monetize your account in a nifty way is by playing to your different niches: for example, I own the Instagram accounts @designselfies, @higherlifestyle and @mens.daily.style.

When you grow or buy an audience for a particular niche using SociallyRich or another web tool, you can then sell them something they would potentially be interested in. The system works flawlessly. Once you have built an audience and have the potential to attract a lot of traffic, you just have to promote the product through your own account – you don’t even have to pay for traffic or influencer marketing anymore.

Sometimes people say to me, “If I launch my business on ‘x’ date, when should I begin my social media marketing?” As my friend Nick Mares, founder of Kettle & Fire Bone Broth once said to me, “It’s never too early to start marketing.”

I apply that mentality to anything marketing-related.

Even if you don’t have plans to launch a business any time soon, start growing your Instagram audience now so that one day if you do decide to launch your own business, you will already have an audience. You will save yourself a lot of time and money as a result.

I understand that it’s hard to appreciate the impact something can have without having experienced it first. And that’s why I want to offer you a completely free 3-day trial of our service so that you can get a first-hand experience of the type of following you can build for yourself, your company, or your clients with the help of SociallyRich.

This 3-day trial is a no commitment, cancel anytime, 100% free trial. If you are still thinking that you’ll continue to do all your social media marketing manually, be sure to bookmark this page because I guarantee you’ll get tired of it fairly fast!

In conclusion

Social media is a touchstone of modern communication and it’s here to stay. In today’s world, the more people we can connect with, the better. Instagram keeps growing on a daily basis and it is now about to hit, if it has not already hit, the 600 million mark!

It doesn’t matter what industry you’re in – what you ultimately want to do is increase your popularity. If you build a large enough Instagram following in your industry or niche, you will effectively establish yourself as an authority and increase your sales prospects.

People will trust you and feel they don’t have to continue searching for another company, as it will validate you in their eyes. Buying follower growth with bots is a statistically proven way to start the process of biasing people towards you.

Think of tools like SociallyRich as the extra firepower and investment for your business that will help you use Instagram and other social media platforms to your advantage. A shift in digital marketing trends is taking place right in front of our very eyes. It’s an exciting time to be alive. My two cents is: Instagram is here to stay, maybe even forever!

Now is the time to get ahead of the curve, see through corners, innovate and stay on top as a digital marketer. Try my free trial offer, and be sure to remember the most important skill of them all: the ability to adapt, and take action today instead of tomorrow.

Guest Author: Ramon Berrios is a multi-national serial entrepreneur based in Puerto Rico, and the co-founder of SociallyRich.co – an Instagram Growth service that helps businesses from any industry grow a targeted following. 

The post Why Instagress Shut Down But You Can Still Grow A Massive Instagram Following appeared first on Jeffbullas’s Blog.



from Affiliate Marketing http://ift.tt/2rfAOuL via Affiliate Marketing
from Tumblr http://ift.tt/2relv3h

Edge.affiliateshop.com

Affiliate marketing is a lucrative and hugely popular way of making money online. It’s something which anyone can get involved in, be it as a part-time …

from Affiliate Marketing http://ift.tt/2r4Tixp via Affiliate Marketing


from Tumblr http://ift.tt/2rydJ7x

DigitalF5 acquires AdClear; aims at greater market dominance in Delhi

Prior to that, he served as General Manager- Digital Marketing at times … Social Media Marketing, Online Media Buying, Affiliate Marketing etc. for …

from Affiliate Marketing http://ift.tt/2r4F6EQ via Affiliate Marketing


from Tumblr http://ift.tt/2s7aOj4

Top 4 Effective Tips For Affiliate Marketing

Whether you’re entirely new in Internet Marketing or an experienced marketer, the affiliate marketing industry offers everyone vast opportunities to …

from Affiliate Marketing http://ift.tt/2qzsuC5 via Affiliate Marketing


from Tumblr http://ift.tt/2rynvX4

Evidence of the Surprising State of JavaScript Indexing

Posted by willcritchlow

Back when I started in this industry, it was standard advice to tell our clients that the search engines couldn’t execute JavaScript (JS), and anything that relied on JS would be effectively invisible and never appear in the index. Over the years, that has changed gradually, from early work-arounds (such as the horrible escaped fragment approach my colleague Rob wrote about back in 2010) to the actual execution of JS in the indexing pipeline that we see today, at least at Google.

In this article, I want to explore some things we’ve seen about JS indexing behavior in the wild and in controlled tests and share some tentative conclusions I’ve drawn about how it must be working.

A brief introduction to JS indexing

At its most basic, the idea behind JavaScript-enabled indexing is to get closer to the search engine seeing the page as the user sees it. Most users browse with JavaScript enabled, and many sites either fail without it or are severely limited. While traditional indexing considers just the raw HTML source received from the server, users typically see a page rendered based on the DOM (Document Object Model) which can be modified by JavaScript running in their web browser. JS-enabled indexing considers all content in the rendered DOM, not just that which appears in the raw HTML.

There are some complexities even in this basic definition (answers in brackets as I understand them):

  • What about JavaScript that requests additional content from the server? (This will generally be included, subject to timeout limits)
  • What about JavaScript that executes some time after the page loads? (This will generally only be indexed up to some time limit, possibly in the region of 5 seconds)
  • What about JavaScript that executes on some user interaction such as scrolling or clicking? (This will generally not be included)
  • What about JavaScript in external files rather than in-line? (This will generally be included, as long as those external files are not blocked from the robot — though see the caveat in experiments below)

For more on the technical details, I recommend my ex-colleague Justin’s writing on the subject.

A high-level overview of my view of JavaScript best practices

Despite the incredible work-arounds of the past (which always seemed like more effort than graceful degradation to me) the “right” answer has existed since at least 2012, with the introduction of PushState. Rob wrote about this one, too. Back then, however, it was pretty clunky and manual and it required a concerted effort to ensure both that the URL was updated in the user’s browser for each view that should be considered a “page,” that the server could return full HTML for those pages in response to new requests for each URL, and that the back button was handled correctly by your JavaScript.

Along the way, in my opinion, too many sites got distracted by a separate prerendering step. This is an approach that does the equivalent of running a headless browser to generate static HTML pages that include any changes made by JavaScript on page load, then serving those snapshots instead of the JS-reliant page in response to requests from bots. It typically treats bots differently, in a way that Google tolerates, as long as the snapshots do represent the user experience. In my opinion, this approach is a poor compromise that’s too susceptible to silent failures and falling out of date. We’ve seen a bunch of sites suffer traffic drops due to serving Googlebot broken experiences that were not immediately detected because no regular users saw the prerendered pages.

These days, if you need or want JS-enhanced functionality, more of the top frameworks have the ability to work the way Rob described in 2012, which is now called isomorphic (roughly meaning “the same”).

Isomorphic JavaScript serves HTML that corresponds to the rendered DOM for each URL, and updates the URL for each “view” that should exist as a separate page as the content is updated via JS. With this implementation, there is actually no need to render the page to index basic content, as it’s served in response to any fresh request.

I was fascinated by this piece of research published recently — you should go and read the whole study. In particular, you should watch this video (recommended in the post) in which the speaker — who is an Angular developer and evangelist — emphasizes the need for an isomorphic approach:

Resources for auditing JavaScript

If you work in SEO, you will increasingly find yourself called upon to figure out whether a particular implementation is correct (hopefully on a staging/development server before it’s deployed live, but who are we kidding? You’ll be doing this live, too).

To do that, here are some resources I’ve found useful:

Some surprising/interesting results

There are likely to be timeouts on JavaScript execution

I already linked above to the ScreamingFrog post that mentions experiments they have done to measure the timeout Google uses to determine when to stop executing JavaScript (they found a limit of around 5 seconds).

It may be more complicated than that, however. This segment of a thread is interesting. It’s from a Hacker News user who goes by the username KMag and who claims to have worked at Google on the JS execution part of the indexing pipeline from 2006–2010. It’s in relation to another user speculating that Google would not care about content loaded “async” (i.e. asynchronously — in other words, loaded as part of new HTTP requests that are triggered in the background while assets continue to download):

“Actually, we did care about this content. I’m not at liberty to explain the details, but we did execute setTimeouts up to some time limit.

If they’re smart, they actually make the exact timeout a function of a HMAC of the loaded source, to make it very difficult to experiment around, find the exact limits, and fool the indexing system. Back in 2010, it was still a fixed time limit.”

What that means is that although it was initially a fixed timeout, he’s speculating (or possibly sharing without directly doing so) that timeouts are programmatically determined (presumably based on page importance and JavaScript reliance) and that they may be tied to the exact source code (the reference to “HMAC” is to do with a technical mechanism for spotting if the page has changed).

It matters how your JS is executed

I referenced this recent study earlier. In it, the author found:

Inline vs. External vs. Bundled JavaScript makes a huge difference for Googlebot

The charts at the end show the extent to which popular JavaScript frameworks perform differently depending on how they’re called, with a range of performance from passing every test to failing almost every test. For example here’s the chart for Angular:

Slide5.PNG

It’s definitely worth reading the whole thing and reviewing the performance of the different frameworks. There’s more evidence of Google saving computing resources in some areas, as well as surprising results between different frameworks.

CRO tests are getting indexed

When we first started seeing JavaScript-based split-testing platforms designed for testing changes aimed at improving conversion rate (CRO = conversion rate optimization), their inline changes to individual pages were invisible to the search engines. As Google in particular has moved up the JavaScript competency ladder through executing simple inline JS to more complex JS in external files, we are now seeing some CRO-platform-created changes being indexed. A simplified version of what’s happening is:

  • For users:
    • CRO platforms typically take a visitor to a page, check for the existence of a cookie, and if there isn’t one, randomly assign the visitor to group A or group B
    • Based on either the cookie value or the new assignment, the user is either served the page unchanged, or sees a version that is modified in their browser by JavaScript loaded from the CRO platform’s CDN (content delivery network)
    • A cookie is then set to make sure that the user sees the same version if they revisit that page later
  • For Googlebot:
    • The reliance on external JavaScript used to prevent both the bucketing and the inline changes from being indexed
    • With external JavaScript now being loaded, and with many of these inline changes being made using standard libraries (such as JQuery), Google is able to index the variant and hence we see CRO experiments sometimes being indexed

I might have expected the platforms to block their JS with robots.txt, but at least the main platforms I’ve looked at don’t do that. With Google being sympathetic towards testing, however, this shouldn’t be a major problem — just something to be aware of as you build out your user-facing CRO tests. All the more reason for your UX and SEO teams to work closely together and communicate well.

Split tests show SEO improvements from removing a reliance on JS

Although we would like to do a lot more to test the actual real-world impact of relying on JavaScript, we do have some early results. At the end of last week I published a post outlining the uplift we saw from removing a site’s reliance on JS to display content and links on category pages.

odn_additional_sessions.png

A simple test that removed the need for JavaScript on 50% of pages showed a >6% uplift in organic traffic — worth thousands of extra sessions a month. While we haven’t proven that JavaScript is always bad, nor understood the exact mechanism at work here, we have opened up a new avenue for exploration, and at least shown that it’s not a settled matter. To my mind, it highlights the importance of testing. It’s obviously our belief in the importance of SEO split-testing that led to us investing so much in the development of the ODN platform over the last 18 months or so.

Conclusion: How JavaScript indexing might work from a systems perspective

Based on all of the information we can piece together from the external behavior of the search results, public comments from Googlers, tests and experiments, and first principles, here’s how I think JavaScript indexing is working at Google at the moment: I think there is a separate queue for JS-enabled rendering, because the computational cost of trying to run JavaScript over the entire web is unnecessary given the lack of a need for it on many, many pages. In detail, I think:

  • Googlebot crawls and caches HTML and core resources regularly
  • Heuristics (and probably machine learning) are used to prioritize JavaScript rendering for each page:
    • Some pages are indexed with no JS execution. There are many pages that can probably be easily identified as not needing rendering, and others which are such a low priority that it isn’t worth the computing resources.
    • Some pages get immediate rendering – or possibly immediate basic/regular indexing, along with high-priority rendering. This would enable the immediate indexation of pages in news results or other QDF results, but also allow pages that rely heavily on JS to get updated indexation when the rendering completes.
    • Many pages are rendered async in a separate process/queue from both crawling and regular indexing, thereby adding the page to the index for new words and phrases found only in the JS-rendered version when rendering completes, in addition to the words and phrases found in the unrendered version indexed initially.
  • The JS rendering also, in addition to adding pages to the index:
    • May make modifications to the link graph
    • May add new URLs to the discovery/crawling queue for Googlebot

The idea of JavaScript rendering as a distinct and separate part of the indexing pipeline is backed up by this quote from KMag, who I mentioned previously for his contributions to this HN thread (direct link) [emphasis mine]:

“I was working on the lightweight high-performance JavaScript interpretation system that sandboxed pretty much just a JS engine and a DOM implementation that we could run on every web page on the index. Most of my work was trying to improve the fidelity of the system. My code analyzed every web page in the index.

Towards the end of my time there, there was someone in Mountain View working on a heavier, higher-fidelity system that sandboxed much more of a browser, and they were trying to improve performance so they could use it on a higher percentage of the index.”

This was the situation in 2010. It seems likely that they have moved a long way towards the headless browser in all cases, but I’m skeptical about whether it would be worth their while to render every page they crawl with JavaScript given the expense of doing so and the fact that a large percentage of pages do not change substantially when you do.

My best guess is that they’re using a combination of trying to figure out the need for JavaScript execution on a given page, coupled with trust/authority metrics to decide whether (and with what priority) to render a page with JS.

Run a test, get publicity

I have a hypothesis that I would love to see someone test: That it’s possible to get a page indexed and ranking for a nonsense word contained in the served HTML, but not initially ranking for a different nonsense word added via JavaScript; then, to see the JS get indexed some period of time later and rank for both nonsense words. If you want to run that test, let me know the results — I’d be happy to publicize them.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!



from Affiliate Marketing http://ift.tt/2qybjAP via Affiliate Marketing
from Tumblr http://ift.tt/2qsdk6m

воскресенье, 28 мая 2017 г.

7 Ways Facebook Keeps You Addicted (and how to apply the lessons to your products)

How Facebook Keeps You Addicted

In 1972 the first app went live.

It wasn’t designed for mobile and it was meant only for geeks and programmers.

That invention was designed and built by Ray Tomlinson. Today that messaging app is used by 4.3 billion people and 269 billion messages are sent every 24 hours.

You may have already guessed what that app is.

Email was the first addictive digital technology that had us checking in to our computers and then decades later our mobile phones.

One of the key reasons for why it is so addictive is “operant conditioning”. It is based upon the scientific principle of variable rewards, discovered by B. F. Skinner (an early exponent of the school of behaviourism) in the 1930’s when performing experiments with rats.

The secret?

Not rewarding all actions but only randomly.

Most of our emails are boring business emails and occasionally we find an enticing email that keeps us coming back for more. That’s variable reward.

That’s one way Facebook creates addiction.

Addiction is now designed “in”

Social media is no different but it has gone to another level.

In fact addiction and keeping you hooked is now designed “into” many platforms and apps. Because the apps that win are not the best products but the most addictive.

In a recent interview on Brain Hacking, Tristan Harris (an ex-Googler) describes how Facebook, Google and others are designing apps for addiction. They want you back to their product at least once a day.

But the reality is that users are spending an average of 50 minutes a day just on Facebook. This is up from 40 minutes a day just a year ago.

A tiny habit

Habits are powerful.

They are also behind behaviour change and one of the top in this field is the behaviour scientist B.J. Fogg who has been lecturing on this since 1997. He shares his time between Stanford University and industry work.

Fogg told Ian Leslie in a recent interview in 1843 magazine that he read the classics in the course of a master’s degree in the humanities. He says that when he read Aristotle’s “Rhetoric”, a treatise on the art of persuasion, “It just struck me….this stuff is going to be rolled out in tech one day!”

The reality now is that we are seeing soft and pervasive persuasion used on the social web.

His simple model provides an insight into how to create powerful apps and design .

Image source: Foggmethod.com 

His recommendation?

Design for the behaviour and not the outcome. That specific behaviour could be a tiny habit. The outcome of becoming healthy is made up of many tiny simple habits. This could include, eating a healthy breakfast, walking every day and getting a good nights sleep.

A creating a tiny habit could be as simple as:

Trigger: After I walk in the front door

Behaviour: I will hang my keys on the hook

His suggestion is then to celebrate that small habit success. That could be as simple as saying “I am awesome” or a happy dance.

The goal is to use daily routines to create tiny habits. Here is his format for creating a tiny habit.

Source: BJ Fogg Slideshare

Using an app is simple. Checking into Facebook to see how many likes you have on your latest post.

One of his students at Stanford University was Mike Kreiger, who went on to co-found Instagram, where over 700 million users now share sunrises, sunsets and selfies. The concept was simple, upload a photo and add a filter.

For many using Instagram is now a habit.

Better than cocaine

Some recent research by Sang Pil Han at Arizona State University discovered that mobile social apps foster more dependency than cocaine or alcohol. This was discovered when they looked at the data behind the use of Facebook and the popular Korean game, Anipang

The slot machine is a perfect example of creating a machine that is designed to hook and addict the user. Natasha Dow Schull, an anthropologist and the author of the book “Addiction by Design”  has spent 15 years of field research in Las Vegas studying solitary gambling at electronic machines.

Her findings reveal how the mechanical rhythm of electronic gambling pulls players into a trancelike state they call the “machine zone”,  in which daily worries, social demands, and even bodily awareness fade away.

Losing time

Even Skinner likened his Skinner box for the rats with its variable reward to the one armed bandits called slot machines. Beyond the reward the other elements to the art of seducing the gambler to slowly empty his pockets over hours and days includes the music, the mini games and even the actual appearance  of spinning wheels.

Money is one thing but time is another and it is something you can never buy. So losing time is a worse addiction than losing money.

You can earn more more money but you can never get back time.

This is how Facebook creates addiction

Building and developing a product that entices you to use it many times a day is at the heart of the Facebook marketing philosophy. It is core to their product development.

So here are some insights into human behaviour that keep us switching on and logging in.

1. Validation

As human creators and sharers we all feel the need to have our creations validated.

Not many of us are immune to the numeric quantification of attention that appears at the bottom of every post on Facebook.

Just a few “likes” and we feel like no-one cares. But get 100 and you feel like an awesome creative champion.

Recent developments on the platform are seeing the streaming love hearts and likes that were were initially built into Periscope are now appearing on Facebook. This burst of visual likes is programmed in to keep you hooked. It is “not” an accident.

Facebook has the resources to copy almost any feature of competitors that they feel improves their addiction tactics.

2. Variable reward

The discovery by Skinner that showed that rats were more likely to become addicted when there were random rewards.

Diving into your Facebook feed reveals various pieces of content and revelations that keep us hooked. Some boring others enticing.

The ever changing feedback that is the numeric quantification of content success is like a drug.

3. Fear of missing out

We all want to be part of the show and fear of missing out is real. This is sometimes abbreviated as “FOMO”. Curiosity is a human condition that keeps us looking, listening and clicking on the the little app icon.

There is a bit of a voyeur in all of us and the platforms feed and reward that human behaviour.

4. Sounds

Getting that sound from your phone notifications is one thing that makes most of us “check in”.

But the Facebook messenger sound that happens when you are exchanging private messages builds even more anticipation. It is intoxicating and addictive.

That design is not by accident.

This is now even appearing as a visual on your SMS and text messages. Now those little moving dots reveal that someone is typing at the other end and that one little tactic keeps us glued to our screens.

5. Vibration

Phones also provide us with alerts when on silent mode. It is that vibration in your pocket or purse.

In most cases when downloading an apt is hard “not” to activate it or it is almost hardwired in.

It is opt-out not opt-in as the default.

That tempting vibration when someone likes, comments or leaves a message on your social media networks is an ever-present temptation.

6. Connection

At a recent social media marketing conference I bumped into a new attendee that revealed that she had now found her “tribe”. Being connected to a world wide community is part of the attraction of social media. It allows us to connect online first and then meet in person later.

Wanting to be connected is a very powerful motivation to use the social web.

The ability to find other passionate humans around the world and to join your global “passion tribe” is compelling ….and addictive.

7. Investment

One of the reasons I use Facebook is to record my trips. It is where  post my mobile photos that distils the highlights of the day in words and images. The time line then becomes a travelogue that is in essence my adventure diary.

It is an investment.

The more I create and the longer I spend in posting and publishing the bigger the emotional investment. Facebook becomes your life mapping app.

Taking control back

A digital detox is one tactic that seems to be gaining traction and attention but for me there is a simpler solution.

Turn off all alerts and notifications.

Gaining back control of your attention is necessary to get work done. Deep work and creating content of consequence is not achieved when there is constant distraction.

I am writing this with sounds, vibrations and all social media turned off. Even the email is off duty.

How to apply the lessons to your products

In his book “Hooked: How to Build Habit Forming Products“, Nir Eval reveals the model for building products that people love. And products that win are the ones that get us hooked.

Here is an example of how Pinterest keeps you “hooked”

how Facebook keeps you addicted

Source: BJ Fogg Slideshare

Here is the distillation of his model in 4 steps to keep your prospects and customers engaged.

  1. Create internal and external triggers that bring people to your product
  2. Get them to log-in or sign up to your resources or product
  3. Provide a variable reward that connects to the tribe, provides resources and enables personal mastery
  4. Allow them to build an investment that provides more triggers to keep them coming back.

Over at his website he has a worksheet that is worth checking out.

Over to you

Creating simple and tiny habits over time leads to big outcomes.

Using this principle alone to design and build digital products that bring value to people’s lives and keeps them coming back sits behind some of the fastest growing companies that the world has ever seen.

How could you apply these principles to your products?

The post 7 Ways Facebook Keeps You Addicted (and how to apply the lessons to your products) appeared first on Jeffbullas’s Blog.



from Affiliate Marketing http://ift.tt/2rdgUQU via Affiliate Marketing
from Tumblr http://ift.tt/2rvEWrn