Showing posts with label SEO News. Show all posts
Showing posts with label SEO News. Show all posts

6 Ways to Get More Results from SEO Right Now

seo result

As PR pros today we are tasked with a mountain of responsibility. Not only do clients demand monthly press mentions, but many lean on their PR firm's for social media and search engine optimization (SEO) advice, and in some cases execution. Unfortunately, much of the SEO advice out there currently is based on anecdotes and myth. Fortunately, a new search engine ranking factors study was recently released that helps shed some light on what's truly important for getting SEO results in 2016.

But, since I'm not an SEO expert, I thought it would make sense to interview someone who is - Dmitry Dragilev, founder of Just Reach Out (which I covered a few articles ago). Together, we are going to walk you through the highlights of an important SEO study and show you how you can use its findings to maximize the SEO benefits that your clients receive from your PR campaigns. So without further ado. . .the Q&A:

Q. As a PR professional, what is the best council I can give my clients about SEO, especially considering that the methods/strategies change with each Google algorithm change?

A. Backlinks remain an integral part of Google's algorithm. One of the chief reasons businesses hire PR firms today is that as PR pros we are very good at generating mentions from authoritative sites. More often than not, these mentions come in the form of links.

According to the new study, backlinks were the #1 most-important ranking factor that they investigated. Specifically, sites that generated links from a diversity of different sites came out on top.

For PR pros, this means that mentions from smaller blogs or niche publications can benefit our client's SEO. Therefore, we shouldn't necessarily shy away from a mention just because it's not on the cover of the Wall Street Journal.

Q. I used to spend a ton of time on Moz analyzing our client's earned media power and used to put a premium on domain authority. Is that still a thing?A. Simple answer? Yes! A website's overall link authority boosts all pages. The study I referenced above discovered that a website's overall link authority played a major role in how well each page on that site ranked. In other words, a brand new page on an established site will tend to outrank a new page on a smaller site.

This impacts our work because we're often unable to get links pointing to specific pages on a client's website (for example, a product or service page). But rest easy - even links pointing to a client's homepage may boost the rankings for those high-priority pages.

Q. I write a ton of contributed content and typically include imagery. Is that imagery helping to move the needle on those articles?

A. Image-rich media ranks very well in search, so yes it does make a difference. In the old days of faxing press releases to journalists, images didn't matter much. In fact, unless you earned a cover story, your client's media coverage usually didn't include an image at all!

In 2016, things are very different. It's common to see an image on almost every page on the web today. This study we are referencing here confirms that, whenever possible, it's smart to add images to your content. They found that content that contained at least once image tended to rank above content without any images.

This means that PR professionals should aim to use images in more areas than just with contributed articles - consider adding imagery to your online press materials, including press releases and press kits.

As I mentioned up front, as PR professionals we are now often tasked with having at least a basic understanding of SEO. The data-backed SEO tips provided by Dragilev should not only help you and your agency better understand SEO, but actually improve the rankings of your clients in search.

Top 3 Tools You Should Use for Technical SEO Audits

Technical SEO Audits



Doing a search engine optimization (SEO) audit is no joke. It used to take time, the patience of a saint and too many spreadsheets. You may grow a white hair or two in the process. But, thanks to technical SEO audit tools, we are no longer doing those insane manual checks that we did in the past. Most SEO experts arm themselves with these tools so they’re no longer rummaging through raw data but making strategic decisions instead.

In this article I’ll share three of my go-to tools for performing a technical SEO audit: DeepCrawl (a cloud-based tool), Screaming Frog (a desktop-based tool) and Search Console (the free web-based tool from Google themselves). They all have their different strengths and use cases. Depending on your requirements, you may need to choose one -- or you may find all three useful in conjunction.

1. DeepCrawl


I really like DeepCrawl, because of its flexibility and the depth of the reports it provides. When starting a crawl, you can choose from numerous crawl types, right up to a full gap analysis of my entire site. You can also auto-schedule crawls, which is really useful. Crawls are highly customizable, so you can set the criteria for maximum / minimum links per page, content, title / URL length, load time, etc.


Here are the three things I like the most about DeepCrawl:
It can easily handle crawling millions of pages, so if you need to crawl a huge site, look no further.
It provides competitor reports -- not just the basic headlines, but the real nitty-gritty details on content, pricing and promotions, site architecture and even the brand’s key influencers.
It allows you to project manage your SEO team, creating tickets for issues and assigning them to people. It also alerts you if any issues pop up consistently in crawls. Plus, it maintains a history of all changes, so you can look back and monitor performance and progress over time.

If I had to improve one thing, I’d ask for more mobile-focused reports. (Their support guys told me to expect more of these in their next product update, so looks like that'll be solved soon anyway).

2. Screaming Frog


When it comes to desktop crawlers, Screaming Frog is an undisputed leader. The tool has been around for quite some time now, and webmasters managing sites of all sizes swear by it. If you are looking at crawling less than 500 URLs, you can even use it for free.

However, if you have a large website with over 10,000 pages, be wary, as desktop crawlers can cause server-response problems. Besides, it doesn’t come with collaboration features, which makes it far less attractive for SEO teams these days.

That said, in this price range, it’s one of the most useful crawlers, and here are the reasons it's in my top three:
It doesn’t die off when your memory is running low and alerts you beforehand. I especially like this feature, because this has happened to me a few times. All you need to do is save the project, increase random access memory (RAM) and restart it to continue.
Their bulk export option really makes life easier, as you can export all data, including internal links, outbound links, anchor text, image alt text, etc.
There is an option to accept cookies, so you can also crawl websites that make it compulsory to accept cookies.

And, though I like the tool overall, if I had to change one thing about it, I’d want them to improve the user experience to make it easier to customize crawls.

3. Google Search Console


While SEO veterans might find it funny to see this tool in the list, many SEOs are relying on it more than ever. The tool has come a long way since its earlier days and can offer a fair amount of insights. These are the three things I do love about it:
It gives you estimates on your position for a keyword, plus the number of impressions and clicks for your site on that keyword in Google search results. That may be basic, but it’s important and useful.
It gives a good summary of things that matter -- things like broken links, number of pages indexed, correctness of HTML markup, page loading speed, etc.
It’s free -- and it comes from the horse's mouth! (OK, that’s two things, but they’re both major plus points.)

The only thing I don’t like about Search Console is that it doesn’t always give a complete picture.

Remember, these tools may not be the best fit for your specific needs. All three have their particular unique selling points and solve specific pain points well. You should review them all and choose the one that’s right for you. The main things to consider are the size of your website, the volume of new pages you generate and the kind of insights you are looking for.

3 SEO Tips to Help Your PR Content Rank Highly With Google's Latest Update

3 SEO Tips to Help Your PR Content Rank Highly With Google's Latest Update

Google’s Panda 4.2 algorithm rollout, which began over the weekend, continues Google’s commitment to rewarding high-quality, user-focused content with higher rankings.
Here are three easy tips for optimizing your PR content for better results:

1. Go long. 

Many PR pros got used to writing short content because press release services provided a surcharge if copy ran over 400 words. Google now scans short content and assumes it doesn’t contain much useful information. The result is that shorter content doesn’t rank as well.
It’s now better to “go long,” says SEO-PR Chief Executive Greg Jarboe, citing a favorite football phrase. “Go ahead and write 600 to 800 words—whether for a press release, blog post or summary of a white paper behind a paywall.”

2. Use synonyms. 

Google will penalize you for keyword stuffing, which is using the same search term repeatedly in copy. It rewards what has always been a good writing practice: the use of synonyms.
“If you’re going to use one word or phrase in the first sentence, use a variation in the second sentence or graph,” Jarboe says. “Google understands what synonyms are and rewards them as good writing that’s more likely to be of interest to readers. The result is a higher ranking.”

3. Get visual. 

You’re missing a huge SEO opportunity if you aren’t adding photos and videos to your content. Over 55 percent of Google search results now include videos and over 40 percent include photos, according to Jarboe.
The most important element to optimize in photos is your caption. Make sure to plant your two- or three-word key phrase at the beginning of your caption copy, Jarboe says. “The same rule applies to YouTube titles. They can only be 100 characters long, so make sure your key phrase is at least in the front third of your title.”
Jarboe’s big SEO takeaway for PR pros is that your normal writing skills are more of an asset now than ever before, because Google rewards well-written content. So, stop writing for what you thought Google wanted—and start writing well again.

After Google's updates: how to judge the quality of a link

Quality Links

When you build links to your website, you want lasting results. It does not make sense to invest your time in search engine optimization methods that are just a flash-in-the-pan.

Google's recent algorithm updates have shown that your website can get in major trouble if your website has the wrong type of backlinks. How do you judge the quality of a website? What is a good website and from which web pages should you get links?

1. All automatic backlinks are bad


All backlinks that have been created automatically have no positive influence on the rankings of your website. If these automatically created backlinks use the rel=nofollow attribute, there's nothing you have to worry about.

If you used tools that automatically created backlinks in bulk for you, you should try to get rid of these backlinks to avoid a penalty. The link disinfection tool in SEOprofiler can help you to get rid of bad backlinks.

2. Google PageRank and other metrics are not important when you build links


Many webmasters only want to get backlinks from pages with a particular PageRank. While you can use this method, it is usually a waste of time and it makes link building more difficult than it really is.

If a website has an overall high-quality then it does not matter if the page with the link to your website has a low Google PageRank:

If a high-quality website adds a new page, the new page will have an initial PageRank of zero. Nevertheless, the page can still be very good.

A page that has a PageRank of zero today can have a high PageRank tomorrow.

If only pages with a high PageRank had a chance, it wouldn't be possible to get new pages in Google's results page. Experience shows that new pages appear in Google's results every day.

In addition, the PageRank that Google publicly displays is not the actual PageRank that Google uses in its algorithm, and the PageRank value can be manipulated.

3. You will get lasting results if you use your common sense


You do not need special metrics to judge the quality of a web page. When you find a web page that could link to your site, ask yourself the following questions:

Does the linking page look good to the average web surfer?
Does the page have interesting content?
Is the content somewhat related to my website?
Does it make sense if the web page links to your site?

If you can answer all questions with "yes," then you should try to get a backlink from that page. It doesn't matter if that page has a low PageRank.

Google tries to imitate common sense with its algorithms. If you use common sense to build your links and follow the tips above, you make sure that the backlinks to your website will count in all future updates of Google's algorithm.

If you want to High quality links so you can contact us Visit our website: Djinn Web Solution

How to Find Out Which of Your Blog Posts are Not Indexed by Google

Google Indexed

Google doesn’t index all your content.
Do you know which pages on your blog are not indexed on Google?
Do you know which pages on your website are indexed but shouldn’t be?
In this post, we take you through the process of figuring out which posts are not indexed by Google.   There is no simple way of doing this but, if you follow the process, you can do it.  I’m geeking out a bit because it’s quite technical!

1.  Check your sitemap

The sitemap tells Google what to index.  Google may crawl through your site and find extra pages to index, but a good starting point is to find out what you are telling it to index.
If you’re using a WordPress tool like Yoast WordPress SEO, this will build the sitemap for you.  You’ll probably have a sitemap for your pages and a sitemap for your posts.
The following example shows that we’re telling Google there are 216 posts to index.

Highlight all this information and copy it into a spreadsheet in Google. When you have it in a spreadsheet, remove any columns that are not the URLs of your posts.
Note:  If all your blog content is in /blog, then it’s going to be easy to filter out just the blog posts.  If your blog posts and pages are all in one directory then you’re better off comparing all pages/posts to see which ones are not indexed.

2.  Check Google Webmaster Tools

In Google Webmaster Tools, it will show you how many of your posts were indexed by Google.  I’ve never seen this at 100% but you do want to see the majority of your posts indexed.
This example shows a gap where there are a good proportion of posts that are not indexed:

3.  Get Your Google Listings

Now you want to go to Google and find out what it has indexed.  The ‘site’ command in Google can list out the posts that are indexed by Google.  It’s not 100% accurate, so there could be other posts that are indexed that are not on that list, but it will be pretty close to the truth.
Before you run the site command, you need to change the settings in Google so it displays 100 results on a page at a time instead of 10.  We are going to extract the contents of Google search and it’s easier to extract them in bigger batches rather than 10 at a time.
To temporarily change your results so it displays a listing of 100 web pages at a time, select the ‘settings’ option at the very bottom right of the Google.com page.  Then, select the ‘search settings’.

On this screen, adjust the dial so that Google will display 100 results at a time instead of 10.

Next, you need to install a bookmarklet in your browser that will help you to extract just the page names from the results.  Go to this page and drag the bookmarklet to your browser bar ->  Here
In Google, type in site:”URL of your website”, without the quotation marks (e.g. site:www.razorsocial.com).  This will show you up to 100 results of pages indexed on your site.  When you have that listing, click on the bookmarklet you installed.  This will extract only the web addresses from the Google listing!

When you get this listing, copy it to the same spreadsheet where you have the list of pages from the sitemap.  Repeat the above until you have all your pages in the spreadsheet.

5.  Compare the Results

You should now have two listings of web addresses in your spreadsheet.  The first column is what you tell Google to index and the second is what Google has actually indexed!
Go through the list and pick out the posts that are in the sitemap but not the Google listings.
From this list, go to Google and search for these posts; even if they are not found using the ‘site’ command, they may still be indexed.
If you have a very long list of posts, you won’t be able to compare the list manually.  Because of this, you’ll need to figure out a good Excel formula that extracts pages where there isn’t a direct match in both columns.  If anyone wants to share in the comments how to do this, feel free.

What to do with Your Results

When you have a list of pages that are not indexed, there are a couple of things to consider:
a) If it is a poor quality post that is not offering any value then delete it.
b) If it’s a good post that should be indexed, then link to it from other posts on your site.  This will help Google pick it up.

Summary

I had hoped there was an easier way to figure out what is not indexed by Google, but this was the simplest solution. If you know of another way, please share!
By going through the process above, you do learn more about your site and you’ll probably identify other issues that are worth considering.
After tidying up my sitemap, there are now only three posts not indexed!

I would love to hear your thoughts/comments below!

Nearly A Year Later, Are We Finally Going To Get A Penguin Update Refresh?

Google expects Penguin 3.0 to launch within the 2014 year.


In 22 days from now, it will be the year anniversary of the release of the fifth update to Google’s Penguin algorithm, code-named Penguin 2.1. As you can imagine, waiting 11 months and 8 days can be excruciating for those who were negatively impacted by the punitive algorithm. They’ve likely by now cleaned up their links but they are still sitting and hoping their businesses survive to see an improvement when Google unleashes the next Penguin update.
Well, it might not be too far off. Google’s John Mueller said on a Google Hangout this morning that he expects Penguin 3.0, which would be the 6th refresh, to happen within the 2014 year. In fact, John said “I am pretty confident we will have something in the reasonable future.”
What is the “reasonable future”? At this point, it is not reasonable to go 11+ months without a refresh, but if I had to guess, I’d hope to see one before the anniversary. I think that would be reasonable at this point in time. But your guess is as good as mine.
We thought we saw Penguin updates before, but Google told us it was not a Penguin update. I definitely believe Google has been testing Penguin updates in the live index but again, they have not fully released it yet.
Penguin 3.0 is expected to be a major update, making the algorithm more capable of running more frequently so that those impacted wouldn’t have to wait too long before seeing a refresh. Something like how Panda is now run monthly.
Google has told us their efforts to update the Penguin algorithm have been met with many challenges. But it seems like we are finally getting to a point where we will see a refresh of Penguin really soon.
As soon as we hear official word from Google on a release date and time, we will be the first to report it to you.
For more on Penguin, see our Penguin update category.