Showing posts with label SEO News. Show all posts
Showing posts with label SEO News. Show all posts

6 Ways to Get More Results from SEO Right Now

seo result

As PR pros today we are tasked with a mountain of responsibility. Not only do clients demand monthly press mentions, but many lean on their PR firm's for social media and search engine optimization (SEO) advice, and in some cases execution. Unfortunately, much of the SEO advice out there currently is based on anecdotes and myth. Fortunately, a new search engine ranking factors study was recently released that helps shed some light on what's truly important for getting SEO results in 2016.

But, since I'm not an SEO expert, I thought it would make sense to interview someone who is - Dmitry Dragilev, founder of Just Reach Out (which I covered a few articles ago). Together, we are going to walk you through the highlights of an important SEO study and show you how you can use its findings to maximize the SEO benefits that your clients receive from your PR campaigns. So without further ado. . .the Q&A:

Q. As a PR professional, what is the best council I can give my clients about SEO, especially considering that the methods/strategies change with each Google algorithm change?

A. Backlinks remain an integral part of Google's algorithm. One of the chief reasons businesses hire PR firms today is that as PR pros we are very good at generating mentions from authoritative sites. More often than not, these mentions come in the form of links.

According to the new study, backlinks were the #1 most-important ranking factor that they investigated. Specifically, sites that generated links from a diversity of different sites came out on top.

For PR pros, this means that mentions from smaller blogs or niche publications can benefit our client's SEO. Therefore, we shouldn't necessarily shy away from a mention just because it's not on the cover of the Wall Street Journal.

Q. I used to spend a ton of time on Moz analyzing our client's earned media power and used to put a premium on domain authority. Is that still a thing?A. Simple answer? Yes! A website's overall link authority boosts all pages. The study I referenced above discovered that a website's overall link authority played a major role in how well each page on that site ranked. In other words, a brand new page on an established site will tend to outrank a new page on a smaller site.

This impacts our work because we're often unable to get links pointing to specific pages on a client's website (for example, a product or service page). But rest easy - even links pointing to a client's homepage may boost the rankings for those high-priority pages.

Q. I write a ton of contributed content and typically include imagery. Is that imagery helping to move the needle on those articles?

A. Image-rich media ranks very well in search, so yes it does make a difference. In the old days of faxing press releases to journalists, images didn't matter much. In fact, unless you earned a cover story, your client's media coverage usually didn't include an image at all!

In 2016, things are very different. It's common to see an image on almost every page on the web today. This study we are referencing here confirms that, whenever possible, it's smart to add images to your content. They found that content that contained at least once image tended to rank above content without any images.

This means that PR professionals should aim to use images in more areas than just with contributed articles - consider adding imagery to your online press materials, including press releases and press kits.

As I mentioned up front, as PR professionals we are now often tasked with having at least a basic understanding of SEO. The data-backed SEO tips provided by Dragilev should not only help you and your agency better understand SEO, but actually improve the rankings of your clients in search.

Top 3 Tools You Should Use for Technical SEO Audits

Technical SEO Audits



Doing a search engine optimization (SEO) audit is no joke. It used to take time, the patience of a saint and too many spreadsheets. You may grow a white hair or two in the process. But, thanks to technical SEO audit tools, we are no longer doing those insane manual checks that we did in the past. Most SEO experts arm themselves with these tools so they’re no longer rummaging through raw data but making strategic decisions instead.

In this article I’ll share three of my go-to tools for performing a technical SEO audit: DeepCrawl (a cloud-based tool), Screaming Frog (a desktop-based tool) and Search Console (the free web-based tool from Google themselves). They all have their different strengths and use cases. Depending on your requirements, you may need to choose one -- or you may find all three useful in conjunction.

1. DeepCrawl


I really like DeepCrawl, because of its flexibility and the depth of the reports it provides. When starting a crawl, you can choose from numerous crawl types, right up to a full gap analysis of my entire site. You can also auto-schedule crawls, which is really useful. Crawls are highly customizable, so you can set the criteria for maximum / minimum links per page, content, title / URL length, load time, etc.


Here are the three things I like the most about DeepCrawl:
It can easily handle crawling millions of pages, so if you need to crawl a huge site, look no further.
It provides competitor reports -- not just the basic headlines, but the real nitty-gritty details on content, pricing and promotions, site architecture and even the brand’s key influencers.
It allows you to project manage your SEO team, creating tickets for issues and assigning them to people. It also alerts you if any issues pop up consistently in crawls. Plus, it maintains a history of all changes, so you can look back and monitor performance and progress over time.

If I had to improve one thing, I’d ask for more mobile-focused reports. (Their support guys told me to expect more of these in their next product update, so looks like that'll be solved soon anyway).

2. Screaming Frog


When it comes to desktop crawlers, Screaming Frog is an undisputed leader. The tool has been around for quite some time now, and webmasters managing sites of all sizes swear by it. If you are looking at crawling less than 500 URLs, you can even use it for free.

However, if you have a large website with over 10,000 pages, be wary, as desktop crawlers can cause server-response problems. Besides, it doesn’t come with collaboration features, which makes it far less attractive for SEO teams these days.

That said, in this price range, it’s one of the most useful crawlers, and here are the reasons it's in my top three:
It doesn’t die off when your memory is running low and alerts you beforehand. I especially like this feature, because this has happened to me a few times. All you need to do is save the project, increase random access memory (RAM) and restart it to continue.
Their bulk export option really makes life easier, as you can export all data, including internal links, outbound links, anchor text, image alt text, etc.
There is an option to accept cookies, so you can also crawl websites that make it compulsory to accept cookies.

And, though I like the tool overall, if I had to change one thing about it, I’d want them to improve the user experience to make it easier to customize crawls.

3. Google Search Console


While SEO veterans might find it funny to see this tool in the list, many SEOs are relying on it more than ever. The tool has come a long way since its earlier days and can offer a fair amount of insights. These are the three things I do love about it:
It gives you estimates on your position for a keyword, plus the number of impressions and clicks for your site on that keyword in Google search results. That may be basic, but it’s important and useful.
It gives a good summary of things that matter -- things like broken links, number of pages indexed, correctness of HTML markup, page loading speed, etc.
It’s free -- and it comes from the horse's mouth! (OK, that’s two things, but they’re both major plus points.)

The only thing I don’t like about Search Console is that it doesn’t always give a complete picture.

Remember, these tools may not be the best fit for your specific needs. All three have their particular unique selling points and solve specific pain points well. You should review them all and choose the one that’s right for you. The main things to consider are the size of your website, the volume of new pages you generate and the kind of insights you are looking for.

3 SEO Tips to Help Your PR Content Rank Highly With Google's Latest Update

3 SEO Tips to Help Your PR Content Rank Highly With Google's Latest Update

Google’s Panda 4.2 algorithm rollout, which began over the weekend, continues Google’s commitment to rewarding high-quality, user-focused content with higher rankings.
Here are three easy tips for optimizing your PR content for better results:

1. Go long. 

Many PR pros got used to writing short content because press release services provided a surcharge if copy ran over 400 words. Google now scans short content and assumes it doesn’t contain much useful information. The result is that shorter content doesn’t rank as well.
It’s now better to “go long,” says SEO-PR Chief Executive Greg Jarboe, citing a favorite football phrase. “Go ahead and write 600 to 800 words—whether for a press release, blog post or summary of a white paper behind a paywall.”

2. Use synonyms. 

Google will penalize you for keyword stuffing, which is using the same search term repeatedly in copy. It rewards what has always been a good writing practice: the use of synonyms.
“If you’re going to use one word or phrase in the first sentence, use a variation in the second sentence or graph,” Jarboe says. “Google understands what synonyms are and rewards them as good writing that’s more likely to be of interest to readers. The result is a higher ranking.”

3. Get visual. 

You’re missing a huge SEO opportunity if you aren’t adding photos and videos to your content. Over 55 percent of Google search results now include videos and over 40 percent include photos, according to Jarboe.
The most important element to optimize in photos is your caption. Make sure to plant your two- or three-word key phrase at the beginning of your caption copy, Jarboe says. “The same rule applies to YouTube titles. They can only be 100 characters long, so make sure your key phrase is at least in the front third of your title.”
Jarboe’s big SEO takeaway for PR pros is that your normal writing skills are more of an asset now than ever before, because Google rewards well-written content. So, stop writing for what you thought Google wanted—and start writing well again.

After Google's updates: how to judge the quality of a link

Quality Links

When you build links to your website, you want lasting results. It does not make sense to invest your time in search engine optimization methods that are just a flash-in-the-pan.

Google's recent algorithm updates have shown that your website can get in major trouble if your website has the wrong type of backlinks. How do you judge the quality of a website? What is a good website and from which web pages should you get links?

1. All automatic backlinks are bad


All backlinks that have been created automatically have no positive influence on the rankings of your website. If these automatically created backlinks use the rel=nofollow attribute, there's nothing you have to worry about.

If you used tools that automatically created backlinks in bulk for you, you should try to get rid of these backlinks to avoid a penalty. The link disinfection tool in SEOprofiler can help you to get rid of bad backlinks.

2. Google PageRank and other metrics are not important when you build links


Many webmasters only want to get backlinks from pages with a particular PageRank. While you can use this method, it is usually a waste of time and it makes link building more difficult than it really is.

If a website has an overall high-quality then it does not matter if the page with the link to your website has a low Google PageRank:

If a high-quality website adds a new page, the new page will have an initial PageRank of zero. Nevertheless, the page can still be very good.

A page that has a PageRank of zero today can have a high PageRank tomorrow.

If only pages with a high PageRank had a chance, it wouldn't be possible to get new pages in Google's results page. Experience shows that new pages appear in Google's results every day.

In addition, the PageRank that Google publicly displays is not the actual PageRank that Google uses in its algorithm, and the PageRank value can be manipulated.

3. You will get lasting results if you use your common sense


You do not need special metrics to judge the quality of a web page. When you find a web page that could link to your site, ask yourself the following questions:

Does the linking page look good to the average web surfer?
Does the page have interesting content?
Is the content somewhat related to my website?
Does it make sense if the web page links to your site?

If you can answer all questions with "yes," then you should try to get a backlink from that page. It doesn't matter if that page has a low PageRank.

Google tries to imitate common sense with its algorithms. If you use common sense to build your links and follow the tips above, you make sure that the backlinks to your website will count in all future updates of Google's algorithm.

If you want to High quality links so you can contact us Visit our website: Djinn Web Solution

How to Find Out Which of Your Blog Posts are Not Indexed by Google

Google Indexed

Google doesn’t index all your content.
Do you know which pages on your blog are not indexed on Google?
Do you know which pages on your website are indexed but shouldn’t be?
In this post, we take you through the process of figuring out which posts are not indexed by Google.   There is no simple way of doing this but, if you follow the process, you can do it.  I’m geeking out a bit because it’s quite technical!

1.  Check your sitemap

The sitemap tells Google what to index.  Google may crawl through your site and find extra pages to index, but a good starting point is to find out what you are telling it to index.
If you’re using a WordPress tool like Yoast WordPress SEO, this will build the sitemap for you.  You’ll probably have a sitemap for your pages and a sitemap for your posts.
The following example shows that we’re telling Google there are 216 posts to index.

Highlight all this information and copy it into a spreadsheet in Google. When you have it in a spreadsheet, remove any columns that are not the URLs of your posts.
Note:  If all your blog content is in /blog, then it’s going to be easy to filter out just the blog posts.  If your blog posts and pages are all in one directory then you’re better off comparing all pages/posts to see which ones are not indexed.

2.  Check Google Webmaster Tools

In Google Webmaster Tools, it will show you how many of your posts were indexed by Google.  I’ve never seen this at 100% but you do want to see the majority of your posts indexed.
This example shows a gap where there are a good proportion of posts that are not indexed:

3.  Get Your Google Listings

Now you want to go to Google and find out what it has indexed.  The ‘site’ command in Google can list out the posts that are indexed by Google.  It’s not 100% accurate, so there could be other posts that are indexed that are not on that list, but it will be pretty close to the truth.
Before you run the site command, you need to change the settings in Google so it displays 100 results on a page at a time instead of 10.  We are going to extract the contents of Google search and it’s easier to extract them in bigger batches rather than 10 at a time.
To temporarily change your results so it displays a listing of 100 web pages at a time, select the ‘settings’ option at the very bottom right of the Google.com page.  Then, select the ‘search settings’.

On this screen, adjust the dial so that Google will display 100 results at a time instead of 10.

Next, you need to install a bookmarklet in your browser that will help you to extract just the page names from the results.  Go to this page and drag the bookmarklet to your browser bar ->  Here
In Google, type in site:”URL of your website”, without the quotation marks (e.g. site:www.razorsocial.com).  This will show you up to 100 results of pages indexed on your site.  When you have that listing, click on the bookmarklet you installed.  This will extract only the web addresses from the Google listing!

When you get this listing, copy it to the same spreadsheet where you have the list of pages from the sitemap.  Repeat the above until you have all your pages in the spreadsheet.

5.  Compare the Results

You should now have two listings of web addresses in your spreadsheet.  The first column is what you tell Google to index and the second is what Google has actually indexed!
Go through the list and pick out the posts that are in the sitemap but not the Google listings.
From this list, go to Google and search for these posts; even if they are not found using the ‘site’ command, they may still be indexed.
If you have a very long list of posts, you won’t be able to compare the list manually.  Because of this, you’ll need to figure out a good Excel formula that extracts pages where there isn’t a direct match in both columns.  If anyone wants to share in the comments how to do this, feel free.

What to do with Your Results

When you have a list of pages that are not indexed, there are a couple of things to consider:
a) If it is a poor quality post that is not offering any value then delete it.
b) If it’s a good post that should be indexed, then link to it from other posts on your site.  This will help Google pick it up.

Summary

I had hoped there was an easier way to figure out what is not indexed by Google, but this was the simplest solution. If you know of another way, please share!
By going through the process above, you do learn more about your site and you’ll probably identify other issues that are worth considering.
After tidying up my sitemap, there are now only three posts not indexed!

I would love to hear your thoughts/comments below!

Nearly A Year Later, Are We Finally Going To Get A Penguin Update Refresh?

Google expects Penguin 3.0 to launch within the 2014 year.


In 22 days from now, it will be the year anniversary of the release of the fifth update to Google’s Penguin algorithm, code-named Penguin 2.1. As you can imagine, waiting 11 months and 8 days can be excruciating for those who were negatively impacted by the punitive algorithm. They’ve likely by now cleaned up their links but they are still sitting and hoping their businesses survive to see an improvement when Google unleashes the next Penguin update.
Well, it might not be too far off. Google’s John Mueller said on a Google Hangout this morning that he expects Penguin 3.0, which would be the 6th refresh, to happen within the 2014 year. In fact, John said “I am pretty confident we will have something in the reasonable future.”
What is the “reasonable future”? At this point, it is not reasonable to go 11+ months without a refresh, but if I had to guess, I’d hope to see one before the anniversary. I think that would be reasonable at this point in time. But your guess is as good as mine.
We thought we saw Penguin updates before, but Google told us it was not a Penguin update. I definitely believe Google has been testing Penguin updates in the live index but again, they have not fully released it yet.
Penguin 3.0 is expected to be a major update, making the algorithm more capable of running more frequently so that those impacted wouldn’t have to wait too long before seeing a refresh. Something like how Panda is now run monthly.
Google has told us their efforts to update the Penguin algorithm have been met with many challenges. But it seems like we are finally getting to a point where we will see a refresh of Penguin really soon.
As soon as we hear official word from Google on a release date and time, we will be the first to report it to you.
For more on Penguin, see our Penguin update category.

3 Simple Ways Email Can Drive SEO Results

Djinn Web Solution

The connection between email and SEO is not an obvious one. Of course, Google isn’t crawling and ranking your individual emails, and having a large list of subscribers in your MailChimp account doesn’t exactly make you rank any higher for your favorite terms.
However, email is a useful tool for keeping an audience engaged and “wielding” traffic in a way that no other marketing channel allows for. Using email as a tool to help or enhance a search engine optimization initiative can be extremely effective if done well; and today, we’ll explore three under-utilized strategies for turning email marketing into an SEO booster.

1. Encourage Engagement With Email

You can’t send out 30 emails about “marketing agency services in NYC” and hope to rank any higher for that term. However, if you have a campaign of helpful marketing resources, you might attract searchers for various marketing-related terms.
Email might not help you rank in and of itself, but an email incentivizing comments and sharing can help “move the needle” on the factors that Google wants to see in the first place.
You might, for example, have recently written a great article about lead generation from Facebook. Though your blog might get some organic traffic, an email to your list about how they can use these lead generation strategies in their business might bump the number of views (and therefore shares, tweets, and comments) that your article receives.
You might enhance this further by encouraging an explicit “engagement” call-to-action within your email:
  • You could ask readers to leave a comment at the bottom of your blog about how they might use your insights in their business
  • You could write about a non-profit project that your company is involved with, and encourage people to share the post in order to spread a good cause
  • You could drive people explicitly to your social channels (like Twitter, Facebook or YouTube) to encourage them to comment on those channels, not just your blog
It might make sense to use surveys or past activity to segment your email list, singling out people who are active with comments and social sharing.

2. Drive To RSS & To Regular Content Consumption

While encouraging comments and engagement can certainly be fruitful for SEO, giving subscribers other ways to get “hooked” on your content is important, as well. You don’t want to have to rely on using your email list to drive your monthly page view count up.
The ideal would be for email to not only drive sales, but also — over time — encourage more and more subscribers to stay connected to your content in other ways that wean them off of needing email reminders. Here are some examples:
  • You can link to blog posts in your email, but also make it clear that people who love your content should connect via RSS
  • On your site, you might give people the ability to subscribe to a separate email list dedicated specifically to delivering weekly or daily updates from a particular section of your blog (for a great example of this, see HubSpot)
  • You might tell your subscribers that if they love your videos, they should subscribe on YouTube, or that if they love your pictures, they should stay connected on Instagram
Having these other “content hooks” means that your regular email activity helps keep people connected to your fresh, rotating content time and time again, whether you explicitly drive your subscribers to that content or not.
How do you think your search rankings would improve if half of your email subscribers were also following you on Twitter and Facebook? How many more views and blog comments do you think you’d be getting now if you had ten times as many people subscribed via RSS or email to get your latest and greatest blog posts?

3. Newsletter Content Can Be Saved



Lastly, email takes time to write. If you’re doing it well, you’re sending out thoughtful and useful content to your readers. It’s a bit of a shame that this wonderful newsletter content doesn’t “register” with Google or have any lasting impact on your rankings. Or does it?
Great email newsletter content can and should be reused and repurposed as blog content, and this can be done in a number of simple ways:
  • If your newsletters are long, you can turn them into stand-alone blog posts by simply taking the same content, inserting relevant links, and getting it live on your blog… potentially allowing you to get that same message out via sharing on Facebook and other social media channels
  • If your newsletters are short, you can send a number of related newsletters, and then “bundle” the content together, smooth it out, and turn over three or four emails together as one blog post
This is a win-win because it allows non-subscribers to glean your insights and read your messages, and also because it gives you more quality content to rank for, instead of leaving it tied up only in inboxes. We’ve actually done this with some of our own marketing news-related blogs posts, and it certainly beats having to do the “heavy lifting” of writing up the story a second time.

Final Thoughts

Who says that email can’t help drive SEO results?! All three of the strategies above can leverage your email list in a way that can help boost your search engine rankings.
A smart next step would involve determining what SEO initiatives are most important for your company, and determining a way that you could use email on a weekly or monthly basis to help bolster those SEO initiatives with some of the strategies laid out here.
Be well, and happy marketing!

Google Warns Local Businesses: You Have 3 Weeks to Save Your Places Listing

If you have your business set up on Google local pages, there are a couple changes that you should be aware of when it comes to verifying and updating your Google+ business listing.

You Have 3 Weeks to Save Your Google Places Listing

The most important change is that some business owners are being required to update and save their Google Place listings. Some business owners were confused about whether this email notification was spam, however it is coming from Google.

Affected users will have three weeks to save and confirm their Google Place listings. Jade Wang, Google Business Community Manager, explained:
We are making some changes to Google Places for Business and Google Maps so we can continue providing people with the best experience when they're looking for local businesses. As part of this process, we're asking business owners to review and confirm some of the information in their Google Places accounts so we can keep showing it to Google users. We know this will be a few extra steps for merchants, and we apologize for any inconvenience and thank you for your time.
We have sent business owners affected by these changes an email entitled "Action Required: You have 3 weeks to save your Google Places Listing".
If you received this email, don't worry. You simply need to login to Places for Business, review your business information, update it if necessary, and click Submit.
You'll need to do this for all listings in your account by February 21 to stay on Google Maps. Otherwise, you'll need to add your business information and undergo PIN verification using Google Places again.

Change to Google+ Local Page Setup

The second change is designed to make it a bit easier for new business owners when setting up their local listing.
When a business owner first creates or Google+ local page, they are required to verify their listing using PIN verification. Google is now making a change were business owner will now be able to see their Google+ listing features prior to verification [].
Important: business owners will still be required to complete the PIN verification before the listing will show up to all users, but it will make it easier to access some of the features while they wait. Business owners must remember that even though they can see the listing, that doesn't mean anybody else can see it.
Verification can be completed by entering the PIN received either via a postcard and sent to the business address or via phone at your business number. If someone else has already verified, such as an employee who may no longer be with the company, you can simply do the verification process again.
If the listing has been created previously, you'll still need to use the PIN verification in order to update and take admin control posting.
Wang explained:
If you're creating a listing in the new Places for Business dashboard, now, you won't have to wait to complete PIN verification before you can see the +page, for most businesses. Just follow the link from your dashboard to see the new page. You will be able to use Google+ social features on this unverified page, but please note -- you still need to complete PIN verification before the page will start showing up in Google Maps and across other Google properties.
If you've got an unverified local Google+ page (made using Google+ in the local business/place category), then we still encourage you to PIN verify this page so that it can start appearing in Google Maps and across other Google properties.
If you're creating a local Google+ page (using Google+ selecting the local business/place category) for a business that we think is already in Google Maps, then you may need to go through both PIN verification and our admin request flow before you can manage the page.

Google Penalizing Sites for Rich Snippet Spam

If you use rich snippets on your websites, you should be aware that Google is now penalizing websites for spamming structured data markup.

The new warning was first mentioned in a forum post on the Google Webmaster Central forums from a user who is asking for clarification about the warning and what the issue could be. It is a manual action penalty based on incorrect usage of markups, regardless of whether it was deliberate spam or simply a mistake.
The warning that would appear in a user’s accounts if they have manual action taken is:
Markup on some pages on this site appears to use techniques such as marking up content that is invisible to users, marking up irrelevant or misleading content, and/or other manipulative behavior that violates Google's Rich Snippet Quality guidelines.
The writing on the wall for penalties related to rich snippets was back in October at Pubcon when Google's Matt Cutts talked about changes Google was planning in regards to rich snippets and dealing with related snippet spam.
Rich snippets could get a revamp and they will dial back on the number of websites that will be able to display rich snippets. “More reputable websites will get rich snippets while less reputable ones will see theirs removed,” says Matt.
The new penalty seems to affect websites that are misusing rich snippets, such as including authorship on homepages and reviews on pages where there are no reviews. But there was evidence that Google was attempting to educate webmasters on how to use it correctly when they made changes in December to add debugging support for structured data.
If you're unclear if you’re using rich snippets correctly, you should first check your Webmaster Tools account and see if there’s any issues that show up, either as issues or in the structured data debugging area. Google also has the pretty significant help area for rich snippets, including with videos, to help webmasters implement structured data correctly.

Index Your Content Faster With the Fetch as Google Tool

Utilizing the various functions that Google Webmaster Tools has to offer is a surefire way to help keep your website running like a well-oiled machine. Two tools our SEO team uses on a regular basis and finds to be extremely beneficial are the Crawl Errors report and Sitemap submission tool.

Amongst the toolkit is the Fetch as Google option, which also gives users an opportunity to submit their URL to the index. Surprisingly, this tool is often under-utilized by bloggers, webmasters, and SEO strategists. This is a convenient way to speed things up considerably if you have new content that you'd like to be discovered and found in the SERPs.

Website owners and marketers often publish new web pages or blog posts on their website, sit back, and wait for them to show up in the Google search results. But that can take weeks or even months to happen! The more savvy marketers will ensure that any new content is included in their XML sitemap and then resubmit their sitemap to Google and Bing.

Submitting your link to the index using the Fetch as Google tool is like pressing a magic button. Google states that they will crawl the URL using this method usually within a day, however, I've seen web pages and blog posts show up in the SERPs in less than 5 minutes of using this tool.

I was once on a phone call with a marketing consultant who was asking me how long it took for a new page to show up in Google search results. He mentioned that he and his webmaster had built a new web page two months prior and it still wasn't showing up in the search results no matter what he Googled. His webmaster kept telling him it could take weeks and to just wait.

I submitted his web page using the Fetch as Google tool during the conversation and before we hung up, it was showing up in the SERPs. He was blown away, and I came out looking like a hero.

Here is a related question Google's Matt Cutts touches on in the video below:

"Google crawls site A every hour and site B once in a day. Site B writes an article, site A copies it changing time stamp. Site A gets crawled first by Googlebot. Whose content is original in Google's eyes and rank highly? If it's A, then how does that do justice to site B?"