Google’s job listings search is now open to all job search sites & developers

It’s now official: Job listings are coming to Google’s search results in a much more prominent way. And the company is now offering a formal path for outsiders to add job listings to the new feature in Google search.
Google announced this morning that they are now opening up job listings within Google search to all developers and site owners. The new jobs display within Google search doesn’t have a formal name. However, it’s part of the overall Google for Jobs initiative that Google previewed last month at the Google I/O conference.
At that time, Google did not say how to get your job listings into this feature. Well, now Google has published a guide to job posting structured data that gives clear advice on what developers need to do to get their job listings into this new Google for Jobs search feature.
There are two basic steps you need to take:
(1) Mark up your job listings with job posting structured data.

Search Engine Land Source

Google to fix missing data from Search Console analytics report soon

Google Search Console has a new bug. Today, it is missing a whole day of data in the Search Analytics report. But you don’t have to worry, it is not anything you’ve done wrong. Google is aware of the issue and is working on fixing it.
The day that is missing is March 9, 2017, which is an inconvenient date, since it is around the same time as this Google Fred update.

John Mueller from Google said on Twitter, “We’re aware of it and should have the missing data point soon.”

@JeremyJavis @googlewmc We’re aware of it and should have the missing data point soon. Thanks for pinging!
— John ☆.o(≧▽≦)o.☆ (@JohnMu) March 13, 2017

This issue is not new. It actually happened before 11 months ago.
Again, no need to panic, this is just a reporting bug in Google Search Console.
The post Google to fix missing data from Search Console analytics report soon appeared first on Search Engine Land.

Search Engine Land Source

Missed link-building opportunities: Reclaiming broken links

It’s inevitable. Despite your best efforts to prevent them, there will likely be some 404 errors showing up on your website for old pages that have been discontinued on your site. Or perhaps someone just inadvertently mistyped the URL they were linking to on your website.
Certainly, 404 errors aren’t great for search engine indexing, but they also represent potential inbound links that are now broken and lost. Or can those links be reclaimed? I have two techniques you might want to try.
Reclaiming broken links with Google Search Console
Google Search Console is free, making it a popular choice for attaining link information about a website. However, as Russ Jones, principal search scientist at Moz, wrote in a thought-provoking post about the true reliability of Google Search Console data, much of the data in Google Search Console, especially around linking, isn’t always terribly accurate. This is due in part to the speed at which Google indexes various pages that may contain

Search Engine Land Source

Google Data Studio now connects to Search Console

Google’s reporting and data visualization tool, Data Studio, now integrates with Google Search Console.
Search Console joins AdWords and Google Analytics among others as a data source for report building. Marketers can build reports that include only Search Console data, or combine it with other sources to compare paid versus organic traffic trends, for example.

Search Console metrics can be aggregated by either site or by page in the Data Source creation flow within Google Data Studio by selecting either “Site Impression” or “URL Impression”.
As of last week, users can build an unlimited number of reports in Google Data Studio. The email used for the Data Studio account needs to have access to Search Console in order to use it as a data source.
The post Google Data Studio now connects to Search Console appeared first on Search Engine Land.

Search Engine Land Source

Google explains what “crawl budget” means for webmasters

Gary Illyes from Google has written a blog post named What Crawl Budget Means for Googlebot. In it, he explains what crawl budget is, how crawl rate limits work, what is crawl demand and what factors impact a site’s crawl budget.
First, Gary explained that for most sites, crawl budget is something they do not need to worry about. For really large sites, it becomes something that you may want to consider looking at. “Prioritizing what to crawl, when, and how much resource the server hosting the site can allocate to crawling is more important for bigger sites, or those that auto-generate pages based on URL parameters,” Gary said.
Here is a short summary of what was published, but I recommend reading the full post.

Crawl rate limit is designed to help Google not crawl your pages too much and too fast where it hurts your server.
Crawl demand is how much Google wants to crawl your pages. This is based

Search Engine Land Source

Fetch and Horror: 3 examples of how fetch and render in GSC can reveal big SEO problems

In May 2014, some powerful functionality debuted in the “fetch as Google” feature in Google Search Console — the ability to fetch and render.
When you ask Google to fetch and render, its crawler will fetch all necessary resources so it can accurately render a page, including images, CSS and JavaScript. Google then provides a preview snapshot of what Googlebot sees versus what a typical user sees. That’s important to know, since sites could be inadvertently blocking resources, which could impact how much content gets rendered.
Adding fetch and render was a big deal, since it helps reveal issues with how content is being indexed. With this functionality, webmasters can make sure Googlebot is able to fetch all necessary resources for an accurate render. With many webmasters disallowing important directories and files via robots.txt, it’s possible that Googlebot could be seeing a limited view of the page — yet the webmaster wouldn’t even know without fetch and render.
As former Googler

Search Engine Land Source

Google Search Console improves Security Issues reports

Google announced they have improved the Search Issues report in the Google Search Console. Specifically, the reports now provide more specific explanations of the security issues detected by Safe Browsing.
The types of security issues that can show include malware, deceptive pages, harmful downloads, and uncommon downloads. The new explanations promise to give more “context and detail” into the security issues. This includes “tailored recommendations for each type of issue, including sample URLs that webmasters can check to identify the source of the issue, as well as specific remediation actions webmasters can take to resolve the issue,” Google said.
Here is a screen shot of some of the alerts one may see in this report:

The post Google Search Console improves Security Issues reports appeared first on Search Engine Land.

Search Engine Land Source

Google search analytics report adds the ability to compare queries

Google has quietly added a new option within the Search Analytics report, within the Google Search Console, to compare one query against the next. This feature was first spotted by @Jonny_J_.
The purpose of the “Compare queries” filter is to compare one query against another. You can currently only compare two queries. Here is what that report looks like when you do the comparison:

You can access this feature by logging into your Google Search Console account and clicking on Search Analytics. Then click on the Queries section and select “Compare queries.”

Then this pops up, and you can enter your search phrases:

The post Google search analytics report adds the ability to compare queries appeared first on Search Engine Land.

Search Engine Land Source

Google sent 4 million messages about search spam last year, saw 33% increase in clean-up requests

Google announced today the latest in their efforts to clean up the search results through webspam techniques. Google explained that in 2015 they saw a 180-percent increase in websites being hacked compared to 2014 and also saw “an increase in the number of sites with thin, low-quality content.”
To combat that, Google released their hacked spam algorithm in October 2015, which resulted in removing “the vast majority” of those issues. They also sent out more than 4.3 million messages to webmasters to notify them of manual actions on their sites; that is a ton of manual notices. With that, they saw a 33-percent increase in the number of sites that went through the reconsideration process. So it is clearly important to make sure to verify your website in the Google Search Console so that you can be alerted of any issues Google finds on your site.
Google also said that users submitted more than 400,000 spam reports. Google acted

Search Engine Land Source

Google Search Console now lets you unsubscribe from some email notifications

Google has added a new feature to allow webmasters to unsubscribe from certain email notifications they get from the Google Search Console.
Google announced the new feature on Google+, saying, “We’ve added a new feature to help you tailor the type of messages you receive.”
Here is a screen shot of the check box for “enable email notifications,” and then the option to “mute” or “unmute” specific notifications:

Google said:

When you receive a message you’d like to disable, click “Unsubscribe from this type of message” in the footer of the email.
Messages you’ve disabled will be listed in your Search Console preferences page — just like on the screenshot.
You can re-enable messages from there, as well.

More details can be found in the Google help documents.
The post Google Search Console now lets you unsubscribe from some email notifications appeared first on Search Engine Land.

Search Engine Land Source