![google search engine url google search engine url](https://news-cdn.softpedia.com/images/news2/copy-pasting-google-search-urls-leaks-previous-searches-495478-2.jpg)
Moreover, an author bio page further assists with disambiguating authors.īack to the reconciliation technique I referenced earlier, Mueller explains how social media links in bio pages can help Google tell the difference between authors with the same name: The more information you can provide about an author, the more proof you’re providing to Google your content is high quality. You can then choose if Googlebot should decide which pages to crawl, if every URL should be crawled, if only URLs with specific values should be crawled, or if no URLs should be crawled.Ĭhanges using the URL Parameter tool may not be reflected in the SERPs for several months, so it’s good practice to regularly use a site:query search to check every few weeks to verify success.The guidelines don’t explicitly state “you need an author bio page,” although it would be a highly effective way to communicate to Google’s quality raters who an author is. The “Yes” option is for active parameters – meaning that the page content is different with each parameter. Step 6: If you selected Yes: Changes, reorders, or narrows page content, you must then select how the parameter affects content (sorts, narrows, specifies, translates, paginates, or other) and then tell Google which URLs with the parameter that Googlebot is allowed to crawl. Google will just pick the version of the URL it thinks is primary and index that version. The “No” option is for passive parameters – meaning that the page content stays the same with or without the parameter. If you selected No: Doesn’t affect page content (ex.
![google search engine url google search engine url](https://i0.wp.com/www.betterhostreview.com/wp-content/uploads/add-google-custom-search-engine-to-site.jpg)
Step 5: Select whether or not the parameter changes how content is seen by the user Note: You can click on all screenshots below to view at a larger size. Step 1: Log in to Search Console and click on Crawl, then URL Parameters If you have parameters in your sitemaps or used in internal linking, this could confuse Google and cause them to index the parameterized URLs anyway. This is the place where I tell you to use this tool with caution – if you make a mistake in this tool and incorrectly exclude URLs it could result in pages or your entire site disappearing from search. Search Console features a tool that will tell Google which parameters to ignore, which can prevent duplication from parameterized URLs. In either case, most parameters don’t actually affect the actual content on the page, meaning that in a search engine’s eyes, all of the below pages are duplicates:Ĭlick through to read a more in-depth post on common duplicate content issues, including parameterized URLs. The below URLs are examples of what passive parameters could look like: Passive parameters don’t have any affect on how content appears to users, but can track visits or referrals. The URLs below are examples of what active parameters could look like sorting a category page for dresses in different ways. This enables a single page to show an infinite number of different views.Īctive parameters can change page content for users by transforming or sorting a page a certain way. URL Parameters are parameters with values that are set dynamically within a pages URL. You can configure URL parameters in Search Console to tell Google whether it should crawl or ignore specific parameters to prevent duplication.