Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Was moving up in SERPS then Got Stuck on Page 2
-
Hi,
I was continuously acquiring quality back-links and my site was moving up in Google SERPS for 3 main keywords. Within a few weeks i was on Page 2 and 3 for these three keywords, but after reaching there I got stuck on these pages and positions despite no change in link building strategy / pattern. I have even increased the number and quality of links that I acquire per day, but I am still stuck at exact same positions.
The website is10 months old and related to a software niche. I update this website once a week.
For one keyword I am stuck at position 1 of page two (you can well imagine the frustration..!!).
My question is that what do I need to do to get out of this "SERP lock"?
-
Thanks all for the answers.
Yes EGOL, It looks like I need to do the Jolt (rate of change of acceleration) rather than just acceleration or speed as I am up against entities having $10b+ market cap(s). These guys love to mop the serp floor using small competition.
Are there any other factors you guys consider relevant in off-page SEO, in addition to rate of increase of links/day and quality/relevance of links?
-
Hi,
This is a great question and many people find themselves right where you are. Both Ryan and Egol have provided you some great responses and I agree with them both. There is a lot that goes into whether and when your site will move up or down. Personally, I feel the closer you get to page 1 the more fine tuning and possibly effort it will take to make the leap to the top of page one.
Sounds like you are doing pretty well for a very young site. Good job and good luck!
-
Did you take physics in high school or college?
Your question is similar to..... "What is the difference between speed and acceleration?"
Just because you are "continuously acquiring quality back-links" doesn't mean that your competitors are sitting on their butts.
If you are gaining ten links per day but the guy above you is gaining twenty you will never ever catch him....
.... and if he is on the third page then the guys near the top of page two might be gaining hundreds per day.
Every time you move up a position in the SERPs the website directly above you will require a greater level of effort to defeat.
That means you gotta press the accelerator down harder and harder as you move up the SERPs. Will you have it floored before you reach the top of page two?
In lightly competitive SERPs you might be able to defeat everyone... but when you get into the heavyweight SERPs the increasing competition will at some point be more than most people can muster.
That's where you hit "SERP lock" as you call it.
Keep in mind that the people behind you are working hard too..... you might tramp the pedal to the floor and see people from behind passing you buy.
My personal opinion is... more people who hold any SERPs today will be lower in the rankings than will be more highly ranked by this time next year. Really. They are going to be displaced by existing heavyweights who are expanding their reach and new heavyweights who are starting to accelerate.
Pick ten SERPs in different niches that interest you and record who is in position #5. Then come back in a year and look for them. More will go down than up.
-
what do I need to do to get out of this "SERP lock"?
Without looking at the site and the keywords involved, we can only offer generic advice.
I would suggest examining all aspects of your onpage factors. Some specifics are:
-
page title: focus a single keyword
-
header: focus the same keyword
-
content: the first sentence of your content should also focus the same keyword
-
site internal linking: when appropriate, other pages of your site should provide links in content to other relevant pages.
-
url: clean, friendly, static urls which offer appropriate use of keywords is helpful
Wikipedia is a great example for many of the above steps.
There are other items to check. My point is link building and site promotion is an ongoing process which happens over months and years. On page changes have the ability to instantly and dramatically change your ranking. There is a good chance you are stuck due to onpage factors.
-
-
Could you let us know the URL and the keywords you're targeting?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Would You Redirect a Page if the Parent Page was Redirected?
Hi everyone! Let's use this as an example URL: https://www.example.com/marvel/avengers/hulk/ We have done a 301 redirect for the "Avengers" page to another page on the site. Sibling pages of the "Hulk" page live off "marvel" now (ex: /marvel/thor/ and /marvel/iron-man/). Is there any benefit in doing a 301 for the "Hulk" page to live at /marvel/hulk/ like it's sibling pages? Is there any harm long-term in leaving the "Hulk" page under a permanently redirected page? Thank you! Matt
Intermediate & Advanced SEO | | amag0 -
Substantial difference between Number of Indexed Pages and Sitemap Pages
Hey there, I am doing a website audit at the moment. I've notices substantial differences in the number of pages indexed (search console), the number of pages in the sitemap and the number I am getting when I crawl the page with screamingfrog (see below). Would those discrepancies concern you? The website and its rankings seems fine otherwise. Total indexed: 2,360 (Search Consule)
Intermediate & Advanced SEO | | Online-Marketing-Guy
About 2,920 results (Google search "site:example.com")
Sitemap: 1,229 URLs
Screemingfrog Spider: 1,352 URLs Cheers,
Jochen0 -
Google indexing only 1 page out of 2 similar pages made for different cities
We have created two category pages, in which we are showing products which could be delivered in separate cities. Both pages are related to cake delivery in that city. But out of these two category pages only 1 got indexed in google and other has not. Its been around 1 month but still only Bangalore category page got indexed. We have submitted sitemap and google is not giving any crawl error. We have also submitted for indexing from "Fetch as google" option in webmasters. www.winni.in/c/4/cakes (Indexed - Bangalore page - http://www.winni.in/sitemap/sitemap_blr_cakes.xml) 2. http://www.winni.in/hyderabad/cakes/c/4 (Not indexed - Hyderabad page - http://www.winni.in/sitemap/sitemap_hyd_cakes.xml) I tried searching for "hyderabad site:www.winni.in" in google but there also http://www.winni.in/hyderabad/cakes/c/4 this link is not coming, instead of this only www.winni.in/c/4/cakes is coming. Can anyone please let me know what could be the possible issue with this?
Intermediate & Advanced SEO | | abhihan0 -
Date of page first indexed or age of a page?
Hi does anyone know any ways, tools to find when a page was first indexed/cached by Google? I remember a while back, around 2009 i had a firefox plugin which could check this, and gave you a exact date. Maybe this has changed since. I don't remember the plugin. Or any recommendations on finding the age of a page (not domain) for a website? This is for competitor research not my own website. Cheers, Paul
Intermediate & Advanced SEO | | MBASydney0 -
Moving from a .org to a .com
We have been a .org website for as long as the web as been around. We just recently got the .com for our organization and wondered what the transition process would be like. We offer a lot of content to help parents with parenting and so as a content driven site we have about 13k external links and 1,200 linking root domains links to our site. Will we loose all our links in the transition to the .com? Is there a way to do this well that helps our brand and also retains our google ranking? Thanks so much for any and all help.
Intermediate & Advanced SEO | | movieguide0 -
Can too many "noindex" pages compared to "index" pages be a problem?
Hello, I have a question for you: our website virtualsheetmusic.com includes thousands of product pages, and due to Panda penalties in the past, we have no-indexed most of the product pages hoping in a sort of recovery (not yet seen though!). So, currently we have about 4,000 "index" page compared to about 80,000 "noindex" pages. Now, we plan to add additional 100,000 new product pages from a new publisher to offer our customers more music choice, and these new pages will still be marked as "noindex, follow". At the end of the integration process, we will end up having something like 180,000 "noindex, follow" pages compared to about 4,000 "index, follow" pages. Here is my question: can this huge discrepancy between 180,000 "noindex" pages and 4,000 "index" pages be a problem? Can this kind of scenario have or cause any negative effect on our current natural SEs profile? or is this something that doesn't actually matter? Any thoughts on this issue are very welcome. Thank you! Fabrizio
Intermediate & Advanced SEO | | fablau0 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0 -
There's a website I'm working with that has a .php extension. All the pages do. What's the best practice to remove the .php extension across all pages?
Client wishes to drop the .php extension on all their pages (they've got around 2k pages). I assured them that wasn't necessary. However, in the event that I do end up doing this what's the best practices way (and easiest way) to do this? This is also a WordPress site. Thanks.
Intermediate & Advanced SEO | | digisavvy0