Is “Internal No Response” from ScreamingFrog a significant issue?

Is “Response Codes: Internal No Response” from ScreamingFrog Truly a Problem?

During a recent crawl of a website using ScreamingFrog, I encountered an issue where 549 links were returning a “no response” status. Interestingly, when I manually check these URLs in a browser, the pages load without any issues.

Screenshot of ScreamingFrog Result

I have a few questions regarding this situation:

  • Could this issue be linked to the configuration of my crawler?
  • If it’s not a crawler configuration issue, what steps should I take to resolve it, especially given that the site is built on WordPress?

Additionally, how would you address this situation if you encountered it?

Thank you for your insights!


2 responses to “Is “Internal No Response” from ScreamingFrog a significant issue?”

  1. When encountering “Response Codes: Internal No Response” notifications in Screaming Frog, it’s understandable to have concerns about potential issues with your website’s links. Below, I’ll address your questions and provide guidance on how to handle this situation effectively.

    Understanding the “No Response” Status

    The “No Response” status in Screaming Frog means that the crawler attempted to access a URL but did not receive a response in a timely manner. This can happen for a variety of reasons, which can be related to your crawler configuration, server response times, or even the website’s infrastructure.

    Possible Issues and Solutions

    1. Crawler Configuration

    Question: Could this issue be related to my crawler configuration?

    Explanation & Solutions:

    • User-Agent Settings: Sometimes, web servers block certain user-agents. Ensure Screaming Frog is using a user-agent that the server recognizes. You can configure this in Screaming Frog under Configuration > User-Agent.

    • Connection Settings: Check the connection settings under Configuration > Connection and ensure the timeout settings arenโ€™t set too low. Increasing the HTTP timeout may allow the crawler more time to wait for a server response.

    • Crawl Limits: Review the limits under Configuration > Limits. Make sure there’s no restriction on response time or crawl depth that’s causing incomplete requests.

    • IP Blocking: If you’re running repeated crawls, the server might block your crawler’s IP address. Consider slowing the crawl rate or using a proxy to prevent this from happening.

    2. Server or Website Issues

    Question: If the problem isnโ€™t related to the crawler configuration, how would you go about fixing it?

    Explanation & Solutions:

    • Server Load: If your server is overloaded, it might be slow to respond to the crawler’s requests. Use server monitoring tools to check the current load and ensure it has sufficient resources to handle requests.

    • Firewall or Security Plugins: Some WordPress security plugins or server firewalls might block requests that appear to be from a bot. Check your firewall settings or plugin configurations to see if they are inadvertently blocking Screaming Frog.

    • Redirect Chains and Errors: Look for potentially incorrect redirect setups. Investigate the server logs to identify any possible routing issues or errors returned when these particular URLs are requested.

    • HTACCESS Rules: If using Apache, verify the .htaccess rules to ensure they’re not inadvertently blocking certain types of crawlers or specific URLs.

    Handling the Situation

  2. It’s great that you’re investigating the “Internal No Response” issue with ScreamingFrog. This can indeed be a perplexing situation. Based on your description, there are a couple of factors to consider:

    Firstly, the discrepancy between ScreamingFrog’s no response status and the browser’s ability to access the pages could be attributed to a few potential issues. One common factor is the way the server handles user agents. Some servers might block certain crawlers or return different responses based on the user agent string. You can check the user agent settings in ScreamingFrog and match them with typical web browsers to see if this resolves the issue.

    Secondly, consider reviewing your server’s configuration, specifically related to timeout settings or security protocols. Sometimes, the server may be configured to deny requests from crawlers after a certain number of failed attempts or due to specific rules in the firewall.

    Since your site is on WordPress, you might also look into plugins that could affect crawling, such as caching or security plugins. Try temporarily disabling them to see if that resolves the issue.

    If youโ€™re still experiencing problems after these checks, it may be helpful to consult your hosting provider to see if there are any underlying server issues. Finally, checking the server logs could provide insight into why these requests are being met with a no response.

    Overall, diagnosing this issue is a multi-step process, but systematically addressing each possibility should lead you closer to a solution. Good luck, and donโ€™t hesitate to share any further updates

Leave a Reply

Your email address will not be published. Required fields are marked *