Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RateLimited during url inspection #45

Open
maximepvrt opened this issue Mar 19, 2024 · 2 comments
Open

RateLimited during url inspection #45

maximepvrt opened this issue Mar 19, 2024 · 2 comments

Comments

@maximepvrt
Copy link

Currently, when conducting an inspection for indexing using the tool, I encountered an obstacle. It appears that there is a rate limit restriction of 2000 calls per day (https://support.google.com/webmasters/thread/240916045/429-quota-issue-googleapis-com-v1-urlinspection-index-inspect?hl=en), which significantly hampers the inspection process.

👍 Done, here's the status of all 14784 pages:
🚦 RateLimited: 14784 pages
{
  "error": {
    "code": 429,
    "message": "Quota exceeded for sc-domain:xxxxxxxxx.fr.",
    "status": "RESOURCE_EXHAUSTED"
  }
}

Proposed Enhancement

  • Cache-Based Inspection Calls: Introduce use the caching mechanism to use previous inspection results and only inspect URL with 429 Responses.

  • Rate Limit Management: Implement a rate limit strategy for inspection calls to ensure adherence to the daily limit of 2000 calls. This could involve throttling the rate of inspection requests to stay within the allowed quota.

  • Error Handling for 429 Responses: Identify and block inspection requests for URLs that were not analyzed due to encountering a 429 error (rate limit exceeded). This prevents redundant calls for URLs that are already queued for inspection.

  • Bypass Mechanism for Cached URLs: Introduce a mechanism to bypass inspection for URLs already present in the cache. This would allow for direct indexing of URLs stored in the cache, optimizing the indexing process.

@lundcm
Copy link

lundcm commented Apr 22, 2024

I have the following locally as a bit of a hack but it works for my purposes. This is in utils.ts:

export async function fetchRetry(url: string, options: RequestInit, retries: number = 5) {
  try {
    const response = await fetch(url, options);
    if (response.status >= 500) {
      const body = await response.text();
      throw new Error(`Server error code ${response.status}\n${body}`);
    }

    if (response.status === 429) {
      console.log("Rate limited. Retrying in 5 seconds...");
      // Retry after 5 seconds if rate limited
      await new Promise((resolve) => setTimeout(resolve, 5000));

      // Retry the request ignoring the retries limit
      return fetchRetry(url, options, retries);
    }
    return response;
  } catch (err) {
    if (retries <= 0) {
      throw err;
    }
    return fetchRetry(url, options, retries - 1);
  }
}

@muditjuneja
Copy link

This works like a charm for me!! Thanks @lundcm

tcrwt added a commit to tcrwt/google-indexing-script that referenced this issue Aug 11, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants