Skip to content

cssnr/algolia-crawler-action

Use this GitHub action with your project
Add this Action to an existing workflow or create a new one
View on Marketplace

Repository files navigation

GitHub Tag Major GitHub Tag Minor GitHub Release Version GitHub Dist Size Action Run Using Workflow Release Workflow Lint Quality Gate Status GitHub Last Commit Codeberg Last Commit GitHub Contributors GitHub Repo Size GitHub Top Language GitHub Forks GitHub Discussions GitHub Repo Stars GitHub Org Stars Discord Ko-fi

Algolia Crawler Action

Easily run the Algolia Crawler after a deployment. Works for free accounts including DocSearch. Only requires your Crawler Credentials; Crawler ID, Crawler User ID, and Crawler API Key.

This uses the Algolia API to Start a crawl.

- name: 'Algolia Crawler'
  uses: cssnr/algolia-crawler-action@v2
  with:
    crawler_id: ${{ secrets.CRAWLER_ID }}
    crawler_user_id: ${{ secrets.CRAWLER_USER_ID }}
    crawler_api_key: ${{ secrets.CRAWLER_API_KEY }}

Make sure to review the Inputs and checkout more Examples.

This is an extremely simple action, for more details see src/index.js.

An alternative to this action is to use a simple web-request-action.

Note

Please submit a Feature Request for new features or Open an Issue if you find any bugs.

Inputs

Input Short Description of Input
crawler_id Crawlers > Your Cralwer > Settings > Crawler ID
crawler_user_id Data Sources > Crawler > Settings > Crawler User Id
crawler_api_key Data Sources > Crawler > Settings > Crawler API Key

Note, these are found in Algolia Dashboard under: Data Sources > Crawler

Crawler ID

To find your crawler_id you need to select your named crawler from the Crawlers list, then under the CONFIGURATION heading, click on Settings. From there you can copy your Crawler ID.

Crawler User Id / Crawler API Key

To find these do not select a crawler from the Cralwers tab, instead click on the Settings tab. From there you can copy both the Crawler User Id and Crawler API Key.

These are usually the same across your account if you have multiple crawlers. The only variable is the Cralwer ID.

Outputs

Output Description
status API Response Status Code
task_id Resulting Crawler Task ID
- name: 'Algolia Crawler'
  uses: cssnr/algolia-crawler-action@v2
  id: crawler
  with:
    crawler_id: ${{ secrets.CRAWLER_ID }}
    crawler_user_id: ${{ secrets.CRAWLER_USER_ID }}
    crawler_api_key: ${{ secrets.CRAWLER_API_KEY }}

- name: 'Echo Output'
  run: |
    echo "Status Code: ${{ steps.crawler.outputs.status }}"
    echo "Task ID: ${{ steps.crawler.outputs.task_id }}"

Examples

💡 Click on an example heading to expand or collapse the example.

GitHub Pages - VitePress
name: 'Pages'

on:
  push:
    branches:
      - 'master'
    paths:
      - 'docs/**'
      - '.vitepress/**'
      - 'package.json'
      - '.github/workflows/pages.yaml'
  workflow_dispatch:

permissions:
  contents: read

concurrency:
  group: pages
  cancel-in-progress: false

jobs:
  build:
    name: 'Build'
    runs-on: ubuntu-latest
    timeout-minutes: 10

    steps:
      - name: 'Checkout'
        uses: actions/checkout@v5
        with:
          fetch-depth: 0

      - name: 'Setup Node 22'
        uses: actions/setup-node@v5
        with:
          node-version: 22
          cache: npm

      - name: 'Configure Pages'
        uses: actions/configure-pages@v5

      - name: 'Install Dependencies'
        run: |
          npm ci

      - name: 'Run Build'
        run: |
          npm run build

      - name: 'Upload Pages Artifact'
        uses: actions/upload-pages-artifact@v3
        with:
          path: .vitepress/dist

  deploy:
    name: 'Deploy'
    runs-on: ubuntu-latest
    timeout-minutes: 5
    needs: build

    permissions:
      pages: write
      id-token: write

    environment:
      name: github-pages
      url: ${{ steps.deployment.outputs.page_url }}

    steps:
      - name: 'Deploy Pages'
        id: deployment
        uses: actions/deploy-pages@v4

  post:
    name: 'Post-Deploy'
    runs-on: ubuntu-latest
    timeout-minutes: 5
    needs: deploy

    steps:
      - name: 'Algolia Crawler'
        uses: cssnr/algolia-crawler-action@v2
        with:
          crawler_id: ${{ secrets.CRAWLER_ID }}
          crawler_user_id: ${{ secrets.CRAWLER_USER_ID }}
          crawler_api_key: ${{ secrets.CRAWLER_API_KEY }}

For more examples, you can check out other projects using this action:
https://github.com/cssnr/algolia-crawler-action/network/dependents

Tags

The following rolling tags are maintained.

Tag Example Target Bugs Feat. Description
GitHub Tag Major vN vN.x.x Includes new features but is always backwards compatible.
GitHub Tag Minor vN.N vN.N.x Only receives bug fixes. This is the most stable tag.
GitHub Release vN.N.N vN.N.N Not a rolling tag. Not recommended.

You can view the release notes for each version on the releases page.

Support

For general help or to request a feature, see:

If you are experiencing an issue/bug or getting unexpected results, you can:

For more information, see the CSSNR SUPPORT.md.

Contributing

If you would like to submit a PR, please review the CONTRIBUTING.md.

Please consider making a donation to support the development of this project and additional open source projects.

Ko-fi

Additionally, you can support other GitHub Actions I have published:

❔ Unpublished Actions

These actions are not published on the Marketplace, but may be useful.


📝 Template Actions

These are basic action templates that I use for creating new actions.

Note: The docker-test-action builds, runs and pushes images to GitHub Container Registry.


For a full list of current projects visit: https://cssnr.github.io/

About

Algolia Crawler Action to Start a crawl and Reindex your Site for All Accounts.

Topics

Resources

License

Contributing

Stars

Watchers

Forks

Sponsor this project