Welcome to HashiCorp Developer! This is the home for HashiCorp product reference documentation and tutorials for our practitioners. For background information on this project, refer to [MKTG-034].
Content Authors Please see this documentation for contributing content updates to Developer. Reach out in #proj-dev-portal on Slack if you have any issues / questions.
- Local Development
- Accessibility
- Testing
- Helpers
- Component Organziation
- Configuration
- Analytics
- SEO metadata
- Performance
- Remote Content & Application context
There are a few things you need to set up before you can begin developing in this repository.
-
The CLI is needed for the next 2 steps.
-
Run
vercel link
This command will prompt you to connect your local copy of repo to the Vercel
dev-portal
project. The command creates a.vercel
directory with a JSON file that contains the information that links to the Vercel project. -
Run
vercel env pull .env.local
This command will pull the development environment variables from the linked Vercel project and write them to a new file called
.env.local
. -
Remove the line containing
VERCEL="1"
from.env.local
.This step is required to prevent the login flow from using HTTPS URLs.
If you're developing in this repository, get started by running:
npm install
npm start
This will give you a development server running on localhost:3000.
Note: Historically, the
.io
sites were served from this repository. They have been migrated into the hashicorp/web repository. See this RFC for full context.
If using npm
isn't working for you, you can try running the website through the Docker container instead.
-
Build the docker container and tag it with something memorable, e.g.
dev-portal
:$ docker build -t dev-portal . [+] Building 115.2s (16/16) FINISHED => => naming to docker.io/library/dev-portal
-
Run the container, making sure to export port 3000 and mount the local source files into it:
$ docker run -it -v $(pwd)/src:/app/website-preview/src -p 3000:3000 dev-portal ... > Ready on http://localhost:3000
Now you can view the website on http://localhost:3000 and any local edits will be reflected on the rendered page.
In the .vscode
directory, you'll find an extensions.json
file that lists recommended VS Code extensions to use for this project.
To add the recommended extensions:
- Open VS Code
- Open the command palette
- Type
Show Recommended Extensions
- Hit the
Enter
key - Click the "Install Workspace Recommended Extensions" icon (it looks like a cloud with an arrow pointing down) under the Workspace Recommendations section of the sidebar
In the .vscode
directory, you'll find a settings.json
file with VS Code settings for this project.
source.fixAll.eslint
enables auto-fixing of eslint issues when a file is savedeslint.codeActionsOnSave.rules
specifies which rules can be auto-fixed on save
@axe-core/react
is a package that allows us to run accessibility checks against the rendered DOM and see results in a browser dev tools console.
** Note: it is recommended to use Chrome. There is limited functionality in Safari and Firefox
We use it for local accessibility testing of the DOM. It does not replace other tools like linting rules and tools like linting rules also do not replace this tool. Both kinds of tools are important for different reasons. Linters can help us write accessible code and form good habits, but they can't check the full output of the code. Tools that check the DOM ensure that the final state of elements is accessible including all calculated text and colors.
The code is set up in _app.tsx
to only use react-dom
and @axe-core/react
if the AXE_ENABLED
environment variable is set. We've added an npm script to make setting that variable easy. To run the app locally with @axe-core/react
enabled, run the following command in your terminal instead of npm start
:
npm run start:with-axe
After you've got the project running locally with AXE_ENABLED
set, you can open a browser to the local server and look at the dev tools console and inspect the console logs output by @axe-core/react
.
We use jest to write unit tests for our code. We also have React Testing Library integrated for writing tests against our rendered React components.
To run tests:
npm test
To run tests in watch mode:
npm run test:watch
Additionally, we use Playwright for end-to-end integration tests. Playwright tests should be used when testing functionality that requires a running Next.js server, such as middleware and redirects.
To run the end-to-end tests:
npm run test:e2e
To view the report for an end-to-end test run:
npx playwright show-report
Auto-populated subdirectories such as .next
and node_modules
can sometimes become out of date. Delete all related subdirectories with the clean
command.
npm install
npm run clean
In order to create some structure and consistency throughout this project, we're creating some light guidelines around where certain components should live. We have three top-level folders which should house components:
src/
components/
views/
layouts/
hooks/
contexts/
components
- Shareable, smaller components for use across any number of other componentsviews
- Componentry which represents a full site "view." This is a way to abstract out page components and easily co-locate related code. Not necessarily intended for re-use, unless one needs to render the same view on multiple pages. This also allows us to co-locate sub-components and test files with page components, which is otherwise difficult with file-based routinglayouts
- Layout components which are generic and possibly used across different pages (see Next.js docs)- Note: In support of future app-router adoption, we are no longer using the
.layout
or.getLayout
pattern, which is not supported in the app directory.
- Note: In support of future app-router adoption, we are no longer using the
hooks
- Shared hooks which are applicable for use across a variety of other components. Hooks which access shared contexts should live incontexts/
(see below)contexts
- Shared contexts and utilities for accessing / interacting with the context values
An example implementation of components laid out this way:
// pages/some/page.tsx
import SomePageView from 'views/some-page'
import SomeLayout from 'layouts/some-layout'
// if we need to adjust props, can wrap this to make any changes necessary
export default function SomePage(props) {
return (
<SomeLayout>
<SomePageView {...props} />
</SomeLayout>
)
}
Per-environment configuration values are defined in JSON files in the config/
folder. Each environment has its own config file, controlled by the HASHI_ENV
environment variable, currently:
config/
base.json # May be used in any environment, including production (see below)
development.json
preview.json
production.json
Each configuration can define an extends
property, which will cause it to merge its properties with the extended configuration file. If no extends
property is explicitly defined, the configuration file will extend from base.json
.
The configuration values are available globally within the application. They can be accessed from a global __config
object:
// config file:
{
"my_config_value": "foo"
}
// in code:
console.log(__config.my_config_value)
Configuration files should be used for any non-sensitive configuration values needed throughout the application which might vary by environment. Consider API endpoints, constants, and flags in scope for the configuration files. Any references to __config
are replaced at build-time with the values from the environment's configuration file using Webpack's DefinePlugin.
We're using Algolia to make the repository searchable. The search index is automatically updated when content changes are pushed in the various content repositories. The scripts to update the search index live in mktg-content-workflows
: docs, tutorials, and integrations.
The main
branch and all preview builds use the production Algolia index, prod_DEVDOT_omni
. To use the staging index, staging_DEVDOT_omni
, update the algolia config value.
Calls to window.analytics.track()
are logged in development for easy iteration while adding analytics code. If you would prefer to reduce the noise created by these logs, start the app with NEXT_PUBLIC_ANALYTICS_LOG_LEVEL=0
:
$ NEXT_PUBLIC_ANALYTICS_LOG_LEVEL=0 npm start
The meta tags for the site are rendered by the HeadMetadata
component. Each page which uses getStaticProps
can return a metadata
property in its prop object to control the metadata which is ultimately rendered. The root site title is defined in our base config under dev_dot.meta.title
.
export async function getStaticProps() {
return {
props: {
metadata: {
title: 'My Page', // Will be joined with the root site title
description: 'This is a cool page',
},
},
}
}
Social card images / OpenGraph images live in /public/og-image/
. Each product should have a {product}.jpg
file in that folder for its generic card image.
We use the Next.js Bundle Analysis GitHub Action to track the size of our JavaScript bundles generated by Next.js's build step. To speed up the execution of the analysis step, we also have a custom build script which prevents the execution of the static generation build step, short-circuiting the Next.js build after the webpack compilation is finished.
This application pulls content from multiple different repositories (remote content) through our Learn API, content API, integrations API, from the local filesystem, as well as directly from the GitHub API. In order to facilitate development and previewing of this content, the application can be run within the context of one of these source repositories. In this scenario, we want to read content from the filesystem for that specific source. This can be distilled down into three specific contexts that need to be handled for any remote content:
- Running the application in this repository (
hashicorp/dev-portal
): all content is sourced remotely - Running the application in a content's source repository (e.g. vault docs in
hashicorp/vault
): all content from the repository is read from the file system - Running the application in a different source repository (e.g. waypoint docs in
hashicorp/vault
): content is sourced remotely if not from the current context
Note: For content which is read from the GitHub API, we try to minimize loading this content from the API in source repositories to reduce reliance on GitHub PATs
If you are wiring up remote data which needs to change its loading strategy depending on the context, you can use isDeployPreview()
from lib/env-checks
:
import { isDeployPreview } from 'lib/env-checks'
isDeployPreview() // in any source repository?
isDeployPreview('vault') // in vault's source repository?