# Run gem install a b to install gem package 'a' and 'b'.
gem install jekyll bundler
# Run 'bundle install' to install missing gems added in the Gemfile.
bundle install
jekyll -v
# jekyll serve
bundle exec jekyll serve
Then browse to http://localhost:4000
docker build -t my-jekyll-env -f Dockerfile .
docker run --name my-jekyll-env --mount type=bind,source=$(pwd),target=/src -p 4000:4000 -it my-jekyll-env
docker compose build
docker compose up
navigate to 127.0.0.1:4000
Hosted at https://liuning0820.github.io/
Netlify is a cloud computing company that offers hosting and serverless backend services for static websites. It features continuous deployment from Git across a global application delivery network, serverless form handling, support for AWS Lambda functions, and full integration with Let's Encrypt.
The site dashboard can be accessed Site Dashboard
Deployed at https://liuning0820.netlify.com/
3:44:21 PM: ffi-1.17.0-x86_64-linux-musl requires rubygems version >= 3.3.22, which is 3:44:21 PM: incompatible with the current version, 3.0.8 3:44:21 PM: Error during gem install 3:44:21 PM: Failing build: Failed to install dependencies
- Solution
Set a RUBY_VERSION environment variable. https://docs.netlify.com/configure-builds/manage-dependencies/#ruby
RUBY_VERSION 3.1.1
- Jekyll
- Jekyll Themes
- HTML Proofer - Validates HTML links, used in the CI/CD pipeline. Installed as a Ruby gem.
- HTML Proofer Usage
- Markdown Spellcheck - Runs spell check in the CI/CD pipeline. Installed via Node.js npm.
- Github Page
- HTML Presentations - reveal.js
- Search Feature - lunr.js
The collection should start with a underscore.
Your Pages site will use the layout and styles from the Jekyll theme you have selected in your repository settings. The name of this theme is saved in the Jekyll _config.yml
configuration file.
silently generate a sitemap.xml for your jekyll site.
https://liuning0820.github.io/sitemap.xml
Having trouble with Pages? Check out the documentation and it will help you sort it out.
ex. http://localhost:4000/ex_presentations/2021-05_a_clean_dev_env/?print-pdf#/ ex. https://liuning0820.github.io/ex_presentations/2021-05_a_clean_dev_env/?print-pdf#/
The Sitemaps protocol allows a webmaster to inform search engines about URLs on a website that are available for crawling. refer
robots.txt可以告诉搜索引擎您网站的哪些页面可以被抓取,哪些页面不可以被抓取。
User-agent: *
Disallow: /
统计你的网站的访问量等相关数据。