Skip to content

BUG: Fixed DataFrame.__repr__ Type Error when column dtype is np.record (GH 48526) #48637

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 17 commits into from
Apr 21, 2023
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
The table of contents is too big for display.
Diff view
Diff view
  •  
  •  
  •  
Prev Previous commit
Next Next commit
Merged main branch and switched to doc/v2.0.0.rst
  • Loading branch information
RaphSku committed Oct 22, 2022
commit 8204a5845721f37de17856e64a52745b58c886e5
2 changes: 1 addition & 1 deletion .github/ISSUE_TEMPLATE/feature_request.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -68,5 +68,5 @@ body:
attributes:
label: Additional Context
description: >
Please provide any relevant Github issues, code examples or references that help describe and support
Please provide any relevant GitHub issues, code examples or references that help describe and support
the feature request.
2 changes: 1 addition & 1 deletion .github/PULL_REQUEST_TEMPLATE.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
- [ ] closes #xxxx (Replace xxxx with the Github issue number)
- [ ] closes #xxxx (Replace xxxx with the GitHub issue number)
- [ ] [Tests added and passed](https://pandas.pydata.org/pandas-docs/dev/development/contributing_codebase.html#writing-tests) if fixing a bug or adding a new feature
- [ ] All [code checks passed](https://pandas.pydata.org/pandas-docs/dev/development/contributing_codebase.html#pre-commit).
- [ ] Added [type annotations](https://pandas.pydata.org/pandas-docs/dev/development/contributing_codebase.html#type-hints) to new arguments/methods/functions.
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/macos-windows.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
name: Windows-MacOS
name: Windows-macOS

on:
push:
Expand Down
8 changes: 4 additions & 4 deletions .github/workflows/python-dev.yml
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
#
# In general, this file will remain frozen(present, but not running) until:
# - The next unreleased Python version has released beta 1
# - This version should be available on Github Actions.
# - This version should be available on GitHub Actions.
# - Our required build/runtime dependencies(numpy, pytz, Cython, python-dateutil)
# support that unreleased Python version.
# To unfreeze, comment out the ``if: false`` condition, and make sure you update
Expand Down Expand Up @@ -54,7 +54,7 @@ jobs:
os: [ubuntu-latest, macOS-latest, windows-latest]

name: actions-311-dev
timeout-minutes: 80
timeout-minutes: 120

concurrency:
#https://github.community/t/concurrecy-not-work-for-push/183068/7
Expand All @@ -75,7 +75,7 @@ jobs:
run: |
python --version
python -m pip install --upgrade pip setuptools wheel
python -m pip install git+https://github.com/numpy/numpy.git
python -m pip install -i https://pypi.anaconda.org/scipy-wheels-nightly/simple numpy
python -m pip install git+https://github.com/nedbat/coveragepy.git
python -m pip install python-dateutil pytz cython hypothesis==6.52.1 pytest>=6.2.5 pytest-xdist pytest-cov pytest-asyncio>=0.17
python -m pip list
Expand All @@ -84,7 +84,7 @@ jobs:
- name: Build Pandas
run: |
python setup.py build_ext -q -j1
python -m pip install -e . --no-build-isolation --no-use-pep517
python -m pip install -e . --no-build-isolation --no-use-pep517 --no-index

- name: Build Version
run: |
Expand Down
54 changes: 54 additions & 0 deletions .github/workflows/scorecards.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,54 @@
name: Scorecards supply-chain security
on:
# Only the default branch is supported.
branch_protection_rule:
schedule:
- cron: '27 19 * * 4'
push:
branches: [ "main" ]

# Declare default permissions as read only.
permissions: read-all

jobs:
analysis:
name: Scorecards analysis
runs-on: ubuntu-latest
permissions:
# Needed to upload the results to code-scanning dashboard.
security-events: write
# Used to receive a badge.
id-token: write

if: github.repository == 'pandas-dev/pandas' # don't run on forks

steps:
- name: "Checkout code"
uses: actions/checkout@v3
with:
persist-credentials: false

- name: "Run analysis"
uses: ossf/scorecard-action@v2.0.3
with:
results_file: results.sarif
results_format: sarif

# Publish the results for public repositories to enable scorecard badges. For more details, see
# https://github.com/ossf/scorecard-action#publishing-results.
publish_results: true

# Upload the results as artifacts (optional). Commenting out will disable uploads of run results in SARIF
# format to the repository Actions tab.
- name: "Upload artifact"
uses: actions/upload-artifact@v3
with:
name: SARIF file
path: results.sarif
retention-days: 5

# Upload the results to GitHub's code scanning dashboard.
- name: "Upload to code-scanning"
uses: github/codeql-action/upload-sarif@v1
with:
sarif_file: results.sarif
6 changes: 1 addition & 5 deletions .github/workflows/ubuntu.yml
Original file line number Diff line number Diff line change
Expand Up @@ -31,9 +31,7 @@ jobs:
matrix:
env_file: [actions-38.yaml, actions-39.yaml, actions-310.yaml]
pattern: ["not single_cpu", "single_cpu"]
# Don't test pyarrow v2/3: Causes timeouts in read_csv engine
# even if tests are skipped/xfailed
pyarrow_version: ["5", "6", "7"]
pyarrow_version: ["7", "8", "9"]
include:
- name: "Downstream Compat"
env_file: actions-38-downstream_compat.yaml
Expand Down Expand Up @@ -77,7 +75,6 @@ jobs:
- name: "Numpy Dev"
env_file: actions-310-numpydev.yaml
pattern: "not slow and not network and not single_cpu"
pandas_testing_mode: "deprecate"
test_args: "-W error::DeprecationWarning:numpy -W error::FutureWarning:numpy"
exclude:
- env_file: actions-39.yaml
Expand All @@ -96,7 +93,6 @@ jobs:
EXTRA_APT: ${{ matrix.extra_apt || '' }}
LANG: ${{ matrix.lang || '' }}
LC_ALL: ${{ matrix.lc_all || '' }}
PANDAS_TESTING_MODE: ${{ matrix.pandas_testing_mode || '' }}
PANDAS_DATA_MANAGER: ${{ matrix.pandas_data_manager || 'block' }}
PANDAS_COPY_ON_WRITE: ${{ matrix.pandas_copy_on_write || '0' }}
TEST_ARGS: ${{ matrix.test_args || '' }}
Expand Down
201 changes: 201 additions & 0 deletions .github/workflows/wheels.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,201 @@
# Workflow to build wheels for upload to PyPI.
# Inspired by numpy's cibuildwheel config https://github.com/numpy/numpy/blob/main/.github/workflows/wheels.yml
#
# In an attempt to save CI resources, wheel builds do
# not run on each push but only weekly and for releases.
# Wheel builds can be triggered from the Actions page
# (if you have the perms) on a commit to master.
#
# Alternatively, you can add labels to the pull request in order to trigger wheel
# builds.
# The label(s) that trigger builds are:
# - Build
name: Wheel builder

on:
schedule:
# ┌───────────── minute (0 - 59)
# │ ┌───────────── hour (0 - 23)
# │ │ ┌───────────── day of the month (1 - 31)
# │ │ │ ┌───────────── month (1 - 12 or JAN-DEC)
# │ │ │ │ ┌───────────── day of the week (0 - 6 or SUN-SAT)
# │ │ │ │ │
- cron: "27 3 */1 * *"
push:
pull_request:
types: [labeled, opened, synchronize, reopened]
workflow_dispatch:

concurrency:
group: ${{ github.workflow }}-${{ github.head_ref || github.run_id }}
cancel-in-progress: true

jobs:
build_wheels:
name: Build wheel for ${{ matrix.python[0] }}-${{ matrix.buildplat[1] }}
if: >-
github.event_name == 'schedule' ||
github.event_name == 'workflow_dispatch' ||
(github.event_name == 'pull_request' &&
contains(github.event.pull_request.labels.*.name, 'Build')) ||
(github.event_name == 'push' && startsWith(github.ref, 'refs/tags/v') && ( ! endsWith(github.ref, 'dev0')))
runs-on: ${{ matrix.buildplat[0] }}
strategy:
# Ensure that a wheel builder finishes even if another fails
fail-fast: false
matrix:
# GitHub Actions doesn't support pairing matrix values together, let's improvise
# https://github.com/github/feedback/discussions/7835#discussioncomment-1769026
buildplat:
- [ubuntu-20.04, manylinux_x86_64]
- [macos-11, macosx_*]
- [windows-2019, win_amd64]
- [windows-2019, win32]
# TODO: support PyPy?
python: [["cp38", "3.8"], ["cp39", "3.9"], ["cp310", "3.10"], ["cp311", "3.11-dev"]]# "pp38", "pp39"]
env:
IS_PUSH: ${{ github.event_name == 'push' && startsWith(github.ref, 'refs/tags/v') }}
IS_SCHEDULE_DISPATCH: ${{ github.event_name == 'schedule' || github.event_name == 'workflow_dispatch' }}
steps:
- name: Checkout pandas
uses: actions/checkout@v3
with:
submodules: true
# versioneer.py requires the latest tag to be reachable. Here we
# fetch the complete history to get access to the tags.
# A shallow clone can work when the following issue is resolved:
# https://github.com/actions/checkout/issues/338
fetch-depth: 0

- name: Build wheels
uses: pypa/cibuildwheel@v2.9.0
env:
CIBW_BUILD: ${{ matrix.python[0] }}-${{ matrix.buildplat[1] }}

# Used to test the built wheels
- uses: actions/setup-python@v3
with:
python-version: ${{ matrix.python[1] }}

- name: Test wheels (Windows 64-bit only)
if: ${{ matrix.buildplat[1] == 'win_amd64' }}
shell: cmd
run: |
python ci/test_wheels.py wheelhouse

- uses: actions/upload-artifact@v3
with:
name: ${{ matrix.python[0] }}-${{ startsWith(matrix.buildplat[1], 'macosx') && 'macosx' || matrix.buildplat[1] }}
path: ./wheelhouse/*.whl

# Used to push the built wheels
# TODO: once Python 3.11 is available on conda, de-dup with
# setup python above
- uses: conda-incubator/setup-miniconda@v2
with:
auto-update-conda: true
# Really doesn't matter what version we upload with
# just the version we test with
python-version: '3.8'
channels: conda-forge
channel-priority: true
mamba-version: "*"

- name: Install anaconda client
if: ${{ success() && (env.IS_SCHEDULE_DISPATCH == 'true' || env.IS_PUSH == 'true') }}
run: conda install -q -y anaconda-client


- name: Upload wheels
if: success()
shell: bash -el {0}
env:
PANDAS_STAGING_UPLOAD_TOKEN: ${{ secrets.PANDAS_STAGING_UPLOAD_TOKEN }}
PANDAS_NIGHTLY_UPLOAD_TOKEN: ${{ secrets.PANDAS_NIGHTLY_UPLOAD_TOKEN }}
run: |
source ci/upload_wheels.sh
set_upload_vars
# trigger an upload to
# https://anaconda.org/scipy-wheels-nightly/pandas
# for cron jobs or "Run workflow" (restricted to main branch).
# Tags will upload to
# https://anaconda.org/multibuild-wheels-staging/pandas
# The tokens were originally generated at anaconda.org
upload_wheels
build_sdist:
name: Build sdist
if: >-
github.event_name == 'schedule' ||
github.event_name == 'workflow_dispatch' ||
(github.event_name == 'pull_request' &&
contains(github.event.pull_request.labels.*.name, 'Build')) ||
(github.event_name == 'push' && startsWith(github.ref, 'refs/tags/v') && ( ! endsWith(github.ref, 'dev0')))
runs-on: ubuntu-latest
env:
IS_PUSH: ${{ github.event_name == 'push' && startsWith(github.ref, 'refs/tags/v') }}
IS_SCHEDULE_DISPATCH: ${{ github.event_name == 'schedule' || github.event_name == 'workflow_dispatch' }}
steps:
- name: Checkout pandas
uses: actions/checkout@v3
with:
submodules: true
# versioneer.py requires the latest tag to be reachable. Here we
# fetch the complete history to get access to the tags.
# A shallow clone can work when the following issue is resolved:
# https://github.com/actions/checkout/issues/338
fetch-depth: 0

# Used to push the built sdist
- uses: conda-incubator/setup-miniconda@v2
with:
auto-update-conda: true
# Really doesn't matter what version we upload with
# just the version we test with
python-version: '3.8'
channels: conda-forge
channel-priority: true
mamba-version: "*"

- name: Build sdist
run: |
pip install build
python -m build --sdist
- name: Test the sdist
shell: bash -el {0}
run: |
# TODO: Don't run test suite, and instead build wheels from sdist
# by splitting the wheel builders into a two stage job
# (1. Generate sdist 2. Build wheels from sdist)
# This tests the sdists, and saves some build time
python -m pip install dist/*.gz
pip install hypothesis==6.52.1 pytest>=6.2.5 pytest-xdist pytest-asyncio>=0.17
cd .. # Not a good idea to test within the src tree
python -c "import pandas; print(pandas.__version__);
pandas.test(extra_args=['-m not clipboard and not single_cpu', '--skip-slow', '--skip-network', '--skip-db', '-n=2']);
pandas.test(extra_args=['-m not clipboard and single_cpu', '--skip-slow', '--skip-network', '--skip-db'])"
- uses: actions/upload-artifact@v3
with:
name: sdist
path: ./dist/*

- name: Install anaconda client
if: ${{ success() && (env.IS_SCHEDULE_DISPATCH == 'true' || env.IS_PUSH == 'true') }}
run: |
conda install -q -y anaconda-client

- name: Upload sdist
if: success()
shell: bash -el {0}
env:
PANDAS_STAGING_UPLOAD_TOKEN: ${{ secrets.PANDAS_STAGING_UPLOAD_TOKEN }}
PANDAS_NIGHTLY_UPLOAD_TOKEN: ${{ secrets.PANDAS_NIGHTLY_UPLOAD_TOKEN }}
run: |
source ci/upload_wheels.sh
set_upload_vars
# trigger an upload to
# https://anaconda.org/scipy-wheels-nightly/pandas
# for cron jobs or "Run workflow" (restricted to main branch).
# Tags will upload to
# https://anaconda.org/multibuild-wheels-staging/pandas
# The tokens were originally generated at anaconda.org
upload_wheels
Loading
You are viewing a condensed version of this merge commit. You can view the full changes here.