diff --git a/RELEASING/README.md b/RELEASING/README.md index 46913d55ecad8..470c890ba9a7d 100644 --- a/RELEASING/README.md +++ b/RELEASING/README.md @@ -67,6 +67,32 @@ need to be done at every release. svn update ``` +To minimize the risk of mixing up your local development environment, it's recommended to work on the +release in a different directory than where the devenv is located. In this example, we'll clone +the repo directly from the main `apache/superset` repo to a new directory `superset-release`: + +```bash +cd +git clone git@github.com:apache/superset.git superset-release +cd superset-release +``` + +We recommend setting up a virtual environment to isolate the python dependencies from your main +setup: + +```bash +virtualenv venv +source venv/bin/activate +``` + +In addition, we recommend using the [`cherrytree`](https://pypi.org/project/cherrytree/) tool for +automating cherry picking, as it will help speed up the release process. To install `cherrytree` +and other dependencies that are required for the release process, run the following commands: + +```bash +pip install -r RELEASING/requirements.txt +``` + ## Setting up the release environment (do every time) As the vote process takes a minimum of 72h, sometimes stretching over several weeks @@ -78,35 +104,41 @@ the wrong files/using wrong names. There's a script to help you set correctly al necessary environment variables. Change your current directory to `superset/RELEASING` and execute the `set_release_env.sh` script with the relevant parameters: -Usage (BASH): +Usage (MacOS/ZSH): + ```bash -. set_release_env.sh +cd RELEASING +source set_release_env.sh ``` -Usage (ZSH): +Usage (BASH): + ```bash -source set_release_env.sh +. set_release_env.sh ``` Example: + ```bash -source set_release_env.sh 0.38.0rc1 myid@apache.org +source set_release_env.sh 1.5.1rc1 myid@apache.org ``` -The script will output the exported variables. Here's example for 0.38.0rc1: +The script will output the exported variables. Here's example for 1.5.1rc1: ``` +------------------------------- Set Release env variables -SUPERSET_VERSION=0.38.0 +SUPERSET_VERSION=1.5.1 SUPERSET_RC=1 -SUPERSET_GITHUB_BRANCH=0.38 -SUPERSET_PGP_FULLNAME=myid@apache.org -SUPERSET_VERSION_RC=0.38.0rc1 -SUPERSET_RELEASE=apache-superset-0.38.0 -SUPERSET_RELEASE_RC=apache-superset-0.38.0rc1 -SUPERSET_RELEASE_TARBALL=apache-superset-0.38.0-source.tar.gz -SUPERSET_RELEASE_RC_TARBALL=apache-superset-0.38.0rc1-source.tar.gz -SUPERSET_TMP_ASF_SITE_PATH=/tmp/superset-site-0.38.0 +SUPERSET_GITHUB_BRANCH=1.5 +SUPERSET_PGP_FULLNAME=villebro@apache.org +SUPERSET_VERSION_RC=1.5.1rc1 +SUPERSET_RELEASE=apache-superset-1.5.1 +SUPERSET_RELEASE_RC=apache-superset-1.5.1rc1 +SUPERSET_RELEASE_TARBALL=apache-superset-1.5.1-source.tar.gz +SUPERSET_RELEASE_RC_TARBALL=apache-superset-1.5.1rc1-source.tar.gz +SUPERSET_TMP_ASF_SITE_PATH=/tmp/incubator-superset-site-1.5.1 +------------------------------- ``` ## Crafting a source release @@ -116,41 +148,133 @@ a branch named with the release MAJOR.MINOR version (on this example 0.37). This new branch will hold all PATCH and release candidates that belong to the MAJOR.MINOR version. +### Creating an initial minor release (e.g. 1.5.0) + The MAJOR.MINOR branch is normally a "cut" from a specific point in time from the master branch. -Then (if needed) apply all cherries that will make the PATCH. +When creating the initial minor release (e.g. 1.5.0), create a new branch: + +```bash +git checkout master +git pull +git checkout -b ${SUPERSET_GITHUB_BRANCH} +git push origin $SUPERSET_GITHUB_BRANCH +``` + +Note that this initializes a new "release cut", and is NOT needed when creating a patch release +(e.g. 1.5.1). + +### Creating a patch release (e.g. 1.5.1) + +When getting ready to bake a patch release, simply checkout the relevant branch: + +```bash +git checkout master +git pull +git checkout ${SUPERSET_GITHUB_BRANCH} +``` + +### Cherry picking + +It is customary to label PRs that have been introduced after the cut with the label +`v.`. For example, for any PRs that should be included in the 1.5 branch, the +label `v1.5` should be added. + +To see how well the labelled PRs would apply to the current branch, run the following command: + +```bash +cherrytree bake -r apache/superset -m master -l v${SUPERSET_GITHUB_BRANCH} ${SUPERSET_GITHUB_BRANCH} +``` + +This requires the presence of an environment variable `GITHUB_TOKEN`. Alternatively, +you can pass the token directly via the `--access-token` parameter (`-at` for short). + +#### Happy path: no conflicts + +This will show how many cherries will apply cleanly. If there are no conflicts, you can simply apply all cherries +by adding the `--no-dry-run` flag (`-nd` for short): ```bash -git checkout -b $SUPERSET_GITHUB_BRANCH -git push upstream $SUPERSET_GITHUB_BRANCH +cherrytree bake -r apache/superset -m master -l v${SUPERSET_GITHUB_BRANCH} -nd ${SUPERSET_GITHUB_BRANCH} ``` +#### Resolving conflicts + +If there are conflicts, you can issue the following command to apply all cherries up until the conflict automatically, and then +break by adding the `-error-mode break` flag (`-e break` for short): + +```bash +cherrytree bake -r apache/superset -m master -l v${SUPERSET_GITHUB_BRANCH} -nd -e break ${SUPERSET_GITHUB_BRANCH} +``` + +After applying the cleanly merged cherries, `cherrytree` will specify the SHA of the conflicted cherry. To resolve the conflict, +simply issue the following command: + +```bash +git cherry-pick +``` + +Then fix all conflicts, followed by + +```bash +git add -u # add all changes +git cherry-pick --continue +``` + +After this, rerun all the above steps until all cherries have been picked, finally pushing all new commits to the release branch +on the main repo: + +```bash +git push +``` + +### Updating changelog + Next, update the `CHANGELOG.md` with all the changes that are included in the release. -Make sure the branch has been pushed to `upstream` to ensure the changelog generator +Make sure the branch has been pushed to `origin` to ensure the changelog generator can pick up changes since the previous release. -Change log script requires a github token and will try to use your env var GITHUB_TOKEN. -you can also pass the token using the parameter `--access_token`. +Similar to `cherrytree`, the change log script requires a github token, either as an env var +(`GITHUB_TOKEN`) or as the parameter `--access_token`. + +#### Initial release (e.g. 1.5.0) + +When generating the changelog for an initial minor relese, you should compare with +the previous release (in the example, the previous release branch is `1.4`, so remember to +update it accordingly): -Example: ```bash -python changelog.py --previous_version 0.37 --current_version 0.38 changelog +python changelog.py --previous_version 1.4 --current_version ${SUPERSET_GITHUB_BRANCH} changelog ``` You can get a list of pull requests with labels started with blocking, risk, hold, revert and security by using the parameter `--risk`. Example: + ```bash python changelog.py --previous_version 0.37 --current_version 0.38 changelog --access_token {GITHUB_TOKEN} --risk ``` -The script will checkout both branches and compare all the PR's, copy the output and paste it on the `CHANGELOG.md` +The script will checkout both branches, compare all the PRs, and output the lines that are needed to be added to the +`CHANGELOG.md` file in the root of the repo. Remember to also make sure to update the branch id (with the above command +`1.5` needs to be changed to `1.5.0`) Then, in `UPDATING.md`, a file that contains a list of notifications around deprecations and upgrading-related topics, make sure to move the content now under the `Next Version` section under a new section for the new release. -Finally bump the version number on `superset-frontend/package.json` (replace with whichever version is being released excluding the RC version): +#### Patch release (e.g. 1.5.1) + +To compare the forthcoming patch release with the latest release from the same branch, set +`--previous_version` as the tag of the previous release (in this example `1.5.0`; remember to update accordingly) + +```bash +python changelog.py --previous_version 1.5.0 --current_version ${SUPERSET_GITHUB_BRANCH} changelog +``` + +### Set version number + +Finally, bump the version number on `superset-frontend/package.json` (replace with whichever version is being released excluding the RC version): -```json +``` "version": "0.38.0" ``` @@ -162,9 +286,13 @@ git add ... git commit ... # push new tag git tag ${SUPERSET_VERSION_RC} -git push upstream ${SUPERSET_VERSION_RC} +git push origin ${SUPERSET_VERSION_RC} ``` +### Create a release on Github + +After submitting the tag, follow the steps [here](https://docs.github.com/en/repositories/releasing-projects-on-github/managing-releases-in-a-repository) to create the release. Use the vote email text as the content for the release description. Make sure to check the "This is a pre-release" checkbox for release canditates. You can check previous releases if you need an example. + ## Preparing the release candidate The first step of preparing an Apache Release is packaging a release candidate @@ -180,14 +308,15 @@ the tag and create a signed source tarball from it: Note that `make_tarball.sh`: -- By default assumes you have already executed an SVN checkout to `$HOME/svn/superset_dev`. -This can be overridden by setting `SUPERSET_SVN_DEV_PATH` environment var to a different svn dev directory +- By default, the script assumes you have already executed an SVN checkout to `$HOME/svn/superset_dev`. + This can be overridden by setting `SUPERSET_SVN_DEV_PATH` environment var to a different svn dev directory - Will refuse to craft a new release candidate if a release already exists on your local svn dev directory - Will check `package.json` version number and fails if it's not correctly set ### Build and test the created source tarball To build and run the **local copy** of the recently created tarball: + ```bash # Build and run a release candidate tarball ./test_run_tarball.sh local @@ -209,6 +338,7 @@ svn update ### Build and test from SVN source tarball To build and run the recently created tarball **from SVN**: + ```bash # Build and run a release candidate tarball ./test_run_tarball.sh @@ -217,6 +347,7 @@ To build and run the recently created tarball **from SVN**: ``` ### Voting + Now you're ready to start the [VOTE] thread. Here's an example of a previous release vote thread: https://lists.apache.org/thread.html/e60f080ebdda26896214f7d3d5be1ccadfab95d48fbe813252762879@ @@ -225,17 +356,10 @@ To easily send a voting request to Superset community, still on the `superset/RE ```bash # Note: use Superset's virtualenv -(venv)$ python send_email.py vote_pmc +(venv)$ python generate_email.py vote_pmc ``` -The script will interactively ask for extra information so it can authenticate on the Apache Email Relay. -The release version and release candidate number are fetched from the previously set environment variables. - -``` -Sender email (ex: user@apache.org): your_apache_email@apache.org -Apache username: your_apache_user -Apache password: your_apache_password -``` +The script will generate the email text that should be sent to dev@superset.apache.org using an email client. The release version and release candidate number are fetched from the previously set environment variables. Once 3+ binding votes (by PMC members) have been cast and at least 72 hours have past, you can post a [RESULT] thread: @@ -245,23 +369,20 @@ To easily send the result email, still on the `superset/RELEASING` directory: ```bash # Note: use Superset's virtualenv -python send_email.py result_pmc +python generate_email.py result_pmc ``` The script will interactively ask for extra information needed to fill out the email template. Based on the voting description, it will generate a passing, non passing or non conclusive email. -here's an example: +Here's an example: ``` -Sender email (ex: user@apache.org): your_apache_email@apache.org -Apache username: your_apache_user -Apache password: your_apache_password A List of people with +1 binding vote (ex: Max,Grace,Krist): Daniel,Alan,Max,Grace A List of people with +1 non binding vote (ex: Ville): Ville A List of people with -1 vote (ex: John): ``` -Following the result thread, yet another [VOTE] thread should be +The script will generate the email text that should be sent to dev@superset.apache.org using an email client. The release version and release candidate number are fetched from the previously set environment variables. ### Validating a release @@ -270,6 +391,7 @@ https://www.apache.org/info/verification.html ## Publishing a successful release Upon a successful vote, you'll have to copy the folder into the non-"dev/" folder. + ```bash cp -r ~/svn/superset_dev/${SUPERSET_VERSION_RC}/ ~/svn/superset/${SUPERSET_VERSION}/ cd ~/svn/superset/ @@ -281,6 +403,7 @@ svn update ``` Then tag the final release: + ```bash # Go to the root directory of the repo, e.g. `~/src/superset` cd ~/src/superset/ @@ -289,7 +412,7 @@ git branch # Create the release tag git tag -f ${SUPERSET_VERSION} # push the tag to the remote -git push upstream ${SUPERSET_VERSION} +git push origin ${SUPERSET_VERSION} ``` ### Update CHANGELOG and UPDATING on superset @@ -312,9 +435,11 @@ Once it's all done, an [ANNOUNCE] thread announcing the release to the dev@ mail ```bash # Note use Superset's virtualenv -python send_email.py announce +python generate_email.py announce ``` +The script will generate the email text that should be sent to dev@superset.apache.org using an email client. The release version is fetched from the previously set environment variables. + ### GitHub Release Finally, so the GitHub UI reflects the latest release, you should create a release from the diff --git a/RELEASING/changelog.py b/RELEASING/changelog.py index 5d4f346c8edfb..0729853ba57e9 100644 --- a/RELEASING/changelog.py +++ b/RELEASING/changelog.py @@ -164,6 +164,12 @@ def _is_risk_pull_request(self, labels: List[Any]) -> bool: return False def _get_changelog_version_head(self) -> str: + if not len(self._logs): + print( + f"No changes found between revisions. " + f"Make sure your branch is up to date." + ) + sys.exit(1) return f"### {self._version} ({self._logs[0].time})" def _parse_change_log( diff --git a/RELEASING/email_templates/announce.j2 b/RELEASING/email_templates/announce.j2 index 80038630d5e0b..5d5b2f67aff3c 100644 --- a/RELEASING/email_templates/announce.j2 +++ b/RELEASING/email_templates/announce.j2 @@ -17,7 +17,7 @@ under the License. -#} To: {{ receiver_email }} -From: {{ sender_email }} + Subject: [ANNOUNCE] Apache {{ project_name }} version {{ version }} Released Hello Community, diff --git a/RELEASING/email_templates/result_pmc.j2 b/RELEASING/email_templates/result_pmc.j2 index be88047524d41..37b7eeda532e9 100644 --- a/RELEASING/email_templates/result_pmc.j2 +++ b/RELEASING/email_templates/result_pmc.j2 @@ -17,7 +17,7 @@ under the License. -#} To: {{ receiver_email }} -From: {{ sender_email }} + Subject: [RESULT] [VOTE] Release Apache {{ project_name }} {{ version }} based on Superset {{ version_rc }} Thanks to everyone that participated. The vote to release diff --git a/RELEASING/email_templates/vote_pmc.j2 b/RELEASING/email_templates/vote_pmc.j2 index 3b2cc1363e38d..a6ebda72c24c4 100644 --- a/RELEASING/email_templates/vote_pmc.j2 +++ b/RELEASING/email_templates/vote_pmc.j2 @@ -17,7 +17,7 @@ under the License. -#} To: {{ receiver_email }} -From: {{ sender_email }} + Subject: [VOTE] Release Apache {{ project_name }} {{ version }} based on Superset {{ version_rc }} Hello {{ project_name }} Community, diff --git a/RELEASING/send_email.py b/RELEASING/generate_email.py similarity index 53% rename from RELEASING/send_email.py rename to RELEASING/generate_email.py index a4b4a449665f9..92536670cda6d 100755 --- a/RELEASING/send_email.py +++ b/RELEASING/generate_email.py @@ -15,9 +15,7 @@ # See the License for the specific language governing permissions and # limitations under the License. # -import smtplib -import ssl -from typing import Any, Dict, List, Optional +from typing import Any, Dict, List from click.core import Context @@ -30,9 +28,7 @@ except ModuleNotFoundError: exit("Click is a required dependency for this script") - -SMTP_PORT = 587 -SMTP_SERVER = "mail-relay.apache.org" +RECEIVER_EMAIL = "dev@superset.apache.org" PROJECT_NAME = "Superset" PROJECT_MODULE = "superset" PROJECT_DESCRIPTION = "Apache Superset is a modern, enterprise-ready business intelligence web application" @@ -44,25 +40,6 @@ def string_comma_to_list(message: str) -> List[str]: return [element.strip() for element in message.split(",")] -def send_email( - smtp_server: str, - smpt_port: int, - username: str, - password: str, - sender_email: str, - receiver_email: str, - message: str, -) -> None: - """ - Send a simple text email (SMTP) - """ - context = ssl.create_default_context() - with smtplib.SMTP(smtp_server, smpt_port) as server: - server.starttls(context=context) - server.login(username, password) - server.sendmail(sender_email, receiver_email, message) - - def render_template(template_file: str, **kwargs: Any) -> str: """ Simple render template based on named parameters @@ -75,122 +52,49 @@ def render_template(template_file: str, **kwargs: Any) -> str: return template.render(kwargs) -def inter_send_email( - username: str, password: str, sender_email: str, receiver_email: str, message: str -) -> None: - print("--------------------------") - print("SMTP Message") - print("--------------------------") - print(message) - print("--------------------------") - confirm = input("Is the Email message ok? (yes/no): ") - if confirm not in ("Yes", "yes", "y"): - exit("Exit by user request") - - try: - send_email( - SMTP_SERVER, - SMTP_PORT, - username, - password, - sender_email, - receiver_email, - message, - ) - print("Email sent successfully") - except smtplib.SMTPAuthenticationError: - exit("SMTP User authentication error, Email not sent!") - except Exception as e: - exit(f"SMTP exception {e}") - - class BaseParameters(object): def __init__( self, - email: str, - username: str, - password: str, version: str, version_rc: str, ) -> None: - self.email = email - self.username = username - self.password = password self.version = version self.version_rc = version_rc self.template_arguments: Dict[str, Any] = {} def __repr__(self) -> str: - return f"Apache Credentials: {self.email}/{self.username}/{self.version}/{self.version_rc}" + return f"Apache Credentials: {self.version}/{self.version_rc}" @click.group() @click.pass_context -@click.option( - "--apache_email", - prompt="Apache Email", - help="Your Apache email this will be used for SMTP From", -) -@click.option( - "--apache_username", prompt="Apache username", help="Your LDAP Apache username" -) -@click.option( - "--apache_password", - prompt="Apache password", - hide_input=True, - help="Your LDAP Apache password", -) @click.option("--version", envvar="SUPERSET_VERSION") @click.option("--version_rc", envvar="SUPERSET_VERSION_RC") def cli( ctx: Context, - apache_email: str, - apache_username: str, - apache_password: str, version: str, version_rc: str, ) -> None: """Welcome to releasing send email CLI interface!""" - base_parameters = BaseParameters( - apache_email, apache_username, apache_password, version, version_rc - ) + base_parameters = BaseParameters(version, version_rc) + base_parameters.template_arguments["receiver_email"] = RECEIVER_EMAIL base_parameters.template_arguments["project_name"] = PROJECT_NAME base_parameters.template_arguments["project_module"] = PROJECT_MODULE base_parameters.template_arguments["project_description"] = PROJECT_DESCRIPTION base_parameters.template_arguments["version"] = base_parameters.version base_parameters.template_arguments["version_rc"] = base_parameters.version_rc - base_parameters.template_arguments["sender_email"] = base_parameters.email ctx.obj = base_parameters @cli.command("vote_pmc") -@click.option( - "--receiver_email", - default="dev@superset.apache.org", - type=str, - prompt="The receiver email (To:)", -) @click.pass_obj -def vote_pmc(base_parameters: BaseParameters, receiver_email: str) -> None: +def vote_pmc(base_parameters: BaseParameters) -> None: template_file = "email_templates/vote_pmc.j2" - base_parameters.template_arguments["receiver_email"] = receiver_email message = render_template(template_file, **base_parameters.template_arguments) - inter_send_email( - base_parameters.username, - base_parameters.password, - base_parameters.template_arguments["sender_email"], - base_parameters.template_arguments["receiver_email"], - message, - ) + print(message) @cli.command("result_pmc") -@click.option( - "--receiver_email", - default="dev@superset.apache.org", - type=str, - prompt="The receiver email (To:)", -) @click.option( "--vote_bindings", default="", @@ -219,14 +123,12 @@ def vote_pmc(base_parameters: BaseParameters, receiver_email: str) -> None: @click.pass_obj def result_pmc( base_parameters: BaseParameters, - receiver_email: str, vote_bindings: str, vote_nonbindings: str, vote_negatives: str, vote_thread: str, ) -> None: template_file = "email_templates/result_pmc.j2" - base_parameters.template_arguments["receiver_email"] = receiver_email base_parameters.template_arguments["vote_bindings"] = string_comma_to_list( vote_bindings ) @@ -238,34 +140,15 @@ def result_pmc( ) base_parameters.template_arguments["vote_thread"] = vote_thread message = render_template(template_file, **base_parameters.template_arguments) - inter_send_email( - base_parameters.username, - base_parameters.password, - base_parameters.template_arguments["sender_email"], - base_parameters.template_arguments["receiver_email"], - message, - ) + print(message) @cli.command("announce") -@click.option( - "--receiver_email", - default="dev@superset.apache.org", - type=str, - prompt="The receiver email (To:)", -) @click.pass_obj -def announce(base_parameters: BaseParameters, receiver_email: str) -> None: +def announce(base_parameters: BaseParameters) -> None: template_file = "email_templates/announce.j2" - base_parameters.template_arguments["receiver_email"] = receiver_email message = render_template(template_file, **base_parameters.template_arguments) - inter_send_email( - base_parameters.username, - base_parameters.password, - base_parameters.template_arguments["sender_email"], - base_parameters.template_arguments["receiver_email"], - message, - ) + print(message) cli() diff --git a/RELEASING/requirements.txt b/RELEASING/requirements.txt new file mode 100644 index 0000000000000..bd3586e04c0d3 --- /dev/null +++ b/RELEASING/requirements.txt @@ -0,0 +1,19 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. + +cherrytree +jinja2 diff --git a/RESOURCES/INTHEWILD.md b/RESOURCES/INTHEWILD.md index f9155d00bb326..61c000bf36f7f 100644 --- a/RESOURCES/INTHEWILD.md +++ b/RESOURCES/INTHEWILD.md @@ -133,6 +133,7 @@ Join our growing community! ### Healthcare - [Amino](https://amino.com) [@shkr] +- [Beans](https://www.beans.fi) [@kakoni] - [Care](https://www.getcare.io/)[@alandao2021] - [Living Goods](https://www.livinggoods.org) [@chelule] - [Maieutical Labs](https://maieuticallabs.it) [@xrmx] diff --git a/docs/docs/databases/clickhouse.mdx b/docs/docs/databases/clickhouse.mdx index 1ece9186f1fbd..e717b60bf0b24 100644 --- a/docs/docs/databases/clickhouse.mdx +++ b/docs/docs/databases/clickhouse.mdx @@ -1,13 +1,13 @@ --- -title: Clickhouse +title: ClickHouse hide_title: true sidebar_position: 15 version: 1 --- -## Clickhouse +## ClickHouse -To use Clickhouse with Superset, you will need to add the following Python libraries: +To use ClickHouse with Superset, you will need to add the following Python libraries: ``` clickhouse-driver==0.2.0 @@ -21,7 +21,7 @@ clickhouse-driver>=0.2.0 clickhouse-sqlalchemy>=0.1.6 ``` -The recommended connector library for Clickhouse is +The recommended connector library for ClickHouse is [sqlalchemy-clickhouse](https://github.com/cloudflare/sqlalchemy-clickhouse). The expected connection string is formatted as follows: diff --git a/docs/docs/installation/sql-templating.mdx b/docs/docs/installation/sql-templating.mdx index 8908d39f0280e..905f99cb9f5de 100644 --- a/docs/docs/installation/sql-templating.mdx +++ b/docs/docs/installation/sql-templating.mdx @@ -273,3 +273,27 @@ Here's a concrete example: superiors order by lineage, level ``` + +**Datasets** + +It's possible to query physical and virtual datasets using the `dataset` macro. This is useful if you've defined computed columns and metrics on your datasets, and want to reuse the definition in adhoc SQL Lab queries. + +To use the macro, first you need to find the ID of the dataset. This can be done by going to the view showing all the datasets, hovering over the dataset you're interested in, and looking at its URL. For example, if the URL for a dataset is https://superset.example.org/superset/explore/table/42/ its ID is 42. + +Once you have the ID you can query it as if it were a table: + +``` +SELECT * FROM {{ dataset(42) }} LIMIT 10 +``` + +If you want to select the metric definitions as well, in addition to the columns, you need to pass an additional keyword argument: + +``` +SELECT * FROM {{ dataset(42, include_metrics=True) }} LIMIT 10 +``` + +Since metrics are aggregations, the resulting SQL expression will be grouped by all non-metric columns. You can specify a subset of columns to group by instead: + +``` +SELECT * FROM {{ dataset(42, include_metrics=True, columns=["ds", "category"]) }} LIMIT 10 +``` diff --git a/docs/docs/security.mdx b/docs/docs/security.mdx index 4f2618773163b..0067f196cb1f9 100644 --- a/docs/docs/security.mdx +++ b/docs/docs/security.mdx @@ -49,7 +49,7 @@ to all databases by default, both **Alpha** and **Gamma** users need to be given To allow logged-out users to access some Superset features, you can use the `PUBLIC_ROLE_LIKE` config setting and assign it to another role whose permissions you want passed to this role. -For example, by setting `PUBLIC_ROLE_LIKE = Gamma` in your `superset_config.py` file, you grant +For example, by setting `PUBLIC_ROLE_LIKE = "Gamma"` in your `superset_config.py` file, you grant public role the same set of permissions as for the **Gamma** role. This is useful if one wants to enable anonymous users to view dashboards. Explicit grant on specific datasets is still required, meaning that you need to edit the **Public** role and add the public data sources to the role manually. diff --git a/docs/package.json b/docs/package.json index 420aba6394438..0d0953de1b40b 100644 --- a/docs/package.json +++ b/docs/package.json @@ -38,7 +38,7 @@ "react-dom": "^17.0.1", "react-github-btn": "^1.2.0", "stream": "^0.0.2", - "swagger-ui-react": "^4.1.2", + "swagger-ui-react": "^4.1.3", "url-loader": "^4.1.1" }, "devDependencies": { diff --git a/docs/yarn.lock b/docs/yarn.lock index b87498c11780e..2c788f210c475 100644 --- a/docs/yarn.lock +++ b/docs/yarn.lock @@ -2183,23 +2183,7 @@ "@babel/helper-validator-option" "^7.16.7" "@babel/plugin-transform-typescript" "^7.16.7" -"@babel/runtime-corejs3@^7.11.2", "@babel/runtime-corejs3@^7.16.3": - version "7.16.3" - resolved "https://registry.npmjs.org/@babel/runtime-corejs3/-/runtime-corejs3-7.16.3.tgz" - integrity sha512-IAdDC7T0+wEB4y2gbIL0uOXEYpiZEeuFUTVbdGq+UwCcF35T/tS8KrmMomEwEc5wBbyfH3PJVpTSUqrhPDXFcQ== - dependencies: - core-js-pure "^3.19.0" - regenerator-runtime "^0.13.4" - -"@babel/runtime-corejs3@^7.17.2": - version "7.17.8" - resolved "https://registry.yarnpkg.com/@babel/runtime-corejs3/-/runtime-corejs3-7.17.8.tgz#d7dd49fb812f29c61c59126da3792d8740d4e284" - integrity sha512-ZbYSUvoSF6dXZmMl/CYTMOvzIFnbGfv4W3SEHYgMvNsFTeLaF2gkGAF4K2ddmtSK4Emej+0aYcnSC6N5dPCXUQ== - dependencies: - core-js-pure "^3.20.2" - regenerator-runtime "^0.13.4" - -"@babel/runtime-corejs3@^7.17.8": +"@babel/runtime-corejs3@^7.11.2", "@babel/runtime-corejs3@^7.16.3", "@babel/runtime-corejs3@^7.17.2", "@babel/runtime-corejs3@^7.17.8": version "7.17.9" resolved "https://registry.yarnpkg.com/@babel/runtime-corejs3/-/runtime-corejs3-7.17.9.tgz#3d02d0161f0fbf3ada8e88159375af97690f4055" integrity sha512-WxYHHUWF2uZ7Hp1K+D1xQgbgkGUfA+5UPOegEXGt2Y5SMog/rYCVaifLZDbw8UkNXozEqqrZTy6bglL7xTaCOw== @@ -4960,11 +4944,6 @@ core-js-compat@^3.20.0, core-js-compat@^3.20.2: browserslist "^4.19.1" semver "7.0.0" -core-js-pure@^3.19.0: - version "3.19.1" - resolved "https://registry.npmjs.org/core-js-pure/-/core-js-pure-3.19.1.tgz" - integrity sha512-Q0Knr8Es84vtv62ei6/6jXH/7izKmOrtrxH9WJTHLCMAVeU+8TF8z8Nr08CsH4Ot0oJKzBzJJL9SJBYIv7WlfQ== - core-js-pure@^3.20.2: version "3.21.1" resolved "https://registry.yarnpkg.com/core-js-pure/-/core-js-pure-3.21.1.tgz#8c4d1e78839f5f46208de7230cebfb72bc3bdb51" @@ -10568,10 +10547,10 @@ swagger-client@^3.17.0: traverse "~0.6.6" url "~0.11.0" -swagger-ui-react@^4.1.2: - version "4.1.2" - resolved "https://registry.npmjs.org/swagger-ui-react/-/swagger-ui-react-4.1.2.tgz" - integrity sha512-HWsvfDviykATBpUh1Q4lvIBsnMFNaHBIw3nr7zAUUxMLzvlX6cbq4jATtM1v7MWiu9zUiE/Z/LmCc3YufTeTnw== +swagger-ui-react@^4.1.3: + version "4.1.3" + resolved "https://registry.yarnpkg.com/swagger-ui-react/-/swagger-ui-react-4.1.3.tgz#a722ecbe54ef237fa9080447a7c708c4c72d846a" + integrity sha512-o1AoXUTNH40cxWus0QOeWQ8x9tSIEmrLBrOgAOHDnvWJ1qyjT8PjgHjPbUVjMbja18coyuaAAeUdyLKvLGmlDA== dependencies: "@babel/runtime-corejs3" "^7.16.3" "@braintree/sanitize-url" "^5.0.2" diff --git a/helm/superset/Chart.yaml b/helm/superset/Chart.yaml index 8d93ab473195b..70dd6fd162b89 100644 --- a/helm/superset/Chart.yaml +++ b/helm/superset/Chart.yaml @@ -22,7 +22,7 @@ maintainers: - name: craig-rueda email: craig@craigrueda.com url: https://github.com/craig-rueda -version: 0.6.1 +version: 0.6.2 dependencies: - name: postgresql version: 11.1.22 diff --git a/helm/superset/templates/deployment-beat.yaml b/helm/superset/templates/deployment-beat.yaml index 55223defc6421..d46d47ee3f9c4 100644 --- a/helm/superset/templates/deployment-beat.yaml +++ b/helm/superset/templates/deployment-beat.yaml @@ -98,7 +98,11 @@ spec: {{- tpl (toYaml .) $ | nindent 12 -}} {{- end }} resources: + {{- if .Values.supersetCeleryBeat.resources }} +{{ toYaml .Values.supersetCeleryBeat.resources | indent 12 }} + {{- else }} {{ toYaml .Values.resources | indent 12 }} + {{- end }} {{- with .Values.nodeSelector }} nodeSelector: {{ toYaml . | indent 8 }} diff --git a/helm/superset/templates/deployment-worker.yaml b/helm/superset/templates/deployment-worker.yaml index 07bacd371b773..54eb5d87517e4 100644 --- a/helm/superset/templates/deployment-worker.yaml +++ b/helm/superset/templates/deployment-worker.yaml @@ -99,7 +99,11 @@ spec: {{- tpl (toYaml .) $ | nindent 12 -}} {{- end }} resources: + {{- if .Values.supersetWorker.resources }} +{{ toYaml .Values.supersetWorker.resources | indent 12 }} + {{- else }} {{ toYaml .Values.resources | indent 12 }} + {{- end }} {{- with .Values.nodeSelector }} nodeSelector: {{ toYaml . | indent 8 }} diff --git a/helm/superset/templates/deployment.yaml b/helm/superset/templates/deployment.yaml index b760d5454da24..4d3a42e8e20a7 100644 --- a/helm/superset/templates/deployment.yaml +++ b/helm/superset/templates/deployment.yaml @@ -115,7 +115,11 @@ spec: containerPort: {{ .Values.service.port }} protocol: TCP resources: + {{- if .Values.supersetNode.resources }} +{{ toYaml .Values.supersetNode.resources | indent 12 }} + {{- else }} {{ toYaml .Values.resources | indent 12 }} + {{- end }} {{- with .Values.nodeSelector }} nodeSelector: {{ toYaml . | indent 8 }} diff --git a/helm/superset/values.schema.json b/helm/superset/values.schema.json index 5e419d35a67d7..6c4359a0ff940 100644 --- a/helm/superset/values.schema.json +++ b/helm/superset/values.schema.json @@ -275,6 +275,9 @@ }, "podLabels": { "$ref": "https://raw.githubusercontent.com/yannh/kubernetes-json-schema/master/v1.23.0/_definitions.json##/definitions/io.k8s.apimachinery.pkg.apis.meta.v1.ObjectMeta/properties/labels" + }, + "resources": { + "type": "object" } }, "required": [ @@ -305,6 +308,9 @@ }, "podLabels": { "$ref": "https://raw.githubusercontent.com/yannh/kubernetes-json-schema/master/v1.23.0/_definitions.json##/definitions/io.k8s.apimachinery.pkg.apis.meta.v1.ObjectMeta/properties/labels" + }, + "resources": { + "type": "object" } }, "required": [ @@ -336,6 +342,9 @@ }, "podLabels": { "$ref": "https://raw.githubusercontent.com/yannh/kubernetes-json-schema/master/v1.23.0/_definitions.json##/definitions/io.k8s.apimachinery.pkg.apis.meta.v1.ObjectMeta/properties/labels" + }, + "resources": { + "type": "object" } }, "required": [ diff --git a/helm/superset/values.yaml b/helm/superset/values.yaml index 2adc6bf662944..197ec4b3c6e70 100644 --- a/helm/superset/values.yaml +++ b/helm/superset/values.yaml @@ -203,6 +203,8 @@ resources: {} # choice for the user. This also increases chances charts run on environments with little # resources, such as Minikube. If you do want to specify resources, uncomment the following # lines, adjust them as necessary, and remove the curly braces after 'resources:'. + # The limits below will apply to all Superset components. To set individual resource limitations refer to the pod specific values below. + # The pod specific values will overwrite anything that is set here. # limits: # cpu: 100m # memory: 128Mi @@ -253,6 +255,14 @@ supersetNode: podAnnotations: {} ## Labels to be added to supersetNode pods podLabels: {} + # Resource settings for the supersetNode pods - these settings overwrite might existing values from the global resources object defined above. + resources: {} + # limits: + # cpu: 100m + # memory: 128Mi + # requests: + # cpu: 100m + # memory: 128Mi ## ## Superset worker configuration supersetWorker: @@ -275,6 +285,14 @@ supersetWorker: podAnnotations: {} ## Labels to be added to supersetWorker pods podLabels: {} + # Resource settings for the supersetWorker pods - these settings overwrite might existing values from the global resources object defined above. + resources: {} + # limits: + # cpu: 100m + # memory: 128Mi + # requests: + # cpu: 100m + # memory: 128Mi ## ## Superset beat configuration (to trigger scheduled jobs like reports) supersetCeleryBeat: @@ -299,6 +317,14 @@ supersetCeleryBeat: podAnnotations: {} ## Labels to be added to supersetCeleryBeat pods podLabels: {} + # Resource settings for the CeleryBeat pods - these settings overwrite might existing values from the global resources object defined above. + resources: {} + # limits: + # cpu: 100m + # memory: 128Mi + # requests: + # cpu: 100m + # memory: 128Mi ## ## Init job configuration init: diff --git a/requirements/base.in b/requirements/base.in index debdc6762cbaf..9d8313823765a 100644 --- a/requirements/base.in +++ b/requirements/base.in @@ -15,10 +15,5 @@ # KIND, either express or implied. See the License for the # specific language governing permissions and limitations # under the License. +# -e file:. -pyrsistent>=0.16.1,<0.17 -zipp==3.4.1 -sasl==0.3.1 -wrapt==1.12.1 # required by astroid<2.9 until we bump pylint -aiohttp==3.8.1 -charset-normalizer==2.0.4 diff --git a/requirements/base.txt b/requirements/base.txt index 6b133eec5e02f..ea9e8f7a8480d 100644 --- a/requirements/base.txt +++ b/requirements/base.txt @@ -1,4 +1,4 @@ -# SHA1:8c236813f9d0bf56d87d845cbffd6ca3f07c3e14 +# SHA1:a9dde048f1ee1f00586264d726d0e89f16e56183 # # This file is autogenerated by pip-compile-multi # To update, run: @@ -8,9 +8,7 @@ -e file:. # via -r requirements/base.in aiohttp==3.8.1 - # via - # -r requirements/base.in - # slackclient + # via slackclient aiosignal==1.2.0 # via aiohttp alembic==1.6.5 @@ -42,9 +40,7 @@ celery==5.2.2 cffi==1.14.6 # via cryptography charset-normalizer==2.0.4 - # via - # -r requirements/base.in - # aiohttp + # via aiohttp click==8.0.4 # via # apache-superset @@ -214,9 +210,7 @@ pyparsing==3.0.6 # apache-superset # packaging pyrsistent==0.16.1 - # via - # -r requirements/base.in - # jsonschema + # via jsonschema python-dateutil==2.8.2 # via # alembic @@ -244,8 +238,6 @@ pyyaml==5.4.1 # apispec redis==3.5.3 # via apache-superset -sasl==0.3.1 - # via -r requirements/base.in selenium==3.141.0 # via apache-superset simplejson==3.17.3 @@ -262,7 +254,6 @@ six==1.16.0 # prison # pyrsistent # python-dateutil - # sasl # sqlalchemy-utils # wtforms-json slackclient==2.5.0 @@ -300,8 +291,6 @@ werkzeug==2.0.3 # via # flask # flask-jwt-extended -wrapt==1.12.1 - # via -r requirements/base.in wtforms==2.3.3 # via # flask-appbuilder @@ -311,8 +300,6 @@ wtforms-json==0.3.3 # via apache-superset yarl==1.6.3 # via aiohttp -zipp==3.4.1 - # via -r requirements/base.in # The following packages are considered to be unsafe in a requirements file: # setuptools diff --git a/requirements/development.in b/requirements/development.in index a9aa89a2c269f..477fff3376b22 100644 --- a/requirements/development.in +++ b/requirements/development.in @@ -15,14 +15,11 @@ # KIND, either express or implied. See the License for the # specific language governing permissions and limitations # under the License. +# -r base.in +-e .[cors,druid,hive,mysql,postgres,thumbnails] flask-cors>=2.0.0 -mysqlclient>=2.1.0 -pillow>=9.0.1,<10 -pydruid>=0.6.1,<0.7 -pyhive[hive]>=0.6.1 -psycopg2-binary==2.9.1 -tableschema -thrift>=0.11.0,<1.0.0 +ipython progress>=1.5,<2 pyinstrument>=4.0.2,<5 +sqloxide diff --git a/requirements/development.txt b/requirements/development.txt index 4702c4b6c3da7..1beebbc9a4cb7 100644 --- a/requirements/development.txt +++ b/requirements/development.txt @@ -1,4 +1,4 @@ -# SHA1:b4a3e0dd12a4937fc5a21bdbf63644be9222c65f +# SHA1:2bd0d7307aeb633b7d97b510eb467285210e783a # # This file is autogenerated by pip-compile-multi # To update, run: @@ -7,7 +7,15 @@ # -r base.txt -e file:. - # via -r requirements/base.in + # via + # -r requirements/base.in + # -r requirements/development.in +appnope==0.1.3 + # via ipython +asttokens==2.0.5 + # via stack-data +backcall==0.2.0 + # via ipython boto3==1.18.19 # via tabulator botocore==1.21.19 @@ -18,14 +26,26 @@ cached-property==1.5.2 # via tableschema certifi==2021.10.8 # via requests +chardet==4.0.0 + # via tabulator +decorator==5.1.1 + # via ipython et-xmlfile==1.1.0 # via openpyxl +executing==0.8.3 + # via stack-data flask-cors==3.0.10 - # via -r requirements/development.in + # via + # -r requirements/development.in + # apache-superset future==0.18.2 # via pyhive ijson==3.1.4 # via tabulator +ipython==8.3.0 + # via -r requirements/development.in +jedi==0.18.1 + # via ipython jmespath==0.10.0 # via # boto3 @@ -34,22 +54,36 @@ jsonlines==2.0.0 # via tabulator linear-tsv==1.1.0 # via tabulator +matplotlib-inline==0.1.3 + # via ipython mysqlclient==2.1.0 - # via -r requirements/development.in + # via apache-superset openpyxl==3.0.7 # via tabulator +parso==0.8.3 + # via jedi +pexpect==4.8.0 + # via ipython +pickleshare==0.7.5 + # via ipython pillow==9.1.0 - # via -r requirements/development.in + # via apache-superset progress==1.6 # via -r requirements/development.in psycopg2-binary==2.9.1 - # via -r requirements/development.in + # via apache-superset +ptyprocess==0.7.0 + # via pexpect +pure-eval==0.2.2 + # via stack-data pure-sasl==0.6.2 # via thrift-sasl pydruid==0.6.2 - # via -r requirements/development.in + # via apache-superset +pygments==2.12.0 + # via ipython pyhive[hive]==0.6.4 - # via -r requirements/development.in + # via apache-superset pyinstrument==4.0.2 # via -r requirements/development.in requests==2.26.0 @@ -61,17 +95,27 @@ rfc3986==1.5.0 # via tableschema s3transfer==0.5.0 # via boto3 -tableschema==1.20.2 +sasl==0.3.1 + # via pyhive +sqloxide==0.1.17 # via -r requirements/development.in +stack-data==0.2.0 + # via ipython +tableschema==1.20.2 + # via apache-superset tabulator==1.53.5 # via tableschema thrift==0.13.0 # via - # -r requirements/development.in + # apache-superset # pyhive # thrift-sasl thrift-sasl==0.4.3 # via pyhive +traitlets==5.2.1.post0 + # via + # ipython + # matplotlib-inline unicodecsv==0.14.1 # via # tableschema diff --git a/requirements/docker.in b/requirements/docker.in index b9a2f3b2400be..7310c408bb841 100644 --- a/requirements/docker.in +++ b/requirements/docker.in @@ -15,5 +15,5 @@ # limitations under the License. # -r base.in +-e .[postgres] gevent -psycopg2-binary diff --git a/requirements/docker.txt b/requirements/docker.txt index 545e919791bd2..f9ea766f4e69f 100644 --- a/requirements/docker.txt +++ b/requirements/docker.txt @@ -1,4 +1,4 @@ -# SHA1:e29e1e67c158a87a692dd8ccaf1e343ebb246dc2 +# SHA1:b6943e4be5e050c458e66b470f517acda02c38e6 # # This file is autogenerated by pip-compile-multi # To update, run: @@ -7,16 +7,18 @@ # -r base.txt -e file:. - # via -r requirements/base.in + # via + # -r requirements/base.in + # -r requirements/docker.in gevent==21.8.0 # via -r requirements/docker.in greenlet==1.1.1 # via gevent psycopg2-binary==2.9.1 - # via -r requirements/docker.in -zope.event==4.5.0 + # via apache-superset +zope-event==4.5.0 # via gevent -zope.interface==5.4.0 +zope-interface==5.4.0 # via gevent # The following packages are considered to be unsafe in a requirements file: diff --git a/requirements/integration.in b/requirements/integration.in index eff495d881314..9601b3a0d23bf 100644 --- a/requirements/integration.in +++ b/requirements/integration.in @@ -14,8 +14,6 @@ # See the License for the specific language governing permissions and # limitations under the License. # -pip-compile-multi!=1.5.9 +pip-compile-multi pre-commit tox -py>=1.10.0 -click diff --git a/requirements/integration.txt b/requirements/integration.txt index edc39fb151267..fb1d37cd53f91 100644 --- a/requirements/integration.txt +++ b/requirements/integration.txt @@ -1,17 +1,16 @@ -# SHA1:8e2dd1e795bcad7451376b3653eb03465e4f05d3 +# SHA1:39179f2c476f94362aa0705be059a488d7e38b6d # # This file is autogenerated by pip-compile-multi # To update, run: # # pip-compile-multi # -backports.entry-points-selectable==1.1.0 +backports-entry-points-selectable==1.1.0 # via virtualenv cfgv==3.3.0 # via pre-commit click==8.0.4 # via - # -r requirements/integration.in # pip-compile-multi # pip-tools distlib==0.3.2 @@ -39,9 +38,7 @@ pluggy==0.13.1 pre-commit==2.14.0 # via -r requirements/integration.in py==1.10.0 - # via - # -r requirements/integration.in - # tox + # via tox pyparsing==3.0.6 # via packaging pyyaml==5.4.1 diff --git a/requirements/local.txt b/requirements/local.txt index 9cb3f70997bc3..c4bd3cd599b36 100644 --- a/requirements/local.txt +++ b/requirements/local.txt @@ -7,7 +7,9 @@ # -r development.txt -e file:. - # via -r requirements/base.in + # via + # -r requirements/base.in + # -r requirements/development.in # The following packages are considered to be unsafe in a requirements file: # setuptools diff --git a/requirements/testing.in b/requirements/testing.in index 082dbc934ac5c..9a40c90753da1 100644 --- a/requirements/testing.in +++ b/requirements/testing.in @@ -16,27 +16,15 @@ # -r development.in -r integration.in +-e file:.[bigquery,hive,presto,trino] docker flask-testing freezegun -google-cloud-bigquery -ipdb -# pinning ipython as pip-compile-multi was bringing higher version -# of the ipython that was not found in CI -ipython openapi-spec-validator -openpyxl -pandas_gbq parameterized -pybigquery pyfakefs -pyhive[presto]>=0.6.3 -pylint==2.9.6 +pylint pytest pytest-cov -statsd pytest-mock -sqloxide -# DB dependencies --e file:.[bigquery] --e file:.[trino] +statsd diff --git a/requirements/testing.txt b/requirements/testing.txt index dbdb9e5077e87..5c1c2f7fce345 100644 --- a/requirements/testing.txt +++ b/requirements/testing.txt @@ -1,4 +1,4 @@ -# SHA1:e273e8da6bfd5f6f8563fe067e243297cc7c588c +# SHA1:623feb0dd2b6bd376238ecf75069bc82136c2d70 # # This file is autogenerated by pip-compile-multi # To update, run: @@ -10,21 +10,14 @@ -e file:. # via # -r requirements/base.in + # -r requirements/development.in # -r requirements/testing.in -appnope==0.1.2 - # via ipython astroid==2.6.6 # via pylint -backcall==0.2.0 - # via ipython cachetools==4.2.4 # via google-auth coverage==5.5 # via pytest-cov -decorator==5.0.9 - # via - # ipdb - # ipython docker==5.0.0 # via -r requirements/testing.in flask-testing==0.8.1 @@ -51,7 +44,7 @@ google-auth-oauthlib==0.4.6 # pydata-google-auth google-cloud-bigquery[bqstorage,pandas]==2.29.0 # via - # -r requirements/testing.in + # apache-superset # pandas-gbq # pybigquery google-cloud-bigquery-storage==2.9.1 @@ -75,24 +68,12 @@ grpcio-status==1.41.1 # via google-api-core iniconfig==1.1.1 # via pytest -ipdb==0.13.9 - # via -r requirements/testing.in -ipython==7.26.0 - # via - # -r requirements/testing.in - # ipdb -ipython-genutils==0.2.0 - # via traitlets isort==5.9.3 # via pylint -jedi==0.18.0 - # via ipython lazy-object-proxy==1.6.0 # via astroid libcst==0.3.21 # via google-cloud-bigquery-storage -matplotlib-inline==0.1.2 - # via ipython mccabe==0.6.1 # via pylint mypy-extensions==0.4.3 @@ -104,15 +85,9 @@ openapi-schema-validator==0.1.5 openapi-spec-validator==0.3.1 # via -r requirements/testing.in pandas-gbq==0.15.0 - # via -r requirements/testing.in + # via apache-superset parameterized==0.8.1 # via -r requirements/testing.in -parso==0.8.2 - # via jedi -pexpect==4.8.0 - # via ipython -pickleshare==0.7.5 - # via ipython proto-plus==1.19.7 # via # google-cloud-bigquery @@ -124,8 +99,6 @@ protobuf==3.19.1 # googleapis-common-protos # grpcio-status # proto-plus -ptyprocess==0.7.0 - # via pexpect pyasn1==0.4.8 # via # pyasn1-modules @@ -133,17 +106,11 @@ pyasn1==0.4.8 pyasn1-modules==0.2.8 # via google-auth pybigquery==0.10.2 - # via -r requirements/testing.in + # via apache-superset pydata-google-auth==1.2.0 # via pandas-gbq -pyfakefs==4.5.0 +pyfakefs==4.5.6 # via -r requirements/testing.in -pygments==2.9.0 - # via ipython -pyhive[hive,presto]==0.6.4 - # via - # -r requirements/development.in - # -r requirements/testing.in pylint==2.9.6 # via -r requirements/testing.in pytest==6.2.4 @@ -159,22 +126,16 @@ requests-oauthlib==1.3.0 # via google-auth-oauthlib rsa==4.7.2 # via google-auth -sqlalchemy-trino==0.4.1 - # via apache-superset -sqloxide==0.1.15 - # via -r requirements/testing.in statsd==3.3.0 # via -r requirements/testing.in -traitlets==5.0.5 - # via - # ipython - # matplotlib-inline -trino==0.306 - # via sqlalchemy-trino +trino==0.313.0 + # via apache-superset typing-inspect==0.7.1 # via libcst websocket-client==1.2.0 # via docker +wrapt==1.12.1 + # via astroid # The following packages are considered to be unsafe in a requirements file: # pip diff --git a/setup.py b/setup.py index e0a6f7556121b..4d1fd3ab4363e 100644 --- a/setup.py +++ b/setup.py @@ -136,11 +136,11 @@ def get_git_sha() -> str: "druid": ["pydruid>=0.6.1,<0.7"], "solr": ["sqlalchemy-solr >= 0.2.0"], "elasticsearch": ["elasticsearch-dbapi>=0.2.0, <0.3.0"], - "exasol": ["sqlalchemy-exasol>=2.1.0, <2.2"], + "exasol": ["sqlalchemy-exasol >= 2.4.0, <3.0"], "excel": ["xlrd>=1.2.0, <1.3"], "firebird": ["sqlalchemy-firebird>=0.7.0, <0.8"], "firebolt": ["firebolt-sqlalchemy>=0.0.1"], - "gsheets": ["shillelagh[gsheetsapi]>=1.0.11, <2"], + "gsheets": ["shillelagh[gsheetsapi]>=1.0.14, <2"], "hana": ["hdbcli==2.4.162", "sqlalchemy_hana==0.4.0"], "hive": ["pyhive[hive]>=0.6.1", "tableschema", "thrift>=0.11.0, <1.0.0"], "impala": ["impyla>0.16.2, <0.17"], diff --git a/superset-frontend/cypress-base/cypress/integration/dashboard/dashboard.applitools.test.ts b/superset-frontend/cypress-base/cypress/integration/dashboard/dashboard.applitools.test.ts index 7a61087c1a8b3..d492175a5e3a6 100644 --- a/superset-frontend/cypress-base/cypress/integration/dashboard/dashboard.applitools.test.ts +++ b/superset-frontend/cypress-base/cypress/integration/dashboard/dashboard.applitools.test.ts @@ -41,8 +41,8 @@ describe('Dashboard load', () => { }); it('should load the Dashboard in edit mode', () => { - cy.get('[data-test="dashboard-header"]') - .find('[aria-label=edit-alt]') + cy.get('.header-with-actions') + .find('[aria-label="Edit dashboard"]') .click(); // wait for a chart to appear cy.get('[data-test="grid-container"]').find('.box_plot', { diff --git a/superset-frontend/cypress-base/cypress/integration/dashboard/edit_mode.test.js b/superset-frontend/cypress-base/cypress/integration/dashboard/edit_mode.test.js index d799dccd3bb6e..7a3b82705cebd 100644 --- a/superset-frontend/cypress-base/cypress/integration/dashboard/edit_mode.test.js +++ b/superset-frontend/cypress-base/cypress/integration/dashboard/edit_mode.test.js @@ -22,8 +22,8 @@ describe('Dashboard edit mode', () => { beforeEach(() => { cy.login(); cy.visit(WORLD_HEALTH_DASHBOARD); - cy.get('[data-test="dashboard-header"]') - .find('[aria-label=edit-alt]') + cy.get('.header-with-actions') + .find('[aria-label="Edit dashboard"]') .click(); }); @@ -94,9 +94,9 @@ describe('Dashboard edit mode', () => { .find('[data-test="discard-changes-button"]') .should('be.visible') .click(); - cy.get('[data-test="dashboard-header"]').within(() => { + cy.get('.header-with-actions').within(() => { cy.get('[data-test="dashboard-edit-actions"]').should('not.be.visible'); - cy.get('[aria-label="edit-alt"]').should('be.visible'); + cy.get('[aria-label="Edit dashboard"]').should('be.visible'); }); }); }); diff --git a/superset-frontend/cypress-base/cypress/integration/dashboard/edit_properties.test.ts b/superset-frontend/cypress-base/cypress/integration/dashboard/edit_properties.test.ts index bf4bd319c1ddd..b3061cdb7d40a 100644 --- a/superset-frontend/cypress-base/cypress/integration/dashboard/edit_properties.test.ts +++ b/superset-frontend/cypress-base/cypress/integration/dashboard/edit_properties.test.ts @@ -66,9 +66,13 @@ function openAdvancedProperties() { function openDashboardEditProperties() { // open dashboard properties edit modal - cy.get('#save-dash-split-button').trigger('click', { force: true }); + cy.get( + '.header-with-actions .right-button-panel .ant-dropdown-trigger', + ).trigger('click', { + force: true, + }); cy.get('[data-test=header-actions-menu]') - .contains('Edit dashboard properties') + .contains('Edit properties') .click({ force: true }); } @@ -80,7 +84,7 @@ describe('Dashboard edit action', () => { cy.get('.dashboard-grid', { timeout: 50000 }) .should('be.visible') // wait for 50 secs to load dashboard .then(() => { - cy.get('.dashboard-header [aria-label=edit-alt]') + cy.get('.header-with-actions [aria-label="Edit dashboard"]') .should('be.visible') .click(); openDashboardEditProperties(); @@ -106,7 +110,10 @@ describe('Dashboard edit action', () => { cy.get('.ant-modal-body').should('not.exist'); // assert title has been updated - cy.get('.editable-title input').should('have.value', dashboardTitle); + cy.get('[data-test="editable-title-input"]').should( + 'have.value', + dashboardTitle, + ); }); }); describe('the color picker is changed', () => { diff --git a/superset-frontend/cypress-base/cypress/integration/dashboard/markdown.test.ts b/superset-frontend/cypress-base/cypress/integration/dashboard/markdown.test.ts index 7ce391868809b..2964241b184a2 100644 --- a/superset-frontend/cypress-base/cypress/integration/dashboard/markdown.test.ts +++ b/superset-frontend/cypress-base/cypress/integration/dashboard/markdown.test.ts @@ -25,14 +25,12 @@ describe('Dashboard edit markdown', () => { }); it('should add markdown component to dashboard', () => { - cy.get('[data-test="dashboard-header"]') - .find('[aria-label="edit-alt"]') + cy.get('.header-with-actions') + .find('[aria-label="Edit dashboard"]') .click(); // lazy load - need to open dropdown for the scripts to load - cy.get('[data-test="dashboard-header"]') - .find('[aria-label="more-horiz"]') - .click(); + cy.get('.header-with-actions').find('[aria-label="more-horiz"]').click(); cy.get('[data-test="grid-row-background--transparent"]') .first() .as('component-background-first'); diff --git a/superset-frontend/cypress-base/cypress/integration/dashboard/save.test.js b/superset-frontend/cypress-base/cypress/integration/dashboard/save.test.js index 8064f81fa14da..b0e9d1141cd30 100644 --- a/superset-frontend/cypress-base/cypress/integration/dashboard/save.test.js +++ b/superset-frontend/cypress-base/cypress/integration/dashboard/save.test.js @@ -25,9 +25,11 @@ import { } from './dashboard.helper'; function openDashboardEditProperties() { - cy.get('.dashboard-header [aria-label=edit-alt]').click(); - cy.get('#save-dash-split-button').trigger('click', { force: true }); - cy.get('.dropdown-menu').contains('Edit dashboard properties').click(); + cy.get('.header-with-actions [aria-label="Edit dashboard"]').click(); + cy.get( + '.header-with-actions .right-button-panel .ant-dropdown-trigger', + ).trigger('click', { force: true }); + cy.get('.dropdown-menu').contains('Edit properties').click(); } describe('Dashboard save action', () => { @@ -35,8 +37,8 @@ describe('Dashboard save action', () => { cy.login(); cy.visit(WORLD_HEALTH_DASHBOARD); cy.get('#app').then(data => { - cy.get('[data-test="dashboard-header"]').then(headerElement => { - const dashboardId = headerElement.attr('data-test-id'); + cy.get('.dashboard-header-container').then(headerContainerElement => { + const dashboardId = headerContainerElement.attr('data-test-id'); cy.intercept('POST', `/superset/copy_dash/${dashboardId}/`).as( 'copyRequest', @@ -56,7 +58,7 @@ describe('Dashboard save action', () => { // change to what the title should be it('should save as new dashboard', () => { cy.wait('@copyRequest').then(xhr => { - cy.get('[data-test="editable-title-input"]').then(element => { + cy.get('[data-test="editable-title"]').then(element => { const dashboardTitle = element.attr('title'); expect(dashboardTitle).to.not.equal(`World Bank's Data`); }); @@ -68,7 +70,7 @@ describe('Dashboard save action', () => { WORLD_HEALTH_CHARTS.forEach(waitForChartLoad); // remove box_plot chart from dashboard - cy.get('[aria-label="edit-alt"]').click({ timeout: 5000 }); + cy.get('[aria-label="Edit dashboard"]').click({ timeout: 5000 }); cy.get('[data-test="dashboard-delete-component-button"]') .last() .trigger('mouseenter') @@ -79,15 +81,15 @@ describe('Dashboard save action', () => { .should('not.exist'); cy.intercept('PUT', '/api/v1/dashboard/**').as('putDashboardRequest'); - cy.get('[data-test="dashboard-header"]') + cy.get('.header-with-actions') .find('[data-test="header-save-button"]') .contains('Save') .click(); // go back to view mode cy.wait('@putDashboardRequest'); - cy.get('[data-test="dashboard-header"]') - .find('[aria-label="edit-alt"]') + cy.get('.header-with-actions') + .find('[aria-label="Edit dashboard"]') .click(); // deleted boxplot should still not exist @@ -142,7 +144,7 @@ describe('Dashboard save action', () => { cy.get('.ant-modal-body').should('not.exist'); // save dashboard changes - cy.get('.dashboard-header').contains('Save').click(); + cy.get('.header-with-actions').contains('Save').click(); // assert success flash cy.contains('saved successfully').should('be.visible'); diff --git a/superset-frontend/cypress-base/cypress/integration/explore/visualizations/download_chart.test.js b/superset-frontend/cypress-base/cypress/integration/explore/visualizations/download_chart.test.js index 42f9c13123bd8..ce4a871f8e2de 100644 --- a/superset-frontend/cypress-base/cypress/integration/explore/visualizations/download_chart.test.js +++ b/superset-frontend/cypress-base/cypress/integration/explore/visualizations/download_chart.test.js @@ -34,7 +34,7 @@ describe('Download Chart > Distribution bar chart', () => { }; cy.visitChartByParams(JSON.stringify(formData)); - cy.get('.right-button-panel .ant-dropdown-trigger').click(); + cy.get('.header-with-actions .ant-dropdown-trigger').click(); cy.get(':nth-child(1) > .ant-dropdown-menu-submenu-title').click(); cy.get( '.ant-dropdown-menu-submenu > .ant-dropdown-menu li:nth-child(3)', diff --git a/superset-frontend/cypress-base/cypress/support/directories.ts b/superset-frontend/cypress-base/cypress/support/directories.ts index 9c783a7220e80..fde9ee0cdeacf 100644 --- a/superset-frontend/cypress-base/cypress/support/directories.ts +++ b/superset-frontend/cypress-base/cypress/support/directories.ts @@ -630,7 +630,8 @@ export const dashboardView = { trashIcon: dataTestLocator('dashboard-delete-component-button'), refreshChart: dataTestLocator('refresh-chart-menu-item'), }, - threeDotsMenuIcon: '#save-dash-split-button', + threeDotsMenuIcon: + '.header-with-actions .right-button-panel .ant-dropdown-trigger', threeDotsMenuDropdown: dataTestLocator('header-actions-menu'), refreshDashboard: dataTestLocator('refresh-dashboard-menu-item'), saveAsMenuOption: dataTestLocator('save-as-menu-item'), @@ -660,7 +661,7 @@ export const dashboardView = { }, sliceThreeDots: '[aria-label="More Options"]', sliceThreeDotsDropdown: '[role="menu"]', - editDashboardButton: '[aria-label=edit-alt]', + editDashboardButton: '[aria-label="Edit dashboard"]', starIcon: dataTestLocator('fave-unfave-icon'), dashboardHeader: dataTestLocator('dashboard-header'), dashboardSectionContainer: dataTestLocator( diff --git a/superset-frontend/packages/superset-ui-chart-controls/src/shared-controls/components/RadioButtonControl.tsx b/superset-frontend/packages/superset-ui-chart-controls/src/shared-controls/components/RadioButtonControl.tsx index b613fed93aa87..497e331133470 100644 --- a/superset-frontend/packages/superset-ui-chart-controls/src/shared-controls/components/RadioButtonControl.tsx +++ b/superset-frontend/packages/superset-ui-chart-controls/src/shared-controls/components/RadioButtonControl.tsx @@ -53,8 +53,12 @@ export default function RadioButtonControl({ '.btn:focus': { outline: 'none', }, + '.control-label': { + color: theme.colors.grayscale.base, + marginBottom: theme.gridUnit, + }, '.control-label + .btn-group': { - marginTop: 1, + marginTop: '1px', }, '.btn-group .btn-default': { color: theme.colors.grayscale.dark1, diff --git a/superset-frontend/packages/superset-ui-core/src/color/CategoricalColorScale.ts b/superset-frontend/packages/superset-ui-core/src/color/CategoricalColorScale.ts index c6f37e4ff771f..2a6a4d63a0473 100644 --- a/superset-frontend/packages/superset-ui-core/src/color/CategoricalColorScale.ts +++ b/superset-frontend/packages/superset-ui-core/src/color/CategoricalColorScale.ts @@ -24,6 +24,7 @@ import { ColorsLookup } from './types'; import stringifyAndTrim from './stringifyAndTrim'; import getSharedLabelColor from './SharedLabelColorSingleton'; import { getAnalogousColors } from './utils'; +import { FeatureFlag, isFeatureEnabled } from '../utils'; // Use type augmentation to correct the fact that // an instance of CategoricalScale is also a function @@ -79,13 +80,15 @@ class CategoricalColorScale extends ExtensibleFunction { return forcedColor; } - const multiple = Math.floor( - this.domain().length / this.originColors.length, - ); - if (multiple > this.multiple) { - this.multiple = multiple; - const newRange = getAnalogousColors(this.originColors, multiple); - this.range(this.originColors.concat(newRange)); + if (isFeatureEnabled(FeatureFlag.USE_ANALAGOUS_COLORS)) { + const multiple = Math.floor( + this.domain().length / this.originColors.length, + ); + if (multiple > this.multiple) { + this.multiple = multiple; + const newRange = getAnalogousColors(this.originColors, multiple); + this.range(this.originColors.concat(newRange)); + } } const color = this.scale(cleanedValue); diff --git a/superset-frontend/packages/superset-ui-core/src/color/SharedLabelColorSingleton.ts b/superset-frontend/packages/superset-ui-core/src/color/SharedLabelColorSingleton.ts index d2a59ac7c292a..10a14df075910 100644 --- a/superset-frontend/packages/superset-ui-core/src/color/SharedLabelColorSingleton.ts +++ b/superset-frontend/packages/superset-ui-core/src/color/SharedLabelColorSingleton.ts @@ -18,7 +18,7 @@ */ import { CategoricalColorNamespace } from '.'; -import makeSingleton from '../utils/makeSingleton'; +import { FeatureFlag, isFeatureEnabled, makeSingleton } from '../utils'; import { getAnalogousColors } from './utils'; export class SharedLabelColor { @@ -37,25 +37,39 @@ export class SharedLabelColor { if (colorScheme) { const categoricalNamespace = CategoricalColorNamespace.getNamespace(colorNamespace); - const colors = categoricalNamespace.getScale(colorScheme).range(); const sharedLabels = this.getSharedLabels(); let generatedColors: string[] = []; let sharedLabelMap; if (sharedLabels.length) { - const multiple = Math.ceil(sharedLabels.length / colors.length); - generatedColors = getAnalogousColors(colors, multiple); - sharedLabelMap = sharedLabels.reduce( - (res, label, index) => ({ - ...res, - [label.toString()]: generatedColors[index], - }), - {}, - ); + const colorScale = categoricalNamespace.getScale(colorScheme); + const colors = colorScale.range(); + if (isFeatureEnabled(FeatureFlag.USE_ANALAGOUS_COLORS)) { + const multiple = Math.ceil(sharedLabels.length / colors.length); + generatedColors = getAnalogousColors(colors, multiple); + sharedLabelMap = sharedLabels.reduce( + (res, label, index) => ({ + ...res, + [label.toString()]: generatedColors[index], + }), + {}, + ); + } else { + // reverse colors to reduce color conflicts + colorScale.range(colors.reverse()); + sharedLabelMap = sharedLabels.reduce( + (res, label) => ({ + ...res, + [label.toString()]: colorScale(label), + }), + {}, + ); + } } const labelMap = Object.keys(this.sliceLabelColorMap).reduce( (res, sliceId) => { + // get new color scale instance const colorScale = categoricalNamespace.getScale(colorScheme); return { ...res, diff --git a/superset-frontend/packages/superset-ui-core/src/dimension/computeMaxFontSize.ts b/superset-frontend/packages/superset-ui-core/src/dimension/computeMaxFontSize.ts index a762d8b1f4460..ebd1f6e5688ba 100644 --- a/superset-frontend/packages/superset-ui-core/src/dimension/computeMaxFontSize.ts +++ b/superset-frontend/packages/superset-ui-core/src/dimension/computeMaxFontSize.ts @@ -27,8 +27,20 @@ function decreaseSizeUntil( ): number { let size = startSize; let dimension = computeDimension(size); + while (!condition(dimension)) { size -= 1; + + // Here if the size goes below zero most likely is because it + // has additional style applied in which case we assume the user + // knows what it's doing and we just let them use that. + // Visually it works, although it could have another + // check in place. + if (size < 0) { + size = startSize; + break; + } + dimension = computeDimension(size); } @@ -66,7 +78,7 @@ export default function computeMaxFontSize( size = decreaseSizeUntil( size, computeDimension, - dim => dim.width <= maxWidth, + dim => dim.width > 0 && dim.width <= maxWidth, ); } @@ -74,7 +86,7 @@ export default function computeMaxFontSize( size = decreaseSizeUntil( size, computeDimension, - dim => dim.height <= maxHeight, + dim => dim.height > 0 && dim.height <= maxHeight, ); } diff --git a/superset-frontend/packages/superset-ui-core/src/utils/featureFlags.ts b/superset-frontend/packages/superset-ui-core/src/utils/featureFlags.ts index 422f88f8f10c0..d6a1f2097f94f 100644 --- a/superset-frontend/packages/superset-ui-core/src/utils/featureFlags.ts +++ b/superset-frontend/packages/superset-ui-core/src/utils/featureFlags.ts @@ -54,6 +54,7 @@ export enum FeatureFlag { ALLOW_FULL_CSV_EXPORT = 'ALLOW_FULL_CSV_EXPORT', UX_BETA = 'UX_BETA', GENERIC_CHART_AXES = 'GENERIC_CHART_AXES', + USE_ANALAGOUS_COLORS = 'USE_ANALAGOUS_COLORS', } export type ScheduleQueriesProps = { JSONSCHEMA: { diff --git a/superset-frontend/packages/superset-ui-core/test/color/CategoricalColorScale.test.ts b/superset-frontend/packages/superset-ui-core/test/color/CategoricalColorScale.test.ts index 1d47cf760e326..91a8f4a3185a7 100644 --- a/superset-frontend/packages/superset-ui-core/test/color/CategoricalColorScale.test.ts +++ b/superset-frontend/packages/superset-ui-core/test/color/CategoricalColorScale.test.ts @@ -18,7 +18,7 @@ */ import { ScaleOrdinal } from 'd3-scale'; -import { CategoricalColorScale } from '@superset-ui/core'; +import { CategoricalColorScale, FeatureFlag } from '@superset-ui/core'; describe('CategoricalColorScale', () => { it('exists', () => { @@ -62,7 +62,36 @@ describe('CategoricalColorScale', () => { expect(c2).not.toBe(c3); expect(c3).not.toBe(c1); }); + it('recycles colors when number of items exceed available colors', () => { + window.featureFlags = { + [FeatureFlag.USE_ANALAGOUS_COLORS]: false, + }; + const colorSet: { [key: string]: number } = {}; + const scale = new CategoricalColorScale(['blue', 'red', 'green']); + const colors = [ + scale.getColor('pig'), + scale.getColor('horse'), + scale.getColor('cat'), + scale.getColor('cow'), + scale.getColor('donkey'), + scale.getColor('goat'), + ]; + colors.forEach(color => { + if (colorSet[color]) { + colorSet[color] += 1; + } else { + colorSet[color] = 1; + } + }); + expect(Object.keys(colorSet)).toHaveLength(3); + ['blue', 'red', 'green'].forEach(color => { + expect(colorSet[color]).toBe(2); + }); + }); it('get analogous colors when number of items exceed available colors', () => { + window.featureFlags = { + [FeatureFlag.USE_ANALAGOUS_COLORS]: true, + }; const scale = new CategoricalColorScale(['blue', 'red', 'green']); scale.getColor('pig'); scale.getColor('horse'); diff --git a/superset-frontend/packages/superset-ui-core/test/dimension/computeMaxFontSize.test.ts b/superset-frontend/packages/superset-ui-core/test/dimension/computeMaxFontSize.test.ts index 99574f4ccf758..a64d819535c6b 100644 --- a/superset-frontend/packages/superset-ui-core/test/dimension/computeMaxFontSize.test.ts +++ b/superset-frontend/packages/superset-ui-core/test/dimension/computeMaxFontSize.test.ts @@ -59,5 +59,14 @@ describe('computeMaxFontSize(input)', () => { }), ).toEqual(25); }); + it('ensure idealFontSize is used if the maximum font size calculation goes below zero', () => { + expect( + computeMaxFontSize({ + maxWidth: 5, + idealFontSize: 34, + text: SAMPLE_TEXT[0], + }), + ).toEqual(34); + }); }); }); diff --git a/superset-frontend/packages/superset-ui-demo/storybook/stories/plugins/plugin-chart-table/TableStories.tsx b/superset-frontend/packages/superset-ui-demo/storybook/stories/plugins/plugin-chart-table/TableStories.tsx index 129c08f505753..b8ea6d9cb5c85 100644 --- a/superset-frontend/packages/superset-ui-demo/storybook/stories/plugins/plugin-chart-table/TableStories.tsx +++ b/superset-frontend/packages/superset-ui-demo/storybook/stories/plugins/plugin-chart-table/TableStories.tsx @@ -66,6 +66,7 @@ function loadData( alignPn = false, showCellBars = true, includeSearch = true, + allowRearrangeColumns = false, }, ): TableChartProps { if (!props.queriesData || !props.queriesData[0]) return props; @@ -86,6 +87,7 @@ function loadData( page_length: pageLength, show_cell_bars: showCellBars, include_search: includeSearch, + allow_rearrange_columns: allowRearrangeColumns, }, height: window.innerHeight - 130, }; @@ -117,8 +119,12 @@ export const BigTable = ({ width, height }) => { const cols = number('Columns', 8, { range: true, min: 1, max: 20 }); const pageLength = number('Page size', 50, { range: true, min: 0, max: 100 }); const includeSearch = boolean('Include search', true); - const alignPn = boolean('Algin PosNeg', false); + const alignPn = boolean('Align PosNeg', false); const showCellBars = boolean('Show Cell Bars', true); + const allowRearrangeColumns = boolean( + 'Allow end user to drag-and-drop column headers to rearrange them.', + false, + ); const chartProps = loadData(birthNames, { pageLength, rows, @@ -126,6 +132,7 @@ export const BigTable = ({ width, height }) => { alignPn, showCellBars, includeSearch, + allowRearrangeColumns, }); return ( d.dx > 0.005); // 0.005 radians = 0.29 degrees - if (metrics[0] !== metrics[1] && metrics[1] && !colorScheme) { + if (metrics[0] !== metrics[1] && metrics[1]) { colorByCategory = false; const ext = d3.extent(nodes, d => d.m2 / d.m1); linearColorScale = getSequentialSchemeRegistry() diff --git a/superset-frontend/plugins/legacy-plugin-chart-sunburst/src/controlPanel.ts b/superset-frontend/plugins/legacy-plugin-chart-sunburst/src/controlPanel.ts index 065964687bfcf..df50be9c4d16c 100644 --- a/superset-frontend/plugins/legacy-plugin-chart-sunburst/src/controlPanel.ts +++ b/superset-frontend/plugins/legacy-plugin-chart-sunburst/src/controlPanel.ts @@ -17,7 +17,11 @@ * under the License. */ import { t } from '@superset-ui/core'; -import { ControlPanelConfig, sections } from '@superset-ui/chart-controls'; +import { + ControlPanelConfig, + ControlPanelsContainerProps, + sections, +} from '@superset-ui/chart-controls'; const config: ControlPanelConfig = { controlPanelSections: [ @@ -71,11 +75,21 @@ const config: ControlPanelConfig = { description: t( 'When only a primary metric is provided, a categorical color scale is used.', ), + visibility: ({ controls }: ControlPanelsContainerProps) => + Boolean( + !controls?.secondary_metric?.value || + controls?.secondary_metric?.value === controls?.metric.value, + ), }, linear_color_scheme: { description: t( 'When a secondary metric is provided, a linear color scale is used.', ), + visibility: ({ controls }: ControlPanelsContainerProps) => + Boolean( + controls?.secondary_metric?.value && + controls?.secondary_metric?.value !== controls?.metric.value, + ), }, groupby: { label: t('Hierarchy'), diff --git a/superset-frontend/plugins/legacy-plugin-chart-world-map/src/WorldMap.js b/superset-frontend/plugins/legacy-plugin-chart-world-map/src/WorldMap.js index b22c2b6e62493..1d51a7e840513 100644 --- a/superset-frontend/plugins/legacy-plugin-chart-world-map/src/WorldMap.js +++ b/superset-frontend/plugins/legacy-plugin-chart-world-map/src/WorldMap.js @@ -23,8 +23,10 @@ import { extent as d3Extent } from 'd3-array'; import { getNumberFormatter, getSequentialSchemeRegistry, + CategoricalColorNamespace, } from '@superset-ui/core'; import Datamap from 'datamaps/dist/datamaps.world.min'; +import { ColorBy } from './utils'; const propTypes = { data: PropTypes.arrayOf( @@ -55,6 +57,9 @@ function WorldMap(element, props) { showBubbles, linearColorScheme, color, + colorBy, + colorScheme, + sliceId, theme, } = props; const div = d3.select(element); @@ -70,15 +75,27 @@ function WorldMap(element, props) { .domain([extRadius[0], extRadius[1]]) .range([1, maxBubbleSize]); - const colorScale = getSequentialSchemeRegistry() - .get(linearColorScheme) - .createLinearScale(d3Extent(filteredData, d => d.m1)); + let processedData; + let colorScale; + if (colorBy === ColorBy.country) { + colorScale = CategoricalColorNamespace.getScale(colorScheme); - const processedData = filteredData.map(d => ({ - ...d, - radius: radiusScale(Math.sqrt(d.m2)), - fillColor: colorScale(d.m1), - })); + processedData = filteredData.map(d => ({ + ...d, + radius: radiusScale(Math.sqrt(d.m2)), + fillColor: colorScale(d.name, sliceId), + })); + } else { + colorScale = getSequentialSchemeRegistry() + .get(linearColorScheme) + .createLinearScale(d3Extent(filteredData, d => d.m1)); + + processedData = filteredData.map(d => ({ + ...d, + radius: radiusScale(Math.sqrt(d.m2)), + fillColor: colorScale(d.m1), + })); + } const mapData = {}; processedData.forEach(d => { diff --git a/superset-frontend/plugins/legacy-plugin-chart-world-map/src/controlPanel.ts b/superset-frontend/plugins/legacy-plugin-chart-world-map/src/controlPanel.ts index ec8aafc7b872a..93fc1ab1c9c02 100644 --- a/superset-frontend/plugins/legacy-plugin-chart-world-map/src/controlPanel.ts +++ b/superset-frontend/plugins/legacy-plugin-chart-world-map/src/controlPanel.ts @@ -22,6 +22,7 @@ import { formatSelectOptions, sections, } from '@superset-ui/chart-controls'; +import { ColorBy } from './utils'; const config: ControlPanelConfig = { controlPanelSections: [ @@ -106,7 +107,25 @@ const config: ControlPanelConfig = { }, ], ['color_picker'], + [ + { + name: 'color_by', + config: { + type: 'RadioButtonControl', + label: t('Color by'), + default: ColorBy.metric, + options: [ + [ColorBy.metric, t('Metric')], + [ColorBy.country, t('Country')], + ], + description: t( + 'Choose whether a country should be shaded by the metric, or assigned a color based on a categorical color palette', + ), + }, + }, + ], ['linear_color_scheme'], + ['color_scheme'], ], }, ], @@ -115,10 +134,6 @@ const config: ControlPanelConfig = { label: t('Country Column'), description: t('3 letter code of the country'), }, - metric: { - label: t('Metric for Color'), - description: t('Metric that defines the color of the country'), - }, secondary_metric: { label: t('Bubble Size'), description: t('Metric that defines the size of the bubble'), @@ -128,6 +143,13 @@ const config: ControlPanelConfig = { }, linear_color_scheme: { label: t('Country Color Scheme'), + visibility: ({ controls }) => + Boolean(controls?.color_by.value === ColorBy.metric), + }, + color_scheme: { + label: t('Country Color Scheme'), + visibility: ({ controls }) => + Boolean(controls?.color_by.value === ColorBy.country), }, }, }; diff --git a/superset-frontend/plugins/legacy-plugin-chart-world-map/src/transformProps.js b/superset-frontend/plugins/legacy-plugin-chart-world-map/src/transformProps.js index 464dd53afa4fc..fd5f109c0d40e 100644 --- a/superset-frontend/plugins/legacy-plugin-chart-world-map/src/transformProps.js +++ b/superset-frontend/plugins/legacy-plugin-chart-world-map/src/transformProps.js @@ -20,8 +20,15 @@ import { rgb } from 'd3-color'; export default function transformProps(chartProps) { const { width, height, formData, queriesData } = chartProps; - const { maxBubbleSize, showBubbles, linearColorScheme, colorPicker } = - formData; + const { + maxBubbleSize, + showBubbles, + linearColorScheme, + colorPicker, + colorBy, + colorScheme, + sliceId, + } = formData; const { r, g, b } = colorPicker; return { @@ -32,5 +39,8 @@ export default function transformProps(chartProps) { showBubbles, linearColorScheme, color: rgb(r, g, b).hex(), + colorBy, + colorScheme, + sliceId, }; } diff --git a/superset-frontend/src/explore/components/useOriginalFormattedTimeColumns.ts b/superset-frontend/plugins/legacy-plugin-chart-world-map/src/utils.ts similarity index 70% rename from superset-frontend/src/explore/components/useOriginalFormattedTimeColumns.ts rename to superset-frontend/plugins/legacy-plugin-chart-world-map/src/utils.ts index b51fef617889d..b558b15b97fe5 100644 --- a/superset-frontend/src/explore/components/useOriginalFormattedTimeColumns.ts +++ b/superset-frontend/plugins/legacy-plugin-chart-world-map/src/utils.ts @@ -16,12 +16,8 @@ * specific language governing permissions and limitations * under the License. */ -import { useSelector } from 'react-redux'; -import { ExplorePageState } from '../reducers/getInitialState'; -export const useOriginalFormattedTimeColumns = (datasourceId?: string) => - useSelector(state => - datasourceId - ? state.explore.originalFormattedTimeColumns?.[datasourceId] ?? [] - : [], - ); +export enum ColorBy { + metric = 'metric', + country = 'country', +} diff --git a/superset-frontend/plugins/plugin-chart-table/src/DataTable/DataTable.tsx b/superset-frontend/plugins/plugin-chart-table/src/DataTable/DataTable.tsx index aff71667408ef..85580e7b63a3d 100644 --- a/superset-frontend/plugins/plugin-chart-table/src/DataTable/DataTable.tsx +++ b/superset-frontend/plugins/plugin-chart-table/src/DataTable/DataTable.tsx @@ -29,6 +29,7 @@ import { usePagination, useSortBy, useGlobalFilter, + useColumnOrder, PluginHook, TableOptions, FilterType, @@ -64,6 +65,7 @@ export interface DataTableProps extends TableOptions { sticky?: boolean; rowCount: number; wrapperRef?: MutableRefObject; + onColumnOrderChange: () => void; } export interface RenderHTMLCellProps extends HTMLProps { @@ -95,12 +97,14 @@ export default typedMemo(function DataTable({ hooks, serverPagination, wrapperRef: userWrapperRef, + onColumnOrderChange, ...moreUseTableOptions }: DataTableProps): JSX.Element { const tableHooks: PluginHook[] = [ useGlobalFilter, useSortBy, usePagination, + useColumnOrder, doSticky ? useSticky : [], hooks || [], ].flat(); @@ -172,6 +176,8 @@ export default typedMemo(function DataTable({ setGlobalFilter, setPageSize: setPageSize_, wrapStickyTable, + setColumnOrder, + allColumns, state: { pageIndex, pageSize, globalFilter: filterValue, sticky = {} }, } = useTable( { @@ -211,6 +217,33 @@ export default typedMemo(function DataTable({ const shouldRenderFooter = columns.some(x => !!x.Footer); + let columnBeingDragged = -1; + + const onDragStart = (e: React.DragEvent) => { + const el = e.target as HTMLTableCellElement; + columnBeingDragged = allColumns.findIndex( + col => col.id === el.dataset.columnName, + ); + e.dataTransfer.setData('text/plain', `${columnBeingDragged}`); + }; + + const onDrop = (e: React.DragEvent) => { + const el = e.target as HTMLTableCellElement; + const newPosition = allColumns.findIndex( + col => col.id === el.dataset.columnName, + ); + + if (newPosition !== -1) { + const currentCols = allColumns.map(c => c.id); + const colToBeMoved = currentCols.splice(columnBeingDragged, 1); + currentCols.splice(newPosition, 0, colToBeMoved[0]); + setColumnOrder(currentCols); + // toggle value in TableChart to trigger column width recalc + onColumnOrderChange(); + } + e.preventDefault(); + }; + const renderTable = () => ( @@ -223,6 +256,8 @@ export default typedMemo(function DataTable({ column.render('Header', { key: column.id, ...column.getSortByToggleProps(), + onDragStart, + onDrop, }), )} diff --git a/superset-frontend/plugins/plugin-chart-table/src/DataTable/hooks/useSticky.tsx b/superset-frontend/plugins/plugin-chart-table/src/DataTable/hooks/useSticky.tsx index 9a98fee431817..6fd4d839ce661 100644 --- a/superset-frontend/plugins/plugin-chart-table/src/DataTable/hooks/useSticky.tsx +++ b/superset-frontend/plugins/plugin-chart-table/src/DataTable/hooks/useSticky.tsx @@ -350,6 +350,7 @@ function useInstance(instance: TableInstance) { data, page, rows, + allColumns, getTableSize = () => undefined, } = instance; @@ -370,7 +371,7 @@ function useInstance(instance: TableInstance) { useMountedMemo(getTableSize, [getTableSize]) || sticky; // only change of data should trigger re-render // eslint-disable-next-line react-hooks/exhaustive-deps - const table = useMemo(renderer, [page, rows]); + const table = useMemo(renderer, [page, rows, allColumns]); useLayoutEffect(() => { if (!width || !height) { diff --git a/superset-frontend/plugins/plugin-chart-table/src/DataTable/types/react-table.d.ts b/superset-frontend/plugins/plugin-chart-table/src/DataTable/types/react-table.d.ts index 52a18d54e1b0f..c1f49ea396f25 100644 --- a/superset-frontend/plugins/plugin-chart-table/src/DataTable/types/react-table.d.ts +++ b/superset-frontend/plugins/plugin-chart-table/src/DataTable/types/react-table.d.ts @@ -36,6 +36,8 @@ import { UseSortByState, UseTableHooks, UseSortByHooks, + UseColumnOrderState, + UseColumnOrderInstanceProps, Renderer, HeaderProps, TableFooterProps, @@ -64,6 +66,7 @@ declare module 'react-table' { UseRowSelectInstanceProps, UseRowStateInstanceProps, UseSortByInstanceProps, + UseColumnOrderInstanceProps, UseStickyInstanceProps {} export interface TableState @@ -73,6 +76,7 @@ declare module 'react-table' { UsePaginationState, UseRowSelectState, UseSortByState, + UseColumnOrderState, UseStickyState {} // Typing from @types/react-table is incomplete @@ -82,12 +86,19 @@ declare module 'react-table' { onClick?: React.MouseEventHandler; } + interface TableRearrangeColumnsProps { + onDragStart: (e: React.DragEvent) => void; + onDrop: (e: React.DragEvent) => void; + } + export interface ColumnInterface extends UseGlobalFiltersColumnOptions, UseSortByColumnOptions { // must define as a new property because it's not possible to override // the existing `Header` renderer option - Header?: Renderer>; + Header?: Renderer< + TableSortByToggleProps & HeaderProps & TableRearrangeColumnsProps + >; Footer?: Renderer>; } diff --git a/superset-frontend/plugins/plugin-chart-table/src/TableChart.tsx b/superset-frontend/plugins/plugin-chart-table/src/TableChart.tsx index 296a7125949a0..f0b125940bb83 100644 --- a/superset-frontend/plugins/plugin-chart-table/src/TableChart.tsx +++ b/superset-frontend/plugins/plugin-chart-table/src/TableChart.tsx @@ -16,7 +16,7 @@ * specific language governing permissions and limitations * under the License. */ -import React, { CSSProperties, useCallback, useMemo } from 'react'; +import React, { CSSProperties, useCallback, useMemo, useState } from 'react'; import { ColumnInstance, ColumnWithLooseAccessor, @@ -192,12 +192,16 @@ export default function TableChart( filters, sticky = true, // whether to use sticky header columnColorFormatters, + allowRearrangeColumns = false, } = props; const timestampFormatter = useCallback( value => getTimeFormatterForGranularity(timeGrain)(value), [timeGrain], ); + // keep track of whether column order changed, so that column widths can too + const [columnOrderToggle, setColumnOrderToggle] = useState(false); + const handleChange = useCallback( (filters: { [x: string]: DataRecordValue[] }) => { if (!emitFilter) { @@ -413,7 +417,7 @@ export default function TableChart( // render `Cell`. This saves some time for large tables. return {text}; }, - Header: ({ column: col, onClick, style }) => ( + Header: ({ column: col, onClick, style, onDragStart, onDrop }) => ( @@ -469,6 +482,7 @@ export default function TableChart( toggleFilter, totals, columnColorFormatters, + columnOrderToggle, ], ); @@ -498,6 +512,7 @@ export default function TableChart( height={height} serverPagination={serverPagination} onServerPaginationChange={handleServerPaginationChange} + onColumnOrderChange={() => setColumnOrderToggle(!columnOrderToggle)} // 9 page items in > 340px works well even for 100+ pages maxPageItemCount={width > 340 ? 9 : 7} noResults={getNoResultsMessage} diff --git a/superset-frontend/plugins/plugin-chart-table/src/controlPanel.tsx b/superset-frontend/plugins/plugin-chart-table/src/controlPanel.tsx index c121547518e46..bb855bd7ccc56 100644 --- a/superset-frontend/plugins/plugin-chart-table/src/controlPanel.tsx +++ b/superset-frontend/plugins/plugin-chart-table/src/controlPanel.tsx @@ -455,6 +455,20 @@ const config: ControlPanelConfig = { }, }, ], + [ + { + name: 'allow_rearrange_columns', + config: { + type: 'CheckboxControl', + label: t('Allow columns to be rearranged'), + renderTrigger: true, + default: false, + description: t( + "Allow end user to drag-and-drop column headers to rearrange them. Note their changes won't persist for the next time they open the chart.", + ), + }, + }, + ], [ { name: 'column_config', diff --git a/superset-frontend/plugins/plugin-chart-table/src/transformProps.ts b/superset-frontend/plugins/plugin-chart-table/src/transformProps.ts index 8b701422bb950..5cf4fd1e83c43 100644 --- a/superset-frontend/plugins/plugin-chart-table/src/transformProps.ts +++ b/superset-frontend/plugins/plugin-chart-table/src/transformProps.ts @@ -220,6 +220,7 @@ const transformProps = ( query_mode: queryMode, show_totals: showTotals, conditional_formatting: conditionalFormatting, + allow_rearrange_columns: allowRearrangeColumns, } = formData; const timeGrain = extractTimegrain(formData); @@ -272,6 +273,7 @@ const transformProps = ( onChangeFilter, columnColorFormatters, timeGrain, + allowRearrangeColumns, }; }; diff --git a/superset-frontend/plugins/plugin-chart-table/src/types.ts b/superset-frontend/plugins/plugin-chart-table/src/types.ts index 7c50f99cb4ae6..f5b83fa8bfd7e 100644 --- a/superset-frontend/plugins/plugin-chart-table/src/types.ts +++ b/superset-frontend/plugins/plugin-chart-table/src/types.ts @@ -71,6 +71,7 @@ export type TableChartFormData = QueryFormData & { emit_filter?: boolean; time_grain_sqla?: TimeGranularity; column_config?: Record; + allow_rearrange_columns?: boolean; }; export interface TableChartProps extends ChartProps { @@ -109,6 +110,7 @@ export interface TableChartTransformedProps { emitFilter?: boolean; onChangeFilter?: ChartProps['hooks']['onAddFilter']; columnColorFormatters?: ColorFormatters; + allowRearrangeColumns?: boolean; } export default {}; diff --git a/superset-frontend/src/SqlLab/actions/sqlLab.js b/superset-frontend/src/SqlLab/actions/sqlLab.js index b950d0e37737f..04707d6ae67a8 100644 --- a/superset-frontend/src/SqlLab/actions/sqlLab.js +++ b/superset-frontend/src/SqlLab/actions/sqlLab.js @@ -23,7 +23,7 @@ import invert from 'lodash/invert'; import mapKeys from 'lodash/mapKeys'; import { isFeatureEnabled, FeatureFlag } from 'src/featureFlags'; -import { now } from 'src/modules/dates'; +import { now } from 'src/utils/dates'; import { addDangerToast as addDangerToastAction, addInfoToast as addInfoToastAction, @@ -917,9 +917,13 @@ export function updateSavedQuery(query) { } export function queryEditorSetSql(queryEditor, sql) { + return { type: QUERY_EDITOR_SET_SQL, queryEditor, sql }; +} + +export function queryEditorSetAndSaveSql(queryEditor, sql) { return function (dispatch) { // saved query and set tab state use this action - dispatch({ type: QUERY_EDITOR_SET_SQL, queryEditor, sql }); + dispatch(queryEditorSetSql(queryEditor, sql)); if (isFeatureEnabled(FeatureFlag.SQLLAB_BACKEND_PERSISTENCE)) { return SupersetClient.put({ endpoint: encodeURI(`/tabstateview/${queryEditor.id}`), diff --git a/superset-frontend/src/SqlLab/actions/sqlLab.test.js b/superset-frontend/src/SqlLab/actions/sqlLab.test.js index f2d56caee4d33..440df74bf937f 100644 --- a/superset-frontend/src/SqlLab/actions/sqlLab.test.js +++ b/superset-frontend/src/SqlLab/actions/sqlLab.test.js @@ -635,7 +635,7 @@ describe('async actions', () => { }); }); - describe('queryEditorSetSql', () => { + describe('queryEditorSetAndSaveSql', () => { const sql = 'SELECT * '; const expectedActions = [ { @@ -651,7 +651,7 @@ describe('async actions', () => { const store = mockStore({}); return store - .dispatch(actions.queryEditorSetSql(queryEditor, sql)) + .dispatch(actions.queryEditorSetAndSaveSql(queryEditor, sql)) .then(() => { expect(store.getActions()).toEqual(expectedActions); expect(fetchMock.calls(updateTabStateEndpoint)).toHaveLength(1); @@ -668,7 +668,7 @@ describe('async actions', () => { const store = mockStore({}); - store.dispatch(actions.queryEditorSetSql(queryEditor, sql)); + store.dispatch(actions.queryEditorSetAndSaveSql(queryEditor, sql)); expect(store.getActions()).toEqual(expectedActions); expect(fetchMock.calls(updateTabStateEndpoint)).toHaveLength(0); diff --git a/superset-frontend/src/SqlLab/components/QueryHistory/QueryHistory.test.tsx b/superset-frontend/src/SqlLab/components/QueryHistory/QueryHistory.test.tsx index e63de3fdca869..8d25fca910124 100644 --- a/superset-frontend/src/SqlLab/components/QueryHistory/QueryHistory.test.tsx +++ b/superset-frontend/src/SqlLab/components/QueryHistory/QueryHistory.test.tsx @@ -24,7 +24,7 @@ const NOOP = () => {}; const mockedProps = { queries: [], actions: { - queryEditorSetSql: NOOP, + queryEditorSetAndSaveSql: NOOP, cloneQueryToNewTab: NOOP, fetchQueryResults: NOOP, clearQueryResults: NOOP, diff --git a/superset-frontend/src/SqlLab/components/QueryHistory/index.tsx b/superset-frontend/src/SqlLab/components/QueryHistory/index.tsx index 6820e19d49deb..c41ace1ead01b 100644 --- a/superset-frontend/src/SqlLab/components/QueryHistory/index.tsx +++ b/superset-frontend/src/SqlLab/components/QueryHistory/index.tsx @@ -25,7 +25,7 @@ import QueryTable from 'src/SqlLab/components/QueryTable'; interface QueryHistoryProps { queries: Query[]; actions: { - queryEditorSetSql: Function; + queryEditorSetAndSaveSql: Function; cloneQueryToNewTab: Function; fetchQueryResults: Function; clearQueryResults: Function; diff --git a/superset-frontend/src/SqlLab/components/QuerySearch/index.tsx b/superset-frontend/src/SqlLab/components/QuerySearch/index.tsx index 762f35e89880e..e1e994133a9cc 100644 --- a/superset-frontend/src/SqlLab/components/QuerySearch/index.tsx +++ b/superset-frontend/src/SqlLab/components/QuerySearch/index.tsx @@ -27,7 +27,7 @@ import { epochTimeXHoursAgo, epochTimeXDaysAgo, epochTimeXYearsAgo, -} from 'src/modules/dates'; +} from 'src/utils/dates'; import AsyncSelect from 'src/components/AsyncSelect'; import { Query } from 'src/SqlLab/types'; import { STATUS_OPTIONS, TIME_OPTIONS } from 'src/SqlLab/constants'; @@ -37,7 +37,7 @@ interface QuerySearchProps { actions: { addDangerToast: (msg: string) => void; setDatabases: (data: Record) => Record; - queryEditorSetSql: Function; + queryEditorSetAndSaveSql: Function; cloneQueryToNewTab: Function; fetchQueryResults: Function; clearQueryResults: Function; diff --git a/superset-frontend/src/SqlLab/components/QueryTable/index.tsx b/superset-frontend/src/SqlLab/components/QueryTable/index.tsx index a50779d6eb9c1..90d2219497583 100644 --- a/superset-frontend/src/SqlLab/components/QueryTable/index.tsx +++ b/superset-frontend/src/SqlLab/components/QueryTable/index.tsx @@ -25,7 +25,7 @@ import { t, useTheme } from '@superset-ui/core'; import { useSelector } from 'react-redux'; import TableView from 'src/components/TableView'; import Button from 'src/components/Button'; -import { fDuration } from 'src/modules/dates'; +import { fDuration } from 'src/utils/dates'; import Icons from 'src/components/Icons'; import { Tooltip } from 'src/components/Tooltip'; import { Query, RootState } from 'src/SqlLab/types'; @@ -46,7 +46,7 @@ interface QueryTableQuery interface QueryTableProps { columns?: string[]; actions: { - queryEditorSetSql: Function; + queryEditorSetAndSaveSql: Function; cloneQueryToNewTab: Function; fetchQueryResults: Function; clearQueryResults: Function; @@ -94,7 +94,7 @@ const QueryTable = ({ const user = useSelector(state => state.sqlLab.user); const { - queryEditorSetSql, + queryEditorSetAndSaveSql, cloneQueryToNewTab, fetchQueryResults, clearQueryResults, @@ -103,7 +103,7 @@ const QueryTable = ({ const data = useMemo(() => { const restoreSql = (query: Query) => { - queryEditorSetSql({ id: query.sqlEditorId }, query.sql); + queryEditorSetAndSaveSql({ id: query.sqlEditorId }, query.sql); }; const openQueryInNewTab = (query: Query) => { @@ -314,7 +314,7 @@ const QueryTable = ({ clearQueryResults, cloneQueryToNewTab, fetchQueryResults, - queryEditorSetSql, + queryEditorSetAndSaveSql, removeQuery, ]); diff --git a/superset-frontend/src/SqlLab/components/ResultSet/index.tsx b/superset-frontend/src/SqlLab/components/ResultSet/index.tsx index d2c4b41ff7589..39c897c8d4e3c 100644 --- a/superset-frontend/src/SqlLab/components/ResultSet/index.tsx +++ b/superset-frontend/src/SqlLab/components/ResultSet/index.tsx @@ -127,11 +127,8 @@ const MonospaceDiv = styled.div` const ReturnedRows = styled.div` font-size: 13px; line-height: 24px; - .limitMessage { - color: ${({ theme }) => theme.colors.secondary.light1}; - margin-left: ${({ theme }) => theme.gridUnit * 2}px; - } `; + const ResultSetControls = styled.div` display: flex; justify-content: space-between; @@ -148,6 +145,19 @@ const ResultSetErrorMessage = styled.div` padding-top: ${({ theme }) => 4 * theme.gridUnit}px; `; +const ResultSetRowsReturned = styled.span` + white-space: nowrap; + text-overflow: ellipsis; + width: 100%; + overflow: hidden; + display: inline-block; +`; + +const LimitMessage = styled.span` + color: ${({ theme }) => theme.colors.secondary.light1}; + margin-left: ${({ theme }) => theme.gridUnit * 2}px; +`; + const updateDataset = async ( dbId: number, datasetId: number, @@ -608,42 +618,38 @@ export default class ResultSet extends React.PureComponent< limitingFactor === LIMITING_FACTOR.DROPDOWN; if (limitingFactor === LIMITING_FACTOR.QUERY && this.props.csv) { - limitMessage = ( - - {t( - 'The number of rows displayed is limited to %(rows)d by the query', - { rows }, - )} - + limitMessage = t( + 'The number of rows displayed is limited to %(rows)d by the query', + { rows }, ); } else if ( limitingFactor === LIMITING_FACTOR.DROPDOWN && !shouldUseDefaultDropdownAlert ) { - limitMessage = ( - - {t( - 'The number of rows displayed is limited to %(rows)d by the limit dropdown.', - { rows }, - )} - + limitMessage = t( + 'The number of rows displayed is limited to %(rows)d by the limit dropdown.', + { rows }, ); } else if (limitingFactor === LIMITING_FACTOR.QUERY_AND_DROPDOWN) { - limitMessage = ( - - {t( - 'The number of rows displayed is limited to %(rows)d by the query and limit dropdown.', - { rows }, - )} - + limitMessage = t( + 'The number of rows displayed is limited to %(rows)d by the query and limit dropdown.', + { rows }, ); } + + const rowsReturnedMessage = t('%(rows)d rows returned', { + rows, + }); + + const tooltipText = `${rowsReturnedMessage}. ${limitMessage}`; + return ( {!limitReached && !shouldUseDefaultDropdownAlert && ( - - {t('%(rows)d rows returned', { rows })} {limitMessage} - + + {rowsReturnedMessage} + {limitMessage} + )} {!limitReached && shouldUseDefaultDropdownAlert && (
@@ -678,6 +684,7 @@ export default class ResultSet extends React.PureComponent< render() { const { query } = this.props; + const limitReached = query?.results?.displayLimitReached; let sql; let exploreDBId = query.dbId; if (this.props.database && this.props.database.explore_database_id) { @@ -747,9 +754,17 @@ export default class ResultSet extends React.PureComponent< } if (query.state === 'success' && query.results) { const { results } = query; + // Accounts for offset needed for height of ResultSetRowsReturned component if !limitReached + const rowMessageHeight = !limitReached ? 32 : 0; + // Accounts for offset needed for height of Alert if this.state.alertIsOpen + const alertContainerHeight = 70; + // We need to calculate the height of this.renderRowsReturned() + // if we want results panel to be propper height because the + // FilterTable component nedds an explcit height to render + // react-virtualized Table component const height = this.state.alertIsOpen - ? this.props.height - 70 - : this.props.height; + ? this.props.height - alertContainerHeight + : this.props.height - rowMessageHeight; let data; if (this.props.cache && query.cached) { ({ data } = this.state); diff --git a/superset-frontend/src/SqlLab/components/SouthPane/SouthPane.test.jsx b/superset-frontend/src/SqlLab/components/SouthPane/SouthPane.test.jsx index 1786a6cf313a6..6dfca33dbc06b 100644 --- a/superset-frontend/src/SqlLab/components/SouthPane/SouthPane.test.jsx +++ b/superset-frontend/src/SqlLab/components/SouthPane/SouthPane.test.jsx @@ -80,7 +80,7 @@ const mockedEmptyProps = { latestQueryId: '', dataPreviewQueries: [], actions: { - queryEditorSetSql: NOOP, + queryEditorSetAndSaveSql: NOOP, cloneQueryToNewTab: NOOP, fetchQueryResults: NOOP, clearQueryResults: NOOP, diff --git a/superset-frontend/src/SqlLab/components/SouthPane/index.tsx b/superset-frontend/src/SqlLab/components/SouthPane/index.tsx index 767b608f3b7d2..ddcd972f9828b 100644 --- a/superset-frontend/src/SqlLab/components/SouthPane/index.tsx +++ b/superset-frontend/src/SqlLab/components/SouthPane/index.tsx @@ -46,7 +46,7 @@ interface SouthPanePropTypes { latestQueryId?: string; dataPreviewQueries: any[]; actions: { - queryEditorSetSql: Function; + queryEditorSetAndSaveSql: Function; cloneQueryToNewTab: Function; fetchQueryResults: Function; clearQueryResults: Function; @@ -62,9 +62,13 @@ interface SouthPanePropTypes { defaultQueryLimit: number; } -const StyledPane = styled.div` - width: 100%; +type StyledPaneProps = { + height: number; +}; +const StyledPane = styled.div` + width: 100%; + height: ${props => props.height}px; .ant-tabs .ant-tabs-content-holder { overflow: visible; } @@ -207,7 +211,7 @@ export default function SouthPane({ return offline ? ( renderOfflineStatus() ) : ( - + { - if (this.state.sql.trim() !== '') { + if (this.props.queryEditor.sql.trim() !== '') { this.runQuery(); } }, @@ -340,7 +335,7 @@ class SqlEditor extends React.PureComponent { key: 'ctrl+enter', descr: t('Run query'), func: () => { - if (this.state.sql.trim() !== '') { + if (this.props.queryEditor.sql.trim() !== '') { this.runQuery(); } }, @@ -383,8 +378,8 @@ class SqlEditor extends React.PureComponent { this.setState({ showEmptyState: bool }); } - setQueryEditorSql(sql) { - this.props.queryEditorSetSql(this.props.queryEditor, sql); + setQueryEditorAndSaveSql(sql) { + this.props.queryEditorSetAndSaveSql(this.props.queryEditor, sql); } setQueryLimit(queryLimit) { @@ -396,7 +391,7 @@ class SqlEditor extends React.PureComponent { const qe = this.props.queryEditor; const query = { dbId: qe.dbId, - sql: qe.selectedText ? qe.selectedText : this.state.sql, + sql: qe.selectedText ? qe.selectedText : this.props.queryEditor.sql, sqlEditorId: qe.id, schema: qe.schema, templateParams: qe.templateParams, @@ -429,12 +424,12 @@ class SqlEditor extends React.PureComponent { }; } - requestValidation() { + requestValidation(sql) { if (this.props.database) { const qe = this.props.queryEditor; const query = { dbId: qe.dbId, - sql: this.state.sql, + sql, sqlEditorId: qe.id, schema: qe.schema, templateParams: qe.templateParams, @@ -466,7 +461,7 @@ class SqlEditor extends React.PureComponent { const qe = this.props.queryEditor; const query = { dbId: qe.dbId, - sql: qe.selectedText ? qe.selectedText : this.state.sql, + sql: qe.selectedText ? qe.selectedText : qe.sql, sqlEditorId: qe.id, tab: qe.title, schema: qe.schema, @@ -682,7 +677,7 @@ class SqlEditor extends React.PureComponent { runQuery={this.runQuery} selectedText={qe.selectedText} stopQuery={this.stopQuery} - sql={this.state.sql} + sql={this.props.queryEditor.sql} overlayCreateAsMenu={showMenu ? runMenuBtn : null} /> @@ -854,6 +849,7 @@ function mapDispatchToProps(dispatch) { queryEditorSetAutorun, queryEditorSetQueryLimit, queryEditorSetSql, + queryEditorSetAndSaveSql, queryEditorSetTemplateParams, runQuery, saveQuery, diff --git a/superset-frontend/src/SqlLab/reducers/sqlLab.js b/superset-frontend/src/SqlLab/reducers/sqlLab.js index 923caaf96152d..d20d34420575a 100644 --- a/superset-frontend/src/SqlLab/reducers/sqlLab.js +++ b/superset-frontend/src/SqlLab/reducers/sqlLab.js @@ -20,7 +20,7 @@ import { t } from '@superset-ui/core'; import getInitialState from './getInitialState'; import * as actions from '../actions/sqlLab'; -import { now } from '../../modules/dates'; +import { now } from '../../utils/dates'; import { addToObject, alterInObject, @@ -340,6 +340,12 @@ export default function sqlLabReducer(state = {}, action) { errorMessage: null, cached: false, }; + + const resultsKey = action?.results?.query?.resultsKey; + if (resultsKey) { + alts.resultsKey = resultsKey; + } + return alterInObject(state, 'queries', action.query, alts); }, [actions.QUERY_FAILED]() { diff --git a/superset-frontend/src/SqlLab/reducers/sqlLab.test.js b/superset-frontend/src/SqlLab/reducers/sqlLab.test.js index cac9376ef52cc..067cba3070ac9 100644 --- a/superset-frontend/src/SqlLab/reducers/sqlLab.test.js +++ b/superset-frontend/src/SqlLab/reducers/sqlLab.test.js @@ -18,7 +18,7 @@ */ import sqlLabReducer from 'src/SqlLab/reducers/sqlLab'; import * as actions from 'src/SqlLab/actions/sqlLab'; -import { now } from 'src/modules/dates'; +import { now } from 'src/utils/dates'; import { table, initialState as mockState } from '../fixtures'; const initialState = mockState.sqlLab; diff --git a/superset-frontend/src/addSlice/AddSliceContainer.tsx b/superset-frontend/src/addSlice/AddSliceContainer.tsx index 7e9f0a1a2ed54..fd22377314ac8 100644 --- a/superset-frontend/src/addSlice/AddSliceContainer.tsx +++ b/superset-frontend/src/addSlice/AddSliceContainer.tsx @@ -19,6 +19,9 @@ import React, { ReactNode } from 'react'; import rison from 'rison'; import { styled, t, SupersetClient, JsonResponse } from '@superset-ui/core'; +import { getUrlParam } from 'src/utils/urlUtils'; +import { URL_PARAMS } from 'src/constants'; +import { isNullish } from 'src/utils/common'; import Button from 'src/components/Button'; import { Select, Steps } from 'src/components'; import { FormLabel } from 'src/components/Form'; @@ -195,10 +198,12 @@ export default class AddSliceContainer extends React.PureComponent< } exploreUrl() { + const dashboardId = getUrlParam(URL_PARAMS.dashboardId); const formData = encodeURIComponent( JSON.stringify({ viz_type: this.state.visType, datasource: this.state.datasource?.value, + ...(!isNullish(dashboardId) && { dashboardId }), }), ); return `/superset/explore/?form_data=${formData}`; diff --git a/superset-frontend/src/assets/images/icons/redo.svg b/superset-frontend/src/assets/images/icons/redo.svg new file mode 100644 index 0000000000000..a35cf022525e7 --- /dev/null +++ b/superset-frontend/src/assets/images/icons/redo.svg @@ -0,0 +1,21 @@ + + + + diff --git a/superset-frontend/src/assets/images/icons/undo.svg b/superset-frontend/src/assets/images/icons/undo.svg new file mode 100644 index 0000000000000..b680a68649681 --- /dev/null +++ b/superset-frontend/src/assets/images/icons/undo.svg @@ -0,0 +1,21 @@ + + + + diff --git a/superset-frontend/src/components/Chart/chartAction.js b/superset-frontend/src/components/Chart/chartAction.js index 8f451444f4633..d52ac79177da0 100644 --- a/superset-frontend/src/components/Chart/chartAction.js +++ b/superset-frontend/src/components/Chart/chartAction.js @@ -595,3 +595,17 @@ export function refreshChart(chartKey, force, dashboardId) { ); }; } + +export const getDatasetSamples = async (datasetId, force) => { + const endpoint = `/api/v1/dataset/${datasetId}/samples?force=${force}`; + try { + const response = await SupersetClient.get({ endpoint }); + return response.json.result; + } catch (err) { + const clientError = await getClientErrorObject(err); + throw new Error( + clientError.message || clientError.error || t('Sorry, an error occurred'), + { cause: err }, + ); + } +}; diff --git a/superset-frontend/src/components/Chart/chartReducer.ts b/superset-frontend/src/components/Chart/chartReducer.ts index d6d612fbfaf25..010140584c2b1 100644 --- a/superset-frontend/src/components/Chart/chartReducer.ts +++ b/superset-frontend/src/components/Chart/chartReducer.ts @@ -22,7 +22,7 @@ import { HYDRATE_DASHBOARD } from 'src/dashboard/actions/hydrate'; import { DatasourcesAction } from 'src/dashboard/actions/datasources'; import { ChartState } from 'src/explore/types'; import { getFormDataFromControls } from 'src/explore/controlUtils'; -import { now } from 'src/modules/dates'; +import { now } from 'src/utils/dates'; import * as actions from './chartAction'; export const chart: ChartState = { diff --git a/superset-frontend/src/components/CopyToClipboard/index.jsx b/superset-frontend/src/components/CopyToClipboard/index.jsx index 2047cf4b0fbdb..95b6cdfdc0f70 100644 --- a/superset-frontend/src/components/CopyToClipboard/index.jsx +++ b/superset-frontend/src/components/CopyToClipboard/index.jsx @@ -57,10 +57,10 @@ class CopyToClipboard extends React.Component { onClick() { if (this.props.getText) { this.props.getText(d => { - this.copyToClipboard(d); + this.copyToClipboard(Promise.resolve(d)); }); } else { - this.copyToClipboard(this.props.text); + this.copyToClipboard(Promise.resolve(this.props.text)); } } @@ -72,7 +72,7 @@ class CopyToClipboard extends React.Component { } copyToClipboard(textToCopy) { - copyTextToClipboard(textToCopy) + copyTextToClipboard(() => textToCopy) .then(() => { this.props.addSuccessToast(t('Copied to clipboard!')); }) diff --git a/superset-frontend/src/components/Datasource/DatasourceModal.tsx b/superset-frontend/src/components/Datasource/DatasourceModal.tsx index e03c416505a39..49cb7ae5a4906 100644 --- a/superset-frontend/src/components/Datasource/DatasourceModal.tsx +++ b/superset-frontend/src/components/Datasource/DatasourceModal.tsx @@ -49,10 +49,6 @@ const StyledDatasourceModal = styled(Modal)` .modal-footer { flex: 0 1 auto; } - - .ant-modal-body { - overflow: visible; - } `; interface DatasourceModalProps { diff --git a/superset-frontend/src/components/DynamicEditableTitle/index.tsx b/superset-frontend/src/components/DynamicEditableTitle/index.tsx index 969aea19b7302..d9e7066330130 100644 --- a/superset-frontend/src/components/DynamicEditableTitle/index.tsx +++ b/superset-frontend/src/components/DynamicEditableTitle/index.tsx @@ -92,6 +92,10 @@ export const DynamicEditableTitle = ({ refreshMode: 'debounce', }); + useEffect(() => { + setCurrentTitle(title); + }, [title]); + useEffect(() => { if (isEditing && contentRef?.current) { contentRef.current.focus(); @@ -202,6 +206,7 @@ export const DynamicEditableTitle = ({ className="dynamic-title" aria-label={label ?? t('Title')} ref={contentRef} + data-test="editable-title" > {currentTitle} diff --git a/superset-frontend/src/components/EmptyState/index.tsx b/superset-frontend/src/components/EmptyState/index.tsx index 7ba54567e438f..7ee69d7eea5e9 100644 --- a/superset-frontend/src/components/EmptyState/index.tsx +++ b/superset-frontend/src/components/EmptyState/index.tsx @@ -106,6 +106,7 @@ const BigDescription = styled(Description)` const SmallDescription = styled(Description)` ${({ theme }) => css` margin-top: ${theme.gridUnit}px; + line-height: 1.2; `} `; diff --git a/superset-frontend/src/components/Icons/index.tsx b/superset-frontend/src/components/Icons/index.tsx index 08b13404a04d2..27efbe4c2e29f 100644 --- a/superset-frontend/src/components/Icons/index.tsx +++ b/superset-frontend/src/components/Icons/index.tsx @@ -155,6 +155,8 @@ const IconFileNames = [ 'tags', 'ballot', 'category', + 'undo', + 'redo', ]; const iconOverrides: Record = {}; diff --git a/superset-frontend/src/components/Modal/Modal.tsx b/superset-frontend/src/components/Modal/Modal.tsx index 389982cc22021..c6d5b3ee0aac6 100644 --- a/superset-frontend/src/components/Modal/Modal.tsx +++ b/superset-frontend/src/components/Modal/Modal.tsx @@ -89,9 +89,20 @@ export const StyledModal = styled(BaseModal)` max-width: ${maxWidth ?? '900px'}; padding-left: ${theme.gridUnit * 3}px; padding-right: ${theme.gridUnit * 3}px; + padding-bottom: 0; + top: 0; `} + .ant-modal-content { + display: flex; + flex-direction: column; + max-height: ${({ theme }) => `calc(100vh - ${theme.gridUnit * 8}px)`}; + margin-bottom: ${({ theme }) => theme.gridUnit * 4}px; + margin-top: ${({ theme }) => theme.gridUnit * 4}px; + } + .ant-modal-header { + flex: 0 0 auto; background-color: ${({ theme }) => theme.colors.grayscale.light4}; border-radius: ${({ theme }) => theme.borderRadius}px ${({ theme }) => theme.borderRadius}px 0 0; @@ -119,11 +130,13 @@ export const StyledModal = styled(BaseModal)` } .ant-modal-body { + flex: 0 1 auto; padding: ${({ theme }) => theme.gridUnit * 4}px; overflow: auto; ${({ resizable, height }) => !resizable && height && `height: ${height};`} } .ant-modal-footer { + flex: 0 0 1; border-top: ${({ theme }) => theme.gridUnit / 4}px solid ${({ theme }) => theme.colors.grayscale.light2}; padding: ${({ theme }) => theme.gridUnit * 4}px; @@ -325,7 +338,7 @@ const CustomModal = ({ mask={shouldShowMask} draggable={draggable} resizable={resizable} - destroyOnClose={destroyOnClose || resizable || draggable} + destroyOnClose={destroyOnClose} {...rest} > {children} diff --git a/superset-frontend/src/components/PageHeaderWithActions/index.tsx b/superset-frontend/src/components/PageHeaderWithActions/index.tsx index 204a82b235d1d..4449d1c6b3472 100644 --- a/superset-frontend/src/components/PageHeaderWithActions/index.tsx +++ b/superset-frontend/src/components/PageHeaderWithActions/index.tsx @@ -50,7 +50,21 @@ const headerStyles = (theme: SupersetTheme) => css` align-items: center; flex-wrap: nowrap; justify-content: space-between; - height: 100%; + background-color: ${theme.colors.grayscale.light5}; + height: ${theme.gridUnit * 16}px; + padding: 0 ${theme.gridUnit * 4}px; + + .editable-title { + overflow: hidden; + + & > input[type='button'], + & > span { + overflow: hidden; + text-overflow: ellipsis; + max-width: 100%; + white-space: nowrap; + } + } span[role='button'] { display: flex; @@ -113,7 +127,7 @@ export const PageHeaderWithActions = ({ }: PageHeaderWithActionsProps) => { const theme = useTheme(); return ( -
+
{showTitlePanelItems && ( diff --git a/superset-frontend/src/components/ReportModal/HeaderReportDropdown/index.tsx b/superset-frontend/src/components/ReportModal/HeaderReportDropdown/index.tsx index cd741e5c338ba..7beec04ffd7a3 100644 --- a/superset-frontend/src/components/ReportModal/HeaderReportDropdown/index.tsx +++ b/superset-frontend/src/components/ReportModal/HeaderReportDropdown/index.tsx @@ -243,7 +243,11 @@ export default function HeaderReportDropDown({ triggerNode.closest('.action-button') } > - + @@ -253,7 +257,7 @@ export default function HeaderReportDropDown({ role="button" title={t('Schedule email report')} tabIndex={0} - className="action-button" + className="action-button action-schedule-report" onClick={() => setShowModal(true)} > diff --git a/superset-frontend/src/components/Timer/Timer.test.tsx b/superset-frontend/src/components/Timer/Timer.test.tsx index f37004af838f8..a71d0f292fcca 100644 --- a/superset-frontend/src/components/Timer/Timer.test.tsx +++ b/superset-frontend/src/components/Timer/Timer.test.tsx @@ -22,7 +22,7 @@ import React from 'react'; import { render, sleep, waitFor } from 'spec/helpers/testing-library'; import Timer, { TimerProps } from 'src/components/Timer'; -import { now } from 'src/modules/dates'; +import { now } from 'src/utils/dates'; function parseTime(text?: string | null) { return !!text && Number(text.replace(/:/g, '')); diff --git a/superset-frontend/src/components/Timer/index.tsx b/superset-frontend/src/components/Timer/index.tsx index 612c78d12dfc6..b03e1a831f8e8 100644 --- a/superset-frontend/src/components/Timer/index.tsx +++ b/superset-frontend/src/components/Timer/index.tsx @@ -20,7 +20,7 @@ import React, { useEffect, useRef, useState } from 'react'; import { styled } from '@superset-ui/core'; import Label, { Type } from 'src/components/Label'; -import { now, fDuration } from 'src/modules/dates'; +import { now, fDuration } from 'src/utils/dates'; export interface TimerProps { endTime?: number; diff --git a/superset-frontend/src/constants.ts b/superset-frontend/src/constants.ts index 8377c5b5f1a98..60668ddcb865d 100644 --- a/superset-frontend/src/constants.ts +++ b/superset-frontend/src/constants.ts @@ -67,10 +67,22 @@ export const URL_PARAMS = { name: 'slice_id', type: 'string', }, + datasourceId: { + name: 'datasource_id', + type: 'string', + }, datasetId: { name: 'dataset_id', type: 'string', }, + datasourceType: { + name: 'datasource_type', + type: 'string', + }, + dashboardId: { + name: 'dashboard_id', + type: 'string', + }, force: { name: 'force', type: 'boolean', @@ -84,6 +96,8 @@ export const URL_PARAMS = { export const RESERVED_CHART_URL_PARAMS: string[] = [ URL_PARAMS.formDataKey.name, URL_PARAMS.sliceId.name, + URL_PARAMS.datasourceId.name, + URL_PARAMS.datasourceType.name, URL_PARAMS.datasetId.name, ]; export const RESERVED_DASHBOARD_URL_PARAMS: string[] = [ diff --git a/superset-frontend/src/dashboard/actions/sliceEntities.js b/superset-frontend/src/dashboard/actions/sliceEntities.js index 8fc85c9161ffd..be111605202cd 100644 --- a/superset-frontend/src/dashboard/actions/sliceEntities.js +++ b/superset-frontend/src/dashboard/actions/sliceEntities.js @@ -21,7 +21,6 @@ import { t, SupersetClient } from '@superset-ui/core'; import rison from 'rison'; import { addDangerToast } from 'src/components/MessageToasts/actions'; -import { getDatasourceParameter } from 'src/modules/utils'; import { getClientErrorObject } from 'src/utils/getClientErrorObject'; export const SET_ALL_SLICES = 'SET_ALL_SLICES'; @@ -39,6 +38,10 @@ export function fetchAllSlicesFailed(error) { return { type: FETCH_ALL_SLICES_FAILED, payload: { error } }; } +export function getDatasourceParameter(datasourceId, datasourceType) { + return `${datasourceId}__${datasourceType}`; +} + const FETCH_SLICES_PAGE_SIZE = 200; export function fetchAllSlices(userId, excludeFilterBox = false) { return (dispatch, getState) => { diff --git a/superset-frontend/src/dashboard/components/DashboardGrid.jsx b/superset-frontend/src/dashboard/components/DashboardGrid.jsx index 4be8d6bc05d0f..72f86fff3210c 100644 --- a/superset-frontend/src/dashboard/components/DashboardGrid.jsx +++ b/superset-frontend/src/dashboard/components/DashboardGrid.jsx @@ -35,6 +35,7 @@ const propTypes = { resizeComponent: PropTypes.func.isRequired, setDirectPathToChild: PropTypes.func.isRequired, width: PropTypes.number.isRequired, + dashboardId: PropTypes.number, }; const defaultProps = {}; @@ -143,6 +144,7 @@ class DashboardGrid extends React.PureComponent { editMode, canEdit, setEditMode, + dashboardId, } = this.props; const columnPlusGutterWidth = (width + GRID_GUTTER_SIZE) / GRID_COLUMN_COUNT; @@ -167,7 +169,11 @@ class DashboardGrid extends React.PureComponent { } buttonAction={() => { - window.open('/chart/add', '_blank', 'noopener noreferrer'); + window.open( + `/chart/add?dashboard_id=${dashboardId}`, + '_blank', + 'noopener noreferrer', + ); }} image="chart.svg" /> @@ -186,7 +192,11 @@ class DashboardGrid extends React.PureComponent { } buttonAction={() => { - window.open('/chart/add', '_blank', 'noopener noreferrer'); + window.open( + `/chart/add?dashboard_id=${dashboardId}`, + '_blank', + 'noopener noreferrer', + ); }} image="chart.svg" /> diff --git a/superset-frontend/src/dashboard/components/Header/Header.test.tsx b/superset-frontend/src/dashboard/components/Header/Header.test.tsx index e5851fb2d500b..730596a9f4b2b 100644 --- a/superset-frontend/src/dashboard/components/Header/Header.test.tsx +++ b/superset-frontend/src/dashboard/components/Header/Header.test.tsx @@ -122,7 +122,7 @@ function setup(props: HeaderProps, initialState = {}) { async function openActionsDropdown() { const btn = screen.getByRole('img', { name: 'more-horiz' }); userEvent.click(btn); - expect(await screen.findByRole('menu')).toBeInTheDocument(); + expect(await screen.findByTestId('header-actions-menu')).toBeInTheDocument(); } test('should render', () => { @@ -134,7 +134,9 @@ test('should render', () => { test('should render the title', () => { const mockedProps = createProps(); setup(mockedProps); - expect(screen.getByText('Dashboard Title')).toBeInTheDocument(); + expect(screen.getByTestId('editable-title')).toHaveTextContent( + 'Dashboard Title', + ); }); test('should render the editable title', () => { @@ -161,21 +163,30 @@ test('should render the "Draft" status', () => { }); test('should publish', () => { - setup(editableProps); + const mockedProps = createProps(); + const canEditProps = { + ...mockedProps, + dashboardInfo: { + ...mockedProps.dashboardInfo, + dash_edit_perm: true, + dash_save_perm: true, + }, + }; + setup(canEditProps); const draft = screen.getByText('Draft'); - expect(editableProps.savePublished).not.toHaveBeenCalled(); + expect(mockedProps.savePublished).toHaveBeenCalledTimes(0); userEvent.click(draft); - expect(editableProps.savePublished).toHaveBeenCalledTimes(1); + expect(mockedProps.savePublished).toHaveBeenCalledTimes(1); }); test('should render the "Undo" action as disabled', () => { setup(editableProps); - expect(screen.getByTitle('Undo').parentElement).toBeDisabled(); + expect(screen.getByTestId('undo-action').parentElement).toBeDisabled(); }); test('should undo', () => { setup(undoProps); - const undo = screen.getByTitle('Undo'); + const undo = screen.getByTestId('undo-action'); expect(undoProps.onUndo).not.toHaveBeenCalled(); userEvent.click(undo); expect(undoProps.onUndo).toHaveBeenCalledTimes(1); @@ -191,12 +202,12 @@ test('should undo with key listener', () => { test('should render the "Redo" action as disabled', () => { setup(editableProps); - expect(screen.getByTitle('Redo').parentElement).toBeDisabled(); + expect(screen.getByTestId('redo-action').parentElement).toBeDisabled(); }); test('should redo', () => { setup(redoProps); - const redo = screen.getByTitle('Redo'); + const redo = screen.getByTestId('redo-action'); expect(redoProps.onRedo).not.toHaveBeenCalled(); userEvent.click(redo); expect(redoProps.onRedo).toHaveBeenCalledTimes(1); @@ -212,7 +223,7 @@ test('should redo with key listener', () => { test('should render the "Discard changes" button', () => { setup(editableProps); - expect(screen.getByText('Discard changes')).toBeInTheDocument(); + expect(screen.getByText('Discard')).toBeInTheDocument(); }); test('should render the "Save" button as disabled', () => { @@ -297,8 +308,8 @@ test('should toggle the edit mode', () => { }, }; setup(canEditProps); - const editDashboard = screen.getByTitle('Edit dashboard'); - expect(screen.queryByTitle('Edit dashboard')).toBeInTheDocument(); + const editDashboard = screen.getByText('Edit dashboard'); + expect(screen.queryByText('Edit dashboard')).toBeInTheDocument(); userEvent.click(editDashboard); expect(mockedProps.logEvent).toHaveBeenCalled(); }); diff --git a/superset-frontend/src/dashboard/components/Header/HeaderActionsDropdown/HeaderActionsDropdown.test.tsx b/superset-frontend/src/dashboard/components/Header/HeaderActionsDropdown/HeaderActionsDropdown.test.tsx index 57fe7a1333973..eb3c6aeb4e973 100644 --- a/superset-frontend/src/dashboard/components/Header/HeaderActionsDropdown/HeaderActionsDropdown.test.tsx +++ b/superset-frontend/src/dashboard/components/Header/HeaderActionsDropdown/HeaderActionsDropdown.test.tsx @@ -29,7 +29,7 @@ import HeaderActionsDropdown from '.'; const createProps = () => ({ addSuccessToast: jest.fn(), addDangerToast: jest.fn(), - customCss: '#save-dash-split-button{margin-left: 100px;}', + customCss: '.ant-menu {margin-left: 100px;}', dashboardId: 1, dashboardInfo: { id: 1, @@ -59,7 +59,10 @@ const createProps = () => ({ userCanEdit: false, userCanSave: false, userCanShare: false, + userCanCurate: false, lastModifiedTime: 0, + isDropdownVisible: true, + dataMask: {}, }); const editModeOnProps = { ...createProps(), @@ -67,50 +70,31 @@ const editModeOnProps = { }; function setup(props: HeaderDropdownProps) { - return ( + return render(
-
+
, + { useRedux: true }, ); } fetchMock.get('glob:*/csstemplateasyncmodelview/api/read', {}); -async function openDropdown() { - const btn = screen.getByRole('img', { name: 'more-horiz' }); - userEvent.click(btn); - expect(await screen.findByRole('menu')).toBeInTheDocument(); -} - test('should render', () => { const mockedProps = createProps(); - const { container } = render(setup(mockedProps)); + const { container } = setup(mockedProps); expect(container).toBeInTheDocument(); }); test('should render the dropdown button', () => { const mockedProps = createProps(); - render(setup(mockedProps)); + setup(mockedProps); expect(screen.getByRole('button')).toBeInTheDocument(); }); -test('should render the dropdown icon', () => { - const mockedProps = createProps(); - render(setup(mockedProps)); - expect(screen.getByRole('img', { name: 'more-horiz' })).toBeInTheDocument(); -}); - -test('should open the dropdown', async () => { - const mockedProps = createProps(); - render(setup(mockedProps)); - await openDropdown(); - expect(await screen.findByRole('menu')).toBeInTheDocument(); -}); - test('should render the menu items', async () => { const mockedProps = createProps(); - render(setup(mockedProps)); - await openDropdown(); + setup(mockedProps); expect(screen.getAllByRole('menuitem')).toHaveLength(4); expect(screen.getByText('Refresh dashboard')).toBeInTheDocument(); expect(screen.getByText('Set auto-refresh interval')).toBeInTheDocument(); @@ -119,13 +103,11 @@ test('should render the menu items', async () => { }); test('should render the menu items in edit mode', async () => { - render(setup(editModeOnProps)); - await openDropdown(); - expect(screen.getAllByRole('menuitem')).toHaveLength(5); - expect(screen.getByText('Refresh dashboard')).toBeInTheDocument(); + setup(editModeOnProps); + expect(screen.getAllByRole('menuitem')).toHaveLength(4); expect(screen.getByText('Set auto-refresh interval')).toBeInTheDocument(); expect(screen.getByText('Set filter mapping')).toBeInTheDocument(); - expect(screen.getByText('Edit dashboard properties')).toBeInTheDocument(); + expect(screen.getByText('Edit properties')).toBeInTheDocument(); expect(screen.getByText('Edit CSS')).toBeInTheDocument(); }); @@ -135,10 +117,9 @@ test('should show the share actions', async () => { ...mockedProps, userCanShare: true, }; - render(setup(canShareProps)); - await openDropdown(); - expect(screen.getByText('Copy permalink to clipboard')).toBeInTheDocument(); - expect(screen.getByText('Share permalink by email')).toBeInTheDocument(); + setup(canShareProps); + + expect(screen.getByText('Share')).toBeInTheDocument(); }); test('should render the "Save Modal" when user can save', async () => { @@ -147,15 +128,13 @@ test('should render the "Save Modal" when user can save', async () => { ...mockedProps, userCanSave: true, }; - render(setup(canSaveProps)); - await openDropdown(); + setup(canSaveProps); expect(screen.getByText('Save as')).toBeInTheDocument(); }); test('should NOT render the "Save Modal" menu item when user cannot save', async () => { const mockedProps = createProps(); - render(setup(mockedProps)); - await openDropdown(); + setup(mockedProps); expect(screen.queryByText('Save as')).not.toBeInTheDocument(); }); @@ -165,43 +144,41 @@ test('should render the "Refresh dashboard" menu item as disabled when loading', ...mockedProps, isLoading: true, }; - render(setup(loadingProps)); - await openDropdown(); + setup(loadingProps); expect(screen.getByText('Refresh dashboard')).toHaveClass( - 'ant-dropdown-menu-item-disabled', + 'ant-menu-item-disabled', ); }); test('should NOT render the "Refresh dashboard" menu item as disabled', async () => { const mockedProps = createProps(); - render(setup(mockedProps)); - await openDropdown(); + setup(mockedProps); expect(screen.getByText('Refresh dashboard')).not.toHaveClass( - 'ant-dropdown-menu-item-disabled', + 'ant-menu-item-disabled', ); }); test('should render with custom css', () => { const mockedProps = createProps(); const { customCss } = mockedProps; - render(setup(mockedProps)); + setup(mockedProps); injectCustomCss(customCss); - expect(screen.getByRole('button')).toHaveStyle('margin-left: 100px'); + expect(screen.getByTestId('header-actions-menu')).toHaveStyle( + 'margin-left: 100px', + ); }); test('should refresh the charts', async () => { const mockedProps = createProps(); - render(setup(mockedProps)); - await openDropdown(); + setup(mockedProps); userEvent.click(screen.getByText('Refresh dashboard')); expect(mockedProps.forceRefreshAllCharts).toHaveBeenCalledTimes(1); expect(mockedProps.addSuccessToast).toHaveBeenCalledTimes(1); }); test('should show the properties modal', async () => { - render(setup(editModeOnProps)); - await openDropdown(); - userEvent.click(screen.getByText('Edit dashboard properties')); + setup(editModeOnProps); + userEvent.click(screen.getByText('Edit properties')); expect(editModeOnProps.showPropertiesModal).toHaveBeenCalledTimes(1); }); diff --git a/superset-frontend/src/dashboard/components/Header/HeaderActionsDropdown/index.jsx b/superset-frontend/src/dashboard/components/Header/HeaderActionsDropdown/index.jsx index ad3dd91ec7ee5..a7860af30f378 100644 --- a/superset-frontend/src/dashboard/components/Header/HeaderActionsDropdown/index.jsx +++ b/superset-frontend/src/dashboard/components/Header/HeaderActionsDropdown/index.jsx @@ -19,16 +19,15 @@ import React from 'react'; import PropTypes from 'prop-types'; -import { styled, SupersetClient, t } from '@superset-ui/core'; +import { SupersetClient, t } from '@superset-ui/core'; import { Menu } from 'src/components/Menu'; -import { NoAnimationDropdown } from 'src/components/Dropdown'; -import Icons from 'src/components/Icons'; import { URL_PARAMS } from 'src/constants'; import ShareMenuItems from 'src/dashboard/components/menu/ShareMenuItems'; import CssEditor from 'src/dashboard/components/CssEditor'; import RefreshIntervalModal from 'src/dashboard/components/RefreshIntervalModal'; import SaveModal from 'src/dashboard/components/SaveModal'; +import HeaderReportDropdown from 'src/components/ReportModal/HeaderReportDropdown'; import injectCustomCss from 'src/dashboard/util/injectCustomCss'; import { SAVE_TYPE_NEWDASHBOARD } from 'src/dashboard/util/constants'; import FilterScopeModal from 'src/dashboard/components/filterscope/FilterScopeModal'; @@ -91,15 +90,9 @@ const MENU_KEYS = { DOWNLOAD_AS_IMAGE: 'download-as-image', TOGGLE_FULLSCREEN: 'toggle-fullscreen', MANAGE_EMBEDDED: 'manage-embedded', + MANAGE_EMAIL_REPORT: 'manage-email-report', }; -const DropdownButton = styled.div` - margin-left: ${({ theme }) => theme.gridUnit * 2.5}px; - span { - color: ${({ theme }) => theme.colors.grayscale.base}; - } -`; - const SCREENSHOT_NODE_SELECTOR = '.dashboard'; class HeaderActionsDropdown extends React.PureComponent { @@ -112,11 +105,13 @@ class HeaderActionsDropdown extends React.PureComponent { this.state = { css: props.customCss, cssTemplates: [], + showReportSubMenu: null, }; this.changeCss = this.changeCss.bind(this); this.changeRefreshInterval = this.changeRefreshInterval.bind(this); this.handleMenuClick = this.handleMenuClick.bind(this); + this.setShowReportSubMenu = this.setShowReportSubMenu.bind(this); } UNSAFE_componentWillMount() { @@ -144,6 +139,12 @@ class HeaderActionsDropdown extends React.PureComponent { } } + setShowReportSubMenu(show) { + this.setState({ + showReportSubMenu: show, + }); + } + changeCss(css) { this.props.onChange(); this.props.updateCss(css); @@ -224,6 +225,9 @@ class HeaderActionsDropdown extends React.PureComponent { addSuccessToast, addDangerToast, filterboxMigrationState, + setIsDropdownVisible, + isDropdownVisible, + ...rest } = this.props; const emailTitle = t('Superset dashboard'); @@ -236,12 +240,47 @@ class HeaderActionsDropdown extends React.PureComponent { hash: window.location.hash, }); - const menu = ( - + return ( + + {!editMode && ( + + {t('Refresh dashboard')} + + )} + {!editMode && ( + + {getUrlParam(URL_PARAMS.standalone) + ? t('Exit fullscreen') + : t('Enter fullscreen')} + + )} + {editMode && ( + + {t('Edit properties')} + + )} + {editMode && ( + + {t('Edit CSS')}} + initialCss={this.state.css} + templates={this.state.cssTemplates} + onChange={this.changeCss} + /> + + )} + {userCanSave && ( )} + {!editMode && ( + + {t('Download as image')} + + )} {userCanShare && ( - + + + + )} + {!editMode && userCanCurate && ( + + {t('Embed dashboard')} + )} - - {t('Refresh dashboard')} - + {!editMode ? ( + this.state.showReportSubMenu ? ( + <> + + + + + + ) : ( + + + + ) + ) : null} + {editMode && + filterboxMigrationState !== FILTER_BOX_MIGRATION_STATES.CONVERTED && ( + + + + )} + {t('Set auto-refresh interval')}} /> - - {editMode && - filterboxMigrationState !== FILTER_BOX_MIGRATION_STATES.CONVERTED && ( - - - - )} - - {editMode && ( - - {t('Edit dashboard properties')} - - )} - - {editMode && ( - - {t('Edit CSS')}} - initialCss={this.state.css} - templates={this.state.cssTemplates} - onChange={this.changeCss} - /> - - )} - - {!editMode && userCanCurate && ( - - {t('Embed dashboard')} - - )} - - {!editMode && ( - - {t('Download as image')} - - )} - - {!editMode && ( - - {getUrlParam(URL_PARAMS.standalone) - ? t('Exit fullscreen') - : t('Enter fullscreen')} - - )} ); - return ( - - triggerNode.closest('.dashboard-header') - } - > - - - - - ); } } diff --git a/superset-frontend/src/dashboard/components/Header/index.jsx b/superset-frontend/src/dashboard/components/Header/index.jsx index f60a0d85d7ae2..ab85bcda04f55 100644 --- a/superset-frontend/src/dashboard/components/Header/index.jsx +++ b/superset-frontend/src/dashboard/components/Header/index.jsx @@ -20,8 +20,8 @@ import moment from 'moment'; import React from 'react'; import PropTypes from 'prop-types'; -import { styled, t, getSharedLabelColor } from '@superset-ui/core'; -import ButtonGroup from 'src/components/ButtonGroup'; +import { styled, css, t, getSharedLabelColor } from '@superset-ui/core'; +import { Global } from '@emotion/react'; import { isFeatureEnabled, FeatureFlag } from 'src/featureFlags'; import { LOG_ACTIONS_PERIODIC_RENDER_DASHBOARD, @@ -30,11 +30,10 @@ import { } from 'src/logger/LogUtils'; import Icons from 'src/components/Icons'; import Button from 'src/components/Button'; -import EditableTitle from 'src/components/EditableTitle'; -import FaveStar from 'src/components/FaveStar'; +import { AntdButton } from 'src/components/'; +import { Tooltip } from 'src/components/Tooltip'; import { safeStringify } from 'src/utils/safeStringify'; import HeaderActionsDropdown from 'src/dashboard/components/Header/HeaderActionsDropdown'; -import HeaderReportDropdown from 'src/components/ReportModal/HeaderReportDropdown'; import PublishedStatus from 'src/dashboard/components/PublishedStatus'; import UndoRedoKeyListeners from 'src/dashboard/components/UndoRedoKeyListeners'; import PropertiesModal from 'src/dashboard/components/PropertiesModal'; @@ -51,6 +50,7 @@ import setPeriodicRunner, { import { options as PeriodicRefreshOptions } from 'src/dashboard/components/RefreshIntervalModal'; import findPermission from 'src/dashboard/util/findPermission'; import { FILTER_BOX_MIGRATION_STATES } from 'src/explore/constants'; +import { PageHeaderWithActions } from 'src/components/PageHeaderWithActions'; import { DashboardEmbedModal } from '../DashboardEmbedControls'; const propTypes = { @@ -107,34 +107,59 @@ const defaultProps = { colorScheme: undefined, }; -// Styled Components -const StyledDashboardHeader = styled.div` - background: ${({ theme }) => theme.colors.grayscale.light5}; +const headerContainerStyle = theme => css` + border-bottom: 1px solid ${theme.colors.grayscale.light2}; +`; + +const editButtonStyle = theme => css` + color: ${theme.colors.primary.dark2}; +`; + +const actionButtonsStyle = theme => css` display: flex; - flex-direction: row; align-items: center; - justify-content: space-between; - padding: 0 ${({ theme }) => theme.gridUnit * 6}px; - border-bottom: 1px solid ${({ theme }) => theme.colors.grayscale.light2}; - .action-button > span { - color: ${({ theme }) => theme.colors.grayscale.base}; + .action-schedule-report { + margin-left: ${theme.gridUnit * 2}px; } - button, - .fave-unfave-icon { - margin-left: ${({ theme }) => theme.gridUnit * 2}px; + + .undoRedo { + margin-right: ${theme.gridUnit * 2}px; } - .button-container { - display: flex; - flex-direction: row; - flex-wrap: nowrap; - .action-button { - font-size: ${({ theme }) => theme.typography.sizes.xl}px; - margin-left: ${({ theme }) => theme.gridUnit * 2.5}px; - } +`; + +const StyledUndoRedoButton = styled(AntdButton)` + padding: 0; + &:hover { + background: transparent; + } +`; + +const undoRedoStyle = theme => css` + color: ${theme.colors.grayscale.light1}; + &:hover { + color: ${theme.colors.grayscale.base}; } `; +const undoRedoEmphasized = theme => css` + color: ${theme.colors.grayscale.base}; +`; + +const undoRedoDisabled = theme => css` + color: ${theme.colors.grayscale.light2}; +`; + +const saveBtnStyle = theme => css` + min-width: ${theme.gridUnit * 17}px; + height: ${theme.gridUnit * 8}px; +`; + +const discardBtnStyle = theme => css` + min-width: ${theme.gridUnit * 22}px; + height: ${theme.gridUnit * 8}px; +`; + class Header extends React.PureComponent { static discardChanges() { const url = new URL(window.location.href); @@ -148,7 +173,9 @@ class Header extends React.PureComponent { this.state = { didNotifyMaxUndoHistoryToast: false, emphasizeUndo: false, + emphasizeRedo: false, showingPropertiesModal: false, + isDropdownVisible: false, }; this.handleChangeText = this.handleChangeText.bind(this); @@ -160,6 +187,7 @@ class Header extends React.PureComponent { this.overwriteDashboard = this.overwriteDashboard.bind(this); this.showPropertiesModal = this.showPropertiesModal.bind(this); this.hidePropertiesModal = this.hidePropertiesModal.bind(this); + this.setIsDropdownVisible = this.setIsDropdownVisible.bind(this); } componentDidMount() { @@ -205,6 +233,12 @@ class Header extends React.PureComponent { } } + setIsDropdownVisible(visible) { + this.setState({ + isDropdownVisible: visible, + }); + } + handleCtrlY() { this.props.onRedo(); this.setState({ emphasizeRedo: true }, () => { @@ -450,180 +484,214 @@ class Header extends React.PureComponent { }; return ( - -
- - - {user?.userId && dashboardInfo?.id && ( - - )} -
- -
- {userCanSaveAs && ( -
- {editMode && ( - <> - - + } + rightPanelAdditionalItems={ +
+ {userCanSaveAs && ( +
+ {editMode && ( +
+
+ + + + + + + + + + +
+ + +
+ )} +
+ )} + {editMode ? ( + + ) : ( +
+ {userCanEdit && ( - - - - + )} +
)}
- )} - {editMode ? ( - - ) : ( - <> - {userCanEdit && ( - - - - )} - - - )} - - {this.state.showingPropertiesModal && ( - + triggerNode.closest('.header-with-actions'), + visible: this.state.isDropdownVisible, + onVisibleChange: this.setIsDropdownVisible, + }} + additionalActionsMenu={ + - )} - - {userCanCurate && ( - - )} - - + {this.state.showingPropertiesModal && ( + + )} + + {userCanCurate && ( + -
- + )} + +
); } } diff --git a/superset-frontend/src/dashboard/components/RefreshIntervalModal.test.tsx b/superset-frontend/src/dashboard/components/RefreshIntervalModal.test.tsx index 9f2f68a9d6451..9151275e800de 100644 --- a/superset-frontend/src/dashboard/components/RefreshIntervalModal.test.tsx +++ b/superset-frontend/src/dashboard/components/RefreshIntervalModal.test.tsx @@ -68,7 +68,8 @@ describe('RefreshIntervalModal - Enzyme', () => { const createProps = () => ({ addSuccessToast: jest.fn(), addDangerToast: jest.fn(), - customCss: '#save-dash-split-button{margin-left: 100px;}', + customCss: + '.header-with-actions .right-button-panel .ant-dropdown-trigger{margin-left: 100px;}', dashboardId: 1, dashboardInfo: { id: 1, @@ -100,6 +101,7 @@ const createProps = () => ({ userCanSave: false, userCanShare: false, lastModifiedTime: 0, + isDropdownVisible: true, }); const editModeOnProps = { @@ -116,9 +118,6 @@ const setup = (overrides?: any) => ( fetchMock.get('glob:*/csstemplateasyncmodelview/api/read', {}); const openRefreshIntervalModal = async () => { - const headerActionsButton = screen.getByRole('img', { name: 'more-horiz' }); - userEvent.click(headerActionsButton); - const autoRefreshOption = screen.getByText('Set auto-refresh interval'); userEvent.click(autoRefreshOption); }; diff --git a/superset-frontend/src/dashboard/components/SliceAdder.jsx b/superset-frontend/src/dashboard/components/SliceAdder.jsx index eeb83d7c56e46..22f8038ee494c 100644 --- a/superset-frontend/src/dashboard/components/SliceAdder.jsx +++ b/superset-frontend/src/dashboard/components/SliceAdder.jsx @@ -58,6 +58,7 @@ const propTypes = { editMode: PropTypes.bool, height: PropTypes.number, filterboxMigrationState: FILTER_BOX_MIGRATION_STATES, + dashboardId: PropTypes.number, }; const defaultProps = { @@ -276,7 +277,11 @@ class SliceAdder extends React.Component { buttonStyle="link" buttonSize="xsmall" onClick={() => - window.open('/chart/add', '_blank', 'noopener noreferrer') + window.open( + `/chart/add?dashboard_id=${this.props.dashboardId}`, + '_blank', + 'noopener noreferrer', + ) } > diff --git a/superset-frontend/src/dashboard/components/gridComponents/Chart.jsx b/superset-frontend/src/dashboard/components/gridComponents/Chart.jsx index 31c8246b19d5f..e5d19e931c58d 100644 --- a/superset-frontend/src/dashboard/components/gridComponents/Chart.jsx +++ b/superset-frontend/src/dashboard/components/gridComponents/Chart.jsx @@ -272,6 +272,7 @@ export default class Chart extends React.Component { : undefined; const key = await postFormData( this.props.datasource.id, + this.props.datasource.type, this.props.formData, this.props.slice.slice_id, nextTabId, @@ -299,6 +300,7 @@ export default class Chart extends React.Component { resultType: 'full', resultFormat: 'csv', force: true, + ownState: this.props.ownState, }); } diff --git a/superset-frontend/src/dashboard/components/gridComponents/Tab.jsx b/superset-frontend/src/dashboard/components/gridComponents/Tab.jsx index f240d6f525587..d8312cd60a9d0 100644 --- a/superset-frontend/src/dashboard/components/gridComponents/Tab.jsx +++ b/superset-frontend/src/dashboard/components/gridComponents/Tab.jsx @@ -150,6 +150,7 @@ class Tab extends React.PureComponent { isComponentVisible, canEdit, setEditMode, + dashboardId, } = this.props; const shouldDisplayEmptyState = tabComponent.children.length === 0; @@ -183,7 +184,7 @@ class Tab extends React.PureComponent { {t('You can')}{' '} diff --git a/superset-frontend/src/dashboard/components/gridComponents/Tab.test.tsx b/superset-frontend/src/dashboard/components/gridComponents/Tab.test.tsx index 82aab17014351..d995595c49100 100644 --- a/superset-frontend/src/dashboard/components/gridComponents/Tab.test.tsx +++ b/superset-frontend/src/dashboard/components/gridComponents/Tab.test.tsx @@ -294,5 +294,5 @@ test('Render tab content with no children, editMode: true, canEdit: true', () => ).toBeVisible(); expect( screen.getByRole('link', { name: 'create a new chart' }), - ).toHaveAttribute('href', '/chart/add'); + ).toHaveAttribute('href', '/chart/add?dashboard_id=23'); }); diff --git a/superset-frontend/src/dashboard/components/menu/ShareMenuItems/ShareMenuItems.test.tsx b/superset-frontend/src/dashboard/components/menu/ShareMenuItems/ShareMenuItems.test.tsx index 579f9d4b69077..498009224a5e7 100644 --- a/superset-frontend/src/dashboard/components/menu/ShareMenuItems/ShareMenuItems.test.tsx +++ b/superset-frontend/src/dashboard/components/menu/ShareMenuItems/ShareMenuItems.test.tsx @@ -102,9 +102,10 @@ test('Click on "Copy dashboard URL" and succeed', async () => { userEvent.click(screen.getByRole('button', { name: 'Copy dashboard URL' })); - await waitFor(() => { + await waitFor(async () => { expect(spy).toBeCalledTimes(1); - expect(spy).toBeCalledWith('http://localhost/superset/dashboard/p/123/'); + const value = await spy.mock.calls[0][0](); + expect(value).toBe('http://localhost/superset/dashboard/p/123/'); expect(props.addSuccessToast).toBeCalledTimes(1); expect(props.addSuccessToast).toBeCalledWith('Copied to clipboard!'); expect(props.addDangerToast).toBeCalledTimes(0); @@ -128,9 +129,10 @@ test('Click on "Copy dashboard URL" and fail', async () => { userEvent.click(screen.getByRole('button', { name: 'Copy dashboard URL' })); - await waitFor(() => { + await waitFor(async () => { expect(spy).toBeCalledTimes(1); - expect(spy).toBeCalledWith('http://localhost/superset/dashboard/p/123/'); + const value = await spy.mock.calls[0][0](); + expect(value).toBe('http://localhost/superset/dashboard/p/123/'); expect(props.addSuccessToast).toBeCalledTimes(0); expect(props.addDangerToast).toBeCalledTimes(1); expect(props.addDangerToast).toBeCalledWith( diff --git a/superset-frontend/src/dashboard/components/menu/ShareMenuItems/index.tsx b/superset-frontend/src/dashboard/components/menu/ShareMenuItems/index.tsx index b196100734cc3..f9016e5263c02 100644 --- a/superset-frontend/src/dashboard/components/menu/ShareMenuItems/index.tsx +++ b/superset-frontend/src/dashboard/components/menu/ShareMenuItems/index.tsx @@ -64,8 +64,7 @@ const ShareMenuItems = (props: ShareMenuItemProps) => { async function onCopyLink() { try { - const url = await generateUrl(); - await copyTextToClipboard(url); + await copyTextToClipboard(generateUrl); addSuccessToast(t('Copied to clipboard!')); } catch (error) { logging.error(error); @@ -87,7 +86,7 @@ const ShareMenuItems = (props: ShareMenuItemProps) => { } return ( - <> +
{copyMenuItemTitle} @@ -98,7 +97,7 @@ const ShareMenuItems = (props: ShareMenuItemProps) => { {emailMenuItemTitle}
- +
); }; diff --git a/superset-frontend/src/dashboard/containers/DashboardGrid.jsx b/superset-frontend/src/dashboard/containers/DashboardGrid.jsx index 96688476112cd..ca214fe878765 100644 --- a/superset-frontend/src/dashboard/containers/DashboardGrid.jsx +++ b/superset-frontend/src/dashboard/containers/DashboardGrid.jsx @@ -30,6 +30,7 @@ function mapStateToProps({ dashboardState, dashboardInfo }) { return { editMode: dashboardState.editMode, canEdit: dashboardInfo.dash_edit_perm, + dashboardId: dashboardInfo.id, }; } diff --git a/superset-frontend/src/dashboard/containers/SliceAdder.jsx b/superset-frontend/src/dashboard/containers/SliceAdder.jsx index 8c02a4a360e7f..078ded23d8ac8 100644 --- a/superset-frontend/src/dashboard/containers/SliceAdder.jsx +++ b/superset-frontend/src/dashboard/containers/SliceAdder.jsx @@ -29,6 +29,7 @@ function mapStateToProps( return { height: ownProps.height, userId: dashboardInfo.userId, + dashboardId: dashboardInfo.id, selectedSliceIds: dashboardState.sliceIds, slices: sliceEntities.slices, isLoading: sliceEntities.isLoading, diff --git a/superset-frontend/src/dashboard/stylesheets/components/header.less b/superset-frontend/src/dashboard/stylesheets/components/header.less index 7db5924b71265..355385d373fd6 100644 --- a/superset-frontend/src/dashboard/stylesheets/components/header.less +++ b/superset-frontend/src/dashboard/stylesheets/components/header.less @@ -55,11 +55,6 @@ color: @almost-black; } -.dashboard-header .dashboard-component-header { - font-weight: @font-weight-normal; - width: auto; -} - .dashboard--editing /* note: sizes should be a multiple of the 8px grid unit so that rows in the grid align */ diff --git a/superset-frontend/src/dashboard/stylesheets/dashboard.less b/superset-frontend/src/dashboard/stylesheets/dashboard.less index b9b2b0aab92f8..cdbdeb6481579 100644 --- a/superset-frontend/src/dashboard/stylesheets/dashboard.less +++ b/superset-frontend/src/dashboard/stylesheets/dashboard.less @@ -19,6 +19,7 @@ /* header has mysterious extra margin */ header.top { margin-bottom: 2px; + z-index: 10; } body { @@ -150,27 +151,6 @@ body { margin: 0 20px; } -.dashboard-header .dashboard-component-header { - display: flex; - flex-direction: row; - align-items: center; - - .editable-title { - margin-right: 8px; - } - - .favstar { - font-size: @font-size-xl; - position: relative; - margin-left: 8px; - } - - .publish { - position: relative; - margin-left: 8px; - } -} - .slice_container .alert { margin: 10px; } diff --git a/superset-frontend/src/embedded/index.tsx b/superset-frontend/src/embedded/index.tsx index 52e0aee8d29b5..c28d416a18aef 100644 --- a/superset-frontend/src/embedded/index.tsx +++ b/superset-frontend/src/embedded/index.tsx @@ -19,7 +19,7 @@ import React, { lazy, Suspense } from 'react'; import ReactDOM from 'react-dom'; import { BrowserRouter as Router, Route } from 'react-router-dom'; -import { makeApi, t } from '@superset-ui/core'; +import { makeApi, t, logging } from '@superset-ui/core'; import { Switchboard } from '@superset-ui/switchboard'; import { bootstrapData } from 'src/preamble'; import setupClient from 'src/setup/setupClient'; @@ -35,7 +35,7 @@ const debugMode = process.env.WEBPACK_MODE === 'development'; function log(...info: unknown[]) { if (debugMode) { - console.debug(`[superset]`, ...info); + logging.debug(`[superset]`, ...info); } } @@ -69,16 +69,16 @@ const appMountPoint = document.getElementById('app')!; const MESSAGE_TYPE = '__embedded_comms__'; +function showFailureMessage(message: string) { + appMountPoint.innerHTML = message; +} + if (!window.parent || window.parent === window) { showFailureMessage( 'This page is intended to be embedded in an iframe, but it looks like that is not the case.', ); } -function showFailureMessage(message: string) { - appMountPoint.innerHTML = message; -} - // if the page is embedded in an origin that hasn't // been authorized by the curator, we forbid access entirely. // todo: check the referrer on the route serving this page instead @@ -134,7 +134,7 @@ function start() { }, err => { // something is most likely wrong with the guest token - console.error(err); + logging.error(err); showFailureMessage( 'Something went wrong with embedded authentication. Check the dev console for details.', ); diff --git a/superset-frontend/src/explore/actions/exploreActions.ts b/superset-frontend/src/explore/actions/exploreActions.ts index 8e73b32a9cd63..fe45d0b63e110 100644 --- a/superset-frontend/src/explore/actions/exploreActions.ts +++ b/superset-frontend/src/explore/actions/exploreActions.ts @@ -140,32 +140,6 @@ export function sliceUpdated(slice: Slice) { return { type: SLICE_UPDATED, slice }; } -export const SET_ORIGINAL_FORMATTED_TIME_COLUMN = - 'SET_ORIGINAL_FORMATTED_TIME_COLUMN'; -export function setOriginalFormattedTimeColumn( - datasourceId: string, - columnName: string, -) { - return { - type: SET_ORIGINAL_FORMATTED_TIME_COLUMN, - datasourceId, - columnName, - }; -} - -export const UNSET_ORIGINAL_FORMATTED_TIME_COLUMN = - 'UNSET_ORIGINAL_FORMATTED_TIME_COLUMN'; -export function unsetOriginalFormattedTimeColumn( - datasourceId: string, - columnIndex: number, -) { - return { - type: UNSET_ORIGINAL_FORMATTED_TIME_COLUMN, - datasourceId, - columnIndex, - }; -} - export const SET_FORCE_QUERY = 'SET_FORCE_QUERY'; export function setForceQuery(force: boolean) { return { @@ -189,8 +163,6 @@ export const exploreActions = { updateChartTitle, createNewSlice, sliceUpdated, - setOriginalFormattedTimeColumn, - unsetOriginalFormattedTimeColumn, setForceQuery, }; diff --git a/superset-frontend/src/explore/components/DataTableControl/CopyToClipboardButton.test.tsx b/superset-frontend/src/explore/components/DataTableControl/CopyToClipboardButton.test.tsx index 2ce91590b9890..a158bfd7ed518 100644 --- a/superset-frontend/src/explore/components/DataTableControl/CopyToClipboardButton.test.tsx +++ b/superset-frontend/src/explore/components/DataTableControl/CopyToClipboardButton.test.tsx @@ -18,7 +18,7 @@ */ import userEvent from '@testing-library/user-event'; import React from 'react'; -import { render, screen } from 'spec/helpers/testing-library'; +import { render, screen, waitFor } from 'spec/helpers/testing-library'; import { CopyToClipboardButton } from '.'; test('Render a button', () => { @@ -28,14 +28,26 @@ test('Render a button', () => { expect(screen.getByRole('button')).toBeInTheDocument(); }); -test('Should copy to clipboard', () => { - document.execCommand = jest.fn(); +test('Should copy to clipboard', async () => { + const callback = jest.fn(); + document.execCommand = callback; + + const originalClipboard = { ...global.navigator.clipboard }; + // @ts-ignore + global.navigator.clipboard = { write: callback, writeText: callback }; render(, { useRedux: true, }); - expect(document.execCommand).toHaveBeenCalledTimes(0); + expect(callback).toHaveBeenCalledTimes(0); userEvent.click(screen.getByRole('button')); - expect(document.execCommand).toHaveBeenCalledWith('copy'); + + await waitFor(() => { + expect(callback).toHaveBeenCalled(); + }); + + jest.resetAllMocks(); + // @ts-ignore + global.navigator.clipboard = originalClipboard; }); diff --git a/superset-frontend/src/explore/components/DataTableControl/index.tsx b/superset-frontend/src/explore/components/DataTableControl/index.tsx index cc379eda63602..fb8af865a3914 100644 --- a/superset-frontend/src/explore/components/DataTableControl/index.tsx +++ b/superset-frontend/src/explore/components/DataTableControl/index.tsx @@ -16,7 +16,7 @@ * specific language governing permissions and limitations * under the License. */ -import React, { useCallback, useMemo } from 'react'; +import React, { useMemo, useState, useEffect } from 'react'; import { css, GenericDataType, @@ -29,7 +29,6 @@ import { import { Global } from '@emotion/react'; import { Column } from 'react-table'; import debounce from 'lodash/debounce'; -import { useDispatch } from 'react-redux'; import { Space } from 'src/components'; import { Input } from 'src/components/Input'; import { @@ -45,10 +44,7 @@ import Popover from 'src/components/Popover'; import { prepareCopyToClipboardTabularData } from 'src/utils/common'; import CopyToClipboard from 'src/components/CopyToClipboard'; import RowCountLabel from 'src/explore/components/RowCountLabel'; -import { - setOriginalFormattedTimeColumn, - unsetOriginalFormattedTimeColumn, -} from 'src/explore/actions/exploreActions'; +import { getTimeColumns, setTimeColumns } from './utils'; export const CellNull = styled('span')` color: ${({ theme }) => theme.colors.grayscale.light1}; @@ -130,8 +126,8 @@ export const RowCount = ({ }) => ; enum FormatPickerValue { - Formatted, - Original, + Formatted = 'formatted', + Original = 'original', } const FormatPicker = ({ @@ -165,47 +161,26 @@ const FormatPickerLabel = styled.span` const DataTableTemporalHeaderCell = ({ columnName, + onTimeColumnChange, datasourceId, - originalFormattedTimeColumnIndex, }: { columnName: string; + onTimeColumnChange: ( + columnName: string, + columnType: FormatPickerValue, + ) => void; datasourceId?: string; - originalFormattedTimeColumnIndex: number; }) => { const theme = useTheme(); - const dispatch = useDispatch(); - const isTimeColumnOriginalFormatted = originalFormattedTimeColumnIndex > -1; - - const onChange = useCallback( - e => { - if (!datasourceId) { - return; - } - if ( - e.target.value === FormatPickerValue.Original && - !isTimeColumnOriginalFormatted - ) { - dispatch(setOriginalFormattedTimeColumn(datasourceId, columnName)); - } else if ( - e.target.value === FormatPickerValue.Formatted && - isTimeColumnOriginalFormatted - ) { - dispatch( - unsetOriginalFormattedTimeColumn( - datasourceId, - originalFormattedTimeColumnIndex, - ), - ); - } - }, - [ - originalFormattedTimeColumnIndex, - columnName, - datasourceId, - dispatch, - isTimeColumnOriginalFormatted, - ], + const [isOriginalTimeColumn, setIsOriginalTimeColumn] = useState( + getTimeColumns(datasourceId).includes(columnName), ); + + const onChange = (e: any) => { + onTimeColumnChange(columnName, e.target.value); + setIsOriginalTimeColumn(getTimeColumns(datasourceId).includes(columnName)); + }; + const overlayContent = useMemo( () => datasourceId ? ( // eslint-disable-next-line jsx-a11y/no-static-element-interactions @@ -222,14 +197,14 @@ const DataTableTemporalHeaderCell = ({ ) : null, - [datasourceId, isTimeColumnOriginalFormatted, onChange], + [datasourceId, isOriginalTimeColumn], ); return datasourceId ? ( @@ -288,10 +263,45 @@ export const useTableColumns = ( coltypes?: GenericDataType[], data?: Record[], datasourceId?: string, - originalFormattedTimeColumns: string[] = [], + isVisible?: boolean, moreConfigs?: { [key: string]: Partial }, -) => - useMemo( +) => { + const [originalFormattedTimeColumns, setOriginalFormattedTimeColumns] = + useState(getTimeColumns(datasourceId)); + + const onTimeColumnChange = ( + columnName: string, + columnType: FormatPickerValue, + ) => { + if (!datasourceId) { + return; + } + if ( + columnType === FormatPickerValue.Original && + !originalFormattedTimeColumns.includes(columnName) + ) { + const cols = getTimeColumns(datasourceId); + cols.push(columnName); + setTimeColumns(datasourceId, cols); + setOriginalFormattedTimeColumns(cols); + } else if ( + columnType === FormatPickerValue.Formatted && + originalFormattedTimeColumns.includes(columnName) + ) { + const cols = getTimeColumns(datasourceId); + cols.splice(cols.indexOf(columnName), 1); + setTimeColumns(datasourceId, cols); + setOriginalFormattedTimeColumns(cols); + } + }; + + useEffect(() => { + if (isVisible) { + setOriginalFormattedTimeColumns(getTimeColumns(datasourceId)); + } + }, [datasourceId, isVisible]); + + return useMemo( () => colnames && data?.length ? colnames @@ -313,9 +323,7 @@ export const useTableColumns = ( ) : ( key @@ -352,3 +360,4 @@ export const useTableColumns = ( originalFormattedTimeColumns, ], ); +}; diff --git a/superset-frontend/src/explore/components/DataTableControl/useTableColumns.test.ts b/superset-frontend/src/explore/components/DataTableControl/useTableColumns.test.ts index 0a3a73e6f09ef..8e51732c2e825 100644 --- a/superset-frontend/src/explore/components/DataTableControl/useTableColumns.test.ts +++ b/superset-frontend/src/explore/components/DataTableControl/useTableColumns.test.ts @@ -107,7 +107,7 @@ test('useTableColumns with no options', () => { name: 'DataTableTemporalHeaderCell', }), props: expect.objectContaining({ - originalFormattedTimeColumnIndex: -1, + onTimeColumnChange: expect.any(Function), }), }), accessor: expect.any(Function), @@ -135,7 +135,7 @@ test('useTableColumns with no options', () => { test('useTableColumns with options', () => { const hook = renderHook(() => - useTableColumns(colnames, coltypes, data, undefined, [], { + useTableColumns(colnames, coltypes, data, undefined, true, { col01: { Header: 'Header' }, }), ); @@ -171,7 +171,7 @@ test('useTableColumns with options', () => { name: 'DataTableTemporalHeaderCell', }), props: expect.objectContaining({ - originalFormattedTimeColumnIndex: -1, + onTimeColumnChange: expect.any(Function), }), }), accessor: expect.any(Function), diff --git a/superset-frontend/src/explore/components/DataTableControl/utils.ts b/superset-frontend/src/explore/components/DataTableControl/utils.ts new file mode 100644 index 0000000000000..f8e8e42e31d43 --- /dev/null +++ b/superset-frontend/src/explore/components/DataTableControl/utils.ts @@ -0,0 +1,49 @@ +/** + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +import { ensureIsArray } from '@superset-ui/core'; +import { + LocalStorageKeys, + setItem, + getItem, +} from 'src/utils/localStorageHelpers'; + +export const getTimeColumns = (datasourceId?: string): string[] => { + const colsMap = getItem( + LocalStorageKeys.explore__data_table_original_formatted_time_columns, + {}, + ); + if (datasourceId === undefined) { + return []; + } + return ensureIsArray(colsMap[datasourceId]); +}; + +export const setTimeColumns = (datasourceId: string, columns: string[]) => { + const colsMap = getItem( + LocalStorageKeys.explore__data_table_original_formatted_time_columns, + {}, + ); + setItem( + LocalStorageKeys.explore__data_table_original_formatted_time_columns, + { + ...colsMap, + [datasourceId]: columns, + }, + ); +}; diff --git a/superset-frontend/src/explore/components/DataTablesPane/DataTablesPane.test.tsx b/superset-frontend/src/explore/components/DataTablesPane/DataTablesPane.test.tsx index cb95e29fd091c..57d599ee82b9a 100644 --- a/superset-frontend/src/explore/components/DataTablesPane/DataTablesPane.test.tsx +++ b/superset-frontend/src/explore/components/DataTablesPane/DataTablesPane.test.tsx @@ -26,6 +26,9 @@ import { screen, waitForElementToBeRemoved, } from 'spec/helpers/testing-library'; +import { DatasourceType } from '@superset-ui/core'; +import { exploreActions } from 'src/explore/actions/exploreActions'; +import { ChartStatus } from 'src/explore/types'; import { DataTablesPane } from '.'; const createProps = () => ({ @@ -55,13 +58,23 @@ const createProps = () => ({ extra_form_data: {}, }, queryForce: false, - chartStatus: 'rendered', + chartStatus: 'rendered' as ChartStatus, onCollapseChange: jest.fn(), queriesResponse: [ { colnames: [], }, ], + datasource: { + id: 0, + name: '', + type: DatasourceType.Table, + columns: [], + metrics: [], + columnFormats: {}, + verboseMap: {}, + }, + actions: exploreActions, }); describe('DataTablesPane', () => { @@ -138,7 +151,7 @@ describe('DataTablesPane', () => { { expect(await screen.findByText('1 row')).toBeVisible(); userEvent.click(screen.getByLabelText('Copy')); - expect(copyToClipboardSpy).toHaveBeenCalledWith( - '2009-01-01 00:00:00\tAction\n', - ); + expect(copyToClipboardSpy).toHaveBeenCalledTimes(1); + const value = await copyToClipboardSpy.mock.calls[0][0](); + expect(value).toBe('2009-01-01 00:00:00\tAction\n'); copyToClipboardSpy.mockRestore(); fetchMock.restore(); }); @@ -190,7 +203,7 @@ describe('DataTablesPane', () => { ` + position: relative; + background-color: ${theme.colors.grayscale.light5}; + z-index: 5; + overflow: hidden; + + .ant-tabs { + height: 100%; + } + + .ant-tabs-content-holder { + height: 100%; + } + + .ant-tabs-content { + height: 100%; + } + + .ant-tabs-tabpane { + display: flex; + flex-direction: column; + height: 100%; + + .table-condensed { + height: 100%; + overflow: auto; + margin-bottom: ${theme.gridUnit * 4}px; + + .table { + margin-bottom: ${theme.gridUnit * 2}px; + } + } + + .pagination-container > ul[role='navigation'] { + margin-top: 0; + } + } + `} +`; + +export const DataTablesPane = ({ + queryFormData, + datasource, + queryForce, + onCollapseChange, + chartStatus, + ownState, + errorMessage, + actions, +}: DataTablesPaneProps) => { + const theme = useTheme(); + const [activeTabKey, setActiveTabKey] = useState(ResultTypes.Results); + const [isRequest, setIsRequest] = useState>({ + results: false, + samples: false, + }); + const [panelOpen, setPanelOpen] = useState( + getItem(LocalStorageKeys.is_datapanel_open, false), + ); + + useEffect(() => { + setItem(LocalStorageKeys.is_datapanel_open, panelOpen); + }, [panelOpen]); + + useEffect(() => { + if (!panelOpen) { + setIsRequest({ + results: false, + samples: false, + }); + } + + if ( + panelOpen && + activeTabKey === ResultTypes.Results && + chartStatus === 'rendered' + ) { + setIsRequest({ + results: true, + samples: false, + }); + } + + if (panelOpen && activeTabKey === ResultTypes.Samples) { + setIsRequest({ + results: false, + samples: true, + }); + } + }, [panelOpen, activeTabKey, chartStatus]); + + const handleCollapseChange = useCallback( + (isOpen: boolean) => { + onCollapseChange(isOpen); + setPanelOpen(isOpen); + }, + [onCollapseChange], + ); + + const handleTabClick = useCallback( + (tabKey: string, e: MouseEvent) => { + if (!panelOpen) { + handleCollapseChange(true); + } else if (tabKey === activeTabKey) { + e.preventDefault(); + handleCollapseChange(false); + } + setActiveTabKey(tabKey); + }, + [activeTabKey, handleCollapseChange, panelOpen], + ); + + const CollapseButton = useMemo(() => { + const caretIcon = panelOpen ? ( + + ) : ( + + ); + return ( + + {panelOpen ? ( + handleCollapseChange(false)} + > + {caretIcon} + + ) : ( + handleCollapseChange(true)} + > + {caretIcon} + + )} + + ); + }, [handleCollapseChange, panelOpen, theme.colors.grayscale.base]); + + return ( + + + + + + + + + + + ); +}; diff --git a/superset-frontend/src/explore/components/DataTablesPane/components/DataTableControls.tsx b/superset-frontend/src/explore/components/DataTablesPane/components/DataTableControls.tsx new file mode 100644 index 0000000000000..c898988c90da0 --- /dev/null +++ b/superset-frontend/src/explore/components/DataTablesPane/components/DataTableControls.tsx @@ -0,0 +1,82 @@ +/** + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +import React, { useMemo } from 'react'; +import { zip } from 'lodash'; +import { css, GenericDataType, styled } from '@superset-ui/core'; +import { + CopyToClipboardButton, + FilterInput, + RowCount, +} from 'src/explore/components/DataTableControl'; +import { applyFormattingToTabularData } from 'src/utils/common'; +import { getTimeColumns } from 'src/explore/components/DataTableControl/utils'; +import { TableControlsProps } from '../types'; + +export const TableControlsWrapper = styled.div` + ${({ theme }) => ` + display: flex; + align-items: center; + justify-content: space-between; + margin-bottom: ${theme.gridUnit * 2}px; + + span { + flex-shrink: 0; + } + `} +`; + +export const TableControls = ({ + data, + datasourceId, + onInputChange, + columnNames, + columnTypes, + isLoading, +}: TableControlsProps) => { + const originalTimeColumns = getTimeColumns(datasourceId); + const formattedTimeColumns = zip( + columnNames, + columnTypes, + ) + .filter( + ([name, type]) => + type === GenericDataType.TEMPORAL && + name && + !originalTimeColumns.includes(name), + ) + .map(([colname]) => colname); + const formattedData = useMemo( + () => applyFormattingToTabularData(data, formattedTimeColumns), + [data, formattedTimeColumns], + ); + return ( + + +
+ + +
+
+ ); +}; diff --git a/superset-frontend/src/explore/components/DataTablesPane/components/ResultsPane.tsx b/superset-frontend/src/explore/components/DataTablesPane/components/ResultsPane.tsx new file mode 100644 index 0000000000000..d69a244430550 --- /dev/null +++ b/superset-frontend/src/explore/components/DataTablesPane/components/ResultsPane.tsx @@ -0,0 +1,177 @@ +/** + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +import React, { useState, useEffect } from 'react'; +import { ensureIsArray, GenericDataType, styled, t } from '@superset-ui/core'; +import Loading from 'src/components/Loading'; +import { EmptyStateMedium } from 'src/components/EmptyState'; +import TableView, { EmptyWrapperType } from 'src/components/TableView'; +import { + useFilteredTableData, + useTableColumns, +} from 'src/explore/components/DataTableControl'; +import { getChartDataRequest } from 'src/components/Chart/chartAction'; +import { getClientErrorObject } from 'src/utils/getClientErrorObject'; +import { TableControls } from './DataTableControls'; +import { ResultsPaneProps } from '../types'; + +const Error = styled.pre` + margin-top: ${({ theme }) => `${theme.gridUnit * 4}px`}; +`; + +const cache = new WeakSet(); + +export const ResultsPane = ({ + isRequest, + queryFormData, + queryForce, + ownState, + errorMessage, + actions, + dataSize = 50, +}: ResultsPaneProps) => { + const [filterText, setFilterText] = useState(''); + const [data, setData] = useState[][]>([]); + const [colnames, setColnames] = useState([]); + const [coltypes, setColtypes] = useState([]); + const [isLoading, setIsLoading] = useState(true); + const [responseError, setResponseError] = useState(''); + + useEffect(() => { + // it's an invalid formData when gets a errorMessage + if (errorMessage) return; + if (isRequest && !cache.has(queryFormData)) { + setIsLoading(true); + getChartDataRequest({ + formData: queryFormData, + force: queryForce, + resultFormat: 'json', + resultType: 'results', + ownState, + }) + .then(({ json }) => { + const { colnames, coltypes } = json.result[0]; + // Only displaying the first query is currently supported + if (json.result.length > 1) { + // todo: move these code to the backend, shouldn't loop by row in FE + const data: any[] = []; + json.result.forEach((item: { data: any[] }) => { + item.data.forEach((row, i) => { + if (data[i] !== undefined) { + data[i] = { ...data[i], ...row }; + } else { + data[i] = row; + } + }); + }); + setData(data); + setColnames(colnames); + setColtypes(coltypes); + } else { + setData(ensureIsArray(json.result[0].data)); + setColnames(colnames); + setColtypes(coltypes); + } + setResponseError(''); + cache.add(queryFormData); + if (queryForce && actions) { + actions.setForceQuery(false); + } + }) + .catch(response => { + getClientErrorObject(response).then(({ error, message }) => { + setResponseError(error || message || t('Sorry, an error occurred')); + }); + }) + .finally(() => { + setIsLoading(false); + }); + } + }, [queryFormData, isRequest]); + + useEffect(() => { + if (errorMessage) { + setIsLoading(false); + } + }, [errorMessage]); + + // this is to preserve the order of the columns, even if there are integer values, + // while also only grabbing the first column's keys + const columns = useTableColumns( + colnames, + coltypes, + data, + queryFormData.datasource, + isRequest, + ); + const filteredData = useFilteredTableData(filterText, data); + + if (isLoading) { + return ; + } + + if (errorMessage) { + const title = t('Run a query to display results'); + return ; + } + + if (responseError) { + return ( + <> + setFilterText(input)} + isLoading={isLoading} + /> + {responseError} + + ); + } + + if (data.length === 0) { + const title = t('No results were returned for this query'); + return ; + } + + return ( + <> + setFilterText(input)} + isLoading={isLoading} + /> + + + ); +}; diff --git a/superset-frontend/src/explore/components/DataTablesPane/components/SamplesPane.tsx b/superset-frontend/src/explore/components/DataTablesPane/components/SamplesPane.tsx new file mode 100644 index 0000000000000..1997acf596ede --- /dev/null +++ b/superset-frontend/src/explore/components/DataTablesPane/components/SamplesPane.tsx @@ -0,0 +1,145 @@ +/** + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +import React, { useState, useEffect, useMemo } from 'react'; +import { ensureIsArray, GenericDataType, styled, t } from '@superset-ui/core'; +import Loading from 'src/components/Loading'; +import { EmptyStateMedium } from 'src/components/EmptyState'; +import TableView, { EmptyWrapperType } from 'src/components/TableView'; +import { + useFilteredTableData, + useTableColumns, +} from 'src/explore/components/DataTableControl'; +import { getDatasetSamples } from 'src/components/Chart/chartAction'; +import { TableControls } from './DataTableControls'; +import { SamplesPaneProps } from '../types'; + +const Error = styled.pre` + margin-top: ${({ theme }) => `${theme.gridUnit * 4}px`}; +`; + +const cache = new WeakSet(); + +export const SamplesPane = ({ + isRequest, + datasource, + queryForce, + actions, + dataSize = 50, +}: SamplesPaneProps) => { + const [filterText, setFilterText] = useState(''); + const [data, setData] = useState[][]>([]); + const [colnames, setColnames] = useState([]); + const [coltypes, setColtypes] = useState([]); + const [isLoading, setIsLoading] = useState(false); + const [responseError, setResponseError] = useState(''); + const datasourceId = useMemo( + () => `${datasource.id}__${datasource.type}`, + [datasource], + ); + + useEffect(() => { + if (isRequest && queryForce) { + cache.delete(datasource); + } + + if (isRequest && !cache.has(datasource)) { + setIsLoading(true); + getDatasetSamples(datasource.id, queryForce) + .then(response => { + setData(ensureIsArray(response.data)); + setColnames(ensureIsArray(response.colnames)); + setColtypes(ensureIsArray(response.coltypes)); + setResponseError(''); + cache.add(datasource); + if (queryForce && actions) { + actions.setForceQuery(false); + } + }) + .catch(error => { + setData([]); + setColnames([]); + setColtypes([]); + setResponseError(`${error.name}: ${error.message}`); + }) + .finally(() => { + setIsLoading(false); + }); + } + }, [datasource, isRequest, queryForce]); + + // this is to preserve the order of the columns, even if there are integer values, + // while also only grabbing the first column's keys + const columns = useTableColumns( + colnames, + coltypes, + data, + datasourceId, + isRequest, + ); + const filteredData = useFilteredTableData(filterText, data); + + if (isLoading) { + return ; + } + + if (responseError) { + return ( + <> + setFilterText(input)} + isLoading={isLoading} + /> + {responseError} + + ); + } + + if (data.length === 0) { + const title = t('No samples were returned for this dataset'); + return ; + } + + return ( + <> + setFilterText(input)} + isLoading={isLoading} + /> + + + ); +}; diff --git a/superset-frontend/src/modules/utils.js b/superset-frontend/src/explore/components/DataTablesPane/components/index.ts similarity index 55% rename from superset-frontend/src/modules/utils.js rename to superset-frontend/src/explore/components/DataTablesPane/components/index.ts index 07dc43856e522..41623cb572083 100644 --- a/superset-frontend/src/modules/utils.js +++ b/superset-frontend/src/explore/components/DataTablesPane/components/index.ts @@ -16,28 +16,6 @@ * specific language governing permissions and limitations * under the License. */ -/* eslint camelcase: 0 */ - -export function formatSelectOptions(options) { - return options.map(opt => [opt, opt.toString()]); -} - -export function getDatasourceParameter(datasourceId, datasourceType) { - return `${datasourceId}__${datasourceType}`; -} - -export function mainMetric(savedMetrics) { - // Using 'count' as default metric if it exists, otherwise using whatever one shows up first - let metric; - if (savedMetrics && savedMetrics.length > 0) { - savedMetrics.forEach(m => { - if (m.metric_name === 'count') { - metric = 'count'; - } - }); - if (!metric) { - metric = savedMetrics[0].metric_name; - } - } - return metric; -} +export { ResultsPane } from './ResultsPane'; +export { SamplesPane } from './SamplesPane'; +export { TableControls, TableControlsWrapper } from './DataTableControls'; diff --git a/superset-frontend/src/explore/components/DataTablesPane/index.ts b/superset-frontend/src/explore/components/DataTablesPane/index.ts new file mode 100644 index 0000000000000..603cf71e6ff5d --- /dev/null +++ b/superset-frontend/src/explore/components/DataTablesPane/index.ts @@ -0,0 +1,20 @@ +/** + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +export { DataTablesPane } from './DataTablesPane'; +export * from './components'; diff --git a/superset-frontend/src/explore/components/DataTablesPane/index.tsx b/superset-frontend/src/explore/components/DataTablesPane/index.tsx deleted file mode 100644 index efa904fd9877c..0000000000000 --- a/superset-frontend/src/explore/components/DataTablesPane/index.tsx +++ /dev/null @@ -1,532 +0,0 @@ -/** - * Licensed to the Apache Software Foundation (ASF) under one - * or more contributor license agreements. See the NOTICE file - * distributed with this work for additional information - * regarding copyright ownership. The ASF licenses this file - * to you under the Apache License, Version 2.0 (the - * "License"); you may not use this file except in compliance - * with the License. You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, - * software distributed under the License is distributed on an - * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY - * KIND, either express or implied. See the License for the - * specific language governing permissions and limitations - * under the License. - */ -import React, { - useCallback, - useEffect, - useMemo, - useState, - MouseEvent, -} from 'react'; -import { - css, - ensureIsArray, - GenericDataType, - JsonObject, - styled, - t, - useTheme, -} from '@superset-ui/core'; -import Icons from 'src/components/Icons'; -import Tabs from 'src/components/Tabs'; -import Loading from 'src/components/Loading'; -import { EmptyStateMedium } from 'src/components/EmptyState'; -import TableView, { EmptyWrapperType } from 'src/components/TableView'; -import { getChartDataRequest } from 'src/components/Chart/chartAction'; -import { getClientErrorObject } from 'src/utils/getClientErrorObject'; -import { - getItem, - setItem, - LocalStorageKeys, -} from 'src/utils/localStorageHelpers'; -import { - CopyToClipboardButton, - FilterInput, - RowCount, - useFilteredTableData, - useTableColumns, -} from 'src/explore/components/DataTableControl'; -import { applyFormattingToTabularData } from 'src/utils/common'; -import { useOriginalFormattedTimeColumns } from '../useOriginalFormattedTimeColumns'; - -const RESULT_TYPES = { - results: 'results' as const, - samples: 'samples' as const, -}; - -const getDefaultDataTablesState = (value: any) => ({ - [RESULT_TYPES.results]: value, - [RESULT_TYPES.samples]: value, -}); - -const DATA_TABLE_PAGE_SIZE = 50; - -const TableControlsWrapper = styled.div` - ${({ theme }) => ` - display: flex; - align-items: center; - justify-content: space-between; - margin-bottom: ${theme.gridUnit * 2}px; - - span { - flex-shrink: 0; - } - `} -`; - -const SouthPane = styled.div` - ${({ theme }) => ` - position: relative; - background-color: ${theme.colors.grayscale.light5}; - z-index: 5; - overflow: hidden; - - .ant-tabs { - height: 100%; - } - - .ant-tabs-content-holder { - height: 100%; - } - - .ant-tabs-content { - height: 100%; - } - - .ant-tabs-tabpane { - display: flex; - flex-direction: column; - height: 100%; - - .table-condensed { - height: 100%; - overflow: auto; - margin-bottom: ${theme.gridUnit * 4}px; - - .table { - margin-bottom: ${theme.gridUnit * 2}px; - } - } - - .pagination-container > ul[role='navigation'] { - margin-top: 0; - } - } - `} -`; - -const Error = styled.pre` - margin-top: ${({ theme }) => `${theme.gridUnit * 4}px`}; -`; - -interface DataTableProps { - columnNames: string[]; - columnTypes: GenericDataType[] | undefined; - datasource: string | undefined; - filterText: string; - data: object[] | undefined; - isLoading: boolean; - error: string | undefined; - errorMessage: React.ReactElement | undefined; - type: 'results' | 'samples'; -} - -const DataTable = ({ - columnNames, - columnTypes, - datasource, - filterText, - data, - isLoading, - error, - errorMessage, - type, -}: DataTableProps) => { - const originalFormattedTimeColumns = - useOriginalFormattedTimeColumns(datasource); - // this is to preserve the order of the columns, even if there are integer values, - // while also only grabbing the first column's keys - const columns = useTableColumns( - columnNames, - columnTypes, - data, - datasource, - originalFormattedTimeColumns, - ); - const filteredData = useFilteredTableData(filterText, data); - - if (isLoading) { - return ; - } - if (error) { - return {error}; - } - if (data) { - if (data.length === 0) { - const title = - type === 'samples' - ? t('No samples were returned for this query') - : t('No results were returned for this query'); - return ; - } - return ( - - ); - } - if (errorMessage) { - const title = - type === 'samples' - ? t('Run a query to display samples') - : t('Run a query to display results'); - return ; - } - return null; -}; - -const TableControls = ({ - data, - datasourceId, - onInputChange, - columnNames, - isLoading, -}: { - data: Record[]; - datasourceId?: string; - onInputChange: (input: string) => void; - columnNames: string[]; - isLoading: boolean; -}) => { - const originalFormattedTimeColumns = - useOriginalFormattedTimeColumns(datasourceId); - const formattedData = useMemo( - () => applyFormattingToTabularData(data, originalFormattedTimeColumns), - [data, originalFormattedTimeColumns], - ); - return ( - - -
- - -
-
- ); -}; - -export const DataTablesPane = ({ - queryFormData, - queryForce, - onCollapseChange, - chartStatus, - ownState, - errorMessage, - queriesResponse, -}: { - queryFormData: Record; - queryForce: boolean; - chartStatus: string; - ownState?: JsonObject; - onCollapseChange: (isOpen: boolean) => void; - errorMessage?: JSX.Element; - queriesResponse: Record; -}) => { - const theme = useTheme(); - const [data, setData] = useState(getDefaultDataTablesState(undefined)); - const [isLoading, setIsLoading] = useState(getDefaultDataTablesState(true)); - const [columnNames, setColumnNames] = useState(getDefaultDataTablesState([])); - const [columnTypes, setColumnTypes] = useState(getDefaultDataTablesState([])); - const [error, setError] = useState(getDefaultDataTablesState('')); - const [filterText, setFilterText] = useState(getDefaultDataTablesState('')); - const [activeTabKey, setActiveTabKey] = useState( - RESULT_TYPES.results, - ); - const [isRequestPending, setIsRequestPending] = useState( - getDefaultDataTablesState(false), - ); - const [panelOpen, setPanelOpen] = useState( - getItem(LocalStorageKeys.is_datapanel_open, false), - ); - - const getData = useCallback( - (resultType: 'samples' | 'results') => { - setIsLoading(prevIsLoading => ({ - ...prevIsLoading, - [resultType]: true, - })); - return getChartDataRequest({ - formData: queryFormData, - force: queryForce, - resultFormat: 'json', - resultType, - ownState, - }) - .then(({ json }) => { - // Only displaying the first query is currently supported - if (json.result.length > 1) { - const data: any[] = []; - json.result.forEach((item: { data: any[] }) => { - item.data.forEach((row, i) => { - if (data[i] !== undefined) { - data[i] = { ...data[i], ...row }; - } else { - data[i] = row; - } - }); - }); - setData(prevData => ({ - ...prevData, - [resultType]: data, - })); - } else { - setData(prevData => ({ - ...prevData, - [resultType]: json.result[0].data, - })); - } - - const colNames = ensureIsArray(json.result[0].colnames); - - setColumnNames(prevColumnNames => ({ - ...prevColumnNames, - [resultType]: colNames, - })); - setColumnTypes(prevColumnTypes => ({ - ...prevColumnTypes, - [resultType]: json.result[0].coltypes || [], - })); - setIsLoading(prevIsLoading => ({ - ...prevIsLoading, - [resultType]: false, - })); - setError(prevError => ({ - ...prevError, - [resultType]: undefined, - })); - }) - .catch(response => { - getClientErrorObject(response).then(({ error, message }) => { - setError(prevError => ({ - ...prevError, - [resultType]: error || message || t('Sorry, an error occurred'), - })); - setIsLoading(prevIsLoading => ({ - ...prevIsLoading, - [resultType]: false, - })); - }); - }); - }, - [queryFormData, columnNames], - ); - useEffect(() => { - setItem(LocalStorageKeys.is_datapanel_open, panelOpen); - }, [panelOpen]); - - useEffect(() => { - setIsRequestPending(prevState => ({ - ...prevState, - [RESULT_TYPES.results]: true, - })); - }, [queryFormData]); - - useEffect(() => { - setIsRequestPending(prevState => ({ - ...prevState, - [RESULT_TYPES.samples]: true, - })); - }, [queryFormData?.datasource]); - - useEffect(() => { - if (queriesResponse && chartStatus === 'success') { - const { colnames } = queriesResponse[0]; - setColumnNames(prevColumnNames => ({ - ...prevColumnNames, - [RESULT_TYPES.results]: colnames ?? [], - })); - } - }, [queriesResponse, chartStatus]); - - useEffect(() => { - if (panelOpen && isRequestPending[RESULT_TYPES.results]) { - if (errorMessage) { - setIsRequestPending(prevState => ({ - ...prevState, - [RESULT_TYPES.results]: false, - })); - setIsLoading(prevIsLoading => ({ - ...prevIsLoading, - [RESULT_TYPES.results]: false, - })); - return; - } - if (chartStatus === 'loading') { - setIsLoading(prevIsLoading => ({ - ...prevIsLoading, - [RESULT_TYPES.results]: true, - })); - } else { - setIsRequestPending(prevState => ({ - ...prevState, - [RESULT_TYPES.results]: false, - })); - getData(RESULT_TYPES.results); - } - } - if ( - panelOpen && - isRequestPending[RESULT_TYPES.samples] && - activeTabKey === RESULT_TYPES.samples - ) { - setIsRequestPending(prevState => ({ - ...prevState, - [RESULT_TYPES.samples]: false, - })); - getData(RESULT_TYPES.samples); - } - }, [ - panelOpen, - isRequestPending, - getData, - activeTabKey, - chartStatus, - errorMessage, - ]); - - const handleCollapseChange = useCallback( - (isOpen: boolean) => { - onCollapseChange(isOpen); - setPanelOpen(isOpen); - }, - [onCollapseChange], - ); - - const handleTabClick = useCallback( - (tabKey: string, e: MouseEvent) => { - if (!panelOpen) { - handleCollapseChange(true); - } else if (tabKey === activeTabKey) { - e.preventDefault(); - handleCollapseChange(false); - } - setActiveTabKey(tabKey); - }, - [activeTabKey, handleCollapseChange, panelOpen], - ); - - const CollapseButton = useMemo(() => { - const caretIcon = panelOpen ? ( - - ) : ( - - ); - return ( - - {panelOpen ? ( - handleCollapseChange(false)} - > - {caretIcon} - - ) : ( - handleCollapseChange(true)} - > - {caretIcon} - - )} - - ); - }, [handleCollapseChange, panelOpen, theme.colors.grayscale.base]); - - return ( - - - - - setFilterText(prevState => ({ - ...prevState, - [RESULT_TYPES.results]: input, - })) - } - isLoading={isLoading[RESULT_TYPES.results]} - /> - - - - - setFilterText(prevState => ({ - ...prevState, - [RESULT_TYPES.samples]: input, - })) - } - isLoading={isLoading[RESULT_TYPES.samples]} - /> - - - - - ); -}; diff --git a/superset-frontend/src/explore/components/DataTablesPane/types.ts b/superset-frontend/src/explore/components/DataTablesPane/types.ts new file mode 100644 index 0000000000000..f526536640c6e --- /dev/null +++ b/superset-frontend/src/explore/components/DataTablesPane/types.ts @@ -0,0 +1,65 @@ +/** + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +import { + Datasource, + GenericDataType, + JsonObject, + QueryFormData, +} from '@superset-ui/core'; +import { ExploreActions } from 'src/explore/actions/exploreActions'; +import { ChartStatus } from 'src/explore/types'; + +export interface DataTablesPaneProps { + queryFormData: QueryFormData; + datasource: Datasource; + queryForce: boolean; + ownState?: JsonObject; + chartStatus: ChartStatus; + onCollapseChange: (isOpen: boolean) => void; + errorMessage?: JSX.Element; + actions: ExploreActions; +} + +export interface ResultsPaneProps { + isRequest: boolean; + queryFormData: QueryFormData; + queryForce: boolean; + ownState?: JsonObject; + errorMessage?: React.ReactElement; + actions?: ExploreActions; + dataSize?: number; +} + +export interface SamplesPaneProps { + isRequest: boolean; + datasource: Datasource; + queryForce: boolean; + actions?: ExploreActions; + dataSize?: number; +} + +export interface TableControlsProps { + data: Record[]; + // {datasource.id}__{datasource.type}, eg: 1__table + datasourceId: string; + onInputChange: (input: string) => void; + columnNames: string[]; + columnTypes: GenericDataType[]; + isLoading: boolean; +} diff --git a/superset-frontend/src/explore/components/ExploreChartHeader/index.jsx b/superset-frontend/src/explore/components/ExploreChartHeader/index.jsx index fa3f5f93542a3..fb85882f02337 100644 --- a/superset-frontend/src/explore/components/ExploreChartHeader/index.jsx +++ b/superset-frontend/src/explore/components/ExploreChartHeader/index.jsx @@ -65,6 +65,7 @@ export const ExploreChartHeader = ({ slice, actions, formData, + ownState, chart, user, canOverwrite, @@ -138,6 +139,7 @@ export const ExploreChartHeader = ({ slice, actions.redirectSQLLab, openPropertiesModal, + ownState, ); const oldSliceName = slice?.slice_name; diff --git a/superset-frontend/src/explore/components/ExploreChartPanel.jsx b/superset-frontend/src/explore/components/ExploreChartPanel.jsx index 49ed20848d687..9fc7caef62803 100644 --- a/superset-frontend/src/explore/components/ExploreChartPanel.jsx +++ b/superset-frontend/src/explore/components/ExploreChartPanel.jsx @@ -198,6 +198,7 @@ const ExploreChartPanel = ({ undefined, ownState, ); + actions.updateQueryFormData(formData, chart.id); }, [actions, chart.id, formData, ownState, timeout]); const onCollapseChange = useCallback(isOpen => { @@ -388,11 +389,12 @@ const ExploreChartPanel = ({ )} diff --git a/superset-frontend/src/explore/components/ExploreChartPanel.test.jsx b/superset-frontend/src/explore/components/ExploreChartPanel.test.jsx index a779773052e69..557b2149e01e1 100644 --- a/superset-frontend/src/explore/components/ExploreChartPanel.test.jsx +++ b/superset-frontend/src/explore/components/ExploreChartPanel.test.jsx @@ -17,12 +17,12 @@ * under the License. */ import React from 'react'; +import userEvent from '@testing-library/user-event'; import { render, screen } from 'spec/helpers/testing-library'; import ChartContainer from 'src/explore/components/ExploreChartPanel'; const createProps = (overrides = {}) => ({ sliceName: 'Trend Line', - vizType: 'line', height: '500px', actions: {}, can_overwrite: false, @@ -30,9 +30,29 @@ const createProps = (overrides = {}) => ({ containerId: 'foo', width: '500px', isStarred: false, - chartIsStale: false, - chart: {}, - form_data: {}, + vizType: 'histogram', + chart: { + id: 1, + latestQueryFormData: { + viz_type: 'histogram', + datasource: '49__table', + slice_id: 318, + url_params: {}, + granularity_sqla: 'time_start', + time_range: 'No filter', + all_columns_x: ['age'], + adhoc_filters: [], + row_limit: 10000, + groupby: null, + color_scheme: 'supersetColors', + label_colors: {}, + link_length: '25', + x_axis_label: 'age', + y_axis_label: 'count', + }, + chartStatus: 'rendered', + queriesResponse: [{ is_cached: true }], + }, ...overrides, }); @@ -83,4 +103,37 @@ describe('ChartContainer', () => { screen.getByText('Required control values have been removed'), ).toBeVisible(); }); + + it('should render cached button and call expected actions', () => { + const setForceQuery = jest.fn(); + const postChartFormData = jest.fn(); + const updateQueryFormData = jest.fn(); + const props = createProps({ + actions: { + setForceQuery, + postChartFormData, + updateQueryFormData, + }, + }); + render(, { useRedux: true }); + + const cached = screen.queryByText('Cached'); + expect(cached).toBeInTheDocument(); + + userEvent.click(cached); + expect(setForceQuery).toHaveBeenCalledTimes(1); + expect(postChartFormData).toHaveBeenCalledTimes(1); + expect(updateQueryFormData).toHaveBeenCalledTimes(1); + }); + + it('should hide cached button', () => { + const props = createProps({ + chart: { + chartStatus: 'rendered', + queriesResponse: [{ is_cached: false }], + }, + }); + render(, { useRedux: true }); + expect(screen.queryByText('Cached')).not.toBeInTheDocument(); + }); }); diff --git a/superset-frontend/src/explore/components/ExploreViewContainer/ExploreViewContainer.test.tsx b/superset-frontend/src/explore/components/ExploreViewContainer/ExploreViewContainer.test.tsx index 7743997a35529..2260346968dd3 100644 --- a/superset-frontend/src/explore/components/ExploreViewContainer/ExploreViewContainer.test.tsx +++ b/superset-frontend/src/explore/components/ExploreViewContainer/ExploreViewContainer.test.tsx @@ -92,7 +92,7 @@ test('generates a new form_data param when none is available', async () => { expect(replaceState).toHaveBeenCalledWith( expect.anything(), undefined, - expect.stringMatching('dataset_id'), + expect.stringMatching('datasource_id'), ); replaceState.mockRestore(); }); @@ -109,7 +109,7 @@ test('generates a different form_data param when one is provided and is mounting expect(replaceState).toHaveBeenCalledWith( expect.anything(), undefined, - expect.stringMatching('dataset_id'), + expect.stringMatching('datasource_id'), ); replaceState.mockRestore(); }); diff --git a/superset-frontend/src/explore/components/ExploreViewContainer/index.jsx b/superset-frontend/src/explore/components/ExploreViewContainer/index.jsx index 97e30d335b7a6..e102f2dc970a8 100644 --- a/superset-frontend/src/explore/components/ExploreViewContainer/index.jsx +++ b/superset-frontend/src/explore/components/ExploreViewContainer/index.jsx @@ -86,26 +86,6 @@ const ExploreContainer = styled.div` height: 100%; `; -const ExploreHeaderContainer = styled.div` - ${({ theme }) => css` - background-color: ${theme.colors.grayscale.light5}; - height: ${theme.gridUnit * 16}px; - padding: 0 ${theme.gridUnit * 4}px; - - .editable-title { - overflow: hidden; - - & > input[type='button'], - & > span { - overflow: hidden; - text-overflow: ellipsis; - max-width: 100%; - white-space: nowrap; - } - } - `} -`; - const ExplorePanelContainer = styled.div` ${({ theme }) => css` background: ${theme.colors.grayscale.light5}; @@ -172,14 +152,24 @@ const ExplorePanelContainer = styled.div` `; const updateHistory = debounce( - async (formData, datasetId, isReplace, standalone, force, title, tabId) => { + async ( + formData, + datasourceId, + datasourceType, + isReplace, + standalone, + force, + title, + tabId, + ) => { const payload = { ...formData }; const chartId = formData.slice_id; const additionalParam = {}; if (chartId) { additionalParam[URL_PARAMS.sliceId.name] = chartId; } else { - additionalParam[URL_PARAMS.datasetId.name] = datasetId; + additionalParam[URL_PARAMS.datasourceId.name] = datasourceId; + additionalParam[URL_PARAMS.datasourceType.name] = datasourceType; } const urlParams = payload?.url_params || {}; @@ -193,11 +183,24 @@ const updateHistory = debounce( let key; let stateModifier; if (isReplace) { - key = await postFormData(datasetId, formData, chartId, tabId); + key = await postFormData( + datasourceId, + datasourceType, + formData, + chartId, + tabId, + ); stateModifier = 'replaceState'; } else { key = getUrlParam(URL_PARAMS.formDataKey); - await putFormData(datasetId, key, formData, chartId, tabId); + await putFormData( + datasourceId, + datasourceType, + key, + formData, + chartId, + tabId, + ); stateModifier = 'pushState'; } const url = mountExploreUrl( @@ -249,11 +252,12 @@ function ExploreViewContainer(props) { dashboardId: props.dashboardId, } : props.form_data; - const datasetId = props.datasource.id; + const { id: datasourceId, type: datasourceType } = props.datasource; updateHistory( formData, - datasetId, + datasourceId, + datasourceType, isReplace, props.standalone, props.force, @@ -265,6 +269,7 @@ function ExploreViewContainer(props) { props.dashboardId, props.form_data, props.datasource.id, + props.datasource.type, props.standalone, props.force, tabId, @@ -530,24 +535,23 @@ function ExploreViewContainer(props) { return ( - - - + { checked={this.state.action === 'saveas'} onChange={() => this.changeAction('saveas')} > - {' '} - {t('Save as ...')}   + {t('Save as...')}
diff --git a/superset-frontend/src/explore/components/controls/AnnotationLayerControl/AnnotationLayer.jsx b/superset-frontend/src/explore/components/controls/AnnotationLayerControl/AnnotationLayer.jsx index 973fabaccf86d..8df36c1291ac8 100644 --- a/superset-frontend/src/explore/components/controls/AnnotationLayerControl/AnnotationLayer.jsx +++ b/superset-frontend/src/explore/components/controls/AnnotationLayerControl/AnnotationLayer.jsx @@ -27,6 +27,7 @@ import { getChartMetadataRegistry, validateNonEmpty, isValidExpression, + styled, withTheme, } from '@superset-ui/core'; @@ -43,6 +44,7 @@ import { } from 'src/modules/AnnotationTypes'; import PopoverSection from 'src/components/PopoverSection'; import ControlHeader from 'src/explore/components/ControlHeader'; +import { EmptyStateSmall } from 'src/components/EmptyState'; const AUTOMATIC_COLOR = ''; @@ -98,6 +100,35 @@ const defaultProps = { close: () => {}, }; +const NotFoundContentWrapper = styled.div` + && > div:first-child { + padding-left: 0; + padding-right: 0; + } +`; + +const NotFoundContent = () => ( + + + {t('Add an annotation layer')}{' '} +
+ {t('here')} + + . +
+ } + image="empty.svg" + /> + +); + class AnnotationLayer extends React.PureComponent { constructor(props) { super(props); @@ -416,6 +447,7 @@ class AnnotationLayer extends React.PureComponent { onChange={this.handleValue} validationErrors={!value ? ['Mandatory'] : []} optionRenderer={this.renderOption} + notFoundContent={} /> ); } @@ -760,9 +792,10 @@ class AnnotationLayer extends React.PureComponent { ariaLabel={t('Annotation source type')} hovered description={t('Choose the source of your annotations')} - label={t('Annotation Source')} + label={t('Annotation source')} name="annotation-source-type" options={supportedSourceTypes} + notFoundContent={} value={sourceType} onChange={this.handleAnnotationSourceType} validationErrors={!sourceType ? [t('Mandatory')] : []} diff --git a/superset-frontend/src/explore/components/controls/DatasourceControl/index.jsx b/superset-frontend/src/explore/components/controls/DatasourceControl/index.jsx index 73aa5e4d913d5..3d6ea2fdd2662 100644 --- a/superset-frontend/src/explore/components/controls/DatasourceControl/index.jsx +++ b/superset-frontend/src/explore/components/controls/DatasourceControl/index.jsx @@ -189,9 +189,9 @@ class DatasourceControl extends React.PureComponent { const isMissingDatasource = datasource.id == null; let isMissingParams = false; if (isMissingDatasource) { - const datasetId = getUrlParam(URL_PARAMS.datasetId); + const datasourceId = getUrlParam(URL_PARAMS.datasourceId); const sliceId = getUrlParam(URL_PARAMS.sliceId); - if (!datasetId && !sliceId) { + if (!datasourceId && !sliceId) { isMissingParams = true; } } diff --git a/superset-frontend/src/explore/components/controls/DndColumnSelectControl/DndMetricSelect.tsx b/superset-frontend/src/explore/components/controls/DndColumnSelectControl/DndMetricSelect.tsx index db76f75fcfc21..0d9591a0c5ae7 100644 --- a/superset-frontend/src/explore/components/controls/DndColumnSelectControl/DndMetricSelect.tsx +++ b/superset-frontend/src/explore/components/controls/DndColumnSelectControl/DndMetricSelect.tsx @@ -237,9 +237,9 @@ export const DndMetricSelect = (props: any) => { const valuesCopy = [...value]; valuesCopy.splice(index, 1); setValue(valuesCopy); - onChange(valuesCopy); + handleChange(valuesCopy); }, - [onChange, value], + [handleChange, value], ); const moveLabel = useCallback( diff --git a/superset-frontend/src/explore/components/controls/SelectControl.jsx b/superset-frontend/src/explore/components/controls/SelectControl.jsx index 53b7440cb4b9a..59141458a4602 100644 --- a/superset-frontend/src/explore/components/controls/SelectControl.jsx +++ b/superset-frontend/src/explore/components/controls/SelectControl.jsx @@ -53,6 +53,7 @@ const propTypes = { placeholder: PropTypes.string, filterOption: PropTypes.func, tokenSeparators: PropTypes.arrayOf(PropTypes.string), + notFoundContent: PropTypes.object, // ControlHeader props label: PropTypes.string, @@ -179,6 +180,7 @@ export default class SelectControl extends React.PureComponent { showHeader, value, tokenSeparators, + notFoundContent, // ControlHeader props description, renderTrigger, @@ -245,6 +247,7 @@ export default class SelectControl extends React.PureComponent { sortComparator: this.props.sortComparator, value: getValue(), tokenSeparators, + notFoundContent, }; return ( diff --git a/superset-frontend/src/explore/components/useExploreAdditionalActionsMenu/index.jsx b/superset-frontend/src/explore/components/useExploreAdditionalActionsMenu/index.jsx index cc416dd56165a..bebe5be269774 100644 --- a/superset-frontend/src/explore/components/useExploreAdditionalActionsMenu/index.jsx +++ b/superset-frontend/src/explore/components/useExploreAdditionalActionsMenu/index.jsx @@ -95,6 +95,7 @@ export const useExploreAdditionalActionsMenu = ( slice, onOpenInEditor, onOpenPropertiesModal, + ownState, ) => { const theme = useTheme(); const { addDangerToast, addSuccessToast } = useToasts(); @@ -132,6 +133,7 @@ export const useExploreAdditionalActionsMenu = ( canDownloadCSV ? exportChart({ formData: latestQueryFormData, + ownState, resultType: 'full', resultFormat: 'csv', }) @@ -166,8 +168,7 @@ export const useExploreAdditionalActionsMenu = ( if (!latestQueryFormData) { throw new Error(); } - const url = await getChartPermalink(latestQueryFormData); - await copyTextToClipboard(url); + await copyTextToClipboard(() => getChartPermalink(latestQueryFormData)); addSuccessToast(t('Copied to clipboard!')); } catch (error) { addDangerToast(t('Sorry, something went wrong. Try again later.')); diff --git a/superset-frontend/src/explore/controlPanels/Separator.js b/superset-frontend/src/explore/controlPanels/Separator.js index 588adea2aae7c..2be8594690d05 100644 --- a/superset-frontend/src/explore/controlPanels/Separator.js +++ b/superset-frontend/src/explore/controlPanels/Separator.js @@ -17,7 +17,7 @@ * under the License. */ import { t, validateNonEmpty } from '@superset-ui/core'; -import { formatSelectOptions } from 'src/modules/utils'; +import { formatSelectOptions } from 'src/explore/exploreUtils'; export default { controlPanelSections: [ diff --git a/superset-frontend/src/explore/controlPanels/sections.tsx b/superset-frontend/src/explore/controlPanels/sections.tsx index a6adbf3af23c3..be21747ed63e6 100644 --- a/superset-frontend/src/explore/controlPanels/sections.tsx +++ b/superset-frontend/src/explore/controlPanels/sections.tsx @@ -19,7 +19,7 @@ import React from 'react'; import { t } from '@superset-ui/core'; import { ControlPanelSectionConfig } from '@superset-ui/chart-controls'; -import { formatSelectOptions } from 'src/modules/utils'; +import { formatSelectOptions } from 'src/explore/exploreUtils'; export const druidTimeSeries: ControlPanelSectionConfig = { label: t('Time'), diff --git a/superset-frontend/src/explore/controls.jsx b/superset-frontend/src/explore/controls.jsx index daba78ca6d243..21134d48f3178 100644 --- a/superset-frontend/src/explore/controls.jsx +++ b/superset-frontend/src/explore/controls.jsx @@ -64,7 +64,7 @@ import { legacyValidateInteger, validateNonEmpty, } from '@superset-ui/core'; -import { formatSelectOptions } from 'src/modules/utils'; +import { formatSelectOptions } from 'src/explore/exploreUtils'; import { TIME_FILTER_LABELS } from './constants'; import { StyledColumnOption } from './components/optionRenderers'; diff --git a/superset-frontend/src/explore/exploreUtils/formData.ts b/superset-frontend/src/explore/exploreUtils/formData.ts index 9987b5d8cfa76..36de6640a5c8d 100644 --- a/superset-frontend/src/explore/exploreUtils/formData.ts +++ b/superset-frontend/src/explore/exploreUtils/formData.ts @@ -20,7 +20,8 @@ import { omit } from 'lodash'; import { SupersetClient, JsonObject } from '@superset-ui/core'; type Payload = { - dataset_id: number; + datasource_id: number; + datasource_type: string; form_data: string; chart_id?: number; }; @@ -42,12 +43,14 @@ const assembleEndpoint = (key?: string, tabId?: string) => { }; const assemblePayload = ( - datasetId: number, + datasourceId: number, + datasourceType: string, formData: JsonObject, chartId?: number, ) => { const payload: Payload = { - dataset_id: datasetId, + datasource_id: datasourceId, + datasource_type: datasourceType, form_data: JSON.stringify(sanitizeFormData(formData)), }; if (chartId) { @@ -57,18 +60,25 @@ const assemblePayload = ( }; export const postFormData = ( - datasetId: number, + datasourceId: number, + datasourceType: string, formData: JsonObject, chartId?: number, tabId?: string, ): Promise => SupersetClient.post({ endpoint: assembleEndpoint(undefined, tabId), - jsonPayload: assemblePayload(datasetId, formData, chartId), + jsonPayload: assemblePayload( + datasourceId, + datasourceType, + formData, + chartId, + ), }).then(r => r.json.key); export const putFormData = ( - datasetId: number, + datasourceId: number, + datasourceType: string, key: string, formData: JsonObject, chartId?: number, @@ -76,5 +86,10 @@ export const putFormData = ( ): Promise => SupersetClient.put({ endpoint: assembleEndpoint(key, tabId), - jsonPayload: assemblePayload(datasetId, formData, chartId), + jsonPayload: assemblePayload( + datasourceId, + datasourceType, + formData, + chartId, + ), }).then(r => r.json.message); diff --git a/superset-frontend/src/explore/exploreUtils/index.js b/superset-frontend/src/explore/exploreUtils/index.js index 79bcb1a36241e..d5c3d06d5f6b6 100644 --- a/superset-frontend/src/explore/exploreUtils/index.js +++ b/superset-frontend/src/explore/exploreUtils/index.js @@ -348,3 +348,7 @@ export const getSimpleSQLExpression = (subject, operator, comparator) => { } return expression; }; + +export function formatSelectOptions(options) { + return options.map(opt => [opt, opt.toString()]); +} diff --git a/superset-frontend/src/explore/reducers/exploreReducer.js b/superset-frontend/src/explore/reducers/exploreReducer.js index 4dfcc9a1781bc..2897832ff055c 100644 --- a/superset-frontend/src/explore/reducers/exploreReducer.js +++ b/superset-frontend/src/explore/reducers/exploreReducer.js @@ -28,7 +28,6 @@ import { getControlValuesCompatibleWithDatasource, } from 'src/explore/controlUtils'; import * as actions from 'src/explore/actions/exploreActions'; -import { LocalStorageKeys, setItem } from 'src/utils/localStorageHelpers'; export default function exploreReducer(state = {}, action) { const actionHandlers = { @@ -265,52 +264,6 @@ export default function exploreReducer(state = {}, action) { sliceName: action.slice.slice_name ?? state.sliceName, }; }, - [actions.SET_ORIGINAL_FORMATTED_TIME_COLUMN]() { - const { datasourceId, columnName } = action; - const newOriginalFormattedColumns = { - ...state.originalFormattedTimeColumns, - }; - const newOriginalFormattedColumnsForDatasource = ensureIsArray( - newOriginalFormattedColumns[datasourceId], - ).slice(); - - newOriginalFormattedColumnsForDatasource.push(columnName); - newOriginalFormattedColumns[datasourceId] = - newOriginalFormattedColumnsForDatasource; - setItem( - LocalStorageKeys.explore__data_table_original_formatted_time_columns, - newOriginalFormattedColumns, - ); - return { - ...state, - originalFormattedTimeColumns: newOriginalFormattedColumns, - }; - }, - [actions.UNSET_ORIGINAL_FORMATTED_TIME_COLUMN]() { - const { datasourceId, columnIndex } = action; - const newOriginalFormattedColumns = { - ...state.originalFormattedTimeColumns, - }; - const newOriginalFormattedColumnsForDatasource = ensureIsArray( - newOriginalFormattedColumns[datasourceId], - ).slice(); - - newOriginalFormattedColumnsForDatasource.splice(columnIndex, 1); - newOriginalFormattedColumns[datasourceId] = - newOriginalFormattedColumnsForDatasource; - - if (newOriginalFormattedColumnsForDatasource.length === 0) { - delete newOriginalFormattedColumns[datasourceId]; - } - setItem( - LocalStorageKeys.explore__data_table_original_formatted_time_columns, - newOriginalFormattedColumns, - ); - return { - ...state, - originalFormattedTimeColumns: newOriginalFormattedColumns, - }; - }, [actions.SET_FORCE_QUERY]() { return { ...state, diff --git a/superset-frontend/src/explore/reducers/getInitialState.ts b/superset-frontend/src/explore/reducers/getInitialState.ts index 659e834d2faad..45440f6f5b4b9 100644 --- a/superset-frontend/src/explore/reducers/getInitialState.ts +++ b/superset-frontend/src/explore/reducers/getInitialState.ts @@ -35,7 +35,6 @@ import { getFormDataFromControls, applyMapStateToPropsToControl, } from 'src/explore/controlUtils'; -import { getItem, LocalStorageKeys } from 'src/utils/localStorageHelpers'; export interface ExplorePageBootstrapData extends JsonObject { can_add: boolean; @@ -78,10 +77,6 @@ export default function getInitialState( initialFormData, ) as ControlStateMapping, controlsTransferred: [], - originalFormattedTimeColumns: getItem( - LocalStorageKeys.explore__data_table_original_formatted_time_columns, - {}, - ), }; // apply initial mapStateToProps for all controls, must execute AFTER diff --git a/superset-frontend/src/modules/utils.test.jsx b/superset-frontend/src/modules/utils.test.jsx deleted file mode 100644 index 0c8515f8ae0b4..0000000000000 --- a/superset-frontend/src/modules/utils.test.jsx +++ /dev/null @@ -1,40 +0,0 @@ -/** - * Licensed to the Apache Software Foundation (ASF) under one - * or more contributor license agreements. See the NOTICE file - * distributed with this work for additional information - * regarding copyright ownership. The ASF licenses this file - * to you under the Apache License, Version 2.0 (the - * "License"); you may not use this file except in compliance - * with the License. You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, - * software distributed under the License is distributed on an - * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY - * KIND, either express or implied. See the License for the - * specific language governing permissions and limitations - * under the License. - */ -import { mainMetric } from 'src/modules/utils'; - -describe('utils', () => { - describe('mainMetric', () => { - it('is null when no options', () => { - expect(mainMetric([])).toBeUndefined(); - expect(mainMetric(null)).toBeUndefined(); - }); - it('prefers the "count" metric when first', () => { - const metrics = [{ metric_name: 'count' }, { metric_name: 'foo' }]; - expect(mainMetric(metrics)).toBe('count'); - }); - it('prefers the "count" metric when not first', () => { - const metrics = [{ metric_name: 'foo' }, { metric_name: 'count' }]; - expect(mainMetric(metrics)).toBe('count'); - }); - it('selects the first metric when "count" is not an option', () => { - const metrics = [{ metric_name: 'foo' }, { metric_name: 'not_count' }]; - expect(mainMetric(metrics)).toBe('foo'); - }); - }); -}); diff --git a/superset-frontend/src/utils/common.js b/superset-frontend/src/utils/common.js index 4efdb205e5a9f..603ec7c54992d 100644 --- a/superset-frontend/src/utils/common.js +++ b/superset-frontend/src/utils/common.js @@ -94,7 +94,7 @@ export function prepareCopyToClipboardTabularData(data, columns) { for (let i = 0; i < data.length; i += 1) { const row = {}; for (let j = 0; j < columns.length; j += 1) { - // JavaScript does not mantain the order of a mixed set of keys (i.e integers and strings) + // JavaScript does not maintain the order of a mixed set of keys (i.e integers and strings) // the below function orders the keys based on the column names. const key = columns[j].name || columns[j]; if (data[i][key]) { @@ -145,4 +145,10 @@ export const detectOS = () => { return 'Unknown OS'; }; +export const isSafari = () => { + const { userAgent } = navigator; + + return userAgent && /^((?!chrome|android).)*safari/i.test(userAgent); +}; + export const isNullish = value => value === null || value === undefined; diff --git a/superset-frontend/src/utils/copy.ts b/superset-frontend/src/utils/copy.ts index 7db289c04037d..0980f2ab17079 100644 --- a/superset-frontend/src/utils/copy.ts +++ b/superset-frontend/src/utils/copy.ts @@ -17,40 +17,79 @@ * under the License. */ -const copyTextToClipboard = async (text: string) => - new Promise((resolve, reject) => { - const selection: Selection | null = document.getSelection(); - if (selection) { - selection.removeAllRanges(); - const range = document.createRange(); - const span = document.createElement('span'); - span.textContent = text; - span.style.position = 'fixed'; - span.style.top = '0'; - span.style.clip = 'rect(0, 0, 0, 0)'; - span.style.whiteSpace = 'pre'; - - document.body.appendChild(span); - range.selectNode(span); - selection.addRange(range); - - try { - if (!document.execCommand('copy')) { - reject(); - } - } catch (err) { - reject(); - } - - document.body.removeChild(span); - if (selection.removeRange) { - selection.removeRange(range); - } else { - selection.removeAllRanges(); - } +import { isSafari } from './common'; + +// Use the new Clipboard API if the browser supports it +const copyTextWithClipboardApi = async (getText: () => Promise) => { + // Safari (WebKit) does not support delayed generation of clipboard. + // This means that writing to the clipboard, from the moment the user + // interacts with the app, must be instantaneous. + // However, neither writeText nor write accepts a Promise, so + // we need to create a ClipboardItem that accepts said Promise to + // delay the text generation, as needed. + // Source: https://bugs.webkit.org/show_bug.cgi?id=222262P + if (isSafari()) { + try { + const clipboardItem = new ClipboardItem({ + 'text/plain': getText(), + }); + await navigator.clipboard.write([clipboardItem]); + } catch { + // Fallback to default clipboard API implementation + const text = await getText(); + await navigator.clipboard.writeText(text); } + } else { + // For Blink, the above method won't work, but we can use the + // default (intended) API, since the delayed generation of the + // clipboard is now supported. + // Source: https://bugs.chromium.org/p/chromium/issues/detail?id=1014310 + const text = await getText(); + await navigator.clipboard.writeText(text); + } +}; + +const copyTextToClipboard = (getText: () => Promise) => + copyTextWithClipboardApi(getText) + // If the Clipboard API is not supported, fallback to the older method. + .catch(() => + getText().then( + text => + new Promise((resolve, reject) => { + const selection: Selection | null = document.getSelection(); + if (selection) { + selection.removeAllRanges(); + const range = document.createRange(); + const span = document.createElement('span'); + span.textContent = text; + span.style.position = 'fixed'; + span.style.top = '0'; + span.style.clip = 'rect(0, 0, 0, 0)'; + span.style.whiteSpace = 'pre'; + + document.body.appendChild(span); + range.selectNode(span); + selection.addRange(range); + + try { + if (!document.execCommand('copy')) { + reject(); + } + } catch (err) { + reject(); + } + + document.body.removeChild(span); + if (selection.removeRange) { + selection.removeRange(range); + } else { + selection.removeAllRanges(); + } + } - resolve(); - }); + resolve(); + }), + ), + ); export default copyTextToClipboard; diff --git a/superset-frontend/src/modules/dates.js b/superset-frontend/src/utils/dates.js similarity index 100% rename from superset-frontend/src/modules/dates.js rename to superset-frontend/src/utils/dates.js diff --git a/superset-frontend/src/modules/dates.test.js b/superset-frontend/src/utils/dates.test.js similarity index 98% rename from superset-frontend/src/modules/dates.test.js rename to superset-frontend/src/utils/dates.test.js index 49325f64f48bf..62973693ea572 100644 --- a/superset-frontend/src/modules/dates.test.js +++ b/superset-frontend/src/utils/dates.test.js @@ -22,7 +22,7 @@ import { epochTimeXHoursAgo, epochTimeXDaysAgo, epochTimeXYearsAgo, -} from 'src/modules/dates'; +} from 'src/utils/dates'; describe('fDuration', () => { it('is a function', () => { diff --git a/superset-frontend/src/views/CRUD/alert/ExecutionLog.tsx b/superset-frontend/src/views/CRUD/alert/ExecutionLog.tsx index 8bdae92eb53d0..37f3e7c2add6d 100644 --- a/superset-frontend/src/views/CRUD/alert/ExecutionLog.tsx +++ b/superset-frontend/src/views/CRUD/alert/ExecutionLog.tsx @@ -25,7 +25,7 @@ import ListView from 'src/components/ListView'; import { Tooltip } from 'src/components/Tooltip'; import SubMenu from 'src/views/components/SubMenu'; import withToasts from 'src/components/MessageToasts/withToasts'; -import { fDuration } from 'src/modules/dates'; +import { fDuration } from 'src/utils/dates'; import AlertStatusIcon from 'src/views/CRUD/alert/components/AlertStatusIcon'; import { useListViewResource, diff --git a/superset-frontend/src/views/CRUD/chart/ChartCard.tsx b/superset-frontend/src/views/CRUD/chart/ChartCard.tsx index 07f9dbf3bfbd8..69812bd641559 100644 --- a/superset-frontend/src/views/CRUD/chart/ChartCard.tsx +++ b/superset-frontend/src/views/CRUD/chart/ChartCard.tsx @@ -43,7 +43,7 @@ interface ChartCardProps { saveFavoriteStatus: (id: number, isStarred: boolean) => void; favoriteStatus: boolean; chartFilter?: string; - userId?: number; + userId?: string | number; showThumbnails?: boolean; handleBulkChartExport: (chartsToExport: Chart[]) => void; } @@ -165,11 +165,13 @@ export default function ChartCard({ e.preventDefault(); }} > - + {userId && ( + + )} diff --git a/superset-frontend/src/views/CRUD/chart/ChartList.test.jsx b/superset-frontend/src/views/CRUD/chart/ChartList.test.jsx index fa5d363c5123e..fd9aee16ab06c 100644 --- a/superset-frontend/src/views/CRUD/chart/ChartList.test.jsx +++ b/superset-frontend/src/views/CRUD/chart/ChartList.test.jsx @@ -17,6 +17,7 @@ * under the License. */ import React from 'react'; +import { MemoryRouter } from 'react-router-dom'; import thunk from 'redux-thunk'; import configureStore from 'redux-mock-store'; import { Provider } from 'react-redux'; @@ -34,6 +35,9 @@ import ConfirmStatusChange from 'src/components/ConfirmStatusChange'; import ListView from 'src/components/ListView'; import PropertiesModal from 'src/explore/components/PropertiesModal'; import ListViewCard from 'src/components/ListViewCard'; +import FaveStar from 'src/components/FaveStar'; +import TableCollection from 'src/components/TableCollection'; +import CardCollection from 'src/components/ListView/CardCollection'; // store needed for withToasts(ChartTable) const mockStore = configureStore([thunk]); const store = mockStore({}); @@ -105,13 +109,17 @@ describe('ChartList', () => { }); const mockedProps = {}; - const wrapper = mount( - - - , - ); + let wrapper; beforeAll(async () => { + wrapper = mount( + + + + + , + ); + await waitForComponentToPaint(wrapper); }); @@ -159,6 +167,18 @@ describe('ChartList', () => { await waitForComponentToPaint(wrapper); expect(wrapper.find(ConfirmStatusChange)).toExist(); }); + + it('renders the Favorite Star column in list view for logged in user', async () => { + wrapper.find('[aria-label="list-view"]').first().simulate('click'); + await waitForComponentToPaint(wrapper); + expect(wrapper.find(TableCollection).find(FaveStar)).toExist(); + }); + + it('renders the Favorite Star in card view for logged in user', async () => { + wrapper.find('[aria-label="card-view"]').first().simulate('click'); + await waitForComponentToPaint(wrapper); + expect(wrapper.find(CardCollection).find(FaveStar)).toExist(); + }); }); describe('RTL', () => { @@ -201,3 +221,39 @@ describe('RTL', () => { expect(importTooltip).toBeInTheDocument(); }); }); + +describe('ChartList - anonymous view', () => { + const mockedProps = {}; + const mockUserLoggedOut = {}; + let wrapper; + + beforeAll(async () => { + fetchMock.resetHistory(); + wrapper = mount( + + + + + , + ); + + await waitForComponentToPaint(wrapper); + }); + + afterAll(() => { + cleanup(); + fetch.resetMocks(); + }); + + it('does not render the Favorite Star column in list view for anonymous user', async () => { + wrapper.find('[aria-label="list-view"]').first().simulate('click'); + await waitForComponentToPaint(wrapper); + expect(wrapper.find(TableCollection).find(FaveStar)).not.toExist(); + }); + + it('does not render the Favorite Star in card view for anonymous user', async () => { + wrapper.find('[aria-label="card-view"]').first().simulate('click'); + await waitForComponentToPaint(wrapper); + expect(wrapper.find(CardCollection).find(FaveStar)).not.toExist(); + }); +}); diff --git a/superset-frontend/src/views/CRUD/chart/ChartList.tsx b/superset-frontend/src/views/CRUD/chart/ChartList.tsx index 2645aa41c74ba..637e9b6ea72c0 100644 --- a/superset-frontend/src/views/CRUD/chart/ChartList.tsx +++ b/superset-frontend/src/views/CRUD/chart/ChartList.tsx @@ -22,7 +22,7 @@ import { SupersetClient, t, } from '@superset-ui/core'; -import React, { useMemo, useState } from 'react'; +import React, { useState, useMemo, useCallback } from 'react'; import rison from 'rison'; import { uniqBy } from 'lodash'; import moment from 'moment'; @@ -146,7 +146,11 @@ const Actions = styled.div` `; function ChartList(props: ChartListProps) { - const { addDangerToast, addSuccessToast } = props; + const { + addDangerToast, + addSuccessToast, + user: { userId }, + } = props; const { state: { @@ -180,7 +184,6 @@ function ChartList(props: ChartListProps) { const [passwordFields, setPasswordFields] = useState([]); const [preparingExport, setPreparingExport] = useState(false); - const { userId } = props.user; // TODO: Fix usage of localStorage keying on the user id const userSettings = dangerouslyGetItemDoNotUse(userId?.toString(), null) as { thumbnails: boolean; @@ -235,27 +238,25 @@ function ChartList(props: ChartListProps) { const columns = useMemo( () => [ - ...(props.user.userId - ? [ - { - Cell: ({ - row: { - original: { id }, - }, - }: any) => ( - - ), - Header: '', - id: 'id', - disableSortBy: true, - size: 'sm', - }, - ] - : []), + { + Cell: ({ + row: { + original: { id }, + }, + }: any) => + userId && ( + + ), + Header: '', + id: 'id', + disableSortBy: true, + size: 'xs', + hidden: !userId, + }, { Cell: ({ row: { @@ -451,10 +452,15 @@ function ChartList(props: ChartListProps) { }, ], [ + userId, canEdit, canDelete, canExport, - ...(props.user.userId ? [favoriteStatus] : []), + saveFavoriteStatus, + favoriteStatus, + refreshData, + addSuccessToast, + addDangerToast, ], ); @@ -552,7 +558,7 @@ function ChartList(props: ChartListProps) { fetchSelects: createFetchDatasets, paginate: true, }, - ...(props.user.userId ? [favoritesFilter] : []), + ...(userId ? [favoritesFilter] : []), { Header: t('Certified'), id: 'id', @@ -596,8 +602,8 @@ function ChartList(props: ChartListProps) { }, ]; - function renderCard(chart: Chart) { - return ( + const renderCard = useCallback( + (chart: Chart) => ( - ); - } + ), + [ + addDangerToast, + addSuccessToast, + bulkSelectEnabled, + favoriteStatus, + hasPerm, + loading, + ], + ); + const subMenuButtons: SubMenuProps['buttons'] = []; if (canDelete || canExport) { subMenuButtons.push({ diff --git a/superset-frontend/src/views/CRUD/data/components/SyntaxHighlighterCopy/index.tsx b/superset-frontend/src/views/CRUD/data/components/SyntaxHighlighterCopy/index.tsx index 73a0f8e9cc080..8e96b95f0dc84 100644 --- a/superset-frontend/src/views/CRUD/data/components/SyntaxHighlighterCopy/index.tsx +++ b/superset-frontend/src/views/CRUD/data/components/SyntaxHighlighterCopy/index.tsx @@ -65,7 +65,7 @@ export default function SyntaxHighlighterCopy({ language: 'sql' | 'markdown' | 'html' | 'json'; }) { function copyToClipboard(textToCopy: string) { - copyTextToClipboard(textToCopy) + copyTextToClipboard(() => Promise.resolve(textToCopy)) .then(() => { if (addSuccessToast) { addSuccessToast(t('SQL Copied!')); diff --git a/superset-frontend/src/views/CRUD/data/database/DatabaseModal/DatabaseConnectionForm/CommonParameters.tsx b/superset-frontend/src/views/CRUD/data/database/DatabaseModal/DatabaseConnectionForm/CommonParameters.tsx index a07c9c9498158..34c21466bec8d 100644 --- a/superset-frontend/src/views/CRUD/data/database/DatabaseModal/DatabaseConnectionForm/CommonParameters.tsx +++ b/superset-frontend/src/views/CRUD/data/database/DatabaseModal/DatabaseConnectionForm/CommonParameters.tsx @@ -122,7 +122,7 @@ export const passwordField = ({ id="password" name="password" required={required} - type={isEditMode && 'password'} + visibilityToggle={!isEditMode} value={db?.parameters?.password} validationMethods={{ onBlur: getValidation }} errorMessage={validationErrors?.password} diff --git a/superset-frontend/src/views/CRUD/data/savedquery/SavedQueryList.tsx b/superset-frontend/src/views/CRUD/data/savedquery/SavedQueryList.tsx index df3f16a858c13..d2dc6aff9c3f7 100644 --- a/superset-frontend/src/views/CRUD/data/savedquery/SavedQueryList.tsx +++ b/superset-frontend/src/views/CRUD/data/savedquery/SavedQueryList.tsx @@ -210,8 +210,10 @@ function SavedQueryList({ const copyQueryLink = useCallback( (id: number) => { - copyTextToClipboard( - `${window.location.origin}/superset/sqllab?savedQueryId=${id}`, + copyTextToClipboard(() => + Promise.resolve( + `${window.location.origin}/superset/sqllab?savedQueryId=${id}`, + ), ) .then(() => { addSuccessToast(t('Link Copied!')); diff --git a/superset-frontend/src/views/CRUD/hooks.ts b/superset-frontend/src/views/CRUD/hooks.ts index 18349b305c212..ed49da1e8cfec 100644 --- a/superset-frontend/src/views/CRUD/hooks.ts +++ b/superset-frontend/src/views/CRUD/hooks.ts @@ -611,8 +611,10 @@ export const copyQueryLink = ( addDangerToast: (arg0: string) => void, addSuccessToast: (arg0: string) => void, ) => { - copyTextToClipboard( - `${window.location.origin}/superset/sqllab?savedQueryId=${id}`, + copyTextToClipboard(() => + Promise.resolve( + `${window.location.origin}/superset/sqllab?savedQueryId=${id}`, + ), ) .then(() => { addSuccessToast(t('Link Copied!')); diff --git a/superset-frontend/src/views/CRUD/utils.tsx b/superset-frontend/src/views/CRUD/utils.tsx index 31f3d4c9edeb1..a0004a844b7d3 100644 --- a/superset-frontend/src/views/CRUD/utils.tsx +++ b/superset-frontend/src/views/CRUD/utils.tsx @@ -246,7 +246,7 @@ export function handleChartDelete( addDangerToast: (arg0: string) => void, refreshData: (arg0?: FetchDataConfig | null) => void, chartFilter?: string, - userId?: number, + userId?: string | number, ) { const filters = { pageIndex: 0, diff --git a/superset-frontend/src/views/components/MenuRight.tsx b/superset-frontend/src/views/components/MenuRight.tsx index 1c46f6bcf079d..4c34b883491c4 100644 --- a/superset-frontend/src/views/components/MenuRight.tsx +++ b/superset-frontend/src/views/components/MenuRight.tsx @@ -18,7 +18,8 @@ */ import React, { Fragment, useState, useEffect } from 'react'; import rison from 'rison'; -import { MainNav as Menu } from 'src/components/Menu'; +import { useSelector } from 'react-redux'; +import { Link } from 'react-router-dom'; import { t, styled, @@ -26,12 +27,12 @@ import { SupersetTheme, SupersetClient, } from '@superset-ui/core'; +import { MainNav as Menu } from 'src/components/Menu'; import { Tooltip } from 'src/components/Tooltip'; -import { Link } from 'react-router-dom'; import Icons from 'src/components/Icons'; import findPermission, { isUserAdmin } from 'src/dashboard/util/findPermission'; -import { useSelector } from 'react-redux'; import { UserWithPermissionsAndRoles } from 'src/types/bootstrapTypes'; +import { RootState } from 'src/dashboard/types'; import LanguagePicker from './LanguagePicker'; import DatabaseModal from '../CRUD/data/database/DatabaseModal'; import { uploadUserPerms } from '../CRUD/utils'; @@ -89,6 +90,9 @@ const RightMenu = ({ const user = useSelector( state => state.user, ); + const dashboardId = useSelector( + state => state.dashboardInfo?.id, + ); const { roles } = user; const { @@ -162,7 +166,9 @@ const RightMenu = ({ }, { label: t('Chart'), - url: '/chart/add', + url: Number.isInteger(dashboardId) + ? `/chart/add?dashboard_id=${dashboardId}` + : '/chart/add', icon: 'fa-fw fa-bar-chart', perm: 'can_write', view: 'Chart', diff --git a/superset/cachekeys/schemas.py b/superset/cachekeys/schemas.py index a44a7c545add4..3d913e8b5f6e7 100644 --- a/superset/cachekeys/schemas.py +++ b/superset/cachekeys/schemas.py @@ -22,6 +22,7 @@ datasource_type_description, datasource_uid_description, ) +from superset.utils.core import DatasourceType class Datasource(Schema): @@ -36,7 +37,7 @@ class Datasource(Schema): ) datasource_type = fields.String( description=datasource_type_description, - validate=validate.OneOf(choices=("druid", "table", "view")), + validate=validate.OneOf(choices=[ds.value for ds in DatasourceType]), required=True, ) diff --git a/superset/charts/schemas.py b/superset/charts/schemas.py index 6a05e4d9942bc..8a82e364be47c 100644 --- a/superset/charts/schemas.py +++ b/superset/charts/schemas.py @@ -31,6 +31,7 @@ from superset.utils import pandas_postprocessing, schema as utils from superset.utils.core import ( AnnotationType, + DatasourceType, FilterOperator, PostProcessingBoxplotWhiskerType, PostProcessingContributionOrientation, @@ -198,7 +199,7 @@ class ChartPostSchema(Schema): datasource_id = fields.Integer(description=datasource_id_description, required=True) datasource_type = fields.String( description=datasource_type_description, - validate=validate.OneOf(choices=("druid", "table", "view")), + validate=validate.OneOf(choices=[ds.value for ds in DatasourceType]), required=True, ) datasource_name = fields.String( @@ -244,7 +245,7 @@ class ChartPutSchema(Schema): ) datasource_type = fields.String( description=datasource_type_description, - validate=validate.OneOf(choices=("druid", "table", "view")), + validate=validate.OneOf(choices=[ds.value for ds in DatasourceType]), allow_none=True, ) dashboards = fields.List(fields.Integer(description=dashboards_description)) @@ -983,7 +984,7 @@ class ChartDataDatasourceSchema(Schema): ) type = fields.String( description="Datasource type", - validate=validate.OneOf(choices=("druid", "table")), + validate=validate.OneOf(choices=[ds.value for ds in DatasourceType]), ) diff --git a/superset/commands/exceptions.py b/superset/commands/exceptions.py index 2a60318b46e05..a661ef4d6047d 100644 --- a/superset/commands/exceptions.py +++ b/superset/commands/exceptions.py @@ -115,8 +115,24 @@ def __init__(self) -> None: super().__init__([_("Some roles do not exist")], field_name="roles") +class DatasourceTypeInvalidError(ValidationError): + status = 422 + + def __init__(self) -> None: + super().__init__( + [_("Datasource type is invalid")], field_name="datasource_type" + ) + + class DatasourceNotFoundValidationError(ValidationError): status = 404 def __init__(self) -> None: - super().__init__([_("Dataset does not exist")], field_name="datasource_id") + super().__init__([_("Datasource does not exist")], field_name="datasource_id") + + +class QueryNotFoundValidationError(ValidationError): + status = 404 + + def __init__(self) -> None: + super().__init__([_("Query does not exist")], field_name="datasource_id") diff --git a/superset/common/utils/query_cache_manager.py b/superset/common/utils/query_cache_manager.py index 92fb3561234f4..76aa5ddef32e3 100644 --- a/superset/common/utils/query_cache_manager.py +++ b/superset/common/utils/query_cache_manager.py @@ -187,3 +187,18 @@ def set( """ if key: set_and_log_cache(_cache[region], key, value, timeout, datasource_uid) + + @staticmethod + def delete( + key: Optional[str], + region: CacheRegion = CacheRegion.DEFAULT, + ) -> None: + if key: + _cache[region].delete(key) + + @staticmethod + def has( + key: Optional[str], + region: CacheRegion = CacheRegion.DEFAULT, + ) -> bool: + return bool(_cache[region].get(key)) if key else False diff --git a/superset/config.py b/superset/config.py index c4af7bc8a57ee..8a5ec248fb8a3 100644 --- a/superset/config.py +++ b/superset/config.py @@ -425,6 +425,7 @@ def _try_json_readsha(filepath: str, length: int) -> Optional[str]: "UX_BETA": False, "GENERIC_CHART_AXES": False, "ALLOW_ADHOC_SUBQUERY": False, + "USE_ANALAGOUS_COLORS": True, # Apply RLS rules to SQL Lab queries. This requires parsing and manipulating the # query, and might break queries and/or allow users to bypass RLS. Use with care! "RLS_IN_SQLLAB": False, diff --git a/superset/connectors/base/models.py b/superset/connectors/base/models.py index 73b841ac1687a..7809dab4ae9c5 100644 --- a/superset/connectors/base/models.py +++ b/superset/connectors/base/models.py @@ -449,7 +449,7 @@ def external_metadata(self) -> List[Dict[str, str]]: def get_query_str(self, query_obj: QueryObjectDict) -> str: """Returns a query as a string - This is used to be displayed to the user so that she/he can + This is used to be displayed to the user so that they can understand what is taking place behind the scene""" raise NotImplementedError() diff --git a/superset/constants.py b/superset/constants.py index 98ce7c5d112f3..72fcc3fdb2bce 100644 --- a/superset/constants.py +++ b/superset/constants.py @@ -129,6 +129,7 @@ class RouteMethod: # pylint: disable=too-few-public-methods "available": "read", "validate_sql": "read", "get_data": "read", + "samples": "read", } EXTRA_FORM_DATA_APPEND_KEYS = { diff --git a/superset/dao/datasource/dao.py b/superset/dao/datasource/dao.py index 8b4845db3c51b..caa45564aa250 100644 --- a/superset/dao/datasource/dao.py +++ b/superset/dao/datasource/dao.py @@ -39,11 +39,11 @@ class DatasourceDAO(BaseDAO): sources: Dict[DatasourceType, Type[Datasource]] = { - DatasourceType.SQLATABLE: SqlaTable, + DatasourceType.TABLE: SqlaTable, DatasourceType.QUERY: Query, DatasourceType.SAVEDQUERY: SavedQuery, DatasourceType.DATASET: Dataset, - DatasourceType.TABLE: Table, + DatasourceType.SLTABLE: Table, } @classmethod @@ -66,7 +66,7 @@ def get_datasource( @classmethod def get_all_sqlatables_datasources(cls, session: Session) -> List[Datasource]: - source_class = DatasourceDAO.sources[DatasourceType.SQLATABLE] + source_class = DatasourceDAO.sources[DatasourceType.TABLE] qry = session.query(source_class) qry = source_class.default_query(qry) return qry.all() diff --git a/superset/dashboards/filter_state/commands/create.py b/superset/dashboards/filter_state/commands/create.py index 137623027a1fc..18dff8928fe83 100644 --- a/superset/dashboards/filter_state/commands/create.py +++ b/superset/dashboards/filter_state/commands/create.py @@ -14,11 +14,13 @@ # KIND, either express or implied. See the License for the # specific language governing permissions and limitations # under the License. +from typing import cast + from flask import session -from superset.dashboards.dao import DashboardDAO +from superset.dashboards.filter_state.commands.utils import check_access from superset.extensions import cache_manager -from superset.key_value.utils import random_key +from superset.key_value.utils import get_owner, random_key from superset.temporary_cache.commands.create import CreateTemporaryCacheCommand from superset.temporary_cache.commands.entry import Entry from superset.temporary_cache.commands.parameters import CommandParameters @@ -34,10 +36,9 @@ def create(self, cmd_params: CommandParameters) -> str: key = cache_manager.filter_state_cache.get(contextual_key) if not key or not tab_id: key = random_key() - value = cmd_params.value - dashboard = DashboardDAO.get_by_id_or_slug(str(resource_id)) - if dashboard and value: - entry: Entry = {"owner": actor.get_user_id(), "value": value} - cache_manager.filter_state_cache.set(cache_key(resource_id, key), entry) - cache_manager.filter_state_cache.set(contextual_key, key) + value = cast(str, cmd_params.value) # schema ensures that value is not optional + check_access(resource_id) + entry: Entry = {"owner": get_owner(actor), "value": value} + cache_manager.filter_state_cache.set(cache_key(resource_id, key), entry) + cache_manager.filter_state_cache.set(contextual_key, key) return key diff --git a/superset/dashboards/filter_state/commands/delete.py b/superset/dashboards/filter_state/commands/delete.py index 155c63f1084c6..3ddc08fc51900 100644 --- a/superset/dashboards/filter_state/commands/delete.py +++ b/superset/dashboards/filter_state/commands/delete.py @@ -16,8 +16,9 @@ # under the License. from flask import session -from superset.dashboards.dao import DashboardDAO +from superset.dashboards.filter_state.commands.utils import check_access from superset.extensions import cache_manager +from superset.key_value.utils import get_owner from superset.temporary_cache.commands.delete import DeleteTemporaryCacheCommand from superset.temporary_cache.commands.entry import Entry from superset.temporary_cache.commands.exceptions import TemporaryCacheAccessDeniedError @@ -30,14 +31,13 @@ def delete(self, cmd_params: CommandParameters) -> bool: resource_id = cmd_params.resource_id actor = cmd_params.actor key = cache_key(resource_id, cmd_params.key) - dashboard = DashboardDAO.get_by_id_or_slug(str(resource_id)) - if dashboard: - entry: Entry = cache_manager.filter_state_cache.get(key) - if entry: - if entry["owner"] != actor.get_user_id(): - raise TemporaryCacheAccessDeniedError() - tab_id = cmd_params.tab_id - contextual_key = cache_key(session.get("_id"), tab_id, resource_id) - cache_manager.filter_state_cache.delete(contextual_key) - return cache_manager.filter_state_cache.delete(key) + check_access(resource_id) + entry: Entry = cache_manager.filter_state_cache.get(key) + if entry: + if entry["owner"] != get_owner(actor): + raise TemporaryCacheAccessDeniedError() + tab_id = cmd_params.tab_id + contextual_key = cache_key(session.get("_id"), tab_id, resource_id) + cache_manager.filter_state_cache.delete(contextual_key) + return cache_manager.filter_state_cache.delete(key) return False diff --git a/superset/dashboards/filter_state/commands/get.py b/superset/dashboards/filter_state/commands/get.py index 9cdd5bcddcb48..ca7ffa9879a9f 100644 --- a/superset/dashboards/filter_state/commands/get.py +++ b/superset/dashboards/filter_state/commands/get.py @@ -18,7 +18,7 @@ from flask import current_app as app -from superset.dashboards.dao import DashboardDAO +from superset.dashboards.filter_state.commands.utils import check_access from superset.extensions import cache_manager from superset.temporary_cache.commands.get import GetTemporaryCacheCommand from superset.temporary_cache.commands.parameters import CommandParameters @@ -34,7 +34,7 @@ def __init__(self, cmd_params: CommandParameters) -> None: def get(self, cmd_params: CommandParameters) -> Optional[str]: resource_id = cmd_params.resource_id key = cache_key(resource_id, cmd_params.key) - DashboardDAO.get_by_id_or_slug(str(resource_id)) + check_access(resource_id) entry = cache_manager.filter_state_cache.get(key) or {} if entry and self._refresh_timeout: cache_manager.filter_state_cache.set(key, entry) diff --git a/superset/dashboards/filter_state/commands/update.py b/superset/dashboards/filter_state/commands/update.py index d27277f9afb97..7f150aae6bae3 100644 --- a/superset/dashboards/filter_state/commands/update.py +++ b/superset/dashboards/filter_state/commands/update.py @@ -14,13 +14,13 @@ # KIND, either express or implied. See the License for the # specific language governing permissions and limitations # under the License. -from typing import Optional +from typing import cast, Optional from flask import session -from superset.dashboards.dao import DashboardDAO +from superset.dashboards.filter_state.commands.utils import check_access from superset.extensions import cache_manager -from superset.key_value.utils import random_key +from superset.key_value.utils import get_owner, random_key from superset.temporary_cache.commands.entry import Entry from superset.temporary_cache.commands.exceptions import TemporaryCacheAccessDeniedError from superset.temporary_cache.commands.parameters import CommandParameters @@ -33,28 +33,23 @@ def update(self, cmd_params: CommandParameters) -> Optional[str]: resource_id = cmd_params.resource_id actor = cmd_params.actor key = cmd_params.key - value = cmd_params.value - dashboard = DashboardDAO.get_by_id_or_slug(str(resource_id)) - if dashboard and value: - entry: Entry = cache_manager.filter_state_cache.get( - cache_key(resource_id, key) - ) - if entry: - user_id = actor.get_user_id() - if entry["owner"] != user_id: - raise TemporaryCacheAccessDeniedError() + value = cast(str, cmd_params.value) # schema ensures that value is not optional + check_access(resource_id) + entry: Entry = cache_manager.filter_state_cache.get(cache_key(resource_id, key)) + owner = get_owner(actor) + if entry: + if entry["owner"] != owner: + raise TemporaryCacheAccessDeniedError() - # Generate a new key if tab_id changes or equals 0 - contextual_key = cache_key( - session.get("_id"), cmd_params.tab_id, resource_id - ) - key = cache_manager.filter_state_cache.get(contextual_key) - if not key or not cmd_params.tab_id: - key = random_key() - cache_manager.filter_state_cache.set(contextual_key, key) + # Generate a new key if tab_id changes or equals 0 + contextual_key = cache_key( + session.get("_id"), cmd_params.tab_id, resource_id + ) + key = cache_manager.filter_state_cache.get(contextual_key) + if not key or not cmd_params.tab_id: + key = random_key() + cache_manager.filter_state_cache.set(contextual_key, key) - new_entry: Entry = {"owner": actor.get_user_id(), "value": value} - cache_manager.filter_state_cache.set( - cache_key(resource_id, key), new_entry - ) + new_entry: Entry = {"owner": owner, "value": value} + cache_manager.filter_state_cache.set(cache_key(resource_id, key), new_entry) return key diff --git a/superset/dashboards/filter_state/commands/utils.py b/superset/dashboards/filter_state/commands/utils.py new file mode 100644 index 0000000000000..35f940f4343e1 --- /dev/null +++ b/superset/dashboards/filter_state/commands/utils.py @@ -0,0 +1,35 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. + +from superset.dashboards.commands.exceptions import ( + DashboardAccessDeniedError, + DashboardNotFoundError, +) +from superset.dashboards.dao import DashboardDAO +from superset.temporary_cache.commands.exceptions import ( + TemporaryCacheAccessDeniedError, + TemporaryCacheResourceNotFoundError, +) + + +def check_access(resource_id: int) -> None: + try: + DashboardDAO.get_by_id_or_slug(str(resource_id)) + except DashboardNotFoundError as ex: + raise TemporaryCacheResourceNotFoundError from ex + except DashboardAccessDeniedError as ex: + raise TemporaryCacheAccessDeniedError from ex diff --git a/superset/databases/dao.py b/superset/databases/dao.py index 5e47772cfc635..892ab86ed21df 100644 --- a/superset/databases/dao.py +++ b/superset/databases/dao.py @@ -24,6 +24,7 @@ from superset.models.dashboard import Dashboard from superset.models.slice import Slice from superset.models.sql_lab import TabState +from superset.utils.core import DatasourceType logger = logging.getLogger(__name__) @@ -75,7 +76,8 @@ def get_related_objects(cls, database_id: int) -> Dict[str, Any]: charts = ( db.session.query(Slice) .filter( - Slice.datasource_id.in_(dataset_ids), Slice.datasource_type == "table" + Slice.datasource_id.in_(dataset_ids), + Slice.datasource_type == DatasourceType.TABLE, ) .all() ) diff --git a/superset/datasets/api.py b/superset/datasets/api.py index fb01b6ee8c9cc..17e99959e9675 100644 --- a/superset/datasets/api.py +++ b/superset/datasets/api.py @@ -21,8 +21,9 @@ from typing import Any from zipfile import is_zipfile, ZipFile +import simplejson import yaml -from flask import g, request, Response, send_file +from flask import g, make_response, request, Response, send_file from flask_appbuilder.api import expose, protect, rison, safe from flask_appbuilder.models.sqla.interface import SQLAInterface from flask_babel import ngettext @@ -45,11 +46,13 @@ DatasetInvalidError, DatasetNotFoundError, DatasetRefreshFailedError, + DatasetSamplesFailedError, DatasetUpdateFailedError, ) from superset.datasets.commands.export import ExportDatasetsCommand from superset.datasets.commands.importers.dispatcher import ImportDatasetsCommand from superset.datasets.commands.refresh import RefreshDatasetCommand +from superset.datasets.commands.samples import SamplesDatasetCommand from superset.datasets.commands.update import UpdateDatasetCommand from superset.datasets.dao import DatasetDAO from superset.datasets.filters import DatasetCertifiedFilter, DatasetIsNullOrEmptyFilter @@ -60,7 +63,7 @@ get_delete_ids_schema, get_export_ids_schema, ) -from superset.utils.core import parse_boolean_string +from superset.utils.core import json_int_dttm_ser, parse_boolean_string from superset.views.base import DatasourceFilter, generate_download_headers from superset.views.base_api import ( BaseSupersetModelRestApi, @@ -90,6 +93,7 @@ class DatasetRestApi(BaseSupersetModelRestApi): "bulk_delete", "refresh", "related_objects", + "samples", } list_columns = [ "id", @@ -760,3 +764,65 @@ def import_(self) -> Response: ) command.run() return self.response(200, message="OK") + + @expose("//samples") + @protect() + @safe + @statsd_metrics + @event_logger.log_this_with_context( + action=lambda self, *args, **kwargs: f"{self.__class__.__name__}.samples", + log_to_statsd=False, + ) + def samples(self, pk: int) -> Response: + """get samples from a Dataset + --- + get: + description: >- + get samples from a Dataset + parameters: + - in: path + schema: + type: integer + name: pk + - in: query + schema: + type: boolean + name: force + responses: + 200: + description: Dataset samples + content: + application/json: + schema: + type: object + properties: + result: + $ref: '#/components/schemas/ChartDataResponseResult' + 401: + $ref: '#/components/responses/401' + 403: + $ref: '#/components/responses/403' + 404: + $ref: '#/components/responses/404' + 422: + $ref: '#/components/responses/422' + 500: + $ref: '#/components/responses/500' + """ + try: + force = parse_boolean_string(request.args.get("force")) + rv = SamplesDatasetCommand(g.user, pk, force).run() + response_data = simplejson.dumps( + {"result": rv}, + default=json_int_dttm_ser, + ignore_nan=True, + ) + resp = make_response(response_data, 200) + resp.headers["Content-Type"] = "application/json; charset=utf-8" + return resp + except DatasetNotFoundError: + return self.response_404() + except DatasetForbiddenError: + return self.response_403() + except DatasetSamplesFailedError as ex: + return self.response_400(message=str(ex)) diff --git a/superset/datasets/commands/exceptions.py b/superset/datasets/commands/exceptions.py index 34c15721abfb6..b743a4355ea06 100644 --- a/superset/datasets/commands/exceptions.py +++ b/superset/datasets/commands/exceptions.py @@ -173,6 +173,10 @@ class DatasetRefreshFailedError(UpdateFailedError): message = _("Dataset could not be updated.") +class DatasetSamplesFailedError(CommandInvalidError): + message = _("Samples for dataset could not be retrieved.") + + class DatasetForbiddenError(ForbiddenError): message = _("Changing this dataset is forbidden") diff --git a/superset/datasets/commands/samples.py b/superset/datasets/commands/samples.py new file mode 100644 index 0000000000000..79ac729be0801 --- /dev/null +++ b/superset/datasets/commands/samples.py @@ -0,0 +1,83 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. +import logging +from typing import Any, Dict, Optional + +from flask_appbuilder.security.sqla.models import User + +from superset.commands.base import BaseCommand +from superset.common.chart_data import ChartDataResultType +from superset.common.query_context_factory import QueryContextFactory +from superset.common.utils.query_cache_manager import QueryCacheManager +from superset.connectors.sqla.models import SqlaTable +from superset.constants import CacheRegion +from superset.datasets.commands.exceptions import ( + DatasetForbiddenError, + DatasetNotFoundError, + DatasetSamplesFailedError, +) +from superset.datasets.dao import DatasetDAO +from superset.exceptions import SupersetSecurityException +from superset.utils.core import QueryStatus +from superset.views.base import check_ownership + +logger = logging.getLogger(__name__) + + +class SamplesDatasetCommand(BaseCommand): + def __init__(self, user: User, model_id: int, force: bool): + self._actor = user + self._model_id = model_id + self._force = force + self._model: Optional[SqlaTable] = None + + def run(self) -> Dict[str, Any]: + self.validate() + if not self._model: + raise DatasetNotFoundError() + + qc_instance = QueryContextFactory().create( + datasource={ + "type": self._model.type, + "id": self._model.id, + }, + queries=[{}], + result_type=ChartDataResultType.SAMPLES, + force=self._force, + ) + results = qc_instance.get_payload() + try: + sample_data = results["queries"][0] + error_msg = sample_data.get("error") + if sample_data.get("status") == QueryStatus.FAILED and error_msg: + cache_key = sample_data.get("cache_key") + QueryCacheManager.delete(cache_key, region=CacheRegion.DATA) + raise DatasetSamplesFailedError(error_msg) + return sample_data + except (IndexError, KeyError) as exc: + raise DatasetSamplesFailedError from exc + + def validate(self) -> None: + # Validate/populate model exists + self._model = DatasetDAO.find_by_id(self._model_id) + if not self._model: + raise DatasetNotFoundError() + # Check ownership + try: + check_ownership(self._model) + except SupersetSecurityException as ex: + raise DatasetForbiddenError() from ex diff --git a/superset/datasets/dao.py b/superset/datasets/dao.py index 89460f3b43c6d..44ab8efa0ce54 100644 --- a/superset/datasets/dao.py +++ b/superset/datasets/dao.py @@ -26,6 +26,7 @@ from superset.models.core import Database from superset.models.dashboard import Dashboard from superset.models.slice import Slice +from superset.utils.core import DatasourceType from superset.views.base import DatasourceFilter logger = logging.getLogger(__name__) @@ -56,7 +57,8 @@ def get_related_objects(database_id: int) -> Dict[str, Any]: charts = ( db.session.query(Slice) .filter( - Slice.datasource_id == database_id, Slice.datasource_type == "table" + Slice.datasource_id == database_id, + Slice.datasource_type == DatasourceType.TABLE, ) .all() ) diff --git a/superset/db_engine_specs/bigquery.py b/superset/db_engine_specs/bigquery.py index daac53d3ab4a4..e33457c79abb1 100644 --- a/superset/db_engine_specs/bigquery.py +++ b/superset/db_engine_specs/bigquery.py @@ -140,6 +140,7 @@ class BigQueryEngineSpec(BaseEngineSpec): "PT1H": "{func}({col}, HOUR)", "P1D": "{func}({col}, DAY)", "P1W": "{func}({col}, WEEK)", + "1969-12-29T00:00:00Z/P1W": "{func}({col}, ISOWEEK)", "P1M": "{func}({col}, MONTH)", "P3M": "{func}({col}, QUARTER)", "P1Y": "{func}({col}, YEAR)", diff --git a/superset/db_engine_specs/gsheets.py b/superset/db_engine_specs/gsheets.py index ba13389e26898..740c1bc33d367 100644 --- a/superset/db_engine_specs/gsheets.py +++ b/superset/db_engine_specs/gsheets.py @@ -171,6 +171,14 @@ def validate_parameters( if not table_catalog: # Allowing users to submit empty catalogs + errors.append( + SupersetError( + message="Sheet name is required", + error_type=SupersetErrorType.CONNECTION_MISSING_PARAMETERS_ERROR, + level=ErrorLevel.WARNING, + extra={"catalog": {"idx": 0, "name": True}}, + ), + ) return errors # We need a subject in case domain wide delegation is set, otherwise the diff --git a/superset/db_engine_specs/postgres.py b/superset/db_engine_specs/postgres.py index f81e2f7b3e4a6..c5ffc79ee7100 100644 --- a/superset/db_engine_specs/postgres.py +++ b/superset/db_engine_specs/postgres.py @@ -21,6 +21,7 @@ from typing import Any, Dict, List, Optional, Pattern, Tuple, TYPE_CHECKING from flask_babel import gettext as __ +from psycopg2.extensions import binary_types, string_types from sqlalchemy.dialects.postgresql import ARRAY, DOUBLE_PRECISION, ENUM, JSON from sqlalchemy.dialects.postgresql.base import PGInspector from sqlalchemy.types import String @@ -287,6 +288,14 @@ def get_column_spec( native_type, column_type_mappings=column_type_mappings ) + @classmethod + def get_datatype(cls, type_code: Any) -> Optional[str]: + types = binary_types.copy() + types.update(string_types) + if type_code in types: + return types[type_code].name + return None + @classmethod def get_cancel_query_id(cls, cursor: Any, query: Query) -> Optional[str]: """ diff --git a/superset/examples/birth_names.py b/superset/examples/birth_names.py index de0018ce7cd95..6b37fe9d08dcf 100644 --- a/superset/examples/birth_names.py +++ b/superset/examples/birth_names.py @@ -29,6 +29,7 @@ from superset.models.core import Database from superset.models.dashboard import Dashboard from superset.models.slice import Slice +from superset.utils.core import DatasourceType from ..utils.database import get_example_database from .helpers import ( @@ -205,13 +206,16 @@ def create_slices(tbl: SqlaTable, admin_owner: bool) -> Tuple[List[Slice], List[ if admin_owner: slice_props = dict( datasource_id=tbl.id, - datasource_type="table", + datasource_type=DatasourceType.TABLE, owners=[admin], created_by=admin, ) else: slice_props = dict( - datasource_id=tbl.id, datasource_type="table", owners=[], created_by=admin + datasource_id=tbl.id, + datasource_type=DatasourceType.TABLE, + owners=[], + created_by=admin, ) print("Creating some slices") diff --git a/superset/examples/country_map.py b/superset/examples/country_map.py index 049de6650c447..c959a92085fc0 100644 --- a/superset/examples/country_map.py +++ b/superset/examples/country_map.py @@ -24,6 +24,7 @@ from superset import db from superset.connectors.sqla.models import SqlMetric from superset.models.slice import Slice +from superset.utils.core import DatasourceType from .helpers import ( get_example_data, @@ -112,7 +113,7 @@ def load_country_map_data(only_metadata: bool = False, force: bool = False) -> N slc = Slice( slice_name="Birth in France by department in 2016", viz_type="country_map", - datasource_type="table", + datasource_type=DatasourceType.TABLE, datasource_id=tbl.id, params=get_slice_json(slice_data), ) diff --git a/superset/examples/deck.py b/superset/examples/deck.py index f6c7a8c6996cf..418ed9d28ba1e 100644 --- a/superset/examples/deck.py +++ b/superset/examples/deck.py @@ -19,6 +19,7 @@ from superset import db from superset.models.dashboard import Dashboard from superset.models.slice import Slice +from superset.utils.core import DatasourceType from .helpers import ( get_slice_json, @@ -213,7 +214,7 @@ def load_deck_dash() -> None: # pylint: disable=too-many-statements slc = Slice( slice_name="Deck.gl Scatterplot", viz_type="deck_scatter", - datasource_type="table", + datasource_type=DatasourceType.TABLE, datasource_id=tbl.id, params=get_slice_json(slice_data), ) @@ -248,7 +249,7 @@ def load_deck_dash() -> None: # pylint: disable=too-many-statements slc = Slice( slice_name="Deck.gl Screen grid", viz_type="deck_screengrid", - datasource_type="table", + datasource_type=DatasourceType.TABLE, datasource_id=tbl.id, params=get_slice_json(slice_data), ) @@ -284,7 +285,7 @@ def load_deck_dash() -> None: # pylint: disable=too-many-statements slc = Slice( slice_name="Deck.gl Hexagons", viz_type="deck_hex", - datasource_type="table", + datasource_type=DatasourceType.TABLE, datasource_id=tbl.id, params=get_slice_json(slice_data), ) @@ -321,7 +322,7 @@ def load_deck_dash() -> None: # pylint: disable=too-many-statements slc = Slice( slice_name="Deck.gl Grid", viz_type="deck_grid", - datasource_type="table", + datasource_type=DatasourceType.TABLE, datasource_id=tbl.id, params=get_slice_json(slice_data), ) @@ -410,7 +411,7 @@ def load_deck_dash() -> None: # pylint: disable=too-many-statements slc = Slice( slice_name="Deck.gl Polygons", viz_type="deck_polygon", - datasource_type="table", + datasource_type=DatasourceType.TABLE, datasource_id=polygon_tbl.id, params=get_slice_json(slice_data), ) @@ -460,7 +461,7 @@ def load_deck_dash() -> None: # pylint: disable=too-many-statements slc = Slice( slice_name="Deck.gl Arcs", viz_type="deck_arc", - datasource_type="table", + datasource_type=DatasourceType.TABLE, datasource_id=db.session.query(table) .filter_by(table_name="flights") .first() @@ -512,7 +513,7 @@ def load_deck_dash() -> None: # pylint: disable=too-many-statements slc = Slice( slice_name="Deck.gl Path", viz_type="deck_path", - datasource_type="table", + datasource_type=DatasourceType.TABLE, datasource_id=db.session.query(table) .filter_by(table_name="bart_lines") .first() diff --git a/superset/examples/energy.py b/superset/examples/energy.py index 137d7fe735010..d88d693651d42 100644 --- a/superset/examples/energy.py +++ b/superset/examples/energy.py @@ -25,6 +25,7 @@ from superset import db from superset.connectors.sqla.models import SqlMetric from superset.models.slice import Slice +from superset.utils.core import DatasourceType from .helpers import ( get_example_data, @@ -81,7 +82,7 @@ def load_energy( slc = Slice( slice_name="Energy Sankey", viz_type="sankey", - datasource_type="table", + datasource_type=DatasourceType.TABLE, datasource_id=tbl.id, params=textwrap.dedent( """\ @@ -105,7 +106,7 @@ def load_energy( slc = Slice( slice_name="Energy Force Layout", viz_type="graph_chart", - datasource_type="table", + datasource_type=DatasourceType.TABLE, datasource_id=tbl.id, params=textwrap.dedent( """\ @@ -129,7 +130,7 @@ def load_energy( slc = Slice( slice_name="Heatmap", viz_type="heatmap", - datasource_type="table", + datasource_type=DatasourceType.TABLE, datasource_id=tbl.id, params=textwrap.dedent( """\ diff --git a/superset/examples/long_lat.py b/superset/examples/long_lat.py index 4245be1057fe7..ba9824bb43fea 100644 --- a/superset/examples/long_lat.py +++ b/superset/examples/long_lat.py @@ -24,6 +24,7 @@ import superset.utils.database as database_utils from superset import db from superset.models.slice import Slice +from superset.utils.core import DatasourceType from .helpers import ( get_example_data, @@ -113,7 +114,7 @@ def load_long_lat_data(only_metadata: bool = False, force: bool = False) -> None slc = Slice( slice_name="Mapbox Long/Lat", viz_type="mapbox", - datasource_type="table", + datasource_type=DatasourceType.TABLE, datasource_id=tbl.id, params=get_slice_json(slice_data), ) diff --git a/superset/examples/multi_line.py b/superset/examples/multi_line.py index 1887fd09069e7..6ca023cdcf168 100644 --- a/superset/examples/multi_line.py +++ b/superset/examples/multi_line.py @@ -18,6 +18,7 @@ from superset import db from superset.models.slice import Slice +from superset.utils.core import DatasourceType from .birth_names import load_birth_names from .helpers import merge_slice, misc_dash_slices @@ -35,7 +36,7 @@ def load_multi_line(only_metadata: bool = False) -> None: ] slc = Slice( - datasource_type="table", # not true, but needed + datasource_type=DatasourceType.TABLE, # not true, but needed datasource_id=1, # cannot be empty slice_name="Multi Line", viz_type="line_multi", diff --git a/superset/examples/multiformat_time_series.py b/superset/examples/multiformat_time_series.py index 1e9ee497db6d1..9b8bb22c98e89 100644 --- a/superset/examples/multiformat_time_series.py +++ b/superset/examples/multiformat_time_series.py @@ -21,6 +21,7 @@ from superset import app, db from superset.models.slice import Slice +from superset.utils.core import DatasourceType from ..utils.database import get_example_database from .helpers import ( @@ -120,7 +121,7 @@ def load_multiformat_time_series( # pylint: disable=too-many-locals slc = Slice( slice_name=f"Calendar Heatmap multiformat {i}", viz_type="cal_heatmap", - datasource_type="table", + datasource_type=DatasourceType.TABLE, datasource_id=tbl.id, params=get_slice_json(slice_data), ) diff --git a/superset/examples/random_time_series.py b/superset/examples/random_time_series.py index 0f39b95bd16cb..152b63e1cc326 100644 --- a/superset/examples/random_time_series.py +++ b/superset/examples/random_time_series.py @@ -21,6 +21,7 @@ import superset.utils.database as database_utils from superset import app, db from superset.models.slice import Slice +from superset.utils.core import DatasourceType from .helpers import ( get_example_data, @@ -89,7 +90,7 @@ def load_random_time_series_data( slc = Slice( slice_name="Calendar Heatmap", viz_type="cal_heatmap", - datasource_type="table", + datasource_type=DatasourceType.TABLE, datasource_id=tbl.id, params=get_slice_json(slice_data), ) diff --git a/superset/examples/world_bank.py b/superset/examples/world_bank.py index 421818724f83c..39b982aa52468 100644 --- a/superset/examples/world_bank.py +++ b/superset/examples/world_bank.py @@ -29,6 +29,7 @@ from superset.models.dashboard import Dashboard from superset.models.slice import Slice from superset.utils import core as utils +from superset.utils.core import DatasourceType from ..connectors.base.models import BaseDatasource from .helpers import ( @@ -172,7 +173,7 @@ def create_slices(tbl: BaseDatasource) -> List[Slice]: Slice( slice_name="Region Filter", viz_type="filter_box", - datasource_type="table", + datasource_type=DatasourceType.TABLE, datasource_id=tbl.id, params=get_slice_json( defaults, @@ -201,7 +202,7 @@ def create_slices(tbl: BaseDatasource) -> List[Slice]: Slice( slice_name="World's Population", viz_type="big_number", - datasource_type="table", + datasource_type=DatasourceType.TABLE, datasource_id=tbl.id, params=get_slice_json( defaults, @@ -215,7 +216,7 @@ def create_slices(tbl: BaseDatasource) -> List[Slice]: Slice( slice_name="Most Populated Countries", viz_type="table", - datasource_type="table", + datasource_type=DatasourceType.TABLE, datasource_id=tbl.id, params=get_slice_json( defaults, @@ -227,7 +228,7 @@ def create_slices(tbl: BaseDatasource) -> List[Slice]: Slice( slice_name="Growth Rate", viz_type="line", - datasource_type="table", + datasource_type=DatasourceType.TABLE, datasource_id=tbl.id, params=get_slice_json( defaults, @@ -241,7 +242,7 @@ def create_slices(tbl: BaseDatasource) -> List[Slice]: Slice( slice_name="% Rural", viz_type="world_map", - datasource_type="table", + datasource_type=DatasourceType.TABLE, datasource_id=tbl.id, params=get_slice_json( defaults, @@ -254,7 +255,7 @@ def create_slices(tbl: BaseDatasource) -> List[Slice]: Slice( slice_name="Life Expectancy VS Rural %", viz_type="bubble", - datasource_type="table", + datasource_type=DatasourceType.TABLE, datasource_id=tbl.id, params=get_slice_json( defaults, @@ -298,7 +299,7 @@ def create_slices(tbl: BaseDatasource) -> List[Slice]: Slice( slice_name="Rural Breakdown", viz_type="sunburst", - datasource_type="table", + datasource_type=DatasourceType.TABLE, datasource_id=tbl.id, params=get_slice_json( defaults, @@ -313,7 +314,7 @@ def create_slices(tbl: BaseDatasource) -> List[Slice]: Slice( slice_name="World's Pop Growth", viz_type="area", - datasource_type="table", + datasource_type=DatasourceType.TABLE, datasource_id=tbl.id, params=get_slice_json( defaults, @@ -327,7 +328,7 @@ def create_slices(tbl: BaseDatasource) -> List[Slice]: Slice( slice_name="Box plot", viz_type="box_plot", - datasource_type="table", + datasource_type=DatasourceType.TABLE, datasource_id=tbl.id, params=get_slice_json( defaults, @@ -343,7 +344,7 @@ def create_slices(tbl: BaseDatasource) -> List[Slice]: Slice( slice_name="Treemap", viz_type="treemap", - datasource_type="table", + datasource_type=DatasourceType.TABLE, datasource_id=tbl.id, params=get_slice_json( defaults, @@ -357,7 +358,7 @@ def create_slices(tbl: BaseDatasource) -> List[Slice]: Slice( slice_name="Parallel Coordinates", viz_type="para", - datasource_type="table", + datasource_type=DatasourceType.TABLE, datasource_id=tbl.id, params=get_slice_json( defaults, diff --git a/superset/explore/form_data/api.py b/superset/explore/form_data/api.py index dc6ee7ea94cc3..00c8730ee411b 100644 --- a/superset/explore/form_data/api.py +++ b/superset/explore/form_data/api.py @@ -21,15 +21,7 @@ from flask_appbuilder.api import BaseApi, expose, protect, safe from marshmallow import ValidationError -from superset.charts.commands.exceptions import ( - ChartAccessDeniedError, - ChartNotFoundError, -) from superset.constants import MODEL_API_RW_METHOD_PERMISSION_MAP, RouteMethod -from superset.datasets.commands.exceptions import ( - DatasetAccessDeniedError, - DatasetNotFoundError, -) from superset.explore.form_data.commands.create import CreateFormDataCommand from superset.explore.form_data.commands.delete import DeleteFormDataCommand from superset.explore.form_data.commands.get import GetFormDataCommand @@ -37,7 +29,10 @@ from superset.explore.form_data.commands.update import UpdateFormDataCommand from superset.explore.form_data.schemas import FormDataPostSchema, FormDataPutSchema from superset.extensions import event_logger -from superset.temporary_cache.commands.exceptions import TemporaryCacheAccessDeniedError +from superset.temporary_cache.commands.exceptions import ( + TemporaryCacheAccessDeniedError, + TemporaryCacheResourceNotFoundError, +) from superset.views.base_api import requires_json logger = logging.getLogger(__name__) @@ -109,7 +104,8 @@ def post(self) -> Response: tab_id = request.args.get("tab_id") args = CommandParameters( actor=g.user, - dataset_id=item["dataset_id"], + datasource_id=item["datasource_id"], + datasource_type=item["datasource_type"], chart_id=item.get("chart_id"), tab_id=tab_id, form_data=item["form_data"], @@ -118,13 +114,9 @@ def post(self) -> Response: return self.response(201, key=key) except ValidationError as ex: return self.response(400, message=ex.messages) - except ( - ChartAccessDeniedError, - DatasetAccessDeniedError, - TemporaryCacheAccessDeniedError, - ) as ex: + except TemporaryCacheAccessDeniedError as ex: return self.response(403, message=str(ex)) - except (ChartNotFoundError, DatasetNotFoundError) as ex: + except TemporaryCacheResourceNotFoundError as ex: return self.response(404, message=str(ex)) @expose("/form_data/", methods=["PUT"]) @@ -132,7 +124,7 @@ def post(self) -> Response: @safe @event_logger.log_this_with_context( action=lambda self, *args, **kwargs: f"{self.__class__.__name__}.put", - log_to_statsd=False, + log_to_statsd=True, ) @requires_json def put(self, key: str) -> Response: @@ -183,7 +175,8 @@ def put(self, key: str) -> Response: tab_id = request.args.get("tab_id") args = CommandParameters( actor=g.user, - dataset_id=item["dataset_id"], + datasource_id=item["datasource_id"], + datasource_type=item["datasource_type"], chart_id=item.get("chart_id"), tab_id=tab_id, key=key, @@ -195,13 +188,9 @@ def put(self, key: str) -> Response: return self.response(200, key=result) except ValidationError as ex: return self.response(400, message=ex.messages) - except ( - ChartAccessDeniedError, - DatasetAccessDeniedError, - TemporaryCacheAccessDeniedError, - ) as ex: + except TemporaryCacheAccessDeniedError as ex: return self.response(403, message=str(ex)) - except (ChartNotFoundError, DatasetNotFoundError) as ex: + except TemporaryCacheResourceNotFoundError as ex: return self.response(404, message=str(ex)) @expose("/form_data/", methods=["GET"]) @@ -209,7 +198,7 @@ def put(self, key: str) -> Response: @safe @event_logger.log_this_with_context( action=lambda self, *args, **kwargs: f"{self.__class__.__name__}.get", - log_to_statsd=False, + log_to_statsd=True, ) def get(self, key: str) -> Response: """Retrives a form_data. @@ -250,13 +239,9 @@ def get(self, key: str) -> Response: if not form_data: return self.response_404() return self.response(200, form_data=form_data) - except ( - ChartAccessDeniedError, - DatasetAccessDeniedError, - TemporaryCacheAccessDeniedError, - ) as ex: + except TemporaryCacheAccessDeniedError as ex: return self.response(403, message=str(ex)) - except (ChartNotFoundError, DatasetNotFoundError) as ex: + except TemporaryCacheResourceNotFoundError as ex: return self.response(404, message=str(ex)) @expose("/form_data/", methods=["DELETE"]) @@ -264,7 +249,7 @@ def get(self, key: str) -> Response: @safe @event_logger.log_this_with_context( action=lambda self, *args, **kwargs: f"{self.__class__.__name__}.delete", - log_to_statsd=False, + log_to_statsd=True, ) def delete(self, key: str) -> Response: """Deletes a form_data. @@ -306,11 +291,7 @@ def delete(self, key: str) -> Response: if not result: return self.response_404() return self.response(200, message="Deleted successfully") - except ( - ChartAccessDeniedError, - DatasetAccessDeniedError, - TemporaryCacheAccessDeniedError, - ) as ex: + except TemporaryCacheAccessDeniedError as ex: return self.response(403, message=str(ex)) - except (ChartNotFoundError, DatasetNotFoundError) as ex: + except TemporaryCacheResourceNotFoundError as ex: return self.response(404, message=str(ex)) diff --git a/superset/explore/form_data/commands/create.py b/superset/explore/form_data/commands/create.py index 7b1f866c505df..7946980c82684 100644 --- a/superset/explore/form_data/commands/create.py +++ b/superset/explore/form_data/commands/create.py @@ -22,9 +22,9 @@ from superset.commands.base import BaseCommand from superset.explore.form_data.commands.parameters import CommandParameters from superset.explore.form_data.commands.state import TemporaryExploreState -from superset.explore.utils import check_access +from superset.explore.form_data.commands.utils import check_access from superset.extensions import cache_manager -from superset.key_value.utils import random_key +from superset.key_value.utils import get_owner, random_key from superset.temporary_cache.commands.exceptions import TemporaryCacheCreateFailedError from superset.temporary_cache.utils import cache_key from superset.utils.schema import validate_json @@ -39,20 +39,24 @@ def __init__(self, cmd_params: CommandParameters): def run(self) -> str: self.validate() try: - dataset_id = self._cmd_params.dataset_id + datasource_id = self._cmd_params.datasource_id + datasource_type = self._cmd_params.datasource_type chart_id = self._cmd_params.chart_id tab_id = self._cmd_params.tab_id actor = self._cmd_params.actor form_data = self._cmd_params.form_data - check_access(dataset_id, chart_id, actor) - contextual_key = cache_key(session.get("_id"), tab_id, dataset_id, chart_id) + check_access(datasource_id, chart_id, actor, datasource_type) + contextual_key = cache_key( + session.get("_id"), tab_id, datasource_id, chart_id, datasource_type + ) key = cache_manager.explore_form_data_cache.get(contextual_key) if not key or not tab_id: key = random_key() if form_data: state: TemporaryExploreState = { - "owner": actor.get_user_id(), - "dataset_id": dataset_id, + "owner": get_owner(actor), + "datasource_id": datasource_id, + "datasource_type": datasource_type, "chart_id": chart_id, "form_data": form_data, } diff --git a/superset/explore/form_data/commands/delete.py b/superset/explore/form_data/commands/delete.py index ec537313d2ba0..598ece3f080fc 100644 --- a/superset/explore/form_data/commands/delete.py +++ b/superset/explore/form_data/commands/delete.py @@ -16,6 +16,7 @@ # under the License. import logging from abc import ABC +from typing import Optional from flask import session from sqlalchemy.exc import SQLAlchemyError @@ -23,13 +24,15 @@ from superset.commands.base import BaseCommand from superset.explore.form_data.commands.parameters import CommandParameters from superset.explore.form_data.commands.state import TemporaryExploreState -from superset.explore.utils import check_access +from superset.explore.form_data.commands.utils import check_access from superset.extensions import cache_manager +from superset.key_value.utils import get_owner from superset.temporary_cache.commands.exceptions import ( TemporaryCacheAccessDeniedError, TemporaryCacheDeleteFailedError, ) from superset.temporary_cache.utils import cache_key +from superset.utils.core import DatasourceType logger = logging.getLogger(__name__) @@ -46,14 +49,15 @@ def run(self) -> bool: key ) if state: - dataset_id = state["dataset_id"] - chart_id = state["chart_id"] - check_access(dataset_id, chart_id, actor) - if state["owner"] != actor.get_user_id(): + datasource_id: int = state["datasource_id"] + chart_id: Optional[int] = state["chart_id"] + datasource_type = DatasourceType(state["datasource_type"]) + check_access(datasource_id, chart_id, actor, datasource_type) + if state["owner"] != get_owner(actor): raise TemporaryCacheAccessDeniedError() tab_id = self._cmd_params.tab_id contextual_key = cache_key( - session.get("_id"), tab_id, dataset_id, chart_id + session.get("_id"), tab_id, datasource_id, chart_id, datasource_type ) cache_manager.explore_form_data_cache.delete(contextual_key) return cache_manager.explore_form_data_cache.delete(key) diff --git a/superset/explore/form_data/commands/get.py b/superset/explore/form_data/commands/get.py index 5b582008218cc..982c8e3b4b7d7 100644 --- a/superset/explore/form_data/commands/get.py +++ b/superset/explore/form_data/commands/get.py @@ -24,9 +24,10 @@ from superset.commands.base import BaseCommand from superset.explore.form_data.commands.parameters import CommandParameters from superset.explore.form_data.commands.state import TemporaryExploreState -from superset.explore.utils import check_access +from superset.explore.form_data.commands.utils import check_access from superset.extensions import cache_manager from superset.temporary_cache.commands.exceptions import TemporaryCacheGetFailedError +from superset.utils.core import DatasourceType logger = logging.getLogger(__name__) @@ -45,7 +46,12 @@ def run(self) -> Optional[str]: key ) if state: - check_access(state["dataset_id"], state["chart_id"], actor) + check_access( + state["datasource_id"], + state["chart_id"], + actor, + DatasourceType(state["datasource_type"]), + ) if self._refresh_timeout: cache_manager.explore_form_data_cache.set(key, state) return state["form_data"] diff --git a/superset/explore/form_data/commands/parameters.py b/superset/explore/form_data/commands/parameters.py index 3e830810b5000..fec06a581fb79 100644 --- a/superset/explore/form_data/commands/parameters.py +++ b/superset/explore/form_data/commands/parameters.py @@ -19,11 +19,14 @@ from flask_appbuilder.security.sqla.models import User +from superset.utils.core import DatasourceType + @dataclass class CommandParameters: actor: User - dataset_id: int = 0 + datasource_type: DatasourceType = DatasourceType.TABLE + datasource_id: int = 0 chart_id: int = 0 tab_id: Optional[int] = None key: Optional[str] = None diff --git a/superset/explore/form_data/commands/state.py b/superset/explore/form_data/commands/state.py index 2aba14a8cb28f..470f2e22f5989 100644 --- a/superset/explore/form_data/commands/state.py +++ b/superset/explore/form_data/commands/state.py @@ -20,7 +20,8 @@ class TemporaryExploreState(TypedDict): - owner: int - dataset_id: int + owner: Optional[int] + datasource_id: int + datasource_type: str chart_id: Optional[int] form_data: str diff --git a/superset/explore/form_data/commands/update.py b/superset/explore/form_data/commands/update.py index 596c5f6e27ef2..fdc75093bef85 100644 --- a/superset/explore/form_data/commands/update.py +++ b/superset/explore/form_data/commands/update.py @@ -24,9 +24,9 @@ from superset.commands.base import BaseCommand from superset.explore.form_data.commands.parameters import CommandParameters from superset.explore.form_data.commands.state import TemporaryExploreState -from superset.explore.utils import check_access +from superset.explore.form_data.commands.utils import check_access from superset.extensions import cache_manager -from superset.key_value.utils import random_key +from superset.key_value.utils import get_owner, random_key from superset.temporary_cache.commands.exceptions import ( TemporaryCacheAccessDeniedError, TemporaryCacheUpdateFailedError, @@ -47,24 +47,25 @@ def __init__( def run(self) -> Optional[str]: self.validate() try: - dataset_id = self._cmd_params.dataset_id + datasource_id = self._cmd_params.datasource_id chart_id = self._cmd_params.chart_id + datasource_type = self._cmd_params.datasource_type actor = self._cmd_params.actor key = self._cmd_params.key form_data = self._cmd_params.form_data - check_access(dataset_id, chart_id, actor) + check_access(datasource_id, chart_id, actor, datasource_type) state: TemporaryExploreState = cache_manager.explore_form_data_cache.get( key ) + owner = get_owner(actor) if state and form_data: - user_id = actor.get_user_id() - if state["owner"] != user_id: + if state["owner"] != owner: raise TemporaryCacheAccessDeniedError() # Generate a new key if tab_id changes or equals 0 tab_id = self._cmd_params.tab_id contextual_key = cache_key( - session.get("_id"), tab_id, dataset_id, chart_id + session.get("_id"), tab_id, datasource_id, chart_id, datasource_type ) key = cache_manager.explore_form_data_cache.get(contextual_key) if not key or not tab_id: @@ -72,8 +73,9 @@ def run(self) -> Optional[str]: cache_manager.explore_form_data_cache.set(contextual_key, key) new_state: TemporaryExploreState = { - "owner": actor.get_user_id(), - "dataset_id": dataset_id, + "owner": owner, + "datasource_id": datasource_id, + "datasource_type": datasource_type, "chart_id": chart_id, "form_data": form_data, } diff --git a/superset/explore/form_data/commands/utils.py b/superset/explore/form_data/commands/utils.py new file mode 100644 index 0000000000000..7927457178c9e --- /dev/null +++ b/superset/explore/form_data/commands/utils.py @@ -0,0 +1,48 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. +from typing import Optional + +from flask_appbuilder.security.sqla.models import User + +from superset.charts.commands.exceptions import ( + ChartAccessDeniedError, + ChartNotFoundError, +) +from superset.datasets.commands.exceptions import ( + DatasetAccessDeniedError, + DatasetNotFoundError, +) +from superset.explore.utils import check_access as explore_check_access +from superset.temporary_cache.commands.exceptions import ( + TemporaryCacheAccessDeniedError, + TemporaryCacheResourceNotFoundError, +) +from superset.utils.core import DatasourceType + + +def check_access( + datasource_id: int, + chart_id: Optional[int], + actor: User, + datasource_type: DatasourceType, +) -> None: + try: + explore_check_access(datasource_id, chart_id, actor, datasource_type) + except (ChartNotFoundError, DatasetNotFoundError) as ex: + raise TemporaryCacheResourceNotFoundError from ex + except (ChartAccessDeniedError, DatasetAccessDeniedError) as ex: + raise TemporaryCacheAccessDeniedError from ex diff --git a/superset/explore/form_data/schemas.py b/superset/explore/form_data/schemas.py index 6d5509d777a3c..192df089e818b 100644 --- a/superset/explore/form_data/schemas.py +++ b/superset/explore/form_data/schemas.py @@ -14,12 +14,20 @@ # KIND, either express or implied. See the License for the # specific language governing permissions and limitations # under the License. -from marshmallow import fields, Schema +from marshmallow import fields, Schema, validate + +from superset.utils.core import DatasourceType class FormDataPostSchema(Schema): - dataset_id = fields.Integer( - required=True, allow_none=False, description="The dataset ID" + datasource_id = fields.Integer( + required=True, allow_none=False, description="The datasource ID" + ) + datasource_type = fields.String( + required=True, + allow_none=False, + description="The datasource type", + validate=validate.OneOf(choices=[ds.value for ds in DatasourceType]), ) chart_id = fields.Integer(required=False, description="The chart ID") form_data = fields.String( @@ -28,8 +36,14 @@ class FormDataPostSchema(Schema): class FormDataPutSchema(Schema): - dataset_id = fields.Integer( - required=True, allow_none=False, description="The dataset ID" + datasource_id = fields.Integer( + required=True, allow_none=False, description="The datasource ID" + ) + datasource_type = fields.String( + required=True, + allow_none=False, + description="The datasource type", + validate=validate.OneOf(choices=[ds.value for ds in DatasourceType]), ) chart_id = fields.Integer(required=False, description="The chart ID") form_data = fields.String( diff --git a/superset/explore/permalink/commands/create.py b/superset/explore/permalink/commands/create.py index c09ca3b372121..7bd6365d814bd 100644 --- a/superset/explore/permalink/commands/create.py +++ b/superset/explore/permalink/commands/create.py @@ -22,9 +22,10 @@ from superset.explore.permalink.commands.base import BaseExplorePermalinkCommand from superset.explore.permalink.exceptions import ExplorePermalinkCreateFailedError -from superset.explore.utils import check_access +from superset.explore.utils import check_access as check_chart_access from superset.key_value.commands.create import CreateKeyValueCommand from superset.key_value.utils import encode_permalink_key +from superset.utils.core import DatasourceType logger = logging.getLogger(__name__) @@ -39,11 +40,16 @@ def __init__(self, actor: User, state: Dict[str, Any]): def run(self) -> str: self.validate() try: - dataset_id = int(self.datasource.split("__")[0]) - check_access(dataset_id, self.chart_id, self.actor) + d_id, d_type = self.datasource.split("__") + datasource_id = int(d_id) + datasource_type = DatasourceType(d_type) + check_chart_access( + datasource_id, self.chart_id, self.actor, datasource_type + ) value = { "chartId": self.chart_id, - "datasetId": dataset_id, + "datasourceId": datasource_id, + "datasourceType": datasource_type, "datasource": self.datasource, "state": self.state, } diff --git a/superset/explore/permalink/commands/get.py b/superset/explore/permalink/commands/get.py index 1e3ea1fdc6f92..f75df69d7a63e 100644 --- a/superset/explore/permalink/commands/get.py +++ b/superset/explore/permalink/commands/get.py @@ -24,10 +24,11 @@ from superset.explore.permalink.commands.base import BaseExplorePermalinkCommand from superset.explore.permalink.exceptions import ExplorePermalinkGetFailedError from superset.explore.permalink.types import ExplorePermalinkValue -from superset.explore.utils import check_access +from superset.explore.utils import check_access as check_chart_access from superset.key_value.commands.get import GetKeyValueCommand from superset.key_value.exceptions import KeyValueGetFailedError, KeyValueParseKeyError from superset.key_value.utils import decode_permalink_id +from superset.utils.core import DatasourceType logger = logging.getLogger(__name__) @@ -47,8 +48,9 @@ def run(self) -> Optional[ExplorePermalinkValue]: ).run() if value: chart_id: Optional[int] = value.get("chartId") - dataset_id = value["datasetId"] - check_access(dataset_id, chart_id, self.actor) + datasource_id: int = value["datasourceId"] + datasource_type = DatasourceType(value["datasourceType"]) + check_chart_access(datasource_id, chart_id, self.actor, datasource_type) return value return None except ( diff --git a/superset/explore/permalink/types.py b/superset/explore/permalink/types.py index b396e335104b0..b90b4d760d4d0 100644 --- a/superset/explore/permalink/types.py +++ b/superset/explore/permalink/types.py @@ -24,6 +24,7 @@ class ExplorePermalinkState(TypedDict, total=False): class ExplorePermalinkValue(TypedDict): chartId: Optional[int] - datasetId: int + datasourceId: int + datasourceType: str datasource: str state: ExplorePermalinkState diff --git a/superset/explore/utils.py b/superset/explore/utils.py index 3eeb1bab9964a..f0bfd8f0aa40c 100644 --- a/superset/explore/utils.py +++ b/superset/explore/utils.py @@ -24,11 +24,18 @@ ChartNotFoundError, ) from superset.charts.dao import ChartDAO +from superset.commands.exceptions import ( + DatasourceNotFoundValidationError, + DatasourceTypeInvalidError, + QueryNotFoundValidationError, +) from superset.datasets.commands.exceptions import ( DatasetAccessDeniedError, DatasetNotFoundError, ) from superset.datasets.dao import DatasetDAO +from superset.queries.dao import QueryDAO +from superset.utils.core import DatasourceType from superset.views.base import is_user_admin from superset.views.utils import is_owner @@ -44,10 +51,39 @@ def check_dataset_access(dataset_id: int) -> Optional[bool]: raise DatasetNotFoundError() +def check_query_access(query_id: int) -> Optional[bool]: + if query_id: + query = QueryDAO.find_by_id(query_id) + if query: + security_manager.raise_for_access(query=query) + return True + raise QueryNotFoundValidationError() + + +ACCESS_FUNCTION_MAP = { + DatasourceType.TABLE: check_dataset_access, + DatasourceType.QUERY: check_query_access, +} + + +def check_datasource_access( + datasource_id: int, datasource_type: DatasourceType +) -> Optional[bool]: + if datasource_id: + try: + return ACCESS_FUNCTION_MAP[datasource_type](datasource_id) + except KeyError as ex: + raise DatasourceTypeInvalidError() from ex + raise DatasourceNotFoundValidationError() + + def check_access( - dataset_id: int, chart_id: Optional[int], actor: User + datasource_id: int, + chart_id: Optional[int], + actor: User, + datasource_type: DatasourceType, ) -> Optional[bool]: - check_dataset_access(dataset_id) + check_datasource_access(datasource_id, datasource_type) if not chart_id: return True chart = ChartDAO.find_by_id(chart_id) diff --git a/superset/jinja_context.py b/superset/jinja_context.py index e365b9a708ddb..42f6809c7402f 100644 --- a/superset/jinja_context.py +++ b/superset/jinja_context.py @@ -38,6 +38,7 @@ from sqlalchemy.types import String from typing_extensions import TypedDict +from superset.datasets.commands.exceptions import DatasetNotFoundError from superset.exceptions import SupersetTemplateException from superset.extensions import feature_flag_manager from superset.utils.core import convert_legacy_filters_into_adhoc, merge_extra_filters @@ -490,6 +491,7 @@ def set_context(self, **kwargs: Any) -> None: "cache_key_wrapper": partial(safe_proxy, extra_cache.cache_key_wrapper), "filter_values": partial(safe_proxy, extra_cache.filter_values), "get_filters": partial(safe_proxy, extra_cache.get_filters), + "dataset": partial(safe_proxy, dataset_macro), } ) @@ -602,3 +604,34 @@ def get_template_processor( else: template_processor = NoOpTemplateProcessor return template_processor(database=database, table=table, query=query, **kwargs) + + +def dataset_macro( + dataset_id: int, + include_metrics: bool = False, + columns: Optional[List[str]] = None, +) -> str: + """ + Given a dataset ID, return the SQL that represents it. + + The generated SQL includes all columns (including computed) by default. Optionally + the user can also request metrics to be included, and columns to group by. + """ + # pylint: disable=import-outside-toplevel + from superset.datasets.dao import DatasetDAO + + dataset = DatasetDAO.find_by_id(dataset_id) + if not dataset: + raise DatasetNotFoundError(f"Dataset {dataset_id} not found!") + + columns = columns or [column.column_name for column in dataset.columns] + metrics = [metric.metric_name for metric in dataset.metrics] + query_obj = { + "is_timeseries": False, + "filter": [], + "metrics": metrics if include_metrics else None, + "columns": columns, + } + sqla_query = dataset.get_query_str_extended(query_obj) + sql = sqla_query.sql + return f"({sql}) AS dataset_{dataset_id}" diff --git a/superset/key_value/utils.py b/superset/key_value/utils.py index b2e8e729b0466..db27e505fbd6c 100644 --- a/superset/key_value/utils.py +++ b/superset/key_value/utils.py @@ -18,10 +18,11 @@ from hashlib import md5 from secrets import token_urlsafe -from typing import Union +from typing import Optional, Union from uuid import UUID import hashids +from flask_appbuilder.security.sqla.models import User from flask_babel import gettext as _ from superset.key_value.exceptions import KeyValueParseKeyError @@ -63,3 +64,7 @@ def get_uuid_namespace(seed: str) -> UUID: md5_obj = md5() md5_obj.update(seed.encode("utf-8")) return UUID(md5_obj.hexdigest()) + + +def get_owner(user: User) -> Optional[int]: + return user.get_user_id() if not user.is_anonymous else None diff --git a/superset/models/sql_lab.py b/superset/models/sql_lab.py index 04d5fc9a94359..74c43718ef781 100644 --- a/superset/models/sql_lab.py +++ b/superset/models/sql_lab.py @@ -166,6 +166,10 @@ def username(self) -> str: def sql_tables(self) -> List[Table]: return list(ParsedQuery(self.sql).tables) + @property + def columns(self) -> List[Table]: + return self.extra.get("columns", []) + def raise_for_access(self) -> None: """ Raise an exception if the user cannot access the resource. diff --git a/superset/reports/notifications/email.py b/superset/reports/notifications/email.py index 20afeae437b00..3991f24b9264d 100644 --- a/superset/reports/notifications/email.py +++ b/superset/reports/notifications/email.py @@ -30,6 +30,7 @@ from superset.reports.notifications.base import BaseNotification from superset.reports.notifications.exceptions import NotificationError from superset.utils.core import send_email_smtp +from superset.utils.decorators import statsd_gauge from superset.utils.urls import modify_url_query logger = logging.getLogger(__name__) @@ -149,6 +150,7 @@ def _get_subject(self) -> str: def _get_to(self) -> str: return json.loads(self._recipient.recipient_config_json)["target"] + @statsd_gauge("reports.email.send") def send(self) -> None: subject = self._get_subject() content = self._get_content() diff --git a/superset/reports/notifications/slack.py b/superset/reports/notifications/slack.py index b833cbd53ddf3..2a198d66453c2 100644 --- a/superset/reports/notifications/slack.py +++ b/superset/reports/notifications/slack.py @@ -29,6 +29,7 @@ from superset.models.reports import ReportRecipientType from superset.reports.notifications.base import BaseNotification from superset.reports.notifications.exceptions import NotificationError +from superset.utils.decorators import statsd_gauge from superset.utils.urls import modify_url_query logger = logging.getLogger(__name__) @@ -147,6 +148,7 @@ def _get_inline_files(self) -> Sequence[Union[str, IOBase, bytes]]: return [] @backoff.on_exception(backoff.expo, SlackApiError, factor=10, base=2, max_tries=5) + @statsd_gauge("reports.slack.send") def send(self) -> None: files = self._get_inline_files() title = self._content.name diff --git a/superset/sql_lab.py b/superset/sql_lab.py index 2eeb2976b4126..785d16327f7f2 100644 --- a/superset/sql_lab.py +++ b/superset/sql_lab.py @@ -548,6 +548,7 @@ def execute_sql_statements( # pylint: disable=too-many-arguments, too-many-loca if store_results and results_backend: key = str(uuid.uuid4()) + payload["query"]["resultsKey"] = key logger.info( "Query %s: Storing results in results backend, key: %s", str(query_id), key ) diff --git a/superset/temporary_cache/api.py b/superset/temporary_cache/api.py index e91a2886691f4..bdbdda302e694 100644 --- a/superset/temporary_cache/api.py +++ b/superset/temporary_cache/api.py @@ -24,20 +24,11 @@ from flask_appbuilder.api import BaseApi from marshmallow import ValidationError -from superset.charts.commands.exceptions import ( - ChartAccessDeniedError, - ChartNotFoundError, -) from superset.constants import MODEL_API_RW_METHOD_PERMISSION_MAP, RouteMethod -from superset.dashboards.commands.exceptions import ( - DashboardAccessDeniedError, - DashboardNotFoundError, -) -from superset.datasets.commands.exceptions import ( - DatasetAccessDeniedError, - DatasetNotFoundError, +from superset.temporary_cache.commands.exceptions import ( + TemporaryCacheAccessDeniedError, + TemporaryCacheResourceNotFoundError, ) -from superset.temporary_cache.commands.exceptions import TemporaryCacheAccessDeniedError from superset.temporary_cache.commands.parameters import CommandParameters from superset.temporary_cache.schemas import ( TemporaryCachePostSchema, @@ -86,14 +77,9 @@ def post(self, pk: int) -> Response: return self.response(201, key=key) except ValidationError as ex: return self.response(400, message=ex.messages) - except ( - ChartAccessDeniedError, - DashboardAccessDeniedError, - DatasetAccessDeniedError, - TemporaryCacheAccessDeniedError, - ) as ex: + except TemporaryCacheAccessDeniedError as ex: return self.response(403, message=str(ex)) - except (ChartNotFoundError, DashboardNotFoundError, DatasetNotFoundError) as ex: + except TemporaryCacheResourceNotFoundError as ex: return self.response(404, message=str(ex)) @requires_json @@ -112,14 +98,9 @@ def put(self, pk: int, key: str) -> Response: return self.response(200, key=key) except ValidationError as ex: return self.response(400, message=ex.messages) - except ( - ChartAccessDeniedError, - DashboardAccessDeniedError, - DatasetAccessDeniedError, - TemporaryCacheAccessDeniedError, - ) as ex: + except TemporaryCacheAccessDeniedError as ex: return self.response(403, message=str(ex)) - except (ChartNotFoundError, DashboardNotFoundError, DatasetNotFoundError) as ex: + except TemporaryCacheResourceNotFoundError as ex: return self.response(404, message=str(ex)) def get(self, pk: int, key: str) -> Response: @@ -129,14 +110,9 @@ def get(self, pk: int, key: str) -> Response: if not value: return self.response_404() return self.response(200, value=value) - except ( - ChartAccessDeniedError, - DashboardAccessDeniedError, - DatasetAccessDeniedError, - TemporaryCacheAccessDeniedError, - ) as ex: + except TemporaryCacheAccessDeniedError as ex: return self.response(403, message=str(ex)) - except (ChartNotFoundError, DashboardNotFoundError, DatasetNotFoundError) as ex: + except TemporaryCacheResourceNotFoundError as ex: return self.response(404, message=str(ex)) def delete(self, pk: int, key: str) -> Response: @@ -146,14 +122,9 @@ def delete(self, pk: int, key: str) -> Response: if not result: return self.response_404() return self.response(200, message="Deleted successfully") - except ( - ChartAccessDeniedError, - DashboardAccessDeniedError, - DatasetAccessDeniedError, - TemporaryCacheAccessDeniedError, - ) as ex: + except TemporaryCacheAccessDeniedError as ex: return self.response(403, message=str(ex)) - except (ChartNotFoundError, DashboardNotFoundError, DatasetNotFoundError) as ex: + except TemporaryCacheResourceNotFoundError as ex: return self.response(404, message=str(ex)) @abstractmethod diff --git a/superset/temporary_cache/commands/entry.py b/superset/temporary_cache/commands/entry.py index 0e9ad0a735069..90aa8e1bebf48 100644 --- a/superset/temporary_cache/commands/entry.py +++ b/superset/temporary_cache/commands/entry.py @@ -14,9 +14,11 @@ # KIND, either express or implied. See the License for the # specific language governing permissions and limitations # under the License. +from typing import Optional + from typing_extensions import TypedDict class Entry(TypedDict): - owner: int + owner: Optional[int] value: str diff --git a/superset/temporary_cache/commands/exceptions.py b/superset/temporary_cache/commands/exceptions.py index 0f8c44cb18fd9..8652a732f7ea0 100644 --- a/superset/temporary_cache/commands/exceptions.py +++ b/superset/temporary_cache/commands/exceptions.py @@ -43,3 +43,7 @@ class TemporaryCacheUpdateFailedError(UpdateFailedError): class TemporaryCacheAccessDeniedError(ForbiddenError): message = _("You don't have permission to modify the value.") + + +class TemporaryCacheResourceNotFoundError(ForbiddenError): + message = _("Resource was not found.") diff --git a/superset/utils/cache_manager.py b/superset/utils/cache_manager.py index 3f071b15435b6..d3b2dbdb00d5e 100644 --- a/superset/utils/cache_manager.py +++ b/superset/utils/cache_manager.py @@ -15,15 +15,40 @@ # specific language governing permissions and limitations # under the License. import logging +from typing import Any, Optional, Union from flask import Flask from flask_caching import Cache +from markupsafe import Markup + +from superset.utils.core import DatasourceType logger = logging.getLogger(__name__) CACHE_IMPORT_PATH = "superset.extensions.metastore_cache.SupersetMetastoreCache" +class ExploreFormDataCache(Cache): + def get(self, *args: Any, **kwargs: Any) -> Optional[Union[str, Markup]]: + cache = self.cache.get(*args, **kwargs) + + if not cache: + return None + + # rename data keys for existing cache based on new TemporaryExploreState model + if isinstance(cache, dict): + cache = { + ("datasource_id" if key == "dataset_id" else key): value + for (key, value) in cache.items() + } + # add default datasource_type if it doesn't exist + # temporarily defaulting to table until sqlatables are deprecated + if "datasource_type" not in cache: + cache["datasource_type"] = DatasourceType.TABLE + + return cache + + class CacheManager: def __init__(self) -> None: super().__init__() @@ -32,7 +57,7 @@ def __init__(self) -> None: self._data_cache = Cache() self._thumbnail_cache = Cache() self._filter_state_cache = Cache() - self._explore_form_data_cache = Cache() + self._explore_form_data_cache = ExploreFormDataCache() @staticmethod def _init_cache( diff --git a/superset/utils/core.py b/superset/utils/core.py index 4a9992d2a29f3..6c90837959edb 100644 --- a/superset/utils/core.py +++ b/superset/utils/core.py @@ -175,12 +175,13 @@ class GenericDataType(IntEnum): # ROW = 7 -class DatasourceType(Enum): - SQLATABLE = "sqlatable" +class DatasourceType(str, Enum): + SLTABLE = "sl_table" TABLE = "table" DATASET = "dataset" QUERY = "query" SAVEDQUERY = "saved_query" + VIEW = "view" class DatasourceDict(TypedDict): diff --git a/superset/utils/csv.py b/superset/utils/csv.py index 0dc84ff36a3de..31f076bfc680c 100644 --- a/superset/utils/csv.py +++ b/superset/utils/csv.py @@ -19,6 +19,7 @@ from typing import Any, Dict, Optional from urllib.error import URLError +import numpy as np import pandas as pd import simplejson @@ -64,8 +65,12 @@ def df_to_escaped_csv(df: pd.DataFrame, **kwargs: Any) -> Any: # Escape csv headers df = df.rename(columns=escape_values) - # Escape csv rows - df = df.applymap(escape_values) + # Escape csv values + for name, column in df.items(): + if column.dtype == np.dtype(object): + for idx, value in enumerate(column.values): + if isinstance(value, str): + df.at[idx, name] = escape_value(value) return df.to_csv(**kwargs) diff --git a/superset/utils/decorators.py b/superset/utils/decorators.py index ab4ee308787c8..f14335f2cada5 100644 --- a/superset/utils/decorators.py +++ b/superset/utils/decorators.py @@ -19,7 +19,7 @@ import time from contextlib import contextmanager from functools import wraps -from typing import Any, Callable, Dict, Iterator, TYPE_CHECKING, Union +from typing import Any, Callable, Dict, Iterator, Optional, TYPE_CHECKING, Union from flask import current_app, Response @@ -32,6 +32,27 @@ from superset.stats_logger import BaseStatsLogger +def statsd_gauge(metric_prefix: Optional[str] = None) -> Callable[..., Any]: + def decorate(f: Callable[..., Any]) -> Callable[..., Any]: + """ + Handle sending statsd gauge metric from any method or function + """ + + def wrapped(*args: Any, **kwargs: Any) -> Any: + metric_prefix_ = metric_prefix or f.__name__ + try: + result = f(*args, **kwargs) + current_app.config["STATS_LOGGER"].gauge(f"{metric_prefix_}.ok", 1) + return result + except Exception as ex: + current_app.config["STATS_LOGGER"].gauge(f"{metric_prefix_}.error", 1) + raise ex + + return wrapped + + return decorate + + @contextmanager def stats_timing(stats_key: str, stats_logger: BaseStatsLogger) -> Iterator[float]: """Provide a transactional scope around a series of operations.""" diff --git a/superset/utils/pandas_postprocessing/boxplot.py b/superset/utils/pandas_postprocessing/boxplot.py index 4436af9182c0f..40ce9200d358e 100644 --- a/superset/utils/pandas_postprocessing/boxplot.py +++ b/superset/utils/pandas_postprocessing/boxplot.py @@ -18,7 +18,7 @@ import numpy as np from flask_babel import gettext as _ -from pandas import DataFrame, Series +from pandas import DataFrame, Series, to_numeric from superset.exceptions import InvalidPostProcessingError from superset.utils.core import PostProcessingBoxplotWhiskerType @@ -122,4 +122,11 @@ def outliers(series: Series) -> Set[float]: for operator_name, operator in operators.items() for metric in metrics } + + # nanpercentile needs numeric values, otherwise the isnan function + # that's used in the underlying function will fail + for column in metrics: + if df.dtypes[column] == np.object: + df[column] = to_numeric(df[column], errors="coerce") + return aggregate(df, groupby=groupby, aggregates=aggregates) diff --git a/superset/views/base.py b/superset/views/base.py index 081951d561aab..1b1c684083ee3 100644 --- a/superset/views/base.py +++ b/superset/views/base.py @@ -620,10 +620,19 @@ def apply(self, query: Query, value: Any) -> Query: return query datasource_perms = security_manager.user_view_menu_names("datasource_access") schema_perms = security_manager.user_view_menu_names("schema_access") + owner_ids_query = ( + db.session.query(models.SqlaTable.id) + .join(models.SqlaTable.owners) + .filter( + security_manager.user_model.id + == security_manager.user_model.get_user_id() + ) + ) return query.filter( or_( self.model.perm.in_(datasource_perms), self.model.schema_perm.in_(schema_perms), + models.SqlaTable.id.in_(owner_ids_query), ) ) diff --git a/superset/views/core.py b/superset/views/core.py index 15ff3b1620e92..f65385fc305a5 100755 --- a/superset/views/core.py +++ b/superset/views/core.py @@ -891,6 +891,8 @@ def explore( if datasource: datasource_data["owners"] = datasource.owners_data + if isinstance(datasource, Query): + datasource_data["columns"] = datasource.columns bootstrap_data = { "can_add": slice_add_perm, diff --git a/tests/integration_tests/charts/api_tests.py b/tests/integration_tests/charts/api_tests.py index 6b8d625d567e3..a37acf6eafc3a 100644 --- a/tests/integration_tests/charts/api_tests.py +++ b/tests/integration_tests/charts/api_tests.py @@ -520,7 +520,13 @@ def test_create_chart_validate_datasource(self): response = json.loads(rv.data.decode("utf-8")) self.assertEqual( response, - {"message": {"datasource_type": ["Must be one of: druid, table, view."]}}, + { + "message": { + "datasource_type": [ + "Must be one of: sl_table, table, dataset, query, saved_query, view." + ] + } + }, ) chart_data = { "slice_name": "title1", @@ -531,7 +537,7 @@ def test_create_chart_validate_datasource(self): self.assertEqual(rv.status_code, 422) response = json.loads(rv.data.decode("utf-8")) self.assertEqual( - response, {"message": {"datasource_id": ["Dataset does not exist"]}} + response, {"message": {"datasource_id": ["Datasource does not exist"]}} ) @pytest.mark.usefixtures("load_birth_names_dashboard_with_slices") @@ -686,7 +692,13 @@ def test_update_chart_validate_datasource(self): response = json.loads(rv.data.decode("utf-8")) self.assertEqual( response, - {"message": {"datasource_type": ["Must be one of: druid, table, view."]}}, + { + "message": { + "datasource_type": [ + "Must be one of: sl_table, table, dataset, query, saved_query, view." + ] + } + }, ) chart_data = {"datasource_id": 0, "datasource_type": "table"} @@ -694,7 +706,7 @@ def test_update_chart_validate_datasource(self): self.assertEqual(rv.status_code, 422) response = json.loads(rv.data.decode("utf-8")) self.assertEqual( - response, {"message": {"datasource_id": ["Dataset does not exist"]}} + response, {"message": {"datasource_id": ["Datasource does not exist"]}} ) db.session.delete(chart) diff --git a/tests/integration_tests/dashboard_utils.py b/tests/integration_tests/dashboard_utils.py index fa6efd60b4dac..41a34fa36edf5 100644 --- a/tests/integration_tests/dashboard_utils.py +++ b/tests/integration_tests/dashboard_utils.py @@ -26,7 +26,7 @@ from superset.models.core import Database from superset.models.dashboard import Dashboard from superset.models.slice import Slice -from superset.utils.core import get_example_default_schema +from superset.utils.core import DatasourceType, get_example_default_schema def get_table( @@ -72,7 +72,7 @@ def create_slice( return Slice( slice_name=title, viz_type=viz_type, - datasource_type="table", + datasource_type=DatasourceType.TABLE, datasource_id=table.id, params=json.dumps(slices_dict, indent=4, sort_keys=True), ) diff --git a/tests/integration_tests/datasets/api_tests.py b/tests/integration_tests/datasets/api_tests.py index 781ae929b743c..28bb617c17c19 100644 --- a/tests/integration_tests/datasets/api_tests.py +++ b/tests/integration_tests/datasets/api_tests.py @@ -27,7 +27,9 @@ import yaml from sqlalchemy.sql import func +from superset.common.utils.query_cache_manager import QueryCacheManager from superset.connectors.sqla.models import SqlaTable, SqlMetric, TableColumn +from superset.constants import CacheRegion from superset.dao.exceptions import ( DAOCreateFailedError, DAODeleteFailedError, @@ -214,6 +216,27 @@ def test_get_dataset_list_gamma(self): response = json.loads(rv.data.decode("utf-8")) assert response["result"] == [] + def test_get_dataset_list_gamma_owned(self): + """ + Dataset API: Test get dataset list owned by gamma + """ + main_db = get_main_database() + owned_dataset = self.insert_dataset( + "ab_user", [self.get_user("gamma").id], main_db + ) + + self.login(username="gamma") + uri = "api/v1/dataset/" + rv = self.get_assert_metric(uri, "get_list") + assert rv.status_code == 200 + response = json.loads(rv.data.decode("utf-8")) + + assert response["count"] == 1 + assert response["result"][0]["table_name"] == "ab_user" + + db.session.delete(owned_dataset) + db.session.commit() + def test_get_dataset_related_database_gamma(self): """ Dataset API: Test get dataset related databases gamma @@ -1842,3 +1865,93 @@ def test_get_datasets_is_certified_filter(self): db.session.delete(table_w_certification) db.session.commit() + + @pytest.mark.usefixtures("create_datasets") + def test_get_dataset_samples(self): + """ + Dataset API: Test get dataset samples + """ + dataset = self.get_fixture_datasets()[0] + + self.login(username="admin") + uri = f"api/v1/dataset/{dataset.id}/samples" + + # 1. should cache data + # feeds data + self.client.get(uri) + # get from cache + rv = self.client.get(uri) + rv_data = json.loads(rv.data) + assert rv.status_code == 200 + assert "result" in rv_data + assert rv_data["result"]["cached_dttm"] is not None + cache_key1 = rv_data["result"]["cache_key"] + assert QueryCacheManager.has(cache_key1, region=CacheRegion.DATA) + + # 2. should through cache + uri2 = f"api/v1/dataset/{dataset.id}/samples?force=true" + # feeds data + self.client.get(uri2) + # force query + rv2 = self.client.get(uri2) + rv_data2 = json.loads(rv2.data) + assert rv_data2["result"]["cached_dttm"] is None + cache_key2 = rv_data2["result"]["cache_key"] + assert QueryCacheManager.has(cache_key2, region=CacheRegion.DATA) + + # 3. data precision + assert "colnames" in rv_data2["result"] + assert "coltypes" in rv_data2["result"] + assert "data" in rv_data2["result"] + + eager_samples = dataset.database.get_df( + f"select * from {dataset.table_name}" + f' limit {self.app.config["SAMPLES_ROW_LIMIT"]}' + ).to_dict(orient="records") + assert eager_samples == rv_data2["result"]["data"] + + @pytest.mark.usefixtures("create_datasets") + def test_get_dataset_samples_with_failed_cc(self): + dataset = self.get_fixture_datasets()[0] + + self.login(username="admin") + failed_column = TableColumn( + column_name="DUMMY CC", + type="VARCHAR(255)", + table=dataset, + expression="INCORRECT SQL", + ) + uri = f"api/v1/dataset/{dataset.id}/samples" + dataset.columns.append(failed_column) + rv = self.client.get(uri) + assert rv.status_code == 400 + rv_data = json.loads(rv.data) + assert "message" in rv_data + if dataset.database.db_engine_spec.engine_name == "PostgreSQL": + assert "INCORRECT SQL" in rv_data.get("message") + + def test_get_dataset_samples_on_virtual_dataset(self): + virtual_dataset = SqlaTable( + table_name="virtual_dataset", + sql=("SELECT 'foo' as foo, 'bar' as bar"), + database=get_example_database(), + ) + TableColumn(column_name="foo", type="VARCHAR(255)", table=virtual_dataset) + TableColumn(column_name="bar", type="VARCHAR(255)", table=virtual_dataset) + SqlMetric(metric_name="count", expression="count(*)", table=virtual_dataset) + + self.login(username="admin") + uri = f"api/v1/dataset/{virtual_dataset.id}/samples" + rv = self.client.get(uri) + assert rv.status_code == 200 + rv_data = json.loads(rv.data) + cache_key = rv_data["result"]["cache_key"] + assert QueryCacheManager.has(cache_key, region=CacheRegion.DATA) + + # remove original column in dataset + virtual_dataset.sql = "SELECT 'foo' as foo" + rv = self.client.get(uri) + assert rv.status_code == 400 + + db.session.delete(virtual_dataset) + db.session.commit() diff --git a/tests/integration_tests/explore/form_data/api_tests.py b/tests/integration_tests/explore/form_data/api_tests.py index c05be00e96186..8b375df56ae38 100644 --- a/tests/integration_tests/explore/form_data/api_tests.py +++ b/tests/integration_tests/explore/form_data/api_tests.py @@ -56,7 +56,7 @@ def admin_id() -> int: @pytest.fixture -def dataset_id() -> int: +def datasource() -> int: with app.app_context() as ctx: session: Session = ctx.app.appbuilder.get_session dataset = ( @@ -64,24 +64,26 @@ def dataset_id() -> int: .filter_by(table_name="wb_health_population") .first() ) - return dataset.id + return dataset @pytest.fixture(autouse=True) -def cache(chart_id, admin_id, dataset_id): +def cache(chart_id, admin_id, datasource): entry: TemporaryExploreState = { "owner": admin_id, - "dataset_id": dataset_id, + "datasource_id": datasource.id, + "datasource_type": datasource.type, "chart_id": chart_id, "form_data": INITIAL_FORM_DATA, } cache_manager.explore_form_data_cache.set(KEY, entry) -def test_post(client, chart_id: int, dataset_id: int): +def test_post(client, chart_id: int, datasource: SqlaTable): login(client, "admin") payload = { - "dataset_id": dataset_id, + "datasource_id": datasource.id, + "datasource_type": datasource.type, "chart_id": chart_id, "form_data": INITIAL_FORM_DATA, } @@ -89,10 +91,11 @@ def test_post(client, chart_id: int, dataset_id: int): assert resp.status_code == 201 -def test_post_bad_request_non_string(client, chart_id: int, dataset_id: int): +def test_post_bad_request_non_string(client, chart_id: int, datasource: SqlaTable): login(client, "admin") payload = { - "dataset_id": dataset_id, + "datasource_id": datasource.id, + "datasource_type": datasource.type, "chart_id": chart_id, "form_data": 1234, } @@ -100,10 +103,11 @@ def test_post_bad_request_non_string(client, chart_id: int, dataset_id: int): assert resp.status_code == 400 -def test_post_bad_request_non_json_string(client, chart_id: int, dataset_id: int): +def test_post_bad_request_non_json_string(client, chart_id: int, datasource: SqlaTable): login(client, "admin") payload = { - "dataset_id": dataset_id, + "datasource_id": datasource.id, + "datasource_type": datasource.type, "chart_id": chart_id, "form_data": "foo", } @@ -111,10 +115,11 @@ def test_post_bad_request_non_json_string(client, chart_id: int, dataset_id: int assert resp.status_code == 400 -def test_post_access_denied(client, chart_id: int, dataset_id: int): +def test_post_access_denied(client, chart_id: int, datasource: SqlaTable): login(client, "gamma") payload = { - "dataset_id": dataset_id, + "datasource_id": datasource.id, + "datasource_type": datasource.type, "chart_id": chart_id, "form_data": INITIAL_FORM_DATA, } @@ -122,10 +127,11 @@ def test_post_access_denied(client, chart_id: int, dataset_id: int): assert resp.status_code == 404 -def test_post_same_key_for_same_context(client, chart_id: int, dataset_id: int): +def test_post_same_key_for_same_context(client, chart_id: int, datasource: SqlaTable): login(client, "admin") payload = { - "dataset_id": dataset_id, + "datasource_id": datasource.id, + "datasource_type": datasource.type, "chart_id": chart_id, "form_data": UPDATED_FORM_DATA, } @@ -139,11 +145,12 @@ def test_post_same_key_for_same_context(client, chart_id: int, dataset_id: int): def test_post_different_key_for_different_context( - client, chart_id: int, dataset_id: int + client, chart_id: int, datasource: SqlaTable ): login(client, "admin") payload = { - "dataset_id": dataset_id, + "datasource_id": datasource.id, + "datasource_type": datasource.type, "chart_id": chart_id, "form_data": UPDATED_FORM_DATA, } @@ -151,7 +158,8 @@ def test_post_different_key_for_different_context( data = json.loads(resp.data.decode("utf-8")) first_key = data.get("key") payload = { - "dataset_id": dataset_id, + "datasource_id": datasource.id, + "datasource_type": datasource.type, "form_data": json.dumps({"test": "initial value"}), } resp = client.post("api/v1/explore/form_data?tab_id=1", json=payload) @@ -160,10 +168,11 @@ def test_post_different_key_for_different_context( assert first_key != second_key -def test_post_same_key_for_same_tab_id(client, chart_id: int, dataset_id: int): +def test_post_same_key_for_same_tab_id(client, chart_id: int, datasource: SqlaTable): login(client, "admin") payload = { - "dataset_id": dataset_id, + "datasource_id": datasource.id, + "datasource_type": datasource.type, "chart_id": chart_id, "form_data": json.dumps({"test": "initial value"}), } @@ -177,11 +186,12 @@ def test_post_same_key_for_same_tab_id(client, chart_id: int, dataset_id: int): def test_post_different_key_for_different_tab_id( - client, chart_id: int, dataset_id: int + client, chart_id: int, datasource: SqlaTable ): login(client, "admin") payload = { - "dataset_id": dataset_id, + "datasource_id": datasource.id, + "datasource_type": datasource.type, "chart_id": chart_id, "form_data": json.dumps({"test": "initial value"}), } @@ -194,10 +204,11 @@ def test_post_different_key_for_different_tab_id( assert first_key != second_key -def test_post_different_key_for_no_tab_id(client, chart_id: int, dataset_id: int): +def test_post_different_key_for_no_tab_id(client, chart_id: int, datasource: SqlaTable): login(client, "admin") payload = { - "dataset_id": dataset_id, + "datasource_id": datasource.id, + "datasource_type": datasource.type, "chart_id": chart_id, "form_data": INITIAL_FORM_DATA, } @@ -210,10 +221,11 @@ def test_post_different_key_for_no_tab_id(client, chart_id: int, dataset_id: int assert first_key != second_key -def test_put(client, chart_id: int, dataset_id: int): +def test_put(client, chart_id: int, datasource: SqlaTable): login(client, "admin") payload = { - "dataset_id": dataset_id, + "datasource_id": datasource.id, + "datasource_type": datasource.type, "chart_id": chart_id, "form_data": UPDATED_FORM_DATA, } @@ -221,10 +233,11 @@ def test_put(client, chart_id: int, dataset_id: int): assert resp.status_code == 200 -def test_put_same_key_for_same_tab_id(client, chart_id: int, dataset_id: int): +def test_put_same_key_for_same_tab_id(client, chart_id: int, datasource: SqlaTable): login(client, "admin") payload = { - "dataset_id": dataset_id, + "datasource_id": datasource.id, + "datasource_type": datasource.type, "chart_id": chart_id, "form_data": UPDATED_FORM_DATA, } @@ -237,10 +250,13 @@ def test_put_same_key_for_same_tab_id(client, chart_id: int, dataset_id: int): assert first_key == second_key -def test_put_different_key_for_different_tab_id(client, chart_id: int, dataset_id: int): +def test_put_different_key_for_different_tab_id( + client, chart_id: int, datasource: SqlaTable +): login(client, "admin") payload = { - "dataset_id": dataset_id, + "datasource_id": datasource.id, + "datasource_type": datasource.type, "chart_id": chart_id, "form_data": UPDATED_FORM_DATA, } @@ -253,10 +269,11 @@ def test_put_different_key_for_different_tab_id(client, chart_id: int, dataset_i assert first_key != second_key -def test_put_different_key_for_no_tab_id(client, chart_id: int, dataset_id: int): +def test_put_different_key_for_no_tab_id(client, chart_id: int, datasource: SqlaTable): login(client, "admin") payload = { - "dataset_id": dataset_id, + "datasource_id": datasource.id, + "datasource_type": datasource.type, "chart_id": chart_id, "form_data": UPDATED_FORM_DATA, } @@ -269,10 +286,11 @@ def test_put_different_key_for_no_tab_id(client, chart_id: int, dataset_id: int) assert first_key != second_key -def test_put_bad_request(client, chart_id: int, dataset_id: int): +def test_put_bad_request(client, chart_id: int, datasource: SqlaTable): login(client, "admin") payload = { - "dataset_id": dataset_id, + "datasource_id": datasource.id, + "datasource_type": datasource.type, "chart_id": chart_id, "form_data": 1234, } @@ -280,10 +298,11 @@ def test_put_bad_request(client, chart_id: int, dataset_id: int): assert resp.status_code == 400 -def test_put_bad_request_non_string(client, chart_id: int, dataset_id: int): +def test_put_bad_request_non_string(client, chart_id: int, datasource: SqlaTable): login(client, "admin") payload = { - "dataset_id": dataset_id, + "datasource_id": datasource.id, + "datasource_type": datasource.type, "chart_id": chart_id, "form_data": 1234, } @@ -291,10 +310,11 @@ def test_put_bad_request_non_string(client, chart_id: int, dataset_id: int): assert resp.status_code == 400 -def test_put_bad_request_non_json_string(client, chart_id: int, dataset_id: int): +def test_put_bad_request_non_json_string(client, chart_id: int, datasource: SqlaTable): login(client, "admin") payload = { - "dataset_id": dataset_id, + "datasource_id": datasource.id, + "datasource_type": datasource.type, "chart_id": chart_id, "form_data": "foo", } @@ -302,10 +322,11 @@ def test_put_bad_request_non_json_string(client, chart_id: int, dataset_id: int) assert resp.status_code == 400 -def test_put_access_denied(client, chart_id: int, dataset_id: int): +def test_put_access_denied(client, chart_id: int, datasource: SqlaTable): login(client, "gamma") payload = { - "dataset_id": dataset_id, + "datasource_id": datasource.id, + "datasource_type": datasource.type, "chart_id": chart_id, "form_data": UPDATED_FORM_DATA, } @@ -313,10 +334,11 @@ def test_put_access_denied(client, chart_id: int, dataset_id: int): assert resp.status_code == 404 -def test_put_not_owner(client, chart_id: int, dataset_id: int): +def test_put_not_owner(client, chart_id: int, datasource: SqlaTable): login(client, "gamma") payload = { - "dataset_id": dataset_id, + "datasource_id": datasource.id, + "datasource_type": datasource.type, "chart_id": chart_id, "form_data": UPDATED_FORM_DATA, } @@ -364,12 +386,13 @@ def test_delete_access_denied(client): assert resp.status_code == 404 -def test_delete_not_owner(client, chart_id: int, dataset_id: int, admin_id: int): +def test_delete_not_owner(client, chart_id: int, datasource: SqlaTable, admin_id: int): another_key = "another_key" another_owner = admin_id + 1 entry: TemporaryExploreState = { "owner": another_owner, - "dataset_id": dataset_id, + "datasource_id": datasource.id, + "datasource_type": datasource.type, "chart_id": chart_id, "form_data": INITIAL_FORM_DATA, } diff --git a/tests/integration_tests/explore/form_data/commands_tests.py b/tests/integration_tests/explore/form_data/commands_tests.py new file mode 100644 index 0000000000000..4db48cfa79737 --- /dev/null +++ b/tests/integration_tests/explore/form_data/commands_tests.py @@ -0,0 +1,359 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. + +import json +from unittest.mock import patch + +import pytest + +from superset import app, db, security, security_manager +from superset.commands.exceptions import DatasourceTypeInvalidError +from superset.connectors.sqla.models import SqlaTable +from superset.explore.form_data.commands.create import CreateFormDataCommand +from superset.explore.form_data.commands.delete import DeleteFormDataCommand +from superset.explore.form_data.commands.get import GetFormDataCommand +from superset.explore.form_data.commands.parameters import CommandParameters +from superset.explore.form_data.commands.update import UpdateFormDataCommand +from superset.models.slice import Slice +from superset.models.sql_lab import Query +from superset.utils.core import DatasourceType, get_example_default_schema +from superset.utils.database import get_example_database +from tests.integration_tests.base_tests import SupersetTestCase + + +class TestCreateFormDataCommand(SupersetTestCase): + @pytest.fixture() + def create_dataset(self): + with self.create_app().app_context(): + dataset = SqlaTable( + table_name="dummy_sql_table", + database=get_example_database(), + schema=get_example_default_schema(), + sql="select 123 as intcol, 'abc' as strcol", + ) + session = db.session + session.add(dataset) + session.commit() + + yield dataset + + # rollback + session.delete(dataset) + session.commit() + + @pytest.fixture() + def create_slice(self): + with self.create_app().app_context(): + session = db.session + dataset = ( + session.query(SqlaTable).filter_by(table_name="dummy_sql_table").first() + ) + slice = Slice( + datasource_id=dataset.id, + datasource_type=DatasourceType.TABLE, + datasource_name="tmp_perm_table", + slice_name="slice_name", + ) + + session.add(slice) + session.commit() + + yield slice + + # rollback + session.delete(slice) + session.commit() + + @pytest.fixture() + def create_query(self): + with self.create_app().app_context(): + session = db.session + + query = Query( + sql="select 1 as foo;", + client_id="sldkfjlk", + database=get_example_database(), + ) + + session.add(query) + session.commit() + + yield query + + # rollback + session.delete(query) + session.commit() + + @patch("superset.security.manager.g") + @pytest.mark.usefixtures("create_dataset", "create_slice") + def test_create_form_data_command(self, mock_g): + mock_g.user = security_manager.find_user("admin") + + dataset = ( + db.session.query(SqlaTable).filter_by(table_name="dummy_sql_table").first() + ) + slice = db.session.query(Slice).filter_by(slice_name="slice_name").first() + + datasource = f"{dataset.id}__{DatasourceType.TABLE}" + args = CommandParameters( + actor=mock_g.user, + datasource_id=dataset.id, + datasource_type=DatasourceType.TABLE, + chart_id=slice.id, + tab_id=1, + form_data=json.dumps({"datasource": datasource}), + ) + command = CreateFormDataCommand(args) + + assert isinstance(command.run(), str) + + @patch("superset.security.manager.g") + @pytest.mark.usefixtures("create_dataset", "create_slice", "create_query") + def test_create_form_data_command_invalid_type(self, mock_g): + mock_g.user = security_manager.find_user("admin") + app.config["EXPLORE_FORM_DATA_CACHE_CONFIG"] = { + "REFRESH_TIMEOUT_ON_RETRIEVAL": True + } + + dataset = ( + db.session.query(SqlaTable).filter_by(table_name="dummy_sql_table").first() + ) + slice = db.session.query(Slice).filter_by(slice_name="slice_name").first() + + datasource = f"{dataset.id}__{DatasourceType.TABLE}" + create_args = CommandParameters( + actor=mock_g.user, + datasource_id=dataset.id, + datasource_type="InvalidType", + chart_id=slice.id, + tab_id=1, + form_data=json.dumps({"datasource": datasource}), + ) + with pytest.raises(DatasourceTypeInvalidError) as exc: + CreateFormDataCommand(create_args).run() + + assert "Datasource type is invalid" in str(exc.value) + + @patch("superset.security.manager.g") + @pytest.mark.usefixtures("create_dataset", "create_slice", "create_query") + def test_create_form_data_command_type_as_string(self, mock_g): + mock_g.user = security_manager.find_user("admin") + app.config["EXPLORE_FORM_DATA_CACHE_CONFIG"] = { + "REFRESH_TIMEOUT_ON_RETRIEVAL": True + } + + dataset = ( + db.session.query(SqlaTable).filter_by(table_name="dummy_sql_table").first() + ) + slice = db.session.query(Slice).filter_by(slice_name="slice_name").first() + + datasource = f"{dataset.id}__{DatasourceType.TABLE}" + create_args = CommandParameters( + actor=mock_g.user, + datasource_id=dataset.id, + datasource_type="table", + chart_id=slice.id, + tab_id=1, + form_data=json.dumps({"datasource": datasource}), + ) + command = CreateFormDataCommand(create_args) + + assert isinstance(command.run(), str) + + @patch("superset.security.manager.g") + @pytest.mark.usefixtures("create_dataset", "create_slice") + def test_get_form_data_command(self, mock_g): + mock_g.user = security_manager.find_user("admin") + app.config["EXPLORE_FORM_DATA_CACHE_CONFIG"] = { + "REFRESH_TIMEOUT_ON_RETRIEVAL": True + } + + dataset = ( + db.session.query(SqlaTable).filter_by(table_name="dummy_sql_table").first() + ) + slice = db.session.query(Slice).filter_by(slice_name="slice_name").first() + + datasource = f"{dataset.id}__{DatasourceType.TABLE}" + create_args = CommandParameters( + actor=mock_g.user, + datasource_id=dataset.id, + datasource_type=DatasourceType.TABLE, + chart_id=slice.id, + tab_id=1, + form_data=json.dumps({"datasource": datasource}), + ) + key = CreateFormDataCommand(create_args).run() + + key_args = CommandParameters(actor=mock_g.user, key=key) + get_command = GetFormDataCommand(key_args) + cache_data = json.loads(get_command.run()) + + assert cache_data.get("datasource") == datasource + + @patch("superset.security.manager.g") + @pytest.mark.usefixtures("create_dataset", "create_slice", "create_query") + def test_update_form_data_command(self, mock_g): + mock_g.user = security_manager.find_user("admin") + app.config["EXPLORE_FORM_DATA_CACHE_CONFIG"] = { + "REFRESH_TIMEOUT_ON_RETRIEVAL": True + } + + dataset = ( + db.session.query(SqlaTable).filter_by(table_name="dummy_sql_table").first() + ) + slice = db.session.query(Slice).filter_by(slice_name="slice_name").first() + + query = db.session.query(Query).filter_by(sql="select 1 as foo;").first() + + datasource = f"{dataset.id}__{DatasourceType.TABLE}" + create_args = CommandParameters( + actor=mock_g.user, + datasource_id=dataset.id, + datasource_type=DatasourceType.TABLE, + chart_id=slice.id, + tab_id=1, + form_data=json.dumps({"datasource": datasource}), + ) + key = CreateFormDataCommand(create_args).run() + + query_datasource = f"{dataset.id}__{DatasourceType.TABLE}" + update_args = CommandParameters( + actor=mock_g.user, + datasource_id=query.id, + datasource_type=DatasourceType.QUERY, + chart_id=slice.id, + tab_id=1, + form_data=json.dumps({"datasource": query_datasource}), + key=key, + ) + + update_command = UpdateFormDataCommand(update_args) + new_key = update_command.run() + + # it should return a key + assert isinstance(new_key, str) + # the updated key returned should be different from the old one + assert new_key != key + + key_args = CommandParameters(actor=mock_g.user, key=key) + get_command = GetFormDataCommand(key_args) + + cache_data = json.loads(get_command.run()) + + assert cache_data.get("datasource") == query_datasource + + @patch("superset.security.manager.g") + @pytest.mark.usefixtures("create_dataset", "create_slice", "create_query") + def test_update_form_data_command_same_form_data(self, mock_g): + mock_g.user = security_manager.find_user("admin") + app.config["EXPLORE_FORM_DATA_CACHE_CONFIG"] = { + "REFRESH_TIMEOUT_ON_RETRIEVAL": True + } + + dataset = ( + db.session.query(SqlaTable).filter_by(table_name="dummy_sql_table").first() + ) + slice = db.session.query(Slice).filter_by(slice_name="slice_name").first() + + datasource = f"{dataset.id}__{DatasourceType.TABLE}" + create_args = CommandParameters( + actor=mock_g.user, + datasource_id=dataset.id, + datasource_type=DatasourceType.TABLE, + chart_id=slice.id, + tab_id=1, + form_data=json.dumps({"datasource": datasource}), + ) + key = CreateFormDataCommand(create_args).run() + + update_args = CommandParameters( + actor=mock_g.user, + datasource_id=dataset.id, + datasource_type=DatasourceType.TABLE, + chart_id=slice.id, + tab_id=1, + form_data=json.dumps({"datasource": datasource}), + key=key, + ) + + update_command = UpdateFormDataCommand(update_args) + new_key = update_command.run() + + # it should return a key + assert isinstance(new_key, str) + + # the updated key returned should be the same as the old one + assert new_key == key + + key_args = CommandParameters(actor=mock_g.user, key=key) + get_command = GetFormDataCommand(key_args) + + cache_data = json.loads(get_command.run()) + + assert cache_data.get("datasource") == datasource + + @patch("superset.security.manager.g") + @pytest.mark.usefixtures("create_dataset", "create_slice", "create_query") + def test_delete_form_data_command(self, mock_g): + mock_g.user = security_manager.find_user("admin") + app.config["EXPLORE_FORM_DATA_CACHE_CONFIG"] = { + "REFRESH_TIMEOUT_ON_RETRIEVAL": True + } + + dataset = ( + db.session.query(SqlaTable).filter_by(table_name="dummy_sql_table").first() + ) + slice = db.session.query(Slice).filter_by(slice_name="slice_name").first() + + datasource = f"{dataset.id}__{DatasourceType.TABLE}" + create_args = CommandParameters( + actor=mock_g.user, + datasource_id=dataset.id, + datasource_type=DatasourceType.TABLE, + chart_id=slice.id, + tab_id=1, + form_data=json.dumps({"datasource": datasource}), + ) + key = CreateFormDataCommand(create_args).run() + + delete_args = CommandParameters( + actor=mock_g.user, + key=key, + ) + + delete_command = DeleteFormDataCommand(delete_args) + response = delete_command.run() + + assert response == True + + @patch("superset.security.manager.g") + @pytest.mark.usefixtures("create_dataset", "create_slice", "create_query") + def test_delete_form_data_command_key_expired(self, mock_g): + mock_g.user = security_manager.find_user("admin") + app.config["EXPLORE_FORM_DATA_CACHE_CONFIG"] = { + "REFRESH_TIMEOUT_ON_RETRIEVAL": True + } + + delete_args = CommandParameters( + actor=mock_g.user, + key="some_expired_key", + ) + + delete_command = DeleteFormDataCommand(delete_args) + response = delete_command.run() + + assert response == False diff --git a/tests/integration_tests/explore/permalink/api_tests.py b/tests/integration_tests/explore/permalink/api_tests.py index a44bc70a7b49a..b5228ab301b24 100644 --- a/tests/integration_tests/explore/permalink/api_tests.py +++ b/tests/integration_tests/explore/permalink/api_tests.py @@ -27,6 +27,7 @@ from superset.key_value.types import KeyValueResource from superset.key_value.utils import decode_permalink_id, encode_permalink_key from superset.models.slice import Slice +from superset.utils.core import DatasourceType from tests.integration_tests.base_tests import login from tests.integration_tests.fixtures.client import client from tests.integration_tests.fixtures.world_bank_dashboard import ( @@ -97,7 +98,8 @@ def test_get_missing_chart(client, chart, permalink_salt: str) -> None: value=pickle.dumps( { "chartId": chart_id, - "datasetId": chart.datasource.id, + "datasourceId": chart.datasource.id, + "datasourceType": DatasourceType.TABLE, "formData": { "slice_id": chart_id, "datasource": f"{chart.datasource.id}__{chart.datasource.type}", diff --git a/tests/integration_tests/import_export_tests.py b/tests/integration_tests/import_export_tests.py index 6d7d581ec6d41..81acda80185ca 100644 --- a/tests/integration_tests/import_export_tests.py +++ b/tests/integration_tests/import_export_tests.py @@ -40,7 +40,7 @@ from superset.datasets.commands.importers.v0 import import_dataset from superset.models.dashboard import Dashboard from superset.models.slice import Slice -from superset.utils.core import get_example_default_schema +from superset.utils.core import DatasourceType, get_example_default_schema from superset.utils.database import get_example_database from tests.integration_tests.fixtures.world_bank_dashboard import ( @@ -103,7 +103,7 @@ def create_slice( return Slice( slice_name=name, - datasource_type="table", + datasource_type=DatasourceType.TABLE, viz_type="bubble", params=json.dumps(params), datasource_id=ds_id, diff --git a/tests/integration_tests/model_tests.py b/tests/integration_tests/model_tests.py index ace75da35a88f..a1791db34bffe 100644 --- a/tests/integration_tests/model_tests.py +++ b/tests/integration_tests/model_tests.py @@ -16,6 +16,7 @@ # under the License. # isort:skip_file import json +from superset.utils.core import DatasourceType import textwrap import unittest from unittest import mock @@ -604,7 +605,7 @@ def test_data_for_slices_with_adhoc_column(self): dashboard = self.get_dash_by_slug("births") slc = Slice( slice_name="slice with adhoc column", - datasource_type="table", + datasource_type=DatasourceType.TABLE, viz_type="table", params=json.dumps( { diff --git a/tests/integration_tests/reports/commands_tests.py b/tests/integration_tests/reports/commands_tests.py index 7629bdd5b4583..dd23d291fd69a 100644 --- a/tests/integration_tests/reports/commands_tests.py +++ b/tests/integration_tests/reports/commands_tests.py @@ -22,6 +22,7 @@ from uuid import uuid4 import pytest +from flask import current_app from flask_sqlalchemy import BaseQuery from freezegun import freeze_time from sqlalchemy.sql import func @@ -1026,20 +1027,23 @@ def test_email_dashboard_report_schedule( screenshot_mock.return_value = SCREENSHOT_FILE with freeze_time("2020-01-01T00:00:00Z"): - AsyncExecuteReportScheduleCommand( - TEST_ID, create_report_email_dashboard.id, datetime.utcnow() - ).run() + with patch.object(current_app.config["STATS_LOGGER"], "gauge") as statsd_mock: - notification_targets = get_target_from_report_schedule( - create_report_email_dashboard - ) - # Assert the email smtp address - assert email_mock.call_args[0][0] == notification_targets[0] - # Assert the email inline screenshot - smtp_images = email_mock.call_args[1]["images"] - assert smtp_images[list(smtp_images.keys())[0]] == SCREENSHOT_FILE - # Assert logs are correct - assert_log(ReportState.SUCCESS) + AsyncExecuteReportScheduleCommand( + TEST_ID, create_report_email_dashboard.id, datetime.utcnow() + ).run() + + notification_targets = get_target_from_report_schedule( + create_report_email_dashboard + ) + # Assert the email smtp address + assert email_mock.call_args[0][0] == notification_targets[0] + # Assert the email inline screenshot + smtp_images = email_mock.call_args[1]["images"] + assert smtp_images[list(smtp_images.keys())[0]] == SCREENSHOT_FILE + # Assert logs are correct + assert_log(ReportState.SUCCESS) + statsd_mock.assert_called_once_with("reports.email.send.ok", 1) @pytest.mark.usefixtures( @@ -1094,19 +1098,22 @@ def test_slack_chart_report_schedule( screenshot_mock.return_value = SCREENSHOT_FILE with freeze_time("2020-01-01T00:00:00Z"): - AsyncExecuteReportScheduleCommand( - TEST_ID, create_report_slack_chart.id, datetime.utcnow() - ).run() + with patch.object(current_app.config["STATS_LOGGER"], "gauge") as statsd_mock: - notification_targets = get_target_from_report_schedule( - create_report_slack_chart - ) + AsyncExecuteReportScheduleCommand( + TEST_ID, create_report_slack_chart.id, datetime.utcnow() + ).run() - assert file_upload_mock.call_args[1]["channels"] == notification_targets[0] - assert file_upload_mock.call_args[1]["file"] == SCREENSHOT_FILE + notification_targets = get_target_from_report_schedule( + create_report_slack_chart + ) - # Assert logs are correct - assert_log(ReportState.SUCCESS) + assert file_upload_mock.call_args[1]["channels"] == notification_targets[0] + assert file_upload_mock.call_args[1]["file"] == SCREENSHOT_FILE + + # Assert logs are correct + assert_log(ReportState.SUCCESS) + statsd_mock.assert_called_once_with("reports.slack.send.ok", 1) @pytest.mark.usefixtures( diff --git a/tests/integration_tests/security_tests.py b/tests/integration_tests/security_tests.py index c44335552b012..e66bf02e82cb3 100644 --- a/tests/integration_tests/security_tests.py +++ b/tests/integration_tests/security_tests.py @@ -39,6 +39,7 @@ from superset.models.slice import Slice from superset.sql_parse import Table from superset.utils.core import ( + DatasourceType, backend, get_example_default_schema, ) @@ -120,7 +121,7 @@ def setUp(self): ds_slices = ( session.query(Slice) - .filter_by(datasource_type="table") + .filter_by(datasource_type=DatasourceType.TABLE) .filter_by(datasource_id=ds.id) .all() ) @@ -143,7 +144,7 @@ def tearDown(self): ds.schema_perm = None ds_slices = ( session.query(Slice) - .filter_by(datasource_type="table") + .filter_by(datasource_type=DatasourceType.TABLE) .filter_by(datasource_id=ds.id) .all() ) @@ -365,7 +366,7 @@ def test_set_perm_slice(self): # no schema permission slice = Slice( datasource_id=table.id, - datasource_type="table", + datasource_type=DatasourceType.TABLE, datasource_name="tmp_perm_table", slice_name="slice_name", ) diff --git a/tests/integration_tests/sqla_models_tests.py b/tests/integration_tests/sqla_models_tests.py index d23b95f53cd3d..6c5b6736d1a15 100644 --- a/tests/integration_tests/sqla_models_tests.py +++ b/tests/integration_tests/sqla_models_tests.py @@ -52,11 +52,10 @@ from .base_tests import SupersetTestCase - VIRTUAL_TABLE_INT_TYPES: Dict[str, Pattern[str]] = { "hive": re.compile(r"^INT_TYPE$"), "mysql": re.compile("^LONGLONG$"), - "postgresql": re.compile(r"^INT$"), + "postgresql": re.compile(r"^INTEGER$"), "presto": re.compile(r"^INTEGER$"), "sqlite": re.compile(r"^INT$"), } diff --git a/tests/integration_tests/utils/cache_manager_tests.py b/tests/integration_tests/utils/cache_manager_tests.py new file mode 100644 index 0000000000000..c5d4b390f9c90 --- /dev/null +++ b/tests/integration_tests/utils/cache_manager_tests.py @@ -0,0 +1,49 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. +import pytest + +from superset.extensions import cache_manager +from superset.utils.core import backend, DatasourceType +from tests.integration_tests.base_tests import SupersetTestCase + + +class UtilsCacheManagerTests(SupersetTestCase): + def test_get_set_explore_form_data_cache(self): + key = "12345" + data = {"foo": "bar", "datasource_type": "query"} + cache_manager.explore_form_data_cache.set(key, data) + assert cache_manager.explore_form_data_cache.get(key) == data + + def test_get_same_context_twice(self): + key = "12345" + data = {"foo": "bar", "datasource_type": "query"} + cache_manager.explore_form_data_cache.set(key, data) + assert cache_manager.explore_form_data_cache.get(key) == data + assert cache_manager.explore_form_data_cache.get(key) == data + + def test_get_set_explore_form_data_cache_no_datasource_type(self): + key = "12345" + data = {"foo": "bar"} + cache_manager.explore_form_data_cache.set(key, data) + # datasource_type should be added because it is not present + assert cache_manager.explore_form_data_cache.get(key) == { + "datasource_type": DatasourceType.TABLE, + **data, + } + + def test_get_explore_form_data_cache_invalid_key(self): + assert cache_manager.explore_form_data_cache.get("foo") == None diff --git a/tests/integration_tests/utils/csv_tests.py b/tests/integration_tests/utils/csv_tests.py index bf6110c639aa4..e514efb1d2108 100644 --- a/tests/integration_tests/utils/csv_tests.py +++ b/tests/integration_tests/utils/csv_tests.py @@ -17,6 +17,7 @@ import io import pandas as pd +import pyarrow as pa import pytest from superset.utils import csv @@ -77,3 +78,6 @@ def test_df_to_escaped_csv(): ["a", "'=b"], # pandas seems to be removing the leading "" ["' =a", "b"], ] + + df = pa.array([1, None]).to_pandas(integer_object_nulls=True).to_frame() + assert csv.df_to_escaped_csv(df, encoding="utf8", index=False) == '0\n1\n""\n' diff --git a/tests/integration_tests/utils/decorators_tests.py b/tests/integration_tests/utils/decorators_tests.py index 98faa8a2ba6a4..d0ab6f98434b3 100644 --- a/tests/integration_tests/utils/decorators_tests.py +++ b/tests/integration_tests/utils/decorators_tests.py @@ -14,7 +14,10 @@ # KIND, either express or implied. See the License for the # specific language governing permissions and limitations # under the License. -from unittest.mock import call, Mock +from unittest.mock import call, Mock, patch + +import pytest +from flask import current_app from superset.utils import decorators from tests.integration_tests.base_tests import SupersetTestCase @@ -41,3 +44,18 @@ def myfunc(arg1: int, arg2: int, kwarg1: str = "abc", kwarg2: int = 2): result = myfunc(1, 0, kwarg1="haha", kwarg2=2) mock.assert_has_calls([call(1, "abc"), call(1, "haha")]) self.assertEqual(result, 3) + + def test_statsd_gauge(self): + @decorators.statsd_gauge("custom.prefix") + def my_func(fail: bool, *args, **kwargs): + if fail: + raise ValueError("Error") + return "OK" + + with patch.object(current_app.config["STATS_LOGGER"], "gauge") as mock: + my_func(False, 1, 2) + mock.assert_called_once_with("custom.prefix.ok", 1) + + with pytest.raises(ValueError): + my_func(True, 1, 2) + mock.assert_called_once_with("custom.prefix.error", 1) diff --git a/tests/unit_tests/dao/datasource_test.py b/tests/unit_tests/dao/datasource_test.py index dd0db265e7a02..a15684d71e699 100644 --- a/tests/unit_tests/dao/datasource_test.py +++ b/tests/unit_tests/dao/datasource_test.py @@ -106,7 +106,7 @@ def test_get_datasource_sqlatable( from superset.dao.datasource.dao import DatasourceDAO result = DatasourceDAO.get_datasource( - datasource_type=DatasourceType.SQLATABLE, + datasource_type=DatasourceType.TABLE, datasource_id=1, session=session_with_data, ) @@ -151,7 +151,9 @@ def test_get_datasource_sl_table(app_context: None, session_with_data: Session) # todo(hugh): This will break once we remove the dual write # update the datsource_id=1 and this will pass again result = DatasourceDAO.get_datasource( - datasource_type=DatasourceType.TABLE, datasource_id=2, session=session_with_data + datasource_type=DatasourceType.SLTABLE, + datasource_id=2, + session=session_with_data, ) assert result.id == 2 diff --git a/tests/unit_tests/db_engine_specs/test_gsheets.py b/tests/unit_tests/db_engine_specs/test_gsheets.py index b050c6fdbf2ab..c2e8346c3c7ac 100644 --- a/tests/unit_tests/db_engine_specs/test_gsheets.py +++ b/tests/unit_tests/db_engine_specs/test_gsheets.py @@ -40,7 +40,14 @@ def test_validate_parameters_simple( "catalog": {}, } errors = GSheetsEngineSpec.validate_parameters(parameters) - assert errors == [] + assert errors == [ + SupersetError( + message="Sheet name is required", + error_type=SupersetErrorType.CONNECTION_MISSING_PARAMETERS_ERROR, + level=ErrorLevel.WARNING, + extra={"catalog": {"idx": 0, "name": True}}, + ), + ] def test_validate_parameters_catalog( diff --git a/tests/unit_tests/explore/utils_test.py b/tests/unit_tests/explore/utils_test.py index 3d12f5e911ee9..9ef92872177ee 100644 --- a/tests/unit_tests/explore/utils_test.py +++ b/tests/unit_tests/explore/utils_test.py @@ -23,12 +23,21 @@ ChartAccessDeniedError, ChartNotFoundError, ) +from superset.commands.exceptions import ( + DatasourceNotFoundValidationError, + DatasourceTypeInvalidError, + OwnersNotFoundValidationError, + QueryNotFoundValidationError, +) from superset.datasets.commands.exceptions import ( DatasetAccessDeniedError, DatasetNotFoundError, ) +from superset.exceptions import SupersetSecurityException +from superset.utils.core import DatasourceType dataset_find_by_id = "superset.datasets.dao.DatasetDAO.find_by_id" +query_find_by_id = "superset.queries.dao.QueryDAO.find_by_id" chart_find_by_id = "superset.charts.dao.ChartDAO.find_by_id" is_user_admin = "superset.explore.utils.is_user_admin" is_owner = "superset.explore.utils.is_owner" @@ -36,88 +45,142 @@ "superset.security.SupersetSecurityManager.can_access_datasource" ) can_access = "superset.security.SupersetSecurityManager.can_access" +raise_for_access = "superset.security.SupersetSecurityManager.raise_for_access" +query_datasources_by_name = ( + "superset.connectors.sqla.models.SqlaTable.query_datasources_by_name" +) def test_unsaved_chart_no_dataset_id(app_context: AppContext) -> None: - from superset.explore.utils import check_access + from superset.explore.utils import check_access as check_chart_access - with raises(DatasetNotFoundError): - check_access(dataset_id=0, chart_id=0, actor=User()) + with raises(DatasourceNotFoundValidationError): + check_chart_access( + datasource_id=0, + chart_id=0, + actor=User(), + datasource_type=DatasourceType.TABLE, + ) def test_unsaved_chart_unknown_dataset_id( mocker: MockFixture, app_context: AppContext ) -> None: - from superset.explore.utils import check_access + from superset.explore.utils import check_access as check_chart_access with raises(DatasetNotFoundError): mocker.patch(dataset_find_by_id, return_value=None) - check_access(dataset_id=1, chart_id=0, actor=User()) + check_chart_access( + datasource_id=1, + chart_id=0, + actor=User(), + datasource_type=DatasourceType.TABLE, + ) + + +def test_unsaved_chart_unknown_query_id( + mocker: MockFixture, app_context: AppContext +) -> None: + from superset.explore.utils import check_access as check_chart_access + + with raises(QueryNotFoundValidationError): + mocker.patch(query_find_by_id, return_value=None) + check_chart_access( + datasource_id=1, + chart_id=0, + actor=User(), + datasource_type=DatasourceType.QUERY, + ) def test_unsaved_chart_unauthorized_dataset( mocker: MockFixture, app_context: AppContext ) -> None: from superset.connectors.sqla.models import SqlaTable - from superset.explore import utils + from superset.explore.utils import check_access as check_chart_access with raises(DatasetAccessDeniedError): mocker.patch(dataset_find_by_id, return_value=SqlaTable()) mocker.patch(can_access_datasource, return_value=False) - utils.check_access(dataset_id=1, chart_id=0, actor=User()) + check_chart_access( + datasource_id=1, + chart_id=0, + actor=User(), + datasource_type=DatasourceType.TABLE, + ) def test_unsaved_chart_authorized_dataset( mocker: MockFixture, app_context: AppContext ) -> None: from superset.connectors.sqla.models import SqlaTable - from superset.explore.utils import check_access + from superset.explore.utils import check_access as check_chart_access mocker.patch(dataset_find_by_id, return_value=SqlaTable()) mocker.patch(can_access_datasource, return_value=True) - assert check_access(dataset_id=1, chart_id=0, actor=User()) == True + check_chart_access( + datasource_id=1, + chart_id=0, + actor=User(), + datasource_type=DatasourceType.TABLE, + ) def test_saved_chart_unknown_chart_id( mocker: MockFixture, app_context: AppContext ) -> None: from superset.connectors.sqla.models import SqlaTable - from superset.explore.utils import check_access + from superset.explore.utils import check_access as check_chart_access with raises(ChartNotFoundError): mocker.patch(dataset_find_by_id, return_value=SqlaTable()) mocker.patch(can_access_datasource, return_value=True) mocker.patch(chart_find_by_id, return_value=None) - check_access(dataset_id=1, chart_id=1, actor=User()) + check_chart_access( + datasource_id=1, + chart_id=1, + actor=User(), + datasource_type=DatasourceType.TABLE, + ) def test_saved_chart_unauthorized_dataset( mocker: MockFixture, app_context: AppContext ) -> None: from superset.connectors.sqla.models import SqlaTable - from superset.explore import utils + from superset.explore.utils import check_access as check_chart_access with raises(DatasetAccessDeniedError): mocker.patch(dataset_find_by_id, return_value=SqlaTable()) mocker.patch(can_access_datasource, return_value=False) - utils.check_access(dataset_id=1, chart_id=1, actor=User()) + check_chart_access( + datasource_id=1, + chart_id=1, + actor=User(), + datasource_type=DatasourceType.TABLE, + ) def test_saved_chart_is_admin(mocker: MockFixture, app_context: AppContext) -> None: from superset.connectors.sqla.models import SqlaTable - from superset.explore.utils import check_access + from superset.explore.utils import check_access as check_chart_access from superset.models.slice import Slice mocker.patch(dataset_find_by_id, return_value=SqlaTable()) mocker.patch(can_access_datasource, return_value=True) mocker.patch(is_user_admin, return_value=True) mocker.patch(chart_find_by_id, return_value=Slice()) - assert check_access(dataset_id=1, chart_id=1, actor=User()) is True + check_chart_access( + datasource_id=1, + chart_id=1, + actor=User(), + datasource_type=DatasourceType.TABLE, + ) def test_saved_chart_is_owner(mocker: MockFixture, app_context: AppContext) -> None: from superset.connectors.sqla.models import SqlaTable - from superset.explore.utils import check_access + from superset.explore.utils import check_access as check_chart_access from superset.models.slice import Slice mocker.patch(dataset_find_by_id, return_value=SqlaTable()) @@ -125,12 +188,17 @@ def test_saved_chart_is_owner(mocker: MockFixture, app_context: AppContext) -> N mocker.patch(is_user_admin, return_value=False) mocker.patch(is_owner, return_value=True) mocker.patch(chart_find_by_id, return_value=Slice()) - assert check_access(dataset_id=1, chart_id=1, actor=User()) == True + check_chart_access( + datasource_id=1, + chart_id=1, + actor=User(), + datasource_type=DatasourceType.TABLE, + ) def test_saved_chart_has_access(mocker: MockFixture, app_context: AppContext) -> None: from superset.connectors.sqla.models import SqlaTable - from superset.explore.utils import check_access + from superset.explore.utils import check_access as check_chart_access from superset.models.slice import Slice mocker.patch(dataset_find_by_id, return_value=SqlaTable()) @@ -139,12 +207,17 @@ def test_saved_chart_has_access(mocker: MockFixture, app_context: AppContext) -> mocker.patch(is_owner, return_value=False) mocker.patch(can_access, return_value=True) mocker.patch(chart_find_by_id, return_value=Slice()) - assert check_access(dataset_id=1, chart_id=1, actor=User()) == True + check_chart_access( + datasource_id=1, + chart_id=1, + actor=User(), + datasource_type=DatasourceType.TABLE, + ) def test_saved_chart_no_access(mocker: MockFixture, app_context: AppContext) -> None: from superset.connectors.sqla.models import SqlaTable - from superset.explore.utils import check_access + from superset.explore.utils import check_access as check_chart_access from superset.models.slice import Slice with raises(ChartAccessDeniedError): @@ -154,4 +227,66 @@ def test_saved_chart_no_access(mocker: MockFixture, app_context: AppContext) -> mocker.patch(is_owner, return_value=False) mocker.patch(can_access, return_value=False) mocker.patch(chart_find_by_id, return_value=Slice()) - check_access(dataset_id=1, chart_id=1, actor=User()) + check_chart_access( + datasource_id=1, + chart_id=1, + actor=User(), + datasource_type=DatasourceType.TABLE, + ) + + +def test_dataset_has_access(mocker: MockFixture, app_context: AppContext) -> None: + from superset.connectors.sqla.models import SqlaTable + from superset.explore.utils import check_datasource_access + + mocker.patch(dataset_find_by_id, return_value=SqlaTable()) + mocker.patch(can_access_datasource, return_value=True) + mocker.patch(is_user_admin, return_value=False) + mocker.patch(is_owner, return_value=False) + mocker.patch(can_access, return_value=True) + assert ( + check_datasource_access( + datasource_id=1, + datasource_type=DatasourceType.TABLE, + ) + == True + ) + + +def test_query_has_access(mocker: MockFixture, app_context: AppContext) -> None: + from superset.explore.utils import check_datasource_access + from superset.models.sql_lab import Query + + mocker.patch(query_find_by_id, return_value=Query()) + mocker.patch(raise_for_access, return_value=True) + mocker.patch(is_user_admin, return_value=False) + mocker.patch(is_owner, return_value=False) + mocker.patch(can_access, return_value=True) + assert ( + check_datasource_access( + datasource_id=1, + datasource_type=DatasourceType.QUERY, + ) + == True + ) + + +def test_query_no_access(mocker: MockFixture, app_context: AppContext) -> None: + from superset.connectors.sqla.models import SqlaTable + from superset.explore.utils import check_datasource_access + from superset.models.core import Database + from superset.models.sql_lab import Query + + with raises(SupersetSecurityException): + mocker.patch( + query_find_by_id, + return_value=Query(database=Database(), sql="select * from foo"), + ) + mocker.patch(query_datasources_by_name, return_value=[SqlaTable()]) + mocker.patch(is_user_admin, return_value=False) + mocker.patch(is_owner, return_value=False) + mocker.patch(can_access, return_value=False) + check_datasource_access( + datasource_id=1, + datasource_type=DatasourceType.QUERY, + ) diff --git a/tests/unit_tests/jinja_context_test.py b/tests/unit_tests/jinja_context_test.py index 1f88f4f1a99c8..75c49f0977bf6 100644 --- a/tests/unit_tests/jinja_context_test.py +++ b/tests/unit_tests/jinja_context_test.py @@ -14,8 +14,15 @@ # KIND, either express or implied. See the License for the # specific language governing permissions and limitations # under the License. +# pylint: disable=invalid-name, unused-argument -from superset.jinja_context import where_in +import json + +import pytest +from pytest_mock import MockFixture + +from superset.datasets.commands.exceptions import DatasetNotFoundError +from superset.jinja_context import dataset_macro, where_in def test_where_in() -> None: @@ -25,3 +32,95 @@ def test_where_in() -> None: assert where_in([1, "b", 3]) == "(1, 'b', 3)" assert where_in([1, "b", 3], '"') == '(1, "b", 3)' assert where_in(["O'Malley's"]) == "('O''Malley''s')" + + +def test_dataset_macro(mocker: MockFixture, app_context: None) -> None: + """ + Test the ``dataset_macro`` macro. + """ + # pylint: disable=import-outside-toplevel + from superset.connectors.sqla.models import SqlaTable, SqlMetric, TableColumn + from superset.models.core import Database + + columns = [ + TableColumn(column_name="ds", is_dttm=1, type="TIMESTAMP"), + TableColumn(column_name="num_boys", type="INTEGER"), + TableColumn(column_name="revenue", type="INTEGER"), + TableColumn(column_name="expenses", type="INTEGER"), + TableColumn( + column_name="profit", type="INTEGER", expression="revenue-expenses" + ), + ] + metrics = [ + SqlMetric(metric_name="cnt", expression="COUNT(*)"), + ] + + dataset = SqlaTable( + table_name="old_dataset", + columns=columns, + metrics=metrics, + main_dttm_col="ds", + default_endpoint="https://www.youtube.com/watch?v=dQw4w9WgXcQ", # not used + database=Database(database_name="my_database", sqlalchemy_uri="sqlite://"), + offset=-8, + description="This is the description", + is_featured=1, + cache_timeout=3600, + schema="my_schema", + sql=None, + params=json.dumps( + { + "remote_id": 64, + "database_name": "examples", + "import_time": 1606677834, + } + ), + perm=None, + filter_select_enabled=1, + fetch_values_predicate="foo IN (1, 2)", + is_sqllab_view=0, # no longer used? + template_params=json.dumps({"answer": "42"}), + schema_perm=None, + extra=json.dumps({"warning_markdown": "*WARNING*"}), + ) + DatasetDAO = mocker.patch("superset.datasets.dao.DatasetDAO") + DatasetDAO.find_by_id.return_value = dataset + + assert ( + dataset_macro(1) + == """(SELECT ds AS ds, + num_boys AS num_boys, + revenue AS revenue, + expenses AS expenses, + revenue-expenses AS profit +FROM my_schema.old_dataset) AS dataset_1""" + ) + + assert ( + dataset_macro(1, include_metrics=True) + == """(SELECT ds AS ds, + num_boys AS num_boys, + revenue AS revenue, + expenses AS expenses, + revenue-expenses AS profit, + COUNT(*) AS cnt +FROM my_schema.old_dataset +GROUP BY ds, + num_boys, + revenue, + expenses, + revenue-expenses) AS dataset_1""" + ) + + assert ( + dataset_macro(1, include_metrics=True, columns=["ds"]) + == """(SELECT ds AS ds, + COUNT(*) AS cnt +FROM my_schema.old_dataset +GROUP BY ds) AS dataset_1""" + ) + + DatasetDAO.find_by_id.return_value = None + with pytest.raises(DatasetNotFoundError) as excinfo: + dataset_macro(1) + assert str(excinfo.value) == "Dataset 1 not found!" diff --git a/tests/unit_tests/pandas_postprocessing/test_boxplot.py b/tests/unit_tests/pandas_postprocessing/test_boxplot.py index 9252b0da78846..27dff0adeb894 100644 --- a/tests/unit_tests/pandas_postprocessing/test_boxplot.py +++ b/tests/unit_tests/pandas_postprocessing/test_boxplot.py @@ -124,3 +124,28 @@ def test_boxplot_percentile_incorrect_params(): metrics=["cars"], percentiles=[10, 90, 10], ) + + +def test_boxplot_type_coercion(): + df = names_df + df["cars"] = df["cars"].astype(str) + df = boxplot( + df=df, + groupby=["region"], + whisker_type=PostProcessingBoxplotWhiskerType.TUKEY, + metrics=["cars"], + ) + + columns = {column for column in df.columns} + assert columns == { + "cars__mean", + "cars__median", + "cars__q1", + "cars__q3", + "cars__max", + "cars__min", + "cars__count", + "cars__outliers", + "region", + } + assert len(df) == 4
( ...style, }} onClick={onClick} + data-column-name={col.id} + {...(allowRearrangeColumns && { + draggable: 'true', + onDragStart, + onDragOver: e => e.preventDefault(), + onDragEnter: e => e.preventDefault(), + onDrop, + })} > {/* can't use `columnWidth &&` because it may also be zero */} {config.columnWidth ? ( @@ -434,12 +446,13 @@ export default function TableChart( /> ) : null}
- {label} + {label}