Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: blankstate metrics columns #20755

Merged
merged 106 commits into from
Jul 26, 2022
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
Show all changes
106 commits
Select commit Hold shift + click to select a range
6fc435d
add POC ExploreMixin
hughhhh Jun 6, 2022
632ce21
Working POC
hughhhh Jun 6, 2022
6f22618
Created/tested query dataset dropdown
lyndsiWilliams Jun 7, 2022
236c489
Add isValidDatasourceType to @superset-ui/core and hide query dropdown
lyndsiWilliams Jun 8, 2022
24261d1
fix merge conflicts
hughhhh Jun 8, 2022
bd3b6a9
Visual updates to explore datasource panel
lyndsiWilliams Jun 8, 2022
64f4181
Temporarily make Query icon visible
lyndsiWilliams Jun 8, 2022
7c07534
Remove Query icon visibility
lyndsiWilliams Jun 8, 2022
6fab57f
Removed isValidDatasourceType check
lyndsiWilliams Jun 8, 2022
3a46152
Added Query preview Modal from DatasourceControl if the data source t…
eric-briscoe Jun 8, 2022
613d78a
> fix integration point with frontend
hughhhh Jun 8, 2022
e076f13
Merge branch 'master' of https://github.com/preset-io/superset into c…
hughhhh Jun 13, 2022
31dfdd2
Adjusts conditional logic approach to be extensible for additional ty…
eric-briscoe Jun 13, 2022
bbb91e0
Merge branch 'chart-power-query' into lyndsi/dataset-panel-updates
eric-briscoe Jun 13, 2022
8513cc5
Merge branch 'lyndsi/dataset-panel-updates' into ericbriscoe/sc-41493…
eric-briscoe Jun 13, 2022
cd54603
refactor
hughhhh Jun 14, 2022
a173cfa
Merge branch 'master' of https://github.com/preset-io/superset into c…
hughhhh Jun 15, 2022
0295818
Merge pull request #218 from preset-io/ericbriscoe/sc-41493/query-pre…
eric-briscoe Jun 15, 2022
de20276
set field for sql
hughhhh Jun 15, 2022
bd7db0d
Fixes issue where Missing query parameters error was showing in datas…
eric-briscoe Jun 16, 2022
e344197
add query_language
hughhhh Jun 16, 2022
9c5bc48
fix ds main_dttm
hughhhh Jun 17, 2022
b6fbc33
Fixes issue where menu.tsx was blocking access to redux debugging for…
eric-briscoe Jun 17, 2022
6af2961
Fixes issue where database id was not available to save query as data…
eric-briscoe Jun 17, 2022
12546f5
Merge branch 'master' into chart-power-query
eric-briscoe Jun 17, 2022
ff59846
Merge branch 'master' into chart-power-query
hughhhh Jun 21, 2022
bdbf2d1
oops
hughhhh Jun 21, 2022
559e1a2
fix pre-commit to 50 errors now
hughhhh Jun 21, 2022
890be43
fix circuliar dep
hughhhh Jun 21, 2022
4b833be
Disables showing Metrics section in DatasourcePanel when Query is the…
eric-briscoe Jun 21, 2022
4254b7e
adds condition to use query.columns if query.results is not present e…
eric-briscoe Jun 22, 2022
4710b3d
down to 26 now
hughhhh Jun 22, 2022
017bde6
patch for pre-commit
hughhhh Jun 24, 2022
075c5d8
one more pre-commit
hughhhh Jun 24, 2022
d73504c
added explore_json error
AAfghahi Jun 24, 2022
74ae38c
added error messages
AAfghahi Jun 24, 2022
ea49535
add for metrics
pkdotson Jun 27, 2022
3f5bea4
add text for columns
pkdotson Jun 27, 2022
07ee508
add model open/close method
pkdotson Jun 27, 2022
3ab4526
add propogation and methods
pkdotson Jun 28, 2022
d5d8aaa
change link to span
pkdotson Jun 28, 2022
91cd5c4
lint fix
pkdotson Jun 28, 2022
97ee4eb
Fixes frontend lint and TypeScript errors unit test fixes will be nex…
eric-briscoe Jun 28, 2022
276bc28
Aditional TypeScript error fix
eric-briscoe Jun 28, 2022
a8ff273
Fixes unit test failure
eric-briscoe Jun 28, 2022
ddf2c8d
fix some types
pkdotson Jun 28, 2022
044945b
added frontend piece
AAfghahi Jun 28, 2022
a9fb857
fix type
pkdotson Jun 28, 2022
8d8a868
Merge branch master
eric-briscoe Jun 28, 2022
7ace7a4
Fixes bad import caused by merge from master and removes duplicate sh…
eric-briscoe Jun 28, 2022
e436582
Fixes for DartasourceControl Test Suite
eric-briscoe Jun 28, 2022
76c3505
Fix lint error
eric-briscoe Jun 28, 2022
3370439
Fixes unit test issues due to array instead of a component being pass…
eric-briscoe Jun 29, 2022
d1866e8
merged main branch
AAfghahi Jun 29, 2022
18b6fa1
Merge pull request #223 from preset-io/arash/error_messages
AAfghahi Jun 29, 2022
a7ded35
Fixes unit test failure for DatasourceControl and simplifies getDatas…
eric-briscoe Jun 30, 2022
c65c225
Merge branch 'master' into chart-power-query
eric-briscoe Jun 30, 2022
54e4023
fix ts
pkdotson Jun 30, 2022
232d6dc
pylint
AAfghahi Jun 29, 2022
aa2756e
core_test fix
AAfghahi Jul 5, 2022
ebdfcf0
Merge pull request #225 from preset-io/arash/pylint_errors
hughhhh Jul 5, 2022
a52acd9
Merge branch 'master' into chart-power-query
eric-briscoe Jul 5, 2022
3653e40
Fixes line error post merge from master
eric-briscoe Jul 5, 2022
c0ab26b
merge master
hughhhh Jul 6, 2022
655fddf
fixed from master
hughhhh Jul 6, 2022
f752655
fixed from master
hughhhh Jul 6, 2022
c478add
Merge branch 'chart-power-query' of ssh://github.com/preset-io/supers…
eric-briscoe Jul 6, 2022
5e32c24
Fixes issue where Overwrite dataset does not work due to userid error
eric-briscoe Jul 6, 2022
6707808
Resolves TypeScript errors with changes made for SPA merging in and c…
eric-briscoe Jul 6, 2022
b28d79c
fix: top right panel view query functionality
hughhhh Jul 7, 2022
c619618
Merge branch 'master' of https://github.com/preset-io/superset into c…
hughhhh Jul 7, 2022
fa85488
remove unneeded code from core.py
hughhhh Jul 7, 2022
a09d781
working samples endpoint for query
hughhhh Jul 7, 2022
076e4fd
add owner check
hughhhh Jul 8, 2022
c5ca22d
update FE for it
hughhhh Jul 8, 2022
a35038d
handle columns are dict vs object
hughhhh Jul 8, 2022
661bb9e
fix exceptions
hughhhh Jul 8, 2022
b476b70
fix fe lint
hughhhh Jul 8, 2022
56fcdc9
fix test
hughhhh Jul 8, 2022
551ddc8
Merge pull request #233 from preset-io/chart-power-query-samples
hughhhh Jul 8, 2022
6e5443a
add tab_name to payload
hughhhh Jul 8, 2022
97ac731
Enables use of tab name from Query
eric-briscoe Jul 8, 2022
664773a
fix merge conflicts
hughhhh Jul 11, 2022
a073313
fix cypress test
hughhhh Jul 11, 2022
f717e18
save columns on execution
hughhhh Jul 11, 2022
16ea687
fix frontend build test
hughhhh Jul 11, 2022
3f6fa0a
remove parathesis around columns
hughhhh Jul 11, 2022
88a51b4
changing column types
AAfghahi Jul 11, 2022
a105ca6
fixing samples that has literal_columns
AAfghahi Jul 11, 2022
9e09af2
address comments
hughhhh Jul 11, 2022
69777d3
add changes
pkdotson Jul 12, 2022
6334e21
Merge branch 'chart-power-query' of https://github.com/preset-io/supe…
pkdotson Jul 12, 2022
8394881
fix path
pkdotson Jul 14, 2022
996f8d3
Merge branch 'master' of https://github.com/preset-io/superset into f…
pkdotson Jul 18, 2022
e9aaa1f
fix merge
pkdotson Jul 18, 2022
3db3d10
fix types
pkdotson Jul 19, 2022
0cf7000
remove console
pkdotson Jul 19, 2022
85177aa
add type
pkdotson Jul 21, 2022
79836f0
fix linting
hughhhh Jul 24, 2022
6073777
update to enum
hughhhh Jul 24, 2022
dc74d2b
Merge branch 'master' of https://github.com/preset-io/superset into f…
hughhhh Jul 25, 2022
bee3795
fix test
hughhhh Jul 25, 2022
b626843
remove explore from buttons
hughhhh Jul 26, 2022
9aac370
fix logic
hughhhh Jul 26, 2022
c0853cf
fix logic
hughhhh Jul 26, 2022
69009be
oops
hughhhh Jul 26, 2022
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
fix pre-commit to 50 errors now
  • Loading branch information
hughhhh authored Jun 21, 2022
commit 559e1a252acab11f6ab6650cf47bbf541f307450
2 changes: 1 addition & 1 deletion superset/common/query_context_factory.py
Original file line number Diff line number Diff line change
Expand Up @@ -82,7 +82,7 @@ def create(

# pylint: disable=no-self-use
def _convert_to_model(self, datasource: DatasourceDict) -> BaseDatasource:
from superset.dao.datasource.dao import DatasourceDAO
from superset.datasource.dao import DatasourceDAO
from superset.utils.core import DatasourceType

return DatasourceDAO.get_datasource(
Expand Down
2 changes: 1 addition & 1 deletion superset/common/query_context_processor.py
Original file line number Diff line number Diff line change
Expand Up @@ -124,7 +124,7 @@ def get_df_payload(
invalid_columns=invalid_columns,
)
)

query_result = self.get_query_result(query_obj)
annotation_data = self.get_annotation_data(query_obj)
cache.set_query_result(
Expand Down
91 changes: 67 additions & 24 deletions superset/models/helpers.py
Original file line number Diff line number Diff line change
Expand Up @@ -58,14 +58,20 @@

from superset import app, db, is_feature_enabled, security_manager
from superset.common.db_query_status import QueryStatus
from superset.connectors.sqla import SqlMetric
from superset.db_engine_specs import BaseEngineSpec
from superset.errors import ErrorLevel, SupersetError, SupersetErrorType
from superset.exceptions import SupersetSecurityException
from superset.jinja_context import (
BaseTemplateProcessor,
ExtraCache,
get_template_processor,
)
from superset.models.core import Database
from superset.sql_parse import (
extract_table_references,
has_table_query,
insert_rls,
ParsedQuery,
sanitize_clause,
Table as TableName,
Expand Down Expand Up @@ -630,47 +636,64 @@ class ExploreMixin:
}

@property
def owners_data(self):
return []
def owners_data(self) -> List[Any]:
raise NotImplementedError()

@property
def metrics(self):
return []
def metrics(self) -> List[Any]:
raise NotImplementedError()

@property
def uid(self):
return "foo"
def uid(self) -> str:
raise NotImplementedError()

@property
def is_rls_supported(self):
return False
def is_rls_supported(self) -> bool:
raise NotImplementedError()

@property
def cache_timeout(self):
return None
def cache_timeout(self) -> int:
raise NotImplementedError()

@property
def column_names(self):
return [col.get("column_name") for col in self.columns]
def column_names(self) -> List[str]:
raise NotImplementedError()

@property
def offset(self):
return 0
def offset(self) -> int:
raise NotImplementedError()

@property
def main_dttm_col(self) -> str:
for col in self.columns:
if col.get('is_dttm'):
return col.get('column_name')
return None
raise NotImplementedError()

@property
def dttm_cols(self) -> List[str]:
return [col.get('column_name') for col in self.columns if col.get('is_dttm')]
raise NotImplementedError()

@property
def db_engine_spec(self) -> Type["BaseEngineSpec"]:
raise NotImplementedError()

@property
def database(self) -> Type["Database"]:
raise NotImplementedError()

@property
def schema(self) -> str:
raise NotImplementedError()

@property
def sql(self) -> str:
raise NotImplementedError()

@property
def columns(self) -> List[Any]:
raise NotImplementedError()

@staticmethod
def get_extra_cache_keys(query_obj):
return []
def get_extra_cache_keys(query_obj: Dict[str, Any]) -> List[str]:
raise NotImplementedError()

def make_sqla_column_compatible(
self, sqla_col: ColumnElement, label: Optional[str] = None
Expand Down Expand Up @@ -1014,6 +1037,26 @@ def handle_single_value(value: Optional[FilterValue]) -> Optional[FilterValue]:
values = values[0] if values else None
return values

def _get_series_orderby(
self,
series_limit_metric: Metric,
metrics_by_name: Dict[str, SqlMetric],
columns_by_name: Dict[str, TableColumn],
) -> Column:
if utils.is_adhoc_metric(series_limit_metric):
assert isinstance(series_limit_metric, dict)
ob = self.adhoc_metric_to_sqla(series_limit_metric, columns_by_name)
elif (
isinstance(series_limit_metric, str)
and series_limit_metric in metrics_by_name
):
ob = metrics_by_name[series_limit_metric].get_sqla_col()
else:
raise QueryObjectValidationError(
_("Metric '%(metric)s' does not exist", metric=series_limit_metric)
)
return ob

def get_sqla_query( # pylint: disable=too-many-arguments,too-many-locals,too-many-branches,too-many-statements
self,
apply_fetch_values_predicate: bool = False,
Expand Down Expand Up @@ -1506,13 +1549,13 @@ def get_sqla_query( # pylint: disable=too-many-arguments,too-many-locals,too-ma
inner_groupby_exprs = []
inner_select_exprs = []
for gby_name, gby_obj in groupby_series_columns.items():
label = get_column_name(gby_name)
label = utils.get_column_name(gby_name)
inner = self.make_sqla_column_compatible(gby_obj, gby_name + "__")
inner_groupby_exprs.append(inner)
inner_select_exprs.append(inner)

inner_select_exprs += [inner_main_metric_expr]
subq = select(inner_select_exprs).select_from(tbl)
subq = sa.select(inner_select_exprs).select_from(tbl)
inner_time_filter = []

if dttm_col and not db_engine_spec.time_groupby_inline:
Expand Down Expand Up @@ -1540,7 +1583,7 @@ def get_sqla_query( # pylint: disable=too-many-arguments,too-many-locals,too-ma
# conditionally mutated, as it refers to the column alias in
# the inner query
col_name = db_engine_spec.make_label_compatible(gby_name + "__")
on_clause.append(gby_obj == column(col_name))
on_clause.append(gby_obj == sa.column(col_name))

tbl = tbl.join(subq.alias(), and_(*on_clause))
else:
Expand Down
44 changes: 44 additions & 0 deletions superset/models/sql_lab.py
Original file line number Diff line number Diff line change
Expand Up @@ -40,6 +40,7 @@
from sqlalchemy.orm import backref, relationship

from superset import security_manager
from superset.db_engine_specs import BaseEngineSpec
from superset.models.helpers import (
AuditMixinNullable,
ExploreMixin,
Expand Down Expand Up @@ -233,6 +234,49 @@ def raise_for_access(self) -> None:
def db_engine_spec(self) -> Type["BaseEngineSpec"]:
return self.database.db_engine_spec

@property
def owners_data(self):
return []

@property
def metrics(self):
return []

@property
def uid(self):
return "foo"

@property
def is_rls_supported(self):
return False

@property
def cache_timeout(self):
return None

@property
def column_names(self):
return [col.get("column_name") for col in self.columns]

@property
def offset(self):
return 0

@property
def main_dttm_col(self) -> str:
for col in self.columns:
if col.get("is_dttm"):
return col.get("column_name")
return None

@property
def dttm_cols(self) -> List[Any]:
return [col.get("column_name") for col in self.columns if col.get("is_dttm")]

@staticmethod
def get_extra_cache_keys(query_obj: Dict[str, Any]) -> List[str]:
return []


class SavedQuery(Model, AuditMixinNullable, ExtraJSONMixin, ImportExportMixin):
"""ORM model for SQL query"""
Expand Down
13 changes: 7 additions & 6 deletions superset/utils/core.py
Original file line number Diff line number Diff line change
Expand Up @@ -1648,26 +1648,27 @@ def extract_dataframe_dtypes(
"date": GenericDataType.TEMPORAL,
}

# todo(hughhhh): can we make the column_object a Union
# todo(hughhhh): can we make the column_object a Union
if datasource and datasource.type == "query":
columns_by_name = {column.get('column_name'): column for column in datasource.columns}
columns_by_name = {
column.get("column_name"): column for column in datasource.columns
}
else:
columns_by_name = (
{column.column_name: column for column in datasource.columns}
if datasource
else {}
)

generic_types: List[GenericDataType] = []
for column in df.columns:
column_object = columns_by_name.get(column)
series = df[column]
inferred_type = infer_dtype(series)
# todo(hughhhh): can we make the column_object a Union
if datasource.type == "query":
if datasource and datasource.type == "query": # type: ignore
generic_type = (
GenericDataType.TEMPORAL
if column_object and column_object.get('is_dttm')
if column_object and column_object.get("is_dttm")
else inferred_type_map.get(inferred_type, GenericDataType.STRING)
)
else:
Expand Down
25 changes: 9 additions & 16 deletions superset/views/core.py
Original file line number Diff line number Diff line change
Expand Up @@ -773,15 +773,23 @@ def explore(
value = GetFormDataCommand(parameters).run()
initial_form_data = json.loads(value) if value else {}

from superset.dao.datasource.dao import DatasourceDAO
from superset.datasource.dao import DatasourceDAO
from superset.models.helpers import ExploreMixin
from superset.utils.core import DatasourceType

# Handle SIP-68 Models or explore view
# API will always use /explore/<datasource_type>/<int:datasource_id>/ to query
# new models to power any viz in explore
datasource: Optional[BaseDatasource] = None
datasource_id = request.args.get("datasource_id", datasource_id)
datasource_type = request.args.get("datasource_type", datasource_type)
dummy_datasource_data: Dict[str, Any] = {
"type": datasource_type,
"name": "[Missing Dataset]",
"columns": [],
"metrics": [],
"database": {"id": 0, "backend": ""},
}

if datasource_id and datasource_type:
# 1. Query datasource object by type and id
Expand Down Expand Up @@ -856,13 +864,6 @@ def explore(
)
standalone_mode = ReservedUrlParameters.is_standalone_mode()
force = request.args.get("force") in {"force", "1", "true"}
dummy_datasource_data: Dict[str, Any] = {
"type": datasource_type,
"name": datasource_name,
"columns": [],
"metrics": [],
"database": {"id": 0, "backend": ""},
}
try:
datasource_data = (
datasource.data if datasource else dummy_datasource_data
Expand Down Expand Up @@ -929,7 +930,6 @@ def explore(
# fallback unkonw datasource to table type
datasource_type = SqlaTable.type

datasource: Optional[BaseDatasource] = None
if datasource_id is not None:
try:
datasource = DatasourceDAO.get_datasource(
Expand Down Expand Up @@ -1002,13 +1002,6 @@ def explore(
)
standalone_mode = ReservedUrlParameters.is_standalone_mode()
force = request.args.get("force") in {"force", "1", "true"}
dummy_datasource_data: Dict[str, Any] = {
"type": datasource_type,
"name": datasource_name,
"columns": [],
"metrics": [],
"database": {"id": 0, "backend": ""},
}
try:
datasource_data = datasource.data if datasource else dummy_datasource_data
except (SupersetException, SQLAlchemyError):
Expand Down