Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

more mypy checks, docopt-->argparse, formatting, cleanup, ci improvements, ... #405

Merged
merged 23 commits into from
Sep 14, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
23 commits
Select commit Hold shift + click to select a range
b80a88d
ci: check formatting of wizard
Sep 7, 2023
abd302b
ci: remove ci job "CI Test Successful"
Sep 7, 2023
a300934
refactor/ci: remove clang versions in incrementing order
Sep 7, 2023
b8696db
ci: cache python dependencies
Sep 7, 2023
34f05da
chore: cleanup outdated MANIFEST.in requirements.txt
Sep 11, 2023
60f5ad1
refactor/ci: ci and pre-commit use same script to check for license tag
Sep 11, 2023
89971d0
feat: common mypy configuration file
Sep 11, 2023
81c509f
refactor(stronger mypy checks): warn_redundant_casts
Sep 11, 2023
a0eeba2
refactor(stronger mypy checks): warn_return_any
Sep 11, 2023
129eea3
refactor(stronger mypy checks): disallow_subclassing_any
Sep 11, 2023
17a0ca7
refactor(stronger mypy checks): disallow_untyped_decorators
Sep 11, 2023
4ec0184
refactor(stronger mypy checks): no implicit_reexport
Sep 11, 2023
01cedcb
refactor(stronger mypy checks): strict_equality
Sep 11, 2023
0574e24
style: cleanup
Sep 11, 2023
5158fdf
refactor(stronger mypy checks): no_implicit_optional
Sep 11, 2023
85a2373
refactor: remove duplicate definition of get_path
Sep 12, 2023
7394927
refactor(explorer): move stuff out of __init__.py
Sep 12, 2023
b2cbd7a
refactor: separate argument parsing from execution logic
Sep 13, 2023
45ae7a9
refactor: file name instead of directory for Data_CUInst.py (interfac…
Sep 14, 2023
1fbeca4
refactor: move API out of __main__ to discopop_explorer.py
Sep 14, 2023
99809b8
feat: switch explorer to argparse
Sep 14, 2023
91afdaf
chore: fix bad file opening mode, improve CLI
Sep 14, 2023
ae153cc
chore: cleanup (comment only)
Sep 14, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
56 changes: 13 additions & 43 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -30,52 +30,28 @@ jobs:
uses: actions/checkout@v3

- name: "Check all files for DiscoPoP License tag"
run: |
ERROR=""
for FILE in $(find . -type f -not -path "**/.git/**" -not -path "**/test/**" \
-not -path "**/docs/**" \
-not -path "**/LICENSE" -not -path "**/VERSION" \
-not -path "**/_version.py" -not -path "**/__init__.py" -not -path "**/py.typed" \
-not -path "**.png" -not -path "**.svg" -not -path "**.ico" \
)
do
FILE_ERROR=""
head -n 20 ${FILE} | grep -q "DiscoPoP software" || FILE_ERROR="yes"
[ -z "$FILE_ERROR" ] || ERROR="yes"
[ -z "$FILE_ERROR" ] || echo "Missing License tag at: ${FILE}"
[ -z "$FILE_ERROR" ] || continue
head -n 20 ${FILE} | grep -q "Technische Universitaet Darmstadt, Germany" || FILE_ERROR="yes"
[ -z "$FILE_ERROR" ] || ERROR="yes"
[ -z "$FILE_ERROR" ] || echo "Missing License tag at: ${FILE}"
[ -z "$FILE_ERROR" ] || continue
head -n 20 ${FILE} | grep -q "3-Clause BSD License" || FILE_ERROR="yes"
[ -z "$FILE_ERROR" ] || ERROR="yes"
[ -z "$FILE_ERROR" ] || echo "Missing License tag at: ${FILE}"
done
# Report error (1), if license tags are missing
[ -z "$ERROR" ] || exit 1
run: ./scripts/dev/check-license.sh $(find . -type f)

- name: Setup Python
uses: actions/setup-python@v3
uses: actions/setup-python@v4
with:
python-version: 3.8
cache: 'pip' # uses requirements.txt

- name: Install Python dependencies
run: |
pip install --upgrade pip
pip install .[ci]
run: pip install -r requirements.txt

- name: "Run MyPy Type Checker - DiscoPoP Explorer"
run: python -m mypy -p discopop_explorer
run: python -m mypy --config-file=mypy.ini -p discopop_explorer

- name: "Run MyPy Type Checker - DiscoPoP Library"
run: python -m mypy -p discopop_library
run: python -m mypy --config-file=mypy.ini -p discopop_library

- name: "Run MyPy Type Checker - DiscoPoP Profiler"
run: python -m mypy -p discopop_profiler
run: python -m mypy --config-file=mypy.ini -p discopop_profiler

- name: "Run MyPy Type Checker - DiscoPoP Wizard"
run: python -m mypy -p discopop_wizard
run: python -m mypy --config-file=mypy.ini -p discopop_wizard

- name: "Check formatting of DiscoPoP Explorer"
run: python -m black -l 100 --check discopop_explorer
Expand All @@ -86,6 +62,9 @@ jobs:
- name: "Check formatting of DiscoPoP Profiler"
run: python -m black -l 100 --check discopop_profiler

- name: "Check formatting of DiscoPoP Wizard"
run: python -m black -l 100 --check discopop_wizard

- name: Test DiscoPop Explorer - DISABLED
run: |
if false; then # disable the check temporarily
Expand All @@ -111,7 +90,7 @@ jobs:
- name: Setup DiscoPoP Profiler - Install Dependencies
run: |
sudo apt-get update
sudo apt-get remove clang-8 clang-10 clang-9 clang-12 clang-11
sudo apt-get remove clang-8 clang-9 clang-10 clang-11 clang-12
wget -O - https://apt.llvm.org/llvm-snapshot.gpg.key | sudo apt-key add -
sudo apt-add-repository "deb http://apt.llvm.org/focal/ llvm-toolchain-focal-11 main"
sudo apt-get update
Expand Down Expand Up @@ -145,19 +124,10 @@ jobs:
- name: "Execute DiscoPoP Profiler - simple_pipeline - discopopPass"
run: .github/workflows/tests/profiler.sh simple_pipeline discopopPass

ci_successful:
name: "CI Tests Successful"
runs-on: ubuntu-20.04
needs: execute_tests
steps:
- name: "Report success"
run: exit 0


update_wiki-build:
name: "Update Wiki - Build"
runs-on: ubuntu-20.04
needs: ci_successful
needs: execute_tests
if: github.ref == 'refs/heads/master'
steps:
- name: Checkout
Expand Down
7 changes: 3 additions & 4 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -30,8 +30,9 @@ repos:
rev: 'v1.5.1' # Use the sha / tag you want to point at
hooks:
- id: mypy # run mypy
#args: [--strict, --ignore-missing-imports]
args: [--config-file=mypy.ini, --ignore-missing-imports]
#additional_dependencies: [dep==version.version.version, ...]
# NOTE: pre-commit runs mypy in a virtualenv, so dependencies are not installed unless explicitly listed here

# Using this mirror lets us use mypyc-compiled black, which is about 2x faster
# more info: https://github.com/psf/black/blob/main/docs/integrations/source_version_control.md
Expand All @@ -46,8 +47,6 @@ repos:
hooks:
- id: licensetag
name: Check all files for DiscoPoP License tag
entry: scripts/dev/check_license.sh
entry: scripts/dev/check-license.sh
language: script
# exclude: we could exclude files here, but we do it in the script instead

# TODO the script should also be used by the CI pipeline
4 changes: 1 addition & 3 deletions MANIFEST.in
Original file line number Diff line number Diff line change
Expand Up @@ -10,8 +10,6 @@
# https://packaging.python.org/guides/using-manifest-in/

include VERSION
include discopop_explorer/requirements.txt
include discopop_library/requirements.txt
include discopop_wizard/requirements.txt
include requirements.txt
recursive-include discopop_wizard *
recursive-include discopop_library *
29 changes: 9 additions & 20 deletions discopop_explorer/PETGraphX.py
Original file line number Diff line number Diff line change
Expand Up @@ -300,9 +300,7 @@ def get_nesting_level(self, pet: PETGraphX, return_invert_result: bool = True) -
parent_nesting_levels.append(
min(
2,
cast(LoopNode, parent_node).get_nesting_level(
pet, return_invert_result=False
),
parent_node.get_nesting_level(pet, return_invert_result=False),
)
)

Expand All @@ -322,7 +320,7 @@ def get_entry_node(self, pet: PETGraphX) -> Optional[Node]:
if pet.node_at(s) not in pet.direct_children(self)
]
if len(predecessors_outside_loop_body) > 0:
return cast(Node, node)
return node
return None


Expand Down Expand Up @@ -361,7 +359,7 @@ def get_entry_cu_id(self, pet: PETGraphX) -> NodeID:
def get_exit_cu_ids(self, pet: PETGraphX) -> Set[NodeID]:
exit_cu_ids: Set[NodeID] = set()
if self.children_cu_ids is not None:
for child_cu_id in cast(List[NodeID], self.children_cu_ids):
for child_cu_id in self.children_cu_ids:
if (
len(pet.out_edges(child_cu_id, EdgeType.SUCCESSOR)) == 0
and len(pet.in_edges(child_cu_id, EdgeType.SUCCESSOR)) != 0
Expand Down Expand Up @@ -933,7 +931,7 @@ def node_at(self, node_id: NodeID) -> Node:
:param node_id: id of the node
:return: Node
"""
return self.g.nodes[node_id]["data"]
return cast(Node, self.g.nodes[node_id]["data"])

# generic type for subclasses of Node
NodeT = TypeVar("NodeT", bound=Node)
Expand Down Expand Up @@ -1154,7 +1152,7 @@ def unused_check_alias(self, s: NodeID, t: NodeID, d: Dependency, root_loop: Nod
parent_func_source = self.get_parent_function(self.node_at(t))

res = False
d_var_name_str = cast(str, str(d.var_name))
d_var_name_str = cast(str, d.var_name)

if self.unused_is_global(d_var_name_str, sub) and not (
self.is_passed_by_reference(d, parent_func_sink)
Expand Down Expand Up @@ -1278,7 +1276,7 @@ def get_variables(self, nodes: Sequence[Node]) -> Dict[Variable, Set[MemoryRegio
):
if dep.var_name == var_name.name:
if dep.memory_region is not None:
res[var_name].add(cast(MemoryRegion, dep.memory_region))
res[var_name].add(dep.memory_region)
return res

def get_undefined_variables_inside_loop(
Expand Down Expand Up @@ -1523,7 +1521,7 @@ def get_reduction_sign(self, line: str, name: str) -> str:
return rv["operation"]
return ""

def dump_to_pickled_json(self) -> str:
def dump_to_pickled_json(self):
"""Encodes and returns the entire Object into a pickled json string.
The encoded string can be reconstructed into an object by using:
jsonpickle.decode(json_str)
Expand All @@ -1545,9 +1543,6 @@ def check_reachability(self, target: Node, source: Node, edge_types: List[EdgeTy
queue = [target]
while len(queue) > 0:
cur_node = queue.pop(0)
if type(cur_node) == list:
cur_node_list = cast(List[Node], cur_node)
cur_node = cur_node_list[0]
visited.append(cur_node.id)
tmp_list = [
(s, t, e)
Expand All @@ -1559,7 +1554,7 @@ def check_reachability(self, target: Node, source: Node, edge_types: List[EdgeTy
return True
else:
if e[0] not in visited:
queue.append(cast(Node, self.node_at(e[0])))
queue.append(self.node_at(e[0]))
return False

def is_predecessor(self, source_id: NodeID, target_id: NodeID) -> bool:
Expand Down Expand Up @@ -1632,9 +1627,6 @@ def check_reachability_and_get_path_nodes(
queue: List[Tuple[CUNode, List[CUNode]]] = [(target, [])]
while len(queue) > 0:
cur_node, cur_path = queue.pop(0)
if type(cur_node) == list:
cur_node_list = cast(List[CUNode], cur_node)
cur_node = cur_node_list[0]
visited.append(cur_node.id)
tmp_list = [
(s, t, e)
Expand Down Expand Up @@ -1696,7 +1688,7 @@ def get_memory_regions(self, nodes: List[CUNode], var_name: str) -> Set[MemoryRe
for s, t, d in out_deps:
if d.var_name == var_name:
if d.memory_region is not None:
mem_regs.add(cast(MemoryRegion, d.memory_region))
mem_regs.add(d.memory_region)
return mem_regs

def get_path_nodes_between(
Expand All @@ -1718,9 +1710,6 @@ def get_path_nodes_between(

while len(queue) > 0:
cur_node, cur_path = queue.pop(0)
if type(cur_node) == list:
cur_node_list = cast(List[CUNode], cur_node)
cur_node = cur_node_list[0]
visited.append(cur_node.id)
tmp_list = [
(s, t, e)
Expand Down
67 changes: 0 additions & 67 deletions discopop_explorer/__init__.py
Original file line number Diff line number Diff line change
@@ -1,67 +0,0 @@
# This file is part of the DiscoPoP software (http://www.discopop.tu-darmstadt.de)
#
# Copyright (c) 2020, Technische Universitaet Darmstadt, Germany
#
# This software may be modified and distributed under the terms of
# the 3-Clause BSD License. See the LICENSE file in the package base
# directory for details.

from pathlib import Path
from typing import List, Optional

from pluginbase import PluginBase # type:ignore

from .PETGraphX import PETGraphX, NodeType
from .parser import parse_inputs
from .pattern_detection import PatternDetectorX
from discopop_library.result_classes.DetectionResult import DetectionResult


def run(
project_path: str,
cu_xml: str,
dep_file: str,
loop_counter_file: str, # TODO we should be able to read all info from the _dep.txt file (?)
reduction_file: str,
plugins: List[str],
file_mapping: Optional[str] = None,
cu_inst_result_file: Optional[str] = None,
llvm_cxxfilt_path: Optional[str] = None,
discopop_build_path: Optional[str] = None,
enable_task_pattern: bool = False,
) -> DetectionResult:
pet = PETGraphX.from_parsed_input(*parse_inputs(cu_xml, dep_file, reduction_file, file_mapping))
print("PET CREATION FINISHED.")
# pet.show()
# TODO add visualization

plugin_base = PluginBase(package="plugins")

plugin_source = plugin_base.make_plugin_source(searchpath=[Path(__file__).parent / "plugins"])

for plugin_name in plugins:
p = plugin_source.load_plugin(plugin_name)
print("executing plugin before: " + plugin_name)
pet = p.run_before(pet)

pattern_detector = PatternDetectorX(pet)

res: DetectionResult = pattern_detector.detect_patterns(
project_path,
cu_xml,
dep_file,
loop_counter_file,
reduction_file,
file_mapping,
cu_inst_result_file,
llvm_cxxfilt_path,
discopop_build_path,
enable_task_pattern,
)

for plugin_name in plugins:
p = plugin_source.load_plugin(plugin_name)
# print("executing plugin after: " + plugin_name)
pet = p.run_after(pet)

return res
Loading