Skip to content

Improve code-base and coverage #31

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 17 commits into from
Jan 26, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 7 additions & 0 deletions .coveragerc
Original file line number Diff line number Diff line change
@@ -1,3 +1,10 @@
[run]
dynamic_context = test_function
omit = fortls/__init__.py

[report]
exclude_lines =
if debug:
log.debug
except:
if not PY3K:
2 changes: 1 addition & 1 deletion .github/workflows/main.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ jobs:
build:
strategy:
matrix:
os: [ubuntu-latest]
os: [ubuntu-latest, windows-latest]
python-version: ["3.7", "3.8", "3.9", "3.10"]
fail-fast: false
runs-on: ${{ matrix.os }}
Expand Down
27 changes: 15 additions & 12 deletions .github/workflows/python-publish.yml
Original file line number Diff line number Diff line change
Expand Up @@ -55,18 +55,21 @@ jobs:
shell: bash
run: sed -i "s@\".*\"@\"${VERSION}\"@g" "fortls/_version.py"

- name: Commit the new version to dev
shell: bash
run: |
git config --global user.name 'gnikit'
git config --global user.email 'giannis.nikiteas@gmail.com'
git fetch origin
git switch dev
git commit -S fortls/_version.py -m "Auto-Update version" -v
git push
git tag -f "${VERSION}"
git push --delete origin "${VERSION}"
git push origin "${VERSION}"
# Disabled the workflow because it messes up with the Releases on GitHub
# releases that use tags through force-push are marked as drafts
# will have to manually update the versions in _version.py
# - name: Commit the new version to dev
# shell: bash
# run: |
# git config --global user.name 'gnikit'
# git config --global user.email 'giannis.nikiteas@gmail.com'
# git fetch origin
# git switch dev
# git commit -S fortls/_version.py -m "Auto-Update version" -v
# git push
# git tag -f "${VERSION}"
# git push --delete origin "${VERSION}"
# git push origin "${VERSION}"

- name: Build package
run: python -m build
Expand Down
11 changes: 10 additions & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,10 +1,19 @@
# CHANGELONG

## Unreleased
## 2.1.0

### Added

- Added coverage metric for Codecov
- Added coverage for `WHERE`, `ENUM`, max line/comment diagnostics and multilines
- Adds Windows CI

### Fixed

- Fixed global `sort_keywords` option not propagating during parsing on Windows
([gnikit/fortls#36](https://github.com/gnikit/fortls/issues/36))
- Fixed unittests not propagating debugger state
([gnikit/fortls#35](https://github.com/gnikit/fortls/issues/35))

## 2.0.1

Expand Down
4 changes: 3 additions & 1 deletion docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,8 @@

sys.path.insert(0, os.path.abspath(".."))

from fortls._version import __version__ # noqa: E402

# Generate the agglomerated changes (from the CHANGELOG) between fortls
# and the fortran-language-server project
with open("../CHANGELOG.md", "r") as f:
Expand Down Expand Up @@ -57,7 +59,7 @@
author = "Giannis Nikiteas"

# The full version, including alpha/beta/rc tags
release = "2.0.0"
release = __version__


# -- General configuration ---------------------------------------------------
Expand Down
33 changes: 23 additions & 10 deletions docs/options.rst
Original file line number Diff line number Diff line change
Expand Up @@ -84,7 +84,9 @@ source_dirs

.. code-block:: json

"source_dirs": ["./**", "/external/fortran/src"]
{
"source_dirs": ["./**", "/external/fortran/src"]
}

By default all directories under the current project will be recursively parsed
for Fortran sources. Alternatively, one can define a series of directories
Expand All @@ -97,7 +99,9 @@ incl_suffixes

.. code-block:: json

"incl_suffixes": [".h", ".FYP"]
{
"incl_suffixes": [".h", ".FYP"]
}

``fortls`` will parse only files with ``incl_suffixes`` extensions found in
``source_dirs``. By default ``incl_suffixes`` are defined as
Expand All @@ -112,7 +116,9 @@ excl_suffixes

.. code-block:: json

"excl_suffixes": ["_tmp.f90", "_hdf5.F90"]
{
"excl_suffixes": ["_tmp.f90", "_hdf5.F90"]
}

If certain files or suffixes do not need to be parsed these can be excluded by
deffining ``excl_suffixes``
Expand All @@ -130,8 +136,9 @@ its subdirectories from being parsed you should define it like so

.. code-block:: json

"excl_paths": ["exclude_dir/**"]

{
"excl_paths": ["exclude_dir/**"]
}

Preprocessor
############
Expand All @@ -141,7 +148,9 @@ pp_suffixes

.. code-block:: json

"pp_suffixes" : [".h", ".F90", ".fpp"]
{
"pp_suffixes" : [".h", ".F90", ".fpp"]
}

By default preprocessor definitions are parsed for all Fortran source files
with uppercase extensions e.g. ``.F90``, ``.F``, ``.F08``, etc.. However, the
Expand All @@ -153,7 +162,9 @@ include_dirs

.. code-block:: json

"include_dirs": ["include", "preprocessor", "/usr/include"]
{
"include_dirs": ["include", "preprocessor", "/usr/include"]
}

By default ``fortls`` will scan the project's directories for files with extensions
``PP_SUFFIXES`` to parse for **preprocessor definitions**. However, if the preprocessor
Expand All @@ -169,9 +180,11 @@ pp_defs

.. code-block:: json

"pp_defs": {
"HAVE_PETSC": ""
"Mat": "type(tMat)"
{
"pp_defs": {
"HAVE_PETSC": ""
"Mat": "type(tMat)"
}
}

Additional **preprocessor definitions** from what are specified in files found in
Expand Down
6 changes: 1 addition & 5 deletions fortls.py
Original file line number Diff line number Diff line change
@@ -1,10 +1,6 @@
#!/usr/bin/env python3
# file used for unit testing
if __name__ == "__main__":
import sys
import os

root_dir = os.path.abspath(os.path.join(os.path.dirname(__file__), ".."))
sys.path.insert(0, root_dir)
import fortls

fortls.main()
6 changes: 3 additions & 3 deletions fortls/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,8 +14,8 @@
from .interface import commandline_args


def error_exit(error_str):
print("ERROR: {0}".format(error_str))
def error_exit(error_str: str):
print(f"ERROR: {error_str}")
sys.exit(-1)


Expand All @@ -25,7 +25,7 @@ def main():
args = commandline_args(__name__).parse_args()

if args.version:
print("{0}".format(__version__))
print(__version__)
sys.exit(0)

debug_server = (
Expand Down
2 changes: 1 addition & 1 deletion fortls/_version.py
Original file line number Diff line number Diff line change
@@ -1 +1 @@
__version__ = "v2.0.1"
__version__ = "v2.1.0"
4 changes: 2 additions & 2 deletions fortls/helper_functions.py
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ def expand_name(line: str, char_poss: int) -> str:
regexs = [LOGICAL_REGEX, SQ_STRING_REGEX, DQ_STRING_REGEX, WORD_REGEX, NUMBER_REGEX]
for r in regexs:
for num_match in r.finditer(line):
if num_match.start(0) <= char_poss and num_match.end(0) >= char_poss:
if num_match.start(0) <= char_poss <= num_match.end(0):
return num_match.group(0)
return ""

Expand Down Expand Up @@ -344,7 +344,7 @@ def get_keywords(keywords, keyword_info={}):
def get_paren_substring(test_str):
i1 = test_str.find("(")
i2 = test_str.rfind(")")
if i1 > -1 and i2 > i1:
if -1 < i1 < i2:
return test_str[i1 + 1 : i2]
else:
return None
Expand Down
15 changes: 1 addition & 14 deletions fortls/intrinsics.py
Original file line number Diff line number Diff line change
Expand Up @@ -54,17 +54,7 @@ def get_snippet(self, name_replace=None, drop_arg=-1):
arg_snip = None
else:
arg_list = self.args.split(",")
place_holders = []
for i, arg in enumerate(arg_list):
opt_split = arg.split("=")
if len(opt_split) > 1:
place_holders.append(
"{1}=${{{0}:{2}}}".format(i + 1, opt_split[0], opt_split[1])
)
else:
place_holders.append("${{{0}:{1}}}".format(i + 1, arg))
arg_str = "({0})".format(", ".join(arg_list))
arg_snip = "({0})".format(", ".join(place_holders))
arg_str, arg_snip = self.get_placeholders(arg_list)
name = self.name
if name_replace is not None:
name = name_replace
Expand All @@ -80,9 +70,6 @@ def get_signature(self):
call_sig, _ = self.get_snippet()
return call_sig, self.doc_str, arg_sigs

def get_documentation(self):
return self.doc_str

def get_hover(self, long=False):
return self.doc_str, False

Expand Down
47 changes: 7 additions & 40 deletions fortls/jsonrpc.py
Original file line number Diff line number Diff line change
@@ -1,19 +1,10 @@
import json

try:
import Queue
except ImportError:
import queue as Queue

import os
import queue
import threading
from collections import deque

try:
from urllib.parse import quote, unquote
except ImportError:
from urllib2 import quote
from urlparse import unquote
from pathlib import Path
from urllib.parse import quote, unquote

from fortls.constants import log

Expand All @@ -26,7 +17,7 @@ def path_from_uri(uri: str) -> str:
_, path = uri.split("file:///", 1)
else:
_, path = uri.split("file://", 1)
return os.path.normpath(unquote(path))
return str(Path(unquote(path)).resolve())


def path_to_uri(path: str) -> str:
Expand Down Expand Up @@ -185,7 +176,7 @@ def send_request_batch(self, requests):

# We communicate the request ids using a thread safe queue.
# It also allows us to bound the number of concurrent requests.
q = Queue.Queue(100)
q = queue.Queue(100)

def send():
for method, params in requests:
Expand Down Expand Up @@ -258,35 +249,11 @@ def write_rpc_notification(method, params):


def read_rpc_messages(content):
def read_header_content_length(line):
if len(line) < 2 or line[-2:] != "\r\n":
raise JSONRPC2ProtocolError("Line endings must be \\r\\n")
if line.startswith("Content-Length: "):
_, value = line.split("Content-Length: ")
value = value.strip()
try:
return int(value)
except ValueError:
raise JSONRPC2ProtocolError(f"Invalid Content-Length header: {value}")

def receive_next():
line = content.readline()
if line == "":
raise EOFError()
length = read_header_content_length(line)
# Keep reading headers until we find the sentinel line
# for the JSON request.
while line != "\r\n":
line = content.readline()
body = content.read(length)
# log.debug("RECV %s", body)
return json.loads(body)

#
conn = JSONRPC2Connection(content)
result_list = []
while True:
try:
result = receive_next()
result = conn._receive()
except EOFError:
break
else:
Expand Down
Loading