Skip to content

Conversation

@akeeste
Copy link
Contributor

@akeeste akeeste commented Nov 2, 2023

This is a quick PR to resolve #247. It adds an optional silent argument to the CDIP module's get_netcdf_variables and request_parse_workflow functions that can prevent an extraneous print statement for running in production environments.

@mikebannis I know this is a delayed quite a bit, but let me know if this will resolve the issue from #247

@akeeste akeeste requested a review from ssolson November 2, 2023 20:24
@akeeste akeeste linked an issue Nov 3, 2023 that may be closed by this pull request
@ssolson ssolson added the Clean Up Improve code consistency and readability label Nov 3, 2023
Copy link
Contributor

@ssolson ssolson left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Adam this looks good to me. I found a small typo in the example notebook that was there from a previous commit.

Please change that and then this should be good to merge. Thanks for tackling.

"## 4.b. `request_parse_workflow`\n",
"\n",
"In the previous example we requested the NetCDF file and then processed the data. This workflow has been codified into a single function to streamline the process and adds additional functionality as well. The `request_parse_workflow` function accepts a netCDF object or a station number. This means the user may pass a CDIP ndetCDF file loaded from file, pull the data with `request_netcdf` and then pass, or just pass a station number letting the function know what data to parse and return. Secondly, the `request_parse_workflow` function accepts parameters allowing the user to specify to only return specific parameters reducing processing requirements. This is especially useful for processing 2D data which is only processed is specifically requested due to the amount of time it takes to process all the 2D data. Next, `request_parse_workflow` will slice on time by years, start_date, or end date. Years can be a single integer or a list of integers and is not required to be consecutive. If specified the start date will remove any data prior to the specified string (e.g. '2011-01-01') and end_date will remove any data after the speficied date. start_date and end_date may be used together, seperatly or not at all. Years works indpendently of start and end date. Next, the data_type defaults to historic but specifying this as realtime will return realtime data from the buoy. Lastly, there is a the boolean `all_2D_variables`. If set to true the function will return all of the wave 2D variables. It is not reccomended to do this due to the computational expense to do so, Instead it is reccomended to specify 2D quantities of interest using the `parameters` keyword.\n",
"In the previous example we requested the NetCDF file and then processed the data. This workflow has been codified into a single function to streamline the process and adds additional functionality as well. The `request_parse_workflow` function accepts a netCDF object or a station number. This means the user may pass a CDIP ndetCDF file loaded from file, pull the data with `request_netcdf` and then pass, or just pass a station number letting the function know what data to parse and return. Secondly, the `request_parse_workflow` function accepts parameters allowing the user to specify to only return specific parameters reducing processing requirements. This is especially useful for processing 2D data which is only processed is specifically requested due to the amount of time it takes to process all the 2D data. A print statement indicates whether the function is currently still processing 2D variables. Use the `silent=True` keyword argument to turn off this print statement in production environments.\n",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"ndetCDF" -> netCDF

@akeeste
Copy link
Contributor Author

akeeste commented Nov 6, 2023

@ssolson Thanks. Typo is fixed. Merging

@akeeste akeeste merged commit f5f25a2 into MHKiT-Software:develop Nov 6, 2023
@akeeste akeeste mentioned this pull request Nov 6, 2023
@akeeste akeeste deleted the cdipSilent branch November 6, 2023 16:16
@ssolson ssolson mentioned this pull request Apr 24, 2024
@ssolson ssolson mentioned this pull request May 6, 2024
ssolson added a commit that referenced this pull request May 8, 2024
# MHKiT v0.8.0
We're excited to announce the release of MHKiT v0.8.0, which brings a host of new features, enhancements, and bug fixes across various modules, ensuring compatibility with Python 3.10 and 3.11, and introducing full xarray support for more flexible data handling. Significant updates in the Wave and DOLfYN modules improve functionality and extend capabilities.

## Python 3.10 & 3.11 Support
MHKiT now supports python 3.10 and 3.11. Support for 3.12 will follow in the next minor update.
- #240


## Wave Module
### Enhancements:
**Automatic Threshold Calculation for Peaks-Over-Threshold**: We've introduced a new feature that automatically calculates the "best" threshold for identifying significant wave events. This method, originally developed by Neary, V. S., et al. in their 2020 study, has now been translated from Matlab to Python, enhancing our existing peaks-over-threshold functionality.

**Wave Heights Analysis**: A new function, `wave_heights`, has been added to extract the heights of individual waves from a time series. This function uses zero up-crossing analysis to accurately measure wave heights, improving upon our previous methods which only provided the maximum value between up-crossings.

**Enhanced Zero Crossing Analysis**: Building on the above, the zero crossing code previously embedded in `global_peaks` has been isolated into a helper function. This modular approach not only refines the codebase but also supports new functionalities such as calculating wave heights, zero crossing periods, and identifying crests.

### Bug Fixes:
**Contour Sampling Error in Wave Contours**: A bug identified in `mhkit.wave.contours.samples_contour` has been resolved. The issue occurred when period samples defined using the maximum period resulted in values outside the interpolation range of the contour data. This was corrected by ensuring that all sampling points are within the interpolation range and adjusting the contour data selection process accordingly.

- #268 
- #252 
- #278


## Xarray Support
MHKiT functions now fully support the use of xarray for passing and returning data.

- #279 
- #282
- #285
- #302
- #310


## DOLfYN

Thanks to the many user contributions and users of MHKiT the DOLFYN module include a significant number of enhancements and bug fixes. 

### Enhancements:
**Altimeter Support**: Enhanced the Nortek Signature Reader to add capability for reading ADCP dual profile configurations.

**Data Handling Improvements**: Introduced logic to skip messy header data that can accumulate during measurements collected via Nortek software on PCs and Macs.

**Instrument Noise Subtraction**: Added a function to subtract instrument noise from turbulence intensity estimation using RMS calculations, providing results that differ by approximately 1% from the existing standard deviation-based "I" property.

**Improved File Handling**: Updates for RDI files to handle changing "number of cells" and variable "cell sizes," which are now bin-averaged into the largest cell size.

### Bug Fixes:
**Power Spectra Calculation**: Fixed a bug where a given noise value was not being subtracted from the power spectra, and noise was inadvertently added as an input to dissipation rate calculations.

**Improved Header Handling**: Allowed RDI reader to skip junk headers effectively.

**Nortek Reader C Types Update**: Adjusted C types in the Nortek reader to handle below-zero water temperatures and to allow pitch and roll values to go negative.


- #280 
- #289
- #290
- #292
- #293
- #294
- #299


## River & Tidal: D3D
Added limits to `variable_interpolation` and added 3 array input capability to `create_points`
- #271

## Developer Experience
### Black formatting
Black formatting is now enforced on all MHKiT files. This ensures consistent formatting across the MHKiT package.
- #281

### Linting & Type Hints
MHKiT is in the process of enforcing pylint and adding type hints to all functions. Currently this has been achieved and is enforced in the Loads and Power modules.
- #288 
- #296 

### CI/CD
This release introduces significant reduction in testing time for development. This is achieved by reducing the number of tests for pulls against the develop branch and only running hindcast test when changes are made to it. A bug in the hindcast CI was fixed which only ran on changes to the hindcast tests instead of the hindcast module. Additionally the wave and wind hindcast needed to be separated in 2 jobs due to the excessive time taken to run a wind cache. This created a number of follow on PRs around solidifying the logic of these job. A special case for Python 3.8, pip, and Mac OS was added to use homebrew to install NetCDF and HDF5 to get tests to pass.
- #241
- #270
- #306
- #311
- #317
- #318
- #319
- #320
- #324

### Clean Up
MHKiT fixed an implementation error where functions used assert instead of built in errors for type and value checking. Additionally these PRs removed unused files, fixed typos, and created an argument which allows users to run CDIP API calls silently.
- #276
- #272
- #273
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Clean Up Improve code consistency and readability

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Extraneous print()

2 participants