-
Notifications
You must be signed in to change notification settings - Fork 206
more linting #218
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
more linting #218
Conversation
An before the people think I forgot my good manners, @fperrad : thanks! |
closed in favor of #219 |
@fperrad what kind of static analyer are you using btw? Is this one freely available for open source scanning as for example coverity? |
I use |
@fperrad It seems @czurnieden included everything from here in #219. |
Yes he did, but the author of the commits is @fperrad and that has to stay. |
These are trivial commits due to static analysis. I certainly would not claim any reward for the trivial commits I made. Furthermore I think these linting commits after merging the "real code" by @czurnieden are harmful since they obfuscate the git history. It complicated things a few times when I was browsing the history and wanted to identify a commit which introduced some change. Closing this PR in favor of #219 would have been the right thing. Ideally the linting should happen before merging the PR by CI, but this does not seem possible since the linting tool is not available freely. For me the question here is why we are even relying on an obsolete linting tool which has even been deprecated by the vendor. |
@sjaeckel What I want to say - I wanted to help you by closing PRs which according to my understanding are resolved in other ways which I consider appropriate. I certainly don't want to take any credit from @fperrad concerning the checking he is doing, which I generally consider important. It would probably help if you clarify what you are expecting from collaborators. Before you mentioned that PR reviews are the bottleneck - this probably means that you just want to hear a different opinion on each of the PRs but no other actions should be taken? |
Resolving problems mentioned in #218
very true
any proposal on what to use instead?
TBH I didn't see it from this Point of View and I agree that those are valid objections/points.
phew, you're asking tough things!
To have another person review the PR's was what I hoped for. That someone comes around the corner and wants to update the complete project I definitely didn't expect, but I'm definitely not against. Coming back to my expectations of collaborators I've probably one thing that I can already say: we don't break userspace, no matter how odd some things from the past look today (resp. only break after we warned early or it's very important) |
Does something like misra for cppcheck exist? I think most important are warnings of recent gcc, clang, clang scan-build and coverity. But more tools are better in that respect.
For me the objective here is not to update the complete project. I missed a few features like floating point conversion and two complement functions, which were added. I am using tommath for embedding in language runtimes, like in TCL. It is also being discussed to use it in GHC. For language runtimes only a small subset of the functions is used but it is good to have more available if something comes up. My recent changes were mostly about trying to reduce a bit the amount of legacy stuff. But the priority is correctness. And this means having a well defined API, test suite, static analysis etc etc. My PR regarding the test suite was made in the hope that this allows to make it easier to test new functions. The valgrind addition was highly appreciated and I would like to have #221 which makes the allocation sizes more well defined. I think the quality criteria change with time. In particular C code should be put under close scrutiny. There are quite a few people switching to Rust or other tools. So the question we should ask here - does it make sense to use the library in new projects with high quality standards or should we just give up and use reimplementations directly? Or do we think that what we have now is sufficient and somehow set in stone? I don't think they are but we have to draw some line between the things you consider as needless changes and changes which are critical. For example a bug should be fixed there is no question. On the other side there are changes like #211 which are totally unnecessary but might only
I agree with that. However I would be generally ok with introducing some breaking points, like a 1.0 to 2.0 transition. At this point users can decide if they want to upgrade. But given the small API I think this is a quick thing. The story looks a bit more complicated for package maintainers, which might result in the necessity for shipping multiple versions. Going via deprecation is at least a way to introduce some grace period, which makes sense given the age of the project. |
after merge of #191
the return value of
mp_set_int
is never checked in this new filebn_mp_ilogb.c