Skip to content
This repository has been archived by the owner on Nov 16, 2023. It is now read-only.

Added HTTP URLs to HTTPS URLs finder & converter Python scripts, and processed HTTP-->HTTPS URL changes #346

Merged
merged 35 commits into from
Oct 23, 2019
Merged
Show file tree
Hide file tree
Changes from 22 commits
Commits
Show all changes
35 commits
Select commit Hold shift + click to select a range
43abc9e
Added utf-8 encoding to NGramExtractor.py
mstfbl Oct 16, 2019
0c9c1c5
Added HTTP to HTTPS finder and converter
mstfbl Oct 21, 2019
e29f6ed
Merge branch 'master' into Issue-52-Actual
mstfbl Oct 21, 2019
afa5f35
Changes made by ChangeHttpURLsToHttps.py
mstfbl Oct 21, 2019
891f931
Added copyright statements
mstfbl Oct 21, 2019
e2a6df3
Merge remote-tracking branch 'upstream/master' into Issue-57
mstfbl Oct 22, 2019
1233fb8
Updated FindHttpURLs.py and ChangeHttpURLsToHttps.py
mstfbl Oct 22, 2019
9121123
Add reports of alterable, nonalterable and invalid URLs
mstfbl Oct 22, 2019
7cec162
Revert "Changes made by ChangeHttpURLsToHttps.py"
mstfbl Oct 22, 2019
b6a2f7f
Add URL changes made by ChangeHttpURLsToHttps.py
mstfbl Oct 22, 2019
786f1d5
Revert "Add URL changes made by ChangeHttpURLsToHttps.py"
mstfbl Oct 22, 2019
168deb5
Revert "Add reports of alterable, nonalterable and invalid URLs"
mstfbl Oct 22, 2019
038262f
Update FindHttpURLs.py and ChangHttpURLsToHttps.py
mstfbl Oct 22, 2019
81c5a96
Add HTTP to HTTPS URL reports
mstfbl Oct 22, 2019
72c85d9
Changes made by ChangeHttpToHttpsURLs.py
mstfbl Oct 22, 2019
136db68
Revert "Changes made by ChangeHttpToHttpsURLs.py"
mstfbl Oct 22, 2019
4f726e1
Revert "Add HTTP to HTTPS URL reports"
mstfbl Oct 22, 2019
c150d33
Revert "Update FindHttpURLs.py and ChangHttpURLsToHttps.py"
mstfbl Oct 22, 2019
e03b38c
Update FindHttpURLs.py and ChangeHttpURLsToHttps.py
mstfbl Oct 22, 2019
9d39f79
Add URL reports
mstfbl Oct 22, 2019
dd6c75c
Add Http-->Https URL changes through ChangeHttpURLsToHttpsURLs.py
mstfbl Oct 22, 2019
ba2742f
Removed if __name__ and main() statements
mstfbl Oct 22, 2019
9d96b11
Revert "Removed if __name__ and main() statements"
mstfbl Oct 22, 2019
90b5dac
Update nimbusml.pyproj
mstfbl Oct 22, 2019
75a6a53
Manually converted two alterable HTTP links to HTTPS.
mstfbl Oct 22, 2019
da2e8d8
Rename ChangeHttpURLsToHttps.py to changeHttpURLsToHttps.py
mstfbl Oct 22, 2019
15edf1e
Rename FindHttpURLs.py to findHttpURLs.py
mstfbl Oct 22, 2019
d16746c
URL in SigmoidKernel.txt is fixed for findHttpURLs.py to recognize it…
mstfbl Oct 23, 2019
735600e
Changed outdated URL as original URL redirected to current URL
mstfbl Oct 23, 2019
840e7e9
Update Report_InvalidUrls_FindHttpURLs.csv
mstfbl Oct 23, 2019
9d45424
Fixing reachable HTTP URLs
mstfbl Oct 23, 2019
5e72ec6
Update findHttpURLs.py
mstfbl Oct 23, 2019
38cfe0a
Updated URL reports, cleared invalid URLs
mstfbl Oct 23, 2019
1ac509e
Update of report for alterable HTTP URLs after running findHttpURLs.p…
mstfbl Oct 23, 2019
32fb17b
Removing URL reports for merge
mstfbl Oct 23, 2019
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion src/NativeBridge/DataViewInterop.h
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ typedef MANAGED_CALLBACK_PTR(bool, GETLABELS)(DataSourceBlock *source, int col,

// REVIEW: boost_python is not updated at the same speed as swig or pybind11.
// Both have a larger audience now, see about pybind11 https://github.com/davisking/dlib/issues/293
// It handles csr_matrix: http://pybind11-rtdtest.readthedocs.io/en/stable/advanced.html#transparent-conversion-of-dense-and-sparse-eigen-data-types.
// It handles csr_matrix: https://pybind11-rtdtest.readthedocs.io/en/stable/advanced.html#transparent-conversion-of-dense-and-sparse-eigen-data-types.
using namespace boost::python;

// The data source wrapper used for managed interop. Some of the fields of this are visible to managed code.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -45,10 +45,10 @@
<https://en.wikipedia.org/wiki/Perceptron>`_

`Large Margin Classification Using the Perceptron Algorithm
<http://citeseer.ist.psu.edu/viewdoc/summary?doi=10.1.1.48.8200>`_
<https://citeseer.ist.psu.edu/viewdoc/summary?doi=10.1.1.48.8200>`_

`Discriminative Training Methods for Hidden Markov Models
<http://citeseer.ist.psu.edu/viewdoc/summary?doi=10.1.1.18.6725>`_
<https://citeseer.ist.psu.edu/viewdoc/summary?doi=10.1.1.18.6725>`_


:param loss: The default is :py:class:`'hinge' <nimbusml.loss.Hinge>`. Other
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@
`Field Aware Factorization Machines
<https://www.csie.ntu.edu.tw/~r01922136/slides/ffm.pdf>`_,
`Field-aware Factorization Machines for CTR Prediction
<http://www.csie.ntu.edu.tw/~cjlin/papers/ffm.pdf>`_,
<https://www.csie.ntu.edu.tw/~cjlin/papers/ffm.pdf>`_,
`Adaptive Subgradient Methods for Online Learning and Stochastic
Optimization
<http://jmlr.org/papers/volume12/duchi11a/duchi11a.pdf>`_
Expand Down
2 changes: 1 addition & 1 deletion src/python/docs/docstrings/FastForestBinaryClassifier.txt
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@
**Reference**

`Wikipedia: Random forest
<http://en.wikipedia.org/wiki/Random_forest>`_
<https://en.wikipedia.org/wiki/Random_forest>`_

`Quantile regression forest
<http://jmlr.org/papers/volume7/meinshausen06a/meinshausen06a.pdf>`_
Expand Down
2 changes: 1 addition & 1 deletion src/python/docs/docstrings/FastForestRegressor.txt
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@
**Reference**

`Wikipedia: Random forest
<http://en.wikipedia.org/wiki/Random_forest>`_
<https://en.wikipedia.org/wiki/Random_forest>`_

`Quantile regression forest
<http://jmlr.org/papers/volume7/meinshausen06a/meinshausen06a.pdf>`_
Expand Down
2 changes: 1 addition & 1 deletion src/python/docs/docstrings/FastTreesBinaryClassifier.txt
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@
<https://en.wikipedia.org/wiki/Gradient_boosting#Gradient_tree_boosting>`_

`Greedy function approximation: A gradient boosting machine.
<http://projecteuclid.org/DPubS?service=UI&version=1.0&verb=Display&handle=euclid.aos/1013203451>`_
<https://projecteuclid.org/DPubS?service=UI&version=1.0&verb=Display&handle=euclid.aos/1013203451>`_

:param optimizer: Default is ``sgd``.

Expand Down
2 changes: 1 addition & 1 deletion src/python/docs/docstrings/FastTreesRegressor.txt
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,7 @@
<https://en.wikipedia.org/wiki/Gradient_boosting#Gradient_tree_boosting>`_

`Greedy function approximation: A gradient boosting machine.
<http://projecteuclid.org/DPubS?service=UI&version=1.0&verb=Display&handle=euclid.aos/1013203451>`_
<https://projecteuclid.org/DPubS?service=UI&version=1.0&verb=Display&handle=euclid.aos/1013203451>`_

:param optimizer: Default is ``sgd``.

Expand Down
2 changes: 1 addition & 1 deletion src/python/docs/docstrings/FastTreesTweedieRegressor.txt
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@
<https://en.wikipedia.org/wiki/Gradient_boosting#Gradient_tree_boosting>`_

`Greedy function approximation: A gradient boosting machine.
<http://projecteuclid.org/DPubS?service=UI&version=1.0&verb=Display&handle=euclid.aos/1013203451>`_
<https://projecteuclid.org/DPubS?service=UI&version=1.0&verb=Display&handle=euclid.aos/1013203451>`_

:param optimizer: Default is ``sgd``.

Expand Down
4 changes: 2 additions & 2 deletions src/python/docs/docstrings/GamBinaryClassifier.txt
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
functions learned will step between the discretization boundaries.

This implementation is based on the this `paper
<http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.352.7619>`_,
<https://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.352.7619>`_,
but diverges from it in several important respects: most
significantly,
in each round of boosting, rather than do one feature at a time, it
Expand Down Expand Up @@ -57,7 +57,7 @@
`Generalized additive models
<https://en.wikipedia.org/wiki/Generalized_additive_model>`_,
`Intelligible Models for Classification and Regression
<http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.352.7619>`_
<https://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.352.7619>`_


:param normalize: Specifies the type of automatic normalization used:
Expand Down
4 changes: 2 additions & 2 deletions src/python/docs/docstrings/GamRegressor.txt
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
functions learned will step between the discretization boundaries.

This implementation is based on the this `paper
<http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.352.7619>`_,
<https://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.352.7619>`_,
but diverges from it in several important respects: most
significantly,
in each round of boosting, rather than do one feature at a time, it
Expand Down Expand Up @@ -57,7 +57,7 @@
`Generalized additive models
<https://en.wikipedia.org/wiki/Generalized_additive_model>`_,
`Intelligible Models for Classification and Regression
<http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.352.7619>`_
<https://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.352.7619>`_


:param normalize: Specifies the type of automatic normalization used:
Expand Down
2 changes: 1 addition & 1 deletion src/python/docs/docstrings/LightLda.txt
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@
topical vectors. LightLDA is an extremely
efficient implementation of LDA developed in MSR-Asia that
incorporates a number of optimization techniques
`(http://arxiv.org/abs/1412.1576) <http://arxiv.org/abs/1412.1576>`_.
`(https://arxiv.org/abs/1412.1576) <https://arxiv.org/abs/1412.1576>`_.
With the LDA transform, we can
train a topic model to produce 1 million topics with 1 million
vocabulary on a 1-billion-token document set one
Expand Down
4 changes: 2 additions & 2 deletions src/python/docs/docstrings/LocalDeepSvmBinaryClassifier.txt
Original file line number Diff line number Diff line change
Expand Up @@ -39,14 +39,14 @@
More details about LD-SVM can be found in this paper `Local deep
kernel
learning for efficient non-linear SVM prediction
<http://research.microsoft.com/en-
<https://research.microsoft.com/en-
us/um/people/manik/pubs/Jose13.pdf>`_.


**Reference**

`Local deep kernel learning for efficient non-linear SVM prediction
<http://research.microsoft.com/en-
<https://research.microsoft.com/en-
us/um/people/manik/pubs/Jose13.pdf>`_


Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -69,14 +69,14 @@

**Reference**

`Wikipedia: L-BFGS <http://en.wikipedia.org/wiki/L-BFGS>`_
`Wikipedia: L-BFGS <https://en.wikipedia.org/wiki/L-BFGS>`_

`Wikipedia: Logistic
regression <http://en.wikipedia.org/wiki/Logistic_regression>`_
regression <https://en.wikipedia.org/wiki/Logistic_regression>`_

`Scalable
Training of L1-Regularized Log-Linear Models
<http://research.microsoft.com/apps/pubs/default.aspx?id=78900>`_
<https://research.microsoft.com/apps/pubs/default.aspx?id=78900>`_

`Test Run - L1
and L2 Regularization for Machine Learning
Expand Down
6 changes: 3 additions & 3 deletions src/python/docs/docstrings/LogisticRegressionClassifier.txt
Original file line number Diff line number Diff line change
Expand Up @@ -70,14 +70,14 @@

**Reference**

`Wikipedia: L-BFGS <http://en.wikipedia.org/wiki/L-BFGS>`_
`Wikipedia: L-BFGS <https://en.wikipedia.org/wiki/L-BFGS>`_

`Wikipedia: Logistic
regression <http://en.wikipedia.org/wiki/Logistic_regression>`_
regression <https://en.wikipedia.org/wiki/Logistic_regression>`_

`Scalable
Training of L1-Regularized Log-Linear Models
<http://research.microsoft.com/apps/pubs/default.aspx?id=78900>`_
<https://research.microsoft.com/apps/pubs/default.aspx?id=78900>`_

`Test Run - L1
and L2 Regularization for Machine Learning
Expand Down
4 changes: 2 additions & 2 deletions src/python/docs/docstrings/OneClassSVMAnomalyDetector.txt
Original file line number Diff line number Diff line change
Expand Up @@ -29,10 +29,10 @@
us/library/azure/dn913103.aspx>`_

`Estimating the Support of a High-Dimensional Distribution
<http://research.microsoft.com/pubs/69731/tr-99-87.pdf>`_
<https://research.microsoft.com/pubs/69731/tr-99-87.pdf>`_

`New Support Vector Algorithms
<http://www.stat.purdue.edu/~yuzhu/stat598m3/Papers/NewSVM.pdf>`_
<https://www.stat.purdue.edu/~yuzhu/stat598m3/Papers/NewSVM.pdf>`_

`LIBSVM: A Library for Support Vector Machines
<https://www.csie.ntu.edu.tw/~cjlin/papers/libsvm.pdf>`_
Expand Down
2 changes: 1 addition & 1 deletion src/python/docs/docstrings/PcaAnomalyDetector.txt
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@

`Randomized Methods for Computing the Singular Value Decomposition
(SVD) of very large matrices
<http://web.stanford.edu/group/mmds/slides2010/Martinsson.pdf>`_
<https://web.stanford.edu/group/mmds/slides2010/Martinsson.pdf>`_
`A randomized algorithm for principal component analysis
<https://arxiv.org/abs/0809.2274>`_,
`Finding Structure with Randomness: Probabilistic Algorithms for
Expand Down
8 changes: 4 additions & 4 deletions src/python/docs/docstrings/SgdBinaryClassifier.txt
Original file line number Diff line number Diff line change
Expand Up @@ -13,14 +13,14 @@
associated optimization problem is sparse, then Hogwild SGD achieves
a
nearly optimal rate of convergence. For a detailed reference, please
refer to `http://arxiv.org/pdf/1106.5730v2.pdf
<http://arxiv.org/pdf/1106.5730v2.pdf>`_.
refer to `https://arxiv.org/pdf/1106.5730v2.pdf
<https://arxiv.org/pdf/1106.5730v2.pdf>`_.


**Reference**

`http://arxiv.org/pdf/1106.5730v2.pdf
<http://arxiv.org/pdf/1106.5730v2.pdf>`_
`https://arxiv.org/pdf/1106.5730v2.pdf
<https://arxiv.org/pdf/1106.5730v2.pdf>`_


:param normalize: Specifies the type of automatic normalization used:
Expand Down
2 changes: 1 addition & 1 deletion src/python/docs/docstrings/SsaForecaster.txt
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@
input time-series where each component in the spectrum corresponds to a
trend, seasonal or noise component in the time-series. For details of the
Singular Spectrum Analysis (SSA), refer to `this document
<http://arxiv.org/pdf/1206.6910.pdf>`_.
<https://arxiv.org/pdf/1206.6910.pdf>`_.

.. seealso::
:py:func:`IIDChangePointDetector
Expand Down
4 changes: 2 additions & 2 deletions src/python/docs/docstrings/SsweEmbedding.txt
Original file line number Diff line number Diff line change
Expand Up @@ -7,12 +7,12 @@
versions of `GloVe Models
<https://nlp.stanford.edu/projects/glove/>`_, `FastText
<https://en.wikipedia.org/wiki/FastText>`_, and `Sswe
<http://anthology.aclweb.org/P/P14/P14-1146.pdf>`_.
<https://anthology.aclweb.org/P/P14/P14-1146.pdf>`_.

.. remarks::
Sentiment-specific word embedding (SSWE) is a DNN featurizer
developed
by MSRA (`paper <http://anthology.aclweb.org/P/P14/P14-1146.pdf>`_).
by MSRA (`paper <https://anthology.aclweb.org/P/P14/P14-1146.pdf>`_).
It
incorporates sentiment information into the neural network to learn
sentiment specific word embedding. It proves to be useful in various
Expand Down
2 changes: 1 addition & 1 deletion src/python/docs/docstrings/SupervisedBinner.txt
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@
the default is to normalize features before training.

``SupervisedBinner`` implements the `Entropy-Based Discretization
<http://www.aaai.org/Papers/KDD/1996/KDD96-019.pdf>`_.
<https://www.aaai.org/Papers/KDD/1996/KDD96-019.pdf>`_.
Partition of the data is performed recursively to select the split
with highest entropy gain with respect to the label.
Therefore, the final binned features will have high correlation with
Expand Down
2 changes: 1 addition & 1 deletion src/python/docs/docstrings/WordEmbedding.txt
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@
available options are various versions of `GloVe Models
<https://nlp.stanford.edu/projects/glove/>`_, `FastText
<https://en.wikipedia.org/wiki/FastText>`_, and `Sswe
<http://anthology.aclweb.org/P/P14/P14-1146.pdf>`_.
<https://anthology.aclweb.org/P/P14/P14-1146.pdf>`_.


:param model_kind: Pre-trained model used to create the vocabulary.
Expand Down
2 changes: 1 addition & 1 deletion src/python/docs/sphinx/ci_script/_static/mystyle.css
Original file line number Diff line number Diff line change
Expand Up @@ -8432,7 +8432,7 @@ label {
padding: 0px;
}
/* Flexible box model classes */
/* Taken from Alex Russell http://infrequently.org/2009/08/css-3-progress/ */
/* Taken from Alex Russell https://infrequently.org/2009/08/css-3-progress/ */
/* This file is a compatability layer. It allows the usage of flexible box
model layouts accross multiple browsers, including older browsers. The newest,
universal implementation of the flexible box model is used when available (see
Expand Down
4 changes: 2 additions & 2 deletions src/python/docs/sphinx/ci_script/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -128,8 +128,8 @@
'relative': True,
'reference_url': {
'nimbusml': None,
'matplotlib': 'http://matplotlib.org',
'numpy': 'http://www.numpy.org/',
'matplotlib': 'https://matplotlib.org',
'numpy': 'https://www.numpy.org/',
'scipy': 'https://www.scipy.org/'},
}

Expand Down
4 changes: 2 additions & 2 deletions src/python/docs/sphinx/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -145,8 +145,8 @@ def install_and_import(package):
'relative': True,
'reference_url': {
'nimbusml': None,
'matplotlib': 'http://matplotlib.org',
'numpy': 'http://www.numpy.org/',
'matplotlib': 'https://matplotlib.org',
'numpy': 'https://www.numpy.org/',
'scipy': 'https://www.scipy.org/'},
}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@ class FactorizationMachineBinaryClassifier(
`Field Aware Factorization Machines
<https://www.csie.ntu.edu.tw/~r01922136/slides/ffm.pdf>`_,
`Field-aware Factorization Machines for CTR Prediction
<http://www.csie.ntu.edu.tw/~cjlin/papers/ffm.pdf>`_,
<https://www.csie.ntu.edu.tw/~cjlin/papers/ffm.pdf>`_,
`Adaptive Subgradient Methods for Online Learning and Stochastic
Optimization
<http://jmlr.org/papers/volume12/duchi11a/duchi11a.pdf>`_
Expand Down
2 changes: 1 addition & 1 deletion src/python/nimbusml/decomposition/pcaanomalydetector.py
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@ class PcaAnomalyDetector(core, BasePredictor, ClassifierMixin):

`Randomized Methods for Computing the Singular Value Decomposition
(SVD) of very large matrices
<http://web.stanford.edu/group/mmds/slides2010/Martinsson.pdf>`_
<https://web.stanford.edu/group/mmds/slides2010/Martinsson.pdf>`_
`A randomized algorithm for principal component analysis
<https://arxiv.org/abs/0809.2274>`_,
`Finding Structure with Randomness: Probabilistic Algorithms for
Expand Down
2 changes: 1 addition & 1 deletion src/python/nimbusml/ensemble/fastforestbinaryclassifier.py
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,7 @@ class FastForestBinaryClassifier(
**Reference**

`Wikipedia: Random forest
<http://en.wikipedia.org/wiki/Random_forest>`_
<https://en.wikipedia.org/wiki/Random_forest>`_

`Quantile regression forest
<http://jmlr.org/papers/volume7/meinshausen06a/meinshausen06a.pdf>`_
Expand Down
2 changes: 1 addition & 1 deletion src/python/nimbusml/ensemble/fastforestregressor.py
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@ class FastForestRegressor(core, BasePredictor, RegressorMixin):
**Reference**

`Wikipedia: Random forest
<http://en.wikipedia.org/wiki/Random_forest>`_
<https://en.wikipedia.org/wiki/Random_forest>`_

`Quantile regression forest
<http://jmlr.org/papers/volume7/meinshausen06a/meinshausen06a.pdf>`_
Expand Down
2 changes: 1 addition & 1 deletion src/python/nimbusml/ensemble/fasttreesbinaryclassifier.py
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,7 @@ class FastTreesBinaryClassifier(
<https://en.wikipedia.org/wiki/Gradient_boosting#Gradient_tree_boosting>`_

`Greedy function approximation: A gradient boosting machine.
<http://projecteuclid.org/DPubS?service=UI&version=1.0&verb=Display&handle=euclid.aos/1013203451>`_
<https://projecteuclid.org/DPubS?service=UI&version=1.0&verb=Display&handle=euclid.aos/1013203451>`_

:param feature: see `Columns </nimbusml/concepts/columns>`_.

Expand Down
2 changes: 1 addition & 1 deletion src/python/nimbusml/ensemble/fasttreesregressor.py
Original file line number Diff line number Diff line change
Expand Up @@ -83,7 +83,7 @@ class FastTreesRegressor(core, BasePredictor, RegressorMixin):
<https://en.wikipedia.org/wiki/Gradient_boosting#Gradient_tree_boosting>`_

`Greedy function approximation: A gradient boosting machine.
<http://projecteuclid.org/DPubS?service=UI&version=1.0&verb=Display&handle=euclid.aos/1013203451>`_
<https://projecteuclid.org/DPubS?service=UI&version=1.0&verb=Display&handle=euclid.aos/1013203451>`_

:param feature: see `Columns </nimbusml/concepts/columns>`_.

Expand Down
2 changes: 1 addition & 1 deletion src/python/nimbusml/ensemble/fasttreestweedieregressor.py
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ class FastTreesTweedieRegressor(
<https://en.wikipedia.org/wiki/Gradient_boosting#Gradient_tree_boosting>`_

`Greedy function approximation: A gradient boosting machine.
<http://projecteuclid.org/DPubS?service=UI&version=1.0&verb=Display&handle=euclid.aos/1013203451>`_
<https://projecteuclid.org/DPubS?service=UI&version=1.0&verb=Display&handle=euclid.aos/1013203451>`_

:param feature: see `Columns </nimbusml/concepts/columns>`_.

Expand Down
4 changes: 2 additions & 2 deletions src/python/nimbusml/ensemble/gambinaryclassifier.py
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ class GamBinaryClassifier(core, BasePredictor, ClassifierMixin):
functions learned will step between the discretization boundaries.

This implementation is based on the this `paper
<http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.352.7619>`_,
<https://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.352.7619>`_,
but diverges from it in several important respects: most
significantly,
in each round of boosting, rather than do one feature at a time, it
Expand Down Expand Up @@ -78,7 +78,7 @@ class GamBinaryClassifier(core, BasePredictor, ClassifierMixin):
`Generalized additive models
<https://en.wikipedia.org/wiki/Generalized_additive_model>`_,
`Intelligible Models for Classification and Regression
<http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.352.7619>`_
<https://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.352.7619>`_


:param feature: see `Columns </nimbusml/concepts/columns>`_.
Expand Down
4 changes: 2 additions & 2 deletions src/python/nimbusml/ensemble/gamregressor.py
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ class GamRegressor(core, BasePredictor, RegressorMixin):
functions learned will step between the discretization boundaries.

This implementation is based on the this `paper
<http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.352.7619>`_,
<https://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.352.7619>`_,
but diverges from it in several important respects: most
significantly,
in each round of boosting, rather than do one feature at a time, it
Expand Down Expand Up @@ -77,7 +77,7 @@ class GamRegressor(core, BasePredictor, RegressorMixin):
`Generalized additive models
<https://en.wikipedia.org/wiki/Generalized_additive_model>`_,
`Intelligible Models for Classification and Regression
<http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.352.7619>`_
<https://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.352.7619>`_


:param feature: see `Columns </nimbusml/concepts/columns>`_.
Expand Down
2 changes: 1 addition & 1 deletion src/python/nimbusml/feature_extraction/text/lightlda.py
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ class LightLda(core, BaseTransform, TransformerMixin):
topical vectors. LightLDA is an extremely
efficient implementation of LDA developed in MSR-Asia that
incorporates a number of optimization techniques
`(http://arxiv.org/abs/1412.1576) <http://arxiv.org/abs/1412.1576>`_.
`(https://arxiv.org/abs/1412.1576) <https://arxiv.org/abs/1412.1576>`_.
With the LDA transform, we can
train a topic model to produce 1 million topics with 1 million
vocabulary on a 1-billion-token document set one
Expand Down
Loading