Skip to content

Commit

Permalink
Browse files Browse the repository at this point in the history
…ic-learn into score-deprecation
  • Loading branch information
mvargas33 committed Oct 1, 2021
2 parents 675dfcf + 8571f97 commit 2cf4075
Show file tree
Hide file tree
Showing 4 changed files with 10 additions and 202 deletions.
65 changes: 0 additions & 65 deletions .travis.yml

This file was deleted.

6 changes: 3 additions & 3 deletions README.rst
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
|Travis-CI Build Status| |License| |PyPI version| |Code coverage|
|GitHub Actions Build Status| |License| |PyPI version| |Code coverage|

metric-learn: Metric Learning in Python
=======================================
Expand Down Expand Up @@ -67,8 +67,8 @@ Bibtex entry::

.. _sphinx documentation: http://contrib.scikit-learn.org/metric-learn/

.. |Travis-CI Build Status| image:: https://api.travis-ci.org/scikit-learn-contrib/metric-learn.svg?branch=master
:target: https://travis-ci.org/scikit-learn-contrib/metric-learn
.. |GitHub Actions Build Status| image:: https://github.com/scikit-learn-contrib/metric-learn/workflows/CI/badge.svg
:target: https://github.com/scikit-learn-contrib/metric-learn/actions?query=event%3Apush+branch%3Amaster
.. |License| image:: http://img.shields.io/:license-mit-blue.svg?style=flat
:target: http://badges.mit-license.org
.. |PyPI version| image:: https://badge.fury.io/py/metric-learn.svg
Expand Down
132 changes: 0 additions & 132 deletions build_tools/travis/flake8_diff.sh

This file was deleted.

9 changes: 7 additions & 2 deletions test/metric_learn_test.py
Original file line number Diff line number Diff line change
Expand Up @@ -79,12 +79,17 @@ def test_singular_returns_pseudo_inverse(self):
class TestSCML(object):
@pytest.mark.parametrize('basis', ('lda', 'triplet_diffs'))
def test_iris(self, basis):
"""
SCML applied to Iris dataset should give better results when
computing class separation.
"""
X, y = load_iris(return_X_y=True)
before = class_separation(X, y)
scml = SCML_Supervised(basis=basis, n_basis=85, k_genuine=7, k_impostor=5,
random_state=42)
scml.fit(X, y)
csep = class_separation(scml.transform(X), y)
assert csep < 0.24
after = class_separation(scml.transform(X), y)
assert before > after + 0.03 # It's better by a margin of 0.03

def test_big_n_features(self):
X, y = make_classification(n_samples=100, n_classes=3, n_features=60,
Expand Down

0 comments on commit 2cf4075

Please sign in to comment.