New Research: Supply Chain Attack on Axios Pulls Malicious Dependency from npm.Details →
Socket
Book a DemoSign in
Socket

graphtools

Package Overview
Dependencies
Maintainers
4
Versions
27
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

graphtools - pypi Package Compare versions

Comparing version
1.5.2
to
1.5.3
+70
-70
graphtools.egg-info/PKG-INFO
Metadata-Version: 2.1
Name: graphtools
Version: 1.5.2
Version: 1.5.3
Summary: graphtools
Home-page: https://github.com/KrishnaswamyLab/graphtools
Download-URL: https://github.com/KrishnaswamyLab/graphtools/archive/v1.5.3.tar.gz
Author: Scott Gigante, Daniel Burkhardt, and Jay Stanley, Yale University
Author-email: scott.gigante@yale.edu
License: GNU General Public License Version 2
Download-URL: https://github.com/KrishnaswamyLab/graphtools/archive/v1.5.2.tar.gz
Description: ==========
graphtools
==========
.. image:: https://img.shields.io/pypi/v/graphtools.svg
:target: https://pypi.org/project/graphtools/
:alt: Latest PyPi version
.. image:: https://anaconda.org/conda-forge/graphtools/badges/version.svg
:target: https://anaconda.org/conda-forge/graphtools/
:alt: Latest Conda version
.. image:: https://api.travis-ci.com/KrishnaswamyLab/graphtools.svg?branch=master
:target: https://travis-ci.com/KrishnaswamyLab/graphtools
:alt: Travis CI Build
.. image:: https://img.shields.io/readthedocs/graphtools.svg
:target: https://graphtools.readthedocs.io/
:alt: Read the Docs
.. image:: https://coveralls.io/repos/github/KrishnaswamyLab/graphtools/badge.svg?branch=master
:target: https://coveralls.io/github/KrishnaswamyLab/graphtools?branch=master
:alt: Coverage Status
.. image:: https://img.shields.io/twitter/follow/KrishnaswamyLab.svg?style=social&label=Follow
:target: https://twitter.com/KrishnaswamyLab
:alt: Twitter
.. image:: https://img.shields.io/github/stars/KrishnaswamyLab/graphtools.svg?style=social&label=Stars
:target: https://github.com/KrishnaswamyLab/graphtools/
:alt: GitHub stars
.. image:: https://img.shields.io/badge/code%20style-black-000000.svg
:target: https://github.com/psf/black
:alt: Code style: black
Tools for building and manipulating graphs in Python.
Installation
------------
graphtools is available on `pip`. Install by running the following in a terminal::
pip install --user graphtools
Alternatively, graphtools can be installed using `Conda <https://conda.io/docs/>`_ (most easily obtained via the `Miniconda Python distribution <https://conda.io/miniconda.html>`_)::
conda install -c conda-forge graphtools
Or, to install the latest version from github::
pip install --user git+git://github.com/KrishnaswamyLab/graphtools.git
Usage example
-------------
The `graphtools.Graph` class provides an all-in-one interface for k-nearest neighbors, mutual nearest neighbors, exact (pairwise distances) and landmark graphs.
Use it as follows::
from sklearn import datasets
import graphtools
digits = datasets.load_digits()
G = graphtools.Graph(digits['data'])
K = G.kernel
P = G.diff_op
G = graphtools.Graph(digits['data'], n_landmark=300)
L = G.landmark_op
Help
----
If you have any questions or require assistance using graphtools, please contact us at https://krishnaswamylab.org/get-help
Keywords: graphs,big-data,signal processing,manifold-learning
Platform: UNKNOWN
Classifier: Development Status :: 4 - Beta

@@ -96,1 +28,69 @@ Classifier: Environment :: Console

Provides-Extra: doc
License-File: LICENSE
==========
graphtools
==========
.. image:: https://img.shields.io/pypi/v/graphtools.svg
:target: https://pypi.org/project/graphtools/
:alt: Latest PyPi version
.. image:: https://anaconda.org/conda-forge/graphtools/badges/version.svg
:target: https://anaconda.org/conda-forge/graphtools/
:alt: Latest Conda version
.. image:: https://img.shields.io/github/workflow/status/KrishnaswamyLab/graphtools/Unit%20Tests/master?label=Github%20Actions
:target: https://travis-ci.com/KrishnaswamyLab/graphtools
:alt: Github Actions Build
.. image:: https://img.shields.io/readthedocs/graphtools.svg
:target: https://graphtools.readthedocs.io/
:alt: Read the Docs
.. image:: https://coveralls.io/repos/github/KrishnaswamyLab/graphtools/badge.svg?branch=master
:target: https://coveralls.io/github/KrishnaswamyLab/graphtools?branch=master
:alt: Coverage Status
.. image:: https://img.shields.io/twitter/follow/KrishnaswamyLab.svg?style=social&label=Follow
:target: https://twitter.com/KrishnaswamyLab
:alt: Twitter
.. image:: https://img.shields.io/github/stars/KrishnaswamyLab/graphtools.svg?style=social&label=Stars
:target: https://github.com/KrishnaswamyLab/graphtools/
:alt: GitHub stars
.. image:: https://img.shields.io/badge/code%20style-black-000000.svg
:target: https://github.com/psf/black
:alt: Code style: black
Tools for building and manipulating graphs in Python.
Installation
------------
graphtools is available on `pip`. Install by running the following in a terminal::
pip install --user graphtools
Alternatively, graphtools can be installed using `Conda <https://conda.io/docs/>`_ (most easily obtained via the `Miniconda Python distribution <https://conda.io/miniconda.html>`_)::
conda install -c conda-forge graphtools
Or, to install the latest version from github::
pip install --user git+git://github.com/KrishnaswamyLab/graphtools.git
Usage example
-------------
The `graphtools.Graph` class provides an all-in-one interface for k-nearest neighbors, mutual nearest neighbors, exact (pairwise distances) and landmark graphs.
Use it as follows::
from sklearn import datasets
import graphtools
digits = datasets.load_digits()
G = graphtools.Graph(digits['data'])
K = G.kernel
P = G.diff_op
G = graphtools.Graph(digits['data'], n_landmark=300)
L = G.landmark_op
Help
----
If you have any questions or require assistance using graphtools, please contact us at https://krishnaswamylab.org/get-help

@@ -23,3 +23,2 @@ numpy>=1.14.0

anndata
anndata
black

@@ -1,2 +0,4 @@

from .api import Graph, from_igraph, read_pickle
from .api import from_igraph
from .api import Graph
from .api import read_pickle
from .version import __version__

@@ -0,10 +1,11 @@

from . import base
from . import graphs
from scipy import sparse
import numpy as np
import warnings
from scipy import sparse
import pickle
import pygsp
import tasklogger
import warnings
from . import base, graphs
_logger = tasklogger.get_tasklogger("graphtools")

@@ -39,3 +40,3 @@

initialize=True,
**kwargs
**kwargs,
):

@@ -259,3 +260,3 @@ """Create a graph built on data.

_logger.debug(msg)
_logger.log_debug(msg)

@@ -278,3 +279,3 @@ class_names = [p.__name__.replace("Graph", "") for p in parent_classes]

# build graph and return
_logger.debug(
_logger.log_debug(
"Initializing {} with arguments {}".format(

@@ -281,0 +282,0 @@ parent_classes,

@@ -1,20 +0,22 @@

from future.utils import with_metaclass
from . import matrix
from . import utils
from builtins import super
from copy import copy as shallow_copy
import numpy as np
import abc
import pygsp
from future.utils import with_metaclass
from inspect import signature
from sklearn.decomposition import PCA, TruncatedSVD
from scipy import sparse
from scipy.sparse.csgraph import shortest_path
from sklearn.decomposition import PCA
from sklearn.decomposition import TruncatedSVD
from sklearn.preprocessing import normalize
from sklearn.utils.graph import graph_shortest_path
from scipy import sparse
import warnings
import abc
import numbers
import numpy as np
import pickle
import pygsp
import sys
import tasklogger
import warnings
from . import matrix, utils
_logger = tasklogger.get_tasklogger("graphtools")

@@ -176,3 +178,3 @@

n_pca = "auto"
_logger.info(
_logger.log_info(
"Estimating n_pca from matrix rank. "

@@ -241,3 +243,3 @@ "Supply an integer n_pca "

):
with _logger.task("PCA"):
with _logger.log_task("PCA"):
n_pca = self.data.shape[1] - 1 if self.n_pca == "auto" else self.n_pca

@@ -274,3 +276,3 @@ if sparse.issparse(self.data):

)
_logger.info(
_logger.log_info(
"Using rank estimate of {} as n_pca".format(self.n_pca)

@@ -298,4 +300,3 @@ )

def get_params(self):
"""Get parameters from this object
"""
"""Get parameters from this object"""
return {"n_pca": self.n_pca, "random_state": self.random_state}

@@ -476,3 +477,3 @@

initialize=True,
**kwargs
**kwargs,
):

@@ -506,6 +507,6 @@ if gamma is not None:

if initialize:
_logger.debug("Initializing kernel...")
_logger.log_debug("Initializing kernel...")
self.K
else:
_logger.debug("Not initializing kernel.")
_logger.log_debug("Not initializing kernel.")
super().__init__(**kwargs)

@@ -565,9 +566,11 @@

if self.kernel_symm == "+":
_logger.debug("Using addition symmetrization.")
_logger.log_debug("Using addition symmetrization.")
K = (K + K.T) / 2
elif self.kernel_symm == "*":
_logger.debug("Using multiplication symmetrization.")
_logger.log_debug("Using multiplication symmetrization.")
K = K.multiply(K.T)
elif self.kernel_symm == "mnn":
_logger.debug("Using mnn symmetrization (theta = {}).".format(self.theta))
_logger.log_debug(
"Using mnn symmetrization (theta = {}).".format(self.theta)
)
K = self.theta * matrix.elementwise_minimum(K, K.T) + (

@@ -577,3 +580,3 @@ 1 - self.theta

elif self.kernel_symm is None:
_logger.debug("Using no symmetrization.")
_logger.log_debug("Using no symmetrization.")
pass

@@ -600,4 +603,3 @@ else:

def get_params(self):
"""Get parameters from this object
"""
"""Get parameters from this object"""
return {

@@ -709,4 +711,3 @@ "kernel_symm": self.kernel_symm,

def diff_op(self):
"""Synonym for P
"""
"""Synonym for P"""
return self.P

@@ -732,4 +733,3 @@

def kernel(self):
"""Synonym for K
"""
"""Synonym for K"""
return self.K

@@ -864,6 +864,6 @@

distance = "data"
_logger.info("Using ambient data distances.")
_logger.log_info("Using ambient data distances.")
else:
distance = "affinity"
_logger.info("Using negative log affinity distances.")
_logger.log_info("Using negative log affinity distances.")
return distance

@@ -918,3 +918,3 @@

P = graph_shortest_path(D, method=method)
P = shortest_path(D, method=method)
# symmetrize for numerical error

@@ -1035,4 +1035,3 @@ P = (P + P.T) / 2

def get_params(self):
"""Get parameters from this object
"""
"""Get parameters from this object"""
params = Data.get_params(self)

@@ -1039,0 +1038,0 @@ params.update(BaseGraph.get_params(self))

@@ -1,10 +0,13 @@

import numpy as np
import tasklogger
import pygsp
import abc
from . import api
from . import base
from . import graphs
from . import matrix
from . import utils
from functools import partial
from scipy import sparse
from . import api, graphs, base, utils, matrix
import abc
import numpy as np
import pygsp
import tasklogger

@@ -84,14 +87,14 @@

If `True` or `> 0`, print status messages
n_svd : int, optional (default: 100)
number of singular vectors to compute for landmarking
thresh : float, optional (default: 1e-4)
threshold below which to truncate kernel
kwargs : additional arguments for graphtools.Graph
Attributes
----------
graph : graphtools.Graph

@@ -207,3 +210,3 @@ """

thresh=1e-4,
**kwargs
**kwargs,
):

@@ -253,3 +256,3 @@

except ValueError as e:
_logger.debug("Reset graph due to {}".format(str(e)))
_logger.log_debug("Reset graph due to {}".format(str(e)))
self.graph = None

@@ -260,3 +263,3 @@

"""Trigger a reset of self.graph
Any downstream effects of resetting the graph should override this function

@@ -365,6 +368,6 @@ """

verbose=self.verbose,
**(self.kwargs)
**(self.kwargs),
)
if self.graph is not None:
_logger.info("Using precomputed graph and diffusion operator...")
_logger.log_info("Using precomputed graph and diffusion operator...")

@@ -392,3 +395,3 @@ def fit(self, X, **kwargs):

if precomputed is None:
_logger.info(
_logger.log_info(
"Building graph on {} samples and {} features.".format(

@@ -399,3 +402,3 @@ X.shape[0], X.shape[1]

else:
_logger.info(
_logger.log_info(
"Building graph on precomputed {} matrix with {} samples.".format(

@@ -412,3 +415,3 @@ precomputed, X.shape[0]

if self.graph is None:
with _logger.task("graph and diffusion operator"):
with _logger.log_task("graph and diffusion operator"):
self.graph = api.Graph(

@@ -428,4 +431,4 @@ X,

**(self.kwargs),
**kwargs
**kwargs,
)
return self
from __future__ import division
from . import matrix
from . import utils
from .base import DataGraph
from .base import PyGSPGraph
from builtins import super
import numpy as np
from scipy import sparse
from scipy.spatial.distance import cdist
from scipy.spatial.distance import pdist
from scipy.spatial.distance import squareform
from sklearn.cluster import MiniBatchKMeans
from sklearn.neighbors import NearestNeighbors
from sklearn.preprocessing import normalize
from sklearn.utils.extmath import randomized_svd
from sklearn.preprocessing import normalize
from sklearn.cluster import MiniBatchKMeans
from scipy.spatial.distance import pdist, cdist
from scipy.spatial.distance import squareform
from scipy import sparse
import numbers
import numpy as np
import tasklogger
import warnings
import tasklogger
from . import matrix, utils
from .base import DataGraph, PyGSPGraph
_logger = tasklogger.get_tasklogger("graphtools")

@@ -79,3 +83,3 @@

n_pca=None,
**kwargs
**kwargs,
):

@@ -136,4 +140,3 @@

def get_params(self):
"""Get parameters from this object
"""
"""Get parameters from this object"""
params = super().get_params()

@@ -352,3 +355,3 @@ params.update(

if self.decay is None or self.thresh == 1:
with _logger.task("KNN search"):
with _logger.log_task("KNN search"):
# binary connectivity matrix

@@ -359,3 +362,3 @@ K = self.knn_tree.kneighbors_graph(

else:
with _logger.task("KNN search"):
with _logger.log_task("KNN search"):
# sparse fast alpha decay

@@ -366,3 +369,3 @@ knn_tree = self.knn_tree

self._check_duplicates(distances, indices)
with _logger.task("affinities"):
with _logger.log_task("affinities"):
if bandwidth is None:

@@ -378,3 +381,3 @@ bandwidth = distances[:, knn - 1]

update_idx = np.argwhere(np.max(distances, axis=1) < radius).reshape(-1)
_logger.debug(
_logger.log_debug(
"search_knn = {}; {} remaining".format(search_knn, len(update_idx))

@@ -408,3 +411,3 @@ )

]
_logger.debug(
_logger.log_debug(
"search_knn = {}; {} remaining".format(

@@ -422,3 +425,3 @@ search_knn, len(update_idx)

if search_knn == knn_max:
_logger.debug(
_logger.log_debug(
"knn search to knn_max ({}) on {}".format(

@@ -436,3 +439,3 @@ knn_max, len(update_idx)

else:
_logger.debug("radius search on {}".format(len(update_idx)))
_logger.log_debug("radius search on {}".format(len(update_idx)))
# give up - radius search

@@ -536,4 +539,3 @@ dist_new, ind_new = knn_tree.radius_neighbors(

def get_params(self):
"""Get parameters from this object
"""
"""Get parameters from this object"""
params = super().get_params()

@@ -666,6 +668,6 @@ params.update({"n_landmark": self.n_landmark, "n_pca": self.n_pca})

"""
with _logger.task("landmark operator"):
with _logger.log_task("landmark operator"):
is_sparse = sparse.issparse(self.kernel)
# spectral clustering
with _logger.task("SVD"):
with _logger.log_task("SVD"):
_, _, VT = randomized_svd(

@@ -676,6 +678,7 @@ self.diff_aff,

)
with _logger.task("KMeans"):
with _logger.log_task("KMeans"):
kmeans = MiniBatchKMeans(
self.n_landmark,
init_size=3 * self.n_landmark,
n_init=1,
batch_size=10000,

@@ -693,3 +696,4 @@ random_state=self.random_state,

pnm = normalize(pnm, norm="l1", axis=1)
landmark_op = pmn.dot(pnm) # sparsity agnostic matrix multiplication
# sparsity agnostic matrix multiplication
landmark_op = pmn.dot(pnm)
if is_sparse:

@@ -850,3 +854,3 @@ # no need to have a sparse landmark operator

precomputed=None,
**kwargs
**kwargs,
):

@@ -903,4 +907,3 @@ if decay is None and precomputed not in ["affinity", "adjacency"]:

def get_params(self):
"""Get parameters from this object
"""
"""Get parameters from this object"""
params = super().get_params()

@@ -1003,3 +1006,3 @@ params.update(

else:
with _logger.task("affinities"):
with _logger.log_task("affinities"):
if sparse.issparse(self.data_nu):

@@ -1110,3 +1113,3 @@ self.data_nu = self.data_nu.toarray()

else:
with _logger.task("affinities"):
with _logger.log_task("affinities"):
Y = self._check_extension_shape(Y)

@@ -1121,3 +1124,3 @@ pdx = cdist(Y, self.data_nu, metric=self.distance)

pdx = (pdx.T / bandwidth).T
K = np.exp(-1 * pdx ** self.decay)
K = np.exp(-1 * pdx**self.decay)
# handle nan

@@ -1149,3 +1152,3 @@ K = np.where(np.isnan(K), 1, K)

distance = "constant"
_logger.info("Using constant distances.")
_logger.log_info("Using constant distances.")
else:

@@ -1200,3 +1203,3 @@ distance = super()._default_shortest_path_distance()

n_jobs=1,
**kwargs
**kwargs,
):

@@ -1245,4 +1248,3 @@ self.beta = beta

def get_params(self):
"""Get parameters from this object
"""
"""Get parameters from this object"""
params = super().get_params()

@@ -1320,3 +1322,3 @@ params.update(

"""
with _logger.task("subgraphs"):
with _logger.log_task("subgraphs"):
self.subgraphs = []

@@ -1327,3 +1329,3 @@ from .api import Graph

for i, idx in enumerate(self.samples):
_logger.debug(
_logger.log_debug(
"subgraph {}: sample {}, "

@@ -1353,3 +1355,3 @@ "n = {}, knn = {}".format(

with _logger.task("MNN kernel"):
with _logger.log_task("MNN kernel"):
if self.thresh > 0 or self.decay is None:

@@ -1370,3 +1372,3 @@ K = sparse.lil_matrix((self.data_nu.shape[0], self.data_nu.shape[0]))

continue
with _logger.task(
with _logger.log_task(
"kernel from sample {} to {}".format(

@@ -1373,0 +1375,0 @@ self.samples[i], self.samples[j]

@@ -0,7 +1,7 @@

from scipy import sparse
import numbers
import numpy as np
import numbers
from scipy import sparse
def if_sparse(sparse_func, dense_func, *args, **kwargs):

@@ -8,0 +8,0 @@ if sparse.issparse(args[0]):

@@ -0,5 +1,6 @@

from . import matrix
from deprecated import deprecated
import numbers
import warnings
from deprecated import deprecated
from . import matrix

@@ -6,0 +7,0 @@ try:

@@ -1,1 +0,1 @@

__version__ = "1.5.2"
__version__ = "1.5.3"
+70
-70
Metadata-Version: 2.1
Name: graphtools
Version: 1.5.2
Version: 1.5.3
Summary: graphtools
Home-page: https://github.com/KrishnaswamyLab/graphtools
Download-URL: https://github.com/KrishnaswamyLab/graphtools/archive/v1.5.3.tar.gz
Author: Scott Gigante, Daniel Burkhardt, and Jay Stanley, Yale University
Author-email: scott.gigante@yale.edu
License: GNU General Public License Version 2
Download-URL: https://github.com/KrishnaswamyLab/graphtools/archive/v1.5.2.tar.gz
Description: ==========
graphtools
==========
.. image:: https://img.shields.io/pypi/v/graphtools.svg
:target: https://pypi.org/project/graphtools/
:alt: Latest PyPi version
.. image:: https://anaconda.org/conda-forge/graphtools/badges/version.svg
:target: https://anaconda.org/conda-forge/graphtools/
:alt: Latest Conda version
.. image:: https://api.travis-ci.com/KrishnaswamyLab/graphtools.svg?branch=master
:target: https://travis-ci.com/KrishnaswamyLab/graphtools
:alt: Travis CI Build
.. image:: https://img.shields.io/readthedocs/graphtools.svg
:target: https://graphtools.readthedocs.io/
:alt: Read the Docs
.. image:: https://coveralls.io/repos/github/KrishnaswamyLab/graphtools/badge.svg?branch=master
:target: https://coveralls.io/github/KrishnaswamyLab/graphtools?branch=master
:alt: Coverage Status
.. image:: https://img.shields.io/twitter/follow/KrishnaswamyLab.svg?style=social&label=Follow
:target: https://twitter.com/KrishnaswamyLab
:alt: Twitter
.. image:: https://img.shields.io/github/stars/KrishnaswamyLab/graphtools.svg?style=social&label=Stars
:target: https://github.com/KrishnaswamyLab/graphtools/
:alt: GitHub stars
.. image:: https://img.shields.io/badge/code%20style-black-000000.svg
:target: https://github.com/psf/black
:alt: Code style: black
Tools for building and manipulating graphs in Python.
Installation
------------
graphtools is available on `pip`. Install by running the following in a terminal::
pip install --user graphtools
Alternatively, graphtools can be installed using `Conda <https://conda.io/docs/>`_ (most easily obtained via the `Miniconda Python distribution <https://conda.io/miniconda.html>`_)::
conda install -c conda-forge graphtools
Or, to install the latest version from github::
pip install --user git+git://github.com/KrishnaswamyLab/graphtools.git
Usage example
-------------
The `graphtools.Graph` class provides an all-in-one interface for k-nearest neighbors, mutual nearest neighbors, exact (pairwise distances) and landmark graphs.
Use it as follows::
from sklearn import datasets
import graphtools
digits = datasets.load_digits()
G = graphtools.Graph(digits['data'])
K = G.kernel
P = G.diff_op
G = graphtools.Graph(digits['data'], n_landmark=300)
L = G.landmark_op
Help
----
If you have any questions or require assistance using graphtools, please contact us at https://krishnaswamylab.org/get-help
Keywords: graphs,big-data,signal processing,manifold-learning
Platform: UNKNOWN
Classifier: Development Status :: 4 - Beta

@@ -96,1 +28,69 @@ Classifier: Environment :: Console

Provides-Extra: doc
License-File: LICENSE
==========
graphtools
==========
.. image:: https://img.shields.io/pypi/v/graphtools.svg
:target: https://pypi.org/project/graphtools/
:alt: Latest PyPi version
.. image:: https://anaconda.org/conda-forge/graphtools/badges/version.svg
:target: https://anaconda.org/conda-forge/graphtools/
:alt: Latest Conda version
.. image:: https://img.shields.io/github/workflow/status/KrishnaswamyLab/graphtools/Unit%20Tests/master?label=Github%20Actions
:target: https://travis-ci.com/KrishnaswamyLab/graphtools
:alt: Github Actions Build
.. image:: https://img.shields.io/readthedocs/graphtools.svg
:target: https://graphtools.readthedocs.io/
:alt: Read the Docs
.. image:: https://coveralls.io/repos/github/KrishnaswamyLab/graphtools/badge.svg?branch=master
:target: https://coveralls.io/github/KrishnaswamyLab/graphtools?branch=master
:alt: Coverage Status
.. image:: https://img.shields.io/twitter/follow/KrishnaswamyLab.svg?style=social&label=Follow
:target: https://twitter.com/KrishnaswamyLab
:alt: Twitter
.. image:: https://img.shields.io/github/stars/KrishnaswamyLab/graphtools.svg?style=social&label=Stars
:target: https://github.com/KrishnaswamyLab/graphtools/
:alt: GitHub stars
.. image:: https://img.shields.io/badge/code%20style-black-000000.svg
:target: https://github.com/psf/black
:alt: Code style: black
Tools for building and manipulating graphs in Python.
Installation
------------
graphtools is available on `pip`. Install by running the following in a terminal::
pip install --user graphtools
Alternatively, graphtools can be installed using `Conda <https://conda.io/docs/>`_ (most easily obtained via the `Miniconda Python distribution <https://conda.io/miniconda.html>`_)::
conda install -c conda-forge graphtools
Or, to install the latest version from github::
pip install --user git+git://github.com/KrishnaswamyLab/graphtools.git
Usage example
-------------
The `graphtools.Graph` class provides an all-in-one interface for k-nearest neighbors, mutual nearest neighbors, exact (pairwise distances) and landmark graphs.
Use it as follows::
from sklearn import datasets
import graphtools
digits = datasets.load_digits()
G = graphtools.Graph(digits['data'])
K = G.kernel
P = G.diff_op
G = graphtools.Graph(digits['data'], n_landmark=300)
L = G.landmark_op
Help
----
If you have any questions or require assistance using graphtools, please contact us at https://krishnaswamylab.org/get-help

@@ -11,5 +11,5 @@ ==========

:alt: Latest Conda version
.. image:: https://api.travis-ci.com/KrishnaswamyLab/graphtools.svg?branch=master
.. image:: https://img.shields.io/github/workflow/status/KrishnaswamyLab/graphtools/Unit%20Tests/master?label=Github%20Actions
:target: https://travis-ci.com/KrishnaswamyLab/graphtools
:alt: Travis CI Build
:alt: Github Actions Build
.. image:: https://img.shields.io/readthedocs/graphtools.svg

@@ -16,0 +16,0 @@ :target: https://graphtools.readthedocs.io/

[metadata]
license-file = LICENSE
[flake8]
ignore =
D100, D104,
E203,
H306
per-file-ignores =
__init__.py: F401
openproblems/tasks/*/*/*.py: D103, E203
openproblems/tasks/*/*/__init__.py: F401, D103
max-line-length = 88
exclude =
.git,
__pycache__,
build,
dist,
Snakefile
[isort]
profile = black
force_single_line = true
force_alphabetical_sort = true
[egg_info]

@@ -5,0 +27,0 @@ tag_build =

@@ -0,4 +1,5 @@

from setuptools import setup
import os
import sys
from setuptools import setup

@@ -47,3 +48,5 @@ install_requires = [

author_email="scott.gigante@yale.edu",
packages=["graphtools",],
packages=[
"graphtools",
],
license="GNU General Public License Version 2",

@@ -58,3 +61,8 @@ install_requires=install_requires,

),
keywords=["graphs", "big-data", "signal processing", "manifold-learning",],
keywords=[
"graphs",
"big-data",
"signal processing",
"manifold-learning",
],
classifiers=[

@@ -61,0 +69,0 @@ "Development Status :: 4 - Beta",

from __future__ import print_function
from load_tests import data, build_graph, assert_raises_message, assert_warns_message
from load_tests import assert_raises_message
from load_tests import assert_warns_message
from load_tests import build_graph
from load_tests import data
import graphtools
import igraph
import numpy as np
import graphtools
import tempfile
import os
import pickle
import tempfile

@@ -12,0 +15,0 @@

from __future__ import print_function
from load_tests import (
np,
sp,
pd,
graphtools,
nose2,
data,
build_graph,
squareform,
pdist,
)
from load_tests import assert_raises_message, assert_warns_message
from load_tests import assert_raises_message
from load_tests import assert_warns_message
from load_tests import build_graph
from load_tests import data
from load_tests import graphtools
from load_tests import nose2
from load_tests import np
from load_tests import pd
from load_tests import pdist
from load_tests import sp
from load_tests import squareform
from nose.tools import assert_raises_regex

@@ -222,3 +222,3 @@

return
G = build_graph(anndata.AnnData(data))
G = build_graph(anndata.AnnData(data, dtype=data.dtype))
assert isinstance(G, graphtools.base.BaseGraph)

@@ -234,3 +234,3 @@ assert isinstance(G.data, np.ndarray)

return
G = build_graph(anndata.AnnData(sp.csr_matrix(data)))
G = build_graph(anndata.AnnData(sp.csr_matrix(data), dtype=data.dtype))
assert isinstance(G, graphtools.base.BaseGraph)

@@ -237,0 +237,0 @@ assert isinstance(G.data, sp.csr_matrix)

@@ -0,10 +1,12 @@

from load_tests import assert_raises_message
from load_tests import data
from parameterized import parameterized
from scipy import sparse
import anndata
import graphtools
import graphtools.estimator
import numpy as np
import pygsp
import anndata
import warnings
import numpy as np
from load_tests import data, assert_raises_message
from scipy import sparse
from parameterized import parameterized

@@ -100,3 +102,3 @@

E2 = Estimator(verbose=0)
E2.fit(anndata.AnnData(X))
E2.fit(anndata.AnnData(X, dtype=X.dtype))
np.testing.assert_allclose(

@@ -103,0 +105,0 @@ E.graph.K.toarray(), E2.graph.K.toarray(), rtol=1e-6, atol=2e-7

from __future__ import print_function
from sklearn.utils.graph import graph_shortest_path
from load_tests import (
graphtools,
np,
sp,
pygsp,
nose2,
data,
build_graph,
squareform,
pdist,
PCA,
TruncatedSVD,
assert_raises_message,
assert_warns_message,
)
from load_tests import assert_raises_message
from load_tests import assert_warns_message
from load_tests import build_graph
from load_tests import data
from load_tests import graphtools
from load_tests import nose2
from load_tests import np
from load_tests import PCA
from load_tests import pdist
from load_tests import pygsp
from load_tests import sp
from load_tests import squareform
from load_tests import TruncatedSVD
from nose.tools import assert_warns_regex
from scipy.sparse.csgraph import shortest_path

@@ -144,3 +143,3 @@ #####################################################

weighted_pdx = (pdx.T / epsilon).T
K = np.exp(-1 * weighted_pdx ** a)
K = np.exp(-1 * weighted_pdx**a)
W = K + K.T

@@ -222,3 +221,3 @@ W = np.divide(W, 2)

weighted_pdx = (pdx.T / epsilon).T
K = np.exp(-1 * weighted_pdx ** a)
K = np.exp(-1 * weighted_pdx**a)
K[K < thresh] = 0

@@ -294,3 +293,3 @@ W = K + K.T

weighted_pdx = (pdx.T / epsilon).T
K = np.exp(-1 * weighted_pdx ** a)
K = np.exp(-1 * weighted_pdx**a)
K[K < thresh] = 0

@@ -366,3 +365,3 @@ W = K + K.T

weighted_pdx = (pdx.T / epsilon).T
K = np.exp(-1 * weighted_pdx ** a)
K = np.exp(-1 * weighted_pdx**a)
K[K < thresh] = 0

@@ -532,3 +531,3 @@ W = K + K.T

weighted_pdx = (pdx.T / epsilon).T
K = np.exp(-1 * weighted_pdx ** a)
K = np.exp(-1 * weighted_pdx**a)
K = K + K.T

@@ -598,6 +597,7 @@ K = np.divide(K, 2)

def test_shortest_path_affinity():
np.random.seed(42)
data_small = data[np.random.choice(len(data), len(data) // 4, replace=False)]
G = build_graph(data_small, knn=5, decay=15)
D = -1 * np.where(G.K != 0, np.log(np.where(G.K != 0, G.K, np.nan)), 0)
P = graph_shortest_path(D)
P = shortest_path(D)
# sklearn returns 0 if no path exists

@@ -607,7 +607,10 @@ P[np.where(P == 0)] = np.inf

np.fill_diagonal(P, 0)
np.testing.assert_allclose(P, G.shortest_path(distance="affinity"))
np.testing.assert_allclose(P, G.shortest_path())
np.testing.assert_allclose(
P, G.shortest_path(distance="affinity"), atol=1e-4, rtol=1e-3
)
np.testing.assert_allclose(P, G.shortest_path(), atol=1e-4, rtol=1e-3)
def test_shortest_path_affinity_precomputed():
np.random.seed(42)
data_small = data[np.random.choice(len(data), len(data) // 4, replace=False)]

@@ -617,3 +620,3 @@ G = build_graph(data_small, knn=5, decay=15)

D = -1 * np.where(G.K != 0, np.log(np.where(G.K != 0, G.K, np.nan)), 0)
P = graph_shortest_path(D)
P = shortest_path(D)
# sklearn returns 0 if no path exists

@@ -623,4 +626,6 @@ P[np.where(P == 0)] = np.inf

np.fill_diagonal(P, 0)
np.testing.assert_allclose(P, G.shortest_path(distance="affinity"))
np.testing.assert_allclose(P, G.shortest_path())
np.testing.assert_allclose(
P, G.shortest_path(distance="affinity"), atol=1e-4, rtol=1e-3
)
np.testing.assert_allclose(P, G.shortest_path(), atol=1e-4, rtol=1e-3)

@@ -627,0 +632,0 @@

@@ -1,20 +0,23 @@

from __future__ import print_function, division
from sklearn.utils.graph import graph_shortest_path
from scipy.spatial.distance import pdist, squareform
from load_tests import assert_raises_message, assert_warns_message
from nose.tools import assert_raises_regex, assert_warns_regex
from __future__ import division
from __future__ import print_function
from load_tests import assert_raises_message
from load_tests import assert_warns_message
from load_tests import build_graph
from load_tests import data
from load_tests import datasets
from load_tests import graphtools
from load_tests import np
from load_tests import PCA
from load_tests import pygsp
from load_tests import sp
from load_tests import TruncatedSVD
from nose.tools import assert_raises_regex
from nose.tools import assert_warns_regex
from scipy.sparse.csgraph import shortest_path
from scipy.spatial.distance import pdist
from scipy.spatial.distance import squareform
import warnings
from load_tests import (
graphtools,
np,
sp,
pygsp,
data,
datasets,
build_graph,
PCA,
TruncatedSVD,
)
#####################################################

@@ -54,3 +57,3 @@ # Check parameters

):
build_graph(np.vstack([data, data[:9]]), n_pca=20, decay=10, thresh=1e-4)
build_graph(np.vstack([data, data[:9]]), n_pca=None, decay=10, thresh=1e-4)

@@ -63,3 +66,3 @@

):
build_graph(np.vstack([data, data[:21]]), n_pca=20, decay=10, thresh=1e-4)
build_graph(np.vstack([data, data[:21]]), n_pca=None, decay=10, thresh=1e-4)

@@ -161,3 +164,4 @@

G2.build_kernel_to_data(
Y=G2.data_nu, knn=data.shape[0] + 1,
Y=G2.data_nu,
knn=data.shape[0] + 1,
)

@@ -238,3 +242,3 @@

pdx = (pdx.T / epsilon).T
K = np.exp(-1 * pdx ** a)
K = np.exp(-1 * pdx**a)
K = K + K.T

@@ -285,3 +289,3 @@ W = np.divide(K, 2)

pdx_scale = (pdx.T / epsilon).T
K = np.where(pdx <= knn_max_dist[:, None], np.exp(-1 * pdx_scale ** a), 0)
K = np.where(pdx <= knn_max_dist[:, None], np.exp(-1 * pdx_scale**a), 0)
K = K + K.T

@@ -438,3 +442,3 @@ W = np.divide(K, 2)

weighted_pdx = (pdx.T / epsilon).T
K = np.exp(-1 * weighted_pdx ** a)
K = np.exp(-1 * weighted_pdx**a)
K[K < thresh] = 0

@@ -534,3 +538,3 @@ K = K + K.T

G = build_graph(data_small, knn=5, decay=None)
P = graph_shortest_path(G.K)
P = shortest_path(G.K)
# sklearn returns 0 if no path exists

@@ -547,3 +551,3 @@ P[np.where(P == 0)] = np.inf

G = graphtools.Graph(G.K, precomputed="affinity")
P = graph_shortest_path(G.K)
P = shortest_path(G.K)
# sklearn returns 0 if no path exists

@@ -561,3 +565,3 @@ P[np.where(P == 0)] = np.inf

D = squareform(pdist(G.data_nu)) * np.where(G.K.toarray() > 0, 1, 0)
P = graph_shortest_path(D)
P = shortest_path(D)
# sklearn returns 0 if no path exists

@@ -564,0 +568,0 @@ P[np.where(P == 0)] = np.inf

from __future__ import print_function
from load_tests import (
graphtools,
np,
nose2,
data,
digits,
build_graph,
generate_swiss_roll,
assert_raises_message,
assert_warns_message,
)
from load_tests import assert_raises_message
from load_tests import assert_warns_message
from load_tests import build_graph
from load_tests import data
from load_tests import digits
from load_tests import generate_swiss_roll
from load_tests import graphtools
from load_tests import nose2
from load_tests import np
import pygsp
#####################################################

@@ -71,2 +70,3 @@ # Check parameters

def test_landmark_knn_graph():
np.random.seed(42)
n_landmark = 500

@@ -77,4 +77,7 @@ # knn graph

)
assert G.transitions.shape == (data.shape[0], n_landmark)
assert G.landmark_op.shape == (n_landmark, n_landmark)
n_landmark_out = G.landmark_op.shape[0]
assert n_landmark_out <= n_landmark
assert n_landmark_out >= n_landmark - 3
assert G.transitions.shape == (data.shape[0], n_landmark_out), G.transitions.shape
assert G.landmark_op.shape == (n_landmark_out, n_landmark_out)
assert isinstance(G, graphtools.graphs.kNNGraph)

@@ -81,0 +84,0 @@ assert isinstance(G, graphtools.graphs.LandmarkGraph)

@@ -0,9 +1,10 @@

from load_tests import assert_warns_message
from load_tests import data
from parameterized import parameterized
from scipy import sparse
import graphtools
import graphtools.matrix
import graphtools.utils
from parameterized import parameterized
from scipy import sparse
import numpy as np
import graphtools
from load_tests import data
from load_tests import assert_warns_message

@@ -10,0 +11,0 @@

from __future__ import print_function
import warnings
from load_tests import (
graphtools,
np,
pd,
pygsp,
nose2,
data,
digits,
build_graph,
generate_swiss_roll,
assert_raises_message,
assert_warns_message,
cdist,
)
from load_tests import assert_raises_message
from load_tests import assert_warns_message
from load_tests import build_graph
from load_tests import cdist
from load_tests import data
from load_tests import digits
from load_tests import generate_swiss_roll
from load_tests import graphtools
from load_tests import nose2
from load_tests import np
from load_tests import pd
from load_tests import pygsp
from scipy.linalg import norm
import warnings

@@ -423,3 +422,3 @@ #####################################################

pdxe_ij = pdx_ij / e_ij[:, np.newaxis] # normalize
k_ij = np.exp(-1 * (pdxe_ij ** a)) # apply alpha-decaying kernel
k_ij = np.exp(-1 * (pdxe_ij**a)) # apply alpha-decaying kernel
if si == sj:

@@ -426,0 +425,0 @@ K.iloc[sample_idx == si, sample_idx == sj] = (k_ij + k_ij.T) / 2

@@ -1,5 +0,6 @@

import graphtools
from load_tests import assert_raises_message
import graphtools
def test_check_in():

@@ -6,0 +7,0 @@ graphtools.utils.check_in(["hello", "world"], foo="hello")