my fork of prototorch_models
Go to file
2023-11-07 16:44:13 +01:00
.github ci: update action to pyproject toml workflow 2023-10-25 15:56:19 +02:00
docs build: bump version 0.7.0 → 0.7.1 2023-10-25 15:56:53 +02:00
examples ci: Trigger example test 2023-06-20 19:29:59 +02:00
src/prototorch/models GMLMLVQ: allow for 2 or more omega layers 2023-11-07 16:44:13 +01:00
tests fix: update import in tests 2023-06-20 21:18:28 +02:00
.bumpversion.cfg build: bump version 0.7.0 → 0.7.1 2023-10-25 15:56:53 +02:00
.gitignore refactor(api)!: merge the new api changes into dev 2021-06-20 19:00:12 +02:00
.pre-commit-config.yaml chore: replace config by pyproject.toml 2023-06-20 18:30:05 +02:00
.readthedocs.yml fix: python is python3.9 2021-06-20 17:50:09 +02:00
.remarkrc Add remarkrc 2021-06-07 15:18:26 +02:00
LICENSE Initial commit 2021-04-21 13:13:28 +02:00
pyproject.toml build: bump version 0.7.0 → 0.7.1 2023-10-25 15:56:53 +02:00
README.md ci: add github actions (#16) 2022-01-11 18:28:50 +01:00

ProtoTorch Models

GitHub tag (latest by date) PyPI GitHub license

Pre-packaged prototype-based machine learning models using ProtoTorch and PyTorch-Lightning.

Installation

To install this plugin, simply run the following command:

pip install prototorch_models

Installing the models plugin should automatically install a suitable version of ProtoTorch. The plugin should then be available for use in your Python environment as prototorch.models.

Available models

LVQ Family

  • Learning Vector Quantization 1 (LVQ1)
  • Generalized Learning Vector Quantization (GLVQ)
  • Generalized Relevance Learning Vector Quantization (GRLVQ)
  • Generalized Matrix Learning Vector Quantization (GMLVQ)
  • Limited-Rank Matrix Learning Vector Quantization (LiRaMLVQ)
  • Localized and Generalized Matrix Learning Vector Quantization (LGMLVQ)
  • Learning Vector Quantization Multi-Layer Network (LVQMLN)
  • Siamese GLVQ
  • Cross-Entropy Learning Vector Quantization (CELVQ)
  • Soft Learning Vector Quantization (SLVQ)
  • Robust Soft Learning Vector Quantization (RSLVQ)
  • Probabilistic Learning Vector Quantization (PLVQ)
  • Median-LVQ

Other

  • k-Nearest Neighbors (KNN)
  • Neural Gas (NG)
  • Growing Neural Gas (GNG)

Work in Progress

  • Classification-By-Components Network (CBC)
  • Learning Vector Quantization 2.1 (LVQ2.1)
  • Self-Organizing-Map (SOM)

Planned models

  • Generalized Tangent Learning Vector Quantization (GTLVQ)
  • Self-Incremental Learning Vector Quantization (SILVQ)

Development setup

It is recommended that you use a virtual environment for development. If you do not use conda, the easiest way to work with virtual environments is by using virtualenvwrapper. Once you've installed it with pip install virtualenvwrapper, you can do the following:

export WORKON_HOME=~/pyenvs
mkdir -p $WORKON_HOME
source /usr/local/bin/virtualenvwrapper.sh  # location may vary
mkvirtualenv pt

Once you have a virtual environment setup, you can start install the models plugin with:

workon pt
git clone git@github.com:si-cim/prototorch_models.git
cd prototorch_models
git checkout dev
pip install -e .[all]  # \[all\] if you are using zsh or MacOS

To assist in the development process, you may also find it useful to install yapf, isort and autoflake. You can install them easily with pip. Also, please avoid installing Tensorflow in this environment. It is known to cause problems with PyTorch-Lightning.

Contribution

This repository contains definition for git hooks. Pre-commit is automatically installed as development dependency with prototorch or you can install it manually with pip install pre-commit.

Please install the hooks by running:

pre-commit install
pre-commit install --hook-type commit-msg

before creating the first commit.

The commit will fail if the commit message does not follow the specification provided here.

FAQ

How do I update the plugin?

If you have already cloned and installed prototorch and the prototorch_models plugin with the -e flag via pip, all you have to do is navigate to those folders from your terminal and do git pull to update.