🎉 first commit

This commit is contained in:
yoshoku 2017-09-30 23:17:24 +09:00
commit 65a09d121f
40 changed files with 1327 additions and 0 deletions

12
.gitignore vendored Normal file
View File

@ -0,0 +1,12 @@
/.bundle/
/.yardoc
/Gemfile.lock
/_yardoc/
/coverage/
/doc/
/pkg/
/spec/reports/
/tmp/
# rspec failure tracking
.rspec_status

2
.rspec Normal file
View File

@ -0,0 +1,2 @@
--format documentation
--color

17
.rubocop.yml Normal file
View File

@ -0,0 +1,17 @@
#AllCops:
# TargetRubyVersion: 2.3
Documentation:
Enabled: false
Metrics/LineLength:
Max: 120
Metrics/ModuleLength:
Max: 200
Metrics/ClassLength:
Max: 200
Security/MarshalLoad:
Enabled: false

5
.travis.yml Normal file
View File

@ -0,0 +1,5 @@
sudo: false
language: ruby
rvm:
- 2.4.2
before_install: gem install bundler -v 1.15.4

74
CODE_OF_CONDUCT.md Normal file
View File

@ -0,0 +1,74 @@
# Contributor Covenant Code of Conduct
## Our Pledge
In the interest of fostering an open and welcoming environment, we as
contributors and maintainers pledge to making participation in our project and
our community a harassment-free experience for everyone, regardless of age, body
size, disability, ethnicity, gender identity and expression, level of experience,
nationality, personal appearance, race, religion, or sexual identity and
orientation.
## Our Standards
Examples of behavior that contributes to creating a positive environment
include:
* Using welcoming and inclusive language
* Being respectful of differing viewpoints and experiences
* Gracefully accepting constructive criticism
* Focusing on what is best for the community
* Showing empathy towards other community members
Examples of unacceptable behavior by participants include:
* The use of sexualized language or imagery and unwelcome sexual attention or
advances
* Trolling, insulting/derogatory comments, and personal or political attacks
* Public or private harassment
* Publishing others' private information, such as a physical or electronic
address, without explicit permission
* Other conduct which could reasonably be considered inappropriate in a
professional setting
## Our Responsibilities
Project maintainers are responsible for clarifying the standards of acceptable
behavior and are expected to take appropriate and fair corrective action in
response to any instances of unacceptable behavior.
Project maintainers have the right and responsibility to remove, edit, or
reject comments, commits, code, wiki edits, issues, and other contributions
that are not aligned to this Code of Conduct, or to ban temporarily or
permanently any contributor for other behaviors that they deem inappropriate,
threatening, offensive, or harmful.
## Scope
This Code of Conduct applies both within project spaces and in public spaces
when an individual is representing the project or its community. Examples of
representing a project or community include using an official project e-mail
address, posting via an official social media account, or acting as an appointed
representative at an online or offline event. Representation of a project may be
further defined and clarified by project maintainers.
## Enforcement
Instances of abusive, harassing, or otherwise unacceptable behavior may be
reported by contacting the project team at yoshoku@outlook.com. All
complaints will be reviewed and investigated and will result in a response that
is deemed necessary and appropriate to the circumstances. The project team is
obligated to maintain confidentiality with regard to the reporter of an incident.
Further details of specific enforcement policies may be posted separately.
Project maintainers who do not follow or enforce the Code of Conduct in good
faith may face temporary or permanent repercussions as determined by other
members of the project's leadership.
## Attribution
This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 1.4,
available at [http://contributor-covenant.org/version/1/4][version]
[homepage]: http://contributor-covenant.org
[version]: http://contributor-covenant.org/version/1/4/

6
Gemfile Normal file
View File

@ -0,0 +1,6 @@
source "https://rubygems.org"
git_source(:github) {|repo_name| "https://github.com/#{repo_name}" }
# Specify your gem's dependencies in svmkit.gemspec
gemspec

8
HISTORY.md Normal file
View File

@ -0,0 +1,8 @@
# 0.1.0
- Added basic classes.
- Added an utility module.
- Added class for RBF kernel approximation.
- Added class for Support Vector Machine with Pegasos alogrithm.
- Added class that performs mutlclass classification with one-vs.-rest strategy.
- Added classes for preprocessing such as min-max scaling, standardization, and L2 normalization.

23
LICENSE.txt Normal file
View File

@ -0,0 +1,23 @@
Copyright (c) 2017 yoshoku
All rights reserved.
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are met:
* Redistributions of source code must retain the above copyright notice, this
list of conditions and the following disclaimer.
* Redistributions in binary form must reproduce the above copyright notice,
this list of conditions and the following disclaimer in the documentation
and/or other materials provided with the distribution.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

84
README.md Normal file
View File

@ -0,0 +1,84 @@
# SVMKit
SVMKit is a library for machine learninig in Ruby.
SVMKit implements machine learning algorithms with an interface similar to Scikit-Learn in Python.
However, since SVMKit is an experimental library, there are few machine learning algorithms implemented.
## Installation
Add this line to your application's Gemfile:
```ruby
gem 'svmkit'
```
And then execute:
$ bundle
Or install it yourself as:
$ gem install svmkit
## Usage
Training phase:
```ruby
require 'svmkit'
require 'libsvmloader'
samples, labels = LibSVMLoader.load_libsvm_file('pendigits', stype: :dense)
normalizer = SVMKit::Preprocessing::MinMaxScaler.new
normalized = normalizer.fit_transform(samples)
transformer = SVMKit::KernelApproximation::RBF.new(gamma: 2.0, n_components: 1024, random_seed: 1)
transformed = transformer.fit_transform(normalized)
base_classifier =
SVMKit::LinearModel::PegasosSVC.new(penalty: 1.0, max_iter: 50, batch_size: 20, random_seed: 1)
classifier = SVMKit::Multiclass::OneVsRestClassifier.new(estimator: base_classifier)
classifier.fit(transformed, labels)
File.open('trained_normalizer.dat', 'wb') { |f| f.write(Marshal.dump(normalizer)) }
File.open('trained_transformer.dat', 'wb') { |f| f.write(Marshal.dump(transformer)) }
File.open('trained_classifier.dat', 'wb') { |f| f.write(Marshal.dump(classifier)) }
```
Testing phase:
```ruby
require 'svmkit'
require 'libsvmloader'
samples, labels = LibSVMLoader.load_libsvm_file('pendigits.t', stype: :dense)
normalizer = Marshal.load(File.binread('trained_normalizer.dat'))
transformer = Marshal.load(File.binread('trained_transformer.dat'))
classifier = Marshal.load(File.binread('trained_classifier.dat'))
normalized = normalizer.transform(samples)
transformed = transformer.transform(normalized)
puts(sprintf("Accuracy: %.1f%%", 100.0 * classifier.score(transformed, labels)))
```
## Development
After checking out the repo, run `bin/setup` to install dependencies. Then, run `rake spec` to run the tests. You can also run `bin/console` for an interactive prompt that will allow you to experiment.
To install this gem onto your local machine, run `bundle exec rake install`. To release a new version, update the version number in `version.rb`, and then run `bundle exec rake release`, which will create a git tag for the version, push git commits and tags, and push the `.gem` file to [rubygems.org](https://rubygems.org).
## Contributing
Bug reports and pull requests are welcome on GitHub at https://github.com/yoshoku/svmkit.
This project is intended to be a safe, welcoming space for collaboration,
and contributors are expected to adhere to the [Contributor Covenant](http://contributor-covenant.org) code of conduct.
## License
The gem is available as open source under the terms of the [BSD 2-clause License](https://opensource.org/licenses/BSD-2-Clause).
## Code of Conduct
Everyone interacting in the SVMKit projects codebases, issue trackers,
chat rooms and mailing lists is expected to follow the [code of conduct](https://github.com/yoshoku/svmkit/blob/master/CODE_OF_CONDUCT.md).

6
Rakefile Normal file
View File

@ -0,0 +1,6 @@
require "bundler/gem_tasks"
require "rspec/core/rake_task"
RSpec::Core::RakeTask.new(:spec)
task :default => :spec

14
bin/console Executable file
View File

@ -0,0 +1,14 @@
#!/usr/bin/env ruby
require "bundler/setup"
require "svmkit"
# You can add fixtures and/or initialization code here to make experimenting
# with your gem easier. You can also use a different console, if you like.
# (If you use this, don't forget to add pry to your Gemfile!)
# require "pry"
# Pry.start
require "irb"
IRB.start(__FILE__)

8
bin/setup Executable file
View File

@ -0,0 +1,8 @@
#!/usr/bin/env bash
set -euo pipefail
IFS=$'\n\t'
set -vx
bundle install
# Do any other automated setup that you need to do here

16
lib/svmkit.rb Normal file
View File

@ -0,0 +1,16 @@
begin
require 'nmatrix/nmatrix'
rescue LoadError
end
require 'svmkit/version'
require 'svmkit/utils'
require 'svmkit/base/base_estimator'
require 'svmkit/base/classifier'
require 'svmkit/base/transformer'
require 'svmkit/kernel_approximation/rbf'
require 'svmkit/linear_model/pegasos_svc'
require 'svmkit/multiclass/one_vs_rest_classifier'
require 'svmkit/preprocessing/l2_normalizer'
require 'svmkit/preprocessing/min_max_scaler'
require 'svmkit/preprocessing/standard_scaler'

View File

@ -0,0 +1,11 @@
module SVMKit
# This module consists of basic mix-in classes.
module Base
# Base module for all estimators in SVMKit.
module BaseEstimator
# Parameters for this estimator.
attr_accessor :params
end
end
end

View File

@ -0,0 +1,22 @@
module SVMKit
module Base
# Module for all classifiers in SVMKit.
module Classifier
# An abstract method for fitting a model.
def fit
raise NotImplementedError, "#{__method__} has to be implemented in #{self.class}."
end
# An abstract method for predicting labels.
def predict
raise NotImplementedError, "#{__method__} has to be implemented in #{self.class}."
end
# An abstract method for calculating classification accuracy.
def score
raise NotImplementedError, "#{__method__} has to be implemented in #{self.class}."
end
end
end
end

View File

@ -0,0 +1,17 @@
module SVMKit
module Base
# Module for all transfomers in SVMKit.
module Transformer
# An abstract method for fitting a model.
def fit
raise NotImplementedError, "#{__method__} has to be implemented in #{self.class}."
end
# An abstract method for fitting a model and transforming given data.
def fit_transform
raise NotImplementedError, "#{__method__} has to be implemented in #{self.class}."
end
end
end
end

View File

@ -0,0 +1,133 @@
require 'svmkit/base/base_estimator'
require 'svmkit/base/transformer'
module SVMKit
# Module for kernel approximation algorithms.
module KernelApproximation
# Class for RBF kernel feature mapping.
#
# transformer = SVMKit::KernelApproximation::RBF.new(gamma: 1.0, n_coponents: 128, random_seed: 1)
# new_training_samples = transformer.fit_transform(training_samples)
# new_testing_samples = transformer.transform(testing_samples)
#
# * *Refernce*:
# - A. Rahimi and B. Recht, "Random Features for Large-Scale Kernel Machines," Proc. NIPS'07, pp.1177--1184, 2007.
class RBF
include Base::BaseEstimator
include Base::Transformer
DEFAULT_PARAMS = { # :nodoc:
gamma: 1.0,
n_components: 128,
random_seed: nil
}.freeze
# The random matrix for transformation.
attr_reader :random_mat # :nodoc:
# The random vector for transformation.
attr_reader :random_vec # :nodoc:
# The random generator for transformation.
attr_reader :rng # :nodoc:
# Creates a new transformer for mapping to RBF kernel feature space.
#
# call-seq:
# new(gamma: 1.0, n_components: 128, random_seed: 1) -> RBF
#
# * *Arguments* :
# - +:gamma+ (Float) (defaults to: 1.0) -- The parameter of RBF kernel: exp(-gamma * x^2)
# - +:n_components+ (Integer) (defaults to: 128) -- The number of dimensions of the RBF kernel feature space.
# - +:random_seed+ (Integer) (defaults to: nil) -- The seed value using to initialize the random generator.
def initialize(params = {})
self.params = DEFAULT_PARAMS.merge(Hash[params.map { |k, v| [k.to_sym, v] }])
self.params[:random_seed] ||= srand
@rng = Random.new(self.params[:random_seed])
@random_mat = nil
@random_vec = nil
end
# Fit the model with given training data.
#
# call-seq:
# fit(x) -> RBF
#
# * *Arguments* :
# - +x+ (NMatrix, shape: [n_samples, n_features]) -- The training data to be used for fitting the model. This method uses only the number of features of the data.
# * *Returns* :
# - The learned transformer itself.
def fit(x, _y = nil)
n_features = x.shape[1]
params[:n_components] = 2 * n_features if params[:n_components] <= 0
@random_mat = rand_normal([n_features, params[:n_components]]) * (2.0 * params[:gamma])**0.5
n_half_components = params[:n_components] / 2
@random_vec = NMatrix.zeros([1, params[:n_components] - n_half_components]).hconcat(
NMatrix.ones([1, n_half_components]) * (0.5 * Math::PI)
)
#@random_vec = rand_uniform([1, self.params[:n_components]]) * (2.0 * Math::PI)
self
end
# Fit the model with training data, and then transform them with the learned model.
#
# call-seq:
# fit_transform(x) -> NMatrix
#
# * *Arguments* :
# - +x+ (NMatrix, shape: [n_samples, n_features]) -- The training data to be used for fitting the model.
# * *Returns* :
# - The transformed data (NMatrix, shape: [n_samples, n_components]).
def fit_transform(x, _y = nil)
fit(x).transform(x)
end
# Transform the given data with the learned model.
#
# call-seq:
# transform(x) -> NMatrix
#
# * *Arguments* :
# - +x+ (NMatrix, shape: [n_samples, n_features]) -- The data to be transformed with the learned model.
# * *Returns* :
# - The transformed data (NMatrix, shape: [n_samples, n_components]).
def transform(x)
n_samples, = x.shape
projection = x.dot(@random_mat) + @random_vec.repeat(n_samples, 0)
projection.sin * ((2.0 / params[:n_components])**0.5)
end
# Serializes object through Marshal#dump.
def marshal_dump # :nodoc:
{ params: params,
random_mat: Utils.dump_nmatrix(@random_mat),
random_vec: Utils.dump_nmatrix(@random_vec),
rng: @rng }
end
# Deserialize object through Marshal#load.
def marshal_load(obj) # :nodoc:
self.params = obj[:params]
@random_mat = Utils.restore_nmatrix(obj[:random_mat])
@random_vec = Utils.restore_nmatrix(obj[:random_vec])
@rng = obj[:rng]
nil
end
protected
# Generate the uniform random matrix with the given shape.
def rand_uniform(shape) # :nodoc:
rnd_vals = Array.new(NMatrix.size(shape)) { @rng.rand }
NMatrix.new(shape, rnd_vals, dtype: :float64, stype: :dense)
end
# Generate the normal random matrix with the given shape, mean, and standard deviation.
def rand_normal(shape, mu = 0.0, sigma = 1.0) # :nodoc:
a = rand_uniform(shape)
b = rand_uniform(shape)
((a.log * -2.0).sqrt * (b * 2.0 * Math::PI).sin) * sigma + mu
end
end
end
end

View File

@ -0,0 +1,148 @@
require 'svmkit/base/base_estimator'
require 'svmkit/base/classifier'
module SVMKit
# This module consists of the classes that implement generalized linear models.
module LinearModel
# PegasosSVC is a class that implements Support Vector Classifier with the Pegasos algorithm.
#
# estimator =
# SVMKit::LinearModel::PegasosSVC.new(reg_param: 1.0, max_iter: 100, batch_size: 20, random_seed: 1)
# estimator.fit(training_samples, traininig_labels)
# results = estimator.predict(testing_samples)
#
# * *Reference*:
# - S. Shalev-Shwartz and Y. Singer, "Pegasos: Primal Estimated sub-GrAdient SOlver for SVM," Proc. ICML'07, pp. 807--814, 2007.
#
class PegasosSVC
include Base::BaseEstimator
include Base::Classifier
DEFAULT_PARAMS = { # :nodoc:
reg_param: 1.0,
max_iter: 100,
batch_size: 50,
random_seed: nil
}.freeze
# The weight vector for SVC.
attr_reader :weight_vec
# The random generator for performing random sampling in the Pegasos algorithm.
attr_reader :rng
# Create a new classifier with Support Vector Machine by the Pegasos algorithm.
#
# :call-seq:
# new(reg_param: 1.0, max_iter: 100, batch_size: 50, random_seed: 1) -> PegasosSVC
#
# * *Arguments* :
# - +:reg_param+ (Float) (defaults to: 1.0) -- The regularization parameter.
# - +:max_iter+ (Integer) (defaults to: 100) -- The maximum number of iterations.
# - +:batch_size+ (Integer) (defaults to: 50) -- The size of the mini batches.
# - +:random_seed+ (Integer) (defaults to: nil) -- The seed value using to initialize the random generator.
def initialize(params = {})
self.params = DEFAULT_PARAMS.merge(Hash[params.map { |k, v| [k.to_sym, v] }])
self.params[:random_seed] ||= srand
@weight_vec = nil
@rng = Random.new(self.params[:random_seed])
end
# Fit the model with given training data.
#
# :call-seq:
# fit(x, y) -> PegasosSVC
#
# * *Arguments* :
# - +x+ (NMatrix, shape: [n_samples, n_features]) -- The training data to be used for fitting the model.
# - +y+ (NMatrix, shape: [1, n_samples]) -- The labels to be used for fitting the model.
# * *Returns* :
# - The learned classifier itself.
def fit(x, y)
# Generate binary labels
negative_label = y.uniq.sort.shift
bin_y = y.to_flat_a.map { |l| l != negative_label ? 1 : -1 }
# Initialize some variables.
n_samples, n_features = x.shape
rand_ids = [*0..n_samples - 1].shuffle(random: @rng)
@weight_vec = NMatrix.zeros([1, n_features])
# Start optimization.
params[:max_iter].times do |t|
# random sampling
subset_ids = rand_ids.shift(params[:batch_size])
rand_ids.concat(subset_ids)
target_ids = subset_ids.map do |n|
n if @weight_vec.dot(x.row(n).transpose) * bin_y[n] < 1
end
n_subsamples = target_ids.size
next if n_subsamples.zero?
# update the weight vector.
eta = 1.0 / (params[:reg_param] * (t + 1))
mean_vec = NMatrix.zeros([1, n_features])
target_ids.each { |n| mean_vec += x.row(n) * bin_y[n] }
mean_vec *= eta / n_subsamples
@weight_vec = @weight_vec * (1.0 - eta * params[:reg_param]) + mean_vec
# scale the weight vector.
scaler = (1.0 / params[:reg_param]**0.5) / @weight_vec.norm2
@weight_vec *= [1.0, scaler].min
end
self
end
# Calculate confidence scores for samples.
#
# :call-seq:
# decision_function(x) -> NMatrix, shape: [1, n_samples]
#
# * *Arguments* :
# - +x+ (NMatrix, shape: [n_samples, n_features]) -- The samples to compute the scores.
# * *Returns* :
# - Confidence score per sample.
def decision_function(x)
@weight_vec.dot(x.transpose)
end
# Predict class labels for samples.
#
# :call-seq:
# predict(x) -> NMatrix, shape: [1, n_samples]
#
# * *Arguments* :
# - +x+ (NMatrix, shape: [n_samples, n_features]) -- The samples to predict the labels.
# * *Returns* :
# - Predicted class label per sample.
def predict(x)
decision_function(x).map { |v| v >= 0 ? 1 : -1 }
end
# Claculate the mean accuracy of the given testing data.
#
# :call-seq:
# predict(x, y) -> Float
#
# * *Arguments* :
# - +x+ (NMatrix, shape: [n_samples, n_features]) -- Testing data.
# - +y+ (NMatrix, shape: [1, n_samples]) -- True labels for testing data.
# * *Returns* :
# - Mean accuracy
def score(x, y)
p = predict(x)
n_hits = (y.to_flat_a.map.with_index { |l, n| l == p[n] ? 1 : 0 }).inject(:+)
n_hits / y.size.to_f
end
# Serializes object through Marshal#dump.
def marshal_dump # :nodoc:
{ params: params, weight_vec: Utils.dump_nmatrix(@weight_vec), rng: @rng }
end
# Deserialize object through Marshal#load.
def marshal_load(obj) # :nodoc:
self.params = obj[:params]
@weight_vec = Utils.restore_nmatrix(obj[:weight_vec])
@rng = obj[:rng]
nil
end
end
end
end

View File

@ -0,0 +1,127 @@
require 'svmkit/base/base_estimator.rb'
require 'svmkit/base/classifier.rb'
module SVMKit
# This module consists of the classes that implement multi-label classification strategy.
module Multiclass
# OneVsRestClassifier is a class that implements One-vs-Rest (OvR) strategy for multi-label classification.
#
# base_estimator =
# SVMKit::LinearModel::PegasosSVC.new(penalty: 1.0, max_iter: 100, batch_size: 20, random_seed: 1)
# estimator = SVMKit::Multiclass::OneVsRestClassifier.new(estimator: base_estimator)
# estimator.fit(training_samples, training_labels)
# results = estimator.predict(testing_samples)
#
class OneVsRestClassifier
include Base::BaseEstimator
include Base::Classifier
DEFAULT_PARAMS = { # :nodoc:
estimator: nil
}.freeze
# The set of estimators.
attr_reader :estimators
# The class labels.
attr_reader :classes
# Create a new multi-label classifier with the one-vs-rest startegy.
#
# :call-seq:
# new(estimator: base_estimator) -> OneVsRestClassifier
#
# * *Arguments* :
# - +:estimator+ (Classifier) (defaults to: nil) -- The (binary) classifier for construction a multi-label classifier.
def initialize(params = {})
self.params = DEFAULT_PARAMS.merge(Hash[params.map { |k, v| [k.to_sym, v] }])
@estimators = nil
@classes = nil
end
# Fit the model with given training data.
#
# :call-seq:
# fit(x, y) -> OneVsRestClassifier
#
# * *Arguments* :
# - +x+ (NMatrix, shape: [n_samples, n_features]) -- The training data to be used for fitting the model.
# - +y+ (NMatrix, shape: [1, n_samples]) -- The labels to be used for fitting the model.
# * *Returns* :
# - The learned classifier itself.
def fit(x, y)
@classes = y.uniq.sort
@estimators = @classes.map do |label|
bin_y = y.map { |l| l == label ? 1 : -1 }
params[:estimator].dup.fit(x, bin_y)
end
self
end
# Calculate confidence scores for samples.
#
# :call-seq:
# decision_function(x) -> NMatrix, shape: [n_samples, n_classes]
#
# * *Arguments* :
# - +x+ (NMatrix, shape: [n_samples, n_features]) -- The samples to compute the scores.
# * *Returns* :
# - Confidence scores per sample for each class.
def decision_function(x)
n_samples, = x.shape
n_classes = @classes.size
NMatrix.new(
[n_classes, n_samples],
Array.new(n_classes) { |m| @estimators[m].decision_function(x).to_a }.flatten
).transpose
end
# Predict class labels for samples.
#
# :call-seq:
# predict(x) -> NMatrix, shape: [1, n_samples]
#
# * *Arguments* :
# - +x+ (NMatrix, shape: [n_samples, n_features]) -- The samples to predict the labels.
# * *Returns* :
# - Predicted class label per sample.
def predict(x)
n_samples, = x.shape
decision_values = decision_function(x)
NMatrix.new([1, n_samples],
decision_values.each_row.map { |vals| @classes[vals.to_a.index(vals.to_a.max)] })
end
# Claculate the mean accuracy of the given testing data.
#
# :call-seq:
# predict(x, y) -> Float
#
# * *Arguments* :
# - +x+ (NMatrix, shape: [n_samples, n_features]) -- Testing data.
# - +y+ (NMatrix, shape: [1, n_samples]) -- True labels for testing data.
# * *Returns* :
# - Mean accuracy
def score(x, y)
p = predict(x)
n_hits = (y.to_flat_a.map.with_index { |l, n| l == p[n] ? 1 : 0 }).inject(:+)
n_hits / y.size.to_f
end
# Serializes object through Marshal#dump.
def marshal_dump # :nodoc:
{ params: params,
classes: @classes,
estimators: @estimators.map { |e| Marshal.dump(e) } }
end
# Deserialize object through Marshal#load.
def marshal_load(obj) # :nodoc:
self.params = obj[:params]
@classes = obj[:classes]
@estimators = obj[:estimators].map { |e| Marshal.load(e) }
nil
end
end
end
end

View File

@ -0,0 +1,57 @@
require 'svmkit/base/base_estimator'
require 'svmkit/base/transformer'
module SVMKit
# This module consists of the classes that perform preprocessings.
module Preprocessing
# Normalize samples to unit L2-norm.
#
# normalizer = SVMKit::Preprocessing::StandardScaler.new
# new_samples = normalizer.fit_transform(samples)
class L2Normalizer
include Base::BaseEstimator
include Base::Transformer
# The vector consists of norms of each sample.
attr_reader :norm_vec # :nodoc:
# Create a new normalizer for normaliing to unit L2-norm.
#
# :call-seq:
# new() -> L2Normalizer
def initialize(_params = {})
@norm_vec = nil
end
# Calculate L2 norms of each sample.
#
# :call-seq:
# fit(x) -> L2Normalizer
#
# * *Arguments* :
# - +x+ (NMatrix, shape: [n_samples, n_features]) -- The samples to calculate L2-norms.
# * *Returns* :
# - L2Normalizer
def fit(x, _y = nil)
n_samples, = x.shape
@norm_vec = NMatrix.new([1, n_samples],
Array.new(n_samples) { |n| x.row(n).norm2 })
self
end
# Calculate L2 norms of each sample, and then normalize samples to unit L2-norm.
#
# :call-seq:
# fit_transform(x) -> NMatrix
#
# * *Arguments* :
# - +x+ (NMatrix, shape: [n_samples, n_features]) -- The samples to calculate L2-norms.
# * *Returns* :
# - The normalized samples (NMatrix)
def fit_transform(x, _y = nil)
fit(x)
x / @norm_vec.transpose.repeat(x.shape[1], 1)
end
end
end
end

View File

@ -0,0 +1,99 @@
require 'svmkit/base/base_estimator'
require 'svmkit/base/transformer'
module SVMKit
# This module consists of the classes that perform preprocessings.
module Preprocessing
# Normalize samples by scaling each feature to a given range.
#
# normalizer = SVMKit::Preprocessing::MinMaxScaler.new(feature_range: [0.0, 1.0])
# new_training_samples = normalizer.fit_transform(training_samples)
# new_testing_samples = normalizer.transform(testing_samples)
class MinMaxScaler
include Base::BaseEstimator
include Base::Transformer
DEFAULT_PARAMS = { # :nodoc:
feature_range: [0.0, 1.0]
}.freeze
# The vector consists of the minimum value for each feature.
attr_reader :min_vec # :nodoc:
# The vector consists of the maximum value for each feature.
attr_reader :max_vec # :nodoc:
# Creates a new normalizer for scaling each feature to a given range.
#
# call-seq:
# new(feature_range: [0.0, 1.0]) -> MinMaxScaler
#
# * *Arguments* :
# - +:feature_range+ (Array) (defaults to: [0.0, 1.0]) -- The desired range of samples.
def initialize(params = {})
@params = DEFAULT_PARAMS.merge(Hash[params.map { |k, v| [k.to_sym, v] }])
@min_vec = nil
@max_vec = nil
end
# Calculate the minimum and maximum value of each feature for scaling.
#
# :call-seq:
# fit(x) -> MinMaxScaler
#
# * *Arguments* :
# - +x+ (NMatrix, shape: [n_samples, n_features]) -- The samples to calculate the minimum and maximum values.
# * *Returns* :
# - MinMaxScaler
def fit(x, _y = nil)
@min_vec = x.min(0)
@max_vec = x.max(0)
self
end
# Calculate the minimum and maximum values, and then normalize samples to feature_range.
#
# :call-seq:
# fit_transform(x) -> NMatrix
#
# * *Arguments* :
# - +x+ (NMatrix, shape: [n_samples, n_features]) -- The samples to calculate the minimum and maximum values.
# * *Returns* :
# - The scaled samples (NMatrix)
def fit_transform(x, _y = nil)
fit(x).transform(x)
end
# Perform scaling the given samples according to feature_range.
#
# call-seq:
# transform(x) -> NMatrix
#
# * *Arguments* :
# - +x+ (NMatrix, shape: [n_samples, n_features]) -- The samples to be scaled.
# * *Returns* :
# - The scaled samples (NMatrix)
def transform(x)
n_samples, = x.shape
dif_vec = @max_vec - @min_vec
nx = (x - @min_vec.repeat(n_samples, 0)) / dif_vec.repeat(n_samples, 0)
nx * (@params[:feature_range][1] - @params[:feature_range][0]) + @params[:feature_range][0]
end
# Serializes object through Marshal#dump.
def marshal_dump # :nodoc:
{ params: @params,
min_vec: Utils.dump_nmatrix(@min_vec),
max_vec: Utils.dump_nmatrix(@max_vec) }
end
# Deserialize object through Marshal#load.
def marshal_load(obj) # :nodoc:
@params = obj[:params]
@min_vec = Utils.restore_nmatrix(obj[:min_vec])
@max_vec = Utils.restore_nmatrix(obj[:max_vec])
nil
end
end
end
end

View File

@ -0,0 +1,87 @@
require 'svmkit/base/base_estimator'
require 'svmkit/base/transformer'
module SVMKit
# This module consists of the classes that perform preprocessings.
module Preprocessing
# Normalize samples by centering and scaling to unit variance.
#
# normalizer = SVMKit::Preprocessing::StandardScaler.new
# new_training_samples = normalizer.fit_transform(training_samples)
# new_testing_samples = normalizer.transform(testing_samples)
class StandardScaler
include Base::BaseEstimator
include Base::Transformer
# The vector consists of the mean value for each feature.
attr_reader :mean_vec # :nodoc:
# The vector consists of the standard deviation for each feature.
attr_reader :std_vec # :nodoc:
# Create a new normalizer for centering and scaling to unit variance.
#
# :call-seq:
# new() -> StandardScaler
def initialize(_params = {})
@mean_vec = nil
@std_vec = nil
end
# Calculate the mean value and standard deviation of each feature for scaling.
#
# :call-seq:
# fit(x) -> StandardScaler
#
# * *Arguments* :
# - +x+ (NMatrix, shape: [n_samples, n_features]) -- The samples to calculate the mean values and standard deviations.
# * *Returns* :
# - StandardScaler
def fit(x, _y = nil)
@mean_vec = x.mean(0)
@std_vec = x.std(0)
self
end
# Calculate the mean values and standard deviations, and then normalize samples using them.
#
# :call-seq:
# fit_transform(x) -> NMatrix
#
# * *Arguments* :
# - +x+ (NMatrix, shape: [n_samples, n_features]) -- The samples to calculate the mean values and standard deviations.
# * *Returns* :
# - The scaled samples (NMatrix)
def fit_transform(x, _y = nil)
fit(x).transform(x)
end
# Perform standardization the given samples.
#
# call-seq:
# transform(x) -> NMatrix
#
# * *Arguments* :
# - +x+ (NMatrix, shape: [n_samples, n_features]) -- The samples to be scaled.
# * *Returns* :
# - The scaled samples (NMatrix)
def transform(x)
n_samples, = x.shape
(x - @mean_vec.repeat(n_samples, 0)) / @std_vec.repeat(n_samples, 0)
end
# Serializes object through Marshal#dump.
def marshal_dump # :nodoc:
{ mean_vec: Utils.dump_nmatrix(@mean_vec),
std_vec: Utils.dump_nmatrix(@std_vec) }
end
# Deserialize object through Marshal#load.
def marshal_load(obj) # :nodoc:
@mean_vec = Utils.restore_nmatrix(obj[:mean_vec])
@std_vec = Utils.restore_nmatrix(obj[:std_vec])
nil
end
end
end
end

33
lib/svmkit/utils.rb Normal file
View File

@ -0,0 +1,33 @@
module SVMKit
# Module for utility methods.
module Utils
class << self
# Dump an NMatrix object converted to a Ruby Hash.
# # call-seq:
# dump_nmatrix(mat) -> Hash
#
# * *Arguments* :
# - +mat+ -- An NMatrix object converted to a Ruby Hash.
# * *Returns* :
# - A Ruby Hash containing matrix information.
def dump_nmatrix(mat)
return nil if mat.class != NMatrix
{ shape: mat.shape, array: mat.to_flat_a, dtype: mat.dtype, stype: mat.stype }
end
# Return the results of converting the dumped data into an NMatrix object.
#
# call-seq:
# restore_nmatrix(dumped_mat) -> NMatrix
#
# * *Arguments* :
# - +dumpted_mat+ -- A Ruby Hash about NMatrix object created with SVMKit::Utils.dump_nmatrix method.
# * *Returns* :
# - An NMatrix object restored from the given Hash.
def restore_nmatrix(dmp = {})
return nil unless dmp.class == Hash && %i[shape array dtype stype].all?(&dmp.method(:has_key?))
NMatrix.new(dmp[:shape], dmp[:array], dtype: dmp[:dtype], stype: dmp[:stype])
end
end
end
end

3
lib/svmkit/version.rb Normal file
View File

@ -0,0 +1,3 @@
module SVMKit
VERSION = '0.1.0'.freeze
end

View File

@ -0,0 +1,48 @@
require 'spec_helper'
RSpec.describe SVMKit::KernelApproximation::RBF do
let(:n_samples) { 10 }
let(:n_features) { 4 }
let(:samples) do
rng = Random.new(1)
rnd_vals = Array.new(n_samples * n_features) { rng.rand }
NMatrix.new([n_samples, n_features], rnd_vals, dtype: :float64, stype: :dense)
end
it 'has a small approximation error for the RBF kernel function.' do
# calculate RBF kernel matrix.
kernel_matrix = NMatrix.zeros([n_samples, n_samples])
n_samples.times do |m|
n_samples.times do |n|
distance = (samples.row(m) - samples.row(n)).norm2
kernel_matrix[m, n] = Math.exp(-distance**2)
end
end
# calculate approximate RBF kernel matrix.
transformer = described_class.new(gamma: 1.0, n_components: 4096, random_seed: 1)
new_samples = transformer.fit_transform(samples)
inner_matrix = new_samples.dot(new_samples.transpose)
# evalute mean error.
mean_error = 0.0
n_samples.times do |m|
n_samples.times do |n|
mean_error += ((kernel_matrix[m, n] - inner_matrix[m, n])**2)**0.5
end
end
mean_error /= n_samples * n_samples
expect(mean_error).to be < 0.01
end
it 'dumps and restores itself using Marshal module.' do
transformer = described_class.new(gamma: 1.0, n_components: 128, random_seed: 1)
transformer.fit(samples)
copied = Marshal.load(Marshal.dump(transformer))
expect(transformer.class).to eq(copied.class)
expect(transformer.params[:gamma]).to eq(copied.params[:gamma])
expect(transformer.params[:n_components]).to eq(copied.params[:n_components])
expect(transformer.params[:random_seed]).to eq(copied.params[:random_seed])
expect(transformer.random_mat).to eq(copied.random_mat)
expect(transformer.random_vec).to eq(copied.random_vec)
expect(transformer.rng).to eq(copied.rng)
end
end

View File

@ -0,0 +1,25 @@
require 'spec_helper'
RSpec.describe SVMKit::LinearModel::PegasosSVC do
let(:samples) { SVMKit::Utils.restore_nmatrix(Marshal.load(File.read(__dir__ + '/test_samples.dat'))) }
let(:labels) { SVMKit::Utils.restore_nmatrix(Marshal.load(File.read(__dir__ + '/test_labels.dat'))) }
let(:estimator) { described_class.new(penalty: 1.0, max_iter: 100, batch_size: 20, random_seed: 1) }
it 'classifies two clusters.' do
estimator.fit(samples, labels)
score = estimator.score(samples, labels)
expect(score).to eq(1.0)
end
it 'dumps and restores itself using Marshal module.' do
estimator.fit(samples, labels)
copied = Marshal.load(Marshal.dump(estimator))
expect(estimator.class).to eq(copied.class)
expect(estimator.params[:reg_param]).to eq(copied.params[:reg_param])
expect(estimator.params[:max_iter]).to eq(copied.params[:max_iter])
expect(estimator.params[:batch_size]).to eq(copied.params[:batch_size])
expect(estimator.params[:random_seed]).to eq(copied.params[:random_seed])
expect(estimator.weight_vec).to eq(copied.weight_vec)
expect(estimator.rng).to eq(copied.rng)
end
end

View File

@ -0,0 +1,7 @@
{ :
shape[iiÈ:
array[Èiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiii:
dtype:
int32:
stype:
dense

File diff suppressed because one or more lines are too long

View File

@ -0,0 +1,35 @@
require 'spec_helper'
RSpec.describe SVMKit::Multiclass::OneVsRestClassifier do
let(:samples) do
SVMKit::Utils.restore_nmatrix(Marshal.load(File.read(__dir__ + '/test_samples_three_clusters.dat')))
end
let(:labels) do
SVMKit::Utils.restore_nmatrix(Marshal.load(File.read(__dir__ + '/test_labels_three_clusters.dat')))
end
let(:base_estimator) do
SVMKit::LinearModel::PegasosSVC.new(penalty: 1.0, max_iter: 100, batch_size: 20, random_seed: 1)
end
let(:estimator) { described_class.new(estimator: base_estimator) }
it 'classifies three clusters.' do
estimator.fit(samples, labels)
score = estimator.score(samples, labels)
expect(score).to eq(1.0)
end
it 'dumps and restores itself using Marshal module.' do
estimator.fit(samples, labels)
copied = Marshal.load(Marshal.dump(estimator))
expect(estimator.class).to eq(copied.class)
expect(estimator.estimators.size).to eq(copied.estimators.size)
expect(estimator.estimators[0].class).to eq(copied.estimators[0].class)
expect(estimator.estimators[1].class).to eq(copied.estimators[1].class)
expect(estimator.estimators[2].class).to eq(copied.estimators[2].class)
expect(estimator.estimators[0].weight_vec).to eq(copied.estimators[0].weight_vec)
expect(estimator.estimators[1].weight_vec).to eq(copied.estimators[1].weight_vec)
expect(estimator.estimators[2].weight_vec).to eq(copied.estimators[2].weight_vec)
expect(estimator.classes).to eq(copied.classes)
expect(estimator.params[:estimator].class).to eq(copied.params[:estimator].class)
end
end

View File

@ -0,0 +1,7 @@
{ :
shape[ii,:
array[,iiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiii:
dtype:
int32:
stype:
dense

File diff suppressed because one or more lines are too long

View File

@ -0,0 +1,21 @@
require 'spec_helper'
RSpec.describe SVMKit::Preprocessing::L2Normalizer do
let(:n_samples) { 10 }
let(:n_features) { 4 }
let(:samples) do
rng = Random.new(1)
rnd_vals = Array.new(n_samples * n_features) { rng.rand }
NMatrix.new([n_samples, n_features], rnd_vals, dtype: :float64, stype: :dense)
end
it 'normalizes each sample to unit length.' do
normalizer = described_class.new
normalized = normalizer.fit_transform(samples)
sum_norm = 0.0
n_samples.times do |n|
sum_norm += normalized.row(n).norm2
end
expect((sum_norm - n_samples).abs).to be < 1.0e-6
end
end

View File

@ -0,0 +1,35 @@
require 'spec_helper'
RSpec.describe SVMKit::Preprocessing::MinMaxScaler do
let(:n_samples) { 10 }
let(:n_features) { 4 }
let(:samples) do
rng = Random.new(1)
rnd_vals = Array.new(n_samples * n_features) { rng.rand }
NMatrix.new([n_samples, n_features], rnd_vals, dtype: :float64, stype: :dense)
end
it 'normalizes range of features to [0,1].' do
normalizer = described_class.new
normalized = normalizer.fit_transform(samples)
expect(normalized.min.to_a.min).to eq(0)
expect(normalized.max.to_a.max).to eq(1)
end
it 'normalizes range of features to a given range.' do
normalizer = described_class.new(feature_range: [-3, 2])
normalized = normalizer.fit_transform(samples)
expect(normalized.min.to_a.min).to eq(-3)
expect(normalized.max.to_a.max).to eq(2)
end
it 'dumps and restores itself using Marshal module.' do
transformer = described_class.new
transformer.fit(samples)
copied = Marshal.load(Marshal.dump(transformer))
expect(transformer.min_vec).to eq(copied.min_vec)
expect(transformer.max_vec).to eq(copied.max_vec)
expect(transformer.params[:feature_range][0]).to eq(copied.params[:feature_range][0])
expect(transformer.params[:feature_range][1]).to eq(copied.params[:feature_range][1])
end
end

View File

@ -0,0 +1,28 @@
require 'spec_helper'
RSpec.describe SVMKit::Preprocessing::StandardScaler do
let(:n_samples) { 10 }
let(:n_features) { 4 }
let(:samples) do
rng = Random.new(1)
rnd_vals = Array.new(n_samples * n_features) { rng.rand }
NMatrix.new([n_samples, n_features], rnd_vals, dtype: :float64, stype: :dense)
end
it 'performs standardization of samples.' do
normalizer = described_class.new
normalized = normalizer.fit_transform(samples)
mean_err = (normalized.mean(0) - NMatrix.zeros([1, n_features])).abs.sum(1)[0]
std_err = (normalized.std(0) - NMatrix.ones([1, n_features])).abs.sum(1)[0]
expect(mean_err).to be < 1.0e-8
expect(std_err).to be < 1.0e-8
end
it 'dumps and restores itself using Marshal module.' do
transformer = described_class.new
transformer.fit(samples)
copied = Marshal.load(Marshal.dump(transformer))
expect(transformer.mean_vec).to eq(copied.mean_vec)
expect(transformer.std_vec).to eq(copied.std_vec)
end
end

14
spec/spec_helper.rb Normal file
View File

@ -0,0 +1,14 @@
require 'bundler/setup'
require 'svmkit'
RSpec.configure do |config|
# Enable flags like --only-failures and --next-failure
config.example_status_persistence_file_path = '.rspec_status'
# Disable RSpec exposing methods globally on `Module` and `main`
config.disable_monkey_patching!
config.expect_with :rspec do |c|
c.syntax = :expect
end
end

23
spec/svmkit_spec.rb Normal file
View File

@ -0,0 +1,23 @@
require 'spec_helper'
RSpec.describe SVMKit do
let(:samples) do
SVMKit::Utils.restore_nmatrix(Marshal.load(File.read(__dir__ + '/test_samples_xor.dat')))
end
let(:labels) do
SVMKit::Utils.restore_nmatrix(Marshal.load(File.read(__dir__ + '/test_labels_xor.dat')))
end
let(:estimator) do
SVMKit::LinearModel::PegasosSVC.new(penalty: 1.0, max_iter: 100, batch_size: 20, random_seed: 1)
end
let(:transformer) do
SVMKit::KernelApproximation::RBF.new(gamma: 1.0, n_components: 1024, random_seed: 1)
end
it 'classifies xor data.' do
new_samples = transformer.fit_transform(samples)
estimator.fit(new_samples, labels)
score = estimator.score(new_samples, labels)
expect(score).to eq(1.0)
end
end

7
spec/test_labels_xor.dat Normal file
View File

@ -0,0 +1,7 @@
{ :
shape[ii<02>:
array[<02>iúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiúiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiii:
dtype:
int32:
stype:
dense

File diff suppressed because one or more lines are too long

10
spec/utils_spec.rb Normal file
View File

@ -0,0 +1,10 @@
require 'spec_helper'
RSpec.describe SVMKit::Utils do
it 'dumps and restores NMatrix object.' do
mat = NMatrix.rand([3, 3])
dumped = described_class.dump_nmatrix(mat)
restored = described_class.restore_nmatrix(dumped)
expect(mat).to eq(restored)
end
end

37
svmkit.gemspec Normal file
View File

@ -0,0 +1,37 @@
# coding: utf-8
lib = File.expand_path('../lib', __FILE__)
$LOAD_PATH.unshift(lib) unless $LOAD_PATH.include?(lib)
require 'svmkit/version'
SVMKit::DESCRIPTION = <<MSG
SVMKit is a library for machine learninig in Ruby.
SVMKit implements machine learning algorithms with an interface similar to Scikit-Learn in Python.
However, since SVMKit is an experimental library, there are few machine learning algorithms implemented.
MSG
Gem::Specification.new do |spec|
spec.name = 'svmkit'
spec.version = SVMKit::VERSION
spec.authors = ['yoshoku']
spec.email = ['yoshoku@outlook.com']
spec.summary = %q{SVMKit is an experimental library of machine learning in Ruby.}
spec.description = SVMKit::DESCRIPTION
spec.homepage = 'https://github.com/yoshoku/svmkit'
spec.license = 'BSD-2-Clause'
spec.files = `git ls-files -z`.split("\x0").reject do |f|
f.match(%r{^(test|spec|features)/})
end
spec.bindir = 'exe'
spec.executables = spec.files.grep(%r{^exe/}) { |f| File.basename(f) }
spec.require_paths = ['lib']
#spec.add_runtime_dependency 'nmatrix', '~> 0.2.3'
spec.add_development_dependency 'bundler', '~> 1.15'
spec.add_development_dependency 'rake', '~> 10.0'
spec.add_development_dependency 'rspec', '~> 3.0'
spec.add_development_dependency 'nmatrix', '~> 0.2.3'
end