PML: Matrix and Machine Learning Library Project

By | April 21, 2016

Using Python for scientific programming has many benefits. Especially IPython makes it very easy for your readers to reproduce your research findings. However, when time comes for running batch experiments, we usually find out that our Python/Matlab prototypes are not scalable. I was always thinking that we should as easily code in C++ as we do with Python. So, I started a C++ project for developing linear algebra and machine learning applications that will speed up C++ coding and now it has two more contributors.

There are already serious libraries written by professionals, and in this project we use them. Our C++ library is just for coding the mostly used operations like matrix multiplications, normalizations,  divergences, and random sampling from common probability distributions easier. We use GSL, CBLAS and LAPACK under the hood.

Our library is a header only library, since usability is the main focus. We are developing by using powerful C++11 features like rvalue references for efficiently dealing with large memory blocks. You can download the current version of the library from its GitHub link. Below is a sample code using the library:

#include <pml.hpp>
#include <pml_random.hpp>

using namespace pml;

int main(){

  Matrix M = Uniform::rand(4,5);   // 4x5 uniform matrix

  Vector v = Poisson::rand(1, 4);  // Vector of length 4, entries 
                                   // drawn from Poisson with mean 1

  Vector v2 = Dot(M, v);           // Matrix-Vector Dot Product

  Vector v3 = Normalize(v2);       // Normalize the result

  std::cout << v3 ;                // Print vector v3 on screen

  v3.save("/tmp/v3.txt");          // Save v3 TO text file.

  return 0;
}

Leave a Reply

Your email address will not be published. Required fields are marked *