## New Publication: “A Bayesian change point model for detecting SIP-based DDoS attacks”

We have published the results of our DDoS attack detection project that we’ve been working on for 2 years on the Digital Signal Processing journal. You can reach our article here. Abstract: Session Initiation Protocol (SIP), as one the most common signaling mechanism for Voice Over Internet Protocol (VoIP) applications, is a popular target for… Read More »

## Our Poster at ICML 2016 Anomaly Detection Workshop

We are happy to present our work, A Dictionary Learning Based Anomaly Detection Method for Network Traffic Data, at the ICML 2016 anomaly detection workshop. Here is the poster we are going to present.

## PML: Matrix and Machine Learning Library Project

Using Python for scientific programming has many benefits. Especially IPython makes it very easy for your readers to reproduce your research findings. However, when time comes for running batch experiments, we usually find out that our Python/Matlab prototypes are not scalable. I was always thinking that we should as easily code in C++ as we… Read More »

## Censored Gaussian Model

Recently I’ve started working with models including censored observations. In order to gain some insight to the problem, I’ve prepared a simple censored Gaussian model. Suppose we draw samples from a Gaussian distribution with unknown mean $\mu$ and precision $\tau$ $(= 1/\sigma^2)$ but can only observe the data points which are greater than a given… Read More »

## Calculating The Inverse of Digamma Function

The digamma function is the logarithmic derivative of the gamma function which is defined for the nonnegative real numbers. When you are working with Beta and Dirichlet distributions, you seen them frequently. Furthermore, if you want to estimate the parameters of a Diricihlet distribution, you need to take the inverse of the digamma function. I… Read More »

## Spectral Learning for Mixture of Markov Models (NIPS 2013 Spectral Learning Workshop)

Here is our work with Cem Subakan, Taylan Cemgil and Bulent Sankur. The Paper Supplementary Material Workshop Poster

## Kullback-Leibler Divergence Between Two Dirichlet (and Beta) Distributions

Recently I’ve been working on learning parameters of a mixture of Dirichlet distributions, I needed a measure to check how good my algorithm works on synthetic data. I was advised to use Kullback-Leibler divergence, but its derivation was a little difficult. Here is the derivation:

## Learning Mixture of Markov Models with EM

Here I present my Matlab implementation for learning Markov mixtures with EM algorithm. David Barber’s book explains the model and the EM derivation very nicely, and also provides a Matlab code package for it. Both the book and codes are open, you’re recommended to have look at them. However, Barber’s code is written for readability… Read More »