We published our work for recovering true network flow distributions from sampled observations using Bayesian nonnegative tensor factorization on Wireless Communications and Mobile Computing. You can reach the article here. Abstract: In this paper, we develop a framework to estimate network flow length distributions in terms of the number of packets. We model the network… Read More »
I’ve prepared a Bayesian Poisson Tensor factorization tutorial which includes detailed derivations and a Python implementation using Numpy, Scipy and Tensorly libraries. You can check it from https://github.com/bariskurt/bptf.
We have published the results of our DDoS attack detection project that we’ve been working on for 2 years on the Digital Signal Processing journal. You can reach our article here. Abstract: Session Initiation Protocol (SIP), as one the most common signaling mechanism for Voice Over Internet Protocol (VoIP) applications, is a popular target for… Read More »
We are happy to present our work, A Dictionary Learning Based Anomaly Detection Method for Network Traffic Data, at the ICML 2016 anomaly detection workshop. Here is the poster we are going to present.
Using Python for scientific programming has many benefits. Especially IPython makes it very easy for your readers to reproduce your research findings. However, when time comes for running batch experiments, we usually find out that our Python/Matlab prototypes are not scalable. I was always thinking that we should as easily code in C++ as we… Read More »
Recently I’ve started working with models including censored observations. In order to gain some insight to the problem, I’ve prepared a simple censored Gaussian model. Suppose we draw samples from a Gaussian distribution with unknown mean $\mu$ and precision $\tau$ $(= 1/\sigma^2)$ but can only observe the data points which are greater than a given… Read More »
The digamma function is the logarithmic derivative of the gamma function which is defined for the nonnegative real numbers. When you are working with Beta and Dirichlet distributions, you seen them frequently. Furthermore, if you want to estimate the parameters of a Diricihlet distribution, you need to take the inverse of the digamma function. I… Read More »
Here is our work with Cem Subakan, Taylan Cemgil and Bulent Sankur. The Paper Supplementary Material Workshop Poster
Recently I’ve been working on learning parameters of a mixture of Dirichlet distributions, I needed a measure to check how good my algorithm works on synthetic data. I was advised to use Kullback-Leibler divergence, but its derivation was a little difficult. Here is the derivation:
Here I present my Matlab implementation for learning Markov mixtures with EM algorithm. David Barber’s book explains the model and the EM derivation very nicely, and also provides a Matlab code package for it. Both the book and codes are open, you’re recommended to have look at them. However, Barber’s code is written for readability… Read More »