Preprint “Prior specification via prior predictive matching: Poisson matrix factorization and beyond”

I will slowly restart blogging a bit about my past year experience that included visiting Prof. Arto Klami group at the University of Helsinki and a research internship atCurious AI, working under the guidance of Mathias Berglund and Harri Valpola. The first part of my stay resulted in an interesting research direction exploring how to use the prior-predictive distribution to obtain direct relationships between moments of the data (if generated by the model being specified) and hyperparameters of the model. I will discuss this further, but for now, I leave the abstract and link to the preprint.

Abstract: Hyperparameter optimization for machine learning models is typically carried out by some sort of cross-validation procedure or global optimization, both of which require running the learning algorithm numerous times. We show that for Bayesian hierarchical models there is an appealing alternative that allows selecting good hyperparameters without learning the model parameters during the process at all, facilitated by the prior predictive distribution that marginalizes out the model parameters. We propose an approach that matches suitable statistics of the prior predictive distribution with ones provided by an expert and apply the general concept for matrix factorization models. For some Poisson matrix factorization models we can analytically obtain exact hyperparameters, including the number of factors, and for more complex models we propose a model-independent optimization procedure.

Link: https://arxiv.org/abs/1910.12263

Nordic Probabilistic AI School 2020!

After the successful first edition of ProbAI, we are again in full motion to organize this summer school. The main difference introduced in the program is making it more tight-knit in a way that we expect higher convergence between students, topics and speakers. Also, we are working towards having an extra Q&A session in the of the day with a group of teaching assistant that will be prepared to go over the content of the day and help clarify any doubts.

Applications: https://probabilistic.ai/application/

Here the official announcement we release in different open channels online.

Continue reading “Nordic Probabilistic AI School 2020!”

Nordic Probabilistic AI School (ProbAI)

Open for applications!

You are welcome to apply for the Nordic Probabilistic AI School (ProbAI) 2019 finding place on June 3-7 in Trondheim (Norway).

About the ProbAI 2019

The Nordic Probabilistic AI School (ProbAI) is a new annual event serving a state-of-the-art expertise in machine learning and artificial intelligence to the public, students, academia and industry.

Our objective is to bring an intermediate to advanced level summer school with a particular focus on probabilistic models and deep generative models, covering the topics of latent variable models, inference with sampling and variational approximations, and probabilistic programming and tools.

The intentionally small team of invited lecturers will cover a carefully designed curriculum. Through a tight cooperation between our lecturers, and through theoretical lectures and hands-on tutorials, we hope to provide a high quality continuous and consistent knowledge transfer.

Go to topics →

Registration Fee

The registration fee includes all courses, coffee breaks, lunches and banquet.

  • Students (including PhD) → 2500 NOK ~ 256 EUR
  • Academia → 5000 NOK ~ 512 EUR
  • Industry → 10000 NOK ~ 1024 EUR

We are also offering a limited number of scholarships.

Visit our website to learn more.

Go to website →

Preprint to Temporal Hierarchical Recurrent Neural Network (THRNN) paper (WSDM 2019)

The ArXiv preprint to our paper introducing a joint Point process and Hierarchical RNN for item and time prediction is now available.


Time is of the Essence: a Joint Hierarchical RNN and Point Process Model for Time and Item Predictions

In recent years session-based recommendation has emerged as an increasingly applicable type of recommendation. As sessions consist of sequences of events, this type of recommendation is a natural fit for Recurrent Neural Networks (RNNs). Several additions have been proposed for extending such models in order to handle specific problems or data. Two such extensions are 1.) modeling of inter-session relations for catching long term dependencies over user sessions, and 2.) modeling temporal aspects of user-item interactions. The former allows the session-based recommendation to utilize extended session history and inter-session information when providing new recommendations. The latter has been used to both provide state-of-the-art predictions for when the user will return to the service and also for improving recommendations. In this work we combine these two extensions in a joint model for the tasks of recommendation and return-time prediction. The model consists of a Hierarchical RNN for the inter-session and intra-session items recommendation extended with a Point Process model for the time-gaps between the sessions. The experimental results indicate that the proposed model improves recommendations significantly on two datasets over a strong baseline, while simultaneously improving return-time predictions over a baseline return-time prediction model.

https://arxiv.org/pdf/1812.01276.pdf

Paper accepted at WSDM 2019

Our paper «Time is of the essence: A joint Hierarchical RNN and Point Process model for time and item predictions» has been accepted at 12th ACM International Conference on Web Search and Data Mining (WSDM). Collaborative work with Bjørnar Vassøy, Massimiliano Ruocco and Erlend Aune. WSDM is one of the top conferences in the domain of data mining, information retrieval and machine learning on the Web. This year WSDM had 511 submissions with an acceptance rate of 16%. Soon we will provide a link to the preprint and source-code.

In this paper, we have proposed a joint model with a shared latent representation for a Point Process model (for time prediction) and a Hierarchical Recurrent Neural Network (HRNN). By doing so we are able to model a multi-session recommendation problem, together with returning time prediction.

This work was developed as part of the Norwegian Open AI Lab in cooperation with Telenor Research.

Looking forward to visiting Melbourne again in the summer!

model

GIR’17 and visiting RMIT

Our position paper called “Poisson Factorization Models for Spatiotemporal Retrieval”, joint work with Dirk Ahlers, got accepted at the 11th Workshop on Geographic Information Retrieval (GIR’17). In this work, we discuss some modelling ideas and possibilities for advancing spatiotemporal retrieval using Poisson factorization models, especially in scenarios where we have multiple sources of count or implicit spatiotemporal user data. Unfortunately, I will not be able to attend the workshop (but Dirk will be there), because I am now in Melbourne, Australia, and will stay here for 3 months, participating as visiting graduate student in a project with the IR group at RMIT. In particular, I will be working with Dr Yongli Ren and Prof Mark Sanderson, developing joint probabilistic models for spatiotemporal user data for indoor spaces recommendations (they have a very interesting dataset that I am curious to explore). Hopefully, in the next couple of months, I will continue working on nice probabilistic models for recommender system, but incorporating many new and interesting ideas related to location and time.

Post-conference: ECML-PKDD 2017

ECML-PKDD 2017 was very pleasant and nice. Skopje was an unexpected surprise. I am happy with each new conference that I attend, always meeting new people doing very good research. The community there was very nice in general!

I presented my paper at Matrix and Tensor Factorization session, and I was particularly happy with that, because even though the application we are working is recommender systems, we are focusing on the methods and proposing new factorization methods and models. Later in the night, we had the poster (poster-ecml2017) session at the Macedonian Opera & Ballet and afterward headed to the wine festival, just outside.

For those interested, my presentation slides here:

Recommender Systems and Deep Learning: paper links

This semester I will be advising some master students on their final project. At this point, they don’t select a specific topic but should look into a given area to find specific research question and some of them will definitely work on Deep Learning and Recommender Systems. Especially because we (the NTNU-AILab group) had a very nice experience last year where one master student doing work on RNN for session-based recommendation managed to have a work accepted at DLRS 2017. So, I decided to make a small selection of the papers related to this topic, focusing on WSDM, WWW, KDD, CIKM, RecSys, ICLR, DLRS and some other specific conferences in the last three years (2015,2016 and 2017). The result is a list of 45 papers, with many distinct ideas, but also some common threads (Matrix Factorization to CNN or LSTM, Session-based methods using RNN, etc). We will not discuss the different ideas, but I will just post the link here because some people might be interested in that.

https://github.com/zehsilva/recsys-deeplearning-info

Lisbon Machine Learning Summer School (LxMLS) 2017

Last year I had the opportunity to attend this great summer school in the beautiful and lovely city of Lisbon. It was a great week together with a lot of interesting and intelligent people, all of them interested in the amazing and exciting area of machine learning and NLP. I liked it so much last year that I decided to come back this year to volunteer as an assistant in the summer school. Today was the -1 day, where we organized some of the registration stuff, welcomed some student and had some beers. Looks like it will be, again, a great time here in Lisbon

http://lxmls.it.pt/2017/

Continue reading “Lisbon Machine Learning Summer School (LxMLS) 2017”