Louvain method for optimising modularity
In collaboration with V. Blondel, J.-L. Guillaume and E. Lefebvre, we have developed a fast and efficient method to uncover communities in very large networks. The code is available on and the method described in our paper Fast unfolding of communities in large networks, V.D. Blondel, J.-L. Guillaume, R. Lambiotte and E. Lefebvre, J. Stat. Mech., (2008) P10008. The method has also been implemented in network visualisation programs, such as gephi

Optimising "generalised modularities"
In collaboration with J.-C. Delvenne and M. Barahona, we have developed a generalised modularity based on dynamical processes and where time plays the role of a resolution parameter. Two codes have been written in order to optimise this generalised modularity, thereby allowing to uncover modules in networks with the desired resolution. The first code is a generalisation a C implementation of Simulated Annealing developed by R. Guimera. The second code generalises the Louvain method described above and is based on a C++ implementation of J.-L. Guillaume. These two codes are available here and are described in a short report. The theoretical aspect of our work is described in Dynamics and Modular Structure in Networks by R. Lambiotte, J.-C. Delvenne and M. Barahona.

Edge partitions for overlapping communities
In collaboration with T.S. Evans, we propose to partition the edges of a graph in order to uncover overlapping communities of its nodes. Our approach is based on the construction of different types of weighted line graphs, i.e. graphs whose nodes are the edges of the original graph, that encapsulate differently the relations between the edges. A description of our work can be found in Line Graphs, Link Partitions and Overlapping Communities by T.S. Evans and R. Lambiotte, Phys. Rev. E 80, (2009) 016105. Codes to produce weighted line graphs and optimise their modularity can be found here.

TiDeH: Time-Dependent Hawkes model for predicting future retweet activity
In collaboration with Ryota Kobayachi, we propose a time-dependent self-exiting point process to model and predict the dynamics of retweets in social media. A description of our work can be found in TiDeH: Time-Dependent Hawkes Process for Predicting Retweet Dynamics by R. Kobayachi and R. Lambiotte, ICWSM (2016). Codes are available here here.

KONECT: The Koblenz Network Collection
The KONECT project curates a collection of network datasets of various sizes and from various domains, as well as analysis code for their analysis. The KONECT project has made available the following software under a free software license (GPLv3):

  • KONECT Network Analysis Toolkit: A GNU Octave and Matlab toolbox for the analysis of networks.
  • KONECT Cloud Analysis: A package for running the analysis of KONECT in a parallelized manner. This software is run continuously on KONECT servers at the institute.
  • KONECT Network Extraction: A library of code to generate network datasets, as well as sample code to generate network datasets from many individual websites.

Stu: Build automation
Stu is a build automation tool developed by the group for data mining applications. It specifically supports large projects consisting of a large number of files, large parallel executions, and the definition of complex interdependencies. It is however not specific to data mining and can be used for general building tasks.