You are here

Drupal for big data - is it ready?

Drupal for big data

In the emerging era of Internet of things and Big data I will try to look at the Drupal's perspective.

All web thrives over bigger and bigger data. Latest Startup trends reveal tendency of applications to work around the idea of data - how to better manage or display them. So being a Drupal professional you could ask for yourself is this platform ready or even is it a correct one to be applied?

Drupal has its immense advantages of being a great CMS to manage content and data for display, but can it actually handle huge amounts of it?

The basis of this talk were formed over the summer of 2014, while I was consulting a business called Cloudsesnse. Over the duration of the project I encountered many situations being required to parse a realtime stream of huge data being imported into Drupal. Challenges I faced started from just altering the data or doing up heavy comparisons and calculations on the fly and ended up with very simple human factors, like empirical assumption that this great framework can do everything.

Whilst consulting and architecting technical solution I began to ponder an idea - quite a few questions actually;

  • How much do we know about Drupal and huge amounts of data?
  • Can this primarily CMS framework play its part in the business of Big data?
  • Is this the right technology, or I am too Drupalised?

These and many other thoughts urged me to for up two talks one on European Drupal days, other at Drupal show and tell. At the end and through the process of getting ready for both events I got really amazed how many projects have tried to put Drupal as part of there "Immense Data" projects.

My thoughts are combined in the presentation below.

Initial presentation from Milan can be found here