News

The paper „Enabling Wide Area Data Analytics with Collaborative Distributed Processing Pipelines (CDPPs)“ by Anja Feldmann, Manfred Hauswirth, and Volker Markl was accepted for publication at ICDCS 2017.

Read more

Prof. Volker Markl, director of the BBDC, will give a keynote speech at the ICDE conference in San Diego on April 19, 2017.

The title of the keynote is "Mosaics: Stratosphere, Flink and beyond".

Read more

On March 15, BBDC Co-Director Klaus-Robert Müller and RIKEN Center for AIP Director Masashi Sugiyama signed a memorandum of understanding between the two research institutions.

Read more

At the EDBT/ICDT 2017 Joint Conference in Venice, Italy, members of the Database and Information Management Group (DIMA), leaded by Prof. Markl, Director of the BBDC, presented the demonstration, I2: Interactive Real-Time Visualization for Streaming Data,

Read more

Events

Digital Future Science Match

Konferenz „Digital Future Science Match“,

12. Mai 2017, Kosmos Berlin

Eine Veranstaltung des Berlin Big Data Center gemeinsam mit dem Zuse Institut Berlin, dem Einstein Center Digital Future und dem Tagesspiegel.

Registrieren Sie sich mit unserem Aktionscode.

Big Data Excellence in Germany and UK

March 1st, 2017 in Berlin

A joint event by the Berlin Big Data Center and UK Science and Innovation Network took place at Smart Data Forum.

The First Berlin Big Data Center Symposium Held in Berlin

The Berlin Big Data Center held its first Symposium on November 8th at the Smart Data Forum, located in Berlin. At this event, members of the BBDC presented the project’s interim results from two years of research.

Newsletter published

The newsletter Big Data Research is a joint newsletter which reflects works done by Berlin Big Data Center (BBDC), Dresden/Leipzig Competence Center for Scalable Data Services and Solutions (ScaDS), Smart Data Innovation Lab (SDIL), and Smart Data Forum (SDF).

Edition #001 (Nov 2016)

Edition #002 (Feb 2017)

Subscribe to Newsletter

 

Five Dimensions of Big Data

Big data is often defined as any data set that cannot be handled using today’s widely available mainstream techniques and technologies. The challenges of handling big data are often described using 3-Vs (volume, variety and velocity): high volume of data from a variety of data sources arriving with high velocity analysed to achieve an economic benefit. However, the 3-Vs fail to reflect complexity of “Big Data” in its entirety.

Opens internal link in current windowRead the full article

According to the Opens external link in current windowHarvard Business Review, Data Scientist is “The Sexiest Job of the 21st Century”. Data scientists are often considered to be wizards that deliver value from big data. These wizards need to have knowledge in three very distinct subject areas, namely, scalable data management, data analysis and domain area expertise. However, it is a challenge to find these jacks-of-all-trades that cover all three areas. Or, as the Opens external link in current windowWall Street Journal puts it “Big Data’s Problem is Little Talent”. Naturally, finding talented data scientists is also a requirement, if we are to put big data to good use. If data analysis were specified using a declarative language, data scientists would not have to worry about low-level programming any longer. Instead, they would be free to concentrate on their data analysis problem. The goal of the Berlin Big Data Center is to help bridge the Talent Gap of Big Data through researching and developing novel technology.

Read more about it in the article of the VLDB keynote "Opens external link in current windowBreaking the Chains: On Declarative Data Analysis and Data Independence in the Big Data Era" by Volker Markl.