Monday, December 19, 2016, 2pm

Title: Finding Interesting Regions in Spatial and Spatio-temporal Datasets

Note: the talk has been cancelled.

Christoph F. EickSpeaker:  Christoph F. Eick, Department of Computer Science, University of Houston

Bio: Christoph F. Eick is an Associate Professor in the Department of Computer Science at the University of Houston and the Director of the UH Data Analysis and Intelligent Systems Lab. His research interests include data sciences, data mining, geographical information systems, artificial intelligence and critical infrastructure resilience. He published more than 150 papers in these areas. He also serves in the program committee of top data mining and artificial intelligence conferences.

Abstract:  Due to the advances in remote sensors and sensor networks, spatial and spatio-temporal data become increasingly available. In this talk, we present three different approaches to find interesting regions is such datasets. The first approach is a serial, density-based spatial-temporal clustering approach that employs non-parametric density estimation techniques and contouring algorithms to obtain spatial clusters whose scope is described using polygon models; next, it identifies spatio-temporal clusters as continuing polygons in consecutive time frames. The second approach generalizes the popular spatial clustering algorithm SNN to create spatio-temporal clusters. The third approach relies on a graph-based—it employs Gabriel graphs to define object neighborhoods—interestingness hotspot discovery framework which grows hotspots from hotspot seeds, maximizing a plug-in interestingness function. Finally, experimental results obtained by applying the three approaches to air pollution, crime, earthquake and New York taxi cab datasets are presented and evaluated.

Time and Location: Monday, December 19, 2016, 2pm, Big Data Lab Frankfurt at the Chair for Databases and Information Systems (DBIS), Goethe-University Frankfurt.

Thursday July 7 / 2016

Title: Data projects for servitization in Industrie 4.0 scenarios

Mikel NiñoSpeaker:  Mikel Niño, Interoperable Databases Group, University of the Basque Country (UPV/EHU), Spain.

M.Sc. in Computer Science (’98), Software Development and Artificial Intelligence (’03) and Business Administration (’15). Currently preparing his Ph.D. Dissertation on Big Data Analytics applications in Industrie 4.0 scenarios. 15+ years of professional experience in the fields of IT, business creation and management, both in public (9 years in Chamber of Commerce, IT promotion programs for the Regional Government as well as for the European Commission) and private organizations (startup founder and advisor for entrepreneurs on IT business strategy).

Abstract: Manufacturing industries have become one of the most prominent sectors for the application of Big Data (together with other key enabling technologies such as Cloud Computing and the Internet of Things), thanks to paradigms such as Industrie 4.0, Industrial Internet or Smart Manufacturing. In these scenarios, the analysis of large-scale volumes of data can provide insights to improve process performance and product quality. Moreover, equipment suppliers can leverage data analytics to attach value-added services to their products (servitization). The talk will illustrate the opportunities and challenges for data analytics applications in this type of manufacturing scenarios, reviewing the experiences and lessons learned devising a data-enabled decision guidance system for a chemical manufacturing sector distributed worldwide.

Time and Location: July 7th, 3pm, Big Data Lab Frankfurt at the Chair for Databases and Information Systems (DBIS), Goethe-University Frankfurt.

Thursday June 30 / 2016

Title: Smart Cities: From legacy infrastructure to smart services.

Oliver_RolleSpeaker:  Oliver Rolle, Project Manager, Urban Software Institute

Abstract: The Urban Software Institute [ui!] consults and supports the transformation of cities to reach ambitious climate, energy and traffic goals. One pillar to reach these goals is to extract data out of legacy infrastructure, making the data available to others, enabling them to re-use and re-purpose the data to create valuable services for smart cities. Based on a traffic use case, the obstacles are shown and how big data and machine learning technologies helped to overcome them.

Time and Location: June 30th, 2pm, Big Data Lab Frankfurt at the Chair for Databases and Information Systems (DBIS), Goethe-University Frankfurt.

Monday June 20 / 2016

Title: Intellectual Capital: Software innovation and its role in national economies

Gio_WiederholdSpeaker:  Gio Wiederhold, Professor Emeritus, Stanford University

Gio Wiederhold was born in Italy, educated in Germany and the Netherlands, moving to the US in 1958. He started with numerical computing at SADTC in Holland and adapted his efforts as computing technology progressed into more areas.
Gio obtained a PhD in 1976 and became a professor at Stanford University. During a three-year assignment at DARPA he initiated the Digital Library program, funding research that led, among others, to Google. After his formal retirement in 2001 he is serving as a government consultant on issues of software exports and their value. In 2011 Gio received an honorary DSc from the National University of Ireland in Galway. He stopped offering courses at Stanford in 2014.
He has authored and coauthored 6 books on diverse topics and over 300 reports and papers and supervised 36 PhD theses.
Many more details are at

Abstract: Software has invaded all aspects of our world.
It can no longer just be viewed as a fascinating technology.
Software, and the products that depend on it, from watches to aircraft, social interactions, and sharing services, comprise a large fraction of modern commerce.
The creators and the intellectual property they generate, exploit, and maintain comprise the intellectual capital, an asset that competes with the financial capital that traditional manufacturing industries rely on.
I will present the flow of innovation into our national economics. Rights to profit from intellectual property are poorly documented and are easily transferred among countries. The importance of our intellectual capital is underestimated by economists and planners because the `Big Data’ they access is primarily from financial-oriented sources.
As result, governmental policies to improve economic activity and the welfare of its people are often naïve and sometimes wrong.
In this world computing experts have roles beyond the base technology.

Presentation Slides (pdf)

Time and Location: June 20th, 2pm, Big Data Lab Frankfurt at the Chair for Databases and Information Systems (DBIS), Goethe-University Frankfurt.

Monday May 23 / 2016

Title: Applications of Modeling & Simulation in Pharmaceutical Development

Jörg_LippertSpeaker:  Jörg Lippert, Head of Clinical Pharmacometrics at Bayer Pharmaceuticals

· Studies of physics at Aachen and Paris
· PhD in Neuroscience – RWTH Aachen
· With Bayer since 2001
· Project lead of strategic project “Systems Biology” at Bayer Technology Services
· Head of Competence Center “Systems Biology & Computational Solutions” from its establishment in 2006 until 2011; responsible for method & software development and consultancy business
· Since 2012, Head of Clinical Pharmacometrics at Bayer Pharmaceuticals overseeing all Modeling & Simulation activities from preclinical development through clinical development and submission until life cycle management

Abstract: Modeling & Simulation (M&S) have evolved into a cornerstone of drug development and regulatory decision making. The prototypical development challenge is high impact decision making under the constraint of limited information. M&S based integration of available knowledge, information and data can help to improve the certainty of predictions and lead to better informed decisions. In the talk, the concept will be exemplified by applications to real world clinical development projects.

Time and Location: May 23rd, 2pm, Big Data Lab Frankfurt at the Chair for Databases and Information Systems (DBIS), Goethe-University Frankfurt.

Thursday April 21 / 2016

Title: How to become a Data Hub – Preparing the ground for Big Data Analytics

Karina_BzheumikhovaSpeaker:  Karina Bzheumikhova, Data Scientist at PWC. After completing her master’s degree program in theoretical particle physics at the Humboldt-University of Berlin Karina worked as an analyst for Infosys, and gained extensive experience in large international transformation projects. Currently she is working in the Digital Transformation Team at PwC. Her focus areas are Digital Architecture, Big Data and Analytics, and Digital Business Model Development. Together with the team she designed a Capability Framework that provides a clear vision on what a company needs to implement in the digital age. She has extensive experience in defining capabilities and designing reference architectures for data and big data management for large customers in various industries.

Abstract: Due to change in customer behaviors through disruptive technologies companies have the need to change their traditional business model and become more data-driven. In the digital age, where buzz words like Big Data, Data Science, and Analytics are widely used, companies struggle to implement those capabilities in a structured and sustainable manner. A typical approach thereby is choosing technologies that are most prominent in the market, without analyzing use cases and possible future requirements.  Preparing the ground for Big Data and Analytics specifically requires a transparent, high-quality, integrated data landscape, which most companies do not have in place today. We propose an approach that starts with deriving requirements from possible future use cases, analyzing which technology, process, people and skills, and organizational impact they have, and defining an appropriate architecture.

Time and Location: April 21st, 11:00, Big Data Lab Frankfurt at the Chair for Databases and Information Systems (DBIS), Goethe-University Frankfurt.