We would like to say thank you and goodbye to Matteo, who will complete his visiting scholarship at the Frankfurt Big Data Lab, at the end of April.
“It is a pleasure having you with us.”
“ I’m a PhD student from Perugia (Italy) and I spent five months at the Frankfurt Big Data Lab thanks to the Erasmus+ Programme. During my stay I grew up both professionally and personally.
The welcome was incredibly warm and since the first days I felt like I was part of the team. I feel like I learned a lot from everybody on how to work in academic research.
Furthermore I had the chance to experience the exciting environment of tech industries and startups in Frankfurt by participating to seminars and other events at the University and at startups.
Finally, I got to know German culture as well as many others, since here you can meet people from everywhere.
I’m looking forward to come back as soon as possible.”
On April 19 we had a very successful start for the Data Challenges 2018.
This year, the Data Challenges 2018 are with Deutsche Bahn (DB) and Procter and Gamble (P&G). We ask students of the Goethe University to create innovative ideas for Smart Cities, Smart Life, Smart Logistics and Smart Supply Chain.
The presentations of the Kick Off are online:
- Introduction to the Data Challenges 2018
- Introduction to DB Data Challenge
- Introduction to P&G Data Challenge
- Data Challenge presentation by Deutsche Bahn:
- Data Challenge presentation by Procter & Gamble (P&G):
We have successfully completed the first two-days workshop “Setting the Stage on #BigData in #Transportation” of the LeMO H2020 project which was held at Frankfurt Big Data Lab from 22-23 January 2018.
Transport researchers and policy makers today face several challenges as they work to build efficient, safe, and sustainable transportation systems. From rising congestion to growing demand for public transit, the travel behaviour and transportation preferences of city dwellers are changing fast.
Leveraging Big Data to Manage Transport Operations (LeMO) project addresses these issues by investigating the implications of the utilisation of Big Data to enhance the economic sustainability and competitiveness of European transport sector. The project studies and analyses Big Data in the European transport domain in particular with respect to five transport dimensions: mode, sector, technology, policy and evaluation. LeMO will accomplish this by conducting a series of case studies, in order to provide recommendations on the prerequisites of effective big data implementation in the transport field.
LINK to Project Web Site: https://lemo-h2020.eu/overview/
We would like to welcome Matteo Pergolesi, who is visiting our Lab till end of March 2018.
Matteo Pergolesi received his master degree cum laude in Information and Automation Engineering from University of Perugia in 2015. Now, he is a Ph.D. candidate in Industrial and Information Engineering at University of Perugia, advised by Professor Gianluca Reali. His research interests focus on Big Data Analysis for Networking.
Matteo will work with us on the DataBench project: http://www.bigdata.uni-frankfurt.de/databench-project/
Professor Roberto V. Zicari spoke on 20.11.2017 at the SIUFrankfurt event: “The Human Side of AI”
In this SIUTranslation event in Frankfurt titled “Human side of AI” , Prof. Dr Gregory Wheeler and Prof. Dott. Ing. Roberto Zicari discussed how Artificial Intelligence (AI) is transforming our society, what are the current and near future applications of AI and how humans can use these technologies more ethically and intelligently.
The Science Innovation Union is a translation-in-science communication and training platform.
Science Innovation Union (www.science-union.org) is committed to inspire entrepreneurship among young scientists from any background, and to boost innovation by bridging the gap between industry, government and academia with the aim of translating innovative science into disruptive business. We are a young and exciting organization, and we have bigplans for this year.
Author: Vanessa Hübner Edited By: Burcu Anil Kirmizitas
This November 20th, 2017 was a cold and rainy day. In fact, it was a typical day that seduces one to sit under a warm blanket and sip hot tea at home – but SIU Frankfurt organised another exciting SIUTranslations event about “The Human Side of AI: The Role of Artificial Intelligence in Society”. The event was booked out: more than 80 interested attendees swapped their warm blankets and hot teas for enlightening talks and delicious wine while networking with fellows afterwards. And they were not disappointed: at this event, SIU members learned about the change from ‘logical’ to ‘numerical’ artificial intelligence, its current and future applications and how these technologies can be used more ethically for the benefit of all. Who presented? –
The experts in the field of artificial intelligence and big data in Frankfurt: Prof. Gregory Wheeler and Prof. Roberto V. Zicari.
“What’s so deep about deep learning?”
Gregory Wheeler is a professor of philosophy and computer science at the Frankfurt School of Finance and Management. He was a research scientist in several universities and institutes in the USA, Germany and Portugal, is an author of many scientific publications and books, and editor of several scientific journals. His interests cover “philosophy, artificial intelligence, statistics and cognitive science”.
He explained that the goal of artificial intelligence is to develop a system that has the capability to either think oract like a human. But how can this be achieved? The first approach to make a system think involves the definition of rules to manipulate given representations such as making the system capable of perceiving objects, of understanding sentences and evaluating situations and thus, being able to act. The second, more feasible approach orients towards a given goal that the system should do, such as the perception of an object, and involves the picking of representations that are incorporated into an algorithm. Machine learning falls within the second approach – the system is programmed to follow a concise goal and it learns “from data without being explicitly programmed to do so”. This strength makes machine learning capable of touching every part of our lives by processing vision and language, by improving robotics, science and medicine as well as by being used in government and commerce. The underlying principle is based on computer science and statistics. However, as Gregory explained, the concept of artificial intelligence is not a ‘new’ phenomenon. It is rather an old concept initiated in 1956 at the Dartmouth Conference and since then, follows the ‘logical’ artificial intelligence approach. This approach tries to tackle artificial intelligence by making the system think: numerous representations guide systems to solutions, e.g. Deep Blue, the chess playing computer. The emergence of large data sets in the beginning of the new millennium changed the ‘logical’ approach to a ‘numerical’ concept of artificial intelligence: Large data sets create variable interactions and make it possible to predict certain outcomes. This allows a system to do. However, since we are at the beginning of this new, promising approach to artificial intelligence, we still must go a long way of research until it reaches its goal of systems that act like a human. Therefore, Gregory urges the audience to appreciate the long-term objectives of artificial intelligence and not to follow Amara’s Law, which describes that “we tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run.”
“Big Data & The Great AI Awakening”
Roberto V. Zicari is a full professor and founder of the Frankfurt Big Data Lab. He was for many years the Director of Goethe Unibator, a network in Frankfurt supporting bright minds in translating their innovative ideas to market-ready products, a member of the Global Venture Lab network, the Object Management Group and the editor of ODBMS.org – an internet platform informing about big data and new trends in data management and data science – and its blog. Roberto has a sound scientific publishing record in the field of data science and he spent time in numerous labs as a visiting scientist in the USA, Switzerland, Mexico and Denmark. He is also an internationally recognised expert in database and information systems.
Roberto’s talk focussed on the ethical and societal implications of big data and artificial intelligence. He explained that having and handling big data has made the recent development in artificial intelligence possible. Therefore, companies with big data pools, such as Google, Facebook, Microsoft and Apple can feed specific algorithms with these data to send personalised ads to prospective customers. These ads are based on decisions that are better on average to increase the odds of making a sale. However, similar algorithms can be implemented to make higher-stakes decisions, such as a medical diagnosis, loan approvals, hiring and crime prevention. Now, better on average is not good enough any more. In fact, with higher-stakes decisions, individuals’ lives are affected. These crucial decisions need accuracy, fairness and discrimination between different conditions. This leads to the notion of implementing an auditing tool that makes the technology able to explain itself, its data-driven algorithm and its decisions, and is followed by the questions of how much transparency we desire and if we wish to have a “human in the loop”. But then, the involvement of humans in these systems bear more ethical implications: Who would be responsible for the consequences of these decisions? What are the human motivations to interfere in the process of these decisions? Who would regulate these motivations? Is it realistic? And how can human involvement steer the use of big data and artificial intelligence for the common good? For this reason, Roberto and Andrej Zwitter, a Professor at the University of Groningen, Norway, initiated the Data for Humanity with the goal to “bring people and institutions together who share the motivation to use data for the common good”. With this initiative, they emphasise not only the need for stronger collaboration between researchers and decision makers in charities and government, but also the urge for the following ethical rules when using data:
- Do no harm
- Use data to help create peaceful coexistence
- Use data to help vulnerable people and people in need
- Use data to preserve and improve natural environment
- Use data to help create a fair world without discrimination
Finally, Roberto emphasised that although software designers have a distinctive ethical responsibility, we all are responsible for the use of big data and artificial intelligence: from employee to software developer to politician to associations. He reiterates that there is a big gap between the idea to do something good for the society and the action to actually do something. Roberto stresses that this gap will be bridged if everyone in one’s daily life becomes aware and takes responsibility to finally become an active participant of the development and progress of artificial intelligence for the common good.
This SIU Translation event was documented in an amazing photo series by the Frankfurter photographer Nikolay Nikolov (www.blindspoteurope.weebly.com). Photos and videos of the event can be viewed at our Facebook page “Science Innovation Union”.
Frankfurt Big Data Lab and Campus Fryslân launch research and education partnership
The Frankfurt Big Data Lab – Goethe University Frankfurt-, and the Data Research Centre, Campus Fryslân – University of Groningen- , signed a Memorandum of Understanding (MoU) with the goal to work together in research and education related to Big Data and Data Science. The cooperation includes staff exchange and joint research and industry projects.
A first joint endeavour is a hands-on training course on Big Data and Data Science to be held in Leeuwarden, from 25th to 28th of September 2017. The workshop will target IT experts of tech companies, SMEs and educational organisations in the region of Fryslân.
Future joint undertakings will include executive courses and research projects.
The team of two students, Patrick Klose und Nicolas Pfeuffer, who won our Data Challenge 2017 with Deutsche Bahn http://aktuelles.uni-frankfurt.de/studium/wettbewerb-zeichnet-studierende-fuer-intelligentes-datenmanagement-aus/
also won the DB & JRE Open Data Hackathon held on 12./13.05.2017 in Berlin (~200 Participants): https://www.mindboxberlin.com/index.php/db-hackathon-may-2017.html
In addition to that, they were invited ( full financed) from the East Japan Railway Company (JRE) to travel to Japan and work together on their systems and perhaps implementing their idea there.
“Durch die Konfrontation mit aktuellen Problemstellungen aus der Praxis konnten wir unser theoretisches Fachwissen um wertvolle Erfahrungen bezüglich der Entwicklung innovativer Lösungen erweitern. Derartige Veranstaltungen sollten unserer Meinung nach in Zukunft häufiger angeboten werden.” – Patrick Klose und Nicolas Pfeffer
“Corpus Nummorum Thracorum Klassifizierung der Münztypen und semantische Vernetzung über Nomisma.org”
Dr. Karsten Tolle, Director of the Frankfurt Big Data Lab, has been awarded a DFG-funding for 36 months to conduct a research project in the area of Digital Humanities.
The project called „Corpus Nummorum Thracorum Klassifizierung der Münztypen und semantische Vernetzung über Nomisma.org“ will use Linked Open Data to create a typology for the coins of the ancient Thrace.
The project is a cooperation between the Frankfurt Big Data Lab (Dr. Karsten Tolle), the “Berlin-Brandenburgische Akademie der Wissenschaften” (Prof. Dr. Dr. hc. mult. Martin Grötschel) and the “Münzkabinett – Staatliche Museen zu Berlin” ( Prof. Dr. Bernhard Weisser).
Dr. Tolle will introduce the project with a talk on Wednesday, May 17, 2017 at 2.30pm at the Frankfurt Big Data Lab.
We are glad to share that Todor Ivanov was invited to teach a two weeks hands-on course on Data Science Tools as part of the new Master in Data Science at the University Perugia, Italy: http://masterds.unipg.it/en/index.html
The two weeks course (20.03.2017- 30.03.2017) was composed of a mix of short lectures and practical hands-on labs in the following topics:
- Introduction to Hadoop and the Hadoop Ecosystem: HDFS, YARN and MapReduce
- Data Movement with Apache Sqoop and Apache Flume
- Managing Data with Impala and Hive
- Data Processing with Spark
- Introduction to Spark MLlib and Spark SQL
- Introduction to NoSQL and MongoDB