Home      Log In      Contacts      FAQs      INSTICC Portal
 

Keynote Lectures

Cloud Computing for Enabling Big Data Analysis Services
Domenico Talia, University of Calabria and Fuzhou University, Italy

New Horizons in IoT Workflows Provisioning in Edge and Cloud Datacentres for Fast Data Analytics: The Osmotic Computing Approach
Rajiv Ranjan, Newcastle University, United Kingdom

 

Cloud Computing for Enabling Big Data Analysis Services

Domenico Talia
University of Calabria and Fuzhou University
Italy
 

Brief Bio
Domenico Talia is a full professor of computer engineering at the University of Calabria and an adjunct professor at Fuzhou University. He is a partner of the startup DtoK Lab. His research interests include Big Data analysis, parallel and distributed data mining algorithms, Cloud computing, distributed knowledge discovery, mobile computing, distributed computing, peer-to-peer systems, and parallel programming. Talia published ten books and about 400 papers in archival journals such as CACM, Computer, IEEE TKDE, IEEE TSE, IEEE TSMC-B, IEEE Micro, ACM Computing Surveys, FGCS, Parallel Computing, IEEE Internet Computing and international conference proceedings. He is a member of the editorial boards of IEEE Transactions on Parallel and Distributed Computing, the Future Generation Computer Systems journal, the International Journal on Web and Grid Services, the Scalable Computing: Practice and Experience journal, the Journal of Cloud Computing, and the Web Intelligence and Agent Systems International journal. Talia has been a project evaluator for several international institutions such as the European Commission, the Aeres in France, the Austrian Science Fund, the Croucher Foundation, and the Russian Federation Government. He served as a program chair, organizer, or program committee member of several international scientific conferences and gave many invited talks and seminars in international conferences and schools. Talia is a member of the ACM and a senior member of IEEE.


Abstract
The growing use of service-oriented computing is accelerating the use of Cloud-based systems for efficient Big Data analysis. Developers and researchers are adopting the three main service models, software as a service (SaaS), platform as a service (PaaS), and infrastructure as a service (IaaS), to implement Big Data analytics solutions in the cloud. Indeed, Cloud computing offers scalable facilities for dealing with the computational and data storage needs of Big Data analysis and for implementing data mining services. This keynote addresses the main topics and research issues on efficiently using Cloud computing platforms for implementing Big Data mining services on large data sets. The talk discusses the delivery of data analysis software called data analytics as a service (DAaaS). We present data mining techniques and frameworks designed for developing service-based distributed data analytics applications on Clouds. These systems implement data set storage, analysis tools, data mining algorithms and knowledge models as single services that are combined through a visual programming interface in distributed workflows. In particular, the talk outlines how to implement big data mining services on the Data Mining Cloud Framework, designed for developing and executing distributed data analytics applications as workflows of services. Application design and execution of data analysis use cases are presented. Programming issues and research trends will be also introduced.



 

 

New Horizons in IoT Workflows Provisioning in Edge and Cloud Datacentres for Fast Data Analytics: The Osmotic Computing Approach

Rajiv Ranjan
Newcastle University
United Kingdom
 

Brief Bio
Professor Rajiv Ranjan is an Australian-British computer scientist, of Indian origin, known for his research in Distributed Systems (Cloud Computing, Big Data, and the Internet of Things). He is University Chair Professor for the Internet of Things research in the School of Computing of Newcastle University, United Kingdom. He is an internationally established scientist in the area of Distributed Systems (having published about 300 scientific papers). He has secured more than $12 Million AUD (£6 Million+ GBP) in the form of competitive research grants from both public and private agencies. He is an innovator with strong and sustained academic and industrial impact and a globally recognized R&D leader with the proven track record. He serves on the editorial boards of top quality international journals including IEEE Transactions on Computers (2014-2016), IEEE Transactions on Cloud Computing, ACM Transactions on the Internet of Things, The Computer (Oxford University), The Computing (Springer) and Future Generation Computer Systems. He led the Blue Skies section (department, 2014-2019) of IEEE Cloud Computing, where his principal role was to identify and write about most important, cutting-edge research issues at the intersection of multiple, inter-dependent research disciplines within distributed systems research area including Internet of Things, Big Data Analytics, Cloud Computing, and Edge Computing. He is one of the highly cited authors in computer science and software engineering worldwide (h-index=49, g-index=130, and 14000+ google scholar citations; h-index=36 and 7600+ scopus citations; and h-index=30 and 4900+ Web of Science citations).


Abstract

Supporting Internet of Things (IoT) workflow enactment/execution on a combination of computational resources at the network edge and at a datacentre remains a challenge. Increasing volumes of data being generated through smart phones, IoT (Internet of Things) devices (which can vary significantly in scope and capability), need to be processed in a timely manner. Current practice involves using edge nodes (e.g. sensors or other low-capacity devices) as a means to acquire/collect data (i.e. as an "observation" mechanism). Subsequently, this data is transmitted to a datacentre/cloud for analysis/insight. Increasingly, the limitation with the use of a large-scale, centralised datacentre is being realised (such as speed of response for latency-sensitive applications), with the emergence of a number of paradigms to address this concern -- such as fog computing, edge computing, Cloud-of-Things, etc. All of these propose the use of dedicated servers (with varying capacity and capability) within micro/nano datacentres at the network edge, to overcome latency constraints associated with moving data to a central facility, and (lack of use of) increasing computational capability within edge devices. These paradigms also closely align with work in content distribution networks (e.g. from Akamai CDNs), which attempt to place data servers within one (or a small number of) hop of end users (currently 85% of users are supported in this way, with >175K Akamai servers)

A key objective of this keynote talk is to understand how such emerging paradigms can be used to enable cloud systems (supported through large scale computational facilities) to be "stretched" to the network edge, to enable data-driven IoT workflows to be enacted efficiently over such combined infrastructure. We propose the combined use of (varying) capability at the network edge (referred to as an "Edge Datacentre" (EDC)) with capability within a Cloud Datacentre (CDC). Collectively, IoT devices and edge resources, like gateways (Raspberry Pi 3), software-defined network systems (Huawei CloudEngine 6800) and smart phones equipped with sensors, constitute a new set of computing resources -- and as potential components of an EDC. The keynote talk will have the following outline:

1.Overview of the research challenges involved with composing and orchestrating complex IoT workflows in cloud-edge continuum infrastructure

2.Discuss two case studies in healthcare and smart cities domain to understand how data-driven workflows can be applied to create/compose next-generation IoT applications.

3.Discuss our experience with running United Kingdom’s largest IoT infrastructure, namely, the Urban Observatory (http://www.urbanobservatory.ac.uk/)




footer