Think Summit Italy 2021
What has happened in technology over the last twelve months? How will the digital transformation process that is ongoing everywhere in the world affect your skills and work? Here comes the 2021 edition of Think Summit Italy, an in-depth technical panel dedicated to developers, IT professionals, and high-level managers.
Think Summit Italy 2021 will be held on October 20th and 21st. It is a unique opportunity to return to challenging keynotes, innovative institutions, industry experts, and to be inspired by the success stories of Italian and international companies.
Starting with the most recent innovations in Hybrid Cloud and AI contexts, the event will analyze how these can be applied to accelerate the transformation of companies’ businesses from a perspective of sustainable growth, addressing the challenges and embracing the opportunities of the current market context.
Ways to participate and details relating to the agenda, speakers, and a timetable of scheduled sessions will soon be announced.
Key technologies that have changed or are newly developed will be demonstrated by IBM professionals, guiding the audience through interactive practical experiences related to the key themes. Hands-on workshops will enable participants to combine theory and practice.
The event will be an opportunity to improve skills, network with industry professionals, share experiences and become familiar with IBM technologies.
Focusing on IT practitioners and developers, the Code@Think session (6 p.m. on October 21st) will allow the audience to get hands-on with the very latest in tech from containers to AI-powered automation.
Think Summit Italy 2021 will provide a deep understanding of the various approaches to modernizing and building apps in a hybrid cloud world, so as to bring the speed and interoperability of open, cloud-native tech within organizations.
Focusing on the Code@Think sessions, four subjects are in the spotlight: real Data Science, Containerization, AI in Cognitive applications, and Robot Process automation. All of these primary subjects are covered permanently on the Coursera MOOC platform: a shortlist of the subjects that will be covered during the summit can be found at the end of this article.
Let’s take a closer look at the four spotlit topics.
1. Data science applied to a real-life scenario
What is data science, and what are the various aspects of a data scientist’s job? The foundation of many activities in today’s digital world, if one is to approach tasks like a data scientist, the framework and its tools must be mastered using a methodology of thought that is different to others in computer science.
Data science is an interdisciplinary field that uses scientific methods, processes, algorithms, and systems to extract knowledge and insights from structured and unstructured data, and to apply knowledge and actionable insights from data across a broad range of application domains.
Therefore, data and business understanding are always at the center of a Data Science project: building appropriate skills in dataset management is essential for any Data Scientist who wishes to accomplish common tasks (such as importing and cleaning data sets, analyzing and visualizing data, building machine learning models and pipelines) and today’s Data Scientists can leverage many powerful tools to perform each of these operations.
However, these capabilities are not always enough to guarantee the success of a project: even when individual tasks appear simple, things can get complicated very quickly as the project evolves through various iterations, new goals are set, and time-to-deployment runs away.
Consequently, the right methodology, supported by tool integration, really makes a difference when it comes to simplifying exploratory experiments that aim to enhance prototype quality while drastically reducing the time to deployment.
One viable learning path is to develop hands-on skills using the tools, languages, and libraries used by professional data scientists.
Data is at the center of this approach, so dataset management is vital: importing and cleaning data sets, analyzing and visualizing data, building machine learning models and pipelines are all necessary steps in completing a project and publishing a report.
A example of how important data science is today can be found in the Call for Code Spot Challenge for Wildfires in Australia. Wildfires are among the most common forms of natural disaster in some regions, including Siberia, the United States, and Australia. It is important to improve wildfire forecasting in order to help firefighters both prepare and respond, and to mitigate wildfires in the future.
Three teams competed, each taking a different approach: Data Warriors chose a CNN (convolutional neural network), Yau_Yee_Italy compared three models (XGBoost, random forest, and LightGBM), and NA utilized linear regression and Excel-based pivot-and-smash. Their respective experiences drove each team towards a different approach, underlining the open mindedness and rich tool knowledge required of a data scientist.
2. Container platform
Containerization is likely the most significant invention in IT since the introduction of virtualization. Open source projects like Kubernetes and products like Red Hat OpenShift have become standards for users looking to deploy and manage containers at scale.
Everyone from small startups to large multinational corporations is transitioning to these technologies, and they are looking for employees who are skilled in these areas.
How to create and deploy an application on a container-based platform is required knowledge for a developer, but Director-level team members are also interested in this branch of knowledge. Everybody who advises, consults, builds, moves or manages Cloud solutions should be aware of how these are deployed today.
Basic computer literacy and a foundational knowledge of cloud computing are more than enough to understand the mechanics of containerization.
Nowadays, cloud computing is the central topic for every IT company; features like fault-tolerance, scaling, availability, and resiliency are required for and by client businesses.
These features are provided by container orchestration tools like Docker-Swarm, Kubernetes, Red Hat OpenShift, and Istio. These tools are changing the way that applications are developed from monolithic to microservices patterns (also called the cloud-native approach), and provide the possibility of having different kinds of deployments, such as public, private, or hybrid cloud.
Learning is easier when hands-on experience is on offer, as is demonstrated in Coursera’s classes on these tools. From getting started with Docker, orchestration and scaling with Kubernetes, to simplifying deployments with OpenShift, all of these actions are now performed through a web browser on IBM Cloud.
3. Cognitive application based on Watson services
Cognitive application understanding and development is not limited to developers or data scientists. In fact, Business Analysts, Line of Business managers, and C-Suite executives are all turning to AI because, in order to compete, they need to be able to innovate at speed. Their goals include:
- Humanizing customer experiences
- Predicting and shaping future outcomes
- Empowering people to focus on higher-value work
- Supporting human capital in their efforts to reimagine new business models by infusing intelligence into their workflows
For many organizations, the strategy that will most help them in achieving these goals is to start infusing AI throughout the organization’s processes, and then work backwards on collecting, organizing, and analyzing data.
This is possible thanks to prebuilt AI applications that can easily be adapted to any specific business situation. Conversational assistants are a good place to start. Many products, including Watson Assistant, allow a customer service agent to be built with minimal expertise and effort.
Hands-on experience will be given in this field during the summit: developing a virtual agent on Watson Assistant, enlarging its Knowledge Base with Watson Discovery, enriching user experience with Speech To Text and Text To Speech, and extending through to a multi-channel deployment.
4. RPA: Robotic Process Automation
AI is key to successful process automation today. Many businesses can take advantage of their data better by implementing an AI-powered automation approach. This recent application field must be tackled with care and due attention from the start.
Moving from automating simple back-office tasks to full-scale automation that will manage time-consuming business processes can be a challenge.
IBM’s Robotic Process Automation offering helps users to automate multiple business and IT processes at scale, with the ease and speed of traditional RPA. Software robots, or bots, can perform operations that leverage artificial intelligence (AI) insights to complete tasks with no latency, enabling the user to realize their digital transformation goals.
Understanding how to leverage AI inside multiple bot executions is a particularly important skill that contributes to a consistently-increasing return on investment by freeing qualified personnel from hidden repetitive tasks to focus on scaling up the business.
This video shows the basics of AI in RPA.
5. IBM learning path on the Coursera MOOC and the IBM platform
To help interested parties to take classes in the best possible way, IBM has brought their expertise in these skill areas to the Coursera MOOC (Massive Open Online Courses) platform.
A learning path towards an in-depth mastery of all the subjects spot-lit in the Code@Think sessions can be organised in the same manner as the four events.
5.1 Data Science Methodology Applied to a Real-Life Scenario
This track will focus on the development of a Data Science project that arises from a real use case. Participants will be required to complete all project phases, from data wrangling to model deployment, applying the right methodology and exploiting the potential of Cloud Pak for Data.
IBM AI Enterprise Workflow specialization and the IBM Data Science Professional Certificate are the suggested classes on the Coursera MOOC platform for this track.
5.2 Red Hat Openshift: Deployment of containerized application
Do you want to learn more about how to deploy an application in the Cloud? This track involves an in-depth exploration of Openshift and containers. Discover how easy and fast it can be to deploy and manage containerized applications by leveraging IBM Cloud services.
Introduction to Containers with Docker, Kubernetes, and OpenShift is part of the IBM Full Stack Cloud Developer Professional Certificate hosted on Coursera.
5.3 From “chatbot” to the AI-based customer care revolution!
This path starts with developing a virtual agent based on Watson Assistant, then enlarging its Knowledge Base using native integration with Watson Discovery. Eventually, user experience will be enriched with Speech To Text and Text To Speech and Watson Translator, extending the agent through a multi-channel deployment.
Building AI Applications with Watson APIs is a Coursera class dedicated to the use of Watson services to build a cognitive application.
5.4 Develop and test an intelligent software bot using IBM Robotic Process Automation
Start by developing a simple bot with IBM RPA Studio, then add more capabilities using native commands configuration. This track leads to the release of a complete bot to automate more complex tasks.
All recommended courses can be found on the Coursera site, or on this IBM page.
October is not so far away. Don’t miss the chance to meet at Code@Think during the IBM Think Summit Italy, at 6 p.m. CEST on October 21st.