Schedule
Flink Forward Global 2021 is kicking off on October 18 with four days of Flink Forward Training Camp featuring brand-new training sessions on the topics of Apache Flink® Development (4 days), Troubleshooting & Operations (2 days), Stateful Functions (1 day), and Advanced Flink Debugging and Performance Tuning (1 day).
Join one of the four online instructor-led sessions below.
The schedule on October 18-21 is displayed in Pacific Time (UTC -7).
The training starts at 18:00 in Central European Time (UTC +1) and 24:00 in Mainland China (UTC+8).
Deep Dive Masterclass
Track 1
Track 2
Day 1
Morning session
Day 1
Afternoon session
Day 2
Morning session
Deep Dive Masterclass Trainers
Recently, he mainly focuses on Apache Paimon, a unifying streaming and batch lake storage.
With a background primarily in engineering leadership and entrepreneurship, Ben has extensive experience building and managing engineering teams across various sectors, including logistics, gaming, and mobile applications. His expertise lies in developing solutions centered around real-world interactions, particularly in areas such as GPS-based technologies, augmented reality, and multi-user collaboration.
Outside of his professional life, Ben is an avid technology enthusiast. He enjoys playing and creating video games and trading cards. Ben balances his career with family life, being a father to two young children and a caretaker to three pet ducks.
After obtaining his PhD in Computer Science in the field of stream processing at Politecnico di Milano, Lorenzo worked at InfluxData developing languages and runtimes for processing time series data. He also worked at Huawei Technologies (Munich Research Center) as a research engineer in the field of object storage cloud services.
His main interest lies in distributed systems for data processing, storage, and retrieval.
Software Engineer at Flink Engine team in Ververica taking part in providing core Flink and Connectors features to Ververica Cloud and Ververica Platform users, previously at AWS Managed service for Flink.
Ververica Bootcamp Program
The Ververica Bootcamp Program is an intensive training initiative that transforms Apache Flink users into proficient data processing professionals. By translating complex Flink concepts into practical exercises rooted in real-world scenarios, we empower participants to tackle their toughest data challenges. Leveraging Ververica Cloud services, participants gain a deep understanding of Flink and learn to optimize the scalability and efficiency of their cloud-based solutions. This program is not just about learning; it’s about mastering Apache Flink and leading the future of data processing.
Level Up Your Stream Processing Skills
This intensive, 2-day face-to-face program is designed for Apache Flink users with 2-4 years of experience who want to take their skills to the intermediate level. We'll delve into advanced Flink concepts and techniques, empowering you to build and deploy highly scalable and efficient real-time data processing pipelines. Leveraging Ververica Cloud services, you'll gain a deeper understanding of Flink and explore best practices for production deployments.
Target Audience:
Apache Flink users with 2-4 years of experience who are comfortable with core concepts and want to become proficient in advanced functionalities.
Key Topics
- Advanced Windowing Operations
- Time Management Strategies
- State Management Techniques
- Serialization Optimization
- Exactly Once Processing
- Fault Tolerance
- Enrichment Techniques
- Scalability Optimization
- Flink SQL Functions
- Table API Features
- Workflow Design
- Using Paimon Effectively
Learning Outcomes
Master Advanced Windowing Operations in Apache Flink:
- Understand and implement session windows, tumbling/sliding windows with triggers, and time management strategies (Event Time, Processing Time, Ingestion Time).
Optimize State Management for High Performance in Flink Applications:
- Apply advanced state management techniques including state partitioning and RocksDB integration.
- Optimize state size and access patterns for enhanced performance.
Improve Workflow Performance via Advanced Serialization Techniques:
- Learn how to reduce time spent serializing and deserializing data, for data sources and sinks (connectors), and over the network.
Deep Dive into Exactly-Once Processing and Failure Recovery:
- Understand the differences between at-least-once, exactly-once, and exactly-once end-to-end. Learn how to effectively use exactly-once processing when faced with bad data, infrastructure failures, and workflow bugs.
Develop Complex Real-Time Pipelines:
- Build a workflow that processes a continuous stream of events to generate both dashboard and analytics results.
- Learn how best to enrich data from a variety of data sources.
- Optimize complex workflows using pre-filtering, pruning, async I/O, broadcast streams, parallel partial enrichments, and other techniques.
Use Flink SQL & Table APIs to Implement Workflows:
- Utilize the advanced functionalities of Flink SQL, including UDFs and Table Functions, and master the Flink Table API for unified data transformations and real-time analytics.
- Compare and contrast the resulting workflow with the Java API.
Designing Optimized Workflows:
- Learn about situations where splitting a workflow into multiple components improves efficiency and reduces operational complexity.
- Learn how to use Paimon as an efficient and low-overhead data bridge between workflows.