Menu
Education & Training
Classroom Training
Experienced trainers provide hands-on training in well-equipped classrooms in our training centers. Trainees can have live interaction with instructors and experienced professionals working on similar technologies. Trainees will gain real time project experience through their training program.
Hybrid
Looking for on-campus training for your learners? ITVision360 conducts customized training sessions to targeted team of learners at client's location as per client requirements. Our expert instructors can deliver onsite training and help students learn on live projects in their own environment.
Online Training
Online curriculam is delivered in real time by the instructors. Students learn from their location without traveling to classrooms. After every session, assignments are given for practice at home. Training coordinators monitor the completion of assignments though online tools.
Syllabus
- Salesforce Standard Objects
- Custom Objects
- Tables and Types of Relation
- Master detail
- lookup
- Many-to-Many
- Tabs
- Fields and Page layouts
- Sandboxes
- Production
- Testing
- Change sets
- External tools
- Business use
- Data management
- Who can access data
- Users
- Profiles- Standard & Customer
- OWD
- Sharing rules and levels
- Roles
- Public Groups
- Manual Sharing
- Record types
- Business Automation
- Process builder
- Flows
- Workflows
- Validation rules
- Report types
- Tabular
- Summary
- Matrix
- Join
- Dashboards
Trainer
Sr. Architect
22+ years of industry experience
70+ projects implemented and have
commendable domain knowledge
Schedule
Please contact info@itvision360.com for upcoming training schedules.
Syllabus
- Overview of RDBMS
- Atmosphere Basics
- Build Tab
- FTP
- Disk
- Set Properties
- Atom Cloud and Test Atom Cloud
- Enable Regional Test and Production Clouds
- Document Flow
- Profile
- Setting up Map Profiles
- Mapping
- Map Functions
- Branch
- Deploy and manage
- Execute the Process
- Process Reporting
- Help and Support
- Introduction- Overview and Agenda for SaaS Integration
- Integration Scenario Salesforce
- Salesforce Query
- Setting Parameters
- Database Connector
- Configuring the Map
- User Defined Map Functions
- Decision Shape
- Salesforce Update
- Connector Call
- Mail Connector
- Message Shape
- Overview and Agenda for Admin
- Build, Deploy, Manage
- Process Schedule
- Connection Licensing
- Process Reporting and Document Statistics
- Exception Shape
- Process Deactivation
- Development Life Cycle Overview
- Build Components and Reusability
- Change Management
- Tracking Fields Overview
- Properties Overview
- Extensions Overview
- Document Flow Overview
- Document Flow- Shape Types
- Agenda and Environments configuration
- Extensions
- REST Overview
- SOAP Overview
- Process Call Overview
- Business Rules Overview
- Document Cache Overview
- Error Handling Techniques,
- Process Route Overview
- Shared Web Server and Event Based Integration
Trainer
Integration Consultant
8+ years of experience in Integration and have
commendable domain knowledge
Schedule
Please contact info@itvision360.com for upcoming training schedules.
Syllabus
- Starting Spark Context - pyspark
- Overview of Spark Read APIs
- Understanding airlines data
- Inferring Schema
- Previewing Airlines Data
- Overview of Data Frame APIs
- Overview of Functions
- Overview of Spark Write APIs
- Overview of Predefined Functions in Spark
- Create Dummy Data Frame
- Categories of Functions
- Special Functions - col and lit
- Common String Manipulation Functions
- Extracting Strings using substring
- Extracting Strings using split
- Padding Characters around Strings
- Trimming Characters from Strings
- Date and Time Manipulation Functions
- Date and Time Arithmetic
- Using Date and Time Trunc Functions
- Date and Time Extract Functions
- Using to_date and to_timestamp
- Using date_format Function
- Dealing with Unix Timestamp
- Dealing with Nulls
- Using CASE and WHEN
- Overview of Basic Transformations
- Data Frame for basic transformations
- Basic Filtering of Data
- Filtering Example using dates
- Boolean Operators
- Using IN Operator or isin Function
- Using LIKE Operator or like Function
- Using BETWEEN Operator
- Dealing with Nulls while Filtering
- Total Aggregations
- Aggregate data using groupBy
- Aggregate data using rollup
- Aggregate data using cube
- Overview of Sorting Data Frames
- Solution - Problem 1 - Get Total Aggregations
- Solution - Problem 2 - Get Aggregations By FlightDate
- Solution - Problem 3 - Row Level Transformations
- Prepare Datasets for Joins
- Analyze Datasets for Joins
- Problem Statements for Joins
- Overview of Joins
- Using Inner Joins
- Left or Right Outer Join
- Solution - Get Flight Count Per US Airport
- Solution - Get Flight Count Per US State
- Solution - Get Dormant US Airports
- Solution - Get Origins without master data
- Solution - Get Count of Flights without master data
- Solution - Get Count of Flights per Airport without master data
- Solution - Get Daily Revenue
- Solution - Get Daily Revenue rolled up till Yearly
- Overview of Spark Metastore
- Exploring Spark Catalog
- Creating Metastore Tables using catalog
- Inferring Schema for Tables
- Define Schema for Tables using StructType
- Inserting into Existing Tables
- Read and Process data from Metastore Tables
- Create Partitioned Tables
- Saving as Partitioned Table
- Creating Temporary Views
- Using Spark SQL
- Introduction to Getting Started with Semi Structured Data using Spark
- Create Spark Metastore Table with Special Data Types
- Overview of ARRAY Type in Spark Metastore Table
- Overview of MAP and STRUCT Type in Spark Metastore Table
- Insert Data into Spark Metastore Table with Special Type Columns
- Create Spark Data Frame with Special Data Types
- Create Spark Data Frame with Special Types using Python List
- Insert Spark Data Frame with Special Types into Spark Metastore Table
- Review Data in the JSON File with Special Data Types
- Setup JSON Data Set to explore Spark APIs on Special Data Type Columns
- Read JSON Data with Special Types into Spark Data Frame
- Flatten Array Fields in Spark Data Frames using explode and explode_outer
- Get Size or Length of Array Type Columns in Spark Data Frame
- Concatenate Array Values into Delimited String using Spark APIs
- Convert Delimited Strings from Spark Data Frame Columns to Arrays
- Setup Data Sets to Build Arrays using Spark
- Read JSON Data into Spark Data Frame and Review Aggregate Operations
- Build Arrays from Flattened Rows of Spark Data Frame
- Getting Started with Spark Data Frames with Struct Columns
- Concatenate Struct Column Values in Spark Data Frame
- Filter Data on Struct Column Attributes in Spark Data Frame
- Create Spark Data Frame using Map Type Column
- Project Map Values as Columns using Spark Data Frame APIs
- Conclusion of Getting Started with Semi Structured Data using Spark
- Introduction to Process Semi Structured Data using Spark Data Frame APIs
- Review the Data Sets to generate denormalized JSON Data using Spark
- Setup JSON Data Sets in HDFS using HDFS Command
- Create Spark Data Frames using Data Frame APIs
- Join Orders and Order Items using Spark Data Frame APIs
- Generate Struct Field for Order Details using Spark
- Generate Array of Struct Field for Order Details using Spark
- Join Data Sets to generate denormalized JSON Data using Spark
- Denormalize Join Results using Spark Data Frame APIs
- Write Denormalized Customer Details to JSON Files using Spark
- Publish JSON Files for downstream applications
- Read Denormalized Data into Spark Data Frame
- Filter Denormalized Data Frame using Spark APIs
- Perform Aggregations on Denormalized Data Frame using Spark
- Flatten Semi Structured Data or Denormalized Data using Spark
- Compute Monthly Customer Revenue using Spark on Denormalized Data
- Conclusion of Processing Semi Structured Data using Spark Data Frame APIs
- Deploying and Monitoring Spark Applications - Introduction
- Overview of Types of Spark Cluster Managers
- Setup EMR Cluster with Hadoop and Spark
- Overall Capacity of Big Data Cluster with Hadoop and Spark
- Understanding YARN Capacity of an Enterprise Cluster
- Overview of Hadoop HDFS and YARN Setup on Multi-node Cluster
- Overview of Spark Setup on top of Hadoop
- Setup Data Set for Word Count application
- [Instructions and Commands] Setup Data Set for Word Count Application
- Develop Word Count Application
- [Code] Develop Word Count Application
- Review Deployment Process of Spark Application
- Overview of Spark Submit Command
- Switching between Python Versions to run Spark Applications or launch Pyspark CLI
- Switching between Pyspark Versions to run Spark Applications or launch Pyspark CLI
- Review Spark Configuration Properties at Run Time
- Develop Shell Script to run Spark Application
- [Code] Develop Shell Script to run Spark Application
- Run Spark Application and review default executors
- Overview of Spark History Server UI
- Setup SSH Proxy to access Spark Application logs - Introduction
- Overview of Private and Public ips of servers in the cluster
- Overview of SSH Proxy
- Setup sshuttle on Mac or Linux
- Proxy using sshuttle on Mac or Linux
- Accessing Spark Application logs via SSH Proxy using sshuttle on Mac or Linux
- Side effects of using SSH Proxy to access Spark Application Logs
- Steps to setup SSH Proxy on Windows to access Spark Application Logs
- Setup PuTTY and PuTTYgen on Windows
- Quick Tour of PuTTY on Windows
- Configure Passwordless Login using PuTTYGen Keys on Windows
- Run Spark Application on Gateway Node using PuTTY
- Configure Tunnel to Gateway Node using PuTTY on Windows for SSH Proxy
- Setup Proxy on Windows and validate using Microsoft Edge browser
- Understanding Proxying Network Traffic overcoming Windows Caveats
- Update Hosts file for worker nodes using private ips
- Access Spark Application logs using SSH Proxy
- Overview of performing tasks related to Spark Applications using Mac
- Deployment Modes of Spark Applications - Introduction
- Default Execution Master Type for Spark Applications
- Launch Pyspark using local mode
- Running Spark Applications using Local Mode
- Overview of Spark CLI Commands such as Pyspark
- Accessing Local Files using Spark CLI or Spark Applications
- Overview of submitting spark application using client deployment mode
- Overview of submitting spark application using cluster deployment mode
- Review the default logging while submitting Spark Applications
- Changing Spark Application Log Level using custom log4j properties
- Submit Spark Application using client mode with log level info
- Submit Spark Application using cluster mode with log level info
- Submit Spark Applications using SPARK_CONF_DIR with custom properties files
- Submit Spark Applications using Properties File
- Passing Application Properties Files and External Dependencies - Introduction
- Steps to pass application properties using JSON
- Setup Working Directory to pass application properties using JSON
- Build the JSON with Application Properties
- Explore APIs to process JSON Data using Pyspark
- Refactor the Spark Application Code to use properties from JSON
- Pass Application Properties to Spark Application using local files in client mode
- Pass Application Properties to Spark Application using local files in cluster mode
- Pass Application Properties to Spark Application using HDFS files
- Steps to pass external Python Libraries using pyfiles
- Create required YAML File to externalize application properties
- Install PyYAML into specific folder and build zip
- Explore APIs to process YAML Data using Pyspark
- Refactor the Spark Application Code to use properties from YAML
- Pass External Dependencies to Spark Application using local files in client mode
- Pass External Dependencies to Spark Application using local files in cluster mode
- Pass External Dependencies to Spark Application using HDFS files
- Spark Application Logging - Introduction
- Review of submitting Spark Applications using different Spark Versions
- Overview of Spark 2 Properties Files
- Overview of Spark 3 Properties Files
- Overview of Application Level Logging
- Default Logging for Spark 2 Applications
Schedule
Please contact info@itvision360.com for upcoming training schedules.
Syllabus
- Embedded C programming
- Applications
- Communication protocols
- Networking and TCP/IP applications
- Socket Programming
- Modules Programming:
- Target board – PIC and STM32 Controllers
- ARM – Advanced RISC Machine
- Getting Started
- Development tools and Utilities
- The GNU Library and System Calls
- Linux Environment
- Building Libraries
- Process Control
- Inter Process Communication
- Managing Signals
- Programming with Threads POSIX
- Operating System/Kernel Concepts
- Introduction to Linux Kernel Programming
- Debugging The Kernel
- Character Drivers
- Interrupts
- Blocked I/O Layer
RTOS RT – Linux
Schedule
Please contact info@itvision360.com for upcoming training schedules.
© All Rights Reserved @ 2023.