Pentaho Training
Intellipaat
Course Summary
Pentaho training class by Intellipaat helps you learn pentaho BI suite which covers Pentaho Data Integration, Pentaho Report Designer, Pentaho Mondrian Cubes and Dashboards. This Pentaho Online course you will help you prepare for Pentaho Data Integration Exam and you will work on real life project work.
-
+
Course Description
About Pentaho Training Course
What you will learn in this Pentaho online certification Training?
- Learn architecture of Pentaho BI Suite
- Get trained in Pentaho Analytics for creating reports using Pentaho BI Server
- Perform multiple data integration, transformation, and analytics
- Pentaho Dashboard & Pentaho Business Analytics
- Use PDI and ETL design pattern to populate data warehouse star schema
- Create complex reports and dashboard for analysis
- Develop Mondrian Cube OLAP schemas with Pentaho workbench
- Integrate Pentaho with Big Data Stack like HDFS , Map Reduce
- Learn how to performance tune PDI jobs and transformations
- Use Pentaho Kettle to build and deploy report in automated manner
Who should take this Pentaho training Course?
- Business Analyst, BI developer, Data Scientist, DW programmer, Solution Architect
- Mainframe and Testing Professionals
- Those who want to take up a career in Business Intelligence & Data Analytics
What are the prerequisites for learning Pentaho?
No prior knowledge is required to Pentaho BI suite.Why should you go for Pentaho Online Training?
- Poor data quality costs US businesses up to $600 billion annually – TDWI
- Big Data Analytics to reach $13.95 Billion by 2017 – Marketsand Markets
- Average US Salary for a Pentaho Professional is $104,000 – indeed.com
-
+
Course Syllabus
Pentaho Course Content
Introduction to Pentaho ToolPentaho user console, Oveview of Pentaho Business Intelligence and Analytics tools, database dimensional modelling, using Star Schema for querying large data sets, understanding fact tables and dimensions tables, Snowflake Schema, principles of Slowly Changing Dimensions, knowledge of how high availability is supported for the DI server and BA server, managing Pentaho artifacts Knowledge of big data solution architecturesHands-on Exercise – Schedule a report using user console, Create model using database dimensional modeling techniques, create a Star Schema for querying large data sets, Use fact tables and dimensions tables, manage Pentaho artifactsData ArchitectureDesigning data models for reporting, Pentaho support for predictive analytics, Design a Streamlined Data Refinery (SDR) solution for a clientHands-on Exercise – Design data models for reporting, Perform predictive analytics on a data set, design a Streamlined Data Refinery (SDR) solution for a dummy clientClustering in PentahoUnderstanding the basics of clustering in Pentaho Data Integration, creating a database connection, moving a CSV file input to table output and Microsoft Excel output, moving from Excel to data grid and log.Hands-on Exercise – Create a database connection, move a csv file input to table output and Microsoft excel output, move data from excel to data grid and logData TransformationThe Pentaho Data Integration Transformation steps, adding sequence, understanding calculator, Penthao number range, string replace, selecting field value, sorting and splitting rows, string operation, unique row and value mapper, Usage of metadata injectionHands-on Exercise – Practice various steps to perform data integration transformation, add sequence, use calculator, work on number range, selecting field value, sorting and splitting rows, string operation, unique row and value mapper, use metadata injectionPentaho FlowWorking with secure socket command, Pentaho null value and error handling, Pentaho mail, row filter and priorities stream.Hands-on Exercise – Work with secure socket command, Handle null values in the data, perform error handling, send email, get row filtered data, set stream prioritiesDeploying SCDUnderstanding Slowly Changing Dimensions, making ETL dynamic, dynamic transformation, creating folders, scripting, bulk loading, file management, working with Pentaho file transfer, Repository, XML, Utility and File encryption.Hands-on Exercise – Make ETL dynamic transformation, create folders, write scripts, load bulk data, perform file management ops, work with Pentaho file transfer, XML utility and File encryptionType of Repository in PentahoCreating dynamic ETL, passing variable and value from job to transformation, deploying parameter with transformation, importance of Repository in Pentaho, database connection, environmental variable and repository import.Hands-on Exercise – Create dynamic ETL, pass variable and value from job to transformation, deploy parameter with transformation, connect to a database, set pentaho environmental variables, import a repository in the pentaho workspacePentaho Repository & Report DesigningWorking with Pentaho dashboard and Report, effect of row bending, designing a report, working with Pentaho Server, creation of line, bar and pie chart in Pentaho, How to achieve localization in reportsHands-on Exercise – Create Pentaho dashboard and report, check effect of row bending, design a report, work with Pentaho Server, create line, bar and pie chart in Pentaho, Implement localization in a reportPentaho DashboardWorking with Pentaho Dashboard, passing parameters in Report and Dashboard, drill-down of Report, deploying Cubes for report creation, working with Excel sheet, Pentaho data integration for report creation.Hands-on Exercise – Pass parameters in Report and Dashboard, deploy Cubes for report creation, drill-down in report to understand the entries, import data from an excel sheet, Perform data integration for report creationUnderstanding CubeWhat is a Cube? Creation and benefit of Cube, working with Cube, Report and Dashboard creation with Cube.Hands-on Exercise – Create a Cube, create report and dashboard with CubeMulti Dimensional ExpressionUnderstanding the basics of Multi Dimensional Expression (MDX), basics of MDX, understanding Tuple, its implicit dimensions, MDX sets, level, members, dimensions referencing, hierarchical navigation, and meta data.Hands-on Exercise – Work with MDX, Use MDX sets, level, members, dimensions referencing, hierarchical navigation, and meta dataPentaho AnalyzerPentaho analytics for discovering, blending various data types and sizes, including advanced analytics for visualizing data across multiple dimensions, extending Analyzer functionality, embedding BA server reports, Pentaho REST APIsHands-on Exercise – Blend various data types and sizes, Perform advanced analytics for visualizing data across multiple dimensions, Embed BA server reportPentaho Data Integration (PDI) DevelopmentKnowledge of the PDI steps used to create an ETL job, Describing the PDI steps to create an ETL transformation, Describing the use of property filesHands-on Exercise – Create an ETL transformation using PDI steps, Use property filesHadoop ETL ConnectivityDeploying ETL capabilities for working on the Hadoop ecosystem, integrating with HDFS and moving data from local file to distributed file system, deploying Apache Hive, designing MapReduce jobs, complete Hadoop integration with ETL tool.Hands-on Exercise – Deploy ETL capabilities for working on the Hadoop ecosystem, Integrate with HDFS and move data from local file to distributed file system, deploy Apache Hive, design MapReduce jobsCreating dashboards in PentahoCreating interactive dashboards for visualizing highly graphical representation of data for improving key business performance.Hands-on Exercise – Create interactive dashboards for visualizing graphical representation of dataPerformance TuningManaging BA server logging, tuning Pentaho reports, monitoring the performance of a job or a transformation, Auditing in PentahoHands-on Exercise – Manage logging in BA server, Fine tune Pentaho report, Monitor the performance of an ETL jobSecurityIntegrating user security with other enterprise systems, Extending BA server content security, Securing data, Pentaho’s support for multi-tenancy, Using Kerberos with PentahoHands-on Exercise – Configure security settings to implement high level securityPentaho ProjectsProject 1– Pentaho Interactive ReportData– Sales, Customer, ProductObjective – In this Pentaho project you will be exclusively working on creating Pentaho interactive reports for sales, customer and product data fields. As part of the project you will learn to create a data source, build a Mondrian cube which is represented in an XML file. You will gain advanced experience in managing data sources, building and formatting Pentaho report, change the report template and scheduling of reports.Project 2– Pentaho Interactive ReportDomain– RetailObjective – Build complex dashboard with drill down reports and charts for analysing business trends.Project 3– Pentaho Interactive ReportDomain– BIObjective – To do automation testing in ETL environment, Check the correctness of data transformation, Data loading in datawarehouse without any loss or truncation, Rejecting, Replacing and Reporting invalid data, Creation of unit tests to target exceptions
This course is listed under
Development & Implementations
, Data & Information Management
and Peripherals
Community
Related Posts: