Victoria Power BI User Group Meeting September 2017
What’s New and Exciting with Power BI Connecting to Data the Right Way Agenda Introductions What’s New and Exciting with Power BI Connecting to Data the Right Way Import vs DirectQuery DirectQuery: SQL Server vs Analysis Services How to Extend your Analystical Capabilities using Power BI and Azure Machine Learning Question and Answer
Introductions The Power BI User group started meeting in October 2016. We now have over 50 members.
Latest Feature Set Review – Dave Jaycock Reporting Drill-through to another report page Explain the increase/decrease insights (preview) Ribbon chart Theming preview update – chart style controls Accessibility improvements Accessible See data Keyboard shortcut helper dialog High density scatter chart sampling Cartesian gridline style control New community visuals Visio visual (preview) Calendar by Tallan Enlighten Aquarium Impact Bubble Chart Data connectivity Azure Consumption Insights connector Improvements to the Dynamics 365 for Financials connector https://powerbi.microsoft.com/en-us/blog/power-bi-desktop-september-2017-feature-summary/
Connecting to Data the Right Way Import – selected tables and columns are imported into Power BI. As visualizations are created and interacted with Power BI uses the imported data. Data must be refreshed – an import of the full data set – to see any changes that occurred to the underlying data since the initial import or the most recent refresh. DirectQuery – no data is imported or copied into Power BI. For relational sources, the selected tables and columns appear in the fields list. For multi-dimensional sources, the dimensions and measures of the selected source appear in the fields list. As you create or interact with a visualization, Power BI queries the underlying data source, which means you’re always viewing current data. Live Connections – with Analysis Services tabular or multidimensional sources a live connection can be established. Similar to DirectQuery, no data is imported or copied into Power BI, and data is queried directly. Modeling (semantic layer) is handled in Analysis Services and unavailable within Power BI.
Direct Query -Works around data size limitations Direct Query -Works around data size limitations. Allows for building visualizations over very large datasets -Always working with current data; requires no refresh But… -All tables/queries must come from single database -Overly “complex” M queries not supported -Relationship Filtering limited to a single direction -Time Intelligence not available -1 million row limit for returning data (but possible to aggregate results of much larger dataset) -No Quick Insights Performance and Load are critical considerations – the performance of the source database can dramatically effect usability. Be aware of row-level security impact
Direct Query Publishing -Requires On-Premises Data Gateway, except when connecting to cloud sources (Azure SQL Database, Azure SQL Data Warehouse, Redshift) Supported Sources: Amazon Redshift Azure HDInsight Spark (Beta) Azure SQL Database Azure SQL Data Warehouse IBM Netezza (Beta) Impala (version 2.x) Oracle Database (version 12 and above) SAP Business Warehouse (Beta) SAP HANA Snowflake Spark (Beta) (version 0.9 and above) SQL Server Teradata Database
Direct Query Demo
Live Connection -Analysis Services Tabular or Multidimensional -Always working with current data; requires no refresh But… -”Model” is entirely that of the underlying cube/tabular model -No ability to edit relationships -No calculated columns or tables The live connection essentially relies entirely on the source for its model/semantic layer, with the exception of the ability to create new calculated measures at the Power BI level. You can, however, specify a DAX or MDX query in the source connection.
Live Connection Demo
Questions and Answers ?
Power BI and Azure Machine Learning Step 5 Create a Power BI report and add the R script as a data source first. Then add the table of prediction results in SQL Server. Typically I like to include both the data that was sent up to the Azure Machine Learning model as well as the prediction results that the Azure Machine Learning model generated in the Power BI Report. Step 4 Save the R data frame of prediction results data back to SQL Server using the same ODBC connection used to extract the data Step 2 Create an R script that will be used as a data source in Power BI. In the R script call the web service and send it the data which the model will predict values for. Step 1 Create on ODBC connection to your SQL Server database. This ODBC connection will be used in R to extract the data out of SQL Server. This data will then be passed up to the Azure Machine Learning Web Service to make predictions on. Step 3 The web service will return a JSON result set to the R script. This JSON result set will need to be loaded into a data frame and formatted so that it can be saved to a table in SQL Server.
Power BI and Azure Machine Learning - Demo
Power BI and Azure Machine Learning Enterprise Solution Execution of the R script can be shifted out of Power BI and into SQL Server 2016. This centralizes and secures the integration code and moves it into the domain of IT to manage.
Do it yourself Follow the steps in my blog post https://www.linkedin.com/pulse/match-made-heaven-part-1-2-anthony-bulk https://www.linkedin.com/pulse/match-made-heaven-part-2-anthony-bulk
NEW MICROSOFT SITE FOR REGISTRATION http://www.pbiusergroup.com/home http://www.pbiusergroup.com/communities/community-home?CommunityKey=6c7102ee-2720-418e-b3c8- fff60a8d78c5&tab=groupdetails Old site https://www.meetup.com/Victoria-Power-BI-User-Group/