Microsoft Big Data Certification Exams http://www.jamesserra.com/archive/2017/08/microsoft-big-data-certification-exams/
This is cool…
Last week I had the chance to do something I have not done before: build a Power BI report to be displayed on a big screen hanging on a wall. To make up for the loss of user interactivity, I used the new Drilldown Player custom visual to cycle through different selections and display a new slice of data every few seconds; Devin Knight’s blog post here has a great summary of how to use it. However I wasn’t happy about the look of the Drilldown Player visual in this particular report: the play/stop/pause buttons aren’t much use if you can’t click on them and the visual doesn’t show all of the values that it is cycling through. As a result I hid the visual behind another one and came up with a different way of displaying the currently-displayed selection.
Here’s a simple example of what I did. Imagine you…
View original post 271 more words
I’m a long time fan of Analysis Services, and this latest feature is a really cool addition to the offering. Currently in Preview, Azure Analysis Services Web Designer offers the following functionality that extends Azure Analysis Services, all through a simple web UI:
- Add a new logical server in Azure
- Create a new Analysis Services model from SQL DB, Azure SQL DW and….Power BI workbooks! (More data sources to come)
- Browse an existing Analysis Services model and add new measures for quick validation
- Open an existing Analysis Services model as a Visual Studio Project, in Excel, or in Power BI
- Edit an existing Analysis Services model using TMSL (Tabular Model Scripting Language)
Example – Creating a Model from Power BI Desktop
In this example, I’ve created a simple Power BI workbook (.pbix file) that’s connecting to an Azure SQL DB instance running the sample AdventureWorks database:
To connect to the new service, I can access via the Azure portal or go direct to https://analysisservices.azure.com:
On this occasion, I’ll use the server I already have and instead go straight to adding a new model:
Once created, I can browse the model in the Designer:
Or open it using Power BI, Excel or Visual Studio:
This is in preview right now, with many features still to come. The ability to import Power BI models into Analysis Services is a massive feature in its own right, but aside from that it already shows how you can quickly create and modify models without having to delve into Visual Studio/SQL Server Data Tools. New features are coming every month, so keep on eye on the team blog (below) to follow its progress.
One of the most useful features to come to Power BI in recent months is the Power BI Service Live Connection. You can find out more here.
What is it?
In a nutshell, it allows developers to connect to an already published Power BI dataset and create new reports without having to redo all the queries and data ingestion.
This scenario is great as an Analysis Services “lite” approach, allowing teams to better collaborate and reduce duplication of effort by sharing standardized models and content. As most of the effort involved in self-service BI tools is spent doing the data mashing, with this approach you can “model once and reuse many times”.
How does it work?
As Power BI Service Live Connection is a preview feature, you first have to enable it in “options”:
Once enabled, restart Power BI and create a new dataset using the normal Get Data functionality. This becomes your data model to publish and share. Instead of building any reports, simply publish the pbix file to the Power BI Service:
Once published, this dataset can then be connected to from any other Power BI workbook using the new functionality. Developers no longer need to do all the heavy lifting work with the data mashing, and instead just use the already built data model to create their own reports and dashboards.
Browse to the published data set:
You then have access to the published data model, and can build your own content. Note the “Edit Queries” option is grayed out.
This new functionality is a great addition to the Power BI toolset, and allows for a much improved sharing and collaboration experience within teams.
Find Out More:
In standard scenarios, ETL developers using SSIS will leverage a data flow that takes data from a source (Database, File, etc) and load it directly into the target database:
Unfortunately, adopting this pattern with an SSIS data flow into Azure SQL Data Warehouse will not naturally leverage the power of PolyBase. In order to understand this we need to take a little peek under the covers of SQL Data Warehouse:
As part of the underlying architecture, SQL DW has at its head the Control Node. This node manages and optimizes queries, and serves as the access point for client applications when accessing the SQL DW cluster. What this means for SSIS, is that when it makes its connection to the SQL DW using standard data connectors, this connection is made to the Control Node:
This method introduces a bottleneck as the Control node serves as the single throughput point for the data flow. As the Control node does not scale out, this limits the speed of your ETL flow.
As stated earlier, the most efficient way of loading into Azure SQL DW is to leverage the power of PolyBase in a process known as “Back Loading”:
Using this method, data is bulk loaded in parallel from staging files held in Blob Storage or Azure Data Lake Store, which allows for the full data processing power of PolyBase to be used.
Leveraging Polybase Within SSIS
Previously, ingesting data from files into SQL DW using PolyBase was a slightly drawn out process of uploading files to storage, defining credentials, file formats and external table definitions before finally loading the data into SQL DW using a CTAS (Create Table As Select) statement. This works perfectly well, but introduces a lot of extra SQL coding on top of the norm.
Thankfully, with the SSIS Azure Feature Back, the above is now made much, much easier with the introduction of the Azure SQL DW Upload task. This process automates the creation of the SQL scripts required, allows you to seamlessly incorporate this task into an SSIS ETL process that will fully leverage Polybase.
Greetings. This my first post for the SQL of the North blog. This blog will serve as a platform for sharing the latest and greatest from the Azure data & analytics world, as well as sharing hints, tips and best practices obtained in the field.
By way of background, my name’s Mike and I’m a DSA (Data Solutions Architect) at Microsoft, where I specialise in helping customers on their data journey into the cloud. I have over 15 years in the data and analytics space, mainly focused on the Microsoft platform, however I’ve picked up a few other useful skills and technologies along the way.
Finally – why “SQL of the North”? Well, I’m a northern (UK) lad by origin, and am also an avid Games of Thrones fan. I wanted something that had a bit of a link to that, and unfortunately the “Data Watch” was already taken 🙂