In addition to all of the obvious business benefits that Microsoft flow brings to the table, one of the things that initially struck me about it was how useful it would be for data acquisition purposes. The first thing that I did with it was to create a flow that queries weather stations from Weather Underground, and stores the data in SQL Azure, and uses Power BI to analyze the weather patterns.
I may blog about that solution in the future, but with the Future of SharePoint event rapidly coming up, my BI Focal fellow collaborator, Jason Himmelstein convinced me that there was something more interesting that we could do with this. How about near real time monitoring of Twitter conversations for the event? All of the pieces were in place.
There are currently essentially three different technologies necessary to make this happen.
Let’s go through each one.
We could store our tweets in a variety of locations (CSV, SharePoint, Excel), and there are already a number of examples out there that demonstrate how to do this. The reason that we want to use a SQL Azure database is twofold. Firstly, Flow has actions for connecting and inserting data into it. That takes care of our storage requirement. The second, and most important part is that SQL Azure databases support DirectQuery in Power BI.
With DirectQuery, Power BI does not cache the data – every interaction results in a query back to the source, in our case the SQL Azure database. This has the effect of making the data available for reporting as soon as it has been delivered by Flow. That’s the theory at least. In reality, Power BI caches certain elements temporarily (dashboard tiles for example), but this is as close to real time as you can get in Power BI without writing data directly to t in the API. Reports are for the most part up to the minute.
You need an Azure subscription to create a database, and the process for creating it is documented in the following video.
We will be using the Twitter trigger with Microsoft flow, and it has several output variables. We want our table to be able to store the values of those variables in a table, so we use the following script to create that table.
CREATE TABLE Twitter( id int IDENTITY(1,1),RetweetCount int,TweetText NVARCHAR(250),TweetedBy NVARCHAR(100),CreatedAt NVARCHAR(100),TweetID NVARCHAR(50),SearchTerm NVARCHAR(50));Goalter table Twitter add primary key (ID)
Once created, we are ready to fill it with tweets.
The recently announced Microsoft Flow is a tool that allows users to automate and integrate processes from different data sources in the cloud. It is based on Azure Logic Apps, and is currently in preview, but already supports a wide variety of actions and triggers. You can sign up for, or access your flows at http://flow.microsoft.com.
Flows consist of two primary objects, triggers and actions. Most triggers, and at the moment all actions, are tied to a data connection. You can register your connections as you go, but you can also view and register them en-masse by selecting your person icon and selecting “My connections”.
Once registered, you can use “Browse” to start from a template, or you can go to “My flows” to start from scratch. That’s what we’ll do. To start, click on “Create new flow”, and you will be presented with the trigger selector.
Most of the available triggers are events, and the first 4 are special cases. The recurrence trigger allows you to schedule your flow. This is what I use for my weather gatherer – it just calls a web page every 5 minutes and passes the result into the next action. The external content source actions are in alphabetical order, so we just scroll down to the Twitter action and select it.
If you have already registered a Twitter account, it will be used by default. If you want to change it, or add a new one, just click on “Change connection”. It’s a good idea to use multiple Twitter accounts if you’re doing multiple queries to avoid running afoul of Twitter’s rate limiting. Finally, just enter the search term in the Query Text box. Any new post of that term on Twitter will launch the flow.
Next, we need to add the “SQL Azure – Insert Row” action. To do so, click on the “+” symbol, click add an action, then click “Load more” at the bottom. Scroll down and select the action.
Again, if you have a database registered, it will be selected by default. If you have multiple databases registered, or want to add more, click on “Change Connection”. Once you have the correct connection selected, you can click on the dropdown and select the correct table (the one created above”). Once selected, the fields will load in to the action.
Populating the fields is a simple matter of selecting the appropriate output variable from the Twitter trigger. The final field, SearchTerm, is used to distinguish between different Twitter searches. Each flow only triggers on one term, but we want to set up multiple flows. We manually enter the value here (in our case “FutureOfSharePoint”). Later, that will be used as a slicer in Power BI.
Once complete, give the Flow a name, click on “Create Flow”, and then “Done”. At that point, you really are done. That’s it, that’s all there is to it. You can query SQL Azure to check for data, and you can also use the information icon to check on the status of Flow runs.
All of these steps are well documented in Jason’s video below:
We want to surface this data with Power BI. We can do this directly from the web interface, but we have a lot more options if we design the report with Power BI Desktop. The next step is to launch Power BI Desktop, Select “Get Data”, select “Microsoft Azure SQL Database” and press the “Connect” button. At this point, you enter in the details about the Azure SQL Server and database, and most importantly, select the DirectQuery option.
The import option will retrieve data from the SQL database and cache it in an embedded model within the report. Once published, the Power BI service can keep it refreshed, but no more than 8 times per day. This is contrasted with DirectQuery, where no data is persisted in the service, and every interaction results in an immediate call back to the data source. For frequent updates, this is what we need.
A word of caution here – we pay a significant penalty from a feature standpoint when using DirectQuery mode. Most of the functions in Power Query and many of the functions in DAX are unavailable to us in this mode. However, with this particular data set, these restrictions are an acceptable tradeoff for the frequent updates.
Again, Jason has done a great job explaining the steps required to build the reports and dashboards in the video below, so I am not going to repeat them here.
Once the report is published, you may want to present it to a wider audience. You can do that through dashboard sharing if your recipients have access to Power BI, or you can publish it anonymously. Given that this is Twitter data, it’s certainly public, and there is no harm in doing so.
To publish the report anonymously, simply open the report in the Power BI service, and select File – Publish to web.
You will then be presented with a dialog box that will give you both a link and an embed code for 3 different possible renditions of the report. Simply select the one you want to use and paste it into the ultimate destination. My report can be seen below, and I will likely update it from time to time to follow current events.
One thing to keep in mind about reports shared anonymously is that even though the report is using DirectQuery, the visuals are only updated approximately every hour. The above report will lag reality by about an hour.
You can see here the power of these tools working together. Flow is an easy to use but yet powerful integration tool. SQL Azure is a rock solid database available in the cloud to other cloud services, and Power BI allows for rapid insights to be built by Power users. No code was harmed in the building of this solution, but regardless, it’s still quite powerful.
From here, I can only see it getting better. My ask from the Flow team? A Power BI action that pumps data directly into a Power BI data model, thus eliminating the need for the Azure DB, and allowing for self updating visuals in Power BI, but that’s a topic for another day.
I was recently interviewed by Jonathan Rozenblit from Microsoft Canada about our Election Night application (previously discussed here). Election Night is an application that lives completely in the cloud, using both Windows Azure and SQL Azure.
NOTE – July 17 2012– The post below was originally written in early 2011, and represents the effort required to get WordPress working in an Azure Web Role. With the release of the new Azure IAAS features in June 2012, I wanted to note that I do not recommend this approach. I am leaving the post here as it may have historical value, or value for those using the tools described. WordPress can now be run as an Azure Web Site, or, as this blog is using, within an Azure Virtual Machine
As I mentioned in a post last week, the blog that you are currently reading is now hosted on Windows Azure. Nothing about the blog platform has changed, it’s still running on WordPress, but along the way I did switch the database from MySQL to SQL Azure. The process of getting this up and running was not exactly straightforward, so I thought that I would share my experience here.
To be clear, I am new to Azure. Brand new. What I’m writing below is simply my experience in getting this up and running, which happily I was able to do. This should not be taken as prescriptive guidance – that MVP badge at the top of this blog is for SharePoint – not Azure. If this helps you, then great. However, I would be happy to receive comments about mistakes, better approaches, or just other approaches.
Your mileage may vary – you’ve been warned…..
Since the Windows Azure operating environment is actually a Virtual Machine running Windows Server 2008 or 2008 R2, you can technically run anything on it that you can with either of those environments. Getting an ASP.NET service up into the cloud is a snap with Visual Studio, but getting other platforms up there, Like PHP requires a bit more effort. Luckily, Microsoft recently published the Windows Azure Companion, which makes it significantly easier to install PHP and PHP based applications like WordPress and Drupal on Azure. We’ll be working with this tool extensively.
1. Create the Storage Account
The Azure Companion installs a series of files into blob storage, so it is necessary to have a storage account available. Log in to the Azure Dashboard, click on “Hosted Services, Storage Accounts, and CDN”, and select “Storage Accounts”. Once this has loaded, click on “New Storage Account”.
From the following dialog box, choose your subscription, enter a name (URL) for your account, and choose a region.
The URL that you enter can be whatever you like, but it MUST be unique across all Azure storage accounts. I also always choose a specific data center. Given that you are charged for bandwidth in and out of the data center, and my WordPress install will be using SQL Azure, I want to make sure that all data moving between my front end server and my SQL Azure server is within the same data center, and this is the only way that I know to do this. The Storage Account should be created fairly quickly.
2. Create the SQL Azure WordPress database
Since we will be using SQL Azure for data storage, its necessary to create the database ahead of time. To do so, log into the Azure Portal select Database, drill into your subscription and select your server, and click create.
Depending on your subscription, you may get different options, but you need to select a database name, edition, and maximum size.
You can select whatever you want for edition and size, but 1 GB should be more than enough for WordPress. Make sure that you remember the name of the database.
Finally, you’ll need to make sure that the Firewall rules are configured to allow access for Azure services. From the Server information screen, click on Firewall Rules.
Unless already selected, clicking on “Allow other Windows Azure services…..” will add a rule permitting your Azure services to access the database.
In addition, make note of the following information. You will be needing it when it comes time to setting up WordPress, below:
3. Install the Windows Azure Companion
You can download the Windows Azure Companion from here. You have three choices – the companion without SSL, with SSL, and the source code. The big difference between the first two is what endpoints are configured, and the source code obviously lets you change the entire solution. Unfortunately, since the solution package has not been configured for remote desktop access, working with the source code is necessary.
Why is remote desktop access necessary? Well, if you are absolutely satisfied that you can get everything configured perfectly in the solution package, then it isn’t but at this stage in the game, I just don’t have that much confidence. RD access lets me tweak things after deployment. The biggest reason for me however was that if you use the application installer in Azure Companion, it will want to install your WordPress Instance in a subdirectory off the root (ie http://blogs.cloudapp.net/MyBlogName). I didn’t want that, I wanted to use http://myblogname.mydomainname.com, and to get that going requires a few modifications to IIS afterwards – hence the need tor RDP access. However, if the default behaviour is OK for you, download the precompiled solution package, follow the instructions here and skip to the next section.
The “AdminWebSite” role is what we’ll be working with. This is the central application for Azure Companion, and from there we’ll install PHP and WordPress. For now, we are primarily concerned with configuring the role, and setting up Remote desktop. In addition, we’ll create a certificate to be used both for the application and for management (Remote Desktop).
First we need to configure the role, and we do that by double clicking on the role name in the roles folder. his brings up the Configuration tab.
The defaults on this page are fine, but the Instance Count is worth noting. According to my testing, Azure Companion can only be used with one instance. This is because multiple instances need to share a common file in the Blob storage, and this file is locked by the first process that accesses it. This will generate an availability warning on deployment.
The settings tab is where most of the configuration is performed.
There are 5 values here that MUST be configured:
The name of the storage account created in step 1
The storage account primary key that can be obtained from the portal after the account has been created
A new user name that will be used to administer Azure Companion
A list of available products to install. Azure Companion is extensible, and you can maintain your own list*.
Once these settings are made, we need to modify the endpoints. Azure allows for 5 endpoints, and if Remote Desktop occupies one of them. On the surface, the 4 endpoints specified should be fine, but my testing showed that either there is a hidden endpoint somewhere, or the limit is actually 4. Either way, we need to remove one of the endpoints. We have no need for MySQL, so that’s the one that loses.
The rest of the settings are fine, so we don’t need to explore them. At this point it’s a good idea to save the project. If you get a write protected error, you’ll need to go to the folder where the project files are stored and remove the read only attribute from the project files. Don’t forget to come back and save the project.
While Visual Studio can deploy directly to Azure, for the first deployment, we have a bit of a chicken and egg problem. In order to deploy a solution that has Remote Desktop enabled, you need to have a service certificate and management certificate already available. The service certificates are installed as a node under the hosted service, but the hosted service hasn’t yet been created. Further, to create a new hosted service from the portal , you need to have the two package files available. Therefore, for our first deployment, we will create the service files and deploy manually.
Right click on your cloud project, and click “Publish”.
In this case, the default value of “Create Service Package Only” is what we want, so click on “OK”. Visual Studio will then create the files that we need to deploy, and open up a Windows Explorer window to the path in which they’re contained. Copy this path to the clipboard, and we are now ready to create our first Service Application.
Open the Windows Azure Portal, click on Hosted Services, and then click “New Hosted Service”. The new Hosted Service dialog appears.
The Service name only matters for management purposes, but the URL Prefix is the way that your application will be addressed from outside, so choose the name wisely. It must be unique among all Azure applications. For the region, make sure that you choose the same region specified for the SQL Azure database above. Since you likely won’t need deployment/production environments, just deploy to production. We’ll be changing this right after we create it, so we don’t want to start it, that just takes extra time. Finally, select the package and configuration generated by Visual Studio.
Once this is done, Azure will create the virtual machine to host the instance, and install the instance itself. The process will take a few minutes, but when you’re ready to proceed, The portal window should appear something like the window below.
The next thing that we need to do is to create our service and management certificates. The two certificates can be based on the same root certificate but must be in two different formats. The management certificate will be a .cer file (which is what is created when a self signed certificate is created) and the service certificate must be in X509 format. This can be done by exporting a .cer file.
If you already have a certificate, you can skip this creation step, but to create one, the easiest way is to use the Visual Studio tools. Once again, Right click on your cloud project, and click “Publish”. This gets a little tricky.
Click on the credentials drop down and select <Add>. Then, the Project Management Authentication dialog appears. Again, select the drop down, and if there are no credentials already stored, choose <create>. Then enter a friendly name for the certificate (in this case sfiWordPress). Follow the instructions in step 2 – but note that there is no “Subscription Page”, but that you’ll be uploading a Management certificate into the Management certificate section. The subscription ID is obtained in the portal and the purpose of naming the credentials is so that Visual Studio can refer to them at a later date.
When ready, click OK. VS will connect to Azure to make sure that everything OK, and load in all of your hosted services. However, we’re still not quite ready to redeploy. First we need to upload a service certificate to the service (we did the management certificate above). This is because we need such a certificate in order to use Remote Desktop.
First, we need to create our certificate, To do so, click the “Configure Remote Desktop connections link”.
Create a new certificate if necessary, and enter in a local machine (for the service) user name. When ready click OK. We’re almost ready to deploy, but first we must upload this new certificate to the service. However, it’s not yet in an importable format, so we need to export our certificate to a .pfx file. To do so, without closing our deployment dialog, run the certificate manager add in by clicking on the Start Pearl, and entering certmgr.msc into the search box:
When the certificate manager window opens, open the Personal branch, and click on Certificates. Then, right click on your new certificate, hover over all tasks, and click on Export.
The certificate export wizard will then start. Walk through the wizard, make sure that you select the option to export the private key, enter (and Remember!!!) a password, enter a file name and save it. When this is done, we’re ready to add it to our service application.
Go back to the Azure Portal, and navigate to your hosted service. Then click on the certificates node, and press the “Add Certificate” button in the ribbon. Browse to the certificate, and enter its password. When ready, click the create button.
Now we’re ready to deploy our Remote Desktop enabled service. Go back to the Visual Studio Publish Dialog, make sure that you have the deploy to Azure option selected, and click the OK button. If all is well you will receive the following prompt:
This is simply warning you that you have another service deployed into the production slot, and that you will be overwriting it with this deployment. Since this is precisely what we want, go ahead and click the “Delete and Continue” button. The deployment process will take several minutes. I suggest going for coffee, or pursuing another vice that requires 5-10 minutes.
4. Use Azure Companion to Set Up PHP and WordPress
Once started, the Azure Companion management service is running on port 8080. You can access it by navigating to your service URL at that port. In this case, it’s http://sfiwhitepages.cloudapp.net:8080. You should see a screen similar to the following:
You log in with the ID created in the service definition in Section 3. If you receive errors, chances are that all the services haven’t spun up yet. You will also receive errors here if you configures multiple instances of the service, because both instances are trying to access the same file.
Click on the Applications tab. You will find a number of applications listed, and the one we’re interested in is WordPress. Selecting it and click next. The next page will show it selected, along with all of its dependencies, including the PHP runtime, and the PHP SQL drivers.
One parameter above is quire important, the installation path. This path will be used to form the URL to your blog, in the form http://serveraddress/InstallationPath. Unless you’ll be taking the additional steps to install the blog into the root that I describe below, you’ll want to choose this name wisely, as you’ll be handing it out.
After you have carefully read all of the licence terms (tee hee), click the “Accept” button. All of the requisite files will be installed for you. This process should take something less than a minute. When done, you will be returned to the application screen. You are now ready to set up WordPress. To begin the process, click on the launch button in the application window:
Alternatively, you can enter the URL for WordPress on your service by adding the installation path to the end of your URL, i.e. http://sfiwordpress.clousapp.net/wordpress. Doing so will begin the WordPress configuration procedure. You should be presented with a page that indicates that the WordPress configuration file needs to be created. Go ahead and do so. After a confirmation page (click “Let’s go!”), you’ll be prompted for the database connection information. Here you’ll enter the information that you recorded at the end of the SQL Azure setup step above (step 2). The format of the user name is a little non standard (username@machinename), but the diagram below shows how the information from step 2 maps to the WordPress setup form.
Clicking Submit causes WordPress to check the connections, and if all is well, you are prompted to Run the install. Clicking the install button brings up the standard WordPress configuration screen, looking for the Site Title (for use on pages), the administrator user name, the administrator password, and your email. WordPress will use your email for things like administrator password resets, but the sendmail function will not work, at least without further setup (which I haven’t done). In other words, don’t forget the password.
Once complete, go ahead and click the “Install WordPress” button. You’ll receive a confirmation message, and clicking on Log In will take you to the login screen. Once logged in (as the administrator), you’ll be taken to the standard WordPress admin screen.
Of course navigating to the blog’s address will take you to the blog itself.
If you’re happy with the URL as is, you’re done. However, if you’re like me, and you want to have the blog at the root, and/or you want to use your own domain to host the blog, then read on…..
5. Move the Blog to its own Application
Once installation is complete, WordPress is stored in a subdirectory of the main PHP host site. If we don’t need the site for anything else, then we can change its port bindings, create a new web application on port 80, and move the WordPress files to it.
The first step is to connect to your Azure instance with Remote Desktop. To do this, open up the Windows Azure portal, navigate to the service instance, and click the connect button in the ribbon.
This will open up the Remote Desktop window. You will log in with the credentials that you created in the Remote Desktop configuration settings in Visual Studio. When entering the User Name, make sure to preface it with a backslash (ie jwhite) to indicate a local user. If you used a self signed certificate, you will receive certificate warnings, but these can safely be ignored.
Once logged in, start the IIS Manager, then open up your virtual server, and open the sites tab. You should see two sites, one is the admin site for Windows Azure Companion, and the other is the PHP Host site. Our first step will be to change the port binding to something other than 80 for the PHP host site, so select it, and click on Bindings.
Select the current binding click edit, and change the port to something other than 80 (I used 81). Keep in mind that because this endpoint is not defined in the service, it will be unavailable outside of the host instance.
Next, open Windows Explorer and open up the F: folder. Create a new folder to house the WordPress files (in my case F:WordPress). This will be our new blog folder. The name will only matter to the server. Next, open up the F:Applications folder. This is the root of the current WordPress site. Copy the two files stored there (phpinfo.php and web.cfg) to the folder that you just created. Next, navigate to the folder in the F:applications folder (in my case, F:applicationswordpress) and copy everything there into the same new folder.
Once this is done, go back into IIS Manager, right click on the Sites folder, and select Add New Site.
Give the site a unique name, and for the Physical Path, use the new blog folder created above. Ensure that the app is explicitly bound to the server’s IP address, and in addition, ensure that the site is bound to port 80. When finished, the web site should start, but if there is an error, simply restart it.
You should now be able to navigate to the root of your site, but you will likely notice that all of your styles have gone. This is because WordPress still thinks it’s installed in the old folder. To fix this, you have two options. You could connect to your SQL Azure Instance with SQL Enterprise Manager, and edit the siteurl record in the wp_options table. Alternatively, you can navigate to the new blog folder, delete the wp-config.php file, navigate to the blog root, and re-set it up as above. This is likelier the easiest option, but be aware that before you do this, you will need to drop and recreate your SQL database.
We’re almost there…
6. Implement a custom Domain Name
OK. Now we have a WordPress blog running at the root of our application. However, if you’re like me, you likely want to use your own domain, and not cloudapp.net. Unfortunately, Azure uses variable external IP addresses, so there’s no way to use host headers and DNS A records. However, we can create a CName (alias) record that converts xxx.cloudapp.net to servername.mydomain.root.
The first step is to log in to your Domain Services provider. I use DynDNS so the example will be from there. Simply add a CNAME record that converts the Azure service name to your desired address. The DynDNS example is below.
You should now be able to navigate to your blog at the new address. It may take a few minutes for changes to propagate.
We have one step to go. WordPress is answering on this new address, but it will still form all of its URLs using the old one. We need to tell it to use the new address, and we do that by navigating to the admin app. Until we make this change, we need to use the old address, which in my case is http://sfiwhitepages.cloudapp.net/wp-admin.
Once logged in, click on settings along the left menu, and then select General.
Once those changes are made, you can use the new domain exclusively.
That’s it…we’re done! Easy right? It’s worth it. No longer is my blog at the mercy of my local power provider, or any IT maintenance schedules. As I mentioned at the beginning, please feel free to comment with any better approaches, or any egregious errors that I may have made.