Note: In this blog post, I’m talking about Azure Functions, which are currently in preview. Settings, functionality, pricing, etc. may still change.


I’m constantly trying to optimise my life by automating small tasks, so that I can either save some time and get things done without having to remember them. For example, I’m a big fan of both IFTTT and Instapaper, and use these two in combination to have new blog articles (e.g. from posted into my Instapaper account automatically, so I can’t read those easily and quickly when I’m travelling to and from work. I do not need to check on a daily basis for new articles, and I can start reading new ones the moment I’m on public transport.

Recently, I started looking at Azure Functions. If you’ve never heard of them before, I recommend starting with the official Azure Functions Overview for an introduction, followed by Scott Hanselman’s and Troy Hunt’s recent blog post on potential use cases. After playing around for a short while with them, I thought I’d give them a chance for a mini-project: I’ve been following Packt Publishing’s Free Learning eBook offer for a while now. When I say follow, I actually mean I open my browser, open the page, and check which new eBook is available.

Pack Publishing Free Learning eBook Offer

This obviously takes some time, and additionally I forget to do it on a daily basis. While I do not get every free daily eBook, I still want to know if there’s something interested published. So, what I was looking at doing was to implement a small website monitor that checks the offer page once per day and sends me an email with the offer details. Quite simple, should be up and running within a short time.



First of all, you obviously need an Azure account. In addition to that, you need a SendGrid account to send out emails. Luckily, Azure subscribers can get a free SendGrid account with up to 25,000 emails each month. More than enough for personal use! Within Azure, click on New, type in SendGrid, and follow the steps to add it:

Azure SendGrid

Once done, create a new API Key in SendGrid and keep it somewhere (as we will need it later).

Furthermore, you also need to add a Function app in Azure if you haven’t already done so. When you create a Function app, you need to specify whether to use a Dynamic App Service Plan or a Classic one. For our small exercise, Dynamic is the best choice (“You don’t have to worry about resource management, and you only pay for the time that your code runs.“). Read more about the differences on How to scale Azure Functions.

Note: You can review the pricing for Functions here (note: Functions are still in Preview, pricing may change). You can also estimate your total cost by using the Azure Pricing calculator. Here’s my estimate for the Function I’m talking about:


Yes, you can safely run your Function for quite a couple of times before getting charged for it.


Once both the Function App and the SendGrid API Key are available, you need to add the key to the Function App’s settings. Open your Function App, go to Function app settings, and select Configure app settings.


Under App settings, add a new setting with AzureWebJobsSendGridApiKey as key and your SendGrid API Key as value.


Once that’s done, save your changes. Your Function apps are now capable of sending out emails via SendGrid.

Creating the Function

In your Function App, create a new Function. Microsoft provides several templates, we’ll use the simplest one and do all the plumbing from scratch (but feel free to check out the available templates to learn how you can integrate for examples Azure Queues or Azure Blob Storage), so select the “Empty – C#” template and give it a name.

Once created, each Function offers you 4 pages where you can manage it – Develop, Integrate, Manage, and Monitor.


Let’s start with setting up the basic “structure” of our Function on the Integrate page. Initially, as we’ve chosen the Empty template, there are no Triggers, Inputs, or Outputs defined. What we want to set up is a Timer trigger (to run our code on a daily basis) and a SendGrid Output (to inform us about the latest offer):


First, add a new Trigger and select the Timer trigger template. Set the Schedule to 0 0 0 * * *. which means that the Function runs every day at midnight (once you add a Timer trigger, the page will also provide some documentation on the format for this in case you want to define your own schedules). Remember to click Save.


Next, add a SendGrid Output. You have the possibility to configure a couple of required parameters either on the configuration page or in your code. Leave the Message parameter name as it is (this is the variable we’ll use in our code), and define the other values as required. As I’m sending the email to myself, I added my email address to both the To and From fields. The Subject and Body will get defined in the code.


Save again, and click on Develop. At the top of the page you’ll see the Code section where you can define the underlying code of your Function.


Copy the following code sample and replace any existing code in your Function:

#r "SendGridMail"

using System;
using SendGrid;

 static string packtUrl = "";

 public static void Run(TimerInfo myTimer,  out SendGridMessage message, TraceWriter log)
     message = null;    
        using (var httpClient = new HttpClient())
             string pageContent = httpClient.GetStringAsync(packtUrl).Result;
             //the content we're looking for comes after a div tag with the class dotd-title
             int tmpIndex = pageContent.IndexOf("dotd-title");
             string offerDetails = pageContent.Substring(tmpIndex +15, 3000);
             //we've now got more content than required, removing anything that is not part of the article description
             offerDetails = offerDetails.Substring(1, offerDetails.IndexOf("dotd-main-book-form")-20);
             offerDetails += String.Format("<br/><br/><a href='{0}'>PacktPub Free Learning</a>",packtUrl);
             //parsing the book title here
             string bookTitle = offerDetails.Substring(offerDetails.IndexOf("<h2>")+4,
             message = new SendGridMessage()
                 Subject = String.Format("Packt's Free Learning Offer: {0}",bookTitle),
                 Html = offerDetails

             log.Info("Finished successfully");
     catch (Exception exc)
         log.Info("Exception:" + exc.Message + exc.StackTrace);

Note the following:

  1. The first line (starting with #r) is used to add a reference to an external assembly (documentation)
  2. public static void Run is the main function which gets called when your Function runs. As you can see, we have 3 arguments for it here. TimerInfo myTimer needs to be added as we’re using a Timer trigger. out SendGridMessage message is the SendGrid email output which we defined in the Integration configuration. We need to instantiate it and set any required properties if we want to send an email. Lastly, TraceWriter log provides a mechanism to log any information that you want during your code execution – errors, warnings, general stuff. Anything written to this log gets displayed in the Log section just below your Code section.
  3. As for the code fetching and extracting the information I want to receive via email, this is pretty straightforward. The few lines I have here instantiate a new HttpClient object, read the offer page content, and fetch the title and description (not in the nicest of ways, but hey, it’s working!).
  4. Lastly, we’re creating our SendGrid email message and assign it the appropriate subject and email body.

Here’s the final result once the code was executed (either when and the email was sent:



That’s it! A couple of minutes of configuration and coding, and I now receive daily email alerts about the latest free eBook.

Yesterday, I talked at the SharePoint Community meetup about “OneDrive for Business – Current State and New Features”. Here are the slides that I used:


From time to time you may want to get an overview of the structure of your SharePoint Online environment. That is, you want to know the number of site collections and subsites, and know how they are organised.

While you could review the existing site collections either in the UI (not very convenient once you have more than 20) or via PowerShell, both approaches don’t provide you with details about a site collection’s structure itself. How many subsites are there, how are they organised? How many levels deep do they go, or are there dozens of subsites located directly underneath a specific site collection?


Just like in my recent post, what you need are the OfficeDev PnP PowerShell cmdlets available at (review the installation instructions on that page if you haven’t installed those awesome cmdlets yet) and an account which has been assigned either the “SharePoint administrator” or “Global administrator” role in your tenant. Additionally, this account needs to have access to all SharePoint Online site collections – this is where the script to set site collection administrators on all site collections comes in very handy.

The Script

What the script does is the following: it first fetches the list of all site collections (apply a filter if necessary) and then collects the structure for each site collection by going through all subsites. Lastly, it writes the collected information into a CSV file for further manual analysis, e.g. in Excel.

#Config Variables - update these as required
$tenant = "mytenant"
$ReportPath = "SPOStructure.csv"

# Note: If you run this script regularly, please have a look at
# the following site to see how you can store credentials securely in Windows
$cred = Get-Credential
Connect-SPOnline -Url "https://$($tenant)" -Credentials $cred

write-host "Getting all sites"
#Note: we do not make use of the IncludeOneDriveSites parameter here,
# which would include personal sites as well
#You could also filter here to get only specific sites, or use
# the -Url parameter for the Get-SPOTenantSite to get the structure of a single site collection only
$tenantSites = Get-SPOTenantSite

function Get-SPOSubWebs($Context, $RootWeb){ 
	$arrWebs = @()
        $Webs = $RootWeb.Webs 
        ForEach ($sWeb in $Webs) 
	    $arrWebs  += $sWeb.Url
            $arrWebs += Get-SPOSubWebs -RootWeb $sWeb -Context $Context 
	return $arrWebs

$allWebs = @()
foreach($site in $tenantSites) {
    write-host "Connecting to $($site.Url)"
    Connect-SPOnline -Url $site.Url -Credentials $cred
    $allWebs += $site.Url
    $allWebs += Get-SPOSubWebs (Get-SPOContext) (Get-SPOWeb)
write-host "Finished"

$allWebs | Out-File -FilePath $ReportPath

Here’s the script in action:

Supply values for the following parameters:
Getting all sites
Connecting to 
Connecting to 
Connecting to 
Connecting to 
Connecting to 
Connecting to 
Connecting to 


And the result written to the CSV file:


Imagine the following scenario: A user or a group of people need full access to all site collections in your Office 365 tenant. It could be a service account that gathers some statistics regularly, or a group of users who provide regular detailed support to your organization. How can you ensure that these users have access to all site collections, even newly created ones? What is the best of all to manage this group of users?

Currently, there is no way how you automatically assign users or groups as site collection administrators in your tenant. And while you can manage the settings per site collection in the SharePoint Online Administration area of the Office 365 portal, doing so for dozens or hundreds of sites is not a productive use of time.


But there’s a way to automate this process, and as usual PowerShell comes to the rescue. What you need are the OfficeDev PnP PowerShell cmdlets available at (review the installation instructions on that page if you haven’t installed those awesome cmdlets yet) and an account which has been assigned either the “SharePoint administrator” or “Global administrator” role in your tenant.

The cmdlet we are interested in is called Set-SPOTenantSite. It allows you to manage some selected properties of a SharePoint Online site collection, among them the owners. The ‘Owners’ parameter expected a list of accounts, for example: ‘’,  ‘’,  ‘’. As you can see, it requires a user’s login name. But how about groups? If I create an Office 365 group, how can I determine its login name? While you can also achieve the same thing via PowerShell, this is one way to do it via the browser:

  1. Go to a SharePoint site and grant the group access
  2. While still on the permissions view, click on the Group name so that you access it’s Personal Settings page
  3. On that page, it will list something like “Account c:0-.f|rolemanager|s-1-5-21-784567607-4288704409-1262486537-2161342”. “c:0-.f|rolemanager|s-1-5-21-784567607-4288704409-1262486537-2161342” is the login name which you need
  4. You can then remove the permissions for the group again

Next, you also need to think about which users you want to add as site collection administrators. If you have a group of users that should be added to all site collections, it makes sense to add all those users either to an Active Directory group (if you’re synchronising your Active Directory with Office 365) or an Office 365 group. That way, you can manage the group of users fairly easily, and add or remove users simply by managing the group – without having to do anything on the site collections directly.

The Script

Here’s the script that helps you set the site collection administrators on a filtered set of site collections (I’m skipping any personal OneDrive for Business sites in the *mytenant-my.sharepoint/* path, e.g. Update: I just realised that OneDrive sites aren’t returned by default, the IncludeOneDriveSites parameter has to be set to $true for this. Either way, we’re skipping sites in the* path in this example). Please note that in its current form below, it is meant to be run directly by someone who has account credentials for a “SharePoint administrator” or “Global administrator”. It can be adapted to use stored credentials, e.g. when you want to run it as a daily scheduled task on a server.

#comma separated list of users and groups to be added
$adminAccounts = "",""

#Specify the tenant here
$tenant = "mytenant"

# Note: If you run this script regularly, please have a look at the following site to see how you can store credentials securely in Windows
$cred = Get-Credential

write-host "Connecting to https://$($tenant)"
Connect-SPOnline -Url "https://$($tenant)" -Credentials $cred
write-host "Getting list of site collections"

#Note: we are only fetching the root site collection and any site collection in the /sites/ path
#Update filters here accordingly to match your requirements
$sitecollections = Get-SPOTenantSite | where {($_.Url -like "*$($tenant)") -or ($_.Url -like "*$($tenant)*")}

foreach($sitecollection in $sitecollections) {
	write-host "Adding administrators to $($sitecollection.Url)"
	Set-SPOTenantSite -Url $sitecollection.Url -Owners $adminAccounts

That’s it, just a couple of lines of PowerShell which can save you a lot of time and help you with your support processes.

Lastly, if you want to run this script regularly as you want to ensure that your users/groups are also added to new site collections and added to existing ones (if they have been removed), I would recommend to follow the instructions given on the OfficeDev PnP PowerShell page for setting up credentials in Windows’ credentials manager and running the script as a scheduled task.


My presentation from yesterday’s Singapore SharePoint Community event:

Last Saturday, I presented at the local Azure Bootcamp here in Singapore. My session was titled “Introduction to Azure Machine Learning”, and I was extremely happy to see that there was a great amount of people interested in it (as it was the first session of the day, some people even had to stand at the back of the room; more chairs were brought in for sessions afterwards).

Here is the slide deck I used. As for the demo, I have a mini-series in the pipeline.

This coming Saturday, April 16, the Global Azure Bootcamp takes place once again. As I just move back to Singapore, I’ll speak here locally on the topic of Azure Machine Learning:

Introduction to Azure Machine Learning

Machine Learning runs predictive models that learn from existing data in order to forecast future behaviors, outcomes, and trends. A practical example for this is when you swipe your credit card somewhere and the bank verifies via Machine Learning if the transaction is likely to be a fraud. An other example are online shopping recommendations based on what you want to buy and what others have purchased before.

In this session, we’ll lay the foundation for understanding the basics of Machine Learning, and see some practical examples of how it can be implemented on Azure

Visit to view the agenda and register now!

Well, I realised that I announced my presentation, but never posted the slidedeck here. Without further ado, here it is:

A year ago, Microsoft released an updated set of Visio Stencils with icons for Office 365 and related products. They updated the set slightly and also provided the option to download the “older 2012 version”:

These stencils contain more than 300 icons to help you create visual representations of Microsoft Office or Microsoft Office 365 deployments including Skype for Business, Microsoft Exchange Server 2013, Microsoft Lync Server 2013, and Microsoft SharePoint Server 2013. The zip file now includes both stencil sets from 2012 and 2014.

Here’s an overview of all icons in the newest version:





















On June 17th and 18th, an online SharePoint conference named SPBiz with a major focus on business-related topics will take place. I’ll participate with a session called “Practical Advice for developing your SharePoint Roadmap”:

On-premises, cloud, or both? One centralized farm or multiple farms in different geographical regions? SharePoint as an intranet, an Enterprise Content Management System, or as THE central platform for all your company’s applications? Deciding what to do with SharePoint in your organization is an important task that needs to be properly planned and aligned with business goals. Developing a roadmap can help you with getting a shared understanding of the way ahead, the initiatives to be undertaken, and the outcomes to be achieved. In this session, you will receive practical advice on how to get started and how to develop your own SharePoint roadmap.

My session will take place from 3-4pm EDT (9-10pm CET). More details can be found on the conference website, the schedule for the 17th can be seen here:

Hope to see you there!