Serverless Scripting with Azure Functions to Move Data


By:


In this video we look at Azure Functions and how they can help you expand your ETL functionality in Azure and assist with moving data around in Azure.

Recommended Reading

Video Transcript

Hello everyone. Welcome to this video about moving data around in Azure. In this video, we will talk about Azure functions and how they can help you to expand the functionality of your ETL flow in Azure.

A quick introduction, my name is Koen Verbeeck. I work as a BI Consultant in Belgium for the company AE. I'm a Microsoft data platform MVP, I write articles for mssqltips.com, and I have some certifications. If you have any questions, you can drop them in the comment section below. You can contact me on Twitter, LinkedIn, or on my blog.

All right, Azure functions. Azure functions are maybe not the first thing you think of when thinking about moving data around in Azure because they are a little pieces of code that you can run serverless in Azure, but you can do anything with them so we can also do some data-related applications with them. All right. So Azure functions are serverless compute. It means it's little pieces of code, which you deploy to Azure, and render somewhere on a server. You don't really care which server or how it runs. You just care that it runs, obviously. And this little piece of code will do mostly one piece of task. For example, read a CSV file and transform it to JSON, for example, or read it from a blob search and SQL Server database. All little pieces of functionality that you can easily implement using existing languages, like for example C#, PowerShell, Python, and many, many others.

As their functions are really useful, because sometimes you have another tool like Azure Data Factory, and it doesn't support all functionality that you want. For example, sending an email or reading data from SharePoint. And when there's such missing functionality, you can easily resolve this using Azure Functions. Someone told me recently Azure Functions are the Script tasks of integration service, but then in the Azure Cloud. Another advantage of Azure Functions is that you have Visual Studio integration. Means you can fully develop them in Visual Studio, the environment that you're used to, you can test them there, you can debug them there, and then you can deploy them to Azure.

All right. Let's quickly see one of those Azure Functions in action. All right, so let's go to Visual Studio. Here I've written a little Azure function and it will take data coming in to Blob storage. So it's triggered every time a new file arrives in that certain container. This function will be triggered, pick up the file and process it. Right. So we have a little configuration, we have the local settings.json file, which will contain the link Azure Blob storage and also the connection string to an Azure SQL Database where we will dump the data. Now, this local settings.json file, typically do not include this in your source control because then everyone can see your secrets, but also when you deploy this to Azure, it will not be included so you need to do your configuration in Azure as well.

Right, so I read the configuration file and over here, I retrieve my connection string, and then I check if the file ends with JSON over here. If it's okay, I'll read the file, and will try to convert it to a C# data table using the tabulate function, which is a function I found on Stack Overflow. Actually, just go over the JSON file and dump it into a table. Once I have my data into a table, I can use the SQL bulk copy class provided to my destination table called topmovies. So I will reading a file in JSON containing the top 250 movies of all time. So here are writing to SQL Server, and then I will just simply put the line, okay, I wrote some JSON blobs to Azure SQL. If all of this is going too fast, don't worry. I've written a tip about this that you can find on mssqltips.com and explains every step in more detail.

Alright, so let's try to debug this. So I click here on play and to launch an external application. You can debug an Azure function locally on your own machine. You don't need to deploy to Azure first. So here you go. It's starting. And here it says, "Okay, now I'm listening on this part "and I'm checking if something arise "in that Azure Blob Storage." Here we have an Azure Blob container. Now this application is listening to this location. So let's pick up a file, a JSON file, and go to Azure Storage Explorer, which is by the way, are free tool you can download, dump the file over there. You can see here it's transferring. Okay, it's complete. If you go over here, you will see yes, found the file. And he's now executing the Azure function. So if you go back to Visual Studio, you can see a lot of output. And at the end, he says, "Okay, I wrote some JSON blob to Azure SQL." So if you go check SQL Server to my destination, and I run this query, I can see, indeed, there are 250 rows present in this table. Okay, this was my very short demo. Let's go back to the slides.

This was a very short introduction to Azure Functions. If you're interested in more of the functionality, or some tips on mssqltips.com where you can check out a couple of use cases that use Azure Functions. And this first one, the process blob files automatically using an Azure Function with Blob trigger, is the demo that I just did. I also have a tip about how to get the Snowflake database and there are some other tips as well. I hope you enjoyed this video. Again, if you have any questions, just put them in the comments and thank you for watching.


Last Update: 4/27/2020




Comments For This Article




Thursday, February 11, 2021 - 12:28:49 PM - Koen Verbeeck Back To Top (88222)
Hi Anand,

Azure Functions are pure code (C#, Python, ...), so anything is possible :)
But yes, you'll be able to read in a flat file. You'll also be able to process row individually, store them in a datatable and then use bulkcopy. But I don't think processing row-by-row is the best option. Depending on the transformation, you can try a set-based approach (Python can do this) or by using SQL in the database.

Regards,
Koen

Wednesday, February 10, 2021 - 2:45:33 PM - Andy Back To Top (88213)
I am sure same code can be reused, and I am learning Azure integration coding. Do you think I can reuse the same code to -

#1. Use a CSV (50K records) file instead of JSON and loop through it.
#2. Process each record for some data transformation, before bulkcopy

Thanks,
Anand














get free sql tips
agree to terms