Azure
Azure
Overview
What is Azure Functions?
Get Started
Create your first function
Create a webhook function
Create an Azure connected function
Create an event processing function
How To
Plan and design
Choose between Flow, Logic Apps, Functions, and WebJobs
Choose between hosting plans
Develop
Develop function apps
Work with triggers and bindings
Create a function from the Azure portal
Testing Azure Functions
Develop and debug locally
Best practices for Azure Functions
Use Azure Functions to perform a scheduled clean-up task
Manage
Configure settings for a function app
Deploy
Continuous deployment for Azure Functions
Monitor
Monitoring Azure Functions
Resources
Pricing
MSDN forum
Stack Overflow
Service updates
Azure Functions Overview
11/22/2016 • 4 min to read • Edit on GitHub
Contributors
Matthew Henderson • Glenn Gailey • Donna Malayeri • Kim Whitlatch (Beyondsoft Corporation) • Tyson Nevil • Sylvan Clebsch • wesmc
• Cory Fowler • Di Hei • Michael S. Collier • Christopher Anderson
Azure Functions is a solution for easily running small pieces of code, or "functions," in the cloud. You can write just
the code you need for the problem at hand, without worrying about a whole application or the infrastructure to
run it. Functions can make development even more productive, and you can use your development language of
choice, such as C#, F#, Node.js, Python or PHP. Pay only for the time your code runs and trust Azure to scale as
needed.
This topic provides a high-level overview of Azure Functions. If you want to jump right in and get started with
Azure Functions, start with Create your first Azure Function. If you are looking for more technical information
about Functions, see the developer reference.
Features
Here are some key features of Azure Functions:
Choice of language - Write functions using C#, F#, Node.js, Python, PHP, batch, bash, or any executable.
Pay-per-use pricing m odel - Pay only for the time spent running your code. See the Consumption hosting
plan option in the pricing section.
Bring your ow n dependencies - Functions supports NuGet and NPM, so you can use your favorite libraries.
Integrated security - Protect HTTP-triggered functions with OAuth providers such as Azure Active Directory,
Facebook, Google, Twitter, and Microsoft Account.
S im plified integration - Easily leverage Azure services and software-as-a-service (SaaS) offerings. See the
integrations section for some examples.
Flexible developm ent - Code your functions right in the portal or set up continuous integration and deploy
your code through GitHub, Visual Studio Team Services, and other supported development tools.
Open-source - The Functions runtime is open-source and available on GitHub.
Functions provides templates to get you started with key scenarios, including the following:
BlobTrigger - Process Azure Storage blobs when they are added to containers. You might use this function for
image resizing.
EventHubTrigger - Respond to events delivered to an Azure Event Hub. Particularly useful in application
instrumentation, user experience or workflow processing, and Internet of Things (IoT) scenarios.
Generic w ebhook - Process webhook HTTP requests from any service that supports webhooks.
GitHub w ebhook - Respond to events that occur in your GitHub repositories. For an example, see Create a
webhook or API function.
HTTPTrigger - Trigger the execution of your code by using an HTTP request.
QueueTrigger - Respond to messages as they arrive in an Azure Storage queue. For an example, see Create
an Azure Function that binds to an Azure service.
S erviceBusQueueTrigger - Connect your code to other Azure services or on-premise services by listening to
message queues.
S erviceBusTopicTrigger - Connect your code to other Azure services or on-premise services by subscribing
to topics.
Tim erTrigger - Execute cleanup or other batch tasks on a predefined schedule. For an example, see Create an
event processing function.
Azure Functions supports triggers, which are ways to start execution of your code, and bindings, which are ways to
simplifying coding for input and output data. For a detailed description of the triggers and bindings that Azure
Functions provides, see Azure Functions triggers and bindings developer reference.
Integrations
Azure Functions integrates with various Azure and 3rd-party services. These services can trigger your function and
start execution, or they can serve as input and output for your code. The following service integrations are
supported by Azure Functions.
Azure DocumentDB
Azure Event Hubs
Azure Mobile Apps (tables)
Azure Notification Hubs
Azure Service Bus (queues and topics)
Azure Storage (blob, queues, and tables)
GitHub (webhooks)
On-premises (using Service Bus)
Consum ption plan - When your function runs, Azure provides all of the necessary computational resources.
You don't have to worry about resource management, and you only pay for the time that your code runs. Full
pricing details are available on the Functions Pricing page.
App S ervice plan - Run your functions just like your web, mobile, and API apps. When you are already using
App Service for your other applications, you can run your functions on the same plan at no additional cost. Full
details can be found on the App Service Pricing page.
For more information about scaling your functions, see How to scale Azure Functions.
Next Steps
Create your first Azure Function
Jump right in and create your first function using the Azure Functions quickstart.
Azure Functions developer reference
Provides more technical information about the Azure Functions runtime and a reference for coding functions
and defining triggers and bindings.
Testing Azure Functions
Describes various tools and techniques for testing your functions.
How to scale Azure Functions
Discusses service plans available with Azure Functions, including the Consumption hosting plan, and how to
choose the right plan.
Learn more about Azure App Service
Azure Functions leverages the Azure App Service platform for core functionality like deployments, environment
variables, and diagnostics.
Create your first Azure Function
11/22/2016 • 3 min to read • Edit on GitHub
Contributors
Glenn Gailey • wesmc • Kim Whitlatch (Beyondsoft Corporation) • Tyson Nevil • Cory Fowler • Yochay Kiriaty
Overview
Azure Functions is an event-driven, compute-on-demand experience that extends the existing Azure application
platform with capabilities to implement code triggered by events occurring in other Azure services, SaaS
products, and on-premises systems. With Azure Functions, your applications scale based on demand and you pay
only for the resources you consume. Azure Functions enables you to create scheduled or triggered units of code
implemented in various programming languages. To learn more about Azure Functions, see the Azure Functions
Overview.
This topic shows you how to use the Azure Functions quickstart in the portal to create a simple "hello world"
Node.js function that is invoked by an HTTP-trigger. You can also watch a short video to see how these steps are
performed in the portal.
Before you can create your first function, you need to have an active Azure account. If you don't already have an
Azure account, free accounts are available.
1. Go to the Azure Functions portal and sign-in with your Azure account.
2. Type a unique Nam e for your new function app or accept the generated one, select your preferred Region ,
then click Create + get started .
3. In the Quickstart tab, click WebHook + API and JavaS cript , then click Create a function . A new
predefined Node.js function is created.
4. (Optional) At this point in the quickstart, you can choose to take a quick tour of Azure Functions features in
the portal. After you have completed or skipped the tour, you can test your new function by using the HTTP
trigger.
1. In the Develop tab, review the Code window and notice that this Node.js code expects an HTTP request
with a name value passed either in the message body or in a query string. When the function runs, this
value is returned in the response message.
2. Click Test to display the built-in HTTP test request pane for the function.
3. In the Request body text box, change the value of the name property to your name, and click Run . You
see that execution is triggered by a test HTTP request, information is written to the streaming logs, and the
"hello" response is displayed in the Output .
4. To trigger execution of the same function from another browser window or tab, copy the Function URL
value from the Develop tab and paste it in a browser address bar. Append the query string value
&name=yourname to the URL and press enter. The same information is written to the logs and the browser
displays the "hello" response as before.
Next steps
This quickstart demonstrates a simple execution of a basic HTTP-triggered function. To learn more about using
Azure Functions in your apps, see the following topics:
Contributors
Glenn Gailey • Kim Whitlatch (Beyondsoft Corporation) • Tyson Nevil • Cory Fowler
Azure Functions is an event-driven, compute-on-demand experience that enables you to create scheduled or
triggered units of code implemented various programming languages. To learn more about Azure Functions, see
the Azure Functions Overview.
This topic shows you how to create a Node.js function that is invoked by a GitHub webhook. The new function is
created based on a pre-defined template in the Azure Functions portal. You can also watch a short video to see how
these steps are performed in the portal.
1. Go to the Azure Functions portal and sign-in with your Azure account.
2. If you have an existing function app to use, select it from Your function apps then click Open . To create a
function app, type a unique Nam e for your new function app or accept the generated one, select your
preferred Region , then click Create + get started .
3. In your function app, click + New Function > GitHub Webhook - Node > Create . This step creates a
function with a default name that is based on the specified template.
4. In Develop , note the sample express.js function in the Code window. This function receives a GitHub
request from an issue comment webhook, logs the issue text and sends a response to the webhook as
New GitHub comment: <Your issue comment text> .
5. Copy the Function URL and GitHub S ecret values. You need these values to create the webhook in
GitHub.
6. Scroll down to Run , note the predefined JSON body of an issue comment in the Request body, then click
Run .
You can always test a new template-based function right in the Develop tab by supplying any expected
body JSON data and clicking the Run button. In this case, the template has a predefined body for an issue
comment.
Next, you will create the actual webhook in your GitHub repository.
3. Paste your function's URL and secret into Payload URL and S ecret , then click Let m e select individual
events , select Issue com m ent and click Add w ebhook .
At this point, the GitHub webhook is configured to trigger your function when a new issue comment is added.
Now, it's time to test it out.
2. In the issue, type a comment and click Com m ent . At this point, you can go back to your new webhook in
GitHub and under Recent Deliveries see that a webhook request was sent and that the body of response is
New GitHub comment: <Your issue comment text> .
3. Back in the Functions portal, scroll down to the logs and see that the function has been triggered and the
value New GitHub comment: <Your issue comment text> is written to the streaming logs.
Next steps
See these topics for more information about Azure Functions.
Contributors
Yochay Kiriaty • wesmc • Glenn Gailey • Kim Whitlatch (Beyondsoft Corporation) • Tyson Nevil • Rachel Appel • Cory Fowler
Azure Functions is an event-driven, compute-on-demand experience that extends the existing Azure application
platform with capabilities to implement code triggered by events occurring in other Azure services, SaaS products,
and on-premises systems. With Azure Functions, your applications scale based on demand and you pay only for
the resources you consume. Azure Functions provides an intuitive, browser-based user interface allowing you to
create scheduled or triggered pieces of code implemented in a variety of programming languages.
This getting started video shows how to create a function app, modify the code, as well as how to interact with the
triggers and bindings.
In this short video, you learn how to create an Azure Function that listens to messages on an Azure Queue and
copies the messages to an Azure Blob.
5. Verify that the function works by viewing activity in the log. You might have to click the Logs link in the
upper right corner to display the log pane.
function myQueueItem()
{
return {
msg: "some message goes here",
time: "time goes here"
}
}
2. Modify the existing function code to call the code added in Step 1. Insert the following code around line 9 of
the function, after the if statement.
This code creates a m yQueueItem and sets its tim e property to the current timeStamp. It then adds the
new queue item to the context's myQueue binding.
using System;
This code adds two classes, TableItem and QItem , that you use to read and write to queues. Additionally,
the Run function has been modified to accept the QItem and TraceWriter parameter, instead of a string
and a TraceWriter .
9. Click S ave .
10. Verify that the code works by viewing the function's logs and in Visual Studio. To verify in Visual Studio, use the
Cloud Explorer to navigate to the functionbindings Azure Table and verify there are rows in it.
For information about other bindings and triggers for Azure Functions, see Azure Functions triggers and bindings
developer reference
Contributors
Glenn Gailey • wesmc • Kim Whitlatch (Beyondsoft Corporation) • Tyson Nevil • Cory Fowler
Azure Functions is an event-driven, compute-on-demand experience that enables you to create scheduled or
triggered units of code implemented in a variety of programming languages. To learn more about Azure Functions,
see the Azure Functions Overview.
This topic shows you how to create a new function in C# that executes based on an event timer to add messages to
a storage queue.
Prerequisites
Before you can create a function, you need to have an active Azure account. If you don't already have an Azure
account, free accounts are available.
1. Go to the Azure Functions portal and sign-in with your Azure account.
2. If you have an existing function app to use, select it from Your function apps then click Open . To create a new
function app, type a unique Nam e for your new function app or accept the generated one, select your preferred
Region , then click Create + get started .
3. In your function app, click + New Function > Tim erTrigger - C# > Create . This creates a function with a
default name that is run on the default schedule of once every minute.
4. In your new function, click the Integrate tab > New Output > Azure S torage Queue > S elect .
5. In Azure S torage Queue output , select an existing S torage account connection , or create a new one,
then click S ave .
6. Back in the Develop tab, replace the existing C# script in the Code window with the following code:
using System;
public static void Run(TimerInfo myTimer, out string outputQueueItem, TraceWriter log)
{
// Add a new scheduled message to the queue.
outputQueueItem = $"Ping message added to the queue at: {DateTime.Now}.";
This code adds a new message to the queue with the current date and time when the function is executed.
7. Click S ave and watch the Logs windows for the next function execution.
8. (Optional) Navigate to the storage account and verify that messages are being added to the queue.
9. Go back to the Integrate tab and change the schedule field to 0 0 * * * * . The function now runs once every
hour.
This is a very simplified example of both a timer trigger and a storage queue output binding. For more information,
see both the Azure Functions timer trigger and the Azure Functions triggers and bindings for Azure Storage topics.
Next steps
See these topics for more information about Azure Functions.
Contributors
cephalin • Donna Malayeri • Kim Whitlatch (Beyondsoft Corporation) • Tyson Nevil • Brent Ozar
This article compares and contrasts the following services in the Microsoft cloud, which can all solve integration
problems and automation of business processes:
Microsoft Flow
Azure Logic Apps
Azure Functions
Azure App Service WebJobs
All these services are useful when "gluing" together disparate systems. They can all define input, actions, conditions,
and output. You can run each of them on a schedule or trigger. However, each service adds a unique set of value,
and comparing them is not a question of "Which service is the best?" but one of "Which service is best suited for
this situation?" Often, a combination of these services is the best way to rapidly build a scalable, full featured
integration solution.
Flows empowers any office worker to perform simple integrations (e.g. get SMS for important emails) without
going through developers or IT. On the other hand, Logic Apps can enable advanced or mission-critical integrations
(e.g. B2B processes) where enterprise-level DevOps and security practices are required. It is typical for a business
workflow to grow in complexity overtime. Accordingly, you can start with a flow at first, then convert it to a logic
app as needed.
The following table helps you determine whether Flow or Logic Apps is best for a given integration.
Both are built on Azure App Service and enjoy features such as source control, authentication, and monitoring.
Both are developer-focused services.
Both support standard scripting and programming languages.
Both have NuGet and NPM support.
Functions is the natural evolution of WebJobs in that it takes the best things about WebJobs and improves upon
them. The improvements include:
The following table summarizes the differences between Functions and WebJobs:
FUNCTIONS WEBJOBS
Pricing Pay-per-use or part of App Service plan Part of App Service plan
Trigger events timer, Azure DocumentDB, Azure Event Azure Storage, Azure Service Bus
Hubs, HTTP/WebHook (GitHub, Slack),
Azure App Service Mobile Apps, Azure
Notification Hubs, Azure Service Bus,
Azure Storage
In-browser development x
PowerShell experimental x
C# x x
F# x
Bash experimental x
PHP experimental x
Python experimental x
JavaScript x x
Whether to use Functions or WebJobs ultimately depends on what you're already doing with App Service. If you
have an App Service app for which you want to run code snippets, and you want to manage them together in the
same DevOps environment, you should use WebJobs. If you want to run code snippets for other Azure services or
even 3rd-party apps, or if you want to manage your integration code snippets separately from your App Service
apps, or if you want to call your code snippets from a Logic app, you should take advantage of all the
improvements in Functions.
You can call a logic app in a flow. You can also call a function in a logic app, and a logic app in a function. The
integration between Flow, Logic Apps, and Functions continue to improve overtime. You can build something in
one service and use it in the other services. Therefore, any investment you make in these three technologies is
worthwhile.
Next Steps
Get started with each of the services by creating your first flow, logic app, function app, or WebJob. Click any of the
following links:
Or, get more information on these integration services with the following links:
Leveraging Azure Functions & Azure App Service for integration scenarios by Christopher Anderson
Integrations Made Simple by Charles Lamanna
Logic Apps Live Webcast
Microsoft Flow Frequently asked questions
Azure WebJobs documentation resources
Scaling Azure Functions
11/22/2016 • 3 min to read • Edit on GitHub
Contributors
dariagrigoriu • wesmc • Andy Pasic • Kim Whitlatch (Beyondsoft Corporation) • Tyson Nevil • Joseph Molnar • Darren Brust
• Cory Fowler • Christopher Anderson
Introduction
The Azure Functions platform allocates compute power when your code is running, scales out as necessary to
handle load, and then scales in when code is not running. This means you don’t pay for idle VMs or have to
reserve capacity before it is needed. The mechanism for this capability is the Consumption service plan. This
article provides an overview of how the Consumption service plan works.
If you are not yet familiar with Azure Functions, see the Azure Functions overview article.
Consumption plan
In the Consum ption plan , your Function Apps are assigned to a compute processing instance. If needed more
instances are added or removed dynamically. Moreover, your functions run in parallel minimizing the total time
needed to process requests. Execution time for each function is aggregated by the containing Function App. Cost
is driven by memory size and total execution time across all functions in a Function App as measured in
gigabyte-seconds. This is an excellent option if your compute needs are intermittent or your job times tend to be
very short as it allows you to only pay for compute resources when they are actually in use.
Runtime scaling
The Azure Functions platform uses a central listener to evaluate compute needs based on the configured triggers
and to decide when to scale out or scale in. The central listener constantly processes hints for memory
requirements and trigger specific data points. For example, in the case of an Azure Queue Storage trigger the
data points include queue length and queue time for oldest entry.
The unit of scaling is the Function App. Scaling out in this case means adding more instances of a Function App.
Inversely as compute demand is reduced, Function App instances are removed - eventually scaling in to zero
when none are running.
Billing model
Billing for the Consumption service plan is described in detail on the Azure Functions pricing page. Usage is
reported per Function App, only for time when code is being executed. The following are units for billing:
Resource consum ption in GB-s (gigabyte-seconds) computed as a combination of memory size and
execution time for all functions running in a Function App.
Executions counted each time a function is executed in response to an event, triggered by a binding.
Azure Functions developer reference
11/22/2016 • 6 min to read • Edit on GitHub
Contributors
Christopher Anderson • Glenn Gailey • wesmc • Andy Pasic • Kim Whitlatch (Beyondsoft Corporation) • Tyson Nevil • David Ebbo
• Sylvan Clebsch • Prashanth Madi • Tom Dykstra • Cory Fowler
In Azure Functions, specific functions share a few core technical concepts and components, regardless of the
language or binding you use. Before you jump into learning details specific to a given language or binding, be
sure to read through this overview that applies to all of them.
This article assumes that you've already read the Azure Functions overview and are familiar with WebJobs SDK
concepts such as triggers, bindings, and the JobHost runtime. Azure Functions is based on the WebJobs SDK.
Function code
A function is the primary concept in Azure Functions. You write code for a function in a language of your choice
and save the code and configuration files in the same folder. The configuration is named function.json , which
contains JSON configuration data. Various languages are supported, and each one has a slightly different
experience optimized to work best for that language.
The function.json file defines the function bindings and other configuration settings. The runtime uses this file
to determine the events to monitor and how to pass data into and return data from function execution. The
following is an example function.json file.
{
"disabled":false,
"bindings":[
// ... bindings here
{
"type": "bindingType",
"direction": "in",
"name": "myParamName",
// ... more depending on binding
}
]
}
Set the disabled property to true to prevent the function from being executed.
The bindings property is where you configure both triggers and bindings. Each binding shares a few common
settings and some settings, which are specific to a particular type of binding. Every binding requires the
following settings:
Function app
A function app is comprised of one or more individual functions that are managed together by Azure App
Service. All of the functions in a function app share the same pricing plan, continuous deployment and runtime
version. Functions written in multiple languages can all share the same function app. Think of a function app as
a way to organize and collectively manage your functions.
To facilitate HTTP triggers, there is also a web host that is designed to sit in front of the script host in
production scenarios. Having two hosts helps to isolate the script host from the front end traffic managed by
the web host.
Folder Structure
The code for all of the functions in a given function app lives in a root folder that contains a host configuration
file and one or more subfolders, each of which contain the code for a separate function, as in the following
example:
wwwroot
| - host.json
| - mynodefunction
| | - function.json
| | - index.js
| | - node_modules
| | | - ... packages ...
| | - package.json
| - mycsharpfunction
| | - function.json
| | - run.csx
The host.json file contains some runtime-specific configuration and sits in the root folder of the function app.
For information on settings that are available, see host.json in the WebJobs.Script repository wiki.
Each function has a folder that contains one or more code files, the function.json configuration and other
dependencies.
When setting-up a project for deploying functions to a function app in Azure App Service, you can treat this
folder structure as your site code. You can use existing tools like continuous integration and deployment, or
custom deployment scripts for doing deploy time package installation or code transpilation.
NOTE
Make sure to deploy your host.json file and function folders directly to the wwwroot folder. Do not include the
wwwroot folder in your deployments. Otherwise, you end up with wwwroot\wwwroot folders.
How to update function app files
The function editor built into the Azure portal lets you update the function.json file and the code file for a
function. To upload or update other files such as package.json or project.json or dependencies, you have to use
other deployment methods.
Function apps are built on App Service, so all the deployment options available to standard web apps are also
available for function apps. Here are some methods you can use to upload or update function app files.
To use App Service Editor
1. In the Azure Functions portal, click Function app settings .
2. In the Advanced S ettings section, click Go to App S ervice S ettings .
3. Click App S ervice Editor in App Menu Nav under DEVELOPMENT TOOLS .
4. click Go .
After App Service Editor loads, you'll see the host.json file and function folders under wwwroot.
5. Open files to edit them, or drag and drop from your development machine to upload files.
To use the function app's SCM (Kudu) endpoint
1. Navigate to: https://<function_app_name>.scm.azurewebsites.net .
2. Click Debug Console > CMD .
3. Navigate to D:\home\site\wwwroot\ to update host.json or D:\home\site\wwwroot\<function_name> to update
a function's files.
4. Drag-and-drop a file you want to upload into the appropriate folder in the file grid. There are two areas in
the file grid where you can drop a file. For .zip files, a box appears with the label "Drag here to upload and
unzip." For other file types, drop in the file grid but outside the "unzip" box.
To use FTP
1. Follow the instructions here to get FTP configured.
2. When you're connected to the function app site, copy an updated host.json file to /site/wwwroot or copy
function files to /site/wwwroot/<function_name> .
To use continuous deployment
Follow the instructions in the topic Continuous deployment for Azure Functions.
Parallel execution
When multiple triggering events occur faster than a single-threaded function runtime can process them, the
runtime may invoke the function multiple times in parallel. If a function app is using the Consumption hosting
plan, the function app could scale out automatically. Each instance of the function app, whether the app runs on
the Consumption hosting plan or a regular App Service hosting plan, might process concurrent function
invocations in parallel using multiple threads. The maximum number of concurrent function invocations in
each function app instance varies based on the memory size of the function app.
Repositories
The code for Azure Functions is open source and stored in GitHub repositories:
Bindings
Here is a table of all supported bindings.
Reporting Issues
ITEM DESCRIPTION LINK
Next steps
For more information, see the following resources:
Contributors
Christopher Anderson • wesmc • Kim Whitlatch (Beyondsoft Corporation) • Tyson Nevil • James Dunn • Myoxocephalus • Sylvan Clebsch
• Glenn Gailey • Tom Dykstra
The C# experience for Azure Functions is based on the Azure WebJobs SDK. Data flows into your C# function via
method arguments. Argument names are specified in function.json , and there are predefined names for
accessing things like the function logger and cancellation tokens.
This article assumes that you've already read the Azure Functions developer reference.
Binding to arguments
The various bindings are bound to a C# function via the name property in the function.json configuration. Each
binding has its own supported types which is documented per binding; for instance, a blob trigger can support a
string, a POCO, or several other types. You can use the type which best suits your need. A POCO object must
have a getter and setter defined for each property.
Logging
To log output to your streaming logs in C#, you can include a TraceWriter typed argument. We recommend that
you name it log . We recommend you avoid Console.Write in Azure Functions.
Async
To make a function asynchronous, use the async keyword and return a Task object.
public async static Task ProcessQueueMessageAsync(
string blobName,
Stream blobInput,
Stream blobOutput)
{
await blobInput.CopyToAsync(blobOutput, 4096, token);
}
Cancellation Token
In certain cases, you may have operations which are sensitive to being shut down. While it's always best to write
code which can handle crashing, in cases where you want to handle graceful shutdown requests, you define a
CancellationToken typed argument. A CancellationToken will be provided if a host shutdown is triggered.
Importing namespaces
If you need to import namespaces, you can do so as usual, with the using clause.
using System.Net;
using System.Threading.Tasks;
The following namespaces are automatically imported and are therefore optional:
System
System.Collections.Generic
System.IO
System.Linq
System.Net.Http
System.Threading.Tasks
Microsoft.Azure.WebJobs
Microsoft.Azure.WebJobs.Host .
#r "System.Web.Http"
using System.Net;
using System.Net.Http;
using System.Threading.Tasks;
The following assemblies are automatically added by the Azure Functions hosting environment:
mscorlib ,
System
System.Core
System.Xml
System.Net.Http
Microsoft.Azure.WebJobs
Microsoft.Azure.WebJobs.Host
Microsoft.Azure.WebJobs.Extensions
System.Web.Http
System.Net.Http.Formatting .
In addition, the following assemblies are special cased and may be referenced by simplename (e.g.
#r "AssemblyName" ):
Newtonsoft.Json
Microsoft.WindowsAzure.Storage
Microsoft.ServiceBus
Microsoft.AspNet.WebHooks.Receivers
Microsoft.AspNEt.WebHooks.Common
Microsoft.Azure.NotificationHubs
If you need to reference a private assembly, you can upload the assembly file into a bin folder relative to your
function and reference it by using the file name (e.g. #r "MyAssembly.dll" ). For information on how to upload
files to your function folder, see the following section on package management.
Package management
To use NuGet packages in a C# function, upload a project.json file to the the function's folder in the function
app's file system. Here is an example project.json file that adds a reference to Microsoft.ProjectOxford.Face
version 1.1.0:
{
"frameworks": {
"net46":{
"dependencies": {
"Microsoft.ProjectOxford.Face": "1.1.0"
}
}
}
}
Only the .NET Framework 4.6 is supported, so make sure that your project.json file specifies net46 as shown
here.
When you upload a project.json file, the runtime gets the packages and automatically adds references to the
package assemblies. You don't need to add #r "AssemblyName" directives. Just add the required using
statements to your run.csx file to use the types defined in the NuGet packages.
This also gives access to the streaming logs where package installation output will be displayed.
2. To upload a project.json file, use one of the methods described in the How to update function app files
section of the Azure Functions developer reference topic.
3. After the project.json file is uploaded, you see output like the following example in your function's streaming
log:
Environment variables
To get an environment variable or an app setting value, use System.Environment.GetEnvironmentVariable , as
shown in the following code example:
Example run.csx:
#load "mylogger.csx"
Example mylogger.csx:
public static void MyLogger(TraceWriter log, string logtext)
{
log.Verbose(logtext);
}
Using a shared .csx is a common pattern when you want to strongly type your arguments between functions
using a POCO object. In the following simplified example, a HTTP trigger and queue trigger share a POCO object
named Order to strongly type the order data:
#load "..\shared\order.csx"
using System.Net;
if (req.orderId == null)
{
return new HttpResponseMessage(HttpStatusCode.BadRequest);
}
else
{
await outputQueueItem.AddAsync(req);
return new HttpResponseMessage(HttpStatusCode.OK);
}
}
#load "..\shared\order.csx"
using System;
outputQueueItem = myQueueItem;
}
Example order.csx:
public class Order
{
public string orderId {get; set; }
public string custName {get; set;}
public string custAddress {get; set;}
public string custEmail {get; set;}
public string cartId {get; set; }
#load "..\shared\mylogger.csx" loads a file located in a folder at the same level as the function folder, that is,
directly under wwwroot.
The #load directive works only with .csx (C# script) files, not with .cs files.
Next steps
For more information, see the following resources:
Contributors
Christopher Anderson • wesmc • Kim Whitlatch (Beyondsoft Corporation) • Tyson Nevil • Sylvan Clebsch • Glenn Gailey • Tom Dykstra
• Cory Fowler
The Node/JavaScript experience for Azure Functions makes it easy to export a function which is passed a
context object for communicating with the runtime, and for receiving and sending data via bindings.
This article assumes that you've already read the Azure Functions developer reference.
Exporting a function
All JavaScript functions must export a single function via module.exports for the runtime to find the function
and run it. This function must always include a context object.
Bindings of direction === "in" are passed along as function arguments, meaning you can use arguments to
dynamically handle new inputs (for example, by using arguments.length to iterate over all your inputs). This
functionality is very convenient if you only have a trigger with no additional inputs, as you can predictably access
your trigger data without referencing your context object.
The arguments are always passed along to the function in the order they occur in function.json, even if you don't
specify them in your exports statement. For example, if you have function(context, a, b) and change it to
function(context, a) , you can still get the value of b in function code by referring to arguments[3] .
All bindings, regardless of direction, are also passed along on the context object (see below).
context object
The runtime uses a context object to pass data to and from your function and to let you communicate with the
runtime.
The context object is always the first parameter to a function and should always be included because it has
methods such as context.done and context.log which are required to correctly use the runtime. You can name
the object whatever you like (i.e. ctx or c ).
// You must include a context, but other arguments are optional
module.exports = function(context) {
// function logic goes here :)
};
context.bindings
The context.bindings object collects all your input and output data. The data is added onto the
context.bindings object via the name property of the binding. For instance, given the following binding
definition in function.json, you can access the contents of the queue via context.bindings.myInput .
{
"type":"queue",
"direction":"in",
"name":"myInput"
...
}
// myInput contains the input data which may have properties such as "name"
var author = context.bindings.myInput.name;
// Similarly, you can set your output data
context.bindings.myOutput = {
some_text: 'hello world',
a_number: 1 };
context.done([err],[propertyBag])
The context.done function tells the runtime that you're done running. This is important to call when you're done
with the function; if you don't, the runtime will still never know that your function completed.
The context.done function allows you to pass back a user-defined error to the runtime, as well as a property bag
of properties which will overwrite the properties on the context.bindings object.
context.log(message)
The context.log method allows you to output log statements that are correlated together for logging purposes.
If you use console.log , your messages will only show for process level logging, which isn't as useful.
The context.log method supports the same parameter format that the Node util.format method supports. So,
for example, code like this:
// You can access your http request off of the context ...
if(context.req.body.emoji === ':pizza:') context.log('Yay!');
// and also set your http response
context.res = { status: 202, body: 'You successfully ordered more coffee!' };
You can include packages in your function by uploading a package.json file to your function's folder in the
function app's file system. For file upload instructions, see the How to update function app files section of
the Azure Functions developer reference topic.
You can also use npm install in the function app's SCM (Kudu) command line interface:
Once the packages you need are installed, you import them to your function in the usual ways (i.e. via
require('packagename') )
module.exports = function(context) {
// Using our imported underscore.js library
var matched_names = _
.where(context.bindings.myInput.names, {first: 'Carla'});
Environment variables
To get an environment variable or an app setting value, use process.env , as shown in the following code
example:
module.exports = function (context, myTimer) {
var timeStamp = new Date().toISOString();
context.done();
};
function GetEnvironmentVariable(name)
{
return name + ": " + process.env[name];
}
TypeScript/CoffeeScript support
There isn't, yet, any direct support for auto-compiling TypeScript/CoffeeScript via the runtime, so that would all
need to be handled outside the runtime, at deployment time.
Next steps
For more information, see the following resources:
Contributors
Sylvan Clebsch • wesmc • Ralph Squillace • Kim Whitlatch (Beyondsoft Corporation) • Tyson Nevil • Cephas Lin
F# for Azure Functions is a solution for easily running small pieces of code, or "functions," in the cloud. Data
flows into your F# function via function arguments. Argument names are specified in function.json , and there
are predefined names for accessing things like the function logger and cancellation tokens.
This article assumes that you've already read the Azure Functions developer reference.
When you use an .fsx for an Azure Function, commonly required assemblies are automatically included for
you, allowing you to focus on the function rather than "boilerplate" code.
Binding to arguments
Each binding supports some set of arguments, as detailed in the Azure Functions triggers and bindings
developer reference. For example, one of the argument bindings a blob trigger supports is a POCO, which can be
expressed using an F# record. For example:
Your F# Azure Function will take one or more arguments. When we talk about Azure Functions arguments, we
refer to input arguments and output arguments. An input argument is exactly what it sounds like: input to your
F# Azure Function. An output argument is mutable data or a byref<> argument that serves as a way to pass
data back out of your function.
In the example above, blob is an input argument, and output is an output argument. Notice that we used
byref<> for output (there's no need to add the [<Out>] annotation). Using a byref<> type allows your
function to change which record or object the argument refers to.
When an F# record is used as an input type, the record definition must be marked with [<CLIMutable>] in order
to allow the Azure Functions framework to set the fields appropriately before passing the record to your
function. Under the hood, [<CLIMutable>] generates setters for the record properties. For example:
[<CLIMutable>]
type TestObject =
{ SenderName : string
Greeting : string }
type Item() =
member val Id = "" with get,set
member val Text = "" with get,set
Logging
To log output to your streaming logs in F#, your function should take an argument of type TraceWriter . For
consistency, we recommend this argument is named log . For example:
Async
The asyncworkflow can be used, but the result needs to return a Task . This can be done with
Async.StartAsTask , for example:
Cancellation Token
If your function needs to handle shutdown gracefully, you can give it a CancellationToken argument. This can be
combined with async , for example:
Importing namespaces
Namespaces can be opened in the usual way:
open System.Net
open System.Threading.Tasks
System
System.Collections.Generic
System.IO
System.Linq
System.Net.Http
System.Threading.Tasks
Microsoft.Azure.WebJobs
Microsoft.Azure.WebJobs.Host .
#r "System.Web.Http"
open System.Net
open System.Net.Http
open System.Threading.Tasks
The following assemblies are automatically added by the Azure Functions hosting environment:
mscorlib ,
System
System.Core
System.Xml
System.Net.Http
Microsoft.Azure.WebJobs
Microsoft.Azure.WebJobs.Host
Microsoft.Azure.WebJobs.Extensions
System.Web.Http
System.Net.Http.Formatting .
In addition, the following assemblies are special cased and may be referenced by simplename (e.g.
#r "AssemblyName" ):
Newtonsoft.Json
Microsoft.WindowsAzure.Storage
Microsoft.ServiceBus
Microsoft.AspNet.WebHooks.Receivers
Microsoft.AspNEt.WebHooks.Common .
If you need to reference a private assembly, you can upload the assembly file into a bin folder relative to your
function and reference it by using the file name (e.g. #r "MyAssembly.dll" ). For information on how to upload
files to your function folder, see the following section on package management.
Editor Prelude
An editor that supports F# Compiler Services will not be aware of the namespaces and assemblies that Azure
Functions automatically includes. As such, it can be useful to include a prelude that helps the editor find the
assemblies you are using, and to explicitly open namespaces. For example:
#if !COMPILED
#I "../../bin/Binaries/WebJobs.Script.Host"
#r "Microsoft.Azure.WebJobs.Host.dll"
#endif
open Sytem
open Microsoft.Azure.WebJobs.Host
When Azure Functions executes your code, it processes the source with COMPILED defined, so the editor prelude
will be ignored.
Package management
To use NuGet packages in an F# function, add a project.json file to the the function's folder in the function
app's file system. Here is an example project.json file that adds a NuGet package reference to
Microsoft.ProjectOxford.Face version 1.1.0:
{
"frameworks": {
"net46":{
"dependencies": {
"Microsoft.ProjectOxford.Face": "1.1.0"
}
}
}
}
Only the .NET Framework 4.6 is supported, so make sure that your project.json file specifies net46 as shown
here.
When you upload a project.json file, the runtime gets the packages and automatically adds references to the
package assemblies. You don't need to add #r "AssemblyName" directives. Just add the required open
statements to your .fsx file.
You may wish to put automatically references assemblies in your editor prelude, to improve your editor's
interaction with F# Compile Services.
Environment variables
To get an environment variable or an app setting value, use System.Environment.GetEnvironmentVariable , for
example:
open System.Environment
run.fsx
#load "logger.fsx"
logger.fsx
Paths provides to the #load directive are relative to the location of your .fsx file.
#load "..\shared\mylogger.fsx" loads a file located in the shared folder at the same level as the function
folder, that is, directly under wwwroot .
The #load directive only works with .fsx (F# script) files, and not with .fs files.
Next steps
For more information, see the following resources:
F# Guide
Best Practices for Azure Functions
Azure Functions developer reference
Azure Functions C# developer reference
Azure Functions NodeJS developer reference
Azure Functions triggers and bindings
Azure Functions testing
Azure Functions scaling
Azure Functions triggers and bindings developer reference
11/22/2016 • 9 min to read • Edit on GitHub
Contributors
Christopher Anderson • Andy Pasic • Matthew Henderson • wesmc • Cephas Lin • Kim Whitlatch (Beyondsoft Corporation) • Tyson Nevil
• Sylvan Clebsch • Glenn Gailey • Tom Dykstra
This topic provides general reference for triggers and bindings. It includes some of the advanced binding features and syntax
supported by all binding types.
If you are looking for detailed information around configuring and coding a specific type of trigger or binding, you may want
to click on one of the trigger or bindings listed below instead:
These articles assume that you've read the Azure Functions developer reference, and the C#, F#, or Node.js developer
reference articles.
Overview
Triggers are event responses used to trigger your custom code. They allow you to respond to events across the Azure
platform or on premise. Bindings represent the necessary meta data used to connect your code to the desired trigger or
associated input or output data. The function.json file for each function contains all related bindings. There is no limit to the
number of input and output bindings a function can have. However, only a single trigger binding is supported for each
function.
To get a better idea of the different bindings you can integrate with your Azure Function app, refer to the following table.
To better understand triggers and bindings in general, suppose you want to execute some code to process a new item
dropped into an Azure Storage queue. Azure Functions provides an Azure Queue trigger to support this. You would need, the
following information to monitor the queue:
A queue trigger binding contains this information for an Azure function. Here is an example function.json containing a queue
trigger binding.
{
"bindings": [
{
"name": "myNewUserQueueItem",
"type": "queueTrigger",
"direction": "in",
"queueName": "queue-newusers",
"connection": "MY_STORAGE_ACCT_APP_SETTING"
}
],
"disabled": false
}
Your code may send different types of output depending on how the new queue item is processed. For example, you might
want to write a new record to an Azure Storage table. To accomplish this, you can setup an output binding to an Azure
Storage table. Here is an example function.json that includes a storage table output binding that could be used with a queue
trigger.
{
"bindings": [
{
"name": "myNewUserQueueItem",
"type": "queueTrigger",
"direction": "in",
"queueName": "queue-newusers",
"connection": "MY_STORAGE_ACCT_APP_SETTING"
},
{
"type": "table",
"name": "myNewUserTableBinding",
"tableName": "newUserTable",
"connection": "MY_TABLE_STORAGE_ACCT_APP_SETTING",
"direction": "out"
}
],
"disabled": false
}
The following C# function responds to a new item being dropped into the queue and writes a new user entry into an Azure
Storage table.
#r "Newtonsoft.Json"
using System;
using Newtonsoft.Json;
await myNewUserTableBinding.AddAsync(
new Person() {
PartitionKey = "Test",
RowKey = Guid.NewGuid().ToString(),
Name = order.name,
Address = order.address,
MobileNumber = order.mobileNumber }
);
}
For more code examples and more specific information regarding Azure storage types that are supported, see Azure
Functions triggers and bindings for Azure Storage.
To use the more advanced binding features in the Azure portal, click the Advanced editor option on the Integrate tab of
your function. The advanced editor allows you to edit the function.json directly in the portal.
Random GUIDs
Azure Functions provides a syntax to generate random GUIDs with your bindings. The following binding syntax will write
output to a new BLOB with a unique name in an Azure Storage container:
{
"type": "blob",
"name": "blobOutput",
"direction": "out",
"path": "my-output-container/{rand-guid}"
}
The following C# code returns the output more naturally without using an out parameter in the function signature.
Async example:
This can also be used with multiple output parameters by designating a single output with $return .
The Azure Functions run-time resolves app settings to values when the app setting name is enclosed in percent signs,
%your app setting% . The following Twilio binding uses an app setting named TWILIO_ACCT_PHONE for the from field of the
binding.
{
"type": "twilioSms",
"name": "$return",
"accountSid": "TwilioAccountSid",
"authToken": "TwilioAuthToken",
"to": "{mobileNumber}",
"from": "%TWILIO_ACCT_PHONE%",
"body": "Thank you {name}, your order was received Node.js",
"direction": "out"
},
Parameter binding
Instead of a static configuration setting for your output binding properties, you can configure the settings to be dynamically
bound to data that is part of your trigger's input binding. Consider a scenario where new orders are processed using an
Azure Storage queue. Each new queue item is a JSON string containing at least the following properties:
{
"name" : "Customer Name",
"address" : "Customer's Address".
"mobileNumber" : "Customer's mobile number in the format - +1XXXYYYZZZZ."
}
You might want to send the customer an SMS text message using your Twilio account as an update that the order was
received. You can configure the body and to field of your Twilio output binding to be dynamically bound to the name and
mobileNumber that were part of the input as follows.
{
"name": "myNewOrderItem",
"type": "queueTrigger",
"direction": "in",
"queueName": "queue-newOrders",
"connection": "orders_STORAGE"
},
{
"type": "twilioSms",
"name": "$return",
"accountSid": "TwilioAccountSid",
"authToken": "TwilioAuthToken",
"to": "{mobileNumber}",
"from": "%TWILIO_ACCT_PHONE%",
"body": "Thank you {name}, your order was received",
"direction": "out"
},
Now your function code only has to initialize the output parameter as follows. During execution the output properties will be
bound to the desired input data.
#r "Newtonsoft.Json"
#r "Twilio.Api"
using System;
using System.Threading.Tasks;
using Newtonsoft.Json;
using Twilio;
// Even if you want to use a hard coded message and number in the binding, you must at least
// initialize the SMSMessage variable.
SMSMessage smsText = new SMSMessage();
// The following isn't needed since we use parameter binding for this
//string msg = "Hello " + order.name + ", thank you for your order.";
//smsText.Body = msg;
//smsText.To = order.mobileNumber;
return smsText;
}
Node.js:
// No need to set the properties of the text, we use parameters in the binding. We do need to
// initialize the object.
var smsText = {};
context.done(null, smsText);
}
where BindingTypeAttribute is the .NET attribute that defines your binding and T is the input or output type that's
supported by that binding type. T also cannot be an out parameter type (such as out JObject ). For example, the Mobile
Apps table output binding supports six output types, but you can only use ICollector or IAsyncCollector for T .
The following example code creates a Storage blob output binding with blob path that's defined at run time, then writes a
string to the blob.
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host.Bindings.Runtime;
BlobAttribute defines the Storage blob input or output binding, and TextWriter is a supported output binding type. As is, the
code gets the default app setting for the Storage account connection string (which is AzureWebJobsStorage ). You can specify a
custom app setting to use by adding the StorageAccountAttribute and passing the attribute array into BindAsync<T>() . For
example,
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host.Bindings.Runtime;
The following table shows you the corresponding .NET attribute to use for each binding type and which package to
reference.
DocumentDB Microsoft.Azure.WebJobs.DocumentDBAttribute# r
"Microsoft.Azure.WebJobs.Extensions.DocumentDB"
Twilio Microsoft.Azure.WebJobs.TwilioSmsAttribute #r
"Microsoft.Azure.WebJobs.Extensions"
Next steps
For more information, see the following resources:
Testing a function
Scale a function
Azure Functions Event Hub bindings
11/15/2016 • 3 min to read • Edit on GitHub
Contributors
wesmc • Glenn Gailey • Kim Whitlatch (Beyondsoft Corporation) • Tyson Nevil • Cephas Lin • Sylvan Clebsch • Jason Card
This article explains how to configure and code Azure Event Hub bindings for Azure Functions. Azure Functions
supports trigger and output bindings for Event Hubs.
This is reference information for Azure Functions developers. If you're new to Azure Functions, start with the
following resources:
If you are new to Azure Event Hubs, see the Azure Event Hub overview.
The Event Hub trigger to a function uses the following JSON object in the bindings array of function.json:
{
"type": "eventHubTrigger",
"name": "<Name of trigger parameter in function signature>",
"direction": "in",
"path": "<Name of the Event Hub>",
"consumerGroup": "Consumer group to use - see below",
"connection": "<Name of app setting with connection string - see below>"
}
consumerGroup is an optional property used to set the consumer group used to subscribe to events in the hub. If
omitted, the $Default consumer group is used.
connection must be the name of an app setting that contains the connection string to the event hub's
namespace. Copy this connection string by clicking the Connection Inform ation button for the namespace, not
the event hub itself. This connection string must have at least read permissions to activate the trigger.
Additional settings can be provided in a host.json file to further fine tune Event Hub triggers.
Trigger usage
When an Event Hub trigger function is triggered, the message that triggers it is passed into the function as a
string.
Trigger sample
Suppose you have the following Event Hub trigger in the bindings array of function.json:
{
"type": "eventHubTrigger",
"name": "myEventHubMessage",
"direction": "in",
"path": "MyEventHub",
"connection": "myEventHubReadConnectionString"
}
See the language-specific sample that logs the message body of the event hub trigger.
C#
F#
Node.js
Trigger sample in C#
using System;
Trigger sample in F#
The output binding uses the following JSON object in the bindings array of function.json:
{
"type": "eventHub",
"name": "<Name of output parameter in function signature>",
"path": "<Name of event hub>",
"connection": "<Name of app setting with connection string - see below>"
"direction": "out"
}
connection must be the name of an app setting that contains the connection string to the event hub's
namespace. Copy this connection string by clicking the Connection Inform ation button for the namespace, not
the event hub itself. This connection string must have send permissions to send the message to the event stream.
Output sample
Suppose you have the following Event Hub output binding in the bindings array of function.json:
{
"type": "eventHub",
"name": "outputEventHubMessage",
"path": "myeventhub",
"connection": "MyEventHubSend",
"direction": "out"
}
See the language-specific sample that writes an event to the even stream.
C#
F#
Node.js
Output sample in C#
using System;
public static void Run(TimerInfo myTimer, out string outputEventHubMessage, TraceWriter log)
{
String msg = $"TimerTriggerCSharp1 executed at: {DateTime.Now}";
log.Verbose(msg);
outputEventHubMessage = msg;
}
Output sample in F#
Next steps
For information about other bindings and triggers for Azure Functions, see Azure Functions triggers and bindings
developer reference
Azure Functions HTTP and webhook bindings
11/15/2016 • 9 min to read • Edit on GitHub
Contributors
Matthew Henderson • wesmc • Glenn Gailey • Kim Whitlatch (Beyondsoft Corporation) • Tyson Nevil • katiecumming • Cephas Lin
• Sylvan Clebsch • Tom Dykstra
This article explains how to configure and code HTTP and webhook triggers and bindings in Azure Functions.
Azure Functions supports trigger and output bindings for HTTP requests and webhooks.
An HTTP trigger lets you invoke a function with an HTTP request. A webhook trigger is an HTTP trigger that's
tailored for a specific webhook provider (e.g. GitHub and Slack).
This is reference information for Azure Functions developers. If you're new to Azure Functions, start with the
following resources:
HTTP trigger
Use the HTTP trigger to respond to an HTTP request.
The HTTP trigger to a function uses the following JSON object in the bindings array of function.json:
{
"name": "<Name of request object/body in function signature>",
"type": "httpTrigger",
"direction": "in",
"authLevel": "<'function', 'anonymous', or 'admin' - see below>"
},
authLevel defines how the HTTP trigger validates the HTTP requests:
Additional settings can be provided in a host.json file to further fine tune HTTP triggers.
http://<yourapp>.azurewebsites.net/api/<funcname>
You can customize this route using the optional route property on the HTTP trigger's input binding. As an
example, the following function.json file defines a route property for an HTTP trigger:
{
"bindings": [
{
"type": "httpTrigger",
"name": "req",
"direction": "in",
"methods": [ "get" ],
"route": "products/{category:alpha}/{id:int?}"
},
{
"type": "http",
"name": "res",
"direction": "out"
}
]
}
Using this configuration, the function is now addressable with the following route instead of the original route.
http://<yourapp>.azurewebsites.net/api/products/electronics/357
This allows the function code to support two parameters in the address, category and id . You can use any Web
API Route Constraint with your parameters. The following C# function code makes use of both parameters.
if (!id) {
context.res = {
// status: 200, /* Defaults to 200 */
body: "All " + category + " items were requested."
};
}
else {
context.res = {
// status: 200, /* Defaults to 200 */
body: category + " item with id = " + id + " was requested."
};
}
context.done();
}
By default, all function routes are prefixed with api. You can also customize or remove the prefix using the
http.routePrefix property in your host.json file. The following example removes the api route prefix by using an
empty string for the prefix in the host.json file.
{
"http": {
"routePrefix": ""
}
}
For detailed information on how to update the host.json file for your function, See, How to update function app
files.
For information on other properties you can configure in your host.json file, see host.json reference.
For Node.js functions, the Functions runtime provides the request body instead of the request object.
{
"name": "req",
"type": "httpTrigger",
"direction": "in",
"authLevel": "function"
},
See the language-specific sample that looks for a name parameter either in the query string or the body of the
HTTP request.
C#
F#
Node.js
open System.Net
open System.Net.Http
open FSharp.Interop.Dynamic
You need a project.json file that uses NuGet to reference the FSharp.Interop.Dynamic and Dynamitey
assemblies, like this:
{
"frameworks": {
"net46": {
"dependencies": {
"Dynamitey": "1.0.2",
"FSharp.Interop.Dynamic": "3.0.0"
}
}
}
}
This will use NuGet to fetch your dependencies and will reference them in your script.
Webhook trigger
Use the webhook trigger to respond to a specific webhook provider. A webhook trigger is an HTTP trigger that
has the following features designed for webhooks:
The webhook trigger to a function uses the following JSON object in the bindings array of function.json:
{
"webHookType": "<github|slack|genericJson>",
"name": "<Name of request object/body in function signature>",
"type": "httpTrigger",
"direction": "in",
},
The Slack webhook generates a token for you instead of letting you specify it, so you must configure your
function-specific key API key with the token from Slack (to see where to define the API key, see Types of API
keys).
{
"webHookType": "github",
"name": "req",
"type": "httpTrigger",
"direction": "in",
},
C#
F#
Node.js
Webhook sample in C#
#r "Newtonsoft.Json"
using System;
using System.Net;
using System.Threading.Tasks;
using Newtonsoft.Json;
Webhook sample in F#
open System.Net
open System.Net.Http
open FSharp.Interop.Dynamic
open Newtonsoft.Json
type Response = {
body: string
}
{
"name": "res",
"type": "http",
"direction": "out"
}
Output usage
You can use the output parameter (e.g. res ) to respond to the http or webhook caller. Alternatively, you can use
the standard Request.CreateResponse() (C#) or context.res pattern to return your response. For examples on
how to use the latter method, see HTTP trigger sample and Webhook trigger sample.
Keys are stored as part of your function app in Azure and are encrypted at rest. To view your keys, create new
ones, or roll keys to new values, navigate to one of your functions within the portal and select "Manage."
Adm in keys : These keys are shared by all functions within the function app. When used as an API key, these
allow access to any function within the function app.
Function keys : These keys apply only to the specific functions under which they are defined. When used as
an API key, these only allow access to that function.
Each key is named for reference, and there is a default key (named "default") at the function and admin level. The
m aster key is a default admin key named "_master" that is defined for each function app and cannot be revoked.
It provides administrative access to the runtime APIs. Using "authLevel": "admin" in the binding JSON will
require this key to be presented on the request; any other key will result in a authorization failure.
NOTE
Due to the elevated permissions granted by the master key, you should not share this key with third parties or distribute it
in native client applications. Exercise caution when choosing the admin authorization level.
https://<yourapp>.azurewebsites.net/api/<function>?code=<ApiKey>
The key can be included in a query string variable named code , as above, or it can be included in an
x-functions-key HTTP header. The value of the key can be any function key defined for the function, or any
admin key.
You can choose to allow requests without keys or specify that the master key must be used by changing the
authLevel property in the binding JSON (see HTTP trigger).
Query string : The provider passes the key name in the query string parameter (e.g.,
clientid
https://<yourapp>.azurewebsites.net/api/<funcname>?clientid=<keyname> ).
Request header : The provider passes the key name in the x-functions-clientid header.
NOTE
Function keys take precedence over admin keys. If two keys are defined with the same name, the function key will be used.
Next steps
For information about other bindings and triggers for Azure Functions, see Azure Functions triggers and bindings
developer reference
Azure Functions Mobile Apps bindings
11/15/2016 • 6 min to read • Edit on GitHub
Contributors
Glenn Gailey • wesmc • Kim Whitlatch (Beyondsoft Corporation) • Tyson Nevil • Cephas Lin • Tom Dykstra
This article explains how to configure and code Azure Mobile Apps bindings in Azure Functions. Azure Functions
supports input and output bindings for Mobile Apps.
The Mobile Apps input and output bindings let you read from and write to data tables in your mobile app.
This is reference information for Azure Functions developers. If you're new to Azure Functions, start with the
following resources:
The Mobile Apps input to a function uses the following JSON object in the bindings array of function.json:
{
"name": "<Name of input parameter in function signature>",
"type": "mobileTable",
"tableName": "<Name of your mobile app's data table>",
"id" : "<Id of the record to retrieve - see below>",
"connection": "<Name of app setting that has your mobile app's URL - see below>",
"apiKey": "<Name of app setting that has your mobile app's API key - see below>",
"direction": "in"
}
id can be static, or it can be based on the trigger that invokes the function. For example, if you use a [queue
trigger]() for your function, then "id": "{queueTrigger}" uses the string value of the queue message as the
record ID to retrieve.
connection should contain the name of an app setting in your function app, which in turn contains the URL of
your mobile app. The function uses this URL to construct the required REST operations against your mobile
app. You [create an app setting in your function app]() that contains your mobile app's URL (which looks like
http://<appname>.azurewebsites.net ), then specify the name of the app setting in the connection property in
your input binding.
You need to specify apiKey if you implement an API key in your Node.js mobile app backend, or
implement an API key in your .NET mobile app backend. To do this, you [create an app setting in your
function app]() that contains the API key, then add the apiKey property in your input binding with the
name of the app setting.
IMPORTANT
This API key must not be shared with your mobile app clients. It should only be distributed securely to service-side
clients, like Azure Functions.
NOTE
Azure Functions stores your connection information and API keys as app settings so that they are not checked into
your source control repository. This safeguards your sensitive information.
Input usage
This section shows you how to use your Mobile Apps input binding in your function code.
When the record with the specified table and record ID is found, it is passed into the named JObject parameter
(or, in Node.js, it is passed into the context.bindings.<name> object). When the record is not found, the parameter
is null .
In C# and F# functions, any changes you make to the input record (input parameter) is automatically sent back to
the Mobile Apps table when the function exits successfully. In Node.js functions, use context.bindings.<name> to
access the input record. You cannot modify a record in Node.js.
Input sample
Suppose you have the following function.json, that retrieves a Mobile App table record with the id of the queue
trigger message:
{
"bindings": [
{
"name": "myQueueItem",
"queueName": "myqueue-items",
"connection":"",
"type": "queueTrigger",
"direction": "in"
},
{
"name": "record",
"type": "mobileTable",
"tableName": "MyTable",
"id" : "{queueTrigger}",
"connection": "My_MobileApp_Url",
"apiKey": "My_MobileApp_Key",
"direction": "in"
}
],
"disabled": false
}
See the language-specific sample that uses the input record from the binding. The C# and F# samples also modify
the record's text property.
C#
Node.js
Input sample in C#
#r "Newtonsoft.Json"
using Newtonsoft.Json.Linq;
The Mobile Apps output for a function uses the following JSON object in the bindings array of function.json:
{
"name": "<Name of output parameter in function signature>",
"type": "mobileTable",
"tableName": "<Name of your mobile app's data table>",
"connection": "<Name of app setting that has your mobile app's URL - see below>",
"apiKey": "<Name of app setting that has your mobile app's API key - see below>",
"direction": "out"
}
connection should contain the name of an app setting in your function app, which in turn contains the URL of
your mobile app. The function uses this URL to construct the required REST operations against your mobile
app. You [create an app setting in your function app]() that contains your mobile app's URL (which looks like
http://<appname>.azurewebsites.net ), then specify the name of the app setting in the connection property in
your input binding.
You need to specify apiKey if you implement an API key in your Node.js mobile app backend, or
implement an API key in your .NET mobile app backend. To do this, you [create an app setting in your
function app]() that contains the API key, then add the apiKey property in your input binding with the
name of the app setting.
IMPORTANT
This API key must not be shared with your mobile app clients. It should only be distributed securely to service-side
clients, like Azure Functions.
NOTE
Azure Functions stores your connection information and API keys as app settings so that they are not checked into
your source control repository. This safeguards your sensitive information.
Output usage
This section shows you how to use your Mobile Apps output binding in your function code.
In C# functions, use a named output parameter of type out object to access the output record. In Node.js
functions, use context.bindings.<name> to access the output record.
Output sample
Suppose you have the following function.json, that defines a queue trigger and a Mobile Apps output:
{
"bindings": [
{
"name": "myQueueItem",
"queueName": "myqueue-items",
"connection":"",
"type": "queueTrigger",
"direction": "in"
},
{
"name": "record",
"type": "mobileTable",
"tableName": "MyTable",
"connection": "My_MobileApp_Url",
"apiKey": "My_MobileApp_Key",
"direction": "out"
}
],
"disabled": false
}
See the language-specific sample that creates a record in the Mobile Apps table endpoint with the content of the
queue message.
C#
Node.js
Output sample in C#
context.bindings.record = {
text : "I'm running in a Node function! Data: '" + myQueueItem + "'"
}
context.done();
};
Next steps
For information about other bindings and triggers for Azure Functions, see Azure Functions triggers and bindings
developer reference
Azure Functions Notification Hub output binding
11/15/2016 • 8 min to read • Edit on GitHub
Contributors
wesmc • Kim Whitlatch (Beyondsoft Corporation) • Tyson Nevil • Cephas Lin • Sylvan Clebsch • Glenn Gailey • Den Delimarsky
• Tom Dykstra
This article explains how to configure and code Azure Notification Hub bindings in Azure Functions.
This is reference information for Azure Functions developers. If you're new to Azure Functions, start with the
following resources:
Your functions can send push notifications using a configured Azure Notification Hub with a few lines of code.
However, the Azure Notification Hub must be configured for the Platform Notifications Services (PNS) you want
to use. For more information on configuring an Azure Notification Hub and developing a client applications that
register to receive notifications, see Getting started with Notification Hubs and click your target client platform at
the top.
The notifications you send can be native notifications or template notifications. Native notifications target a
specific notification platform as configured in the platform property of the output binding. A template
notification can be used to target multiple platforms.
name : Variable name used in function code for the notification hub message.
type : must be set to "notificationHub".
tagExpression : Tag expressions allow you to specify that notifications be delivered to a set of devices who
have registered to receive notifications that match the tag expression. For more information, see Routing and
tag expressions.
hubName : Name of the notification hub resource in the Azure portal.
connection : This connection string must be an Application S etting connection string set to the
DefaultFullSharedAccessSignature value for your notification hub.
direction : must be set to "out".
platform : The platform property indicates the notification platform your notification targets. Must be one of
the following values:
template : This is the default platform if the platform property is omitted from the output binding.
Template notifications can be used to target any platform configured on the Azure Notification Hub. For
more information on using templates in general to send cross platform notifications with an Azure
Notification Hub, see Templates.
apns : Apple Push Notification Service. For more information on configuring the notification hub for
APNS and receiving the notification in a client app, see Sending push notifications to iOS with Azure
Notification Hubs
adm : Amazon Device Messaging. For more information on configuring the notification hub for ADM
and receiving the notification in a Kindle app, see Getting Started with Notification Hubs for Kindle apps
gcm : Google Cloud Messaging. Firebase Cloud Messaging, which is the new version of GCM, is also
supported. For more information on configuring the notification hub for GCM/FCM and receiving the
notification in an Android client app, see Sending push notifications to Android with Azure Notification
Hubs
wns : Windows Push Notification Services targeting Windows platforms. Windows Phone 8.1 and later
is also supported by WNS. For more information on configuring the notification hub for WNS and
receiving the notification in a Universal Windows Platform (UWP) app, see Getting started with
Notification Hubs for Windows Universal Platform Apps
mpns : Microsoft Push Notification Service. This platform supports Windows Phone 8 and earlier
Windows Phone platforms. For more information on configuring the notification hub for MPNS and
receiving the notification in a Windows Phone app, see Sending push notifications with Azure
Notification Hubs on Windows Phone
Example function.json:
{
"bindings": [
{
"name": "notification",
"type": "notificationHub",
"tagExpression": "",
"hubName": "my-notification-hub",
"connection": "MyHubConnectionString",
"platform": "gcm",
"direction": "out"
}
],
"disabled": false
}
You can also manually add a connection string for an existing hub by adding a connection string for the
DefaultFullSharedAccessSignature to your notification hub. This connection string provides your function access
permission to send notification messages. The DefaultFullSharedAccessSignature connection string value can be
accessed from the keys button in the main blade of your notification hub resource in the Azure portal. To
manually add a connection string for your hub, use the following steps:
1. On the Function app blade of the Azure portal, click Function App S ettings > Go to App S ervice
settings .
2. In the S ettings blade, click Application S ettings .
3. Scroll down to the Connection strings section, and add a named entry for
DefaultFullSharedAccessSignature value for your notification hub. Change the type to Custom .
4. Reference your connection string name in the output bindings. Similar to MyHubConnectionS tring used in
the example above.
using System;
using Microsoft.Azure.NotificationHubs;
using Newtonsoft.Json;
public static async Task Run(string myQueueItem, IAsyncCollector<Notification> notification, TraceWriter log)
{
log.Info($"C# Queue trigger function processed: {myQueueItem}");
// In this example the queue item is a new user to be processed in the form of a JSON string with
// a "name" value.
//
// The JSON format for a native APNS notification is ...
// { "aps": { "alert": "notification message" }}
#r "Microsoft.Azure.NotificationHubs"
#r "Newtonsoft.Json"
using System;
using Microsoft.Azure.NotificationHubs;
using Newtonsoft.Json;
public static async Task Run(string myQueueItem, IAsyncCollector<Notification> notification, TraceWriter log)
{
log.Info($"C# Queue trigger function processed: {myQueueItem}");
// In this example the queue item is a new user to be processed in the form of a JSON string with
// a "name" value.
//
// The JSON format for a native GCM notification is ...
// { "data": { "message": "notification message" }}
using System;
using Microsoft.Azure.NotificationHubs;
using Newtonsoft.Json;
public static async Task Run(string myQueueItem, IAsyncCollector<Notification> notification, TraceWriter log)
{
log.Info($"C# Queue trigger function processed: {myQueueItem}");
// In this example the queue item is a new user to be processed in the form of a JSON string with
// a "name" value.
//
// The XML format for a native WNS toast notification is ...
// <?xml version="1.0" encoding="utf-8"?>
// <toast>
// <visual>
// <binding template="ToastText01">
// <text id="1">notification message</text>
// </binding>
// </visual>
// </toast>
log.Info($"{wnsNotificationPayload}");
await notification.AddAsync(new WindowsNotification(wnsNotificationPayload));
}
if(myTimer.isPastDue)
{
context.log('Node.js is running late!');
}
context.log('Node.js timer trigger function ran!', timeStamp);
context.bindings.notification = {
location: "Redmond",
message: "Hello from Node!"
};
context.done();
};
using System;
using System.Threading.Tasks;
using System.Collections.Generic;
public static void Run(string myQueueItem, out IDictionary<string, string> notification, TraceWriter log)
{
log.Info($"C# Queue trigger function processed: {myQueueItem}");
notification = GetTemplateProperties(myQueueItem);
}
using System;
using System.Threading.Tasks;
using System.Collections.Generic;
using System;
public static void Run(string myQueueItem, out string notification, TraceWriter log)
{
log.Info($"C# Queue trigger function processed: {myQueueItem}");
notification = "{\"message\":\"Hello from C#. Processed a queue item!\"}";
}
#r "Microsoft.Azure.NotificationHubs"
using System;
using System.Threading.Tasks;
using Microsoft.Azure.NotificationHubs;
public static void Run(string myQueueItem, out Notification notification, TraceWriter log)
{
log.Info($"C# Queue trigger function processed: {myQueueItem}");
notification = GetTemplateNotification(myQueueItem);
}
Next steps
For information about other bindings and triggers for Azure Functions, see Azure Functions triggers and bindings
developer reference
Azure Functions Service Bus bindings
11/15/2016 • 6 min to read • Edit on GitHub
Contributors
Christopher Anderson • wesmc • Glenn Gailey • Kim Whitlatch (Beyondsoft Corporation) • Tyson Nevil • Cephas Lin • Sylvan Clebsch
• Tom Dykstra
This article explains how to configure and code Azure Service Bus bindings in Azure Functions. Azure Functions
supports trigger and output bindings for Notification Hubs queues and topics.
This is reference information for Azure Functions developers. If you're new to Azure Functions, start with the
following resources:
The Notification Hubs queue and topic triggers to a function use the following JSON objects in the bindings
array of function.json:
queue trigger:
{
"name" : "<Name of input parameter in function signature>",
"queueName" : "<Name of the queue>",
"connection" : "<Name of app setting that has your queue's connection string - see below>",
"accessRights" : "<Access rights for the connection string - see below>"
"type" : "serviceBusTrigger",
"direction" : "in"
}
topic trigger:
{
"name" : "<Name of input parameter in function signature>",
"topicName" : "<Name of the topic>",
"subscriptionName" : "<Name of the subscription>",
"connection" : "<Name of app setting that has your topic's connection string - see below>",
"accessRights" : "<Access rights for the connection string - see below>"
"type" : "serviceBusTrigger",
"direction" : "in"
}
For connection , [create an app setting in your function app]() that contains the connection string to your
Service Hub namespace, then specify the name of the app setting in the connection property in your trigger.
You obtain the connection string by following the steps shown at Obtain the management credentials. The
connection string must be for a Service Bus namespace, not limited to a specific queue or topic. If you leave
connection empty, the trigger assumes that a default Service Bus connection string is specified in an app
setting named AzureWebJobsServiceBus .
For accessRights , available values are manage and listen . The default is manage , which indicates that the
connection has the Manage permission. If you use a connection string that does not have the Manage
permission, set accessRights to listen . Otherwise, the Functions runtime might try and fail to do operations
that require manage rights.
Trigger behavior
S ingle-threading - By default, the Functions runtime processes multiple messages concurrently. To direct
the runtime to process only a single queue or topic message at a time, set serviceBus.maxConcurrrentCalls to
1 in host.json. For information about host.json, see Folder Structure and host.json.
Poison m essage handling - Service Bus does its own poison message handling, which can't be controlled
or configured in Azure Functions configuration or code.
PeekLock behavior - The Functions runtime receives a message in PeekLock mode and calls Complete on
the message if the function finishes successfully, or calls Abandon if the function fails. If the function runs
longer than the PeekLock timeout, the lock is automatically renewed.
Trigger usage
This section shows you how to use your Service Hub trigger in your function code.
In C# and F#, the Service Bus trigger message can be deserialized to any of the following input types:
In Node.js, the Service Bus trigger message is passed into the function as either a string or, in the case of JSON
message, a JavaScript object.
Trigger sample
Suppose you have the following function.json:
{
"bindings": [
{
"queueName": "testqueue",
"connection": "MyServiceBusConnection",
"name": "myQueueItem",
"type": "serviceBusTrigger",
"direction": "in"
}
],
"disabled": false
}
See the language-specific sample that processes a Service Bus queue message.
C#
F#
Node.js
Trigger sample in C#
Trigger sample in F#
queue output:
{
"name" : "<Name of output parameter in function signature>",
"queueName" : "<Name of the queue>",
"connection" : "<Name of app setting that has your queue's connection string - see below>",
"accessRights" : "<Access rights for the connection string - see below>"
"type" : "serviceBus",
"direction" : "out"
}
topic output:
{
"name" : "<Name of output parameter in function signature>",
"topicName" : "<Name of the topic>",
"subscriptionName" : "<Name of the subscription>",
"connection" : "<Name of app setting that has your topic's connection string - see below>",
"accessRights" : "<Access rights for the connection string - see below>"
"type" : "serviceBus",
"direction" : "out"
}
For connection , [create an app setting in your function app]() that contains the connection string to your
Service Hub namespace, then specify the name of the app setting in the connection property in your output
binding. You obtain the connection string by following the steps shown at Obtain the management
credentials. The connection string must be for a Service Bus namespace, not limited to a specific queue or
topic. If you leave connection empty, the output binding assumes that a default Service Bus connection string
is specified in an app setting named AzureWebJobsServiceBus .
For accessRights , available values are manage and listen . The default is manage , which indicates that the
connection has the Manage permission. If you use a connection string that does not have the Manage
permission, set accessRights to listen . Otherwise, the Functions runtime might try and fail to do operations
that require manage rights.
Output usage
In C# and F#, Azure Functions can create a Service Bus queue message from any of the following types:
Any Object - Parameter definition looks like out T paramName (C#). Functions deserializes the object into a
JSON message. If the output value is null when the function exits, Functions creates the message with a null
object.
string - Parameter definition looks like out string paraName (C#). If the parameter value is non-null when
the function exits, Functions creates a message.
byte[] - Parameter definition looks like out byte[] paraName (C#). If the parameter value is non-null when
the function exits, Functions creates a message.
BrokeredMessage Parameter definition looks like out byte[] paraName (C#). If the parameter value is non-null
when the function exits, Functions creates a message.
For creating multiple messages in a C# function, you can use ICollector<T> or IAsyncCollector<T> . A message
is created when you call the Add method.
In Node.js, you can assign a string, a byte array, or a Javascript object (deserialized into JSON) to
context.binding.<paramName> .
Output sample
Suppose you have the following function.json, that defines a Service Bus queue output:
{
"bindings": [
{
"schedule": "0/15 * * * * *",
"name": "myTimer",
"runsOnStartup": true,
"type": "timerTrigger",
"direction": "in"
},
{
"name": "outputSbQueue",
"type": "serviceBus",
"queueName": "testqueue",
"connection": "MyServiceBusConnection",
"direction": "out"
}
],
"disabled": false
}
See the language-specific sample that sends a message to the service bus queue.
C#
F#
Node.js
Output sample in C#
public static void Run(TimerInfo myTimer, TraceWriter log, out string outputSbQueue)
{
string message = $"Service Bus queue message created at: {DateTime.Now}";
log.Info(message);
outputSbQueue = message;
}
Output sample in F#
Next steps
For information about other bindings and triggers for Azure Functions, see Azure Functions triggers and bindings
developer reference
Azure Functions Storage blob bindings
11/15/2016 • 9 min to read • Edit on GitHub
Contributors
Christopher Anderson • wesmc • Glenn Gailey • Kim Whitlatch (Beyondsoft Corporation) • Tyson Nevil • Cephas Lin
This article explains how to configure and code Azure Storage blob bindings in Azure Functions. Azure Functions
supports trigger, input, and output bindings for Azure Storage blobs.
This is reference information for Azure Functions developers. If you're new to Azure Functions, start with the
following resources:
NOTE
A blob only storage account is not supported. Azure Functions requires a general-purpose storage account to use with
blobs.
The Storage blob trigger to a function uses the following JSON objects in the bindings array of function.json:
{
"name": "<Name of input parameter in function signature>",
"type": "blobTrigger",
"direction": "in",
"path": "<container to monitor, and optionally a blob name pattern - see below>",
"connection":"<Name of app setting - see below>"
}
For path , see Name patterns to find out how to format blob name patterns.
connection must contain the name of an app setting that contains a storage connection string. In the Azure
portal, the standard editor in the Integrate tab configures this app setting for you when you create a storage
account or selects an existing one. To manually create this app setting, see [configure this app setting
manually]().
Name patterns
Blob receipts
Handling poison blobs
Name patterns
You can specify a blob name pattern in the path property. For example:
"path": "input/original-{name}",
This path would find a blob named original-Blob1.txt in the input container, and the value of the name variable in
function code would be Blob1 .
Another example:
"path": "input/{blobname}.{blobextension}",
This path would also find a blob named original-Blob1.txt, and the value of the blobname and blobextension
variables in function code would be original-Blob1 and txt.
You can restrict the file type of blobs by using a fixed value for the file extension. For example:
"path": "samples/{name}.png",
In this case, only .png blobs in the samples container trigger the function.
Curly braces are special characters in name patterns. To specify blob names that have curly braces in the name,
double the curly braces. For example:
"path": "images/{{20140101}}-{name}",
This path would find a blob named {20140101}-soundfile.mp3 in the images container, and the name variable
value in the function code would be soundfile.mp3 .
Blob receipts
The Azure Functions runtime makes sure that no blob trigger function gets called more than once for the same
new or updated blob. It does so by maintaining blob receipts to determine if a given blob version has been
processed.
Blob receipts are stored in a container named azure-webjobs-hosts in the Azure storage account for your
function app (specified by the AzureWebJobsStorage app setting). A blob receipt has the following information:
The triggered function ("<function app name> .Functions.<function name> ", for example:
"functionsf74b96f7.Functions.CopyBlob")
The container name
The blob type ("BlockBlob" or "PageBlob")
The blob name
The ETag (a blob version identifier, for example: "0x8D1DC6E70A277EF")
To force reprocessing of a blob, delete the blob receipt for that blob from the azure-webjobs-hosts container
manually.
Trigger usage
In C# functions, you bind to the input blob data by using a named parameter in your function signature, like
<T> <name> . Where T is the data type that you want to deserialize the data into, and paramName is the name you
specified in the trigger JSON. In Node.js functions, you access the input blob data using context.bindings.<name> .
Any Object - useful for JSON-serialized blob data. If you declare a custom input type (e.g. FooType ), Azure
Functions attempts to deserialize the JSON data into your specified type.
String - useful for text blob data.
In C# functions, you can also bind to any of the following types, and the Functions runtime will attempt to
deserialize the blob data using that type:
TextReader
Stream
ICloudBlob
CloudBlockBlob
CloudPageBlob
CloudBlobContainer
CloudBlobDirectory
IEnumerable<CloudBlockBlob>
IEnumerable<CloudPageBlob>
Other types deserialized by ICloudBlobStreamBinder
Trigger sample
Suppose you have the following function.json, that defines a Storage blob trigger:
{
"disabled": false,
"bindings": [
{
"name": "myBlob",
"type": "blobTrigger",
"direction": "in",
"path": "samples-workitems",
"connection":""
}
]
}
See the language-specific sample that logs the contents of each blob that is added to the monitored container.
C#
Node.js
Trigger usage in C#
module.exports = function(context) {
context.log('Node.js Blob trigger function processed', context.bindings.myBlob);
context.done();
};
The Storage blob input to a function uses the following JSON objects in the bindings array of function.json:
{
"name": "<Name of input parameter in function signature>",
"type": "blob",
"direction": "in"
"path": "<Path of input blob - see below>",
"connection":"<Name of app setting - see below>"
},
pathmust contain the container name and the blob name. For example, if you have a queue trigger in your
function, you can use "path": "samples-workitems/{queueTrigger}" to point to a blob in the samples-workitems
container with a name that matches the blob name specified in the trigger message.
connection must contain the name of an app setting that contains a storage connection string. In the Azure
portal, the standard editor in the Integrate tab configures this app setting for you when you create a Storage
account or selects an existing one. To manually create this app setting, see [configure this app setting
manually]().
Input usage
In C# functions, you bind to the input blob data by using a named parameter in your function signature, like
<T> <name> . Where T is the data type that you want to deserialize the data into, and paramName is the name you
specified in the input binding. In Node.js functions, you access the input blob data using context.bindings.<name>
.
Any Object - useful for JSON-serialized blob data. If you declare a custom input type (e.g. FooType ), Azure
Functions attempts to deserialize the JSON data into your specified type.
String - useful for text blob data.
In C# functions, you can also bind to any of the following types, and the Functions runtime will attempt to
deserialize the blob data using that type:
TextReader
Stream
ICloudBlob
CloudBlockBlob
CloudPageBlob
Input sample
Suppose you have the following function.json, that defines a Storage queue trigger, a Storage blob input, and a
Storage blob output:
{
"bindings": [
{
"queueName": "myqueue-items",
"connection": "MyStorageConnection",
"name": "myQueueItem",
"type": "queueTrigger",
"direction": "in"
},
{
"name": "myInputBlob",
"type": "blob",
"path": "samples-workitems/{queueTrigger}",
"connection": "MyStorageConnection",
"direction": "in"
},
{
"name": "myOutputBlob",
"type": "blob",
"path": "samples-workitems/{queueTrigger}-Copy",
"connection": "MyStorageConnection",
"direction": "out"
}
],
"disabled": false
}
See the language-specific sample that copies the input blob to the output blob.
C#
Node.js
Input usage in C#
public static void Run(string myQueueItem, string myInputBlob, out string myOutputBlob, TraceWriter log)
{
log.Info($"C# Queue trigger function processed: {myQueueItem}");
myOutputBlob = myInputBlob;
}
module.exports = function(context) {
context.log('Node.js Queue trigger function processed', context.bindings.myQueueItem);
context.bindings.myOutputBlob = context.bindings.myInputBlob;
context.done();
};
The Storage blob output for a function uses the following JSON objects in the bindings array of function.json:
{
"name": "<Name of output parameter in function signature>",
"type": "blob",
"direction": "out"
"path": "<Path of input blob - see below>",
"connection":"<Name of app setting - see below>"
}
pathmust contain the container name and the blob name to write to. For example, if you have a queue
trigger in your function, you can use "path": "samples-workitems/{queueTrigger}" to point to a blob in the
samples-workitems container with a name that matches the blob name specified in the trigger message.
connection must contain the name of an app setting that contains a storage connection string. In the Azure
portal, the standard editor in the Integrate tab configures this app setting for you when you create a storage
account or selects an existing one. To manually create this app setting, see [configure this app setting
manually]().
Output usage
In C# functions, you bind to the output blob by using the named out parameter in your function signature, like
out <T> <name> , where T is the data type that you want to serialize the data into, and paramName is the name
you specified in the output binding. In Node.js functions, you access the output blob using
context.bindings.<name> .
You can write to the output blob using any of the following types:
Any Object - useful for JSON-serialization. If you declare a custom output type (e.g. out FooType paramName ),
Azure Functions attempts to serialize object into JSON. If the output parameter is null when the function exits,
the Functions runtime creates a blob as a null object.
String - ( out string paramName ) useful for text blob data. the Functions runtime creates a blob only if the
string parameter is non-null when the function exits.
TextWriter
Stream
CloudBlobStream
ICloudBlob
CloudBlockBlob
CloudPageBlob
Output sample
See input sample.
Next steps
For information about other bindings and triggers for Azure Functions, see Azure Functions triggers and
bindings developer reference
Azure Functions Storage queue bindings
11/15/2016 • 5 min to read • Edit on GitHub
Contributors
Christopher Anderson • wesmc • Glenn Gailey • Kim Whitlatch (Beyondsoft Corporation) • Tyson Nevil • Cephas Lin
This article explains how to configure and code Azure Storage queue bindings in Azure Functions. Azure
Functions supports trigger and output bindings for Azure Storage queues.
This is reference information for Azure Functions developers. If you're new to Azure Functions, start with the
following resources:
The Storage queue trigger to a function use the following JSON objects in the bindings array of function.json:
{
"name": "<Name of input parameter in function signature>",
"queueName": "<Name of queue to poll>",
"connection":"<Name of app setting - see below>",
"type": "queueTrigger",
"direction": "in"
}
connection must contain the name of an app setting that contains a storage connection string. In the Azure
portal, the standard editor in the Integrate tab configures this app setting for you when you create a storage
account or selects an existing one. To manually create this app setting, see [configure this app setting manually]
().
Additional settings can be provided in a host.json file to further fine tune storage queue triggers.
If you want to handle poison messages manually, you can get the number of times a message has been picked
up for processing by checking dequeueCount (see Queue trigger metadata).
Trigger usage
In C# functions, you bind to the input message by using a named parameter in your function signature, like
<T> <name> . Where T is the data type that you want to deserialize the data into, and paramName is the name
you specified in the trigger binding. In Node.js functions, you access the input blob data using
context.bindings.<name> .
The queue message can be deserialized to any of the following types:
Any Object - useful for JSON-serialized messages. If you declare a custom input type (e.g. FooType ), Azure
Functions attempts to deserialize the JSON data into your specified type.
String
Byte array
CloudQueueMessage (C#)
expirationTime
insertionTime
nextVisibleTime
id
popReceipt
dequeueCount
queueTrigger (another way to retrieve the queue message text as a string)
Trigger sample
Suppose you have the following function.json, that defines a Storage queue trigger:
{
"disabled": false,
"bindings": [
{
"name": "myQueueItem",
"queueName": "myqueue-items",
"connection":"",
"type": "queueTrigger",
"direction": "in"
}
]
}
See the language-specific sample that retrieves and logs queue metadata.
C#
Node.js
Trigger sample in C#
public static void Run(string myQueueItem,
DateTimeOffset expirationTime,
DateTimeOffset insertionTime,
DateTimeOffset nextVisibleTime,
string queueTrigger,
string id,
string popReceipt,
int dequeueCount,
TraceWriter log)
{
log.Info($"C# Queue trigger function processed: {myQueueItem}\n" +
$"queueTrigger={queueTrigger}\n" +
$"expirationTime={expirationTime}\n" +
$"insertionTime={insertionTime}\n" +
$"nextVisibleTime={nextVisibleTime}\n" +
$"id={id}\n" +
$"popReceipt={popReceipt}\n" +
$"dequeueCount={dequeueCount}");
}
The Storage queue output for a function uses the following JSON objects in the bindings array of function.json:
{
"name": "<Name of output parameter in function signature>",
"queueName": "<Name of queue to write to>",
"connection":"<Name of app setting - see below>",
"type": "queue",
"direction": "out"
}
connection must contain the name of an app setting that contains a storage connection string. In the Azure
portal, the standard editor in the Integrate tab configures this app setting for you when you create a storage
account or selects an existing one. To manually create this app setting, see [configure this app setting manually]
().
Output usage
In C# functions, you write a queue message by using the named out parameter in your function signature, like
out <T> <name> , where T is the data type that you want to serialize the message into, and paramName is the
name you specified in the output binding. In Node.js functions, you access the output using
context.bindings.<name> .
You can output a queue message using any of the data types in your code:
Any Object - useful for JSON-serialization. If you declare a custom output type (e.g. out FooType paramName ),
Azure Functions attempts to serialize object into JSON. If the output parameter is null when the function
exits, the Functions runtime creates a queue message as a null object.
String - ( out string paramName ) useful for test messages. the Functions runtime creates message only if the
string parameter is non-null when the function exits.
Byte array - ( out byte[] )
out CloudQueueMessage - C# only
In C#, you can also bind to ICollector<T> or IAsyncCollector<T> where T is one of the supported types.
Output sample
Suppose you have the following function.json, that defines a Storage queue trigger, a Storage blob input, and a
Storage blob output:
Example function.json for a storage queue output binding that uses a queue trigger and writes a queue
message:
{
"bindings": [
{
"name": "myQueueItem",
"queueName": "myqueue-items",
"connection": "MyStorageConnection",
"type": "queueTrigger",
"direction": "in"
},
{
"name": "myQueue",
"queueName": "samples-workitems-out",
"connection": "MyStorageConnection",
"type": "queue",
"direction": "out"
}
],
"disabled": false
}
See the language-specific sample that writes an output queue message for each input queue message.
C#
Node.js
Output sample in C#
public static void Run(string myQueueItem, out string myQueue, TraceWriter log)
{
myQueue = myQueueItem + "(next step)";
}
module.exports = function(context) {
context.bindings.myQueue = [];
context.bindings.myQueueItem.push("(step 1)");
context.bindings.myQueueItem.push("(step 2)");
context.done();
};
Next steps
For information about other bindings and triggers for Azure Functions, see Azure Functions triggers and
bindings developer reference
Azure Functions Storage table bindings
11/15/2016 • 6 min to read • Edit on GitHub
Contributors
Christopher Anderson • wesmc • Glenn Gailey • Kim Whitlatch (Beyondsoft Corporation) • Tyson Nevil • Cephas Lin
This article explains how to configure and code Azure Storage table triggers and bindings in Azure Functions.
Azure Functions supports input and output bindings for Azure Storage tables.
Read a single row in a C# or Node.js function - Set partitionKey and rowKey . The filter and take
properties are not used in this scenario.
Read m ultiple row s in a C# function - The Functions runtime provides an IQueryable<T> object bound to
the table. Type T must derive from TableEntity or implement ITableEntity . The partitionKey , rowKey ,
filter , and take properties are not used in this scenario; you can use the IQueryable object to do any
filtering required.
Read m ultiple row s in a Node function - Set the filter and take properties. Don't set partitionKey
or rowKey .
Write one or m ore row s in a C# function - The Functions runtime provides an ICollector<T> or
IAsyncCollector<T> bound to the table, where T specifies the schema of the entities you want to add.
Typically, type T derives from TableEntity or implements ITableEntity , but it doesn't have to. The
partitionKey , rowKey , filter , and take properties are not used in this scenario.
This is reference information for Azure Functions developers. If you're new to Azure Functions, start with the
following resources:
The Storage table input to a function uses the following JSON objects in the bindings array of function.json:
{
"name": "<Name of input parameter in function signature>",
"type": "table",
"direction": "in"
"tableName": "<Name of Storage table>",
"partitionKey": "<PartitionKey of table entity to read - see below>",
"rowKey": "<RowKey of table entity to read - see below>",
"take": "<Maximum number of entities to read in Node.js - optional>",
"filter": "<OData filter expression for table input in Node.js - optional>",
"connection": "<Name of app setting - see below>",
}
Use partitionKey and rowKey together to read a single entity. These properties are optional.
connection must contain the name of an app setting that contains a storage connection string. In the Azure
portal, the standard editor in the Integrate tab configures this app setting for you when you create a Storage
account or selects an existing one. To manually create this app setting, see [configure this app setting
manually]().
Input usage
In C# functions, you bind to the input table entity (or entities) by using a named parameter in your function
signature, like <T> <name> . Where T is the data type that you want to deserialize the data into, and paramName is
the name you specified in the input binding. In Node.js functions, you access the input table entity (or entities)
using context.bindings.<name> .
The input data can be deserialized in Node.js or C# functions. The deserialized objects have RowKey and
PartitionKey properties.
In C# functions, you can also bind to any of the following types, and the Functions runtime will attempt to
deserialize the table data using that type:
IQueryable<T>
Input sample
Supposed you have the following function.json, which uses a queue trigger to read a single table row. The JSON
specifies PartitionKey RowKey . "rowKey": "{queueTrigger}" indicates that the row key comes from the queue
message string.
{
"bindings": [
{
"queueName": "myqueue-items",
"connection": "MyStorageConnection",
"name": "myQueueItem",
"type": "queueTrigger",
"direction": "in"
},
{
"name": "personEntity",
"type": "table",
"tableName": "Person",
"partitionKey": "Test",
"rowKey": "{queueTrigger}",
"connection": "MyStorageConnection",
"direction": "in"
}
],
"disabled": false
}
C#
F#
Node.js
Input sample in C#
public static void Run(string myQueueItem, Person personEntity, TraceWriter log)
{
log.Info($"C# Queue trigger function processed: {myQueueItem}");
log.Info($"Name in Person entity: {personEntity.Name}");
}
Input sample in F#
[<CLIMutable>]
type Person = {
PartitionKey: string
RowKey: string
Name: string
}
The Storage table output for a function uses the following JSON objects in the bindings array of function.json:
{
"name": "<Name of input parameter in function signature>",
"type": "table",
"direction": "out"
"tableName": "<Name of Storage table>",
"partitionKey": "<PartitionKey of table entity to write - see below>",
"rowKey": "<RowKey of table entity to write - see below>",
"connection": "<Name of app setting - see below>",
}
Use partitionKey and rowKey together to write a single entity. These properties are optional. You can also
specify PartitionKey and RowKey when you create the entity objects in your function code.
connection must contain the name of an app setting that contains a storage connection string. In the Azure
portal, the standard editor in the Integrate tab configures this app setting for you when you create a Storage
account or selects an existing one. To manually create this app setting, see [configure this app setting
manually]().
Output usage
In C# functions, you bind to the table output by using the named out parameter in your function signature, like
out <T> <name> , where T is the data type that you want to serialize the data into, and paramName is the name
you specified in the output binding. In Node.js functions, you access the table output using
context.bindings.<name> .
You can serialize objects in Node.js or C# functions. In C# functions, you can also bind to the following types:
Output sample
The following function.json and run.csx example shows how to write multiple table entities.
{
"bindings": [
{
"name": "input",
"type": "manualTrigger",
"direction": "in"
},
{
"tableName": "Person",
"connection": "MyStorageConnection",
"name": "tableBinding",
"type": "table",
"direction": "out"
}
],
"disabled": false
}
C#
F#
Node.js
Output sample in C#
public static void Run(string input, ICollector<Person> tableBinding, TraceWriter log)
{
for (int i = 1; i < 10; i++)
{
log.Info($"Adding Person entity {i}");
tableBinding.Add(
new Person() {
PartitionKey = "Test",
RowKey = i.ToString(),
Name = "Name" + i.ToString() }
);
}
Output sample in F#
[<CLIMutable>]
type Person = {
PartitionKey: string
RowKey: string
Name: string
}
context.bindings.outputTable = [];
context.done();
};
The C# code adds a reference to the Azure Storage SDK so that the entity type can derive from TableEntity .
#r "Microsoft.WindowsAzure.Storage"
using Microsoft.WindowsAzure.Storage.Table;
Next steps
For information about other bindings and triggers for Azure Functions, see Azure Functions triggers and
bindings developer reference
Azure Functions timer trigger
11/22/2016 • 3 min to read • Edit on GitHub
Contributors
Christopher Anderson • wesmc • Glenn Gailey • Kim Whitlatch (Beyondsoft Corporation) • Tyson Nevil • Cephas Lin • Tom Dykstra
This article explains how to configure and code timer triggers in Azure Functions. Azure Functions supports the
trigger for timers. Timer triggers call functions based on a schedule, one time or recurring.
The timer trigger supports multi-instance scale-out. One single instance of a particular timer function is run
across all instances.
This is reference information for Azure Functions developers. If you're new to Azure Functions, start with the
following resources:
Timer trigger
The timer trigger to a function uses the following JSON object in the bindings array of function.json:
{
"schedule": "<CRON expression - see below>",
"name": "<Name of trigger parameter in function signature>",
"type": "timerTrigger",
"direction": "in"
}
The default time zone used with the CRON expressions is Coordinated Universal Time (UTC). If you want your
CRON expression to be based on another time zone, create a new app setting for your function app named
WEBSITE_TIME_ZONE . Set the value to the the name of the desired time zone as shown in the Microsoft Time Zone
Index.
For example, Eastern Standard Time is UTC-05:00. If you want your timer trigger to fire at 10:00 AM EST every
day, your could use the following CRON expression which accounts for UTC time zone:
Alternatively, you could add a new app setting for your function app named WEBSITE_TIME_ZONE and set the value
to Eastern S tandard Tim e . Then the following CRON expression could be used for 10:00 AM EST:
Trigger usage
When a timer trigger function is invoked, the timer object is passed into the function. The following JSON is an
example representation of the timer object.
{
"Schedule":{
},
"ScheduleStatus": {
"Last":"2016-10-04T10:15:00.012699+00:00",
"Next":"2016-10-04T10:20:00+00:00"
},
"IsPastDue":false
}
Trigger sample
Suppose you have the following timer trigger in the bindings array of function.json:
{
"schedule": "0 */5 * * * *",
"name": "myTimer",
"type": "timerTrigger",
"direction": "in"
}
See the language-specific sample that reads the timer object to see whether it's running late.
C#
F#
Node.js
Trigger sample in C#
Trigger sample in F#
if(myTimer.isPastDue)
{
context.log('Node.js is running late!');
}
context.log('Node.js timer trigger function ran!', timeStamp);
context.done();
};
Next steps
For information about other bindings and triggers for Azure Functions, see Azure Functions triggers and
bindings developer reference
Azure Functions Twilio output binding
11/15/2016 • 4 min to read • Edit on GitHub
Contributors
wesmc • Kim Whitlatch (Beyondsoft Corporation) • Tyson Nevil
This article explains how to configure and use Twilio bindings with Azure Functions.
This is reference information for Azure Functions developers. If you're new to Azure Functions, start with the
following resources:
Azure Functions supports Twilio output bindings to enable your functions to send SMS text messages with a few
lines of code and a Twilio account.
name: Variable name used in function code for the Twilio SMS text message.
type : must be set to "twilioSms".
accountSid : This value must be set to the name of an App Setting that holds your Twilio Account Sid.
authToken : This value must be set to the name of an App Setting that holds your Twilio authentication token.
to : This value is set to the phone number that the SMS text is sent to.
from : This value is set to the phone number that the SMS text is sent from.
direction : must be set to "out".
body : This value can be used to hard code the SMS text message if you don't need to set it dynamically in
the code for your function.
Example function.json:
{
"type": "twilioSms",
"name": "message",
"accountSid": "TwilioAccountSid",
"authToken": "TwilioAuthToken",
"to": "+1704XXXXXXX",
"from": "+1425XXXXXXX",
"direction": "out",
"body": "Azure Functions Testing"
}
using System;
using Newtonsoft.Json;
using Twilio;
public static void Run(string myQueueItem, out SMSMessage message, TraceWriter log)
{
log.Info($"C# Queue trigger function processed: {myQueueItem}");
// In this example the queue item is a JSON string representing an order that contains the name of a
// customer and a mobile number to send text updates to.
dynamic order = JsonConvert.DeserializeObject(myQueueItem);
string msg = "Hello " + order.name + ", thank you for your order.";
// Even if you want to use a hard coded message and number in the binding, you must at least
// initialize the SMSMessage variable.
message = new SMSMessage();
// A dynamic message can be set instead of the body in the output binding. In this example, we use
// the order information to personalize a text message to the mobile number provided for
// order status updates.
message.Body = msg;
message.To = order.mobileNumber;
}
Asynchronous
This asynchronous example code for an Azure Storage queue trigger sends a text message to a customer who
placed an order.
#r "Newtonsoft.Json"
#r "Twilio.Api"
using System;
using Newtonsoft.Json;
using Twilio;
public static async Task Run(string myQueueItem, IAsyncCollector<SMSMessage> message, TraceWriter log)
{
log.Info($"C# Queue trigger function processed: {myQueueItem}");
// In this example the queue item is a JSON string representing an order that contains the name of a
// customer and a mobile number to send text updates to.
dynamic order = JsonConvert.DeserializeObject(myQueueItem);
string msg = "Hello " + order.name + ", thank you for your order.";
// Even if you want to use a hard coded message and number in the binding, you must at least
// initialize the SMSMessage variable.
SMSMessage smsText = new SMSMessage();
// A dynamic message can be set instead of the body in the output binding. In this example, we use
// the order information to personalize a text message to the mobile number provided for
// order status updates.
smsText.Body = msg;
smsText.To = order.mobileNumber;
await message.AddAsync(smsText);
}
// In this example the queue item is a JSON string representing an order that contains the name of a
// customer and a mobile number to send text updates to.
var msg = "Hello " + myQueueItem.name + ", thank you for your order.";
// Even if you want to use a hard coded message and number in the binding, you must at least
// initialize the message binding.
context.bindings.message = {};
// A dynamic message can be set instead of the body in the output binding. In this example, we use
// the order information to personalize a text message to the mobile number provided for
// order status updates.
context.bindings.message = {
body : msg,
to : myQueueItem.mobileNumber
};
context.done();
};
Next steps
For information about other bindings and triggers for Azure Functions, see Azure Functions triggers and
bindings developer reference
Create a function from the Azure portal
11/22/2016 • 3 min to read • Edit on GitHub
Contributors
Glenn Gailey • tfitzmac • Kim Whitlatch (Beyondsoft Corporation) • Tyson Nevil
Overview
Azure Functions is an event-driven, compute-on-demand experience that extends the existing Azure application
platform with capabilities to implement code triggered by events occurring in other Azure services, SaaS products,
and on-premises systems. With Azure Functions, your applications scale based on demand and you pay only for
the resources you consume. Azure Functions enables you to create scheduled or triggered units of code
implemented in various programming languages. To learn more about Azure Functions, see the Azure Functions
Overview.
This topic shows you how to use the Azure portal to create a simple "hello world" Node.js Azure Function that is
invoked by an HTTP-trigger. Before you can create a function in the Azure portal, you must explicitly create a
function app in Azure App Service. To have the function app created for you automatically, see the other Azure
Functions quickstart tutorial, which is a simpler quickstart experience and includes a video.
Before you can create your first function, you need to have an active Azure account. If you don't already have an
Azure account, free accounts are available.
Resource Group : Select Create new and enter a name for your new resource group. You can also
choose an existing resource group, however you may not be able to create a consumption-based App
Service plan for your function app.
Hosting plan , which can be one of the following:
Consum ption plan : The default plan type for Azure Functions. When you choose a consumption
plan, you must also choose the Location and set the Mem ory Allocation (in MB). For
information about how memory allocation affects costs, see Azure Functions pricing.
App S ervice plan : An App Service plan requires you to create an App S ervice plan/location or
select an existing one. These settings determine the location, features, cost and compute resources
associated with your app.
S torage account : Each function app requires a storage account. You can either choose an existing
storage account or create one.
3. Click Create to provision and deploy the new function app.
Now that the function app is provisioned, you can create your first function.
Create a function
These steps create a function from the Azure Functions quickstart.
1. In the Quickstart tab, click WebHook + API and JavaS cript , then click Create a function . A new
predefined Node.js function is created.
2. (Optional) At this point in the quickstart, you can choose to take a quick tour of Azure Functions features in
the portal. After you have completed or skipped the tour, you can test your new function by using the HTTP
trigger.
1. In the Develop tab, review the Code window and notice that this Node.js code expects an HTTP request
with a name value passed either in the message body or in a query string. When the function runs, this value
is returned in the response message.
2. Click Test to display the built-in HTTP test request pane for the function.
3. In the Request body text box, change the value of the name property to your name, and click Run . You see
that execution is triggered by a test HTTP request, information is written to the streaming logs, and the
"hello" response is displayed in the Output .
4. To trigger execution of the same function from another browser window or tab, copy the Function URL
value from the Develop tab and paste it in a browser address bar. Append the query string value
&name=yourname to the URL and press enter. The same information is written to the logs and the browser
displays the "hello" response as before.
Next steps
This quickstart demonstrates a simple execution of a basic HTTP-triggered function. To learn more about using
Azure Functions in your apps, see the following topics:
Contributors
wesmc • Joseph Molnar • Kim Whitlatch (Beyondsoft Corporation) • Tyson Nevil • Sylvan Clebsch • Cory Fowler • Glenn Gailey
Overview
In this tutorial, we will walk through different approaches to testing functions. We will define a http trigger
function that accepts input through a query string parameter, or the request body. The default HttpTrigger
Node.js Function template code supports a name query string parameter. We will also add code to support
that parameter along with address information for the user in the request body.
The default function template is basically a hello world function that echoes back the name from the request
body or query string parameter, name=<your name> . We will update the code to also allow you to provide the
name and an address as JSON content in the request body. Then the function will echo these back to the client
when available.
Update the function with the following code which we will use for testing:
module.exports = function(context, req) {
context.log("Node.js HTTP trigger function processed a request. RequestUri=%s", req.originalUrl);
context.log("Request Headers = " + JSON.stringify(req.headers));
context.res = {
// status: 200, /* Defaults to 200 */
body: echoString
};
}
To test the function above, copy the Function Url from the portal. It will have the following form:
This is the Url for triggering your function, we can test this by using the cURL command on the command-line to
make a Get ( -G or --get ) request against our function:
This particular example above requires a query string parameter which can be passed as Data ( -d ) in the cURL
command:
curl -G https://<Your Function App>.azurewebsites.net/api/<Your Function Name>?code=<your access code> -d name=
<Enter a name here>
Hit enter and you will see the output of the function on the command-line.
In the portal Logs window, output similar to the following is logged while executing the function:
To test the function we defined above, copy the Function Url from the portal. It will have the following form:
Append the name query string parameter as follows, using an actual name for the <Enter a name here>
placeholder.
Paste the URL into your browser and you should get a response similar to the following.
In the portal Logs window, output similar to the following is logged while executing the function:
TIP
Use the REST Client in which you are comfortable. Here are some alternatives to Postman:
Fiddler
Paw
1. Launch Postman from the Apps button in the upper left of corner of a Chrome browser window.
2. Copy your Function Url and paste it into Postman. It includes the access code query string parameter.
3. Change the HTTP method to POS T .
4. Click Body > raw and add JSON request body similar to the following:
{
"name" : "Wes testing with Postman",
"address" : "Seattle, W.A. 98101"
}
5. Click S end .
The following image shows testing the simple echo function example in this tutorial.
In the portal Logs window, output similar to the following is logged while executing the function:
2016-03-23T08:04:51 Welcome, you are now connected to log-streaming service.
2016-03-23T08:04:57.107 Function started (Id=dc5db8b1-6f1c-4117-b5c4-f6b602d538f7)
2016-03-23T08:04:57.763 Node.js HTTP trigger function processed a request.
RequestUri=https://functions841def78.azurewebsites.net/api/WesmcHttpTriggerNodeJS1?code=XXXXXXXXXX==
2016-03-23T08:04:57.763 Request Headers = {"cache-control":"no-cache","connection":"Keep-
Alive","accept":"*/*","accept-encoding":"gzip","accept-language":"en-US"}
2016-03-23T08:04:57.763 Processing user info from request body...
2016-03-23T08:04:57.763 Processing User Information...
2016-03-23T08:04:57.763 name = Wes testing with Postman
2016-03-23T08:04:57.763 address = Seattle, W.A. 98101
2016-03-23T08:04:57.795 Function completed (Success, Id=dc5db8b1-6f1c-4117-b5c4-f6b602d538f7)
1. In the Azure Portal for your Functions app, create a new C#, F# or Node blob trigger function. Set the path
to monitor to the name of your blob container. For example:
files
2. Click the + button to select or create the storage account you want to use. Then click Create .
3. Create a text file with the following text and save it:
4. Run Microsoft Azure Storage Explorer and connect to the blob container in the storage account being
monitored.
5. Click the Upload button and upload the text file.
The default blob trigger function code will report the processing of the blob in the logs:
2016-03-24T11:30:10 Welcome, you are now connected to log-streaming service.
2016-03-24T11:30:34.472 Function started (Id=739ebc07-ff9e-4ec4-a444-e479cec2e460)
2016-03-24T11:30:34.472 C# Blob trigger function processed: A text file for blob trigger function
testing.
2016-03-24T11:30:34.472 Function completed (Success, Id=739ebc07-ff9e-4ec4-a444-e479cec2e460)
Test the HTTP trigger function we created earlier by adding a JSON string similar to the following in the Request
body field then click the Run button.
{
"name" : "Wes testing Run button",
"address" : "USA"
}
In the portal Logs window, output similar to the following is logged while executing the function:
You could use a timer trigger configured with a queue output binding. That timer trigger code could then write
the test messages to the queue. This section will walk through through an example.
For more in-depth information on using bindings with Azure Functions, see the Azure Functions developer
reference.
Create queue trigger for testing
To demonstrate this approach, we will first create a queue trigger function that we want to test for a queue
named queue-newusers . This function will process name and address information for a new user dropped into
Azure queue storage.
NOTE
If you use a different queue name, make sure the name you use conforms to the Naming Queues and MetaData rules.
Otherwise, you will get a HTTP Status code 400 : Bad Request.
1. In the Azure Portal for your Functions app, click New Function > QueueTrigger - C# .
2. Enter the queue name to be monitored by the queue function
queue-newusers
3. Click the + (add) button to select or create the storage account you want to use. Then click Create .
4. Leave this portal browser window opened so you can monitor the log entries for the default queue function
template code.
Create a timer trigger to drop a message in the queue
1. Open the Azure Portal in a new browser window and navigate to your Function app.
2. Click New Function > Tim erTrigger - C# . Enter a cron expression to set how often the timer code will
execute testing your queue function. Then click Create . If you want the test to run every 30 seconds you
can use the following CRON expression:
*/30 * * * * *
myQueue
queue-newusers
7. Click the + (add) button to select the storage account you used previously with the queue trigger. Then click
S ave .
8. Click the Develop tab for your timer trigger.
9. You can use the following code for the C# timer function as long as you used the same queue message
object name shown above. Then click S ave
using System;
public static void Run(TimerInfo myTimer, out String myQueue, TraceWriter log)
{
String newUser =
"{\"name\":\"User testing from C# timer function\",\"address\":\"XYZ\"}";
myQueue = newUser;
}
At this point C# timer function will execute every 30 seconds if you used the example cron expression. The logs
for the timer function will report each execution:
Code Example:
var nameBodyJSON = {
name : "Wes testing with Node.JS code",
address : "Dallas, T.X. 75201"
};
var options = {
host: "functions841def78.azurewebsites.net",
//path: "/api/HttpTriggerNodeJS2?
code=sc1wt62opn7k9buhrm8jpds4ikxvvj42m5ojdt0p91lz5jnhfr2c74ipoujyq26wab3wk5gkfbt9&" + nameQueryString,
path: "/api/HttpTriggerNodeJS2?
code=sc1wt62opn7k9buhrm8jpds4ikxvvj42m5ojdt0p91lz5jnhfr2c74ipoujyq26wab3wk5gkfbt9",
method: "POST",
headers : {
"Content-Type":"application/json",
"Content-Length": Buffer.byteLength(bodyString)
}
};
callback = function(response) {
var str = ""
response.on("data", function (chunk) {
str += chunk;
});
response.on("end", function () {
console.log(str);
});
}
Output:
C:\Users\Wesley\testing\Node.js>node testHttpTriggerExample.js
*** Sending name and address in body ***
{"name" : "Wes testing with Node.JS code","address" : "Dallas, T.X. 75201"}
Hello Wes testing with Node.JS code
The address you provided is Dallas, T.X. 75201
In the portal Logs window, output similar to the following is logged while executing the function:
Example C# code:
static void Main(string[] args)
{
string name = null;
string address = null;
string queueName = "queue-newusers";
string JSON = null;
if (args.Length > 0)
{
name = args[0];
}
if (args.Length > 1)
{
address = args[1];
}
In the browser window for the queue function, you will see the each message being processed:
Contributors
Donna Malayeri • Kim Whitlatch (Beyondsoft Corporation) • Tyson Nevil • Sylvan Clebsch • Glenn Gailey • Tom Dykstra
NOTE
The Azure Functions local development and tooling experience is currently in preview and the experience will be significantly
improved before the final release. The easiest way to run the Azure Functions host locally is use the Azure Functions CLI and
install it from npm. Currently, only Windows is supported. Full documentation is coming soon. To provide feedback or file
bugs on the Azure Functions CLI, please file an issue in the azure-webjobs-sdk-script GitHub repo.
Best Practices for Azure Functions
11/15/2016 • 3 min to read • Edit on GitHub
Contributors
wesmc • Andy Pasic
Overview
This article provides a collection of best practices for you to consider when implementing function apps. Keep in
mind that your Azure Function App is an Azure App Service. So those best practices would apply.
Whenever possible refactor large functions into smaller function sets that work together and return fast
responses. For example, a webhook or HTTP trigger function might require an acknowledgment response within a
certain time limit. You can pass the HTTP trigger payload into a queue to be processed by a queue trigger function.
This approach allows you to defer the actual work and return an immediate response. It is common for webhooks
to require an immediate response.
Individual messages in a storage queue are limited in size to 64 KB. If you need to pass larger messages between
functions, an Azure Service Bus queue could be used to support message sizes up to 256 KB.
Service Bus topics are useful if you require message filtering before processing.
Idempotent functions are especially recommended with timer triggers. For example, if you have something that
absolutely must run once a day, write it so it can run any time during the day with the same results. The function
can exit when there is no work for a particular day. Also if a previous run failed to complete, the next run should
pick up where it left off.
Depending on how complex your system is, you may have: involved downstream services behaving badly,
networking outages, or quota limits reached, etc. All of these can affect your function at any time. You need to
design your functions to be prepared for it.
How does your code react if a failure occurs after inserting 5,000 of those items into a queue for processing?
Track items in a set that you’ve completed. Otherwise, you might insert them again next time. This can have a
serious impact on your work flow.
Take advantage of defensive measures already provided for components you use in the Azure Functions platform.
For example, see Handling poison queue m essages in the documentation for Azure Storage Queue triggers.
Don't mix test and production code in the same function app.
Functions within a function app share resources. For example, memory is shared. If you're using a function app in
production, don't add test-related functions and resources to it. It can cause unexpected overhead during
production code execution.
Be careful what you load in your production function apps. Memory is averaged across each function in the app.
If you have a shared assembly referenced in multiple .Net functions, put it in a common shared folder. Reference
the assembly with a statement similar to the following example:
#r "..\Shared\MyAssembly.dll".
Otherwise, it is easy to accidentally deploy multiple test versions of the same binary that behave differently
between functions.
Don't use verbose logging in production code. It has a negative performance impact.
Next steps
For more information, see the following resources:
Contributors
Glenn Gailey • wesmc • Kim Whitlatch (Beyondsoft Corporation) • Tyson Nevil
This topic shows you how to use Azure Functions to create a new function in C# that runs based on an event timer
to clean-up rows in a database table. The new function is created based on a pre-defined template in the Azure
Functions portal. To support this scenario, you must also set a database connection string as an App Service setting
in the function app.
Prerequisites
Before you can create a function, you need to have an active Azure account. If you don't already have an Azure
account, free accounts are available.
This topic demonstrates a Transact-SQL command that executes a bulk cleanup operation in table named
TodoItems in a SQL Database. This same TodoItems table is created when you complete the Azure App Service
Mobile Apps quickstart tutorial. You can also use a sample database If you choose to use a different table, you will
need to modify the command.
You can get the connection string used by a Mobile App backend in the portal under All settings > Application
settings > Connection strings > S how connection string values > MS _TableConnectionS tring . You can
also get the connection string direct from a SQL Database in the portal under All settings > Properties > S how
database connection strings > ADO.NET (S QL authentication) .
This scenario uses a bulk operation against the database. To have your function process individual CRUD
operations in a Mobile Apps table, you should instead use Mobile Table binding.
1. Go to the Azure Functions portal and sign-in with your Azure account.
2. If you have an existing function app to use, select it from Your function apps then click Open . To create a new
function app, type a unique Nam e for your new function app or accept the generated one, select your preferred
Region , then click Create + get started .
3. In your function app, click Function app settings > Go to App S ervice settings .
4. In your function app, click All settings , scroll down to Application settings , then under Connection
strings type sqldb_connection for Nam e , paste the connection string into Value , click S ave , then close the
function app blade to return to the Functions portal.
Now, you can add the C# function code that connects to your SQL Database.
5. Click S ave , watch the Logs windows for the next function execution, then note the number of rows deleted
from the TodoItems table.
6. (Optional) Using the Mobile Apps quickstart app, mark additional items as "completed" then return to the Logs
window and watch the same number of rows get deleted by the function during the next execution.
Next steps
See these topics for more information about Azure Functions.
Contributors
Rachel Appel • Kim Whitlatch (Beyondsoft Corporation) • Tyson Nevil
Settings overview
You can manage Azure Function Apps settings by clicking the Function App S ettings link in the bottom-left
corner of the portal. Azure function app settings apply to all functions in the app.
Memory size
You can configure how much memory to allocate for your functions in the current function app.
To configure memory, slide the slider to the desired amount of memory. The maximum is 128 MB.
Continuous integration
You can integrate your Function App with GitHub, Visual Studio Team Services, and more.
1. Click the Configure continuous integration link. This opens a Deploym ents pane with options.
2. Click S etup in the Deploym ents pane to reveal a Deploym ent S ource pane with one option: Click Choose
S ource to show available sources.
3. Choose any of the deployment sources available: Visual Studio Team Services, OneDrive, Local Git
Repository, GitHub, Bitbucket, DropBox, or an External Repository by clicking it.
4. Enter your credentials and information as prompted by the various deployment sources. The credentials and
information requested may be slightly different depending on what source you have chosen.
Once you have setup CI, connected code you push to the configured source is automatically deployed to this
function app.
Authentication/authorization
For functions that use an HTTP trigger, you can require calls to be authenticated.
For details on configuring specific authentication providers, see Azure App Service authentication overview.
CORS
Normally, for security reasons, calls to your hosts (domains) from external sources, such as Ajax calls from a
browser, are not allowed. Otherwise, malicious code could be sent to and executed on the backend. The safest route
then is to blacklist all sources of code, except for a few of your own trusted ones. You can configure which sources
you accept calls from in Azure functions by configuring Cross-Origin Resource Sharing (CORS). CORS allows you
to list domains that are the source of JavaScript that can call functions in your Azure Function App.
For more information on creating API definitions with Swagger, visit Get Started with API Apps, ASP.NET, and
Swagger in Azure.
Application settings
Manage environment variables, Framework versions, remote debugging, app settings, connection strings, default
docs, etc. These settings are specific to your Function App.
NOTE
You can upload scripts, but first you must configure an FTP client in the Azure Function's Advanced Settings .
Kudu
Kudu allows you to access advanced administrative features of a Function App.
To open Kudu, click Go to Kudu . This action opens an entirely new browser window with the Kudu web admin.
NOTE
You can alternatively launch Kudu by inserting "scm" into your function's URL, as shown here:
https://<YourFunctionName>.scm.azurewebsites.net/
From the Kudu webpage, you can view and manage system information, app settings, environment variables, HTTP
headers, server variables, and more.
Advanced settings
Manage your function app like any other App Service instance. This option gives you access to all the previously
discussed settings, plus several more.
Next steps
Need some help?
Post questions in the Azure forums. - Visit MSDN
Contributors
Glenn Gailey • wesmc • cephalin • Kim Whitlatch (Beyondsoft Corporation) • Tyson Nevil • Seth Reid
Azure Functions makes it easy to configure continuous deployment for your function app. Functions uses Azure
App Service integration with BitBucket, Dropbox, GitHub, and Visual Studio Team Services (VSTS) to enable a
continuous deployment workflow where Azure pulls updates to your functions code when they are published to
one of these services. If you are new to Azure Functions, start with Azure Functions Overview.
Continuous deployment is a great option for projects where multiple and frequent contributions are being
integrated. It also lets you maintain source control on your functions code. The following deployment sources are
currently supported:
Bitbucket
Dropbox
Git local repo
Git external repo
GitHub
Mercurial external repo
OneDrive
Visual Studio Team Services
Deployments are configured on a per-function-app basis. After continuous deployment is enabled, access to
function code in the portal is set to read-only .
The code for all of the functions in a given function app lives in a root folder that contains a host configuration file
and one or more subfolders, each of which contain the code for a separate function, as in the following example:
wwwroot
| - host.json
| - mynodefunction
| | - function.json
| | - index.js
| | - node_modules
| | | - ... packages ...
| | - package.json
| - mycsharpfunction
| | - function.json
| | - run.csx
The host.json file contains some runtime-specific configuration and sits in the root folder of the function app. For
information on settings that are available, see host.json in the WebJobs.Script repository wiki.
Each function has a folder that contains one or more code files, the function.json configuration and other
dependencies.
1. In your function app in the Azure Functions portal, click Function app settings > Configure continuous
integration > S etup .
You can also get to the Deployments blade from the Functions quickstart by clicking S tart from source
control .
2. In the Deploym ent source blade, click Choose source , then fill-in the information for your chosen
deployment source and click OK .
After continuous deployment is configured, all changes files in your deployment source are copied to the function
app and a full site deployment is triggered. The site is redeployed when files in the source are updated.
Deployment options
The following are some typical deployment scenarios:
The process to configure and work with a staging deployment looks generally like this:
1. Create two function apps in your subscription, one for the production code and one for staging.
2. Create a deployment source, if you don't already have one. This example uses GitHub.
3. For your production function app, complete the above steps in S et up continuous deploym ent and set
the deployment branch to the master branch of your GitHub repo.
4. Repeat this step for the staging function app, but choose the staging branch instead in your GitHub repo. If
your deployment source doesn't support branching, use a different folder.
5. Make updates to your code in the staging branch or folder, then verify that those changes are reflected in
the staging deployment.
6. After testing, merge changes from the staging branch into the master branch. This will trigger deployment
to the production function app. If your deployment source doesn't support branches, overwrite the files in
the production folder with the files from the staging folder.
NOTE
After you configure continuous integration, you will no longer be able to edit your source files in the Functions portal.
1. In your function app in the Azure Functions portal, click Function app settings > Go to App S ervice
settings > Deploym ent credentials .
2. Type in a username and password, then click S ave . You can now use these credentials to access your
function app from FTP or the built-in Git repo.
How to: Download files using FTP
1. In your function app in the Azure Functions portal, click Function app settings > Go to App S ervice
settings > Properties and copy the values for FTP/Deploym ent User , FTP Host Nam e , and FTPS Host
Nam e .
FTP/Deploym ent User must be entered as displayed in the portal, including the app name, in order to
provide proper context for the FTP server.
2. From your FTP client, use the connection information you gathered to connect to your app and download
the source files for your functions.
How to: download files using the local Git repository
1. In your function app in the Azure Functions portal, click Function app settings > Configure continuous
integration > S etup .
2. In the Deployments blade, click Choose source , Local Git repository , then click OK .
3. Click Go to App S ervice settings > Properties and note the value of Git URL.
4. Clone the repo on your local machine using a Git-aware command line or your favorite Git tool. The Git
clone command looks like the following:
5. Fetch files from your function app to the clone on your local computer, as in the following example:
If requested, supply the username and password for your function app deployment.
Monitoring Azure Functions
11/22/2016 • 2 min to read • Edit on GitHub
Contributors
wesmc • Andy Pasic
Overview
The Monitor tab for each function allows you to review each execution of a function.
Clicking an execution allows you to review the duration, input data, errors, and associated log files. This is useful
debugging and performance tuning your functions.
IMPORTANT
When using the Consumption hosting plan for Azure Functions, the Monitoring tile in the Function App overview blade will
not show any data. This is because the platform dynamically scales and manages compute instances for you, so these metrics
are not meaningful on a Consumption plan. To monitor the usage of your Function Apps, you should instead use the
guidance in this article.
The following screen-shot shows an example:
Real-time monitoring
Real-time monitoring is available by clicking live event stream as shown below.
The live event stream will be graphed in a new browser tab as shown below.
NOTE
There is a known issue that may cause your data to fail to be populated. If you experience this, you may need to close the
browser tab containing the live event stream and then click live event stream again to allow it to properly populate your
event stream data.
The live event stream will graph the following statistics for your function:
These statistics are real-time but the actual graphing of the execution data may have around 10 seconds of latency.
Log into your Azure account using the following command, or any of the other options covered in, Log in to Azure
from the Azure CLI.
azure login
Use the following command to enable Azure CLI Service Management (ASM) mode:.
If you have multiple subscriptions, use the following commands to list your subscriptions and set the current
subscription to the subscription that contains your function app.
The following command will stream the log files of your function app to the command line:
PS C:\> Add-AzureAccount
If you have multiple subscriptions, you can list them by name with the following command to see if the correct
subscription is the currently selected based on IsCurrent property:
PS C:\> Get-AzureSubscription
If you need to set the active subscription to the one containing your function app, use the following command:
Stream the logs to your PowerShell session with the following command:
For more information refer to How to: Stream logs for web apps.
Next steps
For more information, see the following resources:
Testing a function
Scale a function