Azure Functions: Node.js v4 programming model is Generally Available

Azure Functions: Node.js v4 programming model is Generally Available

This article is contributed. See the original author and article here.

The Azure Functions team is thrilled to announce General Availability of version 4 of the Node.js programming model! This programming model is part of Azure Functions’ larger effort to provide a more flexible and intuitive experience for all supported languages. You may be aware that we announced General Availability of the new Python programming model for Azure Functions at MS Build this year. The new Node.js experience we ship today is a result of the valuable feedback we received from JavaScript and TypeScript developers through GitHub, surveys, user studies, as well as suggestions from internal Node.js experts working closely with customers. 


 


This blog post aims to highlight the key features of the v4 model and also shed light on the improvements we’ve made since announcing public preview last spring. 


 


What’s improved in the V4 model? 


 


In this section, we highlight several key improvements made in the V4 programming model. 


 


Flexible folder structure  


 


The existing V3 model requires that each trigger be in its own directory, with its own function.json file. This strict structure can make it hard to manage if an app has many triggers. And if you’re a Durable Functions user, having your orchestration, activity, and client functions in different directories decreases code readability, because you must switch between directories to look at the components of one logical unit. The V4 model removes the strict directory structure and gives users the flexibility to organize triggers in ways that makes sense to their Function app. For example, you can have multiple related triggers in one file or have triggers in separate files that are grouped in one directory. 


 


Furthermore, you no longer need to keep a function.json file for each trigger you have in the V4 model as bindings are configured in code! See the HTTP example in the next section and the Durable Functions example in the “More Examples” section. 


 


Define function in code 


 


The V4 model uses an app object as the entry point for registering functions instead of function.json files. For example, to register an HTTP trigger responding to a GET request, you can call app.http() or app.get() which was modeled after other Node.js frameworks like Express.js that also support app.get(). The following shows what has changed when writing an HTTP trigger in the V4 model: 


 





















File Type



V3 



V4 


 JavaScript

module.exports = async function (context, req) { 
  context.log('HTTP function processed a request'); 

  const name = req.query.name 
    || req.body 
    || 'world'; 

  context.res = { 
    body: `Hello, ${name}!` 
  }; 
}; 

 


const { app } = require("@azure/functions"); 

app.http('helloWorld1', { 
  methods: ['GET', 'POST'], 
  handler: async (request, context) => { 
    context.log('Http function processed request'); 

    const name = request.query.get('name')  
      || await request.text()  
      || 'world'; 

    return { body: `Hello, ${name}!` }; 
  } 
}); 

 


JSON

{ 
  "bindings": [ 
    { 
      "authLevel": "anonymous", 
      "type": "httpTrigger", 
      "direction": "in", 
      "name": "req", 
      "methods": [ 
        "get", 
        "post" 
      ] 
    }, 
    { 
      "type": "http", 
      "direction": "out", 
      "name": "res" 
    } 
  ] 
} 

madhurabharadwaj_0-1694818755219.gif  Nothing  madhurabharadwaj_1-1694818755223.gif

 


 



 


Trigger configuration like methods and authLevel that were specified in a function.json file before are moved to the code itself in V4. We also set several defaults for you, which is why you don’t see authLevel or an output binding in the V4 example. 


 


New HTTP Types 


 


In the V4 model, we’ve adjusted the HTTP request and response types to be a subset of the fetch standard instead of types unique to Azure Functions. We use Node.js’s undici package, which follows the fetch standard and is currently being integrated into Node.js core. 


 


HttpRequest – body 














V3 



V4 


// returns a string, object, or Buffer 
const body = request.body; 
// returns a string 
const body = request.rawBody; 
// returns a Buffer 
const body = request.bufferBody; 
// returns an object representing a form 
const body = await request.parseFormBody();

 


 


const body = await request.text(); 
const body = await request.json(); 
const body = await request.formData(); 
const body = await request.arrayBuffer(); 
const body = await request.blob();


 


HttpResponse – status 














V3 



V4 



 

context.res.status(200); 

context.res = { status: 200} 
context.res = { statusCode: 200 }; 

return { status: 200}; 
return { statusCode: 200 }; 

return { status: 200 }; 


 


To see how other properties like header, query parameters, etc. have changed, see our developer guide. 


 


Better IntelliSense 


 


If you’re not familiar with IntelliSense, it covers the features in your editor like autocomplete and documentation directly while you code. We’re big fans of IntelliSense and we hope you are too because it was a priority for us from the initial design stages. The V4 model supports IntelliSense for JavaScript for the first time, and improves on the IntelliSense for TypeScript that already existed in V3. Here are a few examples: 


 


madhurabharadwaj_2-1694818755225.png


 


madhurabharadwaj_3-1694818755228.png


 


More Examples 


 


NOTE: One of the priorities of the V4 programming model is to ensure parity between JavaScript and TypeScript support. You can use either language to write all the examples in this article, but we only show one language for the sake of article length. 


 


Timer (TypeScript) 


 


A timer trigger that runs every 5 minutes: 


 


 

import { app, InvocationContext, Timer } from '@azure/functions'; 

export async function timerTrigger1(myTimer: Timer, context: InvocationContext): Promise { 
    context.log('Timer function processed request.'); 
} 

app.timer('timerTrigger1', { 
    schedule: '0 */5 * * * *', 
    handler: timerTrigger1, 
}); 

 


 


Durable Functions (TypeScript) 


 


Like in the V3 model, you need the durable-functions package in addition to @azure/functions to write Durable Functions in the V4 model. The example below shows one of the common patterns Durable Functions is useful for – function chaining. In this case, we’re executing a sequence of (simple) functions in a particular order. 


 


 

import { app, HttpHandler, HttpRequest, HttpResponse, InvocationContext } from '@azure/functions'; 
import * as df from 'durable-functions'; 
import { ActivityHandler, OrchestrationContext, OrchestrationHandler } from 'durable-functions'; 

// Replace with the name of your Durable Functions Activity 
const activityName = 'hello'; 

const orchestrator: OrchestrationHandler = function* (context: OrchestrationContext) { 
    const outputs = []; 
    outputs.push(yield context.df.callActivity(activityName, 'Tokyo')); 
    outputs.push(yield context.df.callActivity(activityName, 'Seattle')); 
    outputs.push(yield context.df.callActivity(activityName, 'Cairo')); 

    return outputs; 
}; 
df.app.orchestration('durableOrchestrator1', orchestrator); 

const helloActivity: ActivityHandler = (input: string): string => { 
    return `Hello, ${input}`; 
}; 
df.app.activity(activityName, { handler: helloActivity }); 

const httpStart: HttpHandler = async (request: HttpRequest, context: InvocationContext): Promise => { 
    const client = df.getClient(context); 
    const body: unknown = await request.text(); 
    const instanceId: string = await client.startNew(request.params.orchestratorName, { input: body }); 

    context.log(`Started orchestration with ID = '${instanceId}'.`); 

    return client.createCheckStatusResponse(request, instanceId); 
}; 

app.http('durableOrchestrationStart1', { 
    route: 'orchestrators/{orchestratorName}', 
    extraInputs: [df.input.durableClient()], 
    handler: httpStart, 
}); 

 


 


In Lines 8-16, we set up and register an orchestration function. In the V4 model, instead of registering the orchestration trigger in function.json, you simply do it through the app object on the durable-functions module (here df). Similar logic applies to the activity (Lines 18-21), client (Lines 23-37), and Entity functions. This means you no longer have to manage multiple function.json files just to get a simple Durable Functions app working!  


 


Lines 23-37 set up and register a client function to start the orchestration. To do that, we pass in an input object from the durable-functions module to the extraInputs array to register the function. Like in the V3 model, we obtain the Durable Client using df.getClient() to execute orchestration management operations like starting a new orchestration. We use an HTTP trigger in this example, but you could use any trigger supported by Azure Functions such as a timer trigger or Service Bus trigger. 


 


Refer to this example to see how to write a Durable Entity with the V4 model. 


 


What’s new for GA?  


 


We made the following improvements to the v4 programming model since the announcement of Public Preview last spring. Most of these improvements were made to ensure full feature parity between the existing v3 and the new v4 programming model. 


 



  1. AzureWebJobsFeatureFlags no longer needs to be set 
    During preview, you needed to set the application setting “AzureWebJobsFeatureFlags” to “EnableWorkerIndexing” to get a v4 model app working. We removed this requirement as part of the General Availability update. This also allows you to use Azure Static Web Apps with the v4 model. You must be on runtime v4.25+ in Azure or core tools v4.0.5382+ if running locally to benefit from this change.
      

  2. Model v4 is now the default

    We’re confident v4 is ready for you to use everywhere, and it’s now the default version on npm, in documentation, and when creating new apps in Azure Functions Core Tools or VS Code.


      

  3. Entry point errors are now exposed via Application Insights
    In the v3 model and in the preview version of the v4 model, errors in entry point files were ignored and weren’t logged in Application Insights. We changed the behavior to make entry point errors more obvious. It’s a breaking change for model v4 as some errors that were previously ignored will now block your app from running. You can use the app setting “FUNCTIONS_NODE_BLOCK_ON_ENTRY_POINT_ERROR” to configure this behavior. We highly recommend setting it to “true” for all v4 apps. For more information, see the App Setting reference documentation.


  4. Support for retry policy 

    We added support for configuring retry policy when registering a function in the v4 model. The retry policy tells the runtime to rerun a failed execution until either successful completion occurs or the maximum number of retries is reached. A retry policy is evaluated when a Timer, Kafka, CosmosDB or Event Hubs-triggered function raises an uncaught exception. As a best practice, you should catch all exceptions in your code and rethrow any errors that you want to result in a retry. Learn more about Azure Functions Retry policy.
     



  5. Support for Application Insights npm package
    Add the Application Insights npm package (v2.8.0+) to your app to discover and rapidly diagnose performance and other issues. This package tracks the following out-of-the-box: incoming and outgoing HTTP requests, important system metrics such as CPU usage, unhandled exceptions, and events from many popular third-party libraries.


  6. Support for more binding types
    We added support for SQL and Table input and output bindings. We also added Cosmos DB extension v4 types. A highlight of the latest Cosmos DB extension is that it allows you to use managed identities instead of secrets. Learn how to upgrade your Cosmos DB extension here and how to configure an app to use identities here 


  7. Support for hot reload

    Hot reload ensures your app will automatically restart when a file in your app is changed. This was not working for model v4 when we announced preview, but has been fixed for GA. 




 


How to get started?


Check out our Quickstarts to get started: 



See our Developer Guide to learn more about the V4 model. We’ve also created an upgrade guide to help migrate existing V3 apps to V4. 


 


Please give the V4 model a try and let us know your thoughts! 


 


If you have questions and/or suggestions, please feel free to drop an issue in our GitHub repo. As this is an open-source project, we welcome any PR contributions from the community. 

Try the new outbound dialing experience in Dynamics 365 Customer Service 

Try the new outbound dialing experience in Dynamics 365 Customer Service 

This article is contributed. See the original author and article here.

In the fast-paced world of customer service, efficient outbound calling communication is the cornerstone of success. Dynamics 365 Customer Service has long been a trusted platform for managing customer interactions. With the upcoming October release, we’ve listened to your feedback and delivered a significant enhancement that is set to transform outbound dialing.  

Currently, modifying the dialed number proves to be cumbersome given the inability to edit a digit. Additionally, the absence of number validation increases the risk of agents dialing incorrect numbers. This is especially true in the event of missing country codes. 

In the October release, you will find a more intuitive, streamlined, and efficient outbound dialing experience.

Editing flexibility

In the new outbound dialing experience, agents can continue to initiate calls from customer records. What’s changed? Now, modifying the dialed number is a breeze. The enhanced interface empowers agents to effortlessly edit the number before placing the call. This new experience also introduces auto-formatting, automatically structuring the number as agents type it. This functionality not only reduces errors but also highlights incomplete or invalid numbers. This newfound flexibility ensures accurate and effective outbound calling experiences.

Smart use of screen real estate

The improved interface is designed to optimize the available screen space. By default, the keypad is hidden, given most agents prefer to use the keyboard, which also allows for a clearer view of essential information. However, should agents need to utilize the keypad, it’s just a click away.

Recall recent numbers

Agents now have the power to swiftly call back recent numbers. With the ability to access the last 20 numbers dialed or received calls, agents can easily reconnect with customers. This feature is a time-saver and helps maintain a seamless communication flow.

Country and region support for outbound dialing

A significant advancement for administrators and agents alike is the support for specific countries and regions. Administrators can customize outbound profiles to allow calls only to selected countries or regions. This prevents accidental calls to unintended destinations, reinforcing precision in customer communication.

Intuitive profile selection and profile matching

Agents with multiple outbound profiles will appreciate the intuitive profile selection process. The dropdown menu displays the collective list of supported countries and regions from all profiles. Simplifying the process even further, agents need only enter the number they wish to dial. The system intelligently identifies the outbound profile supporting the dialed number’s country or region. This feature is coming as a fast follow in October.

The October release of Dynamics 365 Customer Service brings an outbound dialing experience with enhanced editing capabilities, smarter interface design, call history, number auto-formatting and validation, and refined country and region support. Agents can confidently and efficiently connect with customers, bolstering the delivery of exceptional customer service.

Learn more about outbound dialing

Watch a quick video demonstration.

To preview this feature, administrators should update the Settings definition for Enhanced outbound dialer experience to set the environment value to Yes. To learn more, see Call a customer in the voice channel | Microsoft Learn.

The post Try the new outbound dialing experience in Dynamics 365 Customer Service  appeared first on Microsoft Dynamics 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Microsoft Mesh enters preview in October, including a new Teams experience

Microsoft Mesh enters preview in October, including a new Teams experience

This article is contributed. See the original author and article here.

We are re-imagining the way employees come together with Microsoft Mesh, a new three-dimensional (3D) immersive experience and, we are excited to announce Mesh public preview availability in October.  

The post Microsoft Mesh enters preview in October, including a new Teams experience appeared first on Microsoft 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Leaders in Viva Engage reach employees and beyond with storyline announcements

Leaders in Viva Engage reach employees and beyond with storyline announcements

This article is contributed. See the original author and article here.

Announcements have been a core part of Viva Engage for years but have recently become a critical way to keep employees informed and engaged with leaders. The broad delivery of announcements across Viva Engage, Viva Connections, Outlook, and Microsoft Teams means that employees can use rich engagement features like reactions, replies, and sharing from within the apps that they use every day. Analytics help track the reach of the announcements, meaning our customers have come to rely on announcements to help run their business and measure the impact their communications are having in your network.


 


We’ve taken it one step further by enabling specific employees as leaders because we know leaders want to share vision, updates, and perspectives to build culture and manage change. When leaders are selected in Viva Engage, and their audiences have been set up they can send storyline announcements. And now leaders can reach and connect with employees in their organizations and send their posts to multiple audiences. Leaders and their delegates can now configure multiple audiences and effectively target storyline announcements to them.


 


VivaEngage1.jpg


 


Leaders can now send targeted storyline announcements to different audiences


 


Leaders, and their delegates can now target storyline announcements to preconfigured audiences, expanding the leader’s ability to reach people beyond their direct reporting organization. Upon creating an announcement, the leader’s default audience will be preselected. To add additional audiences, select Change option. This will bring up the Storyline announcements options window in which the default audience can be changed for any of the configured ones. Once an audience is selected, confirm the channels before you send your message.


 


VivaEngage2.png


 


Set up multiple audiences


 


To view information about how to configure audiences, please visit Identify leaders and manage audiences in Viva Engage. To define a leader’s audience, you add individual users or groups, such as security, distribution, or Microsoft 365 groups. When you add a group, changes to the group’s membership, including nested members, automatically update the audience within 24 hours. This functionality makes it easy to apply existing groups that define a leader’s organization to define the leader’s audience in Viva Engage. Customers may have existing distribution lists that they use to communicate with an audience by email. You can add those lists to the leader’s audience in Viva Engage for continuous communication.


 


Viva Engage3.png


 


Send announcements to your audience across apps



Leaders can make an announcement and select your audience and reach people across apps. Once the announcement is delivered, your audiences can react and reply regardless of what app they receive the announcement. To make your announcements more engaging, attach images or videos, ask a question, pose a poll to your community, or draw attention to specific actions by using rich media within your announcement. Announcements made by leaders will also be highlighted in leadership corner.


 


VivaEngage4.png


 


Analyze the impact of your communication


 


When you post an announcement in Viva Engage, you can expect the message to reach your audience. From the notifications with Microsoft Teams, Outlook interactive messages, Engage notifications, we want to make sure that you understand the impact of your communications by tracking the reach of your communications and the sentiment of your audience. With conversation insights, you’ll be able to view how well your announcement has performed. With personal analytics, you can track the effectiveness across multiple posts and announcements. With audience analytics, leaders can track levels of sentiment analysis to help monitor the engagement, contributions and themes across your audience, beyond what you have sent. You can start to understand and know what your audience thinks is important and can help you identify what to post next.


 


VivaEnage5.png


 


What’s new and next for storyline….


 


Storyline can be made available for only specific employees
Storyline announcements coming to multiple tenant organizations


 


Learn more about Viva Employees Communications and Communities and stay tuned to the M365 Roadmap for the latest updates about what’s coming for Viva Engage.


 


If you are looking to share important news and information with employees, try using announcements on your storyline posts. With the speed of delivery, ability to measure reach, and a way to spark two-way engagement, announcements are an essential way to keep your employees informed.

Azure Database for MariaDB is being retired on 19 September 2025

This article is contributed. See the original author and article here.

In the fall of 2018, we announced the general availability (GA) of Azure Database for MariaDB. Since that release five years ago, we’ve invested time and resources in Azure Database for MariaDB to further extend our commitment to the open-source community by providing valuable, enterprise-ready features of Azure for use on open-source database instances.


 


In November 2021 we released Flexible Server, the next-generation deployment option for Azure Database for MySQL. As we continue to invest in Azure Database for MySQL and focus our efforts on Flexible Server to make it the best destination for your open-source MySQL workloads, we’ve decided to retire the Azure Database for MariaDB service in two years (September 2025). This will help us focus on Azure Database for MySQL – Flexible Server to ensure that we are providing the best user experience for our customers.


 


Azure Database for MySQL – Flexible Server has enhanced features, performance, an improved architecture, and more controls to manage costs across all service tiers when compared to Azure Database for MariaDB. As a result, we encourage you to migrate to Azure Database for MySQL – Flexible Server before the Azure MariaDB retirement to experience the new capabilities of the service, including:



  • More ways to optimize costs, including support for burstable tier compute options.

  • Improved performance for business-critical production workloads that require low latency, high concurrency, fast failover, and high scalability.

  • Improved uptime with the ability to configure a hot standby on the same or a different zone, and a one-hour time window for planned server maintenance.


 


For more information about Flexible Server, see the article Azure Database for MySQL – Flexible Server.


 


Migrate from Azure Database for MariaDB to Azure Database for MySQL – Flexible Server


 


For information about how you can migrate your Azure Database for MariaDB server to Azure Database for MySQL – Flexible Server, see the blog post Migrating from Azure Database for MariaDB to Azure Database for MySQL.


 


Retirement announcement FAQs


 


We understand that you may have a lot of questions about what this announcement means for your Azure Database for MariaDB workloads. As a result, we’ve added several “frequently asked questions” in the article What’s happening to Azure Database for MariaDB?


 


For quick reference, we’ve included a few key questions and answers below.


 


Q. Why am I being asked to migrate to Azure Database for MySQL – Flexible Server


A. There’s a high application compatibility between Azure Database for MariaDB and Azure Database for MySQL, as MariaDB was forked from MySQL. Azure Database for MySQL – Flexible Server is the best platform for running all your MySQL workloads on Azure. MySQL- Flexible Server is economical and provides better performance across all service tiers, together with more ways to control your costs for less costly and faster disaster recovery.


 


Q. After the Azure Database for MariaDB retirement announcement, can I still create new MariaDB servers to meet my business needs?


A. As part of this retirement, we’ll no longer support the ability to create new MariaDB instances by using Azure portal beginning on December 19, 2023. If you do still need to create MariaDB instances to meet business continuity needs, you can use the Azure CLI to do so until March 19, 2024.


 


Q. Can I choose to continue running Azure Database for MariaDB beyond the sunset date?


A. Unfortunately, we don’t plan to support Azure Database for MariaDB beyond the sunset date of September 19, 2025. As a result, we advise you to start planning your migration as soon as possible.


 


Q. I have additional questions about the retirement. How can I find out more?


A. If you have questions, get answers from community experts in Microsoft Q&A. If you have a support plan and you need technical help, create a support request that includes the following information:



  • For Issue type, select Technical.

  • For Subscription, select your subscription.

  • For Service, select My services.

  • For Service type, select Azure Database for MariaDB.

  • For Resource, select your resource.

  • For Problem type, select Migration.

  • For Problem subtype, select Migrating from Azure for MariaDB to Azure for MySQL Flexible Server.


 


If you have questions about the information in this post, please don’t hesitate to reach out to us at AskAzureDBforMariaDB@service.microsoft.com. Thank you!