Azure File Sync Replace Existing Server Endpoint

Azure File Sync Replace Existing Server Endpoint

This article is contributed. See the original author and article here.

Introduction

This is Andrew Coughlin and I am a Customer Engineer at Microsoft focusing on Azure IaaS.  In this blog I will focus on how to add an additional server to an Azure File Sync and how to remove the old server. This process should be followed if you are looking to retire your old Azure File Sync server and want to replace it with a newer server.  However, if you are currently having issues with your Azure File Sync Server this blog is not for your scenario.  If you are having issues with your sync server it is best to open a support request and work with an engineer to solve the problem.

 

Prerequisites

  • Existing Azure File Sync Service setup and configured.
  • Deploy a new Windows Server.
  • Download the Azure File Sync Agent here.
  • Prepare your Windows Server to use Azure File Sync as documented here.

 

Install the Azure File Sync Agent

To install the Azure File Sync Agent, you will go to the location where you downloaded the executable. 

NOTE: Each version of Windows Server has its own version of the installation executable.

  1. Then you will execute the installer. 
  2. Click Next, on the welcome setup wizard.
  3. Accept the terms and conditions and click Next.
  4. Click Next, on Feature Selection.
  5. Set your proxy settings if required, click Next.
  6. Check the checkbox next to Automatically update when a new version becomes available, then select the day and time, then click Install.
  7. Wait for the installation to finish.
  8. Click Finish.

 

Install AFS Agent.gif


Register it to the Storage Sync Service

Once you have installed the Azure Sync Service, you need to register your server with the service.  To do this you will do the following on the new server:

  1. Click OK, on the Azure File Sync – Agent Update
  2. Click Sign In.
  3. You will be prompted for your global administrator username / password.
  4. You will then select the subscription, resource group and storage sync service, then click Register.
  5. Network connectivity test will show green, you may need to click retest.
  6. Click Close once done.

 

Register AF Agent.gif

 

Add the Server Endpoint

Currently we have installed the agent and registered the new server with the Azure File Sync Service.  Next, we will add the new server endpoint to our existing Sync group.  We will go to the portal and search for Storage Sync Service.

 

AndrewCoughlin_2-1600952754726.png

 

Then you will click on the Storage Sync Service you are going to add this new server to.

 

AndrewCoughlin_3-1600952754729.png

 

Click on Overview and then click on your sync group.

 

AndrewCoughlin_4-1600952754731.png

 

Click on the old Server endpoint.

 

AndrewCoughlin_5-1600952754733.png

 

Take note of your current tiering policies for volume policy and date policy.

 

AndrewCoughlin_6-1600952754738.png

 

Click on the X on the Server Endpoint Properties Window.

Click Add server endpoint.

 

AndrewCoughlin_7-1600952754740.png

 

On the Add server endpoint screen, select the server from the dropdown list on Registered Server, and type the path to be synced as part of the sync group. (Example: E:Shares)

 

AndrewCoughlin_8-1600952754743.png

 

If your current server endpoint is doing cloud tiering, enable Cloud Tiering and configure the new server endpoint to match the old server endpoint configuration, then click Create.

 

AndrewCoughlin_9-1600952754747.png

 

You will now see 2 server endpoints, and the new one you just added will be in a pending status.

 

AndrewCoughlin_10-1600952754748.png

 

When the setup is complete the new server should show healthy:

 

AndrewCoughlin_11-1600952754750.png

 

If you switch to the new server, you will see new files being populated:

 

AndrewCoughlin_12-1600952754754.png

 

We now have our new server added and part of the sync group.  As files are created on the old server, they are synced to the file share and back to the second file server. 

 

Verify Sync on new Azure File Sync Server

Next you need to verify that the new server has completed a sync.  To do this we will open eventvwr on the new server.

 

Then navigate to Applications and Services Logs Microsoft FileSync Agent Telemetry.  We want to filter on Event ID 9102, this is logged once a sync has completed.  For additional information around this event id you can visit this page.

 

AndrewCoughlin_13-1600952754761.png

 

Next thing to do is to create the shares. This is one article that explains how to export / import share permissions.

 

Remove Server Endpoint

WARNING: Before completing this next step make sure synchronization, as noted in ‘Verify Sync on new Azure File Sync Server’, has completed and no client devices are no longer talking to this server.  Don’t complete these steps until all clients are talking to the new server. There are several ways you can move your client devices:

  • Add the server to the existing DFS target folder, see my blog here for more information.
  • Shutdown the old server, rename the new server to the old server name, be advised the server endpoint in the portal doesn’t update the name as noted here.
  • Create a CNAME from the old server to the new server, for documentation for creating a cname use this link.

 

Normally when you remove a server endpoint you may only have 1 server in the sync group.  Before you do this, you would make sure to recall those files.  In this scenario, we have 2 servers in the sync group and therefore we do not need to recall the files.  However if you are retiring an Azure File Sync server that you want to keep the files locally on the file server you will want to recall the files before proceeding as documented here.

 

Go back to the Azure Portal, then the Storage Sync Service and the Sync Group. Click on the old sync server and click Delete.

 

AndrewCoughlin_14-1600952754762.png

 

Type the name of the server in the text box and click Delete.

 

AndrewCoughlin_15-1600952754767.png

 

 

AndrewCoughlin_16-1600952754768.png

 

Remove Registered Server

The second to the last step is unregistering the server from the Azure File Sync Service.  To do this go back to the Azure Portal, then the Storage Sync Service.  Click Registered Servers and click on the server you are wanting to retire.  Next, we will click on Unregister Server.

 

AndrewCoughlin_17-1600952754774.png

 

Type the name of the server in the text box and click Unregister.

 

AndrewCoughlin_18-1600952754779.png

 

 

AndrewCoughlin_19-1600952754781.png

 

Uninstall the Azure File Sync agent

The last step is an optional task, since we removed the server from the sync group and unregistered it, those are the more important tasks.  If you would like to uninstall the agent, it is 4 simple steps as outlined below:

 

  1. Click Start > Control Panel > Add/Remove Programs (Programs and Features).
  2. Click on Storage Sync Agent > Click Uninstall.

AndrewCoughlin_20-1600952754782.png

 

  1. Click Yes, on Are you sure you want to uninstall Storage Sync Agent.
  2. Click Yes to restart now, or No to restart later.

 

Conclusion

In 6-7 easy steps, we replaced an old Windows Server running Azure File Sync with a new Windows Server. Are you a Premier or Unified customer? If so and you are wanting to know more about Azure File Sync, ask your CSAM about the Activate Azure with Azure File offering.  Thank you for taking the time to read this blog, I hope this helps you and see you next time.

 

Launch of unified Azure Certified Device program

This article is contributed. See the original author and article here.

Giving certified IoT devices the ability to stand out from the crowd

 

As we work across the IoT industry, we continue to hear from device builders that you are looking for help connecting with customers who want to find the right device to meet their needs and differentiating your devices by making them solution ready. With over 30 billion active IoT devices in the world and 400 percent growth in devices over the past three years, the industry is moving incredibly fast; the challenge of connecting the right audience with the right product will only become more difficult.

 

To enable you to keep up with this pace, I am pleased to share that a unified and enhanced Azure Certified Device program is now generally available, expanding on previous Microsoft certification offerings.

 

At the heart of this device certification program is a promise—a promise to device builders that they can not only quickly get their devices to market, but also better differentiate and promote their devices. And a promise to buyers that they can easily identify devices that meet their needs and purchase those devices with confidence that they have Microsoft approval. Our promise is to the entire IoT ecosystem: Microsoft is committed to helping a diverse set of partners easily create and find IoT devices built to run on Azure, and we’ll support connecting those devices with the right customers.

 

Visit our Getting Started with the Azure Certified Device program page to learn more.

 

Advantages of certifying IoT devices with Microsoft Azure

At Microsoft, we have been certifying devices for over 20 years, resulting in the creation of an ecosystem of over one billion PCs worldwide, that you, our partners, helped build. Now, we are enhancing how we apply our certification experience to our expertise in the cloud—building an IoT device ecosystem that will be exponentially larger—with tens of millions of devices connected to Azure and tens of thousands of customers utilizing devices built by our rapidly growing IoT device builder community. As we continue to build a thriving IoT ecosystem, we are committed to going even further for IoT builders and buyers through our improved tools and services as well as the following certification commitments:

 

  • Giving customers confidence: Customers can confidently purchase Azure certified devices that carry the Microsoft promise of meeting specific capabilities.

  • Matchmaking customers with the right devices for them: Device builders can set themselves apart with certification that highlights their unique capabilities. And customers can easily find the products that fit their needs based on certification differentiation.

  • Promoting certified devices: Device builders get increased visibility, contact with customers, and usage of the Microsoft Azure Certified Device brand.

 

Three certifications available today, with more coming

This IoT device certification program offers three specific certifications today (with more on the way!). Certifications currently available include Azure Certified Device, IoT Plug and Play, and Edge Managed. 

 

Azure Certified Device

Azure Certified Device certification validates that a device can connect with Azure IoT Hub and securely provision through the Device Provisioning Service (DPS). This certification reflects a device’s functionality and interoperability, which are a necessary baseline for more advanced certifications.

 

IoT Plug and Play

Announced in August, IoT Plug and Play certification validates Digital Twin Definition Language (DTDL) version 2 and interaction based on your device model. It enables a seamless device-to-cloud integration experience and enables hardware partners to build devices that can easily integrate with cloud solutions based on Azure IoT Central as well as third-party solutions. Additionally, Azure IoT platform services and SDKs for IoT Plug and Play will be generally available by the end of this month. View our developer documentation for more information, and join the companies already beginning to prepare and certify their devices for IoT Plug and Play.

 

Edge Managed

Edge Managed certification focuses on device management standards for Azure connected devices. Today, this program certification focuses on Edge runtime compatibility for module deployment and management. Informed by conversations with our partners, our Edge Managed certification will continue to grow in the future with additional customer manageability needs.

 

 

Security and edge AI certifications soon in private preview

In addition to the currently available certifications, we are also working on additional security and edge AI certifications, which will soon be in private preview. These programs reflect our continued engagement with customers and partners to address key customer needs and business opportunities in delivering both secure and high-quality AI perception experiences at the edge. Interested partners can contact the Azure Certified Device team for more information.

 

Accelerate business with the Azure Certified Device Catalog

The Azure Certified Device certification program connects a community of device builders with solution builders and buyers through the Azure Certified Device Catalog. Certified devices are searchable based on which devices meet which capabilities, allowing device builders to differentiate their offerings based on the certification program. By certifying their devices to appear in the Azure Certified Device Catalog, device builders gain access to a worldwide audience looking to reliably purchase devices that are built to run on Azure. Meanwhile, buyers can use the catalog as a one-stop-shop they can trust to find and review a wide array of IoT devices.

 

Next steps for pursuing IoT device certification                                   

If you’re a device builder, now is the right time to start thinking about how IoT device certification can benefit your company—elevating your profile and better positioning your devices to reach a broader market. Begin saving valuable time and make your devices stand out from the crowd by taking part in the Azure Certified Device program.

 

Visit our Getting Started with the Azure Certified Device program page to learn more.

 

TodoMVC Full Stack with Azure Static WebApps, Node and Azure SQL #beginners #node #sql #serverless

TodoMVC Full Stack with Azure Static WebApps, Node and Azure SQL #beginners #node #sql #serverless

This article is contributed. See the original author and article here.

TodoMVC is a very well known (like ~27K GitHub stars known) application among developers as it is a really great way to start to learn a new Model-View-Something framework. It has plenty of samples done with different frameworks, all implementing exactly the same solution. This way is very easy to compare them against each other and see what is the one you prefer. Creating a To-Do App is easy enough, but not too easy, to be the perfect playground to learn a new technology.

 

pt1vlgtqp6mfjvx5um4v.jpg

 

The only issue with TodoMVC project is that it “only” focus on front-end solutions. What about having a full-stack implementation of the TodoMVC project with also back-end API and a database? Well it turns out that there is also an answer for that: Todo-Backend. There are more than 100 implementations available! Pretty cool, uh?

If you want to have a test run building a full-stack solution using a new technology stack you want to try, you are pretty much covered.

 

Full Stack with Azure Static Web Apps, Node, Vue and Azure SQL

Lately I was intrigued by the new Azure Static Web Apps that promises an super-easy Azure deploy experience, integration with Azure Function and GitHub Actions, and ability to deploy and manage a full-stack application in just one place, so I really wanted to try to take the chance to create a 100% serverless TodoMVC full stack implementation using:

  • Vue.Js for the frontend as I find it really really cool and powerful;
  • Azure Static Web Apps as I can manage the full-stack app just from one place and deploy just by doing a git push;
  • Node.js for the backend, as I’m learning it and I want to keep exercising. Not to mention that is very common and very scalable;
  • Azure SQL as I want to have a database ready for anything I may want to throw at it;

I searched in the TodoMVC and TodoBackend but didn’t find this specific stack of technologies…so why not creating it myself, I thought? Said and done! Here’s some notes I took while building this.

 

Azure Static Web Apps

sadjelj98npdafzg7efo.png

Still in Preview but I loved it as soon as I saw it. Is just perfect for a full-stack development experience. In one shot you can deploy front-end and back-end, make sure they are correctly configured to work together (you know, CORS) and correctly secured.

Deployment is as easy as configuring a GitHub Action, that is actually automatically done for you, even if you still have full access to it, so you can customize it if needed (for example to include the database in the CI/CD process).
Azure Static Web Apps will serve a static HTML whatever you specify as the app and will spin up and deploy an Azure Function using Node.js to run the back-end using anything you instead specify as the api:

 

abcl1d3uzmwmp1z7nnai.png

 

As you can guess from the configuration, my repo contains the front-end in the client folder and the back-end code in the api folder:

 

7xg7nyf52h5476gkqolg.png

 

Front-End: Vue.js

As I’m still learning also Vue I kept the code very simple and actually started from the TodoMVC Vue sample you can find on the Vue website: TodoMVC Example.

I like this sample a lot as it shows the power of Vue.js using a single file. Very easy to understand if you have just started learning it. If you are already an experienced Vue user, you’ll be happy to know the Azure Static Web Apps has a native support for Vue, so that you can build and deploy Vue CLI. I’m honestly not that expert yet so I really like the super-simple approach that Vue also offers. Plus I also think that the super-simple approach is perfect for learning, which make it just great for this post.

 

7lxx3xebiautjipmegnb.jpg

 

Call a REST API

The original TodoMVC sample uses a local storage to persist To-Do data. Thanks to the Watchers feature that Vue provides, the code JavaScript code you need to write is very simple as any changes to a watched list – todo in this case – is automatically persisted locally via the following snipped of code:

 

watch: {
    todos: {
        handler: function(todos) {
            todoStorage.save(todos);
        },
        deep: true
    }
},

 

Of course, to create a real-world full-stack sample, I wanted to send the To-Do list data to a REST API, avoiding the usage of local storage, to enable more interesting scenarios, like collaboration, synchronization on multiple devices and so on.

Instead of relying on a Watcher, which would unfortunately send the entire list to the REST API and not only the changed item, I decided to go for a more manual way and just call the REST API just binding them directly to the declared methods:

 

methods: {
    addTodo: function () {
        var value = this.newTodo && this.newTodo.trim();
        if (!value) {
            return;
        }
        fetch(API + "/", {headers: HEADERS, method: "POST", body: JSON.stringify({title: value})})
        .then(res => {                  
            if (res.ok) {                                               
                this.newTodo = ''
                return res.json();
            }
        }).then(res => {                        
            this.todos.push(res[0]);
        })
    },

 

Connecting the addTodo method to an HTML object is really simple:

 

<header class="header">
    <h1>todos</h1>
    <input class="new-todo" autofocus autocomplete="off" placeholder="What needs to be done?" v-model="newTodo"
        @keyup.enter="addTodo" />
</header>

 

With these changes done, it’s now time to take a look at the back-end.

 

Back-End: Node

Azure Static Web Apps only support Node.js as a backend language today. No big deal, Node.js is a great, fast and scalable language that works perfectly with Azure Function and Azure SQL so we’re really good here. If you are not familiar on how to run Azure Function with Node.js and Azure SQL make sure to read this article: Serverless REST API with Azure Functions, Node, JSON and Azure SQL. As Azure Static Web Apps uses Azure Functions behind the scenes, everything you learned for Azure Function will be applicable to Azure Static Web Apps back-ends.

The client will send a HTTP request to the back-end REST API passing the To-Do payload as JSON. For example to mark a To-Do as done, this JSON

 

{"completed":true}

 

will be send via a PUT request:

 

https://xyz.azurestaticapps.net/api/todo/29

 

to set the To-Do with Id 29 as done. If everything is ok the REST API will return the entire object, to make sure the client always have the freshest data:

 

[{
    "id":29,
    "title":"Write about Vue",
    "completed":1
}]

 

Thanks to Azure SQL support to JSON, the back-end doesn’t have to do a lot…just turn an HTTP request into a call via the TDS protocol supported by Azure SQL but beside that there isn’t a lot to do. JSON will be passed as is, so what the back-end really has to do is to make sure that depending on the HTTP request method invoked, the correct Azure SQL operation will be executed. For example a PUT request should call and UPDATE statement. Implementation is very easy:

 

switch(method) {
    case "get":
        payload = req.params.id ? { "id": req.params.id } : null;            
        break;
    case "post":
        payload = req.body;            
        break;
    case "put":
        payload =  { 
            "id": req.params.id,
            "todo": req.body
        };   
        break;
    case "delete":
        payload = { "id": req.params.id };
        break;       
}

 

If you have more complex needs you may decide to implement one function per HTTP request method, but it this case would have been an overkill. I really try to follow the KISS principle as much as possible. The simple the better. But not simpler! (Of course if that would be production code I would check and make sure that JSON is actually valid and harmless before passing it to Azure SQL. Never trust user-provided input, you never know!)

 

Database: Azure SQL

Azure SQL has been created with just one simple table:

 

create table dbo.todos
(
  id int not null primary key 
    default (next value for [global_sequence]),
  todo nvarchar(100) not null,
  completed tinyint not null 
    default (0)
)

 

As a developer I still prefer to use JSON in the backend and to send data back and forth to Azure SQL, so that I can also minimize the roundtrips and thus improve performances, so all the stored procedures I’m using have this very simple signature:

 

create or alter procedure [web].[get_todo]
@payload nvarchar(max)

 

Then inside the stored procedure I can then use OPENJSON or any of the JSON functions to manipulate JSON. This way it becomes really easy to accept “n” To-Do as input payload. For example, let’s say I want to delete three To-Dos at once. I can pass something like

 

[{"id":1}, {"id":2}, {"id":8}]

 

and then just by writing this

 

delete t from dbo.todos t 
where exists (
   select p.id 
   from openjson(@payload) with (id int) as p where p.id = t.id
)

 

I can operate on all the selected To-Dos at once. Super cool, and super fast! The ability of Azure SQL to operate both with relational and non-relational features is really a killer feat!

 

Why Azure SQL and not a NoSQL database?

Answering that question could take a book so let me try to summarize. A NoSQL database for a To-Do list app is more than enough. But I always try to think about future improvements, and I want to make sure than anything I’d like to do in future will be reasonably well supported by my database. I might need to have geospatial data, to aggregate data to do some analytics, I may want to use graph or I may need to create a concurrent system to allow more than one person working on he same to-do list and I need a structure without locks. All these things are available inside Azure SQL without requiring me to use anything other than a technology I already know. This means that I’ll be super productive. I won’t even have scalability issues as with Azure SQL I can go up to 100 TB.

 

A To-Do list has a pretty well-defined schema, and the performance I can get out of a properly designed relational database are exceptional and cover a huge spectrum of use cases. With a NoSQL database I might squeeze a bit more performances when I focus on a very specific use case, but at the expense of all the others. I really want to keep door open to any improvement so, for this time, for my use case and future needs, I think Azure SQL is the best option I have here.

 

Keep in mind that well-defined schema doesn’t mean carved in stone. I can have all the flexibility I may want as I can easily store To-Do as JSON (or just a part of it) into Azure SQL, mixing relational and non-relational features, allowing end-users to add custom field and properties if the want to. Actually, you know what? That looks like a great idea for a post. I’ll definitely write on on this topic, so stay tuned!

 

Conclusion

Creating and deploying a full-stack solution is really easy now, thanks to Azure Static Web Apps. Completely serverless, you can just focus on coding and design while enjoying the simplicity – along with scalability and flexibility – that serverless solution offers. Azure SQL will guarantee that your solution is future-prof, providing scalability out and up to 100 TB with all the perks of a modern post-relational database, like multi-model support, security built-in, columnstore, lock-free tables and anything you may need in your wildest dream.

 

As usual enjoy the full source code here: https://github.com/Azure-Samples/azure-sql-db-todo-mvc

TodoMVC Full Stack with Azure Static WebApps, Node and Azure SQL #beginners #node #sql #serverless

TodoMVC Full Stack with Azure Static WebApps, Node and Azure SQL #beginners #node #sql #serverless

This article is contributed. See the original author and article here.

TodoMVC is a very well known (like ~27K GitHub stars known) application among developers as it is a really great way to start to learn a new Model-View-Something framework. It has plenty of samples done with different frameworks, all implementing exactly the same solution. This way is very easy to compare them against each other and see what is the one you prefer. Creating a To-Do App is easy enough, but not too easy, to be the perfect playground to learn a new technology.

 

pt1vlgtqp6mfjvx5um4v.jpg

 

The only issue with TodoMVC project is that it “only” focus on front-end solutions. What about having a full-stack implementation of the TodoMVC project with also back-end API and a database? Well it turns out that there is also an answer for that: Todo-Backend. There are more than 100 implementations available! Pretty cool, uh?

If you want to have a test run building a full-stack solution using a new technology stack you want to try, you are pretty much covered.

 

Full Stack with Azure Static Web Apps, Node, Vue and Azure SQL

Lately I was intrigued by the new Azure Static Web Apps that promises an super-easy Azure deploy experience, integration with Azure Function and GitHub Actions, and ability to deploy and manage a full-stack application in just one place, so I really wanted to try to take the chance to create a 100% serverless TodoMVC full stack implementation using:

  • Vue.Js for the frontend as I find it really really cool and powerful;
  • Azure Static Web Apps as I can manage the full-stack app just from one place and deploy just by doing a git push;
  • Node.js for the backend, as I’m learning it and I want to keep exercising. Not to mention that is very common and very scalable;
  • Azure SQL as I want to have a database ready for anything I may want to throw at it;

I searched in the TodoMVC and TodoBackend but didn’t find this specific stack of technologies…so why not creating it myself, I thought? Said and done! Here’s some notes I took while building this.

 

Azure Static Web Apps

sadjelj98npdafzg7efo.png

Still in Preview but I loved it as soon as I saw it. Is just perfect for a full-stack development experience. In one shot you can deploy front-end and back-end, make sure they are correctly configured to work together (you know, CORS) and correctly secured.

Deployment is as easy as configuring a GitHub Action, that is actually automatically done for you, even if you still have full access to it, so you can customize it if needed (for example to include the database in the CI/CD process).
Azure Static Web Apps will serve a static HTML whatever you specify as the app and will spin up and deploy an Azure Function using Node.js to run the back-end using anything you instead specify as the api:

 

abcl1d3uzmwmp1z7nnai.png

 

As you can guess from the configuration, my repo contains the front-end in the client folder and the back-end code in the api folder:

 

7xg7nyf52h5476gkqolg.png

 

Front-End: Vue.js

As I’m still learning also Vue I kept the code very simple and actually started from the TodoMVC Vue sample you can find on the Vue website: TodoMVC Example.

I like this sample a lot as it shows the power of Vue.js using a single file. Very easy to understand if you have just started learning it. If you are already an experienced Vue user, you’ll be happy to know the Azure Static Web Apps has a native support for Vue, so that you can build and deploy Vue CLI. I’m honestly not that expert yet so I really like the super-simple approach that Vue also offers. Plus I also think that the super-simple approach is perfect for learning, which make it just great for this post.

 

7lxx3xebiautjipmegnb.jpg

 

Call a REST API

The original TodoMVC sample uses a local storage to persist To-Do data. Thanks to the Watchers feature that Vue provides, the code JavaScript code you need to write is very simple as any changes to a watched list – todo in this case – is automatically persisted locally via the following snipped of code:

 

watch: {
    todos: {
        handler: function(todos) {
            todoStorage.save(todos);
        },
        deep: true
    }
},

 

Of course, to create a real-world full-stack sample, I wanted to send the To-Do list data to a REST API, avoiding the usage of local storage, to enable more interesting scenarios, like collaboration, synchronization on multiple devices and so on.

Instead of relying on a Watcher, which would unfortunately send the entire list to the REST API and not only the changed item, I decided to go for a more manual way and just call the REST API just binding them directly to the declared methods:

 

methods: {
    addTodo: function () {
        var value = this.newTodo && this.newTodo.trim();
        if (!value) {
            return;
        }
        fetch(API + "/", {headers: HEADERS, method: "POST", body: JSON.stringify({title: value})})
        .then(res => {                  
            if (res.ok) {                                               
                this.newTodo = ''
                return res.json();
            }
        }).then(res => {                        
            this.todos.push(res[0]);
        })
    },

 

Connecting the addTodo method to an HTML object is really simple:

 

<header class="header">
    <h1>todos</h1>
    <input class="new-todo" autofocus autocomplete="off" placeholder="What needs to be done?" v-model="newTodo"
        @keyup.enter="addTodo" />
</header>

 

With these changes done, it’s now time to take a look at the back-end.

 

Back-End: Node

Azure Static Web Apps only support Node.js as a backend language today. No big deal, Node.js is a great, fast and scalable language that works perfectly with Azure Function and Azure SQL so we’re really good here. If you are not familiar on how to run Azure Function with Node.js and Azure SQL make sure to read this article: Serverless REST API with Azure Functions, Node, JSON and Azure SQL. As Azure Static Web Apps uses Azure Functions behind the scenes, everything you learned for Azure Function will be applicable to Azure Static Web Apps back-ends.

The client will send a HTTP request to the back-end REST API passing the To-Do payload as JSON. For example to mark a To-Do as done, this JSON

 

{"completed":true}

 

will be send via a PUT request:

 

https://xyz.azurestaticapps.net/api/todo/29

 

to set the To-Do with Id 29 as done. If everything is ok the REST API will return the entire object, to make sure the client always have the freshest data:

 

[{
    "id":29,
    "title":"Write about Vue",
    "completed":1
}]

 

Thanks to Azure SQL support to JSON, the back-end doesn’t have to do a lot…just turn an HTTP request into a call via the TDS protocol supported by Azure SQL but beside that there isn’t a lot to do. JSON will be passed as is, so what the back-end really has to do is to make sure that depending on the HTTP request method invoked, the correct Azure SQL operation will be executed. For example a PUT request should call and UPDATE statement. Implementation is very easy:

 

switch(method) {
    case "get":
        payload = req.params.id ? { "id": req.params.id } : null;            
        break;
    case "post":
        payload = req.body;            
        break;
    case "put":
        payload =  { 
            "id": req.params.id,
            "todo": req.body
        };   
        break;
    case "delete":
        payload = { "id": req.params.id };
        break;       
}

 

If you have more complex needs you may decide to implement one function per HTTP request method, but it this case would have been an overkill. I really try to follow the KISS principle as much as possible. The simple the better. But not simpler! (Of course if that would be production code I would check and make sure that JSON is actually valid and harmless before passing it to Azure SQL. Never trust user-provided input, you never know!)

 

Database: Azure SQL

Azure SQL has been created with just one simple table:

 

create table dbo.todos
(
  id int not null primary key 
    default (next value for [global_sequence]),
  todo nvarchar(100) not null,
  completed tinyint not null 
    default (0)
)

 

As a developer I still prefer to use JSON in the backend and to send data back and forth to Azure SQL, so that I can also minimize the roundtrips and thus improve performances, so all the stored procedures I’m using have this very simple signature:

 

create or alter procedure [web].[get_todo]
@payload nvarchar(max)

 

Then inside the stored procedure I can then use OPENJSON or any of the JSON functions to manipulate JSON. This way it becomes really easy to accept “n” To-Do as input payload. For example, let’s say I want to delete three To-Dos at once. I can pass something like

 

[{"id":1}, {"id":2}, {"id":8}]

 

and then just by writing this

 

delete t from dbo.todos t 
where exists (
   select p.id 
   from openjson(@payload) with (id int) as p where p.id = t.id
)

 

I can operate on all the selected To-Dos at once. Super cool, and super fast! The ability of Azure SQL to operate both with relational and non-relational features is really a killer feat!

 

Why Azure SQL and not a NoSQL database?

Answering that question could take a book so let me try to summarize. A NoSQL database for a To-Do list app is more than enough. But I always try to think about future improvements, and I want to make sure than anything I’d like to do in future will be reasonably well supported by my database. I might need to have geospatial data, to aggregate data to do some analytics, I may want to use graph or I may need to create a concurrent system to allow more than one person working on he same to-do list and I need a structure without locks. All these things are available inside Azure SQL without requiring me to use anything other than a technology I already know. This means that I’ll be super productive. I won’t even have scalability issues as with Azure SQL I can go up to 100 TB.

 

A To-Do list has a pretty well-defined schema, and the performance I can get out of a properly designed relational database are exceptional and cover a huge spectrum of use cases. With a NoSQL database I might squeeze a bit more performances when I focus on a very specific use case, but at the expense of all the others. I really want to keep door open to any improvement so, for this time, for my use case and future needs, I think Azure SQL is the best option I have here.

 

Keep in mind that well-defined schema doesn’t mean carved in stone. I can have all the flexibility I may want as I can easily store To-Do as JSON (or just a part of it) into Azure SQL, mixing relational and non-relational features, allowing end-users to add custom field and properties if the want to. Actually, you know what? That looks like a great idea for a post. I’ll definitely write on on this topic, so stay tuned!

 

Conclusion

Creating and deploying a full-stack solution is really easy now, thanks to Azure Static Web Apps. Completely serverless, you can just focus on coding and design while enjoying the simplicity – along with scalability and flexibility – that serverless solution offers. Azure SQL will guarantee that your solution is future-prof, providing scalability out and up to 100 TB with all the perks of a modern post-relational database, like multi-model support, security built-in, columnstore, lock-free tables and anything you may need in your wildest dream.

 

As usual enjoy the full source code here: https://github.com/Azure-Samples/azure-sql-db-todo-mvc

AKS on AzureStack HCI – now in Public Preview

AKS on AzureStack HCI – now in Public Preview

This article is contributed. See the original author and article here.

Hi Everyone,

This week we have announced the availability of the initial public preview of Azure Kubernetes Service (AKS) on Azure Stack HCI.

 

You can evaluate AKS on Azure Stack HCI by registering for the Public Preview here: https://aka.ms/AKS-HCI-Evaluate 

 

Azure Kubernetes Service on Azure Stack HCI takes our popular Azure Kubernetes Service (AKS) and makes it available to customers to run on-premises; delivering Azure consistency, a familiar Azure experience, ease of use and high security for their containerized applications. AKS on Azure Stack HCI enables developers and administrators to deploy and manage containerized apps on Azure Stack HCI. You can use AKS on Azure Stack HCI to develop applications on AKS and deploy them unchanged on-premises, run Arc enabled Data Services on a resilient platform and modernize Windows Server and Linux applications.

 

With AKS on Azure Stack HCI, Microsoft is delivering an Industry leading experience for modern application development and deployment in a hybrid cloud era. Microsoft is the only company that delivers technology that takes you from bare metal to a public cloud connected and consistent application and data platform in your datacenter.

image1.png

 

AKS on Azure Stack HCI can run Windows and Linux containers, all managed and supported by Microsoft. AKS on Azure Stack HCI leverages our experience with AKS, follows the AKS design patterns and best-practices, and uses code directly from AKS. This means that you can use AKS on Azure Stack HCI to develop applications on AKS and deploy them unchanged on-premises. It also means that any skills that you learn with AKS on Azure Stack HCI are transferable to AKS as well.

 

AKS on Azure Stack HCI uses Windows Admin Center and PowerShell to provide an easy to use and familiar deployment experience for any user of Azure Stack HCI. AKS on Azure Stack HCI simplifies the process of setting up Kubernetes on Azure Stack HCI and includes the necessary components to allow you to deploy multiple Kubernetes clusters in your environment.

image2.png

 

Which all means that you can focus on what matters most to you – your applications.

AKS on Azure Stack HCI is designed such that every layer is secure. Microsoft provides a secure baseline of all components in AKS on Azure Stack HCI and keeps them up to date. We will be adding mode security features and further hardening the platform over the course of the public preview.

AKS on Azure Stack HCI fully supports both Linux-based and Windows-based containers. When you create a Kubernetes cluster on Azure Stack HCI you can choose whether to create node pools (groups of identical virtual machine, like on AKS) to run Linux containers, Windows containers, or both. AKS on Azure Stack HCI creates and maintains these virtual machines so that you don’t have to directly manage operating systems.

 

If you have existing .NET applications that you want to modernize, and take advantage of the latest cloud development patterns, AKS on Azure Stack HCI is the platform for you. AKS on Azure Stack HCI provides an industry leading experience for Windows Containers on Kubernetes. We are also working on great tooling and documentation for the process of moving .NET applications from virtual machines to containers with AKS on Azure Stack HCI.

 

If you are building a new cloud native applications on AKS, AKS on Azure Stack HCI provides to easiest way for you to take those applications and run them in your datacenter. AKS on Azure Stack HCI shares a common code base with AKS, the user experience is consistent across both products, and Microsoft is investing to ensure that applications can move easily between these two environments.

 

If you are wanting to utilize new Microsoft technologies like Arc enabled Data Services in your datacenter, AKS on Azure Stack HCI delivers a complete solution from Microsoft. It is validated and supported by Microsoft, designed to deliver the best experience for these applications.

 

You can learn more about AKS on Azure Stack HCI by watching:

 

Working on this project has been a lot of fun for everyone involved, and we are excited to finally be able to share this with the world. I look forward to seeing what everyone is able to achieve with AKS on Azure Stack HCI!

 

Cheers,

Ben Armstrong