Integrating Appwrite in Azure

This post has been republished via RSS; it originally appeared at: Microsoft Tech Community - Latest Blogs - .

Introduction

Technology makes our life easier and helps us solve complex problems. Everyday we look for ways to automate a recurring task, integrate multiple services to build a solution that helps us develop faster.

In this article we will be introducing you to two services:

  • Azure - that makes handling infrastructure easy
  • Appwrite - that makes backend development easy

We will be showing you how to build a virtual machine in Azure, followed by installing Appwrite and building a demo with it.

First, let’s get you introduced to Appwrite and Azure.


What is Appwrite?

Appwrite is a self-hosted backend-as-a-service platform that provides developers with all the core APIs required to build any application. Appwrite provides you with a set of APIs, tools, and a management console UI to help you build your apps a lot faster and in a much more secure way.

Appwrite is both cross-platform and technology agnostic, meaning it can run on any operating system, coding language, framework, or platform. Although Appwrite can easily fit the definition of a serverless technology, it's designed to run well in multiple configurations. You can integrate Appwrite directly with your client app, use it behind your custom backend or alongside your custom backend server.


What is Azure?

The Azure cloud platform has more than 200 products and cloud services designed to help you bring new solutions to life—to solve today’s challenges and create the future. Build, run, and manage applications across multiple clouds, on-premises, and at the edge, with the tools and frameworks of your choice. It is a huge collection of servers and networking hardware, which runs a complex set of distributed applications. These applications orchestrate the configuration and operation of virtualized hardware and software on those servers. The orchestration of these servers is what makes Azure so powerful.

For this tutorial, we will be using Azure Virtual Machine. A virtual machine, commonly shortened to just VM, is no different than any other physical computer like a laptop, smart phone, or server. It has a CPU, memory, disks to store your files, and can connect to the internet if needed. While the parts that make up your computer (called hardware) are physical and tangible, VMs are often thought of as virtual computers or software-defined computers within physical servers, existing only as code. Azure Virtual Machines is the Azure infrastructure as a service (IaaS) used to deploy persistent VMs with nearly any VM server workload that you want.


Prerequisites

To follow the tutorial thoroughly, here are the set of requirements:

  1. A free Azure account
  2. Docker installed
  3. Appwrite installed

Setting up Azure

Creating an Azure Account

  1. Register for the free account from here

Haimantika1360_0-1678971030989.jpeg

 

  1. Click on Start free and sign up using your Microsoft account or GitHub account.

  2. Once the signup process is complete, you will have an Azure subscription ready for you to use.


Creating an Azure Virtual Machine

Now that you have an Azure account, the next step is to create a Virtual Machine:

  1. As soon as you sign in, the portal will open and you will be able to list a list of services, as shown in the picture below.

Haimantika1360_1-1678971030966.png

 

  1. Click on ‘Create a resource’ and then choose ‘Virtual Machines’

Haimantika1360_2-1678971030962.png

 

  1. Click on Create under Virtual Machines and then add the required details to create a VM of your choice.

Haimantika1360_3-1678971030963.png

 

  1. To know more about the process, you can follow the quickstart tutorial
  2. Make sure to select SSH public key under Administrator account
  3. After you have filled in all the details, run the validation check.
  4. When the Generate new key pair window opens, select Download private key and create resource. Your key file will be downloaded as myKey.pem. Make sure you know where the .pem file was downloaded; you will need the path to it in the next step.
  5. After deployment is complete, select Go to resource
  6. On the page for your new VM, select the public IP address and copy it to your clipboard.

Haimantika1360_4-1678971030963.png

 

Note - The minimum requirements to run Appwrite are as little as 1 CPU core and 2GB of RAM, and an operating system that supports Docker


Connect to virtual machine

Create an SSH connection with the VM.

  1. If you are on a Mac or Linux machine, open a Bash prompt and set read-only permission on the .pem file using chmod 400 ~/Downloads/myKey.pem. If you are on a Windows machine, open a PowerShell prompt.
  2. At your prompt, open an SSH connection to your virtual machine. Replace the IP address with the one from your VM, and replace the path to the .pem with the path to where the key file was downloaded.

ssh -i ~/Downloads/myKey.pem azureuser@10.111.12.123


Creating an Azure CosmosDB Instance

  1. From the Azure portal menu or the Home page, select Create a resource.
  2. On the New page, search for and select Azure Cosmos DB.
  3. On the Select API option page, select the Create option within the NoSQL
  4. In the Create Azure Cosmos DB Account page, enter the basic settings for the new Azure Cosmos DB account.
  5. In the Global Distribution tab, you can leave the default values.
  6. Optionally you can configure more details in the following tabs:
    • Networking
    • Backup Policy
    • Encryption
    • Tags
  7. Select Review+Create. Review the account settings, and then select Create. It takes a few minutes to create the account. Wait for the portal page to display Your deployment is complete.

Setting up Appwrite

Prerequisites
Before we get started, there are a couple of prerequisites. If you have the prerequisites set up already, you can skip to the following section.

In order to follow along, you’ll need a few things beforehand.

  • An Appwrite instance If you haven’t set up an Appwrite instance yet, you can follow the getting started guides to get it up and running quickly. Feel free to choose between the One-Click installations on DigitalOcean or manual installations with Docker.

TL;DR: It just takes a single command to install Appwrite.

docker run -it --rm \
    --volume /var/run/docker.sock:/var/run/docker.sock \
    --volume "$(pwd)"/appwrite:/usr/src/code/appwrite:rw \
    --entrypoint="install" \
    appwrite/appwrite:1.2.0
 

Once your server is up and running, head over to the Appwrite Dashboard on your server’s public IP address ( or localhost if you installed locally ) and create a new admin user account.

In order to enable the .NET 6.0 runtime for Appwrite Cloud Functions, you need to update the .env file in the Appwrite installation folder. Enter the file and add dotnet-6.0 to the comma-separated list in the environment variable _APP_FUNCTIONS_RUNTIMES. This will make the .NET runtime available in Appwrite Functions. You can then load the updated configuration using the docker-compose up -d command.

Haimantika1360_5-1678971030964.png

 

  • The Appwrite CLI We’ll use the Appwrite CLI during this exercise as it makes the process super simple. If you have Node.js installed, the installation command is a simple:
npm install -g appwrite-cli
 

If npm is not your thing, we have numerous installation options you can find in the getting started guide for the CLI.


Initializing Appwrite Function

In order to create an Appwrite Cloud Function, first we must login to the Appwrite CLI using the appwrite login command using the credentials we created when setting up the Appwrite instance.

appwrite login
? Enter your email test@test.com
? Enter your password ********
? Enter the endpoint of your Appwrite server http://localhost/v1
✓ Success
 

Next up, we need to create a new Appwrite project (or link an existing one) to work with. This can be achieved via the appwrite init project command.

appwrite init project
? How would you like to start? Create a new Appwrite project
? What would you like to name your project? Project X
? What ID would you like to have for your project? unique()
✓ Success
 

You can give your project any name you'd like. As soon as you create or link a project, you should notice a new appwrite.json file in the current directory which stores all the information about your project.

Now that our CLI is all set up with our Appwrite project, let's initialize our function using the appwrite init function command.

appwrite init function
? What would you like to name your function? CosmosDBCRUD
? What ID would you like to have for your function? unique()
? What runtime would you like to use? .NET (dotnet-6.0)
✓ Success
 

Give your function a name (I've chosen CosmosDBCRUD keeping in mind relevance) and choose the .NET 6.0 runtime. This will create a new Appwrite Function in your project and set up all the boilerplate code necessary. The function's files are available in the ./functions/CosmosDBCRUD directory, which is where we'll be working.

If you don't feel comfortable creating Appwrite Cloud Functions with .NET, please make sure to check out our blog before moving onto the next step.


Creating the Azure CosmosDB CRUD Function

After initializing the RickRoll Function, please visit the ./functions/CosmosDBCRUD directory. Our file structure here looks as follows.

CosmosDBCRUD
├── Function.csproj 
 |
└── src
└── Index.cs
 

Enter src/Index.cs and replace the boilerplate with the following code:

using System;
using System.Threading.Tasks;
using System.Collections.Generic;
using Microsoft.Azure.Cosmos;
using Newtonsoft.Json;

public async Task<RuntimeResponse> Main(RuntimeRequest req, RuntimeResponse res)
{
  // CosmosDB Client Authentication Details

  var cosmosDBEndpoint = req.Variables["COSMOSDB_ENDPOINT"];
  var cosmosDBKey = req.Variables["COSMOSDB_KEY"];

  if(String.IsNullOrEmpty(cosmosDBEndpoint) || String.IsNullOrEmpty(cosmosDBKey))
  {
    return res.Json(
      data: new()
      {
        { "response", "CosmosDB credentials are missing" },
        { "data", null }
      }, 
      statusCode: 400
    );
  }

  // Database Details

  var databaseId = "ProductsDB";
  var containerId = "Products";
  var partitionKeyPath = "/productCategory";

  CosmosClient client = new CosmosClient(
    accountEndpoint: cosmosDBEndpoint, 
    authKeyOrResourceToken: cosmosDBKey
  );

  // Check If Database and Container Exist

  Container container;

  try
  {
    DatabaseResponse databaseResponse = await client.CreateDatabaseIfNotExistsAsync(databaseId);
    Database database = databaseResponse.Database;
    ContainerResponse containerResponse = await database.CreateContainerIfNotExistsAsync(
      id: containerId,
      partitionKeyPath: partitionKeyPath,
      throughput: 400
    );
    container = containerResponse.Container;
  }
  catch(Exception ex)
  {
    Console.WriteLine(ex.StackTrace);
    return res.Json(
      data: new()
      {
        { "response", ex.Message },
        { "data", "Check logs for stack trace" }
      }, 
      statusCode: 400
    );
  }

  // Deserialize Payload
  FunctionRequest functionRequest;
  try
  {
    functionRequest = JsonConvert.DeserializeObject<FunctionRequest>(req.Payload);
  }
  catch(Exception ex)
  {
    Console.WriteLine(ex.StackTrace);
    return res.Json(
      data: new()
      {
        { "response", ex.Message },
        { "data", "Check logs for stack trace" }
      }, 
      statusCode: 400
    );
  }
  var function = functionRequest.Function;
  var product = functionRequest.Product;

  // CosmosDB Functions

  try
  {
    if(function.Equals("create"))
    {
      if(String.IsNullOrEmpty(product.Id))
      {
        product.Id = Guid.NewGuid().ToString();
      }
      var createdProduct = await CreateProduct(container, product);
      return res.Json(new()
      {
        { "response", "Created Product" },
        { "data", createdProduct }
      });
    }

    else if(function.Equals("read"))
    {
      var readProduct = await ReadProduct(container, product);
      return res.Json(new()
      {
        { "response", "Read Product" },
        { "data", readProduct }
      });
    }

    else if(function.Equals("readall"))
    {
      var readProducts = await ReadAllProducts(container);
      return res.Json(new()
      {
        { "response", "Read All Products" },
        { "data", readProducts }
      });
    }

    else if(function.Equals("update"))
    {
      var updatedProduct = await UpdateProduct(container, product);
      return res.Json(new()
      {
        { "response", "Updated Product" },
        { "data", updatedProduct }
      });
    }

    else if(function.Equals("delete"))
    {
      var deletedProduct = await DeleteProduct(container, product);
      return res.Json(new()
      {
        { "response", "Deleted Product" },
        { "data", deletedProduct }
      });
    }

    // Closing return statement

    else
    {
      return res.Json(
        data: new()
        {
          { "response", "Function value is invalid" },
          { "data", null }
        }, 
        statusCode: 400
      );
    }
  }
  catch(Exception ex)
  {
    Console.WriteLine(ex.StackTrace);
    return res.Json(
      data: new()
      {
        { "response", ex.Message },
        { "data", "Check logs for stack trace" }
      }, 
      statusCode: 500
    );
  }
}

public async Task<Product> CreateProduct(Container container, Product product)
{
  var createdProduct = await container.CreateItemAsync<Product>(
    item: product,
    partitionKey: new PartitionKey(product.ProductCategory)
  );

  return createdProduct.Resource;
}

public async Task<Product> ReadProduct(Container container, Product product)
{
  var readProduct = await container.ReadItemAsync<Product>(
    id: product.Id,
    partitionKey: new PartitionKey(product.ProductCategory)
  );

  return readProduct.Resource;
}

public async Task<List<Product>> ReadAllProducts(Container container)
{
  List<Product> products = new List<Product>();

  FeedIterator<Product> feed = container.GetItemQueryIterator<Product>(
    queryText: "SELECT * FROM Products"
  );

  while (feed.HasMoreResults)
  {
    FeedResponse<Product> response = await feed.ReadNextAsync();

    foreach (Product product in response)
    {
      products.Add(product);
    }
  }

  return products;
}

public async Task<Product> UpdateProduct(Container container, Product product)
{
  var updatedProduct = await container.UpsertItemAsync<Product>(
    item: product,
    partitionKey: new PartitionKey(product.ProductCategory)
  );

  return updatedProduct.Resource;
}

public async Task<Product> DeleteProduct(Container container, Product product)
{
  product = await ReadProduct(container, product);

  await container.DeleteItemAsync<Product>(
    id: product.Id,
    partitionKey: new PartitionKey(product.ProductCategory)
  );

  return product;
}

// Models (can also be kept in a separate class file)

public class FunctionRequest
{
    [JsonProperty(PropertyName = "function")]
    public string Function { get; set; }

    [JsonProperty(PropertyName = "product")]
    public Product Product { get; set; }
}

public class Product
{
    [JsonProperty(PropertyName = "id")]
    public string Id { get; set; }

    [JsonProperty(PropertyName = "productCategory")]
    public string ProductCategory { get; set; }

    [JsonProperty(PropertyName = "productName")]
    public string ProductName { get; set; }
}
 

Let’s go over the code we have here. From the Functions documentation , we see that the payload and function variables are available through the request object. Here, we retrieve the Azure CosmosDB endpoint and key from our function variables and use the Microsoft.Azure.Cosmos library to authenticate our client and communicate with the database. We then take the payload, which we receive as a JSON string, and convert it into an object of the FunctionRequest class. This is achieved using the Newtonsoft.Json library. Based on our inputted function, we trigger the corresponding functionality in the CosmosDB SDK, the list of which is available below:

function value Mandatory product fields Nullable product fields
create productCategory, productName id
read id, productCategory productName
readall    
update id, productCategory, productName  
delete id, productCategory productName

In order to make sure that Appwrite installs the Newtonsoft.Json and Microsoft.Azure.Cosmos libraries to our function, we must include them in the Function.csproj file.

<ItemGroup>
    <PackageReference Include="Microsoft.Azure.Cosmos" Version="3.31.2" />
    <PackageReference Include="Newtonsoft.Json" Version="13.0.1" />
  </ItemGroup>
 

Now that our function is ready, we can deploy it to our Appwrite instance using the appwrite deploy function command.

appwrite deploy function
? Which functions would you like to deploy? CosmosDBCRUD (639c3fa02529a234c305)
ℹ Info Deploying function CosmosDBCRUD ( 639c3fa02529a234c305 )
ℹ Info Ignoring files using configuration from appwrite.json
✓ Success Deployed CosmosDBCRUD ( 639c3fa02529a234c305 )
 

The function should be visible within your Appwrite Instance here.

Haimantika1360_6-1678971030964.png

 

One last thing we must do before we can test the function is add the necessary function variables to the Function's Settings page.

  • COSMOSDB_ENDPOINT: Your Azure CosmosDB endpoint URI
  • COSMOSDB_KEY: Your Azure CosmosDB primary key

Haimantika1360_7-1678971030965.png

 

Testing The CosmosDB Cloud Function

Once the function is deployed, we can head over to the Appwrite console and execute the function with the payload in the following format.

{
    "function": "string",
    "product": {
        "id": "string",
        "productCategory": "string",
        "productName": "string"
    }
}
 

After adding a few sample inputs, a successful execution with the selected function being readall led to the following response:

{
    "response": "Read All Products",
    "data": [{
        "Id": "bd05f8c9-4267-4960-9b65-3506cbc7dd60",
        "ProductCategory": "Laptop",
        "ProductName": "Dell Inspiron 5000"
    }, {
        "Id": "1a1d54bf-9523-460e-a78d-1d127af7be67",
        "ProductCategory": "Laptop",
        "ProductName": "ROG Zephyrus Duo 16"
    }, {
        "Id": "2ebd537e-278e-414b-a1b6-b38f1a1f4a13",
        "ProductCategory": "Laptop",
        "ProductName": "Lenovo Legion"
    }, {
        "Id": "24a8d3b0-77e7-44ad-ac20-6fbdb8c5e04d",
        "ProductCategory": "Laptop",
        "ProductName": "HP Pavilion 15"
    }, {
        "Id": "983735cc-8e0e-4c04-ac94-d6ccbc3de6ec",
        "ProductCategory": "Laptop",
        "ProductName": "Alienware 17"
    }]
}
 

Haimantika1360_8-1678971030968.png

 


Conclusion

And that brings us to the end folks! Hope you enjoyed yourself learning how you can integrate Appwrite with Azure to build real-world solutions. Feel free to visit the GitHub repo for the Cloud Function we created (and give it a star if you like it).

adityaoberai / Appwrite-CosmosDB-CRUD-Function-Sample

Appwrite Function Sample to showcase CRUD functions for Azure CosmosDB

Appwrite CosmosDB CRUD Function Sample

Documentation

Appwrite Function Sample to showcase CRUD functions for an Azure CosmosDB NoSQL database.

Input Schema

{
    "function": "string",
    "product": {
        "id": "string",
        "productCategory": "string",
        "productName": "string"
    }
}
 

Types of Inputs

The input can vary a little based on which CRUD function you would like to consume. The following table should help you out:

function value Mandatory product fields Nullable product fields
create productCategory, productName id
read id, productCategory productName
readall
update id, productCategory, productName
delete id, productCategory productName

Output Schema

{
    "response": "Read Product",
    "data": {
        "Id": "1a1d54bf-9523-460e-a78d-1d127af7be67",
        "ProductCategory": "Laptop",
        "ProductName": "ROG Zephyrus Duo 16"
    }
}

In case you want to learn more about Appwrite, feel free to visit the Appwrite Documentation and join Appwrite Discord Server if you need any help.

 

Some other learning resources are:

Leave a Reply

Your email address will not be published. Required fields are marked *

*

This site uses Akismet to reduce spam. Learn how your comment data is processed.